Cloud: Datacenters, Meet Software!

The PC revolution has redefined the notion of a computer over the past four decades. Now it might be time to redefine the notion of a collection of computers, that is, to redefine the notion of a datacenter.

Datacenters are thought of as big places. Some of the more recent plants used by mega-users like Google, Amazon, Microsoft, and Facebook have acres of land under roof, with many tens of thousands of individual systems and power requirements that would support a small city.

Even your friendly local, on-site enterprise datacenter is likely to be a big room with a big budget commitment and a lot of people hired to manage it.

But what if a datacenter could fit in the corner of a room, or under a desk, or in the palm of your hand?

This seems to be the direction we’re headed, as data loads simultaneously grow exponentially and become ever more distributed. This is also part of the vision I saw and heard outlined at the recent Open Compute Summit in San Jose.

Transparency as a Service
The summit was sponsored by The Open Compute Project Foundation, with a goal “to design and enable the delivery of the most efficient server, storage and data center hardware designs for scalable computing,” according to its mission statement. Members strive to share ideas, specs, and intellectual property in an open environment. The foundation is keyed by Facebook and the company’s commitment to transparency in how it builds out the massive datacenter infrastructure it requires.

One significant announcement at the summit was made by Vapor.io and company CEO Cole Crawford. The company aims for nothing less than utter transformation of the datacenter, starting with a programmable, open-source based management solution at the top of the stack.

Crawford and Chief Architect Steven White envision a modern, data-driven datacenter in which servers are “cattle, not pets,” following the still-new concept of software-defined servers and centers. “The Open Data Center Runtime Environment is the first accepted contribution to the Open Compute Foundation using the reciprocal license thus ensuring that forks and branches won’t exist,” according to Vapor.io. “We did this to ensure that when you are interacting with your data center, you’re communicating over a community owned, community standard.”

The company’s ultimate vision is a modern hardware configuration that brings new levels of efficiency and output to datacenters of all sizes.

Mobility & Then the IoT
The fast-growing global dataflow has mobility as today’s primary driver. The proliferation of tablets and especially smartphones on a worldwide basis will cause the total amount of data being processed by the Internet to exceed a zettabyte (1 billion terabytes) annually this year or next. That’s more than 30 terabytes per second.

Smartphone ownership will reach into the billions soon enough, and even in many developing countries, such as the Philippines, there are now more mobile phones than people.

But we ain’t seen nuttin’ yet—the Internet of Things (IoT) will be adding billions of new devices to the global Internet soon enough. Though much of the traffic it generates will be hyper-local (via Bluetooth and other short-range technologies), enough of it will be traveling along the Internet to increase global bandwidth to the dozens of zettabytes by the year 2020, according to Gartner and others.

Think of it as cloud computing to the nth degree in all dimensions. Think of the phrase made famous by Sun Microsystems—“the network is the computer”–extending out to “the edge of the network is your computer.”

The edge of the network seems to me to be much like the edge of the universe, that is, there is no such thing to the single observer. One person’s edge is another’s center. Cyberspace expands outward from wherever you are, and you will expect the same performance (some day) for your single device or your enterprise no matter where you are.

Big, bulky, centralized data centers cannot provide this edge service ubiquitously and effectively. There is also the matter of energy consumption. Datacenters were using about 2% of all electricity consumption in the US in 2011—that number has certainly risen since then, although it did not rise as quickly as the EPA had originally estimated for the period 2007-11.

Focus on Power Consumption
But let’s not get distracted by this particular metric. The big picture is one that features global power consumption and the aspiration of billions of people in developing nations to have better lives.

As I’ve written about many times, and about which much of our research at the Tau Institute is focused, developing nations typically consume 3 to 5% of the electricity per-person of the developed world.

We believe that an aggressive national commitment to IT is a primary indicator of sustained economic and societal growth. To achieve significant economic improvement therefore means we must achieve significant new efficiencies in power consumption.

Right Direction
The vision laid out by Vapor.io seems to be a positive step in this direction. Crawford says the technology, which already has its first customer in Indiana in the US, aims for a PUE of 1.1, compared to an industry average of 1.9 (PUE, or power usage effectiveness, is a simple measure of the ratio of total energy required by a datacenter divided by the amount used by the computing resources. The overhead is primarily eaten up by air-conditioning.)

Crawford and team go further, asserting that a new metric needs to be put into place. The metric would be called performance per watt per dollar, or PWD.

New efficiencies and new metrics are one big part of the puzzle. Another big part takes us back to the question near the beginning of this article. What if I could hold a datacenter in my hand? When will I be able to do this?

For now, the direction is being set. The world will need more mega-datacenter technology in smaller urban spaces, as mobility and the IoT inexorably drive dataflows upward. It will also need as much cloud-driven technology within buildings and some day, per person, as we can imagine.

The Software World
The third big piece of the puzzle involves the software that’s eating the world, in the phrase made notorious by Mark Andreessen in the Wall Street Journal in 2011. The world of cloud computing is a world of virtualization, containers, languages, platforms, architectures, and many things as-a-service.

It is a world that is not familiar to many people in the world of datacenters. A grand conversation is beginning to take place, and will need to intensify dramatically to sync up where the world of data is going and where the world of datacenters should be going.

read more