The data center industry, especially cloud providers, has recently been in the spotlight for massive power consumption. For example, if the industry were a country, it would be in the Top 15 users of energy, somewhere between Spain and Italy. The 30+ Gigawatts pushed through data centers across the globe has a far-reaching impact, including on the enterprise bottom line. Electricity isn’t cheap and, to top it off, analyst firm IDC estimated un-utilized server capacity as equal to $140 billion, more than 20 million servers and 80 million tons of CO2 per year.
Yet, using virtualization technology in a public or private cloud scenario to run many instances of operating systems simultaneously on the same piece of hardware can lead to up to 80% efficiency. Pair this with Infrastructure as a Service or cloud computing models where virtualization is the norm, and the potential for cost savings is tremendous.