I enjoyed reading Nicholas Carr’s “The Big Switch” when it came out a few years ago. It compared cloud services to water and electricity, and set me on course to write about cloud computing and its potential on a regular basis.
Now, as I consult with three different software development teams to develop and deploy their ideas via cloud computing, I have to ask, “when can I simply turn the spigot on and off?”
With public-cloud options, we still face buying instances, aka chunks of compute power. There are gaps we have to leap as we approach the limit of a specific instance, with proportional price jumps.
One blinding revelation I got from putting on my Captain Obvious hat was that buying compute power is more like construction than manufacturing; there are precious few economies of scale. So, one project, as we project growing from 100 users to 1,000 and then 100,000, our cost/customer barely drops.
With private-cloud options, we face the normal IT challenge of captial expenditure. We don’t have an existing datacenter, so we’d be buying iron and software licenses the same as in the past. Sure, we should get higher usage from our resources, but we’d also need full-time people to keep our virtualization, stack, and UI running in joyful harmony around the clock.
One of our projects involves a presentation-driven business that has facilities that are open only a few hours per month. Why even put in a broadband connection at almost $1,000 a year if we can provide almost the same experience to our customers with a laptop, monitor, and DVD? Then after that, provision cloud on an irregular basis?
When can we simply turn the spigot on and off, and get a good “laminar flow,” as the engineers say?