Who’s Responsible for Protecting Data Stored in the Cloud?

With cloud comes the notion of liberation. Cloud is the natural evolution of the data center. It’s easy to deploy, infinitely scalable, and highly redundant. It is the shiny new component inside the storage controller and is making it possible for an old dog to learn some very impressive new tricks. But with the cloud, comes responsibility.
An article recently appeared over at BusinessWeek explaining how many businesses now operate under the assumption that once their data is sent offsite they need not be concerned with protecting it. In a perfect world, this is how it should work. One of the main selling points of outsourcing infrastructure is the idea that there is now one less thing for IT to worry about. However, before any business can trust a third party to protect their invaluable corporate IP, some due diligence must be conducted.

read more

SoftLayer and RightScale Partner

SoftLayer Technologies and RightScale have partnered to provide unmatched scalability and automation solutions that allow Internet-centric companies to speed their time-to-market. Companies such as social gaming developer Broken Bulb Game Studios are able to use SoftLayer’s public and private cloud infrastructure with RightScale cloud management to easily deploy, automate and manage their computing workloads across the globe.
“Customers, such as Broken Bulb, are now experiencing the advantages of working with two leading cloud solutions providers,” said Duke Skarda, Chief Technology Officer of SoftLayer. The partnership between RightScale and SoftLayer offers a solution that is ideal for all types of web savvy companies that need to easily scale IT resources to meet the toughest workloads or steepest Internet traffic demands. This helps enable users to rapidly rollout new web-based services, applications and games through a flexible consumptive monthly billing cycle.

read more

Adobe, Looking to Stay Relevant, Goes Cloud

Adobe launched Creative Suite 6 Monday, the latest version of its flagship software kit for designers and web developers, and made it subscription-based, part of the company’s Creative Cloud.
Pricing starts at $50 a month for a year’s commitment or $75 a month with no contract. Existing users may qualify for a $30-a-month promotion for the first year and there’s a version for business teams that’ll cost $70 a month that won’t be out for a while.
Users can download Photoshop, Illustrator, Dreamweaver, After Effects, InDesign and any of the also separately priced other programs in the bundle to a PC or Mac, share files and store work online in a 20GB locker.

read more

The Taxonomy of IT Part 5 – Genus and Species

As the last (do I hear applause?) installment in this five part series on the Taxonomy of IT, we have a bit of cleanup to do.  There are two remaining “levels” of classification (Genus and Species), but there is also a need to summarize this whole extravaganza into some meaningful summary.

Genus classifications allow us to subdivide the Family of IT into subcultures based on their commonality to each other, while the Species definition enables us to highlight sometimes subtle differences, such as color, range or specific habits.   Therefore, in order to round out our Taxonomy, Genus will refer to how IT is physically architected, while Species will expose what that architecture may hide.

The physical architecture of an IT environment used to fall into only a couple of categories.  Most organizations built their platforms to address immediate needs, distributing systems based on the location of their primary users.  An inventory database would be housed at the warehouse facility, while financials would sit on systems at corporate.  This required the building and maintenance of complex links, both at the physical transport layer and also at the data level.  Because of the limits of access technology, people traveled to where the “data” was kept.

Twenty years ago, we began the transition of moving the data to where the users can consume it.  A new Genus evolved that enabled data to be moved to where it could be consumed.  It’s vastly more efficient to ship a 5MB spreadsheet halfway across the country than it is to ship a 170lb accountant.  In this architecture, the enablers were increases in available bandwidth, more efficient protocols, and better end-node processing power.

As we move forward in time, we are continuing to push the efficiency envelope.  Now, we don’t even have to move that spreadsheet, we just have to move an image of that spreadsheet.  And we don’t care where it needs to go, or even what route it takes to get there.  We are all about lowest cost routing of information from storage to consumption and back.

So, Genus is a way for us to gauge how far down that arc of advancement our customers have traveled.  Think in terms of a timeline of alignment with industry trends and capabilities.

Species, on the other hand, can be used to uncover the “gaps” between where in the timeline the customer is and what they have missed (intentionally or not) in terms of best practices.  Did they advance their security in line with their technology?  Have they established usage policies?  Can their storage sustain the transition?  What have they sacrificed to get where they are today, and what lies beneath the surface?

Using Genus and Species classifications, we can round out the taxonomy of any particular IT environment.  The combination of factors from each of the seven layers completes a picture that will allow us to guide our customers through the turbulent waters of today’s IT world.

To recap the seven layers:

Kingdom: How IT is viewed by the business

Phylum: IT’s general operating philosophy

Class: How IT is managed on a daily basis

Order: How IT is consumed, and why

Family: The structure of data flow within IT systems

Genus: How IT is physically architected

Species: What that architecture may hide

It would be quite the undertaking to build out individual groupings in each of these categories.  That is not what is important (although I did enjoy creating the pseudo-Latin neologisms in earlier parts of the series).  What is key is that we consider all of these categories when creating an overall approach for our customers.  It’s not all about the physical architecture, nor all about management.  It’s about how the collection of characteristics that flow from the top level all the way down to the bottom converge into a single picture.

In my opinion, it is a fatal mistake to apply technology and solutions across any of these levels with impunity.  Assuming that because a customer fits into a specific category they “have” to leverage a specific technology or solution is to blind yourself (and ultimately your customer) to what may be more appropriate to their specific situation.

Each environment is as unique as our own strands of DNA, and as such even those that make it to the Species with commonality will branch onto different future paths.  Perhaps there should be an eighth level, one that trumps all above it.  It could be called “Individuality.”

Five Principles for Cloud Computing Success

cloud computingCloud computing is really coming into its own. After several years of predictions, we’ve finally seen adoption of some cloud technologies at the consumer level. Cloud storage, for example, has taken the marketplace by storm as companies like Dropbox get into the public cloud provider routine.

Meanwhile, companies continue to explore just how cloud computing solutions can meet their needs. Just because something is billed as a cloud solution, however, doesn’t mean it’s a good idea.

Here are five principles your company needs to follow if it’s going to have successful cloud computing deployments:

1.

Top Ten Reasons to Sponsor and Exhibit at Cloud Expo New York

In its recent «Sizing the Cloud» report Forrester Research said it expects the global cloud computing market to reach $240 billion in 2020, up from $40 billion in 2010. Companies worldwide are keen to purchase solutions enabling enterprise class cloud computing control, performance, availability, and scalability. But these companies need to see demos and speak with Cloud vendors and providers face-to-face.
The $5 billion Big Data market is on the verge of a rapid growth spurt that will see it top the $50 billion mark worldwide within the next five years. The US Dept. of Defense for example is placing a big bet on big data, investing over $200M annually.

read more

Big Data Expo New York Speaker Profile: Anjul Bhambhri – IBM

With Big Data Expo 2012 New York (www.BigDataExpo.net), co-located with 10th Cloud Expo, now only seven weeks away, what better time to introduce you in greater detail to the distinguished individuals in our incredible Speaker Faculty for the technical and strategy sessions at the conference…

We have technical and strategy sessions for you every day at the combined event from June 11 through June 14 dealing with every nook and cranny of Cloud Computing and Big Data, but what of those who are presenting? Who are they, where do they work, what else have they written and/or said about the Cloud that is transforming the world of Enterprise IT, side by side with the exploding use of enterprise Big Data – processed in the Cloud – to drive value for businesses…?

read more

Big Data – A Sea Change of Capabilities in IT

“Big data represents a sea change of capabilities in IT” notes Matt McLarty, Vice President, Client Solutions at Layer 7, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. McLarty continued: “In conjunction with mobile and cloud, I think Big Data will provide a technological makeover to the typical enterprise infrastructure, drawing a hard API border in front of core business services while blurring the line between logic and data services.”
Cloud Computing Journal: Agree or disagree? – «While the IT savings aspect is compelling, the strongest benefit of cloud computing is how it enhances business agility.»
Matt McLarty: Agree. We have a number of customers who are able to use Layer 7 Gateways to protect their cloud deployments, and leverage the elastic scaling model of the cloud to handle seasonal or sporadic bursts of traffic dynamically. Historically, these companies would have to try and forecast this and risk over-buying infrastructure. So there is a big cost savings, but dynamic scaling is a new capability that only comes with the cloud model.

read more

Big Data, Private Clouds, and the Enterprise WAN at Cloud Expo New York

By generating massive amounts of new data that in turn require more and more bandwidth, Big Data is stretching an already-congested enterprise WAN to the breaking point. Many companies have made sizable investments in Big Data technologies, and are now looking to emerging cloud technologies to reduce costs and improve performance.
In his session at the 10th International Cloud Expo, Raj Kanaya, CEO & Co-Founder of Infineta Systems, will discuss how, as private cloud build-out intersects with Big Data adoption, the WAN links gluing it all together must deliver enough performance and reliability to make the transformation worthwhile and ensure that the private cloud sustains its cost and performance advantages.

read more