VDI and Cloud Infrastructure – Together at Last

This year, virtual desktop and cloud storage initiatives are at the top of many IT organization’s wish lists. But what is not obvious is how tightly intertwined these two initiatives have become as users embrace the Bring Your Own Device (BYOD) movement. This new BYOD model forces IT groups to provide both secure user applications (VDI) and secure user data (cloud storage).
Although cloud storage services have experienced success in the enterprise, users still have concerns about the security of their information and other problems related to control over data in the cloud. Because of this, VDI will not go away as cloud computing expands, as many have predicted, but instead be used to complement the cloud. According to experts, VDI adoption is predicted to spike – with Gartner estimating that 60 percent of enterprises will deploy some form of VDI by the end of this year.

read more

Smart Grid as a Service at Cloud Expo New York

The complexity of Smart Grid and Advanced Metering Infrastructure (AMI) implementations present a challenge for most of the nation’s electric utilities. From skilled resource availability to the operational complexities of installing and maintaining these systems, utilities have often struggled with achieving success. SAIC’s cloud-based Smart Grid as a Service (SGS) solution was developed to address these complexities and ease the implementation of Smart Grid technology.
In his session at the 10th International Cloud Expo, Tim Crowell, Assistant Vice President and Chief Architect for SAIC’s Smart Grid division, will present SAIC’s SGS solution with specific focus given to how SAIC addressed the challenges of utility integration into their cloud-based infrastructure.

read more

Who’s Responsible for Protecting Data Stored in the Cloud?

With cloud comes the notion of liberation. Cloud is the natural evolution of the data center. It’s easy to deploy, infinitely scalable, and highly redundant. It is the shiny new component inside the storage controller and is making it possible for an old dog to learn some very impressive new tricks. But with the cloud, comes responsibility.
An article recently appeared over at BusinessWeek explaining how many businesses now operate under the assumption that once their data is sent offsite they need not be concerned with protecting it. In a perfect world, this is how it should work. One of the main selling points of outsourcing infrastructure is the idea that there is now one less thing for IT to worry about. However, before any business can trust a third party to protect their invaluable corporate IP, some due diligence must be conducted.

read more

SoftLayer and RightScale Partner

SoftLayer Technologies and RightScale have partnered to provide unmatched scalability and automation solutions that allow Internet-centric companies to speed their time-to-market. Companies such as social gaming developer Broken Bulb Game Studios are able to use SoftLayer’s public and private cloud infrastructure with RightScale cloud management to easily deploy, automate and manage their computing workloads across the globe.
“Customers, such as Broken Bulb, are now experiencing the advantages of working with two leading cloud solutions providers,” said Duke Skarda, Chief Technology Officer of SoftLayer. The partnership between RightScale and SoftLayer offers a solution that is ideal for all types of web savvy companies that need to easily scale IT resources to meet the toughest workloads or steepest Internet traffic demands. This helps enable users to rapidly rollout new web-based services, applications and games through a flexible consumptive monthly billing cycle.

read more

Adobe, Looking to Stay Relevant, Goes Cloud

Adobe launched Creative Suite 6 Monday, the latest version of its flagship software kit for designers and web developers, and made it subscription-based, part of the company’s Creative Cloud.
Pricing starts at $50 a month for a year’s commitment or $75 a month with no contract. Existing users may qualify for a $30-a-month promotion for the first year and there’s a version for business teams that’ll cost $70 a month that won’t be out for a while.
Users can download Photoshop, Illustrator, Dreamweaver, After Effects, InDesign and any of the also separately priced other programs in the bundle to a PC or Mac, share files and store work online in a 20GB locker.

read more

The Taxonomy of IT Part 5 – Genus and Species

As the last (do I hear applause?) installment in this five part series on the Taxonomy of IT, we have a bit of cleanup to do.  There are two remaining “levels” of classification (Genus and Species), but there is also a need to summarize this whole extravaganza into some meaningful summary.

Genus classifications allow us to subdivide the Family of IT into subcultures based on their commonality to each other, while the Species definition enables us to highlight sometimes subtle differences, such as color, range or specific habits.   Therefore, in order to round out our Taxonomy, Genus will refer to how IT is physically architected, while Species will expose what that architecture may hide.

The physical architecture of an IT environment used to fall into only a couple of categories.  Most organizations built their platforms to address immediate needs, distributing systems based on the location of their primary users.  An inventory database would be housed at the warehouse facility, while financials would sit on systems at corporate.  This required the building and maintenance of complex links, both at the physical transport layer and also at the data level.  Because of the limits of access technology, people traveled to where the “data” was kept.

Twenty years ago, we began the transition of moving the data to where the users can consume it.  A new Genus evolved that enabled data to be moved to where it could be consumed.  It’s vastly more efficient to ship a 5MB spreadsheet halfway across the country than it is to ship a 170lb accountant.  In this architecture, the enablers were increases in available bandwidth, more efficient protocols, and better end-node processing power.

As we move forward in time, we are continuing to push the efficiency envelope.  Now, we don’t even have to move that spreadsheet, we just have to move an image of that spreadsheet.  And we don’t care where it needs to go, or even what route it takes to get there.  We are all about lowest cost routing of information from storage to consumption and back.

So, Genus is a way for us to gauge how far down that arc of advancement our customers have traveled.  Think in terms of a timeline of alignment with industry trends and capabilities.

Species, on the other hand, can be used to uncover the “gaps” between where in the timeline the customer is and what they have missed (intentionally or not) in terms of best practices.  Did they advance their security in line with their technology?  Have they established usage policies?  Can their storage sustain the transition?  What have they sacrificed to get where they are today, and what lies beneath the surface?

Using Genus and Species classifications, we can round out the taxonomy of any particular IT environment.  The combination of factors from each of the seven layers completes a picture that will allow us to guide our customers through the turbulent waters of today’s IT world.

To recap the seven layers:

Kingdom: How IT is viewed by the business

Phylum: IT’s general operating philosophy

Class: How IT is managed on a daily basis

Order: How IT is consumed, and why

Family: The structure of data flow within IT systems

Genus: How IT is physically architected

Species: What that architecture may hide

It would be quite the undertaking to build out individual groupings in each of these categories.  That is not what is important (although I did enjoy creating the pseudo-Latin neologisms in earlier parts of the series).  What is key is that we consider all of these categories when creating an overall approach for our customers.  It’s not all about the physical architecture, nor all about management.  It’s about how the collection of characteristics that flow from the top level all the way down to the bottom converge into a single picture.

In my opinion, it is a fatal mistake to apply technology and solutions across any of these levels with impunity.  Assuming that because a customer fits into a specific category they “have” to leverage a specific technology or solution is to blind yourself (and ultimately your customer) to what may be more appropriate to their specific situation.

Each environment is as unique as our own strands of DNA, and as such even those that make it to the Species with commonality will branch onto different future paths.  Perhaps there should be an eighth level, one that trumps all above it.  It could be called “Individuality.”

Five Principles for Cloud Computing Success

cloud computingCloud computing is really coming into its own. After several years of predictions, we’ve finally seen adoption of some cloud technologies at the consumer level. Cloud storage, for example, has taken the marketplace by storm as companies like Dropbox get into the public cloud provider routine.

Meanwhile, companies continue to explore just how cloud computing solutions can meet their needs. Just because something is billed as a cloud solution, however, doesn’t mean it’s a good idea.

Here are five principles your company needs to follow if it’s going to have successful cloud computing deployments:

1.

Top Ten Reasons to Sponsor and Exhibit at Cloud Expo New York

In its recent “Sizing the Cloud” report Forrester Research said it expects the global cloud computing market to reach $240 billion in 2020, up from $40 billion in 2010. Companies worldwide are keen to purchase solutions enabling enterprise class cloud computing control, performance, availability, and scalability. But these companies need to see demos and speak with Cloud vendors and providers face-to-face.
The $5 billion Big Data market is on the verge of a rapid growth spurt that will see it top the $50 billion mark worldwide within the next five years. The US Dept. of Defense for example is placing a big bet on big data, investing over $200M annually.

read more