No one argues that networks have not exploded in terms of speeds and feeds in the past decade. What with more consumers (and cows), more companies going “online”, and more content it’d be hard to argue that there’s less traffic out there today than there was even a mere four or five years ago. The increasing pressure put on the network is often mentioned almost in passing, as though merely moving from 10Gbps to 40Gbps to 100Gbps will solve the problem. Move along now, nothing to see here but a higher flow of packets.
But that higher density of packets along with greater diversity of content coupled with distribution through cloud computing that’s creating other issues for network services whose purpose it is to collect, analyze, and act upon those packets.
Monthly Archives: August 2012
Cloud Security – Implementing a Secure Cloud Backup Case Study
Secure cloud backup is a scenario which increasingly gains traction. It allows organizations to implement an off-site backup while maintaining costs at a minimum. In this blog post I would like to focus on a specific use case of secure cloud backup. The system we describe is comprised of an on-premise replication server, Porticor Cloud Security, and Amazon S3 as the final backup destination, all integrated by one of our fine cloud integrators.
In this use case, an enterprise organization was struggling with an inefficient and costly offsite backup infrastructure that was meant to manage an incrementally expanding database. An offsite server farm was costly to operate and maintain and the tape backup and recovery methods used were time consuming. Furthermore, the company failed to meet regulatory requirements with regard to data availability. T
Enabling Security and Compliance in Big Data
An interesting subplot of this burgeoning, “capture everything” big data culture, is whether a single, byte size piece of information really matters anymore. Big data, after all is really about big picture thinking. At a high level, it’s about how we assemble – on a massive scale– unrelated bits of information to better inform our worldview.
Five tips for CIOs moving to the cloud
CIOs making strategic moves to the cloud that involve core infrastructure such as productivity tools, enterprise applications or collaboration capabilities need to be prepared for a new way of thinking and operating.
Moving to the cloud isn’t – and shouldn’t be – business as usual. It’s a switch that demands fresh attitudes to procurement, accounting, project management and, more than anything, ways of working.
This is the biggest shift in computing architecture since client/server and inevitably there will be surprises along the way but best practices and case studies are emerging. Based on over a decade of operating with companies moving to the cloud, these can be usefully stilled down to the following:
-
Communicate. Any change in IT can lead to confusion. You need to have a strong business case for the Board to get buy-in at the highest levels and this support will help mute any broader …
Dropbox, The Woz, & Public Cloud
Public cloud computing is in the news in a bad way, with Dropbox’s latest security breach and The Woz’s new fears about giving up local control of one’s data.
The Woz seemed to be thinking more out loud than making a prediction; he’s allowed to say whatever he wants in any case. As a Founding Father of the personal-computing revolution, he’s no doubt horrified at the idea of handing over one’s personal stuff to some faceless corporation who promises to store said stuff somewhere, somehow. What if they lose it?
I don’t know whether Woz is also worried about government snooping of all this stuff. I know I certainly am, and I am thoroughly disheartened to see the Obama Administration continue to act as if 1984 has finally, and truly, arrived in this great nation of ours.
Meanwhile, Dropbox seems to be the latest public-cloud company to be victimized by its customers. People with bad intentions allegedly stole passwords from somewhere in cyberspace, and found that some of them also worked at Dropbox. The company has promised tighter security, including required two-stage authentication.
Really, it’s like leaving your laptop in your unlocked car or apartment. But Dropbox has hardly been pro-active in heading these problems off at the pass. eCommerce sites with similarly flimsy security at least have the good stuff – credit card info – encrypted or outsourced and doubly encrypted.
With Dropxbox, the good stuff is rather larger in size than credit-card info, and my clear 20/20 hindsight says the company should have been more serious about securing it, dumb customers or no.
I don’t think these headlines are going to impede public cloud adoption by large enterprises. Any enterprise IT department that wishes, or has been directed to wish, for public cloud in its strategy will have a detailed checklist to head off amateur-hour mistakes.
I also don’t think these headlines will impede the growth of Apple’s iCloud. Myrmidons will continue to follow their departed leader’s vision for several more years, it seems.
I do think these headlines represent a grave threat to all who sell public-cloud ideas to small and medium-sized businesses. Public cloud security lapses look like the inevitable result of hare-brained scchemes to these folks. Since a super-majority of them run their companies on Windows, it seems that Microsoft has a large opportunity here to jump in more aggressively with cloud, integrate ironclad security with Windows 8, and save the day.
The hard reality that our modern era of viruses and malware was largely spawned by Microsoft’s sloppiness in integrating Internet Explorer as an ostensibly key part of Windows gives me little faith this will happen.
Cloud Computing and Big Data Strategy
In this CEO Power Panel at the 10th International Cloud Expo, moderated by Cloud Expo Conference Chair Jeremy Geelan, David Canellos, President and CEO of PerspecSys; Lawrence Guillory, CEO of Racemi; John Keagy, Founder, Chairman and CEO of GoGrid; Treb Ryan, Co-Founder & CEO of OpSource; Joe Weinman, Sr. VP of Cloud Services & Strategy at Telx; Jeff Newlin, Vice President and General Manager of OutSystems North America; and Darryl Brown, CMO at Appcore discussed such topics as: Is it just wishful thinking to depict the Cloud as more than just a technology solution? If not, then what concrete examples best demonstrate cloud computing as an engine of business value?
The Age of Big Data: How to Gain Competitive Advantage
We have entered the “Age of Big Data” according to a recent New York Times article. This comes as no surprise to most organizations already struggling with the onslaught of data coming from an increasing number of sources and at an increasing rate. The 2011 IDC Digital Universe Study reported that data is growing faster than Moore’s Law. This trend points to a paradigm shift in how organizations process data where isolated islands and silos are being replaced by large clusters of commodity servers that keep data and compute resources together.
Another way of looking at this paradigm shift is that the growing volume and velocity of data require a new approach to networked computing. A good example of this change is found at Google. The industry now takes Google’s dominance for granted, but when Google launched its beta search engine in 1998, the company was late entering the market. At the time, Yahoo! was dominant; other contenders included infoseek, excite, Lycos, Ask Jeeves and AltaVista (dominating technical searches). Within two years, Google was the dominant search provider. It wasn’t until 2003, when Google published a paper on MapReduce, that the world got a glimpse into Google’s back-end architecture.
Data Centers: Where Big Data Will Be Exploited
On any given day, it’s not uncommon for a company to generate 2.5 quintillion bytes of data, pushing the amount of data that must be processed and managed to unimaginable levels. Because of the requirements for power and low-latency connections that such data growth entails, many companies have become more inclined to outsource their big data needs to colocation data center facilities. In turn, this has created a huge demand for colocation space as additional processing grounds for big data. According to analyst firm Nemertes, colocation providers will not have the available space to capitalize on approximately $869 million of market demand by 2015. This is with good reason, though, as colocation data centers offer huge benefits for big data, including high-density power, opportunities to decrease latency and a community of like-minded companies with which to cross-connect.
Public Cloud Reportedly Coming from VMware
VMware is going to go up against Amazon, Microsoft, Google and presumably Rackspace with its own public cloud Infrastructure-as-a-Service developed as Project Zephyr according to CRN, which thinks it could create some channel conflict for vCloud service providers that will have to compete with VMware.
The story is unconfirmed but the book quotes “sources with knowledge of the matter” saying that VMware has been quietly beta testing Zephyr for the last few months on Cisco USC servers and EMC Avamar storage at a big data center space in Nevada.
Apparently it’s afraid if it waits much longer it’ll be closed out. Market share will depend a lot on how aggressive its pricing is.
Leading a Horse to Water, Driving Out Uncertainty in IT Cloud Projects
“What Cloud solution is right for us?”
“What functionality will be available in this solution?”
“When will I get my training?”
Each of these questions reflects a person grappling with uncertainty at different levels of the organization. From the initial consideration of changing IT strategy, through the design, implementation, and go live, the project team is constantly working on uncertainty loops as uncertainty cascades down the organization. The senior decision maker starts with uncertainty and has zero commitment, until they commit to a strategy, then the IT manager deals with uncertainty of how to implement the strategy. The IT manager instinctively gathers information to fill in the blanks and then sets to work making commitments to specific design components. As the final design gets closer to testing and rollout, end users have their own set of concerns and questions and eventually will be fully committed to the solution once they’ve had training and cut over to the new solution. In each case, the person is expected to make a commitment but won’t feel comfortable making a choice until much of the uncertainty is driven out, thereby reducing the risk of the decision. People take action to reduce uncertainty instinctively, and as long as they feel uncertain they won’t be comfortable to make a decision. Understanding this, and helping drive out uncertainty to encourage commitment can make a difference between analysis paralysis and steady progress toward the goal. Providing answers to reduce uncertainty can “lead the horse to water” trying to get them to decide.
At GreenPages, we’ve done numerous assessments to create recommendations for companies on what cloud solutions are good for them. It’s a tough decision for the CIO, and a big leap of faith for the company, especially putting your IT organization’s success in the hands of an external Cloud Provider. We research the providers, check references, compare them to industry benchmarks, but it is still a tough decision.
When we meet with IT Managers in the process of implementing Cloud solutions, they grapple with how to fit the standardized cloud services into their organization. One company had a complex Active Directory environment, and although a particular tool claimed to integrate with AD, it had very limited functionality, only allowing a single OU selection, and this customer is still working through how to get the tool to fit their organization. IT Managers know these things can happen and are skeptical until they see a solution first hand, and experience it for themselves.
When users hear about the changes coming, they have their own questions. This is the time when people wonder if their cheese is about to move. People want to know how their job will change with the new tools or strategy. These concerns can pop up unexpectedly if not addressed.
Knowing that people are extra sensitive to uncertainty, the resourceful IT professional can get out in front of people’s most anxiety producing concerns and help to drive out uncertainty:
- Including people affected downstream is a good way to get their input as well as lay the groundwork for commitment.
- Most people need to have an initial exposure to understand something new, a time to contemplate the impact to them, and some forum to voice their concerns, in order to really feel ownership and buy in
(i.e. commitment). - Any information that can be provided to help people understand as early as possible can defuse possible frustration later.
- It’s important to respond clearly when people express their urgency to resolve uncertainty. They want to be heard, and frustration will continue to grow if not addressed.
- The goal is to help people to be comfortable in the project timeline, understanding the designs early, and seeing the actual output as implementation gets closer.
It’s risky to proceed to the next phase without fully addressing uncertainty. There will always be some uncertainty, but recklessly discounting someone’s concerns or putting off understanding the concerns will increase the risk of having the concern blow up unexpectedly at some point. The blowup will create rework as the foundation is questioned and the design is revisited.
You can lead a horse to water but you can’t make him drink. For Cloud projects, helping team members resolve their uncertainty leads them to water and makes them ready and able to take the drink.