In a recent IDG News survey, high-ranking IT executives in North America and Europe were asked about the effect the NSA snooping practices have had on their cloud computing strategy. Despite the furor over the NSA, these leaders are still committed to the cloud.
The Cliff’s notes: In an IDG News survey, high-ranking IT executives in North America and Europe were asked about the effect the NSA snooping practices have had on their cloud computing strategy. Despite the furor over the NSA, these leaders are still committed to the cloud.
Linthicum talks about the dollars and cents, that efficiency and agility benefits that the cloud provides to the enterprise far outweighs any concern that the NSA might be tapping into their communications. This echoes what we hear every day from our customers, but with a little more nuance that goes beyond quantifiable business benefits.
Archivo mensual: diciembre 2013
Assessing Cloud Application Agility
Agility means being able to change direction rapidly in response to a change in market conditions. As business increasingly moves to online and mobile interfaces, the agility of enterprise applications becomes a critical success factor. With the advent of the cloud, making applications agile means moving them away from the on-premise data center. Cloud-based applications are not automatically agile, however. The “lift and shift” approach to porting applications from on-premise data centers to cloud infrastructure can actually impede agility.
To make cloud-based applications agile, there has to be a commitment to platforms, tools, processes and people. Enterprise applications were created to serve business needs. As a result, business managers tend define agility in the following terms: How quickly one can get feedback from the end customer or customer-facing systems; how quickly one can decipher the feedback; how quickly one can decide what to do about it and implement the plan. To make this work, IT managers are moving applications to the cloud.
ZapThink’s 2013 Retrospective and Predictions for 2014
At last, the moment has arrived: the thirteenth and final installment of ZapThink’s annual retrospective and predictions written by one of ZapThink’s original members. ZapThink will live on as a division of Dovel Technologies, and I will continue my thought leadership at an as yet unselected opportunity, so in a year’s time, you may see the fourteenth installment of this annual tradition. But for today, let’s score ourselves one last time on last year’s predictions and make three new ones for 2014.
ZapThink’s 2013 Retrospective and Predictions for 2014
At last, the moment has arrived: the thirteenth and final installment of ZapThink’s annual retrospective and predictions written by one of ZapThink’s original members. ZapThink will live on as a division of Dovel Technologies, and I will continue my thought leadership at an as yet unselected opportunity, so in a year’s time, you may see the fourteenth installment of this annual tradition. But for today, let’s score ourselves one last time on last year’s predictions and make three new ones for 2014.
10 Storage Predictions for 2014
By Randy Weis, Consulting Architect, LogicsOne
As we wrap up 2013 and head into the New Year, I wanted to give 10 predictions I have for the storage market for 2014.
- DRaaS will be the hottest sector of cloud-based services: Deconstructing cloud means breaking out specific services that fit a definition of a cloud type service such as Disaster Recovery as a Service (DRaaS) and other specialized and targeted usages of shared multi-tenant computing and storage services. Capital expenditures, time to market, and staff training are all issues that prevent companies from developing a disaster recovery strategy and actually implementing it. I predict that DRaaS will be the hottest sector of cloud based services for small to medium businesses and commercial companies. This will impact secondary storage purchases.
- Integration of flash storage technology will explode: The market for flash storage is maturing and consolidating. EMC has finally entered into the market. Cisco has purchased Whiptail to integrate it into unified computing systems. PCI flash, server flash drives at different tiers of performance and endurance, hybrid flash arrays, and all flash arrays will all continue to drive the adoption of solid state storage in mainstream computing.
- Storage virtualization – software defined storage on the rise: VMware is going to make their virtual VSAN technology generally available at the beginning of Q2 in 2014. This promises to create a brand new tier of storage in datacenters for virtual desktop solutions, disaster recover, and other specific use cases. EMC is their first release of a software defined storage product called ViPR. It has a ways to go before it really begins to address software defined storage requirements, but it is a huge play in the sense that it validates a segment of the market that has long had a miniscule share. DataCore has been the only major player in this space for 15 years. They see EMC’s announcement as a validation of their approach to decoupling storage management and software from the commodity hard drives and proprietary array controllers.
- Network Attached Storage (NAS) Revolution: We’re undergoing a revolution with the integration and introduction of scale out NAS technologies. One of the most notable examples is Isilon being purchased by EMC and starting to appear as a more fully integrated and fully available solution with a wide variety of applications. Meanwhile NetApp continues to innovate in the traditional scale up NAS market with increasing adoption of ONTAP 8.x. New NAS systems feature support of the most recent releases SMB 3.0, Microsoft’s significant overhaul of Windows-based file sharing protocol (also known as CIFS). This has a significant impact on design Hyper V Storage and Windows file sharing in general. Client and server side failover are now possible with SMB 3.0, which enables the kind of high availability and resiliency for Hyper V that VMware has enjoyed as a competitive advantage.
- Mobile Cloud Storage – File Sharing Will Never Be the Same: Dropbox, Box, Google Drive, Huddle and other smartphone-based methods to access data anywhere are revolutionizing the way individual consumers access their data. This creates security headaches for IT admins, but the vendors are responding with better and better security built into their products. At the enterprise level, Syncplicity, Panzura, Citrix ShareFile, Nasuni and other cloud storage and shared storage technologies are providing deep integration into Active Directory and enabling transfer of large files across long distances quickly and securely. These technologies integrate with on premise NAS systems and cloud storage. Plain and simple, file sharing will never be the same again.
- Hyper Converged Infrastructure Will Be a Significant Trend: The market share dominance of Nutanix, Simplivity (based in Westborough, MA) and VMware’s VSAN technology will all change the way shared storage is viewed in datacenters of every size. These products will not replace the use of shared storage arrays but, instead, provide an integrated, flexible and modular way to scale virtualized application deployments, such as VDI and virtual servers. These technologies all integrate compute & storage, networking (at different levels) and even data protection technology, to eliminate multiple expenditures and multiple points of management. Most importantly, Hyper-converged Infrastructure will allow new deployments to begin small and then scale out without large up-front purchases. This will not work for every tier of application or every company, but it will be a significant trend in 2014.
- Big Data Will Spread Throughout Industries: Big Data has become as much a buzzword as cloud. The actual use of the technologies that we call big data is growing rapidly. This adoption is not only in internet giants like Google and companies that track online behavior, but also in industries such as insurance, life sciences, and retailers. Integration of big data technologies (i.e. Hadoop, MapReduce) with more traditional SQL database technology allows service providers of any type to extract data from traditional databases and begin processing it on a huge scale more efficiently and more quickly, while still gaining the advantage of more structured databases. This trend will continue to spread throughout many industries that need to manage large amount of structured and unstructured data.
- Object based storage will grow: Cloud storage will be big news for 2014 for two major reasons. The first reason stems from shock waves of Nirvanix going out of business. Corporate consumers of cloud storage will be much more cautious and demand better SLAs in order to hold cloud storage providers accountable. The second reason has to do with adoption of giant, geographically dispersed data sets. Object based storage has been a little known, but important, development in storage technology that allows data sets on scale of petabytes to be stored and retrieved by people who generate data and those who consume it. However, these monstrous data sets can’t be protected by traditional RAID technologies. Providers such as Cleversafe have developed a means to spread data across multiple locations, preserving its integrity and improving resiliency while continuing to scale to massive amounts.
- More Data Growth: This may seem redundant, but it is predicted that business data will double every two years. While this may seem like great news for traditional storage vendors, it is even better news for people who provide data storage on a massive scale, and for those technology firms that enable mobile access to that data anywhere while integrating well with existing storage systems. This exponential data growth will lead to advances in file system technologies, object storage integration, deduplication, high capacity drives and storage resource/lifecycle management tool advances.
- Backup and Data Protection Evolution + Tape Will Not Die: The data protection market continues to change rapidly as more servers and applications are virtualized or converted to SaaS. Innovations in backup technology include the rapid rise of Veeam as a dominant backup and replication technology – not only for businesses but also for service providers. The Backup as a Service market seems to have stalled out because feature sets are limited; however the appliance model for backups and backup services continue to show high demand. The traditional market leaders face very strong competition from the new players and longtime competitor CommVault. CommVault has evolved to become a true storage resources management play and is rapidly gaining market share as an enterprise solution. Data deduplication has evolved from appliances such as Data Domain into a software feature set that’s included in almost every backup software out there. CommVault, Veeam, Backup Exec, and others all have either server side deduplication or client side deduplication (or both). The appliance model for disk-spaced backups continues to be popular with Data Domain, ExaGrid, and Avamar as leading examples. EMC dominates this market share – the competition is still trying to capture market share. Symantec has even entered the game with its own backup appliances, which are essentially servers preconfigured with their popular software and internal storage. Tape will not die. Long term, long capacity archives still require use of tapes, primarily for economic reasons. The current generation of tape technology, such as LTO6, can contain up to 6 TB of data on a single tape. Tape drives are routinely made with built-in encryption to avoid data breaches that were more common in the past with unencrypted tape.
So there you have it, my 2014 storage predictions. What do you think? Which do you agree with/disagree with? Did I leave anything off that you think will have a major impact next year? As always, reach out if you have any questions!
Delivering Security ‘as a Service’
SecuritySolutionsWatch.com: Thank you for joining us today, Bryan. If you wouldn’t mind, please tell us a little bit about your background and your role at HP.
Bryan Coapstick: As the Director of Mobile Innovation, I am responsible for ensuring that HP’s mobility initiatives successfully help enterprises improve communications and interactions with customers, employees, constituents and partners to ensure they have the data they need, when they need it, and in a format that is easy to interact with and consume. Additionally, I serve as the industry chair for the Advanced Mobility Working group for the American Council for Technology/Industry Advisory Council (ACT/IAC), whose mission is to “foster collaboration and communication on issues regarding mobile computing in the Federal Government, including citizen services, remote connectivity, employee services, workforce productivity, digital publishing, and enterprise mobility.”
Prior to joining HP, I spent nearly two decades in both the public and private sectors, focused on emerging technologies and the critical intersections between business strategy and technology. I’ve worked with several Fortune 200 companies to increase their market presence and effectively leverage their mobile and digital channels. Additionally, I have built interactive teams, worked to guide digital mobile strategies, and led an Innovation Lab focused on accelerating digital interaction models, the convergence of big data analytics with context aware applications, and emerging mobile payments and transaction capabilities.
Five Hot Healthcare Innovations in 2013 and a Few to Look Out for in 2014
The ability to provide fast and accurate healthcare has always depended on having the latest technology and educating the relevant staff on how to best employ these new devices and applications. Throughout 2013, a number of innovations appeared in the industry, changing the way healthcare we administer and receive care.
New technologies have the potential to disrupt the traditional, slow, and ineffective processes that are often ingrained in the healthcare system. They have the potential to offer more security to meet stringent compliance requirements, streamline distribution, and get more consistency throughout a range of services. This year has seen some new technologies and trends that are having a real impact on the industry.
Cloud Shifts the Burden of Security to Development
Explores three ways to help development teams bear the burden of security: use pen test results to harden the application, leverage service virtualization for security scenarios, & adopt policy-driven development to help engineers understand and satisfy management’s security expectations.
The move to the cloud brings a number of new security challenges, but the application remains your last line of defense. Engineers are extremely well poised to perform tasks critical for securing the application-provided that certain key obstacles are overcome.
Three Ways to Sell Desktop-as-a-Service
Cloud service providers are up against several key challenges when it comes to selling desktop virtualization technology. An understanding of the benefits of desktop-as-a-service (DaaS) rarely exists, and licensing concerns – and their rising costs – are not going away any time soon. Knowing the do’s and don’t of selling desktop-as-a-service before diving in can solve problems on the front end while ensuring adoption of DaaS occurs in companies of all shapes and sizes.
1. Turn IT administrators into allies, not opponents.
Among the myriad reasons enterprises cite for turning to desktop-as-a-service (DaaS) technology, a few prove instrumental in driving its adoption. When evaluating the decision drivers behind DaaS, it is important to consider what the actual motivators behind a switch away from traditional desktop computing may be, and who it is that must be properly motivated. While some of the most progressive executives of small and mid-sized businesses may themselves be the driving force behind such a switch, it is worth noting that these are not typically the individuals interacting with DaaS providers. On the other hand, those who head the IT department of a company have many potential motivators for spearheading a switch from traditional compute to virtualized hosted Windows desktops. The time, energy and frustration DaaS can spare those who actually administer end-user compute for organizations with dozens – or hundreds – of desktops can be significant. If these individuals are sufficiently aware of the benefits they stand to accrue through a switch to more efficient DaaS technology, it follows intuitively that such heads of IT organizations would be among the strongest advocates for such a shift.
Your Cloud Availability: 98%, 99.99% or 99.9999%?
We discussed my belief that the nines (99.99…%) are more marketing than the real deal today – What do you think? The market and tools used to measure your uptime immature or do not really exist. The concept of availability in the cloud is determined by the level of responsibility and liability that the vendors have for their customers. These notions include the ability to monitor, proactively fix and maintain continuous communication with users, giving them clear and genuine visibility into what exactly is going on and when the system is expected to return to normal. It is also necessary, here, to define the concept of downtime. At a very basic level, it is when the system is not available. However, the more precise answer depends on the criticality of specific features and components of an application or service.
On that note, we are interested in hearing your point of view. First and foremost, it is essential to ask, what do you consider “downtime”? How do you approach the matter of downtime with your customers? How do you compensate, if at all? Maybe even before discussing compensation, let us ask how or if you measure downtime? If so, are you able to calculate your own availability over the past week, month, or year? With your help, we can gain even greater insights on the topic.