Archivo de la etiqueta: security

CIF: ‘Lack of trust holding back cloud adoption’

CIF: 'Cloud users are still citing the same inhibitors'

CIF: ‘Cloud users are still citing the same inhibitors’

Security, privacy and lack of control are still the leading inhibitors holding enterprises back from adopting cloud services, according to the Cloud Industry Forum’s latest research.

The CIF, which polled 250 senior IT decision-makers in the UK earlier this year to better understand where cloud services fit into their overall IT strategies, said when asked about their biggest concerns during the decision-making process to move to the cloud, 70 per cent cited data security and 61 per cent data privacy.

Both are up from the 2014 figures of 61 per cent and 54 per cent, respectively.

“Hybrid will be the modus operandi for the majority of organisations for the foreseeable future, being either not yet ready to move everything to the cloud, or unwilling to. There are a number of contributing factors here: fear of losing control of IT systems, security and privacy concerns, and lack of budget currently stand in the way of greater adoption of cloud by businesses,” said Alex Hilton, chief executive of the CIF.

“The primary issue relates to trust: trust that cloud-based data will be appropriately secured, that it won’t be compromised or inadvertently accessed, and that businesses will be able to retrieve and migrate their data when a contract terminates.”

About 40 per cent of respondents were also concerned they would lose control/manageability of their IT systems when moving to cloud, up from 24 per cent last year.

Richard Pharro, chief executive of APM Group, the CIF’s independent certification partner said cloud providers need to improve how to disclose their privacy and security practices in order to inspire more confidence among current and potential users.

“Some Cloud providers are opaque in the way that they operate. The prevalence of click-through licenses, some of which are littered with unrealistic terms and conditions,” Pharro said, adding that improving public disclosure in cloud contracts could go some way towards improving trust and confidence among customers.

Eagle Eye Networks CEO Dean Drako acquires cloud access firm for $50m

Eagle Eye's CEO and former Barracuda Networks president is buying a cloud access and control company for $50m

Eagle Eye’s CEO and former Barracuda Networks president is buying a cloud access and control company for $50m

Dean Drako, president and chief executive of Eagle Eye Networks and former Barracuda Networks president has wholly acquired Brivo, a cloud access control firm, for $50m.

Brivo said its cloud-based access control system, a centralised management and security system for video surveillance cameras, currently services over 6 million users and 100,000 access points.

The acquisition will give Eagle Eye, a specialist in cloud-based video surveillance technology, a flexible access control tool to couple with its current offerings, Drako said.

“My goal was to acquire the physical security industry’s best access control system,” Drako explained.

“Brivo’s true cloud architecture and open API approach put it a generation ahead of other access control systems. Cloud solutions provide exceptional benefits and Brivo is clearly the market and technology leader. Brivo is also committed to strong, long-standing relationships with its channel partners, which I believe is the best strategy for delivering extremely high customer satisfaction.”

Though Eagle Eye will remain autonomous from Brivo, Drako will serve as the company’s chairman; Steve Van Till, Brivo’s president and chief executive, will continue serving in this capacity.

He said Eagle Eye will work to integrate Brivo’s flagship solution, Brivo OnAir, with its cloud security camera system, which will help deliver video verification and natively viewable and searchable video.

“We are extremely excited that Dean Drako has acquired Brivo and is serving as chairman. In addition to Dean’s experience founding and leading Barracuda Networks to be a multi-billion dollar company, he has grown his latest company, Eagle Eye Networks, to be the technology leader in cloud video surveillance,” Van Till said.

“We both share the vision of delivering the tremendous advantages of cloud-based systems to our customers,” he added.

CSA tool helps cloud users evaluate data protection posture of providers

The CSA says the tool can help customers and providers improve their cloud data protection practices

The CSA says the tool can help customers and providers improve their cloud data protection practices

The Cloud Security Alliance this week unveiled the next generation of a tool designed to enable cloud customers to evaluate the level of data protection precautions implemented by cloud service providers.

The Privacy Level Agreement (PLA) v2 tool aims to give customers a better sense of the extent to which their providers have practices, procedures and technologies in place to ensure data protection vis-à-vis European data privacy regulations.

It also provides a guidance for cloud service providers to achieve compliance with privacy legislation in EU, and on how these providers can disclose the level of personal data protection they offer to customers.

“The continued reliance and adoption of the PLA by cloud service providers worldwide has been an important building block for developing a modern and ethical privacy-rich framework to address the security challenges facing enterprises worldwide,” said Daniele Catteddu, EMEA managing director of CSA.

“This next version that addresses personal data protection compliance will be of significant importance in building the confidence of cloud consumers,” Catteddu said.

The tool, originally created in 2013, was developed by the PLA working group, which was organised to help transpose the Art. 29 Working Party and EU National Data Protection Regulator’s recommendations on cloud computing into an outline CSPs can use to disclose personal data handling practices.

“PLA v2 is a valuable tool to guide CSPs of any size to address EU personal data protection compliance,” said Paolo Balboni, co-chair of the PLA Working Group and founding partner of ICT Legal Consulting. “In a market where customers still struggle to assess CSP data protection compliance, PLA v2 aims to fill this gap and facilitate customer understanding.”

Telstra’s recent buy Pacnet suffers IT security breach

Pacnet's IT network was hacked earlier this year

Pacnet’s IT network was hacked earlier this year

Telstra’s recently acquired datacentre and cloud specialist Pacnet suffered a security breach earlier this year whereby a third-party managed to get access to its IT network, the telco revealed this week.

Telstra was quick to point out that while the breach occurred on Pacnet’s IT network (which isn’t connected to Telstra’s) before its acquisition of Pacnet was finalised in April, it did do and has since done all it can to try and understand the reasons for the breach and its potential impact on customers.

The company has alerted customers, staff and regulators in the relevant jurisdictions.

Group executive of global enterprise services Brendon Riley said the investigation is ongoing, and that the company will apply its own tried and tested security technologies and techniques to Pacnet’s network.

“Our investigation found a third party had attained access to Pacnet’s corporate IT network, including email and other administrative systems, through a SQL vulnerability that enabled malicious software to be uploaded to the network,” Riley said.

“To protect against further activity we rectified the security vulnerabilities that allowed the unauthorised access. We have also put in place additional monitoring and incident response capabilities that we routinely apply to all of our networks.”

He said the firm is alerting customers of the potential impact of the breach, and hopes that the extra precautions the company has put in place will restore confidence in the firm.

The company has so far declined to comment on the scope or volume of data exposed to hackers.

Telstra seems keen to pre-empt any privacy-related regulatory challenges, something the company has had to deal with in recent years – which, it was eventually found, was due in part to its own negligence.

Last year for instance the firm was fined by the Australian Information Commissioner for making the personal details of almost 16,000 customers accessible via the internet between February 2012 and May 2013 after several spreadsheets containing customer data dating back to 2009 was found through Google Search.

Telstra’s recent buy Pacnet suffers IT security breach

Pacnet's IT network was hacked earlier this year

Pacnet’s IT network was hacked earlier this year

Telstra’s recently acquired datacentre and cloud specialist Pacnet suffered a security breach earlier this year whereby a third-party managed to get access to its IT network, the telco revealed this week.

Telstra was quick to point out that while the breach occurred on Pacnet’s IT network (which isn’t connected to Telstra’s) before its acquisition of Pacnet was finalised in April, it did do and has since done all it can to try and understand the reasons for the breach and its potential impact on customers.

The company has alerted customers, staff and regulators in the relevant jurisdictions.

Group executive of global enterprise services Brendon Riley said the investigation is ongoing, and that the company will apply its own tried and tested security technologies and techniques to Pacnet’s network.

“Our investigation found a third party had attained access to Pacnet’s corporate IT network, including email and other administrative systems, through a SQL vulnerability that enabled malicious software to be uploaded to the network,” Riley said.

“To protect against further activity we rectified the security vulnerabilities that allowed the unauthorised access. We have also put in place additional monitoring and incident response capabilities that we routinely apply to all of our networks.”

He said the firm is alerting customers of the potential impact of the breach, and hopes that the extra precautions the company has put in place will restore confidence in the firm.

The company has so far declined to comment on the scope or volume of data exposed to hackers.

Telstra seems keen to pre-empt any privacy-related regulatory challenges, something the company has had to deal with in recent years – which, it was eventually found, was due in part to its own negligence.

Last year for instance the firm was fined by the Australian Information Commissioner for making the personal details of almost 16,000 customers accessible via the internet between February 2012 and May 2013 after several spreadsheets containing customer data dating back to 2009 was found through Google Search.

Dropbox the latest to adopt public cloud privacy standard

Dropbox is the latest to adopt one of the first public cloud-focused data privacy standards

Dropbox is the latest to adopt one of the first public cloud-focused data privacy standards

Cloud storage provider Dropbox said it has adopted ISO 27018, among the first international standards focusing on the protection of personal data in the public cloud.

The standard, published in August 2014, is aimed at clarifying the roles of data controllers and data processors in keeping Personally Identifiable Information (PII) private and secure in public cloud environments; it builds on other information security standards within the ISO 27000 family, and specifically, is an enhancement to the 27001 standard.

ISO 27018 also broadly requires adopting cloud providers to be more transparent about what they do with customer data and where they host it.

In a statement the company said the move would give users more confidence in its platform, particularly enterprise users.

“We’re pleased to be one of the first companies to achieve ISO 27018 certification. Privacy and data protection regulations and norms vary around the world, and we’re confident this certification will help our customers meet their global compliance needs,” it said.

Mark van der Linden, Dropbox country manager for the UK said: “Businesses in the UK and all over the world are trusting Dropbox to make collaboration easier and boost productivity. Our ISO 27018 accreditation shows we put users in control of their data, we are transparent about where we store it, and we operate to the highest standards of security.

Earlier this year Microsoft certified Azure, Intune, Office 365 and Dynamics CRM Online under the new ISO standard. At the time the company also said it was hopeful certifying under the standard would make it easier to satisfy compliance requirements, which can be trickier in some verticals than others.

IBM claims strong traction with cybersecurity cloud network

IBM says its recently announced cybersecurity cloud service is gaining traction

IBM says its recently announced cybersecurity cloud service is gaining traction

IBM said over 1,000 organisations have now joined its recently announced cloud-based cybersecurity service, dubbed X-Force Exchange.

The service includes hundreds of terabytes of raw aggregated threat intelligence data and those that sign up to the service can upload their own data, so the more people join the more robust the service gets.

The initial data dump is based on over 25 billion web pages and images collected from a network of over 270 million endpoints, and includes data from over 15 billion monitored security events daily. But the company said participants have created more than 300 new collections of threat data since its launch.

“Cybercrime has become the equivalent of a pandemic — no company or country can battle it alone,” said Brendan Hannigan, general manager, IBM Security.

“We have to take a collective and collaborative approach across the public and private sectors to defend against cybercrime. Sharing and innovating around threat data is central to battling highly organized cybercriminals; the industry can no longer afford to keep this critical resource locked up in proprietary databases. With X-Force Exchange, IBM has opened access to our extensive threat data to advance collaboration and help public and private enterprises safeguard themselves,” Hannigan said.

Security isn’t a new area for IBM but offering real-time cyberthreat detection is, a move that has also put it in direct competition with a wide range of managed security service providers that have been playing in this space for years. Nevertheless, the company has a lot of clients so there’s a huge opportunity for the firm to harvest all of that data – particularly as it creates new partnerships with networking incumbents (like Cisco with VersaStack).

Bring Your Own Encryption: The case for standards

BYOE is the new black

BYOE is the new black

Being free to choose the most suitable encryption for your business seems like a good idea. But it will only work in a context of recognised standards across encryption systems and providers’ security platforms. Since the start of the 21st century, security has emerged from scare-story status to become one of IT users’ biggest issues – as survey after survey confirms. Along the way a number of uncomfortable lessons are still being learned.

The first lesson is that security technology must always be considered in a human context. No one still believes in a technological fix that will put an end to all security problems, because time and again we hear news of new types of cyber attack that bypass sophisticated and secure technology by targeting human nature – from alarming e-mails ostensibly from official sources, to friendly social invitations to share a funny download; from a harmless-looking USB stick ‘accidentally’ dropped by the office entrance, to the fake policeman demanding a few personal details to verify that you are not criminally liable.

And that explains the article’s heading: a balance must be struck between achieving the desired level of protection against keeping all protection procedures quick and simple. Every minute spent making things secure is a minute lost to productivity – so the heading could equally have said “balancing security with efficiency”.

The second lesson still being learned is never to fully trust to instinct in security matters. It is instinctive to obey instructions that appear to come from an authoritative source, or to respond in an open, friendly manner to a friendly approach – and those are just the sort of instincts that are exploited by IT scams. Instincts can open us to attack, and they can also evoke inappropriate caution.

In the first years of major cloud uptake there was the oft-repeated advice to business that the sensible course would be to use public cloud services to simplify mundane operations, but that critical or high priority data should not be trusted to a public cloud service but kept under control in a private cloud. Instinctively this made sense: you should not allow your secrets to float about in a cloud where you have no idea where they are stored or who is in charge of them.

The irony is that the cloud – being so obviously vulnerable and inviting to attackers – is constantly being reinforced with the most sophisticated security measures: so data in the cloud is probably far better protected than any SME could afford to secure its own data internally. It is like air travel: because flying is instinctively scary, so much has been spent to make it safe that you are

less likely to die on a flight than you are driving the same journey in the “safety” of your own car. The biggest risk in air travel is in the journey to the airport, just as the biggest risk in cloud computing lies in the data’s passage to the cloud – hence the importance of a secure line to a cloud service.

So let us look at encryption in the light of those two lessons. Instinctively it makes sense to keep full control of your own encryption and keys, rather than let them get into any stranger’s hands – so how far do we trust that instinct, bearing in mind the need also to balance security against efficiency?

BYOK

Hot on the heels of BYOD – or “Bring Your Own Device” to the workplace – come the acronym for Bring Your Own Key (BYOK).

The idea of encryption is as old as the concept of written language: if a message might fall into enemy hands, then it is important to ensure that they will not be able to read it. We have recently been told that US forces used Native American communicators in WW2 because the chances of anyone in Japan understanding their language was near zero. More typically, encryption relies on some sort of “key” to unlock and make sense of the message it contains, and that transfers the problem of security to a new level: now the message is secure, the focus shifts to protecting the key.

In the case of access to cloud services: if we are encrypting data because we are worried about its security in an unknown cloud, why then should we trust the same cloud to hold the encryption keys?

Microsoft for instance recently announced a new solution to this dilemma using HSMs (Hardware Security Modules) within their Windows Azure cloud – so that an enterprise customer can use its own internal HSM to produce a master key that is then transmitted to the HSM within the Windows Azure cloud. This provides secure encryption when in the cloud, but it also means that not even Microsoft itself can read it, because they do not have the master key hidden in the enterprise HSM.

It is not so much that the enterprise cannot trust Microsoft to protect its data from attack, it is more to do with growing legal complexities. In the wake of Snowden revelations, it is becoming known that even the most well protected data might be at risk from a government or legal subpoena demanding to reveal its content. Under this BYOK system, however, Microsoft cannot be forced to reveal the enterprise’s secrets because it cannot access them itself, and the responsibility lies only with the owner.

This is increasingly important because of other legal pressures that insist on restricting access to certain types of data. A government can, for example, forbid anyone from allowing data of national importance to leave the country – not a simple matter in a globally connected IP network. There are also increasing legal pressures on holders of personal data to guarantee levels of privacy.

Instinctively it feels a lot more secure to manage your own key and use BYOK instead of leaving it to the cloud provider. As long as that instinct is backed by a suitable and strict in-house HSM based security policy, these instincts can be trusted.

BYOE

BYOK makes the best of the cloud provider’s encryption offering, by giving the customer ultimate control over its key. But is the customer happy with the encryption provided?

Bearing in mind that balance between security and efficiency, you might prefer a higher level of encryption than that used by the cloud provider’s security system, or you might find the encryption mechanism is adding latency or inconvenience and would rather opt for greater nimbleness at the cost of lighter encryption. In this case you could go a step further and employ your own encryption algorithms or processes. Welcome to the domain of BYOE (Bring Your Own Encryption).

Again, we must balance security against efficiency. Take the example of an enterprise using the cloud for deep mining its sensitive customer data. This requires so much computing power that only a cloud provider can do the job, and that means trusting private data to be processed in a cloud service. This could infringe regulations, unless the data is protected by suitable encryption. But how can the data be processed if the provider cannot read it?

Taking the WW2 example above: if a Japanese wireless operator was asked to edit the Native American message so a shortened version could be sent to HQ for cryptanalysis, any attempt to edit an unknown language would create gobbledygook, because translation is not a “homomorphic mapping”.

Homomorphic encryption means that one can perform certain processes on the encrypted data, and the same processes will be performed on the source data without any need to de-crypt the encrypted data. This usually implies arithmetical processes: so the data mining software can do its mining on the encrypted data file while it remains encrypted, and the output data, when decrypted, will be the same output as if the data had been processed without any intervening encryption.

It is like operating one of those automatic coffee vendors that grinds the beans, heats the water and adds milk and sugar according to which button was pressed: you do not know what type of coffee bean is used, whether tap, filtered or spring water or whether the milk is whole cream, skimmed or soya. All you know is that what comes out will be a cappuccino with no sugar. In the data mining example: what comes out might be a neat spread-sheet summary of customers average buying habits based on millions of past transactions, without a single personal transaction detail being visible to the cloud’s provider.

The problem with the cloud provider allowing the users to choose their own encryption, is that the provider’s security platform has to be able to support the chosen encryption system. As an interim measure, the provider might offer a choice from a range of encryption offerings that have been tested for compatibility with the cloud offering, but that still requires one to trust another’s choice of encryption algorithms. A full homomorphic offering might be vital for one operation, but a waste of money and effort for a whole lot of other processes.

The call for standards

So what is needed for BOYE to become a practical solution is a global standard cloud security platform that any encryption offering can be registered for support by that platform. The customer chooses a cloud offering for its services and for its certified “XYZ standard” security platform, then the customer goes shopping for an “XYZ certified” encryption system that matches its particular balance between security and practicality.

Just as in the BYOD revolution, this decision need not be made at an enterprise level, or even by the IT department. BYOE, if sufficiently standardised, could become the responsibility of the department, team or individual user: just as you can bring your own device to the office, you could ultimately take personal responsibility for your own data security.

What if you prefer to use your very own implementation of your own encryption algorithms? All the more reason to want a standard interface! This approach is not so new for those of us who remember the Java J2EE Crypto library – as long as we complied with the published interfaces, anyone could use their own crypto functions. This “the network is the computer” ideology becomes all the more relevant in the cloud age. As the computer industry has learned over the past 40 years, commonly accepted standards and architecture (for example the Von Neumamm model or J2EE Crypto) play a key role in enabling progress.

BYOE could prove every bit as disruptive as BYOD – unless the industry can ensure that users choose their encryption from a set of globally sanctioned and standardised encryption systems or processes. If business is to reap the full benefits promised by cloud services, it must have the foundation of such an open cloud environment.

Written by Dr. Hongwen Zhang, chair security working group, CloudEthernet Forum.

Nearly half of Brits find wearables in the enterprise intrusive – study

How will wearables impact privacy in the enterprise?

How will wearables impact privacy in the enterprise?

A recently published study by UK mobile app developer Apadmi suggests UKers are deeply concerned about the privacy implications of wearable IP-connected technology in the workplace.

The study, which surveyed 500 adults living and working in the UK, found that 42 per cent of people in the UK thought that wearable technology posed a risk to their privacy, with only 18 per cent of respondents saying they didn’t feel it was a danger.

But there seemed to be a significant portion of respondents (40 per cent) that did not know whether wearable tech would pose a threat to their privacy.

“It’s obvious from our investigations that privacy is a very real issue for the wearable technology industry, although it’s by no means insurmountable,” said Nick Black, co-founder and director at Apadmi

“A lot of commentators are flagging up the potential privacy implications of devices that can record and relay so much data about an individual. And consumers appear to be taking note, with quite a few admitting that these concerns weigh on their mind when considering whether or not to buy wearable technology.”

Wearables have started to gain favour with some larger enterprises in the US and UK, particularly when it comes to tracking health and fitness. Some private health insurers for instance monitor fitness data as a way to incentivise fitness activity, which reduces the risk of health issues and can lead to lower premiums.

But opinion on the privacy implications of mandating wearables in the workplace seems to be quite strong. When asked how they would feel if their employer required them to use wearable technology as part of their role 25 per cent of respondents said they would consider changing jobs, and a further 24 per cent replied they would be happy to do this.

“We also need to draw attention to the fact that a huge number of people still don’t have a firm grasp of how wearable technology might impact upon privacy in the first place, as demonstrated by the significant number of ‘don’t know’ respondents in our survey. People are naturally apprehensive about what they don’t understand. But it’s interesting that those who go on to purchase a device are overwhelmingly happy with their decision and the benefits it has brought to their lives,” Black explained.

“With this in mind, wearable tech businesses and app developers need to educate prospective customers around privacy concerns to alleviate these fears. Many people still don’t fully understand the privacy issues around wearable technology or appreciate its potential to dramatically improve lives in areas such as health and social care.”

Despite the potential privacy implications many believe use of wearables in the enterprise will rapidly increase over the next few years. Salesforce for instance claims use of wearables in the enterprise will more than triple in the next two years, with smartwatches emerging as a popular candidate to deliver sales and customer service improvements.

The company’s own survey of over 1,400 working adults shows 79 per cent of adopters agree wearables will be strategic to their company’s future success; 76 per cent report improvements in business performance since deploying wearables in the enterprise; and 86 per cent of adopters’ organisations plan to increase their wearables spend over the next 12 months.

EMC World 2015: Event Recap

After EMC World 2015, I’m languishing in airports today in post-conference burnout – an ideal time to deliver a report on the news, announcements and my prognostications on what this means to our business.

The big announcements were delivered in General Sessions on Monday (EMC Information Infrastructure & VCE) and on Tuesday (Federation: VMware & Pivotal). The Federation announcements are more developer and futures oriented, although important strategically, so I’ll pass on that for now.

EMC and VCE have updated their converged and Hyperconverged products pretty dramatically. Yes, VSPEX Blue is Hyperconverged, however unfortunate the name is in linking an EVO:RAIL solution to a reference architecture solution.

The products can be aligned as:

  1. Block
  2. Rack
  3.  Appliances

EMC World 2015

The VCE Vblock product line adheres to its core value proposition closely.

  1. Time from order to completely deployed on the data center floor in 45 days. (GreenPages will provide the Deploy & Implementation services. We have three D&I engineers on staff now.)
  2. Cross component Unified upgrade through a Release Candidate Matrix – every single bit of hardware is tested in major and minor upgrades to insure compatibility: storage, switch, blade, add-ons (RecoverPoint, Avamar, VPLEX).
  3. Unified support – one call to VCE, not to all the vendors in the build

However, VCE is adding options and variety to make the product less monolithic.

  1. VXblock – this is the XtremIO version, intended for large VDI or mission critical transactional deployments (trading, insurance, national healthcare claims processing). The Beast is a Vblock of eight 40 TB Xbrick nodes, 320 TB before dedupe and compression, or nearly 2 PB with realistic data reduction. Yes, that is Two Petabytes of All Flash Array. Remote replication is now totally supported with RecoverPoint.
  2. VXRack – this is a Vblock without an array, but it isn’t VSAN either. It is….ScaleIO, a software storage solution that pools server storage into a shared pool. The minimum configuration is 100 compute nodes, which can be dense performance (4 node form factor in 2 U chassis) or capacity. The nodes can be bare metal or hypervisor of any sort. This can scale to 328 Petabytes. Yes, Petabytes. This is web-scale, but they call it “Rack Scale” computing (first generation). More on that later…
  3. Vscale – Networking! This is Leaf and Spine networking in a rack to tie a VXrack or Vblock deployment together, at scale. “One Ring to Rule Them All”. This is big, literally. Imagine ordering a petabyte installation of VXblock, VXrack and Vscale, and rolling it onto the floor in less than two months.

So, that is Block and Rack. What about Appliance?

Enter VSPEX Blue, the EMC implementation of EVO:RAIL. This has definite value in…

  • Pricing
  • Unified management & support
  • The “app store” with
    • integrated backup (VDPA)
    • replication (vRPA)
    • Cloud Array integration (TwinStrata lives!), a virtual iSCSI controller that will present cloud storage to the system as a backup target or a capacity tier.

This post from Mike Colson provides a good explanation.

Future apps will include virus scanning, links to Public IaaS and others.

I set one up in the lab in 15 minutes, as advertised, although I had to wait for the configuration wizard to churn away after I initialized it and input all the networking. Professional Services will be required, as EMC is requiring PS to implement. Our team is and will be prepared to deploy this. We can discuss how this compares to other Hyperconverged appliances. Contact us for more information.

There are other announcements, some in sheer scale and some in desirable new features.

Data Domain Beast: DD9500, 58.7 TB/hr. and 1.7 PB of capacity. This is rated at 1.5x the performance and 4x the scalability of the nearest competitor.

VPLEX News: The VPLEX Witness can now be deployed in the public Cloud (naturally EMC recommends the EMC Hybrid Cloud or vCloud Air). The Witness has to be outside the fault domains of any protected site, so where better than the Cloud? It is a very lightweight VM.

CloudArray (TwinStrata’s Cloud Array Controller) is integrated with VPLEX. You can have a distributed volume spanning on premise and cloud storage. I’m still trying to grasp the significance of this. The local cache for the CloudArray controller can be very fast, so this isn’t limited to low latency applications. The things you could do…

VPLEX is now available in a Virtual Edition (VPLEX/VE). This will obviously come with some caveats and restrictions, but this also is a fantastic new option for smaller organizations looking for the high availability that VPLEX provides, as well as data mobility and federation of workloads across metro distances.

VVOL: Chuck Hollis (@chuckhollis) led an entertaining and informative ‘Birds of a Feather’ session for VVOLs. Takeaway – this is NOT commonly deployed yet. Only a handful of people have even set it up, and mostly for test. This was in a room with at least 150 people, so high interest, but low deployment. Everyone sees the potential and is looking forward to real world policy based deployments on industry standard storage. This is an emerging technology that will be watched closely.

VNX/VNXe: I didn’t see or hear many striking features or upgrades in this product line, but an all flash VNXe was trumpeted. I’ll be looking at the performance and design specifications of this more closely to see how it might fit targeted use cases or general purpose storage for SMB and commercial level customers. There is talk around the virtualization of the VNX array, as well as Isilon, so pretty soon nearly every controller or device in the EMC portfolio will be available as a virtual appliance. This leads me to…

ViPR Controller and ViPR SRM: Software Defined Storage

ViPR Controller is definitely a real product with real usefulness. This is the automation and provisioning tool for a wide variety of infrastructure elements, allowing for creation of virtual arrays with policy based provisioning, leveraging every data service imaginable: dedupe, replication, snapshots, file services, block services and so on.

ViPR SRM is the capacity reporting and monitoring tool that provides the management of capacity that is needed in an SDS environment. This is a much improved product with a very nice GUI and more intuitive approach to counters and metrics.

I’d recommend a Storage Transformation Workshop for people interested in exploring how SDS can change the way (and cost) of how you manage your information infrastructure.

More on EVO:RAIL/VSPEX Blue

I met with Mike McDonough, the mastermind behind EVO:RAIL. He is indeed a mastermind. The story of the rise of EVO:RAIL as a separate business unit is interesting enough (300 business cases submitted, 3 approved, and he won’t say what the other mystery products are), but the implementation and strategy and vision are what matter to us. The big factor here was boiling down the support cases to come up with the 370 most common reasons for support, all around configuration, management and hardware. The first version of EVO:RAIL addressed 240 of those issues. Think of this as having a safety rail around a vSphere appliance to prevent these common and easily avoidable issues, without restricting the flexibility too much. The next version will incorporate NSX, most likely. Security and inspection are the emphases for the next iteration. Partners and distributors were chosen carefully. GreenPages is one of only 9 national partners chosen for this, based on our long history as a strategic partner and our thought leadership! The tightly controlled hardware compatibility list is a strength, as future regression tests for software and other upgrades will keep the permutations down to a minimum. (By the way, the EMC server platform is Intel, for VxRack, VSPEX Blue and I think for all of their compute modules for all their products). The implication here, competitively, is that as competitive appliances that are buying white box hardware with commodity contracts allowing for flexibility in drives, memory and CPU, will have an exponentially more difficult task in maintain the increasing permutations of hardware versions over time.

Final Blue Sky note:

Rack Scale is an Intel initiative that promises an interesting future for increased awareness of the hardware for hypervisors, but is a very future leaning project. Read Scott Lowe’s thoughts on this.

 

As always, contact us for more details and in-depth conversations about how we can help you build the data center of the future, today.

 

By Randy Weis, Practice Manager, Information Infrastructure