Quantum Exec Outlines «Massive Data Growth» Challenge

Storage is part of cloud computing’s brain, as essential as the microprocessors that work with it to bring cloud computing alive. Along these lines, I was engaged in a short, interesting conversation with Quantum Corp.’s SVP of Strategy Janae Lee in the days before our recent Cloud Expo in New York.

Janae’s been in the storage business a long time, having put in stints at IBM as well as two companies that were acquired by EMC. She’s been at Quantum since 2007.

I asked her a bunch of questions as I was preparing for New York, and she provided a bunch of detailed answers. Now that I’ve had some time to recover from New York (even as we prepare for Cloud Expo Silicon Valley in November), it seems like a good time to report on what she said:

Cloud Computing Journal: Could you briefly describe where Quantum is at with regard to cloud computing and big data? How does its current vision and strategy connect to the company’s history?

Janae Lee: Quantum has long been in the business of intelligently capturing, managing and protecting data on a variety of storage devices so it can be retrieved/accessed when needed. As an example, we provide high-performance file management and policy-based tiering to disk, object store or tape (for cost-effectiveness) through our StorNext solutions.

We see the storage cloud as a very interesting, new, high-growth opportunity for leveraging our intelligence value-add to help customers. As a result, we are deploying a range of private and public cloud models, including:

• In a product reference architecture using StorNext and our Lattus object storage, for customers wanting flexible, low-cost storage in a private cloud.

• As a set of cloud backup and DR products and services which we enable with partner MSPs, based on our DXi deduplication and vmPRO virtual data protection technology.

• As a set of services we intend to offer directly to customers which – when released – will focus on specific use cases. We demonstrated an example of one such service at this year’s NAB (National Association of Broadcasters) show in conjunction with several partners (Adobe, Reach Engine and Telestream). This demonstration showed an entire film editing workflow completed in the cloud using StorNext, Lattus and partner products.

While these models sound discrete, our expectation based on customer input to date is that most of our cloud deployments will be hybrids of a customer onsite and offsite model, with some of the onsite systems being managed by the customer while others are fully managed services by Quantum or its partners.

We will have a number of new offerings in this space this year. However, rather than the vanilla commodity storage offerings frequently announced in press releases, the Quantum solutions will be configured to enable specific customer use cases, thereby maximizing the value and ease of use to the customer.

CCJ: You recently wrote a brief item about Data vs. Information. This is an age-old challenge, as enterprise IT tries to turn data and information into knowledge and wisdom. How does Quantum address this challenge for its customers in an age of massive amounts of real-time data starting to emerge?

Janae: Quantum has been working at the heart of this issue for over 20 years with our scale-out storage business around the StorNext file system. Two of the industries where our deployment of StorNext is strongest are Media and Entertainment and National Intelligence.

Both of these customer sets are on the leading edge of the issues that other, more general Big Data customers are only now beginning to see (massive data capture from devices – cameras and/or sensors – with the need to capture/analyze metadata). This experience has given us an appreciation of the challenges – and the expertise to deal with them. Our strong engagement with these communities is precisely what caused us to become an early proponent of object storage (now embedded in our Lattus system).

StorNext’s natural support for workflow-based architectures has also enabled (and even driven) us to build strong partnerships with vendors who specialize in collection, creation and policy-based management of business metadata, such as media asset management, lab information systems companies, etc.

We leverage our API integration with these systems to enable a full end-to-end data workflow automation capability for customers – a capability that historically has enabled use cases such as multi-use editing or real-time human analytics, even with massive data volumes.

Moving forward, I believe a broader set of customers are going to need this capability, particularly as they capture, store and manage data which must be shared between traditional business systems and Big Data analytics.

We even selectively invest in key metadata technology companies: a couple years ago we invested in a startup company called NerVve Technologies which has some very unique technology for indexing and searching video by defined images rather than text.

CCJ: How is the Internet of Things affecting your thinking and effecting your strategy? How do you simultaneously focus on the big issues as well as the challenges of what could be exponential leaps in the number of devices hooked to the Internet and the data that flows through them?

Janae: Great question. Our strategy is a mix of executing more of what we know, guided by input from our customers, partners and suppliers; and some brand new innovations.

On the one hand, we see the IoT as just our classic environment on steroids. Understand that we’ve been dealing with hundreds of very rich endpoints (e.g., cameras) streaming high volume data – and keeping it – for a long time. Quantum has customers, for example, storing and using over 30PB of environmental data, pulled down from devices like satellites, on-ground sensors, mobile cameras, etc.

So on one level it’s just an extension of what we’ve always done. As a result we have continued to invest – and expanded the scalability of our products – performance scale, capacity scale, “number of things” scale and management scale (also known as ease of use).

This was a key driver of our three plus-year development to release StorNext 5, which takes us to over 5+ billion files in a file system, and also a key driver of our investment in object storage with Lattus, which enables customers to grow their data exponentially over time without the need to ever worry about doing a forklift system migration.

At the same time, there are challenges in this massive end point and data growth that require brand new approaches – innovations in both product and go to market. New levels of data distribution and distributed user collaboration are driving the need for innovation in distributed data management.

This again is a metadata problem – how do you make sure everyone has the same view of the data? Or, more likely given that the physical and economic realities of the network won’t allow for this, how do you address the data coherency in a way which is practical and useful

As a market example: flexibility, ease of administration and cost effectiveness are driving the cloud. For Quantum this means delivering more end-to-end services than we have in the past.

read more

Addressing Identity in the Digital Economy

How do we blaze a better path to a secure mobile future? How do we make today’s ubiquitous mobile devices as low risk as they are indispensable?
As smartphones have become de rigueur in the global digital economy, users want them to do more work, and businesses want them to be more productive for their employees — as well as powerful added channels to consumers.
But neither businesses nor mobile-service providers have a cross-domain architecture that supports all the new requirements for a secure digital economy, one that allows safe commerce, data sharing and user privacy.

read more

Learn More About Secure Web Conferencing

Time compression, travel limits, and increasing complexity in every walk of professional life have driven web and video conferencing from new application to essential services – just like phone, FAX and email before it. This paper describes how OmniJoin™ cloud computing technology and widely available audio/video peripherals deliver easy, high-quality conferencing sessions within IT-friendly network bandwidth and security policies – the essential ingredients for successful business-to-business online meetings.

read more

Learn More About Secure Web Conferencing

Time compression, travel limits, and increasing complexity in every walk of professional life have driven web and video conferencing from new application to essential services – just like phone, FAX and email before it. This paper describes how OmniJoin™ cloud computing technology and widely available audio/video peripherals deliver easy, high-quality conferencing sessions within IT-friendly network bandwidth and security policies – the essential ingredients for successful business-to-business online meetings.

read more

VMware Horizon 6: Updates, Improvements, and Licensing Changes

By Chris Ward, CTO

I got a late start on this blog post but I’m a fan of the saying “better late than never!”  VMware officially shipped Horizon 6, the long awaited major update, to its end user computing product set late last month. There are numerous updates and improvements across the product set with this major release, but there is also a change in how it is licensed. In the past Horizon was consumed either as individual products (VIEW, Mirage, Workspace, etc.) or as a suite which included all components. With this new release, VMware has transitioned to its traditional product hierarchy which includes Horizon Standard, Advanced, and Enterprise editions.  

Each edition builds on previous versions with additional features added into the mix. The Standard edition basically amounts to what we’ve known as VIEW in the past.  It is the baseline VDI feature set inclusive of the connection and security servers, PCoIP protocol, ThinApp application virtualization, and linked clone functionality. Moving to the Advanced edition adds in the Mirage management, Remote Desktop Session Host (RDSH), Horizon Workspace, and vSAN integration.  The Enterprise edition adds vCOPS monitoring and vCAC/vCenter Orchestrator integration.

One of the more exciting features of Horizon 6 is RDSH application publishing. This is a big deal because it’s been a glaring missing checkbox when comparing Horizon to Citrix in the past. This feature allows you to configure Windows terminal server (RDSH) farms which are configured to publish individual applications rather than full desktop sessions, very closely resembling Citrix XenApp. Why’s this a big deal?  Well, it can save a lot of back end horsepower when you can have 50 users share a single RDSH VM to run a few applications rather than needing 50 desktop VMs in traditional VDI. This allows a more flexible architecture so you can deliver each application in the best way possible, rather than being forced into deploying only full desktop operating systems. 

Mirage integration with the traditional VIEW product has improved as well.  While not 100% there, you can now get some of the Mirage OS/application layering functionality inside the VDI environment while still being able to use Mirage in its native capacity as a physical desktop management platform.  vSAN integration is a big step forward in potentially minimizing the typically large storage costs for a VDI environment, and the inclusion of vCOPS in the Enterprise edition is great as it provides very deep insight into what’s going on under the covers with your Horizon infrastructure, including deep PCoIP analytics.  Finally, the Workspace component of Horizon has improved greatly, allowing you to provide your end users with a single web page whereby they can access VDI desktops, RDSH based published applications, Citrix XenApp published applications, ThinApp packaged applications, and SaaS based apps such as Office365, Google Apps, etc.

With this release, VMware seems to be delivering on its promise that the EUC space is one of its 3 strategic focus areas.  I look forward to further improvements, along with the integration of Airwatch into the Horizon family in upcoming releases. For now, Horizon 6 is a very big step in the right direction. 

Have you tried or migrated to Horizon 6 since the launch?  If so, please share your thoughts!

 

Are you interested in learning about how you can extend your data center into the cloud with VMware vCloud Hybrid Service? Register for our upcoming webinar!

 

 

A roundup of analytics, big data and business intelligence forecasts and estimates 2014

From manufacturers looking to gain greater insights into streamlining production, reducing time-to-market and increasing product quality to financial services firms seeking to upsell clients, analytics is now essential for any business looking to stay competitive.  Marketing is going through its own transformation, away from traditional tactics to analytics- and data-driven strategies that deliver measurable results.

Analytics and the insights they deliver are changing competitive dynamics daily by delivering greater acuity and focus.  The high level of interest and hype surrounding analytics, Big Data and business intelligence (BI) is leading to a proliferation of market projections and forecasts, each providing a different perspective of these markets.

Presented below is a roundup of recent forecasts and market estimates:

  • The Advanced and Predictive Analytics (APA) software market is projected from grow from $2.2B in 2013 to $3.4B in 2018, attaining a 9.9% CAGR in the forecast period.  The top 3 vendors in 2013 based on worldwide revenue were SAS ($768.3M, 35.4% market share), IBM ($370.3M, 17.1% market share) and Microsoft ($64.9M, 3% market share).  IDC commented that simplified APA tools that provide less flexibility than standalone statistical models tools yet have more intuitive graphical user interfaces and easier-to-use features are fueling business analysts’ adoption.  Source: http://www.idc.com/getdoc.jsp?containerId=249054
  • A.T. Kearney forecasts global spending on Big Data hardware, software and services will grow at a CAGR of 30% through 2018, reaching a total market size of $114B.  The average business expects to spend $8M on big data-related initiatives this year. Source: Beyond Big: The Analytically Powered Organization.
  • Cloud-based Business Intelligence (BI) is projected to grow from $.75B in 2013 to $2.94B in 2018, attaining a CAGR of 31%.  Redwood Capital’s recent Sector Report on Business Intelligence  (free, no opt in) provides a thorough analysis of the current and future direction of BI.  Redwood Capital segments the BI market into traditional, mobile, cloud and social business intelligence.   The following two charts from the Sector Report on Business Intelligence  illustrate how Redwood Capital sees the progression of the BI market through 2018.

redwood capital global intelligence market size

  • Enterprises getting the most value out of analytics and BI have leaders that concentrate more on collaboration, instilling confidence in their teams, and creating an active analytics community, while laggards focus on technology alone.  A.T. Kearney and Carnegie Mellon University recently surveyed 430 companies around the world, representing a wide range of geographies and industries, for the inaugural Leadership Excellence in Analytic Practices (LEAP) study.  You can find the study here.  The following is a graphic from the study comparing the characteristics of leaders and laggards’ strategies for building a culture of analytics excellence.

leaders and laggards2

  • The worldwide market for Big Data related hardware, software and professional services is projected to reach $30B in 2014.  Signals and System Telecom forecasts the market will attain a Compound Annual Growth Rate (CAGR) of 17% over the next 6 years.  Signals and Systems Telecom’s report forecasts Big Data will be a $76B market by 2020.  Source: http://www.researchandmarkets.com/research/s2t239/the_big_data
  • Big Data is projected to be a $28.5B market in 2014, growing to $50.1B in 2015 according to Wikkbon.  Their report, Big Data Vendor Revenue and Market Forecast 2013-2017 is outstanding in its accuracy and depth of analysis.  The following is a graphic from the study, illustrating Wikibon’s Big Data market forecast broken down by market component through 2017.

Big Data Wikibon

  • SAPIBMSASMicrosoftOracle, Information Builders, MicroStrategy, and Actuate are market leaders in BI according to Forrester’s latest Wave analysis of BI platforms.  Their report, The Forrester Wave™: Enterprise Business Intelligence Platforms, Q4 2013 (free PDF, no opt in, courtesy of SAS) provides a thorough analysis of 11 different BI software providers using the research firm’s 72-criteria evaluation methodology.
  • Amazon Web Services, Cloudera, Hortonworks, IBM, MapR Technologies, Pivotal Software, and Teradata are Big Data Hadoop market leaders according to Forrester’s latest Wave analysis of Hadoop Solutions.  Their report, The Forrester Wave™: Big Data Hadoop Solutions, Q1 2014 (free PDF, no opt in, courtesy of MapR Technologies) provides a thorough analysis of nine different Big Data Hadoop software providers using the research firm’s 32-criteria evaluation methodology.
  • IDC forecasts the server market for high performance data analysis (HPDA) will grow at a 23.5% compound annual growth rate (CAGR) reaching $2.7B by 2018.  In the same series of studies IDC forecasts the related storage market will expand to $1.6B also in 2018. HPDA is the term IDC created to describe the formative market for big data workloads using HPC. Source: http://www.idc.com/getdoc.jsp?containerId=prUS24938714
  • Global Big Data technology and services revenue will grow from $14.26B in 2014 to $23.76B in 2016, attaining a compound annual growth rate of 18.55%.  These figures and a complete market analysis are available in IDC’s Worldwide Big Data Technology and Services 2012 – 2016 Forecast.  You can download the full report here (free, no opt-in): Worldwide Big Data Technology and Services 2012 – 2016 Forecast.

big data analytics by market size

  • Financial Services firms are projected to spend $6.4B in Big Data-related hardware, software and services in 2015, growing at a CAGR of 22% through 2020.  Software and internet-related companies are projected to spend $2.8B in 2015, growing at a CAGR of 26% through 2020.  These and other market forecasts and projections can be found in Bain & Company’s Insights Analysis, Big Data: The Organizational Challenge.  An infographic of their research results are shown below.

Big-Data-infographic-Bain & Company

potential payback of big data initiatives

Facebook and Google: The Race for the Next-Gen Communications Platform

In Part 1, I looked at what could be behind Facebook’s acquisition of WhatsApp and subsequent purchase of Oculus Rift. How are these seemingly different acquisitions related? There is no question that both Facebook and Google are in a race to build the next-generation computing/communications platform (after mobile) and the battle lines are being drawn between the over-the-top (OTT) players and telecommunications companies. After all, both Facebook and Google count on telcos to deliver services to their customer base. How will the WhatsApp and Oculus Rift acquisitions shape Facebook and impact the rest of the market including Google and mobile?
Facebook did not pay $19 billion for WhatsApp simply because it’s an SMS replacement, a Skype and Twitter competitor, because it could grow its international subscriber base, or could attract customers among the coveted millennial demographic – although these reasons are icing on the cake. Facebook’s WhatsApp acquisition was a shot across the bow to telcos: we will be masters of our own destiny and not be reliant on you to reach our customers. With the addition of voice calling to WhatsApp, this puts even more pressure on mobile providers who are already feeling the heat from WhatsApp, which has cost them billions in lost SMS revenue. Now, they have to worry about an even bigger impact on their bread and butter voice business if that follows a similar pricing pressure trajectory. With an already 450 million strong WhatsApp subscriber base added to Facebook’s own customer base, Facebook has the ideal launch pad for the next-generation communications platform provided by Oculous Rift. While Facebook and Google have similar business (new subscribers and advertising revenue) and personal (secure a place in the history books) motivation that is driving both companies overall strategy, their approach is quite different.

read more

Cloud Storage Barriers and How to Bulldoze Through Them

Another week, another cloud storage price drop. The barrier of price is slowly melting away as the cloud storage wars rage and prices drop into the pennies per gigabyte per month. Not familiar with the price wars? Here’s an abridged version:
In 2013, Amazon Web Services (AWS), Microsoft Azure, Google, and Rackspace dropped prices a combined 25 times.
So far in 2014, AWS, Google and Azure have all continued to engage in price wars, dropping storage pricing more than 50 percent.
New entrants like IBM Softlayer recently got in the game with a price drop of up to 65 percent on storage

read more

Cloudian Receives $24 Million in New Funding Round

Cloudian has announced that it has closed a $24 million financing round with new investors, INCJ and Fidelity Growth Partners, and existing Cloudian shareholders, including Intel Capital. All three funds actively seek well-positioned, high-growth, enterprise-focused companies for investment and targeted Cloudian as a key provider of hybrid cloud storage solutions. The funding will enable the company to extend its global sales and marketing reach through targeted programs and amplified market development.
«This substantial investment is strong validation of our unique and leading approach to enterprise on-premises storage and hybrid cloud storage,» said Michael Tso, CEO and co-founder of Cloudian. «The rapid growth of unstructured data is transforming the data storage landscape. With this funding, we will accelerate the deployment of our production-proven storage solutions and revolutionize the cost, scalability and availability models for storing unstructured data in the enterprise.»

read more

Unified IT Monitoring Software

up.time, from uptime software, monitors performance, availability and capacity across all servers, virtual machines, applications, IT services, and the network. Proactively find IT system performance issues before they happen, report on total capacity, easily identify troublemakers, trouble-shoot server and application problems fast, create and report on SLAs, and more.
A unified IT Dashboard that monitors and reports on all server and application health (performance, availability and capacity). A GUI that is easily customizable, with a drag and drop dashboard design. In minutes, create private dashboards, team dashboards (server team, application team, capacity team, etc.) and a NOC for the entire datacenter.

read more