Category Archives: Data management

EMC outlines ‘Technical Debt’ challenges for data greedy enterprises

Jeremy and Guy on stage day 2

President of Core Technologies Division Guy Churchward (Left) and Jeremy Burton, President of Products and Marketing (Right) at EMC World 2016

Speaking at EMC World, President of Core Technologies Division at EMC Guy Churchward joined Jeremy Burton, President of Products and Marketing, to outline one of the industry’s primary challenges, technical debt.

The idea of technical debt is being felt by the new waves of IT professionals. This new generation is currently feeling the pressure from most areas of the business to innovate, to create an agile, digitally enabled business, but still have commitments to traditional IT systems on which the business currently operates on. The commitment to legacy technologies, which could represent a significant proportion of a company’s IT budget and prevents future innovation, is what Churchward describes as the technical debt.

“They know their business is transforming fast,” said Churchward. “Business has to use IT to make their organization a force to be reckoned with and remain competitive in the market, but all the money is taken up by the current IT systems. This is what we call technical debt. A lot of people have to do more with what they have and create innovation with a very limited budget. This is the first challenge for every organization.”

This technical debt is described by Churchward as the first challenge which every IT department will face when driving towards the modern data centre. It makes business clunky and ineffective, but is a necessity to ensure the organization continues to operate, until the infrastructure can be upgraded to a modern proposition. Finding the budget without compromising current operations can be a tricky proposition.

“When you live in an older house, where the layout doesn’t really work for the way you live your life and there aren’t enough closets to satiate your wife’s shoe fetish, maybe it’s time to modernize,” said Churchward on his blog. “But do you knock the whole house down and start again? Maybe it’s tempting but, what about the investment that you’ve already made in your home? It’s similar when you want to modernize your IT infrastructure. You have money sunk into your existing technology and you don’t want to face the disruption of completely starting again

MainOne datacentre 1“For many companies, this debt includes a strategy for data storage that takes advantage of a shrinking per-gig cost of storage that enables them to keep everything. And that data is probably stored primarily on spinning disk with some high-availability workloads on flash in their primary data centre. The old way of doing things was to see volumes of data growing and address that on a point basis with more spinning disk. Data centres are bursting at the seams and it’s now time to modernize – but how?”

Churchward highlighted the first-step is to remove duplicate data sets – EMC launched its Enterprise Copy Data Management tool at EMC World this week – to reduce unnecessary spend within the data centres. While there are a number of reasons to duplicate and keep old data sets for a defined period of time, Churchward commented this data can often be forgotten and thus becomes an expense which can be unnecessary. Although the identification and removal of this data might be considered a simple solution to removing a portion of the technical debt, Churchward believes it could be a $50 billion business problem by 2018.

The Enterprise Copy Data Management software helps customers discover, automate and optimize copy data to reduce costs and streamline operations. The tool automatically identifies duplicate data sets within various data centres, and using data-driven decision making software, optimizes the storage plans, and in the necessary cases, deletes duplicate data sets.

This is just one example of how the challenge of technical debt can be managed, though the team at EMC believe this challenge, the first in a series when transforming to a modern business, can be one of the largest. Whether this is one of the reasons cloud adoption within the mainstream market cloud be slower than anticipated remains to be seen, though the removal of redundant and/or duplicated data could provide some breathing room for innovation and budget for the journey towards the modern data centre.

Ctera now integrated with HP’s hybrid cloud manager

Cloud storageCtera Networks says it has integrated its storage and data management systems with HP’s cloud service automation (HP CSA) as it seeks way to simplify the management of enterprise file services across hybrid clouds.

The HP CSA ‘architecture’ now officially recognises and includes Ctera’s Enterprise File Services platform. The logic of the collaboration is that as the HP service helps companies build private and hybrid clouds they will need tighter data management in order to deliver new services to enterprise users, according to the vendors.

Ctera, which specialises in remote site storage, data protection, file synchronisation, file sharing and mobile collaboration services, has moved to make it easier to get those services on HP’s systems. According to Ctera, the new services can now be run on any organization’s HP CSA managed private or virtual private cloud infrastructure.

Enterprises that embrace the cloud need to modernise their file services and IT delivery models, according to Jeff Denworth, Ctera’s marketing SVP. The new addition of Ctera to HP CSA means they can easily manage file services from a single control point and quickly roll out the apps using a self-service portal, Denworth said.

“HP CSA helps IT managers become organisational heroes by accelerating the deployment of private and hybrid clouds and IT services,” said Denworth. The partnership with HP will result in a ‘broad suite’ of file services, increased agility and cheaper hybrid cloud services, according to Denworth.

The partnership should make things simpler for cloud managers, who are being forced to take on several roles, according to Atul Garg, HP’s general manager of cloud automation. “Today’s IT teams are becoming cloud services brokers, managing various products and services across hybrid environments and fundamentally changing how they deliver value to the broader organisation,” said Garg. Now file services can be deployed easily to tens of thousands of users, said Garg.

Storage Has Evolved – It Now Provides the Context & Management of Data

 

Information infrastructure is taking storage, which is a very fundamental part of any data center infrastructure, and putting context around it by adding value on what has been typically seen as a commodity item.

Bits in and of themselves have little value. Add context to it and assign value to that information and it becomes an information infrastructure. Organizations need to seek to add value to their datacenter environments by leveraging some advanced technologies that have become part of our landscape. These technologies include software defined storage, solid state storage, and cloud based storage. Essentially, there is a new way to deliver a datacenter application data infrastructure.

Storage has evolved

 

http://www.youtube.com/watch?v=yzbwG0g-Y7c

Interested in learning more about the latest in storage technologies? Fill out this form and we’ll get back to you!

By Randy Weis, Practice Manager – Information Infrastructure

A Guide to Successful Big Data Adoption

By Randy Weis, Practice Manager, Data Management & Virtualization

In this video, storage expert Randy Weis talks about the impact big data is having on organizations and provides an outline for the correct approach companies should be taking in regards to big data analytics.

http://www.youtube.com/watch?v=jZ3V2ynOD44

What is your organization doing in regards to big data? Email us at socialmedia@greenpages.com if you would like to talk to Randy in more depth about big data, data management, storage, and more.

Why ParElastic Could Raise $5.7 Million

ParElastic Corporation has raised $5.7M in a Series A round financing led by General Catalyst Partners.  The company’s existing investors including Point Judith Capital, CommonAngels and LaunchCapital also participated in the round. The Series A brings ParElastic’s total financing to $8.7M. The ParElastic Database Virtualization Engine “dramatically increases the flexibility of your current relational database, improving performance and reliability while reducing storage and processing costs.”

Read more, watch a video, download a white paper here.

EMC Leads the Storage Market for a Reason

By Randy Weis, Consulting Architect, LogicsOne

There are reasons that EMC is a leader in the market. Is it because they come out first with the latest and greatest technological innovation? No, or at least not commonly. Is it because they rapidly turn over their old technology and do sweeping replacements of their product lines with the new stuff? No. It’s because there is significant investment in working through what will work commercially and what won’t and how to best integrate the stuff that passes that test into traditional storage technology and evolving product lines.

Storage Admins and Enterprise Datacenter Architects are notoriously conservative and resistant to change. It is purely economics that drives most of the change in datacenters, not the open source geeks (I mean that with respect), mad scientists and marketing wizards that are churning out & hyping revolutionary technology. The battle for market leadership and ever greater profits will always dominate the storage technology market. Why is anyone in business but to make money?

Our job as consulting technologists and architects is to match the technology with the business needs, not to deploy the cool stuff because we think it blows the doors off of the “old” stuff. I’d venture to say that most of the world’s data sits on regular spinning disk, and a very large chunk of that behind EMC disk. The shift to new technology will always be led by trailblazers and startups, people who can’t afford the traditional enterprise datacenter technology, people that accept the risk involved with new technology because the potential reward is great enough. Once the technology blender is done chewing up the weaker offerings, smart business oriented CIOs and IT directors will integrate the surviving innovations, leveraging proven manufacturers that have consistent support and financial history.

Those manufacturers that cling to the old ways of doing business (think enterprise software licensing models) are doomed to see ever-diminishing returns until they are blown apart into more nimble and creative fragments that can then begin to re-invent themselves into more relevant, yet reliable, technology vendors. EMC has avoided the problems that have plagued other vendors and continued to evolve and grow, although they will never make everyone happy (I don’t think they are trying to!). HP has had many ups and downs, and perhaps more downs, due to a lack of consistent leadership and vision. Are they on the right track with 3PAR? It is a heck of a lot more likely than it was before the acquisition, but they need to get a few miles behind them to prove that they will continue to innovate and support the technology while delivering business value, continued development and excellent post-sales support. Dell’s investments in Compellent, particularly, bode very well for the re-invention of the commodity manufacturer into a true enterprise solution provider and manufacturer. The Compellent technology, revolutionary and “risky” a few years ago, is proving to be a very solid technology that innovates while providing proven business value. Thank goodness for choices and competition! EMC is better because they take the success of their competitors at HP and Dell seriously.

If I were starting up a company now, using Kickstarter or other venture investment capital, I would choose the new products, the brand new storage or software that promises the same performance and reliability as the enterprise products at a much lower cost, knowing that I am exposed to these risks:

  • the company may not last long (poor management, acts of god, fickle investors) or
  • the support might frankly sucks, or
  • engineering development will diminish as the vendor investors wait for the acquisition to get the quick payoff.

Meanwhile, large commercial organizations are starting to adopt cloud, flash and virtualization technologies precisely for all the above reasons. Their leadership needs to drive profitability into the datacenter technologies to increase speed to market and improve profitability. As the bleeding edge becomes the smart bet as brought to market by the market leading vendors, we will continue to see success where Business Value and Innovation intersect.

Actifio Announces PAS 5.0, Radically Simple Copy Data Management

Actifio today launched PAS 5.0, a major platform upgrade that extends the company’s signature capability – the ability to recover any application instantly for up to 90 percent less total cost of ownership (TCO) – to large scale enterprises and cloud service providers. This new version of its Protection and Availability Storage (PAS) platform – a proven virtualized storage solution with more than 100 users worldwide – addresses more deployment scenarios by delivering new cloud-based services including multi-tenancy, networking optimization and reporting capabilities. It also offers improved workflow to accelerate application development lifecycles, now including Oracle, with efficient self-service database cloning to lower test/development storage costs across large enterprises.

PAS has disrupted the enterprise storage industry by transforming its underlying economics with a purpose-built system optimized for the real driver of today’s data storage explosion: copy data. By focusing on smarter and more efficient copy data management – where it is not unusual for a business to maintain 13-120 redundant copies of production data – Actifio provides faster recovery and more reliable data protection for up to 90 percent less cost than traditional backup and recovery point tools.

“The copy data explosion is creating significant problems for the enterprise and the reason is that a whole slew of traditional, expensive and siloed data backup and recovery applications can’t instantly find, manage and protect ever-increasing data assets,” Ash Ashutosh, founder and CEO of Actifio. “With PAS 5.0, enterprises can create and maintain a single copy of everything in their production environment, eliminating the need for multiple redundant copies. It also recovers data within seconds, which is more important than ever in mission-critical environments.”

Through Actifio’s radical simplification of the copy data management problem, organizations are freed to reinvest dollars often counted in the millions of annually recurring expenses into more strategic IT initiatives driving growth and innovation. PAS users – including Boston University Medical Campus; Audax Group; NaviSite, a Time Warner Cable Company; City of South Portland; and Jones & Bartlett Learning – are already realizing such savings and putting those resources to better use inside their own organizations.

“Finding ways to extend our current IT investments into new areas can make a good technology decision a great technology decision,” said Erik Dubovik, vice president of IT, Audax Group. “This is exactly the case with PAS 5.0 because it lets me simplify copy data management for more of our environment in addition to lowering test and development storage costs by 95 percent.”

“Our work in the fight against disease is essential but it generates unwieldy ‘Big Data’ that’s hard to store, share and protect using conventional tools,” said Dr. John Meyers, assistant professor of medicine and director of technology for the Department of Medicine at Boston University School of Medicine. “PAS 5.0 will let us recover anything, physical or virtual, across our entire data center – instantly – with a single solution with 80 percent less dedicated backup storage. For us, that will save time, money and trouble – and allow us to focus on the advancement of science.”

This new release coincides with the accelerating worldwide adoption of PAS, which has been installed by hundreds of customers and sold by more than 120 value added resellers worldwide. This momentum has enabled Actifio to double revenues for seven consecutive quarters and expand in key regions across Europe, Asia/Pacific and the Middle East. It has also resulted in tremendous industry-wide acclaim for Actifio, which was recently chosen as one of Gartner’s Cool Vendors in Storage Technologies1 for 2012.

The new PAS 5.0 supports larger deployments and more diverse enterprises while accelerating the test and development process in Oracle environments. Actifio has significantly bolstered the scalability and performance of PAS 5.0 with these powerful new features:

  • 200 percent dedupe capacity increase
  • 2x increase in throughput
  • 10x bandwidth increase with 10GbE
  • 75 percent network bandwidth savings with DeDup Async™ replication
  • Faster on-boarding of remote applications via portable storage
  • Actifio Enterprise Manager – a new software capability that enables
    simple management of large scale PAS deployments

PAS 5.0 gives developers and IT teams copies of production data within minutes to dramatically hasten the arduous testing process. Within Oracle Test and Development environments, PAS 5.0 eliminates more than 95 percent of storage costs by creating multiple copies at a fraction of footprint. Actifio now enables immediate self-service database cloning – taking less than 15 minutes to clone a five terabyte database when it previously took several days using legacy approaches.

For cloud service providers, PAS 5.0 offers a wealth of public and private cloud data management services including backup, remote backup, and test and development. It also provides several new, in-demand features including DeDup Async, secure multi-tenancy and the Actifio Enterprise Reporter.

This new version of PAS now supports a wider variety of environments – including all file system data on Windows and Linux including DAS, NAS and SAN. It provides comprehensive protection and availability for heterogeneous networked and direct-attached server environments, supporting a range of in-band, out-of-band, physical, VMware, structured and unstructured environments.

Additional Multimedia Assets

  • [Video] – Solving
    the Big Data Problem with Actifio
  • [Video] – Boston
    University: Discovering Effective Big Data Management with Actifio
  • [Video] – NaviSite:
    Delivering the Economic Promise of Cloud Backup with Actifio
  • [Video] – Accelerated
    Oracle App Development, Protection & Delivery with Actifio PAS 5.0

PAS 5.0 will be available the week of June 18.  Contact info@actifio.com for more information.


D & B and GlobalSoft Partner with Informatica to Power-Charge Customer Master Data

Image representing Informatica as depicted in ...

D&B announced that GlobalSoft, a global software consultancy, and D&B have partnered to launch D&B360 for Informatica MDM. This marks the first time that D&B has used its Data-as-a-Service (DaaS) solution to integrate its unequaled database with an MDMsolution, enabling companies to automatically update and streamline their data across the enterprise to increase productivity, reduce costs and maximize ROI.

“With the increasing volume, complexity and volatility of data, it has become a struggle for companies to not only effectively manage their data, but to gain the insight necessary to act upon that data to drive real business change,” said Mike Sabin, senior vice president of Sales and Marketing Solutions at D&B. “D&B360 for Informatica MDM helps companies get ahead by delivering timely business data with actionable perspective to provide a competitive advantage.”

D&B360 for Informatica MDM provides true data stewardship and entry-point integration, making it easier to address data consistency and quality issues. In addition, D&B360 for Informatica MDM automatically embeds the industry-standard D&B DUNS® Number and all related business data via the cloud, ensuring users have timely access to a current and complete source of data, enabling them to quickly synchronize information across other business systems.

“At the recent Gartner MDM Summit, a common theme on the minds of attendees was the need to evolve current MDM tools to support the ‘day in the life of the data steward’,” said Andrew White, research vice president and agenda manager for MDM and Analytics, Gartner. “An MDM solution that couples business information like credit rankings and revenue performance can help IT leaders deliver a compelling business case for the value a holistic data view offers to company executives.”

Specific customer benefits include:

  • Faster and easier data management: D&B360 for Informatica MDM
    can be used to match, group and master records more quickly.
  • Improved data stewardship: Timely matching service can
    automatically route records that meet specified criteria for more
    effective and efficient distribution of information.
  • Entry Point Integration: D&B360 for Informatica MDM Adapter
    eliminates data quality issues and duplicate challenges at the entry
    point, to dramatically enhance data quality, decrease MDM processing
    time, improve decision making and drive lower total cost of ownership.

“Together, as we expand our long-standing relationship with D&B, we continue to help our customers and the industry improve and unlock the true value of their data,” said Dennis Moore, senior vice president and general manager, MDM, Informatica. “Our combined offering allows companies to leverage their data more effectively to make better business decisions that benefit their customers – and the bottom line.”

D&B360 for Informatica MDM will be available in the United States beginning in May 2012. For additional information about D&B360 DaaS Solution for Informatica MDM, please visit: www.dnb.com.