Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies – speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating the organization can lead to inconsistent masking and exposure of sensitive data.
But some organizations are adopting a new method of data management for DevOps that is delivering transformational business outcomes in faster time to market, lower costs, and great control.
In his session at DevOps Summit, Brian Reagan, Managing Director of Blackthorne Consulting Group, an Actifio company, reviewed the core concepts of using data virtualization to power DevOps, including Central Administration by Operations, Self-Service for developers and testers, and Automating Data Masking for enhanced control of sensitive information.
He shared real life case studies and provided a practical perspective on the benefits and considerations for these projects.