There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver data in a meaningful way, one that really delivers on what end-users need to maintain a competitive position in fast-changing markets.
What is needed is an agile approach to Big Data, one that affords architects and developers the freedom to rapidly support a wide variety of changing business requirements and needs – without a full «reset» of the Big Data infrastructure and toolset.