DevOps and microservices will be huge for business – but many orgs are nowhere near it yet

Mind the gap please: according to new research from the Ponemon Institute, the gap between organisations’ ideal DevOps and microservices capabilities and what they are actually able to deliver is costing enterprises on average $34 million per year.

The study, which was sponsored by hybrid cloud management provider Embotics and which polled more than 600 cloud management professionals, found three quarters (74%) said DevOps enablement capabilities were either ‘essential’, ‘very important’, or ‘important’ to their organisations. Four in five (80%) said microservices were essential to important. Yet only a third said their company had the ability to push through those capabilities.

Ultimately, the root of the problem is how organisations are managing – and struggling – to cope with how employees are consuming cloud resources. Almost half (46%) of those polled said their company was ‘cloud direct’ – employees bypassing IT to communicate directly to AWS, Azure et al through native APIs or their own public cloud accounts.

This leads to issues with regards to visibility and management. 70% of respondents said they have no visibility at all into the purpose or ownership of the VMs in their cloud environment, while a similar number (66%) said they were ‘constantly challenged’ with management and tracking of assets in their cloud ecosystem.

The solution? DevOps, DevOps, DevOps, as a former Microsoft chief executive might have put it. The report puts this under the banner of CMP (Cloud Management Platform) 2.0 – a new era of hybrid cloud management. Almost three quarters (71%) of those polled say they have adopted or are planning to adopt DevOps methodologies, with the majority saying it will improve project quality (69%), delivery scheduling (61%), and budgeting (60%).

This is by no means the only study in recent weeks which has come to this conclusion. According to Puppet’s most recent note, issued earlier this month, there was not only a clear difference between high and low performers but a significant gap between lower performers depending on industry.

“To enable true digital business process transformation, enterprises need to find a way to bridge the gap between the speed and agility developers need and the control and governance required by the IT organisation,” said Larry Ponemon, chairman and founder of the Ponemon Institute in a statement. “The report shows that this isn’t happening with current cloud management strategies.”

You can read the full report here (email required).

Privacy Shield should be suspended, say MEPs


Joe Curtis

13 Jun, 2018

Privacy Shield, the agreement underpinning data transfers from Europe to America, must be suspended if the US does not fully meet its obligations come 1 September, MEPs voted last night.

The Civil Liberties (Libe) Committee’s decision isn’t binding, but puts pressure on the European Commission to ensure the data-transfer arrangement is being honoured by US authorities.

Drawn up to replace the abandoned Safe Harbor agreement in 2016 after that deal was brought down by a legal challenge, Privacy Shield now faces obstacles of its own.

Designed to extend EU-like data protection to European residents’ data transferred to the US, now the Libe Committee believes the new agreement fails to provide that protection as well.

“While progress has been made to improve on the Safe Harbor agreement, the Privacy Shield in its current form does not provide the adequate level of protection required by EU data protection law and the EU Charter,” said Libe Committee member Claude Moraes.

“It is therefore up to the US authorities to effectively follow the terms of the agreement and for the Commission to take measures to ensure that it will fully comply with the GDPR.”

MEPs raised their concerns following the Cambridge Analytica scandal affecting Facebook, about which the social network’s CEO, Mark Zuckerberg, gave evidence to the European Parliament recently.

While the incident pre-dates Privacy Shield, it was concerned about US authorities’ ability to monitor US firms’ compliance with the agreement, given both Cambridge Analytica’s affiliate company, SCL Elections, and Facebook are both still listed on Privacy Shield.

SCL Elections still listed on Privacy Shield

MEPs also said they’re concerned about a new US law called the CLOUD Act, which gives US authorities the right to access data stored in foreign locations, as long as the organisation storing it is American.

The Libe Committee said this could clash with EU law on data protection.

Picture: Bigstock

Salesforce plans to pump £1.9 billion into UK initiatives


Dale Walker

13 Jun, 2018

Prime minister Theresa May is set today to announce a slew of new commitments aimed at boosting the UK’s tech industry, which include access to a £2.5 billion government investment pot.

The announcements coincide with London Tech Week, which will be the setting for a number of government-led roundtable events with leading technology firms looking to invest in the industry.

CRM giant Salesforce is expected to commit $2.5 billion (£1.87 billion) to the UK market over the next five years, which includes the building of a second data centre.

A further £300 million will come from the UAE-based Mubadala Investment Company in the form of a European investment fund, and an additional £41 million from Japanese firm NTT Data as part of its expansion into the UK.

The government’s own £2.5 billion commitment will be in the form of a British Patient Capital programme that aims to support businesses with high growth potential with access to long-term funding. This fund is also expected to be supported by a further £5 billion in private investment.

“The measures we are announcing today will allow innovative British startups to invest in their future – and in the UK – by hiring more skilled people, expanding their business and exporting their expertise across the world,” said May.

“It’s a great time to be in tech in the UK, and our modern Industrial Strategy will drive continued investment, ensuring the nation flourishes in the industries of the future and creating more high-paying jobs.”

In an effort to make it easier for businesses to source overseas talent, the Prime Minister will also be scrapping the current graduate visa programme and replace it with a ‘startup visa’, a streamlined route that will be available to foreign entrepreneurs once it launches in spring 2019.

“Britain is a digital dynamo with the government and tech sector working together to help make this country the best place in the world to start and grow a digital business,” said culture secretary Matt Hancock. “We’re encouraging the best and brightest tech talent to come to the UK and creating the right conditions for our high growth digital businesses to thrive.”

The government has also committed to the building of two new tech hubs in Brazil and South Africa, designed to encourage the development of digital skills in the regions and foster greater relationships with UK businesses.

NTT Data UK CEO Simon Williams said today’s announcement “shines an important light on the UK technology sector and the incredible talent emerging across the industry”.

Commenting on the company’s £41 million investment, he added: “NTT Data has a proud history of investment and innovation in the UK, which is one of the most competitive markets in the world.

“Companies like NTT Data recognise that by investing and succeeding in the UK, we are in a very strong position to succeed in other markets around the world.”

The UK technology industry attracted $7.8 billion in funding, almost double that of 2016 and $1.8 billion more than France and Germany, according to government figures.

Antony Walker, deputy CEO of trade industry body techUK, said: “This is another vote of confidence in the UK tech sector. The billions of pounds of investment and thousands of new jobs shows that the UK remains a global hub for tech.

“The government is clearly determined not to abandon the playing field to France and others when it comes to presenting a strong offering to tech entrepreneurs and investors. The Industrial Strategy has been very positive for tech. The challenge is to build on these strong foundations. We need to digitise our economy, grow our domestic digital market and identify new export opportunities.”

Six key data strategy considerations for your cloud-native transformation

Many organizations are making the move to cloud-native platforms as their strategy for digital transformation. Cloud native allows companies to deliver fast-responding, user-friendly applications with greater agility. However, the architecture of the data in support of cloud-native transformation is often ignored in the hope that it will take care of itself.

With data becoming the information currency of every organization, how do you avoid the data mistakes commonly made during this cloud transformation journey? What data questions should you ask when building cloud-native applications? How can you gain valuable insight from your data?

The ensuing discussion includes six key considerations companies must have when they make this transition to cloud native.

Farewell, service oriented architecture (SOA) – welcome, microservices!

While there are many legacy applications that are still SOA based, the architectural mindset has changed and microservices have gained much popularity. Rather than architecting monolithic applications, developers can achieve many benefits by creating many independent ‘services’ that work together in concert. A microservice architecture delivers greater agility in application development and simpler codebases; updates and scaling the services can be achieved in isolation and services can be written in different languages and connected to different data tiers and platforms of choice. This strategy allows developers and operators to work together in a much more harmonious way. Such componentized architecture demands a database platform that can support the different data types and structures and programming languages with ease.

12-factor app and cloud-native microservices

The Twelve-Factor App is a set of rules and guidelines for helping organizations build cloud native applications. It serves as an excellent starting point, but when it comes to data platforms, a couple of factors (#4 and #5) need further examination.

#4 – Treat backing services as attached resources: Backing services here refer to databases and the datastores for the most part. This means that microservices demand dedicated single ownership of schema and the underlying datastore.

#5 – Strictly separate build and run stages: Separate build and run stages means the application should be executed as one more stateless processes, and the state is often offloaded onto the backing service. This further implies that the databases and the datastores are expected to be stateful services.

Continuous integration/continuous delivery

The proliferation of service processes, where each service is deployable independently, requires automated mechanisms for deployment and rollback – referred to as continuous integration or continuous delivery (CI/CD). In reality, the value of microservices cannot be fully realized without a mature CI/CD capability to go along with it. Note that such a transient architecture means the database instances will also be ephemeral and they must also be able to easily spin up and spin down on demand.

With the help of the correct cloud native platform and supporting data platform, microservices become easily deployable. The cloud native platform should handle the management of the services running on it and your database should handle the data scaling and monitoring, adding of shards, rebalancing, re-sharding, or failover in the necessary event. The combined database and cloud native solution offloads the operational burden of monitoring the database and the platform, allowing companies to spend more time developing and deploying quality software.

The importance of a multi-cloud deployment model

Enterprises today are adopting a multi-cloud strategy for multiple reasons; to prepare for disaster recovery situations, to take advantage of the financial differences between hosting applications in different cloud infrastructures, for enhanced security, or simply to avoid vendor lock-in. (Who’s not concerned of the powerful behemoth organizations taking over of the world?).

Your application code should be independent of the platform it’s expected to run on.

Monoliths versus non-monoliths

Traditional approaches to data access and data movement are time prohibitive. The legacy approaches involved creating replicas of the data in the primary datastore in other operational datastores and data warehouses/data lakes, where data is updated after many hours or days, typically in batches. As organizations adopt microservices and design patterns, such delays in data movement across different types of datastores impedes agility and prevents organizations from forging ahead with their business plans.

Incrementally migrating a monolithic application to the microservices architecture typically occurs with the adoption of the strangler pattern, gradually replacing specific pieces of functionality with new applications and services. This means that the associated datastores also need to be compartmentalized and componentized, further implying that each microservice can have its own associated datastore/database.

From the data perspective this means:

  • The number of database instances increases with each microservice – again pointing back to spinning up/down on demand
  • For these microservices to communicate with each other, additional HTTP calls, over something like a convenient-to-use REST API, are needed – demanding flexible extensibility across any platform and language. In many cases microservices simply publish events indicating changes, and listeners/subscribers update the associated applications.

The fundamental requirements of a cloud-native database

High Performance: Back in the day, sub-millisecond response times were reserved for a few specialty applications. But, in today’s world of the microservices architecture, this is a must-have requirement for all applications. This latency requirement necessitates the highest-performance, most scalable database solution available.

Active-Active Data Replication: Data replication in batch mode used to be a popular approach. But for real-time applications, replication with event store and event sourcing are getting a lot more traction. In microservices apps, that are loosely coupled and need to share data, there is a need for active/active data replication with tunable consistency. Many customers employ active/active deployment models for many reasons such as:

  • Shared datasets among microservices that are being continually updated
  • Seamless migration of data across datacenters so user experience is not impacted
  • Mitigating failures scenarios and failover to a second datacenter to minimize downtime
  • Handling high volume of incoming traffic and distributing load across multiple servers with seamless syncs and
  • Geographically distributed applications (like a multiplayer game or a real-time bidding/polling application) where data needs to be in sync across geos

High Availability of Data: When you break a big monolith application to microservices with each having its own lifecycle, how do you ensure data availability? The cloud native app developer should choose the datastore based on the Recovery Point Objective (how much data will I lose), Recovery Time Objective (when an event failure occurs, how long will it take for the service to come back), high availability characteristics, installation topology and failover strategy. Single node database instances affect not just failure scenarios but client-downtime events, such as version upgrading, impacting availability.

High availability requirements are typically dependent on the criticality of applications, but the combination of the right database and cloud native solution supports various HA installation strategies for a range of use cases, from internal to mission-critical applications.

Dyntrace to Exhibit at @CloudEXPO NY | @Dynatrace @DevOpsSUMMIT #Agile #DevOps #Serverless #CloudNative

Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence

Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no idea how to get a proper answer.

read more

Parallels Mac Management 7 Adds the Last Missing Piece to the Mac Management Puzzle

I’m excited to announce that Parallels® Mac Management 7 for Microsoft® SCCM is now available. Based on customer feedback, Parallels is delivering new features and enhancements including Internet-based Mac® client management, non-OSD task sequences, and simplified macOS® imaging. What’s in Parallels Mac Management 7? The feedback from our 7 beta testers was fantastic, and we […]

The post Parallels Mac Management 7 Adds the Last Missing Piece to the Mac Management Puzzle appeared first on Parallels Blog.

Google Cloud cameo steals the show at Cisco Live with partnership update top of the agenda

It’s a mad, mad, multi-cloud world all right. At Cisco Live US in Florida yesterday – the company’s flagship jamboree of all things networking – CEO Chuck Robbins was all but upstaged by Google Cloud chief Diane Greene.

Robbins is more than happy to let the great and good share the stage with him at Cisco’s events – Apple CEO Tim Cook appeared at Cisco Live Las Vegas last year to discuss securing the mobile workforce. It is testament to the strength of partnerships at this level too from both sides; Apple and Google have plenty of the consumer clout, but Cisco’s presence in the enterprise market is vital too.

Yet it was also telling that Robbins mentioned Cisco’s Catalyst 9000 platform after Greene’s cameo. The Catalyst 9000 is, in the words of the CEO, ‘the fastest ramping product in the history of Cisco’. But first, an update on the company’s partnership with Google Cloud – in which the word Kubernetes was said rather a lot.

Robbins asked the audience of Cisco customers and partners how many were testing, piloting, or generally getting a feel for Kubernetes today. The applause which came back suggested a positive uptake. Google and Cisco’s partnership, launched in October last year, aims to give Cisco customers the same experience when running Kubernetes applications either on-premise or in Google Kubernetes Engine (GKE). Or in other words, to enable organisations to tackle their cloud journeys at their own pace.

“It’s helping all of you keep doing what you’re doing – keep innovating, and non-disruptively keep disrupting what your company’s doing,” said Greene. “You can’t just rewrite your applications and move to a new environment, and so what we’re doing here is bringing you Kubernetes containers, and then you can let your application developers concentrate on what they’re doing.”

Greene said there were four stakeholder bases who would see benefit from the partnership; engineers, developers, ops, and security. “For engineers, being able to take this incremental approach, this non-disruptive way to keep disrupting what you’re capable of doing in a fast-moving company – that’s one huge advantage,” said Greene. “It really modernises the developer environment.

“I’ve been involved in software development for a long, long time, and I really think these modern technologies are almost giving a 10x productivity improvement,” added Greene. “The Kubernetes environment and Istio is just taking care of a lot of things that developers used to have to worry about. Now they can focus more on the business of the company.

“For the ops folks, it gives you a consistent environment that you can monitor,” Greene continued. “Istio’s going to be really powerful there. For security, to have one consistent model across everywhere that you’re running, that’s huge – and it’s really good for the developers because you don’t have these lowest common denominator rules that can get in the way of innovation.”

Perhaps talk of upstaging is a little harsh. Cisco’s vision is around how its network architecture (above) underpins the innovation and partnerships taking place. “The reality, I believe, is it’s this architecture that brings together automation, security, analytics,” said Robbins. “For me, that’s what’s made the big difference – because you all understand how this can change the operational paradigm in your organisations and allow you to focus on other strategic things.”

Robbins touched on the importance of emerging technologies and their effects on the business in his opening salvo. “This is going to define how we think about the network’s next act: what does it have to do?” asked Robbins. “When you think about the complexity of the world you’re operating in now – which candidly is more complex than it was even four years ago – and then you introduce these new tech changes that bring incredible capabilities.

“Artificial intelligence, augmented reality, machine learning – you think about the requirements, and what you’re being asked to do by the business,” added Robbins. “Your business leaders in your organisation actually don’t care about the technology. They care deeply about the outcome that technology can deliver. They care deeply about moving faster. They care deeply about being able to execute on a strategy the minute they have a strategy.

“This is at the heart of how we defined our strategy that we first began to launch last year. If you step back and look at all of the connections, you have traffic going to public cloud, SaaS, applications, consuming M2M/IoT connectivity at the edge… the only common denominator is the network. Therefore the network has to become a secure platform that enables you to help your organisation achieve its strategies.”

As regular readers of this publication will be aware, Google’s cloud push has been a serious bet over the past couple of years, with validation of Greene’s work coming from the most recent Gartner Magic Quadrant for cloud IaaS. The analyst firm put Google in its leaders’ section for the first time in five years.

Cisco offered one other piece of news yesterday; the company is working with NetApp to deliver a new managed private cloud FlexPod product. FlexPod ‘combines Cisco USC integrated infrastructure with NetApp data services to help organisations accelerate application delivery and transition to a hybrid cloud with a trusted platform for innovation’, in the company’s words.

Main picture credits: Cisco

CEBIT 2018: Huawei launches hybrid cloud offering on Azure Stack


Keumars Afifi-Sabet

12 Jun, 2018

Huawei has launched a hybrid cloud service built for Azure Stack, Microsoft’s offering that brings Azure into customers’ datacentres as a private cloud.

Built on Huawei’s FusionServer V5 servers and CloudEngine switches, Huawei said the tool will allow enterprises to enable digital transformation projects by bringing Azure cloud services to on-premise sites where there is low connectivity, such as an aircraft or an oil rig.

Huawei is one of many firms working with Microsoft on producing services for Azure Stack, but speaking at CEBIT 2018, Microsoft partner director for Azure Stack, Vijay Tewari, labelled the vendor’s relationship with Huawei in particular as deep and strong.

“In terms of working with partners, the amount of time that Huawei [took] to launch the product was the shortest time it took as compared to any other partner, so we have a very strong engineering relationship with [president of server product line] Qiu Long and others at Huawei,” he said.

Huawei believes it is pivotal to pair its infrastructure with partners’ applications as it designs technology for use in smart cities, the cloud, and networking.

The Chinese networking giant likened digital transformation to a “symphony” as it promoted partnerships with a range of companies including Microsoft and UK-based Purple Wi-Fi, the latter of which it is offering its networking infrastructure to allow the Wi-Fi platform to extend the range of analytics tools it can offer customers. 

Purple Wi-Fi will be able to offer customers more detailed tracking information for consumers, with a view to boosting shopping experiences.

The company also outlined how it plans on using its partnerships with local companies to migrate projects to a global scale, with president of Huawei western Europe, Vincent Pang, outlining how a number of small-scale initiatives in Paris and London have helped the company win business elsewhere in the world.

“We want to build a local road here, we want to work with our local partners, we want to have more innovation to create end-to-end best practice here in Europe – but it’s not only for the local innovations, but how we can use these for the global market, and global vertical transformations,” he said.

Pang explained how a smart water project in Paris paved the way for expansion into Shanghai, while a smart logistics project with London’s DHL helped the company win a business case for a car manufacturer in China.

Huawei’s attempt to position itself as a leading player in the smart city scene arose with the lunch of the ‘Rhine Cloud’, a smart city and public services cloud platform, expanding on an initial memorandum of understanding signed earlier this year.

The new framework agreement extends the commitment to building a smart city platform in Duisberg, Germany to serve as a model that the company is hoping to export to the rest of western Europe.

Huawei’s first smart city digital platform includes five resource coordination capabilities for IoT, big data, a geographic information system (GIS) map, video cloud, and converged communications; all combining to share the basic resources with partners, and facilitate development of applications.

Martin Murrack, director of digitisation for Duisberg, outlined some of the benefits citizens should expect from the smart city collaboration with Huawei, including free Wi-Fi access and innovations in education, as well as unveiling the first Rhine Cloud-based SaaS platform, which digitises indoor environments, developed by Navvis.

Cohesity secures $250 million in series D round in hyperconverged storage boost

Cohesity, a hyperconverged storage provider, has announced it has raised $250 million (£186.7m) in an oversubscribed series D funding round to help further drive the company’s momentum.

The round was led by the SoftBank Vision Fund – making it only the second time the group has invested in an enterprise software company – with participation from Cisco Investments, Hewlett Packard Enterprise (HPE), Morgan Stanley Expansion Capital, and Sequoia Capital among others.

Cohesity offers hyperconverged storage for secondary data – in other words, not collected by the user – with the company saying secondary data consumes up to 80% of enterprise storage capacity. With the data in different repositories, such as backups, archives, test/dev and analytics, the company aims to simplify the process with its data platform.

The company has had significant success over the past 12 months, with more than 200 new enterprise customers – from Schneider Electric to the San Francisco Giants – coming on board in the past two quarters alone. Cohesity also secured increased revenues to the tune of 600% between 2016 and 2017.

“My vision has always been to provide enterprises with cloud-like simplicity for their many fragmented applications and data – backup, test and development, analytics, and more,” said Mohit Aron, CEO and founder of Cohesity. “Cohesity has built significant momentum and market share during the last 12 months and we are just getting started. We succeed because our customers are some of the world’s brightest and most fanatical IT organisations and are an extension of our development efforts.”

“Cohesity pioneered hyperconverged secondary storage as a first stepping stone on the path to a much larger transformation of enterprise infrastructure spanning public and private clouds,” said Deep Nishar, senior managing partner of SoftBank Investment Advisers. “We believe that Cohesity’s web-scale Google-like approach, cloud-native architecture, and incredible simplicity is changing the business of IT in a fundamental way.”

Despite SoftBank’s leadership being the eye-catching headline, it is interesting to compare some of the other investors.

Cisco and HPE also put a stake in last April with Cohesity’s series C round of $90 million. As a Business Insider story put it at the time, eyebrows were raised given Cisco and HPE’s fierce competition across multiple business units – storage being one of the hotter ones. Both companies have a clear interest in the space; in January last year, HPE acquired hyperconverged infrastructure (HCI) provider SimpliVity for $650 million in cash, while Cisco caught up by snaffling fellow HCI firm Skyport Systems at the start of this year.

The series D funding has given Cohesity a total of $410 million raised.

Four major challenges of adopting cloud business intelligence – and how to overcome them

With nine in 10 sales and marketing teams insisting that cloud-based business intelligence (BI) technology is necessary to the management of their strategies, it’s no wonder that the adoption has exploded in recent years. However, many businesses, especially small to medium sized enterprises (SMEs) are still lagging behind.

BARC’s research study on BI and data management found that 45% of businesses had not adopted this technology yet, with 6% stating they were opposed to it altogether. Over half of SMEs rated their strategy usage as “low”, and the majority were still not using the cloud for any of their BI and data management programs.

Source

Cloud-based BI can solve many data management issues that businesses face today. If used correctly, it can foster seamless and continuous utilization of information crucial to business growth. So why are so many companies still averse to it?

Let’s discuss.

User proficiency

It’s fair to say that no company can survive for long in the modern world without some form of business intelligence program, whether it be a simple database or a complex data library. As companies become more reliant on large sums of data to manage their sales and marketing strategies, the need for larger storage capacities and immediate access to information becomes an absolute.

However, switching up an operation to run on a cloud-based business intelligence system can be a cumbersome process. Since these systems have the capability to draw from a wide range of datasets, getting everyone in the company up to speed during the implementation phase takes time. The hours spent in training eat into company profits, yet it is necessary for all employees to get onboard if the system is going to be used to its fullest potential. Unfortunately, most businesses that use BI systems fail in this regard, as 49% of employees report that they use less than half of the system’s features due to improper training.

Obviously, failing to properly onboard your team to a cloud-based system could lead to disaster, including decreased profits, longer sales cycles, and copious amounts of employee frustration. For example, say your company switches to a cloud CRM system so your sales associates have access to customer data at all times. If they are not properly trained, they could miss important consumer tidbits, such as the customer’s position along the sales funnel or previous encounters with other representatives. However, with proper training, they can gather important insights from the program to create more seamless experiences. To give you an idea, reps could use the information from POS data points to create more personalized sales strategies based on customers’ past purchases and preferences. 

Unless your employees are tech wizards who happen to be experts in cloud-based BI systems, thorough onboarding will be necessary to get them comfortable with the technology. Of the companies that were able to successfully integrate BI into their sales and marketing practices, 86% claimed to have more accurate reporting and analysis, 84% improved their business decision making process, and 79% saw an enhancement in employee satisfaction.

Additionally, operational efficiency increased, as did customer satisfaction and revenues. Numbers like this clearly show the benefits of proper training and how it can create a more competent and productive company.

Data governance

The mass amount of data that companies have access to in today’s business landscape has created all kinds of gray areas in terms of data governance. Of course, there are many legal issues that come with the hierarchy of data access. For example, should every department within a business have access to sensitive and private information like financial or customer data? Perhaps not. But if every person needs to request permission to access necessary information, it could lead to an immense backlog and wait time to gather specific data points, which completely defeats the purpose of cloud-based BI.

There are some key guidelines companies should follow when setting up internal governance for data.

First of all, teams must recognize that not all data is created equal, and therefore, no single level of access can be used for every dataset all the time. There must be a chain of command in place and a series of checks and balances to ensure that data is being used correctly.

The second rule of thumb is to understand how linking data will create stronger strategies. However, this must be done wisely without using unnecessary datasets. Many sales teams will have their own sets of stored data from experiences and interactions with customers. Similarly, financial departments, customer service representatives, and marketing teams will have their own information as well. By combining these data points into a unified system (like a cloud-based BI program), every department can harness the power of this information to create better customer experiences.

Companies that understand the importance of a unified system for proper governance report remarkably higher success rates than businesses with multiple, inconsistent data sources. Additionally, failure to put a system in place leads to unreliable data that cannot be trusted, which creates confusion, and ultimately, failure.

Source

There is no “one size fits all” rule that can be applied to data governance, at least not yet. The technology is changing very rapidly, and it may take time for legal restrictions to catch up. However, in the meantime, it is vital that businesses set their own standards and rules for proper lineage and chain of custody when it comes to data management.

Security is the number one issue that businesses have with implementing cloud-based BI systems – with 45% listing it as their top concern.

Source 

In the process of moving data to the cloud, security risks can increase significantly. Cloud-based data can sometimes be more prone to hackers and system attacks, so this objection is certainly viable.

These days, data security is at the very top of most priority lists, and if it isn’t, it needs to be. However, it is not impossible to guarantee data security with cloud-based systems if the proper precautions and measures are taken.

In order to ensure that data points are safe and secure (especially during transfer), businesses must use cloud providers with network segmentation, strong password requirements, and heavy encryption for maximum protection. Machine learning systems can be a powerful weapon of defense when it comes to data security. This is because of its ability to mold and adapt as threats grow and system needs change. In fact, machine learning-powered security systems are able to identify and combat credential attacks, which make up over 80% of hacking-related data breaches.

System agility

BI solutions are in a fast, perpetual state of advancement. If a solution goes obsolete, businesses need to be able to change their system with very short adjustment times. This is, of course, a major concern for companies who equate technology changes with higher costs.

In order to diminish the feeling of being overwhelmed with the power and complexity of cloud BI systems, it is best to only focus on the features that will best support your specific industry. Different businesses will require different focuses, depending on their key areas of service. For example, financial services would benefit greatly from end-user self service programs, but have little need for deep data mining. On the other hand, healthcare companies need more data discovery capabilities, along with in-memory support and data cataloging.

Source

Businesses that are considering adopting this type of technology-forward BI system must be prepared to keep up with the changes. This will require constant re-evaluation of systems and comparing old strategies with new and emerging ones. By narrowing your company’s focus to only the most important features, the task of keeping up with the changes will be easier and narrower than trying to keep up with cloud-based advancement as a whole.

Conclusion

Adopting cloud-based systems are rarely easy, but they are necessary in order to keep up with the advancing tech world, especially in regards to BI. Of course, no technology is without its weaknesses or challenges. And in the case of cloud-based BI programs, these obstacles can often seem insurmountable. Fortunately, the top four objections against the implementation of this technology are not without solutions.