The Globe and Mail moves to AWS, combining SageMaker with Sophi analytics platform

Publishing and media companies continue to utilise the cloud for their archival and architectural needs – and The Globe and Mail is the latest to move across.

The Canadian news brand has selected Amazon Web Services (AWS) as its preferred cloud provider, citing the Seattle giant’s artificial intelligence (AI) and machine learning (ML) capabilities as key to its decision.

The Globe and Mail’s usage of AI and ML technologies is extensive. Amazon SageMaker, Comprehend, Rekognition and Textract are all being utilised, while Amazon Polly is being used to convert text articles to audio in English, French, and Mandarin.

One of the more interesting injections of machine learning for The Globe and Mail is through assessing the value of articles before they go live. The publisher’s proprietary analytics platform, Sophi, uses SageMaker among other AWS services to help the editorial team identify which stories should go behind a paywall, as well as which stories to promote and when.

“The Globe originally built Sophi for its own use, but has since begun offering Sophi as a service to other news organisations,” said Greg Doufas, chief technology and digital officer at The Globe and Mail. “With AWS, we are able to bring our tech experts and editorial leadership together to innovate and bring new ideas to the newsroom to provide great experiences for our readers.”

Many newly-announced AWS customers are keen to extol the virtues of AI and ML. Of the most recent, NASCAR became the latest sporting franchise – after Formula 1 and Major League Baseball – to sign up, back in June. The US motor racing governing body is looking to uncover its vast archive and release a periodic video series titled ‘This Moment in NASCAR History’, using Rekognition to automatically tag video frames with metadata for easier search capability.

The latter is an interesting use case across media. Boston TV station WGBH has been migrating its archive from tape and hard disk drives, which took up to 72 hours to access, to Cloudian’s object storage, leading to more seamless retrieval.

“What companies want to do is learn from the data – they want to be able to analyse and benefit from it, and it’s very hard to do if that data is sitting on a shelf,” Jon Toor, Cloudian CMO told CloudTech in 2018. “You need it to be sitting there with real-time access – preferably something that is cloud-integrated so you can also use tools in the cloud to help you learn about that data.”

AWS does not have a monopoly for North American publishers, however. The New York Times, when not working on a blockchain project to combat media misinformation, is a well-known Google Cloud customer. The publisher moved its gaming and crossword platform from Amazon in 2017, while in November it announced it was using Google’s AI to analyse its photo archive for story gathering.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware in talks to acquire Pivotal


Dale Walker

15 Aug, 2019

VMware has said it is in talks to buy Pivotal Software, the virtualisation company revealed in a filing on Wednesday.

The deal would see VMware acquire all outstanding shares of Class A common stock of Pivotal, priced at $15 per share, although all aspects of the agreement are subject to change and VMware is permitted to walk away at any time.

However, if agreed, that price would be substantially higher than Pivotal’s recent share price, which naturally rebounded to around $14 following news of the deal. It would also be the exact same price it listed during its IPO.

The deal is quite unique in the industry, given that Dell continues to own a majority stake in both VMware and Pivotal. Since being spun off from EMC Corporation (now Dell EMC) in 2012, Pivotal has worked to champion the Cloud Foundry, an open-source software platform used by most of the Fortune 500.

Despite seeing initial growth, with stocks rising to as much as $30, the company has struggled of late, and a disastrous financial quarter in 2019 saw stocks drop to as low as $8, losing $31.7 million in the process.

For its part, VMware has continued to work closely with Pivotal. Alongside Dell EMC, VMware remains a Cloud Foundry Foundation platinum partner, which includes sales of Pivotal services to its customers.

“VMware regularly evaluates potential partnerships and acquisitions that would accelerate our strategy,” the company said in a statement. “Pivotal is a long-term strategic partner and we’re already successfully collaborating to help enterprises in their application development and infrastructure transformation.

“VMware’s Board of Directors will continue to act in the best interest of all shareholders. There can be no assurance that any such agreement regarding the potential transaction will occur, and VMware does not intend to communicate further on this matter unless and until a definitive agreement is reached.”

Cloud autonomics: Moving to the holy grail of automated management and optimisation

Opinion The holy grail of IT operations is to achieve a state where all mundane, repeatable remediations occur without intervention, with a human only being called into service for any action that simply cannot be automated. This allows not only for many restful nights, but it also allows IT operations teams to become more agile while maintaining a proactive and highly-optimised enterprise cloud.

Getting to that state seems like it can only be found in the greatest online fantasy game, but the growing popularity of “AIOps” gives great hope that this may actually be closer to a reality than once thought.

Skeptics will tell you that automation, orchestration and optimisation have been alive and well in the data centre for more than a decade now. Companies like Microsoft with System Center, IBM with Tivoli and ServiceNow are just a few examples of platforms that harness the ability to collect, analyse and make decisions on how to act against sensor data derived from physical/virtual infrastructure and appliances.

But when you couple these capabilities with advancements brought through AIOps, you are able to take advantage of the previously missing components by incorporating big data analytics along with artificial intelligence (AI) and machine learning (ML).

As you can imagine, these advancements have brought an explosion of new tooling and services from cloud ISVs intended to make the once utopian “autonomic cloud” a reality. Palo Alto Network’s Prisma Public Cloud product is a great example of a technology that functions with autonomic-type capabilities. The security and compliance features of Prisma Public Cloud are pretty impressive, but it also has a component known as User and Entity Behaviour Analytics (UEBA).

UEBA analyses user activity data from logs, network traffic and endpoints and correlates this data with security threat intelligence to identify activities—or behaviors—likely to indicate a malicious presence in your environment. After analysing the current state of the vulnerability and risk landscape, it reports current risk and vulnerability state and derives a set of guided remediations that can be either performed manually against the infrastructure in question or automated for remediation to ensure a proactive response, hands off, to ensure vulnerabilities and security compliance can always be maintained.

Another ISV focused on AIOps is MoogSoft, which is bringing a next-generation platform for IT incident management to life for the cloud. Moogsoft has purpose-built machine learning algorithms that are designed to better correlate alerts and reduce much of the noise associated with all the data points. When you marry this with their AI capabilities for IT operations, they are helping DevOps teams operate smarter, faster and more effectively in terms of automating traditional IT operations tasks.

As we move forward, expect to see more and more AI and ML-based functionality move into the core cloud management platforms as well.

Amazon recently released AWS Control Tower to aide your company’s journey towards AIOps. While coming with some pretty incredible features for new account creation and increased multi-account visibility, it uses service control policies (SCPs) based upon established guardrails (rules and policies). As new resources and accounts come online, Control Tower can force compliance with the policies automatically, preventing “bad behaviour” by users and eliminating the need to have IT configure resources after they come online. Once AWS Control Tower is being utilised, these guardrails can apply to multi-account environments and new accounts as they are created.

It is an exciting time for autonomic systems capabilities in the cloud. Every company can now automate, orchestrate and proactively maintain and optimise its core cloud infrastructure.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Co-op embarks on major digital transformation project with SAP


Nicholas Fearn

14 Aug, 2019

Convenience store chain Co-op has turned to SAP help it embark on its biggest digital transformation programme to date.

The Retail Business Transformation programme, launched with suppliers this week, uses SAP technology to modernise operations across the entire business.

Co-op said it will use technology from SAP to improve ranging, stock holding, availability and more accurate forecasting information. 

Specifically, Co-Op will deploy SAP’s Retail ECC Suite on HANA, an integrated business process and data management platform that aims to streamline retail operations.

The company explained that the HANA-based system will not only allow it to ensure the availability of the right products in stores to meet customer demand, but also help better understand customer needs to facilitate savvy price and promotion decisions.

It’s currently trialling the systems in 24 stores across five product categories and with the support of 15 suppliers, including the likes of Coca Cola and Heineken.

As part of this digital transformation initiative, the firm has also implemented a cloud-based supplier collaboration portal called Co-op Connect.

Michael Fletcher, chief commercial officer at Co-op, said: “The RBT programme is an integral part of Co-op’s on-going success, as we look to ensure that the technology we are using will future proof the business for many years to come. 

“This pilot will allow us to work with suppliers to ensure that it is working perfectly before we roll it out elsewhere, and we are already getting positive feedback that the new portal is faster and easier to navigate.

Fletcher added that investing in new technology will allow the firm to “grow the business, helping to deliver a stronger Co-op that will result in stronger communities”.

Co-op’s transformational project is yet another example of how major companies from banks to corner shop bands are exploring the use of the latest data management and cloud-based technologies to overhaul their business, ranging from streamlining certain operational aspects to completely reworking their core systems and IT infrastructure.

Amazon claims AWS Rekognition can now detect fear


Bobby Hellard

14 Aug, 2019

Amazon Web Services has revealed an update to its facial recognition software, Rekognition, that can detect a person’s fear.

This update was announced on Monday along with improvements to accuracy and functionality of its facial analysis feature that can identify gender, emotions and age range.

The company claims the software can already accurately read seven ’emotions’, but it has now added an eighth – the ability to spot fear.

However, some experts have pointed out that while there is scientific evidence that suggests there are correlations between facial expressions and emotions, the way they’re communicated across cultures and situations can vary dramatically.

“Today, we are launching accuracy and functionality improvements to our face analysis features,” the tech giant said. “Face analysis generates metadata about detected faces in the form of gender, age range, emotions, attributes such as ‘Smile’, face pose, face image quality and face landmarks.

“With this release, we have further improved the accuracy of gender identification. In addition, we have improved accuracy for emotion detection (for all 7 emotions: ‘Happy’, ‘Sad’, ‘Angry’, ‘Surprised’, ‘Disgusted’, ‘Calm’ and ‘Confused’) and added a new emotion: ‘Fear'”

The ethical use of facial recognition, and its accuracy, particularly when deployed on a crowd, has caused concern throughout the world. From the London Met Police’s use that resulted in zero arrests and a 98% failure rate, to San Francisco’s outright ban of the technology, it’s now more famous for its problems than its benefits.

The UK’s Information Commissioner (ICO) has announced an investigation into the privacy aspect of facial recognition, which came to light this week after the owner of a development site in King’s Cross confirmed the technology was being used.

Amazon’s own Rekognition has also been a source of controversy after it was revealed that US law enforcement used the technology. There was even reports that AWS tried to offer the software to the Immigration and Customs Enforcement organisation, which sparked protests from Amazon staff.

Why a holistic approach to cloud transformation is key to success

As applications increasingly move to the cloud, businesses often voice concerns about soaring WAN costs as well as latency issues when accessing apps. The much-anticipated benefits of a cloud transformation, including greater efficiency and agility, risk being eroded when the user experience is unsatisfactory and costs spin out of control.

How, then, can organisations successfully tackle their transformation projects to avoid these pitfalls and fully realise the benefits of the cloud? This is a contentious issue, even for companies that have already begun their cloud journeys.

According to our own recent independent survey, which included 400 decision-makers in four European countries, fewer than one in 10 companies (nine percent) in Germany, England, France, and the Benelux region are employing a holistic transformation approach, which includes taking application, network, and security aspects into account at the same time.

Furthermore, 21 percent of companies reported starting their journey with applications; 26 percent used the network as the starting point, and one-third (33 percent) began by transforming security. In 11 percent of the companies surveyed, decision-makers actually considered the transformation of applications together with that of the network. The results demonstrate that there is no consistent way to approach a transformation project.

Network topologies for the cloud?

Businesses are advised to take holistic considerations into account during an application transformation as early as the planning phase. This means that the decisions for a cloud project should not be started in isolation from a single business unit, because such siloed thinking leads to negative performance and spiralling costs. If an application is pushed into the cloud without the network and security teams being involved in the planning stage, problems are inevitable.

A traditional network topology is not designed to meet the needs of the cloud. Users are not directly connected to applications in the cloud when using a classic hub-and-spoke network. Whether at the headquarters, at a branch office, or from another remote location, users must always take a detour via the data centre, which creates latency as this connection to the internet is never the shortest or most time-saving path.

This detour can also help explain the skyrocketing costs. The traffic from remote users goes through the MPLS lines several times through this detour. In addition, the increase of internet-bound traffic must be taken into account. Office 365, the most popular cloud-based application suite and the one that launches many companies’ journey to the cloud, can increase traffic substantially. For good reason, the recommendation in the Microsoft Design Guide is to rely on direct internet connections at each location to give employees the shortest path to applications in the cloud.

Security for the cloud, from the cloud

Businesses must understand that a cloud-ready network should be built before deploying a cloud-based application. Part of the building process involves changes to the security infrastructure. If applications are to leave the network and a mobile user wants to access data in the cloud, security hardware at the perimeter becomes a bottleneck for this traffic. Here the second silo opens up. The security team must be invited to the table when a transformation project is planned. The specific security requirements of cloud-based projects have to be considered.

If only the network team is consulted, but not the security expert, the following aspects are often overlooked in the planning phase:

  • Is the existing proxy designed to cope with increasing network traffic?
  • Is the appliance capable of scanning traffic for the rising volume of malware that hides behind SSL encryption?
  • Is the firewall also keeping up with the new data volume and parallel connections, which are required for the Office 365 example?

In short, not only is there more data traffic, but there are also new requirements for the security infrastructure as applications move to the cloud. If companies anticipate the move and provide local internet breakouts, the security infrastructure must also be maintained locally because the traditional security infrastructure around the centralised data centre would, in turn, be associated with a detour.

The solution cannot be to install stacks of appliances at each site, as cost and administrative overhead bar such a move. To secure local breakouts, the solution is a security stack in the cloud with all the necessary security modules, from the next-generation firewall to cloud sandboxing and data loss prevention.

Cloud-delivered security as a service reduces the administrative burden through a high degree of integration and therefore a short path to log correlation. And security from the cloud scales easily with increased data volume and ensures the correct path for business-critical applications through bandwidth management. Application, network, and security transformation must go hand in hand

According to our research, a third of decision-makers are already adapting security requirements as part of their transformation. Building on this progress, the network topology should also be cloud-ready to intercept bottlenecks as applications move to the cloud. That means that the one-quarter of companies that said they want to start with application transformation should reconsider their strategy. All in all, transformation efforts in all three areas must go hand in hand and be planned jointly by all departments from the start. In such a scenario, companies will benefit from their cloud transformation right from the outset.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Slack hands more power to large company admins


Bobby Hellard

14 Aug, 2019

Slack has revealed a set of features for admins that make it easier to manage organisations with large employee counts and busy channels.

The biggest change is that admins can now assign posting permissions more widely than a few select channels, and there is also a new set of APIs to automate the creation of workspaces with names, domains and descriptions.

“We believe this should be easier and today we’re introducing a couple of new features to do just that,” the company said in a blog post.

It comes a week after the company introduced more robust security measures for admins, including the ability to enforce data sharing limits and content blocks on certain devices, as well as a greater variety of two-factor authentication checks.

To start with, the announcement channels will create a single destination for the key information, so teams no longer have to decide what gets shared via email instead of what gets shared via Slack.

“We’ve long encouraged teams to send announcements in channels, where your employees are already working,” the company said. “To broadcast those updates clearly and without distraction, admins have always been able to limit posting permissions in the default ‘general’ channel. Now, for teams on our Plus or Enterprise Grid plans, we’re allowing users to set posting permissions for any channel.”

These come in the form of ‘granular’ controls which limit who can post in a channel and keep chatter to a minimum, leaving the space clear for the most important updates.

As for the new admin APIs, there will be a feature to invite thousands of members at a time, without the need to join the workspace themselves. Invite guest accounts to specific channels (including private ones), set a guest expiration date and customise a welcome message, delegate admin responsibilities to a specific member and automatically trigger the events above based on information collected via web forms.

“All these APIs work towards templated workspace creation and setup,” the company said. “In the future, admins can script the creation of new workspaces that will automatically be configured with their desired settings, content, apps and more.”

Microsoft and Reliance Jio team up in 10-year cloud deal to ‘transform Indian economy and society’

Indian network operator Reliance Jio has announced a 10-year partnership with Microsoft to utilise and promote Azure and ‘accelerate the digital transformation of the Indian economy and society.’

The alliance will comprise a variety of initiatives. Jio will move its non-network applications to Azure, as well as set up data centres across India with Azure housed there. The telco’s internal workforce will be supplied with the Microsoft 365 collaboration suite, while Jio’s connectivity infrastructure will promote the adoption of Azure as part of the company’s cloud-first strategy.

The move will extend beyond Jio internally to its customer base; startups will have greater access to cloud infrastructure, while Indian SMBs will have access to a range of cloud-based productivity apps. For larger organisations, the companies state that new Jio solutions can be leveraged which work with Microsoft offerings already in use.

“In combining efforts, Jio and Microsoft aim to enhance the adoption of leading technologies like data analytics, AI, cognitive services, blockchain, Internet of Things and edge computing among small and medium enterprises to make them ready to compete and grow, while helping accelerate technology-led GDP growth in India and driving adoption of next-gen technology solutions at scale,” the companies said in a statement.

India’s role in the cloud computing ecosystem is an interesting one. The country’s potential is unmistakable; a report last November argued more than one million cloud jobs will be created in India by 2022 while figures in April suggested the overall cloud computing market will break $7 billion by the same year.

Yet glaring weaknesses remain. The most recent report from the Asia Cloud Computing Association (ACCA) last April ranked India only above China and Vietnam in its 14-nation ranking of best cloud nations within Asia Pacific. Many of India’s problems are similar to China’s – the country’s vast expanse means that while certain areas are prosperous, the overall score for connectivity, sustainability and data centre risk are low.

The ACCA report noted at the time that one of the key areas where India could gain leverage is through its tech-literate workforce to improve its attractiveness as a data centre hub. It is however a slow process. “Cloud infrastructure is the weakness that is weighing India down,” the report noted. “Lack of access to quality broadband and sustainable power remain serious issues throughout India, making it difficult for even the most polished security and governance frameworks to drive cloud adoption.”

Figures from Synergy Research last June showed that across APAC, AWS remained the leading public cloud provider, with Alibaba breaking the Microsoft stranglehold on second place – primarily down to Chinese dominance. Satyajit Sinha, an analyst at Counterpoint, told the Economic Times that the Jio and Microsoft team up would require AWS and Google to come up with ‘new, perhaps cheaper’ pricing models for the Indian market.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How public cloud will become the driving force for connected cars

Opinion The cars we drive today are very different to the first models introduced in the late 1800s. Every aspect of the driving experience has evolved, innovating to meet customer needs, industry standards and to ensure passenger safety. Our new cars now offer voice assistants, can be fuelled by electric power, and typically include satellite navigation and collision detection as standard.

It’s time for the next stage of innovation, which will make the cars of the future very different to the ones we have today. this phase includes the introduction of autonomous and connected cars, which are integrated with the public cloud.

As we embark on this period of cloud-based automotive development, data security and privacy must be at the heart of everything we do. Car manufacturers will collect more and more data about us through cloud applications, just as the mobile phone providers do, the question will be: who controls that data? More to the point, who has access to that data? After all, our cars could tell a hacker where we live, where we work, the route we take and when and where we shop…

Consumers, governments and businesses alike are all waking up to the importance of secure, private data usage, so car manufacturers have an opportunity to build compliance in from the get-go.

Key challenges for automotive manufacturers creating connected cars are extracting the huge amount of data from the multitude of sensors on modern-day vehicles, augmenting that data for additional value, anonymising it for GDPR compliance, storing it and presenting it in a format that data scientists can analyse. Public cloud is an ideal solution, providing flexibility, scale, agility and security.

Data-driven cars

As the now-famous quote says, “data is the new oil.” It’s valuable, but if unrefined it cannot be used –the autonomous vehicle must be fed by structured data.

So what if all this data that is now collectable from our connected cars? Off-car analysis for the manufacturer’s gain is one thing, but how could this be put to good use to input into an intelligent city transport system, for example?  Will Audi, BMW, Mercedes, Lexus, VW, JLR all share the data from their connected cars for the greater good of a more intelligent public-use transport system? Only time will tell.

It’s an exciting time for the automotive industry. The basic design of a car hasn’t changed for decades, but it’s now time for the industry to evolve at speed. Of course, there will always be concerns over privacy risks. It’s vital that the industry takes these concerns seriously and puts security at the heart of evolution.

There’s a real opportunity for all vehicle manufacturers to improve their customer experiences through data-driven cars. Not only can these cars help travel experiences through integrated car and traffic management solutions, but they can additionally benefit society through reduced pollution.

The first car models from the 1800s seem basic to modern consumers, and it’s exciting to think that today’s electric, voice-assisted vehicles may spur the same reaction in the future. 

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Analysing Microsoft Azure Dedicated Host and licensing changes: Risk, rage, and reward

Microsoft is modifying some of its Azure licensing terms for dedicated hosted cloud services, with a knock-on effect of making services more expensive for customers of Amazon Web Services (AWS), Google and Alibaba Cloud.

The move coincides with Microsoft launching Azure Dedicated Host, a service that enables users to run Linux and Windows VMs on single-tenant physical servers.

The overall rationale for the service was outlined by the Azure team in a blog post. “The emergence of dedicated hosted cloud services has blurred the line between traditional outsourcing and cloud services, and has led to the use of on-premises licenses on cloud services,” a post read. “Dedicated hosted cloud services by major public cloud providers typically offer global elastic scale, on-demand provisioning and a pay-as-you-go model, similar to multi-tenant cloud services.

“As a result, we’re updating the outsourcing terms for Microsoft on-premises licenses to clarify the distinction between on-premises/traditional outsourcing and cloud services and create more consistent licensing terms across multi-tenant and dedicated hosted cloud services.”

Starting from October, customers who buy on-premises licenses without ‘software assurance and mobility rights’ cannot be deployed with dedicated hosted cloud services offered by the three big competitors, including VMware Cloud on AWS. Microsoft added these changes did not apply to other providers.

Owen Rogers, research vice president for digital economics at 451 Research, noted an example of the potential change. “Back when Azure didn’t offer dedicated hosts, some AWS customers installed Windows Server Datacenter on an AWS dedicated host – as a result, all virtualised operating systems on the host were licensed to run Windows Server, from the single host license, which aided migrations and license management plus lowered costs,” Rogers told CloudTech.

“Now Microsoft is saying you won’t be able to install Windows Server on a dedicated host at all, unless you use Azure.”

It is safe to say that AWS responded to the news with claws out. Writing on LinkedIn Sandy Carter, AWS vice president, argued the announcements “certainly seem like they’ve been taken from the old guard software vendor playbook.” Amazon CTO Werner Vogels was similarly dismissive, writing on Twitter.

Carter cited eMarketer as an example of a customer which had begun its digital transformation journey on Azure but had moved to the other side. “The cloud enables your company’s agility and innovation. Do you really want to bring along the licensing baggage of the old world, especially if those rights continue to change?” wrote Carter. “At AWS, our goal is to provide our customers with choice.”

Choice at what cost, however? This statement may raise the odd eyebrow for those who have been monitoring the recent rumbles around open source and big cloud providers. MongoDB, Confluent and Redis Labs were three companies who had modified their licensing because of major cloud providers who ‘take the open source code, bake it into [their] cloud offering and put all their own investments into differentiated proprietary offerings’, as Confluent co-founder Jay Kreps put it last year.

Redis CEO Ofer Bengal told this publication in February that, aside from AWS, ‘the mood [was] trying to change’, inferring that partnerships between open source cos and big clouds were on the horizon. Lo and behold, less than two months later, Google Cloud announced partnerships with seven open source vendors – including all of the above.

“[The move] is controversial because Azure is restricting freedom of choice regarding where its software can be hosted, and is using its software to undercut its competition,” added Rogers. “Many will say this is just good business sense – Microsoft has invested billions in its software and services over the years, why shouldn’t it use its assets to capitalise on the opportunity? Others will say that some enterprises will have to pay more as a result without getting more value in return.

“The risk is that this move doesn’t encourage customers to move to Azure, but rather encourages customers to migrate to Microsoft’s competitors’ services,” added Rogers.

Ultimately, both sides appear to be looking out for number one – an understandable position given the long-standing supersonic growth from the hyperscale clouds appears to be on the wane. As Synergy Research puts it, this is more the ‘law of large numbers’ taking its inevitable effect – but perhaps the well-known proverb around stones and glass houses may also apply.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.