All posts by James

Gartner and Synergy studies show continued cyclical cloud spend

The cloud industry continues to go up and up – both in terms of global public cloud revenue and spending on data centre hardware and software.

Those are the key findings from two separate research studies, from Gartner and Synergy Research respectively.

Gartner found the worldwide public cloud services market is set to hit $214.3 billion (£164.1bn) in 2019, up 17.5% from the previous year. The primary category will remain SaaS – or cloud application services as Gartner puts it – expecting to account for almost $95bn of that figure this year and $143.7bn by 2022. Yet the fastest growing category will be infrastructure as a service (IaaS), almost doubling in revenue between 2019 and 2022.

In total, SaaS contributes 44% of overall spending today, with cloud business process services (BPaaS) at 23%, IaaS at 18% and platform as a service (PaaS) at 8.8%. Gartner predicts SaaS’ share to be broadly similar by 2022 at 43%, while IaaS rises to 23%, and PaaS at 9.6%.

“At Gartner, we know of no vendor or service provider today whose business model offerings and revenue growth are not influenced by the increasing adoption of cloud-first strategies in organisations,” said Sid Nag, research vice president at Gartner. “What we see now is only the beginning, though. Through 2022, Gartner projects the market size and growth of the cloud services industry at nearly three times the growth of overall IT services.”

Synergy, meanwhile, found that worldwide spend on data centre hardware and software grew by 17% last year. The continued demand for public cloud services is driving this spend, with more extensive server configurations ramping up enterprise selling prices. Public cloud demand grew 30% while private cloud – or cloud-enabled – went up 23%. For traditional ‘non-cloud’ there was no change year on year.

Looking at the runners and riders, Dell EMC and Cisco took the top two slots in the vendor space for public cloud, with HPE and Huawei tied for third. Original design manufacturers (ODM) account for the most market share. In private cloud, Dell EMC again came out on top, ahead of Microsoft and HPE tied for silver, and Cisco.

“Cloud service revenues continue to grow by almost 50% per year, enterprise SaaS revenues are growing by 30%, search [and] social networking revenues are growing by almost 25%, and eCommerce revenues are growing by over 30% – all of which are helping to drive big increases in spending on public cloud infrastructure,” said John Dinsdale, a chief analyst at Synergy.

“We are also now seeing some reasonably strong growth in enterprise data centre infrastructure spending, with the main catalysts being more complex workloads, hybrid cloud requirements, increased server functionality and higher component costs,” Dinsdale added. “We are not seeing much unit volume growth in enterprise, but vendors are benefiting from substantially higher average selling prices.”

These two research reports show that cloud spending is a virtuous circle. Organisations feel an increasing demand for cloud services and adoption; this revenue implores greater capex from the vendors in their infrastructure; and this in turn leads to higher data centre hardware and software spending in general.

“Organisations need cloud-related services to get onboarded onto public clouds and to transform their operations as they adopt public cloud services,” added Nag. “As cloud continues to become mainstream within most organisations, technology product managers for cloud-rated service offerings will need to focus on delivering solutions that combine experience and execution with hyperscale providers’ offerings.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

The cloud migration landscape: Multi-cloud and hybrid battle for supremacy as security remains key

A new study from data virtualisation provider Denodo makes for an interesting snapshot of where cloud migration trends sit today, with security remaining top of mind and hybrid and multi-cloud architectures key.

The findings of the study, which surveyed more than 200 business executives and IT professionals, would not have come as a major surprise to regular readers of this publication. Perhaps some of the figures may have been a little on the low side. 36% of organisations polled said they were currently in the process of migrating workloads to the cloud, with almost 20% saying they were in the advanced stages of implementation.

46% of those polled said they leveraged a hybrid cloud model, with private cloud (33.6%), multi-cloud (32.6%) and public cloud (31.6%) all polling similarly. Cost optimisation (54%) the most frequently cited motivating factor for multi-cloud, ahead of securing a best in breed offering (45%) – with echoes to a recent Turbonomic study – and avoiding vendor lock-in (38%).

Amazon Web Services (AWS) and Microsoft Azure were the clear one-two when it came to most widely used providers, polling 67% and 60% of the vote respectively. Google Cloud, cited by 26% of respondents, trailed.

When it came to security, more than half (52%) of respondents cited it as a key concern, ahead of managing and tracking cloud spend (44%) and a lack of cloud skills (31%). In one statistic that may have gone against the usual trend, Docker – cited by 31% of those polled – was the most popular container technology ahead of Kubernetes (21%).

Naturally, given its heritage, Denodo advocates data virtualisation – where applications can retrieve and manipulate data without knowing where it is located or how it is formatted – as a tool to help organisations manage the complexity of their multi-cloud workloads.

“While organisations continue to adopt cloud solutions at a fast pace, they soon realise that the migration of critical enterprise information resources is a challenge due to today’s complex, big data landscape,” said Ravi Shankar, Denodo chief marketing officer. “Using data virtualisation, businesses alleviate these pain points by building a data services architecture that allows them to gain the maximum benefits from their data and take advantage of cloud modernisation, analytics and hybrid data fabric.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AWS makes S3 Glacier Deep Archive available for coldest cloud storage needs

It was promised at last year’s re:Invent, and now it is here: Amazon Web Services (AWS) has announced the general availability of S3 Glacier Deep Archive, aimed at being the lowest cost storage in the cloud.

When the company said it was the cheapest around, it wasn’t kidding; offering prices at only $0.00099 per gigabyte per month, or $1 per terabyte per month. This level, as with other cold storage, is aimed at organisations looking to move away from off-site archives or magnetic data tapes for data that needs to be stored, but is accessed once in every several blue moons. “You have to be out of your mind to manage your own tape moving forward,” Jassy told re:Invent attendees back in November.

“We have customers who have exabytes of storage locked away on tape, who are stuck managing tape infrastructure for the rare event of data retrieval. It’s hard to do and that data is not close to the rest of their data if they want to do analytics and machine learning on it,” said Mai-Lan Tomsen Bukovec, Amazon Web Services VP of S3. “S3 Glacier Deep Archive costs just a dollar per terabyte per month and opens up rarely accessed storage for analysis whenever the business needs it, without having to deal with the infrastructure or logistics of tape access.”

Cold storage is not just the domain of AWS of course. Google’s Coldline offering was subject to price cuts earlier this month, with the company opting for high levels of availability and low levels of latency as its calling card. Google said at the time that Coldline in multi-regional locations was now geo-redundant, meaning that it was protected from regional failure by storing another copy of it at least 100 miles away in a different region. For comparison, AWS aims to offer eleven nines durability, and restoration within 12 hours or less.

Customers using Glacier Deep Archive, AWS added, include video creation and distribution provider Deluxe, Vodacom, and the Academic Preservation Trust.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Organisations looking to get best of breed and guaranteed app availability with ‘true’ multi-cloud

Multi-cloud is rapidly becoming the de facto deployment model for organisations of all sizes – and a new report from hybrid cloud software provider Turbonomic has found that the vast majority of respondents ‘expect workloads to move freely across clouds.’

The study, the company’s 2019 State of Multicloud Survey which polled almost 850 respondent across multiple IT functions, found the key drivers for ‘true’ multi-cloud elasticity were a desire to leverage best-of-breed cloud services and guaranteed application availability.

Of those services, Amazon Web Services, cited by 55% of respondents, and Microsoft Azure (52%) held the clear advantage. 45% of those polled said they still used private cloud of some sort, while Google (22%) and IBM (8%) trailed. “Choice is not only critical in terms of the freedom to choose the best services for their business, but it’s also about leverage,” the report explained. “Clouds must compete, which ultimately drives the industry as a while forward with the innovation that will differentiate their offerings.”

If multi-cloud is the clear medium of choice, then containerisation is not far behind it. Almost two thirds (62%) of those polled said they had begun their cloud-native or container journey, with on average a quarter (26%) of environments currently using containerised applications. Almost a third of containerised apps were seen as mission-critical.

One of the clearest benefits of multi-cloud implementation, the report noted, was around saving time as the move to workload automation intensified. The exploration of artificial intelligence (AI) and machine learning (ML) is one which divides opinion. Will this time saved lead to more productive workforces, or reduced workforces? Naturally, those polled expressed an optimistic view. Nine in 10 said it would either elevate their careers, or have no impact. Almost half (45%) of organisations surveyed said they were adopting AI and ML for application management.

The report’s conclusion focused around a common theme for regular readers of this publication – the evolution of cloud and how emerging technologies and services are augmenting it. “Clouds today are not just infrastructure as a service, but providers of application and business services,” the report noted. “These services are their true differentiation and will increasingly become their competitive advantage. When clouds compete, customers win.

“Culture and complexity are frequently cited as the main obstacles to success,” it added. “How quickly can people – teams of people – adapt to the speed these technologies enable, embrace the new mindsets they necessitate, and manage the dynamic complexity they create? These questions are compelling organisations to value their people as creative problem solvers and innovators more than ever before.”

The report makes for interesting reading when compared to a study released by Turbonomic and Verizon in 2016. Back then, it was more about outlining a business case for multi-cloud and balking at the cost issues. 81% of those polled back then said choosing the right workloads for the right clouds was a problem yet to be solved.

“The move toward hybrid and multi-cloud is well underway,” said Tom Murphy, CMO at Turbonomic. “This move is driven by an acute need for IT modernisation, as IT continues to elevate its value by increasingly driving innovation and new revenue opportunities.

“Containers and cloud-native are central to IT modernisation initiatives, creating a tipping point in complexity,” Murphy added. “Across industries, IT staff are seeking to minimise human-assisted automation, which is why they are increasingly turning toward workload automation.”

You can read the full 2019 State of Multicloud survey here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How the HR industry has seen cost optimisation with SaaS: Exploring next steps

It is always interesting to see how different industries are seizing the opportunity of cloud adoption as well as coping with its challenges. As far as the HR industry is concerned, a new report has asserted that enterprises have realised the cost savings of software as a service (SaaS) and are now focused on building the next steps.

The study, from Information Services Group (ISG), polled more than 250 companies, ranging in size from 1,000 to more than 20,000 employees, and found a “distinct picture of the typical journey towards maturity in HR technology.”

Approximately half of those polled who had leveraged a SaaS model said they had achieved between 10% and 30% savings in both IT and technology operations and HR administration. 15% said they had achieved savings of 30% or more in both areas. When it came to a specific technology platform – perhaps not surprisingly given the sensitive data departments work with – data security was the key feature, cited by almost three quarters (72%) as a ‘must-have.’ Ease of maintenance (69%), ease of use (66%) and depth of functionality (66%) were also highly cited.

Despite all this however, only two in five (41%) respondents agreed that they had seen measurable business improvements by adopting SaaS. The report argued this was down to a discrepancy between investment and results.

The key finding here was that HCM solutions do not address all of an organisation’s needs. SaaS providers were coming around this realisation too, the report added, investing in case management software or partnering with suitable providers to plug the gap. The report also advocated extensions through open APIs – Oracle, SAP and Workday are among the vendors who have explored this – as well as revaluating their metrics.

Part of the solution for enterprises could be to combine HR with other systems, such as ERP. More than two thirds (68%) of organisations polled said they were exploring this, with approximately half actively seeking out a platform or deployment partner.

Ultimately however, the move to what ISG describes as the fourth generation of HR technologies is a way off yet. HR Tech 4.0, as the report put it, would necessitate central governance established by a centre of excellence, full SaaS capabilities and fully automated and optimised processes.

“While organisations are accelerating HR technology capability, they have not yet made similar advances in process, service delivery and self-service technology,” said Stacey Cadigan, partner at ISG HR Technology and report co-author. “To add true business value and solve HR challenges, organisations must combine the use of SaaS with a clear HR technology strategy, optimised processes, an end-to-end experience and change management designed to ensure technology adoption and drive business outcomes.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Portworx secures $27 million series C funding, unveils update to container management platform

Portworx, a provider of storage and data management software for containers, has announced a $27 million (£20.5m) series C funding round – and touted record revenue and customer growth in the process.

The company’s series C, putting its total funding at $55.5m, was co-led by Sapphire Ventures and the venture arm of Mubadala Investment Company, with new investors Cisco Investments, HPE and NetApp joining Mayfield Fund and GE Ventures in the round.

The company noted its customer base had expanded by more than 100%, with total bookings going up 50% just between the third and fourth quarters of 2018. Among the new customers are HPE, with the infrastructure giant eating its own dog food in the funding stakes by investing after purchasing the Portworx Enterprise Platform.

Portworx also took the opportunity to unveil the latest flavour of its platform, Portworx Enterprise 2.1. New features include enhanced disaster recovery as well as a role-based security option, with organisations able to access controls on a per-container volume basis integrated with corporate authorisation and authentication.

The opportunity for Portworx and other companies of their ilk is a big one. As this publication noted in September, a study from the company noted more than two thirds of companies were ‘making investment’ in containers, noting that the enterprise had caught up. Ronald Sens, director at A10 Networks, noted at the time that while Kubernetes is “clearly becoming the de facto standard”, certain areas, such as application load balancing, are not part of their service.

“Kubernetes alone is not sufficient to handle critical data services that power enterprise applications,” said Murli Thirumale, CEO and co-founder of Portworx in a statement. “Portworx cloud-native storage and data management solutions enable enterprises to run all their applications in containers in production.

“With this investment round the cloud-native industry recognises Portworx and its incredible team as the container storage and data management leader,” Thirumale added. “Our customer-first strategy continues to pay off.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Alibaba Cloud looks to integrated and intelligent future at Beijing summit

Alibaba Cloud has taken the opportunity provided by its most recent Beijing Summit to elaborate on its near 10-year history – and explore a more integrated and intelligent future.

The company noted that its cloud arm was ‘becoming the main business focus of Alibaba Group’ and that cloud adoption was ‘expected to continue and become more immersive in the traditional sectors across China.’  

“Alibaba has championed cloud computing in China over the past 10 years and has been at the forefront of rapid technology development,” said Jeff Zhang, CTO at Alibaba Group and president of Alibaba Cloud. “In the future, our highly compatible and standards-based platform will allow SaaS partners to onboard easily and thrive.

“The offerings will also be enriched by our continued investment in research through the Damo Academy, [which] will align data science with the development of our products,” Zhang added. “To empower all participants in our ecosystem, we will boost the integrated development of technology, products and services on our open platform.”

The event saw three primary products announced: the SCC-GN6, claimed to be the most powerful super-computing bare metal server instance issued by the company to date; a cloud-native relational database service, PolarDB, and SaaS Accelerator, a platform for partners to build and launch SaaS applications as well as utilise Alibaba’s consultancy.

The past six months have seen a period of expansion for Alibaba Cloud. A new data centre complex in Indonesia was launched in January, while the London site opened its doors in October. The company says its IaaS dominance in China is such that it commands a larger market share than the second to eighth largest players put together.

Synergy Research noted in June that the top five cloud infrastructure players in China were all local companies, while across the whole of Asia Pacific (APAC) Alibaba ranked second, behind AWS. In October, data and analytics firm GlobalData noted how Alibaba was gaining across APAC as a whole, saying it was a ‘force to be reckoned with’ and ‘betting big on emerging markets such as India, Malaysia and Indonesia while competing with others in developed markets.’

Asia Pacific remains a region of vastly differing expectations when it comes to cloud computing, as the Asia Cloud Computing Association (ACCA) found in its most recent Cloud Readiness Index report. Those at the top end, such as Singapore and Hong Kong, have overall rankings – based on connectivity, cybersecurity and data centre infrastructures among others – ahead of the US and UK. India, China and Vietnam meanwhile, the bottom three nations, scored lower than 50%.

China itself, according to the ACCA report, had made progress despite retaining its modest ranking from 2016, but its lowest scoring areas – power sustainability and broadband quality – reflected the issues of nationwide adoption of cloud technologies across such a vast area. The report did note that the Chinese government “continues to devote considerable fiscal resources to the development and improvement of infrastructure… a move that will undoubtedly pay off in the next few years.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

HPE aims to deliver on hybrid cloud consultancy prowess with Right Mix Advisor launch

Hewlett Packard Enterprise (HPE) has been focused on the long road of what it calls the ‘innovative enterprise’ and building up its cloud consultancy with the acquisitions of RedPixie and Cloud Technology Partners. Now, it is ready to put that knowledge to the test.

The company has announced the launch of HPE Right Mix Advisor, which is claimed to be an industry-first product recommending the ‘ideal hybrid cloud mix’ to organisations.

The product is based on more than one thousand hybrid cloud ‘engagements’, as well as automated discovery. One recent example saw nine million IP addresses across six data centres analysed, and alongside data from configuration management databases and external cloud vendor pricing models, to provide a roadmap for which workloads would fit private and public respectively.

Cloud Technology Partners, acquired in 2017 for its AWS expertise, and RedPixie a year later for Azure, both fall under HPE Pointnext, the company’s services and consultancy unit. Using this experience, HPE claims, an action plan for hybrid cloud can take only weeks as opposed to months. According to the company’s own work, migrating the right workloads can lead to up to a 40% reduction in cost of ownership.

“I like to tell customers there are a thousand things they could be doing – but they need to find the 10 most impactful things they should start on tomorrow morning,” said Erik Vogel, HPE Pointnext global vice president for hybrid cloud in a statement. “HPE Right Mix Advisor helps organisations get the insight and methodology that they need to drive innovation, deliver predictable optimised customer experiences and remain competitive.”

HPE’s interest in hybrid cloud has been well documented. The company’s Discover Madrid event in November was to unveil the next part of its ‘composable strategy’ – putting together on-premise hardware, software and cloud into a single server platform. In June, HPE announced that it was investing $4 billion into what it called the intelligent edge; technologies to deliver personalised user experiences and seamless interactions in real-time.

As Antonio Neri, HPE president and CEO explained at the time, it’s all about the data – and where you invest in it. “Companies that can distil intelligence from their data – whether in a smart hospital or an autonomous car – will be the ones to lead,” he said. “HPE has been at the forefront of developing technologies and services for the intelligent edge, and with this investment, we are accelerating our ability to drive this growing category for the future.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AWS will support NVIDIA’s T4 GPUs focusing on intensive machine learning workloads

Amazon Web Services (AWS) will release its latest GPU-equipped instance with support for NVIDIA’s T4 Tensor Core GPUs with a particular focus on machine learning workloads.

The specs, while not fully there yet, promise so far AWS-custom Intel CPUs, up to 384 gibibytes (GiB) of memory, up to 1.8 TB of fast local NVMe storage, and up to 100 Gbps in networking capacity.

From NVIDIA’s side, T4 will be supported by Amazon Elastic Container Service for Kubernetes. “Because T4 GPUs are extremely efficient for AI inference, they are well-suited for companies that seek powerful, cost-efficient cloud solutions for deploying machine learning models into production,” Ian Buck, NVIDIA general manager and vice president of accelerated computing wrote in a blog post.

“NVIDIA and AWS have worked together for a long time to help customers run compute-intensive AI workloads in the cloud and create incredible new AI solutions,” added Matt Garman, AWS vice president of compute services in a statement. “With our new T4-based G4 instances, we’re making it even easier and more cost-effective for customers to accelerate their machine learning inference and graphics-intensive applications.”

The announcement came at NVIDIA’s GPU Technology Conference in San Jose alongside various other news. From the AWS stable it was also announced that the NVIDIA Jetson AI platform now supports robotics service AWS RoboMaker, while AI Playground, as reported by sister publication AI News, is an accessible online space for users to become familiar with deep learning experiences.

It’s worth noting that AWS is not the first company to score in this area. In November Google touted itself as the ‘first and only’ major cloud vendor to offer T4 compatibility. In January, the company announced its T4 GPU instances were available in beta across six countries.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AWS’ contribution to Elasticsearch may only further entrench the open source vendor and cloud war

Last week, Amazon Web Services (AWS) announced it was launching an open source value-added distribution for search and analytics engine Elasticsearch. As AWS evangelist Jeff Barr put it, the launch would “help continue to accelerate open source Elasticsearch innovation” with the company “strong believers in and supporters of open source software.”

Yet for industry-watchers and those sympathetic to the open source space, this has been seen as the latest move in a long-running spat between the developers and software vendors on one side, and the cloud behemoths – in particular AWS – on the other. So who is right?

Previous moves in the market have seen a lot of heat thrown in AWS’ direction for, as the open source vendors see it, taking open source code to which they have not contributed and selling software as a service around it. MongoDB, Confluent and Redis Labs were the highest profile companies who changed their licensing to counter this threat, with reactions ranging from understanding through gritted teeth to outright hostility.

In December, Confluent co-founder Jay Kreps outlined the rationale for changing licensing conditions. “The major cloud providers all differ in how they approach open source,” he wrote in a blog post. “Some of these companies partner with the open source companies that offer hosted versions of their system as a service. Others take the open source code, bake it into the cloud offering, and put all their own investments into differentiated proprietary offerings.

“The point is not to moralise about this behaviour – these companies are simply following their commercial interests and acting within the bounds of what the license of the software allows,” Kreps added. “We think the right way to build fundamental infrastructure layers is with open code. As workloads move to the cloud we need a mechanism for preserving that freedom while also enabling a cycle of investment. This is our motivation for the licensing change.”

Redis Labs, when changing its licensing stipulations for the second time last month after developers voiced their concerns, sounded a note of cautious optimism. The company had noted how it was ‘seeing some cloud providers think differently about how they can collaborate with open source vendors.’ Speaking to CloudTech at the time, CEO Ofer Bengal said, AWS aside, “the mood [was] trying to change.”

So whither Amazon’s announcement last week? Several pundits have noted that AWS potentially can’t win in these situations; if the company doesn’t contribute to a particular project it is stripping the technology away, but if it does it is impacting competitors.

AWS VP cloud architecture strategy Adrian Cockcroft – who said on Twitter the company had “proposed to give back jointly at a significant level and [was] turned down” – noted its official stance. Cockcroft had seemingly given short shrift to the moves Redis et al were making, saying there were examples where maintainers were ‘muddying the waters between the open source community and the proprietary code they create to monetise the open source.’

The logical response for a cloud supplier to the addition of adverse licensing terms will be in some cases a fork. The question of blame is difficult for non-partisans to assign – both parties are essentially acting as might be expected

“At AWS, we believe that maintainers of an open source project have a responsibility to ensure that the primary open source distribution remains open and free of proprietary code so that the community can build on the project freely, and the distribution does not advantage any one company over another,” he wrote.

“When the core open source software is completely open for anyone to use and contribute to, the maintainer (and anyone else) can and should be able to build proprietary software to generate revenue,” Cockcroft added. “However, it should be kept separate from the open source distribution in order not to confuse downstream users, to maintain the ability for anyone to innovate on top of the open source project, and to not create ambiguity in the licensing of the software or restrict access to specific classes of users.”

Shay Banon, CEO of Elastic, does not see it the same way. A day after AWS announced Open Distro for Elasticsearch, Banon wrote a missive with three constant themes; keeping things open, not being distracted by overtures from elsewhere, and maintaining a stellar experience for users. “Our commercial code has been an ‘inspiration’ for others,” Banon wrote. “It has been bluntly copied by various companies, and even found its way back to certain distributions or forks, like the freshly minted Amazon one, sadly, painfully, with critical bugs.

“When companies came to us, seeing our success, and asked for [a] special working relationship in order to collaborate on code, demanding preferential treatment that would place them above our users, we told them no,” Banon added. “This happened numerous times over the years, and only recently again, this time with Amazon. Some have aligned and became wonderful partners to us and the community. Others, sadly, didn’t follow through.”

So what happens from here? Can all parties see eye to eye? Stephen O’Grady, principal analyst at RedMonk and a long-time follower of the space, noted both sides of the argument. “The Amazon and Elastic controversy is the product of a collision of models,” O’Grady wrote. “To Banon and Elastic’s credit, Elasticsearch proved to be an enormously popular piece of software. The permissive license the software was released under enabled that popularity… [but] permissive licenses also enable usage such as Amazon’s.

“The logical response for a cloud supplier to the addition of adverse licensing terms will be, in at least some cases, a fork, which is why a move like Amaozn’s… was expected and inevitable,” O’Grady added. “It is also why the question of blame is difficult for non-partisans to assign. Both parties are essentially acting as might be expected given their respective outlooks, capabilities and legal rights.”

Ultimately, this is one which will run and run. As O’Grady explained, the status quo is likely to persist. “It is probable that Amazon’s move here will be the first but not the last,” he added. “As it and other cloud providers attempt to reconcile customer demand with the lack of legal restrictions against creating projects like the Open Distro for Elasticsearch, they are likely to come to the inescapable conclusion that their business interests lie in doing so.

“The incentives and motivations of both parties are clear, and understandable and logical within the context of their respective models,” he added. “Models which are and will continue to be intrinsically at odds even as they’re inextricably linked.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.