All posts by James

Cloudera looks to being a true multi-cloud home and calls out Amazon as primary competitor

Cloudera posted total revenues of $144.5 million (£108.9m) for its most recent quarter, and while it may have disappointed investors, the company said its ‘enterprise data cloud’ strategy with Hortonworks on board will help turn things around.

Of total revenues for the quarter ending January 31 – up 36% from this time last year – 85% of it came from subscriptions, while the remainder came from services. For the full financial year, revenue was $479.9m, up 28% from the previous year’s figures.

Naturally, the major talking point from Cloudera’s most recent quarter was its acquisition of Hortonworks for $5.2 billion back in October. At the beginning of this year, chief marketing officer Mick Hollison told this publication of how the companies were seeing threats from both fronts; the proprietary big data vendors, as well as the public cloud behemoths.

Replying to an analyst question on who their main competitor was, Cloudera CEO Tom Reilly was unequivocal. “Who is our number one competitor? It’s Amazon,” he said. “It’s Amazon’s house offerings in the data management and analytic space – and we believe we are well positioned to compete against them.

“Our value proposition is to be an enterprise data cloud company… giving our customers multi-cloud, hybrid cloud is one enduring differentiator,” Reilly added. “And then our capabilities from the edge – our integrated capabilities from the edge to AI – we’re the only company that’s offering that today.”

This was a similar theme Hollison noted in January; the idea that the public cloud providers are never going to be truly multi-cloud. Of course, the big guys do occasionally spend time together instead of butting heads, such as with the machine learning library Project Gluon launched between AWS and Azure in 2017, and there are certain migration paths, but it’s not to the level that Cloudera can offer.

The concept of the enterprise data cloud is one that not only includes supporting every possible cloud implementation and analytic capability, but also focusing purely on an open philosophy, from storage, to compute, to integration. Reilly also noted this changing customer demand.

“Enterprises are demanding a modern analytic experience across public, private, hybrid and multi-cloud environments. They want the agility, elasticity and ease of use of cloud infrastructure. But they also want to run analytic workloads wherever they choose, regardless of where their data may reside,” said Reilly. “They want open architectures and the flexibility to move those workloads to different cloud environments, public or private to avoid vendor lock-in.

“In summary, what enterprise customers want is an enterprise data cloud,” he added. “This is a new term for our industry and a new market category we are uniquely positioned to lead.”

Investors and analysts may see this differently – at least in the short-term. Benzinga noted analysts either were at neutral or outperform with unchanged or lowered price targets, while MarketWatch said the first year of the merged entity “looks like it’s going to be an adjustment period all around.”

You can read the full Cloudera fourth quarter and fiscal year results here.

Read more: The Cloudera-Hortonworks $5.2bn merger analysed: Challenges, competition and opportunities

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud officially opens Zurich data centre region

Google Cloud has opened the doors to its Zurich data centre region, making it the sixth region for the provider in Europe and nineteenth overall.

The Zurich site, which was first announced in May, will have three availability zones and is available with the standard set of Google products, including Compute Engine, Google Kubernetes Engine, Cloud Bigtable, Cloud Spanner, and BigQuery.

Google is also looking at providing a wider adoption experience through Transfer Appliance, which enables large amounts of data to be transferred to Google Cloud Platform (GCP), as well as private network Cloud Interconnect.

The company’s European footprint appears much more assured, with Zurich joining Belgium, Finland, Frankfurt, London and the Netherlands. Upcoming regions are set to open in Osaka and Jakarta.   

As ever with Google announcements, the company rolled out a series of customers, including Swiss AviationSoftware and University Hospital Balgrist. Perhaps the most interesting of these came from a partner, in this instance Google-specific digital transformation enabler Wabion. “We have customers that are very interested in Google’s innovation who haven’t migrated because of the lack of a Swiss hub,” said Michael Gomez, Wabion co-manager. “The new Zurich region closes this gap, unlocking huge opportunities for Wabion to help customers on their Google Cloud journey.”

Zurich may have been considered an overdue area for cloud data facilities to be housed, given it hosts Google’s largest engineering offices outside of the US. More than 2,000 employees, or ‘Zooglers’, are employed in the city, with a particular focus on natural language processing and machine intelligence.

You can read the full announcement here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Intel, Google, Microsoft and more team up for CXL consortium to supercharge data centre performance

Intel, Google and Microsoft are among nine tech giants who have teamed up to launch a new industry group to advance data centre performance.

The group, which also includes Alibaba, Cisco, Dell EMC, Facebook, HPE and Huawei, is looking at solidifying Compute Express Link (CXL), an emerging high-speed CPU-to-device and CPU-to-memory interconnect. The particular focus is on high performance computing (HPC) and artificial intelligence (AI) workloads among others.

The companies confirmed in a statement that it had ratified the CXL Specification 1.0, built on PCI Express infrastructure which aims to offer breakneck speeds while supporting an ecosystem which enables even faster performance going forward.

The press materials outlined how CXL worked. “CXL technology maintains memory coherency between the CPU memory space and memory on attached devices, which allows resource sharing for higher performance, reduced software stack complexity, and lower overall system cost,” the companies wrote. “This permits users to simply focus on target workloads as opposed to the redundant memory management hardware in their accelerators.

“CXL was designed to be an industry open standard interface for high-speed communications, as accelerators are increasingly used to complement CPUs in support of emerging applications such as artificial intelligence and machine learning.”

The consortium is not stopping there; it is working on CXL Specification 2.0 and is looking for other companies to join, particularly cloud service providers, communications OEMs and system OEMs. “CXL is an important milestone for data-centric computing, and will be a foundational standard for an open, dynamic accelerator ecosystem,” said Jim Pappas, Intel director of technology initiatives.

The initiative is being led by Intel primarily – the press materials being sent to this reporter through the Intel UK mailing list was a bit of a giveaway – and while anyone in the three primary categories can join, eagle-eyed readers will have spotted some notable absentees, particularly a couple of leading cloud vendors as well as the likes of NVIDIA and AMD.

You can find out more about CXL and the group here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

CloudBees, Google and Linux Foundation launch Continuous Delivery Foundation

Meet the Continuous Delivery Foundation (CDF), a new offshoot of the Linux Foundation which will aim to develop, nurture and promote open source projects and best practices around continuous delivery.

The CDF is being led by CloudBees, the arbiters of open source automation server Jenkins, and includes the Jenkins Community, Google, and the Linux Foundation itself as collaborators.

Alongside Jenkins, CloudBees wanted to find a home for a newer flavour, Jenkins X, which aims to automate continuous integration and delivery (CI/CD) in the cloud. This time last year, at the time of its launch, various elements were cited in its launch, from more higher-performing DevOps teams to the near-ubiquity of Kubernetes. “All of this adds up to an increased demand for teams to have a solution for cloud native CI/CD with lots of automation,” wrote James Strachan, distinguished engineer at CloudBees.

The parallels are evident between the companies engaged in this initiative. Around a week or so earlier, the Cloud Native Computing Foundation (CNCF) announced Kubernetes had ‘graduated’, making it a production-ready technology ‘mature and resilient enough to manage containers at scale across any industry in companies of all sizes’, as CNCF said at the time. Alongside that the foundation announced 24 new members – one of whom being CloudBees.

“The time has come for a robust, vendor-neutral organisation dedicated to advancing continuous delivery,” said Kohsuke Kawaguchi, creator of Jenkins and CTO at CloudBees. “The CDF represents an opportunity to raise the awareness of CD beyond the technology people.

“For projects like Jenkins and Jenkins X, it represents a whole new level of maturity,” Kawaguchi added. “We look forward to helping the CDF grow the CD ecosystem and foster collaboration between top developers, end users and vendors.”

The two benchmark industry reports in the DevOps industry are from Puppet and DORA (DevOps Research and Assessment Team), both released in September. Puppet found the vast majority (79%) of organisations polled were bang in the middle when it came to adoption and deployment, while DORA noted how those at the top were pulling significantly ahead due to their advanced cloud infrastructure. Puppet noted 11% of companies were ‘highly evolved’.

Ultimately, the launch of the CDF will hope to bring further clarification and specification to an already fast-moving space. “As the market has shifted to containerised and cloud-native technologies, the ecosystem of CI/CD systems, DevOps and related tools has radically changed,” said Chris Aniszczyk, VP developer relations at the Linux Foundation. “The number of available tools has increased, and there’s no defining industry specifications around pipelines and other CI/CD tools.

“CloudBees, Google and the other CDF founding members recognise the need for a neutral home for collaboration and integration to solve this problem,” he added.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How ideal DevOps recruitment requires a mix of soft and technical skills

If you want to get ahead in DevOps, then automation and process skills are vital – but don't forget the soft skills either.

That's the key finding from a new report by the DevOps Institute. The study, titled Upskilling and which polled more than 1,600 people, found a distinct correlation between 'must-have' and 'nice to have' skill sets. Automation, cited by 57% of respondents as must-have, beat out process (55%) and soft skills (53%). Of those in the process department, software development lifecycle, cited by 47% of respondents, was of the most interest, ahead of understanding process flow and analysis (46%) and Agile methodologies (42%).

When it came to recruitment, there is an equal balance between those looking for soft skills and those looking for technical skills both for internal and external hiring. For C-level executives and IT management, business skills – communication, influencing, negotiation and strategic thinking – were considered particularly important. Yet only 23% of individual contributors – compared with 47% of IT and 43% of C-suite – said these skills were very important.

In terms of specific adoption rates, the research found 43% of organisations were at project-level adoption, while 19% were enterprise-ready and 15% were only at the planning stage. 55% of respondents said they were 'knowledgeable' about DevOps, compared with 22% who said they were very knowledgeable. 22% said they either had very little knowledge or no familiarity.

It is evident, therefore, that different skills are required. Ultimately, professionals need to build their automation and process know-how, while soft skills need to be looked at in terms of collaboration, problem-solving, and sharing and knowledge transfer.

"We found the majority of leaders within the organisations we surveyed are hiring from within and are willing to develop an individual's abilities and provide opportunities," said analyst Eveline Oehrlich of Forrester Research. "Hiring managers see the DevOps human as a creative, knowledge-sharing, eager to learn individual with their skill sets and abilities being shareable. Our study provides insight into what skills the DevOps human should develop, in order to help drive a mindset and a culture for organisations and individuals."

You can read the full report here (email required).

Read more: Four reasons why your company might not be ready for DevOps just yet

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud launches new cloud storage plan to give enterprises more scalability options

Google Cloud is looking at helping enterprise organisations avoid bill shock with the launch of a storage growth plan for Google Cloud Storage.

The launch came about after Google noted the ‘astonishing and unpredictable’ rate at which data is created in its storage offerings. This is evidently not for a handful of photos and videos; companies who sign up to Storage Growth Plan are expected to hand over $10,000 per month for 12 months of Cloud Storage usage.

At the end of the year customers can either commit to the next 12 months at the rate of their peak usage. If it is within 30% of their original commitment, the previous year’s overage is free.

“We’ve developed the Storage Growth Plan to help enterprise customers manage storage costs and meet the forecasting and predictability that is often asked of IT organisations,” wrote Geoffrey Noer, Google Cloud product manager and Darren Strange, product marketing in a blog post.  

“We heard from customers that data growth can be unpredictable, but costs can’t be. We’ve also heard that data can have unpredictable life cycles,” Noer and Strange added. “Consolidating storage into a centrally managed infrastructure resource can make life as a storage architect much easier. But the path to consolidation is fraught with complexity.”

This certainly makes sense. As this publication has frequently noted, different workloads suit different offerings. Some are so time- and resource-dependent that they can be used elsewhere at short notice; Google gives the example of a legacy image archive which could be reborn as an API training set. As ever Google wheeled out a customer already reaping the rewards; in this case pharmaceutical firm Recursion, whose biological image dataset is rapidly growing and is used to train neural networks.

The move can be seen as another way Google is trying to woo cloudy enterprise customers. In February CEO Thomas Kurian took his debut speaking platform, at a Goldman Sachs conference, to announce aggressive sales plans and give Google a stronger enterprise presence. Google Cloud acquired Alooma, an enterprise-focused data pipeline provider, later in the month.

Google Cloud also announced price drops to its cold yet low latency storage offering Coldline. The move comes after the company said Coldline in multi-regional locations was now geo-redundant, meaning data was protected from regional failure thanks to copies stored at least 100 miles away in a different region.

You can read the full blog post here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware posts strong 2019 financial results citing AWS partnership and ‘tech breaking out of tech’

VMware has closed off its 2019 fiscal year with record annual revenues, solid upticks across the board and considerable strength on its partnership with Amazon Web Services (AWS).

Revenue was $2.59 billion (£1.96bn) across the quarter, at a 16% uptick on this time last year, while for the fiscal year it was $8.97bn, a yearly increase of 14%. License revenues were at $1.2bn, or 47.5% of overall quarterly revenues, compared with 52.5% for service revenues. For the fiscal year, ‘services’ comprised 57% of total revenues, a slight dip from the previous year’s 59%.

“We were very pleased with this terrific quarter and fiscal 2019,” said VMware CEO Pat Gelsinger in an earnings call following publication of the results. “At VMware, we believe that software has the power to transform business and humanity. We understand that our products, operations and people collectively have an impact in the world, and we strive to generate positive global impact through all that we do.

“Customers look to VMware for solutions across hybrid cloud, multi-cloud, modern apps, networking and security and digital workspace to help enable their digital transformations,” Gelsinger added. “We demonstrated good Q4 results across our hybrid cloud and SaaS portfolio as we strategically focused on expanding that offering.

“We’ve remained committed to growing this business as customers continue to turn to us for the best solutions that span private and public clouds.”

With that statement in mind, one of the key highlights of the quarter focused around VMware’s continued partnership with AWS. The companies keep cropping up in each other’s events; AWS CEO Andy Jassy took to the stage at VMworld in Las Vegas back in August, while Gelsinger returned the favour at re:Invent in November to all but bring the house down with the announcement of AWS Outposts.

Outposts, AWS said at the time, aims to deliver a ‘truly consistent hybrid experience’ by bringing AWS services, infrastructure and operating models to ‘virtually any’ on-premises facility. VMware’s partnership is a key part of making this happen. According to RightScale’s 2019 State of the Cloud report, issued earlier this week, 12% of organisations polled said they were using Outposts right out of the gate, with a further 29% interested in deploying.

Gelsinger said the most recent quarter saw a $20 million deal brokered with VMware Cloud on AWS, with new customers including Freddie Mac, Nant Media Holdings and the United States Air Force Field Enterprise Data Center.

The other major news VMware issued over the past three months was the planned acquisition of Kubernetes provider Heptio. At the time, Heptio co-founder Craig McLuckie said the two companies’ visions were ‘uncanny’, with VMware seeing the deal as an opportunity to build a cloud-independent Kubernetes control plane for customers. “We will accelerate efforts to make Kubernetes the standard for customers building and running their applications across clouds, and continue to drive the open source community’s development of this critical platform,” added Gelsinger.

Gelsinger described the state of the industry currently as ‘tech breaking out of tech’ in response to an analyst question around divergence between various infrastructure providers. This is a theme which this publication has covered frequently, both in the rise of multi-cloud projects organisations are taking on as well as exploring the next wave of cloud services, whether they be serverless and containers, or quantum and machine learning.

“There’s going to be winners and losers,” Gelsinger explained. “We’ll continue to see lots of questions on cloud, private and public cloud, and how hybrid cloud transitions. We clearly are going to see these normal cycles of over[supply] and undersupply as people are building up rapidly in different geos. We’re no longer dependent on any geo or any individual product, but the real breadth of our portfolio is nicely rewarding us across the broad landscape of, we believe, a good tech market [that] is going to continue well into the future.

“There will be winners and losers inside of that because there is so much change going on in the marketplace with these powerful trends,” Gelsinger added. “I’ve talked about the superpowers: cloud, mobility, AI and edge and IoT, and all of those will have different effects of who’s going to be the winners and losers inside of it.”

You can read the full VMware fourth quarter and fiscal year 2019 results here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

RightScale State of the Cloud 2019: Azure gains again, cost optimisation key, PaaS explodes

Microsoft Azure continues to eat into Amazon Web Services’ (AWS) dominance in the enterprise market, while managing cloud spend and governance continues to be the primary concern, according to the 2019 RightScale State of the Cloud report.

The study, a yearly benchmark assessing cloud adoption and which polled almost 800 executives with a relatively even spread between enterprise and SMB respondents, found cloud cost management was the primary concern for the third consecutive year. For the enterprise sector, optimising costs (84% this year, 80% in 2018) and governance (84% this year, 77% in 2018) are notably on the rise.

The study noted how organisations may be wasting more than even they expect on their services – hence the need for optimisation. Survey respondents estimated they wasted 27% of their uptake this year, yet Flexera – which bought RightScale last year – assesses it to be nearer 35%.

It’s fair to say that using the biggest cloud vendors can be a complex experience with the sheer number of features available. Yet organisations are not helping themselves, with only a handful of companies polled using automated policies to shut down unused workloads, or rightsizing instances. Indeed, less than half (47%) of AWS users are aware of and utilise AWS Reserved Instances, while Azure’s Reserved Instances (23%) pales further into insignificance.

Exploring the cloud behemoths (above), Azure adoption grew from 45% to 52% year on year overall, with Azure’s adoption figures now looking at 85% of AWS’ – up from 70% the year before. For enterprise-specific figures, Azure has risen to 60% while AWS remains flat at 67%. Google remains clear in third position. VMware on AWS Cloud saw growth of 50% across the board, while all other providers surveyed – including Oracle, IBM and Alibaba – saw enterprise gains.

One of the key areas where organisations are becoming increasingly comfortable is emerging platforms. Gartner has already noted this week how the industry is almost at the tipping point where platform as a service (PaaS) offerings will become cloud-dominated, and this is reflected in the RightScale report. Serverless saw a 50% growth year on year, with 36% of overall respondents using it, while machine learning, containers-as-a-service, and IoT are also quick to grow.

Containers, meanwhile – and Kubernetes in particular – are seeing particularly strong adoption rates. 57% of respondents say they regularly use Docker, while Kubernetes has 48% adoption among respondents, up almost double from the previous year (27%). For enterprises, Docker (66%) and Kubernetes (60%) are even further entrenched. Of the big cloud offerings, only Azure Container Service saw noticeably greater adoption, with AWS (44%) ahead of Azure (28%) and Google (15%).

As previously the overriding theme, with so many sectors to cover, is one of hybrid IT and multi-cloud (below); making the most out of your stack and finding the correct avenues for particular workloads. If anything, the release of AWS Outposts – with the nod to VMware’s rise already seen in the report – helped truly legitimise, and normalise, this thinking. The report noted how private cloud growth was there, if a little slow; 12% of those polled are already using Outposts out of the gate with a further 29% interested in the future. VMware vSphere, at a flat 50%, remains the primary tool.

Ultimately, these figures should make for solid reading across the industry. Yet Flexera and RightScale will perhaps be keener than most. As this publication put it when the acquisition was announced in October, the two companies’ proposed marriage, around Flexera’s IT and software asset management (SAM) portfolio, and RightScale’s cloud complexity problem solving, should be a happy one.

“The data is consistent with what we are hearing from our C-level customers: managing the rapid increase in cloud use requires new capabilities for cost optimisation and IT governance,” said Jim Ryan, CEO of Flexera. “With multi-cloud as the strategy of choice, most enterprises are already spending over $1m a year in public cloud. As a result, optimising costs is the top cloud priority for the third year in a row, and governance is the top challenge.”

You can read the full report here (email required).

Picture credits: RightScale, used under CC BY

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Gartner says tipping point in cloud PaaS is almost complete – with $20bn market revenue in 2019

An interesting if brief note from the analysts at Gartner this week: according to their forecasts, almost half of today’s platform as a service (PaaS) service offerings are cloud-only, with a $20 billion (£15.02bn) market revenue by the end of this year.

The analyst firm’s landscape details more than 550 PaaS offerings from 360 vendors across 21 market segments. 48% of these offerings are cloud-only, with 90% only operating within a single PaaS market segment. In terms of the overall market, the move to $20bn this year will go up to $34 billion by 2022.

These figures make for interesting reading when looking through the record books. As far back as 2012, Gartner said PaaS market revenue would hit almost $3bn by 2016. Things have accelerated since then, although as part of a wider market acceleration. Gartner figures from April last year predicted the overall public cloud market would overtake the $300bn mark by 2021 with a whopping 21% growth in 2018 alone. PaaS comprised only 8% of the total public cloud market however.

Naturally, the nature of what the PaaS market comprises is changing. In November 2013, Laurent Lachal, then a senior cloud computing analyst at Ovum, said the market will ‘remain confused’ as PaaS evolved in the coming two to five years. “PaaS offerings will mature and expand the depth and breadth of their features,” he wrote at the time. “For example, as part of the expansion of the scope of their ecosystem services, in the next two years PaaS offerings will increasingly provide not only business-level services but also application-level ecosystem services.”

According to Gartner’s latest focus, the latest abstraction for platform services and applications are blockchain, digital experience, serverless, and artificial intelligence and machine learning.

“Although many organisations anticipate a long-term retention of on-premises computing, the vendors of nearly half of the cloud platform offerings bet on the prevailing growth of cloud deployments and chose the more modern and more efficient cloud-only delivery of their capabilities,” said Yefim Natis, research vice president and distinguished analyst at Gartner.

“Cloud computing is one of the key disruptive forces in IT markets that is getting mainstream trust.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Red Hat: On bridging between the first wave of cloud and next generation platforms

MWC19 For Red Hat, it may sometimes be easier to list what the company doesn't do rather than what it does. The overall umbrella of 'making open source technologies for the enterprise' can range from containers, to cloud, to 5G. But ultimately, as the company has noted at MWC Barcelona this week, it's all developing into a hybrid universe – and it's a space where their customers and partners feel increasingly comfortable.

This is a message of which regular followers of the company – particularly since the acquisition by IBM – will be aware. Take the quotes issued at the time of the original announcement in October. IBM chief executive GInni Rometty described it as 'the next chapter of the cloud…requir[ing] shifting business applications to hybrid cloud, extracting more data and optimising every part of the business, from supply chains to sales."

Lo and behold, a similar message came forth last week, when Rometty keynoted IBM's annual Think conference in San Francisco. "I've often said we're entering chapter two – it's cloud and it's hybrid," she told delegates. "In chapter one, 20% of your work has moved to the cloud, and it has mostly been driven by customer-facing apps, new apps being put in, or maybe some inexpensive compute. But the next 80%… is the core of your business."

For Ashesh Badani (left),  VP and general manager of the cloud business unit at Red Hat, it's a fair position to take. "The press around IBM acquiring Red Hat and we focused on becoming a leader in hybrid cloud – a lot of work we're doing in the cloud business is essential to some of that future direction that we expect to go in," he told CloudTech

"The goal of the cloud business is to help customers with their journey to the cloud," Badani added. "Our firm belief is that, in as much as people talk about a revolution happening, most enterprises have decades of investment in existing assets, skills, as well as application services.

"How can we ensure that we move that set of technologies and leverage the skill our customers have to move towards what I'll call the next generation platform? Being able to bridge both of those worlds is what we're focused on at the moment."

This focus is around such concepts as cloud-native development, microservices-based architectures, and DevOps. But it may be prudent to take a step back for now. Badani sees customers in various traditionally slow-moving industries take the plunge. As many companies have been realising – and as this publication put it earlier this week with one eye on AWS' release of Outposts last year – different services suit different workloads. 

Red Hat sees it as 'footprints' – physical/bare metal, virtualised, private cloud and public cloud. The goal is to have these different workloads – for instance, mission critical workloads in virtualised environments, performance sensitive workloads in bare metal, compliance-sensitive in private cloud and test workloads in public cloud – but an overarching control plane to take care of it. "That abstraction, that commonality, is what we're looking to build," said Badani. "Whether you run OpenShift on bare metal, OpenShift on OpenStack, OpenShift on Amazon, Google, Azure – the interfaces that you're writing to, the application that you build will be ported across."

How much of this is a technological challenge and how much is an organisational one? "Inevitably, you find that you can over time get there [technically], work with partners, parties that augment you. The one that oftentimes is harder is the cultural challenge," said Badani. "Just changing that mindset takes time. We work with partners, system integrators, or have our own practices around things like open innovation, to help companies transform and have smaller agile teams put in place."

The world of open source is another which is changing. Earlier this month, Redis Labs announced it would be changing the licensing terms of its modules again. The rationale in the first instance – and indeed the second – was clear; stop the big cloud providers making profits from their technology without previously contributing to it. However in the case of Redis, the initial change confused and antagonised some developers. 

Other companies have done similar things, from Confluent – whose co-founder said at the time the big cloud vendors shouldn't be judged, as the-then licensing terms enabled it – and MongoDB. Badani saw the reasoning behind this.

"I cast no judgement on one side or the other," he said. "My expectation fully is that companies like Amazon, who want to work with open source-based technologies and communities, increasingly need to find ways to build bridges. You see them already doing that, but we just have to be careful not to overreact."

For Red Hat as 2019 carries on, it's a good theme to sum up their strategy – building bridges, building up and out and keeping an eye on all bases as enterprise cloud workloads become increasingly complex.

Picture credit: "Das Gesicht der Hoffnung", by Kai C. Schwarzer, used under CC BY-NC-ND 2.0

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.