All posts by James

Red Hat: On bridging between the first wave of cloud and next generation platforms

MWC19 For Red Hat, it may sometimes be easier to list what the company doesn't do rather than what it does. The overall umbrella of 'making open source technologies for the enterprise' can range from containers, to cloud, to 5G. But ultimately, as the company has noted at MWC Barcelona this week, it's all developing into a hybrid universe – and it's a space where their customers and partners feel increasingly comfortable.

This is a message of which regular followers of the company – particularly since the acquisition by IBM – will be aware. Take the quotes issued at the time of the original announcement in October. IBM chief executive GInni Rometty described it as 'the next chapter of the cloud…requir[ing] shifting business applications to hybrid cloud, extracting more data and optimising every part of the business, from supply chains to sales."

Lo and behold, a similar message came forth last week, when Rometty keynoted IBM's annual Think conference in San Francisco. "I've often said we're entering chapter two – it's cloud and it's hybrid," she told delegates. "In chapter one, 20% of your work has moved to the cloud, and it has mostly been driven by customer-facing apps, new apps being put in, or maybe some inexpensive compute. But the next 80%… is the core of your business."

For Ashesh Badani (left),  VP and general manager of the cloud business unit at Red Hat, it's a fair position to take. "The press around IBM acquiring Red Hat and we focused on becoming a leader in hybrid cloud – a lot of work we're doing in the cloud business is essential to some of that future direction that we expect to go in," he told CloudTech

"The goal of the cloud business is to help customers with their journey to the cloud," Badani added. "Our firm belief is that, in as much as people talk about a revolution happening, most enterprises have decades of investment in existing assets, skills, as well as application services.

"How can we ensure that we move that set of technologies and leverage the skill our customers have to move towards what I'll call the next generation platform? Being able to bridge both of those worlds is what we're focused on at the moment."

This focus is around such concepts as cloud-native development, microservices-based architectures, and DevOps. But it may be prudent to take a step back for now. Badani sees customers in various traditionally slow-moving industries take the plunge. As many companies have been realising – and as this publication put it earlier this week with one eye on AWS' release of Outposts last year – different services suit different workloads. 

Red Hat sees it as 'footprints' – physical/bare metal, virtualised, private cloud and public cloud. The goal is to have these different workloads – for instance, mission critical workloads in virtualised environments, performance sensitive workloads in bare metal, compliance-sensitive in private cloud and test workloads in public cloud – but an overarching control plane to take care of it. "That abstraction, that commonality, is what we're looking to build," said Badani. "Whether you run OpenShift on bare metal, OpenShift on OpenStack, OpenShift on Amazon, Google, Azure – the interfaces that you're writing to, the application that you build will be ported across."

How much of this is a technological challenge and how much is an organisational one? "Inevitably, you find that you can over time get there [technically], work with partners, parties that augment you. The one that oftentimes is harder is the cultural challenge," said Badani. "Just changing that mindset takes time. We work with partners, system integrators, or have our own practices around things like open innovation, to help companies transform and have smaller agile teams put in place."

The world of open source is another which is changing. Earlier this month, Redis Labs announced it would be changing the licensing terms of its modules again. The rationale in the first instance – and indeed the second – was clear; stop the big cloud providers making profits from their technology without previously contributing to it. However in the case of Redis, the initial change confused and antagonised some developers. 

Other companies have done similar things, from Confluent – whose co-founder said at the time the big cloud vendors shouldn't be judged, as the-then licensing terms enabled it – and MongoDB. Badani saw the reasoning behind this.

"I cast no judgement on one side or the other," he said. "My expectation fully is that companies like Amazon, who want to work with open source-based technologies and communities, increasingly need to find ways to build bridges. You see them already doing that, but we just have to be careful not to overreact."

For Red Hat as 2019 carries on, it's a good theme to sum up their strategy – building bridges, building up and out and keeping an eye on all bases as enterprise cloud workloads become increasingly complex.

Picture credit: "Das Gesicht der Hoffnung", by Kai C. Schwarzer, used under CC BY-NC-ND 2.0

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Check Point exposes yet more shared responsibility misunderstandings for cloud security

Almost one in five organisations polled by cybersecurity solutions provider Check Point Software say they have been victim to a cloud security incident over the past year, while more than a quarter still believe security is the responsibility of the cloud provider.

These and other worrying findings have appeared in Check Point’s latest study. The 2019 Security Report, of which this is the third instalment and combined data with survey responses from IT professionals and C-level executives, also found more than half (59%) of IT respondents polled did not use mobile threat defences.

The report pulls no punches in regard to its analysis. The first section, titled ‘cloud is your weakest link’, explores how cloud services are vulnerable across three main attack vectors; account hijacking, malware delivery, and data leaks. Citing Dome9 – acquired by Check Point last year – in a study last year which found 91% of organisations were concerned about cloud security, the report notes how exposure and default security settings remain an issue.

“65% of IT professionals still underestimate the damage they can cause,” the report explained. “The obvious concern is that organisations are not taking cloud security seriously enough. The breach of sensitive data held in the cloud is a huge risk for an organisation, and threat actors know it. The rate of cyber attacks against cloud-based targets is growing, and with little sign it will slow down.”

The statistic which causes major concern is the three in 10 respondents who affirmed security was the responsibility primarily of the cloud service provider. This, as the report noted, ‘negates recommendations’ over shared, or mutual responsibility.

This is a viewpoint which persists even though cloud providers have tried to remove some of the burden themselves. In November, Amazon Web Services (AWS) launched Amazon S3 Block Public Access, which aimed to secure at the account level, on individual buckets, as well as future buckets created.

The move was to ensure users handled public buckets and objects ‘as needed while giving tools to make sure [users] don’t make them publicly accessible due to a simple mistake or misunderstanding’, in the words of AWS chief evangelist Jeff Barr at the time. Previously, AWS had revamped its design to include bright orange warning indicators to signify which buckets were public.

“As nearly 20% of organisations have experienced a cloud incident in the past year, it’s clear that criminals are looking to exploit these security gaps,” said Zohar Alon, head of the cloud product line at Check Point. “By reviewing and highlighting these developments in the report, organisations can get a better understanding of the threats they face, and how they prevent them impacting on their business.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Redis Labs further changes licensing terms – to make developers happy and keep big cloud vendors at bay

Open source database provider Redis Labs has announced Redis Source Available License (RSAL), representing a modification of previous licensing terms for its modules and looking towards clarification with the open source and developer communities.

The company had in August changed its terms to Apache2 modified with Commons Clause with more than one eye on the biggest cloud providers, who were packaging Redis technology into proprietary offerings and pocketing the resulting profits.

This was a move which was followed towards the end of last year by similar companies, such as MongoDB and Confluent. Writing at the time of the latter’s $2.5 billion valuation following a $125m series D funding round in January, as this publication reported, Confluent co-founder Jay Kreps outlined his company’s position.

“The major cloud providers all differ in how they approach open source,” Kreps wrote in a blog post back in December. “Some of these companies partner with the open source companies that offer hosted versions of their system as a service. Others take the open source code, bake it into the cloud offering and put all their own investments into differentiated proprietary offerings.

“The point is not to moralise about this behaviour; these companies are simply following their commercial interests and acting within the bounds of what the license of the software allows,” Kreps added. “But we think the right way to build fundamental infrastructure layers is with open code.”

Hence the need to tighten things up. Yet the problem Redis Labs found was that the previous terms for its modules – the Redis database project itself remains unchanged – were too open to interpretation, or too confusing. Previously, under Apache2 modified with Commons Clause, the rule was that users were not allowed to sell a product or service ‘whose value derives entirely, or substantially, from the functionality of the software.’ But as Redis subsequently noted, how substantial is ‘substantially’ exactly?

The new solution under RSAL is to communicate more clearly that developers can use the software, modify the source code, integrate it with an application, and use, distribute or sell that application. The only restriction is that the application cannot be a database, a caching engine, a stream processing engine, a search engine, an indexing engine, or a machine learning, deep learning, or artificial intelligence-serving engine.

“We are very open to our community,” Ofer Bengal, Redis Labs CEO, told CloudTech. “We got a lot of feedback and responses regarding Commons Clause which made us think there may be a better definition of license for our case.

“When we said [users were] not allowed to sell a product or service… this created concerns with some developers providing services around open source projects, like consulting services and support services,” Bengal added. “In order to get adoption you need to satisfy the needs of developers, and once we heard after we released Commons Clause that some developers weren’t happy – not with the concept but with the way it was presented and copyrighted, the language of the license – that was the point where we thought that we should correct it.

“We hope that once doing that developers would be happier and more receptive to using software under these licenses.”

For some users, however, that ship may have already sailed. In the aftermath of Redis’ original licensing changes, offshoot groups developed, in particular GoodFORM (Free and Open Redis Modules). Led by developers at Debian and Fedora, GoodFORM set out to fork Redis’ code ‘committed to making [it] available under an open source license permanently’ amid fears they were unable to ship Redis’ versions of affected modules to their users.

Bengal’s response to these projects was unequivocal. “With all due respect, they should wake up and smell the coffee,” he said. “They don’t realise that the world has changed and the exact concept of open source is challenging in today’s environment.

“What they have done is just to counter what we have done. They forked the Redis modules that we had at the time, but this means nothing because they have done nothing with it, and I suspect that they cannot do anything with it,” Bengal added. “You must realise that developing a database is a very complex thing, it’s not a small piece of software that someone can develop from his parents’ home garage. There are tons of nuances and complexities, and if you do not devote yourself 24/7 for years to develop a database there is no way you can really contribute to it.”

It has been a busy few days all told for Redis, with the announcement of $60m in a series E funding round being confirmed earlier this week. The round, which was led by new investor Francisco Partners and also featuring existing investors Goldman Sachs Private Capital Investing, Bain Capital Ventures, Viola Ventures and Dell Technologies Capital, is a particularly important one according to Bengal.

“We are now at the stage where we’re seeing that our opportunity is huge,” he said. “The race over market share as the market matures becomes fiercer and fiercer, and in order to have foothold and market share you need to move very quickly and aggressively.

“Compared to our peers, we decided that in order to move faster and accelerate our growth we need to be more aggressive on the sales side, marketing side, and even on the product development side,” Bengal added.

With regards to the cloud behemoths, there may be some light at the end of the tunnel. In a blog post explaining Redis’ latest modules license changes, co-founder and CTO Yiftach Shoolman noted that the company was “seeing some cloud providers think differently about how they can collaborate with open source vendors.” Bengal added that, Amazon Web Services (AWS) aside, ‘the mood is trying to change’, inferring that partnerships between some cloud providers and companies behind open source projects may not be too far away.

You can read the full explanation of Redis Source Available License (RSAL) here.

Read more: Confluent's $2.5 billion valuation may provide affirmation amid open source turbulence

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud acquires Alooma to bolster enterprise data migration capabilities

Google Cloud chief executive Thomas Kurian promised an aggressive approach to enterprise cloud strategy going forward – and the company has immediately put its money where its mouth is with the acquisition of California and Tel Aviv-based startup Alooma.

Alooma aims to solve a key problem for enterprise organisations in their move to the cloud by offering a single data pipeline which is able to crunch data from various sources, from Snowflake, to Google-tied BigQuery, Azure and Amazon RedShift, to provide real-time insights.

The company, which had received around $15 million across three funding rounds during its tenure, had previously been a long-term partner of Google with several native integrations, from Google Ads and Analytics to database service Cloud Spanner, not to mention BigQuery. Its roster of customers includes OkCupid, Sony, and The New York Times, which already uses Google App Engine for its gaming platform, having moved from Amazon Web Services (AWS) in 2017.

In a letter to Alooma’s customers and partners, published on the company’s blog, founders Yoni Broyde and Yair Weinberger noted the evolutionary nature of the acquisition. “The journey is not over,” Broyde and Weinberger wrote. “Alooma has always aimed to provide the simplest and most efficient path toward standardising enterprise data from every source and transforming it into actionable intelligence.

“Joining Google Cloud will bring us one step closer to delivering a full self-service database migration experience bolstered by the power of their cloud technology, including analytics, security, AI, and machine learning,” they added.

From Google’s perspective the acquisition focuses on three primary areas; the need for open source, the continued enterprise push, as well as bolstering its Israel presence. Writing in a blog, VP engineering Amit Ganesh and director of product management Dominic Preuss noted parallels with Google’s acquisition of cloud migration provider Velostrata last year, which ticked all three boxes.

Earlier this month, Kurian told delegates at a Goldman Sachs conference of his vision for the company and how its cloud offering differs from the likes of AWS and Azure. These were, in order, security and reliability for mission critical applications; hybrid and multi-cloud; ‘very advanced’ AI solutions; ‘vastly different’ capabilities for managing data at scale; and ‘integrating a number of Google’s technology advances with cloud to deliver industry solutions.’

The proposed acquisition of Alooma certainly focuses on managing data at scale, as well as promised initiatives around AI and machine learning – so watch this space.

Financial terms of the acquisition were not disclosed.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Exploring a data-centric approach to data privacy as cloud workloads proliferate

If your organisation, like many others, is putting more and more data into the cloud, you will already know that it’s probably making your security team have kittens. Greater amounts of data being transported in real-time – not to mention the vastly increased number of mobile devices and attack vectors – means the chances for catastrophe have proliferated.

A new study from data protection provider Virtru has looked at the steps for taking a ‘data-centric’ approach to data protection and privacy. The report, conducted by Forrester Research and which polled more than 200 director, VP and C-suite employees across security, risk and IT, argues organisations’ current IT priorities are conflicting – and that data protection is not high on the list.

Almost half (46%) of those polled said that they were adopting a data-centric approach to data protection because they were putting more and more business data into the cloud. The same number said they were particularly concerned around protecting data from cybercriminals, as well as insider theft and abuse.

When it came to the primary capabilities organisations needed to execute data-centric protection, 85% of respondents said enforcing access control was either critical or very important. Encrypting data stored in cloud drives (79%), as well as encrypting data in motion and at rest within the enterprise (79%), were also highly cited.

The key issue in putting this important approach across is prioritisation. For those polled, the key aim this year is to deliver IT projects more quickly (45%). 41% said a major aim was to better comply with privacy regulations, while shifting resources to improve the customer experience (37%) and increasing the business’ role in defining the priorities of IT investments (35%) were also cited.

Naturally, there is an impasse between those who see the need for greater productivity in the organisation and those who see greater security. 39% said they feared data privacy controls would hinder productivity, while a third (34%) said their companies lacked staff with sufficient data privacy expertise. 30% said there was confusion around the differences between data privacy and security.

Yet there are a multitude of benefits to a data-centric protection approach. Almost half (49%) of those polled said the move would improve their organisation’s ability to meet regulatory requirements, while reduction of data theft (47%) and lowered risk of data loss (47%) were also key.

“As IT organisations seek to find ways to deliver on their initiatives more quickly and with a greater focus on regulatory compliance, many struggle to keep these two objectives from conflicting with one another,” the report concluded. “To address these challenges, firms are turning to data-centric data protection solutions, while seeking to overcome challenges with costs, use, and integration that can arise with onboarding new technologies.

“Putting data security and privacy front and centre will help firms realise numerous benefits like improved customer and partner relationships and lower risk of a data incident,” it added. “Failing to properly secure your data puts customer trust, the business’ reputation, and considerable revenues and potential penalties at risk.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Healthcare firms go for the hybrid cloud approach with compliance and connectivity key

It continues to be a hybrid cloud-dominated landscape – and according to new research one of the traditionally toughest industries in terms of cloud adoption is now seeing it as a priority.

A report from enterprise cloud provider Nutanix has found that in two years’ time, more than a third (37%) of healthcare organisations polled said they would deploy hybrid cloud. This represents a major increase from less than a fifth (19%) today.

The study, which polled more than 2,300 IT decision makers, including 345 global healthcare organisations, found more than a quarter (28%) of respondents saw security and compliance as the number one factor in choosing where to run workloads. It’s not entirely surprising. All data can be seen as equal, but healthcare is certainly an industry where the data which comes from it is more equal than others. Factor in compliance initiatives, particularly HIPAA, and it’s clear to see how vital the security message is.

Yet another key area is around IT spending. The survey found healthcare organisations were around 40% over budget when it came to public cloud spend, compared to a 35% average for other industries. Organisations polled who currently use public cloud spend around a quarter (26%) of their annual IT budget on it – a number which is expected to rise to 35% in two years.

Healthcare firms see ERP and CRM, analytics, containers and IoT – the latter being an evident one for connected medical devices – as important use cases for public cloud. The average penetration in healthcare is just above the global score. 88% of those polled said they see hybrid cloud to positively impact their businesses – yet skills are a major issue, behind only AI and machine learning as an area where healthcare firms are struggling for talent.

It is certainly an area where the largest vendors have been targeting in recent months. Amazon Web Services (AWS) announced in September a partnership with Accenture and Merck to build a cloud-based informatics research platform aiming to help life sciences organisations explore drug development. Google took the opportunity at healthcare conference HiMSS to launch a new cloud healthcare API, focusing on data types such as HL7, FHIR and DICOM.

Naturally, Nutanix is also in the business of helping healthcare organisations with their cloud migrations. Yet increased maturity across the industry will make for interesting reading. The healthcare IT stack of the future will require different workloads in different areas, with connectivity the key. More than half of those polled said ‘inter-cloud application mobility’ was essential going forward.

“Healthcare organisations especially need the flexibility, ease of management and security that the cloud delivers, and this need will only become more prominent as attacks on systems become more advanced, compliance regulations more stringent, and data storage needs more demanding,” said Chris Kozup, Nutanix SVP of global marketing. “As our findings predict, healthcare organisations are bullish on hybrid cloud growth for their core applications and will continue to see it as the ideal solution as we usher in the next era of healthcare.

“With the cloud giving way to new technologies and tools such as machine learning and automation, we expect to see positive changes leading to better healthcare solutions in the long run,” Kozup added.

Photo by Hush Naidoo on Unsplash

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

LA Clippers to use Amazon Web Services for CourtVision platform

The LA Clippers are moving their CourtVision game-watching platform onto Amazon Web Services (AWS) and using machine learning to drive greater insights.

Clippers CourtVision, which was created alongside the NBA’s video tracking technology provider Second Spectrum, will have its data stored and analysed on AWS in real-time. The system uses cameras in every NBA arena to collect 3D spatial data, including ball and player locations and movements.

The system will also utilise Amazon SageMaker to build, train and deploy machine learning-driven stats which will appear on live broadcasts and official NBA videos. Clippers fans will be able to access greater insights, from frame-by-frame shots and analysis of whether a shot will go in, to live layouts of basketball plays.

The move with the Clippers comes hot on the heels of the Golden State Warriors moving to Google Cloud, utilising the company’s analytics tools for scouting reports and planning to host a mobile app on Google Cloud Platform.

On the court the two clubs have had recent differing fortunes, with the Warriors having won three of the last four championships. Yet the sporting arena is one of mutual interest to the biggest cloud providers. One of the guest speakers during the AWS re:Invent keynote in November was Formula 1 managing director of motor sports Ross Brawn, who explained how the sport’s machine learning projects were being ramped up after a softer launch this season. Alongside Formula 1, Major League Baseball is another important AWS customer.

“The combination of cloud computing and machine learning has the potential to fundamentally redefine how fans experience the sports they love,” said Mike Clayville, vice president of worldwide commercial sales at AWS. “With AWS, Second Spectrum and the LA Clippers leverage Amazon’s 20 years of experience in machine learning and AWS’s comprehensive suite of cloud services to provide fans with a deeper understanding of the action on the court.

“We look forward to working closely with both organisations as they invent new ways for fans to enjoy the game of basketball,” Clayville added.

Photo by Markus Spiske on Unsplash

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

IBM focuses on second chapter of cloud story at Think – hybrid and open but secure

It’s seconds out for round two of the cloud story – one which has hybrid and multi-cloud at its core, and is open but secured and managed properly.

That was the key message from IBM chief executive and chairman Ginni Rometty at IBM’s Think Conference in San Francisco earlier this week.

“I’ve often said we’re entering chapter two – it’s cloud and it’s hybrid,” Rometty told the audience. “In chapter one, 20% of your work has moved to the cloud, and it has mostly been driven by customer-facing apps, new apps being put in, or maybe some inexpensive compute. But the next 80%… is the core of your business. That means you’ve got to modernise apps to get there. We’re going from an era of cloud that was app-driven to ‘now we’re transforming mission critical.’

“It’s very clear to me that it’s hybrid,” Rometty added, “meaning you’ll have traditional IT, private clouds, [and] public clouds. On average, [if you] put your traditional aside, 40% will be private, 60% public. If you’re regulated it will be the other way around.

“The reason it’s so important to [have] open technologies is that skills are really scarce. But then you’ve got to have consistent security and management.”

Naturally, IBM has been reinforcing this strategic vision with action. The $34 billion acquisition of Red Hat announced in October, albeit not yet to close, is a clear marker of this. As this publication put it at the time, it plays nicely into containers and open technologies in general. Both sides needed each other; IBM gets the huge net of CIO and developers Red Hat provides, while Red Hat gets a sugar daddy as its open source revenues – albeit north of $3 billion a year – can’t compete with the big boys. It’s interesting to note that at the time Rometty said this move represented “the next chapter of the cloud…shifting business applications to hybrid cloud, extracting more data and optimising every part of the business.”

Rometty assured the audience at Think that IBM would continue to invest in this future journey. “This is going to be an era of co-creation,” she said. “It’s why we’ve put together the IBM Garage and the IBM Garage methodology. [It’s] design thinking, agile practices, prototype and DevOps… but with one switch – we do them all in a way that can immediately go from prototype and pilot to production scale.

“I think we are all standing at the beginning of chapter two of this digital reinvention,” Rometty added. “Chapter two will, in my mind, be enterprise driven.”

It is interesting to consider these remarks, along with those of Google Cloud’s new boss Thomas Kurian this week, and look back on what has previously happened in the process. Kurian told an audience at the Goldman Sachs Technology and Internet Conference that Google was going to compete aggressively in the enterprise space through 2019 and beyond. This would presumably be music to the ears of Amir Hermelin, formerly product management lead at Google Cloud, who upon leaving in October opined the company spent too long dallying over its enterprise strategy.

If IBM and Google are advocating a new chapter in the cloud, it may be because the opening stanzas did not work out as well as hoped for either. As this publication has opined variously, the original ‘cloud wars’ have long since been won and lost. Amazon Web Services (AWS) won big time, with Microsoft Azure getting a distant second place and the rest playing for table stakes. Google’s abovementioned enterprise issues contributed, as well as IBM losing the key CIA cloud contract to AWS.

Today, with multi-cloud continuing to be a key theme, attention turns to the next wave of technologies which will run on the cloud, from blockchain, to quantum computing, to artificial intelligence (AI). Rometty noted some of the lessons learned with regards to AI initiatives, from putting in the correct information architecture, to whether you take an ‘inside out’ or ‘outside in’ approach to scale digital transformation.

There was one other key area Rometty discussed. “I think this chapter two of digital and AI is about scaling now, and embedding it everywhere in your business. I think this chapter two when it comes to the cloud is hybrid and is driven by mission critical apps now moving,” said Rometty. “But underpinning it for all of us is a chapter two in trust – and that’s going to be about responsible stewardship.”

The remark, which drew loud applause from the audience, is something we should expect to see a lot more of this year, if analyst firm CCS Insight is to be believed. At the company’s predictions event in October, the forecast was that the needle would move to trust as a key differentiator among cloud service providers in 2019. Vendors “recognise the importance of winning customers’ trust to set them apart from rivals, prompting a focus on greater transparency, compliance efforts and above all investment in the security,” CCS wrote.

You can watch the full presentation here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud chief Kurian advocates aggressive enterprise sales strategy in opening salvo

Google Cloud chief executive Thomas Kurian is looking to give his new employer a stronger enterprise presence – by using old-school sales tactics.

Speaking at the Goldman Sachs Technology and Internet Conference in San Francisco, the former Oracle veteran noted his plan to hire more salespeople, focus more aggressively on large traditional companies, as well as look at fresh deals with systems integrators.

“In a nutshell, we at Google are focused on enabling organisations to get five different capabilities from our platform that help them get brand new ways to exploit the digital revolution,” he told delegates. “We’re seeing very strong momentum with customers around the world in six industries; we’re investing aggressively to grow our direct sales and distribution capacity, as well as striking a number of partnerships with partners who will not only provide value add to our solutions, but also distribute our offerings and capabilities.”

Kurian cited the five ways in which Google differentiated from the AWS’ and Azures as security and reliability for mission critical applications; hybrid and multi-cloud capabilities; ‘very advanced’ artificial intelligence (AI) solutions; ‘vastly different’ capabilities for managing data at scale; and ‘integrating a number of Google’s technology advances with Cloud to deliver industry solutions.”

While some may quibble about the specifics of some of those areas, AI is certainly one area where Google has key strength. Kurian told the audience Google had ‘material capability that is vastly differentiated from other providers in the market’, from algorithms such as TensorFlow to going further up the stack with computer vision and applying it to business use cases. Pharmaceutical, entertainment and airlines were three industries Kurian singled out as benefitting.

Kurian’s comments make for interesting reading, particularly as this is the first major speaking duty he has undertaken since taking over the leadership from Diane Greene at the start of this year. Speaking to this publication at the time of Greene’s departure, Nick McQuire, VP enterprise at analyst firm CCS Insight, noted the importance of revamping the sales process early in his tenure.

“The main thing he’s going to want to focus on is the sales and go-to-market piece,” he told CloudTech in November. “It’s part of a market education process that Google still needs to push. They need to start to bring in more evangelists and business-oriented salespeople who can articulate Google’s business value proposition around cloud.”

Google’s fourth quarter and fiscal year results hit the wires earlier this month which shone a light on how much investment was taking place in its cloud division. The department saw the most new employees in the previous quarter, according to chief financial officer Ruth Porat, for both technical and sales roles.

Kurian’s remarks suggest that this trend will continue, but there remains an element of obfuscation to the details. The company said it had ‘more than doubled’ its $1 million cloud deals and multi-year contracts over the past year, but Porat declined to go into more specific detail following an analyst question in the earnings call. Google’s ‘other’ revenues, where Google Cloud among others sit, were at just under $6.5 billion over the most recent quarter, a 30% rise from the previous year.

This has led to speculation over how much Google’s cloud division is actually making. Microsoft obfuscates to a degree, as does IBM and Oracle, while AWS and Alibaba Cloud are among those which reports specifics. Yet as Paul Miller, senior analyst at Forrester, told this publication, comparing apples to apples for each of the cloud behemoths is a tricky proposition – and it’s all about the wider ecosystem.

“That’s probably the real issue here – all of the major players carve their portfolio up in different ways, and all of them have different strengths and weaknesses,” Miller told CloudTech. “Make it too easy to pick out G Suite’s revenue, and it would look small in comparison to Microsoft’s Office revenue. Make it too easy to pick out GCP’s revenue, and it would look small in comparison to AWS.

“The real value for Google – and for most of the others – is in the way that these different components can be assembled and reassembled to deliver value to their customers,” he added. “That should be the story, not whether their revenue in a specific category is growing 2x, 3x or 10x.

“Are they continuing to invest in the foundational infrastructure upon which growth will depend? Yes. Are they continuing to win new customers, and to grow revenue with existing customers? Yes. Are they solving problems for those customers and delivering value to them? Yes,” concluded Miller. “Make it too easy to pick holes in their possibly growing revenue slower than a specific competitor, and that value story quickly gets buried, and that would be a shame.”

For Kurian, while plans are evidently being put in place to secure new customers, he noted how the ones Google Cloud has are happy.

“We have remarkable customer loyalty,” Kurian added. “The customers that we work with love our technology and as we go forward we’re putting in place a new ‘customers for life’ program that will attract, retain, and convert more customers into advocacy.”

You can listen to the full talk here (email required).

Picture credit: Oracle/Screenshot

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Relational databases ‘here to stay’ in the enterprise amid greater cloud and data complexity

Reports of the death of the relational database have been greatly exaggerated – at least in the enterprise.

According to a new study from Progress Software – the company’s latest annual data connectivity report – while the on-premises relational database behemoths of SQL Server, MySQL and Oracle have declined year on year, the cloud push is not yet forthcoming for mission-critical workloads.

55% of the more than 1,400 business and IT professionals surveyed said they or their customers were currently using SQL, compared with 40% for MySQL and 37% for Oracle. “For important data, there is no substitute for relational databases,” the report noted.

“Modern options such as big data, NoSQL still aren’t the right fit for all business needs, with analytical tooling for these modern databases still in its infancy,” the report added. “Thus RDBMS databases are keeping pace and are here to stay in the enterprise for the foreseeable future.”

Indeed, enterprise decision makers see cloud and NoSQL databases as there to solve alternative problems for businesses, such as enabling greater fluidity across different platforms and accelerated edge to cloud workloads.

In comparison, one in three (32%) organisations polled said they did not use any big data platforms or interfaces. Perhaps surprisingly, Amazon S3 only polled 19% of the vote, with a further 5% saying they plan to use it in the coming two years. As for the other big data vendors, Hadoop Hive (17%) was the most popular, ahead of Spark SQL (10%) and Hortonworks (7%).

When it came to NoSQL, MongoDB (27% currently supported, 7% plan to adopt) was by far the most popular technology among respondents. Regular readers of this publication will be aware of the changing atmosphere surrounding several of the leading open source providers. Towards the end of last year Confluent – which recently secured a $2.5 billion valuation – changed its licensing terms with one eye on the largest cloud vendors running software as a service through their technology. MongoDB has done the same thing, as has Redis Labs, used by 7% of those polled by Progress.

Speaking of which, Amazon Web Services (AWS) was the most popular cloud provider according to survey respondents, cited by 44% of those polled. Microsoft Azure (39%) was not far behind, until a significant gap to VMware (22%) and Google Cloud (18%). Only 8% of respondents were IBM houses. AWS had seen a 12% uptick year on year, with Azure (+7%) and GCP (+3%) the others to see growth.

Underpinning all the data these organisations pick up, whether they use relational or non-relational databases to structure it, is business intelligence. Organisations use on average 2.5 different BI reporting tools, with Tableau usage increasing the most. Excel (42%) was perhaps not surprisingly the most popular tool, ahead of Microsoft BI Platform (26%), Tableau (22%) and Power BI (18%), showcasing a Microsoft-heavy top list. The report noted how the need for data visualisation has grown significantly, as well as embedded analytics.

“As enterprises grow, a wide variety of data is produced, consumed and stored at different parts of the organisation. The challenge is to effectively manage and leverage the volume, variety and velocity of data,” the report concluded. “At a time when budgets and resources remain tight, the old approach of trying constantly to increase expenditures on infrastructure and hardware assets can’t keep pace.

“The success of a business lies in its seamless data connectivity and integration technology in an increasingly hybrid cloud/on-premises world.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.