All posts by James

Google Cloud launches pre-packaged AI services around contact centre and talent acquisition

The importance of artificial intelligence (AI) and machine learning to both the biggest cloud providers and their customers continues to rise – and Google Cloud aims to get a step up on its rivals by offering pre-packaged AI services.

At Google Next back in July, Google Cloud AI chief scientist Fei-Fei Li noted that AI was ‘no longer a niche for the tech world’ but ‘the differentiator for businesses in every industry.’ It’s not difficult to see why. Take the various companies who cite AI and machine learning capability as key when they make the switch regardless of who they shop with – from Bloomberg with Google, to Formula 1 with AWS.

Google’s pre-packaged AI offerings are based around improving the enterprise contact centre and talent acquisition respectively. The roster of partners the company is working with on the contact centre is almost a who’s who of the cloud networking space, from Cisco, to RingCentral, to Twilio, with Deloitte and KPMG among the integration partners.

A blog post from Apoorv Saxena, cloud AI product manager, and Geordy Kitchen, cloud group product manager, explains the benefits of the contact centre technology. “Instead of a phone tree, [Contact Center AI] greets callers in a natural and conversational manner,” the two write., “Whenever possible, it aims to resolve simple requests and tasks, such as billing enquiries or driving directions – and when it determines that a caller’s needs exceed its abilities to help, it seamlessly transitions the call to a live agent and switches to a supporting role.

“During the conversation, it surfaces information that can help the live agent, in real time, so agents have little need to put a caller on hold,” Saxena and Kitchen add. “It also captures important analytics, such as historical trends or whether a certain kind of contact is happening more frequently.”

Many will remember that, back in May, Google conducted a demo where its Assistant software called a real hair salon to book an appointment, with the employee at the other line purportedly unaware that it was an AI calling them. Some had suspicions around the veracity of that demo – so it may be worth exploring these further.

On an earnings call last month, Google CEO Sundar Pichai noted the company’s continued momentum, with larger and more strategic cloud deals, was ‘a natural extension of our long time strength in computing, data centres and machine learning.’ “We have developed these over many years and they power our own services in the cloud and are now helping others,” he told analysts.

Prometheus joins Kubernetes in graduating from Cloud Native Computing Foundation

Prometheus, an open source systems monitoring technology, has ‘graduated’ from the Cloud Native Computing Foundation (CNCF) – five months after Kubernetes took the plunge.

The graduation – which is essentially an affirmation of a technology having strong adoption and governance processes – was announced at the PromCom event. Prometheus has almost 20 active maintainers today, alongside more than 1,000 contributors and more than 13,000 commits.

The CNCF, a vendor-neutral home for emerging projects and technologies, has three stages of development. For those barely out of the womb, the only criterion for the inception stage is that the technology adds value to cloud native computing in some capacity. The incubation stage requires further accreditation in terms of documents and usage before finally ‘graduating’.

In a blog post, Prometheus’ Richard Hartmann noted how both the technology and the documentation had been overhauled since reaching incubation stage, including rewriting the storage backend and a push towards making adoption easier.

“Since its inception in 2012, Prometheus has become one of the top open source monitoring tools of choice for enterprises building modern cloud native applications,” said Chris Aniszczyk, CNCF chief operating officer. “Since being accepted as the second project in CNCF, Prometheus has cultivated an active developer and user community, giving the TOC full confidence to graduate the project.”

Kubernetes found its way to graduation in March, with the Google-developed container orchestration tool counting on companies such as Bloomberg, The New York Times, and Uber – the latter of which also being a keen Prometheus user. As this publication noted throughout last year, pretty much every leading cloud provider joined the CNCF, giving rise to Kubernetes’ growing influence.

For Prometheus, which was originally designed by engineers at SoundCloud, the push to graduation means a greater opportunity to push the technology into the enterprise. “By graduating Prometheus, CNCF shows that it’s confident in our code and feature velocity, or maturity and stability, and our governance and community processes,” added Hartmann. “This also acts as an external verification of quality for anyone in internal discussions around choice of monitoring tool.”

Alibaba Cloud focuses on Asia Pacific with latest launches – and expands Elasticsearch partnership

It has been another busy week at Alibaba Cloud, with the company’s latest releases focusing on both Asia Pacific and European expansion.

The increased focus on Asia Pacific comes through the launch of no fewer than nine products around cloud architecture, machine learning, the Internet of Things (IoT), and security.

These include PAI, Alibaba Cloud’s proprietary machine learning platform, IoT Platform, which will be introduced to the global market, as well as Data Lake Analytics, a serverless offering which aims to turn customers’ data lakes into insights. Other products being introduced into this market include a hybrid backup recovery tool, a dedicated host service, and anti-bot software.

The nine products also include partner offerings. Alibaba’s partnership with China Unicom has resulted in an enterprise network solution, while HPE is on board to help organisations with hybrid cloud and analytics search engine Elasticsearch is available through Alibaba Cloud and Elastic.

The latter is being expanded in a separate announcement to deliver the service globally. Elasticsearch was previously only available to Alibaba Cloud customers in China, with the expanded collaboration including visualisation offering Kibana, data shipper Beats, and processing tool Logstash.

Alibaba had earlier this month been focusing its expansion plans on Southeast Asia; the company had announced a second infrastructure zone in Malaysia, offering DDoS protection, database, networking and monitoring services, as well as certified SAP hosting.

“Asia Pacific is a unique market, and as a global cloud services provider with an Asian origin, we are committed to leverage our knowledge and experience to build a sustainable regional ecosystem and enrich our offerings to meet the needs of our customers in this digital age,” said Derek Wang, chief solution architect at Alibaba Cloud International in a statement.

“This new suite of offering includes products that are highly efficient, cost effective, and some of them are the first of their kind in the industry,” Wang added.

Photo source: www.alibabagroup.com

Risk and finance industry still see cloud as a concern, notes Gartner

Even the slowest industries are moving workloads to the cloud – take risk, audit and finance as an example. Yet there is still plenty more to be done before these verticals become truly comfortable.

That’s according to the latest report from analyst firm Gartner. In the company’s most recent Emerging Risks study, cloud computing remains the primary concern for those in risk and compliance. Cloud was ahead of cybersecurity disclosure and GDPR compliance as the biggest worry for the 110 senior executives polled.

Given Gartner predicts cloud computing to be a $300 billion business by 2021 – and is indeed a cornerstone of the company’s feted ‘digital business’ ethos – there is plenty of reason to get involved. Yet these organisations have a right to be concerned. Gartner added that through 2022, ‘at least’ 95% of cloud security failures will be attributable to the organisation – and that companies should expect to be affected by cybersecurity threats in ‘unpredictable’ ways.

Organisations continue to struggle with security despite record spending on information security over the past two years, according to Gartner. And it is only going to get worse from here. Social engineering, alongside the threat of GDPR, were cited as most likely to cause the most mayhem for enterprises if not properly addressed by risk management leaders.

This publication has explored the various strategies different industries have undertaken in adopting cloud-based technologies. Take healthcare as an example. Writing for this publication in February, Virtustream’s Roberto Mircoli noted that while to date health organisations had been ‘testing the waters by focusing on modernising the back end of systems… transforming core healthcare systems and applications’ was now the order of the day.

“Executives are right to expand cloud services as part of their digital business initiatives, but they need to ensure their cloud security strategy keeps up with this growth,” said Matthew Shinkman, practice leader at Gartner in a statement. “Leaders should start by clearly identifying their most at-risk areas, which remain obscure to many large organisation leaders.”

IDG notes how cloud budgets are going up – and implementations increasing in complexity

It is one of the largest trends of this year, and now it has gotten affirmation from IDG: cloud initiatives are becoming ever-more complex as the technology hits full maturity.

The findings appear in the media and analyst firm’s ‘Cloud Insights’ report for 2018. The study, which polled 550 respondents – all of whom are involved in the cloud buying process to some extent – found almost one in three (30%) were using both multi-cloud and hybrid cloud strategies.

For those who have a multi-cloud approach, the simple fact of having more options was the biggest benefit, cited by 59% of those polled. Quicker and easier disaster recovery (40%), and increased flexibility through working across multiple clouds (38%) were also cited.

Naturally, more IT budget is being devoted to these projects. While spend in proportion to IT remains similar – 30% according to this study, compared with 28% for a similar study in 2016 – the monetary amount has gone up, from $1.62 million two years ago to $2.2m this year. For enterprises, the figure is around $3.5m, while for SMBs it is $889,000, up from $286,000 in 2016.

Of this spend, the CIO, or uppermost IT executive, holds most sway. 71% of those polled in such a position said they had ‘significant influence’, with the CTO further back on 54%.

It is interesting here to look back at how far the industry has come over the past several years. In 2011, when the question was asked over organisations’ plans with regard to utilising computing infrastructure or applications via the cloud, just over half (51%) said at least a part of their infrastructure was cloud-based, with 21% saying they planned to move within three years. Today, those figures have changed to 73% and 10% respectively.

While it’s an implicit note that not everything will be, or indeed can be, moved to the cloud – it was interesting to note the recent comments of Diane Bryant, late of Google, around how McDonald’s still has mainframes and is ‘not ashamed of it’ – the report studiously notes this progress.

Yet challenges still remain among buyers. Vendor lock-in is the primary concern, cited by 47% of those polled, while concerns around where data is stored (34%) and the general security of cloud computing solutions (34%) are also of concern.

“IT organisations are being asked to improve the speed of IT service delivery and react to changing market conditions. Cloud solutions provide the flexibility to do just that,” said Julie Ekstrom, SVP of IDG Communications. “Organisations are relying on a mix of cloud delivery models to meet this need; however it requires management of multiple vendors.

“As tech executives explore new areas of cloud investment, they examine their portfolio of cloud vendors to see what solutions can grow and what new vendors will work collaboratively with their existing portfolio for ease of adoption,” Ekstrom added.

You can find out more about the research here.

Oracle accused of misrepresenting its cloud revenue growth in lawsuit

Oracle is facing a legal threat after a lawsuit has alleged the company used ‘threats and extortive tactics’ when selling its cloud products.

The City of Sunrise Firefighters’ Pension Fund is suing Oracle for allegedly misleading shareholders and ‘misrepresenting the true drivers of [its] cloud revenue growth.’

“Defendants falsely attributed the company’s revenue growth in its cloud segment to a variety of factors and initiatives, including, among other things, Oracle’s “unprecedented level of automation and cost savings”, as well as the company being “customer-focused” and “intimate partners with our customer””, the document stated. “In truth, Oracle drove sales of cloud products using threats and extortive tactics.”

As an example, the document alleged that Oracle threatened customers with audits of their non-cloud-based software until they agreed to take on a cloud license.

The document added that this practice became clear on March 19 2018, which was around the time of Oracle’s third quarter financial results. According to those results, cloud revenues went up 32% year over year and represented 16% of the company’s overall revenue. Despite this the financial press reported mixed results and stocks fell. The document alleged that ‘analysts and market commentators connected Oracle’s poor financial performance to its improper sales tactics.’

Much of Oracle’s announcements in recent months has been around its autonomous database and associated features. At an event in California last week, Oracle CTO and executive Larry Ellison unveiled autonomous transaction processing (ATP) capabilities, based on a specific database type.

According to the document, the use of such tactics, including around automation, “concealed the lack of real demand for Oracle’s cloud services, making the growth unsustainable and ultimately driving away customers.”

Oracle has responded to the claims, with Deborah Hellinger, head of communications, saying: “The suit has no merit and Oracle will vigorously defend against these claims.”

SD-WAN infrastructure market to hit $4.5bn by 2022, says IDC

The software-defined wide area network (SD-WAN) market continues to grow rapidly – and IDC is predicting the overall infrastructure market will be worth $4.5 billion (£3.53bn) by 2022.

The figure, which appears in the analyst firm’s latest SD-WAN Infrastructure Forecast, takes into account the significant uptick in SD-WAN investment, with infrastructure revenues going up 83.3% in 2017 to reach $833 million.

Another report, IDC’s Market Share – the first from the company for this category – sees Cisco and VMware at the top of the tree. The analysis notes the recent M&A activity around the market, with Cisco acquiring Viptela and VMware buying VeloCloud last year. The analysis includes both hardware and software used in SD-WAN deployments.

IDC defines the technology as ‘an architecture that leverages a hybrid WAN using at least two or more connection types.’ This can include MPLS, broadband internet, 3G, 4G, and more.

“The emergence of SD-WAN technology has been one of the fastest industry transformations we have seen in years,” said Rohit Mehra, IDC vice president of network infrastructure. “Organisations of all sizes are modernising their wide area networks to provide improved user experience for a range of cloud-enabled applications.

“Incumbent networking vendors have quickly realigned their routing and WAN optimisation portfolios to take on the growing cadre of startups in this market,” added Mehra. “Enabled by a rapid uptake across the service provider domain, SD-WAN infrastructure will continue to grow rapidly in the coming years, providing a beachhead for other software-defined networking and security functions in the enterprise branch.”

Previous research from IHS Markit in April found that enterprises were getting their heads turned by the benefits of SD-WAN as they focus on more complex cloud deployments. According to the North America-based study, three quarters (74%) of organisations polled had conducted SD-WAN lab trials in 2017, with many of these planned to move into live production this year.

Regular watchers of the industry would have seen this expansion coming, however. Writing for this publication in November, Steve Brar, director of solutions marketing at Riverbed Technology, the rise of SD-WAN has forced IT teams to rethink their entire networks.

“SD-WAN has proven invaluable to companies with large numbers of users spread across many sites who are accessing cloud-based applications,” Brar wrote. “With SD-WAN, these organisations are now able to centrally orchestrate and manage direct connections from geographically dispersed locations to the cloud. They can define and instantly apply policies that govern security and performance across the network using one management console.”

Intel spies $200bn in ‘data-centric’ opportunity combining cloud, edge and AI

Intel has upped its total addressable market (TAM) for what it calls the ‘data-centric’ era of computing from $160 billion to $200bn – with Navin Shenoy, president and general manager of the company’s data centre group, saying it is “the biggest opportunity in the history of the company.”

Shenoy was speaking at the company’s Data-Centric Innovation Summit in Santa Clara, and took to a company editorial to outline his plans.

“I find it astounding that 90% of the world’s data was generated in the past two years – and analysts forecast that by 2025 data will exponentially grow by 10 times and reach 163 zettabytes,” Shenoy wrote. “But we have a long way to go in harnessing the power of this data.

“A safe guess is that only about 1% of it is utilised, processed and acted upon – imagine what could happen if we were able to effectively leverage more of this data at scale.”

Shenoy noted how the confluence of edge computing, mapping, cloud, computer vision and artificial intelligence (AI) was making this opportunity more apparent. Naturally, the company has a variety of products which aim to make the process more seamless. Silicon photonics, combining a silicon integrated circuit and a semiconductor laser, aims to provide high performance computing in hyperscale data centres, while Intel’s Optane DC persistent memory focuses on quicker performance with greater affordability.

What’s more, Intel added that more than $1 billion in revenue came from its processors designed for artificial intelligence workloads.

“We’ve entered a new era of data-centric computing,” Shenoy explained. “The proliferation of the cloud beyond hyperscale and into the network and out to the edge, the impending transition to 5G, and the growth of AI and analytics have driven a profound shift in the market, creating massive amounts of largely untapped data.

“When you add the growth in processing power, breakthroughs in connectivity, storage, memory and algorithms, we end up with a completely new way of thinking about infrastructure,” he added.

“To help our customers move, store and process massive amounts of data, we have actionable plans to win in the highest growth areas, and we have an unparalleled portfolio to fuel our growth – including performance-leading products and a broad ecosystem that spans the entire data-centric market.”

Autonomous driving was cited as a key example of how these technologies will converge – Shenoy described it as having life-saving potential – and it makes sense given Intel’s other bets in this area. But perhaps a small note on the maths may be required. Last June, Intel said the ‘passenger economy’ – a strategy for autonomous cars, as well as the potential gains made by time saved driving – could have the potential to hit $7 trillion across the market. Earlier that year, the company said it had ‘unwavering confidence’ in its chances of taking the autonomous driving market.

You can read Shenoy’s editorial in full here.

Samsung Heavy Industries chooses AWS to help take shipbuilding into the cloud

Another example of cloud computing infiltrating key enterprises; shipbuilding firm Samsung Heavy Industries is moving to Amazon Web Services (AWS) as its preferred cloud provider.

The company says it wants to be seen as a ‘cloud-first maritime business’, with Samsung using a variety of AWS’ services. These include EC2 and S3, naturally, alongside Amazon’s relational database, RDS, AWS Key Management, and governance and compliance tool CloudTrail.

By putting sensors in a variety of devices and crunching the data the systems generate, all backed up by cloud technologies, organisations in the shipping and maritime sector can make significant changes in efficiency and productivity. Take the Port of Rotterdam as an example. In February the port, Europe’s largest by cargo tonnage, said it was signing up with IBM to provider greater insights on water and weather conditions, as well as manage traffic and reduce waiting times at the port.

“We’re digitising our shipping fleet by using the most advanced technologies in the world to enhance our approaches to shipbuilding, operations, and delivery, and chose AWS as our preferred cloud provider to help us quickly transform Samsung Heavy Industries into a cloud-first maritime business,” said Dongyeon Lee, Samsung Heavy Industries director of ship and offshore performance research centre.

“By leveraging AWS, we’ve successfully released several smart shipping systems so that our customers can manage their ships and fleets more efficiently, and we continue to test new capabilities for ocean-bound vessel navigation and automation,” added Lee. “AWS delivers a highly flexible environment, with the broadest and deepest portfolio of cloud services, that is ideal for accelerating research and development across the company, and it has enabled our developers and data scientists to bring new ideas to market at an unprecedented pace.”

AWS, whose revenues went up 49% year over year to $6.1 billion, according to the most recent quarter’s financial report, has been issuing a flurry of recent customer wins. Alongside Samsung, Formula 1, Ryanair, and Major League Baseball were all confirmed as AWS users over the past three months.

Oracle marks ‘major milestone’ in autonomous strategy as Ellison takes more swipes at AWS

Oracle’s CTO and executive chairman Larry Ellison announced the launch of the company’s latest autonomous database service around transaction processing (ATP) last night – but a recent report around claims by Amazon also caught his eye.

At an event in California, Ellison responded to a story, originally broken by CNBC, which claimed that Amazon was planning to move completely away from Oracle’s databases by 2020.

Responding to an analyst question around customers moving off Oracle on the company’s Q218 earnings call back in December, Ellison said: “Let me tell you who’s not moving off of Oracle – a company you’ve heard of that gave us another $50 million this quarter. That company is Amazon. Our competitors, who have no reason to like us very much, continue to invest in and run their entire business on Oracle.”

Ellison reiterated the $50m price and told attendees of his doubt that Amazon would reach its reported target. “They don’t like being our best reference,” he said. “They think of themselves as a competitor, so it’s kind of embarrassing when Amazon uses Oracle, but they want you to use Aurora and Redshift.”

Aurora and Redshift, of course, are Amazon’s primary database products around relational database and data warehousing respectively. Ellison also took the opportunity to tout Oracle’s greater performance when compared to its rival (below) – 12 times faster than Aurora for its autonomous transaction processing database for pure transaction processing, and more than 100 times faster for a mixed workload.

Oracle’s press materials accompanying the ATP release described it as ‘a major milestone in the company’s autonomous strategy’, and Ellison did not hold back in his praise of a technology he described as ‘revolutionary’ at last year’s OpenWorld.

“This machine learning-based technology not only can optimise itself for queries, for data warehouses and data marts, but it also optimises itself for transactions,” said Ellison. “It can run batch programs, reporting, Internet of Things, simple transactions, complex transactions, and mixed workloads. Between these two systems [for data warehousing and transaction processing], the Oracle autonomous database now handles all of your workloads.”

Another barb at Amazon – and it’s worth noting that Andy Jassy is not averse to firing shots back during his keynote speeches – came when Ellison described Oracle’s autonomous database as ‘truly elastic’. It was truly pay as you go, with automatic provisioning and scaling, adding and deleting servers while running, and being serverless when not running.

“Amazon’s databases can’t do that,” he told the audience. “They can’t dynamically add a server when the system is running, they can’t dynamically add network capacity, they can’t dynamically take a server away when there is not demand and it’s not serverless when it’s idle. [Oracle] is a truly elastic system – you only pay for the infrastructure that you use.”

Ellison added that full autonomy – ‘nothing to learn, nothing to do’ became something of a mantra during the presentation – meant Oracle was “as simple to use as the simplest databases on the planet.”

CloudTech has reached out to AWS for comment and will update this piece accordingly.

Picture credits: Oracle/Screenshot