Category Archives: Google

Google cloud team launches damage control mission

Close up of an astronaut in outer space, planet Mars in the background. Elements of the image are furnished by NASAGoogle will offer all customers who were affected by the Google Compute Engine outage with service credits, in what would appear to be a damage control exercise as the company looks to gain ground on AWS and Microsoft Azure in the public cloud market segment.

On Monday, 11 April, Google Compute Engine instances in all regions lost external connectivity for a total of 18 minutes. The outage has been blamed on two separate bugs, which separately would not have caused any major problems, though the combined result was a service outage. Although the outage has seemingly caused embarrassment for the company, it did not impact other more visible, consumer services such as Google Maps or Gmail.

“We recognize the severity of this outage, and we apologize to all of our customers for allowing it to occur,” said Benjamin Treynor Sloss, VP of Engineering at Google, in a statement on the company’s blog. “As of this writing, the root cause of the outage is fully understood and GCE is not at risk of a recurrence. Additionally, our engineering teams will be working over the next several weeks on a broad array of prevention, detection and mitigation systems intended to add additional defence in depth to our existing production safeguards.

“We take all outages seriously, but we are particularly concerned with outages which affect multiple zones simultaneously because it is difficult for our customers to mitigate the effect of such outages. It is our hope that, by being transparent and providing considerable detail, we both help you to build more reliable services and we demonstrate our ongoing commitment to offering you a reliable Google Cloud platform.”

While the outage would not appear to have caused any major damage for the company, competitors in the space may secretly be pleased with the level of publicity the incident has received. Google has been ramping up efforts in recent months to bolster its cloud computing capabilities to tackle the public cloud market segment with hires of industry hard-hitters, for instance Diane Greene, rumoured acquisitions, as well as announcing plans to open 12 new data centres by the end of 2017.

The company currently sits in third place in the public cloud market segment, behind AWS and Microsoft Azure, though has been demonstrating healthy growth in recent months prior to the outage.

Google plays catch-up with Cloud Machine Learning

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingGoogle has entered into the machine learning market with the alpha release of Cloud Machine Learning.

Built on top of the company’s open source machine learning system TensorFlow, the offering will allow customers to build custom algorithms the make predictions for their business, aiding decision making.

“At Google, researchers collaborate closely with product teams, applying the latest advances in machine learning to existing products and services – such as speech recognition in the Google app, search in Google Photos and the Smart Reply feature in Inbox by Gmail,” said Slaven Bilac, Software Engineer at Google Research. “At GCP NEXT 2016, we announced the alpha release of Cloud Machine Learning, a framework for building and training custom models to be used in intelligent applications.”

The system is already used in a number of Google’s current offerings, though it is later to market than its competitors. AWS launched its machine learning in April last year, while IBM’s Watson has been making noise in the industry for years.

Although later to market, Google has highlighted that it will allow customers to export their TensorFlow models to use in other settings, including their own on premise data centres. Other offerings operate in vendor lock-in situation, meaning their customers have to operate the machine-learning models they’ve built in the cloud through an API. Industry insiders have told BCN that avoiding vendor lock-in situations would be seen as a priority within their organization, which could provide Google with an edge in the machine-learning market segment.

Cloud Machine Learning’s launch builds on the growing trend towards advanced data analytics and the use of data to refine automated decision making capabilities. A recent survey from Cloud World Forum showed that 85% of respondents believe data analytics is the biggest game changer for marketing campaigns in the last five years, while 82% said that data would define the way in which they interact with customers.

The company is still behind Microsoft and AWS in the public cloud space, though recent moves are showing Google’s intent to close the gap. At GCP NEXT 2016, Google’s cloud chief Diane Greene told the audience that machine learning and security will form the back bone of her new sales strategy. “If your customer is embracing machine learning, it’d be prudent for you to embrace it too,” said Greene.

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

Google said to be on cloud shopping spree

Googlers having funGoogle is rumoured to be planning the acquisition of a number of businesses to bolster its cloud computing platform and suite of workplace applications.

According to Re/code, the tech giant has amassed a short-list of various start-ups and niche service providers including automated app services start-up Metavine, e-commerce public company Shopify, and payroll and health benefits services business Namely. Re/code sources have stressed that the approaches are preliminary, and none of the companies involved have commented on the rumours.

The moves seem to address two challenges currently facing the Google team. Firstly, there is a notable gap of ‘middle range’ customers for Google Apps. The company traditionally does well with small and large companies, but has struggled with the lucrative market in between. Last year, Google attempted to lure the middle market onto Google Apps for Work by offering the service for free while seeing out their current enterprise agreement, and then $25 per user after that point.

Secondly, the acquisitions would enable Google to move its internal systems to its cloud platform, potentially creating a more solid offering to challenge AWS and Microsoft Azure.

The reports back-up recent moves in the market which indicated Google’s intentions of increasing its stake in the cloud market. While AWS and Microsoft have been firmly planted as the number one and number two players in the public and private cloud space, Google is closing the gap, making a number of company and talent acquisitions to improve its proposition.

Aside from the recent hire of VMware founder Diane Greene to lead its cloud business, last year SVP of Technical Infrastructure Urs Hölzle highlighted that Google cloud platform revenues could surpass Google’s advertising revenue within five years.

“The goal is for us to talk about Google as a cloud company by 2020,” said Hölzle in October. “Our cloud growth rate is probably industry-leading…and we have lots of enterprise customers, happy enterprise customers.”

The rumours shouldn’t come as a surprise, as Hölzle also said that there would be a number of announcements which would “remove any doubt” from Google’s future plans.

While the approaches are rumours, GCP Next 2016, the company’s cloud developer user conference taking place this week, may provide some clarity to Google’s aspirations.

Apple reportedly defects iCloud from AWS to Google Cloud

iCloud-croppedApple has moved some of its iCloud services onto Google Cloud, reducing its reliance on AWS, according to a CRN report.

Though it will still remain an AWS customer, the story states Google claims Apple will now be spending between $400 million and $600 million on its cloud platform. Last month, financial services firm Morgan Stanley estimated Apple spends $1 billion annually on AWS public cloud, though this is likely to be reduced over the coming years as Apple invests more on its own datacentres.

The company currently operates four datacentres worldwide and apparently has plans to open three more. It has been widely reported that Apple has set aside $3.9 billion to open datacentres in Arizona, Ireland and Denmark, with plans to open the first later this year.

Google has been struggling to keep pace with AWS and Microsoft’s Azure, but recent deals indicate an improved performance. A recent survey from Rightscale demonstrated AWS’ dominance in the market, accounting for 57% of public cloud market share, while Azure currently commands seconds place and Google only accounts for 6% of the market.

To bolster its cloud business Google hired VMware co-founder Diane Greene to lead the business unit, which includes Google for Work, Cloud Platform, and Google Apps. The appointment, together with the acquisition of bebop, which was founded by Greene, highlights the company’s ambitions in the cloud world, where it claims it has larger data centre capacity than any other public cloud provider.

Industry insiders have told BCN that acquisitions such as this are one of the main reasons the public cloud market segment is becoming more competitive. Despite AWS’ market dominance, which some insiders attribute to it being first to market, offerings like Azure and Google are becoming more attractive propositions thanks in part to company and talent acquisitions.

Last month, the Google team secured another significant win after confirming music streaming service Spotify as a customer. Spotify had toyed with the idea of managing its own datacentres but said in its blog “The storage, compute and network services available from cloud providers are as high quality, high performance and low cost as what the traditional approach provides.” The company also highlighted that the decision was made based on Google value adds in its data platform and tools.

While Google and Apple have yet to comment on the deal, an Amazon spokesperson has implied the deal may not have happened at all, sending BCN the following emailed statement. “It’s kind of a puzzler to us because vendors who understand doing business with enterprises respect NDAs with their customers and don’t imply competitive defection where it doesn’t exist.”

The rumoured Apple/Google deal marks a tough couple of weeks for AWS. Aside from Apple and Spotify, the company also lost the majority of Dropbox’s business. AWS is still occupies a strong position in the public cloud market but there are increasing signs its competitors are raising their game.

Google’s AlphaGo publicity stunt raises profile of AI and machine learning

Google AlphaGoWorld Go champion Lee Se-dol has beaten AlphaGo, an AI program developed by Google’s DeepMind unit this weekend, though he still trails the program 3-1 in the series.

Google’s publicity stunt highlights the progress which has been made in the world of artificial intelligence and machine learning, as commentators predicted a run-away victory for Se-dol.

DeepMind founder Demis Hassabis commented on Twitter “Lee Sedol is playing brilliantly! #AlphaGo thought it was doing well, but got confused on move 87. We are in trouble now…” allowing Se-dol to win the fourth game in the five game series. While the stunt demonstrates the potential of machine learning, Se-dol’s consolation victory proves that the technology is still capable of making mistakes.

The complexity of the game presented a number of problems for the DeepMind team, as traditional machine learning techniques would not enable the program to be successful. Traditional AI methods, which construct a search tree over all possible positions, would have required too much compute power due to the vast number of permutations within the game. The game is played primarily through intuition and feel, presenting a complex challenge for AI researchers.

The DeepMind team created a program that combined an advanced tree search with deep neural network, which enabled the program to play thousands of games with itself. The games allowed the machine to readjust its behaviour, a technique called reinforcement learning, to improve its performance day by day. This technique allows the machine to play human opponents in its own right, as opposed to mimic other players which it has studied. Commentators who has watched all four games have repeatedly questioned whether some of the moves put forward by AlphaGo were mistakes or simply unconventional strategies devised by the reinforcement learning technique.

Although the AlphaGo program demonstrates progress as well as an alternative means to build machine learning techniques, the defeat highlights that AI is still fallible; there is still some way to go before AI will become the norm in the business world.

In other AI news Microsoft has also launched its own publicity stunt, though Minecraft. The AIX platform allows computer scientists to use the world of Minecraft as a test bed to improve their own artificial intelligence projects. The platform is currently available to a small number of academic researchers, though it will be available via an open-source licence during 2016.

Minecraft appeals to the mass market due to the endless possibilities offered to the users, however the open-ended nature of the game also lends itself to artificial intelligence researchers. From searching an unknown environment, to building structures, the platform offers researchers an open playing field to build custom scenarios and challenges for an acritical intelligence offering.

Aside from the limitless environment, Minecraft also offers a cheaper alternative for researchers. In a real world environment, researcher may deploy a robot in the field though any challenges may cause damage to the robot itself. For example, should the robot not be able to navigate around a ditch, this could result in costly repairs or even replacing the robot entirely. Falling into a ditch in Minecraft simply results in restarting the game and the experiment.

“Minecraft is the perfect platform for this kind of research because it’s this very open world,” said Katja Hofmann, lead researcher at the Machine Learning and Perception group at Microsoft Research Cambridge. “You can do survival mode, you can do ‘build battles’ with your friends, you can do courses, you can implement our own games. This is really exciting for artificial intelligence because it allows us to create games that stretch beyond current abilities.”

One of the main challenges the Microsoft team are aiming to address is the process of learning and addressing problems. Scientists have become very efficient at teaching machines to do specific tasks, though decision making in new situations is the next step in the journey. This “General Intelligence” is more similar to the complex manner in which humans learn and make decisions every day. “A computer algorithm may be able to take one task and do it as well or even better than an average adult, but it can’t compete with how an infant is taking in all sorts of inputs – light, smell, touch, sound, discomfort – and learning that if you cry chances are good that Mom will feed you,” Microsoft highlighted in its blog.

Spotify shifts all music from data centres to Google Cloud

Spotify_Icon_RGB_GreenMusic streaming service Spotify has announced that it is to switch formats for storing tunes for customers and is copying all the music from its data centres onto the Google’s Cloud Platform.

In a blog written by Spotify’s VP of Engineering & Infrastructure, Nicholas Harteau explained that though the company’s data centres had served it well, the cloud is now sufficiently mature to surpass the level of quality, performance and cost Spotify got from owning its infrastructure. Spotify will now get its platform infrastructure from Google Cloud Platform ‘everywhere’, Harteau revealed.

“This is a big deal,” he said. Though Spotify has taken a traditional approach to delivering its music streams, it no longer feels it needs to buy or lease data-centre space, server hardware and networking gear to guarantee being as close to its customers as possible, according to Harteau.

“Like good engineers, we asked ourselves: do we really need to do all this stuff? For a long time the answer was yes. Recently that balance has shifted,” he said.

Operating data centres was a painful necessity for Spotify since it began in 2008 because it was the only way to guarantee the quality, performance and cost for its cloud. However, these days the storage, computing and network services available from cloud providers are as high quality, high performance and low cost as anything Spotify could create from the traditional ownership model, said Harteau.

Harteau explained why Spotify preferred Google’s cloud service to that of runaway market leader Amazon Web Services (AWS). The decision was shaped by Spotify’s experience with Google’s data platform and tools. “Good infrastructure isn’t just about keeping things up and running, it’s about making all of our teams more efficient and more effective, and Google’s data stack does that for us in spades,” he continued.

Harteau cited the Dataproc’s batch processing, event delivery with Pub/Sub and the ‘nearly magical’ capacity of BigQuery as the three most persuasive features of Google’s cloud service offering.

Google launches Dataproc after successful beta trials

Google cloud platformGoogle has announced that its big data analysis tool Dataproc is now on general release. The utility, which was one of the factors that persuaded Spotify to choose Google’s Cloud Platform over Amazon Web Services is a managed tool based on the Hadoop and Spark open source big data software.

The service first became available in beta in September and was tested by global music streaming service Spotify, which was evaluating whether it should move its music files away from its own data centres and into the public cloud – and which cloud service could support it. Dataproc in its beta form supported the MapReduce engine, the Pig platform for writing programmes and the Hive data warehousing software. Google says it has added new features and sharpened the tool since then.

While in its beta testing phase, Cloud Dataproc added features such as property tuning, VM metadata and tagging and cluster versioning. “In general availability new versions of Cloud Dataproc will be frequently released with new features, functions and software components,” said Google product manager James Malone.

Cloud Dataproc aims to minimise cost and complexity, which are the two major distractions of data processing, according to Malone.

“Spark and Hadoop should not break the bank and you should pay for what you actually use,” he said. As a result, Cloud Dataproc is priced at 1 cent per virtual CPU per hour. Billing is by the minute with a 10-minute minimum.

Analysis should run faster, Malone said, because clusters in Cloud Dataproc can start and stop operations in less than 90 seconds, where they take minutes in other big data systems. This can make analyses run up to ten times faster. The new general release of Cloud Dataproc will have better management, since clusters don’t need specialist administration people or software.

Cloud Dataproc also tackles two other data processing bugbears, scale and productivity, promised Malone. This tool complements a separate service called Google Cloud Dataflow for batch and stream processing. The underlying technology for the service has been accepted as an Apache incubator project under the name Apache Beam.

AWS, Azure and Google intensify cloud price war

AzureAs price competition intensifies among the top three cloud service providers, one analyst has warned that cloud buyers should not get drawn into a race to the bottom.

Following price cuts by AWS and Google, last week Microsoft lowered the price bar further with cuts to its Azure service. Though smaller players will struggle to compete on costs, the cloud service is a long way from an oligopoly, according to Quocirca analyst Clive Longbottom.

Amazon Web Services began the bidding in early January as chief technology evangelist Jeff Barr announced the company’s 51st cloud price cut on his official AWS blog.

In January 8th Google’s Julia Ferraioli argued via a blog post that Google is now a cheaper offering (in terms of cost effectiveness) as a result of its discounting scheme. “Google is anywhere from 15 to 41% less expensive than AWS for compute resources,” said Ferraioli. The key to the latest Google lead in cost effectiveness is automatic sustained usage discounts and custom machine types that AWS can’t match, claimed Ferraioli.

Last week Microsoft’s Cloud Platform product marketing director Nicole Herskowitz announced the latest round of price competition in a company blog post announcing a 17% cut off the prices of its Dv2 Virtual Machines.

Herskowitz claimed that Microsoft offers better price performance because, unlike AWS EC2, its Azure’s Dv2 instances have include load balancing and auto-scaling built-in at no extra charge.

Microsoft is also aiming to change the perception of AWS’s superiority as an infrastructure service provider. “Azure customers are using the rich set of services spanning IaaS and PaaS,” wrote Herskowitz, “today, more than half of Azure IaaS customers are benefiting by adopting higher level PaaS services.”

Price is not everything in this market warned Quocirca analyst Longbottom, an equally important side of any cloud deal is overall value. “Even though AWS, Microsoft and Google all offer high availability and there is little doubting their professionalism in putting the stack together, it doesn’t mean that these are the right platform for all workloads. They have all had downtime that shouldn’t have happened,” said Longbottom.

The level of risk the provider is willing to protect the customer from and the business and technical help they provide are still deal breakers, Longbottom said. “If you need more support, then it may well be that something like IBM SoftLayer is a better bet. If you want pre-prepared software as a service, then you need to look elsewhere. So it’s still horses for courses and these three are not the only horses in town.”

Snooper’s charter a potential disaster warns lobby of US firms

security1The ‘snooper’s charter’ could neutralise the contribution of Britain’s digital economy, according to a representation of US tech corporations including Facebook, Google, Microsoft, Twitter and Yahoo.

In a collective submission to the Draft Investigatory Powers Bill Joint Committee they argue that surveillance should be “is targeted, lawful, proportionate, necessary, jurisdictionally bounded, and transparent.”

These principles, the collective informs the parliamentary committee, reflect the perspective of global companies that offer “borderless technologies to billions of people around the globe”.

The extraterritorial jurisdiction will create ‘conflicting legal obligations’ for them, the collective said. If the UK government instructs foreign companies what to do, then foreign governments may follow suit, they warn. A better long term resolution might be the development of an ‘international framework’ with ‘a common set of rules’ to resolve jurisdictional conflicts.

“Encryption is a fundamental security tool, important to the security of the digital economy and crucial to the safety of web users worldwide,” the submission said. “We reject any proposals that would require companies to deliberately weaken the security of their products via backdoors, forced decryption or any other means.”

Another area of concern mentioned is the bill’s proposed legislation on Computer Network Exploitation which, the companies say, gives intelligence services legal powers to break into any system. This would be a very dangerous precedent to set, the submission argues, “we would urge your Government to reconsider,” it said.

Finally, Facebook and co registered concern that the new law would prevent any discussion of government surveillance, even in court. “We urge the Government to make clear that actions taken under authorization do not introduce new risks or vulnerabilities for users or businesses, and that the goal of eliminating vulnerabilities is one shared by the UK Government. Without this, it would be impossible to see how these provisions could meet the proportionality test.”

The group submission joins other individual protest registered by Apple, EE, F-Secure, the Internet Service Providers’ Association, Mozilla, The Tor Project and Vodafone.

The interests of British citizens hang in a very tricky balance, according to analyst Clive Longbottom at Quocirca. “Forcing vendors to provide back door access to their systems and platforms is bloody stupid, as the bad guys will make just as much use of them. However, the problem with terrorism is that it respects no boundaries. Neither, to a greater extent, do any of these companies. They have built themselves on a basis of avoiding jurisdictions – only through such a means can they minimise their tax payments,” said Longbottom.