Archivo de la categoría: Google

Apple reportedly defects iCloud from AWS to Google Cloud

iCloud-croppedApple has moved some of its iCloud services onto Google Cloud, reducing its reliance on AWS, according to a CRN report.

Though it will still remain an AWS customer, the story states Google claims Apple will now be spending between $400 million and $600 million on its cloud platform. Last month, financial services firm Morgan Stanley estimated Apple spends $1 billion annually on AWS public cloud, though this is likely to be reduced over the coming years as Apple invests more on its own datacentres.

The company currently operates four datacentres worldwide and apparently has plans to open three more. It has been widely reported that Apple has set aside $3.9 billion to open datacentres in Arizona, Ireland and Denmark, with plans to open the first later this year.

Google has been struggling to keep pace with AWS and Microsoft’s Azure, but recent deals indicate an improved performance. A recent survey from Rightscale demonstrated AWS’ dominance in the market, accounting for 57% of public cloud market share, while Azure currently commands seconds place and Google only accounts for 6% of the market.

To bolster its cloud business Google hired VMware co-founder Diane Greene to lead the business unit, which includes Google for Work, Cloud Platform, and Google Apps. The appointment, together with the acquisition of bebop, which was founded by Greene, highlights the company’s ambitions in the cloud world, where it claims it has larger data centre capacity than any other public cloud provider.

Industry insiders have told BCN that acquisitions such as this are one of the main reasons the public cloud market segment is becoming more competitive. Despite AWS’ market dominance, which some insiders attribute to it being first to market, offerings like Azure and Google are becoming more attractive propositions thanks in part to company and talent acquisitions.

Last month, the Google team secured another significant win after confirming music streaming service Spotify as a customer. Spotify had toyed with the idea of managing its own datacentres but said in its blog “The storage, compute and network services available from cloud providers are as high quality, high performance and low cost as what the traditional approach provides.” The company also highlighted that the decision was made based on Google value adds in its data platform and tools.

While Google and Apple have yet to comment on the deal, an Amazon spokesperson has implied the deal may not have happened at all, sending BCN the following emailed statement. “It’s kind of a puzzler to us because vendors who understand doing business with enterprises respect NDAs with their customers and don’t imply competitive defection where it doesn’t exist.”

The rumoured Apple/Google deal marks a tough couple of weeks for AWS. Aside from Apple and Spotify, the company also lost the majority of Dropbox’s business. AWS is still occupies a strong position in the public cloud market but there are increasing signs its competitors are raising their game.

Google’s AlphaGo publicity stunt raises profile of AI and machine learning

Google AlphaGoWorld Go champion Lee Se-dol has beaten AlphaGo, an AI program developed by Google’s DeepMind unit this weekend, though he still trails the program 3-1 in the series.

Google’s publicity stunt highlights the progress which has been made in the world of artificial intelligence and machine learning, as commentators predicted a run-away victory for Se-dol.

DeepMind founder Demis Hassabis commented on Twitter “Lee Sedol is playing brilliantly! #AlphaGo thought it was doing well, but got confused on move 87. We are in trouble now…” allowing Se-dol to win the fourth game in the five game series. While the stunt demonstrates the potential of machine learning, Se-dol’s consolation victory proves that the technology is still capable of making mistakes.

The complexity of the game presented a number of problems for the DeepMind team, as traditional machine learning techniques would not enable the program to be successful. Traditional AI methods, which construct a search tree over all possible positions, would have required too much compute power due to the vast number of permutations within the game. The game is played primarily through intuition and feel, presenting a complex challenge for AI researchers.

The DeepMind team created a program that combined an advanced tree search with deep neural network, which enabled the program to play thousands of games with itself. The games allowed the machine to readjust its behaviour, a technique called reinforcement learning, to improve its performance day by day. This technique allows the machine to play human opponents in its own right, as opposed to mimic other players which it has studied. Commentators who has watched all four games have repeatedly questioned whether some of the moves put forward by AlphaGo were mistakes or simply unconventional strategies devised by the reinforcement learning technique.

Although the AlphaGo program demonstrates progress as well as an alternative means to build machine learning techniques, the defeat highlights that AI is still fallible; there is still some way to go before AI will become the norm in the business world.

In other AI news Microsoft has also launched its own publicity stunt, though Minecraft. The AIX platform allows computer scientists to use the world of Minecraft as a test bed to improve their own artificial intelligence projects. The platform is currently available to a small number of academic researchers, though it will be available via an open-source licence during 2016.

Minecraft appeals to the mass market due to the endless possibilities offered to the users, however the open-ended nature of the game also lends itself to artificial intelligence researchers. From searching an unknown environment, to building structures, the platform offers researchers an open playing field to build custom scenarios and challenges for an acritical intelligence offering.

Aside from the limitless environment, Minecraft also offers a cheaper alternative for researchers. In a real world environment, researcher may deploy a robot in the field though any challenges may cause damage to the robot itself. For example, should the robot not be able to navigate around a ditch, this could result in costly repairs or even replacing the robot entirely. Falling into a ditch in Minecraft simply results in restarting the game and the experiment.

“Minecraft is the perfect platform for this kind of research because it’s this very open world,” said Katja Hofmann, lead researcher at the Machine Learning and Perception group at Microsoft Research Cambridge. “You can do survival mode, you can do ‘build battles’ with your friends, you can do courses, you can implement our own games. This is really exciting for artificial intelligence because it allows us to create games that stretch beyond current abilities.”

One of the main challenges the Microsoft team are aiming to address is the process of learning and addressing problems. Scientists have become very efficient at teaching machines to do specific tasks, though decision making in new situations is the next step in the journey. This “General Intelligence” is more similar to the complex manner in which humans learn and make decisions every day. “A computer algorithm may be able to take one task and do it as well or even better than an average adult, but it can’t compete with how an infant is taking in all sorts of inputs – light, smell, touch, sound, discomfort – and learning that if you cry chances are good that Mom will feed you,” Microsoft highlighted in its blog.

Spotify shifts all music from data centres to Google Cloud

Spotify_Icon_RGB_GreenMusic streaming service Spotify has announced that it is to switch formats for storing tunes for customers and is copying all the music from its data centres onto the Google’s Cloud Platform.

In a blog written by Spotify’s VP of Engineering & Infrastructure, Nicholas Harteau explained that though the company’s data centres had served it well, the cloud is now sufficiently mature to surpass the level of quality, performance and cost Spotify got from owning its infrastructure. Spotify will now get its platform infrastructure from Google Cloud Platform ‘everywhere’, Harteau revealed.

“This is a big deal,” he said. Though Spotify has taken a traditional approach to delivering its music streams, it no longer feels it needs to buy or lease data-centre space, server hardware and networking gear to guarantee being as close to its customers as possible, according to Harteau.

“Like good engineers, we asked ourselves: do we really need to do all this stuff? For a long time the answer was yes. Recently that balance has shifted,” he said.

Operating data centres was a painful necessity for Spotify since it began in 2008 because it was the only way to guarantee the quality, performance and cost for its cloud. However, these days the storage, computing and network services available from cloud providers are as high quality, high performance and low cost as anything Spotify could create from the traditional ownership model, said Harteau.

Harteau explained why Spotify preferred Google’s cloud service to that of runaway market leader Amazon Web Services (AWS). The decision was shaped by Spotify’s experience with Google’s data platform and tools. “Good infrastructure isn’t just about keeping things up and running, it’s about making all of our teams more efficient and more effective, and Google’s data stack does that for us in spades,” he continued.

Harteau cited the Dataproc’s batch processing, event delivery with Pub/Sub and the ‘nearly magical’ capacity of BigQuery as the three most persuasive features of Google’s cloud service offering.

Google launches Dataproc after successful beta trials

Google cloud platformGoogle has announced that its big data analysis tool Dataproc is now on general release. The utility, which was one of the factors that persuaded Spotify to choose Google’s Cloud Platform over Amazon Web Services is a managed tool based on the Hadoop and Spark open source big data software.

The service first became available in beta in September and was tested by global music streaming service Spotify, which was evaluating whether it should move its music files away from its own data centres and into the public cloud – and which cloud service could support it. Dataproc in its beta form supported the MapReduce engine, the Pig platform for writing programmes and the Hive data warehousing software. Google says it has added new features and sharpened the tool since then.

While in its beta testing phase, Cloud Dataproc added features such as property tuning, VM metadata and tagging and cluster versioning. “In general availability new versions of Cloud Dataproc will be frequently released with new features, functions and software components,” said Google product manager James Malone.

Cloud Dataproc aims to minimise cost and complexity, which are the two major distractions of data processing, according to Malone.

“Spark and Hadoop should not break the bank and you should pay for what you actually use,” he said. As a result, Cloud Dataproc is priced at 1 cent per virtual CPU per hour. Billing is by the minute with a 10-minute minimum.

Analysis should run faster, Malone said, because clusters in Cloud Dataproc can start and stop operations in less than 90 seconds, where they take minutes in other big data systems. This can make analyses run up to ten times faster. The new general release of Cloud Dataproc will have better management, since clusters don’t need specialist administration people or software.

Cloud Dataproc also tackles two other data processing bugbears, scale and productivity, promised Malone. This tool complements a separate service called Google Cloud Dataflow for batch and stream processing. The underlying technology for the service has been accepted as an Apache incubator project under the name Apache Beam.

AWS, Azure and Google intensify cloud price war

AzureAs price competition intensifies among the top three cloud service providers, one analyst has warned that cloud buyers should not get drawn into a race to the bottom.

Following price cuts by AWS and Google, last week Microsoft lowered the price bar further with cuts to its Azure service. Though smaller players will struggle to compete on costs, the cloud service is a long way from an oligopoly, according to Quocirca analyst Clive Longbottom.

Amazon Web Services began the bidding in early January as chief technology evangelist Jeff Barr announced the company’s 51st cloud price cut on his official AWS blog.

In January 8th Google’s Julia Ferraioli argued via a blog post that Google is now a cheaper offering (in terms of cost effectiveness) as a result of its discounting scheme. “Google is anywhere from 15 to 41% less expensive than AWS for compute resources,” said Ferraioli. The key to the latest Google lead in cost effectiveness is automatic sustained usage discounts and custom machine types that AWS can’t match, claimed Ferraioli.

Last week Microsoft’s Cloud Platform product marketing director Nicole Herskowitz announced the latest round of price competition in a company blog post announcing a 17% cut off the prices of its Dv2 Virtual Machines.

Herskowitz claimed that Microsoft offers better price performance because, unlike AWS EC2, its Azure’s Dv2 instances have include load balancing and auto-scaling built-in at no extra charge.

Microsoft is also aiming to change the perception of AWS’s superiority as an infrastructure service provider. “Azure customers are using the rich set of services spanning IaaS and PaaS,” wrote Herskowitz, “today, more than half of Azure IaaS customers are benefiting by adopting higher level PaaS services.”

Price is not everything in this market warned Quocirca analyst Longbottom, an equally important side of any cloud deal is overall value. “Even though AWS, Microsoft and Google all offer high availability and there is little doubting their professionalism in putting the stack together, it doesn’t mean that these are the right platform for all workloads. They have all had downtime that shouldn’t have happened,” said Longbottom.

The level of risk the provider is willing to protect the customer from and the business and technical help they provide are still deal breakers, Longbottom said. “If you need more support, then it may well be that something like IBM SoftLayer is a better bet. If you want pre-prepared software as a service, then you need to look elsewhere. So it’s still horses for courses and these three are not the only horses in town.”

Snooper’s charter a potential disaster warns lobby of US firms

security1The ‘snooper’s charter’ could neutralise the contribution of Britain’s digital economy, according to a representation of US tech corporations including Facebook, Google, Microsoft, Twitter and Yahoo.

In a collective submission to the Draft Investigatory Powers Bill Joint Committee they argue that surveillance should be “is targeted, lawful, proportionate, necessary, jurisdictionally bounded, and transparent.”

These principles, the collective informs the parliamentary committee, reflect the perspective of global companies that offer “borderless technologies to billions of people around the globe”.

The extraterritorial jurisdiction will create ‘conflicting legal obligations’ for them, the collective said. If the UK government instructs foreign companies what to do, then foreign governments may follow suit, they warn. A better long term resolution might be the development of an ‘international framework’ with ‘a common set of rules’ to resolve jurisdictional conflicts.

“Encryption is a fundamental security tool, important to the security of the digital economy and crucial to the safety of web users worldwide,” the submission said. “We reject any proposals that would require companies to deliberately weaken the security of their products via backdoors, forced decryption or any other means.”

Another area of concern mentioned is the bill’s proposed legislation on Computer Network Exploitation which, the companies say, gives intelligence services legal powers to break into any system. This would be a very dangerous precedent to set, the submission argues, “we would urge your Government to reconsider,” it said.

Finally, Facebook and co registered concern that the new law would prevent any discussion of government surveillance, even in court. “We urge the Government to make clear that actions taken under authorization do not introduce new risks or vulnerabilities for users or businesses, and that the goal of eliminating vulnerabilities is one shared by the UK Government. Without this, it would be impossible to see how these provisions could meet the proportionality test.”

The group submission joins other individual protest registered by Apple, EE, F-Secure, the Internet Service Providers’ Association, Mozilla, The Tor Project and Vodafone.

The interests of British citizens hang in a very tricky balance, according to analyst Clive Longbottom at Quocirca. “Forcing vendors to provide back door access to their systems and platforms is bloody stupid, as the bad guys will make just as much use of them. However, the problem with terrorism is that it respects no boundaries. Neither, to a greater extent, do any of these companies. They have built themselves on a basis of avoiding jurisdictions – only through such a means can they minimise their tax payments,” said Longbottom.

Containers at Christmas: wrapping, cloud and competition

Empty road and containers in harbor at sunsetAs anyone that’s ever been disappointed by a Christmas present will tell you – shiny packaging can be very misleading. As we hear all the time, it’s what’s inside that counts…

What then, are we to make of the Docker hype, centred precisely on shiny, new packaging? (Docker is the vendor that two years ago found a way to containerise applications: other types of containers, operating system containers, have been around for a couple of decades)

It is not all about the packaging, of course. Perhaps we should say that it is more about on what the package is placed, and how it is managed (amongst other things) that matters most?

Regardless, containers are one part of a changing cloud, data centre and enterprise IT landscape, the ‘cloud native’ movement widely seen as driving a significant shift in enterprise infrastructure and application development.

What the industry is trying to figure out, and what could prove the most disruptive angle to watch as more and more enterprises roll out containers into production, is the developing competition within this whole container/cloud/data centre market.

The question of competition is a very hot topic in the container, devops and cloud space.  Nobody could have thought the OCI co-operation between Docker and CoreOS meant they were suddenly BFFs. Indeed, the drive to become the enterprise container of choice now seems to be at the forefront of both companies’ plans. Is this, however, the most dynamic relationship in the space? What about the Google-Docker-Mesos orchestration game? It would seem that Google’s trusted container experience is already allowing it to gain favour with enterprises, with Kubernetes taking a lead. And with CoreOS in bed with Google’s open source Kubernetes, placing it at the heart of Tectonic, does this mean that CoreOS has a stronger play in the enterprise market to Docker? We will wait and see…

We will also wait and see how the Big Cloud Three will come out of the expected container-driven market shift. Somebody described AWS as ‘a BT’ to me…that is, the incumbent who will be affected most by the new disruptive changes brought by containers, since it makes a lot of money from an older model of infrastructure….

Microsoft’s container ambition is also being watched closely. There is a lot of interest from both the development and IT Ops communities in their play in the emerging ecosystem. At a recent meet-up, an Azure evangelist had to field a number of deeply technical questions regarding exactly how Microsoft’s containers fair next to Linux’s. The question is whether, when assessing who will win the largest piece of the enterprise pie, this will prove the crux of the matter?

Containers are not merely changing the enterprise cloud game (with third place Google seemingly getting it very right) but also driving the IT Ops’ DevOps dream to reality; in fact, many are predicting that it could eventually prove a bit of a threat to Chef and Puppet’s future….

So, maybe kids at Christmas have got it right….it is all about the wrapping and boxes! We’ll have to wait a little longer than Christmas Day to find out.

Lucy Ashton. Head of Content & Production, Container WorldWritten by Lucy Ashton, Head of Content & Production, Container World

Google upgrades Cloud SQL, promises managed MySQL offerings

Google officeGoogle has announced the beta availability of a new improved Cloud SQL for Google Cloud Platform – and an alpha version of its much anticipated Content Delivery Network offering.

In a blog post Brett Hesterberg, Product Manager for Google’s Cloud Platform, says the second generation of Cloud SQL will aim to give better performance and more ‘scalability per dollar’.

In Google’s internal testing, the second generation Cloud SQL proved seven times faster than the first generation and it now scales to 10TB of data, 15,000 IOPS and 104GB of RAM per instance, Hesterberg said.

The upshot is that transactional databases now have a flexibility that was unachievable with traditional relational databases. “With Cloud SQL we’ve changed that,” Hesterberg said. “Flexibility means easily scaling a database up and down.”

Databases can now ramp up and down in size and the number of queries per day. The allocation of resources like CPU cores and RAM can be more skilfully adapted with Cloud SQL, using a variety of tools such as MySQL Workbench, Toad and the MySQL command-line. Another promised improvement is that any client can be used for access, including Compute Engine, Managed VMs, Container Engine and workstations.

In the new cloud environment databases need to be easier to stop and restart if they are only used on occasion for brief or infrequent tasks, according to Hesterberg. Cloud SQL now caters for these increasingly common cloud applications of database technology through the Cloud Console, the command line within Google’s gCloud SDK or a RESTful API. This makes admin a scriptable job and minimises costs by only running the databases when necessary.

Cloud SQL will create more manageable MySQL databases, claims Hesterberg, since Google will apply patches and updates to MySQL, manage backups, configure replication and provide automatic failover for High Availability (HA) in the event of a zone outage. “It means you get Google’s operational expertise for your MySQL database,” says Hesterberg. Subscribers signed up for Google Cloud Platform can now get a $300 credit to test drive Cloud SQL, it announced.

Meanwhile in another Google blog, it announced an alpha release of its own content delivery network, Google Cloud CDN. The system may not be consistent and is not recommended for production use, Google warned.

Google Cloud CDN will speed up its cloud services using distributed edge caches to bring content closer to users in a bid to compensate for its relatively low global data centre coverage against rivals AWS and Azure.

Google signs five deals for green powering its cloud services

Cloud service giant Google has announced five new deals to buy 781MW of renewable energy from suppliers in the US, Sweden and Chile, according to a report on Bloomberg.

The deals add up to the biggest-ever purchase of renewable energy ever by a company that is not a utility, according to Michael Terrell, Google’s principal of energy and global infrastructure.

Google will buy 200 megawatts of power from Oklahoma-based Renewable Energy Systems Americas’s Bluestem wind project. From the same US state another 200 megawatts will be contributed by Great Western wind project run by Electricite de France. In addition, Google will also power its cloud services with 225 megawatts of wind power from independent power producer Invenergy.

Google’s data centres and cloud services in South America could become carbon free when the 80 megawatts of solar power that it has ordered from Acciona Energia’s El Romero farm in Chile comes online.

In Scandinavia the cloud service provider has agreed to buy 76 megawatts of wind power from Eolus Vind’s Jenasen wind project to be built in Vasternorrland County, Sweden.

In July, Google committed to tripling its purchases of renewable energy by 2025. At the time, it had contracts to buy 1.1 GW of sustainably sourced power.

Google’s first ever green power deal was in 2010 when it agreed to buy power from a wind farm in Iowa. Last week, it announced plans to purchase buy 61 megawatts from a solar farm in North Carolina.

Google appoints ex-VMware boss to lead enterprise web services business

Google officeGoogle has appointed former VMware CEO and current Google board member Diane Greene to head a new business-oriented cloud service.

Though Google is associated with consumer products and overshadowed by AWS in enterprise cloud computing, the lead is not unassailable, claimed Google CEO Sundar Pichai, in the company’s official blog, as the appointment was announced.

“More than 60% of the Fortune 500 are actively using a paid Google for Work product and only a tiny fraction of the world’s data is currently in the cloud,” he said. “Most businesses and applications aren’t cloud-based yet. This is an important and fast-growing area for Google and we’re investing for the future.”

Since all of Google’s own businesses run on its cloud infrastructure, the company has significantly larger data centre capacity than any other public cloud provider, Pichai argued. “That’s what makes it possible for customers to receive the best price and performance for compute and storage services. All of this demonstrates great momentum, but it’s really just the beginning,” he said.

Pichai stated the new business will bring together product, engineering, marketing and sales, and Green’s brief will be to integrate them into one cohesive offering. “Dianne has a huge amount of operational experience that will continue to help the company,” he said.

In addition, Google is to acquire bebop, a company founded by Greene, to simplify the building and maintain enterprise applications. “This will help many more businesses find great applications and reap the benefits of cloud computing,” said Pichai.

Bebop’s resources will be dedicated to building and integrating the entire range of Google’s cloud products from devices like Android and Chromebooks, through infrastructure and services in the Google Cloud Platform, to developer frameworks for mobile and enterprise users and finally end-user applications like Gmail and Docs.

The market for these cloud development tools will be worth $2.3 billion in 2019, up from $803 million this year, according to IDC. The knock on effect of those developments is that more apps will run on the cloud of the service provider that supported development and that hosting business will triple to $22.6 billion by 2019, IDC says.

Greene and the bebop staff will join Google once the acquisition has completed. The new name for Greene’s division has yet to be named but will include divisions such as Google for Work, Cloud Platform, and Google Apps, according to Android Central.