Archivo de la categoría: Google

Spotify shifts all music from data centres to Google Cloud

Spotify_Icon_RGB_GreenMusic streaming service Spotify has announced that it is to switch formats for storing tunes for customers and is copying all the music from its data centres onto the Google’s Cloud Platform.

In a blog written by Spotify’s VP of Engineering & Infrastructure, Nicholas Harteau explained that though the company’s data centres had served it well, the cloud is now sufficiently mature to surpass the level of quality, performance and cost Spotify got from owning its infrastructure. Spotify will now get its platform infrastructure from Google Cloud Platform ‘everywhere’, Harteau revealed.

“This is a big deal,” he said. Though Spotify has taken a traditional approach to delivering its music streams, it no longer feels it needs to buy or lease data-centre space, server hardware and networking gear to guarantee being as close to its customers as possible, according to Harteau.

“Like good engineers, we asked ourselves: do we really need to do all this stuff? For a long time the answer was yes. Recently that balance has shifted,” he said.

Operating data centres was a painful necessity for Spotify since it began in 2008 because it was the only way to guarantee the quality, performance and cost for its cloud. However, these days the storage, computing and network services available from cloud providers are as high quality, high performance and low cost as anything Spotify could create from the traditional ownership model, said Harteau.

Harteau explained why Spotify preferred Google’s cloud service to that of runaway market leader Amazon Web Services (AWS). The decision was shaped by Spotify’s experience with Google’s data platform and tools. “Good infrastructure isn’t just about keeping things up and running, it’s about making all of our teams more efficient and more effective, and Google’s data stack does that for us in spades,” he continued.

Harteau cited the Dataproc’s batch processing, event delivery with Pub/Sub and the ‘nearly magical’ capacity of BigQuery as the three most persuasive features of Google’s cloud service offering.

Google launches Dataproc after successful beta trials

Google cloud platformGoogle has announced that its big data analysis tool Dataproc is now on general release. The utility, which was one of the factors that persuaded Spotify to choose Google’s Cloud Platform over Amazon Web Services is a managed tool based on the Hadoop and Spark open source big data software.

The service first became available in beta in September and was tested by global music streaming service Spotify, which was evaluating whether it should move its music files away from its own data centres and into the public cloud – and which cloud service could support it. Dataproc in its beta form supported the MapReduce engine, the Pig platform for writing programmes and the Hive data warehousing software. Google says it has added new features and sharpened the tool since then.

While in its beta testing phase, Cloud Dataproc added features such as property tuning, VM metadata and tagging and cluster versioning. “In general availability new versions of Cloud Dataproc will be frequently released with new features, functions and software components,” said Google product manager James Malone.

Cloud Dataproc aims to minimise cost and complexity, which are the two major distractions of data processing, according to Malone.

“Spark and Hadoop should not break the bank and you should pay for what you actually use,” he said. As a result, Cloud Dataproc is priced at 1 cent per virtual CPU per hour. Billing is by the minute with a 10-minute minimum.

Analysis should run faster, Malone said, because clusters in Cloud Dataproc can start and stop operations in less than 90 seconds, where they take minutes in other big data systems. This can make analyses run up to ten times faster. The new general release of Cloud Dataproc will have better management, since clusters don’t need specialist administration people or software.

Cloud Dataproc also tackles two other data processing bugbears, scale and productivity, promised Malone. This tool complements a separate service called Google Cloud Dataflow for batch and stream processing. The underlying technology for the service has been accepted as an Apache incubator project under the name Apache Beam.

AWS, Azure and Google intensify cloud price war

AzureAs price competition intensifies among the top three cloud service providers, one analyst has warned that cloud buyers should not get drawn into a race to the bottom.

Following price cuts by AWS and Google, last week Microsoft lowered the price bar further with cuts to its Azure service. Though smaller players will struggle to compete on costs, the cloud service is a long way from an oligopoly, according to Quocirca analyst Clive Longbottom.

Amazon Web Services began the bidding in early January as chief technology evangelist Jeff Barr announced the company’s 51st cloud price cut on his official AWS blog.

In January 8th Google’s Julia Ferraioli argued via a blog post that Google is now a cheaper offering (in terms of cost effectiveness) as a result of its discounting scheme. “Google is anywhere from 15 to 41% less expensive than AWS for compute resources,” said Ferraioli. The key to the latest Google lead in cost effectiveness is automatic sustained usage discounts and custom machine types that AWS can’t match, claimed Ferraioli.

Last week Microsoft’s Cloud Platform product marketing director Nicole Herskowitz announced the latest round of price competition in a company blog post announcing a 17% cut off the prices of its Dv2 Virtual Machines.

Herskowitz claimed that Microsoft offers better price performance because, unlike AWS EC2, its Azure’s Dv2 instances have include load balancing and auto-scaling built-in at no extra charge.

Microsoft is also aiming to change the perception of AWS’s superiority as an infrastructure service provider. “Azure customers are using the rich set of services spanning IaaS and PaaS,” wrote Herskowitz, “today, more than half of Azure IaaS customers are benefiting by adopting higher level PaaS services.”

Price is not everything in this market warned Quocirca analyst Longbottom, an equally important side of any cloud deal is overall value. “Even though AWS, Microsoft and Google all offer high availability and there is little doubting their professionalism in putting the stack together, it doesn’t mean that these are the right platform for all workloads. They have all had downtime that shouldn’t have happened,” said Longbottom.

The level of risk the provider is willing to protect the customer from and the business and technical help they provide are still deal breakers, Longbottom said. “If you need more support, then it may well be that something like IBM SoftLayer is a better bet. If you want pre-prepared software as a service, then you need to look elsewhere. So it’s still horses for courses and these three are not the only horses in town.”

Snooper’s charter a potential disaster warns lobby of US firms

security1The ‘snooper’s charter’ could neutralise the contribution of Britain’s digital economy, according to a representation of US tech corporations including Facebook, Google, Microsoft, Twitter and Yahoo.

In a collective submission to the Draft Investigatory Powers Bill Joint Committee they argue that surveillance should be “is targeted, lawful, proportionate, necessary, jurisdictionally bounded, and transparent.”

These principles, the collective informs the parliamentary committee, reflect the perspective of global companies that offer “borderless technologies to billions of people around the globe”.

The extraterritorial jurisdiction will create ‘conflicting legal obligations’ for them, the collective said. If the UK government instructs foreign companies what to do, then foreign governments may follow suit, they warn. A better long term resolution might be the development of an ‘international framework’ with ‘a common set of rules’ to resolve jurisdictional conflicts.

“Encryption is a fundamental security tool, important to the security of the digital economy and crucial to the safety of web users worldwide,” the submission said. “We reject any proposals that would require companies to deliberately weaken the security of their products via backdoors, forced decryption or any other means.”

Another area of concern mentioned is the bill’s proposed legislation on Computer Network Exploitation which, the companies say, gives intelligence services legal powers to break into any system. This would be a very dangerous precedent to set, the submission argues, “we would urge your Government to reconsider,” it said.

Finally, Facebook and co registered concern that the new law would prevent any discussion of government surveillance, even in court. “We urge the Government to make clear that actions taken under authorization do not introduce new risks or vulnerabilities for users or businesses, and that the goal of eliminating vulnerabilities is one shared by the UK Government. Without this, it would be impossible to see how these provisions could meet the proportionality test.”

The group submission joins other individual protest registered by Apple, EE, F-Secure, the Internet Service Providers’ Association, Mozilla, The Tor Project and Vodafone.

The interests of British citizens hang in a very tricky balance, according to analyst Clive Longbottom at Quocirca. “Forcing vendors to provide back door access to their systems and platforms is bloody stupid, as the bad guys will make just as much use of them. However, the problem with terrorism is that it respects no boundaries. Neither, to a greater extent, do any of these companies. They have built themselves on a basis of avoiding jurisdictions – only through such a means can they minimise their tax payments,” said Longbottom.

Containers at Christmas: wrapping, cloud and competition

Empty road and containers in harbor at sunsetAs anyone that’s ever been disappointed by a Christmas present will tell you – shiny packaging can be very misleading. As we hear all the time, it’s what’s inside that counts…

What then, are we to make of the Docker hype, centred precisely on shiny, new packaging? (Docker is the vendor that two years ago found a way to containerise applications: other types of containers, operating system containers, have been around for a couple of decades)

It is not all about the packaging, of course. Perhaps we should say that it is more about on what the package is placed, and how it is managed (amongst other things) that matters most?

Regardless, containers are one part of a changing cloud, data centre and enterprise IT landscape, the ‘cloud native’ movement widely seen as driving a significant shift in enterprise infrastructure and application development.

What the industry is trying to figure out, and what could prove the most disruptive angle to watch as more and more enterprises roll out containers into production, is the developing competition within this whole container/cloud/data centre market.

The question of competition is a very hot topic in the container, devops and cloud space.  Nobody could have thought the OCI co-operation between Docker and CoreOS meant they were suddenly BFFs. Indeed, the drive to become the enterprise container of choice now seems to be at the forefront of both companies’ plans. Is this, however, the most dynamic relationship in the space? What about the Google-Docker-Mesos orchestration game? It would seem that Google’s trusted container experience is already allowing it to gain favour with enterprises, with Kubernetes taking a lead. And with CoreOS in bed with Google’s open source Kubernetes, placing it at the heart of Tectonic, does this mean that CoreOS has a stronger play in the enterprise market to Docker? We will wait and see…

We will also wait and see how the Big Cloud Three will come out of the expected container-driven market shift. Somebody described AWS as ‘a BT’ to me…that is, the incumbent who will be affected most by the new disruptive changes brought by containers, since it makes a lot of money from an older model of infrastructure….

Microsoft’s container ambition is also being watched closely. There is a lot of interest from both the development and IT Ops communities in their play in the emerging ecosystem. At a recent meet-up, an Azure evangelist had to field a number of deeply technical questions regarding exactly how Microsoft’s containers fair next to Linux’s. The question is whether, when assessing who will win the largest piece of the enterprise pie, this will prove the crux of the matter?

Containers are not merely changing the enterprise cloud game (with third place Google seemingly getting it very right) but also driving the IT Ops’ DevOps dream to reality; in fact, many are predicting that it could eventually prove a bit of a threat to Chef and Puppet’s future….

So, maybe kids at Christmas have got it right….it is all about the wrapping and boxes! We’ll have to wait a little longer than Christmas Day to find out.

Lucy Ashton. Head of Content & Production, Container WorldWritten by Lucy Ashton, Head of Content & Production, Container World

Google upgrades Cloud SQL, promises managed MySQL offerings

Google officeGoogle has announced the beta availability of a new improved Cloud SQL for Google Cloud Platform – and an alpha version of its much anticipated Content Delivery Network offering.

In a blog post Brett Hesterberg, Product Manager for Google’s Cloud Platform, says the second generation of Cloud SQL will aim to give better performance and more ‘scalability per dollar’.

In Google’s internal testing, the second generation Cloud SQL proved seven times faster than the first generation and it now scales to 10TB of data, 15,000 IOPS and 104GB of RAM per instance, Hesterberg said.

The upshot is that transactional databases now have a flexibility that was unachievable with traditional relational databases. “With Cloud SQL we’ve changed that,” Hesterberg said. “Flexibility means easily scaling a database up and down.”

Databases can now ramp up and down in size and the number of queries per day. The allocation of resources like CPU cores and RAM can be more skilfully adapted with Cloud SQL, using a variety of tools such as MySQL Workbench, Toad and the MySQL command-line. Another promised improvement is that any client can be used for access, including Compute Engine, Managed VMs, Container Engine and workstations.

In the new cloud environment databases need to be easier to stop and restart if they are only used on occasion for brief or infrequent tasks, according to Hesterberg. Cloud SQL now caters for these increasingly common cloud applications of database technology through the Cloud Console, the command line within Google’s gCloud SDK or a RESTful API. This makes admin a scriptable job and minimises costs by only running the databases when necessary.

Cloud SQL will create more manageable MySQL databases, claims Hesterberg, since Google will apply patches and updates to MySQL, manage backups, configure replication and provide automatic failover for High Availability (HA) in the event of a zone outage. “It means you get Google’s operational expertise for your MySQL database,” says Hesterberg. Subscribers signed up for Google Cloud Platform can now get a $300 credit to test drive Cloud SQL, it announced.

Meanwhile in another Google blog, it announced an alpha release of its own content delivery network, Google Cloud CDN. The system may not be consistent and is not recommended for production use, Google warned.

Google Cloud CDN will speed up its cloud services using distributed edge caches to bring content closer to users in a bid to compensate for its relatively low global data centre coverage against rivals AWS and Azure.

Google signs five deals for green powering its cloud services

Cloud service giant Google has announced five new deals to buy 781MW of renewable energy from suppliers in the US, Sweden and Chile, according to a report on Bloomberg.

The deals add up to the biggest-ever purchase of renewable energy ever by a company that is not a utility, according to Michael Terrell, Google’s principal of energy and global infrastructure.

Google will buy 200 megawatts of power from Oklahoma-based Renewable Energy Systems Americas’s Bluestem wind project. From the same US state another 200 megawatts will be contributed by Great Western wind project run by Electricite de France. In addition, Google will also power its cloud services with 225 megawatts of wind power from independent power producer Invenergy.

Google’s data centres and cloud services in South America could become carbon free when the 80 megawatts of solar power that it has ordered from Acciona Energia’s El Romero farm in Chile comes online.

In Scandinavia the cloud service provider has agreed to buy 76 megawatts of wind power from Eolus Vind’s Jenasen wind project to be built in Vasternorrland County, Sweden.

In July, Google committed to tripling its purchases of renewable energy by 2025. At the time, it had contracts to buy 1.1 GW of sustainably sourced power.

Google’s first ever green power deal was in 2010 when it agreed to buy power from a wind farm in Iowa. Last week, it announced plans to purchase buy 61 megawatts from a solar farm in North Carolina.

Google appoints ex-VMware boss to lead enterprise web services business

Google officeGoogle has appointed former VMware CEO and current Google board member Diane Greene to head a new business-oriented cloud service.

Though Google is associated with consumer products and overshadowed by AWS in enterprise cloud computing, the lead is not unassailable, claimed Google CEO Sundar Pichai, in the company’s official blog, as the appointment was announced.

“More than 60% of the Fortune 500 are actively using a paid Google for Work product and only a tiny fraction of the world’s data is currently in the cloud,” he said. “Most businesses and applications aren’t cloud-based yet. This is an important and fast-growing area for Google and we’re investing for the future.”

Since all of Google’s own businesses run on its cloud infrastructure, the company has significantly larger data centre capacity than any other public cloud provider, Pichai argued. “That’s what makes it possible for customers to receive the best price and performance for compute and storage services. All of this demonstrates great momentum, but it’s really just the beginning,” he said.

Pichai stated the new business will bring together product, engineering, marketing and sales, and Green’s brief will be to integrate them into one cohesive offering. “Dianne has a huge amount of operational experience that will continue to help the company,” he said.

In addition, Google is to acquire bebop, a company founded by Greene, to simplify the building and maintain enterprise applications. “This will help many more businesses find great applications and reap the benefits of cloud computing,” said Pichai.

Bebop’s resources will be dedicated to building and integrating the entire range of Google’s cloud products from devices like Android and Chromebooks, through infrastructure and services in the Google Cloud Platform, to developer frameworks for mobile and enterprise users and finally end-user applications like Gmail and Docs.

The market for these cloud development tools will be worth $2.3 billion in 2019, up from $803 million this year, according to IDC. The knock on effect of those developments is that more apps will run on the cloud of the service provider that supported development and that hosting business will triple to $22.6 billion by 2019, IDC says.

Greene and the bebop staff will join Google once the acquisition has completed. The new name for Greene’s division has yet to be named but will include divisions such as Google for Work, Cloud Platform, and Google Apps, according to Android Central.

SAP unveils new powers within Analytics in the Cloud

SAP1SAP has unveiled a new user-friendly analytics service for enterprises which it claims will give better insights by offering an ‘unparalleled user experience’.

The SAP Cloud for Analytics will be delivered through a planned software as a service (SaaS) offering that unifies all SAP’s analytical functions into one convenient dashboard.

Built natively on the SAP HANA Cloud platform, it will be a scalable, multi-tenant environment at a price which SAP says is affordable to companies and individuals. The new offering aims to bring together a variety of existing services including business intelligence, planning, budgeting and predictive capacity.

According to SAP, it has fine tuned workflows so that it’s easier for user to get from insight to action, as one application spirits the uses through this journey more rapidly. It achieves this by giving universal access to all data, digesting it and forwarding the right components to the right organs of the organisation. An intuitive user interface (UI) will help all users, from specialists such as finance professionals to generalists such as line of business analysts, to build connected planning models, analyze data and collaborate. It can extend to unstructured data, helping users to spot market trends within social media and correlate them with company inventories, SAP claims.

It’s all about breaking down the divisions between silos and blending the data to make visualization and forecasting possible, said Steve Lucas, president, Platform Solutions, SAP. “SAP Cloud for Analytics will be a new cloud analytics experience. That to me is more than visualization of data, that’s realization of success,” said Lucas.

SAP said it is also working with partners to provide seamless workflows.

SAP and Google are collaborating to extend the levels of analysis available to customers, according to Prabhakar Raghavan, VP of Engineering at Google Apps. “These innovations are planned to allow Google Apps for Work users to embed, refresh and edit SAP Cloud for Analytics content directly in Google Docs and Google Sheets,” said Raghaven.

Cloud security start up Cloudflare gets $110 million in venture funding

Secure cloudGoogle, Microsoft and chip maker Qualcomm are among the investors to collectively stake $110 million in networking and cyber security start up CloudFlare, according to a report in Fortune.

Cloudflare offers services that speed up cloud systems and web sites while beefing up security. Its main market proposition is to speed up the functioning of any services used by enterprises at the edge of their networks. By doing so it provides a cheaper alternative to the traditional model of on-premise appliances.

Cloudflare claims enterprises can quickly set up cloud-based firewall, load balancing, WAN optimization, distributed denial of service (DDoS) mitigation, content delivery and domain name services services worldwide without needing any hardware. It claims that in one day it saved Chinese users more than 243 years of time that would have been collectively spent waiting for web content to load.

Last week Cloudflare finalized a joint venture with Chinese Internet giant Baidu that allows both US-based companies and Chinese-based companies to use CloudFlare’s website performance service while adhering to Chinese data laws.

Although CloudFlare maintains no physical operations in China, it has worked with Baidu to set up technology within Baidu’s facilities that mimic CloudFlare’s services elsewhere, Prince said.

The funding round was led by Fidelity Investments with Google Capital, Microsoft, Baidu and Qualcomm Ventures, the investment arm of Qualcomm all contributing funds. CloudFlare now has $182 million in total funding.

Matthew Prince, CEO of the start up, said Cloudflare didn’t need the funding as much as it needed the credibility that comes with top brand association. The confidence that comes with the backing of Google and Microsoft could convince nervous buyers that this is a solid investment when the company prepares itself for an initial public offering, it was reported. However, the IPO is unlikely to happen this year, said Prince, and he hinted that it would come no earlier than 2017.