Category Archives: Google

Google, OpenStack target containers as Project Magnum gets first glimpse

Otto, Collier and

Otto, Collier and Parikh demoing Magnum at the OpenStack Summit in Vancouver this week

Google and OpenStack are working together to use Linux containers as a vehicle for integrating their respective cloud services and bolstering OpenStack’s appeal to hybrid cloud users.

The move follows a similar announcement made earlier this year by pure-play OpenStack vendor Mirantis and Google to commit to integrating Kubernetes with the OpenStack platform.

OpenStack chief operating officer Mark Collier said the platform needs to embrace heterogeneous workloads as it moves forward, with both containers and bare-metal solidly on the agenda for future iterations.

To that end, the company revealed Magnum, which in March became an official OpenStack project. Magnum builds on Heat to produce Nova instances on which to run application containers, and it basically creates native capabilities (like support for different scheduling techniques) that enable users and service providers to offer containers-as-a-service.

“As we think about Magnum and how that can take container support to the next level, you’ll hear more about all the different types of technologies available under one common set of APIs. And that’s what users are looking for,” Collier said. “You have a lot of workloads requiring a lot of different technologies to run them at their best, and putting them all together in one platform is a very powerful thing.”

Google’s technical solutions architect Sandeep Parikh and Magnum project leader Adrian Otto (an architect at Rackspace) were on hand to demo a kubernetes cluster deployment in both Google Compute Engine and the Rackspace public cloud using the exact same code and Keystone identity federation.

“We’ve had container support in OpenStack for some time now. Recently there’s been NovaDocker, which is for containers we treat as machines, and that’s fine if you just want a small place to put something,” Otto said.

Magnum uses the concept of a bay – where the orchestration layer goes – that Otto said can be used to manipulate pretty much any Linux container technology, whether its Docker, Kubernetes or Mesos.

“This gives us the ability to offer a hybrid approach. Not everything is great for private cloud, and not everything is great for public [cloud],” Parikh said. “If I want to run a highly available deployment, I can now run my workload in multiple places and if something were to go down the workload will still stay live.”

USA Freedom Act passes ending bulk data collection

The USA Freedom Act will end bulk data gathering familiar to the PRISM programme and other NSA iniatiatives

The USA Freedom Act will end bulk data gathering familiar to the PRISM programme and other NSA iniatiatives

The USA Freedom Act, a bipartisan bill aimed at reforming the US Patriot Act that would among other things end kind of bulk data collection Edward Snowden revealed two years ago, passed the House or Representatives by a wide margin this week. The move may be welcome news to both telcos and cloud service providers alike, many of which lobbied hard for US surveillance reform.

The bill, which passed in a 328 for – 88 against vote, ends the bulk collection of communications metadata under various legal authorities, and not only includes telephony metadata collected under Section 215 but internet metadata that has been or could be collected under other legal authorities as well.

It will also allow companies to be more transparent with the demands being placed on them by legal authorities, and will create  new oversight and accountability mechanisms that will shed more light on the decisions reached by the Foreign Intelligence Surveillance Court (FISC), which has so far operated in a deeply secretive manner and with little interference.

“This bill is an extremely well-drafted compromise—the product of nearly two years of work.  It effectively protects Americans’ civil liberties and our national security.  I am very proud of the USA Freedom Act and am confident it is the most responsible path forward,” said Jim Sensenbrenner, Republican Representative for Wisconsin’s fifth district.

“If the Patriot Act authorities expire, and the FISC approves bulk collection under a different authority, how would the public know?  Without the USA Freedom Act, they won’t.  Allowing the PATRIOT Act authorities to expire sounds like a civil libertarian victory, but it will actually mean less privacy and more risk.”

“Let’s not kill these important reforms because we wish the bill did more.  There is no perfect.  Every bill we vote on could do more,” he added.

Others, including Ted Lieu (D-CA), voted against the proposed reforms because the bill didn’t go far enough.

“While I appreciate a number of the reforms in the bill and understand the need for secure counter-espionage and terrorism investigations, I believe our nation is better served by allowing Section 215 to expire completely and replacing it with a measure that finds a better balance between national security interests and protecting the civil liberties of Americans,” Lieu said.

“Beyond Section 215, I am troubled that the USA Freedom Act would leave in place Sections 505 and 702, provisions that also allow sweeping data collection and backdoor searches circumventing encryption that can result in the collection of information of US citizens not identified in warrants.  The loopholes left in place will continue to undermine the trust of the American people.”

“A federal district court struck down the NSA’s spying on Americans and called the NSA PRISM program ‘Orwellian.’ A federal appellate court ruled last week that the NSA’s bulk collection program was illegal. Despite these two court decisions, the NSA continues to operate its unconstitutional and illegal programs.”

Many cloud service providers and telecoms companies have for the past two years (since Snowden’s NSA-related revelations primarily) voiced concerns that failure to reform US surveillance practices could alienate customers both foreign and domestic. Microsoft and Google have been particularly vocal about this in recent months.

Google’s vice president public policy and government affairs in the Americas Susan Molinari trumpeted her support of the bill. She said the bill takes a big step forward in surveillance reform “while preserving important national security authorities.”

“It ends bulk collection of communications metadata under various legal authorities, allows companies like Google to disclose national security demands with greater granularity, and creates new accountability and oversight mechanisms.”

“The bill’s authors have worked hard to forge a bipartisan consensus, and the approved bill is supported by the Obama Administration, including the intelligence community. The bill now moves to the other side of the Capitol, and we hope that the Senate will use the June 1 expiration of Section 215 and other legal authorities to modernize and reform our surveillance programs, while recognizing the importance of protecting Americans from harm,” she added.

US-based telco Verizon declined to comment on the passage of the bill.

Google Bigtable

Google’s new online data storage service has the potential to enable large companies to implement big data analysis as a cloud service. Google Cloud Bigtable is based on technology that has been used within Google for many years. It now powers many of Google’s core services like Search, Gmail and analytics.

 

This service could be used to store sensor data from an Internet of things monitoring system. Finance, Telecommunications, digital advertising, energy, biomedical and other data-intensive companies are examples of who could benefit from the use of this program.

 

Bigtable is a NoSQL hosted data store. Users can read and write data the API for Apache HBase, an opensource application of the Bigtable architecture for storing data across multiple servers. Due to this, customers can use the service with existing Hadoop software. Hadoop is an open source data processing platform used for large data sets. Bigtable can also work with other Google cloud services.

 

630-Google-introduces-Cloud-Bigtable-managed-NoSQL-database-to-process-data-at-scale

 

Google claims that Bigtable is faster than other NoSQL stores. They manage the service completely, such as data replication for backup and encrypting it for security. Another interesting feature is that as you add more data, Google automatically provides the additional storage capacity.

 

The pricing structure is based on many factors, such as network usage, amount of nodes deployed and amount of storage used. Taking into account all of these things, Bigtable is total cost of ownership is less than half of its direct competitors.

The post Google Bigtable appeared first on Cloud News Daily.

Google adds Crate to SQL services on GCE

Google has been on a big data push

Google has been on a big data push

Google has added open source distributed SQL data store Crate to the Google Compute Engine arsenal, the latest in a series of moves aimed at bolstering the company’s data services.

Crate is a distributed open source data store built on a high availability “shared-nothing” architecture that automatically shards and distributes data across all of nodes (and maintains several replicas for fault tolerance).

It uses SQL syntax but packs some NoSQL goodies as well (Elasticsearch, Presto, Lucene are among the components it implements).

“This means when a new node is added, the cluster automatically rebalances and can self-heal when a node is removed. All data is indexed, optimized, and compressed on ingest and is accessible using familiar SQL syntax through a RESTful API,” explained Tyler Randles, evangelist at Crate.

“Crate was built so developers won’t need to “glue” several technologies together to store documents or BLOBs, or support real-time search. It also helps dev-ops by eliminating the need for manual tuning, sharding, replication, and other operations required to keep a large data store in good health.”

The move is yet another attempt by Google to bolster its data services. Earlier this week the company revealed Bigtable, a fully managed NoSQL database service the company said combines its own internal database technology with open source Apache HBase APIs.

Last month the company announced the beta launch of Google Cloud Dataflow, a Java-based service that lets users build, deploy and run data processing pipelines for other applications like ETL, analytics, real-time computation, and process orchestration, while abstracting away all the other infrastructure bits like cluster management.

Google reveals Bigtable, a NoSQL service based on what it uses internally

Google has punted another big data service, a variant of what it uses internally, into the wild

Google has punted another big data service, a variant of what it uses internally, into the wild

Search giant Google announced Bigtable, a fully managed NoSQL database service the company said combines its own internal database technology with open source Apache HBase APIs.

The company that helped give birth to MapReduce and its sister Hadoop is now making available the same non-relational database tech driving a number of its services including Google Search, Gmail, and Google Analytics.

Google said Bigtable is powered by BigQuery underneath, and is extensible through the HBase API (which provides real-time read / write access capabilities).

“Google Cloud Bigtable excels at large ingestion, analytics, and data-heavy serving workloads. It’s ideal for enterprises and data-driven organizations that need to handle huge volumes of data, including businesses in the financial services, AdTech, energy, biomedical, and telecommunications industries,” explained Cory O’Connor, product manager at Google.

O’Connor said the service, which is now in beta, can deliver over two times the performance of its direct competition (which will likely depend on the use case), and has a TCO of less than half that of its direct competitors.

“As businesses become increasingly data-centric, and with the coming age of the Internet of Things, enterprises and data-driven organizations must become adept at efficiently deriving insights from their data. In this environment, any time spent building and managing infrastructure rather than working on applications is a lost opportunity.”

Bigtable is Google’s latest move to bolster its data services, a central pillar of its strategy to attract new customers to its growing platform. Last month the company announced the beta launch of Google Cloud Dataflow, a Java-based service that lets users build, deploy and run data processing pipelines for other applications like ETL, analytics, real-time computation, and process orchestration, while abstracting away all the other infrastructure bits like cluster management.

PwC, Rosslyn partner on cloudy big data

PwC is teaming up with Rosslyn to help bring analytics-based insights to clients

PwC is teaming up with Rosslyn to help bring analytics-based insights to clients

Pricewaterhouse Coopers (PwC) announced a partnership with Rosslyn Analytics that will see the two firms jointly develop and offer cloud-based big data services to clients.

The two companies said they plan to use Rosslyn’s suite of cloud-enabled data technologies when advising clients on supply chain risk reduction, productivity optimisation and cost reduction, with PwC bringing its deep knowledge of different verticals to the table.

“For our clients, acquiring the knowledge most important to their operations, securing that information and using it optimally are critical – now more than ever before. We are delighted to be teaming up with Rosslyn to offer our joint knowledge and capabilities to clients – giving them one place to go, maximizing experience and assets from both organizations,” said Yann Bonduelle, PwC partner and head of data analytics.

“In our most recent survey of business leaders, 75 per cent of UK CEOs say that data and data analytics are proving valuable to them, whilst 79 per cent see data mining and analysis as one of the top digital technologies. This highlights how important it is to our clients to embrace the technology available to give them greater competitive advantage,” Bonduelle added.

Charles Clark, chief executive of Rosslyn Analytics, said: “Our collaboration is about helping clients to embrace their journey in analytics, and transform their organisations to thrive and maintain relevance in a rapidly changing world. An increasing number of companies, large and small, look to our data technologies to help them reduce costs and risks, and improve their revenue and productivity across their businesses.”

Like KPMG and others in the big four, PwC has struck several deals with cloud and data services providers in a bid to add more value to its client offerings. The company most recently struck a deal with Google that has seen it work closely with its clients to tailor Google Apps for Work to their specific business processes and needs, and help them optimise their operations.

Synergy Research: AWS still larger than four biggest rivals combined

AWS is larger than its four top rivals combined

AWS is larger than its four top rivals combined

Amazon pulled the curtain back from its AWS business last week, announcing its cloud services now rakes in over $5bn annually. John Dinsdale, chief analyst and research director at Synergy Research Group said that now puts the e-commerce giant ahead of most of its largest competitors.

Amazon recently reported its cloud business took in revenues of $1.57bn in the first quarter of 2015, and enjoyed close to 50 per cent growth year on year. This is the first time the e-commerce giant has publicly disclosed AWS revenues.

Following on from that, some vendors which shall remain nameless (AWS competitors) worked behind the scenes to remind press off how much more profitable their cloud businesses are by comparison. But Synergy Research data suggests AWS is far larger than most of its competitors combined, at least in the infrastructure services market specifically.

Microsoft enjoys the highest revenue growth rate and IBM is leading private & hybrid services segment, but according to Synergy AWS continues to grow faster than the market as a whole, and that its market share approached 30 per cent in the most recently reported quarter.

Google is quietly gaining share though it remains just half the size of Microsoft in this market, the firm said.

“Across the full and varied spectrum of cloud activities there are now six companies that can lay a valid claim to having annual cloud revenue run rates in excess of $5 billion – AWS, IBM, Microsoft, HP, Cisco and salesforce – and all are able to claim leadership in different parts of the cloud market,” Dinsdale said.

“However, on a strict like-for-like basis AWS remains streets ahead of the competition in cloud infrastructure services. Furthermore, this part of the cloud market is growing much more rapidly than SaaS or cloud infrastructure hardware and software.”

Like-for-like comparisons seems scarce in cloud revenue reporting, not the least of which because it’s such a nascent sector. Considering the market leader in cloud only just started publicly disclosing revenues tacked onto that business, it may be some time before vendors and service providers come up with standard definitions for what can be reported as ‘cloud’ (for instance, IBM recently reported its annual cloud revenues now exceed $7.7bn).

Synergy estimates quarterly cloud infrastructure service revenues (which includes IaaS, PaaS and private & hybrid cloud) now total exceed $5bn.

Amazon at the top of the Cloud Market

On Thursday, Amazon released their financial performance numbers, and they proved that Amazon is at the top of the cloud market compared to their competitors. Though they are known as an online marketplace, most of their stock market returns and revenue has come from renting processing power to start ups and enterprises.

 

Amazon was the leader in popularizing the cloud-computing field, and for a while they were the only ones to offer such services. This allowed for them to gain an advantage when others began to offer cloud-computing services. Others saw this field as an opportunity to tap into hundreds of billions of dollars. Microsoft has been especially committed to advancing in the field.

 

Though Amazon is the leader by a long shot, its resources are much lower than its competitors who have billions of dollars stashed away. Cloud computing demands heavy investments to set up data centers around the world as well as research and development if the field is to continue to grow and advance.

 

Amazon-Web-Services-Logo

 

In their first quarter reports, Amazon Web Services reported earnings of $1.57 billion and their operating income was $265 million. These statistics are strange coming from a company who often reports losses. This drove Amazon shares up by more than 6% in after-hours trading, and stock is at an all time high.

 

Microsoft, who ranks in at number 2 for cloud computing, reported that its annual revenue from its commercial cloud business would be $6.3 billion based on recent performance. Amazon predicted a similar figure of $5.16 billion. However, included in Microsoft’s number is revenue from different online applications. Azure, the Microsoft equivalent of Amazon’s cloud services, was estimated to be one-tenth of AWS.

 

microsoft

 

AWS got its start about a decade ago as a way to provide computing power to different divisions of Amazon. It has such a positive impact that it then was being offered to start-ups struggling to scale. After this, Amazon focused on expanding market share like it usually does, and it worked.

 

AWS was expected to rival the other businesses within Amazon. The cloud business has been growing by roughly 40% per year, which is twice the rate of the company overall.

 

Recently though, Google’s cloud service has been competing with AWS on pricing, which has been hurting profitability. Amazon has tried cutting prices many times at the expense of revenue growth. Their solution has been to provide other services such as database software and analytics. Amazon has also increased the number of resellers.

 

The big battle is going to be getting to the large companies that have the largest cloud computing needs.  Many companies just floated through the first years of the cloud, they were not ones to adopt the latest technology. They had compliance and contracting processes to follow. Now, cloud computing is commonplace at these companies.

 

Microsoft’s cloud business has doubled in the last year. This is great considering how they have been suffering from low PC sales. Analysts believe that Microsoft has the edge in obtaining larger companies for clients. This is because they might be able to convince them to use their cloud services in addiction to the Microsoft products they already use. For start-ups though, cloud computing and AWS are synonymous.

 

The cloud computing market is going to continue to grow, and no single company can cover all aspects of it. It will be exciting to see where things go from here.

The post Amazon at the top of the Cloud Market appeared first on Cloud News Daily.

Google boosts cloud-based big data services

Google is bolstering its big data services

Google is bolstering its big data services

Google announced a series of big data service updates to its cloud platform this week in a bid to strengthen its growing portfolio of data services.

The company announced the beta launch of Google Cloud Dataflow, a Java-based service that lets users build, deploy and run data processing pipelines for other applications like ETL, analytics, real-time computation, and process orchestration, while abstracting away all the other infrastructure bits like cluster management.

The service is integrated with Google’s monitoring tools and the company said it’s built from the ground up for fault-tolerance.

“We’ve been tackling challenging big data problems for more than a decade and are well aware of the difference that simple yet powerful data processing tools make. We have translated our experience from MapReduce, FlumeJava, and MillWheel into a single product, Google Cloud Dataflow,” the company explained in a recent blog post.

“It’s designed to reduce operational overhead and make programming and data analysis your only job, whether you’re a data scientist, data analyst or data-centric software developer. Along with other Google Cloud Platform big data services, Cloud Dataflow embodies the kind of highly productive and fully managed services designed to use big data, the cloud way.”

The company also added a number of security features to Big Query, Google’s SQL cloud service, including adding row-level permissioning for data protection, made it more performant (raised the ingestion limit to 100,000 rows per second), and announced its availability in Europe.

Google has largely focused its attention on other areas of the stack as of late. The company has been driving its container scheduling and deployment initiative Kubernetes quite hard, as well as its hybrid cloud initiatives (Mirantis, VMware). It also recently introduced a log analysis for Google Cloud and App Engine users.

YouTube brings Vitess MySQL scaling magic to Kubernetes

YouTube is working to integrate a beefed up version of MySQL with Kubernetes

YouTube is working to integrate a beefed up version of MySQL with Kubernetes

YouTube is working to integrate Vitess, which improves the ability of MySQL databases to scale in containerised environments, with Kubernetes, an open source container deployment and management tool.

Vitess, which is available as an open source project and pitched as a high-concurrency alternative to NoSQL and vanilla MySQL databases, uses a BSON-based protocol which creates very lightweight connections (around 32KB), and its pooling feature uses Go’s concurrency support to map these lightweight connections to a small pool of MySQL connections; Vitess can handle thousands of connections.

It also handles horizontal and vertical sharding, and can dynamically re-write queries that could impede the database performance.

Anthony Yeh, a software engineer at YouTube said the company is currently using the service to handle metadata for the company’s video service, which handles billions of daily video views and 300 hours of new video uploads per minute.

“Your new website is growing exponentially. After a few rounds of high fives, you start scaling to meet this unexpected demand. While you can always add more front-end servers, eventually your database becomes a bottleneck.”

“Vitess is available as an open source project and runs best in a containerized environment. With Kubernetes and Google Container Engine as your container cluster manager, it’s now a lot easier to get started. We’ve created a single deployment configuration for Vitess that works on any platform that Kubernetes supports,” he explained in a blog post on the Google Cloud Platform website. “In this environment, Vitess provides a MySQL storage layer with improved durability, scalability, and manageability.”

Yeh said the company is just getting started with the Kubernetes integration, but once users will be able to deploy Vitess in containers with Kubernetes on any cloud platform supported by it.