Archivo de la categoría: AWS

Google adds to its Cloud Platform as vendors compete with AWS Lambda

Google officeGoogle has added to its public cloud infrastructure for developers, Cloud Platform, with a new service that allows app writers to set up functions that can be triggered in response to events. The new Google Cloud Functions has drawn comparison with the Lambda offering from Amazon Web Services (AWS).

The service was not announced to the public, but news filtered out after documentation began to appear on Google’s web site, offering advice to developers. According to the briefing notes, Google Cloud Functions is a ‘lightweight, event-based, asynchronous’ computing system that can be used to create small, single-purpose functions in response to cloud events without the need for managing servers or programming in a runtime environment. Access to the service is available to anyone who fills out a form on the web site.

Google’s answer to AWS Lambda is the latest attempt to catch up with AWS by filling in the omissions in its own service. In September 2015 BCN reported how Google’s Cloud Platform is being sped up by the addition of four new content delivery networks, with CloudFlare, Fastly, Highwinds Network and Level 3 Communications adding to Google’s network of 70 points of presence in 33 countries as part of a new Google CDN Interconnect programme.

Google has also bolstered its cloud offering with new networking, containerisation and price cuts, BCN reported in November 2015. Google has also recruited VMware cofounder Diane Greene to lead all of its cloud businesses, as reported last year.

Google Cloud Functions run as Node.js modules and can be written in JavaScript. A response could be set up to react to, say, circumstances in a user’s Google Cloud Storage, such as an unwanted type of picture file or title. The service also works with webhooks, which contributes to a speeding up of programming processes and code maintenance.

The prices for Cloud Functions were not listed, as the service is still in Alpha mode.

Meanwhile a new start up, Iron.io, has raised $11.5 million in venture capital to develop its own answer to Lamba and Cloud Functions. Microsoft is also rumoured to be developing its own version of Cloud Functions for Azure, according to a report in Forbes.

AWS targets games developers with Lumberyard and Gamelift services

computer game developmentAmazon Web Services (AWS) has launched new services, Lumberyard and Gamelift, to help developers create games and build communities of fans. By simplifying the infrastructure work it could help keep games developers loyal to its cloud services.

Amazon Lumberyard is a free, 3D game engine which developers can use to create games on any IT platform while using the computing and storage resources of the AWS Cloud. According to Amazon, Lumberyard’s visual scripting tool can open up the games development market because it allows non-technical game developers to add cloud-connected features to a game. It claims that features such as community news feeds, daily gifts or server-side combat resolution can be added in minutes through a drag-and-drop graphical user interface.

The other new AWS service, GameLift, aims to simplify the launch and operational management of session-based multiplayer games. Used in combination the two new services will make it easier for games developers to ramp up capacity to order as demand for high-performance game servers fluctuates. The services makes it easier for games developers to cater for fluctuating demand without the expense of additional engineering effort or upfront costs, says AWS.

Amazon Lumberyard is free and available today in beta for developers building PC and console games. A version for mobile and virtual reality (VR) platforms is ‘coming soon’ it says. GameLift is charged on a per-player basis, with fees currently $1.50 per 1,000 daily active users on top of the standard AWS services fees.

Developers typically need to bring 20 technology components to build the highest-quality games, according to Mike Frazzini, Amazon Games’s VP. The expense of resources such as real-time graphics rendering, animation systems and physics simulation make this a prohibitive and risky market to be in.

“Game developers asked for a game engine with the power of commercial engines but significantly less expensive and deeply integrated with AWS for the back,” said Frazzini. AWS now provides that with Lumberyard and GameLift, he said.

Developing and maintaining a back-end infrastructure for multiplayer games requires time, money and expertise that are beyond the reach of most developers, according to Chris Jones, Obsidian Entertainment’s CTO. “GameLift removes much of that burden from the developer, allowing them to focus their energy on bringing their game ideas to life,” said Jones.

AWS – we view open source as a companion

deepaIn one of the last installments of our series marking the upcoming Container World (February 16 – 18,  Santa Clara Convention Center, CA, USA), BCN talks to Deepak Singh, General Manager of Amazon EC2 Container Service, AWS

Business Cloud News: First of all – how much of the container hype is justified would you say?

Deepak Singh: Over the last 2-3 years, starting with the launch of Docker in March 2013, we have seen a number of AWS customers adopt containers for their applications. While many customers are still early in their journey, we have seen AWS customers such as Linden Labs, Remind, Yelp, Segment, and Gilt Group all adopt Docker for production applications. In particular, we are seeing enterprise customers actively investigating Docker as they start re-architecting their applications to be less monolithic.

How is the evolution of containers influencing the cloud ecosystem?

Containers are helping people move faster towards architectures that are ideal for the  AWS cloud. For example, one of the common patterns we have seen with customers using Docker is to adopt a microservices architecture. This is especially true for our enterprise customers who see Docker as a way to bring more applications onto AWS.

What opportunities does this open up to AWS?

For us, it all comes down to customer choice. When our customers ask us for a capability, then we listen. They come to us because they want something the Amazon way, easy to use, easy to scale, lower cost, and where they don’t have to worry about the infrastructure running behind it.

As mentioned, many of our customers are adopting containers and they expect AWS to support them. Over the past few years we have launched a number of services and features to make it easier for customers to run Docker-based applications. These include Docker support in AWS Elastic Beanstalk and the Amazon EC2 Container Service (ECS). We also have a variety of certified partners that support Docker and AWS and integrate with various AWS services, including ECS.

What does the phenomenon of open source mean to AWS? Is it a threat or a friend?

We view open source as a companion to AWS’s business model. We use open source and have built most AWS services on top of open source technology. AWS supports a number of open source applications, either directly or through partners. Examples of open source solutions available as AWS services include Amazon RDS (which supports MySQL, Postgres, and MariaDB), Amazon Elastic MapReduce (EMR), and Amazon EC2 Container Service (ECS). We are also an active member of the open source community. The Amazon ECS agent is available under an Apache 2.0 license, and we accept pull requests and allow our customers to fork our agent as well. AWS contributes code to Docker (e.g. CloudWatch logs driver), and was a founder member of the Open Container Initiative, which is a community effort to develop specifications for container runtimes.

As we see customers asking for services based on various open source technologies, we’ll keep adding those services.

You’ll be appearing at Container World this February. What do you think the biggest discussions will be about?

We expect customers will be interested in learning how they can run container-based applications in production, the most popular use cases, and hear about the latest innovations in this space.

AWS posts most profitable quarter ever

amazon awsAs investors reportedly grow nervous with its parent company, Amazon Web Services has reported its most profitable quarter ever.

Though sales of Amazon Web Services (AWS) grew 69% and profits tripled, the stock of parent company Amazon.com fell by 10% in pre-market trading after its latest earning report, but it had jumped by 8% during Thursday trading.

The fall in market valuation of Amazon.com, described on Wall Street as a readjustment of ‘outsize expectations’ following Amazon’s previous declarations about cost management and investment in infrastructure, is unlikely to affect the cloud services business however.

Meanwhile, the AWS unit saw its quarterly operating profit triple to $687 million, after sales revenue for its latest quarter rose 69% to $2.4 billion. This represents a decline in growth rate, which was 78% in the previous three months. AWS brought in $687 million in operating income for the quarter, in comparison to the $240 million revenue that was made in the corresponding quarter last year. The operating expenditures for AWS for the quarter came in at $1.78 billion, up from $1.18 billion a year earlier.

Though the rate of expansion may be slowing, AWS is still the fastest growing division within Amazon and in an unassailable lead in the cloud infrastructure market, according to Richard Brown, Senior VP for EMEA at Interactive Intelligence.

“Amazon’s latest financial results show that demand for cloud computing is booming and provides insight into the changing behaviours of organisations as they move to the cloud. To keep their foothold in this growing market, cloud vendors like Amazon, Google and Microsoft are prioritising work on their cloud platforms.”

Direct comparisons with AWS rivals such as Microsoft Azure, Google Cloud Platform and IBM SoftLayer are difficult as their parent companies’ financials are structured differently.

On January 18th BCN reported how increasing price competition among the top three cloud service providers may affect profitability in the cloud market in coming months.

This year, BCN reports, AWS plans to add 5 AWS regions and 11 Availability Zones to its current estate of 32 Availability Zones across 12 geographic regions worldwide, with new sites in London, China and India.

AWS, Azure and Google intensify cloud price war

AzureAs price competition intensifies among the top three cloud service providers, one analyst has warned that cloud buyers should not get drawn into a race to the bottom.

Following price cuts by AWS and Google, last week Microsoft lowered the price bar further with cuts to its Azure service. Though smaller players will struggle to compete on costs, the cloud service is a long way from an oligopoly, according to Quocirca analyst Clive Longbottom.

Amazon Web Services began the bidding in early January as chief technology evangelist Jeff Barr announced the company’s 51st cloud price cut on his official AWS blog.

In January 8th Google’s Julia Ferraioli argued via a blog post that Google is now a cheaper offering (in terms of cost effectiveness) as a result of its discounting scheme. “Google is anywhere from 15 to 41% less expensive than AWS for compute resources,” said Ferraioli. The key to the latest Google lead in cost effectiveness is automatic sustained usage discounts and custom machine types that AWS can’t match, claimed Ferraioli.

Last week Microsoft’s Cloud Platform product marketing director Nicole Herskowitz announced the latest round of price competition in a company blog post announcing a 17% cut off the prices of its Dv2 Virtual Machines.

Herskowitz claimed that Microsoft offers better price performance because, unlike AWS EC2, its Azure’s Dv2 instances have include load balancing and auto-scaling built-in at no extra charge.

Microsoft is also aiming to change the perception of AWS’s superiority as an infrastructure service provider. “Azure customers are using the rich set of services spanning IaaS and PaaS,” wrote Herskowitz, “today, more than half of Azure IaaS customers are benefiting by adopting higher level PaaS services.”

Price is not everything in this market warned Quocirca analyst Longbottom, an equally important side of any cloud deal is overall value. “Even though AWS, Microsoft and Google all offer high availability and there is little doubting their professionalism in putting the stack together, it doesn’t mean that these are the right platform for all workloads. They have all had downtime that shouldn’t have happened,” said Longbottom.

The level of risk the provider is willing to protect the customer from and the business and technical help they provide are still deal breakers, Longbottom said. “If you need more support, then it may well be that something like IBM SoftLayer is a better bet. If you want pre-prepared software as a service, then you need to look elsewhere. So it’s still horses for courses and these three are not the only horses in town.”

AWS adds hydro-powered Canadian region to its estate

AWS has announced it will open a new carbon-neutral Canadian region to its estate as well as running a new free test drive service for cloud service buyers.

AWS chief technology evangelist Jeff Barr announced on the AWS official blog that a new AWS region in Montreal, Canada will run on hydro power.

The addition of data centre facilities in the Canada-Montreal region means that AWS partners and customers can run workloads and store data in Canada. AWS has four regions in North America but they are all in the United States, with coverage in US East (Northern Virginia), US West (Northern California),US West (Oregon), and AWS GovCloud (US). There is also an additional region for Ohio planned for some time in 2016. The Ohio and Montreal additions will give AWS 14 Availability Zones in North America.

AWS’s data centre estate now comprises 32 Availability Zones across 12 geographic regions worldwide, according to the AWS Global Infrastructure page. Another 5 AWS regions (and 11 Availability Zones) are in the pipeline including new sites in China and India. These will come online “throughout the next year” said Barr.

The Montreal facilities are not exclusive to Canadian customers and partners and open to all existing AWS customers who want to process and store data in Canada, said Barr.

Meanwhile, AWS announced a collaboration with data platform provider MapR to create a ‘try before you buy’ service. Through AWS facilities MapR is to offer free test drives of the Dataguise DgSecure, HPE Vertica, Apache Drill and TIBCO Spotfire services that it runs from its integrated Spark/Hadoop systems.

The AWS Test Drives for Big Data will provide private IT sandbox environments with preconfigured servers so that cloud service shoppers can launch, login and learn about popular third-party big data IT services as they research their buying options. MapR claims that it has made the system so easy that the whole process, from launching to learning, can be achieved within an hour using its step-by-step lab manual and video. The test drives are powered by AWS CloudFormation.

MapR is currently the only Hadoop distribution on the AWS Cloud that is available as an option on Amazon Elastic MapReduce (EMR), AWS Marketplace and now via AWS Test Drive.

AWS launches Workmail with eye on Exchange defectors

Amazon Work MailAmazon Web Services (AWS) has put its Workmail email and calendaring service on general release. Priced at $4 a month it includes an Exchange migration tool to encourage defections by Microsoft customers. However, those with data sovereignty issues should be aware that the services are mostly being hosted in the US, with a solitary non US data centre in Eire.

After a year in preview, the service was announced on the blog of AWS chief evangelist Jeff Barr. The service, designed to work with existing desktops and mobile clients, has been strengthened since it emerged in preview form, with the new service offering greater security, ease of use and migration, Barr said. The system has an emphasis on mobility features, with location control and policies and actions for controlling mobile devices, along with regular security features such as encryption of stored data, message scanning for spam and virus protection.

The migration tool will make it easier for users to move away from Microsoft Exchange, according to Barr, which suggests that dissatisfied Exchange users could be the primary target market.

The $4 per user per month service comes with an allocation of 50GB of storage and will be run from AWS’ US data centres in Northern Virginia and Oregon (in the US), with a single data centre in Eire to service European customers. “You can choose the region where you want to store your mailboxes and be confident that the stored data will not leave the region,” wrote Barr.

Other features include a Key Management Service (KMS) for creating and managing the keys that are used to encrypt data at rest and Self Certifications, so that WorkMail users can show they have achieved various ISO certifications.

WorkMail will support clients running on OS X, including Apple Mail and Outlook. It will also support clients using the Microsoft Exchange ActiveSync protocol including iPhone, iPad, Kindle Fire, Fire Phone, Android, Windows Phone, and BlackBerry 10. AWS is also working on interoperability support to give users a single Global Address Book and to access calendar information across both environments. A 30-day free trial is available for up to 25 users.

AWS opens up EC2 Container Registry to all

amazon awsCloud giant Amazon Web Services (AWS) has opened its technology for storing and managing application container images up to public consumption.

The AWC EC2 Container Registry Service (ECR) had been exclusively for industry insiders who attended the launch at the AWS re:Invent conference in Las Vegas in October. However, AWS has now decided to level the playing field, its Senior Product Manager Andrew Thomas revealed, guest writing on the blog of AWS chief technologist Jeff Barr. Thomas invited all interested cloud operators to apply for access.

As containers have become the de facto method for packaging application code all cloud service providers are competing to fine tune the process of running code within these constraints, as an alternative to using virtual machines. But developers have fed back teething problems to AWS, Thomas reports in the blog.

ECR, explains Thomas, is a managed Docker container registry designed to simplify the management of Docker container images which, developers have told Thomas, has proved difficult. Running a Docker image registry, in a large-scale job like an infrastructure project, involves pulling hundreds of images at once and this makes self-hosting too difficult, especially with the added complexity of spanning two or more AWS regions. AWS clients wanted fine-grained access control to images without having to manage certificates or credentials, Thomas said.

Management aside, there is a security dividend too, according to Thomas. “This makes it easier for developers to evaluate potential security threats before pushing to Amazon ECR,” he said. “It also allows developers to monitor their containers running in production.”

There is no charge for transferring data into the Amazon EC2 Container Registry. While storage costs 10 cents per gigabyte per month all new AWS customers will receive 500MB of storage a month for a year.

The Registry is integrated with Amazon ECS and the Docker CLI (command line interface), in order to simplify development and production workflows. “Users can push container images to Amazon ECR using the Docker CLI from the development machine and Amazon ECS can pull them directly for production,” said Thomas.

The service was effective from December 21st in the US East (Northern Virginia) with more regions on the way soon.

Containers at Christmas: wrapping, cloud and competition

Empty road and containers in harbor at sunsetAs anyone that’s ever been disappointed by a Christmas present will tell you – shiny packaging can be very misleading. As we hear all the time, it’s what’s inside that counts…

What then, are we to make of the Docker hype, centred precisely on shiny, new packaging? (Docker is the vendor that two years ago found a way to containerise applications: other types of containers, operating system containers, have been around for a couple of decades)

It is not all about the packaging, of course. Perhaps we should say that it is more about on what the package is placed, and how it is managed (amongst other things) that matters most?

Regardless, containers are one part of a changing cloud, data centre and enterprise IT landscape, the ‘cloud native’ movement widely seen as driving a significant shift in enterprise infrastructure and application development.

What the industry is trying to figure out, and what could prove the most disruptive angle to watch as more and more enterprises roll out containers into production, is the developing competition within this whole container/cloud/data centre market.

The question of competition is a very hot topic in the container, devops and cloud space.  Nobody could have thought the OCI co-operation between Docker and CoreOS meant they were suddenly BFFs. Indeed, the drive to become the enterprise container of choice now seems to be at the forefront of both companies’ plans. Is this, however, the most dynamic relationship in the space? What about the Google-Docker-Mesos orchestration game? It would seem that Google’s trusted container experience is already allowing it to gain favour with enterprises, with Kubernetes taking a lead. And with CoreOS in bed with Google’s open source Kubernetes, placing it at the heart of Tectonic, does this mean that CoreOS has a stronger play in the enterprise market to Docker? We will wait and see…

We will also wait and see how the Big Cloud Three will come out of the expected container-driven market shift. Somebody described AWS as ‘a BT’ to me…that is, the incumbent who will be affected most by the new disruptive changes brought by containers, since it makes a lot of money from an older model of infrastructure….

Microsoft’s container ambition is also being watched closely. There is a lot of interest from both the development and IT Ops communities in their play in the emerging ecosystem. At a recent meet-up, an Azure evangelist had to field a number of deeply technical questions regarding exactly how Microsoft’s containers fair next to Linux’s. The question is whether, when assessing who will win the largest piece of the enterprise pie, this will prove the crux of the matter?

Containers are not merely changing the enterprise cloud game (with third place Google seemingly getting it very right) but also driving the IT Ops’ DevOps dream to reality; in fact, many are predicting that it could eventually prove a bit of a threat to Chef and Puppet’s future….

So, maybe kids at Christmas have got it right….it is all about the wrapping and boxes! We’ll have to wait a little longer than Christmas Day to find out.

Lucy Ashton. Head of Content & Production, Container WorldWritten by Lucy Ashton, Head of Content & Production, Container World

AWS launches EC2 Dedicated Hosts feature to identify specific servers used

amazon awsAmazon Web Services (AWS) has launched a new service for the nervous server hugger: it gives users knowledge of the exact server that will be running their machines and also includes management features to prevent licensing costs escalating.

The new EC2 Dedicated Hosts service was created by AWS in reaction to the sense of unease that users experience when they never really know where their virtual machines (VMs) are running.

Announcing the new service on the company blog AWS chief evangelist Jeff Barr says the four main areas of improvement would be in licensing savings, compliance, usage tracking and better control over instances (AKA virtual machines).

The Dedicated Hosts (DH) service will allow users to port their existing server-based licenses for Windows Server, SQL Server, SUSE Linux Enterprise Server and other products to the cloud. A feature of DH will be the ability to see the number of sockets and physical cores that are available to a customer before they invest in software licenses. This improves their chances of not overpaying. Similarly the Track Usage feature will help users monitor and manage their hardware and software inventor more thriftily. By using AWS Config to track the history of instances that are started and stopped on each of their Dedicated Hosts customers can verify usage against their licensing metrics, Barr says.

Another management improvement is created by the Control Instance Placement feature, that promises ‘fine-grained control’ over the placement of EC2 instances on each Dedicated Host.

The provision of a physical server may be the most welcome addition to many cloud buyers dogged by doubts over Compliance and Regulatory Requirements. “You can allocate Dedicated Hosts and use them to run applications on hardware that is fully dedicated to your use,” says Barr.

The service will help enterprises that have complicated portfolios of software licenses where prices are calculated on the numbers of CPU cores or sockets. However, Dedicated Hosts can only run in tandem with AWS’ Virtual Private Cloud (VPC) service and can’t work with its Auto Scaling tool yet.