Archivo de la categoría: AWS

AWS opens up EC2 Container Registry to all

amazon awsCloud giant Amazon Web Services (AWS) has opened its technology for storing and managing application container images up to public consumption.

The AWC EC2 Container Registry Service (ECR) had been exclusively for industry insiders who attended the launch at the AWS re:Invent conference in Las Vegas in October. However, AWS has now decided to level the playing field, its Senior Product Manager Andrew Thomas revealed, guest writing on the blog of AWS chief technologist Jeff Barr. Thomas invited all interested cloud operators to apply for access.

As containers have become the de facto method for packaging application code all cloud service providers are competing to fine tune the process of running code within these constraints, as an alternative to using virtual machines. But developers have fed back teething problems to AWS, Thomas reports in the blog.

ECR, explains Thomas, is a managed Docker container registry designed to simplify the management of Docker container images which, developers have told Thomas, has proved difficult. Running a Docker image registry, in a large-scale job like an infrastructure project, involves pulling hundreds of images at once and this makes self-hosting too difficult, especially with the added complexity of spanning two or more AWS regions. AWS clients wanted fine-grained access control to images without having to manage certificates or credentials, Thomas said.

Management aside, there is a security dividend too, according to Thomas. “This makes it easier for developers to evaluate potential security threats before pushing to Amazon ECR,” he said. “It also allows developers to monitor their containers running in production.”

There is no charge for transferring data into the Amazon EC2 Container Registry. While storage costs 10 cents per gigabyte per month all new AWS customers will receive 500MB of storage a month for a year.

The Registry is integrated with Amazon ECS and the Docker CLI (command line interface), in order to simplify development and production workflows. “Users can push container images to Amazon ECR using the Docker CLI from the development machine and Amazon ECS can pull them directly for production,” said Thomas.

The service was effective from December 21st in the US East (Northern Virginia) with more regions on the way soon.

Containers at Christmas: wrapping, cloud and competition

Empty road and containers in harbor at sunsetAs anyone that’s ever been disappointed by a Christmas present will tell you – shiny packaging can be very misleading. As we hear all the time, it’s what’s inside that counts…

What then, are we to make of the Docker hype, centred precisely on shiny, new packaging? (Docker is the vendor that two years ago found a way to containerise applications: other types of containers, operating system containers, have been around for a couple of decades)

It is not all about the packaging, of course. Perhaps we should say that it is more about on what the package is placed, and how it is managed (amongst other things) that matters most?

Regardless, containers are one part of a changing cloud, data centre and enterprise IT landscape, the ‘cloud native’ movement widely seen as driving a significant shift in enterprise infrastructure and application development.

What the industry is trying to figure out, and what could prove the most disruptive angle to watch as more and more enterprises roll out containers into production, is the developing competition within this whole container/cloud/data centre market.

The question of competition is a very hot topic in the container, devops and cloud space.  Nobody could have thought the OCI co-operation between Docker and CoreOS meant they were suddenly BFFs. Indeed, the drive to become the enterprise container of choice now seems to be at the forefront of both companies’ plans. Is this, however, the most dynamic relationship in the space? What about the Google-Docker-Mesos orchestration game? It would seem that Google’s trusted container experience is already allowing it to gain favour with enterprises, with Kubernetes taking a lead. And with CoreOS in bed with Google’s open source Kubernetes, placing it at the heart of Tectonic, does this mean that CoreOS has a stronger play in the enterprise market to Docker? We will wait and see…

We will also wait and see how the Big Cloud Three will come out of the expected container-driven market shift. Somebody described AWS as ‘a BT’ to me…that is, the incumbent who will be affected most by the new disruptive changes brought by containers, since it makes a lot of money from an older model of infrastructure….

Microsoft’s container ambition is also being watched closely. There is a lot of interest from both the development and IT Ops communities in their play in the emerging ecosystem. At a recent meet-up, an Azure evangelist had to field a number of deeply technical questions regarding exactly how Microsoft’s containers fair next to Linux’s. The question is whether, when assessing who will win the largest piece of the enterprise pie, this will prove the crux of the matter?

Containers are not merely changing the enterprise cloud game (with third place Google seemingly getting it very right) but also driving the IT Ops’ DevOps dream to reality; in fact, many are predicting that it could eventually prove a bit of a threat to Chef and Puppet’s future….

So, maybe kids at Christmas have got it right….it is all about the wrapping and boxes! We’ll have to wait a little longer than Christmas Day to find out.

Lucy Ashton. Head of Content & Production, Container WorldWritten by Lucy Ashton, Head of Content & Production, Container World

AWS launches EC2 Dedicated Hosts feature to identify specific servers used

amazon awsAmazon Web Services (AWS) has launched a new service for the nervous server hugger: it gives users knowledge of the exact server that will be running their machines and also includes management features to prevent licensing costs escalating.

The new EC2 Dedicated Hosts service was created by AWS in reaction to the sense of unease that users experience when they never really know where their virtual machines (VMs) are running.

Announcing the new service on the company blog AWS chief evangelist Jeff Barr says the four main areas of improvement would be in licensing savings, compliance, usage tracking and better control over instances (AKA virtual machines).

The Dedicated Hosts (DH) service will allow users to port their existing server-based licenses for Windows Server, SQL Server, SUSE Linux Enterprise Server and other products to the cloud. A feature of DH will be the ability to see the number of sockets and physical cores that are available to a customer before they invest in software licenses. This improves their chances of not overpaying. Similarly the Track Usage feature will help users monitor and manage their hardware and software inventor more thriftily. By using AWS Config to track the history of instances that are started and stopped on each of their Dedicated Hosts customers can verify usage against their licensing metrics, Barr says.

Another management improvement is created by the Control Instance Placement feature, that promises ‘fine-grained control’ over the placement of EC2 instances on each Dedicated Host.

The provision of a physical server may be the most welcome addition to many cloud buyers dogged by doubts over Compliance and Regulatory Requirements. “You can allocate Dedicated Hosts and use them to run applications on hardware that is fully dedicated to your use,” says Barr.

The service will help enterprises that have complicated portfolios of software licenses where prices are calculated on the numbers of CPU cores or sockets. However, Dedicated Hosts can only run in tandem with AWS’ Virtual Private Cloud (VPC) service and can’t work with its Auto Scaling tool yet.

Equinix connects AWS direct to data centres in Dallas and London

Equinix LD6Data centre operator Equinix has added an Amazon Web Services (AWS) Direct Connect facility in its Dallas data centre and data centres in its London International Business Exchange (IBX).

The AWS Direct Connect facility means that companies using Equinix data centres can connect their privately owned and managed infrastructure directly to AWS, it claims. The arrangement creates a private connection to the AWS Cloud within the same infrastructure. This ‘hard-wiring’ of two infrastructures in the same building can cut costs and latency, while boosting throughput speeds and ultimately creating better application performances, Equinix says. These two offerings bring the total number of Equinix data centres offering a Direct Connect (to AWS) to 10.

The service is a response to increasing demand from clients for hybrid clouds. Equinix says it can configure this in its own data centres, through direct interconnection of the public cloud provider’s kit and the equipment belonging to clients. This Equinix-enabled hybrid is an instant way to achieve the scalability and cost benefits of the cloud, while maintaining the security and control standards offered by an on premise infrastructure.

Equinix claims that a recent study, Enterprise of the Future, found that by 2017 hybrids will double in enterprise cloud computing. According to its feedback from a study group, 84% of IT leaders will deploy IT infrastructure where interconnection, defined as direct, secure physical or virtual connections, is at the core, compared to 38% today.

London is the second Equinix location in Europe, after Frankfurt, to get an AWS Direct Connect arrangement. It means that customers can get “native” connections to AWS Cloud offerings, whereas previously they tethered from Equinix in London into AWS’s Dublin facilities. Equinix’s Dallas IBX, DA5, is the fourth data centre in North America to offer AWS Direct Connect, joining Equinix’s facilities in Seattle, Silicon Valley and Washington. Equinix now offers AWS Direct Connect in ten global locations; Dallas, Frankfurt, London, Osaka, Seattle, Silicon Valley, Singapore, Sydney, Tokyo and Washington, D.C./Northern Virginia. Equinix customers in these areas experience lower network costs into and out of AWS and take advantage of reduced AWS Direct Connect data transfer rates.

Gemalto and NetApp to create secure cloud storage hybrid for AWS customers

Cloud storageSecurity vendor Gemalto and NetApp are to jointly create an integrated, encrypted key management system for securing data for Amazon Web Services (AWS) customers. The aim is to save time and improve security for end users, by simplifying the process of securing virtual data.

The two vendors, both AWS network partners, are to blend Gemalto’s SafeNet Virtual KeySecure and NetApp’s Cloud ONTAP as a unified service to be offered on the AWS Marketplace.

The SafeNet Virtual KeySecure for NetApp Cloud ONTAP (SKNCO) service promises to make storing and encrypting data and applications much easier for companies using virtual environments. The system will pay for itself, claim the vendors, through the productivity gains and raised levels of security created when users enjoy more governance over their stored data.

The SVKNCO creates these benefits, it’s claimed, by centralising management and making it easy to create customisable security policies for data access in the cloud. It achieves this by combining NetApp’s modern storage infrastructure with Gemalto’s SafeNet key management. The hybrid of the two systems can protect customers’ data and encryption keys against unauthorised access, while giving them the most cost effective storage options at all times.

It’s about creating top levels of security, but not at ‘any cost’ according to Todd Moore, VP of Data Encryption Product Management at Gemalto. “AWS users can now turn to NetApp to manage, store and protect their data more confidently, while completely owning their encryption keys,” said Moore.

Meanwhile, data centre infrastructure vendor Nutanix has also announced that its Community Edition is to be made available for AWS customers. The free software tool aims to help AWS customers speed up the evaluation process when weighing their options for buying infrastructure.

Avnet launches Cloud Marketplace with AWS and IBM as key clients

Money cloudGlobal distributor Avnet has launched a cloud shop and unveiled both IBM and AWS as headline partners.

The Avnet Cloud Marketplace is described as ‘the latest evolution’ in Avnet’s cloud offering. Avnet said it is incorporating all the insight gained from running 900,000 workloads in public, private and hybrid clouds in the past two years and making that wisdom available to its partners.

The shop offers top brands like AWS and IBM, with flexible payment models and a cloud management toolset. Avnet said its unique angle is that it allows partners in the US and Canada to offer cloud-based services to their customers through both consumption and subscription-based models. Channel partners will be able to create branded storefronts to offer complete solutions to their customers. Avnet’s Cloud Marketplace customers include VARs, ISVs, MSPs, systems integrators (SI), technology manufacturers and end users. The Marketplace will help them rollout services quicker, according to Sergio Farache, senior vice president of Avnet’s Solutions and Strategy business unit in the Americas.

“This is how Avnet helps partners use next-generation technologies and evolve,” said Farache.

The Avnet Cloud Marketplace is based on Avnet’s digital distribution strategy, where a combination of digital tools, processes and services can simplify the cloud and service provisioning.

Meanwhile, Avnet has announced that IBM’s cloud services Softlayer and Bluemix will be provided through its portal. IBM’s Platform-as-a-Service (PaaS) will help developers integrate applications more rapidly while its Software-as-a Service (SaaS) offering will make cloud, analytics, mobile, social and security applications available to Avnet’s channel partners.

Avnet will also offer IBM Business Partners some educational and training resources to further expand their cloud expertise. On November 6th Avnet revealed that packaged solutions powered by AWS, such as backup and disaster recovery solutions that integrate NetApp and Veritas software with Amazon Simple Storage Service (Amazon S3) and Amazon Glacier, will be offered to its channel. These offerings will make it easier for Avnet’s US and Canadian resellers, AR and service provider partners to sell the full range of AWS.

AWS announces UK will be its third region in the EU by 2017

Amazon Web Services (AWS) is to add a UK region to its empire. On its opening date, mooted for the end of 2016 or early 2017, it will be the third region in European Union and the 12th in the world.

The presence of an AWS region brings lower latency and strong data sovereignty to local users.

Amazon organises its ‘elastic computing’ by hosting it in multiple locations world-wide. The locations are, in turn, sub divided into regions and Availability Zones. Each region is a separate geographical area with multiple, isolated locations known as Availability Zones. The rationale being to give instant local response but geographically diverse back up to each computing ‘instance’ (or user).

Announcing the new UK base in his blog, Amazon CTO Werner Vogels promised that all Britain’s ranges of local and global enterprises, institutes and government departments will get faster AWS Cloud services than they have been getting. The new region will be coupled – for failover purposes – with existing AWS regions in Dublin and Frankfurt. This local presence, says AWS, will provide lower latency access to websites, mobile applications, games, SaaS applications, big data analysis and Internet of Things (IoT) apps.

“We are committed to our customers’ need for capacity,” said Vogels, who promised ‘powerful AWS services that eliminate the heavy lifting of the underlying IT infrastructure’.

The UK government’s Trade and Investment Minister Lord Maude described the decision as ‘great news for the UK’. The choice of the UK, as the third european presence for AWS is, “further proof the UK is the most favoured location in Europe for inward investment,” said Maude.

By providing commercial cloud services from data centres in the UK AWS will create more healthy competition and innovation in the UK data centre market, according to HM Government Chief Technology Officer Liam Maxwell. “This is good news for the UK government given the significant amount of data we hold that needs to be kept onshore,” said Maxwell.

Yesterday, AWS evangelist Jeff Barr revealed in his blog that AWS will be opening a region in South Korea in early 2016, its fifth region in Asia Pacific.

AWS profitability quadruples as revenue surges 78%

amazon awsAmazon Web Services’ revenue grew by 78% year over year to $2.1 billion in the third quarter of 2015 and its operating profit more than quadrupled to $521 million. Its high profits – attributed to 500 new inventions and eight price cuts – contributed to earnings which surpassed analyst expectations and created a surge in parent company Amazon’s stock price.

The high growth rate in AWS profitability could be accounted for by last year’s low margins caused by a competitive price cuts on AWS services.

Meanwhile parent company Amazon reported an overall third-quarter operating profit of $406 million on $25.4 billion of sales. Amazon CFO Brian Olsavsky answered criticism that AWS is keeping the company profitable and that, in the face of cloud competition, it may have to cut prices again to ensure further growth.

“I will point out that this quarter showed a lot of innovation, a lot of new products and features and a lot of investment,” Amazon CFO Brian Olsavsky told analysts. “Globally we are investing very heavily in our Prime platform. We’ve launched multiple devices including e-readers, tablets priced under $50, Echo dash buttons, so there’s a lot of investment going on, and there will continue to be, especially related to prime. Innovation and investment will continue and can be lumpy over time.”

The pace of innovation in AWS and the scale of its business has allowed it to do the ‘heavy lifting for Amazon’ said one Wall Street blogger.

By constantly re-inventing itself AWS has been able to cut its prices eight times since April 2014, said Phil Hardin, Amazon director of investor relations, in an analyst conference calls. “The company rolled out 539 new features and services in the past year alone, many of which have been designed so that its customers can access enterprise-grade services for a fraction of what they would traditionally cost on-premise,” said Hardin.

Amazon continues Internet of Things push with AWS IoT

Intel AWS IoT starter kitThe new AWS platform is designed to allow IoT devices to connect to the AWS cloud as well as a managed cloud service to assist with processing the data.

AWS IoT has been launched in beta, which usually means it’s not quite ready yet, but it needs people to try it out in order to iron out lingering bugs. In essence it appears to be Amazon’s play to put itself in the thick of the IoT land-grab, as the repository of all the data constantly being generated by the billions of sensors expected to comprise the IoT.

In many ways Amazon’s many previous launches and announcements at this year’s AWS re:Invent seems to have been leading up to this, as they’ve all been about making easier to transfer data into the AWS cloud. Specifically Amazon Kenisis Firehose, which is designed to make it easier to upload wireless streaming data to the AWS cloud, seems to have been launched with IoT in mind.

“The promise of the Internet of Things is to make everyday products smarter for consumers, and for businesses to enable better, data-driven offerings that weren’t possible before,” said Marco Argenti, VP of Mobile and IoT at AWS.

“World-leading organizations like Philips, NASA JPL, and Sonos already use AWS services to support the back-end of their IoT applications. Now, AWS IoT enables a whole ecosystem of manufacturers, service providers, and application developers to easily connect their products to the cloud at scale, take action on the data they collect, and create a new class of applications that interact with the physical world.”

Device connections are handled by a device gateway, which provides tools for predetermining responses to data received. AWS IoT also creates a virtual version of each device in the cloud so it can be interacted with even in times of intermittent connectivity. A dedicated SDK aims to make it easier for developers to do clever things with IoT devices and a bunch of semiconductor companies have already got on-board by embedding the SDK into IoT chips, including Broadcom, Intel, Marvell, Mediatek, Microchip, Qualcomm and TI. There are also a bunch of IoT starter kits which can, of course, be bought on Amazon.

“At Philips we aim to empower people to take greater control of their health with digital solutions that support healthy living and improved care coordination,” said Jeroen Tas, CEO Healthcare Informatics, Solutions and Services at Philips. “Our HealthSuite digital platform and its device cloud are already managing more than seven million connected, medical-grade and consumer devices, sensors, and mobile apps.

“With the addition of AWS IoT, we will greatly accelerate the pursuit of our vision. It will be easier to acquire, process, and act upon data from heterogeneous devices in real-time. Our products, and the care they support, are enabled to grow smarter and more personalized over time.”

On top of moves like the Dash Button IoT consumables automated ordering service, this move cements Amazon’s ambition to be a major IoT player, with AWS at the core. If it delivers on the promise of making IoT easier for companies and developers all the other tech giants currently involved in the IoT land grab may need to raise their game.

Amazon Web Services makes aggressive customer acquisition play

Amazon reinvent 2015At its Amazon re:Invent event Amazon Web Services (AWS) announced a number of products and initiatives designed to make it easier for potential customers to move their business to the AWS Cloud.

AWS Snowball is a portable storage appliance designed to be an alternative to trying to upload data over networks, claiming to be able to move 100 TB of data to AWS in less than a week. Amazon is betting that companies are neither willing to prioritise their existing bandwidth, nor devote the time to do this over the network. In addition the company launched Amazon Kinesis Firehose, which is designed to make it easier to upload wireless streaming data to the AWS cloud.

“It has never been easier or more cost-effective for companies to collect, store, analyze, and share data than it is today with the AWS Cloud,” said Bill Vass, VP of AWS Storage Services. “As customers have realized that their data contains key insights that can lead to competitive advantage, they’re looking to get as much data into AWS as quickly as possible. AWS Snowball and Amazon Kinesis Firehose give customers two more important tools to get their data into AWS.”

On top of these new products Amazon announced two new database services – AWS Database Migration Service and Amazon RDS for MariaDB – designed to make it easier for enterprises to bring their production databases to AWS, which seems to take aim at Oracle customers especially.

“With more than a hundred thousand active customers, and six database engines from which to choose, Amazon RDS has become the new normal for running relational databases in the cloud,” said Hal Berenson, VP of Relational Database Services, AWS. “With the AWS Database Migration Service, and its associated Schema Conversion Tool, customers can choose either to move the same database engine from on-premises to AWS, or change from one of the proprietary engines they’re running on-premises to one of the several open source engines available in Amazon RDS.”

Continuing the theme of taking on the big enterprise IT incumbents Amazon launched QuickSight, a cloud business intelligence service that would appear to compete directly with the likes of IBM, while aiming to undercut them with a low-price as-a-service model.

“After several years of development, we’re excited to bring Amazon QuickSight to our customers – a fast and easy-to-use BI service that addresses these needs at an affordable price,” said Raju Gulabani, VP of Database Services at AWS. “At the heart of Amazon QuickSight is the brand new SPICE in-memory calculation engine, which uses the power of the AWS Cloud to make queries run lightning fast on large datasets. We’re looking forward to our customers and partners being able to SPICE up their analytics.”

Lastly Amazon announced a new business group in partnership with Accenture that is also designed to make it easier for companies to move their business to the cloud. The Accenture AWS Business Group is a joint effort between the two and is another example of Accenture putting the cloud at the centre of its strategy.

“Accenture is already a market leader in cloud and the formation of the Accenture AWS Business Group is a key part of our Accenture Cloud First agenda,” said Omar Abbosh, Chief Strategy Officer of Accenture. “Cloud is increasingly becoming a starting point with our clients for their enterprise solutions. Whether our clients need to innovate faster, create new services, or maximize value from their investments, the Accenture AWS Business Group will help them get there faster, with lower risk and with solutions optimized for AWS.”