Archivo de la categoría: Amazon

Amazon expands Anthropic partnership with $25 billion investment

Amazon is expanding its investment and infrastructure commitments in artificial intelligence through a deeper partnership with Anthropic, as cloud providers continue to scale compute capacity for large language models. The company has agreed to invest up to $25 billion more into Anthropic, adding to its previous $8 billion commitment. The latest agreement includes an initial […]

The post Amazon expands Anthropic partnership with $25 billion investment appeared first on Cloud Computing News.

Amazon’s retail stores are starting to run like cloud systems

A shift is underway in how large retailers run their physical stores. What once depended on manual stock checks and fixed supply chains is now moving away from siloed systems toward software-driven operations powered by cloud and AI. Amazon’s latest retail initiative offers a clear example of this change taking shape inside real-world environments. Amazon […]

The post Amazon’s retail stores are starting to run like cloud systems appeared first on Cloud Computing News.

Amazon invests $10B in North Carolina AI data centre

Amazon Web Services announced a $10 billion AI data centre investment in Richmond County, North Carolina, marking the company’s largest single investment in the state’s history. The Amazon AI data centre investment will establish new facilities specifically designed to handle generative artificial intelligence workloads and cloud computing operations. The June 4, 2025 announcement outlines plans for data centre infrastructure […]

The post Amazon invests $10B in North Carolina AI data centre appeared first on Cloud Computing News.

Amazon pumps another €10B into Germany

In another bold step to strengthen its grip in Europe, Amazon launched a €10 billion investment plan for Germany, with the goal of driving innovation and creating thousands of jobs across the country. This huge financial commitment demonstrates Amazon’s commitment to improving its logistics network, cloud infrastructure, and research and development (R&D) skills in its… Read more »

The post Amazon pumps another €10B into Germany appeared first on Cloud Computing News.

AWS announces UK will be its third region in the EU by 2017

Amazon Web Services (AWS) is to add a UK region to its empire. On its opening date, mooted for the end of 2016 or early 2017, it will be the third region in European Union and the 12th in the world.

The presence of an AWS region brings lower latency and strong data sovereignty to local users.

Amazon organises its ‘elastic computing’ by hosting it in multiple locations world-wide. The locations are, in turn, sub divided into regions and Availability Zones. Each region is a separate geographical area with multiple, isolated locations known as Availability Zones. The rationale being to give instant local response but geographically diverse back up to each computing ‘instance’ (or user).

Announcing the new UK base in his blog, Amazon CTO Werner Vogels promised that all Britain’s ranges of local and global enterprises, institutes and government departments will get faster AWS Cloud services than they have been getting. The new region will be coupled – for failover purposes – with existing AWS regions in Dublin and Frankfurt. This local presence, says AWS, will provide lower latency access to websites, mobile applications, games, SaaS applications, big data analysis and Internet of Things (IoT) apps.

“We are committed to our customers’ need for capacity,” said Vogels, who promised ‘powerful AWS services that eliminate the heavy lifting of the underlying IT infrastructure’.

The UK government’s Trade and Investment Minister Lord Maude described the decision as ‘great news for the UK’. The choice of the UK, as the third european presence for AWS is, “further proof the UK is the most favoured location in Europe for inward investment,” said Maude.

By providing commercial cloud services from data centres in the UK AWS will create more healthy competition and innovation in the UK data centre market, according to HM Government Chief Technology Officer Liam Maxwell. “This is good news for the UK government given the significant amount of data we hold that needs to be kept onshore,” said Maxwell.

Yesterday, AWS evangelist Jeff Barr revealed in his blog that AWS will be opening a region in South Korea in early 2016, its fifth region in Asia Pacific.

AWS profitability quadruples as revenue surges 78%

amazon awsAmazon Web Services’ revenue grew by 78% year over year to $2.1 billion in the third quarter of 2015 and its operating profit more than quadrupled to $521 million. Its high profits – attributed to 500 new inventions and eight price cuts – contributed to earnings which surpassed analyst expectations and created a surge in parent company Amazon’s stock price.

The high growth rate in AWS profitability could be accounted for by last year’s low margins caused by a competitive price cuts on AWS services.

Meanwhile parent company Amazon reported an overall third-quarter operating profit of $406 million on $25.4 billion of sales. Amazon CFO Brian Olsavsky answered criticism that AWS is keeping the company profitable and that, in the face of cloud competition, it may have to cut prices again to ensure further growth.

“I will point out that this quarter showed a lot of innovation, a lot of new products and features and a lot of investment,” Amazon CFO Brian Olsavsky told analysts. “Globally we are investing very heavily in our Prime platform. We’ve launched multiple devices including e-readers, tablets priced under $50, Echo dash buttons, so there’s a lot of investment going on, and there will continue to be, especially related to prime. Innovation and investment will continue and can be lumpy over time.”

The pace of innovation in AWS and the scale of its business has allowed it to do the ‘heavy lifting for Amazon’ said one Wall Street blogger.

By constantly re-inventing itself AWS has been able to cut its prices eight times since April 2014, said Phil Hardin, Amazon director of investor relations, in an analyst conference calls. “The company rolled out 539 new features and services in the past year alone, many of which have been designed so that its customers can access enterprise-grade services for a fraction of what they would traditionally cost on-premise,” said Hardin.

Amazon continues Internet of Things push with AWS IoT

Intel AWS IoT starter kitThe new AWS platform is designed to allow IoT devices to connect to the AWS cloud as well as a managed cloud service to assist with processing the data.

AWS IoT has been launched in beta, which usually means it’s not quite ready yet, but it needs people to try it out in order to iron out lingering bugs. In essence it appears to be Amazon’s play to put itself in the thick of the IoT land-grab, as the repository of all the data constantly being generated by the billions of sensors expected to comprise the IoT.

In many ways Amazon’s many previous launches and announcements at this year’s AWS re:Invent seems to have been leading up to this, as they’ve all been about making easier to transfer data into the AWS cloud. Specifically Amazon Kenisis Firehose, which is designed to make it easier to upload wireless streaming data to the AWS cloud, seems to have been launched with IoT in mind.

“The promise of the Internet of Things is to make everyday products smarter for consumers, and for businesses to enable better, data-driven offerings that weren’t possible before,” said Marco Argenti, VP of Mobile and IoT at AWS.

“World-leading organizations like Philips, NASA JPL, and Sonos already use AWS services to support the back-end of their IoT applications. Now, AWS IoT enables a whole ecosystem of manufacturers, service providers, and application developers to easily connect their products to the cloud at scale, take action on the data they collect, and create a new class of applications that interact with the physical world.”

Device connections are handled by a device gateway, which provides tools for predetermining responses to data received. AWS IoT also creates a virtual version of each device in the cloud so it can be interacted with even in times of intermittent connectivity. A dedicated SDK aims to make it easier for developers to do clever things with IoT devices and a bunch of semiconductor companies have already got on-board by embedding the SDK into IoT chips, including Broadcom, Intel, Marvell, Mediatek, Microchip, Qualcomm and TI. There are also a bunch of IoT starter kits which can, of course, be bought on Amazon.

“At Philips we aim to empower people to take greater control of their health with digital solutions that support healthy living and improved care coordination,” said Jeroen Tas, CEO Healthcare Informatics, Solutions and Services at Philips. “Our HealthSuite digital platform and its device cloud are already managing more than seven million connected, medical-grade and consumer devices, sensors, and mobile apps.

“With the addition of AWS IoT, we will greatly accelerate the pursuit of our vision. It will be easier to acquire, process, and act upon data from heterogeneous devices in real-time. Our products, and the care they support, are enabled to grow smarter and more personalized over time.”

On top of moves like the Dash Button IoT consumables automated ordering service, this move cements Amazon’s ambition to be a major IoT player, with AWS at the core. If it delivers on the promise of making IoT easier for companies and developers all the other tech giants currently involved in the IoT land grab may need to raise their game.

Amazon Web Services makes aggressive customer acquisition play

Amazon reinvent 2015At its Amazon re:Invent event Amazon Web Services (AWS) announced a number of products and initiatives designed to make it easier for potential customers to move their business to the AWS Cloud.

AWS Snowball is a portable storage appliance designed to be an alternative to trying to upload data over networks, claiming to be able to move 100 TB of data to AWS in less than a week. Amazon is betting that companies are neither willing to prioritise their existing bandwidth, nor devote the time to do this over the network. In addition the company launched Amazon Kinesis Firehose, which is designed to make it easier to upload wireless streaming data to the AWS cloud.

“It has never been easier or more cost-effective for companies to collect, store, analyze, and share data than it is today with the AWS Cloud,” said Bill Vass, VP of AWS Storage Services. “As customers have realized that their data contains key insights that can lead to competitive advantage, they’re looking to get as much data into AWS as quickly as possible. AWS Snowball and Amazon Kinesis Firehose give customers two more important tools to get their data into AWS.”

On top of these new products Amazon announced two new database services – AWS Database Migration Service and Amazon RDS for MariaDB – designed to make it easier for enterprises to bring their production databases to AWS, which seems to take aim at Oracle customers especially.

“With more than a hundred thousand active customers, and six database engines from which to choose, Amazon RDS has become the new normal for running relational databases in the cloud,” said Hal Berenson, VP of Relational Database Services, AWS. “With the AWS Database Migration Service, and its associated Schema Conversion Tool, customers can choose either to move the same database engine from on-premises to AWS, or change from one of the proprietary engines they’re running on-premises to one of the several open source engines available in Amazon RDS.”

Continuing the theme of taking on the big enterprise IT incumbents Amazon launched QuickSight, a cloud business intelligence service that would appear to compete directly with the likes of IBM, while aiming to undercut them with a low-price as-a-service model.

“After several years of development, we’re excited to bring Amazon QuickSight to our customers – a fast and easy-to-use BI service that addresses these needs at an affordable price,” said Raju Gulabani, VP of Database Services at AWS. “At the heart of Amazon QuickSight is the brand new SPICE in-memory calculation engine, which uses the power of the AWS Cloud to make queries run lightning fast on large datasets. We’re looking forward to our customers and partners being able to SPICE up their analytics.”

Lastly Amazon announced a new business group in partnership with Accenture that is also designed to make it easier for companies to move their business to the cloud. The Accenture AWS Business Group is a joint effort between the two and is another example of Accenture putting the cloud at the centre of its strategy.

“Accenture is already a market leader in cloud and the formation of the Accenture AWS Business Group is a key part of our Accenture Cloud First agenda,” said Omar Abbosh, Chief Strategy Officer of Accenture. “Cloud is increasingly becoming a starting point with our clients for their enterprise solutions. Whether our clients need to innovate faster, create new services, or maximize value from their investments, the Accenture AWS Business Group will help them get there faster, with lower risk and with solutions optimized for AWS.”

Amazon enhances AWS with new analytics tools

AWSOn the eve of its AWS re:Invent 2015 event internet giant Amazon is positioning itself for a run at the business intelligence market.

Already announced is the Amazon Elasticsearch Service, is a managed service designed to make it easier to deploy and operate Elasticsearch in the AWS cloud, on which more later.

In addition the WSJ is reporting the likely launch of a new analytics service, codenamed SpaceNeedle, which is set to augment AWS with business intelligence tools. The reported strategic aim of this new service is to both strengthen Amazon’s relationship with AWS customers and allow it to broaden its total available market.

Back to the Elasticsearch service, BCN spoke to Ian Massingham, UK Technical Evangelist at AWS, to find out what the thinking behind it is. “This service is intended for developers running applications that use Elasticsearch today, or developers that are considering incorporating Elasticsearch into future applications,” he said “Elasticsearch is a popular open-source search and analytics engine for use cases such as log analytics, real-time application monitoring, and click stream analytics.”

Apparently Wikipedia uses Elasticsearch to provide full-text search with highlighted search snippets, as well as search-as-you-type and did-you-mean suggestions, while The Guardian uses Elasticsearch to combine visitor logs with social network data to provide real-time feedback to its editors about the public’s response to new articles.

Expect more AWS news as the re:Invent event gets underway. Already Avere Systems has unveiled Avere CloudFusion, a file storage application for AWS, that aims to provides a cloud file system to leverage Amazon Elastic Compute Cloud (EC2) and Amazon Elastic Block Store (EBS) with the cost efficiencies of Amazon Simple Storage Service (S3), all with the simplicity of network-attached storage.

Amazon Web Services to offer new hierarchical storage options after customer feedback

amazon awsAmazon Web services (AWS) is adding a new storage class to speed up the retrieval of frequently accessed information.

The announcement was made by AWS chief evangelist Jeff Barr on his company blog. Customer feedback had made AWS conduct an analysis of usage patterns, Barr said. AWS’s analytical team discovered that many customers store rarely-read backup and log files, which compete for resources with shared documents or raw data that need immediate analysis. Most users have frequent activity with their files shortly after uploading them after which activity drops off significantly with age. Information that’s important but not immediately urgent needs to be addressed through a new storage model, said Barr.

In response AWS has unveiled a new S3 Standard, within which there is a hierarchy of pricing options, based on the frequency of access. Customers now have the choice of three S3 storage classes, Standard, Standard – IA (infrequent access) and Glacier. All still offer the same level of 99.999999999 per cent durability.‎ The IA Standard for infrequent access has a service level agreement (SLA) of 99 per cent availability and is priced accordingly. Prices start at $0.0125 per gigabyte per month with a 30 day minimum storage duration for billing and a $0.01 per gigabyte charge for retrieval. The usual data transfer and request charges apply.

For billing purposes, objects that are smaller than 128 kilobytes are charged for 128 kilobytes of storage. AWS says this new pricing model will make its storage class more economical for long-term storage, backups and disaster recovery.

AWS has also introduced a lifecycle policy option, in a system that emulates the hierarchical storage model of centralised computing. Users can now create policies that will automate the movement of data between Amazon S3 storage classes over time. Typically, according to Barr, uploaded data using the Standard storage class will be moved by customers to Standard IA class when it’s 30 days old, and on to the Amazon Glacier class after another 60 days, where data storage will $0.01 per gigabyte per month.