Archivo de la categoría: News & Analysis

Cloud computing will impact $1 trillion of IT spending decisions – Gartner

Growing Money - Chart In RiseAnalyst firm Gartner has predicted more than $1 trillion in IT spending will be directly or indirectly impacted by the transition to cloud computing by 2020.

As IT spend steadily shifts from traditional IT offerings through to the cloud, a process which the Gartner team has coined the ‘cloud shift’, the rate in which enterprise organizations transition through to cloud is expected to gradually increase year-on-year. The aggregate amount of cloud shift in 2016 is estimated to reach $111 billion, though this will increase to $216 billion in 2020. The Gartner team believe cloud computing will be one of the most disruptive forces of IT spending since the early days of the digital age.

“Cloud-first strategies are the foundation for staying relevant in a fast-paced world,” said Ed Anderson, Research VP at Gartner. “The market for cloud services has grown to such an extent that it is now a notable percentage of total IT spending, helping to create a new generation of start-ups and “born in the cloud” providers.”

In terms of the specific segments, IaaS is the largest market accounting for $294 billion, though demonstrates one of the lowest levels of cloud shift through 2016, only representing a cloud shift rate of 17%. Business Process Outsourcing, or BPaaS, will represent the biggest cloud shift rate at 43%, though the expected market value through 2016 will be $119 billion.

Gartner Cloud Shift 1

Cloud Shift Summary by Market Segment

While the potential of cloud computing has been exhaustively discussed over recent years, one of the growing debates in the industry has been centred on the skills gap. Cloud requires not only new skills within the organization, but also a different approach in problem solving as well as a new business culture, should be benefits be realized. This challenge is currently being addressed by numerous organizations throughout the world.

“There is no doubt that cloud delivers unmatched business benefits in terms of usability, choice and agility,” said Angelo Di Ventura, Director at Trustmarque. “At the same time it requires wholly new skills and capabilities, and a complete IT transformation to maximise the value that businesses can gain from it – cloud can cause considerable disruption if left unchecked.

“The transition from an internet-enabled business to a digital business running in the cloud represents a huge jump for the majority of IT departments, whose existing infrastructure is designed for ‘business as usual’ operations. Ultimately, there is no one-size-fits-all model when it comes to making cloud work for a business.”

 

Microsoft continues cloud transformation with 100% Azure growth

Microsoft1Microsoft has reported 5% growth to $22.6 billion as the Intelligent Cloud business unit led the charge, with the Azure public cloud offering more than doubling in revenues and compute usage, reports Telecoms.com.

The Intelligent Cloud unit, which includes server products and cloud services, Azure and enterprise mobility offerings grew 7% to $6.7 billion, while the Productivity and Business Processes, which includes Office commercial and consumer product lines as well as the Dynamics suite, grew 5% to $7 billion. Despite revenues in More Personal Computing declining 4% to $8.9 billion, Xbox Live monthly active users grew 33% year-over-year to 49 million and search advertising revenue grew 16% over the period.

“We delivered $22.6 billion in revenue this quarter, an increase of 5% for the quarter in constant currency,” said Satya Nadella, CEO at Microsoft. “This past year was pivotal in both our own transformation and in partnering with our customers who are navigating their own digital transformations. The Microsoft Cloud is seeing significant customer momentum and we’re well positioned to reach new opportunities in the year ahead.”

Cloud computing has once again brought Microsoft to the forefront of the technology industry following a challenging couple of years. It would appear the transition from software to cloud computing brand is being successfully navigated, though there were a few missed steps along the way, most notably the team’s foray into mobile. Microsoft is moving towards the position of ‘mega-vendor’, infiltrating almost all aspects of an organization (cloud, hardware, social, databases etc.), to make it an indispensable factor of a CIOs roster.

The Intelligent Cloud unit continues as the focal point of the company’s growth strategy, as Nadella claims nearly 60% of the Fortune 500 companies use at least three of the company’s cloud offerings, generating more than $12 billion in Commercial Cloud annualized revenue run rate.

“Companies looking to digitally transform need a trusted cloud partner and turn to Microsoft,” said Nadella. “As a result, Azure revenue and usage again grew by more than 100% this quarter. We see customers choose Microsoft for three reasons. They want a cloud provider that offers solutions that reflect the realities of today’s world and their enterprise-grade needs. They want higher level services to drive digital transformation, and they want a cloud open to developers of all types.”

AI has previously been positioned as one of the cornerstones of growth for the company, and this was reinforced during the earnings call, as Nadella noted the component of the Intelligent Cloud business unit. The Cortana Intelligence Suite, formerly known as Cortana Analytics Suite, is built on the company’s on-going research into big data, machine learning, perception, analytics and intelligent bots. The offering allows developers to build apps and bots which interact with customers in a personalized way, but also react to real-world developments in real-time.

“Just yesterday, we announced Boeing will use Azure, our IoT suite, and Cortana Intelligence to drive digital transformation in commercial aviation, with connected airline systems optimization, predictive maintenance, and much more,” said Nadella. “This builds on great momentum in IoT. This is great progress, but our ambitions are set even higher. Our Intelligent Cloud also enables cognitive services. Cortana Intelligence Suite offers machine learning capabilities and advanced predictive analytics.

“Central to our Intelligent Cloud ambition is providing developers with the tools and capabilities they need to build apps and services for the platforms and devices of their choice. The new Azure Container service as well as .NET Core 1.0 for open source and our ongoing work with companies such as Red Hat, Docker, and Mesosphere reflects significant progress on this front. We continue to see traction from open source, with nearly a third of customer virtual machines on Azure running Linux.”

The company exceeded analyst expectations for the quarter, which was reflected in pre-market trading which saw shares in the giant growing 4%. In terms of outlook for the next quarter, most business units are expected to be down a fraction on the Q2 reported figures, unsurprising considering the summer period. Intelligent Cloud is expected to bring between $6.1-6.3 million, Productivity and Business Processes $6.4-6.6 billion, and More Personal Computing $8.7-9 billion.

BT outage impacts 10% of customers in capital

BT Sevenoaks workstyle buildingBT has confirmed around 10% of its customers experienced an outage this morning, which has reportedly been linked to a power incident at the former Telecity LD8 site in London, which is now owned by Equinix, reports Telecoms.com.

BT first acknowledged the outage this morning on Twitter, which took down broadband services for a number of customers in the London area.

The LD8 data centre in London’s Docklands currently houses the London Internet Exchange (LINX), one of the world’s largest Internet Exchanges with more than 700 members which include ISPs such as BT and Virgin Media, as well as content providers.

“We’re sorry that some BT and Plusnet customers experienced problems accessing some internet services this morning,” said a BT spokesperson. “Around 10% of customers’ internet usage was affected following power issues at one of our internet connection partners’ sites in London. The issue has now been fixed and services have been restored.”

While the comment has stated the problem was limited to London, BT’s service status page does indicate dozens of cities and towns across the UK experienced issues. These service challenges have not been directly linked to the same incident to date.

The LD8 data centre has been under control of Equinix over recent months since the US company acquired Telecity for $3.8 billion. Equinix claims it is now the largest retail colocation provider in Europe and globally, after the deal added 34 data centres to the portfolio, though eight assets had to be off-loaded to keep competition powers in the European Commission happy.

“Equinix can confirm that we experienced a brief outage at the former Telecity LD8 site in London earlier this morning,” said a Equinix spokesperson. “This impacted a limited number of customers, however service was restored within minutes. Equinix engineers are on site and actively working with customers to minimise the impact.”

During email exchanges with Telecoms.com, neither BT or Equinix named either party, though this is understandable as it is a sensitive issue. Despite BT stating all services have been recovered at the time of writing the service status page lists dozens of towns and cities who are still experiencing problems. Although not directly linked, as long as service problems continue BT is likely to be facing a mounting customer service challenge.

IBM makes cloud progress but reports another quarterly decline

IBMIBM revenues continued to fall for a 17th consecutive quarter despite beating analyst expectations and demonstrating healthy growth in its cloud and data business units, reports Telecoms.com.

The company reported a drop in revenues for Q2 of 2.8% to $20.24 billion, though this was an improvement on analyst expectations of $20.03 billion, encouraging shares to rise 2.6% to $164 after hours. The business units which the company deems strategic imperatives, cloud, analytics and engagement, gained 12% year-on-year, though this wasn’t enough to counter the impact of legacy technologies on reported earnings which fell to $2.5 billion from $3.45 billion in 2015. Overall, revenues are now roughly 25% lower than the numbers reported in 2011.

“We continued to deliver double-digit revenue growth in our strategic imperatives,” said CFO Martin Schroeter on the company’s earnings call this week. “Over the last 12 months, strategic imperatives delivered $31 billion in revenue, and now represent 38% of IBM.

“Growth was led by cloud, where our revenue was up 30% to $3.4 billion in the quarter, and over $11.5 billion over the last year so good progress in cloud. Looking at revenue from a segment perspective, the strongest growth came from cognitive solutions led by our analytics and cognitive capabilities and security.”

Schroeter was keen to emphasise the impact Watson is having on the business, as the team continue its journey to redefine Big Blue in the age of cloud computing. Numerous customers were listed as wins for IBM in the cognitive computing sector, as IBM continues to champion Watson as a platform to bring together the digital business with digital intelligence to improve decision-making and add intelligence to products and processes. Watson will continue to be the jewel in the crown of Big Blue as the company moves towards the new digital era.

Despite revenues continuing to fall the team has made a number of positive launches throughout the quarter. Quantum computing is now available on the IBM cloud, the team launched a new partnership with Box to counter the impact of EU-US Privacy Shield on its international business, and an expanded partnership with VMWare expanded the reach of its security portfolio.

In terms of the specific segments, revenues in the cognitive team rose 4%, though this is down from 9% growth in the previous quarter, solutions software revenue was up 6% for the quarter, SAS was another area which recorded triple digit growth and Schroeter claims IBM’s security business outperformed the market by three times. The IBM interactive experience unit also demonstrated healthy growth, as the team continue its journey into an entirely new market for Big Blue.

“We have opened over 30 digital studios around the globe including new studios in Singapore and Seoul,” said Schroeter. “We also completed the acquisition of Aperto, a digital agency in Berlin with over 300 employees and a roster of enterprise clients such as Airbus and Siemens.”

One area which has caught the headlines in recent weeks is the impact of Brexit on the fortunes of the technology sector. Despite concerns from various corners of the industry, it would not have appeared to have a significant impact on the long-term vision of IBM.

“I don’t think that Brexit coming at the end of the quarter helped us at all, but we obviously finished kind of right where we expected to finish,” said Schroeter. “And when we look at our full view of the year, we don’t see an impact, if you will, that has any real materiality on us.

“What I typically observe in these kinds of instances is that our discussions with our clients have to go through a process of reprioritization. So as they reprioritize, the length of time that takes depends a lot on how much uncertainty they’re faced with. And obviously, the political leadership in Europe and the UK can help reduce that uncertainty, but we didn’t see – again, we don’t think it helped but it didn’t cause us to change our guidance.”

While revenues have continued to fall for the tech giant, it would appear to be heading in the right direction. The strategic imperatives business units are now accounting for a larger proportion of the overall figures, now 38%, indicating the tide may be turning for IBM. Schroeter also highlighted the team are not happy relying solely on the progress of Watson, as IBM has acquired 20 companies in the last twelve months, which are now beginning to contribute in a more significant manner.

Although progress is starting to be seen, it would be worth noting it has not been an entirely smooth ride for IBM. There have been numerous new product launches and advances into new market segments, though this has come at a cost of more than 70,000 redundancies over recent months. While there has been a slight increase in share price following the announcement, it would be worth noting previous performance has had an impact on IBM. Shares in Big Blue have dropped 17% since CEO Virginia Rometty took over in January 2012 while the S&P 500 index rose 70% during the same period.

Security still viewed as a barrier to progress – Dell

Security CCTV camera in office buildingA recent survey from Dell demonstrates security is still seen as a hindrance to innovation as companies aim to develop a more digitally orientated proposition for the market, reports Telecoms.com.

While a substantial 89% of the respondents highlighted their organization was in the middle of a digital transformation project, 76% agree security is brought into the equation too late in the development process, with 85% saying they actively avoid bringing security experts in due to the belief they will slow or even scupper the project.

“This survey produced some eye-opening results and reinforces what we’ve been hearing directly from our customers,” said John Milburn, GM of One Identity Products at Dell. “Organisations face challenges securing their digital transformations and recognise that their current security measures are exposing the business to risk.

Security has been one of the biggest talking points within the telecommunications and technology industry, generally due to a lack of understanding. Until recently, security challenges would appear to have been pushed to the side as there have not been any clear routes to success. It would seem companies are not willing to allow security concerns to stop progress, instead aiming to secure products retrospectively.

The survey demonstrates attitudes towards are still relatively negligent. While numerous CEO’s and board members have highlighted security would be considered at the top of the agenda, surveys such as this tell a different story, much to the disappointment of security professionals and vendors alike. One conclusion which could be drawn from the survey is security is still considered a barrier to success when driving towards innovation. In fact 37% of respondents agreed with the statement “it is likely that the security team will delay or block a new initiative presented to us today”, and 49% agreed with “our security team does have a reputation for blocking projects based on the past, but now we do a better job of enabling the business”.

“Our goal is to provide our customers with solutions that address these needs. When done right, security can enable organisations to aggressively adopt new technologies and practices that can have a direct, positive impact on revenue, profits, employee productivity and the customer experience. Done right, security also helps CISOs open their own ‘Department of Yes,’ empowering them to deliver the strategic projects and innovative initiatives that drive businesses forward.”

Security is, and will continue to be, a paramount facet of any organization, though the implications which can be drawn from this survey suggest there is still some way before organizations would consider themselves secure. One encourage factor from the survey is 91% of respondents agreed if the security team was given more resources they could do a better job. What is unclear is whether CEOs and other board members will follow up on the promise security will receive more investment.

AT&T expands NFV and SDN offering worldwide

business cloud network worldAT&T has expanded its Network on Demand solutions to now include 76 countries around the world, reports Telecoms.com.

The new service is built on the company’s software-defined network technology, and claimed to help businesses deploy a single universal piece of equipment, choose virtualized functions and set them up in different countries. The service would appear to be designed to simplify the process of buying and adding network functions, reducing the reliance customers have on hardware.

“Building networks by deploying network functions in software is a major shift in network design,” said Ralph de la Vega, CEO of AT&T Business Solutions and International. “We’ve broken through traditional, cost-prohibitive barriers. Our software platform delivers a simple, flexible and efficient experience for any business, virtually anywhere and anytime they need it.”

The service was initially launched in 2015, with AT&T claiming it now has more than 1,200 businesses signed up to the service. 76 countries are now supported by the service, with capabilities including Juniper Networks virtual routing, Cisco virtual router, Fortinet virtual security, and Riverbed virtual WAN optimisation. The service is the third the company has launched on the SDN platform.

The launch builds on wider trends within the industry as telcos aim to utilize the flexibility and speed of SDN and NFV to recoup lost revenues. Traditional revenues streams of voice calls and text messaging have been slowly eroded in recent years, as more customer switch to OTT services such as WhatsApp. Creating new services for business customers is generally regarded as critical if the industry is to avoid being relegated to the likes of utilities.

It would appear to have been a busy couple of weeks for the AT&T team who also made a couple of new announcements last week. On the enterprise side of things, the team it was adding faster internet speeds, up to 1 Gbps, for business customers using the AT&T Business Fiber service. On the consumer side, AT&T also announced it has reached the trial phase of its national drone programme, which focuses on how AT&T customers can benefit from drone-based solutions, including providing enhanced LTE wireless coverage.

Are cyber attacks covering up server inadequacies at Pokémon Go?

Pokemon GO 2Pokémon Go users have continued to struggle as the app’s developer Niantic Labs recovers from hacker attacks and unprecedented demand for the game, reports Telecoms.com.

Claimed attacks from various hacker groups would have appeared to cover up server inadequacies at Niantec Labs, as the team seemingly struggles to meet capacity demands following the games launch in 27 countries worldwide.

Over the course of the weekend, various hacker groups including PoodleCorp and OurMine have claimed responsibility for a distributed denial of service (DDoS) attack, causing a slow and clunky experience for many players around the world. Although the Niantec Labs team has played down the incidents, disruptions have continued into Monday morning with the Telecoms.com editorial team unable to access the game effectively. Whether this can be attributed to the claimed attacks or a lack of server capacity is unclear for the moment.

The hacker saga would have appeared to have started over the weekend, with OurMine stating on its website, “Today We will attack “Pokemon Go” Login Servers! so no one will be able to play this game till Pokemon Go contact us on our website to teach them how to protect it! We will attack it after 3-4 hours! Be ready! We will update you!” This was followed by another statement declaring the servers were down. PoodleCorp claimed the day before (June 16), it had caused an outage, though also said to expect a larger attack in the near future.

While both of these attacks have attracted headlines, it would also appear to have covered up shortcomings on the company’s infrastructure and its ability to deal with high demand. The launch of Pokémon Go has been well documented over the last few weeks as it has been lauded by numerous sources as the biggest mobile game in US history. Even before its official release in the UK, EE announced it saw 350,000 unique users of Pokémon GO on its network.

“This is the fastest take up of an app or game we’ve ever seen – and that’s before it’s officially launched! People across the country are going to be relying on a mobile data network that’s everywhere they go,” said Matt Stagg, EE head of video and content strategy.

Despite claims the server problems have been addressed, complaints have continued to be voiced. Server status tracking website Downdetector stated 39,013 complaints were registered at 22.00 (EST) on July 17. The Niantic Labs team are seemingly underestimating demand for Pokémon Go with each launch, which would be a nice problem to have.

While Telecoms.com was unable to identify Niantic Labs specific cloud set-up, other reports have identified Google as the chosen platform. Although there are no specific announcements linking the two organizations, Niantec was spun out of Google in October last year, and currently has John Hanke at the helm, who was previous VP of Product Management for Google’s Geo division, which includes Google Earth, Google Maps and StreetView. A job vacancy is also on the company’s website which asks for experience in dealing with Google Cloud or AWS.

Although AWS has been listed on the job vacancy, it would be fair to assume it is not involved currently as CTO Werner Vogels couldn’t resist making a joke at the affair stating “Dear cool folks at @NianticLabs please let us know if there is anything we can do to help!” on his twitter account. This could imply some insider knowledge from Vogels as it would be most likely the company would take a swipe at its closest rivals in the public cloud market segment, namely Google or Microsoft Azure.

The claims of the DDoS attacks would appear to have come at an adequate time, as it has taken the heat off the cloud infrastructure inadequacies. According to Business Insider, Hanke said the international roll-out of the game would be “paused until we’re comfortable”, with relation to the server capacity issues. It would seem the company is prepared to ride the wave of demand, as well as complaints, and fix the server problem later, as launches and server issues continued following that interview.

Privacy Shield rubber stamped amid dissent

dataThe European Commission has formally adopted the controversial ‘Privacy Shield’ framework intended to replace the previous Safe Harbour agreement, reports Telecoms.com.

Both schemes covered the transfer of data between the EU and the US, with the balance between free movement of data and the protection of individuals a tricky one to strike. Privacy Shield has many critics who fear it does little to address the issues faced by Safe Harbour. In spite of that the EC has decided to plough forward as anticipated.

“We have approved the new EU-US Privacy Shield today,” said Andrus Ansip, Commission VP for the Digital Single Market. “It will protect the personal data of our people and provide clarity for businesses. We have worked hard with all our partners in Europe and in the US to get this deal right and to have it done as soon as possible. Data flows between our two continents are essential to our society and economy – we now have a robust framework ensuring these transfers take place in the best and safest conditions.”

“The EU-U.S. Privacy Shield is a robust new system to protect the personal data of Europeans and ensure legal certainty for businesses,” said Věra Jourová, Commissioner for Justice, Consumers and Gender Equality. “It brings stronger data protection standards that are better enforced, safeguards on government access, and easier redress for individuals in case of complaints. The new framework will restore the trust of consumers when their data is transferred across the Atlantic. We have worked together with the European data protection authorities, the European Parliament, the Member States and our U.S. counterparts to put in place an arrangement with the highest standards to protect Europeans’ personal data”.

Not everyone in Brussels was convinced, however. “The Commission has today signed a blank cheque for the transfer of personal data of EU citizens to the US, without delivering equivalent data protection rights,” said the Green Party MEP Jan Philipp Albrecht. “The ‘Privacy Shield’ framework does not seem to address the concerns outlined by the European Court of Justice in ruling the Safe Harbour decision illegal. In particular the individual rights of consumers are still too weak and blanket surveillance measures are still in place. In this context, the Commission should not be simply accepting reassurances from the US authorities but should be insisting on improvements in the data protection guaranteed to European consumers.

“The European Parliament already underlined concerns about the lack of general data protection provisions in the US when the initial Safe Harbour decision was concluded in 2000. Independent data protection authorities are still lacking in the US. EU justice commissioner Jourova must now make clear that, once the EU’s new General Data Protection Regulation enter into force in 2018, there will also be a need to revise the Privacy Shield decision.”

Elodie Dowling, VP, EMEA General Counsel at BMC Software reckons there’s still plenty of work to do. “Following negotiations between EU and US officials, the formal adoption of Privacy Shield has officially started today in the EU’s 28 member states,” said Dowling. “Starting August 1, it will then be for businesses across the US and the EU to innovate and comply around this in order to create a culture of trust amongst their customers.

 

“However, with the ongoing discussions generated throughout the negotiation period, it’s unlikely that the official adoption of the Privacy Shield closes the loophole completely. For example, it remains unclear the type of ‘assurances’ the US has provided to the EU to ensure mass surveillance does not apply or, if it does, that it happens in a transparent and framed manner for EU citizens. Surely this particular item is going to be carefully considered by data privacy activists.”

EU moves forward with Privacy Shield despite EDPS warning

Europe US court of justiceThe European Commission has announced it will continue ahead with the EU-US Privacy Shield despite the European Data Protection Supervisor claiming the pact is not robust enough, reports Telecoms.com.

Since Safe Harbour was struck down by the European Court of Justice last year, the industry has been in limbo as politicians were unable to draft an agreement between the US and EU, which met the criteria for data protection in the European market. In May, European Data Protection Supervisor, Giovanni Buttarelli, outlined his concerns on whether the proposed agreement will provide adequate protection against indiscriminate surveillance, believing the pact would not be strong enough to stand up.

“Today Member States have given their strong support to the EU-U.S. Privacy Shield, the renewed safe framework for transatlantic data flows,” said Vice-President Andrus Ansip and Commissioner Věra Jourová in a joint statement. “This paves the way for the formal adoption of the legal texts and for getting the EU-U.S. Privacy Shield up and running. The EU-U.S. Privacy Shield will ensure a high level of protection for individuals and legal certainty for business.”

Despite the European Commission pushing forward with the draft, there have been a number of individuals and parties within the EU who have criticised the agreement. For some, the EU-US Privacy Shield is simply a reheated Safe Harbour, with very little to address the concerns of the original agreement.

Article 29 Working Group is another influential group has highlighted to the industry the pact has made progress, though it did identify a number of shortcomings when looking at mass surveillance and oversight. The new agreement does encourage organizations to be more considered and conservative when sharing data with US, however critics of the new agreement have claimed there are still too many exceptions where the US and its intelligence agencies can move around the agreement. Despite the concerns, the European Commission has ploughed ahead.

On the other side of the argument, Microsoft has somewhat unsurprisingly confirmed its support of the pact, though it has stated it should go further. In any case, a large vendor expressing its support for an agreement which would enable the organization to do more business in Europe should not be met with astonishment.

“It is fundamentally different from the old ‘Safe Harbour’: It imposes clear and strong obligations on companies handling the data and makes sure that these rules are followed and enforced in practice,” said the announcement. “For the first time, the U.S. has given the EU written assurance that the access of public authorities for law enforcement and national security will be subject to clear limitations, safeguards and oversight mechanisms and has ruled out indiscriminate mass surveillance of European citizens’ data.”

“And last but not least the Privacy Shield protects fundamental rights and provides for several accessible and affordable redress mechanisms. During the formal adoption process, the Commission has consulted as broadly as possible taking on board the input of key stakeholders, notably the independent data protection authorities and the European Parliament. Both consumers and companies can have full confidence in the new arrangement, which reflects the requirements of the European Court of Justice. Today’s vote by the Member States is a strong sign of confidence.”

It would appear the European Commission is moving forward to demonstrate to the industry progress is being made, though could be seen as a flimsy approach. With the concerns expressed by influential and respected bodies within the industry, it should not be seen as a surprise if the agreement is struck down once again by the European Court of Justice.

The cloud is a utility, and we’re fine with that – AWS

amazon awsWhile the telco industry is fighting to avoid being relegated to the likes of utilities, AWS has already accepted cloud computing is commoditised, reports Telecoms.com.

As cloud as a concept continues to become normalized within the business world, the number of competitors is growing day by day. AWS would generally still be considered the leader in the market, though progress from Microsoft and Google, as well as a number of new players appearing has slightly eroded this dominant position. According to Brendan Bouffler, AWS’ lead for the team responsible for developing the scientific computing segment, the prospect of cloud becoming utilized would not bother the market leader.

“It already is,” said Bouffler. “You can move in and out of our cloud whenever you like. There’s no long term commitment as our standard terms and conditions last for an hour. You can sign up for an hour and then move out. We see it all the time. We’re constantly holding our feet to the fire and forcing ourselves to innovate, that’s how we keep customers.”

Within the telco industry, operators are fighting against the tide to prevent the business being classed in the same bracket as utilities. Competing on price and constantly attempting to undercut challengers is not a battle ground the industry wants to operate in. The telcos would like to compete on value adds and brand equity, though Bouffler believes there is enough untapped business in the cloud market for the utility model to be successful.

Estimates on the value of the global cloud computing market vary, though statista believes it is worth in the region of $114 billion this year. Should AWS continue its healthy start to 2016, it will account for $10 billion. By 2020 the market is predicted to grow to roughly $159 billion, offering plenty of opportunity for competitors to establish themselves, and AWS to continue its growth.

“Running a company like a hardware vendor does where they are looking for high margins is a legitimate business model, but ours is different to that,” said Bouffler. “We’re a high volume, low margin business and it’s successful for us. It was pretty successful in disrupting the retail industry in the way books were sold. As a consumer of books, I’m in awe of that. You can put books in the hands of people for almost pennies. We democratized reading and we’re going to do the same for cloud.”

Bouffler believes the disruptive nature of Amazon and AWS is fuelling future growth within the business itself. Competing on price is not a worry for the team, as this was the origins of the Amazon book business. Amazon was launched in 1994 and shook up the retail book industry. It drove down prices, opened up new distribution channels and created an entirely new way of consuming literature. Bouffler believes the same is being done for computing.

Although the telco industry is concerned with the direction it is heading, the potential for growth within the cloud computing industry means being classed as a utility is not necessarily a terrible fate for AWS. While there are some organizations who would like to create an industry with higher margins, Bouffler believes the origins of Amazon, the disruptive nature of the business and the experience of operating in a low margin/high volume environment puts the company in a strong position to compete and succeed in a utility environment.

“This is only the tip of the iceberg,” said Bouffler. “Some of our customers are people doing something they wouldn’t have usually done without cloud computing. It wasn’t that they were substituting for money which would have been spent on a hardware cluster, these are projects that weren’t going to happen. This is net new stuff. This whole net new universe is still in front of us, I think we’re only just scratching the surface.

“It’s incredibly sustainable. Even though we’re a low margin business and a high volume business we’re good with that. We’ve been doing this since Amazon came into business (22 years ago), and the model is still working. I think there is still tons to be done before anyone writes obituaries about that business model.”