All posts by Jamie Davies

Machine Vision 5G use case demonstrated by Ericsson and Vodafone

Engine manufactoringEricsson and Vodafone have successfully demonstrated another 5G Proof of Concept, this time focusing on Machine Vision (MV) application, reports Telecoms.com.

The team created a 5G Smart Network Edge prototype including a 5G ready core and demonstrated the benefits of network slicing and distributed cloud technology for MV. Making the announcement at the Innovation Days at Ericsson’s R&D Center in Aachen, the team demonstrated how the 5G Smart Network Edge enables much greater efficiency for industry. Due to reduced network latencies the recognition rate of a cloud-based face detection application was increased. The PoC also confirmed data could be stored locally, decreasing the risk of breaches, loss or unauthorized access.

“Within only 3 months we created a 5G Smart Network Edge prototype by connecting our labs,” Sonja Graf, Head of Vodafone Innovation Park at Vodafone Germany. “The Face Recognition use case is just one example demonstrating how 5G will meet the diverse needs of a wide range of industries.”

While MV is not a new concept for the industry, it is becoming increasing commonplace for quality assurance, inspection and industrial robot guidance processes in the manufacturing industry. Examples of MV include wood quality inspection, robot guidance and checking orientation of components and reading of serial numbers. Actions of the back of the inspection can be automated, opening up the door for artificial intelligence in the manufacturing industry.

“We are delighted that the Ericsson and Vodafone labs have come together to innovate and this first use case shows an excellent example of how 5G can enable industries to become more efficient as well as more secure and cost effective,” said Valter D’Avino, Head of Ericsson Western & Central Europe.

Google’s trans-Pacific submarine cable enters into service today

GoogleA consortium of tech giants including Google and NEC has completed the construction and end-to-end testing of a new trans-Pacific submarine cable system, reports Telecoms.com.

The 9,000km FASTER Cable System enters into service today (30 June), and is claimed to be the first cable system designed from the outset to support digital coherent transmission technology, using optimized fibers throughout the submarine portion. The cable system lands in Oregon in the United States and two landing points in Japan, Chiba and Mie. The team claim the cable will be able to deliver 60 Terabits per second (Tbps) of bandwidth across the Pacific.

“From the very beginning of the project, we repeatedly said to each other, ‘faster, Faster and FASTER,’ and at one point it became the project name, and today it becomes a reality,” said Hiromitsu Todokoro, Chairman of the FASTER Management Committee. “This is the outcome of six members’ collaborative contribution and expertise together with NEC’s support.”

The consortium includes China Mobile International, China Telecom Global, Global Transit, Google, KDDI and Singtel, of which Google has been one of the most vocal. On the official blog, Google said the new cable will help the team launch a new Google Cloud Platform East Asia region in Tokyo.

The new data centre in Tokyo is part of Google’s ambitions to dominate cloud computing and other enterprise service offerings. While it is generally considered to be ranked third in the public cloud stakes, with AWS and Microsoft Azure out ahead, it has been making strides in recent months. Alongside the Tokyo data centre launch, another was opened in Oregon, and there are plans for a further ten over the course of 2017.

Google has been investing in submarine cables since 2008, initially with the 7.68Tb trans-Pacific Unity cable, which came online in 2010. The completion of the project now takes the number of Google-owned undersea cables up to four, though there are likely to be more added in the coming years.

“Today, Google’s latest investment in long-haul undersea fibre optic cabling comes online: the FASTER Cable System gives Google access to up to 10Tbps (Terabits per second) of the cable’s total 60Tbps bandwidth between the US and Japan,” said Alan Chin-Lun Cheung, a Google Submarine Networking Infrastructure.

“We’ll use this capacity to support our users, including Google Apps and Cloud Platform customers. This is the highest-capacity undersea cable ever built — about ten million times faster than your average cable modem — and we’re beaming light through it starting today.”

Ericsson claims a world first with transcontinental 5G trial

Ericsson, Deutsche Telekom and SK Telecom have announced a partnership to deploy world’s first transcontinental 5G trial network, reports Telecoms.com.

The objective of the agreement will be to provide optimized end-user experiences by providing consistent quality of services and roaming experiences for advanced 5G use cases with enhanced global reach. Ericsson will act as the sole supplier to the project, which will include technologies such as NFV, software defined infrastructure, distributed cloud, and network slicing.

Last October, Ericsson and SK Telecom conducted a successful demonstration of network slicing technology, which featured the creation of virtual network slices optimized for services including super multi-view and augmented reality/virtual reality, Internet of Things offerings and enterprise solutions.

“5G is very different from its predecessors in that the system is built as a platform to provide tailored services optimized for individual customer’s needs, at a global scale,” said Alex Jinsung Choi, CTO at SK Telecom. “Through this three-party collaboration, we will be able to better understand and build a 5G system that can provide consistent and enhanced user experience across the globe.”

Alongside the announcement, Ericsson and SK Telecom also successfully completed a demonstration of 5G software-defined telecommunications infrastructure, using the vendors Hyperscale Datacenter System (HDS) 8000 solution. The pair claims this is a world-first and will enable dynamic composition of network components to meet scale requirements of 5G services.

Software-defined telecommunications infrastructure is one of the enablers of network slicing, which will allow operators to create individual virtualized environments which are optimized for specific users. The demonstration itself focused on two use cases; ultra-micro-network end-to-end (E2E) slicing for personalized services, and ultra-large-network E2E slicing for high-capacity processing.

“SDTI is an innovative technology that enhances network efficiency by flexibly constructing hardware components to satisfy the infrastructure performance requirements of diverse 5G services,” said Park Jin-hyo, Head of Network Technology R&D Center at SK Telecom.

Finally, Ericsson has announced another partnership with Japanese telco KDDI with the ambition of delivering IoT on a global scale and providing enhanced connectivity services to KDDI customers.

The partnership will focus on Ericsson’s cloud-based IoT platform to deliver services such as IoT connectivity management, subscription management, network connectivity administration and flexible billing services. The pair claims the new proposition will enable KDDI’s customers to deploy, manage and scale IoT connected devices and applications globally.

IoT represents a significant opportunity for enterprise customers and operators alike, as it significantly increases the amount of data available and also access points to customers worldwide. Research firm Statista estimates the number of devices worldwide could exceed 50 billion, though the definition of what a connected device is or what an IoT connected device is varies.

“KDDI has for a long time been committed to building the communication environment to connect with world operators in order to support the global businesses of our customers,” said Keiichi Mori, GM of KDDI’s IoT Business Development Division. “We believe that by adopting DCP, we will be able to leverage Ericsson’s connection with world carriers and furthermore promote our unified service deployment globally to customers as they start worldwide IoT deployments.”

Storage Wars: Cloud vs. the Card for Storing Mobile Content

Cloud storageIn May, Samsung announced what it describes as the world’s highest capacity microSD card. The Samsung EVO+ 256GB microSD card has enough space to store more than 55,000 photos, 46 hours of HD video or 23,500 MP3 files and songs. It can be used for phones, tablets, video cameras and even drones. It’s set to be available in 50 countries worldwide.

The announcement of Samsung’s new card comes at a time when the amount of mobile content that consumers are creating, consuming and storing on their smartphones and mobile devices is increasing at an exponential rate.  The Growing number of connected devices with advanced features, including high-resolution cameras, 4K video filming and faster processors, are fuelling a global ‘content explosion’.  The content being created today is richer and heavier than ever, placing a growing strain on device storage capacities which could damage the data and impair user experience.

Earlier this year, 451 Research and Synchronoss Technologies charted the growth of smartphone content and found that the average smartphone user now generates 911MB of new content every month. At this rate, a typical 16GB smartphone – which already has almost 11GB of user content on it – will fill up in less than two months.  Given that a high proportion of smartphone owners have low-capacity devices – 31% 16GB; 57% 32GB or smaller – many will (if they haven’t already) quickly find themselves having to make difficult decisions. At the moment, this means having to frequently remove photos, videos and apps to make room for new ones.

It’s also surprising that almost half of smartphone users have no off-device storage in place at all, despite the variety of storage options available. One option is a hardware solution like a memory card. Samsung claim its new microSD card delivers a seamless experience user when accessing, sharing and storing content between different devices (depending on compatibility, of course). Samsung’s suggested price for this experience is $250 however there is another storage option for end-users, the cloud.

Cloud-based storage arguably provides a more flexible and secure method for end-users to back up, transfer and restore their precious content. A memory card, like a phone, can be damaged, lost or stolen. In contrast, the cloud is an ever-present secure repository that retains and restores consumers’ files, photos and media, even if they lose or damage their device or card. However, even in the US, the most mature market for consumer uptake of cloud storage services, more than half of smartphone users are not currently using the cloud to manage their smartphone content.

But why should operators care? 

Subscriber loyalty to operators is being tested. Rather than receive a subsidised handset as part of a contract with an operator, growing numbers of people purchase their devices directly in a regular subscription agreement with the manufacturer instead. Rather than commit to a long-term contract, these consumers enter into no-obligation rolling connectivity-only agreements with their operator.

Offering consumers access to a personal cloud platform is an important opportunity for operators to re-engage with these consumers and keep them tied to their services. Helping subscribers manage the spiralling volumes of their content could be much more effective for operators than faddish offers and promotional bundles to keep subscribers connected to their brand and their ecosystem.

While there is already a lot of cloud competition in the market, such as Google Drive, iCloud, Dropbox and Box, however hosted storage and access has the potential to be much more than a “me too” play for operators, or even an answer to churn.

Cloud services can be a viable revenue generator for operators in their own right. They equip operators with an attractive channel for brand partnerships and developers to reach subscribers with an expanded ecosystem of services. Considerable productivity and profitability benefits can also be found, including reducing time on device-to-device content transfer and freeing up operators’ in-store staff for more in-depth customer engagement.

Operators shouldn’t approach the provision of cloud technology with unease. After all, their core business is all about providing secure wireless transport for voice and increasingly data quickly, at scale, and to a wide range of mobile phones and other connected devices. Cloud storage and access is the natural extension of this business. Of course, given the current climate of heightened awareness around privacy and security, it’s crucial to work with a vendor with a strong track record.  However, operators should realise they’re in a stronger position than they think when it comes to providing cloud services.

Written by Ted Woodbery, VP, Marketing & Product Strategy at Synchronoss Technologies

What does Clinton have in store for the tech industry?

Location United States. Red pin on the map.Hillary Clinton has recently released her campaign promises for the technology sector should she be elected as President Obama’s successor in November, reports Telecoms.com.

The technology agenda focused on a vast and varied number of issues within the technology industry, including the digital job-front, universal high-speed internet for the US, data transmission across jurisdictions, technological innovation and the adoption of technology in government. Although the statement does indicate a strong stance on moving technology to the top of the political agenda, there does seem to be an element of ‘buzzword chasing’ to gain support of the country’s tech giant.

“Today’s dynamic and competitive global economy demands an ambitious national commitment to technology, innovation and entrepreneurship,” the statement read. “America led the world in the internet revolution, and, today, technology and the internet are transforming nearly every sector of our economy—from manufacturing and transportation, to energy and healthcare.”

But what did we learn about America’s technology future?

Focus on 5G and new technologies

One of the more prominent buzzwords through the beginning of 2016 has been 5G as it is seemingly the turn-to phrase for the majority of new product launches and marketing campaigns. The Clinton has aligned themselves with the buzz in committing to deploying 5G networks (no timeframe), as well as opening up opportunities for a variety of next gen technologies.

“Widely deployed 5G networks, and new unlicensed and shared spectrum technologies, are essential platforms that will support the Internet of Things, smart factories, driverless cars, and much more—developments with enormous potential to create jobs and improve people’s lives,” the statement said.

The deployment of 5G has been split into two separate areas. Firstly, the use of the spectrum will be reviewed with the intention of identifying underutilized bands, including those reserved for the government, and reallocating to improve the speed of deployment. Secondly, government research grants will be awarded to various vendors to advance wireless and data technologies which are directed towards social priorities including healthcare, the environment, public safety and social welfare.

A recent report highlighted from Ovum highlighted the US is on the right track for the deployment of 5G, as the team believe it will be one of the leading countries for the technology. Ovum analysts predict there will be at least 24 million 5G subscribers by the end of 2021, of which 40% will be located in North America.

Europe US court of justiceData Transmission between US and EU

From a data transmission perspective, the Clinton team are seemingly taking offence to the European Court of Justice’s decision to strike down Safe Harbour, and the varied reception for the EU-US Privacy Shield. It would appear the Clinton team is under the assumption the deal between the EU and US was struck down for economic reasons, as opposed to data protection.

“The power of the internet is in part its global nature. Yet increasing numbers of countries have closed off their digital borders or are insisting on “data localization” to attempt to maintain control or unfairly advantage their own companies,” the statement said. “When Hillary was Secretary of State, the United States led the world in safeguarding the free flow of information including through the adoption by the OECD countries of the first Internet Policymaking Principles.

“Hillary supports efforts such as the U.S.-EU Privacy Shield to find alignment in national data privacy laws and protect data movement across borders. And she will promote the free flow of information in international fora.”

While it is could be considered encouraging that the mission of the Clinton team is to open up the channels between the two regions again, it does seem to have missed the point of why the agreement was shot down in the first place. The statement seemingly implies EU countries refused the agreement on the ground of promoting the interests of EU countries in the EU, as opposed to privacy concerns and the US attitude to government agencies access to personal information.

Safe Harbour, the initial transatlantic agreement, was shot down last October, though its proposed successor has come under similar criticism. Only last month, the European Data Protection Supervisor, Giovanni Buttarelli, outlined concerns on whether the proposed agreement will provide adequate protection against indiscriminate surveillance as well as obligations on oversight, transparency, redress and data protection rights.

“I appreciate the efforts made to develop a solution to replace Safe Harbour but the Privacy Shield as it stands is not robust enough to withstand future legal scrutiny before the Court,” said Buttarelli. “Significant improvements are needed should the European Commission wish to adopt an adequacy decision, to respect the essence of key data protection principles with particular regard to necessity, proportionality and redress mechanisms. Moreover, it’s time to develop a longer term solution in the transatlantic dialogue.”

The Clinton team can continue to discuss changes to the transatlantic data transmission policy should they choose, however it is highly unlikely any positive moves are to be made until it gets to grips with the basic concerns of EU policy makers.

Navigating big dataAccess to Government data

Currently there are certain offices and data sets which are accessible to the general public, though this is an area which will be expanded under a Clinton regime. The concept is a sound one; giving entrepreneurs and businesses access to the data could provide insight to how money could be saved, used more efficiently or even new technologies implemented to improve the effectiveness of the government, though there could be a downside.

“The Obama Administration broke new ground in making open and machine-readable the default for new government information, launching Data.gov and charging each agency with maintaining data as a valuable asset,” the statement said. “Hillary will continue and accelerate the Administration’s open data initiatives, including in areas such as health care, education, and criminal justice.”

The downside has the potential to ruin any politician. The program is opening the door for criticism from all sides, and will offer ammunition to any opposition.

Connecting American Citizens

One of the most focused points of the document was around the country’s commitment to ensuring each household and business has the opportunity to be connected to high-speed broadband. While this could be considered an effective sound-bite for the party, it is not a new idea by any means. A recent report highlighted there is currently a surprising number of Americans who do not currently have access to broadband. Although it may be expected those in the rural communities would struggle at times, the report indicated 27% and 25% of New York and Los Angeles respectively would be classed in the “Urban Broadband Unconnected” category, which could be considered more unusual.

Connect America Fund, Rural Utilities Service Program and Broadband Technology Opportunities Program are all well-established operations (Rural Utilities Service Program has been around since 1935) which had been drums for previous presidents to bang also. Clinton has said very little new here or has made little commitment to the initiatives.

The team have however committed to a $25 billion Infrastructure Bank which will enable local authorities to apply for grants to make improvements. This is a new concept which Clinton plans to introduce though the details on how it will be funded, what the criteria for application will be or whether there are any stipulations on which vendors the money can be spend with, are not detailed.

NTT makes play for IaaS

NTT CommunicationsNTT Communications has announced the deployment of managed private cloud solutions to HPE and NTT customers in the US in a play for the IaaS market.

Although not hitting the headlines as regularly as competitors such as AWS, Google and Microsoft, NTT has been recognized in the IaaS market by Gartner, and does already have a strong presence in the US market. Although noted as a niche player the IaaS segment, NTT does offer two platforms to global customers; NTT Enterprise Cloud and Cloudn. Gartner has noted the NTT does little to differentiate itself from the rest of the market, though it does have a healthy ecosystem of partners to compensate.

The new proposition will enable joint customers of HPE and NTT to purchase the company’s IaaS portfolio solutions, including cloud migration services, data centre consolidation, managed infrastructure services, and disaster recovery-as-a-service. The NTT team claim it is one of HPE’s first service provider partners capable of providing managed private cloud environments using the new HPE Helion CloudSystem.

“NTT America, the U.S. subsidiary of NTT Com, provides flexible, agile and cost-effective private hybrid cloud solutions to the NTT Com and HPE customer base,” said Indranil Sengupta, Regional Vice President of Product Management at NTT America. “These solutions can be delivered at NTT Com data centres, customer premises or at third party data centres.

“The solution architecture allows customers to leverage their current investments and augment with additional services that they need to run their business efficiently. All of NTT Com’s cloud solutions focus on the five key considerations of security, compliance, migration, legacy integration and change management.”

While a niche player in the market, the move could represent a strategic win for the NTT team, who already has a healthy reputation in North America, and a growing customer base. It would also be considered a timely move as trends in the industry are leaning more towards multi-cloud propositions, where decision makers are more open to working with different cloud providers for different workloads and data sets.

5G will be commercially ready by 2021 – Ovum

Tablet PC with 5GAnalyst firm Ovum has released its inaugural 5G Subscription Forecasts this week which estimates there will be 24 million 5G subscriptions worldwide at the end of 2021, reports Telecoms.com.

The team at Ovum believe 5G commercial services will be a normalized aspect of the telco landscape by 2020, though this will be dominated by those in North America and Asia, who will account for as much as 80% of global 5G subscriptions by 2021. Europe would only take 10% of the pie, with the Middle-East and Africa splitting the remaining 10%.

While 5G could be considered as an advertising buzzword within the telco industry on the whole, the need for speed has been driven primarily by the advancements in user device capabilities. Claims to be the first vendor to have achieved the 5G status are not uncommon, though Nokia was the latest to make such a statement, in claiming its 5G network stack, which combines radio access technology with a cloud-based packet core, running on top of an AirFrame data centre platform, is the foundation of a commercial 5G architecture. This in itself is a bold claim as 5G standards are yet to be fully ratified.

“The main use case for 5G through 2021will be enhanced mobile broadband services, although fixed broadband services will also be supported, especially in the US,” said Mike Roberts, Ovum Practice Leader for carrier strategy and technology. “Over time 5G will support a host of use cases including Internet of Things and mission-critical communications, but Ovum does not believe those use cases will be supported by standardized 5G services through 2021.”

While announcements claiming organizations are 5G ready, a number of these have been excluded from the research. Numerous operators have claimed they will be launching 5G prior to the 2020 date stated in the research though these are services which will generally be deployed on networks and devices which don’t complying with 5G standards. These examples are seemingly nothing more than advertising ploys playing on the excitement of 5G. Ovum defines a 5G subscription as an active connection to a 5G network via a 5G device. 5G is further defined as a system based on and complying with 3GPP 5G standards.

In the UK, Ofcom has stated 5G services could be commercially available by 2020, though this claim has not been backed up by the team at Ovum. While it has not directly commented on the state of play in the UK, Ovum believe the majority of subscribers will be located in the US, Japan, China, and South Korea, countries where the operators have set more aggressive deadlines for the introduction of 5G services.

One area which has not been cleared up is the future spectrum requirements of 5G. Use cases for the high speed networks have already been outlined, including varied cases such as scientific research to financial trading and weather monitoring, though how the supply of spectrum will be split for immediate and future needs is still unknown.

“5G must deliver a further step change in the capacity of wireless networks, over and above that currently being delivered by 4G,” Steve Unger, Group Director at Ofcom on the website. “No network has infinite capacity, but we need to move closer to the ideal of there always being sufficient capacity to meet consumers’ needs”

Cisco cracks open wallet for $293m CloudLock acquisition

Cisco corporateCisco has announced its intent to acquire cloud security company CloudLock in a $293 million deal which is expected to close in Q1 2017, writes Telecoms.com.

CloudLock specializes in cloud access security broker (CASB) technology which provides insight and analytics focused on user behaviour and sensitive data in the cloud. The move builds on Cisco’s ‘Security Everywhere’ strategy, its initiative designed to provide protection from the cloud to the network to the endpoint.

“As companies are migrating to the cloud, they need a technology partner that can accelerate that transition and deliver critical security capabilities for all their users, apps and data in a seamless way,” said Rob Salvagno, VP of Cisco Corporate Development. “CloudLock brings a unique cloud-native, platform and API-based approach to cloud security which allows them to build powerful security solutions that are easy to deploy and simple to manage.”

CASB technology is an aspect of cloud security which has caught the attention of a number of decision makers in recent months. When we spoke to Intel Security CTO Raj Samani earlier this month, he told us CASB solutions were set to be one of the largest talking points for the cloud security market segment in the next couple of years.

“Companies will find controls and measures to give them a level of trust in a vendor to ensure they can operate effectively,” said Samani at the time. “CASBs will be one of the biggest trends we’ll see in the next couple of years.”

“CASB is the ultimate business case, because you can do things faster and more efficiently – you can actually but an ROI and a TCO next to it.”

The purpose of CASB solutions is to sit between the cloud provider and cloud consumer to consolidate multiple types of security policies including authentication, single sign-on, encryption, tokenization and malware detection. The CASB solution offers a level of assurance for those customers who have concerns over the security provided by cloud providers themselves, as the concept of secure and risk with vary dependent on the company.

By integrating CASB solutions, a cloud consumer can dictate how many additional layers of security are placed on top of a cloud provider’s offering, to allow the cloud consumer to define their own risk profile. The concept of CASB on the whole could go some way to mitigate the security concerns for those companies who have not currently adopted the cloud for more sensitive workloads.

Aside from this acquisition, Citrix has been bolstering its security capabilities through additional purchases, including Lancope for $452.5 million last December. Lancope helps customers monitor, detect, analyse and respond to modern threats on enterprise networks through continuous network visibility and threat analysis.

Citrix is one of a number of tech giants who have been forced to reconsider their primary focus, as cloud computing continued to increase its grip on IT decision making. During the company’s most recent earnings call, IoT was outlined as a target growth segment, though the security business unit was prominent during the financials, growing 17% year-on-year.

Oracle and BT team up to conquer the cloud

Oracle planeOracle has announced a new partnership with BT as the company continues its efforts to redefine its offering and penetrate the cloud computing market segment, reports Telecoms.com.

Through the new partnership customers will be able to use several features of BT Cloud Connect environment to gain direct connectivity to the Oracle Cloud. The offering will provide options for connectivity from hybrid enterprise data centres to the Oracle Cloud, of which there are currently 19 spread around the world.

“Direct and reliable access to data and applications hosted in cloud environments has become critical to organisations as they embark on their digital transformation journeys,” Luis Alvarez, CEO of Global Services at BT. “We are accelerating our drive to be the world’s leading cloud services integrator and I am proud that BT is becoming the first global network services provider to offer direct access to the Oracle Cloud.”

Both companies have launched new initiatives to capitalize on the burgeoning cloud computing industry. BT’s Cloud of Clouds offering was launched last year in April as part of the company’s new technology roadmap to move customers onto a cloud platform. The Cloud of Clouds offering allows customers to integrate BT’s private, public and hybrid cloud services, as well as services from partners including AWS, Microsoft Azure, Salesforce and Cisco.

Oracle’s journey to the cloud has been a more varied experience, though the team would appear to be prioritizing the market segment for future growth. The tech giant was seemingly very sceptical over the implementation of cloud initially, as Oracle Executive Chairman Larry Ellison said in an analyst briefing in 2008, “The computer industry is the only industry which is more fashion driven than women’s fashion. I was reading W and it said that orange is the new pink. Cloud is the new SaaS.”

Since this comment the company has changed its direction, acquiring several cloud vendors to boost its position in the market. Oracle has however taken a slightly different approach from others in the industry, targeting organizations which have a vertical specific cloud offering. Opower, a company which provides customer engagement and energy efficiency cloud services to the utilities industry, was acquired for $532 million in May, and Textura, a provider of construction contracts and payment management cloud services, was bought for $663 million in April.

Although Oracle has been late to the party, the company has committed heavily to the new market. During the quarterly call earlier this month Ellison claimed Oracle is in a strong position to grow in the IaaS, having invested heavily second generation data centres. Telecoms.com readers would appear to agree with Ellison’s confidence as we asked in a flash poll whether the company could break AWS, Microsoft and Google’s dominance in the IaaS market; 64% agreed it could in time.

Oracle has committed heavily to the cloud computing market in recent years after an initial period of denial, which could be linked back to the company’s reliance on revenue driven from non-cloud products. The partnership would appear to be a move to justify the company’s position in the cloud market as Oracle lean on BT’s credibility to push its cloud offering to BT customers.

AWS expands footprint in India with new data centre

Location India. Red pin on the map.AWS has expanded its reach in the Asia Pacific region, opening two new Availability Zones in Mumbai, taking the total globally to 35.

The company already has 75,000 customers in the country, which is one of the fastest growing economies worldwide. According to the CIA World Factbook, India is listed as the 12th fastest growing nation with a 7.3% real GDP growth rate, as well as a population growth rate of 1.22% per annum. The new region will support numerous services including Elastic Compute Cloud (EC2), as well Elastic Block Store (EBS), Virtual Private Cloud, Auto Scaling, and Elastic Load Balancing.

“Indian start-ups and enterprises have been using AWS for many years – with most Indian technology start-ups building their entire businesses on AWS, and numerous enterprises running mission-critical, core applications on AWS,” said Andy Jassy, CEO of AWS. “These same 75,000 Indian customers, along with others anxious to start using AWS, have asked for an AWS India Region so they can move their applications that require low latency and data sovereignty.

“We’re excited to make this available today, with the same pay-as-you-go pricing, ability to get started immediately without having to negotiate enterprise agreements or wait days for access, and unmatched functionality that customers enjoy in AWS Regions worldwide – all of which allows customers to go from idea to launch faster than ever before was possible.”

Although India is one of the company’s fastest growing markets worldwide, AWS have been slower to market than its competitors. Last year, Microsoft has brought online three cloud data centres in India for its Azure offering, and IBM opened its first data centre in Chennai for Softlayer. Google is yet to gain traction in the market.

Making the announcement through the official blog, the team also announced numerous local partners ranging from Managed Service Providers such as Spruha Technologies and Consulting Partners including HCL, Tata Consulting Services, and Wipro.