Category Archives: Datacentre

NTT Com buys Cyber CSF to boost Indonesian datacentre presence

NTT Com's newest datacentre in Jakarta, Indonesia

NTT Com’s newest datacentre in Jakarta, Indonesia

NTT Communications announced it has reached an agreement to acquire PT. Cyber CSF, one of Indonesia’s largest datacentre and cloud service providers, for an undisclosed sum.

Headquartered in Jakarta, Cyber CSF was founded in 2012 and with 2,800 racks in 7,700 square metres claims to be the country’s largest datacentre operator. NTT Com plans to rename Cyber CSF as NTT Indonesia Nexcenter.

The company’s carrier-neutral facility links up to 32 domestic and overseas fibre operators and will also help NTT Com add another point of presence for its Arcstar VPN service, which it plans to do in October this year.

NTT said it wants to position itself in front of what the company sees as an impending boom in the regional cloud market.

Indonesia is one of the world’s largest countries and according to IDC the Indonesian ICT market is expected to average about 10 per cent annual growth through 2017, exceeding growth rates in most other Southeast Asian countries.

Additionally, the company expects new legislation in the region to influence more financial services companies to outsource their datacentre operations and move more of their systems to the cloud.

The company has in recent months looked to bolster its datacentre presence globally. In April this year the company’s American subsidiary completed a merger with Verio (after acquiring it 15 years ago), and in March bought a majority stake in one of Germany’s largest datacentre operators, e-shelter.

Will datacentre economics paralyse the Internet of Things?

The way data and datacentres are managed may need to change drastically in the IoT era

The way data and datacentres are managed may need to change drastically in the IoT era

The statistics predicting what the Internet of Things (IoT) will look like and when it will take shape vary widely. Whether you believe there will be 25 billion or 50 billion Internet-enabled devices by 2050, there will certainly be far more devices than there are today. Forrester has predicted 82% of companies will be using Internet of Things (IoT) applications by 2017. But unless CIOs pay close attention to the economics of the datacentre, they will struggle to be successful. The sheer volume of data we expect to manage across these IoT infrastructures could paralyse companies and their investments in technology.

The Value of Information is Relative

ABI Research has calculated that there will be 16 Zettabytes of data by 2020. Consider this next to another industry estimate that there will be 44 Zettabytes by 2020.  While others have said that humanity only produced 2.7 Zettabytes up to 2013. Bottom line: the exponential growth in data is huge.

The natural first instinct for any datacentre manager or CIO is to consider where he or she will put that data. Depending on the industry sector there are regulatory and legal requirements, which mean companies will have to be able to collect, process and analyse runaway amounts of data.  By 2019 another estimate suggests that means processing 2 Zettabytes a month!

One way to react is to simply buy more hardware. From a database perspective the traditional approach would be to create more clusters in order to manage such huge stores of data. However, a critical element of IoT is that it’s based on low-cost technology, and although the individual pieces of data have a value, there is a limit to that value. For example, you do not need to be told every hour by your talking fridge that you need more milk or be informed by your smart heating system what the temperature is at home.  While IoT will lead to smart devices everywhere, its value is relative to the actionable insight it offers.

A key element of the cost benefit equation that needs more consideration is the impact of investment requirements at the backend of an IoT data infrastructure. As the IoT is creating a world of smart devices distributed across networks CIOs have to make a decision about whether the collection, storage and analytics happens locally near the device or is driven to a centralised management system.  There could be some logic to keeping the intelligence locally, depending on the application, because it could speed up the process of providing actionable insight. The company could use low-cost, commoditised devices to collect information but it will still become prohibitively expensive if the company has to buy vast numbers of costly database licenses to ensure the system performs efficiently – never mind the cost of integrating data from such a distributed architecture.

As a result, the Internet of Things represents a great opportunity for open source software thanks to the cost effectiveness of open source versus traditional database solutions. Today, open source-based databases have the functionality, scalability and reliability to cope with the explosion in data that comes with the IoT while transforming the economics of the datacentre. A point which Gartner’s recent Open Source Database Management report endorsed when it said:  “Open source RDBMSs have matured and today can be considered by information leaders, DBAs and application development management as a standard infrastructure choice for a large majority of new enterprise applications.”

The Cost of Integrating Structured and Unstructured

There are other key considerations when calculating the economic impact of the IoT on the datacentre. The world of IoT will be made up of a wide variety of data, structured and unstructured. Already, the need for working with unstructured data has given rise to NoSQL-only niche solutions. The deployment of these types of databases, spurred on by the rise of Internet-based applications and their popularity with developers, is proliferating because they offer the affordability of open source. Yet, their use is leading to operational and integration headaches as data silos spring up all around the IT infrastructure due to limitations in these NoSQL-only solutions. In some cases, such as where ACID properties are required and robust DBA tools are available, it may be more efficient to use a relational database with NoSQL capabilities built in and get the best of both worlds rather than create yet another data silo.  In other cases, such has for very high velocity data streams, keeping the data in these newer data stores and integrating them may be optimal.

A key priority for every CIO is integrating information as economically as possible so organizations can create a complete picture of its business and its customers.  The Postgres community has been at the forefront of addressing this challenge with the creation of Foreign Data Wrappers (FDWs), which can integrate data from disparate sources, likes MongoDB, Hadoop and MySQL. FDWs link external data stores to Postgres databases so users access and manipulate data from foreign sources as if it were part of the native Postgres tables. Such simple, inexpensive solutions for connecting new data streams emerging along with the Internet of Everything will be critical to unlocking value from data.

The Internet of Things promises a great transformation in the ability of enterprises to holistically understand their business and customer environment in real time and deliver superior customer engagement.  It is critical, though, that CIOs understand the economic impact on their datacentre investments.  The IoT creates a number of new challenges, which can be addressed using the right technology strategy.

Written by Pierre Fricke, vice president of product, EnterpriseDB

Alibaba to bolster cloud performance, proposes data protection pact

Alibaba is boosting the performance of its cloud services and reassuring customers on data protection

Alibaba is boosting the performance of its cloud services and reassuring customers on data protection

Alibaba unveiled a series of performance upgrades to its cloud platform this week in a bid to compete more effectively for big data workloads with other large cloud incumbents, and clarified its position on data protection.

The company said it is adding solid state drive (SSD) backed cloud storage, which will massively improve read-write performance over its existing HDD-based offerings, and virtual private cloud services (VPC) for high performance compute and analytics workloads. It’s also boosting performance with virtualised GPU-based technology.

“The huge amount of data and advanced computing capacity has brought great business opportunities to the industry,” said Wensong Zhang, chief technology officer of Aliyun, Alibaba’s cloud division.

“Deep learning and high-performance computing have been widely adopted in Alibaba Group for internal use. Aliyun will roll out high-performance computing services and accelerators based on GPU technology that could be applied in image recognition and deep learning to expand the boundaries of business,” Zhang said.

The company also released what it is calling a data protection pact. In its proposal Alibaba said customers will have “absolute ownership” over all of the data generated or sent to the company’s cloud services, and the “right to select whatever services they choose to securely process their data.”

It also said it would strengthen its threat protection and disaster recovery capabilities in order to reassure customers of its ability to guard their data – and the data of their clients. The company did not, however, cite any specific standards or internationally recognised guidelines on data protection in its plans.

“Without the self-discipline exercised by the banking industry, the financial and economic prosperity that exists in modern-day society would not have ensued. Similarly, without common consensus and concrete action dedicated to data protection, the future for the [data technology] economy would be dim,” the company said in a statement.

“We hereby promise to strictly abide by this pledge, and encourage the entire industry to collectively exercise the self-regulation that is vital in promoting the sustainable development of this data technology economy.”

DataCentred ARM-based OpenStack cloud goes GA

DataCentred is moving its ARM-based OpenStack cloud into GA

DataCentred is moving its ARM-based OpenStack cloud into GA

It has been a big week for ARM in the cloud, with Manchester-based cloud services provider DataCentred announcing that its ARM AArch64-based OpenStack public cloud platform is moving into general availability. The move comes just days after OVH announced it would roll out an ARM-based cloud platform.

The company is running the platform on HP M400 ARM servers, and offering customers access to Intel and ARM architectures alongside one another within an OpenStack environment.

The platform, a product of its partnership with Codethink originally launched in March, comes in response to increasing demand for ARM-based workload support in the cloud according to DataCentre’s head of cloud services Mark Jarvis.

“The flexibility of OpenStack’s architecture has allowed us to make the integration with ARM seamless. When users request an ARM based OS image, it gets scheduled onto an ARM node and aside from this the experience is identical to requesting x86 resources.  Our early adopters have provided invaluable testing and feedback helping us to get to point where we’re confident about stability and support,” Jarvis explained.

“The platform is attracting businesses who are interested in taking advantage of the cost savings the lower-power chips offer as well as developers who are targeting ARM platforms. Developers are particularly interested because virtualised ARM is an incredibly cost-effective alternative to deploying physical ARM hardware on every developer’s desk,” he added.

The company said ARM architecture also offers environmental and space-saving benefits because they can be deployed in higher density and require less power than more conventional x86 chips to run.

Mike Kelly, founder and chief executive of DataCentred didn’t comment on customer numbers or revenue figures but stressed the move demonstrates the company has successfully commercialised OpenStack on ARM.

“The market currently lacks easy to use 64-bit ARM hardware and DataCentred’s innovation provides customers with large scale workloads across many cores. Open source software is the future of computing and the General Availability of DataCentred’s new development will make our services even more attractive to price-sensitive and environmentally-aware consumers,” Kelly said.

DataCentred isn’t alone in the belief that ARM has a strong future in the cloud. The move comes the same week French cloud and hosting provider OVH announced plans to add Cavium ARM-based processors to its public cloud platform by the end of next month.

The company, an early adopter of the Power architecture for cloud, said it will add Cavium’s flagship 48 core 64-bit ARMv8-A ThunderX workload-optimized processor to its RunAbove public cloud service.

Digital Realty to double datacentre footprint with Telx acquisition

Digital Realty is acquiring Telx to bolster its US datacentre footprint

Digital Realty is acquiring Telx to bolster its US datacentre footprint

Digital Realty is to acquire cloud and colocation solutions specialist Telx for $1.9bn in a move the company said would double its datacentre footprint in the US.

Telx is a direct competitor of Equinix and offers a combination of interconnection, cloud and colocation services to enterprises and IT service providers and as of March this year the company managed 1.3 million square feet in 20 facilities across the US – 11 of which are already being leased from Digital Realty.

“This transformative transaction is consistent with our strategy of sourcing strategic and complementary assets to strengthen and diversify Digital Realty’s datacentre portfolio and expand our product mix and presence in the attractive colocation and interconnection space,” said William Stein, Digital Realty’s chief executive officer.

“Telx’s well-established colocation and interconnection businesses provide access to two rapidly-growing segments with long-standing customer relationships in top-tier metropolitan areas such as New York and Silicon Valley.”

“The fact that more than half of Telx’s 20 facilities are run out of Digital Realty properties further highlights the strategic fit as well as the potential incremental revenue opportunities we expect to be able to pursue as one company on a global basis.  This transaction advances our objective of ensuring that Digital Realty’s suite of products and services is able to best serve our customers’ current and future datacentre needs,” Stein added.

Chris Downie, chief executive officer of Telx said: “The combination of Telx’s colocation and interconnection capabilities with Digital Realty’s expansive wholesale platform provides greater flexibility and optionality for our customers and creates a global solutions provider covering wholesale customer applications and smaller performance-oriented deployments in select high-growth urban submarkets across the US.”

The transaction is due to close later this year.

China Mobile revamps private cloud with Nuage SDN

China Mobile, Alcatel Lucent and their respective subsidiaries are working together on SDN in many contexts

China Mobile, Alcatel Lucent and their respective subsidiaries are working together on SDN in many contexts

China Mobile’s IT subsidiary Suzhou Software Technology Company has baked Nuage Networks’ software-defined networking technology into its private cloud architecture to enable federation across multiple China Mobile subsidiaries. The move comes the same week both parent companies – China Mobile and Alcatel Lucent – demoed a virtualised radio access network (RAN), a core network component.

The company deployed Nuage’s Virtualised Services Platform (VSP) and Virtual Services Assurance Platform (VSAP) for its internal private cloud platform in a bid to improve the scalability and flexibility of its infrastructure, and enable infrastructure federation between the company’s various subsidiaries.

Each subsidiary is allocated its own virtual private cloud with its own segmented chunk of the network, but enabling infrastructure federation between them means underutilised assets can be deployed in other parts of the company as needed.

“China Mobile is taking a visionary approach in designing and building its new DevOps private cloud architecture,” said Nuage networks chief executive officer Sunil Khandekar.

“By combining open source software with Nuage Networks VSP, China Mobile is replacing and surpassing its previous legacy architecture in terms of power, sophistication and choice. It will change the way China Mobile operates internally and, ultimately, the cloud services they can provide to customers,” Khandekar said.

The move comes the same week China Mobile and Alcatel Lucent trialled what the companies claimed to be the industry’s first virtualised RAN, which for an operator with over 800 million subscribers has the potential to deliver significant new efficiencies across its datacentres if deployed at scale.

IBM, Nvidia, Mellanox launch OpenPower design centre to target big data apps

IBM has set up another OpenPower design centre in Europe to target big data and HPC

IBM has set up another OpenPower design centre in Europe to target big data and HPC

IBM, Nvidia and Mellanox are setting up another OpenPower design centre in Europe to target development of high performance computing (HPC) apps based on the open source Power architecture.

The move will see technical experts from IBM, Nvidia and Mellanox jointly develop applications on OpenPower architecture which take advantage of the companies’ respective technologies – specifically IBM Power CPUs, Nvidia’s Tesla Accelerated Computing Platform and Mellanox InfiniBand networking solution.

The companies said the move will both advance development of HPC software and create new opportunities for software developers to acquire HPC-related skills and experience.

“Our launch of this new centre reinforces IBM’s commitment to open-source collaboration and is a next step in expanding the software and solution ecosystem around OpenPower,” said Dave Turek, IBM’s vice president of HPC Market Engagement.

“Teaming with Nvidia and Mellanox, the centre will allow us to leverage the strengths of each of our companies to extend innovation and bring higher value to our customers around the world,” Turek said.

The centre will be located in IBM’s client centre in Montpellier, France and complement the Jülich Supercomputing Center launched in November last year.

IBM has been working with a broad range of stakeholders spanning the technology, research and government sectors on Power-based supercomputers in order to satisfy its big Power architecture ambitions. The company hopes Power will command roughly a third of the scale-out market over the next few years.

AWS to expand to India in 2016

AWS said India is the next big market for public cloud expansion

AWS said India is the next big market for public cloud expansion

Amazon unveiled plans this week to bring its Amazon Web Services (AWS) infrastructure to India by 2016 in a bid to expand into the quickly growing public cloud services market there.

AWS is already available in India and the company claims to have over 10,000 local customers using the platform, but the recently announced move would see the company set up its own infrastructure in-country rather than relying on delivering the services from nearby availability zones like Singapore.

The company says the move will likely improve the performance of the cloud services on offer to local organisations.

“Tens of thousands of customers in India are using AWS from one of AWS’s eleven global infrastructure regions outside of India. Several of these customers, along with many prospective new customers, have asked us to locate infrastructure in India so they can enjoy even lower latency to their end users in India and satisfy any data sovereignty requirements they may have,”said Andy Jassy, senior vice president, AWS.

“We’re excited to share that Indian customers will be able to use the world’s leading cloud computing platform in India in 2016 – and we believe India will be one of AWS’s largest regions over the long term.”

The India expansion comes at a time when the local market is maturing rapidly.

According to analyst and consulting house Gartner public cloud services revenue in India will reach $838m by the end of 2015, an increase of almost 33 per cent – making it one of the fastest growing markets for public cloud services in the world (global average growth rates sit in the mid-twenties range, depending on the analyst house). The firm believe many local organisations in India are shifting away from more traditional IT outsourcing and using public cloud services instead.

IBM stands up SoftLayer datacentre in Italy

IBM has added its first SoftLayer datacentre in Italy, and its 24th globally

IBM has added its first SoftLayer datacentre in Italy, and its 24th globally

IBM announced the launch of its first SoftLayer datacentre in Italy this week, which is located in Cornaredo, Milan.

The company said the datacentre in Milan, a growing hub for cloud services, will enable it to offer a local option for Italian businesses looking to deploy IBM cloud services. The facility, it’s 24th SoftLayer datacentre globally, has a capacity for up to 11,000 servers, a power rating of 2.8 megawatts, and is designed to Tier III spec.

“The Italian IT sector is changing as startups and enterprises alike are increasingly turning to the cloud to optimize infrastructure, lower IT costs, create new revenue streams, and spur innovation,” said Marc Jones, chief technology officer for SoftLayer.

“The Milan datacentre extends the unique capabilities of our global platform by providing a fast, local onramp to the cloud. Customers have everything they need to quickly build out and test solutions that run the gamut from crunching big data to launching a mobile app globally,” Jones added.

Nicola Ciniero, general manager of IBM Italy said: “This datacentre represents a financial and technological investment made by a multinational company that has faith in this country’s potential. Having an IBM Cloud presence in Italy will provide local businesses with the right foundation to innovate and thrive on a global level.”

The move comes just a couple of months after IBM added a second SoftLayer datacentre in the Netherlands.

Facebook to build Open Compute datacentre in Ireland

Facebook plans to build a datacentre in Ireland, its second in Europe

Facebook plans to build a datacentre in Ireland, its second in Europe

Facebook this week revealed plans to build an Open Compute datacentre in a bid to support its growth ambitions in Europe.

The proposed location of the new datacentre in County Meath will enable the company to make use of local renewable energy sources and talent, and would be the social media giant’s second in Europe. The first, in Lulea, Sweden, uses 100 per cent hydroelectricity to power its servers.

Facebook said the datacentre could generate hundreds of millions of euros in economic benefits for the region. The project is being supported by the by the Department of Jobs through IDA Ireland. Martin Shanahan, the organisation’s chief executive said: “Facebook’s existing relationship with Ireland is extremely strong and extensive in scope, but the news that the company wants to build its second European data centre in a regional location such as Meath will cement the relationship even further.”

“Ireland has been a home for Facebook since 2007 and today’s planning application demonstrates our continued interest to invest in Ireland,” said Facebook’s datacentre strategy head Rachel Peterson.

“We hope to build an innovative, environmentally friendly data centre that will help us continue to connect people in Ireland and around the world – while supporting local job creation and Ireland’s successful technology economy. We look forward to continuing our conversations with the Clonee community in coming weeks,” she said.

Facebook has less than a handful of datacentres globally but the data volumes it generates – and the infrastructure it needs to support its services – is significant. The company adds 300 million new photos every day, has a data warehouse of over 300 petabytes and processes hundreds of terabytes of data daily. And given nearly three quarters of Facebook users are outside the US, its build-out in Europe and other key strategic regions (India for instance) outside North America will likely continue.