Archivo de la categoría: Datacentre

London’s Virtus Data Centres doubles annual revenues

VirtusLondon based Virtus Data Centres has announced it has doubled its revenues over the last twelve months, though the team haven’t released any specific numbers to substantiate the claim.

The company has recorded a healthy number of new customers throughout the period, including T-Systems which runs its private and public cloud operations from the London2 location in Hayes, as part of a five year transition project to close its private data centre in Feltham. Virtus has 40MW of capacity across its three locations, having acquired the London4 site in Slough during the latter stages of 2015 from Infinity SDC.

“Our aim is to combine cutting edge design and technology with transparent and agile commercials to offer the very best tailored solutions and service for our customers,” said Neil Cresswell, CEO at Virtus Data Centres. “This unique approach to data centre service delivery is the reason we see continued growth across all business lines with the likes of T-Systems and Symantec collocating in our leading facilities. It’s been a fantastic start to the year, and one which we seek to improve upon.”

The company, which has been in operation since 2008, offers traditional retail and wholesale colocation models, through three locations in the London area (Enfield, Hays and Slough) will a fourth set to open early next year. Virtus also boasts to have the highest total colocation MW sales of any operator in the London market throughout 2015, according to findings from CBRE, and is only one of four data centre operators in London to have been awarded Tier III design certification from the Uptime Institute. Virtus has also been expanding its credentials and capabilities in recent months, achieving supplier status with the Crown Commercial Service as part of the G-Cloud 7 initiative.

Recent expansion initiatives have been driven through investment from ST Telemedia, which was announced last year in June. As part of the agreement, ST Telemedia will make what it claims is a ‘significant investment’ into Virtus committing to a 49% via a Joint Venture with Virtus’ existing owner Brockton Capital. ST Telemedia has a healthy track record when it comes to data centre companies having launched i-STT in 2000 which was later merged into Equinix (it has now divested), as well as investments in Level 3 Communications and GDS Services.

New France-IX DNS server claims to reduce latency 10x

server rackInternational internet exchange point provider France-IX has unveiled a new DNS server in Paris that claims to reduce internet traffic latency to 1-3ms.

France-IX is a member-only exchange point that has hundreds of members. It was founded in 2010 and now has operations around the world. The new K-root server is running on a Dell PowerEdge R430 server donated by Dalenys, one of its members.

“We have been connected to France–IX since 2012 to meet the demand of interconnection with internet networks,” said Frederic Dhieux, Deputy CTO at Dalenys. “We are glad to sponsor the K-root installation and actively contribute to France-IX.”

The K-root software is installed and maintained by the RIPE NCC, which has a close relationship with France-IX. “We’re delighted that France-IX has decided to host a K-root node,” said RIPE NCC CIO Kaveh Ranjbar. “The expansion of the K-root network increases its robustness, contributing to the resiliency of the global Internet. Through this kind of cooperation, we help improve the overall stability and global reachability of the Internet for all its users.”

“This project demonstrates the commitment of France-IX and our members and partners to collaborate closely to improve services for the global community,” said Franck Simon, MD of France-IX. “We continue to strive to bring the best possible quality of experience to our members and we are excited about the improved latency we are able to achieve with the new K-root server in Paris, in collaboration with RIPE NCC and Dalenys.”

EMC outlines ‘Technical Debt’ challenges for data greedy enterprises

Jeremy and Guy on stage day 2

President of Core Technologies Division Guy Churchward (Left) and Jeremy Burton, President of Products and Marketing (Right) at EMC World 2016

Speaking at EMC World, President of Core Technologies Division at EMC Guy Churchward joined Jeremy Burton, President of Products and Marketing, to outline one of the industry’s primary challenges, technical debt.

The idea of technical debt is being felt by the new waves of IT professionals. This new generation is currently feeling the pressure from most areas of the business to innovate, to create an agile, digitally enabled business, but still have commitments to traditional IT systems on which the business currently operates on. The commitment to legacy technologies, which could represent a significant proportion of a company’s IT budget and prevents future innovation, is what Churchward describes as the technical debt.

“They know their business is transforming fast,” said Churchward. “Business has to use IT to make their organization a force to be reckoned with and remain competitive in the market, but all the money is taken up by the current IT systems. This is what we call technical debt. A lot of people have to do more with what they have and create innovation with a very limited budget. This is the first challenge for every organization.”

This technical debt is described by Churchward as the first challenge which every IT department will face when driving towards the modern data centre. It makes business clunky and ineffective, but is a necessity to ensure the organization continues to operate, until the infrastructure can be upgraded to a modern proposition. Finding the budget without compromising current operations can be a tricky proposition.

“When you live in an older house, where the layout doesn’t really work for the way you live your life and there aren’t enough closets to satiate your wife’s shoe fetish, maybe it’s time to modernize,” said Churchward on his blog. “But do you knock the whole house down and start again? Maybe it’s tempting but, what about the investment that you’ve already made in your home? It’s similar when you want to modernize your IT infrastructure. You have money sunk into your existing technology and you don’t want to face the disruption of completely starting again

MainOne datacentre 1“For many companies, this debt includes a strategy for data storage that takes advantage of a shrinking per-gig cost of storage that enables them to keep everything. And that data is probably stored primarily on spinning disk with some high-availability workloads on flash in their primary data centre. The old way of doing things was to see volumes of data growing and address that on a point basis with more spinning disk. Data centres are bursting at the seams and it’s now time to modernize – but how?”

Churchward highlighted the first-step is to remove duplicate data sets – EMC launched its Enterprise Copy Data Management tool at EMC World this week – to reduce unnecessary spend within the data centres. While there are a number of reasons to duplicate and keep old data sets for a defined period of time, Churchward commented this data can often be forgotten and thus becomes an expense which can be unnecessary. Although the identification and removal of this data might be considered a simple solution to removing a portion of the technical debt, Churchward believes it could be a $50 billion business problem by 2018.

The Enterprise Copy Data Management software helps customers discover, automate and optimize copy data to reduce costs and streamline operations. The tool automatically identifies duplicate data sets within various data centres, and using data-driven decision making software, optimizes the storage plans, and in the necessary cases, deletes duplicate data sets.

This is just one example of how the challenge of technical debt can be managed, though the team at EMC believe this challenge, the first in a series when transforming to a modern business, can be one of the largest. Whether this is one of the reasons cloud adoption within the mainstream market cloud be slower than anticipated remains to be seen, though the removal of redundant and/or duplicated data could provide some breathing room for innovation and budget for the journey towards the modern data centre.

Welcome to the cloud party – Michael Dell launches Dell Technologies

Michael Dell at EMC World

Dell Founder and CEO Michael Dell

Speaking at EMC World in Las Vegas, Dell CEO Michael Dell and EMC CEO Joe Tucci outlined the rationale behind one of history’s largest mergers, and announced the name of the industry’s latest tech giant – Dell Technologies.

The group itself will be known as Dell Technologies upon the completion of the reported $67 billion merger, though there will also be several individual operating brands. Dell’s client services group will continue to be known as Dell, with the soon-to-be merged enterprise business known as Dell EMC.

“There are certain times once every two or three generations where everything changes,” said Tucci. “The industrial revolution went on for more than 100 years and changed everything they knew back then. Many new companies were born out of the opportunities that were created, and many failed as they didn’t. We are now on the cusp of an even bigger revolution, the digital revolution.”

Tucci, speaking at what he seemingly disappointingly admitted would be his final EMC World, highlighted the vast scale of change at which the world is undergoing currently. IoT and the connected world specifically are redefining not only the way in which individuals communicate with each other, but also the way in which enterprise organizations are structured and operated. The merger enables two companies, which could potentially be perceived as being stuck in a traditional IT world, to create a new brand which can capitalize on digitalization trends.

“We have to change rapidly to be on the wave of this revolution,” said Tucci. “The merger with Dell allows the company to change the concept of the business and capitalize on the opportunities presented by the digital revolution.”

Michael Dell’s contribution to the opening keynote focused more on the rate of innovation, normalization and implementation of new technologies which are driving the digital revolution. EMC World has now been running for 15 years, debuting in 2001, the same year which saw the launch of the iPod, Sun E25k as the state of the art data centre technology and the first availability of 3G networks. Dell commented that while these once-innovations could now be seen as relics, it raise the question of what is possible during the next 15 years.

Joe and Michael

EMC CEO Joe Tucci and Dell CEO Michael Dell on stage at EMC World

“Think about 15 years from now, to the year 2031,” said Dell. “Currently, if you want to code the human genome it takes around 36 hours. In 2031 it will take 94 seconds. In 2031 more than half the cars on the road will be driverless, and there will be more than 200 million connected devices. There will be thousands of innovations which we can’t even begin to perceive. I believe that it could happen sooner as well. The marginal cost of making something intelligent is fast approaching zero.

“The new digital, connected world will require data centre infrastructure to be architected in a different way. It’s going to be cloud native and operated on a Devops methodology. EMC and Dell are merging to create a company which can deliver this concept.”

“We are combining Dell and EMC to help you navigate a successful path, to modernise your IT, reduce costs and helping you create your digital future.”

The merger itself could be evidence of the weight of the digital world and the expectations which are placed on companies to succeed in the new ecosystem. Rather than attempting to change the perception of the organization which they oversee, like IBM and Intel for instance, the merger enables Tucci and Dell to create a new brand which can be defined as how and where they desire. Unlike companies who are in the process of redefining themselves for the cloud era, Dell Technologies can position itself where-ever it chooses in the market, without worry of legacy perceptions.

Dell also claimed the new company will have a significant advantage over competitors due to the fact it will be private. Leaning on the idea Dell Technologies will not have outside influences to be concerned about as publicly trading organizations do, Dell believes the new company can invest for long-term ambition, as opposed to short-termist aims which could be perceived to damage technological innovation.

The IoT wave is continuing to grow, and as we see more devices deployed, more data collected and more cloud-orientated behaviour infiltrating the boardroom, the role of the data centre is likely to become more evident. Dell believes the modern data centre will be the centre of the new technology world, enabling innovation in an increasingly competitive market, and the merger has created a new organization which can capitalize on these trends. The success of the new company remains to be seen, though the new proposition and brand does have the potential to remove perceived doubt as to how traditional IT players can operate in “The Next Industrial Revolution” as Michael Dell highlighted.

Intel prioritizes cloud, IoT and 5G in new business strategy

IntelIntel has outlined a new business strategy to capitalize on new trends within the industry including cloud technology, IoT and 5G.

Speaking on the company’s blog, CEO Brian Krzanich outlined the organizations new strategy which is split into five sections; cloud technology, IoT, memory and programmable solutions, 5G and developing new technologies under the concept of Moore’s law.

“Our strategy itself is about transforming Intel from a PC company to a company that powers the cloud and billions of smart, connected computing devices,” said Krzanich. “But what does that future look like? I want to outline how I see the future unfolding and how Intel will continue to lead and win as we power the next generation of technologies.

“There is a clear virtuous cycle here – the cloud and data centre, the Internet of Things, memory and FPGA’s are all bound together by connectivity and enhanced by the economics of Moore’s Law. This virtuous cycle fuels our business, and we are aligning every segment of our business to it.”

Krzanich believes virtualization and software trends, which are apparently redefining the concept of the data centre, aligns well with the Intel business model and future proposition, through the company’s position in the high-performance computing food chain. Through continued investment in analytics, big data and machine learning technologies, the company aims to drive more of the footprint of the data centre to Intel architecture.

The company’s play for the potentially lucrative IoT market will be built on the phrase of ‘connected to the cloud’. Intel has highlighted it will focus on autonomous vehicles, industrial and retail as our primary growth drivers of the Internet of Things, combining its capabilities within the cloud ecosystem to drive growth within IoT.

While were a number of buzzwords and trends highlighted throughout Krzanich’s post, Moore’s Law appeared to receive particular attention. While generally considered a plausible theory, Moore’s Law itself would appear to be underplayed within the industry, a point which Krzanich did not seem to agree with.

“In my 34 years in the semiconductor industry, I have witnessed the advertised death of Moore’s Law no less than four times,” said Krzanich. “As we progress from fourteen nanometer technology to ten nanometer and plan for seven nanometer and five nanometer and even beyond, our plans are proof that Moore’s Law is alive and well. Intel’s industry leadership of Moore’s Law remains intact, and you will see continued investment in capacity and R&D to ensure so.”

Krzanich’s comments provide more clarity to last week’s announcement on how it would be restructuring the business to accelerate its transformation project, and also it quarterly earnings. The data centre and Internet of Things (IoT) businesses would appear to be Intel’s primary growth engines, delivering $2.2 billion in revenue growth last year, and accounting for roughly 40% of revenue across the period.

The transformation project itself is part of a long-term ambition of the business, as it aims to move the perception of the company away from client computing (PCs and mobile devices) and towards IoT and the cloud. The announcements over the last week have had mixed results in the market; following its quarterlies share price rose slightly, though has declined over the subsequent days.

Verizon launches NFV OpenStack cloud deployment over five data centres

VerizonVerizon has completed the launch of its NFV OpenStack cloud deployment project across five of its US data centres, alongside Big Switch Networks, Dell and Red Hat.

The NFV project is claimed to be the largest OpenStack deployment in the industry and is currently being expanding the project to a number of domestic data centres and aggregation sites. The company also expect the deployment to be adopted in edge network sites by the end of the year, as well as a number of Verizon’s international locations, though a time-frame for the international sites was not disclosed.

“Building on our history of innovation, this NFV project is another step in building Verizon’s next-generation network – with implications for the industry,” said Adam Koeppe, VP of Network Technology Planning at Verizon. “New and emerging applications are highlighting the need for collaborative research and development in technologies like NFV. We consider this achievement to be foundational for building the Verizon cloud that serves our customers’ needs anywhere, anytime, any app.”

Verizon worked with Big Switch Networks, Dell and Red Hat to develop the OpenStack pod-based design, which went from idea to deployment of more than 50 racks in five data centres in nine months, includes a spine-leaf fabric for each pod controlled through a Neutron plugin to Red Hat OpenStack Platform. The multi-vendor project uses Big Switch’s SDN controller software managing Dell switches, which are orchestrated by Red Hat OpenStack platform.

“Dell’s Open Networking initiative delivers on the promise of bringing innovative technology, services and choice to our customers and Verizon’s NFV project is a testament to that vision,” said Tom Burns, GM of Dell’s networking business unit. “With the open source leadership of Red Hat, the SDN expertise of Big Switch and the infrastructure, service and support at scale from Dell, this deployment demonstrates a level of collaboration that sets the tone for the Open Networking ecosystem. This is just the beginning.”

Software-Defined Data Centre to become a common fixture in US – survey

Cloud computingA survey from security and compliance company HyTrust claims the Software-Defined Data Centre (SDDC) is on the verge of becoming a common fixture in corporate America.

65% of the respondents predict faster deployment in 2016, while 62% anticipate increased adoption of the SDDC. Nearly half see greater adoption of network virtualization, while even more, 53%, anticipate and increased adoption of storage virtualization. 50% of the respondents also anticipate higher levels of adoption of public cloud across the course of 2016 also.

“This survey is truly interesting in that it uncovers a new level of maturity in organizations pursuing a SDDC leveraging virtualization and the cloud. It’s long been happening, but now faster and with greater conviction and comfort than perhaps ever before,” said Eric Chiu, President of HyTrust. “Security and privacy have always been the critical inhibitors, and no one denies that these issues still concern senior executives.

“But now we can also see that technologies like those offered by HyTrust, which balance a high level of security and control with smooth automation, are having a major impact. The benefits of virtualized and cloud infrastructures are undeniable—think agility, flexibility and lower cost, among many other advantages—and the obstacles to enjoying those benefits are increasingly being overcome.”

From a security perspective, 74% of the respondents believe security is less of an obstacle to adoption compared to 12 months ago, however that is not to say security challenges have been reduced significantly. 54% of the respondents believe there will be an increased number of breaches throughout 2016, whereas only 11% say the contrary. In terms of migration, 67% believe security will ultimately slow down the process, and 70% believe there will be the same or even greater levels of internal compliance and auditing challenges following the transition to a SDDC platform.

While the Software-Defined Data Centre should not be considered a new term or trend within the industry levels of adoption and trust have been lower in comparison to other technologies in the cloud world. As the industry continues its journey towards automation, the SDDC trends will likely only become louder, as the survey demonstrates.

Equinix launches Data Hub solution for customers on the edge

Office worker sitting on rooftop in cityData centre company Equinix has launched its Data Hub solution to enable enterprise customers to develop large data repositories and dispense, consume and process the data at the edge.

As part of an Interconnection Oriented Architecture, the Data Hub is a bundled solution consisting of pre-configured colocation and power, combined with cloud-integrated data storage solutions, and will work in conjunction with the company’s Performance Hub solution. The company highlighted that while the Performance Hub solves for the deployment of network gear and interconnection inside Equinix data centres, Data Hub enables the deployment of IT gear integrated with Performance Hub.

The launch builds on a number of trends within the industry, including the growing volume of data utilized by enterprise organizations brought on by the implementation of IoT and big data capabilities. According to research from statista, the number of connected devices is forecast to reach 50 billion units worldwide by 2020. The company believe the healthy growth of data consumption will increase the need for organizations to re-think their existing IT infrastructure, and develop an Interconnection Oriented Architecture, at the edge.

“Data, and its exponential growth, continues to be an ongoing concern for enterprise IT,” said Lance Weaver, VP, Product Offers and Platform Strategy at Equinix. “And there is no expectation it will slow down in the near future.  To keep up with this relentless data rise, it is critical for the enterprise to rethink its IT architecture and focus on an interconnection-first strategy.  Data Hub, is the newest solution from Equinix and we are confident that it will provide our enterprise customers with the data management solutions they need today, while providing for growth tomorrow.”

The company have claimed there are a number of use cases including cloud-integrated tiered storage, big data analytics infrastructure, as well as multi-site deployment for data redundancy, allowing data to be synchronously replicated by an enterprise.

“With the explosive growth of mobile, social, cloud and big data, an enterprise data center strategy needs to evolve from merely housing servers to becoming the foundation for new data-driven business models,” said Dan Vesset, Group VP, Analytics and Information Management at research firm IDC. “Equinix, with its global platform of data centers and interconnection-first approach, offers the type of platform needed to create a flexible datacenter environment for innovation – specifically in the realm of data management at the network edge.”

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

US revealed to have 46% of all data centres despite EU concerns

Data protectionNew findings from Synergy Research Group show that 46% major cloud and internet data centre sites are located in the US, with second placed China only accounting for 7%.

The research is based on an analysis of the data centre footprint of 17 of the world’s major cloud and internet service firms and highlights the dominance of the US in the cloud market place. Japan is listed at third with a 6% market share and Germany was the largest European player with just 4%.

“Given that explosive growth in cloud usage is a global phenomenon, it is remarkable that the US still accounts for almost half of the world’s major data centres, but that is a reflection of the US dominance of cloud and internet technologies,” said John Dinsdale, Research Director at Synergy Research Group.

Considering the dominance of AWS, Microsoft and Google in the cloud market space, it’s unsurprising that the US is top of the rankings, though recent concerns from European countries regarding movement of its citizens’ data outside of the EU could complicate matters. Germany is one country which is sensitive to any changes in data protection policy and is considered to have some of the most stringent data protection laws worldwide.

“The other leading countries are there due to either their scale or the unique characteristics of their local markets. Perhaps the biggest surprise is that the UK does not feature more prominently, but that situation will change this year with AWS, Microsoft and Google all opening major data centres in the country,” said Dinsdale.

Back in October, the European Court of Justice decided that Safe Harbour did not give data transfers between Europe and the US adequate protection, and declared the agreement which had been in place since 2000 void. The EU-US Privacy Shield, Safe Harbour’s successor, has also come under criticism in recent weeks as concerns have been raised to how much protection the reformed regulations protect European parties.

While the new agreement has been initially accepted, privacy activist Max Schrems, who has been linked to the initial downfall of Safe Harbour, said in a statement reacting to Privacy Shield, “Basically, the US openly confirms that it violates EU fundamental rights in at least six cases. The commission claims that there is no ‘bulk surveillance’ any more, when its own documents say the exact opposite.” A letter from Robert Litt General Counsel of the Office of the Director of National Intelligence, confirmed that there were six circumstances where the NSA will be allowed to use data for undefined “counter-terrorism” purposes

While the concentration of data centres in the US should not come as a huge surprise, it puts into further context the fears of European parties who are concerned with the effectiveness of any EU-US data protection policies.