Archivo de la categoría: data centre

Cisco reports 3% growth for Q3 and sets targets on IoT market

Cisco corporateCisco has reported 3% year-on-year growth for Q3, topping $12 billion for the quarter, with its security business leading the charge, though the team have reconfirmed IOT, software cloud and collaboration markets are priorities for the future.

The security portfolio demonstrated revenue growth of 17% while deferred revenue grew 31% driven by the ongoing shift from hardware to more software and subscription services. The Collaboration portfolio grew 16%, while the team were also confident in the performance of its next generation data centre portfolio. The ACI platform grew revenues approximately 100%, exceeding a $2 billion annualized run-rate.

“We delivered strong Q3 results against the backdrop of the Macro environment that continues to be uncertain,” said CEO Charles Robbins. “Despite this uncertainty we executed very well, with revenue growth of 3%. The operational changes we continued to make will further enable our customers to leverage strategic role to network as they transform their businesses to become digital.”

Regionally, the America’s accounted for a 4% lift, whereas EMEA and APJ were slightly less at 2% and 1% respectively. The emerging markets demonstrated healthy results for the business, as BRICs increased by 4%, Mexico by 4%, China up 22% and India up 18%. The team highlighted while there was good growth in the public and service provider segments, the enterprise was not as positive as the team pointed towards pressure driven by macro uncertainty as the reasoning.

The quarter also saw Cisco as one of the more active players in the M&A market, completing five acquisitions over the course of the quarter. The $1.4 billion acquisition of Jasper Technologies now makes Cisco the largest cloud based IOT service platform in the industry, the team claims. Cisco also completed the acquisitions of Acano, Synata, Leaba and CliQr during the period, the latter a $260 million orchestration platform to help customers simplify and accelerate their private, public and hybrid cloud deployment. Cisco had already integrated CliQr with its Cisco Application Centric Infrastructure (ACI) and Unified Computing systems (UCS) prior to acquisition.

“These acquisitions are clearly focused on our key growth areas including IOT, software cloud and collaboration as well as continuing to strengthen our core,” said Robbins.

The IoT market has been a long time target of Cisco, with the Jasper deal adding to the ParStream acquisition last year. The acquisition offered the opportunity for instant analysis of masses of data at the network edge with minimal infrastructural or OPEX repercussions, the company claimed.

London’s Virtus Data Centres doubles annual revenues

VirtusLondon based Virtus Data Centres has announced it has doubled its revenues over the last twelve months, though the team haven’t released any specific numbers to substantiate the claim.

The company has recorded a healthy number of new customers throughout the period, including T-Systems which runs its private and public cloud operations from the London2 location in Hayes, as part of a five year transition project to close its private data centre in Feltham. Virtus has 40MW of capacity across its three locations, having acquired the London4 site in Slough during the latter stages of 2015 from Infinity SDC.

“Our aim is to combine cutting edge design and technology with transparent and agile commercials to offer the very best tailored solutions and service for our customers,” said Neil Cresswell, CEO at Virtus Data Centres. “This unique approach to data centre service delivery is the reason we see continued growth across all business lines with the likes of T-Systems and Symantec collocating in our leading facilities. It’s been a fantastic start to the year, and one which we seek to improve upon.”

The company, which has been in operation since 2008, offers traditional retail and wholesale colocation models, through three locations in the London area (Enfield, Hays and Slough) will a fourth set to open early next year. Virtus also boasts to have the highest total colocation MW sales of any operator in the London market throughout 2015, according to findings from CBRE, and is only one of four data centre operators in London to have been awarded Tier III design certification from the Uptime Institute. Virtus has also been expanding its credentials and capabilities in recent months, achieving supplier status with the Crown Commercial Service as part of the G-Cloud 7 initiative.

Recent expansion initiatives have been driven through investment from ST Telemedia, which was announced last year in June. As part of the agreement, ST Telemedia will make what it claims is a ‘significant investment’ into Virtus committing to a 49% via a Joint Venture with Virtus’ existing owner Brockton Capital. ST Telemedia has a healthy track record when it comes to data centre companies having launched i-STT in 2000 which was later merged into Equinix (it has now divested), as well as investments in Level 3 Communications and GDS Services.

Welcome to the cloud party – Michael Dell launches Dell Technologies

Michael Dell at EMC World

Dell Founder and CEO Michael Dell

Speaking at EMC World in Las Vegas, Dell CEO Michael Dell and EMC CEO Joe Tucci outlined the rationale behind one of history’s largest mergers, and announced the name of the industry’s latest tech giant – Dell Technologies.

The group itself will be known as Dell Technologies upon the completion of the reported $67 billion merger, though there will also be several individual operating brands. Dell’s client services group will continue to be known as Dell, with the soon-to-be merged enterprise business known as Dell EMC.

“There are certain times once every two or three generations where everything changes,” said Tucci. “The industrial revolution went on for more than 100 years and changed everything they knew back then. Many new companies were born out of the opportunities that were created, and many failed as they didn’t. We are now on the cusp of an even bigger revolution, the digital revolution.”

Tucci, speaking at what he seemingly disappointingly admitted would be his final EMC World, highlighted the vast scale of change at which the world is undergoing currently. IoT and the connected world specifically are redefining not only the way in which individuals communicate with each other, but also the way in which enterprise organizations are structured and operated. The merger enables two companies, which could potentially be perceived as being stuck in a traditional IT world, to create a new brand which can capitalize on digitalization trends.

“We have to change rapidly to be on the wave of this revolution,” said Tucci. “The merger with Dell allows the company to change the concept of the business and capitalize on the opportunities presented by the digital revolution.”

Michael Dell’s contribution to the opening keynote focused more on the rate of innovation, normalization and implementation of new technologies which are driving the digital revolution. EMC World has now been running for 15 years, debuting in 2001, the same year which saw the launch of the iPod, Sun E25k as the state of the art data centre technology and the first availability of 3G networks. Dell commented that while these once-innovations could now be seen as relics, it raise the question of what is possible during the next 15 years.

Joe and Michael

EMC CEO Joe Tucci and Dell CEO Michael Dell on stage at EMC World

“Think about 15 years from now, to the year 2031,” said Dell. “Currently, if you want to code the human genome it takes around 36 hours. In 2031 it will take 94 seconds. In 2031 more than half the cars on the road will be driverless, and there will be more than 200 million connected devices. There will be thousands of innovations which we can’t even begin to perceive. I believe that it could happen sooner as well. The marginal cost of making something intelligent is fast approaching zero.

“The new digital, connected world will require data centre infrastructure to be architected in a different way. It’s going to be cloud native and operated on a Devops methodology. EMC and Dell are merging to create a company which can deliver this concept.”

“We are combining Dell and EMC to help you navigate a successful path, to modernise your IT, reduce costs and helping you create your digital future.”

The merger itself could be evidence of the weight of the digital world and the expectations which are placed on companies to succeed in the new ecosystem. Rather than attempting to change the perception of the organization which they oversee, like IBM and Intel for instance, the merger enables Tucci and Dell to create a new brand which can be defined as how and where they desire. Unlike companies who are in the process of redefining themselves for the cloud era, Dell Technologies can position itself where-ever it chooses in the market, without worry of legacy perceptions.

Dell also claimed the new company will have a significant advantage over competitors due to the fact it will be private. Leaning on the idea Dell Technologies will not have outside influences to be concerned about as publicly trading organizations do, Dell believes the new company can invest for long-term ambition, as opposed to short-termist aims which could be perceived to damage technological innovation.

The IoT wave is continuing to grow, and as we see more devices deployed, more data collected and more cloud-orientated behaviour infiltrating the boardroom, the role of the data centre is likely to become more evident. Dell believes the modern data centre will be the centre of the new technology world, enabling innovation in an increasingly competitive market, and the merger has created a new organization which can capitalize on these trends. The success of the new company remains to be seen, though the new proposition and brand does have the potential to remove perceived doubt as to how traditional IT players can operate in “The Next Industrial Revolution” as Michael Dell highlighted.

IBM expands flash storage portfolio in continued transition to cloud positioning

Cloud storageIBM has announced the expansion of its flash storage portfolio, to bolster its position in the cognitive computing and hybrid cloud market segments.

The FlashSystem arrays combine its FlashCore technology with scale-out architecture, in the company’s continued efforts to consolidate its position as a vendor to power cloud data centres which utilize cognitive computing technologies. Cognitive computing, and more specifically Watson, has seemingly formed the central pillar of IBM’s current marketing and PR campaigns, as it continues its journey to transform Big Blue into a major cloud player.

“The drastic increase in volume, velocity and variety of information is requiring businesses to rethink their approach to addressing storage needs, and they need a solution that is as fast as it is easy, if they want to be ready for the Cognitive Era,” said Greg Lotko, GM of IBM’s Storage and Software Defined Infrastructure business. “IBM’s flash portfolio enables businesses on their cognitive journey to derive greater value from more data in more varieties, whether on premises or in a hybrid cloud deployment.”

The company claims the new offering will provide an onramp for flash storage for IT service providers, reducing the cost of implementing an all-flash environment, as well as scalable storage for cloud service providers. Another feature built into the proposition, will enable customers to deal with ‘noisy neighbour’ challenges and other network performance issues which can be present in a multi-tenant cloud environment.

“The workloads our department manages include CAD files for land mapping, geographic information system (GIS) applications and satellite imagery for the over 9.2 million acres of State Trust lands we’re responsible to oversee. The data we manage is tied directly to our goal to make this information available and to increase its analytical capabilities,” said William Reed, CTO at the Arizona State Land Department, one of IBM’s customers. “After exhaustive, comparative proof of concept testing we chose IBM’s FlashSystem, which has helped to increase our client productivity by 7 times while reducing our virtual machine boot times by over 85 percent.”

Intel prioritizes cloud, IoT and 5G in new business strategy

IntelIntel has outlined a new business strategy to capitalize on new trends within the industry including cloud technology, IoT and 5G.

Speaking on the company’s blog, CEO Brian Krzanich outlined the organizations new strategy which is split into five sections; cloud technology, IoT, memory and programmable solutions, 5G and developing new technologies under the concept of Moore’s law.

“Our strategy itself is about transforming Intel from a PC company to a company that powers the cloud and billions of smart, connected computing devices,” said Krzanich. “But what does that future look like? I want to outline how I see the future unfolding and how Intel will continue to lead and win as we power the next generation of technologies.

“There is a clear virtuous cycle here – the cloud and data centre, the Internet of Things, memory and FPGA’s are all bound together by connectivity and enhanced by the economics of Moore’s Law. This virtuous cycle fuels our business, and we are aligning every segment of our business to it.”

Krzanich believes virtualization and software trends, which are apparently redefining the concept of the data centre, aligns well with the Intel business model and future proposition, through the company’s position in the high-performance computing food chain. Through continued investment in analytics, big data and machine learning technologies, the company aims to drive more of the footprint of the data centre to Intel architecture.

The company’s play for the potentially lucrative IoT market will be built on the phrase of ‘connected to the cloud’. Intel has highlighted it will focus on autonomous vehicles, industrial and retail as our primary growth drivers of the Internet of Things, combining its capabilities within the cloud ecosystem to drive growth within IoT.

While were a number of buzzwords and trends highlighted throughout Krzanich’s post, Moore’s Law appeared to receive particular attention. While generally considered a plausible theory, Moore’s Law itself would appear to be underplayed within the industry, a point which Krzanich did not seem to agree with.

“In my 34 years in the semiconductor industry, I have witnessed the advertised death of Moore’s Law no less than four times,” said Krzanich. “As we progress from fourteen nanometer technology to ten nanometer and plan for seven nanometer and five nanometer and even beyond, our plans are proof that Moore’s Law is alive and well. Intel’s industry leadership of Moore’s Law remains intact, and you will see continued investment in capacity and R&D to ensure so.”

Krzanich’s comments provide more clarity to last week’s announcement on how it would be restructuring the business to accelerate its transformation project, and also it quarterly earnings. The data centre and Internet of Things (IoT) businesses would appear to be Intel’s primary growth engines, delivering $2.2 billion in revenue growth last year, and accounting for roughly 40% of revenue across the period.

The transformation project itself is part of a long-term ambition of the business, as it aims to move the perception of the company away from client computing (PCs and mobile devices) and towards IoT and the cloud. The announcements over the last week have had mixed results in the market; following its quarterlies share price rose slightly, though has declined over the subsequent days.

Verizon launches NFV OpenStack cloud deployment over five data centres

VerizonVerizon has completed the launch of its NFV OpenStack cloud deployment project across five of its US data centres, alongside Big Switch Networks, Dell and Red Hat.

The NFV project is claimed to be the largest OpenStack deployment in the industry and is currently being expanding the project to a number of domestic data centres and aggregation sites. The company also expect the deployment to be adopted in edge network sites by the end of the year, as well as a number of Verizon’s international locations, though a time-frame for the international sites was not disclosed.

“Building on our history of innovation, this NFV project is another step in building Verizon’s next-generation network – with implications for the industry,” said Adam Koeppe, VP of Network Technology Planning at Verizon. “New and emerging applications are highlighting the need for collaborative research and development in technologies like NFV. We consider this achievement to be foundational for building the Verizon cloud that serves our customers’ needs anywhere, anytime, any app.”

Verizon worked with Big Switch Networks, Dell and Red Hat to develop the OpenStack pod-based design, which went from idea to deployment of more than 50 racks in five data centres in nine months, includes a spine-leaf fabric for each pod controlled through a Neutron plugin to Red Hat OpenStack Platform. The multi-vendor project uses Big Switch’s SDN controller software managing Dell switches, which are orchestrated by Red Hat OpenStack platform.

“Dell’s Open Networking initiative delivers on the promise of bringing innovative technology, services and choice to our customers and Verizon’s NFV project is a testament to that vision,” said Tom Burns, GM of Dell’s networking business unit. “With the open source leadership of Red Hat, the SDN expertise of Big Switch and the infrastructure, service and support at scale from Dell, this deployment demonstrates a level of collaboration that sets the tone for the Open Networking ecosystem. This is just the beginning.”

Equinix launches Data Hub solution for customers on the edge

Office worker sitting on rooftop in cityData centre company Equinix has launched its Data Hub solution to enable enterprise customers to develop large data repositories and dispense, consume and process the data at the edge.

As part of an Interconnection Oriented Architecture, the Data Hub is a bundled solution consisting of pre-configured colocation and power, combined with cloud-integrated data storage solutions, and will work in conjunction with the company’s Performance Hub solution. The company highlighted that while the Performance Hub solves for the deployment of network gear and interconnection inside Equinix data centres, Data Hub enables the deployment of IT gear integrated with Performance Hub.

The launch builds on a number of trends within the industry, including the growing volume of data utilized by enterprise organizations brought on by the implementation of IoT and big data capabilities. According to research from statista, the number of connected devices is forecast to reach 50 billion units worldwide by 2020. The company believe the healthy growth of data consumption will increase the need for organizations to re-think their existing IT infrastructure, and develop an Interconnection Oriented Architecture, at the edge.

“Data, and its exponential growth, continues to be an ongoing concern for enterprise IT,” said Lance Weaver, VP, Product Offers and Platform Strategy at Equinix. “And there is no expectation it will slow down in the near future.  To keep up with this relentless data rise, it is critical for the enterprise to rethink its IT architecture and focus on an interconnection-first strategy.  Data Hub, is the newest solution from Equinix and we are confident that it will provide our enterprise customers with the data management solutions they need today, while providing for growth tomorrow.”

The company have claimed there are a number of use cases including cloud-integrated tiered storage, big data analytics infrastructure, as well as multi-site deployment for data redundancy, allowing data to be synchronously replicated by an enterprise.

“With the explosive growth of mobile, social, cloud and big data, an enterprise data center strategy needs to evolve from merely housing servers to becoming the foundation for new data-driven business models,” said Dan Vesset, Group VP, Analytics and Information Management at research firm IDC. “Equinix, with its global platform of data centers and interconnection-first approach, offers the type of platform needed to create a flexible datacenter environment for innovation – specifically in the realm of data management at the network edge.”

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

US revealed to have 46% of all data centres despite EU concerns

Data protectionNew findings from Synergy Research Group show that 46% major cloud and internet data centre sites are located in the US, with second placed China only accounting for 7%.

The research is based on an analysis of the data centre footprint of 17 of the world’s major cloud and internet service firms and highlights the dominance of the US in the cloud market place. Japan is listed at third with a 6% market share and Germany was the largest European player with just 4%.

“Given that explosive growth in cloud usage is a global phenomenon, it is remarkable that the US still accounts for almost half of the world’s major data centres, but that is a reflection of the US dominance of cloud and internet technologies,” said John Dinsdale, Research Director at Synergy Research Group.

Considering the dominance of AWS, Microsoft and Google in the cloud market space, it’s unsurprising that the US is top of the rankings, though recent concerns from European countries regarding movement of its citizens’ data outside of the EU could complicate matters. Germany is one country which is sensitive to any changes in data protection policy and is considered to have some of the most stringent data protection laws worldwide.

“The other leading countries are there due to either their scale or the unique characteristics of their local markets. Perhaps the biggest surprise is that the UK does not feature more prominently, but that situation will change this year with AWS, Microsoft and Google all opening major data centres in the country,” said Dinsdale.

Back in October, the European Court of Justice decided that Safe Harbour did not give data transfers between Europe and the US adequate protection, and declared the agreement which had been in place since 2000 void. The EU-US Privacy Shield, Safe Harbour’s successor, has also come under criticism in recent weeks as concerns have been raised to how much protection the reformed regulations protect European parties.

While the new agreement has been initially accepted, privacy activist Max Schrems, who has been linked to the initial downfall of Safe Harbour, said in a statement reacting to Privacy Shield, “Basically, the US openly confirms that it violates EU fundamental rights in at least six cases. The commission claims that there is no ‘bulk surveillance’ any more, when its own documents say the exact opposite.” A letter from Robert Litt General Counsel of the Office of the Director of National Intelligence, confirmed that there were six circumstances where the NSA will be allowed to use data for undefined “counter-terrorism” purposes

While the concentration of data centres in the US should not come as a huge surprise, it puts into further context the fears of European parties who are concerned with the effectiveness of any EU-US data protection policies.

Toyota to build massive data centre and recruit partners to support smart car fleet

Toyota smart car standCar maker Toyota is to build a massive new IT infrastructure and data centre to support all the intelligence to be broadcast its future range of smart cars. It is also looking for third party partners to develop supporting services for its new fleet of connected vehicles.

The smart car maker unveiled its plans for a connected vehicle framework at the 2016 Consumer Electronics Show (CES) in Las Vegas.

A new data centre will be constructed and dedicated to collecting information from new Data Communication Modules (DCM), which are to be installed on the frameworks of all new vehicles. The Toyota Big Data Center (TBDC) – to be stationed in Toyota’s Smart Center – will analyse everything sent by the DCMs and ‘deploy services’ in response. As part of the connected car regime, Toyota cars could automatically summon the emergency services in response to all accidents, with calls being triggered by the release of an airbag. The airbag-induced emergency notification system will come as a standard feature, according to Toyota.

The new data comms modules will appear as a feature in 2017 for Toyota models in the US market only, but it will roll out the service into other markets later, as part of a plan to build a global DCM architecture by 2019. A global rollout out is impossible until devices are standardised across the globe, it said.

Toyota said it is to invite third party developers to create services that will use the comms modules. It has already partnered with UIEvolution, which is building apps to provide vehicle data to Toyota-authorised third-party service providers.

Elsewhere at CES, Nvidia unveiled artificial-intelligence technology that will let cars sense the environment and decide their best course. NVIDIA CEO Jen-Hsun Huang promised that the DRIVE PX 2 will have ten times the performance of the first model. The new version will use an automotive supercomputing platform with 8 teraflops of processing power that can process 24 trillion deep learning operations a second.

Volvo said that next year it lease out 100 XC90 luxury sports utility vehicles that will use DRIVE PX 2 technology to drive autonomously around Volvo’s hometown of Gothenburg. “The rear-view mirror is history,” said Huang.