Archivo de la categoría: data centre

Intel prioritizes cloud, IoT and 5G in new business strategy

IntelIntel has outlined a new business strategy to capitalize on new trends within the industry including cloud technology, IoT and 5G.

Speaking on the company’s blog, CEO Brian Krzanich outlined the organizations new strategy which is split into five sections; cloud technology, IoT, memory and programmable solutions, 5G and developing new technologies under the concept of Moore’s law.

“Our strategy itself is about transforming Intel from a PC company to a company that powers the cloud and billions of smart, connected computing devices,” said Krzanich. “But what does that future look like? I want to outline how I see the future unfolding and how Intel will continue to lead and win as we power the next generation of technologies.

“There is a clear virtuous cycle here – the cloud and data centre, the Internet of Things, memory and FPGA’s are all bound together by connectivity and enhanced by the economics of Moore’s Law. This virtuous cycle fuels our business, and we are aligning every segment of our business to it.”

Krzanich believes virtualization and software trends, which are apparently redefining the concept of the data centre, aligns well with the Intel business model and future proposition, through the company’s position in the high-performance computing food chain. Through continued investment in analytics, big data and machine learning technologies, the company aims to drive more of the footprint of the data centre to Intel architecture.

The company’s play for the potentially lucrative IoT market will be built on the phrase of ‘connected to the cloud’. Intel has highlighted it will focus on autonomous vehicles, industrial and retail as our primary growth drivers of the Internet of Things, combining its capabilities within the cloud ecosystem to drive growth within IoT.

While were a number of buzzwords and trends highlighted throughout Krzanich’s post, Moore’s Law appeared to receive particular attention. While generally considered a plausible theory, Moore’s Law itself would appear to be underplayed within the industry, a point which Krzanich did not seem to agree with.

“In my 34 years in the semiconductor industry, I have witnessed the advertised death of Moore’s Law no less than four times,” said Krzanich. “As we progress from fourteen nanometer technology to ten nanometer and plan for seven nanometer and five nanometer and even beyond, our plans are proof that Moore’s Law is alive and well. Intel’s industry leadership of Moore’s Law remains intact, and you will see continued investment in capacity and R&D to ensure so.”

Krzanich’s comments provide more clarity to last week’s announcement on how it would be restructuring the business to accelerate its transformation project, and also it quarterly earnings. The data centre and Internet of Things (IoT) businesses would appear to be Intel’s primary growth engines, delivering $2.2 billion in revenue growth last year, and accounting for roughly 40% of revenue across the period.

The transformation project itself is part of a long-term ambition of the business, as it aims to move the perception of the company away from client computing (PCs and mobile devices) and towards IoT and the cloud. The announcements over the last week have had mixed results in the market; following its quarterlies share price rose slightly, though has declined over the subsequent days.

Verizon launches NFV OpenStack cloud deployment over five data centres

VerizonVerizon has completed the launch of its NFV OpenStack cloud deployment project across five of its US data centres, alongside Big Switch Networks, Dell and Red Hat.

The NFV project is claimed to be the largest OpenStack deployment in the industry and is currently being expanding the project to a number of domestic data centres and aggregation sites. The company also expect the deployment to be adopted in edge network sites by the end of the year, as well as a number of Verizon’s international locations, though a time-frame for the international sites was not disclosed.

“Building on our history of innovation, this NFV project is another step in building Verizon’s next-generation network – with implications for the industry,” said Adam Koeppe, VP of Network Technology Planning at Verizon. “New and emerging applications are highlighting the need for collaborative research and development in technologies like NFV. We consider this achievement to be foundational for building the Verizon cloud that serves our customers’ needs anywhere, anytime, any app.”

Verizon worked with Big Switch Networks, Dell and Red Hat to develop the OpenStack pod-based design, which went from idea to deployment of more than 50 racks in five data centres in nine months, includes a spine-leaf fabric for each pod controlled through a Neutron plugin to Red Hat OpenStack Platform. The multi-vendor project uses Big Switch’s SDN controller software managing Dell switches, which are orchestrated by Red Hat OpenStack platform.

“Dell’s Open Networking initiative delivers on the promise of bringing innovative technology, services and choice to our customers and Verizon’s NFV project is a testament to that vision,” said Tom Burns, GM of Dell’s networking business unit. “With the open source leadership of Red Hat, the SDN expertise of Big Switch and the infrastructure, service and support at scale from Dell, this deployment demonstrates a level of collaboration that sets the tone for the Open Networking ecosystem. This is just the beginning.”

Equinix launches Data Hub solution for customers on the edge

Office worker sitting on rooftop in cityData centre company Equinix has launched its Data Hub solution to enable enterprise customers to develop large data repositories and dispense, consume and process the data at the edge.

As part of an Interconnection Oriented Architecture, the Data Hub is a bundled solution consisting of pre-configured colocation and power, combined with cloud-integrated data storage solutions, and will work in conjunction with the company’s Performance Hub solution. The company highlighted that while the Performance Hub solves for the deployment of network gear and interconnection inside Equinix data centres, Data Hub enables the deployment of IT gear integrated with Performance Hub.

The launch builds on a number of trends within the industry, including the growing volume of data utilized by enterprise organizations brought on by the implementation of IoT and big data capabilities. According to research from statista, the number of connected devices is forecast to reach 50 billion units worldwide by 2020. The company believe the healthy growth of data consumption will increase the need for organizations to re-think their existing IT infrastructure, and develop an Interconnection Oriented Architecture, at the edge.

“Data, and its exponential growth, continues to be an ongoing concern for enterprise IT,” said Lance Weaver, VP, Product Offers and Platform Strategy at Equinix. “And there is no expectation it will slow down in the near future.  To keep up with this relentless data rise, it is critical for the enterprise to rethink its IT architecture and focus on an interconnection-first strategy.  Data Hub, is the newest solution from Equinix and we are confident that it will provide our enterprise customers with the data management solutions they need today, while providing for growth tomorrow.”

The company have claimed there are a number of use cases including cloud-integrated tiered storage, big data analytics infrastructure, as well as multi-site deployment for data redundancy, allowing data to be synchronously replicated by an enterprise.

“With the explosive growth of mobile, social, cloud and big data, an enterprise data center strategy needs to evolve from merely housing servers to becoming the foundation for new data-driven business models,” said Dan Vesset, Group VP, Analytics and Information Management at research firm IDC. “Equinix, with its global platform of data centers and interconnection-first approach, offers the type of platform needed to create a flexible datacenter environment for innovation – specifically in the realm of data management at the network edge.”

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

US revealed to have 46% of all data centres despite EU concerns

Data protectionNew findings from Synergy Research Group show that 46% major cloud and internet data centre sites are located in the US, with second placed China only accounting for 7%.

The research is based on an analysis of the data centre footprint of 17 of the world’s major cloud and internet service firms and highlights the dominance of the US in the cloud market place. Japan is listed at third with a 6% market share and Germany was the largest European player with just 4%.

“Given that explosive growth in cloud usage is a global phenomenon, it is remarkable that the US still accounts for almost half of the world’s major data centres, but that is a reflection of the US dominance of cloud and internet technologies,” said John Dinsdale, Research Director at Synergy Research Group.

Considering the dominance of AWS, Microsoft and Google in the cloud market space, it’s unsurprising that the US is top of the rankings, though recent concerns from European countries regarding movement of its citizens’ data outside of the EU could complicate matters. Germany is one country which is sensitive to any changes in data protection policy and is considered to have some of the most stringent data protection laws worldwide.

“The other leading countries are there due to either their scale or the unique characteristics of their local markets. Perhaps the biggest surprise is that the UK does not feature more prominently, but that situation will change this year with AWS, Microsoft and Google all opening major data centres in the country,” said Dinsdale.

Back in October, the European Court of Justice decided that Safe Harbour did not give data transfers between Europe and the US adequate protection, and declared the agreement which had been in place since 2000 void. The EU-US Privacy Shield, Safe Harbour’s successor, has also come under criticism in recent weeks as concerns have been raised to how much protection the reformed regulations protect European parties.

While the new agreement has been initially accepted, privacy activist Max Schrems, who has been linked to the initial downfall of Safe Harbour, said in a statement reacting to Privacy Shield, “Basically, the US openly confirms that it violates EU fundamental rights in at least six cases. The commission claims that there is no ‘bulk surveillance’ any more, when its own documents say the exact opposite.” A letter from Robert Litt General Counsel of the Office of the Director of National Intelligence, confirmed that there were six circumstances where the NSA will be allowed to use data for undefined “counter-terrorism” purposes

While the concentration of data centres in the US should not come as a huge surprise, it puts into further context the fears of European parties who are concerned with the effectiveness of any EU-US data protection policies.

Toyota to build massive data centre and recruit partners to support smart car fleet

Toyota smart car standCar maker Toyota is to build a massive new IT infrastructure and data centre to support all the intelligence to be broadcast its future range of smart cars. It is also looking for third party partners to develop supporting services for its new fleet of connected vehicles.

The smart car maker unveiled its plans for a connected vehicle framework at the 2016 Consumer Electronics Show (CES) in Las Vegas.

A new data centre will be constructed and dedicated to collecting information from new Data Communication Modules (DCM), which are to be installed on the frameworks of all new vehicles. The Toyota Big Data Center (TBDC) – to be stationed in Toyota’s Smart Center – will analyse everything sent by the DCMs and ‘deploy services’ in response. As part of the connected car regime, Toyota cars could automatically summon the emergency services in response to all accidents, with calls being triggered by the release of an airbag. The airbag-induced emergency notification system will come as a standard feature, according to Toyota.

The new data comms modules will appear as a feature in 2017 for Toyota models in the US market only, but it will roll out the service into other markets later, as part of a plan to build a global DCM architecture by 2019. A global rollout out is impossible until devices are standardised across the globe, it said.

Toyota said it is to invite third party developers to create services that will use the comms modules. It has already partnered with UIEvolution, which is building apps to provide vehicle data to Toyota-authorised third-party service providers.

Elsewhere at CES, Nvidia unveiled artificial-intelligence technology that will let cars sense the environment and decide their best course. NVIDIA CEO Jen-Hsun Huang promised that the DRIVE PX 2 will have ten times the performance of the first model. The new version will use an automotive supercomputing platform with 8 teraflops of processing power that can process 24 trillion deep learning operations a second.

Volvo said that next year it lease out 100 XC90 luxury sports utility vehicles that will use DRIVE PX 2 technology to drive autonomously around Volvo’s hometown of Gothenburg. “The rear-view mirror is history,” said Huang.

Criteo to build giant private big data platform on Huawei servers

datacentre cloudPerformance marketing specialist Criteo has chosen Huawei to supply 700 servers for its new Hadoop Cluster data centre in Pantin, Seine St Denis, near Paris.

Huawei won the tender after its FusionServer RH2288H V3 impressed in a strict comparative study, it says. The servers were chosen for their abundance of high-capacity disks, which give the Criteo data centre a better storage density and cut energy consumption by 10 per cent, it claims.

The new Hadoop platform of Huawei servers will boost Criteo’s processing performance by 30 per cent, it’s claimed. In the first stage of the project, the 700 machines in the Paris data centre outperformed Criteo’s Amsterdam data centre, in terms of computing power and storage, even though the Dutch site has 1,200 servers at its disposal, according to Criteo’s Senior Engineering Manager for Infrastructure Operations, Matthieu Blumberg.

“This is the biggest private Hadoop platform in Europe as of today,” said Blumberg, “Huawei has undeniably good ICT solutions and extensive knowledge of Big Data. We were really impressed.”

As a result, Criteo now plans to install up to 5,000 servers, taking up 350 square meters of rack space, at its Pantin data centre. The total power consumption will rise to 2 MW as the power of the Huawei server estate grows, according to Blumberg.

“We are proud to have built this partnership with Criteo: this is the kind of project we love to develop,” said Robert Yang, Head of the Huawei France Enterprise Business Group.