Todas las entradas hechas por Business Cloud News

HP inks deal with SunEdison to power its cloud with clean energy

HP is the latest cloud player to boost its green credentials

HP is the latest cloud player to boost its green credentials

HP signed a 12-year power purchasing agreement with SunEdison this week that will see it power its cloud datacentres with renewable energy.

The deal will see SunEdison begin construction on a massive wind farm in Texas, which when completed will generate 300 MW of power. The wind farm will be acquired by TerraForm, a global owner and operator of renewable energy plants, in 2016 once it becomes operational.

HP said the 12 MW of wind electricity it agreed to purchase annually is enough to power all of its Texas-based datacentre operations, and will enable the company to reach its 2020 operational greenhouse gas (GHG) emissions reduction goal by the end of this year, five years ahead of schedule.

“This agreement represents the latest step we are taking on HP’s journey to reduce our carbon footprint across our entire value chain, while creating a stronger, more resilient company and a sustainable world,” said Gabi Zedlmayer, vice president and chief progress officer, Corporate Affairs, HP.

“It’s an important milestone in driving HP Living Progress as we work to create a better future for everyone through our actions and innovations,” Zedlmayer said.

Paul Gaynor, executive vice president, Americas and EMEA, SunEdison said: “By powering their data centers with renewable energy, HP is taking an important step toward a clean energy future while lowering their operating costs. At the same time, HP’s commitment allows us to build this project which creates valuable local jobs and ensures Texan electricity customers get cost-effective energy.”

HP is the latest cloud player to bolster its green credentials. Amazon recently announced two clean energy projects in the US within a month of one another, one in Virginia and the other in North Carolina.

Google buys Pixate to strengthen mobile app prototyping, design

Google acquired mobile design and prototyping firm Pixate this week

Google acquired mobile design and prototyping firm Pixate this week

Google quietly acquired Pixate for an undisclosed sum this week. The company, which offers a platform that helps developers and design and prototype mobile apps, may help Google bolster the UX of its own apps while helping it expand the range of services already offered to developers.

A post on the Pixate blog written by chief executive Paul Colton confirmed the acquisition.

“Our small team at Pixate has some really big ideas, and with the help of Google we’ll be able to bring those ideas to the design community at scale. We’ve become an essential part of the workflow for tens of thousands of designers, and are excited about expanding our mission at Google to reach millions of product teams worldwide,” Colton explained.

“Starting today we’re making Pixate Studio free and dramatically reducing the cost of the Pixate cloud service,” he added.

Google said “Pixate adds to our ongoing effort to develop new design and prototyping tools.”

Pixate said it counts companies like Apple, Disney and Amazon as past customers. The company’s services will no doubt complement the cloud-based testing service for Android apps unveiled earlier this year at the I/O conference. The service, based on Appurify’s technology – an acquisition it announced at the conference last year, allows developers to run their applications on simulated versions of thousands of different Android devices.

Verizon, Qualcomm among Mcity partners testing IoT, automated cars

Verizon is teaming up with the University of Michigan to test connected and automated cars

Verizon is teaming up with the University of Michigan to test connected and automated cars

Verizon and Qualcomm are among 15 partners launching Mcity at the University of Michigan this week, a controlled testing environment for connected and automated vehicles that the project participants claim could clear the path for mass-market adoption of driverless cars.

The facility will allow researchers to simulate environments where connected and automated vehicles will be most challenged – for instance where road signs may be defaced by graffiti, or when traffic lights become faulty or break.

“There are many challenges ahead as automated vehicles are increasingly deployed on real roadways,” said Peter Sweatman, director of the University of Michigan Mobility Transformation Center. “Mcity is a safe, controlled, and realistic environment where we are going to figure out how the incredible potential of connected and automated vehicles can be realized quickly, efficiently and safely.”

Michigan – particularly the City of Detroit – has a longstanding (and to some extent troubled) history in automotive, but the University said the facility will help the State regain its leadership in the sector. The project builds on a 3,000 vehicle connected car project launched three years ago and co-funded by the Michigan Economic Development Corporation

As part of its participation with the project Verizon will be contributing its telematics technology, In-Drive, and is offering its own research into vehicle-to-vehicle and vehicle-to-infrastructure technologies. It will also help explore various ways to combine mobility, telematics and IoT services.

Other project partners include Iteris, Navistar, Denso, Ford, General Motors, Qualcomm and Xerox; each partner is investing about $1m into the project over the next three years.

Amit Jain, director of corporate strategy, IoT verticals at Verizon said the project will help create new vendor-agnostic and OEM-agnostic services that could improve road and pedestrian safety.

“Placing the onus on OEMs only to deploy technologies such as Dedicated Short Range Communications (DSRC), for example, could take up to 37 years according to the National Highway Safety Administration (NHTSA). That’s why creating opportunities like Mcity to pool research and share best practices to expedite innovation is so important,” Jain said.

“Consider the fact that there are more than 30,000 fatalities in the US annually caused by vehicle accidents – of which 14 percent of those fatalities involve pedestrians. As part of our participation in Mcity, we will be involved in tailored research to explore how smart phones can be used to further enhance vehicle-to-pedestrian communications.”

Verizon has moved over the past few years to bolster its legacy M2M portfolio (industrial M2M, telematics) with the addition of new IoT services, which according to the telco now constitute a growing portion of its overall revenues – particularly connected cars. In a Q2 2015 earnings call with journalists and analysts this week Verizon’s chief financial officer Francis Shammo said that although IoT is still quite a nascent sector it raked in about $165m for the quarter and $320m year-to-date.

“As far as Internet of Things, we think that the transportation, healthcare, and energy industries in particular present great opportunities for us and we are very active fostering innovation in these areas,” Shammo said. “We are very well-positioned to capitalize on these new growth opportunities and we will continue to develop business models to monetize usage on our network and at the platform level.”

Alibaba to bolster cloud performance, proposes data protection pact

Alibaba is boosting the performance of its cloud services and reassuring customers on data protection

Alibaba is boosting the performance of its cloud services and reassuring customers on data protection

Alibaba unveiled a series of performance upgrades to its cloud platform this week in a bid to compete more effectively for big data workloads with other large cloud incumbents, and clarified its position on data protection.

The company said it is adding solid state drive (SSD) backed cloud storage, which will massively improve read-write performance over its existing HDD-based offerings, and virtual private cloud services (VPC) for high performance compute and analytics workloads. It’s also boosting performance with virtualised GPU-based technology.

“The huge amount of data and advanced computing capacity has brought great business opportunities to the industry,” said Wensong Zhang, chief technology officer of Aliyun, Alibaba’s cloud division.

“Deep learning and high-performance computing have been widely adopted in Alibaba Group for internal use. Aliyun will roll out high-performance computing services and accelerators based on GPU technology that could be applied in image recognition and deep learning to expand the boundaries of business,” Zhang said.

The company also released what it is calling a data protection pact. In its proposal Alibaba said customers will have “absolute ownership” over all of the data generated or sent to the company’s cloud services, and the “right to select whatever services they choose to securely process their data.”

It also said it would strengthen its threat protection and disaster recovery capabilities in order to reassure customers of its ability to guard their data – and the data of their clients. The company did not, however, cite any specific standards or internationally recognised guidelines on data protection in its plans.

“Without the self-discipline exercised by the banking industry, the financial and economic prosperity that exists in modern-day society would not have ensued. Similarly, without common consensus and concrete action dedicated to data protection, the future for the [data technology] economy would be dim,” the company said in a statement.

“We hereby promise to strictly abide by this pledge, and encourage the entire industry to collectively exercise the self-regulation that is vital in promoting the sustainable development of this data technology economy.”

Box, Docker, eBay, Google among newly formed Cloud Native Computing Foundation

The Cloud Native Computing Foundation is putting Linux containers at the core of its definition of 'cloud-native' apps

The Cloud Native Computing Foundation is putting Linux containers at the core of its definition of ‘cloud-native’ apps

The Linux Foundation along with a number of enterprises, cloud service providers , telcos and vendors have banded together to form the Cloud Native Computing Foundation in a bid to standardise and advance Linux containerisation for cloud.

The newly formed open source foundation, a Linux Foundation collaborative project, plans to create and drive adoption of common container technologies at the orchestration level, and integrate hosts and services by defining common APIs and standards.

The organisation also plans to assemble specifications to address a “comprehensive set of container application infrastructure needs.”

The members at launch include AT&T, Box, Cisco, Cloud Foundry Foundation, CoreOS, Cycle Computing, Docker, eBay, Goldman Sachs, Google, Huawei, IBM, Intel, Joyent, Kismatic, Mesosphere, Red Hat, Switch Supernap, Twitter, Univa, VMware and Weaveworks.

“The Cloud Native Computing Foundation will help facilitate collaboration among developers and operators on common technologies for deploying cloud native applications and services,” said Jim Zemlin, executive director at The Linux Foundation.

“By bringing together the open source community’s very best talent and code in a neutral and collaborative forum, the Cloud Native Computing Foundation aims to advance the state-of-the-art of application development at Internet scale,” Zemlin said.

The central goal of the foundation will be to harmonise container standards and techniques. A big challenge with containers today is there are many, many ways to implement them, with a range of ‘open ecosystems’ and vendor-specific approaches, all creating one heterogeneous, messy pool of technologies that don’t always play well together.

That said, the foundation expects to build on other existing open source container initiatives including Docker’s recently announced Open Container Initiative (OCI), with which it will work on building its container image spec into the standards it develops. Google also announced that the foundation would henceforth govern development of Kubernetes, which reached v.1 this week, over to the foundation.

“Google is committed to advancing the state of computing, and to helping businesses everywhere benefit from the patterns that have proven so effective to us in operating at Internet scale,” said Craig McLuckie, product manager at Google. “We believe that this foundation will help harmonize the broader ecosystem, and are pleased to contribute Kubernetes, the open source cluster scheduler, to the foundation as a seed technology.”

Ben Golub, chief executive of Docker said while the OCI offers a solid foundation for container-based computing many standards and fine details have yet to be agreed.

“At the orchestration layer of the stack, there are many competing solutions and the standard has yet to be defined. Through our participation in the Cloud Native Computing Foundation, we are pleased to be part of a collaborative effort that will establish interoperable reference stacks for container orchestration, enabling greater innovation and flexibility among developers. This is in line with the Docker Swarm integration with Mesos,” Golub said.

Microsoft signs GE in massive cloud deal

General Electric has signed up to use Microsoft's cloud software

General Electric has signed up to use Microsoft’s cloud software

Microsoft announced this week that it has signed up long-time tech partner GE to its cloud-based productivity software in a multimillion dollar deal.

The move will see GE deploy Microsoft’s cloud productivity suite Office 365 to GE’s more than 300,000 employees in 170 countries.

Jamie Miller, senior vice president and chief information officer of GE said: “As we deepen our investments in employee productivity, Microsoft’s innovative approach to collaboration made Office 365 our first choice for providing scalable productivity tools to our employees worldwide.”

GE said it will integrate a number of its line of business applications with Office 365 and deploy cloud-based email and Skype for Business calling and meetings, real-time document co-authoring, and team collaboration.

“Microsoft and GE share many values in common — openness, transparency, data-driven intelligence and innovation — all of which are driving forces behind Microsoft’s own mission to help people and organizations achieve more,” said John Case, corporate vice president of Microsoft Office. “As one of the most innovative companies in the world, GE understands what it takes to unleash the potential of its employees. We’re delighted GE has selected Office 365 as the productivity and collaboration solution to empower its global workforce.”

GE and Microsoft are longtime technology partner. The two companies have even set up a joint venture together – Caradigm, a company that develops and sells a healthcare technology platform for clinical applications and population management.

Nevertheless, the deal comes at a critical time for the company and is in some ways a validation of Microsoft’s goal of turning its business around from a number of strategic stumbles and focusing on its core strengths in software and the cloud. Earlier this month the company reported it would write off its entire Nokia acquisition and shed about 7,800 jobs in the process, mostly from its phone business.

Google says trade agreement amendment hinders security vulnerability research

Google says the US DoC amendments would massively hinder its own security research

Google says the US DoC amendments would massively hinder its own security research

Google hit out at the US Department of Commerce and the Bureau of Industry and Security this week over proposed amendments to trade legislation related to the Wassenaar Arrangement, a multilateral export control agreement, arguing they will negatively impact cybersecurity vulnerability research.

The Wassenaar Arrangement is a voluntary multi-national agreement between 41 countries and intended to control the export of some “dual use” technologies – which includes security technologies – and its power depends on each country passing its own legislation to align its trade laws with the agreement. The US is among the agreement’s members.

As of 2013 software specifically designed or modified to avoid being found by monitoring tools has been included on that list of technologies. And, a recent proposal put forward by the US DoC and BIS to align national legislation with the agreement suggests adding “systems, equipment, components and software specially designed for the generation, operation or delivery of, or communication with, intrusion software include network penetration testing products that use intrusion software to identify vulnerabilities of computers and network-capable devices” to the list of potentially regulated technologies, as well as “technology for the development of intrusion software includes proprietary research on the vulnerabilities and exploitation of computers and network-capable devices.”

Google said the US DoC amendments would effectively force it to issue thousands of export licenses just to be able to research and develop potential security vulnerabilities, as companies like Google depend on a massive global pool of talent (hackers) that experiment with or use many of the same technologies the US proposes to regulate.

“We believe that these proposed rules, as currently written, would have a significant negative impact on the open security research community. They would also hamper our ability to defend ourselves, our users, and make the web safer. It would be a disastrous outcome if an export regulation intended to make people more secure resulted in billions of users across the globe becoming persistently less secure,” explained Neil Martin, export compliance counsel, Google Legal and Tim Willis, hacker philanthropist, Chrome security team in a recent blog post.

“Since Google operates in many different countries, the controls could cover our communications about software vulnerabilities, including: emails, code review systems, bug tracking systems, instant messages – even some in-person conversations! BIS’ own FAQ states that information about a vulnerability, including its causes, wouldn’t be controlled, but we believe that it sometimes actually could be controlled information,” the company said.

Google also said the way the proposed amendment is worded is far too vague and proposed clarifying the DoC-proposed amendments as well as the Wassenaar Arrangement itself.

“The time and effort it takes to uncover bugs is significant, and the marketplace for these vulnerabilities is competitive. That’s why we provide cash rewards for quality security research that identifies problems in our own products or proactive improvements to open-source products. We’ve paid more than $4 million to researchers from all around the world.”

“If we have information about intrusion software, we should be able to share that with our engineers, no matter where they physically sit,” it said.

Accenture: For most enterprises, IT-as-a-Service will have to wait

Enterprises are slow to adopt ITaaS

Enterprises are slow to adopt ITaaS

Enterprises are looking to adopt IT-as-a-service (ITaaS) models and modernise their digital systems in a bid to become more competitive, but recently published research suggests most aren’t budging on their existing strategies. Michael Corcoran, senior managing director, global growth and strategy at Accenture, the firm that commissioned the research, told BCN that leaning more on cloud services, using analytics and becoming more automated could help them speed up the transition.

The transition to ITaaS is up there with DevOps and Agile when it comes to cultural and organisational modernisation and service improvement. It implies IT moving from being a monolithic procurement centre to a dynamic internal service provider, something most big organisations need to do in order to more effectively compete in digital.

Accenture and HfS Research surveyed 716 enterprise service buyers and found that 53 per cent of senior executives view ITaaS as critical for their organisation, yet 68 per cent of respondents said their core enterprise processes will not be delivered as-a-service for five or more years.

The research suggests this may be partly due to differing opinions or objectives within the organisations polled. More than half of service buyer senior leaders view aaS as critical and 61 per cent are ready to replace legacy providers in order to achieve their desired outcomes. But the same can’t be said for middle manager and delivery staff: just 29 per cent see the value of aaS in the same way.

“Many enterprise operations executives and service providers must make intrinsic changes to how they operate to stay relevant in an uncertain and challenging future,” said Phil Fersht, chief executive and founder, HfS Research. “It’s the forward-thinking service buyers and providers who set out their vision and path forward for sourcing with defined business outcomes aligned to the as-a-service ideals, that will achieve success. The conservative among us who refuse to accept these times of unprecedented, disruptive transition will be competitively challenged.”

Corcoran told BCN that much of the onus is on service providers, which need to invest in developing as-a-service capabilities. But enterprises also need to deploy the right mix of technologies and invest in the right skills to make the transition happen.

“By effectively moving to the cloud and applying the right digital technology, automation, artificial intelligence and analytics to unlock competitive advantage from data, and utilizing talent smartly, companies are in a better position to innovate faster, create new services and drive business outcomes that positively impact their top and bottom-line,” Corcoran explained.

“49 per cent of today’s enterprise buyers expect to move to a “wide-scale transformation of business processes enabled by new technology tools/platforms” in just two years. So it’s clear that many operational leaders are recognizing the need to steer their enterprises away from legacy delivery models and move towards the cloud and its material business outcomes.”

CenturyLink open sources more cloud tech

CenturyLink has open sourced a batch of cloud tools

CenturyLink has open sourced a batch of cloud tools

CenturyLink has open sourced a number of tools aimed at improving provisioning for Chef on VMware infrastructure as well as Docker deployment, orchestration and monitoring.

Among the projects open sourced by the company include a Chef provisioning driver for vSphere, Lorry.io – a tool for creating, composing and validating Docker images, and imagelayers.io – a tool that helps improve Docker image visualisation in order to help give developers more visibility into their workloads.

“The embrace of open-source technologies within the enterprise continues to rise, and we are proud to be huge open-source advocates and contributors at CenturyLink,” said Jared Wray, senior vice president of platforms at CenturyLink.

“We believe it’s critical to be active in the open-source community, building flexible and feature-rich tools that enable new possibilities for developers.”

While CenturyLink’s cloud platform is proprietary and developed in house Wray has repeatedly said open source technologies form an essential part of the cloud ecosystem – Wray himself was a big contributor to Cloud Foundry, the open source PaaS tool, when developing Iron Foundry.

The company has also previously open sourced other tools, too. Last summer it punted a Docker management platform it calls Panamax into the open source world, a platform is designed to ease the development and deployment of any application sitting within a Docker environment. It has also open sourced a number of tools designed to help developers assess the total cost of ownership of multiple cloud platforms.

Symantec, Frost Data Capital to incubate startups solving IoT security challenges

Symantec and FDC are to incubate ten IoT security startups per year

Symantec and FDC are to incubate ten IoT security startups per year

Symantec is teaming up with venture capital firm Frost Data Capital to incubate startups primarily developing solutions to secure the Internet of Things.

The companies initially plan to create and seed up to ten early-stage startups with funding, resources and expertise, with Symantec offerings access to its own security technologies and Frost Data Capital its data analytics platforms.

“We’re taking a fresh look at driving innovation in the market and this partnership will enable Symantec to transform raw ideas and concepts into meaningful security companies,” said Jeff Scheel, senior vice president, strategy, alliances and corporate development at Symantec. “By collaborating with Frost Data Capital, we create an environment primed to incubate new, innovative and disruptive startups in cyber security – especially in the realm of IoT technologies where verticals like process control, automotive, health care and energy require specialized skills.”

The goal is to encourage development of threat detection analytics services capable of being applied in IoT architectures, where data volume and velocity can be particularly acute challenges when it comes to security and performance.

“We’re seeing a huge opportunity in the IoT security market,” said John Vigouroux, managing partner and president of Frost Data Capital. “We’re excited to work with Symantec to bring cutting-edge, relevant security analytics solutions to market rapidly, in order to prevent next generation cyber attacks on corporate infrastructures. Symantec brings to the table world-class security technology, global presence and strategic relationships that will be instrumental to launching these startups.”

Symantec and FDC are not the only firms looking to incubate startups with a view towards developing IoT solutions that complement their own offerings. Cisco recently announced significant efforts to incubate French and UK startups innovating in the area of IoT networks, while Intel and Deutsche Telekom unveiled similar moves in Europe last year.