All posts by Business Cloud News

Colt gears up for cloud with new CTO appointment

Network Function VirtualisationColt has announced that its newly appointed Chief Technology Officer Rajiv Datta is to be given the brief to drive the service provider’s future cloud strategy.

Datta’s duties are described as ‘the creation of next generation of products and services, including SDN-based networks and digital customer experience’.

Datta has joined Colt’s Executive Leadership team to be led by Carl Grivner who took over as CEO on 1 January. Datta was recruited from AboveNet Communications, where he was chief operating officer for the fibre infrastructure provider as its annual revenues grew from $190m to $500m. New Colt CEO Grivner said Datta was recruited for his track record of transforming business.

Competition from cloud-based services such as Skype has affected the revenues for all European telcos, according to Reuters, which reported how Colt exited the wholesale market for voice calls in 2014.

However, according to a recent report from the European Telecoms Network Operators’ Association (ETNO), the year 2016 should see a return to growth in this sector, thanks in part to a thinning out of players, such as Colt, which also took the decision to pull out of the IT services market.

With an estate of 34 carrier-neutral data centres in Europe and seven managed facilities in Asia Pacific, Colt is to concentrate on cloud and data centre services. However, the data centre market is entering its own phase of consolidation and in November BCN reported market leader Equinix had bought its rival TelecityGroup. In this climate the change management skills of Datta will be invaluable, according to Colt CEO Grivner.

“Rajiv’s track record of transforming business performance will be invaluable in creating the levels of focus, speed and innovation that we need,” he said.

Salesforce buys SteelBrick from California and wind power from Virginia

Cloud giant Salesforce.com has announced two new acquisitions, one intended to boost its bottom line and the other to shrink its carbon footprint.

The acquisition of California based start up SteelBrick gives it a quoting and billing system for SMEs that runs within the Salesforce cloud platform. Meanwhile, it has announced an agreement to source 40 megawatts of green power, from a new West Virginia wind farm, through a virtual power purchase agreement (VPPA).

SteelBricks, which announced the take-over on its company blog, makes configure-price-quote and subscription-billing apps for small and medium-size enterprises. It had recently added subscription billing functions after buying UK-based cloud app maker Invoice IT. The apps automate the processes between researching customers and collecting payment and clients include Silicon Valley cloud vendors Cloudera and Nutanix.

Salesforce already part owned SteelBrick, having funded the start up through investment arm Salesforce Ventures. In December it announced plans to acquire the rest of SteelBrick for $360 million. Salesforce said it aims to close the deal by the end of April.

Having seen how Salesforce pioneered the shift to enterprise cloud computing, SteelBrick CEO Godard Abel said the company founder Max Rudman had been on a six year mission to simplify the process of selling. Abel was brought in as CEO having previously founded BigMachines, acquired by Oracle in 2013.

Meanwhile, San Francisco-based Salesforce is to buy an electricity supply from a West Virginia wind farm approximately 2,700 miles away. The two have signed a 12-year wind energy agreement for 40 megawatts (MW) of power to be provided through a virtual power purchase agreement (VPPA).

The electricity generated under the agreement is expected to be 125,000 megawatt hours annually, which exceeds Salesforce’s data centre electricity consumption in its full fiscal year 2015. The wind farm is expected to be operational by December 2016 and will deliver clean energy to the same regional electricity grid that currently powers the majority of Salesforce’s data centre load.

This announcement follows Salesforce’s recent commitments to achieve net-zero greenhouse-gas emissions by 2050 and to power all its global operations with renewable energy.

Pivotal buys UK-based CloudCredo to acquire Cloud Foundry skills

M&ALondon-based Cloud Foundry services provider CloudCredo has been bought by San Francisco-based software vendor Pivotal, a VMware spin off company. The acquisition includes CloudCredo subsidiary stayUp, which specialises in log analysis.

The logic of the acquisition is that it will make it easier for enterprises to use the new Pivotal Cloud Foundry, according to Pivotal CEO Rob Mee.

CloudCredo will continue to operate from London and service its existing customers, but its new brief includes expanding Pivotal Cloud Foundry’s growth across the world. CloudCredo’s expertise will be applied to help enterprise customers understand how to use the Pivotal Cloud Foundry Cloud Native platform more quickly and fine tune their techniques for creating the appropriate software.

Cloud Foundry skills are at a premium, as there is a scant supply of IT experts in Europe with the necessary skills for providing open source platforms as a service (PaaS) according to analyst James Governor, founder of research company RedMonk. “The pool of Cloud Foundry systems talent in Europe is limited and service companies with a proven track record is even rarer,” he said.

CloudCredo has extensive knowledge of running Cloud Foundry for some of the world’s largest brands, according to Pivotal CEO Mee. “With this expertise, we can better help our customers adopt Pivotal’s Cloud Native platform more quickly,“ he said.

Joining Pivotal means that, overnight, CloudCredo can operate at a global scale, said its CEO Colin Humphreys.

Oracle gets tax breaks to build cloud campus in Texas

OracleOracle has unveiled plans for a technology campus in Austin, Texas in a bid to expand its workforce by 50% in three years. It’s looking for millennials who want to work and live on site and sell cloud computing systems, by creating a combined office and housing complex.

Oracle is also to close its Oregon offices and incorporate the facilities in the new Texas complex. No details were given over staff re-location.

The move is part of a state initiative, including tax breaks and low regulation, as Texas positions itself as a home for innovation and technology. “I will continue to pursue policies that invite the expansion and relocation of tech companies to Texas,” said Texas State Governor Greg Abbott.

The site will include cheap accommodation as Oracle competes for talent in a region with a high concentration of technology start-ups. Its recruitment drive will be aimed at graduates and technical professionals at early stages in their career with the majority of new jobs being created in Oracle’s cloud sales organisation, Oracle Direct.

Oracle is to work with local firms in building the campus, the plans for which include the consolidation of Oracle’s facilities in Oregon. In the first phase it will build a 560,000 square foot complex on the waterfront of Austin’s Lady Bird Lake. It is also building a housing complex next door, with 295 apartments, for employee housing.

Austin’s technology community is teeming with creative and innovative thinkers and the town is a natural choice for investment and growth, claimed Oracle Direct’s Senior VP Scott Armour. “Our campus will inspire, support and attract top talent, with a special focus on the needs of millennials,” said Armour.

Austin’s biggest problems are affordability and mobility, according to Austin’s Mayor Steve Adler. “I look forward to working with Oracle to tackle our biggest challenges,” he said.

AWS opens up EC2 Container Registry to all

amazon awsCloud giant Amazon Web Services (AWS) has opened its technology for storing and managing application container images up to public consumption.

The AWC EC2 Container Registry Service (ECR) had been exclusively for industry insiders who attended the launch at the AWS re:Invent conference in Las Vegas in October. However, AWS has now decided to level the playing field, its Senior Product Manager Andrew Thomas revealed, guest writing on the blog of AWS chief technologist Jeff Barr. Thomas invited all interested cloud operators to apply for access.

As containers have become the de facto method for packaging application code all cloud service providers are competing to fine tune the process of running code within these constraints, as an alternative to using virtual machines. But developers have fed back teething problems to AWS, Thomas reports in the blog.

ECR, explains Thomas, is a managed Docker container registry designed to simplify the management of Docker container images which, developers have told Thomas, has proved difficult. Running a Docker image registry, in a large-scale job like an infrastructure project, involves pulling hundreds of images at once and this makes self-hosting too difficult, especially with the added complexity of spanning two or more AWS regions. AWS clients wanted fine-grained access control to images without having to manage certificates or credentials, Thomas said.

Management aside, there is a security dividend too, according to Thomas. “This makes it easier for developers to evaluate potential security threats before pushing to Amazon ECR,” he said. “It also allows developers to monitor their containers running in production.”

There is no charge for transferring data into the Amazon EC2 Container Registry. While storage costs 10 cents per gigabyte per month all new AWS customers will receive 500MB of storage a month for a year.

The Registry is integrated with Amazon ECS and the Docker CLI (command line interface), in order to simplify development and production workflows. “Users can push container images to Amazon ECR using the Docker CLI from the development machine and Amazon ECS can pull them directly for production,” said Thomas.

The service was effective from December 21st in the US East (Northern Virginia) with more regions on the way soon.

ESI installs HPC data centre to support virtual prototyping

Cloud computingManufacturing service provider ESI Group has announced that a new high performance computing (HPC) system is powering its cloud-based virtual prototyping service to a range of industries across Europe.

The new European HPC-driven data centre is based on the Teratec Campus in Paris, close to Europe’s biggest HPC centre, the Très Grand Centre de Calcul, the data centre of The French Alternative Energies and Atomic Energy Commission (CEA). The location was chosen in order to make collaborative HPC projects possible, according to ESI. The 13,000 square metre CEA campus has a supercomputer with a peak performance of 200 Teraflops and a CURIE supercomputer capable of running a 2 Petaflops per second.

ESI’s virtual prototyping, a product development process run on computer-aided design (CAD), computer-automated design (CAutoD) and computer-aided engineering (CAE) software in order to validate designs, is increasingly run on the cloud, it reports. Before manufacturers commit to making a physical prototype they create a 3D computer-generated model and simulate different test environments.

The launch of the new HPC data centre gives ESI a cloud computing point of delivery (PoD) to serve all 40 of ESI’s offices across Europe and the world. The HPC cloud PoD will also act as a platform for ESI’s new software development and engineering services.

The HPC facility was built by data centre specialist Legrande. The new HPC is needed to meet the change in workloads driven by virtualization and cloud computing with the annual growth in data is expected to rise from 50% in 2010 to reach 4400% in 2020, according to Pascal Perrin, Datacenter Business Development Manager at Legrand.

Legrand subsidiary Minkels supplied and installed the supporting data centre hardware, including housing, UPS, cooling, monitoring and power distribution systems. The main challenge with supporting a super computer that can ramp up CPU activity by the petaflop and with petabytes of data moving in and out of memory is securing the supporting resources, said Perrin. “Our solutions ensure the electrical and digital supply of the data centre at all times,” he said.

Royal Mail bags couriering SaaS specialist NetDespatch

Email DatentunnelThe UK’s Royal Mail has bought cloud-based parcel management system NetDespatch in a bid to expand its range of services and global reach.

NetDespatch will operate as an independent standalone subsidiary so it can continue to service existing clients and is free to offer services to Royal Mail competitors in future. NetDespatch directors Matthew Robertson and Matthew Clark will remain in charge of operations and all existing client terms and conditions will remain unchanged.

The service provider helps carriers (such as its new parent company Royal Mail) to manage the transport of parcels for 130,000 business customers in 100 countries across the world. The NetDespatch parcel management system is a software as a service (SaaS) cloud system that carriers use to track the movements of parcels. It has grown in popularity as it can make it easier to integrate ecommerce websites, sales order processing and warehouse systems at the point of despatch. It makes it easier for users to print shipping labels, customs documents and manifests and automatically pre-advises their carrier of incoming parcels.

The aim is to make a relatively complex process simple, make logistics more efficient and save money for everyone in the supply chain from retailer to carrier to consumer, said Matthew Robertson, NetDespatch’s Commercial Director. “E-commerce is exploding in the run up to Christmas and we expect to continue to steam ahead in 2016 and beyond,” he said.

NetDespatch’s cloud software has made the integration of the Royal Mail’s systems with its customers’ complex IT estates a lot quicker, according to Nick Landon, Managing Director of Royal Mail Parcels. “This acquisition will support our parcels business with new and innovative software solutions,” he said. The fee for the transaction was not disclosed.

Microsoft acquires Metanautix with Quest for intelligent cloud

MicrosoftMicrosoft has bought Californian start up Metanautix for an undisclosed fee in a bid to improve the flow of analytics data as part of its ‘intelligent cloud’ strategy.

The Palo Alto vendor was launched by Theo Vassilakis and Toli Lerios in 2014 with $7 million. The Google and Facebook veterans had impressed venture capitalists with their plans for more penetrative analysis of disparate data. The strategy was to integrate the data supply chains of enterprises by building a data computing engine, Quest, that created scalable SQL access to any data.

Modern corporations aspire to data-driven strategies but have far too much information to deal with, according to Metanautix. With so many sources of data, only a fraction can be analysed, often because too many information silos are impervious to query tools.

Metanautix uses SQL, the most popular query language, to interrogate sources as diverse as data warehouses, open source data base, business systems and in-house/on-premise systems. The upshot is that all data is equally accessible, whether it’s from Salesforce or SQL Server, Teradata or MongoDB.

“As someone who has led complex, large scale data warehousing projects myself, I am excited about building the intelligent cloud and helping to realize the full value of data,” said Joseph Sirosh, corporate VP of Microsoft’s  Data Group, announcing the take-over on the company web site.

Metanautix’s technology, which promises to connect to all data regardless of type, size or location, will no longer be available as a branded product or service. Microsoft is to initially integrate it within its SQL Server and Cortana Analytics systems with details of integration with the rest of Microsoft’s service portfolio to be announced in later months, Sirosh said.

The blog posting from Metanautix CEO Theo Vassilakis hinted at further developments. “We look forward to being part of Microsoft’s important efforts with Azure and SQL Server to give enterprise customers a unified view of all of their data across cloud and on-premises systems,” he said.

Microsoft launches hybrid physical and cloud storage service

Hybrid CloudMicrosoft has announced new data storage products that it says will simplify management and cut costs in complex hybrid clouds.

The StorSimple Virtual Array and StorSimple 8000 Series Update 2 are designed to simplify the task of storing data across on-premise IT equipment and the cloud. The new additions to the StorSimple range are necessary because the range of enterprises using hybrid storage is widening, according to Microsoft.

In a beta testing exercise, aviation giant Textron claimed the new systems helped it save a million dollars a month, while car maker Mazda claimed the StorSimple additions helped it lower its overall costs by 95%.

The StorSimple Virtual Array creates a hybrid cloud storage system using a virtual machine running on Hyper-V or VMware hypervisors and can work with either a Network Attached Storage (NAS) or Storage Area Network (SAN). It integrates primary storage, data protection, archiving and disaster recovery duties into one easy system for small environments with minimal IT infrastructure and management. The addition of the StorSimple Virtual Array means users don’t need to centralize data protection and disaster recovery at the company’s main data centre but can have a simple and upgradeable system they can manage themselves with StorSimple Manager.

Meanwhile the StorSimple 8000 Series Update 2 introduces local volumes and a new high performance StorSimple Cloud Appliance. Local volumes allow the client to use primary storage without the data being tiered to Azure. This gets better performance from applications that cannot tolerate cloud latencies. A local volume is apt for on-premises workloads that have high Input-Output requirements, such as SQL Server. It’s also better for Microsoft Hyper-V and VMware virtual machines. These additions don’t stop customers from being able to use Azure for data protection and location independent disaster recovery, the vendor says.

Enterprises can now adopt a hybrid storage strategy based with StorSimple, said Mike Neil, Microsoft’s corporate VP for Cloud and Enterprise. “It will transform their businesses by cutting costs, simplifying IT and helping increase IT agility in support of business goals,” he said.

Intel teams up with NEC on Cloud RAN development

Base stations could get smaller, cheaper and more powerful if a new virtualization project reaches fruition in 2016, reports Telecoms.com.

Kit maker NEC and Intel Corporation are to jointly develop a Cloud-Radio Access Network (Cloud-RAN) that can virtualize the functions of mobile base stations. The first joint proof of concept trial of Cloud-RAN will run in early 2016.

The partners say they aim to virtualize two major components of the next mobile base station, the Digital Unit (DU) and the Radio Unit (RU), which respectively handle data processing and the sending and receiving of radio waves. The new Cloud-RAN system will separate the DU functions from mobile base stations so they can be run on general-purpose Intel servers with multi-core processors. This means DU functions can be centralised which in turn allows for multiple radio units to be centrally controlled from one general-purpose server.

This re-engineering of base stations boosts their communication performance as they have more precise control of radio interference between the radio units. By consolidating the servers it also cuts the power and space consumption. The upshot of Cloud-RAN should be more powerful base stations that are cheaper to run, according to NEC. Virtualization has been a work in progress for a long time at NEC, said Nozomu Watanabe, General Manager for NEC’s Mobile Radio Access Network Division.

“We have been working with Intel on the virtualization of mobile core networks and customer premises equipment and are pleased to extend our collaboration in Network Functions Virtualization to mobile base stations,” said Watanabe.

NEC is to strengthen its relationship with Intel for the advancement of NFV as the core technology supporting 5G said Watanabe. NEC contributes to SDN and NFV related standards bodies the Open Networking Foundation (ONF), OpenDaylight, ETSI NFV, and Open Platform for NEV (OPNFV). It also the NEC SDN Partner Space programme to promote the development and use of network virtualization technologies.