All posts by Business Cloud News

New Microsoft Trust Center aims to offer stability in shifting cloud industry

MicrosoftMicrosoft has aggregated all the information about its cloud services into one single point of reference, in an attempt to clarify and simplify the increasingly ethereal nature of the cloud.

The announcement, in a blog on the company web site, comes in the same week that one of the new incarnations of HP, Hewlett Packard Enterprises (HPE), repositioned itself as a reseller of Microsoft’s Azure public cloud services.

With the onset of the cloud industry reshaping both the IT industry and companies, the software company turned cloud service vendor has moved to clarify the picture for enterprise buyers.

The new Microsoft Trust Center aims to unify all the different strands of its enterprise cloud services, as confused customers customer began to clamour for a single version of the truth, instead of having to choose between multiple references issued by a choice of Microsoft trusted resources. In the new scheme, the Microsoft Trust Center will be a consistent source of information about all its enterprise cloud services, such as Microsoft Azure, Microsoft Dynamics CRM Online, Microsoft Intune and Microsoft Office 365.

The Microsoft blog post says the Trust Center will be built on security, privacy and control, compliance and transparency. To this end it will advise cloud buyers on how Microsoft’s cloud services will observe international and regional standards, privacy and data protection policies and security features and function.

On Tuesday it was announced that HPE was to become a Microsoft Azure reseller partner, while in return HPE will become a preferred cloud services provider when Microsoft customers need help. The new arrangement, revealed by HPE CEO Meg Whitman in a quarterly analyst call, illustrates how the IT industry is being reshaped around the new hybridisation of computing services. The arrangement means HPE can sell its own hardware and cloud computing software to companies for the private, ‘on-premise’ part of the private-public combination. Meanwhile, the public cloud will be provided by Microsoft’s Azure computing service.

Transparency, according to the Microsoft Trust Center blog, is to be one of the foundations of cloud services.

Data consumption outgrows personal storage capabilities – research

Cloud storageThe sheer weight of data is growing far too fast for personal storage devices to cope, says new research which will delight device makers and disappoint cloud storage providers, if true. The study suggests that the majority of consumers are stressed by the prospect of deleting their content to overcome capacity issues. The only way to cope with the newly identified syndrome, Post Deletion Stress Disorder (PDSD), is to buy more capacity. However, they want to own the device that stores their data, rather than rent it, says Western Digital.

According to the independent study conducted on storage device maker WD’s behalf, 56% of UK consumers have been forced to delete content from a technology device, due to capacity issues, and regretted it. Researcher Vanson Bourne, which talked to 1,000 UK consumers, found that 7% are running out of storage on their mobile daily, 16% are reaching full memory at least weekly and 31% run out of storage capacity at least monthly.

However, given a choice, consumers generally prefer to own storage outright, rather than rent it, it claimed. Meanwhile storage allocations on devices are being pushed with 77% downloading an app to a mobile device at least monthly, and around one third (33%) downloading a feature film to a mobile device this often. Photographs take up the most storage capacity across a range of consumer devices, while 44% of consumers admit they ‘don’t know’ what content is taking up storage capacity on their devices.

With 44% of the survey sample trying to manage on just 64GB of device storage not enough people (just 33%) have invested in an external hard drive to solve their storage problems, says WD a maker of external hard drives. By contrast, just 2% of those surveyed use a paid cloud storage service and only 16% use a free cloud storage service.

“Clearly consumers are sacrificing precious memories and valuable content to make more space on their devices,” said Jim Welsh, general manager of Content Solutions at WD, “we believe consumers will look for external storage solutions that bring more value with features that help them store, share and backup their digital content from mobile devices and computers.”

IBM open-sources machine learning SystemML

Machine 2 MachineIBM is aiming to popularise its proprietary machine learning programme SystemML through open-source communities.

Announcing the decision to share the system source code on the company blog, IBM’s Analytics VP Rob Thomas said application developers are in need of a good translator. This was a reference to the huge challenges developers face when combining information from different sources into data-heavy applications on a variety of computers, said Thomas. It is also a reference to the transformation of a little used proprietary IBM system into a popular, widely adopted artificial intelligence tool for the big data market. The vehicle for this transformation, according to Thomas, will be the open-source community.

IBM claims SystemML is now freely available to share and modify through the Apache Software Foundation open-source organisation. Apache, which manages 150 open-source projects, represents the first step to widespread adoption, Thomas said. The new Apache Incubator project will be code named Apache SystemML.

The machine learning platform originally came out of IBM’s Almaden research lab ten years ago when IBM was looking for ways to simplify the creation of customized machine-learning software, Mr. Thomas said. Now that it is in the public domain, it could be used by a developer of cloud based services to create risk-modeling and fraud prevention software for the financial services industry, Thomas said.

The current version of SystemML could work well with Apache project Spark, Thomas said, since this is designed for processing large amounts of data that stream in from continuous sources like monitors and smartphones. SystemML will save companies valuable time by allowing developers to write a single machine learning algorithm and automatically scale it up using open-source data analytics tools Spark and Hadoop.

MLLib, the machine learning library for Spark, provides developers with a rich set of machine learning algorithms, according to Thomas, and SystemML enables developers to translate those algorithms so they can easily digest different kinds of data and to run on different kinds of computers.

“We believe that Apache Spark is the most important new open-source project in a decade. We’re embedding Spark into our Analytics and Commerce platforms, offering Spark as a service on IBM Cloud, and putting more than 3,500 IBM researchers and developers to work on Spark-related projects,” said Thomas.

While other tech companies have open-sourced machine learning technologies they are generally niche specialised tools to train neural networks. IBM aims to popularise machine learning within Spark or Hadoop and its ubiquity will be critical in the long run, said Thomas.

Shareholders question value in Dell/EMC deal

Dell office logoThe prospect of a potential shareholder revolt has changed the terms of the EMC takeover by Dell.

Under a new proposal EMC will retain a majority stake in Virtustream and has dropped plans to integrated it with VMware, according to sources quoted in Reuters.

Shares in VMware have lost a quarter of their value since Dell’s $60 billion deal to buy EMC was reported in BCN in October. The fall in share value could jeopardise the takeover deal, given the complicated stock related funding of the $67 billion transaction. Dell was originally set to pay EMC shareholders $24.05 per share in cash along with a special stock that tracks the common shares of EMC’s owned virtualisation company VMware.

Under the terms of the Dell deal, EMC shareholders will receive a 0.111 share of VMware tracking stock for each EMC share. However, with VMware shares falling, the value of one of EMC’s most precious assets is a major concern to stakeholders on both sides of the takeover.

A new plan has been hatched, reports Reuters, with EMC set to assume Virtustream’s losses by keeping a majority stake, while VMware will have a minority stake, in order to distance itself from the effects of the loss maker.

News of the new deal made VMware’s common shares improve in value by 3.85% at close of play on the New York Stock Exchange yesterday. Their current price stands at $60.35 a share. Uncertainty about the future of VMware has affected its ability to close deals, according to reports, while a disappointing earnings forecast for fourth-quarter revenue was blamed on currency fluctuations across China, Russia and Brazil.

Investors are asking EMC to launch a share buyback programme for VMware, according Reuters, but no decisions have been made. Activist hedge fund Elliott Management, one of the architects of strategy change at virtualisation company Citrix, is a top EMC shareholder.

Buying back shares could prove expensive, reported Recode. Since $5.7 billion of VMware’s $7.2 billion in cash and short-term investments is held outside the U.S. and subject to corporate taxes if the money is repatriated. Some shareholders pushing for the buyback have suggested taking on debt to pay for it.

EMC bought Virtustream for $1.2 billion in July and its ownership is shared between parent EMC and VMware on a 50/50 basis. Ending the joint venture arrangement could relieve pressure on VMware and cut the amount of capital spending and additional investment Virtustream would need, according to Bernstein analyst Toni Sacconaghi, in a research note seen by Reuters.

New players ally to G-Cloud 7 amid accusations of anti-cloud behaviour

Cloud computingA number of new service providers have announced their participation in the latest iteration of the UK’s government computing services framework, G-Cloud 7. Among the new suppliers pledging to meet the conditions of the latest framework were Fordway, Acuity, Company 85, RedCentric and Komodo Digital.

However, critics have argued that The Crown Commercial Service (CCS) has introduced uncloud-like behaviour, as newly introduced limits could hinder buyers from expanding their use of cloud services.

Under the new rules in G-Cloud 7, users will be forced to re-tender via G-Cloud if they intend to buy additional capacity or services that will cost more than 20% of their original contract’s value. This, according to industry body EuroCloud UK, goes against the defining principle of cloud computing, scalability.

“It deters buyers from using the G-Cloud framework, because it actively discourages the pay per use principle,” said Neil Bacon, MD of Global Introductions and a member of EuroCloud’s G-Cloud working group. Worse still, he said, it will prevent buyers from getting the economies of scale that are the original motivation for their buying decision.

Several G-Cloud providers, including EduServ and Skyscape, outlined their concerns about the move in writing to the Cabinet Office. However, Surrey-based service provider Fordway has committed to the new system, launching its Cloud Intermediation Service (CIS) on G-Cloud 7.

The new service helps clients assess, plan, transform and migrate their infrastructure partly or completely to public cloud. It promises agile project management, bundling together the resources that clients will need to support their in-house team at each stage of the transition.

Fordway claims its relationships with public cloud providers such as Amazon Web Services, Microsoft and Google allow it to create a pivotal single point of contact to manage a transition irrespective of the target platforms.

In Fordway’s case, clients may not be subject to unexpected fluctuations in capacity demand, according to MD Richard Blanford.

“Most IT teams will only migrate their systems to cloud once, and it’s a big step. For the sake of their organisation and their own careers it needs to be planned and delivered successfully, on time and within budget, without any surprises,” said Blanford.

ERP uptake set to boom as risk diminishes – research

enterprise IT rolloutA new survey provides potentially exciting news of lucrative opportunities for cloud service providers. Nearly a third of all enterprise resource planning (ERP) systems in the world will attempt the migration to the cloud in the next two years, if a study commissioned by Hitachi Solutions Europe is accurate.

With rare cloud migration skills at a premium, the mass migration could prove lucrative for experts in this area, according to Hitachi.

The managers of ERP systems have been the most reluctant of all IT users to move to the cloud, according to Tim Rowe, Director of Hitachi Solutions Europe. The importance of this foundation system and its inherent complexity have dissuaded companies from taking any risks. However, as perception of the benefits of cloud computing spreads, the pressure to move is beginning to outweigh the resistance to change, said Rowe. The complexity of ERP systems, once a sales blocker, is now set to become a massive source of margin, according to Hitachi.

“Now we are starting to see a shift as the benefits start to outweigh the perceived risks,” said Rowe.

The survey found that that though 31% of organisations have moved all or part of their ERP systems to the Cloud, or are in the process of doing so, that leaves a healthy 69% who are still keeping ERP in house. However, 44% of the survey group of 315 senior finance professionals

said they would contemplate moving into the cloud in the next two years. If this is an accurate representation of global sentiment, then in the next two years around 30% of all ERP systems will begin an expensive migration to the cloud.

Among the companies with 500 employees there was just as much enthusiasm for taking a Cloud-based approach to ERP. With 27% of this demographic saying they have moved all or part of their ERP to the Cloud, or are in the process, the proportion is roughly similar to the overall average (31%).

Enterprises conducting feasibility research, through peer reviews, will be encouraged by the feedback given by earlier adopters, according to Hitachi. Its study found that 80% of their survey group of finance professionals rated their experience of using cloud-based ERP as excellent or good.

The main blockages to cloud based ERP projects will be data security and privacy risk (ranked as the top concern in 38% of cases) and dependency on a third party provider, nominated as the top fear by 35% of respondents.

AWS launches EC2 Dedicated Hosts feature to identify specific servers used

amazon awsAmazon Web Services (AWS) has launched a new service for the nervous server hugger: it gives users knowledge of the exact server that will be running their machines and also includes management features to prevent licensing costs escalating.

The new EC2 Dedicated Hosts service was created by AWS in reaction to the sense of unease that users experience when they never really know where their virtual machines (VMs) are running.

Announcing the new service on the company blog AWS chief evangelist Jeff Barr says the four main areas of improvement would be in licensing savings, compliance, usage tracking and better control over instances (AKA virtual machines).

The Dedicated Hosts (DH) service will allow users to port their existing server-based licenses for Windows Server, SQL Server, SUSE Linux Enterprise Server and other products to the cloud. A feature of DH will be the ability to see the number of sockets and physical cores that are available to a customer before they invest in software licenses. This improves their chances of not overpaying. Similarly the Track Usage feature will help users monitor and manage their hardware and software inventor more thriftily. By using AWS Config to track the history of instances that are started and stopped on each of their Dedicated Hosts customers can verify usage against their licensing metrics, Barr says.

Another management improvement is created by the Control Instance Placement feature, that promises ‘fine-grained control’ over the placement of EC2 instances on each Dedicated Host.

The provision of a physical server may be the most welcome addition to many cloud buyers dogged by doubts over Compliance and Regulatory Requirements. “You can allocate Dedicated Hosts and use them to run applications on hardware that is fully dedicated to your use,” says Barr.

The service will help enterprises that have complicated portfolios of software licenses where prices are calculated on the numbers of CPU cores or sockets. However, Dedicated Hosts can only run in tandem with AWS’ Virtual Private Cloud (VPC) service and can’t work with its Auto Scaling tool yet.

Verizon and VMTurbo collaborate over smart cloud brokerage

cloud exchangeUS telco Verizon and control system maker VMTurbo have jointly created the Verizon Intelligent Cloud Control to help Verizon customers migrate workloads to the most suitable public cloud service.

The system works by calculating the enterprise customer’s performance and resource needs and matching them up to the most likely provider. The partners claim this is the first automated system of its kind on the market.

Existing cloud brokerages, they claim, have to manually recommend workload placement to public cloud service providers (CSPs). However Intelligent Cloud Control gives Verizon customers a system that automatically makes instant calculations on price, performance and placements, while taking in compliance considerations. It also makes all the sizing and configuration decisions needed in order to install and migrate workloads to public cloud providers.

Verizon claims the system will be easy to use with a single interface and detailed cost controls that will eliminate billing surprises. The system will also help end users keep on top of performance and compliance issues through rigourous cloud monitoring.

The ‘Verizon Intelligent Cloud Control powered by VMTurbo’ service will launch during the first quarter of 2016. Initially the service will include connections to Amazon Web Services, IBM SoftLayer and Microsoft Azure.

Verizon’s customers said they needed a better way to manage their risk when moving to the public cloud, according to Victoria Lonker, director of enterprise networking for Verizon. “We are removing the complexities and myriad trade-offs between price, performance and compliance in various public cloud services,” said Lonker, “now they can focus on the applications and services.”

VMTurbo’s Application Performance Control system is used by 1200 enterprises to guarantee Quality of Service for applications and to make full use of all resources in cloud and virtualized environments.

“Intelligent Cloud Control is different from today’s cloud brokers and managers as it factors in application performance and price,” said Endre Sara, VP of advanced solutions of VMTurbo.

SAP to become a Genband reseller as Kandy improves relationship in the cloud

SAP1Texas-based comms software specialist Genband has signed SAP as a global reseller of its comms platform-as-a-service (PaaS) Kandy. Under the terms of the arrangement, Kandy will be repackaged as the SAP Real-Time Communicator Web application by Genband.

The system is designed to help any sized enterprise to improve its workflow by improving its communications processes, making them simpler to use and more effective. This is achieved by making it easier for sales, service and business professionals to adopt the chat, videoconference and collaboration systems that are often under-used in many companies. By improving real time communications between customers and co-workers SAP says its cloud offering will makes its enterprise clients far more effective sales organisations.

SAP claims its Real-Time Communicator creates personalized engagement and helps them stand out from competitors through a superior customer experience. In its capacity as a reseller SAP has integrated Real-Time Communicator into the rest of its portfolio and embedded communications within its business applications, giving them presence, instant messaging, voice and video chat and conferencing. The Real-Time Communicator is integrated natively into SAP Cloud for Customer, and can be integrated with the SAP Hybris Commerce system.

Genband’s executive VP of Strategy and Cloud Services Paul Pluschkell said SAP, as the world’s top cloud player, is the ideal reseller partner to collaborate with. “Integrating with SAP creates a powerful customer experience that empowers customers to work smarter and more efficiently,” said Pluschkell.

The combination creates dramatic improvements in productivity for clients said Nayaki Nayyar, senior VP of Cloud for Customer Engagement at SAP. Managing vital relationships helps make the experience richer, more contextual and highly efficient, said Nayyar. SAP is reselling Genband because it has created an advanced market offering and the only one that could help SAP launch new offerings across its applications. “Genband’s technology performance leadership, global presence and comprehensive product portfolio, all factored into our decision to select this platform,” said Nayyar.

Equinix connects AWS direct to data centres in Dallas and London

Equinix LD6Data centre operator Equinix has added an Amazon Web Services (AWS) Direct Connect facility in its Dallas data centre and data centres in its London International Business Exchange (IBX).

The AWS Direct Connect facility means that companies using Equinix data centres can connect their privately owned and managed infrastructure directly to AWS, it claims. The arrangement creates a private connection to the AWS Cloud within the same infrastructure. This ‘hard-wiring’ of two infrastructures in the same building can cut costs and latency, while boosting throughput speeds and ultimately creating better application performances, Equinix says. These two offerings bring the total number of Equinix data centres offering a Direct Connect (to AWS) to 10.

The service is a response to increasing demand from clients for hybrid clouds. Equinix says it can configure this in its own data centres, through direct interconnection of the public cloud provider’s kit and the equipment belonging to clients. This Equinix-enabled hybrid is an instant way to achieve the scalability and cost benefits of the cloud, while maintaining the security and control standards offered by an on premise infrastructure.

Equinix claims that a recent study, Enterprise of the Future, found that by 2017 hybrids will double in enterprise cloud computing. According to its feedback from a study group, 84% of IT leaders will deploy IT infrastructure where interconnection, defined as direct, secure physical or virtual connections, is at the core, compared to 38% today.

London is the second Equinix location in Europe, after Frankfurt, to get an AWS Direct Connect arrangement. It means that customers can get “native” connections to AWS Cloud offerings, whereas previously they tethered from Equinix in London into AWS’s Dublin facilities. Equinix’s Dallas IBX, DA5, is the fourth data centre in North America to offer AWS Direct Connect, joining Equinix’s facilities in Seattle, Silicon Valley and Washington. Equinix now offers AWS Direct Connect in ten global locations; Dallas, Frankfurt, London, Osaka, Seattle, Silicon Valley, Singapore, Sydney, Tokyo and Washington, D.C./Northern Virginia. Equinix customers in these areas experience lower network costs into and out of AWS and take advantage of reduced AWS Direct Connect data transfer rates.