Archivo de la categoría: Enterprise IT

Containers at Christmas: wrapping, cloud and competition

Empty road and containers in harbor at sunsetAs anyone that’s ever been disappointed by a Christmas present will tell you – shiny packaging can be very misleading. As we hear all the time, it’s what’s inside that counts…

What then, are we to make of the Docker hype, centred precisely on shiny, new packaging? (Docker is the vendor that two years ago found a way to containerise applications: other types of containers, operating system containers, have been around for a couple of decades)

It is not all about the packaging, of course. Perhaps we should say that it is more about on what the package is placed, and how it is managed (amongst other things) that matters most?

Regardless, containers are one part of a changing cloud, data centre and enterprise IT landscape, the ‘cloud native’ movement widely seen as driving a significant shift in enterprise infrastructure and application development.

What the industry is trying to figure out, and what could prove the most disruptive angle to watch as more and more enterprises roll out containers into production, is the developing competition within this whole container/cloud/data centre market.

The question of competition is a very hot topic in the container, devops and cloud space.  Nobody could have thought the OCI co-operation between Docker and CoreOS meant they were suddenly BFFs. Indeed, the drive to become the enterprise container of choice now seems to be at the forefront of both companies’ plans. Is this, however, the most dynamic relationship in the space? What about the Google-Docker-Mesos orchestration game? It would seem that Google’s trusted container experience is already allowing it to gain favour with enterprises, with Kubernetes taking a lead. And with CoreOS in bed with Google’s open source Kubernetes, placing it at the heart of Tectonic, does this mean that CoreOS has a stronger play in the enterprise market to Docker? We will wait and see…

We will also wait and see how the Big Cloud Three will come out of the expected container-driven market shift. Somebody described AWS as ‘a BT’ to me…that is, the incumbent who will be affected most by the new disruptive changes brought by containers, since it makes a lot of money from an older model of infrastructure….

Microsoft’s container ambition is also being watched closely. There is a lot of interest from both the development and IT Ops communities in their play in the emerging ecosystem. At a recent meet-up, an Azure evangelist had to field a number of deeply technical questions regarding exactly how Microsoft’s containers fair next to Linux’s. The question is whether, when assessing who will win the largest piece of the enterprise pie, this will prove the crux of the matter?

Containers are not merely changing the enterprise cloud game (with third place Google seemingly getting it very right) but also driving the IT Ops’ DevOps dream to reality; in fact, many are predicting that it could eventually prove a bit of a threat to Chef and Puppet’s future….

So, maybe kids at Christmas have got it right….it is all about the wrapping and boxes! We’ll have to wait a little longer than Christmas Day to find out.

Lucy Ashton. Head of Content & Production, Container WorldWritten by Lucy Ashton, Head of Content & Production, Container World

Azure Backup gets fine tuned with speed, cache and retention improvements

AzureMicrosoft’s Azure has promised more speed, lower cache demands and better data retention among a range of improvement to its cloud backup services for enterprise data.

Azure Backup now uses a technology called Update Sequence Number (USN) Journal in Windows to track the files that have changed between consecutive backups. USN keeps track of these changes to files and directories on the volume and this helps to identify changed files quickly.

The upshot of this tweak is a faster backup time. “We’ve seen up to a 50% reduction of backup times when using this optimization,” said Giridhar Mosay, Azure’s Program Manager for Cloud and Enterprise. Individual file server backup times will vary according to numbers and sizes of files and directory structure, Mosay warned.

A new algorithm that computes metadata has slashed the amount of cache space needed for each Azure Backup by 66%. The standard allocation of 15% cache space per volume size being backed up to Azure has proved prohibitive for volumes greater than 10TB. The new algorithm makes the cataloguing of the file space to be backed up a much more efficient process, which creates so much less metadata that it demands only 5% cache space, or less. Azure is now modifying its requirement for cache space to a third of the old level.

Meanwhile the resilience of the system has improved as Azure Backup has increased the number of recovery points for cloud backups. This allows for flexible retention policies to meet stringent compliance requirements such as HIPAA (the federal Health Insurance Portability and Accountability Act of 1996) for large enterprises. The new maximum number of recovery points has increased from 366 to 9999.

Other tweaks include more timeouts across the various phases of backup process to ensure that long running jobs complete reliably. Cloud backups will also run a bit more efficiently as a result of a decoupling of the processes of cataloguing and uploading the backup data. Intermittent failures, in the service to handle incremental backups, have also been identified and resolved, according to Mosay. “We are continuing our journey to make Azure backup enterprise grade,” he said.

IBM acquires Clearleap’s cloud based video

IBM Bluemix CloudIBM says it has acquired cloud based video service provider Clearleap in a bid to make video a strategic source of data on any device at any time.

Clearleap’s video services will be offered through IBM Cloud data centres around the world, which will give clients global 24×7 service and technical support for problem identification and resolution. Clients using the service can now share data and content across geographies and hybrid clouds. IBM will offer the Clearleap APIs on IBM Bluemix in 2016 so clients can build new video offerings quickly and easily.

IBM says Clearleap’s open API framework makes it easy to build video into applications and adapt it to specific business needs like custom workflows and advanced analytics. The framework also means that it works with many third-party applications that customers may already have.

In addition, the Clearleap platform includes subscription and monetization services and data centres from which to host digital video assets. This means IBM customers pass the multi screen video experience on to their own clients.

Clearleap will be integrated into the IBM Cloud platform to make it easy for clients to make money from user video experiences. IBM says this is part of its broader strategy to help clients realise the value of video as it becomes increasingly important in business.

With businesses increasingly using video for CEO webcasts, conference keynotes, customer care and how-to videos, a secure, scalable and open cloud-based system for managing these services has become a priority, says IBM.

Clearleap’s ability to instantly ramp up capacity has won it clients such as HBO, A+E Networks, the NFL, BBC America, Sony Movie Channel, Time Warner Cable and Verizon Communications. Clearleap is headquartered in Atlanta and has data centres in Atlanta, Las Vegas, Frankfurt, and Amsterdam.

“Clearleap joins IBM as visual communications are exploding across every industry,” said Robert LeBlanc, Senior VP of IBM Cloud, “clients want content delivered quickly and economically to any device in the most natural way.”

Meanwhile, in a move that will support the delivery of video services over the cloud, IBM announced a new system that lets developers create apps that tap into vast amounts of unstructured data.

IBM Object Storage, now available on Bluemix, promises simple and secure store and access

Functions. According to IBM 80% of the 2.5 billion gigabytes of data created every day is unstructured content – with most of it video.

Verizon announces IBM integration partnership for SCI customers

VerizonVerizon has announced IBM as the latest partner in its Secure Cloud Interconnect (SCI) service, bringing the total to eight cloud service options for its clients.

Verizon Secure Cloud Interconnect customers can now connect to IBM Cloud data centre sites in Dallas and San Jose in the US and Tokyo and Sydney in the Asia Pacific region. Two additional sites are planned in Europe for the beginning of 2016.

The Verizon Interconnect supports IBM’s broader portfolio of Direct Link services, which allow customers to link their existing IT infrastructure and the cloud compute resources on the IBM Cloud. The service has three offerings, Cloud Exchange, Network Service Provider (NSP) and Colocation, in a range it says will cover all public, private and hybrid eventualities.

The new IBM Cloud addition means Verizon’s Secure Cloud Interconnect now offers access to eight cloud providers. It already has links with AWS, Google CloudPlatform, HPE Rapid Connect, Microsoft ExpressRoute for Office 365TM, Microsoft Azure ExpressRoute, Microsoft Azure Government and Verizon’s own cloud service along with service from data centre providers Coresite, Equinix and Verizon. Its service is available at 50 global locations in the Americas, Latin America, Europe and the Asia-Pacific region.

Users of Verizon’s Secure Cloud Interconnect are promised a direct line to IBM Cloud services through a secure, flexible private link that promises to move workloads easily between clouds. Verizon says it gives enterprise clients more options for storing data. The new service brings a variety of settings, which means data can be stored in a traditional IT environment, a dedicated on- or-off premises cloud and a shared off-premises cloud. This, says Verizon, makes the adoption of a hybrid cloud more achievable and provides a cloud computing estate that is easier to adjust according as business requirements change.

“With SDN at the heart of our Secure Cloud Interconnect solution, IBM customers will find it delivers an unbeatable combination,” said Shawn Hakl, VP of enterprise networking and innovation for Verizon. Yesterday Telecoms.com reported on a similar deal between HPE and NTT.

Elsewhere, Verizon has also announced the expansion of its IoT portfolio, as it launched what it claims is the world’s first Cat1 LTE network feature for IoT. In addition, it announced that it will be giving developers additional tools on its ThingSpace platform, with more application programme interfaces (APIs) and application enablement platforms (AEPs) including an integration of its Bug Labs’ dweet APIs and freeboard visualisation engine.

Ingram confirms Odin deal to boost cloud app channel

AppsParallels is to sell its cloud management technology Odin Service Automation to IT distributor Ingram Micro for an undisclosed sum in a deal expected to close by 2016.

The deal includes intellectual property and the Odin brand. Odin publishes a range of cloud applications that includes web server management, server virtualisation, provisioning and billing automation. It is used by 10,000 service providers who sell applications to their small and medium sized business clients. According to Parallels around the services reach a subscriber base of 10 million SMEs.

IT distributor Ingram Micro has been a customer of Parallels since 2014 when it began using the Odin system as a cloud distribution service, allowing it to repackage applications to its channel partners who then white label them, resell them or manage them for clients. Ingram’s partner base includes resellers, managed service providers, system integrators and hosting provider customers.

Ingram branded its Odin-enabled cloud brokering service as the Cloud Marketplace.

The sell off will enable parent company Parallels Holdings to concentrate on its core business and divest itself of a commodity, according to its CEO Birger Steen. “Now we can sharpen our focus as a company and continue to deliver market leading products under the Parallels, Plesk and Virtuozzo brands.”

Parallels’ solutions business unit will continue to operate as a standalone company. Its Plesk web management business unit will operate as a standalone company under the Plesk brand. The Virtuozzo business unit, which develops container virtualization technology, will operate as a standalone company. All three business units will continue to be owned and controlled by Parallels Holdings Limited.

It looks good for Ingram but not for Parallels, according to one analyst. “I was surprised when Parallels spun off Odin as a separate company, I felt it had some real value,” said Quocirca analyst Clive Longbottom, “Ingram looks like it has gained control of a system that helps it deliver its own products to the channel and allow it to become a cloud aggregator.”

Where this deal leaves Parallels is more of an issue, said Longbottom. “It missed the boat when Docker made more noise on containers, leaving Virtuozzo in the mud. It has not managed to make enough noise for people to know that it is there, trusting instead on word of mouth and just being known. I think that Ingram comes out well from this. Meanwhile, watch out for others buying up the rest of Parallels.”

HPE launches Synergy to help balance hybrid clouds

HPE street logoHewlett Packard Enterprise (HPE) has launched a new service aimed at helping hybrid cloud users strike the right work-cloud balance.

As companies adopt hybrid clouds, they will become increasingly aware that these half private half public clouds do not provide an instant one size fits all solution and HPE Synergy, it says, will give hybrids the fluidity to adjust.

HPE Synergy will work with existing systems from established brand such as Arista, CapGemini, Chef, Docker, Microsoft, Nvidia and VMware, said HPE in a statement. It will be available to customers and channel partners in around April 2016.

The new HPE Synergy service is an intelligent system with a simplified application programming interface (API). This combination of artificial intelligence and a portal will apparently create liquidity in the computing resources of the public and private cloud, meaning that conditions can be constantly monitored and adjustments constantly calculated. The upshot, according to HPE, is a system that can load balance between its public and private capacities and create the right blend for each set of circumstances.

Synergy creates resource pools comprising computing, storage and fabric networking capacity. These can be calculated for each case, according to its needs and the available resources. This capacity management is achieved through a system that can legislate for physical, virtual and containerised workloads.

According to HPE, Synergy’s software-defined intelligence self-discovers and self-assembles the perfect configuration and infrastructure possible (given the resources available) needed for repeatable frictionless updates. Meanwhile, the single unified API offers the chance to programme and control the bare-metal infrastructure as a service interface. The HPE OneView user interface acts as a window on the entire range of different types of storage that an enterprise might have.

The rationale is that everyone is going to hybrid computing, so it makes sense to help them move their resources across the border between private and public cloud as easily as possible, according to HPE general manager Antonio Neri.

“Hybrids of traditional IT and private clouds will dominate the market over the next five years,” said Neri. Clients will want the speed and agility of the cloud and the reliability and security of their own data centres. “With HPE Synergy, IT can deliver infrastructure as code and give businesses a cloud experience in their data centre,” said Neri.

HPE says new Cloud Service Broker could put IT back in control

HPE office logoHewlett Packard Enterprise (HPE) has launched a new service to help clients regain control over their increasingly unwieldy cloud estate.

The new HPE Helion Managed Cloud Broker is a managed service that aims to simplify the management of cloud services across multiple workloads and providers. HPE says it allows businesses to provision, access, consolidate and securely control services. It’s necessary, it says, because companies are being over run as easily accessible cloud applications threaten to cause chaos in many IT departments as they bypass all controls.

New systems are increasingly being ordered and set up without the approval of the IT department, so the cloud threatens the security and management of IT estates. Cloud fever also undermines the potential cost savings achievable through a hybrid infrastructure.

The new Helion Managed Cloud Broker will give IT administrators control and instant visibility over their IT assets, be they traditional IT kit, private clouds or public services. The Cloud Broker will orchestrate all these assets and improve responsiveness, financial management and end-user satisfaction, claims HPE.

The Cloud Broker will support HPE’s entire Helion portfolio including the Managed Virtual Private Cloud, CloudSystem and OpenStack, as well VMWare technology and a range of public cloud providers such as Microsoft Azure and Amazon Web Services. The Cloud Broker service will be generally available in 2016 and charged as a pay per use system.

Features include a self-service portal with a direct interface to service providers. The Broker’s management options cover security, performance, finances, compliance, audits, catalogs, subscriptions and service requests. It also provides monitoring tools, dashboards and reports.

The service was built from HPE Cloud Orchestration Software, ITSM automation software and operations bridge software.

Cloud computing promises speed, agility and costs advantages but they’re soon lost in a sprawl of unmanaged, uncoordinated cloud instances, according to Eugene O’Callaghan, VP of Enterprise Services Workload and Cloud at HPE. “HPE unifies all enterprise cloud resources together, giving our clients a single view,” said O’Callaghan.

New Microsoft Trust Center aims to offer stability in shifting cloud industry

MicrosoftMicrosoft has aggregated all the information about its cloud services into one single point of reference, in an attempt to clarify and simplify the increasingly ethereal nature of the cloud.

The announcement, in a blog on the company web site, comes in the same week that one of the new incarnations of HP, Hewlett Packard Enterprises (HPE), repositioned itself as a reseller of Microsoft’s Azure public cloud services.

With the onset of the cloud industry reshaping both the IT industry and companies, the software company turned cloud service vendor has moved to clarify the picture for enterprise buyers.

The new Microsoft Trust Center aims to unify all the different strands of its enterprise cloud services, as confused customers customer began to clamour for a single version of the truth, instead of having to choose between multiple references issued by a choice of Microsoft trusted resources. In the new scheme, the Microsoft Trust Center will be a consistent source of information about all its enterprise cloud services, such as Microsoft Azure, Microsoft Dynamics CRM Online, Microsoft Intune and Microsoft Office 365.

The Microsoft blog post says the Trust Center will be built on security, privacy and control, compliance and transparency. To this end it will advise cloud buyers on how Microsoft’s cloud services will observe international and regional standards, privacy and data protection policies and security features and function.

On Tuesday it was announced that HPE was to become a Microsoft Azure reseller partner, while in return HPE will become a preferred cloud services provider when Microsoft customers need help. The new arrangement, revealed by HPE CEO Meg Whitman in a quarterly analyst call, illustrates how the IT industry is being reshaped around the new hybridisation of computing services. The arrangement means HPE can sell its own hardware and cloud computing software to companies for the private, ‘on-premise’ part of the private-public combination. Meanwhile, the public cloud will be provided by Microsoft’s Azure computing service.

Transparency, according to the Microsoft Trust Center blog, is to be one of the foundations of cloud services.

ERP uptake set to boom as risk diminishes – research

enterprise IT rolloutA new survey provides potentially exciting news of lucrative opportunities for cloud service providers. Nearly a third of all enterprise resource planning (ERP) systems in the world will attempt the migration to the cloud in the next two years, if a study commissioned by Hitachi Solutions Europe is accurate.

With rare cloud migration skills at a premium, the mass migration could prove lucrative for experts in this area, according to Hitachi.

The managers of ERP systems have been the most reluctant of all IT users to move to the cloud, according to Tim Rowe, Director of Hitachi Solutions Europe. The importance of this foundation system and its inherent complexity have dissuaded companies from taking any risks. However, as perception of the benefits of cloud computing spreads, the pressure to move is beginning to outweigh the resistance to change, said Rowe. The complexity of ERP systems, once a sales blocker, is now set to become a massive source of margin, according to Hitachi.

“Now we are starting to see a shift as the benefits start to outweigh the perceived risks,” said Rowe.

The survey found that that though 31% of organisations have moved all or part of their ERP systems to the Cloud, or are in the process of doing so, that leaves a healthy 69% who are still keeping ERP in house. However, 44% of the survey group of 315 senior finance professionals

said they would contemplate moving into the cloud in the next two years. If this is an accurate representation of global sentiment, then in the next two years around 30% of all ERP systems will begin an expensive migration to the cloud.

Among the companies with 500 employees there was just as much enthusiasm for taking a Cloud-based approach to ERP. With 27% of this demographic saying they have moved all or part of their ERP to the Cloud, or are in the process, the proportion is roughly similar to the overall average (31%).

Enterprises conducting feasibility research, through peer reviews, will be encouraged by the feedback given by earlier adopters, according to Hitachi. Its study found that 80% of their survey group of finance professionals rated their experience of using cloud-based ERP as excellent or good.

The main blockages to cloud based ERP projects will be data security and privacy risk (ranked as the top concern in 38% of cases) and dependency on a third party provider, nominated as the top fear by 35% of respondents.

Google appoints ex-VMware boss to lead enterprise web services business

Google officeGoogle has appointed former VMware CEO and current Google board member Diane Greene to head a new business-oriented cloud service.

Though Google is associated with consumer products and overshadowed by AWS in enterprise cloud computing, the lead is not unassailable, claimed Google CEO Sundar Pichai, in the company’s official blog, as the appointment was announced.

“More than 60% of the Fortune 500 are actively using a paid Google for Work product and only a tiny fraction of the world’s data is currently in the cloud,” he said. “Most businesses and applications aren’t cloud-based yet. This is an important and fast-growing area for Google and we’re investing for the future.”

Since all of Google’s own businesses run on its cloud infrastructure, the company has significantly larger data centre capacity than any other public cloud provider, Pichai argued. “That’s what makes it possible for customers to receive the best price and performance for compute and storage services. All of this demonstrates great momentum, but it’s really just the beginning,” he said.

Pichai stated the new business will bring together product, engineering, marketing and sales, and Green’s brief will be to integrate them into one cohesive offering. “Dianne has a huge amount of operational experience that will continue to help the company,” he said.

In addition, Google is to acquire bebop, a company founded by Greene, to simplify the building and maintain enterprise applications. “This will help many more businesses find great applications and reap the benefits of cloud computing,” said Pichai.

Bebop’s resources will be dedicated to building and integrating the entire range of Google’s cloud products from devices like Android and Chromebooks, through infrastructure and services in the Google Cloud Platform, to developer frameworks for mobile and enterprise users and finally end-user applications like Gmail and Docs.

The market for these cloud development tools will be worth $2.3 billion in 2019, up from $803 million this year, according to IDC. The knock on effect of those developments is that more apps will run on the cloud of the service provider that supported development and that hosting business will triple to $22.6 billion by 2019, IDC says.

Greene and the bebop staff will join Google once the acquisition has completed. The new name for Greene’s division has yet to be named but will include divisions such as Google for Work, Cloud Platform, and Google Apps, according to Android Central.