All posts by Business Cloud News

Software market frustrating for enterprise users says Gemalto research

Software licensing is still causing enterprises grief, according to new research by security firm Gemalto. The biggest pain points and causes of frustration are the inflexibility of licensing arrangements and the unhelpful delivery options.

According to the State of Software Monetization report, software vendors must change if they’re to satisfy enterprise user demand. This means delivering software as a service and making it accessible across multiple devices, it concludes.

The disparity between customer demand and vendor supply has been created by the shift in tastes from enterprise software customers. This is a function of the ‘bring your own device’ (BYOD) phenomenon, which has been partly created by intelligent device manufacturers and mobile phone makers. However, despite creating the demand for more flexibility they have not been able to follow suit and provide a matchingly flexible and adaptable licensing and packaging technique for software, the report says.

The most frequently voiced complaint, from 87% of the survey sample, was about the cost of renewing and managing licenses. Almost as many (83%) complained about the time needlessly wasted on unfriendly processes for renewing and managing licenses (83%) and the time and costs that were lost to non-product-related development (82%). Most of the survey sample (68%) said they had little idea over how the products they buy are being used in the enterprise.

Four out of five respondents believe that software needs to be future-proofed to be successful.

The report was compiled from feedback from 600 enterprise software users and 180 independent software vendors (ISVs), in relation to headaches related to software licensing and packaging.

Software consumption is changing and customers only want to pay for what they use, according to Shlomo Weiss, Senior VP for Software Monetization at Gemalto. “Delivering software, in ways that customers want to consume it, is critical for creating a user experience that sells,” said Weiss.

Linux Foundation wants to extend Swagger in connected buildings

smart cityMembers of the Linux Foundation have met in San Francisco to push its newly announced Open API initiative. The collective want to harmonise efforts in the development of connected building technology.

Founding members of the Open API Initiative, including Google, IBM, Intuit, Microsoft and PayPal, want to extend the range of Swagger, the popular framework for building application programming interfaces (APIs). Their collective ambition is to create a vendor neutral, portable and open specification for providing metadata for APIs based on the representational state architecture (REST).

A new open specification could let humans and computers discover and understand the potential of their proposed connected building services with minimal implementation logic. The Initiative will also promote the use of an open API standard.

Swagger, created in 2010 and offered as an open source license a year later, is a description format used by developers in a broad range of industries. It’s used to design and deliver APIs that support a variety of connected applications and services. Downloads of Swagger and Swagger tooling have tripled in the last year as it became the most popular open source framework for defining and creating RESTful APIs.

SmartBear recently acquired the Swagger API open source project from Reverb Technologies and today is working with its industry peers to ensure the specification and format can continue to evolve. The open governance model for the Open API Initiative includes a Technical Developer Committee (TDC) that will manage the specification and keep users abreast of developments.

“Swagger is considered one of the most popular frameworks for building APIs. When an open source project reaches this level of maturity, it just can’t be managed by one entity,” said Jim Zemlin, executive director at The Linux Foundation. “The Open API Initiative will extend this technology to advance connected application development through open standards.”

AWS announces UK will be its third region in the EU by 2017

Amazon Web Services (AWS) is to add a UK region to its empire. On its opening date, mooted for the end of 2016 or early 2017, it will be the third region in European Union and the 12th in the world.

The presence of an AWS region brings lower latency and strong data sovereignty to local users.

Amazon organises its ‘elastic computing’ by hosting it in multiple locations world-wide. The locations are, in turn, sub divided into regions and Availability Zones. Each region is a separate geographical area with multiple, isolated locations known as Availability Zones. The rationale being to give instant local response but geographically diverse back up to each computing ‘instance’ (or user).

Announcing the new UK base in his blog, Amazon CTO Werner Vogels promised that all Britain’s ranges of local and global enterprises, institutes and government departments will get faster AWS Cloud services than they have been getting. The new region will be coupled – for failover purposes – with existing AWS regions in Dublin and Frankfurt. This local presence, says AWS, will provide lower latency access to websites, mobile applications, games, SaaS applications, big data analysis and Internet of Things (IoT) apps.

“We are committed to our customers’ need for capacity,” said Vogels, who promised ‘powerful AWS services that eliminate the heavy lifting of the underlying IT infrastructure’.

The UK government’s Trade and Investment Minister Lord Maude described the decision as ‘great news for the UK’. The choice of the UK, as the third european presence for AWS is, “further proof the UK is the most favoured location in Europe for inward investment,” said Maude.

By providing commercial cloud services from data centres in the UK AWS will create more healthy competition and innovation in the UK data centre market, according to HM Government Chief Technology Officer Liam Maxwell. “This is good news for the UK government given the significant amount of data we hold that needs to be kept onshore,” said Maxwell.

Yesterday, AWS evangelist Jeff Barr revealed in his blog that AWS will be opening a region in South Korea in early 2016, its fifth region in Asia Pacific.

EC calls for Safer Harbour agreement – issues new guidance

The European Commission has issued new guidance to companies on transatlantic data transfers and has called for a rapid creation of a new framework.

In October BCN reported how a ruling on the case of Schrems vs Data Protection Commissioner) rendered the US-EU Safe Harbour Agreement invalid as it was revealed that EU citizen’s data was being accessed by the US National Security Agency (NSA).

The Commission said it has stepped up talks with US authorities on a new framework and issued guidance to help companies comply with the ruling and work with alternative transfer tools.

“We need an agreement with our US partners in the next three months,” said EV VP Andrus Ansip, who is responsible for the Digital Single Market. “The Commission has been asked to take swift action: this is what we are doing. Today we provide clear guidelines and we commit to a clear timeframe to conclude current negotiations.”

“Citizens need robust safeguards of their fundamental rights and businesses need clarity in the transition period,” said Commissioner Vera Jourová, adding that 4,000 companies currently rely on the transatlantic data pact.

The EC guidelines advised on how data transfers can continue to be pursued by businesses in the interim period. It covers issues such as contractual solutions and contractual rules, binding Corporate Rules for intra-group transfers, derogations and the conclusion or performance of a contract. The guideline document, which is 7,981 words long, runs to 16 pages of challenging reading and is open to interpretation.

“As confirmed by the Article 29 Working Party, alternative tools authorising data flows can

still be used by companies for lawful data transfers to third countries like the United States,” concludes the guidance document. “However, the Commission considers that a renewed and sound framework for transfers of personal data to the United States remains a key priority.”

Enforcement against non-compliance with the Safe Harbour court ruling come into place at the end of January 2016.

Lenovo and Nutanix combo to run private cloud over hyperconverged datacentres

datacentreDatacentre hardware maker Lenovo is to install Nutanix Software in a bid to speed up the process of building the infrastructures that support private clouds.

The new family of hyperconverged appliances will be sold by Lenovo’s sales teams and its global network of partners.

Nutanix makes its own units that converge storage, server and virtualisation services into an integrated ‘scale-out’ appliance, but in this partnership Lenovo will use its own hardware devices to run the Nutanix software. The objective is to simplify data centre building, by pre-engineering most of the integration tasks and make data centre management easier. This, say the manufacturers, will cut down both the building costs and construction time for creating the foundation for a private cloud. It also, claims Nutanix, lowers the cost of ownership by creating modules in which moves and changes are easier to conduct and management is simpler.

By running the jointly created convergence appliances on Lenovo hardware, they can take full advantage of Lenovo’s close ties with Intel and run its latest processor inventions. Lenovo said it is making ‘sizeable investments in a dedicated global sales team to support the new converged appliances for datacentre builders. Lenovo and Nutanix say they are jointly planning more co-development in platform engineering and coding, as well as joint marketing initiatives.

“Lenovo can bring a new perspective to the global enterprise space,” said Lenovo CEO Yang Yuanqing, “Nutanix’s well recognised technology leadership can dramatically reduce complexity in data centres of all sizes.”

The Lenovo OEM partnership with Nutanix goes well beyond typical alliances, said analyst Matt Eastwood, Senior VP for IDC’s Enterprise Infrastructure and Datacenter Group. “This partnership will accelerate the reach of hyperconverged infrastructure,” he said.

IBM targets API Harmony in the cloud

APIIBM is to use the cloud to deliver IT developers into a state of API Harmony.

The vendor turned service provider has launched an intelligent cloud based matchmaking technology, which instantly helps coders to find the right application programming interface (API) for the right occasion. The service, API Harmony, could save the world’s developers so much time and money that the global API economy has been predicted to be worth $2.2 trillion by 2018, according to IBM’s internal research.

The system uses cognitive technologies to anticipate the needs of a developer as they build new apps. It then pre-empts some of the delays that developers face by anticipating their interface challenges and researching the answers. It then makes useful time saving recommendations on which APIs to use, API relationships and anything that might be missing.

The API economy is characterised by IBM research as a commercial exchange of business functions and competencies in APIs. IBM says it is the driving force behind most digital transformation across industries today. By 2018, according to research company Ovum, the number of enterprises having an API program will have grown by 150%.

There are three pillars of harmoniousness in the API economy, according to IBM. Accordingly its API Harmony servicee main components: Strategy, Technologies and Ecosystems. The strategy element consists of IBM’s API Economy Journey Map, in which consultants will help clients identify key opportunities and gauge their readiness for their journey. The Technologies aspect of the service is built on the previously described intelligent me has thrapping systems and cloud delivery. The Ecosystem for the services is the fruit of an IBM collaboration with the Linux Foundation to create an open platform for building, managing and integrating open APIs.

IBM’s Watson APIs are managed by IBM API Management on Bluemix, bringing approximately 30 cognitive-based APIs to industries.

“To succeed in the API Economy enterprises need an open ecosystem and IBM is helping guide clients every step of the way,” said Marie Wieck, General Manager for IBM Middleware.

Cloud migrations driven by bosses, business leaders and board – report

multi cloudThe majority of cloud migrations are driven by the three Bs – bosses, board members and business leaders, as technology experts become marginalized, says a new report. However, the report also indicated  most projects end up being led by a technology-savvy third party.

Hosting vendor Rackspace’s new ‘Anatomy of a Cloud Migration’ study found that CEOs, directors and other business leaders are behind 61% of cloud migrations, rather than IT experts. Perhaps surprisingly, 37% of these laymen and laywomen see their cloud migration projects right through to completion, they told the study.

The report, which compiled feedback from a survey of 500 UK IT and business decision-makers, also revealed what’s in the cloud, why it’s there and how much IT has been moved to the cloud already. There was some good news for the technology expert, as the report also indicates that one of the biggest lessons learned was that cloud migration is not a good experience and that the majority of companies end up consulting a third-party supplier. However, in the end, nine out of ten organisations were able to report that their business goals were met, albeit only ‘to some extent’. The report was compiled for Rackspace by Vanson Bourne.

Among the 500 companies quizzed, an average of 43% of the IT estate is now in the cloud. Cost cutting was the main motive in 61% of cases.

Surprisingly, 29% of respondents said they migrated their business-critical applications first, rather than embark on a painful learning curve with a less important application. The report did not cross reference this figure with the figures for migrations led by CIOs. However, 69% of the survey said they learned lessons from their migration that will affect future projects, which almost matches the  71% of people who didn’t make a mission critical application their pilot migration project.

Other hoped-for outcomes nominated by the survey group were improvements in resilience (in 50% of cases), security (38%), agility (38%) and stabilising platforms and applications (37%).

A move to the cloud is no longer an exclusive function of the IT department, concluded Darren Norfolk, UK MD of Rackspace. “Whether business leaders understand the practicalities of a cloud migration project or not, there appears to be broad acceptance that they can do it,” he said.

Microsoft Azure to become a Red Hat Enterprise Linux channel partner

redhat office logoA new Microsoft-Red Hat partnership could make hybrid cloud computing a lot easier and less binding, in a surprise move that sees Microsoft become a channel partner for an open source company.

The availability of Red Hat’s Enterprise Linux-based systems on Microsoft Azure was the key component of a joint announcement on Wednesday. Microsoft will offer Red Hat Enterprise Linux as the preferred choice for enterprise Linux workloads on Microsoft Azure.

The two vendors also announced plans to jointly tackle issues that commonly arise when enterprises, ISVs and developers try to build, install and manage applications on Red Hat software across private and public clouds.

Under the terms of the partnership Red Hat systems will be available natively to Microsoft Azure customers and Microsoft Azure will become a Red Hat Certified Cloud and Service Provider. In return, Red Hat Cloud Access subscribers will also be able to bring their own virtual machine images to run in Microsoft Azure.

Microsoft Azure customers can now make full use of Red Hat applications such as JBoss Enterprise, JBoss Web Server, Red Hat Gluster Storage and OpenShift, Red Hat’s platform-as-a-service offering.

The two partners will jointly offer enterprise-grade support for hybrid computing set ups. The cross-platform, cross-company support will span both Microsoft and Red Hat offerings. In a new initiative, support teams from both vendors will be located on the same sites, in a bid to achieve the level of support cohesion the public cloud lacks, according to Red Hat.

The two partners will also work together to unify workload management across hybrid clouds. This will see Red Hat CloudForms interoperate with Microsoft Azure and Microsoft System Center Virtual Machine Manager, As a result, customers should be able to manage Red Hat Enterprise Linux on both Hyper-V and Microsoft Azure. Extra support for managing Azure workloads from Red Hat CloudForms is expected ‘in the next few months’.

There will also be a level of collaboration on .NET for a new generation of application development options, Red Hat said. Developers will have access to .NET technologies across Red Hat offerings, including Red Hat OpenShift and Red Hat Enterprise Linux.

“The data centre is heterogeneous, and the cloud is hybrid,” said Paul Cormier, president of Products and Technologies at Red Hat. “Together, we’re offering the most comprehensive support agreement for our mixed technologies to support customers.”

Orange Business Services and Akamai offer 10x faster cloud access

Cloud computingOrange Business Services (OBS) and content delivery specialist Akamai claim they have worked out a way to give enterprise clients up to 10 times faster access to business critical cloud applications.

The new Business VPN Internet Accelerate service, available from OBS, was created using Akamai’s Cloud Networking technology. It optimizes software as a service (SaaS) access so that users don’t have to wait for cloud-based applications, dashboards and documents to open and save.

This, says OBS, will make their customer relationship management, enterprise resource planning and business intelligence activities more potent and productive. The performance improvement is made by tweaking the transport mechanism across the OBS virtual private networks that extend, via an IPSec tunnel, to branch offices in enterprises.

OBS said it improves the cloud user experience through five customer support centres and eight CyberSecurity Operations Centres where it analyses network traffic, constantly configures service levels and monitors security. The Orange business-grade Internet service relies on the global Akamai Intelligent Platform, a content delivery accelerator which has 200,000 servers in 110 countries to localize material. It uses Orange’s global private network of mobile, satellite and wireline access links.

Orange is currently running pilots of Business VPN Internet Accelerate with a number of enterprises. The service is scheduled to be globally available in early 2016.

Getting networks fit for a cloud future with business grade connectivity is vital, said Pierre-Louis Biaggi, OBS’s VP of Connectivity Solutions. “Business VPN Internet Accelerate allows enterprises to embrace global hybrid networks and the Internet,” said Biaggi.

The claim for ten times faster access speeds was based on the results of an Orange Proof-of-Concept between Paris and Singapore. The vendor did not disclose what system was used as a benchmark on which the ten-fold improvement was made.

Wind River launches comprehensive cloud suite

Cloud computing conceptEmbedded software vendor Wind River has launched what it describes as a ‘comprehensive cloud suite’ for multi-architecture operating systems.

The new Wind River range includes the Helix Cloud, Rocket and Pulsar Linux offerings which are designed to communicate across multiple devices, gateways and microcontroller units (MCUs).

The Helix Cloud is a family of software-as-a-service (SaaS) products including development tools, virtual labs and deployed devices. Their joint mission is to simplify and automate the building and managing of IoT technologies at every stage of the life cycle of a system, from design to decommissioning. The Helix Lab Cloud is a virtual hardware lab for simulating and testing IoT devices and complex systems. Meanwhile, the Device Cloud is designed for managing IoT devices and their data.

Wind River claims it can simplify edge-to-cloud development with a single operating system controlling all dialogue between the device and the cloud. Wind River’s Rocket is described as a tiny-footprint commercial-grade quality real-time operating system that’s directly connected to its Helix Cloud. This, it claims, creates the support for multiple architectures and applications running on the type of 32-bit MCUs used in small-footprint sensor hubs, wearables and edge devices.

Pulsar Linux is a small-footprint commercial-grade binary Linux OS based on the Wind River Linux distribution that connects directly to the Helix Cloud to run on applications scaling from 32-bit MCUs to 64-bit CPUs.

The platform independent Rocket and Pulsar Linux support Intel and ARM architectures and a range of mainstream commercial boards, so that apps can run on any device and the developer can create an open collaborative ecosystem.

Wind River partners include Advantech, Freescale, HCL Technologies, Texas Instruments and Xilinx. It has also launched a new developer programme for ISVs, OEMs, systems integrators, ODMs and cloud operators.