Archivo de la categoría: Vendor

NetSuite ditches AWS in Microsoft partnership

NetSuite and Microsoft are linking their cloud services, and NetSuite is moving its services onto Azure

NetSuite and Microsoft are linking their cloud services, and NetSuite is moving its services onto Azure

NetSuite has inked a deal with Microsoft in a move that will see the two companies link up the cloud-based financial and ERP platform with Microsoft Office, Windows and Azure.

As part of the deal the two companies have already integrated NetSuite and Azure Active Directory to enable single sign-on (SSO) for customers using NetSuite together with Azure AD, and in the coming months plan to drive further integration between NetSuite and Office 365 – for instance, to be able to do things like connect NetSuite data to Microsoft Excel and PowerBI in a more seamless way.

The partnership will also see NetSuite move its service off Amazon Web Services, a long-time partner of the firm, as well as take its on-premise deployments and move them into Azure, now its “preferred cloud” provider, by the end of the year.

“We’re at the ‘end of the beginning’ of the cloud, in that the cloud business model that NetSuite pioneered in 1998 is becoming the de facto standard for how fast-growth businesses are run,” said Zach Nelson, NetSuite chief executive.

“We’re thrilled to work with Microsoft to deliver a fluid cloud environment across the key NetSuite and Microsoft applications that companies and their employees rely on to continually improve their day-to-day operations and run their business better and more efficiently,” Nelson said.

Steve Guggenheimer, corporate vice president of developer platform & evangelism and chief evangelist for Microsoft also commented on the deal: “I’m excited about NetSuite’s support for Azure Active Directory for single sign-on, cloud-to-cloud integration and increasing our collaboration across mobile and cloud solutions. Our joint vision is all about giving people the freedom to get more done through the broadening set of devices they interact with that in turn helps businesses innovate and grow.”

IBM closes Phytel acquisition as healthcare partnerships continue

IBM has closed its acquisition of Phytel

IBM has closed its acquisition of Phytel

IBM announced this week it has closed the acquisition of Phytel, which provides cloud-based software that helps healthcare providers and care teams coordinate activities across medical facilities by automating certain aspects of patient care.

The company originally announced the acquisition back in April, when it also bought Explorys, a provider of cognitive cloud-based analytics that provides insights for care facilities derived from datasets derived from numerous and diverse financial, operational and medical record systems.

“The acquisition of Phytel supports our goal to advance the quality and effectiveness of personal healthcare by enabling secure access to individualised insights and a more complete picture of the many factors that can affect people’s health,” said Mike Rhodin, senior vice president, IBM Watson.

At the time IBM said the acquisitions would bolster IBM’s efforts to sell advanced analytics and cognitive computing to primary care providers, large hospital systems and physician networks.

To that end the company also created a special healthcare unit within its Watson business unit to develop solution specifically for the sector and based on the company’s cognitive compute platform.

Just last week the company redoubled its efforts to target health services, this time through social health and mobile platforms. It announced a deal with Japan Post and Apple that will see Japan Post deploy custom iOS apps built by IBM Global Business Services, which will provide services like medication reminders, exercise and diet tracking, community activity scheduling and grocery shopping as part of the post group’s Watch Over service for the elderly.

NCR offers cloud control for Android-based ATMs

NCR says Kalpana can nearly halve the time it takes to deploy new services

NCR says Kalpana can nearly halve the time it takes to deploy new services

NCR has announced a radical new approach to ATM network deployment, with a cloud-based enterprise application allowing banks to control and manage thin-client devices running a locked-down version of the Android operating system.

Called Kalpana, the software can result in a 40 per cent reduction in cost of ownership and halves the time it takes to develop and deploy new services, the company claims.

Robert Johnson, global director of software solutions at NCR, said banks are under pressure to improve services and reduce costs and so are “making very careful technical choices”. With increased emphasis on digital mobile and internet banking the ATM channel “has started looking a little disconnected”, he added.

ATM architecture development has been relatively static over the past 10 to 15 years, and while the devices have become more sophisticated in terms of features such as cash deposit and recycling, colour screens and online chat capabilities, essentially they are all customised PCs, creating problems of management, security and cost.

According to Andy Monahan, vice president of software engineering and general manager for Kalpana at NCR, the recent migration from Microsoft Windows XP to Windows 7 “has forced a rethink” and a few years ago the company decided that Monahan’s team at NCR’s Global R&D centre in Dundee, Scotland, should “take a blank sheet of paper and ask ‘what should we build next’”.

“There are two main discussions that we have with CIOs,” says Monahan. “One is whether there a viable alternative to Windows and the second is about the fact that the banks have a fairly clear idea of the IT architecture they want, and they want it to be consistent.”

The choice of operating system was relatively straightforward, he said. “When you look across the spectrum of embedded operating systems Android is pretty standard.”

By having a thin-client Android based device “everything that is customised is removed from the device and taken into the enterprise”, from where it can be configured and managed remotely. It also dramatically improves security. “By removing everything from the ATM thin-client and taking it into the cloud you create a locked-down environment that is very secure: there’s no BIOS and there’s no hard drives, just a secure boot loader that validates the kernel and checks all the certificates,” said Monahan.

By having the management systems in the enterprise software stack it is easier to develop new applications and services alongside other channels such as mobile and internet, ensuring a faster time-to-market and a more consistent customer experience, as well as to simplify management and maintenance tasks.

The first ATM to work in the Kalpana environment is NCR’s new Cx110, which uses Android tablet technology. Cardtronics, the world’s largest retail ATM owner/operator, has already taken delivery of the Kalpana software and Cx110 ATMs and plans to pilot them at locations in the Dallas-Fort Worth area, beginning “in the next few weeks”.

Hortonworks buys SequenceIQ to speed up cloud deployment of Hadoop

CloudBreak

SequenceIQ will help boost Hortonworks’ position in the Hadoop ecosystem

Hortonworks has acquired SequenceIQ, a Hungary-based startup delivering infrastructure agnostic tools to improve Hadoop deployments. The company said the move will bolster its ability to offer speedy cloud deployments of Hadoop.

SequenceIQ’s flagship offering, Cloudbreak, is a Hadoop as a Service API for multi-tenant clusters that applies some of the capabilities of Blueprint (which lets you create a Hadoop cluster without having to use the Ambari Cluster Install Wizard) and Periscope (autoscaling for Hadoop YARN) to help speed up deployment of Hadoop on different cloud infrastructures.

The two companies have partnered extensively in the Hadoop community, and Hortonworks said the move will enhance its position among a growing number of Hadoop incumbents.

“This acquisition enriches our leadership position by providing technology that automates the launching of elastic Hadoop clusters with policy-based auto-scaling on the major cloud infrastructure platforms including Microsoft Azure, Amazon Web Services, Google Cloud Platform, and OpenStack, as well as platforms that support Docker containers. Put simply, we now provide our customers and partners with both the broadest set of deployment choices for Hadoop and quickest and easiest automation steps,” Tim Hall, vice president of product management at Hortonworks, explained.

“As Hortonworks continues to expand globally, the SequenceIQ team further expands our European presence and firmly establishes an engineering beachhead in Budapest. We are thrilled to have them join the Hortonworks team.”

Hall said the company also plans to contribute the Cloudbreak code back into the Apache Foundation sometime this year, though whether it will do so as part of an existing project or standalone one seems yet to be decided.

Hortonworks’ bread and butter is in supporting enterprise adoption of Hadoop and bringing the services component to the table, but it’s interesting to see the company commit to feeding the Cloudbreak code – which could, at least temporarily, give it a competitive edge – back into the ecosystem.

“This move is in line with our belief that the fastest path to innovation is through open source developed within an open community,” Hall explained.

The big data M&A space has seen more consolidation over the past few months, with Hitachi Data Systems acquiring big data and analytics specialist Pentaho and Infosys’ $200m acquisition of Panaya.

IBM goes after healthcare with acquisitions, Apple HealthKit partnership, new business unit

IBM is pushing hard to bring Watson to the healthcare sector

IBM is pushing hard to bring Watson to the healthcare sector

IBM announced a slew of moves aimed at strengthening its presence in the healthcare sector including two strategic acquisitions, a HealthKit-focused partnership with Apple, and the creation of a new Watson and cloud-centric healthcare business unit.

IBM announced it has reached an agreement to acquire Explorys, which deploys cognitive cloud-based analytics on datasets derived from numerous and diverse financial, operational and medical record systems, and Phytel, which provides cloud-based software that helps healthcare providers and care teams coordinate activities across medical facilities by automating certain aspects of patient care.

The company said the acquisitions would bolster IBM’s efforts to sell advanced analytics and cognitive computing to primary care providers, large hospital systems and physician networks.

“As healthcare providers, health plans and life sciences companies face a deluge of data, they need a secure, reliable and dynamic way to share that data for new insight to deliver quality, effective healthcare for the individual,” said Mike Rhodin, senior vice president, IBM Watson. “To address this opportunity, IBM is building a holistic platform to enable the aggregation and discovery of health data to share it with those who can make a difference.”

That ‘holistic platform’ is being developed by the recently announced Watson Health unit, which as the name suggests will put IBM’s cognitive compute cloud service Watson at the heart of a number of healthcare-focused cloud storage and analytics solutions. The unit has also developed the Watson Health Cloud platform, which allows the medical data it collects to be anonymized, shared and combined with a constantly-growing aggregated set of clinical, research and social health data.

“All this data can be overwhelming for providers and patients alike, but it also presents an unprecedented opportunity to transform the ways in which we manage our health,” said John E. Kelly III, IBM senior vice president, solutions portfolio and research. “We need better ways to tap into and analyze all of this information in real-time to benefit patients and to improve wellness globally.”

Lastly, IBM announced an expanded partnership with Apple that will see IBM offer its Watson Health Cloud platform as a storage and analytics service for HealthKit data aggregated from iOS devices, and open the platform up for health and fitness app developers as well as medical researchers.

Many of IBM’s core technologies, which have since found their way into Watson (i.e. NLP, proprietary algorithms, etc.) are already in use by a number of pioneering medical facilities globally, so it makes sense for IBM to pitch its cognitive compute capabilities to the healthcare sector – particularly in the US, where facilities are legally incentivised to use new technologies to reduce the cost of patient care while keeping quality of service high. Commercial deals around Watson have so far been scarce, but it’s clear the company is keen to do what it can to create a market for cloud-based cognitive computing.

Fujitsu, Microsoft collaborate on Azure, Internet of Things

Fujitsu and Microsoft are partnering on IoT for farming and agricutlure

Fujitsu and Microsoft are partnering on IoT for farming and agricutlure

Fujitsu and Microsoft announced an Internet of Things partnership focused on blending the former’s devices and IoT services for agriculture and manufacturing, powered by Windows software and Azure cloud services.

The move will see the two companies offer a solution that blends Fujitsu’s Eco-Management Dashboard, an IoT service for the agricultural sector, and Microsoft’s Azure database services so that data collected from sensors deployed throughout the operations can be analysed to help firms save money and streamline processes.

The companies said the platform has uses in other sectors and can be tailored to a range of different niche verticals.

“Leveraging the Fujitsu Eco-Management Dashboard solution alongside Microsoft Azure and the Fujitsu IoT/M2M platform, we are able to deliver real-time visualisation of the engineering process for big data analytics to improve the entire production process and inform decision-making,” said Hiroyuki Sakai, corporate executive officer, executive vice president, head of global marketing at Fujitsu.

“We are proud to partner with Fujitsu to enable the next generation of manufacturing business models and services enabled by IoT along with advanced analytics capabilities like machine learning,” said Sanjay Ravi, managing director, Discrete Manufacturing Industry at Microsoft. “Fujitsu’s innovation will drive new levels of operational excellence and accelerate the pace of digital business transformation in manufacturing.”

Fujitsu has been doubling down on IoT this year, with manufacturing looking to be a strong sector for those kinds of services according to anlaysts. In January the company announced plans to expand its two core datacentres in Japan in a bid to accelerate demand for its cloud and IoT services.

The 2nd annual Internet of Things World event to be held in San Francisco in May is due to address some of the challenges ahead of the industry in terms of IoT. Sign up here.

IoT-World-banner-small

AWS doubles down on DaaS with virtual desktop app marketplace

AWS is bolstering its ecosystem around desktops

AWS is bolstering its ecosystem around desktops

Amazon has launched an application marketplace for AWS WorkSpaces, the company’s public cloud-based desktop-as-a-service, which it said would help users deploy virtualised desktop apps more quickly while keeping costs and permissioning under control.

Last year AWS launched WorkSpaces to appeal to mobile enterprises and the thin-client crowd, and the company said the app marketplace will allow users to quickly provision and deploy software directly onto virtual desktops – with software subscriptions charged monthly, and Amazon handling all of the billing.

To complement the marketplace the company unveiled the WorkSpaces Application Manager, which will enable IT managers to track and manage application usage, cost, and permissions.

“With just a few clicks in the AWS Management Console, Amazon WorkSpaces customers are able to provision a high-quality, cloud-based desktop experience for their end users at half the cost of other virtual desktop infrastructure solutions,” said Gene Farrell, general manager of AWS Enterprise Applications.

“By introducing the AWS Marketplace for Desktop Apps and Amazon WAM, AWS is adding even more value to the Amazon WorkSpaces experience by helping organizations reduce the complexity of selecting, provisioning, and deploying applications. With pay-as-you-go monthly pricing and end-user self-provisioning of applications, customers will lower the costs associated with provisioning and maintaining applications for their workforce,” Farrell said.

AWS has spent the better part of the last 9 years building up a fairly vibrant ecosystem of third-party services around its core set of infrastructure offerings, and it will be interesting to see whether the company can replicate that success on the desktop. Amazon says many companies, particularly the larger ones, deploy a mix of upwards of 200 software titles to their desktops, which would suggest a huge opportunity for the cloud giant and its partners.

Microsoft unveils Hyper-V containers, nano servers

Microsoft has unveiled Hyper-V containers and nano servers

Microsoft has unveiled Hyper-V containers and nano servers

Microsoft has unveiled a number of updates to Windows Server including Hyper-V containers, which are essentially Docker containers embedded in Hyper-V VMs, and nano servers, a slimmed down Windows server image.

Microsoft said Hyper-V containers are ideal for users that want virtualisation-grade isolation, but still want to run their workloads within Docker containers in a Windows ecosystem.

“Through this new first-of-its-kind offering, Hyper-V Containers will ensure code running in one container remains isolated and cannot impact the host operating system or other containers running on the same host,” explained Mike Neil, general manager for Windows Server, Microsoft in a recent blog post.

“In addition, applications developed for Windows Server Containers can be deployed as a Hyper-V Container without modification, providing greater flexibility for operators who need to choose degrees of density, agility, and isolation in a multi-platform, multi-application environment.”

Windows Server Containers will be enabled in the next release of Windows Server, which is due to be demoed in the coming weeks, and makes good on Microsoft’s commitment to make the Windows Server ecosystem (including Azure) Docker-friendly.

The company also unveiled what it’s calling nano servers, a “purpose-built OS” that is essentially a stripped down Windows Server image optimised for cloud and container workloads. They can be deployed onto bare metal, and because Microsoft removed tons of code it boots up and runs more quickly.

“To achieve these benefits, we removed the GUI stack, 32 bit support (WOW64), MSI and a number of default Server Core components. There is no local logon or Remote Desktop support. All management is performed remotely via WMI and PowerShell. We are also adding Windows Server Roles and Features using Features on Demand and DISM. We are improving remote manageability via PowerShell with Desired State Configuration as well as remote file transfer, remote script authoring and remote debugging.  We are working on a set of new Web-based management tools to replace local inbox management tools,” the company explained.

“Because Nano Server is a refactored version of Windows Server it will be API-compatible with other versions of Windows Server within the subset of components it includes. Visual Studio is fully supported with Nano Server, including remote debugging functionality and notifications when APIs reference unsupported Nano Server components.”

The move is a sign Microsoft is keen to keep its on-premise and cloud platform ahead of the technology curve, and is likely to appeal to .NET developers who are attracted to some of the benefits of containers while wanting to stay firmly within a Windows world in terms of the tools and code used. Still, the company said it is working with Chef to ensure nano servers work well with their DevOps tools.

Why did anyone think HP was in it for public cloud?

HP president and chief executive officer Meg Whitman (right) is leading HP's largest restructuring ever

HP president and chief executive officer Meg Whitman (pictured right) is leading HP’s largest restructuring ever

Many have jumped on a recently published interview with Bill Hilf, the head of HP’s cloud business, as a sign HP is finally coming to terms with its inability to make a dent in Amazon’s public cloud business. But what had me scratching my head is not that HP would so blatantly seem to cede ground in this segment – but why many assume it wanted to in the first place.

For those of you that didn’t see the NYT piece, or the subsequent pieces from the hordes of tech insiders and journalists more or less towing the “I told you so” line, Hilf was quoted as candidly saying: “We thought people would rent or buy computing from us. It turns out that it makes no sense for us to go head-to-head [with AWS].”

HP has made mistakes in this space – the list is long, and others have done a wonderful job at fleshing out the classic “large incumbent struggles to adapt to new paradigm” narrative the company’s story, so far, smacks of.

I would only add that it’s a shame HP didn’t pull a “Dell” and publicly get out of the business of directly offering public cloud services to enterprise users, which was a good move. Standing up public cloud services is by most accounts an extremely capitally intensive exercise that a company like HP, given its current state, is simply not best positioned to see through.

But it’s also worth pointing out that a number of interrelated factors have been pushing HP towards private and hybrid cloud for some time now, and despite HP’s insistence that it still runs the largest OpenStack public cloud – a claim other vendors have made in the past – its dedication to public cloud has always seemed superficial at best (particularly if you’ve had the, um, privilege, of sitting through years of sermons from HP executives at conferences and exhibitions).

HP’s heritage is in hardware – desktops, printers and servers, and servers still present a reasonably large chunk of the company’s revenue, something it has no choice but to keep in mind as it seeks to move up the stack in other areas (its NFV and cloud workload management-focused acquisitions as of late attest to this, beyond the broader industry trend). According to the latest Synergy Research figures the company still has a lead in the cloud infrastructure market, but primarily in private cloud.

It wants to keep that lead in private cloud, no doubt, but it also wants to bolster its pitch to the scale-out market exclusively (where telcos are quite keen to play) without alienating its enterprise customers. This also means delivering capabilities that are starting to see increased demand among that segment, like hybrid cloud workload management, security and compliance tools, and offering a platform that has enough buy-in to ensure a large ecosystem of applications and services will be developed for it.

Whether OpenStack is the best way of hitting those sometimes competing objectives remains to be seen – HP hasn’t had these products in the market very long, and take-up has been slow – but that’s exactly what Helion is to HP.

Still, it’s worth pointing out that OpenStack, while trying to evolve capabilities that would whet the appetites of communications services providers and others in the scale-out segment (NFV, object storage, etc.), is seeing much more takeup from the private cloud crowd. Indeed one of the key benefits of OpenStack is easy burstability into, and (more of a work in progress), federatability between OpenStack-based public and private clouds, respectively. The latter, by the way, is definitely consistent with the logic underpinning HP’s latest cloud partnership with the European Commission, which looks at – among other things – the potential federatability of regional clouds that have strong security and governance requirements.

Even HP’s acquisition strategy – particularly its purchase of Eucalyptus, a software platform that makes it easy to shift workloads between on premise systems and AWS – seems in line with the view that a private cloud needs to be able to lean on someone else’s datacentre from time to time.

HP has clearly chosen its mechanism for doing just that, just as VMware looked at the public cloud and thought much the same in terms of extending vSphere and other legacy offerings. Like HP, it wanted to hedge its bets stand up its own public cloud platform because, apart from the “me too” aspect, it thought doing so was in line with where users were heading, and to a much more minimal extent didn’t want to let AWS, Microsoft and Google have all the fun if it didn’t have to. But public cloud definitely doesn’t seem front-of-mind for HP, or VMware, or most other vendors coming at this from an on-premise heritage (HP’s executives mentioned “public cloud” just once in the past three quarterly results calls with journalists and analysts).

Funnily enough, even VMware has come up with its own OpenStack distribution, and now touts a kind of “one cloud, any app, any device” mantra that has hybrid cloud written all of it – ‘hybrid cloud service’ being what the previous incarnation of its public cloud service was called.

All of this is of course happening against the backdrop of the slow crawl up the stack with NFV, SDN, cloud resource management software, PaaS, and so forth  – not just at HP. Cisco, Dell, and IBM, are all looking to make inroads in software, while at the same time on the hardware side fighting off lower-cost Asian ODMs that are – with the exception of IBM – starting to significantly encroach on their turf, particularly in the scale-out markets.

The point is, HP, like many old-hat enterprise vendors, know that what ultimately makes AWS so appealing isn’t its cost (it can actually be quite expensive, though prices – and margins – are dropping) or ease of procurement as an elastic hosting provider. It’s the massive ecosystem of services that give the platform so much value, and the ability to tap into them fairly quickly. HP has bet the farm on OpenStack’s capacity to evolve into a formidable competitor to AWS in that sense (IBM and Cisco also, with varying degrees, towing a similar line), and it shouldn’t be dismissed outright given the massive buy-in that open source community has.

But – and some would view this as part of the company’s problem – HP’s bread and butter has been and continues to be in offering the technologies and tools to stand up predominately private clouds, or in the case of service providers, very large private clouds (it’s also big on converged infrastructure), and to support those technologies and tools, which really isn’t – directly – the business that AWS is in, despite there being substantial overlap in the enterprise customers they go after.

However, while it started in this space as an elastic hosting provider offering CDN and storage services, AWS, on the other hand, has more or less evolved into a kind of application marketplace, where any app can be deployed on almost infinitely scalable compute and storage platforms. Interestingly, AWS’s messaging has shifted from outright hostility towards the private cloud crowd (and private cloud vendors) towards being more open to the idea some enterprises simply don’t want to expose their workloads or host them on shared infrastructure – in part because it understands there’s growing overlap, and because it wants them to on-board their workloads onto AWS.

HP’s problem isn’t that it tried and failed at the public cloud game – you can’t really fail at something if you don’t have a proper go at it; and on the private cloud front, Helion is still quite young, as is OpenStack, Cloud Foundry, and many of the technologies at the core of its revamped strategy.

Rather, it’s that HP, for all its restructuring efforts, talk of change and trumpeting of cloud, still risks getting stuck in its old-world thinking, which could ultimately hinder the company further as it seeks to transform itself. AWS senior vice president Andy Jassy, who hit out at tech companies like HP at the unveiling of Amazon’s Frankfurt-based cloud service last year, hit the nail on the head: “They’re pushing private cloud because it’s not all that different from their existing operating model. But now people are voting with their workloads… It remains to see how quickly [these companies] will change, because you can’t simply change your operating model overnight.”

Hyrbid cloud management vendor CliQr scores $20m from Polaris Partners, Google

CliQr has raised $20m in its third round of funding, which the vendor said will be used to bolster its international expansion

CliQr has raised $20m in its third round of funding, which the vendor said will be used to bolster its international expansion

CliQr, a provider of hybrid cloud management services, has secured $20m in series C funding in an investment round led by Polaris Partners with participation from Foundation Capital, Google Ventures and TransLink Capital.

It can be tricky managing workloads that sit on diverse public and private cloud platforms, within one pane of glass, and even more difficult making sure those workloads port over to different cloud platforms that are distributed to varying degrees.

That’s the space CliQr fills with its flagship software offering, Cloud Centre; the company says its offering is based on proprietary “app-centric” technologies that enable hybrid cloud workload lifecycle management without having to do any scripting or tweaking under the hood of the migrated apps.

The latest funding round, which brings the total amount secured by the company since its founding to $38m, will be used to bolster CliQr’s expansion globally.

“CliQr built its technology and reputation by listening to customers about their requirements for the cloud,” said Gaurav Manglik, chief executive officer and co-founder of CliQr.

“To meet our customers’ needs, we are delivering on our vision for unshackling applications from the complexity of ever-changing hybrid cloud environments. Our approach is validated by a strong product platform, enterprise customers, worldwide partners and top-tier investors. We’re ready to put this new investment to work by helping us expand globally to meet skyrocketing demand for our platform,” Manglik said.