Archivo de la categoría: News & Analysis

Microsoft announces R Server availability inside Azure HDInsight

MicrosoftMicrosoft has announced the availability of R Server inside Azure HDInsight, the company’s Hadoop-as-a-service aspect of Azure Data Lake.

Speaking at Strata + Hadoop World, the company is seemingly hoping to capitalize on the growing trend of open source technologies. Microsoft R is now 100% compatible with Open Source R and any library that exists can be used in the R Server context.

Microsoft acquired Revolution Analytics in early 2015 as a means of entering the R-based analytics market, and has since delivered SQL Server R Services on SQL Server 2016 CTP3. R is one of the world’s most widely used programming languages for predictive analytics.

“By making R Server available as a workload inside HDInsight, we remove obstacles for users to unlock the power of R by eliminating memory and processing constraints and extending analytics from the laptop to large multi-node Hadoop and Spark clusters,” said Oliver Chiu, Product Marketing, Hadoop/Big Data and Data Warehousing at Microsoft. “This enables the ability to train and run ML models on larger datasets than previously possible to make more accurate predictions that affect the business.”

The company claims that by making the R Server available as a workload inside HDInsight, it will remove memory and processing constraints allowing developers to better utilize the power of Hadoop and Spark clusters. If correct, organizations will be able to run machine learning models on larger datasets, increasing the accuracy of business predications which are made by the model.

“This gives you the familiarity of the R language for machine learning while leveraging the scalability and reliability built into Hadoop and Spark,” said Chiu. “It also eliminates memory and processing constraints and easily extends their code from their laptop to large multi-terabyte files producing models that are more powerful and accurate.”

IBM launches brain-inspired supercomputer with Lawrence Livermore National Laboratory

artificial intelligence, communication and futuristicIBM and Lawrence Livermore National Laboratory have launched a new project to build a brain-inspired supercomputing platform for deep learning inference.

The project will be built on IBM’s TrueNorth chip, which the company claims will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a tablet computer. The neural network design of IBM’s Neuromorphic System aims to be able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing in a much more economical manner than current chips.

“The delivery of this advanced computing platform represents a major milestone as we enter the next era of cognitive computing,” said Dharmendra Modha, Chief Scientist for Brain-inspired Computing at IBM Research.   “We value our relationships with the national labs. In fact, prior to design and fabrication, we simulated the IBM TrueNorth processor using LLNL’s Sequoia supercomputer. This collaboration will push the boundaries of brain-inspired computing to enable future systems that deliver unprecedented capability and throughput, while helping to minimize the capital, operating and programming costs – keeping our nation at the leading edge of science and technology.”

The technology itself will be utilized in a number of different manners within the National Nuclear Security Administration (NNSA), including the organizations Stockpile Stewardship Program, a program of reliability testing and maintenance of its nuclear weapons without the use of nuclear testing.

“Neuromorphic computing opens very exciting new possibilities and is consistent with what we see as the future of the high performance computing and simulation at the heart of our national security missions,” said Jim Brase, Livermore National Laboratory’s Deputy Associate Director for Data Science. “The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science.”

While Artificial Intelligence has been one of the more prominent trends in the cloud computing world, the success of the technology and PR stunts launched has been varied.

AlphaGo is an example of the success of AI, as Google Deepmind’s AI program beat world Go champion Lee Se-dol in a five match series. As traditional machine learning techniques could not be applied in this instance, the team combined an advanced tree search with deep neural network allowing the program to readjust its behaviour through reinforcement learning. The win came as a surprise to commentators, as the game Go relies on intuition and feel.

On the opposite end of the spectrum, Microsoft has had to release an apology after its twitter inspired AI stunt backfired. The program tweeted controversial comments as it was unable to grasp the politically incorrect nature of the messages it received from users, as reported by the Independent.

NTT Data to acquire Dell Services for $3.06 billion

NTT DataJapan’s NTT Data is to acquire Dell’s IT Services business for $3.06 billion, in an effort to bolster its footprint in the North American region.

The announcement confirms speculation over recent months as to the future of the IT Services division, as Dell has been rumoured to be searching for a buyer for the business unit to aid financing of the EMC deal. Dell Services was initially formed through the acquisition of Perot Systems in 2009 for $3.9 billion. The new agreement with NTT Data will see Dell absorb an $800 million loss on the division and could indicate that financing the EMC acquisition is more difficult than initially expected.

In December, BCN reported Dell had been facing challenges in financing one of the biggest financial deals in history. For the $63 billion EMC acquisition to proceed, Dell has had to reduce its levels of debt with the Perot Systems business unit rumoured to be a favourite for sale.

The company will initially remain under the leadership of Suresh Vaswani, current President of Dell Services, who will continue to report to Dell CEO Michael Dell until the completion of the deal. It is believed that as part of the acquisition NTT Data will take on 28,000 Dell employees, though future leadership of the business has not been confirmed.

“NTT Data is pleased with the unique opportunity to acquire such high-calibre talent, and a corporate culture that shares common values with NTT Data, with emphasis on client first, foresight, teamwork and a commitment to innovation,” said Toshio Iwamoto, President and CEO of NTT Data Corporation. “Welcoming Dell Services to NTT DATA is expected to strengthen our leadership position in the IT Services market and initiates an important business relationship with Dell.”

NTT Data’s acquisition of the IT services division is the largest by the company to date and continues to bolster its North American footprint. Revenues for NTT Data in overseas markets has more than doubled since 2011 and in the same period the company has spent more than $600 million on acquisitions. The company has prioritized growth in the North America region, primarily targeting lucrative contracts in the healthcare, banking, financial services and insurance.

Since 2011 NTT Data has been proactive in bolstering its overseas business with a number of acquisitions throughout the world. In Europe it acquired companies including Everis and Value Team, in North America Optimal Solutions Integration and Carlisle & Gallagher were added, whereas iPay88 increased the company’s footprint in Malaysia.

“There are few acquisition targets in our market that provide this type of unique opportunity to increase our competitiveness and the depth of our market offerings,” said John McCain, CEO of NTT Data. “Dell Services is a very well-run business and we believe its employee base, long-standing client relationships, and the mix of long term and project-based work will enhance our portfolio.”

The deal could indicate that financing the EMC agreement has proved to be more difficult than initially expected. Dell Services as a business unit was reportedly to be valued in the region of $5 billion, which could highlight Dell’s urgency in completing the sale. If reports are correct, it would appear NTT Data has negotiated a good deal.

Oracle expands cloud offering into customer datacentres

Oracle CloudOracle launched Cloud at Customer, a new service designed to extend Oracle’s cloud into a customer’s datacentre.

The service allows companies to place an Oracle cloud server within their own datacentre to create a hybrid environment where customers can choose whether to run workloads on the Oracle cloud or on premise. Oracle claims the new offering will remove a number of barriers to cloud adoption, as the customer will retain control on what data is stored where, removing any residency or security concerns for business critical data.

“We are committed to helping our customers move to the cloud to help speed their innovation, fuel their business growth, and drive business transformation,” said Oracle’s President of Product Development Thomas Kurian. “Today’s news is unprecedented. We announced a number of new Cloud Services and we are now the first Public Cloud Vendor to offer organizations the ultimate in choice on where and how they want to run their Oracle cloud.”

The company claims it is the first in the industry to offer such a service and aims to address security and regulatory barriers for cloud adoption. Oracle stressed the service complies with many security and data regulations including FedRAMP for the US federal government, Germany’s Federal Data Protection Act and the United Kingdom’s Data Protection Act.

Data security and residency has been a topic of healthy discussion in recent months following the EU decision to dismiss Safe Harbour and the introduction of its successor Privacy Shield. Oracle could be capitalizing on the concerns of cloud buyers as it claims the offering answers business, legislative and regulatory obstacles enterprise organizations face when considering the transition to a cloud platform.

The new product launch forms part of Oracle’s general cloud offensive. “Oracle is now selling more new SaaS and PaaS annually recurring cloud revenue than any other company in the world including Salesforce.com,” said Executive Chairman Larry Ellison, during the quarterly earnings call.

“We are growing much faster than Salesforce.com, more than twice as fast. Because we sell into a lot more SaaS and PaaS market than they do. We compete directly with Salesforce.com in every segment of the SaaS customer experience market including sales, service and market.”

Google plays catch-up with Cloud Machine Learning

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingGoogle has entered into the machine learning market with the alpha release of Cloud Machine Learning.

Built on top of the company’s open source machine learning system TensorFlow, the offering will allow customers to build custom algorithms the make predictions for their business, aiding decision making.

“At Google, researchers collaborate closely with product teams, applying the latest advances in machine learning to existing products and services – such as speech recognition in the Google app, search in Google Photos and the Smart Reply feature in Inbox by Gmail,” said Slaven Bilac, Software Engineer at Google Research. “At GCP NEXT 2016, we announced the alpha release of Cloud Machine Learning, a framework for building and training custom models to be used in intelligent applications.”

The system is already used in a number of Google’s current offerings, though it is later to market than its competitors. AWS launched its machine learning in April last year, while IBM’s Watson has been making noise in the industry for years.

Although later to market, Google has highlighted that it will allow customers to export their TensorFlow models to use in other settings, including their own on premise data centres. Other offerings operate in vendor lock-in situation, meaning their customers have to operate the machine-learning models they’ve built in the cloud through an API. Industry insiders have told BCN that avoiding vendor lock-in situations would be seen as a priority within their organization, which could provide Google with an edge in the machine-learning market segment.

Cloud Machine Learning’s launch builds on the growing trend towards advanced data analytics and the use of data to refine automated decision making capabilities. A recent survey from Cloud World Forum showed that 85% of respondents believe data analytics is the biggest game changer for marketing campaigns in the last five years, while 82% said that data would define the way in which they interact with customers.

The company is still behind Microsoft and AWS in the public cloud space, though recent moves are showing Google’s intent to close the gap. At GCP NEXT 2016, Google’s cloud chief Diane Greene told the audience that machine learning and security will form the back bone of her new sales strategy. “If your customer is embracing machine learning, it’d be prudent for you to embrace it too,” said Greene.

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

Suppliers over-promise on cloud deliverables – survey

Vertrag Stiftbergabe ablehnenA recent survey from law firm Eversheds claims 27% of cloud deals have fallen through due to suppliers not meeting client expectations during contract negotiations.

Despite 77% of respondents claiming they intend to increase cloud spend over the next 18 months, the research claims that a number of deals have not come to fruition due to supplier over promising on what can be delivered within the agreement.

Differing views on what should and can be delivered are only coming to light in the final stages of contract negotiation, when buyers are finding out that suppliers cannot deliver on what had previously been promised.

On top of the 27% who have terminated talks, a further 10% said they were tempted to walk away from the deal because of the differences. Suppliers also backed up these statistics, as 57% of the supplier side respondents said that they had lost deals at the contract stage.

“The number of deals breaking down at the last minute is unnecessarily high given that customers and suppliers have typically reached agreement, at least in principle, before deals get to contract negotiation,” said Charlotte Walker-Osborn, Technology & Outsourcing Partner at Eversheds.

“In cloud negotiations, issues which are both legal and commercial in nature tend to come out during contractual discussions because this is when both parties take an in-depth look at the agreed parameters around the deal. Only then, can it become apparent that differing views may be shared on certain key areas such as data privacy and related security issues,” said Walker-Osborn.

Data protection and residency has once again proved to be a contentious issue, as 33% of the customers surveyed said this was the reason they walked away from the deal. Visibility over the suppliers supply chain was another reason, as 28% of customers claimed this was the reason for the breakdown.

“Cloud purchasers are anxious about where data is hosted for two reasons. The first is regulatory. Data protection and privacy regulations vary across jurisdictions, but most countries require companies to know where their data is hosted and being processed,” said Paula Barrett, Global Head of Privacy at Eversheds.

“Conscientious suppliers will ensure relevant regulatory requirements are covered by the contractual terms. However, some suppliers still fail to include fairly mandatory terms that the law requires their clients to have in place. The second reason is because government authorities in some jurisdictions have the right to access personal data, so it is natural that businesses are concerned about where their data will reside,” said Barrett.

The survey also hints that the mass market is still to be convinced of the reliability and robustness of cloud platforms, despite early adopters demonstrated the value. 48% of cloud buyers highlighted that they would like service credits to compensate for any losses in the event of a service outage as a failsafe, whereas only 20% of service providers were likely to include them in an agreement.

Despite the appetite for cloud services being present in the industry, a lack of clarity from suppliers at the outset has seemingly quashed a number of potential deals, which could potentially be avoided with suppliers taking a more proactive stance on data protection concerns, as well as a more sympathetic view of customer caution when implementing new technologies.

Red Hat CEO pins 21% growth on hybrid cloud market

James WhitehurstRed Hat demonstrated healthy growth in its quarterly earnings, with CEO James Whitehurst attributing the success to the growing hybrid cloud market.

The company reported Q4 revenues at $544 million and total revenues for the year at $2.05 billion, both an increase of 21% on the previous year (constant currency). It now claims to be the only open-source company to have breached the $2 billion milestone.

“Our results reflect the fact that enterprises are increasingly adopting hybrid cloud infrastructures and open source technologies, and they are turning to Red Hat as their strategic partner as they do so,” said Whitehurst. “First, the fourth quarter marked our 56th consecutive quarter of revenue growth which contributed to Red Hat’s first year of crossing the $2 billion in total revenue milestone.”

While public cloud has been dominating the headlines in recent weeks, the Red Hat team remain positive that the hybrid cloud market will ultimately deliver on expectations. “Public cloud has been a great resource for us to reach new customers, including small and medium-sized businesses,” said Whitehurst.

“During meetings Frank (Frank Calderoni, CFO) and I have hosted over the quarter, investors have asked whether the public cloud is a positive driver for Red Hat. We firmly believe that it will be a hybrid cloud world, where applications will run across four – all four footprints; physical, virtual, public cloud, and private cloud.

“Our revenue from private IaaS, PaaS and cloud management technologies is growing at nearly twice as fast as our public cloud revenue did when it was at the same size.”

Although it is unsurprising that Red Hat strongly backs the hybrid cloud model, security and data protection concerns in the industry add weight to the position. Despite progress made in the delivery and management of public cloud platforms, recent research has shown that enterprise decision makers are still concerned about the level of security offered in public cloud, but also where the data will reside geographically. Both concerns are seemingly driven hybrid cloud adoption, giving enterprise the full control on how and where company critical data is stored.

Over the last 12 months, Ret Hat has also confirmed a number of partnerships with major players in the public cloud space to increase its footprint. Last year, a partnership was announced with Microsoft where it became a Red Hat Certified Cloud and Service Provider, enabling customers to run their Red Hat Enterprise Linux applications and workloads on Microsoft Azure. In addition the Certified Cloud and Service Provider platform also has relationships with Google and Rackspace. Red Hat claims that these relationships have resulted in more than $100 million revenue, a 90% increase year-on-year.

“In Q4, we further expanded our technology offerings that can be consumed in the cloud. For instance, RHEL on-demand is activated on Azure in February,” said Whitehurst. “OpenShift, our PaaS solution, and our storage technology will be added to the Google cloud. And RHEL OpenStack platform is now available at RackSpace as a managed service.”

Despite increased competition in the market over recent years, Ret Hat has proved to be effective at holding onto customers. The largest 25 contracts that where up for renewal in the last quarter were all renewed and the new deals were 25% higher in the aggregate. The company also claims that 498 of the largest 500 deals over the last five years have also been removed.

“We never want to lose a deal, if we do, we never give up trying to win back the business,” said Calderoni. “This quarter, I am pleased to report that we closed a multi-million-dollar ‘win-back’ of one of those two former top deals.”

The company also estimates that revenues will grow to between $558 million and $566 million for Q1 and between $2.38 billion and $2.420 billion for the financial year.

Apple enters consumer e-health market

Apple carekitApple has announced the launch of CareKit, an open-source software framework which enables its consumers and doctors to proactively keep track of their health through monitoring symptoms and medications in real-time.

The open-source framework follows the launch of ResearchKit last year and enables consumers to us data collected from various sources to understand their health. The app also enables consumers to record feedback on how well they are feeling or recovering from a procedure which can be shared with family members and their doctor remotely.

“We’re thrilled with the profound impact ResearchKit has already had on the pace and scale of conducting medical research, and have realised that many of the same principles could help with individual care,” said Jeff Williams, Apple’s COO. “We believe that giving individuals the tools to understand what is happening with their health is incredibly powerful, and apps designed using CareKit make this a reality by empowering people to take a more active role in their care.”

From next month, the developer community will be able to build their own apps through the open-source software, however Apple have designed four modules in the first instance. Care Card is a to-do list reminding consumers to take medication or perform certain exercises, which can be tracked through various Apple devices. The Symptoms and Measurement Tracker enables consumers to record their symptoms and progress. The Insight Dashboard compares the symptoms to the data taken from the Care Card to ensure that treatment is effective, and the Connect module shares all information with the person’s doctor.

The concept of CareKit is one of the few data analytics use cases available to the consumer market, though the open-source framework will offer opportunities for developers. While the framework is not available for the wider community currently, Apple has been working with a number of developers to demonstrate the use case of the framework. One example, Glow Nature, is an app incorporating the CareKit modules to offer advice to women to guide them through a healthier pregnancy.

The launch of CareKit follows healthy adoption of ResearchKit, a similar open-source framework designed for medical researchers. ResearchKit enables doctors, scientists and other researchers to gather data from participants anywhere in the world using iPhone apps. While ResearchKit enables researchers to more accurately gather data and further their research, CareKit provides these organizations an alternative means to communicate with the mass audience.

“With ResearchKit, we quickly realised the power of mobile apps for running inexpensive, high-quality clinical studies with unprecedented reach,” said Ray Dorsey, Professor of Neurology at the University of Rochester Medical Centre. “We hope that CareKit will help us close the gap between our research findings and how we care for our Parkinson’s patients day-to-day. It’s opening up a whole new opportunity for the democratisation of research and medicine.”

Micro Focus moves into DevOps space with $540 million Serena acquisition

Money cloudUK enterprise software vendor Micro Focus has announced its intention to acquire US firm Serena Software, in a bid to improve its position in the DevOps space.

Subject to competition clearances from the US and Germany the $540 million acquisition is set to close in May. It will enable Micro Focus to enhance its DevOps credentials, and capitalize on one of the industry’s fastest growing trends.

“Today’s announcement marks another significant milestone for Micro Focus, bringing together two highly complementary solution sets that enable customers to build better applications, adapt to changing business conditions more rapidly and maximise the value of existing investments to drive further innovation within their business,” said Stephen Murdoch, CEO at Micro Focus.

“With Serena, we are further positioned to deliver richer solutions for the complex business demands customers are solving today – with greater reliability, predictability and less risk of failure.”

Micro Focus claims Serena’s capabilities around “true DevOps” will improve its position in a growing market and enable them to improve new business services through automated release and deployment solutions.

“This is an exciting announcement that promises to offer substantial value to Serena customers and partners,” said Greg Hughes, CEO of Serena. “Our complementary strengths in software development and IT Operations will only serve to provide a stronger foundation for the next-generation of applications and services they require to meet ongoing business demands.”

Micro Focus, which says it specializes in enterprise application modernization, has been bolstering its capabilities in recent years, with a focus of diversifying its customer base and portfolio. In 2014, the company announced it was merging with Attachmate Group for approximately US$1.2 billion, which also owned Novell and Suse Linux.

At the time, Kevin Loosemore, Executive Chairman of Micro Focus said “This is a transformational event that enables Micro Focus International to further meet the needs and demands of our customers and global partner network with greater scale, a broader portfolio and the global reach their businesses require.”