IBM launches 26 new cloud services for data scientists

IBM2IBM is launching 26 services new services on its IBM Cloud which it describes as a ‘sweeping portfolio for data scientists and app developers’. Its new offering includes 150 publicly available datasets.

The new initiative aims to help developers build and manage applications and help data scientists to read events in the cloud more intuitively. The hybrid cloud service scans multiple cloud providers and uses open systems which, IBM says, will create a ready flow of data across different services.

The new cloud offerings will create a self-service for data preparation, migration and integration, IBM claims, with users being provided with tools for advanced data exploration and modelling. The four main pillars of the new service offering come under the headings of Compose Enterprise, Graph, Predictive Analytics and Analytics Exchange.

The IBM Compose Enterprise is a managed platform that aims to help developers build web-scale apps faster by giving them access to resources such as open source databases and their own their own dedicated cloud servers. Graph is a managed graph database service built on Apache TinkerPop with a stack of business-ready apps with real-time recommendations, fraud detection, IoT and network analysis uses. Predictive Analytics promises developers easy self-build machine learning models from a library of predictive apps generally used by data scientists. Analytics Exchange contains the catalogue of 150 publicly available datasets.

The Apache TinkerPop and the Gremlin graph traversal language will be the primary interface to IBM’s Graph service. IBM has previously pushed TinkerPop to join the Apache Software Foundation. In September BCN reported that IBM is to open a San Francisco facility with resources dedicated to IBM’s new Spark processing technology as the vendor seeks to get Spark users interested in IBM’s Watson developer cloud.

Data handlers are currently handicapped by having to use disparate systems for data needs, IBM claims. “Our goal is to move data into a one-stop shop,” said Derek Schoettle, General Manager, Analytics Platform and Cloud Data Services.

$19 billion Western Digital acquisition of SanDisk gets EC approval

Disk CloudThe European Commission has announced its approval of the proposed take over of storage vendor SanDisk by Western Digital. The merger of the two US-based storage rivals will not adversely affect competition in Europe, the EC has ruled.

In October 2015 BCN reported that Western Digital had announced plans buy chip maker SanDisk for around $19 billion. Flash specialist SanDisk is ranked by IDC as the largest manufacturer of NAND flash memory chips. The capacity of NAND Flash Memory products to store data in a small footprint, while simultaneously using less power but granting faster access to data, has made NAND the storage technology of choice in data centres that support cloud computing.

The market for NAND flash chips was worth $28.9 billion in 2014, according to IDC.

The Commission found that the only overlap between the activities of the hard disk manufacturers is in selling flash memory storage systems and solid-state drives to the enterprise market. In this case, the effects of the merger on competition will be minimal, it has ruled, despite their relatively high combined market share. The presence of Intel, Toshiba, Micron and Samsung in the same market will exert sufficient competitive pressure to prevent the creation of a Western Digital hegemony, the European Commission has ruled.

The Commission also investigated the vertical link between SanDisk’s production of flash memory and the downstream markets for enterprise flash memory storage systems. With flash memory an essential component of solid state drives and other flash memory storage systems the EC investigators have researched whether Western Digital will be in a position to block competitors from access to flash memory.

It also studied the likelihood that competing producers of flash memory might find themselves with an unsustainable customer base. However, SanDisk’s presence on the upstream flash memory market was judged as ‘limited’ and the presence of several active competitors makes this a manageable risk.

“This multi-billion dollar deal can go ahead without delay,” said competition policy commissioner Margrethe Vestager.

Microsoft creates Azure hub for Internet of Things

azure iotMicrosoft has put its new Azure IoT hub on general availability. In a statement, it claims the new system will be a simple bridge between its customers’ devices with their systems in the cloud. It claims that the new preconfigured IoT offering, when used with the Azure IoT Suite, can be used to create a machine to machine network and a storage system for its data in minutes.

The new Azure IoT Hub promises ‘secure, reliable two-way communication from device to cloud and cloud to device’. It uses the open protocols widely adopted in machine to machine technology, such as MQTT, HTTPS and AMQPS. Microsoft claims the IoT Hub will easily integrate with other Azure services like Azure Machine Learning and Azure Stream Analytics. The Machine Learning service uses algorithms in an attempt to spot patterns (such as unusual activity, hacking attempts or commercial trends) that might be useful to data scientists. Azure Stream Analytics allows data scientists and decision makers to act on those insights in real time, through a system with the capacity to simultaneously monitor millions of devices and take automatic action.

Microsoft launched the Azure IoT Suite in September 2015 with a pledge to guarantee standards through its Certified for IoT programme, promising to verify partners that work with operating systems such as Linux, mbed, RTOS and Windows. Microsoft claims its initial backers were Arduino, Beagleboard, Freescale, Intel, Raspberry Pi, Samsung and Texas Instruments. In the three months since the IoT Suite’s launch it has added ‘nearly 30’ more partners, it claims, notably Advantech, Dell, HPE, and Libelium.

“IoT is poised for dramatic growth in 2016 and we can’t wait to see what our customers and partners will continue to build on our offerings. We’re just getting started,” wrote blog author Sam George, Microsoft’s partner director for Azure IoT.

IBM: “The level of innovation is being accelerated”

Angel DiazDr. Angel Diaz joined the research division of IBM in the late nineties, where he helped co-author many of the web standards we enjoy today. Nowadays, he’s responsible for all of IBM’s cloud and mobile technology, as well as architecture for its ambient cloud. Here, ahead of his appearance at Container World (February 16 – 18,  Santa Clara Convention Center, CA,) later this month, BCN caught up with him to find out more about the tech giant’s evolving cloud strategy.

BCN: How would you compare your early days at IBM, working with the likes of Tim Berners-Lee, with the present?

Dr. Angel Diaz: Back then, the industry was focused on developing web standards for a very academic purpose, in particular the sharing of technical information. IBM had a strategy around accelerating adoption and increasing skill. This resulted in a democratization of technology, by getting developers to work together in open source and standards.If you fast forward to where we are now with cloud, mobile, data, analytics and cognitive you see a clear evolution of open source.

The aperture of open source development and ecosystems has grown to include users and is now grounded on solid open governance and meritocracy models. What we have built is an open cloud architecture, starting with an open IaaS based on Open Stack, open PaaS with Cloud Foundry and an open container model with the Open Container Initiative and Cloud Native Computing Foundation. When you combine an open cloud architecture with open APIs defined by the Open API Initiative, applications break free. I have always said that no application is an island – these technologies make it so.

What’s the ongoing strategy at IBM, and where do containers come into it?

It’s very much hybrid cloud. We’ve been leveraging containers to help deliver hybrid applications and accelerate development through devOps, so that people can transform and improve their business processes. This is very similar to what we did in the early days of the web – better business processes means better business. At the end of the day – the individual benefits. Applications can be tailored to the way we like to work, and the way that we like to behave.

A lot of people in the container space, say, wow, containers have been around a long time, why are we all interested in this now? Well, it’s gotten easier to use, and open communities have rallied around it, and it provides a very nice way of marrying concepts of operations and service oriented architecture, which the industry missed in the 2000s.

What does all this innovation ultimately mean for the ‘real world’?

It’s not an exact analogy, but if we remember the impact of HTML, JavaScript – they allowed almost anyone to become a webmaster. That led to the Internet explosion. If you look at where we are now, what we’re doing with cloud: that stack of books you need to go buy has been reduced, the concept count of things you need to know to develop an application, the level of sophistication of what you need to know in order to build an application, scale an application, secure an application, is being reduced.

So what does that do? It increases participation in the business process, in what you end up delivering. Whether it’s human facing or whether it’s an internal business process, it reduces that friction and it allows you to move faster. What’s starting to happen is the level of innovation is being accelerated.

And how do containers fit into this process? 

Previously there was this strict line: you develop software and then operate it and make tweaks, but you never really fundamentally changed the architecture of the application. Because of the ability to quickly stand up containers, to quickly iterate, etc., people are changing their architectures because of operations and getting better operations because of it. That’s where the microservices notion comes in.

And you’ll be talking at Container World. What message are you bringing to the event?

My goal is to help people take a step back and understand the moment we’re in, because sometimes we all forget that. Whether you’re struggling with security in a Linux kernel or trying to define a micro service, you can forget what it is you’re trying to accomplish.

We are in a very special moment where it’s about the digital disruption that’s occurring, and the container technology we’re building here, allow much quicker iteration on the business process. That’s one dimension. The second is that, what IBM’s doing, in not just our own implementation of containers, but in the open source world, to help democratize the technology, so that the level of skill and the number of people who build on this grows.

Terminal Server Setup Guide: Fast-Start Environment

With Terminal Services, organizations can provide employees access to Windows applications from virtually any device no matter the geographic location. Terminal Services (known as RDS beginning with Windows 2008 R2) is a server role in Windows Server that enables the server to host multiple, simultaneous client sessions to Windows desktops and applications. This provides organizations […]

The post Terminal Server Setup Guide: Fast-Start Environment appeared first on Parallels Blog.

Why You Need Parallels Desktop Business Edition—in One Sentence

If you’re still contemplating whether or not Parallels Desktop for Mac Business Edition is right for you, let us help you decide. You already know the benefits of virtualizing Windows on Mac, but what if you need to do so for a large group? How on earth do you manage it all? That’s where the […]

The post Why You Need Parallels Desktop Business Edition—in One Sentence appeared first on Parallels Blog.

Latest cloud infrastructure research shows yet more AWS dominance

(c)iStock.com/4774344sean

This is not the most surprising news you will hear today: a note published by Synergy Research argues Amazon Web Services’ (AWS) domination of the cloud infrastructure market continues. The Seattle giant held 31% of the worldwide market in 2015, the researchers argue, with Microsoft (9%), IBM (7%) and Google (4%) trailing.

Despite the yawning gap between the market leader and the rest, all four companies continue to grow ahead of the overall rate, as Synergy chief analyst John Dinsdale argues the big four are “continuing to run away with the market.” The second tier vendors, which include Salesforce, Rackspace, Oracle, Alibaba, and Fujitsu among others, are what the research firm claims as either niche players, general IT service providers, or companies which lack the scale or focus to really challenge the top vendors.

Picture credit: Synergy Research Group

As was reported in July with the last figures, AWS, Microsoft, IBM, and Google continue to control more than half of the overall cloud infrastructure market. Microsoft and Google have the highest growth rates among the market leaders, at 124% and 108% year on year respectively, but are making a minimal dent in AWS’ lead, with 63% yearly growth.

Synergy argues quarterly cloud infrastructure service revenues, including IaaS, PaaS, private and hybrid cloud, are approaching $7 billion, with trailing 12 month revenues at more than $23 billion.

Opinions on the IaaS market, which has shown a relatively unchanging growth pattern for the past couple of years, range from those who think it’s a three horse race to those who think the race has long been run. Kelly Stirman, VP strategy at MongoDB, told this publication in July there will only be Amazon, Google and Microsoft – as everyone else will eventually run out of money.

Bootlegging, Piracy and Document Management Have Things in Common By @IanKhanLive | @CloudExpo #Cloud

No seriously, I mean it. There are serious correlations and similarities between smuggling alcohol and piracy in any form, whether it’s the real life bad guys on ghost ships in the seas or digital pirates who download stuff off the internet thinking it’s all free stuff while some artist starves to death, or for that matter an enterprise user; quite specifically a certain type of enterprise user. The ones that love free applications and start using them, well and not just use them but spread it to the entire department. You probably have a Bob or Sally or Travis in your department who sends you an innocent link to Dropbox or Google Drive with the Excel file that was too big to send through email. How about that file from accounting that is conveniently shared on Google drive that can be accessed by everyone, including the CFO from his home? Convenient right? Well yes, convenient but also an early preparation to being sued by your customers or probably tens of other people as well. I am really not trying to scare you off but I know this drum has been beaten too many times. Only 19% of organizations have adopted something that does not put their business at risk doing something almost illegal.

read more

[slides] Managing Remote Operations Teams By @WebairSagi | @CloudExpo #Cloud

Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic.
In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what doesn’t. He covered proper training and integration of these teams into the corporate structure, and the most effective ways to introduce them to customers. He also discussed proper vetting of third-party teams should these functions be outsourced.

read more

Why Application Modularization Matters: Testing By @MassHaste | @CloudExpo #Cloud

A few weeks ago my colleague PJ Hagerty wrote about driving your existing monolithic application toward a more modular design. This time around I’ll dive a little bit deeper into its importance and the benefits of application modularization.
One of the most important best practices in application development is testing, and more specifically automated and continuous testing. When your application is a monolith with multiple functionalities and responsibilities, automated testing becomes massively unwieldy. To illustrate this issue let’s use the following illustration of conceptual (and, admittedly trivial) photo gallery web application.

read more