All posts by Business Cloud News

IBM unveils plans for Watson supercomputer to lead the cognitive era

Toward Digital EncryptionIBM CEO Ginni Rometty used the Consumer Electronics Show in Las Vegas to showcase a range of new partnership projects that will help supercomputer Watson usher in the ‘Cognitive Era’.

Among the new advances promised are health and fitness programmes, robotic apps for banking retail and hospitality, intelligent home appliances and computers that understand and adapt to human behaviour.

Under Armour and IBM have jointly developed a new cognitive coach that gives athletes evidence-based advice on health and fitness-related issues, Rometty revealed. The system combines IBM Watson’s technology with data from the 160 million members of Under Armour’s Connected Fitness community.

Meanwhile Watson is being used by Medtronic to bring its analytics powers to diabetes management. A joint effort by both companies means that hypoglycemic events can be predicted three hours in advance and neutralise deadly health events.

The cloud has infused Watson into Softbank Robotics’ ‘empathetic’ robot Pepper, boosting its thought processing powers so it can understand and answer questions. This, argued Rometty, could be applied to businesses such as banking, retail and hospitality.

Rometty said IBM and SoftBank Robotics will tap into data and knowledge across the IoT so Watson-powered Pepper robots can make sense of the hidden meaning in social media, video, images and text. This, Rometty said, brings in a new era in computing where systems understand the world in the way that humans do: through senses, learning and experience.

Appliance maker Whirlpool is being hooked into the Watson IoT cloud to create new cognitive products and services, such as an oven that can learn about its user’s eating habits, health issues and food preferences. IBM demonstrated Whirlpool’s Jen-Air oven equipped with the Chef Watson cooking app.

The developments mark a new cognitive era of computing, where IT works around humans, a reversal of the standard practise, according to IBM. “As the first system of the cognitive era, Watson infuses a kind of thinking ability into digital applications, products and systems,” said John Kelly, senior VP of IBM Research and Solutions Portfolio.

A Watson software development kit (SDK) is available to give developers the chance to tailor the interaction experience. IBM will give clients access to Watson APIs and various pre-packaged applications designed to address a variety of personal and professional needs.

BT Cloud Connect to give customer direct link to Salesforce

BT cloud of cloudsTelco BT is to give its corporate customers the option of a hotline to Salesforce’s cloud service through its BT Cloud Connect service.

The telco claimed it can provide a high performance link to Salesforce’s Customer Success Platform and give customers a more reliable and faster performance from the system, as part of its

Cloud of Clouds programme. BT’s global network connects 200 data centres, 48 of which it owns and operates itself.

The service will be rolled out incrementally from February 2016. The priorities for service roll out will be the US first, then Europe, followed by the rest of the world.

Clients desperately want the cloud to help them manage and access vast amounts of valuable data, but it needs to be made easier for them, according to Keith Langridge, VP of network services at BT Global Services. “Our Cloud of Clouds makes it easier by providing direct, secure and high performance connectivity to the applications. Salesforce is a true pioneer of the technology so this is an important milestone in the delivery of our vision,” said Langridge.

The methods that companies use to connect with the cloud needs to be refined, according to Salesforce’s UK MD Andrew Lawson. “BT is accelerating this shift for its customers,” said Lawson. The addition of Salesforce to its cloud of clouds roster will transform the way BT’s clients can connect with customers, partners and employees.

OVH claims integration of TimeSeries PaaS with Sigfox IoT cloud

France-based ISP turned cloud operator OVH has announced that its TimeSeries platform service is now integrated with the IoT simcloud service provided by IoT specialist Sigfox.

The fine tuning of the two services was announced at the CES 2016 show in Las Vegas, where the two service providers demonstrated showed how the OVH TimeSeries platform as a service (PaaS) can analyse and act on data being fed in from 7,000 sensors connected to the Sigfox IoT.

OVH claimed that machine-learning algorithms within the TimeSeries service can identify patterns and automatically trigger actions in response to the perceived situation. Having started as an ISP in Roubaix, France OVH has evolved to become a cloud service provider in France, Germany, Italy, Poland, Spain, Ireland, the United Kingdom, the Netherlands, Lithuania, the Czech Republic and Finland.

In November BCN reported how it has launched a public cloud service in the UK with customisable security as protection against cyber attacks becomes a major selling point alongside open systems mobility. It recently expanded its offering into the US and Canada. It currently has 220,000 servers in 17 data centres but claims it will have opened 12 new data centres by 2018.

The new integration means that now companies can use OVH’s PaaS TimeSeries application programming interfaces to retrieve data. This frees companies from having to build and manage their own databases, it claims.

The integration and demo at CES will help companies to understand the entire value chain of the Internet of Things, according to OVH CEO Laurent Allard. “A turn-key system for storing and managing data and hosting business applications makes it much simpler, quicker and cheaper to get running with the IoT,” said Allard.

In other news, Sigfox has also announced a pilot programme with the French postal service company La Poste. The two companies are collaborating to invent a new of online postal.

The Domino programme aims to automate the ordering of parcel pickup and delivery via Sigfox’s IoT network. A regional rollout will start in the first half of 2016.

Cloudability buys DataHero for more accurate cost analysis

M&AAccounting start-up Cloudability has acquired data visualisation service provider DataHero, another cloud start-up that formed at about the same time.

Oregon-based Cloudability’s growth came from helping companies track their spending on public cloud infrastructure. It announced the addition of San Francisco based DataHero on the company blog and hailed the extension of its presence 500 miles away in California.

However, while 12 of DataHero’s staff are to join Cloudability, its founder and CEO Chris Neumann will not join, neither will the company’s CFO, engineering VP or the VP for marketing. DataHero will continue to operate as normal for the foreseeable future until it can be integrated into the CloudAbility portfolio.

CloudAbility CEO Mat Ellis said the process will involve building a connector to make it easier for its clients to use DataHero to ingest different information streams, such as invoices from Zuora and conversions from Google Analytics, into Cloudability. The upshot, he said, is to help clients see what’s happening in their business and get a sense of the business costs that matter, such as the IT cost per new customer or the unit contribution margin after the cost of goods sold. The technology matters to cloud users because it helps companies save money on research and development as it brings them the best of both products in one service. “We both have an awesome dashboard which cost a lot of money to get right,” said Ellis. “Now there’s no need to do that twice.”

As the cloud makes it harder for managers to get a clear picture of their asset performances, the data visualisation market has entered a period of consolidation. Salesforce bought EdgeSpring, Zendesk bought BIME, Microsoft bought Datazen and Cloudability previously acquired DataPad. Cloudability has also acquired start ups in other areas such as CloudVertical, RipFog and Attribo.

DataHero had previously raised $10 million in venture funding, the latest award of $6.1 million being announced in May 2015.

“As companies spend more to run applications on public clouds, managing that cloud spending becomes increasingly urgent, difficult and risk prone,” wrote Ellis on his own blog. “Mastering the cloud at scale requires us to think about spending in a completely new way.”

Instead of asking macro economic questions about how much IT is costing every year, the new challenge is to provide micro-economic detail about the cost of every activity, he argued. “We should ask the cost of almost anything: each web page served, widget sold, ride taken across town or flight to the other side of the planet,” he said.

The cost of the DataHero acquisition was not released.

Apple to build new cloud infrastructure as Verizon sells off data centres – reports

datacentreTwo US tech giants are heading in opposite directions regarding datacenters, according to a couple of recent reports

Local US news sources report that Apple has filed a permit with Washoe County in Nevada, to build a new cluster of data centre facilities near its original Reno site. The planning application for Apple’s new ‘Project Huckleberry’ involves the construction of the full shell of a new data centre, several data centre clusters and a support building. The new Huckleberry project will have essentially the same design as an earlier installation in Reno, dubbed Project Mills, according to Trevor Lloyd, senior planner for Washoe County Planning and Development’s Community Services.

Apple was first attracted to invest in the area in 2012 when it received an $89 million tax abatement incentive to locate in Reno Technology Park. Apple recently applied for permission to build a new substation to support further development as the existing site is reaching its capacity, according to Lloyd.

Permission for the site, based on past trends, should be granted by the end of January, according to Lloyd. Tax incentives for cloud infrastructure projects could make economic sense for regional development authorities given their long term impact, according to Mike Kazmierski, president of western Nevada’s Economic Development Authority. “When you put tens of hundreds of millions of dollars on a huge data centre project, you’re in it for the long haul,” said Kazmierksi.

Cloud service provider Rackspace is also planning to build a data centre at Reno Technology Park.

The demands that data centres make on the local community are minor in comparison to benefits that a cloud computing infrastructure brings to the community though economic investments – and owners of data centres should use this in negotiations, according to Kazmierski.

Meanwhile, a large stock of cloud infrastructure could come on the market as telco Verizon Communications reportedly began a process to sell its global estate of 48 data centres. According to insiders quoted by Reuters Verizon is aiming to raise over $2.5 billion and streamline its business. Currently the colocation portfolio generates $275 million in EBITDA.

Telcos such as AT&T, CenturyLink and Windstream have also divested themselves of their data centres businesses in recent years.

Software defined storage and security drive cloud growth, say studies

Cloud securityData centre builders and cloud service developers are at loggerheads over their priorities, according to two new reports.

The explosive growth of modern data centres is being catalysed by new hyperconverged infrastructures and software defined storage, says one study. Meanwhile another claims that enthusiasm for cloud projects to run over this infrastructure is being suffocated by security fears.

A global study by ActualTech Media for Atlantis Computing suggests that a large majority of data centres are now using hyperconverged infrastructure (HCIS) and software defined storage (SDS) techniques in the race to built computing arenas. Of the 1,267 leaders quizzed in 53 countries, 71 per cent said they are using or considering HCIS and SDS to beef up their infrastructure. However, another study, conducted on behalf of hosting company Rackspace, found that security was the over riding concern among the parties who will use these facilities.

The Hyperconverged Infrastructure and Software-Defined Storage 2016 Survey proves there is much confusion and hype in these markets, according to Scott D. Lowe, a partner at ActualTech Media, who said there is not enough data about real-world usage available.

While 75 per cent of data centres surveyed use disk-based storage, only 44 per cent have long term plans for it in their infrastructure plans and 19 per cent will ditch it for HCIS or SDS. These decisions are motivated by the need for speed, convenience and money, according to the survey, with performance (72 per cent), high availability (68 per cent) and cost (68 per cent) as top requirements.

However, the developers of software seem to have a different set of priorities, according to the Anatomy of a Cloud Migration study conducted for Rackspace by market researcher Vanson Bourne. The verdict from this survey group – 500 business decision markers rather than technology builders – was that security will be the most important catalyst and can either speed or slow down cloud adoption.

Company security was the key consideration in the top three motives named by the survey group. The biggest identified threat the survey group wanted to eliminate was escalating IT costs, which 61 per cent of the group named. The next biggest threat they want to avert is downtime, with 50 per cent identifying a need for better resilience and disaster recovery from the cloud. Around a third (38 per cent) identified IT itself as a source of threats (such as viruses and denial of service) that they would want a cloud project to address.

“Cloud has long been associated with a loss of control over information,” said Rackspace’s Chief Security Officer Brian Kelly, “but businesses are now realising this is a misconception.”

AWS launches Workmail with eye on Exchange defectors

Amazon Work MailAmazon Web Services (AWS) has put its Workmail email and calendaring service on general release. Priced at $4 a month it includes an Exchange migration tool to encourage defections by Microsoft customers. However, those with data sovereignty issues should be aware that the services are mostly being hosted in the US, with a solitary non US data centre in Eire.

After a year in preview, the service was announced on the blog of AWS chief evangelist Jeff Barr. The service, designed to work with existing desktops and mobile clients, has been strengthened since it emerged in preview form, with the new service offering greater security, ease of use and migration, Barr said. The system has an emphasis on mobility features, with location control and policies and actions for controlling mobile devices, along with regular security features such as encryption of stored data, message scanning for spam and virus protection.

The migration tool will make it easier for users to move away from Microsoft Exchange, according to Barr, which suggests that dissatisfied Exchange users could be the primary target market.

The $4 per user per month service comes with an allocation of 50GB of storage and will be run from AWS’ US data centres in Northern Virginia and Oregon (in the US), with a single data centre in Eire to service European customers. “You can choose the region where you want to store your mailboxes and be confident that the stored data will not leave the region,” wrote Barr.

Other features include a Key Management Service (KMS) for creating and managing the keys that are used to encrypt data at rest and Self Certifications, so that WorkMail users can show they have achieved various ISO certifications.

WorkMail will support clients running on OS X, including Apple Mail and Outlook. It will also support clients using the Microsoft Exchange ActiveSync protocol including iPhone, iPad, Kindle Fire, Fire Phone, Android, Windows Phone, and BlackBerry 10. AWS is also working on interoperability support to give users a single Global Address Book and to access calendar information across both environments. A 30-day free trial is available for up to 25 users.

Toyota to build massive data centre and recruit partners to support smart car fleet

Toyota smart car standCar maker Toyota is to build a massive new IT infrastructure and data centre to support all the intelligence to be broadcast its future range of smart cars. It is also looking for third party partners to develop supporting services for its new fleet of connected vehicles.

The smart car maker unveiled its plans for a connected vehicle framework at the 2016 Consumer Electronics Show (CES) in Las Vegas.

A new data centre will be constructed and dedicated to collecting information from new Data Communication Modules (DCM), which are to be installed on the frameworks of all new vehicles. The Toyota Big Data Center (TBDC) – to be stationed in Toyota’s Smart Center – will analyse everything sent by the DCMs and ‘deploy services’ in response. As part of the connected car regime, Toyota cars could automatically summon the emergency services in response to all accidents, with calls being triggered by the release of an airbag. The airbag-induced emergency notification system will come as a standard feature, according to Toyota.

The new data comms modules will appear as a feature in 2017 for Toyota models in the US market only, but it will roll out the service into other markets later, as part of a plan to build a global DCM architecture by 2019. A global rollout out is impossible until devices are standardised across the globe, it said.

Toyota said it is to invite third party developers to create services that will use the comms modules. It has already partnered with UIEvolution, which is building apps to provide vehicle data to Toyota-authorised third-party service providers.

Elsewhere at CES, Nvidia unveiled artificial-intelligence technology that will let cars sense the environment and decide their best course. NVIDIA CEO Jen-Hsun Huang promised that the DRIVE PX 2 will have ten times the performance of the first model. The new version will use an automotive supercomputing platform with 8 teraflops of processing power that can process 24 trillion deep learning operations a second.

Volvo said that next year it lease out 100 XC90 luxury sports utility vehicles that will use DRIVE PX 2 technology to drive autonomously around Volvo’s hometown of Gothenburg. “The rear-view mirror is history,” said Huang.

Paradigm4 puts oncology in the cloud with Onco-SciDB

Digital illustration DNA structure in abstract colour backgroundBoston-based cloud database specialist Paradigm4 has launched a new system designed to speed up the process of cancer research among biopharmaceutical companies.

The new Onco-SciDB (oncology scientific database) features a graphical user interface designed for exploring data from The Cancer Genome Atlas (TCGA) and other relevant public.

The Onco application runs on top of the Paradigm4’s SciDB database management system devised for analysing multi-dimensional data in the cloud. The management system was built by database pioneer Michael Stonebraker in order to use the cloud for massively parallel processing and offering an elastic supply of computing resources.

A cloud-based database system gives research departments cost control and the capacity to ramp up production when needed, according to Paradigm4 CEO Marilyn Matz. “The result is that research teams spend less time curating and accessing data and more time on interactive exploration,” she said.

Currently, the bioinformatics industry lacks the requisite analytical tools and user interfaces to deal with the growing mass of molecular, image, functional, and clinical data, according to Matz. By simplifying the day-to-day challenge of working with multiple lines of evidence, Paradigm4 claims that SciDB supports clinical guidance for programmes like precision anti-cancer chemotherapy drug treatment. By making massively parallel processing possible in the cloud, it claims, it can provide sufficient affordable computing power for budget-constrained research institutes to trawl through petabytes of information and create hypotheses over the various sources of molecular, clinical and image data.

Database management system SciDB serves as the foundation for the 1000 Human Genomes Project and is used by bio-tech companies such as Novartis, Complete Genomics, Agios and Lincoln Labs. A custom version of Onco-SciDB has been beta tested at cancer research institute Foundation Medicine.

Industry veteran Stonebraker, the original creator of the Ingres and Postgres systems in 1985 that formed the basis of IBM’s Informix and EMC’s Greenplum, won the Association for Computing Machinery’s Turing Award and $1million from Google for his pioneering of database design.

Microsoft maps out 2016 BizTalk Server, Azure Stack and cloud integration plans

AzureMicrosoft has unveiled its plans to integrate its cast of cloud services and servers in the coming year. Cloud users can now download a roadmap for the direction of its integration products such as the BizTalk application-integration server, Azure Stack and the Logic Apps included in the Azure App Service offering.

The initiative is the idea of new Azure CTO Mark Russinovich in a bid to keep customers aware of the changes that are being made now that many integration processes are out of their domain. Traditionally, integration has been conducted on the customer’s premises or through a business to business arrangement, but in the cloud era the systems they want integrated are typically outside of their control, Russinovich said in the company blog. “Everything from sales leads to invoicing, email and social media, is going to be well beyond the corporate firewall,” he said.

As modern integration goes from corporate computer systems to an increasingly mobile world, there needs to be a change of approach on both ends. On a technical level, this change is unpinned by application programming interfaces (APIs) within lightweight, modern, HTTP/REST-based protocols using JSON, Russinovich said. On a cultural level, Microsoft is to open more channels of communication with its cloud users through updates such as this.

Before the tenth release of BizTalk Server, in Q4 2016, it will release a Community Technology Preview and a beta of the product in Q3. BizTalk Server 2016 is to align with Windows Server 2016, SQL 2016, Office 2016 and the latest Visual Studio. The latest BizTalk release will straddle both the on-premise and cloud worlds, supporting SQL 2016’s AlwaysOn Availability Groups whether they are hosted on Azure or in house.

BizTalk Server will also have better interfaces with Salesforce.com and Office 365, as Microsoft bids to improve the hybrid experience. New, improved BizTalk adapters for Informix, MQ and DB2 have also been promised, along with better PowerShell integration.

Halfway through 2016 Microsoft will host another integration summit, Integrate 2016, as the vendor signals its intent to take its Integration platform as a service (iPaaS) responsibilities seriously too.

According to the roadmap Microsoft should imminently release a preview of a planned Logic Apps Update with Logic Apps becoming generally available in Q2. Azure Stack will be available in Q4.