[video] @Ericsson Keynote: Bringing #IoT to Society | @ThingsExpo @EricssonIT

It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the world and how it starts with business models and monetization strategies.

read more

[slides] #SecOps in Cloud | @CloudExpo @BMCSoftware #AI #ML #DevOps

Many private cloud projects were built to deliver self-service access to development and test resources. While those clouds delivered faster access to resources, they lacked visibility, control and security needed for production deployments. In their session at 18th Cloud Expo, Steve Anderson, Product Manager at BMC Software, and Rick Lefort, Principal Technical Marketing Consultant at BMC Software, discussed how a cloud designed for production operations not only helps accelerate developer innovation, it also delivers the control that IT Operations needs to run a production cloud without getting in the way.

read more

The Two Faces of #Microservices | @CloudExpo @JPMorgenthal #AI #DevOps

Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as Common Object Request Broker Architecture (CORBA) and Distributed Component Object Model (DCOM).

read more

A Look into Cloudian’s HyperStore 4000

Cloudian has come up with a new cloud data archiving product called HyperStore 4000 that promises to have many unique features to support the changing needs of businesses.

HyperStore 4000 offers object storage, which means, data is stored as objects and not as files. The obvious advantage is that each object acts as a single repository instead of a file system where documents are nested within subfolders.

This cloud archiving option comes with a storage of 700 TB, and even has two separate computer nodes for every chassis. Essentially, what this means is there is a high level of flexibility for data storage, as it can be configured as a three-way cluster. In addition, it ensures the highest levels of data availability as it has a built-in cloud tiering system.

With this product, customers have the choice to store data on their own premises or on popular cloud platforms like AWS, Microsoft Azure and Google Cloud. This data can tier to Cloudian’s public cloud as well.

HyperStore 4000 is the perfect addition to Cloudian’s line of products as it combines their existing expertise with what businesses need today. Data is growing at astronomical speeds. According to a report published by IDC, the amount of data is doubling in size every two years, and this means, by 2020, we’ll reach 44 zettabytes (equivalent to 44 trillion gigabytes). Such a rapidly growing data needs ample storage space, and this is exactly what HyperStore 4000 offers with its 700 TB space.

According to Jon Toor, the Chief Marketing Officer of Cloudian, this device will be particularly useful for industries that deal with large amounts of data such as genome sequencing companies, video surveillance units, and those involved in the entertainment industry. These industries prefer large on-premise storage than cloud simply because it’s safe and they know where their data is located. He also opined that it could replace tape archives that are slowly going out of existence.

In terms of pricing too, Cloudian’s HyperStore 4000 gives a competitive deal. A report released by the company says that this device offers on-premise performance at the price of cloud services, as it costs 40 percent less per GB when compared to other products from Cloudian. This lower pricing is Cloudian’s way of establishing itself in a market that is being dominated by cloud providers with extensive infrastructure.

In fact, one of the biggest reasons for many companies to move to the cloud is cost-saving. If Cloudian can offer the same cost, there is a high possibility for customers to buy HyperStore 4000. To top it, the many recent reports about hacking and data loss has brought up renewed concerns about cloud security. So, when Cloudian offers a solution that is safer than cloud, but at the same price, many businesses are sure to look into it.

In short, Cloudian has come up with a competitive market to bridge the gap that exists between cloud and on-premise storage. Its competitive price, abundant storage and easy-to-use features can after all make businesses reconsider their data archiving strategies.

The post A Look into Cloudian’s HyperStore 4000 appeared first on Cloud News Daily.

IDC says global spending on public cloud services to hit $122.5bn in 2017

(c)iStock.com/HYWARDS

Global spending on public cloud services and infrastructure will hit $122.5 billion by the end of this year at an increase of almost 25% on 2016, according to the latest note from IDC.

The analyst house argues that discrete manufacturing, professional services, and banking are the primary public cloud service industries based on market share today, while professional services (23.9% compound annual growth rate), retail (22.8%) and media (22.5%) are expected to be the fastest growers. Almost half of all public cloud spending will come from businesses with more than 1,000 employees.

The United States will remain the largest market for public cloud services generating more than 60% of revenue through the forecast period, IDC argues. Yet seven out of eight regions are expected to reach CAGRs of more than 20% over the next five years – with the US lagging behind at a mere 19.9% CAGR.

“European companies have been slower in the adoption of cloud when compared to their US counterparts, but now the market is maturing and it is the right time for cloud providers to target and capture the untapped segments,” said Serena Da Rold, IDC senior research manager for customer insights and analysis.

This missive makes sense when examining the various moves providers have made in the continent over recent months. Last month reports said that Facebook was expanding its data centre empire with a new site in Denmark, while Rackspace moved operations to Germany in November last year and IBM became the latest vendor to build a UK site at the same time.

Software as a service will remain dominant albeit slowing down over time – two thirds of all public cloud spending in 2017, moving to 60% by 2020 – yet the figures also need to be tempered with some lateral thinking, IDC argues.

“As cloud adoption expands over the next four years, what clouds are and what they can do will evolve dramatically – in several important ways,” said Frank Gens, senior vice president and chief analyst at IDC. “The cloud will become more distributed – through Internet of Things edge services and multicloud services – more trusted, more intelligent, more industry and workload specialised, and more channel mediated.

“As the cloud evolves these important new capabilities – what IDC calls ‘Cloud 2.0’ – the use cases for the cloud will dramatically expand,” Gens added.

According to a note from Synergy Research published at the beginning of this month, quarterly public cloud infrastructure service revenues – including public infrastructure as a service and platform as a service – have hit more than $7 billion, continuing to grow at almost 50% per year.

IDC says global spending on public cloud services to hit $122.5bn in 2017

(c)iStock.com/HYWARDS

Global spending on public cloud services and infrastructure will hit $122.5 billion by the end of this year at an increase of almost 25% on 2016, according to the latest note from IDC.

The analyst house argues that discrete manufacturing, professional services, and banking are the primary public cloud service industries based on market share today, while professional services (23.9% compound annual growth rate), retail (22.8%) and media (22.5%) are expected to be the fastest growers. Almost half of all public cloud spending will come from businesses with more than 1,000 employees.

The United States will remain the largest market for public cloud services generating more than 60% of revenue through the forecast period, IDC argues. Yet seven out of eight regions are expected to reach CAGRs of more than 20% over the next five years – with the US lagging behind at a mere 19.9% CAGR.

“European companies have been slower in the adoption of cloud when compared to their US counterparts, but now the market is maturing and it is the right time for cloud providers to target and capture the untapped segments,” said Serena Da Rold, IDC senior research manager for customer insights and analysis.

This missive makes sense when examining the various moves providers have made in the continent over recent months. Last month reports said that Facebook was expanding its data centre empire with a new site in Denmark, while Rackspace moved operations to Germany in November last year and IBM became the latest vendor to build a UK site at the same time.

Software as a service will remain dominant albeit slowing down over time – two thirds of all public cloud spending in 2017, moving to 60% by 2020 – yet the figures also need to be tempered with some lateral thinking, IDC argues.

“As cloud adoption expands over the next four years, what clouds are and what they can do will evolve dramatically – in several important ways,” said Frank Gens, senior vice president and chief analyst at IDC. “The cloud will become more distributed – through Internet of Things edge services and multicloud services – more trusted, more intelligent, more industry and workload specialised, and more channel mediated.

“As the cloud evolves these important new capabilities – what IDC calls ‘Cloud 2.0’ – the use cases for the cloud will dramatically expand,” Gens added.

According to a note from Synergy Research published at the beginning of this month, quarterly public cloud infrastructure service revenues – including public infrastructure as a service and platform as a service – have hit more than $7 billion, continuing to grow at almost 50% per year.

Why the cloud could hold the cure to diseases

(c)iStock.com/ismagilov

We constantly hear about programs such as Race for the Cure, Breast Cancer Awareness Month, The Ice Bucket Challenge, and other fundraising or awareness initiatives for diseases.  However, hearing a disease has been cured almost never happens. With billions of dollars being used to research diseases around the world, many people started looking for reasons as to why more progress hasn’t been made. Researchers re-examined their processes and realised two things. First, research methods have been largely unchanged in many disease-fighting fields. Foundations, doctors and researchers would conduct studies independent from any other group studying the same disease and draw conclusions from their limited data set.

One example of this was Parkinson’s disease, whereindividual doctors instinctively measured the progression of symptoms during well visits. “Nearly 200 years after Parkinson’s disease was first described by Dr. James Parkinson in 1817, we are still subjectively measuring Parkinson’s disease largely the same way doctors did then,” said Todd Sherer, Ph.D., CEO of The Michael J. Fox Foundation. With few data points and poor collection of that data, Parkinson’s researchers weren’t able to see trends in the data or delve into what treatments were making a positive effect.

The second realisation was that cloud technology was the perfect vehicle to share patient data with other researchers. Big data has been called the “next big tech disrupter” and many companies were already using big data to identify customer trends. Similarly, the scientific community started implementing the cloud to collect data and discover trends in patient and genetic data. Today, the Michael J. Fox Foundation is working on collecting the “world’s largest collection of data about life with Parkinson’s” via smart watches that upload patient data directly to the cloud.

Many disease-fighting organisations are working to implement the cloud as a data sharing vehicle. Nancy Brown, CEO of the American Heart Association, explains why the cloud is a game changer when it comes to curing disease. “To push new novel discoveries, we need the ability to allow scientists and researchers to have access to multiple data sets,” Brown said. “There’s a lot of data out there — data from clinical trials, data from hospitals and their electronic health records, data from the Framingham Heart study. Traditionally, all of that has been kept by individual companies or data owners.”

One beauty of the cloud is hybrid cloud computing, which is the ability to share data without compromising intellectual property. This way,individual entities can share data sets with the public cloud, while synonymously maintaining a private use cloud to store their proprietary findings. This way, everyone has access to the large data sets and can download, manipulate and then store the data it within their own private cloud as they do research.

The importance of the cloud is highlighted by the National Cancer Institute’s program, The Cancer Moonshot, headed by former vice president Joe Biden. The program is designed to double the rate of progress in cancer prevention, diagnosis, and treatment, and to do in five years what might otherwise take a decade. Of the ten “transformative research recommendations” created to achieve the aggressive Cancer Moonshot goal, buildinga national cancer data ecosystem using the cloud is one of them.

Beyond the data sharing capabilities of the cloud are the computing capabilities. Where desktop computers aren’t able to handle and analyse the massive streams of data that are collected, cloud computing is. Mark Kaganovich, founder of SolveBio and a doctoral candidate in genetics,explained the challenge that companies and researchers are actively working on, is building tools to sift through the “data tornado” and take advantage the “huge opportunity to use statistical learning for medicine.”

One real world cloud application is the sharing of how patients with certain genomes react to certain drug treatments. Eric Dishman, Director of Proactive Health Research at Intel,shared that when he had a rare form of kidney cancer doctors tried a variety of treatments without success. It wasn’t until his genome was sequenced were his doctors able to effectively treat him now knowing which drugs were likely to be most effective.

Currently, cancer organisations are working on sharing data on how cancer patients with similar genomic patterns are reacting to their treatments enabling doctor to effectively choose treatments for future patients. AsClay Christensen explains in his book on health care disruption, the cloud has the ability to take our current system of intuitive medicine and transform it to precision medicine.

Why the cloud could hold the cure to diseases

(c)iStock.com/ismagilov

We constantly hear about programs such as Race for the Cure, Breast Cancer Awareness Month, The Ice Bucket Challenge, and other fundraising or awareness initiatives for diseases.  However, hearing a disease has been cured almost never happens. With billions of dollars being used to research diseases around the world, many people started looking for reasons as to why more progress hasn’t been made. Researchers re-examined their processes and realised two things. First, research methods have been largely unchanged in many disease-fighting fields. Foundations, doctors and researchers would conduct studies independent from any other group studying the same disease and draw conclusions from their limited data set.

One example of this was Parkinson’s disease, whereindividual doctors instinctively measured the progression of symptoms during well visits. “Nearly 200 years after Parkinson’s disease was first described by Dr. James Parkinson in 1817, we are still subjectively measuring Parkinson’s disease largely the same way doctors did then,” said Todd Sherer, Ph.D., CEO of The Michael J. Fox Foundation. With few data points and poor collection of that data, Parkinson’s researchers weren’t able to see trends in the data or delve into what treatments were making a positive effect.

The second realisation was that cloud technology was the perfect vehicle to share patient data with other researchers. Big data has been called the “next big tech disrupter” and many companies were already using big data to identify customer trends. Similarly, the scientific community started implementing the cloud to collect data and discover trends in patient and genetic data. Today, the Michael J. Fox Foundation is working on collecting the “world’s largest collection of data about life with Parkinson’s” via smart watches that upload patient data directly to the cloud.

Many disease-fighting organisations are working to implement the cloud as a data sharing vehicle. Nancy Brown, CEO of the American Heart Association, explains why the cloud is a game changer when it comes to curing disease. “To push new novel discoveries, we need the ability to allow scientists and researchers to have access to multiple data sets,” Brown said. “There’s a lot of data out there — data from clinical trials, data from hospitals and their electronic health records, data from the Framingham Heart study. Traditionally, all of that has been kept by individual companies or data owners.”

One beauty of the cloud is hybrid cloud computing, which is the ability to share data without compromising intellectual property. This way,individual entities can share data sets with the public cloud, while synonymously maintaining a private use cloud to store their proprietary findings. This way, everyone has access to the large data sets and can download, manipulate and then store the data it within their own private cloud as they do research.

The importance of the cloud is highlighted by the National Cancer Institute’s program, The Cancer Moonshot, headed by former vice president Joe Biden. The program is designed to double the rate of progress in cancer prevention, diagnosis, and treatment, and to do in five years what might otherwise take a decade. Of the ten “transformative research recommendations” created to achieve the aggressive Cancer Moonshot goal, buildinga national cancer data ecosystem using the cloud is one of them.

Beyond the data sharing capabilities of the cloud are the computing capabilities. Where desktop computers aren’t able to handle and analyse the massive streams of data that are collected, cloud computing is. Mark Kaganovich, founder of SolveBio and a doctoral candidate in genetics,explained the challenge that companies and researchers are actively working on, is building tools to sift through the “data tornado” and take advantage the “huge opportunity to use statistical learning for medicine.”

One real world cloud application is the sharing of how patients with certain genomes react to certain drug treatments. Eric Dishman, Director of Proactive Health Research at Intel,shared that when he had a rare form of kidney cancer doctors tried a variety of treatments without success. It wasn’t until his genome was sequenced were his doctors able to effectively treat him now knowing which drugs were likely to be most effective.

Currently, cancer organisations are working on sharing data on how cancer patients with similar genomic patterns are reacting to their treatments enabling doctor to effectively choose treatments for future patients. AsClay Christensen explains in his book on health care disruption, the cloud has the ability to take our current system of intuitive medicine and transform it to precision medicine.

All you need to know about Parallels Tools installation in Parallels Desktop for Mac

Parallels Support team guest authors: Dineshraj Yuvaraj When you set up your first virtual machine in Parallels Desktop for Mac, you may have noticed Parallels Tools installing automatically (in the guest operating system). So what is Parallels Tools and why is it that important for Windows/Linux/Mac VMs in Parallels Desktop? Why is it installing automatically? How […]

The post All you need to know about Parallels Tools installation in Parallels Desktop for Mac appeared first on Parallels Blog.

Ten Attributes of Serverless Computing Platforms | @CloudExpo #FaaS #Cloud #Serverless

Serverless Computing or Functions as a Service (FaaS) is gaining momentum. Amazon is fueling the innovation by expanding Lambda to edge devices and content distribution network. IBM, Microsoft, and Google have their own FaaS offerings in the public cloud. There are over half-a-dozen open source serverless projects that are getting the attention of developers. This year, expect to see new platforms emerging in

read more