Migrate Data from Legacy to Cloud By @WayneLam888 | @CloudExpo #Cloud

As virtualized and cloud systems are so prevalent and integral to data storage, the need to migrate data from a legacy storage system to a virtual or cloud-based one is inevitable. But, here’s the rub: migrating data is a lot harder than you might think, despite all the marketing noise about smooth transitions, ease-of-use, and turnkey solutions.
While the term ‘cloud’ means many things to many people, I believe it refers to a large-scale virtualized data center that has numerous clusters of VMware or other hypervisor host servers connected to a SAN storage farm with petabytes of data.

read more

The Consumption Cloud Economy by @IanKhanLive | @CloudExpo #IoT #Cloud

Tech Giant Amazon recently announced that it will pay writers only for the pages that are actually read rather that for the purchase of an entire book. This is probably the biggest change that the world of publishing has ever seen, right after the invention of the printing press maybe, that fundamentally changes how books are consumed. This exactly is the Consumption Cloud Economy!
The concept of offering services and solutions in a way where they are consumed to the best of capabilities, where end users pay only for what they use and where vendors and suppliers are paid only for what is actually consumed is the fundamental concept behind the Consumption Cloud. In other words you are now part of a system where as a consumer you can think about Paying ONLY for what you use. That’s fantastic. This helps end users save more than just a few bucks. Expanded to a multitude of services that can be offered in this form, you are now considering IT, Solutions, Services and what not. I spoke about this concept at the recent Cloud Expo in New York where I gave examples of car tires and shoes.
The consumption economy has some key aspects to it that need to be locked down before you end up rolling out your solutions and services on the model.

read more

Cisco to acquire OpenDNS to strengthen cloud security for IoT

Cisco plans to acquire OpenDNS for $635m

Cisco plans to acquire OpenDNS for $635m

Cisco is to acquire cloud-based network security provider OpenDNS for $635m.

OpenDNS’ offering combines DNS services with a managed network security service that tracks devices and traffic and helps mitigate malware or denial of service threats. But it also adds predictive intelligence capabilities by using big data analytics to metabolise real-time behaviour and machine learning algorithms to automate mitigating action.

Cisco said the acquisition would strengthen security services portfolio, a core element of its Internet of Things (IoT) strategy.

“As more people, processes, data and things become connected, opportunities for security breaches and malicious threats grow exponentially when away from secure enterprise networks,” said Hilton Romanski, Cisco chief technology and strategy officer.

“OpenDNS has a strong team with deep security expertise and key technology that complements Cisco’s security vision. Together, we will help customers protect their extended network wherever the user is and regardless of the device.”

As part of the deal, which is expected to close sometime in the first quarter of next year, the OpenDNS team will join the Cisco Security Business Group led by David Goeckeler, the division’s vice president and general manager.

Targeting the network has become an increasingly important component of enterprise IT security, particularly with the explosion of malware and denial of service attacks – and will continue growing in importance as the IoT brings vast volumes of automated connectivity and data transaction.

The trend has seen more emphasis place on cloud-based security services, which can act as a security perimeter without needing to install anything with a datacentre. According to Gartner, the cloud-based security market with grow from $2.1bn in 2013 to $3.bn this year.

BT, Accenture and Cisco form Wireless IoT Forum board

The Wireless IoT Forum has announced its founding board memebers

The Wireless IoT Forum has announced its founding board memebers

Following its launch in March, the Wireless IoT Forum has announced its founding board members, featuring BT, Cisco, Accenture and Telensa among others, reports Telecoms.com.

The forum is a collaborative industry effort designed to help further define the requirements of the wireless WAN in an IoT era. Specifically, the group claims it is looking to drive the widespread adoption of wireless WAN tech by removing fragmentation and drive consolidation around a minimal set of standards for licensed and license-except wireless solutions

Ensuring the interoperability of solutions running throughout the entire IoT stack is one of the primary challenges associated with bringing the Internet of Things to fruition. As such, CEO of the WIoT Forum William Webb believes solving compatibility issues remains key to driving the broad scale adoption of IoT.

“…the risk presented by fragmentation remains very real,” he said. “Without widely-agreed open standards we risk seeing pockets of proprietary technology developing independently, preventing the benefits of mass-market scale. We are delighted today to be announcing our inaugural membership and to begin work to drive towards a collective view on the right way to deliver widespread IoT services.”

BT’s Mark Harrop reckons the IoT industry should look at using GSM as a benchmark for collaboration. “As the success of the GSM standard in the mobile world showed, working to open industry standards is critical to creating the necessary situation for mass market success,” he said. “By aligning the complete value chain in defining and promoting these standards the Wireless IoT Forum is ideally suited to make the Internet of Things a success.”

Under the remit of the association are a variety of working groups, focussing on four core areas; including marketing and requirements; review of applications and standard APIs; connectivity and networking challenges, including configurations, security and radio access for low power wide area networks; and finally regulation.

CSA, CipherCloud look to standardise APIs for cloud access security brokerage

The CSA and CipherCloud are leading an initiative to standardise API implementation for cloud access security brokerage

The CSA and CipherCloud are leading an initiative to standardise API implementation for cloud access security brokerage

The Cloud Security Alliance (CSA) and cloud security vendor CipherCloud are forming a working group to jointly develop best practice around API deployment for cloud access security brokerage services.

Cloud Security Open API Working Group, which at its founding will include contributions from Deloitte, InfoSys, Intel Security, and SAP among others, will jointly define protocols, guidelines and best practices for implementing data security services – encryption, tokenisation and other technologies – across cloud environments.

The CSA said the working group plans to develop API specifications and reference architectures to guide cloud-based data protection.

“Standards are an important frontier for the cloud security ecosystem,” said Jim Reavis, chief executive of CSA.

“The right set of working definitions can boost adoption. This working group will help foster a secure cloud-computing environment – a win for vendors, partners and users. Standardising APIs will help the ecosystem coalesce around a universal language and process for integrating security tools into the cloud applications,” Reavis said.

Pravin Kothari, founder and chief executive of CipherCloud said: “Cloud is the killer app for security innovation. But currently, inefficiencies at the technical level in the form of custom connector protocols can hold back innovations in cloud security. Defining a uniform set of standards can enable us all to operate from the same playbook. As a pioneer in [cloud access security brokerage], we are excited to co-lead this initiative with CSA to accelerate security across clouds.”

The initiative may enhance the ability to integrate various cloud services securely according the Jeff Margolies, principal at Deloitte, and open up what is generally considered to be a fairly closed, proprietary-dominated space.

“Currently the cloud security ecosystem lacks basic integration standards for connecting third-party security solutions to cloud applications, platforms and infrastructure,” he said, adding that the working group may help consolidate standards among vendors and cloud customers.

Cloud providers aren’t sharing their metadata – and it leads to bad trust with customers


The verdict on cloud vendors has come in from a new research study by Forrester: must do better in assuaging compliance fears, building support issues, and making customers feel wanted.

The survey, of 275 IT decision makers in the United States, United Kingdom and Singapore, revealed some eye-opening findings. 39% of UK respondents agreed with the statement “my cloud provider doesn’t know me or my company”. One in three argued “my provider charges me for every little question or incident.” 45% agreed with the statement “if I were a bigger customer, my cloud provider would care more about my success.”

Lilac Schoenbeck is VP product marketing and product management at iland, who sponsored the study. She admitted surprise over the level of negative sentiment customers have towards their providers. “As a cloud provider, that kind of thing just feels really bad,” she tells CloudTech. “I would hate to think that somebody that was basing so much of their business on their infrastructure had those negative feelings about me.”

The perceived lack of support from cloud vendors to their customers, Schoenbeck argued, was a lot less surprising. 26% of respondents said the onboarding process took too long. 21% say the onboarding lacked a sufficient human aspect. 18% had bill shock over their support costs.

Nor were the issues over compliance. 62% said on-demand access to necessary reports would ease the pressure, along with complete reporting of the compliance status of the cloud provider (54%) and suggestions for achieving compliance (43%). “[Compliance is] something that’s really risen to the fore in the last 12 months around cloud,” Schoenbeck explains. “You can see people being much more aware of their responsibilities.”

The nub of the issue is the release – or not – of metadata, information about the performance, configuration and operations of each cloud workload. While typically most cloud providers have access to it, it’s a different story for the customers. Every respondent in the survey said they were financially or operationally affected by unavailable metadata; but Schoenbeck argues this is not the full story.

“Cloud vendors have [the metadata], it’s just a decision about whether or not they’re going to invest in sharing it with their customers – and that’s not a small investment,” she says. “It’s easy to share a CSV that’s far too large and far too old for anybody to care about. For performance data to be useful, you need to get it in almost real time, and make accurate decisions based on it almost immediately.”

She adds: “I think that’s often the challenge. I was a little disappointed to see the breadth of that challenge across the industry.”

This change in customer expectation correlates with the maturity of the cloud market. Over 70% of companies surveyed said they had been using cloud services for more than one year, while 84% and 76% of UK and Singapore respondents respectively said they relied on two or more cloud providers. It’s a challenge vendors increasingly face. When enterprise storage provider Egnyte gained a client win from Box in Red Bull North America, head of EMEA Ian McEwan explained for customers with previous experience, it’s about trust, and understanding the vendor can better meet their business needs.

“What I’d like to see is that this experience of multiple vendors creates a more savvy customer,” Schoenbeck explains. “Ultimately, in any kind of technology, but specifically in an emerging technology like cloud, having a more savvy customer means that you can more powerfully partner with them and achieve their business ends in a better and faster way.”

But with this research, is iland throwing cloud vendors under the bus? “I think it is more an element of prioritisation – almost an element of benign neglect,” says Schoenbeck. “If you looked at the arc of most technologies, they do find themselves in this place where the technology is paramount, and that is the interesting piece, and nobody is really concerned with the end user. Over time, things shift.

“I think we might be at a pivot point where that benign neglect in the name of technological progress is giving way to an understanding that, ultimately, this is a business model and you need to serve your customer,” she adds.

Forrester gave three takeaways for cloud customers following the research:

  • Get the metadata you require: ensure provider exposes performance, security and cost data.
  • Find the clarity for compliance: look for cloud with strong reporting and compliance alerting.
  • Ensure your cloud is supported by humans: don’t settle for being ‘just another account number’.

Colt bows to competition, exits IT services

Colt is bowing out of the increasingly saturated IT services market

Colt is bowing out of the increasingly saturated IT services market

In a bid to increase profitability Colt announced this week that the company would exit the IT services market and put greater focus on its “core” services including its network, voice and datacentre services.

The company said its “managed exit” from the IT services market would also allow it to focus on offering datacentre services (colocation, cloud) and optimise use of its assets.

“Our IT services business would continue to need considerable investment in the short-to-medium term in order to deliver profitability and we do not believe this business can compete and grow successfully with a level of risk that is acceptable,” the company said in a statement Tuesday.

“Colt will continue to honour existing customer contracts through to termination, but will no longer seek new business.”

“The recent performance of IT Services has shown few signs of improving in accordance with the targets we set to deliver appropriate profit and cash returns in the medium term.”

The company anticipates the move will save about €25m annually, though it expects to incur cash and non-cash impairment charges of €45m to €55m and around €90m, respectively. Revenue from IT services is expected to decline €20m annually will become immaterial by 2018, it said.

“The fundamentals of our core network services and voice services businesses remain solid, and we are driving improvements in our datacentre services business. We are taking decisive action to become a more focused and disciplined organisation which we believe will accelerate the performance of our Core Business,” said Rakesh Bhasin, Colt chief executive.

“Overall, we believe the prospects for the Group are good and I am confident that, with the recent changes we have made within the senior management team, we will be able to deliver improved profitability and cash returns,” he added.

Colt still owns and will continue to operate its 22 carrier neutral datacentres in Europe and 7 in the Asia Pacific region (including those acquired through Japanese IT services provider KVH last year), though its goal of moving away from IT services may also mean a pivot towards becoming more of a systems integrator, which – like the IT services market – is quite competitive, and it isn’t entirely clear how the company intends to differentiate from other large incumbents in this space.

AWS to expand to India in 2016

AWS said India is the next big market for public cloud expansion

AWS said India is the next big market for public cloud expansion

Amazon unveiled plans this week to bring its Amazon Web Services (AWS) infrastructure to India by 2016 in a bid to expand into the quickly growing public cloud services market there.

AWS is already available in India and the company claims to have over 10,000 local customers using the platform, but the recently announced move would see the company set up its own infrastructure in-country rather than relying on delivering the services from nearby availability zones like Singapore.

The company says the move will likely improve the performance of the cloud services on offer to local organisations.

“Tens of thousands of customers in India are using AWS from one of AWS’s eleven global infrastructure regions outside of India. Several of these customers, along with many prospective new customers, have asked us to locate infrastructure in India so they can enjoy even lower latency to their end users in India and satisfy any data sovereignty requirements they may have,”said Andy Jassy, senior vice president, AWS.

“We’re excited to share that Indian customers will be able to use the world’s leading cloud computing platform in India in 2016 – and we believe India will be one of AWS’s largest regions over the long term.”

The India expansion comes at a time when the local market is maturing rapidly.

According to analyst and consulting house Gartner public cloud services revenue in India will reach $838m by the end of 2015, an increase of almost 33 per cent – making it one of the fastest growing markets for public cloud services in the world (global average growth rates sit in the mid-twenties range, depending on the analyst house). The firm believe many local organisations in India are shifting away from more traditional IT outsourcing and using public cloud services instead.

Google Pairs Up with Broad Institute

Google has paired up with high profile Broad Institute at MIT to develop its cloud genomics platform. The scientific community has needed new technologies to deal with the scale of genomic information and Google and the Broad Institute are looking to provide that.

This technology will process store, process and share genomic information as well as making it useful and accessible. The institute released a statement saying: “The goal is to enable any genomic researcher to upload, store, and analyze data in a cloud-based environment that combines the Broad Institute’s best-in-class genomic analysis tools with the scale and computing power of Google.”

Broad Institute will make its Genome Analysis Toolkit, or GATK, available as a service on the Google Cloud Platform. Initial access to the GATK will be limited, but eventually the service will be made more widely available. Any user will be able to upload their data to the cloud and GATK will analyze it using Google’s computing capacity.




Product manager at Google Genomics, Jonathan Bingham, wrote in a blog post: “In order to scale up by the next order of magnitude, Broad and Google will work together to explore how to build new tools and find new insights to propel bio-medical research, using deep bioinformatics expertise, powerful analytics, and massive computing infrastructure.”

This partnership allows researchers to outsource the configuration of technical specifications and maintenance of computing capacity to Google.

GATK could give Google an edge over other cloud computing companies such as Amazon Web Services in the genomic field; however, Google partnership with Broad is not exclusive.

The post Google Pairs Up with Broad Institute appeared first on Cloud News Daily.

Using External Displays in Parallels Desktop

Guest blog by Dishant Tripathi, Parallels Support Team Have you ever thought of working with Parallels Desktop on multiple screens? Even if you haven’t, you’ll be amazed by how much it can enhance your experience. (In other words, it’s awesome.) Thankfully, getting started with Parallels Desktop on multiple screens is really simple. All you have to do […]

The post Using External Displays in Parallels Desktop appeared first on Parallels Blog.