How planning is the perfect tonic for avoiding the ‘cloud hangover’

(c)iStock.com/mediaphotos

If there is one shortage we will never suffer in the technology sector, it is one of attention-grabbing sound-bites.

The latest one to do the rounds is the ‘cloud hangover’. It has been coined to describe the undesirable after-effects of a move to the cloud, the most painful of which is the significant unplanned for expenditure incurred by organisations in maintenance and support of their newly acquired cloud IT.

While you may wince at the phrase, the issue is an important one for any organisation thinking about any sort of cloud adoption.

If two of the top benefits of cloud IT are flexibility and the ability to reduce costs, no IT lead wants to end up in a place where their new technology infrastructure ends up being harder to manage and costing much more than was initially planned.

The good news is in my experience such problems are easily avoided, particularly if you accept that, like its real-life equivalent, you recognise that ‘cloud hangover’ is down to a series of poor choices.

So what should organisations be thinking about?

Invest time before you invest money

The first issue to recognise is that any move to the cloud requires an investment of time to map out your needs and create a plan for migration to the cloud. This should be aligned to and prioritised against business needs. It’s at this time when you should involve suppliers to help with your thinking.

A supplier should work with you to assess the costs, risks and rewards of cloud for your organisation, they can then create an implementation strategy roadmap. Of all the investments you make, this planning phase is the most critical because it sets the context for every other choice you make.

Avoid tactical cloud adoption

It is critical to resist the temptation to implement cloud solutions on the basis that they deliver quick cost savings or because they are an easy-to-implement replacement for existing technology – unless it is part of a wider plan. Going down this road can increase complexity and increase IT siloes in your organisation, embedding unnecessary costs or leaving you with a bigger clear-up job further down the line.

The first phase of cloud migration following strategy definition and planning should focus on putting in place the right IT infrastructure to enable future flexibility and cost savings. For many organisations, this may involve a co-location arrangement as a pre-cursor a fuller cloud IT estate.

Get a plan for your data

Understanding and assessing what data you have currently, how it needs to be used and its security requirements is critical to the success of a move to cloud.

Do this piece of work up front and you will ensure you don’t pay for data being held in a security environment which is more costly than what you really need.

An IT team which can deliver

Moving to the cloud demands a team with a different set of skills from the ones you have to manage and maintain in-house IT infrastructure.

Planning for a move to the cloud is as much about people as technology. Once you have identified the IT structure that you want to put in place, then you should be scoping and shaping an internal IT function that has the capability to support it.

Managing suppliers, service level agreements, change management and project management are critical skills for cloud IT, along with strategic planning skills. Keeping the right skills in house and ensuring there is knowledge transfer from your suppliers is critical to keeping a handle on costs and activity.

In my experience, organisations that invest in planning across all of these areas upfront are best placed to avoid nasty surprises, which may otherwise come further down the line. It also helps crystalise the real cost of moving to the cloud so that forecasts for potential like-for-like savings, return on investment and total cost of ownership can be made accurately.

Read more: Report reveals public sector fighting hidden costs of cloud computing

Disaster Recovery Survival By @NaviSite | @CloudExpo #IoT #BigData #DevOps

Does your data center strategy contain a few, potential dark, rain clouds of unknown costs, sustainability of business continuity? Business continuity has to be purpose driven with real, practical goals while maintaining an achievable cost of ownership.
In his session at 17th Cloud Expo, Joe Thykattil, Technology Architect & Sales at TimeWarner / Navisite, will discuss how to leverage the Service Provider and VMWare Air disaster recovery to the cloud partnership to create your BCDR strategy.

read more

How to Save Your Smartphone Battery (Interactive Infographic)

Raise your hand if you’ve drained your battery more than once in a day! I love my smartphone, but I clearly use it more than what this little battery can handle. (I’m looking at you, Fallout Shelter—that game sucks more battery and time out of my day than anything else.) There are lots of tips […]

The post How to Save Your Smartphone Battery (Interactive Infographic) appeared first on Parallels Blog.

How to Fix the “Black Screen” in Windows

Guest blog by Maria Golubeva, Parallels Support Team Some people are tech-savvy, and some aren’t. Once new software is installed on a Mac, some folks just start using it with the default settings, but others (and I’d say I’m one of them) optimize it by changing various settings. Sometimes I just add more RAM and […]

The post How to Fix the “Black Screen” in Windows appeared first on Parallels Blog.

Imperva Inc.’s New Senior Vice President of Cloud Services

Imperva Inc., a company dedicated to protecting critical data and applications throughout the cloud, has recently announced that Meg Bear will become the company’s Senior Vice President of Cloud Services. Bear is responsible for increasing the company’s range of cloud services on an international level. In general, Bear is very qualified for this position. She has over twenty years of experience in a multitude of aspects within the software business. She also holds eleven patents for innovations in data management, social business, recruitment and talent management. Bear is a pivotal figure in the transition to cloud based business models. Bear was previously the Group Vice President for Social Cloud at Oracle responsible for delivering an integrated global social suite and held many other leadership roles at Oracle, including Vice President of Human Capital Management (HCM) Development and Senior Director of Development for Oracle PeopleSoft.

meg bear

“Hiring Meg is one more example of our continued investment in our rapidly growing cloud business, and we are thrilled to bring her on board,” said Anthony Bettencourt. “With the 98% year-over-year increase in subscription revenue reported in our Q2 earnings, we are well positioned with Meg’s experience and leadership to maintain momentum and take advantage of mounting cloud opportunities, adoption rates, and technological innovations.”

“Customers today need more than traditional endpoint and network security solutions that clearly don’t go far enough, given recent high-profile data breaches,” said Bear. “The cloud is an engine of innovation for many companies, and that requires new views on security. I look forward to working with the world class Imperva cloud teams to address those challenges, all with the aim of helping customers protect their business critical data and apps.”

In addition to all previously mentioned qualifications, she also is an Advisory Board Member at Unitive and Brand Amper, and previously served as Advisory Board Member at Storyvite. She has held a plethora of advisory roles with many different organizations, such as Watermark.  Bear graduated with a Degree in Economics and Entrepreneurship from the University of Arizona. Clearly, Bear has a lot to offer to Imperva, the leading provider of cyber security solutions in a business setting.

The post Imperva Inc.’s New Senior Vice President of Cloud Services appeared first on Cloud News Daily.

[session] Running QEMU/KVM Virtual Machines By @DomRodcom | @CloudExpo #Cloud

At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions.
In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, will discuss how in order to also handle QEMU / KVM virtual machines versions, they have developed a new tool, called Vm_commit, which can create commits, backup any commit, rollback to a previous commit or rebase all the commits up to a given one for any running virtual machine, on the fly.

read more

Preparing for ‘Bring Your Own Cloud’

BYOD1_smallIn 2015, experts expect to see more sync and sharing platforms like Google Drive, SharePoint and Dropbox offer unlimited storage to users at no cost – and an increasing number of employees will no doubt take advantage of these simple to use consumer platforms to store corporate documents, whether they are sanctioned by IT or not, turning companies into ‘Bring Your Own Cloud’ free-for-alls.

How can IT leaders prepare for this trend in enterprise?
Firstly, it’s important to realise it is going to happen. This isn’t something IT managers can stop or block – so businesses need to accept reality and plan for it.

IT leaders should: consider what’s really important to manage, and select a solution that solves the problem they need to solve. Opting for huge solutions that do everything isn’t always the best option, so teams should identify whether they need to protect data or devices.

Planning for how to communicate the new solution to users is something to consider early and partnering with the business units to deliver the message in terms that are important to them is an invaluable part of the process. The days of IT deploying solutions and expecting usage are long gone.

Using a two-pronged approach is recommended – IT managers should utilise both internal marketing and education to spread awareness about the benefits of the solution, and implement policies to set standards on what is required. Often end users aren’t aware that their organisation even has data security policies, and education can go a long way to getting compliance without being punitive.

What are the benefits of allowing employees to use these services to store corporate information?

The key benefits are mobility, increased productivity, improved user experience, and greater employee satisfaction and control.

What are the biggest implications for security?

The biggest implications for security involve the loss of valuable intellectual property and internal information such as financials and HR data, as well as data leakage, leading to privacy violations and loss of sensitive customer data. In addition, there are potential violations of regulatory policies for healthcare, financial services, and similar industries.

How can companies manage and control the use of these cloud storage apps when employees are using them in a BYOD environment?

In BYO use cases, companies should look for solutions that are focused on securing and managing data rather than devices. In a BYOD environment, IT managers can’t rely on the ability to lock down devices through traditional methods.

Instead, companies must be able to provide workspaces that have secure IT oversight, but also integrate with what is in the current environment.

Often the current environment has data in many places: file servers, private clouds, public clouds, etc. Choosing a data management solution that integrates with where the company’s data lives today will be more suitable than forcing data to be moved to a single location. This will reduce deployment time and give more flexibility later on to choose where to store the data.

How can organisations educate users and create suitable policies around the use of these tools?

Organisations should consider classifying corporate data. Does every piece of data need to be treated the same way?

Creating realistic policies that protect the company from real harm is so important, as is treating highly sensitive data differently from other data and training employees to know the difference.  Teams will also find it useful to integrate data security essentials into regular organisational onboarding and training programs, and update them as policies evolve.

How can companies find the most suitable alternatives to the free unlimited cloud storage users are turning to, and how do you convince employees to use them over consumer options?

The best solutions balance user experience for end users with robust security, management, and audit controls on the IT side. From a user experience perspective, companies should choose a solution with broad platform adoption, especially for BYOD environments. From a security perspective, choosing a solution that is flexible enough to provide secure IT oversight and that integrates with what you have today will stand the company in good stead. The last thing IT managers want to do is to manage a huge data migration project just to get a data security solution off the ground.

How can companies get around the costs and resources needed to manage their own cloud storage solutions?
Again, flexibility is key here. The best solutions will be flexible enough to integrate with what you have today, but also will allow you to use lower-cost cloud storage when you are ready.

What’s the future of the market for consumer cloud storage – can we expect their use to continue with employees?

Cloud storage in general isn’t going anywhere. The benefits and economics are just too compelling for both consumers and organisations. However, there is and has always been a need to manage corporate data — wherever it resides — in a responsible way. The best way to do this is by using solutions that deliver workspaces that are secure, manageable, and integrated with what businesses and consumers have today.

 

chanel chambersWritten by Chanel Chambers, Director of Product Marketing, ShareFile Enterprise, Citrix.

Cloudera announces tighter security measures for Hadoop

Cloud securityCloudera has announced a new open source project that aims to use real-time analytical applications in Hadoop and an open source security layer for unified access control enforcement.

Kudu, an in-memory store for Hadoop, aims to give developers more choice and stop them from having their options limited. Currently developers must choose between fast analytics with HDFS or updating data with HBase. Combining the two, according to Cloudera, can be potentially fatal for any developers that try, since the systems are both highly complex.

Cloudera says Kudu eliminates the complexities involved in processes like time series analysis, machine data analytics and online reporting. It does this by supporting high-performance sequential and random reads and writes, enabling fast analytics on changing data.

Cloudera co-authored Kudu with Intel, which helped it make better use of in-memory hardware and Intel’s 3D XPoint technology. Other contributors included Xiaomi, AtScale, Splice Machine and Zoomdata.

“Our infrastructure team has been working with Cloudera to develop Kudu, taking advantage of its unique ability to support columnar scans and fast inserts and updates to continue to expand our Hadoop ecosystem footprint,” Baoqiu Cui, chief architect at smartphone developer Xiaomi, told CIO magazine. “Using Kudu, alongside interactive SQL tools like Impala, has allowed us to build a next-generation data analytics platform for real-time analytics and online reporting.”

Meanwhile a new core security for Hadoop has been launched. RecordService aims to provide unified access control enforcement for Hadoop by enforcing role based access controls. It acts as a new layer that sits between Hadoop’s storage and computing engines and aims to consistently enforce the role-based access controls defined by Sentry. RecordService also provides dynamic data masking across Hadoop, protecting sensitive data as it is accessed.

“Security is a critical part of Hadoop, but for it to evolve the security needs to become universal across the platform. With RecordService, the Hadoop community fulfils the vision of unified fine-grained access controls for every Hadoop access path,” said Mike Olson, co-founder and chief strategy officer at Cloudera.

Microsoft selects Ubuntu for first Linux-based Azure offering

AzureMicrosoft has announced plans to simplify Big Data and widen its use through Azure.

In a blog post, T K Rengarajan, Microsoft’s corporate VP for Data Platforms, described how the expanded Microsoft Azure Data Lake Store, available in preview later this year, will provide a single repository that captures data of any size, type and speed without forcing changes to applications as data scales. In the store, data can be securely shared for collaboration and is accessible for processing and analytics from HDFS applications and tools.

Another new addition is Azure Data Lake Analytics, a service built on Apache YARN that dynamically scales, which Microsoft says will stop people being side tracked from work by needing to know about distributed architecture. This service, available in preview later this year, will include U-SQL, a language that unifies the benefits of SQL with the expressive power of user code. U-SQL’s scalable distributed querying is intended to help users analyse data in the store and across SQL Servers in Azure, Azure SQL Database and Azure SQL Data Warehouse.

Meanwhile, Microsoft has selected Ubuntu for its first Linux-based Azure offering. The Hadoop-based big data service offering, HDInsight, will run on Canonical’s open source browser Ubuntu.

Azure HDInsight uses a range of open source analytics engines including Hive, Spark, HBase and Storm. Microsoft says it is now on general release with a 99.9 per cent uptime service level agreement.

Meanwhile Azure Data Lake Tools for Visual Studio will provide an integrated development environment that aims to ‘dramatically’ simplify authoring, debugging and optimization for processing and analytics at any scale, according to Rengarajan. “Leading Hadoop applications that span security, governance, data preparation and analytics can be easily deployed from the Azure Marketplace on top of Azure Data Lake,” said Rengarajan.

Azure Data Lake removes the complexities of ingesting and storing all of your data while making it faster to get up and running with batch, streaming, and interactive analytics, said Rengarajan.

China Unicom and Telefónica in global data centre sharing agreement

Reflections are seen on a logo of Spain's telecommunications giant Telefonica in MadridTelefónica and China Unicom have agreed to share their international data centre capacity for multinational clients across Europe, The Americas and Asia.

The initial agreement covers three major data centres from each operator but, the companies say, this could be the first step towards larger scale cloud cooperation.

The pooling of resources means China Unicom can support customers seeking to expand into Europe and The Americas while Telefónica can strengthen its proposition across Asia.The cloud computing aspect of the agreement includes IaaS (Infrastructure as a Service) with virtual servers and multi-cloud solutions.

China Unicom’s customers can now benefit from Telefónica’s data centre presence in Sao Paolo in Brazil, Miami in the US and Alcalá de Henares in Madrid, Spain. The extent of the partnership will expand as Telefónica commits additional investment towards new infrastructure, facilities and multi-cloud solutions, it says. Conversely, Telefónica can use the cloud capacity of China Unicom’s data centres located across China in Langfang, Shanghai and Chongqing. This, it says, means it can offer end-to-end service delivery for its multinational customers.

China Unicom’s data centre provides international connectivity services including virtual private networks, multi protocol label switching (MPLS) and Global LAN. Internet access for customers’ servers can be offered as an option to the standard service. Every China Unicom Point of Presence comes with ‘meet me’ room services. The mutual colocation service also offers local support for customer equipment.

Telefónica has a presence in 21 countries and a customer base of 329 million accesses around the world, with its major markets being in Spain, Europe and Latin America. Telecom operator China Unicom offers mobile broadband (WCDMA, LTE FDD, TD-LTE), fixed-line broadband, GSM, fixed-line local access, ICT, data communications and related services. It has a total of 439 million subscribers.