Multi-cloud gains more and more traction – but cost still the key factor

(c)iStock.com/Serjio74

New research released by Turbonomic and Verizon has revealed several key reasons for businesses wanting to adopt a multi-cloud strategy, from business continuity to resilience – yet cost is still a burning issue.

The study, which polled 1,821 IT decision makers, saw the need for business continuity as the most important business driver for adoption of multi-cloud, cited by 77% of respondents, ahead of increased resilience (74%), and reduction of operational expenditure (70%) and capital expenditure (69%). However, with regard to selecting vendors, 70% said pricing was the primary consideration, ahead of 51% who opted for service level agreements or quality of service.

More than four in five organisations identified at least 12 different management issues they were facing including balancing performance and cost (90%), delivering IT services to the appropriate budget (89%), and ensuring consistent application performance (86%).

Yet these aren’t the only challenges faced. 81% of those polled said choosing the right workloads for the right clouds is an issue, while 84% had problems with evaluating cloud vendors to meet business and technical requirements.

Charles Crouchman, CTO at Turbonomic – formerly VMTurbo – argued the one factor which will change the mindset of organisations who look at cloud as cost first is ‘experience.’

“As the cloud moves from shadow IT to enterprise IT, the costs will become more transparent – in particular, the cost of assuring that your end users are getting what they expect from your service or brand,” he told CloudTech. “Smart companies understand that customer delight is the path to success – and focusing solely on cost is not a recipe for assuring customers are delighted with your service.”

Crouchman added that he wasn’t surprised by the main survey result – that organisations are facing so many management challenges – but that organisations need to look at cost versus performance from a different perspective. “The reality is that while price and cost impact your CFO and company, performance impacts your customer,” he said. “Without them you don’t have revenue, so you have to focus on all of the trade-offs.

“The reality of the impact poor performance or downtime has on your customer means that you will pay for performance no matter what – and it is in each cloud’s economic interest for you to pay as much as possible.

“So CIOs who believe that the cloud will invariably reduce CapEx or OpEx costs will soon face the reality of cloud bills that are larger than they expected,” Crouchman added.

Earlier this month, Google announced the acquisition of cloud commerce platform provider Orbitera, ostensibly with multi-cloud strategy in mind.

Gearing up for #DigitalTransformation | @CloudExpo @PagerDuty #IoT #ML

Modern organizations face great challenges as they embrace innovation and integrate new tools and services. They begin to mature and move away from the complacency of maintaining traditional technologies and systems that only solve individual, siloed problems and work “well enough.” In order to build…

The post Gearing up for Digital Transformation appeared first on PagerDuty.

read more

GreenPages’ Chris Williams Hosting #vBrownBag AWS Exam Prep Webinar

GreenPages Enterprise Consultant Chris Williams will be hosting a webinar on August 24th to help IT professionals pass the AWS Solutions Architect Associate exam. The series is brought to you by #vBrownBag and follows the official certification blueprint published by Amazon. If you’re thinking of taking this AWS exam, you don’t want to miss out!

The webinar is titled Domain 1.0 Part 4 – Designing highly available, cost-efficient, fault-toleration, scalable systems. In the webinar Chris will cover:

  • Hybrid IT architectures
  • Direct Connect
  • Storage Gateway
  • VPC
  • Directory Services

 

To learn more about the webinar and to register, visit the #vBrownBag site!

 

About #vBrownBag

vBrownBag is a community of people who work in IT infrastructure who help other professionals in the IT industry perform their jobs at a higher level by providing helpful resources and advice through podcasts, TechTalks, and other mediums.

Brexit? What Brexit? Datapipe acquires UK-based managed cloud firm Adapt

(c)iStock.com/Jonathan Maddock

US cloud provider Datapipe has announced it is acquiring Britain-based managed cloud firm and AWS partner Adapt in an attempt to expand the firm’s European presence.

The firms took the unusual step of issuing two separate press releases to cover the same news. From the Adapt side, the company noted the move to Datapipe was a ‘non-disruptive addition’ due to the firm’s similar roadmaps and propositions.

“Adapt and Datapipe both have cultures that focus on proactive, high touch customer service and a commitment to customer-specific solutions designed to meet clients’ individual business challenges,” said Robb Allen, Datapipe CEO. “Our similar approach to guiding clients on their cloud journey makes the acquisition a natural fit for us and will increase our scale and service capabilities in the United Kingdom, and the broader European market.”

Looking at the acquisition in a wider context, however, it shows demand for UK tech businesses in a post-Brexit landscape. Recent research and analysis has been less than kind to Blighty after the referendum vote to leave the European Union in June. A note from consulting firm BroadGroup earlier this month argued that while demand for European data centre space continues to rise, Brexit could send UK ambitions off-kilter, although Amazon Web Services (AWS) recently confirmed its plans for a UK data centre remain on track.

It was a similar story from analyst house IDC in July, praising EMEA infrastructure but warning of a UK downturn. This facet was inferred by Adapt CEO Stewart Smythe. “We are seeing emerging customer requirements for a tactical and strategic presence overseas, so it makes sense for us to advance the UK’s capability in a global market rather than create more bulky domestic organisations,” he said.

“UK-only consolidations in our space can get very messy and can be short-sighted. We have chosen a far more exciting path.”

Financial terms of the deal were not disclosed.

AWS and Azure get the highest federal security rating: What happens from here?

(c)iStock.com/Kevin Smart

Cloud services have been able to store customers’ data for many years now, but the number of prospective clients for several vendors has recently dramatically increased.

Back in late June, the announcement was made that three vendors had received special certifications from the federal government, allowing them to store sensitive data that the government had on hand. Two of those providers are among the most popular within the cloud market, Amazon Web Services (AWS) and Microsoft Azure, while the third is CSRA’s ARC-P IaaS, a vendor that might not be as universally known as the others but still carries enough weight for those in the know. The news was certainly noteworthy for those providers, but it also has tremendous implications for federal agencies as well as the cloud market as a whole.

The federal government is no stranger to storing data using other cloud vendors. The majority of cloud providers knew this early on and even provided special services specifically tailored for government needs. Back in 2011, for example, Amazon launched its AWS GovCloud. Microsoft also started a similar service. The idea was for government agencies to use the cloud to store data of various types. Thousands of government customers quickly hopped on board with the idea, but as helpful as the service was, the most sensitive information the government had still couldn’t be placed on the cloud, at least not until rigorous security standards were met.

A newly created list of requirements was made under the Federal Risk and Authorization Management Program (FedRAMP) called the High Baseline. It contains more than 400 standards set by the National Institute of Standards and Technology that cloud providers would have to meet in order to receive certification allowing it to store sensitive government data. That’s been the goal providers like Amazon and Microsoft have been working toward – being able to qualify for what would be a new influx of government customers. Those providers could already store about half of what the federal government’s IT spends on information, but the other half could be opened to them if their security standards were raised. Based off the announcement, the government says the three named vendors have cleared the bar.

So what types of data will the government be able to store in the cloud? The announcement states that the data in question is considered highly sensitive but unclassified. This includes but is not limited to things like financial data from agencies, patient medical records – likely from healthcare programs run by the government – and even law enforcement data. Most data of this sort is considered of high importance if its leak or theft were to be a great detriment to an agency’s operations, clients, resources, or employees.

The announcement is certainly big news for government agencies as being able to store government data securely will make their jobs much easier. But that’s not the only area that will benefit greatly. The cloud market itself stands to gain quite a bit from the news. For many years, as the cloud has picked up steam and grown in popularity, those organisations that have been reluctant to transition to cloud services have often cited concerns over cloud security.

It’s not easy to hand over potentially sensitive information to a third party, and many executives wanted to know their data would be in good hands. Cloud providers have answered many of those concerns with greater attention being placed on security improvements, but doubts persisted among businesses of all types. With the news that the federal government is willing to trust some cloud providers with highly sensitive data, it shows a great degree of confidence being placed on vendors’ ability to protect information. This can lead to more companies turning to the cloud for flash storage needs.

Security standards from the government have now been met by some of the most popular cloud providers, allowing them to house sensitive government data. With the standards established, we may see other providers work hard to improve their own security so that government agencies can consider them for data storage purposes. Considering how competitive the cloud market is, it would be a wise move for providers to ensure they can work with as many clients as possible.

Read more: AWS and Microsoft get FedRAMP approval for sensitive cloud data

Upgrade now to Parallels Desktop 12 for Mac!

We are so happy to introduce the latest version of Parallels Desktop for Mac and give you amazing Parallels Desktop 10, 11, and Pro Edition users first dibs on upgrading to our new edition. Welcome to Parallels Desktop 12 for Mac! We’ll have more blog posts this week detailing all you need to know about upgrading […]

The post Upgrade now to Parallels Desktop 12 for Mac! appeared first on Parallels Blog.

[session] Machine Learning – It’s All About the Data | @CloudExpo #API #Cloud #MachineLearning

Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value.
In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.

read more

A CIO writes (sort of): I’m terrified of cloud lock-in – what should I do?

(c)iStock.com/kuarmungadd

“Dear Deidre,

I’m stuck in a loveless marriage.

I’m in a three-year relationship which I don’t think I can get out of. It started off great, but now I’m getting bled dry and I don’t have the freedom I thought I would have. What should I do?

Regards,

C.I O’Really.”

Some of you will be suffering from the same concerns as poor Mr. O’Really. Over the last two years it seems like every organisation has been enthusiastically walking down the aisle with whichever cloud vendor had the best deal at the time.

In this ‘heartache column’, I want to look at the inhibitors of cloud adoption and which factors are driving the fear of cloud lock-in. Then talk through the two crucial steps users can take to get the best of both worlds – the business velocity provided by the cloud, without the risks of locking themselves into a specific vendor.

Our own Cloud Brief research has found 82% of respondents are strategically using or evaluating the cloud today. So they should, cloud computing is helping business build better services, quicker and at a lower cost. Better, quicker, cheaper. It’s what we’re all looking for in a relationship with the vendors we love the most.

The number one driver for cloud adoption is agility – the need to rollout new applications faster. This was reinforced at a recent meeting I had in London with developers from a leading global financial institution. They complained it takes three months for hardware supporting a new project to be procured, installed, racked, and stacked. Clearly unacceptable in today’s hyper-competitive market governed by agile development, continuous integration, and elastic scaling.

Where did it all go wrong?

The cloud is providing serious value, but hitching your wagon to one provider could present serious competitive disadvantage. Another cloud vendor could make all sorts of changes to become more appealing. New services, features or region availability are all changes that could give your competitors an advantage.

So what form does that lock-in take? It’s not about the hardware, operating systems and software of the past, instead it’s about APIs, services and data. The underlying infrastructure made up by compute, storage and networking are largely a commodity and can be exchanged between cloud providers. But as we move up the infrastructure stack, the APIs and data these services exchange become less portable.

There are a number of potential friction points: security, management, continuous integration pipelines and container orchestration. That’s not all though, business need to also consider their content management, search, databases, data warehouses, and analytics too.

It’s the data management services which cause most concern. You may have heard of the term ‘data gravity’. It was coined a few years ago, but has a real resonance today. As you’ll remember from fourth form physics: The mass of an object increasing is equalled by the increase in the strength of its gravitational pull. Well, the same is true for data. The more data you have in a specific location, the harder it is to move.

So you’ve got data gravity tugging on your heartstrings and you’re staring down the barrel of being locked-in to an endless stream of unfulfilling and expensive date nights. What are the options?

Open your heart

There are no easy answers, especially if your organisation needs the simplicity and convenience of an “as-a-service solution”. However, there are two things you can do to greatly improve your options.

The first step is to find a service that can be run on multiple public cloud platforms. Would you buy a car that can only be driven on 30% of roads? Don’t buy a service that only works on Amazon, Google or Microsoft.

Second, find services which have open source alternatives. That way, if you tire of your current vendor relationship, you can just give them the heave-ho and download the same software and run it yourself, anywhere. Having an open source choice means you can run your deployment on another cloud or on your own private infrastructure.  Going from an as-a-service to running it yourself does require planning, but it does give you the ultimate freedom.

To demonstrate why both of the above points matter, take a look at Comparethemarket.com. The largest price comparison site in the UK made the switch from managing its own on-premise infrastructure to Amazon Web Services (AWS). As a part of that move, the IT team considered the AWS DynamoDB NoSQL database service. However, concerns around exposing itself to excessive AWS control made comparethemarket.com eliminate DynamoDB as an option.

So our advice to C.I. O’Really is blunt: get a divorce. If your vendor isn’t treating you right then get one that is. Find a solution that is cloud-agnostic and that has an open source version available. These two points will go a long way to giving you the joy of the cloud, along with the freedom. They may seem charming but please, just date your cloud provider, don’t marry them.

Editor’s note: Dear Deidre is a well-known UK agony aunt column in The Sun newspaper. Other agony aunts are also available.

Sponsor Big Data at @CloudExpo | #BigData #IoT #DigitalTransformation

Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things.
And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future – it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher.
Big Data at Cloud Expo Silicon Valley is the place where you can see the technologies and use cases that are delivering Big Data to enterprise IT.

read more