All posts by Cloud Pro

Q&A: UK Cloud Awards judge Mitchell Feldman


Cloud Pro

4 Mar, 2020

Please could you tell us a little bit more about who you are and what you do?

I have been in the IT industry for 20-plus years with 10-plus years in the cloud space. As part of Hewlett Packard Enterprise (HPE), my role is to promote and amplify the amazing work we do in hybrid cloud.

I am creative by design and I’m never more happy then when I am building content that wins the hearts, minds and where possible, the souls of our audience.

How would you describe the UK Cloud Awards in three words?

Forward-thinking, inspirational, prestigious

What appealed to you about becoming a judge for this year’s UK Cloud Awards?

The UK Cloud Awards have been very kind to me as a previous winner, but more so I love their passion for creating a better industry.

What are you most looking forward to about being involved in this year’s awards?

I can’t lie, I love reading the entries. It’s fascinating to learn about how businesses are challenging the status quo and creating amazing new outcomes by leveraging the cloud. This is digital transformation at its best.

This year’s awards have had a bit of a makeover, with new categories and some other tweaks. Tell us why people should be getting excited about all of that/the awards?

Winning an award at this event will bring more success than the award itself. Winning (and even being a runner up) will showcase your business in front of some of the most important people in the industry. It’s win-win.

Do you have a category/categories you’re most excited about?

The geek in me makes me gravitate to transformational projects so it has to be Internet of Things (IoT) Project of the Year.

What are you looking for when you’re reading an entry? How can people make sure theirs stands out?

Video is king! For me, the more you invest in your entry, the greater chance you have of showing the judges just how good you are. I have seen amazing use cases fail due to a low-quality entry. Invest in this initiative like it’s the best customer you could ever win, it will pay dividends for years to come.

What would you say to those thinking about entering but haven’t fully decided to do so as yet?

Why wouldn’t you want to make your business more famous?

Do you have a standout cloud moment from 2019?

The industry changing its narrative and coming to the realisation that we live in a hybrid cloud world.

What are your top three cloud predictions for 2020?

Containerisation will continue to dominate.

AI use cases will be more pervasive than ever before.

Social media platforms will have more accountability to protect society.

Is there anything else you would like to add?

Third-party endorsement of your businesses success (i.e. winning an award) is one of the most powerful marketing tools you will ever have. Take advantage of this amazing opportunity to raise the profile of your business and become the one that everyone else aspires to be.

Q&A: UK Cloud Awards judge Anthony Hodson


Cloud Pro

3 Mar, 2020

Please could you tell us a little bit more about who you are and what you do?

I am an AWS Solution Architect and Consultant working for Managed Service Provider (MSP), Ensono. Ensono serves mid-tier to large enterprises. I help enterprises understand the opportunities cloud has, and then move workloads into or grow workloads within AWS’ cloud. My background spans traditional managed hosting, DevOps tooling and advisory, and Fintech. I enjoy seeing the progress that cloud can bring.

How would you describe the UK Cloud Awards in three words? 

Commending Cloud Creativity

What appealed to you about becoming a judge for this year’s UK Cloud Awards? 

In addition to the splendid party, I wanted to hear the innovations and results that have been delivered in 2019. I can then use this inspire those I speak with in my day job.

What are you most looking forward to about being involved in this year’s awards?

The quality of submissions last year was high. I’m looking forward to more stories of innovation and collaboration and I’m hoping for some that don’t just take a bigger share of the market but make the market bigger.

This year’s awards have had a bit of a makeover, with new categories and some other tweaks. Tell us why people should be getting excited about all of that/the awards? 

This year we’re focusing less on products (three categories) and more on successful projects (ten categories) and recognition for outstanding contributions (five categories). This means, to be recognised, it’s about what the people managed to achieve. We want to recognise those who’ve made the biggest and most innovative strides in progress; Afterall productization can only come after a successful pilot.

Do you have a category/categories you’re most excited about? 

The Internet of Things (IoT) is an area where we see technology enter our day-to-day world. This year we have a new category ‘IoT project of the year.’ I hope to learn how these sensors can be used to improve life on the spinning rock we call home.

What are you looking for when you’re reading an entry? How can people make sure theirs stands out? 

When I read an entry I’m looking for a good story: what was the problem and how did you know there was one? Why should we care? What were the challenges to overcome it? What roles were played and who made up the team? Finally, what measurable outcome was there and what lays ahead in the sequel? Sending marketing material off the website usually does not achieve this.

What would you say to those thinking about entering but haven’t fully decided to do so as yet?

Putting together an entry that explains what you’ve created, why it was hard and what outcomes were achieved, for whom is the basis for many a good sales pitch. Crystallising this into a concise and moving piece will not only offer the chance of industry recognition but it’ll arm you internally with getting more backing for future projects, thereby extending out the ‘DevOps ripple of progress’.

Do you have a standout cloud moment from 2019?

Personally I particularly enjoyed my time with the CIF and the Containers and Functions as a Service webinar. From a technology perspective, I was excited to see AWS release its ‘serverless’ Kubernetes offering AWS ECS Fargate, in doing so, taking out more ‘undifferentiated heavy lifting’ shortening the cycle from idea to delivery.

What are your top 3 cloud predictions for 2020?

1. AWS provides a fully-integrated Disaster Recovery as a Service, manifested through a check-box in the console (or API of-course)
2. Hyperscale providers will continue to hope their customers align with a single hyperscale cloud; the market (driven by compliance and risk mitigation) moves towards multi-hyperscale cloud, those savvy enough use Kubernetes to do this. Google takes the ground as the secondary site for these differentiator workloads (being the Kubernetes mothership).
3. AWS’ CEO Andy Jassy’s love for Vintage Rock spills out of the Re:invent AWS keynote into the AWS re:play party with the Eagles returning from retirement…

Is there anything else you would like to add?

All too often in technology, I see great work where it’s hard to prove it made a measurable impact. When you start a project (or write about a successful project), find some data which gives a base-line: ideally, it’s quantifiable, it could be qualitative (surveys even). For advice and inspiration read (or listen) to Nicole Forsgren’s / Jezz Humble / Gene Kim book ‘Accelerate’ Note there are free chapters on Google books.

Q&A: UK Cloud Awards judge Rob Lamb


Cloud Pro

2 Mar, 2020

Please could you tell us a little bit more about who you are and what you do
My role at Dell Technologies is to bring industry expertise and transformation experience to help customers achieve key business outcomes in times of big change. My aim is to counsel them on how they can accelerate their IT transformation while balancing the need for consistent delivery and helping drive the cultural change associated with such initiatives – the magnitude of cultural and operating model change is often underestimated.
How would you describe the UK Cloud Awards in a nutshell?
The opportunity for people to receive industry recognition for their efforts and initiatives.
What appealed to you about becoming a judge for this year’s UK Cloud Awards?
The quality and breadth of the entries last year was fantastic, and I thoroughly enjoyed reading and reviewing them.
What are you most looking forward to about being involved in this year’s awards?
The new categories are exciting and I’m looking forward to reading the entries.
This year’s awards have had a bit of a makeover, with new categories and some other tweaks. Tell us why people should be getting excited about all of that/the awards?
Now in their seventh year, the UK Cloud Awards celebrates the diversity, innovation, excellence, of entries across 20 categories and will provide entrants a showcase for their efforts. The new categories really broaden the appeal of the awards.
Do you have a category/categories you’re most excited about?
I am really looking forward to the new people-centric categories, and especially the Positive Action Award
What are you looking for when you’re reading an entry? How can people make sure theirs stands out?
Make it real – talk about tangible business outcomes – then prove them. It mustn’t be technology for technology’s sake, there must be a positive impact. Don’t play down the challenges, we all know they happen so don’t gloss over them.
Talking about the challenges can bring your story to life. Short, sharp and punchy catches the eye.
What would you say to those thinking about entering but haven’t fully decided to do so as yet?
What are you waiting for? If you’re proud of a project, if it had a real business outcome and made a difference then why aren’t you writing it up and submitting it?
Do you have a standout cloud moment from 2019?
I think for me it has been the realisation by the industry, especially customers/consumers, that a single cloud isn’t the answer and that multi-cloud is going to be of greater importance in the strategies of enterprises in order to address all their workloads.
What are your top three cloud predictions for 2020?
1) 2019 saw the realisation that multi-cloud is industry direction of travel. This and edge computing will continue to be the aspiration for customers in 2020.
2) Security will continue to be a significant focus. Recent breaches have brought attention to the challenge of securing apps and data in a multi-cloud world.
3) Administration of this evolving cloud landscape will see fundamental changes coming in
terms of how it's all administered.

Q&A: UK Cloud Awards head judge Jez Back


Cloud Pro

24 Feb, 2020

For the past six years, the UK Cloud Awards has been celebrating the best and brightest of the cloud industry’s talent, looking at the projects, products and people that keep this fast-moving sector progressing at pace. 

In that time, the event has seen impressive entries from leading cloud vendors and startups alike, including the likes of Red Hat, Oracle NetSuite, Dropbox and more, with an even more expansive list of entrants expected for this year’s ceremony. The man with the unenviable task of overseeing the selection process to determine a shortlist from all these nominations is Jez Back, who returns as head judge for the second year in a row.  

We sat down with Back to get the low-down on everything you need to know about the UK Cloud Awards, as well as his thoughts on the broader cloud industry.

Please could you tell us a little bit more about who you are and what you do?

I like to describe my situation as having three jobs. I am the Managing Director of Erebus Digital; we’re a small, boutique consultancy that focuses on client business outcomes using technology. Erebus Digital offers 3 main strands in its portfolio; Digital Transformation, Technology Cost Optimisation and Digital Design Services.

My second job is as a board director for a Rugby Club where I lead the strategy, marketing and communications teams. Finally, I am the head judge for the UK Cloud Awards! I often remind myself that I have a family as well.

How would you describe the UK Cloud Awards in three words?

Credibility with Integrity.

This is now your second year as head judge – congratulations and welcome back! What made you want to do it all over again?

Getting to see what people are doing in the market, hearing their stories and seeing how positive outcomes occur is something I really enjoy, so it was a bit of a no-brainer to come back.

I also get to work with some of the brightest and best our industry has to offer as my colleagues on the judging panel. I come away knowing a little more every time I have a conversation with any of them – that’s another great reason.

Finally, to work with my colleagues in the Cloud Industry Forum & Dennis Publishing to promote what our country has to offer in the cloud market is a great privilege and one I take really seriously.

You bring a great breadth and depth of experience to the judging panel in your role, what are you most looking forward to about this year’s awards?

The stories that people have to share. I love to see a great story on how positive outcomes for clients are achieved when I read the entries. I love to see old friends in the industry and make new ones at the awards night as I discover what they have been doing, too. Essentially, it’s the people!

This year’s awards have had a bit of a makeover, with new categories and some other tweaks. Can you share details of the key changes and why they’ve been made?

Obviously, there has been a bit of a shift and I think it very much reflects an outlook that I discussed earlier. It is much more about outcomes and less about the products themselves. That’s why we have reduced the number of product categories and have increased the number of project ones, whilst also adding new people awards.

Given those tweaks, what advice would you give to a) those entering for the first time and b) those returning for another year either to maintain their crown or win one of the accolades for the first time?

I would give three main pieces of advice. Firstly, read the entry criteria carefully. Judges want to give the points but often entrants lose them but simply not answering all of the criteria in the entry. Secondly, provide evidence in everything you say for the entry, backed up with client testimonial. That gives your entry more credibility. Finally, don’t sell the product or service to us by adding marketing brochures – sell the product and service through the stories and outcomes in the entry!

When you’re reading through an entry, what are looking for and is there anything that will really pique your interest?

It is always about the tangible outcomes; that’s what I enjoy the most. The stories of positivity that are backed up with testimonials or delighted customers.

What would you say to those thinking about entering but haven’t fully decided to do so as yet?

Do it – celebrate the achievements of you and your teams, and tell the world about the difference you make.

Do you have a category that is particularly close to your heart?

It has to be the People category, seeing what achievements they’ve made and celebrating them is a major reason why I love being Head Judge.

We’re now in 2020 – how do you think the cloud landscape has changed since the beginning of the decade? What are the key trends and challenges that remain front of mind for you?

Wow, this is a big question and one I could talk for hours on. For me, I would summarise the change in two main areas. Firstly, the evolution and maturation of many of the major providers in the cloud market – look at Microsoft’s transformation as a really obvious example. Second, the acceleration of business value that has come from cloud to businesses both big and small.

That means that the key trends and challenges that lie ahead for the industry will be to tackle the question of cloud security once and for all; it truly bothers me that those perceptions still exist. I also see the major providers doing a lot more co-operation and alliances as the service offerings seek to become more powerful. I’m seeing an increased demand for Cost Management and Optimisation for spend in cloud, especially in manging compliance spend in software licencing in Hybrid and Multi-Cloud. There are more, but these are the ones that are forefront in my mind.

Do you have a standout cloud moment from 2019?

Clearly, it has to be being the head judge for last year’s UK Cloud Awards!

What are your top 3 cloud predictions for 2020?

Cost optimisation will become more important, the major providers will do more alliances and partnerships and finally, Service Meshes will become the next battleground.

Is there anything else you would like to add?

If you’ve got a cloud achievement that you want to celebrate, submit your entry today!

IT Pro 20/20: What the year ahead holds for technology


Cloud Pro

31 Jan, 2020

Welcome to the first issue of IT Pro 20/20, a brand new digital magazine that brings all of the month’s most important tech issues into clear view.

Each month, we will shine a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

This month’s issue is all about the year ahead. We put to you our predictions of what the industry is likely to face over the next twelve months, including the technology likely to dominate news headlines.

We’ve also got a handful of exclusive articles for you that you won’t find online:

  • First, we take a look at whether 2020’s job candidates really need a degree to get ahead in the IT industry.
  • We’ve also commissioned our own postmortem examination of the now sadly departed Windows 7 to see what made it so successful.
DOWNLOAD THIS MONTH’S ISSUE OF IT PRO 20/20 HERE

We hope you enjoy reading this month’s issue. If you would like to receive each issue in your inbox as they release, you can subscribe to our mailing list here.

The next IT Pro 20/20 will be available on 29th February.

Announcing the launch of IT Pro 20/20


Cloud Pro

20 Jan, 2020

We understand that as IT professionals, your free time is incredibly valuable. You need to be able to figure out what stories truly matter and what trends are worth your attention, and do so quickly and with confidence. While we take immense pride in the content that we create online, we recognise that a website can be too mercurial for some of our readers.

It’s for this reason that we decided to launch our very own podcast at the end of last year, helping to diversify the types of content we deliver and offering a whole new way of consuming the latest industry stories.

Well, we’re not stopping there. Today, we’re excited to announce the launch of ‘IT Pro 20/20’, a digital magazine that shines a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

Each issue will act as a snapshot of the month gone by, serving as an essential guide to industry’s most important developments and trends, some of which may have been overshadowed by transient news headlines. What’s more, IT Pro 20/20 will also feature exclusive content, including features and analysis, that has yet to be published on our website. We want to provide IT decision-makers with a clear view of the month gone by, and a glimpse of what issues could shape their roles in the months and years to come.

The magazine allows us to showcase our content in a way that we’ve not been able to previously, providing additional context and insight in a format that works on any device, whether online or offline. It’s perfect for those that enjoy a long read, whether that’s on the commute to or from work, or during a lunch break.

IT Pro 20/20 will be available to download for free, released on the last day of every month, starting from 31st January. As each issue will be reflecting the main themes of each month, we thought it would be best to start with a look at what this year is likely to hold for tech, and a look at some of the key issues facing IT decision-makers. As we progress through the year, we’re also going to major on certain key events, so keep an eye out for those.

It’s the same high-quality content that you’ve come to expect from IT Pro, only in a format that complements your hectic schedule.

Head over to our sign up page on our sister site to register for a free issue of IT Pro 20/20, delivered on the last day of each month.

Revamped UK Cloud Awards 2020 now open for entries


Cloud Pro

20 Jan, 2020

The Cloud Industry Forum (CIF) and Cloud Pro are excited to announce that the seventh annual UK Cloud Awards is open for business – with a revamped list of categories and a new location.

Aiming to celebrate the very best successes in digital transformation, diversity, innovation and customer experience, the 2020 UK Cloud Awards is seeking entries for its 20 new-look categories.

Organisations have between now and 20 March to submit an entry to the 2020 UK Cloud Awards through its website, with the shortlist announced on 28 April. The final awards ceremony will then be hosted on 4 June at the Brewery, in London.

“Back for a seventh year, the UK Cloud Awards continue to evolve,” said CEO of CIF, Alex Hilton. “We have completely reshaped them, with a fantastic new location and twenty transformational awards that reflect the pace of change and innovation in the UK industry.

“Our emphasis for 2020 is recognising people, projects and innovative companies that are making an impact with their customers. We are thrilled to continue to be involved with shaping the cloud industry.”

The awards are no longer broken down based on category and subcategory and have been expanded from the 17 categories organisations could enter last year.

This beefed-up list in 2020 includes project-based awards for excellence in particular areas, such as FinTech Project of the Year, Big Data/Analytics Project of the Year, and DevOps Project of the Year.

There will also be awards recognising cloud innovation, including Cloud Product Innovation of the Year and Cloud security Innovation of the Year.

All entries will be assessed and scrutinised by a panel of expert judges, spearheaded up returning head judge Jez Back, managing director of Erebus Technology Consulting.

Joining him are a string of recognised experts including chief technology advocate with Splunk, Andi Mann and industry analyst Jon Collins, among other widely-recognised industry figures.

“I am delighted to be back once again as Chair of Judges, for what really has become one of the most credible, stand-out events in the UK’s thriving cloud industry,” Back said.

“This year the awards are bigger than ever, with a new list of twenty categories available to enter, covering AI, DevOps, partnerships, cloud project implementation and more.

“We aim to celebrate diversity and talent, and through the UK Cloud Awards recognise the leading figures at the forefront of this innovation.”

The full list of award categories are as follows:

  • Cloud Product Innovation of the Year
  • Cloud Security Innovation of the Year
  • Cloud Collaboration Platform of the Year
  • SaaS Implementation Project of the Year
  • Internet of Things Project of the Year
  • FinTech Project of the Year
  • Big Data/Analytics Project of the Year
  • Machine Learning/Artificial Intelligence Project of the Year
  • DevOps Project of the Year
  • Public Sector Project of the Year
  • Enterprise Project of the Year
  • SMB Project of the Year
  • Digital Transformation Project of the Year
  • Cloud Start Up Company of the Year
  • Cloud Growth Business of the Year
  • Cloud Technology Positive Action Award of the Year
  • Cloud Team of the Year
  • Cloud Channel Partner of the Year
  • Cloud Leader of the Year
  • Newcomer of the Year

“The UK Cloud Awards have gone from strength to strength since their creation,” said Dennis group editor and editorial director of B2B, Maggie Holland.

I remain as excited about the event and everything it stands for now as I was from day one. As a journalist writing about cloud computing, I get to see first-hand just how much talent and innovation this industry has to offer.

“Yet, as a judge, I am always still surprised and delighted by how the bar keeps on being raised.

“I can’t wait to see the entries for this year’s awards and to be able to help those shortlisted and the winners, as well as the great and good of the cloud industry, celebrate that success at the awards event night itself.”

Why ‘lift-and-shift’ is an outdated approach


Cloud Pro

17 Jan, 2020

There are many reasons why a business might consider moving some or all of its applications to public cloud platforms like AWS, Azure or Google Cloud. One of the most compelling, however, is the cloud’s ability to reduce the complexity of an organisation’s IT. Removing the need for management of the physical infrastructure that applications run on can yield big benefits for simplification.

Running your applications in the cloud allows you to take advantage of a new compute economy and scale the capacity up and down as needed, as well as letting you hook into a huge range of additional tools and services. While this is well and good for building new applications, porting pre-existing legacy apps over to cloud-based services can often prove challenging.

Organisations that want to migrate pre-existing workloads to a public cloud are faced with a choice: do they re-architect their applications for cloud, or do they simply attempt to port it to their chosen platform wholesale, with no alteration? For many companies, the latter approach – known as the ‘lift-and-shift’ method – initially sounds like the more attractive option. It allows them to get into the cloud faster, with a smaller amount of work, meaning the IT team has more time to devote to other elements of the migration or to developing entirely new capabilities.

Sadly it’s not quite as simple as that. While some applications can be moved over fairly seamlessly, not all apps are suited to this method. Compatibility is the first issue that companies are liable to run into with lift-and-shift; particularly when dealing with legacy applications, there’s a good chance the original code relies on old, outdated software, or defunct libraries. This could make running that app in the cloud difficult, if not impossible, without modification. Organisations also misinterpret the business continuity options available in public cloud and sometimes assume the options are the same as the on-premises counterpart.

“In a lot of cases with server-side applications, they’re not delivered and packaged as well as workspace applications are on an end-user’s desktop,” says Lee Wynne, CDW’s Public Cloud Architecture Practice Lead, “so finding somebody who actually installed the application on the server in the first place can be difficult.”

This, Wynne points out, along with a lack of documentation and issues with upgrading the original OS that a virtual machine runs on, can prove “very costly and time consuming” when trying to port legacy applications to the cloud with an old OS. In terms of business continuity, Wynne says:

“It can take a fair amount of explaining that in the public cloud domain, the ability to move machines from host to host with zero downtime across availability zones isn’t really a thing, therefore if you are moving a critical business workload from your current data centre that is highly protected by various VMware HA features, you need to consider how that will remain online through availability zone outages. In other words, you have to architect for failure”.

Cost modelling is also a critical component, Wynne says, and organisations need to make sure that the cost modelling they’re doing is an accurate representation of what their actual usage will look like.

“The accuracy element of cost modelling is really critical when you’re assessing at scale. You’re not just assessing a couple of VMs, you’re assessing a whole data centre or a few thousand; you’ve got to be accurate with the costs, and you’ve got to be able to get the instance types that are displayed during those accurate cost assessments.

“Therefore picking the tooling and the right telemetry at the beginning, and getting those costs accurate for your business case, is probably one of the first risks that you’ll come across with a cloud migration. Otherwise, you just end up making it three times more expensive than it actually is, and therefore providing executives and decision makers with the wrong information.

“If you think way back when we went from physical servers to virtual servers, no one did an as-is migration of those physical machines – they monitored them over a two-to-three month period, and then they migrated them based on real utilisation. So they cut down memory, cut down CPU, so they could fit as much as possible on the target VMware cluster. And this is exactly the same with public cloud. That’s why you ensure that you do your cost modelling right. It needs to be lean and optimised, as you are paying by the minute or, in some cases, by the second.”

It’s important to establish how your apps interact with each other, too. Very few business applications exist in isolation, so making sure that all of the integrations and connections between software still function as required – both during and after the migration – is vital. For this reason, CDW includes a dependency mapping service as part of its Cloud Plan offering, which analyses the connections between VMs and then groups them together into functions, so that they can be migrated in smaller groups.

“That reduces risk significantly,” Wynne says. “It’s naive to think that if you’re looking to do a migration on a couple of hundred virtual machines, that you’re going to do them all in one go. It’s not the way it works, you do it batch by batch. So what you don’t want to do is introduce risk by migrating a batch of virtual machines over to public cloud and then realise afterwards that actually, these machines are going to communicate back to the source data centre on an application protocol, which is latency-sensitive – so it’ll break it, it won’t work, it’ll be too slow. So you end up having to roll back, or migrate more VMs really quickly that you didn’t plan for.”

With all this in mind, it’s absolutely key that when starting a cloud migration project, companies take the time to look at their applications realistically, identifying which of them need to be retooled before they can be moved. There may even be cases where it’s faster to rebuild an application from the ground up, rather than tweaking the existing version for a cloud deployment.

The reduction of operational complexity is a key issue for organisations of all types, and it’s one that the cloud can play a huge role in, but be warned – the process of cloud migration isn’t always a simple matter of scooping up your VMs en masse and dumping them into your chosen cloud environment. A good cloud migration involves looking long and hard at which of your applications truly belong in the cloud, and then investing time and effort in making sure that those applications are engineered to get the maximum benefit from a cloud environment.

Organisations that are looking to start this process don’t have to do it alone; CDW is a tried and tested partner, that can help guide your business to the cloud and make sure that your applications are delivering the most value with the least overhead.

Get in touch with a CDW Account Director and ask for details on. CloudCare® JumpStart and CloudCare® CloudPlan. Visit uk.cdw.com

Future-proofing your cloud strategy


Cloud Pro

20 Jan, 2020

Adopting cloud technologies is something that many organisations have struggled with, and these difficulties have been caused in no small part by large amounts of pre-existing technical debt. Legacy applications and physical infrastructure have, in many cases, become a millstone around the necks of companies who want to capitalise on the economic and transformational benefits of the cloud, leading to long migration roadmaps.

This is a challenge in and of itself, but as companies race to embrace the cloud, it’s imperative to ensure they don’t repeat the same mistakes that caused these headaches in the first place. Companies have often found themselves locked into expensive and inflexible relationships with vendors, and while this is less of an issue with the cloud than with on-premise systems, it has by no means disappeared altogether.

There are ways to mitigate the risk of cloud vendor lock-in, however, and the most important factor is proper planning. Planning is always the first phase of any cloud migration, and it’s vital to ensure that before you even open an account with a cloud provider, you’ve worked out all your application dependencies, compatibility requirements and budgets. You can then assess which cloud providers meet those needs, making it easier to identify an alternative provider to move to in the future.

Businesses should also take a long, hard look at which services they actually need to move to the cloud. Contrary to some popular wisdom, not every application and service is necessarily better off being run in the cloud. Establishing what does and doesn’t need to be migrated can save time and money later on, both in the initial migration process and potentially subsequent ones if those workloads are repatriated to on-premise systems.

Organisations should be mindful of contract lengths, too. Long lease terms on physical infrastructure are one of the biggest impediments to organisations’ cloud migrations, and although many cloud services will offer discounts on the price of longer contract lengths, this also contributes towards those same lock-in issues that you’re trying to avoid.

It’s important to remember that you can design and set up most elements of your public cloud infrastructure without spending a single penny, and you should do this as early as possible. Although public cloud providers market themselves on their ability to get up and running very quickly, mature organisations that want to undertake a comprehensive cloud deployment will quickly find that setting this up, along with all the associated account structures, security and compliance frameworks, will quickly eat up unexpectedly large amounts of time.

“None of this costs any money in terms of consumption,” says Lee Wynne, CDW’s public cloud architecture practice lead. “You can build out a well-architected multi-account AWS or Azure design, you can do all your billing strategy, and you can design and provision all your platform architecture and it costs you no money. There are no cloud consumption costs at all to do that; you have to know what you are doing but it won’t cost you a penny. So when you’re ready, when that urgent project comes along, and they go ‘yeah, we need to spin this up in Azure or AWS, or GCP ASAP’, then you’re ready to go. Nobody’s waiting for anything, it’s just done and dusted, a big tick in the box for your IT services. There’s nothing dependent on this from a project perspective, no cost assessments, no buzzwords about machine learning and AI strategies, no politics.

“Now, when it comes to thinking about vendor lock-in you start thinking ‘well, should I just be doing this for one platform? Or should I do it for all of them?’, because it’s the same. That strategy is the same across AWS, Azure or Google Cloud. They all have the same thing: a multi-account structure and a platform architecture, all can be designed and configured with no consumption so it’s not too difficult to do that for all of them. And then all of a sudden, you’ve got a little bit more choice and bit more flexibility about where you choose to deploy certain elements. The only complexity is how you set up network comms between them, if that even needs to happen.”

Designing your workloads and applications in such a way that they can be deployed across multiple clouds from the outset makes jumping ship to a new provider much easier, and it also has the added bonus of making them much leaner and more portable in general.

Vendors themselves have even started warming to the idea of customers deploying across multiple providers, and multi-cloud tools and functionalities are becoming increasingly common. VMware, for example, has been rapidly expanding its portfolio of multi-cloud management tools and has recently announced its plans to incorporate Kubernetes management tools into vSphere, making multi-cloud workload management even simpler. Microsoft has also got in on the trend, announcing earlier this year that Azure Cost Management will also help customers manage their AWS spending.

Another important consideration is the tools that you use to build and run your apps. Most organisations will likely construct their apps using open source (or at the very least widely supported) programming languages, but once those applications are in the cloud, it’s very easy to end up accidentally bolting additional compatibility requirements onto them.

“For example, AWS has CodeStar and Azure has Azure DevOps, and they have a lot of native tools in their ecosystem that you can use to wrap around your source code, test and deployment pipelines,” Wynne points out. “Then you become reliant on it; it becomes part of your DevOps process, and then it becomes part of your culture, and it becomes part of your team. So over time, you can easily end up locked into one of those cloud platforms, if you consume a lot of their native services around your source code, and you begin to extend it further into proprietary database services such as Google Firebase, Azure Cosmos and AWS Aurora and serverless monitoring and debugging tools.”

Vendor lock-in remains a major problem for many IT departments, and it can be just as much of a risk when it comes to the cloud. With thorough planning, good design and careful monitoring, however, it is possible to architect your cloud estate in a way that minimises the chance of your organisation repeating the same woes of time gone by. For businesses that want to ensure they’re implementing the right cloud strategies and designing their platforms in the most flexible way possible, CDW is an ideal partner for planning and orchestrating your cloud migration, bringing years of experience and unparalleled cloud expertise to bear on your business challenges.

Get in touch with a CDW Account Director and ask for details on CloudCare® JumpStart or CloudPlan. Visit uk.cdw.com

What is blockchain?


Cloud Pro

15 Jan, 2020

Blockchain is an advanced way of logging and protecting data and changes to a decentralised database that makes it difficult to manipulate. It’s the technology that underpins digital currencies, such as Bitcoin, and helps to protect against double-spending and hyperinflation in banking. It’s even used to achieve automated supply chain management in manufacturing.

Blockchain is a type of distributed ledger, one that operates as a public platform of data records that isn’t “owned” by any individual. It allows people to exchange information in real-time, with that information changing hands multiple times at once, all while being verified to ensure changes are legitimate.

Blockchain is among the most experimental and emerging technologies around given the sheer amount of moving parts needed to ensure the information is correct at all times, wherever it resides. There’s also a pressing need to ensure it’s accurate, and that everybody can access the same information as each other.

Data resides in a limited number of “blocks” that together make up a “chain”, hence the name Blockchain. Any data sent or received over the chain of data blocks can be viewed by any person, at any time. Any changes to the chain are confirmed and uploaded at the same time as well. Because it doesn’t reside in a single location, however, such as a database or server, it means it’s incredibly difficult to disrupt or hack; this would require every single node supporting the network to be compromised at the same time.

Although originally developed for digital currencies, businesses are now seeing the benefits of implementing forms of distributed ledgers in their own organisations, particularly to protect secure data in environments such as hospitals or by estate agents to secure property purchases.

New use cases

No-one really knows who invented Blockchain. Its initial research paper was published under the name ‘Satoshi Nakamoto‘, the same person attributed to the creation of Bitcoin but it’s likely that the name on the paper was a pseudonym for a group of people who all had a hand in the technology’s development.

Blockchain solved the problem of ‘double spending’, recording what transactions had taken place on the network and preventing users from using the same digital token more than once. It also presented the opportunity for the currency to be decentralised, so governments and other authorities were not required to regulate or oversee it, making it a completely free, global currency.

However, the idea of having a distributed ledger that is not owned by anyone clearly has benefits. For one, it’s super-secure because no one owns the original file and it can be updated without the threat of hack.

It also means data even the most sensitive information such as that related to personal identities, medical information and insurance records can be stored in a place that can be made accessible by all parties in a way that’s trusted.

Now that the technology has been in the public domain for a good few years, companies are finding innovative ways of deploying it. There are, for example, a slew of cannabis startups using blockchain to get a head start in an emerging industry. Most recently, startup TruTrace Technologies partnered with auditing firm Deloitte to track cannabis using blockchain technology, according to Proactive Investors.

The system tracks the drug from seed to sale in order for customers and retailers to know the history of the product as it passes through the supply and consumption chain. 

The rise of blocks

Blockchain relies on blocks of data connected in a chain, as its autonym name suggests. The chain is cryptographically secured and distributed among those that want to change or tweak parts using a network. As the chain evolves, new blocks are added and the person or node that adds that block is solely responsible for authorising it and ensuring it’s correct.

What’s unique about blockchain technologies is that none of the blocks can be changed or removed after being added – a reason to ensure it’s definitely correct or accurate before adding to the chain.

The way blockchains are created makes them perfect for highly regulated industries that need to have a paper trail of changes. Because it’s tamper-proof, the financial sector is one of the industries taking the technology seriously and it was created for Bitcoin for exactly this reason.

Bitcoin miners add the blocks, acting as nodes in a huge peer-to-peer (P2P) network. Everyone works together to validate transactions, without changing anything in the chain. Because every block is linked together in a chain, nothing can be changed without breaking the chain and to change anything, it would need every person who’s ever added a block to change their additions – an impossible task when so many people are using a single network. 

Not all blockchains are built the same, and the time it takes to process blocks of transactions can vary. Given the nature of buying and selling, cryptocurrency blockchains tend to be the quickest examples. The Ethereum blockchain, which supports the Ether cryptocurrency as well as countless other industry projects, is able to process transactions in around 15 seconds, whereas Bitcoin’s network generally takes around 15 minutes.

More affordable and efficient

Blockchain networks can operate through multiple computers across the world, sometimes thousands, in an open P2P configuration. There is no centralised database or server, and because of this users, or nodes, can organise and audit information quicker and more effectively. But the time taken to verify information does scale with the size of the network.

There are benefits to the nature of blockchain networks, with implications for privacy and security. For instance, the fact the data is not stored in any one location means it is difficult, if not impossible, to hack these networks and steal any data, or shut them down. They are also able to withstand the risk of outages, as all nodes would have to be individually taken down for the blockchain to be knocked offline.

Cooperation and collaboration is normally at the heart of most blockchain networks too, with the various users operating under a shared goal. For example, users in the financial services sector would be working to building a safer and more secure method for storing and processing transaction information. While a physical file room may have once been a fixture of such operations, a blockchain network can enable one to transmit data far quicker, and more accurately.

The scope for blockchain to reduce the risks of fraud, and allow for more affordable financial processes, is greater too – with many systems such as these, albeit in their infancy, already producing some results. Santander, for example, earlier this year rolled out a blockchain technology based on Ripple that could accelerate payments across borders.

Public vs private

Much like the field of cloud computing, the function and implementation of blockchain can vary significantly depending on whether it’s designed to be public or private. The primary distinction between these types comes down to who can access a system.

Public

Public blockchains operate a shared network that allows anyone to maintain the ledger and participate in the execution of blockchain protocol – in other words, authorise the creation of blocks. It’s essential for services such as Bitcoin, which operates the largest public blockchain, as it needs to encourage as many users as possible to its ledger to ensure the currency grows.

Public blockchains are considered entirely decentralised, but in order to maintain trust, they typically employ economic incentives, such as cryptocurrencies, and cryptographic verification. This verification process requires every user, or ‘node’, to solve increasingly complex and resource-intensive problems known as a ‘proof of work’, in order to stay in sync.

This means public blockchains often require immense computational power to maintain the ledger, which only worsens as more nodes are added, and predicting how much that will increase is difficult. Given the number of voices in the community, it’s also incredibly difficult to reach a consensus on any technical changes to a public blockchain – as demonstrated by Bitcoin’s two recent hard forks.

Private

Private blockchains are arguably the antithesis of what the technology was originally designed for. Instead of a decentralised, open ledger, a private blockchain is entirely centralised, maintained by nodes belonging to a single organisation or entity.

It’s a novel design tweak that has allowed the technology to flourish within those organisations looking for the same streamlined transactions afforded by public blockchains, only with highly restricted access. As there are fewer participants on the network, transactions are normally cheaper and verified far quicker on private chains, and fixes to faults or network upgrades can be implemented almost immediately.

In order to share the data stored on a private chain, they often operate using a permission-based system, in which node participants are able to grant read access to external parties, such as auditors or regulators looking to check the inner workings of a company.

Unfortunately, as there are fewer nodes maintaining the blockchain, it can’t offer the same high levels of security afforded by decentralised chains.

Consortium

‘Consortium’ is best described as the ‘hybrid cloud’ of blockchain. It provides the robust controls and ‘high trust’ transactions of private blockchains, only without being confined to the oversight of a single entity.

It sits somewhere in the middle. Although they provide the same limited access and high efficiency afforded by private blockchains, dedicated nodes are set aside to be controlled by external companies or agents, instead of having only read access under a private blockchain.

The easiest way to understand how it differs is to think of consortium blockchains as the equivalent of a council group – with each member having responsibility for maintaining the blockchain, and each having permissions to give read access.

Given its collaborative design, it’s a perfect solution for supporting the work of government committees or industry action groups where a number of companies may come together to tackle an issue – whether that be industries working to combat climate change or maintaining a shared ledger to support the work of the United Nations.

Blockchain vs Distributed Ledger Technology

The term blockchain’ is often deployed to refer to a host of similar yet different technologies, and is often falsely used to refer to any decentralised distributed database. Blockchain is, in reality, only one form of the emerging distributed ledger technology (DLT).

DLT is a form of technology comparable to a database but distributed across multiple physical sites and locations, regardless of how near or far from one another. The purpose of such a phenomenon is to avoid having to rely on a centralised storage system or the need for a middle-man, like a network, to authorise and record changes to the records. When changes are requested, the lack of a centralised system means approval is demanded from all notes across a DLT network.

This concept is being adopted by businesses and organisations at a fast pace, and across various industries. This is not just an innovation developed and taken up by tech companies, but sectors like manufacturing and finance.

There are a number of formats in which DLT arises, but the central idea of a diversity of control is at the heart of all of these. One form of distributed ledger, for example, allows data to be stored on separate nodes, such as banking records beginning with each letter of the alphabet dispersed among different locations Rather than replicated to each area, like in a database as we’ve always known, the data is spread across parts of a network.

Blockchain simply refers to one iteration of this form of technology, more specifically, a data structure that takes the shape of entries stored in blocks. This form of structuring data offers an element of synchronisation between parts of a network – and it’s essential for supporting innovations like Bitcoin.

Despite its success as the building block of currencies like Bitcoin, the system doesn’t necessarily need to have miners and tokens to qualify as a blockchain – the term simply refers to the structure of arranging data into blocks. Blockchains, as a result, are decentralised ledgers where data is replicated rather than distributed.

Unfortunately, the frequency at which blockchain and distributed ledger are used interchangeably has created confusion over the technology as a whole, leading many to dismiss blockchain as simply a tool for Bitcoin.