Why ‘lift-and-shift’ is an outdated approach

Cloud Pro

17 Jan, 2020

There are many reasons why a business might consider moving some or all of its applications to public cloud platforms like AWS, Azure or Google Cloud. One of the most compelling, however, is the cloud’s ability to reduce the complexity of an organisation’s IT. Removing the need for management of the physical infrastructure that applications run on can yield big benefits for simplification.

Running your applications in the cloud allows you to take advantage of a new compute economy and scale the capacity up and down as needed, as well as letting you hook into a huge range of additional tools and services. While this is well and good for building new applications, porting pre-existing legacy apps over to cloud-based services can often prove challenging.

Organisations that want to migrate pre-existing workloads to a public cloud are faced with a choice: do they re-architect their applications for cloud, or do they simply attempt to port it to their chosen platform wholesale, with no alteration? For many companies, the latter approach – known as the ‘lift-and-shift’ method – initially sounds like the more attractive option. It allows them to get into the cloud faster, with a smaller amount of work, meaning the IT team has more time to devote to other elements of the migration or to developing entirely new capabilities.

Sadly it’s not quite as simple as that. While some applications can be moved over fairly seamlessly, not all apps are suited to this method. Compatibility is the first issue that companies are liable to run into with lift-and-shift; particularly when dealing with legacy applications, there’s a good chance the original code relies on old, outdated software, or defunct libraries. This could make running that app in the cloud difficult, if not impossible, without modification. Organisations also misinterpret the business continuity options available in public cloud and sometimes assume the options are the same as the on-premises counterpart.

“In a lot of cases with server-side applications, they’re not delivered and packaged as well as workspace applications are on an end-user’s desktop,” says Lee Wynne, CDW’s Public Cloud Architecture Practice Lead, “so finding somebody who actually installed the application on the server in the first place can be difficult.”

This, Wynne points out, along with a lack of documentation and issues with upgrading the original OS that a virtual machine runs on, can prove “very costly and time consuming” when trying to port legacy applications to the cloud with an old OS. In terms of business continuity, Wynne says:

“It can take a fair amount of explaining that in the public cloud domain, the ability to move machines from host to host with zero downtime across availability zones isn’t really a thing, therefore if you are moving a critical business workload from your current data centre that is highly protected by various VMware HA features, you need to consider how that will remain online through availability zone outages. In other words, you have to architect for failure”.

Cost modelling is also a critical component, Wynne says, and organisations need to make sure that the cost modelling they’re doing is an accurate representation of what their actual usage will look like.

“The accuracy element of cost modelling is really critical when you’re assessing at scale. You’re not just assessing a couple of VMs, you’re assessing a whole data centre or a few thousand; you’ve got to be accurate with the costs, and you’ve got to be able to get the instance types that are displayed during those accurate cost assessments.

“Therefore picking the tooling and the right telemetry at the beginning, and getting those costs accurate for your business case, is probably one of the first risks that you’ll come across with a cloud migration. Otherwise, you just end up making it three times more expensive than it actually is, and therefore providing executives and decision makers with the wrong information.

“If you think way back when we went from physical servers to virtual servers, no one did an as-is migration of those physical machines – they monitored them over a two-to-three month period, and then they migrated them based on real utilisation. So they cut down memory, cut down CPU, so they could fit as much as possible on the target VMware cluster. And this is exactly the same with public cloud. That’s why you ensure that you do your cost modelling right. It needs to be lean and optimised, as you are paying by the minute or, in some cases, by the second.”

It’s important to establish how your apps interact with each other, too. Very few business applications exist in isolation, so making sure that all of the integrations and connections between software still function as required – both during and after the migration – is vital. For this reason, CDW includes a dependency mapping service as part of its Cloud Plan offering, which analyses the connections between VMs and then groups them together into functions, so that they can be migrated in smaller groups.

“That reduces risk significantly,” Wynne says. “It’s naive to think that if you’re looking to do a migration on a couple of hundred virtual machines, that you’re going to do them all in one go. It’s not the way it works, you do it batch by batch. So what you don’t want to do is introduce risk by migrating a batch of virtual machines over to public cloud and then realise afterwards that actually, these machines are going to communicate back to the source data centre on an application protocol, which is latency-sensitive – so it’ll break it, it won’t work, it’ll be too slow. So you end up having to roll back, or migrate more VMs really quickly that you didn’t plan for.”

With all this in mind, it’s absolutely key that when starting a cloud migration project, companies take the time to look at their applications realistically, identifying which of them need to be retooled before they can be moved. There may even be cases where it’s faster to rebuild an application from the ground up, rather than tweaking the existing version for a cloud deployment.

The reduction of operational complexity is a key issue for organisations of all types, and it’s one that the cloud can play a huge role in, but be warned – the process of cloud migration isn’t always a simple matter of scooping up your VMs en masse and dumping them into your chosen cloud environment. A good cloud migration involves looking long and hard at which of your applications truly belong in the cloud, and then investing time and effort in making sure that those applications are engineered to get the maximum benefit from a cloud environment.

Organisations that are looking to start this process don’t have to do it alone; CDW is a tried and tested partner, that can help guide your business to the cloud and make sure that your applications are delivering the most value with the least overhead.

Get in touch with a CDW Account Director and ask for details on. CloudCare® JumpStart and CloudCare® CloudPlan. Visit uk.cdw.com

Microsoft plots ‘carbon negative’ target for 2030

Keumars Afifi-Sabet

17 Jan, 2020

Microsoft has outlined a set of ambitious plans to remove more carbon from the atmosphere than it emits by the end of the decade.

By 2030, Microsoft is aiming to be ‘carbon negative’, in that the carbon it removes from the atmosphere outweighs the carbon emitted, including the activity of its wider supply chain.

This is in addition to a $1 billion climate fund to accelerate research and development into carbon reduction, capture and removal technology that doesn’t already exist today.

Moreover, Microsoft wants to continue the trend of lowering emissions while increasing carbon removal so that by 2050 it will, on paper, have removed all the carbon it has emitted since its foundation in 1975.

“While the world will need to reach net-zero, those of us who can afford to move faster and go further should do so,” said Microsoft president Brad Smith.

“We recognize that progress requires not just a bold goal but a detailed plan. As described below, we are launching today an aggressive program to cut our carbon emissions by more than half by 2030, both for our direct emissions and for our entire supply and value chain.

“While we at Microsoft have worked hard to be “carbon neutral” since 2012, our recent work has led us to conclude that this is an area where we’re far better served by humility than pride. And we believe this is true not only for ourselves, but for every business and organization on the planet.”

The industry stalwart is the latest in a string of companies, including Amazon and HP, to enter an arms race geared on reducing carbon footprints and embracing cleaner and greener technologies.

Amazon, for example, has pledged to be carbon neutral by 2040, while Google Cloud hit its 100% renewable energy goal in April 2018, powering its data centres and offices from renewable sources, including solar and wind. Salesforce, similarly, achieved net-zero greenhouse gas emissions the previous year.

HP, on the other hand, has committed to releasing routine sustainability reports that track its progress in its aims to reduce its carbon footprint. It has started to build many of its products with sustainability in mind, including the forthcoming HP Elite Dragonfly business 2-in-1.

Microsoft says it can achieve its own “aggressive” set of targets by first investing in nature-based initiatives, such as planting trees, with the goal of shifting to technology-based programmes when they become more viable.

The wider strategy, however, encompasses a set of smaller goals that Microsoft hopes to hit along the way to achieving its major targets for 2030 and 2050.

By 2025, for instance, Microsoft is hoping to shift to a 100% supply of renewable energy, while aiming to fully electrify its global campus operations vehicle fleet by 2030.

The firm is hoping to implement new procurement processes and tools to incentivise its suppliers to reduce their carbon emissions too, with these pencilled in for July 2021. For customers, meanwhile, Microsoft will roll out a sustainability calculator and dashboard that estimates emissions from Azure services.

Google-parent Alphabet now worth $1 trillion

Bobby Hellard

17 Jan, 2020

Google’s parent company Alphabet has become the fourth US tech company to reach a market value of $1 trillion (£765 billion), ending trading at $1,451.70 per share on Thursday.

It will hold a quarterly conference to discuss Q4 and 2019 financial results on 3 February but Wall Street Analysts are expecting it to report revenue of $46.9 billion – up 20% year-on-year.

Alphabet is now part of a US tech elite with Apple, Amazon and Microsoft all having reached the $1 trillion market cap over the past two years. The iPhone maker was the first to surpass the mark in August 2018 with Amazon hitting it a month later. Microsoft was the third company, doing so in April 2019.

Like Amazon and Microsoft, revenues from its cloud ventures are believed to have heavily contributed to Alphabet’s overall growth, with the Google Cloud Platform doubling it’s revenue run rate to £2 billion per quarter between February 2018 and July 2019, according to CNBC.

Cloud, Google’s Play app store and Google’s hardware division have all been key drivers for the company, according to its earnings report. In Q3 of 2019, revenue from these segments of the business increase 39% year-on-year.

Sundar Pichai’s recent promotion to CEO of Alphabet has also helped to increase optimism among stockholders, according to Pivotal Research Group analyst Michael Levine.

“We are incrementally more constructive about what we perceive as multiple ways to get paid under the recently appointed Pichai regime,” Levine wrote, as reported by Markets Insider. His evaluation is reportedly based on an estimate of Alphabet’s 2021 EBITDA (earnings before interest, tax, depreciation and amortization).

Pichai took over as CEO of both Alphabet and Google after co-founders Larry Page and Sergey Brin stepped down in December. The change of leadership offers “the most optionality for multiple expansion for the stock we have seen in years,” according to Levine, who also sighted Thomas Kurian‘s helm of Google’s cloud business as a positive sign.

Google Cloud unveils premium support offering to further woo enterprise customers

Google Cloud continues to push its wares for an enterprise base with the launch of a premium support offering for enterprise and mission-critical requirements.

The service builds upon current offerings, of providing technical account managers and 15-minute SLOs (service level operations). Any companies with premium support will have their cases handled directly by the best of the best – or ‘context-aware experts’, for the jargon version. Context-aware, in this instance, means support staff who understand their customers’ peak events and will work before, during and after to ensure no issues.

Google also promises a case management API, which aims to specially integrate the vendor and customer systems, while premium members will also get access to Google Cloud’s training library, as well as a sneak peek at previews for key product launches.

“Premium Support has been designed to better meet the needs of our customers running modern cloud technology,” wrote Atul Nanda, vice president support at Google Cloud. “We’ve made investments to improve the customer experience, with an updated support model that is proactive, unified, centred around the customer, and flexible to meet the differing needs of their businesses.”

It has been a busy start to the year for Google. The company unfurled its coldest storage package, Archive, in general availability last week, before taking the opportunity availed by retail show NRF to announce updates for retailers to get on board with Google’s cloud. Kohl’s, Lowe’s and Wayfair are just three of the recently announced major retailers confirmed as Google Cloud customers.

Focusing on the enterprise space and building up the sales and marketing channels have been the key priority for Thomas Kurian in the 12 months since he became Google Cloud CEO. Indeed, Kurian used his first major speaking slot last February to advocate the use of old-school sales tactics to woo the enterprise customers. The previous October, former product management lead Amir Hermelin delivered a valedictory post which argued his previous employer had missed the boat in the enterprise.

Since then, many of Google Cloud’s moves – or at least the marketing messages behind them – have had the enterprise in mind. Take the storage growth plan announced in March for companies who spend $10,000 per month for a year, or the acquisitions of Looker and Elastifile, or security partnerships with Palo Alto and McAfee among others. The question for Google now is how to convert these moves into decision making from the highest level at the world’s largest companies.

Premium Support is available now with Google Cloud promising additional features and support plans throughout the year. You can read the full Google blog here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud hardware sales slide, but still dominates wider IT market

Nicole Kobie

16 Jan, 2020

Sales of hardware for cloud infrastructure slid slightly over the past year as part of overall weaker sales in IT, according to IDC, but they still exceeded spending on non-cloud infrastructure for the second time ever.

The analyst firm said cloud IT infrastructure spending declined for the second quarter in a row in the third quarter of 2019, down 1.8% from the same period last year.

While public cloud infrastructure declined 3.7% on the year, that segment is still worth $11.9 billion in quarterly sales. And while sales for public cloud dropped versus the same period last year, they were up by 24% from the quarter before.

“As the overall segment is generally trending up, it tends to be more volatile quarterly as a significant part of the public cloud IT segment is represented by a few hyperscale service providers,” IDC reported, adding that public cloud makes up most of the spending.

With such volatile quarterly figures, it’s easier to look at the market on an annual basis, and IDC noted public cloud IT infrastructure had stable growth since the analyst firm began tracking the segment. “In [the third quarter of 2019], vendor revenues from private cloud environments increased 3.2% year over year, reaching nearly $5 billion. IDC expects spending in this segment to grow 7.2% year over year in 2019 to $21.4 billion,” the company predicted.

IDC splits infrastructure into three areas: ethernet switches, compute platforms, and storage. While compute will remain the largest segment for cloud infrastructure spending, it’s expected to see growth of just 3% in 2019, while storage will be flat. The Ethernet switches segment is predicted to climb 11% on the year.

IDC said that the IT infrastructure industry was reaching the point where cloud will outstrip spending on traditional, non-cloud systems. Up until this last quarter, cloud spending topped non-cloud only once, back in the third quarter of 2018. In the last quarter, cloud hardware spending topped 53%.

“However, for the full year 2019, spending on cloud IT infrastructure is expected to stay just below the 50% mark at 49.8%,” IDC added. “This year is expected to become the tipping point with spending on cloud IT infrastructure staying in the 50+% range.”

IDC forecasts traditional infrastructure will make up 42% of sales by 2023, down from 52% in 2018. That’s in part due to spending on traditional environments declining, with IDC predicting it to fall by 5.3% over 2019.

“This share loss and the growing share of cloud environments in overall spending on IT infrastructure is common across all regions,” added IDC. “While the industry overall is moving toward greater use of cloud, there are certain types of workloads and business practices, and sometimes end user inertia, which keep demand for traditional dedicated IT infrastructure afloat.”

Compleat targets global expansion with Sage Intacct integration

Daniel Todd

16 Jan, 2020

Spend management software provider Compleat Software has expanded its relationship with Sage, adding a new integration between iCompleat and Sage Intacct.

Designed to streamline the accounts payable process, the expanded relationship will see the firm market and sell its AI-powered iCompleat Buy to Pay service on the Sage Intacct Marketplace.

The move becomes the third integration between the two companies, following previous Sage 50 and Sage 500 additions, and forms part of Compleat’s plans to expand internationally in places such as North America and Australia.

“The simplicity, rapid deployment, increased levels of automation, and cost savings achieved when buying online makes iCompleat an asset for every business,” said Annabel Sim, VP of global sales at Compleat. “This expanded partnership will help our solution to stand out and be successfully adopted in the USA and Australia, in addition to the success achieved in the UK already.”

Initially focused on North America, Sage’s Intacct integrated services repository was recently launched in the UK and Australia and is the first and only preferred financial management service of the American Institute of Certified Public Accountants (AICPA).

Compleat’s software is the first accounts payable automation add-on with a built-in online buying service available on the platform. Its latest product allows users to make Amazon Business purchases from iCompleat, enabling automated invoice matching with simple product ordering tasks.

Through AI and machine learning, the software automates the cycle of buying, invoice capture and approval processes. This increases accounts payable efficiency, the firm said, and eliminates the risk of fraud by providing full visibility and control of company spend.

“We continue to add powerful solutions to the Sage Intacct Marketplace,” commented Eileen Wiens, VP of business development for Sage Intacct. “Sage Intacct customers can now benefit from the built-in online buying functionality and Compleat’s partnership with Amazon Business which takes the Buy to Pay software functionality to a whole new level of automation and end convenience.”

Cloud infrastructure trends: Usage continues to rise – with AWS-VMware workloads soaring in parallel

85% of organisations expect to have the majority of their workloads cloud-based by the end of 2020, according to a new study from AllCloud.

The study, which polled more than 150 IT decision makers at organisations where at least 300 employees were using cloud infrastructure, found seven in 10 respondents already ran at least half of their workloads on the cloud.

When it came to organisations’ primary goals when deciding on their cloud platform of choice, three areas stood out. Not surprisingly, security came out on top, cited by 27.6% of those polled, yet reliability (26.3%) and flexibility (22.4%) fared similarly. This makes for an interesting comparison with cost, cited by only 13.8% of respondents.

Almost half of those who were using a multi-cloud approach had Microsoft Azure (49.3%) as their platform of choice. Google Cloud Platform, cited by 40.1% of those polled, came next, with IBM (32.2%) and Oracle’s (20.4%) clouds trailing.

Given AllCloud’s focus is primarily on supporting AWS initiatives – alongside Salesforce and NetSuite – it makes no attempt to hide the fact it is an AWS-centric report. When it came to specific services – of the more than 170 in AWS’ portfolio – explored next year, database, cited by 21.1% of those polled, was the most frequently cited. IoT services (17.1%) were also keenly cited, alongside containers and microservices (14.5%).

Perhaps the most illuminating statistic came through AWS’ partnership with VMware. According to the data, almost three quarters (73%) of enterprise private workloads are using VMware. Expect this to continue this year, AllCloud asserts. “The existing partnership is likely to grow stronger and broader, with more accessibility released between the technologies,” the report notes. “This will allow a faster rate of enterprise adoption for organisations that want to leverage the benefits of the cloud.”

“The report’s findings are consistent with feedback that AllCloud has been receiving from its clients across the globe – which is that their use of cloud infrastructure and related technologies has been growing – and fast,” the report notes. “As these companies have grown, and their digital transformation programs have progressed, many have embraced AWS as their foundation.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Small businesses and innovators benefit from £100m government boost

Keumars Afifi-Sabet

15 Jan, 2020

Up to 100 million is being poured into researchers and small businesses as part of public sector efforts to invest in emerging technologies like artificial intelligence (AI).

The government’s Future Leaders Fellowships scheme will receive 78 million to be invested in 78 researchers to work on scientific and technological discoveries.

The remaining 20 million will be allocated to universities to support small businesses in rapidly-growing industries including AI, but also areas like clean growth and agri-food.

The 20 University Enterprise Zones (UEZs) will provide specialist support to small businesses and raise the level of knowledge-sharing between academics and entrepreneurs through frequent collaborations.

Through these UEZs, startups and small businesses will be given the facilities and expertise to help take their ideas through from a concept into the production and marketing stages.

These programmes will run across the UK in cities like Exeter, Falmouth, and Durham, not just London, with the government hoping this regional diversity will lead to several improvements to local economies.

These packages are part of the government’s UK Research and Innovation (UKRI) programme, which has seen various sums allocated to boosting aspects of tech growth in recent months.

The NHS, for example, this month received 69.5 million to fund four projects that involve developing therapies and technologies to treat genetic mutations that predicate life-threatening conditions like cancer and arthritis.

The UKRI programme even funded three R&D projects in Bristol with a 50,000 round of investment in March this year.

“UKRI is committed to creating modern research and innovation careers and our Future Leaders Fellowships aim to support and retain the most talented people, including those with flexible career paths,” said UKRI chief executive Professor Sir Mark Walport.

“These 20 University Enterprise Zones funded by Research England will be important focal points for collaboration in business-friendly environments, driving innovation and delivering benefits that will be felt across economies at the local, regional and national scale.”

The largest recipient of the 20 million UEZ fund is the University of Southampton, which will use a 1.5 million boost to fund the Future Towns Innovation Hub.

Other prominent projects include Oxford Brookes’ 1.2 million AI & Data Analysis Incubator, and Lancaster University’s Secure Digitisation UEZ.

What is blockchain?

Cloud Pro

15 Jan, 2020

Blockchain is an advanced way of logging and protecting data and changes to a decentralised database that makes it difficult to manipulate. It’s the technology that underpins digital currencies, such as Bitcoin, and helps to protect against double-spending and hyperinflation in banking. It’s even used to achieve automated supply chain management in manufacturing.

Blockchain is a type of distributed ledger, one that operates as a public platform of data records that isn’t “owned” by any individual. It allows people to exchange information in real-time, with that information changing hands multiple times at once, all while being verified to ensure changes are legitimate.

Blockchain is among the most experimental and emerging technologies around given the sheer amount of moving parts needed to ensure the information is correct at all times, wherever it resides. There’s also a pressing need to ensure it’s accurate, and that everybody can access the same information as each other.

Data resides in a limited number of “blocks” that together make up a “chain”, hence the name Blockchain. Any data sent or received over the chain of data blocks can be viewed by any person, at any time. Any changes to the chain are confirmed and uploaded at the same time as well. Because it doesn’t reside in a single location, however, such as a database or server, it means it’s incredibly difficult to disrupt or hack; this would require every single node supporting the network to be compromised at the same time.

Although originally developed for digital currencies, businesses are now seeing the benefits of implementing forms of distributed ledgers in their own organisations, particularly to protect secure data in environments such as hospitals or by estate agents to secure property purchases.

New use cases

No-one really knows who invented Blockchain. Its initial research paper was published under the name ‘Satoshi Nakamoto‘, the same person attributed to the creation of Bitcoin but it’s likely that the name on the paper was a pseudonym for a group of people who all had a hand in the technology’s development.

Blockchain solved the problem of ‘double spending’, recording what transactions had taken place on the network and preventing users from using the same digital token more than once. It also presented the opportunity for the currency to be decentralised, so governments and other authorities were not required to regulate or oversee it, making it a completely free, global currency.

However, the idea of having a distributed ledger that is not owned by anyone clearly has benefits. For one, it’s super-secure because no one owns the original file and it can be updated without the threat of hack.

It also means data even the most sensitive information such as that related to personal identities, medical information and insurance records can be stored in a place that can be made accessible by all parties in a way that’s trusted.

Now that the technology has been in the public domain for a good few years, companies are finding innovative ways of deploying it. There are, for example, a slew of cannabis startups using blockchain to get a head start in an emerging industry. Most recently, startup TruTrace Technologies partnered with auditing firm Deloitte to track cannabis using blockchain technology, according to Proactive Investors.

The system tracks the drug from seed to sale in order for customers and retailers to know the history of the product as it passes through the supply and consumption chain. 

The rise of blocks

Blockchain relies on blocks of data connected in a chain, as its autonym name suggests. The chain is cryptographically secured and distributed among those that want to change or tweak parts using a network. As the chain evolves, new blocks are added and the person or node that adds that block is solely responsible for authorising it and ensuring it’s correct.

What’s unique about blockchain technologies is that none of the blocks can be changed or removed after being added – a reason to ensure it’s definitely correct or accurate before adding to the chain.

The way blockchains are created makes them perfect for highly regulated industries that need to have a paper trail of changes. Because it’s tamper-proof, the financial sector is one of the industries taking the technology seriously and it was created for Bitcoin for exactly this reason.

Bitcoin miners add the blocks, acting as nodes in a huge peer-to-peer (P2P) network. Everyone works together to validate transactions, without changing anything in the chain. Because every block is linked together in a chain, nothing can be changed without breaking the chain and to change anything, it would need every person who’s ever added a block to change their additions – an impossible task when so many people are using a single network. 

Not all blockchains are built the same, and the time it takes to process blocks of transactions can vary. Given the nature of buying and selling, cryptocurrency blockchains tend to be the quickest examples. The Ethereum blockchain, which supports the Ether cryptocurrency as well as countless other industry projects, is able to process transactions in around 15 seconds, whereas Bitcoin’s network generally takes around 15 minutes.

More affordable and efficient

Blockchain networks can operate through multiple computers across the world, sometimes thousands, in an open P2P configuration. There is no centralised database or server, and because of this users, or nodes, can organise and audit information quicker and more effectively. But the time taken to verify information does scale with the size of the network.

There are benefits to the nature of blockchain networks, with implications for privacy and security. For instance, the fact the data is not stored in any one location means it is difficult, if not impossible, to hack these networks and steal any data, or shut them down. They are also able to withstand the risk of outages, as all nodes would have to be individually taken down for the blockchain to be knocked offline.

Cooperation and collaboration is normally at the heart of most blockchain networks too, with the various users operating under a shared goal. For example, users in the financial services sector would be working to building a safer and more secure method for storing and processing transaction information. While a physical file room may have once been a fixture of such operations, a blockchain network can enable one to transmit data far quicker, and more accurately.

The scope for blockchain to reduce the risks of fraud, and allow for more affordable financial processes, is greater too – with many systems such as these, albeit in their infancy, already producing some results. Santander, for example, earlier this year rolled out a blockchain technology based on Ripple that could accelerate payments across borders.

Public vs private

Much like the field of cloud computing, the function and implementation of blockchain can vary significantly depending on whether it’s designed to be public or private. The primary distinction between these types comes down to who can access a system.


Public blockchains operate a shared network that allows anyone to maintain the ledger and participate in the execution of blockchain protocol – in other words, authorise the creation of blocks. It’s essential for services such as Bitcoin, which operates the largest public blockchain, as it needs to encourage as many users as possible to its ledger to ensure the currency grows.

Public blockchains are considered entirely decentralised, but in order to maintain trust, they typically employ economic incentives, such as cryptocurrencies, and cryptographic verification. This verification process requires every user, or ‘node’, to solve increasingly complex and resource-intensive problems known as a ‘proof of work’, in order to stay in sync.

This means public blockchains often require immense computational power to maintain the ledger, which only worsens as more nodes are added, and predicting how much that will increase is difficult. Given the number of voices in the community, it’s also incredibly difficult to reach a consensus on any technical changes to a public blockchain – as demonstrated by Bitcoin’s two recent hard forks.


Private blockchains are arguably the antithesis of what the technology was originally designed for. Instead of a decentralised, open ledger, a private blockchain is entirely centralised, maintained by nodes belonging to a single organisation or entity.

It’s a novel design tweak that has allowed the technology to flourish within those organisations looking for the same streamlined transactions afforded by public blockchains, only with highly restricted access. As there are fewer participants on the network, transactions are normally cheaper and verified far quicker on private chains, and fixes to faults or network upgrades can be implemented almost immediately.

In order to share the data stored on a private chain, they often operate using a permission-based system, in which node participants are able to grant read access to external parties, such as auditors or regulators looking to check the inner workings of a company.

Unfortunately, as there are fewer nodes maintaining the blockchain, it can’t offer the same high levels of security afforded by decentralised chains.


‘Consortium’ is best described as the ‘hybrid cloud’ of blockchain. It provides the robust controls and ‘high trust’ transactions of private blockchains, only without being confined to the oversight of a single entity.

It sits somewhere in the middle. Although they provide the same limited access and high efficiency afforded by private blockchains, dedicated nodes are set aside to be controlled by external companies or agents, instead of having only read access under a private blockchain.

The easiest way to understand how it differs is to think of consortium blockchains as the equivalent of a council group – with each member having responsibility for maintaining the blockchain, and each having permissions to give read access.

Given its collaborative design, it’s a perfect solution for supporting the work of government committees or industry action groups where a number of companies may come together to tackle an issue – whether that be industries working to combat climate change or maintaining a shared ledger to support the work of the United Nations.

Blockchain vs Distributed Ledger Technology

The term blockchain’ is often deployed to refer to a host of similar yet different technologies, and is often falsely used to refer to any decentralised distributed database. Blockchain is, in reality, only one form of the emerging distributed ledger technology (DLT).

DLT is a form of technology comparable to a database but distributed across multiple physical sites and locations, regardless of how near or far from one another. The purpose of such a phenomenon is to avoid having to rely on a centralised storage system or the need for a middle-man, like a network, to authorise and record changes to the records. When changes are requested, the lack of a centralised system means approval is demanded from all notes across a DLT network.

This concept is being adopted by businesses and organisations at a fast pace, and across various industries. This is not just an innovation developed and taken up by tech companies, but sectors like manufacturing and finance.

There are a number of formats in which DLT arises, but the central idea of a diversity of control is at the heart of all of these. One form of distributed ledger, for example, allows data to be stored on separate nodes, such as banking records beginning with each letter of the alphabet dispersed among different locations Rather than replicated to each area, like in a database as we’ve always known, the data is spread across parts of a network.

Blockchain simply refers to one iteration of this form of technology, more specifically, a data structure that takes the shape of entries stored in blocks. This form of structuring data offers an element of synchronisation between parts of a network – and it’s essential for supporting innovations like Bitcoin.

Despite its success as the building block of currencies like Bitcoin, the system doesn’t necessarily need to have miners and tokens to qualify as a blockchain – the term simply refers to the structure of arranging data into blocks. Blockchains, as a result, are decentralised ledgers where data is replicated rather than distributed.

Unfortunately, the frequency at which blockchain and distributed ledger are used interchangeably has created confusion over the technology as a whole, leading many to dismiss blockchain as simply a tool for Bitcoin.

Encrypting a small business: Why remote working could be your blindspot

David Howell

15 Jan, 2020

In a business environment where cybercrime continues to pose a real and present danger to businesses of all sizes, paying close attention to how data and devices are protected is now of paramount importance. As Werner Vogels, Amazon’s chief technology officer recently put it, we need to “encrypt everything”.

There has perhaps never been a more urgent time to look at encryption strategies. Government research from 2018 revealed that over two in five businesses (43%) have identified security breaches in their systems in the last 12 months. Some of the most common attacks included staff receiving fraudulent emails (75% of those breached), individuals impersonating the organisation online (28%) and viruses and malware (24%). What’s more, security breaches on average cost organisations 894 per incident over the past year.

Legacy systems such as desktop PCs and servers generally use high levels of encryption. However, mobile digital devices often use reduced levels of encrypted security, if indeed they use any encryption at all. According to Sophos, only a third of businesses encrypt the smartphones and tablets they hand out to employees.

Then there’s the cloud to consider, which has become a new battlefield in the fight against cyber crime. As cloud adoption has increased, businesses have slowly handed off the responsibility for encrypting data to service providers that are themselves becoming a favoured target for cyber criminals.

Businesses understand that their customer data, in particular, must be encrypted. Highly regulated industries, such as financial services, have long used strong encryption to meet their compliance responsibilities, with other sectors reacting to high-profile security breaches by enhancing their use of encryption tools and protocols.

For example, the payment card Industry’s Data Security Standard (PCI DSS) has strict requirements on how merchants need to employ encryption to protect stored cardholder data. The Data Protection Act 2018 and GDPR (General Data Protection Regulation) both make it mandatory that businesses take practical steps to protect customer data.

Data dispersal

However, companies are seeing that work is changing and that modern workplace practices, such as remote working, are creating new challenges when it comes to protecting data. Many businesses now operate with a highly dispersed workforce, one that still requires secure lines of communication to the office.

Some technologies have helped in this regard. Virtual private networks (VPNs) that use built-in encryption protocols are now becoming widespread, particularly across the small business community because of their relatively low cost and efficient deployment.

Yet this dispersal of employees is often a “barrier to a successful encryption strategy”, according to findings from the Ponemon Institute’s 2019 Global Encryption Trends report, with many businesses being unable to source where their sensitive data resides.

Some 69% of those surveyed said that data discovery was their biggest headache when it came to encrypting data, 42% found difficulties when first deploying new technologies, and 32% said they struggled to identify what data they should be encrypting as a priority.

For Martin Whitworth, research director of European Data Security and Privacy for IDC, businesses need to have an understanding of the application of encryption, specifically what it can and can’t do.

“It is important for all organisations to have a stance, and policy, on encryption,” says Whitworth. “However, this should not just be shelfware – it must reflect a well thought out position. In fact, one of the real benefits of developing an encryption policy is that it should drive a greater understanding of the topic, what it can do and what it can’t do.” 

He adds that even those businesses who do have encryption policies in place, these often fail to fully protect data once it has been transmitted to remote workers outside of the organisation’s firewall.

“Most small businesses are probably already using encryption – specifically encryption of data in transit, via their use of ‘secure’ web sites (SSL/TLS) and possibly VPNs for remote access,” Whitworth adds. “But they should also be seriously looking at encryption of data at rest; whether this is full disk encryption of laptops and/or smartphones to protect the sensitive data that they have.”

Despite there being an abundance of security tools available for businesses of all sizes, he believes that many of these are “off-putting to small businesses” as they are “not easy to integrate with existing applications”.

“What is often missing are the skills and knowledge to implement, maintain and operate them appropriately,” adds Whitworth, something which hits small businesses the hardest.

Understanding the basics

Despite the challenge facing small businesses, it’s possible to simplify the process of encryption, provided you have a well-defined and communicated policy across your business. Data is now your business’s most precious commodity – a commodity that must be protected.

The Ponemon Institute research found that 44% of businesses performed encryption on-premise before sending data to the cloud using keys their organisation generates and manage. However, 35% of respondents perform this encryption in the cloud, with cloud providers generating and managing those keys. Some 21% of respondents are using some form of Bring Your Own Key (BYOK) approach.

Regardless of the favoured approach to encryption, there are basic steps that all businesses should be taking. “Encryption is no longer an additional expense, it’s something you can enable on most new devices,” explains Oscar Arean, technical operations manager at Databarracks.

“A password on a laptop doesn’t make the data secure. It’s relatively easy to get access to the data either on the laptop or by removing the disk itself. BitLocker is a good start on new Windows laptops, or Mac’s have FileVault. Neither are enabled by default; however, so the first and most important step is actually to enable encryption.”

David Sutton is author of Cyber Security: A practitioner’s guide, published by BCS. His advice is provided in a private capacity and doesn’t necessarily reflect the views of BCS. He believes that encryption can be turned into a fairly straight forward exercise for small businesses, but they should be aware of the added restrictions it could place on day-to-day operations.

“Most commercial encryption software is suitable (or has a product) suitable for small business use,” explains Sutton. “For file and disc encryption, there are really no cons.”

However, he adds that “for email encryption, both sender and receiver must operate the same encryption standard, which can lead to complications when dealing with other organisations who already operate different systems. On the pro side, it’s normally win-win on all types.”

How to use encryption

Having a full understanding of the data landscape across your enterprise will help you figure out what types of encryption you need. When data is at rest stored on hard drives, servers or mobile devices, for instance, file or full drive encryption should be considered.

It’s when data is in motion that encryption becomes even more vital. When data moves over your business’s network or out onto the wider internet, it must have some form of encryption. It’s likely your business has continued to expand its use of the cloud in some capacity and is probably developing hybrid cloud deployments. If that’s the case, data must be encrypted at rest as well as when it’s being transmitted.

Ramon Krikken, research VP Analyst at Gartner, tells IT Pro: “Encryption is considered a baseline control and often provides a first technical step in compliance programs. Encrypted communications, such as TLS (Transport Layer Security), provide a strong control.

“Data-at-rest encryption is more challenging,” he adds, “because the layer at which it is deployed determines how much protection it provides – it’s but a small part of a larger control set that includes monitoring and access control. In addition, encryption key management for data-at-rest encryption is a critical element, because losing the keys means losing the data.”

Of course, the quality of any encryption policy comes down to how keys are generated, applied and managed. For larger businesses, this is somewhat of an easier task despite the quantity of data that needs to be encrypted. Cryptography is often managed by in-house experts equipped with expensive hardware and software.

These resources aren’t something that’s typically available to small businesses, and investing in in-house expertise isn’t usually feasible. As a small business, you’ll likely find yourself working more closely with service providers. However, if that isn’t an option that works for you, you can call upon key management products that are provided as a service. These tend to give you more control over encryption keys, but generally, it’s more difficult to maintain full control unless you have the resources to do so.

What has become clear for all business owners is encryption must form a fundamental component of their data security policies. Where data is stored, who has access and, importantly, how data is protected when in transit and at rest, all require strong encryption protocols.

The use of mobile devices has also moved the perimeter of the security environment businesses have to manage outside of the control of their premises. Ensuring all data communications use strong encryption is now critical to meet data protection and regulatory privacy requirements.

Also, don’t forget your staff. Consistently, one of the weakest links in a security system will often be the people handling data. Ensure your business has detailed and on-going education and training to encompass the encryption tools you are using to ensure they are always correctly used and not avoided for forgotten.