All posts by Cloud Pro

Why bare metal infrastructure is the future of the online media industry

Cloud Pro

15 Dec, 2021

Content providers hoping to compete in today’s ferocious online media market need a server infrastructure that can handle enormous volumes of data on an hourly basis, while also being able to scale as demand grows.

To truly understand what online media companies are up against, let’s take Twitch as an example. The live streaming platform, popular among gamers and vloggers, has generated around 1,400 billion minutes of video content in 2021*, serving around 2.76 million average concurrent users. Given an average 720p HD video will use up around 900 megabytes per hour on a server, we can say that Twitch’s servers handled roughly 46.5 million gigabytes every day throughout the year.

Of course, any content provider hoping to overtake a platform like Twitch needs to match, and even exceed the company’s server capabilities. However, the market is now full of media giants all dealing with similar volumes of data each day, and just staying relevant can be a challenge. Even the smallest degradation in performance can, in the minds of customers, render your platform inferior to rival services.

Bare metal servers are by far the most efficient and most reliable solution for dealing with this extraordinary demand, whether on their own or as a node in a wider infrastructure strategy. The single tenancy approach means you never have to worry about fluctuations in bandwidth, even when your demand scales.

What are bare metal servers?

A bare metal server is a highly customisable physical server that is utilised by a single tenant, i.e. only one customer at a time.

The ‘bare metal’ moniker refers to the idea that the customer benefits from direct access to the hardware, including the storage and memory, without having to deal with pre-installed operating systems or hypervisors. The fact that bare metal servers operate with only one tenant means customers get sole ownership of the server’s resources, avoiding the problem of performance fluctuations during high demand from other users.

You may be thinking that bare metal servers sound an awful lot like dedicated servers, and you’d be right. The two share a number of similarities, including sole tenancy and complete access to hardware. However, bare metal might be best thought of as the next generation of server deployment, and is usually paired with the latest in processor, storage, and memory technology.

Another subtle deviation from dedicated servers is that bare metal is typically far more flexible when it comes to payment options. Hosting companies will usually provide dedicated servers on yearly, or monthly contracts – with bare metal servers it’s a case of monthly or hourly, although cost will vary depending on the package you take. This makes bare metal incredibly useful for those customers that deal with fluctuations in demand, whether that’s a retail site offering Black Friday sales, or a box office app selling tickets for a popular show.

When it comes to deployment, bare metal servers can shine as either a sole hosting option, or when placed in a supporting role. A company may utilise multiple types of hosting, with bare metal acting as a highly configurable overflow, or rely solely on bare metal. Given that bare metal supports multiple types of operating systems, including hypervisors, that usually comes with a bunch of tools to help link with other parts of a network, and that it’s simple to provision with automated deployment, it’s no surprise that it’s becoming the go-to option for myriad situations.

G-Core Labs is one such provider that’s leading the way when it comes to simplified provisioning. Its bare metal as a service platform is built around the principle of automation, allowing customers to provision equipment, configure hardware and spin up new dedicated servers all through APIs. Simply put, this offers scaling potential that’s difficult to challenge.

Why is bare metal so useful to online media platforms?

Protecting one of the most-targeted industries

Security is one of the most challenging aspects of operating a business online, and no matter what industry you operate in, or the size of your company, you’re almost guaranteed to experience a cyber attack in one form or another.

Those in the media industry need to be on higher alert than most businesses, with IBM ranking it the 8th most-targeted industry in 2020**.

Bare metal is one of the most secure forms of server infrastructure available. Given that bare metal operates as a single-tenant environment, resources are allocated to, and controlled by, one customer, isolating them from any system vulnerabilities that may otherwise be present through ‘neighbouring’ users. That control also allows customers to pick and choose what operating systems and tools are deployed, adding and removing software as vulnerabilities are discovered.

Service providers will also offer additional protections on top of the secure foundations of bare metal. Enhanced protections against distributed denial of service (DDoS) attacks is one of a number of safeguards offered by G-Core Labs, which automatically redirects any suspicious traffic to a threat mitigation system (TMS). This TMS is capable of detecting DDoS attacks while also filtering legitimate traffic to the server, based on policies configured by the customer, meaning the server can stay up and running while the attack is being mitigated.

Server performance

The biggest advantage of any kind of dedicated server is that they can handle the sort of resource-intensive tasks that would otherwise be difficult to support using virtual machines shared with other users. Bare metal, as you might imagine, is considered the best option in this regard.

In fact, side by side comparisons show that performance can be as much as 17% lower on virtual machines compared to bare metal, according to a benchmark report released in January in the Applied Sciences journal†.

The main reason for this is that bare metal offers pure hardware, with none of the software that comes preinstalled on virtual machines. It’s up to the user what is installed and how it’s all configured, meaning you can create a bespoke system that’s suited perfectly to your needs as a business, free of any unnecessary layers that may otherwise be a drain on resources. You get precisely what you pay for and you know you’ll get every last drop of performance from the hardware configuration you choose.

What hardware is available is dependent on the service provider, however. G-Core Labs is one such provider that has started integrating high-performance hardware to support its customers, including NVM disks and Intel Xeon Scalable Ice Lake processors.

Opportunities to cut costs

Keeping control of resources is a challenge for most businesses, and this is especially true for those running online media platforms requiring 100% uptime and 24/7 availability. Dedicated servers provide a sure-fire way to optimise costs, although bare metal servers only improve your options further.

Bare metal gives you the opportunity to create a bespoke server that fits your needs exactly and, most importantly, you only pay for what you need. You can then monitor the performance of your platform and expand or shrink your server footprint as and when you require. Simply put, you will never feel like you’re paying more than you need at any given time. This degree of flexibility is incredibly important for media companies that will naturally experience fluctuations in daily traffic, and enormous spikes in activity around the release of popular content.

Depending on the packages offered by the service provider, it’s also possible to configure traffic and bandwidth on top of raw server performance. Changing to a new tariff can usually be done without additional cost to the customer. 

For example, G-Core Labs’ dedicated server tariff allows customers to customise their deployments, including the option to  install RAID or change its type, increase storage volumes and the number of deployed disks, increase memory (RAM) size, opt for solid disk drives (SSDs) or hard drives (HDDs) based on need, and install 10 Gbits/sec network cards. This level of customisation means you never have to pay for something you don’t need.

Even when compared to traditional dedicated servers, the benefits of bare metal are clear. With price plans on a per-hourly basis, it’s a level of flexibility and predictability that’s difficult to replicate.

Learn more about G-Core Labs’ services



What should you really be asking about your remote access software?

Cloud Pro

17 Nov, 2021

Of all the tools underpinning modern IT management, VNC may be one of the most prolific. First developed in the mid-1990s by RealVNC, it’s spent the past three decades enabling IT teams to access remote systems throughout their estate, and has found its way into the toolbelt of every support technician.

It may seem like a simple tool – and in many ways, it is. The underlying RFB protocol that powers VNC-based remote access tools hasn’t changed all that much since it was first introduced, and although its open-source origins have resulted in a huge number of different VNC solutions, they all share most of the same core capabilities. 

This can lead many organisations to treat remote access software as ”part of the furniture”; something that’s useful to have around, but not worth giving any particular thought or consideration to. Indeed, when selecting a VNC solution, many IT professionals simply gravitate towards the first option that comes to mind.

However, while remote access software may not necessarily be a transformative part of your IT stack, selecting a provider should be given a significant amount of scrutiny. Not only is it a foundational part of many operational tasks, making the wrong choice can have serious potential consequences further down the line, and there are a number of important questions that any IT department should be asking of their potential partners.

For instance, certain options may present unforeseen logistical challenges, like platform compatibility. Just because your chosen flavour of remote access software allows technicians to access Windows desktops doesn’t necessarily mean it’s going to play nicely with every operating system, and that could be a problem if you also need to access Linux-based servers. Some providers will even support connections from mobile devices, giving technicians remote access even while on the go.

Deployment is also something that bears consideration. If you’re managing a large fleet of devices, you’ll need a remote access provider that supports automated remote configuration and deployment. If not, support staff will be faced with the tedious prospect of manually installing agents on every machine, one by one. Alternatively, you may want to opt for a provider that offers agentless, on-demand connections to reduce on-device footprints.

Connectivity, meanwhile, is frequently one of the most frustrating elements of IT support, particularly with remote access software that involves laboriously configuring firewalls and port access rules to permit connections. Modern enterprise-level vendors like RealVNC, by contrast, offer cloud-based connection brokering that bypasses fiddly firewall customisation, as well as direct peer-to-peer connections for high-security, privacy-conscious or offline environments.

You should look at performance, too. While the fundamental technology underpinning VNC may not have changed all that much in the last 30 years, it isn’t always implemented as efficiently as it could be, so pay attention to any performance guarantees offered by vendors in order to ensure that your remote sessions are as seamless as possible. 

The biggest area of focus, however, should be security. The purpose of remote access software is to give IT teams an easy way to interact with systems from wherever they are, including back-end servers as well as employee desktops and laptops. By definition, then, this software will likely be installed on most – if not all – of a company’s machines. 

While this is convenient for providing support to colleagues, it can be a double-edged sword under the wrong circumstances. If your technicians can access every computer in your estate from anywhere in the world, it also means that if an intruder gains access to your systems, then they can too. Think of it like giving your neighbour a spare key to your house; it can come in handy for a great many things, but you have to be absolutely certain that you trust them to keep it safe.

This isn’t just about making sure that potential suppliers provide robust privilege management and account authentication options – although these are essential for preventing account takeovers and exploitation by insider threats. It’s also about making sure you trust the security of the software itself.

Open source technology has a reasonably good record for security, but instances like the Heartbleed OpenSSL bug prove that it’s not immune from being compromised. Commercial vendors, on the other hand, can take the foundation created by open source protocols and layer additional protections on top of it, as well as proactively monitoring for potential bugs and vulnerabilities before they become an issue.

Of course, that’s not to say that vendors should be implicitly trusted. Over the last several years, we’ve seen a number of high-profile supply chain attacks on vendors like Kaseya, Solarwinds and more, which have allowed hackers to smuggle malware into customers’ environments. It’s therefore prudent to treat all vendors with a healthy dose of scepticism and to make sure that you’re comfortable with the level of security they offer, as well as the data they’re collecting and storing from your activity.

RealVNC’s offering, VNC Connect, has been built from the ground up for business deployments, specifically tailored to meet these needs. It offers a wide range of automated deployment options, integration with existing enterprise tools, comprehensive platform support and a battery of security protection, including 256-bit AES encryption, multi-factor authentication and granular permissions controls.

It’s also built by the original authors of the RFB protocol, so it’s optimised for performance with patented technology to help keep connections stable even when faced with low bandwidth. It also includes printing, chat and file transfer functionality, and a robust management console.

Remote access software may not be the newest technology in the world of IT, and it’s certainly not the sexiest – but that doesn’t make it any less important. It’s one of the most versatile and widely-used IT tools in the world, and it deserves to be chosen with an appropriate level of care. You wouldn’t leave the keys to your house under the doormat – so don’t do the same with the keys to your IT estate.

Learn more about RealVNC’s services

Why the financial industry is turning to the cloud

Cloud Pro

25 Oct, 2021

It should come as no surprise that the financial services industry is vast. According to data from Research and Markets, the sector is expected to reach a global value of more than $22 trillion by the end of 2021. It’s home to a huge range of organisations, encompassing everything from traditional banks, lenders and insurance companies, to payment providers, wealth management firms and more.

All of these organisations have one thing in common: they rely on immense technical capabilities in order to run their businesses. Financial services organisations have to process vast amounts of data in order to track things like investment trends, market conditions and credit ratings, and all of these analytical processes have to be as close to real-time as possible. After all, time is money.

Historically, this has meant that the financial services sector has almost exclusively been the preserve of giant monolithic organisations, or those with sizeable amounts of pre-existing capital. This is because establishing these technical capabilities traditionally involves significant investment, in the form of data centre equipment and personnel. 

Not only do you need large quantities of high-end server equipment to perform the necessary analytics tasks, you also need storage and networking infrastructure to support it, data centre space to house it in (along with the attendant cooling, power and maintenance costs that go along with it) and a team of highly skilled technical staff to ensure that your data centre remains operational and performant.

That all adds up. Modern cloud platforms like G-Core Labs, however, have opened the financial services market up to organisations that don’t have these resources. Since the turn of the century, the financial services space has exploded with fintech startups, most of whom have leveraged cloud technology to quickly establish their services without needing huge capital investments. This includes household names like PayPal and Venmo, as well as new digital-native challenger banks, boutique lenders and even insurtech firms like TempCover.

Many startups have used the ‘minimum viable product’ approach when designing their applications, which fits well with the cloud model. This method involves focusing on a single, small-scale product or feature, then expanding it over time – which means that the cloud infrastructure required to deliver the service to customers is comparatively cheap and easy to manage. Instead of needing a big, expensive server deployment – which, chances are good, you won’t be fully utilising – you can spin up as much cloud capacity as needed and pay for it on a consumption-based model.

In many cases, this approach also allows cloud-based financial services organisations to be more agile than their more established traditional counterparts, as they can build and test new capabilities much faster. Rather than having to wait for server resources to become available, the combination of highly elastic cloud infrastructure and containerised applications allows for rapid iteration and deployment of new features, which lets financial services companies respond rapidly to the ever-changing needs of the market.

The Royal Bank of Canada, for instance, has used cloud infrastructure since 2018 to speed up the development of its software products. This project is not just a consequence of optimisation, but a part of the bank’s global strategy to transfer business to its data-driven enterprise segment.

The cloud has allowed financial services organisations both old and new to easily leverage another key capability, in the form of big data and AI applications. This emerging technology has enabled financial organisations to rapidly speed up a number of processes, including automatically flagging potentially fraudulent transactions, automating credit reporting and analysing market trends.

For example, Shanghai-based SPD Bank has been using the cloud since 2017 to develop and implement more than 60 applications – including some critical applications – that use artificial intelligence. This enabled it to become an honorary member of the Cloud Native Computing Foundation in acknowledgement of its active use of cloud technology in application development.

This function of the cloud gives organisations the ability to offer new products and services to their customers, as well as introducing cost savings by freeing up employees to focus on more complex and nuanced tasks, but without low-cost, high-capacity cloud resources to enable them, the infrastructure needed to support these workloads would make them prohibitively expensive for many organisations. That’s why G-Core Labs has introduced a cloud AI platform to give customers access to ready-made machine learning models and templates for speeding up development of these applications.

It’s not just about spinning up new services, either – the highly scalable nature of cloud platforms makes them ideally suited for coping with fluctuations in server load. Financial systems in particular require an extremely high level of stability, and being able to rapidly add additional server capacity minimises the chances of an unexpected and costly outage.

The cloud also allows for rapid expansion, as multi-region cloud providers like G-Core Labs allow services to be extended to new geographies at the push of a button. Rather than renting space in a new data centre and installing appliances, existing cloud systems can simply be replicated and placed in the new territory with little to no additional configuration. On top of this, organisations can choose where they hold their data to meet compliance regulations, and cloud platforms like G-Core labs have various certifications to ensure regulatory standards are adhered to.

Security is a top priority for every financial business, and many organisations have chosen to adopt hybrid cloud strategies, allowing them to keep their most sensitive data on-site while also taking advantage of the benefits of public cloud. Alongside a range of robust security protections like comprehensive backup, audit and disaster recovery functionality, G-Core Labs also offers customers the option of maintaining a secure loop within their own security perimeter for additional peace of mind.

The time of big, all-pervading financial monoliths is over. New technologies have levelled the playing field, and nimble startups are taking the opportunity to outmanoeuvre and outperform their legacy competitors. For financial services organisations that want to remain at the cutting edge of data efficiency and customer satisfaction, cloud platforms like G-Core Labs offer the key to ensuring that digital transformation and agility is at the heart of your business.

Learn more about G-Core Labs’ services

How the cloud is supporting retailers in the age of Black Friday

Cloud Pro

30 Sep, 2021

Anyone who’s worked in retail will tell you that it’s often far from easy. That doesn’t just apply to those on the shop floor, though – running a retail business can present a range of challenges for those working behind the scenes, too. It’s an incredibly fast-moving industry, with myriad shifting patterns that can see customers ebb and flow on an almost daily basis.

In such a fluid sector, the demands on IT departments are high. For ecommerce businesses, infrastructure has to be stable, performant and available at all times, and user experience is of the utmost importance. This can be a tough balancing act, particularly for organisations that only have a limited technical workforce, and these pressures have driven many companies into the cloud.

Over the last decade, the world has been increasingly shifting towards online shopping. In fact, according to figures from Statista, global ecommerce sales have grown by more than 200% since 2014 and are projected to surpass $600 trillion by 2024. That growth in demand has driven a vast proliferation of channels, moving from desktop websites to mobile sites, dedicated apps and even social media storefronts – all of which require time, talent and resources to maintain.

Online shopping has also sharpened the effects of seasonal peaks and troughs. Alongside traditionally busy periods like Christmas, Summer and Easter, increasing globalisation has added new entries to the retail calendar, including Black Friday and Cyber Monday. As shoppers eagerly flock to their favourite store pages in search of bargains, the infrastructure behind them has to cope with a sudden influx of traffic which can be orders of magnitude higher than the normal average.

As if that wasn’t enough, the pandemic threw further complications into the path of retailers with national lockdowns and stay-at-home orders. Almost overnight, businesses were forced to transition to a fully digital business model, and the legions of people easing the boredom of being stuck at home by shopping online caused problems even for digitally native organisations.

The consequence of all of this is that ecommerce businesses have had to undergo a rapid transformation in order to make sure their infrastructure is flexible, scalable and responsive enough to support these trends. Cloud technologies have been instrumental in this; when traffic to an online storefront spikes, more infrastructure capacity is needed to ensure that visitors aren’t put off by poor performance and long load times. However, in an on-premises environment, adding more capacity means installing and spinning up more physical appliances.

A cloud-based model, on the other hand, allows extra capacity to be quickly and easily added as it becomes necessary, and because it’s charged on a consumption basis, you can turn it off again once the spike passes. Content delivery networks (CDNs) such as G-Core Labs’ can be particularly helpful here, automatically performing load-balancing duties to ensure that traffic is spread out over as many servers as necessary in order to ensure stable performance. This scalability and elasticity makes cloud infrastructure like G-Core Labs Cloud significantly more cost-effective for dealing with unexpected surges than traditional servers.

For instance, major Asian online retailer Zalora found that its infrastructure was no longer able to cope with the traffic demands placed on it. Zalora moved its entire infrastructure to the cloud and can now handle site traffic increasing by 300-400% during sales without experiencing any dip in performance.

The growth of online shopping has also opened up new markets for retailers, who can reach customers all over the world. The same is also true for their rivals, though, and a global digital economy means more competition for sales. Retailers need to be smarter about winning and retaining customers, and must rely on more than discounts to entice people to their page.

Appropriately, digital marketing technology has exploded in order to fill this need. Brands can now engage with their customers across a huge range of channels, including email, social media and instant messaging platforms, with cloud-based tools not just to efficiently automate these communications, but to track the interactions with customers across all an organisation’s channels. This helps retailers form deeper and more meaningful connections with customers, increasing brand loyalty and boosting the relationship.

Customer-facing technology like online storefronts and digital marketing aren’t the only tools that have drawn retailers to the cloud, however. There are many back-office and line of business roles within retail that have benefitted from the recent growth of SaaS applications, including areas like stock control, logistics management, payroll and more. It’s even helped modernise physical stores, and cloud-based AI systems can now be used to perform complex operations like measuring footfall numbers or stock levels from CCTV footage.

Arguably the most significant change that the cloud has introduced, however, is a focus on data-driven decision-making. All of the tools and techniques we’ve spoken about so far generate information about who shoppers are, what products and services they’re most interested in, when they make purchases, what device they make purchases with, and much more. All of that data can be collected, harnessed and analysed in order to increase your store’s effectiveness.

This can be something as simple as changing what time your email newsletter goes out in order to match your customers’ activity patterns, to a more in-depth change like analysing bounce rates to make your store easier to navigate. The shrewdest retailers are taking all of their available customer and sales data and combining into what’s known as a “single customer view”, representing a near-complete picture of that brand’s customer base. Rather than investing in large data centre deployments in order to support this, however, many organisations have turned to cloud platforms like G-Core Labs in order to facilitate these efforts. The G-Core Labs infrastructure is based on Intel solutions, including the latest 3rd Gen Intel Xeon Scalable (Ice Lake) processors, ensuring enterprise-grade performance for AI and data workloads.

For example, US department store Macy’s uses big data to create price lists for each of its 800 outlets – and is able to do so in real time using the cloud. It also uses analytics to create personalised offers for its customers, and the number of variations for a single mailing campaign can reach an impressive 500,000.

Of course, all of that data also makes an attractive target for hackers, and retailers in particular need to ensure that their cyber security practices are up to standard. Cloud-based endpoint protection systems can help safeguard back-office staff, while robust monitoring and alerting tools can flag suspicious activity on any public-facing sites. Intel SGX sensitive data protection technology is also integrated into the G-Core Labs cloud.

Distributed denial of service (DDoS) attacks are one common tactic that cyber criminals often use against e-commerce businesses, flooding sites with traffic until they crash, then offering to shut the traffic off in exchange for a payment. The perpetrators of these kinds of attacks deliberately time them around peak shopping times like Black Friday or Christmas, making them potentially one of the most financially damaging situations an e-commerce business can face. Thankfully, modern DDoS mitigation services like G-Core Labs’ DDoS protection have evolved to cope with these kinds of attacks, putting a layer in front of the site that can detect and intercept malicious traffic before the target’s infrastructure can be overwhelmed.

Online health and beauty retailer eVitamins tried multiple solutions to reduce the impact of DDoS attacks – including adopting intrusion prevention systems, blocking suspicious IP addresses and analysing logs – but failed to sufficiently reduce the disruption and cost they caused the business. It finally achieved effective DDoS protection by transferring its infrastructure to a public cloud environment.

The world of retail has changed enormously over the past decade, and it’s not going to stop any time soon. Both high street and ecommerce businesses are in a period of rapid evolution, and cloud services are essential for staying ahead of the curve and on top of your competition. Whether you want to increase customer retention, maximise transactions or simply make life easier for the technical teams keeping your business moving, G-Core Labs’ cloud technology is an essential tool in your arsenal.

Learn more about G-Core Labs’ services

Why you should modernise your systems based on need

Cloud Pro

30 Jun, 2021

The business world has been rocked by huge changes recently thanks to lockdown restrictions and new remote-working models, not to mention a complete shift in the way that consumers engage with products and services. Due to the sudden pivots necessitated by COVID-19, it’s understandable that many organisations have adopted rapid digital transformation and an increased dependence on the cloud in order to remain relevant and competitive, to improve customer experience, to satisfy stakeholders and ultimately grow revenue in the face of this disruption.

However, such rapid transformations can be associated with serious risks if not managed properly. Businesses may have been tempted to adopt a one-size-fits-all cloud solution, or to transform in a piecemeal way across multiple cloud silos without following a carefully laid-out strategy. The former can easily result in business transformation that does not fit the specific needs of your business, running counter to the goal of modernising, with the potential to harm rather than help your bottom line.

For the latter, without a strategy you can create various issues for your IT infrastructure, from fragmented systems that do not allow for monitoring and automation across cloud silo boundaries, to increased risk of security breaches and an overall complexity that hampers your ability to quickly update applications and systems to meet the needs of your business and customers.

Continuous modernisation

Developing a robust, detailed strategy is key to successful modernisation, and an understanding that there is not one simple, one-size-fits-all solution that you can apply to your business. Before you commit to a hasty ‘rip and replace’ strategy, it is important to consider the needs of your organisation and the best approach to meeting them.

Companies like Micro Focus can assist you in developing a winning, smart digital transformation strategy that will meet your modernisation goals and avoid potential pitfalls as you build, deliver and run your modernisation project. Micro Focus’s open, integrated, backwards-compatible software bridges existing and emerging technologies, so you can innovate faster, with less risk, as you transform. With AI-powered automation, you can manage and monitor multiple clouds, applications, data centres and networks from one portal. This offers enhanced, joined-up security, quickly highlighting any issues that arise.

Evolving your IT practices – without jeopardising critical business systems and processes developed over decades – is a constant balancing act. That is why bridging new and existing applications and infrastructure is particularly key. Evidence is emerging to suggest that modernisation projects using an incremental and continuous improvement model are more likely to achieve positive results than other more drastic approaches like ripping and replacing your core business applications. The ‘Endless Modernization’ research authored by Micro Focus and The Standish Group found that companies that choose to modernise an existing application rather than fully replace it had a 71% success and 1% failure ratio, compared to a 26% success and 20% failure ratio for those choosing to scrap a software application and start from scratch.

A smart approach to digital transformation allows businesses to transform as they continue to run core applications, modernising continuously and incrementally to strike a balance between innovating and maintaining proper business operations. This incremental approach endorsed by Micro Focus allows you to maintain continuity during your ongoing modernisation work and, ultimately, execute business transformation based on the needs of your organisation, rather than rushing to make it happen as quickly as possible and ending up with a solution that doesn’t properly meet your requirements. And unlike unstructured, piecemeal approaches, this type of transformation is backed by a robust strategy that maintains unity and control across your IT infrastructure.

Beyond the cloud

While the cloud has proved to be a powerful tool in modernisation – particularly when it comes to the dispersed-working models that have gained dominance since 2020 – it is important to understand that transferring all your infrastructure to the cloud may not be the best solution for your business. In a modernisation project built on the needs of your business, a fully cloud-based solution might not give the right results. It’s important to consider the modernisation route that will really suit your business best, whether that’s mainframe-centric, cloud-first, a hybrid approach, DevOps-driven or service-oriented.

Remember that mainframe is just as viable as any other platform when it comes to modernisation. While the cloud is both fast growing and the most high-profile solution today, AWS reports that more than 70% of Fortune 500 companies still run their business-critical apps on mainframes – and for good reason. These adaptable, resilient incumbent systems have been steadily modernised and built upon over the years, becoming one of the most trusted elements of the IT team’s arsenal.

Of course, mission-critical mainframe applications need to be kept up-to-date with modern business demands, but that doesn’t necessitate a ‘rip and replace’ approach. Part of building on existing successes is to recognise what is working and using that as a cornerstone to develop a long-term strategy. Using its modernisation maturity model, Micro Focus can support your preferred modernisation route and advise you on the most efficient and effective way of modernising your mainframe without disrupting business operations.

Working with a transformation partner like Micro Focus enables you to extend the value of your existing technology investments. Every business’s needs are different. Before committing to a wholesale ‘rip and replace’ approach, it’s important to assess your organisation’s requirements. Armed with this understanding, you can execute an incremental and continuous modernisation approach to achieve smart digital transformation by building on your existing infrastructure.

Learn more in Micro Focus’s Race to the Cloud ebook 

IT Pro 20/20: A quantum leap for security

Cloud Pro

30 Jun, 2020

Welcome to the fifth issue of IT Pro 20/20, our sister title’s digital magazine that brings all of the previous month’s most important tech issues into clear view.

Each month, we will shine a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

Cyber security has proven to be one of the most challenging facets of the lockdown – maintaining a robust posture at a time when workers are dispersed and often outside of the company firewall. Suitably, June’s issue of IT Pro 20/20 is all about cyber security. We’ve pulled together stories that examine the current state of the industry, including how current technology is being used and how future trends are likely to reshape our understanding of cyber security.

Our lead feature looks at the rise of quantum computing, still a fledgeling area of the tech industry but one that promises to upend cyber security as we know it. Yet, these remain promises, and it’s unclear whether we will ever see the future that proponents of the technology envisage.

Turning to trends that are a little more pressing today, we also share an industry hacking story that should serve as a lesson in how not to get hacked. I won’t spoil the story here, but I will say it involves a LinkedIn account, a gullible PA, a chief executive’s shoe size, and Tottenham Hotspur Football Club.

In our last exclusive article, we question whether there is, in fact, any weight at all behind the idea that remote working poses a danger to a business’ cyber security, or if these threats have been somewhat exaggerated. It’s likely to be a contentious issue for many, so we’ll leave it up to you to decide.


The next IT Pro 20/20 will be available on Friday 31 July. Previous issues can be found here.

We hope you enjoy reading this month’s issue. If you would like to receive each issue in your inbox as they release, you can subscribe to our mailing list here.

IT Pro 20/20: How regulation is shaping innovation

Cloud Pro

1 Jun, 2020

Welcome to the fifth issue of IT Pro 20/20, our sister title’s digital magazine that brings all of the previous month’s most important tech issues into clear view.

Each month, we will shine a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

To coincide with the second birthday of the General Data Protection Regulation, this month we take a look at the role of regulation in innovation. Rather than focus on GDPR principles and the importance of compliance, we thought it would be far more valuable to show how new rules are working to promote, and in some cases moderate, new technology and ways of thinking.

Our lead feature looks at the nature of corporate travel and how future regulations, as well as societal changes introduced as a result of the coronavirus pandemic, are likely to redefine what it means to travel for business. We also look at how authorities are attempting to rein in the development of cutting edge technology and whether it’s even possible to police something like an algorithm.

For those businesses confused about data laws in a post-Brexit UK, we’ve also put some of the most common issues to a panel of data protection lawyers to assess what the regulatory landscape might look like after the end of the transition period in January 2021.

As ever, you’ll also find a roundup of the four biggest stories of the month that are likely to reverberate throughout 2020.


The next IT Pro 20/20 will be available on Tuesday 30 June. Previous issues can be found here.

We hope you enjoy reading this month’s issue. If you would like to receive each issue in your inbox as they release, you can subscribe to our mailing list here.

IT Pro 20/20: Living at the mercy of technology

Cloud Pro

4 May, 2020

Welcome to the fourth issue of IT Pro 20/20, our brand-new digital magazine that brings all of the previous month’s most important tech issues into clear view.

Each month, we will shine a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

This month we’re taking a look at how technology is reshaping our lives, for better and for worse. While it’s something of a cliche to say, we really are at the mercy of technology. The coronavirus has pushed the majority of businesses to extreme limits, and without technology to keep us connected and productive, many would surely have collapsed by now. It’s this idea of technology continuing to reshape how we interact with the world around us that is at the core of this month’s theme.

Our lead feature takes a look at what Microsoft is doing to help repair the damage caused by the launch of Tay, a chatbot so ill-suited to its purpose that it would serve as a warning for anyone developing AI for public consumption. We also highlight how technology has influenced management styles in recent years, and how overbearing cyber security training has the potential to turn employees against a business.

As ever, you’ll also find a roundup of the four biggest stories of the month that are likely to reverberate throughout 2020.


The next IT Pro 20/20 will be available on Friday 29 May. Previous issues can be found here.

We hope you enjoy reading this month’s issue. If you would like to receive each issue in your inbox as they release, you can subscribe to our mailing list here.

IT Pro 20/20: Turning to the cloud in a crisis

Cloud Pro

31 Mar, 2020

Welcome to the third issue of IT Pro 20/20, our brand-new digital magazine that brings all of the month’s most important tech issues into clear view.

Each month, we will shine a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

This month we’re taking a look at how cloud innovation is helping to support the technology industry and wider society through a global pandemic. Now that most of us are working remotely, it’s important you have the best tools in place to keep employees secure and productive, and so we’ve highlighted a number of areas where the cloud is helping to drive this effort. From free software and remote working tips, to industry leadership and changing technology paradigms, the cloud is behind it all.

We also take a look at the growing trend of screenless content and provide some tips for helping your organisation develop a much-needed audio strategy, as well as the growth of AI as a service, both of which are exclusive to this month’s issue.

As ever, you’ll also find a roundup of the four biggest stories of the month that are likely to reverberate throughout 2020.


We hope you enjoy reading this month’s issue. If you would like to receive each issue in your inbox as they release, you can subscribe to our mailing list here.

The next IT Pro 20/20 will be available on 30 April. Previous issues can be found here.

Q&A: UK Cloud Awards judge Andi Mann

Cloud Pro

5 Mar, 2020

Please could you tell us a little bit more about who you are and what you do?

I am a lifelong technologist with a global perspective, having worked for 30 years in multiple roles for enterprise IT teams, as a leading industry analyst, and with several software vendors across Europe, US, and Asia-Pacific.

Currently, I work at Splunk as a strategic advisor, learning from research, customers, thought leaders, so I can advocate and lead innovative product development internally; and advocate and advise customers and others externally at conferences, in journals, and directly with technology and business leaders.

How would you describe the UK Cloud Awards in a nutshell?

Best in show! The UK Cloud Awards offer a revealing look at how IT is driving UK businesses forward and recognises ‘the best of the best’ in excellence and innovation.

What appealed to you about becoming a judge for this year’s UK Cloud Awards?

As a newbie to the UK Cloud Awards, I was particularly excited just to learn from all the entrants, and see the innovation and expertise they are bringing to the industry.

Beyond that, I was also attracted by the opportunity to leverage my own experience and expertise in cloud, digital, automation, and more to help recognise modern leaders who are making a difference to how businesses benefit from technology.

What are you most looking forward to about being involved in this year’s awards?

Without a doubt, I am most looking forward to learning about the amazing developments, innovations, and especially the business impact that this year’s award nominees are bringing to the UK, and to the world.

This year’s awards have had a bit of a makeover, with new categories and some other tweaks. Tell us why people should be getting excited about all of that/the awards?

Today, every business is a technology business – and not just a cloud business, or an IoT business, or a digital business. Today, every technology matters – it may be a new DevOps approach that tips you over the edge to beat your closest competitors; it may be a collaboration project that drives customer satisfaction to record levels; it may be a big data analytics outcome that delivers new sales and builds revenue; and so on.

It is not enough to have excellence in just one area, so this years’ awards recognizes that technology differentiation matters across the board, by shining a light on outstanding achievements across many different technologies and methodologies.

Do you have a category/categories you’re most excited about?

I am most excited for the Digital Transformation Project of the Year, the DevOps Project of the Year, and the ML/AI Project of the Year categories. Digital Transformation is a buzzword-made-real that is changing not just the IT industry, but really is changing the worlds, so I have high expectations to be amazed by entrants in this category. I have been deeply engaged with DevOps for about a decade – almost as long as anyone – but love to keep learning from practitioners, so I expect to learn from this category.

With my work, I am heavily focused on how businesses bring data, analytics, machine learning, and artificial intelligence to everything from IT Ops/App Dev and cyber security to BI to Edge and IoT, so I expect to be fascinated to see the innovative developments in this category.

What are you looking for when you’re reading an entry? How can people make sure theirs stands out?

I will certainly be looking at how innovative or differentiated the entry is, but primarily I will look for the impact that the project, technology, or approach has had on business goals – or for non-profits, on constituent or member outcomes. We can all deliver new technology, but innovation is much more than having a good idea – imagination without execution is merely hallucination!

For me, it is not enough to demonstrate how amazing a technology implementation is per se, entrants must show why it mattered. If they can show – especially in real, measurable ways – how their entry has delivered on specific, important, business-level goals, they will have a much better chance of getting a vote from me.

What would you say to those thinking about entering but haven’t fully decided to do so as yet?

Why wait?! You cannot win if you don’t enter! Even to make the shortlist will be a major source of inspiration, not just for managers, not just for marketing, but for the teams of individual contributors who did the hard work, likely over many months or even years, to make your project a success.

So, if you are not going to enter for the amazing prizes, for the accolades of your peers, for the management recognition, or for the opportunity to market your win to customers, do it to show the hard workers who made your project possible that this is an achievement you are proud of, and which you want to show off to the world.

Do you have a standout cloud moment from 2019?

Not one single moment, but I would cite the number and severity of major breaches of cloud-based data that have thrown some cold water on the raging fire of cloud computing. This continues to be something our industry needs to address.

Security, privacy, compliance, governance – these should be job #1 for any business. Customers are demanding it, but too many cloud providers are not living up to their promises. It is up to the cloud customers to make that change and make security a priority.