Three essential questions to ask VDI providers in the cloud era

As the workforce has become more dispersed, organisations have looked to virtual desktop infrastructure (VDI) to provide 24/7, anywhere access to corporate apps and desktops. But legacy VDI has fallen far short of its promises, especially when it comes to performance and simplifying IT. However, with cloud adoption accelerating, organisations have the opportunity to completely transform end user computing, deliver beyond the promise of VDI, and take advantage of new levels of agility that drive growth. 

The cloud VDI market, sometimes referred to as desktop as a service (DaaS), is experiencing explosive growth, but within the space, there are multiple approaches and their differences are vitally important for IT leaders to understand. To help determine what the best deployment and management approach is to meet an organisation’s unique needs, there are three primary questions to ask VDI and DaaS providers: 

  • Is the solution globally available? 
  • Is the solution customisable for enterprise needs? 
  • Is the solution turnkey? 

Let’s take a closer look at the three questions you need to ask virtual desktop providers.

Is the solution globally available? 

Most enterprise software was built for one customer and for one data centre. So, the focus was on making the software as scalable as possible within one data centre. That’s the case with legacy VDI. 

It requires a significant amount of resources and has proven cumbersome for most organisations to operate VDI even within a single data centre – let alone three or four. So, a customer might choose to deploy the VDI stack in just one data centre to reduce complexity and management overhead. However, in a larger organisation, most users are not close to that data centre. Because they’re remote, they’re subject to high latency and low bandwidth connections. These networks also can have high packet loss, which requires the use of expensive leased lines to solve.

The upshot is high latency and unhappy users. Even though various remoting protocols have been employed to try to optimise performance, they can’t change the laws of physics.

But a new approach exists now, thanks to the public cloud. Instead of one data centre, the public cloud is made up of many data centres distributed across the globe. Today Microsoft Azure has 54 cloud regions worldwide and more coming, which means that every organisation using Azure now has 54 data centres. It’s time to put them to use.

Because Azure is a globally distributed cloud, it requires a globally distributed cloud VDI solution – and that all comes down to solution architecture. A cloud-native solution, based on a globally distributed architecture, is what you need to look for; it’s the only way to get the performance, security and scalability benefits that have eluded traditional VDI  – whether it’s on-premises or in the cloud – but which are very real today. With the right cloud VDI solution, IT teams can deploy desktops at the edge of the cloud region nearest each user. These edge-native virtual desktops can be deployed within about 25 milliseconds of any user in the world, and that means users get incredible performance.

Is the solution customisable for enterprise needs? 

Ideally, IT teams will be able to manage all their virtual resources – apps, desktops and workstations – using their existing tools and processes, from a single pane of glass. That’s one way to simplify management processes and reduce the need for retraining. To meet the needs of enterprises, the ability to use existing corporate images, integrate with popular enterprise apps such as Skype and Microsoft Teams, and use established GPOs, security policies and multifactor authentication, is required.

One critical aspect of security is strong authentication, but more is needed. A virtual desktop architecture that separates the control plane from the data plane is fundamental to security, because it means that an organisation’s valuable data stays safe and doesn’t enter the virtual desktop control plane. Find out whether potential solutions offer this separation.

Because the IT budget is perennially tight, another important requirement to add to your list is the ability to easily integrate existing IT service management solutions, such as ServiceNow and BMC Remedy. This will efficiently orchestrate desktop provisioning and resource lifecycles.

Is the solution turnkey?

There is the option to build-your-own  VDI solution using many available products, but experience has shown this route requires significant IT team and financial resources. The other option is to choose a turnkey service. Deploying a cloud VDI solution can be as easy as buying a physical PC. Simply choose your configuration:  CPU, memory and storage, and whether or not you want a GPU. Then deploy it in the cloud region closest to the end user. Even better, what if you could scale horizontally across cloud regions – globally – in minutes? Ask providers what’s possible now.

With a truly turnkey service, the provider should own the reliability and availability service level agreement (SLA). Many IT teams want to get out of the desktop SLA business, because it’s time-consuming busy-work that detracts from accomplishing more strategic objectives that contribute to business growth. Know what a vendor’s uptime stats mean and what you’re willing to tolerate in terms of downtime. There’s no reason to accept less than 99.95% desktop availability. 

Goodbye to “good enough”

Some organisations have gotten used to settling for a VDI solution that is slow and expensive because it’s better than nothing. But something better is now possible, thanks to the availability of cloud-native VDI solutions that quickly and easily scale across cloud regions; enterprises don’t have to settle for “good enough” anymore.

By asking the three questions above, you will be able to vet providers to find the VDI solution that will best match your enterprise’s needs, and provide the business agility you need to thrive. in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft warns of remote execution exploit in Excel

Connor Jones

27 Jun, 2019

A new vulnerability in a Microsoft Excel business intelligence tool has been found to give attackers an opportunity to remotely launch malware and take over a user’s system.

Researchers at Mimecast discovered a vulnerability in Power Query (PQ), a powerful and scalable business intelligence tool in Microsoft Excel that allows users to integrate spreadsheets with other areas of their, business such as external databases, text documents and web pages.

The vulnerability is based on a method of data communication between applications which is used across the Microsoft Office suite called Dynamic Data Exchange (DDE). DDE attacks are nothing new, many successful malware campaigns have used the method to compromise documents, however, this particular attack grants perpetrators significant admin privileges.

“In an email attack scenario, an attacker could leverage the DDE protocol by sending a specially crafted file to the user and then convincing the user to open the file, typically by way of an enticement in an email,” said Microsoft. “The attacker would have to convince the user to disable Protected Mode and click through one or more additional prompts.”

Using the exploit, attackers can fingerprint individual systems belonging to victims, allowing them to deliver harmful code that appears harmless to both sandboxes and other security software the victim may be running.

Mimecast researcher Ofir Shlomo also said that the Power Query exploit could be used to launch sophisticated attacks, difficult-to-detect attacks the combine several attack surfaces.

“Using Power Query, attackers could embed malicious content in a separate data source, and then load the content into the spreadsheet when it is opened,” said Shlomo in a research blog shared with IT Pro. “The malicious code could be used to drop and execute malware that can compromise the user’s machine.”

DDE attacks are infamous for targeting enterprises due to their widespread reliance on Microsoft Office software in workplaces around the world.

APT28 and APT37, Russian and North Korean-linked hacking groups respectively, have both used the technique to good effect in recent years, with other groups utilising malformed Word documents for use in spear phishing campaigns.

“Such attacks are usually hard to detect and gives threat actors more chances to compromise the victim’s host,” said Shlomo. “Using the potential weakness in Power Query, attackers could potentially embed any malicious payload that as designed won’t be saved inside the document itself but downloaded from the web when the document is opened.”

Mimecast approached and disclosed the issue with Microsoft when they discovered it as part of Microsoft’s Coordinated Vulnerability Disclosure process. While Microsoft has yet to offer a fix for the issue, they did share a workaround.

Microsoft published an advisory document (advisory 4053440) that offers tips and guidance on how to secure applications when they process DDE fields. This includes instructions on how to create custom registry entries for Office and other methods too, each with benefits and drawbacks listed.

“Attackers are looking to subvert the detections that victims have,” said Shlomo. “While there is a chance that this kind of attack may be detected over time as threat intelligence is shared between various security experts and information sharing platforms, Mimecast strongly recommends all Microsoft Excel customers implement the workarounds suggested by Microsoft as the potential threat to these Microsoft users is real and the exploit could be damaging.”

How to know if AWS Blockchain is right for your business

In April, Amazon made its managed blockchain available to the public. Two weeks later, the company announced that it would offer $10,000 to any employee who quit Amazon and formed their own delivery company.

While the two announcements may not seem immediately connected, the relationship is there if you scratch the surface. But before I get into why, let me back up a little. Since blockchain exploded into the public consciousness with the price surge of Bitcoin in late 2017, business leaders have been wondering whether they should be exploring blockchain solutions. When Amazon made launching a blockchain as easy as opening a Facebook account this year, the question took on new urgency.

Today, though, it’s still not a viable technology for most business applications. Here’s why.

When blockchain makes sense

Blockchain is a technology that consists of a digital “chain” made up of individual “blocks” or data inputs. The chain is immutable, meaning that once a block is added, it cannot be changed or removed by any party. It’s also shared; no single party has control over the chain.

So, at a very high level, blockchain tends to make sense for businesses that have a need for an immutable record of transactions and are willing to share their data with a third party (or multiple third parties).

While the first half of that proposition appeals to a lot of organizations (banks, for example), the second half – the part about sharing data – often does not.

“A solution in search of a problem”

I’ve talked to a lot of business leaders interested in exploring how blockchain could support their operations. But it’s rare that blockchain is a simpler and more efficient solution than whatever the organization is currently using.

In fact, most blockchain efforts out there (with the exception of cryptocurrency) are still in the pilot stage as various groups try to see what will and won’t work.

Take, for example, the state of Illinois, which has partnered with blockchain company Hashed Health to create a blockchain-based pilot program to help issue and track medical licenses. The use case is compelling: government data is a great candidate for sharing publicly, and medical licenses are the kind of sensitive data it makes sense to make untamperable.

And just because we haven’t found many workable applications yet doesn’t mean they’re out there. That’s exactly what happened when economist Alvin Roth developed a model for trading houses without money – a lot of people made fun of him, until it found its home as the structure for America’s kidney donation bank.

Blockchain for partnerships?

One blockchain application many analysts have proposed is partnerships that require close collaboration. In that situation, blockchain would allow for a sort of “trust but verify” approach, ensuring that no party could manipulate the transaction record.

An example, you say? Oh, I don’t know, maybe… a major online retailer like Amazon partnering with a lot of dispersed delivery companies and interested in transparently managing the supply chain.

To be clear, I’m speculating here. But Amazon rarely does things that don’t benefit Amazon directly, and the timing coincidence of the two announcements is striking.

Should your company use blockchain?

As I said in the beginning, blockchain isn’t a viable solution for most companies right now. But that doesn’t mean you should write it off without considering it. Generally, I’d say blockchain may benefit your company if:

  • You’re pretty big – medium to enterprise level – or you have a dedicated dev team used to building software and developing systems. The blockchain by itself is useless; to make it work, you’ll have to figure out how to build it into an application, which means you’ll have to have the internal resources for that project
  • You’re willing to share your data. That’s baked into the nature of blockchain. It’s non-negotiable
  • You need an immutable ledger. This is probably the easiest criterion to meet, but note that needing an immutable ledger alone isn’t reason enough to adopt blockchain
  • Blockchain is simpler and less complex than any other solution out there. This is the real kicker. If you’re only using blockchain to use blockchain – and using it adds complexity to your company – it’s not a good solution

In other words, if you’re mainly interested in blockchain because you feel like you should be, you’re probably safe letting other people experiment before you invest a bunch of resources.

Interested in hearing more in person? Find out more at the Blockchain Expo World Series, Global, Europe and North America.

Oracle calls time on DNS specialist Dyn

Jane McCallion

26 Jun, 2019

Oracle has announced it’s shuttering Dyn, three years after it bought the DNS service, which will be integrated into the Oracle Cloud Infrastructure. 

In a post to its website last night, the company said: “Since the acquisition of Dyn in 2016 and the subsequent acquisition of Zenedge, the engineering teams have been working diligently to integrate Dyn’s products and services into the Oracle Cloud Infrastructure platform. Enterprises can now leverage the best-in-class DNS, web application security, and email delivery services within Oracle Cloud Infrastructure and enhance their applications with a comprehensive platform to build, scale, and operate their cloud infrastructure.”

While the FAQ page gives no deadline for migration to Oracle Cloud Infrastructure, customers have reposted emails which give 31 May 2020 as the End of Life (EOL) date.

Some customers have hit out at the decision, citing the consequent culling of a free tier and loss of some functionality, particularly dynamic DNS.

On Hacker News, someone under the username jpollock said: “As a lesson to anyone else hoping to do a shutdown with a migration to a different service with your company.

“If you are going to treat me the same as any new subscriber, where I have to re-signup, re-add my payment method, export my settings and then import them again, you’re asking me to buy all over again.

“If you ask me to buy, then I get will reevaluate the relationship, and if it’s just as easy to migrate to another supplier I will move.

“Migrating internally should have been “push this button to accept the new terms and pricing, you don’t even need to talk with your registrar.”

“I’ve been a Dyn customer for over a decade, and now I’m moving because it’s just as easy to move as it is to stay.”

Others took to Twitter to announce their displeasure.

Cloud Pro contacted Oracle for confirmation of the May 2020 deadline and further comment, but hadn’t received a response at the time of publication.

NHS Wales goes all in with Microsoft 365

Bobby Hellard

26 Jun, 2019

Every NHS Wales worker will be given access to Microsoft’s enterprise programs such as Outlook and Teams, aiding digital transformation in Britain’s healthcare system.

As part of a country-wide focus on digital transformation, more than 100,000 NHS employees, including GPs, consultants, nurses, therapists, paramedics and support staff, will have access to Microsoft 365.

This will include Outlook, Teams, OneDrive, Word, Excel, PowerPoint, OneNote, SharePoint and Yammer and will be available to staff on multiple devices, such as phones, tablets and laptops.

According to Microsoft, the move is expected to help NHS staff save money and time by not having to travel to face-to-face meetings, freeing them up to focus on patients who need the most help.

“This new national agreement is part of our commitment to refresh NHS Wales IT infrastructure and ensure it supports the transformational changes taking place across health and social care,” said Andrew Griffiths, director of NHS Wales Informatics Service. “It moves our digital estate away from locally managed services and into cloud-based services, delivering efficiencies and economies of scale.

“Frontline staff who work in our health and care services rely on technology, to help them deliver services in new, innovative ways that put the needs of patients first. I am very pleased that we are able to deliver the most up to date tools to our NHS Wales staff to help them with the fantastic work they do every day.”

This is not the first deal Microsoft has gone into with the Welsh public sector. In March, the country became one of the first in the world to give all local authority schools access to Microsoft 365.

The Welsh government paid for all 1,521 “maintained” schools to have access to Microsoft programs in a bid to boost the use of technology among pupils and reduce costs for families and headteachers, as part of the £1.2 million investment.

The agreement with NHS Wales will see Microsoft migrate all its digital estate from locally managed services to a cloud-based service, which has potential risks, particularly for an organisation as large as NHS Wales. As an example of what can go wrong, TSB bank spent the majority of 2018 fixing a botched IT upgrade than cost it almost £100 million.

Then there is the risk of outside threats, similar to the WannaCry attacks. According to Microsoft, the deal also includes an upgrade to Windows 10 E5, which comes with cutting-edge security features to prevent, detect, investigate and respond to potential risks.

“It’s essential that NHS Wales has secure systems that health staff and patients trust and this agreement will help achieve that,” Griffiths added. “It will increase resilience and mean our services are running on the most up-to-date operating system at all times.”

How cloud is transforming manufacturing and financial services in 2019

Two of the world’s most traditional industries have ascended to the cloud.

To meet new demands from an increasingly tech-centric world, manufacturing and financial services are harnessing digital to transform how they do business. These longstanding industries are often perceived as less flashy in comparison with some of their more technology-forward counterparts, like media and entertainment or life sciences, but the reality is that they’ve found creative, compelling ways to adopt cloud computing for business advantage.

Any company in manufacturing or financial services must fight to stay relevant. They occupy huge global markets and face steep competition from both legacy players as well as up-and-comers. It was therefore incumbent on both industries to rethink their business models – or risk being left behind. Cloud-based solutions help alleviate the challenges that come with large-scale digital transformation – the cloud helps avoid redundancy in a demanding world full of digitally-savvy consumers who expect far more efficient, customer-centric services, together with an influx of new, more technologically advanced competitors.

No wonder, then, that both industries have put more money behind the cloud. Our analysis of more than a billion dollars of cloud spend between 2018 – 2019, showed that financial services and manufacturing overall cloud spend grew significantly faster than the majority of their peers.

Since making the shift, manufacturing and financial services companies have found that the cloud offers a world of benefits and strategies to meet the demands of a global, technology-driven market. The challenge, however, is keeping the cloud tightly managed without stifling innovation: unchecked spend, security risks, and lack of visibility can derail an otherwise successful cloud program. Here are two prime examples:

  • Cox Automotive relies on the public cloud to take advantage of economies of scale. It enables them to be agile and test new ideas with limited investment. By consolidating data centres and leveraging AWS they are able to focus on products, services, and clients
  • Intuit, a financial software powerhouse, doesn’t let cloud complexity get in the way of its mission to power prosperity throughout the world. The company automates and optimises its cloud spend, avoiding manual processes that can be labor intensive and error prone

Both companies have found creative ways to drive accountability and avoid cost spikes through careful management of their cloud environment. As a result, they are achieving the objectives that bring so many people to the cloud in the first place – namely, agility, innovation and competitive advantage.

The adoption of cloud technology has enabled manufacturing and financial services to keep up with the ‘anywhere, anytime’ need to access and provide differentiation at a usage and consumption level. Specifically, there are three key areas in the public cloud that these industries are adopting at a faster rate – containers, machine learning, and serverless. Each of these provide a unique ability to improve the top or bottom line. For instance, containers can reduce overhead costs (they enable developers to move more quickly and require less system resources than traditional environments) while offering increased flexibility.

Containers run virtually anywhere; from the branch, to the data centre to the public cloud, across virtual machines or bare metal; they allow companies to move fast, deploy software efficiently, and operate at an unprecedented scale. This application’s agility and ability to improve application delivery, helps reduce costs, making it a strategic, long-term investment. Indeed, when we analysed the data collected for the last financial year, we found that both industries increased their spend in containers by a sizable 37%.

Cloud technology also unlocks access to advanced analysis and deeper insight into data (e.g. consumer and supplier engagement), which has enabled both industries to establish more cost-effective solutions and innovative strategies moving forward. Financial services companies increased their spend on analytics by nearly 2x – with top use cases like faster reporting and deeper analytics and insights.

The most popular cloud services are data warehousing, search, and big data analytics. Big data analytics is widely deployed by manufacturing companies who are looking to gain new insights from the vast amounts of information their machines collect. Consuming these offerings as managed services from a cloud provider means less overhead and maintenance for the organisation, as they are no longer responsible for the application operations.

The agile nature of the cloud also allows for quick responses to changing markets and developments, in a way that can accelerate a company’s business strategy. This helps to ease operations so that they function more efficiently. Manufacturing companies are able to remain fast and responsive to customer demands, which has resulted in shorter product cycles and less time to market, without sacrificing quality. Interestingly, the shift by financial services to the cloud was also spurred by the desire for greater sustainability, largely driven by stockholders and employees, so adopting cloud technology was a natural progression.

Perhaps the most fundamental way the cloud has transformed these industries is through serverless technology. According to our study, there’s been more than a 3x increase in serverless spend by both manufacturing and financial services over the past year. The latter in particular has seen immense value. Why? Scale. One the largest benefits serverless technology can offer companies is the ability to map high scalability– it is highly responsive and provides strong differentiation. For financial services like insurance and banking, it’s useful for predicting consumption and usership outcomes with a high level of accuracy.

Serverless technology is also extremely nimble: it has the ability to create applications for quick dissemination to the public, supporting radical change and real-time innovation. For financial services, where the level of differentiation between products is often small, this delivers a competitive advantage to born-in-the-cloud fintech companies, digital challenger banks and non-bank payment providers and pay services from technology giants such as Google, Apple and Amazon.

For manufacturing organisations, serverless is an ideal technology to power edge computing and IoT. Now, even when remote sites don’t have consistent network connections or robust infrastructure, serverless functions can trigger basic actions like record information, turn on a code, or message a user.

Cloud computing is transforming virtually every facet of every industry. It is clear that cloud adoption is no longer simply a technology decision for companies – it’s a business strategy that can give them agility, speed, and insight. It’s the way to do business in the modern age. in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Box: We’re in ‘wait-and-see mode’ with blockchain

Adam Shepherd

25 Jun, 2019

Box CEO Aaron Levie has confirmed that the company has no plans to integrate blockchain technology into its product portfolio, citing the fact that it’s still too early to have a meaningful impact for its customers.

The cloud collaboration company’s co-founder began his keynote at Box’s annual CIO Summit today by joking that he was teaching his new son – who was born late last month – ‘blockchain for babies’ so he is prepared for the future. Joking aside, however, Levie admitted that the company is not actively exploring blockchain technology.

“We have no specific products that we are working on at the moment, however, we do have people within the organisation that are either researching or always evaluating what might make sense,” he said.

“I think, frankly, we’re a little bit in wait and see mode to see where the trends are going. And we would certainly be there from a product standpoint, when we think it’s very meaningful for our customers… In general, it’s still probably early from a market standpoint, and relative to our technology.”

One branch of technology that Box is actively integrating into its portfolio, though, is artificial intelligence (AI) and machine learning (ML). The company has already begun deploying this technology on a limited basis via its Box Skills Kit feature, which allows customers to build their own integrations with other services, but Levie said that the company is planning to weave AI into its products more widely.

In particular, he says, data classification and security are areas in which the company could implement machine learning in order to benefit customers.

“You take something that used to be an unstructured blob of data, that we didn’t really know what was inside of it,” he said. “Now we can structure it, we can better help our customers manage it, and security and governance.”

In future years, the company is also looking into the possibility of adding complex AI to Box Relay to predict things like which action a person should take next as part of a workflow based on contextual information.

“It’s one thing to streamline a business process and describe that business process and software – it’s a whole other thing if you can actually go and automate that business process predictively or intelligently using AI or machine learning. And that’s the holy grail, frankly, for the entire industry,” Levie added. 

“But it’s something that we’re going to be investing quite a bit in, over the three to five year period.”

AWS hopes to entice more cloud customers with streamlined security tools

Dale Walker

25 Jun, 2019

AWS has made its Control Tower and Security Hub services generally available to all customers, designed to make it easier for organisations to manage security policies across their cloud environments.

Both platforms aim to ease the process for organisations looking to shift over to the cloud by removing a great deal of the heavy lifting involved, something that is being echoed by rivals in the industry.

In the case of Control Tower, automated and preconfigured services allow organisations to deploy a set of guardrails for their cloud environment, safe in the knowledge that these are built to AWS best practices. Importantly, AWS isn’t charging extra for this tool, and users can apply it to any AWS service that they currently pay for.

Once set up, organisations are able to build a secure AWS environment using these preconfigured best practices, defining policies around areas such as compliance and permissions. Control Tower should be especially useful for those organisations who need more prescriptive guidance on how to set up secure environments across multiple accounts, as the guardrails will prevent users from deploying tools that don’t conform to security policies.

The second release this week, AWS Security Hub, aims to solve the problem having too many disparate security packages running across an organisation and being unable to manage them centrally. Now, AWS customers can access all security tools within a single dashboard view, including those provided by third-parties.

The platform is similar to those offered by rivals Microsoft and Google, in the form of Azure Security Center and Google Cloud Security Command Center respectively, however even smaller companies such as Box are pushing for all-in-one windows for security management.

The cloud giant says it already has companies such as GoDaddy, Rackspace, Splunk and PagerDuty, T-Mobile, Uber and Sony Interactive signed up to either Security Hub or Control Tower.

AWS Control Tower is available to all customers using US East (N Virginia), US East (Ohio), US West (Oregon), and EU (Ireland) data centres, with additional regions coming in the near future.

While Control Tower is free, Security Hub is generally available to all customers on a per-usage pricing scheme, although there is a 30-day trial for new users.

Overcoming the skills gap for cloud and digital: Where does security and automation fit in?

Digital transformation initiatives require a distinct technological and cultural change – and the element which binds both together is skills.

Yet getting the right skills remain a near-impossibility. The skills gap shows no sign of lessening, with two recent studies proving this point. In December OpsRamp found the vast majority of businesses continued to struggle finding the right talent for cloud environments. Nine out of 10 hiring managers polled agreed the digital skills gap was anywhere between ‘somewhat big’ and ‘huge.’ In the same month, Cloudera found similarly with machine learning (ML); more than half of 200 European IT managers polled said they were reticent at adopting ML technologies because they did not have enough knowledge of the area.

With no real sign of change, IT solutions provider Kainos fears the worst. The company warned last week that skills gaps would continue to widen unless positive action was taken to ensure ‘joined up’ digital skills training initiatives. The company argued current initiatives, particularly in the UK, had scratched the surface but done little more. Earlier this year 12 technology institutes were launched to ‘offer top-quality, higher level technical education [and] help close skills gaps in key STEM areas.’

For Kainos, enterprises, educational establishments and governments need to work more closely to achieve real change, rather than just paying lip service. This is by no means an idle statement either; the company has its own academy, with more than 5,000 users benefiting from it. The process goes from business to education and government, as well as educating parents on potential careers for their children in IT.

Accenture released a report earlier this month focused on expectation versus reality in cloud initiatives. Organisations see the benefits overall, with above 90% satisfaction on average, but only a third of companies polled said they were fully satisfied on cost, speed, and business enablement metrics.

This suggests a gap in itself. “If you listen to what is happening at the CEO table and also what we have seen in our global survey, is that you see there is a clear understanding of the benefits of cloud, the adoption,” Marco Franzen, Accenture Netherlands managing director for technology consulting, tells CloudTech. “If you then look at the results and analyse them, two out of three [companies] think we’re not there yet. They have implemented cloud to some extent but there’s still a new leap, a new platform to reach, above the normal TCO.”

What are the drivers of this gap? Skillsets are certainly one; Franzen notes that smaller-scale companies in particular may be lacking a key skill to ‘make the next move and go all-in on cloud.’ Complexity of change internally was also cited. But the biggest boon, as often tends to be in any cloud study, is security. Two third polled in the Accenture survey said security compliance – particularly handling security in a new cloud-based environment – was of concern.

Is security therefore the biggest stumbling block when it comes to achieving digital skills and digital transformation? It could be argued that it is one area where all the investment in the world won’t bridge the gap. Last month, Oracle released a report which argued better enterprise cybersecurity would need to be remedied through automation rather than a surge in employee training or great security talent being hired. The report, ‘Security in the Age of AI’ (pdf), found the default response for almost half (47%) of respondents was to invest in more people rather than in more technology regarding security.

Tom Gray, CTO at Kainos, notes the importance of automation bridging the gap to some degree with security, but warns against it being the bulwark of any long-term strategy.

“There are an increasing number of areas where the volume of data or the complexity and velocity of the environment makes it impossible for humans, however skilled, to be as effective as an automated solution,” Gray tells CloudTech. “The management of compute and storage infrastructure has become increasingly automated as this infrastructure has evolved from large numbers of modest sized on-premise installations to a small number of large-scale environments whose scale and complexity makes human operation unviable.”

Gray argues organisations need to invest ‘strategically’ in their training to focus on which areas will need further investment and where employees will progress. “It remains to be seen whether automation will create more new jobs than will be lost through automation but, as with today, skills in creating digital technology – as opposed to simply using it – remain in short supply,” he says.

“There is a growing need for skills in identifying opportunities for automation, making best use of the current automation techniques, and understanding and managing the human aspect of automation on both the organisation and broader society,” Gray adds. “Conversely, some of the more repetitive tasks in solution delivery – including some programming, testing, deployment and operations tasks – will inevitably by type for automation.

“It behoves organisations to consider accelerating automation in these areas, rather than trying to build skills that may become unnecessary or, at the very least, prioritise building foundational knowledge and transferable skills to ensure that the individuals and organisation is responsive and resilient to the automation opportunity.” in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

View from the airport: HPE Discover 2019

Jane McCallion

25 Jun, 2019

Just over two years into Antonio Neri’s tenure as CEO at HPE and the company is rather a different beast to how Meg Whitman left it in November 2017.

Not much has changed structurally, but this year’s Discover conference in Las Vegas to me pointed to a change in culture. Of course, we had the usual product announcements – some of which were quite exciting – but there was a lot of time spent talking about more “businessy” elements.

HPE CEO Antonio Neri at HPE Discover 2019

GreenLake is a standout example of this. The consumption-based service was launched back in November 2017 (just as Neri took the reins, in fact) but was absolutely the star of the show this year.

It’s clear the company is now aggressively pursuing an ‘as a service’ model, rather than sell once and hope for repeat custom down the line, as has been the case in the past, with a pledge to make the full HPE portfolio available through GreenLake by 2022. While there is a bit of a fudge (you can still buy on a one-off basis rather than consumption if you really want), this to me is a big step away from the years of talking about “hybrid IT” under Whitman, which I saw largely as an attempt to give a veneer of cloudiness to HPE’s products while really maintaining a traditional, legacy vendor profile.

Speaking of the cloud, the company has really started to embrace this technology in partnership with cloud vendors – and not just old friends like Microsoft, but also AWS and Google. In particular, it’s using containerisation technology to its advantage, as well as public cloud companies’ realisation that they need to play nice with traditional vendors in order to maximise their customer pool (not everything can be hosted on a public cloud, after all).

Aruba chief Keerti Melkote on stage with HPE CEO Antonio Neri during HPE Discover 2019

HPE has also brought its networking business, Aruba, front and centre in the cloud conversation with a major update to Aruba Central. Indeed, it was given one of the handfuls of product slots during the keynote, with Aruba co-founder Keerti Melkote taking to the stage alongside Neri to talk about this cloud management service.

We also saw the company’s in-memory computing efforts start to bear some commercial fruit. While The Machine, as it was called, was quietly downgraded over the past few years from in-memory computing moonshot product to in-memory computing project, to really just another part of HPE Labs, its technology lives on in the shape of Primera.

This appliance – which its beaming creators were clearly delighted with at the press launch – comes with a 100% uptime guarantee (I’m not rounding up there, either) and was described to me by an independent analyst as the company’s most important storage launch in years.

HPE Primera storage appliance reveal at HPE Discover 2019

There was also a lot of emphasis on the company’s corporate social responsibility (CSR) initiatives, including its continued partnership with Purdue University aimed at using data and analytics to solve world hunger (no, really). As we were repeatedly told, the company’s commitment to social good goes all the way back to “Bill and David” (Hewlett and Packard respectively, in case you’re not on first name terms with the company’s founders), although not an awful lot of evidence was brought along to support this claim.

Overall, particularly as someone who missed 2018’s Discover and Discover Europe, I feel there’s been a palpable change in the company over the last 18 months and I’m not the only one, which can be credited almost entirely to Neri.

As a couple of people I spoke to at the event pointed out, he’s an unusual breed of CEO nowadays; he joined HPE’s predecessor, HP as a call centre operative in 1995 and over the intervening 24 years worked his way up to the top job, rather than being a direct transplant like the four CEOs who preceded him. I understand he’s quite hands-on and will have been closely involved with the development of Aruba Central (he was HP’s networking head for a while after all), staking Primera’s 100% uptime claim and the pivot to a consumption-based business.

Without wishing to sound like I’m completely toeing the party line, HPE really does feel like a more services-oriented and collaborative business than it was just a short two years ago. I’ll be intrigued to see next year how much business is actually being done through GreenLake (of the 600 customers currently counted as doing business through the scheme, two-thirds are actually “in the pipeline”), and whether we’ll see how that 100% uptime claim has held up. If it’s done well we could be at a very exciting point in the development of storage technology and I would expect to see some more products using the same technology to be very literally unveiled.

All images: Jane McCallion/Dennis Publishing. All rights reserved.