AWS ditches Nvidia for in-house ‘Inferentia’ silicon


Bobby Hellard

13 Nov, 2020

Amazon Web Services (AWS) will ditch Nvidia chips responsible for the processing of Alexa queries and will instead use its own in-house silicon, the company confirmed on Friday.

The cloud giant will also be shifting data processing for its cloud-based facial recognition system, ‘Rekognition‘, over to these in-house chips, according to Reuters.

Alexa queries, issued through Amazon’s Echo line of smart speakers, are sent through the company’s data centres where they undergo several stages of processing before coming back to users with an answer, including translating the processed text into audible speech.

The company said that the «majority» of this processing will now be handled using Amazon’s own «Inferentia» computing chips. These were first launched in 2018 as Amazon’s first custom silicon-designed chips for accelerating deep learning workloads.

Amazon has said that the shift to Inferentia for Alexa processing had resulted in a 25% latency boost and 30% lower cost. The firm hopes the same will happen with its Rekognition system, which has also started to adopt the Inferentia chip.

The cloud giant didn’t specify which company previously handled Rekognition processing, but the service has come under some scrutiny from civil rights groups for its involvement with law enforcement. Police were temporarily banned from using it earlier in the year, following the Black Lives Matter protests.

Nvidia and Intel are two of the biggest providers of computing chips, often for data centres, with companies like Amazon and Microsoft included in their clientele. However, a number of firms have begun to move away from vendors and are bringing the technology in-house. For example, Apple has recently moved away from Intel chips in favour of the A14 Bionic processors, which will be used going forward.

Salesforce UK to create 100,000 new digital roles by 2024


Bobby Hellard

12 Nov, 2020

Salesforce has said it aims to add over 100,000 new skilled jobs to the UK market over the next four years through a partnership with training provider QA.

Three apprenticeship programmes and a developer Bootcamp will be created to boost the country’s digital skills and produce a cohort of graduates trained in Salesforce certifications, the company announced on Thursday.

The initiatives could potentially add over 100,000 skilled jobs in the UK over the next four years, according to IDC, with Salesforce boosting its own ecosystem of customers and partners.

Demand for digital skills has been high for years but it has become particularly acute during the pandemic and the greater use of cloud platforms. QA, which is an established provider of Salesforce training, is aiming to bridge the gap by expanding its work with the company.

In the UK, there is a growing demand for Salesforce technology, according to QA, which is fuelling the need for businesses to quickly find and hire new skilled Salesforce talent.

The Developer Bootcamp is a 12-week intensive course that provides specialist skills required to design data models, user interfaces and security for custom applications as well as the ability to customise them for mobile use. The first boot camp is expected to start in March 2021.

The apprenticeship programmes will provide practical learning closely aligned to specific career paths, namely service desk engineering, marketing professional, and business analytics roles within the Salesforce ecosystem.

All three apprenticeships are available for immediate starts and both initiatives will be complemented by content from a Salesforce’s online learning platform Trailhead, which allows participants to continue to develop their Salesforce skills after they have completed their programmes.

«We care passionately about developing the next generation of skilled professionals for our industry,» said Adam Spearing, Salesforce’s EMEA CTO. «The Salesforce ecosystem represents a growing opportunity and urgent need for talent within our customers, marketplace, partners and developers, and we want to kick-start the careers and pathways for young adults. We are excited to be partnering with QA to launch the Developer Bootcamp and Apprenticeship programmes.»

Google slashes free Drive storage to 15GB


Keumars Afifi-Sabet

12 Nov, 2020

Google will restrict the online cloud storage capacity for high-quality photos and videos to 15GB from next year as the firm looks to capitalise on the millions of users who have come to rely on the service.

From June 2021, new high-quality content uploaded to Google Photos will count towards a free 15GB storage capacity, with the company making several pricing tiers available to those who need to store more data. The limit will also apply to files that users keep on Drive, specifically Google Docs, Sheets, Slides, Drawings, Forms, and Jamboard files.

Google is framing these plans as a way to be able to continue to provide everybody with a great storage experience while keeping pace with the growing demand for its free services.

Currently, files created through Google’s productivity apps, as well as photos smaller than 2,048 x 2,048 pixels, and videos shorter than 15 minutes, don’t count towards the cap. High quality, under the new storage calculations, will include photos larger than 16Mp or videos larger than 1080p, all of which will be optionally compressed.

“For many, this will come as a disappointment. We know. We wrestled with this decision for a long time, but we think it’s the right one to make,” said the firm’s product lead for Google Photos, David Lieb.

“Since so many of you rely on Google Photos as the home of your life’s memories, we believe it’s important that it’s not just a great product, but that it is able to serve you over the long haul. To ensure this is possible not just now, but for the long term, we’ve decided to align the primary cost of providing the service (storage of your content) with the primary value users enjoy (having a universally accessible and useful record of your life).”

More than one billion people rely on Google Photos and Google Drive, Lieb added, uploading more than 28 billion photos and videos every week on top of more than four trillion already uploaded onto the service.

The change will only apply to newly uploaded content staring on 1 June next year, with all existing high-quality content remaining exempt from the storage quota. This includes all content uploaded between now and then.

Users who wish to upgrade to a larger storage plan will have to sign up to the company’s paid-for cloud storage platform Google One, with packages beginning at 100GB, alongside other features including access to Google experts and shared family plans.

Currently, Google One is priced at $1.99 per month for 100GB of storage, $2.99 per month for 200GB, and $9.99 per month for 1TB.

Google is also rolling out of a host of new tools, which the firm hopes will go towards justifying the additional cost for those who need to pay for a higher tier.

Among these tools is software that can make it easier to identify and delete unwanted content, such as blurry photos and long videos, though the firm is set to make more announcements in the coming months. Google has in the last few years leant on AI to improve the functionality of its flagship products, including Gmail and Google Docs.

The firm is also introducing new policies for users who are inactive or over their storage limit across Google’s cloud-based services. Those who are inactive in one or more of these services for two years may see their content deleted in those specific products, while users over their storage limit for two years may see their content deleted across the board.

AWS launches visual data preparation tool DataBrew


Sabina Weston

12 Nov, 2020

Amazon Web Services (AWS) has announced the general availability of its new visual data preparation tool that lets users clean and normalise data without having to write code.

Built as part of its AWS Glue service, the new DataBrew tool aims to make visual data preparation more accessible for a greater number of users.

According to AWS, DataBrew facilitates data exploration and experimentation directly from AWS data lakes, data warehouses, and databases. Its users will be able to choose from over 250 built-in functions to combine, pivot, and transpose the data, with the tool also providing transformations that use advanced machine learning techniques such as natural language processing.

DataBrew is serverless and fully-managed, the claim being that users will never need to configure, provision, or manage any compute resources directly.

“AWS customers are using data for analytics and machine learning at an unprecedented pace”, commented Raju Gulabani, AWS vice president of Database and Analytics. “However, these customers regularly tell us that their teams spend too much time on the undifferentiated, repetitive, and mundane tasks associated with data preparation. Customers love the scalability and flexibility of code-based data preparation services like AWS Glue, but they could also benefit from allowing business users, data analysts, and data scientists to visually explore and experiment with data independently, without writing code.

«AWS Glue DataBrew features an easy-to-use visual interface that helps data analysts and data scientists of all technical levels understand, combine, clean, and transform data,” he added.

AWS Glue DataBrew is generally available starting today in Ireland and Frankfurt, Germany, as well as select parts of the United States, including Ohio and Oregon, and the Asia Pacific Region. AWS said that it will announce the availability in additional regions “soon” but has yet to confirm when the tool will arrive in the UK.

When it comes to pricing, AWS said that the DataBrew users will not be faced with any “upfront commitments or costs” to use the tool, but will be expected to pay for the ability to create and run transformations on datasets. AWS did not immediately respond to IT Pro’s query regarding specific pricing.

The role of cloud native at the edge


Keri Allan

12 Nov, 2020

By 2025 analyst firm Gartner predicts that three quarters of enterprise-generated data will be created and processed at the edge – meaning outside of a traditional data centre or cloud and closer to end users.

The Linux Foundation defines edge computing as the delivery of computing capabilities to the logical extremes of a network in order to improve the performance, security, operating cost and reliability of applications and services. “By shortening the distance between devices and the cloud resources that serve them, edge computing mitigates the latency and bandwidth constraints of today’s internet, ushering in new classes of applications,” the foundation explains in its Open Glossary of Edge Computing.

Edge computing has been on a large hype cycle for several years now and many consider it “the Wild West”. This is because there’s a high volume of chaotic activity in this area, resulting in duplicated efforts, as technologists all vie to find the best solutions.

“It’s early doors,” says Brian Partridge, research director at 451 Research. “Vendors and service providers are throwing stuff at the wall to see what sticks. Enterprises are experimenting, investors are making large bets. In short, the market is thrashing, crowded and there’s a lot of confusion.” 

A synergy between cloud native and the edge 

Edge computing opens up many possibilities for organisations looking to scale their infrastructure and support more latency-sensitive applications. As cloud native infrastructures were created to improve flexibility, scalability and reliability, many developers are looking to replicate these benefits close to the data’s source, at the edge. 

“Cloud native can help organisations fully leverage edge computing by providing the same operational consistency at the edge as it does in the cloud,” notes Priyanka Sharma, general manager of the Cloud Native Computing Foundation (CNCF). 

“It offers high levels of interoperability and compatibility through the use of open standards and serves as a launchpad for innovation based on the flexible nature of its container orchestration engine. It also enables remote devops teams to work faster and more efficiently,” she points out. 

Benefits of using cloud native at the edge

Benefits of using cloud native at the edge include the ability to complete faster rollbacks. Therefore, edge deployments that break or have bugs can be rapidly returned to a working state, says William Fellows, co-founder and research director of 451 Research. 

“We’re also seeing more granular, layered container support whereby updates are portioned into smaller chunks or targeted at limited environments and thus don’t require an entire container image update. Cloud native microservices provide an immensely flexible way of developing and delivering fine-grain service and control,” he adds.

There are also financial benefits to taking the cloud native path. The reduction in bandwidth and streamlined data that cloud native provides can reduce costs, making it an incredibly efficient tool for businesses. 

“It can also allow consumption-based pricing approach to edge computing without a large upfront CapEx spend,” notes Andrew Buss, IDC research director for European enterprise infrastructure.

However, it wouldn’t be the “Wild West” out there right now if cloud native was the perfect solution. There are still several challenges still to work on, including security concerns. 

Containers are very appealing due to them being lightweight but they’re actually very bad at ‘containing’,” points out Ildikó Vanska, ecosystem technical lead at the Open Infrastructure Foundation (formerly the OpenStack Foundation).

“This means they don’t provide the same level of isolation as virtual machines, which can lead to every container running on the same kernel being compromised. That’s unacceptable from a security perspective. We should see this as a challenge that we still need to work on, not a downside to applying cloud native principles to edge computing,” she explains. 

There’s also the complexity of dealing with highly modular systems, so for those interested in moving towards cloud native edge computing, you need to prepare by investing the time and resources necessary to effectively implement it. 

What should businesses be thinking about when embarking on cloud native edge computing? 

Cloud native edge solutions are still relatively rare; IDC’s European Enterprise Infrastructure and Multicloud survey from May 2020 showed that the biggest edge investments are still on-premise. 

“However, we expect this to shift in the coming years as cloud native edge solutions become more widely available and mature and we have more use cases that take advantage of cloud as part of their design,” says Gabriele Roberti, Research Manager for IDC’s European Vertical Markets, Customer Insights and Analysis team, and Lead for IDC’s European Edge Computing Launchpad.

For those businesses eager to take the leap, Partridge recommends starting with the application vision, requirements and expected outcomes. After targeting edge use cases that can support a desired business objective – such as lowering operations costs – you can then turn your attention to the system required. 

Laura Foster, programme manager for tech and innovation at techUK, reiterates that it’s important to build a use case that works for your business needs. 

“There’s an exciting ecosystem of service providers, innovators and collaboration networks that can help build the right path for you, but the journey towards cloud native edge computing also needs to go hand in hand with cultural change,” she points out. 

“Emerging technologies, including edge computing, will pioneer innovation, but only if businesses push for change. Retraining and reskilling workforces is a fundamental part of an innovation journey and can often be the key to getting it right,” she concludes.

Remote working is here to stay, says Ginni Rometty


Bobby Hellard

11 Nov, 2020

Remote working and digital transformation plans won’t go back to normal following the release a COVID-19 vaccine, according to IBM’s former CEO Ginni Rometty.

Speaking to CNBC, IBM’s executive chair said that remote work was «here to stay» as part of a hybrid model of working. 

The comments come just days after reports of a potential coronavirus vaccine with a 90% success rate in late-stage trials. Shares in services that had been heavily used during the pandemic, such as Zoom, Amazon and Netflix, all slumped after the announcement, suggesting a return to the old ways once the vaccine is ready. 

However, Rometty doesn’t believe that will be the case, saying it would be hard to go back to the old ways now the world had seen what is possible. 

«I actually don’t think these technology trends are going to reverse themselves,» Rometty told CNBC. She said that a vaccine «allows us to return to perhaps a bit of a more new normal. But a number of these things in the hybrid way of working I believe will remain, and the digital acceleration will continue because people have now seen what is possible.»

A number of businesses began welcoming employees back into the workplace after the first lockdown, with the government also urging a return to work to help boost the economy. However, a second national spike in coronavirus cases has led to a second autumn lockdown

The promise of a vaccine is seen as the safest way to move out of the pandemic, but many want a more balanced home-work setup. Early in the year reports suggested the UK public wanted to move to two or three days in office, with many keen not to go back at all.

Not everyone agrees, however. Microsoft CEO Satya Nadella warned that switching fully to remote working could have negative effects on wellbeing, learning and collaboration – despite his own company ushering in more varied remote working policies. 

«Learning, reskilling, onboarding is going to become a huge issue and we need to be able to incorporate the learning content into a workflow that is natural,» he said.

Adobe buys marketing workflow startup Workfront for $1.5 billion


Bobby Hellard

10 Nov, 2020

Adobe has announced its intent to acquire Workfront for $1.5 billion (£1.1 billion) as it looks to add collaboration tools to its marketing business. 

The deal is expected to close during the first quarter of Adobe’s 2021 fiscal year, subject to regulatory approval, Adobe said

Utah-based Workfront develops project management software for enterprise customers. It will add some 3,000 corporate customers and around one million users to Adobe, with Bloomberg suggesting the takeover could potentially be completed within a week. 

Whenever the deal goes through, it will be the first major action by executive vice president Anil Chakravarthy, who joined Adobe in January.

“Adobe is the undisputed leader in content creation, management, delivery, and measurement and a trusted partner to digital leaders around the globe,” said Chakravarthy. 

“The combination of Adobe and Workfront will further accelerate Adobe’s leadership in customer experience management, providing a pioneering solution that spans the entire lifecycle of digital experiences, from ideation to activation.”

The deal is seen as Adobe’s biggest effort to boost its Experience Cloud division, which includes its marketing, advertising and analytics services. The idea is to make its creative and business applications more collaborative, enabling teams to work closer on projects as they continue to operate remotely

Workfront has seen growing interest in its services during the pandemic as marketing firms have sought more insight into their operations. The company’s CEO, Alex Shootman, called the Adobe takeover an «awesome outcome».

«When you have an opportunity to work with the company that CMOs rely on to run their business, which is Adobe, and you’re a company like Workfront, that takes us forward years in terms of what we would have been able to accomplish on our own,» he said.

Zoom settles with the FTC over ‘deceptive’ encryption claims


Bobby Hellard

10 Nov, 2020

The Federal Trade Commission (FTC) has reached a settlement with Zoom after it sued the company for «unfair and deceptive security practices». 

Following the FTC’s announcement, shares in the video conferencing service tumbled. Zoom closed at 17.4% lower on Monday, with news of positive coronavirus vaccine data also having an impact, according to CNBC

This also affected a number of stocks for service that had been boosted by the pandemic, such as Netflix (8.6% decline) and Amazon (5.1%), but Zoom appeared to be the worst-hit.

With the onset of the pandemic, and the sudden need to stay home, Zoom saw a massive spike in users, turning into a household name almost overnight. This brought greater scrutiny on the firm and highlighted a number of security issues, one of which being a lack of end-to-end encryption

In its settlement reached with the FTC, the company was accused of collecting user data during recorded conferences. Zoom initially said it had «end-to-end, 26-bit encryption», but in fact, it «provided a lower level of security,» the FTC said. 

«During the pandemic, practically everyone – families, schools, social groups, businesses – is using video conferencing to communicate, making the security of these platforms more critical than ever,» said Andrew Smith, the FTC’s director of consumer protection.

«Zoom’s security practices didn’t line up with its promises, and this action will help to make sure that Zoom meetings and data about Zoom users are protected.»

The FTC has called on Zoom to «implement a robust information security programme» as well as a «prohibition on privacy and security misrepresentations».

Zoom is arguably one of the biggest success stories of the year, reportedly recording a 355% revenue increase in the second quarter of 2020. The figures are even more impressive when considering the number of privacy issues it also had to deal with, such as ‘Zoom bombing‘ where unwanted actors invade meetings. 

The service has been used by both remote workers and those stuck at home, to keep in touch with friends and family, but the announcement from Pfizer and BioNTech of a vaccine candidate with a 90% success rate during late-stage trials have suggested Zoom might not enjoy continued success beyond the pandemic. 

Standard Chartered is taking 167 years of banking into the cloud


Adam Shepherd

10 Nov, 2020

The world of finance is changing fast. Where previously the industry was dominated by a clutch of major players that had been around for decades, or even centuries, the advent of cloud technology and mobile apps have allowed a swathe of new digital-only ‘challenger banks’ to spring up and start taking on the big established firms.

In order to compete in this new landscape, incumbents must modernise their offerings, and for many of them this also involves modernising their IT architecture, shifting away from on-premises data centres and monolithic applications to more agile development processes and cloud-based infrastructure.

Standard Chartered is one such incumbent; established in 1853 as the Chartered Bank of India, Australia, and China, the UK-based multinational handles a great deal of corporate and consumer banking across the APAC and EMEA regions, although it lacks a UK retail banking presence. The company is looking to shift its applications and services into the cloud in order to take advantage of the agility that this offers, and Standard Chartered’s CTO of cloud transformation Bhupen Warathe is the man in charge of making it happen.

The bank is implementing a multi-cloud, multi-region strategy, which Warathe says provides better resilience and reliability, as well as mitigating risks around where workloads are running from a geographic perspective. The company is also planning to make use of each platform’s different strengths and capabilities, and one of Standard Chartered’s two cloud providers is Microsoft Azure, chosen in part for its strong SaaS, AI and security competencies.

Around 10% of the bank’s employees have already been moved onto Microsoft 365, Warathe says, with the migration process expected to be complete by the end of next year. The company is also planning to use Azure’s AI and data analytics capabilities to offer richer insights to both staff and clients.

“They have a whole lot of great services including Power BI, and some of the big data products. We want to utilise that for better client insight,” Warathe says. “We want to generate better insights for our frontline staff and also provide much more rich analytics to our clients, both in corporate institutional banking, as well as to our retail and private banking clients.”

Some workloads will also be migrating to Azure, with the bank’s trade finance portfolio earmarked to go first. However, Standard Chartered’s multi-cloud strategy is based around balancing its workloads between Azure and a second cloud provider, the identity of which has not yet been publicly disclosed. 

“Trade finance [will] be moved to Azure; there will be other applications that will be going to the second cloud provider,” he says. “So the examples are, our payment systems will be going to the second cloud provider, and also our digital bank – or what we call virtual bank – capabilities will also be going [there]. So, in summary, we will be balancing the compute load between the two cloud providers, and that helps us.”

There will also be geographical considerations when determining which providers are used. “We are treating Hong Kong and Singapore as a pair, and if things go wrong in Hong Kong, we will switch to Singapore,” Warathe explains. “Similarly, London and Ireland is another pair in the West and we will be using cross-regional resiliency for a specific service provider. In the longer run, we would like to have switching between the cloud providers for specific workloads, but that’s not the immediate plan.”

In addition to resiliency, the multi-region strategy also addresses the bank’s data residency requirements, and Warathe cites Azure’s strong support for regional data hosting as a key feature for enabling Standard Chartered to meet its regulatory and compliance requirements. The company has 45 booking locations across 60 markets, and now that regulators are starting to open up to the use of cloud within the financial services market, Standard Chartered has begun engaging with these regulators to support its cloud rollout.

As with many cloud migrations, the bank is aiming to improve the scalability of its services as part of the project. In some of its larger Asian markets, Warathe says, Standard Chartered has seen huge growth in the volume of payments and transactions that it is processing. In particular, the coronavirus pandemic has driven a huge shift towards digital buying behaviour and e-commerce activities.

“All of those capabilities, buying behaviour from the corporate clients as well as retail clients, is going pretty much digital. And that’s where we have scalability requirements. So in some markets, we have 10 times more volume on a particular day, as compared to another day where the volume is kind of not there. And that’s where the whole scalability aspect also fits some of our needs to become a true digital bank.”

This isn’t the company’s first experience with the cloud, however. In fact, Standard Chartered has been using public cloud for the past three years, with six applications already migrated to its second unnamed provider as pilot tests.

“We already have experience and hence we’re very comfortable going big with a couple of cloud providers,” Warathe says. “We have a financial market business, which has many deployments where we need to do grid computing for risk analysis and portfolio level computations. And at peak, we have to use 10,000 vCPUs – so it’s that kind of load, that kind of compute.”

While this isn’t the first time Standard Chartered has worked with the cloud, it’s no less of a mammoth undertaking for the company, and skills are a firm priority for Warathe. The company has two main development centres in London and Singapore, and Warathe is focusing on making sure that his staff are fully trained on all of the cloud systems that the new infrastructure will need.

“We have close to 10,000 IT professionals in the bank. Cloud needs a different kind of skill set [and] we have established a very good upskilling programme with both the cloud providers. We have already trained more than a thousand people last year,” he says; “this year, we’re training another thousand individuals on cloud technologies. Things like how Kubernetes services or container services work, how some of the PaaS services and managed services are much better than what we can get from the traditional software in production.”

For Warathe, Kubernetes is pivotal to this strategy. The company’s new payment systems are going to be fully deployed using Kubernetes, he says, as will the company’s trade finance systems. There is also a lot of replatforming going on in preparation for the move, with many of the company’s core banking systems being worked on. At the same time, many newer applications, such as its digital banking products, are being developed as cloud-native applications from the word go.

“Kubernetes gives us the best scalability the industry has ever seen. It also gives us the best portability of moving the workloads between the two cloud service providers,” he says, “and that’s where wherever we have volumes, wherever we need scalability, those are the applications we are targeting for Kubernetes and container-based services.”

As you’d expect, Standard Chartered is also taking its own finances into consideration, and Warathe notes that the OpEx-based model of cloud computing offers a very attractive way for the bank to minimise its infrastructure costs, compared to making large capital investments in on-premise hardware.

“I think that [CapEx] model was quite good when we had predictable volumes, [but] when you have massive peaks and troughs, then the CapEx model doesn’t work that well. If you have a predictable volume, you can go for a 16-CPU box and maximise your dollar for five or 10 years,” he says. “But when you have a really dynamic throughput and a very varying degree of volume, then I think the CapEx model doesn’t work as well.”

“With cloud, we don’t have to buy hardware and network and switches and everything else to really put into our books and capitalise it for the next four to five years; that‘s one of the biggest advantages on the financial management side as well. And there are industry results, which show pretty good savings once we achieve a critical mass in terms of migration of workloads… initially you’ll have a bit of double bubble that means a bit of extra cost on one side, but eventually it gives you benefits on the OpEx side as well.”

UK gov urged to help SMBs with digital adoption incentives


Bobby Hellard

9 Nov, 2020

An advocate for tech startups is urging the UK government to find ways of incentivising tech adoption for SMBs to boost the country’s productivity. 

The Coalition for a Digital Economy (Coadec) have called for a digital adoption fund that provides tax reliefs to SMBs and greater collaboration from the UK’s startup ecosystem. 

In its report, ‘Hidden Figures‘, Coadec refers to the UK as «the sick man of Europe» due to its stagnated productivity. The organisation suggest that unless more is done to make digital adoption easier and more attractive to the country’s small and medium-sized businesses, the UK will fall further behind others in Europe. 

«Although most nations have experienced slow productivity growth since 2008, the situation in the UK is by far the worst amongst our peers and one of the worst performances in UK history,» the report states.

«However, the British economy hasn’t always been characterised by slow productivity growth, but performance has fluctuated compared to that of our peers across the last 60 years. In 1960, the UK had the highest level of productivity in Europe before suffering a slowdown in the 1960s and 1970s which led to the UK becoming known as the «sick man of Europe». 

To get the country back to health, Coadec recommends the government put in place four incentives for SMBs. The first is a fund that works like tax credits for companies that want to adopt new technology – helping to reduce the cost. 

For those that are perhaps not confident on what tech their business needs, or unsure what value they will offer, the report recommends creating a sector-by-sector ‘tech matrix’, a list of approved products and services. This also ties into its recommendation that tech startups offer support to SMBs on what tech to adopt and how.

Finally, the report urges the government to create a post-COVID-19 tech adoption strategy, so businesses can be more agile in the face of future challenges.