Siemens and Google Cloud join forces on factory automation


Zach Marzouk

19 Apr, 2021

Google Cloud and Siemens have announced a new partnership that will see AI and machine learning brought to factory floors.

Siemens is planning on integrating Google Cloud’s data cloud and artificial intelligence (AI) machine learning (ML) technologies with its factory automation tools.

With this partnership, the companies hope manufacturers will be able to harmonise factory data, run cloud-based AI/ML models from that data, and deploy algorithms at the network edge. This could produce applications that visually inspect products, for example, or predict the wear-and-tear of machines on the assembly line.

The ultimate goal, said the companies, is to make the deployment of AI in connection with the Industrial Edge easier. They hope this will empower employees as they work on the plant floor, automate mundane tasks and improve overall quality.

“The potential for artificial intelligence to radically transform the plant floor is far from being exhausted. Many manufacturers are still stuck in AI ‘pilot projects’ today – we want to change that,” said Axel Lorenz, VP of Control at Factory Automation of Siemens Digital Industries.

“Combining AI/ML technology from Google Cloud with Siemens’ solutions for Industrial Edge and industrial operation will be a game changer for the manufacturing industry.”

Google Cloud has forged a number of partnerships, including one with Intel which focused on developing integrating services for network providers to develop 5G innovations across various platforms. The collaboration highlighted Google Cloud’s ambitions in the 5G world, as well as Intel’s goal to develop 5G with software-defined infrastructures.

A month after that announcement, Google Cloud hired Uri Frank, an Intel engineering veteran, to ramp up in-house chip production. This was part of the company’s new server chip design efforts as part of its increasing investments in custom silicon. Frank was VP of Platform and Silicon Engineering at Intel and had been appointed corporate VP of Intel’s Design Engineering Group but chose to leave.

Ocado invests £10m into autonomous vehicle startup Oxbotica


Bobby Hellard

16 Apr, 2021

Ocado has invested £10 million into Oxford-based self-driving car startup Oxbotica that includes a partnership to develop autonomous vehicles for curbside deliveries.

The investment came as part of a funding round for Oxbotica and forms the basis of a multi-year collaboration that ultimately aims to reduce costs for Ocado.

The deal is an extension of an existing partnership between the two companies and will focus specifically on developing new hardware and software interfaces for autonomous vehicles that will be used in and around Ocado’s Customer Fulfilment Centre (CFC). This includes a range of logistical drones for use across its factories and loading areas.

Both firms are also interested in “last mile” delivery drones that take goods from vans to front doors.

Data sharing agreements have also been signed as part of the deal, which includes the fitting of “data capture capabilities” inside Ocado delivery vans that will be used by Oxbotica to train and test its technologies. The idea is that the data will highlight which Oxbotica technologies will suit Ocado’s needs.

“We are excited about the opportunity to work with Oxbotica to develop a wide range of autonomous solutions that truly have the potential to transform both our and our partners CFC and service delivery operations, while also giving all end customers the widest range of options and flexibility,” Ocado’s chief of advanced technology, Alex Harvey said.

The partnership could also lead to new jobs with Ocado, which is creating new engineering teams to work specifically with Oxbotica. While no figures have been provided, we do know these roles will be within Ocado’s Advanced Technology division, which is already separate from the team that develops the Ocado Smart Platform.

Logistical costs make up a large part of Ocado’s overall expenditure, the firm said. Approximately 1.5% of sales in the UK are lost due to the cost of moving finished orders from the fulfilment centre to delivery vans, while 10% of sales are lost when delivering goods from the van to the door. Labour also represents 50% of these costs, according to Ocado.

The grocery firm expects to see the first prototypes of some early use cases for autonomous vehicles within two years.

Google’s Project Zero trials 120 day disclosure window for new software flaws


Keumars Afifi-Sabet

16 Apr, 2021

Google’s Project Zero team has updated its vulnerability disclosure policies to introduce a 30-day cushion for businesses to apply patches to the flaws it discloses before revealing any precise exploit mechanisms.

Currently, the security research team adheres to a disclosure windows lasting 90 days, which lasts from the point a vulnerability is reported to a vendor to when they make it public, in order to give software vendors enough time to develop a patch behind the scenes.

Project Zero’s new trial, however, will see the team tack on an additional 30 days to the original window before publishing any technical details, including details behind zero-day vulnerabilities. This will be cut to a period of seven days for bugs that hackers are actively exploiting.

Project Zero is making these changes to encourage faster patch development, to ensure that each fix is correct and comprehensive, and to shorten the time between a patch being released and users installing it.

The team also wants to reduce the risk of opportunistic attacks immediately after technical details are revealed. Flaws in F5 Networks’ BIG-IP software suite serves as a recent example for this phenomenon, where hackers began scanning for vulnerability deployments shortly after technical details behind a handful of critically-rated flaws were published.

The trial is significant as many security research teams across the industry seek to mould their own disclosure policies around those adopted by Project Zero. The success of this trial, therefore, could pave the way for industry-wide changes.

For example, when Project Zero first introduced an automatic 90-day disclosure window in January 2020, a host of other teams shortly followed suit, including Facebook’s internal researchers in September that year.

“Much of the debate around vulnerability disclosure is caught up on the issue of whether rapidly releasing technical details benefits attackers or defenders more,” said Project Zero’s senior security engineering manager, Tim Willis.

“From our time in the defensive community, we’ve seen firsthand how the open and timely sharing of technical details helps protect users across the Internet. But we also have listened to the concerns from others around the much more visible “opportunistic” attacks that may come from quickly releasing technical details.”

He added that despite continuing to believe that quick disclosure outweighs the risks, Project Zero was willing to incorporate feedback into its policies. “Heated discussions” about the risk and benefits of releasing technical details, or proof-of-concept exploits, have also been a significant roadblock to cooperation between researchers and vendors.

Project Zero will, in future, explore reducing the initial 90-day disclosure window in order to encourage vendors to develop patches far quicker than they currently do, with the aim of one day adopting something closer to a 60+30 policy. Based on its data, the team is likely to reduce the disclosure window in 2022 from 90+30 to 84+28.

Although vendors often do release patches in a timely manner, one of the biggest challenges in cyber security is encouraging customers to actually apply these updates to protect themselves against potential exploitation.

There are countless examples of patched vulnerabilities that are still being actively exploited because organisations have failed to apply the relevant updates.

The Cybersecurity and Infrastructure Security Agency (CISA), for instance, revealed in 2020 that many of the top-ten most commonly exploited flaws were those for which patches have existed for years. As of December 2019, hackers were even exploiting a vulnerability in Windows common controls that Microsoft fixed in April 2012.

As the trial unfolds in the coming months, Project Zero has encouraged businesses keen to understand more about the vulnerabilities being disclosed to approach their vendors or suppliers for technical details.

The team won’t reveal any proofs-of-concept or technical details prior to the 30-day window elapsing unless there’s a mutual agreement between Project Zero and the vendor.

Assessing your cloud strategy after COVID


David Howell

16 Apr, 2021

According to research from Virtana, 72% of enterprises have moved one or more applications from the public cloud back on-premises. 

The top reasons for the change included the applications should not have been moved to a public cloud in the first place (41%), technical issues associated with public cloud provisioning (36%), degradation of performance (29%), and unexpected cloud costs (20%).

As the cloud has become a vital component of almost every business’ IT infrastructure – especially since COVID-19 took hold – many enterprises that rushed their expansion of hosted applications are now re-evaluating how they create, manage and deploy cloud services.

Speaking to IT Pro on the publication of the report, Kash Shaikh, President and CEO of Virtana, explains how a cloud deployment should be handled. “Critical applications should never be rushed to the cloud,” he says. “There is really no reason to do it when there are partners and platforms that can help ensure applications will run smoothly in the public cloud and for the right cost.

Creating a multi-cloud environment that can meet the challenges businesses face today and how their IT infrastructures will need to support their processes and remote staff in a post-COVID-19 landscape, is critical to get right.

Taking a step back and evaluating how cloud services are bought and integrating into a business should be a priority for all CTOs. After weathering the initial COVID storm, now is the time to closely audit how cloud services may have proliferated and how these can be rationalised moving forward.

Repatriating data

How applications and the data they rely upon are used is now very different than it was before remote mass working has become the norm. In its 2021 Hybrid Cloud Report, NTT encapsulates the current drivers behind the hybrid cloud, stating: “Business continuity, resilience, and agility are the priority business objectives.” The pandemic’s practical impact  is, in some cases, an underestimation of the network infrastructures needed to support the rapid changes enterprises had to cope with.

The security of data at rest and in motion has never been more critical. Cloud services, by their nature, offer consumption-based flexible environments, yet the visibility of these services can be opaque in some instances. This lack of transparency can lead to a loss of control and low-levels of security. As digital transformation continues, it is essential to increase visibility levels to ensure data security is robust.

How any given business uses data can be a practical guide when decisions have to be made whether more public cloud services are used. If large quantities of data will be in motion across several network endpoints, there is a case for moving it out of the public cloud.

As the quantity of data businesses will have to manage expands thanks to IoT, for instance, focusing on the unique requirements this data needs to deliver efficient value will be the core guide to whether on-prem, public or hybrid cloud deployments will be required. It’s no surprise that this re-evaluation of cloud deployment had given rise to new services such as HPE Greenlake.

Tracy Woo, a senior analyst with Forrester, believes the HPE model could offer the secure flexible cloud services all businesses will need. “Most datacentre purchases are heterogeneous or under one brand, [products] like Greenlake offer folks a ‘capacity on demand’ model that is primarily a financing and contractual vehicle to enable incremental purchases. As hyperconverged systems gain traction, full-stack infrastructure solutions for compute, storage, and network become inseparable and are subject to subscription- or consumption-based pricing paving a future for businesses like Greenlake and also Dell’s Project Apex.”

A hybrid future?

Post-pandemic, it’s clear more distributed resources will be used to support remote working and, critically, secure resources. Here the rush to expand cloud services to keep vital networks operating often meant securing these services was not a priority. Post-pandemic, this must change. 

NTT found that 93% of organisations agree cloud is critical to meeting their immediate business needs, while 88% agree it’s also essential to meeting their future business needs. The hybrid cloud has become the foundation onto which these new services will be built. But the hybrid cloud must be thoroughly evaluated to ensure this structure can meet the needs of the workers using these systems.

As work has changed, so must the cloud services that support the networks business use today and will need in the near future. In the face of an increased desire to use data-intensive technologies such as machine learning, and continuing and expanding cyberthreats, it’s not surprising many IT professionals are evaluating whether the public cloud can deliver the performance and visibility they need at an affordable cost.

Some of the core reasons for data repatriation identified in IDC’s multi-cloud survey include security ( 25% of respondents) and performance (22% of respondents). Additionally, 12% of European organisations stated that their migration of business applications to the cloud was unsuccessful.

Carla Arend, senior program director for Cloud Research Europe at IDC, comments: “Cloud strategy and data strategy need to converge, and organisations need to have a good understanding of their data estate when crafting their cloud strategy. Data classification is critical to make sure which data can be moved to the cloud and which data should stay on-premises, for example for regulatory compliance purposes.”

No one is arguing for data and applications to move wholesale back on-prem. What is being highlighted, especially now that businesses have gained some perspective on their cloud deployments during the last year, is that a more strategic approach is needed to their cloud strategy.

Shaikh believes a more nuanced and integrated approach is needed. “Although IT applications can be strategic, business owners look for SaaS, PaaS, or even IaaS as options to reduce capital expense, labour, utilities, and accounting. They are in the business of selling products, not running data centres,” he says. “The reason why on-premises data centres will linger around is the time and investment to the critical infrastructure that runs the critical applications that are very difficult to uproot.”

No CTO can ignore the cloud. However, with a working landscape in flux and an expanding need to further embrace hosted applications, a new strategic approach is needed to ensure the cloud services created are fit for purpose. 

The hybrid approach is still an option most businesses will choose. However, re-evaluating their application, data access and network needs, might just reveal new approaches to hosted services that could offer the cost-savings, security and efficiency gains needed for their business to thrive post-COVID-19.

Salesforce will reopen its offices in May


Bobby Hellard

14 Apr, 2021

Salesforce will begin welcoming employees back into its US offices, starting with vaccinated members of staff in the middle of May. 

The tech giant’s San Francisco headquarters, Salesforce Tower, along with its Palo Alto and Irvine offices will allow cohorts of 100 people or fewer, according to The San Francisco Chronicle.   

The tech firm, which is one of the city’s largest private employers, is opting for proof of vaccination to allow workers back in. All employees who do return will volunteer to do so as Salesforce has begun to implement a hybrid strategy that enables staff to work from home on a permanent basis.

The company is said to be eliminating designated desks and expanding ‘collaboration’ spaces to aline with its future workforce plans. 

“It’s really a catalyst to create the best employee experience,” Brent Hyder, Salesforce’s chief people officer, told The Chronicle. “We have an opportunity to create an even better workplace for everyone.”

The cloud company appears to be the first major firm in San Francisco’s Bay Area to opt for proof of vaccination. Firms like Facebook and Google are also welcoming employees back into offices, but don’t require any kind of vaccination ID. 

It highlights the degree to which Salesforce has fully embraced its hybrid work strategy, whereas the likes of Google has seemed more cautious.

Despite previously warning that a hybrid model could affect its culture and finances, Google has made changes to its remote working policies. The firm recently said employees can work from home overseas for more than 14 days a year – pending an application to do so. The company’s current work from arrangements is in place until 1 September, where it will then allow people to voluntarily return to the office.

Before the pandemic, around 18% of Salesforce employees were fully remote. Hyder expects that number to eventually sit at around 20% as most will choose a mix of home and in office. Although productivity was higher, he added that employees were “growing weary” because they “want to see each other”.

Union urges ministers to give remote workers a ‘right to disconnect’


Bobby Hellard

13 Apr, 2021

UK ministers are being urged to include a ‘right to disconnect’ policy in the forthcoming Employment Bill to address the boundaries between work and home life.

Tech workers union Prospect wants a legal requirement put in place to force companies to discuss when they can contact their employees while working from home

Around two-thirds of UK workers want to see a ‘right to disconnect’ policy put in place, according to a Prospect poll commissioned by Opinium. The organisations interviewed 4,005 UK nationals over the first week of April and found that 66% would support the policy if it was brought in. This was a strong stance across all age groups and political affiliations, according to Prospect, with 53% of Conservative voters in favour. 

Including a right to disconnect in the Employment Bill would a big step in redefining blurred boundaries and would show that the government is serious about tackling the “dark side” of remote working, according to Prospect’s research director Andrew Pakes. 

“People’s experience of working from home during the pandemic has varied wildly depending on their jobs, their home circumstances, and crucially the behaviour of their employers,” Pakes said.  

“It is clear that for millions of us, working from home has felt more like sleeping in the office, with remote technology meaning it is harder to fully switch off, contributing to poor mental health. Remote working is here to stay, but it can be much better than it has been in recent months.”  

Mental health problems featured in the survey with 35% of participants suggesting their ‘work-related mental health’ had gotten worse during the pandemic. 42% said it was partly due to the inability to switch off from the jobs with 30% stating that they were working more unpaid hours than before the pandemic – 18% suggesting this was at least four hours of additional work.

A number of other countries, such as France and Ireland, have some form of the right to disconnect enshrined in law which is also supported by the European Parliament. 

Nvidia takes aim at Intel with first data centre CPU


Zach Marzouk

13 Apr, 2021

Nvidia has unveiled Grace, an Arm-based data centre CPU designed for giant-scale artificial intelligence (AI) and high-performance computing (HPC) applications. 

This new processor combines Arm CPU cores with a low-power memory subsystem to help it analyse enormous datasets requiring both ultra-fast compute performance and massive memory.

Nvidia Grace, named after US programming pioneer Grace Hopper, is a highly specialised processor that will target workloads such as training next-generation NLP models that have over 1 trillion parameters, according to the company. 

Furthermore, Nvidia claims that a Grace CPU-based system will deliver 10x faster performance than the current Nvidia DGX-based systems that run on x86 CPUs. Nvidia expects this new processor to service a niche segment of computing.

“Leading-edge AI and data science are pushing today’s computer architecture beyond its limits – processing unthinkable amounts of data,” said Jensen Huang, founder and CEO of Nvidia.

“Using licensed Arm IP, Nvidia has designed Grace as a CPU specifically for giant-scale AI and HPC. Coupled with the GPU and DPU, Grace gives us the third foundational technology for computing, and the ability to re-architect the data centre to advance AI. Nvidia is now a three-chip company.”

The Swiss National Supercomputing Centre (CSCS) and the US Department of Energy’s Los Alamos National Laboratory have already announced plans to build Grace-powered supercomputers.

This move could spell trouble for other chipmakers who already have a strong presence in the data centre market like Intel, which is currently dominating with a 90% share, and AMD. By promising a 10x increase in processing performance, it may cause some customers to take note of Grace. This was reflected in the markets where Intel and AMD are both down several percentage points following Nvidia’s announcement.

Nvidia also announced eight Nvidia Ampere architecture GPUs for next-generation laptops, desktops and servers. 

The new Nvidia RTX A5000 and Nvidia RTX A4000 GPUs will help speed AI, graphics and real-time rendering up to 2x faster than previous generations in desktops. In laptops, the new Nvidia RTX A2000, Nvidia RTX A3000, RTX A4000 and RTX A5000 GPUs deliver accelerated performance without compromising mobility. 

For data centres, the new Nvidia A10 GPU provides up to 2.5x the virtual workstation performance of the previous generation while the A16 GPU provides up to 2x user density with lower total cost of ownership and an enhanced virtual desktop infrastructure experience over the previous generation.

In February, it emerged that Nvidia had turned to some of its older graphic cards to meet the demand for GPUs during a global shortage of PC components and chipsets. Nvidia was going to re-release its old chips, such as the GTX 1050 Ti chip, which was meant to have been phased out two years ago, as well as the GeForce RTX 2060.

IBM’s infrastructure services spin-off to be named Kyndryl


Sabina Weston

13 Apr, 2021

IBM has unveiled the name of its Managed Infrastructure Services business which is to become a fully-fledged public company by the end of this year.

The company previously referred to the spin-off as simply ‘NewCo’, but it has now been announced that the new infrastructure services company is to be named Kyndryl – a combination of the words ‘kinship’ and ‘tendril’.

The news comes six months after IBM declared that it would be splitting its business into two separate entities, bringing an end to a strategy that saw it attempt to shift towards cloud growth while maintaining a foothold in its legacy business.

Martin Schroeter, who was appointed Kyndryl CEO at the beginning of this year, said that the name “evokes the spirit of true partnership and growth”.

“Customers around the world will come to know Kyndryl as a brand that runs the vital systems at the heart of progress, and an independent company with the best global talent in the industry,” he added.

According to Kyndryl chief marketing officer Maria Bartolome Winans, “creating a name is just the start of our journey as a brand”.

“It will help identify us and support recognition, but the meaning of the name will be built and enhanced over time from our behaviours, aspirations and actions, and what we enable our customers to do. Our vision is to be the leading company that designs, runs and modernises the critical technology infrastructure of the world’s most important businesses and institutions, ultimately powering human progress,” said Winans.

Kyndryl’s headquarters are to be located in New York City, which Schroeter described as “one of the world’s most vibrant and global urban centres”, adding that the decision “underscores [Kyndryl’s] commitment to the economic health of cities”. IBM’s headquarters are to remain nearby, in Armonk, NY.

Despite being a newly-formed company, Kyndryl is uniquely positioned as a well-established business thanks to its ties with IBM, which already holds a global base of 4,600 customers.

The tax-free deal of separating Kyndryl from IBM is expected to be finalised by the end of 2021, with the latter set to focus entirely on its AI capabilities and the hybrid cloud

Microsoft in ‘advanced talks’ to buy AI firm Nuance Communications


Bobby Hellard

12 Apr, 2021

Microsoft is reportedly set to acquire artificial intelligence (AI) speech recognition firm Nuance Communications for an estimated $16 billion (£11.6 billion)

The two companies are currently in “advanced talks”, according to Bloomberg sources, and Nuance could be valued at around $56 per share if current negotiations hold.

Microsoft is yet to comment on the matter but reports suggest that an official announcement will be made sometime this week, possibly late on Monday. 

Nuance is an American AI firm based just outside of Boston that sells audio recognition and transcription tools. The company was founded in 1992, has over 7,000 employees and reported $346 million in fourth-quarter revenues. It’s thought that Microsoft is attempting to expand via acquisitions, with an interest in Nuance’s work in healthcare, customer services and voicemail. 

Microsoft and Nuance already have a professional relationship, collaborating on technologies that allow doctors to capture voice conversations and enter them into electronic medical records. AI and voice software is an area Microsoft has extensive expertise in already, with developer tools to enable transcription and functions to incorporate speech recognition into its products. Both Teams and the Bing search engine have some forms of speech recognition and audio transcription, for example.

The tech giant is seemingly in takeover mode with a $7.5 billion deal for video game maker Zenimax completed last month and wide reports that it recently considered buying social media app TikTok. The firm was also said to be chasing an acquisition of gaming chat app Discord, for around $10 billion. 

Both Discord and TikTok deals were thought to be more about the large volumes of users data that Microsoft would have access to if the acquisitions were successful. If the details around Nuance are correct, it would be Microsoft’s biggest acquisition since buying LinkedIn in 2016. The firm used data from the recruitment site to power other functions on platforms like Outlook and Teams. 

NHS to digitise coronavirus testing with new Scandit deal


Sabina Weston

8 Apr, 2021

NHS Digital has signed a deal with augmented reality (AR) solutions provider Scandit in an effort to digitise the UK’s Covid testing process.

The Swiss software company will provide the NHS with a data capture service, which will be available for free to any hospital or NHS organisation involved in Covid test tracking, PPE tracking, or patient care, until 30th November 2021.

Specifically, the company will be deploying barcode scanning technology which will make it easier and faster for healthcare workers and volunteers to track and identify the tested samples. Test site staff members will be able to identify patients from a safe distance, avoiding the risk of contagion, by scanning a barcode on a booking form with the help of a smartphone.

After the samples are taken, they will be stored in a vial with a barcode attached to it, allowing staff to quickly scan it in order to ensure the test sample matches the booking ID and minimise mistakes.

The announcement of the deal comes days after the government proposed twice-weekly rapid Covid-testing which would be made available to everyone in England starting 9 April.

Rapid testing has so far been available to those considered to be most at risk, such as persons over 60 and with underlying health conditions, as well as people who are unable to work from home and frontline NHS staff. However, the government is now encouraging everyone to take regular tests, in order to prevent outbreaks as well as return to “a more normal way of life”.

Health secretary Matt Hancock said that, due to one in three Covid carriers being asymptomatic, “regular rapid testing is going to be fundamental in helping us quickly spot positive cases and squash any outbreaks”.

“The vaccine programme has been a shot in the arm for the whole country, but reclaiming our lost freedoms and getting back to normal hinges on us all getting tested regularly,” he added.

Due to the more regular Covid testing pitched by the government, the NHS is facing a significant increase in workload over the coming weeks. The process is to be facilitated with the help of Scandit-supplied technology, at no extra cost.

Commenting on the deal, Scandit CEO Samuel Mueller said that the Swiss software company’s tech “ensures that tests can be tracked quickly and easily”.

“It also integrates easily with smartphones, meaning that the NHS has been able to scale the number of testing sites and make it easy to deploy home-testing effectively. There is no acceptable margin of error. Our clinical-quality barcode scanning technology delivers a highly accurate read rate whether the scan is happening through a car window at a drive-in mobile test site or by someone who is self-testing at home,” he added.

Mueller said that Scandit has been “helping NHS Digital digitise their nationwide testing process since the start of the pandemic”.

“We have taken steps to support the NHS at every step of the way to integrate our technology into the complex NHS IT infrastructure seamlessly. We look forward to continuing this partnership and helping the NHS make it possible to deliver quick Covid-19 tests for every UK resident that needs it,” Mueller added.