How AI is bringing a new dimension to software testing

Software testing teams analyse and correct thousands of code on a daily basis to ensure the final product is free of errors. However, the on-demand customer expects software to be comprehensive in functionality and delivered with precision and speed. Current software testing procedures are not scalable to meet these needs, nor are they cost- or time-efficient in the digital economy.

As products become more complex to create, the code becomes more challenging to test accurately. Manual testing exposes development teams to many challenges—code changes causing errors elsewhere in the product, the considerable length of regression testing cycles, resourcing constraints of hiring skilled software testers to meet demand, and more.

While the current practices of agile and DevOps increase the pace of software development, meeting near-future market needs requires the power of predictive technologies to enhance traditional software testing solutions.

Artificial intelligence (AI) and machine learning (ML) provide a dynamic framework to predict and solve code writing errors before they appear. The more data patterns ML analyses, the more processes and self-adjustments can operate based on those learned patterns. This continuous delivery of insights increases in value with the “intelligence” of the technology. AI has enormous potential to reshape software development. When properly leveraged, AI solutions drive efficiency, optimise processes, and enhance experiences.

Let’s take a closer look at some of the key advantages of implementing AI/ML for testing software during the development process:

Automating and accelerating the testing process

Deploying AI/ML to the software testing process is not to replace human testers, but rather for technology to work in collaboration with humans to make the software development lifecycle more efficient and productive. Software companies utilise the skills of AI/ML experts to apply technology solutions that operate in conjunction with, and complement, traditional software testing processes and solutions already in place.

AI can automate and reduce the number of routine tasks in development and testing, beyond the limitations of traditional test automation tools. Software companies can train AI algorithms to instantly recognise, capture, and analyse large amounts of data sets to expedite the testing process because speed, cost, and efficiency are vital when it comes to testing software codes.

For example, a traditional software test tool analyses tests without discernment, analysing every possible test available. AI can significantly add value and efficiency by reviewing the current test status, recent code changes, and other code markers, deciding which tests to run, and executing them. This allows for scalable and efficient decision-making, freeing software engineers to spend time on more complex and strategic tasks.

Removing bugs

Bugs naturally occur during software development, posing a major pain point for software testing teams. Software companies can use AI and ML algorithms across the company’s library to flag coding mistakes and discover bugs before developers include them in the code. This application of AI algorithms can help development teams save a significant amount of time and resources by not having to manually find and address bugs. Ultimately, AI can also help software companies to decide whether further coding changes are required to prevent program errors.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

ThousandEyes launches live Global Internet Outages Map


Sabina Weston

24 Mar, 2020

Network intelligence company ThousandEyes has launched Internet Outages Map, a tool that provides real-time visualisation of the current state of the global internet health, based on ThousandEyes’ internet Insights.

ThousandEyes’ map provides consumers, businesses, industry analysts and other parties with an insight into ongoing and recent internet outages that may be affecting the experiences of their customers or employees.

The tool uses data from the company’s “vantage points” situated around the world, which perform billions of measurements each day to detect when traffic flows are disrupted within ISPs, public cloud networks and other service providers.

“Over the past couple of weeks, we’ve been inundated with requests from businesses, industry analysts and other various parties wanting to get a better understanding of global internet health during these trying times,” said Mohit Lad, co-founder and CEO of ThousandEyes.

“Today, we’re thrilled to release the Global Internet Outages Map to give businesses and consumers alike a reliable source based on actual internet telemetry instead of public rumor to help them understand what’s happening on the Internet at any point in time.”

The service was launched following a mass migration of employees from the office into working from home, due to the recommendations of governments and employers aiming to reduce to the deadly impact of the coronavirus outbreak.

This has caused an unprecedented level of internet usage, with Vodafone sustaining a 50% surge in demand for data services in some markets, while popular streaming sites such as YouTube or Netflix announcing that they will lower the standard of its streaming quality in order to prevent a much-feared internet-speed bottleneck. 

“Despite massive traffic increases — particularly across consumer last-mile networks — we have not seen a significant corresponding spike in Internet outages, which can occur when traffic levels strain network capacity,” wrote product marketing director Angelique Medina in ThousandEyes’ blog post. “However, there has been an upward trend line in outages over the last three weeks compared with the previous three-week baseline.”

The company says that the Global Internet Outages Map is available at www.thousandeyes.com/outages to anyone at any time, and also provides information in the form of a historical timeline depicting outage volume over the last 24 hours.

How to accelerate your hyperconverged healthcare cloud: A guide

In the race for digital transformation, even in healthcare, there is a need to implement the right systems and solutions to maximise uptime, increase operational performance and to reduce downtime. However, it isn’t as simple as going out to a supermarket to buy a loaf of bread. There are so many potential solutions and systems on the market claiming that they do the job – but do they?

That’s the question that should arise whenever anyone is trying to resolve anything from latency to data storage. However, it’s not an easy question to answer, as the healthcare IT landscape is ever-changing.

Martin Bradburn, CEO of Peasoup Hosting, therefore comments: “For the healthcare sector, cloud infrastructure with its security and scalability can, and in some instances is already accelerating the development of clinical applications. Digital transformation is also changing the way healthcare operates, connecting remote clinical specialist resources directly to the patients for improved diagnosis on a worldwide scale.”

He adds that the use of cloud technology, more precisely "its ability to efficiently process and deliver data in a collaborative manner, analysing data into meaningful information… can relieve the current healthcare challenges. Equally, by using cloud IT infrastructure instead of the one on-premise healthcare, organisations pay more efficiently for what they use,” says Bradburn.

Hyperconverged benefits

In a January article for HealthTech magazine, ‘The Benefits of Hyperconvergence in Healthcare’, freelance journalist Tommy Peterson writes: “With expansions, mergers and an increased reliance on new technologies, the healthcare landscape is changing at dizzying speed.”

That’s because IT teams are being pushed to keep pace with digital transformation, to “navigate complex logistics and help to cut costs and deliver better patient care”, he says, while citing David Lehr, CIO of Luminis Health, a new regional healthcare system in Maryland, comprising Anne Arundel Medical Center and Doctors Community Medical Center.

Lehr adds: “We need to be good stewards of the investments our communities make in us. Driving up unnecessarily high costs on complicated IT infrastructure that takes an army of people to manage isn’t a great way to live up to that expectation.” So, in essence hyperconvergence is seen as the answer, which is to achieve through consolidation to save money and to enhance reliability; by using virtualisation to streamline applications; and he argues that simplification makes data more useful, secure and accessible by making it easier to move on-premise applications to the cloud.

David Trossell, CEO and CTO of Bridgeworks, argues hyperconverged systems have many benefits to the smaller healthcare providers. "By consolidating down multiple technologies from multiple providers for all the separate equipment that forms a modern data centre, along with all the differing support contracts and possible iteroparity issues that can occur, hyperconverged brings this all down to one or two providers," says Trossell.

“Whilst hyperconvergence solves many day-to-day issues in the data centre the biggest threats these days emanate from outside of it in the form of cyber-attacks, and these attacks are getting more and more sophisticated in their approach," Trossell adds. "Where once healthcare companies could revert to the backups and reload; these cyber criminals are content with just attacking online data, they have now even started to attack the backup software and backup data. This forces healthcare companies to seek new levels of data security that are on multiple levels and redundancy across multiple locations.” 

Technology management

Hyperconvergence can also permit healthcare organisations to know when they need to expand or upgrade their technology. When everything has been tested, and whenever there is a need to speak to a supplier’s technical support, they will only need to make one call.

Bradburn adds: “Hyperconvergence is becoming the new standard in infrastructure; there are many commercial offerings that simplify the management and provide greater control of the local infrastructure, ensuring higher availability. The physical infrastructure sizes are reduced, with lower power and cooling requirements and with less management complexity, which has obvious cost-saving benefits in the healthcare sector and reduces the risks of failure.

“A cloud service takes this reduction in risk and complexity to the next level by removing all the infrastructure management. This compliments the healthcare environment, removing the traditional budget limitations with over or under provisioning. This also makes it easier to extend the infrastructure into the cloud to provide the elastic growth, ensuring the infrastructure is always the right size to meet the demands of the users and applications.”

Team collaboration

With applications becoming more mobile and web-based, Bradburn notes there is an increasing team collaboration trend in real-time product development, analysis and reporting. With this in mind, he argues that the cloud environment is “perfect for big data sets – archiving, pulling out and data manipulation is fast and effortless. This is vital for all services when the response is urgent.”

He adds that a cloud can provide an air gap and be bundled with cloud back-up and disaster recovery services. These services 'minimalise and mitigate the risk of cyber-attacks such as hacking or ransomware', as he puts it.

“It's a great solution for all organisations seeking to leverage the cloud while keeping governance and privacy their highest priority. Cloud offers to healthcare organisations a cost-effective way to ensure complete availability of the IT infrastructure whilst limiting vulnerabilities.”

Transporting data

“Using the cloud as on offsite data protection facility has many advantages in cost and, if done correctly, an air-gapped depository”, says Trossell, who believes there is a need for new thinking about how data is transported. He says it’s imperative to move off-site data to a HIPPA compliance cloud, or to multiple cloud providers. 

Placing your data in one data or cloud is highly risky. Healthcare companies therefore need to recognise that not only does their data need be backed up in several locations, but also “that until the last byte of backup data has been received by the cloud, there is no backup.”

Trossell explains that the ability to move data in a timely fashion to the cloud is governed by three factors: latency, packet loss and the bandwidth of the WAN link. He adds: “In many cases we see the assumption that if you want to improve the way in which data moves across the WAN, the answer is to throw bandwidth as the problem.” The trouble is that the existing bandwidth may be completely adequate, and so it may just be latency and packet loss that are affecting WAN performance.

WAN acceleration needed

“The norm is to employ WAN optimisation to resolve the issue”, reveals Trossell. The trouble is that this commonly has little effect on WAN performance – particularly when data must be encrypted. He adds: “The other go to technology in the WAN armory is SD-WAN. Whilst this is great technology; it doesn’t solve the WAN latency and packet loss issues.” The answer to mitigating the effects of latency and packet loss is WAN acceleration, which is also referred to as WAN data acceleration.

Trossell adds: “WAN acceleration approaches the transport of data in a new way. Rather than trying to squash the data down, it uses parallelisation techniques controlled by artificial intelligence (AI) to manage the flow of data across the WAN, whilst mitigating the effects of latency and packet loss.”

What’s great about it, in his view, is that the solution doesn’t change the data in any way, and it can be used in conjunction with existing back-up technologies and SD-WANs. This in turn will drive up the utilisation of the WAN bandwidth to 95%. He therefore notes: “That bandwidth you currently have may, may just be enough for your needs.”

Reducing network congestion

“Due to the nature of this sector, healthcare organisations are often multi-located with other organisations and third-party suppliers in different countries”, says Bradburn. “To improve data transfer across entire network, including the cloud and even for mobile healthcare workers, organisations can implement WAN data accelerators.” He emphasises that the technology reduces network congestion, while optimising the traffic traversing the WAN.

“This improves performance and acceleration of applications across WAN considerably, enabling the real time collaboration required for effective patient healthcare”, he explains, commenting that the healthcare sector has made a big shift in recent years to the public cloud. Machine learning and artificial intelligence are “pushing the cloud adoption further.” Many of these technologies, clouds and cloud service depend on WANs, and so they can be affected by the spectres of latency and packet loss.

Hyperconverged cloud benefits

However, Trossell believes that many of these technologies offer significant benefits to healthcare organisations. “Properly deployed and managed, cloud-based solutions offer healthcare organisations unprecedented opportunities to innovate, develop and deploy applications while still maintaining privacy, security and compliance.”

As for hyperconverged infrastructure, he says: “A hyperconverged infrastructure, by combining and virtualising the components of networking, security storage and compute into a single box or clusters of multiple boxes removes the complexity of a separated traditional model”.

Trossell summarises the key benefits as being:

  • The physical size of the infrastructure is reduced, saving power and cooling cost
  • The management of the infrastructure is simplified, reducing the overhead of staff management time and different skill sets for each component
  • The resilience and performance are higher are there are fewer interconnecting
  • components that can cause bottlenecks or failures

New standard: Hyperconvergence

Hyperconvergence, Trossell finds, is rapidly becoming the new standard in performance and reliability. It also benefits healthcare organisations without impacting on management costs, maintenance and hosting.  Yet, there is no getting away from the importance of connectivity – particularly at a juncture when the uptake of cloud services is increasing.

“In some instances, the distance between on-premise servers and the cloud cause data transfer latency, which becomes the limiting factor, rather than the size of the bandwidth, especially when transferring large medical imagery," says Trossell.

“More and more healthcare organisations implement WAN accelerators to mitigate the above issues. By adding WAN data accelerators, the data can be moved at a high speed across lower bandwidth connections over substantial distances, providing faster access to information and better patient care.” 

Healthcare IT tips

In summary Bradburn and Trossell offer their 5 top tips for consolidating, simplifying, increasing the performance and reducing the cost of healthcare IT:

  • Research your cloud supplier – smaller cloud providers often offer predictable and simple pricing, unlimited bandwidth that’s preferable in the private sector. Always check the small prints and annexe. Some of the cloud providers offer a low cost per unit but additional charges for ingress and egress, which can make the whole solution twice as expensive
     
  • Deploy green technologies – we know that new technologies perform better and have a lower carbon footprint. Offsetting carbon emissions by planting more trees is one way to deal with climate change. But there are cloud suppliers that utilise ecological data centres. Some of them use a liquid technique for cooling IT infrastructure. Liquid cooling offers a much higher performance level ready for big data transfers. This is especially helpful in specific applications like x-ray or video surgeries
     
  • Consider data sovereignty – For example, GDPR compliance is one of the most important safeguards for any companies. Due to the nature of the healthcare sector it’s critical that personal data is stored securely and not replicated overseas. To stay competitive, some cloud providers use foreign data centres for data replication. So, don’t risk and check before you commit
     
  • Mitigate latency and packet loss – Use WAN Acceleration to improve WAN performance by mitigating latency and packet loss. Even SD-WANs will benefit from a WAN performance overlay
     
  • Consider the benefits of hyperconvergence and how it will enable your healthcare organisation’s own operational performance  

So, where next for the hyperconverged healthcare cloud? Bradburn concludes that the next steps are centred around compliance and standards. He thinks there is a need to provide a truly global healthcare service with access to specialists across the globe and to be able to call on their expertise for remote diagnosis.” For example, this could be about clinicians sharing best practice, data and techniques to prevent a coronavirus pandemic; to work together collaboratively from afar to find a cure for cancer. 

This requires healthcare organisations to build “networks that enable the efficiency in data transfer, storage facilities and security are the key challenges, whilst also defining the standards to ensure data formats from clinical systems and personal health IoT devices are universally understood.”

By undertaking these steps, and with the support of WAN acceleration, Bradburn believes the true power of the cloud can be utilised in the healthcare services of the future.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

What’s next for e-commerce?


David Howell

24 Mar, 2020

e-commerce continues to boom. The latest figures from the Office for National Statistics (ONS), show online sales accounted for 18.6% of all retail transactions in 2019. More startling is the growth of mobile channels: WorldPay predicts m-commerce spending will overtake e-commerce by 2023.

For retailers, evolving their business is the key to maintaining market share and nurturing customer loyalty. Research from Mintel published in 2019 is telling: 86% of Brits shop on Amazon, with nearly three-quarters (70%) of them buying something from the site at least once a month. 26% of them also have Prime accounts, giving them access to more and faster delivery options.

Speaking to Cloud Pro, Martin Willetts, technology consulting partner at Deloitte says: “The rate of evolution with consumer behaviours has been difficult for some brands to keep pace with. Those e-commerce organisations that are investing in the flexibility of systems, tools, and processes, are the ones turning the challenge into a real opportunity. 

“The biggest trend is how e-commerce organisations are transforming their capabilities, so they can adapt to support consumer needs while also scaling operations as digital shopping habits continue to proliferate,” Willetts continues. “Also, consumers are increasingly expecting full transparency from brands with regards to both the sustainability and ethical sourcing of products.”

Mobile retail channels will increasingly become the focus for consumers, as they consolidate their use of smartphones. The use of digital kiosks, self-service and cashless checkouts are expanding – all using smartphones as their payment mechanism.

According to a new State of Mobility in Retail Report from enterprise mobility management firm SOTI, 67% of consumers perceive mobile technology as the most effective way to provide a faster shopping experience. Additionally, more than three-quarters (76%) of consumers want in-store staff to use mobile devices to provide a better experience, and nearly one-third (32%) are unwilling to sacrifice personal data security to improve their in-store experience, revealing bold new insights about consumers in the modern retail landscape.

e-commerce is now a multi-channel, multi-touchpoint experience for consumers. They don’t see separate channels, but simply want convenience and speed when they are ready to buy goods and services.

Channel agnostic

The question of whether e-commerce businesses should become ‘mobile-first’ is now moot. The omnichannel has cemented itself as the model e-commerce, and m-commerce is based upon. Indeed, the distinctions are disappearing.

Mobile channels will become increasingly important as a commercial space. Estimates from GSMA suggest over the short term, 1.4 billion people will start using the mobile internet for the first time, bringing the total number of mobile internet subscribers globally to 5 billion. By 2025, the estimated figure rises over 60% of the global population.

“Looking at both commerce channels separately seems counter-intuitive to me,” says Ennis Al-Saiegh, CEO of Smarter Click. “The rise of mobile traffic and ultimately the improved conversion journeys that have had to be improved for this medium, has resulted in improved efficiencies if you treat both channels as a pure ‘commerce’ channel. Treating them as one commerce channel means your brand values, user experience efforts and marketing teams are all aligned to deliver an enhanced experience.”

Deloitte’s Willetts adds: “A ‘Unified Commerce’ approach across physical and digital channels that is modular, flexible and data-driven will be key to ensuring memorable shopping experiences. There is huge potential for retailers to achieve greater value from their marketing spend, seeking to drive customers to the business from any channel to any channel. Again, with a consistent marketing experience regardless of the medium.”

How businesses will construct their e-commerce storefronts is also changing. Headless commerce separates the design of a business’ e-commerce enabled website, with the IT infrastructure that supports it. The practical advantage is the deployment of an e-commerce business can easily be multi-channel. 

Speaking to Cloud Pro, Chris Adriaensen, senior manager of solutions engineering at Auth0, explains: “Headless content is effectively the removal of the end-user interface from backend systems, providing e-commerce solutions as a set of flexible APIs. Front-end developers can take user data and content to create far more dynamic and personalised experiences across different devices and touchpoints. 

“Headless e-commerce allows for more than ‘if you liked these teabags, buy this coffee’ and means content can be delivered in more effective ways, suited to the identity of the user. Power users might get a different experience than first-time buyers, for example. With new tech-driven experiences, such as Amazon Go-style smart stores and AR-powered dressing rooms for clothing still in their early stages, a ‘headless’ approach also facilitates a faster time to market allowing retailers to adapt to new channels, technologies and use cases.”

Consumers want integrated, cross-channel shopping experiences with their purchase journey beginning on one channel and often ending on another – increasingly, their smartphones. Supporting this level of integration is a significant challenge. Here, automation can help and will expand their influence as new services come onstream to help multi-channel retailers deliver their goods to diverse audiences.

New opportunities

According to Statista, 35% of American households have a smart speaker. What’s more, over a quarter, 26% made a purchase using their speaker last year. This rapid expansion of voice commerce is set to continue. 

Data has also become a vital component of successful e-commerce. With masses of information collected about the shopping preferences of individuals, the future of commerce is personal. Brands and retailers that can make personal connections with their customers and deliver new innovative experiences will maintain and expand their market share.

Finding patterns and value in the data being collected will become the province of AI. “Though most e-commerce companies today aren’t using AI in earnest today, we expect this to change rapidly over the next couple of years,” explains Michael Scharff, CEO of AI-powered conversion platform Evolv. “We see AI as a means to an end, not an end itself. e-commerce companies should look to adopt AI to differentiate themselves from the competition, create exceptional omnichannel experiences, and enhance their customer experiences.

Gartner predicts that, in 2020, 85% of customer interactions are managed without human involvement. It’s a trend that has moved at rapid speed. In connection with mobile devices and personal assistants, AI makes a powerful tool for shoppers. Finding products by visual search or making orders via voice assistants becomes hassle-free.

And no retail business can now ignore social media as a commercial channel. According to the most recent “Reimagining Commerce” report from Episerver: “On average, one-fifth of consumers have made purchases directly because of a social media influencer’s product post. Among younger consumers, that number is even higher, with 50% of Gen Z shoppers and 48% of millennials having purchased products either directly by clicking on a post or later on as a result of the influencer’s endorsement.”

Deloitte’s Willetts concludes: “A key technology trend, in response to evolving consumer behaviours, is the need for a flexible and adaptable Commerce platform – with ‘cloud’, ‘headless’ and ‘microservice-based architecture’ often being near the top of business’s requirements list.

“These trends have led both to a resurgence in custom-built e-commerce platforms using smaller ‘lighter’ SaaS vendor’s offerings – as well as some of the larger e-commerce platform vendors re-architecting their platforms regularly to avoid being left behind.”

How consumers connect with the stores they want to buy from will also be transformed as the Internet of Things (IoT) and 5G expand and mature. Using the smartphone as the conduit to reach individuals when environments become smart and connection speeds have low latency delivers massive opportunities to all businesses no matter their size.

e-commerce is changing once again. Customer-facing websites and portals will increasingly adopt the headless approach, with the smartphone now the key battleground for consumer spending. Integration is the key to successfully navigating the next stage of e-commerce evolution.

Blog: How cloud companies are reacting to Covid-19 and services offered: AWS, Alibaba, and more

LIVE As the Covid-19 pandemic continues, with citizens across many countries urged to work from home where possible, it has posed a unique challenge for both frontend applications and the backend technologies underpinning them.

A lot of attention has, understandably, focused on the former. Zoom, which appears to be the videoconferencing tool du jour for many businesses, has held up well thus far, although at the time of print (March 23) some downtime issues in the UK have been detected. Similarly, outside of work, Netflix is lowering its video quality to keep up with demand. Yet underneath it all, cloud infrastructure providers are aiming to keep their systems online throughout the pandemic.

Whether it is cloud software or infrastructure, many of the world’s leading companies are making their tools available for certain users – primarily healthcare organisations or researchers working on Covid-19.

CloudTech is putting together a list of offerings from vendors reacting to the Covid-19 crisis, which can be found below. If your organisation is not on this list and is making products available, let us know at editorial@techforge.pub.

Hyperscaler highlights

Alibaba Cloud said on March 23 that the Alibaba Foundation and Jack Ma Foundation had recently launched the Global MediXchange for Combating Covid-19. The project, with the support of Alibaba Cloud Intelligence and Alibaba Health, was established to ‘facilitate continued communication and collaboration across borders, as well as to provide the necessary computing capabilities and data intelligence to empower pivotal research efforts’, the company said. You can find out more about the initiative here.

In a previous Canalys report, Alibaba Cloud had been praised for offering credits to organisations enabling them to buy its Elastic Compute Service, as well as cybersecurity services. The company also made its AI-powered platform freely available to research institutions working on treating and preventing coronavirus. You can find out more about these services here.

Amazon Web Services (AWS) announced on March 20 that it was committing $20 million for customers working on diagnostics solutions. The AWS Diagnostic Development Initiative is open to accredited research institutions and private entities using AWS to support research-oriented workloads for the development of Covid-19 testing and diagnostics.

The initiative is being put together alongside 35 global research institutions, startups, and other businesses, and is being aided by an outside technical advisory group of leading scientists and global health policy experts. You can find out more about the project here.

Google Cloud announced on March 3 that it was rolling out free access to advanced Hangouts Meet videoconferencing capabilities to all G Suite and G Suite for Education customers globally, including larger meetings – up to 250 participants per call – as well as live streaming up to 100,000 viewers, and the ability to record meetings and save them to Google Drive.

The company has already taken other steps. On March 17, Google postponed its Cloud Next event, having previously made the decision to take its April 6-8 gathering virtual-only.

IBM said on March 22 that it was collaborating with the White House and the US Department of Energy among others to launch the Covid-19 High Performance Computing Consortium. The company said it would pool an ‘unprecedented’ amount of computing power – 16 systems with more than 330 petaflops, 775,000 CPU cores, 34,000 GPUs and more – to ‘help researchers everywhere better understand Covid-19, its treatments and potential cures.’

The next step, IBM added, is to work with consortium partners to ‘evaluate proposals from researchers around the world and provide access to this supercomputing capacity for the projects that can have the most immediate impact.’ According to reports, citing President Trump, Amazon, Google, and Microsoft are also part of the consortium.

Microsoft announced on March 19 that National Health Service (NHS) staff in the UK can use collaboration tool Microsoft Teams for free. NHS Digital rolled out Teams across all NHSmail users between March 16 and March 20.

In the US, Microsoft has helped design a ‘coronavirus self-checker’ in a project alongside the US Centers for Disease Control and Prevention. As reported on March 23 the bot, called Clara, aims to help people make decisions about what to do if they have potential Covid-19 symptoms.

Timeline

March 23: Banyan Security, a San Francisco-based vendor, said it would offer free access to its Zero Trust security offering ‘for a limited time’ in the wake of the coronavirus pandemic.

March 23: Cisco said it would commit $225 million to coronavirus response to ‘support healthcare and education, government response and critical technology.’ The funds, $8m in cash and $210 in product, will in part go to the United Nations Foundation’s Covid-19 Solidarity Response Fund.

March 20: Huawei said it had worked with Huazhong University of Science & Technology and Lanwon Technology on an AI project to help ease the burden on imaging doctors who are able to diagnose and quantitatively analyse Covid-19.

March 17: ServiceNow announced it was making certain apps available to any public agency in the world to help deal with the coronavirus pandemic. The primary app, an emergency response operations app, was built by Washington State on the ServiceNow platform.

March 17: Okta is offering free single sign-on (SSO) and multi-factor authentication (MFA) for secure remote working. “Any organisation that would find value in leveraging the Okta Identity Cloud for remote work during an emergency situation should be able to do so at no cost,” the company wrote.

March 17: Dropbox said it was ‘proud’ to offer free Dropbox Business and HelloSign Enterprise subscriptions for a three-month period to non-profits and NGOs focused on fighting Covid-19. Eligible organisations are encouraged to apply here.

March 16: Box CEO Aaron Levie said via Twitter that anybody working on Covid-19 research or response efforts could email rapid-response@box.com to set up free secure file sharing and storage.

March 16: Stewart Butterfield, CEO of Slack, said people working on Covid-19 research, response or mitigation were entitled to free upgrades to paid plans, setting up consultation for remote collaboration best practices among others. Users are asked to email covid@slack.com

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft prioritises extra Azure capacity for ‘key customers’


Keumars Afifi-Sabet

23 Mar, 2020

Microsoft has outlined provisions for adding capacity to its Azure servers for key public and emergency services across the world as teams fight to contain the escalating COVID-19 crisis.

The company says it has been monitoring its services and usage trends 24/7 to ensure customers are able to stay online as businesses adjust to a sharp rise in remote working

However, Microsoft has said a cohort of key customers will be prioritised should there be capacity constraints, which includes retaining priority over new Azure cloud capacity.

With demand rising, higher priority will be afforded to first responders, health and emergency services, critical government infrastructure organisational use, as well as ensuring remote workers are up and running with the core functionality of Teams

“Over the past several weeks, all of us have come together to battle the global health pandemic,” the company said in a blog post. “We are working closely with first responder organizations and critical government agencies to ensure we are prioritizing their unique needs and providing them our fullest support.

“We are also partnering with governments around the globe to ensure our local datacenters have on-site staffing and all functions are running properly,” added Microsoft.

The tech giant has stressed there aren’t any cloud constraints at the moment, but concerns remain that the stress on the wider internet will increase as more of the global population moves online.

A host of streaming providers, including Netflix and YouTube, have in the last few days reduced streaming quality as a direct response to these concerns.

Microsoft’s brief warning about potential usage constraints with Azure cloud services may cause concern for businesses currently struggling to grapple with masses of employees adopting flexible and remote working patterns.

It’s currently unclear what these restrictions mean for businesses and organisations not deemed to be a priority, should Azure servers be faced with capacity constraints in any of its regions – it’s likely that some businesses could struggle to secure extra capacity if demand continues to increase.

The industry giant has said, however, that it would communicate any updates as soon as possible through its online resources and blogs.

Google bins Chrome 82 development amid coronavirus delays


Bobby Hellard

23 Mar, 2020

Google has cancelled development of version 82 of Chrome and will instead skip ahead to Chrome 83.

Chrome 81, which was due for release on 17 March, is currently still in a beta channel and will stay there until 83 is ready to be promoted.

On Friday, the tech giant announced it had paused all work on new Chrome and Chrome OS releases as work schedules were delayed by the outbreak of COVID-19. On Monday, Jason Kersey, the director of technical program management at Google, announced that the schedule for 82 had to be dropped altogether to maintain stability.

“This is an update on our earlier decision to pause our branch and release schedule,” he wrote in a Chrome discussion group for developers. “As we adapt our future milestone schedules to the current change in schedule, we have decided to skip the M82 release to ensure we keep users safe and focus all efforts on maintaining stability.”

As a result, Chrome 82 is effectively dead, as Google will not be pushing its release to developers. It will no longer be tested or merged into branches or even be placed in beta. Instead, all efforts will go to moving the development channel onto Chrome 83.

Kersey added that there will be a further update later in the week with more information about future changes, which could be a regular update as the coronavirus continues to cause disruption.

Similar disruption has been felt at Microsoft, which has also paused the release of new versions of its Edge browser to remain consistent with Chrome. Version 81 was due to be released to developers on Tuesday, but that has also been paused due to COVID-19.

Both Google and Microsoft have had to force staff to work remotely, which may have limited their abilities to respond to bugs in new versions.

Google has said that security updates have been unaffected by the disruption and will still go through for Chrome OS as planned.

How cloud providers are changing the outlook for IoT data and analytics management

CIOs and CTOs are exploring new ways to extract insights from their enterprise data assets with analytics tools. While they continue to invest in on-premises solutions, they're also looking to public cloud service providers.

As cloud computing providers grow their footprint in the Internet of Things (IoT) value chain, their investments in data and analytics services are accelerating.

Based on the review of cloud service provider offerings, recent acquisitions, and the competitive outlook, ABI Research now forecasts that cloud suppliers will grow their share of IoT data and analytics management revenues from $6 billion in 2019 to $56 billion in 2026.

IoT data and analytics market development

While the growth is impressive, cloud vendor services today are focused on data management complemented by a generic analytics toolset. That said, cloud computing vendor revenues come primarily from streaming, storage, and the orchestration of data.

In contrast, most analytics service offerings across cloud vendors are less differentiated, as reflected in pre-built templates — such as AWS Sagemaker and Microsoft Azure Notebooks — which leverage the Project Jupyter open-source software, standards and services initiative.

Considering that many cloud vendors are in the early stages of their analytics investment, they are relying on their specialized channel partners for addressing more specific 'advanced analytics' and vertical market needs.

"The overall approach shown by cloud suppliers in their analytics services reflects the dilemma they face in the complex IoT partnership ecosystem," says Kateryna Dubrova, analyst at ABI Research. "Effectively, do they rely on partners for analytics services, or do they build analytics services that compete with them?"

Interestingly, streaming is the one analytics technology that all cloud vendors are building into their solution portfolios to blend data management with near-real-time analytics on streamed IoT data.

Companies such as AWS, Microsoft, Google, IBM, and Oracle, for example, are promoting their proprietary streaming solutions to differentiate, accelerate time-to-market, and win over customers.

According to the ABI assessment, companies including Cloudera, Teradata, and C3.ai are introducing streaming analytics services that are reliant upon open-source technology, such as Spark and Flink.

However, by choosing to focus on data management and streaming technologies, cloud vendors are ceding the advanced analytics market to other suppliers. That emerging market is an example of the 'coopetition' in the IoT ecosystem, where cloud vendors partner with advanced analytics experts.

This vendor coopetition enables them to promote an end-to-end IoT technology stack. For example, Azure and AWS have partnered with Seeq to leverage its advance analytics capabilities. Other vendors, such as Oracle, Cisco, and Huawei, are pushing intelligence and analytics closer to the devices, expanding their edge computing portfolio.

Outlook for cloud-based analytics applications growth

Such divergent analytics strategies represent the reality and challenges for serving a very diverse IoT ecosystem with IoT analytics services.

"Ultimately, businesses are moving to an analytics-driven business model which will require both infrastructure and services for continuous intelligence. Cloud vendor strategies need to align with this reality to take advantage of analytics value and revenues that will transition to predictive and prescriptive solutions," Dubrova concludes.

There is a significant upside opportunity for vendors that are exploring the numerous applications for IoT analytics services. Solutions will include both on-premises IT infrastructure and cloud offerings.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Build your own cloud infrastructure with Nextcloud and Collabora


Andy Webb
K.G. Orphanides

30 Mar, 2020

Cloud services have revolutionised the way we work, providing easy paths for collaboration over a distance and business-critical features such as automatic off-site backups and version control. 

However, potential issues range from not wanting to entrust data to the uncanny tentacles of global megacorporations, to very limited options for intranet-only deployments and the cost of per-user licensing for bundled services that not all users need.

Setting up your own cloud services can – in some cases – provide financial savings, but it will certainly provide greater control over the data you’re responsible for and the way your users can access it. 

That can be a distinct advantage when it comes to data protection and financial services regulation. Note, though, that you will be responsible for securing and updating the software that runs your cloud, rather than being able to leave that to a third party.

We’ll guide you through setting up open source cloud storage suite Nextcloud and Collabora Online, which adds online collaborative document editing to Nextcloud, as well as a few more common business cloud features. For brevity and convenience, we’ll be using containerised versions of the software distributed using Snap and Docker.

The latest version, Nextcloud 18 Hub, includes support for an integrated OnlyOffice editing environment on the same server. This is bleeding edge stuff, so for both this reason and because the OnlyOffice Community Edition it uses supports just 20 simultaneous open documents, we’ve opted not to use this approach for our tutorial.

Instead, we’ll guide you through setting up the current stable version 16 snap release of Nextcloud and the more fully-featured Collabora document editing environment on a dedicated server, as this is more appropriately scaleable to the needs of most businesses.

In our example deployment, we’ve given Nextcloud and Collabora’s servers each a dedicated VM. The required spec will vary depending on how many users you have, how much they need to store and how frequently they’ll access storage and edit documents.

A very basic setup – suitable for a small business or department of three to ten people – works smoothly with a single core and 1GB RAM for the Nextcloud server, and two cores and 2GB RAM for Collabora. The extra memory is particularly important here if you expect multiple users to work on their documents at the same time.

Unless you have very high storage capacity requirements, we suggest using an SSD-based system to improve responsiveness. This tutorial was written using virtual servers hosted on Vultr and that service’s default Ubuntu 18.04 image, but applies to any comparable virtual or hardware server configuration.

Nextcloud installation

Set up an Ubuntu 18.04 server. If your install image doesn’t prompt you to do so, create a new user, add them to the sudoers group and update the server. You’ll be logging in as that new user, rather than as root, whenever you need command line access to the Nextcloud server.

adduser username
adduser username sudo
su username
sudo apt update
sudo apt dist-upgrade

Now, we’re ready to install the Nextcloud snap package, which packs in all required dependencies.

sudo snap install nextcloud

To configure your Nextcloud, connect to your server’s IP address in a web browser and follow the prompts to create an admin account. Congratulations, you now have a basic cloud storage server.

To make it easily accessible and appropriately professional-looking, we’ll want a domain name for it – either a new domain or a subdomain of your existing web address will work well.  

With an appropriate domain name registered or subdomain selected, create an A record in your registrar’s DNS management portal pointing at your new Nextcloud server’s IP address.

Now we’ll have to tell your Nextcloud instance what its domain name is. Log in to the server at the command line.

sudo nano /var/snap/nextcloud/current/nextcloud/config/config.php

Add your new domain name under trusted_domains, save changes and you should now be able to immediately access your Nextcloud from that URL.

With that done, it’s time to run through Nextcloud’s recommended server security tweaks, most importantly HTTPS support.
 
The Nextcloud snap comes with built-in support for generating a LetsEncrypt certificate, so just run:

sudo nextcloud.enable-https lets-encrypt

Then follow through the certificate creation process for your domain name. The Nextcloud Snap includes an integrated auto-renewal routine, but you can also renew your certificates at any point by re-running the creation command above.

Configure Nextcloud

Nextcloud needs to be able to communicate with your users for everything from registration emails to editing invitations, so you’ll need an SMTP server that it can send outbound emails through.

In this example, we’re integrating Nextcloud with a business that uses G-Suite, so we’ll use Gmail as our SMTP server. However, third-party SMTP providers of this kind may require some extra configuration on their end to work. In this instance, we had to reduce security to allow access. If users aren’t allowed to manage their own less secure apps, you’ll have to grand them this permission in the G-Suite admin panel’s Advanced Security Settings.

If you’re testing Nextcloud and using a standard Gmail account for SMTP, you’ll find the same setting in your personal Google Account Security options.

If you run your own mail server, you’ll want to create a user for Nextcloud and point it at that. 

At this point, you should add a recovery email address and ideally enable two-factor authentication for your admin account. Once you roll Nextcloud out to your users, you should strongly encourage them to do the same.

If all you need is online storage, you’re ready to invite users, but if you want to provide more advanced cloud services and apps, such as document editing,  you’ll want to add a few more features.

Click on your profile icon and select Apps. Here, you’ll see all the default features of Nextcloud, such as its gallery display for images, plain text editor and PDF viewer, as well as any pending updates for them. 

In the pane on the left, a category list lets you view a full range of official and third-party Nextcloud apps. There’s a lot here, so you’ll want to take a look through everything to see what your users are likely to need.

Nextcloud’s app library includes Google Drive, Microsoft OneDrive and Dropbox integrations that can help users transfer files from third-party cloud services to Nextcloud, multimedia file playback and conversion, single sign-on and additional two-factor authentication support, web form creation, WebRTC-based video and voice conferencing, end-to-end encryption and real-time tracking of associated mobile devices, as well as more traditional office suite functionality.

For this tutorial, we’re going to add a calendar, task list, and contact management. Go to Office & text and select Download and enable on Calendar, Contacts and Tasks. You may be prompted to enter your password. Once you’ve added these and returned to the main Nextcloud interface, you’ll be able to access these via extra buttons that’ll appear on the interface’s top bar.

Nextcloud includes a simple integrated text editor by default, but if you need proper online document creation and editing, the Nexcloud Collabora Online app is an elegant solution. To use it, however, you’ll need to set up a Collabora Online server. 

Based on LibreOffice, Collabora’s features include full version control, commenting, collaborative document editing, and it allows you to create word processor documents, spreadsheets and presentations. Documents are saved in standard Open Document formats, and the synced versions that’ll be saved on users’ devices can be opened in any compatible word processor, although you only get access to collaborative editing via the web interface.

Collabora is available as a Docker image. As it can become rather memory-hungry if you’ve got lots of users editing documents at the same time, we recommend giving it its own server, which also makes life a little easier when it comes to setup and configuration.

Collabora installation

Spin up a fresh Ubuntu 18.04 server and update it. We’ll be expanding on Nextcloud’s official Collabora deployment instructions for this section and working on the assumption that Collabora will only need to serve a single Nextcloud instance.

While some previous iterations of Docker liked to run as root, which is reflected in the Collabora setup instructions linked above, you can and should use a normal user in the sudoers group. So, if your installation image doesn’t do this for you by default:

adduser username
adduser username sudo
su username
sudo apt update
sudo apt dist-upgrade

sudo apt install docker.io
sudo docker pull collabora/code
sudo docker run -t -d -p 127.0.0.1:9980:9980 -e ‘domain=subdomain\\.yournextclouddomain\\.tld’ -e ‘dictionaries=en en-gb’ –restart always –cap-add MKNOD collabora/code

These parameters include British, as well as US, English dictionaries – you can add others as needed. The path specified in the docker run command above must match the URL that your users will be using to connect to Nextcloud.

You’ll also need Apache to act as a forward proxy for the Docker images. 

sudo apt-get install apache2 
sudo a2enmod proxy 
sudo a2enmod proxy_wstunnel 
sudo a2enmod proxy_http 
sudo a2enmod ssl 

Before you can configure it properly, though, we’ll need to set up TLS certificates for the subdomain it’ll be using. We’re again using Let’s Encrypt certificates in this tutorial.

In this particular configuration, the easiest option is to stop Apache before using the Lets Encrypt certbot’s certificate-only generation mode.

sudo service apache2 stop
sudo apt install certbot
sudo certbot certonly –standalone

Enter the domain name you want to use for the Collabora server –  we suggest using a subdomain on the same domain you’re using for your Nextcloud server. Remember to create an A record in your DNS settings to point the subdomain at your new Collabora server before you try to generate the certificate.

Certbot automatically sets up a cron job to handle the required three-monthly renewals of Let’s Encrypt certificates, but we’ll have to make a couple of modifications to make sure it stops and restarts Apache properly. First, test renewal, as it’ll have to stop your server.

sudo certbot renew –dry-run –pre-hook “service apache2 stop” –post-hook “service apache2 start”

If that runs without any errors, we need to create scripts for those pre and post hooks into the appropriate directories.

sudo nano /etc/letsencrypt/renewal-hooks/pre/stop_apache
#!/bin/bash
service apache2 stop

sudo nano /etc/letsencrypt/renewal-hooks/post/start_apache
#!/bin/bash
service apache2 start

sudo chmod u+x /etc/letsencrypt/renewal-hooks/post/start_apache
sudo chmod u+x /etc/letsencrypt/renewal-hooks/pre/stop_apache

See if these are working by running
sudo certbot renew –dry-run

You can also confirm that there’s a systemd timer in place for certbot thus:

systemctl list-timers

We’re using Apache as a proxy here, so you’ll need to enter the URL of the Collabora Online server – the one you just got a certificate for – and the path to the certificates we created earlier.

<VirtualHost *:443>
ServerName your.collabora.subdomain:443

# SSL configuration, you may want to take the easy route instead and use Lets Encrypt!
SSLEngine on
SSLCertificateFile /etc/letsencrypt/live/certificate.domain.here/cert.pem
SSLCertificateChainFile /etc/letsencrypt/live/certificate.domain.here/chain.pem
SSLCertificateKeyFile /etc/letsencrypt/live/certificate.domain.here/privkey.pem
SSLProtocol             all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-G$
SSLHonorCipherOrder     on

# Encoded slashes need to be allowed
AllowEncodedSlashes NoDecode

# Container uses a unique non-signed certificate
SSLProxyEngine On
SSLProxyVerify None
SSLProxyCheckPeerCN Off
SSLProxyCheckPeerName Off

# keep the host
ProxyPreserveHost On

# static html, js, images, etc. served from loolwsd
# loleaflet is the client part of LibreOffice Online
ProxyPass           /loleaflet https://127.0.0.1:9980/loleaflet retry=0
ProxyPassReverse    /loleaflet https://127.0.0.1:9980/loleaflet

# WOPI discovery URL
ProxyPass           /hosting/discovery https://127.0.0.1:9980/hosting/discovery retry=0
ProxyPassReverse    /hosting/discovery https://127.0.0.1:9980/hosting/discovery

# Main websocket
ProxyPassMatch “/lool/(.*)/ws$” wss://127.0.0.1:9980/lool/$1/ws nocanon

# Admin Console websocket
ProxyPass   /lool/adminws wss://127.0.0.1:9980/lool/adminws

# Download as, Fullscreen presentation and Image upload operations
ProxyPass           /lool https://127.0.0.1:9980/lool
ProxyPassReverse    /lool https://127.0.0.1:9980/lool

# Endpoint with information about availability of various features
ProxyPass           /hosting/capabilities https://127.0.0.1:9980/hosting/capabilities retry=0
ProxyPassReverse    /hosting/capabilities https://127.0.0.1:9980/hosting/capabilities
 </VirtualHost>

sudo service apache2 restart

Your Collabora server should now be good to go. Log into your Nextcloud web interface as an admin, open the Settings screen and scroll down the left-hand pane until you get to Collabora Online Development Edition.

Click on that and enter the URL and port of your Collabora subdomain – your.collabora.subdomain:443 – and click apply.

User experience

With all your apps up and running, it’s finally time to invite your users. In the Nextcloud web interface, click on your profile icon on the top right of the interface and select users. 

Create new users by clocking the +New user button in the left-hand pane then filling in the account settings you want to give them in the entry that appears at the to of the user list on the right. 

You can set a password for them, which they should change after first logging in, and inform them of it. Alternately you can leave this password field blank and have them use the password reset feature to create their own password.

When they first log in, users should set their language and locale preferences in the Personal info section of the settings screen, again accessible by clicking on the user icon at top right. Locale determines the first day of the week for the calendar, which is set to the US Sunday-first system by default, and the language in which days are named.

As well as the website, client applications are available for Windows, Linux, macOS, Android and iOS. Users will be prompted to download these when they first connect, and they’re always available via the Mobile & desktop entry in the settings screen, accessible by selecting settings from the menu that appears when you click on your user icon at top right. 

Users can also search for Nextcloud clients in mobile app stores and link them by manually entering your cloud server’s URL. This section also includes links to information for syncing calendar and contact data.

If you want your Android users to be able to edit documents from the Nextcloud mobile app, you should use your device management system to roll out the apk file that can be downloaded directly from Nextcloud or have them install the app from the F-Droid store, as the Google Play version is, at time of writing, lagging behind when it comes to support for Collabora Office.

 

The mobile apps’ Auto upload options allow you to select specific directories on your phone to be automatically backed up to Nextcloud, whether that’s your photo gallery or critical document folder.

Assuming you’ve enabled the contacts app for Nextcloud, the mobile clients will be able to automatically back up your contacts to the service every day, and you can use Bitfire’s DAVx5 app for real-time calendar and contact syncing – once added, Nextcloud calendars and contacts can be accessed via your preferred app. 

Tasklist management for the Tasks feature is supported in DAVx5 via OpenTasks for Android, which users are prompted to install. Sync schedules can be customised as needed, with features including the option of only syncing over Wi-Fi, and everything works seamlessly in the background.

 

The desktop clients let you configure file syncing and create a default Nextcloud folder, whose contents will be automatically kept in sync with your Nextcloud server. You’ll want to set it to launch on system startup. You can also apply bandwidth limits and throttling, which may be helpful to those working from home or on the road.

Nextcloud and its broad range of apps and connectivity tools have great potential for any business that wishes to either switch away from or supplement third-party cloud services. 

For a more customised installation or to support large numbers of users, you may wish to build from source once you’ve familiarised yourself with Nextcloud’s systems, but the containerised versions of Nextcloud and Collabora are regularly updated and meet the core requirements for a small self-managed business cloud.

YouTube fails to clarify whether reduced streaming quality will impact live events


Sabina Weston

20 Mar, 2020

YouTube is yet to say whether the imposed low streaming quality across Europe will also carry over to its Live service, something which many businesses may come to rely on during increased demand for virtual conferencing.

The company announced today that it will lower the standard of its streaming quality in the UK and EU in order to prevent a much-feared internet-speed bottleneck, as thousands are confined to their homes due to the coronavirus outbreak.

However, when we asked whether the decision also applies to YouTube Live, parent company Google failed to provide an answer. 

“While we have seen only a few usage peaks, we have measures in place to automatically adjust our system to use less network capacity,” said a spokesperson for the Google. “Following the meeting between Google’s CEO, Sundar Pichai, YouTube’s CEO, Susan Wojcicki, and Commissioner Breton we are making a commitment to temporarily default all traffic in the EU to Standard Definition. We will continue working with member state governments and network operators to minimize stress on the system, while also delivering a good user experience.”

YouTube Live seemed like a viable option for many companies looking to stream conferences and events that otherwise face cancellation or postponement, given that governments across the globe continue to advise against mass public gatherings.

The company added that the reduced quality of streaming in the EU and UK would last for around a month and was an action taken in cooperation with governments.

Europen commissioner for internal market and services Thierry Breton praised Pichai and Wojcicki for the move:

“Millions of Europeans are adapting to social distancing measures thanks to digital platforms, helping them to telework, e-learn and entertain themselves,” he said. “I warmly welcome the initiative that Google has taken to preserve the smooth functioning of the Internet during the COVID19 crisis by having YouTube switch all EU traffic to Standard Definition by default.

“I appreciate the strong responsibility that Mr Pichai and Mrs Wojcicki have demonstrated. We will closely follow the evolution of the situation together.”

YouTube is the second streaming service to announce that they are reducing their streaming quality due to the coronavirus outbreak, following Netflix’s decision on Thursday.