Microsoft wraps up multibillion-dollar deal with AT&T for Azure migration


Keumars Afifi-Sabet

18 Jul, 2019

US telecoms giant AT&T Communications has signed what’s thought to be a multi-billion dollar cloud partnership agreement with Microsoft to aid the firm’s ‘public cloud-first’ strategy.

Microsoft will embark on a non-exclusive multi-year alliance with AT&T Communications to build on progress made in cloud computing, artificial intelligence (AI) and 5G networking. Azure will also be the preferred provider for non-networking applications, the cloud company announced.

The industry giant will also tap into AT&T’s geographically-dispersed 5G network to design and build edge-computing capabilities, as well as Internet of Things (IoT) devices.

The deal will also see AT&T employees migrate to Microsoft’s Office 365 suite of apps and services, with the company also planning to move most of its non-networking workloads to the public cloud by 2024.

This deal has come just a day after AT&T announced a similar partnership with IBM and its recently-acquired subsidiary Red Hat. On the surface, this second multi-billion-dollar agreement is similar to Microsoft’s deal, but centres on its business side.

“Today’s agreement is another major step forward in delivering flexibility to AT&T Business so it can provide IBM and its customers with innovative services at a faster pace than ever before,” said IBM’s senior vice president for cloud and cognitive software Arvind Krishna.

“We are proud to collaborate with AT&T Business, provide the scale and performance of our global footprint of cloud data centers, and deliver a common environment on which they can build once and deploy in any one of the appropriate footprints to be faster and more agile.”

The deal will allow AT&T to host business applications on IBM Cloud, and the networking firm will also use Red Hat’s open-source platform to manage workloads and applications. AT&T Business will have greater access to the Red Hat Enterprise Linux and OpenShift platforms as part of the deal.

IBM will be the main developer and provider for AT&T Business, the telecoms giant’s enterprise arm. Meanwhile, IBM will help to manage the IT infrastructure of the wider organisation both on and off-premise, as well as on public, private and hybrid cloud.

Just as with AT&T and Microsoft, the two companies will also collaborate on edge computing platforms to allow enterprise clients to take advantage of 5G networking speeds as well as IoT devices. The wider aim is to reduce latency and dramatically improve bandwidth for data transfers between multiple clouds and edge devices.

Trump reportedly concerned over JEDI cloud contract


Bobby Hellard

18 Jul, 2019

US President Donald Trump has requested more information on how the Pentagon is developing its JEDI cloud computing contract following concerns from Republicans that other providers were unfairly excluded.

Senate Homeland Security Committee Chair Ron Johnson said in an interview that he had discussed the contract with Trump aboard Air Force One adding that the president “wanted to understand what the issues were, what our concerns were,” according to a Bloomberg report.

There are also reports of a letter sent from Senator Marco Rubio to national security advisor John Bolton requesting the contract be delayed. A spokesperson for Rubio said he and Trump had discussed the letter.

“I respectfully request that you direct the delay of an award until all efforts are concluded in addition to evaluating all bids in a fair and open process in order to provide the competition necessary to obtain the best cost and best technology for its cloud computing needs,” Rubio reportedly wrote.

According to Bloomberg‘s unnamed source, who allegedly heard the call, Trump sounded as though he was considering cancelling the contract outright.

The highly sought after Joint Enterprise Defense Infrastructure (JEDI) contract is worth $10 billion and is part of a broad modernisation of Pentagon information technology systems that could take up to 10 years. Some have argued that the single nature of the contract favours Amazon Web Services, which, along with Microsoft, are the remaining contenders.

A number of cloud providers and experts would be in agreement with Trump if the reports are true. Both IBM and Oracle have raised concerns over the fact the contract will be awarded to one provider, which both companies have condemned and also taken legal action over.

A decision to review the contract process would also be seen as yet another clash between Trump and Amazon CEO Jeff Bezos, following a string of public snubs.

“So sorry to hear the news about Jeff Bozo being taken down by a competitor whose reporting, I understand, is far more accurate than the reporting in his lobbyist newspaper, the Amazon Washington Post. Hopefully the paper will soon be placed in better & more responsible hands!” he posted in January, in relation to Bezos’ divorce from his now ex-wife Mckenzie Bezos.

IBM and AT&T combine for major ‘multi-year’ cloud, edge and IoT deal

IBM and AT&T have announced a multi-year 'strategic alliance' whereby the telco will utilise IBM's cloud, as well as the Red Hat platform.

The Armonk firm will make AT&T Business its primary provider of sofware-defined networking (SDN) alongside this, building on the companies' existing relationships around AT&T being IBM's strategic global networking provider.

Alongside the moves to preferred providers, the companies will also collaborate on various initiatives, including edge computing platforms. AT&T and IBM see benefits in both directions on this: using 5G speed at the edge of the network, enterprises will be able to get greater insights and efficiencies. AT&T naturally wants to play a part in the former, with IBM there at the latter.

The two companies have been long-time partners; at InterConnect in 2017, AT&T CEO Randall Stephenson joined IBM chief exec Ginni Rometty on stage to discuss the increasing 'enterprise strong' element of IBM's offering. "I don't believe we're more than three or for years away from being indistinguishable from the 'data cloud' to the 'network cloud'," said Stephenson at the time.

"Building on IBM's 20-year relationship with AT&T, today's agreement is another major step forward in delivering flexibility to AT&T Business so it can provide IBM and its customers with innovative services at a faster pace than ever before," said Arvind Krishna, IBM SVP cloud and cognitive software in a statement. "We are proud to collaborate with AT&T Business, provide the scale and performance of our global footprint of cloud data centres, and deliver a common environment on which they can build once and deploy in any one of the appropriate footprints to be faster and more agile."

This makes for an interesting comparison with Verizon, who moved to Amazon Web Services (AWS) as preferred public cloud provider in May last year. The operator said it was migrating more 1,000 business critical applications and backend systems as part of the process.

In 2017, AT&T signed an agreement with Oracle whereby the telco moved 'thousands' of its large scale internal databases to Oracle's IaaS and PaaS offerings.

CloudTech has reached out to AT&T and will update the story in due course.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Facebook confirms Workplace price hike is on the horizon


Dale Walker

17 Jul, 2019

Facebook is set to remodel its pricing structure for its collaboration service Workplace in what equates to a price hike for users currently paying for the service.

Since its launch in 2016, the two million paying users the company has attracted have been on a $3 per user per month plan, the only ‘premium’ tier available, with a basic version given to users for free.

As of September, Workplace will offer three pricing tiers instead, including a rebranded premium tier – Workplace Advanced – that raises the cost of the platform to $4 per user per month. The basic package has also had a name change, now called “Essential”, and a new tier known as “Enterprise” will run at $8 per user per month and will offer priority support services and early access to new features.

The “per user per month” format is also a change for the company, having previously charged based on how many users were currently active on a company’s account. Charging a flat fee will generally result in higher overall costs, but this will also mean customers will pay the same predictable rate each month.

Facebook has confirmed that existing paying users will continue to pay $3 per user per month until 30 September 2020, after which time the new pricing structure will come into effect.

Interestingly, Facebook is also introducing a new add-on package called Workplace Frontline, specifically designed to cater for those frontline workers who engage directly with the general public, such as a cashier or those on a shop floor, who may sometimes feel disconnected from the rest of the business.

Organisations on the Advanced or Enterprise plans can bolt-on these users to their price plan for an additional $1.50 per user per month, regardless of whether they are active or not.

According to Facebook, workers are classed as frontline if they spend less than 50% of their time at a desk, are paid hourly, or do not have an email address. Nurses, doctors, facility workers, those in public services, couriers, warehouse staff, and those in the hospitality industry would all qualify for this status.

Aside from these changes, the individual features assigned to the tiers will remain the same.

The platform currently boasts over two million paid users, having attracted big brands like Walmart, Nestlé and Teléfonica, as well as the likes of Spotify, Grab, WWF and Save the Children.

While a price hike is never a welcome change, the restructure is the first in the platform’s short history and somewhat necessary given the fierce competition in the market. It needs to mature in the face of the rapidly growing Microsoft Teams, which recently passed Slack in terms of subscribers and offers similar services targeting both backend and frontline workers.

Slack, for its part, is also garnering a great deal of attention following its decision to become a publicly traded company, particularly as stocks sold far higher than expected. It, too, will be looking to compete against its rivals, but doubts remain as to whether it has the business model and infrastructure clout to remain competitive.

Why the path to digital transformation starts with your data strategy

More and more businesses are waking up to the significant opportunities that embracing artificial intelligence (AI) could offer – from improved customer service to increased productivity. But simply not knowing where to start on their digital transformation journey can often hold them back.

A major benefit of AI is the potential to use new technology to gather fresh insights from data. If that sounds like something your company could benefit from, the best place to embark on your digital transformation journey is by creating the right data strategy. A strategy that will harness all the information typically gathered and processed by today’s businesses in a clear and useful manner.

But where do you start when creating a robust data strategy?

Think about storage

Identify where your data is stored – and how. Think about all potential formats, from emails and documents to databases. Without a clear view of your data, you could be missing out on all kinds of opportunities to utilise it, from coordinating customer offers to ordering stock in response to expected demand at different times of year.

Once you’ve finished your data audit, your next aim should be to keep all data within a centralised repository that’s easily accessible. Cloud-based services are ideal places to start, as they have a much lower capital outlay than attempting to store data using an on-site solution.

Ideally, organisations of all sizes should choose a platform that’s easy to scale – in both capacity and performance – and think about which format they need their data to be available in. Using a translation tool such as Kafka will make it easier for companies to manage data from different apps.

However, with so much important information held together in one place, the issue of security becomes more important than ever. A solid backup and recovery plan is therefore a vital element of any data strategy.

Time for a clean-up

In the day-to-day running of a business, it can be easy to forget to dispose of data that’s no longer used or of value. Not only is this very important when meeting GDPR regulations, it’s also a waste of time and money to store decades-old data that no longer has any benefit.

Finding this data and destroying it appropriately is critical to being able to manage all your relevant data effectively.

What do you need from your data?

Once you’ve gathered, cleansed and stored your organisation’s data in one secure place, the next step is to think about what you want from your data. What are the specific questions you need to ask? This is essentially about understanding and looking for the insights that will make the difference to your business.

Test those questions with a smaller data set to reveal whether you’re truly asking the right questions – and whether the right algorithms are being used to answer them.

The ultimate aim of any data strategy is to create a single point of truth. Achieve this and you are well on your way to embracing AI and all the potential benefits it has to offer your business.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Using hybrid cloud to power your business: A guide

In this modern world, organisations are facing great pressure to adapt rapidly and innovate to keep up with their competitors. Businesses are being forced to move faster and faster, with the only constant  being change: changing infrastructure, changing strategies and changing technologies. Transforming to a ‘digital business’ by implementing cloud services and platforms is no longer an option but an absolute necessity, failure to do so will lead to an organisation’s failure.

It is clear that cloud has become a key enabler for strategic success—but not everyone’s cloud journey looks the same. Different businesses have different ambitions, and their journeys are rarely as straightforward as just deploying servers and virtual machines into the cloud. It can be difficult for a number of reasons, including the choices of platforms and decisions over where the data should sit.

One option organisations should look at is the hybrid-cloud model and Gartner experts have stated that by 2020, 90% of organizations will have adopted some type of hybrid cloud. It is therefore important to understand what it is and how enterprises benefit from the hybrid cloud.

What is hybrid cloud?

Forrester research describes hybrid cloud as: “one or more public clouds connected to something in my data center. That thing could be a private cloud; that thing could just be traditional data center infrastructure.” To put it simply, a hybrid cloud is a mash-up of on-premises and off-premises IT resources.

To expand on that, hybrid cloud is a cloud-computing environment that connects a mix of public cloud, private cloud and on-premises infrastructure. A key advantage to this model is that it allows workloads and data to travel between private and public clouds as demands and costs change, providing businesses with the flexibility they need.

There is not a single hybrid cloud model that works for every organisation and every model should fit the unique needs of each company. By allowing multiple deployment models in a hybrid cloud, organisations are able to benefit from a seamless and secure environment which enables productivity and flexibility.

Why choose hybrid cloud?

It is not usually feasible for businesses to go all in and move completely into the cloud straight away, unless they happen to be  cloud-native organisations. That doesn’t mean that enterprises with legacy systems have been unable to make any headway with the cloud. To get around this, they can try a mixture of public and private clouds, and  combine this with hosted, colocated and on-premise infrastructure where necessary.

Hybrid cloud allows organisations to experience the advantages of both types of cloud. By spreading computational data across both resources, it is possible to optimise the environment whilst keeping everyday functions as streamlined as possible. Enterprises can make their own minds up on which type of data should be stored in a public cloud, whilst keeping any sensitive data in the private cloud. This is granting them the key element that they need: control. Access to the benefits of private and public clouds is perfect for organisations wanting to grow at the speed they need.

Hybrid solutions grant business the key element that they need: control. Control to optimise their IT investment by selecting the best-fit infrastructure for different categories of workload. Control to choose where their most critical data should reside. Control to spread their workloads across multiple platforms to avoid the risk of vendor lock in from a single platform strategy.

What next?

Hybrid cloud is set to evolve in the years to come, with technology playing a key role in this. We will see the incorporation of automation, machine learning and artificial intelligence into cloud platforms and this will impact the way the cloud environment is managed and maintained.

Organisations need to understand that before choosing a hybrid cloud model, it needs to understand exactly why it is doing so, the impact it will have upon the business and how it will carry out the transformation. Moving to the cloud is not just a technology upgrade, but a complete change of mindset that effects the entire business: from technology and process, to employees and skills. It is vital to choose the right partner to help navigate this journey and ensure cloud investment enables organisations to achieve their objectives.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft to shift Cortana focus for enterprise


Bobby Hellard

16 Jul, 2019

Microsoft has a new vision for Cortana that involves a shift in focus towards enterprise customers and further integrations with Amazon’s Alexa.

The changes will involve conversations and interactions that are part of the software and services the company offers to businesses, but it will be one of multiple digital assistants available on Windows 10 in the future.

This is because Microsoft is opening Windows 10 even further to third-party digital assistants. In the next update to Windows 10, due in September, Microsoft will allow voice assistants like Alexa to activate on the lock screen. The change will allow third-party assistants to be activated from their wake words when the PC is locked. So calling out ‘Alexa’ will open your laptop as well as your Amazon Echo device.

These digital assistant changes will be in the September update, codenamed 19H2. Unlike previous Fall updates, this one will be a lot smaller in size with fewer new features and changes.

Although the changes are minor, it does show that Microsoft is willing to work with its rivals, particularly as its own voice assistant, Cortana, hasn’t been as popular as the likes of the Google Assistant.

Earlier in the year, Amazon enabled its Alexa wake word on the Windows 10 app and Microsoft also made changes, moving Cortana into a separate app in the Microsoft store and away from the built-in search experience in the operating system, which was arguably one of the more annoying features of Windows 10.

For Amazon, Alexa has been widely adopted in the home and in business. Most recently, the digital voice assistant has been announced as an asset for the NHS as it will be used both in the service and as a replacement for calling your GP.

10 charts that will change your perspective of AI in marketing

  • Top-performing companies are more than twice as likely to be using AI for marketing (28% vs. 12%) according to Adobe’s latest Digital Intelligence Briefing.
  • Retailers are investing $5.9B this year in AI-based marketing and customer service solutions to improve shoppers’ buying experiences according to IDC.
  • Financial Services marketers lead all other industries in AI application adoption, with 37% currently using them today.
  • Sales and Marketing teams most often collaborate using Configure-Price-Quote (CPQ) and Marketing Automation AI-based applications, with sales leaders predicting AI adoption will increase 155% across sales teams in two years.

Artificial Intelligence enables marketers to understand sales cycles better, correlating their strategies and spending to sales results. AI-driven insights are also helping to break down data silos so marketing and sales can collaborate more on deals. Marketing is more analytics and quant-driven than ever before with the best CMOs knowing which metrics and KPIs to track and why they fluctuate.

The bottom line is that machine learning and AI are the technologies CMOs and their teams need to excel today. The best CMOs balance the quant-intensive nature of running marketing with qualitative factors that make a company’s brand and customer experience unique. With greater insight into how prospects make decisions when, where, and how to buy, CMOs are bringing a new level of intensity into driving outcomes. An example of this can be seen from the recent Forbes Insights and Quantcast research, Lessons of 21st-Century Brands Modern Brands & AI Report (17 pp., PDF, free, opt-in). The study found that AI enables marketers to increase sales (52%), increase in customer retention (51%), and succeed at new product launches (49%). AI is making solid contributions to improving lead quality, persona development, segmentation, pricing, and service.

The following ten charts provide insights into how AI is transforming marketing:

21% of sales leaders rely on AI-based applications today, with the majority collaborating with marketing teams sharing these applications

Sales leaders predict that their use of AI will increase 155% in the next two years. Sales leaders predict AI will reach critical mass by 2020 when 54% expect to be using these technologies. Marketing and sales are relying on AI-based marketing automation, configure-price-quote (CPQ), and intelligent selling systems to increase revenue and profit growth significantly in the next two years. Source: Salesforce Research, State of Sales, 3rd edition. (58 pp., PDF, free, opt-in).

AI sees the most significant adoption by marketers working in $500m to $1bn companies, with conversational AI for customer service the most dominant

Businesses with between $500M to $1B lead all other revenue categories in the number and depth of AI adoption use cases. Just over 52% of small businesses with sales of $25M or less are using AI for predictive analytics for customer insights. It’s interesting to note that small companies are the leaders in AI spending, at 38.1%, to improve marketing ROI by optimising marketing content and timing. Source: The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte and American Marketing Association. (71 pp., PDF, free, no opt-in).

22% of marketers currently are using AI-based applications with an additional 57% planning to use in the next two years

There are nine dominant use cases marketers are concentrating on today, ranging from personalised channel experiences to programmatic advertising and media buying to predictive customer journeys and real-time next best offers. Source: Salesforce’s State of Marketing Study, 5th edition

Content personalisation and predictive analytics from customer insights are the two areas CMOs most prioritise AI spending today

The CMO study found that B2B service companies are the top user of AI for content personalisation (62.2%) and B2B product companies use AI for augmented and virtual reality, facial recognition and visual search more than any other business types. Source: CMOs’ Top Uses For AI: Personalisation and Predictive Analytics. Marketing Charts. March 14, 2019

45% of retailers are either planning to or have already implemented AI to improve multichannel customer engagement as a core part of their marketing mix

Reflecting how dependent retailers are on supply chains, 37% of retailers are investing in AI today to improve supply chain logistics, supply chain management, and forecasting (37%). Source: AI and Machine Learning use cases in the retail industry worldwide as of 2019, Statista.

Personalising the overall customer journey and driving next-best offers in real-time are the two most common ways marketing leaders are using AI today, according to Salesforce

Improving customer segmentation, improving advertising and media buying, and personalising channel experiences are the next fastest-growing areas of AI adoption in marketing today. Source: Salesforce’s State of Marketing Study, 5th edition

82% of marketing leaders say improving customer experience is the leading factor in their decision to adopt AI

The timing and delivery of content, offers, and contextually relevant experiences are second (67%), and improving performance metrics is third at 57%. Source: Leading reasons to use artificial intelligence (AI) for marketing personalisation according to industry professionals worldwide in 2018, Statista.

81% of marketers are either planning to or are using AI in audience targeting this year

80% are currently using or planning to use AI for audience segmentation. EConsultancy’s study found marketers are enthusiastic about AI’s potential to increase marketing effectiveness and track progress. 88% of marketers interviewed say AI will enable them t be more effective in getting to their goals. Source: Dream vs. Reality: The State of Consumer First and Omnichannel Marketing. EConsultancy (36 pp., PDF, free, no opt-in).

Over 41% of marketers say AI is enabling them to generate higher revenues from email marketing

They also see an over 13% improvement in click-through rates and 7.64% improvement in open rates. Source: 4 Positive Effects of AI Use in Email Marketing, Statista (infographic), March 1, 2019.

Marketers and agencies are most comfortable with AI-enabled bid optimisation for media buying, followed by fraud mitigation

Marketers and their agencies differ on ad inventory selection and optimisation, with marketing teams often opting to use their analytics and reporting instead of relying on agency AI methods. Source: Share of marketing and agency professionals who are comfortable with AI-enabled technology automated handling of their campaigns in the United States as of June 2018, Statista.

Additional data sources on AI’s use in marketing:

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

UK’s first green data centre for AI launches in Cornwall


Bobby Hellard

15 Jul, 2019

A satellite station has opened the UK’s first green high-performance computing platform for artificial intelligence and machine learning on demand.

The Cornwall-based Goonhilly Earth station is one of the first organisations in the UK to deploy a liquid immersion cooling system to mitigate the power demands of high-performance computing.

Its green platform consists of an onsite array of solar panels that can support the data centre’s full power requirements of 500KW, with local wind power to be added in the near future, according to the company. The system has been designed to particularly target the data-intensive needs of the automotive, life sciences and aerospace marketplaces.

“There are people working on some clever algorithms to save our planet from climate change,” said Chris Roberts, head of data centre and cloud at Goonhilly. “The irony is that these models require heavy processing power.

“Fortunately, new technology is helping, such as immersion cooling which is 45-50% more efficient than air cooling, cuts electricity demand in half, and also allows us to use the exhaust heat elsewhere.”

According to Goonhilly, the platform delivers high-performance GPU-based compute and storage for decentralised and centralised AI and machine learning applications. By provisioning both compute, AI and machine learning resources on demand, customers can reduce the cost of deployment and accelerate the launch of products.

Goonhilly has also joined the Nvidia Inception programme for businesses, furthering its AI work and granting it access to Nvidia’s DGX-1, the world’s first supercomputer purpose-built for enterprise AI and deep learning.

To mark the opening of the data centre, and also celebrate the 50th anniversary of the moon landing, Goonhilly is hosting an event on-site on Thursday 18 July for space industry partners, academia, customers and prospects. It includes a panel discussion on trends in AI, cloud and edge computing.

“Through our strong partnerships with industry and academia, we pride ourselves on being at the forefront of innovation. Our new green data centre is no exception. It is satisfying to open our doors to the many businesses and organisations with data-intensive applications who can benefit from this facility and the community we are creating,” said Ian Jones, the CEO of Goonhilly.

Europe’s Galileo satellite system crippled by days-long outage


Keumars Afifi-Sabet

15 Jul, 2019

The European Union’s satellite navigation infrastructure, used by businesses and government agencies across the continent, has been offline for more than 100 hours following a network-wide outage.

The Global Navigation Satellite Systems Agency (GSA), which runs the £8 billion Galileo programme, confirmed this weekend the satellite system has been struck with a “technical incident related to its ground infrastructure”.

As a result, all 24 satellites in orbit are non-operational.

“Experts are working to restore the situation as soon as possible,” the GSA said in a statement.

“An Anomaly Review Board has been immediately set up to analyse the exact root cause and to implement recovery actions.”

Galileo is used by government agencies, academics and tech companies for a wide range of applications, from smartphone navigation to search-and-rescue missions.

The programme offers several services including a free Open Service for positioning, navigation and timing, and an encrypted Public Regulated Service (PRS) for government-authorised users like customs officers and the police.

Its business application spans multiple sectors; used by fishing vessels, for example, to provide data to fishery authorities as well as by tractors with guidance for navigation. According to the GSA, 7.5 billion Galileo-friendly apps are expected by the end of 2019.

However, the satellite system, developed so European organisations aren’t wholly entirely reliant on GPS, has been offline since 1 am UTC on Thursday 11 July.

The GSA said at the time that users may experience “service degradation” on all Galileo satellites. A further update then issued two days later claimed users would be experiencing a total service outage until further notice. Neither update offered a concrete explanation for the mysterious outage, which has persisted at the time of writing.

The root cause, however, may lie with a ground station based in Italy, known as the Precise Timing Facility (PTF), according to Inside GNSS. This facility generates the Galileo System Time, which is beamed up to the satellites to enable user localisation. It is also often used as an accurate time reference.

In June, GPS services were also hit by a similar outage which affected a host of Middle-Eastern countries. According to Isreali media, that outage was linked to state-sponsored attacks from Russia.

The government and UK businesses have played an integral role in helping to develop Galileo since its pilot launch in 2016. The continental service is expected to be fully operational by 2020, with 30 satellites in total.

But the UK’s withdrawal from the EU has threatened to fully cut off access by British agencies and companies, should no deal be agreed.

The government has already set aside £92 million to develop an independent satellite system, although it’s unclear how long this would take to implement.