All posts by cloudtech

Marriott reported another data breach: Why cyber risk assessment is important

Marriott International — the multinational hospitality company behind the third-largest hotel brand in the world — reported a major data breach on March 31 2020, marking its second major data breach in the last two years. This data breach is expected to leak the information of 5.2 million guests worldwide.

“Marriott said Tuesday approximately 5.2 million guests worldwide may have been affected. The information taken may have included names, addresses, phone numbers, birthdays, loyalty information for linked companies like airlines and room preferences. Marriott said it’s still investigating but it doesn’t believe credit card information, passport numbers or driver’s license information was accessed,” reported ABC News. In February-end, Marriott found a massive amount of guest information was being accessed using two of its employees’ user credentials.

After an initial investigation, Marriott believed that the data breach probably started in mid-January. It blocked those login credentials, and now, it is assessing the situation and assisting the relevant authorities for investigating the data breach. Though Marriott is doing everything to fix the problem now, it is no good news seeing it suffered two major data breaches in less than two years.

In November 2018, Marriott reported the first major data breach, which leaked the personal information of 383 million people. So, the combined amount of data that got leaked in these two data breaches totals to 388.2 million. Moreover, after the last major data breach, it was expected that Marriott will harden its cybersecurity infrastructure, train its security teams, and upgrade its systems. However, the latest data breach raises questions on its efforts to fight threats.

This brings us to the question: how does an organisation check and validate its security infrastructure? The answer: cybersecurity risk assessment.

Let’s learn more about it and how it helps organisations to test their security postures.

Cybersecurity risk assessment is the risk assessment of cyber or digital threats. It has become increasingly important since every organisation — nowadays — implements and relies on information technology and systems for running its business. Since it heavily relies on these digital systems, a small breach, hack, or malfunction may pose high risks.

As risk assessments are necessary for every organisation for getting informed and preparing for unexpected issues or risks like industrial malfunctioning and manufacturing defects and deaths, cybersecurity risk assessments are critical for knowing and preparing for unexpected cyber threats. The list of threats includes but is not limited to data breaches, insider or online attacks, etc.

“Risk assessments are used to identify, estimate, and prioritise risk to organisational operations (such as mission, functions, image, and reputation), organisational assets, individuals, other organisations, and the Nation, resulting from the operation and use of information systems. The purpose of risk assessments is to inform decision makers and support risk responses by identifying: (i) relevant threats to organisations or threats directed through organisations against other organisations; (ii) vulnerabilities both internal and external to organisations;(iii) impact (i.e., harm) to organisations that may occur given the potential for threats exploiting vulnerabilities; and (iv) likelihood that harm will occur,” according to NIST’s Guide for Conducting Risk Assessments.

Similarly, cyber risk assessment— is the term defining the process of assessing the cyber or digital risks facing your business or organisation. Its primary goal is to help the board members and decision-makers to understand the organisation’s cybersecurity infrastructure and install and support the best risk mitigation processes for fighting off — or at least decreasing the cyber risks of — both online as well as offline threats.

There are numerous examples and reasons that prove the importance of cyber risk assessments. The data breaches reported by Marriott International are great examples; if Marriott’s security infrastructure was attack-proof, it might not have suffered the data breach — at least the second one. Every customer (guest) making a reservation at Marriott after the first breach in November 2018 must have believed in its promise of hardening its security infrastructure. However, it failed — super hard — at keeping its promise. Though the investigation is still in progress for the second breach, Marriott had — probably — a gap in their security posture that led to the data breach. What could have been done?

Even if the two employees — whose login credentials were used for the second data breach — were involved in the breach, its security systems should have detected and reported massive data requests coming from systems at a single location or origin. And if detected and reported, its security teams should have checked the issue and identified the data breach earlier — ideally. However, it is evident that they did not detect or find the massive breach until recently.

That said, every organisation must perform cybersecurity risk assessments on a regular basis. It helps the organisation to identify its security weaknesses, inform the security teams as well as decision-makers, and harden or install the necessary cybersecurity processes and products to improve the overall security. Moreover, it reduces the long-term costs, provides awareness on the installed processes and systems, helps avoid data breaches and security incidents, and helps meet the legal and regulatory cybersecurity requirements. These, in turn, helps strengthen your brand and avoid unnecessary costs or risks. Also, it builds trust in your present and future customers for your organisation.

Picture credit: "Marriott Hotel", by José Carlos Cortizo Pérez, used under CC BY 2.0

IDC finds how organisations investing in cloud-based quantum computing seek to gain competitive edge

IDC, in its recent study titled ‘Quantum Computing Adoption Trends: 2020 Survey Findings’ has found that organisations currently investing in cloud-based quantum computing technologies are expecting to see improved AI capabilities, accelerated business intelligence, and increased productivity and efficiency.

During its initial stage, the study indicated that as cloud-based quantum computing is still at its nascent stage and the allotted funding for its initiatives is limited – somewhere between 0%-2% – the end-users are very much positive that they will realise a competitive advantage owing to early investment. At present, the manufacturing, finance, and security industries are at the forefront as they are experimenting with more potential use cases, developing advanced prototypes, and implementing the technology.

Limited skills, lack of available resources, complex technology, and cost are some of the factors that discourage some organisations from investing in quantum computing. However, these strands, combined with a large interdisciplinary interest, have compelled vendors of quantum computing to develop the technology that addresses multiple end-user needs and skill levels.

Last year, researchers at Google claimed that their quantum computer has solved a problem in some minutes that would otherwise take even the very best conventional machine thousands of years to crack. They termed this milestone as ‘quantum supremacy’, as it took a very long time to realise the immense potential of quantum computers. On the other hand, IBM has criticised the claim stating that the same problem can be solved by its machine in 2.5 days with sophisticated classical programming, arguing that Google has not yet reached the milestone in actuality. 

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Half of Indian enterprises will operate hybrid multicloud environments by 2021, predicts IDC

Half of the enterprises in India will be operating in a hybrid multicloud environment by 2021, according to predictions from analyst firm IDC.

The finding appears as part of the company’s India-contextualided Worldwide Cloud Predictions for 2020 and beyond. Alongside this, by 2021, IDC predicted that 30% of Indian enterprises will deploy unified VMs, Kubernetes, and multicloud management processes and tools to support robust multicloud management across on-premises and public clouds.

Rishu Sharma, the principal analyst, cloud and artificial intelligence, IDC India, said: "Enterprises in India are looking at cloud as a key enabler to meet their business priorities. As per IDC's Cloud Pulse 2Q19, 75% of organisations in India have plans to invest in the cloud-based infrastructure and applications to meet their business goals.

“Cloud will become the enabler for all things digital but will bring in challenges associated with the management of multiple clouds and traditional systems," Sharma added.

In January, Google Cloud partnered with Bharti Airtel, an Indian telecommunications company. The deal with Airtel will see the organisation offer collaboration tool G Suite to small as well as medium sized businesses as part of its integrated B2B connectivity solutions. In August 2019, fellow Indian telco Reliance Jio, announced a 10-year partnership with Microsoft to utilise and promote Azure and ‘accelerate the digital transformation of the Indian economy and society.’ As per the agreement, Jio will move its non-network applications to Azure, and also set up data centres across India with Azure housed there.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud partners with Palo Alto, McAfee, and others to bolster security

With an aim to strengthen its security and attract more number of enterprise customers to its cloud platform and services, Google Cloud has announced its partnership with Palo Alto Networks, Qualys, McAfee, Fortinet and ForgeRock.

Google Cloud and Palo Alto Networks will be jointly working on the development of a new multi-cloud security framework for Anthos, which is Google Cloud’s hybrid platform, and multi-cloud Kubernetes deployments. According to the companies, the framework will make use of Palo Alto Networks’ Prisma Cloud security platform and its VM-Series virtual firewalls which will focus on helping customers of Google Cloud deploy a common compliance and runtime security posture across all of their workloads.

Along with this latest security framework, both Google Cloud and Palo Alto Networks have also announced a new threat intelligence integration that will be merging Google Cloud’s Event Threat Detection product with Palo Alto Networks AutoFocus threat intelligence service. The companies also said that integrating signals based on Google’s own internal sources with additional visibility from Palo Alto Networks footprint of network, endpoint, and cloud intelligence sources will help joint customers proactively identify and stop threats. In the first half of 2020, the companies are planning to launch both the new security framework and threat intelligence integration.

Google Cloud’s new partnership with McAfee will be merging that vendor’s endpoint security technology for Linux and Windows workloads along with its Mvision Cloud platform for container security, on Google Cloud infrastructure.

In another extended integration with Google Cloud, Fortinet announced a reference architecture for customers in order to connect distributed branches to Google Cloud Platform with Fortinet’s SD-WAN. According to Fortinet, its FortiCWP product will soon be integrated with GCP’s Cloud Security Command Center to offer additional workload protection and visibility.

Google Cloud’s partnership with Qualys will make its cloud-based security and compliance products available via the Google Cloud Marketplace. The latest integration will include the Qualys Cloud Agent — a lightweight scanner that according to the vendor will enable two-second global visibility. With Qualys on Google Cloud, vulnerability findings are available in the GCP Security Command Center on its own. Similar findings are also present centrally in the Qualys Cloud Platform that allows security teams to track as well as report across the entire enterprise.

ForgeRock too has joined the Google Cloud Partner Advantage Program and has said that it is the first premier-level identity management vendor in the program. ForgeRock announced the launch of its Cloud platform-as-a-service which is built on GCP that includes a software-as-a-service for embedding modern identity capabilities into apps.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

IDC picks Trend Micro as the top vendor in SDC workload protection

Cybersecurity solutions provider Trend Micro has been named as the number one vendor in Software-Defined Compute (SDC) workload protection in IDC’s latest report.

Trend Micro company achieved a market share lead of 35.5% in 2018. Steve Quane, Trend Micro’s network defence and hybrid cloud security executive VP, said: “We predicted a decade ago that organisations would need multi-layered security to protect their cloud environments and software-defined data centres.”

Frank Dickson, IDC’s security and trust program vice president said: “For years, Trend Micro has steadily built out its SDC workload protection capabilities for virtual, public cloud and container environments, offering tight integration with AWS, Azure and Google Cloud Platform.”

"Although the future has not been written, Trend Micro is the dominant player in this market," he added.

Real-time security has been embedded into running applications over this time and Trend Micro has made sure that it focuses on security-as-code and automation in order to seamlessly build protection into DevOps pipelines, including pre-runtime scanning of container images. This carries on with the launch of XDR in the month of August.

XDR plays an important role in correlating data across network, server, email, endpoint, and cloud workloads in order to identify spiteful activity which might otherwise go unnoticed.

In the month of October, Trend Micro further built on these capabilities by its move to acquire Cloud Conformity, a leader in security posture management. The firm announced the launch of its cloud security services program called Trend Micro Cloud One to address security challenges faced by customers around storage, data centre, IaaS, serverless architectures and containers.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Why businesses fail to maximise the value of data visualisation

Data visualisation has become one of the hottest tools in data-driven business management over the past few years. As business intelligence software becomes a more central part of companies’ toolkits and data practices, visualisations have improved while concurrently becoming more precise and versatile.

Even so, not every case of a business implementing BI software and data visualisation is a success. Although they are meant to streamline data analysis and comprehension, they can sometimes produce the opposite effect.

A recent survey by Ascend2 revealed that despite their best intentions, many companies fumble their data visualisation implementations and end up doing more harm than good. While this has not necessarily affected the popularity of BI and data visualisation, it does raise some interesting questions about what companies can do right.

The survey shows that while many have had success with their data visualisation and data dashboard strategies, a majority have only been somewhat successful, or worse, unsuccessful. 

Regardless, dashboards and visualisation confer significant benefits for organisations, so they are not likely to go anywhere.

Why some visualisations are less successful

The survey responses indicate that while data dashboards are still being used and developed, the number of companies that are experiencing strong success with them has dropped. When asked about the overall effectiveness of their data dashboard strategies, only 43% of those surveyed described it as very successful. Meanwhile, 54% called it somewhat successful, while 3% were unsuccessful in deploying data visualisations and dashboards.

One of the biggest challenges is that fewer respondents believed they had consistent access to the data they required. A major benefit of dashboards is that they provide only the data that is relevant to each user and exhibits it in an easily digestible manner. However, dashboard design can sometimes go awry and become either too cluttered or too sparse, obscuring important information in the process.

Indeed, the number of respondents who claimed they frequently or always had the right data to make business decisions fell from 44% in 2017 to 43% in 2018.

A focus on a specific type of data visualisation can misrepresent data, while a strong focus on one type of data can exclude up to 80% of a company’s full data stream

Nevertheless, it does appear that visualisations and dashboards are gaining popularity. The survey found that a total of 84% of respondents planned to increase their overall budgets for data dashboards and visualisations to some extent, although most only plan on increasing it moderately.

This is because despite the challenge of successfully implementing a data visualisation strategy, visual language has been proven to improve productivity and efficiency in the workplace.

Why companies will keep investing in visualisations

One big reason many companies undergo less-than-optimal implementations is that they do not have an effective answer to the question, “What is data visualisation?” For many, the definition is as simple as charts made from spreadsheets and basic diagrams. However, today’s business intelligence tools offer a significant variety of visuals that can make almost any data easier to comprehend and actionable.

A report by the American Management Association has found that visualisation tends to improve several aspects of companies’ decision making. According to the AMA, 64% of participants made decisions faster when using a visualisation tool, while another found that visual language can shorten work meetings by up to 24%.

More importantly, the AMA report cites additional third-party studies demonstrating that visual language helps problem solving, improving efficiency by 19% while overall producing 22% higher results in 13% less time.

With that in mind, however, the report by Ascend2 may be cause for concern, or at least a call to action, for many companies employing data dashboards. The importance of design and precision cannot be overstated when planning a data visualisation strategy.

In some cases, a focus on a specific type of visualisation can misrepresent data or make it harder to understand. Other times, a strong focus on one type of data—such as structured data—can exclude up to 80% of a company’s full data stream.

Having a clear deployment strategy that understands an organisations’ specific needs and objectives can also make the process easier. The Ascend2 study discovered that companies which focused on objectives that are more important—instead of those that are more challenging, but less critical—can also help organisations increase their success with data dashboards and visualisations.

Coursing the right plot

Data visualisations will continue to be a central part of organisations’ data practices. The improvements it offers for decision-making, consensus, problem-solving, and more make it a key part of business success. Still, companies should focus their efforts on building data visualisation strategies and data dashboards that give their teams the information they need, and deliver it consistently.

Editor's note: This article was written in association with StudioWorks.

Puppet’s 2019 State of DevOps report: How security needs to fit into continuous delivery

You’ve got the processes in place for a revamped software delivery cycle in your organisation. The foundation has been built, the stack is in place and the culture is going in the right direction. What are you missing?

Security in DevOps is an ‘unrealised ideal’ and a key step in moving from DevOps good practice to best practice. That’s according to the latest Puppet State of DevOps Report, published earlier today. 

The report, the eighth in total, explored the various journeys organisations were at in security integration. Security, alas, is not a competitive differentiator – getting good product out there is – so the report sympathised with organisations facing the struggle. Though the road ahead is paved with good intentions, it doesn’t change habits – or pay the bills.

In all, almost a quarter (21%) of the 3,000 respondents polled who have the highest levels of security integration – whereby security is present across all phases of the software delivery cycle – say they have reached an advanced stage of DevOps maturity. Only 6% of those with no security integration say they have done so. 

What’s more, if you have the highest level of security integration it means you are more likely to deliver on production demand quickly, cited by 61% of firms. Of those with no security integration, less than half (49%) are able to deploy on demand. Security-conscious firms are also more than twice as likely to be able to stop a push to production for a moderate security vulnerability, meaning their customers aren’t able to release insecure code.

The most marked change was with regards to overall security posture. More than four in five (82%) of those polled with the highest levels of security integration said their security practices ‘significantly’ improved their posture, compared with only 38% of those with no integration.

In some aspects, the figures between the haves and the have-nots are not as broad as they seem. This may be of particular interest due to the harsh journey involved. Getting seamless security integration is a multi-layered problem. As the report puts it: “You see the underlying complexity that’s been masked over by years of duct tape and glue. You tackle the roadblock, but as you resolve it, new obstacles appear. You resolve one roadblock after another, and it gets frustrating, but after a while, you see that your team can overcome issues as they arise.” 

Last year, the key takeaway was with regards to getting each step right. The 2018 Puppet report argued reaching the zenith, where Dev and Ops integrate seamlessly and in perfect harmony, meant a slow evolution. Only one in 10 organisations polled were outliers either way, with 79% of companies somewhere in the middle.

With regard to security, those at the more advanced end of DevOps implementation are automating security policy configurations, and at the very sharp end exploring automated incident response. “They had cultivated a powerful blend of high-trust environments, autonomous teams, and a high degree of automation and cross-functional collaboration between application teams, operations and security teams.

“The result? Security becomes a shared responsibility across delivery teams that are empowered to make security improvements.”

Ultimately, it is a long road, but a profitable one if all stakeholders care enough, which is rather like security as a whole. “The DevOps principles that drive positive outcomes for software development – culture, automation, measurement and sharing – are the same principles that drive positive security outcomes,” said Alanna Brown, senior director of developer relations at Puppet and report author. “Organisations that are serious about improving their security practices and posture should start by adopting DevOps practices.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

IDC: Global spending on public cloud services and infrastructure to reach £398b in 2023

The latest IDC study suggests that global spending on public cloud services and infrastructure will be more than double by 2023, mainly driven by digital transformation deployments. According to the market researcher, spending will grow at a rate of 22.3 per cent from $229b (£182b) in 2019 to nearly $500b (£398b) in 2023.

Eileen Smith, programme director at IDC, said: “Adoption of public (shared) cloud services continues to grow rapidly. Enterprises, especially in professional services, telecommunications, and retail, continue to shift from traditional application software to software-as-a-service (SaaS) and from traditional infrastructure to infrastructure-as-a-service (IaaS) to empower customer experience and operational-led digital transformation initiatives.”

SaaS will hold more than half of all public cloud spending during the forecast period, says IDC, adding that the market segment comprising applications and system infrastructure software (SIS) will be dominated by applications purchases. The report said: “The leading SaaS applications will be customer relationship management (CRM) and enterprise resource management (ERM). SIS spending will be led by purchases of security software and system and service management software.”

It said that the IaaS would be the second largest category of public cloud spending throughout the forecast period, which will be followed by platform-as-a-service (PaaS). IaaS spending, spanning servers and storage devices will also be the rapid growing category of cloud spending with a growth rate of 32 per cent. The market research firm said: “PaaS spending will grow nearly as fast – 29.9 per cent – led by purchases of data management software, application platforms, and integration and orchestration middleware.”

Three industries, namely professional services, discrete manufacturing, and banking, will be responsible for more than one-third of all public cloud services spending during the forecast period.

The report further said: “While SaaS will be the leading category of investment for all industries, IaaS will see its share of spending increase significantly for industries that are building data and compute-intensive services. For example, IaaS spending will represent more than 40 per cent of public cloud services spending by the professional services industry in 2023 compared to less than 30 per cent for most other industries. Professional services will also see the fastest growth in public cloud spending with a five-year CAGR of 25.6 per cent.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AWS announces availability of Amazon Managed Blockchain service

Amazon Web Services (AWS) has announced the market availability of its Amazon Managed Blockchain (AMB) service, which is designed to help companies develop and manage scalable blockchain networks.

The platform extends its support to thousands of applications and millions of transactions via open source frameworks, such as Ethereum and Hyperledger Fabric. Those intending to permit multiple parties to perform transactions and maintain a cryptographically verifiable record of them without the need for a trusted, central authority can easily setup a blockchain network across multiple AWS accounts with the help of AWS Management Console.

Rahul Pathak, general manager, Amazon Managed Blockchain at AWS, said: “Customers want to use blockchain frameworks like Hyperledger Fabric and Ethereum to create blockchain networks so they can conduct business quickly, with an immutable record of transactions, but without the need for a centralised authority. However, they find these frameworks difficult to install, configure, and manage.

"Amazon Managed Blockchain takes care of provisioning nodes, setting up the network, managing certificates and security, and scaling the network," Pathak added. "Customers can now get a functioning blockchain network set up quickly and easily, so they can focus on application development instead of keeping a blockchain network up and running.”

Last month, VMware announced its integration with Digital Asset, which operates the open source DAML language to construct smart contracts. As part of this collaboration, VMware is integrating the DAML with its VMware Blockchain platform. In 2018, VMware introduced its first, own blockchain project called Project Concord at the VMworld event.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Exploring the links between AI, 5G and IoT – and how cloud computing underpins them all

Sponsored If the agenda and sessions at MWC 2019 don’t explicitly mention cloud computing, then there’s a good reason. The emerging technologies that will be explored at the show, which runs as ever in Barcelona from February 25-28, from artificial intelligence (AI), to 5G, to the Internet of Things, all need cloud to underpin them.

Don’t make the mistake of thinking there won’t be any presence on that front. The vendor literature is naturally full of advocating a holistic, tech-utopian landscape. Mavenir describes itself on its page as “the industry’s only 100% software, end-to-end, cloud-native network software provider.” Gemalto on its page puts as a key strength “trusted data exchange from edge devices up to the cloud”, while Huawei in its invitation notes now “new technologies like 5G, AI, IoT and cloud computing are more important than ever.” Exhibitors at this year’s event include, in the shape of Google Cloud, Alibaba Cloud, and IBM, three of the biggest cloud vendors in the space.

Regular readers of this publication will be more than aware of how cloud is the glue which holds the more emerging technologies together. Speaking last year Pat Gelsinger, the CEO of VMware, summed it up nicely. Cloud enables mobile connectivity; mobile connectivity creates more data; more data makes artificial intelligence better; AI enables more edge use cases; and more edge needs more cloud for storage and compute.

Indeed, the links between cloud and AI go deeper. Writing in November, Dr. Wanli Min, chief machine intelligence scientist at Alibaba Cloud, noted that while AI “seems to mean all things to all people”, the evidence suggests a gradual path.

“Crucially, cloud computing using AI isn’t a radical or revolutionary change. In many respects it’s an evolutionary one,” he wrote. “For many organisations, it has been a seamless integration from existing systems, with AI investment gathering pace quickly. Over the next few years we can expect to see the industry continue to boom, with AI driving cloud computing to new heights, while the cloud industry helps bring the benefits of AI to the mainstream.”

If you do look hard enough you will see more concrete references, including an interesting session on February 27 around the concept of ‘cloud XR’ (extended reality) – ‘the spectrum of technologies [combining] generated virtual elements into the real environment.’

Yet this coming together has seen MWC’s message change. Indeed, as is shown in its 2019 theme of ‘intelligent connectivity’, the industry has gone beyond the original ethos of mobile. “We are rapidly moving to a world where mobile will connect everyone and everything, but at the same time, we are expanding our reach beyond ‘just’ mobile,” a blog post explains.

“The theme of this year’s event is ‘intelligent connectivity’ – the term we use to describe the powerful combination of flexible, high-speed 5G networks, the Internet of Things, artificial intelligence and big data. Intelligent connectivity marks the beginning of a new era defined by highly contextualised and personalised experiences, delivered as and when you want them. This is the future of our industry and our world.”

Read the full article here and find out more about MWC19 by visiting here.

Editor’s note: This article is brought to you alongside MWC19.