What to expect from AWS Re:Invent 2019


Bobby Hellard

2 Dec, 2019

The card tables at las Vegas will have to make way for cloud computing this week, as AWS is in town for Re:Invent 2019.

Last year the conference took place in eight venues, with some 50,000 cloud computing enthusiasts descending on the city of sin. There was a lot to take in too, from new blockchain and machine learning service to even satellite data programs, the announcements came thick and fast.

AWS is the leading provider of public cloud services, so naturally its annual conference is gargantuan. It can’t afford to rest on its laurels, though, as its rivals are building up their own offerings and gaining on Amazon fast. In just the last year Microsoft has invested heavily in Azure with a string of acquisitions for migration services, while IBM has changed focus with its massive Red Hat deal and Google is ploughing so much into its cloud it’s hurting Alphabet earnings.

This is without mentioning the biggest cloud computing deal of the last two years, the Pentagon’s JEDI contract, being awarded to Microsoft, despite AWS being the clear favourite for much of the bidding.

Bearing all this in mind, I do expect to see a slew of new products and services unveiled throughout the week.

Expansive keynotes

AWS tends to do marathon keynotes that run on for three hours and overflow with announcements. Last year CEO Andy Jassy fired through new products and special guest customers with the same stamina that saw Eliud Kipchoge recently break the world record over 26.2 miles.

Jassy is, of course, back again this year, for not one but two keynotes. On Tuesday he will deliver his main opening day presentation, while on Wednesday he will join the head of worldwide channels and alliances, Doug Yeum for a fireside chat.

CTO Werner Vogels will be on stage on Thursday with an in-depth explainer on all the new products. This two hour deep dive is definitely one for the diehard fans of cloud architecture, with all the technical underpinning you crave. For the frugal, get there early — the first 1,000 guests in the keynote line will get a special piece of “swag”, according to the website.

Machine Learning

Last year, machine learning and the AWS Marketplace took precedence and 2019’s event should hold more of the same. Recently, the company announced the launch of the AWS Data Exchange, a new hub for partners to share large datasets that customers can use for their machine learning and analytical programmes.

The customer element is key for AWS, as it often integrates and shares these innovations. Last year, head of Formula 1 Ross Brawn joined Jassy on stage keynote and showcased what his sport had done with AWS Sagemaker and other machine learning services. Interestingly, the basic idea for the prediction models they used came from a London-based startup called Vantage Power that developed the technology to predict the lifespan of electric batteries in buses.

Doubtless there will be some kind of machine learning update, but what it is could depend on what AWS customers have innovated. Last year the company announced a partnership with NFL app Next Gen Stats, the automation of NASCAR’s video library and multiple services with US-based ride-hailing firm Lyft. Vegas is all about gambling, but it’s a safe bet that at least one of these companies will be in attendance to talk through case studies.

AWS goes all-in on quantum computing


Bobby Hellard

3 Dec, 2019

AWS unveiled its plans to aid and accelerate the research of quantum computing at its Re:Invent conference on Monday.

The cloud giant announced three new services for testing, researching and experimenting with the technology.

The first of which was Amazon Bracket, a service that enables scientists, researchers, and developers to begin experimenting with computers from quantum hardware providers in a single place.

To go with Bracket is the AWS Centre for Quantum Computing which brings together quantum computing experts from Amazon, the California Institute of Technology (Caltech) and other academic research institutions to collaborate on the research and development of new quantum computing technologies.

And finally Amazon Quantum Solutions Lab, which is a program that connects customers with quantum computing experts and consulting partners to develop internal expertise aimed at identifying practical uses of quantum computing. The aim is to accelerate the development of quantum applications with meaningful impact.

There has been significant progress in quantum computing this year, particularly from IBM and Google, with both announcing large investments in the technology. The two made headlines in October after IBM discredited claims made by Google that its 53-qubit Sycamore processor had achieved “quantum supremacy”.

Quantum computing refers to extremely powerful machines capable of processing massive swathes of data due to a reliance on the theory of quantum mechanics in the way they are constructed. For example, Google suggested its processor was able to perform a complex mathematical problem in 200 seconds, while the world’s most-powerful supercomputer would need 10,000 years to complete.

Despite the work of IBM, Google and now AWS, quantum computing is not quite a mainstream technology just yet, but according to AWS evangelist Jeff Barr, that time is coming.

“I suspect that within 40 or 50 years, many applications will be powered in part using services that run on quantum computers,” he wrote in a blog post. “As such, it is best to think of them like a GPU or a math coprocessor. They will not be used in isolation, but will be an important part of a hybrid classical/quantum solution.”

Facebook lets users port photos and videos to Google


Nicole Kobie

2 Dec, 2019

Facebook is letting users move uploaded photos and videos to Google Photos as part of a project enabling data portability. 

The new tool lets Facebook users bulk export all of their photos and videos to Google’s photo hosting service. So far, the tool is only available in Ireland, but is set to be rolled out more widely in the first half of next year. 

“At Facebook, we believe that if you share data with one service, you should be able to move it to another,” said Steve Satterfield, Director of Privacy and Public Policy at Facebook, in a blog post. “That’s the principle of data portability, which gives people control and choice while also encouraging innovation.”

Data portability is required under laws such as GDPR and the California Consumer Privacy Act; the data portability rules in the latter come into play next year, just as this tool arrives more widely. 

Transferring the data to Google Photos does not appear to delete it from Facebook, but you can move the images over to the rival digital provider and then delete your account. It’s worth noting that Facebook has long allowed users to download everything from their account, photos and videos included, and then they can, of course, be uploaded again to your digital host of choice, Google Photos or otherwise. 

Facebook said the photo transfer tool is just the first step, and its release is designed to be assessed by policymakers, academics and regulators, in order to help decide what data should be portable and how to keep it private and secure.

“We’ve learned from our conversations with policymakers, regulators, academics, advocates and others that real-world use cases and tools will help drive policy discussions forward,” said Satterfield. 

He added: “We are currently testing this tool, so we will continue refining it based on feedback from people using it as well as from our conversations with stakeholders.”

The photo tool is based on code developed at the Data Transfer Project, an effort launched in 2018 that includes leading tech companies including Microsoft, Twitter, Google and Apple. The aim is to develop an open-source data portability platform to make it easier for individuals using their products to shift to a new provider if desired. 

The tool will eventually be available via the settings section of “Your Facebook Information.” “We’ve kept privacy and security as top priorities, so all data transferred will be encrypted and people will be asked to enter their password before a transfer is initiated,” said Satterfield. 

Satterfield saying Facebook hoped to “advance conversations” on the privacy questions identified in the white paper, which included the need to make users aware of privacy terms at the destination service, the types of data being transferred, and to ensure it’s encrypted to avoid it being diverted by hackers. For example, should contact list data be portable, given it’s private information of other people? Satterfield called on more companies to join the Data Transfer Project to further such efforts, which will be welcome to everyone as, after a string of security and privacy concerns, Facebook might not be the most trusted service on such issues. 

How to excel at secured cloud migrations through shared responsibility: A guide

  • 60% of security and IT professionals state that security is the leading challenge with cloud migrations, despite not being clear about who is responsible for securing cloud environments
  • 71% understand that controlling privileged access to cloud service administrative accounts is a critical concern, yet only 53% cite secure access to cloud workloads as a key objective of their cloud privileged access management (PAM) strategies

These and many other fascinating insights are from the recent Centrify survey, Reducing Risk in Cloud Migrations: Controlling Privileged Access to Hybrid and Multi-Cloud Environments, downloadable here. The survey is based on a survey of over 700 respondents from the United States, Canada, and the UK from over 50 vertical markets, with technology (21%), finance (14%), education (10%), government (10%) and healthcare (9%) being the top five. For additional details on the methodology, please see page 14 of the study.

What makes this study noteworthy is how it provides a candid, honest assessment of how enterprises can make cloud migrations more secure by a better understanding of who is responsible for securing privileged access to cloud administrative accounts and workloads.

Key insights from the study include the following:

Improved speed of IT services delivery (65%) and lowered total cost of ownership (54%) are the two top factors driving cloud migrations today

Additional factors include greater flexibility in responding to market changes (40%), outsourcing IT functions that don’t create competitive differentiation (22%), and increased competitiveness (17%). Reducing time-to-market for new systems and applications is one of the primary catalysts driving cloud migrations today, making it imperative for every organisation to build security policies and systems into their cloud initiatives.

How To Excel At Secured Cloud Migrations With A Shared Responsibility Model

Security is the greatest challenge to cloud migration by a wide margin

60% of organisations define security as the most significant challenge they face with cloud migrations today. One in three sees the cost of migration (35%) and lack of expertise (30%) being the second and third greatest impediments to cloud migration project succeeding. Organisations are facing constant financial and time constraints to achieve cloud migrations on schedule to support time-to-market initiatives. No organisation can afford the lost time and expense of an attempted or successful breach impeding cloud migration progress.

How To Excel At Secured Cloud Migrations With A Shared Responsibility Model

71% of organisations are implementing privileged access controls to manage their cloud services

However, as the privilege becomes more task-, role-, or access-specific, there is a diminishing interest of securing these levels of privileged access as a goal, evidenced by only 53% of organisations securing access to the workloads and containers they have moved to the cloud. The following graphic reflects the results.

How To Excel At Secured Cloud Migrations With A Shared Responsibility Model

An alarmingly high 60% of organisations incorrectly view the cloud provider as being responsible for securing privileged access to cloud workloads

It’s shocking how many customers of AWS and other public cloud providers are falling for the myth that cloud service providers can completely protect their customised, highly individualised cloud instances.

The native identity and access management (IAM) capabilities offered by AWS, Microsoft Azure, Google Cloud, and others provide enough functionality to help an organisation get up and running to control access in their respective homogeneous cloud environments. Often they lack the scale to adequately address the more challenging, complex areas of IAM and Privileged Access Management (PAM) in hybrid or multi-cloud environments, however. For an expanded discussion of the Shared Responsibility Model, please see The Truth About Privileged Access Security On AWS and Other Public Clouds. The following is a graphic from the survey and Amazon Web Services’ interpretation of the Shared Responsibility Model.

How To Excel At Secured Cloud Migrations With A Shared Responsibility Model

Implementing a common security model in the cloud, on-premises, and in hybrid environments is the most proven approach to making cloud migrations more secure

Migrating cloud instances securely needs to start with Multi-Factor Authentication (MFA), deploying a common privileged access security model equivalent to on-premises and cloud systems, and utilising enterprise directory accounts for privileged access.

These three initial steps set the foundation for implementing least privilege access. It’s been a major challenge for organisations to do this, particularly in cloud environments, as 68% are not eliminating local privilege accounts in favour of federated access controls and are still using root accounts outside of “break glass” scenarios.

Even more concerning, 57% are not implementing least privilege access to limit lateral movement and enforce just-enough, just-in-time-access.

How To Excel At Secured Cloud Migrations With A Shared Responsibility Model

When it comes to securing access to cloud environments, organisations don’t have to reinvent the wheel

Best practices from securing on-premises data centres and workloads can often be successful in securing privileged access in cloud and hybrid environments as well.

Conclusion

The study provides four key takeaways for anyone working to make cloud migrations more secure. First, all organisations need to understand that privileged access to cloud environments is your responsibility, not your cloud providers’. Second, adopt a modern approach to privileged access management that enforces least privilege, prioritising “just enough, just-in-time” access. Third, employ a common security model across on-premises, cloud, and hybrid environments. Fourth and most important, modernise your security approach by considering how cloud-based PAM systems can help to make cloud migrations more secure.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Dead Netflix accounts reactivated by hackers


Bobby Hellard

29 Nov, 2019

Hackers have exploited Netflix’s data retention policies to reactivate cancelled customer subscriptions and steal their accounts.

Former subscribers say they noticed their accounts had been reinstated when they were charged a monthly fee, months after cancellation.

The hackers can log in to dormant accounts and reactivate them without knowing users bank details, according to the BBC.

This is due to the streaming service storing customer data, including billing information, for ten months after cancellation. This is to enable a speedy account recovery should a user wish to rejoin.

However, this is proving to be a benefit for hackers who just need an email address and password to reactivate an account.

Radio 4’s You and Yours programme spoke to Emily Keen who said she cancelled her subscription in April 2019 but was charged £11.99 by Netflix in September. She tried to log in to the account but found that email and password were no longer recognised as the hackers had changed her details and signed her up to the more expensive service option.

Keen contacted Netflix and was assured her card would be blocked and she would receive a full refund, but the streaming service went on to take two further payments in October and November.

Other users that have had their accounts mysteriously reactivated have hit out at the company on Twitter.

“Super disappointed with my @netflix customer service experience,” one user posted on the social media site. “Our account was hacked, supposed to have been deactivated, was reactivated by hacker, and continued to use our credit card. We were told to file chargeback and @netflix would not offer refund.”

Stolen Netflix login details have reportedly been found on sites like eBay, sold as “lifetime” accounts for as little as £3. The same issue was reported for Disney+ accounts just hours after the service launched in the US, with login details serfacing on hacking forums. 

Cloud Pro has approached Netflix for comment.

Alibaba Cloud releases Alink machine learning algorithm to GitHub

Alibaba Cloud has announced it has made the ‘core codes’ of its machine learning algorithm Alink available on GitHub.

The company notes it is one of the top 10 contributors to the GitHub ecosystem, with approximately 20,000 contributors. Alink was built as a self-developed platform to aid batch and stream processing, with applications for machine learning tasks such as online product recommendation and intelligent customer services.

Not surprisingly, Alibaba is targeting data analysts and software developers to build their own software focusing on statistical analysis, real-time prediction, and personalised recommendation.

“As a platform that consists of various algorithms combining learning in various data processing patterns, Alink can be a valuable option for developers looking for robust big data and advanced machine learning tools,” said Jia Yangqing, Alibaba Cloud president and senior fellow of its data platform. “As one of the top 10 contributors to GitHub, we are committed to connecting with the open source community as early as possible in our software development cycles.

“Sharing Alink on GitHub underlines our such long-held commitment,” Jia added.

With the US enjoying a well-earned holiday rest, and the majority of the world hunting out Black Friday deals, Alibaba had a chance to rush the opposition with Singles Day earlier this month. The numbers put out by the company did not disappoint: zero downtime was claimed, with $1 billion of gross merchandise volume achieved within 68 seconds of launch.

A recent report from ThousandEyes aimed to explore benchmark performance of the hyperscalers, noting that Alibaba, alongside Amazon Web Services (AWS), relied more heavily on the public internet rather than Microsoft and Google, who generally prefer private backbone networks. The report also noted that, contrary to opinion, Alibaba suffered packet loss when it came to China’s Great Firewall.

You can take a look at the Alibaba Cloud Alink GitHub by visiting here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

GitGuardian, the security startup hunting down online secrets to keep companies safe from hackers


Victoria Woollaston

28 Nov, 2019

When the login details of an Uber engineer were exposed in 2016 – signalling one of the most high-profile breaches of recent years – the names and addresses of 57 million riders and drivers were left at the mercy of hackers. 

None of Uber’s corporate systems had been directly breached, though. Its security infrastructure was working as it should. Instead, the credentials were found buried within the code of an Uber developer’s personal GitHub account. This account and its repositories were hacked, reportedly due to poor password hygiene and the stolen credentials used to access Uber’s vast datastore. This breach, which Uber sat on for a year, resulted in a then-record-breaking $148 million fine.

Yet despite this public lesson in how not to handle private credentials, so-called company secret leakage is an everyday occurrence

The rise of secret leakage

Research from North Carolina State University found that in just six months between October 2017 and April 2018, more than half a million secrets were uploaded to GitHub repositories, including sensitive login details, access keys, auth tokens and private files. A 2019 SANS Institute survey found that half of company data breaches in the past 12 months were a result of credential hacking – higher than any other attack method among firms using cloud-based services. 

This is where GitGuardian comes in. 

Founded in 2017 by Jérémy Thomas and Eric Fourrier – a pair of applied mathematics graduates and software engineers specialising in data science, machine learning and AI – the Paris-based cybersecurity startup uses a combination of algorithms, including pattern matching and machine learning, to hunt for signs of company secrets in online code. According to the company’s figures, more than a staggering 3,000 secrets make their way online every day.

“The idea for GitGuardian came when Eric and I spotted a vulnerability buried in a GitHub repository,” CEO and co-founder Thomas tells Cloud Pro. “This vulnerability involved sensitive credentials relating to a major company being leaked online that had the potential to cost the firm tens of millions of dollars if they had got into the wrong hands. We alerted the company to the vulnerability and it was able to nullify it in less than a week.” 

“We then built an algorithm and real-time monitoring platform that automated and significantly built-upon the manual steps we took when we made that initial detection, and this platform attracted interest from GitHub’s own Scott Chacon as well as Solomon Hykes from Docker and Renaud Visage from EventBrite.” 

How the cloud is fuelling secret leakage

The problem of sensitive data leakage stems in part from the increasing reliance of software developers on third-party services. To integrate such services, developers often juggle hundreds of credentials with varying sensitivity, from API keys used to provide mapping features on websites to Amazon Web Services login details, and private cryptographic keys for servers. Not to mention the many secrets designed to protect data, surrounding payment systems, intellectual property and more. 

In the process of handling these integrations, more than 40 million developers and almost 3 million businesses and organisations globally use GitHub, the public platform that lets developers share code and collaboratively work on projects. Either by accident (in the majority of cases), or occasionally knowingly, these uploads have company secrets buried within them alongside the code that’s being developed. As was seen with the Uber breach, hackers can theoretically scour this code, steal credentials and hack company accounts all without the developer and their employer being any the wiser.

How GitGuardian plugs these leaks

GitGuardian’s technology works by first linking developers registered on GitHub to their respective companies. This already gives the company greater insight over who their developers are on GitHub and the levels of public activity they’re involved in. This is especially important for developers’ personal repositories because they’re completely out of their companies’ control, yet too often contain corporate credentials. 

Once linked, GitGuardian’s algorithms scrutinise any and all code changes, known as commits, made by these developers in real-time, looking for signs of company secrets. Such signs within these commits range from code patterns to file types that have previously been found to contain credentials.  

“Our algorithms scan the content of more than 2.5 million commits a day, covering over 300 types of secrets from keys to database connection strings, SSL certificates, usernames and passwords,” Thomas continues.

Once a leak occurs, it takes four seconds for GitGuardian to detect it and send an alert to the developer and their security team. On average, the information is removed within 25 minutes and the credential is revoked within the hour. For every alert, GitGuardian seeks feedback from its developers and security teams who rate the accuracy of the detection: were company secrets actually exposed or was it a false positive? Consequently, the algorithm is constantly evolving in response to new secrets and how they are leaked.

This seems like a simple premise, even if the technology behind it is far from simple. But what’s to stop a hacker building a similar algorithm to intercept the secrets before GitGuardian’s platform spots it? 

“GitGuardian is indeed competing with individual black hat hackers, as well as organised criminal groups,” Thomas explains. “We constantly improve our algorithms to be quicker and smarter than they are, and to be able to detect a wider scope of vulnerabilities, which requires a dedicated, highly skilled team.

“We’re helped in this by our users and customers who give us feedback – at scale – that we reinject into our algorithms. Our white hat approach allows us to collect feedback and this gives us a tremendous edge over black hats. You can see this as the unfair advantage you get by doing good.”

GitGuardian has already supported global government organisations, more than 100 Fortune 500 companies and 400,000 individual developers. It’s now setting its sights on adding even more developers and companies to its platform to further improve its algorithm, and extend this technology for use on private sites. 

“We started GitGuardian by tackling secrets in source code and private sites,” concludes Thomas. “Our ambition really is to be developers’ and cybersecurity professionals’ best friend when it comes to securing the vulnerability area that is emerging due to modern software development techniques [and] we’re on the road to doing this.”

McAfee notes the gap between cloud-first and cloud-only – yet optimism reigns on success

Two in five large UK organisations expect their operations to be cloud-only by 2021 according to a new report – but the gap between the haves and the have-nots is evident.

The findings appear in a new report from McAfee. The security vendor polled more than 2000 respondents – 1310 senior IT staff and 750 employees – across large businesses in the UK, France, and Germany to assess cloud readiness.

40% of large UK businesses expect to be cloud-only by 2021, yet only 5% surveyed already consider themselves to be at this stage, the research found. 86% of UK-based senior IT staff saw their business as cloud-first today, comparing similarly to France (90%) and Germany (92%), while optimism reigned over becoming cloud-only when given an indeterminate future date. 70% of UK respondents agreed this would occur, albeit lower than their French (75%) and German (86%) counterparts.

The benefits are clear among respondents. 88% of senior IT staff polled in the UK said moving to the cloud had increased productivity among end users. 84% said the move had improved security, while supplying more varied services (85%) and increased innovation (84%) were also cited.

The question of responsibility is an interesting one, and shows where the waters begin to muddy. Never mind the issue around vendor versus customer, consensus does not particularly exist within senior leadership. Ultimately, the majority believe responsibility lies with the head of IT (34%), compared with the CIO (19%), CEO (14%), or CISO (5%). One in five (19%) employees surveyed admitted to using apps which had not been approved by IT.

“The key to security in a cloud-first environment is knowing where and how data is being used, shared and stored by employees, contractors and other third parties,” said Nigel Hawthorn, director of McAfee’s EMEA cloud security business. “When sensitive corporate data is under the IT team’s control – whether in collaboration tools or SaaS and IaaS applications – organisations can ensure the right policies and safeguards are in place to protect data from device to cloud, detect malicious activity and correct any threats quickly as soon as they arise.”

Those wondering ‘whither McAfee?’ with regards to cloud security research will notice the company’s long-standing pivot to this arena. The abovementioned ‘device to cloud’ reference is taken direct from McAfee’s branding as the company looks to gather expertise as a cloud access security broker (CASB).

This is not without success, as McAfee was named for a second year, alongside Bitglass, Netskope and Symantec, as a leader in Gartner’s CASB Magic Quadrant last month. Last year Gartner noted, with the acquisition of Skyhigh Networks, McAfee’s expertise in raising awareness of shadow IT. 2019’s Quadrant sees one new face in the winners’ enclosure in the shape of Microsoft.

In April, McAfee released a special edition of its Cloud and Risk Adoption Report. According to the 1,000 enterprise organisations polled, more than half (52%) said they found security better in the cloud than on-premise, with organisations who adopt a CASB more than 35% likelier to launch new products and gain quicker time to market.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AT&T and Microsoft launch edge computing network


Bobby Hellard

27 Nov, 2019

Microsoft and AT&T have integrated 5G with Azure to launch an edge computing service for enterprise customers.

The two companies signed a $2 billion deal in July, which involved the migration of AT&T data and workflows to Azure, and introduced plans to accelerate work on 5G and cloud computing.

The first joint announcement to come out of the deal, announced on 26 November, is a pilot launch of an edge computing service called Network Edge Compute, a virtualised 5G core that can deploy Azure services.

It’s available to certain customers, initially in Dallas, but will roll out to some in Los Angeles and Atlanta over the next year.

“With our 5G and edge computing, AT&T is collaborating uniquely with Microsoft to marry their cloud capabilities with our network to create lower latency between the device and the cloud that will unlock new, future scenarios for consumers and businesses,” said Mo Katibeh, EVP and chief marketing officer, AT&T Business.

“We’ve said all year developers and businesses will be the early 5G adopters and this puts both at the forefront of this revolution.”

The collaboration will see AT&T become a “public-cloud first” business, according to Microsoft. The telecoms giant’s migration is well underway and is set to be completed by 2024.

“We are helping AT&T light up a wide range of unique solutions powered by Microsoft’s cloud, both for its business and our mutual customers in a secure and trusted way,” said Corey Sanders, corporate VP of Microsoft Solutions.

“The collaboration reaches across AT&T, bringing the hyper-scale of Microsoft Azure together with AT&T’s network to innovate with 5G and edge computing across every industry.”

It’s also another big deal for Microsoft, which has made its public cloud strategy clear with a number of acquisitions for migration specialists. Most recently the tech giant snapped up Mover, which swiftly followed a deal to buy similarly named Movere.

Microsoft and AT&T expand upon partnership to deliver Azure services on 5G core

Microsoft and AT&T have beefed up their strategic partnership, announcing a new offering where AT&T’s growing 5G network will be able to run Azure services.

The companies will be opening select preview availability for network edge compute (NEC) technology. The technology ‘weaves Microsoft Azure cloud services into AT&T network edge locations closer to customers,’ as the companies put it.

Microsoft and AT&T first came together earlier this year, with the former somewhat stealing the thunder of IBM, who had announced a similar agreement with AT&T the day before.

While the operator will be using Microsoft’s technology to a certain extent – the press materials noted it was ‘preferred’ for ‘non-network applications’ – the collaborative roadmap, for edge computing and 5G among other technologies – was the more interesting part of the story. The duo noted various opportunities that would be presented through 5G and edge. Mobile gaming is on the priority list, as is utilising drones for augmented and virtual reality.

Regarding AT&T’s personal cloudy journey, the commitment to migrating most non-network workloads to the public cloud by 2024 was noted, while the commitment for the operator to become ‘public-cloud first’ was reaffirmed.

“We are helping AT&T light up a wide range of unique solutions powered by Microsoft’s cloud, both for its business and our mutual customers in a secure and trusted way,” said Corey Sanders, Microsoft corporate vice president in a statement. “The collaboration reaches across AT&T, bringing the hyperscale of Microsoft Azure together with AT&T’s network to innovate with 5G and edge computing across every industry.”

After many false starts – remember Verizon’s ill-fated public cloud product offering? – telco is finding a much surer footing in the cloud ecosystem. As VMware CEO Pat Gelsinger put it in August: “Telcos will play a bigger role in the cloud universe than ever before. The shift from hardware to software is a great opportunity for US industry to step in and play a great role in the development of 5G.”

You can read the full Microsoft and AT&T update here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.