Intel Presentation at @CloudEXPO New York | @Intel @IntelSoftware @ZhannaGrinko #CIO #DataCenter #DigitalTransformation

On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud’s shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure for today even as you’re forecasting for tomorrow.

read more

Google invests $700 million in carbon-neutral Danish data centre


Connor Jones

21 Nov, 2018

Google has announced plans to build a cutting-edge, carbon neutral data centre in Fredericia, Denmark, matching any energy consumed with 100% carbon-free energy.

Google will be investing almost $700 million in the new site, the location of which was chosen due to the country’s high-quality digital infrastructure and support for renewable energy. Nordic countries are renowned for their use of cheap renewable energy sources including hydropower and wind, so it’s no surprise more tech giants are setting up shop in Scandinavia.

Google’s European data centres typically use one-third less energy than its sites elsewhere in the world, but the tech giant is still on a quest to use less. It will be seeking out new investment opportunities in Danish renewable energy projects called Power Purchase Agreements, it said. 

«At Google, we aim to support the communities that surround our facilities, and in the last few years we’ve invested almost 3.4 million euro in grants to initiatives that build the local skills base – like curriculum and coding programs, as well as educational support through teaching collaborations at area colleges. We’ll also introduce initiatives like these in Fredericia,» said Joe Kava, Vice President of data centres in yesterday’s announcement.

The new data centre in Fredericia will be one of the most advanced and energy efficient in the company’s arsenal, implementing advanced machine learning to ensure every watt of power is used effectively and efficiently.

«In a dynamic environment like a data center, it can be difficult for humans to see how all of the variables–IT load, outside air temperature, etc.-interact with each other. One thing computers are good at is seeing the underlying story in the data,» Joe Kava said in a blog post. Using finely-tuned models designed by Google’s engineers, it is able to maximise PUE (Power Usage Effectiveness).

This isn’t the first time in recent months that Google has hit the headlines for its work in renewable energy. Back in May 2018, the company announced a partnership with Eon to provide a new service which aimed to help UK homeowners save money by switching to solar panels.

The service uses machine learning to assess data points including roof area and angle to determine a house’s solar potential.

Even more recently, and continuing on the Nordic theme, Google signed a 10-year deal back in September 2018 agreeing to buy renewable energy from three new wind farms in Finland which it will use to power one of its data centres. It was the first instance where the company agreed to buy power while not receiving any government subsidies.

Microsoft buys FSLogix to boost Office 365 virtualisation performance


Clare Hopping

21 Nov, 2018

Microsoft has acquired app provisioning firm FSLogix to improve the speed of user profile loading in applications including Outlook and OneDrive.

The solution has been engineered for multi-cloud environments and it will significantly improve the user experience, Microsoft’s Brad Anderson, CVP, Microsoft 365 and Julia White, CVP, Microsoft Azure said.

“Through customer engagement, we know that Microsoft Office applications are some of the most highly used and most commonly virtualized applications in any business,” they said in a joint statement.

“Office 365 ProPlus is currently the best Office experience, and, with FSLogix enabling faster load times for user profiles in Outlook and OneDrive, Office 365 ProPlus will become even more performant in multi-user virtual environments (including Windows Virtual Desktop).”

The Atlanta, Georgia-based company is a significant investment for Microsoft. Although the terms of the deal haven’t been revealed, FSLogix has received huge investment before the deal was announced – to the tune of $10.3 million.

“When we launched FSLogix in 2012, our goal was to build software that helped customers reduce the amount of resources, time, and labor required to support virtual desktops,” FSLogix cofounder and CTO Randy Cook said.

“Our first two products, FSLogix Apps and FSLogix Profile Container, focused on addressing critical needs that have existed from the dawn of desktop virtualisation… our most recent product, Office 365 Container, is designed to enhance the Microsoft Office 365 experience in those virtual desktop environments.“

FSLogix will join Microsoft “soon” and the two teams will continue to work together to integrate its platform into Microsoft’s productivity apps and services.

SAP buys automation specialist Contextor to boost S/4HANA


Clare Hopping

21 Nov, 2018

SAP has announced the acquisition of RPA innovator Contextor SAS, which the company says will help with the development of its SAP Leonardo Machine Learning portfolio.

Contextor’s technology makes it easier for businesses to automate repetitive tasks, freeing up resource for employees to develop new products and services instead. It’s based upon bots, with software distributed to key business-as-usual (BAU) tasks and applications, ensuring innovation remains at the heart of everything, they do.

To date, Contextor customers have deployed more than 100,000 bots to save their businesses valuable resources, time and money.

“With intelligent RPA accelerated by Contextor, businesses will be able to achieve the high automation level necessary to become intelligent enterprises,” said Markus Noga, head of machine learning at SAP.

“The acquisition is a big step towards orchestrating process automation and will help SAP inject RPA capabilities into our applications, first and foremost into SAP S/4HANA.”

SAP expects Contextor’s technology to be deployed to customers in the first half of 2019. It’ll first be integrated into the company’s SAP S/4HANA, with other platforms following later in the year. SAP hopes up to half of all its customers’ processes deployed on SAP ERP software will be automated over the next three years.

SAP is investing heavily into AI and machine learning at the moment, making it easier for businesses to gain insights from customers, reduce the manpower needed to keep businesses ahead of the curve and boosting productivity. Last week, it acquired experience management firm Qualtrics for $8 billion and back in January, it bought chatbot business Recast.ai. That’s on top of its other purchases – Coresystems and Callidus Cloud.

UK Cloud Awards 2019 now open for business


Cloud Pro

21 Nov, 2018

The Cloud Industry Forum (CIF) and Cloud Pro are pleased to announce that the UK Cloud Awards are now open for entries.

Has your business creatively used cloud services to drive a new cloud-based project? Or are you an innovative cloud vendor? Have you played a key part in a successful digital transformation project powered by the cloud? Do you see yourself as a cloud leader, entrepreneur or visionary?

If you answered yes to any of these questions, we want to hear from you as the UK Cloud Awards 2019 are now open for business.

Now in their sixth year, the awards, which are designed to showcase and celebrate the leading vendors, customers and individuals who are setting the benchmark in the UK cloud industry and beyond, will take place on 16 May 2019 at the prestigious County Hall in London.

Since their launch, the Awards have constantly evolved to keep pace with the changing tech landscape. This year is no exception, with new categories introduced focused on the impact of AI/ML, DevOps, the increasing diversity of talent, and the formation of ecosystems.

The award categories are as follows:

BEST-IN-CLASS

  • Most Innovative Enterprise Product
  • Most Innovative SMB Product
  • Best Cloud Platform Solution
  • Cyber or Security Product or Service
  • Best FinTech Product or Service
  • Best Data Management Product or Service
  • Best AI/ML Enabled Product or Service
  • Best Cloud Enabled End User Experience

BEST DIGITAL TRANSFORMATION PROJECTS

  • Public Sector Project/3rd Sector Project
  • Private Sector Enterprise Project
  • Private Sector SMB Project
  • Best DevOps & Function as a Service Implementation

BEST-IN-CLASS CLOUD SERVICE PROVIDER

  • Best Cloud Service Provider
  • Best Cloud Managed Service Provider
  • Cloud Migration Partner/Technical Collaboration Project

ACHIEVEMENT AWARDS

  • Best Newcomer of the Year
  • Cloud Visionary of the Year

Entries will be scrutinised by an expert panel of judges, headed up by cloud expert Jez Back as head judge.

«The UK Cloud Awards have rightly earned their spot as one of the most credible and innovative events in the technology awards calendar, so I am delighted to assume the mantle as head judge this year,» Back said.

«To ensure that we can keep pace with the industry we have included new categories focused on next-generation technologies, such as AI, and emerging techniques such as DevOps. We also wish to look to the future, so have introduced a new individual category to showcase the diversity and emerging talent of our future leaders by creating Best Cloud Newcomer.»

Alex Hilton, CEO of the Cloud Industry Forum, added: “The UK Cloud Awards celebrate all the innovation this industry can offer and the whole event, from the number of attendees to the number of nominations, grows year-on-year. The Awards’ popularity owes much to our stringent and entirely independent judging process, designed to ensure that we can really identify the best of the best. 2018 was a record year for the UK Cloud Awards, and I have no doubt that we can raise the bar again this year.”

Entries to the awards can now be submitted via the website and submissions must be completed before close of business on 22 February 2019 to be eligible.

AWS launches new security offering which mitigates S3 misconfigurations – if customers get it right

Amazon Web Services (AWS) has announced extra steps to ensure customers’ S3 buckets don’t become misconfigured – but don’t assume responsibility has been taken away from the customer.

The new service, Amazon S3 Block Public Access, can work at the account level, on individual buckets, as well as future buckets created. Users can also block existing public access, or ensure public access is not available for newly created items.

The move can be seen as an extension of the various access controls users already have on AWS buckets, through either Access Control Lists (ACL), or identity and access management (IAM) bucket policies. Users will not be charged for this additional usage, aside from usual prices for all requests made to the S3 API.

As Jeff Barr, chief evangelist for Amazon Web Services, put it in a blog post explaining the new system: “We want to make sure that you use public buckets and objects as needed, while giving you tools to make sure that you don’t make them publicly accessible due to a simple mistake or misunderstanding.”

This has been a long-term problem for both AWS and its customers. The model of shared responsibility states that the provider is liable for security ‘of’ the cloud, such as infrastructure, while the customer is responsible for security ‘in’ the cloud – in other words, ensuring data is properly configured.

A series of high profile breaches, including Verizon, Accenture and Booz Allen Hamilton, have exacerbated the issue. Last month, research from cloud access security broker (CASB) Netskope argued the majority of Center for Internet Security (CIS) benchmark violations found in AWS environments fell under the IAM remit.

AWS has taken steps previously to make the issue more visible – literally. This time last year the company revamped its design to give bright orange warning indicators as to which buckets were public. Yet the message of personal and organisational responsibility still needs to be hammered home.

In April, CloudTech published two articles exploring S3 security as part of its monthly topic focusing on the subject. Doug Hazelman, vice president of technical marketing at backup service provider CloudBerry, argued there were no excuses for errors of this nature.

“By virtue of having a service readable and writeable from anywhere in the world, this sort of [attack] is bound to happen, one might say. But that is not true: even the lowest functionality devices, such as sensors, can be configured to authenticate via a put request to an S3 bucket,” Hazelman wrote.

“Put simply: this shouldn’t happen. There is no reason to have a world-readable and world-writeable S3 bucket,” he added. “Preventing this type of lift of private data requires making sure one simple setting is configured as is the default when setting up a new Amazon S3 instance.

“To be honest, it is beyond me why projects make it into production with this setting at anything but its secure default, but too many breaches – and it’s a stretch to call them breaches because accessing the data is essentially as simple as browsing to a public website – have shown that for whatever reason, companies are not being careful enough in their S3 configurations.”

Micah Montgomery, cloud services architect at cybersecurity firm Mosaic451, cited a lack of understanding at the cloud’s complexity as a concern.

“The ease of using AWS or other cloud environments can make it easy to forget just how complex the cloud is,” he wrote. “This complexity is why the cloud is so visible, but it also decreases visibility. In many cases, AWS breaches happen because organisations have non-IT personnel, or IT personnel who do not fully understand the cloud, configuring their AWS buckets.

“In a general IT environment, there is a management console for every area and tool,” Montgomery added. “Once you add a cloud environment, you add another management console. There are already hundreds of ways to screw things up in an on-premises data environment. The cloud adds yet another layer of complexity, and organisations must understand how it will impact their overall cybersecurity.”

With this latest update, AWS is giving even more possibilities to get it right – but bear in mind they cannot hold customers’ hands every step of the way. Read the full blog post here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft Office 365 and Azure users locked out of accounts due to MFA issues


Keumars Afifi-Sabet

20 Nov, 2018

Azure and Office 365 users were unable to login to their accounts yesterday due to issues with Microsoft’s multi-factor authentication (MFA) service.

From 4.39am on Monday until later that evening users in the UK and Western Europe, as well as pockets around the world, were unable to access their Office 365 accounts.

Azure services such as Azure Active Directory was also closed off to users whose organisations enforced mandatory MFA.

Although Microsoft says its services are now operating as normal, this incident has angered organisations trying to convince their employees of MFA’s benefits, as well as those who have had to contend with similar outages in recent months.

The cause, according to Azure’s status history, lied with requests from MFA servers, sent to a European-based database, reaching operation threshold, which in turn caused latency and timeouts.

Attempts to reroute traffic through North America ended in failure, and caused a secondary issue when servers become unhealthy and traffic was throttled to handle increased demand.

«Engineers deployed a hotfix which eliminated the connection between Azure Identity Multi-Factor Authentication Service and a backend service. Secondly, engineers cycled impacted servers which allowed authentication requests to succeed,» Microsoft wrote.

«Engineers will continue to investigate to establish the full root cause and prevent future occurrences.»

The firm says it will publish a full analysis of the outage within the next couple of days.

Error messages that users received upon trying to access their Office 365 and Azure accounts

Monday’s issues are the latest in a string of prominent Microsoft Azure and Office 365 outages customers have had to suffer in recent months, with the previous incident occurring just three weeks ago.

The days-long outage, which struck in late October, left predominately UK users unable to login to Office 365 due to additional login prompts appearing after user credentials had already been entered.

Another global outage in September affected Azure and Office 365 users across the world after a «severe weather event» knocked one of Microsoft’s San Antonio-based servers offline.

«With less than a month between disruptions, incidents like today’s Azure multi-factor authentication issue pose serious productivity risks for those sticking to a software-as-a-service monoculture,” said Mimecast’s cyber resilience expert Pete Banham.

«With huge operational dependency on the Microsoft environment, no organisation should trust a single cloud supplier without an independent cyber resilience and continuity plan to keep connected and productive during unplanned, and planned, email outages.

«Every minute of an email outage could costs businesses hundreds and thousands of pounds. Without the ability to securely log in, knowledge worker employees are unable to do their jobs.»

Cloud Pro approached Microsoft for comment.

Pure Storage adds new AWS integrations to support hybrid cloud


Adam Shepherd

19 Nov, 2018

Pure Storage is embracing the public cloud, announcing a new suite of cloud-based services designed to support the hybrid cloud operating models the company’s customers are demanding.

The new capabilities have been collectively dubbed Pure Storage Cloud Data Services, and comprise three new features built around AWS’ public cloud platform.

The first, Cloud Block Store for AWS, is billed as «industrial-strength block storage» for mission-critical apps hosted natively on AWS. Running the same Purity storage software layer as the company’s on-premise flash hardware and managed by its Pure1 cloud management service. The goal is to make it as easy as possible to move data between on-premise storage and AWS environments, using the same APIs, plug-ins and automation tools across both environments.

Pure is touting benefits for efficiency, reliability and performance thanks to features like thin provisioning and asynchronous replication to AWS. «We really elevate cloud storage up to a product that is suitable for tier one mission-critical applications that are reliant on having consistent, performant access to data,» said Pure Storage field CTO Patrick Smith.

The company is also adding a new feature to its FlashArray all-flash data centre storage products. CloudSnap for AWS allows FlashArray products to take portable snapshots to AWS S3 as a target, but also allows those snapshots to be restored quickly either on-prem or in an AWS environment via Cloud Block Store.

«Where we’re differentiated here is that FlashArray will back up to FlashBlade, which is our all-flash object storage platform. It not only backs up incredibly quickly, but it also provides fast recovery,» Smith said; «10x faster, in most cases, than our competition.»

The final feature being announced is StorReduce for AWS, an object storage deduplication engine that Pure picked up when it acquired deduplication specialists StorReduce earlier this year. The new replication features will allow Pure’s customers to do away with tape backups for long-term storage, the company says, and embrace a more flexible flash-based architecture.

«It allows us to change the economics of FlashBlade; not just make it a gamechanger in terms of rapid recovery, but also allow us to do that at a price point that means it’s not just for those troublesome applications that you can’t restore in time,» Smith told Cloud Pro. «This now makes FlashBlade suitable for all our customers’ on-prem backup requirements»

CloudSnap is available for FlashArray customers now, while Cloud Block Store and StorReduce are entering a limited public beta, with full public availability planned for both by mid-2019.

The company also told Cloud Pro that while AWS is the only public cloud provider supported at launch, adding other major providers is «a priority» post-launch.

«AWS is the start. We needed to start somewhere and AWS is a good partner with us,» Smith said, «and so they were a logical start – but we will absolutely have plans to add the other large cloud providers.»

Smith also predicted big benefits for Pure Storage’s partner ecosystem, on which Pure depends for its route to market.

«I think in the same way that this opens up new opportunities for Pure Storage, it also opens up new opportunities for our channel partners,» Smith told Cloud Pro. «I think the impact of us supporting the public cloud allows them to benefit as well as us.»

«We are absolutely committed to the channel; it is our go-to-market, and so our Cloud Data Services will all route through our channel partners. We are a partner company.»

Diane Greene to step down as Google Cloud CEO: Analysing the past three years – and the future

Analysis Diane Greene is to step down as Google Cloud’s CEO after three years, the company has announced – with former Oracle executive Thomas Kurian to take over the reins.

Kurian, whose more than 20-year stint at Oracle culminated in leading product development, resigned from the Redwood giant last month having taken a period of leave. In a blog post announcing the change, Greene said she would remain CEO until January and stay on the board of Alphabet, with Kurian joining the company on November 26.

“The Google Cloud team has accomplished amazing things over the last three years, and I’m proud to have been a part of this transformative work,” Greene wrote. “We have moved Google Cloud from having only two significant customers and a collection of startups to having major Fortune 1000 enterprises betting their future on Google Cloud, something we should accept as a great compliment as well as a huge responsibility.”

Greene’s stint had certainly exceeded her expectations at least in terms of timeframes, having expected to only be in the role for two years. Having joined Google after the company acquired enterprise development platform startup Bebop, Greene had committed the proceeds to philanthropical efforts, and is looking now to invest in female founders and CEOs. “I want to encourage every woman engineer and scientist to think in terms of building their own company someday,” Greene added.

So how will the past three years be remembered and analysed? Ultimately, while the company still trails behind Amazon Web Services (AWS) and Microsoft Azure, it doesn’t tell the full story – and the leadership and investment Greene has put in will enable Google Cloud to compete strongly in future on different fronts.

Missing the enterprise boat

As various sections of the tech media have pointed out, Google remains well behind AWS and Azure, in the cloud infrastructure market. According to the most recent figures from Synergy Research, AWS holds 34% market share, with Microsoft at 14% and Google at 7%, alongside IBM.

Yet change has been strong at Google since 2015; both in terms of customers acquired, as Greene noted in her valediction, as well as technologies invested in. In May, Google was named for the first time as a leader in Gartner’s Magic Quadrant for infrastructure as a service; a considerable feat considering the AWS/Azure duopoly. As this publication put it in March last year, when Google unveiled its latest roster of enterprise customers – three of whom all solidly in the Fortune 500 – “this was the week in which Google’s enterprise cloud offering came of age.”

Some, however, thought Google had been too late to make its move. Amir Hermelin, formerly product management lead at Google Cloud, took to Medium to argue the company dallied when it came to big ticket clients.  “Seeing success with Snapchat and the likes, and lacking enough familiarity with the enterprise, and lacking enough familiarity with the enterprise space, it was easy to focus away from ‘large orgs’,” Hermelin wrote. “This included insufficient investments in marketing, sales, support, and solutions engineering, resulting in… being inferior compared to the competitors.”

Nick McQuire, VP enterprise at analyst firm CCS Insight, said that while Google were late to the party on enterprise, the long road ahead remains. “In fairness, they’ve had to shift a little bit of their approach, particularly over the last 12-18 months around open, multi-cloud and hybrid cloud,” McQuire told CloudTech. “Google is also making the point that it is super early in this progression.

“There’s still in my opinion not only a lot to play for, but we’re going to see a lot of meanders across the road in the industry in cloud over the next couple of years as well.”

Key to this open approach, and one area where significant investment had been made, however, was in artificial intelligence (AI) and machine learning (ML). Hermelin cited this area as one of three ‘strong pillars’ for Google, alongside security and Kubernetes – the latter also being open and having gained significant momentum in the industry over the past 12 months.

In AI and ML, Google can be seen as having an advantage, with the launch of pre-packaged AI services in August being of particular interest. Much of this leadership has been down to the work of Fei-Fei Li, enlisted as chief scientist at Google Cloud AI before returning to academia in September. This is an area which is only going to accelerate; a prediction from analyst firm CCS Insight forecast that by 2020, cloud service providers would expand from general purpose AI to business-specific applications.

Trust us

Another bet CCS – and McQuire personally – made was around trust. Next year, the big cloud vendors won’t be using compute, storage, or even emerging technologies as their primary currency. It will all be around trust; and in particular, the long-standing fear of vendor lock-in, mitigated by the continued feasibility of multi-cloud workloads.

Netflix could be seen as a classic example of this. The streaming giant has espoused and evangelised all things AWS for several years, but uses Google for certain disaster recovery workloads. Expect this to continue, driven by transparency around trust and compliance. “Companies are trying to figure that through: what is the advantage of going all-in versus a multi-cloud strategy?” said McQuire.

“Google at the moment is getting some good traction with companies that are pursuing a multi-cloud,” he added. “They’re going to Google for AI and machine learning and they’re starting there; maybe over time they’ll migrate more. It’s that approach that Google’s being open and honest about, which I think is beneficial to their strategy.”

With regard to financials, Google, like Microsoft, does not give specific numbers. For Q318, Google’s ‘other revenues’ bucket, where Google Cloud is housed, reported at $4.64 billion, up almost 30% from the previous year. Compared with Q415, the nearest quarter to when Greene arrived, other revenues were at less than half ($2.1bn).

Google chief exec Sundar Pichai said earlier this year the company’s cloud arm was clocking more than $1bn in revenues per quarter. Replying to an analyst question for Google’s most recent financials, Pichai said the company was “definitely seeing strong indicators that the investment in product [was] clearly beginning to work”, as well as adding they were “thoughtfully looking” at hybrid cloud options.

The in-tray for the new boss

For the incoming Thomas Kurian, McQuire argued the former Oracle exec will have a few important items which will be straight on his to-do list.

“The main thing he’s going to want to focus on is the sales and go-to-market piece around Google,” said McQuire. “It’s part of a market education process that Google still needs to push. They have done better here, but they need to start to bring in more evangelists and business-oriented salespeople who can articulate Google’s business value proposition around cloud.

“More importantly, [Kurian needs] to help educate on the trustworthiness of Google’s strategy and its cloud overall as well,” McQuire added. “Google has been investing a lot – and I think the market doesn’t always appreciate that and fully understand that – but it’s that element I think that needs to be improved a little bit.

“Diane Greene has laid some pretty good foundations for Kurian to come in. We’ll see where they go from here.”

Picture credit: Google Cloud Platform/Screenshot

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft launches Azure-based blockchain development kit


Connor Jones

19 Nov, 2018

Microsoft has released a blockchain development kit for its Azure platform, designed with smooth integration between blockchain and its third-party SaaS in mind.

The company said the tools are widely used by businesses taking their first steps towards reinventing the way they do business. The technology and tools have already been used to create projects such as democratising supply chain financing in Nigeria to tracking British crops from farm to fork.

This iteration of the SDK will focus on three key areas: creating connections between blockchain and other interfaces involved in the business process such as mobile clients and IoT; integration with data, software, and media that lives «off chain» such as office documents and CAD files; deploying smart contracts for implementation with business networks.

«This kit extends the capabilities of our blockchain developer templates and Azure Blockchain Workbench, which incorporates Azure services for key management, off-chain identity and data, monitoring, and messaging APIs into a reference architecture that can be used to rapidly build blockchain-based applications,» Said Marc Mercuri, principal program manager at Microsoft’s Blockchain Engineering division, in a blog post.

The kit is designed to streamline processes and lower the barrier to entry for developers wanting to create end-to-end blockchain applications.

«The Azure Blockchain Development Kit is the next step in our journey to make developing end to end blockchain applications accessible, fast, and affordable to anyone with an idea,» said Mecuri. «It is built atop our investments in blockchain and connects to the compute, data, messaging, and integration services available in both Azure and the broader Microsoft Cloud to provide a robust palette for a developer to realize their vision.»

This announcement follows plans for another blockchain project earlier this year – to provide decentralised IDs (DIDs) via an Authenticator app. After thinking about how users grant consent to a myriad of apps and services, users should have something that allows them to easily control access their digital identity, Ankur Patel, principal programme manager at Microsoft’s Identity Division, said. Microsoft said it has explored a range of different decentralised storage systems, but found blockchains provided the most robust protocols for enabling DIDs.