Microsoft Cloud for Financial Services will launch next month

Zach Marzouk

6 Oct, 2021

Microsoft Cloud for Financial Services is set to launch on 1 November to help financial institutions use the Microsoft Cloud.

The new initiative integrates cloud services across Microsoft Azure, Microsoft 365, Microsoft Dynamics 365, and Microsoft Power Platform, with new capabilities and customisations unique to financial services. It also has been designed to address the control frameworks and regulatory requirements facing the industry.

The platform provides financial institutions with a unified customer profile, easier customer onboarding, help with personalising customer interaction, and makes automation and collaboration across front and back-office easier. It also helps to identify and prevent fraud, protects the merchant services arms of financial institutions, carries out risk assurance and support, and helps organisations manage compliance requirements.

Microsoft hopes this will help financial institutions and industry partners to unlock value by optimising business processes through integrated collaboration and omnichannel communication capabilities, and enhance the customer experience through comprehensive customer insights and personalised interactions. It also helps accelerate products to market and removes data silos.

“This industry-specific cloud introduces new capabilities that unlock the power of the Microsoft Cloud to help innovate for responsible and sustainable growth,” said Bill Borden, corporate vice president of Worldwide Financial Services at Microsoft.

“Our industry cloud has a foundation of privacy, security, and regulatory compliance across Microsoft and our partner ecosystem, and it is built on an industry data model that enables interoperability and innovation.”

In September, Atos and IBM teamed up to create a new centre of excellence to help banks and insurance companies improve security and regulatory compliance when moving to the cloud. The platform aims to provide technical and financial services advice and expertise with Atos professionals trained on the IBM Cloud providing local language assistance. Atos will also offer automation services like Robotic Process Automation, AI-driven intelligent workflows, and business processes reengineering.

Windows 11 rollout begins as industry predicts slow business uptake

Sabina Weston

5 Oct, 2021

Microsoft has officially launched Windows 11, with the operating system’s phased rollout kicking off on 5 October. 

The release has been long anticipated by consumers and tech industry professionals alike, with the update bringing a number of new features such as a redesigned Start menu, Microsoft Teams integration, and the promise of faster future updates. 

Windows chief product officer Panos Panay, who was promoted to executive vice president in August, announced the launch on Windows Blogs and thanked Microsoft’s partners for their support.

“We are grateful to our entire ecosystem of partners who have played important roles in helping us prepare to get Windows 11 into the hands of our customers around the world. From OEM and app partners, to silicon, to retail, to our Windows Insiders, a launch of this global scale could not be achieved without them,” he said.

“On behalf of the entire team, we are pumped to bring you Windows 11, the Windows that brings you closer to what you love. We look forward to seeing the dreams and ideas you bring to life with Windows 11. This is just the beginning,” he added.

The tech industry was quick to share its thoughts on the launch, and many believe Windows 11 will fail to make a significant impact, with business uptake likely to be slow.

Gartner senior research director Ranjit Atwal, for example, told IT Pro that he is not expecting the launch to create “significant change” in the wider PC market. Many businesses will likely wait until next year to upgrade to Windows 11, he added, due to uncertainty towards the availability and compatibility of different apps.

Scott Riley, director of Cloud Nexus, a security provider and gold Microsoft Partner, told IT Pro, also believes that business uptake of the new operating system will be slow. When asked about whether users should upgrade today, he said: “The answer is no, Windows 10 is still fully supported by Microsoft until October 2025 so there is no urgency to make the leap,” he added.

Riley added that the operating system “feels like a facelift rather than a complete change to Windows 10”, and noted that Microsoft’s stringent system requirements could be another factor in users’ reluctance to upgrade immediately. 

“There are a lot of changes under the hood, and the minimum requirements have increased to focus on security for home and business devices,’ he said. “Windows 11 now requires a processor which supports security features which were only introduced into Intel and AMD chips following the Spectre and Meltdown attacks in 2018,” he said.

“As such this means that an awful lot of computers produced in 2018 and earlier will not be supported on Windows 11.”

However, Mahadeva Bisappa, principal architect at the Microsoft Partner and digital transformation consultancy, SPR, told IT Pro that the operating system has clearly been designed for the distributed workforce.

“Windows 11 comes out at a time when distributed remote work has become a norm”, she said, adding that its features are tailored to meeting “those remote working needs”.

“This includes all the new user interface improvements, Microsoft Teams for integrated communication and collaboration via text, audio and video modes across devices, and being able to use Windows 11 from any device or operating system,” he said. 

Bisappa also highlighted Windows 11’s security features, saying that Microsoft has been “doing a tremendous job of updating the Windows operating system regularly to address security issues and help users be more productive and secure”.

If you’re ready to make the team, a guide on how to install Windows 11 is available here.

Facebook blames faulty configuration change for hours-long outage

Bobby Hellard

5 Oct, 2021

A faulty configuration change has been blamed for taking Facebook, WhatsApp and Instagram offline for more than six hours on Monday night. 

The social network’s engineering team said that the changes affected the routers that coordinate the platform’s network traffic between its data centres. This, they said, caused a “cascading effect” on the way its data centres communicate, bringing all of the company’s services to a halt. 

“Our services are now back online and we’re actively working to fully return them to regular operations,” the company said in a blog post. “We want to make clear at this time we believe the root cause of this outage was a faulty configuration change. We also have no evidence that user data was compromised as a result of this downtime.”

In order to remedy the issue, Facebook sent engineers to one of its main data centres in California, according to The New York Times, suggesting it couldn’t be fixed remotely. It was also reported that the outage prevented staff from accessing company buildings and conference rooms with their badges.

The incident caught the attention of internet giant Cloudflare, which initially assumed something was wrong with its own DNS servers. However, after an investigation, engineers realised something more serious was happening, and reported in a blog that “social media quickly burst into flames.”

“Facebook and its affiliated services WhatsApp and Instagram were, in fact, all down,” Cloudflare said. “Their DNS names stopped resolving, and their infrastructure IPs were unreachable. It was as if someone had ‘pulled the cables’ from their data centres all at once and disconnected them from the Internet.”

The issues were down to BGP – the Border Gateway Protocol – which is a mechanism that exchanges routing information between autonomous systems on the web. The bigger versions of these make the internet work and have constantly updated lists for the possible routes of traffic, according to Cloudflare. 

“The Internet is literally a network of networks, and it’s bound together by BGP,” the firm said in its blog. “BGP allows one network (say Facebook) to advertise its presence to other networks that form the Internet. As we write Facebook is not advertising its presence, ISPs and other networks can’t find Facebook’s network and so it is unavailable.”

IT Pro 20/20: Using technology to create a better future

Dale Walker

5 Oct, 2021

Welcome to issue 21 of IT Pro 20/20, a digital magazine from our sister title IT Pro that distils the most important themes of the previous month into a simple, easy-to-read package.

This month we look at the newest innovations and projects helping to shape how we interact with the world around us.

Whether that be new approaches to tackling climate change, quirky ideas on what the office should look like post-pandemic, or ambitious plans to build a world-leading smart city, each story celebrates technology that’s helping to turn age-old into cutting-edge.


The next IT Pro 20/20 will be available on 29 October – previous issues can be found here. If you would like to receive each issue in your inbox as they release, you can subscribe to our mailing list here.

Supreme Court denies Oracle appeal over JEDI contract

Danny Bradbury

5 Oct, 2021

The US Supreme Court has denied Oracle‘s petition against the Pentagon’s vendor selection for the Joint Enterprise Defense Infrastructure (JEDI) contract. 

The petition, filed in January 2021, followed the failure of Oracle’s legal appeal in federal court. After Microsoft won the JEDI contract, Oracle argued the awarding of the contract to a single source was unlawful according to Congressional restrictions on single-source awards. 

The company also accused federal circuit courts of taking a hands-off approach when evaluating the complaint and said several Pentagon officials had conflicts of interest concerning Amazon, which also bid on the project. 

“Federal contracting is rife with potential corruption, and nowhere is that truer than in defense procurements,” its petition concluded. “Each year, billions of dollars of governmental contracts are tainted by the misconduct of agency personnel.” 

The rejection was a foregone conclusion given the Pentagon scrapped the $10bn project following another protracted legal fight. Amazon challenged the Microsoft win twice, alleging political interference by then-president Donald Trump, who had a long-standing grudge against Amazon’s CEO, Jeff Bezos. The contract was crippled after AWS won its legal battle. 

The Department of Defense decided to divide the work on future cloud computing systems between multiple bidders. Changing technical needs played a large part in the decision to scrap the project, said Pentagon officials in July, citing new initiatives like the Joint All-Domain Command and Control (JADC2), which will be a single network connecting sensors from all the military services. 

JEDI’s successor is the Joint Warfighter Cloud Capability (JWCC), which will involve multiple cloud service providers. The Pentagon will consider both AWS and Microsoft. It said these were the only two providers that could meet its requirements. 

The federal circuit court had said that the original decision to award JEDI to a single vendor had not affected Oracle, which would not have been considered under a multi-vendor award. 

Google Cloud confirms Intel Ice Lake processor support for N2 VMs

Bobby Hellard

30 Sep, 2021

Google Cloud has announced that its Compute Engine N2 virtual machines (VMs) will be available with Intel’s 10 nanometer Ice Lake Xeon processors.

There’s no specific date for the release, but Google now joins a list of companies that includes Amazon, Microsoft, and Oracle, which are set to use the latest generation of Xeon Scalable chips for their cloud services.

Google claims that using the 10nm chips in the N2 VMs will offer a 30% boost in price-performance compared to the previous generation of Xeons. The current version of N2 uses Intel’s 14nm second-generation processors, known as Cascade Lake.

The new N2 VMs will be offered at the same price as the existing Cascade Lake N2, and their usage can be discounted using existing N2 committed use discounts, according to Google.

The news comes just two weeks before Google Cloud Next, the cloud giant’s annual conference, where more details of the announcement will likely be shared. This is somewhat behind the rest of the industry, however, with Amazon, Microsoft, and Oracle all confirming support shortly after Intel officially revealed Ice Lake in April.

Google’s N2 will be available in preview early in the fourth quarter of 2021 in the US, Europe, and Southeast Asia, while availability in additional Google Cloud regions, in line with current N2 machine family regions, is planned for “the coming months”, the tech giant said.

The Ice Lake-N2 VMs have already been used by select customers, such as e-commerce firm Shopify, which used them to increase performance and reduce response times for its applications.

“With Google Cloud’s new N2-Ice Lake VMs, we were able to achieve improvements on all these areas,” said Justin Reid, senior staff engineer at Shopify. “We were able to achieve over 10% performance improvements for one of our compute-intensive workloads by running on the new N2 Ice Lake VMs and also achieve lower request latency for our users as compared to previous generation N2 Cascade Lake VMs.”

The rise of cloud misconfiguration threats and how to avoid them

Keri Allan

5 Oct, 2021

With cloud adoption accelerating, the growing scale of cloud environments is outpacing the capacity for businesses to keep them secure. This is why many organisations feel vulnerable to data breaches that might arise as a result of cloud configuration errors. 

More than 80% of the 300 cloud engineering and security professionals questioned by Sonatype and Fugue in their latest cloud security report said they felt their organisations were at risk. Factors include teams struggling with an expanding IT ‘surface area’, an increasingly complex threat landscape, and recruitment challenges coupled with a widening skills gap. 

A major security threat 

Misconfiguration is a major problem because cloud environments can be enormously complicated, and mistakes can be very hard to detect and manually remediate. According to Gartner, the vast majority of publicly disclosed cloud-related security breaches are directly caused by preventable misconfiguration mistakes made by users, highlighting how great of a security threat they truly are.

“Often companies use default configurations, which are insecure for many use cases, and unfortunately there’s still a significant skills gap,” says Kevin Curran, professor of cyber security at Ulster University. “The cloud industry is relatively new, so there’s a noticeable deficit in knowledgeable cloud architects and engineers.”

He claims there are numerous scanning services constantly seeking out vulnerabilities to exploit, and, because flaws can be abused within minutes of creation, it’s led to an urgent race between attackers and defenders

“An attacker can typically detect a cloud misconfiguration vulnerability within ten minutes of deployment, but cloud teams are slower in detecting their own misconfigurations,” he adds. “In fact, only 10% are matching the speed of hackers.”

Misconfiguration can happen for many reasons, such as organisations prioritising legacy apps over cloud security, Ben Matthews, a partner at consultancy firm Altman Solon, points out. “Even with the significant growth in cloud adoption in recent years,” he adds, “the current and likely enduring prevalence of mixed and hybrid environments mean that this problem isn’t going away anytime soon.”

There are several other common causes of cloud misconfiguration, too. Those questioned as part of Sonatype and Fugue’s study cited too many APIs and interfaces to govern, a lack of controls, oversight and policy, and even simple negligence, as among the main reasons. 

A fifth (20%) noted their businesses haven’t been adequately monitoring their cloud environments for misconfiguration, while 21% reported not checking infrastructure as code (IaC) prior to deployment. IaC is a process for managing and provisioning IT infrastructure through code instead of manual processes. 

It’s a people problem

Experts agree that cloud misconfiguration is, first and foremost, a people problem, with traditional security challenges such as alert fatigue, the complexity of managing applications and workloads, and human error playing a significant role. 

“Laziness, a lack of knowledge or oversight, simple mistakes, cutting corners, rushing a project – all these things play into misconfigurations,” points out Andras Cser, vice president and principal analyst at Forrester. 

Organisations also find the demand for cloud security expertise is outstripping supply, making it harder than ever to retain staff with the knowledge required to guarantee cloud security. Often, there’s also confusion within businesses as to who’s responsible for checking for vulnerabilities, and, if any are found, ensuring they’re removed.

“Secure configuration of cloud resources is the responsibility of cloud users and not the cloud service providers,” clarifies Gartner’s senior director analyst, Tom Croll. “Often, misconfigurations arise due to confusion within organisations about who’s responsible for detecting, preventing and remediating insecure cloud assets. Application teams create workloads, often outside the visibility of security departments and security teams often lack the resources, cooperation or tools to ensure workloads are protected from misconfiguration mistakes.”

Curran continues by highlighting that different teams are responsible at different stages of any cloud project. For instance, cloud developers using IaC to develop and deploy cloud infrastructure should be aware of the major security parameters included in the software development cycle. The security team, on the other hand, is generally responsible for monitoring and the compliance team for audits. To make things more complicated, Sonatype and Fugue’s report suggests cloud security requires more cross-team collaboration than in the data centre. More than a third (38%) of those surveyed, however, cited friction existing between teams over cloud security roles.

Avoiding cloud configuration errors

Wherever possible, organisations will want to prevent cloud misconfiguration problems from arising in the first place. This can be achieved by using tools such as IaC scanning during the development phase, and the adoption of policy as code (PaC), which, according to Curran, has revolutionised how IT policy is implemented. 

Rather than following written rules and checklists, in PaC, policies are expressed “as code” and can be used to automatically assess the compliance posture of IaC and the cloud environments organisations are actively running. 

“Using PaC for cloud security is significantly more efficient and cost-effective as it’s repeatable, shareable, scalable and consistent,” he explains, adding: “It also greatly reduces security risks due to human error.” Of course, mistakes can be missed and, therefore, continuous 24/7 monitoring should be core to a business’ cloud security operation in order to maximise the chances of finding potential vulnerabilities.

Experts advise businesses to use automated security services, such as cloud security posture management (CSPM), which are designed to identify misconfiguration issues and compliance risks in the cloud. This particular tool automates the process of finding and fixing threats across all kinds of cloud environments. 

“These allow cloud platform admins to create a good baseline of cloud configuration artefacts, then detect any drifts from it,” Forrester’s Cser continues. “It also takes advantage of best-practice templates that will flag issues around S3 buckets or overprivileged instances, for example. Automated CSPM visibility, detection and remediation should be continuous.”

SolarWinds hackers are targeting Microsoft AD servers

Sabina Weston

29 Sep, 2021

Nobelium, the hacking group responsible for last year’s cyber attack on SolarWinds, is now stealing data from Active Directory Federation Services (AD FS) servers.

That’s according to Microsoft’s Threat Intelligence Center (MSTIC), which has issued a warning about Nobelium’s latest actions on its blog.

The Russian state-backed hacking group was found to be using a post-exploitation backdoor dubbed FoggyWeb in order to remotely exfiltrate sensitive data as well as maintain persistence on victims’ networks, warned MSTIC researcher Ramin Nafisi.

In order to steal the data, Nobelium hackers first gain admin privileges to AD FS servers by employing “multiple tactics to pursue credential theft”. Once they manage to compromise the server, they then deploy FoggyWeb “to remotely exfiltrate the configuration database of compromised AD FS servers, decrypted token-signing certificates and token-decryption certificates”, wrote Nafisi.

The “passive and highly targeted” FoggyWeb backdoor “has been observed in the wild as early as April 2021”, he added.

Microsoft stated that it had notified all customers believed to be targeted by Nobelium. However, it didn’t rule out that some organisations might still be at risk. It recommends that potential victims audit their on-premises and cloud infrastructure, “remove user and app access”, strengthen their passwords, as well as “use a hardware security module (HSM) in securing AD FS servers to prevent the exfiltration of secrets by FoggyWeb”.

The tech giant also advised organisations to “harden and secure AD FS deployments” by taking additional measures, including limiting on-network access via host firewall and requiring all cloud admins to use multi-factor authentication.

The warning comes three months after Nobelium was found to have engaged in “password spray and brute-force attacks” on Microsoft’s customers, with around 10% of the targets being based in the UK.

The hackers implanted “information-stealing malware” on a device belonging to a Microsoft customer support agent, through which they obtained “basic account information for a small number of [Microsoft’s] customers”, according to the tech giant.

Prior to this, Nobelium launched a wave of attacks on more than 150 government agencies, think tanks, consultants, and NGOs from 24 countries, targeting an estimated 3,000 email accounts.

A third of businesses plan to set to spend $1 million on AI by 2023

Bobby Hellard

29 Sep, 2021

A third of organisations with plans to adopt artificial intelligence (AI) have said they will invest $1 million or more into the technology over the next two years. 

That’s according to Gartner’s annual Emerging Technology Product Leaders survey, where the majority of respondents (87%) predict industry-wide funding for AI increasing at a “moderate to fast pace” throughout 2022.

The survey was conducted between April and June of this year with 268 respondents from China, Hong Kong, Israel, Japan, Singapore, the UK and the US. Respondents were required to be involved in their organisation’s portfolio decisions when it comes to emerging technology and to work at an organisation in the high-tech industry with enterprise-wide revenue for fiscal year 2020 of $10 million or more.

AI seems to be the priority for most, with an average planned investment of $679,000 in computer vision over the next two years. Compared with other emerging technology areas, such as cloud and IoT, AI technologies had the second-highest reported ‘mean funding’ allocation. 
“Rapidly evolving, diverse AI technologies will impact every industry,” said Errol Rasit, managing vice president at Gartner.

“Technology organisations are increasing investments in AI as they recognise its potential to not only assess critical data and improve business efficiency, but also to create new products and services, expand their customer base and generate new revenue. These are serious investments that will help to dispel AI hype.”

Just over half of the respondents reported significant customer adoption of their AI-enabled products and services. 41% per cent of the respondents also cited AI emerging technologies as still being in development or at early adoption stages, suggesting there is a wave of potential adoption as new or augmented AI products and services are set to enter general availability.

The report is in contrast to another Gartner report from earlier in September, which highlighted the lack of talent the UK is currently facing and the barriers it could create for businesses adopting emerging technology. 

The perceived lack of talent was cited as the leading factor inhibiting adoption for six technology domains: compute infrastructure and platform services, network, security, digital workplace, IT automation and storage and database.  

Cloudflare takes aim at “exorbitant” AWS fees with R2 storage service

Bobby Hellard

29 Sep, 2021

Internet giant Cloudflare has made a bold pitch for enterprise customers with its new R2 object storage service. 

Cloudflare claims the selling point of R2 is that it comes with no “outrageous” charges for migrating data to external services, pitting it directly against Amazon’s dominant S3 service. 

R2 Storage is designed for the edge, according to Cloudflare, and offers customers the ability to store large amounts of data and extract it for no additional cost. 
In order to build websites and applications, developers need to store photos, videos, and graphics in easily accessible places, but that can become an expensive problem over time. AWS S3 is well known for its “egress” charges that can result in hefty bills over time, and Microsoft Azure and Google Cloud also implement similar fees for data migration.

However, both Azure and Google Cloud offer substantial discounts for their mutual Cloudflare customers, according to a Cloudflare blog from July.

Increasingly egregious bandwidth pricing has made cloud storage an expensive headache for some developers, and eventually leads to vendor lock-in, according to Cloudflare. As such, the company is making it its mission to heIp build a better internet by focusing on making it faster, safer, and also more affordable for everyone.

“Since AWS launched S3, cloud storage has attracted, and then locked in, developers with exorbitant egress fees,” said Matthew Prince, co-founder and CEO of Cloudflare. “We want developers to keep developing, not worrying about their storage bill. 

“Our aim is to make R2 Storage the least expensive, most reliable option for storing data, with no egress charges. I’m constantly amazed by what developers are building on our platform, and look forward to continued innovation as we expand the tools they have access to.”

As well as entering the enterprise storage business, Cloudflare this week also announced its first foray into the email security industry.