Todas las entradas hechas por Keumars Afifi-Sabet

DocuSign acquires ‘smart agreements’ startup Clause


Keumars Afifi-Sabet

28 May, 2021

Electronic signature provider DocuSign has acquired one of its key partners, Clause, alongside its intellectual property assets and staff, in order to integrate its technology into a broader cloud-based smart contracts platform.

Clause, which was founded five years ago, develops systems to assist digital contractual agreements, such as user verification and industry-specific services such as real-time, data-driven insurance contracts.

DocuSign, which has previously collaborated with Clause on developing digital contract technologies, will integrate the startup’s broader technology portfolio into its own Agreement Cloud platform. This system aims to elevate digital contracts from photos of paper documents into ‘living documents’ with interactivity and digital functionality.

«It is a compelling and exciting frontier of technology, and it’s an important enabler of making our Agreement Cloud smart,» said DocuSign CTO, Kamal Hathi. «It’s against that backdrop that DocuSign has entered into a definitive agreement to acquire the IP rights and hire the team from one of the industry’s smart agreement pioneers, Clause.

«Its products already integrate tightly with DocuSign eSignature, and we’re exploring deeper connections to contract lifecycle management (CLM) too.»

The company is also keen on integrating Clause’s services for various industries, including financial services, health care, and insurance companies, into its Agreement Cloud.

Among the features included in the latest release of Agreement Cloud are eSignature compatibility with Microsoft Teams and an eWitness feature that allows contract signers to include up to two witnesses per signer in the signing process.

Clause has been working closely with DocuSign to develop «groundbreaking capabilities» in contracting technology for the past two years, its founder Peter Hunn, said. This led to the conclusion that the scale and distribution of DocuSign would complement the innovations developed by Clause, with the two companies being a perfect fit for one another.

«The opportunity in front of us is to deliver Smart Agreements to the world, leveraging best-in-class eSignature and CLM products, as part of one of the largest tech companies,» Hunn said.

«The Clause team will continue our work within DocuSign to deliver on our shared vision for smart agreements—a development that will fundamentally change the future of contracts, just like word processing and eSignature

DocuSign has also been a keen investor in the startup, having led Clause’s Series A funding round of $5.5 million in 2019. The financial details of this acquisition haven’t been disclosed publicly, however.

Microsoft and the Linux Foundation launch green software initiative


Keumars Afifi-Sabet

26 May, 2021

Several major players in the tech industry have banded together to form a non-profit organisation directed at creating a trusted ecosystem of engineers, standards, tools and best practices for building green software

The Green Software Foundation sees Microsoft collaborating with the Linux Foundation, Joint Development Foundation Projects, Accenture, ThoughtWorks and GitHub to devise ways for making software development more sustainable.

The foundation aims to help the wider software industry contribute to the tech sector’s ambitions to reduce greenhouse gas emissions by 45% in 2030, in line with the Paris Climate Agreement. 

«As we think about the future of the software industry, we believe we have a responsibility to help build a better future – a more sustainable future – both internally at our organisations and in partnership with industry leaders around the globe,» said corporate vice president for developer relations, Jeff Sandquist.

«With data centres around the world accounting for 1% of global electricity demand, and projections to consume 3-8% in the next decade, it’s imperative we address this as an industry.»

The Green Software Foundation will focus on three key pillars of standards, innovation and community. More specifically, the organisation will agree standards and best practices for building green software, nurturing the creation of trusted open source and open data protects, and allowing the growth of a diverse international community of developers.

The members will also endeavour to drive awareness about ways to build greener applications, and encourage the adoption of green software across the industry through ambassador programmes. 

The foundation will also encourage voluntary adoption and help guide government policy towards those standards for a consistent approach for measuring and reporting green software emissions.

Alongside the founding members, Goldman Sachs, Leaders for Climate Action, the Green Web Foundation and WattTime will join the organisation as general members. The Linux Foundation will manage these collaborative efforts, and other organisations are invited to apply to join as a general member.

VMware urges vCenter customers to immediately patch their systems


Keumars Afifi-Sabet

26 May, 2021

VMware is urging its customers to update vCenter Server versions 6.5, 6.7 and 7.0 immediately after fixing two vulnerabilities that could allow attackers to launch remote code execution attacks. 

The most severe bug is tracked as CVE-2021-21985 which lies in the vSphere Client. This flaw involves a lack of input validation in the Virtual SAN Health Check plugin, which is enabled by default in the system. 

The vSAN system is a software-defined storage platform that’s used to eliminate the need for additional storage boxes using the local server storage. The health check plugin enhances customer support and user experience by allowing customers to manage their virtual deployments, including dozens of automated health checks.

The vulnerability is rated 9.8 on the CVSS threat severity scale and could allow hackers with network access to port 443 to execute commands with unrestricted privileges on the operating system that hosts vCenter Server. The high base score suggests the effects are particularly devastating, and the vulnerability is relatively easy to exploit.

The second vulnerability, tracked as CVE-2021-21986, is less severe, but nonetheless would allow attackers with network access to port 443 on vCenter Server to perform actions allowed by the impacted plugins without authentication. 

This vulnerability concerns a vSphere authentication mechanism for the Virtual SAN Health Check, Site Recovery, vSphere Lifecycle Manager and VMware Cloud Director Availability plugins in the vSphere Client. 

The bugs are extremely serious, VMware has warned, and customers are being advised to patch immediately. 

«With the threat of ransomware looming nowadays the safest stance is to assume that an attacker may already have control of a desktop and a user account through the use of techniques like phishing or spearphishing, and act accordingly,» the firm says in its FAQs. 

«This means the attacker may already be able to reach vCenter Server from inside a corporate firewall, and time is of the essence.»

The issue affects all vCenter Server customers, not just those who use vSAN, because this plugin is shipped with all systems and is enabled by default. The company doesn’t advise disabling the vSAN plugin, because manageability and monitoring will not be possible, and customers using vSAN should only disable the plugin for short periods of time. 

Warning of the dangers, VMware said in its FAQs that customers without perimeter security controls on their virtualisation infrastructure may be in jeopardy. Ransomware gangs, particularly, have demonstrated they can compromise corporate networks and subsequently wait for new vulnerabilities in order to attack from inside a network.

The fear is very real given that ransomware operators had previously exploited critical ESXi and vSphere Client flaws, with Carbon Spider and Sprite Spider gangs exploiting the flaws to encrypt virtual machines (VMs).

Microsoft to retire Internet Explorer 11 in 2022


Keumars Afifi-Sabet

20 May, 2021

Microsoft’s once widely-used Internet Explorer browser will reach end-of-life status from June 2022, with the firm no longer supporting the desktop application.

The legacy browser will also be absorbed into Microsoft Edge through an in-browser Internet Explorer mode, so organisations still reliant on the out-of-date service can continue to run critical applications in an emulated environment.

The number of users still relying on Internet Explorer is minimal compared with historic standards, with the soon-to-be legacy browser holding a 0.71% market share as of April 2021. Chrome, by contrast, holds a 64.47% market share, according to Stat Counter.

Microsoft launched the next generation of its web browser, Microsoft Edge, in 2015 as a replacement for Internet Explorer 11.

The company then launched the second iteration of its flagship browser in 2020, powered by the open source Chromium engine, while announcing plans to retire the ‘legacy’ Edge version. Chromium-based Edge is now the default browser for Windows 10, with the 2015 version removed as of last month.

“With Microsoft Edge, we provide a path to the web’s future while still respecting the web’s past,” said Microsoft developer Sean Lyndersay. “Change was necessary, but we didn’t want to leave reliable, still-functioning websites and applications behind.

“We’re here to help you transition to the more comprehensive browsing experience of Microsoft Edge and tell you a bit more about why we think it will address your needs, both at home and at work.”

Microsoft is encouraging its users to transition to Edge by promoting the wide range of benefits it over the Internet Explorer user experience. The use of a dual-engine, for example, supports both legacy and modern sites, while the Internet Explorer mode will allow users to continue using sites and apps that are only compatible with Internet Explorer.

The Edge browser is also more secure than Internet Explorer, offering a host of features including Microsoft Defender SmartScreen to block phishing attacks and malware infection attempts. While Internet Explorer 11 packaged security updates monthly, Edge can issue security patches for flaws within days.

Organisations using Internet Explorer are being encouraged to move to Microsoft Edge immediately, and continue to use their legacy applications through the dedicated Internet Explorer mode, which Microsoft will continue to support until 2029.

Internet Explorer was first released in 1995 as part of the add-on package Plus! for Windows 95. The project was started by developer Thomas Reardon in 1994 who used source code from Spyglass’ Mosaic web browser.

The web browser underwent several transformations and redesigns through the years, before its final version, Internet Explorer 11, was released in 2013 alongside Windows 8. Development on the project was suspended in 2016 when all work was shifted over to Microsoft Edge, which launched in the previous year.

Microsoft announced in February that Internet Explorer would no longer be compatible with Microsoft 365 apps from August 2021. This follows Microsoft Teams dropping support for the browser in November last year.

Users have until June 2022 before the Internet Explorer desktop app will no longer be supported, or available to download.

Microsoft rolls out Windows 10X-inspired May 2021 Update


Keumars Afifi-Sabet

19 May, 2021

Businesses are able to update their Windows 10 systems with version 21H1, dubbed the May 2021 Update, which is designed to deliver features that improve security, remote access and the quality of experience.

The update, which will take a staggered and measured approach to rollout, introduces several new security-oriented features, namely multi-camera support for Windows Hello, Windows Defender Application Guard, and a group policy service. 

These are inspired by work Microsoft has done on its Windows 10X operating system, which was initially designed for dual-screen foldable devices but has since evolved to be a more general-purpose system.

“In the current environment, we know that you continue to rely on your PCs more than ever. As a result, we are initially taking a measured seeker-based approach to the rollout of the May 2021 Update,” said Microsoft’s vice president for program management, Windows servicing and delivery, John Cable. 

“We are throttling availability up over the coming weeks to ensure a reliable download experience for all, so the update may not be offered to you right away. Additionally, some devices might have a compatibility issue for which a safeguard hold is in place. In these cases, we will not offer the update until we are confident that you will have a good update experience.”

Among the features included in 21H1 is Windows Hello multi-camera support, which will set the default as the external camera when both external and internal Windows Hello cameras are connected to a device. 

Application Guard, meanwhile, is designed to help prevent old and emerging attacks by isolating untrusted entities in a Hyper-V-powered container. The Windows Management Instrumentation (WMI) Group Policy Service (GPSVC), finally, will receive a performance update to better support remote work scenarios.

All editions of the May 2021 Update will receive 18 months of servicing and support, with commercial organisations recommended to begin targeted deployments to ensure services and infrastructure works as expected. 

The update also integrates elements of the once completely separated Windows 10X platform that had been in development. Instead of launching Windows 10X as a standalone product, Cable said, Microsoft is using teachings from the development process to integrate its core elements into other parts of Windows and alternative services. 

The new app container technology at the heart of Microsoft Defender Application Guard, for example, is derived from Windows 10X, as is an enhanced voice typing experience, and a modernised touch keyboard. 

HPE launches key framework for EU’s Gaia-X project


Keumars Afifi-Sabet

18 May, 2021

HPE has announced a set of capabilities to equip organisations with the tools required to monetise data by tapping into the EU’s in-development Gaia-X federated data infrastructure.

Companies, service providers and public organisations can use HPE’s Solution Framework for Gaia-X to gear up to be compatible with the data platform when it launches in the near future. The system supports all functionality required to provide and consume data and services in a decentralised, federated environment. 

By buying into HPE’s framework, organisations can tap into huge distributed data pools, strengthen data sovereignty and create value from data in ways they could never have prior to involvement in Gaia-X. 

This framework is based on a reference architecture comprising key components of HPE’s software portfolio, third-party software, and the Cloud28+ network, a marketplace for monetising data and services. Everything will also be bundled in an ‘as a service’ HPE Greenlake model, meaning it’s more accessible to customers and partners. 

“Gaia-X is not about US versus Europe, but about the key question of the next wave of digital transformation and how to create network effects without centralisation in order to unlock the value of distributed data, while at the same time reserving sovereignty of every participant,” said Johannes Koch, HPE’s senior vice president for Germany, Austria and Switzerland, and MD for Germany. 

“Gaia-X is the focal point of this endeavour, and as such is also a catalyst to create the future architecture of the digital world. In essence, it’s about restoring the original freedom of the internet and about creating an open, decentralised cloud.” 

The EU proposed Gaia-X as a next-gen continental-wide system in order to reduce the reliance on, and domination of, large US tech companies with regards to data, the cloud, and digital transformation. 

The platform connects a host of cloud service suppliers through an interoperable data exchange platform that serves as a warehouse for several industries and data sources. It also acts as a data repository for businesses to pick specific services, such as IoTbig data and machine learning

HPE joined the non-profit organisation managing and contributing to Gaia-X on day one, and has contributed to its architecture, standards and certification since. 

The message HPE was keen to stress is that businesses cannot reap the benefits of Gaia-X unless their infrastructures and data operations are configured in such a way that they’re compatible with the platform. This is where the firm’s HPE Solution Framework for Gaia-X steps in as a means of getting businesses ready to be a part of the Gaia-X project.

The firm says its own strategy is perfectly aligned with the approach Gaia-X is taking, and the problems that it’s trying to solve, with HPE’s software portfolio and business model pivoted to it. 

A key component of the HPE Solution Framework for Gaia-X is a reference architecture that defines the foundation of the components needed to decentralise workloads, and this also includes a central governance structure 

The HPE Ezmeral Software Platform, which provides tools such as access to distributed data and unified control of distributed Kubernetes clusters, serves as the technological foundation of its framework. 

Its Secure Production Identity Framework for Everyone (SPIFFE) and the SPIFFE Runtime Environment (SPIRE) offer open source standards for securely authenticating software services.

Finally, Cloud28+ allows customers to monetise their data and services through the marketplace that this platform offers, and the partners associated with the community. 

Google Cloud and SpaceX partner on Starlink internet service


Keumars Afifi-Sabet

14 May, 2021

SpaceX and Google Cloud Platform (GCP) have struck a partnership that’ll see the two companies deliver data management, cloud services, and applications to enterprise customers across the world.

The agreement will combine SpaceX’s flagship Starlink low-orbit satellite system with Google Cloud’s data centres to provide high-speed broadband to customers on the network edge.

Starlink, a low latency broadband system comprising roughly 1,500 satellites, will base its ground stations within Google’s data centres, with GCP’s high capacity private network supporting the delivery of the global satellite internet service.

The aim is to connect businesses and consumers to the internet and to cloud computing services regardless of where they’re based, and with the highest possible levels of connectivity.

“Applications and services running in the cloud can be transformative for organisations, whether they’re operating in a highly networked or remote environment,” said senior vice president for infrastructure at Google Cloud, Urs Hölzle.

“We are delighted to partner with SpaceX to ensure that organizations with distributed footprints have seamless, secure, and fast access to the critical applications and services they need to keep their teams up and running.”

Combining Starlink’s broadband system with Google’s infrastructure will offer organisations across the world networking availability and speeds that they should expect in the modern age, SpaceX president and COO Gwynne Shotwell added.

SpaceX began developing Starlink in 2015, and the system has undergone deployment tests over the last few years. The objective has been to deploy roughly 1,500 satellites by 2021 in order to launch the networking service for enterprise customers, which SpaceX has almost achieved.

The US Federal Communications Commission (FCC) also submitted filings in 2019 for approval of up to 30,000 additional satellites to complement the 12,000 Starlink satellites that the FCC had already approved, according to Space News.

SpaceX previously struck a partnership with Microsoft in October 2020 to allow the computing giant to launch a fleet of satellites to host its Azure Space platform. This services the space industry’s mission needs while also claiming to offer high networking speeds with low latency for public and private organisations.

The networking service, powered by GCP, will be available from the second half of this year.

Can IBM buy its way to cloud success?


Keumars Afifi-Sabet

14 May, 2021

IBM has been a fixture of the computing industry almost since its inception, defining various eras with products such as the Model 5150 or Watson, the AI-powered suite of business services. One of the secrets to its longevity has been a powerful ability to reinvent itself when market shifts threaten the viability of its business model. As a result, the company is just as relevant today as it was when founded in 1911. 

While we may not readily associate IBM with cloud computing, this is where the company sees its future, alongside the twin pillars of AI and quantum computing. As such, the firm has launched itself into a radical shift in pursuit of a revenue model reliant on expanding its hybrid cloud business. This is a strategy that’s seen IBM plot to cleave off its managed services business as well as make ten acquisitions within the space of a year, comprising one of the computing giant’s most comprehensive reinventions yet. It’s a process, however, that its executives feel is essential to IBM’s long-term survival.

The ‘$1 trillion hybrid cloud opportunity’

IBM’s leadership has often referenced the “$1 trillion hybrid cloud opportunity” as a key driver for the strategy, and for good reason. The market has shown a long-term move towards cloud services, Gartner VP analyst Craig Lowery tells IT Pro, with many businesses changing their strategies to help their clients achieve their cloud objectives. “Customers have been making their requirements known for many years,” Lowery says. IBM has, like many other companies, eventually had to respond to that, he adds, saying that its leadership “has taken the appropriate actions, as they see it, to align with customer needs”.

This explosive cloud growth coincides with the continued success of businesses such as AWS, Google Cloud and Alibaba, with a wave of digital transformation projects triggering an acceleration in cloud adoption. “Overall, these trends have maintained growth in cloud spending,” says Blake Murray, research analyst at Canalys. “However, increased spending is now happening across almost all industries, with the need for digitalisation, app modernisation, content streaming, collaboration software, online learning and gaming. This is likely to continue, as an increasingly digital world becomes a ‘new normal’.”

State of decline

Just as the fortunes of major cloud giants have surged, the financial power of IBM as a wider entity has dwindled over the previous decade.

Delving into specific business units, we can see that performance declined on all fronts between 2011 and 2016, but especially the Systems and Technology segment. Like for like comparisons beyond this point are difficult, as IBM underwent two internal restructures, once in 2015 then again in 2018, but these moves failed to stem the long-term trend, and revenues continued to decline. At the same time, IBM’s cloud operations – spread across all divisions – began to spark into life, mirroring wider industry trends. 

Today, cloud computing is one of IBM’s most important revenue streams and will continue to grow in significance. The rising value of the firm’s cloud business is clear, and a key reason why its leadership sees cloud computing as a future moneymaker.

Sparking an internal revolution

In October, IBM announced it would carve away its managed services business into a separate entity by the end of 2021. This is a key part of the overall strategy, the company’s vice-president for Hybrid Cloud EMEA, Agnieszka Bruyère, tells IT Pro, with its AI, quantum computing and cloud operations being recast as the three main pillars of IBM’s operations. 

The origins of this strategy stretch back two or three years, she adds, when the company first pinpointed the key role cloud computing would play in its clients’ digital transformation journeys. At that stage, however, 80% of its customers’ workloads were still residing in the data centre. This is partially why IBM is pursuing hybrid cloud. The firm, Bruyère explains, doesn’t consider the public cloud alone to be a viable long-term solution for helping its customers modernise. “It cannot be only a purely public cloud transformation,” she says. “It does not meet the companies’ reality in terms of security, compliance, business model, whatever, and really the best way to respond to companies’ challenges is a hybrid cloud strategy.”

The foundational step on this path was IBM’s record $34 billion acquisition of Red Hat, with the open-source giant brought in to bolster the company’s technology portfolio. Playing a key role in driving this deal forward was Arvind Krishna, who at the time was VP for hybrid cloud but was named CEO in April 2020. His promotion coincided with the recruitment of Bank of America veteran Howard Boville as his replacement. Since then, Bruyère tells IT Pro, IBM has adopted much-needed “clarity” on its hybrid cloud strategy, with the business taking more aggressive steps since.

The pair have played a key role in making a set of strategic acquisitions while paving the way for the divestiture of its entire managed services business. This follows a long history of divestments, Krishna recently commented, with IBM divesting networking in the 90s, PCs back in the 2000s and semiconductors about five years ago.

“We want to make sure we are focusing our investment in this space, and we really want to do it only in this space – hybrid cloud and AI,” Bruyère says. “Another new aspect is about the industrial offerings with the new management, and this is really important because it’s not only about building technical capabilities, but also bringing the regulation layer; the specifics for every industry.” 

The key difference since the leadership reshuffle is a strategic focus on the logistics around hybrid cloud, rather than the technology. The company has made efforts to apply its technology to the needs and requirements of particular industries, taking into account unique security, data protection and regulatory requirements, among other considerations. This was signalled with the launch of IBM Cloud for Financial Services, with specific sector-based services set to follow. 

IBM’s cloud computing ‘shopping spree’ 

The changed approach has also been expressed in the nature of IBM’s ten acquisitions since the Red Hat deal closed in 2019, one of the most recent being Taos Mountain, a cloud consultancy firm. IBM is hoping the services of each business, largely small enterprises, can give its wider cloud offering an added edge. 

Reflecting Bruyère’s assessment of IBM’s new strategic direction, Lowery highlights the importance of professional services in making cloud adoption work as the reason the company has focused on acquiring consultancies. Of course, of the ten, five are involved in consultancy. “The expertise about how to build in the cloud, how to build across clouds, how to build from cloud to your on-premises data centre – which is hybrid – most of that requires skills and expertise that are not readily available for hire, except through a professional services company,” he says. 

Red Hat, meanwhile, fits into the equation perfectly thanks to its technology for containers and container orchestration, as well as its OpenShift family of software products. “That technology is well-suited to building hybrid and multi-cloud solutions where you have one standard way for building applications,” Lowery adds. “It’s not the only way to solve hybrid and multi-cloud scenarios, but it is a valid way, and Red Hat brings IBM the technology to solve that particular set of problems in that way.” 

The rocky road to cloud success

Although the opportunity for IBM is undeniable, so too is the need for urgency. While the size of the cloud market has certainly grown in recent years, the grip of the biggest cloud companies has also tightened; as time passes it becomes increasingly difficult for a challenger to make serious inroads. 

Looking at how prospective customers plan to spend in the coming year, we can also see that IBM faces more of an uphill struggle for business than any other player in this space. 

Turning the tide commercially will be IBM’s most pressing challenge, although we can start to see these efforts pay off with a turnaround in IBM’s financial results for the first quarter of 2021. As far as Murray is concerned, the company is certainly on the right track with the actions it’s taking, especially the decision to spin off its managed services business into an entity named Kyndryl.

“It allows IBM to become much more nimble and responsive,” he explains, “increasing its relevance in a multi-cloud, hybrid world, and reducing competition with the largest systems integrators that will be critical partners for its hybrid cloud and AI offerings. The most important move it has made recently is establishing a new, simplified global sales structure and go-to-market model, giving partners ownership of all but its largest enterprise customers and removing compensation for IBM sales selling into any other accounts.”

Success will very much depend on IBM’s commitment to its new ecosystem and channel model, with a need to reduce complexity and refresh its rules of engagement, he adds. “In the past, IBM has made similar promises but failed to follow through. It now has an opportunity to establish itself as a vendor of partner choice.”

For Gartner’s Craig Lowery, the first thing he’ll be looking for as signs of green shoots would be when his clients begin showing more interest.“We know when a company is making an impact,” he explains, “when Gartner clients start asking about them and are getting the message in the market that the company has made a significant change and that the change has some substance to it.” 

Given the long-term nature of this transition, Lowery advises IBM’s executives to remain consistent in their approach, but also not to shy away from the need to make tweaks as and when required. The fact IBM is making these structural changes, he notes, shows its executives understand the shift that’s required to stay relevant in the future. “It’s clear to me that IBM knows these changes are necessary and that it is willing to do the hard work to make it happen.”

Microsoft launches open source tool Counterfeit to prevent AI hacking


Keumars Afifi-Sabet

4 May, 2021

Microsoft has launched an open source tool to help developers assess the security of their machine learning systems.

The Counterfit project, now available on GitHub, comprises a command-line tool and generic automation layer to allow developers to simulate cyber attacks against AI systems.

Microsoft’s red team have used Counterfit to test its own AI models, while the wider company is also exploring using the tool in AI development.

Anyone can download the tool and deploy it through Azure Shell, to run in-browser, or locally in an Anaconda Python environment.

It can assess AI models hosted in various cloud environments, on-premises, or in the edge. Microsoft also promoted its flexibility by highlighting the fact that it’s agnostic to AI models and also supports a variety of data types, including text, images, or generic input.

“Our tool makes published attack algorithms accessible to the security community and helps to provide an extensible interface from which to build, manage, and launch attacks on AI models,” Microsoft said.

“This tool is part of broader efforts at Microsoft to empower engineers to securely develop and deploy AI systems.”

The three key ways that security professionals can deploy Counterfit is by pen testing and red teaming AI systems, scanning AI systems for vulnerabilities, and logging attacks against AI models.

The tool comes preloaded with attack algorithms, while security professionals can also use the built-in cmd2 scripting engine to hook into Counterfit from existing offensive tools for testing purposes.

Optionally, businesses can scan AI systems with relevant attacks any number of times to create baselines, with continuous runs as vulnerabilities are addressed, helping to measure ongoing progress.

Microsoft developed the tool out of a need to assess its own systems for vulnerabilities. Counterfit began life as a handful of attack scripts written to target individual AI models, and gradually evolved into an automation tool to attack multiple systems at scale.

The company claims it’s engaged with a variety of its partners, customers, and government entities in testing the tool against machine learning models in their own environments.

Red Hat launches OpenShift Platform Plus alongside new managed cloud services


Keumars Afifi-Sabet

28 Apr, 2021

Red Hat has launched an advanced tier of its OpenShift container application platform, with added tools designed to offer a complete Kubernetes stack out-of-the-box. This is in addition to launching three new managed cloud services. 

Red Hat’s OpenShift Kubernetes Engine is the foundational layer of OpenShift, allowing customers to run containers across hybrid cloud deployments on the Red Hat Enterprise Linux (RHEL) OS. The OpenShift Container Platform adds developer and operations services, as well as advanced features for app development and modernisation. 

The tertiary tier, OpenShift Platform Plus, builds on the OpenShift Container Platform to provide advanced security features, ‘day two’ management capabilities and a global container registry. It brings together all the aspects needed to build, deploy and run any application where OpenShift software runs, Red Hat claims.

Its launch has come alongside a set of managed cloud services tightly integrated with the Red Hat OpenShift platform to help organisations build, deploy and manage cloud-native apps across hybrid configurations. 

Red Hat OpenShift Streams for Apache Kafka, Red Hat OpenShift Data Science and OpenShift API Management are being launched to ease the complexities of modern IT environments, while not compromising on productivity. 

OpenShift Streams for Apache Kafka is designed to make it easier for customers to create, discover and connect to real-time data streams regardless of where they’re based.

OpenShift Data Science also offers organisations a way to develop, train and test machine learning models and export in a container-ready format.

OpenShift API management, meanwhile, reduces the operational cost of delivering API-first, microservices-based apps.

“To take full advantage of the open hybrid cloud, IT leaders need to be able to use the technologies that they need in whatever IT footprint makes sense for them,” said Red Hat’s executive vice president for products and technologies, Matt Hicks, at Red Hat Summit 2021. 

“Red Hat managed cloud services effectively drops many barriers that have kept organisations from harnessing the full potential of the hybrid cloud. We believe eliminating the traditional overhead of managing cloud-scale infrastructure will spark a genesis moment for customers and open up a future of possibility where those barriers once stood.”

Red Hat OpenShift Platform Plus adds Advanced Cluster Security for Kubernetes, a standalone product developed from the firm’s recent acquisition of StackRox. This offers built-in Kubernetes-native security tools to safeguard infrastructure and management workloads through an app’s development cycle. This is in addition to Advanced Cluster Management for Kubernetes and Red Hat Quay. The former brings end-to-end visibility and control of clusters, while the latter provides a secure registry for a consistent build pipeline.

“We believe this version addresses the need for a hybrid cloud solution that we hear from our customers, and we’ll be working lead with customer-managed OpenShift across data centre, public and private cloud,” said senior vice president for cloud platforms at Red Hat, Ashesh Badani.

“This version also becomes a landing point for additional capabilities, and we have worked hard to reduce costs compared to purchasing any of these capabilities a la carte, and we will continue to offer all three versions so customers can best decide what’s appropriate for their use case, and subscribe to the best available version.”

One of the key appeals is it grants businesses system-level data collection and analysis, as well as more than 60 security policies out-of-the-box that can be enforced from the time apps are built to when they’re deployed. 

RedHat OpenShift Platform Plus also lets organisations take a DevSecOps approach to security by integrating declarative security into developer tooling and workflows.

The three managed services, being launched in the coming months, build on Red Hat’s existing suite of OpenShift apps, allowing customers and partners to build an open Kubernetes-based hybrid cloud strategy.

Based on the open source Apache Kafka project, OpenShift Streams for Apache Kafka allows dev teams to more easily incorporate streaming data into their apps. Real-time data is critical to these apps, and provide more immediate digital experiences wherever a service is delivered.

OpenShift Data Science builds on Red Hat’s Open Data Hub project and provides faster development, training and testing of machine learning models without the expected infrastructure demands. 

Finally, the OpenShift API Management managed cloud service offers full API management to Red Hat Oepnshift Dedicated, as well as OpenShift on AWS. This combines managed operations with native OpenShift integration to let organisations focus on the innovation side of things as opposed to the infrastructure. 

Red Hat OpenShift API Management also enables customers to build their own API management program, with the capabilities to control access, monitor usage, share common APIs and evolve their overall application landscape through a single DevOps pipeline.