All posts by Dale Walker

Box overhauls its Relay workflow tool


Dale Walker

22 May, 2019

Box has launched what it describes as an “all-new” version of its Box Relay workflow management tool, featuring a more powerful workflow engine, a simplified UI, and improved tools for manipulating data.

The company first introduced the platform back in 2016 in a bid to make it easier for multiple departments, both inside and outside an organisation, to collaborate on projects from within the Box app, while automating much of the configuration side. It’s designed to make repeated processes, such as the onboarding of a new employee to the company, easier to automate.

The platform has since received a number of updates and developments, including the launch of an API in July 2018, which allowed Relay to be integrated into other business systems, such as CRM and ERP tools.

The latest version now brings improvements to its core engine, which now builds workflows based on ‘if this then that’ (IFTTT) triggers to support processes that require a larger number of intricate steps. The platform will now also support the option to route content based on metadata attributes, for example, date, dropdown, multi-select and open text fields.

More immediately noticeable changes can be found in the updated visuals, including a new UI that’s been redesigned to allow non-IT staff to build their own processes without the need for additional technical support. The main dashboard has also been given a fresh look, which will now display real-time metrics for workflow history, details on who created, updated or deleted workflows, and the option to export the audit history.

“Enterprise workflows built around content like document reviews and approvals and employee on-boarding and off-boarding need to be reimagined,” said Jeetu Patel, chief product officer at Box. “They’re disconnected from the apps teams use every day, locked behind IT, and don’t support external collaboration.”

“The new Box Relay brings powerful automation to improve these critical business processes, whether it’s creating sales proposals and marketing assets, or driving budget sign-offs and contract renewals, and more. Enterprises now have one platform for secure content management, workflow, and collaboration that’s built for how we work today.”

Relay has also been more tightly integrated into the Box portfolio. Specifically, users can call upon all the tools found in Box Cloud Content Management, including the security and compliance features, as well as the same integrations, such as Office 365 and DocuSign.

The new Box Relay is currently in private beta but will become generally available in “late June 2019”. The platform will release with both a paid version and a free ‘Lite’ version.

Alongside the Relay update, Box said it is also working on a new single view UI as part of Box Tasks, which is designed to make it easier for users to see all their tasks at once, which will be supported with mobile push notifications. This addition is currently in public beta and will be added for all users for free once it launches generally.

Red Hat Enterprise Linux 8 launches with simplified multicloud tools


Dale Walker

7 May, 2019

Open source giant Red Hat has made the latest generation of its Enterprise Linux operating system generally available, bringing with it a host of new features that it hopes will support the growing adoption of hybrid and multicloud deployments.

By upgrading to the new Red Hat Enterprise Linux (RHEL) 8, customers will be able to benefit from a more streamlined process for updating developer tools and frameworks, better security support, compatibility with some of Red Hat’s newest services out of the box, and a more user-friendly GUI that aims to reduce the barrier to entry for Linux beginners.

Red Hat is hoping to capitalise on a growing multicloud industry that’s seen adoption across 70% of customers, according to recent IDC data. What’s more, 64% of applications from an average company’s IT portfolio are based in either public or private cloud.

With Linux poised to play a part in around $10 trillion in global business revenue this year, Red Hat believes that the software layer should be keeping pace in the multicloud era, particularly when companies are also adopting other disruptive strategies, like AI and DevOps.

Speaking at Red Hat Summit this week, Stefanie Chiras, vice president and general manager of Red Hat Enterprise Linux, explained that the company wants to serve as a “junction” between innovative products and applying these to a business ecosystem.

“Innovation and Linux are inseparable – from building the Internet’s backbone to forming the first neurons of AI, Linux drives IT’s present and future,” said Chiras. “We want to redefine the value of an operating system in this new era of IT. We want to really show to the industry that Red Hat is an enterprise software portfolio company, not a product company.”

As an example, she highlighted the addition of Insights into RHEL 8 by default, effectively a support service that allows customers to access expertise on their Linux deployments. This will monitor a business’ deployment and flag any security vulnerabilities or stability issues, and flag these automatically to admins. Red Hat was keen to state that this was more of a “coaching” approach to support, as customers become more familiar with the platform, and effectively replaces the idea of support tickets.

“We’ve pulled Insights directly into the RHEL subscription – that allows customers to use a software-as-a-service offering, [and] leverage all the tools that we offer,” Chiras explained.

Another major addition is Application Streams feature, a tool that aims to improve the platform’s process of updating languages and frameworks, something that has traditionally been difficult to do without creating instability. With Application Streams, these languages and frameworks will be updated far more frequently without straining core resources.

It’s clear that making things easier is the theme of the iteration, something that’s certainly resonated with Red Hat’s customers. Today, RHEL enjoys over 50,000 deployments, and recorded 8,000 downloads for its beta phase of version 8 – compared to just 400 for RHEL 7.

Specifically, it has hidden many of the more granular system tasks behind the updated RHEL GUI, which now also makes it easier to update instances from version 7 to 8. It also draws upon Ansible, Red Hat’s automation platform, to power new system roles and allows the creation of workflows for more complex management tasks. This should make it far easier for, for example, new system admins to adopt new protocols and reduce the possibility of human error.

RHEL is still Red Hat’s flagship product, and will certainly serve as a springboard for the rest of its portfolio. In fact, Red Hat also plans to introduce Enterprise Linux CoreOS, a lightweight version of RHEL 8 designed for customers using OpenShift 4.

The release marks the last iteration of RHEL released before the completion of the $33 billion acquisition by IBM, which was approved by the US Department of Justice this week.

HPE hails ‘major leap in storage’ with memory-driven flash


Dale Walker

28 Nov, 2018

HPE has announced a host of new enhancements to its storage portfolio, including the introduction of memory-driven flash and an expansion to the coverage of its multi-cloud service.

The new additions come at a time when the company has all but given up on keeping pace with market leader Dell, and is instead seeking to build out its Intelligent Infrastructure range with new capabilities.

Chief among these is the introduction of Memory-Driven Flash to its 3PAR family of data centre flash storage modules and its Nimble Storage range, something that HPE described as the biggest leap in storage in 25 years. It essentially combines HPE software with storage class memory (SCM) and non-volatile memory (NVMe), based on Intel’s Octane hardware.

The result is a new class of storage that’s designed to lower latency by 2x, and is billed as being 50% faster than all-flash arrays using non-volatile memory SSDs. This is particularly important for those latency-sensitive workloads that rely on real-time processing, or those that use AI at scale.

“Most applications can benefit from adding memory, but memory is very expensive,” said Milan Shetti, GM of HPE Storage, speaking at HPE Discover in Madrid this week. “You can also have intelligent storage, but one of the key attributes of this is you need to have memory.

“[This] is the industry’s first enterprise storage operating system, which will support storage-class memory,” he added. “This is something we’ve been working on for a while. With [Memory-Driven] operating system, at the speed of memory and at the cost of flash, you’re getting an entirely new way of building computing, storage and data management.

This new architecture will be available in December 2018 for 3PAR as a straightforward upgrade, and sometime in 2019 for Nimble Storage.

Another major announcement was the introduction of new tools to its InfoSight product, a management platform that is designed to predict and prevent errors across an organisation’s infrastructure without user involvement, as well as predict bandwidth, capacity and performance needs.

Now the platform can also utilise more machine learning-based controls, including an enhanced recommendation engine that replaces its basic predictive analytics with an AI-based system. This drastically improves optimisation guidance across HPE Nimble Storage as a result, the company explained.

Also announced was the release of machine learning tools for its HPE 3PAR storage range, which allows for self-diagnosis of performance bottlenecks and means InfoSight can be extended to purely on-premise deployments. HPE explained this addresses the issue of being unable to provide InfoSight analytics to data centres which may have limited access to the cloud.

The company also revealed that its Cloud Volumes, a hybrid cloud storage-as-a-service platform, is now expanding into the UK and Ireland regions in 2019. Currently available for HPE Nimble Storage, the pay-as-you-use service allows customers to move their on-prem applications to the AWS or Azure cloud, only with enterprise-grade storage instead of the default storage offered by those clouds.

This platform also now includes containers-as-a-service for the provisioning of applications hosted by HPE, and compliance certifications, including SoC Type 1, HIPAA for healthcare, and GDPR.

HPE builds on its composable vision with new Hybrid Cloud Platform


Dale Walker

27 Nov, 2018

HPE has launched a host new additions to its composable cloud portfolio, chief among these being a hybrid cloud software stack that aims to bring the flexibility and fluidity of the public cloud, as well as a host of AI-driven storage systems, to the on-premise environment.

The company claims this represents the industry’s first hybrid cloud platform built on composability and flexible deployments, something that will help those businesses struggling to move to the cloud due to a lack of skilled stack specialists.

HPE follows the likes of Microsoft, Google, and IBM, leading technology companies that have shifted their focus over the past year towards a customer base that’s increasingly adopting multiple cloud providers or hybrid over a single, all-encompassing service.

It also builds on HPE’s launch of a hybrid cloud-as-a-service model, which sits inside its GreenLake financial services brand, allowing customers to pay monthly in exchange for hybrid services deployed and managed entirely by HPE. 

This new Composable Hybrid Cloud product extends that by allowing customers to essentially upgrade their traditional data centre set up to operate in the same way as a public cloud. This means that typical public cloud features such as the fluid provisioning of resources can now be built into a server stack, which the company claims will drastically improve the efficiency of workloads and allow for better communication between deployments.

“Our customers want to innovate faster, with greater automation and intelligence,” said Phil Davis, president of Hybrid IT at HPE. With our new open hybrid cloud platform, enterprises of all sizes can now manage, provision and deliver workloads and apps instantly and continuously to accelerate innovation.”

The new software includes HPE’s Composable Fabric, first introduced as part of its Synergy platform, which is placed on top of a server setup built with ProLiant or Synergy servers. This works like a mesh that’s able to automatically scale up and scale down network resources based on the needs of workloads, as well as allow those processes to communicate with others on the network.

This simplifies the operations of the network by allowing it to effectively act like a hyperscale cloud provider, providing a means to adjust workload balancing on the fly, the company says. Once deployed, this is said to reduce the over-provisioning of resources by up to 70%.

InfoSight, HPE’s AI-powered analytics service, is also integrated into the software stack, which automatically predicts and prevents points of failure in the workload, reducing the amount of manual work required by operators.

“Organisations are demonstrating an ever-growing appetite for automation, scalability and openness to aggressively accelerate development and operations,” said Thomas Meyer, group VP of IDC. “With composable cloud, HPE aims to deliver a foundational pillar with those attributes in mind to help customers accelerate their digital transformation.”

Importantly, HPE’s hybrid cloud platform is open to third-parties. Currently, the platform supports ProLiant server workloads built for Red Hat’s OpenShift and VMware. HPE’s Synergy infrastructure platform will also support workloads from SAP and Oracle, and its Image Streamer capability means it can support Dev/Ops tools from the likes of Chef, Ansible, Puppet, and VMware.

HPE’s Composable Cloud platform on ProLiant rack servers will roll out to customers in the US, UK, Ireland, France, Germany and Australia starting in early 2019.

Cloudera and Hortonworks announce $5.2bn merger


Dale Walker

4 Oct, 2018

Cloudera and Hortonworks, two of the largest providers of open source enterprise Hadoop products, have announced a merger of equals deal that’s said to place their joint value at $5.2 billion.

The deal is an all-stock merger, with Cloudera stockholders taking ownership of approximately 60% of the combined company, with Hortonworks stockholders taking the remaining 40%. The combined companies will boast more than 2,500 customers, around $720 million in revenue, and over $500 million in debt-free cash.

The Big Data companies, once heavy rivals, have built their businesses on providing simpler, packaged IT services for companies wanting to exploit the data processing power of the Hadoop platform but are unable to build systems from scratch.

Where Hortonworks has largely relied on charging for support services as a purely open source provider, Cloudera based much of its business on subscription services. The deal will likely see many Hortonworks customers transition over to a subscription fee, providing an early spike in revenue for the joint company.

The deal places the companies in a far better position to take on newer cloud solutions by providing a platform that covers multiple cloud types, as well as on-premise and Edge. It also gives the two companies a far better chance at maintaining profitability in the heavily crowded Big Data space.

“This compelling merger will create value for our respective stockholders and allow customers, partners, employees and the open source community to benefit from the enhanced offerings, larger scale and improved cost competitiveness inherent in this combination,” said Rob Bearden, chief executive officer of Hortonworks.

“Together, we are well positioned to continue growing and competing in the streaming and IoT, data management, data warehousing, machine learning/AI and hybrid cloud markets. Importantly, we will be able to offer a broader set of offerings that will enable our customers to capitalize on the value of their data.”

Cloudera CEO Tom Reilly will serve as CEO of the combined firm, while Hortonworks’ COO and CFO Scott Davidson will stay on as COO.

Current Hortonworks CEO Rob Bearden will join the board of directors, with current Cloudera board member Marty Cole moving up to chairman.

Shares in the firms spiked post the news, with Cloudera stock surging 26% in after-hours trading, and Hortonworks rising by 27%.

Salesforce plans to pump £1.9 billion into UK initiatives


Dale Walker

13 Jun, 2018

Prime minister Theresa May is set today to announce a slew of new commitments aimed at boosting the UK’s tech industry, which include access to a £2.5 billion government investment pot.

The announcements coincide with London Tech Week, which will be the setting for a number of government-led roundtable events with leading technology firms looking to invest in the industry.

CRM giant Salesforce is expected to commit $2.5 billion (£1.87 billion) to the UK market over the next five years, which includes the building of a second data centre.

A further £300 million will come from the UAE-based Mubadala Investment Company in the form of a European investment fund, and an additional £41 million from Japanese firm NTT Data as part of its expansion into the UK.

The government’s own £2.5 billion commitment will be in the form of a British Patient Capital programme that aims to support businesses with high growth potential with access to long-term funding. This fund is also expected to be supported by a further £5 billion in private investment.

“The measures we are announcing today will allow innovative British startups to invest in their future – and in the UK – by hiring more skilled people, expanding their business and exporting their expertise across the world,” said May.

“It’s a great time to be in tech in the UK, and our modern Industrial Strategy will drive continued investment, ensuring the nation flourishes in the industries of the future and creating more high-paying jobs.”

In an effort to make it easier for businesses to source overseas talent, the Prime Minister will also be scrapping the current graduate visa programme and replace it with a ‘startup visa’, a streamlined route that will be available to foreign entrepreneurs once it launches in spring 2019.

“Britain is a digital dynamo with the government and tech sector working together to help make this country the best place in the world to start and grow a digital business,” said culture secretary Matt Hancock. “We’re encouraging the best and brightest tech talent to come to the UK and creating the right conditions for our high growth digital businesses to thrive.”

The government has also committed to the building of two new tech hubs in Brazil and South Africa, designed to encourage the development of digital skills in the regions and foster greater relationships with UK businesses.

NTT Data UK CEO Simon Williams said today’s announcement “shines an important light on the UK technology sector and the incredible talent emerging across the industry”.

Commenting on the company’s £41 million investment, he added: “NTT Data has a proud history of investment and innovation in the UK, which is one of the most competitive markets in the world.

“Companies like NTT Data recognise that by investing and succeeding in the UK, we are in a very strong position to succeed in other markets around the world.”

The UK technology industry attracted $7.8 billion in funding, almost double that of 2016 and $1.8 billion more than France and Germany, according to government figures.

Antony Walker, deputy CEO of trade industry body techUK, said: “This is another vote of confidence in the UK tech sector. The billions of pounds of investment and thousands of new jobs shows that the UK remains a global hub for tech.

“The government is clearly determined not to abandon the playing field to France and others when it comes to presenting a strong offering to tech entrepreneurs and investors. The Industrial Strategy has been very positive for tech. The challenge is to build on these strong foundations. We need to digitise our economy, grow our domestic digital market and identify new export opportunities.”

Box Zones now supports multi-zone storage for regulatory piece of mind


Dale Walker

24 May, 2018

Box customers can now store and access data across multiple cloud zones from inside a single instance of its Box Zones tool.

The enterprise content management firm has expanded on the capabilities first introduced to Box Zones when it launched in 2016, which allowed customers to store data within a specific region of their choosing – the UK, Germany, Ireland, Japan, Singapore, Australia, the US or Canada.

With today’s announcement, customers are now able to pick and choose where they keep their data and adhere to the necessary data protection requirements of those regions, such as GDPR, due to change the way EU residents’ data is stored and managed from tomorrow.

“Business has never been more fast paced, and at the same time regulatory changes like GDPR and the ever-changing security landscape are adding complexity, making it increasingly difficult to create a digital workplace that provides employees with the information they need to be successful,” said Box’s chief product officer, Jeetu Patel.

“Multizone support for Box Zones gives enterprises the best of both worlds,” added Patel. “Not only will they be able to make granular decisions about how to govern and store their data across the globe, but users will get the same great collaborative experience they have with Box today – no matter where they, or their collaborators, are located.”

The multizone feature in Box Zones works using a drop-down menu, allowing customers to change their data residency settings at will. Organisations will also have the option of assigning users to specific cloud zones, with data being migrated to a new zone automatically when the zone changes.

Box also overhauled its admin console within Box Zones, allowing admins to gain oversight of what data is being stored where, regardless of the number of zones being used.

It also confirmed that companies will be able to apply their own security policies on top of the tool, blocking data transfers between certain regions if required.

What hasn’t changed is the number of zones that are available, however. Speaking at Box’s World Tour event in London today, CEO Aaron Levie confirmed the company plans to add to its existing zones over time.

“From a technology standpoint we can support more and more regions over time, wherever our technology partners have data centres and infrastructure,” said Levie. “So we’re evolving the regional support based on the customer demand we are seeing.”

Box Zones currently supports data residency across the eight aforementioned regions. Organisations are able to assign a default zone when first setting up, and then assign users via an API  the admin console.

View from the airport: Red Hat Summit 2018


Dale Walker

11 May, 2018

Red Hat’s annual meet is a chance for the company and its customers, to celebrate all things open source; yet this year’s summit was about something more important.

With the weight of 25 years behind it, the pressure was on the company to make a bold stand, proving to the industry that it can ward against the likes of Amazon and other goliaths looking to wrestle control of the highly lucrative hybrid cloud market.

The past few days have proved one thing. Red Hat is poised to become the dominant force in hybrid cloud and open source.

Setting the tone of the conference, Red Hat immediately announced that IBM would be fully containerising its entire WebSphere application portfolio over to Red Hat’s OpenShift platform – a direct rival moving its services over to Red Hat technology. It’s precisely the sort of knockout announcement the company needed.

If that wasn’t enough to persuade its customers of its value in the market, Red Hat also announced a partnership with Microsoft on the creation of the industry’s first jointly-managed container platform – essentially OpenShift running on Microsoft Azure, including both its on-premise and hybrid cloud platforms.

With those two major announcements aside, Red Hat took the time to detail its plans for the integration of CoreOS into its wider portfolio, something that customers have been waiting for since the acquisition was announced in January.

The integration, alongside new offerings through IBM and Microsoft, means that over the next few months the company will have access to a far greater number of potential customers, with a greater portfolio of products to offer them.

It’s no wonder then that Red Hat is confident it will extend its earnings winning streak, having already marked 64 consecutive quarters of revenue growth – an unprecedented feat. While the company is far from the monetary value of tech’s biggest players, it’s fast becoming the most important company in the open source space.

Red Hat considers itself to be the Switzerland of the tech industry, and while that message may be nauseating, it’s difficult to argue against the notion of removing vendor lock-in for customers, nor doubt Red Hat’s ability to sway traditional technology giants over to open source.

While executives expressed a willingness to engage with trends they see as being future growth opportunities – namely serverless computing and automation – it will be another Summit or two before we see any meaningful action on those fronts. For the time being, hybrid remains Red Hat’s focus.

Red Hat will be one to watch over the next year. With IBM and Microsoft already on side, it’s anyone’s guess which technology giant will be lured next.

Q&A: Red Hat’s Werner Knoblich talks hybrid, talent, and the serverless future


Dale Walker

10 May, 2018

“You don’t get lucky 64 times in a row,” says Werner Knoblich, Red Hat’s SVP and GM of EMEA, pointing to the open source company’s unprecedented streak of quarterly revenue growth.

Red Hat posted $772 million in revenue for the fourth quarter of the 2018 fiscal year, 23% higher than over the same period in 2017. The company is riding high on a burgeoning market that’s seen a shift towards containerisation and open source in recent years.

Knoblich argues that Red Hat’s core philosophy of open source, and its bet on Kubernetes and hybrid cloud just over five years ago, is helping to place the company at the forefront of application development, particularly as others have historically relied on vendor lock-in.

The Switzerland of hybrid cloud

“We often talk about ourselves as being Switzerland,” says Knoblich. “We’re becoming the abstraction layer to ensure people aren’t locked in. It’s one of the reasons we are still an independent company.

“It’s one of our key value propositions – being a neutral company. If years ago HP or IBM had acquired us, we would have lost our neutrality. The ecosystem’s other players know this as well, as open source is an innovation engine.”

Red Hat’s 2018 Summit will be a particularly memorable one in years to come as it marked the year that IBM, a rival platform provider, announced it would shifting its WebSphere applications over to Red Hat’s OpenShift.

“There’s endorsement there,” says Knoblich. “IBM is now fully containerising all of its WebSphere applications, making them available on OpenShift. It’s their preferred platform. We see WebSphere as a direct competitor to us in terms of JBoss (Red Hat’s application server), and they’re fully switching over to our technology. It’s a gigantic endorsement that we’ve done the right things with OpenShift.”

Its record growth is being fuelled by an increasing willingness to embrace open source services among markets that have traditionally been locked to Windows-based systems.

“We struggled in the early days when we were mainly a pure Linux company,” says Knoblich. “Our sweet spot was countries and industries where there was heavy Unix usage and not just pure Microsoft – then we did Unix replacements, not necessarily Microsoft migrations.

“There’s not really a struggle anymore, because even a Microsoft customer is becoming a very good target for us – they also need to containerise their applications and automate their environment,” adds Knoblich.

“Our portfolio has grown so much that we have so many more possibilities. Even if a customer says ‘I’m staying completely on Windows, I don’t want to switch to Linux’, even for this customer we have offerings that can provide value where a couple of years back we had nothing.”

The lure of open source

Red Hat also attributes a great deal of its success to its ability to outmanoeuvre its rivals when it comes to talent acquisition.

“Every single company is fighting for the same talent,” says Knoblich. “Google tries to hire best developers in the world, so does Deutsche Bank, Barclays, Volkswagen. But if you’re the best developer, you can choose. Companies need to be attractive to these developers, otherwise they won’t get them.”

He argues that if employers don’t start offering developers a means to work within open source communities, they risk alienating budding talent looking to build experience.

“As a developer, where do you build your CV? It’s on Github. They want to make a name for themselves, not just put on a CV that they worked at Barclays or whoever.”

If companies don’t allow them to do this, the developer will say ‘this company is limiting my career path, so I won’t join them’. That’s what accelerated the whole [open source] movement.”

A shift of this kind can require a cultural change, something that Red Hat, as a veteran in open source space, has been fostering among other companies in the role of a consultant.

“Often the legal department is the big issue – they say ‘we can’t have our employees making submissions to open source communities with the domain email address of the employer’.

‘What’s the legal exposure if I as an employee make a commit and something goes wrong with the code? Is there a liability?”

Finding the next industry standard

Having placed an early stake on hybrid, Red Hat is hoping to once again position itself ahead of the competition. Part of that shift will be to invest in what it sees as the next big thing: serverless.

“It was kind of introduced by Amazon, but with that, again, they’re trying to lock their customers to their platform,” says Knoblich, commenting on AWS’ shift towards offering a dynamic cloud service that handles the allocation of machine resources – normally handled by a server.

He explains that Red Hat is looking at making a serverless offering as a function of OpenShift in the near future, competing directly with Amazon.

However, possibly the biggest focus for the company will be in the untapped multi-cloud management space, explains Knoblich, as customers look for products that make it easier to handle large numbers of deployments.

“Because the world is hybrid, customers will have different environments – on-premise, VMware virtualised environment, OpenStack, private cloud, Azure. But they somehow still need to manage it all with a single page of glass – they don’t want to have use all the individual dedicated tools.”

“That’s obviously a big play. The environment is not becoming simpler, it’s becoming, to a certain extent, more complicated,” says Knoblich. “Those management tools that bring that all together… that’s also something I think we will be focusing on.”

Image: Shutterstock

Red Hat sets out roadmap for CoreOS integration


Dale Walker

9 May, 2018

Red Hat has released the first details of its roadmap for the integration of the newly acquired CoreOS tools into its existing suite of container-based services.

The open source giant snapped up CoreOS, a highly successful cloud-native startup, for $250 million back in January, a move considered to be Red Hat’s biggest acquisition since its shift in focus towards providing Kubernetes services.

Since then, Red Hat has been silent on what tools the company would formally embrace, however, it has now confirmed that CoreOS Tectonic, Quay and Container Linux will all be integrated into Red Hat’s OpenShift container platform.

Tectonic was originally developed to solve problems associated with managing Kubernetes deployments at scale by introducing automation, supported by much-lauded ‘over-the-air’ updates. Integrated into OpenShift as ‘automated operations’, the feature should make it easier for IT admins to roll out automatic upgrades across their clusters and hosts.

Also making its way into OpenShift is Container Linux, a lightweight operating system providing immutable infrastructure for container deployments, that also benefits from over-the-air updates.

“Our number one principle is that no customer is left behind,” said Ashesh Badani, VP and general manager of OpenShift, speaking at Red Hat Summit. “We want to make sure that all the community interests, all the customers, around Container Linux are supported. We move that forward injecting Red Hat content into that.

“Tectonic was a pretty popular distribution of Kubernetes – customers really liked the fact Tectonic was focused on over the air upgrades, technologies around monitoring and metering. We’re taking all of that and converging that into the OpenShift platform, available over the next six months.”

Quay, a service that acts as a registry for managing container images, will be a standalone product of the OpenShift portfolio, the company confirmed.

Red Hat Quay will be available as an on-premise deployment or through a hosted service as Red Hat Quay.io, and will feature the same tools that made the service popular, including geographic replication, security scanning, and image time machine features.

Badani added that the integration roadmap would be fully delivered to customers by the end of the year, and that incremental progress updates would be provided, the next being at some point over the summer.

Image: Shutterstock