All posts by Keumars Afifi-Sabet

View from the airport: Epicor Insights 2019

Keumars Afifi-Sabet

25 Apr, 2019

Manufacturing, lumber, distribution; these kinds of industries have been around for hundreds of years, and those who run companies in these sectors are used to doing things a certain way. Change is slow and, if business is good, can be seen as more of a risk than an opportunity.

This is the challenge that enterprise resource planning (ERP) firm Epicor aimed to tackle as it entered into its annual Insights conference, hosted this year at Mandalay Bay in Las Vegas. And, if the company’s executives are to be believed, the migration of its flagship product line-up to the public cloud is certainly the future.

This could lead to new technological possibilities, their message says, with the biggest announcements last week centring on enhanced ERP functionality prefaced by a stronger partnership with Microsoft’s Azure platform.

The most eye-catching of these was the Epicor Virtual Agent (EVA), an AI-powered digital assistant that the company’s chief product and technology officer Himanshu Pulsale insisted wasn’t “just a chatbot”. Epicor’s big bet on the cloud was also met with a commitment to something it calls the ‘connected enterprise’, or in other words letting Internet of Things (IoT) devices loose across the factory floor. But this direction of travel hasn’t been embraced as warmly among a significant chunk of Epicor customers.

Just a few years ago the company was in rocky waters under its previous leadership, adopting a defensive mentality with regards to its customers, and almost apologising when they integrated any new tech into their products. But now the company is banging the drum for digital transformation and, in particular, public cloud migration.

Epicor has embarked on a decisive drive to communicate these benefits to a sceptical customer base, a small proportion of which have no interest in pursuing cloud technology whatsoever.

The software giant places the number of its customers still using Epicor ERP 9, released almost a decade ago, and similarly outdated software at roughly under 500 of 3,000 in the manufacturing space. Moreover, questions often asked throughout the conference ran along the lines of “do I need to be on cloud to use this feature or can I get it with on-prem as well?”.

One of the deepest concerns among a more older generation of customers, as Epicor sees it, is that they’re being forced into using the cloud against their wishes. They are, in fact, quite comfortable with how their infrastructure sits right now. The biggest worry for Epicor is they’ll sit on whatever system they have for as long as possible, and then look at rival options – instead of upgrading – when the software is no longer supported.

Meanwhile, although Epicor’s leadership would say they’re ahead of the curve tech-wise, or at least on par with their weightier rivals (the likes of Oracle and SAP), they’d also concede this conflict has held them back. Painful upgrades and software flaws, among other issues, have clouded the company’s past.

But the overall tone at Insights 2019 was one of optimism, as Epicor seeks to finally hit the elusive billion-dollar-company mark within the next 12 months. The defensive mentality of old has been discarded, with a more proactive one picked up in its place.

But reckoning with the pace of growth has posed another headache. Epicor is a much larger firm than it was several years ago, but many of the internal processes are still unchanged from that of a small business. Internal change remains a learning process, and executives are finding they can’t whip up their conference speech a day before their keynote anymore.

These next 12 months will be especially interesting for a software firm that has put its chips on the table for public cloud. Epicor’s main predicament in future will centre on wrestling with legacy attitudes among a significant minority of customers while trying to keep its tech as fresh as the biggest players in ERP.

Insights 2019: Epicor extends Microsoft Azure partnership to power fresh ERP enhancements

Keumars Afifi-Sabet

17 Apr, 2019

Epicor has built on its partnership with Microsoft Azure to roll out major artificial intelligence (AI) and Internet of Things (IoT) enhancements to its suite of enterprise resource planning (ERP) tools.

Manufacturers and distributors will benefit from upgrades to the company’s Epicor ERP and Prophet 21 platforms respectively, powered by closer integration with Microsoft’s Azure cloud platform almost a year after this partnership was first announced.

Details of both releases were outlined by Epicor’s chief product and technology officer Himanshu Palsule at the company’s annual Insights customer conference, hosted this year at Mandalay Bay, Las Vegas.

In particular, this Azure base layer will power the company’s digital assistant Epicor Virtual Agent (EVA), as well as render better connectivity between people and smart machines in the firm’s Epicor ERP platform for manufacturers.

“We’ve made great progress in moving many of you onto the Azure platform,” Palsule said during his keynote address. “And as you saw with EVA and the connected enterprise, we have now started using the platform elements of Azure.

“We’ve started using the service levels, service fabric and the IoT Hub and other parts of that. And that’s always been our strategy.”

Improvements to the manufacturer-centric Epicor ERP platform sees a host of tools such as AI and analytics integrate with the system to enhance the company’s vision for a ‘connected enterprise’.

The Epicor IoT module, in particular, will connect smart machines across the manufacturing floor to Microsoft’s Azure IoT Hub, with data pulled directly from sensors and visualised on the ERP home page.

Meanwhile, the Prophet 21 web-based app, targeted at distributors, is encouraging its customers to embrace the public cloud to transform their business processes.

This app was recently ported onto the public cloud in its entirety, with software engineers telling Cloud Pro the most exciting part of the shift is customers’ newfound ability to use the product on any device, from tablets to iMacs.

The latest releases of Epicor’s ERP tools for manufacturers and distributors make steps towards realising ambitions set out last year, when the strategic partnership was first outlined last year.

As to where the company aims to take this in the future, Palsule told the press at a Q&A following the keynote that customers whose own platforms are tied to Azure will in future be able to draw additional benefits.

He also outlined how the partnership would work in practice, and address concerns around security, saying the firm resisted any temptation of trying to solve it themselves.

“The way it works is you get the machines communicate with the IoT Hub, the IoT Hub has a certain restriction on security protocols. Then that data then goes into ERP and communicates back,” said Palsule.

“Everything that we’re doing follows the standard Microsoft protocols so we are relying heavily on this partnership to give us security.”

Meanwhile, for Microsoft, the company sees this partnership as the perfect marriage between the underlying cloud infrastructure and the niche expertise that a partner in the mould of Epicor can offer.

“We got our engineering teams together we sat down, and ultimately worked out all the architectural designs,” Microsoft’s partner director for global ISV alliances and business development Don Woods told Cloud Pro.

“We put the ERP into Azure, and then started going to customers and said ‘look, you want to get into the cloud? Epicor’s moving into the cloud. It’s going to SaaS. You want to benefit from all this? Join us in that movement’.

“And Epicor wanted to make sure that the customers had that capability, because that’s where it’s going.”

Google Plus back from the dead as ‘Google Currents’ enterprise workspace app

Keumars Afifi-Sabet

12 Apr, 2019

The now-defunct social media platform Google Plus has been unexpectedly resurrected as an enterprise application to rival the likes of Slack and Facebook Workplace.

After the service sustained a data leak of half a million accounts in October 2018, Google launched a security review and decided to shut down the platform permanently.

A second data leak in December – this time exposing the private data for 52.5 million users – led Google to accelerate its demise from August to April 2019. The platform then officially closed just 10 days ago.

This has also coincided with the launch of Google Currents, which touts itself as a like-for-like replacement to the Google Plus for G Suite app that had been available only for enterprise users.

“Currents is a G Suite app that enables people to have meaningful discussions and interactions across your organization, helping keep everyone in the know and giving leaders the opportunity to connect with their employees,” the company announced.

“Currents makes it easy to have meaningful discussions by enabling leaders and employees to exchange ideas across the organization and gather valuable feedback and input from others – without flooding inboxes.”

Among features in the new platform are analytical tools that let users track how widely-seen their posts across the network are, and priority offered to posts from leadership teams.

Tags and streams, including a ‘home stream’, are designed to show individuals the most relevant posts for them at any time.

Organisations can register with the Google Currents beta now, and all content for existing Google Plus users will automatically transfer upon enrollment. The app is available in all versions of G Suite.

Incidentally the name ‘Google Currents’ is itself a resurrection of the name once prescribed to a social magazine app, or Google’s answer to Apple News. This app launched in 2011 but was replaced two years later with Google Play Newsstand.

Before retiring just days ago, Google Plus endured a torrid life playing second-fiddle to more widely-used platforms such as Facebook, Instagram and Twitter. The data leak in October would have proved the final nail in the coffin but for the subsequent exposure of 52.5 million users not long after.

But Google yet hopes to keep its social media platform alive in some form, with the core code in its consumer-focused app now porting to a business-oriented workplace service.

This also follows the launch of several new G Suite updates at Google Cloud Next 2019, with Sheets, Hangouts, Calendar and Gmail touted for the near future.

AWS makes double swoop for Volkswagen and Standard Bank

Keumars Afifi-Sabet

27 Mar, 2019

Amazon’s cloud arm has struck separate agreements with Volkswagen (VW) and Standard Bank to boost the two companies’ cloud platform and customer-facing applications respectively.

VW’s deal with Amazon Web Services (AWS) will aim to pave the way for a transformation of the car maker’s manufacturing and logistical processes across its 122 plants, including the effectiveness of assembly equipment, as well as track parts and vehicles.

Together, the two companies will develop a new platform, dubbed the Volkswagon Industrial Cloud, which will deploy technologies such as the Internet of Things (IoT) and machine learning to realise this wider ambition.

AWS’ IoT services will be deployed in full across VW’s new platform in order to detect and collect data from the floor of each plant, then organise and conduct sophisticated analytics on the information to gain operational insights.

Moreover, VW will feed all the information into a data lake built on an Amazon S3 bucket on which data analytics will be conducted. This, the two companies hope, will lead to improvements in forecasting and insight into operational trends. The manufacturing process could also be streamlined, as well as identifying gaps in production and waste management.

“We will continue to strengthen production as a key competitive factor for the Volkswagen Group. Our strategic collaboration with AWS will lay the foundation,” said the chairman of the Porsche AG executive board Oliver Blume.

“The Volkswagen Group, with its global expertise in automobile production, and AWS, with its technological know-how, complement each other extraordinarily well. With our global industry platform, we want to create a growing industrial ecosystem with transparency and efficiency bringing benefits to all concerned.”

AWS has also announced that the South African Standard Bank has decided to use its services to migrate production workloads onto the public cloud provider’s systems. These include many customer-facing platforms and banking apps.

Subject to regulations, the migration will ideally take place across all banking departments including personal banking and corporate investment banking. The firm will also adopt AWS’ data analytics and machine learning tools, to automate financial operations and generally improve web and mobile apps used by its customers.

“Standard Bank Group has been a trusted financial institution for more than 150 years. We look forward to working closely with them as they become Africa’s first bank in the cloud, leveraging AWS to innovate new services at a faster clip, maintain operational excellence, and provide secure banking services to customers around the world,” Andy Jassy, CEO of AWS. 

An AWS cloud centre of excellence will also be created within the bank, featuring a team dedicated exclusively to the public cloud migration. This centre will also build training and certification programmes within the firm to boost employees’ digital skills. This will also be extended one step further with an educational and digital skills programme to be launched across South Africa.

View from the airport: DataWorks Summit 2019

Keumars Afifi-Sabet

22 Mar, 2019

Are you from the Hortonworks side or the Cloudera side? It’s a question I found myself asking a lot at this year’s DataWorks Summit, the first major event since the two companies completed their $5.2 billion merger just months ago. Naturally, a marriage of this scale throws up a tidal wave of questions. Unfortunately, there were no answers to be found in Barcelona.

It’s difficult to put my finger on the mood in the air, but it was closest to uncertainty. Given that DataWorks Summit has conventionally been a Hortonworks event, having the ‘new’ Cloudera spearhead it was jarring. Not just for the press, but the comms team. The reason? The May-time Washington DataWorks Summit, as well as Cloudera’s two Strata Data Conferences had already been planned and organised months before the merger was tied up. So the company has to effectively go through the motions with its 2019 events.

But it was especially confusing given the Hortonworks branding appears to have been discarded entirely. Instead, the two companies, now operating under the Cloudera umbrella, have undergone a complete image refresh, with a newly-designed logo and several buzzy slogans to boot.

A new image is always something to get excited about. But the fact Cloudera was handing out metal pins emblazoned with the company’s old logo summed up the feeling quite effectively. Its Twitter page, too, is still displaying the company’s old logo at the time of writing.

Meanwhile, the event was fronted by the firm’s chief marketing officer (CMO) Mick Hollison. This underpinned the company’s almost singular focus on ‘image’ this week, which on one level made sense. Earnings day the week before made for grim reading. Revenue grew, sure. But so did expenditure, by quite a lot. This doubled losses to more than $85 million. Yet Cloudera is setting itself a target of becoming a billion dollar company before the end of the year, and reinforced its ambitions to target only the largest companies.

But it didn’t seem appropriate that a significant portion of the top brass was left at home. Anybody who could give serious answers about Cloudera’s financial performance, or specific details about the merger, was not available to chat. Then it hit me during the main keynote when it became clear CMO Hollison would be the only Cloudera voice addressing the press, analysts and delegates that morning. Chiefly, at Cloudera’s first major public event since the merger, it begged the question: Where was the CEO?

It’s not fair to say that everybody with prominence was left at home. Hilary Mason, Cloudera’s resident data scientist and the lead on its research division, dazzled on the evolving nature of AI. Meanwhile, there were some interesting insights to gain on data warehousing, open source, and GDPR. The thematic substance of DataWorks Summit 2019 was actually quite positive despite the company’s considered efforts to push its new marketing slogans, namely ‘from the edge to AI’ and ‘the enterprise data cloud’.

But the merger, undoubtedly, was at the forefront of everyone’s minds, with many questions lingering. Though now that it has mostly been completed, it was interesting to hear discussions with Hortonworks were actually underway for three-and-a-half years before the two firms tied the knot.

Yet we still don’t fully know what its flagship service, named the Cloudera Data Platform (CDP), will look like. We do, however, know it’s a mash-up of Hortonworks and Cloudera’s legacy systems, Cloudera Distribution Including Apache Hadoop (CDH) and Hortonworks Data Platform (HDP).

Neither do we know when this will launch, with Cloudera officially saying it will come within the next two quarters. But one customer, Swiss insurance firm Zurich, told Cloud Pro it was coming in June. Meanwhile, while customers are allowed to keep these legacy platforms until around 2022, for Zurich, currently in the process of migrating from HDP 2.0 to 3.0, does this then mean a second big transition in quick succession? The aim is, of course, to transition all customers to CDP eventually.

The future is uncertain. So much so that nobody really knows if the DataWorks Summits held in 2019 will be the last ever. Nevertheless, this presented a fantastic opportunity for Cloudera to address the world post-merger, and take on its major challenges head-on.

But this was an opportunity missed. The fact its most senior staff were left at home spoke volumes, even though the substance of the conference was for the most part engaging. It became clear over the course of the event that there hasn’t been, and probably won’t be, a honeymoon period for the ‘new’ Cloudera as it begins to find its feet in a turbulent market.

DataWorks Summit 2019: Cloudera allays post-merger fears with ‘100% open-source’ commitment

Keumars Afifi-Sabet

20 Mar, 2019

The ‘new’ Cloudera has committed to becoming a fully open-source company, having followed an open-core model prior to its $5.2 billion merger with former rival Hortonworks.

All 32 of the current open source projects found between both Hortonworks and Cloudera’s legacy platforms will remain available as cloud-based services on its new jointly-developed Cloudera Data Platform (CDP).

There were fears Cloudera’s influence could undermine the “100% open source” principles that underpinned Hortonworks, given the former had previously been just an ‘open-core’ company. This amounted to a business model in which limited versions of Cloudera projects were offered in line with open source principles, with additional features available at a cost.

Cloudera first made reassurances over its commitment to open source on a conference call with journalists last week. This call was made to explain the firm’s dismal Q4 2018 financial results which saw the company’s net losses double post-merger to $85.5m.

The commitment, which Cloudera elaborated at the company’s DataWorks Summit 2019 hosted in Barcelona this week, has also coincided with a complete rebranding of the company logo, and further elaboration over its vision for an ‘enterprise data cloud’.

This, according to the firm’s chief marketing officer Mick Hollison, includes multi-faceted data analytics and support for every conceivable cloud model from multiple public clouds to hybrid cloud to containers like Kubernetes.

It would also be underpinned with a common compliance and data governance regime, and would retain a commitment to “100% open source”, with Hollison insisting several times to journalists at a press briefing the term “isn’t just marketing fluff”.

Cloudera’s vice president for product management Fred Koopmans told journalists at the same press briefing that both company’s existing customers valued the principles of ‘openness’ – which starts with open APIs.

“They don’t view that there is one vendor that’s going to serve all of their needs today and in the future,” Koopmans said. “Therefore it’s critical for them to have open APIs so they can bring in other software development companies that can extend it and enhance the platform.

“What open source provides them is no dead-ends; if they’re trying to develop something, and there’s a particular feature they need. They always have the option of going and adding a feature with their own development team. So this is a huge driver for a lot of our larger customers in particular.”

Cloudera also used the DataWorks Summit to outline its intentions to exclusively chase the biggest enterprise customers, insisting the firm is only interested in tackling big data problems for large companies.

CDP, the embodiment of the new vision, is due to make its way to customers as only a public cloud platform later this year, with a private cloud iteration to follow in late 2019 or early 2020. The platform is a mashing-together of Cloudera’s Cloudera Distribution including Apache Hadoop (CDH) and Hortonworks’ Hortonworks Data Platform (HDP).

Microsoft open-sources Azure compression technology

Keumars Afifi-Sabet

15 Mar, 2019

Microsoft hopes that open sourcing the compression technology embedded in its Azure cloud servers will pave the way for the technology’s adoption into a range of other devices.

The company is making the algorithms, hardware design specifications and the source code behind its compression tech, dubbed Project Zipline, available for manufacturers and engineers to integrate into silicon components.

Microsoft announced this move to mark the start of the Open Compute Project’s (OCP) annual summit. Microsoft is a prominent member of the programme, which was started by Facebook in 2011 and includes the likes of IBM, Intel, and Google.

Project Zipline is being released to the OCP to combat the challenges posed by an exploding volume of data that exists in the ‘global datasphere’, in both private and public realms, the company said. Businesses are also increasingly finding themselves burdened with mountains of internal data that should be better managed and utilised.

“The enterprise is fast becoming the world’s data steward once again,” said Microsoft’s general manager for Azure hardware infrastructure Kushagra Vaid.

“In the recent past, consumers were responsible for much of their own data, but their reliance on and trust of today’s cloud services, especially from connectivity, performance, and convenience perspectives, continues to increase and the desire to store and manage data locally continues to decrease.

“We are open sourcing Project Zipline compression algorithms, hardware design specifications, and Verilog source code for register transfer language (RTL) with initial content available today and more coming soon.

“This contribution will provide collateral for integration into a variety of silicon components (e.g. edge devices, networking, offload accelerators etc.) across the industry for this new high-performance compression standard.”

According to the firm, the compression algorithm yields up to twice as high compression ratios versus the widely used Zlib-L4 64KB compression model. Contributing RTL at this level of detail, Vaid added, sets a new precedent for frictionless collaboration, and can open the door for hardware innovation at the silicon level.

Members of the OCP will be able to run their own Project Zipline trials and contribute to the further development of the algorithm, and its hardware specifications.

Microsoft hopes that its technology will be integrated into a variety of silicon components and devices, in the future. These could range from smart SSDs to archival systems, to cloud appliances, as well as IoT and edge devices.

Making its compression technology available represents Microsoft’s latest contribution to OCP, more than five years after the company first began contributing to the open source project. Incremental contributions have been made ever since, with the company, for instance, delivering its Open CloudServer specs to the project in October 2014.

Google fixes ‘highly severe’ zero-day Chrome exploit

Keumars Afifi-Sabet

7 Mar, 2019

Google has confirmed that a Chrome browser patch released last week was a fix for a critical flaw that was being exploited by criminals to inject malware onto a user’s device.

The company is urging Chrome users to immediately update their web browsers to the latest version, released last week, in light of the discovery of a zero-day vulnerability rated ‘highly severe’.

The flaw, termed CVE-2019-5786, is a memory mismanagement bug in Chrome’s FileReader, an API included in all web browsers that allows apps to read files stored on a user’s device or PC.

Its nature as a ‘use-after-free’ error means it tries to access memory after it has been deleted from Chrome’s allocated memory and, through this mechanism, could lead to the execution of malicious code.

“According to the official release notes, this vulnerability involves a memory mismanagement bug in a part of Chrome called FileReader,” said Sophos’ security proselytiser Paul Ducklin.

“That’s a programming tool that makes it easy for web developers to pop up menus and dialogues asking you to choose from a list of local files, for example when you want to pick a file to upload or an attachment to add to your webmail.”

“When we heard that the vulnerability was connected to FileReader, we assumed that the bug would involve reading from files you weren’t supposed to. Ironically, however, it looks as though attackers can take much more general control, allowing them to pull off what’s called Remote Code Execution.”

This breed of attack means cyber criminals could inject malware onto unsuspecting users’ machines without any warning, or seize full control of a device.

The vulnerability was discovered by Clement Lecigne of Google’s threat analysis group on 27 February. Google’s technical program manager Abdul Syed said that the company has become aware of active exploits in the wild, but provided no further information as to the nature of these or who had been targeted.

Google initially released the fix on Friday 1 March, but updated its original announcement to provide further details around the flaw.

Exclusive: European Microsoft 365 outage sent Department for Education’s IT into “meltdown”

Keumars Afifi-Sabet

6 Feb, 2019

The Department for Education (DfE) endured a 12-hour IT nightmare as a result of last month’s European-wide Microsoft 365 outage.

The government department’s IT systems were paralysed on 24 January, with more than 6,000 of its employees locked out of their cloud-based Microsoft and email accounts, according to a DfE source, Cloud Pro has learnt.

Crisis meetings were held throughout the day as officials scrambled with the consequences of a departmental-wide outage entirely out of their hands, and also unexplained at the time.

The civil servant, who requested not to be named, also confirmed that colleagues were forced to share confidential documents using Skype’s instant messaging.

“It beggars belief that we were locked out of email for an entire day, the whole department was in meltdown,” the source said.

A DfE spokesperson confirmed the department’s systems were partially disrupted by the European-wide Microsoft outage on 24 January, and that contingency plans were put in place to mitigate these effects.

“The Department for Education was one of many organisations impacted by Microsoft’s Outlook issues on Thursday 24 January,” the spokesperson told Cloud Pro. “The impact of disruption to email services was managed and services resumed within 24 hours.”

Staff used “smarter working technology” to continue delivering services as smoothly as possible, while “normal business continuity arrangements” were deployed to minimise the impact of disruption to mail services. The spokesperson would not confirm whether the ‘businesses continuity arrangements’ existed prior to 24 January. The department’s video conferencing and shared documents services were unaffected.

The Microsoft 365 outage struck organisations from 9:30am on 24 January, with firms across the continent experiencing severe IT difficulties. Microsoft acknowledged that it was experiencing problems with its services, and engineers worked to restore services at around 8pm the same evening.

“This incident underlines the very real risk authentication delays can have on critical email systems, disrupting government business and preventing officials from sharing confidential information securely,” said Centrify’s vice president John Andrews.

“With rising levels of cyber attacks, it’s vital that all departments ensure privileged access to confidential data is a major priority, so that systems are protected from outsider threats at all times.”

The incident demonstrates just how dependent massive organisations, including critical government services, are on third-party cloud vendors to provide an undisrupted service at the risk of sustaining organisations paralysis.

Microsoft also suffered a global authentication-related outage four days later, with users from nations across the world including the US and Japan unable to login to critical cloud-based services.

Global Microsoft outage leaves users unable to login

Keumars Afifi-Sabet

30 Jan, 2019

A host of Microsoft’s cloud services including Azure Government Cloud and LinkedIn sustained a global authentication outage just a few days after users were blocked from accessing Office 365 in Europe.

Users in parts of Europe, the US, as well as Australia and Japan were blocked from logging into their services between 9pm GMT yesterday and the early hours of this morning due to authentication issues.

A host of Microsoft Cloud services including Dynamics 365 and Office 365, as well as US Government cloud resources, were out of action for a few hours due to problems with its authentication infrastructure.

According to the outage detection service downdetector, the issue may have affected a wide range of services including Skype, OneDrive, Office365, and – which all experienced spikes at roughly the same time. Users also complained across social media about difficulties logging into these platforms.

The issue, which has now been resolved, involved users attempting to log into new sessions, with the Azure status page indicating it concerned an internal DNS provider, describing the issue as ‘Level 3’ after an investigation. Microsoft says that engineers mitigated the outages by failing over CyberLink DNS services to an alternative provider.

These issues were resolved shortly after midnight this morning, but lasted at least a few hours, affecting users in predominantly the Eastern hemisphere who were getting into the crux of their working days.

The global outage arose just five days after Microsoft customers were unable to access their Office 365 accounts for a full working day in Europe.

The company confirmed on Thursday, after initially maintaining that services were running smoothly, that its cloud-powered productivity suite was experiencing difficulties, with the continental outage lasting around nine hours in total. 

This rocky start to the new year reflects a series of outages that Microsoft had sustained with its cloud services in the last few months of 2018, as the Windows-manufacturer struggled to provide 100% reliability.