All posts by Keumars Afifi-Sabet

DataWorks Summit 2019: Cloudera allays post-merger fears with ‘100% open-source’ commitment


Keumars Afifi-Sabet

20 Mar, 2019

The ‘new’ Cloudera has committed to becoming a fully open-source company, having followed an open-core model prior to its $5.2 billion merger with former rival Hortonworks.

All 32 of the current open source projects found between both Hortonworks and Cloudera’s legacy platforms will remain available as cloud-based services on its new jointly-developed Cloudera Data Platform (CDP).

There were fears Cloudera’s influence could undermine the “100% open source” principles that underpinned Hortonworks, given the former had previously been just an ‘open-core’ company. This amounted to a business model in which limited versions of Cloudera projects were offered in line with open source principles, with additional features available at a cost.

Cloudera first made reassurances over its commitment to open source on a conference call with journalists last week. This call was made to explain the firm’s dismal Q4 2018 financial results which saw the company’s net losses double post-merger to $85.5m.

The commitment, which Cloudera elaborated at the company’s DataWorks Summit 2019 hosted in Barcelona this week, has also coincided with a complete rebranding of the company logo, and further elaboration over its vision for an ‘enterprise data cloud’.

This, according to the firm’s chief marketing officer Mick Hollison, includes multi-faceted data analytics and support for every conceivable cloud model from multiple public clouds to hybrid cloud to containers like Kubernetes.

It would also be underpinned with a common compliance and data governance regime, and would retain a commitment to “100% open source”, with Hollison insisting several times to journalists at a press briefing the term “isn’t just marketing fluff”.

Cloudera’s vice president for product management Fred Koopmans told journalists at the same press briefing that both company’s existing customers valued the principles of ‘openness’ – which starts with open APIs.

“They don’t view that there is one vendor that’s going to serve all of their needs today and in the future,” Koopmans said. “Therefore it’s critical for them to have open APIs so they can bring in other software development companies that can extend it and enhance the platform.

“What open source provides them is no dead-ends; if they’re trying to develop something, and there’s a particular feature they need. They always have the option of going and adding a feature with their own development team. So this is a huge driver for a lot of our larger customers in particular.”

Cloudera also used the DataWorks Summit to outline its intentions to exclusively chase the biggest enterprise customers, insisting the firm is only interested in tackling big data problems for large companies.

CDP, the embodiment of the new vision, is due to make its way to customers as only a public cloud platform later this year, with a private cloud iteration to follow in late 2019 or early 2020. The platform is a mashing-together of Cloudera’s Cloudera Distribution including Apache Hadoop (CDH) and Hortonworks’ Hortonworks Data Platform (HDP).

Microsoft open-sources Azure compression technology


Keumars Afifi-Sabet

15 Mar, 2019

Microsoft hopes that open sourcing the compression technology embedded in its Azure cloud servers will pave the way for the technology’s adoption into a range of other devices.

The company is making the algorithms, hardware design specifications and the source code behind its compression tech, dubbed Project Zipline, available for manufacturers and engineers to integrate into silicon components.

Microsoft announced this move to mark the start of the Open Compute Project’s (OCP) annual summit. Microsoft is a prominent member of the programme, which was started by Facebook in 2011 and includes the likes of IBM, Intel, and Google.

Project Zipline is being released to the OCP to combat the challenges posed by an exploding volume of data that exists in the ‘global datasphere’, in both private and public realms, the company said. Businesses are also increasingly finding themselves burdened with mountains of internal data that should be better managed and utilised.

“The enterprise is fast becoming the world’s data steward once again,” said Microsoft’s general manager for Azure hardware infrastructure Kushagra Vaid.

“In the recent past, consumers were responsible for much of their own data, but their reliance on and trust of today’s cloud services, especially from connectivity, performance, and convenience perspectives, continues to increase and the desire to store and manage data locally continues to decrease.

“We are open sourcing Project Zipline compression algorithms, hardware design specifications, and Verilog source code for register transfer language (RTL) with initial content available today and more coming soon.

“This contribution will provide collateral for integration into a variety of silicon components (e.g. edge devices, networking, offload accelerators etc.) across the industry for this new high-performance compression standard.”

According to the firm, the compression algorithm yields up to twice as high compression ratios versus the widely used Zlib-L4 64KB compression model. Contributing RTL at this level of detail, Vaid added, sets a new precedent for frictionless collaboration, and can open the door for hardware innovation at the silicon level.

Members of the OCP will be able to run their own Project Zipline trials and contribute to the further development of the algorithm, and its hardware specifications.

Microsoft hopes that its technology will be integrated into a variety of silicon components and devices, in the future. These could range from smart SSDs to archival systems, to cloud appliances, as well as IoT and edge devices.

Making its compression technology available represents Microsoft’s latest contribution to OCP, more than five years after the company first began contributing to the open source project. Incremental contributions have been made ever since, with the company, for instance, delivering its Open CloudServer specs to the project in October 2014.

Google fixes ‘highly severe’ zero-day Chrome exploit


Keumars Afifi-Sabet

7 Mar, 2019

Google has confirmed that a Chrome browser patch released last week was a fix for a critical flaw that was being exploited by criminals to inject malware onto a user’s device.

The company is urging Chrome users to immediately update their web browsers to the latest version, released last week, in light of the discovery of a zero-day vulnerability rated ‘highly severe’.

The flaw, termed CVE-2019-5786, is a memory mismanagement bug in Chrome’s FileReader, an API included in all web browsers that allows apps to read files stored on a user’s device or PC.

Its nature as a ‘use-after-free’ error means it tries to access memory after it has been deleted from Chrome’s allocated memory and, through this mechanism, could lead to the execution of malicious code.

“According to the official release notes, this vulnerability involves a memory mismanagement bug in a part of Chrome called FileReader,” said Sophos’ security proselytiser Paul Ducklin.

“That’s a programming tool that makes it easy for web developers to pop up menus and dialogues asking you to choose from a list of local files, for example when you want to pick a file to upload or an attachment to add to your webmail.”

“When we heard that the vulnerability was connected to FileReader, we assumed that the bug would involve reading from files you weren’t supposed to. Ironically, however, it looks as though attackers can take much more general control, allowing them to pull off what’s called Remote Code Execution.”

This breed of attack means cyber criminals could inject malware onto unsuspecting users’ machines without any warning, or seize full control of a device.

The vulnerability was discovered by Clement Lecigne of Google’s threat analysis group on 27 February. Google’s technical program manager Abdul Syed said that the company has become aware of active exploits in the wild, but provided no further information as to the nature of these or who had been targeted.

Google initially released the fix on Friday 1 March, but updated its original announcement to provide further details around the flaw.

Exclusive: European Microsoft 365 outage sent Department for Education’s IT into “meltdown”


Keumars Afifi-Sabet

6 Feb, 2019

The Department for Education (DfE) endured a 12-hour IT nightmare as a result of last month’s European-wide Microsoft 365 outage.

The government department’s IT systems were paralysed on 24 January, with more than 6,000 of its employees locked out of their cloud-based Microsoft and email accounts, according to a DfE source, Cloud Pro has learnt.

Crisis meetings were held throughout the day as officials scrambled with the consequences of a departmental-wide outage entirely out of their hands, and also unexplained at the time.

The civil servant, who requested not to be named, also confirmed that colleagues were forced to share confidential documents using Skype’s instant messaging.

“It beggars belief that we were locked out of email for an entire day, the whole department was in meltdown,” the source said.

A DfE spokesperson confirmed the department’s systems were partially disrupted by the European-wide Microsoft outage on 24 January, and that contingency plans were put in place to mitigate these effects.

“The Department for Education was one of many organisations impacted by Microsoft’s Outlook issues on Thursday 24 January,” the spokesperson told Cloud Pro. “The impact of disruption to email services was managed and services resumed within 24 hours.”

Staff used “smarter working technology” to continue delivering services as smoothly as possible, while “normal business continuity arrangements” were deployed to minimise the impact of disruption to mail services. The spokesperson would not confirm whether the ‘businesses continuity arrangements’ existed prior to 24 January. The department’s video conferencing and shared documents services were unaffected.

The Microsoft 365 outage struck organisations from 9:30am on 24 January, with firms across the continent experiencing severe IT difficulties. Microsoft acknowledged that it was experiencing problems with its services, and engineers worked to restore services at around 8pm the same evening.

“This incident underlines the very real risk authentication delays can have on critical email systems, disrupting government business and preventing officials from sharing confidential information securely,” said Centrify’s vice president John Andrews.

“With rising levels of cyber attacks, it’s vital that all departments ensure privileged access to confidential data is a major priority, so that systems are protected from outsider threats at all times.”

The incident demonstrates just how dependent massive organisations, including critical government services, are on third-party cloud vendors to provide an undisrupted service at the risk of sustaining organisations paralysis.

Microsoft also suffered a global authentication-related outage four days later, with users from nations across the world including the US and Japan unable to login to critical cloud-based services.

Global Microsoft outage leaves users unable to login


Keumars Afifi-Sabet

30 Jan, 2019

A host of Microsoft’s cloud services including Azure Government Cloud and LinkedIn sustained a global authentication outage just a few days after users were blocked from accessing Office 365 in Europe.

Users in parts of Europe, the US, as well as Australia and Japan were blocked from logging into their services between 9pm GMT yesterday and the early hours of this morning due to authentication issues.

A host of Microsoft Cloud services including Dynamics 365 and Office 365, as well as US Government cloud resources, were out of action for a few hours due to problems with its authentication infrastructure.

According to the outage detection service downdetector, the issue may have affected a wide range of services including Skype, OneDrive, Office365, and Outlook.com – which all experienced spikes at roughly the same time. Users also complained across social media about difficulties logging into these platforms.

The issue, which has now been resolved, involved users attempting to log into new sessions, with the Azure status page indicating it concerned an internal DNS provider, describing the issue as ‘Level 3’ after an investigation. Microsoft says that engineers mitigated the outages by failing over CyberLink DNS services to an alternative provider.

These issues were resolved shortly after midnight this morning, but lasted at least a few hours, affecting users in predominantly the Eastern hemisphere who were getting into the crux of their working days.

The global outage arose just five days after Microsoft customers were unable to access their Office 365 accounts for a full working day in Europe.

The company confirmed on Thursday, after initially maintaining that services were running smoothly, that its cloud-powered productivity suite was experiencing difficulties, with the continental outage lasting around nine hours in total. 

This rocky start to the new year reflects a series of outages that Microsoft had sustained with its cloud services in the last few months of 2018, as the Windows-manufacturer struggled to provide 100% reliability. 

AWS launches DocumentDB in a blow to open source


Keumars Afifi-Sabet

10 Jan, 2019

Amazon Web Services (AWS) has launched a managed document database service fully compatible with the widely-used open source software MongoDB.

Amazon DocumentDB, touted as a fast and scalable document database designed to be compatible with existing MongoDB apps and tools, is built from the ground up but is based on the technology used by the aforementioned $44 billion open source company.

The move is seen as a kick in the teeth for open source after MongoDB recently released a set of public licensing policies for third-party commercial use. These aimed to put a stop to large vendors exploiting the firm’s freely available technology.

AWS’ managed database will demonstrate high-performance levels and bring newfound scalability to managed databases, AWS chief evangelist Jeff Barr announced in a blog post, with capacity climbing from a base of 10GB up to 64TB, in 10GB increments.

“To meet developers’ needs, we looked at multiple different approaches to supporting MongoDB workloads,” said AWS vice president for non-relational databases Shawn Bice. “We concluded that the best way to improve the customer experience was to build a new purpose-built document database from the ground up, while supporting the same MongoDB APIs that our customers currently use and like.

“This effort took more than two years of development, and we’re excited to make this available to our customers today.”

AWS says its latest product offers users the capacity to built “performant, highly available applications that can quickly scale to multiple terabytes and hundreds of thousands of reads and writes-per-second”.

The firm added that customers have found using MongoDB inconvenient due to the complexities that came with setting up and managing MongoDB clusters at scale.

DocumentDB uses a purpose-built SSD-based storage layer, with a six-way replication across three availability zones. The storage layer is distributed and self-healing, giving it the qualities needed to run production-scale workloads, Barr added.

AWS’ newly-announced service will fully support MongoDB workloads on version 3.6, with customers also able to migrate their MongoDB datasets to DocumentDB, after which they’ll pay a fee for the capacity they use.

Amazon DocumentDB essentially implements the Apache 2.0 open source MongoDB 3.6 application programming interface (API) by emulating the responses that a MongoDB client would expect from a MongoDB server.

DocumentDB’s six-way storage replication will also ensure data can move from one system to another upon detecting a fault within 30 seconds. Meanwhile, it’ll give customers the option to encrypt their active data, snapshots, and replicas, with authentication enabled by default.

Version 3.6 of MongoDB is little under a year-and-a-half out of date, having been released in November 2017, with the latest release MongoDB 4.0.5, released in December, adding several new features and faster performance.

The two companies previously clashed in April 2017 when the AWS extended its Database Migration Service (DMS) to cover the migration of MongoDB NoSQL databases. At the time the DynamoDB only worked with AWS, where MongoDB’s own service retained compatibility with a plethora of cloud providers.

Mozilla planning revamped Thunderbird for 2019


Keumars Afifi-Sabet

3 Jan, 2019

Mozilla has announced its opensource email client Thunderbird will benefit from a redesigned user interface (UI) and better Gmail support within the next year.

As part of its roadmap for 2019, the firm will grow its team by half a dozen members, from eight to 14 engineers, in order to make the service faster, more secure, and improve the user experience (UX).

Announcing the plans in a blog post, the Firefox developer said it will build on the progress made with the release of Thunderbird 60 in August, which saw major upgrades to its core code and improvements to security and stability.

“We heard from users who upgraded and loved the improvements, and we heard from users who encountered issues with legacy add-ons or other changes that they hurt their workflow,” said Thunderbird community manager Ryan Spies.

“We listened, and will continue to listen. We’re going to build upon what made Thunderbird 60 a success, and work to address the concerns of those users who experienced issues with the update.

“Hiring more staff will go a long way to having the manpower needed to build even better releases going forward.”

Mozilla will prioritise Thunderbird’s design and UX improvements, after receiving “considerable feedback” and complaints, with a primary focus on improving compatibility with Google’s Gmail.

This, specifically, will see better support for Gmail’s labels, a way to categorise messages, and improvements to how Gmail-specific features translate to the Thunderbird client.

Among the project’s engineering priorities for the new year will be looking into methods for measuring slowness, and developing fixes to specific bugs that deteriorate the user experience.

The new staff members will also be put to work re-writing parts of the core code and “working toward a multi-process Thunderbird”.

The client’s notifications and encryption settings will also benefit from an overhaul, the firm confirmed.

Thunderbird will seek to integrate its own notifications with a user’s operating system, while Mozilla will allow users to more easily secure their communications after an engineer was recently hired with a specific remit over security.

Mozilla hasn’t yet completely determined its roadmap, and wouldn’t guarantee that all changes outlined, including the UI redesign, would be available in the next Thunderbird release.

Microsoft 365 offers new bundles for cyber security and GDPR compliance


Keumars Afifi-Sabet

3 Jan, 2019

Microsft has added two new security-centric subscription bundles to its Microsoft 365 enterprise suite to be made available from February 1 2019.

The new ‘Identity & Threat Protection’ and ‘Information Protection & Compliance’ packages are aimed at enterprise customers unable to commit to most high-end Microsoft 365 subscriptions, but who still want to benefit from the firm’s security and compliance tools.

The former will include Microsoft Threat Protection, which comprises Azure Advanced Threat Protection (ATP), Windows Defender ATP and Office 365 ATP including Threat Intelligence. Priced at $12 per user per month, it will also offer Microsoft Cloud App Security, and Azure Active Directory.

The latter package, which is more geared towards assisting customers with compliance needs such as handling requests under the European Union’s General Data Protection Regulation (GDPR), will be priced at $10 per user per month.

This package combines Office 365 Advance Compliance and Azure Information Protection, and is designed to aid chief compliance officers with ongoing risk assessments within their organisations. The package will also automatically classify and protect sensitive data and use AI to respond to regulatory requests.

“A big driver of customer adoption of Microsoft 365 is the need for security and compliance solutions in an age of increasingly sophisticated cybersecurity threats, as well as complex information protection needs due to regulations like the GDPR,” said corporate vice president for Microsoft 365 Ron Markezich.

“As we speak to customers about the future of work, we know security and compliance are some of the highest organizational priorities and we hope these new offerings will help them achieve their security and compliance goals.”

These products already exist as part of the Microsoft 365 E5 suite, the most high-end subscription bundle available, but is being made available to existing and prospective customers on a standalone basis.

Microsoft confirmed that neither the price, nor composition, of its existing packages will change.

Matt Hancock says every GP practice must offer Skype appointments by 2024


Keumars Afifi-Sabet

2 Jan, 2019

The health secretary, Matt Hancock, has called on every GP practice in England to offer digital appointments within five years as part of plans to shake-up an ‘outdated’ IT setup.

With the IT market dominated by just two providers, a new GP IT Futures framework will open up the NHS to investment and encourage competition, Hancock said.

By 2023 to 2024, every patient should be able to access GP services digitally, with practices offering online and video consultations as standard, through services such as Skype or Google Hangouts. This will continue alongside clinicians seeing patients in person.

Executing the plans will free up staff time, and reduce delays by hastening the flow of data between GP practices, hospitals and social care institutions, according to the Department for Health and Social Care (DHSC)

The framework will also examine how patient data can be migrated to cloud services, and how both patients and clinicians can access this data securely in real-time.

“Too often the IT used by GPs in the NHS – like other NHS technology – is out of date. It frustrates staff and patients alike, and doesn’t work well with other NHS systems. This must change,” said Matt Hancock.

“I love the NHS and want to build it to be the most advanced health and care system in the world – so we have to develop a culture of enterprise in the health service to allow the best technology to flourish.

“I want to empower the country’s best minds to develop new solutions to make things better for patients, make things better for staff, and make our NHS the very best it can be.”

New standards, to be developed by NHS Digital, will introduce minimum technical requirements for any systems implemented, so they are able to communicate easily and securely, and be upgradeable.

The DHSC says it will seek to end existing contracts with providers which do not adhere to the minimum technical requirements.

“The next generation of IT services for primary care must give more patients easy access to all key aspects of their medical record and provide the highest quality technology for use by GPs,” said NHS Digital’s chief executive Sarah Wilkinson.

“They must also comply with our technology standards to ensure that we can integrate patient records across primary care, secondary care and social care.

“In addition, we intend to strengthen quality controls and service standards, and dramatically improve the ease with which GPs can migrate from one supplier to another.”

Hancock has dedicated a significant portion of his six-month tenure as health secretary talking up the role technology can play in transforming the NHS, marking his start with a £500 million digital transformation pledge.

He also called for the NHS to vastly expand its library of approved apps, shortly after taking up his new role, to support both patients and clinicians in their roles.

UK cloud adoption outpacing the EU average


Keumars Afifi-Sabet

17 Dec, 2018

The UK is the sixth largest cloud user among European Union (EU) countries, up one place four years ago, according to research. 

British enterprises boast a relatively high rate of cloud adoption, with 41.9% of organisations adopting some form of cloud service, against an EU average of 26.2%. Cloud services were mostly used for hosting email systems, and storing electronic files.

The UK is only beaten by a handful of predominately Scandinavian nations, like Denmark in third place with 55.6%, Sweden in second place with 57.2% and Finland leading the pack with 65.3%.

Statistics published by Eurostat show that UK organisations are outpacing the rest of Europe, on average, with UK cloud adoption representing a 17.9% increase on 2014, against a relatively modest EU-wide average increase of 7.2%.

Meanwhile, public cloud usage among both large organisations and SMBs in the EU-28 generally overshadows private cloud usage, 40% versus 31% for enterprises against 17% versus 11% for SMBs. But these statistics also point to overall cloud usage among larger businesses dwarfing cloud adoption among SMBs.

“Cloud computing is one of the strategic digital technologies considered important enablers for productivity and better services,” said authors Magdalena Kaminska and Maria Smihily.

“Enterprises use cloud computing to optimise resource utilisation and build business models and market strategies that will enable them to grow, innovate and become more competitive.

“Growth remains a condition for businesses’ survival and innovation remains necessary for competitiveness. In fact, the European Commission in the wider context of modernisation of the EU industry, develops policies that help speed up the broad commercialisation of innovation.”

Surprisingly, the rate of cloud adoption in countries such as France and Germany was considerably below average, 19% and 22% respectively, and a far cry from the host of Scandinavian leaders.

The specific reasons for adopting cloud computing technology among businesses also varied to some extent, with nearly seven out of ten enterprises using the cloud for storing files in electronic form, and for email; 68% and 69% respectively.

Moreover, just 23% of European businesses use cloud computing power for an enterprise’s software and just 29% of firms use cloud-based customer relationship management (CRM) tools and apps.

UK-based organisations’ cloud usage which also includes office software, and financial or accounting apps, was generally higher than average across the board, with 77% of British organisations using the cloud for storage of files, for example.

Unsurprisingly, the highest proportion of enterprises using cloud computing services were in the information and communication sector, 64%.

This was followed by ‘professional, scientific and technical activities’ businesses, which stood at 44%, while the rate of adoption for firms in almost all other sectors languishing between 20% to 33%.