‘Software glitch’ to blame for global Cloudflare outage


Keumars Afifi-Sabet

3 Jul, 2019

Cloudflare has resolved an issue that led to websites serviced by the networking and internet security firm to show 502 ‘Bad Gateway’ errors en masse for half an hour yesterday.

From 2:42pm BST the networking giant suffered a massive spike in CPU utilisation to its network, which Cloudflare is blaming on bad software deployment. This affected websites hosted in territories across the entire world.

Ironically, even Downdetector was knocked offline during the outage

Once this faulty deployment was rolled back, its CTO John Graham-Cumming explained, service was returned to normal operation and all domains using Cloudflare returned to normal traffic levels.

“This was not an attack (as some have speculated) and we are incredibly sorry that this incident occurred,” Graham-Cumming said.

“Internal teams are meeting as I write performing a full post-mortem to understand how this occurred and how we prevent this from ever occurring again.”

The incident affected several massive industries, including cryptocurrency markets, with users not able to properly access exchanges like CoinMarketCap and CoinBase.

Cloudflare issued an update last night suggesting the global outage was caused by the deployment of just one misconfigured rule within the Cloudflare Web Application Firewall (WAF) during a routine deployment. The company had aimed to improve the blocking of inline JavaScript used in cyber attacks.

One of the rules it deployed caused CPU to spike to 100% on its machines worldwide, and subsequently led to the 502 errors seen on sites across the world. Web traffic dropped by 82% at the worst point during the outage.

“We were seeing an unprecedented CPU exhaustion event, which was novel for us as we had not experienced global CPU exhaustion before,” Graham-Cumming continued.

“We make software deployments constantly across the network and have automated systems to run test suites and a procedure for deploying progressively to prevent incidents.

“Unfortunately, these WAF rules were deployed globally in one go and caused today’s outage.”

At 3:02pm BST the company realised what was going on and issued a global kill on the WAF Managed Rulesets which dropped CPU back to normal levels and restored traffic, before fixing the issue and re-enabling the Rulesets approximately an hour later.

Many on social media were speculating during the outage that the 502 Bad Gateway errors may be the result of a distributed denial-of-service (DDoS) attack. However, these suggestions were fairly quickly quashed and confirmed to be untrue by the firm.

Why CEOs crave digital transformation results – and the greater impact on business growth

Digital transformation fuels upside market opportunities, and related growth goals are now the CEO's top business priority, according to the latest worldwide study by Gartner. Moreover, a growing number of CEOs will focus more on financial priorities – especially profitability improvement.

The annual survey of CEO and senior business executives in the fourth quarter of 2018 examined their business issues, as well as some areas of technology agenda impact. In total, 473 business leaders of companies with $50 million or more – and 60 percent with $1 billion or more – in annual revenue were qualified and surveyed.

Digital business market development 

"After a significant fall last year, mentions of growth increased this year to 53 percent, up from 40 percent in 2018," said Mark Raskino, vice president at Gartner. "This suggests that CEOs have switched their focus back to tactical performance as clouds gather on the horizon."

The survey results showed that a popular solution is to look in other geographic locations for growth. Responses mentioned other cities, states, countries and regions, as well as 'new markets' would also include some geographic reach — although a new market can also be industry-related, or virtual.

Twenty-three percent of CEOs see significant impacts arising from recent developments in tariffs, quotas and other forms of trade controls. Another 58 percent of CEOs have general concerns about this issue, suggesting that more CEOs anticipate it might impact their businesses in the future.

Another way that CEOs seem to be confronting softening growth prospects and weakening margins is to seek diversification — which increasingly means the application of 'digital business' to offer new products and revenue-producing channels.

Eighty-two percent of Gartner's survey respondents agreed that they had a management initiative or transformation program underway to make their companies more digital — that's up from 62 percent in 2018.

Cost management has risen in CEO priorities. When asked about their cost-control methods, 27 percent of respondents cited technology enablement, securing the third spot after measures around people and organisation, such as bonuses and expense or budget management.

However, when asked to consider productivity and efficiency actions, CEOs were much more inclined to think of digital business technology as a tool. Forty-seven percent of respondents mentioned technology as one of their top two ways to improve productivity.

According to the Gartner assessment, digital business planning must include the whole executive committee. However, the survey results showed that CEOs are concerned that some of the executive roles do not possess strong or even sufficient digital skills to face the future.

On average, CEOs believe that sales, risk, supply chain and human resource officers are most in need of more digital expertise. And, once all executive leaders are more comfortable with the digital sphere, new capabilities to execute on their business strategies will need to be developed.

Outlook for digital transformation skills development

When asked which organisational competencies their company needs to develop the most, 18 percent of CEOs named talent management, closely followed by technology enablement and digitalisation (17 percent) and data centricity or data management (15 percent).

"Datacentric decision-making is a key culture and capability change in a management system that hopes to thrive in the digital age. Executive leaders must be a role model to encourage and foster data centricity and data literacy in their business units and the organisation as a whole," Mr. Raskino concluded.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Equinix ploughs $1bn into building xScale data centres across Europe


Keumars Afifi-Sabet

2 Jul, 2019

Data centre firm Equinix will invest $1 billion (approximately £793 million) into building six data centres across Europe to support some of the biggest cloud players like Microsoft Azure and Google Cloud Platform (GCP).

Backed by Singapore’s sovereign wealth fund GIC, Equinix will establish new xScale facilities in key locations around Europe, including London and Paris. They will be based near, or on, the firm’s International Business Exchange (IBX) campuses and provide companies with heightened connectivity and edge computing capabilities.

Equinix is only targeting hyperscale companies initially. These, in addition to Azure and GCP, includes Alibaba Cloud, Amazon Web Services (AWS), and Oracle Cloud Infrastructure, to support their unique workloads.

“It has been a long journey to reach this point, but we are tremendously excited to announce the formation of our first xScale data centers joint venture,” said Equinix president and CEO Charles Meyers.

“Partnering with a world-class investment partner like GIC will provide the opportunity to make significant capital investments in order to capture targeted large-footprint deployments while continuing to optimize our capital structure.

“The JV [joint-venture] structure will enable us to extend our cloud leadership while providing significant value to a critical set of hyperscale customers.”

Six xScale data centres, of which two are to be based in London, will allow customers to add core deployments to their existing access points so they can expand on a single platform. These are also specifically-engineered to meet the technical and operational requirements of hyperscale companies’ workloads.

The infrastructure will be managed and staffed by Equinix while being connected to the Equinix global platform in order to provide a non-disrupted experience for hyperscale firms.

“As hyperscale companies expand around the world, they will increasingly look to partners to provide both broad global scale and deep local knowledge when deploying data center space,” said vice president for datacentre infrastructure and services with 451 Research Kelly Morgan.

“By increasing the number of hyperscale facilities in the EMEA region, the joint venture between Equinix and GIC aims to accelerate the adoption of hybrid and multicloud as the IT architecture of choice by companies throughout the region.”

Equinix will sell its London LD10 and Paris PA8 IBX data centre facilities to the fund that manages this $1 billion joint-venture, with new xScale data centres expected to come to fruition on these sites.

New data centres will also be built in Amsterdam, London and on two sites in Frankfurt.

Gartner notes the inexorable shift of the database market to the cloud

The database market continues to shift to the cloud – and according to Gartner, three quarters of all databases will be deployed or migrated to a cloud platform by 2022.

The finding, which appears in the analyst firm’s latest report, ‘The Future of the DBMS Market is Cloud’, revealed how artificial intelligence was influencing the need for greater data usage. Only 5% of databases will be cloudless by 2022, Gartner added, being considered for ‘repatriation’ to on-premises environments.

Last year, global database management system (DBMS) revenue grew 18.4% to $46 billion (£36.4bn), according to Gartner’s figures. Growth in on-premises systems, the company added, was not down to long term strategy; instead, price increases and forced upgrades are undertaken to mitigate risk.

“According to inquiries with Gartner clients, organisations are developing and deploying new applications in the cloud and moving existing assets at an increasing rate, and we believe this will continue to increase,” said Donald Feinberg, Gartner distinguished research vice president. “We also believe this begins with systems for data management solutions for analytics (DMSA) use cases – such as data warehousing, data lakes, and other use cases where data is used for analytics, artificial intelligence and machine learning.

“Increasingly, operational systems are also moving to the cloud, especially with conversion to the SaaS application model,” Feinberg added.

Of total DMBS revenue growth, cloud database management systems accounted for more than two thirds (68%) of it. Microsoft and Amazon Web Services (AWS), Gartner added, accounted for more than three quarters of that. This gives rise to another trend – that cloud service provider infrastructure is becoming de facto data management platforms.

This is again linked back to multi-cloud, and organisations having to reinforce their strategies due to complex implementations. Gartner recently found that, of respondents who were on the public cloud, more than four in five (81%) were using more than one service provider. “Ultimately what this shows is that the prominence of the CSP infrastructure, its native offerings, and the third-party offerings that run on them is assured,” added Feinberg.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AI and data analytics tech served up at Wimbledon


Bobby Hellard

1 Jul, 2019

On the 1 July, the Wimbledon Championship will start off a fortnight of grand slam tennis at the All England Lawn Tennis Club (AELTC), one of the world’s most traditional sporting events. But the competition isn’t strictly all heritage and old fashion values, it’s also one of the most technologically advanced sporting spectacles around.

In its 49th year, the Championship is using more artificial intelligence technology than ever before to capture the best bits. Powered by sophisticated cloud computing, the tournament now has a raft of data-powered and AI services for organisers, fans and even players to use. From automated highlights packages to performance analytics, Wimbledon is a hive of cutting-edge technology.

Data Points

For 30-years, IBM has been in partnership with Wimbledon and since 1990 the tech giant has collected 62.8 million data points from the championships. There are 18 courts at the AELTC, each with an average of four matches per day (weather permitting) and scanning through this mass data would take many humans far longer than the two weeks of the competition. Instead, IBM’s cloud platform runs in the background, capturing the data and spinning it into everything from player insights to fan based applications.

“It’s about trying to uncover those stories,” said Sam Sneddon, IBM’s client executive. “It’s about trying to help fans, in the moment, when they’re consuming sport, to really know what it takes. Which are the most exciting moments, what are they thinking, and being able to provide those insights to fans wherever they’re watching it.

AI-powered returns

While Roger Federer may be the most successful Wimbledon player of all time, IBM’s Watson AI is the most efficient. Launched last year, the AI-powered service uses cameras and sensors to track the play and create an automated highlights reel. Each bit of action is monitored and ranked via its statistical importance within the match and how much of a cheer it receives from the crowd; even the sound of racket on the ball is measured. At the end of each match, Watson then creates a highlights package within two minutes.

For this year’s competition, players can expect to have their tense facial expressions or celebratory body language examined by Watson, which will use visual image recognition technology to capture their reactions in order to add these to the automated highlights reel. Watson can pick up anything from an agonising grimace to a celebratory roar, which will prompt the machine to automatically clip that point in the match.

“This allows us to clip the highlights package to be really tight, so it knows exactly when play is happening,” Seddon told the Telegraph. “We asked ourselves how do we create video content that’s available really quickly? What are the most exciting moments in a match? You can sit there as a digital editor in a match and make that decision yourself, or you can turn that question over to an AI system.

“Then we had to define what exciting is – well, let’s listen to how excited the crowd are, let’s look how animated the players are, let’s analyse the data and see whether this is a turning point in the match and use of that to generate highlights.”

Player-coach data sets

All these data points are not just for the TV, however, as IBM also provides data analytics to players and coaches to help them read insights into their performance. After each match, personal analysis is available within 20 minutes of it finishing.

All this data is monitored by 48 IBM recruited and trained tennis experts, who capture match statistics at courtside and report back in sub-second response times. IBM says that using tennis players trained on its systems to capture data ensures they read the game faster and can provide data more accurately.

Former champion Andy Murray is set to make a comeback in the doubles, following a serious hip injury and thanks to similar tech provide by Catapult, Murray has been analysing the previous form and comparing it his post-operation technique. Jozef Baker, a product specialist at Catapult, has worked closely with Murray and his strength and conditioning coach Matt Little since 2017.

“The credit for his return to play belongs entirely with Andy and his team, but we look to provide the best tools and hardware and software for Matt to make the best decisions for Andy and his health,” he told Forbes.

“We have been able to use the technology to identify tennis strokes so we have utilised some of that work and we have detected serves from historical data on Andy. We have managed to compare that to what he is doing at the moment during his recovery and from there we took conclusions from a pre-injury Murray and compared them to help Matt in building that workload.”

As such, Wimbledon may be the key event in the tennis calendar, but it also stands as a quiet showcase of some of the most cutting-edge AI and data analysis technology currently available, demonstrating that if the tech can stand up in a high-speed, intense environment, it’s poised reap rewards for the businesses that adopt it. 

Cloud database management set to soar in coming years


Connor Jones

1 Jul, 2019

The trend involving databases being used for analytics under the ever-popular software as a service (SaaS) model will see 75% of all databases being deployed or migrated to a cloud platform, according to Gartner’s latest predictions.

The IT analyst house also said that just 5% of these will ever be considered by owners to be taken back into on-premise infrastructure as businesses continue to realise the benefits of widespread cloud adoption.

“According to inquiries with Gartner clients, organisations are developing and deploying new applications in the cloud and moving existing assets at an increasing rate, and we believe this will continue to increase,” said Donald Feinberg, distinguished research vice president at Gartner.

“We also believe this begins with systems for data management solutions for analytics (DMSA) use cases — such as data warehousing, data lakes and other use cases where data is used for analytics, artificial intelligence (AI) and machine learning (ML).

“Increasingly, operational systems are also moving to the cloud, especially with conversion to the SaaS application model.”

Research from Gartner shows that worldwide revenue from database management systems was up a significant 18.4% to $46 million and cloud database management systems accounted for 68% of that.

The company also notes that Microsoft and AWS account for more than 75% of the total market growth, indicating a trend towards cloud service providers becoming the new data management platform.

On-premise infrastructure rarely offers built-in capabilities to support cloud integration which is why its growth isn’t as vibrant as its cloud counterparts. The industry is growing, but at a much slower rate and not because of new on-premise deployments, but because of price increases and forced upgrades.

“Ultimately what this shows is that the prominence of the CSP infrastructure, its native offerings, and the third-party offerings that run on them is assured,” said Feinberg. “A recent Gartner cloud adoption survey showed that of those on the public cloud, 81% were using more than one CSP.

“The cloud ecosystem is expanding beyond the scope of a single CSP — to multiple CSPs — for most cloud consumers,” he added.

The UK is adopting the cloud more than others in the EU, according to figures from Eurostat published late last year.

A sixth-place ranking among EU countries for cloud adoption is primarily due to the high rate of British enterprises using some form of cloud service.

British businesses beat the average EU country in this regard by a significant margin, with 41.9% using at least one cloud service compared to the average of 26.2% – a figure beaten only by a handful of Scandinavian nations, Denmark, Sweden and Finland among them. 

Microsoft bids for behind-the-scenes access to Linux flaws


Keumars Afifi-Sabet

1 Jul, 2019

Microsoft has applied to join two security boards for representatives of Linux distributions to discuss and coordinate vulnerabilities and security issues.

The linux-distros mailing list is used as a private channel where developers can discuss flaws in Linux systems and co-ordinate fixes for issues that have not yet reached the public domain. The oss-security group is used to discuss vulnerabilities that are already known.

Microsoft’s ‘Linux Kernal Hacker’ Sasha Levin sent an application to join the security boards last week, which could see the Windows developer to be party to behind-closed-doors conversations on ongoing security issues.

Members of this community include Chrome OS, Red Hat, Oracle, SUSEand Amazon Linux AMI.

There are several criteria that organisations need to meet to join the linux-distros group. For example, Levin cited Azure Sphere and Windows Subsystem for Linux v2 as examples of the company actively maintaining Unix-like operating system distro with open source components.

Successful applications must also have a userbase that isn’t limited to their own organisation, which Microsoft said it fits through millions of cores its customers run on systems such as those aforementioned.

Organisations must also be able to demonstrate at least a year-long track record of fixing vulnerabilities, including some on Linux distros, and releasing fixes for known issues within 10 days or fewer.

Applications would also have to gain the recommendation of an individual who has been active on oss-security of years but is not affiliated with the organisation. Levin copied in renowned Linux developer Greg Kroah-Hartman, who replied separately in the email chain to vouch for Microsoft’s submission.

“I can vouch for Sasha,” Kroah-Hartman said. “He is a long-time kernel developer and has been helping with the stable kernel releases for a few years now, with full write permissions to the stable kernel trees.

“I also suggested that Microsoft join linux-distros a year or so ago when it became evident that they were becoming a Linux distro, and it is good to see that they are now doing so.”

Microsoft has shifted towards embracing Linux technology and open source principles over the last few years, and increasingly under CEO Satya Nadella’s leadership. This is after its former CEO Steve Ballmer infamously referred to Linux as a “malignant cancer” and “communism” almost 20 years ago.

A significant change happened a decade ago when Microsoft released 20,000 lines of code to the Linux open source community. This led the executive director of the Linux Foundation Jim Zemlin to declare at the time that “hell has frozen over”.

To demonstrate how much Linux popularity has surged in recent years, Sasha Levin added in a further message to the email chain that the usage of this technology on Microsoft’s Azure cloud services has now surpassed Windows. This is just two years after Microsoft said that 40% of virtual machines in Azure are running Linux.

As a result of this increased usage, Microsoft’s security centre has started receiving a higher volume of security reports of issues with Linux code from users and from vendors.

How DevOps enables organisations to deliver true value to customers: A guide

Gone are the days when teams could work on a project for months (or even years) before releasing it to production. Now even two-week cycles are too long due to ever more demanding customer expectations and an always-growing field of competitors.

Organisations today expect to deliver new features to production on a weekly, daily or even hourly basis. This accelerated timeline lets organisations adapt to market shifts and technological changes. It keeps companies on the same pace as their competitors. Most importantly, it enables businesses to continuously deliver value to customers.

There is no question that being better, faster and more reliable is beneficial. But how can businesses achieve that? The short answer is through DevOps.

DevOps saves time by speeding delivery and eliminating lean time

Which phases of the software development life cycle take the most time? The answer, perhaps surprising, is none of them. The most time is wasted not in any one phase, but in between phases. Developers wait for business analysts to give them requirements. Testers wait for developers to finish their code, and developers wait for testers to tell them whether the code works. Both developers and testers wait for system administrators to deploy new releases to different environments. And sysadmins wait for everyone else to tell them why deployments failed. 

The sad truth is that in many companies a significant effort is spent on tasks that do not bring value to the company. Waiting does not bring value, yet many organisations have processes in place with time built in for waiting for another team or colleague to do something. Repeating the same set of manual tests over and over again does not bring value, yet this is a normal practice for many businesses. Handing over deployments to a different department when the team in charge of an application could do it themselves by executing a single command does not bring value, yet there are often separate departments for deployment.

DevOps enables seamless collaboration between teams

Who is in charge of a product? The answer is usually "no one in particular". Everyone is in charge of a particular aspect or phase of the development life cycle, but no one is in charge of a product from beginning to end. And to make things even more complicated, employees often focus solely on their assigned roles.

Developers don’t always consider how their work affects testing and deployment, since it’s not their responsibility. They may not even know how their applications are tested or deployed. The same can be said for all other departments. Sysadmins likely do not consider how much time others spend opening Jira tickets in order to deploy something. No one team considers the complete life cycle of an application, because responsibility is split among too many silos. There is a lack of empathy. Communication between teams is limited and one team does not understand how their actions affect another.

DevOps addresses the disconnect between teams by building empathy between all those involved in an application’s life cycle. The practice unites distinct teams into a single team that responds to a single product owner. It dismantles silos by creating self-sufficient teams fully in charge of everything related to their application from start to finish. Such teams are in charge of requirements, development, testing, deployment to production, and even monitoring and pager duty. They are in full control of what's happening with their application.

As a result, there is nothing to be handed over to other teams or silos. There is no need for inefficient handovers in the form of Jira tickets, emails or other administrative hurdles. DevOps is all about cultural changes aimed at creating autonomous and self-sufficient teams in charge of the entire life cycle of one or more applications. Therefore, creation of “DevOps departments” and employment of “DevOps engineers” completely misunderstands what DevOps tries to accomplish.

DevOps is not about creating more silos, nor it is about renaming existing departments as “DevOps departments”. Instead, it's about people working together to accomplish a common goal: successful release of new features to production.

When everyone works as a single team focused on a single product, communication improves, the need for administrative overhead decreases and ownership is established. Working together and understanding how one department’s actions affect others create empathy. As a result, productivity and quality increase, and costs and time to market decrease. 

Eliminating lean time and automating repetitive processes inevitably increases the time teams have at their disposal. Saved time combined with better cooperation between those involved in application life cycles allows teams to focus entirely on actions that bring value by dedicating precious time and effort to solving problems and innovating.

For real innovation to happen, everyone in the company needs to be involved. Elimination of wasted time and resources, improved collaboration and decentralised innovation are the three key ingredients that allow organisations to focus on what really matters. They allow businesses to move from trying to catch up with competitors to being able to innovatively tackle today and tomorrow’s challenges.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.