IBM appoints former Bank of America CTO as head of cloud business


Carly Page

7 Apr, 2020

IBM has hired former Bank of America chief technology officer (CTO) Howard Boville to head up its cloud business.

Boville, who will join the company as senior vice president of Cloud Platform, replaces Arvind Krishna, who was promoted to CEO earlier this year after Ginni Rometty stepped down from the role

In a LinkedIn post shared with employees, Krishna wrote that Boville, who spent eight years at Bank of America designing and running the company’s cloud services, “is a proven strategist and expert in the realm of cloud and has played a critical role in developing the financial services ready public cloud with IBM.”

Boville’s appointment comes after IBM last year announced it had designed a financial services-ready public cloud by collaborating with Bank of America. IBM said it would enable independent software vendors and Software as a Service (SaaS) providers to focus on deploying their core services to financial institutions with the controls for the platform already put in place.

In his LinkedIn post, shared on his first day as CEO, Krishna also said told IBM staff that in light of the global COVID-19 pandemic, “now is the time to build a fourth platform in hybrid cloud.”

“An essential, ubiquitous hybrid cloud platform our clients will rely on to do their most critical work in this century,” he said. “A platform that can last even longer than the others.

“The fundamentals are already in place. Our approach to hybrid cloud is the most flexible and the most cost effective for our clients in the long term. Coupled with our deep expertise, IBM has unique capabilities to help our clients realize the potential of a hybrid cloud business model.”

He added that IBM has “to win the architectural battle in cloud”, noting that “there’s a unique window of opportunity for IBM and Red Hat to establish Linux, containers and Kubernetes as the new standard.

“We can make Red Hat OpenShift the default choice for hybrid cloud in the same way that Red Hat Enterprise Linux is the default choice for the operating system.”

Krishna also announced other leadership changes. Jim Whitehurst, in his new role as president, will head IBM Strategy as well as the Cloud and Cognitive Software unit, and Paul Cormier will lead IBM’s open-source software company Red Hat as its new president and CEO.

Giving your business an edge


David Howell

7 Apr, 2020

A seismic shift is coming to how businesses use communications technologies. As the roll-out of 5G accelerates, the internet and the networks it supports will transform as centralised data moves to the edge of the network, creating flexible communications systems with low latency.

The burgeoning Internet of Things (IoT) space will also influence how data processing takes place on the edge of networks. Already, ‘edge datacentres’ are being built, delivering efficient services to end-users who are distant from the centralised datacentres. Data processing on the edge of networks will expand to support developing sectors such as autonomous vehicles, for example, that need low-latency communications to ensure they function safely.

Data networks in their current form are highly hierarchical. The expansion of 5G not only delivers new hardware infrastructure, but also a shift to large-scale network function virtualisation (NFV). This is coupled with software defined networks (SDN) that can be used with ‘network slicing’, enabling businesses to divide the 5G network in specialised channels for specific use cases.

Communications services industry body TM Forum explains: “Architecturally, the evolution to 5G may not be as dramatic as some of the other transformations, but when it comes to opportunity, 5G promises to be revolutionary if communications service providers, suppliers, application developers and enterprises can figure out how to co-create, manage and participate in digital ecosystems.”

Speaking to Cloud Pro, Martin Garner, chief operations officer at CCS Insight, says: “The key benefit of 5G is speed and latency. By doing the initial processing of a lot of the data at the edge, decision making can be done more quickly, and at a lower cost. For example, in a processing plant, there are huge benefits from automatically detecting anomalies in real-time.” 

He continues: “An additional benefit comes from the option to process relevant data locally and never allow it to leave the premises. This is true for many healthcare uses, where the information is extremely sensitive for personal reasons, and also for businesses that, for example, do not want to let commercially sensitive production data leave the factory. Some industrial sites even have an ‘air gap’, with no external data connection to the machinery, to ensure that local data stays local.” 

Edge networks will become the fundamental foundation onto which all intensive data processing services will be built.

Computing on the edge

While edge computing offers innumerable benefits, it needs a solid mobile communications infrastructure to support it.

“A fundamental goal of 5G is to enable virtual network slicing, as fully scalable, programmable and flexible networks are the future with 5G,” says Colin Bryce, Director of Mobile Network Engineering at CommScope

Dividing the infrastructure into independent virtual networks enables operators to create an independent standardised layer above the control plane, from which they can deliver proprietary value-added services. Network slicing will be especially important for 5G network success, as mobile network operators seek to find effective solutions to manage spectrum while reducing costs. Virtual networks will allow for tremendous network efficiencies and provide operators.”

In its 2017 “Introduction to Network Slicing”, the GSMA said that slicing could be determined by function or behaviour. For example, in automotive one slice could be dedicated to a high bandwidth connection for delivering infotainment, while another “ultra-reliable” slice would be used for assisted driving

John Vickery, principal technology partner at BT, who explains that the edge could also be closer to home. 

“In an enterprise context where business customers have large quantities of data to process such as HD video, video analytics or machine vision, these requirements can be met by deploying an ‘on-premise’ edge capability at their business location,” he tells Cloud Pro

“This allows them to benefit from reduced backhaul costs (since they won’t need to send all their data to the core) and they also get the added benefits of low latency, reliability and data sovereignty. So, any investment in a network edge capability will also need to be balanced against growth in demand for ‘on-premises’ edge.”

How businesses slice up the available 5G network to fuel their needs and how edge computing will factor into their infrastructures remains to be seen. What is clear for all enterprises is the edge offer massive potential to finally use data processing and communications to deliver world-class services.

New data ecosystems

While edge computing using 5G may be something of an emerging technology trend, there are already a number of practical applications and use cases to consider.

In a recent research paper, Julian Bright, senior analyst at Ovum, said: “The incentive for edge computing to be deployed in 5G networks appears to be growing, and it is becoming increasingly important to the 5G business case. The first commercial deployments appear likely to start in 2020 in markets such as South Korea and China with others following soon after. Above all, MEC [Multi-access edge computing] can become a reality, particularly as more commercial 5G deployments begin to appear, services start to bed in, and yet more use cases start to emerge.”

As the infrastructure begins to develop as 5G rolls out, all eyes will turn to the costs of deployment and ongoing development. It’s clear that there is a data and communications imperative no business can ignore. 

In 2019, Barclays Bank conducted a study looking at the benefits and impacts 5G could have on businesses. It found that currently 59% of companies operate across disparate locations and need to communicate in real-time, 49% need to communicate with customers and fill customer orders online, 48% have to connect multiple machines together to run their business, and 43% say that customers expect it.

The report’s authors said that in light of these findings, “it’s obvious that 5G’s projected increase in speed and reliability will greatly benefit british firms. However, factor in the forecasts for future demand for mobile internet, particularly in the context of IoT and issues such as connecting multiple devices and customer expectations leap to the fore”.

This multi-channel approach is how 5G and edge computing services will be built. Once the new communications ecosystem begins to come into focus, more use cases will develop. It’s a heady time, as businesses can see the first signs of the shackles coming off their communications services. The edge could finally deliver the quantum leap in network design and management they have been waiting for.

Zoom admits it made security “missteps” amid remote working surge


Bobby Hellard

6 Apr, 2020

Zoom’s founder and CEO has admitted his company made “missteps” that should have been fixed before the service became so popular during the coronavirus pandemic.  

Eric Yuan told CNN that the company had “moved too fast” and should have done more to enforce password and meeting room security. 

The service is currently seeing a spike in usage as more and more people are using video conferencing to connect to work colleagues, family and friends. Recent reports have suggested that Zoom is now more popular in the US than Microsoft Teams, with its user base surging from 10 million to 200 million in recent weeks. 

However, this has resulted in more scrutiny of the service as numerous security issues have come to the fore. From “zoomboming” to confusion over its level of encryption, Zoom has been dogged by security concerns, forcing its CEO to make public apologies. 

“During this COVID-19 crisis, we moved too fast,” he said. “Our intention was to serve the end-users, but we had some missteps. We should have done something to enforced password and meeting rooms and double-checked everything. We should have taken actions to fix those missteps.

“New user cases are very different from our traditional customer base where they have an IT team to support them. We’ve learned our lessons and we’ve taken a step back to focus on privacy and security.”

Yuan was tougher on himself in an earlier interview with The Wall Street Journal, saying that he “really messed up as CEO” and that he felt an obligation to win back user trust. 

Zoom’s internal criticism follows a troubling few weeks where a number of problems have plagued the videoconferencing platform. Most recently, its been the target of a hack known as ‘Zoomboming’, where unwanted guests invade a meeting.

Questions have also been asked about the level encryption the service offers, as it was recently revealed Zoom didn’t have end-to-end encryption between calls, despite saying so in its privacy policy. 

The issues have seen a number of companies and organisations drop the services, such as the FBI and Elon Musk’s SpaceX. Going forward, Yuan promised to make Zoom a “privacy and security-first company”.  

Paul Cormier appointed CEO of Red Hat


Bobby Hellard

6 Apr, 2020

Red Hat, the open source technology giant, has appointed Paul Cormier as its chief executive officer (CEO). 

Cormier previously served as the company’s president of products and technology and will now replace Jim Whitehurst who left to become president of IBM following Ginni Rometty’s retirement in January. 

Since joining Red Hat in 2001, Cormier has been involved in more than 25 acquisitions, pushing the company beyond its roots with Linux.

He is credited with pioneering the subscription model that helped the company transform from an open source disruptor into an enterprise technology mainstay. Cormier was also ‘”instrumental” in helping the company combined with IBM following its $43 billion acquisition.

“When I joined Red Hat, it would have been impossible to predict how Linux and open source would change our world, but they are truly everywhere,” Cormier said in a statement. 

“The transformations I see happening in our industry are exciting, as they present new challenges and opportunities. The opportunity for Red Hat has never been bigger than it is today and I am honoured to lead the company to help our customers solve their challenges and to keep Red Hat at the forefront of innovation.”

Having worked with him at Red Hat for more than a decade, Whitehurst said that Cormier was the “natural choice” to lead the company. The IBM president called Cormier the driving force behind its product strategy and explained that he understands how to help its customers and partners make the most out of their cloud strategies. 
 
“He is a proven leader and his commitment to open source principles and ways of working will enable Red Hat not only to keep pace with the demands of enterprise IT, but also lead the way as emerging technologies break into the mainstream,” said Whitehurst. 
 
“It was my honour and privilege to lead a company filled with many of our industry’s best and brightest and I am excited to see what Red Hatters accomplish under Paul’s leadership.”

Microsoft’s Edge now more popular than Firefox for the first time


Keumars Afifi-Sabet

3 Apr, 2020

Fundamental changes to the Edge platform have seen Microsoft’s flagship browser swell in popularity to the extent it’s overtaken Mozilla’s Firefox as the second most widely-used browser.

Microsoft Edge crept up from a market share of 7.38% in February to 7.59% during March 2020, versus a slightly reduced 7.19% share for Firefox against 7.57% the previous month, according to NetMarketShare.

A steady rise in popularity for Microsoft Edge against the steady fall of Firefox’s market share over the last couple of years has seen a crossover moment occur for the first time.

The milestone follows a period of change for Edge, that comes pre-packaged with its Windows operating systems. Among these changes are a reangling towards business users, and an overhaul of its codebase to the extent it’s now based on the open source Chromium browser.

Another feature, known as Collections, allows workers in procurement to drag and drop items from search results into a list that can be shared with others, complete with image and metadata for all items.

The Chromium-powered Edge has also seen a brand redesign to distinguish itself from the previous iteration of Edge, which has languished for years, as well as Internet Explorer, which has sustained an organic month-by-month decline.

Although Chrome enjoys a near-monopolistic market share of desktop browsers, often hitting between 60% and 70% in market share over the last few years, the tussle for second has been closely fought between Firefox, Edge and Internet Explorer.

Firefox has, itself, undergone a series of key changes focused almost exclusively on protecting user privacy. The most recent step forward in its development, which typifies this trend, involves the launch of a paid-for virtual private network (VPN) that encrypts users’ connections across apps and devices.

Unfortunately for Mozilla, these efforts haven’t paid off in the way the developer may have hoped, given its market share has continued to fall over time, from 9.27% in March 2019, for example, to just above 7% last month. Comparatively, Edge held just 5.2% market share the same time last year.

The rise of Microsoft Edge has also coincided with the fall of Internet Explorer, which held a market share above 12% during 2018. This is largely due to the fact many businesses still rely on the web browser to run business-critical applications.

The fact the new Edge is powered by Chromium is also sure to attract a swathe of users simply curious as to how it compares against previous iterations, and whether this cleaner codebase leads to smarter functionality.

Cloud IT infrastructure spending stormed back in Q4 to secure modest yearly growth, says IDC

Total end user spending on IT infrastructure products for cloud environments recovered significantly in the most recent quarter bringing overall 2019 levels above the red line, according to IDC.

Following two consecutive quarters of spending decline for server, enterprise storage and Ethernet switch for cloud, Q419 saw $19.4 billion (£x) in spending at a 12.4% year-over-year growth. Total spending for the year, at $66.8bn, saw 2.1% annual growth.

IDC noted that, aside from occasional swings, the IT infrastructure industry is now at a point where spending on cloud IT infrastructure 'consistently surpasses' spending on non-cloud IT. Non-cloud IT infrastructure spend dropped 4.6% in the most recent quarter to $18.7bn, with a 4.1% decline across the year.

The public cloud arena continues to drive the cloud IT infrastructure market but with only modest upswings: a 14.5% year-on-year growth for the most recent quarter but – thanks to a weak middle part of the year – only a 0.1% growth overall, at $45.2bn. Private cloud grew 6.6% in 2019 to $21.6bn.

When it came to specific companies, Dell Technologies was at the top of the pile for Q419, capturing 14.5% of market share. HPE and the New H3C Group saw second with 11.6% share, before a gap to Cisco and Inspur, both holding 5.9% of the market.

Kuba Stolarski, research director for infrastructure systems, platforms and technologies at IDC, said that the ongoing Covid-19 pandemic would lead to greater adoption of public cloud services once the crisis was passed. "As enterprise IT budgets tighten through the year, public cloud will see an increase in demand for services," said Stolarski. "This increase will come in part from the surge of work from home employees using online collaboration tools, but also from workload migration to public cloud as enterprises seek ways to save money for the current year.

"Once the coast is clear of coronavirus, IDC expects some of this new cloud service demand to remain sticky going forward," Stolarski added.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Elon Musk’s SpaceX bans Zoom over security fears


Sabina Weston

2 Apr, 2020

Elon Musk’s SpaceX has banned its employees from using video conferencing app Zoom, citing concerns over the app’s ability to keep users secure.

A similar ban has been issued by one of SpaceX’s biggest customers, NASA, coming into force after the FBI issued a warning about “Zoom-bombing” – hackers disrupting video conferences with threatening language, hate speech and pornographic images.

Zoom has previously admitted that it is “currently not possible” to enable end-to-end encryption for its video meetings.

In an email seen by Reuters, SpaceX informed its staff that all access to Zoom had been disabled with immediate effect.

“We understand that many of us were using this tool for conferences and meeting support,” the message read. “Please use email, text or phone as alternate means of communication.”

It’s not known whether the ban also extends to Tesla, which is also owned by SpaceX CEO Elon Musk. The company was unable to respond to requests for comment at the time of publication.

Zoom has surged to more than 200 million daily meeting participants after employees around the world were asked to work from home to slow the pace of the fast-spreading coronavirus pandemic. On 5 March, Slack, Facebook, and Microsoft had all advised their employees to work remotely following the spread of the virus to the west coast of the United States. Over the course of the month, most tech giants had closed their offices, shifting all employee communications online.

Despite numerous security warnings, prime minister Boris Johnson is known to be using Zoom to hold cabinet video conferences. He was recently criticised for sharing a picture on Twitter in which a Zoom meeting ID was clearly visible.

A government spokesperson defended the PM, telling BBC News on Wednesday that “in the current unprecedented circumstances, the need for effective channels of communication is vital”.

The news comes days after a class-action lawsuit was filed against Zoom, the compliant alleging the service illegally shared user data with Facebook. Zoom’s founder and CEO, Eric S. Yuan, announced over a blog post that over the next three months, the company will work to “better identify, address, and fix issues proactively”.

How Ubisoft’s i3d.net onboarded Opengear to avert networking disasters


Keumars Afifi-Sabet

2 Apr, 2020

Downtime can prove a fiasco for any organisation, as can a sudden surge in demand, and it’s particularly true for companies wired into the heart of the online gaming scene. From EA Sports’ FIFA to the renowned Call of Duty franchise, millions of gamers across the globe have come to expect 24/7 network availability. 

The growing demand for always-on services is akin to the way that organisations reliant on cloud-powered applications expect flawless and reliable connections on which to run their operations. Just look at the escalating COVID-19 pandemic that’s taken the cloud computing world by storm – with a surge in demand for data services, Wi-Fi networks and workplaces platforms like Microsoft Teams. The staggering work that goes into maintaining these networks as userbases swell, whether in the business or gaming worlds, is routinely overlooked; it’s often a case of missing crucial elements when things go wrong.

At games publisher Ubisoft, subsidiary i3D.net runs and maintains the networks that power widely-played AAA multiplayer games, like Tom Clancy’s The Division. While it had been successful managing with just 70 staff and servers based in 35 sites spanning 15 countries, in the mid-2010s, it became clear extra muscle was needed to continue to service a rapidly-swelling user base.

“The big thing in game hosting is the fact you need to be really flexible and very responsive to the fast-changing market,” i3D.net COO, Rick Sloot, tells Cloud Pro. “A game can be popular, or it can be a real flop. But as soon as the game is popular, and a lot of people are playing it, or maybe even more people are going to play it than you’re expecting, you need extra capacity within hours, or maybe, at most, in a matter of days.”

The pressures of an always-on world

In the past, i3D.net would factor networking issues as a business cost, but these started to become too frequent to sustain. The infrastructure was built to incorporate redundancy, though if any routers, switches or other equipment went down, i3D.net would be pressed to resolve these issues as soon as possible while game sessions across the world were put on hold. The firm sought to onboard a third-party network monitoring company in 2015 to bolster network resilience, once it became impossible to tolerate these problems. The need was especially pressing given how limited staffing levels were, combined with exponentially growing demand. Network management firm Opengear was recruited shortly before Ubisoft released its hotly-anticipated Tom Clancy’s The Division 2, to improve resilience and failover options should things get hairy.

“The way the 24/7 world is working currently, and everybody wants to be online 24/7, [network failure] was not an acceptable risk anymore,” Sloot continues. “Because the company, and everybody in the world, is demanding a 24/7 service, we needed to look for other solutions, and other ways of maintaining the flexibility but without adding a lot of overhead on us.”

The potential for demand to surge at any one time, and in any location across the world, was impractical given i3D.net would rely on its own network engineers to fly out to these sites should work need doing. Remote hands would be used where possible, but it would take crucial minutes or hours to establish a connection while networks were offline. Expansion at existing locations, or establishing new sites, also posed issues when demand for a game went “sky high”.

Going mobile

Opengear already formed a part of i3D.net’s infrastructure, but on a much smaller scale, Sloot says. The implementation phase, which spanned a year, involved heavily ramping up the company’s involvement, which, thanks to the existing relationship, was more straightforward than it could have been. The equipment was shipped to i3D.net, and its engineers spent the following year flying from location to location to install the infrastructure. As i3D.net harbours sufficient technical expertise, it primarily leant on Opengear for enhancements. Automatic failover to alternative networks, for example, would ensure games would continue running when things looked hairy. This operated through the installation of cellular friction, with communication running via 4G networks instead of traditional backup lines.

“Before, we would always try to have a backup line; for example, buy a backup line from a data centre and then connect that one. So this was a very good additional feature for us, which brought the service to a higher level,” Sloot continues. The implementation of cellular friction, however, brought its own challenges. 

“Maybe sometimes for us, from our side, it’s tricky because for cellular friction you need good quality of signal … which is always a challenge in a data centre, which is always a highly secure facility.”

As for how he’d advise other businesses to handle their networking infrastructure as they look to scale, he repeated that you would only miss the most crucial elements powering your networks behind the scenes when things go horribly wrong. 

“I always say to my guys here, what could be the worst that can happen?” he explains. “If you look at all those steps that could happen – what can you prevent, and if you can prevent them, what’s the best solution for it? 

“If there’s a solution, what are the costs versus the risks? Looking at this particular solution of Opengear, the costs of not having a network is, like, tens of thousands of Euros per hour. Buying the product is a small fraction of that, so, it’s a rather small investment for achieving high availability.”

Canonical launches Managed Apps for enterprise DevOps teams


Daniel Todd

1 Apr, 2020

Canonical has announced the arrival of Managed Apps, a service that will allow enterprises to have mission-critical apps deployed and operated by the firm as a fully managed service.

Covering ten of the world’s most widely used open source apps, Managed Apps removes the need for enterprises to contract with multiple vendors, Canonical explained, while customers also benefit from support for their underlying infrastructure.

The service will cover ten widely used cloud-native database and LMA (logging, monitoring and alerting) apps on multi-cloud Kubernetes and also on virtual machines across bare-metal, public and private cloud.

At launch, those managed databases include MySQL, InfluxDB, PostgreSQL, MongoDB, as well as ElasticSearch Open Source Mano and Kafka. The service will also cover demand-based scaling, high availability for fault tolerance, security patching and updates.

Managed Apps are also backed by SLAs for uptime, round-the-clock break/fix response, while businesses can monitor app health through an integrated LMA stack and dashboard, which includes Grafana, Prometheus and Graylog.

Ultimately, the initiative will allow DevOps teams to focus on delivering business value instead of typical time-consuming tasks thanks to the streamlined approach to infrastructure maintenance, the firm explained.

“As organisations increasingly move to a cloud-native approach, they can be slowed down by spending too much time on the underlying management of their cloud and its applications,” commented Stephan Fabel, director of Product at Canonical.

“Our Managed Apps give them the freedom to focus on business priorities, with the confidence that their apps are reliably maintained, secure and can scale to production needs.”

Managed Apps also offers full lifecycle management that includes resource-scaling based on changes in demand, as well offering high availability by default. Canonical’s managed services also have MSPAlliance CloudVerify certification, the firm said, which is equivalent to SOC 2 Type 2, ISO 27001/ISO 27002 and GDPR compliance.

The software firm also revealed that it plans to expand the number of open source apps covered by Managed Apps to further improve performance accountability.

UK government to launch coronavirus ‘contact tracking’ app


Sabina Weston

1 Apr, 2020

The UK government is reportedly preparing to launch an app that will warn users if they are in close proximity to someone who has tested positive for coronavirus.

The contact-tracking app will be released just before the lockdown is lifted or in its immediate aftermath, Sky News has reported and will use short-range Bluetooth signals to detect other phones in close vicinity and then store a record of those contacts on the device.

If somebody tests positive for COVID-19, they will be able to upload those contacts, who can then be alerted – via the app.

This means that the data will not be regularly shared with a central authority, potentially easing concerns around privacy violations.

If people with the app later test positive for coronavirus, they could allow all the folks they’ve been near to be informed, so those people could self-isolate.

Jim Killock, executive director of the Open Rights Group, told IT Pro that the new app, and similar developments, might “prove to be important tools in the fight against COVID-19”. However, he also raised concerns about the privacy of users.

“Nevertheless, we are concerned that [the] government needs to put more effort into helping people understand their approach to privacy more generally, and improve their communications vastly. Building a project like this at speed carries privacy, security and delivery risks, so the more information that is given out the better,” he said.

NHSX, the innovation arm of the UK’s National Health Service, will reportedly appoint an ethics board to oversee the development of the app, with its board members set to be announced over the coming weeks.

“It is good that they are thinking about the privacy of users – this is essential to build trust and confidence so people use it,” said Killock.

Questions might arise over the effectiveness of the app, as large numbers of people will be required to use it in order for it to work efficiently. The NHS is reportedly counting on the app being downloaded by more than 50% of the population.

“NHSX is looking at whether app-based solutions might be helpful in tracking and managing coronavirus, and we have assembled expertise from inside and outside the organization to do this as rapidly as possible,” an NHSX spokesperson said.

Only last week, the NHSX and Matt Hancock MP were urged to follow steps that would guarantee that new technologies used to tackle the coronavirus outbreak abide by data protection ethics.

In an open letter signed by numerous “responsible technologists”, they were asked to take urgent steps to ensure that the public’s trust in the NHS is not undermined by possible data breaches.