Cisco seeks Webex enhancements with Slido acquisition


Keumars Afifi-Sabet

14 Dec, 2020

Cisco has acquired audience interaction company Slido in efforts to enhance the Webex video conferencing user experience and stay relevant with the likes of Zoom, Teams and Google Meet enjoying a surge in popularity.

The firm is hoping to integrate Slido’s audience interaction and engagement features, such as polls and Q&As, into the Webex platform to improve the quality of the product and make it more appealing for users. 

The acquisition will pave the way for meeting owners to create engaging content such as infographics, get real-time insights as well as obtain feedback. This is in addition to Slido’s inbuilt functionality to support virtual conferences and massive events.

“Slido technology enables higher levels of user engagement―before, during and after meetings and events,” said Abhay Kulkarni, Cisco’s VP and GM for Webex Meetings. “The Slido technology will be part of the Cisco Webex platform and enhance Cisco’s ability to offer new levels of inclusive audience engagement across both in-person and virtual experiences.

“In the massive shift to “virtual everything,” remote meetings and events have become the lifeblood for connecting people in all aspects of their lives – from friends to family to work colleagues.

“Slido has over seven million participants monthly and provides its customers with an inclusive audience engagement platform that enables real-time feedback and insight before, during and after any meeting or event via dynamic polls, Q&A, quizzes, word clouds, surveys and more.”

Bundling such features into the meetings experience is something that Cisco is hoping can keep Webex relevant at a time where its industry rivals such as Microsoft Teams and Zoom are enjoying rampant success.

This isn’t to say, however, that Cisco’s enterprise collaboration platform hasn’t enjoyed a surge in popularity itself, recording 590 million meeting participants in September, for example, according to Reuters. Zoom, however, boasted a staggering 300 million daily meeting participants during the height of the pandemic in April. 

The company describes its goal as delivering experiences that are 10x better than in-person interactions, which the integration of Slido’s audience engagement tools will help to contribute to. Cisco will also hope to integrate further insights into the broader Webex platform, with a view to raising productivity while workers are still based remotely.

This isn’t the first recent acquisition that Cisco has made squarely with the view to enhance the Webex experience, having previously acquired BabbleLabs earlier this year. The previous deal saw the firm seek to integrate AI processing technology into meetings in order to suppress background noise and enhance speech clarity. 

AWS to offer free cloud training to 29 million people


Bobby Hellard

11 Dec, 2020

Amazon Web Services (AWS) has announced an ambitious plan to help 29 million people around the world gain digital skills with free cloud computing training. 

The announcement came on Thursday at re:Invent 2020, the cloud giant’s annual conference. 

The training will include more than 500 free courses, interactive labs, and virtual day-long training sessions. AWS will also continue to invest in its free training courses to help participants earn certification, and will expand its AWS re:Start programme that looks to reach underrepresented communities to help them find work in the tech industry. 

What’s more, AWS will pilot new training programs, such as a two-day AWS Fibre Optic Splicing Certification and its Machine Learning University, a free course designed to teach people ML concepts for business. 

This is just a snapshot of the work Amazon is doing to help individuals around the world, according to Teresa Carlson, VP of worldwide public sector at AWS. 

“As part of our efforts to continue supporting the future workforce, we are investing hundreds of millions of dollars to provide free cloud computing skills training to people from all walks of life and all levels of knowledge, in more than 200 countries and territories,” Carlson wrote.

“We will provide training opportunities through existing AWS-designed programs, as well as develop new courses to meet a wide variety of schedules and learning goals. The training ranges from self-paced online courses – designed to help individuals update their technical skills – to intensive upskilling programs that can lead to new jobs in the technology industry.”

The announcement will be welcomed by many around the world, particularly those in industries that have been displaced by the coronavirus, which will likely have a lasting impact on many job roles. Digital or tech roles, such as those with cloud computing specialties, are thought to be of high demand for the post-coronavirus world. 

Google Cloud buys UK data analytics firm Dataform


Bobby Hellard

10 Dec, 2020

Google Cloud has acquired a London-based startup called Dataform that builds tools to manage data flows for enterprise customers.

The terms of the deal haven’t been released, but TechCrunch understands that it is an ‘acquihire’ with Google keen to take on the company’s talent.

The company is described as an “operating system” for data warehouses and some of its co-founders are ex-Google employees. Its platform aims to help data-rich businesses draw insights by mining data stored in warehouses.

This is something that usually requires a team of engineers and analysts, but the Dataform system is about making the process simpler and cheaper for organisations.

This is a growing area of data analytics with companies such as Snowflake recently undergoing a successful IPO. Dataform were close to a series A funding round, but have instead chosen to continue its growth under Google.

Under the terms of the deal, Dataform will continue to operate under its management and focus on BigQuery. The Dataform Web will also be made free for all new users from now on with customers transitioned to the free plan immediately.

“After several conversations with the Google Cloud team it became clear that we are deeply aligned on the importance of serving analysts with the right tools and technology in order to fill what we all perceive as a missed opportunity in existing solutions,” co-founder and CTO Guillaume-Henri Huon wrote on Dataform’s website.

“At the same time, as a team of just seven, in a complex, competitive and rapidly changing market, we had more ideas than we had people or resources to accomplish. There has always been so much more we wanted to do each quarter than we could achieve.

“With the support of the BigQuery and Cloud Analytics teams and our combined thought leadership and efforts, we felt that together we could achieve something bigger than we could separately”.

HPE launches HPC as a service through HPE GreenLake


Daniel Todd

10 Dec, 2020

HPE has announced it is offering its High-Performance Computing (HPC) solutions as a service through HPE GreenLake, which include a range of fully managed, pre-bundled HPC cloud services.

These new HPE GreenLake cloud services will allow customers to combine the power of an agile, elastic, pay-per-use cloud experience with proven, market-leading HPC systems, the tech firm said. 

Compatible on-premises or in a colocation facility, the as a service platform has been designed to tackle demanding compute and data-intensive workloads, power AI and ML initiatives, speed time to insight, as well as create new products and experiences.

“We are transforming the market by delivering industry-leading HPC solutions in simplified, pre-configured services that control costs and improve governance, scalability and agility through HPE GreenLake,” commented Peter Ungaro, senior vice president and general manager, HPC and Mission Critical Solutions (MCS), at HPE.  

According to Intersect360 Research, the HPC market will grow by more than 40%, reaching almost $55 billion by 2024. The tech is designed to support ongoing data growth, including data from emerging applications and endpoints such as AI training models and edge devices, to efficiently process and analyse data.

HPE said its HPC as a service offering will dramatically simplify the experience by speeding up the deployment of HPC projects by up to 75% and reducing capital expenditures by up to 40%.

Enterprises can deploy the fully managed services in any data centre environment, the firm added, allowing them to pay for only what they use, focus on running projects to increase time-to-insight and accelerate innovation.

HPE will initially offer an HPC service based on HPE Apollo Systems, combined with storage and networking technologies, which are purpose-built for running modelling and simulation workloads. The firm then plans to expand the rest of its HPC portfolio to as-a-service offerings in future. 

GreenLake for HPC is available in small, medium or large options that can be ordered via a self-service portal, with the service then ready in less than 14 days. 

As part of the offering, customers will also gain access to HPE GreenLake Central, HPE Self-service dashboard HPE Consumption Analytics, as well as HPC, AI & App Services.

“These HPC cloud services enable any enterprise to access the most powerful HPC and AI capabilities and unlock greater insights that will power their ability to advance critical research and achieve bold customer outcomes,” Ungaro added.

AWS CISO urges companies to adopt a zero-trust security approach


Keumars Afifi-Sabet

9 Dec, 2020

Organisations should embrace the philosophy and principles of zero-trust security to keep up to date with modern demands and security threats, AWS’ chief information security officer (CISO) Steve Schmidt has urged.

Adopting the core tenets of a zero-trust philosophy, including accessibility and usability, and ensuring you’re focusing on the core fundamentals of security, will ensure businesses can eliminate needless risks in their IT estates.

Doing so, however, isn’t as straightforward as businesses may hope, according to Schmidt. This is because the term ‘zero-trust’ can mean different things in different contexts, with this ambiguity the product of a diversity of use cases to which it applies.

“Zero-trust is, to me, a set of mechanisms that focus on providing security controls around digital access and assets while not solely depending on traditional network controls or network perimeters,” he explained, speaking at AWS re:Invent 2020. 

“In other words, we aren’t going to trust a user based only on their location within a traditional network. Instead, we want to augment network-centric models with additional techniques, which we would describe as identity-centric controls.”

An example of one such use case that he provided was human-to-application security, which is particularly relevant given the surge in people working from home in 2020. Traditionally, applications sat behind a virtual private network (VPN) front door, but these aren’t compatible with the diversity of devices that workers use to access work-related services. Applying zero-trust principles generates the objective to make the locks on applications effective enough that you can eliminate a VPN-based front door altogether.

Zero-trust principles have become far more popular across the industry of late, with a number of companies quick to adopt and promote this philosophy either as part of their own strategies or in their products. 

BlackBerry, for example, announced Persona Desktop in October, a security platform that uses artificial intelligence (AI) and machine learning to detect user and entity behaviour abnormalities. Persona Desktop works at the endpoint, and eliminates the need to share data back to the cloud before the system acts, and also aims to protect against stolen credentials, insider threats, and physical compromise.

Google, too, launched a zero-trust remote access service known as BeyondCorp Remote Access earlier this year that’s designed to give remote teams access to their internal applications without the need for a VPN.

As part of Schmidt’s outline of AWS’ security strategy, he also proposed a set of questions that businesses and IT administrators should ask about their organisation’s security configuration. Elements such as where the perimeter is, and how large it is, as well as how easy it might be to monitor and audit, should be considered. 

Schmidt also, by way of example, suggested that while VPNs are fine to use for network isolation, it would be best to make the implementation dynamic and hidden from the user experience. This might lead to users not even noticing that network boundaries are being created and torn down as required.

Microsoft to offer top-secret cloud platform for classified data


Praharsha Anand

9 Dec, 2020

Microsoft has announced the launch of its newest cloud offering: Azure Government Top Secret.

The new cloud service expands Microsoft’s tactical edge portfolio for the US government, including Azure (public cloud), Azure Government, and Azure Government Secret. Microsoft has tailored Azure Government Top Secret for its US government customers that work with classified information.

“Azure Government Top Secret provides the same capabilities as the commercial version of Azure, Azure Government and Azure Government Secret to enable a continuum of compute from mission cloud to tactical edge,” said Tom Keane, corporate vice president of Azure global.

“The broad range of services will meet the demand for greater agility in the classified space, including the need to gain deeper insights from data sourced from any location as well as the need to enable the rapid expansion of remote work.”

According to reports, Microsoft is working with the US government to secure accreditation for its new cloud. In the meantime, it’s already completed the build out of the Azure Government Top Secret regions. 

This announcement comes amid ongoing court battles over the Department of Defense’s (DOD) $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud contract. The DOD awarded the whole contract to Microsoft, bypassing Amazon Web Services and spurring the company to launch a lawsuit

Microsoft also announced enhancements to its Azure Government Secret service, authorized and actively used by the US Department of Defense, law enforcement, and other agencies. 

According to Microsoft, Azure Government Secret will now include Azure Kubernetes Service (AKS) and Azure Container Instances. The additions aim to help application developers deploy and manage containerized applications more easily.

Intelligent security analytics services Azure Sentinel and Azure Security Center are also now available in Azure Government Secret, enabling unified security across digital estates and facilitating proactive threat management.

“The consistency between Azure (commercial), Azure Government, and Azure Government Secret is also starting to change the game as software development may happen from anywhere, while the code itself can be promoted to enclaves with higher classification levels. There it can interact with data of higher classification levels. At the end of the day, this means doing more for the mission at a lower overall cost,” said Carroll Moon, CTO of CloudFit Software.

Salesforce: Customer service teams have accelerated digital plans in 2020


Bobby Hellard

8 Dec, 2020

The pandemic has exposed a number of technology gaps in customer service, according to 88% of service professionals, as customers switched from physical to virtual locations during 2020.

This caused many customer service leaders to accelerate digital transformation strategies, according to a report from Salesforce

The ‘State of Service‘ report provides a “snapshot” of priorities, challenges and trajectories of global customer service teams. The findings are based on a survey of customer service agents, decision-makers, mobile workers, and dispatchers with over 7,000 respondents across 33 countries.

In response to the pandemic, 85% of service teams said they had changed their policies to provide more flexibility to customers with 60% adding that they had invested in new service technology. 

“Leaders are taking this time to rethink the value of experiences and reimagine engagement with customers and employees alike,” said Brian Solis, global innovation evangelist at Salesforce.”

“It’s not just about technology. Sometimes technology is at its best when invisible. We’re going to see significantly more agile, innovative, and relevant organisations emerge from this crisis that provides modern and sought-after experiences that change the game for everyone.”

According to the report, 88% of service professionals said the pandemic exposed technology gaps, and 86% said the same for service channel gaps as customers flocked away from physical locations and towards digital methods of engagement.

Teams also found shortcomings that went beyond the obvious, with 87% realising that their existing policies and protocols – such as cancellation fees for events that were prohibited by public health measures – were not suited for current circumstances.

In the face of these challenges, service teams were forced to make digital transformations that will endure beyond the pandemic. 78% said they had invested in new technology because of the pandemic, with 32% suggesting they had ramped up their adoption of artificial intelligence systems

Russian hackers are exploiting critical VMware flaws


Keumars Afifi-Sabet

8 Dec, 2020

State-backed Russian cyber criminals are actively exploiting a recently-patched vulnerability in a series of VMware products in order to access sensitive corporate data.

VMware had previously warned its customers about a critical command injection flaw in a number of its products, including Workspace One Access and Identity Manager in late November. Although the bug was considered severe, with a rating of 9.1 on the CVSS threat severity scale, a patch wasn’t available at the time and was only released on 3 December. 

Hackers operating on behalf of the Russian state, however, have been actively exploiting the vulnerability to access data on targeted systems, according to an advisory issued by the US National Security Agency (NSA).

“The exploitation via command injection led to installation of a web shell and follow-on malicious activity where credentials in the form of SAML authentication assertions were generated and sent to Microsoft Active Directory Federation Services, which in turn granted the actors access to protected data,” the advisory said.

“It is critical when running products that perform authentication that the server and all the services that depend on it are properly configured for secure operation and integration. Otherwise, SAML assertions could be forged, granting access to numerous resources.”

Beyond the wider business community, the NSA has stressed the need for organisations involved in national defence and security to apply VMware’s patch as soon as possible, or implement workarounds until updates are feasible. The advisory also suggests that organisations review and harden their configurations as well as the monitoring of federated authentication providers.

Beyond Workspace One Access and Identity Manager, the products affected include Access Connector and Identity Manager Connector, with specific product versions outlined in VMware’s original security advisory.

The vulnerability, tagged CVE-2020-4006, essentially allows hackers to seize control of vulnerable machines. They would first need to be armed with network access to the administrative configurator on port 8443, as well as a valid password to the admin account.

As such the NSA has recommended that network administrators limit the accessibility of the management interface on servers to only a small set of known systems, and block it from direct internet access. Critical portions of this activity can also be blocked by disabling the firm’s configurator service.

Zero-click ‘wormable’ RCE flaw uncovered in Microsoft Teams


Keumars Afifi-Sabet

8 Dec, 2020

Hackers were able to exploit a serious vulnerability in Microsoft Teams desktop apps to execute arbitrary code remotely and spread infection across a company network by simply sending a specially-crafted message.

The zero-click flaw, which is wormable, can be triggered by cross-site scripting (XSS) injection in Teams, with hackers able to transmit a malicious message which will execute code without user interaction.

This remote code execution (RCE) flaw was first reported to Microsoft in August, with the company fixing the bugs in October 2020. However, security researcher Oskars Vegaris, who discovered the flaw,  has complained that the firm didn’t take his report as seriously as it should have, with Microsoft not even assigning the bug a CVE tag.

Microsoft considered the Teams vulnerability as ‘important’ although described its impact as ‘spoofing’ in its bug bounty programme. As for the CVE element, Microsoft doesn’t issue CVE tags on products that automatically update without user interaction.

“This report contains a new XSS vector and a novel RCE payload which are used together,” Vegaris wrote on GitHub. “It affects the chatting system within Microsoft Teams and can be used in e.g. direct messages, channels.”

In a technical breakdown of the vulnerability, the researcher highlighted how RCE can be achieved by chaining two flaws, including stored XSS in Teams chat functionality and a cross-platform JavaScript exploit for the Teams desktop client. 

The impact is seemingly alarming, with its wormable nature meaning the exploit payload can be spread across other users, channels and companies without any interaction. The execution of malicious code could also happen without any user interaction, given users need to only view the specially-crafted message. 

The consequences of infection range from complete loss of confidentiality and integrity for victims, to access to private communications, internal networks, private keys as well as personal data outside of Microsoft Teams.

Hackers can also gain access to single sign-on (SSO) tokens for other services, including Microsoft services such as Outlook or Microsoft 365. This will expose them to possible phishing attacks too, as well as keylogging with specially-crafted payloads, according to Vegaris.

IT Pro approached Microsoft for comment.

Is one cloud enough?


David Howell

The cloud now forms an integral part of every business’ IT infrastructure. Increasingly, however, the growth of the cloud market and incredible range of choice of products and platforms on offer has led to multi-cloud fatigue.

The often-haphazard deployment of cloud services from multiple vendors has also created management challenges that have been exacerbated thanks to the coronavirus pandemic. Are we now in a situation where CTOs need to take action to rationalise their cloud deployments?

The latest cloud study from IBM reveals that 64% of executives plan to migrate more mission-critical workloads to the cloud in the next two years. If not done carefully, however, this could easily lead to more cloud bloat and a consequent aggravation of management and service support issues.

While it can be tempting to just throw money and resources at the problem, this can often result in expanding existing cloud deployments or buying new services without the due diligence needed to ensure they can be integrated efficiently and securely with legacy installations. Information technology service management (ITSM), when coupled with cloud management platforms (CMPs), has, in some cases, exposed weak links in existing cloud infrastructures.

Management of large multi-cloud deployments is also difficult to do effectively. According to the State of the Cloud 2020 report from Flexera, applications are often siloed across the cloud architecture, with a third (33%) of respondents to this year’s survey using multi-cloud management tools. 

“Businesses often adopt a multi-cloud strategy to deliver specific applications or services, avoid cloud vendor lock-in, reduce costs, enable flexibility and increase scalability,” explains Paul Stapley, practice director at Logicalis. “However, multi-cloud adoption presents challenges for people and processes, as the more platforms you have, the more challenging it becomes to manage them. CTOs can face security challenges, connectivity reliability (problems), performance issues, and inconsistent service offerings, making it challenging to utilise and operate multi-cloud deployments efficiently.”

Reducing the cloud architecture’s complexity would enable a far more cost-effective, secure, and efficient cloud service to be constructed and then managed. The danger of businesses’ reaction to COVID-19, which saw a rush to implement ever more cloud services, is an unstructured and unplanned expansion with consequent weak security and lack of management oversight.

Is one cloud enough?

CTOs struggling to manage multi-cloud deployments and ‘cloud bloat’ may wonder if moving everything back to a single cloud is the way forward, but that’s not necessarily the best answer, either.

“With a clear strategy and approach for using multiple clouds, businesses can avoid the issue of ‘cloud bloat’,” Maynard Williams, MD of Accenture Technology UK tells IT Pro. “This strategy must cover both the specific use cases for cloud service provider offerings and how transactions work end to end when they may span multiple clouds. For example, a business might put all of a particular type of workload, such as analytics, onto a specific cloud platform, while ensuring that an issue can be traced across multiple stacks. It also needs to consider hybrid options and transitional states as applications migrate to the cloud.”

Reducing the footprint of a business’ cloud deployments can also deliver much tighter security. Concerns about public cloud’s safety remain high, with Cavirin reporting nine out of ten cybersecurity professionals (91%) are extremely-to-moderately concerned about public cloud security. The most prominent challenge organisations face to their security operations is visibility into infrastructure security (44%), followed by setting consistent security policies across cloud and on-premises environments, and with compliance, which tied at 42% each.

Speaking to IT Pro, Anne Hardy, CISO at Talend says: “Cloud security covers various aspects, the most important being governance, network, logical access control, data protection, security logging and monitoring, security incident response and disaster recovery. Every one of these aspects cannot be managed in the same way with AWS, Azure or GCP (Google Cloud Platform). This variation across multi-cloud deployments means a business needs a team of the right people who can understand all of these areas and the security needs of each.”

Whether and how businesses rationalise their cloud infrastructure will depend on their medium to long-term planning. As Logicalis’ Paul Stapley says, each company will react differently. “We know the correct cloud brings many benefits to the correct workloads. The reasons for both adopting and moving workloads need to be right. By no means is it a one-size-fits-all approach, or a set-in-stone process. As technology advances, and business needs change, the cloud is built to adjust accordingly to best manage those variations.”

New normal IT

Is a single cloud deployment the future of IT infrastructure? The Trinity of AWS, Google Cloud Platform (GCP), and Azure offer all businesses – no matter their size – a cloud deployment platform that can be lean and efficient. The one cloud approach seems distant at this point, however, and the propensity to use multiple cloud deployments often from different vendors shows little sign of slowing. Indeed, the pandemic has accelerated the expansion of public clouds to cope with remote mass working demands, with much of this new deployment being fragmented.

“Most enterprises will choose to work with at least one of the public cloud hyperscalers,” Accenture’s Maynard Williams explains. “And there’s good reason for this: There’s a great deal of competition in the market, so they’re investing heavily in areas like streamlining migration, adapting services for private clouds and pushing out to the edge. In addition, they are investing in a variety of industry-specific cloud solutions – for example, GE Healthcare is running its Health Cloud on Amazon AWS. A bespoke cloud deployment strategy can allow a business to get far more out of these investments than they could realistically achieve alone.”

Williams continues: “For most organisations, the optimal way forward is to select a primary hyperscaler for the majority of mission-critical workloads, and then work with one or more secondary providers dictated by the specific needs of the business. This might depend on regulations, industry, concentration of risk or specialised workloads. This enables the organisation to build core skills and experience on one platform but, take advantage of specialised solutions where it makes the most sense.”

All enterprises large and small, understand that in the post-COVID era, flexibility will be crucial to their long-term sustainability in their marketplaces. The cloud will continue to be a foundation all businesses use to deliver the IT services they need. 

Cloud bloat and the management and security issues these bring will be addressed. However, the cloud is a flexible space that can expand and contract as needed. Can one cloud serve all these requirements? It’s an unlikely scenario. But more streamlining and using fewer vendors look set to become the norm.