IBM job ad calls for 12-years of experience with six-year-old Kubernetes


Bobby Hellard

13 Jul, 2020

IBM has put out a job advert calling for a candidate with over 12-years of experience with Kubernetes administration and management. 

It looks like a fairly straight forward ad, except for the fact that Kubernetes has only been a thing for the last six years.

The advertisement, which is still live, calls for a “minimum” of 12 years experience in Kubernetes, including “hands-on” experience setting up Kubernetes platforms, deploying microservices and other web applications and managing secure secrets along with container orchestration.

It requires someone to have earned at least six years experience before the first GitHub post about the project was made on 7 June 2014.

As the Twitter account ‘Really Bad Job Ads‘ shows, it’s very common to make typos or strange syntactical errors in job ads, but nothing on its feed comes close to a giant tech company getting in a muddle over new technology.

In this regard, IBM is not alone, as developer Sebastián Ramírez pointed out on Twitter. He applied for a role that asked for over four years of experience using FastAPIs, but Ramírez knew all too well at the time that no one could have more than one and a half years experience of it because he created it. 

This also goes the other way with job seekers sometimes getting it wrong. Replying to Ramírez, researcher Lynn Boyden recalled an applicant in 2012 that said they had over 17-years of experience with web design.

“We interviewed a 28-year-old designer in 2012 who told us he had 17-years experience designing websites. I said, ‘Tim Berners-Lee doesn’t have 17 years experience designing websites’. ‘Who’s Tim Berners-Lee?’ he asked. So yeah.”

Again, this goes the other way as further down the replies, App designer Jens Ravens explained that he was once told he didn’t have enough experience with a certain iOS library during an interview, despite the fact he developed it.

Nokia begins major data centre networking gambit


Keumars Afifi-Sabet

10 Jul, 2020

Nokia has launched a set of tools, equipment and an operating system for data centre networking to help large companies manage growing traffic in light of increased 5G and machine learning adoption.

Working in collaboration with Apple to build the technology, Nokia has launched a data centre Network Operating System (NOS) as a toolkit to allow for intent-based automation and operations in data centres. This is in addition to new routers and switches.

The company’s data centre venture is based on the idea that the data centre will overlap with cloud and telecoms networks, with technologies like 5G and the Internet of Things (IoT) causing demand for data movements to rise.

All together, Nokia’s foray will allow what it describes as ‘cloud builders’ – webscale firms, service providers and large enterprises – to scale-up and adapt their data centre environments in light of the surging traffic.

“With decades of experience serving the world’s telecom operators, we understand the engineering challenges of building and operating business and mission-critical IP networks on a global scale,” said Nokia’s president of IP and optical networks, Basil Alwan.

“However, today’s massive data centers have their own unique operational challenges. The SR Linux project was the proverbial ’clean-sheet’ rethink, drawing from our partnership with Apple and others. The resulting design is impressive in its depth and strikes the needed balance for the future.”

Nokia describes its Service Router Linux as the first fully modern microservices-based network operating system. It’s built on technology used in more than a million IP network routers, and runs standard Linux. This can be combined with the Nokia Service Router Linux NetOps development kit, which allows customers to take advantage of a rich set of programming capabilities.

Majority of UK firms say cyber threats are outpacing cloud security


Sabina Weston

10 Jul, 2020

New research into cloud security management has found that 83% of UK organisations believe threats to cloud systems are outpacing their ability to effectively deploy countermeasures.

This places the UK behind the global average, at 71%. By contrast, only 53% of German enterprises believe the same.

Cyber security company Palo Alto Networks has published its findings about the practices, tools, and technologies that companies around the world use to manage security for cloud-native architecture, interviewing 3,000 professionals in cloud architecture, information security, DevOps, and application development located across the UK, Germany, USA, Singapore, and Australia.

The State of Cloud Native Security report shows that UK organisations today host 42% of their workloads in the cloud and expect this to increase to 65% in the next two years.

A significant majority (93%) of UK businesses admitted to using more than one cloud platform, while one in two (57%) said they use between two and five. The trend was reflected on a global scale, with 94% and 60% of global organisations admitting to the same respectively.

However, the report has found that the growing reliance on cloud infrastructure has not translated into increased confidence in cloud security. In fact, 84% of UK respondents admitted that their organisation struggles to draw a clear line between their own responsibility for cloud security and their cloud service providers responsibility for security.

Low confidence in cloud security, and undefined responsibility for it, coincides with a surge in the number of attacks on cloud accounts, up by 630% between January and April of this year, according to McAfee. A majority of these external attacks were large-scale attempts to access cloud accounts with stolen credentials and usually targeted collaboration services like Microsoft 365.

The research also found that, while overall enterprise use of cloud services increased by 50%, access to the cloud using unmanaged, personal devices doubled, contributing to the risk of company data being stolen.

Docker partners with AWS to help take the complexity out of containerisation


Bobby Hellard

10 Jul, 2020

Docker is expanding its collaboration with AWS to make its Compose and Desktop tools easier to use in container ecosystems.

The compose and desktop developer tools will be integrated with AWS Elastic Container Service (ECS) and ECS on AWS Fargate.

The container ecosystem has become complex, according to Docker. What used to be a simple case of A talking to B and B talking to a database, has expanded due to new managed container services. The complexity of it is helpful to operational teams that might want more control, but it makes it tougher for developers.

“With a large number of containers being built using Docker, we’re very excited to work with Docker to simplify the developer’s experience of building and deploying containerised applications to AWS,” said Deepak Singh, the VP for compute services at AWS.

“Now customers can easily deploy their containerised applications from their local Docker environment straight to Amazon ECS. This accelerated path to modern application development and deployment allows customers to focus more effort on the unique value of their applications, and less time on figuring out how to deploy to the cloud.”

A similar integration was announced in May for Microsoft Azure, shortening the developer commute for Azure Container Instances, highlighting Docker’s more developer-focused strategy. The company surprised many when it sold its enterprise business to Mirantis in 2019, but its reasoning was to solely focus on cloud-native development.

“Going forward, in partnership with the community and ecosystem, we will expand Docker Desktop and Docker Hub’s roles in the developer workflow for modern apps,” said CEO Scott Johnston.

“Specifically, we are investing in expanding our cloud services to enable developers to quickly discover technologies for use when building applications, to easily share these apps with teammates and the community, and to run apps frictionlessly on any Kubernetes endpoint, whether locally or in the cloud.”

Oracle to put its own hardware in customer data centres


Keumars Afifi-Sabet

9 Jul, 2020

Oracle has announced a package for enterprise customers to give them the full benefits of the company’s public cloud services while retaining all their data on-premise.

Dubbed Oracle Dedicated Region Cloud@Customer, the service is touted as the industry’s first fully-managed cloud region that brings more than 50 cloud services that can run from customers’ own data centres.

With packages starting at $500,000 per month, installing Oracle hardware in their own data centres allows enterprise customers with high security and regulatory commitments to benefit from cloud-based software without migrating their data.

Previously, Oracle customers adopting hybrid cloud configurations weren’t necessarily able to use all of the company’s cloud-based services due to incompatibility with their own hardware.

This new service will allow customers to port the entirety of Oracle’s software stack to their own data centres by installing Oracle hardware onsite.

“Enterprise customers have told us that they want the full experience of a public cloud on-premises, including access to all of Oracle’s cloud services, to run their most important workloads,” said executive vice president of engineering for Oracle Cloud Infrastructure, Clay Magouyrk.

“With Oracle Dedicated Region Cloud@Customer, enterprises get all of our second-generation cloud services, including Autonomous Database, in their datacenters. Our major competitors can’t offer customers a comparable dedicated cloud region running on-premises.”

However, the service will likely draw attention from those who campaign against practices that create vendor lock-in, as the installation of Oracle’s own hardware may make it more difficult for enterprise customers to transition to other providers should they wish.

Taking the fight to AWS

The move also represents an attempt to bring the fight to Amazon Web Services (AWS), as part of a long-running feud between the two companies, with Oracle taking on the highly similar AWS Outposts service.

AWS Outposts is Amazon’s fully managed and configurable compute and storage rack service built with AWS-designed hardware. The service allows AWS customers to run on-premise computing while connected to AWS services in the cloud.

Compared with Oracle Dedicated Region Cloud@Customer’s 50 cloud services, AWS Outposts only offers four, Oracle’s Larry Ellison claimed during an online event, according to Tech Radar.

AWS, incidentally, offers six services, including Amazon EC2, Amazon EBS, Amazon Elastic Container Service (ECS), Amazon Elastic Kubernetes Service (EKS), Amazon Relational Database Service (RDS), and Amazon Elastic MapReduce (EMR).

Ellison threw additional barbs at AWS while making the announcement, claiming Oracle’s compliance certifications and management are the same across the public cloud and dedicated region, unlike AWS Outposts.

He also highlighted AWS’ role – as he sees it – in last year’s infamous Capital One data breach in which the personal information of 100 million customers was targeted by cyber criminals due to a misconfigured web application.

The Oracle CEO said it happened because Amazon cloud databases require complex and manual provisioning, configuration, encryption, backup and security, suggesting it was very complicated and that human beings can make mistakes as a result, leading to data loss.

“With Oracle, it is 100% automated and users cannot make mistakes on 100% automated processes. It is the only database where a person who runs the database has no access to users’ data,” he added.

Despite the war of words, however, AWS still dominates the cloud market, leading the industry in terms of market share, followed by Microsoft Azure and Google Cloud Platform.

Microsoft Teams update will put users in a virtual theatre


Bobby Hellard

9 Jul, 2020

Microsoft has revealed a slew of changes to Teams that have been in development since the start of the coronavirus pandemic and subsequent lockdown.

The changes are designed to improve user experience on the collaboration platform and drive greater engagement.

It starts with “Together Mode”, which is an artificial intelligence-based feature that places a digital avatar of the user in a virtual space. It essentially takes your face and shoulders and puts you, alongside the other members of the call, in a virtual theatre. This is being rolled out now, with other virtual spaces set to be added in the coming months.

The thinking behind the changes is to make users feel more “connected”, according to Microsoft. It wants the service to be more inclusive by making people more visible when they join conference calls. To do this, it has added a “Dynamic View” which allows users to share content side by side with other participants.

In addition to Dynamic View, users can also share live reactions with their teammates. This essentially resembles emoji reactions that appear in their intended recipient’s box as they talk. Teams will also have video filters where users can adjust lighting levels or soften focus to improve your webcam while on a call.

Later in the year Teams will also support live transcriptions with a translate feature for meetings in other languages. The service will also get suggested replies for its chat features, similar to Gmail’s suggested replies function, that offers auto-fill text based on the context of the previous message in the channel. Microsoft is also adding ‘chat bubbles’ so that the messages can be seen more clearly by other participants in the call.
 
And, finally, Teams will also integrate Cortana, Microsoft’s voice assistant, on the mobile version of the service. Cortana will be able to make a call, join meetings, share files with colleagues and even send chat messages.

Google abandons controversial cloud project in China


Sabina Weston

9 Jul, 2020

Google has decided to abandon the development of a controversial cloud computing project named “Isolated Region”, which catered to various governments’ desires to control data within their borders.

The tech giant scrapped the initiative in May, partly due to the coronavirus pandemic but also due to the rising geopolitical tensions between the US and China, where Isolated Regions was being developed, according to two anonymous Google employees speaking to Bloomberg.

The project, which was launched in early 2018, sought to comply with Chinese regulations which require Western companies to form a joint venture with a Chinese partner company when they provide data or networking services. However, the development was paused in January 2019, reportedly due to Google choosing to focus on potential customers in Europe, the Middle East and Africa.

However, according to one source, the geopolitical issues placed demands on Isolated Region that Google was not capable of delivering.

A Google spokeswoman refuted the claims made by the employees, telling Bloomberg that Isolated Region wasn’t shut down for either of the given reasons and that the company “does not offer and has not offered cloud platform services inside China”.

Google has said that the cloud initiative was cancelled because “other approaches we were actively pursuing offered better outcomes”, although it has yet to elaborate on the specifics of these approaches.

“We have a comprehensive approach to addressing these requirements that covers the governance of data, operational practices and survivability of software,” the spokeswoman said. “Isolated Region was just one of the paths we explored to address these requirements.”

“What we learned from customer conversations and input from government stakeholders in Europe and elsewhere is that other approaches we were actively pursuing offered better outcomes.”

The news comes just days after Google, alongside Microsoft, Facebook and Twitter, made the decision to suspend the processing of user data requests from the Hong Kong government, following the implementation of a new security law that criminalises protests. The Hong Kong government reportedly requested data from Google users 105 times in 2019 alone.

IBM buys RPA company WDG Automation


Sabina Weston

8 Jul, 2020

IBM has announced its acquisition of WDG Automation, a Brazillian software provider specialising in robotic process automation (RPA)

Financial details of the acquisition weren’t disclosed to the public, but the companies said they expect the deal to close in the third quarter. 

IBM’s decision to acquire WDG Automation sees the tech giant continuing its expansion into the AI-infused automation market, as it looks to provide its customers with the ability to “quickly identify more granular opportunities for automation (…) as well as help ensure consistent and accurate data is being used across all tools and business functions, including customer service, IT, finance, HR, and supply chain”. 

RPA technologies have the ability to computerise repetitive tasks, removing them from the human employees’ workload and therefore boosting productivity, as well as general welfare and wellbeing.

WDG Automation, based in São José do Rio Preto, Brazil, is a provider of RPA, Intelligent Automation (IA), Interactive Voice Response (IVR) and chatbots. It markets its products at business users looking to create automations using a desktop recorder without the need for IT.

WDG’s software robots are able to run on-demand or by using an automated scheduler, depending on the customer’s needs.

The company’s founder and CEO Robson Felix called automation “crucial in the digital era, as businesses need to perform several repetitive or routine tasks so that employees are able to focus on innovation”.

“I’m incredibly proud of the role WDG Automation has played in the RPA market with a unified and integrated platform to help companies in Brazil increase their productivity,” he added.

WDG co-founder Kleber Rodrigues Junior said that |joining forces with IBM will scale our capabilities to a wider audience, helping companies around the world accelerate their growth on their business transformation journeys”.

Denis Kennelly, general manager of Cloud Integration at IBM Cloud and Cognitive Software, added: “IBM already automates how companies apply AI to business processes and IT operations so they can detect opportunities and problems and recommend next steps and solutions”. 

“With today’s announcement, IBM is taking that a step further and helping clients accelerate automation to more parts of the organization, not just to routine, but more complex tasks so employees can focus on higher-value work.”

The acquisition might signal a shift of priorities for IBM, which recently decided to “sunset” its general-purpose facial recognition and analysis software suite over ethical concerns following a fortnight of Black Lives Matter protests.

Last week, the company unveiled an AI-powered inventory control system to help businesses optimize their decision-making and build resilient supply chains more effectively.

SUSE acquires Kubernetes startup Rancher Labs


Carly Page

8 Jul, 2020

German-based Linux distribution company SUSE has announced plans to acquire Kubernetes management outfit Rancher Labs

The price of the acquisition was not disclosed, but two people familiar with the deal told CNBC that SUSE is paying between $600 million to $700 million to buy the Cupertino-based startup. 

Rancher Labs, which was was founded in 2014 and currently boasts more than 200 employees, provides open-source software that enables organizations to deploy and manage Kubernetes at scale.

The startup, which has raised more than $95 million in funding, claims Rancher is the “most widely used enterprise Kubernetes platform” with more than 30,000 active users. Its big-name customers include the likes of American Express, Comcast, Deutsche Bahn and Viasat.

The deal, SUSE and Rancher Labs claim, will deliver computing everywhere with the latest AI and seamless deployment of containerized workloads from the edge to the core to the cloud.

“This is an incredible moment for our industry, as two open source leaders are joining forces. The merger of a leader in Enterprise Linux, Edge Computing and AI with a leader in Enterprise Kubernetes Management will disrupt the market to help customers accelerate their digital transformation journeys,” said Melissa Di Donato, Suse CEO. 

“Only the combination of SUSE and Rancher will have the depth of a globally supported and 100% true open source portfolio, including cloud-native technologies, to help our customers seamlessly innovate across their business from the edge to the core to the cloud.”

Shen Liang, Rancher CEO, added that the merging of the two companies will “help organisations control their cloud native futures.”

“Our leading Kubernetes platform with SUSE’s broad open source software solutions creates a powerful combination, enabling IT and Operations leaders worldwide to best meet the needs of their customers wherever they are on their digital transformation journey from the data center to cloud to edge,” he added.

The acquisition is expected to close before the end of October 2020, following regulatory approvals. 

News of the takeover comes just weeks after Rancher Labs announced the launch of Rancher Academy, a new certification program that the open source software firm says will address the growing Kubernetes skills gap.

Royal Marsden powers virtual COVID-19 agent with IBM Watson


Keumars Afifi-Sabet

8 Jul, 2020

The Royal Marsden NHS Foundation Trust has partnered with IBM to launch an AI-powered virtual agent that will provide staff with up-to-date HR and workplace information as the UK emerges from lockdown.

Ask Maisie, powered by IBM Watson, will give the Royal Marsden’s hospitals in London and Surrey the capacity to manage its workforce by serving as an information hub accessed through the intranet. 

Common questions can be answered through automation and AI with the HR department freed to engage in more complex areas, and handle more sensitive matters. Ask Maisie combines IBM Watson Assistant and natural langue processing (NLP) capabilities through the IBM public cloud. 

“As the pandemic evolves so have the long term implications on healthcare which include a growing expectation for immediate and remote access to trusted information,” said director for healthcare and life sciences with IBM UK and Ireland, Andreas Haimböck-Tichy. 

“This has led to many healthcare providers accelerating digital transformation plans to give clinicians time to focus on patients alongside helping to manage the physical and mental health of their key workers. Digital transformation in healthcare is not just limited to the clinical environment.

“Modern technology has an incredible potential to change the way a hospital operates for the better and help revolutionise the care patients receive.”

Topics the staff can access range from advice for high-risk workers, how to self-isolate, and what happens when staff receive official shielding letters. The sources of all information will be ‘trusted sources’ including the hospitals’ own policy handbooks as well as official bodies such as NHS England. 

The COVID-19 pandemic has been a highly disruptive force, but for many public sector organisations, it’s given development and engineering teams an opportunity to implement digital systems to help deliver services

For Royal Marsden, the crisis has triggered the need for technology to help manage its staff, with the organisation claiming the right investments in technology can help organisations build resilience and prepare for any future turmoil. 

Now that Ask Maisie has been launched, it can continue to enhance it knowledge-base as well as learning from interactions it has.