Exploring the commercial advantages of blockchain technologies – and what CIOs need to do about it

The initial commercial interest in cryptocurrency IT infrastructure was the potential to enable an alternative to government-backed fiat currencies. However, now most of the forward-looking focus is on blockchain, the distributed ledger technology that underpins the new applications.

Although deployments are still very much in the realm of the early adopter, blockchain has proven advantages across several vertical industries: it is safe, decentralised, transparent and can reduce intermediary costs.

Blockchain use case market development

While many CIOs and CTOs believe that blockchain likely has a way to go before becoming a mainstream technology within their sector, five compelling use cases across asset tracking, financial services and digital identity are already in production.

They offer valuable business process improvements to the pioneering organisation that has already deployed a blockchain — whether in terms of increased efficiency, reduced fees and fraud, or full transparency across the whole network.

According to the latest worldwide market study by Juniper Research, the total value of B2B cross-border payments immutably stored on a blockchain will exceed $4.4 trillion by 2024 — that's up from $171 billion in 2019.

Blockchain enables real-time clearing and settlement for B2B transactions, while offering increased transparency and reduced costs. These practical applications can deliver significant other benefits.

The new research revealed that financial institutions will save $7 billion by 2024, due to the automation of ‘Know Your Customer’ checks, allied to the involvement of blockchain in identifying users via self-sovereign identity.

Juniper Research assessed 15 leading blockchain vendors, scoring them on experience in the sector, marketing efforts and customer deployments along with their blockchain solutions. Juniper identified the 5 leading vendors as follows: IBM, Infosys Finacle, Guardtime, R3 and Ripple.

The analyst research scored IBM highly for its diverse blockchain solutions in production, with a strong client base for many vertical industries. Additionally, Infosys Finacle has established itself as a leading blockchain provider for financial institutions, with global partners and popular solutions.

"The implementation of blockchain is part of a wider strategy for financial institutions to digitally transform operations," said Dr Morgane Kimmich, research analyst at Juniper Research. "Blockchain will enable stakeholders to reduce operational costs in a competitive market that is becoming increasingly commoditised."

The research found that Ripple, Visa and IBM are driving blockchain innovation in cross-border payments. Ripple has led the market since 2012, capitalising on its early mover advantage to grow to over 200 financial institution partners in 2019.

Outlook for blockchain applications innovation

However, Ripple is facing increased competition from Visa B2B Connect and IBM Blockchain World Wire, which have already grown their presence in 60 countries and have high-profile partners in the financial services ecosystem.

Moreover, the Juniper analyst anticipates that both companies will continue to exploit their global presence, trusted brand names and established business partner networks to scale their solutions. These market leaders are experienced in market development, moving new product and service offerings beyond the early adopter segment. More deployment growth is sure to follow their lead.

Interested in hearing more in person? Find out more at the Blockchain Expo World Series, Global, Europe and North America. 

IBM touts first financial services-specific public cloud after Bank of America collaboration

IBM is looking to target financial services customers with the launch of what is being claimed as the world’s first financial services-ready public cloud – in association with Bank of America.

In some respects, this can be seen as a glorified customer update. Bank of America will be a ‘committed collaborator’ to use the platform, built on IBM’s public cloud, and will host key applications to support its 66 million banking customers.

Yet the companies have collaborated extensively on the product, naturally designed to stringent security practices. The duo is working with Promontory, an IBM business unit focused on financial services regulatory compliance consulting, while strict compliance will be enforced among ISVs or SaaS providers who wish to participate.

Red Hat OpenShift will be deployed as the product’s primary Kubernetes environment to manage containerised software, with more than 190 API-driven services being issued to create new cloud-native applications.

Financial services is an important battleground for the leading clouds. Amazon Web Services (AWS) cites three primary customers in this industry; Liberty Mutual, Starling Bank, and Capital One. The latter hit the headlines for the wrong reasons after a data breach was confirmed in July, although the company subsequently noted its cloud operating model helped solve the issue at greater speed. For Microsoft Azure, MetLife, German savings bank Provinzial and South African Nedbank are among its key clients.

This is not the entire story, however, as many financial services firms are looking to hybrid cloud to ensure sufficient digital adoption. According to a report from Nutanix issued in April, more than one in five financial organisations polled said they were deploying a hybrid cloud model, with the vast majority (91%) saying hybrid was their ‘ideal’ IT model.

“This is one of the most important collaborations in the financial services industry cloud space,” said Cathy Bessant, Bank of America chief operations and technology officer in a statement. “This industry-first platform will allow Bank of America to use the public cloud, putting data security, resiliency, privacy and customer information safety need at the forefront of decision making.

“By setting a standard that addresses the concern of hosting highly confidential information, we aim to drive the public cloud to a safety level that is unmatched,” Bessant added.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

With Azure Arc, Microsoft aims to go beyond traditional hybrid cloud – with Anthos and Outposts for company

Keep your friends close, but your enemies closer. Microsoft’s most eye-catching announcement at MS Ignite earlier this week was around Azure Arc which, similar to AWS Outposts and Google Anthos, allows Azure to run and be run on its hyperscale rivals.

The rationale, as the company has put it in the press materials, was to go beyond the usual definitions of hybrid cloud in an area which is rapidly becoming more than table stakes for the hyperscalers. Mark Russinovich, Azure CTO, told ZDnet that it was what the company is considering as ‘hybrid 2.0.’

“Enterprises rely on a hybrid technology approach to take advantage of their on-premises investment and, at the same time, utilise cloud innovation,” Julia White, Azure corporate vice president, wrote in a blog post. “As more business operations and applications expand to include edge devices and multiple clouds, hybrid capabilities must enable apps to run seamlessly across on-premises, multi-cloud and edge devices.

“Without coherence across these environments, cost and complexity grow exponentially,” added White. “Today, we take a significant leap forward to enable customers to move from just hybrid cloud to truly deliver innovation anywhere with Azure.”

First up on the list of services to go into Arc are Kubernetes and Azure SQL Analytics. Customers will now ‘have the flexibility to deploy Azure SQL Database and Azure Database for PostgreSQL Hyperscale where they need it on any Kubernetes cluster’, as the company puts it.

While the three hyperscalers now have options where services can be deployed on different clouds, don’t imagine for a moment that they are all arm-in-arm walking into the sunset singing Kumbaya. At the time of VMworld in August – VMware having extensive partnerships with the three biggest clouds – Pivot3 CMO Bruce Milne noted the ‘tension’ in the air.

“There’s an obvious strategic tension in VMware’s collaboration with the hyperscale cloud providers, but for now it appears they’ve agreed to a collaborative détente,” said Milne. “Watch this space because that friction is sure to generate sparks eventually.”

Nick McQuire, VP enterprise at CCS Insight, noted the changes Microsoft had made, but with a caveat.

“Over 60% of enterprises use multiple clouds and a mix of on-premises and public cloud computing in their businesses so providing a single control pane, consistent management and security across this multi-dimensional environment is now becoming the new rules of engagement in the cloud wars,” said McQuire. “It means that Microsoft is becoming more attentive to customer needs, but it is also an indication that battle lines of competition in cloud are shifting towards managing the control pane.

“With the arrival of multi-cloud management in Azure, we are now seeing perhaps the biggest shift yet in Azure’s strategic evolution.”

Perhaps the most comprehensive analysis was from regular Forbes contributor Janakiram MSV. Alongside noting the changes to the control plane, Janakiram noted where Microsoft is looking in terms of customer focus. “With Azure Arc, Microsoft is enabling enterprises with legacy infrastructure to join the hybrid cloud bandwagon,” he wrote. “Microsoft is not alienating customers running legacy hardware and VMs from the hybrid cloud. VMs are treated as first-class citizens in the world of Azure Arc.”

“Microsoft’s hybrid strategy based on Azure Arc and Azure Stack looks compelling and convincing,” Janakiram added. “Azure Arc’s key differentiation lies in the balance between traditional, VM-based workloads and modern containerised workloads that operate in the same context of the hybrid and multi-cloud environments.”

You can find out more about Azure Arc by visiting here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

CircleCI aims to further break down the ‘hornet’s nest’ of continuous delivery with EMEA expansion

Continuous integration and delivery (CI/CD) software provider CircleCI has been acting on its expansion plans following the $56 million (£44.8m) secured in series D funding in July. Now, the company is ready for business in London – and has hired a new head of EMEA to push things along.

Sharp observers looking at the almost 250 faces which comprise the CircleCI team would have noticed a recent addition at the foot of the list. Nick Mills joined the company in September having previously held leading sales roles at Stripe and Facebook, among others, invariably concerned with international expansion.

At CircleCI, Mills will be responsible for EMEA – which the company says represents almost a quarter of its overall business – in everything which is classified as non-engineering. “There’s a huge amount of expansion opportunity,” Mills tells CloudTech. “I’ve already had some interesting conversations in the first few weeks here with companies in fintech and mobility, on-demand services. They really see CircleCI and CI/CD as a fundamental critical enabler that can help their teams increase productivity.”

The company certainly appears to be seeing gains from this bet. Big tech names on the customer roster include Facebook, Spotify and Docker, while investor Scale Venture Partners described the company earlier this year as the ‘DevOps standard for companies looking to accelerate their delivery pipeline while increasing quality.’

For CEO Jim Rose, who has been in London this week for the launch, it is the expansion of a journey which began for him in 2014, first as COO before moving up to the chief executive role a year later.

“When I first got to the company, there were about 30 individual logos in the CI/CD market, and that’s been whittled way down,” Rose tells CloudTech. “Now there is, really, ourselves, a couple of smaller, standalone, very focused CI/CD players, and then you’ve got some of the larger platforms that are trying to go end-to-end.”

Rose cites the ‘peanut butter manifesto’, the now infamous document from Yahoo which used the foodstuff as a metaphor for businesses spreading themselves too thinly across multiple offerings, as evidence for why the larger platforms will struggle.

“We have really gone for the opposite of that strategy,” he explains. “For the vast majority of large customers, you can only move certain systems one at a time. Customers ask us all the time… how do we build that CI/CD system but also the most flexible system so that regardless of what you have in place inside of your overall enterprise or team, it’s really easy and seamless?”

There are various aspects which pull together the company’s strategy. Back in the mid-2000s, if a company built a new application it would hire a bunch of developers, flesh out the spec, write custom code across every line and then package and ship the resultant product. As Rose puts it, any custom code written today takes on the mantle of orchestrating all the pieces together, from the plethora of open source libraries and third-party services.

Continuous delivery is a hornet’s nest – it’s very easy to get to version one, but then the complexity comes as your developers start pushing a lot faster and harder

“What we’re helping customers do is, across all of these hundreds and thousands and millions of projects, start to take a heartbeat of all those different common components and use that to help people build better software,” says Rose. “If you have a version that’s bad or insecure, if you’re trying to pull a library from a certain registry that has stability problems, if you have certain services that are just unavailable… these are all new challenges to software development teams.

“Using the wisdom of the crowd and the wisdom of the platform overall, we’re starting to harness that and use that on behalf of our customers so they can make their build process more stable, more secure, and higher performing.

“Honestly, continuous delivery is a hornet’s nest,” adds Rose. “It’s really complicated to run into one of these systems at scale. It’s very easy to get to version one, but then the complexity comes as you bring it out to more teams, as you add more projects, as your developers start pushing a lot faster and a lot harder.”

For a large part of the company’s history, the individual developer or team of developers was the route in for sales; almost in an infiltrative ‘shadow IT’ context, whether it was the CTO of a small startup or a team lead at a larger company. While this can still be the case at enterprise-level organisations, CircleCI realised it needed more of a top-down, hybrid sales approach.

“One of the biggest changes in our space – not just CI/CD, but the developer space more generally – is developers historically have not been conditioned to pay for things,” says Rose. “If you needed a new tool, a new script, the developers would either go out and create it on their own or they use an open source service.

“What’s changed over the last two or three years is now developers, because their time is so valuable, have the budget and the expectation that they have the opportunity to pay for services that help you move faster. A lot of what we do from a sales perspective is help development teams understand how to procure technology. What’s necessary? What do you think about what you look at? How do we help you through that commercial process?”

Mills will be looking to guide EMEA customers through this process, with the stakes high and the motivation to assist leading tech companies strong. “A lot of companies are successful in and of themselves and can build their businesses, but the space we’re in really has the potential to enable the most successful tech companies today and of the future,” Mills explains.

“Ultimately, the creation they can generate as companies can obviously help them move quickly, increase the scale and pace of product delivery,” he adds. “To me, that feels like incredibly high-level work to be doing and high value.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Five ways the cloud can benefit HR departments


Esther Kezia Thorpe

6 Nov, 2019

Increasingly, on-premise solutions are holding back innovation, with processes often disconnected and across multiple systems. The right cloud solution can solve many of these issues, and with many now as secure as on-premise tools, the reasons not to consider cloud are diminishing.

Many IT departments understand the benefits of cloud and have introduced cloud services in a variety of different ways to drive the business forward. 

But there are a number of ways HR departments – which are at risk of being left behind when it comes to digital transformation – can benefit from the cloud as well.

With that in mind, if you’re exploring how the cloud could work for your business, here are five ways it could benefit your HR department.


HR leaders need new tools to navigate the complex business landscape and overcome the challenges of dealing with an agile workforce. Learn more in this whitepaper on embracing the future of work.

Download now


1 – Enabling flexibility

Whether it be a reorganisation, a merger, or changes in compliance, having a flexible system that can grow and change with the business is essential in today’s fast-paced world. Having HR services in the cloud allows much greater flexibility and means departments are not completely reliant on IT to adapt to new business requirements.

The cloud also offers an element of future-proofing. Whether it be artificial intelligence, machine learning or automation, these “tools of the future” are maturing rapidly, with a growing number of businesses finding solid use-cases for them in their organisation. 

Although you may not be ready to dive straight into automation, having HR systems running in the cloud will ensure that they are prepared for integration with cutting edge technologies when the time is right.

2 – Increasing efficiency

Recruiting, onboarding and processing employees are a key function of an HR department. If these processes are done using paper-based workflows or outdated technology, it can really slow things down and even cause the loss of in-demand candidates.

Many cloud-based HR solutions offer tracking for applicants and a workflow for the whole recruitment process, from applying and uploading CVs to scheduling an interview and sorting out the job offer documentation.

The advantage of the cloud in particular is having everything unified. Legacy systems can offer end-to-end workflows, but are often disconnected from the rest of the business. The cloud, however, offers HR, payroll, finance and other departments a unified experience, which in turn will give better visibility across the company.

3 – An improved user experience

Cloud HR software offers a modern, intuitive user experience, which can be accessed from any device, rather than being reliant on software installed on particular machines. This in turn means that flexible working can be an option for members of the department; an increasingly important requirement for employees.

For other business staff, a cloud-based HR system can allow them access to information about their salary, holidays and benefits any time they want, from wherever they need it. Having modern, accessible systems can have the added benefit of increasing staff retention.

4 – Ready for data

The cloud offers huge potential for integrating high-quality analytics into HR functions. Data and analytics can be used for a vast range of tasks related to HR, from measuring employee satisfaction, to pay benchmarking, retention trends and business efficiency.

Data analysis can be done at the touch of a button through dashboards and reports, or even using big data and predictive analytics.

In turn, the processes that require a lot of data can be automated. Whether this is timesheet submission, performance reviews or holiday requests, a cloud system can automate the update process for specific tasks related to employees.


‘Embrace the future of work: Why HR must move to the cloud’ explores how forward-thinking HR leaders are focusing on new ways of realising the potential of their workforce. Download it here.

Download now


5 – Constant compliance

Business regulations are being constantly updated or added, whether it be privacy requirements or timesheet compliance. HR and finance departments in particular have strict compliance procedures which can have heavy consequences if not followed properly. 

Most cloud-based solutions have an advantage over traditional on-site applications as updates are pushed out automatically, which means HR can be reassured that they’re constantly compliant with the latest legislation and processes.

Looking to the ‘HyPE’ of cloud storage: How HPE is looking to help with hybrid cloud

Analysis Cloud storage is old hat right?  It’s the simpler part of cloud and is after all just storage. So how can cloud storage be more interesting to explore and deliver greater value to the customer?

Having had a recent expert and exclusive briefing from inside HPE from my position as an industry cloud influencer, there is a powerful and relevant story to tell that forms a base of the HPE cloud strategy and a value for those in cloud to be cognisant of for the opportunity it presents.

We live in a time of cloud and hybrid cloud is rapidly becoming the norm, if it is not already. As we approach 2020 an average of about a third of companies’ IT budget is reported as going towards cloud service with, according to Forbes, an expectation that around 83% of Enterprise Workloads expected to be in the cloud by the end of 2020. Of course, these shall be spread across varying cloud form factors and offerings encompassing SaaS, PaaS and IaaS, private, public and of course hybrid clouds.

Hybrid cloud is where the greatest challenge appears to lay. Whether by accident or strategically, most firms are unable to align to a singular cloud provider through one platform to meet all the need of their business, users and customers. Much like days of the past, where businesses mixed Unix, Netware, Lan Manager and other operating systems to support the applications required by the business, today this has become a hybrid cloud environment.

Industry trends align to validate this with Gartner reporting that 81% of public cloud users choose between two or more providers and with Canalys taking this deeper, citing that Amazon, Microsoft, and Google combined accounted for 57% of the global cloud computing market in 2018. The average Business is running 38% of Workloads in public clouds and 41% in private clouds with Hybrid cloud running at a 58% adoption rate according to Rightscale industry figures.

This growth of hybrid is driving an increasing challenge for the CTO/CxO, that of Data portability. “How do I maintain resiliency and security across hybrid and multi-cloud environments and get the benefits of cloud with the values of on premise I enjoyed?”… “How do I have storage in the cloud behave in away I am used to from on premise?” The want for consistency of data services and to be able to run data back out of cloud if and when wanted is also a key driver.

We have seen cloud make it easy to spin up and speed forwards using Agile and DevOps as attractive rewards. However, as customer’s cloud usage has rapidly matured demands on the platforms and pressures of mobility and portability have driven greater demands on storage flexibilities.

The customer focus of moving applications to the cloud has revolved mostly around the selection of the compute platform, the lift and shift, leaving the storage focused issue to rear its head later, with many experiencing the latter shock factor of cost and tie in issues. We have also seen customers maturing use and demands of cloud platforms drive innovation of periphery cloud services as evidenced here in the area of storage.

So, what do you do when public cloud storage does not meet all your storage needs? Let’s start from the offering of a true commercial utility-based model aligned with a focus on performance and storage needs. HPE is allowing you to abstract your data store from the public cloud in a Storage as a service offering that frees you from ties to any singular public cloud offering. Put your data in and then decide which public cloud(s) do you want to access the data set, knowing you can move data in and out as you want to. The key is that the storage becomes extrapolated from the compute, a positive step towards true portability across the major public cloud compute offerings.

Imagine combining public cloud compute with its high SLA on compute with a data storage set with a 99.9999% SLA and having the ability to easily switch compute providers if and when you choose leaving the data set intact. Moving compute more easily between AWS, Azure and Google Compute is the panacea for many.  In fact in the Turbonomic’s 2019 State of Multicloud  report, 83% cited expecting workloads to eventually move freely between clouds. We are seeing the first steps here to the expectation becoming a reality.

The clever market offering that will prove attractive here is the commercial offering will deliver one flat and clear billing model across all clouds with no egress charges. Both technically and commercially HPE Cloud Volumes is setting out to make the complex and critical needs of storage simple and increasingly affordable, flexible and importantly portable.  Through this HPE is setting its stall to be a key cloud transformation partner for business.

HPE is stepping the game up through acquired technologies to service, support and supplement the needs of the high growth public cloud consumption. Their offering will not be right for every customer in every public cloud, but for its specific use case offers a valuable option. The offering as would be expected is for Block and not Object storage, but it remains that this addresses a large segment of the cloud workload storage requirements for most corporate entities.

The promise is portable cloud storage across compute platforms with on the fly commercial transparency.  This removes the tie in to any public cloud offering such as AWS, Azure or Google Compute. You do of course tie your storage into HPE Cloud Volumes (although without the egress charges), but by agnosticising your storage you allow greater flex to mix/match and change between the major cloud platforms, something lacking for many today.

Are we going to see the question change from where is my data, to where do you want it to be?  The HPE offering is one of portability and operability, bringing on premise flexibility, security and portability to cloud storage.

Separating storage from compute workloads is an enabling factor for the flexibility of moving between cloud compute offerings for DR, testing or simply for when a switch is wanted. To deliver a solution without introducing latency, HPE has had to align its locations with the mainstream public cloud providers. As would be expected both Docker and Kubernetes are inherently supported, key to make the offering fit the increasingly open DevOps world of public cloud.

The extrapolation of storage is smart presentation of value from HPE to the exploding cloud market and the needs of customers for greater flexibility and portability.  We should not forget that one of the drivers for cloud adoption is the capability to access data from anywhere at anytime easily and according to a Sysgroup study “Providing access to data from anywhere is the main reason for cloud adoption.”

We also heard about Infosight – the hidden gem in the HPE kingdom – in simplistic terms this is an offering that utilises AI to take telemetry data and advise customers of an issue forth coming and what to do about it, before it has impact! So, apply this to Cloud Volumes and you have a compounding value of maximising your storage when and where you need with maximum reliability and predictability.

Customers are seeking increased Data mobility and portability – the panacea promise of cloud solutions and the ability to move to/from compute offerings from varying vendors quickly and easy. Excitingly, HPE has strategised that by 2020 everything it sells will be available ‘as a service’. Do we see a new ‘HPEaaS’ ahead? This will form a strong foundation for HPE to make a big noise alongside the explosive growth of the public cloud space and positions a new offering much needed at the centre of the public cloud battle as it continues.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Here is every major Microsoft Teams update from Ignite 2019


Dale Walker

6 Nov, 2019

Microsoft Ignite has delivered an enormous number of updates across the company’s product portfolio and in particular its collaboration platform, Teams.

To make life a little easier for readers we’ve rounded up the most important changes, some of which are available now, with others being accessible through Microsoft’s preview programme.

Simple sign-in for Microsoft 365

Microsoft has added a bunch of authentication features to its 365 platform that will also filter down to Teams. Most of these are aimed at what the company calls “firstline workers”, defined as those employees who act as the first point of contact for a business, typically retail staff.

Firstly, SMS sign-in is coming to Teams, allowing users to log onto the service with their phone number and an SMS authentication code, effectively removing the need to remember passwords. Likewise, a Global Sign-out feature is also on its way for Android devices that lets workers sign out of all their 365 apps, including Teams, with one click.

Firstline managers will also soon be able to manage user credentials directly. The goal here is to reduce the strain on IT support by allowing managers to, for example, help employees reset passwords without having to turn to support teams.

All of these authentication features are currently in preview but are expected to become generally available before the end of the year.

Content management and Yammer integrations

There are also a bunch of new features designed to make it easier to access and manage files and tasks across its Microsoft 365 portfolio from inside the Teams app.

Aside from a few minor tweaks to permission options, some bigger changes include being able to preview files from 320 different file types, support for relevant People cards and recent files that are now displayed alongside chat windows, and the ability to sync Team files to a PC or Mac.

Customers will also be able to migrate conversations they’ve had in Outlook into the Teams app before the end of this year and in the first quarter of 2020, the company plans to add a new UI for tasks created in Teams, Planner or To Do.

Finally, Yammer, the enterprise communications app that at one time looked like it was going to be scrapped, will be integrated into Teams in a preview state by the end of the year, before rolling out generally in 2020.

The app has received a complete redesign, being the first to be built from the ground up using the Microsoft’s Fluent Design System – Microsoft’s latest design language. For Teams, this means that the app can be accessed directly from the left rail, although this new version is unavailable until its private preview in December, with general availability at some point in 2020.

Emergency calling

Speaking of communications, US customers are now able to use a feature called Dynamic Emergency Calling, which also provides the current location of the user to emergency services. Currently, the feature supports those on Microsoft’s Calling Plan, although Direct Routing users will eventually be supported before the end of the year.

There were also a number of smaller announcements for the calling function. Music on Hold is a fairly self-explanatory new feature, offering the option to play music for any caller placed on hold or put into a queue. Location Based Routing allows customers to control the routing of calls between VoIP and PSTN endpoints. And finally, larger organisations with a global footprint can now choose the nearest Microsoft media processor to handle their calls, which should improve overall performance.

Upgrade to Microsoft Teams Rooms

Microsoft Teams Rooms (MTR), the successor to Skype Room Systems, also received a handful of updates, the biggest of which is compatibility with Cisco WebEx and Zoom, allowing Teams users to connect to these services directly. This should be available starting in early 2020, beginning with WebEx.

The second big announcement is the launch of a new managed service called Managed Meeting Rooms. This will provide cloud-based IT management and security monitoring for meetings, as well as support onsite services through a partner network. This service is available now in private preview and is expected to launch generally during spring 2020.

Enhanced security

There have also been a tonne of security and compliance updates for IT admins.

Firstly, Microsoft’s Advanced Threat Protection has been extended to messages within Teams, offering real-time protection against malware and malicious scripts. Similarly, the option to apply policies that block groups from sharing files with each other now also covers files stored within SharePoint.

Admins have also been given more options concerning the length of data policies, with the option to enforce restrictions for a single day, as well as new PowerShell functions designed to make it easier to assign policies across larger Teams groups. It will also soon be possible to manage device usage and deployments through a single management portal.

Private chat and customised conversations

The option to chat privately inside a Team is now generally available to all users, allowing customers to create separate chat windows that can be viewed and accessed by a select few of the team’s members.

The multiwindow feature is also coming to Teams from the first quarter of 2020, allowing users to pop out chats and meetings into a separate window. Users will also be able to pin specific channels to the left rail, allowing them to keep track of most-commonly used conversations.

Finally, the Teams client is also heading to Linux, with a public preview being made available before the end of 2019.

Virtual consultations

A new feature called Virtual Consults is available in private preview, designed to make it easier for organisations that rely on sensitive consultations with their customers, such as healthcare professionals or customer service agencies, to arrange calls. The feature brings with it built-in meeting transcriptions, cloud-recording and the option to blur out backgrounds if they are in a location that would otherwise distract from the meeting.

Developer tools

A healthy chunk of upgrades were also reserved for the Teams developer community.

Firstly, Microsoft has made the tools available as part of its Power Platform more accessible in the Teams environment. Power Apps developers are now able to publish their creations as Teams apps directly to the company’s app library, making it easier for users to find them. The whole process of adding apps has also been made easier. As for users, they will eventually be able to pin these custom apps to the left rail, once this comes into play before the end of 2019.

Power BI is also receiving some updates that will translate to Teams next year, including the ability to create interactive cards within Teams conversations for users to engage with. The Power BI tab in Teams will also be given an upgrade, making it easier to select the right reports.

IGEL partners with Citrix and Ingram Micro for simplified access to Azure workspaces


Daniel Todd

6 Nov, 2019

Cloud software provider IGEL has teamed up with Citrix and Ingram Micro to launch a new software bundle that will simplify user access to Azure-delivered cloud workspaces.

The package includes “best-in-breed” products from both IGEL and Citrix that will simplify the delivery of high performance end user computing with “anywhere access” in the cloud, IGEL said.

Ideal for businesses needing to address aging Windows 7 endpoints before support ends on 14 January 2020, the unified solution simplifies the migration of Windows desktops to Azure — allowing them to realise the benefits of Windows 10 without the usual migration pain points.

“With our combined solution, IGEL, Citrix and Ingram Micro are making it easy to streamline end user computing in Azure to power cloud-based Windows Virtual Desktops that will simplify endpoint management, improve security, lower costs and keep workers productive,” said Jed Ayres, president and CEO of IGEL North America.

“And for those who have already moved to Windows 10, they too can easily migrate those desktops to the cloud, with this combined offer virtually eliminating all the headaches associated with managing and maintaining hundreds or thousands of endpoints running full-blown Windows.”

With the new bundle, organisations can leverage public cloud desktop-as-a-service (DaaS) workspaces from the Azure cloud in the form of Windows Virtual Desktops (WVDs).

Businesses will have access to the Citrix Workspace platform for managing and automating activities across all locations and devices, as well as IGEL’s Workspace Edition software which combines the IGEL OS and its Universal Management Suite (UMS) for endpoint control.

The combined solution is available exclusively via distributor Ingram Micro now, delivered as single offering to further streamline the adoption of WVD DaaS workspaces.

“This new offer from Ingram Micro combines the unique strengths of both Citrix and IGEL to enable organisations to realise the full benefits of Windows 10 without the typical pain of migration,” commented Nabeel Youakim, vice president of Product and Solutions Architecture at Citrix. “In particular, the new Windows 10 multi-session entitlements of Windows Virtual Desktops offer easy access to Windows 10 along with great economy for those looking to move to the Azure cloud.

“With Ingram Micro’s new offer, Citrix and IGEL are playing a key role in making Windows 10 from the cloud the new reality.”

IBM to develop public cloud banking platform


Bobby Hellard

6 Nov, 2019

IBM is working with Bank of America to develop what it claims is the world’s first financial services-ready public cloud.

Named “the financial services-ready public cloud”, IBM said it could enable independent software vendors and Software as a Service (SaaS) providers to focus on deploying their core services to financial institutions with the controls for the platform already put in place.

The aim is to give financial institutions an opportunity to efficiently assess the security, resiliency and compliance of technology vendors, through the platform’s security validation. Only independent software vendors or SaaS providers that demonstrate compliance with the platform’s policies will be eligible to deliver services through it.

Financial services-ready public cloud is expected to run on the tech giant’s public cloud and will be built with Red Hat‘s Open Shift. It will include more than 190 API driven, cloud-native platforms as a service where users will be able to create applications.

The company said the project has been developed with the aid of financial services experts in IBM’s networks, including some of the largest financial institutions in the world.

According to Bank of America’s CTO Cathy Bessant, it’s one of the most important collaborations in the financial services industry cloud space.

“This industry-first platform will allow Bank of America to use the public cloud, putting data security, resiliency, privacy and customer information safety needs at the forefront of decision making,” said Bessant. “By setting a standard that addresses the concern of hosting highly-confidential information, we aim to drive the public cloud to a safety level that is unmatched.”

How Johnson & Johnson boosted its performance by lifting Teradata to AWS


Lindsay Clark

6 Nov, 2019

Data has become the engine that drives modern business, and collating and analysing that data is a crucial component of many IT departments’ duties. Most turn to Enterprise Data Warehouse (EDW) technologies, which offer platforms that allow business to centralise their data for easier analysis and processing.

Teradata is among the most well-known EDW platforms on the market, having spent the last 40 years building its reputation providing on-premise EDW hardware and software for customers including General Motors, P&G, eBay and Boeing. It has now transitioned to a cloud-first model and is now available on all three major public cloud providers, following the addition of Google Cloud Platform support on 22 October 2019.

Back in 2017, however, the company’s cloud credentials were not so well-established. That’s why when healthcare and pharmaceuticals giant Johnson & Johnson (J&J) decided to move its data stores to a Teradata-powered cloud infrastructure, the plan was met with surprise and skepticism. In the years leading up to the project, J&J’s senior manager for data and analytics Irfan Siddiqui says, the company became aware its current on-premise platform would not support its burgeoning data analytics requirements demands at an affordable price for very much longer.

“We [had] been experiencing some challenges and thinking about how we transform the traditional data warehouse into a more modern service, particularly around the flexibility, scalability and cost, and we were searching for a solution,” he told a Teradata conference in Denver, Colorado earlier this year.

And so, in 2017 it started to look at migrating its enterprise data warehouse (EDW) system to the cloud, eventually landing on Teradata as the most promising solution provider for its problems.

At that time, the offer of Teradata on AWS was not widely considered mature enough for an enterprise environment, Siddiqui tells Cloud Pro.

Five lessons from Johnson & Johnson’s EDW cloud migration

Identify all the stakeholders involved and begin discussions to identify potential challenges

Start with a small proof of concept to test all aspects of the potential solution

Understand as early as possible the network bandwidth and latency between your on-premise and cloud solutions

Expect some things to go wrong the first time you try them

Engage a strong project manager, who is good with timelines and risk, to be the single point of contact for communicating progress

Practise processes over and over again, including failure scenarios

“When Teradata released its first machine on AWS, and I said I wanted to do a proof of concept for Teradata in the cloud, people who knew Teradata, their first reaction was, ‘What? Why? Really?’.”

However, the commitment from Teradata to show its systems could work in the cloud was so strong Siddiqui found the confidence to go into a proof of concept. Initial trials showed promise.

The 80-terabyte a-ha moment

“Most of us know doing a capacity expansion or migration to new hardware takes in the order of six months but [with AWS] we were able to spin up a formal system with 80TB of data in just 20 minutes. That was one of the ‘a-ha moments’ for us which became the driving force for us to take another step,” he says.

J&J set itself five goals in lifting Teradata to the cloud, Siddiqui says: to migrate three data environments and all its applications by the halfway point of 2019; to offer the same or improved performance compared with the on-premise system; and to increase flexibility and scalability while reducing cost.

This posed a sizeable challenge for Siddiqui’s team, which aimed to support about 300TB of storage, 50 business applications and 2,500 analytics users on to a system capable of handling more than 200 million queries per month.

It also raised some significant questions.

“How are our applications going to perform? How do we migrate? What happens with downtime, and stability and security?” he says. “We had to address these questions, not just for our leadership team, but all the stakeholders across J&J. We had to show how it would benefit each one of us.”

Most applications stay on-prem

Although all the data warehouse workloads would be in the cloud, most of the related analytics applications and data visualisation tools, including Qlik, Talend, Informatica, and Tibco, remained on-premise.

Some applications were split between the cloud and on-premise servers. For example, J&J wanted to spin up application development environments in the cloud when they were required and only pay when using them. “That is the flexibility we did not have our own servers,” Siddiqui says.

Given the migration had to follow an upgrade to the data warehouse production environment, deadlines became tight. The team worked for three months more or less continuously. But by the end of June of 2019, it was able to decommission the on-premise data warehouse hardware systems.

The hard work has paid off for Siddiqui and his team. Extract-transform-load jobs now take half the time compared to the on-premise system. Large Tableau workload performance has improved by 60% and another application’s data loading was cut from more than three hours to 50 minutes.

Beware the desktop data hoarders

Claudia Imhoff, industry analyst and president of Intelligence Solutions, says it makes sense to put enterprise data warehousing in the cloud in terms of scalability and performance, but there are caveats.

“It’s a wonderful place if you have all the data in there. But, unless you’re a greenfield company, nobody has all of their data in the cloud. Even if most operational systems are in the cloud, there are so many little spreadsheets that are worth gold to the company, and they’re on somebody’s desktop,” she says.

“There are arguments for bringing the data into the cloud. It is this amorphous thing, and you don’t even know where the data is being stored. And you don’t care, as long as you get access to it. Some of it’s in Azure, some of it’s in AWS, and some of it is in fill-in-the-blank cloud. And, by the way, some of it is still on-premise. Can you bring the data together virtually and analyse it? Good luck with that,” she adds.

To succeed in getting data warehousing and analytics into the cloud, IT must convince those hoarding data on desktop systems that it is in their interest to share their data. The cloud has to do something for them, she says.

Despite the challenges, enterprise IT managers can expect to see more data warehouse deployments in the cloud. In April, IDC found the market for analytics tools and EDW software hosted on the public cloud would grow by 32% annually to represent more than 44% of the total market in 2022. These organisations will have plenty to learn from J&J’s data warehouse journey.