Salesforce augments its Marketing Cloud suite with new automation tools


Sabina Weston

17 Jul, 2020

Salesforce has announced its plans to roll out three new Marketing Cloud product updates which aim to facilitate working from home for enterprise marketing teams.

The updates include three AI-focused innovations to its Interaction Studio, four new features in the enterprise addition of Pardot, as well as a new Datorama integration with Tableau.

Salesforce’s Interaction Studio, which uses artificial intelligence (AI) and machine learning to manage customer interactions, is to be improved with technology integrated from the company’s Evergage acquisition, which was finalised earlier this year.

Users will be able to to build, manage, test, and implement AI-powered recommendation strategies with the new Einstein Personalization Recipes, use an “advanced, continuous learning algorithm” to choose the best suitable offer or experience through Einstein Personalization Decisions, as well as improve personalisation campaigns and customer engagement efforts using A/B/n Testing.

Salesforce also announced that it would be integrating its Datorama analytics platform with Tableau, which was acquired by the company last year for $15.3 billion (£12 billion). The integration is to aid marketers in optimising their budget and data using Automated Marketing Data Integration.

Lastly, the software company will roll out four new features to its enterprise edition of Pardot, Salesforce’s marketing automation product. With B2B Marketing Analytics Plus, users will be able to use AI to sift through and understand data as well as determine best future marketing strategies.

Pardot will also be equipped with Einstein Attribution, which produces report and dashboards for optimised revenue credit assignment and Pardot Business Units, which facilitates campaign management across brands, geographies and segments. Admins and developers will also be able to test, audit and configure IT processes with the new Developer Sandboxes for Pardot.

While the Datorama Integration with Tableau is already available, Pardot Premium is to roll out later this month. Salesforce announced that the Interaction Studio will become generally available in the third quarter of 2020.

HPE brings new HPE GreenLake cloud experience to Europe


Daniel Todd

17 Jul, 2020

HPE has teamed up with colocation service provider Interxion to bring GreenLake services to the European market, in a move it says will offer a managed cloud experience without the complexity of managing data centres.

Hosted at Interxion’s colocation data centres, the HPE GreenLake cloud services initiative has been designed to allow customers to maintain ownership of their data and workloads, minus the need to manage on-site data.

Announcing the move, the tech giant said the setup will bring a host of benefits – such as greater visibility of resource utilisation across co-located and public cloud-based workloads, as well as governance across enterprise applications and data. It will also enable enterprises to interconnect with connectivity providers, public clouds and each other, the firm said.

To begin with, HPE is introducing a pilot program in Ireland – but a successful trial will see the scheme quickly expand to the UK, Germany, France and the Netherlands.

“As the cloud gateway to Europe, Ireland is the perfect market to trial this new hosted HPE GreenLake cloud offer,” commented Maeve Culloty, Managing Director of HPE Ireland. “By running workloads as a service on dedicated hardware at a colocation data center, customers are getting the best of both worlds: the convenience of cloud and the security and compliance associated with a traditional on-premise infrastructure.”

The announcement follows the launch of new HPE GreenLake’s new cloud services last month, which opened up access to a host of cloud services that can be deployed via HPE GreenLake Central.

Now, thanks to the Interxion collaboration, customers can leverage further key benefits – such as eliminating the cost of on-premise data centres and gaining access to public cloud providers, the company claims. Businesses will also have visibility over usage and spend, keep control of data and processes, all while running their workloads in data centres powered by 100% renewable energy.

Initially, HPE and Interxion said they are offering HPE GreenLake cloud services for private cloud with containers or virtual machines, and data centre infrastructure – which will be available via HPE and HPE channel partners.

“Our agreement to launch HPE GreenLake cloud services hosted in our data centers improves speed and agility by increasing customers’ connectivity to public clouds while staying in control of cost, security and compliance without the need to invest in an on-premise data center, as they can deploy HPE GreenLake solutions in Interxion’s colocation data centers,” commented Séamus Dunne, managing director of Interxion Ireland.

“Interxion will help businesses safeguard mission-critical data, while also taking into account their security needs and operational reliability.”

Gmail overhaul aims to reduce worker reliance on Slack and Zoom


Bobby Hellard

16 Jul, 2020

Google has made changes to Gmail to make the app more holistic for remote working users and reduce their dependency on supporting services like Slack and Zoom.

The company is incorporating more applications into its business version of Gmail, taking away the need to use multiple tabs or switch to other apps.

Following on from Google Meet and Chats being integrated into Gmail, the tech giant is increasing the capabilities of both within the email service. The collaboration features in Chat rooms now include shared files and tasks functions.

This also includes the ability to create chat rooms with people outside of your company, such as contractors or consultants, creating cross-organisational collaboration, similar to the ‘Connect‘ feature Slack introduced in June.

The idea is to give G Suite customers less reason to use services like Slack and make Gmail a one-stop collaboration tool. While the email service has over 1.5 billion active users, G Suite only has six million business customers, less than Microsoft’s more enterprise-focused Office suite.
 
Within a room, users can open and co-edit a document with the rest of their team without leaving Gmail, similar to the functions in Google Docs. Users can also join video calls from Chats, forward a chat message to an inbox and create a task directly from a chat message.

“Virtual meetings, remote collaboration, flexible hours: it’s becoming clear that these new ways of working are here to stay,” said Javier Soltero, VP and GM of G Suite.
 
Remote working has significantly increased the demands we’re getting from many directions – in both our professional and our personal lives. People tell us they feel overloaded with too much information and too many tasks across too many different tools. Instead of learning another tool, we need the tools we already use to be even more helpful, and work together, in an integrated, intuitive way.”

There are also updates for Meet and Chat security; in the coming weeks, Google Meet hosts will have more control over meetings, where they can assign who can join or collaborate within them. There will also be stronger phishing protections built within Gmail to Google Chat. If a user clicks on a link within the service, it will be scanned in real-time and flagged if it’s found to contain malicious content.

Mine: The startup that can track down your data


Bobby Hellard

GDPR defines personal data as an ‘asset’, yet despite this modern valuation, most of us have unwittingly – or unthinkingly –  given it away in exchange for online services. As such, the average digital footprint is spread far and wide.

If you can remember all the companies that have bits of your digital info, you can begin approaching each one individually and demand they delete it – but you may be surprised at the quantity of organisations that really is. While the mind may immediately leap to Facebook, Google and the other tech giants, it’s also lots of obscure entities. That time you brought a hat at Disneyland, for example, the shop collected more than just a payment. 

How do you start tracing companies you don’t remember engaging with? The answer is in your inbox and involves a little AI knowhow. This is the basic premise of Mine, an Israeli startup that uses machine learning to make the GDPR’s ‘right to be forgotten’ serviceable.

Gold Mine

Mine was founded by CEO Gal Ringel and CTO Gal Golan, who met in the cyber security unit of the Israeli army, and CPO Kobi Nissan, who previously worked for CandyCrush developer King. While many businesses saw GDPR as a hindrance when it came into force in 2018, for Mine it was an integral part of its inception.

“When we started to research the right to be forgotten, we quickly realised that we couldn’t find one tool that makes the GDPR accessible for the average person,” Ringel tells IT Pro. “Regulations are complex and difficult for the average user to understand. With that goal in mind, we quickly realised that we needed to come up with a really simple app that uncovers what companies have your personal data, to make your digital footprint tangible, for the first time, so you can almost touch it.” 

Ringel estimates that around 350 companies are waiting to be found in the average person’s email. For work accounts, it’s almost half of that falling somewhere between 80 to 100. A staggering 90% of the companies that have your data can be found in your inbox, spam filter or even your trash folder. What’s more, the key to finding out who has your personal details isn’t in the email itself, but rather the subject line.

With Google Cloud’s AI platform, Mine has built machine learning models divided into two datasets. The first is trained on emails that have been tagged as different types of interactions – specifically learning about subject lines. This process has been repeated in 12 different languages so that the service works for users in other parts of the world, not just Israel, and can also spot traces of companies from Germany, France, Italy, Israel, Spain and many more. 

“We search for these traces and then reflect it back to you,” Ringel explains. “So basically the AI understands what the interaction you had with a company was just based on the email’s subject. So for example, ‘Welcome to Air BnB’, that interaction is a sign up, and ‘Your purchase from Amazon’, means you’ve bought something. 

“We worked really hard for almost a year to develop machine learning that is non-intrusive, but basically scans your inbox by only looking at the email subject. So it never actually reads the context of the email, because we don’t want to see the process of how they collect the personal data and we also don’t want to be collecting any email data.”

How Mine understands what data you’ve given to that company is down to the second dataset, which has been trained on thousands of privacy policies. Under the rules of the GDPR, companies have to be transparent with the ‘what’ and ‘why’ of data collection. So for example, Airbnb collects your data for two reasons: Signing up to its service and then for payments. So it will have your name, address, email, mobile phone number, a copy of your passport, plus payment details if you’ve ever used its service. 

Sign up

Naturally, to find all this out, you need to sign up to one more service: Mine’s. It requires your email address to perform its basic function and your first and last name so that it can contact each company on your behalf. Upon registering your email, the company says, the machine learning models get to work and within 30 seconds you’ll be presented with 40 or so companies that have your data – this expands to hundreds after roughly 48 hours and repeatedly notifies you as and when you sign up to more. 

All the usual suspects will be there – Netflix, LinkedIn, Amazon – along with an assortment of unknown and forgettable one time services. Underneath each will be the data you signed up and a button to take action. Click on this and Mine sends a request on your behalf. Some companies, however, will be listed as “action unavailable”. 

“The reason you see action unavailable can be for two reasons,” Ringel explains. “First, these are companies that we still haven’t got round to analysing their privacy policies and learned their data structure. And the second could be that we didn’t find any contact information within their privacy policy. So we don’t know who to approach. When you want to reclaim from a company, we automatically shoot an email to its data protection officer or its privacy officer.” 

As a company turning the GDPR into a service, Mine will come under more scrutiny than most when it comes to compliance. The company’s own privacy policy has no margin for error.  

“The only data we do store is your email address, which you signed up with, and a list of the companies’ names we identified in your footprint,” Ringel confirms. “You can easily request a copy of the data we hold about you to see exactly what we keep. We are fully transparent on everything we are doing and, of course, in line with GDPR regulations.”

Main image copyright: Mine.

HSBC agrees multi-year cloud partnership with AWS


Bobby Hellard

15 Jul, 2020

HSBC Holdings has selected Amazon Web Services (AWS) as its long-term cloud provider for its planned digital transformation

The multi-year agreement will make AWS technology available across the company’s business, starting with customer-facing applications and modernisation of its Global Wealth & Personal Banking arm.

HSBC Holdings, the parent company of HSBC, is headquartered in London and serves customers around the world out of offices in 64 countries across Europe, Asia, North America, Latin America, and the Middle East and North Africa. 

With the migration, HSBC will have access to AWS’s extensive portfolio of cloud services, including compute, containers, storage, database, analytics, machine learning, and security to develop new digital products and support security and compliance standards for millions of its personal banking customers around the worldwide. 

The bank plans to use AWS serverless and analytics services, including Amazon Kinesis, to create a more personalised and customer-centric banking experience, it said. 

“Our work with AWS is an example of how HSBC continues to invest in secure and advanced technologies to make our digital banking experience even better for customers,” said Dinesh Keswani, CTO and CIO for digital at HSBC. 

“Our ambition is to make it easy, safe, and reliable for customers to bank with us, whenever and wherever they are. HSBC’s collaboration with AWS helps us to deliver innovative banking solutions to customers at a faster rate, starting with our Wealth & Personal Banking business.”

According to AWS, HSBC is continuing to expand its use of its cloud services to deliver innovative financial services that help customers grow their wealth in “new and more personalised ways”.

“We look forward to our continued collaboration with HSBC as they leverage AWS’s proven capabilities, reliability, and security to drive efficiency across their business and become a more agile organisation in the cloud,”  said Frank Fallon, VP of financial services at AWS. 

Microsoft and Citrix expand partnership for the new normal


Bobby Hellard

14 Jul, 2020

Microsoft and Citrix have announced an expansion of their partnership to help organisations deal with the move to remote and agile business models.

The multi-year deal between Citrix and Microsoft aims to help businesses accelerate the move to the cloud and speed up adoption of digital workspaces and virtual desktops.

Citrix and its workspace portal will become Microsoft’s “preferred” digital workspace, while Citrix has chosen Azure as its preferred cloud platform.

The thinking behind the deal is to help businesses address the “workplace of the future” with the pandemic drastically changing the way companies operate. Citrix believes that more organisations will make remote work a permanent part of their cost and workforce management strategies, which will mean that office structures will change. 

Virtual desktops has been an area of heavy investment from Microsoft, with the pandemic forcing many businesses to shift to working from home. As a result, Windows Virtual Desktop usage has almost trebled in recent months and the company’s CEO, Satya Nadella, has prioritised work on both Microsoft Teams and Windows Virtual Desktops.
 
The two companies will provide joint tools and services to simplify and speed the transition of on-premises Citrix customers to Azure. They will also devise a connected roadmap to enable a consistent and optimal flexible work experience that will include joint services comprised of Citrix Workspace, Citrix SD-WAN, Microsoft Azure and Microsoft 365.

“The COVID-19 pandemic has forced businesses around the world to change the way that employees work, while still meeting the speed and security requirements that today’s uncertain business environment demands,” said David Henshall, president and CEO of Citrix.

“Looking forward, hybrid-work models will become the standard for many customers, requiring a flexible infrastructure to support, secure and empower their teams.
 
“Together, Citrix and Microsoft can deliver a powerful digital workspace in a trusted and secure public cloud where employees can access everything they need to engage and be productive whether they are at home, in the office or on the road.”

Google launches Confidential VMs for sensitive data processing


Dale Walker

14 Jul, 2020

Confidential VMs will be the first product in Google Cloud’s new confidential computing portfolio, the company has revealed, allowing companies to process sensitive data while keeping it encrypted in memory.

The announcement aims to capitalise on a growing interest in confidential computing, a field that promises to revolutionise cloud computing by providing what is in effect permanent uptime on data encryption.

Until now, like many cloud providers, Google offered encryption on data at rest and while in transit, requiring that data to be decrypted before it could be processed. Through Confidential VMs, Google customers encrypt data while it is being processed inside a virtual machine.

Google’s new feature is an evolution of its Shielded VMs, a tool launched in 2018 that companies could deploy to strip out most of the potentially vulnerable startup processes that trigger when attempting to create a new environment. This is in addition to a few layers of extra protection against external attacks, and monitoring systems that check for unexpected changes to data.

These added layers of security were required given that data is normally decrypted in order to be processed inside the VM – something that not only creates added risk from external attacks, but also forces companies to deploy strict access controls to ensure only the right employees handle the data.

The Confidential VMs feature, available as a beta today, attempts to solve these issues by allowing customers to encrypt their data in memory, meaning encryption can be maintained while it is being used, indexed, queried, or trained on.

This promises to have profound implications for those industries that process highly sensitive or heavily regulated data, such as those in finance and health, or government agencies. Companies in these sectors, which are usually forced to keep most of their data processing in their own private networks, now have a public cloud option, Google claims.

“These companies want to adopt the latest cloud technologies, but strict requirements for data privacy or compliance are often barriers,” Sunil Potti, general manager and VP of Security at Google Cloud. “Confidential VMs… will help us better serve customers in these industries, so they can securely take advantage of the innovation of the cloud while also simplifying security operations.”

Providing confidential computing is largely a question of hardware, something that many vendors have grappled with over the past few years. In this case, Google has turned to AMD and its second-generation EPYC CPUs – these now support a ‘Secure Encrypted Virtualisation (SEV)’ feature, which allows a VM to run with encrypted memory using a unique, non-exportable, key.

“Our deep partnership with Google Cloud on its Confidential VMs solution helps ensure that customers can secure their data and achieve performance when adopting this transformational technology,” said Dan McNamara, senior vice president and general manager of AMD’s Server Business Unit.

“Confidential VMs offer high performance for the most demanding computational tasks all while keeping VM memory encrypted with a dedicated per-VM instance key that is generated and managed by our hardware.”

The company has also confirmed that any customers already running workloads in a VM on Google Cloud Platform will be able to shift these over to a Confidential VM using a checkbox.

Google has also said that VM memory encryption will not interfere with workload output, promising that the performance of Confidential VMs will be on-par with that of non-confidential VMs.

BigQuery Omni pulls Google, AWS, and Azure analytics into one UI


Dale Walker

14 Jul, 2020

Google has launched an early version of BigQuery Omni, its new analytics tool that lets users access and view data across Google Cloud and Amazon Web Services without leaving the Big Query UI.

Powered by Google Anthos, its vendor-neutral app development platform, users will be able to use SQL and the standard BigQuery APIs to manipulate data silos sourced from multiple platforms, without having to manage the underlying infrastructure.

Although the initial alpha launch of the service is restricted to Google Cloud and AWS, Google has also confirmed that Microsoft Azure will eventually be supported.

The tool has been designed to target those customers who rely on multiple cloud service providers and are forced to juggle and consolidate a number of analytics tools in order to get a view of their data.

This is made possible by the decoupling of storage and compute, according to the firm. The compute side has always been regarded as ‘stateless’ but, until now, BigQuery required data to be stored in Google Cloud – this restriction has now been scrapped, allowing customers to store their data in any supported public cloud.

This single view means that customers can use BigQuery Omni to run SQL queries on clusters in whichever region the data resides. For example, it will be possible to query Google Analytics 360 Ads data stored in Google Cloud while simultaneously querying logs data from any apps stored in AWS S3. This can then be used to build a dashboard to get a complete view of audience behaviour alongside ad spend.

This means customers can avoid any costs associated with moving or copying data between cloud platforms in order to get a full view, Google claims.

“85% of respondents to 451 Research’s Voice of the Enterprise Data & Analytics, Data Platforms 1H20 survey agreed that the ability to run the same database on multiple cloud/data centre environments is an important consideration when selecting a new data platform,” said Matt Aslett, research director, Data, AI and Analytics, 451 Research.

“As hybrid and multi-cloud adoption has become the norm, enterprises are increasingly looking for data products that provide a consistent experience and lower complexity of using multiple clouds, while enabling the ongoing use of existing infrastructure investments,” he added.

The new system is built using Anthos, an app development platform launched last year to appease customers that wanted a single programming model that allowed for data to be moved between their various cloud providers without charge or requiring changes.

The underlying infrastructure is run entirely by Google, including any communication between cloud providers, on the familiar BigQuery UI, so there will be little operational change from the customers’ perspective, the company claims.

BigQuery Omni’s engine will run on Anthos clusters inside the BigQuery managed service, and will source data from the various data silos across a customer’s public cloud services, provided they have provided authorisation. In order to run queries, data is temporarily moved from the cloud provider’s data storage to the BigQuery cluster running on Anthos.

For now, BigQuery Omni is only available in private alpha, so customers will need to apply to Google to use it if they’re interested. It’s also only available for AWS S3 for now, with Azure support coming soon.

There is currently no general release date available.

SAP patches critical flaw that lets hackers seize control of servers


Keumars Afifi-Sabet

14 Jul, 2020

Software company SAP has patched a critical vulnerability that can be exploited by an unauthenticated hacker to take control of systems and applications.

The flaw, assigned CVE-2020-6287, affects the LM Configuration Wizard element of the NetWeaver Application Server (AS) Java platform, and affects potentially 40,000 customers, according to Onapsis, which discovered the vulnerability.

Alarmingly, the flaw has been rated 10 out of 10 on the CVSS scale and has spurred the United States Computer Emergency Readiness Team (US-CERT) into issuing an alert encouraging organisations to patch their systems immediately.

“Due to the criticality of this vulnerability, the attack surface this vulnerability represents, and the importance of SAP’s business applications, the Cybersecurity and Infrastructure Security Agency (CISA) strongly recommends organizations immediately apply patches,” the alert said. 

“CISA recommends organizations prioritize patching internet-facing systems, and then internal systems.”

Those unable to patch their systems should mitigate the vulnerability by disabling the LM Configuration Wizard service. Should this step be impossible, or take more than 24 hours to complete, CISA has recommended closely monitoring SAP NetWeaver AS for any suspicious or anomalous activity. 

The flaw is a result of the lack of authentication in a web component of the SAP NetWeaver AS for Java which allows for several high-privileged activities on the SAP system. 

Successful exploitation involves a remote hacker obtaining unrestricted access to SAP systems by creating high-privileged users and executing arbitrary OS commands with high privileges. Hackers would retain unrestricted access to the SAP database and can perform application maintenance activities. 

The flaw, in essence, entirely undermines confidentiality, integrity and availability of data and processes hosted by the SAP application. 

The vulnerability is present by default in SAP applications running over SAP NetWeaver AS Java 7.3, and any newer versions up to SAP NetWeaver 7.5, affecting a handful of applications. These include SAP Enterprise Resource Planning (ERP), SAP Product Lifecycle Management, SAP Customer Relationship Management (CRM), and around a dozen more.

Flaws rated 10/10 on the CVSS scale are barely encountered, and ordinarily mean the vulnerability is highly exploitable, easy to trigger, and require little or no additional privileges and user interaction. Nevertheless, the SAP flaw is the second 10-rated vulnerability discovered within a couple of weeks, after Palo Alto patched a flaw in its networking services based around its SAML-based authentication mechanism.

Both the SAP and Palo Alto flaws were highlighted by official US law enforcement agencies, the former flagged by US-CERT and the latter by US Cyber Command.

What is serverless computing?


Steve Cassidy

14 Jul, 2020

If you’re looking to move away from hybrid cloud and pack up your on-premises servers all together, but are worried about how your applications will run in the cloud, serverless computing could be the right strategy for you.

Serverless computing? As in running everything on desktops?

Ah, no – serverless computing means building your server functions in the cloud, rather than on a local, physical machine. In this way, they can benefit from demand-driven management, spinning up as required then closing down again when, for example, the entire human race decides to stay at home for several months. Ideally, functions should be fully portable, eschewing platform-specific services and tricks, so they’ll run in any data centre.

So we can go serverless and retire our old servers?

It’s unlikely that you’d be able to do a straightforward lift-and-shift of your old, badly behaved suite of IT resources up into the cloud. Any function that depends on some older technology (say, for example, a Windows dialog box) will have to be rebuilt with modern tools that embrace scalability and movability. Indeed, even once you’ve moved, it might make sense to keep your older servers running in parallel for some time, as a fallback in case of unforeseen hiccups. 

Could we at least streamline our local admin team?

If that’s your plan, make sure they’re ready to come back on a consultancy basis: you’re going to need their knowledge more than ever while the development is in progress, and likely for some time afterwards. Only the very simplest of businesses can make a consequence-free shift, and they’re still likely to need some techie oversight to ensure everything is scaling and behaving like it should.

Surely moving our everyday line-of-business functions off-site is going to slow things down?

If you have a big on-site compute load then it might, but for outward-facing services – that is, ones used by your customers rather than your employees – moving to a scalable architecture could speed things up. What’s more, a serverless approach easily allows for multiple instances so you can, for example, create different versions of your site for different users and markets.

Is it wise to put our critical functions in the hands of a third party?

Part of the beauty of the serverless model is that you’re not tied to any single provider. If there’s a problem with one host, you can just pop a serverless image onto a flash drive and fire it up somewhere else. Running instances here and there might not be cheap, but it’s a much more resilient position than one where yanking out a 13A lead will scuttle your whole operation.

Are there other benefits?

Most popular business apps are now very old: histories stretching back 20 years or more are not uncommon. That means you’re working with two decades of accumulated bug fixes, function changes and bloat. The process of moving to a serverless model gives you a chance to take stock, assess which parts of your code portfolio could work better in the cloud, and to re-engineer any broken or backward functions. 

So when will our everyday apps go serverless?

Basic, network-shared apps aren’t going to magically transform into serverless versions: the cost of moving outweighs the advantages. However, it may be that service providers (like your card payment processor) migrate you to serverless because you’re only using one specific part of their offering, so it makes sense for them to only fire up the code you’re using. That move will probably be entirely invisible to you, though – which is just as it should be.