All posts by louiscolumbus

How does privileged access security work on AWS and other public clouds?

Bottom line: Amazon’s Identity and Access Management (IAM) centralises identity roles, policies and Config Rules yet doesn’t go far enough to provide a Zero Trust-based approach to Privileged Access Management (PAM) that enterprises need today.

AWS provides a baseline level of support for Identity and Access Management at no charge as part of their AWS instances, as do other public cloud providers. Designed to provide customers with the essentials to support IAM, the free version often doesn’t go far enough to support PAM at the enterprise level. To AWS’s credit, they continue to invest in IAM features while fine-tuning how Config Rules in their IAM can create alerts using AWS Lambda. AWS’s native IAM can also integrate at the API level to HR systems and corporate directories, and suspend users who violate access privileges.

In short, native IAM capabilities offered by AWS, Microsoft Azure, Google Cloud, and more provides enough functionality to help an organisation get up and running to control access in their respective homogeneous cloud environments. Often they lack the scale to fully address the more challenging, complex areas of IAM and PAM in hybrid or multi-cloud environments.

The truth about privileged access security on cloud providers like AWS

The essence of the Shared Responsibility Model is assigning responsibility for the security of the cloud itself including the infrastructure, hardware, software, and facilities to AWS and assign the securing of operating systems, platforms, and data to customers. The AWS version of the Shared Responsibility Model, shown below, illustrates how Amazon has defined securing the data itself, management of the platform, applications and how they’re accessed, and various configurations as the customers’ responsibility:

AWS provides basic IAM support that protects its customers against privileged credential abuse in a homogenous AWS-only environment. Forrester estimates that 80% of data breaches involve compromised privileged credentials, and a recent survey by Centrify found that 74% of all breaches involved privileged access abuse.

The following are the four truths about privileged access security on AWS (and, generally, other public cloud providers):

Customers of AWS and other public cloud providers should not fall for the myth that cloud service providers can completely protect their customised and highly individualised cloud instances

As the Shared Responsibility Model above illustrates, AWS secures the core areas of their cloud platform, including infrastructure and hosting services. AWS customers are responsible for securing operating systems, platforms, and data and most importantly, privileged access credentials.

Organisations need to consider the Shared Responsibility Model the starting point on creating an enterprise-wide security strategy with a Zero Trust Security framework being the long-term goal. AWS’s IAM is an interim solution to the long-term challenge of achieving Zero Trust Privilege across an enterprise ecosystem that is going to become more hybrid or multi-cloud as time goes on.

Despite what many AWS integrators say, adopting a new cloud platform doesn’t require a new Privileged Access Security model

Many organisations who have adopted AWS and other cloud platforms are using the same Privileged Access Security Model they have in place for their existing on-premises systems. The truth is the same Privileged Access Security Model can be used for on-premises and IaaS implementations.

Even AWS itself has stated that conventional security and compliance concepts still apply in the cloud. For an overview of the most valuable best practices for securing AWS instances, please see my previous post, 6 Best Practices For Increasing Security In AWS In A Zero Trust World.

Hybrid cloud architectures that include AWS instances don’t need an entirely new identity infrastructure and can rely on advanced technologies, including Multi-Directory Brokering

Creating duplicate identities increases cost, risk, and overhead and the burden of requiring additional licenses. Existing directories (such as Active Directory) can be extended through various deployment options, each with their strengths and weaknesses. Centrify, for example, offers Multi-Directory Brokering to use whatever preferred directory already exists in an organisation to authenticate users in hybrid and multi-cloud environments.

And while AWS provides key pairs for access to Amazon Elastic Compute Cloud (Amazon EC2) instances, their security best practices recommend a holistic approach should be used across on-premises and multi-cloud environments, including Active Directory or LDAP in the security architecture.

It’s possible to scale existing Privileged Access Management systems in use for on-premises systems today to hybrid cloud platforms that include AWS, Google Cloud, Microsoft Azure, and other platforms

There’s a tendency on the part of system integrators specialising in cloud security to oversell cloud service providers’ native IAM and PAM capabilities, saying that a hybrid cloud strategy requires separate systems. Look for system integrators and experienced security solutions providers who can use a common security model already in place to move workloads to new AWS instances.

Conclusion

The truth is that Identity and Access Management solutions built into public cloud offerings such as AWS, Microsoft Azure, and Google Cloud are stop-gap solutions to a long-term security challenge many organisations are facing today. Instead of relying only on a public cloud provider’s IAM and security solutions, every organisation’s cloud security goals need to include a holistic approach to identity and access management and not create silos for each cloud environment they are using.

While AWS continues to invest in their IAM solution, organisations need to prioritise protecting their privileged access credentials – the “keys to the kingdom” – that if ever compromised would allow hackers to walk in the front door of the most valuable systems an organisation has. The four truths defined in this article are essential for building a Zero Trust roadmap for any organisation that will scale with them as they grow.

By taking a “never trust, always verify, enforce least privilege” strategy when it comes to their hybrid- and multi-cloud strategies, organisations can alleviate costly breaches that harm the long-term operations of any business.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How AWS certifications are increasing tech salaries by up to $12k per year

  • AWS and Google certifications are among the most lucrative in North America, paying average salaries of $129,868 and $147,357 respectively
  • Cross-certifying on AWS is providing a $12K salary bump to IT professionals who already have Citrix and Red Hat/Linux certifications today
  • Globally, four of the five top-paying certifications are in cloud computing

These and many other insights of which certifications provide the highest salaries by region of the world are from the recently published Global Knowledge 2019 IT Skills and Salary ReportThe report is downloadable here (27 pp., PDF, free, opt-in). The methodology is based on 12,271 interviews across non-management IT staffs (29% of interviews), mid-level professionals including managers and team leads (43%), and senior-level and executive roles (28%) across four global regions. For additional details regarding the study’s methodology, please see page 24 of the report.

Key insights from the report include the following:

Cross-certifying on AWS is providing a $12K salary bump to IT professionals who already have Citrix and Red Hat/Linux certifications

Citrix certifications pay an average salary of $109,546 and those earning an AWS certification see a $12,339 salary bump on average. Red Hat/Linux certification-based jobs pay an average of $113,165 and are seeing an average salary bump of $12,553.  Cisco-certified IT professionals who gain AWS certification increase their salaries on average from $101,533 to $111,869, gaining a 10.2% increase. The following chart compares the salary bump AWS certifications are providing to IT professionals with seven of the more popular certifications (please click on the graphic to expand for easier reading).

AWS and Google certifications are among the most lucrative in North America, paying average salaries of $129,868 and $147,357 while the most popular are cybersecurity, governance, compliance, and policy

27% of all respondents to Global Knowledge’s survey have at least one certification in this category. Nearly 18% are ITIL certified. In North American, the most popular certification categories beyond cybersecurity are CompTIA, Microsoft, and Cisco. The following table from the report provides an overview of salary by certification category (please click on the graphic to expand for easier reading).

AWS Certified Solutions Architect – Associate is the most popular AWS certification today, with 72% of respondents having achieved its requirements

Certified Solutions Architect – Associate leads the top five most commonly held AWS certifications today according to the survey. AWS Certified Developer – Associate (33%), AWS Certified SysOps Administrator – Associate (24%), AWS Certified Solutions Architect – Professional (16%) and AWS Certified Cloud Practitioner round out the top five most common AWS certifications across the 12,271 global respondents to the Global Knowledge survey.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Capgemini report shows why AI is the future of cybersecurity

These and many other insights are from Capgemini’s Reinventing Cybersecurity with Artificial Intelligence Report published this week. You can download the report here (28 pp., PDF, free, no opt-in). Capgemini Research Institute surveyed 850 senior executives from seven industries, including consumer products, retail, banking, insurance, automotive, utilities, and telecom. 20% of the executive respondents are CIOs, and 10% are CISOs. Enterprises headquartered in France, Germany, the UK, the US, Australia, the Netherlands, India, Italy, Spain, and Sweden are included in the report. Please see page 21 of the report for a description of the methodology.

Capgemini found that as digital businesses grow, their risk of cyberattacks exponentially increases. 21% said their organization experienced a cybersecurity breach leading to unauthorized access in 2018.

Enterprises are paying a heavy price for cybersecurity breaches: 20% report losses of more than $50 million. Centrify’s most recent survey, Privileged Access Management in the Modern Threatscape, found that 74% of all breaches involved access to a privileged account. Privileged access credentials are hackers’ most popular technique for initiating a breach to exfiltrate valuable data from enterprise systems and sell it on the Dark Web.

Key insights include the following:

69% of enterprises believe AI will be necessary to respond to cyberattacks

The majority of telecom companies (80%) say they are counting on AI to help identify threats and thwart attacks. Capgemini found the telecom industry has the highest reported incidence of losses exceeding $50M, making AI a priority for thwarting costly breaches in that industry.

It’s understandable by Consumer Products (78%), and Banking (75%) are second and third given each of these industry’s growing reliance on digitally-based business models. U.S.-based enterprises are placing the highest priority on AI-based cybersecurity applications and platforms, 15% higher than the global average when measured on a country basis.

73% of enterprises are testing use cases for AI for cybersecurity across their organisations today with network security leading all categories

Endpoint security the third-highest priority for investing in AI-based cybersecurity solutions given the proliferation of endpoint devices, which are expected to increase to over 25B by 2021. Internet of Things (IoT) and Industrial Internet of Things (IIoT) sensors and systems they enable are exponentially increasing the number of endpoints and threat surfaces an enterprise needs to protect.

The old “trust but verify” approach to enterprise security can’t keep up with the pace and scale of threatscape growth today. Identities are the new security perimeter, and they require a Zero Trust Security framework to be secure. Be sure to follow Chase Cunningham of Forrester, Principal Analyst, and the leading authority on Zero Trust Security to keep current on this rapidly changing area. You can find his blog here.

51% of executives are making extensive AI for cyber threat detection, outpacing prediction, and response by a wide margin

Enterprise executives are concentrating their budgets and time on detecting cyber threats using AI above predicting and responding. As enterprises mature in their use and adoption of AI as part of their cybersecurity efforts, prediction and response will correspondingly increase. “AI tools are also getting better at drawing on data sets of wildly different types, allowing the “bigger picture” to be put together from, say, static configuration data, historic local logs, global threat landscapes, and contemporaneous event streams,” said Nicko van Someren, Chief Technology Officer at Absolute Software.

64% say that AI lowers the cost to detect and respond to breaches and reduces the overall time taken to detect threats and breaches up to 12%

The reduction in cost for a majority of enterprises ranges from 1% – 15% (with an average of 12%). With AI, the overall time taken to detect threats and breaches is reduced by up to 12%. Dwell time – the amount of time threat actors remain undetected – drops by 11% with the use of AI. This time reduction is achieved by continuously scanning for known or unknown anomalies that show threat patterns. PetSmart, a US-based specialty retailer, was able to save up to $12M by using AI in fraud detection from Kount. By partnering with Kount, PetSmart was able to implement an AI/Machine Learning technology that aggregates millions of transactions and their outcomes.

The technology determines the legitimacy of each transaction by comparing it against all other transactions received. As fraudulent orders were identified, they were canceled, saving the company money and avoiding damage to the brand. The top 9 ways Artificial Intelligence prevents fraud provides insights into how Kount’s approach to unsupervised and supervised machine learning stops fraud.

Fraud detection, malware detection, intrusion detection, scoring risk in a network, and user/machine behavioral analysis are the five highest AI use cases for improving cybersecurity

Capgemini analyzed 20 use cases across information technology (IT), operational technology (OT) and the Internet of Things (IoT) and ranked them according to their implementation complexity and resultant benefits (in terms of time reduction).

Based on their analysis, we recommend a shortlist of five high-potential use cases that have low complexity and high benefits. 54% of enterprises have already implemented five high impact cases. The following graphic compares the recommended use cases by the level of benefit and relative complexity.

56% of senior execs say their cybersecurity analysts are overwhelmed and close to a quarter (23%) are not able to successfully investigate all identified incidents

Capgemini found that hacking organizations are successfully using algorithms to send ‘spear phishing’ tweets (personalized tweets sent to targeted users to trick them into sharing sensitive information). AI can send the tweets six times faster than a human and with twice the success. “It’s no surprise that Capgemini’s data shows that security analysts are overwhelmed. The cybersecurity skills shortage has been growing for some time, and so have the number and complexity of attacks; using machine learning to augment the few available skilled people can help ease this. What’s exciting about the state of the industry right now is that recent advances in Machine Learning methods are poised to make their way into deployable products,” said van Someren.

Conclusion

AI and machine learning are redefining every aspect of cybersecurity today. From improving organizations’ ability to anticipate and thwart breaches, protecting the proliferating number of threat surfaces with Zero Trust Security frameworks to making passwords obsolete, AI and machine learning are essential to securing the perimeters of any business. 

One of the most vulnerable and fastest-growing threat surfaces are mobile phones. The two recent research reports from MobileIronSay Goodbye to Passwords (4 pp., PDF, opt-in) in collaboration with IDG, and Passwordless Authentication: Bridging the Gap Between High-Security and Low-Friction Identity Management (34 pp., PDF, opt-in) by Enterprise Management Associates (EMA) provide fascinating insights into the passwordless future. They reflect and quantify how ready enterprises are to abandon passwords for more proven authentication techniques including biometrics and mobile-centric Zero Trust Security platform.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

10 charts that will change your perspective of AI in marketing

  • Top-performing companies are more than twice as likely to be using AI for marketing (28% vs. 12%) according to Adobe’s latest Digital Intelligence Briefing.
  • Retailers are investing $5.9B this year in AI-based marketing and customer service solutions to improve shoppers’ buying experiences according to IDC.
  • Financial Services marketers lead all other industries in AI application adoption, with 37% currently using them today.
  • Sales and Marketing teams most often collaborate using Configure-Price-Quote (CPQ) and Marketing Automation AI-based applications, with sales leaders predicting AI adoption will increase 155% across sales teams in two years.

Artificial Intelligence enables marketers to understand sales cycles better, correlating their strategies and spending to sales results. AI-driven insights are also helping to break down data silos so marketing and sales can collaborate more on deals. Marketing is more analytics and quant-driven than ever before with the best CMOs knowing which metrics and KPIs to track and why they fluctuate.

The bottom line is that machine learning and AI are the technologies CMOs and their teams need to excel today. The best CMOs balance the quant-intensive nature of running marketing with qualitative factors that make a company’s brand and customer experience unique. With greater insight into how prospects make decisions when, where, and how to buy, CMOs are bringing a new level of intensity into driving outcomes. An example of this can be seen from the recent Forbes Insights and Quantcast research, Lessons of 21st-Century Brands Modern Brands & AI Report (17 pp., PDF, free, opt-in). The study found that AI enables marketers to increase sales (52%), increase in customer retention (51%), and succeed at new product launches (49%). AI is making solid contributions to improving lead quality, persona development, segmentation, pricing, and service.

The following ten charts provide insights into how AI is transforming marketing:

21% of sales leaders rely on AI-based applications today, with the majority collaborating with marketing teams sharing these applications

Sales leaders predict that their use of AI will increase 155% in the next two years. Sales leaders predict AI will reach critical mass by 2020 when 54% expect to be using these technologies. Marketing and sales are relying on AI-based marketing automation, configure-price-quote (CPQ), and intelligent selling systems to increase revenue and profit growth significantly in the next two years. Source: Salesforce Research, State of Sales, 3rd edition. (58 pp., PDF, free, opt-in).

AI sees the most significant adoption by marketers working in $500m to $1bn companies, with conversational AI for customer service the most dominant

Businesses with between $500M to $1B lead all other revenue categories in the number and depth of AI adoption use cases. Just over 52% of small businesses with sales of $25M or less are using AI for predictive analytics for customer insights. It’s interesting to note that small companies are the leaders in AI spending, at 38.1%, to improve marketing ROI by optimising marketing content and timing. Source: The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte and American Marketing Association. (71 pp., PDF, free, no opt-in).

22% of marketers currently are using AI-based applications with an additional 57% planning to use in the next two years

There are nine dominant use cases marketers are concentrating on today, ranging from personalised channel experiences to programmatic advertising and media buying to predictive customer journeys and real-time next best offers. Source: Salesforce’s State of Marketing Study, 5th edition

Content personalisation and predictive analytics from customer insights are the two areas CMOs most prioritise AI spending today

The CMO study found that B2B service companies are the top user of AI for content personalisation (62.2%) and B2B product companies use AI for augmented and virtual reality, facial recognition and visual search more than any other business types. Source: CMOs’ Top Uses For AI: Personalisation and Predictive Analytics. Marketing Charts. March 14, 2019

45% of retailers are either planning to or have already implemented AI to improve multichannel customer engagement as a core part of their marketing mix

Reflecting how dependent retailers are on supply chains, 37% of retailers are investing in AI today to improve supply chain logistics, supply chain management, and forecasting (37%). Source: AI and Machine Learning use cases in the retail industry worldwide as of 2019, Statista.

Personalising the overall customer journey and driving next-best offers in real-time are the two most common ways marketing leaders are using AI today, according to Salesforce

Improving customer segmentation, improving advertising and media buying, and personalising channel experiences are the next fastest-growing areas of AI adoption in marketing today. Source: Salesforce’s State of Marketing Study, 5th edition

82% of marketing leaders say improving customer experience is the leading factor in their decision to adopt AI

The timing and delivery of content, offers, and contextually relevant experiences are second (67%), and improving performance metrics is third at 57%. Source: Leading reasons to use artificial intelligence (AI) for marketing personalisation according to industry professionals worldwide in 2018, Statista.

81% of marketers are either planning to or are using AI in audience targeting this year

80% are currently using or planning to use AI for audience segmentation. EConsultancy’s study found marketers are enthusiastic about AI’s potential to increase marketing effectiveness and track progress. 88% of marketers interviewed say AI will enable them t be more effective in getting to their goals. Source: Dream vs. Reality: The State of Consumer First and Omnichannel Marketing. EConsultancy (36 pp., PDF, free, no opt-in).

Over 41% of marketers say AI is enabling them to generate higher revenues from email marketing

They also see an over 13% improvement in click-through rates and 7.64% improvement in open rates. Source: 4 Positive Effects of AI Use in Email Marketing, Statista (infographic), March 1, 2019.

Marketers and agencies are most comfortable with AI-enabled bid optimisation for media buying, followed by fraud mitigation

Marketers and their agencies differ on ad inventory selection and optimisation, with marketing teams often opting to use their analytics and reporting instead of relying on agency AI methods. Source: Share of marketing and agency professionals who are comfortable with AI-enabled technology automated handling of their campaigns in the United States as of June 2018, Statista.

Additional data sources on AI’s use in marketing:

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How CRM remains the fastest growing enterprise software market – and how Salesforce still dominates

  • Salesforce dominated the worldwide CRM market with a 19.5% market share in 2018, over double its nearest rival, SAP, at 8.3% share
  • Worldwide spending on customer experience and relationship management (CRM) software grew 15.6% to reach $48.2B in 2018
  • 72.9% of CRM spending was on software as a service (SaaS) in 2018, which is expected to grow to 75% of total CRM software spending in 2019
  • Worldwide enterprise application software revenue totalled more than $193.6B in 2018, a 12.5% increase from 2017 revenue of $172.1B. CRM made up nearly 25% of the entire enterprise software revenue market

CRM remains the largest and fastest growing enterprise software category today according to the latest market sizing, and market share research Gartner published this weekGartner defines CRM as providing the functionality to companies across the four segments of customer service and support, digital commerce, marketing, and sales.

All four subsegments of the CRM market grew by more than 13.7%, with marketing emerging as the fastest growing segment, increasing by 18.8% and representing more than 25% of the entire CRM market. Customer service and support retain its No. 1 position, contributing 35.7% of CRM market revenue, attaining $17.1B in revenues in 2018.

Key insights include the following:

With 19.5% market share, Salesforce has over 2X the CRM sales SAP has and over 3X of Oracle

Salesforce continues to dominate CRM globally, increasing its market share from 18.3% in 2017 to 19.5% in 2018. Adobe is the only other vendor to grow its market share in 2018. Microsoft and SAP successfully held onto to market share while Oracle lost share.

Adobe and Salesforce grew faster than the overall market, increasing CRM revenues 21.7% and 23.2% respectively

Adobe’s CRM sales jumped from $2B in 2017 to $2.4B in 2018. Salesforce CRM revenues increased from $7.6B in 2017 to $9.4B in 2018, growing the fastest of all competitors in this market. SAP grew 15.5% between 2017 and 2018, just below the overall market growth of 15.6%. Microsoft (15%) and Oracle (7.1%) grew slower than the market. The following graphic compares growth rates between 2017 and 2018.

Adobe dominates the marketing subsegment of CRM with 19% market share in 2018

Salesforce has 11.7% of the marketing subsegment, followed by IBM (5.7%), SAP (4%), Oracle (3.6%) and HubSpot (3.4%). Gartner estimates the marketing subsegment was a $12.2B market in 2018, increasing from $10.3B in 2017, achieving 18.8% growth in just a year.

Eastern and Western Europe were the fastest growing regions at 19.7% and 17.5% respectively

North America and Western Europe were the largest two regions with North America growing at 15.2% to reach $28.1B in revenue.

Sources:

Gartner Says Worldwide Customer Experience and Relationship Management Software Market Grew 15.6% in 2018

Market Share: Customer Experience and Relationship Management, Worldwide, 2018 (client access required)

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

The top 10 cybersecurity companies to watch in 2019 – and the key trends to explore

Today’s threatscape has made “trust but verify” obsolete 

The threatscape every business operates in today is proving the old model of “trust but verify” obsolete and in need of a complete overhaul. To compete and grow in the increasingly complex and lethal threatscape of today, businesses need more adaptive, contextually intelligent security solutions based on the Zero Trust Security framework.

Zero Trust takes a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. John Kindervag was the first to see how urgent the need was for enterprises to change their approach to cybersecurity, so he created the Zero Trust Security framework in 2010 while at Forrester. Chase Cunningham, Principal Analyst at Forrester, is a mentor to many worldwide wanting to expand their knowledge of Zero Trust and frequently speaks and writes on the topic. If you are interested in cybersecurity in general and Zero Trust specifically, be sure to follow his blog.

AI and machine learning applied to cybersecurity’s most significant challenges is creating a proliferation of commercially successful, innovative platforms. The size and scale of deals in cybersecurity continue to accelerate with BlackBerry’s acquisition of Cylance for $1.4B in cash closing in February of this year being the largest. TD Ameritrade’s annual survey of registered investment advisors (RIA) showed nearly a 6X jump in cybersecurity investments this year compared to 2018.

The top ten cybersecurity companies reflect the speed and scale of innovation happening today that are driving the highest levels of investment this industry has ever seen. The following are the top ten cybersecurity companies to watch in 2019:

Absolute (ABT.TO) 

One of the world’s leading commercial enterprise security solutions, serving as the industry benchmark for endpoint resilience, visibility, and control. The company enables more than 12,000 customers with self-healing endpoint security, always-connected visibility into their devices, data, users, and applications whether endpoints are on or off the network, and the ultimate level of control and confidence required for the modern enterprise. Embedded in over one billion endpoint devices, Absolute delivers intelligence and real-time remediation capabilities that equip enterprises to stop data breaches at the source.

To thwart attackers, organisations continue to layer on security controls — Gartner estimates that more than $124B will be spent on security in 2019 aloneAbsolute’s 2019 Endpoint Security Trends Report finds that much of that spend is in vain, however, revealing that 70% of all breaches still originate on the endpoint. The problem is complexity at the endpoint – it causes security agents to fail invariably, reliably, and predictably.

Absolute’s research found that 42% of all endpoints are unprotected at any given time, and 100% of endpoint security tools eventually fail. As a result, IT leaders see a negative ROI on their security spend. What makes Absolute one of the top 10 security companies to watch in 2019 is their purpose-driven design to mitigate this universal law of security decay.

Enterprises rely on Absolute to cut through the complexity to identify failures, model control options, and refocus security intent. Rather than perpetuating organisations’ false sense of security, Absolute enables uncompromised endpoint persistence, builds resilience and delivers the intelligence needed to ensure security agents, applications, and controls continue functioning and deliver value as intended. Absolute has proven very effective in validating safeguards, fortifying endpoints, and stopping data security compliance failures. The following is an example of the Absolute platform at work:

BlackBerry Artifical Intelligence and Predictive Security

BlackBerry is noteworthy for how quickly it is reinventing itself into an enterprise-ready cybersecurity company independent of the Cylance acquisition. Paying $1.4 billion in cash for Cylance brings much-needed AI and machine learning expertise to their platform portfolio, an acquisition that BlackBerry is moving quickly to integrate into their product and service strategies.

BlackBerry Cylance uses AI and machine learning to protect the entire attack surface of an enterprise with automated threat prevention, detection, and response capabilities. Cylance is also the first company to apply artificial intelligence, algorithmic science, and machine learning to cyber security and improve the way companies, governments, and end users proactively solve the world’s most challenging security problems.

Using a breakthrough mathematical process, BlackBerry Cylance quickly and accurately identifies what is safe and what is a threat, not just what is in a blacklist or whitelist. By coupling sophisticated math and machine learning with a unique understanding of a hacker’s mentality, BlackBerry Cylance provides the technology and services to be truly predictive and preventive against advanced threats.

The following screen from CylancePROTECT provides an executive summary of CylancePROTECT usage, from the number of zones and devices to the percentage of devices covered by auto-quarantine and memory protection, threat events, memory violations, agent versions, and offline days for devices.

Centrify

Centrify is redefining the legacy approach to Privileged Access Management by delivering cloud-ready Zero Trust Privilege to secure modern enterprise attack surfaces. Centrify Zero Trust Privilege helps customers grant least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment.

Industry research firm Gartner predicted Privileged Access Management (PAM) to be the second-fastest growing segment for information security and risk management spending worldwide in 2019 in their recent Forecast Analysis: Information Security and Risk Management, Worldwide, 3Q18 Update (client access required). By implementing least privilege access, Centrify minimises the attack surface, improves audit and compliance visibility, and reduces risk, complexity, and costs for the modern, hybrid enterprise.

Over half of the Fortune 100, the world’s largest financial institutions, intelligence agencies, and critical infrastructure companies, all trust Centrify to stop the leading cause of breaches – privileged credential abuse. PAM was also named a Top 10 security project for 2019 in Gartner’s Top 10 Security Projects for 2019 (client access required).

Cloudflare

Cloudflare is a web performance and security company that provides online services to protect and accelerate websites online. Its online platforms include Cloudflare CDN that distributes content around the world to speed up websites, Cloudflare Optimizer that enables web pages with ad servers and third-party widgets to download Snappy software on mobiles and computers, Cloudflare Security that protects websites from a range of online threats including spam, SQL injection, and DDOS, Cloudflare Analytics that gives insight into website’s traffic including threats and search engine crawlers, Keyless SSL that allows organisations to keep secure sockets layer (SSL) keys private, and Cloudflare applications that help its users install web applications on their websites.

CrowdStrike

Applying machine learning to endpoint detection of IT network threats is how CrowdStrike is differentiating itself in the rapidly growing cybersecurity market today. It’s also one of the top 25 machine learning startups to watch in 2019.

CrowdStrike is credited with uncovering Russian hackers inside the servers of the US Democratic National Committee. The company’s IPO was last Tuesday night, with an initial $34/per share price. Their IPO generated $610M at a valuation at one point reaching nearly $7B. Their Falcon platform stops breaches by detecting all attacks types, even malware-free intrusions, providing five-second visibility across all current and past endpoint activity while reducing cost and complexity for customers.

CrowdStrike’s Threat Graph provides real-time analysis of data from endpoint events across the global crowdsourcing community, allowing detection and prevention of attacks based on patented behavioral pattern recognition technology.

Hunters.AI

Hunters.AI excels at autonomous threat hunting by capitalising on its autonomous system that connects to multiple channels within an organisation and detects the signs of potential cyber-attacks. They are one of the top 25 machine learning startups to watch in 2019.

What makes this startup one of the top ten cybersecurity companies to watch in 2019 is its innovative approach to creating AI- and machine learning-based algorithms that continually learn from an enterprise’s existing security data. Hunters.AI generates and delivers visualised attack stories allowing organisations to more quickly and effectively identify, understand, and respond to attacks.

Early customers include Snowflake Computing, whose VP of security recently said, “Hunters.AI identified the attack in minutes. In my 20 years in security, I have not seen anything as effective, fast, and with high fidelity as what Hunters can do.”  The following is a graphic overview of how the system works:

Idaptive

Idaptive is noteworthy for the Zero Trust approach it is taking to protecting organisations across every threat surface they rely on to operate their businesses dally. Idaptive secures access to applications and endpoints by verifying every user, validating their devices, and intelligently limiting their access. Their product and services strategy reflects a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network.

The Idaptive Next-Gen Access platform combines single single-on (SSO), adaptive multifactor authentication (MFA), enterprise mobility management (EMM) and user behaviour analytics (UBA). They have over 2,000 organisations using their platform today. Idaptive was spun out from Centrify on January 1 this year.

Kount

Kount has successfully differentiated itself in an increasingly crowded cybersecurity marketplace by providing fraud management, identity verification and online authentication technologies that enable digital businesses, online merchants and payment service providers to identify and thwart a wide spectrum of threats in real-time. Kount has been able to show through customer references that their customers can approve more orders, uncover new revenue streams, and dramatically improve their bottom line all while minimising fraud management cost and losses.

Through Kount’s global network and proprietary technologies in AI and machine learning, combined with policy and rules management, its customers thwart online criminals and bad actors driving them away from their site, their marketplace and off their network. Kount’s continuously adaptive platform learns of new threats and continuously updates risk scores to further thwart breach and fraud attempts.

Kount’s advances in both proprietary techniques and patented technology include: Superior mobile fraud detection, Advanced artificial intelligence, Multi-layer device fingerprinting, IP proxy detection and geo-location, Transaction and custom scoring, Global order linking, Business intelligence reporting, Comprehensive order management, Professional and managed services. Kount protects over 6,500 brands today.

MobileIron

The acknowledged leader in mobile device management (MDM) software, MobileIron’s latest series of developments makes it noteworthy and one of the top 10 cybersecurity companies to watch in 2019.

MobileIron was the first to deliver key innovations such as multi-OS mobile device management, mobile application management (MAM), and BYOD privacy controls. Last month MobileIron introduced zero sign-on (ZSO), built on the company’s unified endpoint management (UEM) platform and powered by the MobileIron Access solution. “By making mobile devices your identity, we create a world free from the constant pains of password recovery and the threat of data breaches due to easily compromised credentials,” wrote Simon Biddiscombe, MobileIron’s president and chief executive officer in his recent blog post, Single sign-on is still one sign-on too many.

Biddiscombe’s latest post, MobileIron: We’re making history by making passwords history, provides the company’s vision going forward with ZSO. Zero sign-on eliminates passwords as the primary method for user authentication, unlike single sign-on, which still requires at least one username and password. MobileIron paved the way for a zero sign-on enterprise with its Access product in 2017, which enabled zero sign-on to cloud services on managed devices. Enterprise security teams no longer have to trade off security for better user experience, thanks to the MobileIron Zero Sign-On.

Sumo Logic

Sumo Logic is a fascinating cybersecurity company to track because it shows the ability to take on large-scale enterprise security challenges and turn them into a competitive advantage. An example of this is how quickly the company achieved FedRAMP Ready Designation, getting listed in the FedRAMP Marketplace.

Sumo Logic is a secure, cloud-native, machine data analytics service, delivering real-time, continuous intelligence from structured, semi-structured, and unstructured data across the entire application lifecycle and stack. More than 2,000 customers around the globe rely on Sumo Logic for the analytics and insights to build, run, and secure their modern applications and cloud infrastructures. With Sumo Logic, customers gain a multi-tenant, service-model advantage to accelerate their shift to continuous innovation, increasing competitive advantage, business value, and growth.

Founded in 2010, Sumo Logic is a privately held company based in Redwood City, Calif. and is backed by Accel Partners, Battery Ventures, DFJ, Franklin Templeton, Greylock Partners, IVP, Sapphire Ventures, Sequoia Capital, Sutter Hill Ventures and Tiger Global Management.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

What matters most in business intelligence 2019: Key enterprise use cases

  • Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019
  • Reporting, dashboards, data integration, advanced visualisation, and end-user self-service are the most strategic BI initiatives underway in enterprises today
  • Operations, executive management, finance, and sales are primarily driving business intelligence (BI) adoption throughout enterprises today
  • Tech companies’ operations and sales teams are the most effective at driving BI adoption across industries surveyed, with advertising driving BI adoption across marketing

These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.

Key insights from the study include the following:

Operations, executive management, finance, and sales are primarily driving business intelligence (BI) adoption throughout their enterprises today

More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.

Tech companies’ operations and sales teams are the most effective at driving BI adoption across industries surveyed, with advertising driving BI adoption across marketing

Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.

Reporting, dashboards, data integration, advanced visualisation, and end-user self-service are the most strategic BI initiatives underway in enterprises today

Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.

BI software providers most commonly rely on executive-level personas to design their applications and add new features

Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.

Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management

Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages.

Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.

In aggregate, BI is achieving its highest levels of adoption in R&D, executive management, and operations departments today

The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.

Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration

Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.

Marketing/sales and operations are using the greatest variety of BI tools today

The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to get your data scientist career up and running: A guide

Note: The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a data scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

Continuously build your statistical literacy and programming skills

Currently, there are 24,697 open data scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyse all open positions in the U.S., the following list of the top 10 data science skills was created today.

As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritised list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritise. Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

Continually be creating your own, unique portfolio of analytics and machine learning projects

Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:

  • Your programming language of choice (e.g., Python, R, Julia, etc.).
  • The ability to interact with databases (e.g., your ability to use SQL).
  • Visualisation of data (static or interactive).
  • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
  • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

Get (or git!) yourself a website

If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organisations may also reach out to you with freelance projects, interviews, and other opportunities.

Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network

If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specialising in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:

Bring the same level of intensity to improving your communication skills as you do to your quantitative skills as data scientists need to also excel at storytelling. One of the most important skills for data scientists to have is the ability to communicate results to different audiences and stakeholders so others can understand and act their insights. Since data projects are collaborative across many teams and results are often incorporated into larger projects, the true impact of a data scientist’s work depends on how well others can understand their insights to take further action and make informed decisions.

  • Recruiters
  • Friends, family, and colleagues
  • Career fairs and recruiting events
  • General job boards
  • Company websites
  • Tech job boards

Bring the same level of intensity to improving your communication skills as you do to your quantitative skills – as data scientists need to also excel at storytelling

One of the most important skills for data scientists to have is the ability to communicate results to different audiences and stakeholders so others can understand and act their insights. Since data projects are collaborative across many teams and results are often incorporated into larger projects, the true impact of a data scientist’s work depends on how well others can understand their insights to take further action and make informed decisions.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to improve privileged users’ security experiences with machine learning

Bottom line: One of the primary factors motivating employees to sacrifice security for speed are the many frustrations they face, attempting to re-authenticate who they are so they can get more work done and achieve greater productivity.

How bad security experiences lead to a breach

Every business is facing the paradox of hardening security without sacrificing users’ login and system access experiences. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment across every threat surface an organisation has.

Centrify’s recent survey Privileged Access Management In The Modern Threatscape found that 74% of data breaches start with privileged credential abuse. Forrester estimates that 80% of data breaches have a connection to compromised privileged credentials, such as passwords, tokens, keys, and certificates. On the Dark Web, privileged access credentials are a best-seller because they provide the intruder with “the keys to the kingdom.” By leveraging a “trusted” identity, a hacker can operate undetected and exfiltrate sensitive data sets without raising any red flags.

Frustrated with wasting time responding to the many account lock-outs, re-authentication procedures, and login errors outmoded Privileged Access Management (PAM) systems require, IT Help Desk teams, IT administrators, and admin users freely share privileged credentials, often resulting in them eventually being offered for sale on the Dark Web.

The keys to the kingdom are in high demand

18% of healthcare employees are willing to sell confidential data to unauthorised parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. State-sponsored and organised crime organisations offer to pay bounties in bitcoin for privileged credentials for many of the world’s largest financial institutions on the Dark Web. And with the typical U.S.-based enterprise losing on average $7.91M from a breach, more than double the global average of $3.86M according to IBM’s 2018 Data Breach Study, it’s clear that improving admin user experiences to reduce the incidence of privileged credential sharing needs to happen now.

How machine learning improves admin user experiences and thwarts breaches

Machine learning is making every aspect of security experiences more adaptive, taking into account the risk context of every privileged access attempt across any threat surface, anytime. Machine learning algorithms can continuously learn and generate contextual intelligence that is used to streamline verified privileged user’s access while thwarting many potential threats ― the most common of which is compromised credentials.

The following are a few of the many ways machine learning is improving privileged users’ experiences when they need to log in to secure critical infrastructure resources:

  • Machine learning is making it possible to provide adaptive, personalised login experiences at scale using risk-scoring of every access attempt in real-time, all contributing to improved user experiences: Machine learning is making it possible to implement security strategies that flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface, and generating a risk score in milliseconds.

    Being able to respond in milliseconds, or real-time is essential for delivering excellent admin user experiences. The “never trust, always verify, enforce least privilege” approach to security is how many enterprises from a broad base of industries including leading financial services and insurance companies are protecting every threat surface from privileged access abuse.

    CIOs at these companies say taking a Zero Trust approach with a strong focus on Zero Trust Privilege corporate-wide is redefining the legacy approach to Privileged Access Management by delivering cloud-architected Zero Trust Privilege to secure access to infrastructure, DevOps, cloud, containers, big data, and other modern enterprise use cases. Taking a Zero Trust approach to security enables their departments to roll out new services across every threat surface their customers prefer to use without having to customise security strategies for each.
     

  • Quantify, track and analyse every potential security threat and attempted breach and apply threat analytics to the aggregated data sets in real-time, thwarting data exfiltration attempts before they begin. One of the tenets or cornerstones of Zero Trust Privilege is adaptive control. Machine learning algorithms continually “learn” by continuously analysing and looking for anomalies in users’ behavior across every threat surface, device, and login attempt.

    When any users’ behavior appears to be outside the threshold of constraints defined for threat analytics and risk scoring, additional authentication is immediately requested, and access denied to requested resources until an identity can be verified. Machine learning makes adaptive preventative controls possible.
     

  • When every identity is a new security perimeter, machine learnings’ ability to provide personalisation at scale for every access attempt on every threat surface is essential for enabling a company to keep growing. Businesses that are growing the fastest often face the greatest challenges when it comes to improving their privileged users’ experiences.

    Getting new employees productive quickly needs to be based on four foundational elements. These include verifying the identity of every admin user, knowing the context of their access request, ensuring it’s coming from a clean source, and limiting access as well as privilege. Taken together, these pillars form the foundation of a Zero Trust Privilege.

Conclusion

Organisations don’t have to sacrifice security for speed when they’re relying on machine learning-based approaches for improving the privileged user experience. Today, a majority of IT Help Desk teams, IT administrators, and admin users are freely sharing privileged credentials to be more productive, which often leads to breaches based on privileged access abuse. By taking a machine learning-based approach to validate every access request, the context of the request, and the risk of the access environment, roadblocks in the way of greater privileged user productivity disappear. Privileged credential abuse is greatly minimised.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to improve supply chains with machine learning: 10 proven ways

Bottom line: Enterprises are attaining double-digit improvements in forecast error rates, demand planning productivity, cost reductions and on-time shipments using machine learning today, revolutionising supply chain management in the process.

Machine learning algorithms and the models they’re based on excel at finding anomalies, patterns and predictive insights in large data sets. Many supply chain challenges are time, cost and resource constraint-based, making machine learning an ideal technology to solve them.

From Amazon’s Kiva robotics relying on machine learning to improve accuracy, speed and scale to DHL relying on AI and machine learning to power their Predictive Network Management system that analyses 58 different parameters of internal data to identify the top factors influencing shipment delays, machine learning is defining the next generation of supply chain management. Gartner predicts that by 2020, 95% of Supply Chain Planning (SCP) vendors will be relying on supervised and unsupervised machine learning in their solutions. Gartner is also predicting by 2023 intelligent algorithms, and AI techniques will be an embedded or augmented component across 25% of all supply chain technology solutions.

The ten ways that machine learning is revolutionising supply chain management include:

Machine learning-based algorithms are the foundation of the next generation of logistics technologies, with the most significant gains being made with advanced resource scheduling systems

Machine learning and AI-based techniques are the foundation of a broad spectrum of next-generation logistics and supply chain technologies now under development. The most significant gains are being made where machine learning can contribute to solving complex constraint, cost and delivery problems companies face today. McKinsey predicts machine learning’s most significant contributions will be in providing supply chain operators with more significant insights into how supply chain performance can be improved, anticipating anomalies in logistics costs and performance before they occur. Machine learning is also providing insights into where automation can deliver the most significant scale advantages. Source: McKinsey & Company, Automation in logistics: Big opportunity, bigger uncertainty, April 2019. By Ashutosh Dekhne, Greg Hastings, John Murnane, and Florian Neuhaus

The wide variation in data sets generated from the Internet of Things (IoT) sensors, telematics, intelligent transport systems, and traffic data have the potential to deliver the most value to improving supply chains by using machine learning

Applying machine learning algorithms and techniques to improve supply chains starts with data sets that have the greatest variety and variability in them. The most challenging issues supply chains face are often found in optimising logistics, so materials needed to complete a production run arrive on time. Source: KPMG, Supply Chain Big Data Series Part 1

Machine learning shows the potential to reduce logistics costs by finding patterns in track-and-trace data captured using IoT-enabled sensors, contributing to $6M in annual savings

BCG recently looked at how a decentralised supply chain using track-and-trace applications could improve performance and reduce costs. They found that in a 30-node configuration when blockchain is used to share data in real-time across a supplier network, combined with better analytics insight, cost savings of $6M a year is achievable. Source: Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

Reducing forecast errors up to 50% is achievable using machine learning-based techniques

Lost sales due to products not being available are being reduced up to 65% through the use of machine learning-based planning and optimisation techniques. Inventory reductions of 20 to 50% are also being achieved today when machine learning-based supply chain management systems are used. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

DHL Research is finding that machine learning enables logistics and supply chain operations to optimise capacity utilisation, improve customer experience, reduce risk, and create new business models

DHL’s research team continually tracks and evaluates the impact of emerging technologies on logistics and supply chain performance. They’re also predicting that AI will enable back-office automation, predictive operations, intelligent logistics assets, and new customer experience models. Source: DHL Trend Research, Logistics Trend Radar, Version 2018/2019 (PDF, 55 pp., no opt-in)

Detecting and acting on inconsistent supplier quality levels and deliveries using machine learning-based applications is an area manufacturers are investing in today

Based on conversations with North American-based mid-tier manufacturers, the second most significant growth barrier they’re facing today is suppliers’ lack of consistent quality and delivery performance. The greatest growth barrier is the lack of skilled labor available. Using machine learning and advanced analytics manufacturers can discover quickly who their best and worst suppliers are, and which production centers are most accurate in catching errors. Manufacturers are using dashboards much like the one below for applying machine learning to supplier quality, delivery and consistency challenges. Source: Microsoft, Supplier Quality Analysis sample for Power BI: Take a tour, 2018

Reducing risk and the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point across supply chains today

When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. Inspectorio is a machine learning startup to watch in this area. They’re tackling the many problems that a lack of inspection and supply chain visibility creates, focusing on how they can solve them immediately for brands and retailers. The graphic below explains their platform. Source: Forbes, How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility, January 23, 2019

Machine learning is making rapid gains in end-to-end supply chain visibility possible, providing predictive and prescriptive insights that are helping companies react faster than before

Combining multi-enterprise commerce networks for global trade and supply chain management with AI and machine learning platforms are revolutionising supply chain end-to-end visibility.

One of the early leaders in this area is Infor’s Control Center. Control Center combines data from the Infor GT Nexus Commerce Network, acquired by the company in September 2015, with Infor’s Coleman Artificial Intelligence (AI) Infor chose to name their AI platform after the inspiring physicist and mathematician Katherine Coleman Johnson, whose trail-blazing work helped NASA land on the moon. Be sure to pick up a copy of the book and see the movie Hidden Figures if you haven’t already to appreciate her and many other brilliant women mathematicians’ many contributions to space exploration. ChainLink Research provides an overview of Control Center in their article, How Infor is Helping to Realise Human Potential, and two screens from Control Center are shown below.

Machine learning is proving to be foundational for thwarting privileged credential abuse which is the leading cause of security breaches across global supply chains

By taking a least privilege access approach, organisations can minimise attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse in their supply chains by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.  

Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.  Centrify is a leader in this area, with globally-recognised suppliers including Cisco, Intel, Microsoft, and Salesforce being current customers.  Source: Forbes, High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019, November 28, 2018.

Capitalising on machine learning to predict preventative maintenance for freight and logistics machinery based on IoT data is improving asset utilisation and reducing operating costs

McKinsey found that predictive maintenance enhanced by machine learning allows for better prediction and avoidance of machine failure by combining data from the advanced Internet of Things (IoT) sensors and maintenance logs as well as external sources. Asset productivity increases of up to 20% are possible and overall maintenance costs may be reduced by up to 10%. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.