All posts by louiscolumbus

10 charts that will change your perspective of AI in marketing

  • Top-performing companies are more than twice as likely to be using AI for marketing (28% vs. 12%) according to Adobe’s latest Digital Intelligence Briefing.
  • Retailers are investing $5.9B this year in AI-based marketing and customer service solutions to improve shoppers’ buying experiences according to IDC.
  • Financial Services marketers lead all other industries in AI application adoption, with 37% currently using them today.
  • Sales and Marketing teams most often collaborate using Configure-Price-Quote (CPQ) and Marketing Automation AI-based applications, with sales leaders predicting AI adoption will increase 155% across sales teams in two years.

Artificial Intelligence enables marketers to understand sales cycles better, correlating their strategies and spending to sales results. AI-driven insights are also helping to break down data silos so marketing and sales can collaborate more on deals. Marketing is more analytics and quant-driven than ever before with the best CMOs knowing which metrics and KPIs to track and why they fluctuate.

The bottom line is that machine learning and AI are the technologies CMOs and their teams need to excel today. The best CMOs balance the quant-intensive nature of running marketing with qualitative factors that make a company’s brand and customer experience unique. With greater insight into how prospects make decisions when, where, and how to buy, CMOs are bringing a new level of intensity into driving outcomes. An example of this can be seen from the recent Forbes Insights and Quantcast research, Lessons of 21st-Century Brands Modern Brands & AI Report (17 pp., PDF, free, opt-in). The study found that AI enables marketers to increase sales (52%), increase in customer retention (51%), and succeed at new product launches (49%). AI is making solid contributions to improving lead quality, persona development, segmentation, pricing, and service.

The following ten charts provide insights into how AI is transforming marketing:

21% of sales leaders rely on AI-based applications today, with the majority collaborating with marketing teams sharing these applications

Sales leaders predict that their use of AI will increase 155% in the next two years. Sales leaders predict AI will reach critical mass by 2020 when 54% expect to be using these technologies. Marketing and sales are relying on AI-based marketing automation, configure-price-quote (CPQ), and intelligent selling systems to increase revenue and profit growth significantly in the next two years. Source: Salesforce Research, State of Sales, 3rd edition. (58 pp., PDF, free, opt-in).

AI sees the most significant adoption by marketers working in $500m to $1bn companies, with conversational AI for customer service the most dominant

Businesses with between $500M to $1B lead all other revenue categories in the number and depth of AI adoption use cases. Just over 52% of small businesses with sales of $25M or less are using AI for predictive analytics for customer insights. It’s interesting to note that small companies are the leaders in AI spending, at 38.1%, to improve marketing ROI by optimising marketing content and timing. Source: The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte and American Marketing Association. (71 pp., PDF, free, no opt-in).

22% of marketers currently are using AI-based applications with an additional 57% planning to use in the next two years

There are nine dominant use cases marketers are concentrating on today, ranging from personalised channel experiences to programmatic advertising and media buying to predictive customer journeys and real-time next best offers. Source: Salesforce’s State of Marketing Study, 5th edition

Content personalisation and predictive analytics from customer insights are the two areas CMOs most prioritise AI spending today

The CMO study found that B2B service companies are the top user of AI for content personalisation (62.2%) and B2B product companies use AI for augmented and virtual reality, facial recognition and visual search more than any other business types. Source: CMOs’ Top Uses For AI: Personalisation and Predictive Analytics. Marketing Charts. March 14, 2019

45% of retailers are either planning to or have already implemented AI to improve multichannel customer engagement as a core part of their marketing mix

Reflecting how dependent retailers are on supply chains, 37% of retailers are investing in AI today to improve supply chain logistics, supply chain management, and forecasting (37%). Source: AI and Machine Learning use cases in the retail industry worldwide as of 2019, Statista.

Personalising the overall customer journey and driving next-best offers in real-time are the two most common ways marketing leaders are using AI today, according to Salesforce

Improving customer segmentation, improving advertising and media buying, and personalising channel experiences are the next fastest-growing areas of AI adoption in marketing today. Source: Salesforce’s State of Marketing Study, 5th edition

82% of marketing leaders say improving customer experience is the leading factor in their decision to adopt AI

The timing and delivery of content, offers, and contextually relevant experiences are second (67%), and improving performance metrics is third at 57%. Source: Leading reasons to use artificial intelligence (AI) for marketing personalisation according to industry professionals worldwide in 2018, Statista.

81% of marketers are either planning to or are using AI in audience targeting this year

80% are currently using or planning to use AI for audience segmentation. EConsultancy’s study found marketers are enthusiastic about AI’s potential to increase marketing effectiveness and track progress. 88% of marketers interviewed say AI will enable them t be more effective in getting to their goals. Source: Dream vs. Reality: The State of Consumer First and Omnichannel Marketing. EConsultancy (36 pp., PDF, free, no opt-in).

Over 41% of marketers say AI is enabling them to generate higher revenues from email marketing

They also see an over 13% improvement in click-through rates and 7.64% improvement in open rates. Source: 4 Positive Effects of AI Use in Email Marketing, Statista (infographic), March 1, 2019.

Marketers and agencies are most comfortable with AI-enabled bid optimisation for media buying, followed by fraud mitigation

Marketers and their agencies differ on ad inventory selection and optimisation, with marketing teams often opting to use their analytics and reporting instead of relying on agency AI methods. Source: Share of marketing and agency professionals who are comfortable with AI-enabled technology automated handling of their campaigns in the United States as of June 2018, Statista.

Additional data sources on AI’s use in marketing:

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How CRM remains the fastest growing enterprise software market – and how Salesforce still dominates

  • Salesforce dominated the worldwide CRM market with a 19.5% market share in 2018, over double its nearest rival, SAP, at 8.3% share
  • Worldwide spending on customer experience and relationship management (CRM) software grew 15.6% to reach $48.2B in 2018
  • 72.9% of CRM spending was on software as a service (SaaS) in 2018, which is expected to grow to 75% of total CRM software spending in 2019
  • Worldwide enterprise application software revenue totalled more than $193.6B in 2018, a 12.5% increase from 2017 revenue of $172.1B. CRM made up nearly 25% of the entire enterprise software revenue market

CRM remains the largest and fastest growing enterprise software category today according to the latest market sizing, and market share research Gartner published this weekGartner defines CRM as providing the functionality to companies across the four segments of customer service and support, digital commerce, marketing, and sales.

All four subsegments of the CRM market grew by more than 13.7%, with marketing emerging as the fastest growing segment, increasing by 18.8% and representing more than 25% of the entire CRM market. Customer service and support retain its No. 1 position, contributing 35.7% of CRM market revenue, attaining $17.1B in revenues in 2018.

Key insights include the following:

With 19.5% market share, Salesforce has over 2X the CRM sales SAP has and over 3X of Oracle

Salesforce continues to dominate CRM globally, increasing its market share from 18.3% in 2017 to 19.5% in 2018. Adobe is the only other vendor to grow its market share in 2018. Microsoft and SAP successfully held onto to market share while Oracle lost share.

Adobe and Salesforce grew faster than the overall market, increasing CRM revenues 21.7% and 23.2% respectively

Adobe’s CRM sales jumped from $2B in 2017 to $2.4B in 2018. Salesforce CRM revenues increased from $7.6B in 2017 to $9.4B in 2018, growing the fastest of all competitors in this market. SAP grew 15.5% between 2017 and 2018, just below the overall market growth of 15.6%. Microsoft (15%) and Oracle (7.1%) grew slower than the market. The following graphic compares growth rates between 2017 and 2018.

Adobe dominates the marketing subsegment of CRM with 19% market share in 2018

Salesforce has 11.7% of the marketing subsegment, followed by IBM (5.7%), SAP (4%), Oracle (3.6%) and HubSpot (3.4%). Gartner estimates the marketing subsegment was a $12.2B market in 2018, increasing from $10.3B in 2017, achieving 18.8% growth in just a year.

Eastern and Western Europe were the fastest growing regions at 19.7% and 17.5% respectively

North America and Western Europe were the largest two regions with North America growing at 15.2% to reach $28.1B in revenue.

Sources:

Gartner Says Worldwide Customer Experience and Relationship Management Software Market Grew 15.6% in 2018

Market Share: Customer Experience and Relationship Management, Worldwide, 2018 (client access required)

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

The top 10 cybersecurity companies to watch in 2019 – and the key trends to explore

Today’s threatscape has made “trust but verify” obsolete 

The threatscape every business operates in today is proving the old model of “trust but verify” obsolete and in need of a complete overhaul. To compete and grow in the increasingly complex and lethal threatscape of today, businesses need more adaptive, contextually intelligent security solutions based on the Zero Trust Security framework.

Zero Trust takes a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. John Kindervag was the first to see how urgent the need was for enterprises to change their approach to cybersecurity, so he created the Zero Trust Security framework in 2010 while at Forrester. Chase Cunningham, Principal Analyst at Forrester, is a mentor to many worldwide wanting to expand their knowledge of Zero Trust and frequently speaks and writes on the topic. If you are interested in cybersecurity in general and Zero Trust specifically, be sure to follow his blog.

AI and machine learning applied to cybersecurity’s most significant challenges is creating a proliferation of commercially successful, innovative platforms. The size and scale of deals in cybersecurity continue to accelerate with BlackBerry’s acquisition of Cylance for $1.4B in cash closing in February of this year being the largest. TD Ameritrade’s annual survey of registered investment advisors (RIA) showed nearly a 6X jump in cybersecurity investments this year compared to 2018.

The top ten cybersecurity companies reflect the speed and scale of innovation happening today that are driving the highest levels of investment this industry has ever seen. The following are the top ten cybersecurity companies to watch in 2019:

Absolute (ABT.TO) 

One of the world’s leading commercial enterprise security solutions, serving as the industry benchmark for endpoint resilience, visibility, and control. The company enables more than 12,000 customers with self-healing endpoint security, always-connected visibility into their devices, data, users, and applications whether endpoints are on or off the network, and the ultimate level of control and confidence required for the modern enterprise. Embedded in over one billion endpoint devices, Absolute delivers intelligence and real-time remediation capabilities that equip enterprises to stop data breaches at the source.

To thwart attackers, organisations continue to layer on security controls — Gartner estimates that more than $124B will be spent on security in 2019 aloneAbsolute’s 2019 Endpoint Security Trends Report finds that much of that spend is in vain, however, revealing that 70% of all breaches still originate on the endpoint. The problem is complexity at the endpoint – it causes security agents to fail invariably, reliably, and predictably.

Absolute’s research found that 42% of all endpoints are unprotected at any given time, and 100% of endpoint security tools eventually fail. As a result, IT leaders see a negative ROI on their security spend. What makes Absolute one of the top 10 security companies to watch in 2019 is their purpose-driven design to mitigate this universal law of security decay.

Enterprises rely on Absolute to cut through the complexity to identify failures, model control options, and refocus security intent. Rather than perpetuating organisations’ false sense of security, Absolute enables uncompromised endpoint persistence, builds resilience and delivers the intelligence needed to ensure security agents, applications, and controls continue functioning and deliver value as intended. Absolute has proven very effective in validating safeguards, fortifying endpoints, and stopping data security compliance failures. The following is an example of the Absolute platform at work:

BlackBerry Artifical Intelligence and Predictive Security

BlackBerry is noteworthy for how quickly it is reinventing itself into an enterprise-ready cybersecurity company independent of the Cylance acquisition. Paying $1.4 billion in cash for Cylance brings much-needed AI and machine learning expertise to their platform portfolio, an acquisition that BlackBerry is moving quickly to integrate into their product and service strategies.

BlackBerry Cylance uses AI and machine learning to protect the entire attack surface of an enterprise with automated threat prevention, detection, and response capabilities. Cylance is also the first company to apply artificial intelligence, algorithmic science, and machine learning to cyber security and improve the way companies, governments, and end users proactively solve the world’s most challenging security problems.

Using a breakthrough mathematical process, BlackBerry Cylance quickly and accurately identifies what is safe and what is a threat, not just what is in a blacklist or whitelist. By coupling sophisticated math and machine learning with a unique understanding of a hacker’s mentality, BlackBerry Cylance provides the technology and services to be truly predictive and preventive against advanced threats.

The following screen from CylancePROTECT provides an executive summary of CylancePROTECT usage, from the number of zones and devices to the percentage of devices covered by auto-quarantine and memory protection, threat events, memory violations, agent versions, and offline days for devices.

Centrify

Centrify is redefining the legacy approach to Privileged Access Management by delivering cloud-ready Zero Trust Privilege to secure modern enterprise attack surfaces. Centrify Zero Trust Privilege helps customers grant least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment.

Industry research firm Gartner predicted Privileged Access Management (PAM) to be the second-fastest growing segment for information security and risk management spending worldwide in 2019 in their recent Forecast Analysis: Information Security and Risk Management, Worldwide, 3Q18 Update (client access required). By implementing least privilege access, Centrify minimises the attack surface, improves audit and compliance visibility, and reduces risk, complexity, and costs for the modern, hybrid enterprise.

Over half of the Fortune 100, the world’s largest financial institutions, intelligence agencies, and critical infrastructure companies, all trust Centrify to stop the leading cause of breaches – privileged credential abuse. PAM was also named a Top 10 security project for 2019 in Gartner’s Top 10 Security Projects for 2019 (client access required).

Cloudflare

Cloudflare is a web performance and security company that provides online services to protect and accelerate websites online. Its online platforms include Cloudflare CDN that distributes content around the world to speed up websites, Cloudflare Optimizer that enables web pages with ad servers and third-party widgets to download Snappy software on mobiles and computers, Cloudflare Security that protects websites from a range of online threats including spam, SQL injection, and DDOS, Cloudflare Analytics that gives insight into website’s traffic including threats and search engine crawlers, Keyless SSL that allows organisations to keep secure sockets layer (SSL) keys private, and Cloudflare applications that help its users install web applications on their websites.

CrowdStrike

Applying machine learning to endpoint detection of IT network threats is how CrowdStrike is differentiating itself in the rapidly growing cybersecurity market today. It’s also one of the top 25 machine learning startups to watch in 2019.

CrowdStrike is credited with uncovering Russian hackers inside the servers of the US Democratic National Committee. The company’s IPO was last Tuesday night, with an initial $34/per share price. Their IPO generated $610M at a valuation at one point reaching nearly $7B. Their Falcon platform stops breaches by detecting all attacks types, even malware-free intrusions, providing five-second visibility across all current and past endpoint activity while reducing cost and complexity for customers.

CrowdStrike’s Threat Graph provides real-time analysis of data from endpoint events across the global crowdsourcing community, allowing detection and prevention of attacks based on patented behavioral pattern recognition technology.

Hunters.AI

Hunters.AI excels at autonomous threat hunting by capitalising on its autonomous system that connects to multiple channels within an organisation and detects the signs of potential cyber-attacks. They are one of the top 25 machine learning startups to watch in 2019.

What makes this startup one of the top ten cybersecurity companies to watch in 2019 is its innovative approach to creating AI- and machine learning-based algorithms that continually learn from an enterprise’s existing security data. Hunters.AI generates and delivers visualised attack stories allowing organisations to more quickly and effectively identify, understand, and respond to attacks.

Early customers include Snowflake Computing, whose VP of security recently said, “Hunters.AI identified the attack in minutes. In my 20 years in security, I have not seen anything as effective, fast, and with high fidelity as what Hunters can do.”  The following is a graphic overview of how the system works:

Idaptive

Idaptive is noteworthy for the Zero Trust approach it is taking to protecting organisations across every threat surface they rely on to operate their businesses dally. Idaptive secures access to applications and endpoints by verifying every user, validating their devices, and intelligently limiting their access. Their product and services strategy reflects a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network.

The Idaptive Next-Gen Access platform combines single single-on (SSO), adaptive multifactor authentication (MFA), enterprise mobility management (EMM) and user behaviour analytics (UBA). They have over 2,000 organisations using their platform today. Idaptive was spun out from Centrify on January 1 this year.

Kount

Kount has successfully differentiated itself in an increasingly crowded cybersecurity marketplace by providing fraud management, identity verification and online authentication technologies that enable digital businesses, online merchants and payment service providers to identify and thwart a wide spectrum of threats in real-time. Kount has been able to show through customer references that their customers can approve more orders, uncover new revenue streams, and dramatically improve their bottom line all while minimising fraud management cost and losses.

Through Kount’s global network and proprietary technologies in AI and machine learning, combined with policy and rules management, its customers thwart online criminals and bad actors driving them away from their site, their marketplace and off their network. Kount’s continuously adaptive platform learns of new threats and continuously updates risk scores to further thwart breach and fraud attempts.

Kount’s advances in both proprietary techniques and patented technology include: Superior mobile fraud detection, Advanced artificial intelligence, Multi-layer device fingerprinting, IP proxy detection and geo-location, Transaction and custom scoring, Global order linking, Business intelligence reporting, Comprehensive order management, Professional and managed services. Kount protects over 6,500 brands today.

MobileIron

The acknowledged leader in mobile device management (MDM) software, MobileIron’s latest series of developments makes it noteworthy and one of the top 10 cybersecurity companies to watch in 2019.

MobileIron was the first to deliver key innovations such as multi-OS mobile device management, mobile application management (MAM), and BYOD privacy controls. Last month MobileIron introduced zero sign-on (ZSO), built on the company’s unified endpoint management (UEM) platform and powered by the MobileIron Access solution. “By making mobile devices your identity, we create a world free from the constant pains of password recovery and the threat of data breaches due to easily compromised credentials,” wrote Simon Biddiscombe, MobileIron’s president and chief executive officer in his recent blog post, Single sign-on is still one sign-on too many.

Biddiscombe’s latest post, MobileIron: We’re making history by making passwords history, provides the company’s vision going forward with ZSO. Zero sign-on eliminates passwords as the primary method for user authentication, unlike single sign-on, which still requires at least one username and password. MobileIron paved the way for a zero sign-on enterprise with its Access product in 2017, which enabled zero sign-on to cloud services on managed devices. Enterprise security teams no longer have to trade off security for better user experience, thanks to the MobileIron Zero Sign-On.

Sumo Logic

Sumo Logic is a fascinating cybersecurity company to track because it shows the ability to take on large-scale enterprise security challenges and turn them into a competitive advantage. An example of this is how quickly the company achieved FedRAMP Ready Designation, getting listed in the FedRAMP Marketplace.

Sumo Logic is a secure, cloud-native, machine data analytics service, delivering real-time, continuous intelligence from structured, semi-structured, and unstructured data across the entire application lifecycle and stack. More than 2,000 customers around the globe rely on Sumo Logic for the analytics and insights to build, run, and secure their modern applications and cloud infrastructures. With Sumo Logic, customers gain a multi-tenant, service-model advantage to accelerate their shift to continuous innovation, increasing competitive advantage, business value, and growth.

Founded in 2010, Sumo Logic is a privately held company based in Redwood City, Calif. and is backed by Accel Partners, Battery Ventures, DFJ, Franklin Templeton, Greylock Partners, IVP, Sapphire Ventures, Sequoia Capital, Sutter Hill Ventures and Tiger Global Management.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

What matters most in business intelligence 2019: Key enterprise use cases

  • Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019
  • Reporting, dashboards, data integration, advanced visualisation, and end-user self-service are the most strategic BI initiatives underway in enterprises today
  • Operations, executive management, finance, and sales are primarily driving business intelligence (BI) adoption throughout enterprises today
  • Tech companies’ operations and sales teams are the most effective at driving BI adoption across industries surveyed, with advertising driving BI adoption across marketing

These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.

Key insights from the study include the following:

Operations, executive management, finance, and sales are primarily driving business intelligence (BI) adoption throughout their enterprises today

More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.

Tech companies’ operations and sales teams are the most effective at driving BI adoption across industries surveyed, with advertising driving BI adoption across marketing

Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.

Reporting, dashboards, data integration, advanced visualisation, and end-user self-service are the most strategic BI initiatives underway in enterprises today

Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.

BI software providers most commonly rely on executive-level personas to design their applications and add new features

Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.

Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management

Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages.

Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.

In aggregate, BI is achieving its highest levels of adoption in R&D, executive management, and operations departments today

The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.

Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration

Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.

Marketing/sales and operations are using the greatest variety of BI tools today

The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to get your data scientist career up and running: A guide

Note: The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a data scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

Continuously build your statistical literacy and programming skills

Currently, there are 24,697 open data scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyse all open positions in the U.S., the following list of the top 10 data science skills was created today.

As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritised list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritise. Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

Continually be creating your own, unique portfolio of analytics and machine learning projects

Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:

  • Your programming language of choice (e.g., Python, R, Julia, etc.).
  • The ability to interact with databases (e.g., your ability to use SQL).
  • Visualisation of data (static or interactive).
  • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
  • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

Get (or git!) yourself a website

If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organisations may also reach out to you with freelance projects, interviews, and other opportunities.

Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network

If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specialising in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:

Bring the same level of intensity to improving your communication skills as you do to your quantitative skills as data scientists need to also excel at storytelling. One of the most important skills for data scientists to have is the ability to communicate results to different audiences and stakeholders so others can understand and act their insights. Since data projects are collaborative across many teams and results are often incorporated into larger projects, the true impact of a data scientist’s work depends on how well others can understand their insights to take further action and make informed decisions.

  • Recruiters
  • Friends, family, and colleagues
  • Career fairs and recruiting events
  • General job boards
  • Company websites
  • Tech job boards

Bring the same level of intensity to improving your communication skills as you do to your quantitative skills – as data scientists need to also excel at storytelling

One of the most important skills for data scientists to have is the ability to communicate results to different audiences and stakeholders so others can understand and act their insights. Since data projects are collaborative across many teams and results are often incorporated into larger projects, the true impact of a data scientist’s work depends on how well others can understand their insights to take further action and make informed decisions.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to improve privileged users’ security experiences with machine learning

Bottom line: One of the primary factors motivating employees to sacrifice security for speed are the many frustrations they face, attempting to re-authenticate who they are so they can get more work done and achieve greater productivity.

How bad security experiences lead to a breach

Every business is facing the paradox of hardening security without sacrificing users’ login and system access experiences. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment across every threat surface an organisation has.

Centrify’s recent survey Privileged Access Management In The Modern Threatscape found that 74% of data breaches start with privileged credential abuse. Forrester estimates that 80% of data breaches have a connection to compromised privileged credentials, such as passwords, tokens, keys, and certificates. On the Dark Web, privileged access credentials are a best-seller because they provide the intruder with “the keys to the kingdom.” By leveraging a “trusted” identity, a hacker can operate undetected and exfiltrate sensitive data sets without raising any red flags.

Frustrated with wasting time responding to the many account lock-outs, re-authentication procedures, and login errors outmoded Privileged Access Management (PAM) systems require, IT Help Desk teams, IT administrators, and admin users freely share privileged credentials, often resulting in them eventually being offered for sale on the Dark Web.

The keys to the kingdom are in high demand

18% of healthcare employees are willing to sell confidential data to unauthorised parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. State-sponsored and organised crime organisations offer to pay bounties in bitcoin for privileged credentials for many of the world’s largest financial institutions on the Dark Web. And with the typical U.S.-based enterprise losing on average $7.91M from a breach, more than double the global average of $3.86M according to IBM’s 2018 Data Breach Study, it’s clear that improving admin user experiences to reduce the incidence of privileged credential sharing needs to happen now.

How machine learning improves admin user experiences and thwarts breaches

Machine learning is making every aspect of security experiences more adaptive, taking into account the risk context of every privileged access attempt across any threat surface, anytime. Machine learning algorithms can continuously learn and generate contextual intelligence that is used to streamline verified privileged user’s access while thwarting many potential threats ― the most common of which is compromised credentials.

The following are a few of the many ways machine learning is improving privileged users’ experiences when they need to log in to secure critical infrastructure resources:

  • Machine learning is making it possible to provide adaptive, personalised login experiences at scale using risk-scoring of every access attempt in real-time, all contributing to improved user experiences: Machine learning is making it possible to implement security strategies that flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface, and generating a risk score in milliseconds.

    Being able to respond in milliseconds, or real-time is essential for delivering excellent admin user experiences. The “never trust, always verify, enforce least privilege” approach to security is how many enterprises from a broad base of industries including leading financial services and insurance companies are protecting every threat surface from privileged access abuse.

    CIOs at these companies say taking a Zero Trust approach with a strong focus on Zero Trust Privilege corporate-wide is redefining the legacy approach to Privileged Access Management by delivering cloud-architected Zero Trust Privilege to secure access to infrastructure, DevOps, cloud, containers, big data, and other modern enterprise use cases. Taking a Zero Trust approach to security enables their departments to roll out new services across every threat surface their customers prefer to use without having to customise security strategies for each.
     

  • Quantify, track and analyse every potential security threat and attempted breach and apply threat analytics to the aggregated data sets in real-time, thwarting data exfiltration attempts before they begin. One of the tenets or cornerstones of Zero Trust Privilege is adaptive control. Machine learning algorithms continually “learn” by continuously analysing and looking for anomalies in users’ behavior across every threat surface, device, and login attempt.

    When any users’ behavior appears to be outside the threshold of constraints defined for threat analytics and risk scoring, additional authentication is immediately requested, and access denied to requested resources until an identity can be verified. Machine learning makes adaptive preventative controls possible.
     

  • When every identity is a new security perimeter, machine learnings’ ability to provide personalisation at scale for every access attempt on every threat surface is essential for enabling a company to keep growing. Businesses that are growing the fastest often face the greatest challenges when it comes to improving their privileged users’ experiences.

    Getting new employees productive quickly needs to be based on four foundational elements. These include verifying the identity of every admin user, knowing the context of their access request, ensuring it’s coming from a clean source, and limiting access as well as privilege. Taken together, these pillars form the foundation of a Zero Trust Privilege.

Conclusion

Organisations don’t have to sacrifice security for speed when they’re relying on machine learning-based approaches for improving the privileged user experience. Today, a majority of IT Help Desk teams, IT administrators, and admin users are freely sharing privileged credentials to be more productive, which often leads to breaches based on privileged access abuse. By taking a machine learning-based approach to validate every access request, the context of the request, and the risk of the access environment, roadblocks in the way of greater privileged user productivity disappear. Privileged credential abuse is greatly minimised.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to improve supply chains with machine learning: 10 proven ways

Bottom line: Enterprises are attaining double-digit improvements in forecast error rates, demand planning productivity, cost reductions and on-time shipments using machine learning today, revolutionising supply chain management in the process.

Machine learning algorithms and the models they’re based on excel at finding anomalies, patterns and predictive insights in large data sets. Many supply chain challenges are time, cost and resource constraint-based, making machine learning an ideal technology to solve them.

From Amazon’s Kiva robotics relying on machine learning to improve accuracy, speed and scale to DHL relying on AI and machine learning to power their Predictive Network Management system that analyses 58 different parameters of internal data to identify the top factors influencing shipment delays, machine learning is defining the next generation of supply chain management. Gartner predicts that by 2020, 95% of Supply Chain Planning (SCP) vendors will be relying on supervised and unsupervised machine learning in their solutions. Gartner is also predicting by 2023 intelligent algorithms, and AI techniques will be an embedded or augmented component across 25% of all supply chain technology solutions.

The ten ways that machine learning is revolutionising supply chain management include:

Machine learning-based algorithms are the foundation of the next generation of logistics technologies, with the most significant gains being made with advanced resource scheduling systems

Machine learning and AI-based techniques are the foundation of a broad spectrum of next-generation logistics and supply chain technologies now under development. The most significant gains are being made where machine learning can contribute to solving complex constraint, cost and delivery problems companies face today. McKinsey predicts machine learning’s most significant contributions will be in providing supply chain operators with more significant insights into how supply chain performance can be improved, anticipating anomalies in logistics costs and performance before they occur. Machine learning is also providing insights into where automation can deliver the most significant scale advantages. Source: McKinsey & Company, Automation in logistics: Big opportunity, bigger uncertainty, April 2019. By Ashutosh Dekhne, Greg Hastings, John Murnane, and Florian Neuhaus

The wide variation in data sets generated from the Internet of Things (IoT) sensors, telematics, intelligent transport systems, and traffic data have the potential to deliver the most value to improving supply chains by using machine learning

Applying machine learning algorithms and techniques to improve supply chains starts with data sets that have the greatest variety and variability in them. The most challenging issues supply chains face are often found in optimising logistics, so materials needed to complete a production run arrive on time. Source: KPMG, Supply Chain Big Data Series Part 1

Machine learning shows the potential to reduce logistics costs by finding patterns in track-and-trace data captured using IoT-enabled sensors, contributing to $6M in annual savings

BCG recently looked at how a decentralised supply chain using track-and-trace applications could improve performance and reduce costs. They found that in a 30-node configuration when blockchain is used to share data in real-time across a supplier network, combined with better analytics insight, cost savings of $6M a year is achievable. Source: Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

Reducing forecast errors up to 50% is achievable using machine learning-based techniques

Lost sales due to products not being available are being reduced up to 65% through the use of machine learning-based planning and optimisation techniques. Inventory reductions of 20 to 50% are also being achieved today when machine learning-based supply chain management systems are used. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

DHL Research is finding that machine learning enables logistics and supply chain operations to optimise capacity utilisation, improve customer experience, reduce risk, and create new business models

DHL’s research team continually tracks and evaluates the impact of emerging technologies on logistics and supply chain performance. They’re also predicting that AI will enable back-office automation, predictive operations, intelligent logistics assets, and new customer experience models. Source: DHL Trend Research, Logistics Trend Radar, Version 2018/2019 (PDF, 55 pp., no opt-in)

Detecting and acting on inconsistent supplier quality levels and deliveries using machine learning-based applications is an area manufacturers are investing in today

Based on conversations with North American-based mid-tier manufacturers, the second most significant growth barrier they’re facing today is suppliers’ lack of consistent quality and delivery performance. The greatest growth barrier is the lack of skilled labor available. Using machine learning and advanced analytics manufacturers can discover quickly who their best and worst suppliers are, and which production centers are most accurate in catching errors. Manufacturers are using dashboards much like the one below for applying machine learning to supplier quality, delivery and consistency challenges. Source: Microsoft, Supplier Quality Analysis sample for Power BI: Take a tour, 2018

Reducing risk and the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point across supply chains today

When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. Inspectorio is a machine learning startup to watch in this area. They’re tackling the many problems that a lack of inspection and supply chain visibility creates, focusing on how they can solve them immediately for brands and retailers. The graphic below explains their platform. Source: Forbes, How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility, January 23, 2019

Machine learning is making rapid gains in end-to-end supply chain visibility possible, providing predictive and prescriptive insights that are helping companies react faster than before

Combining multi-enterprise commerce networks for global trade and supply chain management with AI and machine learning platforms are revolutionising supply chain end-to-end visibility.

One of the early leaders in this area is Infor’s Control Center. Control Center combines data from the Infor GT Nexus Commerce Network, acquired by the company in September 2015, with Infor’s Coleman Artificial Intelligence (AI) Infor chose to name their AI platform after the inspiring physicist and mathematician Katherine Coleman Johnson, whose trail-blazing work helped NASA land on the moon. Be sure to pick up a copy of the book and see the movie Hidden Figures if you haven’t already to appreciate her and many other brilliant women mathematicians’ many contributions to space exploration. ChainLink Research provides an overview of Control Center in their article, How Infor is Helping to Realise Human Potential, and two screens from Control Center are shown below.

Machine learning is proving to be foundational for thwarting privileged credential abuse which is the leading cause of security breaches across global supply chains

By taking a least privilege access approach, organisations can minimise attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse in their supply chains by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.  

Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.  Centrify is a leader in this area, with globally-recognised suppliers including Cisco, Intel, Microsoft, and Salesforce being current customers.  Source: Forbes, High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019, November 28, 2018.

Capitalising on machine learning to predict preventative maintenance for freight and logistics machinery based on IoT data is improving asset utilisation and reducing operating costs

McKinsey found that predictive maintenance enhanced by machine learning allows for better prediction and avoidance of machine failure by combining data from the advanced Internet of Things (IoT) sensors and maintenance logs as well as external sources. Asset productivity increases of up to 20% are possible and overall maintenance costs may be reduced by up to 10%. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

The state of cloud business intelligence 2019: Digging down on Dresner’s analysis

  • An all-time high 48% of organisations say cloud BI is either “critical” or “very important” to their operations in 2019
  • Marketing and sales place the greatest importance on cloud BI in 2019
  • Small organisations of 100 employees or fewer are the most enthusiastic, perennial adopters and supporters of cloud BI
  • The most preferred cloud BI providers are Amazon Web Services and Microsoft Azure.

These and other insights are from Dresner Advisory Services’ 2019 Cloud Computing and Business Intelligence Market Study. The 8th annual report focuses on end-user deployment trends and attitudes toward cloud computing and business intelligence (BI), defined as the technologies, tools, and solutions that rely on one or more cloud deployment models. What makes the study noteworthy is the depth of focus around the perceived benefits and barriers for cloud BI, the importance of cloud BI, and current and planned usage.

“We began tracking and analysing the cloud BI market dynamic in 2012 when adoption was nascent. Since that time, deployments of public cloud BI applications are increasing, with organisations citing substantial benefits versus traditional on-premises implementations,” said Howard Dresner, founder, and chief research officer at Dresner Advisory Services. Please see page 10 of the study for specifics on the methodology.

Key insights gained from the report include the following:

An all-time high 48% of organisations say cloud BI is either “critical” or “very important” to their operations in 2019

Organisations have more confidence in cloud BI than ever before, according to the study’s results. 2019 is seeing a sharp upturn in cloud BI’s importance, driven by the trust and credibility organisations have for accessing, analysing and storing sensitive company data on cloud platforms running BI applications.

Marketing and sales place the greatest importance on cloud BI in 2019

Business intelligence competency centres (BICC) and IT departments have an above-average interest in cloud BI as well, with their combined critical and very important scores being over 50%.

Dresner’s research team found that operations had the greatest duality of scores, with critical and not important being reported at comparable levels for this functional area. Dresner’s analysis indicates operations departments often rely on cloud BI to benchmark and improve existing processes while re-engineering legacy process areas.

Small organisations – of 100 employees or fewer – are the most enthusiastic, perennial adopters and supporters of cloud BI

As has been the case in previous years’ studies, small organisations are leading all others in adopting cloud BI systems and platforms.  Perceived importance declines only slightly in mid-sized organisations (101-1,000 employees) and some large organisations (1,001-5,000 employees), where minimum scores of important offset declines in critical.

The retail/wholesale industry considers cloud BI the most important, followed by technology and advertising industries

Organisations competing in the retail/wholesale industry see the greatest value in adopting cloud BI to gain insights into improving their customer experiences and streamlining supply chains. Technology and advertising industries are industries that also see cloud BI as very important to their operations. Just over 30% of respondents in the education industry see cloud BI as very important.

R&D departments are the most prolific users of cloud BI systems today, followed by marketing and sales

The study highlights that R&D leading all other departments in existing cloud BI use reflects broader potential use cases being evaluated in 2019. Marketing & Sales is the next most prolific department using cloud BI systems.

Finance leads all others in their adoption of private cloud BI platforms, rivaling IT in their lack of adoption for public clouds

R&D departments are the next most likely to be relying on private clouds currently. Marketing and sales are the most likely to take a balanced approach to private and public cloud adoption, equally adopting private and public cloud BI.

Advanced visualisation, support for ad-hoc queries, personalised dashboards, and data integration/data quality tools/ETL tools are the four most popular cloud BI requirements in 2019

Dresner’s research team found the lowest-ranked cloud BI feature priorities in 2019 are social media analysis, complex event processing, big data, text analytics, and natural language analytics. This years’ analysis of most and least popular cloud BI requirements closely mirror traditional BI feature requirements.

Marketing and sales have the greatest interest in several of the most-required features including personalised dashboards, data discovery, data catalog, collaborative support, and natural language analytics

Marketing and sales also have the highest level of interest in the ability to write to transactional applications. R&D leads interest in ad-hoc query, big data, text analytics, and social media analytics.

The retail/wholesale industry leads interest in several features including ad-hoc query, dashboards, data integration, data discovery, production reporting, search interface, data catalog, and ability to write to transactional systems

Technology organisations give the highest score to advanced visualisation and end-user self-service. Healthcare respondents prioritise data mining, end-user data blending, and location analytics, the latter likely for asset tracking purposes. In-memory support scores highest with Financial Services respondent organisations.

Marketing and sales rely on a broader base of third party data connectors to get greater value from their cloud BI systems than their peers

The greater the scale, scope and depth of third-party connectors and integrations, the more valuable marketing and sales data becomes. Relying on connectors for greater insights into sales productivity & performance, social media, online marketing, online data storage, and simple productivity improvements are common in marketing and sales. Finance requiring integration to Salesforce reflects the CRM applications’ success transcending customer relationships into advanced accounting and financial reporting.

Subscription models are now the most preferred licensing strategy for cloud BI and have progressed over the last several years due to lower risk, lower entry costs, and lower carrying costs

Dresner’s research team found that subscription license and free trial (including trial and buy, which may also lead to subscription) are the two most preferred licensing strategies by cloud BI customers in 2019. Dresner Advisory Services predicts new engagements will be earned using subscription models, which is now seen as, at a minimum, important to approximately 90% of the base of respondents.

60% of organisations adopting cloud BI rank Amazon Web Services first, and 85% rank AWS first or second

43% choose Microsoft Azure first and 69% pick Azure first or second. Google Cloud closely trails Azure as the first choice among users but trails more widely after that. IBM Bluemix is the first choice of 12% of organisations responding in 2019.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Uncovering the insight behind Gartner’s $331 billion public cloud forecast

Gartner is predicting the worldwide public cloud services market will grow from $182.4 billion in 2018 to $214.3bn in 2019, a 17.5% jump in just a year.

  • Gartner predicts the worldwide public cloud service market will grow from $182.4bn in 2018 to $331.2bn in 2022, attaining a compound annual growth rate (CAGR) of 12.6%
  • Spending on infrastructure as a service (IaaS) is predicted to increase from $30.5bn in 2018 to $38.9bn in 2019, growing 27.5% in a year
  • Platform as a service (PaaS) spending is predicted to grow from $15.6bn in 2018 to $19B in 2019, growing 21.8% in a year
  • Business intelligence, supply chain management, project and portfolio management and enterprise resource planning (ERP) will see the fastest growth in end-user spending on SaaS applications through 2022

Gartner’s annual forecast of worldwide public cloud service revenue was published last week, and it includes many interesting insights into how the research firm sees the current and future landscape of public cloud computing. Gartner is predicting the worldwide public cloud services market will grow from $182.4bn in 2018 to $214.3bn in 2019, a 17.5% jump in just a year.

By the end of 2019, more than 30% of technology providers’ new software investments will shift from cloud-first to cloud-only, further reducing license-based software spending and increasing subscription-based cloud revenue.

The following graphic compares worldwide public cloud service revenue by segment from 2018 to 2022. Please click on the graphic to expand for easier reading.

Comparing compound annual growth rates (CAGRs) of worldwide public cloud service revenue segments from 2018 to 2022 reflects IaaS’ anticipated rapid growth. Please click on the graphic to expand for easier reading.

Gartner provided the following data table this week as part of their announcement:

BI, supply chain management, project and portfolio management and ERP will see the fastest growth in end-user spending on SaaS applications through 2022

Gartner is predicting end-user spending on business intelligence SaaS applications will grow by 23.3% between 2017 and 2022.  Spending on SaaS-based supply chain management applications will grow by 21.2% between 2017 and 2022. Project and portfolio management SaaS-based applications will grow by 20.9% between 2017 and 2022. End-user spending on SaaS ERP systems will grow by 19.2% between 2017 and 2022.

Sources: Gartner Forecasts Worldwide Public Cloud Revenue to Grow 17.5 Percent in 2019 and Forecast: Public Cloud Services, Worldwide, 2016-2022, 4Q18 Update (Gartner client access)

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

The five key things every executive needs to know about identity and access management

  • For new digital business models to succeed, customers’ privacy preferences need to be secure, and that begins by treating every identity as a new security perimeter.
  • Organisations need to recognise that perimeter-based security, which focuses on securing endpoints, firewalls, and networks, provides no protection against identity and credential-based threats. Until they start implementing identity-centric security measures, account compromise attacks will continue to provide a perfect camouflage for data breaches.
  • 74% of data breaches start with privileged credential abuse that could have been averted if the organisations had adopted a privileged access management (PAM) strategy, according to a recent Centrify survey.
  • Just 48% of organisations have a password vault, and only 21% have multi-factor authentication (MFA) implemented for privileged administrative access.

New digital business models are redefining organisations’ growth trajectories and enabling startups to thrive, all driven by customer trust. Gaining and strengthening customer trust starts with a security strategy that can scale quickly to secure every identity and threat surface a new business model creates. 

Centrify’s recent survey, Privileged Access Management in the Modern Threatscape, found 74% of data breaches begin with privileged credential abuse. The survey also found that the most important areas of IT infrastructure that new digital business models rely on to succeed — including big data repositories, cloud platform access, containers, and DevOps — are among the most vulnerable. The most urgent challenges executives are facing include protecting their business, securing customer data, and finding new ways to add value to their business’ operations.

Why executives need to know about identity and access management now  

Executives have a strong sense of urgency to improve identity and access management (IAM) today to assure the right individuals access the right resources at the right times and for the right reasons.

IAM components like access management, single sign-on, customer identity and access management (CIAM), advanced authentication, identity governance and administration (IGA), IoT-driven IAM, and privileged access management address the need to ensure appropriate access to resources across an organisation’s entire attack surface and to meet compliance requirements.

Considering that privileged access abuse is the leading cause of today’s breaches, they’re especially prioritising privileged account management as part of their broader cybersecurity strategies to secure the “keys to their kingdom.” Gartner supports this view by placing a high priority on privileged account management, including it in its Gartner Top 10 Security Projects for 2018, and again in 2019.

During a recent conversation with insurance and financial services executives, I learned why privileged access management is such an urgent, high priority today. Privileged access abuse is the leading attack vector, where they see the majority of breach attempts to access the company’s most sensitive systems and data. It’s also where they can improve customer data security while also making employees more productive by giving them access systems and platforms faster. All of them know instances of hackers and state-sponsored hacking groups offering bitcoin payments in exchange for administrative-level logins and passwords to their financial systems.

Several of the executives I spoke with are also evaluating Zero Trust as the foundation for their cybersecurity strategy. As their new digital business models grow, all of them are focused on discarding the outdated, “trust, but verify” mindset and replacing it with Zero Trust, which mandates a “never trust, always verify” approach. They’re also using a least privilege access approach to minimise each attack surface and improve audit and compliance visibility while reducing risk, complexity, and costs.

The following are the five things every executive needs to know about identity and access management to address a reality that every company and consumer must recognise exists today. Attackers no longer “hack” in, they log in.

Designing in the ability to manage access rights and all digital identities of privileged users require privileged access management (PAM) and identity governance and administration (IGA) systems be integrated as part of an IAM strategy

For digital business initiatives’ security strategies to scale, they need to support access requests, entitlement management, and user credential attestation for governance purposes. With identities being the new security perimeter, provisioning least privileged access to suppliers, distributors, and service organisations is also a must-have to scale any new business model. Natively, IGA is dealing only with end users – not privileged users. Therefore integration with PAM systems is required to bring in privileged user data and gain a holistic view of access entitlements.

IAM is a proven approach to securing valuable Intellectual Property (IP), patents, and attaining regulatory compliance, including GDPR

The fascinating digital businesses emerging today also function as patent and IP foundries. A byproduct of their operations is an entirely new business, product and process ideas. Executives spoken with are prioritising how they secure intellectual property and patents using an Identity and Access Management strategy.

Knowing with confidence the identity of every user is what makes every aspect of an IAM strategy work

Having multi-factor authentication (MFA) enabled for every access session, and threat surface is one of the main processes that make an IAM strategy succeed. It’s a best practice to reinforce Zero Trust principles through multi-factor authentication enforcement on each computer that cannot be circumvented (or bypassed) by malware.

Designing in transaction verification now for future eCommerce digital business models is worth it

Think of your IAM initiative as a platform to create ongoing customer trust with. As all digital business initiatives rely on multi-channel selling, designing in transaction verification as part of an IAM strategy is essential. Organisations are combining verification and MFA to thwart breaches and the abuse of credential access abuse.

In defining any IAM strategy focus on how privileged access management (PAM) needs to be tailored to your specific business needs

PAM is the foundational element that turns the investments made in security into business value. It’s a catalyst for ensuring customer trust turns into revenue. Many organisations equate PAM with a password vault.

But in a modern threatscape where humans, machines, applications, and services dynamically require access to a broadening range of attack surfaces such as cloud, IoT, big data, and containers, that outdated legacy approach won’t effectively secure the leading attack vector: privileged access abuse. Vendors such as Centrify and others are looking beyond the vault and offering Zero Trust solutions for PAM that address these modern access requestors and attack surfaces.

Conclusion

Insurance and financial services executives realise, and even predict, that there’s going to be an increase in the number and intensity of efforts to break into their systems using compromised credentials. Prioritising privileged access management as part of the IAM toolkit is proving to be an effective cybersecurity strategy for protecting their businesses and customers’ data while also making a valuable contribution to its growth.

The bottom line is that identity and access management is the cornerstone of any effective Zero Trust-based strategy, and taking an aggressive, pre-emptive approach to privileged access management is the new normal for organisations’ cybersecurity strategies.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.