Todas las entradas hechas por louiscolumbus

Why Google needs to make machine learning its growth fuel

  • In 2017 Google outspent Microsoft, Apple, and Facebook on R&D spending with the majority being on AI and machine learning.
  • Google needs new AI- and machine learning-driven businesses that have lower Total Acquisition Costs (TAC) to offset the rising acquisition costs of their ad and search businesses.
  • One of the company’s initial forays into AI and machine learning was its $600M acquisition of AI startup DeepMind in January 2014.
  • Google has launched two funds dedicated solely to AI: Gradient Ventures and the Google Assistant Investment Program, both of which are accepting pitches from AI and machine learning startups today.
  • On its Q4’17 earnings call, the company announced that its cloud business is now bringing in $1B per quarter. The number of cloud deals worth $1M+ that Google has sold more than tripled between 2016 and 2017.
  • Google’s M&A strategy is concentrating on strengthening their cloud business to better compete against Amazon AWS and Microsoft Azure.

These and many other fascinating insights are from CB Insight’s report, Google Strategy Teardown (PDF, 49 pp., opt-in). The report explores how Alphabet, Google’s parent company is relying on Artificial Intelligence (AI) and machine learning to capture new streams of revenue in enterprise cloud computing and services. Also, the report looks at how Alphabet can combine search, AI, and machine learning to revolutionise logistics, healthcare, and transportation. It’s a thorough teardown of Google’s potential acquisitions, strategic investments, and partnerships needed to maintain search dominance while driving revenue from new markets.

Key takeaways from the report include the following:

Google needs new AI- and machine learning-driven businesses that have lower total acquisition costs (TAC) to offset the rising acquisition costs of their ad and search businesses

CB Insights found Google is experiencing rising TAC in their core ad and search businesses. With the strategic shift to mobile, Google will see TAC escalate even further. Their greatest potential for growth is infusing greater contextual intelligence and knowledge across the entire series of companies that comprise Alphabet, shown in the graphic below.

Google has launched two funds dedicated solely to AI: Gradient Ventures and the Google Assistant Investment Program, both of which are accepting pitches from AI and machine learning startups today

Gradient Ventures is an ROI fund focused on supporting the most talented founders building AI-powered companies. Former tech founders are leading Gradient Ventures, assisting in turning ideas into companies. Gradient Venture’s portfolio is shown below:

In 2017 Google outspent Microsoft, Apple, and Facebook on R&D spending with the majority being on AI and machine learning

Amazon dominates R&D spending across the top five tech companies investments in R&D in 2017 with $22.6B. Facebook leads in percent of total sales invested in R&D with 19.1%.

Google AI led the development of Google’s highly popular open source machine software library and framework Tensor Flow and is home to the Google Brain team

Google’s approach to primary research in the fields of AI, machine learning, and deep learning is leading to a prolific amount of research being produced and published. Here’s the search engine for their publication database, which includes many fascinating studies for review. Part of Google Brain’s role is to work with other Alphabet subsidiaries to support and lead their AI and machine learning product initiatives. An example of this CB Insights mentions in the report is how Google Brain collaborated with autonomous driving division Waymo, where it has helped apply deep neural nets to vehicles’ pedestrian detection The team has also been successful in increasing the number of AI and machine learning patents, as CB Insight’s analysis below shows:

Mentions of AI and machine learning are soaring on Google quarterly earnings calls, signaling senior management’s prioritising these areas as growth fuel

CB Insights has an Insights Trends tool that is designed to analyse unstructured text and find linguistics-based associations, models and statistical insights from them. Analysing Google earnings calls transcripts found AI and machine learning mentions are soaring during the last call.

Google’s M&A strategy is concentrating on strengthening their cloud business to better compete against Amazon AWS and Microsoft Azure

Google acquired Xively in Q1 of this year followed by Cask Data and Velostrata in Q2. Google needs to continue acquiring cloud-based companies who can accelerate more customer wins in the enterprise and mid-tier, two areas Amazon AWS and Microsoft Azure have strong momentum today.

Glassdoor’s 10 highest paying tech jobs of 2018: Why it remains a software-defined world

  • Software engineering manager is the highest paying position with an average salary of $163,500 with 31,621 open positions on Glassdoor today.
  • Over 368,000 open positions are available across the 10 highest paying jobs on Glassdoor today.
  • $147,000 is the average salary of the top 10 tech jobs on Glassdoor today.
  • 12.7% of all open positions are for software engineers, making this job the most in-demand in tech today.

Glassdoor is best known for its candid, honest reviews of employers written anonymously by employees. It is now common practice and a good idea for anyone considering a position with a new employer to check them out on Glassdoor first. With nearly 40 million reviews on more than 770,000 companies. Glassdoor is now the 2nd most popular job site professionals rely on in the U.S., attracting approximately 59 million job seekers a month. The Chief Human Resources Officer of one of the largest and best-known cloud-based enterprise software companies told me recently she gets 2X more applications from Glassdoor for any given position than any other recruiting site or channel.

Earlier this month Glassdoor Economic Research published the results of research completed on how base pay compares between tech and non-tech jobs.  The research team gathered a sample of tech companies with at least 100 job postings on Glassdoor as of June 26, 2018. Glassdoor defined tech roles as those positions requiring knowledge of code, software or data. The study found the following to be the 10 highest paying tech jobs today:

Walmart eCommerce, Microsoft, Intel, Amazon, and Google have the highest concentration of tech jobs as a percentage of all positions open

Workday, Salesforce, Verizon, and IBM have the highest concentration of non-tech positions available today.

Source: Glassdoor Economic Research Blog, Landing a Non-Tech Job in Tech: Who’s Hiring Today? July 19, 2018

The global state of enterprise analytics 2018: How cloud, big data and AI are key to the future

  • 71% of enterprises globally predict their investments in data and analytics will accelerate in the next three years and beyond
  • 57% of enterprises globally have a Chief Data Officer, a leadership role that is pivotal in helping to democratise data and analytics across any organisation
  • 52% of enterprises are leveraging advanced and predictive analytics today to provide greater insights and contextual intelligence into operations
  • 41% of all enterprises are considering a move to cloud-based analytics in the next year
  • Cloud computing (24%), big data (20%), and AI/machine learning (18%) are the three technologies predicted to have the greatest impact on analytics over the next five years
  • Just 16% of enterprises have enabled at least 75% of their employees to have access to company data and analytics

These and many other fascinating insights are from MicroStrategy’s latest research study, 2018 Global State of Enterprise Analytics Report.  You can download a copy here (PDF, 44 pp., opt-in). The study is based on surveys completed in April 2018 with 500 globally-based enterprise analytics and business intelligence professionals on the state of their organisations’ analytics initiatives across 20 industries. Participants represented organisations with 250 to 20,000 employees worldwide from five nations including Brazil, Germany, Japan, the United Kingdom and the United States. For additional details on the methodology, please see the study here. The study’s results underscore how enterprises need to have a unified data strategy that reflects their growth strategies and new business models’ information needs.

Key takeaways from the study include the following:

Driving greater process and cost efficiencies (60%), strategy and change (57%) and monitoring and improving financial performance (52%) are the top three ways enterprises globally are using data and analytics today

The study found that enterprises are also relying on data and analytics to gain greater insights into how current products and services are used (51%), managing risk (50%) and attain customer growth and retention (49%). Across the five nations surveyed, Japan leads the world in the use of data and analytics to drive process and cost efficiencies (65%). UK-based enterprises lead all nations in their use of data and analytics to analyse how current products and services are being used.  The report provides graphical comparisons of the five nations’ results.

Cloud computing, big data, and AI/machine learning are the three technologies predicted to have the greatest global impact on analytics over the next five years

Japanese enterprises predict cloud computing will have the greatest impact on the future of analytics (28%) across the five nations’ enterprises interviewed. AI/Machine Learning is predicted to have the greatest impact on analytics in the U.K. (26%) globally as is Big Data in Germany (29%). Please see the study for country-specific prioritisation of technologies.

52% of enterprises are leveraging advanced and predictive analytics today to provide greater insights and contextual intelligence into operations

Additional leverage areas include distribution of analytics via e-mail and collaboration tools (49%), analytics embedded in other apps including Salesforce (44%) and mobile productivity apps (39%). Japanese enterprises lead the world in their adoption of advanced and predictive analytics (60%). German enterprises lead the world in the adoption of analytics for collaboration via e-mail and more real-time data and knowledge-sharing methods (50%).

59% of enterprises are using big data analytics, leading all categories of intelligence applications 

Enterprise reporting (47%), data discovery (47%), mobile productivity apps (44%) and embedded apps (42%) are the top five intelligence applications in use globally by enterprises today. Big Data’s dominance in the survey results can be attributed to the top five industries in the sampling frame is among the most prolific in data generation and use. Manufacturing (15%) is the most data-prolific industry on the planet. Additional industries that generate massive amounts of data dominate the survey’s demographics including software technology-based businesses (14%), banking (13%), retail (11%), and financial services/business services (6%).

27% of global enterprises prioritise security over any other factor when evaluating a new analytics vendor

The three core attributes of a scalable, comprehensive platform, ease of use, and a vendor’s products having an excellent reputation are all essential. Enterprises based in four of the five nations also prioritise security as the most critical success factor they evaluate potential analytics vendors to do business with. Enterprise scalability is most important in the U.S., with 26% of enterprises interviewed saying this is the most important priority in evaluating a new analytics vendor.

Data privacy and security concerns (49%) is the most formidable barrier enterprises face in gaining more effective use of their data and analytics

Enterprises from four of the five nations say data privacy and security are the most significant barrier they face in getting more value from analytics. In Japan, the greatest barrier is access limited to data across the organisation (40%).

41% of all enterprises globally are considering a move to the cloud in the next year

64% of U.S.-based enterprises are considering moving to a cloud-based analytics platform or solution in the next year. The U.S. leads enterprises from all five nations in planned cloud-based analytics cloud adoption as the graphic below illustrates.

10 ways to improve cloud ERP with AI and machine learning

Capitalising on new digital business models and the growth opportunities they provide are forcing companies to re-evaluate ERP’s role. Made inflexible by years of customisation, legacy ERP systems aren’t delivering what digital business models need today to scale and grow.

Legacy ERP systems were purpose-built to excel at production consistency first at the expense of flexibility and responsiveness to customers’ changing requirements. By taking a business case-based approach to integrating Artificial Intelligence (AI) and machine learning into their platforms, Cloud ERP providers can fill the gap legacy ERP systems can’t.

Closing legacy ERP gaps with greater intelligence and insight

Companies need to be able to respond quickly to unexpected, unfamiliar and unforeseen dilemmas with smart decisions fast for new digital business models to succeed. That’s not possible today with legacy ERP systems. Legacy IT technology stacks and the ERP systems they are built on aren’t designed to deliver the data needed most.

That’s all changing fast. A clear, compelling business model and successful execution of its related strategies are what all successful Cloud ERP implementations share. Cloud ERP platforms and apps provide organisations the flexibility they need to prioritise growth plans over IT constraints. And many have taken an Application Programming Interface (API) approach to integrate with legacy ERP systems to gain the incremental data these systems provide. In today’s era of Cloud ERP, rip-and-replace isn’t as commonplace as reorganising entire IT architectures for greater speed, scale, and customer transparency using cloud-first platforms.

New business models thrive when an ERP system is constantly learning. That’s one of the greatest gaps between what Cloud ERP platforms’ potential and where their legacy counterparts are today. Cloud platforms provide greater integration options and more flexibility to customise applications and improve usability which is one of the biggest drawbacks of legacy ERP systems. Designed to deliver results by providing AI- and machine learning insights, Cloud ERP platforms, and apps can rejuvenate ERP systems and their contributions to business growth.

The following are the 10 ways to improve cloud ERP with AI and machine learning, bridging the information gap with legacy ERP systems:

Cloud ERP platforms need to create and strengthen a self-learning knowledge system that orchestrates AI and machine learning from the shop floor to the top floor and across supplier networks

Having a cloud-based infrastructure that integrates core ERP web services, apps, and real-time monitoring to deliver a steady stream of data to AI and machine learning algorithms accelerates how quickly the entire system learns. The cloud ERP platform integration roadmap needs to include APIs and web services to connect with the many suppliers and buyer systems outside the walls of a manufacturer while integrating with legacy ERP systems to aggregate and analyse the decades of data they have generated.

Virtual agents have the potential to redefine many areas of manufacturing operations, from pick-by-voice systems to advanced diagnostics

Apple’s Siri, Amazon’s Alexa, Google Voice, and Microsoft Cortana have the potential to be modified to streamline operations tasks and processes, bringing contextual guidance and direction to complex tasks. An example of one task virtual agents are being used for today is guiding production workers to select from the correct product bin as required by the Bill of Materials. Machinery manufacturers are piloting voice agents that can provide detailed work instructions that streamline configure-to-order and engineer-to-order production. Amazon has successfully partnered with automotive manufacturers and has the most design wins as of today. They could easily replicate this success with machinery manufacturers.

Design in the Internet of Things (IoT) support at the data structure level to realise quick wins as data collection pilots go live and scale

Cloud ERP platforms have the potential to capitalise on the massive data stream IoT devices are generating today by designing in support at the data structure level first. Providing IoT-based data to AI and machine learning apps continually will bridge the intelligence gap many companies face today as they pursue new business models. Capgemini has provided an analysis of IoT use cases shown below, highlighting how production asset maintenance and asset tracking are quick wins waiting to happen. Cloud ERP platforms can accelerate them by designing in IoT support.

Reducing equipment breakdowns and increasing asset utilisation by analysing machine-level data to determine when a given part needs to be replaced

It’s possible to capture a steady stream of data on each machine’s health level using sensors equipped with an IP address. Cloud ERP providers have a great opportunity to capture machine-level data and use machine learning techniques to find patterns in production performance by using a production floor’s entire data set. This is especially important in process industries where machinery breakdowns lead to lost sales. Oil refineries are using machine learning models comprise more than 1,000 variables related to material input, output and process perimeters including weather conditions to estimate equipment failures.

Designing machine learning algorithms into track-and-traceability to predict which lots from which suppliers are most likely to be of the highest or lowest quality

Machine learning algorithms excel at finding patterns in diverse data sets by continually applying constraint-based algorithms. Suppliers vary widely in their quality and delivery schedule performance levels. Using machine learning, it’s possible to create a track-and-trace application that could indicate which lot from which supplier is the riskiest and those that are of exceptional quality as well.

AI and machine learning can provide insights into how Overall Equipment Effectiveness (OEE) can be improved that aren’t apparent today

Manufacturers will welcome the opportunity to have greater insights into how they can stabilise then normalise OEE performance across their shop floors. When a cloud ERP platform serves as an always-learning knowledge system, real-time monitoring data from machinery and production assets provide much-needed insights into areas for improvement and what’s going well on the shop floor.

Cloud ERP providers need to pay attention to how they can help close the configuration gap that exists between PLM, CAD, ERP and CRM systems by using AI and machine learning

The most successful product configuration strategies rely on a single, lifecycle-based view of product configurations. They’re able to alleviate the conflicts between how engineering designs a product with CAD and PLM, how sales & marketing sell it with CRM, and how manufacturing builds it with an ERP system. AI and machine learning can enable configuration lifecycle management and avert lost time and sales, streamlining CPQ and product configuration strategies in the process.

Improving demand forecasting accuracy and enabling better collaboration with suppliers based on insights from machine learning-based predictive models is attainable with higher quality data

By creating a self-learning knowledge system, cloud ERP providers can vastly improve data latency rates that lead to higher forecast accuracy. Factoring in sales, marketing, and promotional programs further fine-tunes forecast accuracy.

Implementing self-learning algorithms that use production incident reports to predict production problems on assembly lines needs to happen in cloud ERP platforms

A local aircraft manufacturer is doing this today by using predictive modeling and machine learning to compare past incident reports. With legacy ERP systems these problems would have gone undetected and turned into production slowdowns or worse, the line having to stop.

Improving product quality by having machine learning algorithms aggregate, analyse and continually learn from supplier inspection, quality control, Return Material Authorisation (RMA) and product failure data

Cloud ERP platforms are in a unique position of being able to scale across the entire lifecycle of a product and capture quality data from the supplier to the customer. With legacy ERP systems manufacturers most often rely on an analysis of scrap materials by type or caused followed by RMAs. It’s time to get to the truth about why products fail, and machine learning can deliver the insights to get there.

IBM’s 2018 data breach study shows why we’re in a Zero Trust world now

  • Digital businesses that lost less than 1% of their customers due to a data breach incurred a cost of $2.8M, and if 4% or more were lost the cost soared to $6M.
  • U.S. based breaches are the most expensive globally, costing on average $7.91M with the highest global notification cost as well, $740,000.
  • A typical data breach costs a company $3.86M, up 6.4% from $3.62M last year.
  • Digital businesses that have security automation can minimize the costs of breaches by $1.55M versus those businesses who are not ($2.88M versus $4.43M).
  • 48% of all breaches are initiated by malicious or criminal attacks.
  • Mean-time-to-identify (MTTI) a breach is 197 days, and the mean-time-to-contain (MTTC) is 69 days.

These and many other insights into the escalating costs of security breaches are from the 2018 Cost of a Data Breach Study sponsored by IBM Security with research independently conducted by Ponemon Institute LLC. The report is downloadable here (PDF, 47 pp. no opt-in).

The study is based on interviews with more than 2,200 compliance, data protection and IT professionals from 477 companies located in 15 countries and regions globally who have experienced a data breach in the last 12 months. This is the first year the use of Internet of Things (IoT) technologies and security automation are included in the study. The study also defines mega breaches as those involving over 1 million records and costing $40M or more. Please see pages 5, 6 and 7 of the study for specifics on the methodology.

The report is a quick read and the data provided is fascinating. One can’t help but reflect on how legacy security technologies designed to protect digital businesses decades ago isn’t keeping up with the scale, speed and sophistication of today’s breach attempts. The most common threat surface attacked is compromised privileged credential access. 81% of all breaches exploit identity according to an excellent study from Centrify and Dow Jones Customer Intelligence, CEO Disconnect is Weakening Cybersecurity (31 pp, PDF, opt-in).

The bottom line from the IBM, Centrify and many other studies is that we’re in a Zero Trust Security (ZTS) world now and the sooner a digital business can excel at it, the more protected they will be from security threats. ZTS begins with Next-Gen Access (NGA) by recognizing that every employee’s identity is the new security perimeter for any digital business.

Key takeaways from the study include the following:

US-based breaches are the most expensive globally, costing on average $7.91m, more than double the global average of $3.86m

Nations in the Middle East have the second-most expensive breaches globally, averaging $5.31M, followed by Canada, where the average breach costs a digital business $4.74M. Globally a breach costs a digital business $3.86M this year, up from $3.62M last year. With the costs of breaches escalating so quickly and the cost of a breach in the U.S. leading all nations and outdistancing the global average 2X, it’s time for more digital businesses to consider a Zero Trust Security strategy. See Forrester Principal Analyst Chase Cunningham’s recent blog post What ZTX Means For Vendors And Users, from the Forrester Research blog for where to get started.

The number of breached records is soaring in the US, the third leading nation of breached records, 6,850 records above the global average

The Ponemon Institute found that the average size of a data breach increased 2.2% this year, with the U.S. leading all nations in breached records. It now takes an average of 266 days to identify and contain a breach (Mean-time-to-identify (MTTI) a breach is 197 days and the mean-time-to-contain (MTTC) is 69 days), so more digital businesses in the Middle East, India, and the U.S. should consider reorienting their security strategies to a Zero Trust Security Model.

French and US digital businesses pay a heavy price in customer churn when a breach happens, among the highest in the world 

The following graphic compares abnormally high customer churn rates, the size of the data breach, average total cost, and per capita costs by country.

US companies lead the world in lost business caused by a security breach with $4.2m lost per incident, over $2m more than digital businesses from the Middle East

Ponemon found that U.S. digitally-based businesses pay an exceptionally high cost for customer churn caused by a data breaches. Factors contributing to the high cost of lost business include abnormally high turnover of customers, the high costs of acquiring new customers in the U.S., loss of brand reputation and goodwill. U.S. customers also have a myriad of competitive options and their loyalty is more difficult to preserve. The study finds that thanks to current notification laws, customers have a greater awareness of data breaches and have higher expectations regarding how the companies they are loyal to will protect customer records and data.

Conclusion

The IBM study foreshadows an increasing level of speed, scale, and sophistication when it comes to how breaches are orchestrated. With the average breach globally costing $4.36M and breach costs and lost customer revenue soaring in the U.S,. it’s clear we’re living in a world where Zero Trust should be the new mandate.

Zero Trust Security starts with Next-Gen Access to secure every endpoint and attack surface a digital business relies on for daily operations, and limit access and privilege to protect the “keys to the kingdom,” which gives hackers the most leverage. Security software providers including Centrify are applying advanced analytics and machine learning to thwart breaches and many other forms of attacks that seek to exploit weak credentials and too much privilege. Zero Trust is a proven way to stay at parity or ahead of escalating threats.

Why enterprises feel more susceptible to threats than ever before

  • Identities, not systems, are the new security perimeter for any digital business, with 81% of breaches involving weak, default or stolen passwords.
  • 53% of enterprises feel they are more susceptible to threats since 2015.
  • 51% of enterprises suffered at least one breach in the past 12 months and malicious insider incidents increased 11% year-over-year.

These and many other fascinating insights are from SecurIT: the Zero Trust Summit for CIOs and CISOs held last month in San Francisco, CA. CIO and CSO produced the event that included informative discussions and panels on how enterprises are adopting Next-Gen Access (NGA) and enabling Zero Trust Security (ZTS). What made the event noteworthy were the insights gained from presentations and panels where senior IT executives from Akamai, Centrify, Cisco, Cylance, EdgeWise, Fortinet, Intel, Live Nation Entertainment and YapStone shared their key insights and lessons learned from implementing Zero Trust Security.

Zero Trust is a recognized framework developed by Forrester Research in collaboration with the National Institute of Standards and Technology (NIST) and also promoted by Google as BeyondCorp. Zero Trust Security is predicated on the concept that an organization doesn’t trust anything inside or outside its boundaries and instead verifies anything and everything before granting access. The approach works because today’s leading attack vector is weak or compromised credentials according to Verizon’s 2018 Data Breach Investigations Report.

Key takeaways from the Zero Trust Summit include the following:

Identities, not systems, are the new security perimeter for any digital business, with 81% of breaches involving weak, default or stolen passwords

Tom Kemp, Co-Founder, and CEO, Centrify, provided key insights into the current state of enterprise IT security and how existing methods aren’t scaling completely enough to protect every application, endpoint, and infrastructure of any digital business. He illustrated how $86B was spent on cybersecurity, yet a stunning 66% of companies were still breached. Companies targeted for breaches averaged five or more separate breaches already. The following graphic underscores how identities are the new enterprise perimeter, making NGA and ZTS a must-have for any digital business.

53% of enterprises feel they are more susceptible to threats since 2015

Chase Cunningham’s presentation, Zero Trust and Why Does It Matter, provided insights into the threat landscape and a thorough definition of ZTX, which is the application of a Zero Trust framework to an enterprise. Dr. Cunningham is a Principal Analyst at Forrester Research serving security and risk professionals. Forrester found the percentage of enterprises who feel they are more susceptible to threats nearly doubled in two years, jumping from 28% in 2015 to 53% in 2017. Dr. Cunningham provided examples of how breaches have immediate financial implications on the market value of any business with specific focus on the Equifax breach.

Presented by Dr. Cunningham during SecurIT: the Zero Trust Summit for CIOs and CISOs

51% of enterprises suffered at least one breach in the past 12 months and malicious insider incidents increased 11% year-over-year

43% of confirmed breaches in the last 12 months are from an external attack, 24% from internal attacks, 17% are from third-party incidents and 16% from lost or stolen assets. Consistent with Verizon’s 2018 Data Breach Investigations Report use of privileged credential access is a leading cause of breaches today.

Presented by Dr. Cunningham during SecurIT: the Zero Trust Summit for CIOs and CISOs

One of Zero Trust Security’s innate strengths is the ability to flex and protect the perimeter of any growing digital business at the individual level, encompassing workforce, customers, and distributors

Akamai, Cisco, EdgeWise, Fortinet, Intel, Live Nation Entertainment and YapStone each provided examples of how their organizations are relying on NGA to enable ZTS enterprise-wide. Every speaker provided examples of how ZTS delivers several key benefits including the following: First, ZTS reduces the time to breach detection and improves visibility throughout a network. Second, organizations provided examples of how ZTS is reducing capital and operational expenses for security, in addition to reducing the scope and cost of compliance initiatives. All companies presenting at the conference provided examples of how ZTS is enabling greater data awareness and insight, eliminating inter-silo finger-pointing over security responsibilities and for several, enabling digital business transformation. Every organization is also seeing ZTS thwart the exfiltration and destruction of their data.

Conclusion

The SecurIT: the Zero Trust Summit for CIOs and CISOs event encapsulated the latest advances in how NGA is enabling ZTS by having enterprises who are adopting the framework share their insights and lessons learned. It’s fascinating to see how Akamai, Cisco, Intel, Live Nation Entertainment, YapStone, and others are tailoring ZTS to their specific customer-driven goals. Each also shared their plans for growth and how security in general and NGA and ZTS specifically are protecting customer and company data to ensure growth continues, uninterrupted.

Here’s where business intelligence is truly delivering value in 2018

  • Executive management, operations, and sales are the three primary roles driving Business Intelligence (BI) adoption in 2018.
  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018.
  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018.
  • Organizations successful with analytics and BI apps define success in business results, while unsuccessful organizations concentrate on adoption rate first.
  • 50% of vendors offer perpetual on-premises licensing in 2018, a notable decline over 2017. The number of vendors offering subscription licensing continues to grow for both on-premises and public cloud models.
  • Fewer than 15% of respondent organizations have a Chief Data Officer, and only about 10% have a Chief Analytics Officer today.

These and many other fascinating insights are from Dresner Advisory Service’s  2018 Wisdom of Crowds® Business Intelligence Market Study. In its ninth annual edition, the study provides a broad assessment of the business intelligence (BI) market and a comprehensive look at key user trends, attitudes, and intentions.  The latest edition of the study adds Information Technology (IT) analytics, sales planning, and GDPR, bringing the total to 36 topics under study.

“The Wisdom of Crowds BI Market Study is the cornerstone of our annual research agenda, providing the most in-depth and data-rich portrait of the state of the BI market,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “Drawn from the first-person perspective of users throughout all industries, geographies, and organization sizes, who are involved in varying aspects of BI projects, our report provides a unique look at the drivers of and success with BI.” Survey respondents include IT (28%), followed by Executive Management (22%), and Finance (19%). Sales/Marketing (8%) and the Business Intelligence Competency Center (BICC) (7%). Please see page 15 of the study for specifics on the methodology.

Key takeaways from the study include the following:

Executive management, operations, and sales are the three primary roles driving business intelligence (BI) adoption in 2018

Executive management teams are taking more of an active ownership role in BI initiatives in 2018, as this group replaced Operations as the leading department driving BI adoption this year. The study found that the greatest percentage change in functional areas driving BI adoption includes Human Resources (7.3%), Marketing (5.9%), BICC (5.1%) and Sales (5%).

Making better decisions, improving operational efficiencies, growing revenues and increased competitive advantage are the top four BI objectives organizations have today

Additional goals include enhancing customer service and attaining greater degrees of compliance and risk management. The graph below rank orders the importance of BI objectives in 2018 compared to the percent change in BI objectives between 2017 and 2018. Enhanced customer service is the fastest growing objective enterprises adopt BI to accomplish, followed by growth in revenue (5.4%).

Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018

The study found that second-tier initiatives including data discovery, data mining/advanced algorithms, data storytelling, integration with operational processes, and enterprise and sales planning are also critical or very important to enterprises participating in the survey. Technology areas being hyped heavily today including the Internet of Things, cognitive BI, and in-memory analysis are relatively low in the rankings as of today, yet are growing. Edge computing increased 32% as a priority between 2017 and 2018 for example. The results indicate the core aspect of excelling at using BI to drive better business decisions and more revenue still dominate the priorities of most businesses today.

Sales & marketing, business intelligence competency center (BICC) and executive management have the highest level of interest in dashboards and advanced visualization

Finance has the greatest interest in enterprise planning and budgeting. Operations including manufacturing, supply chain management, and services) leads interest in data mining, data storytelling, integration with operational processes, mobile device support, data catalog and several other technologies and initiatives. It’s understandable that BICC leaders most advocate end-user self-service and attach high importance to many other categories as they are internal service bureaus to all departments in an enterprise. It’s been my experience that BICCs are always looking for ways to scale BI adoption and enable every department to gain greater value from analytics and BI apps. BICCs in the best run companies are knowledge hubs that encourage and educate all departments on how to excel with analytics and BI.

Insurance companies most prioritize dashboards, reporting, end-user self-service, data warehousing, data discovery and data mining

Business Services lead the adoption of advanced visualization, data storytelling, and embedded BI. Manufacturing most prioritizes sales planning and enterprise planning but trails in other high-ranking priorities. Technology prioritizes Software-as-a-Service (SaaS) given its scale and speed advantages. The retail & wholesale industry is going through an analytics and customer experience revolution today. Retailers and wholesalers lead all others in data catalog adoption and mobile device support.

Insurance, technology and business services vertical industries have the highest rate of BI adoption today

The Insurance industry leads all others in BI adoption, followed by the Technology industry with 40% of organizations having 41% or greater adoption or penetration. Industries whose BI adoption is above average include Business Services and Retail & Wholesale. The following graphic illustrates penetration or adoption of Business Intelligence solutions today by industry.

Dashboards, reporting, advanced visualization, and data warehousing are the highest priority investment areas for companies whose budgets increased from 2017 to 2018

Additional high priority areas of investment include advanced visualization and data warehousing. The study found that less well-funded organizations are most likely to lead all others by investing in open source software to reduce costs.

Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018

Factors contributing to the high adoption rate for BI in small businesses include business models that need advanced analytics to function and scale, employees with the latest analytics and BI skills being hired to also scale high growth businesses and fewer barriers to adoption compared to larger enterprises. BI adoption tends to be more pervasive in small businesses as a greater percentage of employees are using analytics and BI apps daily.

Executive management is most familiar with the type and number of BI tools in use across the organization

The majority of executive management respondents say their teams are using between one or two BI tools today. Business Intelligence Competency Centers (BICC) consistently report a higher number of BI tools in use than other functional areas given their heavy involvement in all phases of analytics and BI project execution. IT, Sales & Marketing and Finance are likely to have more BI tools in use than Operations.

Enterprises rate BI application usability and product quality & reliability at an all-time high in 2018

Other areas of major improvements on the part of vendors include improving ease of implementation, online training, forums and documentation, and completeness of functionality. Dresner’s research team found between 2017 and 2018 integration of components within product dropped, in addition to scalability. The study concludes the drop in integration expertise is due to an increasing number of software company acquisitions aggregating dissimilar products together from different platforms.

10 charts that will change your perspective of big data’s growth

  • Worldwide big data market revenues for software and services are projected to increase from $42bn in 2018 to $103bn in 2027, attaining a Compound Annual Growth Rate (CAGR) of 10.48% according to Wikibon
  • Forrester predicts the global big data software market will be worth $31bn this year, growing 14% from the previous year. The entire global software market is forecast to be worth $628bn in revenue, with $302bn from applications
  • According to an Accenture study, 79% of enterprise executives agree that companies that do not embrace big data will lose their competitive position and could face extinction. Even more, 83%, have pursued big data projects to seize a competitive edge
  • 59% of executives say big data at their company would be improved through the use of AI according to PwC

Sales and marketing, research & development (R&D), supply chain management (SCM) including distribution, workplace management and operations are where advanced analytics including big data are making the greatest contributions to revenue growth today. McKinsey Analytics’ study Analytics Comes of Age, published in January 2018 (PDF, 100 pp., no opt-in) is a comprehensive overview of how analytics technologies and big data are enabling entirely new ecosystems, serving as a foundational technology for artificial intelligence (AI).

McKinsey finds that analytics and big data are making the most valuable contributions in the basic materials and high tech industries. The first chart in the following series of ten is from the McKinsey Analytics study, highlighting how analytics and big data are revolutionizing many of the foundational business processes of sales and marketing.

The following ten charts provide insights into big data’s growth:

Nearly 50% of respondents to a recent McKinsey Analytics survey say analytics and Big Data have fundamentally changed business practices in their sales and marketing functions

Also, more than 30% say the same about R&D across industries, with respondents in High Tech and Basic Materials & Energy report the greatest number of functions being transformed by analytics and big data. Source: Analytics Comes of Age, published in January 2018 (PDF, 100 pp., no opt-in).

Worldwide big data market revenues for software and services are projected to increase from $42bn in 2018 to $103bn in 2027, attaining a Compound Annual Growth Rate (CAGR) of 10.48%

As part of this forecast, Wikibon estimates the worldwide big data market is growing at an 11.4% CAGR between 2017 and 2027, growing from $35bn to $103bn. Source: Wikibon and reported by Statista.

According to NewVantage Venture Partners, big data is delivering the most value to enterprises by decreasing expenses (49.2%) and creating new avenues for innovation and disruption (44.3%)

Discovering new opportunities to reduce costs by combining advanced analytics and big data delivers the most measurable results, further leading to this category being the most prevalent in the study. 69.4% have started using big data to create a data-driven culture, with 27.9% reporting results. Source: NewVantage Venture Partners, Big Data Executive Survey 2017 (PDF, 16 pp.)

The Hadoop and big data markets are projected to grow from $17.1bn in 2017 to $99.31bn in 2022 attaining a 28.5% CAGR

The greatest period of projected growth is in 2021 and 2022 when the market is projected to jump $30bn in value in one year. Source: StrategyMRC and reported by Statista.

Big data applications and analytics is projected to grow from $5.3bn in 2018 to $19.4bn in 2026, attaining a CAGR of 15.49%

Big data market worldwide includes Professional Services is projected to grow from $16.5B in 2018 to $21.3B in 2026. Source: Wikibon and reported by Statista.

Comparing the worldwide demand for advanced analytics and big data-related hardware, services and software, the latter category’s dominance becomes clear

The software segment is projected to increase the fastest of all categories, increasing from $14B in 2018 to $46B in 2027 attaining a CAGR of 12.6%. Sources: WikibonSiliconANGLE; Statista estimates and reported by Statista.

Advanced analytics and big data revenue in China are projected to be worth ¥57.8bn ($9bn) by 2020

The Chinese market is predicted to be one of the fastest growing globally, growing at a CAGR of 31.72% in the forecast period. Sources: Social Sciences Academic Press (China) and Statista.

Non-relational analytic data stores are projected to be the fastest growing technology category in big datagrowing at a CAGR of 38.6% between 2015 and 2020

Cognitive software platforms (23.3% CAGR) and Content Analytics (17.3%) round out the top three fastest growing technologies between 2015 and 2020. Source: Statista.

A decentralized general-merchandise retailer that used big data to create performance group clusters saw sales grow 3% to 4%

Big data is the catalyst of a retailing industry makeover, bringing greater precision to localization than has been possible before. Big data is being used today to increase the ROI of endcap promotions, optimize planograms, help to improve upsell and cross-sell sales performance and optimize prices on items that drive the greatest amount of foot traffic. Source: Use Big Data to Give Local Shoppers What They Want, Boston Consulting Group, February 8, 2018.

84% of enterprises have launched advanced analytics and big data initiatives to bring greater accuracy and accelerate their decision-making

Big data initiatives focused on this area also have the greatest success rate (69%) according to the most recent NewVantage Venture Partners Survey. Over a third of enterprises, 36%, say this area is their top priority for advanced analytics and Big Data investment. Sources: NewVantage Venture Partners Survey and Statista.

Additional big data information sources

4 Pain Points of Big Data and how to solve them, Digital McKinsey via Medium, November 10, 2017

53% Of Companies Are Adopting Big Data Analytics, Forbes, December 24, 2017

6 Predictions For The $203 Billion Big Data Analytics Market, Forbes, Gil Press, January 20, 2017

Analytics Comes of Age, McKinsey Analytics, January 2018 (PDF, 100 pp.)

Big Data & Analytics Is The Most Wanted Expertise By 75% Of IoT Providers, Forbes, August 21, 2017

Big Data 2017 – Market Statistics, Use Cases, and Trends, Calsoft (36 pp., PDF)

Big Data and Business Analytics Revenues Forecast to Reach $150.8 Billion This Year, Led by Banking and Manufacturing Investments, According to IDC, March 14, 2017

Big Data Executive Survey 2018, Data and Innovation – How Big Data and AI are Driving Business Innovation, NewVantage Venture Partners, January 2018 (PDF, 18 pp.)

Big Data Tech Hadoop and Spark Get Slow Start in Enterprise, Information Week, March 20, 2018

Big Success With Big Data, Accenture  (PDF, 12 pp.)

Gartner Survey Shows Organizations Are Slow to Advance in Data and Analytics, Gartner, February 5, 2018

How Big Data and AI Are Driving Business Innovation in 2018, MIT Sloan Management Review, February 5, 2018

IDC forecasts big growth for Big Data, Analytics Magazine. April 2018

IDC Worldwide Big Data Technology and Services 2012 – 2015 Forecast, Courtesy of EC Europa (PDF, 34 pp.)

Midyear Global Tech Market Outlook For 2017 To 2018, Forrester, September 25, 2017 (client access reqd.)

Oracle Industry Analyst Reports – Data-rich website of industry analyst reports

Ten Ways Big Data Is Revolutionizing Marketing And Sales, Forbes, May 9, 2016

The Big Data Payoff: Turning Big Data into Business Value, CAP Gemini & Informatica Study, (PDF, 12 pp.)

The Forrester Wave™: Enterprise BI Platforms With Majority Cloud Deployments, Q3 2017 courtesy of Oracle

Three ways machine learning is revolutionising zero trust security

Bottom line: Zero Trust Security (ZTS) starts with Next-Gen Access (NGA). Capitalizing on machine learning technology to enable NGA is essential in achieving user adoption, scalability, and agility in securing applications, devices, endpoints, and infrastructure.

How next-gen access and machine learning enable zero trust security

Zero Trust Security provides digital businesses with the security strategy they need to keep growing by scaling across each new perimeter and endpoint created as a result of growth. ZTS in the context of Next-Gen Access is built on four main pillars: (1) verify the user, (2) validate their device, (3) limit access and privilege, and (4) learn and adapt. The fourth pillar heavily relies on machine learning to discover risky user behavior and apply for conditional access without impacting user experience by looking for contextual and behavior patterns in access data.

As ZTS assumes that untrusted users or actors already exist both inside and outside the network, machine learning provides NGA with the capability to assess data about users, their devices, and behavior to allow access, block access, or enforce additional authentication. With machine learning, policies and user profiles can be adjusted automatically and in real-time. While NGA enabled by machine learning is delivering dashboards and alerts, the real-time response to security threats predicated on risk scores is very effective in thwarting breaches before they start.

Building NGA apps based on machine learning technology yields the benefits of being non-intrusive, supporting the productivity of workforce and business partners, and ultimately allowing digital businesses to grow without interruption. For example, Centrify’s rapid advances in machine learning and Next-Gen Access to enable ZTS strategies makes this company one of the most interesting to watch in enterprise security.

The following are three ways machine learning is revolutionizing Zero Trust Security:

  • Machine learning enables enterprises to adopt a risk-based security strategy that can flex with their business as it grows. Many digital businesses have realized that “risk is security’s new compliance,” and therefore are implementing a risk-driven rather than a compliance-driven approach. Relying on machine learning technology to assess user, device, and behavioral data for each access request derives a real-time risk score. This risk score can then be used to determine whether to allow access, block access, or step up authentication. In evaluating each access request, machine learning engines process multiple factors, including the location of the access attempt, browser type, operating system, endpoint device status, user attributes, time of day, and unusual recent privilege change. Machine learning algorithms are also scaling to take into account unusual command runs, unusual resource access histories, and any unusual accounts used, unusual privileges requested and used, and more. This approach helps thwart comprised credential attacks, which make up 81% of all hacking-related data breaches, according to Verizon.
  • Machine learning makes it possible to accomplish security policy alignment at scale. To keep pace with a growing digital business’ need to flex and scale to support new business models, machine learning also assists in automatically adjusting user profiles and access policies based on behavioral patterns. By doing so, the need for IT staffers to review and adjust policies vanishes, freeing them up to focus on things that will grow the business faster and more profitably. On the other hand, end users are not burdened with step-up authentication once a prior abnormal behavior is identified as now typical behavior and therefore both user profile and policies updated.
  • Machine learning brings greater contextual intelligence into authentication, streamlining the experience and increasing user adoption. Ultimately, the best security is transparent and non-intrusive. That’s where the use of risk-based authentication and machine learning technology comes into play. The main impediment to adoption for multi-factor authentication has been the perceived impact on the productivity and agility of end users. A recent study by Dow Jones Customer Intelligence and Centrify revealed that 62% of CEOs state that multi-factor authentication (MFA) is difficult to manage and is not user-friendly, while only 41% of technical officers (CIOs, CTOs, and CISOs) agree with this assessment. For example, having to manually type in a code that has been transmitted via SMS in addition to the already supplied username and password is often seen as cumbersome. Technology advancements are removing some of these objections by offering a more user-friendly experience, like eliminating the need to manually enter a one-time password on the endpoint, by enabling the user to simply click a button on their smartphone. Nonetheless, some users still express frustration with this additional step, even if it is relatively quick and simple. To overcome these remaining barriers to adoption, machine learning technology contributes to minimizing the exposure to step up authentication over time, as the engine learns and adapts to the behavioral patterns.

Conclusion

Zero Trust Security through the power of Next-Gen Access is allowing digital businesses to continue on their path of growth while safeguarding their patented ideas and intellectual property. Relying on machine learning technology for Next-Gen Access results in real-time security, allowing to identify high-risk events and ultimately greatly minimizing the effort required to identify threats across today’s hybrid IT environment.

The best big data companies and CEOs to work for in 2018

Forbes readers’ most common requests centre on who the best companies are to work for in analytics, big data, data management, data science and machine learning. The latest Computer Reseller News‘ 2018 Big Data 100 list of companies is used to complete the analysis as it is an impartial, independent list aggregated based on CRN’s analysis and perspectives of the market. Using the CRN list as a foundation, the following analysis captures the best companies in their respective areas today.

Using the 2018 Big Data 100 CRN list as a baseline to compare the Glassdoor scores of the (%) of employees who would recommend this company to a friend and (%) of employees who approve of the CEO, the following analysis was completed today. 25 companies on the list have very few (less than 15) or no Glassdoor reviews, so they are excluded from the rankings. Based on analysis of Glassdoor score patterns over the last four years, the lower the number of rankings, the more 100% scores for referrals and CEOs. These companies, however, are included in the full data set available here. If the image below is not visible in your browser, you can view the rankings here.

The highest rated CEOs on Glassdoor, as of May 11 2018, include the following:

  • Dataiku – Florian Douetteau – 100%
  • StreamSets – Girish Pancha – 100%
  • MemSQL – Nikita Shamgunov – 100%
  • 1010 Data – Greg Munves – 99%
  • Salesforce – Marc Benioff – 98%
  • Attivio – Stephen Baker – 98%
  • SAP – Bill McDermott – 97%
  • Qubole – Ashish Thusoo – 97%
  • Trifacta – Adam Wilson – 97%
  • Zaloni – Ben Sharma – 97%
  • Reltio – Manish Sood – 96%
  • Microsoft – Satya Nadella – 96%
  • Cloudera – Thomas J. Reilly – 96%
  • Sumo Logic – Ramin Sayar – 96%
  • Google – Sundar Pichai – 95%
  • Looker – Frank Bien – 93%
  • MongoDB – Dev Ittycheria – 92%
  • Snowflake Computing – Bob Muglia – 92%
  • Talend – Mike Tuchen – 92%
  • Databricks – Ali Ghodsi – 90%
  • Informatica – Anil Chakravarthy – 90%