Category Archives: AI

What did we learn from EMC’s data protection report?

a safe place to workEMC has recently released its Global Data Protection Index 2016 where it claims only 2% of the world would be considered ‘leaders’ in protecting their own assets, reports Telecoms.com.

Data has dominated the headlines in recent months as breaches have made customers question how well enterprise organizations can manage and protect data. Combined with transatlantic disagreements in the form of Safe Habour and law agencies access to personal data, the ability to remain secure and credible is now more of a priority for decision makers.

“Our customers are facing a rapidly evolving data protection landscape on a number of fronts, whether it’s to protect modern cloud computing environments or to shield against devastating cyber-attacks,” said David Goulden, CEO of EMC Information Infrastructure. “Our research shows that many businesses are unaware of the potential impact and are failing to plan for them, which is a threat in itself.”

EMC’s report outlined a number of challenges and statistics which claimed the majority of the industry are not in a place they should be with regard to data protection. While only 2% of the industry would be considered leaders in the data protection category, 52% are still evaluating the options available to them. Overall, 13% more businesses suffered data loss in the last twelve months, compared to the same period prior to that.

But what are the over-arching lessons we learned from the report?

Vendors: Less is more

A fair assumption for most people would be the more protection you take on, the more protected you are. This just seems logical. However, the study shows the more vendors you count in your stable, the more data you will leak.

The average data loss instance costs a company 2.36TB of data, which would be considered substantial, however it could be worse. The study showed organizations who used one vendor lost on average 0.83TB per incident, two vendors 2.04TB and three vendors 2.58TB. For those who used four or more vendors, an average of 5.47TB of data was lost per incident.

Common sense would dictate the more layers of security you have, the more secure you will be, however this is only the case if the systems are compatible with each other. It should be highlighted those who lost the larger data sets are likely to be the larger companies, with more data to lose, though the study does seem to suggest there needs to be a more co-ordinated approach to data protection.

And they are expensive…

Using same concept as before, the average cost of lost data was $900,000. For those who have one vendor, the cost was $636,361, for those with two, 789,193 and for those with three vendors the cost was just above the average at 911,030. When companies bring in four or more vendors, the average cost of data loss rises to 1.767 million.

China and Mexico are the best

While it may be surprising, considering many of the latest breakthroughs in the data world have come from Silicon Valley or Israel, China and Mexico are the two countries which would be considered furthest ahead of the trend for data protection.

EMC graded each country on how effective they are were implementing the right technologies and culture to prevent data loss within the organizations themselves. 17 countries featured ahead of the curve including usual suspects of the UK (13.5% ahead of the curve), US (8%), Japan (1%) and South Korea (9%), however China and Mexico led the charge being 20% and 17% respectively ahead.

While it may not be considered that unusual for China to have a strong handle on data within its own boarders, Mexico is a little more surprising (at least to us at Telecoms.com). The country itself has gone through somewhat of a technology revolution in recent years, growing in the last 20 years from a country where only 10% of people had mobile through to 68% this year, 70% of which are smartphones. Mexico is now the 11th largest economy in terms of purchasing power, with the millennials being the largest demographic. With the population becoming more affluent, and no longer constrained by the faults of pre-internet world, the trend should continue. Keep up the good work Mexico.

Human error is still a talking point

When looking at the causes of data loss, the results were widespread, though the causes which cannot be controlled were at the top of the list. Hardware failure, power loss and software failure accounted for 45%, 35% and 34% respectively.

That said the industry does now appear to be taking responsibility for the data itself. The study showed only 10% of the incidents of data loss was blamed on the vendor. A couple of weeks ago we spoke to Intel CTO Raj Samani who highlighted to us the attitude towards security (not just data protection) needs to shift, as there are no means to outsource risk. Minimizing risk is achievable, but irrelevant of what agreements are undertaken with vendors, the risk still remains with you. As fewer people are blaming the vendors, it would appear this responsibility is being realized.

Human error is another area which still remains high on the agenda, as the study showed it accounts for 20% of all instances of data loss. While some of these instances can be blamed on leaving a laptop in the pub or losing a phone on the train, there are examples where simple mistakes in the workplace are to blame. These will not be removed, as numerous day-to-day decisions are based off the back of intuition and gut-feel, and a necessity for certain aspects of the business.

An area which could be seen as a potential danger would be that of artificial intelligence. As AI advances as a concept, the more like humans they will become, and thus more capable of making decisions based in intuition. If this is to be taken as the ambition, surely an intuitive decision making machine would offer a security defect in the same way a human would. Admittedly the risk would be substantially smaller, but on the contrary, the machine would be making X times many more decision than the human.

All-in-all the report raises more questions than provides answers. While security has been pushed to the top of the agenda for numerous organizations, receiving additional investment and attention, it does not appear the same organizations are getting any better at protecting themselves. The fact 13% more organizations have been attacked in the last 12 month suggests it could be getting worse.

To finish, the study asked whether an individual felt their organization was well enough protected. Only 18% believe they are.

Twitter acquires machine learning start-up Magic Pony

Twitter has stepped up its efforts in the machine learning arena after announcing the acquisition of visual processing technology company Magic Pony.

While the company claims machine learning is central to the brands capabilities, it has been relatively quiet in the market segment in comparison to industry heavy weights such as IBM, Google and Microsoft. This is the third acquisition the team has made in this area, reported to be in the range of $150 million, following the purchase of Whetlab last year and Mad Bits in 2014, compared to Google who acquired Jetpac, Dark Blue Labs and Vision Factory, as well as $500 million on DeepMind, all in 2014.

“Machine learning is increasingly at the core of everything we build at Twitter,” said Jack Dorsey, Twitter CEO. “Magic Pony’s machine learning technology will help us build strength into our deep learning teams with world-class talent, so Twitter can continue to be the best place to see what’s happening and why it matters, first. We value deep learning research to help make our world better, and we will keep doing our part to share our work and learnings with the community.”

The acquisition follows Twitter’s announcement last week advertisers will now be able to utilize emoji keyword targeting for Twitter Ads. Although a simple proposition in the first instance, the new features did open up the opportunity for machine learning enhanced advertising solutions.

Magic Pony, which was founded in 2014 and currently has 11 employees, was acquired to bolster the visual experiences that are delivered across Twitter apps. The team will link up with Twitter Cortex, the in-house machine learning department, to improve image processing expertise.

The technology itself makes use of the abilities of convolutional neural networks to scale-out an image. By taking the information in a picture, the technology imagines a larger and more in-depth image by scaling out the detail which it sees. Much in the same way a human can imagine the rest of a car by seeing the door, the technology learns lessons from previous experiences and applies logical decisions moving forward.

Magic Pony itself was initially supported by investment from Octopus Ventures who have seemingly found a specialty in finding promising AI start-ups. Prior to Magic Pony being acquired by Twitter, Octopus Ventures invested it Evi which was acquired by Amazon in 2012, and SwiftKey which was acquired by Microsoft this year.

“Today marks a great day for the Magic Pony team,” said Luke Hakes, Investment Director at Octopus Ventures. “We’re proud to have believed in the concept early on and to then have had the privilege of joining their journey. The technology Magic Pony has developed is revolutionary and pushes the boundaries of what is possible with AI in the video space.

“The UK continues to grow as the ‘go-to’ place for companies looking to build best in breed AI technology – Octopus has been fortunate to work with the founders of three companies in this space that have gone on to be acquired, with Evi and Amazon, SwiftKey and Microsoft, and now Magic Pony and Twitter. We are excited for the Magic Pony team, but also to take what we have learnt on the last three journeys and help the next generation of entrepreneurs lead the way in the on-going AI revolution.”

Machine learning front and centre of R&D for Microsoft and Google

Dear Future Im Ready, message on paper, smart phone and coffee on tableMicrosoft and Google have announced plans to expand their machine learning capabilities, through acquisition and new research offices respectively, reports Telecoms.com.

Building on the ‘Conversation-as-a-Platform’ proposition put forward by CEO Satya Nadella at Build 2016, the Microsoft team has announced plans to acquire Wand Labs. The purchase will add weight to the ‘Conversation-as-a-Platform’ strategy, as well as supporting innovation ambitions for Bing intelligence.

“Wand Labs’ technology and talent will strengthen our position in the emerging era of conversational intelligence, where we bring together the power of human language with advanced machine intelligence,” said David Ku, Corporate Vice President of the Information Platform Group on the company’s official blog. “It builds on and extends the power of the Bing, Microsoft Azure, Office 365 and Windows platforms to empower developers everywhere.”

More specifically, Wand Labs adds expertise in semantic ontologies, services mapping, third-party developer integration and conversational interfaces, to the Microsoft engineering team. The ambition of the overarching project is to make the customers experience more seamless by harnessing human language in an artificial environment.

Microsoft’s move into the world of artificial intelligence and machine learning has not been a smooth ride to date, though this has not seemed to hinder investment. Back in March, the company’s AI inspired Twitter account Tay went into melt-down mode, though the team pushed forward, updating its Cortana Intelligence Suite and releasing its Skype Bot Platform. Nadella has repeatedly highlighted artificial intelligence and machine learning is the future for the company, stating at Build 2016:

“As an industry, we are on the cusp of a new frontier that pairs the power of natural human language with advanced machine intelligence. At Microsoft, we call this Conversation-as-a-Platform, and it builds on and extends the power of the Microsoft Azure, Office 365 and Windows platforms to empower developers everywhere.”

Google’s efforts in the machine learning world have also been pushed forward this week, as the team announced dedicated machine learning research based in the Zurich offices, on its blog. The team will focus on three areas specifically, machine intelligence, natural language processing & understanding, as well as machine perception.

Like Microsoft, Google has prioritized artificial intelligence and machine learning, though both companies will be playing catch-up with the likes of IBM and AWS, whose AI propositions have been in the market for some time. Back in April, Google CEO Sundar Pichai said in the company’s earnings call “overall, I do think in the long run, I think we will evolve in computing from a mobile first to an AI first world,” outlining the ambitions of the team.

Google itself already has a number of machine learning capabilities incorporated in its product portfolio, those these could be considered as relatively rudimentary. Translate, Photo Search and SmartReply for Inbox already contains aspects of machine learning, though the team are targeting more complex and accurate competencies.

Elsewhere, Twitter has announced on their blog advertisers will now be able to utilize emoji keyword targeting for Twitter Ads. This new feature uses emoji activity as a signal of a person’s mood or mind set, allowing advertisers to more effectively communicate marketing messages minimizing the potential for backlash of disgruntled twitter users. Although the blog does not state the use of machine learning competencies, it does leave the opportunity for future innovation in the area.

IBM launches weather predictor Deep Thunder for The Weather Company

cloud storm rainIBM’s Weather Company has announced the launch of Deep Thunder to help companies predict the actual impact of various weather conditions.

By combining hyper-local, short-term custom forecasts developed by IBM Research with The Weather Company’s global forecast model the team hope to improve the accuracy of weather forecasting. Deep Thunder will lean on the capabilities of IBM’s machine learning technologies to aggregate a variety of historical data sets and future forecasts to provide fresh new guidance every three hours.

“The Weather Company has relentlessly focused on mapping the atmosphere, while IBM Research has pioneered the development of techniques to capture very small scale features to boost accuracy at the hyper local level for critical decision making,” said Mary Glackin, Head of Science & Forecast Operations for The Weather Company. “The new combined forecasting model we are introducing today will provide an ideal platform to advance our signature services – understanding the impacts of weather and identifying recommended actions for all kinds of businesses and industry applications.”

The platform itself will combine more than 100 terabytes of third-party data daily, as well as data collected from the company’s 195,000 personal weather stations. The offering can be customized to suit the location of various businesses, with IBM execs claiming hyper-local forecasts can be reduced to between a 0.2 to 1.2 mile resolution, while also taking into account other factors for the locality such as vegetation and soil conditions.

Applications for the new proposition can vary from the agriculture to city planning & maintenance to validating insurance claims, however IBM has also stated consumer influences can also be programmed into the platform, meaning retailers could manage their supply chains and understand what should be stocked on shelves with the insight.

Facebook launches 30 made-for-VR games at E3

FacebookFacebook, Bethesda Softworks and Sony are among the names to have announced new made-for-VR games at E3, reports Telecoms.com.

Facebook has launched 30 made-for-VR games for the Oculus Touch as it continues efforts to diversify its portfolio. Aside from those being released in the coming months, the Oculus team have also stated it has ‘hundreds’ more titles in the pipeline, though it hasn’t established when the Touch motion controllers might ship. The announcement also included the launch of Oculus Ready PCs, made by Alienware, Lenovo, and HP.

Bethesda Softworks also claims its Fallout 4 will become first big open-world game to get an official, studio-released virtual reality mode, as well as Sony announcing its Resident Evil title will receive the ‘full VR experience’.

While the shift towards VR and AR offers healthy potential for brands and gaming companies alike, it could present the same challenges for network players as the rise of mobile. VR could provide similar stress on the network as smartphone mass-adoption and the subsequent reduction in the price of data did. Deloitte estimates 2.5 million VR headsets and 10 million game copies could be sold in 2016 alone.

From a VR perspective, the gaming industry represents a healthy opportunity for brands such as Oculus. Research from intelligence firm Newzoo estimates gamers worldwide could generate a total of $99.6 billion in revenues in 2016, up 8.5% compared to 2015. Mobile will account for $36.9 billion, exceeding PC revenues for the first time, and growth is expected to continue at a healthy 6.6% CAGR through to 2019, potentially reaching $118.6 billion in total.

One of the main challenges for the VR industry currently is the levels of adoption and normalization of the technology itself. Currently the hardware is generally perceived as a luxury item and VR revenues will remain marginal for the short- to mid-term future until uptake has moved into the mainstream market. Newzoo expect the majority of revenues to be generated by hardware sales, spectator content, and live viewing formats, though this is likely to be the platform where consumers communicate with each other and interact with content in the long run.

Elsewhere in the industry, Sony has confirmed its first steps into the world of high-end VR, by announcing the release of PlayStation VR. The headset will be available later this year; October 13th and will be priced at $499 when bundled with the camera and Move controllers it needs to be fully functional.

While Sony is slightly later to the market than the Oculus Rift and HTC Vive, should the team be able to capitalise on strong performance in recent months the move could prove to be a successful venture. During the final quarter of 2015, Sony’s gaming division reported a 10.5% year-on-year increase revenue brought on by strong PlayStation hardware and software sales totalling $4.89 billion. Operating income for the gaming unit was 45.5% higher owing partly to the fact the company sold more than sold over 35 million PlayStation 4 consoles.

IBM launches interactive ads on Watson

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.IBM has announced the launch of Watson Ads to harness the AI potential of its cognitive computing platform and create interactive ads, personalized to individual customers. The first offerings of the initiative will be made available through The Weather Company sub-brand.

Personalized advertising has proved to be big business in recent months as brands aim to move away from the blanket marketing approach, as towards a proposition where one-to-one communications are the norm. IBM believe Watson’s ability to understand and comprehend natural language will enable advertisers to interact with customers on a more personal level, and also on a wide scale.

“The dawn of cognitive advertising is truly a watershed moment. Now as part of IBM, we have even more tools and technologies at our disposal to inspire innovations within advertising, artificial intelligence and storytelling,” said Domenic Venuto, GM of Consumer products at The Weather Company. “This is a huge opportunity to expose consumers to all of the surprising and delightful experiences that Watson has in store for them – and to make advertising a truly valuable interaction for both our fans and our marketing partners, which is always our goal.”

IBM claim the new proposition will aide advertiser in numerous ways including a better understanding of brand perception and customer favourability, helping customers make a more informed decision, improve overall experience, optimize creative and advertising strategies, as well as helping marketers use data more effectively.

As part of the initiative, the team will also create the Watson Ads Council, a collection of marketers from various verticals, who will act as a sounding board for the latest innovations leveraging Watson Ads and cognitive advancements in advertising.

“Transforming ourselves and industries is part of The Weather Company DNA,” said Jeremy Steinberg, Global Head of Sales at The Weather Company. “We’ve embraced big data and leveraged it to improve every aspect of our business, from forecast accuracy to ad targeting. Now we’ve set our sights on cognition. We believe human interaction is the new ‘search,’ and that cognitive advertising is the next frontier in marketing – and we’re leading the charge to make it a reality.”

Watson Ads will launch first exclusively across The Weather Company properties, but this is expected to have broad implications for other marketing channels, including out of home, television, connected cars and social media platforms.

Dell targets SMBs in China with launch of new company

Location China. Green pin on the map.Dell has prioritized growing its presence within the Chinese market targeting SMBs and public sector organizations, according to China Daily.

Speaking at the China Big Data Industry Summit in Guiyang, Dell CEO Michael Dell announced the launch of a new company, alongside its local partner, to gain traction within the lucrative market. Guizhou YottaCloud Technology will now act as a means for Dell to access the local market, prioritizing small and medium-sized enterprises and local governments in the first instance.

“China will play an increasingly important role in the big data era and the United States-based tech giant will speed up efforts to develop new products for the market,” said Dell at the conference.

Dell is one of a number of organizations who have prioritized local partnerships in the Chinese market, as locals tend to favour Chinese businesses and technologies over foreign counterparts, quoting security as the main driver. The country itself is a big draw for Dell as a business, representing its second largest market worldwide, only behind the US. The company also highlighted in September it plans to invest $125 billion in the Chinese market over the next five years, with cloud computing being the focal point.

Last year Dell launched it’s ‘In China for China’ strategy, which not only included the above investments, but also a drive from its Venture Capital arm in China to encourage entrepreneurialism, expanding its R&D function in the country, as well as establishing an artificial intelligence and advanced computing joint-lab, with the Chinese Academy of Sciences. The AI research will focus on the areas of cognitive function simulation, deep learning and brain computer simulation.

“The Internet is the new engine for China’s future economic growth and has unlimited potential,” said Dell in September. “Being an innovative and efficient technology company, Dell will embrace the principle of ‘In China, for China’ and closely integrate Dell China strategies with national policies in order to support Chinese technological innovation, economic development and industrial transformation.”

Marc Benioff backs AI as Salesforce reports 28% growth

Marc Benioff

Salesforce CEO Marc Benioff

Salesforce reported healthy results over the course of Q1, growing 28%, as CEO Marc Benioff backed AI as the next major growth driver, during the company’s quarterly earnings call.

While social and mobile has facilitated Salesforce growth in recent years, the team are backing artificial intelligence as the next major trend to take the company through the targeted $10 billion annual revenue target. Benioff highlighted that in the same way the company is now known for being a social and mobility brand, the ambition is for Salesforce to be perceived as “an AI first company”.

“When I look at kind of the next major trend for Salesforce and our industry that will drive tremendous growth is got to be artificial intelligence,” said Benioff. “And as we look out into the future and we start to look at extreme improvement and advances in artificial intelligence whether it’s machine learning, whether it’s deep learning, whether it’s machine intelligence itself, I think that those kind of capabilities appearing inside our applications that is going to be a major growth capability going forward.”

One of the newest product launches for the company, Salesforce Inbox, uses these AI and machine intelligence opportunities to gives companies a perspective on how they can be more efficient in the sales, service, and marketing processes. SalesforceIQ is another offering which uses the same capabilities as it has an artificial intelligence front end, whereas Benioff also highlighted Sales Cloud has a machine learning front end.

While others in the industry have been very vocal about their progress within the AI field, Salesforce has seemingly been sneaking in under the radar with additional acquisitions including Tempo AI and PredictionIO. SalesforceIQ, an AI-driven calendar app which can prioritize work schedules for sales employees, was incorporated into the product portfolio following the $390 million acquisition of RelateIQ in 2014. These acquisitions, as well as organic development, are aiding the company in adapting to what Benioff described as “an AI first world”.

Salesforce’s new efforts will focus on the new, digitally enabled customers and consumers, who could be seen to driving the transformation worldwide. This new generation is defined by technology and speed, as Benioff highlighted they want services faster and easier than ever before, as well as being ever more reliant on social and mobile technologies. Companies who do not adapt themselves to this new proposition but remain in a more traditional model are those who will struggle to remain competitive.

“We’re in the midst of a massive generational shift; a new generation of customers and consumers is clearly emerging,” said Benioff. “We have been calling them here at Salesforce C generation customers. I mean this is really part of a huge shift that’s happening in computing. We’ve gone from the first generation of computing which was very much about systems of record to the second generation which was systems of engagement we talked about that on these calls many times over the last 10 years.

“And we are clearly moving into this incredible world that the system of intelligence that’s all yielding these incredible systems of customers or C generation customers that are — that our customers are connecting to. And that’s we’re so excited about.”

In terms of financials, revenues for Q1 grew to nearly $2 billion, up 28% in constant currency. Sales Cloud demonstrated 15% year-over-year growth, Service Cloud grew 32%, Marketing Cloud grew 29%, whereas Apps Cloud and other business units grew 45%. Growth in Sales Cloud was the highest recorded in the five previous quarters, which Benioff attributing to a number of new innovations including its Lightning platform, where the team have recently released an updated government edition, as well as Pardot and SteelBrick capabilities.

The team are also raising 2017 revenue guidance to $8.16 billion to $8.2 billion, and are expecting revenues of between $2.005 billion to $2.015 billion in Q2.

“I’m also thrilled to announce we’re raising full-year revenue guidance $80 million raising the guidance we feel really excited about that, $8.2 billion is the high-end of our range and our current outlook puts us on its square path, look we are going to see now that we’re going to realize very shortly our $10 billion dream,” said Benioff. “This is amazing I think that one of the reasons that we are doing so well is because Oracle and SAP are doing so poorly in the cloud”

Sony leans on AI to give technological advantage

Sony robotSony has announced its latest investment into Cogitai, taking the company’s interests into the world of artificial intelligence.

Artificial intelligence has been claiming column inches in recent months, as numerous technology companies including Facebook and Google aim to gain traction in a potentially profitable marketplace. The company has subtle experience in the AI space, having incorporated a number of face and speech recognition capabilities into previous products, though the company has not specifically stated where Cogitai’s technology will fit into the mix. Financials of the agreement have not been released to date.

“We believe that AI will be incorporated into numerous products and will eventually become commonplace,” said Hiroaki Kitano, CEO of Sony Computer Science Laboratories. Kitano’s division is responsible for future innovation in the business, where the team is current investigating the role of AI in enhanced the music experience for customers, as well as how the company can improve its own internal manufacturing processes.

“As this evolution happens, the most important thing to focus on is the benefit the technology brings to consumers. Because of this, the choice of domains, value propositions, and how one can align technologies to enable them to work together will be crucial. From this perspective the collaboration between Cogitai and Sony is a major milestone for the next wave of AI.”

The company’s first venture into the AI market focused around the launch of robotic dog AIBO in 1999 and humanoid robot QRIO in 2003. While these launches received a healthy amount of attention at the time, the last products were produced in 2006 due to the company’s need to concentrate on fighting back competition in its core consumer electronics business. Having restructured the consumer electronics business, the team could be using the integration of AI to provide technological advantage in the market segment.

Sony’s current AI activities are centred within the System R&D Group which is based in Sony Headquarters, and is also responsible for the development of augmented reality and other emerging technology areas. The team have implemented various AI capabilities in a number of current products including Xperia Agent, a voice activated robot which provides information in a similar manner to Siri and Project N, a wearable device, though the capabilities don’t appear to be as advanced as others in the market.

Accenture and IPsoft team up to launch AI initiative

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.Accenture has expanded its partnership with IPsoft to accelerate the adoption and implementation of artificial intelligence technologies.

As part of the relationship the team will launch the Accenture Amelia Practice, a new consulting arm for Accenture which will develop go-to-market strategies using the IPsoft’s product offering to build virtual agent technology for customers. In the first instance, the team will target the banking, insurance and travel industries.

“Artificial intelligence is maturing rapidly and offers great potential to reshape the way that organisations conduct business and interact with their customers and employees,” said Paul Daugherty, Accenture’s CTO “At the same time, executives are overwhelmed by the plethora of technologies and many products that are advertising AI or Cognitive capabilities.”

“With our new Accenture Amelia practice, we are taking an important step forward in advancing the business potential of artificial intelligence by combining IPsoft’s world-class virtual agent platform with Accenture’s broad technology capabilities and industry experience to help clients transform their business and operations.”

The extended partnership will focus on creating practical implementations for AI within the current business world, using automation at scale to increase organizational efficiencies. The IPsoft team have implemented the same concept with a number of customers including programs to answer invoicing queries from suppliers and front-line customer service bots.

Artificial intelligence is seemingly one of a number of new areas being prioritized by the Accenture team, as industry continues trends towards a more digitally enabled ecosystem. Recent research from highlighted the digital economy accounted for roughly 22% of the world’s total economy, with this figure predicted to rise to 25% by 2015. This figure was as low as 15% in 2005. The same research also predicts growth of new technology will continue on an upward scale, as 28% of the respondents believe the pace of change will increase “at an unprecedented rate”.

While Accenture’s business has predominantly been focused around traditional IT to date, the team’s future business will shift slightly towards disruptive technologies, building on its new business mantra ‘Every Business is a Digital Business’. AI is one of those prioritized disruptions, as it described artificial intelligence and intelligent automation as the “essential new co-worker for the digital age”.

It would appear Accenture are betting heavy on these new technologies as it claims 70% of executives are making significantly more investments in artificial intelligence technologies than they did in 2013, and 55% state that they plan on using machine learning and embedded AI solutions (like Amelia) extensively.