Category Archives: AI

AI is getting there but still confusing…

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.Research from Narrative Science claims confusion over the definition of artificial intelligence is holding it back, although 62% of enterprise respondents believe it will be place by 2018, reports Telecoms.com.

Although this is an encouraging statistic, the report also highlights there is confusion over the definition of the technology itself. 62% of those who contributed to the survey said they were not using AI currently, however later in the survey 88% of the same people were then found to be products or solutions which are under pinned by AI technology. 20% of the respondents highlighted AI wouldn’t be implemented in their organization until there was more clarity on what the technology is, where it fits into the IT function and what the benefits are.

These statistics more than anything else highlight confusion, and ignorance to the artificial intelligence technology which is already present in their day-to-day lives. AI isn’t new, in science fiction movies or in real life. From Siri on Apple devices to Amazon’s recommended purchases or Facebook’s content recommendations, AI has been drip feed into the real-world of technology with few people realizing its impact. The functions mentioned are AI at one of its simplest versions, though IBM has been making progress with its Watson offering moving into more complex arenas, such as medical diagnosis, building management and weather modelling systems.

But what is the real potential of artificial intelligence? According to the report, predictive analytics is the most prominent use-case. 38% of the respondents believe prediction on activity relating to machines, customers or business health is the most relevant use-case. This is one of the more obvious use-cases as there is a direct link to the bottom line, recouping the investment made in the technologies. Whether this is repairs on leased equipment, understanding which customers are most likely to churn or understanding external factors which may impact the supply/demand dynamic, these are all use-cases which impact the bottom line.

These use-cases can also be linked back to the growth of big data and the desire to become more competitive by being more intelligent. The more information a company has access to, the more well-informed decisions become and the risk undertaken is reduced. Dependent on who you speak to the industry is either very good or very bad at using data. The number is almost certainly in the middle, as there is only so many man hours which can be contributed towards the analysis of this data, and data scientists are in-demand.

With the introduction of IoT, increased efficiency in collection and more effective real-time solutions, the tidal wave of information available to an organization will continue to grow. For the investment in data collection, storage and management to be realized, an artificially intelligent solution to comprehend the information and turn it into insight is an alternative, as a human could not stay awake long enough to do the same level of work. To ensure ROI and avoid drowning in the swell of information, artificial intelligence could be critical.

Another area which received attention during the report was automation. This would appear to be low on the agenda currently, though 25% of the respondents felt this was the most important use-case moving forward. One of the myths which have been swirling around artificial intelligence since the release of Terminator is the idea AI will eventually remove the requirement for humans. It’s all very doom and gloom, however AI offers companies the opportunity to take the more mundane, simplistic and repetitive tasks away from employees, to ensure they can focus more time on what would be considered more valuable and critical to the success of the business.

While there still needs to be a focus around what artificial intelligence actually is and what can be achieved through the implementation of such next gen technologies, progress is beginning to be seen. Should cloud computing and 5G be the driving forces towards IoT, to ensure the time and investment is not a waste, assistance from AI driven solutions would appear to be crucial. An AI solution will not (or at least in the near future) make business critical decisions, though the promise of big data is to provide a suitable level of information to ensure businesses are making informed decisions. AI could be the link between information and insight.

Intel digs deep into wallet to buy its way into AI game

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingVirtual reality may well have been capturing the imagination of the industry in recent months, but Intel’s $400 million of AI start-up Nervana highlights it’s not all fun and games, reports Telecoms.com.

Having set its position as a leader in the data centre market and then largely missed out on the smartphone revolution, it would appear Intel is determined not to miss out on the burgeoning IoT segment, with the Nervana purchase added more firepower to the company’s efforts. The acquisition also highlights the importance of artificial intelligence to the development of the technology industry.

“Intel is a company that powers the cloud and billions of smart, connected computing devices,” said Diane Bryant, GM of the Data Center Group at Intel. “Thanks to the pervasive reach of cloud computing, the ever decreasing cost of compute enabled by Moore’s Law, and the increasing availability of connectivity, these connected devices are generating millions of terabytes of data every single day. The ability to analyse and derive value from that data is one of the most exciting opportunities for us all. Central to that opportunity is artificial intelligence.”

The IoT revolution is coming whether we like it or not, and with it will come such vast amounts of data. Due to the volume, it will beyond comprehension for humans to develop insight from the information. Current data analytics tools and processes could be described (at best) as adequate, though this is before the surge in connected devices. Statista estimates the number of connected devices will grow from 18.2 billion in 2015, through to 50.2 billion in 2020. The devices themselves will also improve, increasing the amount of information which can be collected individually, which will lead to a tidal wave of data to be analysed.

If it is assumed to be immensely difficult or more likely impossible to analyse this data and turn it into actionable insight, what is the point in collecting it in the first place. This is the justification of artificial intelligence. Using such technologies to undertake more rudimentary decision making capabilities brought about through data analysis, or presenting insight to the more complex decisions to business leaders, is where the value of artificial intelligence will be felt. If cloud computing enables the IoT revolution, artificial intelligence will make sure it’s not a waste of time or money.

For a notable proportion of the population, AI is likened to Terminator or other such doomsday stories. But as Bryant notes below, the applications of AI will stretch throughout the life of a consumer, but perhaps more importantly, the business, manufacturing and services world.

“While artificial intelligence is often equated with great science fiction, it isn’t relegated to novels and movies,” said Bryant. “AI is all around us, from the commonplace (talk-to-text, photo tagging, and fraud detection) to the cutting edge (precision medicine, injury prediction, autonomous cars). Encompassing compute methods like advanced data analytics, computer vision, natural language processing and machine learning, artificial intelligence is transforming the way businesses operate and how people engage with the world.”

The acquisition does answer a question raised by Telecoms.com a couple of weeks ago. During early July, Intel announced a new initiative with BMW and Mobileye to drive forward the development of autonomous vehicles. The initiative showed potential, though should BMW are to supply the cars, Intel the chips and Mobileye the detection capabilities, have the body, the muscles and the eyes, but not the brain/AI to bring it all together. This Nervana acquisition in theory completes the circle and provides the intelligence aspect of the car.

Artificial intelligence has the potential to shape the technology industry moving forward, and it would appear this is a view which is shared by the major players. Google has acquired nine AI firms, including Deepmind for $625 million, Twitter has four major acquisitions, most recently Magic Pony for $150 million, Salesforce has acquired two AI start-ups already this year and Apple reported bought Turi for $200 million. The money being spent to gain the upper hand in this sub-sector is beginning to rival the early days of cloud computing.

Apple gives Siri an AI facelift

Apple 1Apple has continued its journey into the world of artificial intelligence through the $200 million acquisition of machine learning start-up Turi, according to Geekwire.

The deal has not been explicitly confirmed by the team at Apple, though it does back up claims from CEO Tim Cook the company is extending its footprint into the growing sub-sector. Although Apple has not been the most prominent in the industry in terms of grabbing headlines, Google and IBM have been particularly vocal, a number of its products are built on the basic principles of artificial intelligence. Siri is a prime example though expanding its potential through the implementation of more advanced technologies offers the potential to improve the user experience.

Turi offers tools which enable developers to embed machine learning into applications, which automatically scale and tune. Use cases for the technology include product recommendations, sentiment analysis, churn prediction and lead scoring for trial customers.

The long-term plan for the business is not clear for the moment. Whether the tools will be made available for the Apple developer community, or remain in-house for the tech giant, or even if the company will remain in Seattle, are unknown as the acquisition still remains officially unconfirmed.

“These experiences become more powerful and intuitive as we continue our long history of enriching our products through advanced artificial intelligence,” said Cook on the company’s earnings call last month. “We have focused our AI efforts on the features that best enhance the customer experience.”

During the briefing, Cook highlighted the potential for Siri to not only understand words from the user, but also identify the sentiment. The acquisition of Turi could be a link between a relatively simplistic function currently, to one which can more effectively predict what the consumer wants and better refine search results.

“We’re also using machine learning in many other ways across our products and services, including recommending songs, apps, and news,” said Cook. “Machine learning is improving facial and image recognition in photos, predicting word choice while typing in messages and mail, and providing context awareness in maps for better directions.

“Deep learning within our products even enables them to recognize usage patterns and improve their own battery life. And most importantly, we deliver these intelligent services while protecting users’ privacy. Most of the AI processing takes place on the device rather than being sent to the cloud.”

Although less vocal than other industry players Apple has been expanding its capabilities through various acquisitions. Since the turn of 2015 the company has acquired 15 organizations, not including Turi for the moment, which does contain a number of machine learning competences. VocalIQ, a UK speech tech firm, and Perceptio, an image recognition company, were both bought in September last year, as well as facial recognition business Emotient in January.

The sluggish smartphone market has been causing challenges for manufacturers, driving the need to provide more differentiation. Hardware has provided little opportunity for brands to differentiate products and operating systems offer even less variance, meaning manufacturers have had to invest more in software solutions. Siri is already one of the more recognizable personal assistant features on the market, and the inclusion of an in-phone AI offering could bring about much needed differentiation.

Google grows (again) but ‘Other Bets’ cost the giant $1bn

GoogleGoogle has reported its Q2 numbers, continuing a strong run of performances within the technology industry, though efforts to diversify its overall business are not paying off just yet, reports Telecoms.com.

The Alphabet brand was announced last year, with aim of allowing the team to invest in other projects more freely, without being impeded by the advertising business. It would appear the management team are not afraid to throw R&D money at its innovation team as it searches for another billion-dollar business, as the ‘Other Bets’ segment, which includes Google Fibre and the autonomous cars projects, accounted for an operating loss of $859 million. Revenues did grow to $185 million, up 150% on the same quarter in 2015, though this number was made almost insignificant by the $19 billion generated in the advertising business.

The technology industry on the whole has been providing strong numbers over the last couple of weeks, though there has been a question as to whether two advertising giants can co-exist. With Facebook reporting significant growth yesterday, advertising revenues across the period increased 63% year-on-year to $6.2 billion, these numbers were dwarfed by Google, perhaps demonstrating there is potential for both organizations to share advertising revenues, which are decreasing in value, and grow healthily.

With regard to the dwindling value of advertising revenues, Google would appear to be combatting this with volume. CFO Ruth Porat highlighted the mobile search capabilities were the primary driver behind the year-on-year growth, though the desktop and tablet search did also grow.

Numbers such as these will grab headlines, meaning it can be easy to forget about the Google cloud business, one of the top priorities for the Alphabet business moving forward.

On the same day which AWS reported revenues of $2.9 billion for the quarter, Google’s cloud business also demonstrated solid growth. Although the numbers are not specific, the ‘Other’ revenues segment which includes the cloud business, and other services such as Google play, accounting for $2.1 billion through the three month period, an increase of 33% on Q2 2015.

“Many tremendous digital experiences are being built in the cloud today, and businesses are working to take advantage of the cloud as part of their digital transformation,” said Google CEO Sundar Pichai. “We’ve been integrating our cloud and apps products to create more unified solutions for companies large and small, and these efforts are paying off.”

Following on from Pichai’s previous comments on the role of artificial intelligence on the Google cloud platform, and the wider Google business, its importance has been reiterated once again. Machine learning is being prioritized as the differentiator for Google in a competitive technology market, and only last week the team introduced two cloud machine learning APIs for speech and natural language to help enterprise customers convert audio to text and easily understand the structure and sentiment of the text in a variety of languages.

In terms of footprint, the team are not done growing yet. At the end of last month, Google and friends completed work on a new trans-Pacific submarine cable system, which will help the team launch a new Google Cloud Platform East Asia region in Tokyo. Back in March, the team confirmed it would be investing heavily in expansion of its cloud footprint with 12 new data centres around the world by the end of 2017.

AWS has previously stated it intends to break the $10 billion barrier in cloud revenues during 2016, though Google may not be that far behind. With its history of not being afraid to invest, and the growth numbers which have been witnessed over the last few quarters, Google could be set to accelerate.

Microsoft continues cloud transformation with 100% Azure growth

Microsoft1Microsoft has reported 5% growth to $22.6 billion as the Intelligent Cloud business unit led the charge, with the Azure public cloud offering more than doubling in revenues and compute usage, reports Telecoms.com.

The Intelligent Cloud unit, which includes server products and cloud services, Azure and enterprise mobility offerings grew 7% to $6.7 billion, while the Productivity and Business Processes, which includes Office commercial and consumer product lines as well as the Dynamics suite, grew 5% to $7 billion. Despite revenues in More Personal Computing declining 4% to $8.9 billion, Xbox Live monthly active users grew 33% year-over-year to 49 million and search advertising revenue grew 16% over the period.

“We delivered $22.6 billion in revenue this quarter, an increase of 5% for the quarter in constant currency,” said Satya Nadella, CEO at Microsoft. “This past year was pivotal in both our own transformation and in partnering with our customers who are navigating their own digital transformations. The Microsoft Cloud is seeing significant customer momentum and we’re well positioned to reach new opportunities in the year ahead.”

Cloud computing has once again brought Microsoft to the forefront of the technology industry following a challenging couple of years. It would appear the transition from software to cloud computing brand is being successfully navigated, though there were a few missed steps along the way, most notably the team’s foray into mobile. Microsoft is moving towards the position of ‘mega-vendor’, infiltrating almost all aspects of an organization (cloud, hardware, social, databases etc.), to make it an indispensable factor of a CIOs roster.

The Intelligent Cloud unit continues as the focal point of the company’s growth strategy, as Nadella claims nearly 60% of the Fortune 500 companies use at least three of the company’s cloud offerings, generating more than $12 billion in Commercial Cloud annualized revenue run rate.

“Companies looking to digitally transform need a trusted cloud partner and turn to Microsoft,” said Nadella. “As a result, Azure revenue and usage again grew by more than 100% this quarter. We see customers choose Microsoft for three reasons. They want a cloud provider that offers solutions that reflect the realities of today’s world and their enterprise-grade needs. They want higher level services to drive digital transformation, and they want a cloud open to developers of all types.”

AI has previously been positioned as one of the cornerstones of growth for the company, and this was reinforced during the earnings call, as Nadella noted the component of the Intelligent Cloud business unit. The Cortana Intelligence Suite, formerly known as Cortana Analytics Suite, is built on the company’s on-going research into big data, machine learning, perception, analytics and intelligent bots. The offering allows developers to build apps and bots which interact with customers in a personalized way, but also react to real-world developments in real-time.

“Just yesterday, we announced Boeing will use Azure, our IoT suite, and Cortana Intelligence to drive digital transformation in commercial aviation, with connected airline systems optimization, predictive maintenance, and much more,” said Nadella. “This builds on great momentum in IoT. This is great progress, but our ambitions are set even higher. Our Intelligent Cloud also enables cognitive services. Cortana Intelligence Suite offers machine learning capabilities and advanced predictive analytics.

“Central to our Intelligent Cloud ambition is providing developers with the tools and capabilities they need to build apps and services for the platforms and devices of their choice. The new Azure Container service as well as .NET Core 1.0 for open source and our ongoing work with companies such as Red Hat, Docker, and Mesosphere reflects significant progress on this front. We continue to see traction from open source, with nearly a third of customer virtual machines on Azure running Linux.”

The company exceeded analyst expectations for the quarter, which was reflected in pre-market trading which saw shares in the giant growing 4%. In terms of outlook for the next quarter, most business units are expected to be down a fraction on the Q2 reported figures, unsurprising considering the summer period. Intelligent Cloud is expected to bring between $6.1-6.3 million, Productivity and Business Processes $6.4-6.6 billion, and More Personal Computing $8.7-9 billion.

IBM makes cloud progress but reports another quarterly decline

IBMIBM revenues continued to fall for a 17th consecutive quarter despite beating analyst expectations and demonstrating healthy growth in its cloud and data business units, reports Telecoms.com.

The company reported a drop in revenues for Q2 of 2.8% to $20.24 billion, though this was an improvement on analyst expectations of $20.03 billion, encouraging shares to rise 2.6% to $164 after hours. The business units which the company deems strategic imperatives, cloud, analytics and engagement, gained 12% year-on-year, though this wasn’t enough to counter the impact of legacy technologies on reported earnings which fell to $2.5 billion from $3.45 billion in 2015. Overall, revenues are now roughly 25% lower than the numbers reported in 2011.

“We continued to deliver double-digit revenue growth in our strategic imperatives,” said CFO Martin Schroeter on the company’s earnings call this week. “Over the last 12 months, strategic imperatives delivered $31 billion in revenue, and now represent 38% of IBM.

“Growth was led by cloud, where our revenue was up 30% to $3.4 billion in the quarter, and over $11.5 billion over the last year so good progress in cloud. Looking at revenue from a segment perspective, the strongest growth came from cognitive solutions led by our analytics and cognitive capabilities and security.”

Schroeter was keen to emphasise the impact Watson is having on the business, as the team continue its journey to redefine Big Blue in the age of cloud computing. Numerous customers were listed as wins for IBM in the cognitive computing sector, as IBM continues to champion Watson as a platform to bring together the digital business with digital intelligence to improve decision-making and add intelligence to products and processes. Watson will continue to be the jewel in the crown of Big Blue as the company moves towards the new digital era.

Despite revenues continuing to fall the team has made a number of positive launches throughout the quarter. Quantum computing is now available on the IBM cloud, the team launched a new partnership with Box to counter the impact of EU-US Privacy Shield on its international business, and an expanded partnership with VMWare expanded the reach of its security portfolio.

In terms of the specific segments, revenues in the cognitive team rose 4%, though this is down from 9% growth in the previous quarter, solutions software revenue was up 6% for the quarter, SAS was another area which recorded triple digit growth and Schroeter claims IBM’s security business outperformed the market by three times. The IBM interactive experience unit also demonstrated healthy growth, as the team continue its journey into an entirely new market for Big Blue.

“We have opened over 30 digital studios around the globe including new studios in Singapore and Seoul,” said Schroeter. “We also completed the acquisition of Aperto, a digital agency in Berlin with over 300 employees and a roster of enterprise clients such as Airbus and Siemens.”

One area which has caught the headlines in recent weeks is the impact of Brexit on the fortunes of the technology sector. Despite concerns from various corners of the industry, it would not have appeared to have a significant impact on the long-term vision of IBM.

“I don’t think that Brexit coming at the end of the quarter helped us at all, but we obviously finished kind of right where we expected to finish,” said Schroeter. “And when we look at our full view of the year, we don’t see an impact, if you will, that has any real materiality on us.

“What I typically observe in these kinds of instances is that our discussions with our clients have to go through a process of reprioritization. So as they reprioritize, the length of time that takes depends a lot on how much uncertainty they’re faced with. And obviously, the political leadership in Europe and the UK can help reduce that uncertainty, but we didn’t see – again, we don’t think it helped but it didn’t cause us to change our guidance.”

While revenues have continued to fall for the tech giant, it would appear to be heading in the right direction. The strategic imperatives business units are now accounting for a larger proportion of the overall figures, now 38%, indicating the tide may be turning for IBM. Schroeter also highlighted the team are not happy relying solely on the progress of Watson, as IBM has acquired 20 companies in the last twelve months, which are now beginning to contribute in a more significant manner.

Although progress is starting to be seen, it would be worth noting it has not been an entirely smooth ride for IBM. There have been numerous new product launches and advances into new market segments, though this has come at a cost of more than 70,000 redundancies over recent months. While there has been a slight increase in share price following the announcement, it would be worth noting previous performance has had an impact on IBM. Shares in Big Blue have dropped 17% since CEO Virginia Rometty took over in January 2012 while the S&P 500 index rose 70% during the same period.

Google adds image recognition to growing AI portfolio

Googlers having funGoogle has continued its charge on the artificial intelligence market through purchasing French image recognition startup Moodstocks, reports Telecoms.com.

Moodstocks, founded in 2008, develops machine-learning based image recognition technology for smartphones, which has been described by developers as the ‘Shazam for images’. Financials of the agreement have not been confirmed to date.

“Ever since we started Moodstocks, our dream has been to give eyes to machines by turning cameras into smart sensors able to make sense of their surroundings,” Moodstock said on its website. “Today, we’re thrilled to announce that we’ve reached an agreement to join forces with Google in order to deploy our work at scale. We expect the acquisition to be completed in the next few weeks.”

Artificial intelligence is one of the focal points of the Google strategy moving forward, which was confirmed by Google CEO Sundar Pichai during the company’s recent earnings call, though the focus can be dated back to the $625 million DeepMind acquisition in 2014. Although DeepMind is arguably the most advanced AI system in the industry, Telecoms.com readers recently confirmed in a poll Google was the leader in the AI segment, it has seemingly been playing catch up with the likes of Watson and AWS whose offerings have been in the public eye for a substantially longer period of time.

The recognition tools are most likely to be incorporated into the Android operating system, though Moodstocks customers will be able to continue to use the service until the end of their subscription. Moodstocks will be incorporated into Google’s R&D centre in France, where the team will work alongside engineers who are focusing on the development of Youtube and Chrome, two offerings where there could be a link to the Moodstocks technology.

“Many Google services use machine learning (or machine learning) to make them simpler and more useful in everyday life – such as Google Translate, Smart Reply Inbox, or the Google app,” said Vincent Simonet, Head of R&D centre of Google’s French unit. “We have made great strides in terms of visual recognition: now you can search in Google Pictures such as ‘party’ or ‘beach’ and the application will offer you good pictures without you and have never needed to categorize them manually.”

Last month, Google also announced it was expanding its machine research team by opening a dedicated office in Zurich. The team will focus on three areas specifically, machine intelligence, natural language processing & understanding, as well as machine perception.

Elsewhere in the industry, Twitter completed the acquisition of Magic Pony last month reportedly for $150 million. Magic Pony, which offers visual processing technology, was one of the more public moves made by the social media network, which could be seen as unusual as the platform lends itself well to the implementation of AI. Microsoft also announced the purchase of Wand Labs, building on the ‘Conversation-as-a-Platform’ proposition put forward by CEO Satya Nadella at Build 2016.

Image recognition startup joins Google in France

Googlers having funGoogle has continued its charge on the artificial intelligence market through purchasing French image recognition startup Moodstocks, reports Telecoms.com.

Moodstocks, founded in 2008, develops machine-learning based image recognition technology for smartphones, which has been described by developers as the ‘Shazam for images’. Financials of the agreement have not been confirmed to date.

“Ever since we started Moodstocks, our dream has been to give eyes to machines by turning cameras into smart sensors able to make sense of their surroundings,” Moodstock said on its website. “Today, we’re thrilled to announce that we’ve reached an agreement to join forces with Google in order to deploy our work at scale. We expect the acquisition to be completed in the next few weeks.”

Artificial intelligence is one of the focal points of the Google strategy moving forward, which was confirmed by Google CEO Sundar Pichai during the company’s recent earnings call, though the focus can be dated back to the $625 million DeepMind acquisition in 2014. Although DeepMind is arguably the most advanced AI system in the industry, Telecoms.com readers recently confirmed in a poll Google was the leader in the AI segment, it has seemingly been playing catch up with the likes of Watson and AWS whose offerings have been in the public eye for a substantially longer period of time.

The recognition tools are most likely to be incorporated into the Android operating system, though Moodstocks customers will be able to continue to use the service until the end of their subscription. Moodstocks will be incorporated into Google’s R&D centre in France, where the team will work alongside engineers who are focusing on the development of Youtube and Chrome, two offerings where there could be a link to the Moodstocks technology.

“Many Google services use machine learning (or machine learning) to make them simpler and more useful in everyday life – such as Google Translate, Smart Reply Inbox, or the Google app,” said Vincent Simonet, Head of R&D centre of Google’s French unit. “We have made great strides in terms of visual recognition: now you can search in Google Pictures such as ‘party’ or ‘beach’ and the application will offer you good pictures without you and have never needed to categorize them manually.”

Last month, Google also announced it was expanding its machine research team by opening a dedicated office in Zurich. The team will focus on three areas specifically, machine intelligence, natural language processing & understanding, as well as machine perception.

Elsewhere in the industry, Twitter completed the acquisition of Magic Pony last month reportedly for $150 million. Magic Pony, which offers visual processing technology, was one of the more public moves made by the social media network, which could be seen as unusual as the platform lends itself well to the implementation of AI. Microsoft also announced the purchase of Wand Labs, building on the ‘Conversation-as-a-Platform’ proposition put forward by CEO Satya Nadella at Build 2016.

What did we learn from EMC’s data protection report?

a safe place to workEMC has recently released its Global Data Protection Index 2016 where it claims only 2% of the world would be considered ‘leaders’ in protecting their own assets, reports Telecoms.com.

Data has dominated the headlines in recent months as breaches have made customers question how well enterprise organizations can manage and protect data. Combined with transatlantic disagreements in the form of Safe Habour and law agencies access to personal data, the ability to remain secure and credible is now more of a priority for decision makers.

“Our customers are facing a rapidly evolving data protection landscape on a number of fronts, whether it’s to protect modern cloud computing environments or to shield against devastating cyber-attacks,” said David Goulden, CEO of EMC Information Infrastructure. “Our research shows that many businesses are unaware of the potential impact and are failing to plan for them, which is a threat in itself.”

EMC’s report outlined a number of challenges and statistics which claimed the majority of the industry are not in a place they should be with regard to data protection. While only 2% of the industry would be considered leaders in the data protection category, 52% are still evaluating the options available to them. Overall, 13% more businesses suffered data loss in the last twelve months, compared to the same period prior to that.

But what are the over-arching lessons we learned from the report?

Vendors: Less is more

A fair assumption for most people would be the more protection you take on, the more protected you are. This just seems logical. However, the study shows the more vendors you count in your stable, the more data you will leak.

The average data loss instance costs a company 2.36TB of data, which would be considered substantial, however it could be worse. The study showed organizations who used one vendor lost on average 0.83TB per incident, two vendors 2.04TB and three vendors 2.58TB. For those who used four or more vendors, an average of 5.47TB of data was lost per incident.

Common sense would dictate the more layers of security you have, the more secure you will be, however this is only the case if the systems are compatible with each other. It should be highlighted those who lost the larger data sets are likely to be the larger companies, with more data to lose, though the study does seem to suggest there needs to be a more co-ordinated approach to data protection.

And they are expensive…

Using same concept as before, the average cost of lost data was $900,000. For those who have one vendor, the cost was $636,361, for those with two, 789,193 and for those with three vendors the cost was just above the average at 911,030. When companies bring in four or more vendors, the average cost of data loss rises to 1.767 million.

China and Mexico are the best

While it may be surprising, considering many of the latest breakthroughs in the data world have come from Silicon Valley or Israel, China and Mexico are the two countries which would be considered furthest ahead of the trend for data protection.

EMC graded each country on how effective they are were implementing the right technologies and culture to prevent data loss within the organizations themselves. 17 countries featured ahead of the curve including usual suspects of the UK (13.5% ahead of the curve), US (8%), Japan (1%) and South Korea (9%), however China and Mexico led the charge being 20% and 17% respectively ahead.

While it may not be considered that unusual for China to have a strong handle on data within its own boarders, Mexico is a little more surprising (at least to us at Telecoms.com). The country itself has gone through somewhat of a technology revolution in recent years, growing in the last 20 years from a country where only 10% of people had mobile through to 68% this year, 70% of which are smartphones. Mexico is now the 11th largest economy in terms of purchasing power, with the millennials being the largest demographic. With the population becoming more affluent, and no longer constrained by the faults of pre-internet world, the trend should continue. Keep up the good work Mexico.

Human error is still a talking point

When looking at the causes of data loss, the results were widespread, though the causes which cannot be controlled were at the top of the list. Hardware failure, power loss and software failure accounted for 45%, 35% and 34% respectively.

That said the industry does now appear to be taking responsibility for the data itself. The study showed only 10% of the incidents of data loss was blamed on the vendor. A couple of weeks ago we spoke to Intel CTO Raj Samani who highlighted to us the attitude towards security (not just data protection) needs to shift, as there are no means to outsource risk. Minimizing risk is achievable, but irrelevant of what agreements are undertaken with vendors, the risk still remains with you. As fewer people are blaming the vendors, it would appear this responsibility is being realized.

Human error is another area which still remains high on the agenda, as the study showed it accounts for 20% of all instances of data loss. While some of these instances can be blamed on leaving a laptop in the pub or losing a phone on the train, there are examples where simple mistakes in the workplace are to blame. These will not be removed, as numerous day-to-day decisions are based off the back of intuition and gut-feel, and a necessity for certain aspects of the business.

An area which could be seen as a potential danger would be that of artificial intelligence. As AI advances as a concept, the more like humans they will become, and thus more capable of making decisions based in intuition. If this is to be taken as the ambition, surely an intuitive decision making machine would offer a security defect in the same way a human would. Admittedly the risk would be substantially smaller, but on the contrary, the machine would be making X times many more decision than the human.

All-in-all the report raises more questions than provides answers. While security has been pushed to the top of the agenda for numerous organizations, receiving additional investment and attention, it does not appear the same organizations are getting any better at protecting themselves. The fact 13% more organizations have been attacked in the last 12 month suggests it could be getting worse.

To finish, the study asked whether an individual felt their organization was well enough protected. Only 18% believe they are.

Twitter acquires machine learning start-up Magic Pony

Twitter has stepped up its efforts in the machine learning arena after announcing the acquisition of visual processing technology company Magic Pony.

While the company claims machine learning is central to the brands capabilities, it has been relatively quiet in the market segment in comparison to industry heavy weights such as IBM, Google and Microsoft. This is the third acquisition the team has made in this area, reported to be in the range of $150 million, following the purchase of Whetlab last year and Mad Bits in 2014, compared to Google who acquired Jetpac, Dark Blue Labs and Vision Factory, as well as $500 million on DeepMind, all in 2014.

“Machine learning is increasingly at the core of everything we build at Twitter,” said Jack Dorsey, Twitter CEO. “Magic Pony’s machine learning technology will help us build strength into our deep learning teams with world-class talent, so Twitter can continue to be the best place to see what’s happening and why it matters, first. We value deep learning research to help make our world better, and we will keep doing our part to share our work and learnings with the community.”

The acquisition follows Twitter’s announcement last week advertisers will now be able to utilize emoji keyword targeting for Twitter Ads. Although a simple proposition in the first instance, the new features did open up the opportunity for machine learning enhanced advertising solutions.

Magic Pony, which was founded in 2014 and currently has 11 employees, was acquired to bolster the visual experiences that are delivered across Twitter apps. The team will link up with Twitter Cortex, the in-house machine learning department, to improve image processing expertise.

The technology itself makes use of the abilities of convolutional neural networks to scale-out an image. By taking the information in a picture, the technology imagines a larger and more in-depth image by scaling out the detail which it sees. Much in the same way a human can imagine the rest of a car by seeing the door, the technology learns lessons from previous experiences and applies logical decisions moving forward.

Magic Pony itself was initially supported by investment from Octopus Ventures who have seemingly found a specialty in finding promising AI start-ups. Prior to Magic Pony being acquired by Twitter, Octopus Ventures invested it Evi which was acquired by Amazon in 2012, and SwiftKey which was acquired by Microsoft this year.

“Today marks a great day for the Magic Pony team,” said Luke Hakes, Investment Director at Octopus Ventures. “We’re proud to have believed in the concept early on and to then have had the privilege of joining their journey. The technology Magic Pony has developed is revolutionary and pushes the boundaries of what is possible with AI in the video space.

“The UK continues to grow as the ‘go-to’ place for companies looking to build best in breed AI technology – Octopus has been fortunate to work with the founders of three companies in this space that have gone on to be acquired, with Evi and Amazon, SwiftKey and Microsoft, and now Magic Pony and Twitter. We are excited for the Magic Pony team, but also to take what we have learnt on the last three journeys and help the next generation of entrepreneurs lead the way in the on-going AI revolution.”