Exploring the journey from cloud to AI – with a few big data bumps along the way

The potential of cloud computing and artificial intelligence (AI) is irresistible. Cloud represents the backbone for any data initiative, and then AI technologies can be used to derive key insights for both greater business intelligence and topline revenue. Yet AI is only as good as the data strategy upon which it sits.

At the AI & Big Data Expo in Amsterdam today, delegates were able to see that the proof of the pudding was in the eating through NetApp's cloud and data fabric initiatives, with Dreamworks Animation cited as a key client who was able to transform its operations.

For the cloud and AI melting pot, however, there are other steps which need to be taken. Patrick Slavenburg, a member of the IoT Council, opened the session with an exploration of how edge computing was taking things further. As Moore's Law finally begins to run out of steam, Slavenburg noted there are up to 70 startups working solely on new microprocessors today. 

Noting how technology history tends to repeat itself, he added today is a heyday for microprocessing architecture for the first time since the 1970s. The key aspect for edge here is being able to perform deep learning at that architectural level, with the algorithms being more lightweight.

Florian Feldhaus, enterprise solutions architect at NetApp, sounded out that data was the key to running AI. According to IDC, by 2020 90% of corporate strategies will explicitly mention data as a critical enterprise asset, and mention analytics as an essential competency. "Wherever you store your data, however you manage it, that's the really important piece to get the benefits of AI," he explained.

The industry continues to insist that it is a multi-cloud, hybrid cloud world today. It is simply no longer a choice between Amazon Web Services (AWS), Microsoft Azure or Google Cloud Platform (GCP), but assessing which workloads fit which cloud. This is also the case in terms of what your company's data scientists are doing, added Feldhaus. Data scientists need to use data wherever they want, he said – use it in every cloud and move the data around to make it available to them.

"You have to fuel data-driven innovation on the world's biggest clouds," said Feldhaus. "There is no way around the cloud." With AI services available in seconds, this was a key point in terms of getting to market. It is also the key metric for data scientists, he added.

NetApp has been gradually moving away from its storage heritage to focus on its 'data fabric' offering – an architecture which offers access to data across multiple endpoints and cloud environments, as well as on-premises. The company announced yesterday an update to its data fabric, with greater integration across Google's cloud as well as support for Kubernetes.

Feldhaus noted the strategy was based on NetApp 'wanting to move to the next step'. Dreamworks was one customer looking at this future, with various big data pipelines allied with the need to process data in a short amount of time.

Ultimately, if organisations want to make the most of the AI opportunity – and time is running out for laggards – then they need their data strategy sorted out. Yes, not everything can be moved to the cloud and some legacy applications need a lot of care and attention, but a more streamlined process is possible. Feldhaus said NetApp's data fabric had four key constituents; discovering the data, activating it, automating, and finally optimising.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.