Exploring the journey from cloud to AI – with a few big data bumps along the way

The potential of cloud computing and artificial intelligence (AI) is irresistible. Cloud represents the backbone for any data initiative, and then AI technologies can be used to derive key insights for both greater business intelligence and topline revenue. Yet AI is only as good as the data strategy upon which it sits.

At the AI & Big Data Expo in Amsterdam today, delegates were able to see that the proof of the pudding was in the eating through NetApp's cloud and data fabric initiatives, with Dreamworks Animation cited as a key client who was able to transform its operations.

For the cloud and AI melting pot, however, there are other steps which need to be taken. Patrick Slavenburg, a member of the IoT Council, opened the session with an exploration of how edge computing was taking things further. As Moore's Law finally begins to run out of steam, Slavenburg noted there are up to 70 startups working solely on new microprocessors today. 

Noting how technology history tends to repeat itself, he added today is a heyday for microprocessing architecture for the first time since the 1970s. The key aspect for edge here is being able to perform deep learning at that architectural level, with the algorithms being more lightweight.

Florian Feldhaus, enterprise solutions architect at NetApp, sounded out that data was the key to running AI. According to IDC, by 2020 90% of corporate strategies will explicitly mention data as a critical enterprise asset, and mention analytics as an essential competency. "Wherever you store your data, however you manage it, that's the really important piece to get the benefits of AI," he explained.

The industry continues to insist that it is a multi-cloud, hybrid cloud world today. It is simply no longer a choice between Amazon Web Services (AWS), Microsoft Azure or Google Cloud Platform (GCP), but assessing which workloads fit which cloud. This is also the case in terms of what your company's data scientists are doing, added Feldhaus. Data scientists need to use data wherever they want, he said – use it in every cloud and move the data around to make it available to them.

"You have to fuel data-driven innovation on the world's biggest clouds," said Feldhaus. "There is no way around the cloud." With AI services available in seconds, this was a key point in terms of getting to market. It is also the key metric for data scientists, he added.

NetApp has been gradually moving away from its storage heritage to focus on its 'data fabric' offering – an architecture which offers access to data across multiple endpoints and cloud environments, as well as on-premises. The company announced yesterday an update to its data fabric, with greater integration across Google's cloud as well as support for Kubernetes.

Feldhaus noted the strategy was based on NetApp 'wanting to move to the next step'. Dreamworks was one customer looking at this future, with various big data pipelines allied with the need to process data in a short amount of time.

Ultimately, if organisations want to make the most of the AI opportunity – and time is running out for laggards – then they need their data strategy sorted out. Yes, not everything can be moved to the cloud and some legacy applications need a lot of care and attention, but a more streamlined process is possible. Feldhaus said NetApp's data fabric had four key constituents; discovering the data, activating it, automating, and finally optimising.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

HPE banks on AI and analytics for cloud success


Jane McCallion

18 Jun, 2019

HPE has beefed up its cloud offerings at its annual Discover conference this week with a suite of updates to its cloud infrastructure and management portfolio.

The company’s flagship cloud-based network management product, Aruba Central, now sports AI-powered analytics technology integrated from Aruba NetInsight and User Experience Insight. These, the company says, will help IT professionals resolve intermittent network issues quickly while also identifying ways to optimise customers’ infrastructure for a better overall experience.

Aruba Central also has enhancements in the software defined arena. The new SD-WAN Orchestrator has an eye on edge computing, allowing administrators to easily deploy flexible and secure topologies in large-scale edge environments. According to HPE, this will allow organisations to connect thousands of branches to multiple data centres.

There’s also a new SaaS prioritisation feature that enhances the performance of SaaS applications for end users while also giving visibility and feedback of this experience to administrators.

Finally in the Aruba business segment, Virtual Gateways is now available for AWS and Azure.

It’s not just Aruba Central that’s had a spruce up, though.

HPE has also announced extensions to two of its most recent partnerships: Google Cloud and Equinix.

Organisations running containerised workloads on ProLiant servers or Nimble Storage on premise can now HPE Cloud Volumes with Google Cloud Anthos for hybrid disaster recovery, hybrid continuous integration/continuous development (CI/CD), or similar.

This offering is also available through HPE GreenLake, the company’s consultancy and pay-per-use service.

With Equinix, businesses using the company’s colocation services that are looking for cloud-based disaster recovery, backup and recovery, or test and development environments can now get Data as a Service based on HPE Cloud Volumes through the Equinix Marketplace.

All of these services are available immediately.

Affordable Larger Server Memory for Private and Public Cloud | @CloudEXPO @ScaleMP #Cloud

ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.

read more

What matters most in business intelligence 2019: Key enterprise use cases

  • Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019
  • Reporting, dashboards, data integration, advanced visualisation, and end-user self-service are the most strategic BI initiatives underway in enterprises today
  • Operations, executive management, finance, and sales are primarily driving business intelligence (BI) adoption throughout enterprises today
  • Tech companies’ operations and sales teams are the most effective at driving BI adoption across industries surveyed, with advertising driving BI adoption across marketing

These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.

Key insights from the study include the following:

Operations, executive management, finance, and sales are primarily driving business intelligence (BI) adoption throughout their enterprises today

More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.

Tech companies’ operations and sales teams are the most effective at driving BI adoption across industries surveyed, with advertising driving BI adoption across marketing

Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.

Reporting, dashboards, data integration, advanced visualisation, and end-user self-service are the most strategic BI initiatives underway in enterprises today

Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.

BI software providers most commonly rely on executive-level personas to design their applications and add new features

Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.

Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management

Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages.

Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.

In aggregate, BI is achieving its highest levels of adoption in R&D, executive management, and operations departments today

The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.

Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration

Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.

Marketing/sales and operations are using the greatest variety of BI tools today

The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

What to expect from HPE Discover 2019


Jane McCallion

17 Jun, 2019

As the low-pressure vortex continues to bring traditional summertime weather to the UK, I’ve hopped on a plane to head to this month’s hottest Las Vegas gathering: HPE Discover 2019 (and, with an expected high of 42 degrees Celsius on Wednesday, it will be hot, literally).

Once again, Hewlett Packard Enterprise (HPE) is gathering customers, partners, executives and a smattering of journalists in the Sands Expo Center to tell us about its successes of the past year, provide more information on new products and services it’s ready to launch, and its wider plans for the future.

In previous years, it’s been a bit easier to speculate on what might come up during the main keynote. We’ve had former CEO Meg Whitman saying ‘goodbye,’ current CEO Antonio Neri saying ‘hello,’ and then the split of HP into HPE and HP Inc.

Things have been a bit more sedate over the past 12 months though, which does make my job of predicting what will happen this week a little more difficult. That’s not to say there’s nothing to talk about, though.

The company recently bought Cray adding yet another string to its high performance computing (HPC) bow. I don’t expect a high level of detail on what we can expect from this acquisition in the long term (although I would assume that rather like SGI it will be absorbed into the general HPE IP pool), but I’d be surprised if it isn’t at least mentioned.

It’s also interesting to note that Keerti Melkote, HPE’s president of intelligent edge and co-founder of Aruba – (now HPE’s networking business) – will be joining Neri during his keynote on Tuesday. Something tells me we may be in for some big edge computing and/or networking news.

Other perennial topics for HPE include AI and machine learning, hybrid and as-a-service IT – including its own GreenLake offering – and high-performance enterprise-grade storage.

I suspect we will also hear about the company’s recently returned ‘space computer’ – two Apollo servers that were sent to the International Space Station (ISS) in 2017 to see how this kind of technology performs in orbit, which landed back on terra firma a couple of weeks ago. While I’m not expecting a big song and dance, it would be a missed opportunity for the company not to at least celebrate the achievement in some way.

Something I’m expecting to hear less about, however, is The Machine. Originally billed as a completely new form of architecture, it then morphed into a research project and seems perhaps to have been gently absorbed into HPE Labs. Not really something that warrants top billing.

HPE Discover kicks off today with the channel partner conference, with the main event kicking off tomorrow. Stay tuned to Channel Pro, Cloud Pro and IT Pro for all the latest news and analysis from the show.

BMC Software Named “Gold Sponsor” of @CloudEXPO Silicon Valley | @BMCSoftware #HybridCloud #AI #AIOps #AWS #DevOps #Serverless #Kubernetes

BMC has unmatched experience in IT management, supporting 92 of the Forbes Global 100, and earning recognition as an ITSM Gartner Magic Quadrant Leader for five years running. Our solutions offer speed, agility, and efficiency to tackle business challenges in the areas of service management, automation, operations, and the mainframe.

read more

How shared responsibility means CIOs and CFOs need to be close partners

In today’s complex business ecosystem, the relationship between the CIO and the CFO has to be closely aligned, which goes beyond just an agreement on the budget. The CIO and the CFO have to be as one, working together as two strong pillars that ensure the organisation meets every demand the regulators, shareholders, customers, partners and employees place on it. The days of tension between the CIO and CFO are long gone.

Whether your day to day responsibility is financial or technological, there is a binding commonality between you both: as CIO and CFO, you both share the responsibility for risk and compliance in the organisation. That shared responsibility extends to a complete understanding of each other’s roles and challenges. As CIO, you should be well versed on the financial processes and demands, and a transformational CFO is well briefed and often passionate about technology.

With technology underpinning every aspect of an organisation and becoming increasingly important, the responsibilities of the CIO have evolved from just a set of assets and services that are a cost centre to the organisation. Technology is now a foundation of the business, no matter its vertical market. If we accept that technology underpins our organisations, it is important to realise there is an increased risk. With this risk comes greater demand for compliance. The last decade has seen a wealth of regulations, all of which bring a technology compliance element to your organisation.

It is, therefore, vital for the CIO to be able to articulate risk in financial terms, whether it be data or cyber security threats to the CFO. This is critical because any problems that cause risk have a technical and operational impact and solution. For example, investing in new infrastructure or balancing the organisation’s concerns about intellectual property protection when using the public cloud, are technology questions. But, they are clearly business questions too. If the organisation opts to delay infrastructure investment, it could impact business continuity. That in turn impacts the customer experience.

These may sound like decisions that are the sole domain of the CIO, but that is not the case. If technology is truly at the heart of the organisation, then the responsibility for these decisions has to be shared with your peers in the C-suite.

The CIO has to understand, extremely well, the financial system of the organisation, and be able to calculate the risks in a way that the CFO finds accurate and helpful. Together, you need to be able to take these assessments to the audit, or risk, committee and on to the board. This entire discussion is not technical; it is a financial discussion. In fact, you should not have a C in your job title until you realise that part of your job entails being able to hold your own when having financial discussions.

Honorific title

There are a lot of people who have grown up through the technology ranks and eventually get a CIO title. And, in my experience, there are many who are CIO in name only; the title is somewhat honorific.

Am I being disloyal to my CIO peers? No. You can tell the difference between CIOs that are fluent in financial terms and ideas and those that aren’t.

IT leaders are new to the C-suite. Many CIOs have come up from an operational background, which has required and benefited from our detail orientation. Important as detail is at the C-suite, so too is a collaborative approach and this is where CIOs need to develop their skill set. Because, in truth, the C-suite is a lifeboat with just one packet of biscuits to survive on. The team that shares the biscuits of responsibility, is the one that will sail into a safe harbour. Any group of executives at the C-level are all working hard to figure out how to direct the company and they all share the authority.

Lastly, a good relationship with the CFO – and everyone else at the C-level – in no way detracts from the relationship a CIO must have with the CEO. Buy-in and participation from the CEO is vital. Everyone in the C-level needs the CEO’s support and understanding. That CEO air cover is most effective when it is shared between the CFO and CIO. 

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware to acquire cloud startup Avi Networks


Bobby Hellard

14 Jun, 2019

VMware has signed a “definitive agreement” to acquire Avi Networks, a startup that helps companies deliver cloud-based applications.

The software virtualisation company said it plans to introduce the Avi platform to VMware customers and partners to help enterprises adopt software-defined application delivery across data centres and clouds.

Once the deal is closed, the VMware and Avi Networks teams will work together to advance its Virtual Cloud Network vision, build out full stack L2-7 services and deliver the public cloud experience for on-prem environments.

The VMware vision, according to Tom Gillis, SVP and general manager of the network and security business unit, is to deliver the “public cloud experience” to developers regardless of what underlying infrastructure they are running, meaning agility – the ability to quickly deploy new workloads, try new ideas and to “iterate”.

This seems like a key driver behind the deal as VMware said that modern infrastructure needs to provide agility wherever it executes, either on premises, in hybrid cloud deployments, or in native public clouds.

“Application Delivery Controllers (ADCs) are a critical pillar of a software-defined data centre,” he said “Many workloads cannot be deployed without one. For many customers, this means writing their application to bespoke and proprietary APIs that are tied to expensive hardware appliances.

“The Avi Networks team saw this problem and solved it in the right way. They built a software architecture that is truly scale-out, with a centralised controller. This controller manages not just the configuration of the individual load balancers but also manages their state. This architecture mirrors the approach of our groundbreaking software-defined networking solution VMware NSX.”

VMware called Avi Networks a “pioneering” startup, due to its software-defined ADC architecture that is fully distributed, auto scalable, and intrinsically more secure. The Avi Platform enables elastic load balancing, application acceleration and security services combined with centralised management and orchestration for consistent policies and operations.

Unlike traditional ADCs, it does not require custom appliances and can be consumed on-prem, in public clouds, or as a service, enabling new flexibility and faster time to value at lower costs.

How to get your data scientist career up and running: A guide

Note: The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a data scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

Continuously build your statistical literacy and programming skills

Currently, there are 24,697 open data scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyse all open positions in the U.S., the following list of the top 10 data science skills was created today.

As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritised list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritise. Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

Continually be creating your own, unique portfolio of analytics and machine learning projects

Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:

  • Your programming language of choice (e.g., Python, R, Julia, etc.).
  • The ability to interact with databases (e.g., your ability to use SQL).
  • Visualisation of data (static or interactive).
  • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
  • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

Get (or git!) yourself a website

If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organisations may also reach out to you with freelance projects, interviews, and other opportunities.

Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network

If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specialising in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:

Bring the same level of intensity to improving your communication skills as you do to your quantitative skills as data scientists need to also excel at storytelling. One of the most important skills for data scientists to have is the ability to communicate results to different audiences and stakeholders so others can understand and act their insights. Since data projects are collaborative across many teams and results are often incorporated into larger projects, the true impact of a data scientist’s work depends on how well others can understand their insights to take further action and make informed decisions.

  • Recruiters
  • Friends, family, and colleagues
  • Career fairs and recruiting events
  • General job boards
  • Company websites
  • Tech job boards

Bring the same level of intensity to improving your communication skills as you do to your quantitative skills – as data scientists need to also excel at storytelling

One of the most important skills for data scientists to have is the ability to communicate results to different audiences and stakeholders so others can understand and act their insights. Since data projects are collaborative across many teams and results are often incorporated into larger projects, the true impact of a data scientist’s work depends on how well others can understand their insights to take further action and make informed decisions.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Welcome to the age of the platform


Adam Shepherd

17 Jun, 2019

It’s hard to argue against the idea that Dropbox has been a hugely influential company. It’s synonymous with consumer cloud storage and was one of the first companies to popularise the concept. In many ways, it was instrumental to the growth of cloud computing as a mainstream technology.

Now, however, it’s changing tack and reinventing itself. No longer content to merely be the place where users and teams store their documents, the company wants to become the connective tissue that links all the digital elements of your working life. As founder and CEO Drew Houston puts it, he wants Dropbox to move from the filing cabinet to the boardroom.

In Houston’s view, work revolves around content; whether it’s contracts, proposals, timesheets, web pages or blueprints, files are the one constant within every business. With that in mind, your filesystem should be at the heart of your workflow. This is fundamentally what the newly-redesigned Dropbox is all about: using your files as a jumping-off point for collaboration within the organisation.

“We see no shortage of opportunity to help kind of build this workspace that organises itself, that lets you use any of the tools you want to use. But instead of being organised around the concept of messaging, we think the starting point is really around the content,” Houston tells me.

“What are people talking about when they’re in Slack? It’s usually content that lives in Dropbox. Or like Salesforce; the salesperson’s day revolves around content that lives in Dropbox, because they’re getting a proposal together, or they’re getting contracts signed. They’re round-tripping with Dropbox all day. And if instead of having to jump from back and forth, if we can smooth that over, that’s really valuable.”

The app now boarding at Platform One

It’s fundamentally a platform play. Houston wants Dropbox to be the first thing workers open when they get to their desk and the one they come back to most regularly throughout the day, maximising the amount of time spent in the app and minimising time spent outside it. In order to do that, the company is pursuing a number of integrations with other services such as Zoom, Slack and Jira, which will allow users to take actions within those services without actually leaving the Dropbox app.

“A lot of this fragmentation in the end user experience, IT is also experiencing,” Houston says. “We see a big opportunity to help them kind of wrangle all the different tools that their employees are bringing in and help them get some semblance of control, and visibility back.”

“We want to occupy a little bit of different space and people’s minds, from just being a passive content repository to being the living workspace where you get things done. So moving from the filing cabinet to the conference room, where the difference is, you can still have stuff and content, but then you see people and you can have conversations and you can be up on the whiteboard.”

Dropbox is far from the only company doing this; virtually every other business cloud company is pursuing a similar strategy. Salesforce, Box and even Slack itself have been opening up app stores and bolting on integrations left and right in an effort to make their applications as much of a one-stop-shop for their customers’ needs as possible.

Collaboration has also been a key focus for these companies, incorporating messaging, sharing and other communication functionalities into their products. For some, like Microsoft and Google, this takes the form of integrating their storage platforms with full-blown unified comms solutions or collaboration suites like Teams and Hangouts. For others (including Dropbox), it’s comments and activity tracking.

All this is a clear indication that ‘just’ doing cloud-based file storage really isn’t enough of a differentiator any more. Businesses now require a level of added service, whether it’s curating and cataloguing files or helping businesses seamlessly integrate those files into broader workflows, customers expect their storage platform to help take some extra hassle out of modern business.

One big happy family

The benefit of this shift is a potential increase in customer choice and flexibility; you can use Office 365 for its excellent productivity software, but you can also use Dropbox to store your content without being locked into OneDrive, and use Slack for messaging without having to stick to Teams. As these platforms grow in maturity and their integrations grow deeper and more plentiful, businesses will end up with the ability to combine them in whatever way works for them.

On the other hand, this proliferation also runs the risk of increased fragmentation and creep. Without careful management, organisations could easily find themselves with multiple different collaboration and filesharing platforms, all performing nearly identical roles and with content scattered between them. Cost models and licensing also have to be carefully monitored, lest organisations find themselves paying for multiple platforms that fulfil the same role.

These problems can be solved, though. The answer is deep and genuine collaboration between tech companies, such as using machine learning to detect when a file is shared in a Slack channel and automatically uploading it to the relevant Dropbox folder, or Microsoft and Dropbox jointly offering co-branded packages of Office 365 software and Dropbox storage. These ideas might seem unrealistic, but the growing trend towards cooperation and partnership within the tech industry means they’re not as outlandish as they might once have been.

“We’re thinking excitedly about these opportunities,” Dropbox’s group product manager John Hrvatin tells me. “When you’re talking about productivity tools, that data is housed in these silos, and that’s why you end up having to have 10 search boxes. But we’re going through the effort of creating these partnerships with these key partners, because we want to break down those silos. Just the user experience of joining a meeting, or sharing a file, or managing a Trello board [from Dropbox]; just that is user benefit. But search is the other thing that we can fix with these partnerships. And there really isn’t a good way of doing that – unless you build [them].”

Once tech companies figure out a way not just to co-exist but to genuinely collaborate, a world of possibilities opens up. Imagine a system where every single tool or piece of software you use was genuinely and deeply connected; where one search bar allowed you to simultaneously search your cloud storage, emails, hard drive, Slack workspace, Github repos and every other tool besides. Imagine if machine learning and OCR could work together to automatically title, tag and organise files – no more vague titles like ‘proposalV3.doc’ or searching through 18 sub-folders because someone’s mis-filed a crucial document.

These sound like utopian pipe dreams, but it’s all technically feasible. All tech companies have to do is stop trying to beat each other to the finish line and start helping each other – and if Dropbox’s latest pivot is anything to go by, that might happen sooner than we think.

“We’re really at the beginning of that journey,” Dropbox CTO Quentin Clarke says. “But we’re actually uniquely positioned here, because of our neutrality, because we’re so committed to being a system player, we’re going to have access to information – because of these integration with zoom, Slack, etc – that no other players will even have. So our ability to make something good of that using machine learning is actually super high. And that’s something I think you should keep an eye on over the over the coming months and years as we continue to roll out on top of this foundation.”