Spotting the elephant in the room: Why cloud will not burst colo’s bubble just yet

When it comes to the future demand for data centre colocation services, it would be easy to assume there’s a large elephant in the room – in the shape of a large cloud ready to consume all before it.  

From what we are seeing, however, alongside our cloud provider hosting services and in line with market forecasts, this is far from actual reality. The signs are that colocation can look forward to a vibrant long term market. CBRE, for example, recently reported 2019 was another record year of colocation market growth in the so-called FLAP (Frankfurt, London, Amsterdam, Paris) markets. There’s also a growing choice of high quality colocation facilities thriving in regional UK locations.

Perhaps more telling, however, amid all the excitement and market growth statistics surrounding cloud, some analysts are already predicting only about half of enterprise workloads will ultimately go into to it: best practice and business pressures will see most of the remaining share gradually moving from on-premise to colo – with only a minority remaining on-premise in the long term.

The case for colo

This is because a public cloud platform, while great for scalability, flexibility and ease of access, probably won’t totally satisfy all enterprise application and workload needs. Some will demand extremely high performance while others just need low-cost storage. And unless your own in-house data centre or hosting provider is directly connected to the cloud provider’s network infrastructure, latency is a consideration. This will impact on user experience as well as become a potential security risk. Then of course there’s the governance and security concerns around control of company data.

At the same time, there are serious engineering challenges and costs involved when running private cloud solutions on-premise. The initial set-up is one thing, but there’s also the ongoing support and maintenance involved. For critical services, providing 24 hour technical support can be a challenge.   

Sooner or later, therefore, enterprises will inevitably have to address the implications and risks of continuing to run servers in-house for storing and processing large volumes of data and applications.  Faced with solving the rising costs, complexities and security issues involved, many will turn to finding quality colocation facilities capable of supporting their considerable requirements – from housing servers for running day to day applications, legacy IT systems, and in some cases, mission-critical systems, and for hosting private or hybrid clouds. 

On the hunt

So where’s the elephant? Right now, the elephant is most likely residing in the board rooms of many enterprise businesses. However, the real-life issues and challenges associated with a ‘cloud or nothing’ approach will increasingly come to light and the novelty of instant ‘cloudification’ will wear off. CIOs will be once again able to see the wood for the trees. Many will identify numerous workloads that don’t go into cloud, and where the effort or cost of cloud is a barrier.  

This journey and eventual outcome is natural – an evolution rather than a sudden and dramatic revolution. It’s a logical process that enterprise organisations and CIOs need to go through, to finally achieve their optimum balance for highly effective, cost-efficient, secure, resilient, flexible and future-proofed computing. 

Nevertheless, CIOs shouldn’t assume that colocation will always be available immediately, exactly where they need it and at low cost. As the decade wears on, some colocation providers will probably need to close or completely upgrade smaller or power strapped facilities. Others will build totally new ones from the ground up. Only larger ones, especially those located in lower cost areas where real estate is significantly cheaper, may be capable of the economies of scale necessary for delivering affordable and future-proofed solutions for larger workload  requirements. Time is therefore of the essence for commencing the evaluation process for identifying potential colocation facilities.

In summary, the cloud is not going to consume colocation’s lunch. More likely, together, they will evolve as the most compelling proposition for managing almost all enterprise data processing, storage and applications requirements. They are complementary solutions rather than head to head competitors.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Four ways the cloud facilitates workplace collaboration


Zach Cooper

10 Feb, 2020

Collaboration is an increasingly powerful tool, with Forbes recently proclaiming the transition from the information age to the collaboration age

In modern businesses, collaboration refers to bringing employees and business partners closer together, strengthening their working relationships. It cultivates an environment wherein expertise and knowledge is shared, creating a hotbed for innovation and growth. 

It’s also critical in other ways. Employees expect to be equipped with tools that allow them to collaborate without friction, no matter where they are located, using the devices which are most useful to them. Neglecting this business requirement could make it difficult to retain and attract talent. 

Cloud technology is a great collaboration facilitator. By acting as the foundation of the modern, digital workspace, the cloud allows users to connect and work more cohesively, both within the business and with external partners.

Employee communication

As speed and agility are critical to business demands, the ease at which internal employees can communicate is very important. Cloud technology allows team communication to be instant and robust, with chat-based workspaces displacing archaic email systems and enabling employees to collaborate on team projects.

Better communication between employees at all levels allows ideas to be easily shared, resulting in higher participation levels across projects. All members of the team have an equal opportunity to contribute, empowering employees and making for a healthy working environment.

File sharing

No matter how quickly employees can digitally communicate, if the IT infrastructure isn’t in place to support workflows, all that will get done is lots of chatting and no actioning.

In business, file transfers are the process which gets things done. The daily transfer of files and documents between external systems and external partners has become so prominent that it’s now classified as a core business process. One that, without the right technology in place, can be laborious and unreliable.

Cloud-based file sharing enables instant access to files, no matter where the team is located. It’s also able to manage larger files, such as audio and video that email servers don’t have capacity to accommodate, as well as eliminating the need for physical files. 

The fact that 80% of IT professionals are reported to have received file sharing requests via the cloud, displays how organisations are already using the cloud to ease collaboration in this way.

Remote working

Digital workspaces created by the cloud are allowing employees to connect and work in new ways. Boundaries are being deconstructed and focal points decentralised, making for a more balanced enterprise. Even employees’ locations are redistributed.

This is the remote working phenomenon. With cloud computing, employees can work anywhere, at any time, permitting collaboration between colleagues without the restriction of having to be in the same physical space. 

A wider pool of talent is created for the business to select from as geography is no longer a factor, and on the employee’s side, goes some way to improving their welfare; all feeding positively into productivity levels.

Employees are also empowered to utilise the devices they hold a preference for. Whether they need to operate on a tablet, a desktop, or on a phone on-the-go, cloud computing provides the flexibility to work with what suits them and their specific tasks the best.

Improved customer service

All initiatives within an enterprise are to some degree steered towards improving the customer experience. Cloud technology can store information on clients, ongoing sales, business intelligence, and so on, all with the aim of making it easier for employees to access shared information and communicate with one another.

Investment into a CRM system which integrates smoothly with existing applications permits further collaboration across departments within the sales and marketing process, creating ease which feeds back to the customer. This level of collaboration can power a more agile business, within which are responsive, better-informed teams capable of meeting changing customer demands.

National Portrait Gallery hit by 350,000 email attacks in three months


Bobby Hellard

10 Feb, 2020

The National Portrait Gallery was targeted by 347,602 emails containing spam, phishing and malware attacks in the last quarter of 2019, a freedom of information (FOI) request has revealed.

Over half of the emails, 194,620, were identified as being directory harvest attacks (DHA), a technique used to harvest valid email addresses belonging to employees and associates of the gallery, according to data collected by think tank Parliament Street.

The gallery also blocked 61,710 emails from senders flagged as belonging to a «threat intelligence blacklist». A further 85,793 emails were intercepted as they were believed to have contained spam content – which is anything from unsolicited marketing to serious phishing and malware. According to the figures, 418 of the emails contained a virus of some kind.

«These figures paint a worrying picture of the volume of malicious email attacks designed to trick unsuspecting staffers into handing over confidential data such as passwords and log-in credentials,» said Andy Heather, VP of security firm Centrify.

«The National Portrait Gallery is an incredibly popular destination for tourists, attracting millions of visitors and members every year, which unfortunately makes it a top target for hackers and cyber criminals seeking to use legitimate, often stolen, credentials to gain access fear of detection.»

Stolen employee credentials are a global problem for all businesses. Last year, figures from Google’s Password Checkup report suggested that 1.5% of all sign-in attempts were being made using details compromised during a data breach.

«Addressing this threat means ensuring a zero-trust approach to employee communication, ensuring suspicious emails are spotted and full checks are made so that managers can be sure all staffers are who they say they are,» Heather added.

In 2017, London art dealers were defrauded out of hundreds of thousands of pounds after hackers successfully breached company email accounts to monitor correspondence between clients. The incident resulted in fresh cyber security guidance being issued by the Society of London Art Dealers, as well as tips for avoiding email fraud.

Snowflake secures $479m in latest funding round alongside key Salesforce partnership

Cloud data warehousing provider Snowflake has announced the closure of a $479 million (£370.6m) funding round, taking the company’s overall funding to $12.4 billion.

The funding, which represents the eighth round since 2012, was led by two new investors, in the shape of Dragoneer Investment Group and Salesforce Ventures. The latter is especially pertinent, as a strategic partnership with Salesforce was also announced. More details of the partnership will be announced at an event in June, but for now Frank Slootman, CEO of Snowflake, said the company was ‘looking forward to the positive impact our technologies and services will deliver to our customers and the broader market.’

Snowflake, which received two rounds of funding totalling $713m in 2018, has long since been a darling of the privately held cloud space. The company polled second, behind only Stripe, in Forbes’ most recent Cloud 100 list, with this publication noting at the time how B2B use cases were progressing significantly ahead of the consumer-based SaaS companies previously en vogue.

Salesforce Ventures, who alongside Bessemer Venture Partners consults with Forbes on its Cloud 100, therefore has a very interesting position to play here. Salesforce’s acquisition of MuleSoft in 2018 showed the company’s strategy to help its customers tie up data across various clouds. Many how-to guides exist on how to connect Snowflake and Salesforce, as well as AWS, Azure and other clouds where Salesforce users will host data.

Speaking to the San Francisco Business Times, Slootman noted this round of funding was not about the money, but to ‘advance content strategy’ as part of the Salesforce partnership. “This is not a traditional fundraise. It is part of a strategic alliance with Salesforce that we initiated,” he said.

Other investors, alongside Salesforce and Dragoneer, were Altimeter Capital, ICONIQ Capital, Madrona Venture Group, Redpoint Ventures, Sequoia, and Sutter Hill Ventures. “Snowflake’s rapid growth and ability to unlock real value for customers have been impressive,” said Marc Stad, founder and managing partner of Dragoneer in a statement. “We are confident Snowflake’s innovative and evolving technology, and its customer-first approach, will continue to drive sustainable momentum over the long-term.”

While Snowflake secured the silver medal in Forbes’ 2019 Cloud 100 list, don’t bet on their being there this coming autumn. The company’s next step would be to move for IPO, according to TechCrunch, although Slootman did not outline specific plans. Alternately, it does not take a huge leap of faith to imagine that, depending on the fruits of their combined labours, Snowflake’s name may be on a shopping list somewhere at Salesforce towers. The proposed $12.4bn valuation would be almost double that of the $6.5bn Salesforce paid for MuleSoft, the company’s largest acquisition to date.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

5G, the edge, and the disruption of the cloud: Why now is the time for change

Sponsored If one were to put together a linguistic analysis of all the conversations held at MWC 2020, later this month in Barcelona, there is a fair chance that the most spoken word would be 5G.

Not surprisingly, the term will be everywhere this year – much of course as it was last year and the year before. Yet whether it is smartphone vendors looking to showcase their latest, speediest wares, or thought leaders looking to where the enterprise needs to focus, things have turned up a notch over the past 12 months.

Take, for instance, what Bejoy Pankajakshan, executive vice president of Mavenir, had to say for sister publication Telecoms last month. The need for discussion is vital for future strategy, Pankajakshan affirms, as the options are legion. “A 5G network is envisaged as the most open, powerful, flexible, and advanced network the telecoms world has ever seen,” he wrote. “At its heart, [it] is a software network and its development and deployment requires a new approach and a new way of thinking.

“If a 5G network isn’t built the right way, users may not come to the telco, and the OTTs could win again.”

Getting everything moving, across various stakeholders, will be no easy task. Charting a blueprint for 5G in concert with other technologies requires detailed planning. Take a session from Accenture set to take place on February 25 around unlocking the power of the cloud. The rise of edge computing, setting the stage for network transformation, will bring huge long-term benefits, but immediate challenges.

“Network cloudification and the disaggregation of hardware and software become necessary – with CSPs now embarking on the critical journey to move their networks to be fully in the cloud, built around cloud at the edge and mobile edge compute, with the ability to cater for all the new applications and use cases unlocked by 5G,” the session materials read.

For some, 5G will inevitably disrupt cloud computing as we know it today. Writing for this publication in August, Marty Puranik, founder, president and CEO of Atlantic.Net, noted how 5G would effectively kill latency – and in one fell swoop, potentially eradicate the need for cloud solutions.

“One of the main reasons the cloud is so beneficial is for numerous devices – either in an organisation for a private cloud or any user with an Internet connection for a public cloud – to connect to and transmit data with a central machine or hard drive located on the cloud,” Puranik wrote. “For an employee to share a large video file with a colleague who’s working from home that day, the cloud made it simple. But why go through all that if your device can connect with your colleague’s device with only a millisecond of latency and a minimum connection speed of 20 Gbps down and 10 Gbps up?”

Edge computing, essentially the older, more streetwise sister of cloud computing, is expected to receive a lot of attention in Barcelona. Microsoft for instance, from whom the taglines of intelligent cloud and intelligent edge are never far away, are expected to be unveiling edge computing services at MWC.

A report from Omdia, which looks to preview 5G developments at MWC20, noted that 5G and AI technologies could utilise edge computing, to the detriment of the cloud. “By 2025, two of three smartphones will include built-in AI capabilities, and global revenue for AI smartphones will increase to $378 billion,” the report notes. “To alleviate consumers’ privacy concerns, smartphone and smart speaker manufacturers will introduce 5G products which perform visual AI processing tasks on edge servers and appliances, bypassing the privacy risks involved in sending data to the cloud.”

The current technological landscape feels like the calm before the storm. Organisations need to fully research the terrain and find out the best use cases for edge, 5G and AI to ensure smooth sailing ahead.

Editor’s note: This article is brought to you by MWC20.

LinkedIn’s CEO steps aside after 11 years in charge


Keumars Afifi-Sabet

7 Feb, 2020

LinkedIn CEO Jeff Weiner is stepping down from his leadership role later this year, with the firm’s senior vice president of product Ryan Roslansky taking up the post from 1 June.

Weiner will step aside after 11 years in charge of the workplace social networking firm, after successfully growing the company and overseeing its acquisition by Microsoft. 

He will shift over to become the company’s executive chairman, a role that will involve supporting the leadership team more broadly.

His replacement, Roslansky, will report into Microsoft’s CEO Satya Nadella, just as Weiner has done since the firm was bought out in 2016 for $26 billion. The senior vice president of product, who has been at the firm for more than ten years, has overseen building LinkedIn’s influencer programme and publishing platform.

“While I’ve been thinking about the timing of this transition for some time, over the last year or so, several factors converged that led me to conclude now is the right time to make this change,” Weiner wrote in a LinkedIn post.

“For starters, our business has never been better, our culture has never been stronger, and our future has never been clearer. Additionally, my passion for initiatives beyond my day-to-day role as CEO has continued to grow. 

“Most importantly, after working with Ryan for nearly two decades, spanning two companies and countless roles, it’s become clear to me that going forward, his vision, drive and passion are exactly what the role requires.”

Weiner said that his role as executive chairman will allow him to hold a more top-level influence position and contribute to business strategy, product vision, and general advice, much in the same way that LinkedIn founder Reid Hoffman performed the role.

Tomer Cohen, meanwhile, will take over from Roslansky’s current role as the company’s product lead, having helped to shape LinkedIn’s product ecosystem strategy.

“I’m taking on this new role of CEO because I believe deeply in what we are doing, how we are manifesting our vision, and where this company can go,” Roslansky wrote in his own LinkedIn post. “And the best part about it is that I truly believe we’re just getting started.

“Over my 10+ years at LinkedIn, I’ve been fortunate to work on every part of our product ecosystem – spanning the largest global professional social network and the six distinct businesses built on top of it. 

“To say I am humbled to lead this company into the future would be an understatement.”

LinkedIn has undergone several changes in the last few years, most recently migrating to Microsoft’s Azure platform in July 2019. The move involves shifting all workloads from LinkedIn’s own data centres to Microsoft’s public cloud platform, and will take multiple years to complete.

Microsoft has also previously outlined plans to integrate the social networking platform with its other products, namely Office and Outlook

Coronavirus starts to take its toll on the tech industry


Sabina Weston

7 Feb, 2020

The spread of coronavirus is affecting technology companies including Tesla, Qualcomm, and Hon Hai – whicn makes iPhones for Apple, as well as products for HP Inc. and Sony.

Tesla’s stock fell by 17% on Wednesday, following VP Tao Lin’s announcement, via Weibo, that the coronavirus outbreak in China would delay deliveries of its Model 3 cars.

The Model 3 vehicles are produced in Tesla’s Shanghai Gigafactory, which last month was ordered by the Chinese government to shut down due to fears of the virus spreading within it. At the time, Tesla finance chief Zach Kirkhorn said that the closure would “slightly” impact the company’s profitability in the first quarter of the year.

Factories are scheduled to reopen on 9 February, yet with the death toll reaching 563, it is uncertain whether the Chinese government will or will not extend the shutdown.

The phone industry may also be negatively affected by the virus outbreak in China.

Qualcomm’s chief financial officer Akash Palkhiwala stated on Wednesday that the government-imposed shutdown caused by the coronavirus might threaten manufacturing and sales.

During a conference call with investors, Palkhiwala said that Qualcomm expects «significant uncertainty around the impact from the coronavirus on handset demand and supply chain».

Hon Hai had more precise estimates about the coronavirus’ impact on its business. The outbreak forced the company to revise its expected sales growth to 1-3%, rather than a previously projected 3-5%.

So far, the effects of the virus outbreak on tech companies were largely due to China being brought to a standstill by mandatory ‘self-quarantines’. Companies have not reported any cases of coronavirus among their workers.

However, LG has decided to prioritise its employees’ wellbeing by withdrawing from this month’s upcoming Mobile World Congress (MWC), in order to protect its staff from the virus outbreak.

MWC is scheduled to go ahead, yet organisers are likely to introduce a ‘no handshake’ policy to contain the possibility of the virus spreading among attendees.

As such, the coronavirus is having a direct impact on the technology industry form the manufacturing side through to technology showcases nations apart from China. 

Netskope secures $340m in funding at $3bn valuation to further cloud security mission

Cloud security provider Netskope has announced the closure of a $340 million (£263m) investment on a valuation of almost $3 billion.

The move represents the seventh funding round for the Santa Clara-based company, taking its total funding to more than $740m. Netskope’s most recent funding was a series F round in November 2018 which raised $168.7m.

Netskope’s primary offering is a cloud access security broker (CASB) product, alongside a secure web gateway (SWG) and public cloud security. The company has a private access Zero Trust solution, part of the Netskope Security Cloud, which is currently in beta.

As cloud usage continues to rise in the enterprise, security threats have risen with it. Plenty of venture capital money is being placed in the market; last month CloudKnox, a provider of identity authorisation for hybrid and multi-cloud environments, raised $12 million. While the biggest cloud providers are taking steps to remediate issues – Google in December announced a partnership with various security firms, including Palo Alto Networks and McAfee for instance – responsibility still rests with the customer for security in the cloud. In other words, what data sits where.

“We look forward to using this investment to continue to accelerate global demand for the Netskope Security Cloud Platform and continue to push the envelope on our vision of a cloud-native platform that secures data against all threats across all of an enterprise’s traffic – whether destined for the web, the cloud, or private apps,” wrote Netskope CEO Sanjay Beri in a blog post.

Sequoia Capital Global Equities led the round as a new investor, with additional funding being provided by all existing investors, including Accel, Base Partners, Geodesic Capital, ICONIQ Capital, Lightspeed Venture Partners, Sapphire Ventures, and Social Capital. Patrick Fu, managing partner at Sequoia, said in a statement that Netskope was “raising the bar for game changers successfully pushing beyond the limitations of existing technology to reshape a market.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Best open source cloud-storage services


Steve Clark

6 Feb, 2020

When it comes to cloud computing, it’s easy to fall back on the same big name players like Amazon Web Services (AWS), Microsoft, and Google. But, as in so many areas of computing, open source can offer a low or no cost alternative that businesses should be investigating.

Cozy

What we liked:

If you’re planning to take the plunge by moving away from traditional cloud platforms run by the big tech companies, mobility and security should be high on your list of considerations. You need to be able to access your files any time, anywhere, and they should be safe from nefarious net-dwellers snooping on personal data. 

Cozy offers exactly that. You can operate within your browser or download the app for your computer. Refreshingly, the company also “prohibits any use, for its benefit or for the benefit of anyone, of all or part of your ‘private’ data”. 

Thanks to a clean layout and large fonts, Cozy’s design is effortlessly user-friendly. A simple sidebar ensures navigation is slick and familiar, while adding your files to the cloud is achieved with one click of the Upload button. Clicking the three-dot menu reveals options to create new folders and select multiple items and, once you’ve added files, you can then manage items through the second three-dot menu that appears beside it, renaming and moving files, and sharing them with others. If you’re a convert from the Google and Microsoft services, you’ll have no trouble mastering Cozy.  

During our tests, we found uploading to Cozy was almost instant. As with other cloud services, a progress bar in the bottom-right corner provides a visual indication of everything you’re uploading to your Cozy cloud drive. 

You’ll find another handy shortcut by clicking the company logo in the top-left corner, which lets you quickly jump between the standard cloud service, your Photos folder and the Cozy Bank service (if you sign up for it). 

How it can be improved:

Cozy doesn’t offer the best deals when it comes to storage. Free accounts are capped at 5GB, which is fine for basic use, but you’ll need to shell out around £3 a month to increase that to a more respectable 50GB. While it’s a simple process to create new folders and the like, we’d love to have a Google Drive-style context menu to manage files with a simple right-click of the mouse.

XOR Drive

What we liked:

Security lies at the heart of XOR Drive. Proudly proclaiming itself as “encrypted cloud storage”, the platform abandons centralised systems in favour of a decentralised blockchain-based service. So, rather than letting a single company keep all your data, you’re completely in control of it, making XOR Drive the choice for the truly privacy conscious. 

To get started, you need to create an account with Blockstack, which powers the service. You can then access your cloud storage and, once you’re in, things start to look a little more familiar. In fact, if you didn’t know about XOR’s emphasis on security, privacy and data ownership, you’d think this was just another cloud-storage platform – and that’s no bad thing. 

Uploading to the service is speedy. Unlike other contenders, you can also drag and drop files to upload them. Sharing, too, offers the option to share publicly or to send files direct to other Blockstack users, keeping them private. 

How it can be improved:

If you’re a heavy user of cloud storage, you’ll notice everything slows down when uploading files over 10MB to the free account. Likewise, upload and download times are penalised if you’re using more than 10GB of storage. Occasionally, when attempting to see files, we received the notification that XOR ‘Failed to load preview’.

Bloom

What we liked:

Bloom – currently in beta – is very similar to Cozy.This browser-based cloud-storage service is big, bright and easy to navigate, and includes an Android app. 

The interface’s mobile-style icons take you to different areas – Drive, Photos, Music, contacts and so on.
And while you may not have any games saved, visit Arcade anyway, for a free version of popular puzzler 2,048.
You can also access Bitflow, for downloading files saved as torrents.

Once you get into the actual drives themselves, it begins to feel less mobile-inspired and more like a traditional cloud platform. Looking like a mash-up between OneDrive and Google Drive, navigation is standard and uploads are fairly quick, though by no means the fastest. Bloom’s storage is certainly generous – sign up for free and you’ll immediately get 30GB to use how you choose. 

How it can be improved:

Because it’s not yet had a full release, Bloom is lacking in certain areas. There’s no iOS app (though there are hints that it’s under consideration) and no dedicated desktop installation – all that cloud magic happens in your browser or in the Android app. That may be a deal-breaker, particularly for iPhone and iPad owners. 

A guide to computational storage: Boosting performance for SSD storage arrays

With the proliferation of IoT devices and 5G fast wireless appearing on the horizon, enterprises are moving towards edge-based infrastructure deployment. Computational storage is a new storage technology powering the storage and compute part of edge infrastructure. With computational storage, it will be possible for enterprises as well as telecom service providers to support a massive amount of data processing, mainly at edge nodes.

A few companies have started offering computational storage solutions to businesses and organisations. Let’s understand the computational storage concept in-depth and how it is backed by communities and tech vendors.

The need for computational storage

Currently, most developments in the technology domain are focusing on digital user experience in real time with intelligence. This calls for the data centre or infrastructure stack to be at the highest performance level, equipped with all the latest hardware resources and computational processing techniques. Artificial intelligence/machine learning, analytics techniques are moved into the data centre to make digital devices intelligent.

As a result, we have seen the evolution of many new data centre technologies to boost data centre performance. We have also seen legacy HDD getting replaced by flash-based SSD arrays; the use of NVMe and FPGAs to boost data access in storage devices; the use of GPUs for hyper-scale data centre; and so on. Overall, we are witnessing the emergence of High-Performance Computing (HPC) systems that support the processing of huge amounts of data.

This leads to two types of gradual demands as we move forward into digital transformation. One, AI/ML and analytics applications need faster access to data than which are currently provided via traditional storage systems.

Secondly, data processing demands will continuously increase as per growth in IoT and Edge computing. Moreover, the humongous data generated by 5G networks will exponentially support IoT and edge use cases.

Although a maximum number of data centres are equipped with all-flash storage arrays, organisations face bottlenecks in supporting the ever-growing processing demands by AI/ML or big data applications.

This is where computational storage comes in.

What is computational storage and why do we need it?

Computational storage is a technique of moving at least some processing closer to or along with storage devices. It is also being referred to as ‘in-situ’ or ‘in-storage’ processing.

Generally, data has to move between the CPU and a storage layer that causes a delay in response time for input queries. Applying computational storage is critical to address the real-time processing requirement of AI/ML or analytics applications. We can host such high-performance computing applications within the storage itself, reducing resource consumption and costs, and achieving a higher throughput for latency-sensitive applications. Additionally, computational storage enables reduction in power consumption by data centre resources.

The core reason why computational storage stands advantageous to data centres is due to a mismatch between the storage capacity and the host machine’s memory data bandwidth (PCI links) that are connected to the CPU. To understand how this mismatch can be caused in a hyper-scale data centre, let’s take an example of the proposed server architecture by Azure and Facebook at OpenCompute.

In this proposed server, 64 SSDs are attached to one CPU host through PCI links. As shown in the above proposed server block diagram, 64 SSDs are connected to PCI links of 16 lanes. Each of the SSDs has 16 flash channels for data access, taking the total internal flash bandwidth to 8.5 GB/s. Now, 64 flash channels are available across 16 SSDs, which makes the total storage capacity as 544 GB/s. The bandwidth of PCI links is limited to 16 GB/s. This is a huge mismatch in the path of data to the host CPU. In such cases, in-situ processing can be applied so that most critical high-performance applications move to SSDs.

SNIA standards and market development

A global storage community, SNIA, has formed a Computational Storage Technical Work Group (TWG) to promote the interoperability of computational storage devices, and to define interface standards for system deployment, provisioning, management, and security. The TWG includes storage product companies such as Arm, Eideticom, Inspur, Lenovo, Micron Technology, NetApp, NGD Systems, Nyriad, Samsung Electronics, ScaleFlux, SK Hynix, Western Digital Corporation, and Xilinx. 

SNIA has defined the following three standards to implement computational storage in any type of server, whether it’s a small-medium scale enterprise data centre or a hyperscale data centre.

Computational Storage Drive (CSD): A component that provides persistent data storage and computational services

Computational Storage Processor (CSP): A component that provides computational services to a storage system without providing persistent storage

Computational Storage Array (CSA): A collection of computational storage drives, computational storage processors and/or storage devices, combined with a body of control software

Several R&Ds are under way and researchers are developing POCs to test standards defined by SNIA on high-performance computing applications. For example: CSD is demonstrated with project Catalina.

What's more, some of the core members of SNIA’s computations storage TWG have already started offering solutions. The vendors are NGD Systems, Samsung, ScaleFlux, Eideticom, and Nyriad.

Conclusion

Computational storage standards will be a great addition keeping in mind the growing demand for process data through high-performance computing applications. This type of in-storage embedded processing will come along with different forms and approaches which can be offered with NVMe-based architecture to boost SSD stacked servers.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.