All posts by tomasfonseca

How cloud computing is changing the laboratory ecosystem

The Human Genome Project, that was declared complete in 2003, took over a decade to run, costing billions of dollars. Using today’s software, the same data analysis can be accomplished in just 26 hours.  

Research has thrived from the rapid growth in computational power. With this, comes increased pressure on labs as data storage facilities need to house the exponentially-increasing quantities of large data sets, as big data becomes an integral part of research. This poses a crucial problem for labs, in which they may require a whole room dedicated to its on-site storage. And with that comes maintenance, leading to hefty up-front IT infrastructure costs.

Cloud computing has helped to alleviate this burden, by removing the need for companies to have their own information silos. Instead, research data can be stored in the cloud, external to their own facility. For laboratories, cloud computing centralises data, ensuring security and data sovereignty whilst facilitating collaboration.

Centralising data

Cloud computing allows labs to partake in immense computing processes without the cost and complexity of running onsite server rooms. Switching from an onsite solution to the cloud alleviates the costs of IT infrastructure, reducing the cost of entry into the industry, while also leveling the playing field for smaller laboratories.

Moreover, cloud computing can allow data to be extracted from laboratory devices to be put in the cloud. Device integration between lab equipment and cloud services allows real-life data from experiments to be collated in a cloud system. One of the most popular products in the market is Cubuslab, a plug-and-play solution that serves as a laboratory execution system and collects instrument data in real time as well as managing devices remotely.

This new collection of high amounts of data requires a centralised system that integrates the scientists protocols and experimental annotations. The electronic lab notebook, is starting to become a common tool in research by allowing users to organise all their different data inputs and retrieve this data at any point. This also allows for large R&D projects to effectively control data over their scalability potential.

Data sovereignty

As the pressure on laboratory practices swells, it is now more crucial than ever to adapt the way labs function. The need for cloud computing in laboratory environments has become increasingly imperative due to exceeding regulatory requirements.

Compliant research, through audit trails and other measures, is required to verify that the data is truthful and unaltered. Good experimental practice makes research more applicative to receive patents. Moreover, cloud computing supports custom access control, so that access to certain information can be allocated dependant on the role. Therefore, the QA/QM department of the lab can govern which members have access to certain levels of data.

In reference to the fear around cloud computing mentioned earlier, the fear of security is not formed from substance. Cloud providers have security protocols and specialised staff that provide majoritively better security than on-site solutions.

However, in light of last year’s GDPR law, the compliance regulations of cloud providers has changed significantly. Some cloud providers use hybrid or multi-cloud approaches, which poses difficulty in the lab as it is much more complex to check GDPR compliance. You would have to check the data protection policies of each cloud provider. A breach in compliance from one provider would sabotage the data-protective infrastructure of the whole system.

Data safety

A lot of the current aversion to cloud computing comes from fear of lost data. It may be easy to think that one is most likely to lose their lab data due to outages or cyber attacks. However, it is more probable to lose data from fires in proximity to on-site data storage facilities. Moreover, paralysis of the electrical grid in the U.S. is mostly caused by squirrels, rather than by cyber-attacks (according to Jon Inglis, former Deputy Director, U.S. National Security Agency). Furthermore, deploying systems in the Cloud escapes the problem of data loss caused by outages that would otherwise handicap smaller laboratory industries. If a cascading failure were to occur in the cloud, the data will not be lost as most files are stored in more than three locations, hence securing data.

What seems to be holding back many labs is the fear of losing data by switching to cloud computing. In reality, it is much safer and reliable than on-site data storage.

Conclusion

Cloud computing has allowed a more balanced playing field with regards to research. Cloud computing is a cost-effective way for smaller laboratories to get a leg up in the research industry, just as long as it pertains to compliance regulations. This is due to the ability to scale-up, as a business doesn’t require excessive data storage facilities – which is especially important as data sets become exceedingly large.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.