Todas las entradas hechas por ricknotsodelgado

The cloud develops alongside ingenuity: How cloud is essential to underpin new technologies

The Internet needed a means to maximise its potential – and the solution would need to be highly adaptable and accepting of a wide variety of technology. Cloud computing has evolved with the information age.

The evolution of applications for cloud computing: A history

The Allied Telecom Group describes cloud computing as an interconnected network of remote servers; specifically, Internet-host servers that process and manage information, as well as a place to store data. Cloud computing was initially seen as a backup solution for hard drives. However, it soon became much more than this.

Cloud computing development advanced quickly over a very short time. In 2013, mobile banking was already active in cloud computing. By 2015, the cloud was being used by about half of the US government’s agencies. This amounted to $2 billion in spending.

The cloud was no longer just for tech companies. Businesses big and small began to recognise its advantages.

The many uses of cloud computing

As the cloud branched out of the technology space, it started to show its true potential. It quickly became water cooler jargon as companies across the US, and then throughout the world. The business world may have been the most influential in bringing cloud computing to the masses; certainly, IT staffing benefitted from users’ lack of understanding of cloud implementation.

People knew they wanted the cloud before they knew what it was. Unfortunately, the cloud remained an elusive concept to grasp. Logically it should not be; it is in essence a network. However, most users do not see the command line transactions that IT personnel do. They use the cloud behind layers of interfaces. It is not uncommon for even an entire organisation to consider their cloud options without a systematic plan.

It's important to understand as well that the cloud can be extremely dynamic – and it helps to visualise how ingrained the cloud has become in the modern Internet.

Take New Generation Applications and w3schools as examples. New Gen Apps lists several applications for the cloud from a business perspective: scalable usage; chatbots and other communication; productivity; business processes; backup and recovery; application development; test and development; big data analytics; and social networking. Compare this with w3schools’ analysis from the viewpoint of Internet programmers and coders: file storage; photo editing; digital video; Twitter applications; anti-virus; word processing, spreadsheets and presentation software; maps; and eCommerce.

All the potential uses for the cloud are unknown

One could say that the potential applications for the cloud are limited only by the imagination of human beings. Most networks are more capable than initially thought. They pool resources together that bring about great achievements.

Cloud computing harnesses networking. It strives to perfect it. One can think of it as making communication clearer. For instance, it can unify word processing applications with multi-lingual chatboxes, so scientists from around the world can collaborate to write a book in real time. It allows people from every nation to play online games and compete within fractions of a second.

Every last potential use for cloud computing is unknown. What can be predicted, however, is that as new innovations arise, the cloud is a tool and platform essential to practical uses of new technology. As it happens, it is also a means by which old technology or ideas can interface with new ones. It reduces the need for new versions. Like a neutral zone from which all parties can draw resources, the cloud is a medium like no other.

Cloud services resemble a Swiss Army knife

After a while, the cloud begins to look like the Swiss Army knife for Internet applications. There are numerous programming languages, and yet the cloud can benefit them all. Networking is as important to business as it is to education or any congregation of people.

The cloud is a network, as well as a tool that facilitates information across networks. It is not just a chat box or library that everyone can access. It facilitates services on top of the main application. Services can draw from other applications to produce a new product quickly; this was a profound achievement that led to the plethora of mobile applications available today. In addition to its adaptability, the cloud also solved a lot of download issues. It allowed devices to run applications on distant servers – another function which helped build a lot of applications.

Cloud computing is one of those rare occurrences in which ingenuity gains a partner.

Opinion: Why ‘robotics as a service’ is on its way

It feels like everything is a cloud service these days, and it turns out that includes robotics. Robotics as a Service (RaaS) is quickly growing into a multi-billion dollar industry. An International Data Corporation report said that in 2015, global robotics spending was at $71 billion.

The report also stated that worldwide spending on robotics and related services are to hit $135.4 billion by 2019. According to research manager John Santagate at IDC Manufacturing Insights, “robotic capabilities continue to expand while increasing investment in robot development is driving competition and helping to bring down the costs associated with robots.” The industrial use of robots not only can cut costs, but it also drives a great transformation for customer experience.

In the healthcare sector, institutions are using robots to talk to each other to combine data that can be spread across multiple databases. Rather than build new integrations across the existing databases or replace them, institutions can instead, create a digital nurse that will gather relevant information from each database creating a simpler solution and lower risk option. A robot is installed in each back end system that consolidates and displays the needed information on a mobile device in real time.

In the consumer world, many startups are starting to build chatbots to be used in customer service processes. Chatbots can be used within already existing chat technology such as WhatsApp or Google Messenger.

As an overall view, service robotics offer a large advantage in taking over certain industry tasks that could be deemed tough, risky or even mundane. Ordinary daily tasks that require little effort can be quickly taken over with robots that provide a high degree of accuracy. At the moment, the manufacturing sector purchases the most robots and related services and it’s no wonder. Factories and assembly lines are one of the best areas to implement robotics. Process manufacturing companies that develop products based on formulas or recipes such as drugs or sodas have greatly benefitted from the infallible accuracy that robotics have offered. Not far behind the manufacturing industry, however, is the healthcare sector with projections of spending to double by 2019.

The global service robotics market has been projected to surpass 18 million units by 2020 with an expected CAGR of 23.7 percent from 2014 to 2020. The main factor that drives the industry is the demand for decreasing labor costs in developed countries and the growing occurrence of supported living. More and more companies have started to enter the industry hoping to evolve and refine automation techniques and user-end customer services.

RaaS can also be used to leverage the cloud making it possible to embed devices on the web and cloud computing environments. An obvious use for including cloud capabilities in robots is its use in stores, warehouses and distribution centres allowing businesses to never be “sold out” of hot items. Data captured by robots through video analytics or RFID tags like inventory and customer preferences can all be stored on a hybrid cloud-based system or all flash storage.

RaaS providers can handle maintenance along with integration of the robots and databases used within enterprises. Not only does this cut costs, but it also makes management and scalability easier along with offering greater flexibility. As technologies are expanded and innovated, robots will soon be more integrated with cloud servers and intelligent digital environments only meant to create smarter business networks. Rather than peg robots as merely a product, robots as a service can create new and better business models making things easier on our budgets and end products.

How the cloud wars are beginning to enter a new phase

Opinion Imagine a business where the product keeps getting better and better, like storage space and video quality, but the prices keep falling steadily as the product continues to improve. It’s happening in cloud computing – and this ‘race to zero’ has people in the industry deeply worried.

The current leader, Amazon Web Services (AWS), slashed their prices an astounding 44 times in the past six years. Amazon’s strategy, it seems, is more of a low cost, high volume model. This strategy seems to be working according to research company Gartner, who reported that AWS stores twice as much customer data as the next seven leading public cloud companies. AWS profits in 2016 grew over 120% and sales grew over 60%, extinguishing any assumptions that AWS was a loss leader product for Amazon as a whole.

Amazon is betting at such a low cost, enterprise businesses will treat cloud services like you treat products at Walmart. With such low prices on products, you’ll keep adding more to your cart. The other leading cloud providers, like Microsoft and Google, are willing to keep up and match Amazon’s prices, with Gartner analyst Raj Bala describing it as aggressive, even ‘punitive’.

Aaron Levie, CEO of Box, has predicted that storage will be free and unlimited in the future. If the current race to lower prices continues, it looks as if Levie’s prediction will soon come to pass. This means, however, that cloud companies have to build in other functionality besides storage alone. From security improvements to converged storage where data storage is combined with data computing, cloud companies are looking for related and integrated services they can charge for.

In the race to differentiate, Microsoft rolled out a hybrid cloud which integrates into its subscription based Office 365 program in April 2016. Satya Nadella, Microsoft’s CEO, explained how this integration works. “If a certain calculation needs massive resources, the transaction can be handed off to its cloud products to provide a sort of turbo charge,” he said. This strategy might be very successful, since many people already use Office at work and the ease of integration might be the incentive for companies to choose Azure over AWS.

Google’s Cloud Platform has strengths with its big data analytics capability, but also has some restraints as it is unable to integrate with existing data platforms, making it more difficult for late adopters to convert to the service. As Kurt Marko explains, it can be a ‘poor choice for cloud laggards and organisations looking for a place to offload legacy virtual infrastructure and applications’.

In March this year, IBM announced it was rolling out more cloud offerings in an effort to compete with AWS and to make the point that storage is not the aspect of the cloud which enterprises really need. One study predicts 85% of enterprise businesses will move to multi-cloud architectures by 2018. IBM’s new products utilise Watson to help clients manage and leverage data stored between multiple clouds.

One analyst, Rodney Nelson at Morningstar Inc., questioned how effective IBM’s rollout would be, noting “the major cloud vendors are flooding the market with new features on a consistent basis…all have moderate to major leads on IBM in that regard.” IBM’s most competitive idea may be moving focus away from offering cloud services and moving towards offering services that will integrate with all cloud providers.

The intensely competitive cloud computing environment is volatile, with new companies throwing their proverbial hat in the ring, such as the Chinese company Alibaba. It’s also pushing out current competitors, with telecom companies like Verizon and AT&T losing market share due to lack of expensive yet impactful infrastructure investment and innovations.

There are so many facets which will affect the final outcome of the cloud computing industry. From the businesses who are quickly trying to implement and then harness the cloud’s potential, there are CIOs who are actually choosing the cloud provider, business people who are learning how to use big data and cloud computing for business intelligence, to companies who are fighting for the biggest piece of the pie and are investing millions to stay competitive. It’s impossible to predict what the final outcome will be.

How machine learning could prevent money laundering

Machine learning is being put to use in all sorts of areas today. From smart cars and homes and beyond, the use of artificial intelligence (AI) and machine learning (ML) are becoming a larger part of how many companies conduct business. As more and more businesses are hit with cyber crime rather than physical crimes, there has been a needed shift from commercial surveillance systems towards cyber security systems to protect confidential data. More recently, we’ve seen ML sink its teeth into anti money laundering (AML) with big potential impacts there.

Most current AML systems are founded on an extensive list of rules. Banks and institutions are required to comply with the Bank Secrecy act and implement certain AML rules. These regulations are meant to help detect and report any suspicious activity that could indicate money laundering or terrorist financing. As these regulations have become more demanding, the traditional rules-based systems has become more and more complex with hundreds of rules driving know your customer (KYC) activity and Suspicious Activity Report (SAR) filing. As financial institutions monitor billions of transactions a day, the data mined from each creates a silo of information and data that any person would find overwhelming to sift through. More and more cases are being flagged for investigation but more and more false positives pop up.

Along with in an increase in the false positive rates, another challenge found in AML is the fact that it hardly ever signifies as the activity of just one transaction, account, business or person. Because of this, detection cannot focus on singles instances but rather requires analysis of behavioural patterns of transactions occurring over time. Therefore it is nearly impossible for personnel to investigate all cases in a timely manner.

Additionally, there are very little historical data surrounding ML making it difficult to pinpoint exact tactics and methods for ML. From trusts to black market currency exchanges or loan-back schemes, there is, unfortunately, no typical ML case. This also lends to indefinable data labels and sets, which have required manual analysis in the past. Despite the growing difficulties to monitor AML, more financial institutions are turning to the various technological tools such as AI, ML and big data analytics to detect ML cases.

When combined, these systems can merge across massive spectrums of data sources and dig through boundless mountains of data. When ML or AI has been implemented, however, these hybrid systems are able to translate those unlabelled points of data into signals to detect behavioural anomalies and intent. These ML systems can establish “normal behaviour” patterns and then identify anomalous behaviour next to it, thereby weeding out the false positives from true ML cases. In this instance, there is massive reduction of false negatives and false positives.

Moreover, FICO has created an AML Threat Score that helps to prioritize investigation queues for SARs, utilizing behavioural analytics from Falcon Fraud Manager. FICO utilizes transaction profiling tech, self-calibrated models and customer behaviour lists all which can adapt to the constant changing dynamics within a financial institution. Other ML systems that have seen progress has been the “unsupervised learning” — a form of machine learning that uses algorithms to draw inferences from data sets that lack labelled responses. Due to the large gap in historical data on ML cases, there is a weighty need for ML technology to be able to analyse and gain insight from data without prior knowledge of what to look for. Unsupervised machine learning learns from that unlabelled data and results in the ability to differentiate between the relevant and irrelevant data and can then divide this unlabelled data into usable clusters. This is achieved through link analysis, temporal clustering, associative learning and other techniques that allow financial institutions to track entity interactions, behavioural changes and transaction volatility.

The benefits of using machine learning to prevent ML can be seen when labelled and unlabelled data from this slew of sources can be ingested into a system that is flexible enough to accept a multitude of data points across a myriad of sources while also analysing its potential for a ML case. As the AML regulations that are required of banks become more intense and fines for complying grow, you can be sure to see the implementation of machine learning built on top of the traditional AML systems already in place today.

The biggest obstacles holding back big data success – and how to overcome them

While big data is undoubtedly high on the list of invaluable tools a business needs today – and has been for some time – many companies are still struggling to use it. In fact, according to Square Root’s Data Chasers research, while 92% of companies wholeheartedly believe that big data will revolutionise their success, only 40% are actually taking advantage of it as it was designed to be.

There are obstacles that companies have been stumbling over for years, seemingly unable to overcome – but progress can be made with the right insights and perspectives. Here are the top four roadblocks businesses are facing with big data, and how they could finally beat them.

It’s too little, too late to make the change

Big data isn’t some shiny new accessory to speed up and improve your productivity. On the contrary, you can instead think of it as the foundation and structural beams of your company’s infrastructure – something that’s clearly not easy to replace on short notice. While younger businesses are better able to embrace the change, with new startups figuring it into their original construction, many businesses who’ve been in the industry for years are struggling to remake themselves as big data compatible.

It’s a steep uphill climb, but the key for established businesses trying to make the switch is to take meaningful but measured steps. You didn’t build Rome in a day and you’re not going to rebuild it in that time; instead, evaluate what parts of your business could benefit the most from Big Data, and what practices could make real changes in your productivity and interactions with your audience now. Apply those small but crucial changes and slowly work your way backwards. It won’t happen all at once, but it will give you valuable results where it counts.

The experts are in short supply – or aren’t the right kinds

Your current data experts aren’t to blame; they are skilled in their chosen profession and you hired them for a reason. However, the issue lies in the data world moving out from under them. New practices, tools, and developments in big data have made previously invaluable skills irrelevant, and calls for a new crop of data experts fluent in the modern lingo and tactics. The obvious answer is to hire on these professionals, right?

However, the universities are having trouble keeping up. Students are graduating as quickly as they can and older professionals are taking new courses to bring themselves up to date, but the issue remains. Businesses who want to leverage big data to their full benefit will have to accept that the right experts come at a high price and competition is tough, but it’s necessary.

They’re not sure what they need big data for

Unfortunately, many companies are approaching Big Data with the mindset of “if they have one, I want one too!” It’s undoubtedly a tool every company needs, but for different reasons, and if you acquire big data without knowing the problems you want solved or the insights you’re looking for, it’ll be useless.

While it’s tempting to build up big data as quickly as you can, it’s more important to put on the breaks and have your company take a long look at what actually needs accomplished, from developing converged systems to ironing out operational hiccups. If there are gaps in your information, then this is a place big data can help as well. Once you have a solid look at your goals, you’ll know how to refine the tool to work for you.

They take too much too fast

Think of big data like a massive haystack; the data you benefit from is also hay, but a specific kind of hay. Using big data properly is asking for the right type of hay, from the right haystack, and extracting it with the right tool. Unfortunately, many businesses fail to realise this and believe that all the hay is valuable – and the more haystacks, the merrier. In other words, companies often use too many data sources, too many data collection methods, and put in too many data requests, giving them plenty of results but none that are precise or actionable. This leads to confusion and false starts that hinder rather than help.

Instead, companies need to refine the way they use big data – and not get too excited. It’s about the right answers, not all the answers.

The key aspects to consider when executing a smooth move to the cloud

As the benefits of cloud computing become more pronounced, more businesses are migrating to the cloud. Greater scalability, flexibility and financial security often come as a result of making the shift to cloud computing – and those are just a few of the advantages. The allure of the cloud is well known. However, the fine details of cloud migration and implementation are often overlooked.

Migrating to the cloud is more complicated than many companies anticipate. Too many of the business are pulling the trigger on cloud migration with only the first few steps in mind — the cost of the service and the logistics of the physical transfer of the data itself. Moving data is a lot like moving to a new house or apartment. If you have never done it before, you may be thinking, “I’ll just move all my stuff and pay the rent or mortgage.” Anyone who has moved in the past few months can tell you that it’s often more complicated than that.

For starters, you have to select a place of residence. As you do this, you must consider the needs of your family. Think of the features and amenities in a home that will be of most value to you. Moving into a public space such as an apartment is often cost effective. Still, apartments have their drawbacks. Houses offer the advantage of greater privacy and circumstantial control. If you need something in between, a townhome could serve as a sort of hybrid that offers the best of both worlds. Other considerations: What level of upkeep will the property require? Is there a big yard? Will the house require renovation in order to suit the needs of your family? You could always just build your own home – although, this could become very complicated if you have no experiences with homebuilding.

As you can see, there are numerous unseen variables involved in moving to a new house or apartment. Believe it or not, all of these examples are directly comparable to considerations that should be made when migrating data to a cloud. If you didn’t already make this connection, take a minute to reread the previous paragraph with the following comparisons in mind: family = company; home/property = cloud platform; apartment = public cloud space; house = private cloud space; townhouse = hybrid cloud arrangement.

These are just a few of the factors that home movers or data migrants should take into account. With this analogy as a backdrop, consider a few tips for avoiding problems when migrating to the cloud.

Start simple

Cloud computing is a powerful tool. This technology has created so many options and opportunities to improve the internal mechanism of a company. Still, let’s not get hasty. Start by doing some research and assessing your company’s cloud computing needs.

Understand the pros and cons of public, private and hybrid cloud computing. Once you have an idea of what you are looking for, consider cloud computing service options. If you don’t know much about the market, there are a few providers that are well suited to companies who are beginning their cloud computing journey. According to Logicworks CTO Jason McKay, “One cloud does not fit all, but if you pick a major IaaS cloud provider like AWS or Azure, one cloud certainly fits most.” You could also attempt to build your own cloud computing platform; however, this is not recommended if you or members of your IT staff have little or no cloud computing experience. The same is said of hybrid cloud configurations.

The point is, keep it simple. Begin with a simple, singular cloud computing configuration. Experts say that most successful complex cloud computing configurations are outgrowths of an initially simple setup.

Plan ahead

A survey conducted by IDC revealed that out of over 6,000 executives, only 3% would characterise their cloud strategy as “optimised.” 47% describe their cloud strategy as “opportunistic or ad hoc.” In order for cloud computing provide maximum benefit, companies must have a plan for cloud migration. The following are a couple suggestions to keep in mind as you prepare for cloud migration.

  • Have a plan for maintenance and data management. Some platforms include tools that will help you to do manage your cloud data, at least on a general level. Beyond this, IT personnel should have a firm grasp of the company’s data needs before cloud computing is implemented. This way they anticipate cloud management needs and are prepared to proactively solve problems right from the start.
  • Have a plan for account controls. If you’ve already predetermined your security preferences, authorised access preferences, finance and resource management preferences and data preferences before cloud implementation, you will find cloud computing to be a more effective and hassle free tool. What’s more, if you have a clearly defined cloud management rhythm established from the get go, it will be easier to grow when the time comes.

Get to know your cloud storage bill: How to choose the best options

When it comes to using a public cloud, there are incredible advantages – for a price. But what are you really paying for?

While the flat rate you see advertised may be appealing, there are several details that can raise or lower your cloud storage bill. Could you be paying for something you don’t need, and how can you lower your expenses by factoring in certain specifics before you choose a provider? Get a closer look at your cloud storage bill:

Itemising your bill

1.Price per GB: Most cloud providers price based on the amount of gigabytes used. Ones like Amazon reduce their rates per GB if you require a massive amount of storage, and others keep it steady despite your level of data. In either case, this rate is affected by how redundant or active your data is – or, in other words, whether it just sits there or is often interacted with. The idea behind this is to reward businesses that use their cloud as the primary point of storage rather than branching out over several.

2.Storage actions: To put it simply, storage actions are all the changes, adjustments, and deletions of the data within your cloud storage. If you move something to a new file within, get rid of it entirely, or post it, these are all considered actions which your cloud provider will track, tally up, and then charge a price for hosting these actions. Some providers, such as Amazon S3, don’t charge for storage actions. This can catch businesses off guard when they go to a provider that does.

3.Transfer costs: When you work within the cloud, whether public or hybrid cloud, it’s free. However, some providers charge a fee for removing data for their storage. While most will allow companies to transfer data in at no cost, when it comes to migrating to a separate cloud, removing data for edits and then replacing it, or sharing data across multiple clouds, this can incur a high level of expense.

How to choose the best option: Consider employee fluency

Ultimately, it’s your employees that will be interacting with the cloud. How easy their process is made will affect the rates you have to pay them, the amount of tech support required to help them navigate the new platform, and how efficiently the data is being used – which affects your profits.

If your employees aren’t as fluent with the cloud, they may trial and error with managing the data and boost the price of your storage actions. This makes an option with lower prices on action fees – or one that is free – the best option. On the flipside, while it may be more cost-effective to choose a specific option, shelling out the budget for a platform with particular features could help your employees complete work more efficiently, boosting profits in the end. 

How redundant or active is your storage?

For businesses that don’t work mainly online, data storage can be a way to safeguard data that’s not used often. This makes it redundant data. However, for online companies or larger corporations, data sharing and online collaboration is a chief part of their work. This makes it active data. Having a cloud option that offers cost effective deals depending on how often the data is interacted with is a key to cutting out the extra expenses you may be subjected to without knowing. Consider how much interaction your data will get on a regular basis, and then be sure to check options that accommodate your active or redundant data.

How important is storage transfer?

If you’re not sure you’ll stick with your current cloud option, need to transfer data in and out regularly, or like to spread your data across many clouds for better accessibility, choosing a cloud option with the lowest fee – or no fee at all – for transfers is crucial to shave off your expenses. However, if you intend on staying put for the future and working within the cloud, then you can save money over other options by taking advantage of their added features.

The cloud provider you choose and the actions you take with your storage all depend on those fine details not many businesses know about their real cloud bill. To save money and improve your data storage, keep this in mind.

Why the cloud could hold the cure to diseases

(c)iStock.com/ismagilov

We constantly hear about programs such as Race for the Cure, Breast Cancer Awareness Month, The Ice Bucket Challenge, and other fundraising or awareness initiatives for diseases.  However, hearing a disease has been cured almost never happens. With billions of dollars being used to research diseases around the world, many people started looking for reasons as to why more progress hasn’t been made. Researchers re-examined their processes and realised two things. First, research methods have been largely unchanged in many disease-fighting fields. Foundations, doctors and researchers would conduct studies independent from any other group studying the same disease and draw conclusions from their limited data set.

One example of this was Parkinson’s disease, whereindividual doctors instinctively measured the progression of symptoms during well visits. “Nearly 200 years after Parkinson’s disease was first described by Dr. James Parkinson in 1817, we are still subjectively measuring Parkinson’s disease largely the same way doctors did then,” said Todd Sherer, Ph.D., CEO of The Michael J. Fox Foundation. With few data points and poor collection of that data, Parkinson’s researchers weren’t able to see trends in the data or delve into what treatments were making a positive effect.

The second realisation was that cloud technology was the perfect vehicle to share patient data with other researchers. Big data has been called the “next big tech disrupter” and many companies were already using big data to identify customer trends. Similarly, the scientific community started implementing the cloud to collect data and discover trends in patient and genetic data. Today, the Michael J. Fox Foundation is working on collecting the “world’s largest collection of data about life with Parkinson’s” via smart watches that upload patient data directly to the cloud.

Many disease-fighting organisations are working to implement the cloud as a data sharing vehicle. Nancy Brown, CEO of the American Heart Association, explains why the cloud is a game changer when it comes to curing disease. «To push new novel discoveries, we need the ability to allow scientists and researchers to have access to multiple data sets,» Brown said. «There’s a lot of data out there — data from clinical trials, data from hospitals and their electronic health records, data from the Framingham Heart study. Traditionally, all of that has been kept by individual companies or data owners.»

One beauty of the cloud is hybrid cloud computing, which is the ability to share data without compromising intellectual property. This way,individual entities can share data sets with the public cloud, while synonymously maintaining a private use cloud to store their proprietary findings. This way, everyone has access to the large data sets and can download, manipulate and then store the data it within their own private cloud as they do research.

The importance of the cloud is highlighted by the National Cancer Institute’s program, The Cancer Moonshot, headed by former vice president Joe Biden. The program is designed to double the rate of progress in cancer prevention, diagnosis, and treatment, and to do in five years what might otherwise take a decade. Of the ten “transformative research recommendations” created to achieve the aggressive Cancer Moonshot goal, buildinga national cancer data ecosystem using the cloud is one of them.

Beyond the data sharing capabilities of the cloud are the computing capabilities. Where desktop computers aren’t able to handle and analyse the massive streams of data that are collected, cloud computing is. Mark Kaganovich, founder of SolveBio and a doctoral candidate in genetics,explained the challenge that companies and researchers are actively working on, is building tools to sift through the “data tornado” and take advantage the “huge opportunity to use statistical learning for medicine.”

One real world cloud application is the sharing of how patients with certain genomes react to certain drug treatments. Eric Dishman, Director of Proactive Health Research at Intel,shared that when he had a rare form of kidney cancer doctors tried a variety of treatments without success. It wasn’t until his genome was sequenced were his doctors able to effectively treat him now knowing which drugs were likely to be most effective.

Currently, cancer organisations are working on sharing data on how cancer patients with similar genomic patterns are reacting to their treatments enabling doctor to effectively choose treatments for future patients. AsClay Christensen explains in his book on health care disruption, the cloud has the ability to take our current system of intuitive medicine and transform it to precision medicine.

Why the cloud could hold the cure to diseases

(c)iStock.com/ismagilov

We constantly hear about programs such as Race for the Cure, Breast Cancer Awareness Month, The Ice Bucket Challenge, and other fundraising or awareness initiatives for diseases.  However, hearing a disease has been cured almost never happens. With billions of dollars being used to research diseases around the world, many people started looking for reasons as to why more progress hasn’t been made. Researchers re-examined their processes and realised two things. First, research methods have been largely unchanged in many disease-fighting fields. Foundations, doctors and researchers would conduct studies independent from any other group studying the same disease and draw conclusions from their limited data set.

One example of this was Parkinson’s disease, whereindividual doctors instinctively measured the progression of symptoms during well visits. “Nearly 200 years after Parkinson’s disease was first described by Dr. James Parkinson in 1817, we are still subjectively measuring Parkinson’s disease largely the same way doctors did then,” said Todd Sherer, Ph.D., CEO of The Michael J. Fox Foundation. With few data points and poor collection of that data, Parkinson’s researchers weren’t able to see trends in the data or delve into what treatments were making a positive effect.

The second realisation was that cloud technology was the perfect vehicle to share patient data with other researchers. Big data has been called the “next big tech disrupter” and many companies were already using big data to identify customer trends. Similarly, the scientific community started implementing the cloud to collect data and discover trends in patient and genetic data. Today, the Michael J. Fox Foundation is working on collecting the “world’s largest collection of data about life with Parkinson’s” via smart watches that upload patient data directly to the cloud.

Many disease-fighting organisations are working to implement the cloud as a data sharing vehicle. Nancy Brown, CEO of the American Heart Association, explains why the cloud is a game changer when it comes to curing disease. «To push new novel discoveries, we need the ability to allow scientists and researchers to have access to multiple data sets,» Brown said. «There’s a lot of data out there — data from clinical trials, data from hospitals and their electronic health records, data from the Framingham Heart study. Traditionally, all of that has been kept by individual companies or data owners.»

One beauty of the cloud is hybrid cloud computing, which is the ability to share data without compromising intellectual property. This way,individual entities can share data sets with the public cloud, while synonymously maintaining a private use cloud to store their proprietary findings. This way, everyone has access to the large data sets and can download, manipulate and then store the data it within their own private cloud as they do research.

The importance of the cloud is highlighted by the National Cancer Institute’s program, The Cancer Moonshot, headed by former vice president Joe Biden. The program is designed to double the rate of progress in cancer prevention, diagnosis, and treatment, and to do in five years what might otherwise take a decade. Of the ten “transformative research recommendations” created to achieve the aggressive Cancer Moonshot goal, buildinga national cancer data ecosystem using the cloud is one of them.

Beyond the data sharing capabilities of the cloud are the computing capabilities. Where desktop computers aren’t able to handle and analyse the massive streams of data that are collected, cloud computing is. Mark Kaganovich, founder of SolveBio and a doctoral candidate in genetics,explained the challenge that companies and researchers are actively working on, is building tools to sift through the “data tornado” and take advantage the “huge opportunity to use statistical learning for medicine.”

One real world cloud application is the sharing of how patients with certain genomes react to certain drug treatments. Eric Dishman, Director of Proactive Health Research at Intel,shared that when he had a rare form of kidney cancer doctors tried a variety of treatments without success. It wasn’t until his genome was sequenced were his doctors able to effectively treat him now knowing which drugs were likely to be most effective.

Currently, cancer organisations are working on sharing data on how cancer patients with similar genomic patterns are reacting to their treatments enabling doctor to effectively choose treatments for future patients. AsClay Christensen explains in his book on health care disruption, the cloud has the ability to take our current system of intuitive medicine and transform it to precision medicine.

How the cloud is improving healthcare in remote populations

(c)iStock.com/wasja

From improved diagnosing to enhanced treatment methodologies for a multitude of illnesses and diseases, the healthcare industry has benefited tremendously over the past decade thanks to advancements in technology. One of the most notable improvements has come about as the result of cloud technology – an increased ability to provide healthcare to remote populations.

In India, approximately 70% of the population lives in villages, many of which have limited access to healthcare providers, if any. Thankfully though, health technology specialists are tapping into the $125bn healthcare market and creating cloud-based equipment, such as all-flash storage arrays, and solutions that provide point of care diagnostic abilities along with telemedicine antidotes. These digital health breakthroughs are empowering more than 8,000 healthcare technicians to serve as proctors for physicians in rural areas.

When data is shared through cloud-based health programs, it enables health workers, including physicians, nurses, and health technician specialists, to link their data through shared networks located at central medical facilities. The data can be integrated into today’s most accurate diagnostic systems, allowing extremely effective medical objectives and treatment methods to be developed and implemented.

A cloud-based healthcare program proves to be of benefit because it operates within a large pool of unique, easily accessible useable resources that are virtualised. The resources have the ability to be dynamically reconfigured so that they are able to adjust to a variable scale, or load, which then provides users of the program to perform optimum utilisation of the resources stored in the cloud pool.

As with any type of healthcare solution, there are challenges that must be appropriately addressed. Cloud-based computing, especially within the healthcare field, requires lots of maintenance and training. Physicians and healthcare technicians must not only know how to utilise cloud-based software programs and equipment, but owners of healthcare clinics and medical centres must also address all legal issues that coincide with using cloud-based technology.

One program which is cloud-based and increasing healthcare to remote populations is ReMeDi. This program provides a unique form of video and audio capabilities between physicians and patients located in rural areas. The video and audio features allow real-time consultations to take place, which can be critical and even life-saving in some situations.

Take, for example, a patient who arrives at a village centre. A health technician takes vitals and performs basic diagnostic testing, followed then by adding the data and information to an electronic health record. The information is then shared via the appropriate cloud-based healthcare program and viewed by an offsite physician; this physician can quickly create a treatment plan, as well as prescribe any medications that may be needed to fight off a deadly infection.

Systematic innovation has always been a major factor in the development of advanced healthcare. Innovation drives cost-effectiveness as well as efficiency and high quality resolutions to today’s healthcare concerns. Cloud-based technology is proving to be a breakthrough in modern healthcare tactics, allowing research outcomes to be greatly improved, thus changing the face of IT. Data handling problems, coupled with complex and even sometimes unavailable or expensive computational methods have always resulted in research complications, especially within the biomedical research field. Cloud-based programs, though, are showing a lot of potential in being able to overcome these hurdles.

Read more: Why the healthcare industry’s move to cloud computing is accelerating