All posts by ricknotsodelgado

How the cloud is improving healthcare in remote populations

(c)iStock.com/wasja

From improved diagnosing to enhanced treatment methodologies for a multitude of illnesses and diseases, the healthcare industry has benefited tremendously over the past decade thanks to advancements in technology. One of the most notable improvements has come about as the result of cloud technology – an increased ability to provide healthcare to remote populations.

In India, approximately 70% of the population lives in villages, many of which have limited access to healthcare providers, if any. Thankfully though, health technology specialists are tapping into the $125bn healthcare market and creating cloud-based equipment, such as all-flash storage arrays, and solutions that provide point of care diagnostic abilities along with telemedicine antidotes. These digital health breakthroughs are empowering more than 8,000 healthcare technicians to serve as proctors for physicians in rural areas.

When data is shared through cloud-based health programs, it enables health workers, including physicians, nurses, and health technician specialists, to link their data through shared networks located at central medical facilities. The data can be integrated into today’s most accurate diagnostic systems, allowing extremely effective medical objectives and treatment methods to be developed and implemented.

A cloud-based healthcare program proves to be of benefit because it operates within a large pool of unique, easily accessible useable resources that are virtualised. The resources have the ability to be dynamically reconfigured so that they are able to adjust to a variable scale, or load, which then provides users of the program to perform optimum utilisation of the resources stored in the cloud pool.

As with any type of healthcare solution, there are challenges that must be appropriately addressed. Cloud-based computing, especially within the healthcare field, requires lots of maintenance and training. Physicians and healthcare technicians must not only know how to utilise cloud-based software programs and equipment, but owners of healthcare clinics and medical centres must also address all legal issues that coincide with using cloud-based technology.

One program which is cloud-based and increasing healthcare to remote populations is ReMeDi. This program provides a unique form of video and audio capabilities between physicians and patients located in rural areas. The video and audio features allow real-time consultations to take place, which can be critical and even life-saving in some situations.

Take, for example, a patient who arrives at a village centre. A health technician takes vitals and performs basic diagnostic testing, followed then by adding the data and information to an electronic health record. The information is then shared via the appropriate cloud-based healthcare program and viewed by an offsite physician; this physician can quickly create a treatment plan, as well as prescribe any medications that may be needed to fight off a deadly infection.

Systematic innovation has always been a major factor in the development of advanced healthcare. Innovation drives cost-effectiveness as well as efficiency and high quality resolutions to today’s healthcare concerns. Cloud-based technology is proving to be a breakthrough in modern healthcare tactics, allowing research outcomes to be greatly improved, thus changing the face of IT. Data handling problems, coupled with complex and even sometimes unavailable or expensive computational methods have always resulted in research complications, especially within the biomedical research field. Cloud-based programs, though, are showing a lot of potential in being able to overcome these hurdles.

Read more: Why the healthcare industry’s move to cloud computing is accelerating

How small businesses can succeed with their big data projects

(c)iStock.com/Matej Moderc

It’s no secret that businesses are excited about the possibilities they can gain from big data; large corporations have been tapping big data analytics for years as a way to increase revenue and improve efficiency.

Small businesses, on the other hand, may feel a bit left out of the trend. Big data, after all, sounds like a complex strategy using the latest technology to discover important insights. In other words, it’s something that feels like is out of reach. But that would be a mistake. Small businesses now have access to big data solutions thanks to decreasing costs and greater understanding of its uses.

The big questions small businesses have is how best they can use it. With so many possibilities ripe for the taking, what strategy can they pursue to ensure their big data projects are successful? While no strategy is fool proof, there are a number of guidelines small businesses can adopt to increase their chances for success.

Adopting a big data strategy will certainly be a change of pace for a small business. Big data has the potential to completely change how a company operates and what they focus on. Due to this factor, it’s important that small businesses trying out big data for the first time don’t aim too big. Keep expectations manageable. Big data can be collected from a vast array of sources, but if a small business narrows its focus, it will be able to pinpoint specific ways in which it can use big data.

One good place to start is through augmenting the knowledge and data that a small business already has. There’s no need to install Internet of Things sensors to collect data for a company to analyse. Any small business should already have lots of data to examine, from sales data to social media interactions. Mining that information for helpful insights is a good first step before growing a big data strategy.

Specificity is something that applies to all big data projects. Small businesses need to be able to clearly define their goals while also specifying the results they want to achieve. Without clear direction, companies will spend a lot of time collecting and analysing data with no real purpose. Organisations must declare the business cases they’re going after with their big data projects. With this in mind, they’ll be able to proceed with confidence.

Even with these guidelines, small businesses may still struggle implementing big data solutions. In many cases, they’ll have little knowledge of the tools available at their disposal, let alone how to use them in the right way. The answer may be found in the cloud. Just as with many other services, big data analytics is something offered by many cloud service providers. Some of the options are free, while others can be purchased through modest monthly fees, but they all can grant small businesses certain capabilities they’ve never had before.

With big data available through the cloud, small businesses don’t have to worry about investing large sums of money into new software and hardware. Cloud services with a modular data centre can also help smaller companies digitise their existing data, essentially putting them in a form where they can finally be analysed for big data insights.

It all comes down to finding and adopting the right tools. There are a lot of tools available out there, so small businesses need to find one that is user friendly and features a short learning curve. Chances are a small business doesn’t have much in the way of big data experts, so picking an easy-to-use analytics dashboard goes a long way toward promoting a successful big data project. The right tool will help small businesses unify their sources, aiding them in getting the most out of the data already in their hands.

With something as complex as big data, no single solution will provide the pathway to immediate success, but big data projects can be more effective with the above tips in mind. Every small business will have different goals and objectives, but the one unifying aspect is that big data can help them all. Small businesses don’t have to feel like big data is out of reach. If anything, there has been no better time to give it a shot than right now.

Read more: Survey warns about security of small businesses using free cloud storage

What will 2017 bring for the storage industry?

(c)iStock.com/Akindo

In a field as competitive as the storage market is, knowing what’s going to happen before it actually happens can be extremely valuable. And right now, the storage market is going through some big transitions, some of which are upending the industry.

This is kind of to be expected when dealing with a business that incorporates the latest technological advances, but even with that in mind, the rapid evolution underway at the moment is impressive to watch. The storage market as we know it today will look quite different in a decade. But let’s not get too far ahead of ourselves; we’ll keep our projections focused and take a brief but insightful look into the not-too-distant future as we see what the experts are saying about the storage market for 2017.

If one thing is consistent across the range of predictions being made, it’s that the role of cloud storage will be integral. More businesses are turning to the cloud for a wide variety of services with storage being one of the primary choices.

In a study released earlier this year, 451 Research found that the cloud storage market was expected to double by next year. That’s only the start of what could be some enormous growth for cloud storage. Research conducted by MarketsandMarkets similarly found huge rates of growth for the market, only their study went further. The organisation predicts that after doing nearly $19 billion in 2015, the cloud storage market would grow to more than $65 billion by 2020. These findings indicate that businesses are quickly moving many of their storage needs to the cloud.

That’s not the only transition many enterprises are making regarding their storage options. For many years, companies relied on hard disk drives for their storage needs, but starting several years ago, they began to move into the realm of solid state drives (SSDs). SSDs are largely better for performance, much faster than hard disk drives, and better able to handle the workloads businesses have these days. The one downside was the price — SSDs were simply the more expensive option. But that’s not necessarily the case anymore. The price of solid state drives is declining, and many experts are predicting they’ll reach the same price range as hard disk drives in 2017. That means not only will SSDs become more popular among enterprises, they’ll soon be used as a replacement for hard disk drives. The hard disk option may still be used for archived data that needs to be kept around in the long term, but the vast majority of storage usage will likely be solid state drives.

While many companies head over to cloud storage, we can also expect more intense competition. The usual companies will be out in full force, like Google, Microsoft Azure, and definitely AWS, but that won’t be the only place the competition will come from. According to Mark Lewis, CEO of Formation (and the former CTO and GM of EMC), the legacy players will find new rivals in regional cloud service providers. These providers will be businesses fulfilling niche company needs, providing specialised services that can bring more value to their customers.

This marks a big change from the much larger cloud service providers that usually offer features meant to cover a wide range of clients. Whether a company is trying to set up a modular data centre or working to analyse years of sales data, a specialised provider will have a specific service designed to help them.

Lewis also predicts that there will be an increase in adoption for software defined storage. Instead of being used for smaller projects, software defined storage will become a major factor in big data initiatives, an especially important role considering how vital big data is becoming for enterprises. Part of the reason businesses will move to software defined storage is how easy it has become to move legacy applications to there, turning its advantages into key strengths every business can use.

Staying on top of these changes in the storage market can help a business be prepared for the future. It’s all about keeping technology up to date and using the latest advances for the benefits they provide. While these predictions are still more educated guesses, they can provide a helpful guide for a company’s plans for the future.

Overcoming the Achilles heel of flash storage: Making flash last longer

(c)iStock.com/loops7

More corporations than ever are taking advantage of flash storage, and you use it every day – be it in your computer, phone, wearable device, or in the USB drive you back up your data on. It’s faster, more durable, and more reliable than a spinning disk, but it still does have one weakness: it deteriorates over time.

In theory, all of the flash’s advantages eventually go out the window as its lifespan wanes, and since it’s more expensive than a spinning disk, experts are trying to find new ways to make it live longer.

How flash storage deteriorates

To put it simply, flash storage is made up of a group of cells. Each of these cells are individually written with data, and then hold this data for you to reference later. Your pictures, documents, and videos are all broken up into several tiny pieces and individually written into these cells, similar to writing something on paper with a pencil.

Like pencil and paper, when you erase what the cell held – removing a photo or document – and replace it with something else – a new photo or document – you’re forcing the cell to rewrite itself, like erasing pencil lead from paper. If you use the eraser too much, the paper will thin out and eventually rip. That is the Achilles Heel of flash storage.

How experts are making flash last longer

1. Machine learning adjusting the load to make flash cells last longer: There are high points and low points in everything’s lifespan, and flash is no different. The kind of load it can take early in its life is going to be different from later in life, with a few ups and downs in the middle as well.

Machine learning is being developed and tested by experts to automatically identify when these high and low points are, and purposefully lighten the load or distribute the strain rewriting puts on flash storage cells so that they’re not overly taxed, which wears them out sooner. This process is so specific, so fast, and so difficult to detect that it’s impossible for human beings to do it manually – which is exactly what the computer itself would be set to the task.

2. Removing static data from wear levelling flash storage: There is static data and then active data, both which live up to their names.

Static data, when written on a cell, only needs to be written once – so those cells can take a break; their job is done. However, active data, such as a program is being written and rewritten often, makes those cells work constantly. To keep any one cell from being worn out too fast, wear levelling is a built-in flash feature that spreads that weight out over multiple cells.

The problem is that static data is makes it impossible to spread that wear evenly – it’s not active, so it can’t be moved. This causes the active data cells to wear out considerably faster, and when they wear out, the entire thing wears out, dramatically reducing the lifespan of the drive. That’s why experts have created wear levelling and non-wear levelling flash storage, so active data and static data can each have their own storage to avoid overburdening the flash.

3. Digital signal processing to make bit errors more readable: When the cells are written and rewritten too often, they can misunderstand the finer details of what they’re supposed to write, which essentially makes them mistranslate the data. This is called a bit error, and the cells have to work harder to decipher what the data really means, putting a strain on them and their lifespan.

However, experts have developed storage systems that use digital signal processing (DSP) that takes half that burden on itself, splitting up the strain so that bit errors can be avoided or handled faster, putting less strain on the flash storage itself. However, this is a temporary solution that extends flash life only to a certain degree.

Conclusion

Flash storage is like all things; it’s mortal. It’s impossible, for now, to create flash which doesn’t wear out, but in the meantime, experts are tailoring certain storage to help spread out the work and bear the additional weight, and even teaching the flash storage itself how to better allocate its efforts in converged infrastructure to keep it fresh. The longer the storage lives, the more cost effective it is, which inherently drives more funds to improving its lifespan.

Why preparation is key to securing your cloud migration

(c)iStock.com/fazon1

The benefits of big data are real. And with so many businesses looking to migrate their data to the cloud, they want to make sure everything arrives safely and intact. After all, much of this data contains sensitive and proprietary information, and the prospect of moving it from the safety of the corporate firewall to a cloud environment is cause for concern.

Still, as data volumes continue their exponential growth, moving vast sets of structured and unstructured data from the restrictive confines of an on-premise Hadoop deployment to a cloud-based solution will be an inescapable choice for companies looking to stay competitive.

Fortunately, proper preparation is the key to ensuring a smooth and secure transition to the cloud. With that goal in mind, here are some steps your business can take on the preparation side to secure your cloud migration.

Pick your cloud vendor carefully

Data migration to the cloud necessitates a cloud host. And there are a variety of cloud modular solutions to choose from. The key to choosing the right cloud vendor for your organization lies in understanding your big data needs. While price is certainly a consideration, other criteria such as data security and how well the vendor is equipped to carry out the big data storage and analytics tasks that you need are critical.

If data security is your main concern, then vet your vendors accordingly. If you need a vendor that excels at app hosting, make sure that the hosts you are considering excel in that area. If rapid data analytics and reduced time-to-insight are top of mind criteria, then a versatile cloud solution such as Spark as a Service would be worth your consideration. In all cases, make sure that the cloud vendor’s platform conforms to industry and internal compliance standards before entering into a cloud service agreement.

Take baby steps

When it comes to adopting a new and promising technology, the tendency for many companies is to want to jump in with both feet. But adoption typically comes with a learning curve, and cloud adoption is no exception. By nature, data migration to the cloud can often cause some downtime, which could potentially impact the business. The smart approach to mitigate the risk of business disruption is to take small steps, beginning with the migration of apps and data that aren’t classified as sensitive or mission critical. Once the security and reliability of the cloud host have been assessed, the next bigger step of loading more sensitive data into the cloud can be taken.

Get clear on security

When it comes to data security you need to be clear on which security protocols your cloud vendor uses and the degree to which these protocols can ensure that sensitive information remains private. However, if your organization is like most, you won’t be transferring all of your data to the cloud. Some data will remain on your own servers. This means that you now have data in two different environments, not to mention those cloud hosted apps that come with their own security systems.

Multiple data environments can lead to confusion for your IT team—the kind of confusion that wastes valuable time and reduces the productivity of your big data initiative. To solve this data security dilemma you’ll need to get clear on implementing a broad and coordinated security policy with technology and policy management protocols that cover apps in the data center and apps in the cloud.

Be strict with BYOD

Migrating data to the cloud enables employees to collaborate like never before. And with the proliferation of mobile devices such as smartphones and tablets, more and more businesses are bringing the practice of bring your own device (BYOD) into the workplace to make workforce collaboration even more convenient. 

However, granting employees access to potentially sensitive data and applications in the cloud poses a number of security risks, especially since mobile devices are fast becoming the favoured targets of skilled hackers. Organisations looking to leverage BYOD need to implement and enforce strict protocols for how data may be accessed and used, along with guidelines that clearly spell out which employees have permission to access sensitive data and cloud-based applications on mobile devices, and which employees do not. Like all security technology and protocols, BYOD safeguards should be segregated solely to the IT department to ensure quality security assessment across the organisation.

As technology advances and data volumes grow ever larger, the rush to the cloud by organizations will only intensify. That being said, the migration of data to the cloud cannot be rushed into. By following these and other guidelines, and by exercising careful planning and preparation to ensure a successful and secure data migration to the cloud, organisations stand to reap the many bottom-line benefits that a cloud solution offers.

AWS and Azure get the highest federal security rating: What happens from here?

(c)iStock.com/Kevin Smart

Cloud services have been able to store customers’ data for many years now, but the number of prospective clients for several vendors has recently dramatically increased.

Back in late June, the announcement was made that three vendors had received special certifications from the federal government, allowing them to store sensitive data that the government had on hand. Two of those providers are among the most popular within the cloud market, Amazon Web Services (AWS) and Microsoft Azure, while the third is CSRA’s ARC-P IaaS, a vendor that might not be as universally known as the others but still carries enough weight for those in the know. The news was certainly noteworthy for those providers, but it also has tremendous implications for federal agencies as well as the cloud market as a whole.

The federal government is no stranger to storing data using other cloud vendors. The majority of cloud providers knew this early on and even provided special services specifically tailored for government needs. Back in 2011, for example, Amazon launched its AWS GovCloud. Microsoft also started a similar service. The idea was for government agencies to use the cloud to store data of various types. Thousands of government customers quickly hopped on board with the idea, but as helpful as the service was, the most sensitive information the government had still couldn’t be placed on the cloud, at least not until rigorous security standards were met.

A newly created list of requirements was made under the Federal Risk and Authorization Management Program (FedRAMP) called the High Baseline. It contains more than 400 standards set by the National Institute of Standards and Technology that cloud providers would have to meet in order to receive certification allowing it to store sensitive government data. That’s been the goal providers like Amazon and Microsoft have been working toward – being able to qualify for what would be a new influx of government customers. Those providers could already store about half of what the federal government’s IT spends on information, but the other half could be opened to them if their security standards were raised. Based off the announcement, the government says the three named vendors have cleared the bar.

So what types of data will the government be able to store in the cloud? The announcement states that the data in question is considered highly sensitive but unclassified. This includes but is not limited to things like financial data from agencies, patient medical records – likely from healthcare programs run by the government – and even law enforcement data. Most data of this sort is considered of high importance if its leak or theft were to be a great detriment to an agency’s operations, clients, resources, or employees.

The announcement is certainly big news for government agencies as being able to store government data securely will make their jobs much easier. But that’s not the only area that will benefit greatly. The cloud market itself stands to gain quite a bit from the news. For many years, as the cloud has picked up steam and grown in popularity, those organisations that have been reluctant to transition to cloud services have often cited concerns over cloud security.

It’s not easy to hand over potentially sensitive information to a third party, and many executives wanted to know their data would be in good hands. Cloud providers have answered many of those concerns with greater attention being placed on security improvements, but doubts persisted among businesses of all types. With the news that the federal government is willing to trust some cloud providers with highly sensitive data, it shows a great degree of confidence being placed on vendors’ ability to protect information. This can lead to more companies turning to the cloud for flash storage needs.

Security standards from the government have now been met by some of the most popular cloud providers, allowing them to house sensitive government data. With the standards established, we may see other providers work hard to improve their own security so that government agencies can consider them for data storage purposes. Considering how competitive the cloud market is, it would be a wise move for providers to ensure they can work with as many clients as possible.

Read more: AWS and Microsoft get FedRAMP approval for sensitive cloud data

Six key benefits of cloud computing in the healthcare industry

(c)iStock.com/Liubomir Turcanu

The cloud has many benefits for businesses – but it is also making advances in the medical industry, becoming a vital tool for healthcare professionals everywhere. What makes it so valuable?

Better collaboration

Collaboration is vital to the healthcare industry, which makes the cloud a perfect companion in the field. By allowing professionals to store and access data remotely, healthcare professionals around the world can gain access to patient data immediately and apply the necessary care without a delay. In addition to this, remote conferencing, up-to-the-second updates on healthcare developments and patient conditions, and more, is allowing doctors to save those precious life-saving minutes.

Greater reach, especially during times of disaster

When disaster strikes, getting the necessary professionals to the places they need to be or giving the present doctors the information they need is a difficult task. Being able to consult with one another, send requests for additional resources or man-power, or simply keeping each other updated on the status of a disaster victim’s condition can be the difference between a life lost or saved.

An on-site doctor with very little experience in surgery can now have real-time guidance from an expert to perform a field surgery, for example, with all the present medical equipment transmitting real-time information from one source to the next to ensure the best work is done.

Better storage – lower cost

The cloud makes it possible to not only hold more information but to do it at a lower cost, much like when working with software defined storage. This allows even the smaller hospitals access to the kind of information they need to offer the best care, without the price tag that could either put them under or force them to make cuts where cuts shouldn’t be made.

Better use of big data to treat patients

When you hear “big data,” you likely think of businesses mining data for marketing or production strategies. For the medical field, it’s much of the same, except instead of narrowing down the tastes of consumers, big data allows doctors to narrow down the conditions of patients, comparing them to others to deliver a more focused and accurate assessment on their ailments.

Additionally, instead of figuring out how to tailor a product to be the most appealing, medical professionals can tailor their care or gain better insights on the conditions themselves, so there’s much less room for error with treatment. Big data, however, is far too big for any one server to manage; with the cloud, doors are being opened for the healthcare industry so they can take advantage of it first-hand.

Improved medical research

Much in the way big data is making it possible for doctors to treat their patients better, the cloud makes it possible via storing and sharing data to speed up the research process. With the ability to gather outside data from multiple fields, data analysts can use the cloud to pool this data and condense it into better results, allowing the medical professionals to get a clearer and more advanced image of the subjects they’re researching. These sort of advances are the kind that cure diseases and improve the kind of care being given.

Remote patient care

We’ve all heard about how the Internet of Things (IoT) is making it easier to drive your car or even make coffee, but with the cloud stepping into the medical field, could the IoT save your life? The answer is yes.

With new mobile devices being made to monitor a patient’s condition, and even applications on your smartphone making it possible to keep your doctor up-to-date on your condition or get remote consolation, cloud is making it possible for you to get high quality care without ever stepping into a hospital. If you’re unable to get to the hospital, don’t want to spend the money, or dislike visiting doctors altogether, you can use new devices powered by the cloud to transmit your condition or ask for advice from a doctor on standby, which will allow both patients and medical professionals to catch dangers early.

How Trulia is using Hadoop and big data to power one of the largest global real estate sites

(c)iStock.com/cnythzl

In one form or another, businesses have always employed data analysis to inform decision making. But today’s state of the art big data analytics platforms like Hadoop have taken things to a whole new level.

This new level of business functionality, called converged infrastructure, optimises business processes across an entire enterprise by grouping multiple IT components into a single computing package.

For example, converged infrastructure may include components like servers, storage, networking equipment, and software for IT infrastructure management, automation, and orchestration. That’s a lot of moving parts—a lot of data storage and processing power. And data analytics platforms like Hadoop route all that power and data through a central user interface allowing massive amounts of data to be crunched faster, easier, and cheaper. That may sound pretty incredible, but that neat little description falls short of demonstrating how big a deal converged infrastructure really is. So let’s take a look at how the online real estate giant, Trulia, is doing it.

One of the largest online residential real estate marketplaces around, Trulia claims more than 55 million unique site visits per month. Trulia is an online real estate marketplace whose service model centres on providing relevant and unique insights about properties, neighbourhoods, commute times, and school districts. Trulia’s insights are available at every level of the real estate industry including home buyers, sellers, and renters. But Trulia’s core strength is data.

Back in 2012, Trulia unveiled its new service to help real estate agents identify the best leads by tapping a trove of data the company had started analysing the year before, according to Reuters. Trulia Insight, as the service is called, shows real estate agents which potential buyers have pre-qualified for a mortgage and whether they are looking to buy a home in the next six months. And with the help of Hadoop, Trulia updates its website with new insights every night. A recent article on CIO.com explains that “every night, Trulia crunches more than a terabyte of new data and cross-references it with about two petabytes of existing data to deliver the most up-to-data real estate information to its users.”

In that same CIO.com article, Zane Williamson, one of Trulia’s senior DevOps engineers explains how their daily, terabyte data processing includes information from public records, real estate listings, and user activity. “We process this data across multiple Hadoop clusters and use the information to send out email and push notifications to our users,” Zane said. “That’s the lead driver to get users back to the site and interacting. It’s very important that it gets done in a daily fashion. Reliability and uptime for the workflows is essential.”

Two years after Trulia converged its IT infrastructure through the Hadoop platform, the company was acquired by its largest competitor, Zillow, for $3.5 billion. By 2015 Q4, Zillow announced that they had finished integrating the Trulia platform. In the wake of their merger with Trulia, Geekwire.com reported Zillow’s 2015 earnings increased seven cents per share, after analysts had projected a three cent loss.

Now Trulia and Zillow are moving forward with the same Hadoop-based data analytics platform, and according to Trulia’s vice president of data engineering, Deep Varma, they’re moving toward providing real-time data to users while enhancing the emotional aspects of their user experience. At the same time, according to siliconangle.com, Trulia and Zillow are working toward leveraging California’s Open Data Movement to further enhance their real estate insights with crime scores and public transit systems.

Why Hong Kong is playing “cloud catch up” – but moving quickly

(c)iStock.com/mbbirdy

It may be strange to hear, but Hong Kong has fallen behind much of the rest of the world when it comes to adopting cloud computing.

Hong Kong is well known for being an enormous financial hub for south-east Asia and one of the busiest centres of economic activity in the world, so to think that they’re lacking on the technological front would seem out of character. And yet, that’s exactly the case.

In many ways, Hong Kong is playing a version of “cloud catch up” with the rest of the world, with reports and surveys showing that there’s a general reluctance among businesses to make the cloud part of their operations. Despite this, there are other factors within Hong Kong that are pushing more cloud solutions, to the point where we may only be a few years away from Hong Kong truly catching up with what others are doing.

The hesitation shown by businesses in Hong Kong is actually reminiscent of the worries many business leaders expressed about the cloud several years ago. While not necessarily a new technology, the cloud was quickly gaining steam, and organisations wanted to know if it was the right fit for them. The usual issues were brought up, most of them pertaining to security.

Once businesses became more familiar with the cloud, and once cloud providers addressed the worries, adoption skyrocketed, to the point where it’s hard to imagine doing business without cloud computing at all. Businesses in Hong Kong are facing the same challenges. Many of the top executives indicate they are worried about cloud security, while others say it’s more a question about the legal restrictions dealing with the cloud.

Other factors also play a role in the lack of cloud adoption, factors that aren’t as prevalent in other areas of the world. As mentioned above, the finance industry in Hong Kong is especially large and influential. But by its very nature, financial institutions don’t like taking on risk. For many years, the cloud was seen as a risky manoeuvre, so many financial companies chose to stay away from it. That same thinking still holds sway in Hong Kong, though that is changing by degrees. Hong Kong CIOs have also expressed less interest in cloud. A recent survey from Gartner shows that CIOs rank cloud computing as a low priority, especially when compared to nearby regions like India, Southeast Asia, and Australia. Again, overcoming these obstacles requires a change in mindset, one that is certainly happening, though it lags behind other parts of the globe.

Another reason for poor cloud adoption is the lack of availability of cloud services. Amazon Web Services (AWS) is the leading public cloud provider in the world, but for the longest time it hadn’t spread to Hong Kong. AWS now considers Hong Kong to be what is referred to as an “edge location”, which is a definite improvement, but more work needs to be done. Luckily, it appears that provider proliferation is happening in the area. Public cloud providers like Amazon are turning their attention to the international market, with major in-roads being made in India, China, and Southeast Asia. Microsoft has partnered up with Hong Kong tech companies to provide the cloud version of its Office software. Google Cloud Platform is making strategic moves in the area as well. That’s not to mention Alibaba’s recent opening of its first data centre in Hong Kong.

It’s clear that cloud providers see the value of moving into Hong Kong and are implementing plans that will see it grow over the next few years. This is aided by legal reforms and the cooperation of Hong Kong companies.

The future of cloud computing in Hong Kong is certainly a bright one. According to research from IDC, the amount of money spent on cloud services in Hong Kong is expected to reach nearly $700 million by 2017. Companies see the opportunity and their getting on board with the idea. In a few years, it may even seem strange that we ever thought of Hong Kong being so far behind in the cloud. Whatever happens, it’s almost a sure thing that the cloud will prosper in the region, and with it, the companies that take advantage of the solutions it provides.

The race to zero: On the path to no-cost cloud computing

(c)iStock.com/erhui1979

It seems that everyone is in a race, and the pace of business in the modern world is no less evident than in cloud computing. While prices for just about anything are going up, cloud storage prices are nosediving. The competition between leaders such as Google, Amazon, and Box is driving prices down and providers across the industry are being pressured to be part of the “race to zero”.

This race has been accelerating. Microsoft began providing unlimited storage in October 2014, for its OneDrive service included with Office 365. This was in response to Google Apps for Work that provided numerous features that threatened to outdo Microsoft’s well-known business products.InformationWeek provided insight into this story. Also mentioned was a system of checks and balances that exists because while vendors are providing lower and lower cost and even free storage, managing physical resources is expensive.

For example, Amazon made 47 price reductions in six years by late 2014, while its revenue growth slowed from about 80% to below 39% between 2011 and 2014.

Storage will soon be free, many experts are not shy to admit. However, there are many ways around this, and Office 365 is an example. The resources with this Microsoft product are not free and require an annual subscription, so technically users must still pay for what they get. Storage can essentially be thought of as part of it. This represents another trend being seen in the race to zero.

Cloud services worth paying for

Cloud services company Box is also on the bandwagon. Offering affordable storage, it has introduced services people and companies don’t hesitate to pay for. Security is one of them. Nobody is willing to skimp on security, as data is more vulnerable than ever. Box is also developing apps for document and project management as well as collaboration. Dropbox is another example: it is pushing Dropbox for Business with a range of security and administrative functions.

In 2014, IBM announced it was investing $1.2 billion in expanding its data centres. Microsoft has said it is spending $750 million to improve its Wyoming data centre alone. The trend has been to offer services companies don’t pay for in a one-off purchase, but throughout the year. These services must be great enough that just a few companies can handle the pressure.

Memory prices plummet

The falling price of memory is supporting the race to zero. According toBusiness Insider, a hard drive with a gigabyte of capacity cost $9,000 in 1993. In 2013, the same amount of storage was about $0.04. Providers have also passed on the savings in processing power. A $3 million IBM mainframe had 65 kilobytes, operated at 0.043MHZ, and supported a lunar landing mission in 1969. The iPhone 6 has more power at 16GB and 2.6GHz. It costs about $200.

Isn’t it really a race to the top?

It seems the race to zero is being driven in part by competition, despite the diminutive costs of memory today. Dallas Salazar, writing for Seeking Alpha, discusses this very topic. There’s a logical notion that cloud storage gets people’s attention. Amazon is now offering mobile apps for Cloud Drive, a product now available for $60 per year under a plan which it calls Unlimited Everything. The Unlimited Photos plan is even cheaper while the storage has no limits at all. Free storage is a pretty good selling point for larger-scale paid services, as companies gain greater knowledge on cloud computing.

Whether free storage is a strategy to draw customers, or simply a passing of dirt cheap costs from provider to customer, there is no doubt the race to zero is well and truly on.