All posts by pratikdholakiya

Assessing just how safe you are in the cloud – and three tips to secure your business data

Each day, Internet users generate an average of 2.5 quintillion bytes of data, according to recent research from Domo. Per minute, The Weather Channel receives more than 18 million forecast requests, Netflix users stream almost 70,000 hours of video, and Google conducts 3.5 million searches.

90% of all data available today was created in the last two years. As a result, an ever-increasing amount of enterprise data is being stored in the public cloud, both exclusively and hybrid. But it seems not everything is as smooth as we want it to be in the paradise. There have been various cloud horror stories that forced us to think: how safe are we in the cloud?

Remember when hundreds of companies exposed PII as well as private emails (including confidential business emails) to the world through Google Groups back in 2017. A small settings error was to be blamed for such a massive data leakage and companies like Fusion Media Group, IBM's Weather Company, Freshworks, and SpotX etc. were affected by this security issue.

In another incident, Stanford University too suffered, not once, but in three separate data breaches. The data security breaches, which was caused by “misconfigured permissions” exposed not only personal employee information (including their salary information and Social Security numbers) but also student sexual assault reports and confidential financial aid reports. In fact, there are many such cloud data security horror stories, enough to give you nightmares.

The cloud is dangerous

It is true that cloud environment offers the benefits of flexibility, availability, and low costs etc. But at the same time data storage in the cloud is becoming an increasing concern for anyone who use file storing and data sharing tools like Google Drive, Dropbox, Microsoft OneDrive, Amazon Drive, and the likes, when it comes to keeping their information private.

Data in the cloud is stored in an encrypted form, meaning they are encoded with a specific encryption key without which the stored files look like gibberish. A hacker needs to crack these keys to read the information. The most important factor, therefore, is: Who has the key? And this factor is often responsible for most data security breaches.

In most cases, the commercial cloud storage systems keep the key themselves so that their systems can see and process user data. Moreover, these systems can access the key as a user logs in using his/her password. While this is perhaps the most convenient way for cloud storage systems, it is also less secure. Any flaw in the service provider’s security practice can leave the users’ data vulnerable. Dropbox, for example, has been severely criticized for its security and privacy controversies.

Again, there are some cloud services that allow users to upload and download files only through service-specific client applications, which also include encryption functions. These service providers allow users to keep their encryption keys themselves and are therefore a bit more secure than the others; however, they aren’t perfect and there are chances that their own apps might be hacked and compromised, allowing the intruder to access your files.

How to protect yourself and your data

While there is no way you can ensure that your information is safe on the cloud, there are some protective measures that you can take to deal the issue of cloud privacy. Here are 3 data protection tips to reduce the risk of your cloud experience.

1. Encrypt your data and use encrypted cloud services

As mentioned earlier, cloud storage services that offer local encryption and decryption of data alongside storage and backup are safer options than those who keep the encryption keys to themselves. Spideroak, for example, has a “zero-knowledge” privacy policy, meaning neither the service providers nor server administrators have access to your files. Similarly, use virtual private networks that keep your information encrypted and hidden from intruders and do not store or track your data when you use their services. This ExpressVPN review, for instance, explains how the company has a strict “No Logging” policy to ensure an optimal user experience.

In addition, to add an additional layer of security to your files, you can encrypt your data before uploading them onto the cloud. There are many software that allows you to encrypt you file and make them password-protected before moving them to the cloud.

2. Backup your data locally

Always have electronic backups for your data so that you can access them even if the one on the cloud gets lost or corrupted. You can either keep it in an external storage device or in some other cloud storage. However, the former is perhaps a better option as you can access them even without Internet connectivity.

In addition, avoid keeping your sensitive information such as passwords, Social Security number, credit/debit card details, banking information, or even your intellectual property like patents and copyrights etc. in the cloud. These kind of information, if compromised, can result in potential data leakage.

3. Have strong passwords

Although you might have heard this before, making your password stronger is perhaps one of the best ways to safeguard your files stored in the cloud. Even the U.S. government has revamped its password recommendations. The days of picking your favorite phrase as your password and replacing a few characters with symbols are practically over. Also, stop doubling your one password for other services.

Instead, choose long, weird string of words as your password and add a combination of special characters, some capital letters, or numbers to make it stronger. Most data leakage happen due to easy to guess passwords. If required, test if your password on the safety of your computer. Another good practice is to change your password is every 90 days or less. This practice will also help you keep the internal intruders away, thus avoiding workplace breaches.

Conclusion

Keeping your data safe on the cloud is all about remaining secure, vigilant and resilient. Have a multi layered data security system in place and continue monitoring to ensure your systems are still secure. However, if you still feel your data is under threats of breaching, take control and quickly address the issue to recover before it causes a havoc. Don’t just rely on your cloud service providers’ security assurances. Always have your own security measures in place from the beginning. After all – it is better to be safe than sorry.

Five tips for creating successful company-wide data security training

Creating a safe online environment for your business is a major concern for leaders today. With the amount of data breaches increasing steadily and consumer trust in data management declining, it’s no wonder that improving the security of IT systems is the number one priority for 55% of companies.

Employees can either be your greatest strength or your greatest weakness when it comes to data security. Unfortunately, one of the leading causes of data breaches is internal negligence due to poor training. However, when the staff is educated and instructed on the proper practices, the risk of cyberattacks or data leaks can be reduced by up to 70%.

If your business is ready to enact a company-wide data security training plan, here are some tips that can improve results and ensure you are properly prepared for anything that comes your way.

Make sure everyone understands the entire business process

Businesses in today’s world rely on vast collections of complex datasets. First and foremost, in order to make sure this valuable information stays secure, everyone must understand the processes and how their work fits into the big picture. Whether it's managing customer profiles, translating marketing data into the main CRM system, ect., there cannot be any gray areas.

Explaining to employees how pieces of the data puzzle fit together will make it much easier to implement security procedures and new systems. For example, one common BI cloud-based system that businesses rely on these days is Salesforce. However, 43% of teams report that they use less than half of its CRM features. This could result in poor data management and reduced returns. By using a proper Salesforce training system that explains how datasets can be used throughout the company, you can work to fill in the gaps and help your team to better understand the data lifecycle.

The need for data security education is huge among businesses. In order for information to be utilized properly, there must be a set system in place for its storage and organization. Be sure that your onboarding strategies cover all of the bases so everyone is on the same page.

Assess the needs of each department

Of course, each facet of your business has different needs and priorities, especially in terms of data collection and access. For instance, the accounting department will need higher security for sensitive financial information from clients while marketing teams will require consumer behavior data points to guide their strategies.

Source

Rather than thinking of a data security system as a one-size-fits-all blanket, you must take each department’s needs into consideration and be sure that your approach covers their priorities. Talk to the heads of each department to determine how and where data security can best be implemented to accommodate their day-to-day.

Determine when and how training will be conducted

Introducing a company-wide security training program is by no means a small task. Every single organization is made up of people, and each person learns differently. Therefore, in order to be sure that everyone is on the same page, there must be some careful planning about the way that training will be conducted.

Make sure that your training courses cover the most important topics for the best results. Keep in mind, not all your employees are data security experts; try not to get too technical and keep it user-friendly. Stick to the main points and offer clear and easy solutions.

It’s interesting to note that businesses that hold a single training program every year have lower retention rates than ones that offer monthly refreshers. If possible, it may be within your best interests to offer regular classes throughout the year to make sure they are up to speed.

Source

Develop a system to test the effectiveness of training

According to Dashlane’s report, 90% of businesses fall prey to attacks due to internal threats and mistakes made by employees. The most common culprits are phishing attempts, weak passwords, and accidently sharing private information. Therefore, part of your training must address these top issues, as well as the solutions to combat the most prevalent problems.

In addition to providing educational information, your training must have a system in place to check that everyone understands what they have learned. Since much of the information related to data security is highly technical, not every employee will get it the first time around. A short test at the end of training will show what your team learned on paper, but simulations and test runs will give you a better idea of how they will actually apply this knowledge in the real world.

There are a series of tests that you can run to check your employees’ security savviness. For example, you can send simulated phishing tests via email or even password enumeration tests to check the effectiveness of your employees’ security habits.

Stay up to date on all big data news and trends

The world of data security changes seemingly by the minute. Every day, there are new threats along with new technology to make systems safer and more secure.

In order to truly protect your company from cyberattacks, cyber security managers must stay sharp on any developments in this area. Make sure everyone stays informed about new data systems and technologies by keeping up with the latest industry news. Furthermore, encourage continued education or participation in cyber security seminars and meetings.

Conclusion

Thankfully, the issue of data security is not without solutions. Whether your business decides to instill stricter data governance for added security, or prefers a multi-cloud infrastructure for increased safety, the only way to ensure that these strategies perform effectively is to train your team properly and make sure they know the processes from A to Z.

Three virtualisation best practices for small businesses

The use of virtualisation has soared in the past few years. From a test lab technology, it has become one of the mainstream components in the IT industry. According to the 2016 Spiceworks State of IT report, more than 76 percent of organisations are already using virtualisation. Moreover, x86 server virtualisation had an increase of 5.7 percent (about $5.6 billion) in 2016 from the previous year, according to Gartner Inc.

The high-level penetration of virtualisation is due to the many benefits that this technology has to offer. From reduced overhead costs to quicker back-up & recovery, better business continuity to more efficient IT operations, the benefits are many. Virtualisation can be specifically beneficial for SMEs since, with a lesser number of physical hardware, the maintenance cost goes down significantly.

However, in order to make virtualisation successful, it must be integrated with certain other tools, especially project management tools. Organizations can’t afford to shut down the servers or hardware components to implement the virtualisation technology, even for one day. This is because they are the heart and soul of the IT projects and shutting them down can have a cascading effect on the business processes. Therefore, project management teams must take an active role in the virtualisation process and time the technology implementations efficiently. Project management tools such as Workzone and Scoro will help your project teams determine the status of the IT projects and successfully plan implementation of virtualisation to ensure minimal disruption of the business processes.    

If you are a small business owner and want to implement virtualisation, here are 3 best practices to follow:

Virtualisation audit

Instead of random virtualisation of servers, desktops and other hardware units, small business owners must first identify the business goals that virtualisation will help them achieve. A full IT infrastructure audit is necessary to decide the extent of virtualisation that needs to be undertaken. Infrastructure audit also includes understanding how virtualisation should be implemented, setting the right expectations and calculating the ROI. It is best to start with virtualisation of small applications that would be easy to handle if things go wrong and move to other mission-critical projects.

For instance, if you are planning server virtualisation for your organisation, then as a part of the IT audit you need to consider the following:

  • Workload capacity: Servers are virtualized since they are often underutilised. Virtualisation of servers allows more than one application to run on the same server. Since the utilisation increases, you need to consider the workload capacity to ensure efficiency and high performance. Moreover, not all applications can run on virtual servers, so you need to decide which applications are suitable for running on virtual servers.
  • Systems architecture: In this phase, the business owner needs to factor in the current and future business needs. Businesses need to scale their IT components depending upon the demand and work pressure. In doing so, you need to consider the CPU usage, bandwidth requirements, and the related costs.
  • Security: Security issues need to be focused on since each layer of virtualisation increases the potential vulnerability or threat.
  • Operational processes: Virtualisation changes the way jobs are being done within an organisation. So you need to decide how the virtual servers are going to be monitored, maintained, the kind of training employees would require to adapt to the new environment, etc.

Similar in-depth audit needs to be done for all the hardware that need to be virtualised. Scalability is the biggest strengths of a virtual environment, so you can easily be persuaded into creating endless virtual resources. But this may lead to overburdening of the physical units and make management very complicated. So understand your needs before embarking on virtualisation.

Backup must be your top priority

Back-up is important for physical and virtual systems alike. You can either back-up the entire virtual machine or the data contained in it. Backing-up data is a lot easier and allows rapid recovery compared to backing-up the entire system. Thanks to the number of software packages that make physical and virtual machine back-up easier for the small businesses.

However, traditional back-up processes might not work for virtual systems. However, back-up of the virtual systems can be performed in the following ways:

  • Manual back-up. Copy the files stored in the virtual machine manually.
  • Cloning the virtual machine. Give descriptive names to the clones and save them in a destination dedicated for back-ups.
  • Including virtual machine files in the automatic Time Machine backups. This can be done by editing the General settings in the virtual machine.
  • Using third-party back-up software.

Data or machine back-up must be done regularly to make sure the important data and information is easily available even if there is a disaster.

Know the limitations of virtualisation

Before you virtualise all your old physical systems and deploy a number of virtual servers within your IT infrastructure, you must understand the limitations of virtualisation. The limitations are very less compared to the advantages it offers, but they should be taken into consideration before implementing virtualisation within your organisation.

The main limitations are associated with server virtualisation. This is because virtualisation divides the processing power, thus if large databases are stored on the virtual servers that involve a lot of transaction processing, all other processes might get slowed down. Thus the efficiency goes down. Moreover, if the physical server unit fails for some reason, all the virtual components become unavailable. A probable solution can be using two physical servers (with the same configuration) and a centralized storage for all the virtualised components but it becomes cost-intensive and might not be feasible for small businesses with tight budgets.

Finally, setting up and managing a virtual environment is not as easy as it might sound. You may also need to employ skilled manpower to look after the virtual set-up. Therefore, it is a must that you start small scale virtualisation and then move onto more complex set-ups when you become familiar with the new system.

Conclusion

Virtualisation is set to become an even more significant IT component in the near future, therefore businesses both small and big should leverage on this technology. Transitioning to a virtual environment can be very beneficial for your organisation, but you need to know about the virtualisation best practices to reap the best benefits.

Read more: How virtualisation is a vital stepping stone to the cloud

Five key cloud trends to look forward to in 2017: Containers, AI, and more

(c)iStock.com/binabina

There’s no denying that cloud computing has changed the way both large enterprises and small businesses operate.

The latest market analysis from Cisco states that global cloud IP traffic will almost quadruple (3.7-fold) over the next 5 years. It will grow at a CAGR (Compound Annual Growth Rate) of 30% from 2015 to 2020.

With the increased acceptance of cloud computing, we can look forward to better and greater tapping of cloud resources this year. A few trends to watch out for in 2017 are:

Migration will continue to accelerate

The only question that remains to be answered for most business enterprises is when to move their services to the cloud as cloud providers are constantly updating their services. It is not only the small businesses that are shifting their services to the cloud, but large commercial organisations are doing so as well.

A new report from McKinsey’s Silicon Valley group has found that cloud migration will gain maximum acceleration for large enterprises that have, until now, been reluctant to accept the changing technology. While nearly 77% companies relied on traditional IT infrastructure in 2015, this number is likely to drop down to 43% in 2018 as large businesses will migrate to cloud-based infrastructure, the report concludes.

Cloud security will take precedence

Though an increasing number of commercial enterprises are moving their services to the cloud, a few companies still have reservations in using this technology. Security remains the biggest concern for them. As an increasing number of organisations in manufacturing and healthcare industries are migrating to the cloud, data security will need to stay one step ahead.

Though there is no iron-clad consensus about cloud security capabilities, several IT experts, including Scott Chasin, believe that cloud computing can offer robust security in the coming years. “There’s a growing call for security to be treated as a fundamentally basic utility where safety can be assumed. The cloud is the key to enabling this, with benefits like storage options, scalability, and ease of deployment,” says Chasin.

The continued rise of containers

The cloud computing industry has witnessed a surge in the use of containers as they can make operations more portable and efficient. Flexibility and lowered costs are the driving forces behind this trend. They are a good means to develop and deploy micro-services, especially for cloud-based applications.

According to Anand Krishnan, EVP and GM of cloud at Canonical, the company behind Ubuntu, “Containers will be used for deploying solutions to solve real-world business problems. Companies will use them to provide new services that are secure, efficient, elastic, and scalable.”

Machine learning and artificial intelligence will gain prominence

Machine learning and artificial intelligence can be the next big waves in the cloud computing industry. The four biggest players in this industry Google, Microsoft, Amazon Web Services, and IBM have already introduced machine-learning and artificial-intelligence-based cloud services.

Microsoft offers over 20 cognitive services, while IBM launched its first cloud-based platform for AI-powered decision-making in September last year. The platform called Project DataWorks automates the intelligent deployment of data products on the IBM Cloud using Machine Learning and Apache Spark.

Google too unveiled a set of cloud computing services last year. Though the use of machine learning and artificial intelligence in the industry is still in its infancy, it is going to shape the future of cloud computing platforms.

Hybrid cloud continues to proliferate

Though small-and-medium-sized businesses can shift their services to the cloud, it may not be a financially-viable solution for large enterprises, at least not immediately. As they usually make significant investments in on-premises IT infrastructure, these companies will move only a part of their operations to the cloud.

This is not the only factor that will drive the development of hybrid clouds. It’s about having the best of both. A hybrid approach will enable businesses to take advantage of the scalability offered by cloud computing, without exposing critical data to third-party vulnerabilities.

Hybrid cloud development, however, faces several challenges including integration and application incompatibilities. The lack of management tools and common APIs is also a top concern for public cloud providers. But according to Ed Anderson, research vice president at Gartner, these top concerns also highlight some of the top opportunities for cloud providers. “We know that both public and private cloud services (of various types) will become more widely used. Therefore, providers must focus on the top hybrid cloud challenges to be successful in meeting the growing demand for hybrid cloud solutions,” says Anderson.

Conclusion

Just a few years ago, several enterprises moved to the cloud in a bid to gain advantage over their competitors. However, it has now become necessary to use cloud computing services and why not? The sheer flexibility, scalability and cost-friendly nature of cloud computing technology has started to attract consumers from a wide range of industries, including aviation, healthcare, and automobile. With the largest commercial organisations increasingly tapping into cloud services, cloud adoption is expected to increase to a greater extent in 2017.

Five key cloud trends to look forward to in 2017: Containers, AI, and more

(c)iStock.com/binabina

There’s no denying that cloud computing has changed the way both large enterprises and small businesses operate.

The latest market analysis from Cisco states that global cloud IP traffic will almost quadruple (3.7-fold) over the next 5 years. It will grow at a CAGR (Compound Annual Growth Rate) of 30% from 2015 to 2020.

With the increased acceptance of cloud computing, we can look forward to better and greater tapping of cloud resources this year. A few trends to watch out for in 2017 are:

Migration will continue to accelerate

The only question that remains to be answered for most business enterprises is when to move their services to the cloud as cloud providers are constantly updating their services. It is not only the small businesses that are shifting their services to the cloud, but large commercial organisations are doing so as well.

A new report from McKinsey’s Silicon Valley group has found that cloud migration will gain maximum acceleration for large enterprises that have, until now, been reluctant to accept the changing technology. While nearly 77% companies relied on traditional IT infrastructure in 2015, this number is likely to drop down to 43% in 2018 as large businesses will migrate to cloud-based infrastructure, the report concludes.

Cloud security will take precedence

Though an increasing number of commercial enterprises are moving their services to the cloud, a few companies still have reservations in using this technology. Security remains the biggest concern for them. As an increasing number of organisations in manufacturing and healthcare industries are migrating to the cloud, data security will need to stay one step ahead.

Though there is no iron-clad consensus about cloud security capabilities, several IT experts, including Scott Chasin, believe that cloud computing can offer robust security in the coming years. “There’s a growing call for security to be treated as a fundamentally basic utility where safety can be assumed. The cloud is the key to enabling this, with benefits like storage options, scalability, and ease of deployment,” says Chasin.

The continued rise of containers

The cloud computing industry has witnessed a surge in the use of containers as they can make operations more portable and efficient. Flexibility and lowered costs are the driving forces behind this trend. They are a good means to develop and deploy micro-services, especially for cloud-based applications.

According to Anand Krishnan, EVP and GM of cloud at Canonical, the company behind Ubuntu, “Containers will be used for deploying solutions to solve real-world business problems. Companies will use them to provide new services that are secure, efficient, elastic, and scalable.”

Machine learning and artificial intelligence will gain prominence

Machine learning and artificial intelligence can be the next big waves in the cloud computing industry. The four biggest players in this industry Google, Microsoft, Amazon Web Services, and IBM have already introduced machine-learning and artificial-intelligence-based cloud services.

Microsoft offers over 20 cognitive services, while IBM launched its first cloud-based platform for AI-powered decision-making in September last year. The platform called Project DataWorks automates the intelligent deployment of data products on the IBM Cloud using Machine Learning and Apache Spark.

Google too unveiled a set of cloud computing services last year. Though the use of machine learning and artificial intelligence in the industry is still in its infancy, it is going to shape the future of cloud computing platforms.

Hybrid cloud continues to proliferate

Though small-and-medium-sized businesses can shift their services to the cloud, it may not be a financially-viable solution for large enterprises, at least not immediately. As they usually make significant investments in on-premises IT infrastructure, these companies will move only a part of their operations to the cloud.

This is not the only factor that will drive the development of hybrid clouds. It’s about having the best of both. A hybrid approach will enable businesses to take advantage of the scalability offered by cloud computing, without exposing critical data to third-party vulnerabilities.

Hybrid cloud development, however, faces several challenges including integration and application incompatibilities. The lack of management tools and common APIs is also a top concern for public cloud providers. But according to Ed Anderson, research vice president at Gartner, these top concerns also highlight some of the top opportunities for cloud providers. “We know that both public and private cloud services (of various types) will become more widely used. Therefore, providers must focus on the top hybrid cloud challenges to be successful in meeting the growing demand for hybrid cloud solutions,” says Anderson.

Conclusion

Just a few years ago, several enterprises moved to the cloud in a bid to gain advantage over their competitors. However, it has now become necessary to use cloud computing services and why not? The sheer flexibility, scalability and cost-friendly nature of cloud computing technology has started to attract consumers from a wide range of industries, including aviation, healthcare, and automobile. With the largest commercial organisations increasingly tapping into cloud services, cloud adoption is expected to increase to a greater extent in 2017.