All posts by jeffdenworth

Why it’s time to take new strategies for beating ransomware

(c)iStock.com/Leonardo Patrizi

The sad facts of ransomware are that no-one is immune and attacks are impacting hospitals, schools, government, law enforcement agencies and businesses of all sizes. The increased frequency – and scale – of attacks has organisations thinking differently about their approach to ransomware. According to the FBI, ransomware attacks have increased 35-fold in 2016, resulting in an estimated $209 million paid out every quarter.

In addition, there has recently been a string of very public web services hacking events that have created question marks about the threat of storing data in the public cloud. More worryingly, we only know the publicly reported instances of such hacks.

In 2012, Dropbox was compromised by an internal phishing attack targeted at a Dropbox administrator. The event took four years to come to light, as the entire dataset – with hashed and salted passwords – appeared for sale on the dark web in 2016. The company put through a password reset prompt for users whose password had not changed before mid-2012, saying afterwards that the move had protected all impacted users.

In 2014, Yahoo! was breached by state-sponsored hackers who managed to get access to 500 million user credentials. In this case, two years passed before the breach became public knowledge only after the credentials were offered for sale to the public in 2016.

These events underscore the value of target-rich environments that attract the efforts of the world’s cyber-criminal and state-sponsored espionage community. User credentials are sold by the fraction of a penny, so commercial hackers must focus their energies on the world’s largest websites and cloud storage repositories in order to be successful. What’s worse, the increasing occurrences of these hacks is evolving the conversation around SaaS security from if to when.

The proportions have reached pandemic scale but of further concern are the delays between first breach and public notification. The delays beg the question: how long will it take to find out about the hacks that are happening right now?

What we do know is that all of the major cloud storage SaaS companies share some aspect of the data management and security management with their customers. Not one of them can claim to allow their customers to enjoy exclusive ownership of their data, their metadata, their encryption keys and their access credentials. For a certain class of security-conscious enterprises, this is fundamentally unacceptable. Gartner agrees, where in the 2016 IT Market Clock for the Digital Workplace it said: “Organisations with strong requirements for data protection, or those with strict regulations about data location and residency or complex data manipulation requirements, should focus on private cloud or on-premises EFSS deployments.”

How to safeguard your organisation

There are several countermeasures organisations can implement to fight back against crypto-malware:

Step one: Secure the perimeter to minimise the chance of breach: Patch your operating systems and keep your operating systems up to date. This is imperative. Then educate employees about the threat of ransomware and the role they can play in protecting the organisation’s data, disable macro scripts from office files transmitted over email, and limit access to critical and rapidly-changing datasets to only need-to-know users.

Step two: Backup all files and systems to avoid paying ransom to recover from crypto events. Then backup your endpoint and backup your file servers, and implement lightweight, optimised data protection tools that minimise recovery points.

Using very granular file sync and backup procedures, affected organisations with innovative safeguards in place have minimised their recovery points to as little as five minutes – versus 24 hours or more with alternate measures. With the right data protection tools, organisations can successfully save themselves from paying hundreds of thousands of dollars in ransom and minimise the period of business outage, while protecting their corporate reputations.

For the last 20 years, the market has been conditioned for daily backups. Whether we’re talking server or endpoint backup, in both cases file storage systems have been built for relatively lax backup intervals because backups have been expensive, requiring lots of CPU, lots of storage and too much time, and organisations haven’t had to deal with an explosion of file-locking malware attacks.

The use of legacy backup software in an organisation becomes a major issue for organisations where knowledge workers are continuously storing data on PCs and file shares. For example, an organisation that has 1,000 knowledge worker employees with file access by power users and IT teams has all of its files shares vulnerable. Daily backup using legacy tools leaves 24 hours of work unprotected which equates to 2.73 many years of cumulative lost productivity.

That demonstrates how legacy backup tools can have real costs for organisations that are routinely faced with crypto-ransomware. Modern backup solutions, including CTERA’s, can enable organisations to achieve a finer degree of backup interval granularity through the use of global, source-based deduplication, incremental-ever versioning and the ability to track file changes without doing full system scans. That said – default settings for even the most efficient tools are anywhere from four to eight hours, which is nearly a full business day. Therefore, the same problem essentially persists.

The only way we can put an end to this ransomware pandemic is by building the right safeguards that eliminate enterprise vulnerability and end the need to pay cyber-criminals to access our data and our systems. Whether you choose CTERA tools or any number of other approaches to safeguarding your organisation, do take steps now so you’re prepared because it is now a case of when, not if, an attack will happen.

Organisations start to address the subtleties of securing ‘as a service’ propositions

(c)iStock.com/BlackJack3D

According to a recent Verizon report, 94% of companies expect more than a quarter of their workloads to be in the cloud within two years. The enterprise is moving to a cloud model of IT service consumption and delivery for a variety of reasons: IT organisations can become more responsive to business requirements by scaling up on demand, operations are refocused toward the company’s core competence – which is typically not building data centres, and the long term costs can be more than 65% less than traditional IT models.

As we celebrate the tenth anniversary of cloud titan Amazon Web Services, it’s important to reflect on what the market has learned about security in the first decade of cloud computing and what enterprises must consider while beginning their journey to the cloud.

When considering a move to the cloud, it’s vital to remember that there are fundamentally different security models involved when adopting shared software as a service (SaaS) offerings compared to infrastructure as a service (IaaS). While people tend to lump these two classes of services together as public, data ownership, encryption management, and service access are wildly different between these two varieties of cloud services:

While SaaS services are typically easier to consume than IaaS services because you don’t have to manage the application layer, they’re also less secure than the more DIY-oriented approach that you have with IaaS deployments. It’s important to consider that these architectural differences are fundamental to the way third parties may try to access your data, and there are some very famous cases where cloud security is not one-size-fits-all. There are various areas which need to be addressed.

Encryption

As the now-famous FBI vs. Apple case is provoking a global debate around the challenges of end-to-end security – both good and bad – it is a good lesson in source-based encryption. The case highlighted how user-generated security keys can create significant barriers to data access while simultaneously launching one of the most public government-sponsored hacking campaigns of all time.

Data ownership

In the above case, Apple completely complied with the government’s request for any of the San Bernadino shooter’s data that was managed by its iCloud SaaS service – and it could do so because it also owned the customer’s data. Similar cases have surfaced all around the world, where national interests are not as aligned, such as Microsoft vs. the US Government. What is becoming increasingly clear is that unless you’re the service and data owner, a third party SaaS provider can be leaned upon in ways you may not have prepared for and ways you’re less able to protect yourself from.

Threat radius

Aggregating the information of or about many organisations within one system has always been a critical shortcoming of public and private cloud IT systems. There is a direct, linear correlation between the amount of users of a public cloud SaaS service and its appeal to hackers – where the largest services can often deliver the best data payload to hackers. Over the last two years there have been countless examples of high-value hacking events from the U.S. Office of Personnel Management, to Target, Sony and more.

As a reaction to this surge in security breaches, security managers and CISO’s are advising that organisations segregate their data sets, minimise the volume of any one high target asset and create secure, encrypted tenants around users to the greatest level of granularity as possible. Businesses, such as private equity firm The Carlyle Group, have been working with CTERA to adopt best practices to ensure security is approached correctly. The firm has worked to encrypt each of its offices with a unique key and explains its approach in a video here.

Here are some best practices as you consider moving your data to the cloud.

Own your keys, and generate your keys: Third party encryption key generation is an additional level of vulnerability that isn’t appropriate for today’s secure organisations.

Make your cloud private: Virtual private clouds (VPCs) have achieved the same level of security as private data centres and have always been able to deliver. For this reason, organisations can now go all-in on their cloud agenda without compromising on application or data security.

Compartmentalise your users, divisions and data to minimise the thread radius of any security compromise: Scalable systems that create unique encryption keys and data isolation around cloud tenants can ensure that in the event of any breach, the breach is contained to that user, application or segmented data set.

By taking on board these best practices, you can avoid issues that are being uncovered in the high profile cases and address the cloud security challenge.