Category Archives: High Tech Security

CyberOam Provides Critical Insight for Virtual Datacenter Administrators

Guest Post by Natalie Lehrer, a senior contributor for CloudWedge.

Organizations must provide reliable technical resources in order to keep a business running in an efficient manner. Network security is one of the chief concerns of all companies regardless of size. Although corporations are often pressed to earn profits, the need to protect all company related data at any cost should be a top priority.

Virtual datacenters can be susceptible to a variety of threats including hyperjacking, DoS attacks and more. The importance of keeping up to date on the latest server patches, security bulletins and being aware of the latest malware threats is more important than ever. Therefore, it is critical that all incoming network traffic is properly scanned in search of viruses and malicious code that could possibly corrupt or cause the malfunction of the virtual datacenter.

What is the Solution?

Network appliances such as Cyberoam can act as a unified threat management suite. In addition, Cyberoam scans as all incoming and outgoing traffic while producing detailed reports for system administrators. These granular reports list all virtual datacenter activity while providing logs that give forensic computer scientists direction on where to focus their investigations. Since any activities performed on virtual servers can be retained using Cyberoam, the audit process can provide a clear trail which will lead you to the culprit incase of a data breach. Cyberoam is not a reactive solution. Cyberoam proactively scans all incoming and outgoing data incase viruses and other harmful programs try to compromise and corrupt your entire virtual datacenter.

Security intricacies include intrusion protection services, specialized auditing applications and robust firewall features. Firewalls play an important role in keeping all harmful material from compromising virtual servers. Firewalls essentially block intruders while simultaneously allowing legitimate TCP or UDP packets to enter your system. Cyberoam allows administrators the ability to easily construct firewall rules that keep internal data safe and secure.

When you setup your virtual datacenter, it is important to utilize all of the features at your disposal. Sometimes the most obscure features are the most valuable. The best way to keep your virtual datacenter is safe is be on top of the latest knowledge. There have been reports that many IT professionals find themselves intimidated by new technology simply have not taken the initiative to learn all about the latest datacenter hardware and software available to them today. If you are trying to stay one step ahead of the game, your best bet is to learn all about the tools on the market and make your decision accordingly. Be sure to scrutinize any appliance you decide to utilize inside of your datacenter before adding it into your arsenal of IT weaponry.

Headshot

Natalie Lehrer is a senior contributor for CloudWedge.

In her spare time, Natalie enjoys exploring all things cloud and is a music enthusiast.

Follow Natalie’s daily posts on Twitter: @Cloudwedge, or on Facebook.

SmartRulesR DLP Thwarts email Distribution of Confidential Info

New Zealand-owned cloud email security and hosting company SMX has released SmartRules DLP, designed to safeguard confidential information against unauthorized email distribution.

SmartRules DLP (Data Loss Prevention) is one of a number of new service improvements currently being rolled out by SMX, following research and development support from Callaghan Innovation.

SMX’s co-founder and chief technology officer, Thom Hooker, says the R&D funding has enabled SMX to accelerate software development in several key areas. He says SmartRules® DLP has been given urgent priority, following the recent security breaches experienced by Government organizations.

“SMX is the leading cloud email security solution used by Government organizations with around 60 Government sector customers,” Thom Hooker says. “SmartRules® DLP meets the most stringent compliance requirements with easy-to-use rule building and related compliance processes.

“Email makes it very easy for employees to accidentally – or intentionally – send sensitive documents to recipients outside the organization,” Hooker says. “By deploying SMX’s SmartRules® DLP, customers can define rules to block and report on employees attempting to send sensitive documents externally. SmartRules® DLP can be configured to detect visible data as well as scanning for hidden metadata. The use of hidden metadata tags inside documents makes it harder for users to subvert DLP rules looking for visible text – that is, by changing the document name.”

Hooker says SMX’s SmartRules® DLP can also detect sensitive content embedded in archives – such as .zip, .rar, .tar, .gz, and so on – and can be configured to block emails containing archives that cannot be opened – for example, password protected or unknown document types.

Another significant new enhancement to the SMX Cloud Email Security Suite, Hooker says, will be beefing up the SMX email hosting platform with enterprise-grade security, reliability and new features. SMX will offer 100 percent availability, as well as enterprise-ready tools such as shared calendars, online data storage similar to Dropbox, global address books and support for ActiveSync to sync contacts, emails and calendars with mobile devices.

Let’s Hope Not: Least Favorite 2013 Prediction is “Hacking-as-a-Service”

Among all the pundit predictions for the coming year in cloud computing the one that caught my eye was this one by BusinessInsider’s Julie Bort in an article entitled “5 Totally Odd Tech Predictions That Will Probably Come True Next Year

1. Bad guys start offering “hacking as a service”

Security company McAfee says that criminal hackers have begun to create invitation-only forums requiring registration fees. Next up, these forums could become some sort of black-market software-as-a-service. Pay a monthly fee and your malware is automatically updated to the latest attack. Don’t pay, and it would be a shame if something happened to your beautiful website …

HaaS? Let’s hope not.

Four Things You Need to Know About PCI Compliance in the Cloud

By Andrew Hay, Chief Evangelist, CloudPassage

Andrew HayAndrew Hay is the Chief Evangelist at CloudPassage, Inc. where he is lead advocate for its SaaS server security product portfolio. Prior to joining CloudPassage, Andrew was a a Senior Security Analyst for 451 Research, where he provided technology vendors, private equity firms, venture capitalists and end users with strategic advisory services.

Anyone who’s done it will tell you that implementing controls that will pass a PCI audit is challenging enough in a traditional data center where everything is under your complete control. Cloud-based application and server hosting makes this even more complex. Cloud teams often hit a wall when it’s time to select and deploy PCI security controls for cloud server environments. Quite simply, the approaches we’ve come to rely on just don’t work in highly dynamic, less-controlled cloud environments. Things were much easier when all computing resources were behind the firewall with layers of network-deployed security controls between critical internal resources and the bad guys on the outside.

Addressing the challenges of PCI DSS in cloud environments isn’t an insurmountable challenge. Luckily, there are ways to address some of these key challenges when operating a PCI-DSS in-scope server in a cloud environment. The first step towards embracing cloud computing, however, is admitting (or in some cases learning) that your existing tools might be not capable of getting the job done.

Traditional security strategies were created at a time when cloud infrastructures did not exist and the use of public, multi-tenant infrastructure was data communications via the Internet. Multi-tenant (and even some single-tenant) cloud hosting environments introduce many nuances, such as dynamic IP addressing of servers, cloud bursting, rapid deployment and equally rapid server decommissioning, that the vast majority of security tools cannot handle.

First Takeaway: The tools that you have relied upon for addressing PCI related concerns might not be built to handle the nuances of cloud environments.

The technical nature of cloud-hosting environments makes them more difficult to secure. A technique sometimes called “cloud-bursting” can be used to increase available compute power extremely rapidly by cloning virtual servers, typically within seconds to minutes. That’s certainly not enough time for manual security configuration or review.

Second Takeaway: Ensure that your chosen tools can be built into your cloud instance images to ensure security is part of the provisioning process.

While highly beneficial, high-speed scalability also means high-speed growth of vulnerabilities and attackable surface area. Using poorly secured images for cloud-bursting or failing to automate security in the stack means a growing threat of server compromise and nasty compliance problems during audits.

Third Takeaway: Vulnerabilities should be addressed prior to bursting or cloning your cloud servers and changes should be closely monitored to limit the expansion of your attackable surface area.

Traditional firewall technologies present another challenge in cloud environments. Network address assignment is far more dynamic in clouds, especially in public clouds. There is rarely a guarantee that your server will spin up with the same IP address every time. Current host-based firewalls can usually handle changes of this nature but what about firewall policies defined with specific source and destination IP addresses? How will you accurately keep track of cloud server assets or administer network access controls when IP addresses can change to an arbitrary address within a massive IP address space?

Fourth Takeaway: Ensure that your chosen tools can handle the dynamic nature of cloud environments without disrupting operations or administrative access.

The auditing and assessment of deployed servers is an addressable challenge presented by cloud architectures. Deploying tools purpose-built for dynamic public, private and hybrid cloud environments will also ensure that your security scales alongside your cloud server deployments. Also, if you think of cloud servers as semi-static entities deployed on a dynamic architecture, you will be better prepared to help educate internal stakeholders, partners and assessors on the aforementioned cloud nuances – and how your organization has implemented safeguards to ensure adherence to PCI-DSS.

 


Woz is Worried About “Everything Going to the Cloud” — the Real Issue is Giving Up Control

Guest Post By Nati Shalom, CTO and Founder of GigaSpaces

In a recent article, Steve Wozniak, who co-founded Apple with the late Steve Jobs, predicted “horrible problems” in the coming years as cloud-based computing takes hold. 

“I really worry about everything going to the cloud,”.. “I think it’s going to be horrendous. I think there are going to be a lot of horrible problems in the  next five years. ….“…with the cloud, you don’t own anything. You already signed it away.”

When I first read the title I thought, Wozniak sounds like Larry Ellison two years ago, when he pitched the Cloud is hype, before he made a 180-degree turn to acknowledge Oracle wished to be a cloud vendor too. 

Reading it more carefully, I realized the framing of the topic is instead just misleading. Wozniak actually touches on something that I hear more often, as the cloud hype cycle is moves from a Peak of Inflated Expectations into through the Trough of Disillusionment.

Wozniak echos an important lesson, that IMO, is major part of the reason many of the companies that moved to cloud have experienced lots of outages during the past months. I addressed several of these aspects in in a recent blog post: Lessons from the Heroku/Amazon Outage.

When we move our operations to the cloud, we often assume that we’re out-sourcing our data center operation completely, including our disaster recovery procedures. The truth is that when we move to the cloud we’re only outsourcing the infrastructure, not our operations, and the responsibility of how to use this infrastructure remain ours.

Choosing better tradeoffs between producivity and control

For companies today, the main reason we chose to move to the cloud in the first place was to gain better agility and productivity. But in starting this cloud journey, we found that we had to give up some measure of control to achieve the agility and productivity.

The good news is that as the industry mature there are more choices that provides better tradeoffs between producivity and control:

  • Open source cloud such as OpenStack and CloudStack
  • Private cloud offering
  • DevOps and automation tools such as Chef and Puppet
  • OpenSource PaaS such as Cloudify, OpenShift and CloudFoundry
  • DevOps and PaaS combined such Cloudify

As businesses look at cloud strategy today, there isn’t a need to give up control over productivity. With technologies like Cloudify, businesses can get the best out of both worlds.

References