Category Archives: Standards

ISO 27018 and protecting personal information in the cloud: a first year scorecard

ISO 27018 has been around for a year - but is it effective?

ISO 27018 has been around for a year – but is it effective?

A year after it was published,  – the first international standard focusing on the protection of personal data in the public cloud – continues, unobtrusively and out of the spotlight, to move centre stage as the battle for cloud pre-eminence heats up.

At the highest level, this is a competitive field for those with the longest investment horizons and the deepest pockets – think million square foot data centres with 100,000+ servers using enough energy to power a city.  According to research firm Synergy, the cloud infrastructure services market – Infrastructure as a Service (Iaas), Platform as a Services (PaaS) and private and hybrid cloud – was worth $16bn in 2014, up 50 per cent on 2013, and is predicted to grow 30 per cent to over $21bn in 2015. Synergy estimated that the four largest players accounted for 50 per cent of this market, with Amazon at 28 per cent, Microsoft at 11 per cent, IBM at 7 per cent and Google at 5 per cent.  Of these, Microsoft’s 2014 revenues almost doubled over 2013, whilst Amazon’s and IBM’s were each up by around half.

Significantly, the proportion of computing sourced from the cloud compared to on-premise is set to rise steeply: enterprise applications in the cloud accounted for one fifth of the total in 2014 and this is predicted to increase to one third by 2018.

This growth represents a huge increase year on year in the amount of personal data (PII or personally identifiable information) going into the cloud and the number of cloud customers contracting for the various and growing types of cloud services on offer. but as the cloud continues to grow at these startling rates, the biggest inhibitor to cloud services growth – trust about security of personal data in the cloud – continues to hog the headlines.

Under data protection law, the Cloud Service Customer (CSC) retains responsibility for ensuring that its PII processing complies with the applicable rules.  In the language of the EU Data Protection Directive, the CSC is the data controller.  In the language of ISO 27018, the CSC is either a PII principal (processing her own data) or a PII controller (processing other PII principals’ data).

Where a CSC contracts with a Cloud Service Provider (CSP), Article 17 the EU Data Protection Directive sets out how the relationship is to be governed. The CSC must have a written agreement with the CSP; must select a CSP providing ‘sufficient guarantees’ over the technical security measures and organizational measures governing PII in the Cloud service concerned; must ensure compliance with those measures; and must ensure that the CSP acts only on the CSC’s instructions.

As the pace of migration to the cloud quickens, the world of data protection law continues both to be fragmented – 100 countries have their own laws – and to move at a pace driven by the need to mediate all competing interests rather than the pace of market developments.

In this world of burgeoning cloud uptake, ISO 27018 is proving effective at bridging the gap between the dizzying pace of Cloud market development and the slow and uncertain rate of legislative change by providing CSCs with a workable degree of assurance in meeting their data protection law responsibilities.  Almost a year on from publication of the standard, Microsoft has become the first major CSP (in February 2015) to achieve ISO 27018 certification for its Microsoft Azure (IaaS/PaaS), Office 365 (PaaS/Saas) and Dynamics CRM Online (SaaS) services (verified by BSI, the British Standards Institution) and its Microsoft Intune SaaS services (verified by Bureau Veritas).

In the context of privacy and cloud services, ISO 27018 builds on other information security standards within the IS 27000 family. This layered, interlocking approach is proving supple enough in practice to deal with the increasingly wide array of cloud services. For example, it is not tied to any particular kind of cloud service and, as Microsoft’s certifications show, applies to IaaS (Azure), PaaS (Azure and Office 365) and SaaS (Office 365 and Intune). If, as shown in the graphic below, you consider computing services as a stack of layered elements ranging from networking (at the bottom of the stack) up through equipment and software to data (at the top), and that each of these elements can be carried out on premise or from the cloud (from left to right), then ISO 27018 is flexible enough to cater for all situations across the continuum.

Software as a Licence to Software as a Service: the Cloud Continuum

Software as a Licence to Software as a Service: the cloud continuum

Indeed, the standard specifically states at Paragraph 5.1.1:

“Contractual agreements should clearly allocate responsibilities between the public cloud PII processor [i.e. the CSP], its sub-contractors and the cloud service customer, taking into account the type of cloud service in question (e.g. a service of an IaaS, PaaS or SaaS category of the cloud computing reference architecture).  For example, the allocation of responsibility for application layer controls may differ depending on whether the public cloud PII processor is providing a SaaS service or rather is providing a PaaS or IaaS service upon which the cloud service customer can build or layer its own applications.”

Equally, CSPs will generally not know whether their CSCs are sending PII to the cloud and, even if they do, they are unlikely to know whether or not particular data is PII. Here, another strength of ISO 27018 is that it applies regardless of whether particular data is, or is not, PII: certification simply assures the CSC that the service the CSP is providing is suitable for processing PII in relation to the performance by the CSP of its PII legal obligations.

Perhaps the biggest practical boon to the CSC however is the contractual certainty that ISO 27018 certification provides.  As more work migrates to the cloud, particularly in the enterprise space, the IT procurement functions of large customers will be following structured processes in order to meet the requirements of their business and, in certain cases, their regulators. In their requests for information, proposals and quotations from prospective CSPs, CSCs now have a range of interlocking standards including ISO 27018 to choose from in their statements of requirements for a particular Cloud procurement.  As well as short-circuiting the need for CSCs to spend time in writing up detailed specifications of their own requirements, verified compliance with these standards for the first time provides meaningful assurance and protection from risk around most aspects of cloud service provision. Organisations running competitive tenders can benchmark bidding CSPs against each other on their responses to these requirements, and then include as binding commitments the obligations to meet the requirements of the standards concerned in the contract when it is let.

In the cloud contract lifecycle, the flexibility provided by ISO 27018 certification, along with the contract and the CSP’s policy statements, goes beyond this to provide the CSC with a framework to discuss with the CSP on an ongoing basis the cloud PII measures taken and their adequacy.

In its first year, it is emerging that complying, and being seen to comply, with ISO 27018 is providing genuine assurance for CSCs in managing their data protection legal obligations.  This reassurance operates across the continuum of cloud services and through the procurement and contract lifecycle, regardless of whether or not any particular data is PII.  In customarily unobtrusive style, ISO 27018 is likely to go on being a ‘win’ for the standards world, cloud providers and their customers, and data protection regulators and policy makers around the world.

 

6 Cloud Computing Standards to Watch Out For

Of the numerous platforms available, cloud computing is slowly becoming the next big wave to hit industries and computing professionals around the globe, after Android applications. The cloud computing platform is one of the only ways in which that companies can reach new levels within their industry. One of the growing trends in the world is the rise in open-source cloud computing. Although very handy and easily available, there are factors that one needs to consider before implementing it across the company. We discuss the various problems associated with cloud computing compliance issues.

Plugging the holes in the cloud while you can

Open source cloud has rapidly increased as a mode of communication and storage for most companies around the world. Yet, due to the fact they are open source, there are certain regulatory factors that need come into the purview. Although, open source cloud computing is a conducive and a viable option compared to existing facilities, there are several factors that should be taken care of while on the cloud.

Standards-to-watch-for

  1. How secure is your cloud: One of the primary organisations that is ensuring the compliance to security issues is met, is the Cloud Security Alliance (CSA). The latter is a global coalition that represents businesses, apart from industry and subject matter experts. This organization is the reason why most companies are ensuring that they achieve the best practices within their cloud, across the world.
  2. Is the cloud compliant: When placing workloads on the cloud, make sure that you have conducted certain risk assessments before you go on the cloud. Cloud security compliance standards, once implemented is one of the factors that deals with virtualization issues.
  3. Does it have a license? Per user, device and enterprise licensing models for the cloud are essentially factors that impact companies. Licensing issues are also present in the open-source cloud models and they need to address at the outset. There may issues to be dealt with such as proprietary licenses, and other traditional licenses.
  4. Is It Interoperable? Portability within your cloud should be the reason that you are sticking to the cloud. Transferring data from one cloud to another should be the reason that you have selected the convenience provided by the cloud. This will bring forth other important factors to the purview which involves certain standards such as those laid down by the Institute of Electrical and Electronics Engineers or IEEE.
  5. How Scalable is your cloud: The faster you can connect and transfer data on your server, the faster it can upload workloads and store other data. Ensure that you cloud is scalable and brings you the convenience of uploading heavy workload without changing too much in the service contract.
  6. Evaluate the performance: Your SLA with the cloud should involve factors that allow you the convenience of business continuity and disaster recovery. This will help you measure the performance of the cloud in those critical moments.

It’s vital to have some levels of compliance in any technological advancement to enhance your business prospects. HCL Technologies is one of the technological giants that adhere to the cloud computing standards which is the reason it is in the forefront while delivering innovative SAP Solutions for its clients be it on the cloud, on premise, or through a hybrid approach.

To know more about cloud computing standards and services please visit HCL Technologies.

Six Degrees Group Achieves PCI DSS Compliance

Six Degrees Group, a provider of integrated managed data services, today announces that following an official audit its datacentres and security systems are now fully compliant with the Payment Card Industry Data Security Standard (PCI DSS).

The confirmation of PCI DSS compliance complements Six Degrees Group’s ISO27001: 2005 certification for information security, which emphasises the Group’s commitment to protecting and securing clients’ data.

PCI DSS is a set of comprehensive standards for ensuring the security of financial payment data that was developed by the founding payment brands of the PCI Security Standards Council including Visa Inc., American Express and MasterCard Worldwide. As a result of this certification, Six Degrees is now on the approved global Visa Merchant register.

Mike Ing, group business operations director of Six Degrees Group, stated: “These standards globally govern all organisations that store, process or transmit cardholder data. Achieving this compliance provides our customers and prospects with the reassurance that Six Degrees Group is committed to the security and confidentiality of sensitive data by meeting the physical security requirements of the PCI standard.”

Five IT Security Predictions for 2013

Guest Post by Rick Dakin, CEO and co-founder of Coalfire, an independent IT GRC auditor

Last year was a very active year in the cybersecurity world. The Secretary of Defense announced that the threat level has escalated to the point where protection of cyber assets used for critical infrastructure is vital. Banks and payment processors came under direct and targeted attack for both denial of service as well as next-generation worms.

What might 2013 have in store? Some predictions:

1. The migration to mobile computing will accelerate and the features of mobile operating systems will become known as vulnerabilities by the IT security industry. 

Look out for Windows 95 level security on iOS, Android 4 and even Windows 8 as we continue to connect to our bank and investment accounts – as well as other important personal and professional data – on smartphones and tablets.

As of today, there is no way to secure an unsecured mobile operating system (OS). Some risks can be mitigated, but many vulnerabilities remain. This lack of mobile device and mobile network security will drive protection to the data level. Expect to see a wide range of data and communication encryption solutions before you see a secure mobile OS.

The lack of security, combined with the ever-growing adoption of smartphones and tablets for increasingly sensitive data access, will result is a systemic loss for some unlucky merchant, bank or service provider in 2013. Coalfire predicts more   than 1 million users will be impacted and the loss will be more than $10 million.

2. Government will lead the way in the enterprise migration to “secure” cloud computing.

No entity has more to gain by migrating to the inherent efficiencies of cloud computing than our federal government. Since many agencies are still operating in 1990s-era infrastructure, the payback for adopting shared applications in shared hosting facilities with shared services will be too compelling to delay any longer, especially with ever-increasing pressure to reduce spending.

As a result, Coalfire believes the fledgling FedRAMP program will continue to gain momentum and we will see more than 50 enterprise applications hosted in secure federal clouds by the end of 2013. Additionally, commercial cloud adoption will have to play catch-up to the new benchmark that the government is setting for cloud security and compliance. It is expected that more cloud consumers will want increased visibility into the security and compliance posture of commercially available clouds.

3. Lawyers will discover a new revenue source – suing negligent companies over data breaches.

Plaintiff attorneys will drive companies to separate the cozy compliance and security connection. It will no longer be acceptable to obtain an IT audit or assessment from the same company that is managing an organization’s security programs. The risk of being found negligent or legally liable in any area of digital security will drive the need for independent assessment.

The expansion of the definition of cyber negligence and the range of monetary damages will become more clear as class action lawsuits are filed against organizations that experience data breaches.

4. Critical Infrastructure Protection (CIP) will replace the Payment Card Industry (PCI) standard as the white-hot tip of the compliance security sword.

Banks, payment processors and other financial institutions are becoming much more mature in their ability to protect critical systems and sensitive data.  However, critical infrastructure organizations like electric utilities, water distribution and transportation remain softer targets for international terrorists.

As the front lines of terrorist activities shift to the virtual world, national security analysts are already seeing a dramatic uptick in surveillance on those systems. Expect a serious cyber attack on critical infrastructure in 2013 that will dramatically change the national debate from one of avoidance of cyber controls to one of significantly increased regulatory oversight.

5. Security technology will start to streamline compliance management.

Finally, the cost of IT compliance will start to drop for the more mature industries such as healthcare, banking, payment processing and government. Continuous monitoring and reporting systems will be deployed to more efficiently collect compliance evidence and auditors will be able to more thoroughly and effectively complete an assessment with reduced time on site and less time organizing evidence to validate controls.

Since the cost of noncompliance will increase, organizations will demand and get more routine methods to validate compliance between annual assessment reports.

Rick Dakin is CEO and co-founder of Coalfire is an independent information technology Governance, Risk and Compliance (IT GRC) firm that provides IT audit, risk assessment and compliance management solutions. Founded in 2001, Coalfire has offices in Dallas, Denver, Los Angeles, New York, San Francisco, Seattle and Washington D.C. and completes thousands of projects annually in retail, financial services, healthcare, government and utilities. Coalfire’s solutions are adapted to requirements under emerging data privacy legislation, the PCI DSS, GLBA, FFIEC, HIPAA/HITECH, HITRUST, NERC CIP, Sarbanes-Oxley, FISMA and FedRAMP.

Rackspace Launches High Performance OpenStack Cloud Block Storage

Today, Rackspace Hosting announced the unlimited availability of Cloud Block Storage, powered by OpenStack®. This solution provides a superior approach to attached storage in the Cloud by addressing customer demand for consistent and affordable performance for file systems, databases and other input/output (I/O) intensive applications. Rackspace Cloud Block Storage offers a standard volume option for everyday storage with performance that has been tested to be at least 30 percent less variable than that of alternatives1. The new product’s Solid State Drive (SSD) volume option has also been tested to deliver even higher performance, 5x to 6x faster than competing solutions1. Both options feature a transparent, flat pricing structure with no charge for I/O, and are now available for Cloud Servers powered by OpenStack.

“The Rackspace Cloud Block Storage solution is a crucial piece of our product portfolio,” said John Engates, CTO of Rackspace. “The explosion of data over the past few years has placed greater demands on our customers, presenting them with a variety of new storage related challenges. We developed Cloud Block Storage to deliver consistent performance in the cloud, with a very simple pricing model that gives customers the flexibility they require to meet their unique business needs.”

With Rackspace Cloud Block Storage, customers get:

A Full-Featured Attachable Storage Solution

  • Attach multiple volumes of up to 1 Terabyte each of block storage to
    Cloud Servers
  • Detach and re-attach storage between compute nodes in seconds
  • Choice of Standard Performance or SSD-based High-Performance storage

Enhanced Performance

  • SSD-based solution is more than 10 times faster than Standard drive
    performance1.
  • Rackspace’s Cloud Block Storage Standard drive delivers consistent
    performance with less variability than standard drive solutions
    offered by leading competitors1.
  • High performance can be achieved without the need to RAID0 (stripe)
    volumes together, providing significant savings in cost and complexity.
  • There is no cap on I/O and users do not have to specify IOPS numbers,
    as they do with competing solutions.

A Simple Pricing Model

  • Standard – $0.15 per gigabyte per month; SSD – $0.70 per gigabyte per
    month
  • $.10 per gigabyte per month for snapshot data stored
  • Competitive pricing structure also features I/O at no additional
    charge, no additional per-instance fee, no minimum instance size, and
    consistent pricing in all U.S. regions

No Vendor Lock-In

  • Using the OpenStack Cinder APIs will allow customers to avoid
    proprietary implementation
    Rackspace Cloud Block Storage

Standard volumes are aimed at customers that typically require large amounts of everyday storage. These customers can leverage the product for a broad range of applications, including those that require standard performance or those needing to scale storage without scaling compute nodes. In addition, the product provides dependable storage for archiving solutions, companies that access large quantities of large files, and small to medium size websites.

Rackspace Cloud Block Storage SSD volumes are ideally suited for customers that require even higher levels of performance than what is normally experienced with standard drives. With a faster and more reliable SSD-based storage solution, customers can be better equipped to use applications that are crucial to their business, such as self-managed MySQL databases, MongoDB, Cassandra, and Web caching and indexing, among others.

“Based on our internal benchmarks, we’ve been impressed with the ability of Rackspace Cloud Block Storage to steadily perform at a high level,” said Greg Arnette, CTO at Sonian Inc. “For our customers, the capacity to effectively archive large amounts of email data is critical to their business. As a result, we look for storage solutions that give us maximum agility, scalability and enterprise readiness. We are excited that Rackspace is now providing a new block storage alternative service for running our large scale email archiving deployments.”

Cloud Block Storage joins Cloud Databases as a key solution in Rackspace’s expanding portfolio of storage products. Rackspace Cloud Block Storage is now available in the U.S. and UK. For more information, visit http://www.rackspace.com/cloud/public/blockstorage/

1 The data provided results from performance benchmarking tests that were commissioned by Rackspace. More information is available at: http://www.rackspace.com/blog/cloud-block-storage/.


London City Lifeline Colo Gets ISO27001 Security Certification

City Lifeline, the central London colocation data centre, has today been awarded ISO27001 Information Security Management Certification. This accreditation confirms that City Lifeline’s security systems and processes meet the highest recognised international standards for physical security and information security.

Security, both of equipment operation and data integrity, is critical for all companies and organisations. When asked, organisations using data centre and colocation services consistently rate security as their number one priority. The internationally administered and recognised ISO27001 certification gives customers confidence that a data centre operates at the highest level of security and that it consistently delivers what it claims.

Commenting on the achievement, Roger Keenan, managing director at City Lifeline said: “We are thrilled to have been awarded the prestigious ISO27001 accreditation. Achieving ISO27001 took us over a year of hard work. All of our existing processes and procedures were reviewed and overhauled where needed and comprehensively documented. City Lifeline has always been strong on security and this new certification confirms that companies and organisations can trust and rely on us to keep their equipment and data 100 per cent secure.”

ISO27001 is an internationally recognized certification that sets out specific physical and information security standards, which must be continuously maintained by those to whom it is awarded.


DMTF Releases Specification for Simplifying Cloud Infrastructure Management

The Distributed Management Task Force (DMTF), the organization bringing the IT industry together to collaborate on systems management standards development, validation, promotion and adoption, today announced the release of the new Cloud Infrastructure Management Interface (CIMI) specification. The new specification standardizes interactions between cloud environments to achieve interoperable cloud infrastructure management between service providers and their consumers and developers, enabling users to manage their cloud infrastructure use easily and without complexity.

Cloud computing allows customers to improve the efficiency, availability and flexibility of their IT systems over time. As companies have adopted cloud computing, vendors have embraced the need to provide interoperability between enterprise computing and cloud services. DMTF developed CIMI as a self-service interface for infrastructure clouds, allowing users to dynamically provision, configure and administer their cloud usage with a high-level interface that greatly simplifies cloud systems management.

“The CIMI standard is a critical piece for cloud infrastructure management because it alleviates complexity while improving flexibility, portability and security,” said Winston Bumpus, Chairman of the Board, DMTF. “With the release of the CIMI v1.0 specification, DMTF offers a well-rounded, industry-wide solution for simplifying cloud infrastructure management.”

Today’s release includes two components:

  • Cloud Infrastructure Management Interface – (CIMI) Model and REST
    Interface over HTTP Specification
  • Cloud Infrastructure Management Interface – (CIMI) Primer

The CIMI specification is the centerpiece of DMTF’s Cloud Management Initiative, and is the first standard created by the Cloud Management Working Group (CMWG). DMTF’s Cloud Management Initiative includes contributions from additional working groups including the Cloud Auditing Data Federation Working Group (CADF WG), the Network Services Management Working Group (NSM WG), the Software License Management (SLM) Incubator and the System Virtualization, Partitioning, and Clustering Working Group (SVPC WG). Additional announcements are expected from DMTF cloud-related working groups early next year.

DMTF working groups and incubators collaborate with a number of industry organizations in an effort to unify their cloud management initiatives. These organizations include the Cloud Security Alliance (CSA), the China Communications Standards Association (CCSA), the China Electronics Standardization Institute (CESI), the Open Data Center Alliance (ODCA), the Storage Networking Industry Association (SNIA), the Open Grid Forum (OGF), the Object Management Group (OMG), The Open Group (TOG), the Metro Ethernet Forum (MEF), the Global Inter-Cloud Technology Forum (GICTF) and the TeleManagement Forum (TMF).

For additional information on DMTF’s cloud efforts, including specifications, whitepapers and charters, visit www.dmtf.org/cloud.

 


Coalfire Opens VMware Compliance Lab

Coalfire Systems, Inc. today announced that it has established the VMware Compliance Lab, a center of excellence and that designs, tests and promotes IT security best practices and audit guidelines for virtualized computing environments.

The VMware Compliance Lab, housed in Coalfire’s Seattle office, provides partners and end users with the information and tools they need to expedite the audit process and ensure compliance with major IT security standards, including PCI DSS, HIPAA/HITECH, GLBA, FISMA and FedRAMP. As a fully-independent IT Governance, Risk an Compliance firm, Coalfire gathers reference architecture and controls data from VMware, tests those controls in both the lab and the field, and issues guidance documents that security professionals can use to manage risk and compliance. In addition to VMware products, the Lab also houses and tests controls information from other products built on the VMware reference architecture, including solutions from EMC, RSA, HP, Symantec, McAfee and LogRhythm.

“Coalfire is partnering with VMware and other industry leaders to promote security and compliance in virtualized environments,” said Rick Dakin, CEO, co-founder and senior strategist at Coalfire. “Our lab provides a clearinghouse of un-biased, tested and proven best practices, and as those best practices are adopted in the field, end users will be able to streamline and risk and compliance efforts.”

”Coalfire’s thought leadership and IT audit expertise enables our partners and customers to confidently virtualize highly regulated workloads and meet their regulatory requirements. The guidance provided by Coalfire coupled with VMware’s proven leadership and ecosystem enables enterprises to use their virtualization investment as they move business critical applications to the cloud,” said Parag Patel, vice president, Global Strategic Alliances.


Four Things You Need to Know About PCI Compliance in the Cloud

By Andrew Hay, Chief Evangelist, CloudPassage

Andrew HayAndrew Hay is the Chief Evangelist at CloudPassage, Inc. where he is lead advocate for its SaaS server security product portfolio. Prior to joining CloudPassage, Andrew was a a Senior Security Analyst for 451 Research, where he provided technology vendors, private equity firms, venture capitalists and end users with strategic advisory services.

Anyone who’s done it will tell you that implementing controls that will pass a PCI audit is challenging enough in a traditional data center where everything is under your complete control. Cloud-based application and server hosting makes this even more complex. Cloud teams often hit a wall when it’s time to select and deploy PCI security controls for cloud server environments. Quite simply, the approaches we’ve come to rely on just don’t work in highly dynamic, less-controlled cloud environments. Things were much easier when all computing resources were behind the firewall with layers of network-deployed security controls between critical internal resources and the bad guys on the outside.

Addressing the challenges of PCI DSS in cloud environments isn’t an insurmountable challenge. Luckily, there are ways to address some of these key challenges when operating a PCI-DSS in-scope server in a cloud environment. The first step towards embracing cloud computing, however, is admitting (or in some cases learning) that your existing tools might be not capable of getting the job done.

Traditional security strategies were created at a time when cloud infrastructures did not exist and the use of public, multi-tenant infrastructure was data communications via the Internet. Multi-tenant (and even some single-tenant) cloud hosting environments introduce many nuances, such as dynamic IP addressing of servers, cloud bursting, rapid deployment and equally rapid server decommissioning, that the vast majority of security tools cannot handle.

First Takeaway: The tools that you have relied upon for addressing PCI related concerns might not be built to handle the nuances of cloud environments.

The technical nature of cloud-hosting environments makes them more difficult to secure. A technique sometimes called “cloud-bursting” can be used to increase available compute power extremely rapidly by cloning virtual servers, typically within seconds to minutes. That’s certainly not enough time for manual security configuration or review.

Second Takeaway: Ensure that your chosen tools can be built into your cloud instance images to ensure security is part of the provisioning process.

While highly beneficial, high-speed scalability also means high-speed growth of vulnerabilities and attackable surface area. Using poorly secured images for cloud-bursting or failing to automate security in the stack means a growing threat of server compromise and nasty compliance problems during audits.

Third Takeaway: Vulnerabilities should be addressed prior to bursting or cloning your cloud servers and changes should be closely monitored to limit the expansion of your attackable surface area.

Traditional firewall technologies present another challenge in cloud environments. Network address assignment is far more dynamic in clouds, especially in public clouds. There is rarely a guarantee that your server will spin up with the same IP address every time. Current host-based firewalls can usually handle changes of this nature but what about firewall policies defined with specific source and destination IP addresses? How will you accurately keep track of cloud server assets or administer network access controls when IP addresses can change to an arbitrary address within a massive IP address space?

Fourth Takeaway: Ensure that your chosen tools can handle the dynamic nature of cloud environments without disrupting operations or administrative access.

The auditing and assessment of deployed servers is an addressable challenge presented by cloud architectures. Deploying tools purpose-built for dynamic public, private and hybrid cloud environments will also ensure that your security scales alongside your cloud server deployments. Also, if you think of cloud servers as semi-static entities deployed on a dynamic architecture, you will be better prepared to help educate internal stakeholders, partners and assessors on the aforementioned cloud nuances – and how your organization has implemented safeguards to ensure adherence to PCI-DSS.

 


Rackspace Launches OpenStack-based Private Cloud Software

Today, Rackspace Hosting announced the release of Rackspace Private Cloud Software, powered by OpenStack – making it simple and easy for companies to install, test and run a multi-node OpenStack based private cloud environment. The software, code named “Alamo,” uses the same OpenStack compute platform, Nova, used to run Rackspace clouds and is available as a free download from the Rackspace website. This software is based upon Rackspace’s experience in deploying and operating OpenStack-based public and private clouds in a variety of environments including in Rackspace’s own datacenters as well as in external datacenters. The Rackspace Private Cloud is backed by an optional support offering.

The Rackspace Private Cloud Software combines the capabilities of public cloud with the customization, reliability and control advantages of a dedicated environment. Customers now have a simple way to install an OpenStack-based private cloud in their own datacenter, at Rackspace, or in a colocation facility.

“We believe that the majority of our customers and cloud users will be running hybrid cloud environments for a long time,” said Jim Curry, general manager of Private Cloud business at Rackspace. “Today’s announcement allows businesses to utilize their existing investment in their own datacenter resources to run an open cloud solution for additional control and customization and also take advantage of Rackspace’s datacenter options.”

Key Benefits of Rackspace Private Cloud, Powered by OpenStack include:

  • Deploy in minutes To download, customers can go to www.rackspace.com/cloud/private
    and run a simple installer to deploy OpenStack components and
    configuration for private clouds.
  • Integrated and tested configuration – Based on customer
    feedback, Rackspace selected a proven configuration, which initially
    includes Ubuntu 12.04 LTS host operating system and KVM hypervisor. It
    is 100% open source OpenStack Essex with Compute, Image Service,
    Identity Service and Dashboard. Rackspace is working with partners
    like Red Hat and others to offer its customers choice of host
    operating systems and OpenStack distributions in the future.
  • Backed by Rackspace Fanatical Support® – Organizations running
    the software can utilize free support forums or can purchase
    Escalation Support services from Rackspace. Escalation Support
    includes 24x7x365 ticket and phone support for Rackspace Private Cloud
    powered by OpenStack from the experts at Rackspace.

“Since the founding of OpenStack, we have had requests from the marketplace and our customers for a private cloud software offering based on OpenStack that makes it easy to get up and running. We are making that solution available through Rackspace’s Private Cloud Software allowing organizations of any size to take advantage of open cloud technology that conforms 100% to the open source code base. Rackspace is making it easy for every IT decision maker, IT pro, and system administrator to install, test and run OpenStack clouds anywhere within minutes – you don’t need to be an OpenStack expert,” said Lew Moorman, president at Rackspace. “This is built, packaged and tested by the OpenStack experts at Rackspace, providing customers access to a proven configuration and to Rackspace’s expert Fanatical Support team.”

This software is the newest addition to the Rackspace Private Cloud suite along with OpenStack Training and Support services. For more information go to: www.rackspace.com/cloud/private