Archivo de la etiqueta: security

Why visibility and control are critical for container security

Reacting to the steady flow of reported security breaches in open source components such as Heartbleed, Shellshock and Poodle is making organisations focus increasingly on making the software they build more secure, improving application delivery, agility and security. As organisations increasingly turn to containers to improve application delivery and agility, the security ramifications of the containers and their contents are coming under increased scrutiny.

An overview of today’s container security initiatives 

Container providers such as Docker and Red Hat, are aggressively moving towards reassuring the marketplace about container security. Ultimately, they are focusing on the use of encryption to secure the code and software versions running in Docker users’ software infrastructure to protect users from malicious backdoors included in shared application images and other potential security threats.

However, this method is slowly being put under scrutiny as it covers only one aspect of container security, excluding whether software stacks and application portfolios are free of known, exploitable versions of open source code.

Without open source hygiene, Docker Content Trust will only ever ensure that Docker images contain the exact same bits that developers originally put there, including any vulnerabilities present in the open source components. Therefore, they only amount to a partial solution.

A more holistic approach to container security

Knowing that the container is free of vulnerabilities at the time of initial build and deployment is necessary, but far from sufficient. New vulnerabilities are being constantly discovered and these can often impact older versions of open source components. Therefore, what’s needed is an informed open source technology that provides selection and vigilance opportunities to users.

Moreover, the security risk posed by a container also depends on the sensitivity of the data accessed via it, as well as the location of where the container is deployed. For example, whether the container is deployed on the internal network behind a firewall or if it’s internet-facing will affect the level of risk.

In this context, a publicly available attack makes containers subject to a range of threats, including cross-scripting, SQL injection and denial-of-services which containers deployed on an internal network behind a firewall wouldn’t be exposed to.

For this reason, having visibility into the code inside containers is a critical element of container security, even aside from the issue of security of the containers themselves.

It’s critical to develop robust processes for determining; what open source software resides in or is deployed along with an application, where this open source software is located in build trees and system architectures, whether the code exhibits security vulnerabilities and whether an accurate open source risk profile exists.

Will security concerns slow container adoption? – The industry analysts’ perspective

Enterprise organisations today are embracing containers because of their proven benefits; improved application scalability, fewer deployment errors, faster time to market and simplified application management. However, just as organisations have moved over the years from viewing open source as a curiosity to understanding its business necessity, containers seem to have reached a similar tipping point. The question now seems to be shifting towards whether security concerns about containers will inhibit further adoption. Industry analysts differ in their assessment of this.

By drawing a parallel to the rapid adoption of virtualisation technologies even before the establishment of security requirements Dave Bartoletti, Principal Analyst at Forrester Research, believes security concerns won’t significantly slow container adoption. “With virtualization, people deployed anyway, even when security and compliance hadn’t caught up yet, and I think we’ll see a lot of the same with Docker,” according to Bartoletti.

Meanwhile, Adrian Sanabria Senior Security Analyst at 451 Research believes enterprises will give containers a wide berth until security standards are identified and established. “The reality is that security is still a barrier today, and some companies won’t go near containers until there are certain standards in place”, he explains.

To overcome these concerns, organisations are best served to take advantage of the automated tools available to gain control over all the elements of their software infrastructure, including containers.

Hence, the presence of vulnerabilities in all types of software is inevitable, and open source is no exception. Detection and remediation of vulnerabilities, are increasingly seen as a security imperative and a key part of a strong application security strategy.

 

Bill_LedinghamWritten by Bill Ledingham, EVP of Engineering and Chief Technology Officer, Black Duck Software.

EC/US have three months to find a new Safe Harbour

The European Commission (EC) and the US are under pressure to come up with a new replacement system for the recently invalidated Safe Harbour agreement.

A statement from EU advisory body The Article 29 Working Party on the Protection of Individuals, has given those affected by the ruling three months to devise a new system.

However, the US and the EC have previously worked for two years without success to reform the Safe Harbour agreement. The reforms were made necessary after US government surveillance programmes were revealed by National Security Agency (NSA) whistle blower Edward Snowden. However, despite co-operation, for two years progress stalled as the US couldn’t guarantee limits on access to personal data.

“If by the end of January 2016, no appropriate solution is found with the US authorities and depending on the assessment of the transfer tools by the Working Party, EU data protection authorities are committed to take all necessary and appropriate actions, which may include coordinated enforcement actions,” said the statement issued.

Following Court of Justice of the European Union (CREU) ruling on October 6th, many companies risk being prosecuted by European privacy regulators if they transfer the data of EU citizen’s to the US without a demonstrable set of privacy safeguards.

The 4,000 firms that transfer their clients’ personal data to the United States currently have no means of demonstrating compliance to EC privacy regulations. As the legal situation currently stands, EU data protection law says companies cannot transfer EU citizens’ personal data to countries outside the EU which have insufficient privacy safeguards.

EU data protection authorities, meeting in Brussels to assess the implications of the ruling, said in a statement that they would assess the impact of the judgment on other data transfer systems, such as binding corporate rules and model clauses between companies.

The regulators said in their statement the EU and the United States should negotiate an “intergovernmental agreement” providing stronger privacy guarantees to EU citizens, including oversight on government access to data and legal redress mechanisms.

Multinationals can still set up internal privacy rules for US data transfers, to be approved by regulators but these so called ‘binding corporate rules’ are only used by 70 companies. All alternative data transfer systems could now also be at risk of a legal challenge, say lawyers. “The good news is that the European data protection authorities have agreed on a kind of grace period until the end of January,” said Monika Kuschewsky, a lawyer at Covington & Burling.

Bracket Computing wins $45 million to secure cloud with encapsulated data cells

Cloud securitySecurity start up Bracket Computing has been awarded $45m in a Series C investment round to develop its system for making content safe on the cloud.

Bracket’s Computing Cell technology works by encapsulating content in cell in order to secure it. The enveloped data and applications can then travel in safety across multiple cloud environments, according to its inventors. The Cell technology simplifies the increasingly complex issue of cloud management by consolidating security, networking and data management into a single construct.

The cell can run across multiple public clouds and in a customer’s own data centre. The cell structure also brings consistency to the cloud, as it protects client apps from the performance changes that can occur in cloud computing.

Customers hold the digital keys to their data, which is encrypted. Bracket runs a service that reserves hardware at cloud providers when necessary and distributes the data across multiple machines to smooth performance and improve speed.

The founders, Tom Gillis and Jason Lango, have a pedigree in Internet security having created Ironport Systems’ anti-spam hardware range, which was bought by Cisco Systems 2007 for $830 million. In 2011 they founded Bracket to solve the new security problems created by the cloud.

“Imagine if you could encapsulate your most sensitive applications, data and services and run them securely across hyperscale public clouds and your private cloud, while ensuring consistent security controls and data management,” said Lango, “this is what a Bracket Computing Cell allows. It enables an enterprise without boundaries, without sacrificing security and control.”

The funds will finance a global roll-out said Bracket CEO Tom Gillis. The data centres of the finance sector are an immediate target, but the technology applies to all large corporations, said Gillis. “Financial firms need to remain technology leaders. We’re working with some of the very largest as we define the blueprint for the data centre of the future.”

Bracket Computing Raises $45 Million

Bracket Computing, accompany that strives to deliver enterprise computing driven by business needs instead of hardware limitations, has raised upwards of $45 million in a Series C funding round. The total funding from this round and all previous is estimated to be more than $130 million. There are two new investors: Fidelity Management and Research Company and Goldman Sachs, These two companies are joined by Bracket’s previous investors Allegis Capital, Andreessen Horowitz, ARTIS Ventures, Columbus Nova Technology Partners, Norwest Venture Partners, and Sutter Hill Ventures. Bracket Capital plans to use this new capital to develop the Bracket Computing Cell and finance the company’s global expansion. The Bracket Computing cell allows enterprise applications and data and their associated security, networking, and data management infrastructure to reside in a single software design. This Computing Cell has been formatted to function across a multitude of public cloud providers in addition to the company’s on-premise data center. The result of this formatting will be a consistent virtual enterprise-grade infrastructure.

Tom Gillis, CEO and co-founder of Bracket, has commented, “Bracket is fundamentally redefining enterprise computing. Financial firms need to remain technology leaders, and we’re working with some of the very largest as we define the blueprint for the data center of the future. Our vision is to provide a secure, advanced, virtual infrastructure that spans multiple clouds, both private and public, with one consistent set of capabilities. Having investors of this quality bolsters our efforts to build this ambitious technology.”

Tom-Gillis-Bracket-Computing

Jason Lango, CTO and co-founder of Bracket, has added, ““Imagine if you could encapsulate your most sensitive applications, data, and services and have them run securely across leading hyper scale public clouds and your private cloud, all the while ensuring consistent security controls and data management capabilities. This is what a Bracket Computing Cell allows. It enables an enterprise without boundaries, without sacrificing security and control.” Within the Computing Cell there is encryption technology that enhances security by creating a secure fabric that extends the user’s trust across multiple hyper scale clouds that it doesn’t necessarily control. Such an approach allows the user to span multiple public clouds while still maintaining a high level of security.

The post Bracket Computing Raises $45 Million appeared first on Cloud News Daily.

Azure Cloud Security Enhancement

One of the most anticipated user management and security features for Microsoft Azure has officially been launched. According to Alex Simons, director of program management at Microsoft’s Identity Division, the Azure Roles-Based Access Control, or RBAC, is now generally available. RBAC has been requested by customers that have evaluated Azure as the foundation of their own enterprise cloud sectors. Azure Roles-Based Access Control permits administrators to selectively grant access to both cloud services and production workloads,  adding a level of security.

As Dushyant Gill, a Microsoft Azure Active Directory program manager explained, “Until now, to give people the ability to manage Azure you had to give them full control of an entire Azure subscription. Now, using RBAC, you can grant people only the amount of access that they need to perform their jobs.” RBAC interfaces with Azure Active Directory (AD), Microsoft’s cloud-based identity management platform, to show users their assigned Azure resources. Once you extend your Active Directory to the cloud, using Azure AD—your employees can purchase and manage Azure subscriptions using their existing work identity. These Azure subscriptions automatically connect to your Azure AD for single sign-on and access management.”

Azure

If an Active Directory account becomes disabled, access to all Azure subscriptions is cut off, enhancing the security of the azure program. Roles-Based Access Control may also provide departments a certain level of independence whilst still being compliant with the organizations IT policies. Gill described, “Using Azure RBAC, you can enable self-service management of cloud resources for your project teams while retaining central control over security sensitive infrastructure. For example, a common setup is to allow project teams to create and manage their own virtual machines and storage accounts, but only allow them to connect to networks managed by a central team.”

RBAC is currently available with a multitude of preset roles; however, “if none of the built-in RBAC roles addresses your specific access need, you will be able to create a custom RBAC role composing the exact operations to which you wish to grant access” (Gill).

The post Azure Cloud Security Enhancement appeared first on Cloud News Daily.

Kii and KDDI say their joint platform will make IoT safe on cloud

Secure cloudJapanese telco KDDI is working with Internet of Things (IoT) cloud platform provider Kii to create a risk averse system in which enterprises can develop mobile apps.

The KDDI cloud platform service (KCPS) is described as a mobile back end as a service (mBaaS) offering that uses Kii’s software to create mobile and IoT apps on a private network. The two companies have worked together on ways to apply cloud disciplines for efficient sharing of resources, contained within the confines on an Intranet environment. The object of the collaboration is to allow companies to develop machine to machine systems, without exposing them to the public cloud while they are in development.

According to KDDI, the KCPS uses the telco’s Wide Area Virtual Switch to integrate a number of different virtual network layers with Kii’s software. Together they create a new level of fast connections across the Intranet. KCPS also provides a service environment for intranet-conscious customers who need high standards of security and enterprise functions without resorting to the public Internet, according to the vendor.

KDDI claims this is the first instance in which both Intranet and Internet services can work seamlessly with any mobile application developed on the KCPS platform.

KDDI’s application development support will allow developers to build better quality, lower priced applications in a short period of time, it claims. The platform is designed to help developers manage application development, devices and data, while providing essential features like push notifications and geo-location information. KCPS should be compatible with mobile apps on Android and iOS, according to KDDI.

“As the IoT gains mass acceptance, we see tremendous value helping mobile app developers get more IoT devices into the hands of consumers,” said Kii CEO Masanari Arai, “our collaboration will use the cloud to build the backend support of these apps in Japan.”

Cloud industry shaken by European Safe Harbour ruling

Europe US court of justiceThe Court of Justice of the European Union has ruled the Safe Harbour agreement between Europe and the US, which provides blanket permission for data transfer between the two, is invalid.

Companies looking to move data from Europe to the US will now need to negotiate specific rules of engagement with each country, which is likely to have a significant impact on all businesses, but especially those heavily reliant on the cloud.

The ruling came about after Austrian privacy campaigner Max Schrems asked to find out what data Facebook was passing on to US intelligence agencies in the wake of the Snowden revelations. When his request was declined on the grounds that the safe harbour agreement guaranteed his protection he contested the decision and it was referred to the Court of Justice.

This decision had been anticipated, and on top of any legal contingencies already made large players such as Facebook, Google and Amazon are offered some protection by the fact that they have datacentres within Europe. However the legal and logistical strain will be felt by all, especially smaller companies that rely on US-based cloud players.

“The ability to transfer data easily and securely between Europe and the US is critical for businesses in our modern data-driven digital economy,” said Matthew Fell, CBI Director for Competitive Markets. “Businesses will want to see clarity on the immediate implications of the ECJ’s decision, together with fast action from the Commission to agree a new framework. Getting this right will be important to the future of Europe’s digital agenda, as well as doing business with our largest trading partner.”

“The ruling invalidating Safe Harbour is seismic,” said Andy Hardy, EMEA MD at Code42, which recently secured $85 million in Series B funding. “This decision will affect big businesses as well as small ones. But it need not be the end of business as we know it, in terms of data handling. What businesses need to do now is safeguard data. They need to find solutions that keep their, and their customer’s, data private – even when backed up into public cloud.”

“Symantec respects the decision of the EU Court of Justice,” said Ilias Chantzos, Senior Director of Government Affairs EMEA at Symantec. “However, we encourage further discussion in order to create a strengthened agreement with the safeguards expected by the EU Court of Justice. We believe that the recent ruling will create considerable disruption and uncertainty for those companies that have relied solely on Safe Harbour as a means of transferring data to the United States.”

“The issues are highly complex, and there are real tensions between the need for international trade, and ensuring European citizen data is treated safely and in accordance with data protection law,” said Nicky Stewart, commercial director of Skyscape Cloud Services. “We would urge potential cloud consumers not to use this ruling as a reason not to adopt cloud. There are very many European cloud providers which operate solely within the bounds of the European Union, or even within a single jurisdiction within Europe, therefore the complex challenges of the Safe Harbor agreement simply don’t apply.”

These were just some of the views offered to BCN as soon as the ruling was announced and the public hand-wringing is likely to continue for some time. From a business cloud perspective one man’s problem is another’s opportunity and companies will be queuing up to offer localised cloud services, encryption solutions, etc. In announcing a couple of new European datacentres today Netsuite was already making reference to the ruling. This seems like a positive step for privacy but only time will tell what it means for the cloud industry.

Rackspace launches managed security and compliance service for enterprise cloud clients

Security concept with padlock icon on digital screenRackspace has announced new managed security and compliance assistance services to protect businesses and mitigate the risk of cyber threats. These services will give Rackspace clients ‘holistic’ coverage across cover complex, multi-cloud environments, it claims.

The service will provide consultation and tailored security using Rackspace’s inhouse expertise. It can both improve security while cutting the cost of vigilance, Rackspace claimed.

The Rackspace Managed Security offering is to be backed by round the clock support from the Customer Security Operations Center (CSOC) at Rackspace headquarters and will open in October. The service comprises four elements: host and network protection, vulnerability management, threat intelligence and compliance assistance.

Host and Network Protection will protect against zero-day and non-malware attacks as well as traditional compromise tactics. Security Analytics uses a security information and event management (SIEM) system paired with big data analytics to collect and analyse security data from the customer’s environment. As part of its Vulnerability Management service Rackspace will scan its clients’ environments and tailor its responses to estimated threats. Meanwhile, its Threat Intelligence will use fuse information from 20 feeds with Rackspace’s own internal data to constantly redraw the changing threat landscape.

All this information will help clients meet their governance objectives, as part of Rackspace’s Compliance Assistance service, which offers detailed proof of configuration hardening and monitoring, patch monitoring and user observance, the service provider said.

This information, in tandem with detail about file integrity, will help cloud service managers and CIOs to keep on top of their mounting compliance challenge, claimed Brian Kelly, chief security officer at Rackspace.

“Cyber-attacks are the new normal for companies,” said Kelly. It will be a lot cheaper and quicker to use Rackspace to manage cloud services, said Kelly. “We have 16 years of first-hand knowledge managing IT infrastructure and direct experience with today’s complex threats.”

Hitachi Data Systems unveils new automated IoT policing system

A new IoT system can predict crime by reading social media and analysing the public’s movements, claims Hitachi data Systems (HDS).

Hybrid cloud systems designed by HDS are to offer new automated policing systems, including predictive crime analytics and video management systems. The new public safety technologies were unveiled yesterday by HDS at the ASIS International Annual Seminar and Exhibits in Anaheim, California.

The new Hitachi Visualization Suite (HVS) (version 4.5) now includes Predictive Crime Analytics (PCA) and version 2.0 of the Video Management Platform (VMP).

The PCA predicts crime by analysing live social media and Internet data feeds to gather intelligent insights which enable the users of the system to make ‘highly accurate crime predictions’, claims HDS. Both social media and video camera data will be analysed for both historical crime and to predict potential incidents.

The HVS is a hybrid cloud-based platform that integrates disparate data and video assets from public safety systems, such as computer-aided emergency services dispatch, number plate readers and gunshot sensors. The real time info is then presented geospatially to monitors at law enforcement agencies in order to improve intelligence, support their investigations and make policing more efficient, says HDS. The geospatial visualizations will also provide better historical crime data, by presenting information on crime in several forms, including heat maps.

Blending real-time event data from public safety systems with historical and contextual crime data allows agencies to conduct more thorough analysis, using spatial and temporal prediction algorithms, that could help solve many hitherto unsolvable crimes. It could also provide underlying risk factors that generate or mitigate crime, says HDS.

The system uses natural language processing for topic intensity modelling using social media networks which, HDS claims, will deliver highly accurate crime predictions.

The systems will ultimately create faster police response times when situations develop, according to Mark Jules, HDS’s VP of Public Safety and Data Visualization. “Today, we are empowering them with the ability to take a proactive approach to crime and terrorism,” said Jules, “Public safety is a fundamental pillar of our vision for smart cities and societies.”

Imperva Inc.’s New Senior Vice President of Cloud Services

Imperva Inc., a company dedicated to protecting critical data and applications throughout the cloud, has recently announced that Meg Bear will become the company’s Senior Vice President of Cloud Services. Bear is responsible for increasing the company’s range of cloud services on an international level. In general, Bear is very qualified for this position. She has over twenty years of experience in a multitude of aspects within the software business. She also holds eleven patents for innovations in data management, social business, recruitment and talent management. Bear is a pivotal figure in the transition to cloud based business models. Bear was previously the Group Vice President for Social Cloud at Oracle responsible for delivering an integrated global social suite and held many other leadership roles at Oracle, including Vice President of Human Capital Management (HCM) Development and Senior Director of Development for Oracle PeopleSoft.

meg bear

“Hiring Meg is one more example of our continued investment in our rapidly growing cloud business, and we are thrilled to bring her on board,” said Anthony Bettencourt. “With the 98% year-over-year increase in subscription revenue reported in our Q2 earnings, we are well positioned with Meg’s experience and leadership to maintain momentum and take advantage of mounting cloud opportunities, adoption rates, and technological innovations.”

“Customers today need more than traditional endpoint and network security solutions that clearly don’t go far enough, given recent high-profile data breaches,” said Bear. “The cloud is an engine of innovation for many companies, and that requires new views on security. I look forward to working with the world class Imperva cloud teams to address those challenges, all with the aim of helping customers protect their business critical data and apps.”

In addition to all previously mentioned qualifications, she also is an Advisory Board Member at Unitive and Brand Amper, and previously served as Advisory Board Member at Storyvite. She has held a plethora of advisory roles with many different organizations, such as Watermark.  Bear graduated with a Degree in Economics and Entrepreneurship from the University of Arizona. Clearly, Bear has a lot to offer to Imperva, the leading provider of cyber security solutions in a business setting.

The post Imperva Inc.’s New Senior Vice President of Cloud Services appeared first on Cloud News Daily.