Archivo de la etiqueta: security

Blue Coat Systems Acquires Elastica

Blue Coat Systems as recently announced its agreement to acquire Elastica , Inc. for $280 million.

Because of the unprecedented rate with which cloud applications have been adopted, there is also an unprecedented necessity for increased security and protection for such applications. Due to the mixed use of cloud and on premise applications, Blue Coat Systems has found it difficult to efficiently manage its security. With the addition of Elastica, Blue Coat  will be able to offer a global security platform with across the board data level security. This will make Blue Coat the only company to deliver the requirements that resulted from post-infrastructure. By combining the talents of Blue Coat with Elastica’s Cloud Access Security and Analytics, Blue Coat will  provide a solution to the problems associated with cloud security requirements. Elastica’s CloudSOC provides tools such as threat scoring powered by machine learning, user and end-point behavior modeling, natural language-based cloud DLP, and analysis with remediation in a cloud application SOC. Elastica delivers such tools through its CASB gateway and API controls for cloud security management and enforcement.

image_blue_coat_security_platform1a

Greg Clark, Blue Coat CEO, has commented: “This acquisition gives Blue Coat customers access to Elastica’s CloudSOC, which brings an unprecedented level of elegance and innovation to something that is rapidly becoming a complex challenge for organizations to solve. As we evaluated many CASB players, it was clear that Elastica’s technologies represent the future of the CASB space. Segmented CASB players have survived through their dependency upon existing on-premise infrastructure. As the industry’s leading web security platform, it is natural for Blue Coat to be the first to deliver an extended spectrum of CASB capabilities while also delivering them with our cloud protection solutions.”

Mike Fey, Blue Coat president and COO, also added, ““Our customers cannot tolerate a world where the performance and security of cloud applications are spread across a tangled web of solutions leaving them powerless to manage the threat and deliver the SLA which their users have come to expect. Corporations are facing a dissolving perimeter. The traditional infrastructure-centric way of protecting users cannot support the cloud age. We have made it our mission to solve this challenge by delivering an entire solution from the cloud, specifically built for the cloud.”

The post Blue Coat Systems Acquires Elastica appeared first on Cloud News Daily.

Software market frustrating for enterprise users says Gemalto research

Software licensing is still causing enterprises grief, according to new research by security firm Gemalto. The biggest pain points and causes of frustration are the inflexibility of licensing arrangements and the unhelpful delivery options.

According to the State of Software Monetization report, software vendors must change if they’re to satisfy enterprise user demand. This means delivering software as a service and making it accessible across multiple devices, it concludes.

The disparity between customer demand and vendor supply has been created by the shift in tastes from enterprise software customers. This is a function of the ‘bring your own device’ (BYOD) phenomenon, which has been partly created by intelligent device manufacturers and mobile phone makers. However, despite creating the demand for more flexibility they have not been able to follow suit and provide a matchingly flexible and adaptable licensing and packaging technique for software, the report says.

The most frequently voiced complaint, from 87% of the survey sample, was about the cost of renewing and managing licenses. Almost as many (83%) complained about the time needlessly wasted on unfriendly processes for renewing and managing licenses (83%) and the time and costs that were lost to non-product-related development (82%). Most of the survey sample (68%) said they had little idea over how the products they buy are being used in the enterprise.

Four out of five respondents believe that software needs to be future-proofed to be successful.

The report was compiled from feedback from 600 enterprise software users and 180 independent software vendors (ISVs), in relation to headaches related to software licensing and packaging.

Software consumption is changing and customers only want to pay for what they use, according to Shlomo Weiss, Senior VP for Software Monetization at Gemalto. “Delivering software, in ways that customers want to consume it, is critical for creating a user experience that sells,” said Weiss.

Ovum Cloud Security

Tim Jennings, an Ovum analyst, has declared that although there are many fears surrounding the security of the cloud, the increasing number of data breaches is more likely to influence enterprise transition to the cloud. This trend exemplifies the increasing level of maturity of the cloud environment. Jennings commented in a blog,” “Given that data security and privacy concerns have been an inhibitor during the early stages of cloud adoption, it is somewhat ironic that the continued spate of high-profile customer data breaches is likely to push more enterprises toward cloud services. One can envisage, therefore, pointed conversations within boardrooms as CIOs and chief security officers are questioned about the likelihood of their organizations being the next to suffer reputational damage through the exposure of customer data. Many organizations will conclude that using the expertise of a third party is a more reliable approach than depending on in-house resources.”

To a certain extent, some degree of vulnerability will always be prevalent. Jennings added, “Many have been like rabbits caught in the headlights, seemingly having little insight into the root cause of the failure, the extent of the consequences, or the actions required for remediation.”

Outsourcing to modern cloud providers appears to be the logical move. Cloud providers have invested large amounts of money into the security sector, covering areas from the physical security of a center to encryption of customer data and advanced security intelligence.

While it is unrealistic for large companies to replicate this sophisticated cloud environments created by experts, adopting a public cloud environment is not always safer. “It may be that enterprises prefer to use either an on premise or virtual private cloud, while still taking advantage of a specialist provider’s management and security capabilities. Nor does it mean that the responsibility for security and customer data passes away from the enterprise—even though the delivery of these capabilities is in the hands of the third party, governance and control must be retained in-house.”

The post Ovum Cloud Security appeared first on Cloud News Daily.

Bringing the enterprise out of the shadows

Ian McEwanIan McEwan, VP and General Manager, EMEA at Egnyte discusses why IT departments must provide employees with secure, adaptive cloud-based file sync and share services, or run the risk of ‘shadow IT’ — inviting major security vulnerabilities and compliance issues within organisations.

The advent of cloud technology has brought a wide range of benefits to businesses of all sizes, improving processes by offering on-demand, distributed access to the information and applications that employees rely on. This change has not only made IT easier for businesses, it is also fueling new business models and leading to increased revenues for those making best use of the emerging technology.

The cloud arguably offers a business the greatest benefit when used for file sync and share services, allowing users to collaborate on projects in real-time, at any time on any device from any geographic location. File sync and share makes email attachments redundant, allowing businesses to reclaim and reduce the daily time spent by employees on email, as well as the chances of files being lost, leaked or overwritten. If used correctly, IT departments can have a comprehensive overview of all the files and activity on the system, enabling considerably better file management and organisation.

Employees ahead of the corporate crowd

Unfortunately business adoption of file sharing services is often behind where employees would like it to be and staff are turning to ‘shadow IT’ – unsanctioned consumer-grade file sharing solutions. These services undermine the security and centralised control of IT departments. Businesses lose visibility over who has access to certain files and where they are being stored, which can lead to serious security and compliance problems.

CIOs need to protect their companies from the negative impact of unsanctioned cloud applications by implementing a secure solution that monitors all file activity across their business.

Secure cloud-based file sharing

To satisfy both the individual user and business as a whole, IT departments need to identify file sharing services that deliver the agility that comes with storing files in the cloud. It starts with ensuring that a five-pronged security strategy is in place that can apply consistent, effective control and protection over the corporate information throughout its lifecycle. This strategy should cover:

  • User Security – controlling who can access which files, what they can do with them and how long their access will last.
  • Device Security – protecting corporate information at the point of consumption on end user devices.
  • Network Security – protecting data in transit (over encrypted channels) to prevent eavesdropping and tampering.
  • Data Centre Security – providing a choice of deployment model that offers storage options both on premises and in the cloud and total control over where the data is stored.
  • Content Security – attaching policies to the content itself to ensure it can’t leave the company’s controlled environment even when downloaded to a device.

A solution that addresses these security areas will allow efficient collaboration without sacrificing security, compliance and control.

A user friendly, business ready solution

Furthermore, the selected solution and strategy will need to keep up with business demands and industry regulations. Flexibility can be achieved if businesses consider adaptive file sharing services that give them access to files regardless of where they are stored – in the cloud, on premises or a hybrid approach. This enables a business to adapt the service for its own changing business preferences, as well as industry standards that can dictate where data is stored and how it is shared. Recent changes to the US-EU Safe Harbour regulations which determine how businesses from the US and EU must share and keep track of data, highlight the necessity for businesses to have an adaptive file sharing solution in place to meet the demands of new regulations,  or else risk heavy fines and reputational damage.

The final hurdle towards successful implementation of a cloud-based file sharing service is ensuring user adoption through simple functionality. If a service isn’t easy to use, staff may find themselves falling back on shadow IT services due to convenience. It is important, therefore, that IT seeks solutions that can be accessed across all devices, and can be integrated with other popular applications already in used within an organisation.

The integrity and privacy of a business’ information requires a secure, adaptive cloud-based file sharing solution that gives organisations comprehensive visibility and control across the lifecycle of its data. Overlooking the security implications of shadow IT services can result in a company incurring significant costs – not just in financial terms, but for a company’s brand, reputation and growth potential. It’s time for IT departments to act now and adopt cloud services that enable efficient collaboration, mitigate any chances of risk and lift the shadow from corporate data.

Druva’s data protection service now available on Azure

Cybersecurity2Converged data protection firm Druva has allied itself with Microsoft Azure in a bid to expand its cloud presence to a wider public cloud and infrastructure market.

The new relationship gives Druva customers more global options for their data storage, privacy and security needs and a more impressive infrastructure vendor for companies with sensitive compliance and legal issues. Partnering with Azure helps Druva settle any regional data privacy issues that might otherwise dissuade them from using Druva as more companies realise that on-premise storage is becoming unsustainable, according to Druva.

Druva’s new Azure relationship, it says, gives customers have a wider set of choices as they try to decide how to keep up with data growth, security and regionally specific regulation requirements.

Azure will help Druva meet international and industry-specific compliance standards, such as ISO 27001, HIPAA, FedRAMP, SOC 1 and SOC 2. Among the country standards it meets are the Australia IRAP, UK G-Cloud and Singapore MTCS. Microsoft was also the first to adopt the uniform international code of practice for cloud privacy, ISO/IEC 27018, which governs the processing of personal information by cloud service providers. Microsoft’s data centre locations will give Druva 21 storage regions around the globe, including Canada and China which will help Druva meet data residency needs increasingly specified by clients, it claims.

Customers need stronger data protection and security in the cloud now they’re running sensitive workloads, according to Druva CEO Jaspreet Singh. Microsoft will broaden Druva’s cloud-related options and give customers additional choice for deploying in the cloud securely and conveniently. “Druva has quickly grown to become the de facto standard for data protection workloads in the public cloud,” said Singh.

Azure will extend the data storage footprint of Druva inSync, the analyst endpoint and cloud service data protection system. Druva inSync plans will begin at $6/user per month. Azure support will be generally available in 45 days.

Why visibility and control are critical for container security

Reacting to the steady flow of reported security breaches in open source components such as Heartbleed, Shellshock and Poodle is making organisations focus increasingly on making the software they build more secure, improving application delivery, agility and security. As organisations increasingly turn to containers to improve application delivery and agility, the security ramifications of the containers and their contents are coming under increased scrutiny.

An overview of today’s container security initiatives 

Container providers such as Docker and Red Hat, are aggressively moving towards reassuring the marketplace about container security. Ultimately, they are focusing on the use of encryption to secure the code and software versions running in Docker users’ software infrastructure to protect users from malicious backdoors included in shared application images and other potential security threats.

However, this method is slowly being put under scrutiny as it covers only one aspect of container security, excluding whether software stacks and application portfolios are free of known, exploitable versions of open source code.

Without open source hygiene, Docker Content Trust will only ever ensure that Docker images contain the exact same bits that developers originally put there, including any vulnerabilities present in the open source components. Therefore, they only amount to a partial solution.

A more holistic approach to container security

Knowing that the container is free of vulnerabilities at the time of initial build and deployment is necessary, but far from sufficient. New vulnerabilities are being constantly discovered and these can often impact older versions of open source components. Therefore, what’s needed is an informed open source technology that provides selection and vigilance opportunities to users.

Moreover, the security risk posed by a container also depends on the sensitivity of the data accessed via it, as well as the location of where the container is deployed. For example, whether the container is deployed on the internal network behind a firewall or if it’s internet-facing will affect the level of risk.

In this context, a publicly available attack makes containers subject to a range of threats, including cross-scripting, SQL injection and denial-of-services which containers deployed on an internal network behind a firewall wouldn’t be exposed to.

For this reason, having visibility into the code inside containers is a critical element of container security, even aside from the issue of security of the containers themselves.

It’s critical to develop robust processes for determining; what open source software resides in or is deployed along with an application, where this open source software is located in build trees and system architectures, whether the code exhibits security vulnerabilities and whether an accurate open source risk profile exists.

Will security concerns slow container adoption? – The industry analysts’ perspective

Enterprise organisations today are embracing containers because of their proven benefits; improved application scalability, fewer deployment errors, faster time to market and simplified application management. However, just as organisations have moved over the years from viewing open source as a curiosity to understanding its business necessity, containers seem to have reached a similar tipping point. The question now seems to be shifting towards whether security concerns about containers will inhibit further adoption. Industry analysts differ in their assessment of this.

By drawing a parallel to the rapid adoption of virtualisation technologies even before the establishment of security requirements Dave Bartoletti, Principal Analyst at Forrester Research, believes security concerns won’t significantly slow container adoption. “With virtualization, people deployed anyway, even when security and compliance hadn’t caught up yet, and I think we’ll see a lot of the same with Docker,” according to Bartoletti.

Meanwhile, Adrian Sanabria Senior Security Analyst at 451 Research believes enterprises will give containers a wide berth until security standards are identified and established. “The reality is that security is still a barrier today, and some companies won’t go near containers until there are certain standards in place”, he explains.

To overcome these concerns, organisations are best served to take advantage of the automated tools available to gain control over all the elements of their software infrastructure, including containers.

Hence, the presence of vulnerabilities in all types of software is inevitable, and open source is no exception. Detection and remediation of vulnerabilities, are increasingly seen as a security imperative and a key part of a strong application security strategy.

 

Bill_LedinghamWritten by Bill Ledingham, EVP of Engineering and Chief Technology Officer, Black Duck Software.

EC/US have three months to find a new Safe Harbour

The European Commission (EC) and the US are under pressure to come up with a new replacement system for the recently invalidated Safe Harbour agreement.

A statement from EU advisory body The Article 29 Working Party on the Protection of Individuals, has given those affected by the ruling three months to devise a new system.

However, the US and the EC have previously worked for two years without success to reform the Safe Harbour agreement. The reforms were made necessary after US government surveillance programmes were revealed by National Security Agency (NSA) whistle blower Edward Snowden. However, despite co-operation, for two years progress stalled as the US couldn’t guarantee limits on access to personal data.

“If by the end of January 2016, no appropriate solution is found with the US authorities and depending on the assessment of the transfer tools by the Working Party, EU data protection authorities are committed to take all necessary and appropriate actions, which may include coordinated enforcement actions,” said the statement issued.

Following Court of Justice of the European Union (CREU) ruling on October 6th, many companies risk being prosecuted by European privacy regulators if they transfer the data of EU citizen’s to the US without a demonstrable set of privacy safeguards.

The 4,000 firms that transfer their clients’ personal data to the United States currently have no means of demonstrating compliance to EC privacy regulations. As the legal situation currently stands, EU data protection law says companies cannot transfer EU citizens’ personal data to countries outside the EU which have insufficient privacy safeguards.

EU data protection authorities, meeting in Brussels to assess the implications of the ruling, said in a statement that they would assess the impact of the judgment on other data transfer systems, such as binding corporate rules and model clauses between companies.

The regulators said in their statement the EU and the United States should negotiate an “intergovernmental agreement” providing stronger privacy guarantees to EU citizens, including oversight on government access to data and legal redress mechanisms.

Multinationals can still set up internal privacy rules for US data transfers, to be approved by regulators but these so called ‘binding corporate rules’ are only used by 70 companies. All alternative data transfer systems could now also be at risk of a legal challenge, say lawyers. “The good news is that the European data protection authorities have agreed on a kind of grace period until the end of January,” said Monika Kuschewsky, a lawyer at Covington & Burling.

Bracket Computing wins $45 million to secure cloud with encapsulated data cells

Cloud securitySecurity start up Bracket Computing has been awarded $45m in a Series C investment round to develop its system for making content safe on the cloud.

Bracket’s Computing Cell technology works by encapsulating content in cell in order to secure it. The enveloped data and applications can then travel in safety across multiple cloud environments, according to its inventors. The Cell technology simplifies the increasingly complex issue of cloud management by consolidating security, networking and data management into a single construct.

The cell can run across multiple public clouds and in a customer’s own data centre. The cell structure also brings consistency to the cloud, as it protects client apps from the performance changes that can occur in cloud computing.

Customers hold the digital keys to their data, which is encrypted. Bracket runs a service that reserves hardware at cloud providers when necessary and distributes the data across multiple machines to smooth performance and improve speed.

The founders, Tom Gillis and Jason Lango, have a pedigree in Internet security having created Ironport Systems’ anti-spam hardware range, which was bought by Cisco Systems 2007 for $830 million. In 2011 they founded Bracket to solve the new security problems created by the cloud.

“Imagine if you could encapsulate your most sensitive applications, data and services and run them securely across hyperscale public clouds and your private cloud, while ensuring consistent security controls and data management,” said Lango, “this is what a Bracket Computing Cell allows. It enables an enterprise without boundaries, without sacrificing security and control.”

The funds will finance a global roll-out said Bracket CEO Tom Gillis. The data centres of the finance sector are an immediate target, but the technology applies to all large corporations, said Gillis. “Financial firms need to remain technology leaders. We’re working with some of the very largest as we define the blueprint for the data centre of the future.”

Bracket Computing Raises $45 Million

Bracket Computing, accompany that strives to deliver enterprise computing driven by business needs instead of hardware limitations, has raised upwards of $45 million in a Series C funding round. The total funding from this round and all previous is estimated to be more than $130 million. There are two new investors: Fidelity Management and Research Company and Goldman Sachs, These two companies are joined by Bracket’s previous investors Allegis Capital, Andreessen Horowitz, ARTIS Ventures, Columbus Nova Technology Partners, Norwest Venture Partners, and Sutter Hill Ventures. Bracket Capital plans to use this new capital to develop the Bracket Computing Cell and finance the company’s global expansion. The Bracket Computing cell allows enterprise applications and data and their associated security, networking, and data management infrastructure to reside in a single software design. This Computing Cell has been formatted to function across a multitude of public cloud providers in addition to the company’s on-premise data center. The result of this formatting will be a consistent virtual enterprise-grade infrastructure.

Tom Gillis, CEO and co-founder of Bracket, has commented, “Bracket is fundamentally redefining enterprise computing. Financial firms need to remain technology leaders, and we’re working with some of the very largest as we define the blueprint for the data center of the future. Our vision is to provide a secure, advanced, virtual infrastructure that spans multiple clouds, both private and public, with one consistent set of capabilities. Having investors of this quality bolsters our efforts to build this ambitious technology.”

Tom-Gillis-Bracket-Computing

Jason Lango, CTO and co-founder of Bracket, has added, ““Imagine if you could encapsulate your most sensitive applications, data, and services and have them run securely across leading hyper scale public clouds and your private cloud, all the while ensuring consistent security controls and data management capabilities. This is what a Bracket Computing Cell allows. It enables an enterprise without boundaries, without sacrificing security and control.” Within the Computing Cell there is encryption technology that enhances security by creating a secure fabric that extends the user’s trust across multiple hyper scale clouds that it doesn’t necessarily control. Such an approach allows the user to span multiple public clouds while still maintaining a high level of security.

The post Bracket Computing Raises $45 Million appeared first on Cloud News Daily.

Azure Cloud Security Enhancement

One of the most anticipated user management and security features for Microsoft Azure has officially been launched. According to Alex Simons, director of program management at Microsoft’s Identity Division, the Azure Roles-Based Access Control, or RBAC, is now generally available. RBAC has been requested by customers that have evaluated Azure as the foundation of their own enterprise cloud sectors. Azure Roles-Based Access Control permits administrators to selectively grant access to both cloud services and production workloads,  adding a level of security.

As Dushyant Gill, a Microsoft Azure Active Directory program manager explained, “Until now, to give people the ability to manage Azure you had to give them full control of an entire Azure subscription. Now, using RBAC, you can grant people only the amount of access that they need to perform their jobs.” RBAC interfaces with Azure Active Directory (AD), Microsoft’s cloud-based identity management platform, to show users their assigned Azure resources. Once you extend your Active Directory to the cloud, using Azure AD—your employees can purchase and manage Azure subscriptions using their existing work identity. These Azure subscriptions automatically connect to your Azure AD for single sign-on and access management.”

Azure

If an Active Directory account becomes disabled, access to all Azure subscriptions is cut off, enhancing the security of the azure program. Roles-Based Access Control may also provide departments a certain level of independence whilst still being compliant with the organizations IT policies. Gill described, “Using Azure RBAC, you can enable self-service management of cloud resources for your project teams while retaining central control over security sensitive infrastructure. For example, a common setup is to allow project teams to create and manage their own virtual machines and storage accounts, but only allow them to connect to networks managed by a central team.”

RBAC is currently available with a multitude of preset roles; however, “if none of the built-in RBAC roles addresses your specific access need, you will be able to create a custom RBAC role composing the exact operations to which you wish to grant access” (Gill).

The post Azure Cloud Security Enhancement appeared first on Cloud News Daily.