Tag Archives: security

Bring Your Own Encryption: The case for standards

BYOE is the new black

BYOE is the new black

Being free to choose the most suitable encryption for your business seems like a good idea. But it will only work in a context of recognised standards across encryption systems and providers’ security platforms. Since the start of the 21st century, security has emerged from scare-story status to become one of IT users’ biggest issues – as survey after survey confirms. Along the way a number of uncomfortable lessons are still being learned.

The first lesson is that security technology must always be considered in a human context. No one still believes in a technological fix that will put an end to all security problems, because time and again we hear news of new types of cyber attack that bypass sophisticated and secure technology by targeting human nature – from alarming e-mails ostensibly from official sources, to friendly social invitations to share a funny download; from a harmless-looking USB stick ‘accidentally’ dropped by the office entrance, to the fake policeman demanding a few personal details to verify that you are not criminally liable.

And that explains the article’s heading: a balance must be struck between achieving the desired level of protection against keeping all protection procedures quick and simple. Every minute spent making things secure is a minute lost to productivity – so the heading could equally have said “balancing security with efficiency”.

The second lesson still being learned is never to fully trust to instinct in security matters. It is instinctive to obey instructions that appear to come from an authoritative source, or to respond in an open, friendly manner to a friendly approach – and those are just the sort of instincts that are exploited by IT scams. Instincts can open us to attack, and they can also evoke inappropriate caution.

In the first years of major cloud uptake there was the oft-repeated advice to business that the sensible course would be to use public cloud services to simplify mundane operations, but that critical or high priority data should not be trusted to a public cloud service but kept under control in a private cloud. Instinctively this made sense: you should not allow your secrets to float about in a cloud where you have no idea where they are stored or who is in charge of them.

The irony is that the cloud – being so obviously vulnerable and inviting to attackers – is constantly being reinforced with the most sophisticated security measures: so data in the cloud is probably far better protected than any SME could afford to secure its own data internally. It is like air travel: because flying is instinctively scary, so much has been spent to make it safe that you are

less likely to die on a flight than you are driving the same journey in the “safety” of your own car. The biggest risk in air travel is in the journey to the airport, just as the biggest risk in cloud computing lies in the data’s passage to the cloud – hence the importance of a secure line to a cloud service.

So let us look at encryption in the light of those two lessons. Instinctively it makes sense to keep full control of your own encryption and keys, rather than let them get into any stranger’s hands – so how far do we trust that instinct, bearing in mind the need also to balance security against efficiency?

BYOK

Hot on the heels of BYOD – or “Bring Your Own Device” to the workplace – come the acronym for Bring Your Own Key (BYOK).

The idea of encryption is as old as the concept of written language: if a message might fall into enemy hands, then it is important to ensure that they will not be able to read it. We have recently been told that US forces used Native American communicators in WW2 because the chances of anyone in Japan understanding their language was near zero. More typically, encryption relies on some sort of “key” to unlock and make sense of the message it contains, and that transfers the problem of security to a new level: now the message is secure, the focus shifts to protecting the key.

In the case of access to cloud services: if we are encrypting data because we are worried about its security in an unknown cloud, why then should we trust the same cloud to hold the encryption keys?

Microsoft for instance recently announced a new solution to this dilemma using HSMs (Hardware Security Modules) within their Windows Azure cloud – so that an enterprise customer can use its own internal HSM to produce a master key that is then transmitted to the HSM within the Windows Azure cloud. This provides secure encryption when in the cloud, but it also means that not even Microsoft itself can read it, because they do not have the master key hidden in the enterprise HSM.

It is not so much that the enterprise cannot trust Microsoft to protect its data from attack, it is more to do with growing legal complexities. In the wake of Snowden revelations, it is becoming known that even the most well protected data might be at risk from a government or legal subpoena demanding to reveal its content. Under this BYOK system, however, Microsoft cannot be forced to reveal the enterprise’s secrets because it cannot access them itself, and the responsibility lies only with the owner.

This is increasingly important because of other legal pressures that insist on restricting access to certain types of data. A government can, for example, forbid anyone from allowing data of national importance to leave the country – not a simple matter in a globally connected IP network. There are also increasing legal pressures on holders of personal data to guarantee levels of privacy.

Instinctively it feels a lot more secure to manage your own key and use BYOK instead of leaving it to the cloud provider. As long as that instinct is backed by a suitable and strict in-house HSM based security policy, these instincts can be trusted.

BYOE

BYOK makes the best of the cloud provider’s encryption offering, by giving the customer ultimate control over its key. But is the customer happy with the encryption provided?

Bearing in mind that balance between security and efficiency, you might prefer a higher level of encryption than that used by the cloud provider’s security system, or you might find the encryption mechanism is adding latency or inconvenience and would rather opt for greater nimbleness at the cost of lighter encryption. In this case you could go a step further and employ your own encryption algorithms or processes. Welcome to the domain of BYOE (Bring Your Own Encryption).

Again, we must balance security against efficiency. Take the example of an enterprise using the cloud for deep mining its sensitive customer data. This requires so much computing power that only a cloud provider can do the job, and that means trusting private data to be processed in a cloud service. This could infringe regulations, unless the data is protected by suitable encryption. But how can the data be processed if the provider cannot read it?

Taking the WW2 example above: if a Japanese wireless operator was asked to edit the Native American message so a shortened version could be sent to HQ for cryptanalysis, any attempt to edit an unknown language would create gobbledygook, because translation is not a “homomorphic mapping”.

Homomorphic encryption means that one can perform certain processes on the encrypted data, and the same processes will be performed on the source data without any need to de-crypt the encrypted data. This usually implies arithmetical processes: so the data mining software can do its mining on the encrypted data file while it remains encrypted, and the output data, when decrypted, will be the same output as if the data had been processed without any intervening encryption.

It is like operating one of those automatic coffee vendors that grinds the beans, heats the water and adds milk and sugar according to which button was pressed: you do not know what type of coffee bean is used, whether tap, filtered or spring water or whether the milk is whole cream, skimmed or soya. All you know is that what comes out will be a cappuccino with no sugar. In the data mining example: what comes out might be a neat spread-sheet summary of customers average buying habits based on millions of past transactions, without a single personal transaction detail being visible to the cloud’s provider.

The problem with the cloud provider allowing the users to choose their own encryption, is that the provider’s security platform has to be able to support the chosen encryption system. As an interim measure, the provider might offer a choice from a range of encryption offerings that have been tested for compatibility with the cloud offering, but that still requires one to trust another’s choice of encryption algorithms. A full homomorphic offering might be vital for one operation, but a waste of money and effort for a whole lot of other processes.

The call for standards

So what is needed for BOYE to become a practical solution is a global standard cloud security platform that any encryption offering can be registered for support by that platform. The customer chooses a cloud offering for its services and for its certified “XYZ standard” security platform, then the customer goes shopping for an “XYZ certified” encryption system that matches its particular balance between security and practicality.

Just as in the BYOD revolution, this decision need not be made at an enterprise level, or even by the IT department. BYOE, if sufficiently standardised, could become the responsibility of the department, team or individual user: just as you can bring your own device to the office, you could ultimately take personal responsibility for your own data security.

What if you prefer to use your very own implementation of your own encryption algorithms? All the more reason to want a standard interface! This approach is not so new for those of us who remember the Java J2EE Crypto library – as long as we complied with the published interfaces, anyone could use their own crypto functions. This “the network is the computer” ideology becomes all the more relevant in the cloud age. As the computer industry has learned over the past 40 years, commonly accepted standards and architecture (for example the Von Neumamm model or J2EE Crypto) play a key role in enabling progress.

BYOE could prove every bit as disruptive as BYOD – unless the industry can ensure that users choose their encryption from a set of globally sanctioned and standardised encryption systems or processes. If business is to reap the full benefits promised by cloud services, it must have the foundation of such an open cloud environment.

Written by Dr. Hongwen Zhang, chair security working group, CloudEthernet Forum.

Nearly half of Brits find wearables in the enterprise intrusive – study

How will wearables impact privacy in the enterprise?

How will wearables impact privacy in the enterprise?

A recently published study by UK mobile app developer Apadmi suggests UKers are deeply concerned about the privacy implications of wearable IP-connected technology in the workplace.

The study, which surveyed 500 adults living and working in the UK, found that 42 per cent of people in the UK thought that wearable technology posed a risk to their privacy, with only 18 per cent of respondents saying they didn’t feel it was a danger.

But there seemed to be a significant portion of respondents (40 per cent) that did not know whether wearable tech would pose a threat to their privacy.

“It’s obvious from our investigations that privacy is a very real issue for the wearable technology industry, although it’s by no means insurmountable,” said Nick Black, co-founder and director at Apadmi

“A lot of commentators are flagging up the potential privacy implications of devices that can record and relay so much data about an individual. And consumers appear to be taking note, with quite a few admitting that these concerns weigh on their mind when considering whether or not to buy wearable technology.”

Wearables have started to gain favour with some larger enterprises in the US and UK, particularly when it comes to tracking health and fitness. Some private health insurers for instance monitor fitness data as a way to incentivise fitness activity, which reduces the risk of health issues and can lead to lower premiums.

But opinion on the privacy implications of mandating wearables in the workplace seems to be quite strong. When asked how they would feel if their employer required them to use wearable technology as part of their role 25 per cent of respondents said they would consider changing jobs, and a further 24 per cent replied they would be happy to do this.

“We also need to draw attention to the fact that a huge number of people still don’t have a firm grasp of how wearable technology might impact upon privacy in the first place, as demonstrated by the significant number of ‘don’t know’ respondents in our survey. People are naturally apprehensive about what they don’t understand. But it’s interesting that those who go on to purchase a device are overwhelmingly happy with their decision and the benefits it has brought to their lives,” Black explained.

“With this in mind, wearable tech businesses and app developers need to educate prospective customers around privacy concerns to alleviate these fears. Many people still don’t fully understand the privacy issues around wearable technology or appreciate its potential to dramatically improve lives in areas such as health and social care.”

Despite the potential privacy implications many believe use of wearables in the enterprise will rapidly increase over the next few years. Salesforce for instance claims use of wearables in the enterprise will more than triple in the next two years, with smartwatches emerging as a popular candidate to deliver sales and customer service improvements.

The company’s own survey of over 1,400 working adults shows 79 per cent of adopters agree wearables will be strategic to their company’s future success; 76 per cent report improvements in business performance since deploying wearables in the enterprise; and 86 per cent of adopters’ organisations plan to increase their wearables spend over the next 12 months.

EMC World 2015: Event Recap

After EMC World 2015, I’m languishing in airports today in post-conference burnout – an ideal time to deliver a report on the news, announcements and my prognostications on what this means to our business.

The big announcements were delivered in General Sessions on Monday (EMC Information Infrastructure & VCE) and on Tuesday (Federation: VMware & Pivotal). The Federation announcements are more developer and futures oriented, although important strategically, so I’ll pass on that for now.

EMC and VCE have updated their converged and Hyperconverged products pretty dramatically. Yes, VSPEX Blue is Hyperconverged, however unfortunate the name is in linking an EVO:RAIL solution to a reference architecture solution.

The products can be aligned as:

  1. Block
  2. Rack
  3.  Appliances

EMC World 2015

The VCE Vblock product line adheres to its core value proposition closely.

  1. Time from order to completely deployed on the data center floor in 45 days. (GreenPages will provide the Deploy & Implementation services. We have three D&I engineers on staff now.)
  2. Cross component Unified upgrade through a Release Candidate Matrix – every single bit of hardware is tested in major and minor upgrades to insure compatibility: storage, switch, blade, add-ons (RecoverPoint, Avamar, VPLEX).
  3. Unified support – one call to VCE, not to all the vendors in the build

However, VCE is adding options and variety to make the product less monolithic.

  1. VXblock – this is the XtremIO version, intended for large VDI or mission critical transactional deployments (trading, insurance, national healthcare claims processing). The Beast is a Vblock of eight 40 TB Xbrick nodes, 320 TB before dedupe and compression, or nearly 2 PB with realistic data reduction. Yes, that is Two Petabytes of All Flash Array. Remote replication is now totally supported with RecoverPoint.
  2. VXRack – this is a Vblock without an array, but it isn’t VSAN either. It is….ScaleIO, a software storage solution that pools server storage into a shared pool. The minimum configuration is 100 compute nodes, which can be dense performance (4 node form factor in 2 U chassis) or capacity. The nodes can be bare metal or hypervisor of any sort. This can scale to 328 Petabytes. Yes, Petabytes. This is web-scale, but they call it “Rack Scale” computing (first generation). More on that later…
  3. Vscale – Networking! This is Leaf and Spine networking in a rack to tie a VXrack or Vblock deployment together, at scale. “One Ring to Rule Them All”. This is big, literally. Imagine ordering a petabyte installation of VXblock, VXrack and Vscale, and rolling it onto the floor in less than two months.

So, that is Block and Rack. What about Appliance?

Enter VSPEX Blue, the EMC implementation of EVO:RAIL. This has definite value in…

  • Pricing
  • Unified management & support
  • The “app store” with
    • integrated backup (VDPA)
    • replication (vRPA)
    • Cloud Array integration (TwinStrata lives!), a virtual iSCSI controller that will present cloud storage to the system as a backup target or a capacity tier.

This post from Mike Colson provides a good explanation.

Future apps will include virus scanning, links to Public IaaS and others.

I set one up in the lab in 15 minutes, as advertised, although I had to wait for the configuration wizard to churn away after I initialized it and input all the networking. Professional Services will be required, as EMC is requiring PS to implement. Our team is and will be prepared to deploy this. We can discuss how this compares to other Hyperconverged appliances. Contact us for more information.

There are other announcements, some in sheer scale and some in desirable new features.

Data Domain Beast: DD9500, 58.7 TB/hr. and 1.7 PB of capacity. This is rated at 1.5x the performance and 4x the scalability of the nearest competitor.

VPLEX News: The VPLEX Witness can now be deployed in the public Cloud (naturally EMC recommends the EMC Hybrid Cloud or vCloud Air). The Witness has to be outside the fault domains of any protected site, so where better than the Cloud? It is a very lightweight VM.

CloudArray (TwinStrata’s Cloud Array Controller) is integrated with VPLEX. You can have a distributed volume spanning on premise and cloud storage. I’m still trying to grasp the significance of this. The local cache for the CloudArray controller can be very fast, so this isn’t limited to low latency applications. The things you could do…

VPLEX is now available in a Virtual Edition (VPLEX/VE). This will obviously come with some caveats and restrictions, but this also is a fantastic new option for smaller organizations looking for the high availability that VPLEX provides, as well as data mobility and federation of workloads across metro distances.

VVOL: Chuck Hollis (@chuckhollis) led an entertaining and informative ‘Birds of a Feather’ session for VVOLs. Takeaway – this is NOT commonly deployed yet. Only a handful of people have even set it up, and mostly for test. This was in a room with at least 150 people, so high interest, but low deployment. Everyone sees the potential and is looking forward to real world policy based deployments on industry standard storage. This is an emerging technology that will be watched closely.

VNX/VNXe: I didn’t see or hear many striking features or upgrades in this product line, but an all flash VNXe was trumpeted. I’ll be looking at the performance and design specifications of this more closely to see how it might fit targeted use cases or general purpose storage for SMB and commercial level customers. There is talk around the virtualization of the VNX array, as well as Isilon, so pretty soon nearly every controller or device in the EMC portfolio will be available as a virtual appliance. This leads me to…

ViPR Controller and ViPR SRM: Software Defined Storage

ViPR Controller is definitely a real product with real usefulness. This is the automation and provisioning tool for a wide variety of infrastructure elements, allowing for creation of virtual arrays with policy based provisioning, leveraging every data service imaginable: dedupe, replication, snapshots, file services, block services and so on.

ViPR SRM is the capacity reporting and monitoring tool that provides the management of capacity that is needed in an SDS environment. This is a much improved product with a very nice GUI and more intuitive approach to counters and metrics.

I’d recommend a Storage Transformation Workshop for people interested in exploring how SDS can change the way (and cost) of how you manage your information infrastructure.

More on EVO:RAIL/VSPEX Blue

I met with Mike McDonough, the mastermind behind EVO:RAIL. He is indeed a mastermind. The story of the rise of EVO:RAIL as a separate business unit is interesting enough (300 business cases submitted, 3 approved, and he won’t say what the other mystery products are), but the implementation and strategy and vision are what matter to us. The big factor here was boiling down the support cases to come up with the 370 most common reasons for support, all around configuration, management and hardware. The first version of EVO:RAIL addressed 240 of those issues. Think of this as having a safety rail around a vSphere appliance to prevent these common and easily avoidable issues, without restricting the flexibility too much. The next version will incorporate NSX, most likely. Security and inspection are the emphases for the next iteration. Partners and distributors were chosen carefully. GreenPages is one of only 9 national partners chosen for this, based on our long history as a strategic partner and our thought leadership! The tightly controlled hardware compatibility list is a strength, as future regression tests for software and other upgrades will keep the permutations down to a minimum. (By the way, the EMC server platform is Intel, for VxRack, VSPEX Blue and I think for all of their compute modules for all their products). The implication here, competitively, is that as competitive appliances that are buying white box hardware with commodity contracts allowing for flexibility in drives, memory and CPU, will have an exponentially more difficult task in maintain the increasing permutations of hardware versions over time.

Final Blue Sky note:

Rack Scale is an Intel initiative that promises an interesting future for increased awareness of the hardware for hypervisors, but is a very future leaning project. Read Scott Lowe’s thoughts on this.

 

As always, contact us for more details and in-depth conversations about how we can help you build the data center of the future, today.

 

By Randy Weis, Practice Manager, Information Infrastructure

Microsoft to improve transparency, control over cloud data

Microsoft wants to improve the security of its offerings

Microsoft wants to improve the security of its offerings

Microsoft has announced a series of measures to give customers more control over their cloud-based data, a move it claims will improve transparency around how data is treated as well as the security of that data.

The company announced enhanced activity logs of user, admin and policy-related actions, which customers and partners can tap into through a new Office 365 Management Activity API to use for compliance and security reporting.

Microsoft said by the end of this year it plans to introduce a Customer Lockbox for Office 365, which will give Office users the ability to approve or reject a Microsoft engineer’s request to log into the Office 365 service.

“Over the past few years, we have seen the security environment change and evolve. Cyber threats are reaching new levels, involving the destruction of property, and governments now act both as protectors and exploiters of technology. In this changing environment, two themes have emerged when I talk with our customers – 1) they want more transparency from their providers and more control of their data, and 2) they are looking for companies to protect their data through leading edge security features,” explained Scott Charney, corporate vice president, trustworthy computing at Microsoft.

“In addition to greater control of their data, companies also need their technology to adhere to the compliance standards for the industries and geographic markets in which they operate.”

The company is also upping its game on security and encryption. Office 365 already encrypts data in transit, but in the coming months Charney said the company plans to introduce content-level encryption, and by 2016 plans to enable the ability for customers to require Microsoft to use customer-generated and customer-controlled encryption keys to encrypt their content at rest.

It also plans to bolster network security through Azure-focused partnerships with the likes of Barracuda, Check Point, Fortinet, Websense, Palo Alto Networks, F5 and Alert Logic, and broaden the security capabilities of its enterprise mobility management suite.

Microsoft has over the past couple of years evolved into a strong proponent of and active participant in discussions around data security and data protection, including legislative change impacting these areas in the US. It’s also among a number of US cloud providers that are convinced many still lack trust in the cloud from a security standpoint, consequently hampering its ability to make inroads into the cloud market, which gives it an added incentive to double down on securing its own offerings.

Cisco, Elastica join forces on cloud security monitoring

Cisco will resell Elastica's cloud service monitoring technology

Cisco will resell Elastica’s cloud service monitoring technology

Networking giant Cisco is teaming up with Elastica, a cloud security startup, in a move that will see the two firms combine their threat intelligence and cloud service monitoring technologies.

The partnership will also see Cisco resell Elastica’s cloud application security and monitoring solution (CloudSOC) to its customers.

“The combination of Cisco’s threat-centric security portfolio and Elastica’s innovation in cloud application security provides a unique opportunity. Our global customers gain additional levels of visibility and control for cloud applications and it enhances our portfolio of advanced cloud-delivered security offerings,” said Scott Harrell, vice president of product management, Cisco Security Business Group.

“We are excited to partner with Elastica to deliver an even richer portfolio of on–premises and cloud application security to protect businesses across the attack continuum – before, during, and after an attack,” Harrell said.

The move is a big win for Elastica, a startup that existed stealth in early 2014 and just last month secured $30m in funding. Cisco will provide the security startup with a large and varied channel that spans both the enterprise and scale-out markets, while Cisco can plug a gap in its burgeoning cloud-centric portfolio (that said, it’s possible the move is a precursor to an acquisition).

“CIOs want to empower employees with advanced cloud apps that help enterprises stay agile, productive and competitive in the marketplace. The power of these cloud apps – information sharing and built-in collaboration capabilities – also require a completely new approach to security,” said Rehan Jalil, president and chief executive of Elastica.

“Elastica’s cloud app security technology, together with Cisco’s broad security portfolio and footprint, will help us catalyze the safe and compliant use of cloud apps so that our customers can continue to securely make their businesses more agile and productive,” Jalil said.

IBM makes cyber threat data available as a cloud security service

IBM is launching a cybersecurity cloud service

IBM is throwing its hat into the cybersecurity ring

IBM has unveiled a cloud-based cybersecurity service which includes hundreds of terabytes of raw aggregated threat intelligence data, which can be expanded upon by users that sign up to use the service.

At about 700TB, IBM’s X-Force Exchange service is being pitched by the firm as one of the largest and most complete catalogues of cybersecurity vulnerability data in the world.

The threat information is based on over 25 billion web pages and images collected from a network of over 270 million endpoints, and will also include real-time data provided by others on the service (so effectively, the more people join, the more robust the service gets).

“The IBM X-Force Exchange platform will foster collaboration on a scale necessary to counter the rapidly rising and sophisticated threats that companies are facing from cybercriminals,” said Brendan Hannigan, general manager, IBM Security.

“We’re taking the lead by opening up our own deep and global network of cyberthreat research, customers, technologies and experts. By inviting the industry to join our efforts and share their own intelligence, we’re aiming to accelerate the formation of the networks and relationships we need to fight hackers,” Hannigan said.

Last year IBM made a number of acquisitions to bolster end-point and cloud security (CrossIdeas, Lighthouse) and adding cyber threat detection to the mix creates a nicely rounded security portfolio. But the move also put it in direct competition with a wide range of managed security service providers that have been playing in this space for years and going after the same verticals (oil & gas, financial service, retail, media, etc.), so it will be interesting to see how IBM differentiates itself.

Cloud security vendor Adallom secures $30m in series C led by HP

Adallom secured $30m in new funding this week from HP Ventures among others

Adallom secured $30m in new funding this week from HP Ventures among others

Cloud security service provider Adallom announced this week it has secured $30m in a series C funding round led by Hewlett Packard Ventures, which the company said it would put towards research and development.

Adallom, which was founded by cybersecurity veterans Assaf Rappaport, Ami Luttwak and Roy Reznik in 2012, offers a security service that integrates with the authentication chain of a range of SaaS applications and lets IT administrators monitor usage for every user on each device.

The software works with a conjunction of end-point and network security solutions and has a built-in, self-learning engine that analyses user activity on SaaS applications and assesses the riskiness of each transaction in real-time, alerting administrators when activity becomes too risky for an organisation given its security policies.

The company said the latest funding round, which brings the total amount secured by the firm since its founding three years ago to just under $50m, speaks to the rapid growth of the SaaS market, and the need for more flexible security solutions.

“The market’s embrace of our approach to cloud security and our investors’ continued confidence in our products, team and results to date is a strong endorsement of Adallom. It also serves as encouragement to continue to execute on our mission to deliver the best platform for protecting data in the cloud,” said Rappaport, Adallom’s chief executive. “We’re determined to exceed the expectations of our customers and investors, and continue our innovation in this market.”

The company said the investment will be used to double down on development and improve support for more services; it claims the security service already supports over 13,000 cloud apps.

Adallom’s funding round caps off a successful month for a number of cloud security vendors, with Palerra, ProtectWise and Elastica all securing millions in investment.

Cloud security vendor Palerra scores $17m

Palerra is among a number of cloud security startup combining predictive analytics and machine learning algorithms to bolster cloud security

Palerra is among a number of cloud security startups combining predictive analytics and machine learning algorithms in clever ways

Cloud security vendor Palerra has secured $17m in series B funding, a move the company said would help accelerate sales and marketing efforts around its predictive analytics and threat detection services.

Palerra’s flagship service, Loric, combines threat detection and predictive analytics in order to provide automatic incident response and remediation for malicious traffic flowing to a range of cloud services and platforms.

Over the past few years we’ve seen a flurry of cloud security startups emerge, which all deploy analytics and machine learning algorithms to cleverly detect perceived and actual threats and respond in real-time, so it would seem enterprises are starting to become spoilt with choice.

The $17m round was led by August Capital, with participation from current investors Norwest Venture Partners (NVP), Wing Venture Capital and Engineering Capital, and brings the total amount secured by the firm to $25m.

The funds will be used to bolster sales and marketing efforts at the firm.

“The dramatic rise in adoption of cloud services by today’s enterprises against the backdrop of our generation’s most potent cyber threats has necessitated a new approach. LORIC was designed to meet these threats head on and this new round underscores our commitment to deliver the most powerful cloud security solution in the industry,” said Rohit Gupta, founder and chief executive officer of Palerra.

“As the perimeter disintegrates into a set of federated cloud-based and on-premises infrastructures, effective monitoring becomes almost impossible, unless security controls are embedded in these heterogeneous environments. This will require enterprises to reconsider and possibly redesign their security architecture and corresponding security controls by placing those controls in the cloud,” Gupta added.

Singtel buys Trustwave in managed security play

Singtel has acquired Trustwave, a cloud and managed security services provider

Singtel has acquired Trustwave, a cloud and managed security services provider

Singtel is to acquire IT security firm Trustwave in a move that will see the latter operate as the cybersecurity division of the Singaporean telecoms incumbent.

The deal will see Singtel acquire a 98 per cent stake in the American security services firm, which has an $850m equity value. Singtel said it paid around $810m for the company.

Following the acquisition more than 1,200 Trustwave employees will join Singtel to form a standalone cybersecurity services business unit.

Trustwave said it had three million business subscribers pre-acquisition and five security operations centres (in the US and Poland).

In canned remarks Trustwave chairman, chief executive and president Robert McCullen said: “This strategic partnership creates an unparalleled opportunity to combine Singtel’s robust information and communications solutions with Trustwave’s industry-leading security technologies and managed services platform to deliver cutting-edge solutions that will enhance our customer experience.”

“Singtel is the perfect partner for us as we continue to help businesses fight cybercrime, protect data and reduce security risk, and the Trustwave team is thrilled to become a part of such a prestigious and innovative organization,” McCullen said.

Singtel said the move will allow it to build a stronger presence in the American and European cloud services markets as it combines its existing enterprise IT assets it already leverages in the Asia Pacific region.

Chua Sock Koong, Singtel Group chief executive said: “We aspire to be a global player in cyber security.  We have established a strong security business in the region, both organically and through strategic partnerships with global technology leaders.”

“Our extensive customer reach and strong suite of ICT services, together with Trustwave’s deep cyber security capabilities, will create a powerful combination and allow Singtel to capture global opportunities in the cyber security space,” Koong said.

The acquisition will see Singtel move into an area that seems to be constantly on the up – cyberattacks like DDoS and man-in-the-middle attacks are becoming more frequent and cheaper to procure on the black market according to nearly every report out there, and other IT-focused telcos (i.e. Verizon) making moves to broaden their enterprise services to include cloud security and managed security services. According to Gartner the managed security industry is estimated to generate approximately $24bn by 2018, up almost 75 per cent from $14bn in 2014.

Salesforce buys mobile authentication startup

MFA is becoming more prominent among enterprises

MFA is becoming more prominent among enterprises

Salesforce has acquired Toopher, a Texas-based mobile authentication startup, for an undisclosed sum.

The company, which offers multifactor authentication (MFA) for mobile platforms, was acquired by the CRM giant less than a month after it secured $200k in new investment.

“Today it is with great excitement that we can unveil our ability to super-charge our superpower—because we are being acquired by Salesforce,” the company’s founders Josh Alexander and Evan Grim wrote in a statement on the Toopher website.

“While we will no longer sell our current products, we are thrilled to join Salesforce, where we’ll work on delivering the Toopher vision on a much larger scale as part of the world’s #1 Cloud Platform. We can’t imagine a better team, technology and set of values with which to align.”

Toopher said it will continue to support existing customers.

Salesforce is aligning itself with a number of enterprise IT vendors including Microsoft, PingIdentity and RSA, which have over the past few years moved to acquire MFA vendors in order to bolster the security posture of their offerings.

Given the rise in MFA adoption among enterprises (a recent SafeNet survey suggests 37 per cent of organisations used MFA in 2014, up from 30 per cent the previous year), the performance improvements associated with tight technical integration between MFA and the services they protect, and the fact these enterprises are becoming more and more mobile, it’s not surprising to see some vendors swoop in to acquire the technology outright.