Risk management has been around for a long time. Financial managers run risk assessments for nearly all business models, and the idea of risk carries nearly as many definitions as the Internet. However, for IT managers and IT professionals, risk management still frequently takes a far lower priority that other operations and support activities.
Archivo mensual: enero 2015
Release Management vs Release Engineering By @Plutora | @DevOpsSummit [#DevOps]
A common pattern in the past few years is the creation of centralized Release Engineering groups. These professionals apply common patterns for automating releases, they have a solid understanding of build and testing frameworks, and they help a large organization standardize software delivery. These centralized groups have become more popular as enterprises support an increasing number of applications and a faster release cadence.
In an ideal enterprise, all of your applications are packaged, delivered, and installed according to a standard, and these teams serve to apply and enforce enterprise IT policy. For example, the Release Engineering team may be responsible for a common Continuous Integration server, a standard approach to branching and tagging in source control, and the establishment of a common playbook for production releases.
Frequency vs. size of cloud data breaches: Which is worse?
(c)iStock.com/sproetniek
Let’s face it, 2014 was a busy year for hacks and data breaches. There were the high profile Sony hacks, the record breaking fines handed out as a result of ePHI (electronic Protected Health Information) healthcare data breaches in the US, and sites such as Gmail and eBay were also targeted by hackers.
The potential for breaches to occur more frequently as businesses collect increasing amounts of consumer data was discussed at the CES conference earlier this year, and was highlighted as a concern by Barack Obama during his State of the Union address on January 20th.
There are a number of factors to consider when weighing up whether frequent or large data breaches are worse. After all, not all breaches are created equal.
Assessing the severity of a breach
Firstly, consider the type of breach – was it a malicious attack, or did it occur as a result of human error?
Secondly, the type of information that was breached will determine the severity. Even a small breach can cause a significant amount of damage if personal or financial information is involved.
The high profile Sony hack was not its first major hacking incident; in fact, Sony has fallen victim to five breaches since 2013. While the focus of the most recent hacks was widely reported to be the early release of five movies in response the the release of The Interview, the hack revealed the US Social Security numbers of more than 47,000 celebrities, freelancers, and current and former Sony employees, as well as medical records, salaries, and other sensitive personal information that can be used for identity theft. Nearly all of this information was stored in Excel spreadsheets, which had absolutely no form of password protection.
Other notable breaches in 2014 included JP Morgan Chase, which saw 83,000,000 records breached for use of identity theft. Or Home Depot, which saw 109,000,000 records breached in order to gain access to financial information. eBay also had 145,000,000 records breached, again for identity theft.
Social networking sites often fall victim to hackers too; Twitter has experienced 11 hacking incidents since 2013, although the majority of these are considered to be minor, and affected single accounts in most instances. The most common type of breach in this instance proved to be a nuisance, rather than harmful.
Snapchat experienced two breaches in 2014, and both times a significant amount of personal data was leaked online. The first breach saw 4.6 million usernames and phone numbers being breached, and later, in October 2014 almost 98,000 stolen files were posted to The Pirate Bay. Snapchat blamed third-party applications for the breach, although they didn’t name the culprit. This hack occurred despite a warning from a data security report conducted in August 2013 showing that Snapchat’s data was vulnerable to attack.
The highest ever fine in HIPAA history was also handed out in 2014, following a data breach that saw the electronic PHI of 6,800 patients disclosed on Google. The New York Presbyterian Hospital and Columbia University Medical Center together have agreed to hand over a whopping $4.8 million to settle the alleged HIPAA violations.
When determining which is worse – the frequency or the size of a breach – the simple answer is that any breach can be devastating. It is therefore essential that organisations, large and small, put security and compliance at the top of their agenda this year.
Preventing data breaches from happening
Organisations may encounter numerous attempted cyber attacks every day without even knowing it. Here is a list of necessary steps and actions which every organisation should be taking, in order to avoid potential breaches:
- Perform regular risk assessments to identify where valuable data is stored, and how it is transmitted internally and externally.
- Perform vulnerability scanning on a regular basis, followed by penetration testing for the most critical assets to identify and remediate security weaknesses.
- Deploy technologies to protect against all attack vectors, ensuring security and authentication software is installed on all devices.
- Partner with a third party team of experts to help ensure your organisation has enough manpower and skillsets in-house to deal with cyber attacks, and to make sure those technologies are installed, optimized and working continuously.
- Use adequate authentication and encryption on all devices, especially around the storage and exchange of documents and data.
- Create and regularly practice an incident response plan as part of the organisation’s business continuity planning so that if a breach occurs, the business knows what steps to take to contain it and minimise the damage.
- Educate employees about appropriate handling and protection of sensitive data. Lost, stolen and discarded devices containing critical information illustrates that corporate policy designed to safeguard portable data only works when employees follow the rule; and this is especially important given the increase in the number of mobile devices used by organisations over the last few years.
IBM launches cloud-based technology Identity Mixer to protect personal data
(c)iStock.com/graffizone
IBM has announced the launch of Identity Mixer, a cloud-based technology which aims to protect personal data by only revealing the bare minimum for transactions, on its BlueMix developer platform.
The release, which has not coincidentally been pegged for Data Privacy Day, uses a cryptographic algorithm to only reveal selected pieces of a user’s information, such as date of birth, home address, and credit card number, to third parties.
An example given of how this service works is through a video streaming service for age-restricted films. Instead of a user submitting full date of birth and address to assess whether they are old enough and in the right region to view the film, Identity Mixer will tell the streaming service only the fact that the user is old enough and in the right region.
The completed product has taken over a decade of research, and is available for developers to test in their own applications and web services. Developers can choose the type of data they wish to secure, and BlueMix will provide the code.
“Identity Mixer enables users to choose precisely which data to share, and with whom,” said Christina Peters, IBM chief privacy officer. “Now web service providers can improve their risk profile and enhance trust with customers, and it’s all in the cloud making it easy for developers to program.”
This innovation is hardly surprising, as IBM has won the most US patents for 21 years in a row, with the culture of the organisation deeply embedded in its staff. When this publication spoke to UK&I cloud leader Doug Clark back in March, he explained even he had a patent, despite being a “salesy” person. Despite this, IBM’s most recent financial results left analysts unconvinced, even though the company hit its target of $7bn in yearly cloud revenue.
Take a look at the video below which explains Identity Mixer in more detail:
Apache Drill Provides Competitive Benefits By @MapR | @CloudExpo [#BigData]
Apache Drill is an open source, schema-free SQL query engine—that enables a business to extract the most value possible from data stored in Hadoop. Apache Drill does this by breaking down many of the technical barriers that had restricted the ways that analysts could work with data, and limited the intelligence that they could glean from it. With Apache Drill, data scientists are no longer constrained by the limitations of last generation tools, and they are free to work with data in a myriad of productive ways without draining IT resources.
Amazon Is Seriously Challenged By Microsoft | @CloudExpo [#Cloud]
AWS was in the premier position having had a head start of more than 4 years in the cloud related business. The lowering of prices by Microsoft; a new CEO; an emphasized bet on the cloud have all contributed to Microsoft Azure’s adoption by the enterprises. One of the reasons for this being the better tooling; the variety of applications and services; and not-for-Windows only shift have all contributed.
Microsoft brought in a lot of enhancements that should prove this advantage Microsoft has gained further. The storage options with SSDs and the rapid, multifaceted updates to its Azure cloud computing platform are all very attractive for enterprises.
Networking in 2015 – SDN and IPv6 By @Entuity | @CloudExpo [#Cloud #SDN]
After reading Sean Michael Kerner’s article in Enterprise Networking Planet earlier this month, I thought I’d add a few thoughts of my own to what he wrote. He discusses SDN and IPv6. Here’s my take on Software Defined Networking (SDN) in 2015:
Given the ongoing hype and the range of benefits its proponents extol, I expect to see a continued increase in interest and evaluation of SDN. However with the ongoing battle between the major SDN vendors (OpenFlow, Cisco ACI, VMware NSX, Contrail, etc.) and the fear of ‘backing the wrong horse’, the lack of proven, mature enterprise deployments and the management and monitoring difficulties associated with SDN (most NMS do not yet adequately support SDN) it is unlikely that wide-scale enterprise deployments of SDN will occur in 2015. Whilst the perceived benefit of easing the management and implementation of distributed security policies, access-control and QoS by deploying SDN is undeniably appealing, lack of maturity will hold back wide-scale deployment for some time to come.
The Non-Tangibles Become Tangible By @Entuity | @CloudExpo [#Cloud]
We recently changed over the entire phone system in our London office. After four years, the equipment failed, and it was time to update. As we did so, we also changed vendors. Why?
It occurred to us that, for four years, we had put up with horrible support – we were the victims of fraud at one point, and the vendor did nothing to support us with the main telecom provider. The phone system could go down for a day before they came out to repair it. No big deal, right? What high tech software company needs to communicate to the outside world?
So it was time to put in a new phone system. Though our original vendor offered lower terms than a competitor (though, note that they submitted their bid 48 hours later than they had promised, and while our phone system was STILL down!), we decided to pay a bit more with a vendor who seemed more eager and supportive for our business.
GE’s DevOps Lifecycle By @MMurray | @DevOpsSummit [#DevOps]
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security.
In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND secure.
Avoid Application Outages and Stalls By @HannahHaleyC | @DevOpsSummit [#DevOps]
Last week I was able to catch up with Chris Tranter, Technical Lead at Hallmark UK.
Hallmark’s engineering team runs on tight deadlines, and like all popular websites, places a huge importance on customer experience. As you can imagine, their site has abnormal and inconsistent load times, especially through the holiday season. Ensuring a smooth, seamless experience without stalls, outages, or crashes is extremely important.
It was great to see how AppDynamics APM can become a necessary solution on a developer’s toolkit.