How Santander embraced cloud storage for greater productivity and data security

(c)iStock.com/tupungato

If you were to think of an industry that would be naturally reluctant to embrace technologies such as cloud and mobility for reasons of sensitive data privacy, which one would you think of first? The two most common answers are usually healthcare and banking – so it is therefore interesting to note how Santander has gone all in on enterprise file sync and share (EFSS) with cloud storage provider CTERA Networks.

According to the bank, the opportunity to move organisational capabilities to the cloud arose as far back as 2011 to increase productivity – yet there was a huge task to be undertaken in terms of finding a solution that would pass all of Santander’s security tests. This is perhaps not unreasonable, but the proposed EFSS solution would need to prove ‘dramatically’ more cost effective, hardware-agnostic, and have high security standards when deployed in Santander data centres, including multi-user encryption keys, data redundancy, and file versioning. The deployment has been made across more than 60,000 employees across Spain and Latin America. 

The benefits, according to Pablo Ruiz Correa, head of open innovation at Santander, are tangible. “We realised the need to provide a solution for simple, anytime anywhere file access and collaboration, but we also needed to meet requirements for total data privacy and security,” he explained, adding: “CTERA has allowed us to transform how users sync, collaborate on and protect their files without requiring any of the security compromises that are common to public cloud SaaS offerings.”

For CTERA, it represents an impressive customer story, and one which fits into their enterprise-secure cloud storage philosophy. Research carried out by the vendor in September last year found that more than three quarters of survey respondents were at least somewhat concerned about consumer-grade file sync and share solutions. The company’s view has been consistent; even though the likes of Dropbox and Box – which is traditionally an enterprise player anyway – have been beefing up their security, it still is not enough against vendors who know the B2B game inside out, a view shared by fellow enterprise storage firm Egnyte.

Speaking to CloudTech in July last year Rani Osnat, CTERA VP strategic marketing, explained: “When you talk to larger enterprises, especially the ones that are in regulated industries like banking or insurance, or healthcare, they definitely have a strong preference for private cloud solutions. They have the know-how, and they have strict security and compliance requirements that are not entirely satisfied by solutions that utilise a public cloud infrastructure.”

An interesting comparison as to how banks are moving forward with technology was evinced by Spain’s leader, CaixaBank. Pere Nebot, the bank’s CIO, told delegates at this year’s Mobile World Congress that by now 80% of all contracts signed are digital, with only 8% of transactions now undertaken in a physical branch.

It is clearly a case of keeping up with the Joneses, and banks would do well not to be left behind. “Santander is now not only more productive and agile, but we’ve managed to increase our data security, control and cost savings in our journey to the cloud,” added Correa.

You can find the full case study here.

Infoblox bolsters off-premise security capabilities

Security CCTV camera in office buildingInfoblox has released its DNS Firewall as a service, extending its services to roaming devices off-premise, which will be available towards the end of 2016.

The new service will offer protection to customers roaming outside the corporate perimeter, as well as within, by offering a single pane of glass for protection from malware and cyberattacks. The cloud-service works through providing actionable network intelligence to customers to strengthen their operational and security postures. It also delivers unified reporting and single-policy configuration, which Infoblox claims are capabilities not available through purely cloud-based DNS services.

“Enterprise networks do not have the luxury of being walled gardens any more, not with employees bringing their own devices and accessing data from everywhere,” said Scott Fulton, EVP of Products at Infoblox. “Infoblox DNS Firewall as a service helps our customers by providing the same industry leading protection for on- and off-premise devices, helping organisations to build enterprise networks that are more available, secure, and smart.”

The offerings capitalize on the threat intelligence technology which Infoblox acquired through buying IID in February 2016. IID was acquired for approximately $45 million as a means for Infoblox to increase its threat detection capabilities, as a means to differentiate Infoblox from other DDI vendors.

IID’s cloud-based platform for threat intelligence federation allows customers to share threat intelligence, which has been highlighted as another potential growth area for Infoblox, though this is a competitive marketplace already. Companies such as iSight already have a healthy presence in the threat intelligence market segment, though Infoblox does have a number of partnerships with these vendors, inherited through recent acquisitions, which the team does not expect to change moving forward.

mod_nss: Instalación y configuración

En algunas algunas distribuciones nos podemos encontrar que mod_ssl no soporta TLS 1.2, pero en cambio sí podremos instalar mod_nss que sí lo soporta. Vamos a ver como usar mod_nss

Primero deberemos cargarlo y definir algunas variables globales, por ejemplo en /etc/httpd/conf.d/nss.conf:

LoadModule nss_module modules/libmodnss.so

AddType application/x-x509-ca-cert .crt
AddType application/x-pkcs7-crl    .crl

NSSPassPhraseDialog  builtin

NSSPassPhraseHelper /usr/libexec/nss_pcache

NSSSessionCacheSize 10000
NSSSessionCacheTimeout 100
NSSSession3CacheTimeout 86400


NSSRandomSeed startup builtin

NSSRenegotiation off

NSSRequireSafeNegotiation off

NSSCipherSuite +rsa_rc4_128_md5,+rsa_rc4_128_sha,+rsa_3des_sha,-rsa_des_sha,-rsa_rc4_40_md5,-rsa_rc2_40_md5,-rsa_null_md5,-rsa_null_sha,+fips_3des_sha,-fips_des_sha,-fortezza,-fortezza_rc4_128_sha,-fortezza_null,-rsa_des_56_sha,-rsa_rc4_56_sha,+rsa_aes_128_sha,+rsa_aes_256_sha

NSSProtocol TLSv1.0,TLSv1.1,TLSv1.2

A continuación mediante certutil crearemos la base de datos que contendrá los certificados:

echo "ejemplopassword" > /etc/httpd/alias/pwdfile.txt
echo "internal:ejemplopassword" > /etc/httpd/alias/pin.txt
certutil -N -d /etc/httpd/alias -f /etc/httpd/alias/pwdfile.txt

Para el caso de CentOS, al instalar mod_nss se generará una base de datos con certificados autofirmados de ejemplo en /etc/httpd/alias

Mediante certutil deberemos generar la clave privada y el CSR que necesitamos para firmar el certificado, por ejemplo:

# certutil -R -s 'CN=systemadmin.es, O=systemadmin, OU=modnss, L=Barcelona, ST=Barcelona, C=RC' -o /etc/httpd/ssl/systemadmin.csr -a -g 2048 -d /etc/httpd/alias -f /etc/httpd/alias/pwdfile.txt

Podemos ver la clave privada generada mediante certutil -K:

# certutil -K -d /etc/httpd/alias/
certutil: Checking token "NSS Certificate DB" in slot "NSS User Private Key and Certificate Services"
< 0> rsa      37d35426e3a54d45c360be5727cc0f93be4dbeb4   NSS Certificate DB:alpha
< 1> rsa      c2fb4ee7ebeedc5a8f0c0cb8d6d2d51581b9ef57   NSS Certificate DB:cacert
< 2> rsa      6c18f8803eb18ad6ad1930c3b4650eb3e8dc5b72   NSS Certificate DB:Server-Cert
< 3> rsa      67c0de3a88a738ffaaf3508d370b528b7976ab0e   NSS Certificate DB:sudosueu
< 4> rsa      7b7276980ef037e4b6b37652a95e16376ea95e29   SelfSignedSP
< 5> rsa      da7524dee9662362db91ff0b95e77c078e2c4ed5   (orphan)

Una vez la entidad certificadora nos devuelva el certificado firmado, deberemos importar primero el certificado intermedio, si existe. Por ejemplo, para importar el certificado presente en /etc/httpd/ssl/systemadmin_intermediate.crt a la clave GeoTrustGlobalCA haríamos:

# certutil -A -n 'geotrust' -t 'CT,,' -d /etc/httpd/alias -f /etc/httpd/alias/pwdfile.txt -a -i /etc/httpd/ssl/systemadmin_intermediate.crt

Finalmente, importaremos el certificado firmado mediante el siguiente comando. En este caso suponemos que el certificado esta en /etc/httpd/ssl/systemadmin_cert.crt y lo queremos importar con la clave systemadmin:

# certutil -A -n 'systemadmin' -t 'P,,' -d /etc/httpd/alias -f /etc/httpd/alias/pwdfile.txt -a -i /etc/httpd/ssl/systemadmin_cert.crt

Podemos verificar la cadena mediante certutil -O:

# certutil -O -n systemadmin -d .
"GeoTrustGlobalCA" [CN=GeoTrust DV SSL CA - G3,OU=Domain Validated SSL,O=GeoTrust Inc.,C=US]

  "systemadmin" [CN=www.systemadmin.es]

Si volvemos a listar las claves privadas veremos que ya no se encuentra huérfana:

# certutil -K -d .
certutil: Checking token "NSS Certificate DB" in slot "NSS User Private Key and Certificate Services"
< 0> rsa      37d35426e3a54d45c360be5727cc0f93be4dbeb4   NSS Certificate DB:alpha
< 1> rsa      c2fb4ee7ebeedc5a8f0c0cb8d6d2d51581b9ef57   NSS Certificate DB:cacert
< 2> rsa      6c18f8803eb18ad6ad1930c3b4650eb3e8dc5b72   NSS Certificate DB:Server-Cert
< 3> rsa      67c0de3a88a738ffaaf3508d370b528b7976ab0e   NSS Certificate DB:sudosueu
< 4> rsa      7b7276980ef037e4b6b37652a95e16376ea95e29   SelfSignedSP
< 5> rsa      da7524dee9662362db91ff0b95e77c078e2c4ed5   systemadmin

Para habilitar el virtualhost SSL con mod_nss, deberemos añadir las siguientes opciones:

<VirtualHost *:443>
(...)
  NSSEngine on

  NSSCipherSuite +rsa_rc4_128_md5,+rsa_rc4_128_sha,+rsa_3des_sha,-rsa_des_sha,-rsa_rc4_40_md5,-rsa_rc2_40_md5,-rsa_null_md5,-rsa_null_sha,+fips_3des_sha,-fips_des_sha,-fortezza,-fortezza_rc4_128_sha,-fortezza_null,-rsa_des_56_sha,-rsa_rc4_56_sha,+rsa_aes_128_sha,+rsa_aes_256_sha

  NSSProtocol TLSv1.0,TLSv1.1,TLSv1.2

  NSSNickname systemadmin

  NSSCertificateDatabase /etc/httpd/alias
(...)
</VirtualHost>

Simplemente deberemos indicar la clave del certificado a usar mediante NSSNickname, en el caso de ejemplo sería systemadmin.

Tags:

London’s Virtus Data Centres doubles annual revenues

VirtusLondon based Virtus Data Centres has announced it has doubled its revenues over the last twelve months, though the team haven’t released any specific numbers to substantiate the claim.

The company has recorded a healthy number of new customers throughout the period, including T-Systems which runs its private and public cloud operations from the London2 location in Hayes, as part of a five year transition project to close its private data centre in Feltham. Virtus has 40MW of capacity across its three locations, having acquired the London4 site in Slough during the latter stages of 2015 from Infinity SDC.

“Our aim is to combine cutting edge design and technology with transparent and agile commercials to offer the very best tailored solutions and service for our customers,” said Neil Cresswell, CEO at Virtus Data Centres. “This unique approach to data centre service delivery is the reason we see continued growth across all business lines with the likes of T-Systems and Symantec collocating in our leading facilities. It’s been a fantastic start to the year, and one which we seek to improve upon.”

The company, which has been in operation since 2008, offers traditional retail and wholesale colocation models, through three locations in the London area (Enfield, Hays and Slough) will a fourth set to open early next year. Virtus also boasts to have the highest total colocation MW sales of any operator in the London market throughout 2015, according to findings from CBRE, and is only one of four data centre operators in London to have been awarded Tier III design certification from the Uptime Institute. Virtus has also been expanding its credentials and capabilities in recent months, achieving supplier status with the Crown Commercial Service as part of the G-Cloud 7 initiative.

Recent expansion initiatives have been driven through investment from ST Telemedia, which was announced last year in June. As part of the agreement, ST Telemedia will make what it claims is a ‘significant investment’ into Virtus committing to a 49% via a Joint Venture with Virtus’ existing owner Brockton Capital. ST Telemedia has a healthy track record when it comes to data centre companies having launched i-STT in 2000 which was later merged into Equinix (it has now divested), as well as investments in Level 3 Communications and GDS Services.

Why savvy digital visionaries see beyond the nearby clouds

(c)iStock.com/oztasbc

While the decisive CEOs have a digital transformation agenda, those companies executing plans to re-imagine their business models are in the minority. And yet, the early adopters now using cloud computing are enabled to respond quickly to changing market conditions. In contrast, the laggards are undecided and risk falling further behind.

This is a global phenomena, where the industry and local market leaders are able to enact their transition with limited interference or threats from more traditional competitors. Just consider the current status-quo within the United Kingdom, as an example.

The transformation of UK businesses is still relatively immature. Many organisations are aware of the potential of open hybrid cloud adoption, but they fail to actively address the technical debt that defines their legacy IT environment. Besides, they tend to narrowly focus on a small snapshot of the bigger picture.

While some British leaders have progressive transformational goals under consideration, much more work is needed if they are to reach their full digital potential. This is the key finding from the latest market study by the UK-based Cloud Industry Forum (CIF).

Their study was conducted in the fourth quarter of 2015. They polled 250 senior IT and business decision-makers from both the public and private sectors. What they uncovered was not encouraging: just 16% of organisations have a digital transformation strategy in place. However in two years, 72% of those polled say they will be better prepared. What are they waiting for, before they act? Perhaps they need a little guidance to show them the way forward.

Maybe they seek someone who can describe how to distinguish between the herd of unimaginative me-too cloud service providers, and thereby offer an alternative point of view – one that can demonstrate they’ve navigated boldly across a complex and disruptive digital business transformation landscape.

“Cloud computing is the agent of digital disruption, and we can see that there are significant benefits to be had by businesses that pursue both digital transformation and cloud computing strategies in tandem, ” stated Alex Hilton, CEO of CIF.

Hilton believes that cloud computing and digital transformation go hand in hand. In fact, 85% of UK businesses with a digital transformation plan have already benefited from a tangible competitive advantage.

Cloud services form the foundation of digital transformation and can facilitate rapid business change. That is apparent, it’s also clear from their research that transformation strategies serve to enhance the effectiveness and benefits of cloud computing implementations.

That’s why the more progressive organisations have focused on developing multifaceted talent – beyond basic technical-centricity. The notion of engaging a Digital Polymath is compelling. Harnessing the wisdom of a worldly open-minded individual that acknowledges the near-term challenges and opportunities, yet also has the vision to be able to anticipate the broader future.

Other key findings from the study include:

  • 13% of organisations that have implemented, or planning on implementing a digital transformation strategy, say that cloud is critical to it, and a further 80% say that cloud is important.
  • Implementing a digital transformation strategy benefits cloud users, and those that have are statistically more likely to report experiencing greater benefits from their cloud deployments.
  • 38% of cloud users with a digital transformation strategy say that cloud has given their organisations a significant competitive advantage. This figure is higher than the number that do not have a digital transformation strategy reporting the same (5%).
  • The cost savings of cloud users also increases if the organisation has implemented a digital transformation strategy (26% average saving) compared to those who have not (9% average saving).
  • The CIO is the most likely to be the driving force behind digital transformation, and by some margin at 60%. The next most likely is the CEO in 18% of cases.
  • 59% of organisations that currently have, or are in the process of implementing, a digital transformation strategy say it will steer the use of technology over the next decade.
  • 43% of survey respondents report that the intention is to achieve better use of data and analytics, and 30% report it is to improve innovation abilities.

The UK market study results are somewhat consistent with the findings of similar surveys of business leaders in the North America region. Striving to merely reach parity with the more progressive market leaders in your industry is likely a blueprint for a myopic plan of action. When you eventually arrive at that destination, you discover that the goal posts have already moved.

The alternative perspective – think ahead; imagine what’s next; define that future state; design a distinctive digital agenda that’s very difficult for competitors to simply replicate; execute in the now.

CD Automation | @DevOpsSummit #DevOps #IoT #ContinuousDelivery

Earlier this week, we hosted a Continuous Discussion (#c9d9) on Continuous Delivery (CD) automation and orchestration, featuring expert panelists Dondee Tan, Test Architect at Alaska Air, Taco Bakker, a LEAN Six Sigma black belt focusing on CD, and our own Sam Fell and Anders Wallgren.
During this episode, we discussed the differences between CD automation and orchestration, their challenges with setting up CD pipelines and some of the common chokepoints, as well as some best practices and tips for implementing CD.

read more

Mobile Stress Testing | @DevOpsSummit #DevOps #ContinuousDelivery

Stress Testing: A testing process designed to push an application’s environment to its breaking point so that QA teams can gain an understanding of the upper limits of capacity within the system. Its purpose: Stress testing exposes issues that may not appear under normal or even expected conditions. It allows testers to determine the software’s robustness and ensure that the system fails and recovers in an acceptable manner.

read more

Agile Requirements | @DevOpsSummit #DevOps #IoT #ContinuousDelivery

Agile teams report the lowest rate of measuring non-functional requirements. What does this mean for the evolution of quality in this era of Continuous Everything?
To explore how the rise of SDLC acceleration trends such as Agile, DevOps, and Continuous Delivery are impacting software quality, Parasoft conducted a survey about measuring and monitoring non-functional requirements (NFRs). Here’s a glimpse at what we discovered and what it means for the evolution of quality in this era of Continuous Everything…

read more

Managed Cloud Storage – What’s the hold up?

Boxes on trolley in warehouseOrganisations operating in today’s highly competitive and lightning-speed world are constantly looking for new ways to deliver services to customers at reduced cost. Cloud technologies in particular are now not only being explored but are becoming widely adopted, with new Cloud Industry Forum statistics showing that 80% of UK companies are adopting cloud technology as a key part of their overall IT and business strategy.

That said, the cloud is yet to be widely accepted as the safe storage location that the industry is saying it is. There is still a great deal of apprehension, in particular from larger organisations, to entrust large volumes of data to the cloud. Indeed, for the last 20 years, storage has been defined by closed, proprietary and in many cases monolithic hardware-centric architectures, which were built for single applications, local network access, limited redundancy and highly manual operations.

Storage demands are changing

The continuous surge of data in modern society, however, now requires systems with massive scalability, local and remote accessibility, continuous uptime and great automation, with fewer resources having to manage greater capacity. The cloud is the obvious answer but there is still hesitancy.

Let’s face it though, anyone who is starting out today is unlikely to go out and buy a whole bunch of servers to deploy locally. They are much more likely to sign up for cloud-based managed services for functions like accounting, HR and expenses, and have a laptop with a big hard drive to store and share files using Gmail, Dropbox and so on. It is true to say that smaller businesses are increasingly using storage inside cloud apps, but for larger businesses, this option is not quite so simple or attractive. Many enterprises are turning to the cloud to host more and more apps but they still tend to keep the bulk of their static data on their own servers, to not only ensure safety and security but also to conduct faster analytics.

Open Door LightThe cloud storage door is only slightly ajar

With increasing data volumes and accelerated demand for scalability, you would expect many businesses to be using cloud-based managed storage already. However, the fact remains that there are still many businesses burying their heads in the sand when it comes to cloud storage. As a result, there is quite a bit of fatigue amongst the storage vendors who have been promoting cloud for some time, but not seeing the anticipated take-up. In fact, I would go so far as to say that the door the industry is pushing against is only slightly ajar.

As with most things, there are clouds and there are clouds. At the end of the day, cloud-based storage can be anything an organisation wants it to be – the devil is in the architecture. If you wanted to specify storage that incorporates encryption, a local appliance, secure high-bandwidth internet connectivity, instant access, replication, green and economical storage media – a managed cloud storage service can actually ‘do’ all of these things and indeed, is doing so for many organisations. There is take-up, just not quite as much as many storage vendors would like.

It’s all about the data

Nowadays, for most organisations it is about achieving much more than just the safe storage of data. It’s more and more common to bolt-on a range of integrated products and services to achieve a wide range of specialist goals, and it’s becoming rare that anyone wants to just store their data (they want it to work for them). Most organisations want their data to be discoverable and accessible, as well as have integrity guarantees to ensure the data will be usable in the future, automated data storage workflows and so on. Organisations want to, and need to, realise the value of their data, and are now looking at ways to capitalise on it rather than simply store it away safely.

Some organisations though, can’t use managed cloud storage for a whole raft of corporate, regulatory and geographical reasons. The on-premise alternative to a cloud solution, however, doesn’t have to be a burden on your IT, with remote management of an on-site storage deployment now a very real option. This acknowledges that storage capabilities that are specific to an industry or to an application are now complex. Add on some additional integrated functionality and it’s not something that local IT can, or wants to, deal with, manage or maintain. And who can blame them? Specialist services require a specialist managed services provider and that is where outsourcing, even if you can’t use the cloud, can add real value to your business.

What do you want to do with your data?

At the end of the day, the nature of the data you have, what you want to do with it and how you want it managed, will drive your storage direction. This includes questions around whether you have static or data that’s subject to change, whether your storage needs to be on-premise or can be in the cloud, whether you want to backup or archive your data, whether you want an accessible archive or a deep archive, whether you need it to be integrity-guaranteed or something else, long or short term. Cloud won’t always necessarily be the answer; there are trade-offs to be made and priorities to set. Critically, the storage solution you choose needs to be flexible enough to deal with these issues (and how they will shift over time) and that is the difficulty when trying to manage long-term data storage. Everything is available and you can get what you want but you need to make sure that you are moving to a managed cloud service for the right reasons.

Ever-increasing organisational data volumes will continue to relentlessly drive the data storage industry and today’s storage models need to reflect the changing nature of the way in which businesses operate. Managed storage capabilities need to be designed from the ground up to facilitate organisations in maximising the value they can get from their data and reflect how those same organisations want to access and use it both today, and more importantly, for years to come.

Written by Nik Stanbridge, VP Marketing at Arkivum