Research argues greater confidence in public cloud security from IT pros

(c)iStock.com/scanrail

IT professionals are becoming more confident in the security of the public cloud compared to corporate data centres, according to the latest study from IT consulting firm SADA Systems.

The poll, which quizzed more than 200 IT managers around their use of public cloud services, found that 84% of respondents were using some form of public cloud infrastructure today.

49% of respondents said they used Google Cloud Platform, compared with 48% for Microsoft and 42% for AWS – although it’s worth noting that SADA’s work comes primarily through Google and Microsoft reselling.

Half (50%) of those polled said they are likely to increase their public cloud usage by at least 25% over the next three years, with a further 25% saying they would increase their usage by more than half. 45% of firms polled said it took them three to six months to migrate to public cloud, with 23% saying it took three months.

In terms of issues with cloud adoption, more than half (51%) of respondents said concerns around data security prevented them from quicker adoption, while long-term viability of cloud (40%) and escalating costs (33%) were also highly cited.

“All signs point to public cloud adoption growing and enterprise IT becoming more comfortable with the prospect of running their most sensitive data on public cloud infrastructure,” said Tony Safoian, SADA president and CEO in a statement. “Security and reliability will always be primary concerns – as they should – and companies should lean on expert consultants and integrators to guide them in addressing these issues.

“The convenience of public cloud, coupled with easy access to proven resources for managing these environments, make the option of moving to public cloud too compelling to ignore,” Safoian added.

IDEs Are Moving to the Cloud | @CloudExpo #API #Cloud #DevOps

During the early days of my career, I used a variety of development environments, including the most popular Turbo C along with FoxPro, PowerBuilder, and Delphi before finally settling down with Microsoft Visual Studio. The first line of code that I have ever written was in QBasic running on Microsoft DOS. The editor had useful features like adding the line numbers and automatically converting the statements to uppercase. Hitting F5 would run the program instantly without having to switch to the command prompt.

read more

The Zombie Apocalypse and Other Cloud Infrastructure Concerns | @CloudExpo #IaaS #Cloud #OpenStack

Last month, I returned from an overseas trip to a friend’s wedding to find the zombie apocalypse happening in my neighborhood near downtown San Jose. People of all ages and demographics united together in what looked like strange cult gatherings in public parks, stumbling around the streets of the South Bay with eyes glued to their phones and occasionally yelling something strange like, “OMG I hatched a Nidoking!” Cars would come to abrupt halts in front of me at unexplained times—I half expected a scene from the movie Zombieland to unfold where Woody Harrelson would leap out from an SUV ready to give me the nickname Minnesota and help navigate safely back to my parents’ farm.

Pokémon Go was released in the US on July 6, 2016. Unless you’ve been living off the grid in a yurt (in which case you won’t be reading this blog post), you’ve probably observed that this was much more than a game release and more like the dawn of a new way of life for some people. While there have been well-publicized challenges as the game has scaled to reach millions, it’s an amazing…

read more

Examining a new approach to data centre security

(c)iStock.com/4x-Image

Changing with the times is frequently overlooked when it comes to data security. Technology is becoming increasingly dynamic, but most data centres are still using archaic security measures to protect their network – which isn’t going to stand a chance against today’s sophisticated attacks.

Recent efforts to upgrade these massive security systems are still falling short. Data centres house a huge amount of data and there shouldn’t be any shortcuts when implementing security to protect that data. The focus remains on providing protection only at the perimeter to keep threats outside. However, implementing perimeter-centric security leaves the insides of the data centre vulnerable, where the actual data resides.

Cybercriminals understand this, and are constantly utilising advanced threats and techniques to breach external protections and move inside the data centre. Without strong internal security protections, hackers have visibility and access to steal data and disrupt business processes before they are even detected.

Businesses face security challenges as traffic behaviour and patterns are shifting. There is a higher amount of applications in the data centre, and these applications are integrated with each other. The increasing number of applications causes east-west traffic within the data centre to drastically grow, and as the perimeter defences are blind to this traffic, it makes lateral movement possible. With the rising number of applications, hackers have a broader choice of targets. Another challenge is that the manual processes for managing security are too slow. New applications that are rapidly created will evolve and change frequently, and static security controls are unable to keep up with the pace.

To address these challenges, a new security approach is needed – one that requires bringing security inside the data centre to protect against advanced threats. Enter micro-segmentation.

Micro-segmentation with advanced threat prevention is emerging as the new way to improve data centre security. Micro-segmentation works by grouping resources within the data centre and applying specific security policies to the communication between those groups. The data centre is essentially divided up into smaller, protected sections (segments) so that any intrusion discovered can be contained.

However, despite the separation, applications need to cross micro-segments in order to communicate with each other. This makes lateral movement still possible, which is why in order to detect and prevent lateral movement in the data centre it is vital for threat prevention to inspect traffic crossing the micro-segments.

In order to address data centre security agility, so it can cope with rapid changes, when new applications are added the security in the software-defined data centre learns about the role, scale, and location of the application. This allows the correct security policies to be enforced and removes the need for a manual process.

Strengthening the perimeter offers little help if there is no additional security within the data centre. With micro-segmentation, advanced security and threat prevention services can be deployed wherever they are needed in the environment. Implementing solutions such as Check Point’s vSEC for VMware NSX will provide multi-layered defences to protect east-west traffic within the data centre, and automatically quarantine infected machines for remediation. This puts required protection inside the organisation’s data centre, securing their company assets and valuable data from attacks.

By deploying advanced security solutions, businesses can better protect their data centres from undetected breaches and sophisticated threats. 

UPS Modernizes and Streamlines Procure-to-Pay Processes | @CloudExpo #API #Cloud #BusinessIntelligence

The next BriefingsDirect business innovation for procurement case study examines how UPS modernizes and streamlinines its procure-to-pay processes.
Learn how UPS – across billions of dollars of supplier spend per year – automates supply-chain management and leverages new technologies to provide greater insight into procurement networks. T

read more

Feature-Rich DEMS Available from @DoubleHorn | @CloudExpo #API #Cloud #Security #FedRAMP

DoubleHorn is now offering its feature-rich and cost effective Evidence Management System for Law enforcement agencies interested in developing body-worn camera programs across Texas.
DoubleHorn’s Evidence Management System (DEMS) is a unified system suitable for capturing any digital evidence (videos, images and files) and physical evidence end-to-end. It is a fully secure, scalable, cost-effective and compliant solution for law enforcement authorities to managing any type digital evidence. The diverse features of DEMS include collection and storage of physical and digital evidence, chain of custody, managing convenience copies, collection and analysis of metadata, and reporting. DEMS also offers cloud storage options that fit all the major security and compliance requirements including CJIS, FedRAMP, FISMA, HIPAA, FIPS, ITAR, FERPA, etc.

read more

IDC: Global revenues from public cloud will hit almost $200bn by 2020

(c)iStock.com/RapidEye

Global revenues from public cloud services will hit more than $195 billion (£150.2bn) by 2020, according to the latest forecast from analyst house IDC.

The new prediction, which arrives in the firm’s semi-annual public cloud services spending guide, argues 2020 will double 2016’s expected revenues of $96.5bn, and represents a compound annual growth rate (CAGR) of 20.4% between 2015 and 2020.

“Cloud software will significantly outpace traditional software product delivery over the next five years, growing nearly three times faster than the software market as a whole and becoming the significant growth driver to all functional software markets,” said Benjamin McGrath, IDC senior research analyst for SaaS and business models. “By 2020, about half of all business software purchases will be of service-enabled software, and cloud software will constitute more than a quarter of all software sold.”

According to the IDC analysis, manufacturing, banking, and professional services represent almost a third of public cloud services revenues in 2016. The US is expected to be the largest market for public cloud with almost two thirds of revenues, followed by Western Europe and the snappily-titled Asia/Pacific excluding Japan (APeJ).

“Cloud computing is breaking down traditional technology barriers as line of business leaders and their IT organisations rely on cloud to flexibly deliver IT resources at the lower cost and faster speed that businesses require,” said Eileen Smith, program director of customer insights and analysis. “Organisations across all industries are now free to adapt to market changes quicker and take more risks, as they are no longer bound by legacy IT constraints.”

Previous IDC research focused on EMEA cloud IT infrastructure, which grew by 17% to $1.3bn in the first quarter of this year. The second quarter figures are yet to be disclosed, but the analyst house warned of the potential impact the Brexit EU referendum vote could have.

How Amazon is disrupting a $34bn database market

(c)iStock.com/Prykhodov

When Amazon launched Aurora in 2014, it was presented as a clear challenge to giants in the $34 billion database market. Today it is Amazon’s fastest-growing product and has already surpassed the growth of Amazon Redshift, which is saying something. Customers of the service are among Amazon’s largest and loudest advocates. Since the start of 2016, roughly 7,000 databases have been migrated to AWS Aurora — and the rate of adoption has tripled since March 2016.

Why is Aurora gaining popularity? As we have come to expect from Amazon, Aurora provides enterprises with the performance and reliability of a commercial product at a fraction of the cost of Oracle or IBM. Amazon has also made consistent efforts to reduce the effort of database migration with services like AWS Database Migration Service.

First and foremost, migration to Aurora is about cost: companies want to get out of expensive database licenses. But the popularity of Aurora is also a sign of rising interest in Amazon’s fully-managed tools — overcoming fears that using native AWS tools equates to vendor lock-in.

Traditionally, vendor lock-in worries would cause a company to use only “basic” services in order to make Amazon easy to leave. But It appears that these fears are being eclipsed by a desire to reduce IT management. In other words, the value of a managed or automated approach far outweighs the potential effort of migrating out of that service for a (hypothetical) future transition.

Zynga began with this “tentative” approach to adopting AWS, but now realizes that the value of AWS is not cheap compute — it is reduced infrastructure maintenance. Zynga is infamous for migrating to AWS, then deciding to migrate back to on-premises cloud, then returning to AWS in 2015. This time around, Zynga decided their goal was not just to reduce bottom-line costs, but to be smarter about putting engineering resources towards applications, not infrastructure.  

“As we migrated from our own private cloud to AWS in 2015, one of the main objectives was to reduce the operational burden on our engineers by embracing the many managed services AWS offered,” said Chris Broglie of Zynga on the AWS blog. “Before Aurora we would have had to either get a DBA online to manually provision, replicate, and failover to a larger instance, or try to ship a code hotfix to reduce the load on the database. Manual changes are always slower and riskier, so Aurora’s automation is a great addition to our ops toolbox.”

The adoption rate of Aurora and Redshift seem to indicate that Zynga is not the only company willing to purchase higher-level service offerings from Amazon. Anecdotally, the team at Logicworks has also seen growing interest in Aurora and other services like Redshift and RDS.

Changing your database schema has traditionally been difficult and expensive. Early adopters of the cloud usually just want to get their databases running on Amazon EC2 — choosing speed and ease of migration over long-term licensing cost savings. As cloud adoption matures, expect more companies to make a (slow) migration over to cloud-native systems. Because in the end, it is not just about licensing costs. It is about removing management burden from IT — and choosing to focus engineering talent on what really matters.

The post appeared first on Logicworks Gathering Clouds.

How to Create a More Engaging Consumer App | @CloudExpo #Cloud #Mobile

The development of your app isn’t something that happens overnight, but rather it is something that takes place over time and through multiple iterations. While you may have developed the framework for your app already, improving engagement and innovating to provide a better user experience is a completely different process. There are plenty of ways that you can improve user engagement, but these five strategies will help you improve your app while also reducing the risk that your “innovative” updates actually decrease user engagement.

read more

Blockchain Identity | @CloudExpo #Blockchain #DigitalTransformation

As good Venn diagrams always show it’s the intersection areas that offer the sweet spot for trends and markets, and an especially potent one is the intersection of Cloud computing, with the blockchain and also digital identity, particularly when considered within contexts such as Digital Democracy.

Given the diversity of each field alone there are multiple perspectives to this, more the start of a conversation rather than a definitive design.

read more