How cloud storage became a target for hackers – and what can be done about it

(c)iStock.com/4774344sean

With the recent revelations that Yahoo! experienced a hack in 2014 where the accounts of around 500 million users were compromised, it brings back into focus the importance of businesses ensuring their customers’ data is always protected.

More and more businesses are now using the cloud to store their data. As with all new technologies though, hackers will look to exploit any security vulnerabilities they can find. While it is not yet clear whether the attack on Yahoo! was on a cloud-based system or was due to vulnerabilities present in a third-party application on the Yahoo! website, other high profile attacks have taken place against cloud storage systems in recent years.

In this article, David Midgley, head of operations at Total Processing, examines what has made cloud storage vulnerable to attack and how to make cloud storage more secure going forward.

Consumers are now increasingly comfortable making online financial transactions – I’d even argue that consumers have now come to expect the ease and convenience of making financial transactions in this way. More and more businesses have entered the eCommerce marketplace in order to keep pace with their rivals or seize upon the opportunities that eCommerce presents. However, the public should be wary of handing over their financial details so easily.

Businesses have increasingly begun to embrace cloud storage options in recent years to store their data; among other reasons, cloud storage solutions have meant they no longer need to incur the numerous costs associated with storing all their information in physical data centres. However, some businesses don’t seem to understand the potential hazards of using such a method for storing customer data.

While the cloud has opened up new frontiers, it’s also opened up a whole new world of security issues, as hackers now have another way to try and access people’s personal and financial information. Therefore, it is vitally important that businesses processing and storing customer information do their utmost to ensure it is secure and safe from those with sinister motives.

This unfortunately is not always the case though. The last two years in particular have seen a number of high profile attacks against cloud storage systems being highlighted. For instance, the attack on Apple’s iCloud platform that resulted in the release of the personal photographs of many high profile figures was a big talking point in the summer of 2014. On that occasion, as with the recent attack on Yahoo!, the hacker was able to access highly sensitive and confidential information following a single hack.

As someone who works in the fintech sector, where we are processing large amounts of financial data on a daily basis, I find this very worrying particularly given more and more people now make online transactions due to the proliferation of eCommerce as part of our everyday lives.

It may be unfair though to only shine the spotlight on cloud storage solutions, as hackers will attack wherever they can find a weak spot in a company’s security. This certainly appears to be the case with the attack on Yahoo!, as there is still uncertainty as to whether the hackers gained access via cloud storage or by exploiting a vulnerability they had found in a third-party application that had access to the website. It may simply be that the high profile attack on Apple put the spotlight onto cloud storage systems in my mind, and this has been reaffirmed by the scale of the attack on Yahoo! that has come to light in the last month. Either way, I think it will mean that attention will now start to increasingly focus on cloud storage in the minds of hackers.

Given these most recent revelations, I’m sure we can all agree that online security needs to be a top priority. It really is not difficult either – common sense practices will go a long way to keep your business and the information you’re holding secure. Keep all your security software up to date and implement two-factor authentication. Even keeping the security settings on your email systems rigid will do a lot to keep you protected from external threats.

Watson Crowdsources Cloud Computing | @CloudExpo #Cloud #CognitiveComputing #DigitalTransformation

Recently I’ve been doing quite a bit of analysis work using the IBM Watson cognitive business platform. The really exciting thing about this opportunity is the way data can seem to have a conversation with you. This got me wondering if social media data could carry on a conversation as well. Given my almost unhealthy interest in cloud computing, we ran a one week experiment to «crowdsource the internet» in order to see if it held any interesting cloud computing insights.

read more

[session] Legacy with #WebRTC | @ThingsExpo #IoT #M2M #RTC #UCaaS

Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies.
In his session at @ThingsExpo, Dan Cunningham, CTO of ReadyTalk, will cover real world examples of how enterprises and vendors alike can modernize infrastructure without ditching existing investments.

read more

VMworld EU – Live from Barcelona Day 2

Hello All! Another day at VMworld Europe and more information on what’s new in the VMware ecosystem.  As their recent press release has made obvious, VMware has gone all in with making their ecosystem more developer friendly:  REST-based APIs, Container Support, expanded PowerCLI to name a few.

 

vSphere 6.5

VMware vCenter Server® Appliance – The new vCSA appliance has made many significant improvements.  From the migration path to the ability to now have HA to the new scales of what it can manage.  There are now fewer and fewer reasons to stay with the windows version of vCenter, though there are still some use cases.

REST APIs – gives developers, DevOps, and Operations teams greater flexibility and automation options.

VMware vSphere Client – Goodbye flash-based web client, hello HTML5!, VMware listened to customer feedback and created an interface that is more responsive and (hopefully) more resilient.

VMware vSphere Integrated Containers™ – by getting in on containerized applications in a fashion that is consumable by their existing VMware infrastructure/employees, VMware is making it easier for their infrastructure and development teams to work together more cohesively.

 

VSAN 6.5

iSCSI Support – will enable Virtual SAN storage to be presented as an iSCSI target for external physical workloads including clustered applications such as Microsoft SQL Server with Failover Clustering on a limited number of physical servers.

Containers Support – Virtual SAN will provide persistent data layer for containerized applications via VMware vSphere Integrated Containers.

Two-node Direct Connect – having an option for ROBO sites gives customers more flexibility & options.

Expanded PowerCLI – now that PowerShell is available Linux and Mac, the new PowerCLI integrations will be even more useable.

 

vRealize Automation 7.2

AWS, Microsoft Azure, and container support will give IT and DevOps teams greater flexibility in deploying cross-cloud, multi-tier applications.   In addition, Log Insight 4.0 and vRealize Operations 6.4 were announced and will with all of the aforementioned technologies.

 

In addition, the vBrownBag Tech Talks that I’ve been helping to create have proved to be a great resource of exposure to new technical topics.  We’ve published the VMworld Europe vBrownBag tech talks here.  We recorded 21 videos!  Next week I’ll be doing the OpenStack Summit Tech Talks, so expect many more videos in the coming week!

More to come! In the meantime, register for our upcoming webinar. CTO Chris Ward will be doing a full deep-dive into all of the biggest VMworld announcements.

Chris Williams – GreenPages Enterprise Consultant

 

Google announces low latency cold storage Coldline in “major refresh”

(c)iStock.com/Bernhard_Staehli

Google has announced a ‘major refresh’ of its cloud storage options, including Coldline, a new storage class aimed at long-term archival and disaster recovery.

The revamped storage classes are multi-regional, for highest availability of frequently accessed data, such as video and business continuity, regional, for data accessed frequently within a region such as data analytics and general compute, Nearline, for infrequent access, and Coldline.

“Whether a business needs to store and stream multimedia to their users, store data for machine learning and analytics or restore a critical archive without waiting for hours or days, Cloud Storage now offers a broad range of storage options to meet those needs,” wrote Kirill Tropin, product manager in a blog post.

Picture credit: Google

The move is particularly interesting given Nearline, which was launched in March last year, was already considered as the lower cost yet higher latency data solution. In a graph outlining Nearline, the ‘time to first byte’ was seconds, ahead of ‘hours’ for more traditional cold storage.

“(Near) online data at an offline price” was the tag line. Yet in August Google announced a revamp of Nearline, promising lower latency and improved performance. Now, all four classes in the latest move promise millisecond access, with Google hoping it “changes the way that companies think about storing and accessing their cold data.”

As you’d expect, Google also rolled out a customer to showcase its new offerings; this time, it’s video hosting provider Vimeo, which is using the multi-regional storage class for high availability and Nearline to minimise overall costs in tandem with Fastly, a content delivery network service. Other companies announced today in the expanded Google cloud storage partner ecosystem include hyperconverged provider Cohesity, who said it was “excited” by the collaboration in a statement, Cloudian, and Veritas.

Back in September, Google announced eight new locations for its Cloud Platform service across five continents. The company cited its machine learning potential as a key aspect for potential new customers, with note-taking app Evernote a case in point when it recently confirmed it was moving onto Google’s platform.

Coldline is priced at $0.007 per gigabyte per month, which Google claims is the most economical storage for data accessed less than once per year.

Ops, APIs and Compression | @DevOpsSummit #API #APM #DevOps

I’ve been reading up on APIs cause, coolness. And in particular I really enjoyed reading Best Practices for Designing a Pragmatic RESTful API because it had a lot of really good information and advice.
And then I got to the part about compressing your APIs.
Before we go too far let me first say I’m not saying you shouldn’t compress your API or app responses. You probably should. What I am saying is that where you compress data and when are important considerations.

read more

Government Can Influence Cloud Interoperability | @CloudExpo #API #SaaS #Cloud #FedRAMP

A BriefingsDirect thought leadership panel discussion explores how public-sector organizations can gain economic benefits from cloud interoperability and standardization.
Our panel comes to you in conjunction with The Open Group Paris Event and Member Meeting October 24 through 27, 2016 in France, with a focus on the latest developments in eGovernment.

As government agencies move to the public cloud computing model, the use of more than one public cloud provider can offer economic benefits by a competition and choice. But are the public clouds standardized efficiently for true interoperability, and can the large government contracts in the offing for cloud providers have an impact on the level of maturity around standardization?

read more

Microsoft Rolls Out Custom Versions of Azure for the US DoD

Microsoft is known to provide customized products for government agencies at the federal, state, and local levels, to ensure that its products meet the necessary requirements and certifications laid down for these agencies. To add a feather to its cap, Microsoft announced on Tuesday that it would create custom versions of its cloud platform, Azure and Office 365, to meet the Impact Level 5 requirements laid down by the US Department of Defense (DoD). This product is expected to be available by the end of 2016, according to a press release.

In this version, Azure and Office 365 will be physically isolated and kept in two new regions dedicated for it. According to the company, these two regions will be located one each in Arizona and Texas, though the exact cities were not made public. To connect to these two centers, other DoD servers can use Microsoft ExpressRoute – Microsoft’s private connection that offers higher levels of security and lower levels of latency. Such a setup is expected to give an extra layer of security for data transmission, especially to access information that are considered to be critical for national security.

With this setup, Microsoft can meet the next level of security requirements namely the Impact Level 5 controls, that are laid down by DoD. This new addition is significant for Microsoft, as it means that Azure cloud products will be an integral part of National Security System Data and other mission critical information. In fact, Microsoft will be the only cloud provider to offer a cloud that meets these stringent requirements, and in this sense, it gives Microsoft an edge over its competitors in a crowded cloud market. Currently, Amazon’s AWS is Level4 compliant, whereas there are no such known certifications for Google.

Earlier, Microsoft’s cloud had the certifications to handle up to DoD’s Impact Level 4, which includes controlled but unclassified data such as privacy information, and protected health information. Though Impact Level 5 is also unclassified data, it includes those that are critical for National Security.

With this new addition, the total number of regions for Azure Government services will go up to six, and this includes Virginia, Iowa, and two unnamed data centers, apart from the new ones. Microsoft claims that its Azure services are being used by more than 70,000 customers in the government sector, and six million end users are accessing its various cloud products.

The post Microsoft Rolls Out Custom Versions of Azure for the US DoD appeared first on Cloud News Daily.

Wearable Medical Devices | @ThingsExpo #IoT #M2M #API #WearableTech

The global wireless health market by products encompasses ECG monitors, insulin monitors, neuromonitors such as EEG and EMG devices.
Recent estimations show that an astounding 1.8 million people across the world will be using wireless remote monitoring devices by 2017. Many of these users will be those suffering from chronic diseases. To put that into perspective, consider this: about 75% of all healthcare expenditure worldwide is spent on patients suffering from chronic diseases such as diabetes, asthma, and Alzheimers, and cardiovascular problems. Grappling with these issues, the healthcare industry has attempted to increase the integration of technology into medical services, and the wireless health market is expected to benefit from this. Transparency Market Research’s report, titled ‘Wireless Health Market – Global Industry Analysis, Size, Share, Growth, Trends and Forecast 2015–2023,’ says that wearable medical devices will show the fastest growth within this market.

read more

Cutting through the noise and sorting out the hybrid and multi-cloud imbroglio

(c)iStock.com/BrianAJackson

The hybrid cloud: it’s a buzzword which refuses to go away.

This month is a case in point. Amazon Web Services (AWS) and Microsoft, arguably the two biggest and most influential cloud computing organisations, both made announcements on the theme.

The general availability of Windows Server 2016 and System Center 2016 was seen as “another big step in hybrid cloud”, Microsoft argued, with a “cloud-ready OS” which “inherently enables hybrid cloud.” AWS, on the other hand, announced a previously-rumoured hybrid cloud partnership with VMware, the two companies pointing out its leadership in public and private cloud respectively.

Demand is certainly there. According to IDC, more than 70% of IT organisations in western Europe will commit to hybrid cloud architectures by 2017. IBM, who announced their own object storage for hybrid clouds last week, released their own study which backed up those figures; 78% of C-suite executives polled said their cloud initiatives were coordinated or fully integrated, up from 34% in 2012.

As Sebastian Krause, IBM Cloud Europe general manager wrote for this publication at the time: “Executives expect hybrid cloud adoption (in particular) to support their organisation’s growth in three main ways: by facilitating innovation, lowering total cost of ownership and by enhancing operational efficiencies and enabling them to more readily meet customer expectations.”

Sounds great, right? But there may be a catch.

Charles Crouchman is CTO at enterprise cloud and virtualisation software provider Turbonomic. He argues that plenty of organisations eulogising over hybrid cloud instead have ‘multi-cloud’. “While we hear a lot of discussion about hybrid cloud, the truth is that the majority of organisations aren’t there yet,” he tells CloudTech.

“A truly hybrid cloud would allow workloads to switch seamlessly between cloud environments, whether public or private, depending on the needs of the organisation,” Crouchman adds. “Instead, what the vast majority of organisations have now is multi-cloud; while they have a number of environments, both private and public, and workloads may fluctuate in size and demand, barring exceptional circumstances those workloads will stay where they were first placed.”

Essentially, if you’re using multiple clouds, it does not automatically make it hybrid. Back in August, Turbonomic polled almost 2,000 IT decision makers, and the verdict which came back was that many organisations did not have the skills needed to manage a multi-cloud environment. This was backed up by a study from HyTrust in September which found that while 60% of VMworld attendees polled said they plan to move to a multi-cloud model, data encryption and security remain serious stumbling blocks. Interestingly, almost a third said that, if they were using two or more clouds, they preferred a flavour combination of AWS and Azure.

“This is quite believable when one considers that by their very nature multi-cloud environments aren’t managed from a single user interface,” explains Crouchman. “An organisation’s private and public clouds have, in most cases, come from separate vendors, each working differently, with their own management tools and interfaces.

“This adds extra complexity on top of an already potent mix of challenges, such as deciding which workloads should be placed in which cloud; balancing cost and performance across all environments; while meeting service level agreements and quality of service obligations.”

For Crouchman, this is indicative of a stepping stone approach; if companies are still experiencing trouble with deploying multi-cloud environments, then a truly hybrid approach is a long way away. So what would in his opinion be the perfect scenario? “In an ideal future, organisations would use autonomic, economic-based intelligence to manage multi-cloud environments and ultimately deliver hybrid cloud, with workloads always in the best possible place to suit business needs and provide peak performance,” he says.

“In such a system, the multi-cloud environment would be treated as an economy in its own right; with resources bought and traded depending on their applications’ – or end users’ – real-time needs, while obeying the organisation’s own budgetary rules.”