“Opsani helps the enterprise adopt containers, help them move their infrastructure into this modern world of DevOps, accelerate the delivery of new features into production, and really get them going on the container path,” explained Ross Schibler, CEO of Opsani, and Peter Nickolov, CTO of Opsani, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Monthly Archives: April 2018
Five tips for better AWS S3 bucket security
Barely a day goes by without news of yet another breach of an AWS S3 bucket. But these breaches are preventable.
Misconfigured Amazon Simple Storage Service (S3) buckets have led to an epidemic of breaches, any of them involving major companies or their business associates, including Verizon, a data analytics firm hired by the Republican National Committee, FedEx, even a national defense contractor. The problem is so pervasive that last summer, Amazon itself sent out emails to customers with publicly accessible S3 buckets, warning them to check their security settings.
AWS is a powerful and highly secure cloud environment, but it must be configured and maintained properly. Here are five tips for keeping your AWS environment safe from hackers – and your company out of the news.
Know what it is you’re doing
The default privacy setting for AWS S3 buckets is owner-only. Most AWS breaches involve organisations choosing the “all authorized users” setting when expanding access to their buckets, not realising that this setting includes all authorized users of Amazon Web Services, not just their account. This means that anyone with an AWS account can access that bucket with whatever permissions are granted to that level of access; it’s a free-for-all.
Understand what level of access you’re granting to your data and who you are granting it to. A good rule of thumb is, if you’re not sure, don’t do it! Get help before you end up exposing your data to the world.
Know what data you have
Data governance is one of the pillars of cloud security. You cannot secure your data if you don’t know what you have. Is your data important? Is it unique? Does it have value?
Keep Mitnick’s law in mind and don’t waste money on worthless data; you shouldn’t spend more money protecting your data than the data is worth. At the same time, you shouldn’t skimp on security when dealing with sensitive data that’s worth enormous amounts of money. If a $500 tool can stop a multi-million-dollar breach, it’s well worth the investment.
Many organisations fall prey to the “camel’s nose under the tent” problem. They find that cloud computing is easy, so easy that they start migrating all manner of data into the cloud without evaluating it and considering whether it belongs there. Eventually, really sensitive data ends up being stored in the cloud. Even worse, the IT people may not know this data exists, and it becomes a shadow IT problem. Always identify your data and run an assessment before putting into the cloud. If you only have 2 levels of classification, Private or Public, treat everything as Private until you are sure its public. Assume it’s private until proven otherwise.
Take advantage of the tools that are available to secure your AWS environment
Many tools are available that are specifically designed to help you secure your environment. However, these tools are only as good as the people who run them; they work only if organisations actually use them and read the reports they generate. These reports tend to be voluminous, which means they often end up unread. However, they contain nuggets of critical information about the security of your cloud environment.
Beware of the complexity of AWS
The ease of using AWS or other cloud environments can make it easy to forget just how complex the cloud is. Most clouds offer thousands of different options. This complexity is why the cloud is so flexible, but it also decreases visibility. Properly configuring and managing a cloud environment is like assembling a puzzle where all of the pieces are black; when everything looks the same, it’s difficult to find the right knob to turn.
In a general IT environment, there is a management console for every area and tool. Routers, switches, firewalls, servers, and data storage all have their own, different tools, and each tool has its own management console. Once you add a cloud environment, you add another management console. There are already hundreds of ways to screw things up in an on-premises data environment. The cloud adds yet another layer of complexity, and organisations must understand how it will impact their overall cyber security.
Don’t be afraid to seek help
When configured and managed properly, AWS is highly secure, but cloud security is quite different than on-premises security. In many cases, AWS breaches happen because organisations have non-IT personnel, or IT personnel who do not fully understand the cloud, configuring their AWS buckets.
Most organisations do not have the in-house resources to properly configure and maintain a secure AWS environment. Don’t be afraid to seek the help of a professional managed security services provider (MSSP) with experience in both cloud and on-premises data security. A reputable MSSP can help you every step of the way, from evaluating and assessing your data to keeping your cloud secure moving forward.
Even if you’ve already been running an AWS environment for some time, in light of the epidemic of AWS breaches, hiring an MSSP to perform a cloud assessment is a wise move. If you do have security issues, wouldn’t you rather find out about them during an assessment instead of after your data is put up for sale on the Dark Net? Seeking professional help could make the difference between your organisation having a secure AWS environment and it being the next “major AWS breach” headline.
How should businesses respond to the security challenges of multi-cloud?
More and more, hybrid multi-cloud is generally accepted to be the IT architecture of the future.
Only multi-cloud offers the flexibility, the scalability and the security benefits that modern businesses demand. But a decision to embrace a multi-cloud-first strategy is only the beginning – companies need to spend serious time on careful preparations to optimise this approach. For instance, there are security challenges that inevitably follow when data is spread across multiple cloud environments, and companies need to be certain that they are protected from cyber-attacks.
But how can companies be sure when their data is spread across distant networks? How can they keep their data fully under their control? How best can they make encryption work for them?
Nothing is more important to the operations and reputation of an enterprise than protecting its data. But that can be complex and costly when that data is globally dispersed, extremely mobile and, at times, out of its control. According to Equinix’s Global Interconnection Index, over the next three years enterprises will require 50% more traffic capacity. The growth of IoT and all those connections with clouds, IT providers and third-party network destinations eats ups a lot of bandwidth, and this explosion of data is certain to make companies nervous about the degree of visibility and control they have over their data.
As data grows, fear often grows. Security challenges are one of the main barriers to cloud adoption for businesses with McAfee reporting that 49% of businesses are delaying cloud deployment due to a cybersecurity skills gap. Businesses need to reimagine how they manage data and decide on operational models that best fit their business needs, inside or outside of data silos. With so much on the line, the best defence against cyberattacks is to take a “trust nothing” approach. Similar to an international airport, companies must erect security checkpoints in all the places where their data is being exchanged, whether that’s inside or outside of their organisations.
Given the cybersecurity risks of today's distributed application, cloud and data environments, customers are turning to encryption as one of the most effective controls to protect their critical information.
In response to the increasing adoption of hybrid and multi-cloud environments by many organisations and the need for private, secure, globally available key management, Equinix has developed a software-as-a-service encryption and key management service called SmartKey. Companies are using SmartKey to securely host encryption keys separate from, but in close proximity to, the data located across networks, hybrid and multi-cloud environments. This protects enterprise data wherever it resides and keeps it under the enterprise’s control, while locating that data and a company’s encryption keys in proximate but separate places for better security and compliance. In doing so, SmartKey solves a variety of data sovereignty issues and puts an enterprise, not an external service provider, in control of both its data and its keys. It overcomes security challenges by making a hybrid multicloud future a lot more secure and a lot less complicated. Leveraging data encryption and identity key management platforms for multiple clouds through services such as SmartKey is vital to an enterprise’s data protection plans.
You will remember that 2017 brought on painful security lessons of unprecedented scale to numerous businesses. Literally billions of online accounts were hacked, and sensitive personal data for hundreds of millions of people were exposed. It’s vital for companies to always remember that any online technology is subject to compromise. So, establishing a global infrastructure of manageable security control points that can scale quickly and securely is key. Services such as SmartKey help to ensure that enterprises enjoy both the flexibility and scalability of multiple cloud environments, whilst getting the control needed over dispersed data to mitigate potential security challenges and achieve peace of mind.
RTC Testing Evolution with #WebRTC | @ExpoDX #RTC #Telecom #IoT #AI
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems.
In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented a comprehensive view of the numerous testing challenges researchers have faced before arriving at the first release candidate of the WebRTC specifications.
WebRTC and Edge Computing | @ExpoDX @NTTCom #IoT #RTC #WebRTC
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding – video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service.
However, the benefit of WebRTC for IoT is not only its convenience and interoperability. It has lots of potential to address current issues around IoT – security, connectivity and so on – based on P2P technology. It will become a key-component especially in edge computing use cases, in his view.
Whose Data Is It Anyway? | @ExpoDX @EFeatherston #GDPR #Cloud #AI #DataCenter #DigitalTransformation
Last year was another banner year for security breaches, sad but true. At times it felt like a week did not go by without the announcement of another large corporate security breach (that usually happened sometime in the past) and the personal information of millions of people was potentially at risk. One breach in particular stands out to me. The Equifax breach that occurred last September. Approximately 145 million individuals’ personal information had potentially been acquired by the hackers.
Blockchain Use Case for 2018 | @ExpoDX @EFeatherston #FinTech #Blockchain #AI #IoT #DigitalTransformation
Supply Chain. Logistics. Provenance. Hot topics around the dinner table with the family I am sure. Okay, maybe not so much. Even so, these all impact our lives daily, while we may be blissfully unaware of it. Take the dinner table for example. That savory grilled salmon filet on your plate may have come all the way from China. Those carrots and asparagus? Possibly shipped in from Peru. The fresh (out of season) fruit? Chile or Mexico could have been the origination point. What gets all those items from their source to your dinner table, while they are still fresh, is all about the supply chain.
Nexenta brings software-defined storage into the cloud
Nexenta is extending its open source-driven software-defined storage (OpenSDS) portfolio with NextentaCloud, a range of cloud-connected storage for businesses wanting to store their data off-premises.
The first product launching in the portfolio will be Nexenta’s Amazon Web Services (AWS) implementation, available from AWS Marketplace. NexentaCloud for AWS allows businesses to take advantage of AWS’s public cloud service to store their applications securely in the cloud.
It supports snapshots, cloning, thin provisioning and data compression, with preconfigured AWS instances, helping businesses make use of powerful business continuity and self-service test/development environments, without the laborious set-up process.
“By further extending our enterprise OpenSDS technology to the cloud, NexentaCloud for AWS lets organisations, especially those who have historically utilised on-premises NAS and SAN server platforms, continue their journey to a hybrid cloud environment,” said Tarkan Maner, chairman and CEO at Nexenta.
“NexentaCloud is the next step in our vision to disrupt the status quo. It enables customers to expand their data storage assets from on-premises to the cloud, and deliver improved manageability, reliability, and scalability with predictive intelligence.”
With NexentaCloud’s analytics, businesses can keep on top of their data centre and cloud environments, the company claimed, making intelligent decisions based on usage.
Nexenta raised $20 million of funding at the tail-end of last year to boost its cloud offerings, and it seems as though this latest product is the first innovation that injection of cash has helped produce. The next products expected to materialise as part of this cloud drive are NexentaStor CloudNAS and NexentaFusion CloudManagement.
Sean Mack Joins @DevOpsSummit Faculty | @xOps #CloudNative #Serverless #DevOps #Agile #DigitalTransformation
For far too long technology teams have lived in siloes. Not only physical siloes, but cultural siloes pushed by competing objectives. This includes informational siloes where business users require one set of data and tech teams require different data. DevOps intends to bridge these gaps to make tech driven operations more aligned and efficient.
Michael Wood Joins @CloudEXPO Faculty | @VMware @VeloCloud @WoodCast #MPLS #SDWAN #SDN #SDDC
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. SD-WAN helps enterprises to take advantage of the exploding landscape of cloud applications and services, due to its unique capability to support all things cloud related.