Cloud Native Computing Foundation Receives Grant from Google Cloud | @DevOpsSUMMIT #DevOps #Kubernetes

VANCOUVER, Canada, Aug. 29, 2018 /PRNewswire/ — Open Source Summit North America – The Cloud Native Computing Foundation® (CNCF®), which sustains and integrates open source technologies like Kubernetes® and Prometheus™, today announced that Google Cloud has begun transferring ownership and management of the Kubernetes project’s cloud resources to CNCF community contributors. Google Cloud will help fund this move with a $9 million grant of Google Cloud Platform credits, divided over three years, to cover the infrastructure costs associated with Kubernetes development and distribution, such as running the continuous integration and continuous delivery (CI/CD) pipelines and providing the container image download repository.

read more

Sponsorship Opportunities at @KubeSUMMIT Silicon Valley | #CloudNative #Serverless #AWS #Docker #Kubernetes #Microservices

As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology — and even primary platform — of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embracing the reality of Serverless architectures, which are critical to developing and operating real-time applications and services. Serverless is particularly important as enterprises of all sizes develop and deploy Internet of Things (IoT) initiatives.

read more

CloudEXPO Introduces Rockstar @KubeSUMMIT Faculty | #CloudNative #Serverless #DataCenter #Monitoring #Containers #DevOps #Docker #Kubernetes

IT professionals are also embracing the reality of Serverless architectures, which are critical to developing and operating real-time applications and services. Serverless is particularly important as enterprises of all sizes develop and deploy Internet of Things (IoT) initiatives.

Serverless and Kubernetes are great examples of continuous, rapid pace of change in enterprise IT. They also raise a number of critical issues and questions about employee training, development processes, and operational metrics.

There’s a real need for serious conversations about Serverless and Kubernetes among the people who are doing this work and managing it.

So we are very pleased today to announce the ServerlessSUMMIT at CloudEXPO.

read more

StackRox to Highlight Kubernetes Security | @KubeSUMMIT @StackRox #CloudNative #Serverless #DevOps #Docker #Kubernetes #Security

StackRox helps enterprises secure their containerized and Kubernetes environments at scale. The StackRox Container Security Platform enables security and DevOps teams to enforce their compliance and security policies across the entire container life cycle, from build to deploy to runtime. StackRox integrates with existing DevOps and security tools, enabling teams to quickly operationalize container and Kubernetes security. StackRox customers span cloud-native startups, Global 2000 enterprises, and government agencies. StackRox is privately held and headquartered in Mountain View, California. To learn more, visit www.stackrox.com and follow us on Facebook, LinkedIn and Twitter.

read more

MapR Amplifies Power of Kubernetes | @KubeSUMMIT @MapR #CloudNative #Serverless #DevOps #Docker #Kubernetes

Implementation of Container Storage Interface (CSI) for Kubernetes delivers persistent storage for compute running in Kubernetes-managed containers. This future-proofs Kubernetes+Storage deployments. Unlike the Kubernetes Flexvol-based volume plugin, storage is no longer tightly coupled or dependent on Kubernetes releases. This creates greater stability because the storage interface is decoupled entirely from critical Kubernetes components allowing separation of privileges as CSI components do not need full privileges of Kubernetes components. With the implementation of Container Storage Interface (CSI), persistent data layer for Kubernetes and other Container Orchestration (CO) tools, such as Mesos and Docker Swarm are now future-proofed.

read more

Persistent Storage for Kubernetes | @KubeSUMMIT @Elastifile #CloudNative #Containers #Serverless #DevOps #Docker #Kubernetes

With container technologies widely recognized as the cloud-era standard for workload scaling and application mobility, organizations are increasingly seeking to support container-based workflows. In particular, the desire to containerize a diverse spectrum of enterprise applications has highlighted the need for reliable, container-friendly, persistent storage. However, to effectively complement today’s cloud-centric container orchestration platforms, persistent storage solutions must blend reliability and scalability with a simple, cloud-native user experience. The introduction of Elastifile’s CSI driver addresses these needs by augmenting containerized workflows with highly-available, scalable NFS file storage delivered via Elastifile Cloud File System…and with no complex, manual storage provisioning required.

read more

Redis Labs further changes licensing terms – to make developers happy and keep big cloud vendors at bay

Open source database provider Redis Labs has announced Redis Source Available License (RSAL), representing a modification of previous licensing terms for its modules and looking towards clarification with the open source and developer communities.

The company had in August changed its terms to Apache2 modified with Commons Clause with more than one eye on the biggest cloud providers, who were packaging Redis technology into proprietary offerings and pocketing the resulting profits.

This was a move which was followed towards the end of last year by similar companies, such as MongoDB and Confluent. Writing at the time of the latter’s $2.5 billion valuation following a $125m series D funding round in January, as this publication reported, Confluent co-founder Jay Kreps outlined his company’s position.

“The major cloud providers all differ in how they approach open source,” Kreps wrote in a blog post back in December. “Some of these companies partner with the open source companies that offer hosted versions of their system as a service. Others take the open source code, bake it into the cloud offering and put all their own investments into differentiated proprietary offerings.

“The point is not to moralise about this behaviour; these companies are simply following their commercial interests and acting within the bounds of what the license of the software allows,” Kreps added. “But we think the right way to build fundamental infrastructure layers is with open code.”

Hence the need to tighten things up. Yet the problem Redis Labs found was that the previous terms for its modules – the Redis database project itself remains unchanged – were too open to interpretation, or too confusing. Previously, under Apache2 modified with Commons Clause, the rule was that users were not allowed to sell a product or service ‘whose value derives entirely, or substantially, from the functionality of the software.’ But as Redis subsequently noted, how substantial is ‘substantially’ exactly?

The new solution under RSAL is to communicate more clearly that developers can use the software, modify the source code, integrate it with an application, and use, distribute or sell that application. The only restriction is that the application cannot be a database, a caching engine, a stream processing engine, a search engine, an indexing engine, or a machine learning, deep learning, or artificial intelligence-serving engine.

“We are very open to our community,” Ofer Bengal, Redis Labs CEO, told CloudTech. “We got a lot of feedback and responses regarding Commons Clause which made us think there may be a better definition of license for our case.

“When we said [users were] not allowed to sell a product or service… this created concerns with some developers providing services around open source projects, like consulting services and support services,” Bengal added. “In order to get adoption you need to satisfy the needs of developers, and once we heard after we released Commons Clause that some developers weren’t happy – not with the concept but with the way it was presented and copyrighted, the language of the license – that was the point where we thought that we should correct it.

“We hope that once doing that developers would be happier and more receptive to using software under these licenses.”

For some users, however, that ship may have already sailed. In the aftermath of Redis’ original licensing changes, offshoot groups developed, in particular GoodFORM (Free and Open Redis Modules). Led by developers at Debian and Fedora, GoodFORM set out to fork Redis’ code ‘committed to making [it] available under an open source license permanently’ amid fears they were unable to ship Redis’ versions of affected modules to their users.

Bengal’s response to these projects was unequivocal. “With all due respect, they should wake up and smell the coffee,” he said. “They don’t realise that the world has changed and the exact concept of open source is challenging in today’s environment.

“What they have done is just to counter what we have done. They forked the Redis modules that we had at the time, but this means nothing because they have done nothing with it, and I suspect that they cannot do anything with it,” Bengal added. “You must realise that developing a database is a very complex thing, it’s not a small piece of software that someone can develop from his parents’ home garage. There are tons of nuances and complexities, and if you do not devote yourself 24/7 for years to develop a database there is no way you can really contribute to it.”

It has been a busy few days all told for Redis, with the announcement of $60m in a series E funding round being confirmed earlier this week. The round, which was led by new investor Francisco Partners and also featuring existing investors Goldman Sachs Private Capital Investing, Bain Capital Ventures, Viola Ventures and Dell Technologies Capital, is a particularly important one according to Bengal.

“We are now at the stage where we’re seeing that our opportunity is huge,” he said. “The race over market share as the market matures becomes fiercer and fiercer, and in order to have foothold and market share you need to move very quickly and aggressively.

“Compared to our peers, we decided that in order to move faster and accelerate our growth we need to be more aggressive on the sales side, marketing side, and even on the product development side,” Bengal added.

With regards to the cloud behemoths, there may be some light at the end of the tunnel. In a blog post explaining Redis’ latest modules license changes, co-founder and CTO Yiftach Shoolman noted that the company was “seeing some cloud providers think differently about how they can collaborate with open source vendors.” Bengal added that, Amazon Web Services (AWS) aside, ‘the mood is trying to change’, inferring that partnerships between some cloud providers and companies behind open source projects may not be too far away.

You can read the full explanation of Redis Source Available License (RSAL) here.

Read more: Confluent's $2.5 billion valuation may provide affirmation amid open source turbulence

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Popular password managers found to have serious flaws


Clare Hopping

21 Feb, 2019

Security researchers have revealed that some of the most popular password managers around are also the most vulnerable, allowing hackers to break in and steal information as easily as they would be able to if the information was stored in a text file.

Independent Security Evaluators (ISE) tested a range of password managers – those embedded in browsers and also paid-for software that claim to stop people from being able to steal passwords. It found that every single tool could be broken into and so failed to sufficiently protect information as claimed.

“Although password managers provide some utility for storing login/passwords and limit password reuse, these applications are a vulnerable target for the mass collection of this data through malicious hacking campaigns,” ISE chief executive Stephen Bono said.

The company looked in detail at 1Password, Dashlane, KeePass, and LastPass to see how robust they were at securing users from having their credentials stolen. They all work in the same way – “securely” storing passwords so users are able to keep track of their different credentials across services from one place.

However, every single application had “serious” vulnerabilities, including ease of stealing the master password used to protect the others from prying eyes. Access to the master password means all other passwords stored can be easily obtained, making these platforms pretty useless in terms of their core purpose. 

All four password managers can be hacked when in the background running state when they’re locked by the master password, which is the most common way the applications are used. However, the most recent version of 1Password and Dashlane can be broken into and all passwords leaked while in both the locked and unlocked state. All four password managers could be intercepted using keylogger malware.

“People believe using password managers makes their data safer and more secure on their computer,” added ISE executive partner Ted Harrington. “Our research provides a public service to vendors of these widely-adopted products who must now mitigate against attacks based the discovered security issues, as well as alert consumers who have a false sense of security about their effectiveness.”

ISE recommends that users properly shut down their password managers when they’re not in use.

«Password managers are an important and increasingly necessary part of our lives. In our opinion, users should expect that their secrets are safeguarded according to a minimum set of standards that we outlined as ‘security guarantees’. Initially our assumption and expectation were that password managers are designed to safeguard secrets in a ‘non-running state’, which we identified as true. However, we were surprised in the inconsistency in secrets sanitisation and retention in memory when in a running unlocked state and, more importantly, when placed into a locked state,» ISE concluded in its research. 

«If password managers fail to sanitise secrets in a locked running state then this will be the low hanging fruit, that provides the path of least resistance, to successful compromise of a password manager running on a user’s workstation.

«Once the minimum set of ‘security guarantees’ is met then password managers should be re-evaluated to discover new attack vectors that adversaries may use to compromise password managers and examine possible mitigations for them.»

The cloud in 2020: Enterprise compatibility with edge computing, containers and serverless

In the future, as we speed down the motorway in our self-driving vehicles, historians will mark the 2010s as the decade of the cloud. Some would argue that the tenets of cloud computing were established in the 1960s, when U.S. government scientist J.C.R. Licklider planned an “intergalactic computer network”. In 2006 the cloud experienced a seminal moment, when Amazon entered the space with EC2. However, it is in the 2010s that cloud forged its role as the transformational technology of its generation.

The majority of leading brands embraced the cloud; and even the Central Intelligence Agency in the U.S. made the move in 2013. Some of the largest digital businesses in the world, Facebook, Netflix, Amazon, are not only cloud native, but achieved huge success because they chose that model.

In 2010 Amazon Web Services, Microsoft and Google – three leading lights of the cloud – had all launched their cloud businesses. The same year OpenStack, the leading open-source software platform for cloud, started as well. Worldwide spending on the public cloud started the decade at $77 billion, according to Statista, and was predicted to conclude it at $411 billion – a fivefold increase. Remarkable momentum, given that the cloud was still in its infancy. In the coming years, the development of business and consumer applications will accelerate from cloud enabled to cloud native – as exciting new cloud technologies flourish. Even the definition of what cloud means is changing, with the addition of edge and hybrid environments.

The world shifts too rapidly to make consistent predictions about the future; but commanding trends are at work and moulding the cloud as we head towards 2020.   

Enterprises finalise transition to public cloud

Despite the hype around the cloud, not every business has made the jump. According to Forrester: “Cloud's impact has been global, yet fewer than half of all enterprises use a public cloud platform. Yet recent research from 451 Research has shown that it is the financial services industry who is leading in terms of adopting cloud technologies. Faced with the agility from cloud native disruptors and increased competition, 60% of financial services companies surveyed said they expect to use various cloud platforms in combination with one another – slightly higher than the number for other businesses (58%).

Edge is not the end of cloud computing – but a natural evolution that will see telcos, manufacturers and more employing it as the new decade dawns

Indeed, as a McKinsey survey noted, even many companies that have adopted the cloud are far from complete operation in it: “While almost all respondents are continuing to build sophisticated cloud programs, there is a clear gap between the leaders (those who have migrated more than 50% of their processing workloads) and the laggards (those who have moved less than 5%)”.

A common concern putting businesses off employing an enterprise cloud computing strategy is security. Two-thirds of IT professionals state that security is their greatest concern in this respect, according to a study by LogicMonitor. As the decade draws to a close, the industry will seek to strengthen cloud security. Solutions to address compliance and data control needs will trigger adoption from those companies that are still holding out. Indeed, solutions that answer questions around data, as opposed to compute needs, will dictate who provides the most compelling answers for the enterprise.

The responsibility for security rests mostly with the customer though; and increasing amounts are deploying cloud visibility and control tools to lower security failures, according to the analyst firm. Strides in machine learning, predictive analysis and artificial intelligence will accelerate the number of large-scale, highly distributed deployments – as they become more feasible and secure to manage. Foolproof security does not exist in any computing environment. Nevertheless, increasing numbers of businesses will feel safer working with the cloud, triggering higher adoption rates by 2020; preparing the path for nearing total adoption in the following decade.

The cloud reimagined by edge computing

Consider cloud computing and typically centralised data centres, running thousands of physical servers, come to mind. However, this vision misses one of the greatest new opportunities for cloud – distributed cloud infrastructure. As businesses find themselves requiring near-instant access to data and compute resources to serve customers, they are increasingly looking to edge computing.

Edge computing directs certain compute processes away from centralised data centres to points in the network nearer to users, devices and sensors. IDC describes it as a “mesh network of micro data centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet”.

This environment is very valuable for the Internet of Things (IoT), with its requirement to collect and process vast amounts of data in near-real-time, with a very low level of latency. It can lower connectivity costs by sending only the most important information, as opposed to raw streams of sensor data. For example, a utility with sensors on field equipment can analyse and filter the data prior to sending it and taxing network and computing resources.

Edge is not the end of cloud computing, but rather a natural evolution that will see telcos, manufacturers and many organisations employing it as the new decade dawns.

Containerisation continues

Containers, which enable developers to manage and easily migrate software code, have become very popular. That is not going to change over the coming decade. Forrester estimates that a third of enterprises are testing containers for use in production; while 451 Research forecasts that the application containers market will grow 40% annually to $2.7 billion in 2020. 53% of organisations are either investigating or using containers in development or in production, according to a Cloud Foundry report.

The majority of businesses are leveraging containers to enable portability between cloud services from AWS, Microsoft Azure and Google Cloud as they firm up their DevOps strategies for more rapid software production.

Kubernetes is making waves in container deployment, by using operating-system-level virtualisation over hardware virtualisation. Vendors delivering pragmatic answers without getting caught up in the craze will achieve meaningful market penetration. The hype around containerisation is going to translate into widespread adoption as the decade turns.

Serverless computing grows in popularity

For some time organisations have developed applications and deployed them on servers. With serverless computing, a cloud provider manages the code execution, executes it only when required and charges it only when the code is running. With this model, businesses no longer have to worry about provisioning and maintaining servers when putting code into production. (“Serverless” is a somewhat misleading term, as applications still run on servers).

Serverless computing is not going to be an overnight sensation, but more of a natural route taken when usage increases over time

Serverless computing came into being back in 2014 at the AWS re:Invent conference, with Amazon Web Services’ announcement of Lambda and has recently gotten further traction with open source project Firecracker. Serverless computing, potentially, is a very big development, with one caveat. Not everyone is going to be ready for it. Going serverless requires an overhaul of traditional development and the production paradigm. In effect, it’s outsourcing entire pieces of infrastructure. In fact, it’s everything apart from the app itself.

This will mean that serverless computing is not going to be an overnight sensation, but more of a route taken when usage increases over time. While existing solutions usually lock customers into a specific cloud provider, the arrival of open source solutions in this space will accelerate and broaden the portfolio of implementations of serverless computing across the industry.

Open source continues its reign

Open source enterprise software has never been more popular. An increasing number of organisations are introducing open source software into their processes and even building entire businesses around it. Black Duck Software’s 2017 survey of executives and IT professionals identified that 60% of respondents reported that their company’s use of open source increased over the previous year. Two-thirds of the businesses surveyed contribute to open source projects. The cloud has ensured that the open source ecosystem is thriving, relying on a large range of open source DevOps tools, aggressive use of build automation and infrastructure platforms, like OpenStack and Kubernetes, unlocking application delivery in the cloud.

As cloud adoption increases, open source technologies will carry on boosting innovation for the rest of the 2010s and beyond. Cloud domination has been a staple of this decade and with such several exciting trends shaping it, it certainly seems the best days for cloud computing are still yet to come.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Support for Multi-Cluster Kubernetes Applications | @KubeSUMMIT @Rancher_Labs #CloudNative #Serverless #DevOps #AWS #Kubernetes

Applications with high availability requirements must be deployed to multiple clusters to ensure reliability. Historically, this has been done by pulling nodes from other availability zones into the same cluster. However, if the cluster failed, the application would still become unavailable. Rancher’s support for multi-cluster applications is a significant step forward, solving this problem by allowing users to select the application and the target clusters, providing cluster specific data. Rancher then initiates deployment to those clusters.

read more