Serverless & Kubernetes Summit | @KubeSUMMIT @CloudEXPO #CloudNative #Serverless #DevOps #DataCenter #Monitoring #Docker #Kubernetes

Kubernetes as a Container Platform is becoming a de facto for every enterprise. In my interactions with enterprises adopting container platform, I come across common questions: – How does application security work on this platform? What all do I need to secure? – How do I implement security in pipelines? – What about vulnerabilities discovered at a later point in time? – What are newer technologies like Istio Service Mesh bring to table?In this session, I will be addressing these commonly asked questions that every enterprise trying to adopt an Enterprise Kubernetes Platform needs to know so that they can make informed decisions.

read more

Microsoft buys Citus Data to boost open source database in the cloud


Clare Hopping

28 Jan, 2019

Microsoft has acquired database business Citus Data in the hope it’ll be able to offer customers access to scalable databases on PostgreSQL.

The company said it wants to offer developers access to more scalable and flexible systems to deal with larger data volumes in a faster fashion. It says the integration of Citus Data’s software into its platform will allow for more advanced data-led insights.

Citus Data is an open source extension to PostgreSQL that transforms the technology into a distributed database. It’s available as a fully-managed database as a service, as enterprise software, and as a free open source download, making it flexible enough to integrate into a business’s existing infrastructure.

“Together, Microsoft and Citus Data will further unlock the power of data, enabling customers to scale complex multi-tenant SaaS applications and accelerate the time to insight with real-time analytics over billions of rows, all with the familiar PostgreSQL tools developers know and love,” said Rohan Kumar, corporate vice president for Azure Data.

Although PostgreSQL is always available as a managed service on Azure Cloud (as well as AWS), Microsoft will integrate Citus Data so it’s available for all workloads, whether in the cloud or on-premise environments.

“We created Citus to transform PostgreSQL into a distributed database—giving developers game-changing performance improvements and delivering queries that are magnitudes faster than proprietary implementations of Postgres,” said Umur Cubukcu, co-founder and CEO of Citus.

Microsoft is ramping up partnerships with open source businesses at the moment, with last year’s acquisition of GitHub, possibly the biggest open source community in the world.

AWS releases Neo-AI code to the open-source world


Clare Hopping

28 Jan, 2019

AWS has released its Neo-AI code as an open source project, encouraging developers and other AI experts to contribute to the platform.

The company explained that usually, ensuring a machine learning model works across a variety of hardware platforms (especially those running on edge networks) is difficult because there are so many factors and limitations to consider.

Even in less complicated devices, there are so many software variations that it can be tricky to make sure machine learning works across all of them. As a result, manufacturers and vendors are being limited by which companies they can work with to provide machine learning tools they require.

With AWS’s Neo-AI, machine learning models are automatically optimised for use TensorFlow, MXNet, PyTorch, ONNX, and XGBoost models, converting them into common formats to work on a wider variety of devices. The models can be run at a  faster pace as well because Neo-AI uses a compact runtime, limiting the resources a framework would typically consume.

Even if edge devices are being constrained by resources, this is irrelevant, because Neo-AI will shrink down the resources needed to run. Neo-AI currently supports platforms from Intel, NVIDIA, and ARM, with support for Xilinx, Cadence, and Qualcomm arriving later in the year.

“To derive value from AI, we must ensure that deep learning models can be deployed just as easily in the data center and in the cloud as on devices at the edge,” said Naveen Rao, general manager of the artificial intelligence products group at Intel.

“Intel is pleased to expand the initiative that it started with nGraph by contributing those efforts to Neo-AI. Using Neo, device makers and system vendors can get better performance for models developed in almost any framework on platforms based on all Intel compute platforms.”

Announcing @IsomorphicHQ to Exhibit at @CloudEXPO Silicon Valley | #Cloud #AI #CIO #IoT #SmartClient #DevOps #ArtificialIntelligence

Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web.

With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support.

Leading global enterprises use Isomorphic technology to reduce costs and improve productivity, developing & deploying sophisticated business applications with unprecedented ease and simplicity.

read more

SUSE to Present OpenStack and Kubernetes Track at @CloudEXPO | @SUSE @CSeader #OpenStack #Serverless #Kubernetes

Take advantage of autoscaling, and high availability for Kubernetes with no worry about infrastructure. Be the Rockstar and avoid all the hurdles of deploying Kubernetes. So Why not take Heat and automate the setup of your Kubernetes cluster? Why not give project owners a Heat Stack to deploy Kubernetes whenever they want to?

Hoping to share how anyone can use Heat to deploy Kubernetes on OpenStack and customize to their liking.

This is a tried and true method that I’ve used on my OpenStack clusters and I will share the benefits, bumps along the way and the lessons learned.

read more

Monitoring with AI Presentation Slides | @CloudEXPO @Dynatrace @AloisReitbauer #DevOps #Monitoring #AI #ArtificialIntelligence

Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no idea how to get a proper answer.

read more

Lori MacVittie @DevOpsSUMMIT Presentation | @LMacVittie @F5Networks #DevOps #Monitoring #Microservices

Cloud, containers, and their second cousin, DevOps, are disrupting the data center. Not just because applications are leaving, but because of the demands and expectations of dev and ops on IT. Modernization – not just migration – is critical for the data center to add value to the business. This session will explore these demands and expectations and provide both architectural and operational guidance to making the changes necessary to maintain relevance in a cloudy, containerized, and DevOps-driven world. If you’re good with that, let’s run with it.

read more

Dion Hinchcliffe @CloudEXPO Keynote | @DHinchcliffe #CIO #IoT #IIoT #FinTech #SmartCities #DigitalTransformation

Most organizations are awash today in data and IT systems, yet they’re still struggling mightily to use these invaluable assets to meet the rising demand for new digital solutions and customer experiences that drive innovation and growth. What’s lacking are potent and effective ways to rapidly combine together on-premises IT and the numerous commercial clouds that the average organization has in place today into effective new business solutions. New research shows that delivering on multicloud experience creation both sustainably and cost-effectively at scale is the single most important way to meet this existential challenge that can create rapid business value and sustain relevancy in the market. Yet the majority of organizations have been slow to put the needed delivery capabilities in place.

read more

Hacking Into IoT Session at @CloudEXPO Silicon Valley | @DonMalloy #Cloud #CIO #IoT #IIoT #SmartCities #DigitalTransformation

History of how we got here. What IoT devices are most vulnerable? This presentation will demonstrate where hacks are most successful, through hardware, software, firmware or the radio connected to the network. The hacking of IoT devices and systems explained in 6 basic steps.

On the other side, protecting devices continue to be a challenging effort. Product vendors/developers and customers are all responsible for improving IoT device security.

The top 10 vulnerabilities will be presented and discussed.

read more

Benefits of AI and machine learning for cloud security


Grace Halverson

25 Jan, 2019

It takes a year and almost £3 million pounds to contain the average data breach, according to a 2018 study by the Ponemon Institute. And despite growing cloud adoption, many IT professionals still highlight the cloud as the primary area of vulnerability within their business.

To combat this and lower their chances of experiencing a breach, some companies are turning to AI and machine learning to enhance their cloud security.

AI, or artificial intelligence, is software that can solve problems and think by itself in a way that’s similar to humans. Machine learning is a subset of AI that uses algorithms to learn from data. The more data patterns it analyses, the more it processes and self-adjusts based on those patterns, and the more valuable its insights become.

While not a silver bullet or a panacea, this approach shifts practices from prevention to real-time threat detection, putting companies and cloud service providers a step ahead of cyber attackers. Here are some of the benefits.


Up to 95 percent of data leaks in the cloud through 2020 will happen because of human error. Learn more about how AI and machine learning are helping combat cybercriminals in this whitepaper.

Download now


Big Data Processing

Cybersecurity systems produce massive amounts of data—more than any human team could ever sift through and analyse. Machine learning technologies use all of this data to detect threat events. The more data processed, the more patterns it detects and learns, which it then uses to spot changes in the normal pattern flow. These changes could be cyber threats.

For example, machine learning takes note of what’s considered normal, such as from when and where employees log into their systems, what they access regularly, and other traffic patterns and user activities. Deviations from these norms, such as logging in during the early hours of the morning, get flagged. This in turn means that potential threats can be highlighted and dealt with in a faster fashion.

Event Detection and Blocking

When AI and machine learning technologies process the data generated by the systems and find anomalies, they can either alert a human or respond by shutting a specific user out, among other options.

By taking these steps, events are often detected and blocked within hours, shutting down the flow of potentially dangerous code into the network and preventing a data leak. This process of examining and relating data across geography in real-time enables businesses to potentially get days of warning and time to take action ahead of security events.


Almost three quarters of successful data breaches gain access through an endpoint. Download this whitepaper now to learn more about securing your laptops, tablets and mobiles through the cloud.

Download now


Delegation to the Automation

When security teams have AI and machine learning technologies handle routine tasks and first level security analysis, they are free to focus on more critical or complex threats.

This does not mean these technologies can replace human analysts, as cyber attacks often originate from both human and machine efforts and therefore require responses from both humans and machines as well. However, it does allow analysts to prioritise their workload and get their tasks done more efficiently.