Inmarsat and Microsoft team up on network-sharing deal


Clare Hopping

26 Feb, 2019

Inmarsat and Microsoft have announced a joint collaboration that will see the former’s customers transfer data collected via their IoT solutions to the latter’s Azure IoT platform.

Businesses in the agriculture, mining, transportation, and logistics sectors will be able to connect the data they’ve collected with cloud-based apps to gain more insight into their customers, their supply chain and the market in general.

Microsoft customers will also be able to access Inmarsat’s global satellite communications network to transfer their data to the cloud.

“Our relationship with Microsoft Azure will provide customers with the reliable global connectivity and cloud services they need to take advantage of the Internet of Things wherever they are,” said Paul Gudonis, president of Inmarsat Enterprise.

“Industrial IoT solutions have the potential to bring transparency and intelligence to the global supply chain and by partnering with Microsoft we are making it easier and faster than ever for businesses from all sectors to exploit the technology and achieve competitive advantage.”

Inmarsat and Microsoft plan to extend the partnership to include other verticals in the future, but feel these key areas will be of most benefit at present.

“Microsoft Azure is being built as the world’s computer; extending the reach of our cloud through IoT and intelligent edge services,” added Sam George, director of Azure IoT.

“Our goal is to enable customers to take advantage of connected IoT solutions no matter where they are in their journey. With Inmarsat, customers across industries from agriculture to mining and logistics sectors, will benefit from the power of the intelligent cloud and intelligent edge with global satellite connectivity in the most remote parts of the globe.”

Alibaba Cloud debuts new data intelligence tools


Clare Hopping

26 Feb, 2019

Alibaba Cloud has introduced a set of tools it claims will make it easier for businesses to take advantage of data intelligence and gain greater insight into the running of their business.

Businesses in the retail, fin-tech, logistics, media and entertainment, digital branding, and marketing sectors are expected to be the primary beneficiaries of the new tech.

Alibaba’s Realtime Compute processes events, such as fraud detection, social analytics, and QoS monitoring of telco networks. It then presents these as business insights that can be used to take action based on behaviour displayed.

For businesses that need to manage and analyse large swathes of data, Alibaba has created DataWorks and MaxCompute 2.0 that can together process up to 100 petabytes of data a day.

For businesses with unstructured data sets, Data Lake Analytics can be used to query massive data lakes but only pay for the data crawled.

“Businesses around the world are increasingly relying on data intelligence to drive innovation, digitalise operations, and delight customers,” said Henry Zhang, senior staff product manager at Alibaba Cloud International.

“We work with customers from many industries along this digital transformation journey. We are keen to turn our proven in-house technology into broadly applicable services and pass the benefits on to customers globally so they can quickly build applications on top, such as for 5G, edge computing, and IOT, and shorten the time-to-market.”

Alibaba used this week’s Mobile World Congress (MWC) event in Barcelona to unveil a number of other tools, including ApsaraDB for MariaDB TX that allows businesses to deploy enterprise, database-centric applications, while SQL Server Enterprise Always On allows organisations to take advantage of high availability and disaster recovery plans.

Cloud Parallel File System offers businesses deploying High Performance Computing workloads to offer the highest levels of efficiency, while Elastic Container Instance makes it easier for Alibaba customers to run containers

Why the future of application deployment is not a binary choice

Last year, an interesting headline appeared on Forbes.com: On-premise is dead. Long live on-premise. The author explained that although public cloud is pervasive, on-prem is certainly not dead. Pundits who declared on-prem computing dead and buried several years ago started to backtrack in 2018.

Industry analysts seem to agree. Now it looks like even public cloud vendors recognise the market is not satisfied with only public cloud; therefore, they are pursuing on-prem opportunities. With the announcement of AWS Outposts, AWS made the tacit admission that not everything is moving to the public cloud.

Choosing a cloud experience along a continuum of options

Yet, is embracing these same two options a flawed view moving forward? Historically, technology limited organisations to a binary choice: Go to the public cloud or stay on premises. In the future, businesses large and small will have a broad continuum of options of locations and consumption models. And the lines separating all of those options will blur as IT embraces the cloud experience.

As organisations evaluate their best deployment and consumption options, they are finding that cloud is no longer a destination. Instead, it is new way of doing business that focuses on speed, scalability, simplicity, and economics. This type of business model allows IT to distribute data and apps across a broad continuum of options. In the past, IT routinely optimised infrastructure to positively impact applications. In the future, IT will manage applications to deliver the best experience wherever the infrastructure is located.

Start with data and applications and then choose options

Data and applications should be placed where each performs best – amidst changing requirements of service delivery, operational simplicity, guaranteed security, and optimisation of costs. For example, the General Data Protection Regulation dramatically changed how many global businesses secure customer data, which often leads to changes in deployments of certain applications. And since change is constant, IT will need to be agile when choosing deployment options.

Smart organisations will deploy applications and data on the continuum — between the two ends of the public cloud and on-prem spectrum. Some organisations may choose colocation (colo) because it provides the infrastructure and security of a dedicated data center without the costs of maintaining a facility. Other workloads are best served in a private cloud using traditional on-prem with consumption-based pricing, which could save the business from an unplanned capital expenditure. Another application may demand high security and control, yet flexibility and scalability, which would make on-prem private cloud the best alternative.

Clearly, having choices provides the opportunity to select the best deployment and consumption model for each individual application. And as needs change, deployment and consumption models will also change. The beauty of having numerous choices is that it gives organisations more flexibility to manage costs, security, and technical needs.

Complexity requires expertise and a trusted advisor

Keep in mind that more choices can mean more complexity. In addition, the lines between all of these options are blurring, which can confuse things further. For example, certain models give customers the flexibility to pay for what they use but still manage the infrastructure themselves, which fits neither the traditional on-prem nor the public cloud model.

It’s often difficult for an organisation to develop a new mindset, one that adapts to IT changes quickly. Too many times, they are constrained by legacy thinking, infrastructure, and tools. Better training and tools from industry experts can solve these issues.

To adjust to this often complex yet agile environment, many business will need help and should seek out knowledgeable partners that provide the richness of options along the continuum. A successful organisation will need both professional services and tools to help support as many options as possible.

A continuum of choices opens the door to new opportunities

It’s time to stop limiting choices to only on-prem versus public cloud. Instead, consider all the options available for new opportunities and long-term success. Compliance, cost, performance, control, complexity of migration – all of these factors will determine the right mix for deployment of data and applications. To help with this determination, businesses should seek help from partners who can give them expertise in as many choices as possible.

Photo by Marcus Dall Col on Unsplash

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft unveils HoloLens 2 with a enterprise laser-focus


Bobby Hellard

25 Feb, 2019

Microsoft has announced HoloLens 2 at MWC on Sunday, the follow up to its mixed reality headset.

The announcement was made by Alex Kipman, a Microsoft Technical Fellow and the creator of the first HoloLens, who said that customers had been consistently asking Microsoft for three things to improve the HoloLens.

First, was for more immersion, people wanted to see more of the holographic landscape the headset presents. Second, was to make the device more comfortable, so users could stay emersed for longer periods of time. And third, was for it to have industry-leading value right out of the box.

Microsoft took this on board and improved the headset’s a field of view so that it’s more than double that of the first-generation HoloLens, approximating 2,000 pixels per eye while still maintaining the original’s pixel density. Ten-Point touch interaction with holograms, complete with the ability to sense hand movements with greater fidelity, has also been added. And, a new UI model that allows users to interact with virtual buttons and for holograms to follow the user.

Microsoft sees HoloLens 2 as an enterprise device for front-line workers and touted that the main changes such as the expanded field view, built-in AI tools and direct manipulation of holographs, will enable the headset to achieve this. The $3,500 device will be the front end for apps such as Dynamics Remote Assist, Dynamics 365 layout and Dynamics Guides applications. More importantly, Microsoft has said it will have an open approach to third-party developers.

While the news that the new device will allow more immersion and comfort will go down well with all, the integration with Azure and Dynamics will be the bigger plus for the enterprise arena.

«Microsoft’s new HoloLens device will bring the enterprise mixed reality market to its next level of adoption,» said J.P. Gownder, VP and principal analyst at Forrester. «The device itself solves many problems associated with the first model – from a vastly expanded field of view to better hand gestures. But it’s the integration with Azure and Dynamics that will empower developers to create powerful mixed reality experiences more quickly and cheaply.

«For the old HoloLens, editing the number of polygons forming a hologram into a manageable number was challenging and expensive. With cloud tools, developers can render an object in, say, 50,000 polygons on the device but also can render it in the Azure cloud with one million polygons visible to the user. This turbocharger’s the opportunity for creating holograms that workers can interact with during their daily jobs, helping them solve previously intractable problems.»

CoreOS Eases Adoption of Kubernetes | @DevOpsSUMMIT @RedHat @CoreOS #DevOps #CloudNative #Serverless #Docker #Kubernetes

CoreOS extends CoreOS Tectonic, the enterprise Kubernetes solution, from AWS and bare metal to more environments, including preview availability for Microsoft Azure and OpenStack. CoreOS has also extended its container image registry, Quay, so that it can manage and store complete Kubernetes applications, which are composed of images along with configuration files. Quay now delivers a first-of-its-kind Kubernetes Application Registry that with this release is also integrated with Kubernetes Helm so that deployment of an application can be completely automated.

read more

Fully Managed Apache Kafka and Apache Flink | @KubeSUMMIT @EventadorLabs #Kafka #Apache #Serverless #Kubernetes

As Apache Kafka has become increasingly ubiquitous in enterprise environments, it has become the defacto backbone of real-time data infrastructures. But as streaming clusters grow, integrating with various internal and external data sources has become increasingly challenging. Inspection, routing, aggregation, data capture, and management have all become time-consuming, expensive, poorly performing, or all of the above. Elements erases this burden by allowing customers to easily deploy fully managed discrete plug-ins that make streaming infrastructures a true hub for democratizing data across the enterprise. Eventador Elements provides unprecedented simplicity with the ability to eliminate any worry about the underlying infrastructure, configuration or management—it is all handled and managed by Eventador in a true cloud-native fashion.

read more

Building & Structuring Teams for Successful DevOps Adoption | @DevOpsSUMMIT @MThreeC @ConorDevOps #DevOps #Monitoring

Conor Delanbanque has been involved with building & scaling teams in the DevOps space globally. He is the Head of DevOps Practice at MThree Consulting, a global technology consultancy. Conor founded the Future of DevOps Thought Leaders Debate. He regularly supports and sponsors Meetup groups such as DevOpsNYC and DockerNYC.

read more

Check Point exposes yet more shared responsibility misunderstandings for cloud security

Almost one in five organisations polled by cybersecurity solutions provider Check Point Software say they have been victim to a cloud security incident over the past year, while more than a quarter still believe security is the responsibility of the cloud provider.

These and other worrying findings have appeared in Check Point’s latest study. The 2019 Security Report, of which this is the third instalment and combined data with survey responses from IT professionals and C-level executives, also found more than half (59%) of IT respondents polled did not use mobile threat defences.

The report pulls no punches in regard to its analysis. The first section, titled ‘cloud is your weakest link’, explores how cloud services are vulnerable across three main attack vectors; account hijacking, malware delivery, and data leaks. Citing Dome9 – acquired by Check Point last year – in a study last year which found 91% of organisations were concerned about cloud security, the report notes how exposure and default security settings remain an issue.

“65% of IT professionals still underestimate the damage they can cause,” the report explained. “The obvious concern is that organisations are not taking cloud security seriously enough. The breach of sensitive data held in the cloud is a huge risk for an organisation, and threat actors know it. The rate of cyber attacks against cloud-based targets is growing, and with little sign it will slow down.”

The statistic which causes major concern is the three in 10 respondents who affirmed security was the responsibility primarily of the cloud service provider. This, as the report noted, ‘negates recommendations’ over shared, or mutual responsibility.

This is a viewpoint which persists even though cloud providers have tried to remove some of the burden themselves. In November, Amazon Web Services (AWS) launched Amazon S3 Block Public Access, which aimed to secure at the account level, on individual buckets, as well as future buckets created.

The move was to ensure users handled public buckets and objects ‘as needed while giving tools to make sure [users] don’t make them publicly accessible due to a simple mistake or misunderstanding’, in the words of AWS chief evangelist Jeff Barr at the time. Previously, AWS had revamped its design to include bright orange warning indicators to signify which buckets were public.

“As nearly 20% of organisations have experienced a cloud incident in the past year, it’s clear that criminals are looking to exploit these security gaps,” said Zohar Alon, head of the cloud product line at Check Point. “By reviewing and highlighting these developments in the report, organisations can get a better understanding of the threats they face, and how they prevent them impacting on their business.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Check Point reveals cloud and mobile security threats are growing


Clare Hopping

22 Feb, 2019

Criminals are increasingly targeting the cloud and mobile environments of businesses because they’re the least protected infrastructure, Check Point has revealed.

The insights are backed up by evidence in the form of the cyber security company’s 2019 Security Report, with almost a fifth of businesses having experienced a security incident over the last 12 months, including data leaks and breaches and malware.

“The third installment of our 2019 Security Report shows just how vulnerable organizations are to attacks targeting their cloud and mobile estates, because there is a lack of awareness of the threats they face and how to mitigate them,” said Zohar Alon, head of cloud product line at Check Point Software Technologies. “As nearly 20% or organizations have experienced a cloud incident in the past year, it’s clear that criminals are looking to exploit these security gaps.”

The reason so many businesses are being targeted is because 30% feel it’s the responsibility of their cloud provider to protect them against threats and this means they’re not sufficiently protecting their cloud infrastructure. However, it’s widely recommended by cloud providers that this duty to protect a cloud environment is shared between the provider and the customer.

The most prevalent threats in the cloud are misconfiguration of cloud platforms, which was highlighted by 62% of businesses, with unauthorised access to cloud platforms cited as a problem by 55% of IT professionals. Those questioned also said insecure interfaces and APIs were a big problem in their organisation.

Mobile environments are also more at risk because businesses are failing to use mobile defences to protect their infrastructure, whether that’s by implementing malware detection or monitoring the usage of devices for system vulnerabilities.

In fact, less than 10% of IT professionals thought mobile threats presented a significant risk to their business, failing to recognise that malware can very easily propagate between mobile devices on a central network.

“By reviewing and highlighting these developments in the Report, organizations can get a better understanding of the threats they face, and how they prevent them impacting on their business,” Alon said.

Google introduces hybrid cloud platform for flexible environments


Clare Hopping

22 Feb, 2019

Google has launched itself into the hybrid cloud arena, launching Google Cloud Service Platform to help businesses take advantage of a mixed on-premise and cloud services at once.

CSP is built upon Google Kubernetes Engine (GKE) and integrates GKE On-Prem, which allows all of Google’s services to be pushed through to datacentres if a business doesn’t want to (or their sensitive data doesn’t allow for it to) operate solely on the cloud.

It will especially be useful for highly-regulated businesses that need to keep ownership of their data, but also companies that need to run legacy applications in their own datacentre, rather than in the cloud.

Alongside the ability to run applications in the cloud or on-premise, CSP also introduces other services that will help developers, security professionals and the IT team operate more efficiently.

For example, CSP Config Management enables the IT department to create multi-cluster policies out of the box to make sure that every cluster has the correct access controls and resource quotes set up. Security teams can monitor usage of the CSP environment, monitoring any changes and alerting them to anything suspicious.

StackDriver Monitoring offers a single management console for teams to keeps tabs on both environment at once and compatibility with the GCP marketplace offers open-source, and commercial Kubernetes applications with templates to make it faster to deploy applications.

“Built on open APIs, CSP is a less disruptive and more compliant approach than competing hybrid offerings,” Eyal Manor, vice president of Google Cloud said. “CSP gives you the freedom to modernize your applications at your own pace, innovate faster, and improve operational security and governance.

“Now that our customers have started to modernize their applications in their own data centers with CSP, we believe it will be the enterprise application deployment platform of choice for many years to come.”