Using hybrid cloud to power your business: A guide

In this modern world, organisations are facing great pressure to adapt rapidly and innovate to keep up with their competitors. Businesses are being forced to move faster and faster, with the only constant  being change: changing infrastructure, changing strategies and changing technologies. Transforming to a ‘digital business’ by implementing cloud services and platforms is no longer an option but an absolute necessity, failure to do so will lead to an organisation’s failure.

It is clear that cloud has become a key enabler for strategic success—but not everyone’s cloud journey looks the same. Different businesses have different ambitions, and their journeys are rarely as straightforward as just deploying servers and virtual machines into the cloud. It can be difficult for a number of reasons, including the choices of platforms and decisions over where the data should sit.

One option organisations should look at is the hybrid-cloud model and Gartner experts have stated that by 2020, 90% of organizations will have adopted some type of hybrid cloud. It is therefore important to understand what it is and how enterprises benefit from the hybrid cloud.

What is hybrid cloud?

Forrester research describes hybrid cloud as: “one or more public clouds connected to something in my data center. That thing could be a private cloud; that thing could just be traditional data center infrastructure.” To put it simply, a hybrid cloud is a mash-up of on-premises and off-premises IT resources.

To expand on that, hybrid cloud is a cloud-computing environment that connects a mix of public cloud, private cloud and on-premises infrastructure. A key advantage to this model is that it allows workloads and data to travel between private and public clouds as demands and costs change, providing businesses with the flexibility they need.

There is not a single hybrid cloud model that works for every organisation and every model should fit the unique needs of each company. By allowing multiple deployment models in a hybrid cloud, organisations are able to benefit from a seamless and secure environment which enables productivity and flexibility.

Why choose hybrid cloud?

It is not usually feasible for businesses to go all in and move completely into the cloud straight away, unless they happen to be  cloud-native organisations. That doesn’t mean that enterprises with legacy systems have been unable to make any headway with the cloud. To get around this, they can try a mixture of public and private clouds, and  combine this with hosted, colocated and on-premise infrastructure where necessary.

Hybrid cloud allows organisations to experience the advantages of both types of cloud. By spreading computational data across both resources, it is possible to optimise the environment whilst keeping everyday functions as streamlined as possible. Enterprises can make their own minds up on which type of data should be stored in a public cloud, whilst keeping any sensitive data in the private cloud. This is granting them the key element that they need: control. Access to the benefits of private and public clouds is perfect for organisations wanting to grow at the speed they need.

Hybrid solutions grant business the key element that they need: control. Control to optimise their IT investment by selecting the best-fit infrastructure for different categories of workload. Control to choose where their most critical data should reside. Control to spread their workloads across multiple platforms to avoid the risk of vendor lock in from a single platform strategy.

What next?

Hybrid cloud is set to evolve in the years to come, with technology playing a key role in this. We will see the incorporation of automation, machine learning and artificial intelligence into cloud platforms and this will impact the way the cloud environment is managed and maintained.

Organisations need to understand that before choosing a hybrid cloud model, it needs to understand exactly why it is doing so, the impact it will have upon the business and how it will carry out the transformation. Moving to the cloud is not just a technology upgrade, but a complete change of mindset that effects the entire business: from technology and process, to employees and skills. It is vital to choose the right partner to help navigate this journey and ensure cloud investment enables organisations to achieve their objectives.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft to shift Cortana focus for enterprise


Bobby Hellard

16 Jul, 2019

Microsoft has a new vision for Cortana that involves a shift in focus towards enterprise customers and further integrations with Amazon’s Alexa.

The changes will involve conversations and interactions that are part of the software and services the company offers to businesses, but it will be one of multiple digital assistants available on Windows 10 in the future.

This is because Microsoft is opening Windows 10 even further to third-party digital assistants. In the next update to Windows 10, due in September, Microsoft will allow voice assistants like Alexa to activate on the lock screen. The change will allow third-party assistants to be activated from their wake words when the PC is locked. So calling out ‘Alexa’ will open your laptop as well as your Amazon Echo device.

These digital assistant changes will be in the September update, codenamed 19H2. Unlike previous Fall updates, this one will be a lot smaller in size with fewer new features and changes.

Although the changes are minor, it does show that Microsoft is willing to work with its rivals, particularly as its own voice assistant, Cortana, hasn’t been as popular as the likes of the Google Assistant.

Earlier in the year, Amazon enabled its Alexa wake word on the Windows 10 app and Microsoft also made changes, moving Cortana into a separate app in the Microsoft store and away from the built-in search experience in the operating system, which was arguably one of the more annoying features of Windows 10.

For Amazon, Alexa has been widely adopted in the home and in business. Most recently, the digital voice assistant has been announced as an asset for the NHS as it will be used both in the service and as a replacement for calling your GP.

10 charts that will change your perspective of AI in marketing

  • Top-performing companies are more than twice as likely to be using AI for marketing (28% vs. 12%) according to Adobe’s latest Digital Intelligence Briefing.
  • Retailers are investing $5.9B this year in AI-based marketing and customer service solutions to improve shoppers’ buying experiences according to IDC.
  • Financial Services marketers lead all other industries in AI application adoption, with 37% currently using them today.
  • Sales and Marketing teams most often collaborate using Configure-Price-Quote (CPQ) and Marketing Automation AI-based applications, with sales leaders predicting AI adoption will increase 155% across sales teams in two years.

Artificial Intelligence enables marketers to understand sales cycles better, correlating their strategies and spending to sales results. AI-driven insights are also helping to break down data silos so marketing and sales can collaborate more on deals. Marketing is more analytics and quant-driven than ever before with the best CMOs knowing which metrics and KPIs to track and why they fluctuate.

The bottom line is that machine learning and AI are the technologies CMOs and their teams need to excel today. The best CMOs balance the quant-intensive nature of running marketing with qualitative factors that make a company’s brand and customer experience unique. With greater insight into how prospects make decisions when, where, and how to buy, CMOs are bringing a new level of intensity into driving outcomes. An example of this can be seen from the recent Forbes Insights and Quantcast research, Lessons of 21st-Century Brands Modern Brands & AI Report (17 pp., PDF, free, opt-in). The study found that AI enables marketers to increase sales (52%), increase in customer retention (51%), and succeed at new product launches (49%). AI is making solid contributions to improving lead quality, persona development, segmentation, pricing, and service.

The following ten charts provide insights into how AI is transforming marketing:

21% of sales leaders rely on AI-based applications today, with the majority collaborating with marketing teams sharing these applications

Sales leaders predict that their use of AI will increase 155% in the next two years. Sales leaders predict AI will reach critical mass by 2020 when 54% expect to be using these technologies. Marketing and sales are relying on AI-based marketing automation, configure-price-quote (CPQ), and intelligent selling systems to increase revenue and profit growth significantly in the next two years. Source: Salesforce Research, State of Sales, 3rd edition. (58 pp., PDF, free, opt-in).

AI sees the most significant adoption by marketers working in $500m to $1bn companies, with conversational AI for customer service the most dominant

Businesses with between $500M to $1B lead all other revenue categories in the number and depth of AI adoption use cases. Just over 52% of small businesses with sales of $25M or less are using AI for predictive analytics for customer insights. It’s interesting to note that small companies are the leaders in AI spending, at 38.1%, to improve marketing ROI by optimising marketing content and timing. Source: The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte and American Marketing Association. (71 pp., PDF, free, no opt-in).

22% of marketers currently are using AI-based applications with an additional 57% planning to use in the next two years

There are nine dominant use cases marketers are concentrating on today, ranging from personalised channel experiences to programmatic advertising and media buying to predictive customer journeys and real-time next best offers. Source: Salesforce’s State of Marketing Study, 5th edition

Content personalisation and predictive analytics from customer insights are the two areas CMOs most prioritise AI spending today

The CMO study found that B2B service companies are the top user of AI for content personalisation (62.2%) and B2B product companies use AI for augmented and virtual reality, facial recognition and visual search more than any other business types. Source: CMOs’ Top Uses For AI: Personalisation and Predictive Analytics. Marketing Charts. March 14, 2019

45% of retailers are either planning to or have already implemented AI to improve multichannel customer engagement as a core part of their marketing mix

Reflecting how dependent retailers are on supply chains, 37% of retailers are investing in AI today to improve supply chain logistics, supply chain management, and forecasting (37%). Source: AI and Machine Learning use cases in the retail industry worldwide as of 2019, Statista.

Personalising the overall customer journey and driving next-best offers in real-time are the two most common ways marketing leaders are using AI today, according to Salesforce

Improving customer segmentation, improving advertising and media buying, and personalising channel experiences are the next fastest-growing areas of AI adoption in marketing today. Source: Salesforce’s State of Marketing Study, 5th edition

82% of marketing leaders say improving customer experience is the leading factor in their decision to adopt AI

The timing and delivery of content, offers, and contextually relevant experiences are second (67%), and improving performance metrics is third at 57%. Source: Leading reasons to use artificial intelligence (AI) for marketing personalisation according to industry professionals worldwide in 2018, Statista.

81% of marketers are either planning to or are using AI in audience targeting this year

80% are currently using or planning to use AI for audience segmentation. EConsultancy’s study found marketers are enthusiastic about AI’s potential to increase marketing effectiveness and track progress. 88% of marketers interviewed say AI will enable them t be more effective in getting to their goals. Source: Dream vs. Reality: The State of Consumer First and Omnichannel Marketing. EConsultancy (36 pp., PDF, free, no opt-in).

Over 41% of marketers say AI is enabling them to generate higher revenues from email marketing

They also see an over 13% improvement in click-through rates and 7.64% improvement in open rates. Source: 4 Positive Effects of AI Use in Email Marketing, Statista (infographic), March 1, 2019.

Marketers and agencies are most comfortable with AI-enabled bid optimisation for media buying, followed by fraud mitigation

Marketers and their agencies differ on ad inventory selection and optimisation, with marketing teams often opting to use their analytics and reporting instead of relying on agency AI methods. Source: Share of marketing and agency professionals who are comfortable with AI-enabled technology automated handling of their campaigns in the United States as of June 2018, Statista.

Additional data sources on AI’s use in marketing:

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

UK’s first green data centre for AI launches in Cornwall


Bobby Hellard

15 Jul, 2019

A satellite station has opened the UK’s first green high-performance computing platform for artificial intelligence and machine learning on demand.

The Cornwall-based Goonhilly Earth station is one of the first organisations in the UK to deploy a liquid immersion cooling system to mitigate the power demands of high-performance computing.

Its green platform consists of an onsite array of solar panels that can support the data centre’s full power requirements of 500KW, with local wind power to be added in the near future, according to the company. The system has been designed to particularly target the data-intensive needs of the automotive, life sciences and aerospace marketplaces.

“There are people working on some clever algorithms to save our planet from climate change,” said Chris Roberts, head of data centre and cloud at Goonhilly. “The irony is that these models require heavy processing power.

“Fortunately, new technology is helping, such as immersion cooling which is 45-50% more efficient than air cooling, cuts electricity demand in half, and also allows us to use the exhaust heat elsewhere.”

According to Goonhilly, the platform delivers high-performance GPU-based compute and storage for decentralised and centralised AI and machine learning applications. By provisioning both compute, AI and machine learning resources on demand, customers can reduce the cost of deployment and accelerate the launch of products.

Goonhilly has also joined the Nvidia Inception programme for businesses, furthering its AI work and granting it access to Nvidia’s DGX-1, the world’s first supercomputer purpose-built for enterprise AI and deep learning.

To mark the opening of the data centre, and also celebrate the 50th anniversary of the moon landing, Goonhilly is hosting an event on-site on Thursday 18 July for space industry partners, academia, customers and prospects. It includes a panel discussion on trends in AI, cloud and edge computing.

“Through our strong partnerships with industry and academia, we pride ourselves on being at the forefront of innovation. Our new green data centre is no exception. It is satisfying to open our doors to the many businesses and organisations with data-intensive applications who can benefit from this facility and the community we are creating,” said Ian Jones, the CEO of Goonhilly.

Europe’s Galileo satellite system crippled by days-long outage


Keumars Afifi-Sabet

15 Jul, 2019

The European Union’s satellite navigation infrastructure, used by businesses and government agencies across the continent, has been offline for more than 100 hours following a network-wide outage.

The Global Navigation Satellite Systems Agency (GSA), which runs the £8 billion Galileo programme, confirmed this weekend the satellite system has been struck with a “technical incident related to its ground infrastructure”.

As a result, all 24 satellites in orbit are non-operational.

“Experts are working to restore the situation as soon as possible,” the GSA said in a statement.

“An Anomaly Review Board has been immediately set up to analyse the exact root cause and to implement recovery actions.”

Galileo is used by government agencies, academics and tech companies for a wide range of applications, from smartphone navigation to search-and-rescue missions.

The programme offers several services including a free Open Service for positioning, navigation and timing, and an encrypted Public Regulated Service (PRS) for government-authorised users like customs officers and the police.

Its business application spans multiple sectors; used by fishing vessels, for example, to provide data to fishery authorities as well as by tractors with guidance for navigation. According to the GSA, 7.5 billion Galileo-friendly apps are expected by the end of 2019.

However, the satellite system, developed so European organisations aren’t wholly entirely reliant on GPS, has been offline since 1 am UTC on Thursday 11 July.

The GSA said at the time that users may experience “service degradation” on all Galileo satellites. A further update then issued two days later claimed users would be experiencing a total service outage until further notice. Neither update offered a concrete explanation for the mysterious outage, which has persisted at the time of writing.

The root cause, however, may lie with a ground station based in Italy, known as the Precise Timing Facility (PTF), according to Inside GNSS. This facility generates the Galileo System Time, which is beamed up to the satellites to enable user localisation. It is also often used as an accurate time reference.

In June, GPS services were also hit by a similar outage which affected a host of Middle-Eastern countries. According to Isreali media, that outage was linked to state-sponsored attacks from Russia.

The government and UK businesses have played an integral role in helping to develop Galileo since its pilot launch in 2016. The continental service is expected to be fully operational by 2020, with 30 satellites in total.

But the UK’s withdrawal from the EU has threatened to fully cut off access by British agencies and companies, should no deal be agreed.

The government has already set aside £92 million to develop an independent satellite system, although it’s unclear how long this would take to implement.

Microsoft Teams now ‘bigger than Slack’


Keumars Afifi-Sabet

12 Jul, 2019

The number of individuals using Microsoft’s flagship workplace hub has soared in the last few months to leave its key competitor – Slack – in the dust, figures released by the firm show.

Two years after Microsoft launched its Teams platform, which is part of the firm’s Office 365 suite of apps and services, the company is boasting the digital workspace has more than 13 million active daily users.

This is one-third more than Slack’s 10 million daily user count according to the latest figures the company has disclosed. Active weekly users for Microsoft’s service, meanwhile, stand at 19 million.

The feat is more staggering considering that Microsoft’s platform was lagging behind its rival as soon as April this year, according to a chart the company produced.

Microsoft Teams owes its recent success to auto-inclusion with the 365 suite of apps and services

The pace of growth has been sharp but will not come as a surprise considering the number of organisations that are reliant on Microsoft Office 365, of which Teams is an integral component.

Distributing Microsoft Teams to its pre-existing customer base has likely been a huge factor in its growth since it was first launched two years ago.

The company says Teams now boasts a user base of 500,000 organisations. Slack, meanwhile, has more than 85,000 paid-for organisations, according to its latest figures, but the total number of businesses signed up to the workplace hub has not been disclosed.

Microsoft has also used this opportunity to introduce a raft of additional features for the workplace app, including priority notifications and read receipts for private chats.

Announcements can allow team members to flag important news in a channel, while cross-channel posting saves time on copy-and-pasting the same message to different audiences.

IT administrators are also being helped to deploy the Teams client and manage policies for every member within an organisation. Pre-defined policies, in areas like messaging and meetings, can be applied to employees based on the needs of their individual roles.

Slack itself recently announced a number of updates to its functionality and user interface. These span shared channels with customers and vendors, as well as added integration between email and calendars.

Commenting on Slack’s IPO a few weeks ago, vice president and principal analyst with Forrester, Michael Facemire, said Slack’s success will be determined by how well it can penetrate enterprises.

“Can Slack prove to the enterprise buyer that it is more than a chat app, more than a collaboration tool, but instead an enterprise collaboration platform? If Slack can do this, expanding out of a tech-savvy user base and into all parts of the business become much easier, as it starts to do work for everyone.

“The next challenge is selling its service into the enterprise. Many companies have multiple instances of free Slack in use. But this group of users face their first hurdle when these free accounts need enterprise governance (single sign-on, message retention rules, etc).

“Will Slack be able to prove the value of both paying the fee and doing the work to integrate with existing systems? This question will also signal how quickly it can succeed in an enterprise market.”

Cloud Pro approached Slack for comment and an update on its active daily user count, but hadn’t received a response at the time of publication.

Thousands of sites fall to Magecart ‘spray and pray’ attack


Connor Jones

12 Jul, 2019

More than 17,000 domains have been compromised in an attack launched by the prolific hacking group Magecart, according to attack surface management firm RiskIQ.

The attack preys upon websites with leaky Amazon S3 buckets, an attack method seen all too often despite them now being protected by default. The researchers said that anyone with an AWS account could read or write files in the affected buckets.

The attackers scanned the web for misconfigured buckets to see if they had any Javascript files they could download and add their skimming code, overwriting the script on the bucket.

Magecart was trying to run scripts on websites to glean and make off with payment information that can then be sold on for profit. It wasn’t just smaller websites affected by the attack, some of the 17,000+ compromised websites fell into the top 2,000 Alex rankings.

The problem with the attacker’s methodology is this type of skimming attacks rarely works on payment pages of websites, which makes the chance of a successful attack low compared to a more considered, targeted approach.

But the Magecart group could still enjoy “a substantial return on investment” due to the range of the attack. “The ease of compromise that comes from finding public S3 buckets means that even if only a fraction of their skimmer injections returns payment data, it will be worth it,” said Yonathan Klijnsma, threat researcher at RiskIQ, in a blog post.

“Perhaps most importantly, the widespread nature of this attack illustrates just how easy it is to compromise a vast quantity of websites at once with scripts stored in misconfigured S3 buckets,” he added. “Without greater awareness and an increased effort to implement the security controls needed to protect the content stored in these buckets from theft or alteration by malicious attackers, there will be more – and more impactful – attacks using techniques similar to the ones outlined in this blog.”

Exploiting misconfigured Amazon S3 buckets is a common attack method used time and again by opportunistic cyber criminals.

Earlier in the year, Facebook apps Cultura Collectiva and At the Pool became victims of a similar attack, with the cyber criminals making off with 540 million records, including users’ names, IDs and comments made through Facebook’s social integration.

“Like any other security procedure, security policies are a good mechanism for protecting the access to your S3 Bucket, but it needs to be used the right way,” said Boris Cipot, senior security engineer at Synopsys. “It has to be understood, and the user needs to know what they are doing when applying those policies to their buckets.

“Unfortunately, misconfigured policies then can lead to examples like those where the attacker can identify buckets with those misconfigured policies and modify the content on them,” he added. “Every user should have a good understanding of what they’re doing, but if this is not possible, leave it to professionals that know how to handle security.

“On the other hand it would be nice to see if Amazon could make a policy screening functionality were they could identify such misconfigured policies and warn the user – or in some cases even forbid the usage of loose policies.”

Other notable examples of devastating attacks made possible by leaky buckets include the leak of data belonging 120 American households by Experian. The NSA, WWE and Accenture also suffered similar attacks.

The future looks bright, however. According to reports, since Amazon enabled encryption for buckets by default, the number of exposed files has plummeted to less than 2,000 whereas the number was in the region of 16 million beforehand.

How public cloud continues to drive demand for cybersecurity solutions

Ongoing investment in cybersecurity solutions continues to grow. According to the latest worldwide market study by Canalys, cybersecurity solutions for public cloud and 'as a service' accelerated in the first quarter of 2019. Those deployment models collectively grew 46 percent year-on-year.

These type of solutions accounted for 17.6 percent of the total cybersecurity market value — that's up from 13.8 percent in the same period a year ago. Virtual security appliances and agent solutions also grew significantly, up by 18.2 percent on an annual basis.

Cybersecurity solutions market development

Traditional security hardware and software deployments still dominate, representing almost 75 percent of the total market. Both deployment models continued to grow but at a slower rate of just over 8 percent. This growth highlights the ongoing transition in cybersecurity solutions, as organisations look to protect more data assets and workloads located in the public cloud.

Moreover, IT vendors have introduced new ways of doing business with channels and enterprise customers in terms of purchasing, consumption and servicing — as well as helping simplify security operations within increasingly complex IT environments.

The worldwide cybersecurity market reached $9.7 billion in terms of shipments in the latest quarter — that's up 14.2 percent from $8.5 billion in Q1 2018.

 

 

According to the Canalys assessment, enterprise IT investment in cybersecurity shows no sign of slowing down. "The security industry will be immune to the increasingly challenging macro-economic and political environment," said Matthew Ball, principal analyst at Canalys.

There's a troubling trend that has raised awareness about the ultimate cost of an inadequate defense to counter online criminal activity. Recent high-profile ransomware attacks have resulted in some organisations paying large sums to regain access to critical IT systems and their related data.

Strengthening security strategies across devices, infrastructure, perimeters and applications will continue to be critical. Increasing employee training and gaining more comprehensive cybersecurity insurance will also be important.

As new cyber threats appear in the online arena, more security software startups will likely emerge, adding to an already crowded market. Product differentiation will be key, but also offering customers a choice of deployment models and simplified licensing will be vital.

Outlook for cybersecurity solutions growth

The challenge for enterprise organisations in both the public and private sectors is to maintain pace with the evolving and diverse range of online security threats. Many think they're too small or not high-profile enough to be targeted, but online hackers will seek to exploit any IT vulnerabilities.

This threat landscape is creating opportunities for IT channel partners to expand their capabilities to provide more holistic cybersecurity offerings to assess, recommend, deploy, integrate and manage multi-vendor solutions and services incorporating several deployment models.

Overall, the channel represented 92.3 percent of the cybersecurity solutions shipment value in the first quarter of 2019.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Does the rise of edge computing mean a security nightmare?

What do we mean by edge computing? In a nutshell, with edge computing you are processing data near the edge of your network, where the data is being generated, instead of relying on the cloud – or, more specifically, a collection of data centres.

As a relatively new methodology, computing at the edge invites new security challenges as you are looking at new setups or new architecture. Some say that you have to rely on vendors to secure your environment when you start computing at the edge. Those that champion edge computing claim that computing at the edge is safer because data is not traveling over a network but others see edge computing as being less secure because, for example, IoT devices are easily hacked.

And there are many ways to think about edge computing including smartphones. After all, if you consider the security and privacy features of a smartphone where you are encrypting and storing some kind of biometric information on the phone then you effectively take away those security concerns from the cloud and place them ‘next’ to the user, on their phone. 

With edge computing, you are effectively running your code on the edge. But running your code on the edge brings about specific security challenges because it's not within your stack or within your security environment – even though it is running on the edge it may still sometimes require queries from the back end, from the application. This is the main security concern when running a serverless environment and, in general, when running code on the edge. Where IoT devices are concerned, you run some of the code on the device itself (your mobile device or your IoT device) and you need to secure this. 

The massive proliferation of end user endpoint devices could turn out to be an edge computing headache for many organisations. A single user might have multiple devices connected to the network simultaneously. The same user will undoubtedly mix both personal and professional data (and applications/profiles) onto a single device. In most scenarios, endpoint security tends to be less than robust, whereby this user could (unwittingly) expose the organisation to serious risk and accompanying losses or exposure to malicious viruses. Many of these devices are not only very insecure, but they can’t even be updated or patched – a perfect target for hackers.

And 5G will certainly cement the era of edge computing. In general 5G should be a wonderful thing because it will accelerate the use and development of real time applications. But when you have more data going through a device you need more control of that data and you will need tools that allow an organisation to control that data from a security perspective.

The IoT and 5G relationship will see huge numbers of IoT devices feeding a huge amount of data to the edge. Currently however, none of the security protocols on IoT are standardised which highlights the biggest security risk of 5G. That is to say, your smart fridge in the kitchen currently has no standard for how it secures and authenticates with other smart devices. Base-level security controls are therefore required to mitigate such risks.

In the wider business world there will be a massive shift of computing function to the edge. When organisations rely less and less on data centres, (they will end up virtually ‘next’ to the workforce), then securing the endpoint edge means encrypting communications and ensuring that security devices are able to inspect that encrypted data at network speeds. Devices also need to be automatically identified at the moment of access, and appropriate policies and segmentation rules applied without human intervention. They also need to be continuously monitored, while their access policies need to be automatically distributed to security devices deployed across the extended network.

Organisations ultimately want to protect their data and they want to protect their production. When you are computing at the edge you are working with data at the edge and not in your workload. From a security point of view therefore, you need to secure the data both in transit and at rest. This security challenge is currently undertaken largely vendors and ultimately the security protocols underwritten by the big cloud providers such as AWS for example.

However, it is a mistake to believe that edge technology inherits the same security controls and processes that are found with the likes of AWS or the public cloud. Computing at the edge can cover all kinds of environments which are often remotely managed and monitored; this might not offer the same security or reliability that organisations are used to seeing with the private cloud. Ultimately it is the responsibility of the customer to properly vet potential vendors to fully understand their security architectures and practices.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud to acquire storage provider Elastifile, further reinforces enterprise ambitions

Google Cloud’s focus on the enterprise continues apace, with the company announcing its intent to acquire Santa Clara-based cloud storage provider Elastifile.

Elastifile offers cloud-native file storage which promises an enterprise-grade distributed file system, as well as ‘intelligent’ object tiering. While the company had been somewhat under the radar – its most recent funding round of $16 million in 2017 was a little lower than average for cloud storage series Cs – its product set is bang on target for enterprise pain points. The company argues its set of features mean organisations do not need to refactor their apps when migrating to the cloud.

Google plans to integrate Elastifile with Google Cloud Filestore – and writing a blog post confirming the news, Google Cloud CEO Thomas Kurian noted the synergies between the two companies.

“The combination of Elastifile and Google Cloud will support bringing traditional workloads into [Google Cloud Platform] faster and simplify the management and scaling of data and compute intensive workloads,” wrote Kurian. “Furthermore, we believe this combination will empower businesses to build industry-specific, high performance applications that need petabyte-scale file storage more quickly and easily.

“This is critical for industries like media and entertainment, where collaborative artists need shared file storage and the ability to burst compute for image rendering; and life sciences, where genomics processing and machine learning training need speed and consistency; and manufacturing, where jobs like semiconductor design verification can be accelerated by parallelising the simulation models,” added Kurian.

From the Elastifile side, CEO Erwan Menard outlined the rationale behind the move. “As we join the Google Cloud team, we are eager to build further upon our joint success, providing even more value to our customers,” Menard wrote. “Together, we are absolutely convinced that joining Google Cloud will enable us to serve the market with the best file storage service in any cloud, developed in a stimulating environment where our team members will continue to thrive.”

In June, Google Cloud welcomed business intelligence platform Looker into its ranks in what was reported as a $2.6bn (£2.05bn) all-cash transaction. The move was seen as a further indication of Google’s move into multi-cloud, in particular bringing together data from various SaaS applications – another important element for enterprises to consider.

Since Kurian took the top job at Google Cloud plenty has been discussed around the company’s ongoing ambitions, both for enterprise and multi-cloud. During his first public speaking gig in February, Kurian noted his plan to hire more sales staff and focus more aggressively on larger, traditional companies. At Google Next in April, the latter went a step further, with Google announcing its cloud services platform Anthos would accommodate AWS and Microsoft Azure.

Financial terms of the acquisition were not disclosed.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.