How to catch hackers in the act


Steve Clark

18 Dec, 2018

No matter how well protected your computer is, all systems can be compromised. In many cases, this happens under our very nose and it’s almost impossible to catch a hacker red-handed as they try and access your files or take control of programs.

Fortunately, there are early-warning tools available that can immediately alert you to any system breach, whether that’s someone trying to take over your webcam or trying to make changes to your files. The great thing is that the majority of these tools are free, although they don’t always come with the most user-friendly of setups.

We’ve put together a list of potential hacking scenarios and the tools you should consider trying out in order to combat them.

When new devices connect to your network

Worried that someone’s leeching off your internet? An easy way to find out for sure is to use Nirsoft’s Wireless Network Watcher, which lists connected devices on your network. Wringing every last function out of this feature-rich program would take days, but even at its most basic, it reveals plenty about network activity.

Find out when devices are connecting to your network

Run Wireless Network Watcher, then click the column marked Last Detected On until it displays a down-arrow. That way, the newest device to connect always shows at the top. This helps you recognise it at a glance.

To set up alerts for potentially rogue devices, enter Options and turn on Beep On New Device and Beep On Disconnected Device. Remember to turn up your volume. Now, head into Device Options. Here, you can set your own connection and disconnection alerts if you don’t like the defaults – maybe an air raid warning siren? Finally, create a log by selecting View and choosing HTML Report – All Items.

When changes are made to your system

If you’re concerned about ransomware, or just don’t want anyone toying with your rig, download WinPatrol – your digital guard dog. Features include controlling startup programs and delaying chosen programs from running at boot-up, which speeds up your computer.

After recoiling at how badly designed the user interface is – it’s visually busy, crammed with 15 tabs, and absolutely no directions for use – you can start monitoring changes to your computer’s files and folders.

WinPatrol.png WinPatrol helps protect you from malware and worse

WinPatrol can be used as a purely passive program. Leave it chugging away in the background and whenever a new program runs, or attempts to install, it’ll give you a heads-up in the form of a visual and audio alert. You’re then able to accept or reject any changes. Don’t recognise the program? Reckon it might be malware or worse? Block it from running or ruining your PC.

It’s worth noting that this only halts programs – you’ll need to track them down in WinPatrol to begin the process of removing the threat.

When someone accesses your shared files

Sharing is caring. It’s also usually secure when doing so over your own network (just be sure to change your default WiFi password). Don’t be lulled into a false sense of security, though. Download Net Share Monitor (bit.ly/net463) to keep a close eye on all shared folders.

It’s a pleasantly minimalist tool. Just three tabs sit in the window: Active Sessions logs remote users, Accessed Files displays which folders and files remote users are currently connected to, while Shared Files lists the network’s shared folders.

Track every change when sharing over your network

Microsoft offers a decent guide to sharing folders, available here. Once complete, let the tool go to work. Net Share Monitor automatically finds and scans shared folders, with no additional setup required. The instant that changes are detected, it’ll beep and flash, while logging details in a separate file.

For a cloud storage alternative, OneDrive sends emails and push notifications to your phone whenever users open shared files and make alterations.

When changes are made to your folders

FolderMonitor is the local equivalent of Net Share Monitor. After you download it, the program’s likely to be lurking in your system tray, so double-click the icon and force it open.

Place folders under surveillance with FolderMonitor

Right-click anywhere in the window, then select ‘Add folder’. Find the folder you want to observe, and click Ok to add it to the watch-list. Alternatively, ‘Add path’ lets you cast a wider net (for instance, placing every folder in This PC under surveillance).

When a folder is added to FolderMonitor, right-click it to open FolderMonitor Options. Choose the events you want to check for – Created, Changed, Renamed, and Deleted. Now, when an unscrupulous someone takes remote control, it triggers an unmissable alarm, warning you of changes to the offending folder.

When someone logs into your PC

Need to keep your computer under lock and key while you’re away? Use Microsoft’s Task Scheduler to send an email every time someone logs onto your machine.

Ironically, in order to run this, you’ll need to turn on ‘Less secure apps‘ for your Google account – as such, we suggest creating a dummy Gmail address specifically for these notifications.

Set up an automatic alert when someone logs onto your computer

First, download and extract SendEmail. Then, forget about it. We’re going to perform a mini-workshop instead. Open Windows Task Scheduler. On the Actions panel, click Create Basic Task, and give it a name and description. In the Trigger tab, tick ‘When a specific event is logged’. Hit Next and under Log, use the drop-down to select Security; under Source, choose ‘Microsoft Windows security auditing’. In the ID box, type 4624. Click Next.

On the Action page, click ‘Start a program’ and choose SendEmail. In the ‘Add arguments’ box, type the following, replacing everything in brackets with your own data: -f [fromemail]@gmail.com -t [toemail]@gmail.com -u [Subject line, e.g. Did you log on?] -m [Message, e.g. Someone’s logged onto your computer] -s smtp.gmail.com:587 -xu [fromemail]@gmail.com -xp [fromemail password] -o tls=yes

Once that’s done, test the automated alert by locking and unlocking your computer. You should receive an email warning you of the login.

When an app tries to spy on you

Have you ever downloaded a seemingly innocuous app, and wondered ‘Why would it need to access my microphone and camera?’ Well, it could be that the devs are spying on you. Even legitimate apps like Facebook have come under fire for supposedly listening in (for targeted advertising reasons, of course).

To take control, check every app’s permissions in Settings – then revoke access to sensitive tools such as the microphone, camera, Bluetooth, Wi-Fi and location (or ditch the app entirely if you don’t trust it).

D-Vasive Anti Spy monitors apps that may be watching you

For extra protection, you can download D-Vasive, created by self-proclaimed ‘cybersecurity legend’ John McAfee. Note that it costs £4.49. But while there are free alternatives, we’ve yet to find one that fully replicates D-Vasive’s protective security measures; D-Vasive sends alerts every time an app activates smartphone tools that may record, track, or otherwise spy on you. Whenever a warning flashes, you can instantly shut it down.

Get alerts when your accounts are breached

Is it time to change your username and password – again? It seems like every week brings a fresh hack or cyber-attack to worry about.

Have you been pwned?

Haveibeenpwned.com cross-references your email address against accounts that have been hacked, stolen and sold on the dark web. As well as searching its database, you can protect yourself against future breaches by setting up email notifications via bit.ly/pwnme463, so you’re informed as soon as the attack happens and can change your login pronto.

For extra security, install the HackNotice extension for Chrome. This handy browser tool tells you when you’re visiting a site that’s been hacked. You can even set up a watchlist for sites you regularly visit, or those that store personal data.

Airlines bracing for exponential growth: How the cloud will be critical for success

The International Air Transport Association (IATA) highlighted that by the end of 2018, global air traffic passenger demand is projected to grow 7% year over year. IATA also forecasts passenger numbers will continue increasing, with an expected 8.2 billion people traveling by 2037, more than doubling the number of air traffic passengers in 2018.

Airliners should expect passenger traffic to double over the next few decades—and while this is something airlines should look forward to, concerns over rising fuel prices, labor exhaustion, capacity issues, load-bearing variables, security concerns, competition growth, and more are very real as well. To withstand such growth, airlines are embarking on a digital transformational journey into the era of Business 4.0, adopting new and emerging technology in order to meet the substantial increase in consumer demands, rising expectations in customer satisfaction, optimised operational efficiency, and more.

While a combination of technology services will be needed to succeed in such an environment, the integration of cloud is proving to be the most valuable to those making this transformational journey forward.

Cloud is no longer a luxury

Organisations originally viewed cloud storage as a place to cut costs. Discrete data centers were too expensive to maintain and cloud adoption served as a cost-efficient solution to that one specific problem. While data storage was revolutionised, the core systems were still operating in traditional ways that today seem prehistoric. The cloud is no longer only a buzzword used to make companies seem more digital while minimising some cost—it’s now an essential business driver. Airlines are finding that they need to make this foundational shift and rebuild their IT infrastructure with the cloud in mind, or else their operational efficiency and customer service efforts will not be able to scale alongside the impending growth.

The traditional on-premise enterprise infrastructures that airlines have used to-date are complex, stagnant, expensive, and inflexible to immediate, spur-of-the-moment changes in services needed to deliver exceptional customer service. By switching to the cloud, airlines are able to leverage the cloud ecosystem to allow seamless connectivity between their front, middle, and back office operations.

Not only is operational efficiency increased, but total cost of ownership is reduced significantly. Airlines can take advantage of the pay per use model that comes with cloud computing, avoid capex on IT infrastructure and associated maintenance costs, and even cut costs on the highly-skilled man power needed to operate and maintain complex airline data centers.

Increased reliability, and minimised downtime

Cloud isn’t only the answer for large, global airlines. Small and medium sized airlines can now afford to establish safeguards in their IT infrastructure as well—avoiding any operational crises that may result in system downtime.

Historically, airlines have been required to overhaul and replace infrastructure every three to five years. Cloud adoption can help these airliners avoid the costly infrastructure rebuilds by developing safeguards and backup processes in their stead. By minimising downtime and optimising operational efficiency through the cloud, airlines can both be poised for inevitable risk and promise improved reliability and customer satisfaction—all while significantly cutting costs.

Improving customer satisfaction and individual personalisation

Companies of all industries are facing a new reality that customer experience is king—and that consumers no longer only want to have a high-quality product at a fair price, but they want to have an enjoyable experience throughout the entire transactional process—from point of sale, to product and service delivery, and beyond. The same is true for airline passengers—and cloud integration allows airlines to offer unique, individualised experiences for each passenger based on a wide range of variables.

By having all data across an entire global enterprise stored in a single source, airlines are able to cater to the differentiating demands of separate consumers and markets across the world. Through use of real-time availability in multiple cloud instances—customers have the access to services that meet their personalised needs.

Additionally, customer experience is improved in several other ways as well through cloud adoption. By integrating booking systems and customer’s immigration and security clearance details into the cloud, customer boarding time and security lines will be streamlined through self-boarding processes and optimised security pre-check. Airlines of course benefit from cloud-based passenger data as well, as they will then have access to consumer insights which can help shape other investments they make regarding in-airport and in-flight experiences.

Airlines are bracing for previously unimaginable growth over the next few decades, and leveraging the transformative power of the cloud will allow airlines to scale successfully along-side such immense growth, but also help them cut costs and optimise operational efficiency across the board.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Amazon will improve its cloud efficiency using ARM-based processor

Amazon is deploying an ARM-based Graviton processor to improve its cloud-based computing services. As per the Seattle-based company, this will lead to cost savings of up to 45% for "scale-out" services.

Amazon became the world’s biggest player in cloud computing via Amazon Web Services (AWS) which is the company’s $27 billion cloud business. AWS provides on-demand cloud computing platforms to individuals, companies, and governments on a paid subscription basis.

The e-commerce giant is changing the technology behind its cloud services to deliver faster performance and to save costs. The new system is expected to provide the company with a performance-per-dollar advantage.

The ARM Graviton processor contains 64-bit Neoverse cores and is based on the Cosmos 16nm processor platform, highlighted ARM Senior Vice President Drew Henry.

In addition, the Israeli-designed Graviton operates on the Cortex-A72 64bit core which functions at clock frequencies up to 2.3GHz. The servers run on Intel and AMD processors.

The system will assist Amazon with scale-out workloads. Users of the service can share the load across a group of smaller instances such as containerized microservices, web servers, development environments, and caching fleets.  

There are other advantages to Amazon from the new technology, centred around being more independent in relation to technology providers.

Amazon will now have the ability to license Arm blueprints, via Annapurna. In addition, the company will now be allowed to customize and tweak those designs and will have the ability to go to contract manufacturers like TSMC and Global Foundries and get competitive chips made.

Additionally, AWS is also building a custom ASIC for AI Inference called Inferentia. This could be capable of scaling from hundreds to thousands of trillions of operations per second and further reduce the cost of cloud-based services.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series.pngInterested in hearing industry leaders discuss subjects like this and sharing their Cyber Security & Cloud use-cases? Attend the Cyber Security & Cloud Expo World Series events with upcoming shows in Silicon Valley, London and Amsterdam to learn more.

Packet and Wasabi join hands to offer better cloud services than AWS

Cloud and edge computing infrastructure provider Packet, and hot cloud storage firm Wasabi, have joined hands to integrate their respective platforms to offer their customers cloud computing and storage services for less compared to Amazon Web Services (AWS).

David Friend, CEO of Wasabi, said: “Amazon has 100-some-odd cloud services. They do everything, but they don’t do anything particularly well. They’ve got one big integrated environment. But if you want the best content delivery network, Amazon doesn’t have it. If you want the best storage, Amazon doesn’t have it.”

At the moment, Packet and Wasabi’s offering is very limited in scope if compared to AWS’ multiple services. Unlike AWS’ be-everything-to-everybody approach, the companies are focusing only on cloud storage and cloud computing.

According to Friend, Wasabi’s cloud storage is 80% cheaper and six-times faster than Amazon S3 storage.

Zac Smith, CEO of Packet, said: “How can we create an experience for enterprise buyers that gives the best of both worlds: the best, low-cost storage option and the best compute, but at the same time not with a lower experience for the developer? We’re not trying to solve this from a technology standpoint. We’re trying to solve this from an operations and business standpoint.”

Packet claims that its bare-metal cloud supports more than 60,000 installations every month and is available in more than 18 countries. Its cloud automation platform enables bare metal installations in less than 60 seconds.

Also, both companies will be offering joint services via their individual infrastructure-as-a-service (IaaS) management consoles and APIs, which is likely to be available in Q1 2019. This integrated console will let Packet compute customers to use Wasabi storage and Wasabi customers to use Packet compute resources.

These joint cloud services will be connected over a high-capacity, low-latency fibre network with no transfer fees between compute and storage elements.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series.pngInterested in hearing industry leaders discuss subjects like this and sharing their Cyber Security & Cloud use-cases? Attend the Cyber Security & Cloud Expo World Series events with upcoming shows in Silicon Valley, London and Amsterdam to learn more.

Vue Entertainment and Urban Airship to deliver UK cinema tickets through Google Pay


Clare Hopping

10 Dec, 2018

Vue Cinemas and customer engagement platform Urban Airship have teamed up to offer customers the opportunity to queue jump with Google Pay.

Cinema-goers can now purchase tickets in advance in person or via Vue’s website so when they attend a film showing, they can walk straight into the cinema screen without needing to queue at the box office.

“We know that more and more customers are using mobile wallets and we always move quickly to adopt technology that will improve customer experience,” said Dan Green, head of digital at Vue Entertainment.

“Our commitment to launching broad distribution for Google Pay movie tickets is a great example of understanding customer behaviour and reacting quickly to give them what they want. It will also enable us to offer enhanced personalisation which we know customers value.”

Apparently, this will save customers some of the 52 hours a year they spend queuing for various services.

When a customer purchases the ticket using their Google Pay account, the tickets are saved to their digital wallet and added to their lock screen ahead of the showing, so they can flash their phone to the attendants rather than searching through their phone for a confirmation.

As an add-on to the integration, Airship has implemented a personalisation mobile wallet movie experience, offering tailored recommendations and discounts to customers based on their purchases.

“Bringing physical and digital experiences together is more important than ever to streamline both customer interactions and business operations,” said Brett Caine, CEO of Urban Airship.

“Through our close work with Google Pay and Vue, we’re bringing mobile wallet movie tickets to everyone’s smartphone, eliminating the hassle of standing in queues, searching through emails for confirmations or having to first download another app. It’s all about getting guests to the best seats, concessions and the big show more quickly.”

The Vue Google Pay integration will debut in Vue’s paperless venues before being rolled out to other sites across the country.

Mind the backup gap: Protecting born in the cloud data in Office 365

Applications such as Exchange, Sharepoint and OneDrive are the oil that keeps the wheels of commerce turning. Adoption of Office 365 is growing at such a rate that even Microsoft has been taken by surprise. The company estimates that during FY2019 it will reach the point where two-thirds of its office business customers will have migrated to the software-as-a-service platform which, it says, is about a year ahead of expectations.

Businesses are understandably looking for the agility and scalability of cloud-based applications, but in the rush to migrate for convenience and efficiency, the balance of responsibility for security and backup is also shifting and as such requires close examination. Organisations need to be aware that, while they can now rely on Microsoft to protect and guarantee availability for these mission-critical applications and underlying infrastructure instead of having to carry out that activity themselves, responsibility for protecting the sensitive company data that resides in those systems remains firmly in-house. This means businesses need to ensure that their data is fully backed up against common threats to security and productivity and that any backup gaps resulting from the hand-off between themselves and the platform provider, are closed.

Given the rapid penetration of Office 365, we’re seeing more businesses looking closely into their backup situation as they strive to balance productivity, data protection, security and compliance. It’s therefore worth examining some of the key reasons that additional backup for Office 365 is essential.

Office 365 offers backup – to a point

Unsurprisingly, Microsoft knows its users pretty well, and Office 365 does have a number of backup safety nets built in to spare users’ blushes. Accidentally deleted mailboxes in Exchange can be recovered, and files in OneDrive that have been deleted, encrypted by ransomware or inadvertently overwritten can be restored to a point in time prior to the incident. However, in both these cases, data recovery has a time limit, and 30 days is the magic number. If the user doesn’t notice the error for a month, then those emails and files are gone for good.

Fixing the issue for users who’ve owned up to genuine mistakes in time is one thing, but how about users who don’t have the business’ best interests at heart? According to the Verizon Data Breach Investigations report, the second most common cause of cyber security breaches is privileged misuse or insider threat. A disgruntled employee who decides to delete mission-critical files and data won’t be publicising the fact and if 30 days pass before the crime is discovered, there’ll be no way of restoring those files unless that data is protected elsewhere.

A further issue lies around standard events, such as an employee leaving the company. Office 365 will keep their emails for 30 days, but after that, all the valuable historical intelligence left behind by that employee will be lost.

Potentially, the most compelling argument for creating independent backups is compliance. Companies that are subject to regulations requiring them to retain deleted data for extended time periods will not be able to comply if that data resides only in Office 365.

Ultimately, the data managed, shared and stored via Office 365 is mission-critical, so it is common sense to back that data up to the same level that you back up all your systems, rather than risk a gap that could result in damaging data loss.

Protecting born-in-the-cloud data – modifying the 3-2-1 rule  

Once organisations have identified the need to backup Office 365 data, they face the decision of how best to tackle this.

I mentioned at the beginning that the balance of responsibility for security and backup is shifting, and that’s also true of the best-practice approach to backups. Previously, the accepted rule was that organisations should retain three copies of their data in two different media, one of which is typically kept on-premise, with one copy stored off-site. The cloud has changed all that. With data that’s born in the cloud it no longer makes sense to keep a backup copy on-premise for two key reasons. Firstly, bandwidth is at a premium and streaming backup data to your own data centre causes unnecessary congestion. Secondly, restore times could prove unacceptably long. Instead, it’s logical to create backup copies in two different cloud locations, with each copy stored in a different geographic region as proof against regional disasters.

An Office 365 backup solution needs to overcome the shortcomings of the native backup features. Unlimited storage and retention, point in time recoverability for all data, including email, and full visibility for ease of management, as well as audit and compliance purposes, are all critical features. Plus, it goes without saying that, should the worst happen, you must be able to find and restore the data you need quickly and easily.

Since businesses first started looking for Office 365 backups, there has been an issue around finding a single solution that comprehensively covers Exchange, SharePoint Online and OneDrive for Business. It’s therefore important, though seemingly obvious, to check that the solutions you’re evaluating cover all the applications to the same degree.

At iland we’re seeing growing numbers of organisations who are looking to close the gaps in Office 365 backup with a single solution. This single solution gives peace of mind that the data keeping their business in action is backed up to the same high standards they apply to other systems and data.

The balance of responsibility for security and backup may have changed, but the importance of protecting mission-critical data is as high, if not higher, than ever before.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series.pngInterested in hearing industry leaders discuss subjects like this and sharing their Cyber Security & Cloud use-cases? Attend the Cyber Security & Cloud Expo World Series events with upcoming shows in Silicon Valley, London and Amsterdam to learn more.

CIO strategies for moving to a cloud-first business


Mark Samuels

11 Dec, 2018

The cloud is now established as a business-as-normal activity. Almost three-quarters of businesses have between one and five years of experience with cloud technologies, according to research by IT industry association CompTIA.

So, what will be the key trends for the cloud through 2019 and beyond, and how can CIOs continue to make the most of the cloud?

Alex von Schirmeister, chief digital, technology and innovation officer at retail specialist RS Components, is using the cloud to provide a platform for digital transformation. He is working to create a cloud-first approach, yet the required balance of business benefit and technological risk leaves von Schirmeister to conclude that there will never be a 100% migration to the cloud.

On-prem will always have a role

“There will always be room for some sort of hybrid solution, where certain aspects of your infrastructure – because of issues around criticality, confidentiality or risk – will continue being held on-premise,” he explains. “There are still many legacy companies, like ours, that are still at an early stage regarding the journey of discovery and the cloud.”

“We’re still experimenting in many ways. The clear majority of our main infrastructure still sits on a physical data warehouse and it will continue to do so for several years. But we’re certainly looking at how we migrate in the future.”

That stance resonates with Gregor Petri, research vice president at Gartner, who says most businesses are currently focused on creating a cloud strategy, but concerns remain around existing application portfolios. While some IT leaders might consider lifting and shifting a small minority of services, the majority of CIOs will use the cloud in the future as a platform to deliver innovation.

“That approach means most things will eventually run in the cloud, but not because CIOs pick up existing applications and choose to run them somewhere else,” says Petri. Lifting and shifting doesn’t give you the benefits of the cloud. But what is true is that the next version of the software you’re running today, and in intend to run in the future, will run in the cloud.”

The need for reliable governance

This piecemeal approach is a strategy that chimes with Richard Corbridge, who is chief digital and information officer at Leeds Teaching Hospitals NHS Trust. His organisation is aiming to move to a cloud-first policy during the next four years. Corbridge, who joined the Trust late last year, says the transition on demand is taking longer because of those previous investments in internal infrastructure made before he joined.

Now, however, cloud increasingly represents the sensible choice for organisations in his sector. “Healthcare organisations are pushing hard to show that cloud is not just possible but almost mandated as the approach of choice, due to its added security and functionality,” says Corbridge. “There’s no point in every Trust in the UK investing in their own infrastructure.”

He says it makes good business sense for Leeds to use the cloud, especially if that approach helps support better investment in security through the big-budget approach of vendor partners. Corbridge recognises governance is still a concern, suggesting this need to keep a tight grip on data location will help sponsor the use of mixed approaches to the cloud.

“There’s probably still a need for a bit of a hybrid solution because of the nature of some of the systems, particularly the legacy technology that some Trusts have,” he says, adding the likely direction of travel for European public sector bodies is on demand. “As we modernise, I think we’ll ultimately all move more towards the public cloud.”

“Cloud is still too expensive”

Like Corbridge, Richard Gifford, CIO at logistics firm Wincanton, says his organisation must deal with legacy concerns. At the moment, his business has a high, fixed-cost legacy installation base. “I’m not able to breathe in easily if circumstances change,” says Gifford.

“Equally, we sometimes want to be able to expand rapidly, so that we can serve our customers in a more agile way. That means we’re looking to move from a fixed-cost base to a scalable operation.”

Gifford says the form of that transition, including choice of provider, is up for debate. The organisation will go to the public cloud for some areas, such as testing and development, and for certain enterprise applications. But where the firm is running 24/7 operations, such as in the case of ecommerce, the cloud is currently too expensive.

So, rather than the commonly-cited security implications of going on demand, Gifford says cost concerns are a bigger impediment to a cloud-first strategy. “We’re embracing on-demand IT, but we’ll be making a tentative move to the cloud because of commercial reasons,” he says.

“In the public cloud, I’m going to have to stand up services to run 24/7, so I’m actually paying for that computing power even if we’re not using it. In a private cloud, that situation is different because I’m only paying for the service when I need it.”

The future of computing is at the edge

It is reasonable to suggest, therefore, that the approach to the cloud varies between sectors and organisations – and that variation will remain a common theme going forward. What is certain, however, is that the cloud will be used to help deliver innovation and new services.

Richard Orme, CTO of Photobox Group, expects the next year or two to be huge in terms of businesses using the cloud to deliver benefits to customers. “People are going to see much more rapid iterations on their apps, software and operating systems and far fewer big, one-off updates,” he says.

Further down the line, as cloud becomes the norm, Orme believes there is going to be much greater use of edge computing. This involves the proliferation of smaller, localised data centres that are focused on machine-to-machine interactions, such as those that could be used to power artificial intelligence (AI) initiatives.

“AI requires vast numbers of interactions between two machines – for example, your phone and the data centre which holds the AI engine – so being physically close makes a big difference in how quickly these machines can interact and, therefore, respond to an end user. As the complexity of AI grows, edge computing and local data centres that are used specifically for super-fast machine-to-machine interactions will become key,” says Orme.

Plenty of investment is taking place in machine learning – but the skills gap isn’t going away

If you thought the cloud skills gap was bad, then it’s only going to get worse as more emerging technologies mature.

According to a new report from Cloudera, more than half of the 200 European IT managers surveyed said they were reticent at adopting machine learning technologies because they did not have enough skills and knowledge of the area.

As might be expected, plenty of investment is going to be taking place. Almost nine in 10 (87%) of those polled said they have already implemented machine learning technology, or plan to do so. While a similar number (89%) said they had a ‘basic’ understanding of ML’s benefits, 60% admitted they lacked the skills to implement the solutions fully.

When it came to cost, however, there was an overall positive reaction. Three quarters (74%) of respondents said machine learning would be a cost that would eventually reduce the bottom line, while a third of companies said they were already seeing a return on investment from their initiatives. 84% of those polled said machine learning provided a competitive advantage, with key benefits being the improvement of operational efficiency and greater data insights.

Regular readers of this publication will be more than aware of the difficulties organisations face in terms of closing the skills gap, seemingly regardless of technology. Cloud computing professionals continue to be at a premium; a study earlier this week from OpsRamp found 94% of the more than 100 respondents were having a ‘somewhat difficult’ time finding candidates with the right tech and business skills to drive digital innovation.

Problems can go in both directions too. Let’s put it this way: if organisations are going to hire someone to take charge of their ML initiatives, they are going to receive a driven, talented, intelligent professional. If this is the case, make sure you have enough work for them to get on with.

Writing in Medium in March, Jonny Brooks-Bartlett, data scientist at Deliveroo, noted the disparity between companies who didn’t know what they were doing and young, hungry execs. “The data scientist likely came in to write smart ML algorithms to drive insight, but can’t do this because their first job is to sort out the data infrastructure and/or create analytic report,” he wrote. “In contrast, the company only wanted a chart they could present in their board meeting each day.”

Ultimately, it is this uncertainty which needs to be eradicated if organisations want to get serious about machine learning. “Although most IT buyers understand the benefits of machine learning, many are still unsure about how to implement and how it will impact their businesses,” said Stephen Line, VP EMEA at Cloudera. “These are barriers that can be overcome through upskilling staff, recruiting new data talent, working with the right partners who can complement existing teams, and through leveraging external technology.”

http://www.cybersecuritycloudexpo.com/global/wp-content/uploads/2018/10/ai-bigdata-world-series.png Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

Google Cloud’s Security Command Centre enters beta phase


Connor Jones

6 Dec, 2018

Google Cloud has announced its Cloud Security Command Centre (SCC), previously revealed back in March, is now available in beta to Google Cloud Platform (GCP) customers.

The Cloud SCC, according to Google Cloud, is the first of its kind to be offered by a major cloud provider which offers organization-level visibility into assets, vulnerabilities, and threats. Essentially, the new service provides a user-friendly hub for all levels of a business to access and assess data security events from across its network.

Data can be accessed through a simple dashboard which allows for fast detection of security risks and possible vulnerabilities. This can include overly permissive firewalls and alerts relating to possible compromise leading to coin mining.

The Cloud SCC gives users a comprehensive overview of all cloud assets across GCP services, allowing the viewing of resources across the whole GCP organisation or for just specific projects. It also allows users to make changes such as setting up automatic notifications after a policy change is made to a network firewall, which then needs to be reverted at a late date.

Another interesting feature about the Cloud SCC is that it provides an overview of not just Google Cloud security services such as Foresti and Cloud Security Scanner, but for third-party services too if the business has those implemented alongside Google Cloud services.

The features also work to streamline the experience of detecting security risks in the business by having all assets feed information into one dashboard, without having to visit separate consoles or cloud environments. Third-party tools can also be directly accessed through Cloud SCC to help speed remediation efforts.

The SCC will also provide coverage across Cloud Datastore, Cloud DNS, Cloud Load Balancing, Cloud Spanner, Container Registry, Kubernetes Engine, and Virtual Private Cloud, the company confirmed.

The tool is similar in function to the Shield platform currently being developed by Box, announced in August. Box is betting on machine learning-based security as a major selling point of the platform, which will also give admins a detailed overview of a company’s security portfolio.

Set to be released in 2019, Box Shield will allow security analysts to check to see what content is being accessed, who is accessing it, and whether sensitive data is being downloaded.

Salesforce adds IoT insights into cloud-powered Field Service Lightning mobile app


Clare Hopping

6 Dec, 2018

Salesforce’s IoT Insights platform is now available as part of its Field Service Lightning, allowing businesses to gain a better understanding of the IoT devices operating with their systems.

The addition of IoT Insights means workers out in the field are able to identify and diagnose problems with equipment remotely using the Field Service mobile app, allowing them to quickly isolate a problem and deploy the right engineer to fix any issues.

The new integration will now mean that engineers can access far more detailed information on the devices being used by the company on the edge of their network. Specifically, workers can now identify when a device is going to fail and how to fix the issue ahead of time, so that they arrive at a site equipped with the tools they need.

Because all this data is kept within Salesforce, businesses can stay on top of all the admin associated with field workers and operations. The tool features automated work order processes that are triggered by signals coming from the IoT devices, essentially cutting down the amount of admin work required to support repairs.

For example, as soon as a device starts malfunctioning, the system can autonomously identify what’s wrong and deploy an engineer, rather than the case having to go through a call centre.

“IoT-enabled products give organizations an opportunity to take a proactive approach to customer service, and the potential is limitless, “ Paolo Bergamo, SVP and GM, Salesforce Field Service Lightning said.

“Examples include smart homes that notify service teams when an oven or air conditioning unit is about to fail, so they can fix the issue before the machine breaks; or industrial-scale machines that automatically send performance signals to field technicians in advance of routine maintenance. The promise is a world with zero down-time where everything just works.”

Gartner predicts that there may be as many as 20 billion connected devices across the globe by 2020.

In May the company announced it was extending its data centre footprint in the UK with the opening of its second facility, one that runs entirely on renewable energy.