WikiLeaks claims to publish confidential AWS data centre location information

WikiLeaks has published what it claims is a 'highly confidential' document outlining the addresses and operational details of Amazon Web Services (AWS) data centres.

The whistle-blowing organisation claims to have published the document, originating from late 2015, as an attempt to shed light on the 'largely hidden' nature of cloud infrastructure locations.

"While one of the benefits of the cloud is the potential to increase reliability through geographic distribution of computing resources, cloud infrastructure is remarkably centralised in terms of legal control," the company wrote in a statement. "Until now, this cloud infrastructure controlled by Amazon was largely hidden, with only the general geographic regions of the data centres publicised."

AWS' global infrastructure page outlines geographical locations in terms of 'regions'; for instance, US East has six in North Virginia and three in Ohio, while Europe has presence in Frankfurt, Ireland, London and Paris – with three zones, or data centres, each. 

This is common practice – and compared with some others can be more information than usual. For instance, Oracle only put together a public-facing map of its cloud regions late last year; when this publication enquired for a list of its cloud data centre regions in mid-2017, reply came that there wasn't one available.

WikiLeaks claims there are elements of obfuscation revealed in the document. On page seven, regarding the IAD77 data centre unit in Virginia, the document states that Amazon 'is known as Vandalay Industries on badges and all correspondence with building manager'; the latter does not appear to exist outside of reference as a fictional company in the US sitcom Seinfeld. WikiLeaks has also issued an updated map of AWS' regions with addresses, notes and contact numbers.

The timing of the disclosure has also been influenced with regards to the upcoming $10 billion cloud contract for the US Department of Defense. As this publication reported in March when the tender was opened up, the government's search for a 'coordinated enterprise-level approach to cloud infrastructure' meant they were looking for a single vendor – arguing multi-cloud was too complex – to fulfil the work. Earlier this week, it was reported that Google had dropped out of the race, while Microsoft employees had protested about the ethical complications of winning the contract.

AWS and WikiLeaks have locked horns previously. In 2010, the former kicked the latter off its platform having previously been a customer, saying it did not own or otherwise control the rights to the classified content it was disclosing. 

"[We] have hundreds of thousands of customers storing all kinds of data on AWS. Some of this data is controversial, and that's perfectly fine," the company said in a statement at the time. "But when companies or people go about securing and storing large quantities of data that isn't rightfully theirs, and publishing this data without ensuring it won't injure others, it's a violation of our terms of service, and folks need to go operate elsewhere."

You can take a look at the WikiLeaks document here.

IBM Multicloud Manager aims to simplify working across multiple cloud environments


Bobby Hellard

15 Oct, 2018

IBM has launched a service for organisations that can integrate hybrid cloud services and on-premise business systems into a simplified interface.

Called ‘Multicloud Manager’ and operated via the IBM Cloud, the service aims to help businesses manage and integrate workloads from other public and private cloud providers, such as Amazon or Microsoft, via an operations console.

Multicloud Manager also makes uses of the Kubernetes container orchestration technology to make it easier to and cheaper to move and manage across both cloud and on-premise environments. An integrated compliance and rules engine helps to ensure those applications remain compliant with security standards.

Prompting this new service, a report from IBM’s Institute for Business Value found that 85% of companies use more than one cloud environment and the hybrid cloud/ on-premise IT model has been a popular choice for many organisations. A report from 451 research suggested that UK firms spent more on the cloud in the past three years than they did on data centres.

«With its open source approach to managing data and apps across multiple clouds, the IBM Multicloud Manager will enable companies to scale their many cloud investments and unleash the full business value of the cloud,» said Arvind Krishna, SVP, IBM Hybrid Cloud.

«In doing so, they will move beyond the productivity economics of renting computing power, to fully leveraging the cloud to invent new business processes and enter new markets.»

IBM expects its new service to be a game-changer for modernising business around the world by integrating multiple cloud services via one simple to use interface.

As an example, if a car rental company uses one cloud for AI services and separate cloud for booking systems, and also financial processes via on-premise computers, ‘Multicloud Manager’ can work across the different computing infrastructures to enable customers to book a car fast via the companies mobile app.

«The old idea that everything would move to the public cloud never happened,» said Stephen Elliot, program vice president, IDC. «Instead, the cloud market has evolved to meet the needs of clients who want to maintain on-premises systems while tapping a multitude of cloud platforms and vendors. The challenge for this approach is integration. Many IT companies have been talking about multi-cloud, but to date, the user experience has been fragmented.»

Should you stop using CCleaner?


Jane Hoskyn

16 Oct, 2018

CCleaner – still developed by Puriform, but now owned by Avast – was one of the most highly recommended free software tools around, until this summer when the great junk-remover became such a ghastly junk offender that even its own parent company withdrew its latest version.

Can our old favourite ever be trusted again? Here we answer your questions and look at the best free CCleaner alternatives.

What did CCleaner do wrong?

The trouble began in May, when CCleaner 5.43 added two pre-ticked boxes: ‘Allow usage data to be shared with 3rd parties for analytics purposes’ and ‘Show offers for our other products’. You couldn’t untick either of them unless you paid for an upgrade. CCleaner’s June release (5.44), duly spammed users with pop-up adverts for a ‘Summer Sale’.

CCleaner 5.43 displayed tick boxes for data-collection and ads that you couldn’t untick

Then came the infamous July release (5.45) which removed both Privacy tick boxes, but continued to opt you into data-gathering and adverts. What’s more, CCleaner now kept running after you closed the program window, and its Active Monitoring process had become impossible to switch off.

CCleaner 5.45 removed tick boxes for ads and data collection, but opted you in anyway

Users speculated that Active Monitoring, which claims to look out for temporary files, was being used to track you. Why else would CCleaner be so reluctant to let you close it?

«Somebody over at Piriform REALLY REALLY wants you to enable monitoring whether or not you like it» said one of many furious users on the Piriform Community Forum.

Didn’t GDPR ban that kind of thing?

Indeed it did. According to the General Data Protection Regulations (GDPR), which came into effect on 25 May, consent is not valid if: «There was no genuine free choice over whether to opt in; you use pre-ticked opt-in boxes or other methods of default consent (or) people cannot easily withdraw consent» (see the ICO’s website for more information).

CCleaner 5.44 still tries hard to stop you switching off Active Monitoring – click Yes to ignore it

So by greying out its Privacy tick boxes, and then removing them completely, it appears CCleaner failed on at least three counts to meet required standards for consent.

The one reassurance is that CCleaner’s free edition doesn’t update automatically, so you may be using an older version that does let you opt out. Sticking with outdated software isn’t usually the safest policy, but this mess shows it can pay to wait for any problems to emerge before jumping into a new version.

What did CCleaner have to say for itself?

Avast, which bought CCleaner’s developer Piriform last year, spent the summer unleashing defensive drivel that ranged from empty cliches («Your privacy is very important to us») to patronising filibuster («In order to answer that question») via oodles of self-important jargon about analytics, aggregation, anonymisation and «underlying mechanisms». Here’s 400-odd words of it.

That statement, released by Avast on 6 August, admits («as part of our ongoing mission», sigh) that version 5.45 «introduced some features… aimed at providing us with more accurate data». So they’re tracking your moves more closely than ever. And, as we know, the data is then shared.

The statement goes on to insist data-gathering is «a separate function to Active Monitoring», but doesn’t say how it’s carried out. Next, Piriform says it’s working on a new version of CCleaner, in which data-gathering and Active Monitoring will be separate. Hang on, didn’t they just say these were already separate? Avast seriously underestimates its users’ intelligence.

Hours after the statement appeared, it emerged that Avast was ditching version 5.45 and rolling back to 5.44 until the next version is ready.

Meanwhile, Avast has defended its prying by saying the info it shares is «essentially anonymous» (look for ‘Laurence Piriform’ on the forums). But anonymity does not make spying OK. If someone’s snooping on your home but they can only see your silhouette and don’t know your name, they’re still snooping.

Is CCleaner OK to use again?

At the time of writing the official version is 5.47. It lets you switch off Active Monitoring and close the program easily, and you can also now untick the ‘Allow usage data’ box.

So if you really want to stick with CCleaner, install 5.47 from www.ccleaner.com then tweak your privacy settings immediately. Go to Options, Privacy, and then untick ‘Allow usage data to be shared…’. Now go to Options, Monitoring, untick ‘Enable system monitoring’ and then untick Enable Active Monitoring. Click Yes in the pop-up that tries to talk you out of it (see screenshot).

Whether that makes it OK to use again, we’re not so sure. If you share your doubts, it may be time you tried an alternative.

What should I use instead?

Your easiest option is Windows’s built-in Disk Clean-up tool, which can remove gigabytes of temporary files, caches and old Windows updates. You can also set Windows 10 to automatically remove temporary files when space is low. Go to Settings, System, Storage and then switch on ‘Storage sense’.

Away from Windows, open-source tool BleachBit is the most powerful free alternative. There’s no fuss or flash; just tick what you want to clean, then follow BleachBit’s advice (it warns you if some files are slow to clean, and if others are worth keeping – browser passwords, for example). You can add many more programs to the list for really deep cleaning. The installable version adds a ‘Shred with BleachBit’ option for obliterating sensitive files, while the portable version runs on Windows XP and later.

Other alternatives include System Ninja, whose free version includes a duplicate finder, and ATF (All Temp File) Cleaner, a free, portable program designed for Windows XP, Vista and 7 – and still works.

To get previous versions of CCleaner click ‘Download Now’ at OldVersion.com

If you pine for the days when CCleaner got to work without rifling through your drawers, you can install editions going right back to 2004 from OldVersion.com.

Google to offer G Suite cloud identification tool separately to developers


Clare Hopping

15 Oct, 2018

Google has split up its G Suite cloud identification tool from the rest of its enterprise services for developers, so they will be able to integrate it into their own services.

The company’s Google Identity, which was built on the BeyondCorp framework was previously only available as part of the entire G Suite ecosystem. But there’s apparently a lot of demand for it to work outside of the range, so Google will now launch a beta of Cloud Identity for Customers and Partners (CICP) on its Cloud Platform.

The product will allow developers to integrate identity and access management for apps and services without having to move away from the GCP environment or enlist the help of a third party.

“We’ve had a lot of success internally with the model and what we’ve received good feedback from customers, but they wanted to use it (Cloud Identity and BeyondCorp) throughout the organization and as a standalone product,» Karthik Lakshminarayanan, product management director at Google Cloud Platform.

It’s an authentication service with integrated automated threat detection, built on scalable infrastructure that makes it the ideal environment for businesses already using GCP to develop their apps and services.

So why has Google only just decided to split its cloud identity tool away from the main G Suite set of tools?

“Expectations have changed,” Jayachandran told VentureBeat. “Users expect agile, mobile work environments across multiple devices, and it’s reshaping how we think about security, access, and control. Admins want to give them this modern, forward-thinking experience, but they don’t want security to be compromised. The perimeter has disappeared.”

Snowflake secures $450m funding for expansion and further multi-cloud exploration

The money just keeps rolling in for Snowflake Computing. The San Mateo-based data warehousing provider has announced $450 million (£341m) in additional growth funding to help grow its organisation and explore new strategies.

Snowflake offers a comprehensive and complete data warehouse – in other words, repositories of integrated data from at least one source, usually more – in the cloud, with the capability to upload and analyse data to offer business intelligence capabilities. The company’s secret sauce is in its patented architecture, which offers benefits such as near-linear scale-out.

Funding for this round was led by Sequoia Capital – whose other investments have included Cohesity, Docker and Skyhigh Networks – as well as including participation from Altimeter Capital, Capital One Growth Ventures, ICONIQ Capital, Madrona Venture Group, Meritech Capital, Redpoint Ventures, Sutter Hill Ventures and Wing Ventures. Aside from Meritech, the remainder were all previous investors.

Among the company’s plans for the funding include growing its sales and engineering teams across the US and globally, as well as expanding its multi-cloud strategy. Snowflake had been available on Amazon Web Services (AWS) since its inception – but compatibility with Microsoft Azure was announced back in July. At the time, the company said the move came about because of customer demand – and with the company’s first value being to ‘always put the customer first’, it will be interesting to see how this progresses from here.

This is not Snowflake’s first funding round of the year; as this publication reported in January, the company secured $263 million in growth funding with a pre-money valuation at the time of $1.5 billion. This time round, the company’s valuation is at $3.5bn.

“Learning to be data-driven is an imperative for every organisation today, and a data-driven organisation must be in control of its data,” said Bob Muglia, Snowflake CEO in a statement. “Snowflake is the most powerful data warehouse in the world for analytics solutions. That power delivers the security, control and business answers needed to enable data-driven organisations.

“This is driving spectacular growth for our company, and this latest funding round will provide Snowflake with the resources we need to serve our rapidly growing set of new and existing customers around the world,” added Muglia.

Companies still hitting cloud roadblocks despite extensive preparation, research finds

Organisations are recognising the benefits of the cloud and making extensive preparation – but they are still experiencing various problems with implementation, according to a new study.

The study, conducted by IT provider Softchoice, and which polled 250 IT decision makers across North America, found preparation for cloud initiatives was, on the whole, exemplary. 83% of those polled said they assessed existing applications to determine if they were ready for the cloud, 82% modernised their data centres in preparation, while just under three quarters (72%) communicated the business impact of a cloud strategy internally.

Once companies take the plunge however, the issues begin. 57% of those polled admitted they had exceeded their cloud budgets at some point, while more than two in five (43%) said they had trouble in knowing how to create an effective cloud management strategy.

The larger the organisation, the greater the struggle. Almost half (48%) of IT leaders at mid-sized firms strongly believed moving to the cloud had helped them achieve their business goals, a figure which compares unfavourably with enterprises (36%). Only one in three (36%) of all respondents strongly agreed they were confident about their cloud security policies.

The report also provided one of the strongest assertions that the skills gap was alive and well in cloud computing; 96% of those polled said there was a skills gap in their organisations. This is a long-term bone of contention as regular readers of this publication will recognise. A study from F5 Networks and Foresight Factory last month argued the importance of management in this context; with technologies such as containers and APIs, as well as multiple cloud services, coming to the fore, issues will persist.

“The journey to the cloud, no matter the organisation, isn’t without its challenges,” said Craig McQueen, senior director of innovation at Softchoice. “Organisations are doing the necessary prep work, but there are still opportunities to adjust their strategies for long-term success.

“When IT leaders prepare for the unpredictability in cloud costs, and bring in the right outside partners, organisations can become more efficient and effective in the cloud,” McQueen added.

You can find out more about the report here (email required).

Announcing @Nutanix «Platinum Sponsor» of @CloudEXPO NY | #Nutanix #Agile #DevOps #Serverless #CloudNative

Nutanix has been named «Platinum Sponsor» of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, software-defined solution with rich machine intelligence.

read more

Google Cloud CEO Diane Greene: On becoming a ‘major enterprise player’ – with AI as the heartbeat

Google Cloud CEO Diane Greene has told an audience in London of her pride at how quickly the company has become a ‘major enterprise player’ – with artificial intelligence (AI) and security the key differentiators to their value proposition.

Speaking at the latest leg of Google Cloud Next’s world tour, Greene explained the increasing importance of AI to the cloud equation – a topic covered in chapter and verse by this publication in recent times.

“Cloud is becoming just a better way to run your IT, and it’s turning out to almost provide a working structure for how you can effect change in your company,” said Greene. “Data tends to be in silos across your data centres or in lots of different databases – then people move to the cloud and you put it all in one data lake, and all of a sudden everybody in the company can easily ask questions, subject to access controls.

“Then once you have that data you can start running analytics, doing AI, getting to know your customers so much better, making predictions about what they want and training on your own data – and everything changes for the better,” added Greene.

Regular readers of this publication will be aware of what Google is doing around pre-packaged AI – in August services were launched around contact centre and talent acquisition. But these industry-focused cloud toolkits don’t stop there. Google is working in the realms of financial services for anti-money laundering and fraud detection, retail for inventory predictions, and healthcare for managing and anonymising records, as well as diagnostics.

This is all AI-flavoured, of course; Greene noted that healthcare was one of the first industries targeted because it is so data-intensive which lent itself so well to building machine learning models. This is interesting when compared with the verdict from analyst firm CCS Insight earlier this month; the company predicted that by 2020 cloud service providers would expand from general purpose AI to business-specific applications. While Google appears to be leading the way in this field right now, expect efforts here to ramp up considerably in the coming 18 months.

Regarding security, Greene put forward a couple of eye-opening statistics. Google filters out 7000 bad URLs per minute, as well as 10 million spam and phishing attacks. Those who read earlier this week around Google+ being shut down after a software breach may raise eyebrows rather than open eyes at the following statement, but Greene emphasised the security side. “Our security is just built into every layer of the system – our assumption is that anything on the network is a risk,” said Greene. “There’s just no more secure setup than taking a Chromebook, adding hardware-based two factor authentication and running it discless with G Suite.”

G Suite – with AI completely infused, as Greene put it – is a key component of the Google Cloud mix. Last year Google revealed that Verizon had rolled out the collaboration tool to more than 150,000 of its employees, while more recently Airbus had gone all-in, with 130,000 employees. Other customers noted here were retailer Carrefour, using both Google Cloud Platform and AI capabilities, long-time partner SAP, and the Zoological Society of London (ZSL), who is trusting Google with image APIs for wildlife surveillance.

This comes amidst recent comments from Amir Hermelin, formerly product management lead at Google Cloud, who took to Medium to lament mistakes in taking too long to realise the value of the enterprise market. “Seeing success with Snapchat and the likes, and lacking enough familiarity with the enterprise space, it was easy to focus away from ‘large orgs,’” wrote Hermelin. “This included insufficient investments in marketing, sales, support, and solutions engineering.”

With this in mind, Greene appears to be doing her best to make up for lost time here.

Picture credit: Google Cloud/Screenshot

https://www.iottechexpo.com/northamerica/wp-content/uploads/2018/09/all-events-dark-text.pngInterested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located IoT Tech Expo, Blockchain Expo, AI & Big Data Expo and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam and explore the future of enterprise technology.

Simplifying complex public sector environments through cloud: A guide

Government and public sector organisations continue to seek ways to improve services and mitigate the risk of migrating mission-critical applications to the cloud. Already, many organisations, focused on improving the citizen experience, have set their sights on the cloud. Flexible, agile and affordable, managed cloud can accelerate the agency mission.  Still, CIOs are cautious about which applications to prioritise and what steps they must take to ensure the reality fulfils the promise.

In the US, since the White House released the Cloud First Policy in 2011, organisations are prioritising the evaluation of cloud computing before making any new investments in IT infrastructure or software. Likewise in the UK, the government set out its plan to adopt a 'Cloud First' policy for public sector IT back in 2013. As a result, worldwide organisations have cautiously deployed sub-sets of applications in the cloud, like email, and an increasing number of organisations have undertaken projects to develop cloud-native applications. Successful deployments and more cloud choices are increasing public sector confidence in cloud.

The benefits of cloud are real

Originating with the government, mission-critical is a term now used by all organisations to describe the crucial dependency of an organisation on an application and its relationship with the business of governing. Delivering mission-critical applications more efficiently saves taxpayers’ money, strengthens security and enables government and public sector organisations to do more with constrained budgets. Migrating legacy workloads and applications to the cloud frees up the budget to tackle transformative new projects and programmes.

The financial incentives for migrating traditional applications to the cloud are real. Analysts estimate that in 2018, 70% of the $95.7 billion U.S. federal IT budget will be spent on operating and maintaining existing IT investments and that percentage figure will be similar here in the UK.

Migration speedbump: Reliance on core enterprise applications

While modern web-scale applications were designed to run in the cloud, most organisations still rely on a broad mix of existing mission-critical applications, many of which were not intended to operate in a cloud computing environment and some that were not designed to scale up or down on demand. Further complicating migration planning, applications may require stringent security or compliance controls to ensure that regulatory guidelines are met.

While organisations are increasingly eager to move to the cloud, many soon realise that it takes considerable time and resources to plan, re-architect and manage the new solution. The majority of today’s mission-critical workloads do not fall into the category that makes it easy for public sector organisations to migrate and gain authority to operate (ATO) quickly and cost-effectively. Most public cloud offerings do not allow these organisations to realise true IT transformation since the self-service nature of public cloud still requires organisations to perform most of the management activities themselves in order to support the workload and the end-user base.

Cloud – the public face of transformation

Public sector organisations need a better path to the cloud with secure and cost-effective managed cloud solutions that allow them to run the complex, mission-critical applications of today. Government and public sector organisations require a cloud that was built to handle mission-critical applications for the most complex, highly secure IT landscapes in the world.

Accelerate time to value

While public sector IT budgets remain stagnant or even decrease, expectations for new technology are continually rising. Public sector and government organisations need to increase the performance and reliability while simultaneously managing and containing costs from their most expensive and time-consuming mission-critical applications. Organisations must be able to step into the cloud today by deploying existing applications with minimal changes and gain ATO sooner, versus taking a giant leap to re-architecting every application from scratch to be cloud-native.

Security and compliance

Different applications have different use case characteristics and they have specific requirements which address these to enable them to migrate their mission-critical applications. This includes understanding stringent security and compliance requirements. They also require an approach to cloud which ensures continuity of operation and shortens time to value without compromising essential aspects like security and compliance.

Manage costs

Organisations need to maximise operational efficiency while increasing cost transparency and gaining the ability to allocate cost to specific programmes, allowing them to achieve improved financial flexibility to choose the right services and resources for the right workload. A true consumption-based pricing model allows government IT departments to only pay for what they use—improving economics beyond basic virtualisation, and freeing up budgetary funding to be allocated to other areas.

Offload management burden

Cloud platform infrastructure is only part of the mission-critical workload story. Organisations can experience true IT transformation through comprehensive IT management solutions that raise the bar on current operations and ensure that the most value and performance is gained from information infrastructures. Industry-leading methods and innovative tools now exist, providing real-time metrics to detect, diagnose, repair, and report on issues in the most complex business IT environments.

Focus on the mission

Government agencies can now effectively navigate the complexities of digital transformation and modernisation. Here at Virtustream our Federal Cloud capabilities allow public sector and government organisations to deliver mission-critical applications more efficiently, strengthen security while simultaneously enabling them to tackle transformative new projects and programmes.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

London’s Zoological Society fights species extinction with Google Cloud


Bobby Hellard

11 Oct, 2018

The rainforest is a wonder of nature that’s fundamental to the preservation of life on the planet, but in the last 40-years, the earth has lost 58% of its wildlife as biodiversity has reduced catastrophically across the world.

Hoping to reduce this horrific trend, the Zoological Society London (ZSL) has been working with Google Cloud, a partnership that has seen the cloud giant provide the hardware and software needed to monitor species around the world.

Speaking at Google Cloud Next, ZSL’s conservation technology lead, Sophie Maxwell shed light on a project known as InstantWild, which the two companies first launched together in 2017.

ZSL's Sophie maxwell on stage at Google Cloud Next

ZSL’s Sophie Maxwell on stage at Google Next 2018

In the beautiful southern Asian rainforests of Borneo, home to the orangutan, the giant elephant and the clouded leopard, ZSL and Google set up networks of camera traps to collect images of the diverse inhabitants who call it home.

Borneo is one of the oldest rainforests on the planet and these small digital cameras that are deployed in the field have a passive red sensor that detects motion and snaps a picture of wildlife as it walks past. They’re typically deployed in grids of around 30 cameras, to survey wildlife populations in the areas ZSL work.

Male Orangutans photographed by camera traps in Borneo

But technology can fail or be problematic, and in the case of camera traps, they take a large number of empty images and snaps of common species that happen to be walking past. This generates too much data.

«Currently, experts have to label images one by one which can take months,» said Maxwell. «With over 100,000 images coming from each survey and only 5% of them containing the target species, this is really slowing us down.

«And these surveys are happening all over the world. It’s like looking for a needle in a haystack and it’s causing a bottleneck. We are wasting our time when we could be out there saving wildlife.»

To tackle this challenge, ZSL used Google Cloud and machine learning to create its own image recognition models to help identify species in camera trap images for greater and faster insights.

«They tell us about the state of nature,» said Maxwell. «Whether the population numbers are going up or whether they are going down, where species are going and what the threats are that they face. We urgently need this information to help us focus and also take conservation action and inform policy change.

«We created our own models without any machine learning expertise, using 35,000 expert labelled images, to create our own advanced model, which we applied to a challenging dataset from Borneo.»

A Sun Bear photographed by a camera trap in Borneo

The models produced surprising results, with 91% classification accuracy for 34 species and according to Maxwell, auto machine learning sped up ZSL’s conservation work from months to just days.

From this promising start, the wildlife conservation charity is hoping to further its work with Google Cloud and create a new platform powered by machine learning that will help conservationists create their own models for niche species and habitats.

This, Maxwell said, will be free and open source so that researchers around the world can share their models and refine them collectively as a community to process their camera track data to reveal new insights.

«Longer-term, we see machine learning playing a key role in giving real-time time view of the pulse of wildlife across the planet to speed up our conservation efforts, because time is running out,» added Maxwell.

Images courtesy of ZSL