All posts by James

Check Point Software acquires Dome9 to beef up multi-cloud options

Check Point Software, an Israel-based cyber security firm, has announced the acquisition of compatriot Dome9 – with multi-cloud capability once again proving key.

Dome9, which like Check Point is based in Tel Aviv, offers a SaaS platform which aims to visualise organisations’ security postures in the public cloud. Companies can have verifiable infrastructure security for every public cloud, including the behemoths of AWS, Azure and Google Cloud Platform.

Check Point Software – who may be best known for its security research and uncovering attacks such as the recent Android-centric ‘Black Rose Lucy’ botnet – is looking to enhance Infinity, what it claims as the only fully consolidated cyber security architecture, as well as its general cloud security offering through multi-cloud protection capabilities.

“Dome9 and Check point’s CloudGuard together provide the best cloud security solution in the industry,” said Gil Shwed, CEO of Check Point in a statement. “Dome9’s platform will add rich cloud management and active policy enforcement capabilities to Check Point’s Infinity architecture, particularly complementing the CloudGuard security product family and make our broad solution even more differentiated in the rapidly moving cyber security environment.

“As fifth generation cyberattacks increasingly target enterprise cloud environments, so our Gen V cyber security solution must effectively protect this vector,” added Shwed. “This acquisition will enhance our ability to deliver the benefits of cloud with the critical security that must extend from the networks, endpoints and data centres to the cloud and mobile enterprise-wide.”

Dome9 had secured almost $30 million in funding during its history, with this publication reporting in April last year about a $16.5m round led by SoftBank.

You can find out more about the acquisition here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Oracle unveils next generation cloud vision at OpenWorld – autonomous and rearchitected

“A semi-autonomous database is like a semi-self-driving car,” Oracle co-founder and CTO Larry Ellison told attendees at the company’s OpenWorld event. “You get in, you drive, and you die.”

Why the comparison? Ellison was responding to an article he read – provenance not known – around rumours Amazon was building a ‘semi-autonomous’ database. It’s not much of a surprise. Oracle’s autonomous database is certainly Ellison’s favourite topic right now – but bashing the biggest player in cloud infrastructure must rank a close second.

The first keynote of Oracle OpenWorld, in San Francisco this week, focused on the next generation of cloud computing. The company’s vision is based around the autonomous database, but now added to this is a rearchitected infrastructure – what it calls the second-generation cloud.

“I’m not talking about a few software changes here and there,” said Ellison. “I’m talking about a completely new hardware configuration for the cloud. We had to add a new network of dedicated independent computers to surround the perimeter of our cloud – these are computers you don’t find in other clouds.”

Key to this revamped infrastructure, Oracle added, was around separate machines for customer data and the vendor’s control code. It’s a two way street, Ellison said; you don’t have to trust us, and we can’t trust all of you. Oracle can’t see customer data, but bad actors won’t be able to look at or modify their code either.

Naturally, the Oracle CTO had a comparison at hand. “If you look at the AWS cloud – in that machine can be one customer, could be multiple customers, but in that machine is the AWS cloud control code sharing the computer with customer code,” said Ellison. “That means you’d better trust your customers. You’d better trust all your customers. It’s a fundamental problem with the architecture of the cloud.”

Many of the other selling points of the autonomous database, from purported better performance to price cuts, had been covered previously when announcing capabilities for transaction processing and data warehousing, although it didn’t hurt to remind the audience.

Yet the primary focus was around security – providing ‘impenetrable barriers’ to block threats from getting to the cloud, and robots to find threats and eliminate them. All autonomous, of course. “It’s easy to say, but very hard to do to build a secure cloud,” said Ellison. “If it was easy to do, someone would have already done it.”

The focus on automation and the rise of machine learning is clear across the C-suite. Mark Hurd, Oracle CEO, told attendees at his keynote session that his company saw AI as a ‘core feature that will get embedded into virtually every application’, rather than an independent solution. “I don’t think there’s much of a debate the cloud market is accelerating – it’s moving quicker than expected,” said Hurd. “You’re going to start seeing the next generation of cloud technology capabilities – driven by AI.”

Plenty more news was announced at OpenWorld. Of most interest was Oracle’s burgeoning cloud region roadmap. By the end of next year, the company said, it would open additional regions across four geographies; Europe, North America, LATAM, and a particular focus on Asia Pacific, including Australia, India, Japan and South Korea. Elsewhere, Oracle announced business-ready blockchain applications – a technology the company has certainly had an interest in – while on the AI theme, Oracle Digital Assistant was launched to automate routine tasks and support various applications, from ERP to CRM.

As much as AI has beget the conversation around what it will mean for those with more mundane jobs, Ellison noted that Oracle’s autonomous database won’t mean companies’ admins will be out of the door anytime soon. “There is an incredible shortage of skilled IT professionals, and it’s good if we take out some of the mundane drudgery of running a database,” he said. “Your developers, your administrators, are now working on tasks that are higher value for the business.”

Ellison told attendees that it had taken a long time for Oracle to get here – the company had previously said it was a ‘major milestone’ – so this did feel like the point of no return. Oracle’s public cloud services will be sold solely for its generation two cloud from now on. “It required a fundamental rearchitecture of our cloud,” he said. “We did that and, as a result, we have these two key technologies that protect the cloud and protect your data.”

Picture credit: Oracle/Screenshot

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Exploring the benefits and challenges of hyperconverged and software-defined storage

The verdict on software-defined, hyperconverged and cloud storage from DataCore is in: while hyperconverged is making inroads organisations are struggling, while software-defined storage is seeing a greater number of use cases.

This analysis may not come as a huge surprise given DataCore’s primary business line is through software-defined. Yet the study, which polled 400 IT professionals who were currently using or evaluating software-defined storage, hyperconverged and cloud storage, still has plenty of interesting statistics to consider.

Three in five respondents (60%) said that automating frequent or complex storage operations was a key business driver for implementing these storage technologies overall, while simplifying storage management (56%) and extending the life of existing storage assets (56%) were also highly cited.

For hyperconverged, performance was the key driver, while the primary attribute of software-defined storage was automation and reduced complexity. Public cloud, meanwhile, saw a significant downturn when it came to delivering higher performance, with more than half of respondents saying they weren’t considering it at all.

Issues such as business continuity and data protection saw consistent figures across the board, however. Almost three quarters (74%) of those polled said it was the primary capability they wanted from their storage infrastructure – the most popular choice.

More participants said they had standardised on software-defined storage (37%) than other technologies. All-flash array (29%) was the next most popular, ahead of HCI (21%), hybrid (18%), public (17%), and containers (10%). All-flash however got plenty of votes when it came to future deployment – one in three respondents said they were strongly considering it but had yet to deploy. 42% of those polled said they had no interest in public clouds and containers respectively.

The dismissal of the latter may be something of a surprise given other research promoting its wares. Yet, of those who had taken the plunge, 19% said there was a lack of sufficient storage tools or data management services. 18% cited a slowdown in application response time, while the same number noted a lack of persistent storage for key applications.

The biggest concern, however, was around vendor lock-in within storage. 42% of those polled said this was their biggest problem. Again, software-defined storage was considered a useful tool in this regard. For those who are struggling with hyperconverged – the key reasons given were around lack of integration, lack of scale and price – the study recommends what it calls ‘hybrid-converged’ technology, which amounts to being able to deploy various storage options from a unified management plane.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Alibaba Cloud launches London data centres with promise for further expansion

Alibaba Cloud has opened the doors on its UK data centres, adding to its global footprint and promising 24/7 support as well as real-time monitoring.

The move confirms the Chinese provider’s UK move, with the company now operating 52 availability zones in 19 regions worldwide. The others outside Asia Pacific are two regions in the US, and one region each in Frankfurt and Dubai, while outside China Alibaba has representation in Singapore. Sydney, Kuala Lumpur, Jakarta, Mumbai and Tokyo.

This publication broke the news last month that Alibaba was launching operations in the UK after spotting a new landing page with ‘London is calling’ as its headline. As per September’s specifications, the new facility is set to have 99.99% availability, cooling configured with N+1 redundancy, and dual availability zones for stronger disaster recovery.

Alibaba also brought out a customer for the grand launch, with Sean Harley, CIO at London-headquartered business media firm Ascential, saying working with Alibaba was key to its success on a global scale, particularly through the complex market of China.

“At Alibaba Cloud, we are – and always have been – committed to our customers,” said Yeming Wang, Alibaba Cloud general manager EMEA. “Our expansion into the United Kingdom, and by extension into Europe, is in direct response to the rapidly increasing demands we have seen for local facilities within the region.

“Using AI-powered and data-driven technology, our latest data centres will offer customers complete access to our wide range of cloud services from machine learning capabilities to predictive data analytics – ensuring that we continue to offer an unparalleled level of service,” added Wang. “We are incredibly proud to take this latest step in our continued investment in EMEA.”

According to a recent note from GlobalData, Alibaba is gaining significantly in Asia Pacific outside its Chinese heartland, betting big on emerging markets such as India, Malaysia and Indonesia, while competing with others in the developed markets of Hong Kong, Japan, Singapore and Australia.

China is an anomaly in the cloud infrastructure market. According to figures from Synergy Research, the top 10 vendors in the country are all local players. As a result of Alibaba’s dominance, the company ranks second behind AWS in the Asia Pacific region. Every other region has an AWS-Microsoft-Google leadership.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

SAP breaks €5bn in quarterly cloud and software revenue – but profits see slight dip

SAP has claimed it is the ‘fastest growing cloud company at scale’ in enterprise software applications after breaking the €5 billion barrier for quarterly cloud and software revenues.

The figure of €5.01bn (£4.4bn) represents an increase of 7.5% on this time last year, and an increase of 1.3% on the previous quarter. Total revenues for the company were at €6.02bn, meaning cloud and software was at 83% of all earnings for the quarter.

Speaking to analysts following the announcement, SAP CEO Bill McDermott said the company was in the position it needed to be. “With 41% cloud revenue growth in Q3, SAP has the fastest cloud growth of any peer at scale in the enterprise applications software industry,” said McDermott.

“Three years ago we said that cloud revenue would overtake license revenue in 2018. Today the fast adoption of our cloud solutions and business networks has accelerated this positive development,” McDermott added. “The resilience of our license business remains ever steady even as we grow the cloud beyond expectations.

“Cloud has a higher lifetime value, drive[s] faster consumption of innovation and has higher predictability going forward. We planned for this transition – we guided for it and we are delivering it.”

This publication duly reported SAP’s prediction that cloud profits would exceed software license revenues by 2018 back in January 2015 – and it appears to be on track for that side. That said, it is never fully clear cut to tell; apart from Amazon, which puts AWS revenues in a clear segment, virtually every other primary vendor obfuscates its figures to some degree. Microsoft, for instance, reveals a percentage point increase for Azure but hides its overall numbers in two buckets. For SAP, describing those revenues as ‘cloud and software’ makes sense in this context.

SAP raised its 2018 outlook for the third time this year – but despite this, profits were slightly down, with operating profit declining 6%, from €1.3bn to €1.2bn.

Chief financial officer Luka Mucic reiterated that cloud was ‘where the long-term value was’ and that SAP’s ‘intelligent enterprise strategy’ was ‘resonating broadly and propelling strong adoption of [their] suite in the cloud.’ Responding to an analyst question around license performance and support revenue, Mucic added that “the market needs to expect a gentle decline in growth rates of support revenues.”

You can read SAP’s full financial results here.

Picture credit: SAP

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

DevOps skills demand continues to soar – with salaries going up with it

Two pieces of research have hit CloudTech’s inbox which show that if you have the right DevOps skills you can go just about anywhere – and name your price with it.

According to a new report issued by O’Reilly Media, the global median pay for DevOps professionals is currently at $90,000 a year.

The report was based on the responses of more than 1,300 IT professionals, and noted that the headline figure is down from $100,000 the year before. Yet this is nothing to be perturbed about: the lowering of the average is down to a greater number of respondents as well as a wider geographic dispersion to traditionally lower-income areas.

One area which definitely needs improvement, however, is around gender imbalance, with only 6% of respondents identifying themselves as female. Their salaries are $6,000 lower on average than their male counterparts to boot.

Salaries can be based on how much time is spent coding rather than spent in meetings. Those who code more earn less. According to survey respondents, those who code only between one to three hours per week bring home on average $94,000, while others who spent at least 20 hours a week at the coalface earn on average $82,000. Naturally, greater responsibility means more time away from the desk, the report notes, while organisations with a lot of coders would also have entry-level employees and interns whose salaries would lower the median value.

It will not come as a major surprise either to note how seniority holds sway. Those with less than five years’ industry experience can expect to earn $58,000, while those with more than 20 years earn $123,000 on average.

It’s worth noting at this point that it’s technically impossible to have 20 years’ experience in DevOps, given the term only came into usage around a decade ago. Any consultant who notes they were talking a good DevOps game in the 1990s, therefore, should be treated with suspicion. Yet if they discuss precursors such as agile software development, then you’re on a much surer footing.

In terms of programming languages, two thirds (66%) of respondent said they used Bash, with 63% using Python and 42% JavaScript. Comparing this with salary, average pay for Python professionals surveyed is $86,300. Some languages perform even better – PHP and Go offer median salaries of $90,000 and $102,000 respectively – but with far fewer professionals regularly using them.

Meanwhile, new data released by cloud service provider Akamai has shown that demand for DevOps skills has risen by more than two thirds across the past two years. Looking at the disparity between coding and management from the O’Reilly research, the Akamai study noted how whether the job role is for DevOps managers, senior staff or engineers, demand has grown significantly for all.

The Akamai study also focused on salary; salaries are on average 24% above the median of similar processes and methodologies, such as agile and scrum and test automation.

“Ensuring that businesses have the right talent is key to the success of DevOps, and when hiring and retaining this talent organisations need to ensure they have the best tools available,” said Ian Florey, Akamai solutions engineering manager. “From cloud platforms which allow automated product updates, to real-time monitoring which helps understand customer habits, DevOps experts expect to have the essential tools to make the most of their skill set.”

Read more: Putting the ‘ops’ back in DevOps: Keeping relevant and providing value for IT

IBM’s quarterly growth rebound comes to an end – yet cloud revenues continue to tick over

So much for that run of quarterly growth: IBM’s latest financials show a 2% decline in revenues, yet cloud revenues continue to perform solidly.

Overall revenue was at $18.8 billion (£14.3bn) for the quarter, with cloud revenues at $19bn across the past 12 months. This is ticking over compared with the previous quarter’s figure of $18.5bn.

The company’s message, as this publication reported earlier this week, was focused around helping customers in the ‘emerging, high value’ segments of the IT industry – of which cloud is an integral part knitting it together.

“Our performance this quarter was driven by the offerings in hybrid cloud, in security, in digital, and in analytics and AI – a testament to our ability to deliver differentiated value to our clients through innovative technologies with the skills and expertise to implement these technologies,” said Jim Kavanaugh, IBM chief financial officer in an earnings call. “We see the results in our strategic imperatives revenue growth of 13% over the last 12 months.

“We also see this playing out in higher operating margin over the last few quarters, which supports both our long-term investment and return to shareholders,” Kavanaugh added. “With our success in these higher value areas and our focus on delivering consistent operational performance, we remain on track to our full-year expectations of earnings per share and free cash flow.”

Hybrid cloud was certainly the term of choice in the earnings call. Kavanaugh noted the statistic which the company pulled out with its recently released Multicloud Manager product – more of which shortly – around enterprises only being 10% to 20% through their cloud journey.

“Progress [is] slow by the lack of interoperability across cloud environments and concerns about the ability to manage data privacy and security in multiple cloud environments,” added Kavanaugh. “So clients need a cloud partner that can offer a hybrid cloud for workloads that cut across public, private and traditional, a secure cloud for mission-critical workloads and highly sensitive data and an open cloud to run complex, multi-cloud environments.”

Multicloud Manager, launched to fanfare earlier this week, aims to not provide ‘scale for the sake of scale’, but help customers launch new business services or enter new markets at pace. As John Considine, IBM general manager for cloud infrastructure services, put it when speaking to CloudTech in February, the company is positioning itself as ‘the enterprise cloud’ – putting emphasis on helping organisations grow through emerging technologies. Indeed, with Multicloud Manager, enabling workloads on AWS, Azure and more, this publication surmised the possibility of IBM accepting it was not going to overtake the big cloud infrastructure leaders.

This was by no means the only piece of news IBM announced this quarter. The company also had various initiatives in AI, with the launch of AI OpenScale technology to manage the lifecycle of all forms of AI applications and models, while on the business side IBM partnered up with CenturyLink in August to solidify enterprise connectivity in emerging markets.

You can read the full IBM investor release here (pdf).

Read more: IBM launches multi-cloud management tool, continues to emphasise open, AI-driven future

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud Academy launches new tools to aim to close the cloud skills gap

The skills gap has been a thorn in cloud computing’s side for longer than many in the industry would care to remember. Cloud Academy hopes to create a change in mindset with its latest release.

The training company has announced the launch of Cloud Roster and Cloud Catalog, two products which aim to provide a fuller picture of the cloud jobs and skills landscape.

Cloud Roster is a job roles matrix which analyses tens of thousands of public job postings per week to give a basis for the top trending technology skills as they develop. Cloud Catalog, meanwhile, focuses on the technologies themselves, providing a stack ranking of technologies by popularity and geography based on data from various developer community platforms.

The company cited figures from IDC, who predicted cloud investment to grow at a 22% computing growth rate until 2021. As a result the importance of closing the skills gap – or at least ensuring it does get any wider – is key.

“We talk a lot about the cloud skills gap with our customers, and the fact that there’s a need for technical talent is well documented,” wrote Alex Brower, Cloud Academy VP marketing in a blog post announcing the news. “We wanted to dive a layer deeper and use data to qualify and quantify the nature of the technical skills gap in a way that’s meaningful and objective.”

In a blog post from the start of this year, Cloud Academy noted the importance of partnerships and patience in putting a successful cloud migration plan together. “Do not be excessively aggressive,” the company wrote. “While you may be tempted to go for the quick win, realise that cutting corners will almost certainly guarantee failure for your cloud migration project. Baby steps, logical steps, are very important.”

According to a study from IT provider Softchoice earlier this month, organisations are still hitting roadblocks with their cloud implementations despite extensive preparation. More than two in five (43%) of the 250 respondents admitted they had difficulty in knowing how to create an effective cloud management strategy.

IBM launches multi-cloud management tool, continues to emphasise open, AI-driven future

IBM is making a big bet into the multi-cloud game with the launch of a new management tool aimed at integrating workloads from different providers.

The Multicloud Manager tool runs on IBM’s Kubernetes-based Cloud Private platform, and, while being optimised on IBM’s cloud, will allow organisations to manage and integrate workloads from other cloud vendors including Amazon, Microsoft, and Red Hat.

The launch is being supported with findings from a research report, conducted by IBM and Ovum, which found that 85% of companies surveyed were using more than one cloud environment. Despite this, four in five mission-critical workloads were still being run on-premises due to performance and regulatory issues.

Arvind Krishna, senior vice president of IBM hybrid cloud, said that companies were only ’10% to 20%’ into their cloud journeys, moving beyond low-end infrastructure as a service to higher business value.

“Like the rest of IBM’s portfolio, we have always been focused on delivering business value with the cloud,” Krishna said in a published Q&A. “It is not about scale for the sake of scale, but helping clients leverage the cloud to optimise their business, launch new business services rapidly, or enter new markets.

“To accomplish this, companies today want to deploy across multiple cloud environments, regardless of vendor, and today’s announcement is a major breakthrough in accomplishing this,” added Krishna. “It will help unlock the next 80% of business cloud innovation.”

This makes for a particularly interesting announcement when it comes to IBM’s positioning in the long-held ‘cloud wars’ saga. 2018 has seen an upturn in the company’s fortunes; three consecutive quarters of growth following 22 straight quarters of declining revenue is testament to that. Yet IBM seems to have accepted its lot – much as VMware did with its AWS partnership – with regards to cloud infrastructure leadership. In other words: enterprises are going to predominantly be on AWS and Azure, so how can they be helped otherwise?

Speaking to CloudTech back in February, John Considine, IBM general manager for cloud infrastructure services, agreed with this publication’s position that a lot of the ‘cloud wars’ narrative was bluff, bluster and obfuscation. For IBM, Considine explained, the focus was on openness and investment in emerging technologies – indeed, in a recent prediction, analyst firm CCS Insight forecast IBM to be the clear leader in quantum computing.

“Here’s the trick: for us, we are the enterprise cloud,” said Considine at the time. “This investment we’ve made in the technologies, in the underlying infrastructure… [there’s] huge growth this year in our infrastructure both in geographic and capacity expansion, but as well as new features and the rate of delivering new features. These are focused in really providing solutions for enterprises, and then helping them make this transformation.”

This open approach was emphasised again by Krishna. “By taking an open approach to cloud, IBM gives our clients the ability to mix and match their cloud environments based on what they need to accomplish,” he said. “It’s no secret that openness and interoperability are built into the very fabric of IBM’s long history, and we see these capabilities continuing to be a critical factor as we enter into this next phase of cloud and AI.”

So what does this mean for competing vendors in cloud management? As regular readers of this publication will be aware, the space is consolidating. CloudHealth Technologies was snapped up by VMware earlier this year, as well as extending its partnership with fellow management player ParkMyCloud. One company in the crosshairs could be Nutanix. Writing for The Motley Fool, Timothy Green noted IBM’s desire for brand agnosticism going back to the 1990s, as well as Nutanix’s vision to ‘meld private, public and distributed cloud operating environments [to] provide a single point of control.’

You can find out more about IBM Multicloud Manager here.

https://www.iottechexpo.com/northamerica/wp-content/uploads/2018/09/all-events-dark-text.pngInterested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located IoT Tech Expo, Blockchain Expo, AI & Big Data Expo and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam and explore the future of enterprise technology.

WikiLeaks claims to publish confidential AWS data centre location information

WikiLeaks has published what it claims is a 'highly confidential' document outlining the addresses and operational details of Amazon Web Services (AWS) data centres.

The whistle-blowing organisation claims to have published the document, originating from late 2015, as an attempt to shed light on the 'largely hidden' nature of cloud infrastructure locations.

"While one of the benefits of the cloud is the potential to increase reliability through geographic distribution of computing resources, cloud infrastructure is remarkably centralised in terms of legal control," the company wrote in a statement. "Until now, this cloud infrastructure controlled by Amazon was largely hidden, with only the general geographic regions of the data centres publicised."

AWS' global infrastructure page outlines geographical locations in terms of 'regions'; for instance, US East has six in North Virginia and three in Ohio, while Europe has presence in Frankfurt, Ireland, London and Paris – with three zones, or data centres, each. 

This is common practice – and compared with some others can be more information than usual. For instance, Oracle only put together a public-facing map of its cloud regions late last year; when this publication enquired for a list of its cloud data centre regions in mid-2017, reply came that there wasn't one available.

WikiLeaks claims there are elements of obfuscation revealed in the document. On page seven, regarding the IAD77 data centre unit in Virginia, the document states that Amazon 'is known as Vandalay Industries on badges and all correspondence with building manager'; the latter does not appear to exist outside of reference as a fictional company in the US sitcom Seinfeld. WikiLeaks has also issued an updated map of AWS' regions with addresses, notes and contact numbers.

The timing of the disclosure has also been influenced with regards to the upcoming $10 billion cloud contract for the US Department of Defense. As this publication reported in March when the tender was opened up, the government's search for a 'coordinated enterprise-level approach to cloud infrastructure' meant they were looking for a single vendor – arguing multi-cloud was too complex – to fulfil the work. Earlier this week, it was reported that Google had dropped out of the race, while Microsoft employees had protested about the ethical complications of winning the contract.

AWS and WikiLeaks have locked horns previously. In 2010, the former kicked the latter off its platform having previously been a customer, saying it did not own or otherwise control the rights to the classified content it was disclosing. 

"[We] have hundreds of thousands of customers storing all kinds of data on AWS. Some of this data is controversial, and that's perfectly fine," the company said in a statement at the time. "But when companies or people go about securing and storing large quantities of data that isn't rightfully theirs, and publishing this data without ensuring it won't injure others, it's a violation of our terms of service, and folks need to go operate elsewhere."

You can take a look at the WikiLeaks document here.