Smart cities: building the metropolis of the future


Cloud Pro

8 Aug, 2019

Cities of the world are buckling. The UN estimates 55% of the planet’s 7 billion people live in urban areas and it’s believed a million people join this list on a daily basis. Infrastructure is feeling the strain, there’s unrest, congestion is polluting our lungs and crime is prevalent. What’s more, if this rise continues as expected, cities will be home to some 6.1 billion people by 2050. Something has to give.

The answer lies in making cities work smarter. Of course, the promise of smart cities isn’t new; it’s a concept that’s been celebrated on screen for decades and has seemingly been on the periphery for years, yet today we’re finally on the verge of achieving this truly connected utopia. Companies across the globe are using technology and analytics to make city life a breeze; preventing traffic jams, solving crimes, boosting tourism and more.

Speed and safety

In the UK, the economy as a whole lost £8 billion due to staff being stuck in traffic jams last year, or 178 hours per driver, according to research by Inrix. As cities become saturated, authorities and industry are turning to big data and tech to ease this load. In London, for instance, Transport for London’s (TfL) Open Data project provides more than 80 data feeds through a free API. These feeds share details about air quality, tube times and delays, the number of passengers flowing through the network as well as data on live traffic disruptions. Some 600 third-party apps are now being powered by these feeds, used by 42% of Londoners, and it’s reported to be putting £130 million a year back into the capital’s economy. On a wider scale, it’s estimated that by using open data effectively, 629 million hours of waiting time could be saved on the EU’s roads and energy consumption could be reduced by 16%.

“Open data is changing our everyday lives and how organisations like TfL work,” said Jeni Tennison, CEO at the Open Data Institute. “Data is becoming as important as other types of infrastructure, such as roads and electricity, which means building strong data infrastructure is vital to economic growth and wellbeing.”

Beyond roads, tracking pedestrians is key in keeping a city moving. In Glasgow and London, mobile phone data can be used to track passenger numbers on public transport, while sensors in lampposts can track footfall. The Netherlands has even begun trialling smart traffic lights that give the elderly extra crossing time or change automatically when they detect an approaching cyclist.

In China and Singapore, authorities are taking things a step further. Through the use of IoT devices and sensors, alongside advanced 4G data networks and AI, not only are they monitoring and improving traffic flow, they use the data to track road violations and even predict crime. Singapore, for instance, uses data from RFID-equipped travel cards, CCTV and anonymised phone data to identify problems before congestion can take hold. Its AI can spot patterns and run algorithms that highlight issues some 10 or 20 steps down the line.

Elsewhere, the Chinese province of Zhejiang is using 1,000 sensors to capture more than a terabyte of data every month. Stored on Intel servers running on Intel Xeon processor E5 series and holding an incredible 198TB, this data is easy to access and analyse by large numbers of users who can search for a licence plate on the network in less than a second, from 2.4 billion records. In particular, products such as those developed as part of Intel’s Vision Accelerator Design use deep neural networks to analyse such video footage quickly and accurately.

It’s not just traffic violations being caught using next-level technology and analytics. CCTV video link-ups, license plate scanning, smart mapping and even real-time facial recognition are also helping save lives, cut down on vandalism and prevent robberies. The London Mayor’s Office for Policing and Crime (MOPAC) recently partnered with Greater London Authority’s Strategic Crime Analysis team to launch SafeStats. By feeding more than 20 million crime and safety records from the police, ambulance, fire brigade and transport authorities into advanced AI software, emergency services can detect patterns and identify crime hotspots. This AI can even offer solutions and guide authorities on policy. For example, when cross referenced with records from 25 hospitals, it can be used to create heat maps that help steer local policing strategies and funding.

The connected city

Once crime is being tackled, and transport delays are managed, cities become more attractive to tourists; another area in which big data, AI and analytics are playing a significant role. In Manchester, the Beacons for Science app lets tourists use virtual and augmented reality to unlock experiences at landmarks across the city. In London, Mastercard has been hired to produce a series of smart city initiatives including the Visit London Official City Guide app. This app taps into real-time data feeds to help tourists navigate the city, using geolocation to flag nearby places of interest and transport routes.

Elsewhere, the West of England Combined Authority was recently awarded £5 million in funding to trial a 5G network at tourist destinations in Bristol and Bath. This network complements the Bristol Is Open smart city scheme designed as a city-wide private network testbed, powered by an Intel® Xeon® equipped Blue Crystal II supercomputer, on which companies and organisations can test smart city solutions.

“The vision behind Bristol is Open was to see how we could make the city smarter and quicker than any other,” explained Julie Snell, Managing Director of Bristol is Open. “We can offer a test network that’s run on gigabit fibre. It’s got everything from Wi-Fi to 2G, 3G, 4G, massive MIMO (multiple-in multiple-out), LTE and even some 5G. We also have 1,500 Wi-Fi meshed network lamp posts, allowing us to bounce signals around the city without us needing constant fibre connections.”

This network, consisting of hundreds of Internet of Things connections, can help people in areas of poor connectivity get online easily, and cheaply. Data from this network can be fed into a 4K, 180-degree ‘data dome’ and used to track Met Office weather patterns in the region, monitor mobile usage, and record air pollution levels as part of a feasibility study by the University of Bristol. This could see the city become the first to let people identify their individual exposure to pollution, and it’s a similar setup to that used by the Sensing London project which used Intel Galileo-based end-to-end Internet of Things infrastructure to measure local air quality and human activity. Beyond the sensors, Intel and Bosch recently teamed up to develop the Air Quality Micro Climate Monitoring System (MCMS) which takes the data from such sensors and uses software to measure air quality, providing councils with meaningful insight.

Looking to the future, the global rollout of 5G is expected to accelerate not only the adoption of smart city technology but its capabilities. It will exponentially increase the number of sensors, the strength of the connections and the speed at which data can be sent and analysed. Combine this with the ongoing advances in data collection and analysis, and the expansion of the IoT, and it looks like we’re at a critical juncture in the pursuit of a truly connected utopia.

Discover more amazing stories powered by big data and Intel technology

Microsoft launches dedicated host service alongside licensing changes


Keumars Afifi-Sabet

6 Aug, 2019

Microsoft is previewing an ‘Azure Dedicated Host’ service for enterprises looking to run their Linux and Windows virtual machines (VMs) on their own physical servers, alongside a set of changes to licensing costs.

The dedicated host service will target enterprise customers which prioritise the security benefits of physical hosting over shared cloud hosting, as well as the isolation of their sensitive information.

These servers will not be shared with any other customer, and businesses which opt for one will retain full control over how services run on the machine.

The Azure Dedicated Host is available in two iterations. The first type is based on the 2.3GHz Intel Xeon E5-2673 v4 processor and has a maximum of 64 virtual CPUs available. This can be chosen in a 256GiB and 448GiB RAM configuration, priced at $4.055 per hour and $4.492 per hour respectively.

The second version, meanwhile, is based on the Intel Xeon Platinum 8168 processor with 72 virtual CPUs available and is priced at $4.039 per hour in a 144GiB configuration.

Moreover, several can be grouped together into larger host groups in a particular region, so businesses can build clusters of physical servers.

The dedicated hosts will be subject to automatic maintenance by default, although administrators can defer host maintenance operations and apply them within a 35-day window. It’s possible, during this window, to retain full control over the server maintenance.

This has been announced in conjunction with a set of key changes to the pricing of software licenses, which sees a separation between on-premise outsourcing services and cloud services. Customers will need an additional ‘software assurance’ to run Microsoft software on public cloud services from 1 October this year.

Businesses using rival cloud providers, like Amazon Web Services (AWS) or Google Cloud Platform (GCP) should, therefore, expect the cost of running Microsoft software to increase.

The introduction of Azure Dedicated Host, on the other hand, has also seen Microsoft roll out an Azure Hybrid Benefit licensing option, which allows customers to use software without the need for a ‘software assurance’.

Both Google and Amazon have launched similar dedicated physical services in recent, years, with Azure the latest major cloud provider to follow suit.

Google, for instance, launched sole-tenant nodes in its Compute Engine last June, which allowed businesses to run instances on their own dedicated architecture as opposed to sharing hosting with other customers. These are similar to AWS’ EC2 dedicated hosts.

Elsewhere, Microsoft has increased the bug bounty rewards as part of a big security push that has also seen the launch of the Azure Security Lab.

The highest bounty will be doubled to $40,000, while those with access to the lab can attempt a set of scenario-based challenges with a maximum award of $300,000.

The new lab itself is a set of dedicated cloud hosts that offers security researchers a secure space to test against Infrastructure as a Service (IaaS) attacks.

Organisations are invited to apply to join the new security-focused community by requesting a Windows or Linux VM, with successful applicants given access to campaigns for targeted scenarios and added incentives.

Why HPE has swallowed MapR’s assets


Jane McCallion

6 Aug, 2019

News broke overnight that HPE has made yet another acquisition – its second this year – in the form of MapR’s assets (but not, it seems, MapR itself).

MapR, in case you’re not familiar with the company, is a big data and analytics specialist with a focus on artificial intelligence (AI). Founded 10 years ago, it has some impressive credentials – for example, in 2013 it broke the MinuteSort record on Google Compute Engine. However, it recently found itself in financial trouble, announcing in May 2019 that it would have to close if it couldn’t find additional funding by 3 July.

With HPE acquiring all its assets (existing technology, intellectual property, and expertise in AI and data management), MapR has for all intents and purposes ceased to exist, despite not being acquired as an entity. Given MapR’s money troubles, this isn’t really a surprising move on either part: HPE doesn’t take on any of MapR’s financial baggage and it’s a lot quicker to complete than a full acquisition, which would have taken many months securing approval from the various regulatory authorities around the world.

From a strategic point of view, buying MapR’s assets makes a great deal of sense for HPE. The company is putting huge emphasis on its AI and analytics credentials, as evidenced by the launch of its Primera storage appliance at its annual Discover conference in June this year. It’s also been working with Purdue University to try and solve the problem of world hunger.

In a statement announcing the acquisition, HPE said MapR’s assets will accelerate its Intelligent Data Platform capabilities.

“At HPE, we are working to simplify our customers’ and partners’ adoption of artificial intelligence and machine learning,” said Phil Davis, president of Hybrid IT at HPE.

“MapR’s enterprise-grade file system and cloud-native storage services complement HPE’s BlueData container platform strategy and will allow us to provide a unique value proposition for customers.”

In case the name doesn’t ring a bell, BlueData was a “Big Data-as-a-service” business that was acquired by HPE in November 2018. It, too, focused on analytics and machine learning, albeit in containers, rather than computing clusters as MapR’s technology does.

Speaking of computing clusters and other recent HPE acquisitions, there’s perhaps something going unsaid in last night’s announcement.

A computing cluster can be quite small – small enough for use by SMBs, for example. But the term is more frequently associated with large data centres and, specifically, supercomputers (aka high-performance computing or HPC).

It’s worth noting, then, that while MapR’s being rolled into the Intelligent Data Platform unit and treated very much as a software play, HPE’s most recent acquisition prior to this was Cray – the venerable supercomputing firm. This followed the summer 2016 acquisition of SGI, another big name in HPC.

It’s yet to become completely clear what HPE’s supercomputing strategy is, but it would seem remiss if these two units don’t end up working closely together.

Is this the last purchase of 2019 for HPE? We’re more than halfway through the year, but there’s plenty of AI prospects, in particular, to go around so we may yet see a mid-autumn spending spree.

The majority of Chrome extension installs are split across these 13 apps


Connor Jones

5 Aug, 2019

Google’s Chrome extension store is said to be dominated by just a handful of popular applications, with the majority of its application selection having fewer than 1,000 installs, according to a new study.

Figures released from Extension Monitor show that although Chrome now boasts over 1 billion extension installs, only 13 apps have over 10 million installs each.

Of the 188,000 extensions that make up the store, it’s believed as much as 87% of these have fewer than 1,000 installs, including 24% that have either one or zero installs. The figures also show that around half of all extensions have been installed less than 16 times.

Security was a common theme identified when looking at the most downloaded extensions – adblockers, antivirus applications, password managers and VPNs dominated the list of most popular extensions. Other prominent categories included communications and shopping.

Well-known apps such as Grammarly, Adblock, Honey, Avast Online Security, Skype and Google Translate dominated the top spots. LastPass and Google Hangouts were among the apps just shy of the 10 million mark.

The 10 million club:

  • Cisco Webex Extension
  • Google Translate
  • Avast Online Security
  • Adobe Acrobat
  • Grammarly for Chrome
  • Adblock Plus – free ad blocker
  • Pinterest Save Button
  • Skype
  • AdBlock
  • Avast SafePrice
  • uBlock Origin
  • Honey
  • Tampermonkey

Even though a large proportion of extensions have a comparably low install-base, it’s the extensions in this bracket that are often the most malicious, which collectively can still target a large number of users. Last month we reported that some Google Chrome extensions harvest user data as part of a “murky data economy” and then sell that data onto Fortune 500 companies.

The scheme was thought to have affected up to 4 million users across the various extensions, most of which had thousands of installs each, although some exceeded one million. The sensitive data was then accessible by anyone who was willing to pay a fee as small as $49.

In response, Google pointed users to its policy changes made in June 2019 and how it plans to make the Chrome Web Store more secure, a policy that’s since been slammed by the Electronic Frontier Foundation (EFF).

The organisation said that the changes would do nothing to secure the Web Store as they don’t address the APIs used by extensions to aggregate and sell data. Instead, the EFF claims Google should simply enforce existing policy properly.

“Ultimately, users need to have the autonomy to install the extensions of their choice to shape their browsing experience, and the ability to make informed decisions about the risks of using a particular extension,” said the EFF. “Better review of extensions in Chrome Web Store would promote informed choice far better than limiting the capabilities of powerful, legitimate extensions.”

Skybox and Zscaler team up for stronger cloud firewall integration

If there is one thing safer than a cloud security provider, it is two cloud security providers – in theory, at least. Zscaler and Skybox Security are coming together to connect two of their products for greater end-to-end protection.

The two companies will combine Zscaler’s Cloud Firewall product with the Skybox Security Suite, which encompasses visibility, vulnerability control, as well as firewall and network assurance for enterprise use cases. Zscaler policy information will feed directly into the firewall and network assurance modules. The data will help inform Skybox’s visual model for hybrid networks.

Customers of Zscaler Cloud Firewall will be able to ensure they are adhering to policies more easily, as well as automatically flagging violations.  

“Organisations need a seamless way to deliver a consistent and compliant policy on or of network,” said Punit Minocha, SVP of business and corporate development at Zscaler in a statement. “Zscaler cloud platform’s fast and secure policy-based access connects the right user to the right service or application.

“Combined with the Skybox security policy management solution, we simplify management and allow customers to transition their access policies to a modern cloud architecture,” added Minocha.

Last week Skybox issued its mid-year report on vulnerability and threat trends which argued, among others, that cloud container vulnerabilities were on a steady upturn. Vulnerabilities in container software had increased by 46% in the first half of 2019 compared with the year previously, the company noted.

The overriding theme was of good news and bad news. While it was good that only a ‘small fraction’ of vulnerabilities published will have an exploit, increasing network complexity makes it much tougher to understand what goes where.

“It’s critical that customers have a way to spot vulnerabilities even as their environment may be changing frequently,” said Amrit Williams, Skybox VP products at the time. “They also need to assess those vulnerabilities’ exploitability and exposure within the hybrid network and prioritise them alongside vulnerabilities from the rest of the environment – on-prem, virtual networks and other clouds.”

Getting a greater handle on whether vulnerabilities are infiltrating the enterprise network is naturally key – and it is a cornerstone of the Zscaler and Skybox partnership. You can find out more about the collaboration here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Office 365 ban in German schools ‘temporarily’ lifted


Dale Walker

2 Aug, 2019

A ban on the use of Office 365 products in German schools has been temporarily lifted following a series of talks between Microsoft and the Hessian Data Protection Commissioner, according to an updated statement released today.

The German state of Hesse imposed restrictions on the use of Microsoft software in July after ruling Office 365 exposed information on students and teachers to potential access from US officials, and was therefore in breach of the EU’s General Data Protection Regulation.

The decision followed several years of debate around whether German public authorities should use such cloud software at all, given that a large chunk of data is funnelled back to the US.

Office 365 was largely tolerated so long as Microsoft continued to invest in a local German cloud service, removing the need to send data back to the US. However, last August the company decided to shutter this service, leading officials to eventually conclude last month that Office 365 use no longer complied with data laws.

Yet, in another twist, the Hessian Data Protection Commissioner, Professor Michael Ronellenfitsch has now decided to “provisionally tolerate” the use of Office 365 in German schools, provided a series of conditions are met.

“The legality of using Office 365 is not yet fully understood,” said Ronellenfitsch, in a statement. “In my opinion dated 09.07.2019 I have drawn the conclusion and explained that according to the state of the checks, the use of Office 365 in Hessian schools can not be tolerated.

“Since then, there have been intensive discussions with Microsoft about the privacy compliance of Office 365’s use in the school, which has led to a privacy-related assessment and has invalidated a significant proportion of the concerns.”

The ruling allows schools that have already purchased version 1904 of Office 365 and its various apps to continue using the software “until further notice”.

However, those schools are also required block the transmission of any kind of diagnostic data themselves, although Microsoft is required to provide support with this – that is until the data protection authorities are able to provide a more permanent solution.

The Hessian data authority has also promised to conduct an audit of the current arrangement over the next few months, and will deliver a more permanent data protection assessment for the school sector, the statement added.

The decision appears to be something of an attempt to limit any potential disruption an outright Office 365 ban might have. However, it’s likely that the ban will return unless Microsoft comes up with a way of preventing diagnostic data from leaving the country.

David Friend, Wasabi CEO: Cloud storage will be a commodity – and clever vendors can make the most of it

Boston-based Wasabi Technologies has a clear business strategy. “We only do storage,” CEO David Friend tells CloudTech. “We don’t do compute, we’re not going to buy supermarkets, or make movies or TV shows – we just do storage and we’re very good at it.”

This will be the first, but not the last, veiled reference to a certain Seattle-based cloud storage provider. In some ways it is a legitimate concern. Amazon Web Services (AWS) has the largest share across infrastructure by a distance – between a third and half, depending on who you believe – but it also has plenty of offerings around software.

In fact, Amazon has just about everything. Across its 23 categories, there are a grand total of 183 products under the AWS banner, with management and governance, as well as machine learning, top of the tree with 19 each. Wasabi, by contrast, has just one. It’s a complexity thing, Friend argues.

“Our vision is that cloud storage will just become a commodity,” says Friend. “We don’t believe in having all these goofy tiers, because Wasabi is faster than [Amazon] S3 and cheaper than Glacier. So you don’t need six tiers of storage in between with all the complexity that implies and the consultants you have to hire in order to figure out what data to go in what tier.”

Whereas Glacier is cold storage, cheaper for workloads accessed once in a blue moon, Wasabi focuses on the opposite. The clue is in the name, with the company having been called BlueArchive in a previous life. Wasabi’s pricing model is simple: $5.99 per terabyte per month, translating to $0.0059 per gigabyte, with no additional charges for egress or API requests.

Maslow’s hierarchy of needs is often crudely bastardised to add ‘Wi-Fi’ or ‘internet’ to the bottom tier; with commoditisation, perhaps storage needs to be added beneath it. Things have come a long way from when Steve Jobs rebuked Dropbox as being merely a ‘feature’, after Drew Houston turned down the Apple chief’s acquisition offer.

Friend, alongside Thomas Koulopoulos, penned an eBook published earlier this year around the concept of the ‘bottomless cloud’, and spoke on the topic at the recent Cyber Security & Cloud Expo event in Amsterdam.

“We were talking about changing the mindset from thinking of data as sort of a scarcity to more a mindset of data abundance,” says Friend. “The idea that data storage gets to be so cheap that it’s not worth deleting anything.

“The people who are throwing away their data because they don’t see any immediate need for it… five, 10 years from now they’re going to look back and say ‘wow, with all the analytical tools we have today, I wish we had saved that data because we could be using it to gain competitive advantage, gain insights into what our customers are doing and what they want,’” adds Friend

“That’s the mindset that we have to change. We have to think about data as something which has probably got future value that’s in excess of what we think it might have today; we need to think of cloud storage the same way we think of electricity or bandwidth.”

Naturally, if you’re going to take on AWS the battle cannot be won alone. Wasabi has emboldened its approach in recent months with expanded geography to Europe and a channel strategy; the former saw a data centre open in Amsterdam in March, while the latter included partnerships with Veeam and Pax8 among others.

“In Europe particularly, we’re going all-channel for practical purposes,” explains Friend, albeit adding users could still sign up for storage directly. “When you look at Veeam’s new cloud-enabled backup product, there’s a dropdown menu. You can pick Amazon storage, Google storage, Microsoft, IBM, and Wasabi. So we’ve broken into the ranks of the top storage vendors now, and that’s clearly where we want to be.”

As far as managed service providers (MSPs) are concerned more generally, Friend notes that the move to the cloud – and the recurring revenue which results – reaps its own rewards. “We’re teaching the MSPs how to make money in this,” he says. “Instead of getting a one-time pop for a box, you get a revenue stream that goes on and on year after year. If you’re the person selling this cloud storage, every year they’re paying for what they’re using, and it just grows.

“It’s an opportunity for the MSPs to really get on the bandwagon and get in front of the cloud migration curve.”

Even though the hyperscalers are hogging public cloud infrastructure, plenty of innovation can still be found. Friend cites Stackpath, a content delivery network (CDN) and edge computing provider, and fellow CDN-er Limelight as companies ‘flourishing off picking pieces’ of the cloud.

“What the MSPs can do is learn how to put these things together and make money,” Friend adds. “That’s the part that we’re playing. Right now we’re the big guys in independent cloud storage, and we can provide the MSP with a great revenue stream.”

Picture credit: Wasabi Technologies

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware extends Google Cloud deal, positions as hybrid partner of choice for the hyperscalers

VMware and Google Cloud have extended their partnership with the launch of a new offering which enables organisations to run their VMware workloads in Google Cloud Platform (GCP).

The product, the snappily titled Google Cloud VMware Solution by CloudSimple – the latter being a verified VMware cloud partner – will give customers the opportunity to run VMware workloads on-prem, in the cloud, or as part of a hybrid architecture.

Both sides naturally came across as agreeable in the soundbites. “With VMware on Google Cloud Platform, customers will be able to leverage all of the familiarity and investment protection of VMware tools and training as they execute on their cloud strategies, and rapidly bring new services to market and operate them seamlessly and more securely across a hybrid cloud environment,” said VMware COO Sanjay Poonen.

Google Cloud CEO Thomas Kurian noted similarly in a blog post confirming the news. “Customers have asked us to provide broad support for VMware, and now with Google Cloud VMware Solution by CloudSimple, our customers will be able to run VMware vSphere-based workloads in GCP,” wrote Kurian. “This brings customers a wide breadth of choices for how to run their VMware workloads in a hybrid deployment, from modern containerised applications with Anthos to VM-based applications with VMware in GCP.”

The move – as Poonen noted – meant VMware now supported the five largest clouds, in this instance AWS, Azure, Alibaba and IBM alongside Google. VMware’s dealings with AWS are better known; the launch of AWS Outposts last November all-but brought the house down at re:Invent when CEO Pat Gelsinger took to the stage. Outposts enables organisations to deliver a ‘truly consistent hybrid experience’, in the company’s words, either AWS-native or running VMware Cloud on AWS.

VMware’s positioning as a partner has been long in coming and is coming to a head now. In February 2015, this publication put out an op-ed titled ‘Here’s why VMware hasn’t left it too late with its hybrid cloud push.’ Back then, VMware’s move to cloud was relatively late, having previously had a long-standing heritage in virtualisation. Many companies had their aborted attempts to move into the public cloud – CenturyLink and Verizon to name two in the telco space – yet VMware evidently saw two apparent trends.

Enterprises were not only going to focus on a handful of primary players for their public cloud infrastructure – and not focus on just one vendor but go multi-cloud – but they were also not going to give up certain on-prem assets.

You can read the full Google Cloud blog here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware extends Google Cloud deal, positions as hybrid partner of choice for the hyperscalers

VMware and Google Cloud have extended their partnership with the launch of a new offering which enables organisations to run their VMware workloads in Google Cloud Platform (GCP).

The product, the snappily titled Google Cloud VMware Solution by CloudSimple – the latter being a verified VMware cloud partner – will give customers the opportunity to run VMware workloads on-prem, in the cloud, or as part of a hybrid architecture.

Both sides naturally came across as agreeable in the soundbites. “With VMware on Google Cloud Platform, customers will be able to leverage all of the familiarity and investment protection of VMware tools and training as they execute on their cloud strategies, and rapidly bring new services to market and operate them seamlessly and more securely across a hybrid cloud environment,” said VMware COO Sanjay Poonen.

Google Cloud CEO Thomas Kurian noted similarly in a blog post confirming the news. “Customers have asked us to provide broad support for VMware, and now with Google Cloud VMware Solution by CloudSimple, our customers will be able to run VMware vSphere-based workloads in GCP,” wrote Kurian. “This brings customers a wide breadth of choices for how to run their VMware workloads in a hybrid deployment, from modern containerised applications with Anthos to VM-based applications with VMware in GCP.”

The move – as Poonen noted – meant VMware now supported the five largest clouds, in this instance AWS, Azure, Alibaba and IBM alongside Google. VMware’s dealings with AWS are better known; the launch of AWS Outposts last November all-but brought the house down at re:Invent when CEO Pat Gelsinger took to the stage. Outposts enables organisations to deliver a ‘truly consistent hybrid experience’, in the company’s words, either AWS-native or running VMware Cloud on AWS.

VMware’s positioning as a partner has been long in coming and is coming to a head now. In February 2015, this publication put out an op-ed titled ‘Here’s why VMware hasn’t left it too late with its hybrid cloud push.’ Back then, VMware’s move to cloud was relatively late, having previously had a long-standing heritage in virtualisation. Many companies had their aborted attempts to move into the public cloud – CenturyLink and Verizon to name two in the telco space – yet VMware evidently saw two apparent trends.

Enterprises were not only going to focus on a handful of primary players for their public cloud infrastructure – and not focus on just one vendor but go multi-cloud – but they were also not going to give up certain on-prem assets.

You can read the full Google Cloud blog here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

JEDI contract put on hold after intense lobbying efforts


Connor Jones

2 Aug, 2019

The $10 billion JEDI contract to supply cloud computing services to the Pentagon has been halted after an aggressive lobbying campaign from rival tech companies.

According to CNN, which first reported the story, an inside campaign was allegedly carried out to dissuade President Trump from choosing Amazon’s AWS as the winner of the contract.

Amazon and Microsoft are currently the only two companies in the race after Oracle and IBM were knocked out of the running months ago, but a one-page document was given to Trump which appears to visually outline Amazon’s ten-year plan for cloud monopolisation.

The document is identical to one created by Oracle’s top Washington lobbyist, Kenneth Glueck, an executive vice president with the company, Glueck told CNN.

CNN remarked that the document delivered to Trump, which may have been the deciding factor in delaying the JEDI contract due to be announced this month, was designed to play up to the feud between Trump and Amazon CEO Jeff Bezos.

“So sorry to hear the news about Jeff Bozo being taken down by a competitor whose reporting, I understand, is far more accurate than the reporting in his lobbyist newspaper, the Amazon Washington Post,” tweeted Trump in relation to Bezos’ divorce at the time. “Hopefully the paper will soon be placed in better & more responsible hands!”

Defence Secretary Mark Esper is currently investigating allegations of unfairness in the awarding of the contract, according to Pentagon spokeswoman Elissa Smith.

“Keeping his promise to Members of Congress and the American public, Secretary Esper is looking at the Joint Enterprise Defense Infrastructure (JEDI) program,” Smith said in a statement on Thursday to Reuters. “No decision will be made on the program until he has completed his examination.”

Speculation surrounding the treatment of AWS in the contract’s bidding process has raged on for months, some have argued that the nature of the contract itself favours AWS and the services it offers.

Reports also suggest that Senator Mark Rubio penned a letter to national security advisor John Bolton requesting the contract be delayed.

“I respectfully request that you direct the delay of an award until all efforts are concluded in addition to evaluating all bids in a fair and open process in order to provide the competition necessary to obtain the best cost and best technology for its cloud computing needs,” Rubio reportedly wrote.

The Joint Enterprise Defence Infrastructure (JEDI) contract is worth $10 billion and the project to renovate the Pentagon’s IT infrastructure into a contemporary cloud-based one could span 10 years.