All posts by Connor Jones

Google founders Larry Page and Sergey Brin step down from Alphabet management


Connor Jones

4 Dec, 2019

Google founders Larry Page and Sergey Brin are stepping aside from their leadership roles of Google’s parent company Alphabet, bringing to an end an unprecedented reign at the helm of one of the most influential companies in history.

The iconic silicon valley duo will remain as board members, shareholders and overall “proud parents” of the companies they founded and led since starting the search engine giant from a California garage in 1998.

Page and Brin said they wanted to simplify the management structure of the tech giant, adding there was no need to have two CEOs and a president of the same company.

Google’s Sundar Pichai, who joined the company in 2004 and was appointed CEO in 2015, will assume the CEO role of both Google and Alphabet.

“I will continue to be very focused on Google and the deep work we’re doing to push the boundaries of computing and build a more helpful Google for everyone,” said Pichai. “At the same time, I’m excited about Alphabet and its long term focus on tackling big challenges through technology.”

“While it has been a tremendous privilege to be deeply involved in the day-to-day management of the company for so long, we believe it’s time to assume the role of proud parents – offering advice and love, but not daily nagging,” said Page and Brin in a joint letter.

Alphabet, the multinational conglomerate holding company which houses Google and a number of other projects it has launched, including DeepMind, was created in 2015 and replaced Google as the publicly-traded company.

Shortly after, Pichai replaced Page as CEO at Google following months of rumours that he would be the next man for the job. Pichai previously held positions at Google such as product chief and head of Android prior to assuming the top job at the company.

Among other notable successes, Pichai’s reign at Google has seen the company invest heavily in green energy. Google Cloud said in 2018 it runs entirely on green energy and that the company has invested billions in building a variety of green datacentre facilities across the world, including locations in Finland and Denmark.

Google has also been embroiled in controversy throughout 2019. Most recently, the EU announced it plans to launch an investigation into the company’s data collection practices. The UK’s competition watchdog also announced it will be investigating Google’s £2 billion acquisition of data analysis firm Looker.

Google to axe Cloud Print by 2021


Connor Jones

22 Nov, 2019

Google Cloud has revealed it will be terminating its Cloud Print service on the 1 January 2021, warning businesses they will need to find an alternative solution.

The move will affect all devices across all operating systems, according to a Google support note, prompting the need for businesses to “execute a migration strategy” over the next year.

The termination comes as the company shifts towards a native approach to its printing services. Chrome OS received CUPS printing in 2017, a native printing management service engrained in Google’s operating system and that will replace Cloud Print after the cut-off date. However, for Windows, Mac, Linux and any other users, alternative solutions will be needed.

The announcement is more likely to hurt small businesses that operate in a single-vendor IT environment. For example, if an SMB operated solely on Microsoft Azure and ran Windows machines only, then having that vendor take away its printing service means it would have to go out and pay for a new service.

This is especially relevant to businesses that equip their workforce with workstations of different operating systems of even operate a BYOD policy. In this scenario, a printing service that can manage all these different devices at once will be necessary and costly.

Larger businesses have time to adapt to the change and even work with other printing vendors to create a bespoke cloud-based printing product for that given business. 

Reaction has been mixed among customers, with many taking to Twitter to voice their concerns.

One user said “Google should consider open sourcing the entire Cloud Print project,” while others said the move was indicative of the modern office’s move towards a paperless office.

“It’s the only way I’ve printed ever since it came out. Can print from Android, make it act as a default printer in windows. Used to be able to do the same on a Mac,” said another user. “Could always just go to the website and print any document you could upload from anywhere.”

Incidentally, IT Pro’s own office relies on Google Cloud Print.

“I think it might just be a case of adding the printer to our print server and working around a way to either send out a handout explaining to users on what to do to add the printer or setting times to physically add it,” our own IT department explained.

Cloud Print joins a growing list of axed Google products in 2019. Dedicated email app Inbox was terminated in June, Hire was ended in August and most notably Google+ was shuttered finally in April 2019 following a series of security errors impacting millions of its users.

Can Google Stadia finally bring success to cloud gaming?


Connor Jones

18 Nov, 2019

It’s not usually part of our remit, but despite it being a gaming-geared announcement, there’s something about the new Google Stadia gizmo that interests us, specifically its infrastructure which raises questions we can’t quite answer yet.

Codenamed Project Stream before its launch, Google’s game streaming service is far from the first to grace the market, but it might be the most well-timed attempt of them all. It aims to bring console-quality games directly to virtually any screen with no need for a physical console.

Games will be controlled by a new Stadia controller which connects directly to the platform via Wi-Fi. Not the screen, not the Chromecast – directly to the game client residing in the cloud.

It’s important to note this isn’t the first time a cloud streaming of this kind has been attempted. OnLive and PlayStation Now, to name but two, promised so much but then crashed and burned after delivering very little.

OnLive ran into money issues where the cost of running its infrastructure far exceeded the income it made. It’s reported that the service cost millions of dollars to run every month but on launch, and for a few weeks after, the company only received single-digit daily income because of its “try before you buy” policy on games.

PlayStation Now is actually still alive and kicking. The premium service hasn’t been adopted nearly as widely as first thought, probably due to a combination of its reported heavy input lag and poor variety of supported games.

So with that, Stadia must overcome some significant challenges to breathe new life into the platform. With edge computing, it theoretically has an advantage over OnLive and being Google, it has already managed to secure a decent selection of launch titles to ensure day-one success.

What caught our eye is the cloud and edge computing aspects of the service’s infrastructure and streaming strategy. OnLive attempted cloud gaming in the past but the existing support infrastructure, which was unfit for purpose, has been the main blockade in developing a system that actually works. The streaming speed and quality were passable as a proof of concept and just about playable with some games, but to launch Stadia eight years after the failure of OnLive, it must do much better.

So many questions

Google has said it will be using a combination of its highly advanced data centres and edge infrastructure to deliver gaming at low latencies, something that, especially in the online multiplayer space, is of vital importance.

Phil Harrison, VP and GM at Google, leading the Stadia project, said that the measurable latency issues seen in Project Stream are “solved and mitigated”.

“There are some investments in the datacentre that will create a much higher experience for more people, and there are some fundamental advances in compression algorithms,” he told Eurogamer.

“One important thing to keep in mind is that we are building our infrastructure at the edge. It’s not just in our central, massive datacentres. We’re building infrastructure as close to the end user, the gamer, as possible – so that helps mitigate some of the historical challenges”.

The interaction between the datacentre and the edge is unclear, specifically to what extent both will impact the overall processing and transmission of game data. What’s been said so far seems somewhat confusing and at times contradictory. For example, Harrison spoke about microsecond ping speeds for gamers but 20ms edge to datacentre speeds.

Google’s massive number of datacentres, according to Harrison, will be pivotal in delivering the Stadia experience the tech giant has imagined. Harrison said Google’s datacentres offer the theoretically unlimited compute capacity needed for a cloud-based streaming service to thrive. Gaming developments have, in years gone by, been limited because of hardware and people’s reluctance to upgrade for a few years, or until the next console life cycle starts. In the datacentre, CPU and GPU capacity is as powerful as the developer needs it to be to run its game.

Chris Gardner, senior analyst at Forrester is optimistic about the capability of Google’s infrastructure. “The network configuration is somewhat of a mystery, but clearly Google nailed this because benchmarks have shown perceived input latency to be extremely fast,” he tells Cloud Pro. “Google has experience with network optimisation (all the way down to designing its own protocols) so the performance is not a stretch.”

Take the specified hardware announced by Google and put it into one of Google’s many datacentres and you arguably have a recipe for success, he adds.

Trouble in the network

However, the promises around network speeds proved to be a point of contention for us. Firstly, Harrison told the BBC that 4K gaming can be achieved on download speeds of 25mbps; for reference, the average UK household gets just 18.5mbps speeds from its internet connection, far less in rural areas. The VP said Google expects network demands to improve, but it definitely wasn’t a promise.

Although Google seems to be confident in the fact that its back-end equipment is up to the task, it’s likely to face the problem of internet service provider (ISP) throttling – world-class servers or not. Harrison already confirmed that Google already has relationships within the wider industry, but it’s possible the company could run into the same problems that Netflix faced during its expansion where it started paying ISPs to allow faster speeds on the service but instead users were throttled.

It’s a very real possibility that ISPs would throttle bandwidth as popularity grows and network demands are greater. “[Netflix] had to negotiate with the major players to ensure the customer experience wasn’t dreadful,” says Gardner. “I expect the same experience for game streaming providers, although much more so because now it’s not just a bandwidth negotiation, it’s latency as well”.

On the topic of latency, Gardner cited this as his biggest concern of the whole project. “What I expect to see is streaming to be initially successful with casual games, platformers and roleplaying games,” he said. “However, multiplayer games demand low latency and low input lag to stay competitive and enjoyable. This is my biggest concern,” he added. “Shooters, MOBAs and other types of super competitive games – I honestly don’t expect gamers to tolerate the latency.”

Competition is just around the corner

There are only three companies in the world right now that are positioned well enough to feasibly deliver on a cloud-based product like this. Google is one of them, AWS and Microsoft are the others. We just wouldn’t expect any of these to pump so much time and money into something the world isn’t ready for yet.

Google’s main competitor in this area is Microsoft, which is working on Project xCloud, its own game streaming service currently in beta. The company behind the Xbox is certainly lagging compared to Google as its product is currently still in the development stage, but it arguably presents the best case to make this idea of game streaming work. Reports from those selected to test the beta version of xCloud also seem to be unanimously positive.

Couple Microsoft’s prowess in the cloud market with its strong presence in the console sector spanning nearly two decades and that provides an impressive backing for what could potentially be a better product compared to Google’s. It’s possible Microsoft could let Google launch Stadia, learn from its inevitable mistakes, and then blow it out of the water with a far superior service.

Regardless of how all of this plays out, it’s difficult to get excited about something that has failed in so many previous attempts and with so little information about the project disclosed – the kind of information that we need to make educated guesses about its viability as a service – we can’t help but look on with scepticism.

Main image credit: Marco Verch

US Supreme Court agrees to end Google and Oracle’s ten-year copyright battle


Connor Jones

18 Nov, 2019

The US Supreme Court has agreed to hear a copyright lawsuit that has spanned nearly a decade between tech giants Google and Oracle.

The case was originally brought to Google following Oracle’s 2010 acquisition of Sun Microsystems, the company responsible for developing the Java language. Oracle alleged that Google stole code from the Java language to build its Android platform, a claim Google has repeatedly denied.

What followed was a series of court hearings and resulting appeals, and although a number of lower courts sided with Google in the case, Oracle has successfully challenged these rulings in superior courts.

The case’s most recent ruling came in March 2018 when the Federal Court sided with Oracle’s copyright claim, resulting in Google being hit with a $9 billion (£7 billion) damages bill.

The bill hasn’t yet been paid as Google petitioned the US Supreme Court in January 2019, asking it to overturn “a devastating one-two punch at the software industry”.

No date has been set by the Supreme Court but a one-hour window has been allotted to hear the companies’ arguments. Being the US’ highest court, the ruling is likely to be the last word on the lengthy case which could have lasting effects on the software development industry – that is, whether application programming interface (API) packages can be copyrighted.

Permitting these vital components of software interoperation to be copyrighted could potentially stifle the software industry, making it difficult for new apps to work with other apps and software platforms.

The Supreme Court previously refused to hear the case following a 2014 Federal Circuit ruling but agreed this time around following support from the likes of Microsoft and Mozilla. The Electronic Frontier Foundation (EFF) has also sided with Google, calling the case a “mess”.

“We welcome the Supreme Court’s decision to review the case and we hope that the Court reaffirms the importance of software interoperability in American competitiveness,” said Kent Walker, Google’s SVP of Global Affairs, speaking to Cloud Pro. “Developers should be able to create applications across platforms and not be locked into one company’s software.”

Cloud Pro has contacted Oracle for a statement but did not receive a reply by the time of publication.

“Bulletproof” dark web data centre seized by German police


Connor Jones

30 Sep, 2019

German authorities scuppered a pervasive dark web operation on Friday, saying it was being run out of a former NATO bunker.

Seven individuals have been arrested on the suspicion of being associated with organised crime and as accessories to hundreds of thousands of crimes through their hosted dark web platforms such as the Wall Street Market and Cannabis Road.

The outfit is believed to be spearheaded by a 59-year-old Dutchman who, authorities understand, acquired the bunker located in the small town of Traben-Trarbach in 2013.

After buying the bunker, the man who is yet to be named by authorities is claimed to have transformed it into a large and highly secure data centre, designed “exclusively for illegal purposes”, according to prosecutor Juergen Bauer, as reported by the Associated Press.

Dark web marketplaces are infamous for being cornucopias of crime where people can buy drugs, weapons, credit card information, forged documents and more.

As suspects linked to the operation of such as site, 13 suspects in total, aged 20-59, can all be charged as accessories to every crime and transaction that took place on their hosted sites.

“I think it’s a huge success… that we were able at all to get police forces into the bunker complex, which is still secured at the highest military level,” said regional criminal police chief Johannes Kunz. “We had to overcome not only real, or analogue, protections; we also cracked the digital protections of the data centre.”

Authorities described the facility as a “bulletproof hoster”, designed specifically to conceal the activity from law enforcement.

Policing the unknown

The dark web has proven to be a reliable sanctuary for cyber criminals due to its decentralised and anonymous nature. Websites are accessed through The Onion Router (Tor) browser and a user’s connection is redirected through multiple different global locations which makes the identification of an online criminal nigh-on impossible.

The proliferation of cryptocurrencies has also contributed to the anonymity of criminals as, like their web traffic, payments made using cryptos are also beamed through multiple addresses making them difficult track.

It started with bitcoin but since then other cryptocurrencies have gained popularity, and new and more anonymous coins have been devised. Monero is one such coin that’s favoured by criminals as it conceals the sender and recipient’s address more comprehensively than others.

Cryptocurrency tumblers are another tool that hampers policing efforts. They offer a service that’s the cryptocurrency equivalent of money laundering; users send their coins to a tumbling service, pay a fee and get completely different coins in return, further complicating tracking efforts made by authorities.

While authorities have famously been able to clamp down on certain marketplace operations, their success, in some cases, hasn’t been attributed to sophisticated web tracking techniques – the fatal clues have sometimes been found through the criminals’ poor web hygiene.

For example, perhaps the most well-known dark web market Silk Road was eventually seized by authorities after finding posts made by the owner Ross Ulbricht which advertised the marketplace on a ‘clear net’ bitcoin forum along with his personal email address in a separate post.

The network is difficult to crack, but as the FBI evidenced with the seizure of Playpen, they can take down sites if they hack the endpoint. Authorities deployed malware on the abuse-distribution platform that revealed the IP address of any user that clicked on illegal images, leading to the arrest of the site’s operator.

IT Pro contacted the National Cyber Security Agency for comment but it did not reply at the time of publication.

Dedicated global taskforces

As the dark web becomes a more widespread issue, dedicated dark web security organisations have been formed around the world to help tackle the issue.

The seizure of the Alphabay and Hansa marketplaces in 2017 was a global coordinated effort named Operation Bayonet and led by Europol, but required help from law enforcement authorities in Thailand, the Netherlands, Lithuania, Canada, the United Kingdom, and France.

The huge effort required in Bayonet provided the catalyst that led to the formation of Europol’s own dedicated dark web team and the US followed suit six months later with its Joint Criminal Opioid Darknet Enforcement (J-CODE) team.

“Criminals think that they are safe on the darknet, but they are in for a rude awakening,” said Attorney General Sessions on the J-CODE launch. “We have already infiltrated their networks, and we are determined to bring them to justice.

“In the midst of the deadliest drug crisis in American history, the FBI and the Department of Justice are stepping up our investment in fighting opioid-related crimes. The J-CODE team will help us continue to shut down the online marketplaces that drug traffickers use and ultimately that will help us reduce addiction and overdoses across the nation.”

Google invests $3 billion in European data centre expansion


Connor Jones

20 Sep, 2019

Google’s CEO Sundar Pichai announced today that the company will be investing a further three billion euros (£2,642,906,834) into European data centres over the next two years.

This additional investment brings Google’s total investment in European digital infrastructure to 15 billion euros (£13,212,675,000) since 2007 – an endeavour which has supported 13,000 jobs, according to a Copenhagen Economics study.

In addition, a further 600 million euros (£528,393,000) will be pumped into the expansion of its data centre operations in Hamina, Finland, which it originally bought in 2009 and transformed it from an old paper mill to a high-tech facility which supports 4,300 jobs.

“The Nordic countries are great examples of how the internet can help drive economic growth,” said Pichai. “Our Hamina data centre is a significant driver of economic growth and opportunity. It also serves as a model of sustainability and energy efficiency for all of our data centres.”

The Hamina facility is situated near to the Russian border and uses seawater taken from the Gulf of Finland to reduce the amount of energy required to cool the hardware.

Google announced yesterday that it has continued on its commitment to using as much green energy as possible by completing the largest corporate purchase of renewable energy in history.

“These deals will increase our worldwide portfolio of wind and solar agreements by more than 40 percent, to 5,500 MW–equivalent to the capacity of a million solar rooftops,” said Pichai. “Once all these projects come online, our carbon-free energy portfolio will produce more electricity than places like Washington D.C. or entire countries like Lithuania or Uruguay use each year.”

Currently, Google’s other European data centres are located in the Netherlands, Ireland and Belgium, but last year it announced plans to build an entirely carbon-neutral data centre in Denmark, adding to its European data centre portfolio and bolstering its green energy drive.

The tech giant plans to invest $700 million (£616,769,017) into the new green site in Frederica, Denmark and use machine learning to ensure ever watt is used effectively.

Europe is somewhat of a hotbed for data centres, particularly for Google’s in Scandinavia which can operate using better energy efficiency than other locations around the world.

Microsoft revs up connected car cloud service with TomTom


Connor Jones

9 Sep, 2019

Satellite navigation giant TomTom has partnered with Microsoft to integrate its technology in the Redmond company’s cloud-based Connected Vehicle Platform (MCVP).

Navigation usage data will be collected and sent back to the platform, which works in tandem with Microsoft Azure, and will allow car manufacturers to make better-informed decisions for tailored services, thanks to being able to tap into the compute power of a large cloud platform. 

Diagnostic data will also be driven back to the platform which will allow car makers to make data-driven decisions for engineering and design changes.

TomTom’s location intelligence which includes traffic information and map services will also be made available to cars’ navigation apps in addition to aiding autonomous driving.

MCVP aims to unify connected cars and the data they collect with its Azure platform so its customers can create improved in-vehicle services, such as traffic alerts and better understand the needs of those with connected cars.

It extends further from just consumers, commercial and industrial vehicles are all compatible with the platform so businesses can harness the data from their fleet of trucks, ships, drones and cranes to help create more efficient processes.

Microsoft has already attracted prominent vehicle manufacturers to its platform; Volkswagen agreed last year to build its automotive cloud platform on Azure.

“Our integration with the Microsoft Connected Vehicle Platform means that automakers can get access to precise and reliable navigation and driving behaviour data easily, while of course adhering to privacy principles,” said Cees van Dok, chief product officer at TomTom.

“This data could, for instance, be used to predict the range of an electric vehicle based on driving behaviour and planned route more accurately; or to work out, based on navigation behaviour, what connectivity package for online navigation would be best suited for a driver. This is a game-changer for OEMs.” 

This TomTom-Microsoft partnership is an extension to its existing relationship, which was bolstered in February after the navigation specialist was selected by Microsoft to be its sole location data for its mapping services. TomTom’s data is used across a variety of Microsoft products including Azure Maps, Bing, Cortana, Windows and will also be used in future releases.

“With Microsoft Connected Vehicle Platform serving as the digital chassis of the car, telematics, infotainment, and data from sensors are all connected to the cloud in the same way, effectively solving the pain point of managing different systems for scale, security, and reliability,” said Tara Prakriya, partner group program manager of Microsoft Connected Vehicle Platform and mobility at Microsoft. “We’re delighted to add navigation intelligence data from TomTom to MCVP.” 

The pair’s partnership hasn’t always been so fruitful, though. Back in 2009, they both sued each other within a month, alleging patent infringements on both sides. The case was later settled with both sides having to pay the other an undisclosed sum.

Microsoft launches bug bounty programme Chromium-based Edge


Connor Jones

21 Aug, 2019

Microsoft has launched a fresh bug bounty programme specifically for its Chromium-based Edge browser, offering rewards double the value of its previous HTML Edge version.

The maximum reward for hunters finding significant flaws in the latest version of its flagship browser has increased to $30,000 for the most critical vulnerabilities.

Other issues will be judged by their significance, depending on how impactful the flaw is to future versions of Edge, with hunters being rewarded from $1,000 upwards.

The launch of the latest bug bounty programme coincides with the launch of the beta preview of the next Edge version and will work hand-in-hand with Microsoft’s Researcher Recognition Program.

The initiative acts somewhat like a loyalty card for bug hunters who follow Microsoft’s vulnerability disclosure process: Points are awarded for every bug they report and these points can be multiplied depending on the product on which they’re found.

A bug found in Azure or Windows Defender, for example, is eligible for a 3x points multiplier whereas Edge on Chromium gets a mere 2x multiplier – GitHub and LinkedIn receive none.

Once a hunter accrues enough points, they “may be recognised in our public leaderboard and rankings, annual Most Valuable MSRC Security Researcher list, and invited to participate in exclusive events and programs,” said Microsoft.

The program will also run alongside the pre-existing bug bounty for the HTML version of Edge, which offers rewards of between $500 – $15,000.

“Vulnerabilities that reproduce in the latest, fully patched version of Windows (including Windows 10, Windows 7 SP1 or Windows 8.1) or MacOS may be eligible for the Microsoft Edge Insider bounty program,” said Microsoft. “Windows Insider Preview is not required.”

Since the browser is powered using Chromium, the new bug bounty programme will support the Chrome Vulnerability Reward Program “so any report that reproduces on the latest version of Microsoft Edge but not Chrome will be reviewed for bounty eligibility based on severity, impact, and report quality,” it added.

The Chrome Vulnerability Reward Program currently offers rewards ranging vastly from $500 to $150,000 with the greatest rewards likely to be issued for bugs found in Chrome OS.

Apple also announced the expansion of its bug bounty programme at Black Hat 2019 in August, making it the most lucrative bounty program in tech.

In addition to dishing out special iPhones to select bug hunters, making it easier for them to investigate the flagship Apple device, it announced a maximum reward for bugs of up to $1.5 million.

Back in March, an Argentinian teenage bug hunter became the first in the world to earn $1 million from lawfully finding and disclosing bugs in bounty programs. He reported more than 1,600 bugs – notable inclusions were major issues with Twitter’s and Verizon’s products.

The majority of Chrome extension installs are split across these 13 apps


Connor Jones

5 Aug, 2019

Google’s Chrome extension store is said to be dominated by just a handful of popular applications, with the majority of its application selection having fewer than 1,000 installs, according to a new study.

Figures released from Extension Monitor show that although Chrome now boasts over 1 billion extension installs, only 13 apps have over 10 million installs each.

Of the 188,000 extensions that make up the store, it’s believed as much as 87% of these have fewer than 1,000 installs, including 24% that have either one or zero installs. The figures also show that around half of all extensions have been installed less than 16 times.

Security was a common theme identified when looking at the most downloaded extensions – adblockers, antivirus applications, password managers and VPNs dominated the list of most popular extensions. Other prominent categories included communications and shopping.

Well-known apps such as Grammarly, Adblock, Honey, Avast Online Security, Skype and Google Translate dominated the top spots. LastPass and Google Hangouts were among the apps just shy of the 10 million mark.

The 10 million club:

  • Cisco Webex Extension
  • Google Translate
  • Avast Online Security
  • Adobe Acrobat
  • Grammarly for Chrome
  • Adblock Plus – free ad blocker
  • Pinterest Save Button
  • Skype
  • AdBlock
  • Avast SafePrice
  • uBlock Origin
  • Honey
  • Tampermonkey

Even though a large proportion of extensions have a comparably low install-base, it’s the extensions in this bracket that are often the most malicious, which collectively can still target a large number of users. Last month we reported that some Google Chrome extensions harvest user data as part of a “murky data economy” and then sell that data onto Fortune 500 companies.

The scheme was thought to have affected up to 4 million users across the various extensions, most of which had thousands of installs each, although some exceeded one million. The sensitive data was then accessible by anyone who was willing to pay a fee as small as $49.

In response, Google pointed users to its policy changes made in June 2019 and how it plans to make the Chrome Web Store more secure, a policy that’s since been slammed by the Electronic Frontier Foundation (EFF).

The organisation said that the changes would do nothing to secure the Web Store as they don’t address the APIs used by extensions to aggregate and sell data. Instead, the EFF claims Google should simply enforce existing policy properly.

“Ultimately, users need to have the autonomy to install the extensions of their choice to shape their browsing experience, and the ability to make informed decisions about the risks of using a particular extension,” said the EFF. “Better review of extensions in Chrome Web Store would promote informed choice far better than limiting the capabilities of powerful, legitimate extensions.”

JEDI contract put on hold after intense lobbying efforts


Connor Jones

2 Aug, 2019

The $10 billion JEDI contract to supply cloud computing services to the Pentagon has been halted after an aggressive lobbying campaign from rival tech companies.

According to CNN, which first reported the story, an inside campaign was allegedly carried out to dissuade President Trump from choosing Amazon’s AWS as the winner of the contract.

Amazon and Microsoft are currently the only two companies in the race after Oracle and IBM were knocked out of the running months ago, but a one-page document was given to Trump which appears to visually outline Amazon’s ten-year plan for cloud monopolisation.

The document is identical to one created by Oracle’s top Washington lobbyist, Kenneth Glueck, an executive vice president with the company, Glueck told CNN.

CNN remarked that the document delivered to Trump, which may have been the deciding factor in delaying the JEDI contract due to be announced this month, was designed to play up to the feud between Trump and Amazon CEO Jeff Bezos.

“So sorry to hear the news about Jeff Bozo being taken down by a competitor whose reporting, I understand, is far more accurate than the reporting in his lobbyist newspaper, the Amazon Washington Post,” tweeted Trump in relation to Bezos’ divorce at the time. “Hopefully the paper will soon be placed in better & more responsible hands!”

Defence Secretary Mark Esper is currently investigating allegations of unfairness in the awarding of the contract, according to Pentagon spokeswoman Elissa Smith.

“Keeping his promise to Members of Congress and the American public, Secretary Esper is looking at the Joint Enterprise Defense Infrastructure (JEDI) program,” Smith said in a statement on Thursday to Reuters. “No decision will be made on the program until he has completed his examination.”

Speculation surrounding the treatment of AWS in the contract’s bidding process has raged on for months, some have argued that the nature of the contract itself favours AWS and the services it offers.

Reports also suggest that Senator Mark Rubio penned a letter to national security advisor John Bolton requesting the contract be delayed.

“I respectfully request that you direct the delay of an award until all efforts are concluded in addition to evaluating all bids in a fair and open process in order to provide the competition necessary to obtain the best cost and best technology for its cloud computing needs,” Rubio reportedly wrote.

The Joint Enterprise Defence Infrastructure (JEDI) contract is worth $10 billion and the project to renovate the Pentagon’s IT infrastructure into a contemporary cloud-based one could span 10 years.