Pentagon opens winner-takes-all cloud defence contract – as REAN tender pared back

Microsoft said earlier this week that government IT had reached a tipping point towards moving to the cloud – and now another large US contract is up for grabs.

During an Industry Day yesterday, plans were revealed by the Pentagon for the Joint Enterprise Defence Infrastructure Cloud (JEDI) opened up for tender to any single cloud provider.

The key is in the word ‘single’: as reported by Bloomberg, using multiple clouds would ‘exponentially increase the complexity’ of the issue, according to defence staff, therefore the plan is to go with one cloud vendor only.

As a draft statement of objectives explains: “The Department of Defence’s lack of a coordinated enterprise-level approach to cloud infrastructure makes it virtually impossible for our warfighters and leaders to make critical data-driven decisions at ‘mission speed’, negatively affecting outcomes.

“A fragmented and largely on-premise computing and storage solution forces the warfighter into tedious data and application management processes, compromising their ability to rapidly access, manipulate, and analyse data at the home front and tactical edge,” the document adds. “Most important, current environments are not optimised to support large, cross domain analysis using advanced capabilities such as machine learning and artificial intelligence to meet current and future warfighting needs and requirements.”

This comes amidst something of an issue with a recent tender. In February, the Pentagon awarded a contract for up to $950 million to REAN Cloud, a Virginia-based managed services provider, for automated pricing and procurement. Earlier this week, it was revealed that this contract had been cut to as low as $65m. Oracle had protested against the original decision according to a filing on February 20. In a statement, REAN said it was honoured to be performing work for the US military, and that it was unaware of the reasons as to why the contract was trimmed.

“Based on the threat of legal action and protest by the old guard, the only winners in this delay are those large companies that stand to lose money if DoD proceeds with innovation,” said Sekhar Puli, REAN Cloud managing partner. “In the meantime, the cost of maintaining antiquated government infrastructure has not subsided.”

Microsoft recently expanded its government cloud offerings to include government-specific editions of Microsoft 365 and Azure Stack. At the time Julia White, corporate vice president of Azure, wrote that “evidence we are at a tipping point for government to modernise IT with the cloud is coming from agencies across every level and branch of government.”

Long-time readers of this publication will remember the imbroglio between Amazon Web Services (AWS) and IBM in 2013 for a $600 million CIA cloud computing contract eventually won, after legal rulings, by AWS. If this, and the most recent contract, are anything to go by, expect a battle to be fought to the very end by the main cloud players.

Suppliers welcome the advent of G-Cloud 10, but concerns remain


Joe Curtis

8 Mar, 2018

Public sector suppliers have welcomed a government turnaround that will now see G-Cloud 10 go live in June 2018.

Vendors will be able to apply to list their services on the public sector cloud procurement framework from April, much sooner than expected after Whitehall had initially mooted extending the previous iteration of G-Cloud into 2019.

Crown Commercial Services’ (CCS’s) decision to effectively reverse its earlier position means a new glut of small cloud suppliers can bid for government work, while companies already listed on G-Cloud can update the services they offer.

Oliver Dowden, minister for implementation, said: “I’m pleased to confirm that we will re-let the G-Cloud framework, which provides opportunities to many small businesses in the digital sector.

“This will provide innovative online solutions to government, supporting the delivery of efficient, effective public services. Small businesses are the backbone of our economy, so it’s crucial that we listen to them when shaping policy, as we have done today.”

Suppliers can typically expect details of a new version of G-Cloud to circulate around six months before it’s due to launch, but by November last year they had yet to receive any information from CCS, which subsequently extended G-Cloud 9 until May 2019.

But rumours of a turnaround have been rumbling for some time, and CCS cancelled a webinar back in January, saying it was “overwhelmed by the feedback we have received for G10”.

Suppliers can finally update their offerings

News this week that G-Cloud 10 is once again on the horizon was therefore welcomed by suppliers that rely on the framework for large portions of their businesses.

Nicky Stewart, commercial director of IaaS, PaaS and email provider UKCloud, told Cloud Pro: “I’m absolutely delighted that they have brought it forward for all sorts of reasons. First but not foremost it shows CCS is capable of listening because there was a bit of an outcry when they did delay G10.

“For UKCloud, we have a host of exciting new products and services and it’s great now we have a near-term vehicle to reach public sector buyers with those, but above all it shows that government is committed to G-Cloud, that it’s committed to the SMB community and the rapid benefits G-Cloud is bringing government.”

Harry Metcalfe, MD of digital design supplier dxw, added: “We have lots of new services we were working on for G-Cloud so we were irritated it was delayed, so this is really great news.

“Frankly I think that speaks quite highly of CCS [that they brought G10 forward]. CCS puts itself out there as an organisation that listens to its suppliers and here they have demonstrated that’s really truly the case.”

The current G-Cloud 9 framework has 2,856 suppliers, over 90% of which are SMBs, and the arrival of version 10 means even more suppliers will be able to join it.

Industry trade body TechUK’s head of public sector, Rob Driver, said: “The announcement of the G-Cloud 10 framework should be welcomed as it allows new innovative providers to work with government, enables new services to be provided and is an opportunity to engage with the wider public sector to make use of the framework.”

But challenges for suppliers remain with the Digital Marketplace

Since its creation in 2012, G-Cloud has seen public sector buyers spend more than £2.8 billion with private sector companies, with nearly half of that going to small and medium businesses.

But this spending data is old, dating from the end of 2017, and this is just one of the issues suppliers still have with G-Cloud, and the Digital Marketplace within which the framework sits.

“With G-Cloud there’s no visibility of the tenders and opportunities,” said UKCloud’s Stewart. “For a very transparent framework one of the areas that could be improved is you have no idea as a supplier if you’re on any buyers’ lists or in the running for an opportunity.”

Only shortlisted suppliers get that insight, making it harder to understand what opportunities are in the pipeline, what competitors you’re up against or what buyers are searching for, she said.

“The only way to extrapolate that kind of data is to get the G-Cloud spend data, but … we have only got data up to the end of 2017. That means it’s quite difficult to understand the market.”

Another issue Stewart highlighted was the fact that suppliers cannot alter prices on a framework, and are being forced to wait until the next iteration comes along to make revisions. This is an issue for vendors that face increased third-party costs, she said.

“There has to be some consideration of how do you deal with third-party price rises that are out of suppliers’ control,” she argued. “SMBs are not as well placed to absorb those increases as bigger suppliers.”

Unspecified delays to the next versions of both the Digital Outcomes and Specialists (DOS) framework to draft in digital specialists, and the Cyber Security Services framework, have become a bugbear for dxw’s Metcalfe, but he said he’s hopeful they won’t be delayed for much longer.

Proxy Models in Container Environments | @DevOpsSummit #CloudNative #Serverless #DevOps

Today, containers use some of the same terminology, but are introducing new ones. That’s an opportunity for me to extemporaneously expound* on my favorite of all topics: the proxy. One of the primary drivers of cloud (once we all got past the pipedream of cost containment) has been scalability. Scale has challenged agility (and sometimes won) in various surveys over the past five years as the number one benefit organizations seek by deploying apps in cloud computing environments.

read more

Ocado improves fraud detection with machine learning


Bobby Hellard

8 Mar, 2018

Ocado Technology is fighting fraud with machine learning, improving its detection rate by 15 times since using automation to detect wrongdoing.

The technology division of the online supermarket has built its own machine learning-based algorithm with Google’s TensorFlow and Google Cloud to help it detect fraud among the sheer petabytes of data it stores in Google Cloud.

The system is part of the Ocado Smart Platform (OSP), and attempts to differentiate between shoppers who accidentally input the wrong personal details or who use an expired debit card, and those who have a malicious intent.

For Ocado, part of the danger of fraud is that leaving it unchecked could result in the fraudulent information being moved to other systems and divisions in the business, eventually impacting customer service.

This has led Ocado to develop a mechanism for predicting and recognising these incidents among millions of other normal events using data collected from past orders, as well as cases of fraud.

To do this, engineers have built a deep neural network using TensorFlow and uploaded the whole fraud detection system to the cloud.

“From the data we had collected from past orders, including cases of fraud, we created a list of features which included the number of past deliveries, the cost of baskets, and other information,” said Ocado’s Holly Godwin. “The more features we included in the training data, the more reliable the model could be, so we made sure that we were providing our model with as much information as possible.

“After collating our data, we then had to decide upon an algorithm capable of learning from the information. Eventually we implemented a deep neural network on TensorFlow, as it was precise and easy to deploy into production. Using TensorFlow was a natural choice as we had already made the move over to Google Cloud for data analytics so using TensorFlow alongside our data stored on the Google Cloud Platform worked well.”

The algorithm had to be as accurate as possible, because confirmed fraud cases are rare, typically one in every thousand orders (0.01%), and a machine learning model that is only 99.9% accurate could still miss several instances of fraud.

“The motivation behind using machine learning for fraud detection was twofold: speed and adaptability,” Godwin added. “Machines are fundamentally more capable of quickly detecting patterns compared to humans. Also, as fraudsters change their tactics, machines can learn the new patterns much quicker.”

Ocado’s push to include more features in the training data have created a more reliable model which has already achieved significant success, improving Ocado’s precision of detecting fraud by a factor of 15.

Why cloud-native virtual network functions are important for NFV

Virtual network functions (VNFs) are software implementations of network function equipment packaged in virtual machines, on top of commercial off the shelf hardware NFV infrastructure. VNFs are a core part of NFV – as we know the base of NFV was to virtualise the network functions and software based to reduce cost and gain full control over network operations with added agility and flexibility benefits. We can say that the majority of NFV operations are focused towards how VNFs can be served in NFV infrastructure to introduce new services for consumers. In future, we can expect major developments will be related to VNFs only.

VNFs and NFV are separated by the fact that VNF are provided by external vendors or open source communities to service providers who are transitioning their infrastructure to NFV. There may be several VNFs which combine to form a single service for NFV. This adds complexity to the overall NFV purpose of agility, where VNFs from different vendors need to deploy in NFV infrastructure having a different operational model.

VNFs developed by different vendors have different methodologies for complete deployment in existing NFV environments. Onboarding VNFs remains a challenge due to a lack of standard processes for complete management from development to deployment and monitoring.

At a basic level, traditional VNFs come with limitations such as:

  • VNFs consume huge amounts of hardware in order to be highly available
  • VNFs are developed, configured and tested to run for specified NFV hardware infrastructure
  • Needs manual installation, configuration and deployment on NFVi
  • API not provided for VNF to enable automated scaling, configuration to serve the sudden spike in demand for utilisation
  • Not supporting multi-tenancy, VNFs cannot be easily shared in infrastructure for reuse

Building cloud-native VNFs is a solution for vendors and this is a revolution in software development to have all cloud-native characteristics to VNFs. Features we can expect as cloud-native VNFs are containerised functions, microservices-based, dynamically managed and specifically designed for orchestration. The major differentiator of cloud-native VNFs from traditional VNFs can be self-management capability and scalability.

Building cloud-native VNFs overcomes the limitations of traditional VNFs and gives the following benefits. Cloud-native VNFs have APIs which enables:

  • Automated installation and configuration
  • Automated scaling when dynamic requirement from network
  • Self-healing or fault tolerant
  • Automated monitoring and analysis of VNFs for errors, capacity management and performance
  • Automated upgrading and updating VNFs for applying new releases and patches
  • Standard and simplified management enables less power consumption; reduction of unnecessary allocated resources
  • Reusability and sharing of processes within VNFs can be achieved. VNFs can be easily shared within an NFV environment

NFV is a key technology used in the development of 5G networks. But NFV is going through a maturation stage where NFV solution providers are resolving many challenges like automated deployment, and VNF onboarding. Developing VNF and deploying into NFV infrastructure sounds simple, but it raises various questions when it comes to scale, configuring or updating VNFs. Any task related to VNFs need manual intervention, leads to more time consumption for launching or updating new services for service providers.

To deliver the promise of agility by NFV in 5G needs exceptional automation at every level of NFV deployment. Building cloud-native VNFs seems to be the solution – but it is at a very early stage.

The post Importance of Cloud-Native VNFs for NFV Success appeared first on Sagar Nangare.

VMware and AWS bring hybrid partnership to Europe


Clare Hopping

8 Mar, 2018

VMware Cloud is now available on Amazon Web Services (AWS) in Europe, meaning businesses can take advantage of locally-hosted services to migrate their cloud services and hybrid cloud deployments.

The two companies explained this will speed up the process of migrating and deploying workloads, as well as having obvious benefits for industries requiring locally-hosted infrastructure.

The first European zone to launch will be the AWS EU (London) region, with Frankfurt due to launch later this year, alongside Asia Pacific.

“Since launching VMware Cloud on AWS just six months ago, we’ve seen tremendous interest from our global customer base and multi-national enterprises,” Mark Lohmeyer, vice president and general manager of VMware’s cloud platform business unit, said.

“Today marks an essential starting point for our global expansion to deliver unparalleled hybrid cloud services in major geographies around the world.”

AWS said the launch was a result of customer demand, with businesses in the healthcare, transportation, financial services, manufacturing, oil & gas, government, education, professional services, and technology sectors already using the cloud service to store their VMware tools.

“Working together, VMware and AWS are delivering deeper AWS integration so that customers won’t have to manage their own storage and database services,” said Matt Garman, vice president of AWS Compute Services.

As well as announcing VMware Cloud on AWS launching in Europe, VMware also revealed new services in its cloud portfolio, including VMware Hybrid Cloud Extension Service for Private Cloud to help businesses migrate and deploy their apps across different vSphere versions, on-premises and in the cloud.

Expanded Wavefront is VMware’s effort to provide a metrics monitoring and analytics platform that supports a range of public and private cloud infrastructure, including VMware Cloud on AWS, with 45 new integrations that can be monitored by Wavefront.

VMware’s Log Intelligence Service offers operational insights into VMware-based data centres and VMware Cloud on AWS, helping businesses troubleshoot and log activity on cloud platforms using machine learning and dashboards, while Expanded VMware Cost Insight Service calculates the cost and capacity demands of running VMware Cloud on AWS workloads in private or public clouds.

“The need to support a complex set of new and existing applications is driving cloud adoption, and the needs of the applications are driving cloud decisions,” said Raghu Raghuram, chief operating officer of products and cloud services at VMware.

“VMware Cloud gives customers unprecedented flexibility to develop any type of application, deploy these apps to any cloud, and deliver them to any device while leveraging a consistent infrastructure across clouds and a consistent set of operations across any cloud.”

Failing at Customer Success? Add Recursion to the Value Chain

In B2B situations, customer outcomes are far more likely to correspond to quantifiable key performance indicators (KPIs) like customer profitability, productivity, cost savings, etc. Customer delight might very well be on this list, but it is far more difficult to measure. We can thus come to the reasonable conclusion that customer success initiatives will only be successful if the people responsible for it take their focus off of inward-facing metrics like churn, upsells, and cross-sells, and instead give their full attention to such customer outcomes.

Just one problem: in the B2B scenario, simply focusing on business KPIs as business outcomes still falls short as a true metric for customer success.

read more

Shankar Kalyana Joins @CloudEXPO NY Faculty | @IBMcloud @SKalster #DigitalTransformation

Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy — one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also received the prestigious Outstanding Technical Achievement Award three times – an accomplishment befitting only the most innovative thinkers. Shankar Kalyana is among the most respected strategists in the global technology industry. As CTO, with over 32 years of IT experience, Mr. Kalyana has architected, designed, developed, and implemented custom and packaged software solutions across a vast spectrum of environments and platforms. His current area of expertise includes hybrid, multi-cloud as-a-service strategies that drive digital and cognitive enterprises to operational excellence.

read more

Compliance and Security | @CloudEXPO #Cybersecurity #GDPR #ArtificialIntelligence

In 2018, the shifting emphasis to IoT, Artificial Intelligence (AI), virtual reality (VR) and automation seem to overshadow cloud; yet, I believe it is just the opposite. A recently published industry survey shows that by 2020, the use of public cloud will grow dramatically. Business goals related to actively adopting AI, IoT and machine learning strategies are prompting IT teams to consider outsourced cloud and cloud experts to move faster than competitors. The formats and pilots incorporating these technologies can be seen across multiple markets and segments including government, retail, and industrial bases. The use of AI, VR, and IoT is also driving the technology, compliance and cybersecurity markets necessary to support these innnovations. For example, nowadays there are a number of automotive, entertainment and digital marketing companies with dedicated cyber teams.

read more

VMware Cloud on AWS now available in Europe – with more regions promised

The joint cloud offering from VMware and Amazon Web Services (AWS) is now available in Europe, the former has announced, with London first on the list and Frankfurt to join in due course.

VMware Cloud on AWS, which offers benefits to joint customers including workload portability between private and public clouds, had previously been available in AWS’ US West and US East regions.

The service was only made initially available in August, with AWS chief executive Andy Jassy joining VMware compatriot Pat Gelsinger on stage at VMworld. Among the list of earliest customers using the joint offering were Moody’s, Symantec, and Western Digital. Joining them, according to the VMware press materials, are Brink’s, a secure logistics provider, who said the flexibility of the new product is a ‘key driver in Brink’s technology and business transformation.’

“Bringing together the best of VMware and AWS, VMware Cloud on AWS provides customers an operationally consistent and familiar way to run, manage and secure applications in a hybrid cloud with access to a broad range of innovative and comprehensive AWS services and robust disaster protection,” wrote Sai Gopalan, VMware senior product marketing manager in a blog post confirming the news.

“This release initiates the global expansion of the service by making it available to our European and multi-national customers.”

Gopalan added there were future plans to move to additional European and Asia Pacific regions.