All posts by Connor Jones

NASCAR revs up its video business with AWS


Connor Jones

5 Jun, 2019

The National Association for Stock Car Auto Racing (NASCAR) has partnered with AWS to utilise the cloud giant’s artificial intelligence and machine learning tools to automate the database categorisation of 70 years worth of video.

In the run-up to the airing of its online series ‘This Moment in NASCAR History’, the sport that packs deafening stadiums has 18-petabytes of video to migrate to an AWS archive where the processing will take place.

“Speed and efficiency are key in racing and business which is why we chose AWS – the cloud with unmatched performance, the most comprehensive set of services, and the fastest pace of innovation – to accelerate our migration to the cloud,” said Craig Neeb, executive vice president of innovation and development, NASCAR.

“Leveraging AWS to power our new video series gives our highly engaged fans a historical look at our sport while providing a sneak peek at the initial results of this exciting collaboration,” he added.

Using Amazon Rekognition, the platform’s AI-driven image and video analysis tool, NASCAR hopes to automate the tagging of video metadata for its huge catalogue of multimedia to save time searching for specific clips.

Metadata is attributed to stored multimedia files which makes it easier for someone to search for it in a database. For example, a type of metadata attributed to a given video would include the race date, competition, the drivers involved, location and other information that would differentiate it from other clips.

Making a series that joins clips of races throughout the years would take a long time to manually search through petabytes of video.

“By using AWS’s services, NASCAR expects to save thousands of hours of manual search time each year, and will be able to easily surface flashbacks like Dale Earnhardt Sr.’s 1987 ‘Pass in the Grass’ or Denny Hamlin’s 2016 Daytona 500 photo finish, and quickly deliver these to fans via video clips on NASCAR.com and social media channels,” read an AWS statement.

NASCAR also plans to use Amazon SageMaker to train deep learning models against its footage spanning decades to enhance the metadata tagging and video analytics capabilities.

The sport will also be using Amazon Transcribe, automatic speech recognition service, to caption and timestamp every word of speech in the archived videos which will facilitate easy searchability further.

“AWS’s unmatched portfolio of cloud services gives NASCAR the most flexible and powerful tools to bring new elements of the sport to live broadcasts of races,” said Mike Clayville, vice resident, worldwide commercial sales at AWS.

BT partners with Juniper on unified cloud network platform


Connor Jones

3 Jun, 2019

BT has partnered with Juniper Networks to support with the core infrastructure that will underpin the rollout of its upcoming unified cloud networking platform.

The platform will unify BT’s networks including 5G, Wi-Fi and fixed-line into one virtualised service which will enable more efficient infrastructure management and deployment.

The new unified platform will supposedly allow BT to “create new and exciting converged services bringing mobile, Wi-Fi, and fixed network services together”.

The platform’s infrastructure will be build to a common framework, allowing it to be shared across BT’s offices nationally and globally.

The platform will be used by a range of BT’s arms such as voice, mobile core, radio/access, ISP, TV and IT services and deploying the platform company-wide will cut costs and streamline operations.

“This move to a single cloud-driven network infrastructure will enable BT to offer a wider range of services, faster and more efficiently to customers in the UK and around the world,” said Neil McRae, chief architect, BT. “We chose Juniper to be our trusted partner to underpin this Network Cloud infrastructure based on the ability to deliver a proven solution immediately, so we can hit the ground running.”

“Being able to integrate seamlessly with other partners and solutions and aligning with our roadmap to an automated and programmable network is also important,” he added.

We’re told that the project will facilitate the advent of new applications and workloads for the telecoms giant and evolve its existing ones including converged fixed and mobile services and faster time-to-market for internet access delivery.

“By leveraging the ‘beach-front property’ it has in central offices around the globe, BT can optimise the business value that 5G’s bandwidth and connectivity brings,” said Bikash Koley, chief technology officer, Juniper Networks.

“The move to an integrated telco cloud platform brings always-on reliability, along with enhanced automation capabilities, to help improve business continuity and increase time-to-market while doing so in a cost-effective manner,” he added.

BT has undergone a change in leadership this year and faces challenges in almost all areas of its business, according to its annual financial overview.

EE’s business has been carrying the telco, it’s the only arm of the company that is posting profits in an “unfavourable telecoms market”. Its revenue slip for the year has been attributed to the decline in traditional landline calls with a seemingly unrelenting shift to voice over IP.

In order to capitalise on new business angles such as IoT, cloud and SD-WAN, BT admits greater investment is needed and this will most likely hinder its short-term revenue targets but it could be pay off in the long term.

“Our aim is to deliver the best converged network and be the leader in fixed ultrafast and mobile 5G networks,” said Jensen. “We are increasingly confident in the environment for investment in the UK.”

EE launched its 5G network last week, becoming the first telecoms company in the UK to do so. It’s available in six major cities and speeds of 1Gbps are promised “for some users”.

Google Cloud scores FA digital transformation partnership


Connor Jones

31 May, 2019

The English Football Association (The FA) has partnered with Google Cloud to digitally transform its St. George’s Park national training centre used by 28 national teams.

Google Cloud is now the official cloud and data analytics partner to the FA and during the multi-year partnership, Google Cloud aims to put G Suite at the heart of everything. It will see a shift from siloed working to a more collaborative approach between coaches of all the teams.

“The first step in our transformation at St. George’s Park was to unify the way our coaches train and develop our 28 national teams to increase productivity,” says Craig Donald, CIO at the FA. “We needed the ability to collaborate and share across the coaches and team managers. G Suite allowed us to do that and was the first part of our Google Cloud partnership.”

The FA has terabytes of data stored in Google Cloud collected from tracking player activity and the analysis team will use the tools provided by Google Cloud Platform, such as smart analytics tools, machine learning, AI and BigQuery to unearth new insights from the data.

The organisation’s next step will be to build out its Player Profile System (PPS), a proprietary tool built on the platform, to measure performance, fitness, training and form of players at all levels.

The goal is to automate near real-time data analysis which will give the pitchside coaches a better indication as to how the players are performing in training, which could influence decisions such as player selection for matches.

The PPS will be further enhanced by Google Cloud smart analytics, data management systems and machine learning capabilities to analyse even more player data signals.

“Smart analytics and data management play a critical part in our PPS,” said Nick Sewell FA head of application development. “Everything we do at St George’s Park for this workload is built on Google Cloud.”

Over the multi-year partnership The FA aims to tackle three key areas:

  • Success: Preparing both men’s and women’s senior teams for the next World Cups.
  • Diversity: Doubling female participation in the game.
  • Inclusivity: Making football more inclusive and open to all.

“We believe technology is a key area of potential competitive advantage for our 28 teams and everything we do at St George’s Park,” said Dave Reddin, The FA’s head of team strategy and performance.

“We have progressively built a systematic approach to developing winning England teams and through the support of Google Cloud technology we wish to accelerate our ability to translate insight and learning into performance improvements.”

AWS’ launches Textract tool capable of reading millions of files in a few hours


Connor Jones

30 May, 2019

AWS has said that its Textract tool, designed to extract and translate data between files, is now generally available for all customers.

The tool, which is a machine learning-driven feature of its cloud platform, lets customers autonomously extract data from documents and accurately convert it into a usable format, such as exporting contractual data into database forms.

The fully-managed tool requires no machine learning knowledge to use and works in virtually any document. Industries that work with specific file types such as financial services, insurance and healthcare will also be able to plug these into the tool.

Textract aims to expedite the laborious data entry process that is also often inaccurate when using other third-party software. Amazon claims it can accurately analyse millions of documents in “just a few hours“.

“Many companies extract text and data from files such as contracts, expense reports, mortgage guarantees, fund prospectuses, tax documents, hospital claims, and patient forms through manual data entry or simple OCR software,” the company said.

“This is a time-consuming and often inaccurate process that produces an output requiring extensive post-processing before it can be put in a format that is usable by other applications,” it added.

Textract takes data from scanned files stored in Amazon S3 buckets, reads them and returns data in JSON text annotated with the page number, section, form labels, and data types.

PwC is already using the tool for its pharmaceutical clients, an industry that commonly uses processes that involve Food and Drug Administration (FDA) forms that would otherwise require hours to complete, according to Siddhartha Bhattacharya, director lead, healthcare AI at PwC.

“Previously, people would manually review, edit, and process these forms, each one taking hours,” he said. “Amazon Textract has proven to be the most efficient and accurate OCR solution available for these forms, extracting all of the relevant information for review and processing, and reducing time spent from hours to down to minutes.”

The Met Office is another organisation that plans to implement Textract, making use of old weather records.

“We hope to use AmazonTextract to digitise millions of historical weather observations from document archives,” said Philip Brohan, climate scientist at the Met Office. “Making these observations available to science will improve our understanding of climate variability and change.”

Uncrackable passwords introduced to Microsoft Azure


Connor Jones

17 May, 2019

Microsoft Azure has increased the character limit for passwords in Azure Active Directory from 16 to a massive 256 characters, making brute force hack attempts much more difficult.

It seems to be a hot topic for Azure customers who have been reminding Microsoft of its seemingly unsatisfactorily small limit for passwords.

“Many of you have been reminding us that we still have a 16-character password limit for accounts created in Azure AD,” said Microsoft’s Alex Simons. “While our on-premises Windows AD allows longer passwords and passphrases, we previously didn’t have support for this for cloud user accounts in Azure AD.”

“Today, I am pleased to announce that we have changed this limit, allowing you to set a password with up to 256 characters, including spaces,” he added.

Passwords must still meet three out of the four essential criteria as set out in Microsoft’s policy documentation.

  • Lowercase characters
  • Uppercase characters
  • Numbers (0-9)
  • Symbols (@ # $ % ^ & * – _ ! + = [ ] { } | \ : ‘ , . ? / ` ~ ” ( ) ūüėČ

While account and password security are of paramount importance to IT users, Microsoft still won’t force you to create an iron-clad password, keeping the minimum allowance at just a mere eight characters.

The difference between an eight-character password and a 256 character one is huge, according to howsecureismypassword.net, a website used to check how long it would take to brute force a password.

We took three different passwords of varying lengths to see how long it would take to crack each of them. First up is ‘Jazzily1’, the minimum character requirement that adheres to three of Azure’s four essential criteria. This would take just one month to crack, according to the website.

A middle ground 137-character password would take 29,511,750,324 octogintillion years (quite a lot) to crack, and the 253-character password we used at the upper limit of Azure’s allowance would take ‘forever’.

Another way to look at hyper-secure passwords is Professor Bill Buchanan’s take on things regarding 128-bit AES keys. He said that in order to break one of these, it would take the energy required to boil every single one of Earth’s oceans 16,384 times just to crack a single key.

In related news, Microsoft recently gained FIDO certification for its Windows 10 authenticator Windows Hello in the upcoming May 2019 upgrade, seemingly in an embryonic first step towards a passwordless Windows.

Windows Hello will use facial recognition, fingerprint scanning and a secure PIN number for more than 800 million Windows 10 devices starting next month – a service with cross-compatibility with other Microsoft services such as Office 365, OneDrive and more.

“Our work with FIDO Alliance, W3C and contributions to FIDO2 standards have been a critical piece of Microsoft’s commitment to a world without passwords,” said principal group program manager with Microsoft Yogesh Mehta.

“No one likes passwords (except hackers),” he added. “People don’t like passwords because we have to remember them. As a result, we often create passwords that are easy to guess – which makes them the first target for hackers trying to access your computer or network at work.”

In the same May update, Microsoft will also stop enforcing its password expiration policies which prompt users to change their passwords every few months.

The company’s logic behind this came from the idea that if users are frequently changing passwords, they will be more inclined to just make small changes or even start writing them down; a big security no-no.

Security flaw found in Google’s “most secure” account authenticator


Connor Jones

16 May, 2019

A misconfigured Bluetooth pairing protocol in Google’s Titan security keys could allow attackers to bypass encryption and hijack user accounts, the firm has revealed.

Google has said it will start offering replacements of what it once called the “strongest, most phishing resistant method of two-step verification (2SV) on the market today”, following the discovery of the flaw which exposes account information to those within Bluetooth range.

The company has assured customers that the keys, the technology for which was first launched in 2017, would still do their job and provide multi-factor authentication built to a FIDO-standard that’s stronger than regular 2SV, but that the $50 cost would be waived if they wanted a replacement unit.

“This bug affects Bluetooth pairing only, so non-Bluetooth security keys are not affected,” said Christiaan Brand, product manager, Google Cloud. “Current users of Bluetooth Titan Security Keys should continue to use their existing keys while waiting for a replacement, since security keys provide the strongest protection against phishing.”

When attempting an account sign-in, a Titan user is required to press a button on the Bluetooth key to authenticate the log-in attempt. It was discovered that immediately after this button press, attackers have a narrow window to connect their own device to the security key, which could result in the attacker logging into the user’s account from their device, provided they already had said user’s email and password.

Titan keys work by acting as another authentication step and are linked with a user’s device, such as a phone or laptop, via a Bluetooth connection. A flaw in this connection means that an attacker could trick the phone or laptop into thinking the attacker’s own device is the security key. If this is achieved, the attacker could bypass the authentication process and start to make changes to the user’s device by mimicking an external keyboard and mouse.

It could be argued that a situation where an attacker that has your account credentials, knows you use a Titan key and is within 30m of your location would be unlikely to occur, but it’s still serious enough to prompt Google into taking action by replacing all affected keys. Others are less sceptical, though.

“The fact you must be within 30 feet of the security key isn’t an issue, especially when you consider how fast compiled and scripted software can run,” said Mark Miller, director of enterprise security support at Venafi. “In addition, lots of people conduct business in public places like coffee shops and airports, so connecting a dongle to a device isn’t that farfetched.”

“From a technology perspective, these keys are amazing; they make security a lot easier to consume”, he added. “However, there is no such thing as perfect technology, so I’m glad Google is taking the initiative and recalling these keys.”

Most recently, Google announced that a new form of its Titan Security keys would be made available to all Android phones running Android 7.0 or later, with its line of Pixel phones getting a slightly more secure version too.

The phone as a security key (PaaSK) standard was announced at Google Cloud next 2019 and instead of having an external Titan Security key to hand, all that would be required is to unlock your Google account-linked Android device and press a button to approve the log-in in real time.

The Titan key was originally introduced to combat phishing attempts that exploited vulnerable 2SV methods such as confirmation codes delivered by texts – a method of communication that can be hijacked with relative ease.

In other Google news, a privacy flaw was found in Google Pay’s settings on Wednesday. Optional settings regarding a user’s ability to share their creditworthiness, personal information or Google Pay account information were hidden behind a special URL and not directly through the Google Pay account settings page.

Google has since attributed this error to fault left over from an update and has now fixed it so that the three privacy settings now appear as normal.

NHS Digital cuts costs by using VMware cloud on AWS


Connor Jones

10 May, 2019

NHS digital is migrating VMware vSphere workloads to its AWS cloud platform to reduce costs and improve the operational efficiency of digitally-enabled healthcare.

The move is part of NHS Digital’s long-term intention to migrate the majority of its services from its current on-premise infrastructure to its AWS and Azure multi-cloud environment, building on its cloud-first approach to delivering healthcare.

NHS digital has worked with VMware cloud on AWS to create a new commercial model which provides better economies of scale benefits and supposedly will streamline the remaining cloud migration processes NHS Digital needs to make in the future.

“The uptake of digital services in the NHS is accelerating so the NHS and social care’s IT backbone must be up to the job,” said Rob Shaw, deputy chief executive at NHS Digital. “With VMware Cloud on AWS, we’re providing a resilient platform to support digitally-enabled care today and in the future.”

“We now have a commercial framework in place to enable NHS and public-sector organizations to confidently use the cloud,” he added. “Together we can benefit from the economies of scale and cost efficiencies of this model.”

NHS Digital is the UK’s healthcare service’s IT arm which is currently attempting to undertake one of the most complex digital transformation projects in recent times.

The organisation is notorious for operating on legacy infrastructure and outdated technology, which is why Health Secretary Matt Hancock announced a £487 million fund in his inaugural speech last July dedicated for digital projects.

The organisation coordinates and maintains mission-critical IT infrastructure that underpins the NHS and social care, facilitating the needs of 1.4 million staff and 1.5 million social care staff.

“We choose the right cloud for each workload, and VMware Cloud on AWS is the absolute best option for running our vSphere-based environments in the cloud,” said Michael Flintoft, associate director of platforms and infrastructure at NHS Digital. “It’s easy to move solutions across the different environments and it’s easy to run and manage.”

“We built a virtual data centre in the AWS cloud in less than three hours,” he added. “That speed and agility is just what we need to harness innovation and make the best digital services available for the NHS and social care sector.”

Its cloud-first initiative forms just part of the organisation’s overall digital transformation. AI and robotics are also are expected to be harnessed in the near future and the current thinking is that within 20 years, 90% of jobs in the NHS will require digital skills.

NHS Digital has been entangled in its fair share of controversy over the past few years. It was slammed heavily for agreeing to share patient health records with Google-owned DeepMind and its data sharing practices again came under fire during the care.data fiasco.

Microsoft announces Azure updates for IoT, databases and more


Connor Jones

7 May, 2019

Microsoft has kicked off its annual developer conference Build in Seattle with CEO Satya Nadella using his pre-event keynote to announce a slew of updates to its Azure services, including IoT, databases, analytics and storage.

Internet of Things (IoT)

The first update comes to Azure IoT Edge, Microsoft’s cloud-based IoT monitoring and deployment platform used by businesses to coordinate their IoT devices. After nearly two years since its release, IoT Edge will now support Kubernetes integration in preview.

Microsoft’s fully managed IoT SaaS,¬†Azure IoT central,¬†will now sport some enhanced features surrounding the rules it follows to process data and send it to analytics services as well as benefiting from some cosmetic upgrades to dashboards and data visualisation.

Databases

Microsoft has released an update to Azure SQL Database serverless which will appeal to those that care about being billed for unused compute. The update introduces a new compute tier for databases with intermittent usage with the goal of improving the cost-efficiency of the service. It involves scaling determined by workload and pauses compute when the database is inactive. It’s now available in preview for single databases.

Developers will also now be able to scale compute, storage, and memory resources in a new Hyperscale service tier in Azure Database for PostgreSQL. It’s now in public preview as Hyperscale (Citus) for PostgreSQL and in general availability as Azure SQL Database Hyperscale.

Storage

Azure Managed Disks will now support direct uploads at sizes from 32GB to 32TiB – an update now in public preview in all regions. Azure Managed Disks are virtual hard disks that act like a physical disk in an on-premise server.

Analytics

Azure Data Factory’s two new components announced earlier this year will enter preview this week. Mapping Data Flows allows cloud users to transform data at scale by creating data transformation jobs without requiring any knowledge of coding. Also on the codeless theme, Wrangling Data Flows allows users to ‘explore and wrangle data at scale’ – visualising data en masse and making it easier to understand. The components are in public preview and private preview respectively.

Azure Data Warehouse also gets support for semi-structured data, which is also available in preview. ADW can now analyse both structured and semi-structured data such as JSON directly inside the service which yields faster insights.


More than two thirds of cloud resellers offer Azure, making it the most widely offered IaaS solution. Discover more cloud buying and selling trends in this whitepaper.

Download now


In other news

Microsoft has treated us to many of the announcements prior to the event’s official opening. AI and blockchain preset services were announced last week as appetisers for this week’s event.

Azure Cognitive Services got smarter with Decision and Personaliser, two new components that are designed to provide users with smarter specific recommendations for better decision-making.

AI has been added to Azure Search, which will interact with Cognitive Services to unearth better insights from structured and unstructured content. Machine learning services have also been tweaked to enable codeless model creation and a new deployment service with drag and drop capabilities.

Slack’s new integrations signal the end to war on email


Connor Jones

25 Apr, 2019

Slack has added some new features to its collaboration platform which aim to embrace the power of email, the very tool it aimed to kill off over five years ago.

Instead of having the two services operate alongside one another, now those in your organisation who aren’t on Slack, or have just started and haven’t yet received credentials, can still benefit from its collaboration features.

Directly addressing an individual in Slack via it’s ‘@-mention’ feature can now be utilised within email, with notifications appearing in the employee’s inbox if they are not on the platform or logged in.

Replies sent from an employee’s inbox will beam straight back to the relevant channel just as if the interaction was taking place on just the one platform.

Admins will need to tweak their company’s account to allow outside users to communicate with those inside the organisation in this way, but it’s a step closer to being a more unified collaboration tool.

This supports Slack’s existing Outlook and Gmail functionality, which allows users to forward emails into a channel where members can view and discuss the content and plan responses from inside Slack.

Another interesting announcement, made at the company’s Frontiers conference in San Franciso, relates to its ‘Workflow Builder’ tool which will enable any user within Slack to build apps for routine functions without coding knowledge.

The tool, which is launching later this year, will be capable of automating functions, such as completing and filing a benefits request form to HR or sending messages to help new starters find the right channels to join, saving other workers from sacrificing time to give a platform tutorial.

If this sounds familiar then you’d be right. Slack announced back in February two new toolkits that would also allow non-coders to build apps within Slack, however Workflow Builder appears to be geared towards routine automation rather than the more technical backend functions of the platform.

Slack’s integration with Outlook and Google Calendar is also becoming stronger as any status you set within Calendar will be automatically synced to Slack, such as being away from the office for an event or when you have a meeting booked in.

As many business meetings tend to be virtual nowadays, integration with calendars will allow other users to see who your meeting is with and provide joining options directly within Slack thanks to partnerships with Hangouts, Zooma and Webex.

There is also a change coming to Slack’s search function which, although fast and expansive, isn’t always the most intuitive or organised.¬†Slack aims to address this by adding new features to make it easier to view unread messages quicker, allow faster navigation between channels to find the relevant person, and better functionality when sifting through channel archives. These features will be available in the coming weeks.

Slack’s five-year slog of a battle with email has proved fruitless; email still exists and seems like it’s here to stay. Google has invested into it to a greater extent recently despite the wide adoption of the platform which depends on the virality of its freemium model.

View from the aiport: Google Cloud Next 2019


Connor Jones

12 Apr, 2019

Google made a raft of announcements at this year’s Next event in San Francisco this week. The most noteworthy of which was Anthos. Formerly Cloud Services Platform, it becomes the first, multi-cloud platform which will appeal heavily to its enterprise customers, marking a shift away from the developer focus in recent years towards business leaders.

It’s a step in the right direction for Google as it’s the C-suite that it needs to be targeting in order to accelerate the platform’s adoption and most will probably agree that the focus has been on the developers for too long. With Anthos, Google has set its sights on the future, heeding the advice of analysts who say that 88% of businesses will undergo a multi-cloud transformation in the next few years. Google’s new multi-cloud platform, “simply put, is the future of cloud”, according to Urs H√∂lzle, Google’s senior vice president of technical infrastructure.

But, if you look past all the marketing spiel, you stop seeing the innovative new platform and notice that by releasing Anthos as its flagship product, Google has essentially taken a step down and conceded that it’s the second cloud provider in the market. If you can’t beat them, use them to help you scale, though, perhaps?

“They’ve taken on a very interesting approach,” said Sid Nag, research director at Gartner. “They want to be the second cloud which is kind of interesting because they don’t want to compete with the 40-50 pound gorilla [AWS] so they’re basically saying it’s a multi-cloud world and they’re pushing the multi-cloud narrative so they can come in as the second cloud… and then land and expand so I think that’s a pretty smart strategy”.

One area where Google seems to be leading the charge is in security. Some 30 new products and services were announced at this year’s event which totals more than 100 in the past year alone.¬†

It’s clear the company is taking security seriously – as it should – and is keen to show its customers that everything the company offers has security baked in from the start. The Cloud Security Command Centre looks like a nice piece of kit for any cloud platform admin to use and has the added benefit of Google’s industry-leading machine learning (ML) capabilities. This, I’m sure, will prove an attractive selling point as it helps users detect malicious activity ahead of an attack.¬†

Building on the AI theme, the automated functions of all the new features – from AutoML advancements to intelligent event threat detection – continues to provide Google with a serious USP to draw in customers. Google has made it easier than ever to run an advanced cloud environment in a secure and intuitive way. It wants to leave the coding and app creation to the developers and let the admins do their job which is focusing on driving the business forward.

For example, using ML-driven Connected Sheets, businesses could oversee their distribution channels and see where operations were being halted. It would be easy to detect if warehouse stock wasn’t leaving Brazil on time because major road works were taking place so the business could simply re-route the drivers’ navigation systems to get things back on track.

The challenge Google will face in the next year is that of scale. The company has only just started to win over the hearts of business leaders though. Athos garnered the biggest roar I heard all week, but it needs to take that and move on, proving to its customers that it’s really serious about enterprise.

Google also faces the challenge of partnering with the smaller software vendors. Over the week, Google announced partnerships with the biggest bulls in the pen: Cisco, Salesforce, HSBC РI could go on. But, now, what it must do Рgiven it has committed to this containerised and stateless approach with Anthos Рis show that it can work with the smaller ISVs.

“The interesting part will be seeing how it can start working with smaller ISVs and convert those apps into Google containers and its containerised as that will be the challenge – that’s how it will grow its business,” said Nag. Smaller apps are the ones that will be modernised in the future, so monetising these will be key to Google’s success years down the line.

It seems as though the new CEO Thomas Kurian is continuing in the hugely successful footsteps of his predecessor Diane Greene – the woman that drove the company to be enterprise-ready in just two years instead of the forecasted 10. Conceding the second place spot to Amazon might be a good move for the company, filling the market gap enterprise customers so desperately needed.

Hybrid cloud is something that many organisations wrestle with and they finally have an answer to the headache they’ve faced for years. We’re excited to see how the company scales in the next year – the first under Kurian’s reign – and whether it’s able to tackle the challenges that it faces.