All posts by James

Google Cloud and AWS launch new services on machine learning and containers respectively

Another day, another product launch in the land of the hyperscalers – and for Google Cloud and Amazon Web Services (AWS), their new services are focusing on machine learning (ML) and containers respectively.

Google’s launch of Cloud AI Platform Pipelines, in beta, aims to provide a way to deploy ‘robust, repeatable machine learning pipelines… and delivers an enterprise-ready, easy to install, secure execution environment for ML workflows.’

This can be seen, for Google Cloud’s customers, as a potential maturation of their machine learning initiatives. “When you’re just prototyping a machine learning model in a notebook, it can seem fairly straightforward,” the company notes, in a blog post authored by product manager Anusha Ramesh and developer advocate Amy Unruh. “But when you need to start paying attention to the other pieces required to make a ML workflow sustainable and scalable, things become more complex.

“A machine learning workflow can involve many steps with dependencies on each other, from data preparation and analysis, to training, to evaluation, to deployment, and more,” they added. “It’s hard to compose and track these processes in an ad-hoc manner – for example, in a set of notebooks or scripts – and things like auditing and reproducibility become increasingly problematic.”

The solution will naturally integrate seamlessly with Google Cloud’s various managed services, such as BigQuery, stream and batch processing service Dataflow, and serverless platform Cloud Functions, the company promises. The move comes at an interesting time given Google’s ranking in Gartner’s most recent Magic Quadrant for cloud AI developer services; placed as a leader, alongside IBM, Microsoft and Amazon Web Services (AWS), but just behind the latter two, with AWS on top.

AWS, meanwhile, has launched Bottlerocket, an open source operating system designed and optimised specifically for hosting containers. The company notes the importance of containers to package and scale applications for its customers, with chief evangelist Jeff Barr noting in a blog post that more than four in five cloud-based containers are running on Amazon’s cloud.

Bottlerocket aims to solve some of the challenges around container rollouts, using an image-based model instead of a package update system to enable a quick rollback and potentially avoid breakages. Like other aspects of cloud security, surveys have shown that container security snafus are caused frequently by human error. In a recent report StackRox said misconfigured containers were ‘alarmingly common’ as a root cause.

Barr noted security – in this case installing extra packages and increasing the attack surface – was a problem Bottlerocket aimed to remediate, alongside updates, increasing overheads, and inconsistent configurations.

“Bottlerocket reflects much of what we have learned over the year,” wrote Barr. “It includes only the packages that are needed to make it a great container host, and integrates with existing container orchestrators.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud complexity and ‘terrifying’ IoT means organisations’ asset visibility is worsening – report

As security best practice continues to be a battle between organisations closing the gap of hackers who stay one step ahead, a new report from cybersecurity asset management provider Axonius has argued the complexity of cloud infrastructure means companies are ‘rapidly’ losing sight of their asset landscape.

The study, put together by Enterprise Strategy Group (ESG) and which polled 200 North America-based IT and cybersecurity professionals, found that for respondents overall, more than half (52%) of VMs now reside in the cloud and running in multiple environments.

The report describes cloud visibility as ‘hazy at best’, with more than two thirds (69%) of those polled admitting they have a cloud visibility gap. Three quarters of those polled said they had experienced several serious cloud VM security incidents. Adding to this mix is a rise in container usage, with plenty of research reports previously noting dire consequences may be afoot if the spike was not adequately secured. Axonius describes container uptake as ‘mainstream’ today.

Internet of Things (IoT) projects are gaining steam yet an even wider visibility gap remains – 77% of respondents report a disparity. The report describes this trend as ‘inevitable or terrifying’; four in five (81%) say IoT devices will outnumber all other devices within three years, while more than half (58%) admit diversity in devices was their biggest management headache.

Bring your own device (BYOD) is still a sticking point for many companies, even if the hype and coverage has since died down. Almost half (49%) of organisations polled said they prohibit BYOD for work-related activities, while three in five (61%) of those who have policies in place are worried about violations. “BYOD looks to be here to stay… even if security suspects that policies are being circumvented,” the report notes.

Part of the solution is also part of the problem. Organisations are using on average more than 100 separate security tools, making the already-complicated task of IT asset management even more fiendish. A new approach is needed, the report warns: IT asset inventories currently demand the involvement of multiple teams, and take more than two weeks of effort.

“When we speak with customers from the midmarket up to the Fortune 100, we hear the same challenges: teams are faced with too many assets, a patchwork of security tools, and maddeningly manual processes to understand what is there and whether those assets are secure,” said Dean Sysman, CEO and co-founder at Axonius. “This survey uncovers the depth and breadth of the asset management challenges we see today and what’s on the horizon.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Lloyds Banking Group signs up to Google Cloud in five-year partnership

Google Cloud continues to secure the big-ticket clients, with the company announcing that Lloyds Banking Group is set to embark on its ‘cloud transformation’ journey with Google.

The bank will invest a total of £3 billion ($3.9bn) in a five-year deal which will see Lloyds deploy various Google Cloud services focused around the customer experience. Google Cloud will also ensure that Lloyds engineers are trained to ‘enhance disciplines… all in an effort to boost efficiency and offer innovative new services to the bank’s retail and commercial customers.’

“The size of our digital transformation is huge and Google Cloud’s capabilities will help drive this forward, increasing the pace of innovation, as well as bringing new services to our customers quickly and at scale,” said Zaka Mian, group transformation director at Lloyds in a statement. “This collaboration gives us a strategic advantage to continue as a leader in banking technology.”

Alongside retail and healthcare, financial services is one of the three primary customer target areas for Google. HSBC is arguably the best-known financial customer, with the company speaking at Google Next as far back as 2017. In September Srinivas Vaddadi, delivery head for data services engineering at HSBC, elaborated on the bank’s ongoing process of moving its legacy data warehouse onto BigQuery. Other Google Cloud financial services customers include PayPal, Ravelin, and Charles Schwab.

Recent customer wins include Major League Baseball, who is discontinuing its relationship with Amazon Web Services as a result, and Hedera Hashgraph.

“Banking customers today expect secure access to their funds, without downtime, and delivered through the modern experiences they receive in other aspects of their lives,” said Google Cloud CEO Thomas Kurian. “We are proud to work with such a storied institution as Lloyds, which helped to create – and continues to redefine – the next generation of financial services.”

Picture credit: Lloyds Branch Manchester Exterior, by Money Bright, used under CC BY 2.0

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Gartner’s cloud AI developer services Magic Quadrant sees AWS edge out Microsoft and Google

The hyperscalers continue to lead where others fear to tread – or at least those without the cavernous pockets of the biggest cloud players, at least. Gartner has gazed into its crystal ball and ranked the main vendors for cloud AI developer services, and it’s a 1-2-3 for Amazon Web Services (AWS), Microsoft and Google.

In its latest Magic Quadrant, Gartner defines ‘cloud AI developer services’ thus: as “cloud-hosted services/models that allow development teams to leverage AI models via APIs without requiring deep data science expertise…deliver[ing] services with capabilities in language, vision, and automated machine learning.” In other words, companies who can provide a one-stop-shop for applications and models across the stack are likely to fare well here.

AWS nudged out Microsoft and Google in terms of ability to execute and completeness of vision, but as can be evinced from the ranking the three leaders are tightly bunched. IBM also placed in the leaders’ ranking, with 10 vendors assessed in total.

At last count – or rather, what CEO Andy Jassy told re:Invent attendees in December – AWS had more than 175 services in total available to customers. Machine learning (ML), principally through the SageMaker portfolio, comprises more than one in nine of these, and was an area where AWS’ view ‘continued to evolve’, Jassy said.

This breadth of portfolio, according to the Gartner analysis, is both a gift and a curse. While AWS’ offerings cater across myriad industries and their journeys, in terms of whether you have ML skills or not, Gartner noted the complexity ‘poses some challenges’ both for individual developers and enterprise application leaders.

Google, meanwhile, has previously touted itself as the cloud leader in machine learning, with a lot of initial advantage being built up through the leadership of Fei-Fei Li, who left in late 2018 to continue her work at Stanford University. Like AWS, many of its customer wins – Sainsbury’s being a good example in the hotly-contested retail market – cite ML front and centre as the key differentiator. As regular readers of this publication will be aware, Google Cloud has significantly increased its press output, with new customers, initiatives and partnerships appearing on an almost daily basis.

This is how Gartner summed up Google’s offering. The company was praised for its depth of languages available, its AutoML and image recognition tools, but its drawbacks were tied around its lesser standing in cloud infrastructure compared with AWS and Azure. Thomas Kurian’s leadership ‘attracted positive feedback… but the organisation is still undergoing substantial change, the full impact of which will not be apparent for some time’, Gartner wrote.

Microsoft, sitting in between AWS and Google, had its analysis presented in a similar feel, going relatively quietly about its business. The company’s cloud AI developer services were ‘among the most comprehensive on the market and all highly competitive’, according to Gartner, but the analyst firm cautioned its branding strategy was ‘confusing’ due to multiple business units, Azure, Cortana, and more.

The visionaries section at bottom right, where vision is praised but scale is not quite there yet, is often an interesting marker for future stars. Aible, a business intelligence (BI) platform provider which promises to automate the AutoML lifecycle, noted Gartner’s verdict that visionaries were ‘likely to excel’ in AutoML, a segment viewed as the most important for application leaders and development organisations.

Ultimately however, Gartner’s cloud AI developer services Magic Quadrant has a somewhat similar feel to its cloud IaaS ranking – at least on the top right, anyway. The full list of vendors analysed were Aible, Amazon Web Services, Google, H20.ai, IBM, Microsoft, Prevision.io, Salesforce, SAP, and Tencent.

Disclosure: CloudTech procured the Quadrant through the AWS landing page – you can do so too by visiting here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Major League Baseball makes Google Cloud official cloud partner as AWS strikes out

Major League Baseball (MLB) has selected Google Cloud as its official cloud partner, appearing to bring to an end the company’s long-standing relationship with Amazon Web Services (AWS).

In what both companies described as a ‘powerful multi-year collaboration’, MLB will migrate its cloud and on-premise systems to Google Cloud, as well as move Google in to power tracking technology Statcast. Machine learning – which was cited when MLB extended its AWS deal in 2018 – is also a factor in this move, with the baseball arbiter also citing application management and video storage capabilities.

MLB will continue to use Google’s Ad Manager and Dynamic Ad Insertion to power its advertising arm for the third season running – a factor noted by the company in this change.

“MLB has enjoyed a strong partnership with Google based on Google Ad Manager’s live ad delivery with MLB.tv as well as YouTube’s strong fan engagement during exclusive live games,” said Jason Gaedtke, MLB chief technology officer in a statement. “We are excited to strengthen this partnership by consolidating MLB infrastructure on Google Cloud and incorporating Google’s world-class machine learning technology to provide personalised and immersive fan experiences.”

After two months of 2020, Google continues to be by far the noisiest of the main cloud providers in terms of updates and announcements. Among its other customers acquired this year are Lowe’s and Wayfair, announced during the NRF retail extravaganza in January, and decentralised network Hedera Hashgraph.

What makes this move interesting is with regard to AWS’ expertise in the sporting arena, with several marquee brands on board. Formula 1 and NASCAR are the best known in terms of arbiters, with the Bundesliga signing up earlier this year. Google Cloud’s best-known sporting customer to date is the Golden State Warriors, in a deal announced this time last year.

CloudTech has reached out to Google Cloud to confirm whether this is a single cloud provider deal, but it is worth noting the MLB logo no longer appears on AWS’ dedicated sports client page. As of February 29, it was still there (screenshot: CloudTech).

You can read the full announcement here.

Photo by Jose Morales on Unsplash

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud computing accelerating climate change is a misnomer, scientists find

Data centre workloads, powered by the rise in cloud computing, may not be the threat to the climate many have feared, according to a new report.

The study, published in the journal Science last week, argued that while global data centre energy has increased over the past decade, this growth is negligible compared with the rise of workloads during that time.

According to the research, 2018 saw global data centre usage pinned at 205 terawatt-hours (TWhs), comprising around 1% of global electricity consumption. This represents a 6% uptick compared with 2020 figures, yet global data centre compute instances rose by 550% over that time. To put it as energy use per compute instance, the intensity of energy used by global data centres has decreased by 20% annually since 2010.

The paper cites various improvements as key to this change. Greater server virtualisation has meant a sixfold increase in compute instances with only a 25% rise in server energy use, according to the research. More energy-efficient port technologies, the report cites, have enabled a 10-fold increase in data centre IP traffic with only ‘modest’ increases in network device energy usage.

What’s more, the rise of the hyperscalers has helped. The move away from more traditional, smaller data centres – comprising almost four in five compute instances in 2010 – has resulted in greater PUE (power usage effectiveness) due to power supply efficiencies, as well as stronger cooling systems. Hyperscale data centres, as part of larger, more energy-efficient cloud data centres, now make up 89% of compute instances in 2018, the report estimates.

The average PUE per data centre has dropped to 0.75 in 2018, representing a significant improvement. When this publication attended the opening of Rackspace’s UK data centre campus in 2015, the PUE was 1.15, which at the time was noted as ‘almost unheard of in commercially available multi-tenant data centres.’

Plenty of initiatives are taking place which show how the industry is looking to harness the planet’s natural cooling systems to create a more sustainable future. In September, SIMEC Atlantis Energy announced plans to build an ocean-powered data centre in Caithness, off the Scottish coast. The company, who according to reports is in the process of arranging commercial deals for the site, is following in the footsteps of Microsoft, who experimented with placing a data centre underwater in 2018 off Orkney.

The naturally cooler temperatures of islands in the northern hemisphere, most notably Scandinavia, have long since been seen as advantageous. In what was seen as a landmark ruling in 2016, the Swedish government confirmed data centre operators would be subject to reduction in electricity taxation, putting the industry on a similar footing as manufacturing among others.

In terms of the hyperscale cloud providers, Google touts itself as leading the way, saying as far back as April 2018 that it had become the first public cloud provider to run all its clouds on renewable energy. The company says its PUE across all data centres for 2019 was at 1.1, citing favourably an industry average of 1.67.

Following the release of the Science report, Urs Holzle, SVP for technical infrastructure at Google Cloud, said the findings ‘validated’ the company’s efforts, which included utilising machine learning to automatically optimise cooling, and smart sensors for temperature control. “We’ll continue to deploy new technologies and share the lessons we learn in the process, design the most efficient data centres possible, and disclose data on our progress,” wrote Holzle.

Amazon Web Services (AWS), the leader in cloud infrastructure, says that as of 2018 it exceeded 50% renewable energy usage and has ‘made a lot of progress’ on its commitment to 100% renewable usage. The company has previously received criticism, with a report from Greenpeace this time last year saying AWS ‘appears to have abandoned its commitment to renewable energy’. Last month, Amazon CEO Jeff Bezos said he would commit $10 billion to address climate change.

CloudTech contacted AWS for comment and was pointed in the direction of a 451 Research report from November which found that AWS’ infrastructure was 3.6 times more energy efficient than the median of enterprise data centres surveyed.

One potential future area of concern with regard to computational power is that of Bitcoin. The energy required to mine the cryptocurrency has led to various headlines, with the University of Cambridge arguing that Bitcoin’s energy usage, based on TWh per year, equalled that of Switzerland. Pat Gelsinger, the CEO of VMware, previously said when exploring the concept of ‘green blockchain’ that the energy required to process it was ‘almost criminal.’

Michel Rauchs, who worked on the Cambridge project, is speaking at Blockchain Expo later this month on whether Bitcoin is ‘boiling the oceans.’ His argument is that the question is more nuanced than many believe – not helped by the extremist opinions on both sides.

“The way that Bitcoin is being valued for different people right now is completely subjective,” Rauchs tells CloudTech. “For some people it’s really an essential; for other people it’s some sort of gimmick, and it’s definitely not worth the electricity it consumes.

“There is no easy answer,” he adds. “The only thing that we can say today is that Bitcoin right now is at least not directly contributing to climate change, though the level of energy consumption is really high. You need to look at the energy mix – what sources of energy are going into producing that electricity.”

The report concludes that, despite the good news, the IT industry, data centre operators and policy makers cannot ‘rest on their laurels’. If Moore’s Law is anything to go by – albeit a long-standing dictum which may be reaching the end of its natural life itself – demand will continue to proliferate, with the next doubling of global data centre compute instances predicted to occur within the next four years.

You can read the full article here (preview only, client access required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Salesforce acquires Vlocity in $1.33bn all-cash deal amid executive shuffle

Salesforce is to acquire Vlocity, a provider of industry-specific cloud and CRM apps, for $1.33 billion (£1.04bn) in an all-cash deal – amid an action-packed end to the company’s fiscal year.

The news was tucked away at the very end of Salesforce’s fourth quarter update. The company’s financial results were described as ‘phenomenal’ by CEO Marc Benioff – not co-CEO, more on which shortly – with Q4 revenue up 35% year over year to $4.8bn (£3.7bn), and total fiscal 2020 revenue at $17.1bn on a 29% increase.

Vlocity’s acquisition by Salesforce – in terms of the acquiring company – should come as little surprise. The Vlocity solution is built natively on the Salesforce platform, with Salesforce Ventures co-leading its most recent funding, a $60 million series C round in March.

The six primary industries targeted by Vlocity are communications, energy and utilities, government, health, insurance, and media and entertainment, with key customers including T-Mobile, Telus, and Sky. The company placed in the top 25 of the most recent Forbes Cloud 100 ranking, as well as just outside the top 50 on the Inc. 5000 Series earlier this month, representing the fastest growing private companies in California.

Writing to customers following the Salesforce acquisition news, Vlocity CEO David Schmaier noted the growth the company had undertaken to this point. “Every organisation, including the world’s largest customer-centric corporations and industries, must digitally transform,” Schmaier wrote. “It is more important than ever for our customers to have products that speak the language of their industries.

“The best customer experiences are industry-specific,” Schmaier added. “Together, our customers, our partners and our employees have accomplished so much. I am thrilled about our future with Salesforce.”

Speaking to investors following the earnings report, Benioff noted that Vlocity was a ‘relatively small transaction’ – not idle words when you’ve previously spent $15.7bn on Tableau – but that the time was right to invest. “In our relationship with Vlocity and the way that we originally invested in the company, it created a situation for acquisition that we needed to take advantage of and which is why we have acquired it at a very attractive price, because we have been partners with them from the very beginning,” he said.

Alongside this, it was announced that Keith Block has stepped down as co-CEO of Salesforce, with Benioff carrying on alone. Block will remain an advisor to the CEO, with Benioff leading the tributes to an executive who ‘helped position [Salesforce] as a global leader and… the envy of the industry.’ Gavin Patterson, former BT chief, is joining the company as president and CEO of Salesforce International, based in London.

The Vlocity transaction is expected to close during the second quarter.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud bolsters security offerings at RSA – as Thales report warns of more breaches

Google Cloud has beefed up its security offerings to include greater threat detection, response integration, and online fraud prevention.

The news, announced at the RSA Conference in San Francisco, focused predominantly on enterprise security product Chronicle, which was ‘acquired’ by Google Cloud last year having been a bet of the ‘moonshot’ X R&D company.

Users will be able to target threats through YARA-L, a new rules language built specifically for modern attack behaviours, with Google Cloud promising ‘massively scalable, real-time and retroactive rule execution.’

One part of Chronicle’s development, ‘intelligent data fusion’, is also being forwarded as part of this rollout. This means companies can automatically link multiple events into a single timeline. The move is alongside security partners, with Palo Alto – announced as a collaborator for hybrid cloud platform Anthos in December – first on the list.

In terms of more general security defences, Google Cloud is also introducing an enterprise-strength reCAPTCHA product, as well as Web Risk API, both available for separate purchase. The former has recently been fortified with various bot defence systems, while the latter enables client applications to check URLs against unsafe web resources.

For Google, whose updates and iterations are mainly focused around the enterprise – the company certainly being the ‘noisiest’ of the biggest cloud providers – this aims to represent another step in the right direction.

“When it comes to security, our work will never be finished,” wrote Sunil Potti, Google Cloud VP security in a blog post. “In addition to the capabilities announced today, we’ll continue to empower our customers with products that help organisations modernise their security capabilities in the cloud or in-place.”

Alongside this, security provider Thales has reported in its latest global Data Threat study that while around half of all data is now stored in cloud environments, a similar number of organisations across various industries have suffered a data breach.

The study, conducted alongside IDC and which polled more than 1,700 executives, found that 47% of organisations experienced a breach or failed a compliance audit over the past year. Financial services firms, at 54%, suffered the most, ahead of government (52%) and retail (49%). Yet government respondents said they were the most advanced in their digital transformation strategies.

IDC noted four key strategies based on the report’s findings, with encryption at the heart of security across big data, IoT, and containers. Organisations should invest in hybrid and multi-cloud data security tools; increase focus on data discovery and centralisation of key management; focus on threat vectors within their control; and consider a Zero Trust model.

Yet the latter came with a caveat. “Zero Trust is a fantastic initiative to authenticate and validate the users and devices accessing applications and networks, but does little to protect sensitive data should those measures fail,” said Frank Dickson, IDC program vice president for cybersecurity products. “Employing robust data discovery, hardening, data loss prevention, and encryption solutions provide an appropriate foundation for data security, completing the objective of pervasive cyber protection.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Don’t expect AI software companies to gobble up revenues SaaS-style, warns a16z

The potential of artificial intelligence (AI) revolutionising software is evident, from creating new data models to the improved insights gleamed from the data provided. If you’re a business intelligence (BI) vendor and you are not exploring AI or machine learning in some capacity, for instance, then there is a real danger of missing the boat.

But if you’re expecting AI software companies to eat up recurring revenues like their cloudy, SaaS-y predecessors, then think again.

Martin Casado and Matt Bornstein, of venture capital firm Andreessen Horowitz (a16z), argue that lower gross margins, scaling challenges and weaker defensive capabilities mean AI businesses are not going to resemble traditional software companies going forward. What’s more, the mix means AI companies will more closely resemble services-oriented businesses.

One key area of definition is around cloud infrastructure costs. While SaaS providers are among the many businesses who are likely to use a hyperscaler or other cloud provider, companies focusing on AI software will have much more complex – and subsequently expensive – demands, from training models, to inference, to the rich media being used.

Nascency is also a concern, a16z note. “We’ve had AI companies tell us that cloud operations can be more complex and costly than traditional approaches, particularly because there aren’t good tools to scale AI models globally,” Casado and Bornstein wrote. “As a result, some AI companies have to routinely transfer trained models across cloud regions – racking up big ingress and egress costs – to improve reliability, latency, and compliance.”

As a result, this chunk of cost could put an immediate stop to thoughts of gross margins in the 60%-80% range, with sights lowered to 30%-50%, with scaling ‘linear at best’, rather than supersized. But there are other reasons to consider.

The human factor is an important point to consider, a16z noted. Take the example of the largest social media companies hiring thousands of human moderators to help the AI-based systems. This is not to mention the continually iterative process of training models, securing new training data, and then feeding that back into the systems. The need for human intervention will decline going forwards, but Casado and Bornstein wrote that ‘it’s unlikely that humans will be cut out of the loop entirely.’

Putting together a strong defensive moat, as a16z calls it, is more difficult than it seems. SaaS companies can do so by owning the intellectual property generated by their work, such as the source code. Open source providers have, as the past year has shown, come into problems with their differentiated approach. Yet for AI companies, reference implementations are available from open source libraries, while a lot of the groundwork is conducted in academia. Ultimately, it comes down to who owns the data; in this instance, it is the customer, or in the public domain.

The future of enterprise software is a fascinating one with the emergence of various technologies. In 2018, venture capital fund Work-Bench explored how AI was being incorporated, again noting the roadblock between the academic work being undertaken and the business models being plugged into it. “Despite hopeful promise, startups racing to democratise AI are finding themselves stuck between open source and a cloud place,” the report noted.

While stressing caution, Casado and Bornstein are overall optimistic, so long as AI companies heed the warning signs. “Things like variable costs, scaling dynamics, and defensive moats are ultimately determined by markets – not individual companies,” they wrote. “The fact that we’re seeing unfamiliar patterns in the data suggests AI companies are truly something new.

“There are already a number of great companies who have built products with consistently strong performance.”

You can read the full a16z analysis here.

Editor's note: You can read more news on artificial intelligence, machine learning, deep learning and more at our industry-specific sister publication, AI News.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Human error and misconfigurations primary source of Kubernetes security snafus, report says

StackRox, a provider of cloud-native, container and Kubernetes security, warned in its previous report that the security implications for Kubernetes were beginning to spill over to adoption – and the release of its updated winter study have proved the company right.

The paper, the winter edition of its State of Container and Kubernetes Security Report, was put together alongside 451 Research and polled more than 500 industry professionals.

94% of those polled said they had experienced security incidents in their container environments during the previous 12 months. As is frequently the case with other cloud security snafus, human error – in this case misconfigured containers – can be found as a root cause, a trend which StackRox said was ‘alarmingly common.’

More than two thirds (69%) of those polled said they had experienced a misconfiguration incident; just over a quarter (27%) found a security incident during runtime, with a similar number (24%0 having a major vulnerability to remediate.

86% of respondents said they were running containerised applications in Kubernetes – the same number as in the spring survey. However, the way Kubernetes is being used is changing rapidly, as more organisations put trust in the hyperscalers managing their workloads. Just over a third (35%) of respondents said they manage Kubernetes directly today – down from 44% six months ago – with more respondents (37%) using Amazon EKS. More than one in five (21%) say they use Azure AKS and Google GKE, with both representing a significant increase from spring.

In a similar theme, maturation is increasing in terms of cloud-only environments. While hybrid deployments remain more popular – 46% compared to 40% for cloud-only – it represented a big drop from the 53% who cited it six months ago. For cloud-only, organisations remain predominantly trusting a single cloud, although multi-cloud deployments are becoming more popular.

The previous report, issued in July, gave more of a general warning on container security. Six months prior, two in three organisations said they had more than 10% of their applications containerised – yet two in five were concerned their container strategy did not sufficiently invest in security. This time around, only 28% of organisations polled said they had fewer than 10% of their containers running in production – down from 39% last time.

“One of the most consistent results we get on our own surveys of DevOps and cloud-native security technologies is how important security is for these environments,” said Fernando Montenegro, principal analyst at 451 Research. “It is interesting to see how this observation fits well with the StackRox study, highlighting the need for both engineering and security professionals to have visibility and properly deploy security controls and practices for container and Kubernetes environments.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.