AWS re:Invent: Andy Jassy announces ML Marketplace, Blockchain and more


Bobby Hellard

29 Nov, 2018

At AWS re:Invent on Wednesday, Andy Jassy said that he had «a few things to share». But, over the course of his two-hour keynote, the CEO announced a barrage of new services and capabilities from blockchain to machine learning.

The boss of the world’s biggest cloud computing company has had a busy few days at the annual event in Las Vegas. From making announcements to meeting many of the developers and partners that have flocked to Sin City, Jassy has put himself about and offered plenty of information on everything he’s revealed.

And, there was a ridiculous amount of them…

Machine Learning Marketplace

Available now, the AWS Marketplace for Machine Learning includes over 150 algorithms and models, with more, said to be coming every day, that can be deployed directly to Amazon SageMaker. Its a giant algorithm hub for developers to find and offer machine learning models for the benefit of all.

For Gavin Jackson, the managing director of AWS UK and Ireland, this was the biggest news of the day and also a very good example of an underlying theme of this year’s re:Invent. It’s about catering to both those who can and those who can’t.

«The big announcement, I thought, was the Machine Learning Marketplace,» said Jackson. «Because while SageMaker is a good use of existing training models that you can just plug straight into your application, customers who are building their own training models and algorithms for applications can just look at a much wider s

et of use cases that are available in the marketplace and then just plug them in so they don’t have to build them for themselves.

«At the same time, those that do have data scientists and are building their own algorithms and training models can plug them into the marketplace and monetise it. It’s kind of a marketplace for those that can and those that can’t and everybody wins in the end. It just accelerates the progress of machine learning artificial intelligence over time.»

Blockchain

Unexpectedly, the CEO announced two new services to help companies manage business transactions for blockchain, starting with Amazon Managed Blockchain. Jassy said that this new service makes it easy to create and manage scalable blockchain networks using the popular, open source Ethereum and Hyperledger Fabric frameworks.

It’s run from an AWS Management Console, where customers can set up a blockchain network that can span multiple AWS accounts and scale to support thousands of applications and millions of transactions

The second blockchain offering, Amazon QLDB, is a transparent, immutable, and cryptographically verifiable ledger for applications that need a central, trusted authority to provide a permanent and complete record of transactions, such as supply chain, financial, manufacturing, insurance, and HR. This option is for customers who want to build applications where multiple parties can execute transactions without the need for a trusted, central authority.

According to Jassy, the company was asked why they had not shown any previous interest in blockchain, despite many of its customers and partners using the technology.

“We just hadn’t seen that many blockchain examples in production or that couldn’t easily be solved by database,” said Jassy. “People just assumed that meant we didn’t think blockchain was important or that we wouldn’t build a blockchain service. We just didn’t understand what the real customer need was.”

Data

Also announced during the keynote were new services for automating data applications and detailed guidance to help customers build faster on AWS services.

The AWS Control Tower is a cloud interface that allows users to govern multiple AWS workloads, particularly for companies migrating to the cloud. Jassy said it offers pre-packaged governance rules for security, operations, and compliance, which customers can apply enterprise-wide or to groups of accounts to enforce policies or detect violations.

Jumping on the data lake bandwagon, the company is now offering AWS Lake Formation, which will run on Amazon S3. Data lakes are storage systems that source data from multiple locations and stores it in files for technologies like machine learning. The AWS version is said to automate and optimise data, reducing the data management burden for customers.

Hybrid

There was some noise before the event that AWS would address hybrid cloud systems and it has confirmed AWS Outposts, which is a fully managed and configurable compute and storage racks service built with AWS-designed hardware. It’s a service that allows customers to run on-premise computing and storage functions while connecting to other AWS services in the cloud.

These outposts come in two variants; first, an extension of the VMware Cloud on AWS service that runs on AWS Outposts and second, AWS Outposts that allow customers to run on-premise ccomputing and storage that uses the same native AWS APIs used in the AWS cloud

AWS looks to redefine hybrid cloud at re:Invent 2018 – plus make big moves in blockchain

Andy Jassy, CEO of Amazon Web Services, noted during his keynote at re:Invent today that on a recent business trip, a senior AWS executive found themselves sat next to an exec from an unnamed competitor. They pulled up a presentation deck which – Jassy paraphrasing – noted its product strategy was to ‘look at what AWS launches, and then move as fast as possible to get something vaguely in that area.’

So with that story in mind, customers and partners, media and analysts, and perhaps a few cloud vendors as well, sat down in Las Vegas to absorb AWS’ updates. Database, blockchain and machine learning all ended up getting a significant airing – but the last announcement, on hybrid cloud, stole the show.

Compute and storage

Jassy started out by putting down the various numbers which asserted AWS’ dominance in the cloud market, both in terms of market share and absolute growth. Regular readers of this publication will be more than aware of these numbers; Synergy Research, for instance, said at last count that AWS led across all geographies and was ‘in a league of its own.’ It was here that Jassy threw in his obligatory Oracle dig. “There are some providers who don’t have enough revenue to show up here,” he said, who only appear when they try and grab attention for themselves. You can guess the rest.

Yet with so much to get through, competitor-bashing was relatively brief. The first segment underlined the breadth of the AWS portfolio. Take containers as one example. Customers can use ECS for a container service most tightly integrated with AWS, EKS to use Kubernetes in a managed service, or Fargate for a more ad hoc approach.

With that in mind, a new storage class was unveiled. Glacier Deep Archive is aimed as being the lowest cost storage available in the cloud, at less than one tenth of a cent per gigabyte per month – or $1 per terabyte per month. Naturally, it is aimed at the coldest possible usage; enterprises who are managing data on ancient tape. “You have to be out of your mind to manage your own tape moving forward, and this will be here for you in 2019,” said Jassy.

One of the key themes throughout the presentation focused around how developers and technology builders needed the right tools to get the job done. Indeed, practically every announcement AWS made was in response to specific customer pain points. In terms of file systems, Amazon already had Linux workloads covered with its FSx storage product, but added Windows Server and Lustre to it. While the former is an evident attempt to lure Windows users, the latter is of particular interest, focusing on high performance computing (HPC). Both products have the HIPAA, ISO and PCI-DSS security standards out of the box.

The second generation of builders

To some extent, the second stanza contradicted the first. With AWS’ huge depth in its products, complexity can of course be seen as an issue. Indeed, the cloud management space, with vendors such as CloudHealthTech – acquired by VMware, more on whom later – is testament to that. Jassy noted how a second type of builder was emerging; mostly found in enterprise organisations who wanted more of a guiding hand in how to set up products. AWS Control Tower and AWS Security Hub – the former enabling customers to set up a landing zone or environment easily, the latter centrally managing security and compliance across an AWS environment – were launched with that in mind.

One of the more eye-opening statistics was that more than 10,000 data lakes were being built on top of AWS S3. As Jassy noted, the data lake may be the in-vogue concept for 2018. “People realise there is significant value in moving all that disparate data and making it much easier by consolidating it into a data lake to enable you to run analytics and machine learning,” he said. “But if you’ve tried to build a data lake, it’s hard.”

AWS Lake Formation, therefore, was launched in order for organisations to take their data out of silos in days, rather than months, with Amazon offering to do the heavy lifting, from cleaning to partitioning, to indexing and cataloguing. “This is a step level change in how easy it’s going to be for all of you to have data lakes,” said Jassy.

“It’s obvious what The Beatles were singing about,” Jassy joked as the strains of Blackbird came to a close. “Database freedom!” Indeed, they ‘were only waiting for this moment to be free’, and this was where some of the harshest criticism came in – reserved for the legacy, relational database players. “People are sick of it, and now they have choice,” he said.

Again, the breadth of the portfolio was noted with three customer examples. For simpler iterations there is DynamoDB; Lyft uses it to coordinate passenger information and GPS coordinates. Airbnb uses ElastiCache for its single sign on (SSO) to be firing with microsecond latency. Nike used Neptune to build an app whereby athletes, their followers, and all their interests correlated. But if you have tables that fluctuate due to seasonality, or spikiness, then it’s a matter of guesswork knowing how to scale. The snappily-named Dynamo DB Read/Write Capacity On Demand aims to take care of that.

Perhaps the biggest cheer of the keynote came when Amazon Timestream was announced. The database can process trillions of events per day at one tenth of the cost of relational databases, and is focused on IoT and edge computing.

Blockchain, machine learning, and old friends

When it came to blockchain and machine learning, both saw leaps forward. Amazon Quantum Ledger Database (QLDB) – cited by Amazon CTO Werner Vogels as one of his favourite announcements – aims to solve the problem of providing a ledger with a trusted entity without having to surf through complicated functionality provided by blockchain frameworks. The second, Amazon Managed Blockchain, does what it says on the tin, supporting both Hyperledger Fabric and Ethereum. The company has certainly come a long way from this time last year when it said it wasn’t especially interested in the technology.

As far as machine learning went, Amazon SageMaker Ground Truth, which aims to help label data more accurately, and AWS Inferentia, a high-performance machine learning inference chip, stood out. Yet in one of the few nods to previous business, Ross Brawn, managing director of motor sports at Formula 1, took to the stage to expand on AWS’ partnership with the sporting giant first announced in July.

ML had been promised as a cornerstone at the time, and Brawn duly delivered. ‘F1 Insights Powered By AWS’ had been launched to some extent this season, providing more data as well as predictions on what may happen. This is being extended next season by further integrating telemetry data to predict performance and race strategy, as well as using HPC to simulate environments where slipstreams don’t knock out as much equilibrium of the preceding car, leading to closer racing. “These are insights the teams have always had – but we’re going to bring them out to the fans to show them what’s happening,” said Brawn.  

Mindful perhaps that looking back rather than forward may have turned into a habit, the next segment was also the last. Pat Gelsinger, CEO of VMware, went on stage – much as Jassy had done during VMworld keynotes – to help launch AWS Outposts. In some way, the best – or perhaps most shocking – had been left till last. The company claims to deliver a ‘truly consistent hybrid experience’ by bringing AWS services, infrastructure and operating models to ‘virtually any’ on-premises facility. This can be achieved either as AWS-native, or running VMware Cloud on AWS.

“The breadth and scale of the AWS platform now, combined with the sheer velocity of new feature releases means that few firms on the planet are moving faster,” said Nick McQuire, VP enterprise at CCS Insight. “It bodes ominously for Microsoft and Google in the high stakes cloud wars.”

Picture credits: AWS/Screenshot

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

HPE hails ‘major leap in storage’ with memory-driven flash


Dale Walker

28 Nov, 2018

HPE has announced a host of new enhancements to its storage portfolio, including the introduction of memory-driven flash and an expansion to the coverage of its multi-cloud service.

The new additions come at a time when the company has all but given up on keeping pace with market leader Dell, and is instead seeking to build out its Intelligent Infrastructure range with new capabilities.

Chief among these is the introduction of Memory-Driven Flash to its 3PAR family of data centre flash storage modules and its Nimble Storage range, something that HPE described as the biggest leap in storage in 25 years. It essentially combines HPE software with storage class memory (SCM) and non-volatile memory (NVMe), based on Intel’s Octane hardware.

The result is a new class of storage that’s designed to lower latency by 2x, and is billed as being 50% faster than all-flash arrays using non-volatile memory SSDs. This is particularly important for those latency-sensitive workloads that rely on real-time processing, or those that use AI at scale.

«Most applications can benefit from adding memory, but memory is very expensive,» said Milan Shetti, GM of HPE Storage, speaking at HPE Discover in Madrid this week. «You can also have intelligent storage, but one of the key attributes of this is you need to have memory.

«[This] is the industry’s first enterprise storage operating system, which will support storage-class memory,» he added. «This is something we’ve been working on for a while. With [Memory-Driven] operating system, at the speed of memory and at the cost of flash, you’re getting an entirely new way of building computing, storage and data management.

This new architecture will be available in December 2018 for 3PAR as a straightforward upgrade, and sometime in 2019 for Nimble Storage.

Another major announcement was the introduction of new tools to its InfoSight product, a management platform that is designed to predict and prevent errors across an organisation’s infrastructure without user involvement, as well as predict bandwidth, capacity and performance needs.

Now the platform can also utilise more machine learning-based controls, including an enhanced recommendation engine that replaces its basic predictive analytics with an AI-based system. This drastically improves optimisation guidance across HPE Nimble Storage as a result, the company explained.

Also announced was the release of machine learning tools for its HPE 3PAR storage range, which allows for self-diagnosis of performance bottlenecks and means InfoSight can be extended to purely on-premise deployments. HPE explained this addresses the issue of being unable to provide InfoSight analytics to data centres which may have limited access to the cloud.

The company also revealed that its Cloud Volumes, a hybrid cloud storage-as-a-service platform, is now expanding into the UK and Ireland regions in 2019. Currently available for HPE Nimble Storage, the pay-as-you-use service allows customers to move their on-prem applications to the AWS or Azure cloud, only with enterprise-grade storage instead of the default storage offered by those clouds.

This platform also now includes containers-as-a-service for the provisioning of applications hosted by HPE, and compliance certifications, including SoC Type 1, HIPAA for healthcare, and GDPR.

Three quarters of businesses missing out on low tech integration


Clare Hopping

28 Nov, 2018

Three-quarters of small to medium-sized enterprises are missing out because they haven’t deployed basic mobile working tools to their staff.

That’s according to a report commissioned by Crown Workspace and carried out by an independent research firm. Crown Workspace argues that by failing to integrate technologies such as cloud-based apps and services and BYOD policies, SMEs are significantly limiting their success.

The company noted that deploying basic tools to staff, such as the cloud and allowing employees to work from home, use their own devices for work and offering flexible working conditions can have a significant positive impact on innovation and productivity.

“Modern technology has created a new set of rules for the workplace,” Simon Gammell, director at Crown Workspace said.

“Tech such as WiFi, remote storage and mobile are what employees expect, and that’s what SME owners should consider first when designing a workspace to ensure that their people can work and communicate effectively. Design factors such as layout, equipment and furniture are also massively important too but should not come at the detriment of technology.”

Some of the areas highlighted as lacking by the company include voice technologies that are only being adopted by one in five organisations. Only 25% of the businesses questioned feel prepared for mobile working and although technologies such as Li-Fi and heating are becoming buzz technologies for SMEs, few are taking up the opportunity to modernise the workplace.

This is having a knock-on impact on employee productivity, but small businesses are struggling to justify spending on new tech in their offices.

“Landlords are conscious that occupiers need faster broadband speeds and greater access to strong wireless connections, amongst other technological advances,” Hugh Prissick, project manager and owner of Storey added. “Future proofing buildings is difficult but landlords and developers are placing technology at the heart of the design of new buildings.”

HPE to acquire BlueData to help with AI efforts


Clare Hopping

28 Nov, 2018

HPE has announced it has acquired software provider BlueData to help it boost its AI and analytics efforts.

HPE wants to roll out more software to capture the enterprise’s demand for big data systems and by purchasing BlueData, it hopes it can bring to market the tools that businesses need a whole lot faster than it is able to on its own.

BlueData’s software is available in the cloud, on-premise and on hybrid infrastructure, making it a promising addition to HPE’s acquisition portfolio.

“BlueData has developed an innovative and effective solution to address the pain points all companies face when contemplating, implementing, and deploying AI/ML and big data analytics,” Milan Shetti, SVP and GM, Storage and Big Data Global Business Unit at HPE said.

“Adding BlueData’s complementary software platform to HPE’s market-leading Apollo Systems and professional services is consistent with HPE’s data-first strategy and enables our customers to extract insights from data – whether on-premises, in the cloud, or in a hybrid architecture.”

HPE expects the acquisition to close in January 2019, although it isn’t clear whether the entire BlueData team will join HPE to help with the integration or if HPE’s staff will continue the development of the platform when it is integrated with its Apollo range of machine learning tools and services.

“Growth in the volume and the types of data in the market continues to accelerate, as does the demand for a fast, easy, and unified consumption experience for AI and big data analytics,” said Kumar Sreekanti, Co-founder and CEO of BlueData.

“From our perspective, data is fuel, and BlueData’s software is the engine that helps businesses consume their data and deliver insights in the most effective and efficient way. We’ve had tremendous customer success by providing a turn-key solution that delivers an as-a-service experience for AI and big data, and are excited to reach even more customers as part of HPE.”

AWS Ground Stations link satellites to the cloud


Bobby Hellard

28 Nov, 2018

Amazon Web Services (AWS) announced AWS Ground Station on Tuesday at its re:Invent conference in Las Vegas.

It’s a service that aims to feed satellite data straight into AWS cloud infrastructure faster, easier and at an affordable price for its customers.

There will be 12 of these ground stations located around the world and they’re effectively antennas that link up with satellites as they pass by while orbiting. This is a problem AWS customers have identified, according to the company, where satellites are only in the range of certain ground antennas briefly, making uploading and downloading data difficult.

The announcement was made by AWS CEO Andy Jassy, who cited customers as the inspiration and called the service «the first fully managed global ground station service».

«Customers said, ‘look, there’s so much data in space and so many applications that want to use that data to completely change the way in which we interact with our planet and world,» he said. «Why don’t we just make this easier?»

There is nothing easy about dealing with satellites, particularly for transferring data.

As they’re only in range for limited periods, linking up is very challenging and the data itself needs some kind of infrastructure to be stored, processed and utilised within. From top to bottom, it’s an operation that requires a lot of resources, such as land and hardware, which is extremely expensive. Thankfully, the world’s biggest cloud provider has stepped in to find a solution.

So, how does it work? According to AWS, customers can figure out which ground station they want to interact with and identify the satellite they want to connect with. Then, they can schedule a contact time; the exact time they want the satellite to interact with the chosen ground station as it passes by.

Each AWS Ground Stations will be fitted with multiple antennas to simultaneously download and upload data through an Amazon Virtual Private Cloud – directly feeding it into the customer’s AWS infrastructure.

«Instead of the old norm where it took hours, or sometimes days, to get data to the infrastructure to process it,» added Jassy. «It’s right there in the region in seconds. A real game changer for linking with satellites.»

While AWS provides the cloud computing, the antennas themselves have come from its partnership with Lockheed Martin, which has developed a network of antennas called Verge. Where AWS offer the powers to process and store, Verge promises a resilient link for the data to travel to.

«Our collaboration with AWS allows us to deliver robust ground communications that will unlock new benefits for environmental research, scientific studies, security operations, and real-time news media,» said Ric Ambrose, executive VP of Lockheed Martin.

«In time, with satellites built to take full advantage of the distributed Verge network, AWS and Lockheed Martin expect to see customers develop surprising new capabilities using the service.»

Predicting the future of digital marketplaces: AI, personalisation, and the latest cloud platforms

  • The U.S. B2B eCommerce market is predicted to be worth $1.2T by 2022 according to Forrester.
  • 75% of marketing executives say that reaching customers where they prefer to buy is the leading benefit a company gains from selling through an e-commerce marketplace according to Statista.
  • 67% strongly agree to the importance of B2B e-commerce being critical to their business’s advantages and results in their industry.

Digital marketplaces are flourishing today thanks to the advances made in artificial intelligence (AI), machine learning, real-time personalisation and the scale and speed of the latest generation of cloud platforms including the Google Cloud Platform. Today’s digital marketplaces are capitalising on these technologies to create trusted, virtual trading platforms and environments buyers and sellers rely on for a wide variety of tasks every day.

Differentiated from B2B exchanges and communities from the 90s that often had high transaction costs, proprietary messaging protocols, and limited functionality, today’s marketplaces are proving that secure, trusted scalability is achievable on standard cloud platforms. Kahuna recently partnered with Brian Solis of The Altimeter Group to produce a fascinating research study, The State (and Future) of Digital Marketplaces. The report is downloadable here (PDF, 14 pp., opt-in). A summary of the results is presented below.

Kahuna digitally transforms marketplaces with personalisation

The essence of any successful digital transformation strategy is personalisation, and to the extent, any organisation can redefine every system, process, and product to that goal is the extent to which they’ll grow. Digital marketplaces are giving long-established business and startups a platform to accelerate their digital transformation efforts by delivering personalisation at scale.

Kahuna’s approach to solving personalisation at scale across buyers and sellers while creating trust in every transaction reflects the future of digital marketplaces. They’ve been able to successfully integrate AI, machine learning, advanced query techniques and a cloud platform that scales dynamically to handle unplanned 5x global traffic spikes. Kahuna built its marketplace platform on Google App EngineGoogle BigQuery, and other Google Cloud Platform (GCP).

Kahuna’s architecture on GCP has been able to scale and onboard 80+ million users a day without any DevOps support, a feat not possible with the exchange and community platforms of the 90s. By integrating their machine learning algorithms designed to enhance their customers’ ability to personalise marketing messages with Google machine learning APIs to drive TensorFlow, Kahuna has been able to deliver fast response times to customers’ inquiries. Their latest product,  Kahuna Subject Line Optimisation, analyses the billions of emails their customers use to communicate with customers to see what has and hasn’t worked in the past.  Marketplace customers will receive real-time recommendations as they are in the email editor composing an email subject line. Kahuna scores the likely success of the subject lines in appealing to target audiences so that marketers can make adjustments on the fly.

The state (and future) of digital marketplaces

Digital marketplaces are rapidly transforming from transaction engines to platforms that deliver unique, memorable and trusted personal experiences.

Anyone who has ever used OpenTable to get a last-minute reservation with friends at popular, crowded restaurant has seen the power of digitally enabled marketplace experiences in action. Brian Solis noted futurist, author, and analyst with The Altimeter Group recent report,  The State (and Future) of Digital Marketplaces is based on 100 interviews with North American marketing executives across eight market segments.

Key insights and lessons learned from the study include the following:

Altimeter found that 67% of marketplaces are generating more than $50m annually and 32% are generating more than $100m annually with the majority of marketplaces reporting a Gross Merchandise Volume (GMV) of between $500m to $999m

When the size of participating companies is taken into account, it’s clear digital marketplaces are one form of new digital business models larger organisations are adopting, piloting and beginning to standardise on. It can be inferred from the data that fast-growing, forward-thinking smaller organisations are looking to digital marketplaces to help augment their business models. Gross merchandise volume (GMV) is the total value of merchandise sold to customers through a marketplace.

59% of marketing executives say new product/service launches are their most important marketplace objective for 2019

As marketplaces provide an opportunity to create an entirely new business model, marketing executives are focused on how to get first product launches delivering revenue fast. Revenue growth (55%), customer acquisition (54%) and margin improvement (46%) follow in priority, all consistent with an organisations’ strategy of relying on digital marketplaces as new business models.

Competitive differentiation, buyer retention, buyer acquisition, and social media engagement and the four most common customer-facing challenges marketplaces face today

39% of marketing execs say that differentiating from competitors is the greatest challenge, followed by buyer retention (32%), buyer acquisition (29%) and effective social media campaigns (29%) Further validation that today’s digital marketplaces are enabling greater digital transformation through personalisation is found in just 22% of respondents said customer experience is a challenge.

Marketplaces need to scale and provide a broader base of services that enable “growth as a service” to keep sellers engaged

Marketplaces need to continually be providing new services and adding value to buyers and sellers, fueling growth-as-a-service. The three main reasons sellers leave a marketplace are insufficient competitive differentiation (46%), insufficient sales (33%) and marketplace service fees (31%). Additionally, sellers claim that marketing costs (28%) and the lack of buyers (26%) are critical business issues.

Lack of sellers who meet their needs (53%) is the single biggest reason buyers leave marketplaces

Buyers also abandon marketplaces due to logistical challenges including shipping costs and fees added by sellers (49%) and large geographic distances between buyers and sellers (39%). These findings underscore why marketplaces need to be very adept at creating and launching new value-added services and experiences that keep buyers active and loyal. Equally important is a robust roadmap of seller services that continually enables greater sales effectiveness and revenue potential.

HPE builds on its composable vision with new Hybrid Cloud Platform


Dale Walker

27 Nov, 2018

HPE has launched a host new additions to its composable cloud portfolio, chief among these being a hybrid cloud software stack that aims to bring the flexibility and fluidity of the public cloud, as well as a host of AI-driven storage systems, to the on-premise environment.

The company claims this represents the industry’s first hybrid cloud platform built on composability and flexible deployments, something that will help those businesses struggling to move to the cloud due to a lack of skilled stack specialists.

HPE follows the likes of Microsoft, Google, and IBM, leading technology companies that have shifted their focus over the past year towards a customer base that’s increasingly adopting multiple cloud providers or hybrid over a single, all-encompassing service.

It also builds on HPE’s launch of a hybrid cloud-as-a-service model, which sits inside its GreenLake financial services brand, allowing customers to pay monthly in exchange for hybrid services deployed and managed entirely by HPE. 

This new Composable Hybrid Cloud product extends that by allowing customers to essentially upgrade their traditional data centre set up to operate in the same way as a public cloud. This means that typical public cloud features such as the fluid provisioning of resources can now be built into a server stack, which the company claims will drastically improve the efficiency of workloads and allow for better communication between deployments.

«Our customers want to innovate faster, with greater automation and intelligence,» said Phil Davis, president of Hybrid IT at HPE. With our new open hybrid cloud platform, enterprises of all sizes can now manage, provision and deliver workloads and apps instantly and continuously to accelerate innovation.»

The new software includes HPE’s Composable Fabric, first introduced as part of its Synergy platform, which is placed on top of a server setup built with ProLiant or Synergy servers. This works like a mesh that’s able to automatically scale up and scale down network resources based on the needs of workloads, as well as allow those processes to communicate with others on the network.

This simplifies the operations of the network by allowing it to effectively act like a hyperscale cloud provider, providing a means to adjust workload balancing on the fly, the company says. Once deployed, this is said to reduce the over-provisioning of resources by up to 70%.

InfoSight, HPE’s AI-powered analytics service, is also integrated into the software stack, which automatically predicts and prevents points of failure in the workload, reducing the amount of manual work required by operators.

«Organisations are demonstrating an ever-growing appetite for automation, scalability and openness to aggressively accelerate development and operations,» said Thomas Meyer, group VP of IDC. «With composable cloud, HPE aims to deliver a foundational pillar with those attributes in mind to help customers accelerate their digital transformation.»

Importantly, HPE’s hybrid cloud platform is open to third-parties. Currently, the platform supports ProLiant server workloads built for Red Hat’s OpenShift and VMware. HPE’s Synergy infrastructure platform will also support workloads from SAP and Oracle, and its Image Streamer capability means it can support Dev/Ops tools from the likes of Chef, Ansible, Puppet, and VMware.

HPE’s Composable Cloud platform on ProLiant rack servers will roll out to customers in the US, UK, Ireland, France, Germany and Australia starting in early 2019.

One in three CISOs view cloud as a security risk


Adam Shepherd

27 Nov, 2018

The cloud may be powering a great deal of business transformation, but many security leaders aren’t entirely happy about it, as new research reveals that one-third of CISOs view the cloud as their biggest security risk.

According to a study of 250 global CISOs and security leaders conducted by Kaspersky Lab, 30% of survey participants ranked cloud computing as the security risk that they were most worried about. This outranks both legacy IT and insider threats, which were listed as the top the top concern by 12% and 10% of CISOs respectively.

To be more specific, it’s not just cloud computing in general that was identified as a potential danger, but cloud computing and «uncontrolled cloud expansion» by different departments and lines of business within the organisation.

This could imply that CISOs are concerned about the potential security risks introduced by HR, finance and other departments procuring their own IT on an as-a-service model, without any oversight from the security team – although only 5% of respondents specifically identified shadow IT as a risk.

The majority (86%) of CISOs believe that security breaches are inevitable, according to the research. That’s a belief that coincides with almost half of the respondents reporting that CISOs have become risk management professionals over the past few years.

«My role actually consists of one very simple paradigm: minimizing cybersecurity risks for the group,» the CISO of a Swiss construction firm told Kaspersky.

«Furthermore, when it comes to the more ‘human’ part of my role, I’m a manager of very talented cybersecurity specialists, who are targets of multiple head hunters at the moment.»

Despite this focus on risk, however, only around one third of CISOs said that assessing and managing security risks was the most important part of their job, with the majority reporting that it was the implementation and management of security solutions.

HPE moves further into the hybrid cloud space, ramps up ‘innovative enterprise’ strategy

There is a major vendor cloud conference taking place this week. No, not that one: Hewlett Packard Enterprise (HPE) has moved lock, stock and barrel to Spain this week for its Discover 2018 Madrid Conference, and taken the opportunity to update its product roadmap and strategy with it.

First up, the products. HPE has launched Composable Cloud, aimed squarely at workloads for both private and hybrid cloud environments. The overall focus was to unveil ‘the next phase of HPE’s composable strategy’ – in other words, putting together on-premise hardware, software and cloud into a single server platform.

The composable cloud has two strands; for customers of ProLiant DL servers, HPE’s rack-based server, and for customers of infrastructure platform Synergy. The ‘open hybrid cloud platform’ – open as in being available to provision more than one cloud provider – promises greater scalability, 90% faster deployment of new configurations, and a 97% decrease in time for lifecycle operations. As has to be the way today, Composable Cloud for ProLiant DL can also make use of HPE’s InfoSight software – ‘AI for the data centre’ as the company puts it – to offer predictive analytics and continuous learning.

Speaking of InfoSight, HPE also unveiled new machine learning capabilities for the tool, including an AI-driven resource planner, as well as the ability to self-diagnose performance issues. This has been extended to purely on-prem environments, or those with restricted access to the cloud.

Other announcements as part of a more general product tune-up included the expansion of multi-cloud storage service HPE Cloud Volumes to the UK and Ireland, HPE Memory-Driven Flash, an enterprise storage offering, and renewal of vows with hyperconverged storage provider Cohesity focused around hybrid cloud consolidation.

HPE’s strong position when it comes to both cloud and traditional IT infrastructure equipment – servers, routers, switches and so forth – remains, with the company competing at the top end with Cisco, Dell EMC, and Microsoft. It is upon these foundations that the company is targeting a more holistic approach looking at public, private, and hybrid cloud.

Alongside other vendors in its position, HPE is focusing on what CEO Antonio Neri calls the ‘innovative enterprise.’ Essentially, it is just a question of semantics: SAP, for instance, prefers to call it the ‘intelligent enterprise.’ Yet the concept is the same, focusing on the full enterprise digital transformation journey; connecting and making the most of data garnered from edge to cloud, sprinkled with a smattering of AI and machine learning, among other emerging technologies.

The company has spent the past 15 months beefing up its cloudy knowledge. HPE’s consultancy unit, Pointnext, has welcomed both Cloud Technology Partners, an AWS specialist, and RedPixie, an Azure house, to its bosom. In an editorial post outlining the company’s strategy to be a partner for the ‘edge-centric, cloud-enabled, data-driven enterprise of the future’, Neri noted that 90% of cloud projects which don’t involve advisory and professional services fail.

In June, HPE outlined a $4 billion investment over four years to define the ‘intelligent edge’, a term also favoured by Microsoft. While specific details were thin on the ground at the time, the announcement represented a statement of sorts, and Neri used the editorial to add more context.

“Today, 75% of data is created at the edge, but 94% of that data is untapped or lost,” Neri wrote. “This data waste is a missed opportunity and lost value. HPE knows the value of data and the insight it can give us, and that’s why we have announced our commitment to invest $4bn in the intelligent edge over the next four years.

“We’ve started at the edge because it is increasingly here that business – and life – takes place,” Neri added. “The edge Is all about using technology and data to redefine the experiences and business outcomes you deliver to your end customers, making them more satisfied and loyal. For your employees, the edge helps them to be more productive and engaged, whether they are in an office, oil rig, or university.”

Ultimately, despite the different buzzwords and the eye-catching investments, vendors are all pulling in the same direction – in some cases after dragging their feet – and recognising a data-driven future, as well as the tough road ahead for enterprise digital transformation. “We believe that the cloud experience should be open and seamless across all your clouds, public, private, on-premise and off-premise,” added Neri. “Further, we believe the best cloud partner is one who is unbiased and without an agenda.

“At HPE, the cloud is an experience, not a destination. It is by engaging edge to cloud that new opportunities can be created.”

You can find out more about HPE’s Discover announcements here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.