UK Cloud Awards 2019 winners announced


Cloud Pro

17 May, 2019

County Hall in London last night played host to the sixth annual UK Cloud Awards, where accolades were awarded in a number of categories to showcase and reward the innovation and success of this country’s cloud industry.

The awards, which are put on by the Cloud Industry Forum (CIF) and Cloud Pro, were supported by Platinum sponsors ScienceLogic and CDW, acknowledged and rewarded the best cloud providers, products and projects from the past year.

 “From the standard and sheer number of entries we received for the awards it’s clear that it’s an incredibly exciting time to be in the cloud industry,” said Alex Hilton, CIF’s CEO.

“Having served as a judge every year since we founded the awards, I can say that deciding the winners this year was the toughest one yet, and that making the shortlist is an achievement in itself. However, there can of course only be one winner in each category and a big congratulations is deserved for everyone that took home prizes on the night.”

The expert panel of independent judges was led by head judge Jez Back and despite the overwhelming number of submissions, just 17 were lucky enough to take awards home on the night.

“The UK Cloud Awards received yet another record number of entries this year, with a depth and quality which proves the health and strength of the UK cloud industry,” Back said.

“The judges and I were looking for examples of unrelenting focus on helping to transform the way organisations operate and delivering meaningful outcomes for their clients. I’m pleased to say that our winners delivered on both counts and I’d like to congratulate the finalists and ultimate winners on their achievements and successes.”

 The winners are as follows:

Best in Class

Most Innovative Enterprise Product

Cloe from Densify

Most Innovative SMB Product

GoSimple Tax

Best Cloud Platform Solution

SuiteCloud from Oracle NetSuite

Best Cyber Security Product or Service

OnDMARC from Red Sift

Best Fintech Product or Service

Float

Best AI / ML Enabled Product or Service

Data Science from SnapLogic

Best Data Management Product or Service

Cloud Data Services from Pure Storage

Best Cloud Enabled End User Experience

Dropbox Business from Dropbox

Best Digital Transformation Projects

Public Sector / Third Sector Project

Financial Conduct Authority in partnership with Sopra Steria

Private Sector Enterprise Project

Manchester United in partnership with HCL

Private Sector SMB Project

Media Matters in partnership with Chalkline

DevOps and Functions as a Service Implementation

HeleCloud

Best in Class, Service Providers

Best Cloud Service Provider (CSP)

Workiva

Best Managed Service Provider (MSP)

Unify Communications Ltd

Cloud Migration Partner / Technical Collaboration Project

Ensono at Guinness World Records

Personal Achievement Awards

Cloud Newcomer of the Year

Myles Clarke, Product Manager at Cloud Gateway

Cloud Visionary of the Year

Chris Dunning at TechQuarters

A raffle was held on the night for the UK Cloud Awards’ official charity partner, The Sick Children’s Trust, with prizes donated by Sennheiser and Corona/3 Monkeys Zeno and Dennis Publishing.

Attendees dug deep in their pockets to raise money to support the charity’s efforts in keeping sick children and their families together.

Paul Franklin, group publisher of Channel Pro, Cloud Pro and IT Pro, added: “This year’s UK Cloud Awards was our biggest and best event yet so a huge thanks is owed to our Platinum sponsors CDW and ScienceLogic, and our Award sponsors, Fujitsu, Navisite, TechData and Veritas.

“We established the UK Cloud Awards to champion excellence and highlight the cloud innovation and leadership taking place in the UK, but they wouldn’t be possible without the backing of the industry. We fully expect to repeat the success of this year’s awards in 2020, so watch this space!”

VMware acquires Bitnami to accelerate multi-cloud and enterprise deployments

VMware has announced its intent to acquire Bitnami, a provider of application packaging and delivery, to help accelerate app delivery for multi-cloud use cases.

Bitnami, based out of San Francisco, offers organisations the capability to deploy a multitude of applications through its libraries either natively or across a plethora of cloud providers. The company was a graduate of Y Combinator and was noted for a reticence on funding, only accepting a seed round in May 2013.

This need for greater enterprise presence and the funding required to do it ultimately precipitated the move, the company said.

“This was actually an easy decision and has to do with our shared vision for the future,” wrote co-founders Daniel Lopez and Erica Brescia in a blog post. “Our mission at Bitnami is to make awesome software available to everyone, everywhere… to make that software accessible to the largest number of users and developers possible.

“Over the past years, we expanded our focus to help enterprises use Bitnami in production, often as part of a migration of their application to the cloud or their adoption of Kubernetes,” Lopez and Brescia added. “We realised that if we wanted to continue to grow, we would have to raise money, as building an enterprise salesforce is not easy to do when you are bootstrapped.

“As part of the conversations we had during this process, we realised that VMware would be the ideal partner for us. We both believe in a Kubernetes and multi-cloud future. We both share large enterprise customers, including cloud service providers. We both are building products and services to help companies navigate this multi-platform, multi-vendor world with a focus on enterprises.”

From VMware’s side, the company said it was ‘committed to maintaining the deep partnerships that Bitnami currently has with major cloud providers.’ The company’s website has logos of Amazon Web Services, Microsoft Azure, Google Cloud Platform and Oracle, alongside VMware.

“Upon close, Bitnami will enable our customers to easily deploy application packages on any cloud – public or hybrid – and in the most optimal format – virtual machine (VM), containers and Kubernetes helm charts,” wrote Milin Desai and Paul Fazzone in a blog post. “Further, Bitnami will be able to augment our existing efforts to deliver a curated marketplace to VMware customers that offers a rich set of applications and development environments in addition to infrastructure software.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Security flaw found in Google’s «most secure» account authenticator


Connor Jones

16 May, 2019

A misconfigured Bluetooth pairing protocol in Google’s Titan security keys could allow attackers to bypass encryption and hijack user accounts, the firm has revealed.

Google has said it will start offering replacements of what it once called the «strongest, most phishing resistant method of two-step verification (2SV) on the market today», following the discovery of the flaw which exposes account information to those within Bluetooth range.

The company has assured customers that the keys, the technology for which was first launched in 2017, would still do their job and provide multi-factor authentication built to a FIDO-standard that’s stronger than regular 2SV, but that the $50 cost would be waived if they wanted a replacement unit.

«This bug affects Bluetooth pairing only, so non-Bluetooth security keys are not affected,» said Christiaan Brand, product manager, Google Cloud. «Current users of Bluetooth Titan Security Keys should continue to use their existing keys while waiting for a replacement, since security keys provide the strongest protection against phishing.»

When attempting an account sign-in, a Titan user is required to press a button on the Bluetooth key to authenticate the log-in attempt. It was discovered that immediately after this button press, attackers have a narrow window to connect their own device to the security key, which could result in the attacker logging into the user’s account from their device, provided they already had said user’s email and password.

Titan keys work by acting as another authentication step and are linked with a user’s device, such as a phone or laptop, via a Bluetooth connection. A flaw in this connection means that an attacker could trick the phone or laptop into thinking the attacker’s own device is the security key. If this is achieved, the attacker could bypass the authentication process and start to make changes to the user’s device by mimicking an external keyboard and mouse.

It could be argued that a situation where an attacker that has your account credentials, knows you use a Titan key and is within 30m of your location would be unlikely to occur, but it’s still serious enough to prompt Google into taking action by replacing all affected keys. Others are less sceptical, though.

«The fact you must be within 30 feet of the security key isn’t an issue, especially when you consider how fast compiled and scripted software can run,» said Mark Miller, director of enterprise security support at Venafi. «In addition, lots of people conduct business in public places like coffee shops and airports, so connecting a dongle to a device isn’t that farfetched.»

«From a technology perspective, these keys are amazing; they make security a lot easier to consume», he added. «However, there is no such thing as perfect technology, so I’m glad Google is taking the initiative and recalling these keys.»

Most recently, Google announced that a new form of its Titan Security keys would be made available to all Android phones running Android 7.0 or later, with its line of Pixel phones getting a slightly more secure version too.

The phone as a security key (PaaSK) standard was announced at Google Cloud next 2019 and instead of having an external Titan Security key to hand, all that would be required is to unlock your Google account-linked Android device and press a button to approve the log-in in real time.

The Titan key was originally introduced to combat phishing attempts that exploited vulnerable 2SV methods such as confirmation codes delivered by texts – a method of communication that can be hijacked with relative ease.

In other Google news, a privacy flaw was found in Google Pay’s settings on Wednesday. Optional settings regarding a user’s ability to share their creditworthiness, personal information or Google Pay account information were hidden behind a special URL and not directly through the Google Pay account settings page.

Google has since attributed this error to fault left over from an update and has now fixed it so that the three privacy settings now appear as normal.

PencilDATA to Exhibit at @CloudEXPO | @Pencil_DATA @Chainkit #AI #CIO #FinTech #Blockchain #Chainkit #ArtificilaIntelligence

PencilDATA helps organizations operate in a trusted environment by making it easy to take advantage of advanced blockchain technologies. Chainkit is a cloud-based API from PencilDATA that enables organizations to improve the data integrity in their existing systems and business processes using the immutable strength of blockchain.

The PencilDATA founding team is made up of industry veterans, with deep experience in enterprise software, security, data, and cloud, and is supported by a world-class team of advisors and investors.

read more

Google Cloud launches Osaka data centre region, cites strategic importance of Japan

Google Cloud has announced the official launch of its Osaka cloud data centre region, making it the second facility in Japan and the 20th overall.

The Osaka region, as with all Google Cloud facilities aside from Iowa, has three availability zones and comes with the standard product set, including Compute Engine, App Engine, Google Kubernetes Engine, Cloud Bigtable, Cloud Spanner, and BigQuery.

The move ensures the company is going ahead at a steady clip following the opening of the Zurich data centre region in March. According to Google Cloud’s map, Seoul, Jakarta, and Salt Lake City are next on the list, pencilled in provisionally for the first half of 2020.

Writing in a blog post confirming the launch, Google Cloud CEO Thomas Kurian noted the strategic importance of Japan – and that with two cloud regions in the country, customers will benefit from greater business continuity, infrastructure, and more partners.

“We’ve invested more than $47 billion in our global infrastructure in the last three years, and we continue to invest to fuel the digital transformation journey of our customers across more than 150 countries,” wrote Kurian. “These key elements, combined with our global partner ecosystem and expanded go-to-market organisation, enable Google Cloud to be a strategic partner to the world’s leading enterprises.”

As ever with these things, a satisfied customer was rolled out, in the shape of beverage manufacturer Asahi Breweries. “We’re looking forward to the Osaka cloud region,” said Tatsuhito Chiku, corporate officer and general manager of IT at Asahi Group Holdings. “Using Google Cloud Platform services like BigQuery have enabled us to build a system with low latency and high resiliency, and the Osaka cloud region will further improve our system availability and achieve business continuity.”

Google was also placed as a leader for the first time in Gartner’s most recent infrastructure as a service (IaaS) Magic Quadrant focusing specifically on Japan. This makes for an interesting analysis; when looking at the leaders, it remains an AWS-Microsoft-Google 1-2-3, with IBM as the only ‘visionary’ and a host of local providers, from Fujitsu to IIJ (Internet Initiative Japan) as niche players.

You can read the full announcement here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Why legacy IT vendors are seeking cloud niche viability

Leading hyperscale cloud service providers continue to disrupt the traditional IT infrastructure vendor landscape, as more enterprise CIOs and CTOs expand their adoption of multi-cloud strategies that marginalise the remaining applications for on-premises data centres.

Legacy IT vendors that were reluctant to evolve their business model will now seek niche cloud market segments where they can differentiate their offerings. There's no viable growth path that's based upon hardware or software market status-quo assumptions. However, distinctive professional services are still a source of new opportunities.

Cloud computing market development

The worldwide public cloud services market is projected to grow 17.5 percent in 2019 to total $214.3 billion — that's up from $182.4 billion in 2018, according to the latest global market study by Gartner.

The fastest-growing market segment will be cloud system infrastructure services, or infrastructure as a service (IaaS), which is forecast to grow 27.5 percent in 2019 to reach $38.9 billion — that's up from $30.5 billion in 2018.

The second-highest growth rate of 21.8 percent will be achieved by cloud application infrastructure services, or platform as a service (PaaS).

"Cloud services are definitely shaking up the industry," said Sid Nag, vice president at Gartner. "At Gartner, we know of no vendor or service provider today whose business model offerings and revenue growth are not influenced by the increasing adoption of cloud-first strategies in organisations. What we see now is only the beginning, though."

Through 2022, Gartner projects the market size and growth of the cloud services industry at nearly three times the growth of overall IT services.

According to recent Gartner surveys, more than a third of organisations see cloud investments as a top three investing priority, which is impacting market offerings. Gartner expects that by the end of 2019, more than 30 percent of technology providers’ new software investments will shift from cloud-first to cloud-only.

This means that license-based software consumption will further plummet, while SaaS and subscription-based cloud consumption models continue their rise.

"Organisations need cloud-related services to get on-boarded onto public clouds and to transform their operations as they adopt public cloud services," said Mr. Nag.

Currently, almost 19 percent of cloud budgets are spent on cloud-related services, such as cloud consulting, implementation, migration and managed services, and Gartner expects that this rate will increase to 28 percent by 2022.

Outlook for cloud computing application growth

According to the Gartner assessment, as cloud computing continues to become mainstream within most organisations, technology product managers for cloud-related service offerings will need to focus on delivering solutions that combine experience and execution with hyperscale providers’ offerings.

This complementary approach will drive both the transformation and optimisation of an organisation’s IT infrastructure and operations. This vendor coexistence model is the multi-cloud market reality.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Unravel Data secures $35 million series C funding to help combine APM with big data muscle

Meet Unravel Data. The California-based big data software provider has secured $35 million (£27.1m) in a series C funding round – with the aim of revamping the application performance management (APM) space.

The round was led by Point72 Ventures, with participation from Harmony Partners, as well as existing investors Menlo Ventures, GGV Capital, and M12, formerly Microsoft Ventures.

The company’s vision is based around the thesis that the current leaders in the APM and log management market – AppDynamics, New Relic, Splunk et al – are not able to keep up with modern, data-driven, AI-enabled IT architectures. Among the company’s customers offering full stack observability for Azure, AWS, as well as big data platforms Cloudera, Hortonworks and MapR, are Deutsche Bank, Neustar and, perhaps not altogether surprisingly, Microsoft.

“CIOs in our network told us story after story of traditional application monitoring tools failing in a big data context because those tools were designed for the world of the past,” said Matthew Grande, chief market intelligence officer at Point72. “This new architecture requires a different product, one built from the ground up to focus on the unique challenges posed by big data applications.

“Unravel is poised to capture this emerging big data APM market,” added Grande.

The funding comes alongside various indicators of the company’s success. In what PR types call ‘momentum announcements’ – because there is not any news to speak of – the company cites annual recurring revenue (ARR) growth of 500% and the launch of Microsoft Azure and AWS cloud partner ecosystems as particular highlights.

Ultimately, there is major currency in making the most out of the deluge that is organisations’ data capacity. For some vendors, it is all about helping their customers filter the signal from the noise and glean insights on their data. For others, such as Unravel, it is about stability in performance.

“Every business is becoming a data business, and companies are relying on their data applications such as machine learning, IoT, and customer analytics, for better business outcomes using technologies such as Spark, Kafka, and NoSQL,” said Kunal Agarwal, Unravel Data CEO. “We are making sure that these technologies are easy to operate and are high performing so that businesses can depend on them. We partner with our customers through their data journey and help them successfully run data apps on various systems whether on-premises or in the cloud.”

Total funding for the company now stands at $57.2 million.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Adobe sends out a legal warning to creative cloud users


Bobby Hellard

14 May, 2019

Adobe has sent creative cloud users a legal warning that they could be at risk of potential claims of infringement by third parties if they continue using the outdated versions of the apps.

Adobe Creative Cloud is a set of applications and services that gives subscribers access to graphic design, video editing, web development, and photography software, such as Photoshop and Lightroom. It comes along with a set of mobile applications, and also some optional cloud services, with a monthly or annual subscription.

Last week, Adobe said that older versions of Creative Cloud apps would no longer be available to subscribers, which it has explained in a blog post.

«We recommend all customers use the latest release of Creative Cloud for optimal performance and benefits,» it said. «Please note that going forward, Creative Cloud customers will only have direct download access (from the Creative Cloud Desktop app and Adobe.com) to the two most recent major versions of Creative Cloud desktop applications.»

Now, however, the company is warning users of legal infringements from third parties and saying that they are no longer licensed to use its older apps if they chose too.

«Adobe recently discontinued certain older versions of Creative Cloud applications. Customers using those versions have been notified that they are no longer licensed to use them and were provided guidance on how to upgrade to the latest authorised versions,»  it said in a statement to Gizmodo.

«Unfortunately, customers who continue to use or deploy older, unauthorised versions of Creative Cloud may face potential claims of infringement by third parties. We cannot comment on claims of third-party infringement, as it concerns ongoing litigation.»

While most might just shrug their shoulders and think «why not just update?» it is important to note the many reasons users chose to stick with legacy software. Some might be using older hardware and as such might not have the specs to run increasingly bloated software.

There are also those who like what they have and have some scepticism about updates – such as those people still using Windows 7. While cloud-based services definitely have their benefits, it does highlight the issue that you essentially do not own the software you’re paying for.

Data visibility: The biggest problem with public clouds


Keri Allan

14 May, 2019

Use of public cloud continues to grow. In fact, 84% of businesses had placed additional workloads into the public cloud in 2018, according to a recent report by Dimension Research. Almost a quarter of those (21%) reported that their increase in public cloud workloads was significant.

However, while respondents were almost unanimous (99%) in their belief that cloud visibility is vital to operational control, only 20% of respondents said they were able to access the data they need to monitor public clouds accurately.

«If there’s any part of your business – including your network – which you can’t see, then you can’t determine how it’s performing or if it is exposing your business to risks such as poor user experience or security compromise,» points out Scott Register, vice president, product management at Ixia, the commissioner of the report.

This sounds like a major issue and yet surprisingly, it’s nothing new. Tony Lock, distinguished analyst and director of engagement at Freeform Dynamics, has been reporting on visibility issues for over five years, and not just regarding public cloud.

«Believe it or not despite people having had IT monitoring technology almost since IT began, we still don’t have good visibility in a lot of systems,» he tells us. «Now we’re getting so much more data thrown at us, visibility is even more of a challenge – just trying to work out what’s important through all of the noise.»

He adds that for many years public cloud providers have been slow to improve their services and make it easier for organisations to see what’s happening, largely because they handled it all for them.

«To a degree, you can understand why [providers] didn’t focus on monitoring to begin with, as they’ve got their own internal monitoring systems and they were looking after everything. But if a customer is going to use them for years and years then they want to see what’s in there, how it’s being used and if it’s secure.»

The cost of zero visibility

A lack of visibility in the public cloud is a business risk in terms of security, compliance and governance, but it can also affect business costs. For example, companies may be unaware that they’re paying for idle virtual machines unnecessarily.

Then there’s performance. Almost half of those that responded to Ixia’s survey stated that a lack of visibility has led to application performance issues. These blind spots hide clues key to identifying the root cause of a performance issue, and can also lead to inaccurate fixes.

Another issue relates to legal requirements and data protection. With a lack of visibility, some businesses may not be aware that they have customer information in the public cloud, which is a problem when «the local regulations and laws state it should not be stored outside of a company’s domain», highlights Lock.

Then there are the complexities around data protection and where the liability sits should a data breach occur.

«Often a daisy chain of different companies is involved in cloud storage, with standard terms and conditions of business, which exclude liability,» explains BCS Internet Specialist Group committee member, Peter Lewinton. «This can leave the organisation that collected the data [being] liable for the failure of a key supplier somewhere in the chain – sometimes without understanding that this is the position. This applies to all forms of cloud storage, but there’s less control with the public cloud.»

Understandably, security continues to be a big concern for enterprises. The majority (87%) of those questioned by Ixia said they’re concerned that their lack of visibility obscures security threats, but it’s also worth noting that general security concerns regarding public cloud still apply.

What’s the solution?

Lock believes that things are changing and vendors are beginning to listen to the concerns of customers. Vendors have started to make more APIs available and several third-party vendors are also creating software that can run inside virtualised environments to feed back more information to customers. «This move is partly down to customer pressure and partly down to market maturity,» he notes.

Ixia’s Scott Register recommends either a physical or virtual network tap that effectively mirrors traffic on a network segment or physical interface to a downstream device for monitoring.

«These taps are often interconnected with specialised monitoring gear such as network packet brokers, which can provide advanced processing, such as aggregation, decryption, filtering and granular access controls. Once the relevant packets are available, hundreds of vendors offer specialised tools that use the packet data for application or network performance monitoring as well as security analytics.»

Are vendors really to blame?

Although many businesses suffer with poor public cloud visibility, Owain Williams, technical lead at Vouchercloud, believes customers are too quick to blame the provider. He argues that there are many reliable vendors already providing the necessary access tools and that a lack of visibility is often down to the customer.

«This is my experience. As such it’s often entirely solvable from the business. The main providers already give you the tools you need. Businesses can log every single byte going in and out if they wish – new folders, permission changes, alerts; all the bells and whistles. If the tools themselves are inefficient, then businesses need to re-evaluate their cloud provider.

Instead, he believes that many of the visibility problems that businesses encounter can be traced back to those managing infrastructure – employees that may be in need of extra training and support.

«Better education for people – those charged with provisioning the infrastructure – is a strong first port of call,» he argues. «It’s about ensuring the businesses and individuals have the right training and experience to make the most of their public cloud service. The tools already exist to assure visibility is as robust as possible – it’s provided by these large public cloud organisations. Invariably, it’s a case of properly identifying and utilising these tools.»

Four key ways to optimise your cloud spending in 2019: A guide

The last six years have seen a rapid acceleration in the use of cloud computing. Enterprises of all sizes and across industries are now turning to cloud infrastructure and pursuing cloud-first strategies. As IT departments increasingly shift workloads into public clouds while they simultaneously convert existing on-premises virtualised environments to support cloud-like capabilities, optimising cloud spending has become an intense focus for many organisations.

In the RightScale 2019 State of the Cloud Report from Flexera, cloud users who were surveyed on their adoption of cloud identified two top priorities that, not surprisingly, correlate: moving more workloads to public cloud and optimising cloud costs.

As the use of public cloud has grown, so has the amount of spend. Public cloud spending is quickly becoming a significant new line item in IT budgets, especially among larger companies that typically run more workloads in cloud. More than 50 percent of enterprises are spending in excess of $1.2 million per year on public cloud in 2019.

Total annual revenue for the top public cloud providers Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform exceeded $30B last year (the exact number is unknown because only Amazon breaks out revenue for its cloud infrastructure unit separately from its other cloud services). Much of that revenue comes at the expense of cloud users not understanding how they can prevent wasted cloud costs.

Cloud users can do more to optimise cloud spend

Even though managing cloud costs is a top initiative for cloud users in 2019, they continue to underestimate the amount of wasted cloud spend. Respondents estimated 27 percent waste, while Flexera has measured actual waste at 35 percent.

One reason there is so much waste in public cloud is because of the complexity of cloud pricing and billing: Cloud providers offer hundreds of thousands of price points, and bills often have millions of line items. Add to that the constant change in pricing and introduction of new features plus the decentralisation of cloud use and it becomes apparent why it’s difficult to get a handle on cloud costs.

There are four key ways to address the problem of overspending on cloud:

Identify wasted cloud spend

Common areas where companies waste cloud spend include:

  • Running instances 24×7: Rather than running instances when they are not needed, implement auto-scaling for production workloads as well as automated scheduling for development environments to run only during working hours or when needed
  • Overprovisioning instances: Typically about 40 percent of instance spend is on VMs that are running at below 40 percent of CPU and memory utilisation, with the majority of those running under 20 percent utilisation. By downsizing, you can significantly reduce costs
  • Neglecting to clean up old storage data: Deleting storage volumes that are no longer attached to instances as well as deleting old snapshots that are beyond your snapshot retention policy can save you money

Take advantage of discounts

Another significant area for cost savings is to leverage discounts from the cloud providers by committing to instance usage. Each cloud provider has different rules and approaches to its discounts, but the basic principles for using discounts is the same and can result in significant savings.

Collaborate to save more

Because centralised cloud governance teams aren’t the owners of the cloud resources, the biggest challenge they face is taking action on areas of waste. Once they have uncovered possible savings opportunities, they need to collaborate with the resource owners to take action. A typical collaboration process would include several steps by various stakeholders:

  • Cloud governance team identifies recommendations for savings and shares recommendations with resource owners
  • Resource owners can then identify recommendations that should be ignored (a recommendation to downsize an underutilised instance may not be appropriate if it is used for a disaster recovery scenario, for example), take action on recommendations, and share recommendations with other team members.
  • Cloud governance team then reviews the results of the actions and reports on savings

Leverage automated policies for cost control

The agility offered by cloud results in resources often being provisioned with no delays for approval processes, so costs can quickly escalate if there are no governance processes in place to uncover waste and ensure efficiency. If your enterprise, like many, is manually managing your cloud costs rather than using automated policies, you could be missing opportunities to significantly reduce wasted spend.

The key is to remember that cloud cost optimisation is not once and done. By using automated policies to continually optimise for both the quick fixes such as identifying idle and underutilised instances as well as tackling more complex scenarios like Reserved Instances planning, you can reduce wasted spend over the long term.

Optimising cloud costs delivers instant savings

Enterprises in virtually every industry are adopting cloud to transform their IT organisations to help them accelerate innovation, expand market reach, and drive IT costs down — all while unlocking investment capital that was previously tied up in costly data centres.

The variable cost model of cloud with monthly billing cycles enables you to immediately realise savings the minute that you turn off a resource or scale it down. But ongoing management of cloud costs must be more than a one-time event or once-a-quarter focus. Companies that become proficient in continuous, automated processes to monitor and optimise cloud spend will save the most over time.

For more information, please download the RightScale 2019 State of the Cloud Report from Flexera.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.