Todas las entradas hechas por yairgreen

A guide to enterprise cloud cost management – understanding and reducing costs

For the enterprise, managing cloud costs has become a huge problem. Public cloud continues to grow in popularity and top providers, such as Amazon Web Services, Microsoft Azure and Google offer competitive prices to attract enterprises. But your search to save money shouldn't stop there. There are many factors – some of which IT teams initially overlook – that can increase a public cloud bill. Fortunately, organisations can avoid any unwanted billing surprises with a smart cloud cost management strategy.

Enterprises progressing through their cloud adoption need to ensure that they have cost management strategies in place to control their spend as they continue to migrate services to cloud providers. Let’s examine some cloud cost management strategies that you can use to reduce your cloud costs immediately.

The challenges of managing cloud costs

Cloud infrastructure offers many benefits for organisations but it also presents a variety of challenges. The benefits are easily seen – scalability, control, security etc. – but it's also important to understand how moving to the cloud impacts your organisation. A major factor that contributes to the challenge of cloud cost management is the difficulty that organisations have in tracking and forecasting usage. Unpredictable budget costs can be one of the biggest cloud management pain points.

The ability to scale up and down on demand has allowed resource procurement to transition from sole ownership of the finance or procurement team to stakeholders across IT, DevOps and others. Such democratisation of procurement has initiated an ever-growing group of cost-conscious stakeholders who are now responsible for understanding, managing and optimising costs.

Before you move your infrastructure to the cloud, it is important to evaluate how much the public cloud will cost. Like any IT service, the public cloud can introduce unexpected charges.

The first step of a cloud cost management strategy is to look at the public cloud providers' billing models. Take note of how much storage, CPU and memory your applications require, and which cloud instances would meet those requirements. Then, estimate how much those applications will cost in the cloud. Compare your estimates to how much it currently costs to run those apps on premises. Some workloads are more cost-effective when in-house due to data location and other factors.

When using multiple public cloud providers, integration and other factors can lead to unexpected fees. Think ahead and plan application deployments to see where you might incur additional costs. Also, look at your cloud bill and see what you are charged for access, CPU and storage. The ability to track spending across more than one cloud is invaluable.

Before you commit to a cloud vendor, you have to understand your business requirements and examine what a certain vendor is offering. At first glance, most vendors have similar packages and prices, but when you examine them in detail, you might discover, for example, that one vendor has a dramatically lower price for certain types of workloads.

Organisations should also avoid vendor lock-in. Moving workloads from one cloud vendor to another can sometimes be difficult. Organisations sometimes end up paying higher prices than necessary because they didn't do their homework upfront and it is subsequently too difficult to migrate applications or workloads after they are in production.

Key areas where you can cut your cloud costs

To reduce your cloud costs, you must first identify waste by uncovering inefficient use of cloud resources. Cloud cost management is not a one-and-done process, but you can immediately start saving money on your cloud infrastructure costs if you address key areas that account for the majority of wasted cloud spend and budget overruns.

Ensure teams have the direct ability to see what they are spending. It’s easy to get carried away spinning up services, unless you know exactly what you are already spending. Identify what you have, and who owns it. Tag resources with user ownership, cost centre information and created time to give you a better handle on where the spend originates. This information can be used to track usage through detailed billing reports.

Once you have a handle on what your spend is, set budgets per account. Doing this after establishing a baseline ensures that you are setting practical and realistic budgets that are based on the actual usage. Look to whitelist Instance types (RDS & EC2) to only allow instances of specific types (e.g. t2.medium) or of classes (e.g. t2-*), or of sizes (e.g. *-micro, *-small, *-medium).

Prevent staff from provisioning unapproved virtual instances from the marketplace that include software license costs, or from using specific OS or DB engines from vendors with whom you do not have enterprise agreements in place or are too costly to run at scale. Review in which regions you have services running. The cost of services per region can vary as much as 60%. So you need to ensure you are balancing the need with running services in a given region with the cost of doing so. You can use instance scheduling to start and stop instances on a planned schedule. Shutting down environments on nights and weekends can help save you 70% of runtime costs. Look to determine which environments need 24×7 availability, and schedule the rest.

Manage your storage lifecycle by ensuring that you are rotating logs and snapshots regularly and backup and remove any storage volumes that are no longer in use. Ensure that you are using only one Cloudtrail configuration and have added additional ones only when absolutely necessary. Also, ensure that sandbox or trial accounts are only utilised for exploration purposes and for the duration committed.

Another technological solution that can help to reduce operating expenses is the use of containers. Often used by IT teams taking DevOps approaches, containers package applications together with all their dependencies, making them easier to deploy, manage and/or migrate from one environment to another.

Last, but not least, use a cloud cost management vendor. Many organisations decide that tackling these cost optimisation chores on their own takes too much time and skill. Instead, they leverage services from a reputable cloud cost management vendors. Cloud cost management is one of the major pain points various organisations have when migrating to the cloud. Cloud costs can sometimes be difficult to estimate, due to the complexity of the cloud infrastructure. 

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Does the rise of edge computing mean a security nightmare?

What do we mean by edge computing? In a nutshell, with edge computing you are processing data near the edge of your network, where the data is being generated, instead of relying on the cloud – or, more specifically, a collection of data centres.

As a relatively new methodology, computing at the edge invites new security challenges as you are looking at new setups or new architecture. Some say that you have to rely on vendors to secure your environment when you start computing at the edge. Those that champion edge computing claim that computing at the edge is safer because data is not traveling over a network but others see edge computing as being less secure because, for example, IoT devices are easily hacked.

And there are many ways to think about edge computing including smartphones. After all, if you consider the security and privacy features of a smartphone where you are encrypting and storing some kind of biometric information on the phone then you effectively take away those security concerns from the cloud and place them ‘next’ to the user, on their phone. 

With edge computing, you are effectively running your code on the edge. But running your code on the edge brings about specific security challenges because it's not within your stack or within your security environment – even though it is running on the edge it may still sometimes require queries from the back end, from the application. This is the main security concern when running a serverless environment and, in general, when running code on the edge. Where IoT devices are concerned, you run some of the code on the device itself (your mobile device or your IoT device) and you need to secure this. 

The massive proliferation of end user endpoint devices could turn out to be an edge computing headache for many organisations. A single user might have multiple devices connected to the network simultaneously. The same user will undoubtedly mix both personal and professional data (and applications/profiles) onto a single device. In most scenarios, endpoint security tends to be less than robust, whereby this user could (unwittingly) expose the organisation to serious risk and accompanying losses or exposure to malicious viruses. Many of these devices are not only very insecure, but they can’t even be updated or patched – a perfect target for hackers.

And 5G will certainly cement the era of edge computing. In general 5G should be a wonderful thing because it will accelerate the use and development of real time applications. But when you have more data going through a device you need more control of that data and you will need tools that allow an organisation to control that data from a security perspective.

The IoT and 5G relationship will see huge numbers of IoT devices feeding a huge amount of data to the edge. Currently however, none of the security protocols on IoT are standardised which highlights the biggest security risk of 5G. That is to say, your smart fridge in the kitchen currently has no standard for how it secures and authenticates with other smart devices. Base-level security controls are therefore required to mitigate such risks.

In the wider business world there will be a massive shift of computing function to the edge. When organisations rely less and less on data centres, (they will end up virtually ‘next’ to the workforce), then securing the endpoint edge means encrypting communications and ensuring that security devices are able to inspect that encrypted data at network speeds. Devices also need to be automatically identified at the moment of access, and appropriate policies and segmentation rules applied without human intervention. They also need to be continuously monitored, while their access policies need to be automatically distributed to security devices deployed across the extended network.

Organisations ultimately want to protect their data and they want to protect their production. When you are computing at the edge you are working with data at the edge and not in your workload. From a security point of view therefore, you need to secure the data both in transit and at rest. This security challenge is currently undertaken largely vendors and ultimately the security protocols underwritten by the big cloud providers such as AWS for example.

However, it is a mistake to believe that edge technology inherits the same security controls and processes that are found with the likes of AWS or the public cloud. Computing at the edge can cover all kinds of environments which are often remotely managed and monitored; this might not offer the same security or reliability that organisations are used to seeing with the private cloud. Ultimately it is the responsibility of the customer to properly vet potential vendors to fully understand their security architectures and practices.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.