Cloud pricing war moves from VMs to object storage, 451 Research argues

The battlefield for cloud pricing will shift from virtual machines to object storage, with databases among others undergoing similar pressures in the next 18 months, according to the latest note from 451 Research.

The trend of vendors undercutting each other for VMs has long been apparent; take January last year as just one example, where Microsoft announced price reductions of up to 17% on its Azure D-series Dv2 VMs shortly after Amazon Web services announced their latest cuts.

According to the figures, obtained via 451 Research’s Cloud Price Index, object storage prices are declining across every region, including a drop of 14% over the past year.

451 puts price cuts beyond compute down to an ever-maturing market and an increase in cloud-native development, although adding that the market ‘is not highly price-sensitive at this time, although naturally, end users want to make sure they are paying a reasonable price’.

“The big cloud providers appear to be playing an aggressive game of tit for tat, cutting object storage prices to avoid standing out as expensive,” said Jean Atelsek, digital economics unit analyst at 451 Research in a statement. “This is the first time there has been a big price war outside compute, and it reflects object storage’s move into the mainstream.

“While price cuts are good news for cloud buyers, they are now faced with a new level of complexity when comparing providers,” Atelsek added.

One analyst firm which attempts to solve the imbroglio of cloud pricing for customers is Cloud Spectator, who issues a yearly report outlining the best vendors taking into account both price and performance. The company’s analysis puts 1&1 at the top of the charts, giving it the benchmark score of 100 out of 100, with AWS (24), Azure (27) and Google (48) struggling by comparison, although adding performance on VMs was less variable with the major players.

Back in May, 451 Research argued there was a ‘very limited relationship’ between price and market share and that the ‘race to the bottom’ was something of a misnomer, with the supply of higher value services being key to long term growth for vendors.

You can find out more about the Cloud Price Index here.

IBM Delivers Mixed Results

On April 18, IBM released the first quarter results of 2017 and its not a happy picture. The overall revenue declined by 2.8 percent and this is the 20th straight quarter that has seen year-over-year revenue declines for the company.

IBM reported a total revenue of $18.16 billion this quarter and this is 2.8 percent less than the revenue of $18.68 billion it generated during the same period in 2016. It also feel short of analysts’ expectations as they were looking for a revenue of around $18.39 billion.

The net income for this period was $1.75 billion and this is way short when compared to$2.01 billion it reported a year ago. Also, the GAAP earnings per share is $1.86 compared to $2.09 last year. Due to these results, the stock price of IBM fell by five percent during the after-hours trade.

Though these results looks bleak and dismal, they’re actually not. Over the last few years, IBM has been making strategic changes for its businesses. In fact, it’s the newer initiatives like analytics, cloud computing and artificial intelligence that have generated positive results for the company. These new initiatives represent $7.8 billion of the company’s total revenue and this has registered a 12 percent increase when compared to the first quarter of last year.

In fact, cloud can be the real savior for IBM here. The company a revenue of $3.5 billion in this quarter and this is a 33 percent increase year-over-year. This goes to show that there is a lot of change happening in terms of IBM’s operations and it’s its past businesses that are slowing it down.

In many ways, this result shows some important trends that are happening in this industry now. There is a clear shift towards cloud technologies and any company that doesn’t want to ride this trend is sure to be left behind.

Also, this reflects that there is a growing demand for technologies like cloud and cognitive computing, and this reflects the changing nature of the IT industry as a whole. Gone are the days when we were focusing on software products. Today, we want to have everything as a service that can be used on a demand basis.

For example, when we wanted to use a tool like Adobe photoshop, we had to buy a CD and install it in our system a decade ago. Today, you can simply pay a subscription, go online, use the tool and save your files on the cloud. This goes to show how much we have evolved as a society and how well we have adapted to these changes.

Going forward, IBM should focus more on these emerging technologies as they are likely to be the revenue drivers and generators over the next decade.

In short, IBM’s results was a mixed bag as the ghosts of its past businesses are pulling it down, but the newer business areas are doing exemplarily well. This simply reflects the changing nature of business and how IBM is adapting to this change.

The post IBM Delivers Mixed Results appeared first on Cloud News Daily.

How machine learning could prevent money laundering

Machine learning is being put to use in all sorts of areas today. From smart cars and homes and beyond, the use of artificial intelligence (AI) and machine learning (ML) are becoming a larger part of how many companies conduct business. As more and more businesses are hit with cyber crime rather than physical crimes, there has been a needed shift from commercial surveillance systems towards cyber security systems to protect confidential data. More recently, we’ve seen ML sink its teeth into anti money laundering (AML) with big potential impacts there.

Most current AML systems are founded on an extensive list of rules. Banks and institutions are required to comply with the Bank Secrecy act and implement certain AML rules. These regulations are meant to help detect and report any suspicious activity that could indicate money laundering or terrorist financing. As these regulations have become more demanding, the traditional rules-based systems has become more and more complex with hundreds of rules driving know your customer (KYC) activity and Suspicious Activity Report (SAR) filing. As financial institutions monitor billions of transactions a day, the data mined from each creates a silo of information and data that any person would find overwhelming to sift through. More and more cases are being flagged for investigation but more and more false positives pop up.

Along with in an increase in the false positive rates, another challenge found in AML is the fact that it hardly ever signifies as the activity of just one transaction, account, business or person. Because of this, detection cannot focus on singles instances but rather requires analysis of behavioural patterns of transactions occurring over time. Therefore it is nearly impossible for personnel to investigate all cases in a timely manner.

Additionally, there are very little historical data surrounding ML making it difficult to pinpoint exact tactics and methods for ML. From trusts to black market currency exchanges or loan-back schemes, there is, unfortunately, no typical ML case. This also lends to indefinable data labels and sets, which have required manual analysis in the past. Despite the growing difficulties to monitor AML, more financial institutions are turning to the various technological tools such as AI, ML and big data analytics to detect ML cases.

When combined, these systems can merge across massive spectrums of data sources and dig through boundless mountains of data. When ML or AI has been implemented, however, these hybrid systems are able to translate those unlabelled points of data into signals to detect behavioural anomalies and intent. These ML systems can establish “normal behaviour” patterns and then identify anomalous behaviour next to it, thereby weeding out the false positives from true ML cases. In this instance, there is massive reduction of false negatives and false positives.

Moreover, FICO has created an AML Threat Score that helps to prioritize investigation queues for SARs, utilizing behavioural analytics from Falcon Fraud Manager. FICO utilizes transaction profiling tech, self-calibrated models and customer behaviour lists all which can adapt to the constant changing dynamics within a financial institution. Other ML systems that have seen progress has been the “unsupervised learning” — a form of machine learning that uses algorithms to draw inferences from data sets that lack labelled responses. Due to the large gap in historical data on ML cases, there is a weighty need for ML technology to be able to analyse and gain insight from data without prior knowledge of what to look for. Unsupervised machine learning learns from that unlabelled data and results in the ability to differentiate between the relevant and irrelevant data and can then divide this unlabelled data into usable clusters. This is achieved through link analysis, temporal clustering, associative learning and other techniques that allow financial institutions to track entity interactions, behavioural changes and transaction volatility.

The benefits of using machine learning to prevent ML can be seen when labelled and unlabelled data from this slew of sources can be ingested into a system that is flexible enough to accept a multitude of data points across a myriad of sources while also analysing its potential for a ML case. As the AML regulations that are required of banks become more intense and fines for complying grow, you can be sure to see the implementation of machine learning built on top of the traditional AML systems already in place today.

Happy Earth Day!

Jump-start your Earth Day efforts with properly recycling your electronics AKA e-cycling! It’s incredibly easy to think environmentally responsible when it comes to your electronics. We’re happy to encourage and inform others about how-to dispose of your electronics in an environmentally safe fashion: Here is your Earth Day How-To: Most Important rule of e-cycling: Do […]

The post Happy Earth Day! appeared first on Parallels Blog.

Top @Docker Metrics | @DevOpsSummit #DevOps #APM #AI #CD #Monitoring

Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resources of one or more underlying hosts. It’s not uncommon for Docker servers to run thousands of short-term containers (e.g., for batch jobs) while a set of permanent services runs in parallel. Traditional monitoring tools not used to such dynamic environments are not suited for such deployments. On the other hand, some modern monitoring solutions (e.g. SPM from Sematext) were built with such dynamic systems in mind and even have out of the box reporting for docker monitoring. Moreover, container resource sharing calls for stricter enforcement of resource usage limits, an additional issue you must watch carefully. To make appropriate adjustments for resource quotas you need good visibility into any limits containers have reached or errors they have caused. We recommend using alerts according to defined limits; this way you can adjust limits or resource usage even before errors start happening.

read more

In-Stream Processing Cures the Batch Processing Blues | @CloudExpo #BI #Cloud #BigData

Most of us have moved our web and e-commerce operations to the cloud, but we are still getting sales reports and other information we need to run our business long after the fact. We sell a hamburger on Tuesday, you might say, but don’t know if we made money selling it until Friday. That’s because we still rely on Batch processing, where we generate orders, reports, and other management-useful pieces of data when it’s most convenient for the IT department to process them, rather than in real time. That was fine when horse-drawn wagons made our deliveries, but it is far too slow for today’s world, where stock prices and other bits of information circle the world (literally) at the speed of light. It’s time to move to In-Stream Processing. You can’t – and shouldn’t – keep putting it off.

read more

Cloud Deployment: Slow and Thoughtful Wins the Race | @CloudExpo #Cloud #APM #Monitoring

This recent research on cloud computing from the Register delves a little deeper than many of the “We’re all adopting cloud!” surveys we’ve seen. They found that meaningful cloud adoption and the idea of the cloud-first enterprise are still not reality for many businesses. The Register’s stats also show a more gradual cloud deployment trend over the past five years, not any sort of explosion. One important takeaway is that coherence across internal and external clouds is essential for IT right now. That translates into tasks like planning better cloud management and application and platform integrations.

read more

[slides] Multi-Cloud #DevOps | @CloudExpo @Valb00 @NetApp @SolidFire #AI

All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in – resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud.
In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of SolidFire, discussed how to leverage this concept to seize on the creativity and business agility to make it real.

read more

The Disruptor That Needs Disrupting | @CloudExpo #IoT #M2M #Cloud #DigitalTransformation

Many people mistakenly believe that Al Gore invented the Internet, but in reality it was Tim Berners-Lee. He created URIs, HTTP, HTML, and the first web browser – all critical building blocks that paved the way for the Internet to operate as the ubiquitous, decentralized network for sharing information that we take for granted today. As a result of his contributions to society, it was recently announced that Tim Berners-Lee has been awarded the prestigious Turing Award.

read more