FICO reinforces market position with product updates

dataGlobal analytics firm FICO has launched a number of new and updated offerings to enable businesses to develop prescriptive analytics and decision management applications and improve business decision agility.

The Decision Management Suite 2.0 product – an updated version of the same name – enables customers to develop analytic applications in the cloud and improve automated business decision agility. The Decision Central offering manages, audits, reports and updates decision logic and predictive models, so customers can record and store automated decisions so they can be reused, modified and improved. Finally, the Strategy Director tool helps users structure the decision flow. The tools are available through Amazon Web Services or as a private cloud or on-premises deployment.

“Many Big Data deployments have failed to deliver competitive advantage because their approach is completely backwards,” said Stuart Wells, CTO at FICO. “We focus on decisions-first, as opposed to data-first. That gives our customers the fastest route to real value, and the agility to change course faster than the competition. It means being able to innovate like start-ups. Some of our Decision Management Suite customers have reduced the time to deploy an analytic application from months to days, and the time to model a decision and act on it from weeks to minutes.”

While the concept of Big Data has been around for some time many business struggle to comprehend the vast amount collated, to utilize it within the organization in any meaningful manner. The FICO product suite is one of a number of new products in the industry which aims to bring all this data into one concise system, and ultimately drive decision making capability through the insight uncovered.

“The original launch of the FICO Decision Management Suite in 2013 represented a dramatic change in decision logic authoring and application development,” Wells said. “Now, with version 2 of the Decision Management Suite, FICO’s customers have the chance to pull even further ahead of their competitors. This product suite represents the future of prescriptive analytics and decision management, and it’s available now.”

The cloud bloat: Counting calories to prevent unnecessary waste

(c)iStock.com/AJ_Watt

Ponder this for a minute: our love affair with cloud technology and services, and the consequent struggle to keep cloud spending in check, is a lot like our daily eating habits.

Spinning up cloud services is almost as easy as grabbing food to go at your favourite burger joint. We’ve made fast food so convenient, affordable, and packed with calories, it’s become a national problem, because our consumption doesn’t match our nutritional needs or activity levels. Now imagine how much worse it would be if we had no ingredient lists, no easy way to count calories, and all of our weight gain for the month showed up in a single day.

It’s not a pretty picture, is it? Thank goodness for food group charts, nutrition labels, Weight Watchers, personal trainers, and fitness apps. Yet we find ourselves in a similar conundrum with cloud consumption.

We love the convenience and ability to spin up resources at the drop of a credit card, and the way we can easily scale resources up and down at will to match demand. On the other hand, it’s easy to have too much of a good thing. That’s how you end up with ‘cloud bloat.’

Cloud bloat happens when you consume resources you don’t really need. But instead of expanding your waist line, you’re expanding your “waste line” – in fact, research shows that up to 30% of cloud spend can be wasted.

So why do we have all this waste? Well, cloud makes it easy to spend without governance and control.  Per unit costs seem so low and priced by the hour, how could they possibly add-up to tens of thousands of dollars each month? 

How do you prevent cloud waste?

Most businesses rely on the monthly bill from their cloud provider or providers, which can pose problems.  Cloud bills are in technical terms, rather than terms that make sense to your business. They are also after-the-fact, long past the point of enabling you to react to a problem in any timely fashion, and lack the historical context required to pinpoint problems or plan for future spend. While most cloud providers offer some sort of tagging capability that enables customers to categorise resources, compliance and consistency is difficult to enforce, making it hard to understand who used which resources and at what cost. 

There are many different sources of waste, but some of the most common include purchasing oversized virtual machines (VMs), leaving VMs turned on when they are no longer needed, using expensive storage for infrequently accessed data, or just spinning up unauthorised resources, also known as shadow IT. In order to manage these resources, you first have to measure them – you need a “smart meter” for your cloud that will allow you to match usage with users and put costs against it, so you can see who is using how much of what, and what it is costing you. The next step is to use this information to ensure that you are using your cloud resources efficiently and effectively.

As you adopt more cloud services, and spend increases, managing cloud costs becomes a much larger and critical task.  You need to consider how the data is to be used, how often, and by whom, in addition to the following factors:

  • Which cloud platform(s) are used or planned?
  • Public cloud, Private cloud, Multi- or hybrid cloud?
  • What are our reporting requirements?
  • Showback, chargeback, billing?
  • Do we have customisable reports, dashboards for business users?
  • How much data do we have?
  • How accurate does the data need to be?
  • Do we need “real-time” results?
  • What is our budget and how does it compare to labor costs?

At Cloud Cruiser, we believe that there are four steps required to transform your data into meaningful intelligence, otherwise known as cloud analytics:  normalisation, enrichment, relevancy and visualisation.

Normalising data into common buckets and units of measure across your different cloud platforms provides more holistic and useful analytics that ultimately drive better decisions.  With enrichment, one of the key factors in data transformation is the ability to map your cloud data to organisational and financial information, to provide the business context that matters to you and that you need to drive effective decision-making.  

In order to trim the cloud fat, your cloud consumers need fast access to timely, relevant information, so they can see where they’re spending and trim where necessary. Role-based permissions allow you to apply filters to cut down on extraneous information “noise” — and also protects company data by restricting access to confidential information. Similarly, raw data or data in spreadsheets lacks the context and the visual ‘ah-hah!’ you need to optimise your cloud spend and make better business decisions. Different roles require different levels of information, or different kinds of information – the finance folks typically are concerned with costs, while DevOps may be more focused on units – VMs, network throughput, storage volumes, and the like.

A picture is worth a thousand words. Or in this case, hundreds of thousands of rows of data when we’re talking about the cloud. The amount of data is growing exponentially, with more than 2.5 quintillion bytes of data being produced every day. It’s no wonder the BI and analytics market is predicted to reach $16.9B this year. Our poor, limited brains do much better when looking at pretty pictures than trying to find meaning in column after column of numbers.

Formulating a plan of attack

Every organisation is different, so it’s best to start out small and build from there. Start with the highest cost resources or the resources that have the greatest impact on your business. If CPU performance is mission-critical for your customer-facing apps, then target VM utilisation first. If keeping costs under budget for a particular project is your main objective, then target costs by project.

To help your organisation develop a better plan of attack, here are 12 tips to help manage cloud spend:

  • If you have more than $20,000 of cloud spend per month, eliminate the spreadsheet and automate
  • Collect cloud usage and spend on a daily basis
  • Normalise data across cloud platforms, service bundles, and units of measure to see everything in a single, holistic manner
  • Think of the end results you want first, and work backwards to identify users, visualisations and data requirements
  • Add both organisational and financial context to your cloud data to give business meaning
  • Eliminate any IT bottleneck – give end users self-service reporting access. Save time and money
  • Develop use cases to ensure your tagging strategy considers both financial and operational goals
  • People and processes are not enough to enforce consistent tagging – technology must be used to add, correct, or transform tags
  • Data filtering should be enforced to ensure both relevancy and confidentiality
  • Use the right visualisation chart for the story you’re trying to tell with your data
  • Implement multiple levels of control to ensure usage and spending stays within pre-set thresholds
  • Focus first on the areas of greatest spend and/or the areas of greatest impact to the business

Count your cloud calories

Don’t let your consumption get out of hand. Make it easy to count those “cloud calories”—empower your users with easy to read, easy to access data on cloud usage. Keep IT out of the way or users will resort to shadow IT (just like that secret stash of candy bars in your drawer). With relevant data presented visually and in context, users will do a better job of matching consumption to business activity. Encourage them to weigh in daily in order to spot spikes in usage or spending before they balloon into bloat.

We love the cloud.  It’s dynamic, agile, scales and drives business value. However, the ease of access to resources can lead to waste, inefficiency, and out-of-control spending. Fortunately, with the right mix of people, process, and technology, you can defeat cloud bloat. A trim “waste line” and a healthy, sustainable approach to growth and technology use will keep your organisation agile and flexible. 

Microsoft shifts focus to Chinese cloud market

MicrosoftMicrosoft has announced a successful year in the Chinese market, as well as intentions to step-up its expansion plans in the region, according to China Daily.

The company claims it now has more than 65,000 corporate clients, and appetite for its Azure offering in Chinese enterprise organizations is steadily increasing. As part of the expansion plans, Microsoft lowered its prices for Chinese customers earlier this month, seemingly in an effort to undercut its global competitor AWS, as well as local powerhouses such as Alibaba Tencent.

“Though the GDP growth is slowing down, Chinese companies still need to focus on three points to remain relevant and competitive: innovation, productivity and the return of investments,” Ralph Haupter, CEO of Microsoft in China. “And cloud computing can help in all of the above three aspects. We will focus on manufacturing, retail, automotive, media and other industries to further expand market share.”

While China has proved to be one of the top priorities of the majority of the cloud players in recent years, a recent report from BSA highlighted the region was one of the poorest performers in the global IT community. Measuring each country of their cloud policies and legislation, as well as the readiness of its enterprises, China ranked 23 out of the top 24 IT nations worldwide, mainly due to poor performance in the data privacy, cybercrime, promotion of free trade and security categories, though it was one of the worst performers across every category.

Despite concerns from the BSA, Ji Yanhang, an analyst at Analysys International, believes the market has strong potential, stating “China’s national strategies, such as boosting high-end manufacturing, will increase demand for cloud services in the coming years.”

The announcement follows last weeks’ quarterly earnings call, where CEO Satya Nadella reported that Office commercial products and cloud services revenue grew 7%, Office consumer products and cloud services revenue grew 6% and Dynamics products and cloud services revenue grew 9%. Azure revenues grew 120% over the period, though this is down from 140% growth in the previous quarter.

IBM expands flash storage portfolio in continued transition to cloud positioning

Cloud storageIBM has announced the expansion of its flash storage portfolio, to bolster its position in the cognitive computing and hybrid cloud market segments.

The FlashSystem arrays combine its FlashCore technology with scale-out architecture, in the company’s continued efforts to consolidate its position as a vendor to power cloud data centres which utilize cognitive computing technologies. Cognitive computing, and more specifically Watson, has seemingly formed the central pillar of IBM’s current marketing and PR campaigns, as it continues its journey to transform Big Blue into a major cloud player.

“The drastic increase in volume, velocity and variety of information is requiring businesses to rethink their approach to addressing storage needs, and they need a solution that is as fast as it is easy, if they want to be ready for the Cognitive Era,” said Greg Lotko, GM of IBM’s Storage and Software Defined Infrastructure business. “IBM’s flash portfolio enables businesses on their cognitive journey to derive greater value from more data in more varieties, whether on premises or in a hybrid cloud deployment.”

The company claims the new offering will provide an onramp for flash storage for IT service providers, reducing the cost of implementing an all-flash environment, as well as scalable storage for cloud service providers. Another feature built into the proposition, will enable customers to deal with ‘noisy neighbour’ challenges and other network performance issues which can be present in a multi-tenant cloud environment.

“The workloads our department manages include CAD files for land mapping, geographic information system (GIS) applications and satellite imagery for the over 9.2 million acres of State Trust lands we’re responsible to oversee. The data we manage is tied directly to our goal to make this information available and to increase its analytical capabilities,” said William Reed, CTO at the Arizona State Land Department, one of IBM’s customers. “After exhaustive, comparative proof of concept testing we chose IBM’s FlashSystem, which has helped to increase our client productivity by 7 times while reducing our virtual machine boot times by over 85 percent.”

Intel prioritizes cloud, IoT and 5G in new business strategy

IntelIntel has outlined a new business strategy to capitalize on new trends within the industry including cloud technology, IoT and 5G.

Speaking on the company’s blog, CEO Brian Krzanich outlined the organizations new strategy which is split into five sections; cloud technology, IoT, memory and programmable solutions, 5G and developing new technologies under the concept of Moore’s law.

“Our strategy itself is about transforming Intel from a PC company to a company that powers the cloud and billions of smart, connected computing devices,” said Krzanich. “But what does that future look like? I want to outline how I see the future unfolding and how Intel will continue to lead and win as we power the next generation of technologies.

“There is a clear virtuous cycle here – the cloud and data centre, the Internet of Things, memory and FPGA’s are all bound together by connectivity and enhanced by the economics of Moore’s Law. This virtuous cycle fuels our business, and we are aligning every segment of our business to it.”

Krzanich believes virtualization and software trends, which are apparently redefining the concept of the data centre, aligns well with the Intel business model and future proposition, through the company’s position in the high-performance computing food chain. Through continued investment in analytics, big data and machine learning technologies, the company aims to drive more of the footprint of the data centre to Intel architecture.

The company’s play for the potentially lucrative IoT market will be built on the phrase of ‘connected to the cloud’. Intel has highlighted it will focus on autonomous vehicles, industrial and retail as our primary growth drivers of the Internet of Things, combining its capabilities within the cloud ecosystem to drive growth within IoT.

While were a number of buzzwords and trends highlighted throughout Krzanich’s post, Moore’s Law appeared to receive particular attention. While generally considered a plausible theory, Moore’s Law itself would appear to be underplayed within the industry, a point which Krzanich did not seem to agree with.

“In my 34 years in the semiconductor industry, I have witnessed the advertised death of Moore’s Law no less than four times,” said Krzanich. “As we progress from fourteen nanometer technology to ten nanometer and plan for seven nanometer and five nanometer and even beyond, our plans are proof that Moore’s Law is alive and well. Intel’s industry leadership of Moore’s Law remains intact, and you will see continued investment in capacity and R&D to ensure so.”

Krzanich’s comments provide more clarity to last week’s announcement on how it would be restructuring the business to accelerate its transformation project, and also it quarterly earnings. The data centre and Internet of Things (IoT) businesses would appear to be Intel’s primary growth engines, delivering $2.2 billion in revenue growth last year, and accounting for roughly 40% of revenue across the period.

The transformation project itself is part of a long-term ambition of the business, as it aims to move the perception of the company away from client computing (PCs and mobile devices) and towards IoT and the cloud. The announcements over the last week have had mixed results in the market; following its quarterlies share price rose slightly, though has declined over the subsequent days.

Everything You Need To Know About RDS CALs

Remote Desktop Services is a powerful role available with Windows Server. It enables businesses to centrally host resources and securely publish them to remote clients. However, there are different types of licenses that need to be purchased before setting up an RDS environment. It is important for companies to understand RDS Client Access Licenses (CALs) […]

The post Everything You Need To Know About RDS CALs appeared first on Parallels Blog.

Announcing @SoftLayer Named “Gold Sponsor” of @CloudExpo New York | #Cloud

SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON’s 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York.
SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.

read more

Meeting the demands of an aging population through open data healthcare

Medicine doctor hand working with modern computer interface as mSpeaking at Ovum’s Smart to Future City Forum, Ian Jones, Smart City Lead at the City of Leeds, highlighted the ambitions of the city is to create a citizen and data driven healthcare program for its aging population.

Using a strategy based on digital innovation and open data, the team are in the process of bridging the £600 million gap in budgets to meet the demands of an aging population. The ambition of the city is to create a programme which enables digital thinking in a health system which could be seen as bulky, un-responsive and limited.

“Open data gives us a view on how the city operates,” said Jones. “It allows customers to see data, understand the situation, raise questions and allows us to use the data to encourage innovators to help us solve the cities problems. How we use the data is driven entirely from the community. This is where the value is driven from.”

Bringing together the five trusts in Leeds, the city’s first challenge is to bring together the trusts on one public services network, to increase collaboration and integration, and achieve what the city is describing as citizen driven health. Ultimately the team are driving towards the concept of citizens managing their own health through a digital model and open data infrastructure.

The concept itself it fundamentally built out of the citizens needs themselves. After an initial consultation process with the citizens themselves, the team have driven a number of different initiatives from transportation challenges for an aging population, poor air quality within the city to diabetes management.

Through the deployment of various IoT devices throughout the city, the Leeds Data Mill acts as an open data hub to enable the citizens themselves to drive innovation in the city. Using this concept, the team aim to add value to the overall population by taking ideas from the citizens themselves, as opposed to dictating what is good for them. This in itself is the concept of citizen driven health.

Western Australia redefines itself through cloud and advanced data analytics adoption

John Atkins

Government of Western Australia’s Agent General to Europe John Atkins at Smart to Future Cities Forum

Speaking at Ovum’s Smart to Future Cities 2016 event, Government of Western Australia’s Agent General to Europe John Atkins put forward a convincing case for Western Australia as one of the world’s most innovative regions.

Bringing together cloud technologies, smart cities concepts, data analytics, robotics, autonomous vehicles and artificial intelligence, the region is aiming to transform its economy, which has traditional relied on natural resources. The region aims to create a new ecosystem, with the hub based in Perth, built on the back of future technologies and a redefinition of the basis of Western Australia.

“Perhaps the most exciting project is the Square Kilometre Array,” said Atkins. “It’s combining scientists and engineers from more than 20 countries and now we can explore the universe 20 times faster than any telescope around the world today. More than 4 petabytes of data has been produced by the project since 2013.

“We’re redefining our role in the community by embracing technology”

The project itself aims to utilize largest radio telescope ever seen on Earth, and will be world’s largest public science data project upon completion.  The overall aim of the project is to answer fundamental questions of science and about the laws of nature, such as: how did the Universe, and the stars and galaxies contained in it, form and evolve?

Aside from answering questions which have puzzled scientists for generations, the project is also drawing attention simple because of the scale at which it operates. Once completed it will generate data at a rate more than 10 times today’s global Internet traffic, presenting a unique data collection, analysis and action challenge.

From a transport perspective, the company have taken lessons learned from Transport for London, and built an enhanced passenger experience through citizen engagement on its app, building network intelligence through data analysis and managing the day to day challenges of congestion through IoT deployments throughout the city. Investing in advanced data analytics tools and processes, the team are setting themselves the challenge of taking the region beyond the 21st century.

Western Australia has chosen to diversify its economy, reducing the reliance on natural resources, by embracing the collaborative, and encouraging the adoption of disruptive technologies. Contrary to the traditional policy of government undertaking time-consuming reviews, the Government of Western Australia has put its ambitious foot forward, driving innovation in its agricultural, scientific, transportation and natural resources industries through cloud and data analytics technologies.