Category Archives: Big Data

Accenture outlines future of cloud and data analytics in sport

Accenture 3Although the digital age has created a wealth of opportunities for organizations to create new revenue streams and attract new audiences, maintaining engagement of these customers is becoming an increasing difficult job, according to Accenture’s Nick Millman.

The availability and ease of information in the 21st century has created a new dynamic where consumers are now becoming increasingly competent at multi-tasking and operating several devices, which has made the task of keeping a viewer’s attention throughout the course of a sporting event more challenging. Millman, who leads the Big Data & Analytics Delivery at Accenture, are using this dynamic to create new engagement opportunities for the Six Nations.

“There will be a number of people who will watch the entirety of a match, however there will be others who will be playing with their tablet or phone and enjoying the multi-screen experience,” said Millman. “To keep the level of engagement, sports need to become more digital themselves, providing more insight and data to fans who are watching the game. Ideally you want them to be on their phone looking at something which is relevant to the game as opposed to Facebook or what their friends are doing.”

Accenture first teamed up with the Six Nations as a technology partner four years ago, where the initial partnership focused on demonstrating the company’s mobility capabilities through creating the official app. What started as a basic app now acts as a delivery platform where Accenture can showcase their data analytics capabilities, processing more than 2 million rows of data per game and creating visuals in (near) real-time to tell a different story behind the sport itself.

The data itself is not necessarily the greatest use to the fans, so Accenture has brought in rugby experts year-on-year to help understand the nuances of the information. This year Nick Mallet, Ben Kay and David Flatman helped the team tell the story. This is the same in the business world. Data analysts themselves may not be able to make the right decisions when it’s comes to the application of the data, as they wouldn’t understand the market in the same way as a Managing Director who has been in the industry for 30 years. The application of data in sport and the business world will only be effective when it is merged with expertise and experience to provide context.

Accenture 2“One of the interesting things which we saw is that there is now an interesting dynamic between data driven decisions and gut feel,” Millman highlighted. “In some cases when you are watching the game you may think that one player would be considered the best on the park, but the data tells a different story. Seeing one hooker for example hit every line out perfectly might make him look like the most effective, but the data might suggest the opposition hooker who produced several small gains when carrying the ball had a greater impact on the game.

“This can translate into the business world also, as a marketing team may have a better feel about a product which it wants to push out to the market, but the data team have evidence which shows resource should be focused on a different area of the business,” said Millman. “I don’t think there is a right answer to what is better, data driven decision making or intuition, but it’s an interesting dynamic. The successful businesses will be the ones who are effective at blending the data and the skills to come to the right outcome.”

While the role of analytics is becoming more prominent in sport and the business world, there is still some education to be done before the concepts could be considered mainstream. Analytics may be big business in the enterprise segments, but there are still a large proportion of SMBs who do not understand the power of data analytics for their own business. The ability to cross sell, develop a stronger back story of your customer, maintain engagement or even implement artificial intelligence programs is only available once the core competencies of big data and analytics are embraced within the organization.

Accenture 1For Accenture, wearables and IoT are next on the horizon and potentially virtual reality in the future. This year the app was available on the Apple watch, as Millman is starting to see trends which could shift the consumption of data once again.

“It’s still early days, but some of the consumption of data is likely to shift from tablets and smartphones,” said Millman. “Like it shifted from desktops to laptops to smartphones and tablets, it may shift to wearable devices in the future.

“Also this year we build a prototype using virtual reality to immerse people into the rugby experience. I’m not sure VR will become mainstream in a sporting context in the next 12-18 months but I think increasingly VR and AR (augmented reality) will become a part of the sports viewing experience.”

Organizations struggling to capitalize on benefits of big data

Laptop Screen with Big Data Concept.Big data is now considered one of the more significant priorities of businesses through 2016 however research from DNV GL has highlighted only 23% of organizations have a defined strategy moving forward.

According to research from the business assurance arm of DNV GL, while the majority (52%) of companies have outlined the importance of big data for future operations, roughly only a quarter have the capabilities to fulfil the promise and capitalize fully on the benefits. The interest increases significantly for larger organizations, those of 1000 or more employees, as 70% highlighted it as a priority.

“Big data is changing the game in a number of industries, representing new opportunities and challenges,” says Luca Crisciotti, CEO of DNV GL – Business Assurance. “I believe that companies that recognize and implement strategies and plans to leverage the information in their data pools have increased opportunities to become more efficient and meet their market and stakeholders better.”

One of the larger concerns for big data which have been voiced in conference and articles in recent months is an organizations ability to act upon the potential of the information now available. The volume of data is growing at a notable rate, though one concern is few organizations have the current technological capabilities or adequately trained employees to realize the potential.

BCN has been told during numerous conversations many organizations current capabilities can only analyse a small proportion, between 5-25% dependent on who you speak to, of the data collected. Until organizations are capable of analysing and actioning larger proportions of the data, the potential of big data or the promised ROI will not be achieved. DNV GL claim 16% of organizations are viewing better business decision making and 11% for financial savings, are the aims of big data, while 16% have prioritised an improved user experience.

“The ability to use data to obtain actionable knowledge and insights is inevitable for companies that want to keep growing and profiting,” said Crisciotti. “The data analyst or scientist will be crucial in most organizations in the near future.”

DNV GL believe more has to be done to enable and prepare the organization for utilizing big data to the full extent. The team claims only 28% have improved information management and 25% have implemented new technologies and methods. From an employee perspective, only 16% have addressed the internal culture and 15% the company’s business model.

Big data has been championed as a means to drive efficiency within organizations, but also as an opportunity to create a more personalized experience for customers in the digital era. It is also a prelude to artificial intelligence, another area which has been dominating headlines in recent months, neither of which will be achievable until investments have been made in technology and personnel to increase the proportion of data which can be understood and actioned.

SAP updates BusinessObjects offering at SAPPHIRE NOW conference

SAP sailingSAP has announced a number of new updates for its analytics solutions portfolio at the 28th annual SAPPHIRE NOW conference.

The company’s business intelligence portfolio, BusinessObjects, will continue to offer solutions on premise and in the cloud, as well as incorporating a number of new features for visualizations and storytelling, data wrangling and blending, geospatial, trend analysis, custom filters, linked stories, notifications and chat.

“SAP is enabling companies to lead in the digital economy by significantly simplifying the platform, providing best-in-class analytics and a superior user experience,” said Stefan Sigg, SVP for SAP Analytics. “SAP BusinessObjects remains the most relevant analytics in the industry — and we offer the best end-to-end capabilities both on premise and in the cloud in the market today.”

One enhancement has focused more on the integration and collaboration efforts of the business, as the offering can now connect and blend existing data sources such as the SAP ERP, SAP SuccessFactors solutions, Salesforce, and Google Drive (amongst others), on a single platform without having to move data into the cloud environment. The offering now also includes predictive analytics capabilities leveraging powerful built-in algorithmic models, to enhance data-driven decision making capabilities.

SAP also updated its BusinessObjects Enterprise offering, which has been mainly designed for on premise analytics. Enterprise organizations have a choice of premium, professional and standard editions, which offer a variety of services including enhancements which make the platform Internet of Things–ready.

The company also launched one of its newest cloud offerings, the Digital Boardroom (see below), which has been built on the BusinessObjects platform. The Digital Boardroom is real-time business intelligence and ad hoc analysis portal, which provides executives with information sourced from all SAP S/4HANA Lines of Business data to provide a “single source of truth for the company”.

Digital Boardroom

SAP’s HANA launches on Huawei’s FusionSphere cloud platform

Huawei MWC 2016Huawei and SAP have announced the general availability of the SAP HANA platform on Huawei’s OpenStack cloud platform FusionSphere 5.1.

The announcement follows a long-standing partnership, dating back to 2012 when Huawei became a SAP global technology partner, which saw the team open a co-innovation centre at Huawei’s Shenzhen campus last year, which was tasked with advanced the teams capabilities in the cloud computing and big data market segments.

“SAP is the world’s largest provider of enterprise application software, and SAP HANA is leading enterprise software innovation right now,” said Zhipeng Ren, President of the Huawei IT Cloud Computing Product Line. “Huawei’s FusionCloud solution support for SAP HANA is widely accepted in the market. With the open cloud computing strategy, Huawei builds a win-win cloud ecosystem through an open, enterprise-class cloud platform.

“Based on OpenStack open source architecture, Huawei FusionSphere has made thousands of enterprise-class enhancements, and is an ideal cloud infrastructure platform for SAP HANA and critical enterprise applications. In the meantime, our joint initiatives with SAP are intended to create more value for customers to achieve their goals.”

Over the course of the relationship, SAP’s HANA offering has been made available on a number of Huawei platforms including FusionCube, FusionServer RH2288H V2/V3, FusionServer RH5885H V3 and FusionServer RH8100 V3. Huawei claims that since FusionSphere can run business applications that have traditionally been run on premise, the platform will create a number of new opportunities for mass processing of big data on the cloud.

Can your analytics tools meet the demands of the big data era?

New productSpeaking at Telco Cloud, Actian’s CTO Michael Hoskins outlined the impact big data is having on the business world, and the challenges which are being faced by those who are not keeping up with the explosion of data now available to decision makers.

The growth of IoT and the subsequent increase is data has been widely reported. Last year, Gartner predicted the number of connected ‘things’ would exceed 6.4 billion by the end of 2016 (an increase of 22% from 2015), and continue to grow to beyond 20.8 billion by 2020. While IoT is a lucrative industry, businesses are now facing the task of not only managing the data, but gaining insight from such a vast pool of unstructured information.

“Getting a greater understanding of your business is the promise of big data,” said Hoskins. “You can see things which you never were able to before, and it’s taking business opportunities to the next generation. The cloud is really changing the way in which we think about business models – it enables not only for you to understand what you are doing within your business, but the industry on the whole. You gain insight into areas which you never perceived before.”

Actian is one of a number of companies who are seemingly capitalizing on not only the growth of IoT and big data, but also the fact it has been rationalized by decision makers within enterprise as a means to develop new opportunities. The company has been building its presence in the big data arena for five years, and has invested more than $300m in growing organically, as well as acquiring new technology capabilities and expertise externally. As Hoskins highlighted to the audience, big data is big business for Actian.

Actian - Mike Hoskins

Actian’s CTO Michael Hoskins

But what are the challenges which the industry is now facing? According to Hoskins, the majority of us don’t have the right tools to fully realize the potential of big data as a business influencer.

“The data explosion which is hitting us is so violent, it’s disrupting the industry. It’s like two continents splitting apart,” said Hoskins. “On one continent we have the traditional tools, and on the other we have the new breed of advanced analytics software. The new tools are drifting away from the traditional, and the companies who are using the traditional are being left behind.”

Data analytics as a business practise is by no means a new concept, but the sheer volume, variety and speed at which data is being collected means traditional technologies to analyse this data are being made redundant. Hoskins highlighted they’re too slow (they can’t keep up with the velocity of collection), they’re too rigid (they can’t comprehend the variety of data sets), and they’re too cumbersome (they can’t manage the sheer volume of data). In short, these tools are straining under the swell.

The next challenge is scaling current technologies to meet the demands, which leaves most cases is a very difficult proposition. It’s often too short-term, too expensive and the skills aren’t abundant enough. Hoskins believes the time-cost-value proposition simply does not make sense.

“The journey of modernization goes from traditional, linear tools, through to business intelligence and discovery, this is where we are now, through to decision science,” said Hoskins. “Traditional tools enable us to look back at what we’ve done and make reactive decisions, but businesses now want to have a forward looking analytics model, drawing out new insights to inform decision making. But this cannot be done with traditional tools.

“This is the promise of advanced analytics. The final stage is where we can use data analytics to inform business decisions; this is where data becomes intelligence.”

New HP Tech Venture Group may lead to HPE overlap

HPHP has announced the launch of HP Tech Ventures, the new corporate venture arm of the business, which will invest in IoT and artificial intelligence start-ups that could end up competing with HPE.

The team will aim to develop partnership and identify potential acquisitions within the new era of disruptive technologies. HP Tech Ventures, which will be based out of offices in Palo Alto and Tel Aviv, will be led by Chief Disrupter, Andrew Bolwell targeting new technologies in 3D transformation, immersive computing, hyper-mobility, Internet of Things, artificial intelligence, and smart machines in the first instance.

Following the split of Hewlett-Packard into two separate organizations, HP took the PC and printer assets, while HPE is now focused on enterprise-orientated technologies. Over the last several months, HPE has made numerous product launches and investments in cloud, machine-learning and IoT technologies, and HP Tech Ventures targeted technologies (IoT, AI, smart machines etc.) could potentially make the once combined companies, competitors. HPE also has its own venture arm, where it has invested in various cloud, big data and security start-ups.

“The next technology revolution is shifting towards strategic markets that speak to HP’s strengths,” said Shane Wall, HP Chief Technology Officer and head of HP Labs. “With our global brand and broad reach into consumer and commercial markets worldwide, HP can help start-ups bring product to market, build their business and scale in the global marketplace as they grow.”

The company has claimed it will be able to offer rapid scale to innovative start-ups, through its technology network, as well as its channel and distribution partners. The launch would appear to be one of HP’s strategies to counter the negative impact which declining PC sales is having on its traditional business, entering into new markets through potential acquisitions as opposed to organic growth.

ATP teams up with Infosys to launch big data driven ranking system

ATPThe Association of Tennis Professionals, ATP, has partnered with Infosys to launch a new statistical way to measure the best performing ATP World Tour players.

The new ATP Stats Leaderboards makes use of Infosys’ data analytics capabilities to bring together recorded stats from various professionals on the tour today to rank them in three categories, Serving, Returning and Under Pressure, and even allows users to compare current players with greats from the past. The three categories can be broken down by surface, by year, by past 52 weeks or by career.

“These new statistics offer players, fans and media interesting new insights into how our athletes are rating in three key areas against their peers on the ATP World Tour,” said Chris Kermode, ATP Executive Chairman. “There is huge potential to understand our sport better through the development of new statistics, and we look forward to further advances coming soon in this area through our partnership with Infosys.”

The project uses the Infosys Information Platform, an opensource data analytics platform, and brings together the vast amount of data collected by the ATP over the years to give fans a concise rating of players on the tour today. The ranking are determined through various big data models combining several metrics including the number of double faults during a game, number of aces, the percentage of points won on an opponent’s serve and the number of successfully converted break points, to give a measure of how players are performing currently and in comparison to previous parts of the season.

“The uniqueness of our partnership with the ATP World Tour lies in being able to challenge the traditional models, and experiment and embrace technology to create a compelling experience for fans across the globe,” said U B Pravin Rao, Chief Operating Officer at Infosys. “We firmly believe that technology can amplify our ability to create this unique differentiation and we will continue to find newer avenues to elevate the fan experience.”

While this would be considered a novel concept for the game of tennis, the use of big data and advanced analytics tools is not new for the world of sports entertainment. Accenture Digital has been using its data analytics capabilities to predict the outcome of the Six Nations and the recent Rugby World Cup.

The company has been a technology partner of the Six Nations for five years now, and this year introduced an Oculus Rift beta virtual reality headset and development kit as part of the on-going marketing strategies to demonstrate its capabilities. The company claims to process more than 1.9 million rows of data during every match, and also developed parameters for 1800 algorithms to bring the data, dating back to 2006, to life. After each match, approximately 180,000 on-field actions were added to the increasing data store to refine the decision making capabilities.

Big Data looks inwards to transform network management and application delivery

Strawberry and Blackberry CloseWe’ve all heard of the business applications touted by big data advocates – data-driven purchasing decisions, enhanced market insights and actionable customer feedback. These are undoubtedly of great value to businesses, yet organisations only have to look inwards to find further untapped potential. Here Manish Sablok, Head of Field Marketing NWE at ALE explains the two major internal IT processes that can benefit greatly from embracing big data: network management and application delivery.

SNS Research estimated Big Data investments reached $40 billion worldwide this year. Industry awareness and reception is equally impressive – ‘89% of business leaders believe big data will revolutionise business operations in the same way the Internet did.’ But big data is no longer simply large volumes of unstructured data or just for refining external business practices – the applications continue to evolve. The advent of big data analytics has paved the way for smarter network and application management. Big data can ultimately be leveraged internally to deliver cost saving efficiencies, optimisation of network management and application delivery.

What’s trending on your network?

Achieving complete network visibility has been a primary concern of CIOs in recent years – and now the arrival of tools to exploit big data provides a lifeline. Predictive analytics techniques enable a transition from a reactive to proactive approach to network management. By allowing IT departments visibility of devices – and crucially applications – across the network, the rise of the Bring Your Own Device (BYOD) trend can be safely controlled.

The newest generation of switch technology has advanced to the stage where application visibility capability can now be directly embedded within the most advanced switches. These switches, such as the Alcatel-Lucent Enterprise OmniSwitch 6860, are capable of providing an advanced degree of predictive analytics. The benefits of these predictive analytics are varied – IT departments can establish patterns of routine daily traffic in order to swiftly identify anomalies hindering the network. Put simply, the ability to detect what is ‘trending’ – be it backup activities, heavy bandwidth usage or popular application deployment – has now arrived.

More tasks can be automated than ever before, with a dynamic response to network and user needs becoming standard practice. High priority users, such as internal teams requiring continued collaboration, can be prioritised the necessary network capacity in real-time.

Trees, silhouetted in the mistEffectively deploy, monitor and manage applications

Effective application management has its own challenges, such as the struggle to enforce flexible but secure user and device policies. Big data provides the business intelligence necessary to closely manage application deployment by analysing data streams, including application performance and user feedback. Insight into how employees or partners are using applications allows IT departments to identify redundant features or little used devices and to scale back or increase support and development accordingly.

As a result of the increasing traffic from voice, video and data applications, new network management tools have evolved alongside the hardware. The need to reduce the operational costs of network management, while at the same time providing increased availability, security and multimedia support has led to the development of unified management tools that offer a single, simple window into applications usage. Centralised management can help IT departments predict network trends, potential usage issues and manage users and devices – providing a simple tool to aid business decisions around complex processes.

Through the effective deployment of resources based on big data insight, ROI can be maximised. Smarter targeting of resources makes for a leaner IT deployment, and reduces the need for investment in further costly hardware and applications.

Networks converging on the future

Big data gathering, processing and analytics will all continue to advance and develop as more businesses embrace the concept and the market grows. But while the existing infrastructure in many businesses is capable of using big data to a limited degree, a converged network infrastructure, by providing a simplified and flexible architecture, will maximise the benefits and at the same time reduce Total Cost of Ownership – and meet corporate ROI requirements.

By introducing this robust network infrastructure, businesses can ensure a future-proof big data operation is secure. The advent of big data has brought with it the ability for IT departments to truly develop their ‘smart network’. Now it is up to businesses to seize the opportunity.

Written by Manish Sablok, Head of Field Marketing NWE at Alcatel Lucent Enterprise

FICO reinforces market position with product updates

dataGlobal analytics firm FICO has launched a number of new and updated offerings to enable businesses to develop prescriptive analytics and decision management applications and improve business decision agility.

The Decision Management Suite 2.0 product – an updated version of the same name – enables customers to develop analytic applications in the cloud and improve automated business decision agility. The Decision Central offering manages, audits, reports and updates decision logic and predictive models, so customers can record and store automated decisions so they can be reused, modified and improved. Finally, the Strategy Director tool helps users structure the decision flow. The tools are available through Amazon Web Services or as a private cloud or on-premises deployment.

“Many Big Data deployments have failed to deliver competitive advantage because their approach is completely backwards,” said Stuart Wells, CTO at FICO. “We focus on decisions-first, as opposed to data-first. That gives our customers the fastest route to real value, and the agility to change course faster than the competition. It means being able to innovate like start-ups. Some of our Decision Management Suite customers have reduced the time to deploy an analytic application from months to days, and the time to model a decision and act on it from weeks to minutes.”

While the concept of Big Data has been around for some time many business struggle to comprehend the vast amount collated, to utilize it within the organization in any meaningful manner. The FICO product suite is one of a number of new products in the industry which aims to bring all this data into one concise system, and ultimately drive decision making capability through the insight uncovered.

“The original launch of the FICO Decision Management Suite in 2013 represented a dramatic change in decision logic authoring and application development,” Wells said. “Now, with version 2 of the Decision Management Suite, FICO’s customers have the chance to pull even further ahead of their competitors. This product suite represents the future of prescriptive analytics and decision management, and it’s available now.”

Meeting the demands of an aging population through open data healthcare

Medicine doctor hand working with modern computer interface as mSpeaking at Ovum’s Smart to Future City Forum, Ian Jones, Smart City Lead at the City of Leeds, highlighted the ambitions of the city is to create a citizen and data driven healthcare program for its aging population.

Using a strategy based on digital innovation and open data, the team are in the process of bridging the £600 million gap in budgets to meet the demands of an aging population. The ambition of the city is to create a programme which enables digital thinking in a health system which could be seen as bulky, un-responsive and limited.

“Open data gives us a view on how the city operates,” said Jones. “It allows customers to see data, understand the situation, raise questions and allows us to use the data to encourage innovators to help us solve the cities problems. How we use the data is driven entirely from the community. This is where the value is driven from.”

Bringing together the five trusts in Leeds, the city’s first challenge is to bring together the trusts on one public services network, to increase collaboration and integration, and achieve what the city is describing as citizen driven health. Ultimately the team are driving towards the concept of citizens managing their own health through a digital model and open data infrastructure.

The concept itself it fundamentally built out of the citizens needs themselves. After an initial consultation process with the citizens themselves, the team have driven a number of different initiatives from transportation challenges for an aging population, poor air quality within the city to diabetes management.

Through the deployment of various IoT devices throughout the city, the Leeds Data Mill acts as an open data hub to enable the citizens themselves to drive innovation in the city. Using this concept, the team aim to add value to the overall population by taking ideas from the citizens themselves, as opposed to dictating what is good for them. This in itself is the concept of citizen driven health.