ATP teams up with Infosys to launch big data driven ranking system

ATPThe Association of Tennis Professionals, ATP, has partnered with Infosys to launch a new statistical way to measure the best performing ATP World Tour players.

The new ATP Stats Leaderboards makes use of Infosys’ data analytics capabilities to bring together recorded stats from various professionals on the tour today to rank them in three categories, Serving, Returning and Under Pressure, and even allows users to compare current players with greats from the past. The three categories can be broken down by surface, by year, by past 52 weeks or by career.

“These new statistics offer players, fans and media interesting new insights into how our athletes are rating in three key areas against their peers on the ATP World Tour,” said Chris Kermode, ATP Executive Chairman. “There is huge potential to understand our sport better through the development of new statistics, and we look forward to further advances coming soon in this area through our partnership with Infosys.”

The project uses the Infosys Information Platform, an opensource data analytics platform, and brings together the vast amount of data collected by the ATP over the years to give fans a concise rating of players on the tour today. The ranking are determined through various big data models combining several metrics including the number of double faults during a game, number of aces, the percentage of points won on an opponent’s serve and the number of successfully converted break points, to give a measure of how players are performing currently and in comparison to previous parts of the season.

“The uniqueness of our partnership with the ATP World Tour lies in being able to challenge the traditional models, and experiment and embrace technology to create a compelling experience for fans across the globe,” said U B Pravin Rao, Chief Operating Officer at Infosys. “We firmly believe that technology can amplify our ability to create this unique differentiation and we will continue to find newer avenues to elevate the fan experience.”

While this would be considered a novel concept for the game of tennis, the use of big data and advanced analytics tools is not new for the world of sports entertainment. Accenture Digital has been using its data analytics capabilities to predict the outcome of the Six Nations and the recent Rugby World Cup.

The company has been a technology partner of the Six Nations for five years now, and this year introduced an Oculus Rift beta virtual reality headset and development kit as part of the on-going marketing strategies to demonstrate its capabilities. The company claims to process more than 1.9 million rows of data during every match, and also developed parameters for 1800 algorithms to bring the data, dating back to 2006, to life. After each match, approximately 180,000 on-field actions were added to the increasing data store to refine the decision making capabilities.

[session] Hybrid Security Monitoring By @AlertLogic | @CloudExpo #Cloud

The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity.
In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can defend against the cyber kill chain. The audience will learn about the “10 Security Best Practices” for effective security solutions and proven strategies to prevent harmful cyber activity.

read more

IBM launches quantum computing platform for public on cloud

IBM QuantumIBM has announced its quantum computing platform, Quantum Experience, will be available to the public through its cloud platform, who can access and run experiments on the company’s quantum processor.

The platform, which will be delivered onto any desktop and mobile device, will drive IBM’s efforts to redefine its perception in the industry. The company believe quantum computing is the future of computing and has the potential to solve certain problems that are impossible to solve on today’s supercomputers.

“Quantum computers are very different from today’s computers, not only in what they look like and are made of, but more importantly in what they can do. Quantum computing is becoming a reality and it will extend computation far beyond what is imaginable with today’s computers,” said Arvind Krishna, SVP at IBM Research. “This moment represents the birth of quantum cloud computing.

“By giving hands-on access to IBM’s experimental quantum systems, the IBM Quantum Experience will make it easier for researchers and the scientific community to accelerate innovations in the quantum field, and help discover new applications for this technology.”

IBM believes momentum driven from Moore’s law is ‘running out of steam’, quantum computing will be the next catalyst for innovation in the cloud computing era. The power available through quantum computing has the potential to take technologies such as artificial intelligence to the next level, as well as increasingly the long-term potential of IBM’s Watson.

Quantum computing is by no means a new idea, Richard Feynman proposed to build computers based on the laws of quantum mechanics in 1981, but is only now becoming a reality within the industry. A classical computer makes use of bits to process information, where each bit represents either a one or a zero. In contrast, a qubit can represent a one, a zero, or both at once, which is known as superposition. The outcome could result in a platform which can process calculations dramatically faster than classical computers.

Some corners of the industry could see IBM as one of the organizations who have been left in the pre-cloud era, though this announcement and the work done by the team to progress Watson is seemingly creating a new market for the business. As opposed to simply playing catch-up in the traditional cloud markets, IBM would appear to be looking further afield to redefine the perception of IBM.

IBM announces availability of quantum computing for testing on IBM cloud

(c)iStock.com/shylendrahoode

IBM is making available quantum computing to the public, allowing people to access and run experiments on IBM’s quantum processor.

Scientists at the company have manufactured quantum processor that can be accessed via a quantum computing platform delivered through the IBM Cloud onto any desktop or mobile device.

The cloud-enabled quantum computing platform, called IBM Quantum Experience, facilitates the running of algorithms and experiments on IBM’s quantum processor and users can work with the individual quantum bits (qubits) while going through tutorials and simulations related to the strengths of quantum computing.

The quantum processor comprises five superconducting qubits and is based at the IBM T.J. Watson Research Center in New York.

The five-qubit processor represents the latest advancement in IBM’s quantum architecture that can scale to larger quantum systems. The processor is also reportedly the foremost approach towards manufacturing a universal quantum computer.

A universal quantum computer can be programmed to perform any computing task and will be exponentially faster than classical computers for a number of important applications for science and business.

Though this kind of computer is not present today, IBM foresees the presence of medium-sized quantum processors of 50-100 qubits in the next 10 years. A 50-qubit quantum computer will surpass a TOP500 supercomputer by a long way.

MS Terminal Server and Parallels Remote Application Server- a Winning Combination!

MS Terminal Server is a server program in Windows NT 4.0 and above operating systems that delivers graphical user interface (GUI) instances of a Windows desktop to a remote client device. It has the ability to host multiple client sessions and effectively publish desktops to thin clients, Windows-based and non-Windows-based. Using MS Terminal Server, businesses […]

The post MS Terminal Server and Parallels Remote Application Server- a Winning Combination! appeared first on Parallels Blog.

Node.js in Retail | @DevOpsSummit #DevOps #Docker #Microservices

In a crowded world of popular computer languages, platforms and ecosystems, Node.js is one of the hottest. According to w3techs.com, Node.js usage has gone up 241 percent in the last year alone. Retailers have taken notice and are implementing it on many levels. I am going to share the basics of Node.js, and discuss why retailers are using it to reduce page load times and improve server efficiency. I’ll talk about similar developments such as Docker and microservices, and look at several companies implementing these technologies. I’ll also discuss how mobile computing is changing buyer behavior and expectations.

read more

Microsoft adds more software capabilities to Azure IoT suite

AzureMicrosoft has announced the acquisition of Solair, an Italian IoT software company which currently operates in the manufacturing, retail, food & beverage and transportation industries.

Solair software, which runs on the Microsoft Azure platform, focuses on helping customers improve efficiently and profitably of their IoT initiatives. The acquisition continues Microsoft’s ambitions in the IoT market segment, delivering a more complete solution, as opposed to simply the Azure IoT platform.

“The integration of Solair’s technology into the Microsoft Azure IoT Suite will continue to enhance our complete IoT offering for the enterprise,” Sam George, Partner Director, Azure IoT at Microsoft. “We’ll have more specifics to share about how Solair is helping us build the intelligent cloud in the future. In the meantime, I’d like to reiterate my welcome to the Solair team.”

Solair has been in operation for five years now, and boasts a healthy number of customers including Rancilio Group, where it enabled a connected coffee machine maintenance strategy, and Minerva Omega where it created a remote maintenance and service strategy for the food processing group. Financial details of the acquisition were not released, and Microsoft did not release any specific details of how the business will be integrated into the overall Azure IoT suite.

“From the very start, our mission has been to help customers quickly and easily gain access to the huge benefits of the Internet of Things (IoT),” Tom Davis CEO at Solair. “By building our solutions based on real customer requirements that allow them to gain real value, I’m confident that Solair’s technology and talent will be able to make an important contribution to Microsoft’s Azure IoT Suite and Microsoft’s broader IoT ambitions.”

Microsoft has seemingly been on a mission to bolster its position in the IoT market, both organically and through industry acquisitions. At Build 2016, the team launched Azure Service Fabric and new IoT starter kits, as well as previews of new services to serverless compute for event-driven solutions, Azure Functions, and Power BI Embedded which allows developers to embed reports and visualizations in any application.

EMC outlines ‘Technical Debt’ challenges for data greedy enterprises

Jeremy and Guy on stage day 2

President of Core Technologies Division Guy Churchward (Left) and Jeremy Burton, President of Products and Marketing (Right) at EMC World 2016

Speaking at EMC World, President of Core Technologies Division at EMC Guy Churchward joined Jeremy Burton, President of Products and Marketing, to outline one of the industry’s primary challenges, technical debt.

The idea of technical debt is being felt by the new waves of IT professionals. This new generation is currently feeling the pressure from most areas of the business to innovate, to create an agile, digitally enabled business, but still have commitments to traditional IT systems on which the business currently operates on. The commitment to legacy technologies, which could represent a significant proportion of a company’s IT budget and prevents future innovation, is what Churchward describes as the technical debt.

“They know their business is transforming fast,” said Churchward. “Business has to use IT to make their organization a force to be reckoned with and remain competitive in the market, but all the money is taken up by the current IT systems. This is what we call technical debt. A lot of people have to do more with what they have and create innovation with a very limited budget. This is the first challenge for every organization.”

This technical debt is described by Churchward as the first challenge which every IT department will face when driving towards the modern data centre. It makes business clunky and ineffective, but is a necessity to ensure the organization continues to operate, until the infrastructure can be upgraded to a modern proposition. Finding the budget without compromising current operations can be a tricky proposition.

“When you live in an older house, where the layout doesn’t really work for the way you live your life and there aren’t enough closets to satiate your wife’s shoe fetish, maybe it’s time to modernize,” said Churchward on his blog. “But do you knock the whole house down and start again? Maybe it’s tempting but, what about the investment that you’ve already made in your home? It’s similar when you want to modernize your IT infrastructure. You have money sunk into your existing technology and you don’t want to face the disruption of completely starting again

MainOne datacentre 1“For many companies, this debt includes a strategy for data storage that takes advantage of a shrinking per-gig cost of storage that enables them to keep everything. And that data is probably stored primarily on spinning disk with some high-availability workloads on flash in their primary data centre. The old way of doing things was to see volumes of data growing and address that on a point basis with more spinning disk. Data centres are bursting at the seams and it’s now time to modernize – but how?”

Churchward highlighted the first-step is to remove duplicate data sets – EMC launched its Enterprise Copy Data Management tool at EMC World this week – to reduce unnecessary spend within the data centres. While there are a number of reasons to duplicate and keep old data sets for a defined period of time, Churchward commented this data can often be forgotten and thus becomes an expense which can be unnecessary. Although the identification and removal of this data might be considered a simple solution to removing a portion of the technical debt, Churchward believes it could be a $50 billion business problem by 2018.

The Enterprise Copy Data Management software helps customers discover, automate and optimize copy data to reduce costs and streamline operations. The tool automatically identifies duplicate data sets within various data centres, and using data-driven decision making software, optimizes the storage plans, and in the necessary cases, deletes duplicate data sets.

This is just one example of how the challenge of technical debt can be managed, though the team at EMC believe this challenge, the first in a series when transforming to a modern business, can be one of the largest. Whether this is one of the reasons cloud adoption within the mainstream market cloud be slower than anticipated remains to be seen, though the removal of redundant and/or duplicated data could provide some breathing room for innovation and budget for the journey towards the modern data centre.

Artifficial Intelligence and IoT | @ThingsExpo #AI #IoT #DigitalTransformation

Artificial Intelligence has the potential to massively disrupt IoT.
In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things.
AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process control, homeland security, energy and finance.

read more

IoT and Fog Computing | @ThingsExpo @Metavine #IoT #InternetOfThings

The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform.
In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today’s coding paradigm and share the must-have mindsets for removing complexity from the development process, accelerating application delivery times, and ensuring that developers will become heroes (not bottlenecks) in the IoT revolution.

read more