Category Archives: data analytics

Teradata VantageCloud integrated with Microsoft Azure Machine Learning

Teradata has integrated Teradata VantageCloud, an cloud analytics and data platform, with Microsoft Azure Machine Learning (Azure ML). VantageCloud’s scalability, openness and analytics – ClearScape Analytics – combined with Azure ML’s ability to simplify and accelerate the ML lifecycle could help customers unlock the full value of their data, even in the most complex and… Read more »

The post Teradata VantageCloud integrated with Microsoft Azure Machine Learning appeared first on Cloud Computing News.

Confluent acquires Immerok to develop cloud native Apache Flink offering

Confluent, a data streaming specialist, has signed a definitive agreement to acquire Immerok, a contributor to Apache Flink – a powerful technology for building stream processing applications and one of the most popular Apache open source projects. Immerok has developed a cloud-native, fully managed Flink service for customers looking to process data streams at a… Read more »

The post Confluent acquires Immerok to develop cloud native Apache Flink offering appeared first on Cloud Computing News.

Dynatrace extends Grail to power business analytics

Software intelligence company Dynatrace has extended its Grail causational data lakehouse to power business analytics. As a result, the Dynatrace platform can instantly capture business data from first and third-party applications at a massive scale without requiring engineering resources or code changes. It prioritises business data separately from observability data and stores, processes, and analyzes… Read more »

The post Dynatrace extends Grail to power business analytics appeared first on Cloud Computing News.

HPE give IoT portfolio an edgy feel

Oil and gas platform in the gulf or the sea, The world energy, OHPE has unveiled new capabilities and partnerships to bring real-time data analytics and IoT insight to the network edge, reports Telecoms.com.

The team claims its new offerings, Edgeline EL1000 and Edgeline EL4000, are the first converged systems for the Internet of Things, capable of integrating data capture, analysis and storage at the source of collection. Transport and storage of data for analytics are becoming prohibitively expensive, the company claims, so the new products offer decision making insight at the network edge to reduce costs and complexities.

HPE claims the new offerings are capable of delivering heavy-duty data analytics and insights, graphically intense data visualization, and real-time response at the edge. Until recently, the technology to drive edge analytics has not been available, meaning data has had to be transferred to the network core to acquire insight. The team have also announced the launch of Vertica Analytics Platform which offers in-database machine learning algorithms and closed-loop analytics at the network edge.

“Organizations that take advantage of the vast amount of data and run deep analytics at the edge can become digital disrupters within their industries,” said Mark Potter, CTO of the Enterprise Group at HPE. “HPE has built machine learning and real time analytics into its IoT platforms, and provides services that help customers understand how data can best be leveraged, enabling them to optimize maintenance management, improve operations efficiency and ultimately, drive significant cost savings.”

The news follows an announcement from IBM and Cisco last week which also focused on IoT at the edge. Alongside the product launches from HPE, the team also announced a partnership with GE Digital to create more relevant propositions for industry. The partnership focuses on combining HPE technical know-how with GE’s industrial expertise and its Predix platform to create IoT-optimized hardware and software. GE’s Predix platform will be a preferred software solution for HPE’s industrial-related use cases and customers.

While the promise of IoT given the industry plenty to get excited about in recent years, the full potential has been difficult to realize due to the vast amount of data which needs to be transported to the network core to process and drive insight from. Although it would seem logical to process the data at the source of collection, technical capabilities have not been at the point where this has been possible. Recent advances from the IBM/Cisco and HPE/GE partnerships are removing the need to transfer information, and also the risk of bottleneck situations, points of failure and storage expenses from the IoT process.

“In order to fully take advantage of the Industrial IoT, customers need data-centre-grade computing power, both at the edge – where the action is – and in the cloud,” said Potter. “With our advanced technologies, customers are able to access data centre-level compute at every point in the Industrial IoT, delivering insight and control when and where needed.”

Applications for the edge-analytics proposition could be quite wide, ranging from production lines in Eastern Europe to oil rigs in the North Sea to smart energy grids in Copenhagen. It would appear the team are not only targeting industrial segments, where IoT could ensure faster and more accurate decision making in the manufacturing process for instance, but also those assets which do not have reliable or consistent connectivity.

What is the promise of big data? Computers will be better than humans

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingBig data as a concept has in fact been around longer than computer technology, which would surprise a number of people.

Back in 1944 Wesleyan University Librarian Fremont Rider wrote a paper which estimated American university libraries were doubling in size every sixteen years meaning the Yale Library in 2040 would occupy over 6,000 miles of shelves. This is not big data as most people would know it, but the vast and violent increase in the quantity and variety of information in the Yale library is the same principle.

The concept was not known as big data back then, but technologists today are also facing a challenge on how to handle such a vast amount of information. Not necessarily on how to store it, but how to make use of it. The promise of big data, and data analytics more generically, is to provide intelligence, insight and predictability but only now are we getting to a stage where technology is advanced enough to capitalise on the vast amount of information which we have available to us.

Back in 2003 Google wrote a paper on its MapReduce and Google File System which has generally been attributed to the beginning of the Apache Hadoop platform. At this point, few people could anticipate the explosion of technology which we’ve witnessed, Cloudera Chairman and CSO Mike Olson is one of these people, but he is also leading a company which has been regularly attributed as one of the go-to organizations for the Apache Hadoop platform.

“We’re seeing innovation in CPUs, in optical networking all the way to the chip, in solid state, highly affordable, high performance memory systems, we’re seeing dramatic changes in storage capabilities generally. Those changes are going to force us to adapt the software and change the way it operates,” said Olson, speaking at the Strata + Hadoop event in London. “Apache Hadoop has come a long way in 10 years; the road in front of it is exciting but is going to require an awful lot of work.”

Analytics was previously seen as an opportunity for companies to look back at its performance over a defined period, and develop lessons for employees on how future performance can be improved. Today the application of advanced analytics is improvements in real-time performance. A company can react in real-time to shift the focus of a marketing campaign, or alter a production line to improve the outcome. The promise of big data and IoT is predictability and data defined decision making, which can shift a business from a reactionary position through to a predictive. Understanding trends can create proactive business models which advice decision makers on how to steer a company. But what comes next?

Mike Olsen

Cloudera Chairman and CSO Mike Olsen

For Olsen, machine learning and artificial intelligence is where the industry is heading. We’re at a stage where big data and analytics can be used to automate processes and replace humans for simple tasks. In a short period of time, we’ve seen some significant advances in the applications of the technology, most notably Google’s AlphaGo beating World Go champion Lee Se-dol and Facebook’s use of AI in picture recognition.

Although computers taking on humans in games of strategy would not be considered a new PR stunt, IBM’s Deep Blue defeated chess world champion Garry Kasparov in 1997, this is a very different proposition. While chess is a game which relies on strategy, go is another beast. Due to the vast number of permutations available, strategies within the game rely on intuition and feel, a complex task for the Google team. The fact AlphaGo won the match demonstrates how far researchers have progressed in making machine-learning and artificial intelligence a reality.

“In narrow but very interesting domains, computers have become better than humans at vision and we’re going to see that piece of innovation absolutely continue,” said Olsen. “Big Data is going to drive innovation here.”

This may be difficult for a number of people to comprehend, but big data has entered the business world; true AI and automated, data-driven decision may not be too far behind. Data is driving the direction of businesses through a better understanding of the customer, increase the security of an organization or gaining a better understanding of the risk associated with any business decision. Big data is no longer a theory, but an accomplished business strategy.

Olsen is not saying computers will replace humans, but the number of and variety of processes which can be replaced by machines is certainly growing, and growing faster every day.

IBM makes software defined infrastructure smarter

IBMIBM has expanded its portfolio of software-defined infrastructure solutions adding cognitive features to speed up analysis of data, integrate Apache Spark and help accelerate research and design, the company claims.

The new offering will be called IBM Spectrum Computing and is designed to aide companies to extract full value from their data through adding scheduling capabilities to the infrastructure layer. The product offers workload and resource management features to research scientists for high-performance research, design and simulation applications. The new proposition focuses on three areas.

Firstly, Spectrum Computing works with cloud applications and open source frameworks to assist in sharing resources between the programmes to speed up analysis. Secondly, the company believes it makes the adoption of Apache Spark simpler. And finally, the ability to share resources will accelerate research and design by up to 150 times, IBM claims.

By incorporating the cognitive computing capabilities into the software-defined infrastructure products, IBM believes the concept on the whole will become more ‘intelligent’. The scheduling competencies of the software will increase compute resource utilization and predictability across multiple workloads.

The software-defined data centre has been steadily growing, and is forecasted to continue its healthy growth over the coming years. Research has highlighted the market could be worth in the region of $77.18 Billion by 2020, growing at a CAGR of 28.8% from 2015 to 2020. The concept on the whole is primarily driven by the attractive feature of simplified scalability as well as the capability of interoperability. North America and Asia are expected to hold the biggest market share worldwide, though Europe as a region is expected to grow at a faster rate.

“Data is being generated at tremendous rates unlike ever before, and its explosive growth is outstripping human capacity to understand it, and mine it for business insights,” said Bernard Spang, VP for IBM Software Defined Infrastructure. “At the core of the cognitive infrastructure is the need for high performance analytics of both structured and unstructured data. IBM Spectrum Computing is helping organizations more rapidly adopt new technologies and achieve greater, more predictable performance.”

Wipro open sources big data offering

Laptop Screen with Big Data Concept.Wipro has announced it has open sourced its big data solution Big Data Ready Enterprise (BDRE), partnering with California based Hortonworks to push the initiative forward.

The company claims the BDRE offering addresses the complete lifecycle of managing data across enterprise data lakes, allowing customers to ingest, organize, enrich, process, analyse, govern and extract data at a faster pace. BDRE is released under the Apache Public License v2.0 and hosted on GitHub. Teaming up with Hortonworks will also give the company additional clout in the market, at Hortonworks is generally considered one of the top three Hadoop distribution vendors in the market.

“Wipro takes pride in being a significant contributor to the open source community, and the release of BDRE reinforces our commitment towards this ecosystem,” said Bhanumurthy BM, COO at Wipro. “BDRE will not only make big data technology adoption simpler and effective, it will also open opportunities across industry verticals that organizations can successfully leverage. Being at the forefront of innovation in big data, we are able to guide organizations that seek to benefit from the strategic, financial, organizational and technological benefits of adopting open source technologies.”

Companies open sourcing their own technologies has become somewhat of a trend in recent months, as the product owners themselves would appear to be moving towards a service model as opposed to traditional vendor. According to ‘The Open Source Era’, an Oxford Economics Study which was commissioned by Wipro, 64% of respondents believe that open source will drive Big Data efforts in the next three years.

The report also claims open source has become a foundation stone of the technology roadmap of a number of businesses, 75% of respondent believe integration between legacy and open source is one of the main challenges and 52% said open source is already supporting development of new products and services.

IBM and Cisco combine to deliver IoT insight on the network edge

Oil and gas platform in the gulf or the sea, The world energy, OIBM and Cisco have extended a long-standing partnership to enable real-time IoT analytics and insight at the point of data collection.

The partnership will focus on combining the cognitive computing capabilities of IBM’s Watson with Cisco’s analytics competencies to support data action and insight at the point of collection. The team are targeting companies who operate in remote environments or on the network edge, for example oil rigs, where time is of the essence but access to the network can be limited or disruptive.

The long promise of IoT has been to increase the amount of data organizations can collect, which once analysed can be used to gain a greater understanding of a customer, environment or asset. Cloud computing offers organizations an opportunity to realize the potential of real-time insight, but for those with remote assets where access to high bandwidth connectivity is not a given, the promise has always been out of reach.

“The way we experience and interact with the physical world is being transformed by the power of cloud computing and the Internet of Things,” said Harriet Green, GM for IBM Watson IoT Commerce & Education. “For an oil rig in a remote location or a factory where critical decisions have to be taken immediately, uploading all data to the cloud is not always the best option.

“By coming together, IBM and Cisco are taking these powerful IoT technologies the last mile, extending Watson IoT from the cloud to the edge of computer networks, helping to make these strong analytics capabilities available virtually everywhere, always.”

IoT insight at the point of collection has been an area of interest to enterprise for a number of reasons. Firstly, by decreasing the quantity of data which has to be moved transmission costs and latency are reduced and the quality of service is improved. Secondly, the bottleneck of traffic at the network core can potentially be removed, reducing the likelihood of failure. And finally, the ability to virtualize on the network edge can extend the scalability of an organization.

ABI Research has estimated 90% of data which is collected through IoT connected devices are stored or processed locally, making it inaccessible for real-time analytics, therefore it must be transferred to another location for analysis. As the number of these devices increases, the quantity of data which must be transferred to another location, stored and analysed also increases. The cost of data transmission and storage could soon prohibit some organizations from achieving the goal of IoT. The new team are hoping the combination of Cisco’s edge analytics capabilities and the Watson cognitive solutions will enable real-time analysis at the scene, thus removing a number of the challenges faced.

“Together, Cisco and IBM are positioned to help organizations make real-time informed decisions based on business-critical data that was often previously undetected and overlooked,” said Mala Anand, SVP of the Cisco Data & Analytics Platforms Group. “With the vast amount of data being created at the edge of the network, using existing Cisco infrastructure to perform streaming analytics is the perfect way to cost-effectively obtain real-time insights. Our powerful technology provides customers with the flexibility to combine this edge processing with the cognitive computing power of the IBM Watson IoT Platform.”

What did we learn from PwC’s local government survey?

City HallPwC has recently released findings from its annual survey, The Local State We’re In, which assesses the challenges facing local government and their responses to them, as well as looking at public opinion on the organizations capabilities.

Here, we’ve pulled out four of the lessons we learnt from the report:

Data Analytics is top of the agenda for CEOs and Local Government Leaders

A healthy 91% of the Chief Execs surveyed confirmed Data Analytics was an area which they were well equipped. This in fact was the most popular answer for this specific question, as other areas such as business intelligence (59%), supply chain management (55%) and information governance & records management (40%) fared less so.

While it is encouraging the leaders are confident in their team’s ability to perform in the data analytics world, the research also stated local government’s use of structured and unstructured data varies quite considerably. 71% of the Chief Execs agreed they were using structured data (e.g. information in government controlled databases), whereas this number drops to 33% when unstructured data (e.g. social media and data generated through search engines) is the focal point of the question.

As the consumer continues its drive towards digital and the connected world, the level of insight which can be derived through unstructured data, social media in particular, will continue to increase. Back in 1998 Merrill Lynch said 80-90% of all potentially usable business information may originate in unstructured form. This rule of thumb is not based on primary or any quantitative research, but is still accepted by some in the industry. Even if this number has dropped, there is a vast amount of information and insight which is being missed by the local government.

But data driven decision making isn’t

Throughout the industry, data driven decision making has been seen as one of the hottest growing trends, and also as the prelude to the introduction of artificial intelligence.

Despite the media attention such ideas are receiving, it would appear these trends are not translating through to local government. Only 41% of the respondents said their organization is using data analytics to inform decision making and strategy. It would appear local government is quite effective (or at least confident) at managing data, but not so much at using it for insight.

Digital Device Tablet Laptop Connection Networking Technology ConceptPublic is not confident in local government’s ability to embrace digital

Although leadership within the local authorities themselves are happy with the manner in which their organization has embraced digital, this confidence is not reflected by the general public.

76% of Chief Execs who participated in the research are confident in their own digital strategies, however only 23% of the general public are confident in the council’s ability to manage the transition through to digital. This is down from 28% in the same survey during 2015 and 29% in 2014. The findings could demonstrate the rigidity of government bodies, especially at a local level, as it would appear the evolution of emerging technologies is outstripping local government’s ability to incorporate these new ideas and tools.

There is also quite a significant difference in how the public and the Chief Execs view cyber security. While only 17% of the Chief Execs believe their organization is at risk from cyber threats, 70% of the general public are not confident local government will be able to manage and share their personal information appropriately. 2016 has already seen a number of high profile data breaches which could have an impact on the opinions of the general public. If tech savvy enterprise organizations such as TalkTalk cannot defend themselves, it may be perceived that public sector organizations are less likely to do so.

However, local government does have the backing from the public to invest in digital

The general public would not appear to currently have great confidence in the local government’s current ability to embrace the digital age however they have seemingly given their blessing for the local government to continue investments.

39% of the general public who completed the survey highlighted their preference for engagement with local government would be through a digital platform, as opposed to the 24% who would prefer the telephone and 28% who would rather engage in person. Unfortunately, while digital is the most popular option for engaging, only 37% were satisfied with the current digital access to local government, down from 38% in last year’s research.

Salesforce SMB’s business leader talks data analytics, AI and the age of entrepreneurship

Sanj Salesforce

Sanj Bhayro, SVP EMEA Commercial at Salesforce

While the business world has traditionally favoured the biggest and the richest, cloud as a technology is seen as the great equalizer. Through a transition through to the cloud, SMBs are being empowered to take on the nemesis of enterprise business, with the number of wins growing year-on-year.

This, according to Salesforce’s Sanj Bhayro, is one of the most exciting trends we’re now witnessing in business throughout the world. Bhayro currently leads the EMEA SMB business at Salesforce and for almost 11 years has been part of the team which has seen the power of intelligent CRM systems grow backroom businesses to industry giants. Just look at the growth and influence of companies such as Uber and AirBnB for justification of his claims.

“The SMB business in Salesforce is one of the most exciting, because we get to work with really innovative companies,” said Bhayro. “All the innovation in the industry is coming from these small to medium sized businesses. They are disrupting the traditional market which is in turn forcing the traditional players to transform their own business models.

“Something which is interesting from our perspective at Salesforce is that when we started 17 years ago the internet wasn’t that prevalent, the cloud wasn’t a word that was used that often, and it was the SMB companies who adopted our technology. The cloud offered them the operational efficiency, the scale and the reach to take on these traditional players. These smaller organizations are looking more and more towards technology as the enabler for innovation.”

The majority of the SMBs could be considered to be too small to drive innovation in-house. For the most part, the IT department is small, and responsible for ‘keeping the lights on’, working through the cloud has enabled innovation and created opportunities for these organizations. And for the most part, the ability to be innovative is much more prominent in the smaller organizations.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

The fail-fast business model is one which has captured the imagination of numerous enterprise organizations around the world. Amazon CEO Jeffrey Bezos recently claimed the fail-fast model was the catalyst for recent growth within the AWS business, though the majority are seemingly struggling to implement the right culture which encourages learning and innovating through failing. For the majority, failure is simply failure, not part of the journey to success.

But this in itself is one of the ways in which the smaller, more agile organizations are innovating and catching enterprise scale businesses. The implementation of cloud platforms speeds up the failures and lessens negative impacts on the business, to further drive the journey to innovation.

“For start-ups and early stage companies, failing is an accepted mentality. How many companies are actually the same as when they started? They failed, learned and then progressed. As businesses become bigger and bigger it becomes a lot more difficult. Certainly for larger companies there is a lot more friction around the fail-fast model. Smaller companies are culturally set up to allow them to pivot and try new things, whereas larger ones, purely because of their size, are constrained.”

Outside of the SMB team, Salesforce engineers have been prioritizing the use of artificial intelligence for future product launches and updates. This was reinforced during the company’s quarterly earnings call in recent weeks as CEO Marc Benioff backed AI as the next major growth driver. While there is potential for AI in the SMB market place, for the moment it is only for those who are ahead of the curve.

For the most part, data analytics is starting to drip down into smaller organizations, though there is still a substantial amount of data which is not being utilized. For Bhayro, as the concept of the cloud is now ubiquitous, the opportunities are almost limitless. But only once these organizations have got on top of managing their own data, breaking down the silos within the business.

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.“AI translates well into the SMB business model and it will be the SMBs who drive where AI goes,” said Bhayro. “There are generally two camps when it comes to the SMB market, those who are cloud-native, those who capitalizing on the sharing-economy and those who are more traditional organizations. The shift that the traditional business has to make to break down the silos, and to move towards a cloud back-end is far more difficult than a company like Deliveroo who started in the cloud and can scale. Never the less that shift has to be made.”

“So much data is being created and there’s so much that you can do with it. The problem is that so many companies are not doing enough with their data. Recent reports stated that most companies can only analyse 1% of their data. Even before we start moving towards AI technologies, the way we service intelligence is through insight. We need to provide the right tools to make data available and malleable, to everybody in your business. These data analytics tools are the first steps and then we can look forward to AI technologies.”

The UK government has made numerous schemes available to SMBs to encourage the growth of this subsector in recent years, and Bhayro believes these efforts have been playing off in the international markets.

“I delighted to say that the UK takes a leadership position (in relation to SMB growth and innovation in comparison to the rest of Europe),” said Bhayro. “Something in the region of 95-96% of the companies in the UK are SMBs, and the government is currently doing the right things to encourage and propel entrepreneurs. I think we’re in the time of entrepreneurship, and this is the time for people to have the vision and grow. These companies are having wonderful ideas, and they are moving into the growth period, but it’s the customer experience which really differentiates them from the competition. Not many of these companies are set up to achieve customer experience objectives, but this is where we (Salesforce) come in.”