Archivo de la categoría: IoT

What did we learn from EMC’s data protection report?

a safe place to workEMC has recently released its Global Data Protection Index 2016 where it claims only 2% of the world would be considered ‘leaders’ in protecting their own assets, reports Telecoms.com.

Data has dominated the headlines in recent months as breaches have made customers question how well enterprise organizations can manage and protect data. Combined with transatlantic disagreements in the form of Safe Habour and law agencies access to personal data, the ability to remain secure and credible is now more of a priority for decision makers.

“Our customers are facing a rapidly evolving data protection landscape on a number of fronts, whether it’s to protect modern cloud computing environments or to shield against devastating cyber-attacks,” said David Goulden, CEO of EMC Information Infrastructure. “Our research shows that many businesses are unaware of the potential impact and are failing to plan for them, which is a threat in itself.”

EMC’s report outlined a number of challenges and statistics which claimed the majority of the industry are not in a place they should be with regard to data protection. While only 2% of the industry would be considered leaders in the data protection category, 52% are still evaluating the options available to them. Overall, 13% more businesses suffered data loss in the last twelve months, compared to the same period prior to that.

But what are the over-arching lessons we learned from the report?

Vendors: Less is more

A fair assumption for most people would be the more protection you take on, the more protected you are. This just seems logical. However, the study shows the more vendors you count in your stable, the more data you will leak.

The average data loss instance costs a company 2.36TB of data, which would be considered substantial, however it could be worse. The study showed organizations who used one vendor lost on average 0.83TB per incident, two vendors 2.04TB and three vendors 2.58TB. For those who used four or more vendors, an average of 5.47TB of data was lost per incident.

Common sense would dictate the more layers of security you have, the more secure you will be, however this is only the case if the systems are compatible with each other. It should be highlighted those who lost the larger data sets are likely to be the larger companies, with more data to lose, though the study does seem to suggest there needs to be a more co-ordinated approach to data protection.

And they are expensive…

Using same concept as before, the average cost of lost data was $900,000. For those who have one vendor, the cost was $636,361, for those with two, 789,193 and for those with three vendors the cost was just above the average at 911,030. When companies bring in four or more vendors, the average cost of data loss rises to 1.767 million.

China and Mexico are the best

While it may be surprising, considering many of the latest breakthroughs in the data world have come from Silicon Valley or Israel, China and Mexico are the two countries which would be considered furthest ahead of the trend for data protection.

EMC graded each country on how effective they are were implementing the right technologies and culture to prevent data loss within the organizations themselves. 17 countries featured ahead of the curve including usual suspects of the UK (13.5% ahead of the curve), US (8%), Japan (1%) and South Korea (9%), however China and Mexico led the charge being 20% and 17% respectively ahead.

While it may not be considered that unusual for China to have a strong handle on data within its own boarders, Mexico is a little more surprising (at least to us at Telecoms.com). The country itself has gone through somewhat of a technology revolution in recent years, growing in the last 20 years from a country where only 10% of people had mobile through to 68% this year, 70% of which are smartphones. Mexico is now the 11th largest economy in terms of purchasing power, with the millennials being the largest demographic. With the population becoming more affluent, and no longer constrained by the faults of pre-internet world, the trend should continue. Keep up the good work Mexico.

Human error is still a talking point

When looking at the causes of data loss, the results were widespread, though the causes which cannot be controlled were at the top of the list. Hardware failure, power loss and software failure accounted for 45%, 35% and 34% respectively.

That said the industry does now appear to be taking responsibility for the data itself. The study showed only 10% of the incidents of data loss was blamed on the vendor. A couple of weeks ago we spoke to Intel CTO Raj Samani who highlighted to us the attitude towards security (not just data protection) needs to shift, as there are no means to outsource risk. Minimizing risk is achievable, but irrelevant of what agreements are undertaken with vendors, the risk still remains with you. As fewer people are blaming the vendors, it would appear this responsibility is being realized.

Human error is another area which still remains high on the agenda, as the study showed it accounts for 20% of all instances of data loss. While some of these instances can be blamed on leaving a laptop in the pub or losing a phone on the train, there are examples where simple mistakes in the workplace are to blame. These will not be removed, as numerous day-to-day decisions are based off the back of intuition and gut-feel, and a necessity for certain aspects of the business.

An area which could be seen as a potential danger would be that of artificial intelligence. As AI advances as a concept, the more like humans they will become, and thus more capable of making decisions based in intuition. If this is to be taken as the ambition, surely an intuitive decision making machine would offer a security defect in the same way a human would. Admittedly the risk would be substantially smaller, but on the contrary, the machine would be making X times many more decision than the human.

All-in-all the report raises more questions than provides answers. While security has been pushed to the top of the agenda for numerous organizations, receiving additional investment and attention, it does not appear the same organizations are getting any better at protecting themselves. The fact 13% more organizations have been attacked in the last 12 month suggests it could be getting worse.

To finish, the study asked whether an individual felt their organization was well enough protected. Only 18% believe they are.

Telsta adds IoT and big data offering to Network and Services biz unit

Location Australia. Green pin on the map.Australian telco Telstra has continued efforts to bolster its Network Applications and Services (NAS) business unit through acquiring Readify, reports Telecoms.com.

The company has been vocal about its aims for the NAS business unit as it has sought to expand through numerous acquisitions in recent years. Aside from the Readify deal, the company has also incorporated O2 Networks, Bridge Point Communications, Kloud and North Shore Connections, as well as numerous partnerships including with cloud security start-up vArmour.

“This arm of the business (NAS) has been a strong growth area for Telstra, achieving double-digit growth in revenue driven by business momentum in Asia, as well as advances in technology in the cloud computing space,” said a statement on the company website. “We are well equipped to continue to capitalise on this growth and ensure our focus on NAS continues to drive revenue.”

Readify, which currently offers enterprise cloud application solutions as well as Big Data and IoT, will provide an additional platform for Telstra to drive digital transformation for its enterprise customers in domestic and global markets. The offering builds on the January acquisition of Kloud which offers cloud migration services, as well as unified communications solutions and contact centre provider North Shore Connections in 2013, network integration services provider O2 Networks in 2014 and security, networking, and data management provider Bridgepoint, also in 2014.

“Readify will provide application development and data analytics services, nicely complementing Kloud’s existing services,” said Telstra Executive Director Global Enterprise and Services, Michelle Bendschneider. “It will enable Telstra to add incremental value to customers in enterprise cloud applications, API-based customisation and extensions as well as business technology advisory services.”

Back in April, the company announced a business multi-cloud connecting solution, which supports numerous offerings hybrid cloud offerings including Azure, AWS, VMware, and IBM. The one-to-many “gateway” model will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only connect to AWS and IBM SoftLayer for the moment.

The cloud and enterprise services market has been a long-ambition of the company, though it did get off to a slow start. Back in 2014, its national rival Optus Business stole a march on Telstra through acquiring Ensyst, winner of Australian Country Partner of the Year at the Microsoft Worldwide Partner Awards during the same year, as it looked to grow its own cloud proposition. It would appear Telstra is making up for lost time through an accelerated program of product releases and acquisitions.

Ericsson claims a world first with transcontinental 5G trial

Ericsson, Deutsche Telekom and SK Telecom have announced a partnership to deploy world’s first transcontinental 5G trial network, reports Telecoms.com.

The objective of the agreement will be to provide optimized end-user experiences by providing consistent quality of services and roaming experiences for advanced 5G use cases with enhanced global reach. Ericsson will act as the sole supplier to the project, which will include technologies such as NFV, software defined infrastructure, distributed cloud, and network slicing.

Last October, Ericsson and SK Telecom conducted a successful demonstration of network slicing technology, which featured the creation of virtual network slices optimized for services including super multi-view and augmented reality/virtual reality, Internet of Things offerings and enterprise solutions.

“5G is very different from its predecessors in that the system is built as a platform to provide tailored services optimized for individual customer’s needs, at a global scale,” said Alex Jinsung Choi, CTO at SK Telecom. “Through this three-party collaboration, we will be able to better understand and build a 5G system that can provide consistent and enhanced user experience across the globe.”

Alongside the announcement, Ericsson and SK Telecom also successfully completed a demonstration of 5G software-defined telecommunications infrastructure, using the vendors Hyperscale Datacenter System (HDS) 8000 solution. The pair claims this is a world-first and will enable dynamic composition of network components to meet scale requirements of 5G services.

Software-defined telecommunications infrastructure is one of the enablers of network slicing, which will allow operators to create individual virtualized environments which are optimized for specific users. The demonstration itself focused on two use cases; ultra-micro-network end-to-end (E2E) slicing for personalized services, and ultra-large-network E2E slicing for high-capacity processing.

“SDTI is an innovative technology that enhances network efficiency by flexibly constructing hardware components to satisfy the infrastructure performance requirements of diverse 5G services,” said Park Jin-hyo, Head of Network Technology R&D Center at SK Telecom.

Finally, Ericsson has announced another partnership with Japanese telco KDDI with the ambition of delivering IoT on a global scale and providing enhanced connectivity services to KDDI customers.

The partnership will focus on Ericsson’s cloud-based IoT platform to deliver services such as IoT connectivity management, subscription management, network connectivity administration and flexible billing services. The pair claims the new proposition will enable KDDI’s customers to deploy, manage and scale IoT connected devices and applications globally.

IoT represents a significant opportunity for enterprise customers and operators alike, as it significantly increases the amount of data available and also access points to customers worldwide. Research firm Statista estimates the number of devices worldwide could exceed 50 billion, though the definition of what a connected device is or what an IoT connected device is varies.

“KDDI has for a long time been committed to building the communication environment to connect with world operators in order to support the global businesses of our customers,” said Keiichi Mori, GM of KDDI’s IoT Business Development Division. “We believe that by adopting DCP, we will be able to leverage Ericsson’s connection with world carriers and furthermore promote our unified service deployment globally to customers as they start worldwide IoT deployments.”

What does Clinton have in store for the tech industry?

Location United States. Red pin on the map.Hillary Clinton has recently released her campaign promises for the technology sector should she be elected as President Obama’s successor in November, reports Telecoms.com.

The technology agenda focused on a vast and varied number of issues within the technology industry, including the digital job-front, universal high-speed internet for the US, data transmission across jurisdictions, technological innovation and the adoption of technology in government. Although the statement does indicate a strong stance on moving technology to the top of the political agenda, there does seem to be an element of ‘buzzword chasing’ to gain support of the country’s tech giant.

“Today’s dynamic and competitive global economy demands an ambitious national commitment to technology, innovation and entrepreneurship,” the statement read. “America led the world in the internet revolution, and, today, technology and the internet are transforming nearly every sector of our economy—from manufacturing and transportation, to energy and healthcare.”

But what did we learn about America’s technology future?

Focus on 5G and new technologies

One of the more prominent buzzwords through the beginning of 2016 has been 5G as it is seemingly the turn-to phrase for the majority of new product launches and marketing campaigns. The Clinton has aligned themselves with the buzz in committing to deploying 5G networks (no timeframe), as well as opening up opportunities for a variety of next gen technologies.

“Widely deployed 5G networks, and new unlicensed and shared spectrum technologies, are essential platforms that will support the Internet of Things, smart factories, driverless cars, and much more—developments with enormous potential to create jobs and improve people’s lives,” the statement said.

The deployment of 5G has been split into two separate areas. Firstly, the use of the spectrum will be reviewed with the intention of identifying underutilized bands, including those reserved for the government, and reallocating to improve the speed of deployment. Secondly, government research grants will be awarded to various vendors to advance wireless and data technologies which are directed towards social priorities including healthcare, the environment, public safety and social welfare.

A recent report highlighted from Ovum highlighted the US is on the right track for the deployment of 5G, as the team believe it will be one of the leading countries for the technology. Ovum analysts predict there will be at least 24 million 5G subscribers by the end of 2021, of which 40% will be located in North America.

Europe US court of justiceData Transmission between US and EU

From a data transmission perspective, the Clinton team are seemingly taking offence to the European Court of Justice’s decision to strike down Safe Harbour, and the varied reception for the EU-US Privacy Shield. It would appear the Clinton team is under the assumption the deal between the EU and US was struck down for economic reasons, as opposed to data protection.

“The power of the internet is in part its global nature. Yet increasing numbers of countries have closed off their digital borders or are insisting on “data localization” to attempt to maintain control or unfairly advantage their own companies,” the statement said. “When Hillary was Secretary of State, the United States led the world in safeguarding the free flow of information including through the adoption by the OECD countries of the first Internet Policymaking Principles.

“Hillary supports efforts such as the U.S.-EU Privacy Shield to find alignment in national data privacy laws and protect data movement across borders. And she will promote the free flow of information in international fora.”

While it is could be considered encouraging that the mission of the Clinton team is to open up the channels between the two regions again, it does seem to have missed the point of why the agreement was shot down in the first place. The statement seemingly implies EU countries refused the agreement on the ground of promoting the interests of EU countries in the EU, as opposed to privacy concerns and the US attitude to government agencies access to personal information.

Safe Harbour, the initial transatlantic agreement, was shot down last October, though its proposed successor has come under similar criticism. Only last month, the European Data Protection Supervisor, Giovanni Buttarelli, outlined concerns on whether the proposed agreement will provide adequate protection against indiscriminate surveillance as well as obligations on oversight, transparency, redress and data protection rights.

“I appreciate the efforts made to develop a solution to replace Safe Harbour but the Privacy Shield as it stands is not robust enough to withstand future legal scrutiny before the Court,” said Buttarelli. “Significant improvements are needed should the European Commission wish to adopt an adequacy decision, to respect the essence of key data protection principles with particular regard to necessity, proportionality and redress mechanisms. Moreover, it’s time to develop a longer term solution in the transatlantic dialogue.”

The Clinton team can continue to discuss changes to the transatlantic data transmission policy should they choose, however it is highly unlikely any positive moves are to be made until it gets to grips with the basic concerns of EU policy makers.

Navigating big dataAccess to Government data

Currently there are certain offices and data sets which are accessible to the general public, though this is an area which will be expanded under a Clinton regime. The concept is a sound one; giving entrepreneurs and businesses access to the data could provide insight to how money could be saved, used more efficiently or even new technologies implemented to improve the effectiveness of the government, though there could be a downside.

“The Obama Administration broke new ground in making open and machine-readable the default for new government information, launching Data.gov and charging each agency with maintaining data as a valuable asset,” the statement said. “Hillary will continue and accelerate the Administration’s open data initiatives, including in areas such as health care, education, and criminal justice.”

The downside has the potential to ruin any politician. The program is opening the door for criticism from all sides, and will offer ammunition to any opposition.

Connecting American Citizens

One of the most focused points of the document was around the country’s commitment to ensuring each household and business has the opportunity to be connected to high-speed broadband. While this could be considered an effective sound-bite for the party, it is not a new idea by any means. A recent report highlighted there is currently a surprising number of Americans who do not currently have access to broadband. Although it may be expected those in the rural communities would struggle at times, the report indicated 27% and 25% of New York and Los Angeles respectively would be classed in the “Urban Broadband Unconnected” category, which could be considered more unusual.

Connect America Fund, Rural Utilities Service Program and Broadband Technology Opportunities Program are all well-established operations (Rural Utilities Service Program has been around since 1935) which had been drums for previous presidents to bang also. Clinton has said very little new here or has made little commitment to the initiatives.

The team have however committed to a $25 billion Infrastructure Bank which will enable local authorities to apply for grants to make improvements. This is a new concept which Clinton plans to introduce though the details on how it will be funded, what the criteria for application will be or whether there are any stipulations on which vendors the money can be spend with, are not detailed.

What did we learn at Cloud & DevOps World?

Cloud & DevOps WorldThe newly branded Cloud & DevOps World kicked off yesterday with one theme prominent throughout the various theatres; cloud is no longer a disruption, but what can be achieved through the cloud is still puzzling decision makers, reports Telecoms.com.

One word which was heard more than any other was maturity, as there would appear to be a general consensus that cloud computing had matured as a concept, process and business model. Although finding the greatest value from the cloud is still a challenge, there is a general feeling those in the IT world are becoming more adventurous and more willing to experiment.

Speaking in the Business Transformation theatre, Hotels.com CIO Thierry Bedos opened up the conference with a look into future trends in cloud computing. Maturity was the main driver of the talk here, as Bedos pointed out AWS’ dominant position as market leader and innovator is starting to loosen. While it would generally be considered strange to call tech giants such as Google and Microsoft challenger brands, it would be fair in the context of public cloud. But not for much longer, as the gap is slimming. For Bedos, this competition is a clear indication of a maturing market.

Along Bedos, Oracle’s Neil Sholay gave us insight into the world of data analytics, machine learning and AI in the Oracle Labs. Bill Gates famously said “Content is King”, and while this remains true, Sholay believes we can now go further and live by the rule “Corpus is King”. Content is still of value, though the technologies and business practise to deliver content have dated the phrase. The value of content is now in mastering its delivery through effective analytics to ensure automation, context and insight. A content campaign is only as good as the information you feed it to provide value to the consumer.

The Cyber & Cloud Security theatre held a slightly different story, but maturity was still a strong theme. ETSI & GSMA Security Working Group Chairperson Charles Brookson commented to us while there is still a lot of work to do to ensure security, the decision makers are maturing in the sense they have accepted 100% secure is unachievable and remaining as secure as possible for as long as possible is the new objective.

For a number of the delegates and speakers this is a new mind-set which has been embraced, however there are still some technical drawbacks. Futuristic advances such as biometric security is set to become a possibility in the near future, but Birmingham University’s David Deighton showed the team had made solid progress in the area. Failure rates are still at 2%, which was generally received as too high, but this has been reduced from 15% in a matter of months. The team would appear to be heading in the right direction, at a healthy pace.

Once again the concept of failure was addressed in the IoT & Data Analytics theatre as conference Chairperson Emil Berthelsen (Machine Research) told us the important lesson from the day was to set the right expectations. Some project will succeed and some will not, but there is no such thing as failure. The concept of IoT is now beginning to gain traction in the enterprise world, starting to show (once again) maturity, but for Berthelsen, the importance of scalability, security and data in IoT solutions was most evident throughout the day.

Day 1 showed us one thing above all else; we’re making progress, but we’re not quite there yet.

Connected home will be operated by Apple and Google

Research from Gartner has claimed 25% households in developed economies will utilise the services of digital assistants, such as Apple’s Siri or Google Assistant, on smartphones as the primary means to interact with the connected home.

The user experience is an area which has been prioritized by numerous tech giants, including those in the consumer world, as the process of normalizing the connected world moves forward. Although IoT as a concept has been generally accepted by industry, efforts to take the technology into the wider consumer ecosystem are underway.

Connecting all IoT applications under a digital assistant could be a means to remove the complexity of managing the connected home, playing on the consumer drive for simplicity and efficiency. The digital assistant also presents an entry point for artificial intelligence, as appliances and systems in the home can be optimized alongside information available over the internet. Energy consumption, for example, could potentially be reduced as the digital assistant optimizes a thermostats levels dependent on current weather conditions.

“In the not-too-distant future, users will no longer have to contend with multiple apps; instead, they will literally talk to digital personal assistants such as Apple’s Siri, Amazon’s Alexa or Google Assistant,” said Mark O’Neill, Research Director at Gartner. “Some of these personal assistants are cloud-based and already beginning to leverage smart machine technology.”

The process of normalizing IoT in the consumer world will ultimately create a number of new opportunities for the tech giants, as the technology could offer a gateway into the home for a number of other verticals. Banks and insurance companies for example, could offer advice to customers on how they could save money on bills, should they have access to the data which is generated in the connected home.

“APIs are the key to interoperating with new digital interfaces and a well-managed API program is a key success factor for organizations that are interested in reaching consumers in their connected homes,” said O’Neill. “In the emerging programmable home, it is no longer best to spend time and money on developing individual apps. Instead, divert resources to APIs, which are the way to embrace the postapp world.”

Samsung acquires containers-cloud company Joyent

Money Tree, Currency, Growth.Samsung has agreed to buy San Francisco based cloud provider Joyent in an effort to diversify its product offering in declining markets, reports Telecoms.com.

Financial for the deal have not been disclosed, however the team stated the acquisition will build Samsung’s capabilities in the mobile and Internet of Things arenas, as well cloud-based software and services markets. The company’s traditional means of differentiating its products have been through increased marketing efforts and effective distribution channels, though the new expertise will add a new string to the bow.

“Samsung evaluated a wide range of potential companies in the public and private cloud infrastructure space with a focus on leading-edge scalable technology and talent,” said Injong Rhee, CTO of the Mobile Communications business at Samsung. “In Joyent, we saw an experienced management team with deep domain expertise and a robust cloud technology validated by some of the largest Fortune 500 customers.”

Joyent itself offers a relatively unique proposition in the cloud market as it runs its platform on containers, as opposed to traditional VM’s which the majority of other cloud platforms run on. The team reckons by using containers efficiency it notably improved, a claim which is generally supported by the industry. A recent poll run on Business Cloud News found 89% of readers found container run cloud platforms more attractive than those on VMs.

While smartphones would now be considered the norm in western societies, the industry has been taking a slight dip in recent months. Using data collected from public announcements and analyst firm Strategy Analytics, estimates showed the number of smartphones shipped in Q1 2016 fell to 334.6 million units from 345 million during the same period in 2015. The slowdown has been attributed to lucrative markets such as China becoming increasingly mature, as well as pessimistic outlook from consumers on the global economy.

As a means to differentiate the brand and tackle a challenging market, Samsung has been looking to software and services offerings, as creating a unique offering from hardware or platform perspective has become next to impossible. In terms of the hardware, the latest release of every smartphone contains pretty much the same features (high-performance camera, lighter than ever before etc.), and for the platform, the majority of the smartphone market operates on Android. Software and services has become the new battle ground for product differentiation.

Last month, the team launched its Artik Cloud Platform, an open data exchange platform designed to connect any data set from any connected device or cloud service. IoT is a market which has been targeted by numerous organizations and is seemingly the focus of a healthy proportion of product announcements. The launch of Artik Cloud puts Samsung in direct competition with the likes of Microsoft Azure and IBM Bluemix, as industry giants jostle for lead position in the IoT race, which has yet to be clarified. The inclusion of Joyent’s technology and engineers will give Samsung extra weight in the developing contest.

The purchase also offers Samsung the opportunity to scale its own scale its own cloud infrastructure. The Samsung team says it’s one of the world’s largest consumers of public cloud data and storage, and the inclusion of Joyent could offer the opportunity to move data in-house to decrease the dependency on third party cloud providers such as AWS.

As part of the agreement, CEO Scott Hammond, CTO Bryan Cantrill, and VP of Product Bill Fine, will join Samsung to work on company-wide initiatives. “We are excited to join the Samsung family,” said Hammond. “Samsung brings us the scale we need to grow our cloud and software business, an anchor tenant for our industry leading Triton container-as-a-service platform and Manta object storage technologies, and a partner for innovation in the emerging and fast growing areas of mobile and IoT, including smart homes and connected cars.”

IBM takes Watson to Asia

The globe close up, Asia pastIBM has opened a new research centre in Singapore as it aims to expand its cognitive computing offering Watson into the Asian markets.

The Watson Centre will be located in IBM’s current office at Marina Bay Financial Centre will help commercialize the cognitive, blockchain and design capabilities through partnering with local organizations and co-creating new business solutions. The company claims the new centre will act as a hub for almost 5,000 IBM cognitive solutions professionals in the Asia Pacific region.

Although countries like Japan and China would be considered more mature in their adoption of cloud and next generation technologies, there are numerous others who are in the early stages of adoption. Countries like India and Indonesia have economies which are demonstrating healthy GDP growth at 7.3% and 4.7% respectively, as well as being the third and fifth most populous countries worldwide. Cloud adoption is beginning to accelerate in countries such as these representing a lucrative opportunity for companies such as IBM.

“Watson and blockchain are two technologies that will rapidly change the way we live and work, and our clients in Asia Pacific are eager to lead the way in envisioning and creating that future,” said Randy Walker, CEO IBM Asia Pacific. “Here they can leverage the latest in customer experience design, use cognitive technology to draw insight from vast quantities of data, and draw on IBM’s huge investments in research and development. In partnership with our clients we are nurturing local talent and building an ecosystem to accelerate the development of cognitive solutions and blockchain platforms.”

It would appear the IBM team will be focusing on the financial services, healthcare and tourism industries in the first instance, and the team already have a number of wins in place including Parkway Pantai, DBS Bank and ZUMATA Technologies. The Asian markets have seemingly been a target for Big Blue, and is one of the areas the company has been seeing positive results in recent months. Despite reporting its 16th consecutive quarterly revenue decline in April, the Asian markets were one of the few areas the team saw growth.

Watson has seemingly been the focal point of the company’s efforts to redefine their market position, as the team aim to position itself firmly in the cloud space. Last month the team announced it would teach Watson Korean in an effort to increase the usage and adoption of cloud computing within the region, and acquisitions over recent months have been geared more towards the IoT business unit.

“So where are we in the transformation?” said Martin Schroeter, CFO at IBM during the quarterly earnings call. “It is continued focus on shifting our investments into those strategic imperatives, it is making sure that the space we’re moving to is higher margin and higher profit opportunity for us and then making sure we’re investing aggressively to keep those businesses growing.”

HPE give IoT portfolio an edgy feel

Oil and gas platform in the gulf or the sea, The world energy, OHPE has unveiled new capabilities and partnerships to bring real-time data analytics and IoT insight to the network edge, reports Telecoms.com.

The team claims its new offerings, Edgeline EL1000 and Edgeline EL4000, are the first converged systems for the Internet of Things, capable of integrating data capture, analysis and storage at the source of collection. Transport and storage of data for analytics are becoming prohibitively expensive, the company claims, so the new products offer decision making insight at the network edge to reduce costs and complexities.

HPE claims the new offerings are capable of delivering heavy-duty data analytics and insights, graphically intense data visualization, and real-time response at the edge. Until recently, the technology to drive edge analytics has not been available, meaning data has had to be transferred to the network core to acquire insight. The team have also announced the launch of Vertica Analytics Platform which offers in-database machine learning algorithms and closed-loop analytics at the network edge.

“Organizations that take advantage of the vast amount of data and run deep analytics at the edge can become digital disrupters within their industries,” said Mark Potter, CTO of the Enterprise Group at HPE. “HPE has built machine learning and real time analytics into its IoT platforms, and provides services that help customers understand how data can best be leveraged, enabling them to optimize maintenance management, improve operations efficiency and ultimately, drive significant cost savings.”

The news follows an announcement from IBM and Cisco last week which also focused on IoT at the edge. Alongside the product launches from HPE, the team also announced a partnership with GE Digital to create more relevant propositions for industry. The partnership focuses on combining HPE technical know-how with GE’s industrial expertise and its Predix platform to create IoT-optimized hardware and software. GE’s Predix platform will be a preferred software solution for HPE’s industrial-related use cases and customers.

While the promise of IoT given the industry plenty to get excited about in recent years, the full potential has been difficult to realize due to the vast amount of data which needs to be transported to the network core to process and drive insight from. Although it would seem logical to process the data at the source of collection, technical capabilities have not been at the point where this has been possible. Recent advances from the IBM/Cisco and HPE/GE partnerships are removing the need to transfer information, and also the risk of bottleneck situations, points of failure and storage expenses from the IoT process.

“In order to fully take advantage of the Industrial IoT, customers need data-centre-grade computing power, both at the edge – where the action is – and in the cloud,” said Potter. “With our advanced technologies, customers are able to access data centre-level compute at every point in the Industrial IoT, delivering insight and control when and where needed.”

Applications for the edge-analytics proposition could be quite wide, ranging from production lines in Eastern Europe to oil rigs in the North Sea to smart energy grids in Copenhagen. It would appear the team are not only targeting industrial segments, where IoT could ensure faster and more accurate decision making in the manufacturing process for instance, but also those assets which do not have reliable or consistent connectivity.

44% of consumers have issues with wearables functionality

Iot isometric flowchart design bannerFindings from Ericsson ConsumerLab claim consumer enthusiasm for wearables technology is still growing but vendors are not meeting price or functionality expectations, reports Telecoms.com.

The research focused on opinions from 5,000 smartphone users from Brazil, China, South Korea, the UK and the US, though it’s worth noting 50% of respondents were current owners of wearables technology, a much higher proportion of the general public. While the statistics demonstrated there is still an appetite for wearable technologies outside of fitness applications, price of entry could be a barrier for entry, as well as customer expectations on functionality generally exceeding what vendors are currently able to offer.

32% of respondents said they would be interested or willing to buy a Panic/SOS button, and 25% said the same for an identity authentication device. Smart Watches were still of interest to the industry as 28% said they would have an interest in purchasing such as a device, but this statistic contradicts recent reports the segment has been declining. Strategy Analytics forecasted a 12% decline in Apple watch sales this year after a strong launch. A third of non-users have stated the cost of keeping digital devices connected is a key reason why they haven’t invested in wearable technology to date.

While the SA report could indicate a slight hiccup in the adoption of wearables, this is also backed up to a degree by the Ericsson report which states 10% of wearable users abandoned the technology. This is mainly due to the capabilities which are on offer. A common cause of dissatisfaction is customers feel tethered to their smartphone, as the wearable device does not have standalone features. This could also be tied into the overall value/price proposition of the devices as could be seen as a product of convenience as opposed to a smartphone replacement.

In terms of the reasoning for abandoning wearables, over half of respondents said the devices did not meet expectations. 21% highlighted limited functionality and uses, 23% stated the fact the device was not standalone or didn’t have inbuilt connectivity was the reason, where as 9% said inaccurate data and information. Despite the concerns over functionality, 83% of respondents said they expect wearables to have some form of standalone connectivity in the near future. Should this be the case, 43% believe wearables will ultimately replace smartphones.

“Although consumers show greatest interest in devices related to safety, we also see openness to wearable technology further away from today’s generation,” said Jasmeet Singh Sethi, Consumer Insight Expert, Ericsson ConsumerLab. “In five years’ time, walking around with an ingestible sensor, which tracks your body temperature and adjusts the thermostat setting automatically once you arrive home, may be a reality.” Other use cases included a smart water purifier, gesture communicator, virtual reality sports attire, emotion sensing tattoos and a wearable camera.

The survey does demonstrate long-term viability for wearable technology, though there would have to be increased functionality before it could be considered mainstream. It would appear standalone connectivity would be the bare minimum required, as the currently offering seemingly does not offer the value to customers should they have to continue to carry a smartphone as well as the wearable device.