Category Archives: News & Analysis

India and Brazil biggest countries for app growth

Research from App Annie highlighted emerging markets are set to account for 45% of global app revenue by 2020, with revenue growth expected to double figures in western economies.

Revenues are estimated to reach $102.5 billion and global mobile app store downloads will reach approximately 288 billion in 2020, as US, the Japan and Western European nations would expect growth to be around 12% CAGR, though this percentage grows to 29% when looking at the developing nations.

As of 2015, the APAC market accounted for approximately 52% of global downloads, and 55% of revenues, though these numbers are increased to 57% and 58% respectively by 2020. It would appear the majority of this growth has been driven by the emerging markets as between 2015 and 2020, the contribution to global downloads will increase from 66% to 75% and for revenues the increase with be from 30% to 49%.

India and Brazil are two countries listed were growth will be particularly strong, mainly due to a large population, strong performing economies and strong smartphone uptake. Estimates highlight there are approximately 200 million smartphone users in India currently, though this number is expected to increase to 317 million by 2019. Over the same period GDP in India was at around $2.09 trillion, and is estimated to rise to $3.1 trillion in 2020. Mexico, Indonesia and China also showed good potential.

First and foremost, games will drive the expansion in the emerging markets, though the more mature regions are beginning to witness more of a subscription based model. Whether this is to be the same long-term in the emerging markets is yet to be seen, as there is little data to suggest similar patterns. India for example would still be considered in the hyper growth stages of development, offering lucrative opportunities for developers who gain traction in the early days.

The report does shed light on some interesting statistics and could offer encouragement to app developers who have traditionally found monetization a difficult task, but it comes as no surprise to hear the report states there is likely to be more growth in emerging markets than mature ones.

AWS release statement to explain Aussie outage

Location Australia. Green pin on the map.AWS has blamed a power shortage caused by adverse weather conditions as the primary cause of the outage Australian customers experienced this weekend.

A statement on the company’s website stated its utility provider suffered a failure at the regional substation, which resulted in the total loss of utility power to multiple AWS facilities. At one of these facilities, the power redundancy didn’t work as designed and the company lost power to a large number of instances in the availability zone.

The storm this weekend was one of the worst experienced by Sydney in recent years, recording 150mm of rain over the period, with 93 mm falling on Sunday 5th alone, and wind speeds reaching as high as 96 km/h. The storm resulted in AWS customers losing services for up to six hours, between 11.30pm and 4.30am (PST) on June 4/5. The company claims over 80% of the impacted customer instances and volumes were back online and operational by 1am, though a latent bug in the instance management software led to a slower than expected recovery for some of the services.

While adverse weather conditions cannot be avoided, the outage is unlikely to ease concerns over public cloud propositions. Although the concept of cloud may now be considered mainstream, there are still numerous decision makers who are hesitant over placing mission critical workloads in such an environment, as it has been considered as handing control of a company’s assets to another organization. Such outages will not bolster confidence in those who are already pessimistic.

“Normally, when utility power fails, electrical load is maintained by multiple layers of power redundancy,” the statement said. “Every instance is served by two independent power delivery line-ups, each providing access to utility power, uninterruptable power supplies (UPSs), and back-up power from generators. If either of these independent power line-ups provides power, the instance will maintain availability. During this weekend’s event, the instances that lost power lost access to both their primary and secondary power as several of our power delivery line-ups failed to transfer load to their generators.”

In efforts to avoid similar episodes in the future, the team have stated additional breakers will be added to assure that we more quickly break connections to degraded utility power to allow generators to activate before uninterruptable power supplies systems are depleted. The team have also prioritized reviewing and redesigning the power configuration process in their facilities to prevent similar power sags from affecting performance in the future.

“We are never satisfied with operational performance that is anything less than perfect, and we will do everything we can to learn from this event and use it to drive improvement across our services,” the company said.

HPE give IoT portfolio an edgy feel

Oil and gas platform in the gulf or the sea, The world energy, OHPE has unveiled new capabilities and partnerships to bring real-time data analytics and IoT insight to the network edge, reports Telecoms.com.

The team claims its new offerings, Edgeline EL1000 and Edgeline EL4000, are the first converged systems for the Internet of Things, capable of integrating data capture, analysis and storage at the source of collection. Transport and storage of data for analytics are becoming prohibitively expensive, the company claims, so the new products offer decision making insight at the network edge to reduce costs and complexities.

HPE claims the new offerings are capable of delivering heavy-duty data analytics and insights, graphically intense data visualization, and real-time response at the edge. Until recently, the technology to drive edge analytics has not been available, meaning data has had to be transferred to the network core to acquire insight. The team have also announced the launch of Vertica Analytics Platform which offers in-database machine learning algorithms and closed-loop analytics at the network edge.

“Organizations that take advantage of the vast amount of data and run deep analytics at the edge can become digital disrupters within their industries,” said Mark Potter, CTO of the Enterprise Group at HPE. “HPE has built machine learning and real time analytics into its IoT platforms, and provides services that help customers understand how data can best be leveraged, enabling them to optimize maintenance management, improve operations efficiency and ultimately, drive significant cost savings.”

The news follows an announcement from IBM and Cisco last week which also focused on IoT at the edge. Alongside the product launches from HPE, the team also announced a partnership with GE Digital to create more relevant propositions for industry. The partnership focuses on combining HPE technical know-how with GE’s industrial expertise and its Predix platform to create IoT-optimized hardware and software. GE’s Predix platform will be a preferred software solution for HPE’s industrial-related use cases and customers.

While the promise of IoT given the industry plenty to get excited about in recent years, the full potential has been difficult to realize due to the vast amount of data which needs to be transported to the network core to process and drive insight from. Although it would seem logical to process the data at the source of collection, technical capabilities have not been at the point where this has been possible. Recent advances from the IBM/Cisco and HPE/GE partnerships are removing the need to transfer information, and also the risk of bottleneck situations, points of failure and storage expenses from the IoT process.

“In order to fully take advantage of the Industrial IoT, customers need data-centre-grade computing power, both at the edge – where the action is – and in the cloud,” said Potter. “With our advanced technologies, customers are able to access data centre-level compute at every point in the Industrial IoT, delivering insight and control when and where needed.”

Applications for the edge-analytics proposition could be quite wide, ranging from production lines in Eastern Europe to oil rigs in the North Sea to smart energy grids in Copenhagen. It would appear the team are not only targeting industrial segments, where IoT could ensure faster and more accurate decision making in the manufacturing process for instance, but also those assets which do not have reliable or consistent connectivity.

UK Government passes spy bill with strong majority

Lady Justice On The Old Bailey, LondonThe House of Commons has voted in favour of the Investigatory Powers Bill which gives UK intelligence agencies greater power to examine browsing histories and hack phones, reports Telecoms.com.

The bill, which now passes through to the House of Lords, has been under scrutiny since last year, with the latest version being reviewed since March. The original version of the bill, known as the ‘Snooper’s Charter’ by critics, came up against strong opposition from a host of technology companies who have registered privacy concerns. The bill itself will require technology companies to collect and store data on customers, while also allowing intelligence agencies to remotely access smartphones and other devices.

“The Bill provides a clear and transparent basis for powers already in use by the security and intelligence services, but there need to be further safeguards,” said, Harriet Harman, MP for Camberwell and Peckham and Chair of the Joint Committee on Human Rights. “Protection for MP communications from unjustified interference is vital, as it is for confidential communications between lawyers and clients, and for journalists’ sources, the Bill must provide tougher safeguards to ensure that the Government cannot abuse its powers to undermine Parliament’s ability to hold the Government to account.”

Although proposed by the Conservative party, the bill was strongly supported by the Labour party as well as the majority of the commons, with opposition primarily coming from the Scottish National Party. Despite privacy and civil rights concerns from the SNP, the bill passed with a vote of 444 to 69. The vote in the House of Lords is expected to take place in the next couple of months with the bill being passed to law in January 2017.

The bill was deemed as a high priority for intelligence agencies within the UK, it has been under scrutiny from the Joint Committee on Human Rights, after concerns it could potentially violate privacy and civil rights. As part of the review, extended protection will also granted to lawyers and journalists.

“The Joint Committee heard from 59 witnesses in 22 public panels,” said Victoria Atkins, MP for Louth and Horncastle, speaking on behalf of the Joint Committee on Human Rights and the Bill Committee. “We received 148 written submissions, amounting to 1,500 pages of evidence. We visited the Metropolitan police and GCHQ, and we made 87 recommendations, more than two thirds of which have been accepted by the Home Office.”

One of the initial concerns was a permanently open backdoor which could be accessed by intelligence agencies without oversight, which has seemingly been addressed. Intelligence agencies will have to request access, which will be granted should it not be too complicated or expensive. What the definition of complicated or expensive has not been given, however it does appear to end concerns of a government ‘all-access-pass’. Whether this is enough of a concession for the technology companies remains to be seen.

Microsoft, HPE and Cisco take top-spot for infrastructure vendors

male and female during the run of the marathon raceMicrosoft, HPE and Cisco have been named as three of the leading names in the cloud industry by Synergy Research as the firm wraps up the winners and losers for the first quarter.

While the cloud infrastructure market has been growing consistently at an average rate of 20% year-on-year, 2016 Q1 was estimated at 13%, though this was to be expected following peak sales during the latter stages of 2015. Microsoft led the way for cloud infrastructure software, whereas HPE led the private cloud hardware market segment, and Cisco led the public cloud hardware segment.

“With spend on cloud services growing by over 50% per year and spend on SaaS growing by over 30%, there is little surprise that cloud operator capex continues to drive strong growth in public cloud infrastructure,” said Jeremy Duke, Synergy Research Group’s Chief Analyst. “But on the enterprise data centre side too we continue to see a big swing towards spend on private cloud infrastructure as companies seek to benefit from more flexible and agile IT technology. The transition to cloud still has a long way to go.”

For the last eight quarters total spend on data centre infrastructure has been running at an average of $29 billion, with HPE controlling the largest share of cloud infrastructure hardware and software over the course of 2015. Cloud deployments or shipments of systems that are cloud enabled now account for well over half of the total data centre infrastructure market.

cloud leaders

44% of consumers have issues with wearables functionality

Iot isometric flowchart design bannerFindings from Ericsson ConsumerLab claim consumer enthusiasm for wearables technology is still growing but vendors are not meeting price or functionality expectations, reports Telecoms.com.

The research focused on opinions from 5,000 smartphone users from Brazil, China, South Korea, the UK and the US, though it’s worth noting 50% of respondents were current owners of wearables technology, a much higher proportion of the general public. While the statistics demonstrated there is still an appetite for wearable technologies outside of fitness applications, price of entry could be a barrier for entry, as well as customer expectations on functionality generally exceeding what vendors are currently able to offer.

32% of respondents said they would be interested or willing to buy a Panic/SOS button, and 25% said the same for an identity authentication device. Smart Watches were still of interest to the industry as 28% said they would have an interest in purchasing such as a device, but this statistic contradicts recent reports the segment has been declining. Strategy Analytics forecasted a 12% decline in Apple watch sales this year after a strong launch. A third of non-users have stated the cost of keeping digital devices connected is a key reason why they haven’t invested in wearable technology to date.

While the SA report could indicate a slight hiccup in the adoption of wearables, this is also backed up to a degree by the Ericsson report which states 10% of wearable users abandoned the technology. This is mainly due to the capabilities which are on offer. A common cause of dissatisfaction is customers feel tethered to their smartphone, as the wearable device does not have standalone features. This could also be tied into the overall value/price proposition of the devices as could be seen as a product of convenience as opposed to a smartphone replacement.

In terms of the reasoning for abandoning wearables, over half of respondents said the devices did not meet expectations. 21% highlighted limited functionality and uses, 23% stated the fact the device was not standalone or didn’t have inbuilt connectivity was the reason, where as 9% said inaccurate data and information. Despite the concerns over functionality, 83% of respondents said they expect wearables to have some form of standalone connectivity in the near future. Should this be the case, 43% believe wearables will ultimately replace smartphones.

“Although consumers show greatest interest in devices related to safety, we also see openness to wearable technology further away from today’s generation,” said Jasmeet Singh Sethi, Consumer Insight Expert, Ericsson ConsumerLab. “In five years’ time, walking around with an ingestible sensor, which tracks your body temperature and adjusts the thermostat setting automatically once you arrive home, may be a reality.” Other use cases included a smart water purifier, gesture communicator, virtual reality sports attire, emotion sensing tattoos and a wearable camera.

The survey does demonstrate long-term viability for wearable technology, though there would have to be increased functionality before it could be considered mainstream. It would appear standalone connectivity would be the bare minimum required, as the currently offering seemingly does not offer the value to customers should they have to continue to carry a smartphone as well as the wearable device.

Consumer buying decisions still based on price – Nokia

A racehorse and jockey in a horse raceResearch from Nokia has highlighted consumer buying decisions for smartphones and post-paid contracts are still based on financial drivers as opposed to value add propositions, reports Telecoms.com.

With the worldwide number of smartphone and total number of mobile phone users estimated to exceed 2.6 billion and 5 billion respectively by 2019, the race is now on for operators to capture the biggest share of this lucrative market. Nokia’s research addressed a number of factors surround churn rate and customer acquisition, as well as wider trends, though concerns could be raised on the financial drivers for purchasing decisions placing operators in a similar arena to utilities companies.

Efforts in recent years by the operators have been to shift the focus of the consumer away from price, and move purchasing decisions towards value and performance. T-Mobile US announced a further prong to its ‘Un-carrier’ strategy this week, as it will reward customers with stock, seemingly for nothing in return in the first instance, though additional shares can be acquired by referring new customers to the brand. There have been similar efforts from operators around the world, though the statistics do not suggest there has been a significant impact.

In comparison between 2014 and 2016, the number of respondents who said their attitudes on retention were influenced by cost and billing was still the highest factor, but did drop from 45% to 40%. In terms of the reasons for choosing a new operator, 45% stated this would be based on price, with value adds, mobile technology and choice of devices, only accounting for 17%, 14% and 11% respectively. The quality of a network is also a concern, though the drivers behind choosing a new or staying with an operator are still predominantly price driven.

While price is still the number one objective for customers, the statistics do highlight value added services are having more of an impact on customer retention than acquisition. In terms of definitions, core operator offerings, such as SMS, data and minutes were not included in the research, however value added services increased the likelihood in a customer staying with an operator by 11%, the perception of a network’s quality was up 55% and the number of customers that used more than one gigabyte of data per month was also up 15%.

While operators are generally perceived as trying to avoid competing for new customers solely on price, the research does seem to indicate this would be the most effective route. While retention can seemingly be influenced by value adds, a utility model may be difficult to avoid for customer acquisition.

“We can see the marketing battles to acquire mobile subscribers are fierce,” said Bhaskar Gorti, Applications & Analytics president at Nokia. “What we don’t see as well is the work operators do every day to retain customers. Our study shows how important that work is – and also how challenging it is as customers, attached to their phones, demand higher levels of service.”

In line with industry expectations, 4G usage is on the increase with 38% of new subscribers over the last 12 months choosing 4G networks. The uptake is mainly witnessed in the mature markets, Japan and US are showing the highest levels of adoption, though respondents of the survey highlighted there still are barriers to adoption. For those who are not using 4G currently, a device which doesn’t support 4G or the price being too high were the main reasons.

How to turn the cloud into a competitive advantage with a scorecard approach to migration

Closeup on eyeglasses with focused and blurred landscape view.We have seen enterprise cloud evolve a lot in recent years, going from specific workloads running in the cloud to businesses looking at a cloud-first approach for many applications and processes. This rise was also reflected in the Verizon State of the Market: Enterprise Cloud 2016 report, which found that 84% of enterprises have seen their use of cloud increase in the past year, with 87% of these now using cloud for at least one mission-critical workload. Furthermore, 69% of businesses say that cloud has enabled them to significantly reengineer one or more business processes, giving a clear sign of the fundamental impact that cloud is having on the way we do business.

These findings give a clear sign that whilst companies will continue to leverage the cloud for niche applications, enterprises are now looking to put more business-centric applications in the cloud. This approach requires designing cloud-based applications that specifically fit each workload — taking into account geography, security, networking, service management expectations and the ability to quickly deploy the solution to meet rapidly changing business requirements. As a result, a core focus for 2016 will be the creation of individual cloud spaces that correspond to the individual needs of a given workload.

The key to cloud is collaboration

This focused alignment has led to the role of enterprise IT evolving to that of a cloud broker that must collaborate with lines of business to ensure overall success of the organisation. By using an actionable, scorecard approach for aligning cloud solutions with the needs of each workload, enterprises can make more informed assessments on how best to support applications in the cloud.

Three practical steps are as follows:

  1. Consult the Business and Assess User Requirements: IT professionals should build a relationship with their organisation’s lines of business to accurately identify critical application requirements to create the right cloud solution. Some questions to ask include:
  • What are all the barriers for successful application migration?
  • What is the importance of the application’s availability and what is the cost of downtime?
  • What regulations does the application and data need to comply with?
  • How often will IT need to upgrade the application to maintain competitive advantage?
  1. Score Applications and Build a Risk Profile: The careful assessment of technical requirements of applications can mean the difference between a successful cloud migration and a failed one. A checklist to guide IT departments away from major pitfalls is important. Such as:
  • Determine the load on the network
  • Factor in time to prepare the application
  • Carefully consider the costs of moving

In addition to assessing the technical requirements, IT professionals must evaluate the applications’ risk profile. Using data discovery tools to look at the data flow is instrumental to detecting breaches and mitigating any impact.

  1. Match Requirements to the Right Cloud Service Model: Choosing the best cloud model for enterprise IT requires a thorough comprehension of technical specifications and workload requirements. The following are key considerations to help IT directors partner with their business unit colleagues to define enterprise needs and determine the right cloud model.
  • Does the application’s risk profile allow it to run on shared infrastructure?
  • What proportion of the application and its data are currently based on your premises, and how much is based with a provider?
  • How much of the management of the cloud can you take on?

Cloud is empowering IT professionals to gain a greater role in effectively impacting business results. Working in the right cloud environment allows for operational efficiency, increased performance, stringent security measures and robust network connectivity.

What’s on the horizon for cloud?

In the coming months and years, we will see an increased focus on the fundamental technology elements that enable the Internet of Things – cloud network and security. Networking and cloud computing are at the heart of IoT, comprising half of the key ingredients that make IoT possible. (Security and infrastructure are the other two.) This is not surprising considering IoT needs reliable, flexible network connections (both wireless and wireline) to move all the collected data and information from devices back to a central processing hub, without the need for human intervention. Similarly, cloud computing provides the flexibility, scale and security to host applications and store data.

Going forward, success will not be measured by merely moving to the cloud. Success will be measured by combining favourable financials and user impact with enhanced collaboration and information sharing across a business’ entire ecosystem. Those IT departments that embrace the cloud through the creation and implementation of a comprehensive strategy — that includes strong and measurable metrics and a strong focus on managing business outcomes — will be the ones we talk about as pioneers in the years to come.

Written by Gavan Egan, Managing Director of Cloud Services at Verizon Enterprise Solutions

Solarwinds acquires LogicNow to form new MSP business unit

Expansion1SolarWinds has completed the acquisition of LogicNow, which it plans to merge with the N-able business unit to create SolarWinds MSP. The new company will now serve 18,000 managed service providers worldwide, managing more than five million end-points and one million mailboxes.

For LogicNow’s General Manager Alistair Forbes combining his company’s expertise with capabilities of SolarWinds was an opportunity to take the business to the next level.

“This acquisition is the culmination of a journey which we’ve been on for the last 12 years,” said Forbes. “We saw the opportunity to combine with SolarWinds and the N-able division, and really shift gears into the next phase of our business. Since N-able was acquired by SolarWinds we’ve really seen them become a much more prominent player in the market.

“If you have a look at opportunity SolarWinds gave N-able, we see this as the best way we can accelerate the growth of the LogicNow business and take it to the next level.”

What is worth noting is that the growth of LogicNow has not hit a glass ceiling, Forbes highlighted the business has grown 40% over the last twelve months, however the association with SolarWinds can open up new doors for the team. While LogicNow is in itself a respected brand in the industry, SolarWinds has made its name as a specialist for enterprise scale organizations. By leaning on the SolarWinds brand, Forbes believes opportunities will be created which would have been significantly harder by taking the organic route.

The SolarWinds MSP business will now focus on a number of areas including remote monitoring and management, security including anti-malware, multi-vendor patch management and web access control, backup and disaster recovery, data analytics and risk and vulnerability assessment, amongst other areas.

First and foremost, the new brand will focus on understanding the technology, expertise and assets which are now available, to both business units. “The immediate focus of the business will be to take the combined assets and see what we can create,” said Forbes. “There will be some areas of overlap and also a few redundancies, but nothing massive. This acquisition is all about putting two and two together to make something bigger.”

For the moment, the LogicNow and SolarWinds N-able brands will continue, though this will be phased out over time. Internally, both units are working to shift the culture from the separate businesses to the SolarWinds MSP mentality, though it is thought the restructuring and integration process will be a relatively simple one. For the most part, there is little overlap, and although certain functions will require redundancies, there are only a couple of offices which would be deemed to clash. Boulder, Colorado is one of those offices, and there will be a requirement to merge into one physical location, though the headcount reduction will be minimized overall, Forbes claims.

SAP and Yandex pair up to launch predictive analytics platform

Zumos de verdura ecolcicaSAP and Yandex Data Factory have announced a new strategic partnership to develop cloud-based predictive analytics services for the retail, e-commerce, banking and telecommunications sectors.

Using the machine learning & data analytics capabilities of Yandex Data Factory, the team will offer services such as personalised offers of goods and services, churn prediction and recommendations for the retention of customers, which will be based on the SAP HANA Enterprise Cloud platform.

While the offering could seemingly be extended to other verticals, the team are focusing on more consumer orientated areas due to the volume of data these companies have already collected. There does not seem to be any reason why it could not be extended beyond the industries mentioned, though there would have to be a suitable amount of information collected to realize the full potential of the offering.

“While the advantages of big data analysis are well-understood, many in the retail, e-commerce, banking and telecommunications sectors will have concerns over integrating data analysis technology with their existing systems,” said Alexander Khaytin, chief operating officer at Yandex Data Factory. “We wanted to remove this obstacle and therefore deliberately partnered with SAP – one of the leading providers of data and business automation tools in these sectors – so that we can offer our clients advanced predictive and prescriptive analytics, without additional integration costs.”

The Yandex big data algorithms are based on matching a customer’s profile with another customer who has demonstrated the same purchasing tendencies. In theory, should two customers have a similar profile, predictions on what one will do can be based on the actions of the second. Yandex claims it can reduce the costs of acquiring new and retaining current customers by up to 10%.