Todas las entradas hechas por Jamie Davies

India and Brazil biggest countries for app growth

Research from App Annie highlighted emerging markets are set to account for 45% of global app revenue by 2020, with revenue growth expected to double figures in western economies.

Revenues are estimated to reach $102.5 billion and global mobile app store downloads will reach approximately 288 billion in 2020, as US, the Japan and Western European nations would expect growth to be around 12% CAGR, though this percentage grows to 29% when looking at the developing nations.

As of 2015, the APAC market accounted for approximately 52% of global downloads, and 55% of revenues, though these numbers are increased to 57% and 58% respectively by 2020. It would appear the majority of this growth has been driven by the emerging markets as between 2015 and 2020, the contribution to global downloads will increase from 66% to 75% and for revenues the increase with be from 30% to 49%.

India and Brazil are two countries listed were growth will be particularly strong, mainly due to a large population, strong performing economies and strong smartphone uptake. Estimates highlight there are approximately 200 million smartphone users in India currently, though this number is expected to increase to 317 million by 2019. Over the same period GDP in India was at around $2.09 trillion, and is estimated to rise to $3.1 trillion in 2020. Mexico, Indonesia and China also showed good potential.

First and foremost, games will drive the expansion in the emerging markets, though the more mature regions are beginning to witness more of a subscription based model. Whether this is to be the same long-term in the emerging markets is yet to be seen, as there is little data to suggest similar patterns. India for example would still be considered in the hyper growth stages of development, offering lucrative opportunities for developers who gain traction in the early days.

The report does shed light on some interesting statistics and could offer encouragement to app developers who have traditionally found monetization a difficult task, but it comes as no surprise to hear the report states there is likely to be more growth in emerging markets than mature ones.

AWS release statement to explain Aussie outage

Location Australia. Green pin on the map.AWS has blamed a power shortage caused by adverse weather conditions as the primary cause of the outage Australian customers experienced this weekend.

A statement on the company’s website stated its utility provider suffered a failure at the regional substation, which resulted in the total loss of utility power to multiple AWS facilities. At one of these facilities, the power redundancy didn’t work as designed and the company lost power to a large number of instances in the availability zone.

The storm this weekend was one of the worst experienced by Sydney in recent years, recording 150mm of rain over the period, with 93 mm falling on Sunday 5th alone, and wind speeds reaching as high as 96 km/h. The storm resulted in AWS customers losing services for up to six hours, between 11.30pm and 4.30am (PST) on June 4/5. The company claims over 80% of the impacted customer instances and volumes were back online and operational by 1am, though a latent bug in the instance management software led to a slower than expected recovery for some of the services.

While adverse weather conditions cannot be avoided, the outage is unlikely to ease concerns over public cloud propositions. Although the concept of cloud may now be considered mainstream, there are still numerous decision makers who are hesitant over placing mission critical workloads in such an environment, as it has been considered as handing control of a company’s assets to another organization. Such outages will not bolster confidence in those who are already pessimistic.

“Normally, when utility power fails, electrical load is maintained by multiple layers of power redundancy,” the statement said. “Every instance is served by two independent power delivery line-ups, each providing access to utility power, uninterruptable power supplies (UPSs), and back-up power from generators. If either of these independent power line-ups provides power, the instance will maintain availability. During this weekend’s event, the instances that lost power lost access to both their primary and secondary power as several of our power delivery line-ups failed to transfer load to their generators.”

In efforts to avoid similar episodes in the future, the team have stated additional breakers will be added to assure that we more quickly break connections to degraded utility power to allow generators to activate before uninterruptable power supplies systems are depleted. The team have also prioritized reviewing and redesigning the power configuration process in their facilities to prevent similar power sags from affecting performance in the future.

“We are never satisfied with operational performance that is anything less than perfect, and we will do everything we can to learn from this event and use it to drive improvement across our services,” the company said.

HPE give IoT portfolio an edgy feel

Oil and gas platform in the gulf or the sea, The world energy, OHPE has unveiled new capabilities and partnerships to bring real-time data analytics and IoT insight to the network edge, reports Telecoms.com.

The team claims its new offerings, Edgeline EL1000 and Edgeline EL4000, are the first converged systems for the Internet of Things, capable of integrating data capture, analysis and storage at the source of collection. Transport and storage of data for analytics are becoming prohibitively expensive, the company claims, so the new products offer decision making insight at the network edge to reduce costs and complexities.

HPE claims the new offerings are capable of delivering heavy-duty data analytics and insights, graphically intense data visualization, and real-time response at the edge. Until recently, the technology to drive edge analytics has not been available, meaning data has had to be transferred to the network core to acquire insight. The team have also announced the launch of Vertica Analytics Platform which offers in-database machine learning algorithms and closed-loop analytics at the network edge.

“Organizations that take advantage of the vast amount of data and run deep analytics at the edge can become digital disrupters within their industries,” said Mark Potter, CTO of the Enterprise Group at HPE. “HPE has built machine learning and real time analytics into its IoT platforms, and provides services that help customers understand how data can best be leveraged, enabling them to optimize maintenance management, improve operations efficiency and ultimately, drive significant cost savings.”

The news follows an announcement from IBM and Cisco last week which also focused on IoT at the edge. Alongside the product launches from HPE, the team also announced a partnership with GE Digital to create more relevant propositions for industry. The partnership focuses on combining HPE technical know-how with GE’s industrial expertise and its Predix platform to create IoT-optimized hardware and software. GE’s Predix platform will be a preferred software solution for HPE’s industrial-related use cases and customers.

While the promise of IoT given the industry plenty to get excited about in recent years, the full potential has been difficult to realize due to the vast amount of data which needs to be transported to the network core to process and drive insight from. Although it would seem logical to process the data at the source of collection, technical capabilities have not been at the point where this has been possible. Recent advances from the IBM/Cisco and HPE/GE partnerships are removing the need to transfer information, and also the risk of bottleneck situations, points of failure and storage expenses from the IoT process.

“In order to fully take advantage of the Industrial IoT, customers need data-centre-grade computing power, both at the edge – where the action is – and in the cloud,” said Potter. “With our advanced technologies, customers are able to access data centre-level compute at every point in the Industrial IoT, delivering insight and control when and where needed.”

Applications for the edge-analytics proposition could be quite wide, ranging from production lines in Eastern Europe to oil rigs in the North Sea to smart energy grids in Copenhagen. It would appear the team are not only targeting industrial segments, where IoT could ensure faster and more accurate decision making in the manufacturing process for instance, but also those assets which do not have reliable or consistent connectivity.

UK Government passes spy bill with strong majority

Lady Justice On The Old Bailey, LondonThe House of Commons has voted in favour of the Investigatory Powers Bill which gives UK intelligence agencies greater power to examine browsing histories and hack phones, reports Telecoms.com.

The bill, which now passes through to the House of Lords, has been under scrutiny since last year, with the latest version being reviewed since March. The original version of the bill, known as the ‘Snooper’s Charter’ by critics, came up against strong opposition from a host of technology companies who have registered privacy concerns. The bill itself will require technology companies to collect and store data on customers, while also allowing intelligence agencies to remotely access smartphones and other devices.

“The Bill provides a clear and transparent basis for powers already in use by the security and intelligence services, but there need to be further safeguards,” said, Harriet Harman, MP for Camberwell and Peckham and Chair of the Joint Committee on Human Rights. “Protection for MP communications from unjustified interference is vital, as it is for confidential communications between lawyers and clients, and for journalists’ sources, the Bill must provide tougher safeguards to ensure that the Government cannot abuse its powers to undermine Parliament’s ability to hold the Government to account.”

Although proposed by the Conservative party, the bill was strongly supported by the Labour party as well as the majority of the commons, with opposition primarily coming from the Scottish National Party. Despite privacy and civil rights concerns from the SNP, the bill passed with a vote of 444 to 69. The vote in the House of Lords is expected to take place in the next couple of months with the bill being passed to law in January 2017.

The bill was deemed as a high priority for intelligence agencies within the UK, it has been under scrutiny from the Joint Committee on Human Rights, after concerns it could potentially violate privacy and civil rights. As part of the review, extended protection will also granted to lawyers and journalists.

“The Joint Committee heard from 59 witnesses in 22 public panels,” said Victoria Atkins, MP for Louth and Horncastle, speaking on behalf of the Joint Committee on Human Rights and the Bill Committee. “We received 148 written submissions, amounting to 1,500 pages of evidence. We visited the Metropolitan police and GCHQ, and we made 87 recommendations, more than two thirds of which have been accepted by the Home Office.”

One of the initial concerns was a permanently open backdoor which could be accessed by intelligence agencies without oversight, which has seemingly been addressed. Intelligence agencies will have to request access, which will be granted should it not be too complicated or expensive. What the definition of complicated or expensive has not been given, however it does appear to end concerns of a government ‘all-access-pass’. Whether this is enough of a concession for the technology companies remains to be seen.

Microsoft, HPE and Cisco take top-spot for infrastructure vendors

male and female during the run of the marathon raceMicrosoft, HPE and Cisco have been named as three of the leading names in the cloud industry by Synergy Research as the firm wraps up the winners and losers for the first quarter.

While the cloud infrastructure market has been growing consistently at an average rate of 20% year-on-year, 2016 Q1 was estimated at 13%, though this was to be expected following peak sales during the latter stages of 2015. Microsoft led the way for cloud infrastructure software, whereas HPE led the private cloud hardware market segment, and Cisco led the public cloud hardware segment.

“With spend on cloud services growing by over 50% per year and spend on SaaS growing by over 30%, there is little surprise that cloud operator capex continues to drive strong growth in public cloud infrastructure,” said Jeremy Duke, Synergy Research Group’s Chief Analyst. “But on the enterprise data centre side too we continue to see a big swing towards spend on private cloud infrastructure as companies seek to benefit from more flexible and agile IT technology. The transition to cloud still has a long way to go.”

For the last eight quarters total spend on data centre infrastructure has been running at an average of $29 billion, with HPE controlling the largest share of cloud infrastructure hardware and software over the course of 2015. Cloud deployments or shipments of systems that are cloud enabled now account for well over half of the total data centre infrastructure market.

cloud leaders

44% of consumers have issues with wearables functionality

Iot isometric flowchart design bannerFindings from Ericsson ConsumerLab claim consumer enthusiasm for wearables technology is still growing but vendors are not meeting price or functionality expectations, reports Telecoms.com.

The research focused on opinions from 5,000 smartphone users from Brazil, China, South Korea, the UK and the US, though it’s worth noting 50% of respondents were current owners of wearables technology, a much higher proportion of the general public. While the statistics demonstrated there is still an appetite for wearable technologies outside of fitness applications, price of entry could be a barrier for entry, as well as customer expectations on functionality generally exceeding what vendors are currently able to offer.

32% of respondents said they would be interested or willing to buy a Panic/SOS button, and 25% said the same for an identity authentication device. Smart Watches were still of interest to the industry as 28% said they would have an interest in purchasing such as a device, but this statistic contradicts recent reports the segment has been declining. Strategy Analytics forecasted a 12% decline in Apple watch sales this year after a strong launch. A third of non-users have stated the cost of keeping digital devices connected is a key reason why they haven’t invested in wearable technology to date.

While the SA report could indicate a slight hiccup in the adoption of wearables, this is also backed up to a degree by the Ericsson report which states 10% of wearable users abandoned the technology. This is mainly due to the capabilities which are on offer. A common cause of dissatisfaction is customers feel tethered to their smartphone, as the wearable device does not have standalone features. This could also be tied into the overall value/price proposition of the devices as could be seen as a product of convenience as opposed to a smartphone replacement.

In terms of the reasoning for abandoning wearables, over half of respondents said the devices did not meet expectations. 21% highlighted limited functionality and uses, 23% stated the fact the device was not standalone or didn’t have inbuilt connectivity was the reason, where as 9% said inaccurate data and information. Despite the concerns over functionality, 83% of respondents said they expect wearables to have some form of standalone connectivity in the near future. Should this be the case, 43% believe wearables will ultimately replace smartphones.

“Although consumers show greatest interest in devices related to safety, we also see openness to wearable technology further away from today’s generation,” said Jasmeet Singh Sethi, Consumer Insight Expert, Ericsson ConsumerLab. “In five years’ time, walking around with an ingestible sensor, which tracks your body temperature and adjusts the thermostat setting automatically once you arrive home, may be a reality.” Other use cases included a smart water purifier, gesture communicator, virtual reality sports attire, emotion sensing tattoos and a wearable camera.

The survey does demonstrate long-term viability for wearable technology, though there would have to be increased functionality before it could be considered mainstream. It would appear standalone connectivity would be the bare minimum required, as the currently offering seemingly does not offer the value to customers should they have to continue to carry a smartphone as well as the wearable device.

Consumer buying decisions still based on price – Nokia

A racehorse and jockey in a horse raceResearch from Nokia has highlighted consumer buying decisions for smartphones and post-paid contracts are still based on financial drivers as opposed to value add propositions, reports Telecoms.com.

With the worldwide number of smartphone and total number of mobile phone users estimated to exceed 2.6 billion and 5 billion respectively by 2019, the race is now on for operators to capture the biggest share of this lucrative market. Nokia’s research addressed a number of factors surround churn rate and customer acquisition, as well as wider trends, though concerns could be raised on the financial drivers for purchasing decisions placing operators in a similar arena to utilities companies.

Efforts in recent years by the operators have been to shift the focus of the consumer away from price, and move purchasing decisions towards value and performance. T-Mobile US announced a further prong to its ‘Un-carrier’ strategy this week, as it will reward customers with stock, seemingly for nothing in return in the first instance, though additional shares can be acquired by referring new customers to the brand. There have been similar efforts from operators around the world, though the statistics do not suggest there has been a significant impact.

In comparison between 2014 and 2016, the number of respondents who said their attitudes on retention were influenced by cost and billing was still the highest factor, but did drop from 45% to 40%. In terms of the reasons for choosing a new operator, 45% stated this would be based on price, with value adds, mobile technology and choice of devices, only accounting for 17%, 14% and 11% respectively. The quality of a network is also a concern, though the drivers behind choosing a new or staying with an operator are still predominantly price driven.

While price is still the number one objective for customers, the statistics do highlight value added services are having more of an impact on customer retention than acquisition. In terms of definitions, core operator offerings, such as SMS, data and minutes were not included in the research, however value added services increased the likelihood in a customer staying with an operator by 11%, the perception of a network’s quality was up 55% and the number of customers that used more than one gigabyte of data per month was also up 15%.

While operators are generally perceived as trying to avoid competing for new customers solely on price, the research does seem to indicate this would be the most effective route. While retention can seemingly be influenced by value adds, a utility model may be difficult to avoid for customer acquisition.

“We can see the marketing battles to acquire mobile subscribers are fierce,” said Bhaskar Gorti, Applications & Analytics president at Nokia. “What we don’t see as well is the work operators do every day to retain customers. Our study shows how important that work is – and also how challenging it is as customers, attached to their phones, demand higher levels of service.”

In line with industry expectations, 4G usage is on the increase with 38% of new subscribers over the last 12 months choosing 4G networks. The uptake is mainly witnessed in the mature markets, Japan and US are showing the highest levels of adoption, though respondents of the survey highlighted there still are barriers to adoption. For those who are not using 4G currently, a device which doesn’t support 4G or the price being too high were the main reasons.

Solarwinds acquires LogicNow to form new MSP business unit

Expansion1SolarWinds has completed the acquisition of LogicNow, which it plans to merge with the N-able business unit to create SolarWinds MSP. The new company will now serve 18,000 managed service providers worldwide, managing more than five million end-points and one million mailboxes.

For LogicNow’s General Manager Alistair Forbes combining his company’s expertise with capabilities of SolarWinds was an opportunity to take the business to the next level.

“This acquisition is the culmination of a journey which we’ve been on for the last 12 years,” said Forbes. “We saw the opportunity to combine with SolarWinds and the N-able division, and really shift gears into the next phase of our business. Since N-able was acquired by SolarWinds we’ve really seen them become a much more prominent player in the market.

“If you have a look at opportunity SolarWinds gave N-able, we see this as the best way we can accelerate the growth of the LogicNow business and take it to the next level.”

What is worth noting is that the growth of LogicNow has not hit a glass ceiling, Forbes highlighted the business has grown 40% over the last twelve months, however the association with SolarWinds can open up new doors for the team. While LogicNow is in itself a respected brand in the industry, SolarWinds has made its name as a specialist for enterprise scale organizations. By leaning on the SolarWinds brand, Forbes believes opportunities will be created which would have been significantly harder by taking the organic route.

The SolarWinds MSP business will now focus on a number of areas including remote monitoring and management, security including anti-malware, multi-vendor patch management and web access control, backup and disaster recovery, data analytics and risk and vulnerability assessment, amongst other areas.

First and foremost, the new brand will focus on understanding the technology, expertise and assets which are now available, to both business units. “The immediate focus of the business will be to take the combined assets and see what we can create,” said Forbes. “There will be some areas of overlap and also a few redundancies, but nothing massive. This acquisition is all about putting two and two together to make something bigger.”

For the moment, the LogicNow and SolarWinds N-able brands will continue, though this will be phased out over time. Internally, both units are working to shift the culture from the separate businesses to the SolarWinds MSP mentality, though it is thought the restructuring and integration process will be a relatively simple one. For the most part, there is little overlap, and although certain functions will require redundancies, there are only a couple of offices which would be deemed to clash. Boulder, Colorado is one of those offices, and there will be a requirement to merge into one physical location, though the headcount reduction will be minimized overall, Forbes claims.

Microsoft launches Spark for Azure HDInsight and pushes into consumer AI

CortanaAlmost 12 months after releasing the public preview of Spark for Azure HDInsight Microsoft has announced general availability of the proposition to the industry, as well as extending Cortana’s offering to the Xbox.

Making the announcement on the Azure blog, Oliver Chiu, Product Marketing Manager for Hadoop/Big Data and Data Warehousing, outlined the improvements made on the offering as well as the company’s efforts to make big data easy and more approachable. The company claims the Hadoop and Spark cloud service is now an enterprise-ready solution which is fully managed, secured, and highly available.

“Since we announced the public preview, Spark for HDInsight has gained rapid adoption and is now 50% of all new HDInsight clusters deployed,” said Chiu. “With GA (General Availability), we are revealing improvements we’ve made to the service to make Spark hardened for the enterprise and easy for your users. This includes improvements to the availability, scalability, and productivity of our managed Spark service.”

Features for the new services include new capabilities to the YARN resource manager to create an open source Apache licensed REST web service, as well as integrations between Spark and Azure Data Lake Store to increase scalability, Spark and Data Lake Store to increase security, Jupyter (iPython) notebooks and Power BI to build interactive visualizations over data of any size.

Alongside the Spark announcement, the Microsoft team has also ventured into the consumer AI market as Cortana will now be available on Xbox One, Xbox Live and Windows Stores. Starting with a limited Xbox Preview audience for Xbox One, the offering will be more widely released to the Xbox app (beta) on Windows 10 over the course of the summer.

Starting out with U.S., U.K., France, Italy, Germany and Spain, Cortana voice commands on Xbox One will work with both headsets and Kinect. Firstly users will be able to find new games, see what your friends are up to, start a party and accomplish common tasks, though new features will be adding over time.

Despite the company suffering a very public set-back in the AI world with the malfunction of Tay, the team have pressed forward, seemingly prioritizing AI for new features throughout the portfolio. CEO Satya Nadella stated at the Microsoft Build 2016 event AI would feature heavily in future investments, as the team target a Conversation-as-a-Platform proposition. The Cortana Intelligence Suite, which was launched at the event, allows developers to build apps and bots which interact with customers in a personalized way, but also react to real-world developments in real-time.

“As an industry, we are on the cusp of a new frontier that pairs the power of natural human language with advanced machine intelligence,” said Nadella. “At Microsoft, we call this Conversation-as-a-Platform, and it builds on and extends the power of the Microsoft Azure, Office 365 and Windows platforms to empower developers everywhere.”

IBM launches interactive ads on Watson

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.IBM has announced the launch of Watson Ads to harness the AI potential of its cognitive computing platform and create interactive ads, personalized to individual customers. The first offerings of the initiative will be made available through The Weather Company sub-brand.

Personalized advertising has proved to be big business in recent months as brands aim to move away from the blanket marketing approach, as towards a proposition where one-to-one communications are the norm. IBM believe Watson’s ability to understand and comprehend natural language will enable advertisers to interact with customers on a more personal level, and also on a wide scale.

“The dawn of cognitive advertising is truly a watershed moment. Now as part of IBM, we have even more tools and technologies at our disposal to inspire innovations within advertising, artificial intelligence and storytelling,” said Domenic Venuto, GM of Consumer products at The Weather Company. “This is a huge opportunity to expose consumers to all of the surprising and delightful experiences that Watson has in store for them – and to make advertising a truly valuable interaction for both our fans and our marketing partners, which is always our goal.”

IBM claim the new proposition will aide advertiser in numerous ways including a better understanding of brand perception and customer favourability, helping customers make a more informed decision, improve overall experience, optimize creative and advertising strategies, as well as helping marketers use data more effectively.

As part of the initiative, the team will also create the Watson Ads Council, a collection of marketers from various verticals, who will act as a sounding board for the latest innovations leveraging Watson Ads and cognitive advancements in advertising.

“Transforming ourselves and industries is part of The Weather Company DNA,” said Jeremy Steinberg, Global Head of Sales at The Weather Company. “We’ve embraced big data and leveraged it to improve every aspect of our business, from forecast accuracy to ad targeting. Now we’ve set our sights on cognition. We believe human interaction is the new ‘search,’ and that cognitive advertising is the next frontier in marketing – and we’re leading the charge to make it a reality.”

Watson Ads will launch first exclusively across The Weather Company properties, but this is expected to have broad implications for other marketing channels, including out of home, television, connected cars and social media platforms.