Netskope study argues majority of cloud apps are not ready for EU GDPR

(c)iStock.com/FrankRamspott

Three quarters of cloud apps used by businesses are not equipped for the upcoming EU General Data Protection Regulation (GDPR), according to the latest study from cloud security services provider Netskope.

The report comes hot on the heels of a survey conducted by the firm back in February which argued businesses were not sure if they would keep up with the upcoming legislation, which is set to drop in two years’ time and includes provisos over the right to be forgotten, as well as the user’s right to know when their data has been hacked.

According to the research, employees used on average 777 cloud apps in a given organisation – a figure which was a slight increase from previous years. Netskope argues that 75% of the more than 22,000 apps tracked did not stand up to upcoming EU data privacy scrutiny. The majority of these violations (73.6%), perhaps not surprisingly, came from cloud storage apps. Almost 95% of the apps analysed were also not deemed to be enterprise-grade.

Yet this may not be the worst news to come out of the report. Netskope also found that 11% of enterprises surveyed were using sanctioned – in other words, IT-approved – apps laced with malware, with more than a quarter (26.2%) of malware in these apps shared with users, either internally, externally, or publicly.

“The shift to the cloud presents an increasing complexity and volume of security challenges for enterprises, including regulations like the EU GDPR,” said Netskope CEO and founder Sanjay Beri. “When the deadline for compliance looming, complete visibility into and real-time control over app usage and activity in a centralised, consistent way that works across all apps is paramount for organisations to understand how they use and protect their customers’ personal data.”

[session] Why Organizations Benefits from Cloud Bursting By @AvereSystems | @CloudExpo #Cloud

When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files.
In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, will discuss the IT and business benefits that cloud bursting provides, including increased compute capacity, lower IT investment, financial agility, and, ultimately, faster time-to-market.

read more

AWS release statement to explain Aussie outage

Location Australia. Green pin on the map.AWS has blamed a power shortage caused by adverse weather conditions as the primary cause of the outage Australian customers experienced this weekend.

A statement on the company’s website stated its utility provider suffered a failure at the regional substation, which resulted in the total loss of utility power to multiple AWS facilities. At one of these facilities, the power redundancy didn’t work as designed and the company lost power to a large number of instances in the availability zone.

The storm this weekend was one of the worst experienced by Sydney in recent years, recording 150mm of rain over the period, with 93 mm falling on Sunday 5th alone, and wind speeds reaching as high as 96 km/h. The storm resulted in AWS customers losing services for up to six hours, between 11.30pm and 4.30am (PST) on June 4/5. The company claims over 80% of the impacted customer instances and volumes were back online and operational by 1am, though a latent bug in the instance management software led to a slower than expected recovery for some of the services.

While adverse weather conditions cannot be avoided, the outage is unlikely to ease concerns over public cloud propositions. Although the concept of cloud may now be considered mainstream, there are still numerous decision makers who are hesitant over placing mission critical workloads in such an environment, as it has been considered as handing control of a company’s assets to another organization. Such outages will not bolster confidence in those who are already pessimistic.

“Normally, when utility power fails, electrical load is maintained by multiple layers of power redundancy,” the statement said. “Every instance is served by two independent power delivery line-ups, each providing access to utility power, uninterruptable power supplies (UPSs), and back-up power from generators. If either of these independent power line-ups provides power, the instance will maintain availability. During this weekend’s event, the instances that lost power lost access to both their primary and secondary power as several of our power delivery line-ups failed to transfer load to their generators.”

In efforts to avoid similar episodes in the future, the team have stated additional breakers will be added to assure that we more quickly break connections to degraded utility power to allow generators to activate before uninterruptable power supplies systems are depleted. The team have also prioritized reviewing and redesigning the power configuration process in their facilities to prevent similar power sags from affecting performance in the future.

“We are never satisfied with operational performance that is anything less than perfect, and we will do everything we can to learn from this event and use it to drive improvement across our services,” the company said.

HPE give IoT portfolio an edgy feel

Oil and gas platform in the gulf or the sea, The world energy, OHPE has unveiled new capabilities and partnerships to bring real-time data analytics and IoT insight to the network edge, reports Telecoms.com.

The team claims its new offerings, Edgeline EL1000 and Edgeline EL4000, are the first converged systems for the Internet of Things, capable of integrating data capture, analysis and storage at the source of collection. Transport and storage of data for analytics are becoming prohibitively expensive, the company claims, so the new products offer decision making insight at the network edge to reduce costs and complexities.

HPE claims the new offerings are capable of delivering heavy-duty data analytics and insights, graphically intense data visualization, and real-time response at the edge. Until recently, the technology to drive edge analytics has not been available, meaning data has had to be transferred to the network core to acquire insight. The team have also announced the launch of Vertica Analytics Platform which offers in-database machine learning algorithms and closed-loop analytics at the network edge.

“Organizations that take advantage of the vast amount of data and run deep analytics at the edge can become digital disrupters within their industries,” said Mark Potter, CTO of the Enterprise Group at HPE. “HPE has built machine learning and real time analytics into its IoT platforms, and provides services that help customers understand how data can best be leveraged, enabling them to optimize maintenance management, improve operations efficiency and ultimately, drive significant cost savings.”

The news follows an announcement from IBM and Cisco last week which also focused on IoT at the edge. Alongside the product launches from HPE, the team also announced a partnership with GE Digital to create more relevant propositions for industry. The partnership focuses on combining HPE technical know-how with GE’s industrial expertise and its Predix platform to create IoT-optimized hardware and software. GE’s Predix platform will be a preferred software solution for HPE’s industrial-related use cases and customers.

While the promise of IoT given the industry plenty to get excited about in recent years, the full potential has been difficult to realize due to the vast amount of data which needs to be transported to the network core to process and drive insight from. Although it would seem logical to process the data at the source of collection, technical capabilities have not been at the point where this has been possible. Recent advances from the IBM/Cisco and HPE/GE partnerships are removing the need to transfer information, and also the risk of bottleneck situations, points of failure and storage expenses from the IoT process.

“In order to fully take advantage of the Industrial IoT, customers need data-centre-grade computing power, both at the edge – where the action is – and in the cloud,” said Potter. “With our advanced technologies, customers are able to access data centre-level compute at every point in the Industrial IoT, delivering insight and control when and where needed.”

Applications for the edge-analytics proposition could be quite wide, ranging from production lines in Eastern Europe to oil rigs in the North Sea to smart energy grids in Copenhagen. It would appear the team are not only targeting industrial segments, where IoT could ensure faster and more accurate decision making in the manufacturing process for instance, but also those assets which do not have reliable or consistent connectivity.

How IoT relates to the cloud and PaaS: Pace and flexibility

(c)iStock.com/chinaface

By Alex Vilner, managing director, SaM Solutions

With all things tech switching to cloud-based platforms the ‘as a service’ industry is exploding. The one that started it all, SaaS (software as a service), is now joined by IaaS and PaaS.

The commonality for these ‘as a service’ offerings is that they are all based in the cloud, solely managed by the provider and available to clients via a membership platform.

Connected devices are changing the landscape of technology. We don’t purchase anything – we subscribe to it. The pace of technology growth led to a culture of obsolescence. Barely had companies adopted and incorporated a new piece of software or infrastructure into their technology stack and a new version was released.

Membership-based or ‘as a service’ offerings do two things: first, they ensure that companies always have access to the latest features and services without the cost and headaches associated with incorporating new releases. Second, there is an economy-of-scale dynamic at work.

Developers of these services work on the offering every day. Building a feature or improving an existing one might take two or three dedicated developers working for weeks. The ROI for this investment, if the services are maintained in-house, simply isn’t there for most companies; however, service providers push this new feature to all of their customers meaning that the benefits are experienced across the board.

IoT reliance on the cloud and the benefits of the ‘as a service’ model has made companies re-examine their available tools in search of solutions that are as flexible, scalable and economical as their own products.

PaaS operates in the area between SaaS and IaaS; giving full control of the application and the data to the developer. The infrastructure (middleware, operating systems, virtualisation, storage and networking) are unlikely to change during the development process of an IoT application. This is why it makes sense to leave those aspects to be managed by someone else (IaaS provider).

Why does PaaS fit with IoT?

Data, data, data: So much of the IoT is about data. Data collection. Data storage. Data analysis. Much of the testing required by IoT applications requires different configurations of data and most IoT applications are developed for multiple use case scenarios. PaaS allows developers complete control over collected data without the burden of managing storage systems.

Future flexibility: Data analysis requirements will change as the application sees wider adoption and developers learn more about how users are deploying the product in real life. Smart developers plan for this eventuality by building products that include flexible components. PaaS allows developers to customise data analysis both now and in the future.

Pace and competition: With predicted billions up for grabs, rest-assured that competition in the IoT space is only going to continue to grow. Releasing products months and sometimes years ahead of pre-IoT project estimates requires development teams to reconsider which aspects of the stack need to be built in-house and which can be managed by someone else.

Workflows: For many adopters, the primary draw to IoT solutions is automation. Instead of relying on the inspector to manually bring up the inspection requirements for a component, a device scans a barcode on the part and ensures that all of the requirements and data recording areas are automatically displayed to the inspector when the component arrives. Developing new workflows and optimising existing ones is a driving force behind new releases of IoT applications. PaaS provides just the access needed to continue making these improvements.

Virtual everything: Not only have our tools gone to the cloud, our teams have too. Most development teams include members from various facility locations, both on- and off-site, and help from outsource firms. Teams are rarely co-located making tools that are based in the cloud necessary to efficient team operations.

According to Fabrizio Biscotti, research director at Gartner, “the PaaS segment showed impressive growth, not just in the AIM (application infrastructure and middleware) market but across the entire enterprise software market” in 2015. This growth points to the increased demand for IoT-supporting development tools and the continued growth expected in this area.

UK Government passes spy bill with strong majority

Lady Justice On The Old Bailey, LondonThe House of Commons has voted in favour of the Investigatory Powers Bill which gives UK intelligence agencies greater power to examine browsing histories and hack phones, reports Telecoms.com.

The bill, which now passes through to the House of Lords, has been under scrutiny since last year, with the latest version being reviewed since March. The original version of the bill, known as the ‘Snooper’s Charter’ by critics, came up against strong opposition from a host of technology companies who have registered privacy concerns. The bill itself will require technology companies to collect and store data on customers, while also allowing intelligence agencies to remotely access smartphones and other devices.

“The Bill provides a clear and transparent basis for powers already in use by the security and intelligence services, but there need to be further safeguards,” said, Harriet Harman, MP for Camberwell and Peckham and Chair of the Joint Committee on Human Rights. “Protection for MP communications from unjustified interference is vital, as it is for confidential communications between lawyers and clients, and for journalists’ sources, the Bill must provide tougher safeguards to ensure that the Government cannot abuse its powers to undermine Parliament’s ability to hold the Government to account.”

Although proposed by the Conservative party, the bill was strongly supported by the Labour party as well as the majority of the commons, with opposition primarily coming from the Scottish National Party. Despite privacy and civil rights concerns from the SNP, the bill passed with a vote of 444 to 69. The vote in the House of Lords is expected to take place in the next couple of months with the bill being passed to law in January 2017.

The bill was deemed as a high priority for intelligence agencies within the UK, it has been under scrutiny from the Joint Committee on Human Rights, after concerns it could potentially violate privacy and civil rights. As part of the review, extended protection will also granted to lawyers and journalists.

“The Joint Committee heard from 59 witnesses in 22 public panels,” said Victoria Atkins, MP for Louth and Horncastle, speaking on behalf of the Joint Committee on Human Rights and the Bill Committee. “We received 148 written submissions, amounting to 1,500 pages of evidence. We visited the Metropolitan police and GCHQ, and we made 87 recommendations, more than two thirds of which have been accepted by the Home Office.”

One of the initial concerns was a permanently open backdoor which could be accessed by intelligence agencies without oversight, which has seemingly been addressed. Intelligence agencies will have to request access, which will be granted should it not be too complicated or expensive. What the definition of complicated or expensive has not been given, however it does appear to end concerns of a government ‘all-access-pass’. Whether this is enough of a concession for the technology companies remains to be seen.

Microsoft, HPE and Cisco take top-spot for infrastructure vendors

male and female during the run of the marathon raceMicrosoft, HPE and Cisco have been named as three of the leading names in the cloud industry by Synergy Research as the firm wraps up the winners and losers for the first quarter.

While the cloud infrastructure market has been growing consistently at an average rate of 20% year-on-year, 2016 Q1 was estimated at 13%, though this was to be expected following peak sales during the latter stages of 2015. Microsoft led the way for cloud infrastructure software, whereas HPE led the private cloud hardware market segment, and Cisco led the public cloud hardware segment.

“With spend on cloud services growing by over 50% per year and spend on SaaS growing by over 30%, there is little surprise that cloud operator capex continues to drive strong growth in public cloud infrastructure,” said Jeremy Duke, Synergy Research Group’s Chief Analyst. “But on the enterprise data centre side too we continue to see a big swing towards spend on private cloud infrastructure as companies seek to benefit from more flexible and agile IT technology. The transition to cloud still has a long way to go.”

For the last eight quarters total spend on data centre infrastructure has been running at an average of $29 billion, with HPE controlling the largest share of cloud infrastructure hardware and software over the course of 2015. Cloud deployments or shipments of systems that are cloud enabled now account for well over half of the total data centre infrastructure market.

cloud leaders

44% of consumers have issues with wearables functionality

Iot isometric flowchart design bannerFindings from Ericsson ConsumerLab claim consumer enthusiasm for wearables technology is still growing but vendors are not meeting price or functionality expectations, reports Telecoms.com.

The research focused on opinions from 5,000 smartphone users from Brazil, China, South Korea, the UK and the US, though it’s worth noting 50% of respondents were current owners of wearables technology, a much higher proportion of the general public. While the statistics demonstrated there is still an appetite for wearable technologies outside of fitness applications, price of entry could be a barrier for entry, as well as customer expectations on functionality generally exceeding what vendors are currently able to offer.

32% of respondents said they would be interested or willing to buy a Panic/SOS button, and 25% said the same for an identity authentication device. Smart Watches were still of interest to the industry as 28% said they would have an interest in purchasing such as a device, but this statistic contradicts recent reports the segment has been declining. Strategy Analytics forecasted a 12% decline in Apple watch sales this year after a strong launch. A third of non-users have stated the cost of keeping digital devices connected is a key reason why they haven’t invested in wearable technology to date.

While the SA report could indicate a slight hiccup in the adoption of wearables, this is also backed up to a degree by the Ericsson report which states 10% of wearable users abandoned the technology. This is mainly due to the capabilities which are on offer. A common cause of dissatisfaction is customers feel tethered to their smartphone, as the wearable device does not have standalone features. This could also be tied into the overall value/price proposition of the devices as could be seen as a product of convenience as opposed to a smartphone replacement.

In terms of the reasoning for abandoning wearables, over half of respondents said the devices did not meet expectations. 21% highlighted limited functionality and uses, 23% stated the fact the device was not standalone or didn’t have inbuilt connectivity was the reason, where as 9% said inaccurate data and information. Despite the concerns over functionality, 83% of respondents said they expect wearables to have some form of standalone connectivity in the near future. Should this be the case, 43% believe wearables will ultimately replace smartphones.

“Although consumers show greatest interest in devices related to safety, we also see openness to wearable technology further away from today’s generation,” said Jasmeet Singh Sethi, Consumer Insight Expert, Ericsson ConsumerLab. “In five years’ time, walking around with an ingestible sensor, which tracks your body temperature and adjusts the thermostat setting automatically once you arrive home, may be a reality.” Other use cases included a smart water purifier, gesture communicator, virtual reality sports attire, emotion sensing tattoos and a wearable camera.

The survey does demonstrate long-term viability for wearable technology, though there would have to be increased functionality before it could be considered mainstream. It would appear standalone connectivity would be the bare minimum required, as the currently offering seemingly does not offer the value to customers should they have to continue to carry a smartphone as well as the wearable device.

Consumer buying decisions still based on price – Nokia

A racehorse and jockey in a horse raceResearch from Nokia has highlighted consumer buying decisions for smartphones and post-paid contracts are still based on financial drivers as opposed to value add propositions, reports Telecoms.com.

With the worldwide number of smartphone and total number of mobile phone users estimated to exceed 2.6 billion and 5 billion respectively by 2019, the race is now on for operators to capture the biggest share of this lucrative market. Nokia’s research addressed a number of factors surround churn rate and customer acquisition, as well as wider trends, though concerns could be raised on the financial drivers for purchasing decisions placing operators in a similar arena to utilities companies.

Efforts in recent years by the operators have been to shift the focus of the consumer away from price, and move purchasing decisions towards value and performance. T-Mobile US announced a further prong to its ‘Un-carrier’ strategy this week, as it will reward customers with stock, seemingly for nothing in return in the first instance, though additional shares can be acquired by referring new customers to the brand. There have been similar efforts from operators around the world, though the statistics do not suggest there has been a significant impact.

In comparison between 2014 and 2016, the number of respondents who said their attitudes on retention were influenced by cost and billing was still the highest factor, but did drop from 45% to 40%. In terms of the reasons for choosing a new operator, 45% stated this would be based on price, with value adds, mobile technology and choice of devices, only accounting for 17%, 14% and 11% respectively. The quality of a network is also a concern, though the drivers behind choosing a new or staying with an operator are still predominantly price driven.

While price is still the number one objective for customers, the statistics do highlight value added services are having more of an impact on customer retention than acquisition. In terms of definitions, core operator offerings, such as SMS, data and minutes were not included in the research, however value added services increased the likelihood in a customer staying with an operator by 11%, the perception of a network’s quality was up 55% and the number of customers that used more than one gigabyte of data per month was also up 15%.

While operators are generally perceived as trying to avoid competing for new customers solely on price, the research does seem to indicate this would be the most effective route. While retention can seemingly be influenced by value adds, a utility model may be difficult to avoid for customer acquisition.

“We can see the marketing battles to acquire mobile subscribers are fierce,” said Bhaskar Gorti, Applications & Analytics president at Nokia. “What we don’t see as well is the work operators do every day to retain customers. Our study shows how important that work is – and also how challenging it is as customers, attached to their phones, demand higher levels of service.”

In line with industry expectations, 4G usage is on the increase with 38% of new subscribers over the last 12 months choosing 4G networks. The uptake is mainly witnessed in the mature markets, Japan and US are showing the highest levels of adoption, though respondents of the survey highlighted there still are barriers to adoption. For those who are not using 4G currently, a device which doesn’t support 4G or the price being too high were the main reasons.

HPE edging out Cisco in cloud infrastructure space, argues research

(c)iStock.com/scanrail

Hewlett Packard Enterprise (HPE) and Cisco continue to battle it out for supremacy in the cloud infrastructure market with HPE having the slight advantage, according to the latest note from Synergy Research.

The note, which assesses the most recent Q1 data, sees Cisco just ahead of HPE and Dell in public cloud hardware, but HPE more dominant in the private cloud hardware space, capturing more than 20% of the overall market. In cloud software, Microsoft has more than 40% share and is streets ahead of nearest competitor VMware, but given the relatively nascent size of the market sees the Redmond giant in third place for the overall poll.

Even though both HPE and Cisco gained market share in the previous quarter, HPE slightly widened its advantage, according to Synergy. Overall, the global cloud infrastructure market grew by 13% in Q1 – a drop-off from the usual 20% run rate which the analyst house describes as a ‘typically soft’ quarter following the usual Q4 peak.

Recent research on cloud infrastructure insists that the move to an entirely public cloud is not quite within organisations’ grasps.

A study from VMTurbo found more than half of organisations did not have a multi-cloud strategy in place, while companies were in some cases reluctant to have a public cloud first data strategy, with customer requests and HIPAA compliance in the way.

Jeremy Duke, Synergy Research Group founder and chief analyst, argues that with spend on cloud services and software as a service growing by 50% and 30% per year respectively, the cloud figures are not surprising – but on premise IT systems are going to be entrenched for some time.

“There is little surprise that cloud operator capex continues to drive strong growth in public cloud infrastructure,” said Duke. “But on the enterprise data centre side too, we continue to see a big swing towards spend on private cloud infrastructure as companies seek to benefit from more flexible and agile IT technology.

“The transition to cloud still has a long way to go,” he added.