UK retailer Boots deputizes in-store app to capitalize on mobility trends

UK retailer Boots has announced it has launched a new app, Sales Assist, to make it easier and simpler for customers to get hold of the products they need.

The app itself is based on the upward trends of customers using devices to gain better value for their pounds as they shop on the high street. By incorporating iPads in a number of shops throughout the UK the app is supporting the retailer’s vision is to use mobility to change the way customers shop.

“At Boots UK we’re investing in innovative new technology to further improve the retail experience for our customers, and mobility is at the forefront of this transformation,” said Robin Phillips, Director of Omnichannel and Development at Boots UK. “By developing Sales Assist, in collaboration with IBM and Apple, and launching it on the 3,700 iPads in our stores, we’re integrating our digital and in-store presence to deliver an even better shopping environment for customers.

“The unique tool allows our colleagues to quickly show product information, ratings and reviews, look up inventory online and make recommendations based on online analytics, all from the shop floor. It will help even our smallest stores feel like a flagship shop, with access to the entire Boots range at their fingertips.”

Boots is using Bluemix, IBM’s cloud platform, to link Sales Assist with the company’s applications and data. The app itself links into the boots.com database allowing shop assistants to locate items, but also use the power of analytics to drive recommendations and impulse buys. The team have not stated how the app will be evolved in the future, though there is the potential for artificial intelligence to be incorporated to drive additional sales in and out of the store.

What did we learn at Cloud & DevOps World?

Cloud & DevOps WorldThe newly branded Cloud & DevOps World kicked off yesterday with one theme prominent throughout the various theatres; cloud is no longer a disruption, but what can be achieved through the cloud is still puzzling decision makers, reports Telecoms.com.

One word which was heard more than any other was maturity, as there would appear to be a general consensus that cloud computing had matured as a concept, process and business model. Although finding the greatest value from the cloud is still a challenge, there is a general feeling those in the IT world are becoming more adventurous and more willing to experiment.

Speaking in the Business Transformation theatre, Hotels.com CIO Thierry Bedos opened up the conference with a look into future trends in cloud computing. Maturity was the main driver of the talk here, as Bedos pointed out AWS’ dominant position as market leader and innovator is starting to loosen. While it would generally be considered strange to call tech giants such as Google and Microsoft challenger brands, it would be fair in the context of public cloud. But not for much longer, as the gap is slimming. For Bedos, this competition is a clear indication of a maturing market.

Along Bedos, Oracle’s Neil Sholay gave us insight into the world of data analytics, machine learning and AI in the Oracle Labs. Bill Gates famously said “Content is King”, and while this remains true, Sholay believes we can now go further and live by the rule “Corpus is King”. Content is still of value, though the technologies and business practise to deliver content have dated the phrase. The value of content is now in mastering its delivery through effective analytics to ensure automation, context and insight. A content campaign is only as good as the information you feed it to provide value to the consumer.

The Cyber & Cloud Security theatre held a slightly different story, but maturity was still a strong theme. ETSI & GSMA Security Working Group Chairperson Charles Brookson commented to us while there is still a lot of work to do to ensure security, the decision makers are maturing in the sense they have accepted 100% secure is unachievable and remaining as secure as possible for as long as possible is the new objective.

For a number of the delegates and speakers this is a new mind-set which has been embraced, however there are still some technical drawbacks. Futuristic advances such as biometric security is set to become a possibility in the near future, but Birmingham University’s David Deighton showed the team had made solid progress in the area. Failure rates are still at 2%, which was generally received as too high, but this has been reduced from 15% in a matter of months. The team would appear to be heading in the right direction, at a healthy pace.

Once again the concept of failure was addressed in the IoT & Data Analytics theatre as conference Chairperson Emil Berthelsen (Machine Research) told us the important lesson from the day was to set the right expectations. Some project will succeed and some will not, but there is no such thing as failure. The concept of IoT is now beginning to gain traction in the enterprise world, starting to show (once again) maturity, but for Berthelsen, the importance of scalability, security and data in IoT solutions was most evident throughout the day.

Day 1 showed us one thing above all else; we’re making progress, but we’re not quite there yet.

Ransomware may be a big culprit for data loss – but it’s the wrong fall guy

(c)iStock.com/Big_Ryan

With researchers seeing a 3500% increase in the use of net infrastructure which criminals use to run ransomware campaigns, it’s not surprising that ransomware has been making big headlines.

The media laments the growing rings of cyber criminals that launch ransomware threats, but there’s another culprit that tends to slip under the radar: people like you and me. Sure, we’re not instigating the campaign – that’s on the hacker – but employees often let the bad actor through the front door, so to speak. Employees access an insecure web page, download infected software or click a phishing link in an email. In fact, of all the data breaches reported in the UK during Q1 2016, ICO data reveals that 62% were caused by human error.

Worse, ransomware and other incidents related to human error are putting businesses at a greater risk of data loss. In a Foursys survey of 400 UK-based IT managers, 11% of those that had reported security breaches caused by threats such as ransomware said they had experienced data loss as a result. According to research by the University of Portsmouth, fraud and human error are costing UK organisations £98.6 billion a year. Unfortunately, that number is likely even larger, as it doesn’t include instances that have gone undiscovered or unreported.

And while some might think that storing data in the cloud puts it out of reach of ransomware, they’re wrong. Ransomware has the ability to encrypt files on hardware and cloud services alike. And, of course, data in the cloud is always susceptible to human error.

If despite your best efforts, an employee or vendor deletes your data, having current backups is the key to restoring the files without a severe impact on your business. If your systems are taken hostage by ransomware, data backups are the key to being able to recover access to your files without paying the ransom (which is never recommended, as it only encourages hackers).

This two-part series will discuss some of the common ways human error can lead to data loss or ransomware infections and address how your business can prepare for these threats.

Cloud provider risks

Under the EU’s General Data Protection Regulation (GDPR), all organisations handling personal data will be responsible for ensuring that information is protected and are responsible for breaches of this data. This responsibility extends to third-party cloud providers, which is why vendor due diligence is critical.

Non-compliance can result in fines of as much as 5% of annual worldwide turnover or €1 million, whichever is greater. With such high stakes, it’s important to ensure vendors have proper policies and procedures in place to ensure the availability and security of any data they process.

You might find that the vendor’s terms of service meet your needs, but be aware that terms of service can change without notice. That’s what happened to one man, a distinguished lecturer for a content network, who woke up one day to discover that his cloud vendor had deleted more than five years of archives for 15 retired machines. After lengthy back-and-forth discussions with the vendor’s tech support, he discovered that a change in the corporation’s retention policy – of which he’d been unaware – had allowed the backups to be deleted. They were eventually restored, but if he hadn’t been vigilant, he very well could have lost his backups permanently.

Shadow IT

Human error and ransomware alone are enough of a risk to put businesses on high alert, but shadow IT exacerbates this threat. Research from Cisco reveals that CIOs estimate that their organisation has 51 public cloud applications in use, but the actual number is more like 730. What happens if employees upload restricted data to an unauthorised cloud application – such as Google Drive, Dropbox and Evernote –  and that application experiences a breach or the proper encryption is not used?

If your employees are uploading files to an unauthorised cloud or using software as a service (SaaS), that not only increases your security risk; it also increases your risk of data loss, as that data isn’t being backed up.

SaaS, in fact, is one of the most prevalent threats to data loss in the cloud. A recent study found that almost 80% of respondents had lost data in their organisations’ SaaS deployments. The top causes were accidental deletion (41%), migration errors (31%) and accidental overwrites (26%).

Lack of internal awareness of security best practices

One of the major culprits of human error is sheer carelessness or ignorance of how data should be handled. In the ICO data mentioned above, the majority of incidents attributable to human error included security gaffes such as posting, emailing or faxing data to the wrong recipient. Additionally, a disturbing number of employees are falling victim to phishing attempts. According to research from Verizon, people opened 30% of phishing messages – that’s 7% greater than last year – and of those, 13% also opened the attachment, introducing the malware to the network.

Many instances of cloud data loss and ransomware infections can be classified into one of the above human error-related categories. But simply being aware of these threats isn’t enough.

This is one of a two part series: the second piece next week will examine mitigating cloud vendor risks, shadow IT and lack of cybersecurity awareness.

The four key storage drivers for SaaS success

(c)iStock.com/livecal

Software as a service (SaaS) and cloud delivery models are disrupting the world of traditional enterprise software.  As IT organisations embrace agile infrastructure strategies to keep pace with the rapid change of the businesses they support, SaaS delivered applications have a clear advantage over traditional on-premises enterprise apps.  As the forces of digital transformation reshape enterprise IT for the on-demand world, SaaS applications will be a driving force in reshaping how modern businesses function.

Choosing SaaS

SaaS is a more efficient way to deliver powerful functionality and positive experiences to business users, as well as radically simplifies deployment. Users of SaaS solutions also don’t have to install and maintain servers, databases or scale data center infrastructure to add more storage or compute power – the SaaS vendors handle all of this for them. The SaaS model collapses the entire application delivery infrastructure into the software being delivered, which is why SaaS is so appealing to CIOs and line of business executives.

Some SaaS providers turn to public cloud vendors, though others may choose to build their own cloud infrastructure to optimise application delivery infrastructure as part of their core competence. Furthermore, providers want to optimise this to ensure seamless delivery when it comes to issues around customer experience, real-time analytics or privacy safeguards. For SaaS vendors, solid, flexible infrastructure can be a competitive advantage in delivering mixed workloads.

The four keys to success

In order for SaaS vendors’ infrastructure strategies to differ from enterprise IT and ensure it impacts the effectiveness of the service – the application “experience” they offer to end users –vendors should consider these four SaaS business drivers when it comes to impacting application infrastructure decisions:

  • Customer retention is as important as a new customer acquisition – Because SaaS users aren’t invested in infrastructure and perpetual license outlays, the “switching cost” is low. As a result, SaaS vendors must provide consistently high customer experience in every area, including great uptime, superior application performance and general account management.
  • SaaS vendors most grow to succeed – Vendors’ business models are based on low barriers to start (e.g. free trials) and network effects that deliver more value as user base grows. This means that SaaS companies need infrastructure that scales easily as they add customers, users, devices and data growth.
  • Deliver real time analytics and decision support – SaaS solutions today are doing far more than just transactions—they need to manage real-time analytics. The era when online analytic processing (OLAP) was a batch function performed by specialists with niche data warehouse tools is now giving way to SaaS solutions that embed personalisation, intelligence and other analytics as part of the user experience. This requires infrastructure that can deliver consistently high performance and user experience under dynamic workload environments.
  • Ability to scale – The cost of application delivery infrastructure is effectively the cost of goods sold (COGS) for a SaaS company. As a result, the ability to scale infrastructure in a cost efficient way is vital to profitability.

These four key business drivers map back to capabilities needed from cloud infrastructure. Scalability is table stakes – any SaaS vendor that chooses to build out its own infrastructure must ensure it can quickly scale its application and performance needs, even as its application workload profile evolves to include more rapid analytics and personalisation.

The infrastructure stack for any vendor building its own cloud has multiple layers – from virtualisation to security and disaster recovery capabilities – but at the end of the day, its foundation is storage. As a result, SaaS providers must consider these infrastructure requirements when it comes to choosing a storage technology approach.

The shift to all flash storage?

To power the high performance needs of the modern SaaS business, all flash storage is the natural technology selection. However, beyond choosing all flash, SaaS providers also need agile storage solutions that can seamlessly manage growth in data and users with a cost model that supports their business. Furthermore, storage solutions must evolve with the constantly shifting requirements of the applications they support.

The ability of a storage platform to handle SaaS solutions with real time analytics is the biggest concern. The platform will need an architecture that performs at the highest level for both transactional and analytics workloads. Not every storage array can deliver consistently high performance under both types of workloads. In order to deliver consistently high user experience in a cost effective manner, it’s essential that the entire infrastructure stack – from the bottom storage layer up – is tuned for real-time analysis, as well as transaction processing.

Conclusion

While much has been made of the SaaS business model – subscription-based solutions that offload the infrastructure concerns for users – the future for what will set one SaaS solution ahead of its competitors is likely to be how well it can deliver differentiated functionality like advanced analytics. Most SaaS providers have adopted agile development practices and incorporated DevOps into their thinking to stay responsive to customer needs and maintain a competitive advantage. At the same time, deploying an agile infrastructure strategy that incorporates flexible storage solutions is essential, so that it can evolve with changing demands of the application and market.

SaaS is already eating up the traditional software delivery model, and in order to succeed, vendors must focus on embedding real-time analytics to deliver a strong user experience. At the end of day, this means SaaS infrastructure not only has to scale, it must also handle complex workloads in a way that will support the most competitive end-user experience. The winners in the age of SaaS will make strategic infrastructure choices that best support their fundamental offering and business model. 

Part 2: Networking the Cloud for the IoT – Stressing the Cloud | @CloudExpo #IoT #Cloud

Karen Field, Penton Communications’ IoT Institute director, in her article “Start Small to Gain Big,” postulated an oil drilling platform with 30,000 sensors would generate about 1 Terabyte of data per day. She also stressed that only 1% of that data would likely be used. From a systems engineering point of view this data flow is multiplied by the trillions of other IoT sensors in the cloud, introducing unprecedented data processing and data transport stress. Industries and competing companies within those industries will also be forced to weigh the economic impact of paying for this transport and processing.

read more

Surviving the Zombie Apocalypse | @CloudExpo #Cloud #Security

Security is one of the most controversial topics in the software industry. How do you measure security? Is your favorite software fundamentally insecure? Are Docker containers secure?
Dan Walsh, SELinux architect, wrote: “Some people make the mistake of thinking of containers as a better and faster way of running virtual machines. From a security point of view, containers are much weaker.” Meanwhile, James Bottomley, Linux Maintainer and former Parallels CTO, wrote: “There’s contentions all over the place that containers are not actually as secure as hypervisors. This is not really true. Parallels and Virtuozo, we’ve been running secure containers for at least 10 years.” To add to the mix, Theo de Raadt, OpenBSD project lead, wrote back in 2007: “You are absolutely deluded, if not stupid, if you think that a worldwide collection of software engineers who can’t write operating systems or applications without security holes, can then turn around and suddenly write virtualization layers without security holes.”

read more

The rest of the world is catching up with AWS – Hotels.com CIO

Speaking at Cloud and DevOps World, Hotels.com CIO Theirry Bedos outlined some of the cloud industry’s growing trends, including the erosion of AWS’ dominant position, reports Telecoms.com.

The growth of the cloud industry has been well documented over recent months, as numerous studies and surveys dominate web searches claiming adoption rates are accelerating. While it is still debatable if cloud has penetrated the mainstream market, according to Bedos, what is clear is the industry is heading that direction; there’s no turning around now.

“The world is becoming fluffier and fluffier,” said Bedos. “There are countless studies and surveys on the internet which show the cloud is becoming more popular and widely used, which is only good for the industry. AWS is still the number one player in the market, but the rest are starting to catch up now. This is one of the most interesting trends which we are seeing.”

As with the acceptance and adoption of any new technology, there are bound to be a number of underlying trends. For Bedos, one of the more interesting of those trends is the acceptance there is another way aside from AWS.

While AWS is still considered the leader in the industry, controlling notably more market share than other cloud providers, the lead is slimming. Microsoft and Google have both been prominent over the course of the last 18 months in bolstering their cloud capabilities, and this has not gone unnoticed by the industry. Although cloud adoption rates are increasing, AWS is getting a smaller and smaller slice of the pie as customers are taking alternatives into consideration.

This should not be considered a major surprise, as this is a trend which has been witnessed with the growth of other technology sub-sectors. Back in the early 2000s, Netscape’s web browser was once dominant in terms of usage share, but lost most of that share to Internet Explorer during the so-called first ‘Browser War’. Bedos highlighted Netscape was first to market, and enjoyed that position for some time until the proposition became normalized and competition grew. This is the same trend AWS is undertaking currently.

“I’m not saying AWS will disappear in the same way Netscape did, but we’re going to see other players chip away at their market share,” said Bedos. That said, the increased competition and drive to acquire new customers could see the balance of power shift towards the consumer.

On top of the increased competition, Bedos also commented on the USPs of the individual cloud providers themselves. Buyers generally buy for a specific reason and these USPs in the cloud provider’s offerings is starting to fund the trend of multi-cloud environments in the enterprise business. Why choose when you can have the best of multiple cloud worlds? For Bedos, this is driving the trend of interoperability. Before too long moving workloads and data sets between different cloud environments will be a simple task, as vendors appreciate a lock-in situation will negatively impact their own business. Co-operation could potentially be the new battle ground.

AWS will continue; they are continuing to innovate and have the backing of one of the worlds’ most recognizable brands. However, increased competition, as well as the tendency of buyers to prefer a multi-cloud proposition, will see a more even playing field, and the bargaining power of these deals potentially leaning towards the consumer.

UK citizens trust EU countries with data more than the UK

EuropeWith the countdown to Brexit vote in its final days, research from Blue Coat has highlighted British respondents would be more trusting if their data was stored in the EU country as opposed to the UK.

Although only marginal, 40% of respondents believe the EU is a safer bet for storage of data, whereas only 38% elected the UK. Germany was perceived as the most trustworthy state, though this could be seen as unsurprising as the country is generally viewed as having the most stringent data protection laws. France ranked in second place, whereas the UK sat in third.

While the true impact of Brexit will only be known following the vote, the role of the UK in the technology world could be impacted by the decision. The research showed a notable favouritism to store data in countries which are part of the EU and under the influence of the European Commission’s General Data Protection Regulation. When looking across the Atlantic to the US, within the UK has more trust than the rest of Europe, though it could still be considered very low. In the UK, 13% said they would trust the US with their data, whereas this number drops down to 3% where France and Germany are concerned.

“The EU regulatory landscape is set to radically change with the introduction of the GDPR legislation and this research highlights the level of distrust in countries outside the EU,” Robert Arandjelovic, Director of Product Marketing EMEA, Blue Coat Systems. “Respondents prefer to keep their data within the EU, supporting new European data protection legislation.

“More concerning is the fact that almost half of respondents would trust any country to store their data, indicating too many employees simply doesn’t pay enough attention to where their work data is held. This presents a risk to enterprises, even if their employees treat where it is being hosted with little interest.”

While the impact of the Brexit vote is entirely theoretical at the moment, leaving the union could spell difficult times for the UK as EU countries favour those which are in the EU. What is apparent from the statistics is the US still has substantial work to do to counter the ill effects of the Safe Harbour agreement, which was struck down last October. The survey indicates the replacement policy, the EU-US Privacy Shield, has not met the requirements of EU citizens as trust in the US is still low.

Dell sells software business for $2bn to fund EMC deal

Dell has announced Francisco Partners and Elliott Management have agreed to purchase its software business unit as the company moves towards deadline day for the EMC merger, reports Telecoms.com.

The deal, initially reported by Reuters, will include the Quest Software and SonicWALL assets reportedly for just over $2 billion. Both assets were acquired by Dell in recent years, for a combined total of $3.6 billion, and while this could be seen as a big loss for the company, details of what the transaction will include and what will remain in the Dell business have not been confirmed.

The acquisition represents two growing trends within the industry. Firstly, venture capitalists have been making some notable moves in recent weeks, possibly indicating confidence in backing cloud companies have returned. Vista Equity Partners bought Marketo for $1.8 billion last month, then this followed up with a deal for Ping Identity for $600 million. Thoma Bravo also bought Qlik for $3 billion and Providence Strategic Growth invested $130 million in Logic Monitor recently.

Secondly, Dell is starting to peel back layers of their business. For the most part, this shouldn’t be seen as a particular surprise; an acquisition the size of the one Dell is currently going through requires funding, and there is also likely to be a certain level of crossover between the two business units. Characterising sale of Quest Software and SonicWALL, as well as Dell Services in March, as panic sales could be tempting, though it could also be seen as logical.

Dell’s buy-out of EMC was initially announced in October last year for $67 billion, billed as one of the largest acquisitions in the history of the technology industry. At EMC World this year, the team took the chance to launch the new brand, Dell Technologies, but also outline the integration strategy of the two tech giants. Dell’s Chief Integration Officer Rory Read and EMC’s COO of the Global Enterprise Services business unit Howard Elias highlighted while a reduction in headcount and sales would be limited, it would not be entirely unavoidable; two companies as large as Dell and EMC are naturally going to have crossover.

The sales to Francisco Partners and Elliott Management could be seen as a means to raise capital for the acquisition, this is hardly surprising as it was highly unlikely $67 billion was going to be found down the back of the sofa. The team have not commented on the specifics of the agreement to date, however one thing it does highlight is sales are a necessity to funding one of the largest deals in the history of the technology industry.