How the cloud super-providers see the changing landscape

Earlier this week, analyst firm Cloud Spectator published its latest report on combining price and performance in the cloud. In the main, its results mirrored previous studies; that the Amazons, Microsofts and Googles of this world may not be the best option for some organisations compared with the high performance, cheaper price specialists.

So what of the behemoths, and how are their strategies changing? An illuminating session at Cloud Expo Europe today, featuring IBM, OVH – who bucked the trend by ranking second in the Cloud Spectator report – and Rackspace answered these questions and more.

For Rene Bostic, technical vice president of innovation and new technologies at IBM Cloud, the transformation in organisational awareness during the past two years was marked. IBM started with helping clients and customers understand the landscape; now there is a greater sense of nuance. Take integration with Watson as an example; the AI is used among IBM's customer service clients to help identify the tone of a customer, irate or otherwise, and tailor the response as a result.

"Our customers understand what cloud is, they understand the use cases," Bostic told the audience. "The focus now is on business innovation. How can you monetise the cloud environments that you have – how can you make sure your startup companies and other companies won't take your market share away fast?"

The realisation around open source was shared among all panellists. Rackspace is a given – the company co-invented OpenStack after all – while Russell Reeder, president and CEO of OVH US, advocated his viewpoint. The key, Reeder argued, is avoiding vendor lock-in at all costs. Customers have been there before, and don't ever want to go through it again. "As a customer, people should be really afraid of the cloud," he explained, "and choosing the wrong cloud provider and being locked in."

Reeder insisted that despite the maturation, we were still only at the beginning of what cloud can do. "We're at the most mature we've ever been," Reeder explained. Alex Hilton, CEO of the Cloud Industry Forum and moderator of the session, jokingly offered that the cloud was 'adolescent'. With maturation comes customer success – and innovative stories to go with it.

Bostic offered the example of an insurance provider integrating with IBM's cloud to help reduce the cost of claims. The solution was to integrate with The Weather Company – a firm IBM bought at the start of 2016, with certainly the raising of one eyebrow by this reporter at the time – and plug in to its API. Based on real-time conditions, the system sends a notification to affected users. 'A hailstorm is coming. You may want to move your car inside.' The upshot is at least one fewer claim – but on the backend, users are none the wiser about which cloud environment they are using.

It's certainly a multi-cloud landscape out there; plenty of research confirms it, and it's all part of the growing up process. Rackspace's recent strategy, of providing managed services for the most popular public cloud providersm certainly fits into this and should be well-known – although the fact the company secured the plumb position for advertising space outside the show's entrance may suggest it's not quite well known enough yet.

"For us, it's not about saying 'how do you justify the support on top of AWS?' – we expect all of you to be in a multi-cloud world, so how do we broker that for you? How do we optimise that for you?" said Lee James, Rackspace chief technology officer.

The company's much-vaunted fanatical support goes to the extent that if a customer changes provider, Rackspace will swap them over at no charge. James offered a couple of examples of innovative customer success; firstly a furniture supplier, whose usage can be both scaled up and scaled back, as well as a food manufacturer who runs completely on AWS with Rackspace keeping them cost optimised.

Ultimately, however, it is not all rosy in the garden. The dreaded skills gap, a common scourge according to a lot of market research, refuses to go away. How do the big cloud providers feel about it? OVH is building tools where customers can undergo the migration process themselves if they are able, while Rackspace has opened up a 'university' to train employees and IBM is focusing more on cloud-native apps.

Save the Children: How cloud helps in disaster zones


Rene Millman

21 Mar, 2018

The cloud can greatly benefit charities trying to help people around the world in humanitarian disasters, according to Save the Children’s head of IT.

For the charity’s teams deployed in disaster zones, time is of the essence, and they need to do whatever it takes to save children’s lives delivering life-saving food, water, healthcare, protection and education, said Gerry Waterfield, head of global IT services at Save the Children International, speaking at Cloud Expo Europe in London today.

Using different cloud services, the charity can mobilise quickly and securely without having to deploy preconfigured devices with its line-of-business suite of applications, Waterfield said. Being able to deploy this so quickly can mean the difference between life and death.

“The work we do is in very difficult locations, so we have to think about connectivity, it is one of the biggest issues we face before we use the cloud,” he said. “The other issue is having power; if there is no power, there is no connectivity, hence no internet.”

The charity works in more than 120 countries around the word and helped 22 million children in 2016. Waterfield said that bandwidth is frequently at a premium and the charity is heavily reliant on costly satellite communications, so the use of lightweight web apps is important.

In order to get power, and thus connectivity, Save The Children has looked at using solar power because of the amount of sunlight available in a lot of areas where it works.

Wit so many refugees fleeing war over the Mediterranean Sea, having connectivity at sea means that the charity can access real-time weather data from the cloud as well as data on numbers making the dangerous journey across this stretch of water so that the charity is better able to be in the right place to offer assistance.

To that end, Waterfield said that the charity has used Office 365, as it can be rolled out everywhere to any device. It has also used a cloud-based HR system from Oracle. Waterfield said this has been helpful in emergency situations where volunteers have to be assembled quickly and onboarded as well as in helping select the right people for the right roles on the ground.

Save the Children has also used Kobo Toolbox, to allow workers in emergency situations create ad-hoc reports, and Facebook’s Workplace as an enterprise social network to allow workers to exchange information more quickly about situations and projects.

Going forward, Waterfield said that he would like to see the charity be able to use more technology in the field as this would help more people in crisis situations.

How to Set Up Your Mac to Run Multiple IE Versions Simultaneously

A blog reader asked me how I set everything to run multiple browsers simultaneously. Here’s how I configured my Mac® to use five different versions of Internet Explorer at the same time. Step 1: Your Mac To store all the VMs, you will need about 60–90 GB of free space on your Mac. A Mac […]

The post How to Set Up Your Mac to Run Multiple IE Versions Simultaneously appeared first on Parallels Blog.

Salesforce acquires MuleSoft for $6.5 billion

Salesforce has announced the acquisition of application network platform provider MuleSoft for $6.5 billion, making it the largest acquisition in the company's history.

MuleSoft – which went public last year – offers a platform which connects SaaS and enterprise applications whether on-premise or in the cloud.

The company's role within Salesforce will be to 'power the new Salesforce Integration Cloud, which will enable all enterprises to surface any data – regardless of where it resides – to drive deep and intelligent customer experiences throughout a personalised 1:1 journey', in the words of the company.

MuleSoft's more than 1,200 customers include Coca-Cola, Barclays and Unilever. Naturally, these companies cited in the press materials are also key Salesforce clients. Unilever announced in July that it was combining with Salesforce and Accenture, as well as being a long-term customer of Salesforce Marketing Cloud, while Coca-Cola has used Salesforce for building custom apps on the Salesforce1 platform and Barclays utilises Salesforce Communities.

"Together, Salesforce and MuleSoft will enable customers to connect all of the information throughout their enterprise across all public and private clouds and data sources – radically enhancing innovation," said Marc Benioff, Salesforce CEO. "I am thrilled to welcome MuleSoft to the Salesforce Ohana." Greg Schott, MuleSoft CEO, added: "Together, Salesforce and MuleSoft will accelerate our customers' digital transformations enabling them to unlock their data across any application or endpoint."

The largest acquisition from Salesforce previously, where monetary details were disclosed, was ExactTarget in 2013 for $2.5bn. Shares in MuleSoft rose significantly by 27.2% at close.

Andrew Keys Joins @CloudEXPO NY Faculty | @ConsenSysAndrew #FinTech #Blockchain #Bitcoin #Ethereum

Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew’s role at ConsenSys Enterprise is a multi-faceted approach of strategy and enterprise business development. Andrew graduated from Loyola University in Maryland and University of Auckland with degrees in economics and international finance.

read more

How multi-cloud is forcing organisations to take a more sophisticated cloud approach

Organisations are taking a more sophisticated approach to cloud vendor selection and management with multi-cloud at the heart of this change, according to industry research firm Cloud Spectator.

The company has released its latest price-performance analysis for public cloud infrastructure as a service (IaaS) providers in North America, and found again that behemoths Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform struggle in the rankings against smaller players.

Hosting provider 1and1 topped the list with the benchmark score of 100, with French provider OVH (75) and CenturyLink (74) picking up the silver and bronze medals respectively. 1and1 came first because of its especially strong performance in VMs – on which the figures are calculated, alongside block storage performance – and the most inexpensive pricing of the firms studied.

In contrast, Google achieved an overall score of only 37, but still ahead of AWS (31) and Azure (30). DimensionData, with a score of 20, finished bottom of the pile.

The report sets its stall out straight away with its hypothesis. “A lack of transparency in the public cloud IaaS marketplace for cloud services performance often leads to misinformation or false assumptions,” the report notes. “Users and potential users may be led to view cloud computing as a commodity, differentiated mostly by variety of services. In reality, cloud performance is impacted by a variety of factors from provider to provider, involving everything from the physical hardware to the cost of the virtualised resources.

“By evaluating cloud services based on performance rather than solely price or VM configurations, users are able to maximise value in the cloud.”

This is by no means the first study from Cloud Spectator which comes to this conclusion. As far back as January 2016, this publication reported the ascent of 1and1 as the best value IaaS provider as a ‘surprising’ finding. Today, however, the news comes as less of a surprise – and led by multi-cloud implementations, organisations are becoming savvier in terms of cloudy value for money.

“Various research inputs show that a multi-cloud approach is the preferred strategy,” Ken Balazs, Cloud Spectator CEO, tells CloudTech. “This approach helps organisations create competition, avoid vendor lock-in, provide service alternatives for applications, and ensure functionally equivalent services for pricing optimisation.”

Balazs added that the majority of clients Cloud Spectator works with have multi-cloud and hybrid cloud initiatives. “We almost always see AWS, Microsoft and Google under consideration, but I don’t think that cloud is a one size fits all proposition,” says Balazs. “There are a number of excellent providers, so vendor selection needs to start with identifying the client’s needs, aligning them to the services and offerings of providers, and matching these to budgets.”

You can read the full report here (registration required).

Oracle cloud revenue continues strong growth with more autonomous services promised

It’s the same old story for Oracle as its Q318 financial results were announced; strong cloud revenues, but what about the rest of the portfolio?

Oracle announced total revenues of $9.8 billion (£7bn) over the past three months, with total cloud revenues contributing 16% of the overall figure at $1.6bn. Cloud revenues have gone up 32% year over year; cloud software as a service, the much larger bucket, saw revenues of $1.15bn at a yearly increase of 33%, while cloud platform as a service and infrastructure as a service went up 28% year on year to $415m.

Total on-premise software revenues went up 4% to $6.4bn, while new software licenses, hardware revenues and services revenues all saw a small decline.

The majority of the company’s recent news updates came at CloudWorld in New York back in February. This included the launch of 12 new data centres regions, across Asia, Europe and the Americas, as well as updates to Oracle’s Internet of Things Cloud and to enterprise service level agreements.

The big kahuna, however, was around new features with Oracle’s autonomous database cloud, the company’s big bet. To help make all Oracle Cloud Platform services ‘self-driving, self-securing and self-repairing’, the company explained, would include such features as automated code generation, self-learning chatbots, and security remediation for application development.

Speaking to analysts after the announcement was made, CTO Larry Ellison promised more of the same. “Over the next few months, we expect to deliver autonomous analytics, autonomous mobility, autonomous application development and autonomous integration services,” he said, as transcribed by Seeking Alpha.

“Oracle’s new suite of autonomous PaaS services delivers an unprecedented level of automation and cost savings to our customers,” Ellison added. “Our highly automated suite of autonomous PaaS services reduces cost by reducing human labour and improves reliability and security by reducing human error.

“No other cloud provider has anything like it.”

You can read the full earnings report here.

Security is the biggest driver and obstacle in hybrid cloud migrations


Joe Curtis

20 Mar, 2018

Just 16% of enterprises use just one cloud, with two-thirds having a strategy in place for a hybrid approach, according to a new report.

Companies in the early stages of cloud adoption are likely to be using one cloud as they assess operational challenges of migrating, or if they’re only pushing a few workloads into the cloud as they keep sensitive data on-premise, an investigation by 451 Research has revealed.

But the vast majority of firms – 84% – are using multiple clouds for improved speed and agility, the analyst house’s survey of 1,500 CIOs and IT managers at large enterprises, conducted in association with NTT Com and Dell EMC, found.

“Over 80% of the respondents to this study currently use multiple cloud environments, with varying amounts of integration, migration and interaction between them,” said Liam Eagle, 451’s research manager for cloud, hosting and managed services.

“Perhaps most significant is that approximately a quarter of companies already use some form of hybrid cloud – using the definition of seamless delivery of a single business function across multiple environments.”

For those firms looking at hybrid cloud, security is the biggest factor, but 451 warned that traditional measures like firewalls and access controls will need to be re-tooled for hybrid environments, which require integrated tools designed for the cloud.

However, security is also the biggest barrier to adopting hybrid cloud, giving IT security teams a headache in having to track and monitor different workloads in different states, and protect them in transit and at rest. With multiple environments – both on-premise and cloud – access control is also more difficult, 451 said.

Innovation is a lesser driver pushing companies down a hybrid route, and the report said that hybrid cloud only indirectly leads to this “as part of a wider business strategy”.

“Innovation is not inherently linked to operational efficiency per se, unless it involves for example energy savings, the realization of new products or services or significant transformation,” the report read.

Another minor reason to move to hybrid was to avoid vendor lock-in with a single cloud, with large enterprises used to dealing with a wide variety of suppliers looking to replicate this approach in the cloud.

Other barriers include operational complexity in managing different environments, where cloud is a seamless extension of on-premise, and a difficulty in migrating workloads like applications and databases in the cloud.

Where these “cannot be migrated, there is inherent cost in re-development and delay in implementation”, 451 warned.

Your data backup could be a disaster waiting to happen


Nik Rawlinson

22 Mar, 2018

Does backup get your back up? It shouldn’t. A robust backup routine ought to be fuss-free and transparent – because if it isn’t, you’re far less likely to keep your archives updated. That means that when things go wrong and you need to recover a lost file, the crucial documents you’ve been working on may not have been backed up.

Even if you’re trying to do everything right, your chosen backup method might not be as comprehensive or bulletproof as you thought. And that should be a big concern: small businesses can’t afford to have things go wrong when it comes to backup, and home users stand to lose irreplaceable documents, photos and videos if their backup system lets them down.

Sync versus backup

Let’s clear one thing up right away: synchronisation isn’t backup. Cloud-syncing services are an easy, effective way to keep vital files updated across several machines. But if you’re relying on a service like this to save your skin in the case of an IT emergency, you’re running a serious risk.

Take Dropbox as an example. Not only does Dropbox duplicate your files onto every computer you own, it also keeps its own set of backups – so you can roll back to an earlier version of a file, or bring deleted items back from oblivion. This feature can be a real life-saver: to recover a deleted file, you can just log in through the browser, click Files, then click Deleted files in the sidebar. Find the file you want to resurrect and click restore.

Dropbox keeps copies of deleted files for 30 days as standard and 120 days in the Professional version

Dropbox keeps copies of deleted files for 30 days as standard and 120 days in the Professional version

The catch is that changes and deleted files are only stored for 30 days, after which they’re purged. So while Dropbox can rescue you from short-term problems, it’s no use when you need to restore a document that was changed or deleted a few months ago. You can extend the window to 120 days by upgrading to a Dropbox professional account, but it’s expensive: it costs £199 annually, or £19.99 per month.

It also still doesn’t count as a proper backup solution. A dedicated backup service should allow you to recover files that were deleted years ago, or step back through a complete history of changes to a document, from its original creation to the present day. Not only is this essential for data security, it also provides a helpful audit trail so you can track the development of your projects. Some backup services even offer an authentication service that can be used to prove that a certain file was created or edited on a certain date.

Although we’ve picked on Dropbox here, it’s by no means an outlier. Similar issues apply to Google Drive, iCloud, OneDrive and so forth. Syncing services should be used for just that – syncing – and backup left to tools designed with that task in mind.

The 3-2-1 strategy

When it comes to backup, the standard advice is that you should keep three copies of anything that matters, in two different formats, with at least one of them off-site – an approach known as the 3-2-1 strategy. The last point is particularly important: no matter how diligently you backup your system, if you store your media right next to your PC then it will be equally susceptible to fire, flood and theft – another reason why your backups might not be as safe as you’d hope.

The good news is that offsite backup is easy: there are plenty of cloud-based backup providers who will, for a modest subscription, handle everything for you. However, this is normally on a “best-effort” basis; for safety, speed and convenience it’s a good idea to keep local backups as well.

At least one of your backup destinations should be off-site

At least one of your backup destinations should be off-site

Ideally, you want your local backups to be updated in real time, so that every time you update a file the backup gets updated too. You can get close to this using Windows’ File History feature, as we discuss below (or Time Machine on macOS), with a NAS or a USB drive as your destination.

If you’re serious about backups then ideally you should also keep a second regularly updated set, to provide an extra layer of robustness against glitches and disasters. Ideally, this would be on a different medium to your primary backups: using a pair of hard drives is much safer than using two folders on one drive.

This is another place where it’s tempting to rely on cloud services, but here’s a cautionary tale: Apify founder Jan Čurn lost 8,000 photos after he uploaded them to Dropbox, then tried to remove them from his local hard disk, to free up space. In theory, he should have been fine. He used Dropbox’s Smart Sync feature (only available on Professional accounts), which is supposed to store your files in the cloud, and download them on demand.

However, writing on Medium, Čurn recalled that the Dropbox client crashed during the initial sync operation, so he unsynced his photo folders by hand. “Everything worked well, the directories disappeared from the local hard drive, but they were still available on Dropbox’s website. All good,” he wrote.

But all wasn’t good. Two months later, Čurn discovered that the photo folders were empty on the server, too. “[It] seems that the Dropbox client first deletes files locally before it informs the server about the new selective sync settings,” he noted. “Consequently, if the client crashes or is killed before the server is contacted, the files remain deleted without any trace. After the client restarts again, it only sees there are some files missing and syncs this new state with the server.”

Dropbox’s engineering team managed to recover 1,463 of Čurn’s files, but the rest were lost for good. It’s a reminder of another important principle: a backup is a copy. If you only have one copy of something, it’s not a backup.

Scheduling your backup

If your backup routine relies on you remembering to update your archives then it’s liable to fail; dedicated backup tools either run continuously, or update your backups at regular intervals. Most backup tools take an incremental approach, so only new and updated files are stored, which saves time, and keeps storage demands down. It can also save you money, by postponing the day when you need to invest in a larger repository for your backups.

Set the smallest practical interval for each incremental backup. Hourly is by no means too often: ask yourself whether you could afford to lose a morning’s work if a lunchtime power cut corrupted your drive and wiped out several hours of productivity. However, if you’re working with a capped broadband connection, it makes sense to limit your cloud backups to run during unmetered hours (typically overnight), so long as you also have local backups running throughout the day.

Don’t rely exclusively on incremental backups, though. Taking periodic full backups allows you to quickly and easily restore your complete system to a recent state; mixing incremental and full backups is just as important as storing them in several locations.

Using Windows’ built-in backup tools

Windows' built-in backup tools can archive your data on either an external drive or a NAS location

Windows’ built-in backup tools can archive your data on either an external drive or a NAS location

Windows’ integrated tools make it very easy to maintain local backups. Start by enabling the File History tool, which uses a connected drive or NAS as a repository for key files, including your Libraries, Desktop, contacts and favourites. To find it, open Settings’ Update & Security pane, and click Backup in the sidebar (or just search Cortana for Backup). Click “+” beside ‘Add a drive’ and select a connected storage device. This only searches for USB drives; if you want to use a NAS, wait for it to fail, then click Show all network connections and select the volume you want to use.

You can manually specify what folders Windows backs up, how often it backs them up and how long it keeps them

You can manually specify what folders Windows backs up, how often it backs them up and how long it keeps them

It may look like nothing has happened, but click out and back into Backup and you’ll see that the Add drive button has been replaced by a switch, toggled to On to activate the backup. Click More options below this to specify what’s included in the backup set, how frequently you want it to back up (between every 10 minutes, and daily) and how long the backup set should be kept. You can also invoke an immediate backup.

With this done, Windows will start quietly and continuously backing up your modified files. If you need to recover a file or folder, simply navigate to it in Explorer and click the History button in the ribbon to view and restore old versions and deleted files.

Back up your cloud files

As we’ve mentioned above, entrusting your files in a cloud-syncing service doesn’t guarantee their safety – so you should make sure your local folders are included in your backup sets, so that files stored on sync services like Dropbox, OneDrive and iCloud should be backed up automatically. Simply keep their client apps are running the whole time your PC is active to keep the copies updated.

With Google Drive, things are a bit more complicated. Google prefers that you work through the browser, and the documents it stores on your local machine are only web links that launch each file in a web app. This means that the files on the server are your only copy – which is, of course, a dangerous situation to be in.

The solution is a tool called InSync, which downloads the files themselves, not just the links – including files others have shared with you – and converts them to Microsoft Office or OpenDocument formats so you can open them locally. The synchronisation and translation works both ways, too, so any edits you make on your PC will be sent back to the server, effectively giving Google Drive the same offline features as Office 365 enjoys through its association with the offline Office apps. It’s not free, but a lifetime licence can be had for a very reasonable $30.

Back up your website

If you keep a blog, or use a CMS to manage your website, it’s important to think about backing that up too. Even if you’re using managed hosting or a shared server, it’s asking for trouble to keep all your data in one place: hosts can – and do – go bankrupt, disappear or suffer DDOS and malware attacks.

Automattic’s VaultPress is a comprehensive backup tool for WordPress blogs, which backs up not only your database, but your themes, settings, system files and uploads, too. It starts at $39 per year for daily backups with a 30-day archive, uptime monitoring, and protection against brute force attacks, comment and pingback spam.

If you don’t need something quite so heavyweight, check out the free BackWPup WordPress extension, which can back up your site to Dropbox, S3 or an FTP server. To install it, hover over Plugins in your WordPress Dashboard, click Add New, and type BackWPup into the search box at the top of the following screen.

With the right tool, such as VaultPress, you can even back up a WordPress installation

With the right tool, such as VaultPress, you can even back up a WordPress installation

Similar backup tools are available for other CMS platforms, and many hosting control panels feature backup tools for flat-form or non-managed sites. Parallels Plesk lets you schedule both incremental and full backups of your data and configuration; by default the destination is a folder on the same server, which isn’t ideal, but you can send it to a separate FTP server and password-protect the resulting archive.

Backup isn’t exactly an exciting topic, but we’ve seen how easy it can be using Windows’ built-in tools, and user-friendly software like O&O DiskImage Professional and Paragon Backup & Recovery. Throw in a dedicated off-site backup service like Backblaze or Carbonite and you’ve easily satisfied the requirements of a dependable 3-2-1 backup strategy.

There’s just one more thing to say: once you’ve set up your system, make sure you thoroughly test your ability to restore files, before you need to rely on it for real. A subtle configuration error could mean you’re not backing up all the files you thought you were, or you could discover that your connection to the cloud storage server is too slow to bring you back online in an acceptable time frame. If you do find any issues, you’ll be glad you ironed them out while the going is good.

Image: Shutterstock

Revit for Mac with Parallels Desktop

Need to run AutoDesk Revit but have a Mac® computer? Architects, structural engineers, designers, and contractors alike utilize Revit, a powerful computer-aided design (CAD) software for building information modeling. CAD software enables users to visualize design, create photorealistic drawings, and even future-proof models for environmental factors. According to AutoDesk’s knowledge network, the system requirements to […]

The post Revit for Mac with Parallels Desktop appeared first on Parallels Blog.