Demand for Kubernetes skills soars eight-fold in two years


Clare Hopping

22 Jun, 2018

More businesses are seeking out qualified Kubernetes developers and engineers as they turn to the tech for building their DevOps environments, a report by security software company CyberArk has found.

The company’s IT Jobs Watch report revealed that demand for Kubernetes skills has grown by 752% over the last two years, making it one of the most in-demand IT industries in terms of growth – rising up 729 places to the top 250 most needed roles in IT.

“Kubernetes has become a massive money word, and these figures show that DevOps teams are seeking more skills to help them manage and deploy applications at scale,” said Josh Kirkwood, DevOps Security Lead at CyberArk said.

“There is a very real danger that the rush to achieve IT and business advantages will outpace awareness of the security risks. If privileged accounts in Kubernetes are left unmanaged, and attackers get inside the control panel, they could gain control of an organisation’s entire IT infrastructure,” Kirkwood said.

However, he warned that if businesses rush to onboard inexperienced Kubernetes staff, they risk opening up their organisation to attack. This is demonstrated in another CyberArk report, which revealed many of the DevOps professionals being employed by organisations have security knowledge gaps – particularly around privileged accounts and secrets and container environments.

“Many organisations simply task the same DevOps hires – often with no security experience – to protect these new Kubernetes environments, in addition to the numerous other responsibilities they have to deliver,” Kirkwood added. “That’s no longer sufficient, and security teams need to get more closely involved to support the platform.”

Kirkwood advised that businesses take advantage of cross-team collaboration to ensure they are able to recruit those needed to fill roles, while also securing their existing infrastructure sufficiently to ensure they can create the safest, secure and effective DevOps environment.

Try macOS Mojave with Parallels Desktop for Mac

by Guest Blog Author, Alex Sursiakov, Program Manager at Parallels On June 4 at the WWDC 2018 keynote, Apple® announced major updates to all of its software platforms. One of them is macOS® Mojave, the new version of the operating system for your Mac®. macOS Mojave will be available to Mac users this fall. But what […]

The post Try macOS Mojave with Parallels Desktop for Mac appeared first on Parallels Blog.

Questions you still need to ask SAP about indirect access


Joe Curtis

18 Jun, 2018

SAP’s licensing has struggled to keep up with how customers have used the vendor’s technology in recent years.

Its per-user pricing model doesn’t account for the world of IoT, in which devices, applications and bots, not human workers, access the German software maker’s ERP systems, and do so much more often than staff would.

But customers had previously felt comfortable setting up such access, with SAP personnel aware of these connections without raising any issues, or any customers falling foul of licensing audits, according to the SAP User Group Executive Network (SUGEN), which represents customers.

However, after SAP sued high profile customers brewery Anheuser-Busch InBev and British firm Diageo, against whom it sought huge multi-million-dollar damages, for cases of indirect access, customers understandably grew worried.

Their cases typified indirect access use cases; Diageo enabled access to SAP’s ERP system via third-party software, Salesforce. The accusations against Anheuser are less clear, but SAP alleged that it accessed its systems directly and indirectly without appropriate licenses. Anheuser settled out of court with SAP in March.

SAP moved to address customer fears over indirect licensing by introducing a new policy in April to cover third-party access to its ERP and S/4 HANA systems (on-premise or in the cloud), as well as taking steps to ensure the threat of audits isn’t used in sales negotiations.

The new pricing model for indirect access to SAP’s ERP applications, and its S/4 HANA suite of ERP tools, aims to bring SAP’s licensing into step with modern uses of technology – essentially, machine-to-machine interactions replacing a lot of human access to ERP systems.

After introducing some new licensing models some months previously, as of April SAP detailed a new licensing policy to account for IoT use.

Instead of following its per-user pricing model, this one differentiates between human access via SAP software and device or bot access to core ERP systems.

This new model charges for access on a per-transaction level (SAP calls it a ‘document’), where that transaction might be a POS transaction, an approved invoice or something else that requires accessing SAP’s ERP.

So, what’s changed?

SAP’s per-user licensing model remains unchanged, and organisations can also use SAP’s other two models, procure-to-pay and order-to-cash – both order-based pricing models – choosing between them to best suit how they use SAP’s applications.

This new per-document pricing model addresses the biggest fear among SAP’s customers though; indirect access.

It will price access based on the number of transactions or ‘documents’ (an item like an invoice that needs to be run through the ERP system) accessing S/4 HANA or SAP’s ERP suite.

You can decide whether your usage suits the document-based pricing model

Customers can choose which model suits their needs best, and can move to the document-based pricing model or remain on their existing model.

SAP has promised “conversion offerings” for those considering a switch. However, SUGEN warns that it’s not clear whether users will save money under the new model.

SAP promises to separate software audits from sales negotiations

With customers scared that SAP will use the threat of software audits in sales negotiations in light of SAP’s lawsuits against Anheuser and Diageo, the German tech firm has promised to ensure the two remain separate.

As SAP put it, somewhat mildly, audits coinciding with talks around new software purchases “can sometimes cause frustration” for customers.

It’s promised to change this to ensure audits aren’t co-opted as a nasty sales tactic, and will introduce self-service auditing features for customers to check how their usage measures up to their licensing agreements.

So, what questions should you still have about SAP’s new licensing direction?

Will you save money under the document-based pricing model?

The biggest question is whether you’ll actually save money by switching to the new licensing model. SUGEN believes it’s too early to tell.

“It is difficult to know if existing customers will pay more under the new model, as the measurement and auditing tools required aren’t currently available,” SUGEN core leadership team member Philip Adams says.

“It won’t be until Q1 2019 that customers will be able to assess any potential cost impact. However, we have urged SAP to publicly promise that customers would be able to adopt the new model without incurring further costs if the business value or scope of their usage has not changed.”

Will the new per-document model be priced reasonably?

The lack of certainty above is in part because the catch-all nature of the term ‘document’ will mean different things for different industries – it could be an invoice, a retail transaction, an oil production contract, or anything else: clearly some will be easier to process than others, or require fewer touchpoints with an ERP system.

“Generally I think it’s a good move by SAP but there are many details that are still unclear,” says Duncan Jones, principal analyst with Forrester. “For instance, will the price for the new per order be reasonable and will SAP sales teams discount it appropriately?

“My general advice is to prepare a solid negotiation strategy to embrace the change but get sufficient safeguards and compensation in return.”

Will organisations using non-SAP software pay more to integrate these apps into their ERP platform?

This is something else that’s too early to answer, and may play into your decision as to whether you want to change your pricing model at all.

“What is clear is that if you move to the new model, ‘indirect’ transactions from non-SAP systems to SAP systems would be counted and charged for in the new way,” explains SUGEN’s Adams, “hence customers with existing contracts need to look at and understand whether they are licensed for these types of transactions under their current contracts.”

Diageo’s court case came about because of its use of Salesforce, and SAP will be keen to encourage customers to use its newly-launched C/4 HANA suite of CRM apps over rival offerings.

Should you trust SAP?

This one’s easy to answer – wait and see.

SUGEN points out that SAP has known about situations where customers connect third-party systems to its ERP platform for at least six years, but only started punishing companies for doing so recently, leaving its customer base panicked and confused about what they could and could not do.

“Expecting customers to talk to and trust account managers in an environment where SAP has admitted to having lost customer trust is asking a lot,” says chairman Gianmaria Perancin.

“If SAP publicly provided reassurances that customers won’t be asked to pay more for use cases and implementations that were undertaken in good faith, this would go a long way to encourage customers to engage with SAP proactively.”

However, Adams adds: “Without these reassurances, customers will find themselves in a state of paralysis, unable to move forward as they do not yet know what the new licensing model will mean for them. Over the coming months, we will be surveying customer organisations to see if their licensing positions are clearer, and what this means for their future plans and investments with SAP.”

Image: Shutterstock

AWS team up with NOVA for a degree in the cloud


Bobby Hellard

21 Jun, 2018

Amazon Web Services (AWS) has announced a new cloud computing specialisation degree created in collaboration with Northern Virginia Community College (NOVA).

The program will be one of the first cloud computing degrees in the US offered by a community college and will be part of its Information Systems Technology (IST) Associate of Applied Science degree starting towards the end of 2018.

AWS said the two-year degree program is built to address the high concentration of tech employers in the Northern Virginia region and the demand for employees with cloud computing skills.

“A key part of the new Virginia economy is building up our talent pipeline to match our education system, and aligning our training programs around the skills needed, such as cloud computing, for 21st-century jobs,” said Ralph Northam, Governor of Virginia.

“Community colleges like NOVA are important engines for workforce development, and this collaboration with Amazon Web Services marks an exciting first step in a broader plan to bring cloud computing education to students across the Commonwealth of Virginia.”

The 63-credit associate degree program is mapped to skills and competency-based credentials required by AWS and other employers who leverage cloud-based services. All students will receive membership in the AWS Educate program and gain hands-on experience with leading cloud technology and tools.

This degree program is the first step in a much broader plan by AWS to bring cloud computing education to students throughout the state of Virginia and potentially to other educational institutes around the world.

“We’re thrilled to collaborate with NOVA on this degree program, as they break new ground to open up opportunities to careers in cloud computing for students in the state of Virginia and around the globe,” said Teresa Carlson, vice president of worldwide public sector at AWS.

“We believe that this degree offering, and our collaboration with community and vocational programs around the world, can fundamentally alter the role that these institutions play in helping to build and diversify the pipeline of new, exceptional talent in the tech community.”

Picture: Shutterstock

A CDO’s guide to data warehouse automation: Why it is needed and how to succeed with data

The four “Vs” of data are well known – volume, velocity, variety and veracity. However, data warehousing infrastructure in many organisations is no longer equipped to handle these. The fifth elusive “V” – value – is even more evasive. Meeting these challenges at the scale of data that modern organisations have requires a new approach – and automation is the bedrock.

For CDOs, it’s all about finding methods of using data for value creation and revenue generation, which occupies 45% of their time. This means harnessing the growing beast that is data in a way that is practical, manageable and useful.  That’s where the data warehouse comes in, providing a centralised space for enterprise data that business users, including the CDO, can use to derive insights.

Creating a successful data warehouse is critical for CDOs to succeed in monetising data within their organisation. However, the traditional waterfall approach to data warehousing, first introduced in the 1970s, delivers only a fraction of the value that it could potentially offer. Instead, the approach needs to evolve to become more responsive as organisational needs change, addressing new data sources and adapting to business demand.

Practical steps for the successful implementation of an automated data warehouse

As IT departments are expected to do much more with much less, processes need to change. Data warehouses can no longer be created “artisinally” – IT teams need to focus on producing an adaptable decision support infrastructure. Here are five steps for CDOs to help their company achieve this:

Understand the desired outcomes: Before making any decisions as to the future of your data warehouse infrastructure, CDOs need to ensure they understand the specific challenges the business teams are facing where data could help. In essence, the data warehouse automation and modernisation program needs to be built around enabling decision-making that will lead to differentiation in the market place.

According to 39% of respondents a TDWI survey, re-alignment to business objects is the top reason for data warehouse modernisation, selected by. By enabling collaboration between business teams and IT teams, the CDO helps chart the course for how business goals and technology meet. In turn, this will lead to overall business transformation, accelerated through the new data warehouse’s approach to data-driven decisions.

Understand what you have already: Most organisations already have sophisticated data management tools deployed as part of their infrastructure – however these may not be working to the fullest of their abilities. Organisations already using SQL Server, Oracle, or Teradata, for example, have a range of data management and data movement tools, already within their IT real estate, which can be automated and leveraged more effectively as part of a data warehouse automation push.

However, in that inventorying process, CDOs should be ensuring they have considered the capacity requirements of their data warehouse. Data will continue growing exponentially, so while the data warehouse may be fit for purpose today, it’s important that the automation processes, storage requirements and general infrastructure are capable of handling this in the future too.

As part of this, data warehouse automation needs to integrate with the business as it is, rather than the business as the IT teams wish it might be. CDOs need to encourage their teams to understand the data that is available, and the automated analytics and evaluation processes which can be used to meet specific business priorities. The data warehouse automation strategy needs to be designed not just for an ideal set up of data, expertly managed and curated, but for the realistic “messiness” of the business data landscape.

Automate efficiently: Data warehouse automation, as with any other large-scale transformation project, requires resources – and these are often scarce due to strict budgets and competing priorities. This means that CDOs need to think hard about what actually should be automated in order to free up man-hours in the future. In particular, these should be systematic processes, where data warehouse automation can either eliminate the need for human involvement or dramatically accelerate the process.

Embrace change: CDOs should look at data warehouse modernisation and automation as an avenue of constant, on-going development. As business needs change and new data sources emerge, CDOs need to be able to re-strategise different parts of the infrastructure to match. Similarly, to minimise disruption and ease the transition for business users, CDOs should take a staged approach to the initial automation and modernisation process, with a set schedule of when different requirements will be met. Post-production change is inevitable due to evolving business needs, new technologies used and continuous improvement desired. Change needs to be planned for.

At the same time CDOs need to prepare for the human change that automation will create. In business teams, users can be re-deployed on analysing business intelligence and translating insight into business value. In the IT teams, automation provides new capacity to plan for the future – looking at new analytics tools, or planning for smarter, better ways to deliver on business priorities further down the line.

A data warehouse automation mentality

Data warehouse automation is not solely software you buy. It’s a philosophy and culture you implement. Tools and technologies form the bedrock of the processes, but a data warehouse strategy requires strong leadership, a transparent process, and an unrelenting focus on the business’s end goals in order to succeed.

Without robust data warehouse automation, businesses will struggle to capitalise on the potential of data and its associated technologies. As the strategic lead for data-driven transformation, and the change agents across both business and IT teams, the responsibility falls to CDOs. Professionals in this role need to understand, strategise, and execute on the way that large-scale data usage will influence future business decisions. The adaptability of the supporting data infrastructure can either be a CDO’s greatest weakness or greatest asset. Use the four steps above to ensure it is the latter, and to achieve the ultimate goal of any business investment – value.

Why for ultimate data centre security, technology alone is not the answer

The security of data – and in particular people’s personal data – has been a hot topic in recent months. The EU’s rollout of new GDPR regulations; the Cambridge Analytica scandal; or the seemingly weekly revelations of financial institutions or consumer service providers which have had their databases hacked, are all examples most of us will be aware of.

Less often discussed but just as important as the security of our data, is the security of the data centres that house it. And at first glance, identifying, reviewing and prioritising all the elements that a data centre must contain in terms of security would appear to be a very complex subject, depending on myriad variables including facility size, organisation type, service commitments, system complexity, customer requirements, the list goes on…

However, independent of the variables mentioned above, in my view data centre security can be boiled down to just two areas – physical security and operational security.  And while both of these clearly depend to a great extent on technology, the single most important element is the establishment of appropriate policies, processes and operating procedures – and critically, of course, actually following them.

Unfortunately, over the years I have seen many examples of security – both physical and operational – being seriously compromised through the lack of clear and well-defined security processes and procedures. And ironically, I have seen this most often in data centre facilities that had state-of-the art security equipment installed.

For example, implementing the latest and most sophisticated biometric access systems does not, by itself, ensure that supposedly secure areas are actually secure and that access is fully controlled. On the contrary, I have witnessed unauthorised and unsupervised personnel wander in and out of secure areas at will. The failure here not being due to any fault with the access control equipment itself but to appropriate security protocols not being implemented or maintained.

As for operational security, a standard requirement for any modern data centre is to have redundancy capabilities fully integrated in order to ensure continuous operation even if disaster strikes. And for many data centre operators’ customers, this is non-negotiable, given their dependence on the often mission-critical systems the data centres house.

However, just as with ensuring physical security, implementing systems for fully redundant facility operation is not simply a matter of installing more of the latest equipment. Ensuring data centre redundancy is a hugely complex undertaking. Initial design is clearly important, as is the correct installation and interlinking of redundant systems, whether for power, cooling, monitoring, or communications. But most important of all, once again, are the protocols and procedures that must be implemented and followed in order to ensure that redundant gear actually kicks in to action if and when it needs to.

Regardless of whether the data centre in question is hyperscale or a relatively small edge facility, having the right processes in place and the right people following them are typically what makes the difference between, on the one hand, a data centre’s security being fully maintained and on the other, a catastrophic failure.

So when securing even the most technical of environments, technology is only part of the answer. Without the disciplined application of associated policies and processes, success cannot be guaranteed. After all, the best tools in the tool box are of little value without the appropriate knowledge and experience to use them.

HPE puts $4 billion aside to invest in the intelligent edge

Hewlett Packard Enterprise (HPE) has seen the future – and it’s all about the intelligent edge.

The company has announced a $4 billion (£3.04bn) investment over four years in technologies and services to deliver personalised user experiences, seamless interactions, and artificial intelligence (AI) and machine learning to improve customer experiences and adapt in real time.

HPE cited Gartner figures which argue that by 2022 three quarters of enterprise-generated data will be created and processed outside of the traditional data centre or cloud, up significantly from 10% this year. It’s a race against time for companies to get proper processes and actionable insights from their data wherever it lies – and HPE feels as though it has the solutions to those problems.

“Data is the new intellectual property, and companies that can distil intelligence from their data – whether in a smart hospital or an autonomous car – will be the ones to lead,” said Antonio Neri, HPE president and CEO. “HPE has been at the forefront of developing technologies and services for the intelligent edge, and with this investment, we are accelerating our ability to drive this growing category for the future.”

Details are a little scant on where this money will go – however HPE did note that it will ‘invest in research and development to advance and innovate new products’ as well as ‘continue to invest in open standards and open source technologies, cultivate communities of software, AI and network engineers, and further develop its ecosystem through new and expanded partnerships.’

The $4bn HPE is putting aside for this investment is not quite the $5bn Microsoft announced back in April focusing on the Internet of Things. Microsoft also favours the term ‘intelligent edge’ when discussing the future of technology. In February, during the company’s Q2 financial report, CEO Satya Nadella told analysts that the ‘intelligent cloud and intelligent edge platform [was] fast becoming a reality.’

A data centre with no centre: Why the cloud of the future will live in our homes

When we talk about the home of the future, we might think of technologies that will bring convenience to our lives. Refrigerators that know when we’ve run out of milk and order more for us, perhaps, or 3D printers that will make any shape of pasta you can imagine. But while this Jetsons-like vision might initially hold some appeal to the average person on the street, such innovations are likely to come and go in a flash.

In truth, we struggle to adapt to technologies that are designed to help us around the home, often reverting to old ways of doing things. Take Amazon’s Alexa for example. The vast majority of voice skills that the AI assistant is capable of are largely unused by consumers.

It’s time we shifted focus from gimmicks to changes that will offer genuine value to the household and the wider community. Because in the home of the future, the most interesting things will be happening behind the scenes.

Imagine a home that is also a data centre. While on the surface this might seem like a far-fetched idea, it would actually bring many benefits. A vast amount of computational power currently goes unused in homes, with computers, games consoles set-top boxes and smart televisions under-utilised and in many cases in standby mode for most of their life. This untapped power could be used to drastically reduce reliance on existing data centres.

And then there’s speed. The UK’s average broadband speeds of 46.2Mbps downstream and 6.2Mbps upstream might not sound much in comparison to a tier-one data centre, but by linking homes together in a decentralised network the potential is enormous.

Of course, any such plan would need to be incentivised – by using blockchain technology for example, to create a tokenised system of reward for contributing to this decentralised network.

Imagine having the cost of your broadband bill covered by being part of such a network, or buying a wireless speaker that pays for itself over time. Those that contribute the most earn the most, and by using a decentralised system the rewards would be distributed in a completely transparent and verifiable way, in contrast to traditional cloud platforms that centralise control and network revenues.

These networks would provide real, community-owned alternatives to the services provided by Amazon, Google and the like, without the massive environmental impact. Data centres are of course incredibly costly to run, mostly due to the incredible amount of energy required to keep the servers cool. While Microsoft has recently embarked on a curious experiment with Project Natick, sinking a data centre into the sea off the Orkney Islands in an attempt to boost energy efficiency, it’s hard to believe that this is a realistic option for the future.

Being surrounded by seawater might keep the temperature of the hardware under control without requiring the specialist cooling systems used in conventional server farms, but it also makes servicing a faulty node pretty much impossible, and a lot of energy has to go into making the thing in the first place. Surely it makes much more sense to maximise the potential of the devices we already have at our disposal, which would otherwise be idle for around three-quarters of their lifetime.

In the UK, we’re already starting to see solar panels built into many new homes, as well as installed in existing ones – with owners able to sell any excess electricity that they generate back to the National Grid. And you can certainly imagine a near-future where this shift paves the way for individual homes becoming nodes of vast, decentralised networks.

So, the concept of cloud might not be sinking without a trace – but it is certainly time for an upgrade. And this year, you can expect the first wave of decentralised, faster, cheaper networks to arrive – bringing a much needed working use case along for the ride.

Best desktop email clients 2018


Jonathan Parkyn

21 Jun, 2018

Web-based email has never been so popular yet there are plenty of headaches associated with having to be online to read your messages. Even the best cloud services out there struggle to replicate the ease of use that desktop-based clients bring, whether it be easily backing up emails, accessing attachments offline, or simply offering the same flexibility when it comes to capacity.

We’ve tested some of the most popular email clients to see which offer the most well-rounded experience for users, including software performance, feature set, and their ease of use.

eM Client

www.emclient.com

Price: Free

With a smart-looking, modern interface and plenty of advanced features, eM Client is easily the best email software for Windows PCs.

Calendar, contacts and tasks are all integrated and there’s even built-in support for chat (via Facebook, Google or Jabber). Setting up accounts is very straightforward – most popular email services are automatically recognised and configured without you having to faff around with SMTP server settings and suchlike. If you’re switching from another email program, eM Client will helpfully offer to import data from your old application, and if you’re using an Outlook.com or Gmail account, your calendar and contacts will be automatically synced, too.

The client can be switched to a stylish Dark theme

eM Client’s default interface should feel instantly familiar – it uses the tried-and-tested, three-column (folder list, message list, preview pane) layout. But the program doesn’t look old-fashioned and you can customise its layout to suit your tastes by clicking Menu, Tools, Settings, Appearance – we prefer the stylish Dark theme. It’s also possible to switch Conversation view off, if you prefer.

Notable features include a super-fast search, advanced filtering tools, templates, signatures, tags and the ability to categorise mail using colour-coding. There’s even a built-in translator, which uses Bing’s translation engine.

The latest version of eM Client (7.1) adds a number of useful new features, including an improved backup tool that can automatically back up your data in the background, and support for PGP encryption.

How it can be improved

The free version of eM Client only supports two mail accounts. If you need more than that, you’ll have to pay for the Pro version, which costs £36 (or £72 if you want lifetime upgrades to future versions). After the 30-day trial, you’ll need to apply to eM Client for a free license to keep using it for free, which seems like an unnecessary step. There’s currently no integration with Windows 10’s Action Centre – instead you’re alerted to new mail via eM Client’s own Notification area icon pop-up.

Verdict

Quibbles aside, eM Client easily beats the competition. It offers a great balance of simplicity and adaptability, while its familiarity makes it a great replacement for older tools, such as Outlook Express and Windows Live Mail.

Features: 5

Performance: 5

Ease of use: 5

Overall: 5

Mozilla Thunderbird

www.thunderbird.net

Price: Free

Thunderbird is a resolutely old-school email program that offers support for multiple POP and IMAP accounts, and provides easy set-up for popular services, such as Gmail and Outlook.com.

You can configure it so that it looks and works how you want it to, and there are loads of features, including powerful filtering tools, an RSS reader and instant messaging. Like Mozilla’s more famous web browser, Thunderbird’s abilities can be expanded further by installing add-ons – anything from alternative themes to mail merge tools, password managers and more – though many ‘legacy’ extensions are being phased out. The once-optional Lightning add-on is now integrated into the program, meaning that calendar and tasks features are now built in.

How it can be improved:

Thunderbird lacks native Exchange support, meaning some accounts (including Outlook.com ones) don’t get the full range of features. There’s no support for Windows 10’s Action Centre, either – so there are no native Windows 10 notifications.

Mozilla has made no secret of the fact that its struggling to justify Thunderbird’s ongoing development. Last year Mozilla found a way to keep Thunderbird alive by separating it from its core business and new features have been promised, but its future is far from guaranteed.

Verdict

With plenty of built-in features and many more available through add-ons, Thunderbird is highly versatile, though its ageing interface and lack of support for some newer standards are disappointing – and may never be fixed.

Features: 4

Performance: 4

Ease of use: 4

Overall: 4

Microsoft Mail app

www.microsoft.com

Windows 10’s built-in apps tend to come in for a bit of a knocking, but Mail is actually pretty good. It has a nice, clean interface, supports most account types (including POP and IMAP) and is refreshingly simple to set up and use.

Its close integration into the OS has a number of benefits, including a live tile in the Start menu and a cross-app relationship with the People (contacts) and Calendar apps. Microsoft keeps improving the Mail app, too. Last year it added a Focused Inbox feature for Gmail users, for example (click Settings, Reading to toggle this on or off). On touch-screen devices, the app’s intuitive Swipe Action controls are an added bonus.

How it can be improved:

Some of Mail’s tools are a little too simplistic – there’s no filtering, for example, and it only supports plain text signatures (though you can hack it by adding in your own HTML code). Also, since Mail is tied so closely to the OS, its reliability can be affected by Windows 10 updates. We’ve experienced problems like these first-hand and many user reviews on the app’s Windows Store page would suggest that we’re not alone.

Verdict

If simplicity is what you’re after, look no further – Mail’s already installed on your PC and is a piece of cake to set up. That said, you may find its lack of features frustrating.

Features: 3

Performance: 4

Ease of use: 5

Overall: 4

Best of the rest

Microsoft Outlook (www.microsoft.com)

If you subscribe to Office 365 (from £5.99 per month or £60 per year), you get Outlook with it, which is an email client, calendar tool, contacts manager and to-do list all in one. Outlook offers a lot of advanced tools, including a powerful Rules function, fast search and built-in archiving tools. But, these days, Outlook feels like overkill – there are simpler tools available for free.

Mailbird Lite (www.getmailbird.com)

Mailbird Lite feels a little like Microsoft’s Mail app on steroids. It looks great and has the ability to connect to popular apps and services, including WhatsApp, Slack and Facebook, as well as the usual email accounts. The free version of Mailbird restricts you to a single email account and lacks some of the full program’s better tools, such as attachment previewing and email snoozing. Adding these will cost you £19.50.

Postbox (www.postbox-inc.com)

Postbox’s interface is uncluttered and should feel familiar. RSS feeds and newsgroups are supported but, strangely, there’s no built-in calendar. Some of the program’s more innovative tools include automated responses and placeholders, which can save time if you find yourself frequently sending similar replies. The big drawback is that Postbox isn’t free. Beyond the 30-day free trial you’ll need to pay – currently it’s $40 (£29) for the full version.

Image: Shutterstock

Edge or cloud? The five factors that determine where to put workloads

Should you send your data to computers or bring your computing assets to the data?

This is a major question in IoT. A few years ago you might have said “everything goes to the cloud,” but sheer size and scope often makes a smart edge more inevitable. IDC estimates that 40% of IoT data will be captured, processed and stored pretty much where it was born. While Gartner estimates the amount of data outside the cloud or enterprise data centres will grow from 10% today to 55% by 2022.

So how do you figure out what goes where?

Who needs it?

IoT will generate asinine quantities of data across all industries. Manufacturers and utilities already track millions of data streams and generate terabytes a day. Machine data can come at blazingly fast speeds, with vibration systems churning out over 100,000 signals a second, delivered in a crazy number of formats.

Machines, however, aren’t good conversationalists. They often just provide status reports on temperature, pressure, speed, pH, etc. It’s like watching an EKG machine; companies want the data, and in many cases need to keep it by law, but only a few need to see the whole portfolio.

The best bet: look at the use case scenario first. Chances are, every workload will require both cloud and edge technologies, but the size of the edge might be larger than anticipated. 

How urgently do they need it?

We’ve all become accustomed to the Netflix wheel that tells you your movie is only 17% loaded. But imagine if your lights were stuck at 17% brightness when you came home. Utilities, manufacturers and other industrial companies operate in real-time – any amount of network latency can constitute an urgent problem.

Peak Reliability, for instance, manages the western U.S. grid. It serves 80 million people spread over 1.8 million square miles. It also has to monitor over 440,000 live data streams. During the great eclipse it was getting updates every ten seconds

Rule of thumb: if interruptions can’t be shrugged off, stay on the edge or a self-contained network.

Is anyone’s life on the line?

When IT managers think about security, they think firewalls and viruses. Engineers on factory floors and other “OT” employees—who will be some of the biggest consumers and users of IoT — think about security as fires, explosions and razor wire. The risk of a communications disruption on an offshore drilling rig, for example, far outweighs the cost benefits of putting all of the necessary computing assets on the platform itself. Take a risk-reward assessment.

What are the costs?

So if the data isn’t urgent, won’t impact safety, and more than a local group of engineers will need it, do you send it to the cloud? Depends on the cost. Too many companies have responded to cloud like a teenager in 2003 given their first smart phone. Everything seems okay, until the bill comes.

In the physical world, no one sends shipments from L.A. to San Francisco via New York, unless there is a good reason to go through New York. Distance means money. Sending data to the cloud that could just as effectively be stored or analyzed on the edge is the digital equivalent. Getting the right balance of edge and cloud is the key to managing the overall TCO.

How complex is the problem?

This is the most important and challenging factor. Are you examining a few data streams to solve an immediate problem such as optimizing a conveyor belt, or comparing thousands of lines across multiple facilities? Are you looking at a patient’s vital signs to determine a course of treatment, or studying millions of protein folds to develop a new drug?

Companies often use the cloud to crack a problem, and then repeat it locally at the edge. Projects resulting in millions in savings aren’t being produced by a magical algorithm in the cloud – instead, people look at a few data streams and figure it out on their own.

Another way to think about it: the cloud is R and the edge is the D in R&D.