Google gets serious about cloud with sole-tenant nodes


Clare Hopping

11 Jun, 2018

Google has announced the beta rollout of its sole-tenant nodes in Google Compute Engine, allowing businesses to run instances on their own dedicated architecture rather than having to share the host with others.

However, just because a node is only used by one business, that doesn’t mean it’s harder for the firm to set up and manage itself. To solve this issue, Google’s sole-tenant nodes use an algorithm to automatically find the best location to launch the instance. They’re also better at dealing with outages during maintenance, with instances migrated automatically to avoid downtime, according to the cloud giant.

Google claimed the pricing structure is also highly competitive because businesses only need to pay for what they use. Access is charged on a per-second basis, with a one minute minimum charge.

The cloud giant said its sole-tenant nodes will be particularly attractive for highly regulated industries that need to separate their compute resources in the cloud for compliance reasons. They can be used hand-in-hand with virtual machines to offer a flexible, but secure service.

Businesses that aren’t so highly regulated can use these nodes to choose where they want their instances to run using user-defined labels or choosing to have Google select their locations instead.

Google’s sole-tenant nodes are pretty similar to Amazon Web Services’ EC2 Dedicated Hosts, which also gives users keys to their own virtual kingdom, so customers can create multiple instances on physical hardware if they wish. It also appears to be similar to Microsoft’s single-tenanted infrastructure, suggesting Google is a little behind its rivals in launching such an offering.

Picture: Bigstock

dHosting Named “Technology Sponsor” of @CloudEXPO NY | @dhosting_com @dhosting_pl #Serverless #DataCenter #Storage

Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting’s greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting’s unique product, Dynamic Edge.

read more

Why digital business talent is a top priority for CEOs

Who is leading digital transformation at the most forward-thinking organizations across the globe, and what are the most significant roadblocks to their progress? That depends, on who you ask. Let's consider the most common digital business challenges today, and how savvy leaders overcome them.

Digital growth tops the list of CEO business priorities in 2018 and 2019, according to the latest worldwide market study by Gartner. However, as growth becomes harder to achieve, CEOs are concentrating on changing and upgrading the structure of their companies — including digital business investments.

Exploring digital business transformation culture

"Although growth remains a CEO's biggest priority, there was a significant fall in simple mentions of it this year, from 58 percent in 2017 to just 40 percent in 2018. This doesn't mean CEOs are less focused on growth, instead it shows that they're shifting perspective on how to obtain it," said Mark Raskino, vice president at Gartner.

The Gartner survey of CEO and senior business executives in the fourth quarter of 2017 examined their business issues, as well as some areas of technology agenda impact. In total, 460 business leaders in organizations with more than $50 million in annual revenue were qualified and surveyed.

IT remains a high priority coming in at the third position, and CEOs mention digital transformation, in particular. Workforce has risen rapidly this year to become the fourth-biggest priority, up from seventh in 2017. The number of CEOs mentioning workforce in their top three priorities rose from 16 percent to 28 percent.

However, when asked about the most significant internal constraints to growth, employee knowledge and progressive talent issues were at the very top of the list. CEOs said a lack of 'skilled talent' and workforce capability, by far,  is the biggest inhibitor of digital business development progress.

Culture change is a key aspect of digital transformation. Gartner found that CIOs agreed it was a very high-priority, but only 37 percent of CEOs said a significant or deep culture change is needed by 2020. Regardless, when companies that have a digital growth initiative are compared with those that don't, the proportion in need of culture change rose to 42 percent. Enough said.

"These survey results show that if a company has a digital initiative, then the recognized need for culture change is higher," said Mr. Raskino. "The most important types of change that CEOs intend to make include making the culture more proactive, collaborative, innovative, empowered and customer-centric. They also highly rate a move to a more digital and tech-centric culture."

Survey respondents were asked whether they have a management initiative or transformation program to make their business more digital. Sixty-two percent said they did. Of those organizations, 54 percent said that their digital business objective is transformational while 46 percent said the objective of the initiative is optimization.

In the background, CEOs' that use of the word 'digital' has been steadily rising. When asked to describe their top five business priorities, the number of respondents mentioning the word 'digital' at least once has risen from 2.1 percent in the 2012 survey to 13.4 percent in 2018.

This attitude toward digital business development is backed up by CEOs' continuing intent to invest in IT infrastructure. Sixty-one percent of respondents intend to increase spending on IT in 2018, while 32 percent plan to make no changes to spending and only seven percent foresee spending cuts.

Ongoing digital culture development challenge

The Gartner survey showed that the percentage of survey respondents who think their company is an 'innovation pioneer' has reached a high of 41 percent — that's up from 27 percent in 2013 — with fast followers not far behind at 37 percent.

"CIOs should leverage this bullish sentiment by encouraging their business leaders into making commitments to digital business change," said Mr. Raskino. "However, superficial digital change can be a dangerous form of self-deceit. The CEO's commitment must be grounded in deep fundamentals, such as genuine customer value, a real business model concept and disciplined economics."

Google Cloud launches sole-tenant nodes for improved compliance and utilisation

Google Cloud has announced the launch of sole-tenant nodes on Google Compute Engine – helping customers in various industries around compliance in the process.

The new service, which is currently in beta availability, gives customers ownership of all VMs, hypervisor and host hardware, going against the traditional cloud use case of multi-tenant architecture and shared resources.

“Normally, VM instances run on physical hosts that may be shared by many customers,” explained Google’s Manish Dalwadi and Bryan Nairn in a blog post confirming the news. “With sole-tenant nodes, you have the host all to yourself.”

This will potentially be good news to companies in finance and healthcare, along with other firms who adopt an all-data-is-equal-but-some-data-is-more-equal-than-others mindset. Organisations with strict compliance and regulatory requirements can use sole-tenant nodes to ensure physical separation of compute resources in the cloud, while Google also noted that companies can achieve higher levels of utilisation if they are creative with their instance placements and machine types launched on sole-tenant nodes.

The move puts Google in line with Amazon Web Services (AWS) and Microsoft. The former, for example, offers EC2 Dedicated Hosts, a physical server with EC2 instance capacity dedicated to the user, as well as Dedicated Instances. An AWS document outlines the differences between the two; apart from the straightforward difference in terms of per-host and per-instance billing, Dedicated Hosts offers visibility on sockets and physical cores, targeted instance placement and bring your own license (BYOL).

This is just one of various initiatives Google has put into place this year to beef up its cloudy operations. Last month, this investment was validated in the form of Gartner’s Magic Quadrant for cloud infrastructure as a service. Google made the leaders’ section, which for five years had been the sole domain of AWS and Microsoft, for the first time.

Pricing for Google’s sole-tenant nodes is on a per-second basis with a minimum charge of one minute.

Main picture credit: Google

Five tips for creating successful company-wide data security training

Creating a safe online environment for your business is a major concern for leaders today. With the amount of data breaches increasing steadily and consumer trust in data management declining, it’s no wonder that improving the security of IT systems is the number one priority for 55% of companies.

Employees can either be your greatest strength or your greatest weakness when it comes to data security. Unfortunately, one of the leading causes of data breaches is internal negligence due to poor training. However, when the staff is educated and instructed on the proper practices, the risk of cyberattacks or data leaks can be reduced by up to 70%.

If your business is ready to enact a company-wide data security training plan, here are some tips that can improve results and ensure you are properly prepared for anything that comes your way.

Make sure everyone understands the entire business process

Businesses in today’s world rely on vast collections of complex datasets. First and foremost, in order to make sure this valuable information stays secure, everyone must understand the processes and how their work fits into the big picture. Whether it's managing customer profiles, translating marketing data into the main CRM system, ect., there cannot be any gray areas.

Explaining to employees how pieces of the data puzzle fit together will make it much easier to implement security procedures and new systems. For example, one common BI cloud-based system that businesses rely on these days is Salesforce. However, 43% of teams report that they use less than half of its CRM features. This could result in poor data management and reduced returns. By using a proper Salesforce training system that explains how datasets can be used throughout the company, you can work to fill in the gaps and help your team to better understand the data lifecycle.

The need for data security education is huge among businesses. In order for information to be utilized properly, there must be a set system in place for its storage and organization. Be sure that your onboarding strategies cover all of the bases so everyone is on the same page.

Assess the needs of each department

Of course, each facet of your business has different needs and priorities, especially in terms of data collection and access. For instance, the accounting department will need higher security for sensitive financial information from clients while marketing teams will require consumer behavior data points to guide their strategies.

Source

Rather than thinking of a data security system as a one-size-fits-all blanket, you must take each department’s needs into consideration and be sure that your approach covers their priorities. Talk to the heads of each department to determine how and where data security can best be implemented to accommodate their day-to-day.

Determine when and how training will be conducted

Introducing a company-wide security training program is by no means a small task. Every single organization is made up of people, and each person learns differently. Therefore, in order to be sure that everyone is on the same page, there must be some careful planning about the way that training will be conducted.

Make sure that your training courses cover the most important topics for the best results. Keep in mind, not all your employees are data security experts; try not to get too technical and keep it user-friendly. Stick to the main points and offer clear and easy solutions.

It’s interesting to note that businesses that hold a single training program every year have lower retention rates than ones that offer monthly refreshers. If possible, it may be within your best interests to offer regular classes throughout the year to make sure they are up to speed.

Source

Develop a system to test the effectiveness of training

According to Dashlane’s report, 90% of businesses fall prey to attacks due to internal threats and mistakes made by employees. The most common culprits are phishing attempts, weak passwords, and accidently sharing private information. Therefore, part of your training must address these top issues, as well as the solutions to combat the most prevalent problems.

In addition to providing educational information, your training must have a system in place to check that everyone understands what they have learned. Since much of the information related to data security is highly technical, not every employee will get it the first time around. A short test at the end of training will show what your team learned on paper, but simulations and test runs will give you a better idea of how they will actually apply this knowledge in the real world.

There are a series of tests that you can run to check your employees’ security savviness. For example, you can send simulated phishing tests via email or even password enumeration tests to check the effectiveness of your employees’ security habits.

Stay up to date on all big data news and trends

The world of data security changes seemingly by the minute. Every day, there are new threats along with new technology to make systems safer and more secure.

In order to truly protect your company from cyberattacks, cyber security managers must stay sharp on any developments in this area. Make sure everyone stays informed about new data systems and technologies by keeping up with the latest industry news. Furthermore, encourage continued education or participation in cyber security seminars and meetings.

Conclusion

Thankfully, the issue of data security is not without solutions. Whether your business decides to instill stricter data governance for added security, or prefers a multi-cloud infrastructure for increased safety, the only way to ensure that these strategies perform effectively is to train your team properly and make sure they know the processes from A to Z.

As employee use of cloud apps explodes – can CASBs help?

Rapid adoption of software as a service (SaaS) has changed the security paradigm for enterprise applications. Provisioning is no longer an activity performed solely by IT; instead, business managers are independently purchasing cloud apps and skipping security practices. This leaves enterprise data exposed, forcing IT/security teams into reactive mode as they try to manage risks with their existing security tools using ineffective “whack-a-mole” approaches.

Ultimately, as enterprises shift more workloads to the cloud, securing this environment becomes one of the most critical challenge facing security and IT decision makers. Shadow IT has further complicated the ability of enterprises to gain visibility and control of the applications employees use every day.

For that reason, enterprises are increasingly looking to cloud access security broker (CASB) vendors to fill the void. CASBs are designed to work with any and all SaaS applications by delivering add-on security capabilities and analytics that enable detection and response for cloud apps.

Last year for its CASB Magic Quadrant report, Gartner predicted that by 2020, 60 percent of large enterprises will use a CASB to govern cloud services, up from less than 10 percent today. The analyst firm also predicted that SaaS and IaaS will drive end-user CASB spending, from $150.7 million in 2015 to $713 million in 2020.

Co-author of the Magic Quadrant Steve Riley noted that, “CASBs are becoming as important to cloud as firewalls became to data centers. With your firewall: the whole purpose was to protect your data on your systems. In cloud it’s still your data, but it isn’t your system anymore. CASBs are the thing that helps you protect your data on somebody else’s systems.”

Gartner defines four equally important functional pillars that a CASB solution must deliver: visibility, compliance, data security, and threat protection. These four capabilities are mandatory to address the tenant security responsibilities listed above and are essential for cloud security success. The following four considerations should be taken into account when evaluating and deploying CASB solutions.

SaaS discovery and reputation

SaaS discovery represents one of the most important capabilities in the CASB toolbox because it catalogs all SaaS usage by employees. It’s quite common to have hundreds of applications on the list. Depending upon an application’s specific purpose, this implies there is likely widespread loss of visibility and control of sensitive data being uploaded by users. That’s enough to keep any CISO awake at night, and it certainly justifies CASB investment.  

By nature, assessing a SaaS vendor’s security capabilities (above) is inherently opaque and requires significant investigation and due diligence. Hence, security certifications for SaaS vendors carry a lot of weight, and reputation is also critical. As part of their service, CASB vendors should include a reputation service that evaluates the security maturity of an entire catalog of SaaS vendors. Risk scores must be provided so that IT security teams can make informed decisions about whether or not certain cloud apps should be trusted and used by employees. 

However, before you go exercising control, remember that consumerization trends have shown that heavy-handed approaches to security often backfire, so the most effective strategy is to “coach” employees to use more secure options. But if the risks of dubious cloud apps are ultimately unacceptable and user practices don’t adjust accordingly, application blocking can and should be used.

Identity and two-factor authentication

As soon as the cloud application count goes above one or two, having employees managing their own identities and passwords quickly become a tangle of security risks and poor user experience. So integration with Identity and Access Management (IAM) is mandatory for managing risk and optimizing user experience. Better yet, an integrated IAM with the CASB solution will accelerate deployment and increase the CASB’s value for organizations that have yet to roll it out. Of course, if you already have IAM, the CASB solution must be able to support multiple identity vendors. Risk-based authentication is effective at balancing user experience with security control. When risky behavior or activity is detected, a user request should be sent to re-authenticate using an additional second factor.

Data visibility, control, and loss prevention

Visibility of data flowing in and out of cloud applications is best enabled with Data Loss Prevention (DLP) practices and tools. DLP is not a new capability so enterprises with existing deployments should be able to extend their existing policies via Internet Content Adaptation Protocol (ICAP) into the CASB for additional enforcement and protection of both structured and unstructured data within cloud apps.

For many mid-sized organizations that don’t yet have DLP, an integrated CASB DLP option for configuring and enforcing policy is a great cost-effective option. Appropriate DLP controls are needed to enforce policies for preventing the most sensitive data entering a cloud app. Similar controls should also prevent or alert when users attempt offloading of sensitive data particularly into unmanaged devices.

The latter is the riskiest, and having an integrated Digital Rights Management (DRM) capability means that when a third party user or an internal user on an unmanaged device needs to view data, it can be done simply through web browser scripts that prevent saving, offloading and cutting and pasting of data. Finally, for the most security-conscious organizations, being able to enforce data-at-rest encryption using their own unique keys ensures that no other party, including the SaaS provider itself, can access cloud data.

Threat detection and response

Stolen credentials by attackers, and malicious insider usage are two major threats facing cloud applications. Two-factor authentication goes a long way to mitigate stolen credentials, but it’s not always used, nor is it foolproof. Advanced analytics, or more specifically User Entity and Behavior Analytics (UEBA), is a critical capability that identifies such types of potentially malicious activity so that immediate responses can be taken including locking the account, or requesting step-up authentication. 

Decision makers recognize that they really can’t prevent employees from accessing cloud applications, but CASB allows businesses to get their arms around Shadow IT as well as sanctioned cloud application usage by extending on-premises security policies to the cloud. Rather than fighting SaaS, CASB tools embrace SaaS in a way that doesn’t impair the user experience, but instead enhances it while allowing the enterprise to maintain a comprehensive and consistent security policy across all software environments.

Why enterprises are creating a self-induced skills gap despite strong cloud appetite

Enterprises have a serious appetite to move their resources to the cloud at one level – but a lack of skills and resistance to change from some quarters is holding organisations back.

That’s the key finding from new research from global cloud provider Skytap. The study, in conjunction with 451 Research, may have many rolling their eyes in a manner suggesting they have seen it before – yet it still proves companies are not getting to grips with the change.

More than two thirds (67%) of the 450 C-level and director-level technology leaders polled said they planned to migrate or modernise at least half of their on-premises applications in the next 12-24 months. Yet half (49%) said they wanted to go about migrating to the cloud through refactoring or rewriting applications – the strategies that require the highest degree of IT skill. As the report puts it, organisations are ‘their own worst enemy.’

Part of this is down to the lure of hyperscale cloud providers. Two in three respondents say they use one or more of Amazon Web Services (AWS), Azure, Google or IBM Cloud. Yet the report argues this form of cloud modernisation – focusing predominantly on the front-end – neglects the engine room, the enterprise data centre, where gnarled, complex, ERP and CRM apps live. They’re critical to the business, but more importantly, they’re ill-suited for cloud environments.

This may end up explaining a certain amount of apathy among organisations polled. More than half (55%) of respondents said their most critical recruiting need was ‘people capable of migrating existing applications to the cloud’, while a similar number (54%) said ‘internal resistance to change’ was key to holding their firm back from modernisation.

All things considered, the key point of the research is a simple one: don’t believe anyone who tells you that cloud is done, at least in the enterprise. Enterprise approaches continue to be haphazard – a ‘myriad of difficult choices exacerbated by urgent skills needs and the significant challenges created by traditional but mission-critical applications left in the data centre’, as the company puts it.

“Cloud is often overhyped and simplified, while modernisation and digital transformation can be even more vague,” said Brad Schick, Skytap CTO. “Our study cuts through to clarify the fact that technological change is hard and is being further aggravated by cookie-cutter approaches to cloud adoption.

“We want to be part of a conversation that gives enterprises clear choices to manage change and progressive modernise without burning everything to the ground,” added Schick.

You can find out more and read the report here.

Experts urge caution over NHS promises to secure its data in the cloud


Keumars Afifi-Sabet

7 Jun, 2018

Migrating from legacy infrastructure to the cloud is a mammoth task for any organisation, yet it’s particularly daunting for the National Health Service (NHS) – one UK’s most critical public services and its largest employer.

NHS Digital, the organisation underpinning the health service’s digital transformation, issued guidance in January outlining how Trusts should approach migration – marking the first time an authority has greenlit the use of public cloud in the health service.

Previously, cloud providers would approach Trusts to offer their services only to get knocked back, with many claiming NHS Digital prohibited the use of public cloud services. NHS Digital would then have to clarify to vendors individually, when subsequently approached with complaints, that there was no explicit ban on the use of cloud.

Because of this, NHS Digital commissioned a working group on the safe use of cloud with the National Cyber Security Centre (NCSC) in 2016, with policy at the time dictating any data stored overseas couldn’t contain sensitive information. The working group was set up to investigate how the policy could be changed to make use of cloud benefits, with a framework eventually agreed with ministers the following year.

“What the guidance is providing is a mechanism for organisations to do their own risk assessment as to whether or not cloud services can be used for their service or requirements,” Michael Flintoff, NHS Digital associate director for platforms and infrastructure, tells Cloud Pro.

“The main thing for me is clarity,” he adds, as Trusts have indicated a desire for guidance around data risk modelling and accessing services, something which traditionally hasn’t been there.

Sam Smith, coordinator at campaign group medConfidential, agrees the guidance is needed in principle, explaining “what it does – and the reason for the guidance – is it means hospitals can no longer say [to vendors] ‘NHS Digital told us not to’,” also claiming NHS Digital grew weary of having to repeatedly clear up the misunderstanding on an individual basis.

But Smith does not believe the guidance will make a difference, and, despite the clarity it provides, sees this as NHS Digital having “dumped [responsibility] back on the hospitals”, while criticising the organisation for failing to introduce a set of minimum standards for the products available.

NHS ‘can be, and will be attacked’

Cloud transformation has been central to the digital transformation strategy of many organisations, including swathes of the public sector, with the government issuing its own cloud-first guidance last year.

NHS Digital says the benefits extend beyond cost-saving to being able to develop, test and deploy services quicker, without large initial capital expenditure, as well as a better scope for data interoperability.

Flintoff said a cloud-first strategy has led to greater efficiencies in his own “heavily technical, development-orientated” organisation; undergoing transformation on a service-by-service basis and pay-as-you-go type services, such as SQL-as-a-service, to ensure best value.

But for an organisation as large and fragmented as the NHS, an IT project of this scale poses huge challenges, with the health service keen to forget a string of failed efforts in the past; the most notable example being the care.data scheme abandoned in 2016.

Security, meanwhile, is an equally pressing concern. NHS Digital assured Trusts they could safely locate health data, including patient records, in the public cloud, but a string of reports have underlined security risks. For instance, 100GB of secret NSA data was found exposed on a misconfigured Amazon Web Services’ (AWS) S3 bucket in late 2017, among a host of other high-profile leaks including those at FedEx and Accenture.

“The days of blithely assuming that an IT system can be made totally secure are gone,” says Dr Paul Miller, senior analyst at Forrester.

“It is far more realistic to assume that any IT system can be attacked and will be attacked, and the emphasis should, therefore, be on detecting those attacks, defending against the vast majority of those attacks, and mitigating the impact of any attack that does get past the initial set of defences.”

medConfidential’s Smith adds that while the chances of suffering a breach aren’t necessarily higher in the cloud, the consequences of any breaches are likely to be far more severe in nature if they occur on public cloud infrastructure as opposed to within the NHS firewall – particularly if Trusts simply opt for the cheapest option.

In light of these “additional challenges,”, the NHS is deploying tools like privileged access management and two-factor authentication to bolster security.

“The fundamentals of what we’ve done with IT for the last 20 years haven’t changed,” says NHS Digital’s Flintoff. “They don’t change because we go to the cloud, we might just have to approach them differently because we do that”.

Third-party access to patient data

For others, meanwhile, one of the biggest concerns centres on privacy, and whether third-party organisations, such as large tech companies, may have access to sensitive patient data once it’s put into the cloud.

“These people are here to make money, if not today, tomorrow, and you need to understand very clearly what their business model is if you are going to be part of that,” said Javier Ruiz Diaz, a director at Open Rights Group. “For the NHS it would be highly irresponsible not to be asking those questions right now”.

He cites arrangements with DeepMind in which hospitals, most notably the Royal Free in London, were slammed for granting Google access to patient data without consent.

Provided additional concerns around ensuring trusts do not become locked-in to one provider, and retaining ownership of public data, are dealt with, he believes the benefits could be substantial.

But these benefits, which include “better ICT, more innovative, and faster development”, can only be reaped on the condition it’s “done properly and it’s not done with short-sightedness” or as a money saving exercise.

“If you say you need to do cloud just because you’re saving money – that in itself is very, very short-sighted, because there are costs,” adds Ruiz. “You are bringing risks, so to just say you are going to save money is a false economy, because in the long term you are going to lose a lot more.”

‘It makes no sense to train a new army of experts’

McAfee’s research also found a quarter of organisations cited a lack of staff with skills to manage security for cloud applications as a key challenge, with only 24% reporting they suffered no skills shortage. Significantly, 40% of IT leaders reported a skills shortage was slowing their organisation’s cloud adoption.

The NHS is suffering a staffing shortage in clinical areas, let alone in the technical skills required to maintain an IT project of this scale. NHS Digital itself only has “18 to 20 deeply technically skilled people”, according to a House of Commons report into the WannaCry attack that crippled the health system last year.

The report highlighted the struggle faced by NHS organisations trying to recruit and retain skilled cyber security staff in the midst of a national shortage, in a landscape where the private sector can pay far more than the health service to attract talent.

“Teams within the NHS already have skills and experience in the relevant areas, but there aren’t enough of them,” says Dr Miller. “It doesn’t make sense for the NHS to train or hire a new army of cloud experts for the migration: there are systems integrators, consultancies, and other partners ready and willing to provide those services to the NHS. But, equally, the NHS should not outsource the whole problem to a third party.”

He called for an increase in the size and funding for internal IT and digital capabilities, given the NHS needs people with the skills and expertise to understand the migration, as well as continue to ask the right questions with regards to what its partners may be proposing. Above all, this transition must be designed, shaped, and led by NHS employees.

“The work must also be done within the broader umbrella of a digital strategy for the NHS. This isn’t just about moving from one server to another, or upgrading one application to its cloud-based equivalent,” he adds.

“The real opportunity here is to think about what a digital NHS should look like, how a digital NHS can help NHS staff be more efficient, informed, and empowered, and how a digital NHS can improve interaction with and care for patients.”

Image: Shutterstock

Is it time to dump Microsoft Office?


Stuart Andrews

12 Jun, 2018

No-one likes a stealth tax, yet that’s exactly what some people accuse Microsoft of implementing. While we wouldn’t want to go back to the days of paying £400 for a single-user licence of Office desktop software, the monthly drip of £6 to £13 per month starts to add up. So is Microsoft delivering real value for money or are we all mugs?

Make no mistake, Office 365 is a crucial part of Microsoft’s drive towards a cloud-first future. And with it, profitability: it made a $21 billion profit in 2017 compared to $12 billion in 2015, when Office 365 was released. Through aggressive pricing, marketing and a steady depreciation of the standalone versions, Redmond has done everything in its power to make Office 365 the standard way to buy Office.

Sadly for Microsoft, not everyone is keen to jump on board.

The sales pitch for Office 365 has always focused on shifting away from the big releases to a continually evolving office suite, with new features rolling out on an almost monthly basis. Yet, you don’t have to be a cynic to suggest that Office 365 still looks and feels an awful lot like the version that launched three years ago, which closely resembled the one that emerged back in 2012.

Are we paying a subscription for software that’s constantly improving, its incremental improvements being overlooked because we’ve forgotten what we started with a few years ago? Or are we merely paying a monthly retainer for the same old Office? To find out, we’re going to delve into Office’s features, focusing on those that have rolled out since July 2015, when Office 2016 hit the shelves.

What’s changed in Office?

If you’re a casual Office user making light use of basic features, or even an old-school power user with an established way of working, you might agree that Office hasn’t changed noticeably in the past three years. In terms of the basic look and feel of its core features, Office 2016 wasn’t a huge leap forwards from Office 2013, with much of the focus on collaborative editing and teamwork tools, alongside closer integration with OneDrive and Skype.

Moreover, many of the post-2015 enhancements centre on current Microsoft preoccupations, which may or may not interest you. Many focus on the pen and ink tools being pushed on the company’s Surface devices, or on support for the 3D content tools that came with the Windows 10 Creators Update.

For instance, a new customisable, portable pen set can be used across all Office apps on all devices, while new ink and pencil effects give you fresh options for annotations, notes and plans. You can use a pen to select and change objects in Word, PowerPoint and Excel, or sketch out rough squares, circles and blobs before converting them into shapes in Word. These features might be game changers if you’ve embraced the stylus, but for those of us working on a normal desktop or laptop, they’re almost irrelevant.

As for 3D content, it’s interesting that all the major applications now support 3D models, allowing you to pull one into a document then resize and rotate it to your heart’s content. But for many business users, the lack of relevant content – or resources and desire to create their own – is a real sticking point. There’s good news for 2D artwork: the Remove Background feature gives you Photoshop-style tools that can remove a plain background in a matter of seconds.

Other changes simply ensure that features apply more consistently throughout the suite. For example, real-time collaboration features, where you can see a document updating, character by character, as another editor works on it, were restricted to Word in the initial release. Now they’re there in Excel and PowerPoint too, though you need to share the files through OneDrive or SharePoint to benefit.

Word

Let’s start with Word. One of Office 2016’s biggest strengths has become its use of Microsoft’s AI and machine learning to get you started on common Office tasks or enhance the quality of your work.

Take Word’s Editor pane, for example, or the new Researcher tool. Call the latter up from the References tab, type in a subject, and Bing will go away and search for sources. From these you can pull out notes and even quotes, with Researcher tracking citations and adding them automatically to the document’s bibliography.

Word’s Editor tool is far more than a replacement spellchecker – used wisely, it can genuinely improve your use of words

You may prefer working independently through your browser, or you might trust Google to deliver stronger sources, but Researcher can be great for getting a head start on a topic or searching for a relevant snippet of info to support a key point. It’s arguably of most use to students or journalists, but if you spend any time trying to pull notes together for a report or meeting, having a built-in tool that tracks sources and citations can be a real time-saver.

Right next to Researcher, you’ll find the new Smart Lookup, again powered by Bing. Apply it to a word or phrase and you’re presented with not only a definition, but more in-depth explanations from a range of different sources, along with more general web search results. Smart Lookup isn’t always all that smart, however. I looked up “fiesta”, in the context of festivities, and was shown content around Ford’s small car and a US grocery chain.

Type in a subject and Researcher will bring up a list of sources, making it easy to get to grips with a topic

Other new features might not set the world alight. You can page through longer documents like a book instead of continuously scrolling through them – great on a big desktop screen, if almost useless on a laptop.

You can also add a character count to the status bar, or view and restore changes in shared documents without leaving Word. Smart Quotes have been improved to work more accurately around punctuation.

Meanwhile, the new Translator for Office 365 feature is basically a replacement for the old Mini Translator window. On the plus side, it’s a more effective tool, handling longer passages and producing reasonable working translations that make almost perfect sense.

PowerPoint

PowerPoint hasn’t sat still for the past few years, either, with the most useful addition being Designer. While no replacement for a real designer or a strong set of corporate templates, its Design Ideas can transform a deck of slides into something that looks professional.

If you’re pressed for time, the new QuickStarter template could be tempting. Just type in a topic and this intelligent tool goes to work with the help of Bing, asking you to pick from a selection of visual treatments before coming back with a suggested structure, relevant facts to get you started and even ideas for other areas or points for further research. It’s a feature that should play well with Office 365 Home and Personal subscribers, but is it useful in a business context? Possibly not. It’s one of those tools that hints at a future where intelligent assistants dig out ideas and insights to improve your productivity, but at the moment it’s more suited to the classroom than the boardroom.

SmartLookup provides useful reference material– and the ability to import 3D models

Other additions are smaller, but potentially more useful. The new Morph transition allows you to duplicate a slide and move or add elements, then morph between the original and the copy with all the elements shifting simultaneously. It’s a classic maximum impact, minimum effort effect. The same is true of the new Zoom feature. Insert a zoom into a slide, select the slides or sections you want included and PowerPoint will flick from one to the next with a sweeping zoom in, zoom out animation.

Excel

If Word and PowerPointboast eye-catching new features, Excel’s enhancements are less immediate. Few of us thrill to the sound of faster opening of complex documents, improved autocomplete or a more flexible copy feature, but all improve basic usability, albeit in ways that you might not notice.

Excel’s map chart tool allows you to compare data – for example, population density – using maps gleaned from Bing

Other improvements make more of a difference when you’re dealing with large or complex datasets in research or enterprise. Over the past two years, Microsoft has steadily drip-fed out additions to the Query Editor, such as new transformations for Adding Columns by Example or splitting and grouping columns to manipulate their data. Again, these features rely on Microsoft’s algorithms to get Excel to handle the grunt work, leaving you to dig further or refine. When you do come up with something interesting, closer integration with Power BI makes it easier to share queries or insights with colleagues.

Of course, not every Excel user ever touches the Query Editor, let alone uses Power BI, but if you do then the experience should have improved.

Outlook

You could argue that, with Outlook, Microsoft is stealing Google’s tricks. The new Focused Inbox view is one example, borrowing from Google’s Inbox, but Outlook has also pinched Gmail’s idea of sucking information out of your incoming emails and using it to create reminders or events.

Gmail users will know that this is particularly useful for meetings and travel arrangements, and Outlook does a reasonable job of putting appointments, flights and hotel reservations on your schedule where they’ll be more accessible, though Microsoft’s assistant isn’t quite as smart as Google’s when it comes to spotting and capturing the vital info.

Outlook’s new, Google-style Focused Inbox view makes it easy to quickly see your most important emails via a tab

Vanilla Outlook 2016 introduced a move away from sending files via email to leaving those files in your OneDrive cloud and sending permissions to share, view, download and edit. More recent changes have made this more straightforward, by allowing you to drag-and-drop cloud-stored attachments as if they were attached to the email. You can also set permissions to these files, ensuring that people can’t edit and reuse them if you only want them to view.

Finally, sales teams or smaller businesses shouldn’t underestimate the Office 365-exclusive Outlook Customer Manager add-in. This enables you to set up companies, contacts, events and deals within a customer-centric view, so that you can view touchpoints, conversations, meetings and opportunities company by company, with all the relevant data close to hand. It transforms Outlook into something a little more like a customer relationship management tool; one that’s tracking your email conversations and calendar events to give you a bigger-picture view.

Just be aware, however, that it takes a while to set up and start using, and even more time before it starts getting to grips with your data and throwing up useful information.

The verdict

So, are you really benefiting from that Office 365 Subscription?

Based on the incremental rollout of new features, it depends. None of the new features are what you might call game-changers. None seriously impact everyday Office workflows or are likely to transform the way you do your job. In fact, if you use Office the way many of us do, working with established tools and templates, you might not even be aware that these new features exist.

To balance, some are genuinely useful. Word’s Editor is an improvement on the old Spelling and Grammar check when you’re giving work a read through. PowerPoint’s Designer can help non-designers produce better-looking slides.

Office 365’s great strength is its range of bundled services, but do you use them?

Outlook’s Focused Inbox is a better way to organise your mail. And while Excel’s enhancements are more specialist, they provide similar shortcuts for data analysis. Most of all, these tools point to Microsoft’s wider vision of an Office where intelligence and automation make it easier to get real work done.

What’s there right now might not be enough for those paying monthly or annual subscriptions and wondering why Office isn’t evolving faster, in which case the question is how much value you’re getting from the other benefits: the Outlook.com email, the OneDrive storage or SharePoint, Teams and Yammer in the Business Premium version. If you or your company use them, then it’s hard to argue with the package on value for money. If not, Microsoft still has a way to go before it convinces us that the “as a service” model is really paying off.

Let’s be clear: as a bundle of services Office 365 is an excellent deal, but if you’re only paying to use the core Office apps of Word, Excel and PowerPoint, it might be time for a rethink.

Image: Shutterstock

IBM and CA Technologies team up to help give mainframe a cloudy edge

They are two companies who have spent more time espousing the benefits of the mainframe than most – and now IBM and CA Technologies have forged a new partnership designed to make the technology more accessible in a cloud and DevOps landscape.

The collaboration will see new services being launched to help organisations develop, test, and monitor applications in the mainframe. The services will be available on IBM’s Cloud Managed Services on zSystems, otherwise known as zCloud.

Clients using zCloud can now take advantage of four new services: CA Brightside, which aids in developing applications for the mainframe using existing open source tools and frameworks; CA Service Virtualization, which focuses on rapidly testing and modifying applications already in place; CA Mainframe Operational Intelligence, for monitoring applications in the cloud; and CA Data Content Discovery for data protection to assist with compliance regulations.

According to a blog post from the two companies, almost 80% of all enterprise data is managed on the mainframe, while more than 90% of the world’s top 100 banks house data on there. While many column inches over the years have opined on the death of the mainframe, many others have issued a rebuttal. The IBM and CA partnership can certainly be put into the latter category.

“Through this strategic relationship, the companies are architecting new and innovative ways to enable clients to develop, run and manage mainframe applications in the cloud,” wrote Phil Guido, IBM general manager of GTS infrastructure services and Greg Lotko, CA general manager of mainframe systems. “Our clients can now quickly access development, testing, application management and regulatory compliance services to deliver operational resiliency, efficiencies, and workforce agility that their businesses require.”

As is expected with such announcements, a customer or two gets rolled out to reveal the benefits of the collaboration. In this instance, Anthem, a US healthcare provider, was chosen, with Tim Skeen, Anthem CIO, saying the new service ‘demonstrates a real possibility for us to deliver new innovations and features faster and securely through a true cloud experience for the mainframe.’

Writing for this publication in 2016, Christopher O’Malley, CEO of Compuware – admittedly no friend of CA or IBM – discussed the importance of breaking down barriers in order to integrate the mainframe into mainstream DevOps and multi-platform IT environments.

“In order to encourage their acceptance of it, agile developers working in modern multi-platform environments need to realise that DevOps teams are only as strong as their weakest link,” wrote O’Malley. “Since mainframe code provides a huge chunk of the digital DNA that defines how the business runs, it’s impossible to turn DevOps into a true competitive advantage if that one mainframe element sits in isolation.

“It’s therefore in everyone’s best interests to bring the mainframe into the fold of mainstream IT.”

Picture credit: IBM