Dell Virtustream gains certified cloud provider status for Australia


Clare Hopping

1 Jun, 2018

The Australian government has presented Dell Virtustream with a place on the Australian Signals Directorate’s (ASD) Certified Cloud Services List (CCSL), meaning it’s now been granted permission to host unclassified dissemination limiting marker (DLM) government information on its cloud service.

The cloud business joins other companies obtaining permission, alongside other tech providers including AWS, IBM, Salesforce, ServiceNow, Sliced Tech and Vault Systems/

However, Dell Virtustream’s Unclassified DLM classification is the second level on the list. Only Dimension Data, Macquarie Government, Microsoft’s Azure and Office 365, Sliced Tech and Vault Systems have “Protected” status, the highest level of accreditation available.

Part of the specification for gaining protected-level status is that data stored within the cloud services is only available to employees in Australia. However, Microsoft’s Azure Cloud service does allow for the transfer of information to other countries.

However, Australia’s Cyber Coordinator Alastair MacGibbon has since reassured doubters that the service does tick all the boxes (although he was non-committal when asked whether the ASD specifies the provider must be based in Australia to be CSSL approved) and no data will be made available outside of the country, as per the rules.

“Data can reside anywhere in the world, you can demand data stay in Australia but it doesn’t always make it more secure that it’s in a particular geography,” he said.

“It’s good that we hold data in Australia, that means that data comes under Australian law, that means that agencies and others have more access to it and other country’s agencies theoretically don’t have access to that data.”

Three ways machine learning is revolutionising zero trust security

Bottom line: Zero Trust Security (ZTS) starts with Next-Gen Access (NGA). Capitalizing on machine learning technology to enable NGA is essential in achieving user adoption, scalability, and agility in securing applications, devices, endpoints, and infrastructure.

How next-gen access and machine learning enable zero trust security

Zero Trust Security provides digital businesses with the security strategy they need to keep growing by scaling across each new perimeter and endpoint created as a result of growth. ZTS in the context of Next-Gen Access is built on four main pillars: (1) verify the user, (2) validate their device, (3) limit access and privilege, and (4) learn and adapt. The fourth pillar heavily relies on machine learning to discover risky user behavior and apply for conditional access without impacting user experience by looking for contextual and behavior patterns in access data.

As ZTS assumes that untrusted users or actors already exist both inside and outside the network, machine learning provides NGA with the capability to assess data about users, their devices, and behavior to allow access, block access, or enforce additional authentication. With machine learning, policies and user profiles can be adjusted automatically and in real-time. While NGA enabled by machine learning is delivering dashboards and alerts, the real-time response to security threats predicated on risk scores is very effective in thwarting breaches before they start.

Building NGA apps based on machine learning technology yields the benefits of being non-intrusive, supporting the productivity of workforce and business partners, and ultimately allowing digital businesses to grow without interruption. For example, Centrify’s rapid advances in machine learning and Next-Gen Access to enable ZTS strategies makes this company one of the most interesting to watch in enterprise security.

The following are three ways machine learning is revolutionizing Zero Trust Security:

  • Machine learning enables enterprises to adopt a risk-based security strategy that can flex with their business as it grows. Many digital businesses have realized that “risk is security’s new compliance,” and therefore are implementing a risk-driven rather than a compliance-driven approach. Relying on machine learning technology to assess user, device, and behavioral data for each access request derives a real-time risk score. This risk score can then be used to determine whether to allow access, block access, or step up authentication. In evaluating each access request, machine learning engines process multiple factors, including the location of the access attempt, browser type, operating system, endpoint device status, user attributes, time of day, and unusual recent privilege change. Machine learning algorithms are also scaling to take into account unusual command runs, unusual resource access histories, and any unusual accounts used, unusual privileges requested and used, and more. This approach helps thwart comprised credential attacks, which make up 81% of all hacking-related data breaches, according to Verizon.
  • Machine learning makes it possible to accomplish security policy alignment at scale. To keep pace with a growing digital business’ need to flex and scale to support new business models, machine learning also assists in automatically adjusting user profiles and access policies based on behavioral patterns. By doing so, the need for IT staffers to review and adjust policies vanishes, freeing them up to focus on things that will grow the business faster and more profitably. On the other hand, end users are not burdened with step-up authentication once a prior abnormal behavior is identified as now typical behavior and therefore both user profile and policies updated.
  • Machine learning brings greater contextual intelligence into authentication, streamlining the experience and increasing user adoption. Ultimately, the best security is transparent and non-intrusive. That’s where the use of risk-based authentication and machine learning technology comes into play. The main impediment to adoption for multi-factor authentication has been the perceived impact on the productivity and agility of end users. A recent study by Dow Jones Customer Intelligence and Centrify revealed that 62% of CEOs state that multi-factor authentication (MFA) is difficult to manage and is not user-friendly, while only 41% of technical officers (CIOs, CTOs, and CISOs) agree with this assessment. For example, having to manually type in a code that has been transmitted via SMS in addition to the already supplied username and password is often seen as cumbersome. Technology advancements are removing some of these objections by offering a more user-friendly experience, like eliminating the need to manually enter a one-time password on the endpoint, by enabling the user to simply click a button on their smartphone. Nonetheless, some users still express frustration with this additional step, even if it is relatively quick and simple. To overcome these remaining barriers to adoption, machine learning technology contributes to minimizing the exposure to step up authentication over time, as the engine learns and adapts to the behavioral patterns.

Conclusion

Zero Trust Security through the power of Next-Gen Access is allowing digital businesses to continue on their path of growth while safeguarding their patented ideas and intellectual property. Relying on machine learning technology for Next-Gen Access results in real-time security, allowing to identify high-risk events and ultimately greatly minimizing the effort required to identify threats across today’s hybrid IT environment.

Microsoft is now more valuable than Google


Vaughn Highfield

31 May, 2018

Microsoft is now more valuable than Google and its listed parent company, Alphabet. For the first time since 2015, the Redmond-based technology company overtook the behemoth that is Google to become the third most valuable company in the world.

Valued at $753 billion (£566 billion), Microsoft sits just ahead of Alphabet’s $739 billion (£556 billion) valuation. Microsoft and Google have been trading places on the rankings since Google first surpassed the company in 2012. However, with this decisive gain, it shows that Microsoft CEO Satya Nadella has really managed to change the company’s image and turn its fortunes around.

Since taking over four years ago, Nadella has helped more than double Microsoft’s stock price, and the business has gone from being a 40-plus-year-old company to a modern tech icon. By focusing Microsoft into product categories like AI and cloud computing, while simultaneously axing failing divisions like the Windows Phone, Microsoft has successfully modernised. The latest ranking shift just goes to show these rather drastic methods of moving away from Windows as its core product have clearly worked.

Microsoft is also hot on the heels of Amazon, the second largest company in the world – sitting at $782 billion (£588 billion). At the top of the pack is Apple – which, with a market valuation of $923 billion (£694 billion), isn’t going anywhere soon. Interestingly, though, Microsoft arguably has a larger portfolio than Apple and that could well be used to its advantage. As The Verge points out, Google generates around 90% of its revenue directly from advertising, and 60% of Apple’s entire revenue is also attributable to iPhone sales which could see billions wiped from its valuation if sales slow.

Microsoft, however, has grown much of its business through hardware, software and services. In its latest earnings report, Microsoft’s Windows, Surface and Xbox divisions chalked up to around 35% of its revenue. Cloud services clocked in at around 30% and Office and productivity solutions at another 30%.

So where next for Microsoft? Morgan Stanley believes that Microsoft will be one of the first companies out there to hit a $1 trillion valuation within the space of a year thanks to the growth of cloud services. If this is the case, Apple certainly has something to be worried about.

Picture: Bigstock

How to Switch Between Mac and Windows on Parallels Desktop

Let’s be honest: to the average person, the idea of running two different operating systems at the same time on one computer is pretty weird. This idea naturally leads to questions like these: How do I know which one I’m using at any one moment? How do I switch between them? Which applications do I […]

The post How to Switch Between Mac and Windows on Parallels Desktop appeared first on Parallels Blog.

Why it’s time for manufacturers to take security in the cloud seriously

Manufacturers deal with sensitive data every day. This includes test and quality data, warranty information, device history records, and especially the engineering specifications for a product that are highly confidential. Trusting that data to a cloud-based application or cloud services provider is a major step, and manufacturers need to fully educate themselves about the security risks and advantages of cloud-based software.

As we prepare to enter the second half of 2018, consider the following three questions as your guide when discussing application infrastructure and operations with cloud providers.

Question #1: How do you keep my data safe?

The answer should be long and multi-faceted. Because no single tool will defend against every kind of attack in any network, cloud providers must deploy multiple layers of defense using: internal systems; protection provided by tier 1 cloud platforms; and security service providers. All of these elements come together to provide complete protection.

Below are some examples of these layers:

  • Physical defence: Cloud platform providers can and should exercise tight control of access to the physical devices on which the software systems reside. In best case scenarios, Independent auditors attest to the safety of this access. This control and documentation must be reviewed on a regular basis.
  • Barriers to entry: Firewalls built into the cloud service can limit access to ports managed by the application. Unneeded ports should be blocked so that they cannot be accessed.
  • Application password protection: the best-designed cloud applications allow your organisation’s identity management system to provide authentication and password management, limiting access to your data and following your internal security policies. This should also support two-factor authentication if your internal policies require it. Some of the more advanced systems can also provide an identity management service as an alternative to your internal solutions, if required.
  • Application firewalls: Most enterprise-class application designs will include a Web Application Firewall service that uses the latest technology to defend against such things as denial of service attacks and other types of malicious access.
  • Activity monitoring: State-of-the-art cloud platform providers continuously monitor for suspicious activity that could be the result of hacking or malware. Again, in best case scenarios, warnings are sent automatically and steps taken to protect the data and the integrity of the platform.
  • Malware monitoring: Both the application provider and the hosting platform provider must run active checks for malicious code to ensure each piece of code that is executed matches the published signature for that code. Be warned: this is a step that many providers have not migrated to yet.
  • Code standards: Good security starts with good code. Security standards must be included in the system development life cycle, governing every aspect of the system. Be sure to review the code standards of the application developer.
  • Third party code scanning: The most advanced application providers use a third-party firm to scan code looking for opportunities to improve security and look for known vulnerabilities with each new version of the application. Ask for details about this, as there are many different levels of scanning available; a once-a-year scan is obviously not as valuable as regularly scheduled scans before each new release of software.
  • Data encryption: Generally accepted practices for data encryption provide different options for data in different modes: data in transit (being communicated within the system or between the database and your user interface) and data at rest (data that resides within the database and is not currently being accessed).

Data in transit can be encrypted using industry standard encryption through the browser. Additionally, APIs that access the data should use encrypted data and include encrypted tokens to increase access control.

Encryption of data at rest protects against accessing data from outside the application’s control. As the physical access to the system is protected and the data is in password protected databases, at-rest encryption may not be essential for every customer – but the question is still worth asking.

Question #2: How do I know that my data can’t be accessed by other customers?

There are many ways to ask this question:

  • Do you mix my data with other companies’ data?
  • Can other people see my data?
  • What’s your data structure for each customer?

The answer to each of these is data separation. The system architecture should ensure separation of customer data by customer organization, usually by individual factory or site. This allows even customer administrative tasks such as assigning roles to be limited in scope. While many applications are multi-tenant (meaning the application is shared across multiple customers), transactional data should still be separated by customer factory, meaning there is no commingling of customer data. In other words, your data will be separated from every other customer, giving the highest level of data separation.

Question #3: How does cloud security compare to on-premise security?

There is a common misperception that a set of servers running on-premise at a corporate office is more secure than a cloud-based application. Owning the hardware and software often gives a false sense of security; most on-premise systems fall far short of the security that the best cloud providers have deployed.

For example, the cloud storage system utilized by my company was designed for 99.999999999% durability and up to 99.99% availability of objects over a given year. That design and those numbers are virtually impossible to duplicate with an on premise solution. In addition, the comprehensive access control described above is nearly impossible to duplicate on-premise. To deploy tools like these in an on-premise environment would require not only large investments in infrastructure, but large teams to manage them too.

Ask yourself:

  • How big is your security team?
  • How much is your budget for security around your manufacturing data?

Then remember, the best application providers and data centers have large, dedicated security teams who have implemented automated threat monitoring systems that operate 24×7. In the end, the best cloud software companies have dedicated more time, resources and budget to securing our systems than most organizations are able to provide themselves.

Office 365 usage goes up and up leaving G Suite behind, says research

Microsoft Office 365 usage continues to accelerate significantly across organisations of all sizes while Google’s G Suite languishes in comparison, according to new figures from Bitglass.

The cloud access security broker (CASB), in its 2018 Cloud Adoption Report, analysed software usage of more than 135,000 companies globally and found Office 365 continues to rule the roost.

56.3% of the more than 135,000 companies analysed were users, compared with only a quarter (24.8%) for G Suite. The latter has actually decreased – admittedly from 24.9% – compared with two years ago, while Office 365 uptake in 2016 was at 34.3%.

In terms of Office 365 and G Suite houses by size, the trend for larger firms to go with Microsoft remains apparent. Regular readers of this publication may remember a 2015 survey from BetterCloud which found companies surveyed who ran Office 365 had IT teams on average five times the size of their Google compatriots.

Bitglass comes to a similar conclusion. Just under half (49.6%) of companies assessed with fewer than 500 employees say they are Office 365 customers, compared with 73.4% for 500-1000 and 73.7% for more than 1000. For G Suite – 24.1%, 24.3% and 25.8% respectively – the figures show little deviation.

Looking across all organisations, 13.8% of companies worldwide are using AWS, with technology (21.5%), education (19.7%) and media (15.3%) firms ahead of the global trend. For larger firms the figures are even more stark; 22.1% of companies with more than 1000 employees use AWS in some capacity, compared with 15.8% for organisations at the 500-1000 range, and 10.8% for those smaller.

For other apps analysed, the general trend is of gradually greater adoption the larger the organisation. More than half of organisations with at least 500 employees are Slack users, compared with 37.8% of smaller businesses. Box is used by 28% of the largest organisations polled compared with 12.7% of companies with fewer than 500 employees, while Salesforce (18.3% and 8.8%) has the same trend.

The dominance of Office 365 is evidently something those at Google have been trying to address. Last month, the company announced the launch of Google One, a premium tier cloud storage offering focused on replacing paid consumer Google Drive plans.

The most interesting aspect which leapt out, however, was around the change of price for 2TB of storage – half of what competitors such as Dropbox and Microsoft charge. As Microsoft bundles storage in with Office 365 subscriptions, this is a not insurmountable hurdle which Google continues to be up against.

Rich Campagna, chief marketing officer at Bitglass, said it was ‘no surprise’ that overall cloud adoption continues to skyrocket. “Organisations worldwide have come to trust platforms like Office 365 and AWS as vendors continue to bolster security and feature sets,” he said.

“Competition between major public cloud players such as Amazon, Google and Microsoft will only increase as they fight to grow market share,” Campagna added. “It remains to be seen which emerging apps will join them to become staples in the enterprise.”

You can find out more and download the report here (email required).

What Google knows about you


Jonathan Parkyn

5 Jun, 2018

It’s much more than just a search engine these days, but the data Google gleans from its users’ search history and other activities is still central to the company’s continued success.

Collecting and using other people’s information is Google’s bread and butter, providing it with the ultimate advertising commodity – the ability to target specific people – and effectively funding the many ‘free’ services the company offers.

In its privacy policy, Google says “we use the information we collect from all of our services to provide, maintain, protect and improve them, to develop new ones and to protect Google and our users”.

But that’s only half the story. It’s also using your data to boost the effectiveness of its own business. The more user data it has, the more accurately it can target adverts – and the more powerful it becomes.

Google has tried to address privacy concerns by providing more transparency on how and why it uses people’s data, and by inviting users to view and control this information. The trouble is, there are dozens of different settings scattered around various web pages and devices. In this section, we’ll point you straight at the Google settings you need to change.

Your search and web activity

You probably won’t be too shocked by the fact that Google stores and uses data from your web searches. Even still, viewing everything the company has recorded about your web activities (by signing in with your Google account) is quite an eye-opener.

Your activity is shown as a vertical timeline and, depending on how many Google-related tools, services and devices you use, you could be presented with a list of not only every Google search you’ve performed, but also every site you’ve visited in Chrome, every route you’ve planned in Google Maps, every Android app you’ve ever used and more besides, all stretching back years.

Erase your Google search history by choosing the date range, then clicking delete

Thankfully, My Activity lets you control how much of this data Google stores. You can search for specific items or scroll back through your history to find something you want to delete. Click an item for more details, then click the three vertical dots button in the pop-up windows and click Delete, then click Delete again to remove the item from your history.

Alternatively, you can delete data in bulk by clicking the three vertical dots button in the top right-hand corner of the main page and selecting ‘Delete activity by’. Here, you can select the date range and the Google service (Search, for example) you want to delete data for from the drop-down menus, and click Delete, then Delete again. Or, if you want to get rid of the whole lot, select ‘All time’ and ‘All products’ from the menus.

To stop Google tracking your searches (and your browsing activity if you use Chrome), head here, turn off the blue slider, then click Pause.

Your location history

Google keeps track of your movements in the real world, as well as the online one. It follows you when you’re signed into your Google account, and you’re carrying your phone or tablet at the same time.

You can view your location history by signing into your Timeline. Select a date from the drop-down menus in the top left corner.

To stop Google tracking your location click the Pause Location History button at the bottom of the page, then click Pause in the window that pops up. Keep in mind that this won’t turn off the built-in location-tracking abilities of any devices you use.

For example, if you use an Android phone, you may also wish to tap Settings, ‘Security & location’, then Location and either completely switch off ‘location tracking’, or tap ‘App-level permissions’ and disable it for individual apps.

Tick this box, then click ‘Delete Location History’ to permanently erase it

Disabling location tracking won’t delete any previous location activity that Google has recorded. To do this, click the small dustbin icon to the lower right of the Timeline page’s main map image. In the window that appears, tick the box next to ‘I understand and want to delete all Location History’, then click Delete Location History. This will permanently delete your location history – neither you nor Google will be able to get it back.

Bear in mind that disabling location tracking and deleting your location history may affect the functions of some Google services. For example, Google Now, which answers your spoken queries, will no longer be able to provide you with information or suggestions based on your location.

Your personal interests

Rather creepily, Google builds a list of things you like (and don’t like), based on your search and YouTube activity. It uses this to create a profile that lets advertisers target you. Google claims this is to “make the ads that you see more useful to you”.

Disable personalisation adverts by clicking the blue slider, then selecting Turn Off

To see what Google thinks your interests are, head here and sign in with your Google account. Scroll down to the ‘Topics you like’ and ‘Topics you don’t like’ headings. You may find these are eerily accurate. You can click the X to delete individual likes and dislikes, and add new ones (click ‘+New Topic’), should you wish.

Alternatively, you can completely turn off targeted advertising by clicking the blue slider to the right of the Ads Personalisation heading, then clicking Turn Off.

Your gender and birthday

Like many sites and services, Google asks for sensitive details – including your age and gender – when you sign up for an account. The difference is that Google may share some of this information openly, unless you tell it not to.

Head here and look under ‘Gender, date of birth and more’. If you see a green globe icon next to any of the information shown here, that means it’s shared publicly – anyone can see it when they look at your Google profile.

Select ‘Private’ to stop people seeing personal info on your Google profile

To change this, click the globe icon and select Private. Even then, Google will continue to use your gender information to “provide more relevant, tailored content you might be interested in, like ads” unless you change yet another setting.

Go to this page and scroll down to ‘Your profile’. Click the pencil icon next to Gender and select ‘Rather not say’. Be aware that choosing this will also stop Google tools and services from referring to you as either male or female.

Your voice

If you use the Google Now assistant, or any of the company’s Home smart speaker products, then recordings of your voice may also be among the data stored about you on Google’s giant servers.

Like most voice-controlled assistants, Google Now and Google Home work by learning and accessing all kinds of personal data, so you should avoid using them if you want to avoid Google’s tentacles. Deleting or blocking access to your data effectively renders them next to useless.

To check for and delete any existing voice recordings Google might have, head here and click in the Search box at the top. Make sure ‘All time’ is selected under ‘Filter by date’, then untick ‘All products’, tick ‘Voice & Audio’ and click the Search (magnifying glass) icon. Click the three dots button in the Search box, then click ‘Delete results’, Delete.

Your devices

Remove devices from Google’s history so it no longer knows what you use

As well as tracking you and your activities, Google likes to keep a record of the devices you’ve used to access its services – not just Android devices but Windows PCs, iPhones and more. It might be less invasive than some of the other data the company keeps on you, but you may still wish to delete devices you no longer use – if you’ve lost your phone and you want to block access to your Google account from it, for example.

To do so, sign in here, then click ‘Device activity & security events’ on the left. Now click Review Devices, click the device you want to delete and click the Remove button.

Image: Shutterstock

Equinix keeps Digital Realty at arm’s length in colocation market – with global expansion key

Equinix is moving ahead of Digital Realty and NTT in the colocation market helped by two acquisitions in the most recent quarter, according to the latest figures from Synergy Research.

The company completed its acquisition of Australian data centre provider Metronode last month – having been first announced in December – alongside announcing the acquisition of the Infomart building in Dallas for $800 million.

According to the Synergy figures, Equinix and Digital Realty are growing far quicker than the overall market, with Equinix having 13% of total share – all in retail colocation – and Digital Realty at just over 8%. Digital Realty continues to dominate the wholesale colo market at 28%, while Equinix had a 17% share of retail colocation.

Of the smaller players, NTT is clear in third place with just over 6% of the overall colocation market, with KDDI/Telehouse and China Telecom rounding off the top five.

While a lot of importance continues to be placed in the North America and EMEA heartlands – in January Digital Realty announced a deal with Oracle to add direct access for its US cloud infrastructure – Synergy argues looking globally is key to future operations. Equinix ranked as a leader in EMEA and Latin America, ranked second in North America, and third in APAC.

“When it comes to operating data centres and colocation services, scale and geographic reach are important. Enterprises are pushing more of their data centre operations into colocation facilities and are also aggressively driving more workloads onto the public cloud, where cloud providers themselves use a lot of colocation facilities,” said John Dinsdale, a chief analyst and research director at Synergy. “Satisfying the needs of those enterprises and cloud providers often requires a large and widely distributed data centre footprint.

“In order to help achieve that scale there needs to be constant investment in existing data centres… in addition to which we’ve seen $42 billion in data centre M&A deals over the last 36 months, with Equinix or Digital Realty alone accounting for half of the total,” added Dinsdale. “There are good reasons why those two are the leading players in the colocation market.”

According to figures published by the analyst firm in January, 2017 was a record-breaking year for data centre M&A activity, with 48 transactions at $20 billion overall.

Gartner’s 2018 IaaS Magic Quadrant: Google joins leaders’ zone as only six vendors make cut

Google has clambered into the leaders’ section of Gartner’s latest infrastructure as a service (IaaS) Magic Quadrant, while the wheat has been separated from the chaff.

The annual report concluded that the cloud IaaS market is now a three-horse race in the top right box, with the leaders’ zone not being an Amazon Web Services (AWS) and Microsoft-only area for the first time since 2013.

Indeed, Gartner hacked away many of the fringe players for the latest Quadrant. Only six companies make this year’s list, down from 14 this time last year. In effect, Google moved up while the other combatants in last year’s ‘visionaries’ section – Alibaba Cloud, IBM and Oracle – all moved left.

Regarding the two primary leaders, Gartner’s analysis probably won’t surprise those who have consistently followed the market. AWS’ dominance was evidently noted – one point of interest is that many enterprise customers spend more than $5 million annually with some spending more than $100m – but securing optimal use from the company’s extensive portfolio can be challenging for even expert IT organisations. For Microsoft, the company’s increased openness and sustained high growth rate was reported, with concerns over larger scale implementations.

Google, however, carried a few interesting notes. Gartner said the company had a ‘well-implemented, reliable and performant core of fundamental IaaS and PaaS capabilities – including an increasing number of unique and innovative capabilities.’ In terms of cautions, Google fell down on not having a large number of MSP partners, although Gartner noted the improvement the company had made in that area.

It is certainly fair to say that the past 12 months has seen serious improvements from Google’s cloud arm – and placement at the top table from Gartner can be seen as important validation of this shift. At the start of this year, Google outlined its infrastructure expansion plans, focusing on five new data centre regions – with more having since been announced – and three subsea cables. Last month, CEO Sundar Pichai acknowledged the company was striking ‘significantly larger, more strategic deals’ for cloud.

As this publication recently reported, it can also be seen as a case of keeping up with the Joneses. Capex spend from the ‘hyperscaler’ cloud vendors hit record levels in the most recent quarter, according to figures from Synergy Research. In a note published after financial results were disclosed, Synergy said cloud growth over the past two quarters had been ‘quite exceptional’.

Concluding the report, Gartner said the cloud IaaS market was ‘consolidating rapidly’, with the reduction of vendors reflecting heightened customer expectations, with services around hardware and software infrastructure, management and governance, and pre-integrated value-added solutions all necessary. The analyst firm added that of the six companies which made the cut, some already have this capability while others simply have the ambition to do so.

You can find out more about the report and download a reprint here (Microsoft landing page).

Postscript: As mentioned, Google’s inclusion in the leaders’ section means the five year run of Amazon Web Services (AWS) and Microsoft only being at the top table has come to an end. But whither 2013? Well AWS was there, as one would expect, but Microsoft was a bit behind. One other company was in the leaders’ zone; CSC, who of course merged with HP Enterprise Services last year to create DXC Technology. If you remembered that, give yourself a pat on the back.

What new trends teach us about the future of collaborative tech


Sandra Vogel

31 May, 2018

Every business, no matter how small, relies on collaboration to get the job done. Ideas need to be generated, honed and perfected. Projects need to be defined, scoped, managed and evaluated. Clients – and staff teams – need to be listened to and worked with.

There are two key imperatives for effective collaboration: Pairing ‘many-to-many’ communications (like a real conversation) with the need to bypass hierarchies and old ideas about who can and can’t participate.

Combining these two gives an organisation the highest chance of getting the best ideas, while encouraging everyone to contribute means people feel valued rather than sidelined out of the important decisions.

New lines of communication

Technology is very good both of these, but it has to be implemented well to achieve them – that means picking the right product for your needs. For example, Casual Dining Group has ditched email for its staff communications, and instead uses Workplace by Facebook, a platform that supports collaboration in businesses that don’t necessarily have a traditional infrastructure in place.

“Crucially, this has allowed us to connect all of our workers, regardless of their job title or location,” Celia Pronto, Chief Customer and Digital Officer at Casual Dining Group, tells Cloud Pro. “For a restaurant business, where many employees don’t have an email address, this is critical.”

This has created benefits for the business – it has “fostered a healthy sense of competition, created a space to share best practice and garnered an openness and awareness of wider business aims,” says Pronto.

Similarly, Tinypulse, an anonymous platform for interacting with employees through questions, cheers, suggestions and direct messages has helped PR Agency NeoPR engage better with its staff.

Neo PR’s director, Gemma Spinks, told us: “Since implementing this tool we have seen increased collaboration between teams on subjects that may not have otherwise been addressed. Staff morale is also greatly improved as people feel they have an official forum to raise, discuss and share concerns they, or other team members may have.

She adds that using these types of collaborative tools “allows everyone in the team, not just the line managers, to recognise and highlight good work, successes or general pleasantness in the office.”

The workspace is evolving

In a traditional office environment, using collaborative technology often goes hand in hand with a physical reconfiguration of the workspace.

One approach that’s growing in popularity is ‘huddle rooms’ – in many ways a reaction to open plan offices which typically lack places to sit and chat. Huddle rooms have comfy seating and tables to work at, and, most importantly, are loaded with tech – big screens for video calls, interactive whiteboards, conference call setups. This makes them ideal for including remote workers who can join by voice, video and screen share.

This approach isn’t just for businesses. At Guy’s and St Thomas’ NHS Foundation Trust there is now a state-of-the-art Cancer Centre, a huddle room equipped with screens and communications equipment. This allows cases to be discussed in a secure environment, medical documents and files to be viewed, and conversations to be had with colleagues who may be at other hospitals.

Professional services company PwC has taken the concept of the huddle room one step further with its new ‘Delta Room’. Located at its office in Paris, it’s fitted with multiple large format, gesture-controlled screens that can capture information in real-time, as well as wireless audio and a mix of desk and sofa style seating. It’s an open-plan meeting space that allows people to move around freely while collaborating – and of course, it can include remote participants.

Not constrained by time and space

Collaborative technologies really come into their own when they allow people to defy time and space, and come together to pursue projects wherever they happen to be. Achieving this doesn’t necessarily require lots of futuristic looking equipment or fancy features. Often it is just about having access to shared working space.

For example, when taxi booking service mytaxi rebranded to incorporate Hailo, it needed to establish a new brand identity with a small team spread between London, Dublin, Hamburg, and Barcelona – all within four months. It needed its team to be able to work together, yet remain flexible to meet tight deadlines.

The team worked in Dropbox Paper, a shared document development system, for all elements of the project from creating strategy documents to mock-ups with feedback and wireframes. The task was so big that Gary Bramall, Chief Marketing Officer at mytaxi describes it as “…the marketing equivalent of raising the Titanic with a tiny team”

Pushing at an open door

What all of these examples show is that collaborative technology can be futuristic (like gesture responsive screens) or more traditional (like Dropbox Paper), but it can’t be exclusive or restricting. Closed door meetings are dwindling in favour of open collaboration, and the idea of having to be present in person to join in a discussion is a thing of the past.  

The cutting-edge, like PwC’s Delta Room, is all about democratising and inclusion. Whatever we see next in collaborative tech, it’s likely to push further that already open door.

Image: Shutterstock