HPE to Buy Cloud Cruiser

Hewlett Packard Enterprise (HPE) is on an acquisition streak, as it has acquired a cloud metering and billing company called Cloud Cruiser.  The financial terms of the deal are not being disclosed, and the acquisition is expected to be complete by the end of this year. This is the second acquisition that HPE has made in a week, with the earlier one being its acquisition of a company called SimpliVity for a whopping $650 million.

Founded in 2010, Cloud Cruiser provides analytics and software to help vendors measure the cloud services used by businesses and other IT departments. In other words, this company provides metering software to track the use of public and private cloud services. The idea behind this software is to offer an accurate picture of the scope of cloud services currently used by different companies, and their current and future need. With this information, service providers can determine the potential gap that exists between their client’s current usage and needs, and help to fill it with the right services.

This Northern California company has so far obtained a funding of around $20 million, and has an impressive list of clients that include Accenture, Microsoft, TD Bank, and Ford. HPE is in fact, Cloud Cruiser’s largest client, accounting for a major chunk of its revenue. This company’s cloud metering service is used by HPE’s Flexible Capacity arm of business, that allows customers to manage their own data centers with a pay-as-you-go model. However, the rising costs of Cloud Crusier was proving to be expensive for HPE, so it’s little surprise then that HPE is acquiring this company.

If you look closely, HPE stands to gain much in this acquisition. Firstly, HPE doesn’t have to pay money any more for the cloud metering services, and this is definitely a smart way to save money. Secondly, this acquisition fits perfectly with HPE’s Flexible Capacity business, and it can extend this service in a more effective manner for its customers. Currently, HPE offers a pay-as-you-go model, so Cloud Cruiser’s service can sure help to meter and bill more efficiently, not to mention the analytical information that can be gained from it. Thirdly, HPE is already embarking on a hybrid cloud strategy, and this is most evident with its acquisition of SimpliVity. In this sense, this acquisition is expected to give a further boost to HPE’s strategy.

On completion of this acquisition, Cloud Cruiser will become a part of HPE’s Data Center Care portfolio in its Technology Services Support division. The co-founder and CEO of Cloud Cruiser, David Zabrowski, will join HPE and will report to Scott Weller – the SVP of Technology Services Support. Coincidentally, Zabrowski served as VP and General Manager of HPE from 1997 to 2002. So, this will be a reunion of sorts for the CEO too.

Overall, this acquisition will be a good one for HPE as it can further its growth strategy as well as help it to save costs. As for Cloud Cruiser, this is a good end.

The post HPE to Buy Cloud Cruiser appeared first on Cloud News Daily.

[slides] Connected Cars | @ThingsExpo @Cisco_Jasper #IoT #IIoT #M2M #AI

IoT is fundamentally transforming the auto industry, turning the vehicle into a hub for connected services, including safety, infotainment and usage-based insurance. Auto manufacturers – and businesses across all verticals – have built an entire ecosystem around the Connected Car, creating new customer touch points and revenue streams. In his session at @ThingsExpo, Sanjay Khatri, Head of Platform Product Marketing at Cisco Jasper, shared real-world examples of how IoT transforms the car from a static product into a dynamic hub, drawing from experience at Cisco Jasper working with the world’s top 23 automakers. He discussed how enterprises can leverage IoT to cultivate meaningful customer relationships and lucrative business opportunities.

read more

CIOs fear outdated IT budgeting is slowing down cloud adoption

(c)iStock.com/Rainer von Brandis

If the goal for CIOs is to go to the cloud, one thing is holding them back, according to a new report from Trustmarque: outdated budgeting models.

The report, titled ‘Highlighting operational and financial barriers to cloud’, argues more than half (55%) of UK CIOs polled believe out of date capex models are slowing down their adoption of cloud services. An overwhelming 87% say existing software licensing agreements are another cause of delay – a higher figure than last year’s survey indicating the problem is getting worse – while 59% cite inflexibility of fixed-term and fixed-user licensing agreements.

If that wasn’t bad enough, more than three quarters (77%) of CIOs say they are finding it difficult to establish which cloud services are suitable for their organisation, while a similar number (72%) say different payment methods makes things more complicated. Half of respondents say cloud is only partly delivering on its promised benefits.

So what can be done? Naturally, Trustmarque has its own solution – a product called Cloud-ESP, which aims to provide an online portal for procurement and management of cloud services – but more widely, the need for new skills and potentially restructuring IT operations needs to be on organisations’ radar.

“The on-demand nature of cloud means unmanaged cloud can play havoc with long term financial plans,” said James Butler, Trustmarque CTO. “CIOs must ensure they retain full visibility and control over their IT estate, across SaaS, IaaS and traditionally licensed solutions, to minimise the unplanned spend that poorly managed cloud infrastructure and services can result in.

“Transitioning to the cloud, or becoming a ‘cloud-first’ business, is a sizeable task for many organisations,” Butler added. “It has taken a short period of time for cloud to become such a disruptive force, and it is likely that effect will continue into the next five years.

“The CIO of 2017 must be capable of embracing cloud while minimising unintended consequences, by succeeding in overcoming the existing barriers to cloud adoption.”

You can find out more and read the full report here.

CIOs fear outdated IT budgeting is slowing down cloud adoption

(c)iStock.com/Rainer von Brandis

If the goal for CIOs is to go to the cloud, one thing is holding them back, according to a new report from Trustmarque: outdated budgeting models.

The report, titled ‘Highlighting operational and financial barriers to cloud’, argues more than half (55%) of UK CIOs polled believe out of date capex models are slowing down their adoption of cloud services. An overwhelming 87% say existing software licensing agreements are another cause of delay – a higher figure than last year’s survey indicating the problem is getting worse – while 59% cite inflexibility of fixed-term and fixed-user licensing agreements.

If that wasn’t bad enough, more than three quarters (77%) of CIOs say they are finding it difficult to establish which cloud services are suitable for their organisation, while a similar number (72%) say different payment methods makes things more complicated. Half of respondents say cloud is only partly delivering on its promised benefits.

So what can be done? Naturally, Trustmarque has its own solution – a product called Cloud-ESP, which aims to provide an online portal for procurement and management of cloud services – but more widely, the need for new skills and potentially restructuring IT operations needs to be on organisations’ radar.

“The on-demand nature of cloud means unmanaged cloud can play havoc with long term financial plans,” said James Butler, Trustmarque CTO. “CIOs must ensure they retain full visibility and control over their IT estate, across SaaS, IaaS and traditionally licensed solutions, to minimise the unplanned spend that poorly managed cloud infrastructure and services can result in.

“Transitioning to the cloud, or becoming a ‘cloud-first’ business, is a sizeable task for many organisations,” Butler added. “It has taken a short period of time for cloud to become such a disruptive force, and it is likely that effect will continue into the next five years.

“The CIO of 2017 must be capable of embracing cloud while minimising unintended consequences, by succeeding in overcoming the existing barriers to cloud adoption.”

2017 #IIoT Predictions | @ThingsExpo #IoT #M2M #API #AI #FogComputing

With cybersecurity on the everyone’s mind, 2017 will be see the emergence of True Fog Computing and Programmable/Intelligent Edge Devices with the strongest security measures to-date. As 2017 kicks into full gear and a particularly interesting 2016 fades into the rearview mirror, we took a look around the IIoT landscape to see what this year might potentially have in store. We will be unveiling five IIoT-related predictions throughout this week and into next, so stay tuned and let us know what you think!

read more

What is Sage 50c?

Sage 50c is a cloud-based accounting software from Sage –  a leading provider of accounting and payroll software. Most of Sage’s products are geared towards small and medium sized businesses, as they’re the ones that need a good amount of automation within limited budgets. Likewise, Sage 50c is also programmed keeping these businesses in mind.

Sage 50c is an cloud version of Sage 50, that was earlier known as Peachtree Accounting. With this new software, users can run Sage 50 on the cloud instead of their local machines, thereby saving them time and resources. They no longer have to worry about their systems being compatible with the software, as all the specifications are handled by cloud providers.

Also, since this software is cloud-based, it can be accessed from any device and location. For example, a user can start working at home on his or her laptop, and complete it from a mobile device while commuting to work; such is the accessibility of this new software.  It gives greater flexibility to employees, especially the millennial generation, as they prefer to have a better work-life balance than the other generations.

In many ways, Sage 50c is like Office 365. Both these software are cloud versions of the parent software, and this is probably why Sage 50c comes with Office 365 Business Premium version. This integration was announced by Sage last July during its summit conference in Chicago, and it is expected to be available for small and medium businesses across different global markets this year. Sage plans to start first with its UK and Ireland market, where it will be available by the end of this month. It’s not clear why Sage chose the UK as its first market for Sage 50c, though some reports speculate that it wants to establish a market before Britain completes all formalities for exit from the European Union (EU). By spring, Sage 50c is expected to be available in the US ,Canada, Germany, and France.

This partnership between Sage and Microsoft extends the benefits of this accounting software greatly. In fact, the combination of Sage’s accounting software with Microsoft’s Excel is expected to help millions of small and medium businesses around the world to better manage their finances and payroll.

Besides the integration of Office 365, other new features in Sage include Sage Contact – a tool that syncs with Microsoft Outlook to give users instant access to the details of their contacts, so they can get all the related information within seconds.  In addition, Sage 50c comes with a feature called Mobile Invoicing that’ll help businesses to record their expenses and generate any necessary invoices remotely. They can even photograph receipts and capital expenses, and upload the same to their account, to help them better track their finances and also to reduce workload during an audit. Businesses can also choose to tie Sage 50c with their bank accounts to give them an instant view of their financial status at any time.

Sage 50c comes with some Business Intelligence reporting tools as well to give businesses an insight into their operations and performance. In all, it’s going to be an exciting year for Sage and its customers.

The post What is Sage 50c? appeared first on Cloud News Daily.

Opinion: When big data and Brexit collide

(Image Credit: iStockPhoto/Maxiphoto)

During the UK referendum campaign, the Leave camp spoke ardently about the importance of protecting our sovereignty and making our own laws. Now we’re coming out of the EU and the single market, sovereignty is top of the agenda again, for the opposite reason. Rather than solving our right to sovereignty, Brexit threatens to destabilise it.

Right now, data holders are worried about the sovereignty of their information and the onus of complying with international laws that are not our own. For instance, company directors are wondering what their obligations will be if their organisation’s data is stored abroad and subject to the laws of the country in which the data resides? How do they comply with the country’s privacy regulations and keep foreign countries from being able to subpoena their data?

In the Autumn of 2016 4D surveyed 200 UK decision-makers in small-to-medium sized businesses. We discovered that 72% of the respondents are under pressure to demonstrate data protection compliance for customer data and 63% say Brexit has intensified their concerns surrounding data location and sovereignty even further – suggesting matters of sovereignty may not have been the best reason for exiting the EU.

Brexit’s impact on General Data Protection Regulations (GDPR) is a case in point. The UK authorities played a significant role in developing and refining the new EU framework, that comes into force on 25th May 2018.

Contrary to wanting to shake off the European enforced legislation, 69% of businesses want to keep GDPR. Nearly half (46%) of these businesses are fully prepared to absorb additional costs incurred through direct marketing – which the Information Commissioner’s Office (ICO) estimates will come to an additional £76,000 a year. Just 23% would like to scrap GDPR to avoid extra operating costs. While the majority (59%) think GDPR should be compulsory for all large businesses.

This doesn’t necessarily mean that businesses are happy to embrace all European legislation. Data protection is a minefield and proper governance is desperately needed. For many, protecting one’s data is a major factor in a company’s decision-making. One in two businesses in the UK decide where data is stored based on matters of data security alone.

However, on the flip side, this means the other half aren’t thinking about data residency issues. We also know that only 28% think about data sovereignty in terms of how local laws will impact the way they store their data and 87% of IT decision-makers confess to not looking at data location and sovereignty issues post-referendum.

This lack of consideration could be a ticking time-bomb. If the UK’s data flows become pawns in a messy divorce, with Theresa May reiterating recently that the government is pushing a hard Brexit as opposed to soft, businesses will need to get to grips with where their data lies, what laws their data is subject to, and who owns the data centers in which their data resides. As amorphous as cloud computing sounds, company data hosted in the cloud is not an ethereal mass of zeros and ones. It has a home and this home may become a bone of contention.

Yesteryear, a European data center could have served UK and European customers. In just over two years’ time, companies may need a European data center for European customers and a UK data center for UK customers. If this comes to fruition, expect multinational companies, serving a European population to move the bulk of their servers from London to a European data center (i.e. in Dublin, Paris, Frankfurt etc.). This would represent a mass exodus of investment.

However, it also stands to reason that SMEs in the UK that don’t intend to trade with the EU would gain far more certainty and simplicity by placing their physical servers in a UK owned and located data center, on a co-location basis. This is reflected in the 64% of respondents who believe that in the current climate, the assurance of colocation and flexibility of cloud infrastructure strikes a good balance.

We also have to consider the recently published (10th January) European Commission’s Free Flow of Data Initiative (FFDI) Communications proposal. Up until then, the position of the European Commission was that member states (with the exception of certain specific classes of data) need not require data to be located within nation-state boundaries – by law. Companies would have the right to choose where to locate their data within the EU. To add to the confusion, they are also proposing to introduce new legal concepts and policy measures targeted at business-to-business transactions.

The only silver lining to this is that these proposals are still in the consultation phase and there may be opportunities for trade associations such as TechUK to push for reform.

So where does this leave software, cloud and hosting companies that want to enter the UK market over the next couple of years? Until very recently, data sovereignty has been a bit of a misnomer in the US and Europe as we’ve all become used to storing and transferring private citizen data across borders without much fuss. The only certainty emerging from all this uncertainty is that if you are looking to expand into the UK market, the safest long-term bet is to put your servers and data into British-based data centers. By doing so, you will automatically be aligning the data security needs of your British clients with current and future UK data protection legislation – whatever that may be. Britain is also likely to adhere to the very strict data privacy rules it (ironically) helped craft in the upcoming General Data Protection Regulation (GDPR) in 2018. If the data center or hosting provider happens to be British owned, even better, as it won’t be subject to outside meddling from US agencies, as Microsoft has found out with some of its Irish-based data centers.

Taking a home-grown approach would certainly insulate SMEs them from the negotiations’ changing winds.  This awareness is starting to dawn. Almost one-third of companies using an international public cloud for company data intend to stop doing so in two years’ time, following Brexit. While the proportion of companies using a UK public cloud for company data are expected to increase by almost a third in two years’ time, in the wake of the UK’s exit from the European Union.

While the wholesale movement of company data would be premature at this stage, the thinking certainly needs to be done over the next 12 months, in terms of the connotations of a business’s current cloud mix and the ins and outs of transitioning to a UK-based data center. 

The sovereignty of their data will only be one small piece of the jigsaw but it’s an important one. In the digital era, data is a company’s crown jewels and the way businesses treat and protect their data will govern their reputations.

Are you concerned about Brexit disruption? Share your thoughts in the comments.

Opinion: When Big Data and Brexit collide

(Image Credit: iStockPhoto/Maxiphoto)

During the UK referendum campaign, the Leave camp spoke ardently about the importance of protecting our sovereignty and making our own laws. Now we’re coming out of the EU and the single market, sovereignty is top of the agenda again, for the opposite reason. Rather than solving our right to sovereignty, Brexit threatens to destabilise it.

Right now, data holders are worried about the sovereignty of their information and the onus of complying with international laws that are not our own. For instance, company directors are wondering what their obligations will be if their organisation’s data is stored abroad and subject to the laws of the country in which the data resides? How do they comply with the country’s privacy regulations and keep foreign countries from being able to subpoena their data?

In the Autumn of 2016 4D surveyed 200 UK decision-makers in small-to-medium sized businesses. We discovered that 72% of the respondents are under pressure to demonstrate data protection compliance for customer data and 63% say Brexit has intensified their concerns surrounding data location and sovereignty even further – suggesting matters of sovereignty may not have been the best reason for exiting the EU.

Brexit’s impact on General Data Protection Regulations (GDPR) is a case in point. The UK authorities played a significant role in developing and refining the new EU framework, that comes into force on 25th May 2018.

Contrary to wanting to shake off the European enforced legislation, 69% of businesses want to keep GDPR. Nearly half (46%) of these businesses are fully prepared to absorb additional costs incurred through direct marketing – which the Information Commissioner’s Office (ICO) estimates will come to an additional £76,000 a year. Just 23% would like to scrap GDPR to avoid extra operating costs. While the majority (59%) think GDPR should be compulsory for all large businesses.

This doesn’t necessarily mean that businesses are happy to embrace all European legislation. Data protection is a minefield and proper governance is desperately needed. For many, protecting one’s data is a major factor in a company’s decision-making. One in two businesses in the UK decide where data is stored based on matters of data security alone.

However, on the flip side, this means the other half aren’t thinking about data residency issues. We also know that only 28% think about data sovereignty in terms of how local laws will impact the way they store their data and 87% of IT decision-makers confess to not looking at data location and sovereignty issues post-referendum.

This lack of consideration could be a ticking time-bomb. If the UK’s data flows become pawns in a messy divorce, with Theresa May reiterating recently that the government is pushing a hard Brexit as opposed to soft, businesses will need to get to grips with where their data lies, what laws their data is subject to, and who owns the data centers in which their data resides. As amorphous as cloud computing sounds, company data hosted in the cloud is not an ethereal mass of zeros and ones. It has a home and this home may become a bone of contention.

Yesteryear, a European data center could have served UK and European customers. In just over two years’ time, companies may need a European data center for European customers and a UK data center for UK customers. If this comes to fruition, expect multinational companies, serving a European population to move the bulk of their servers from London to a European data center (i.e. in Dublin, Paris, Frankfurt etc.). This would represent a mass exodus of investment.

However, it also stands to reason that SMEs in the UK that don’t intend to trade with the EU would gain far more certainty and simplicity by placing their physical servers in a UK owned and located data center, on a co-location basis. This is reflected in the 64% of respondents who believe that in the current climate, the assurance of colocation and flexibility of cloud infrastructure strikes a good balance.

We also have to consider the recently published (10th January) European Commission’s Free Flow of Data Initiative (FFDI) Communications proposal. Up until then, the position of the European Commission was that member states (with the exception of certain specific classes of data) need not require data to be located within nation-state boundaries – by law. Companies would have the right to choose where to locate their data within the EU. To add to the confusion, they are also proposing to introduce new legal concepts and policy measures targeted at business-to-business transactions.

The only silver lining to this is that these proposals are still in the consultation phase and there may be opportunities for trade associations such as TechUK to push for reform.

So where does this leave software, cloud and hosting companies that want to enter the UK market over the next couple of years? Until very recently, data sovereignty has been a bit of a misnomer in the US and Europe as we’ve all become used to storing and transferring private citizen data across borders without much fuss. The only certainty emerging from all this uncertainty is that if you are looking to expand into the UK market, the safest long-term bet is to put your servers and data into British-based data centers. By doing so, you will automatically be aligning the data security needs of your British clients with current and future UK data protection legislation – whatever that may be. Britain is also likely to adhere to the very strict data privacy rules it (ironically) helped craft in the upcoming General Data Protection Regulation (GDPR) in 2018. If the data center or hosting provider happens to be British owned, even better, as it won’t be subject to outside meddling from US agencies, as Microsoft has found out with some of its Irish-based data centers.

Taking a home-grown approach would certainly insulate SMEs them from the negotiations’ changing winds.  This awareness is starting to dawn. Almost one-third of companies using an international public cloud for company data intend to stop doing so in two years’ time, following Brexit. While the proportion of companies using a UK public cloud for company data are expected to increase by almost a third in two years’ time, in the wake of the UK’s exit from the European Union.

While the wholesale movement of company data would be premature at this stage, the thinking certainly needs to be done over the next 12 months, in terms of the connotations of a business’s current cloud mix and the ins and outs of transitioning to a UK-based data center. 

The sovereignty of their data will only be one small piece of the jigsaw but it’s an important one. In the digital era, data is a company’s crown jewels and the way businesses treat and protect their data will govern their reputations.

Are you concerned about Brexit disruption? Share your thoughts in the comments.

Happy 33rd Birthday, Macintosh!

  Thirty-three years ago today, Steve Jobs introduced the Macintosh 128k. This launched the pivotal definition of home computing. Flash forward to 2017 and Apple holds numerous records in top technology, computer sales, and “Apple” is more than just a house-hold name; it’s a way of life. Re-live the nostalgic unveiling below: Credit: macessentials YouTube […]

The post Happy 33rd Birthday, Macintosh! appeared first on Parallels Blog.

Blockchain beyond Bitcoin: Assessing the enterprise use cases

(c)iStock.com/Radachynskyi

Updated Jan 25 Blockchain has serious potential to disrupt a multitude of industries, but a lot of barriers – not least confusion – still need to be ironed out.

That’s one of the verdicts from a recent research report from analyst firm Tractica on how the database technology, defined as a ‘distributed data verification technology wherein financial and operational transactions are recorded and validated across a network, rather than through a central authority’, can move away from its Bitcoin roots.

The report, titled ‘Blockchain for Enterprise Applications’, details that while Bitcoin was blockchain’s first ‘killer app’, it is “blossoming beyond cryptocurrency and the transfer of money, to an architecture that can support many times of transactions, from logging an event, to signing a document, to allocating energy between parties, and far beyond.”

Jessica Groopman is a principal analyst at Tractica, and author of the report. Having covered the Internet of Things (IoT) for years before looking more seriously at blockchain several months ago, she notes that “blockchain makes IoT look like a toy store” in terms of the technology’s complexity. More importantly from her employer’s perspective however, this represents the first time to her knowledge a technology analyst, rather than a pure consulting or investment firm, had explored this specific topic.

Blockchain makes IoT look like a toy store

“The real light switch for us was thinking about this as a new way to really unite, or, if nothing else, shorten the time between financial transactions and operational execution,” she tells CloudTech. “Those two processes – that is, paying for something and the event you are paying for, or the operation that you’re paying for happening, are somewhat disconnected.

“That’s the way I try and explain it to people,” Groopman adds. “It helps bridge that gap between a process occurring and payment for transaction or currency exchange for that process – although it’s not always currency.”

With this background, Groopman discusses the “painful hours” spent – jokingly, of course – assessing how far beyond currency blockchain can go. With a bit of lateral thinking, the possibilities are vast and varied. Digital and music rights can be transformed, for instance; in September a startup called Revelator raised $2.5 million to tackle this very task, with the theory being that the public, secure nature of the ledger and the immediacy of the record-keeping could enable much more efficient payments.

Groopman argues that while there are plenty of “interesting, eyebrow-raising opportunities”, the industries where the biggest impact will be found will be healthcare, government, logistics, and energy, for two reasons; one, they have the most dollars riding on them; and two, they will cause a domino effect for the more ‘tangential’ industries to follow.

“If this becomes a pervasive part of the transportation industry, smart trucks or smart cars, that’s going to ripple outwards into media, into telecoms, potentially even hospitality,” says Groopman. “If your car or truck is beginning to act and transact as an autonomous agent, that’s going to run on the blockchain most securely, and so you can see how one industry picking this up would begin to open up that opportunity to other outlets.”

It may be the most secure long term, but there are still some gremlins that can get into the system. At the Blockchain Expo conference in London, panellists discussed the risks alongside the rewards. Susan Furnell of Furnell Consult, who was moderating the session, told delegates of the issues when transferring from the blockchain to a traditional server architecture, such as moving Bitcoin onto another currency exchange. Aldo Lo Castro, head of research and development at AliasLab, noted the importance of who can see the data on a blockchain – “once it’s there, it stays there” – while Dimitri De Jonghe, core developer at BigchainDB, warned of potential problems in a consensus system, such as bad actors duplicating their identity and who could gain more votes.

At the heart of blockchain technology however, and the development of it, lies a paradox, and it goes a step further to explaining how it can be used in enterprise services. While the original development of Bitcoin as a public blockchain was, as Groopman puts it, ‘anarchic’ in nature, hoping to subvert centralisation altogether, there are many stakeholders in this particular game. “The corporations that are developing and investing in this technology don’t necessarily feel that way,” she says, understatedly.

You can see how one industry picking blockchain up would begin to open up the opportunity to other outlets

Of the players who need to be singing from the same page are developers, startups, and miners, not to mention consortia, banks, governments, and regulators. Groopman argues that this is where the ‘philosophical difference’ is playing out in the development of the technology. “Getting everybody on the same page is a much more complicated, tricky task, in the private enterprise blockchain space than any other technology space I’ve ever looked at,” she admits.

This complexity is borne out in ‘blockchain as a service’ (BaaS) solutions – Microsoft, who was also at the Expo, has an offering in this space alongside IBM – which aims to make things easier from an infrastructure and development perspective. As for whether there is room for both public and private blockchains, Groopman says that a hybrid approach, utilising public blockchains such as Bitcoin and Ethereum, and private, may be the best long-term play. “There’s definitely a role for private blockchain, because this is not something that we’re going to be able to turn on a switch overnight and completely change the dominant organisational structure of business and government,” she says, laughing. “That’s not going to happen.”

Ultimately, while the potential is vast, the report sounds a cautionary note, saying the space today ‘desperately’ needs definition. “The only in-production example of blockchain at scale today is Bitcoin,” says Groopman. “We’re still extremely early. There’s a tremendous amount of hype, and lots of frenetic energy, but when you get down to it it’s extremely early, and there are many, many barriers in place between this technology blooming into the mainstream and where we are today.”

You can find out more about the report here.

 

 

Blockchain and IoT will be explored in more detail at Blockchain Expo in Berlin and Santa Clara in 2017. Find out more at www.blockchain-expo.com