Salesforce SMB’s business leader talks data analytics, AI and the age of entrepreneurship

Sanj Salesforce

Sanj Bhayro, SVP EMEA Commercial at Salesforce

While the business world has traditionally favoured the biggest and the richest, cloud as a technology is seen as the great equalizer. Through a transition through to the cloud, SMBs are being empowered to take on the nemesis of enterprise business, with the number of wins growing year-on-year.

This, according to Salesforce’s Sanj Bhayro, is one of the most exciting trends we’re now witnessing in business throughout the world. Bhayro currently leads the EMEA SMB business at Salesforce and for almost 11 years has been part of the team which has seen the power of intelligent CRM systems grow backroom businesses to industry giants. Just look at the growth and influence of companies such as Uber and AirBnB for justification of his claims.

“The SMB business in Salesforce is one of the most exciting, because we get to work with really innovative companies,” said Bhayro. “All the innovation in the industry is coming from these small to medium sized businesses. They are disrupting the traditional market which is in turn forcing the traditional players to transform their own business models.

“Something which is interesting from our perspective at Salesforce is that when we started 17 years ago the internet wasn’t that prevalent, the cloud wasn’t a word that was used that often, and it was the SMB companies who adopted our technology. The cloud offered them the operational efficiency, the scale and the reach to take on these traditional players. These smaller organizations are looking more and more towards technology as the enabler for innovation.”

The majority of the SMBs could be considered to be too small to drive innovation in-house. For the most part, the IT department is small, and responsible for ‘keeping the lights on’, working through the cloud has enabled innovation and created opportunities for these organizations. And for the most part, the ability to be innovative is much more prominent in the smaller organizations.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

The fail-fast business model is one which has captured the imagination of numerous enterprise organizations around the world. Amazon CEO Jeffrey Bezos recently claimed the fail-fast model was the catalyst for recent growth within the AWS business, though the majority are seemingly struggling to implement the right culture which encourages learning and innovating through failing. For the majority, failure is simply failure, not part of the journey to success.

But this in itself is one of the ways in which the smaller, more agile organizations are innovating and catching enterprise scale businesses. The implementation of cloud platforms speeds up the failures and lessens negative impacts on the business, to further drive the journey to innovation.

“For start-ups and early stage companies, failing is an accepted mentality. How many companies are actually the same as when they started? They failed, learned and then progressed. As businesses become bigger and bigger it becomes a lot more difficult. Certainly for larger companies there is a lot more friction around the fail-fast model. Smaller companies are culturally set up to allow them to pivot and try new things, whereas larger ones, purely because of their size, are constrained.”

Outside of the SMB team, Salesforce engineers have been prioritizing the use of artificial intelligence for future product launches and updates. This was reinforced during the company’s quarterly earnings call in recent weeks as CEO Marc Benioff backed AI as the next major growth driver. While there is potential for AI in the SMB market place, for the moment it is only for those who are ahead of the curve.

For the most part, data analytics is starting to drip down into smaller organizations, though there is still a substantial amount of data which is not being utilized. For Bhayro, as the concept of the cloud is now ubiquitous, the opportunities are almost limitless. But only once these organizations have got on top of managing their own data, breaking down the silos within the business.

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.“AI translates well into the SMB business model and it will be the SMBs who drive where AI goes,” said Bhayro. “There are generally two camps when it comes to the SMB market, those who are cloud-native, those who capitalizing on the sharing-economy and those who are more traditional organizations. The shift that the traditional business has to make to break down the silos, and to move towards a cloud back-end is far more difficult than a company like Deliveroo who started in the cloud and can scale. Never the less that shift has to be made.”

“So much data is being created and there’s so much that you can do with it. The problem is that so many companies are not doing enough with their data. Recent reports stated that most companies can only analyse 1% of their data. Even before we start moving towards AI technologies, the way we service intelligence is through insight. We need to provide the right tools to make data available and malleable, to everybody in your business. These data analytics tools are the first steps and then we can look forward to AI technologies.”

The UK government has made numerous schemes available to SMBs to encourage the growth of this subsector in recent years, and Bhayro believes these efforts have been playing off in the international markets.

“I delighted to say that the UK takes a leadership position (in relation to SMB growth and innovation in comparison to the rest of Europe),” said Bhayro. “Something in the region of 95-96% of the companies in the UK are SMBs, and the government is currently doing the right things to encourage and propel entrepreneurs. I think we’re in the time of entrepreneurship, and this is the time for people to have the vision and grow. These companies are having wonderful ideas, and they are moving into the growth period, but it’s the customer experience which really differentiates them from the competition. Not many of these companies are set up to achieve customer experience objectives, but this is where we (Salesforce) come in.”

Data sovereignty: How to shield your business from changing regulations

(c)iStock.com/Tempura

International data privacy regulations can be complex and ever-changing. The invalidation of the Safe Harbour agreement by the European Court of Justice, and the ongoing negotiations about its replacement, means than many companies are now forced to rethink their approach to information governance, without the certainty of knowing exactly which regulations they will need to comply with.

Classifying information that is subject to data privacy regulations, from information that is not, can be a difficult task, one that requires businesses to implement a sound information governance strategy. When done properly – by classifying content with the right metadata – complying with any change in the shifting regulatory landscape becomes a lot easier. Yet businesses attempting to implement the correct information governance platform can face difficulties. So what do organisations need to consider?

  • The whole picture: Looking at just one system, such as emails or file shares, does not work. Organisations must consider all the information entering the enterprise and every touch point where data enters or leaves
  • Co-operation: Business stakeholders and the IT department must co-operate on the implementation of the strategy to ensure it addresses new regulations without blocking business productivity
  • Prepare to be flexible: Laws and regulations constantly change so organisations must prepare an information governance strategy in which both the policies and the data model itself can be flexible. If the data model is not flexible, businesses may be able to tweak policy but could be faced with the herculean task of reclassifying terabytes of existing data for new regulations

Beyond these considerations, businesses must also take the continued growth of cloud-based services into account when preparing to become compliant with data privacy laws. Choosing a cloud provider is no longer just about functionality, features and price. Data privacy is a key consideration, particularly as widespread adoption of consumer-grade software in the workplace, especially cloud-based services, has exploded over the last couple of years. Employees are using these consumer-grade services – such as Google Drive, Dropbox and Evernote – to boost productivity in the workplace but they can make it very difficult for companies to address data privacy regulations.

When choosing a cloud provider, IT and business professionals should ensure that the company can address data privacy requirements. With the invalidation of the “Safe Harbour” agreement – and the on-going negotiations between US and EU regulators over its replacement – the current regulatory environment in Europe is unclear. For this reason alone, organisations must ensure their cloud provider consistently maintains security procedures and protocols to protect all customer data and ensure compliance with current and future EU data privacy requirements. Many businesses may prefer to use a provider which can offer them a dedicated European data zone, with complete network segregation of all hardware and access levels. Whatever the end decision, a secure, global information management strategy must be a cornerstone of any corporate cloud adoption.

Whatever method businesses choose to solve the issue, action must be taken sooner rather than later. By avoiding the problem or tackling it in many different stages, organisations are opening themselves up to potentially huge penalties and fines. The reality is that most regulations do not require organisations to implement any dramatically new features in order to become compliant. They just require businesses to ensure a comprehensive information governance strategy is in place to monitor data and guarantee compliance with the shifting regulatory landscape.

Citrix and Microsoft team up to tackle enterprise mobility

Silhouette Businessman Holding PuzzleCitrix has expanded its partnership with Microsoft as the team aim to capitalize on flexible working and enterprise mobility trends.

Speaking at Citrix Synergy in Las Vegas, CEO Kirill Tatarinov outlined objectives to meet the needs of the modern workforce with application and desktop virtualisation in the cloud, network delivery and enterprise mobility management. Citrix has selected Azure as the preferred and strategic cloud for its future roadmap and the team will work to develop new integrations between Citrix XenMobile, NetScaler and the Microsoft Enterprise Mobility Suite, to improve efficiency and data security.

“Companies of all sizes across all industries around the world have an amazing opportunity to embrace digital transformation and empower their people to work productively from anywhere at any time,” said Kirill Tatarinov, CEO of Citrix. “Our customers are asking Citrix and Microsoft to work closer together to help them fully leverage innovations like Windows 10, Office 365 and Azure. This enhanced partnership ensures we can be more agile in responding to our customers’ needs and help them accelerate the move to digital business.”

As part of the partnership, the team will aim to accelerate the deployment of Windows 10 Enterprise within their customer’s organisations. Citrix customers can use AppDNA to aid migration to Windows10 by providing application lifecycle management tools to discover and resolve application compatibility issues, the team claims.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

Citrix NetScaler will integrate with EMS to provide virtual private network capabilities for more secure, identity-based access to on-premises applications on Microsoft Intune-managed devices. Citrix will also offer customers who have purchased Windows Software Assurance on a per-user basis the option to host their Windows 10 Enterprise Current Branch for Business images on Azure through its XenDesktop VDI solution, which the team claim is a first in the industry.

“Our relationship with Citrix has always been founded on the commitment to making our mutual customers successful by empowering their people to be more productive,” said Scott Guthrie, EVP of Cloud and Enterprise at Microsoft. “By selecting Azure as its preferred and strategic cloud, Citrix is helping companies mobilise their workforces to succeed in today’s highly competitive, disruptive and global business environment.”

Employees not taking advantage of mobility initiatives – survey

Digital Device Tablet Laptop Connection Networking Technology ConceptDespite mobility being one of the top priorities for organizations throughout the world, research from IDC has shown only 13% of those who are given the option actually work from home.

Enterprise mobility has proved to be one of the more prominent trends emerging out of the evolution to cloud-based platforms, as employees aim to create a working environment which encourages innovation and creativity however the study shows the generosity is not being taken advantage of. One statistic which could be seen as an obstacle to adoption is two in five line managers admit they do not want their employees to work from home.

Numerous organizations have highlighted mobility strategies as a priority for coming months, as organizations aim to utilize the power and freedom of cloud based applications to increase the productivity of employees. Findings from 451 Research claims 40% of enterprise organizations are prioritizing mobilization of general business apps over the next two years, as opposed to focusing solely on field services and sales teams. The trends towards mobility are also confirmed when assessing the M&A market. In the mobile device management and mobile middleware segment, 28% of the total deals (21 of 74) and 77% of their total value ($3bn of $3.9bn) over the past decade have occurred over the past two years alone.

Although other research has suggested organizations are shifting to a mobility mind-set, IDC’s study has outlined the drive towards is still in the early adopter stages, despite numerous organizations claiming its importance. The leadership team were particularly critical of considering working from home to be acceptable, as only 43% of employees are confident leadership is fully behind mobility as a concept. Of those who do have the opportunity to work from home, only 14% spend more than half their time outside the office.

From a leadership perspective, new EU regulations regarding the protection, residence and transition of data could have an impact on their attitudes towards mobility, as penalties for non-compliance will be to the tune of €20 million or 4% of the organization’s annual turnover, whichever is greater.

While vendors are striving to improve the efficiency of mobility solutions, as well as championing efforts to make the technologies on the whole more secure, unless the adoption of the mobility culture is increased from the end-user side, there are unlikely to be any changes in the near future. If the statistics remain true, mobility initiatives will not achieve the required ROI, which could have a negative long-term impact on the investments made into the mobility segment on the whole.

WANdisco collaborates with Bridgeworks for rapid cloud migration

(c)iStock.com/baona

WANdisco and Bridgeworks have announced a new partnership, with WANdisco Fusion’s patented cloud transactional data technology will feature Bridgeworks PORTrockIT software.

David Richards, WANdisco CEO and co-founder, said: “WANdisco’s continuous availability and guaranteed data consistency can now be deployed with software from Bridgeworks that virtually removes the effects of network latency, making it possible for on-premise and cloud environments to operate as one.”

PORTrockIT’s patented technology allows businesses to rapidly move large volumes of data and increases performance by up to 100 times. The increase in speed improves critical enterprise applications such as backup, replication, disaster recovery and cloud migration. The integrated Bridgeworks and WANdisco technology will ensure that large volumes of live production data can move to and from the cloud without any business disruption.

David Trossell, CEO of Bridgeworks, said: “WANdisco software is already known for its performance and it has revolutionised the global enterprise requirement to migrate large amounts of transactional data to the Cloud. When combined with Bridgeworks PORTrockIT technology, the resulting increase of up to 100 times in transfer speed has raised the bar for business.”

The integrated technology supports hybrid cloud requirements for on-demand, burst-out processing, and offsite disaster recovery, without downtime or data loss. The integrated solution removes the requirement for cloud vendor storage appliances to be sent back and forth from customer data centers in a process that involves days of downtime and is only suitable for one-time movement of cold, less critical data.

CSC announces HPE enterprise services merger to create $26bn business

Meg Whitman

HPE CEO Meg Whitman

CSC has announced it will be merging with the enterprise services segment of HPE, as the latter reported its fourth consecutive quarter of year-over-year revenue growth.

Revenues for 2016 Q2 were reported at $12.7 billion, up more than 1%, as the team attributed the success to its servers, storage, networking and converged infrastructure business units. The enterprise services unit also saw a healthy performance, and will now be spun out and merged with CSC to create a $26 billion organization.

“The transaction is currently targeted to be completed by March 31, 2017,” said HPE CEO Meg Whitman on the company’s earnings call. “For the combined CSC and Enterprise Services, this will create a new company that will be a pure-play global IT services leader. For customers, this means global access to world class offerings in cloud, mobility, application development and modernization, business process services, IT services, big data and analytics, and securities.”

The move comes six months after CSC underwent a similar split to HP and HPE. CSC serves commercial and government clients globally, whereas CSRA targets public sector clients in the United States. Following the completion of the transaction next year, CSC’s current president and CEO Mike Lawrie will continue to head up the new company, though the new brand has not been released as of yet. Both companies have seemingly benefited from their respective splits in recent months, demonstrating healthy growth since the two separations.

Since the CSC separation, the company has been aggressively reinforcing its position in the market with various acquisitions and joint ventures. Created CeleritiFinTech, a joint venture with HCL, to strengthen its position in the banking sector, acquired UXC to increase its footprint in the Australia-New Zealand region and bought Xchanging to bolster its insurance solutions.

“Our proposed merger with HPE Enterprise Services is a logical next step in CSC’s transformation,” Lawrie said. “As a more powerful and versatile global technology services business, the new company will be well positioned to innovate, compete and serve clients in a rapidly changing marketplace. We are excited by the great potential this merger brings to our people, clients, partners and investors, and by the opportunity to strengthen our relationship and collaboration with HPE.”

In terms of HPE moving forward, Whitman highlighted next generation software defined infrastructure is a priority for the business, focused on servers, storage, networking, converged infrastructure, hyper-converged, and Helion. The company has stated it will remain open to future acquisitions, though it would appear there aren’t any major targets in the pipeline as Whitman seemed ‘standoffish’ during the earnings call.

NTT Data partners with Privitar to make customers GDPR compliant

Lady JusticeNTT Data UK has announced a partnership agreement with Privitar to provide data protection solutions built on new requirements set out by the EU General Data Protection Regulation.

The GDPR requires companies to process and use the personal data of any European customers in a justifiable and ethical manner, whilst also giving increased control of the data back to the customers themselves. As the role of data increases within the business world customers have become increasingly interested in how their personal information is stored and used. Insight delivered from this data can be used to drive additional revenues for a business, though once GDPR comes into legislation in 2018, there will be strict guidance on how the data is used.

NTT Data believe this dynamic will create complications for various organizations, and claim combining the NTT Data’s data and process capabilities, with Privitar’s privacy software, will create a proposition which will comply to all GDPR data requirements.

“By combining NTT DATA’s sector-specific domain knowledge with Privitar’s software we can now deliver programmes that make our clients champions of both privacy and innovation,” said Steve Mitchener, CEO of NTT Data UK. “I’m excited that this partnership will allow our clients to fully utilise their data assets without fear of reputational and financial damage, or regulatory action.”

Capacity, redundancy, and transparency: The three big cloud question marks

(c)iStock.com/anilakkus

From my perch in the cloud computing ecosystem, I see three concerns on the horizon for cloud computing: capacity, redundancy, and transparency.

In certain ways, these topics are inter-related with security. Certainly, from the customer’s viewpoint, they all touch on perceptions of value and trust. And possible solutions, at least to my mind, tend to reflect old-school logic around risk management principles.

I know from my conversations with other professionals in the cloud computing industry that my concerns are widely shared and that a variety of solutions are actively being sought. Because these are broad-based concerns that affect everyone in this domain, however, they’re worth sharing to generate further discussion of potential solutions.  

Capacity: are we ‘there’ yet?

Our capacity to store data is being challenged by big data – the vast amounts of data that exist and that are being generated afresh each day in an increasingly digital world. Currently, it may appear as if there’s an infinite amount of data storage and processing capacity available in the cloud, but that’s not the case. Industry players tell me they’re concerned that, at some point, we may “fall off a cliff.” Although many believe that the proverbial cliff will be prompted by the advent of the Internet of Things (IoT), it may come sooner. (These issues are being addressed by the IEEE Big Data Initiative, a tremendous resource for CloudTech readers.)

Embedded systems generate enormous amounts of unstructured data and companies with legacy embedded systems are starting to analyze it for insights into how their products have been used. This need may challenge us before IoT’s exponentially vast data generation becomes an issue.

In IoT’s case, however, a number of strategies suggest themselves, including processing at the network’s edge to reduce the amount of data flowing upstream to the cloud. Other strategies include variations on current practices such as representative sampling of data and logical partitioning of data sets. Intercloud services could divvy up the work, but these remain a work-in-progress.   

Redundancy: déjà vu all over again?

Redundancy to my mind means that cloud-based data repositories should be retrievable or self-regenerating, if they are lost or purposefully or inadvertently discarded by the cloud service. This is a form of security. Our IEEE Cloud Computing Initiative’s Facebook page, for example, inexplicably vanished recently – and, with it, a great deal of valuable, time-consuming work.

The issue is that many companies are turning to the cloud to provide redundancy for critical business continuity data and processes. If the user relies on the cloud for storage and redundancy, what are the redundancy capabilities and policies used by your cloud service? After all, the cloud’s key selling point is that it eliminates the time and cost of maintaining one’s own databases.

Transparency: it isn’t clear

Transparency is a quality that might be applied to both the capacity and the redundancy issues – what is a cloud provider’s policy regarding its own capacity to store and process vast amounts of data and to back it up, should it be discarded or lost? For competitive reasons, the answer is likely not forthcoming from your cloud service provider.

More important to current and future users of the cloud (and the Internet), however, is the vast amounts of data being gathered – and, increasingly, applied in various ways – on a user’s identity, location, browsing habits, topics of interest, social media activities and purchasing patterns, to name a few.

How many of us really realise the degree to which that data, in ever more granular form, is being collected in the first place? And wouldn’t we all like to know exactly what is being collected and how that data is being used? Certainly we should all have some understanding of what personally identifiable information is gathered and how it is used. It might also be valuable to know how and when our activities are tracked and analysed, even if and when that data is anonymised.

Trust and value

Becoming smarter about the cloud vendors we work with is a critical piece of the solution to all the issues I’ve just raised. In a recent issue of IEEE Internet Computing magazine, “Managing Risk in a Cloud Ecosystem,” the authors offer the following, logical reassurance: “Due to economies of scale, cutting-edge technology advancements and higher concentration of expertise, cloud providers have the potential to offer state-of-the-art cloud ecosystems that are resilient, self-regenerating and secure – far more secure than the environments of consumers who manage their own systems.”

Yet the article’s ultimate conclusion is that cloud consumers must follow a rigorous, established process to determine their own approach to risk management and identify their security requirements. That’s the basis for understanding how cloud providers manage risk and the degree to which they can meet a customer’s security requirements.

I urge you to read the article, which is full of practical guidance on these issues. And I invite you to join the community that has coalesced around IEEE Cloud Computing to broaden your knowledge of current challenges and participate in discovering solutions. 

Let the countdown to GDPR begin

Location Germany. Red pin on the map.The road to data protection has been a long and confusing one. Despite being one of the biggest concerns of consumers and corporates throughout the world, progress has hardly been moving at breakneck speed, but as of today (May 25th), companies now have exactly two years to ensure they are compliant with the EU’s General Data Protection Regulation.

The general objectives of the GDPR are to give citizens back the control of their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU. Data protection is a complicated business throughout the EU mainly due slight differences from country to country, and then again, with overarching EU regulations, or directives which haven’t even made it to regulation.

Conversations surrounding the new regulations have been ongoing since 2012, though companies now have until 25th May 2018 to ensure they are fully compliant. For this would seem an adequate amount of time, however a recent YouGov and Netskope survey highlighted only one in five are confident they will be compliant in this time period. For Eduard Meelhuysen, VP at Netskope, decision makers need to take a step back to get a better understanding of the current state of their data, before concentrating on any company app.

“If they are to comply, IT teams will need to make the most of the two-year grace period which means that both cloud-consuming organisations and cloud vendors will need to take active measures now,” said Meelhuysen. “As a starting point, organisations should take a hard look at how their data are shared and stored, focusing in particular on any cloud apps in use across the organisation.

“The GDPR makes specific provisions for unstructured data of the type created by many cloud apps, data which are typically harder to manage and control. That means organisations need to manage employees’ interactions with the cloud carefully as a key tenet of GDPR compliance.”

a safe place to work“As cloud app use continues to increase within businesses, data will become harder to track and control. But with the GDPR instigating a maximum possible fine of €20 million or 4% of global turnover (whichever is higher) in certain cases, there is now more incentive than ever for companies to focus on data protection. Getting a handle on cloud app use will be a crucial part of ensuring compliance for any organisation, and IT teams will need to start work now to meet the May 2018 compliance deadline.”

One area which has been given attention within the GDPR is that of data residency. New regulations will require organizations do not store in or transfer data through countries outside the European Economic Area that do not have equivalently strong data protection standards. The list of countries that meet these standards is short, 11, with a notable absentee, the United States of America, which could pose problems for numerous organizations.

While this may be considered one of the headline areas for the GDPR and one which will likely be heavily scrutinized, for Dave Allen, General Counsel at Dyn, concentrating too much on this area could lull companies into a false sense of security.

“As the EU GDPR comes into effect, businesses will need to take a hard look at their current methods of sharing and storing data,” said Allen. “While some Internet companies have begun to address new challenges at the fixed locations where data is stored – this alone will not necessarily be enough to ensure compliance.

“Those companies focusing solely on data residency may well fall victim to a false sense of confidence that sufficient steps have been taken to address these myriad regulations outlined in the GDPR. As the GDPR will hold businesses accountable for their data practices, businesses must recognise that the actual paths data travels are also a key factor to consider. In many ways, the constraints which come with the cross-border routing of data across several sovereign states mean these paths pose a more complex problem to solve.

“Although no silver bullet exists for compliance with the emerging regulations which govern data flows, businesses which rely on the global Internet to serve their customers should be seriously considering visibility into routing paths along both the open Internet and private networks. As we enter an era of emerging geographic restrictions, businesses with access to traffic patterns in real time, in addition to geo-location information, will find themselves in a much stronger position to tackle the challenges posed by the GDPR.”

Anonymous unrecognizable man with digital tablet computerOverall, the GDPR will ensure companies take a greater level of responsibility to safeguard the personal data they hold from attacks. Recent months have seen a number of highly publicised attacks significantly impact the reputation of well-known and respected brands, making consumers nervous about which of their personal information is being held. Previously, attacks on such organizations would not have been thought possible; surely they have the budgets to ensure these breaches wouldn’t happen?

Another headline proposition from the GDPR is the consumer’s right to access data which is stored on them, and also the right to have this data ‘forgotten’. For Jon Geater, CTO at Thales e-Security, this will create numerous challenges and changes to the way in which data is stored and accessed.

“The new rules also make clear another important factor that we should already have known: that you can outsource your risk, but you can’t outsource your responsibility,” said Geater. “If organisations use a third party provider to store and manage data – such as a cloud provider, for example – they are still responsible its protection and must demonstrate exactly how the data is protected in the remote system. Therefore, formal privacy-by-design techniques need to make their way down the supply chain if companies are to avoid penalties or nightmarish discovery and analysis tasks.

“In addition, organisations will now have to provide citizens with online access to any their own personal data they store. While the Data Protection Act traditionally allowed anyone to request access to this data, with GDPR in effect organisations must make this available for download ‘where possible’ and ‘without undue delay’.

“This is a very significant change and securing this access will represent a significant challenge to many organisations – especially while still complying with the new tighter rules – and will require robust cybersecurity technology across the board.”

What is clear is there will be complications. This shouldn’t be considered a massive surprise as any new regulations are fraught with complications on how to remain or become compliant, but the European Commission isn’t messing around this time. With fines of €20 million or 4% of global turnover (whichever is greater), the stick is a hefty one, and the carrot is yet to be seen.

Middleware Solutions | @CloudExpo #DataCenter #BigData #M2M #IoT #API

While there has been much ado about interoperability, there are still no real solutions, same as last year and the year before that. The large EHR vendors who continue to dominate the market still maintain that interoperability is all but solved, still can’t connect EHRs across the continuum causing frustration by providers and a disservice to patients. The ONC pays lip service to the problem, but that is about it. It is time for the healthcare industry to consider alternatives like middleware which has been proven in other industries such as finance, retail and hospitality.

read more