All posts by Guest Author

Five key enterprise PaaS trends to look out for this year

PaaS will see a big shakeup this year according to Rene Hermes, general manager EMEA, Apprenda

PaaS will see a big shakeup this year according to Rene Hermes, general manager EMEA, Apprenda

The last year has shown that a growing number of enterprises are now choosing Platform as a Service (PaaS) ahead of Infrastructure as a Service (IaaS) as the cornerstone of their private/hybrid cloud strategy. While the enterprise cloud market has obviously experienced a substantial amount of change over the last year, the one thing that’s certain is that this will keep on accelerating over the coming months.

Here are five specific enterprise cloud trends that we believe will prove significant throughout the rest of 2015 and beyond.

The PaaS standard will increasingly be to containerise – While we’ve always committed to the concept of a container-based PaaS, we’re now seeing Docker popularise the concept. The broader enterprise world is now successfully vetting the viability of a container-based architecture, and we’re seeing enterprises move from just asking about containers as a roadmap item to now asking for implementation details. This year won’t necessarily see broad-based customer adoption, but we’re anticipating a major shift as PaaS becomes synonymous with the use of containers.

Practical microservices capabilities will win out over empty posturing – It’s probably fair to say that most of the microservices ‘advice’ offered by enterprise PaaS vendors to date has been questionable at best. Too many vendors have simply repackaged the Service-Oriented Architecture conversation and represented it as their microservices positioning. That’s fine, but it hasn’t helped customers at all as vendors have avoided being held accountable to microservices at both a feature and execution level. This isn’t sustainable, and PaaS and cloud vendors will need to deliver practical guidance driven by core enterprise PaaS features if they are to be taken seriously.

Internet of Things will be a key driver for PaaS implementations – For PaaS to be successful they need to support core business use cases. However too many PaaS implementations are deployed just to simplify the IT model so that developers can quickly build cloud-enabled applications. That approach simply isn’t going to withstand the pressure caused by the increased take-up of innovations such as The Internet of Things that will require web-service back-ends that are easy to manage, highly available and massively scalable.

Containerising OpenStack services set to create confusion – The move towards OpenStack being deployed within containers is interesting, but we believe adoption will prove slow. With many now expecting container control and management to sit within the PaaS layer, moves such as containerised OpenStack are likely just to cause confusion. Given that PaaS is becoming the dominant form of cloud assembly, containerised IaaS will stall as it conflicts directly with the continued growth in enterprises deploying private/hybrid PaaS – regardless of whether they’ve built IaaS already.

PaaS buyers to dismiss infrastructure prescriptive solutions – Many PaaS vendors do a lot of marketing around being portable, but in reality many organisations find that this can increase IT risk and drive lock-in by deliberately creating stack dependencies. We’re finding PaaS buyers much keener to challenge vendors on their infrastructure portability as early as the proof of concept phase. That’s because customers want an enterprise PaaS that doesn’t favour one infrastructure over another. To ensure this outcome, customers are now using their RFPs and proofs of concept to insist that PaaS vendors demonstrate that their solutions are portable across multiple infrastructure solutions.

By Rene Hermes, general manager EMEA, Apprenda

Bring Your Own Encryption: The case for standards

BYOE is the new black

BYOE is the new black

Being free to choose the most suitable encryption for your business seems like a good idea. But it will only work in a context of recognised standards across encryption systems and providers’ security platforms. Since the start of the 21st century, security has emerged from scare-story status to become one of IT users’ biggest issues – as survey after survey confirms. Along the way a number of uncomfortable lessons are still being learned.

The first lesson is that security technology must always be considered in a human context. No one still believes in a technological fix that will put an end to all security problems, because time and again we hear news of new types of cyber attack that bypass sophisticated and secure technology by targeting human nature – from alarming e-mails ostensibly from official sources, to friendly social invitations to share a funny download; from a harmless-looking USB stick ‘accidentally’ dropped by the office entrance, to the fake policeman demanding a few personal details to verify that you are not criminally liable.

And that explains the article’s heading: a balance must be struck between achieving the desired level of protection against keeping all protection procedures quick and simple. Every minute spent making things secure is a minute lost to productivity – so the heading could equally have said “balancing security with efficiency”.

The second lesson still being learned is never to fully trust to instinct in security matters. It is instinctive to obey instructions that appear to come from an authoritative source, or to respond in an open, friendly manner to a friendly approach – and those are just the sort of instincts that are exploited by IT scams. Instincts can open us to attack, and they can also evoke inappropriate caution.

In the first years of major cloud uptake there was the oft-repeated advice to business that the sensible course would be to use public cloud services to simplify mundane operations, but that critical or high priority data should not be trusted to a public cloud service but kept under control in a private cloud. Instinctively this made sense: you should not allow your secrets to float about in a cloud where you have no idea where they are stored or who is in charge of them.

The irony is that the cloud – being so obviously vulnerable and inviting to attackers – is constantly being reinforced with the most sophisticated security measures: so data in the cloud is probably far better protected than any SME could afford to secure its own data internally. It is like air travel: because flying is instinctively scary, so much has been spent to make it safe that you are

less likely to die on a flight than you are driving the same journey in the “safety” of your own car. The biggest risk in air travel is in the journey to the airport, just as the biggest risk in cloud computing lies in the data’s passage to the cloud – hence the importance of a secure line to a cloud service.

So let us look at encryption in the light of those two lessons. Instinctively it makes sense to keep full control of your own encryption and keys, rather than let them get into any stranger’s hands – so how far do we trust that instinct, bearing in mind the need also to balance security against efficiency?

BYOK

Hot on the heels of BYOD – or “Bring Your Own Device” to the workplace – come the acronym for Bring Your Own Key (BYOK).

The idea of encryption is as old as the concept of written language: if a message might fall into enemy hands, then it is important to ensure that they will not be able to read it. We have recently been told that US forces used Native American communicators in WW2 because the chances of anyone in Japan understanding their language was near zero. More typically, encryption relies on some sort of “key” to unlock and make sense of the message it contains, and that transfers the problem of security to a new level: now the message is secure, the focus shifts to protecting the key.

In the case of access to cloud services: if we are encrypting data because we are worried about its security in an unknown cloud, why then should we trust the same cloud to hold the encryption keys?

Microsoft for instance recently announced a new solution to this dilemma using HSMs (Hardware Security Modules) within their Windows Azure cloud – so that an enterprise customer can use its own internal HSM to produce a master key that is then transmitted to the HSM within the Windows Azure cloud. This provides secure encryption when in the cloud, but it also means that not even Microsoft itself can read it, because they do not have the master key hidden in the enterprise HSM.

It is not so much that the enterprise cannot trust Microsoft to protect its data from attack, it is more to do with growing legal complexities. In the wake of Snowden revelations, it is becoming known that even the most well protected data might be at risk from a government or legal subpoena demanding to reveal its content. Under this BYOK system, however, Microsoft cannot be forced to reveal the enterprise’s secrets because it cannot access them itself, and the responsibility lies only with the owner.

This is increasingly important because of other legal pressures that insist on restricting access to certain types of data. A government can, for example, forbid anyone from allowing data of national importance to leave the country – not a simple matter in a globally connected IP network. There are also increasing legal pressures on holders of personal data to guarantee levels of privacy.

Instinctively it feels a lot more secure to manage your own key and use BYOK instead of leaving it to the cloud provider. As long as that instinct is backed by a suitable and strict in-house HSM based security policy, these instincts can be trusted.

BYOE

BYOK makes the best of the cloud provider’s encryption offering, by giving the customer ultimate control over its key. But is the customer happy with the encryption provided?

Bearing in mind that balance between security and efficiency, you might prefer a higher level of encryption than that used by the cloud provider’s security system, or you might find the encryption mechanism is adding latency or inconvenience and would rather opt for greater nimbleness at the cost of lighter encryption. In this case you could go a step further and employ your own encryption algorithms or processes. Welcome to the domain of BYOE (Bring Your Own Encryption).

Again, we must balance security against efficiency. Take the example of an enterprise using the cloud for deep mining its sensitive customer data. This requires so much computing power that only a cloud provider can do the job, and that means trusting private data to be processed in a cloud service. This could infringe regulations, unless the data is protected by suitable encryption. But how can the data be processed if the provider cannot read it?

Taking the WW2 example above: if a Japanese wireless operator was asked to edit the Native American message so a shortened version could be sent to HQ for cryptanalysis, any attempt to edit an unknown language would create gobbledygook, because translation is not a “homomorphic mapping”.

Homomorphic encryption means that one can perform certain processes on the encrypted data, and the same processes will be performed on the source data without any need to de-crypt the encrypted data. This usually implies arithmetical processes: so the data mining software can do its mining on the encrypted data file while it remains encrypted, and the output data, when decrypted, will be the same output as if the data had been processed without any intervening encryption.

It is like operating one of those automatic coffee vendors that grinds the beans, heats the water and adds milk and sugar according to which button was pressed: you do not know what type of coffee bean is used, whether tap, filtered or spring water or whether the milk is whole cream, skimmed or soya. All you know is that what comes out will be a cappuccino with no sugar. In the data mining example: what comes out might be a neat spread-sheet summary of customers average buying habits based on millions of past transactions, without a single personal transaction detail being visible to the cloud’s provider.

The problem with the cloud provider allowing the users to choose their own encryption, is that the provider’s security platform has to be able to support the chosen encryption system. As an interim measure, the provider might offer a choice from a range of encryption offerings that have been tested for compatibility with the cloud offering, but that still requires one to trust another’s choice of encryption algorithms. A full homomorphic offering might be vital for one operation, but a waste of money and effort for a whole lot of other processes.

The call for standards

So what is needed for BOYE to become a practical solution is a global standard cloud security platform that any encryption offering can be registered for support by that platform. The customer chooses a cloud offering for its services and for its certified “XYZ standard” security platform, then the customer goes shopping for an “XYZ certified” encryption system that matches its particular balance between security and practicality.

Just as in the BYOD revolution, this decision need not be made at an enterprise level, or even by the IT department. BYOE, if sufficiently standardised, could become the responsibility of the department, team or individual user: just as you can bring your own device to the office, you could ultimately take personal responsibility for your own data security.

What if you prefer to use your very own implementation of your own encryption algorithms? All the more reason to want a standard interface! This approach is not so new for those of us who remember the Java J2EE Crypto library – as long as we complied with the published interfaces, anyone could use their own crypto functions. This “the network is the computer” ideology becomes all the more relevant in the cloud age. As the computer industry has learned over the past 40 years, commonly accepted standards and architecture (for example the Von Neumamm model or J2EE Crypto) play a key role in enabling progress.

BYOE could prove every bit as disruptive as BYOD – unless the industry can ensure that users choose their encryption from a set of globally sanctioned and standardised encryption systems or processes. If business is to reap the full benefits promised by cloud services, it must have the foundation of such an open cloud environment.

Written by Dr. Hongwen Zhang, chair security working group, CloudEthernet Forum.

The channel must embrace cloud to build for the future

The channel needs to embrace cloud services in order to succeed in IT today

The channel needs to embrace cloud services in order to succeed in IT today

With cloud acceptance growing, more and more businesses are dipping their toes in the water and trying out cloud based services and applications in a bid to work smarter and lower IT expenditure. But with recent research suggesting that four in ten ICT decision-makers feel their deployment fails to live up to the hype – more needs to be done to ensure cloud migration is a success.

This is where the channel has a vital role to play and can bridge the knowledge gap and help end-users reap the benefits that cloud technology can provide.

With the cloud becoming a mainstream solution for businesses and an integral part of an organisation’s IT strategy, the channel is presented with a huge opportunity. Offering cloud services to the market has the potential to yield high revenues, so it’s vital that the channel takes a realistic approach to adopting cloud within its portfolio, and becomes a trusted advisor to the end user.

We have identified three key reasons why resellers shy away from broadening their offering to encompass cloud for new and existing customers. A common barrier is a simple lack of understanding of the cloud and its benefits. However, if a business is keen to adopt this technology, it is vital that its reseller is able to offer advice and guidance to prevent them looking elsewhere.

Research by Opal back in 2010 found that 40 per cent of resellers admit a sense of ‘fear and confusion’ around cloud computing, with the apprehension to embrace the technology also extending to end users, with 57 per cent reporting uncertainty among their customer bases. This lack of education means they are missing out on huge opportunities for their business. A collaborative approach between the reseller and cloud vendor will help to ensure a seamless knowledge transfer followed by successful partnership and delivery.

The sheer upheaval caused by offering the cloud will see some resellers needing to re-evaluate their own business models and strategies to fulfil the need. Those that are unaccustomed to a service-oriented business model may find that becoming a cloud reseller presents strategic challenges as they rely on out-dated business plans and models that don’t enable this new technology. However, failing to evolve business models could leave resellers behind in the adoption curve, whilst their competitors are getting ahead. Working with an already established partner will help resellers re-evaluate their existing business plans to ensure they can offer cloud solutions to their customers.

Resellers are finding it challenging to provide their customers with quick, scalable cloud solutions due to the fact that moving existing technology services into cloud services can be time consuming, and staff will be focused on working to integrate these within the enterprise. However, this issue can easily be resolved by choosing a trusted cloud provider, and in turn building a successful partnership.

Although resellers will come across barriers when looking at providing their customers with cloud services, these shouldn’t get in the way of progression. In order to enter a successful partnership with a cloud provider, there are some important factors resellers should consider before taking the plunge.

Scalability

Before choosing a prospective partner, resellers need to ensure it has the scalability and technology innovation to provide a simple integration of current IT services into the cloud. Recent research has proved that deploying cloud services from three or more suppliers can damage a company’s business agility. UK businesses state a preference for procuring cloud services from a single supplier for ease of management. It’s important to make sure the chosen provider has the ability to provide one fully encompassed cloud service that can offer everything their customers require.

Brand reputation

Choosing a partner that offers not only a best-of breed private, public and hybrid cloud solution, but also has the ability to provide the reseller with a branded platform will give an extra layer of credibility to the business for not only existing customers, but future ones as well. Resellers are more likely to choose a cloud provider that gives them control over the appearance, as well as support and access to infrastructure of the cloud platform.

Industry experience

It’s vital to ensure the cloud provider has extensive industry experience and knowledge with a proven track record in order to meet the required criteria of scalability and performance. The partner must have the knowledge in order to educate and offer advice to the reseller. If they are able to do so, the reseller can therefore pass this knowledge on to their own customers.

By not offering the cloud, resellers will miss out on vast opportunities and in turn, lose potential revenue as well as new and existing customers. The channel must now embrace the cloud and take advantage of the partnerships available in order to succeed.

Written by Matthew Munson, chief technology officer, Cube52

ISO 27018 and protecting personal information in the cloud: a first year scorecard

ISO 27018 has been around for a year - but is it effective?

ISO 27018 has been around for a year – but is it effective?

A year after it was published,  – the first international standard focusing on the protection of personal data in the public cloud – continues, unobtrusively and out of the spotlight, to move centre stage as the battle for cloud pre-eminence heats up.

At the highest level, this is a competitive field for those with the longest investment horizons and the deepest pockets – think million square foot data centres with 100,000+ servers using enough energy to power a city.  According to research firm Synergy, the cloud infrastructure services market – Infrastructure as a Service (Iaas), Platform as a Services (PaaS) and private and hybrid cloud – was worth $16bn in 2014, up 50 per cent on 2013, and is predicted to grow 30 per cent to over $21bn in 2015. Synergy estimated that the four largest players accounted for 50 per cent of this market, with Amazon at 28 per cent, Microsoft at 11 per cent, IBM at 7 per cent and Google at 5 per cent.  Of these, Microsoft’s 2014 revenues almost doubled over 2013, whilst Amazon’s and IBM’s were each up by around half.

Significantly, the proportion of computing sourced from the cloud compared to on-premise is set to rise steeply: enterprise applications in the cloud accounted for one fifth of the total in 2014 and this is predicted to increase to one third by 2018.

This growth represents a huge increase year on year in the amount of personal data (PII or personally identifiable information) going into the cloud and the number of cloud customers contracting for the various and growing types of cloud services on offer. but as the cloud continues to grow at these startling rates, the biggest inhibitor to cloud services growth – trust about security of personal data in the cloud – continues to hog the headlines.

Under data protection law, the Cloud Service Customer (CSC) retains responsibility for ensuring that its PII processing complies with the applicable rules.  In the language of the EU Data Protection Directive, the CSC is the data controller.  In the language of ISO 27018, the CSC is either a PII principal (processing her own data) or a PII controller (processing other PII principals’ data).

Where a CSC contracts with a Cloud Service Provider (CSP), Article 17 the EU Data Protection Directive sets out how the relationship is to be governed. The CSC must have a written agreement with the CSP; must select a CSP providing ‘sufficient guarantees’ over the technical security measures and organizational measures governing PII in the Cloud service concerned; must ensure compliance with those measures; and must ensure that the CSP acts only on the CSC’s instructions.

As the pace of migration to the cloud quickens, the world of data protection law continues both to be fragmented – 100 countries have their own laws – and to move at a pace driven by the need to mediate all competing interests rather than the pace of market developments.

In this world of burgeoning cloud uptake, ISO 27018 is proving effective at bridging the gap between the dizzying pace of Cloud market development and the slow and uncertain rate of legislative change by providing CSCs with a workable degree of assurance in meeting their data protection law responsibilities.  Almost a year on from publication of the standard, Microsoft has become the first major CSP (in February 2015) to achieve ISO 27018 certification for its Microsoft Azure (IaaS/PaaS), Office 365 (PaaS/Saas) and Dynamics CRM Online (SaaS) services (verified by BSI, the British Standards Institution) and its Microsoft Intune SaaS services (verified by Bureau Veritas).

In the context of privacy and cloud services, ISO 27018 builds on other information security standards within the IS 27000 family. This layered, interlocking approach is proving supple enough in practice to deal with the increasingly wide array of cloud services. For example, it is not tied to any particular kind of cloud service and, as Microsoft’s certifications show, applies to IaaS (Azure), PaaS (Azure and Office 365) and SaaS (Office 365 and Intune). If, as shown in the graphic below, you consider computing services as a stack of layered elements ranging from networking (at the bottom of the stack) up through equipment and software to data (at the top), and that each of these elements can be carried out on premise or from the cloud (from left to right), then ISO 27018 is flexible enough to cater for all situations across the continuum.

Software as a Licence to Software as a Service: the Cloud Continuum

Software as a Licence to Software as a Service: the cloud continuum

Indeed, the standard specifically states at Paragraph 5.1.1:

“Contractual agreements should clearly allocate responsibilities between the public cloud PII processor [i.e. the CSP], its sub-contractors and the cloud service customer, taking into account the type of cloud service in question (e.g. a service of an IaaS, PaaS or SaaS category of the cloud computing reference architecture).  For example, the allocation of responsibility for application layer controls may differ depending on whether the public cloud PII processor is providing a SaaS service or rather is providing a PaaS or IaaS service upon which the cloud service customer can build or layer its own applications.”

Equally, CSPs will generally not know whether their CSCs are sending PII to the cloud and, even if they do, they are unlikely to know whether or not particular data is PII. Here, another strength of ISO 27018 is that it applies regardless of whether particular data is, or is not, PII: certification simply assures the CSC that the service the CSP is providing is suitable for processing PII in relation to the performance by the CSP of its PII legal obligations.

Perhaps the biggest practical boon to the CSC however is the contractual certainty that ISO 27018 certification provides.  As more work migrates to the cloud, particularly in the enterprise space, the IT procurement functions of large customers will be following structured processes in order to meet the requirements of their business and, in certain cases, their regulators. In their requests for information, proposals and quotations from prospective CSPs, CSCs now have a range of interlocking standards including ISO 27018 to choose from in their statements of requirements for a particular Cloud procurement.  As well as short-circuiting the need for CSCs to spend time in writing up detailed specifications of their own requirements, verified compliance with these standards for the first time provides meaningful assurance and protection from risk around most aspects of cloud service provision. Organisations running competitive tenders can benchmark bidding CSPs against each other on their responses to these requirements, and then include as binding commitments the obligations to meet the requirements of the standards concerned in the contract when it is let.

In the cloud contract lifecycle, the flexibility provided by ISO 27018 certification, along with the contract and the CSP’s policy statements, goes beyond this to provide the CSC with a framework to discuss with the CSP on an ongoing basis the cloud PII measures taken and their adequacy.

In its first year, it is emerging that complying, and being seen to comply, with ISO 27018 is providing genuine assurance for CSCs in managing their data protection legal obligations.  This reassurance operates across the continuum of cloud services and through the procurement and contract lifecycle, regardless of whether or not any particular data is PII.  In customarily unobtrusive style, ISO 27018 is likely to go on being a ‘win’ for the standards world, cloud providers and their customers, and data protection regulators and policy makers around the world.

 

Giving employees the cloud they want

Business are taking the wrong approach to their cloud policies

Business are taking the wrong approach to their cloud policies

There is an old joke about the politician who is so convinced she is right when she goes against public opinion, that she states, “It’s not that we have the wrong policies, it’s that we have the wrong type of voters!” The foolishness of such an attitude is obvious and yet, when it comes to mandating business cloud usage, some companies are still trying to live by a similar motto despite large amounts of research to the contrary.

Cloud usage has grown rapidly in the UK, with adoption rates shooting up over 60% in the last four years, according to the latest figures from Vanson Bourne. This reflects the increasing digitalisation of business and society and the role cloud has in delivering that.  Yet, there is an ongoing problem with a lack of clarity and understanding around cloud policies and decision making within enterprises at all levels. This is only natural, as there is bound to be confusion when the IT department and the rest of the company have differing conceptions about what the cloud policy is and what it should be. Unfortunately, this confusion can create serious security issues, leaving IT departments stuck between a rock and a hard place.

Who is right? The answer is, unsurprisingly, both!  Increasingly non-IT decision makers and end-users are best placed to determine the value of new services to the business; but IT departments have long experience and expertise in the challenges of technology adoption and the implications for corporate data security and risk.

Cloud policy? What cloud policy?

Recent research from Trustmarque found that more than half (56 per cent) of office workers said their organisation didn’t have a cloud usage policy, while a further 28 per cent didn’t even know if one was in operation. Despite not knowing their employer’s cloud policy, nearly 1 in 2 office workers (46 per cent) said they still used cloud applications at work. Furthermore, 1 in 5 cloud users admitted to uploading sensitive company information to file sharing and personal cloud storage applications.

When employees aren’t sure how to behave in the cloud and companies don’t know what information employees are disseminating online, the question of a security breach becomes one of when, not if. Moreover, with 40 per cent of cloud users admitting to knowingly using cloud applications that haven’t been sanctioned or provided by IT, it is equally clear that employee behaviour isn’t about to change. Therefore, company policies must change instead – which often is easier said than done. On the one hand, cloud applications are helping increase productivity for many enterprises, and on the other, the behaviour of some staff is unquestionably risky. The challenge is maintaining an IT environment that supports employees’ changing working practices, but at the same time is highly secure.

By ignoring cloud policies, employees are also contributing to cloud sprawl. More than one quarter of cloud users (27 per cent), said they had downloaded cloud applications they no longer use. The sheer number and variety of cloud applications being used by employees’ means costs can quickly spiral out of control. This provides another catch-22 situation for CIOs seeking balance, as they look to keep costs down, ensure information security and empower employees to use the applications needed to work productively.

The road to bad security is paved with good intentions

The critical finding from the research is that employees know what they are doing is not sanctioned by their organisation and still engage in that behaviour. However, it’s important to recognise that this is generally not due to malicious intent, but rather because they see the potential benefits for themselves or their organisation and security restrictions mean their productivity is hampered – so employees look for a way around those barriers.

It is not in the interest of any business to constrain the impulse of employees to try and be more efficient. Instead, businesses should be looking for the best way to channel that instinct while improving security. There is a real opportunity for those businesses that can marry the desires of employees to use cloud productively, but with the appropriate security precautions in place, to get the very best out of cloud for the enterprise.

Stop restricting and start empowering

The ideal solution for companies is to move towards an integrated cloud adoption/security lifecycle that links measurement, risk/benefit assessment and policy creation, policy enforcement, education and app promotion, so that there is a positive feedback loop reinforcing both cloud adoption and good security practices.  This means an organisation will gain visibility into employees’ activity in the cloud so that they can allow their favourite applications to be used, while blocking specific risky activity. This is far more effective than a blanket ban as it doesn’t compromise the productive instincts of employees, but instead encourages good behaviour and promotes risk-aware adoption. In order for this change to be effected, IT departments need to alter their mind set and become the brokers of services such as cloud, rather than the builder of constricting systems. If organisations can empower their users by for example, providing cloud-enabled self-service, single sign-on and improved identity lifecycle management, they can simultaneously simplify adoption and reduce risk.

Ignorance of cloud policies among staff significantly raises the possibility of data loss, account hijacking and other cloud-related security threats. Yet since the motivation is, by and large, the desire to be productive rather than malicious, companies need to find a way to blend productivity and security instead of having them square off against each other. It is only through gaining visibility into cloud usage behaviour that companies can get the best of both worlds.

Written by James Butler, chief technology officer, Trustmarque

The Internet of Things: Where hope tends to triumph over common sense

The Internet of Things is coming. But not anytime soon.

The Internet of Things is coming. But not anytime soon.

The excitement around the Internet of Things (IoT) continues to grow, and even more bullish predictions and lavish promises will be made made about and on behalf of it in the coming months. 2015 will see us reach “peak oil” in the form of increasingly outlandish predictions and plenty of over-enthusiastic venture capital investments.

But the IoT will not change the world in 2015. It will take at least 10 years for the IoT to become pervasive enough to transform the way we live and work, and in the meantime it’s up to us to decode the hype and figure out how the IoT will evolve, who will benefit, and what it takes to build an IoT network.

Let’s look at the predictions that have been made for the number of connected devices. The figure of 1 trillion has been used several times by a range of incumbents and can only have been arrived at using a very, very relaxed definition of what a “connected thing” is. Of course, if you’re willing to include RFID tags in your definition this number is relatively easy to achieve, but it doesn’t do much to help us understand how the IoT will evolve. At Ovum, we’re working on the basis of a window of between 30 billion and 50 billion connected devices by 2020. The reason for the large range is that there are simply too many factors at play to be any more precise.

Another domain where enthusiasm appears to be comfortably ahead of common sense is in discussions about the volume of data that the IoT will generate. Talk of an avalanche of data is nonsense. There will be no avalanche; instead we’ll see a steadily rising tide of data that will take time to become useful. When building IoT networks the “data question” is one of the things architects spend a lot of time thinking and worrying about. In truth, the creators of IoT networks are far more likely to be disappointed that their network is taking far longer than expected to reach the scale of deployment necessary to produce the volumes of data they had boasted about to their backers.

This article appeared in the latest issue of the BCN Magazine. Click here to download a digital version.

Even the question of who will make money out of the IoT, and where they will make it, is being influenced too much by hope and not enough by common sense. The future of the IoT does not lie in the connected home or in bracelets that count your steps and measure your heartbeat. The vast majority of IoT devices will not beautify our homes or help us with our personal training regime. Instead they will be put to work performing very mundane tasks like monitoring the location of shipping containers, parcels, and people. The “Industrial IoT” which spans manufacturing, utilities, distribution and logistics will make up by far the greatest share of the IoT market. These devices will largely remain unseen by us, most will be of an industrial grey colour, and only a very small number of them will produce data that is of any interest whatsoever outside a very specific and limited context.

Indeed, the “connected home” is going to be one of the biggest disappointments of the Internet of Things, as its promoters learn that the ability to change the colour of your livingroom lights while away on business doesn’t actually amount to a “life changing experience”. That isn’t to say that our homes won’t be increasingly instrumented and connected, they will. But the really transformational aspects of the IoT lie beyond the home.

There are two other domains where IoT will deliver transformation, but over a much longer timescale than enthusiasts predict. In the world of automotive, cars will become increasingly connected and increasingly smart. But it will take over a decade before the majority of cars in use can boast the levels of connectivity and intelligence we are now seeing in experimental form. The other domain that will be transformed over the long-term is healthcare, where IoT will provide us with the ability to monitor and diagnose conditions remotely, and enable us to deliver increasingly sophisticated healthcare services well beyond the boundaries of the hospital or the doctor’s surgery.

Gary Barnett

But again, we are in the earliest stages of research and experimentation and proving some of the ideas are practical, safe and beneficial enough to merit broader roll-out will take years and not months. The Internet of Things will transform the way we understand our environment as well as the people and things that exist within it, but that transformation will barely have begun by 2020.

Gary Barnett is Chief Analyst, Software with Ovum and also serves as the CTO for a non-profit organisation that is currently deploying what it hopes will become the world’s biggest urban air quality monitoring network.

How to achieve success in the cloud

To cloud or not to cloud? With the right strategy, it need not be the question.

To cloud or not to cloud? With the right strategy, it need not be the question.

There are two sides to the cloud coin: one positive, the other negative, and too many people focus on one at the expense of the other for a variety of reasons ranging from ignorance to wilful misdirection. But ultimately, success resides in embracing both sides and pulling together the capabilities of both enterprises and their suppliers to make the most of the positive and limit the negative.

Cloud services can either alleviate or compound the business challenges identified by Ovum’s annual ICT Enterprise Insights program, based on interviews with 6,500 senior IT executives. On the positive side both public and private clouds, and everything in between, help:

Boost ROI at various levels: From squeezing more utilization from the underlying infrastructure to making it easier to launch new projects with the extra resources exposed asa result.

Deal with the trauma of major organisational/ structural changes as they can adapt to the ups and downs of requirements evolution.

Improve customer/citizen experience, and therefore satisfaction: This has been one of the top drivers for cloud adoption. Cloud computing is at its heart user experience-centric. Unfortunately many forget this, preferring instead to approach cloud computing from a technical perspective.

Deal with security, security compliance, and regulatory compliance: An increasing number of companies acknowledge that public cloud security and compliance credentials are at least as good if not better than their own, particularly in a world where security and compliance challenges are evolving so rapidly. Similarly, private clouds require security to shift from reactive and static to proactive and dynamic security, whereby workloads and data need to be secured as they move in and out of internal IT’s boundaries.

On the other hand, cloud services have the potential to compound business challenges. For instance, the rise of public cloud adoption contributes to challenges related to increasing levels of outsourcing. It is all about relationship management, and therefore relates to another business challenge: improving supplier relationships.

In addition to having to adapt to new public cloud offerings (rather than the other way round), once the right contract is signed (another challenging task), enterprises need to proactively manage not only their use of the service but also their relationships with the service provider, if only to be able to keep up with their fast-evolving offerings.

Similarly, cloud computing adds to the age-old challenge of aligning business and IT at two levels: cloud-enabling IT, and cloud-centric business transformation.

From a cloud-enabling IT perspective, the challenge is to understand, manage, and bridge a variety of internal divides and convergences, including consumer versus enterprise IT, developers versus IT operations, and virtualisation ops people versus network and storage ops. As the pace of software delivery accelerates, developers and administrators need to not only to learn from and collaborate with one another, but also deliver the right user experience – not just the right business outcomes. Virtualisation ops people tend to be much more in favour than network and storage ops people of software-defined datacentre, storage, and networking (SDDC, SDS, SDN) with a view to increasingly take control of datacentre and network resources. But the storage and network ops people, however, are not so keen on letting the virtualisation people in.

When it comes to cloud-centric business transformation, IT is increasingly defined in terms of business outcomes within the context of its evolution from application siloes to standardised, shared, and metered IT resources, from a push to a pull provisioning model, and more importantly, from a cost centre to an innovation engine.

The challenge, then, is to understand, manage, and bridge a variety of internal divides and convergences including:

Outside-in (public clouds for green-field application development) versus inside-out (private cloud for legacy applicationmodernization) perspectives. Supporters of the two approaches can be found on both the business and IT sides of the enterprise.

Line-of-business executives (CFO, CMO, CSO) versus CIOs regarding cloud-related roles, budgets, and strategies: The up-andcoming role of chief digital officer (CDO) exemplifies the convergence between technology and business C-level executives. All CxOs can potentially fulfil this role, with CDOs increasingly regarded as “CEOs in waiting”. In this context, there is a tendency to describe the role as the object of a war between CIOs and other CxOs. But what digital enterprises need is not CxOs battling each other, but coordinating their IT investments and strategies. Easier said than done since, beyond the usual political struggles, there is a disparity between all side in terms of knowledge, priorities, and concerns.

Top executives versus middle management: Top executives who are broadly in favour of cloud computing in all its guises, versus middle management who are much less eager to take it on board, but need to be won over since they are critical to cloud strategy execution.

Laurent Lachal

Shadow IT versus Official IT: Where IT acknowledges the benefits of Shadow IT (it makes an organisation more responsive and capable of delivering products and services that IT cannot currently support) and its shortcomings (in terms of costs, security, and lack of coordination, for example). However, too much focus on control at the expense of user experience and empowerment perpetuates shadow IT.

Only then will your organisation manage to balance both sides of the cloud coin.

Laurent Lachal is leading Ovum Software Group’s cloud computing research. Besides Ovum, where he has spent most of his 20 year career as an analyst, Laurent has also been European software market group manager at Gartner Ltd.

CyberOam Provides Critical Insight for Virtual Datacenter Administrators

Guest Post by Natalie Lehrer, a senior contributor for CloudWedge.

Organizations must provide reliable technical resources in order to keep a business running in an efficient manner. Network security is one of the chief concerns of all companies regardless of size. Although corporations are often pressed to earn profits, the need to protect all company related data at any cost should be a top priority.

Virtual datacenters can be susceptible to a variety of threats including hyperjacking, DoS attacks and more. The importance of keeping up to date on the latest server patches, security bulletins and being aware of the latest malware threats is more important than ever. Therefore, it is critical that all incoming network traffic is properly scanned in search of viruses and malicious code that could possibly corrupt or cause the malfunction of the virtual datacenter.

What is the Solution?

Network appliances such as Cyberoam can act as a unified threat management suite. In addition, Cyberoam scans as all incoming and outgoing traffic while producing detailed reports for system administrators. These granular reports list all virtual datacenter activity while providing logs that give forensic computer scientists direction on where to focus their investigations. Since any activities performed on virtual servers can be retained using Cyberoam, the audit process can provide a clear trail which will lead you to the culprit incase of a data breach. Cyberoam is not a reactive solution. Cyberoam proactively scans all incoming and outgoing data incase viruses and other harmful programs try to compromise and corrupt your entire virtual datacenter.

Security intricacies include intrusion protection services, specialized auditing applications and robust firewall features. Firewalls play an important role in keeping all harmful material from compromising virtual servers. Firewalls essentially block intruders while simultaneously allowing legitimate TCP or UDP packets to enter your system. Cyberoam allows administrators the ability to easily construct firewall rules that keep internal data safe and secure.

When you setup your virtual datacenter, it is important to utilize all of the features at your disposal. Sometimes the most obscure features are the most valuable. The best way to keep your virtual datacenter is safe is be on top of the latest knowledge. There have been reports that many IT professionals find themselves intimidated by new technology simply have not taken the initiative to learn all about the latest datacenter hardware and software available to them today. If you are trying to stay one step ahead of the game, your best bet is to learn all about the tools on the market and make your decision accordingly. Be sure to scrutinize any appliance you decide to utilize inside of your datacenter before adding it into your arsenal of IT weaponry.

Headshot

Natalie Lehrer is a senior contributor for CloudWedge.

In her spare time, Natalie enjoys exploring all things cloud and is a music enthusiast.

Follow Natalie’s daily posts on Twitter: @Cloudwedge, or on Facebook.

Think Office 365 is a Maintenance-Free Environment? Not So Fast …

Guest Post by Chris Pyle, Champion Solutions Group

So you’ve made the move to Office 365. Great!

You think you’ve gone from worrying about procuring exchange hardware and storage capacity, being concerned about email recovery plans, and having to keep up with the constant maintenance of your exchange server farm and the backing up your data, to relying on Office 365 to provide virtually anywhere-access to Microsoft tools.

Sounds pretty good, and we won’t blame you if you’re thinking that your move to the cloud has just afforded you a maintenance-free environment, but not so fast.

While the cost-savings and convenience it may seem like a no-brainer, what many administrators often forget is that the cloud itself doesn’t make email management any easier – there are still a ton of tasks that need to be done to ensure usability and security.

Indeed while moving mailboxes to the cloud may be efficient and provide cost savings, it doesn’t mean administration ends there. Not by any means.

Not to worry, for starters Office 365 admins looking for a faster and easier way to handle mail administration tasks have a number of tools at their disposal, such as our 365 Command by MessageOps. 365Command replaces the command line interface of Windows® PowerShell with a rich, HTML5 graphical user interface that is easy to navigate and makes quick work of changing mailbox settings, monitoring usage and reporting (and did we say you don’t need to know PowerShell?).

From our users who manage about 1 million mail boxes we see the most effective 365 administrators break down maintenance and tasks into daily, weekly, monthly, and quarterly buckets. Breaking down tasks this way simplifies work-flow, and the best part is that this can be easily implemented into your routine and should heighten the value and success utilizing Office 365.

Here are best practices for getting started:

Daily: Mailbox Administrators are constantly responding to any addition, change, and removal requests for their Office365 accounts. The most common are daily tasks that are quickly resolved, for example “forgot my password”, “need access to folder X”, “executive Y is on maternity leave, can you forward her files”, and so on:

  1. Modifying Passwords

  2. Modifying Folder Permissions

  3. Mailbox Forwarding

  4. Creating Single and Shared Mailboxes

Weekly: Weekly task groupings are geared toward helping Administrators keep a watchful eye on growth and scalability, security, speed and access. For example, checking for new devices that are being added to mailboxes, comparing them from previous weeks, and verifying that the user did indeed add a new device, and not incurring a potential risk of theft or fraud:

  1. Review Top Mailbox Growth by Size

  2. Review Office 365 Audit Logs

  3. Review Mobile Security

  4. Review Shared Mailbox Growth- (shared mailboxes only have 10GB limit!)

  5. Review the exact location of their servers and their mailboxes within the Microsoft data centers

Monthly: OK, now you’re cooking with gasoline — with those annoying daily tasks and cumbersome weekly tasks out of the way, top-level Administrators turn their full attention to security and access, which we can never have a lapse in attention:

  1. They run reports and lists of all users last login date. They are checking for people who may no longer be employed with the company, thus eliminating the need for that mailbox and its associated cost from Microsoft. Or if there is limited use, they could move the end user to a less expensive Office 365 SKU, again reducing their overall O365 costs.

  2. From a security standpoint, they are running reports to see who is forwarding their mailboxes to external mailboxes, such as sending their email to their home email account (Gmail/Yahoo/ Hotmail, etc.)

  3. Review password strength and the passwords that are set to expire on a monthly basis, ensuring their mailboxes are safe and secure.

  4. Review mailbox permissions, and review who has Send As privileges in their organization. They are confirming with the end user that they allowed these people to have the ability to send email as them.

  5. Review which employees have Full Mailbox access privileges. They confirm with the end user that they do want those additional users to have full access to their mail and calendar.

Quarterly: See how easy this is now? You’ve cleared out the clutter, and made sure every box on the system is secure. You’ve taken the steps to keep the system running fast and true, with consistent access and performance across the enterprise. Now kick back, light a fat stogie and do some light clean up and maintenance:

  1. Group Clean Up, review all email groups to ensure they have active members, as well as review which groups have people in them that are no longer employed, or contractors that are no longer involved, which groups aren’t being utilized, etc.

  2. Review the Edit Permissions list.

  3. Review Non Password changes in 90 days.

Conclusion

Just because you’ve moved to the cloud it doesn’t mean management and maintenance of your mail boxes stops there. Many of these best-practices would require the knowledge of PowerShell, but who wants to deal with that? Save yourself lots of trouble and find a tool that will manage these activities, streamline your work-flow and jump-start your productivity.

Chris Pyle headshot

Christopher Pyle is President & CEO for Champion Solutions Group. He is also an active member of Vistage International, an executive leadership organization, and is a Distinguished Guest Lecturer at Florida Atlantic University’s Executive Forum Lecture Series.

Google, Amazon Outages a Real Threat For Those Who Rely On Cloud Storage

Guest Post by Simon Bain, CEO of SearchYourCloud.

It was only for a few minutes, however Google was down. This follows hot on the heels of the other major cloud provider Amazon being down for a couple of hours earlier in August. This even relatively short outage could be a real problem for organizations that rely on these services to store their enterprise information. I am not a great lover of multi-device synchronization, I mean all those versions kicking around your systems! However if done well, it could be one of the technologies that help save ‘Cloud Stores’ from the idiosyncrasies of the Internet and a connected life.

We seem to be currently in the silly season of outages, with Amazon, Microsoft and Google all stating that their problems were cause by a switch being replaced or an update going wrong.

These outages may seem small for the supplier. But they are massive for the customer, who is unable to access sales data or invoices for a few hours.

This however, should not stop people using these services. But it should make them shop around, and look at what is really on offer. A service that does not have synchronization may well sound great. But if you do not have a local copy of your document on the device that you are actually working on, and your connection goes down, for whatever reason, then your work will stop.

SearchYourCloud Inc. has recently launched SearchYourCloud, a new application that enables people to securely find and access information stored in Dropbox, Box, GDrive, Microsoft Exchange, SharePoint or Outlook.com with a single search. Using either a Windows PC or any IOS device, SearchYourCloud will also be available for other clouds later in the year.

SearchYourCloud enables users to not only find what they are searching for, but also protects their data and privacy in the cloud.

Simon Bain

Simon Bain is Chief Architect and CEO of SearchYourCloud, and also serves on the Board of the Sun Microsystems User Group.