All posts by navisite

How to enable the ISV community with cloud computing

(c)iStock.com/roberthyrons

Access to computing resources has been democratised through access to low cost cloud platforms. Network access is now high-speed, low-cost and almost ubiquitous. Together, this means that cloud provides a perfect platform for the development and scalable delivery of applications. And the ISV community has a significant opportunity to take advantage.

Delivering value for customers

The SaaS model creates new opportunities for both ISVs and their customers. Consumption-based charging models enable both low cost-of-entry and low cost-of-software, so clients can experiment with applications that optimise business processes, drive higher efficiency, productivity and growth.

Such cloud-based models allow ISVs to focus on their core goals of developing and delivering applications; and improving their customer experience. Tasks like capacity management, infrastructure budget management and platform availability can all be offloaded to a cloud partner; and importantly, the costs of such operations can be married to usage and revenue for the ISV.

Other tasks can potentially be offloaded, too – ISVs working with a Managed Service Provider can also offload tasks such as patching, replication, redundancy and security. With the right partner, an ISV can deliver agility to the DevOps cycle and then rely on the service provider to implement change control, security or compliance enhancements, business continuity and to deliver a robust availability and performance SLA for the production applications.

Enabling customisation

ISVs can take a pragmatic approach in a move to SaaS, establishing whether it’s a good fit for their client base, their financial model and their software offering. However, they might have use cases that are better suited to a partner that can deliver hybrid solutions.

Applications delivered to end-users in heavily regulated industries may not be suited to multi-tenanted platforms, but may still sit well on discrete, dedicated infrastructure with appropriate security and compliance controls. Those cloud platforms where resources are rented by the hour may not necessarily offer the best value to applications with predictable workloads; or those where the end-user signs fixed-term contracts.

The combination of opportunities presented by IaaS and SaaS models has expanded the options available to ISVs for software development and delivery; and in turn provided a greater number of options and better-value solutions for end-users.

The cloud is reducing barriers to entry for new software businesses and allowing existing ISVs to be more agile, responsive to customers and innovative. Both the customers of these solutions and the ISVs themselves stand to gain considerable benefits by taking advantage of cloud infrastructure and managed services as long as due diligence is undertaken in this transition.

Choosing a cloud provider

Developers are now leveraging Infrastructure-as-a-Service (IaaS) and other cloud platforms to spin up cloud instances and to tear them down when they’re done, without any manual intervention, and all from one portal. However, there are a variety of service providers that offer an equal variety of cloud services. Software businesses need to consider their specific needs carefully when selecting a provider.

Key considerations should include:

  • Uptime commitment from the service provider, particularly how this relates to the critical needs of the relevant application
  • Whether a transparent billing mechanism is built in with the cloud offering
  • API functionality to allow your software to drive the infrastructure
  • Tools or native platform capabilities that allow rapid cloning and deployment of whole environments, enabling faster and more reliable DevOps cycles
  • Whether that partner can deliver a robust, managed service wrap that enhances the user experience and delivers security to client data

The challenges ahead

Despite the opportunities it offers, the cloud is not a panacea for ISVs and challenges still remain in the delivery of secure, reliable software solutions to end-users. However, by partnering with a Managed Service Provider that can offer agile development platforms and robust delivery services, many software developers will be able to reap the benefits of increased focus on core business areas, improved user experience and faster, more agile DevOps cycles.

Enterprise IT in 2016: Why it is no longer enough to just be the best in your industry

(c)iStock.com/tomazl

Over the course of 2015, we’ve seen many of the trends we predicted last year come to fruition. As cloud adoption in the UK soared past 84% in May, it was evident that we had been right that the technology would become even more mainstream. With enterprises aiming to become more mobile, 2015 has also seen the normalisation of “anytime, anywhere working” and the increased dependency on IT to help drive business transformation.

With over two-thirds of IT departments set to increase their operational IT budgets in 2016, it does not look like the importance of IT in driving business goals will diminish anytime soon – the question really lies in how. Although we don’t have a crystal ball that we can look into to predict anything with certainty, there are some trends emerging that we see shaping the enterprise IT in the year ahead:

Businesses will have to use IT to evolve

It is no longer sufficient “just” to be the best in your industry.  Businesses across all industries will need to examine their untapped data assets to drive the next wave of innovation and competition. Disruptive technologies have changed the potential of businesses of all sizes, and within each industry there will be those that will effectively leverage data assets to drive unprecedented levels of competition in 2016.

The nature of security will continue to change rapidly

As we’ve seen over the past 18 months, the nature of the threat has changed in fundamental ways.  No longer is perimeter based security sufficient – if it ever was in the first place. More than ever, a deep, granular, distributed security model will be needed.  Advances in software defined networking combined with other non-traditional approaches (think beyond IP and port ACLs!) will be what enables IT to keep pace with the evolving threat.

Understanding for how to map application portfolios to many cloud models will grow

Insomuch as mainframes still exist (indeed are a growth area for some), so will on premise IT, private cloud, boutique cloud, and hyper scale cloud.  All will continue to remain relevant.  Much of the new development, so called “born in cloud” applications, is likely to align with the hyper scale cloud, while the vast majority of existing enterprise applications may not. 

The value proposition of hyper scale cloud will be stemmed by shortage of truly able developers

There is already a shortage of developers that can truly capitalise on the value of hyper scale cloud. Indeed, many “born in cloud” applications are really just traditional designs deployed to the cloud.  Applications, even newly developed ones, often rely on the infrastructure layer to provide resilience.  The next generation of applications engineered to provide resilience at the application layer – i.e. those that can tolerate infrastructure failure – will suffer until this developer shortage is addressed. Unlikely to end in 2016, this is long term problem that will require one or more higher education cycles to fully resolve.

There will be a resurgence of the private cloud…but not for long

Early adopters of public cloud will re-evaluate the commercial fit of private cloud – and late adopters may move directly to private cloud due to regulations and compliance needs.  Cloud economics are compelling for a wide variety of use cases, but a CAPEX investment supporting a stable, long term application base often makes sense.  In addition, many regulatory bodies regularly lag behind in innovation, and private cloud often addresses compliance obligations while still providing many of the benefits of public cloud.  However, this resurgence is likely to be short lived as regulatory bodies catch up, applications evolve, and more flexible pricing models for public cloud prevail.

As we move into 2016, it’s clear that organisations will continue to look to their IT teams to remain competitive – both for developing new business solutions and meeting existing challenges. As such, it’s important that they are prepared to tackle the biggest hurdles and continue to take advantage of the opportunities that IT presents to the enterprise.

Securing your data centre from human error with a multi-step security approach

(c)iStock.com/gogo_b

10 years ago, few could have predicted that the world would generate data on today’s colossal scale, that a new social media environment would emerge, or that the Internet of Things (IoT) would integrate devices with intelligent IT. These changes have impacted the way we access data, and the way that businesses manage and store data.

Recently we have seen an evolution in infrastructure and storage to support these new trends, both for the business community and for consumers, which has driven innovation in how the data can and should be protected. Companies and individuals are responsible for securing and protecting all this data, and whilst great strides have been made to ensure that information is protected from external threats, it’s often humans who continue to be the weakest link in the security chain.

Whether through malicious intent or inadvertent carelessness, even the most sophisticated technology can be rendered useless if sensitive information gets into the wrong hands due to human error; therefore it is vital that data centres have a multi-step security approach in place.

Securing external threats

If you are looking to a third party provider to host your data, it is essential to seek absolute clarity on what measures of security are in place at the logical and physical levels. World class data centres have a number of sophisticated controls to ensure systems remain protected, including physical security controls like cameras and biometric access systems and may then offer managed services to deliver logical controls at the network level like firewalls, intrusion detection or DoS mitigation.

At the OS level, operating systems have become more secure and more sophisticated anti-virus software is now available, whilst threats at the applications level can be mitigated in a number of ways; for example, intelligent web application firewalls can be implemented. These firewalls are clever enough to understand what the normal traffic patterns are for an application, and if they encounter traffic patterns outside the defined “normal” parameters, the firewall can automatically block the problem traffic, averting a problem before it happens.

Sitting on top of these tools and systems are defined processes and best-practices, including specific industry compliance standards such as PCI, HIPPA, FISMA, and others which define broader measures to protect data like ISO, SSAE16 and ISMS. But despite development in tools, systems and processes, new threats continue to emerge and organisations need to be on alert to stay one step ahead of those external threats.

Securing internal threats

Much of the focus on the human link in the data centre security chain is on protecting networks from outsiders, but the insider threat continues to pose a significant risk. “Rogue insiders” already have access to systems and can often avoid tripping alarms that might otherwise signal some form of attack. In fact the 2015 Information Security Breaches survey found that 75% of organisations suffered staff-related security breaches with 50% citing that the worst breaches in the year were cause by human error. Recognising the sources of these threats is one thing, but it is quite another to be able to deal with them. However there are several practical steps data centre managers can take to enable this.

Many data centre providers take advantage of the new levels of sophistication in algorithms for encryption, which can provide another layer of protection, should outsiders gain access to data. As well as encrypting data for both storage and transmission, it is important to capture all the information about data access attempts – both legal and illegal. This allows privileged users to do their jobs in a climate of transparency, whilst acting as a deterrent for unauthorised access.

Multiple factor authentication, where multiple checks take place – for or example, keys or used in conjunction with passwords, then combined with biometrics like finger print or retina scans –can be incorporated as an additional measure.

Ultimately, a multi-level approach to security must be taken to close the weak links within a data centre. The goal of this approach is to meet compliance and specific legal requirements as well as to stay one step ahead of the risk posed by rogue employees. Using the multi-level security approach, we can create numerous opportunities to proactively detect, deter, and effectively deal with both insider and external threats.

How EU legislation impacts data processing in the cloud

(c)iStock.com/Ramberg

Last week, the European Union agreed on proposed Data Protection Regulations that potentially impact all organisations that either use or process the personal data of EU citizens. There will now be further consultations before these become statute, but for the first time these will be regulations, rather than directives; which mean that individual EU member states will have little room for interpretation in how they are applied. 

This has implications for IT service providers, SaaS providers or cloud providers, and for their customers. Under the current directives, third-party organisations who store data on behalf of others, have limited responsibilities as “processors” rather than “controllers” of data. But under the new proposals, individuals will be able to seek legal redress against any organisation they believe has misused their data and against any third-party that processed that data. In addition the EU may be able to fine those who breach the regulations, with a maximum potential fine of two percent of global turnover.

In practice it will mean that the safeguarding of personal data will become even more important; and that organisations will have extend their diligence into investigation of the controls and processes deployed by any third party they trust to process data on their behalf. Businesses must now implement “privacy by design”; How this will work in practice is still being debated, but with increasing amounts of sensitive data being available online, companies will be expected to be more aware of and better able to implement privacy into their IT platforms and into any outsource relationships.

Larger processors of data will need to appoint a Data Protection Officer and they will need to evidence transparent processes that deal with:

  • Controls to mitigate risks
  • Data security breach and communication around such incidents
  • Impact and risk assessment around the use of personal data
  • Staff training and awareness
  • The deletion of personal data or “Right to be Forgotten”

In turn this means that businesses engaging with service providers should ascertain that these partners have:

  • Appropriate tools to ensure the physical and logical security of their data; ranging from secure data centres with appropriate access controls, through to logical controls like firewalls, web application firewalls, intrusion detection or file integrity monitoring
  • Processes that control access to and management of data; for example secured logical access to networks or devices, or best practices around server image hardening and patching
  • Processes and tools that facilitate audit and investigation; for example the review and storage of device logging data; transparent monitoring and reporting; or the willingness to allow a 3rd party audit of systems and processes
  • Processes and tools for the identification and erasure of records, including secure destruction of storage and backup media
  • A demonstrable commitment to staff training and culture of data security.

Why hybrid cloud is the way forward for IT leaders

(c)iStock.com/Erikona

As enterprises increasingly recognise the significant value and opportunity that cloud computing presents, they continue to invest in and grow their cloud strategy. According to Gartner, the use of cloud computing is growing, and by 2016 this growth will increase to become the bulk of new IT spend.

In spite of this, Gartner estimated that, by 2020, on-premise cloud will still account for 70 per cent of the total market and VMware CEO Pat Gelsinger stated that, currently, over 92 per cent of cloud deployments are still on-premise or private. Indeed, in NaviSite’s own recent survey of over 250 UK and US IT professionals, over 89 per cent of UK respondents stated that deploying some sort of private cloud and hybrid infrastructure is a priority within the next 12 months.

There are many good reasons for organisations still opting for applications on hardware owned and managed in-house. Most organisations still have large investments in technology, people, and processes that cannot simply be written-off; certain workloads still do not suit virtualised or multi-tenanted platforms; renting resources is not always cheaper or better than owning them; and there are valid security and compliance reasons for keeping certain data on-premise.

In spite of these concerns, however, the public cloud continues to grow at a ferocious rate, validating the benefits that this infrastructure delivery model offers. That certain data and workloads are better suited for a private cloud infrastructure or for a physical hosted platform therefore seems to be the caveat that opens the door to hybrid solutions. Although many UK businesses have migrated certain applications to the cloud, over three quarters of respondents in NaviSite’s recent survey had migrated under fifty per cent of their infrastructure to the cloud.

A hybrid solution gives organisations the option of scaling resources for specific workloads and running applications on the most appropriate platform for a particular given task. A highly dynamic application with varying spikes may be best supported in the public cloud, a performance-intensive application may be better suited running from the private cloud and a dataset with high regulatory requirements may need to remain on a physical hosted platform. A hybrid solution allows an organisation to place their data where regulatory or security requirements dictate. This is significant as 59 per cent of UK IT professionals surveyed by NaviSite still cite security as their main concern with cloud migration.

Hybrid continues to grow as it is the solution that offers organisations the best of both worlds. For IT leaders, a hybrid strategy that pragmatically embraces the new, whilst making best use of current-state is essential. By going hybrid, today’s IT leaders can pick the best-fit strategy for the current demands of their business, within a flexible framework that will enable them to manage future change.