Todas las entradas hechas por Guest Author

Three major run-time performance hurdles to avoid

Network Function VirtualisationEvery investment in major IT transformation arrives with its own set of structural and institutional challenges, not least migrating to a virtualized infrastructure. Get it wrong from the outset and you will be faced with the mother of all headaches. If the issue of performance is not addressed at inception, your data centre and its services – although being able to run – will be increasingly plagued by service problems, poor user satisfaction and inadequate Return on Investment. So, companies looking to transform their virtualised infrastructure should look before they leap and consider three major hurdles.

#1: The heavy lifting of remedial actions

Shift happens. Seasonal traffic spikes, buggy drivers, fluctuating head counts, updated applications, worn out drives and a thousand other factors all contribute to the constantly shifting foundation upon which your infrastructure must stand and thrive. But in any dynamic environment, the unforeseen throttle will inevitably occur and issues will need fixing.

End-user issue remediation actions are exceedingly important, so when a VDI (Virtual Desktop Interface) user calls or files a trouble ticket, IT must have the tools for translating complaints into troubleshooting. Pinpointing problem sources manually can take hours; sometimes even days or weeks, because virtual infrastructure tools tend to be designed for high-level observation, not granular examination of what exactly is, or isn’t, going on. If multiple problems crop up from different sources, the burden of remediation magnifies, potentially risking an organisation’s business continuity.

#2: Isolated end-user experience metrics

Amazon famously publicised how each 1/10th of a second of added site latency sacrificed 1% in sales. Nothing will persuade a worker not to use a cloud-based software tool, particularly on a new rollout, quite like staring through a login hourglass. Of course, management feels the same pain. Aberdeen Research found that poor application performance can slice up to 9% from corporate revenue.

Citrix proposes that there are five key, measurable metrics that constitute the lion’s share of poor user experiences within virtualised environments:

  • Launch time
  • Logon time
  • Load time
  • Latency
  • Operations such as printing, screen updates, etc.

Clearly, most issues will stem from obstructions in resource flow, but determining the root cause or causes of any decline in these metrics can be arduous.

#3: Inefficient allocation of IT resources

Virtual infrastructure resource utilisation wastes mountains of money when it runs too cold and throws up bottlenecks and failures when it runs too hot. Efficiency is the relationship between performance and resource utilisation and maximum efficiency does not mean maximum utilisation. The physical underpinnings of your virtual infrastructure are a mesh of CPU, memory, storage, network and – particularly in VDI and HPC deployments – GPU (Graphics Processing Unit) assets. The key is to deduce not only utilisation levels but overall efficiency within each resource type, network region and entire organisation.

The solution for companies looking to jump these hurdles is to invest in better predictive analytics-driven strategic levers. They enable the reduction or even elimination of persistent IT problems. To prevent the heavy lifting of remedial actions the tool needs to increase real-time prevention of problems and in cases when prevention is not enough, step in with automated remediation. Either way, the end result is massive savings in resolution time as well as a much improved chance for IT, and thus the whole organisation, to meet its SLAs (Service-Level Agreements). The ability to help pinpoint the problem’s source and whenever possible, cross-silo insights for balancing load optimisation to alleviate bottlenecks is crucial to prevent isolated end-user experience metrics.

In order to combat inefficient allocation of IT resources you must be able to monitor the real-time characteristics of these physical resources and how end-points and applications are using them across the network. Clarity of vision is a strategic lever not only to utilisation levels but overall efficiency within each resource type, network region and entire organisation.

Written by Atchison Frazer, CMO at Xangati

SaaS for SMBs: The main meal, not the side salad

Door to new opportunityWe know that telcos are well-placed to sell cloud services to small and medium-sized businesses. Even if they don’t always know it yet, the business customers they provide telecoms services to need to embrace digital services to succeed, and telecoms providers are trusted to deliver technology that meets their stated needs. Cloud is very much at the forefront of most telcos’ thinking, in terms of how it can be used to serve their own operational requirements, but also as a delivery mechanism and opportunity for providing business customers with a range of additional services. They are increasingly offering a catalogue of cloud applications that can provide business customers with efficient tools to help run their organisations without the cost and complexity of buying, running and maintaining software from myriad, discrete providers.

But while this is good in theory, the practice is somewhat different. In the last two to three years, there has been a lot of talk about the opportunity that exists, but for most this opportunity has not resulted in a significant new revenue stream. And building new revenue streams are vital for operators.

Voice and data revenues are declining so portfolio diversity is necessary if operators are to remain profitable. Providing cloud services remains one of the most obvious routes to diversity – which makes its current stagnation a bit of a worry.

Through a number of interviews and workshops, BCSG has identified some of the key factors that have stopped telcos from making the most of the opportunity.

Getting it wrong

Unfortunately, selling cloud services as “just another add on” doesn’t work as it might with other services for small businesses. The telco may be a natural provider of these services, but to reach the mass market customers need to be educated on how the solutions can bring value and why the telco should be considered as the vendor – it is, after all, a service of the type that the business may have previously acquired from elsewhere.

Many telcos are failing to making it clear why businesses should buy from them – the value of these joined-up digital solutions (alongside existing services, for example a device, a 4G connection and the ability to access their business files on the move) are not being communicated to the customer.

Additionally, many telcos are not communicating in an effective way, preferring a ‘big bang, product push’ approach to marketing – all products, all channels, all customers, all at once. This untargeted approach is not winning business.

Getting it right

To be successful, telcos need to understand their own customers better. What would they most benefit from? How well do they understand their own needs? How can they get a busy business owner to stop and take time to consider how to become more efficient with a new application? How could the business grow as a result of a new cloud marketing solution?

Without understanding how to take a customer on the journey, knowing what education they will need, and having a clear idea of the barriers they will face, customers won’t adopt the services or reach the point where they are getting good value, a must for any SaaS product. Developing this customer understanding enables providers to deliver a targeted approach that personalises the engagement. Once that baseline of customer understanding is established, telcos can begin the education process that underpins a path to purchase and ongoing use. At this point, targeted marketing must take customers on a journey that builds understanding of, comfort with, and desire for the services available. Telcos could help businesses understand, for example, how a mobile device can be combined with cloud-based software that creates online forms to save time completing admin outside of working hours or how a tablet and email marketing app can be used together to create the next campaign to find new customers in dead time waiting for a flight in an airport.

Working with the software vendors (ISVs) to provide insight about customers and their products is critical – they are, after all, the experts on their product. ISVs already sell their products through a number of channels and so should have the resources and expertise on hand to help with the sales process. There is no need to build something from scratch. While it might seem obvious, our analysis shows these key elements of the customer journey and marketing process are rarely followed.

Finally, once the understanding, education, customer journey and value is understood, telcos need to execute effectively. This means using the right marketing tools in the right way to reach the right customers at the right time. Taking a measured, staged approach to rolling out new services, using each opportunity to test, learn and scale, reduces risk and helps to avoid big bang launch, followed by re-launch 6 months later as the first approach hasn’t worked.

A lot of the steps and processes needed to create the right customer journeys can easily be automated using the right tools, such as marketing automation. These tools are also essential for the measurement and analysis of performance that helps to foster further learning about the customer.

Telcos have been told that “the time is now” for cloud services for several years, but up until now, with some exceptions, cloud services has remained a lacklustre business stream for telcos. Making cloud services the centre of attention, rather than an afterthought tacked on at the end of a sales call, will help telcos capture that all important cloud opportunity.

Written by Alan Marsh, Product and Marketing Director at BCSG

Demystifying the three myths of cloud database

cloud question markThe cloud is here and it’s here to stay. The cost savings, flexibility and added agility alone mean that cloud is a force to be reckoned with.

However, many businesses are struggling to figure out exactly how to get the most out of the cloud; particularly when choosing what infrastructure elements to leave on-premises and which to migrate to the cloud. A recent SolarWinds survey found that only 42 per cent of businesses will have half or more of their organisations’ total IT infrastructure in the cloud within the next three to five years. Furthermore, seven per cent say their organisation has not yet migrated any infrastructure at all to the cloud, though many of these plan to once they have considered what to transfer and how to do it.

One of the more controversial moves when it comes to migrating infrastructure to the cloud is the database. Hesitancy in making the shift to the cloud is clear, with nearly three quarters (73%) of organisations stating they have yet to do so – but why is this?

The database is often seen as the most critical and important piece of IT infrastructure when it comes to performance, and lies at the heart of most applications, meaning changes are perceived as being risky. If there is a negative effect when moving or changing the way it operates, a ripple effect could impact on the entire business, for example losing important data.

While on some level this fear is justifiable, there are certainly a few reasons which could be defined as myths, or misconception, rather than reality:

Myth 1: Need high performance and availability? The cloud is not a suitable fit.

Several years ago during the early days of the cloud, the ‘one size fits all’ approach may have been fact, however with the natural maturation of the technology we’re at a point where databases in the cloud can meet the needs of even the most demanding applications.

The reality of today’s cloud storage systems is that there are very powerful database services available on the cloud, many based on SSD drives offering up to 48,000 IOPS and 800MBps throughout per instance. Also, while outages in the cloud were a common annoyance two to three years ago, today’s cloud providers often exceeds that of what most on-premises systems are able to deliver. Today’s cloud provider SLAs combined with the ease of setting replicas, standby systems and the durability of the data stored are often able to deliver better results.

This is not to say that the database administrator (DBA) is free of responsibility. While the cloud provider will take care of some of the heavy lifting that is involved with configurative and administrative tasks, the DBA is still responsible for the overall performance. Therefore, the DBA needs to still pay close attention to resource contention, bottlenecks, query tuning, execution plans, etc. – some of which may mean new performance analysis tools are needed.

Myth 2: The cloud is not secure.

Even though security should always be a concern, just because you can stroll into a server room and physically see the server racks doesn’t necessarily mean they are more secure than the cloud. In fact, there have been many more high profile security breaches involving on-premises compared to public cloud.

The truth is the cloud can be extremely secure, you just need a plan. When using a cloud provider, security is not entirely their responsibility, instead it needs to be thought of as a shared job – they provide reasonably secure systems, and you are responsible for secure architecture and processes.

You need to be very clear about the risks, the corporate security regulations which need to be abided by and the compliance certifications that must be achieved. Also, by developing a thorough understanding of your cloud provider’s security model, you will be able to implement proper encryption, key management, access control, patching, log analysis, etc. to complement what the cloud provider offers and take advantage of their security capabilities. With this collaborative approach to security and in-depth understanding of one another, you can ensure that your data is safe, if not safer, than if it were physical server racks down the hall.

Myth 3: If I use cloud I will have no control of my database.

This is another half-truth. Although migrating your database to the cloud does hand over some of the day-to-day maintenance control to your provider, when it comes to performance your control won’t and shouldn’t be any less.

As mentioned above, an essential step to ensure that you remain in control of your database is to understand your cloud provider’s service details. You need to understand their SLAs, review their recommended architecture, stay on top of new services and capabilities and be very aware of scheduled maintenance which may impact your job. Also, it’s important to take into account data transfer and latency for backups and to have all your databases in sync, especially if your database-dependent applications need to integrate with another one and are not in the same cloud deployment.

Finally, keep a copy of your data with a different vendor who is in a different location. If you take an active role in managing backup and recovery, you will be less likely to lose important data in the unlikely event of vendor failure or outage. The truth is that most cloud providers offer plenty of options, giving you the level of control you need for each workload.

Conclusion

The decision to migrate a database to the cloud is not an easy one, nor should it be. Many things need to be taken into account and the benefits and drawbacks need to be weighed up. However, given the tools available and the maturity of the cloud market today, deciding not to explore cloud as an option for your database could be short-sighted.

Written by Gerardo Dada, Head Geek at SolarWinds

How to turn the cloud into a competitive advantage with a scorecard approach to migration

Closeup on eyeglasses with focused and blurred landscape view.We have seen enterprise cloud evolve a lot in recent years, going from specific workloads running in the cloud to businesses looking at a cloud-first approach for many applications and processes. This rise was also reflected in the Verizon State of the Market: Enterprise Cloud 2016 report, which found that 84% of enterprises have seen their use of cloud increase in the past year, with 87% of these now using cloud for at least one mission-critical workload. Furthermore, 69% of businesses say that cloud has enabled them to significantly reengineer one or more business processes, giving a clear sign of the fundamental impact that cloud is having on the way we do business.

These findings give a clear sign that whilst companies will continue to leverage the cloud for niche applications, enterprises are now looking to put more business-centric applications in the cloud. This approach requires designing cloud-based applications that specifically fit each workload — taking into account geography, security, networking, service management expectations and the ability to quickly deploy the solution to meet rapidly changing business requirements. As a result, a core focus for 2016 will be the creation of individual cloud spaces that correspond to the individual needs of a given workload.

The key to cloud is collaboration

This focused alignment has led to the role of enterprise IT evolving to that of a cloud broker that must collaborate with lines of business to ensure overall success of the organisation. By using an actionable, scorecard approach for aligning cloud solutions with the needs of each workload, enterprises can make more informed assessments on how best to support applications in the cloud.

Three practical steps are as follows:

  1. Consult the Business and Assess User Requirements: IT professionals should build a relationship with their organisation’s lines of business to accurately identify critical application requirements to create the right cloud solution. Some questions to ask include:
  • What are all the barriers for successful application migration?
  • What is the importance of the application’s availability and what is the cost of downtime?
  • What regulations does the application and data need to comply with?
  • How often will IT need to upgrade the application to maintain competitive advantage?
  1. Score Applications and Build a Risk Profile: The careful assessment of technical requirements of applications can mean the difference between a successful cloud migration and a failed one. A checklist to guide IT departments away from major pitfalls is important. Such as:
  • Determine the load on the network
  • Factor in time to prepare the application
  • Carefully consider the costs of moving

In addition to assessing the technical requirements, IT professionals must evaluate the applications’ risk profile. Using data discovery tools to look at the data flow is instrumental to detecting breaches and mitigating any impact.

  1. Match Requirements to the Right Cloud Service Model: Choosing the best cloud model for enterprise IT requires a thorough comprehension of technical specifications and workload requirements. The following are key considerations to help IT directors partner with their business unit colleagues to define enterprise needs and determine the right cloud model.
  • Does the application’s risk profile allow it to run on shared infrastructure?
  • What proportion of the application and its data are currently based on your premises, and how much is based with a provider?
  • How much of the management of the cloud can you take on?

Cloud is empowering IT professionals to gain a greater role in effectively impacting business results. Working in the right cloud environment allows for operational efficiency, increased performance, stringent security measures and robust network connectivity.

What’s on the horizon for cloud?

In the coming months and years, we will see an increased focus on the fundamental technology elements that enable the Internet of Things – cloud network and security. Networking and cloud computing are at the heart of IoT, comprising half of the key ingredients that make IoT possible. (Security and infrastructure are the other two.) This is not surprising considering IoT needs reliable, flexible network connections (both wireless and wireline) to move all the collected data and information from devices back to a central processing hub, without the need for human intervention. Similarly, cloud computing provides the flexibility, scale and security to host applications and store data.

Going forward, success will not be measured by merely moving to the cloud. Success will be measured by combining favourable financials and user impact with enhanced collaboration and information sharing across a business’ entire ecosystem. Those IT departments that embrace the cloud through the creation and implementation of a comprehensive strategy — that includes strong and measurable metrics and a strong focus on managing business outcomes — will be the ones we talk about as pioneers in the years to come.

Written by Gavan Egan, Managing Director of Cloud Services at Verizon Enterprise Solutions

Bridging the Gap: Hybrid Cloud – blurring the lines between public and private cloud capabilities

Hybrid CloudThe public cloud is often seen as something that sits outside the enterprise. But its capabilities can be brought in-house.

The benefits of cloud are now widely known; a faster time to market, streamlined processes and flexible infrastructure costs. The public cloud model also provides rapid access to business applications or shared physical infrastructure owned and operated by a third party. It is quick to provision, and the utility billing model employed in a public cloud means companies only pay for the services they use. And, with costs spread across a number of users, costs are kept under control.

This works especially well for certain business applications that support HR, Sales, Marketing and Support.  It is also ideal for training, development and testing – where there are sporadic bursts of activity.

Private cloud, on the other hand, offers a bespoke infrastructure dedicated to an individual business, either run on-premises or hosted within a data center run by the cloud provider. This provides most of the benefits of the cloud – an agile, scalable and efficient cloud infrastructure – but with greater levels of control and security for the business than is offered by the public cloud, and as a result, often has a slightly higher level of cost.

A private cloud is often perceived to offer the best option for mission critical applications, or those that demand a higher level of customisation – something that can be more difficult to achieve in a public cloud environment. It can also reduce latency issues, as services are accessed over internal networks rather than the internet.

Bearing these factors in mind, a private cloud tends to work well for large and complex applications or organisations and those with stricter obligations around data and regulation.

Historically, customers have been faced with the dilemma of which model to use – public or private.  They’ve had to make a decision, one application at a time. This is mainly because public and private have had very different setups. There has not been an ability to seamlessly pick up workloads and move them back and forth between the private and public cloud.  Each ‘burst’ or ‘cross-over’ from on-premise to on-cloud (or vice versa) requires different provisioning code, security profiles, network configurations, testing and automation tools. It’s just too difficult!

Fortunately, when considering the move to cloud, it doesn’t have to be an either/or decision anymore: hybrid cloud enables companies to utilise a mixture of both, and is giving organizations new strategic options. It is about providing the exact same infrastructure, security policies and toolsets, and, at the very last stage, choosing a deployment option – either on-premise or on-cloud.

One of the key benefits of operating a hybrid cloud is that it enables users to move applications and workloads between environments depending on demand, business needs and other variables. This mixed approach means businesses can rapidly respond to operational developments — for example using public cloud to quickly and cost-effectively develop and test new applications, before moving them back behind the firewall as they go into production.

It also means more (if not all) of a company’s applications are now ready to take advantage of the benefits of being deployed on a cloud – even if it’s the private cloud to start with.

This is now possible thanks to an evolution in the cloud computing space — the Public Cloud Machine — which uses the same software and hardware as the public cloud to bring the capabilities on-premise, meaning businesses can exploit the power of public cloud infrastructure while having the extra control that in-house data centers provide.

Essentially, it means organizations can address specific business or regulatory requirements, as well as those around data control and data location, while being able to tap into the perceived benefits of the public cloud: agility and pay-as-you go billing.

The hybrid cloud is set to become a business-as-usual expectation from companies.  Oracle is leading with the Public Cloud Machine, getting customers ahead of the curve.

By being able to blur the lines between where one cloud begins and another ends, companies can gain the ultimate flexibility of cloud, become more agile than their competitors and be in a better position to rapidly respond to changing needs and an increasingly competitive environment.

Written by Neil Sholay, Head of Oracle Digital, EMEA at ‎Oracle

Should Public Cloud be Synonymous with Outsourcing?

Marathon runners taking the position for the start of raceI caught an internet meme the other day that said, “The Cloud is just a computer somewhere else.”  But is that true?  Is the cloud really all about outsourcing your infrastructure to somewhere or someone else?

Popular opinion seems to indicate that’s the case.  But I would argue otherwise.

The cloud is a way of thinking.  Consider the ease with which you can swipe your credit card and walk away with a virtual infrastructure in the cloud.  Pay for what you need now, and scale out to meet your growing demands as your business or projects expand.  Who could say no to that?

In my experience as an IT leader and solutions architect, this is what the cloud is really all about.  Self-service provisioning; elastic, pay-as-you-grow infrastructure; and a service-driven operating model with all-inclusive, per-VM pricing.

If we take that perspective, we see that the cloud is not just about outsourcing.  In fact, all IT leaders should aspire to deliver the same agility, elasticity, and efficiency of the cloud model – whether their infrastructure runs on-premises or “in the cloud.”

With that said, this has not always been feasible or easy.  Traditional IT infrastructure is costly, complex, and rigid.  It simply doesn’t provide the same level of efficiency and agility as public cloud providers can deliver.  And that’s no surprise.  Early in their history, pioneering service providers and technology giants like Google, Amazon, and Facebook, discarded the old IT model and built their own infrastructure based on key design principles of software-defined, scale-out, and x86 commodity hardware.

Until now, visionary IT leaders who sought to deliver a cloud operating model on-site had little at their disposal.  But that is changing.  Breakthroughs in on-premises infrastructure like hyperconvergence are making it possible to bring the benefits of the cloud on-site, avoiding the tradeoffs of outsourcing their infrastructure and core business applications to the cloud.

In many ways, hyperconverged infrastructure delivers the same efficiency and agility of cloud.  It’s based on the same design principles noted above – x86 commodity building blocks, software-defined, and linear scalability. However, hyperconverged infrastructure also provides the performance, protection, and resiliency enterprises require – all while reducing complexity and costs.

In fact, in a recent independent study, focusing on the cost-effectiveness and three-year total cost of ownership (TCO) savings of hyperconvergence and the public cloud, hyperconvergence vendor SimpliVity was compared to public cloud vendor Amazon Web Services. The study found that SimpliVity’s hyperconverged infrastructure solution offers a TCO savings of 22% to 49% when compared to Amazon Web Services. This shows that cost is no longer a barrier to creating a private cloud. Enterprises can choose what best suits their workloads, public or private.

Overall, with hyperconvergence, enterprises can now outsource to the public cloud or decide to stay on-premises, all the while maintaining the agility, elasticity, and cost-effectiveness of the public cloud.

Written by Rich Kucharski, Vice President of Solutions Architecture at SimpliVity

What happens to EU General Data Protection Regulation if the UK votes for a Brexit?

EuropeBusinesses warned not to give up on data reforms just because UK could quit Europe

As the UK prepares to vote on whether to leave the European Union, businesses are being warned not to give up on data reforms inspired by the forthcoming EU General Data Protection Regulation (GDPR).

Businesses across the country have been studying implications of the new Regulation, due to be in force in May 2018, which aims to create a ‘one-stop shop’ for data protection across the European Union.

Some of the key aspects of the bill include huge fines for data breaches, new rules around the collection of personal data and new rights for European citizens to ask for data be deleted or edited. Many businesses will also be required to appoint a Data Protection Officer.

However, the Brexit vote opens up the possibility that the UK could be out of the EU by the time it comes into force.

John Culkin, Director of Information Management at Crown Records Management, said: “It would be tempting for businesses to think that if the UK leaves the EU this regulation would not apply. In fact, that isn’t the case. Although an independent Britain would not be a signatory of the Regulation, in reality it would still be impossible to avoid its implications.

“The Regulation governs the personal data of all European citizens, providing them with greater control and more rights over information held about them. So any company holding identifiable information of an EU citizen, no matter where it is based, needs to be aware. With millions of EU citizens living in the UK, too, it’s hard to imagine that many businesses here would be unaffected.

“The same applies to data breaches involving the personal data of European citizens. So it will still be vital to have a watertight information management system in place which allows businesses to know what information they have, where it is, how it can be edited and who is responsible for it.”

Even if the UK votes to leave the EU, data in Great Britain & Northern Ireland will continue to be regulated by the current Data Protection Act, which was passed in 1998.

A spokesperson for the Information Commissioners’ Office (ICO), an independent body set up to uphold information rights, said: “Although derived from an EU Directive, the Data Protection Act was passed by the UK Parliament and will remain in place after any exit, until Parliament decides to introduce a new law or amend it.

“The UK has a history of providing legal protection to consumers around their personal data. Our data protection laws precede EU legislation by more than a decade, and go beyond the current requirements set out by the EU, for instance with the power given to the ICO to issue fines.

“Having clear laws with safeguards in place is more important than ever given the growing digital economy, and is also central to the sharing of data that international trade relies on. The UK will continue to need clear and effective data protection laws, whether or not the country remains part of the EU.”

Culkin believes there is a real danger that UK businesses will defer crucial reforms of their information management systems – just in case the Brexit vote in June changes the agenda. But he warns it is a big risk.

He said: “Businesses should be thinking about the benefits of good information governance rather than hesitating because of what could happen in the future.

“There is no point putting in place systems that ignore privacy by design, for instance, when that is good procedure – no matter what happens in Europe in June. The same is true of measures to protect a business from data breaches, which have reputational as well as financial implications – no matter who imposes the fine.

“As for personal data, citizens, in the UK are only going to be more demanding about how their data is collected, stored and edited in future – the genie is out of the bottle and it’s not sensible to think that leaving the EU will change it. Preparing for a modern data world is not only about the GDPR.”

This a view shared by the ICO which will continue to ensure organisations meet their information rights obligations no matter how the UK votes.

A spokesperson said: “Ultimately, this is a decision for organisations based on their own particular circumstances. Revisiting and reassessing your data protection practices will serve you well whatever the outcome of the referendum. Investing in GDPR compliance will ensure an organisation has a high standard of data protection compliance that will enable the building of consumer trust.”

Hybrid IT skillsets – Does your IT team have what it takes?

The seamstress the neck sews clothes in the StudioIT runs the world and without it the well-behaved, functional technological ecosystem would come to a screeching halt. But the agile, dynamic nature of the tech world provides businesses with many services to consume to meet the bottom-line goals, which means that IT needs to constantly keep up with the latest trends while balancing the people, process and budget equations. Cutting costs, encouraging work flexibility or improving efficiency are the objectives of an investment in technology. It’s part of IT’s responsibility to maximize the ROI of that investment.

For many, the biggest technological transformation for businesses to date has been the cloud. It’s fair to say cloud adoption is nearly ubiquitous, with just 7% of IT pros across the UK saying their organisation have not migrated any infrastructure to the cloud. However, despite its popularity, it’s become clear cloud adoption isn’t suitable for all workloads and, even if it were, 65% of organisations state that it’s unlikely that all of their infrastructure will ever be migrated to the cloud. Thus the market has developed and created a new trend – hybrid IT.

The evolution of hybrid IT

An annual IT trends report from SolarWinds, which surveyed UK IT practitioners, managers and directors, highlighted this new trend and found the vast majority of businesses have shifted away from on-premises infrastructure to hybrid IT environments.

Hybrid IT is where businesses migrate some infrastructure services to the cloud, while continuing to maintain some critical IT services onsite. Hybrid IT is benefiting businesses by reducing the cost, while increasing the agility and scalability of infrastructure, and relieving internal IT personnel of some day-to-day responsibilities.

However, it has also put the IT pros implementing hybrid IT under a lot of pressure. Now they are faced with a dual mandate: increase efficiency through cloud services while also ensuring critical systems, databases and applications are secure, lean, and agile. This is a huge challenge and many IT pros simply don’t have the skillset to cope with such a huge task that includes a constantly moving target with continuous integration and continuous delivery. It’s therefore imperative that businesses provide sufficient resources to support the IT team, and encourage IT pros to gain the skills and tools required to properly manage hybrid IT environments.

Addressing IT pros’ concerns

Despite a vital need for new skills, tools and resources, less than half of IT pros (48%) feel they have the support needed from leadership, and the organisation as a whole, to develop or improve the skills needed to better manage hybrid IT environments. Nearly three quarters (72%) also feel further disadvantaged as they are uncertain whether their IT organisation currently has adequate resources to manage a hybrid IT environment.

Kong_YangHere are a few suggestions and tips IT pros should take advantage of which could help enhance their skillset for managing a hybrid IT environment:

Think about DevOps – IT pros would benefit from leveraging the principles of DevOps when managing a hybrid IT environment. This would help to achieve faster choices, greater agility and organisational efficiency. IT pros will then be able to update and make any necessary changes to infrastructure, making IT services, whether on-premises or in the cloud, more agile, lean and scalable.

Management and monitoring toolsets – Leverage a management and monitoring toolset which can quickly surface a single point of truth across all the relevant platforms is hugely beneficial when working with both on-premises and cloud resources. This will create a more efficient process to remediation, troubleshooting and optimising, thanks to the normalisation of key performance metrics, alerts and other collected data from applications and workloads, regardless of their location or service provider.

Educate yourself – As more IT services become available from vendors, IT pros must improve upon the following: being business savvy for contract negotiation, having technical expertise to understand and use the available cloud services and project management. These will all require the ability to effectively manage project budgets, workflows and deadlines; dissect terms and conditions; and understand service-level agreements.

A syllabus for IT pros – In order to be successful in the hybrid IT world, IT pros need to have a versatile portfolio of skills. Areas which need to be focused on are: service-oriented architectures, automation, vendor management, application migration, distributed architectures, API and hybrid IT monitoring and management tools and metrics.

Monitoring as a discipline is a necessity – Monitoring as a discipline needs to be viewed as a core IT function. Once established in practice, businesses will begin to reap the rewards of streamlining infrastructure and application performance, cost and security which will all feed into a successful IT management strategy.

Kong Yang, Head Geek at SolarWinds

Managed Cloud Storage – What’s the hold up?

Boxes on trolley in warehouseOrganisations operating in today’s highly competitive and lightning-speed world are constantly looking for new ways to deliver services to customers at reduced cost. Cloud technologies in particular are now not only being explored but are becoming widely adopted, with new Cloud Industry Forum statistics showing that 80% of UK companies are adopting cloud technology as a key part of their overall IT and business strategy.

That said, the cloud is yet to be widely accepted as the safe storage location that the industry is saying it is. There is still a great deal of apprehension, in particular from larger organisations, to entrust large volumes of data to the cloud. Indeed, for the last 20 years, storage has been defined by closed, proprietary and in many cases monolithic hardware-centric architectures, which were built for single applications, local network access, limited redundancy and highly manual operations.

Storage demands are changing

The continuous surge of data in modern society, however, now requires systems with massive scalability, local and remote accessibility, continuous uptime and great automation, with fewer resources having to manage greater capacity. The cloud is the obvious answer but there is still hesitancy.

Let’s face it though, anyone who is starting out today is unlikely to go out and buy a whole bunch of servers to deploy locally. They are much more likely to sign up for cloud-based managed services for functions like accounting, HR and expenses, and have a laptop with a big hard drive to store and share files using Gmail, Dropbox and so on. It is true to say that smaller businesses are increasingly using storage inside cloud apps, but for larger businesses, this option is not quite so simple or attractive. Many enterprises are turning to the cloud to host more and more apps but they still tend to keep the bulk of their static data on their own servers, to not only ensure safety and security but also to conduct faster analytics.

Open Door LightThe cloud storage door is only slightly ajar

With increasing data volumes and accelerated demand for scalability, you would expect many businesses to be using cloud-based managed storage already. However, the fact remains that there are still many businesses burying their heads in the sand when it comes to cloud storage. As a result, there is quite a bit of fatigue amongst the storage vendors who have been promoting cloud for some time, but not seeing the anticipated take-up. In fact, I would go so far as to say that the door the industry is pushing against is only slightly ajar.

As with most things, there are clouds and there are clouds. At the end of the day, cloud-based storage can be anything an organisation wants it to be – the devil is in the architecture. If you wanted to specify storage that incorporates encryption, a local appliance, secure high-bandwidth internet connectivity, instant access, replication, green and economical storage media – a managed cloud storage service can actually ‘do’ all of these things and indeed, is doing so for many organisations. There is take-up, just not quite as much as many storage vendors would like.

It’s all about the data

Nowadays, for most organisations it is about achieving much more than just the safe storage of data. It’s more and more common to bolt-on a range of integrated products and services to achieve a wide range of specialist goals, and it’s becoming rare that anyone wants to just store their data (they want it to work for them). Most organisations want their data to be discoverable and accessible, as well as have integrity guarantees to ensure the data will be usable in the future, automated data storage workflows and so on. Organisations want to, and need to, realise the value of their data, and are now looking at ways to capitalise on it rather than simply store it away safely.

Some organisations though, can’t use managed cloud storage for a whole raft of corporate, regulatory and geographical reasons. The on-premise alternative to a cloud solution, however, doesn’t have to be a burden on your IT, with remote management of an on-site storage deployment now a very real option. This acknowledges that storage capabilities that are specific to an industry or to an application are now complex. Add on some additional integrated functionality and it’s not something that local IT can, or wants to, deal with, manage or maintain. And who can blame them? Specialist services require a specialist managed services provider and that is where outsourcing, even if you can’t use the cloud, can add real value to your business.

What do you want to do with your data?

At the end of the day, the nature of the data you have, what you want to do with it and how you want it managed, will drive your storage direction. This includes questions around whether you have static or data that’s subject to change, whether your storage needs to be on-premise or can be in the cloud, whether you want to backup or archive your data, whether you want an accessible archive or a deep archive, whether you need it to be integrity-guaranteed or something else, long or short term. Cloud won’t always necessarily be the answer; there are trade-offs to be made and priorities to set. Critically, the storage solution you choose needs to be flexible enough to deal with these issues (and how they will shift over time) and that is the difficulty when trying to manage long-term data storage. Everything is available and you can get what you want but you need to make sure that you are moving to a managed cloud service for the right reasons.

Ever-increasing organisational data volumes will continue to relentlessly drive the data storage industry and today’s storage models need to reflect the changing nature of the way in which businesses operate. Managed storage capabilities need to be designed from the ground up to facilitate organisations in maximising the value they can get from their data and reflect how those same organisations want to access and use it both today, and more importantly, for years to come.

Written by Nik Stanbridge, VP Marketing at Arkivum

Cloud is growing, but will it be your organisation’s downfall?

competitive swimmingThe reality is that most enterprise applications are well on their way to being cloud based. We’ve seen it with simple workloads such as HR and payroll, travel and expense management, and in the last decade we’ve seen the cloud as the new normal for customer relationship management (CRM) deployments. According to Gartner[1], “Spending on public-cloud-based, vertical-specific applications is expected to significantly increase through 2017, further highlighting the growing confidence in their use for mission-critical systems.”

Upgrading your enterprise resource planning (ERP) system to the cloud means retiring your old approach to business management applications and no longer having to procure, install, maintain, and manage the infrastructure. And perhaps most compelling is to leverage the cloud to redefine your business processes and take advantage of a new era of service delivery and flexibility to enable your organisation to grow.

So what are the benefits of cloud based ERP solutions? Below are the top five reasons why moving your ERP system to the cloud will benefit your business and support business growth.

  1. Freedom of Choice

Put quite simply, not all cloud ERP systems are created equal. Specifically, very few ERP vendors respect your right to choose the deployment model that is most appropriate for you, and revise that decision down the road as your business grows or technical needs change. Your right to transition between on-premises, multi-tenant, and single tenant is an important one. It recognises that the “best” deployment model for you today might not be the best model in a few years, or even a few months. By providing the choice of Multi-Tenant (with its compelling economics and seamless upgrades) or Single Tenant (allowing more administrative control and administrative ownership), you can choose the model that works best for you.

  1. Compelling Cloud Economics

Despite the cloud having proven its value beyond just good financial sense, there is no doubt that for companies of all sizes the economics of cloud deployment are undeniably compelling, moving from capital to operational expenditure. Some of the more hidden economic benefits of the cloud include:

  • Not being as capital intensive as an on-premises deployment because of the subscription-based pricing model.
  • Better and more instant scalability, allowing clients to add (and sometimes remove) users to their system on demand and saving them from having to invest in hardware and software at the “high water mark”.
  • The direct and indirect costs of your infrastructure, from server to database systems to the actual hardware and replacement cycle cost.
  • The hidden costs of maintaining the servers yourself.
  • The benefit of the reduced deployment times (and corresponding improved ROI) that are typical for cloud deployments, as the necessary infrastructure is in place already.
  1. Better IT Resource Utilisation

Moving to the cloud means that your IT department will be able to deliver higher-value activities that are better aligned with your mission, and they will be able to spend less time “patching the servers and keeping the lights blinking.” At the end of the day, most IT departments are stretched pretty thin, and find themselves spending too much time on low-value (but admittedly critical) Development projectactivities such as verifying backups, applying security updates, and upgrading the infrastructure upon which your critical systems run. There is tremendous business benefit to assigning those tasks back to your ERP vendor as part of a cloud deployment, freeing up your IT department’s time to work on more strategic business projects such as creating executive dashboards, deploying mobile devices, and crafting helpful management reports.

  1. The Cloud is More Secure

Today, it’s hard to imagine a client who could possibly create a more secure operating environment than leading cloud providers. Indeed, Gartner reports[2] that “Multi-tenant services are not only highly resistant to attack, but are also a more secure starting point than most traditional in-house implementations.”

Where security once implied a locking the server room door and forcing people to use long passwords, today it means hardened electronic operating environments. You can’t claim to be secure unless you have systems and people protecting your infrastructure 24 hours a day, 365 days a year, and verifying that security updates from all vendors are thoughtfully tested, then applied.

Security today is a comprehensive, end-to-end mindset that has to be built across every layer of the ERP environment from the physical network interface cards to the user passwords. It means a holistic approach to anticipating and minimising possible natural, human, and technical disruptions to your system to ensure uptime and peace of mind.

  1. Mobile and Collaborative

The modern ERP deployment landscape is full of mobile professionals, including sales and service staff operating outside the four walls of your office, who expect access to the ERP system from their handheld devices. You may also have mobile onsite staff such as shop floor operators and logistics staff that need to access your ERP from tablets and similar devices. Moving to a cloud-based system gives everyone the real-time system access they require as a routine part of their jobs while driving out the inefficiency of paper-based processes and the burden and security risk of figuring out how to deliver this yourself.

Opening up your ERP system by virtue of cloud deployment allows you to retire the poorly defined ad-hoc “integration by Excel file” workflows that might have cropped up across your organisation. In their place, you can deploy real-time integration processes that link your employees, suppliers, partners, and customers.

Cloud deployment brings the opportunity to redefine many of your legacy business processes and workflows in a way that leverages these more open, connected, instantaneous integration paths.

ERP solutions aren’t just software. They are tools that can be used to help grow your business profitably, offering flexible solutions that provide more accurate information in real-time, driving smarter, faster decision-making, and enabling customers to quickly meet changing market demands to stay ahead of their competition. The cloud increases the business benefits that ERP offers and can accompany your business on the road to successful growth.

Writes Martin Hill, Vice President Marketing, Epicor International