Archivo de la categoría: Enterprise IT

Businesses are ready for cloud – but lack of transparency is limiting its usefulness

cloud puzzleDespite common perceptions, cutting costs isn’t the primary reason businesses are choosing cloud these days. The other major advantages are the agility and scalability cloud brings, enabling organisations to quickly respond to business demand. The combination of benefits is driving both IT and lines of business to rely on cloud to serve as a foundation for innovation and enablement.

But the advantages of cloud cannot be fully harnessed if transparency into the environments is compromised. Clouds that limit visibility result in significant operational and financial issues, including performance problems or outages, challenges reporting to management, and unexpected bills. In fact, challenges with transparency restrict 63% of organizations from growing their cloud usage. That’s according to a recent global survey conducted by Forrester Consulting that we commissioned. The survey sought insights from 275 IT executives and decision makers who are experienced cloud customers.

When it comes to data about cloud environments, what are organisations looking for from their providers? Clearly security and compliance information is important. Worryingly, 39% of those surveyed said they lacked security data and 47% said they lacked compliance data. Not surprisingly, the majority said they needed on-demand access to necessary reports to make compliance and audit processes easier.

That said, on-demand reporting technology only goes so far, and many respondents wanted suggestions and/or support from experts on staff at the cloud provider. In light of evolving security risks and corporate compliance concerns – especially as lines of business adopt cloud without IT involvement – cloud providers need to simplify the process for ensuring advanced security and compliance in the cloud, not get in the way.

Beyond security and compliance, performance information, historical information and clear details about costs and upcoming bills are also key. Without this, businesses find it hard to plan for or meet the needs of their end users. It also makes it extremely difficult to budget properly.

Just like with their own servers, organisations need to understand the performance of a cloud service to get the most from it, whether that means making sure resources are running properly, anticipating potential issues or preventing wasteful “zombie virtual machines.” Due to a lack of transparency from their cloud providers, more than a third of the respondents in the survey ended up with bills they hadn’t expected and 39% found they were paying for resources they weren’t actually using.

Cloud customers can use data to make better purchasing decisions. Clear information from a cloud provider will help companies discover where they need more resources, or even where they can regain capacity and maximise their spend.

Once again though, beyond the on-demand data, customers require solid support to ensure they are getting what they need from cloud. In the survey, 60% of respondents said that problems with support were restricting their plans to increase their usage of cloud. Issues like slow response times, lack of human support, lack of expertise of the support personnel and higher-than-expected support costs started with the onboarding process and only continued. Aside from preventing customers from reaping the benefits of cloud, these issues leave businesses feeling that they’re seen more as a source of revenue than as a valued cloud customer.

When it comes down to it, cloud customers should not settle for cloud services that limit visibility into the cloud environments. Compromises in transparency mean sacrifices to very agility, scalability and cost benefits that drive organizations to cloud in the first place. And beyond transparency, customers should not underestimate the human element of cloud. A cloud provider’s customer support plays a huge role in speeding return on cloud investment, and ultimately, in determining success and failure of a cloud initiative.

As the Forrester study states, “Whether you are a first-time cloud user or looking to grow your cloud portfolio, our research shows that your chances of success are greater with a trusted cloud provider at your side — one that gives you the technology and experts to solve your challenges.”

You can read more about the survey findings in the study, “Is Your Cloud Provider Keeping Secrets? Demand Data Transparency, Compliance Expertise, and Human Support From Your Global Cloud Providers.”

Written by Dante Orsini, senior vice president, iland

Thames Tideway Tunnel taps Accenture in NetSuite deal

Accenture claims this is the first implementation of a multi-tenant cloud-based ERP system at a regulated utility in the UK

Accenture claims this is the first implementation of a multi-tenant cloud-based ERP system at a regulated utility in the UK

Thames Tideway Tunnel, the project company set up to manage London’s “super-sewer” overflow reduction project, has deployed NetSuite’s cloud-based ERP platform in a bid to reduce costs and drive flexibility in its financial and project planning operations.

The company, which is due to start construction on a super sewer system to tackle sewage overflowing into the River Thames, said it required a flexible, low-cost IT systems implementation to support its core financial and project planning operations.

It enlisted Accenture to help deploy NetSuite OneWorld across the organisation.

“An agile and intuitive back-office IT system is critical for effective management and delivery of large-scale infrastructure projects,” said Robin Johns, head of Information Systems at Thames Tideway Tunnel.

“We selected Accenture to help us with this implementation based on its extensive experience with NetSuite cloud ERP technology and complex system integrations. We also chose Accenture for its ability to offer practical solutions to deliver an IT platform that will help facilitate financing and construction of the super sewer, while keeping costs down for customers,” Johns said.

Maureen Costello, managing director of Accenture’s utilities practice in the UK and Ireland said this is the first implementation of a multi-tenant cloud-based ERP system at a regulated utility in the UK.

It “demonstrates the company’s innovative approach and commitment to efficiently manage the delivery of this capital project,” Costello said.

Netflix to retire on-prem datacentres by summer’s end

Netflix is making big changes to how it architects its service

Netflix is making big changes to how it architects its service

Netflix said it plans to move its last remaining on-prem systems to the cloud in a move aimed at streamlining its datacentre strategy.

According to a recent report in The Wall Street Journal’s CIO Journal, while its entire customer-facing business runs on AWS, Netflix said the company is planning to completely retire its own datacentres later this summer.

While most of its internal applications also run in the public cloud the company still uses its own infrastructure to store backups of its video collection, and for persistent failover.

It is clear Netflix has until very recently continued to invest in that infrastructure. Earlier this year the video streaming giant swapped 16 existing storage systems for three XIV systems to reduce datacentre floor space used by about 80 per cent and boost its database transactions-per-minute.

It was also testing IBM’s recently announced Spectrum Storage software, which is designed to optimise storage and ease management within hybrid cloud environments.

Moving all of its systems and applications to the cloud will complement a massive architectural overhaul announced earlier this year.

The company said rising demand for its service, which is mostly deployed on AWS infrastructure from multiple locations (initially just in the US) has prompted an effort to simplify its architecture so that it can scale more rapidly and reduce outages.

“Over the past 7 years, Netflix streaming has expanded from thousands of members watching occasionally to millions of members watching over two billion hours every month.  Each time a member starts to watch a movie or TV episode, a “view” is created in our data systems and a collection of events describing that view is gathered.  Given that viewing is what members spend most of their time doing on Netflix, having a robust and scalable architecture to manage and process this data is critical to the success of our business,” the company said at the time.

A tale of two ITs

Werner Knoblich,  head of strategy at Red Hat in Europe, Middle East, and Africa (EMEA)

Werner Knoblich, senior vp and gm of Red Hat in EMEA

Gartner calls it ‘bimodal IT’; Ovum calls it ‘multimodal IT’; IDC calls it the ‘third platform’. Whatever you choose to call it, they are all euphemisms for the same evolutions in IT: a shift towards deploying more user-centric, mobile-friendly software and services that more scalable, flexible and easily integrated than the previous generation of IT services. And while the cloud has evolved as an essential delivery mechanism for the next generation of services, it’s also prompting big changes in IT says Werner Knoblich, senior vice president and general manager of Red Hat in EMEA.

“The challenge with cloud isn’t really a technology one,” Knoblich explains, “but the requirements of how IT needs to change in order to support these technologies and services. All of the goals, key metrics, ways of doing business with vendors and service providers have changed.”

Most of what Knoblich is saying may resonate with any large organisation managing a large legacy estate that wants to adopt more mobile and cloud services; the ‘two ITs can be quite jarring.

The chief goal used to be reliability; now it’s agility. In the traditional world of IT the focus was on price for performance; now it’s about customer experience. In traditional IT the most common approach to development was the classic ‘waterfall’ approach – requirements, design, implementation, verification, maintenance; now it’s all about agile and continuous delivery.

Most assets requiring management were once physical; now they’re all virtualised machines and microservices. The applications being adopted today aren’t monolithic beasts as they were traditionally, but modular, cloud-native apps running in Linux containers or platforms like OpenStack (or both).

Not just the suppliers – but also the way they are sourced – has changed. In the traditional world long-term, large-scale multifaceted deals were the norm; now, there are lots of young, small suppliers, contracted in short terms or on a pay-as-you-go basis.

“You really need a different kind of IT, and people who are very good in the traditional mode aren’t necessarily the ones that will be good in this new hybrid world,” he says. “It’s not just hybrid cloud but hybrid IT.”

The challenges are cultural, organisational, and technical. According to the 2015 BCN Annual Industry Survey, which petitioned over 700 senior IT decision makers, over 67 per cent of enterprises plan to implement multiple cloud services over the next 18 months, but close to 70 per cent were worried about how those services would integrate with other cloud services and 90 per cent were concerned about how they will integrate those cloud services with their legacy or on-premise services.

That said, open source technologies that also make use of open standards play a massive role in ensuring cloud-to-cloud and cloud-to-legacy integrations are achievable and, where possible, seamless – one of the main reasons why Linux containers are gaining so much traction and mind share today (workload portability). And open source technology is something Red Hat knows a thing or two about.

Beyond its long history in server and desktop OSs (Red Hat Enterprise Linux) and middleware (JBoss) the company is a big sponsor and early backer of Open Stack, increasingly popular cloud building software built on a Linux foundation. It helped create an open source platform as a service, OpenShift. The company is also working on Atomic Host, an open source container-based hosting mechanism for a slimmed down version of RHEL with support for other open source container technologies including Kubernetes and Docker, the darlings of the container community.

“Our legacy in open source is extremely important and even more important in cloud than the traditional IT world,” Knoblich says.

“All of the innovation happening today in cloud is open source – think of Docker, OpenStack, Cloud Foundry, Kubernetes, and you can’t really think of one pure proprietary offering that can match these in terms of the pace of innovation and the rate at which new features are being added,” he explains.

But many companies, mostly the large supertankers, don’t yet see themselves as ready to embrace these new technologies and platforms – not just because they don’t have the type or volume of workloads to migrate, because they require a huge cultural and organisational shift. And cultural as well as organisational shifts are typically rife with political struggles, resentment, and budgetary wrestling.

“You can’t just install OpenStack or Dockerise your applications and ‘boom’, you’re ready for cloud – it just doesn’t work that way. Many of the companies that are successfully embracing these platforms and digitising their organisations set up a second IT department that operates in parallel to the traditional one, and can only seed out the processes and practices – and technologies – they’ve embraced when critical mass is reached. Unless that happens, they risk getting stuck back in the traditional IT mentality.”

An effective open hybrid approach ultimately means not only embracing the open source solutions and technologies, but recognising that some large, monolithic applications – say, Cobol-based mainframe apps – won’t make it into this new world; neither will the processes needed to maintain those systems.

“For some industries, like insurance for instance, there isn’t a recognised need to ditch those systems and processes. But for others, particularly those being heavily disrupted, that’s not the case. Look at Volkswagen. They don’t just see Mercedes, BMW and Tesla as competitors – they see Google and Apple as competitors too because the car becomes a technology platform for services.”

“No industry is secure from disruption, particularly from players that scarcely existed a few years ago, which is why IT will be multi-modal for many, many years to come,” he concludes.

This interview was developed in partnership with Red Hat

Sixth-sensors: The future of the Internet of Things and the connected business

IT departments will soon have to worry about IoT

IT departments will soon have to worry about IoT

An IT admin walks in to his cabin and instantly knows something is wrong. He does not even have to look at his dashboard to identify the problem. Instead, he heads straight to the server room to fix the server which is overheating because of a failed fan.

The IT admin does not have a sixth-sense. He is alerted to the problem by an internet-enabled thermostat in the server room which sensed the rise in temperature and automatically changed the lighting to alert the admin, through an internet-enabled lightbulb and his smart watch.

This is not the plot of a futuristic Sci-Fi movie. It is 2015 and just one example of how the Internet of Things (IoT) is already at work in business.

Smart living

Every few years, IT communities become awash with new buzzwords and trends that early adopters declare as the next big thing and sceptics decry as impractical and over-hyped. Over time, some fizzle out because of low industry acceptance, while others go on to really disrupt the industry.

From smart cars to watches and even homes, connected technologies are already changing consumer lives, fueling growing expectations and apprehensions. Last year, the government demonstrated its belief in the future potential of technology when it pledged to spend £45m to develop the IoT, more than doubling the funds available to the UK technology firms developing everyday devices that can communicate over the internet.

In the consumer market, IoT technology is already being lapped up. Within just a few months of its launch, Apple claimed 75% of the smartwatch market. As yet, self-driving cars are yet to take to Britain’s roadways. However, with prototypes already being pioneered and app developers racing to create everything from connected entertainment to automated piloting using GPS, when the infrastructure required to make smart cities a reality is sanctioned by local councils and city mayors, IoT could literally find itself in the driving seat.

Smart workplaces

Outside of very early prototype projects, currently, IoT does not rank highly on the enterprise agenda, which is typically a few years behind the general technology adoption cycle. However, in the not-too-distant future, smart-devices will be the norm – IDC estimates the market will be worth $8.9 Trillion by 2020, with 212 billion connected devices.

With the promise of enhanced business processes and intelligence, IoT is increasingly being touted as a holy amalgamation of big data, mobility and cloud technology. Despite this, in the short term at least, businesses will be reluctant to flow of sensitive data through such internet-enabled devices due to obvious security concerns. The exception is in the large businesses that have already explored the potential of machine-to-machine connectivity in their industries, such as automotive and insurance.

Where smart devices are catching up in day-to-day business is in an entirely different function of operations – facilities. What if your management decides to get internet-enabled LED bulbs and thermostats which connect to the internet? Will the IoT bring additional responsibilities on to the service desk? A definite yes.

Facilities need to be managed – and a tool to manage them. That’s just the start. For example, each bulb in a smart IoT connected environment must be monitored and checked to confirm they are working.

Assuming there are over 100 such appliances in an office environment, consider all the IP addresses that will need to be allocated. Likewise, a mesh network would also be required to control the IP address allocation, where one connected device would result in an ad-hoc network.

As previously non-IT facilities start to be connected to the internet, it will be the job of the IT team to make sure they’re working well. As the volume of devices connected to the network grows, securing it will be even more challenging.

Of course, organisations can get around the security challenge by having a local network dedicated only for these devices, but the management of this expanded estate would nonetheless require a dedicated management tool.

Where large organisations have already invested in machine-to-machine (M2M) interactions and deployed connected devices in their facilities, the purpose has typically been to achieve automation and gather more intelligence.

As yet, smaller businesses do not have to worry about automation and logistics at such large scales and it’s clear that the IoT is definitely not going transform their business operations overnight. However, before long, IoT will be something all IT departments should learn to manage – especially the new generation of IoT-connected devices which would traditionally have been classed and managed as non-IT assets.

Written by Pradyut Roy, product consultant, ManageEngine

HMRC embraces PaaS, moves tax platform to the cloud

HMRC is building out a PaaS and cloud strategy to support its digital services agenda

HMRC is building out a PaaS and cloud strategy to support its digital services agenda

In a bid to offer a set of new and redesigned digital services HMRC is moving its tax platform to the cloud and rolling out automated infrastructure to support its internal platform as a service (PaaS).

“In HMRC we are scaling up our cloud infrastructure as we prepare to deliver more new and redesigned digital services. These services sit on the “Tax Platform”, our internal platform as a service,” explained Kalbir Sohi, an Infrastructure Digital Service Manager at HMRC.

“Over the last two years developing the Tax Platform we’ve been automating the creation of infrastructure to ensure consistency and quality in our infrastructure by defining it in code and decreasing the amount of time that people in our team spend doing repetitive manual tasks like provisioning and configuring servers.”

In a bid to ease vendor lock-in the organisation has been using a range of open source tools including Puppet, git and VCloud Tools to build and scale the infrastructure over the past two years; it’s also contributing code back to the codebases where relevant.

“We are committed to both using open source products and contributing back to the community to improve them based on what we are doing. This should help us to avoid being tied to one specific supplier or technology but will also allow us to contribute to some of the interesting and novel cloud tools that are emerging — hopefully making these tools more useable for organisations like HMRC,” Sohi explained.

The Tax Platform is designed to make building, testing, deploying and running microservice-based web applications very easy, and is intended to help HMRC embrace a cloud service brokerage model it said would help ease digital service delivery internally and externally.

The organisation’s next step is to focus on scripting networking, storage and compute automation into the platform, and selecting the open source tools to help make its cloud brokerage ecosystem more robust for those maintaining the platform.

“Having access to fast, repeatable, efficient infrastructure will change the way that teams approach building and running the HMRC services that do not fit the platform as a service model. We are changing much more than this too. We are taking a new approach to infrastructure which will shape how we organise ourselves to deliver services in the future. At the heart of this is designing for, and testing with our users.”

UK SME cloud adoption swells on flexible working growth

UK SMEs are turning to the cloud to support their flexible working needs

UK SMEs are turning to the cloud to support their flexible working needs

UK SMEs are upping their use of cloud services in a bid to cater to more flexible working practices, recently released research from the BT Business and the British Chambers of Commerce (BCC) suggests.

According to a survey of over 300 decision makers working in small and medium-sized business in the UK, nine out of ten (91 per cent) of companies have at least one member of staff working from home, and a fifth of businesses (19 per cent) said more than half of their workforce working away from their main office location.

The BCC said the results are directly linked to growing cloud service use. About 69 per cent of businesses use cloud-based applications, and more than half (53 per cent) saying that they are critical to effective remote working.

As one might have guessed, internet connectivity was also rated quite highly on the list of core elements required to effectively facilitate flexible working (63 per cent), and smartphones are seen as the technology that has made the biggest difference to businesses in the last 12 months (according to 68 per cent of respondents).

“It is vital to ensure that UK businesses have access to world-class digital infrastructure if they are to maintain their competitiveness in a global marketplace,” said Adam Marshall, executive director of policy and external affairs, BCC.

“Cloud and mobile technologies are becoming increasingly important as firms expand into new markets and explore new ways of working – especially overseas. It is encouraging to see that so many British firms are adapting their working practices to take advantage of these developments,” he added.

Legislation that came into effect last summer means employees in the UK with over 26 weeks service are eligible to request flexible working hours, allowing more employees to set up home offices and work remotely. Research from the Office for National Statistics found that in the first three months of 2014, 4.2 million staff across the country worked from home, equating to 13.9 per cent of the workforce, a figure that is only set to grow since the law’s passing.

Office enlists private cloud for global expansion

Office is in the middle of a significant global expansion

Office is in the middle of a significant global expansion

Shoe retailer Office is using a private cloud and managed virtualisation services to handle spikes in online ordering ahead of one of its busiest periods.

Office has over 150 stores across Europe and the US and began rolling out its international e-commerce site earlier this year in a bid to expand its presence globally.

The company enlisted e-commerce specialist Envoy Digital to help with its broader digitisation efforts. It is using Rackspace’s private cloud platform to host the e-commerce site, which is built using the hybris platform, and VMware-based managed virtualisation in combination with load balancers to manage and distribute workloads and traffic.

“When working with any cloud provider, it’s critical that they can ensure only a minimal amount of our time is spent overseeing the IT infrastructure so that it operates smoothly,” said Robin Worthington, multichannel director, Office. “This allows us to focus on what we’re best at – helping customers find the right shoes.”

The company said it wanted to migrate its international platform to the cloud and improve the reliability of its multichannel infrastructure in advance of the summer season, which is one of the busiest for the shoe retailer.

IoT security and the world of US medicine

IoT in healthcare faces its fair share of challenges

IoT in healthcare faces its fair share of challenges

Internet of Things security is anything but a homogenous concept. It is, rather, extremely dependent on the type of products being developed and – in many cases – the sort of regulatory restrictions they are subject to.

Of all the sectors where IoT is proliferating, however, it is arguably medical that is the most fraught. In medical IT, developers have to operate in a minefield of intense regulation, life and death safety issues, and an unusually high (and of course very much unwelcome) degree of scrutiny from hackers.

The hacking of medical data is a popular criminal enterprise, particularly in the US, where just last week UCLA Health hospitals say hackers may have accessed personal information and medical records of as many as 4.5 million patients.

However, while no-one would be overjoyed at the thought of something as intimate as their medical records falling into the hands of digital crooks, it is arguably the patient who has the least to worry about here. The main targets of medical data theft are US insurance companies and the institutions that administer Medicare. In the US, patients usually collect medication and leave it to pharmacists to bill the insurance companies.

A single refill for five months’ medication can easily add up to a few thousand dollars, so the rewards for effective fraud – with hackers posing as pharmacists – are large. Insurance companies, of course, foot the bill, while for those impersonated the results can cost time, stress, and in worst case scenarios a potentially dangerous delay in securing their medication.

It’s just one example of why security around medical data – medical IoT’s bread and butter – has to be so tight.

Someone extremely familiar with the territory is Sridhar Iyengar, one of the founders of AgaMatrix. At AgaMatrix, Iyengar  helped develop the first iPhone –connected medical device, a glucose monitor called iBGStar, then a revolutionary innovation for diabetes sufferers.

Nowadays Iyengar’s focus is on Misfit, a wearables company focussing on fitness rather than illness, but he is still deeply involved with issues surrounding IoT, health, and security. In September, he will attend Internet of Things Security conference in Boston as a keynote speaker, where he will draw on his expertise in diabetes to illustrate the wider challenges confronted by developers in the realm of medical IoT.

“The Holy Grail in this world of diabetes is what they call an artificial pancreas,” he says, “meaning that, if you can sense how much glucose is in your blood, you can pump in the right amount of insulin to automatically regulate it. Nobody has made a commercial version of that. Partly because the folks who make a glucose sensor are different to the folks that make the pumps and it has been  difficult for the two to cooperate due to trade secrets and the complexities of sharing the liability of devices from different manufacturers that must work in unison. The patients are left to suffer.”

In one famous incident, this frustrating discontinuity was first overcome by a “citizen scientist,” a father who hacked his diabetic child’s separate devices and was able to link the two together. While this was never marketed, it signalled that the race for a commercially viable artificial pancreas was very much on. However, while no-one would resent such intrepid ingenuity on the part of the “citizen scientist,” Iyengar points out that it is also demonstrates the devices in question were very much hackable.

“If somebody hacks into an insulin pump you could kill someone,” he says. “They overdose, they go into a coma, they die. None of these insulin pump manufacturers are going to open source anything: they can’t, because of the deadly consequences of someone hacking it.”

Ultimately, it will prove an interesting challenge to future regulators to establish precisely where to draw the line on issue such as this. Still, the capacity for others to easily take control of (for instance) a connected pacemaker is bound to generate a degree of concern.

Many of these issues are complicated by existing regulations. The US Health Insurance Portability and Accountability Act (HIPAA) requirements state that medical data can only be shared after it has been completely anonymised, which presents something of a paradox to medical IoT, and frequently requires complex architectures and dual databases, with pointers enabling healthcare professionals to blend the two together and actually make sense of them.

Issues like this mean developers can’t rely on industry standard architectures.

“You can’t rely on this network immune system that exists in the consumer software space where many different parties are vigilant in monitoring breaches and bugs because multiple vendors’ code is used by a product,” says Sridhar, picking an apt metaphor. “If you want to develop security related features you kind of have to do it yourself.”  In turn this means that, if there are breaches, you have to address them yourself. “It raises this interesting dilemma,” he says. “On the one hand the way that software’s written in the medical field, it’s supposed to be more safe. But in some situations it may backfire and the entire industry suffers.”

84% of UK CIOs say cloud reduces overall IT control – survey

UK CIOs are concerned cloud adoption is reducing their control over IT

UK CIOs are concerned cloud adoption is reducing their control over IT

A recent survey of 100 UK CIOs suggests close to nine in ten believe unsanctioned use of cloud services has created long term security risks for their organisations, and about 84 per cent believe cloud adoption reduces their organisation’s control over IT more broadly.

The survey, commissioned by Fruition Partners, looks specifically at IT service management (ITSM) trends in large UK companies (organisations with more than 1,000 employees).

The results suggest CIOs are still very concerned a lack of maturity around cloud service management and application support within enterprises is driving more ‘Shadow IT’ in their organisations.

About 60 per cent of respondents said there is an increasing culture of ‘Shadow IT’ in their organisations, and 79 per cent believe there are cloud services in use that IT does now know about.

Over three quarters (78 per cent) of CIOs stated that the rest of the business frequently does not seek their advice when it comes to the procurement of public cloud services, and about one in two CIOs believe their employees are side-stepping their own IT departments and going directly to cloud service providers for application support.

“CIOs need to remember that while the availability of public cloud services may mean they need to provide fewer IT services themselves, it doesn’t reduce the need for the management of those services. In fact, it’s arguable that the need for rigorous management actually increases. Of course you should expect public cloud services to work faultlessly, however you’d be crazy to blindly trust that they will, without managing and monitoring how those services are delivered to the business,” said Paul Cash, managing director of Fruition Partners UK.

Cash explained that regardless of the type of cloud service IT departments should still be managing them internally rather than “handing over all responsibility to cloud providers.”

“CIOs must make it easier for employees in other lines of business to work with the IT department to source the cloud services they want,” he said. “There are simple initial steps they can take to do this, such as creating and publishing a comprehensive service catalogue which is exposed to the entire business. A service catalogue that lists sanctioned public cloud services will reduce the impact of shadow IT and make it far easier for employees throughout the organisation to buy cloud services from the IT department – while ensuring that IT can control and manage the services that are implemented.”