Category Archives: Enterprise IT

CIF: Enterprises still struggling with cloud migration

Enterprises are still struggling with their cloud migrations, the CIF claims

UK enterprises are still struggling with their cloud migrations, the CIF research shows

The latest research published by Cloud Industry Forum (CIF) suggests over a third of UK enterprises IT decision-makers believe cloud service providers could have better supported their migration to the cloud.

The CIF, which polled 250 senior IT decision-makers in the UK earlier this year to better understand where cloud services fit into their overall IT strategies, said its clear UK business are generally satisfied with their cloud services and plan to use more of them. But 35 per cent of those polled also said their companies still struggle with migration.

“The transition to cloud services has, for many, not been as straightforward as expected. Our latest research indicates that the complexity of migration is a challenge for a significant proportion of cloud users, resulting in unplanned disruption to the business,” said Alex Hilton, chief executive of the Cloud Industry Forum.

“There may be a case that cloud service providers need to be better at either setting end user expectations or easing the pain of migration to their services. But equally, it’s important that end users equip themselves with enough knowledge about cloud to be able to manage it and ensure that the cloud-based services rolled out can support business objectives, not hinder them.”

Piers Linney, co-chief executive of Outsourcery said the research highlights the need for providers to develop a “strong integrated stack of partners.”

“IT leaders looking for a provider should first assess their existing in-house skills and experience to understand how reliant they will be on the supplier to ensure a smooth transition. Equally, cloud suppliers need to be more sensitive to their customers’ requirements and tailor their service to the level of support needed for successful cloud adoption,” he said.

“The most critical factor is for IT leaders to really get under the bonnet of their potential cloud provider, make sure that the have a strong and highly integrated stack of partners and a proven track record of delivery for other customers with needs similar to their own.”

IDC: Cloud high on the list for utilities sector but skills shortage pervades

The utilities sector is struggling with an ageing workforce and lacks critical cloud skills

The utilities sector is struggling with an ageing workforce and lacks critical cloud skills

About three quarters of utilities see moving their on-premise apps and workloads into the public cloud as a dominant component in their IT strategies, according to a recent IDC survey. But Gaia Gallotti, research manager at IDC Energy Insights told BCN the firms need to first overcome a pretty significant skills gap and an ageing workforce if they are to modernise their systems.

According to the survey, which polled 38 international senior utility executives, the vast majority of respondents are sold on the benefits cloud could bring to their IT strategies.  About 87 per cent said cloud services provide better business continuity and disaster recovery than traditional technology, and 74 per cent said public cloud migration will be dominant within their broader IT strategy.

Interestingly, while 76 per cent of respondents believe cloud providers can offer better security and data privacy controls than their own IT organisation,  63 per cent said ceding control to a cloud provider is a barrier to their organisation’s adoption of cloud services.

“The utilities industry can no longer afford to deny the advantages of ‘going into the cloud.’ As security concerns are further debunked, utilities expect to see a significant push in their cloud strategy maturity in the next 24 months, so much so that they expect to make up lost ground and even supersede other industries,” Gallotti said.

But most also believe internal IT skillsets not up to speed with new cloud standards, methodologies, and topologies. 74 per cent said they will need a third-party professional services firm to help develop a public cloud strategy.

“This is a huge problem the industry is facing, but not exclusively for cloud services. Utilities are struggling to attract talent in all IT domains, especially for the ‘third platform’, as they compete with companies in the IT world that attract ‘Generation Y’ talent more easily,” Gallotti explained.

“The utilities industry also has an issue with aging workforce outside of IT and across its other business units. In the short term, we expect utilities to rely more on their service providers to fill skills gap that emerge, in the hope of more easily attracting the right talent as the industry transforms and becomes more appealing to Gen Y.”

Philips health cloud lead: ‘Privacy, compliance, upgradability shaping IoT architecture’

Ad Dijkhoff says the company's healthcare cloud ingests petabytes of data, experiencing 140 million device calls on its servers each data

Ad Dijkhoff says the company’s healthcare cloud ingests petabytes of data, experiencing 140 million device calls on its servers each day

Data privacy, compliance and upgradeability are having a deep impact on the architectures being developed for the Internet of Things, according to Ad Dijkhoff, platform manager HealthSuite Device Cloud, Philips.

Dijkhoff, who formerly helped manage the electronics giant’s infrastructure as the company’s datacentre programme manager, helped develop and now manages the company’s HealthSuite device cloud, which links over 7 million healthcare devices and sensors in conjunction with social media and electronic medical health record data to a range of backend data stores and front-end applications for disease prevention and social healthcare provision.

It collects all of the data for analysis and to help generate algorithms to improve the quality of the medical advice that can be generated from it; it also opens those datastores to developers, which can tap into the cloud service using purpose-built APIs.

“People transform from being consumers to being patients, and then back to being consumers. This is a tricky problem – because how do you deal with privacy? How do you deal with identity? How do you manage all of the service providers?” Dijkhoff said.

On the infrastructure side for its healthcare cloud service Philips is working with Rackspace and Alibaba’s cloud computing unit; it started in London and the company also has small instances deployed in Chicago, Houston and Hong Kong. It started with a private cloud, in part because the technologies used meant the most straightforward transition from its hosting provider at the time, and because it was the most feasible way to adapt the company’s existing security and data privacy policies.

“These devices are all different but they all share similar challenges. They all need to be identified and authenticated, first of all. Another issue is firmware downloadability – what we saw with consumer devices and what we’re seeing in professional spaces is that these devices with be updated a number of times during a lifetime, so you need that process to be cheap and easy.”

“Data collection is the most important service of them all. It’s around getting the behaviour of the device, or sensor behavior, or the blood pressure reading or heart rate reading into a back end, but doing it in a safe and secure way.”

Dijkhoff told BCN that these issues had a deep influence architecturally, and explained that it had to adopt a more modular approach to how it deployed each component so that it could drop in cloud services where feasible – or use on-premise alternatives where necessary.

“Having to deal with legislation in different countries on data collection, how it can be handled, stored and processed, had to be built into the architecture from the very beginning, which created some pretty big challenges, and it’s probably going to be a big challenge for others moving forward with their own IoT plans,” he said. “How do you create something architecturally modular enough for that? We effectively treat data like a post office treats letters, but sometimes the addresses change and we have to account for that quickly.”

Real-time cloud monitoring too challenging for most providers, TFL tech lead says

Reed says TFL wants to encourage greater greater use of its data

Reed says TFL wants to encourage greater greater use of its data

Getting solid data on what’s happening in your application in real-time seems to be a fairly big challenge for most cloud services providers out there explains Simon Reed, head of bus systems & technology at Transport for London (TFL).

TFL, the executive agency responsible for transport planning and delivery for the city of London, manages a slew of technologies designed to support over 10 million passenger journeys each day. These include back office ERP, routing and planning systems, mammoth databases tapped in to line-of-business applications as well as customer-facing app (i.e. real-time travel planning apps, and the journey planner website), line-of-business apps, as well as all the vehicle telematics, monitoring and tracking technologies.

A few years ago TFL moved its customer facing platforms – the journey planner, the TFL website, and the travel journey databases – over to a scalable cloud-based platform in a bid to ensure it could deal with massive spikes in demand. The key was to get much of that work completed before the Olympics, including a massive data syndication project so that app developers could more easily tap into all of TFL’s journey data.

“Around the Olympics you have this massive spike in traffic hitting our databases and our website, which required highly scalable front and back-ends,” Reed said. “Typically when we have industrial action or a snowstorm we end up with 10 to 20 times the normal use, often triggered in less than half an hour.”

Simon Reed is speaking at the Cloud World Forum in London June 24-25. Register for the event here.

The organisation processes bus arrival predications for all 19,000 bus stops in London which is constantly dumped into the cloud in a leaky-tap model, and there’s a simple cloud application that allows subscribers to download the data in a number of formats, and APIs to build access to that data directly into applications. “As long as developers aren’t asking for predictions nanoseconds apart, the service doesn’t really break down – so it’s about designing that out and setting strict parameters on how the data can be accessed and at what frequency.”

But Reed said gaining visibility into the performance of a cloud service out of the box seems to be a surprisingly difficult thing to do.

“I’m always stunned about how little information there is out of the box though when it comes to monitoring in the cloud. You can always add something in, but really, should I have to? Surely everyone else is in the same position where monitoring actual usage in real-time is fairly important. The way you often have to do this is to specify what you want and then script it, which is a difficult approach to scale,” he said. “You can’t help but think surely this was a ‘must-have’ when people had UNIX systems.”

Monitoring (and analytics) will be important for Reed’s team as they expand their use of the cloud, particularly within the context of the journey data TFL publishes. Reed said its likely those systems, while in a strong position currently, will likely see much more action as TFL pursues a strategy of encouraging use of the data outside the traditional transport or journey planning app context.

“What else can we do to that data? How can we turn it around in other ways? How can other partners do the same? For us it’s a question of exploiting the data capability we have and moving it into new areas,” he said.

“I’m still not convinced of the need to come out of whatever app you’re in – if you’re looking at cinema times you should be able to get the transportation route that gets you to the cinema on time, and not have to come out of the cinema listings app. I shouldn’t have to match the result I get in both apps in order to plan that event – it should all happen in one place. It’s that kind of thinking we’re currently trying to promote, to think more broadly than single purpose apps, which is where the market is currently.”

Telstra, Komatsu sign $23m cloud, IoT deal

Komatsu is tapping Telstra for its cloud, comms and M2M strategy

Komatsu is tapping Telstra for its cloud, comms and M2M strategy

Telstra inked a deal with mining and construction equipment provider Komatsu to help the company deploy and manage its ICT and IoT strategy over the next three years.

The deal, valued at $23m, is an extension of a move in 2010 to on-board Komatsu’s applications onto Telstra’s cloud platform.

As part of the latest agreement Telstra will provide the core telecoms (voice, data and mobile) and ICT services (IoT and cloud infrastructure services) to Komatsu. The company said it is building on a recent M2M trial which it said enabled a ‘zero touch’ remote download of performance diagnostic data from more than 700 pieces of its equipment on mine sites.

The company said being able to access and analyse the data from inSite Centre in Sydney in real-time removed the need to take the equipment out of the field to download data, resulting in improved fleet and production efficiency.

“From the beginning of our cloud journey with Telstra, the focus has been on giving more time back to the business so we can innovate and adapt, and not worry about IT. This new agreement will be an extension of our collaborative relationship and will ensure we continue to lead our category within the mining sector,” said Ian Harvison, chief information officer at Komatsu.

“Telstra understands where we’ve come from and more importantly where we want to take our business, and we feel very confident that our technology and business is future proofed to allow us to compete in a continually evolving and competitive landscape.”

Sean Taylor, Komatsu Australia’s managing director and chief executive said: “We’re committed to business innovation and staying one step ahead of our customer’s needs – and it’s only through relationships with key partners like Telstra that this is possible. We’re excited about the next phase in our ICT strategy and look forward to many more years of innovation.”

Woodside to deploy IBM Watson to improve oil & gas operations

Woodside will use Watson to improve employee training and oil & gas operations

Woodside will use Watson to improve employee training and oil & gas operations

Australian oil and gas firm Woodside will deploy IBM’s Watson-as-a-Service in order to improve operations and employee training, the companies announced this week.

The energy firm plans to use the cloud-based cognitive compute service to help train engineers and operations specialists on fabricating and managing oil and gas infrastructure and facilities.

The company said the cognitive advisory service it plans to use, ‘Lessons Learned’, will help improve operational processes and outcomes and include over thirty years of collective knowledge and experience operating oil and gas assets.

Woodside Senior vice president strategy, science and technology Shaun Gregory said the move is a part of a broader strategy to use data more intelligently at the firm.

“We are bringing a new toolkit to the company in the form of evidence based predictive data science that will bring down costs and increase efficiencies across our organization,” Gregory said.

“Data science, underpinned by an exponentially increasing volume and variety of data and the rapidly decreasing cost of computing, is likely to be a major disruptive technology in our industry over the next decade. Our plan is to turn all of this data into a predictive tool where every part of our organisation will be able to make decisions based on informed and accurate insights.”

Kerry Purcell, managing director, IBM Australia and New Zealand said: “Here in Australia IBM Watson is transforming how banks, universities, government and now oil and gas companies capitalise on data, helping them discover completely new insights and deliver new value.”

CIF cloud code of practice gains European Commission backing

The Cloud Industry Forum's COP gained the EC's seal of approval for cloud certification this week

The Cloud Industry Forum’s COP gained the EC’s seal of approval for cloud certification this week

The Cloud Industry Forum’s (CIF) code of practice for cloud service providers has been added to the European Commission’s growing list of cloud certification schemes. The move means it passes the EC’s benchmark for service security and reliability.

The Commission’s Cloud Certification Schemes List was set up as part of the European Cloud Strategy and developed by the European Union Agency for Network and Information Security (ENISA); it gives an overview of different existing certification schemes for cloud services in the region.

The scheme effectively the Commission’s way of recognising a certification’s claim to ensuring cloud contracts guarantee a certain level of security or reliability, which it hopes will assure European customers of a provider’s claims and help stimulate spending on cloud services.

“This is a major milestone for the Cloud Industry Forum and the broader cloud community.  There are no dedicated cloud standards in the market, making it difficult for small business customers to identify trusted advisors,” said Alex Hilton, chief executive officer of the Cloud Industry Forum.

“We hope this recognition will encourage more users of cloud services to actively seek providers that are CIF-certified, and likewise more CSPs to seek certification. We have taken important steps in providing a foundation in what is a fast changing and, to many, a new technology sector,” Hilton said.

Other certification schemes included in the list include the Cloud Security Alliance’s attestation, certification and self assessment, EuroCloud’s Star Audit, ISO 27001 and PCI v3.

Richard Pharro, chief executive of APM Group, the Cloud Industry Forum’s certification partner, added: “The Code of Practice was first established with the aim of driving levels of accountability, capability and transparency in the Cloud industry, which are all critical to the Cloud service contract. With the adoption of Cloud within businesses progressing at an incredibly fast rate, those key tenets of Cloud delivery are as important as ever.”

“CSPs need to ensure they operate their businesses and services in a fully open and transparent manner where it is clear to their customers – existing and new – that they are trustworthy and capable of offering the services they claim to be able to offer. The CIF CoP is one of very few schemes which offers this much needed reassurance to end users regarding the organisations they choose to work with,” he added.

Five key enterprise PaaS trends to look out for this year

PaaS will see a big shakeup this year according to Rene Hermes, general manager EMEA, Apprenda

PaaS will see a big shakeup this year according to Rene Hermes, general manager EMEA, Apprenda

The last year has shown that a growing number of enterprises are now choosing Platform as a Service (PaaS) ahead of Infrastructure as a Service (IaaS) as the cornerstone of their private/hybrid cloud strategy. While the enterprise cloud market has obviously experienced a substantial amount of change over the last year, the one thing that’s certain is that this will keep on accelerating over the coming months.

Here are five specific enterprise cloud trends that we believe will prove significant throughout the rest of 2015 and beyond.

The PaaS standard will increasingly be to containerise – While we’ve always committed to the concept of a container-based PaaS, we’re now seeing Docker popularise the concept. The broader enterprise world is now successfully vetting the viability of a container-based architecture, and we’re seeing enterprises move from just asking about containers as a roadmap item to now asking for implementation details. This year won’t necessarily see broad-based customer adoption, but we’re anticipating a major shift as PaaS becomes synonymous with the use of containers.

Practical microservices capabilities will win out over empty posturing – It’s probably fair to say that most of the microservices ‘advice’ offered by enterprise PaaS vendors to date has been questionable at best. Too many vendors have simply repackaged the Service-Oriented Architecture conversation and represented it as their microservices positioning. That’s fine, but it hasn’t helped customers at all as vendors have avoided being held accountable to microservices at both a feature and execution level. This isn’t sustainable, and PaaS and cloud vendors will need to deliver practical guidance driven by core enterprise PaaS features if they are to be taken seriously.

Internet of Things will be a key driver for PaaS implementations – For PaaS to be successful they need to support core business use cases. However too many PaaS implementations are deployed just to simplify the IT model so that developers can quickly build cloud-enabled applications. That approach simply isn’t going to withstand the pressure caused by the increased take-up of innovations such as The Internet of Things that will require web-service back-ends that are easy to manage, highly available and massively scalable.

Containerising OpenStack services set to create confusion – The move towards OpenStack being deployed within containers is interesting, but we believe adoption will prove slow. With many now expecting container control and management to sit within the PaaS layer, moves such as containerised OpenStack are likely just to cause confusion. Given that PaaS is becoming the dominant form of cloud assembly, containerised IaaS will stall as it conflicts directly with the continued growth in enterprises deploying private/hybrid PaaS – regardless of whether they’ve built IaaS already.

PaaS buyers to dismiss infrastructure prescriptive solutions – Many PaaS vendors do a lot of marketing around being portable, but in reality many organisations find that this can increase IT risk and drive lock-in by deliberately creating stack dependencies. We’re finding PaaS buyers much keener to challenge vendors on their infrastructure portability as early as the proof of concept phase. That’s because customers want an enterprise PaaS that doesn’t favour one infrastructure over another. To ensure this outcome, customers are now using their RFPs and proofs of concept to insist that PaaS vendors demonstrate that their solutions are portable across multiple infrastructure solutions.

By Rene Hermes, general manager EMEA, Apprenda

BMJ CTO: ‘Consumerisation of IT brings massive risks’

Sharon Cooper, CTO of BMJ

Sharon Cooper, CTO of BMJ

As we approach Cloud World Forum in London this June BCN had the opportunity to catch up with one of the conference speakers, Sharon Cooper, chief technology officer of BMJ to discuss her views on the risks brought about by the consumerisation of IT.

What do you see as the most disruptive trend in enterprise IT today?

For me it is the consumerisation of IT, but not because I’m worried that IT department is being put out of business, or because business users don’t know what tools they need to run their business. My concern about the disruption is that there is a hidden risk and potential massive costs and unknown danger because many of today’s applications and tools are so deceptively simple to use that business users are not aware of things that might be critical to them, in part because the IT department always controlled everything, and hid much of the complexity from them.

Tools are so easy to use, someone just sign ups with their email address, uploads a large spreadsheet full of personal customer data, and then they leave, they forget to tell anyone that they have that account, it might even be under their personal email address. So the company has no idea where its corporate assets are being stored, you have no idea where they are being stored, and when that customer asks to be removed from the company’s databases, nobody has any idea that the customers details are hidden away in locally used Google Drives, Dropboxes, or other applications.

If nobody in the company has a view over what tools are used, by whom and what’s in them, is the company even aware of the risk, or its individual employees who are using these tools? Business users are reasonably savvy people but they probably won’t check the T&Cs or remember that extremely boring information governance mandatory training module they had to complete last year.

I really encourage people in my organisation to find good tools, SaaS, cloud based, apps, but I ask them to ensure that my team knows what they are, give them a quick review to see if they are genuine and not some sort of route for activists, has checked over the T&Cs, remind them about the fact that they are now totally responsible for any personal customer data or sensitive corporate information in those applications, and they will be the ones that will be impacted if the ICO comes calling.

What do you think the industry needs to work on in terms of cloud service evolution?

Trying to get legislation to catch up with the tech, or even be in the same century.

What does BMJ’s IT estate look like? What are the major services needing support?

We have a bit of everything, like most companies, although I believe we have made fairly significant moves into cloud and SaaS/managed services.

Our desktop IT, which is provided by our parent company is very much traditional/on-premise, although we have migrated our part of the business to Google Apps for business, which has dramatically transformed staff’s ability to work anywhere. We’re migrating legacy bespoke CRM systems to cloud-based solutions, and use a number of industry specific managed services to provide the back office systems that we use directly, rather than via our parent.

Our business is in digital publishing and the tools that we use to create the IP and the products that drive our revenue are predominantly open source, cloud-based, and moving increasingly that way. Our current datacentre estate includes private cloud, with some public cloud, and we believe we will move more towards public over the next 2-3 years.

Can you describe some of the unique IT constraints or features particular to your company or the publishing sector? How are you addressing these?

Our parent company is in effect a UK trade union, its needs are very, very different from ours; we were originally their publishing department and now an international publisher with the majority of our revenues coming from outside the UK. There is some overlap but it is diminishing over time.

Our market is relatively slow to change in some ways, so our products are not always driven as fast by changes in technology or in the consumer IT markets.

Traditionally academic publishing is not seen as a huge target for attack, but the nature of what we publish, which can be considered by some to be dangerous, has the potential to increase our risks above that of some of our peers – for example, controversies over the accuracy of medical treatments, we were the Journal that produced the evidence that Andrew Wakefields research into MMR was wrong, and he has pursued us through the courts for years. If that story had broken today, would we have been a target of trolling or even hacktivists. We sell products into the Middle East that contain information on alcohol related diseases, and we’ve been asked to remove them because there is not alcoholic disease in those countries (we have not bowed to this government pressure),

As the use of knowledge at the point of care becomes ever more available via the use of devices that can be used by anyone, anywhere, so does the additional burden of medical device regulation and other challenges, which coming from a print publishing background, were never relevant before.

Are there any big IT initiatives on the horizon at BMJ? What are the main drivers of those?

We have probably under invested in many applications over the last several years, a policy to really sweat an IT asset was in place for years – and we have a range of systems we will be replacing over time, consolidating – for example we have 5 different e-commerce systems, revenue is processed in more than 3 applications.

As with most companies a focus on data and analytics in all of its guises will be critical as we move forward.

Why do you think it’s important to attend Cloud World Forum?

It’s always good to see what vendors are offering and to hear what others have done to solve problems in their industries which might have relevance to yours, quite often it means you don’t feel quite so bad about your own situation when you hear other people’s tales.

Leeds Building Society targets customer engagement with HP deal

Leeds Building Society shifted its savings and lending business onto a hosted HP platform in 2013

Leeds Building Society shifted its savings and lending business onto a hosted HP platform in 2013

Leeds Building Society is to revamp its customer engagement tools through a ten-year deal with HP Enterprise Services, which will encompass a number of independent software vendors working on different parts of the business. The deal builds on the earlier deal between the two firms in 2013, which focused on moving the building society’s core banking platform to the cloud.

Under the 10-year agreement, HP Application Transformation Services will work with independent software vendors TIBCO, Numéro and Infor to provide Leeds Building Society with customer engagement capabilities hosted in an HP Helion managed virtual private cloud environment. This will help the society streamline its mortgage and savings processes, making it easier to grow market share and penetrate new market segments.

The deal has several parts. Omni-channel customer experience management specialist Numéro will provide contact management capability for new customer communication channels. The idea is to ensure the building society can offer support across any communications channel, without the customer having to start the process again. Infor’s multi-channel, interactive campaign management solution, Infor Epiphany, will help the building society to offer customers personalised communications, allowing the society to strengthen individual customer relationships. HP Exstream will provide customer communication (such as statements, notices and renewals) through customers’ preferred channel. TIBCO ActiveMatrix BPM software will digitise its business processes, systems and applications.

“Like all financial institutions, our future is dependent upon delivering the right services for current and future customers,” said Tom Clark, chief information officer, at Leeds Building Society. “ICE represents the cornerstone of our long-term strategy to deliver significant productivity and customer communication channel improvements while reducing costs and meeting regulatory requirements. HP already hosts our core application for mortgages and savings and, with a proven track record of delivering large-scale hosted services and innovative technology, can help us to achieve our business objectives.”

Leeds Building Society joined the shared services alliance founded by HP Enterprise Services and the Yorkshire Building Society in September 2013, in a deal that saw the society move its core application for mortgages and savings to the cloud. The deal also marked a growing recognition among the UK’s mid-tier institutions of the power of cloud to help them move with the times.

HP’s original deal with the Yorkshire Building Society involved shifting the building society’s core mortgage and savings application to the cloud. That in turn enabled the Yorkshire to effectively offer its automated mortgage sales, lending and savings account processing product as a white labelled solution to other financial institutions (which it had been doing for years), through HP.

The Leeds Building Society is the fifth largest of its kind in the UK, with assets of £10 billion. Founded in 1875, the society has approximately 703,000 customers and 65 branches in the UK, with 29 in Yorkshire and a branch each in Dublin and Gibraltar.