Help Desk Basics: How Effective Is Your Help Desk, Really?

An Effective Help Desk

Before you can determine how effective your help desk is, you need to determine what your help desk does. What would you say your help desk’s most important purpose is?

I’ve had the opportunity to work on many different help desks of many different sizes. I’ve found, almost in every case, help desk technicians are unable to answer the basic question of why they work for the company. Some will say “We fix PCs!” or “We answer phones!”

Those may be good examples of their duties, but they do not directly answer why they have a job.

For internal IT help desks, the answer is simple, but rarely communicated or discussed:

The role of all help desk employees is to reduce employee downtime and maximize employee financial productivity.

If you or your team do not have a clear understanding of your primary goal, you may be working on solving problems that do not impact your core objectives. You will also never be able to truly determine how effective your help desk is.

Many metrics are available to help desk managers and employees, including Average Talk Time, Call Abandon Rates, First Level Resolution, Client Satisfaction Scores, etc. Out of all of your metrics, if “Financial Productivity” is your primary focus, this helps narrow down your key metrics to a smaller number.

I do want to state that all metrics are important to understand and measure, but decisions and directions should be primarily determined based on your key drivers. The # 1 most important metric when identifying effectiveness:

First Call Resolution 

How often does your help desk have the tools, knowledge, and access to resolve reported issues on the FIRST call, without delay or escalation? The only faster way to resolve an issue is by avoiding the issue altogether—a topic we will cover in a future blog post.

Do you agree?  Do you disagree?  I would love to hear from you!

By Steven White, Director of Customer Service, Managed Services

[video] Enterprise #DigitalTransformation | @ExpoDX #AI #FinTech #SmartCities

Digital Transformation (DX) is not a “one-size-fits all” strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor progress, and never forget that their enterprise is in a day-to-day battle for survival.

read more

How analytics will revolutionise supply chains in 2018

  • While 94% of supply chain leaders say that digital transformation will fundamentally change supply chains in 2018, only 44% have a strategy ready.
  • 66% of supply chain leaders say advanced supply chain analytics are critically important to their supply chain operations in the next 2 to 3 years.
  • Forecast accuracy, demand patterns, product tracking traceability, transportation performance and analysis of product returns are use cases where analytics can close knowledge gaps.

These and other insights are from The Hackett Group study, Analytics: Laying the Foundation for Supply Chain Digital Transformation (10 pp., PDF, no opt-in). The study provides insightful data regarding the increasing importance of using analytics to drive improved supply chain performance. Data included in the study also illustrate how analytics is enabling business objectives across a range of industries. The study also provides the key points that need to be considered in creating a roadmap for implementing advanced supply chain analytics leading to digital transformation. It’s an interesting, insightful read on how analytics are revolutionizing supply chains in 2018 and beyond.

Key takeaways from the study include the following:

66% of supply chain leaders say advanced supply chain analytics are critically important to their supply chain operations in the next two to three years

The Hackett Group found the majority of supply chain leaders have a sense of urgency for getting advanced supply chain analytics implemented and contributing to current and future operations. The majority see the value of having advanced analytics that can scale across their entire supplier network.

Improving forecast accuracy, optimizing transportation performance, improving product tracking & traceability and analyzing product returns are the use cases providing the greatest potential for analytics growth

Each of these use cases and the ones that are shown in the graphic below has information and knowledge gaps advanced supply chain analytics can fill. Of these top use cases, product tracking and traceability are one of the fastest growing due to the stringent quality standards defined by the US Food & Drug Administration in CFR 21 Sec. 820.65 for medical products manufacturers.  The greater the complexity and cost of compliance with federally-mandated reporting and quality standards, the greater potential for advanced analytics to revolutionize supply chain performance.

Optimizing production and sourcing to reduce total landed costs (56%) is the most important use case of advanced supply chain analytics in the next two to three years

The Hackett Group aggregated use cases across the four categories of reducing costs, improving quality, improving service and improving working capital (optimizing inventory). Respondents rank improving working capital (optimizing inventory) with the highest aggregated critical importance score of 39%, followed by reducing costs (29.5%), improving service (28.6%) and improving quality (25.75%).

44% of supply chain leaders are enhancing their enterprise resource planning (ERP) systems’ functionality and integration to gain greater enterprise and supply chain-wide visibility

Respondents are relying on legacy ERP systems as their main systems of record for managing supply chain operations, and integrating advanced supply chain analytics to gain end-to-end supply network visibility. 94% of respondents consider virtual collaboration platforms for internal & external use the highest priority technology initiative they can accomplish in the next 2 to 3 years.

The majority of companies are operating at stages 1 and 2 of the Hackett Group’s Supply chain analytics maturity model

A small percentage are at the stage 3 level of maturity according to the study’s results. Supply chain operations and performance scale up the model as processes and workflows are put in place to improve data quality, provide consistent real-time data and rely on a stable system of record that can deliver end-to-end supply chain analytics visibility. Integrating with external data becomes critically important as supply networks proliferate globally, as does the need to drive greater predictive analytics accuracy.

Organically DevOps | @DevOpsSummit #DevOps #Serverless #AI #Monitoring

Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual’s transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task can not only be daunting, but viewed by many as impossible.

read more

The Reality of AR and VR | @ExpoDX #AI #DX #IoT #DigitalTransformation

Augmented reality (AR) and virtual reality (VR) have been the subject of much discourse in the last several years. They were widely anticipated in the gaming and entertainment industries, but in marketing and corporate settings, the benefits were murky at best. Today, these technologies are becoming more of a reality in all areas of business.
For example, the new IKEA® shopping app leverages augmented reality to help shoppers determine how certain popular items will look in their homes – no trip to a crowded megastore necessary. Marriott® deployed virtual reality for customers to see its hotel rooms around the world without having to actually travel, and KFC® and Walmart® are using virtual reality to train employees before they begin working.

read more

Microsoft posts more solid cloud growth with $28.9bn in overall revenue for Q218

Microsoft continues to cite the cloud as the key to its success; the company has posted revenues of $28.9 billion and operating income of $8.7bn for Q218, an increase of 12% and 10% respectively.

While the company does not give specific numbers for Azure, its ‘intelligent cloud’ bucket – which focuses on Azure and server products, saw revenues of $7.8bn, an increase of 15% year over year, while the ‘productivity and business processes’ segment, focusing more on Office 365 went up 24% to $8.95bn.

Despite the ever-improving cloudy outlook, the quarter did see an overall loss of $6.3bn, which Microsoft attributed to a ‘net charge of $13.8bn related to the Tax Cuts and Jobs Act.’

Satya Nadella, Microsoft CEO, told analysts the ‘intelligent cloud and intelligent edge platform [was] fast becoming a reality.’

“As intelligent cloud, intelligent edge becomes more predominant, our architectural advantage is increasingly clear to our customers,” said Nadella, as transcribed by Seeking Alpha. “You see this reflected in the latest CIO surveys, as well as in our 98% Azure revenue growth this quarter. Only Microsoft delivers hybrid consistency, developer productivity, AI capabilities, and trusted security and compliance.”

Highlights for Microsoft in the most recent quarter include a partnership with SAP to ‘provide enterprise customers with a clear roadmap to confidently drive more business innovation in the cloud’, as well as launching Azure Migrate, a free service to help organisations move over from their VMware environments.

The latter initially did not go down well with VMware, with the company saying at the time it ‘cannot endorse an unsupported and non-engineered solution that isn’t optimised for the VMware stack’. In other words, if customers took the plunge and something went wrong, they would be on their own. However, in December Microsoft posted a missive saying it was working with VMware and its cloud provider program partners for an offering made generally available later this year. VMware then updated its blog post removing the more antagonistic language, saying it was ‘in the process of engaging with the partner to ensure compliance and that the appropriate support model is in place.’

You can read the full financial report here. Watch this space however, as Amazon Web Services (AWS) posts its financials later today.

The Top 10 IoT Trends For 2018 | @ExpoDX #IoT #IIoT #SmartCities #AI #DX

The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is jumping aboard, hoping to harness the power of the IoT to connect with customers, grow their brands, and improve the customer journey in deeply personal ways. But industries like healthcare and supply are not far behind. They’re using the technology to connect with patients via wearable devices, and track products from factory to floor. In many ways, the full potential of the IoT is still being realized; we’ll likely see more of that in 2018.

read more

Why standard web hosting is not suitable for protected health information

Web and mobile applications are a key differentiator between healthcare providers in a highly competitive industry — customers want to be able to access their healthcare records, prescriptions, and associated data in the way that is most convenient to them. Healthcare providers who aren’t online are falling behind.

In the rush to build online services, it may be tempting to take shortcuts where protected health information (PHI) hosting is concerned. After all, a standard web hosting provider is perfectly capable of storing and serving healthcare data.

But most web hosting and server hosting providers aren’t capable of offering services that comply with the Health Insurance Portability and Accountability Act (HIPAA) — in fact, they aren’t even capable of providing a platform on which a skilled developer could build HIPAA-compliant applications and services.

The risk of being non-compliant with HIPAA should not be taken lightly. Non-compliance fines are large enough to damage the viability of healthcare businesses, and for the worst breaches healthcare professionals and administrators might be jailed.

There are seven fundamental requirements for HIPAA compliance:

  • Transport encryption,
  • Storage encryption
  • Backups,
  • Authorization,
  • Data integrity,
  • Data disposal,
  • A HIPAA Business Association agreement with third-party vendors.

A HIPAA-compliant hosting provider doesn’t automatically make an application or website compliant, but it does provide a solid foundation on which to build HIPAA-compliant services.

The authorization requirement of HIPAA stipulates that only authorized individuals are able to access PHI and that access is via audited access controls. That means no one should be able to access the data without permission, either over the network or from inside the data center. Most hosting providers do not have sufficient network or physical security to be compliant, and, without the necessary security, there is no way for an application that stores or processes PHI to be compliant.

Healthcare data must be stored in such a way that guarantees it hasn’t been tampered with or altered. The best way to protect data from tampering is to encrypt it, and encryption at rest isn’t a service the majority of web hosts are able to provide. Encryption can be managed at the application layer, but whole disk encryption is safer, reduces complexity, and should be a standard part of any HIPAA-compliant hosting plan.

When a company stores or processes Protected Health Information, it has two options: to build and manage a HIPAA-compliant hosting platform, or to use HIPAA-compliant hosting provided by a third party vendor like Liquid Web. The first option is expensive, complex, and not a core competence of most healthcare businesses.

The more economical, efficient, and secure option is to use a third-party host with HIPAA expertise. Only an expert and experienced HIPAA-compliant hosting provider is able to provide a trustworthy HIPAA Business Associate Agreement — and without such an agreement, a healthcare provider’s websites and applications can’t be HIPAA-compliant.

HIPAA-compliant hosting may look superficially similar to standard web hosting, but under-the-hood a huge amount of work goes into building a secure, reliable platform that empowers healthcare professionals to build the HIPAA-compliant services their customers expect.

The Surprising Truth About Cloud Security | @CloudExpo #DigitalTransformation

Another day, another breach. No wonder security is tied for the top barrier to cloud adoption, according to 2017 research from RightScale, with 25 percent of survey respondents naming it, alongside expertise and expense, as their greatest challenge.

In the face of security concerns, IT executives have mistakenly found comfort in private clouds over public clouds. The RightScale survey found that enterprises run about 75 percent of workloads in the cloud, with 43 percent done in a private cloud and 32 percent handled in a public cloud.

read more

Internet of Blockchains | @ExpoDX #FinTech #Blockchain #Hyperledger #Bitcoin #Ethereum

A few years ago – in the early days of Blockchain – a lot of people were taken with the idea of a multifunctional chain on which all transactions could be handled. After Ethereum was launched in 2014, its advocates were talking themselves hoarse about the transformative opportunities the platform introduced. Decentralized applications, they predicted, along with all sorts of value transfers would be executed exclusively on Ethereum from that point on, and no other networks would ever be needed.

read more