How to conduct an ITSM assessment that actually means something

ITIL (Information Technology Infrastructure Library), a standard framework for managing the lifecycle of IT Services, is sweeping the U.S.   Based on a 2011 analysis of 23 ITIL studies, Rob England concluded that the compound annual growth in ITIL adoption was 20%± and that ITIL training attendance increased at a compound annual rate of 30% for the past ten years.  Despite this apparent surge of adoption, enterprises continue to struggle with ITIL’s daunting framework.

Recognizing the confusion inherent in ITIL alignment, numerous vendors have created “ITSM assessments” with varying degrees of complexity and debatable value.  These assessments draw upon frameworks such as ITIL, CMMI-SVC, Cobit and, occasionally, BiSL or more specific constructs such as KCS and IAITAM.  Where does one begin?  What is most important?  Where will improvement deliver the best payback?  How can one ensure that all phases of implementation share a common and scalable foundation?

Fundamental Assessment Approach
Figure 1: Fundamental Assessment Approach

All assessments follow a pretty basic formula:

  1. Determine and document the current state of ITSM in the organization.
  2. Determine and document the desired state of ITSM in the organization.
  3. Establish a practical path from current to desired state (roadmap).

Simply stated, the objective is to successfully execute the ITSM roadmap, thereby achieving a heightened level of service that meets the needs of the business.  But don’t let those vendors through the door just yet because this is where ITSM initiatives go sideways.

Current state, desired state and roadmap mean nothing without first establishing scope and methodology.  How comprehensive should the assessment be?  Does it need to be repeatable?  Which processes and functions should be targeted?  Should it be survey-based?  Who should participate?

Rather than seeking input from the ever so eager and friendly salespeople, one can follow a simple three-step exercise to determine scope and methodology.  These steps, described in the following sections, may save you millions of dollars.  I have seen dozens of large enterprises fail to take these steps with an estimated average loss of $1.25M.  For smaller enterprises ($500M – $1B in revenue), the waste is closer to about $450,000.  The bulk of this amount is the cost of failed projects.  In some instances those losses exceeded $10M (usually involving CMDB implementations).

Three Steps to a Meaningful ITSM Assessment

Though these steps are simple, they are by no means easy.  For best results, one should solicit the participation of both IT and business stakeholders.  If the answer comes easily, keep asking the question because easy answers are almost always wrong.  Consider using a professional facilitator, preferably someone with deep, practical knowledge of ITIL and a solid foundation in COBIT and CMMI-SVC.

So, the three steps are really three questions:

  1. Why do you need an ITSM Assessment?
  2. What do you need to know?
  3. How do you gain that knowledge?

Step 1:  WHY Do You Need an ITSM Assessment?

IT Service Management aligns the delivery of IT services with the needs of the enterprise.  Thus, any examination of ITSM is in the context of the business.  If one needs an ITSM assessment, the business must be experiencing pain related to service delivery.

  1. Identify service delivery pain points.
  2. Map each pain point to one or more business services.
  3. Assign a broad business value to the resolution of each pain point (e.g. High, Medium, Low).  Divide these values into hard savings (dollars, staff optimization), soft savings (efficiency, effectiveness), and compliance (regulatory, audit, etc.).
  4. Map each pain point to a process or process area.

There should now be a list of processes with associated pain points.  How well can the business bear the pain over the next few years?  With this preliminary analysis, one should be able to create a prioritized list of processes that require attention.

For now, there is no need to worry about process dependencies.  For instance, someone may suggest that a CMDB is required for further improvements to Event Management.  Leave those types of issues for the assessment itself.

Step 2: WHAT Do You Need to Know?

 

Four Assessment Needs
Figure 2: Four Assessment Needs

Now that the organization understands why an assessment is required (of if an assessment is required), it can identify, at least in broad terms, the information required for such an assessment.

Referring the chart in Figure 2, IT management need only ask four questions to determine the needs of an assessment.

Is ISO/IEC 20000 Certification Required?

If the organization requires ISO/IEC 20000 certification, a Registered Certification Body (four listed in the U.S.) must provide a standardized audit, process improvement recommendations, and certification.  For most enterprises, this is a major investment spanning considerable time.

Does Repeated Benchmarking Provide Value?

Does the organization really need a score for each ITIL process?  Will the assessment be repeated on a frequent and regular basis?  Will these scores affect performance awards?  Will the results be prescriptive or actionable and will those prescribed actions significantly benefit the business?

The sales pitch for an ITSM assessment usually includes an ITIL axiom like, “You can’t manage what you don’t measure” (a meme often incorrectly attributed to Deming or Drucker).  One must ask if scores are the best measure of a process?  To what extent do process maturity scores drive improvements?  Not much.  Each process has its own set of Critical Success Factors, Key Performance Indicators and metrics.  These are far more detailed and effective data points than an assessment score.  Ah, but what about the big picture?  Again, ITIL and COBIT provide far more effective metrics for governance and improvement on a macro level.

That said, there are some pretty impressive assessments available, some with administrative functions and audience differentiation baked into the interface.  However, one should build a business case and measure, through CSFs and KPIs, the value of such assessments to the business.

Do you need an ITSM Strategy and Framework?

Does the organization already have an intelligent strategy for its ITSM framework?  Is there a frequently refreshed roadmap for ITSM improvement?  For most enterprises, the honest answer to this is no.  Numerous Fortune 500 enterprises have implemented and “optimized” processes without strategy, roadmap, or framework.  The good news is that they keep consultants like me busy.

To build an ITSM strategy, an organization needs enough information on each process to prioritize those processes as pieces of the overall service workflow.

To gauge the priority of each process, we focus on three factors:

  • Business value of the process – the extent to which the process enables the business to generate revenue.
  • Maturity gap between current and desired state – small, medium or large gap (scores not really required).
  • Order of precedence – is the process a prerequisite for improvement of another process?

To complete the strategic roadmap, one will also need high-level information on ITSM-related tools, integration architecture, service catalog, project schedule, service desk, asset management, discovery, organizational model, business objectives, and perceived pain points.

Are You Targeting Specific Processes?

To some extent, everything up to this point is preparation and planning.  When we improve a process, we do that in the context of the lifecycle.  This task requires deep and detailed data on process flows, forms, stakeholders, taxonomy, inputs, outputs, KPIs, governance, tools, and pain points.

As this assessment will be the most prescriptive, it will require the most input from stakeholders.

Step 3:  HOW Do You Gain that Knowledge?

Finally, the organization identifies the assessment parameters based on the data required.  Similar to the previous step, we divide assessments into four types.

ISO/IEC 20000 Certification

The only standardized ITSM assessment is the audit associated with the ISO/IEC 20000 certification (created by itSMF and currently owned and operated by APM Group Ltd.).  The journey to ISO 20k is non-trivial.  As of this writing, 586 organizations have acquired this certification.  The process is basically measure, improve, measure, improve, ………. , measure, certify.  Because the purpose of improvement is certification, this is not the best approach to prescriptive process optimization.

Vendor-Supplied ITSM Assessment

The administration, content, and output of ITSM assessments vary wildly between vendors.  In most cases, the ITSM assessment generates revenue not from the cost of the assessment but from the services required to deliver the recommended improvements.

Rule #1:  “If you don’t know where you’re going, you’ll probably end up somewhere else” (Lawrence J. Peter).   Without a strategy and roadmap, assessments will lead you to a place you would rather not be.

Rule #2:  The assessment matters far less than the assessor.  When seeking guidance on ITSM optimization, one needs wisdom more than data.  A skilled assessor understands this workflow in the context of a broader lifecycle and can expand the analysis to identify bottlenecks that are not obvious from an assessment score.  An example is Release Management.  The Service Desk may complain that release packages are poorly documented and buggy.  Is that the fault of the Release Manager or is it a flaw with the upstream processes that generate the Service Design Package?

Rule #3:  Scores are only useful as benchmarks and benchmarks are only useful when contextually accurate (e.g. relative performance within a market segment).  Despite the appeal of a spider diagram, avoid scored assessments unless compelled for business reasons.  Resources are better spent analyzing and implementing.

Rule #4:  An assessment without implementation is a knick-knack.  Validate the partner’s implementation experience and capability before signing up for any assessments and be prepared to act.

Rule #5:  A free assessment is a sales pitch.

Rule #6:  A survey-based assessment using a continuous sliding scale of respondent perception is a measure of process, attitude, and mood.   So is a two year old child.

Rule #7:  In ITSM assessments, simpler is better.  Once a vendor decides that the assessment needs to produce a repeatable score, the usefulness of that tool will decline rapidly.  If you doubt this, just look under the covers of any assessment tool for the scoring methodology or examine the questions and response choices for adherence to survey best practices.

Strategy and Roadmap Workshops

Enterprise Service Management strategies save money because not having them wastes money.  Without guiding principles, clear ownership, executive sponsorship, and a modular, prioritized roadmap, the ITSM journey falters almost immediately. Service Catalogs and CMDBs make a strategy mandatory.  For those who lack an actionable Service Strategy and Roadmap, this is the first assessment to consider.

An enterprise needs an experienced ITSM facilitator for strategy workshops.  Typically, the assessment team will perform a high-level process assessment, relevant tool analysis, framework architecture integration study, and a handful of half-day workshops where the gathered information is molded into a plan for staged implementation.

Targeted Process Assessments

Organizations know where the pain points are and have a pretty good sense of the underlying factors.  The assessor finds this knowledge scattered across SMEs, Service Desk personnel, business line managers, development teams, project office, and many other areas.  The assessor’s value is in putting these puzzle pieces together to form a picture of the broader flows and critical bottlenecks.  Through the inherited authority of the project sponsor, the assessor dissolves the organizational boundaries that stymy process optimization and, with an understanding of the broader flow, assists in correctly identifying areas where investment would yield the highest return.

For these assessments, look for a consultant who has insightful experience with the targeted process.  An assessment of IT Asset Management, a process poorly covered in ITIL (a footnote in the SACM process), requires a different skill set than an assessment of Release and Deployment Management or Event Management.

The output from a Targeted Process Assessment should be specific, actionable, and detailed.  Expect more than a list of recommendations.  Each recommendation should tie to a gap and have an associated value to the business.  Essentially, IT management should be able to construct an initial business case for each recommended improvement without a lot of extra effort.

Summary

Liam McGlynn
Liam McGlynn

Organizations are investing tens of millions in ITSM assessments.  I have seen stacks of them sitting on the shelves of executives or tucked away in some dark and dusty corner of a cubicle.  Whether these assessments were incompetent or comprehensive, as dust collectors, they have zero value.

How prevalent is the lunacy of useless ITSM assessments?  From my own experience and from conversations with others in the field, vendors are selling a lot of dust collectors.  Nobody wants to be the person who sponsored or managed a high-profile boondoggle.

So the advice is this.

  • Don’t waste time on scores because there are better ways to sell ITSM to the board than a spider diagram.
  • Develop and maintain an ITSM Strategy and Roadmap.  As Yogi Berra once said, “If you don’t know where you’re going, you’ll wind up somewhere else”.
  • Assessing and implementing need to be in close proximity to each other.
  • Get an assessor with wisdom who can facilitate a room full of people.
  • Finally, follow the three steps before you let the vendors into your office.

The journey may have many waypoints but let’s just make it once.

Liam McGlynn is a Managing Consultant at Fruition Partners, a leading cloud systems integrator for service management and a Preferred Partner of ServiceNow.  

Product Reviews Roadmap

Ros_Satar
Swit Swoo! Ros Satar looking glam at Edgasbaston Cricket Stadium in Birmingham for the SDI awards dinner.

We have the following technology reviews scheduled for publication before the end of 2013 on The ITSM Review:

Service Catalogue

Knowledge Management

Integrations

  • Tools and technology that facilitate ITSM processes – By Ros Satar
  • Invites being sent now, contact us to participate

Managed Service Providers

  • By Barclay Rae
  • Scheduled for later in 2013

Change, Config and Release

  • by Ros Satar, pencilled in for early 2014
  • If you have any questions or suggestions about who should participate in these reviews please give us a shout.

Futures

We’re currently planning our review calendar for 2014 and very open to your suggestions. Current ideas include Game mechanics on the service desk, Pure play SaaS ITSM, use of ITSM tech for non-IT, Social IT & codeless ITSM.

Don’t be shy, let us know what you want to see! 🙂

Thanks, Martin

Image Credit

Why the CIO won't go the same way as the VP of Electricity

Dead as a...
CIO, Dead as a… ?

Commoditisation is, without doubt, a massive and revolutionary trend in IT. In just a handful of years, a huge range of industrialised, cost-effective solutions have created rapid change, so much so that some commentators now predict the end of the corporate IT department altogether.

Info-Tech Research Group’s June 2013 article highlights a comparison made by some, between today’s CIO, and the “VP of Electricity” role apparently ubiquitous in large organisations at the turn of the last century.

As electricity supply standardised and industrialised, the argument goes, the unsuspecting VP of Electricity (and presumably their whole department) found themselves anachronistic and redundant.

It’s a very flawed comparison. There can be no doubt that consumerisation is a reality, but business IT and electricity provision are very different things.

IT Service is not a light switch

Utility electricity is what it is: we may have a choice of billing suppliers, and perhaps several different options at the point of connection (such as smart metering), but the end user sees very little difference: it’s the same product from the same wires. I flip the light switch, and the light comes on. There is very little scope to vary the service, at the point of consumption, to gain significant competitive edge.

Hence, from the end-user’s point of view, electricity is an absolute service: it’s either there, or it isn’t (and usually, it’s there: Britain’s national grid, for instance, works to – and achieves – a 99.9999% reliability target. By comparison, the granddaddy of commoditised IT infrastructure, Amazon AWS, offers 99.95% – lower by a factor of 500).

By contrast, our customers experience IT as much more than a simple on/off service. It is far more complex, multi-faceted, and variable. We could frequently change our electricity supplier, and our customers and end-users would never be aware. However, change one of the many elements of IT with which they interact, though, and we can make a significant difference to their working day.

For many end-users, in fact, the “light switch” experience seems a long way off. A Forrester study of 900 end users and 900 IT professionals, in January 2013, found, for example, that 84% of business users experienced a severe or moderate impact on their ability to be productive on a monthly basis, as a direct result of IT issues. 14% experience difficulties at least once per day. The study also revealed that there’s a large gap between how the business thinks about IT and how IT thinks about itself. The difference varies regionally, between 13 to 16 percentage points.

If it’s done well, however, IT is an immensely powerful and proven asset to the business. MITSloan’s seminal 2007 study, “Avoiding the Alignment Trap in Information Technology“, identified a best-of-breed group of organisations delivering true “IT-enabled growth”. This group, representing the top 7% of their large sample, was found to be achieving an average of 35% compound annual growth rate, while spending 15% less than the study average on IT.

Commoditised tools

In our IT Service Management functions, commoditisation is just as much a reality as in any other facet of IT. As Gartner Research Director Jeff Brooks put it, “In the ITSM space, and more specifically the use of ITSM tools for IT Service Desk, we continue to see vendors provide commoditised tools”.  Does that mean, then, all of those tools, and the functions which use them, are as good as they will ever be? Clearly not: in January 2012, Brooks’s organisation observed that the average maturity level in IT Infrastructure and Operations was 2.35 on their 1-to-5 scale: a “disheartening conclusion”. As Brooks added: “Opportunities for differentiation exist, but the vendors have yet to capitalise on those prospects”.

He’s right. Even as the service desk, or any other ITSM function, becomes ubiquitous enough to be considered by many to have become commoditised, our customers expectations evolve and change. The age of the smartphone has increased mobility, and put leading-edge, location-aware, always-on technologies into homes and pockets. New-look frontline services like Apple’s Genius Bar have created demand for new ways to get support. On-demand commodity services, though convenient, create new management challenges around costs, control, and alignment. We should never consider ourselves finished.

Growth is driven by effective alignment of technology and processes. Henry Ford did not change the car industry simply by switching to standardised services. He created differentiation by aligning great processes with great technology. Some of those technologies, such as his electricity supply, would likely have been the same commodity service consumed by all of Ford’s rivals, but he made the difference with the things he created around them.  Fundamentally, as long as technology and innovation can give one business an edge over another, the role of the technologist – including the CIO and their business unit – will be relevant.

After all, if all companies were to standardise on a single set of commoditised IT offerings today, by tomorrow some of them would have found significant advantages in breaking out of the pattern.

Image Credit

Do you clog your social media channels with useless crap?

True value or ego massage? '64 % of the people sharing information from others to others did it to get attention, show friendship, show they have inside information, show humour...'
True value or ego massage? ’64 % of the people sharing information from others to others did it to get attention, show friendship, show they have inside information, show humour…’

Do you care what you share or clog your social media channels with useless crap?

In this article Tobias Nyberg explores why people share at all.

Sharing and caring

Is there a way to tell if what I’m sharing actually has any value to others? Do my followers, readers and community peers have any use for the information I share or am I guilty of clogging the social media channels with useless crap?

Out of the 239 twitter followers, 158 Facebook friends, 93 Google+ circlers, 343 LinkedIn connections I have, a very small percentage interact with me frequently when I share stuff with them. How can I tell that they and the ones that are silent find value in my contributions?

Is sharing caring or is it a way to feed my ego?

I asked myself, the Google+ Back2ITSM community and friends of mine this question some time ago since I wanted to try to understand if I bring any value to the community in these areas or if I should just stop spreading worthless information.

The answers were, of course, not simple or even all in the course of what I expected. And just to set some prerequisites straight, I wasn’t necessarily looking for hard fact metrics on value (even if it would be nice), a good feeling about the value takes me close enough.

There are some basic tell tails to see if you bring value through the social channels. If people follow you on twitter, have you in circles on Google+, friend or follow you on Facebook connect with you on LinkedIn they at least think that you at one point or another brought them value. The problem is of course that most people don’t un-follow, un-friend or un-circle you if you no longer bring any value. They’ll either ignore you or mute you from their streams.

Another thing is if your followers re-share your contributions, you would expect them to find your information valuable to them, and in some cases it probably is. But as it turns out, the main reason people share stuff from others, is to either look smart themselves or in other ways boost the image of them. (See also ‘Suffering with consumption‘)

An old study I found on how word of mouth advertising works is probably possible to apply on social media as well. At least for the sake of argument in this situation. That study shows that 64 % of the people sharing information from others to others did it to get attention, show friendship, show they have inside information, show humour, etc.

There is of course value in that, maybe just not the kind of value I was hoping to bring to the community.

Writing and sharing for your own sake

Some of the people I’ve spoken to about this says that they aren’t that interested in what value to others they contribute with. They write, share and interact for their own sake. And I guess that’s a perfectly fine standpoint as well. It can be a way to collect and sort out thoughts and ideas and put it into structure for use, now or later on. And if someone happens to read it and find it valuable, well, good for them. But will they continue to create and share content if no one ever uses it or find value in it?

Some people believe that value from what you contribute with will come out eventually, for someone, and you won’t probably even know it. So their strategy is to keep sharing what comes to mind (and perhaps what make them look smarter) and then let the information be valuable, or not. I guess that would be sharing without caring.

I’ve also been told that it’s impossible to know if you bring any value if you don’t know what your followers want and find value in. And that is a bit tricky to say the least when you don’t know them at all, or even know who they are besides a screen name.

One method that I’ve found to be more used than others is a pragmatic approach of loosely collecting vibes on the channels on what kind of value you bring. Most of us probably do that but some even have methods of sorts to create an perception or understanding on what social media channels to use because they bring more value (as well as gain more value as it turns out) to their followers. Some people write it down to track changes and to see their “vibe-trends” over time.

In the end, it seem to be hard to measure the value of what you share on social media and it’s hard to even create a perception of the value of your contributions to others. I think it’s safe to say that much of what many people share is valuable for ego boosting though, may it be mine or your ego.

When I share things with the community I would like to think I care about what I share and what the information bring in form of value. But to be frank, sometimes I share value and sometimes I share crap. But even more importantly, sometimes, quite often, I don’t share at all. Because I care what I share.

Quick Guide to Knowledge Management Tool Selection

"Tomatoes don’t go in fruit salad"
“Tomatoes don’t go in fruit salad”

In this extended article, Barclay Rae provides an independent guide to Knowledge Management and in particular Knowledge Management tool selection.

Knowledge Management can be many things – from simple useable checklists to complex context-sensitive and case-based toolsets.

Some of the most effective knowledge solutions can be very basic, like lists of contact details, account numbers or simple spreadsheets.

The key to success is in getting people to use these sources and continuing to use them (and find them useful).

Good practical design is key to building tools that provide information and knowledge quickly, intuitively and appropriately – and that are regularly and continuously used.

For IT Support in particular this means:

  1. Getting the right level of information – accurate, up to date, relevant, useable
  2. To the right person – being aware of the support model and the levels of knowledge held at different support levels
  3. In a language and format that is appropriate for them – technical or not, plus summary or detailed, as required for the relevant support level and skillset
  4. Quickly and when and where they need it – Without need for long searches or trawling through long lists of options, delivered at the point of service or action as required.
  5. Context is everything – too technical or not technical enough, out of date, inappropriate, complex, slow – tools must be able to understand and deliver on these within a clearly context, otherwise the ‘knowledge is useless or even dangerous.

So, what is Knowledge Management?

This is the process or discipline that ensures that teams have relevant information to hand, to assist in having a clear understanding of a situation. Knowledge Management is the process that manages the capability to provide that information, based on accurate and relevant data. If the information is available at the right place and time, then those people accessing it can make more informed decisions and also speed up the support and resolution process – i.e. by reducing the need to escalate.

What Does Knowledge Management mean in ITSM?

Knowledge management is not just about getting information fast when trying to solve incidents, although this is a good practical starting point for many organisations. Data gathering, solution design, process design, knowledge transfer are all key elements – across all of IT and beyond. Knowledge should be able to be applied at all parts of the service ‘supply chain’ to ensure that this is built in a robust, complete and effective way. ‘Knowledge Management’ can be Data, Information, Knowledge or Wisdom (see list below) – all differing levels of content or applied and documented understanding that provides value in terms of improvements in service quality and efficiency.

  • DATA – Ten tomatoes
  • INFORMATION – He bought ten tomatoes
  • KNOWLEDGE – A tomato is a fruit
  • WISDOM – Tomatoes don’t go in fruit salad

Tools capture, store and make that information available, and relevant. Getting the right information to the right person – at the right level when they need it – is the goal. The easiest elements to identify and apply ‘knowledge articles’ to are Incidents, Problems and Service Requests. This should also be extended to Changes, Releases, CIs, Services, offerings, processes and workflows – all aspects of service delivery, where information and knowledge is needed. Key elements for tools should be in the ability to easily create, approve, review, update store and make available knowledge articles – i.e. secure curation. In addition the integration of these knowledge functions to other areas of ITSM should be seamless. Integration and alignment with other internal and external sources of knowledge is also useful, as is any formal approach or verification around approved techniques for KM – e.g. like KCS (Knowledge Centred Support). Like many aspects of ITSM technology and practice (and software in general) the value and success of this rests as much with the approach and focus around implementation, culture and governance, as it does with functionality. Vendors need therefore to possess understanding, skills and expertise in implementing these solutions and be geared up to pass on these skills to clients for successful implementation.

Knowledge Management Functionality

Knowledge Creation – systems should have the facility to easily create ‘knowledge articles’ (KAs). These can be original records (i.e. specific work instructions or content), and/or packages of content including documents. Linking – Content can be intelligently and seamlessly linked to external sources – tech manuals, wikis etc. Knowledge Curation – there should be definable process workflows to control the lifecycle of KAs as follows:

  1. Creation of record – ad hoc or as part of a defied process (e.g. release, change)
  2. Approval of record – functional escalation to pre-defined approver or approver group
  3. Publishing/Release of record
  4. Presentation of record – use of KA as designed and required
  5. Review/update of record
  6. Removal / archiving of record
  7. Tracking and assessment of use of record

Knowledge Sharing – promotion of process and information across systems and channels as required. Presentation of KAs:

  1. To multiple staff levels by login
  2. Presentation from searches (queries/predictive) on key classifications – type, impact, product, service, symptom, error message etc.
  3. Presentation of options based on case-based search criteria and probability
  4. Presentation as integral components of ITSM processes:
    • i.     Incident Management – issue resolution, triage
    • ii.     Service Desk – work instructions, manuals, fault fixing
    • iii.     Problem Management – known error records
    • iv.     Change Management – procedures and guidance
    • v.     Configuration Management – procedure and guidance
    • vi.     Services and Service offerings – Procedure and guidance
    • vii.     Request Fulfilment – Procedure and guidance
    • viii.     Release and Deployment Management – Procedure and guidance
    • ix.     Transition – Testing & Verification – testing notes guidance
    • x.     Service Introduction – support notes and guidance
  5. Vendors should show innovation through integration and interaction with new products and areas of technology – e.g. integration with Knowledge lockers like Evernote, Onenote, etc
  6. Self-help access to users via self-service portals – providing user friendly versions of internal KAs
  7. Crowdsourcing – links to Incident and Problem Management processes for access to outstanding issues and inputs to create known error records

Knowledge Development – ability to update and improve knowledge articles and also to assess the value of usage as input to predicting new records or record types Intelligence – Systems should show innovation by learning from existing records – types, content and usage – and prompting to create new KAs.

Vendor Approach

  1. Vendors should demonstrate a clear understanding of how to approach Knowledge integration within their (and with other) products
  2. Innovation in approach and delivery are a differentiator – e.g. beyond simple functional KA creation and management
  3. Project management and tool implementation should include guidance, training, workshops etc. on strategic and technical aspects of Knowledge Management
  4. KCS accreditation and proof of capability desirable

General Knowledge Management Requirements

  1. User-configurable forms, tables, workflows
  2. Should be able to create user-defined rules for creation (e.g. mandatory fields) and lifecycle management (e.g. who, how when revised and updated.
  3. Lifecycle activity should trigger escalation processes – (e.g. automated emails/ texts to approvers, reminders etc.)
  4. Role-based security access – to allow control of access and level of information by login
  5. Ability to provide multiple levels and formats of information in KAs – i.e. bullet points for senior technical levels, scripted specific details for junior / non-technical staff.
  6. Vendors should provide expertise and guidance in the implementation of the tool and relevant processes and project requirements around Knowledge Management – e.g. with workshops and training as well as implementation consultancy.
  7. Open system for real-time integration with external ITSM.
  8. Vendors should have established proven links with other ITSM tools and modules, Incident, Problem and Change Management, CMDB and Service Catalogue.

Barclay will soon begin a competitive review of Knowledge Management technology. If you offer technology in this area and would like to participate in our next review please contact us.

Photo Credit 

Reader Census – The results so far…

We began a reader census a couple of weeks back. See the original article here: ‘Reader Census – Your opinion counts‘.

Thank you to everyone who has taken the time to complete the census. It will remain open so please take a few minutes to share your opinion

  • So far the average completion time is less than 90 seconds
  • It only has 6 simple questions on one page
  • As you can see we are acting on the data you provide us

Click here to complete the Reader Census.

I thought I would share some of the results so far.

General Outlook – Hungry for Knowledge

Readers like to read a balance of opinions from different people, from different geographies in different circumstances. You like the fact we’re not ITIL zealots and try to provide a balance of opinions. You like the fact we’re vendor neutral and not stuffing sales messages in your face.

Overall, I would say respondents in the census are hungry for knowledge – you want to know what others are doing, how they are doing it, the challenges they are facing and how they are addressing them. You want to keep an eye on the market and stay current with practices.

Keep it Real

In terms of future direction and content you would like to see – the message we have received is to keep it real. You want pragmatic examples and use cases.

“Show me examples and experiences of people that are doing it well in real life, rather than theorists who can tell us how it should be done but are yet to do it.”

There is also a desire to hear from organisations of different sizes:

“If possible it would be nice to have content which feel less america centric. I would been keen to get insights on how itsm is done in “second world” places, this may be a more accessable option for small shops.”

Content Types

In terms of content types or content vehicles, your preferences in order of priority are:

  1. Industry news
  2. Product Reviews
  3. How to / Instructional guides
  4. Opinion

All options are shown in the graphic below (‘Aggregate score’ is a weighted score based on strength of opinion).

"Q. Please rate each site feature according to importance:"
“Q. Please rate each site feature according to importance:”

Thanks again to all those that participated – please keep sending us your feedback.

Click here to complete the Reader Census.