Showing posts with label metrics. Show all posts
Showing posts with label metrics. Show all posts

Wednesday, January 1, 2014

Give them what they want - or what they need?

I've been catching up with discussions in LinkedIn groups today, and I keep running into this quote by Andrew Brendenkamp:

Content Strategy should be about who is my target audience and what content do I need to give them to win them (and keep them) as happy customers.

It's a good quote - no dissension here - but it makes me think that there is more to meeting customer needs than making them happy. Documentation, like usability in general, is a background function: the foreground function is that the user can use the product.

So you might produce beautiful documentation that garners raves from readers, but it doesn't actually do what it's supposed to do. For example, readers might not realize that you've omitted key information; or that they could have learned what they needed in many fewer words; or the people who find the documentation think it's brilliant, but most people can't find it. Of course there's still value in wowing readers, but that is secondary to the main purpose.

Don't get me wrong: writers need to be close to the business. They need to consciously meet business needs. But business needs might be a bit more subtle than "happy customers". And metrics that focus on the easy targets might miss the mark.

Friday, September 14, 2012

The messy side of metrics

As I have said before, I'm a big fan of diving into the weeds and figuring out data. Sometimes this is a lot of hard work with very little to show for it. Every once in a while it has spectacular results.

I once worked at a company that had an unsearchable knowledge base. I won't go into the technical details, but it was difficult to collate the data in it. Consequently there had been very little reporting of customer problems.

I decided to investigate the knowledge base. I was able to output a massive text file of all support calls logged over the previous 18 months. I went through it, cutting and pasting issues into other documents, which was about the crudest way imaginable to collate electronic data. It seemed to take forever, but in reality it took about 24 hours of heads-down concentration.

I discovered a number of things, but the most dramatic was that a single error message was responsible for a high percentage of support calls. It was relatively easy to write documentation to help users who got the error so that they didn't need to call customer support. (I had to write a spec for a compiler to attach the doc to the error message, but that is another story.) Afterwards the number of customer support calls fell dramatically.

It's sort of embarrassing to admit to the lack of technical savvy in my approach, but I think that's what makes the story worth telling. There isn't always a nifty technical solution for data analysis. Saying "It can't be done" often just means, "There's no easy way to do it." Also, when you handle data manually you notice things that wouldn't be apparent in an automatically generated report.

Thursday, September 13, 2012

Case Study: Metrics (feedback button data)

I once conducted a project to analyse the user feedback that my then-employer received from the feedback button on help pages. We had all been using the feedback to determine which topics needed attention and to understand doc usage. But I quickly saw that there were some pretty serious problems with the data.

Cautionary Tales: Metrics (performance measurement)

(I'm a big fan of IT people. Also, nobody likes measuring things more than me. But interpretation is everything. Or rather, a little knowledge is a dangerous thing.)

I once worked in a company where an IT team of three people provided technical support to about 1,000 employees. One of the IT guys was great and the other two were really bad. By bad, I mean that when an employee asked them to fix a problem on their PC, these two guys typically couldn't fix the problem and often also created new problems.

Then one day the good IT guy got fired. We learned later that the company had instituted metrics to measure performance of the IT department, and had determined that this fellow was too slow: metrics showed that it took him two or three times longer, on average, to close a case.

What everyone in the company knew (except, apparently, IT management), was that the good IT guy took all the complex and difficult problems, while the other two guys dealt with the sort of mundane issues they could handle. Difficult problems take longer.

I might have just passed this off as incredible stupidity, but the next company I went to also had three people in the IT department. They were extremely overworked (causing delays that affected productivity all over the company), but one day one of them got laid off. We learned later that the company had instituted metrics to measure performance of the IT department, and the metrics showed that there wasn't enough work for three. The metrics system only covered issues that were logged through a web site, but employees mostly just called IT without logging an issue.