Monday, September 15, 2014

Saturday, August 2, 2014

Usability - Talking the talk and...

Two years ago at the Fluxible conference in Kitchener, I attended a talk by James Wu, lead tablet designer at Kobo, called "Rethinking the Tablet UX". I was really taken with the talk. I even wrote a blog post about it, here. It made me think that usability at Kobo was pretty darned advanced.

Yesterday I bought my first ereader, a Kobo Aura HD. In many ways it's a great device, and I'm not blaming James Wu, but OH MY GOD THE USABILITY SUCKS. It boggles my mind. The device is not unusable, and I'm sure I'll get used to it, but there are so many little things that are egregious bad practices. The incompetence is breath-taking. Take, for instance:

You can't read your Kobo while it is recharging over USB. Before unplugging the cord between your computer and Kobo, you have to click an Eject icon in the desktop app. This is a terrible constraint from a usability standpoint. In fact, after being reminded approximately eight million times, I forgot to click it.

During setup I repeatedly landed on pages with only one button: Eject, even though my device was still charging and still being configured.

On the device, there's a warning saying "Please eject your eReader before unplugging your USB cable", but it doesn't say how to eject it. You can't do it from the device. You have to open the desktop app and click Eject.

The repeated warning about ejecting made me think I'd damage my ereader if I pulled the plug without ejecting, but eventually I found, on a web page, that the reason for ejecting is that otherwise you might lose data. But there is no help whatsoever on what data you'll lose. Your last download? All downloads in the session? Does it matter if all you did was recharge, or if you use WiFi? Is there an auto-save? Can I recover by resyncing? For the love of god, give us a hand, Kobo!

Only some of the icons in the Kobo desktop app have tooltips. For others, you just have to hope you won't do something unintended... then click and try to figure out what it did.

The Kobo desktop app has no help button or link to help.

The official user guide (which I found through a google search) is in PDF format only. The table of contents aren't hyperlinks. The pages have no page numbers. The headings aren't registered as headings so you can't open a bookmarks pane on the left. There is no index. When you copy text out of the guide, spaces between words become either tabs or carriage returns. (I found the user guide so unnavigable that I tried to create my own subset of topics I needed, but the formatting issues prevented me.) The content is very sparse and mostly describes the UI.

The getting started guide is printed in eight languages simultaneously, and contains exactly 54 words in English. It's useless.

To get started, I downloaded a few free ebooks from the Kobo site, but when I tried to open them I got an error that they were Adobe Digital Editions and required special software. The user guide mentions Adobe Digital Editions and provides a link to Adobe but the link is broken and there is no information about how to set it up on my Kobo. Eventually I found an article on the Kobo site that explained what to do. I'm still looking for information on how to install Acrobat Reader on my Kobo. Why is installing this software not [an optional] part of configuring the device???

When you buy an iPad (I have heard), they start setup in the store and the whole thing is ready to go by the time you get home. When you buy a Kobo, it comes with a dead battery, completely unconfigured, with a difficult and poorly documented setup process. I know there's a limit to the amount that can be done without a dedicated retail outlet (although I bought mine at Chapters), but Kobo is too far at the other end of the spectrum.

I could go on and on. Really. This product has an awful out-of-box experience. The documentation is a nightmare. When I look up Kobo on LinkedIn, I see a ton of people with UX in their titles, and lots of descriptions of user research. What in god's name are they doing?

Saturday, July 12, 2014

Case Study: Disruption

Back in the 1980s I worked for a computer timesharing company. We had a large mainframe computer with a proprietary operating system and a corporate internet with nodes all over the world. Our customers used dumb terminals (a monitor with a connection to a mainframe) to do their computing work, and they paid by the minute. We had the world's largest (or second largest) corporate internet, and a large customer base of companies such as insurance companies who needed more processing power than was available elsewhere.

My division, Data Services, provided databases that users could access, and they paid based on usage (so much per data point). Data Services provided statistical software that ran on our mainframe and could be used for the timesharing fee. We provided enormous economic and financial databases, but really shone in the areas of energy and aviation.

My colleagues were extremely bright and forward-thinking and we were in a constant state of innovation, but in many ways we were always a few years behind the rest of the world. When I started there in the mid-80s PCs were already widely used in business, but my terminal had no monitor, just a wide paper feed: all input and output was recorded on paper. After a while we upgraded to dumb terminals and then to PCs. Our customers used acoustic couplers to connect (a kind of modem where you put the handset of your phone into connectors), and I worked on helping them move to faster modems. I attended endless meetings where we grappled with alternative ways to deliver data to customers, such as floppies, CDs, and quarterly downloads. We fretted over ways to modernize our pricing, and one of my jobs was to create pricing models to estimate the effect on revenue from alternatives such as subscriptions, or unlimited and report-based pricing. It was a bitch teaching our workforce the basics of PCs, and transitioning our developers from APL to C++.

As data storage and processing power exploded, the basic business of the company, computer timesharing, became anachronistic. Eventually the company was purchased, cherry-picked, and mostly shut down. But in a lot of ways the company was ahead of its time. We had software that changed traffic lights for emergency vehicles decades before others produced anything as sophisticated. My division, Data Services, was a leader in value-added data and statistical analysis software. We were achieving good profits and growth right to the end.

When I was eventually laid off I was traumatized. I had been so invested in my work that it was like slamming into a brick wall at 70 mph. That was 25 years ago but in some ways I haven’t recovered. I worked at RIM when it became clear it was failing, back in 2012, and couldn't stand the idea of suffering through another slow death: I jumped ship as quickly as I could, but for a year I maintained an almost obsessive watch on the company, checking the stock price and reading the analysts and pundits daily.

My experience with a failing company led me to pick up Clayton Christensen’s The Innovator’s Dilemma some years ago. It seemed that we fit his thesis (PCs disrupted the timesharing business). It was (and is) important to me to understand why bright people - who realized their predicament, had the will, and had the resources to change – couldn't make it. Christensen’s contention that you can’t make substantive change from within, but need to start a new subsidiary, is pretty compelling.

Despite its strengths, the book didn't pass the smell test for me. It is too simplistic, too dogmatic, too anecdotal, and inappropriately universal. Christensen seemed most intent on proving that we have to throw out old values and instincts, that we have to adopt a new way of doing business that is even more ruthless and cut-throat than before. It seemed more like propaganda than theory: another phase in the right-wing attack on civilization, a new libertarianism for the private sector. The smartypants premise is that the best and the brightest must admit defeat to the small and the weak. The innovator’s dilemma is that “doing the right thing is the wrong thing.”

I don’t have a copy of the book and I don’t want to critique it, but I am energized by Jill Lepore's fantastic piece in the New Yorker (link). This is the first in a planned series of articles about what Lepore calls the disruption machine.

Friday, July 4, 2014

On indexes

I love indexes. I love using a good index and I love creating a good index. All this dates back to my childhood and cookbooks. The Joy of Cooking, in those days, had an index that was truly a joy. The person who created it (perhaps Mrs. Rombauer herself) knew cooks well enough to know what they would be looking for, and gave us a long, redundant, gloriously usable index.

Mastering the Art of French Cooking, on the other hand, had an index created by a moron. Although Mastering the Art was published in two volumes, Volume II had a combined, color-coded index that always threw me. Worse, there was a confusing use of French and English terms. Look up mustard and you found some entries; look up moutarde and you found others. I bet that Julia Child passed off the indexing responsibility to a flunky.

Every couple of years Mrs Rombauer published a new edition of The Joy and she revised the recipe selection, the recipes, and the index. It was a true process of continuous excellence. Mastering the Art, on the other hand, has not been revised, except to update for new equipment (such as food processors) and changes to ingredients.

It is not uncommon for doc managers to task junior writers to index the works of senior writers. It's not just that you need to understand content to do a good job indexing it; it's also that indexing is a good exercise for a writer to do as part of their content creation. It's a second level check of your content organization. It's also a way to implement reader issues you might be aware of. For example, if you're documenting MS SQL Server and you know that some readers will be more familiar with Oracle, you can index some Oracle terms such as System ID for Database Name.

There is a growing movement in the tech writing biz that indexes are unnecessary. I see the point for online help; numerous studies (including my own) have shown that readers prefer to use Search (and in fact, they often prefer to use Search in Google rather than within the doc). But if you publish PDFs that your readers will print, then you need an index. And if you're going to have an index, you should make a good one.

At a conference years ago I attended a lecture on indexes given by an academic. His theory was that indexes should train the reader to use correct (his idea of correct) terminology. The example he gave was of a home first aid guide he had once worked on. He declared proudly that he had changed the index entry for "collar bone" to "collar bone: see clavicle". I don't know what he said after that because I walked out.

Sunday, March 23, 2014

DITA in times of contraction

When Pubs managers decide to move their doc content to DITA, all they see is the savings. It's ROI, ROI, ROI. "You have to spend to save," they argue, and they often start spending hundreds of thousands of dollars on software purchases, training, and non-writing personnel. All that's fine when a company is growing and has loads of cash, but what risks are Pubs managers exposing themselves to if the company hits bad times?

As I have argued before, in many cases DITA doesn't so much save money as redistribute it. Where before you spent the lion's share of your doc budget on salaries for writers, now you're spending the most money on tools developers, information architects, editors, and software.

I'll give you an example: I once worked in a DITA shop where a team of 11 writers was overseen by a manager, three team leads, three editors, and two information architects; and it was supported by nearly a dozen tools developers. There were almost twice as many non-writers working on the documentation as writers (and yet writers had to fill out complicated forms for the editors, as well as project-manage the localization of their docs). The CMS was enormously expensive, and then the CMS vendor end-of-lifed our database so we had to spend a pant-load on a new one, including two years of research, planning, tools redevelopment, migrating, and tweaking the migration.

In a DITA shop, teams become complexly interdependent. Much effort is expended on assimilating writers so that they give up their holistic approach to writing, and accept their role in a DITA production line that starts with information developers; relegates writers to the role of filling in content in designated, pre-typed topics; and ends with editors. As it was explained to me, the writer must learn to pass the baton. DITA proponents argue that writers who can't assimilate should be fired.

The CMS and publishing tools are enormously complex so that nothing can be published without the help of a large team of tools developers. In addition, the complex new processes and corresponding bureaucracy require training (and hence trainers) before new writers can become productive.

Now imagine that the company has a profit dip and needs to cut costs. Who and what is expendable?

Before you had a team of writers, and if the company got in trouble you could lay off some with minimal impact. But now, if you have to contract your Pubs department you're in a pickle. The information typing process relies on so many non-writers that it seems inevitable that when companies are in decline, a DITA shop is going to have to give up more writers than a non-DITA shop.

That fragile CMS doesn't run itself, and keeping it going requires expert skills: you're going to have to keep most of your tools developers unless you want to give up publishing documentation altogether. It's probably not possible to give up the expensive maintenance plan for the CMS, either.

Your complex processes are going to continue to require the trainers, team leads, information architects, and editors.

In short, you're left with an expensive behemoth that can't be easily dismantled... unless you decide to ditch DITA altogether and migrate to a simpler solution.

The risk of DITA is fine when there is real justification for adopting DITA: when there is real need for reuse, when translation savings can't be garnered by a simpler alternative like Docbook XML or Madcap Flare, when you absolutely need to enforce strict information typing on writers. The problem is that nearly all outfits that are adopting DITA do not have that real justification. They're wasting money on DITA, and that could get them into trouble when the cash stops flowing.

Helping developers write better API references

Several years ago I gave a presentation to developers at my company about how to write better API references. The goal was to help developers who produced public APIs to write code comments that would be more useful to the mobile app devs who used the APIs.

I started with a quote from senior management about the importance of the API references to the success of our products. At this point and throughout the presentation, I wanted it to be clear that I wasn't just some tech writer spouting off about how to do things: this initiative was important to development management. I constantly threw in quotes from development managers (using their names) to support what I was saying. (I couldn't go so far as to say there was an integrated initiative because there wasn't.)

Next, I sketched the current Pubs team initiatives to improve documentation for the mobile app devs: new web sites for doc delivery that made it more easy to find and use the content; new feedback mechanisms; research of what the app devs wanted; and developer outreach such as the presentation I was giving them.

On the subject of research, I described some recent work I had done. I had gone to a developer conference and conducted a focus group, as well as six round-table discussions; collected questionnaires; and done dozens of usability tests with developers. I had attended a hackathon where I interacted with participants, learning their frustrations and successes. I had also attended a Developer's Day where I talked to prospective app devs about their backgrounds and their interest in using our APIs.

My main take-away from that research, I explained, was that mobile app devs want development to be easy. One of my bullet points was, "They are looking for information that is complete, simple, easy to understand, quick to use, and easy to find." This may seem obvious, but most developers seem to assume that app devs don't want to be told too much - that they can figure it out. In my experience, all too many technical writers in development documentation make the same false assumption. What app devs say is: Spell everything out! Tell me how to do it! As I explained in my presentation, three-quarters of the app devs I talked to were working on multiple platforms, not just ours.

Next I shared some personas of mobile app devs. These were not personas that I had made up, but that I had got from senior development managers. (As I have argued before, I firmly believe that personas should be prescriptive rather than descriptive.)

Finally I got to the point where I could define what a public API reference is, and show some examples from our company. I criticized our current API refs: it was too difficult to create mobile apps with our APIs; app devs were complaining; we had acquired a reputation of not being developer-friendly; the API references were a major source of info for technical writers, so the rest of the documentation was suffering; we were getting too many support calls.

I showed an example of a really bad page from one of our API references and explained what was wrong with it. Then I showed them an example of a really good page and talked about why it was so useful to readers. That led into an interactive session of about 15 minutes where we looked at API references and discussed how to improve them. Although there were about 100 developers in the room, the discussion was lively and very positive.

The rest of the talk was guidelines to improve our processes going forward. I defined responsibilities:
  • API developers: Document all the elements of the API as comments in source code.
  • Tech writers: Review the API reference to help with wording, fix typos and grammar, and note missing content.
  • Both: Tech writers collaborate with developers to add content and examples.
I told them that when they started to write code comments for public APIs, they should ask themselves: What is it? When would this be useful or necessary? Why would app devs want to use it? How do they use it? I said the The API reference should explain: every class, method, parameter, etc; side-effects (what this code will affect and vice versa); and assumptions: always think of the customer. Imagine someone who is busy, doesn’t want to spend a lot of time figuring things out, hasn’t immersed themselves in our environment, just wants a quick answer.

After that I described some of the reasons our company had ended up with poor API references, providing a solution for each issue. For example, one solution was, "Ensure that the API reference is a deliverable. Make the API reference a product requirement. Add the API reference to the Definition of Done."

In the question period at the end, many of the developers provided even more ideas for how to make things better going forward. I ended with a quote: “Poorly documented code? Chances are the problem lies not in your programmers, but in your process.”

Wednesday, January 1, 2014

Give them what they want - or what they need?

I've been catching up with discussions in LinkedIn groups today, and I keep running into this quote by Andrew Brendenkamp:

Content Strategy should be about who is my target audience and what content do I need to give them to win them (and keep them) as happy customers.

It's a good quote - no dissension here - but it makes me think that there is more to meeting customer needs than making them happy. Documentation, like usability in general, is a background function: the foreground function is that the user can use the product.

So you might produce beautiful documentation that garners raves from readers, but it doesn't actually do what it's supposed to do. For example, readers might not realize that you've omitted key information; or that they could have learned what they needed in many fewer words; or the people who find the documentation think it's brilliant, but most people can't find it. Of course there's still value in wowing readers, but that is secondary to the main purpose.

Don't get me wrong: writers need to be close to the business. They need to consciously meet business needs. But business needs might be a bit more subtle than "happy customers". And metrics that focus on the easy targets might miss the mark.