RE. Quality metrics for documentation?

Subject: RE. Quality metrics for documentation?
From: "Hart, Geoff" <Geoff-H -at- MTL -dot- FERIC -dot- CA>
To: "Techwr-L (E-mail)" <TECHWR-L -at- lists -dot- raycomm -dot- com>
Date: Fri, 3 Mar 2000 13:32:27 -0500

Susan Peradze is interested in <<... procuring a tool (preferably software)
that can be used to measure the quality of
technical documentation. We need to produce "objective" data to demonstrate
to upper management that the documentation produced by the tech writers is
far superior to that produced by the application developers.>>

Sounds like you guys are worried about being "rightsized" out of a job, and
that suggests that if you make it through the current dangerous period,
you'll need to plan an ongoing process of keeping management aware of your

One of my favorite, time-tested rules of thumb is that you'll eventually see
improvements in what you set out to measure via a metric; techwhirlers and
engineers, being smarter than the average bear, quickly learn how to analyze
a metric and take appropriate measures so that we/they seem to be improving
without actually doing anything much to actually improve. "Pages per hour"
is one of the better ones: just write very verbose text and suddenly your
productivity goes way up. The users suffer, but at least you're being
productive. Speaking of which:

<<We are looking for a tool that is quick and easy to use and that will
present the type of data (e.g., spelling errors per page, run-on sentences
per page) that is understandable to non-tech writers. (We would prefer to
enlighten management on the best methods for measuring both productivity and
quality, but... Thus far, management has measured productivity
only and has favored the pages-per-hour measurement.)>>

It sounds like your management badly needs to be educated about the real
role of writers: it's not to churn out pages, but rather to churn out usable
information. Speaking as an editor, I hate coming across typos, but speaking
as a reader, typos are basically irrelevant; I can usually figure out what's
meant if the rest of the writing is good. If there are _any_ typos in your
documentation, you need to institute a formal policy of always using a
spellchecker before printing a documentation. You can catch run-on sentences
with most grammar checkers, so the same rule applies. But neither of these
is really important in the larger scheme of things: inaccurate or missing
information, information that can't be found, incomprehensible prose, and
significant inconsistencies are far more important, and you need an editor
(or a peer review) to find these, because no automatic tool yet exists that
can do the job.

The only really useful quality metric is one that reflects how successful
your audience is at finding and using information. Sadly, that's hard to do
objectively, though there's a huge body of literature on usability testing
that will help. If you want to persuade your managers that you techwhirlers
do a better job than the developers, you're going to have to spend the time
and effort to field-test the two types of documentation (yours and
developer-generated docs). Those results should speak for themselves, though
I bet you'll be surprised by some of the things your audience has to say
about both kinds of documentation. Three important metrics you can use as a
starting point: time to find the portion of the docs they're looking for,
time to complete a task using the docs once they've found the relevant
information, and number of errors made en route to completing the task.
Elaborate on these as necessary, but they're probably the most important
ones from a quality standpoint. They're also easy to quantify and track,
which lets you begin improving your quality.

--Geoff Hart, FERIC, Pointe-Claire, Quebec
geoff-h -at- mtl -dot- feric -dot- ca

Hofstadter's Law: The time and effort required to complete a project are
always more than you expect, even when you take into account Hofstadter's

Previous by Author: Capturing clean screen images for book?
Next by Author: RE. Readability studies on fonts--serif and sans serif
Previous by Thread: Re: Book on Acrobat: summary
Next by Thread: FIX: Does MSWord's "Total Page # code" suck?

What this post helpful? Share it with friends and colleagues:

Sponsored Ads