TechWhirl (TECHWR-L) is a resource for technical writing and technical communications professionals of all experience levels and in all industries to share their experiences and acquire information.
For two decades, technical communicators have turned to TechWhirl to ask and answer questions about the always-changing world of technical communications, such as tools, skills, career paths, methodologies, and emerging industries. The TechWhirl Archives and magazine, created for, by and about technical writers, offer a wealth of knowledge to everyone with an interest in any aspect of technical communications.
Subject:Re: Who Checks My Little Dog Checkers? From:Robert Plamondon <robert -at- PLAMONDON -dot- COM> Date:Thu, 9 Jan 1997 09:26:39 PST
The question on the floor is, "Should we add intentional errors in order
to trap inattentive reviewers?"
Maybe it's different with other people, but adding intentional errors
to my work would be like stocking lifeboats with barrels of salt water.
The environment I work in tends to have only a tenuous commitment on
the part of the client to review the work. My clients are all extremely
busy, their phones ring constantly, and they have far too much work to
take home every night. I do not need to invent traps to determine
whether they are reviewing the documents with total thoroughness. I know
that they aren't.
This has led to various embarrassing errors making it into print. Most
of the errors have been mine, but I can't produce an error-free document
without some kind of thorough review. Normally, this is supposed to come
from the client, but I'm keen to find alternatives that, if not higher
in quality, are more available to me.
I just now came up with an interesting idea with regards to reviews. In
a review involving more than one or two reviewers, I attach a review sheet
spelling out what I want each reviewer to do (some review only a page or
two), and giving the due date. There's a place for each reviewer to sign,
indicating that they did, indeed, review the document.
Many reviewers will sign and return an unreviewed document, or a barely
skimmed document. I'm toying with the idea of having them also indicate
how thoroughly they reviewed the document. For example:
I reviewed the document as follows:
Pages: __________ were compared item by item with data known to be
Pages: ___________ were reviewed carefully, with each statement
considered for correctness.
Pages: ___________ were read.
Pages: ____________ were skimmed.
The instructions to the reviewers should request their action in the
same terms, so they know what's expected of them.
In planning the review, the goal should be to achieve an agreed-upon
level of coverage by agreed-upon reviewers. Managers tend to have a
horror of errors, so if you give them the chance to dial in the acceptable
error level through choosing the minimum review level, they tend to
volunteer reasonable amount of resources. And, if not, they know what
they're in for.
Robert Plamondon, High-Tech Technical Writing, Inc.
36475 Norton Creek Road * Blodgett * Oregon * 97326
robert -at- plamondon -dot- com * (541) 453-5841 * Fax: (541) 453-4139