Surveys/Audience analysis - Summary2

Subject: Surveys/Audience analysis - Summary2
From: Damien Braniff <Damien_Braniff -at- PAC -dot- CO -dot- UK>
Date: Fri, 20 Mar 1998 14:27:23 +0000

That is the way that I used to see my audience -- fairly homogenous. Until
I performed an audience analysis (for a class I am taking) for a
scanner/software package. I discovered that, for a product such as that in
commercial market, it is nearly impossible to nail down a user type. I had
a housewife with practically no computer experience who wanted to scan
recipes into a recipe application, a business owner with expert computer
knowledge who wanted to use it with a document management system, a graphic
artist with little computer knowledge but expert graphics application
knowledge who was going to use it for scanning
I was at a loss. I did not know at what level to write the manual. I
eventually settled for a Getting Started section which covered
installation and basic tasks (scanning text, scanning graphics, cropping
graphics, saving graphics, etc.) for the neophytes, and a more
comprehensive reference section for the more advanced users. How effective
that was, I don't know. I guess I'll see.
A couple of thoughts for you to ponder.
The company I work for places a lot of emphasis on listening to the "Voice
of the Customer." In fact, they made the effort last winter to train a
wide variety of marketing, development, and documentation people in focus
group interviewing techniques, then sent us in groups of 2 or 3 to customer
sites througout the country to talk to "real users" of the products we
sell. It was a
very interesting experience.

It probably isn't possible for you to develop and implement that complete a
program yourself right off, but you might be able to arrange with one of
your installers to do a "ride along" where you just observe what the
installer does, listen with your own ears for the kinds of questions the
customers ask, and perhaps ask the customers a few of your own questions.
We found that customers are flattered to be asked their opinions and more
than willing to talk.

Many years ago, when the idea of "user friendly" manuals first developed,
those of us who thought writing in the second person would be a good idea
ran into a lot of opposition from technical writers who had spent the last
15 or 20 years writing in the third person. They felt the less formal
style was too much like the first reading text books children have in
school (See Dick run. Run, run, run.) and that audiences would be offended
and felt talked down to. Of course, they were exaggerating the writing
style, but nevertheless we felt they deserved an answer.

A search of the literature then available turned up a study that had been
conducted by IBM (a competitor to the company I was working for at the
time) that indicated that highly educated people (those who read at the
15th grade level or higher) are not put off by text written at the 8th
grade level--they simply read it faster or skim over information they
already know. Unfortunately, I don't have a copy of that study any more
(almost everyone now accepts a second person writing style) but my point in
bringing it up to you is to assure you that targeting your writing for the
"lowest common denominator" is a valid strategy that won't harm you with
more advanced users.
Close, but no cigar. The analysis is intended to define the characteristics
of the group that you're writing for. If it's a heterogeneous group--the
most common case--you may be lucky and discover that it's made up of a
small number of relatively
homogeneous groups. So the trick is not to define a single group that
you're writing for, but rather to see how many of the component groups you
can write for reasonably well. An example would make this a lot clearer.

Let's assume you've identified three main groups: propeller-heads
(experts), competent users, and neophytes. Simplisticly the
experts might want nothing more than a reference manual, the competents
might want a task-based manual, and the neophytes might want a tutorial. So
you solution would be to publish three manuals. Alternatively, you might be
able to combine the expert/competent info. in a single manual, as follows:
1. This is step 1, with no details (for experts, who already know how to do
this step). [Insert a paragraph here providing details for competents]
2. This is step 2, with no details (for experts, who already know how to do
this step). [Insert a paragraph here providing details for competents]

And so on. The trick is to identify the needs of each component of your
audience and see what approach most effectively
addresses those needs. In your specific case, you've got experts you've
named "installers", and you can produce documentation specific to their
needs: this will be fairly advanced material, with little hand-holding
because your installers don't need it. In addition, you've got a wide range
of end users (after the installation is complete), so you'll probably have
to do something like I proposed in my example: provide high-level
information in numbered steps and details (for those who need them) clearly
distinguished from the steps so that experts can skip past it easily.
That's a start, anyway. You'll have to tweak it from there.


I am currently doing research to produce an in-house document addressing
this issue precisely. I have come across a few helpful books ("User and
Task Analysis for Interface Design" by JoAnn T. Hackos and Janice Redish;
"Standards For Online Documentation" by JoAnn T. Hackos and Dawn M.
Stevens; and "Dynamics in Document Design" by Karen A. Schriver.)
As you develop a client/user analysis process, you may notice a strong
resemblance to the traditional software development process. There are a
lot of similarities in the approaches to both. Keep that in mind. There is
a lot of information on the web if you look hard enough. A couple of
Keywords/phrases to use are: "Questioning Techniques"; "User(Client)
Requirements Analysis". There are more, but they elude me at this moment.
Develop the "product" with the user's involvement. This needs to be an
ongoing, iterative process. If you are revising documentation, get as much
input from the current "product" users as possible. Their input will be an
invaluable asset to you in assessing the strengths and weaknesses of the
existing documentation. This is very good starting point. As for gauging
documentation effectiveness, you're correct in that it is very difficult to
assess. You can use an iterative form of "prototyping", with the end users
(of course), to judge if you're on the right track to developing
more effective documentation.

Previous by Author: Surveys/Audience analysis - Summary1
Next by Author: Re: CBT vs. Stand up
Previous by Thread: Re: Surveys/Audience analysis - Summary1
Next by Thread: Surveys/Audience analysis - Summary2

What this post helpful? Share it with friends and colleagues:

Sponsored Ads

Sponsored Ads