Re: John Galt: Lover of Standards

Subject: Re: John Galt: Lover of Standards
From: "Eric J. Ray" <ejray -at- raycomm -dot- com>
To: TECHWR-L <techwr-l -at- lists -dot- raycomm -dot- com>
Date: Mon, 25 Oct 1999 07:28:24 -0600

Who is John Galt? (Sorry, couldn't resist.)

Andrew Plato wrote:
> There is value in the standard because a profound majority accept
> it as truth - not because it is objectively good.

Agreed. Keep in mind, however, that the "profound majority" also,
in most cases, includes your readers, so arbitrarily picking a
different standard messes with the assumptions that your readers
make, and can lead to cognitive dissonance or actual usability
problems. For example, there's a good case to be made, from the
usability standpoint, for putting punctuation in computer documentation
outside of parentheses (so there's no question about if the . or , should
be typed). That said, if you're writing how-to documentation for
English teachers, you should go with the standard that they accept
as _truth_ rather than hindering the acceptance of the documentation
through what will likely be seen as an egregious error.

> When a scientist engages in a new project, he/she does not sit down and graph
> out all the "rules." The first thing he/she does is develop a hypothesis - the
> academic version of the mission statement. Then as the project progresses, the
> scientists leverages his work off similar work, learning, adapting, and
> evolving until a solution is found. Often, the solution found is not what was
> expected - this leads to more research and the cycle starts again. 99999999
> times out of 1000000000, there are no methodologies. These are built slowly
> over the course of the project.

er...but the starting point for the cycle (the hypothesis) itself
requires certain assumptions or ground rules. Where do we get the
ground rules that allow us to develop a hypothesis? From
* standards
* academic research
* experience
* other people's opinions
* guesses

The scientific method _requires_ that you start with assumptions (or givens,
or whatever you want to call them). They're just not always explicitly called
out. For example, when scientists are preparing for a space shuttle launch,
they start with "assuming Pythagoras was right and the square of the
hypotenuse really is equal to the sum of the squares of the other sides,
and assuming that Copernicus was right about how the solar system works,
and assuming that Einstein was right about e=mc^2", then we can do this.

Similarly, there _are_ methodologies, and they focus primarily on testing the
hypothesis _while_holding_other_factors_constant_. The implementation
varies, but that's always the goal.

I'll grant (and this is probably your beef with standards in general)
that many people take standards as being definitive, as opposed to being
merely easily referenceable codifications of working assumptions, but
it's clear that the _intent_ of standards is to be the latter. How do I
know that? If standards were definitive, forever-and-always, till-death-do-
us-part, they wouldn't use version numbers and the Chicago Manual of Style
wouldn't be in its 14th (15th?) edition. It'd still be number 1.

> Where I think many writers go astray on the whole standards issue is that they
> try to define the rules before they ever play the game. They try to set up
> perfect little universes where they can work in peace and quiet, rather than
> fight informational chaos. They place their faith in structure before knowing
> what it is that structure is defining. Since this structure was documented in

I'm with you here, mostly. You're exactly right that standards fight
informational
chaos...but that's often good. If I'm on a deadline (and stalling by arguing
on a discussion list), I can use that to my advantage. Say I choose to follow
the
Sun style guide...I've just saved myself hours of making picky and trivial
decisions and deliberately let that little book fight informational chaos. If
some of the information in the book doesn't work--that is, if it doesn't serve
my readers needs--I'll ignore it, consider it an exception, discard that
part of the book as an invalid assumption, or otherwise do what I think will
work better. That said, if I arbitrarily do that too much, I've reintroduced
informational chaos. (Interestingly, the more experienced and theoretically
more capable I get, the more I want to defer to a style guide.)

You, Andrew, do the same thing, but your perfect universe is within your head,
following whatever working assumptions you use. Your universe either calls for
using 2 spaces or 1 (or three) after a period, but I'm willing to bet that
your universe is consistent in using 1 _or_ 2 _or_ 3, but not mixing them up.

> some book, and it worked somewhere else - it goes to follow that it will work
> for you, right? Wrong.

But, assuming that you have some reason to have chosen "some book", that's a
good place to start. "Best practices" as defined by Sun's style guide or
Microsoft's style guide or Eric's style guide are a fine _working_
hypothesis. As soon as you have some reason to think that they don't work,
then you _change_ your working hypothesis.

> The technological standards for HTML, OOP, SQL, etc. did not arise one day out
> of a learned person's mind. They were habits that melted together until
> someone wrote them down. In other words - HTML existed and worked long before
> the standard was developed. HTML is actually an outgrowth of research projects
> and nerdy tinkering. The standard was not established until years after it was
> invented and already in use.

First, the "habits", if you will, weren't arbitrary--they were what someone
knew to work in a specific case. "Habits", in this case, are pretty severely
self-selecting...the "habit" of writing {make this big and bold, please}
rather than writing H1 dies out quickly, as it's not reinforced.

Note that the HTML standard was based on assumptions about what people/Web
browsers/etc would do, and as the assumptions were found to be wrong, the
standard was changed. Pending the standard changing, people often labeled
their Web pages as (in effect) adhering to the Microsoft style guide ("Best
viewed with Internet Explorer) or the Netscape style guide. Those people who
followed no standards or really lousy standards (the school of BLINK) were
mocked or ignored.

> As any good scientist and engineer will tell you, the only way you can truly
> tell if something works is to take it out and fly it (drive it, execute it, put
> it on a web server etc.) I think the same rule applies to documentation. The
> only way to tell if documentation works is to sit your ass down and write it
> and then let people read it. Nobody in the universe (besides another tech
> writer) cares one tiny bit if a document was written using a good standard if
> the document (help file, CBT, whatever) contains useless information. And just
> because you follow a well thought out process does not mean you will, always
> and reliably produce good documentation.

True...and if you follow a well-thought-out process and produce crappy
documentation, you need to change your process. But if you wing it and
produce crappy documentation, how do you know what to change? Now, I'm
not saying that the style guide (or methodology, or whatever standard
you care to name) will make it crappy or not, but if you know what you
did, you have a starting point for making changes. (Hmmm...I compiled
this with the -cksjf option, and it reformatted my hard drive...perhaps
I should try a different option next time.) But if you have no process,
you're doomed to forever winging it. If you have a process that only you know,
it's just not yet written down, but as much a standard for Andrew's writing
as something else might be for mine.

Note that process, as I'm using it here, applies to everything from
information gathering to writing to testing to reviewing. Anything
and EVERYTHING that goes into your documentation product follows
a process/standard/methodology. The only question is if you know
what it is.

> Therefore, while I may seem like I am advocating chaos and anarchy all the
> time, in reality I merely advocating some common sense when it comes to
> standards. Use them wisely and treat them as friends not mandates from the
> Gods. Good standards will last, bad ones should evolve. But having a standard
> just because you should is nonsense.

I agree completely.

> Good writers figure out what it is they are writing about FIRST. SECOND they
> write the document. Then THIRD they make sure it conforms to appropriate
> standards or bend the existing ones to fit the current situation. Applying a
> standard to a document about a product (science, concept, idea, design, etc.)
> you do not understand is like putting a bow on Pandora's Box. Sure it looks
> pretty, but you still have no idea what's inside the box.

I disagree. SECOND, they set up their working assumptions (computer literate
users, expecting API documentation, don't care about spaces after a period or
punctuation within quotes, using this process or style guide or whatever).
THIRD, they write the document in accordance with the assumptions they've made
(or, to return to fighting words, they write the document in accordance
with the standards they use). If you do it this way, you don't waste time
on trivial decisions, don't have to rework the document again, and don't
document stuff that doesn't need to be documented or miss stuff that does
need to be documented. If I write "documentation" about product A without
clearly specifying my standards/audience/methodology/style etc, I'm
potentially wasting my time and my readers time. If I clearly understand
the assumptions and audience and standards (API docs, not user docs,
for example), I can more efficiently write good documentation.

Eric
ejray -at- raycomm -dot- com




Previous by Author: Re: FWD: Appealing to or introducing Tech Comm "best practices"
Next by Author: ADMIN: Rules reminder
Previous by Thread: John Galt: Lover of Standards
Next by Thread: Re: John Galt: Lover of Standards


What this post helpful? Share it with friends and colleagues:


Sponsored Ads