TechWhirl (TECHWR-L) is a resource for technical writing and technical communications professionals of all experience levels and in all industries to share their experiences and acquire information.
For two decades, technical communicators have turned to TechWhirl to ask and answer questions about the always-changing world of technical communications, such as tools, skills, career paths, methodologies, and emerging industries. The TechWhirl Archives and magazine, created for, by and about technical writers, offer a wealth of knowledge to everyone with an interest in any aspect of technical communications.
Subject:Re: dpi and pixels From:Tim Altom <taltom -at- SIMPLYWRITTEN -dot- COM> Date:Mon, 21 Jun 1999 11:21:33 -0500
DPI stands for "dots per inch" and it's a linear measure of how many "dots"
(which are the smallest unit of display) can fit into one inch. This isn't
the same as screen resolution. This is widely misunderstood.
"DPI" is meaningless on a screen, which doesn't have "dots" as printed
materials do. Screens (monitors, actually) have picture elements, or
"pixels", which are the smallest units of light the monitor can produce.
Screens are often specified as 72 pixels, meaning that you can fit 72 of
them into one inch. Some do better.
Print resolution is often orders of magnitude better than 72. 300 DPI is
low-end laser. It goes up into the thousands of dots (or "spots" as printers
like to call them) for high-end printing.
Moving from one resolution to another is always a problem. Either way, you
often lose clarity. When you scan a high-resolution graphic, you're not
scanning a TIF; you're scanning a page. The resolution of the printed
graphic is only a starting point. The scanner now has its OWN highest
possible resolution, which probably isn't all that high unless you have a
really good one. Then you have to factor in the highest resolution you can
see and store in the computer.
The storage size of a graphic is largely dependent on colors and size. Each
byte in a graphic file has to be defined, whether it's white or some wild
custom color. 24-bit representations of 8X10 graphics at high resolution can
consume several megabytes. A 2-color black and white of an 8X10 is almost
exactly one meg. So your boss is right to be concerned.
In essence, DPI is how it prints; pixels is how it looks onscreen. These two
things interact. A high-resolution output may look pretty ragged onscreen,
but the output driver will take care of that.
Get a good book on this subject if you're venturing into it regularly. A lot
of our more tedious work is in this area, translating graphics from one
platform, resolution, or format to another.
Simply Written, Inc.
Featuring FrameMaker and the Clustar Method(TM)
"Better communication is a service to mankind."
----- Original Message -----
From: Sylvia Braunstein <sbraun -at- RUGGED -dot- COM>
To: <TECHWR-L -at- LISTSERV -dot- OKSTATE -dot- EDU>
Sent: Monday, June 21, 1999 11:26 AM
Subject: dpi and pixels
>I am confused. I was requested to scan a high resolution picture of
>1000x1000 TIF format. I assumed it was pixels.
>My boss said that 300 dpi (dot per inch) were enough otherwise the file
>would be way too big.
>Now, I am confused between the two of them.
>Can anybody help me understand the difference and how you measure the
> Sylvia Braunstein
> E-mail: sbraun -at- rugged -dot- com