Back when all you saw on desktop computer screens were plain letters and numbers, life was easy. The computer had to keep track of just 2,000 locations: 80 characters in 25 lines. Early on, some Apple IIs and IBMs showed not even 80 letters across a line, just 40! Easy really meant dumb. All that the computers knew enough to show on these screens were rudimentary letters and numbers and very crude drawings.

Modern Macintosh and IBM compatibles keep track of more like 480,000 screen locations. Each place is so tiny, you'd call it a dot. Computer engineers call it a pixel.The engineers now light the screen pixel by pixel. You can see Times Roman, Helvetica, Tiffany or any other type style you like. The same screen can show many styles of drawing, and colors, too. For proof, see some of today's computer games.

All your computer needs, to be that fancy, is the right graphics adapter circuit card (or board, the two words being interchangeable in techy talk). Some newer computers come with great cards. Most older computers didn't.

If your computer is showing its age, a new graphics adapter card can add new life. You can put the card, yourself, in all but the cheapest or oldest IBM compatible computers.

A graphics card is a go-between. The computer's main brain (CPU in jargon) tells the card what's to be drawn, whether a letter, a number or a horse. The card sections the screen (like graph paper) into pixels and tells the right pixels to light up. The more pixels, the less fuzzy the screen.

To create colors, graphics boards assign each different color a different number. In dividing the screen, graphics cards assign each pixel a certain number of bits.

Bit is short for binary digit, "binary" because one and zero are the only things computer chips understand. A pixel assigned two bits can make four digits (00, 01, 10 and 11).

Therefore, a board with two bits per pixel can show four colors onscreen. A card with three bits per pixel shows up to eight shades at one time.

In reading ads, be sure to find out how many colors a card can display at one time. Many can display over 100 colors, but only eight at a time. Also don't be confused by ads for eight-bit and 16-bit cards. That number tells how many bits the card receives from the computer's CPU, not how many pixels you get onscreen.

The Macintosh had extremely good resolution and color selection almost from the word go. Not so IBM compatibles. CGA (Color Graphics Adapter) boards for early IBM compatibles had a low 320 by 200 pixel resolution. Why? Mostly because early IBM world engineers didn't think people needed better than that.

CGA was followed by EGA (Extended Graphics Adapter), a higher resolution "standard" that's becoming obsolete. The newest, VGA (Video Graphics Array), gives potentially as much resolution and colorization as the Macintosh by assigning each pixel eight bits.

All VGA boards have at least two settings. In low resolution mode, they can show up to 256 shades of color. When set for highest resolution (into which some boards squeeze over 1,000 by 600 pixels), color range dives to as low as 16 shades.

Each kind of graphics card (CGA, EGA or VGA) needs different instructions from software programmers. Most video cards support previous monitor standards. So VGA cards can run CGA and EGA programs.

But none of them work the other way around. So unless your software has programming in it for high resolution or extensive color selection, you're just as well off - for those programs - with cheaper EGA or even, for some programs, CGA. (You may want to consider moving to newer or revised software.) To get VGA resolution and colorization, you need a VGA monitor. A few graphics cards work only with special monitors. Before you buy one, make sure your monitor can use it.

(Watch for our column explaining how monitors work, which brands work best, and how to select one for yourself.) Macintosh users are spared most graphics card decisions. Even the cheapest Macs match the IBM world's VGA resolution. Mac owners who want more kicks can buy 24-bit color adapters and monitors. They paint the screen with so many dots and color subtleties, images almost look photographic.

IBM compatible owners can also get 24-bit components. Whether for Mac or IBM, we're talking big bucks here.

Back to ordinary cards for IBM compatibles: We've always had good luck with ATI and AST's CGA and EGA adapters - quick installation without a lot of switch setting or manual reading. Paradise and Everex cards follow not far behind. We also tested a lot of Brand-X models and, with most, find only minor quibbles.

When it comes to VGA cards, you do get what you pay for. El Cheapo boards work well for ordinary VGA signals (although one we tested often put extraneous lines onscreen.) But if you're looking to squeeze on 1,000 by 600 dots or more, buy a brand we named above. They cost $100 to $500.

For heavy-duty users, we recommend NEC's MultiSync Graphics Engine. This circuit card has the power of ordinary VGA cards (ITAL) plus (END ITAL) processor chips that do work usually done by the CPU chip. It works only with multisync monitors and start at $1,000. But it paints elaborate pictures and graphs very quickly.

NEC's Model MGE-AT-16 Graphics Engine tested at nearly double ordinary VGA speed. Model MGE-AT-256 shows 256 shades of color at up to 1024 by 768 pixels of resolution.

You can read back issues of this twice-weekly column at the electronic library, NewsNet, reachable via computer plus modem over phone lines. For NewsNet information, call (800) 345-1301. Copyright 1990 P/K Associates Inc. 3006 Gregory Street, Madison, WI 53711-1847.