Getting into the new TV series "Terminator: The Sarah Connor Chronicles" (which isn't bad, by the way) prompted me to watch the "Terminator" movies again.

And while doing so, I noticed that the trilogy demonstrates an amazingly rapid uptick in movie magic over two decades, providing a timeline of inroads made by the still-growing use of computer graphics.

In 1984, James Cameron's "The Terminator" was a low-budget hit, offering a clever twist on sci-fi movie staples up to that point — the man-vs.-machine theme that dates back to "Metropolis," the bleak future of "Mad Max," the grungy technology of "Alien," the cynicism of "Blade Runner" and, of course, that overused, always-confused fantasy plot device, time travel.

What "The Terminator" did not have, however, was the advanced visual look of bigger-budget pictures. In my 1984 "Terminator" review, I wrote: "The special effects aren't 'Star Wars' level, but they serve the purpose."

That's especially true of the film's climactic use of Ray Harryhausen-style herky-jerky stop-motion animation, as Arnold Schwarzenegger's skin melts away and the skeletal robot stalks his prey.

When I was a kid, Harryhausen's "The 7th Voyage of Sinbad" (1958) was highlighted by a sword fight with a skeleton that knocked my socks off. Then he upped the ante five years later with an entire army of skeletons in "Jason and the Argonauts." Great stuff.

It's fair to say that "The Terminator" marked the end of that era. Fantasy flicks would be required to up their game if they were going to satisfy the hungry youth audience looking for the next gee-whiz thrill.

Like George Lucas, Cameron was ever on the lookout for "wow" effects, and after experimenting with fluid creatures in his underwater epic "The Abyss," he came up with the eye-popping "liquid metal" of "Terminator 2: Judgment Day" (1991).

There would be no going back. The bar had been raised; computer graphics had set the standard. And not just in live-action films, but also in animated features — from experimental sequences in Disney's "Beauty and the Beast" (1991) and "The Lion King" (1994) to Pixar's groundbreaking "Toy Story" (1995).

The evolution of digital graphics over a relatively short period of time is mind-blowing (which is the idea, of course), with the most awe-inspiring leaps and bounds made by Lucas, especially with the "Star Wars" prequels from 1999 to 2005, and Pixar, with eight animated blockbusters in a row since 1995; an astonishing record.

By the time "Terminator 3: Rise of the Machines" came along in 2003, the shape-shifting "Terminatrix" was an astonishing creation, morphing from one persona to another, her hands and arms becoming all kinds of mechanical devices and weaponry.

Today, computer graphics have virtually (no pun intended) taken over the industry.

But the best visuals in the world don't matter without effective stories and characters.

Here's something I wrote in 1991 about the fantasy-comedy "Death Becomes Her": "Movies today are as technically adept as they can be. But if as much attention was paid to story and character as is given to technical proficiency, the films would be much more enjoyable and fulfilling."

Seventeen years later, that remains a problem. From "300" to "Beowulf," too many live action films are as cartoony as "Shrek," with characters of no more depth.

Special effects need to supplement the drama, not displace it. It's a lesson that doesn't seem to have been part of the learning curve.