What's
the big deal?
By Ralph B. Davis
Associate Editor
Perhaps you're a computer novice or don't even own a
computer. Maybe you're a seasoned computer professional who
just doesn't get all the hype. Or perhaps you've heard all
about Y2K and can't understand why there's such a fuss about
the change from 1999 to 2000. In any case, you're probably
asking yourself the same question millions of others
are:
What's the big deal anyway?
To understand the millennium bug, you have to travel back
in time about 30 years to when computers first came into
vogue.
Of course, back then, there was no such thing as a
personal computer. In-stead, businesses and government
agencies were turning to mainframe computers to simplify
their jobs.
The problem was, unlike now when memory upgrades and
hard-drive replacements are as cheap and easy as buying a
VCR, everything associated with computers was extremely
expensive. A single megabyte of memory could cost thousands
of dollars.
Even as late as 1984, when Apple Computer unveiled its
first Macintosh with one megabyte of memory and no hard
drive, the pricetag was about $2,000.
So, in the late 1960s and 1970s, programmers were under
orders to conserve as much as possible. One cost-saving
measure they chose to implement was to use shorthand when
expressing a date's year, using "76" instead of "1976," for
example.
It wasn't that the programmers were necessarily
short-sighted. At the time, many realized that the two-digit
notation could cause problems later on, but figured that as
computers became cheaper to produce, they would be replaced
with more up-to-date ones.
That, however, didn't quite pan out. Instead of replacing
their old mainframes, most businesses and government
agencies simply upgraded them or added on, leaving their old
computers in place as the base for their entire systems.
Now, as the century draws to a close, many businesses and
government agencies are finding out that their computer
systems are out-of-date -- that when the year 2000 rolls
around, their mainframes will change from "99" to "00," 100
years in the past.
The story of Y2K, then, is how one shortcut decades
earlier became a global problem costing companies billions
of dollars to fix ahead of time, and no one knows how much
to repair after January 1.
Still, perhaps you're still wondering why the millennium
bug is a problem. So what if computers think the year is
00?
Imagine you run a business on one of these computers. All
of your financial information -- accounts receivable,
accounts payable, inventory, payroll, etc. -- is contained
on this obsolete system. Sudden-ly, your billing program
thinks none of your customers owe anything for another 95
years.
The problem grows wider. If your bank depends on such a
computer, your savings account could be erased. If your
electric company's power plants isn't caught up, the
computers which operate the plant could suddenly shut down
completely, unable to resolve the inconsistency.
Even the computer in your own home might not be safe. If
you own an IBM-compatible PC running Windows 98, you're not
necessarily out of hot water if you haven't made sure to
download the appropriate update. If you use anything prior
to Windows 98, you are definitely not Y2K-ready.
Apple's Macintosh computers have been Y2K-compliant ever
since the first one rolled off the assembly line (in fact,
they're good until the year 29,940).
But even if you own a Macintosh or PC which has a
millennium-proof operating system, that doesn't mean you're
necessarily out of the woods. While your computer may
continue to function properly, individual programs on it --
word processors, spreadsheets, graphics programs, even games
-- might not.
Starting to see the problem?
|