On midnight, Dec. 31, 1999, as the world turns into a new century and a new millennium, lights will go out all over the world. Indeed, this catastrophe may, in some cases, begin earlier, perhaps at the end of 1998. Is this the paranoid prediction of some millenarian nut who forecasts the expiration of the laws of thermodynamics? No. It represents a sober warning by experts who have analyzed the consequences of decades of carefree shortsightedness. Programmers and users who maintain applications have become accustomed to two-digit representations of the "year" portion of dates. Thus, a baby delivered in 2000 might be assumed by a Social Security program to be 100 years old at birth. The consequences could be equally serious for financial, commercial, medical, and legal records, prison sentences, and other date-sensitive data of all kinds. Some may be attracted by the prospect that records kept against a possible tax audit might vanish into a memory hole, but the tax-collectors are not likely to be amused. Furthermore, there has been an unknown amount of date-space kleptomania. Programmers looking for an inconspicuous spot to hide some unusual option (like automatic cancellation of parking tickets issued to the mayor's mothers-in-law) have inserted a command like "If year = 00 [or 99] then ....etc." Such occurrences are no doubt rare, but the end-of-century problem is genuine and widespread. How could this happen? Back in the 1960s and 1970s, memory space was costly. Programmers did not anticipate that their work would endure so long and might, in fact, become an honored characteristic of legacy code. NO GLOBAL FIX Many computer users (including this reporter) may have heard of the Year 2000 problem but have assumed casually that IBM or Microsoft or someone else in charge would come up with a global fix. We dreamed that all you would have to do would be to buy some simple, inexpensive chip, or download a one-size-fits-all repair program from the Internet, and that would be that. One could then forget about fussy procedural matters like this and then go on the cope with real problems. No. IBM has indeed looked into the matter. It offers a path toward solution but warns that the journey will be dusty and fatiguing. It is not a quick, simple fix but a 180-page document titled "Year 2000 and 2 Digit Dates: A Guide for Planning and Implementation." This document is now available on the World Wide Web at the IBM Software Group Home Page at <http://www.software.ibm.com>. IBM's main theme: every application must be reviewed and repaired one by one. There is no universal get-well pill. The guide is a wakeup call for users and programmers combined with step-by-step advice for checking existing programs and making them chronology-friendly forever. What is the ultimate solution? Open-ended four-digit year entries that, like the regular calendar, go on and on. After all, 2100 is not so far away. And 3001, and so on. Wait! After 9999, the standard must be changed again to accommodate five-digit numbers. An IBM committee is probably working on that issue. Where would we lost children be without our Great Blue Nanny to look after us? ---------------------------- Norris Parker Smith is a journalist who specializes in HPC and high bandwidth communications. Reader comments are welcome.
IBM SAYS YEAR 2000 COULD BE MASSIVE PAIN IN THE MOUSE
November 3, 1995