Roman numerals were used to set up the system of numbering years in the Christian era. There is no Roman numeral for zero, and the year 1 A.D. was the year following 1 B.C. The first millennium began January 1, of year 1 and, if a millennium is defined to be 1000 years, it ended at the end of year 1000. The second millennium will not close until the end of year 2000.
Y2K is not truly a "millennium bug." The third millennium will not begin until January 1, 2001, unless we claim that either the first or second millennium covered only 999 years.
Year 2000 is just the year after 1999. This progression of years is rather natural to us. Unfortunately this progression of years does not exist in many computer applications. For years, well meaning programmers have been storing years in the two digit format of YY. As the years progress these computer systems see the years as 97, 98, 99, 00, 01 ... At first glance the use of two digit years doesn't seem to cause us any reason for alarm. We know that 00 should be 2000 and that it follows 1999. However computer applications that use two digits for year storage will not know the century. The years 1900, 2000, and 2100 will be stored as 00. When a computer application sorts by year then the year 00 will be before 99, therefore treating the year 00 as if it was 1900. Computer applications will also compute date calculations incorrectly when using two digit years. To illustrate this problem here is how computers may calculate an age in the year 2000, for someone who was born in 1945:2000 - 1945 = 55 00 - 45 =-45 in a signed 2 digit integer field 00 - 45 = 45 in an unsigned 2 digit integer fieldAs you can see from the example, this person's age in the year 2000 may be calculated to be 45 instead of 55. If you were born before 1950 you'll be able to say that you are younger than you really are!
Just imagine how inventory items with an expiration date of 00 or beyond will be handled by automatic inventory systems. Will the inventory system in the 90's flag the new item as old and automatically create an order for a another item? That could be very expensive.
Another problem with the year 00 is that an application may reject the year. That may really sound goofy, but many systems have been written so that if the year is zero then prompt for the year again.
You may have heard that this is a mainframe problem only. That's not true! This problem exists in any computer system, computer application, or computerized hardware that uses two digits for the year. I wonder if the security system to our building will let me in on January 1, 2000.
You may also be thinking that you don't have to worry about this problem, yet. Wrong! If you do any kind of date projections with your computer systems, such as, calculate expected date of graduation. Then you will have year 2000 problems before the year 2000 is here.
You may also be thinking that you'll just buy the year 2000 compliant version of your software. Then you should contact your vendor and ask them if their software is year 2000 compliant. Hopefully they have plans underway to be year 2000 compliant.
Do you receive data from other sources outside of your area? Does this information contain dates? Do you know how they are planning on implementing changes for the year 2000? Will their changes effect your system?
Don't wait. Find out how your systems process dates.
CIS Year 2000 Team Leader
Computing and Information Services
Texas A&M University
Here are two excellent sites from CNET on this subject:
Thanks to The Internet Tourbus for providing the following links -
Free testing, diagnosis, and patch resources:
More Y2K information:
The Cassandra Project
Understand Y2K in 5 Steps
The Y2K Watch
Y2K MEDIA WATCH