The Year 2000 Problem arises from the simple fact that the two-digit representation for the year 2000 is "00." A great deal of computer logic only uses two digits rather than four to designate the calendar year (e.g., DD/MM/YY). The century portion of the date was frequently left off and that all logic assumed that the century was 19.
Computer programs use a two-digit notation to
save valuable storage space and data entry time. Although this practice
was common, it can lead to incorrect results whenever computer software
performs arithmetic
operations, comparisons or data field sorting
involving years later than 1999.
Although humans can properly interpret this representation,
most computer programs will believe that "00" is less than "99." Many will
assume that "00" simply means "1900." This causes a serious problem because,
without the
leading century digits, a program has no certain
way to differentiate between the years 1900 and 2000. Suppose, for example,
that an employee was born in 1935 and is scheduled to receive retirement
benefits in 2000. A simple computer calculation of that employee's age
will provide the wrong result ("00" minus "35" = -35). Another problem
originates from the fact that many systems were programmed to interpret
the date 12-31-99 as an infinite maximum date. Currently, for example,
active employee assignments are given a default termination date of 12-31-99.
Obviously, this system will need to be corrected to prevent all of
our employees from being automatically terminated at the end of 1999. A
third problem is that the year 2000 is a leap year. Some date-processing
routines may not recognize this fact. Century years are considered leap
years only when divisible by 400.
WHO IS AFFECTED BY THE YEAR 2000 ?
It is a significant challenge across the computer
industry and for any business, agency, institution or person using
computers. Any system or program, including desktop software, could be
affected if two digits are
used to represent the year.
The Y2K problems can have a serious impact upon
numerous business functions, such as payroll, student loans, retirement
benefits, a five-year auto loan beginning in 1996 or mortgage calculation,
etc. Any computer calculation
that involves a date could yield incorrect answers.
Some errors might be obvious and the programs would abnormally end; others
might quietly corrupt our files with inaccurate data. A number
of economic sectors could face electronic disruptions in January 2000,
but the extent of the potential problem remains largely unknown because
industry groups are just beginning to collect data on the problem's scope.
The Y2K problem is real, it affects everyone of us. Just pick up any newspaper
or magazine. They are full of vivid stories about the impending disaster
coming January 1, 2000 or possibly Monday January 3, 2000: banks
cannot open, automatic teller
machines will not disburse cash, traffic lights
cannot be timed properly, airplanes cannot take off, power plants
will shut down, hospitals unable to function, government benefit checks
will be delayed. There are predictions of no food on store shelves in big
cities, riots in the streets, hundreds of businesses will go bankrupt.
Surely, there's a lot of exaggeration in the press
about the possible consequences of Y2K-related failures. The ultimate
consequences of such errors are, at best, unknown. However, there is also
some truth in many
of these predictions: banking, security, health
care, distribution, energy, telecommunications, transportation, and other
critical industries will suffer at least some disruptions; investors, management,
and workers will end
up in court over unresolved Y2K issues; and there
is a fair chance both the US economy and the world economy will suffer
while problems not solved in time by January 1, 2000 are being fixed.
WHY HAS THE YEAR 2000 PROBLEM HAPPENED ?
Several business and technological problems have combined to create "The Millennium Time Bomb".
1. Computing Resource Restraints
Main memory, disk space was at a premium in the
early days of computing. Until recently, computer memory and storage were
expensive and performance could be adversely affected by the manipulation
of "unnecessary" data.
It made sense to save several characters in every
date entry in a database, especially those containing millions of records.
Writing programs that minimized the amount of
memory and storage used was desirable. So programmers, usually working
with COBOL, hit the idea of using just two digits to record years in dates,
so that 59 stood for 1959.
This convention was carried forward even when
both memory and storage costs
began to plummet in the 1980s.
2. Applications
Have Lasted Longer Than Expected Programmers assumed that the applications they were writing would be replaced long before the turn of the century. Remarkably, many of those old programs are still in use.
Besides, once an application is working, there is reluctance to significantly change or replace it until forced to do so. "If it ain't broke, don't fix it", has long been the unofficial businesses for many organizations. This makes short term sense and keeps the budgets looking good, but it sacrifices the longer term good of the company. With the year 2000 approaching, suddenly that long term is coming home to roost.
3. Lack Of Widely Accepted Date Standards
No standard representation of dates has been internationally
accepted and implemented. For example, to an American, 1/4/96 means January
4th, 1996. To an Englishman, 1/4/96 means the first of April, 1996 - more
colloquially known as "April Fools' Day".
At least the year is common between these two. But even this has not been true around the world - China, for example, didn't standardize on the Gregorian calendar until 1948. Attempts have been made since the mid 1970s to set such standards, and even as recently as the late 1980's the ISO-8601 standard advocated the YYYYMMDD format for date storage. But the use of these standards was never internationally mandated and they never achieved critical mass in commercial acceptance.
4. User Demand
Computers are designed to make life easier for
users and to relate to users in a manner consistent with their known world.
Nobody wants to enter the century when he or she is not used to having
to do so. The "Customer is always
right". There is little maneuvering room when
the user is dictating the requirements for a new system. Historically,
users have not thought to specify Year 2000 compliance.
5. Backwards Compatibility
The market has demanded backwards compatibility,
basically because users have been reluctant to retire working applications
in order to replace them with the latest models. Why should I have to rewrite
my nice spreadsheet
just because a new release of Lotus 1-2-3 came
out?
Every new application written is expected to maintain some form of compatibility with previous systems. Today, Windows 98 is faced with the burden of supporting not only DOS programs but also Windows 3.1, Windows 95, Windows for Workgroups and so on. All this has meant that out-of-date date algorithms have had to be supported by leading edge replacement systems.
6. Code Re-Use
It has always made sound economic sense not to reinvent the wheel. Virtually all new applications have algorithms and even code incorporated from previous systems. This speeds up development and usually results in more reliable systems. The re-use of algorithms which have a hidden date processing makes the year 2000 problem is so huge and why some people have likened it to an immense virus. As the algorithms are used and reused, so their deadly payload is spread through more and more systems.
7. Historical Data
The data is accumulated over the years and successive
applications are built on this asset for analysis purposes. Changing the
data means changing the applications accessing that data. And this has
ramifications outside the
realm of the of the IT department. With the PC
revolution, development and control of many applications has typically
passed out of the control of the MIS professionals into the hands of the
end users. So changing the core
corporation data means immense inconvenience
and cost to all those end users.
8. Procrastination
By the end of the early 1990s, date problems began
to appear. Boeing was the first company to report the problems: date calculations
suddenly became problematic as calculated time periods began to fall into
2000.
Computers at Unum Life Insurance of America in
1995 automatically deleted 700 records from a data base that tracks the
licensing status of brokers because some programs interpreted the expiration
date as 1900. Some people talked about approaching the problems, but little
happened. After all, the general impression was that it was an easy fix:
you just had to put a 19 or 20 in front of those two-digit years to make
things work. Why bother with a feature that nobody really wanted to implement
anyway? A significant proportion of MIS departments have been putting the
Y2K problem off. There are always other, more urgent, things to do, they
reasoned.
9. Other Business Priorities
With the increasing pace of change in technology,
the urgent has taken priority over the essential, in order to meet the
user community's demands. There is little incentive to devote resources
into pro-actively looking for new
problems.
10. Business Process Re-Engineering
Many MIS departments have been downsized to simply
provide the required business functions of the company. The "slack time"
that could have been devoted to addressing the year 2000 issue before it
became urgent has
been deliberately cut out, in the search for
leaner and meaner business processes. Furthermore, because they now lack
the internal resources to handle the year 2000 problem, companies who have
been through Business Re-engineering downsizing will be forced to outsource
the Year 2000 fix project and to compete for increasing in demand programming
resources.
Baltimore, MD. Sun, 11 Oct 1998
VACETS Column