[wpdreams_ajaxsearchpro_results id=1 element='div']

What’s the Y2K bug?

[ad_1]

The millennium bug was a programming issue caused by storing dates in a six-character format, resulting in incorrect sorting after 1999. Programmers had to decide whether to rewrite programs or fix pre-existing ones. Over $300 billion was spent fixing the issue.

The millennium bug was a computer problem that threatened the operations of businesses, utility companies, financial industries, government agencies, and even science. On the stroke of midnight between December 31, 1999 and January 1, 2000, the fear was that all computers might shut down. The millennium bug is also known as the year 2000 problem, year 2 problem, year 2 error, and more commonly referred to simply as the year 2 problem.

The millennium bug was specifically a programming issue. It was the result of a combination of space issues and a lack of foresight on the part of programmers in the 1960s and 1970s. During the initial stages of computer programming, memory and other storage space were scarce and expensive, so saving fonts was a priority.

The programmers were writing business application code using COBOL (Common Business Oriented Language) and RPG (Report Programming Generator) to run on mainframe. The programmers stored dates in the form of yymmdd involving a total of six characters, automatically sorting in ascending order. Each of those characters equaled one whole space (byte) of computer memory. As a result, saving two bytes of memory for each date was significant when you consider the amount of date fields stored on cards, tapes, or disks in all records in all files on all computers.

In the 1980s and 1990s programs were modified for changing business needs, so programmers maintained, tweaked, and added new requirements to old applications, rather than rewriting them from scratch. The updates and changes were enough to keep the original systems running.

During the mid-1990s, programmers began to realize that dates would not be sorted correctly by the year 2000. Within the computing community it started to become an issue that needed to be fixed. Then, in 1997, the situation became public knowledge.

A decision had to be made to start over and rewrite programs from start to finish, or to fix pre-existing programs and memorized dates. This option presented another challenge because some of the source code was lost.
Many companies have been created to solve these problems. One option was to simply add the century to the pre-existing date. This would add two more bytes for every date stored anywhere in the disk files. Others opted to rewrite their software and take advantage of new object-oriented and networking technologies as they moved their critical applications off mainframes.
Over 300 billion US dollars (USD) has been spent fixing the millennium bug. In addition to software concerns, countless survival businesses have sprung up and profited from a concerned and proactive public.

[ad_2]