question for programmer types : LUSENET : TimeBomb 2000 (Y2000) : One Thread

what is "windowing"in y2k remediation and does it have anything to do with embedded chips?

-- zoobie (, May 22, 1999


Windowing is yet another shortcut designed to buy more time. Instead of just fixing the code to handle four digits for the year (full expansion), windowing treats some range of years as starting with 19 and others as starting with 20.

For example, a company might say "any year greater than 0 and less than 20 will start with 20 (2000-2020) and any year from 21-99 will start with 19 (1921-1999). This way, during the Y2K rollover everything will work, until at least 2020 that is. This does not effect embedded chips but it does raise a whole other set of problems. For example, if you exchange data with another company and they have a different "window" than you things are gonna get all screwed up, even if both systems are deemed "Y2K compliant".


-- TECH32 (TECH32@NOMAIL.COM), May 22, 1999.

Most Real Time Clock chips shipped today, and until a couple years ago almost ALL such chips, only supported a 2-digit year. In those (relatively uncommon, except in PCs) cases where the year is needed and used, it is up to the software (or firmware) to window that year. In a PC, the window is typically 1980-2079.

-- Flint (, May 22, 1999.



Just let me clarify then in the current BIOS or CMOS chips they have windowed the years. Is that what you mean? So PC's will work and understand that the century is 19** or 20**.

Fascinating stuff.


-- Thomas G. Hale (, May 22, 1999.

* * * 19990522 Saturday

Ref. Flint...

The "typical window is ..." whatever the vendor/user decides it is. There _is_ NO SUCH THING AS A _typical window_ range.

Cases in point: Each of the four (4) Year 2000 projects I was involved in ( 1996-1998 ) utilized different date ranges in their windowing methods.

In the PC/Network environment it can get even trickier, with differing windowing methods used for many software applications on the same platform; furthermore, windowing methods in software applications from the SAME VENDOR ( e.g., Microsoft ) will implement different/distinct windowing methods to confuse things just a little more!

Ain't it fun, guys?!!

Regards, Bob Mangus * * *

-- Robert Mangus (, May 22, 1999.


You're quite right. The PC window I was referring to is the window protocol used by the BIOS. No BIOS author I've ever heard of uses any other window. Other windows (for example, those used by Microsoft products) are ultimately remapped from a known, fixed BIOS window.

In any case, the window can't exceed 100 years using 2-digit years.

-- Flint (, May 22, 1999.


Here's another definition, and examples to show aspects of windowing:

Windowing is a method for translating two-digit year numbers into four-digit year numbers.

The name comes from imagining that all the four-digit year numbers are laid out in a line, and one is looking at them through a moveable window that allows one to see only 100 of those numbers at a time. One can shift the window left or right to see different 100-year-number stretches -- say, 1921-2020 or 1980-2079, as mentioned above by TECH32 and Flint -- but not more than 100 of the year numbers at any one window position.

As long as the window is *fixed* at one position, one can unambiguously translate year numbers viewed through it, from two-digit to four-digit or from four-digit to two-digit. E.g. if the window position is fixed at 1921-2020, then two-digit year "37" unambiguously corresponds to four-digit year "1937".

If different viewers are using different window positions, they may not get the same results when translating between two-digit and four-digit year numbers.

E.g., if viewer A's window position is 1921-2020 and viewer B's window position is 1980-2079, then viewer A will translate two-digit year "37" to four-digit year "1937", but viewer B will translate that same two-digit year number "37" to four-digit year "2037".

Viewer A and viewer B will identically translate two-digit year numbers "80"-"99" to four-digit years "1980"-"1999", and "00"-"20" to four-digit years "2000"-"2020", but they will get different results whenever they translate two-digit year numbers in the range "21"-"79" to four-digit years.

So, as pointed out above by TECH32 and Bob, use of different window positions by different viewers can cause trouble or confusion.

One way of defining the Y2K problem is that for decades many programmers have habitually used windowing with a window position of 1900-1999 in their programs ("00" = "1900", "01" = "1901", ... "98" = "1998", "99" = "1999"), but without consciously recognizing it and considering the implications of its limitations as the calendar dates being processed approached and exceeded 1999.

Now we're having to scramble to change these billions and billions of implicit windowing decisions. In some cases, we're just moving the window positions (this may require more complicated programming -- instead of just sticking "19" in front of any two-digit year, one has to add decisions about where to break the two-digit year numbers between "19yy" and "20yy"). In other cases, we're eliminating the windowing by converting to all-four-digit year numbers.

(Ahem -- it should be noted that (A) our use of four-digit year numbers is often just a form of windowing in which the window is 10,000 numbers wide, which may cause trouble about 8000 years from now, and (B) this discussion assumes that the calendar in use is the Gregorian one commonly used by the Western countries in recent centuries, rather than any of the numerous other calendars in which the current year's number is *not* 1999.)

Embedded -- Because windowing is simply a method of translating between two-digit and four-digit year numbers, it can be used wherever there is any programming involving year numbers. This _could_ include embedded systems. In fact, I personally have worked with embedded systems whose programs used windowing. (Note: I write "embedded system" instead of "embedded chip" here because there are embedded systems that consist of more than one chip. Some of the embedded systems I worked with were essentially stripped-down PCs inside ATMs.)

-- No Spam Please (, May 22, 1999.


Flint's posting should _NOT_ be interpreted as meaning that "PC's will work and understand that the century is 19** or 20**".

The situation is more complicated than that. I'll let Flint explain.

-- No Spam Please (, May 22, 1999.

Note that some pre-existing windowing schemes prevent the implementation of the quickest of y2k fixes -- setting of the date back to 1972.

-- a (a@a.a), May 22, 1999.

(1) Windowing doesn't always work...depending on what checks the system makes. For example, I'm involved with a scheduling system that checks to ensure that the date-year is in the range 00 will check out as invalid without programming checks...even if the system knows that 00 is 2000. (2) Running the system with a "date" of 1972 fails for the same reason. (3) Windowing may just defer the time bomb for another 30-50 an unsuspecting programmer then. Since code I wrote 30+ years ago is still running, I can't be sure that today's systems won't be embedded somewhere...

-- Mad Monk (, May 23, 1999.

Actually, it's quite true that "PC's will work and understand that the century is 19** or 20**" -- except for the exceptions! There are four classes of exceptions:

(1) Early PC's (the IBM PC and XT and clones) had no RTC at all. It was necessary for the user to set the time and date every time the unit was booted. The DOS default is to prompt for time and date if there is no autoexec.bat file as a holdover from those early times.

(2) Improper or missing windowing. Until fairly recently (the last 5 years or so), most BIOSs didn't really window at all, they hardcoded a 19 as the century. Depending on how they did this, if you have one of these you either need to reset the date after rollover one time, or you need to reset it every boot.

Improper windowing: The RTC is battery-backed (if the battery is still good). When the century changes, the PC is either off or on. If off, the BIOS should check the century during the power-up sequence and window the century. Not all of them do this. If the unit is on, no BIOS will notice the century change until the next call to the BIOS to get or set the time or date. Some BIOSs handle a subset of these four calls (get time, set time, get date, set date), and some don't handle any of them. And if an application attempts to read the date directly from the RTC between rollover and that first BIOS call that corrects it, then that application must also have date windowing logic. And this isn't necessarily true, although to my knowledge nobody has found an application that makes this error.

Recent BIOSs from the major vendors (AMI, Award and Phoenix) do the windowing correctly, although of course applications that don't use the BIOS won't know this and must themselves handle the RTC correctly.

(3) Operating system. Most PC operating systems get the date one time, during OS initialization. After that, they use a different clock to keep track of the time. Applications that get the date from the OS do NOT cause the OS to access the RTC directly or indirectly (through the BIOS) ever again. So the OS must also handle rollover correctly, and handle the date properly for all OS functions. This isn't always done right either. Windows 3.x, for example, won't sort the directory by date correctly if files have creation dates after rollover.

(4) Applications. This is the Big One. If the application uses 2- digit years, it doesn't really matter what the BIOS or OS does, the application is responsible for windowing. The obvious culprits are databases and spreadsheets (for example, Lotus Notes up to version 4 wouldn't let you create a 4-digit year if you wanted to; no such data type was defined). More subtle problems exist in for example 'make' utilities (got bit by that one), and virus checkers (which check for out-of-date virus databases).

Applications can also get the leap year confused, even though the RTC handles it correctly.

In general, to be sure your 'PC' is compliant, you must check *every* application you use. Some applications use dates internally and you wouldn't know it except they screw up after rollover in undefined ways. You wouldn't even know how to test except to set the clock forward (preferably after backing up *everything* or using an identical PC with your entire hard disk cloned). Most of these cases will come out in the wash later, and hopefully any damage can be repaired (but hope may not be sufficient).

Of course, even if you test a shrinkwrapped application and find date bugs, you can't fix them. You'll need a compliant upgrade, which may not be available. If you rely on that application and there *is* no upgrade, well, life is tough.

-- Flint (, May 23, 1999.

Windowing is usually used to avoid expanding a file layout. If you expand a file, ALL programs that use that file must be adjusted, even if they don't reference the expanded field. And the file must be physically updated.

Another problem with windowing that I don't think was mentioned is the sorting problem. Since the field still contains "00" for 2000, it will sort before all other records - ie 1900.

I'm not a big fan of windowing, but it does work, kind of, in some situations. I'll take any solution, if it works, that can save some time. Time is the real problem here... <:)=

-- Sysman (, May 23, 1999.

"Windowing" is something that does not get discussed much anymore, so for those who are late to the Y2K debate, I will go into some detail. Also, a full understanding of the windowing concept will go a long way to explain why the bump in the road scenario is far more likely than TEOTWAWKI.

But before getting into windowing it is important to correct a couple of basic misunderstandings about Y2K in order to provide some context for this programming technique.

You know that preamble in most media stories, where they say that the cause of the Y2K problem is that back in the old days, when memory was expensive, programmers dropped the 2 digits to save space? Well, it's a neat idea, but it happens to be wrong. It's a canard, like that old story about Einstein flunking math, which has simply been repeated so many times that it has become widely believed.

You see, anyone who is the least bit interested in minimizing the space required to store a date, will not store a date as 6 or 8 characters of TEXT, but will store it as an INTEGER. The real reason that the century digits are dropped is for the same reason why we do it in normal conversation. If you tell me that you saw the '92 World Series, I don't have to ask you, was that 1892 or 1992? In most cases it is perfectly safe to make certain assumptions about the century that a year belongs to.

So the real reason for the programming shortcut is simply a matter of convenience. If I pull out my Visa card I see that the expiry date is 09/00. When I buy my "Phantom Menace" ticket over the phone, I can just as easily say "Zero Nine Zero Zero" as "Zero Nine Two Thousand", and for the operator on the other end, it is a couple less keystrokes of data to enter. So why didn't Visa make my expiry 09/2000 when they were doing their Y2K remediation? Because it is perfectly obvious that the year must be 2000. It couldn't be 1900, credit cards did not exist then, and certainly not 2100, nobody's credit is THAT good.

This in a nutshell is the concept of windowing: making an assumption about the century based on a set of rules. Another myth is that windowing is a new technique created for Y2K remediation. Ridiculous. I have personally been using this method for over 20 years, and it was old when I started. There are exceptions of course, where expansion, i.e. the use of explicit 4 digit years is desirable, but they are rare. An example would be date of birth, some people are over 100 years old. But for most situations, it is unlikely that transactions will span 50 years, let alone 100.

So for applications that have properly implemented windowing, Y2K is rarely an issue. When Y2K remediation efforts got into full swing in 1997, there was a big debate between the "windowers" and the "expansionists". The expansionists wanted to move to full 4 digit years so that the problem would be fixed "once and for all". Many of these people also wanted to take the opportunity to finally come up with a true standardized date notation, to finally do away with the many, and often confusing, date formats that are commonly used worldwide.

It was from the arguments put forward by these expansionists that some of the wildly exaggerated notions about the effort required for Y2K remediation got started. That is because if the date format on every single screen display and report had to be changed, Y2K would become a problem of nightmarish proportions. Fortunately, windowers won this debate hands down, but there are some companies that are running late and over budget as they did try to go the expansionist route.

Yet another myth is that it is critical for companies exchanging data to all use the identical "window", i.e. use exactly the same coded set of rules. Again, if you think about this for a moment, you will realize that in almost all cases it is totally irrelevant.

Example: suppose 4 companies are exchanging data regarding inventory, orders, billing, etc.

Company A uses rule: year > 30 then century is 19 else century is 20.

Company B uses rule: year > 38 then century is 19 else century is 20.

Company C uses rule: year > 35 then century is 19 else century is 20.

Company D uses rule: year > 41 then century is 19 else century is 20.

Okay, so we have four different companies passing transactions back and forth with four different date windows. If a transaction comes through with a date of 361231, companies A and C will take it to mean Dec. 31, 2036, while companies B and D will assume it is Dec. 31, 1936. But wait a minute, how likely is it for a transaction to have such a date? Know any companies that have their records computerized back to 1936? Deal with anybody that wants their order shipped in 2036?

In real life, the vast majority of business transactions don't go back or ahead more than 20 years, so if the my window and your window is off by a few years, it doesn't make a iota of difference because it is extremely unlikely we will ever have a transaction with a date that would screw up. As long as each participant uses a window that is in the same ballpark, and is reasonable, it will work fine.

Ah, but won't we all have to change our windows in about 30 years? Yes, but so what? Assuming that the software application is calling a subroutine to handle the date formatting on input, only a few lines of code need be changed. If this is handled by a call to the OS, the application not change whatsoever. All of the places that use dates will use that routine and hence the new rule. Isn't that worth being able to say '92 World Series instead of 1992 World Series? Why should we make computers less convenient just because we are in a new century?

As for "embedded systems", the windowing concept remains the same. The only difference is that frequently there is no century to drop in the first place, i.e. they only ever use 2 digits for the year. In such cases the often repeated phrase about computers "assuming the century to be 1900 instead of 2000" does not apply. The only century assumptions are going on in the heads of observers or downstream applications. This is typically the situation with systems that do real-time data acquisition - as far as they are concerned it really is the year 99 and Trajan is head honcho of the Roman Empire ;-)

When all is said and done, next year a lot of people will be trying to figure out how the Y2K apocalypse was averted. Windowing will be responsible for a big chunk of that explanation.

-- Computer Pro (, May 24, 1999.

Moderation questions? read the FAQ