1999 and date standards - EVERYONE READ !!

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I was wondering if someone could explain something to me ? It is acknowleged we WILL have a problem with the april 1, july 1, april 6, april 9 aug 22- gps rollover oct 1 etc. BUT IF there is NO agreed upon date standard, HOW will the computer know 01/04/99 from 04/01/99( I used dd/mm/yy FIRST ex.) if a computer has to interface with a system that uses mm/dd/yy ? WHICH date will be used ? and if BOTH systems use dd/mm/yy instead of mm/dd/yy , what is to stop the computer from thinkin it is in fiscal 99-00 IN JANUARY INSTEAD OF APRIL ? or assuming that july 7 is actually JANUARY & and switches over a mainframes to fiscal 99-00? I think this is a silent killer that I haven't heard discussed - I would like answer - ASAP because if what I THINK is possible we may NOT have a YEAR to the year 2000. Our event may be IN JANUARY.

Sincerely,

Sean Henry

-- sean henry (venturer@interlog.com), December 28, 1998

Answers

Sean,

You are quite correct on bigtime failures happening in 99. YES, they begin in January. See an article written by experts in Brittain listed in garynorth.com under the category 'Too Late'. Look for a recent posting "April 1999 - a foretaste of y2k."

-- BBrown (peace2u@bellatlantic.net), December 28, 1998.


You agree on the format beforehand. Otherwise it doesn't work. I don't really think this is an issue - we have been exchanging data for a long time now - no one is just going to declare their own 'standard'. At least not if they are sane.

-- Paul Davis (davisp1953@yahoo.com), December 28, 1998.

Sean, the existing computer systems and networks are in fact using many date formats, which have been "negotiated" over time. System A may exchange date data with System B of the form "yy-mm-dd", but to exchange data with System C, the format used is perhaps of the form "mm/dd/yy". (And note that these "systems" could actually different applications running on the same computer ... or completely separate systems that are nodes on a network.) The many different formats, however they got there over the years, nevertheless work, because any two systems that need to exchange data understand the format needed. (And in many instances, this is accomplished by special purpose "translator" computer programs written for the sole purpose of translating one format to another [e.g., "yy-mm-dd" to "mm/dd/yy" and vice versa].)

Unfortunately, this adds yet another huge wrinkle to the Y2K problem, since in addition to having to go over all the software that has been doing all this, it is not completely clear cut what the ramifications are to try to "fix" the year to account for 2000. Simply using "yyyyy-mm-dd" and "mm/dd/yyyy" may not cut it, because addition of the two extra digits for the year may "bump" other data over by two places, causing problems that need to be looked for and accounted for. Attempting to stick with using 2 digits to represent the year, but now trying some other scheme to so that it is "understood" that "99" means 1999, but "00" means 2000, gets messy, because you have to make sure that all the systems involved understand: 1) that this new scheme is now in effect; and 2) that all systems agree on where the "pivot point" -- the year dividing the 20th and 21st century -- is, which can often vary. (And then, there are all those "special flag" dates that some systems may have problems with, because the date field -- regardless of its format -- has been misused to indicate some special action needs to be performed [e.g., the classic end-of-file condition].)

Hope this helps....

-- Jack (jsprat@eld.net), December 28, 1998.

# # # 19981228

Paul Davis:

Fact: Everyone _is_ declaring their own "standard" for windowing! That's _why_ this 83-year old fell through the ( case in point ) payroll system application. This "error" was caught. That's NOT AN "IF;" it's, when and where!

How many of these cases need to go [ _undetected/_detected too late ] before a system goes into useless mode?

_That, my friend, is _the only UNKNOWN_ before society.

... "Get It?!" ... The, "Ah ha!" factor ...

Regards, Bob Mangus # # #

-- Robert Mangus (rmangus@mail.netquest.com), December 28, 1998.


RM - you're changing the subject on me. Data transfers between system rely on agreeing on what the data format is beforehand. Now whether or not the receiving system can use the data is determined by the software running on that system - which is another problem entirely. As the Pharoah said to the harem master - one thing at a time.

-- Paul Davis (davisp1953@yahoo.com), December 28, 1998.


To add to the "conventional wisdom" stew of conflicting assumptions - the central European standard assumption reverses the mm/dd/yy and dd/mm/yy conventions - so even when people agree on a standard between computers - the reports generated from an updated year 2000 program need to be reverse-translated sometimes back to a different form.

Its not difficult - just tedious and time-consuming, neither of which is available now. So errors between the current program and the "translators" are very likely as a single program gets upgraded to "integrate" it back into an existing commercial transaction cycle.

-- Robert A. Cook, P.E. (Kennesaw GA) (cook.r@csaatl.com), December 28, 1998.


I have posted several times that the ANSI standard is YYYY-MM-DD. (And by extension, YYYY Month Day). But most of you still write stuff like 1/1/00, 12/31/99. If you all can't get your act together, and use a unambiguous standard, how do you expect the wogs in Europe, Asia, and elsewhere to start using such a standard?

-- ToldYouSo (beating@myhead.com), December 28, 1998.

ToldYouSo: If its any consolation, I expect that Y2K will essentially create a complete and total end of the world as we know it, removing the need for any computeresque standards for mucho years to come. Hope this helps....

-- Jack (jsprat@eld.net), December 28, 1998.

Your question opens up the world of interfaces -- the way one company's computers talk to each other. I used to code bank transactions -- ATM-to-switch-to-VISA-to-switch-to-ATM types of communications. I can tell you that considerable sweat goes into making sure that the outgoing date fields line up with what's expected on the other side. Otherwise, you can bring the other side's machine to a halt.

Now you understand the problem about "contamination". Compare the following field strings: MM/DD/YY/$56.31/Mr.George.Stephanopoulos/... MM/DD/YYYY/$56.31/Mr.George.Stephanopoulos/... ('/' is just for visual separation, not part of the fields)

If "we" send a four-digit year field and "they" are not ready, it doesn't matter if our date fields are arranged properly; they'll think the last two digits of our date are the first two of their next field (which, BTW, means they'll think the year is 1920). Likewise, if they send us two digits and we're looking for four, we think that $5 is part of the date. Either way, the data will either shut the transaction down (best case), shut the machine down (next worse case), or be stored in a database (very very bad). Enough bad DB entries and you've got a serious problem.

Long answer to short problem (-:

-- D.G. (wagoods@aol.com), December 29, 1998.


I RECENTLY SENT THE FOLLOWING MESSAGE

Subject: Y2k international DATE FORMAT standard Date: Thu, 14 Jan 1999 16:39:07 -0500

TO A NUMBER OF PEOPLE.

SINCE I CANNOT "SEND PAGE" HERE IS THE ADDRESS

http://www.iso.ch/markete/moreend.htm

-- Steven Belsky (balstarr@idt.net), January 15, 1999.



"Your question opens up the world of interfaces -- the way one company's computers talk to each other. I used to code bank transactions -- ATM-to-switch-to-VISA-to-switch-to-ATM types of communications. I can tell you that considerable sweat goes into making sure that the outgoing date fields line up with what's expected on the other side. Otherwise, you can bring the other side's machine to a halt."

D.G. - I used to work for VISA and an very conversant with how the data interfaces work. VISA and MC claim that they are compliant. As you know, VISA interfaces (switches with) over 21,000 banks worldwide, with many many more endpoints. It has long been my belief that VISA cannot claim compliance unless they have set up a system to weed out bad or corrupt data from non-compliant banks. Otherwise VISA will just be perpetuating the problem that you have described, leading to systemic cross-contamination of bad data in the financial community.

I realise that I am greatly simplifying the problem. What is your take on this - i.e. firewalls, bank isolation etc.

Do you have any experience in this field or know what banks are planning to do to avoid this calamity?

Thanks,

Andy

Two digits. One mechanism. The smallest mistake.

"The conveniences and comforts of humanity in general will be linked up by one mechanism, which will produce comforts and conveniences beyond human imagination. But the smallest mistake will bring the whole mechanism to a certain collapse. In this way the end of the world will be brought about."

Pir-o-Murshid Inayat Khan, 1922 (Sufi Prophet)

-- Andy (2000EOD@prodigy.net), January 15, 1999.


Moderation questions? read the FAQ