DON'T Miss This Post!!! Best Programmer Explanation Yet : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I don't know the person that originally posted this,don't even know if they are really a programmer.I do know any of you that are "At the end of your rope"will find this comforting; that you are doing the right thing.Some insights:from a programmer "Thirty-three years ago, I was a young woman right out of high school. I had a newborn baby to support, and no husband. I could not go to college because of the child, and much to my utter shock and surprise, I had just learned that secretaries make only slightly more money than babysitters. I had to find a good job, so I could support myself and my child. I was seventeen years old.

At that time, the profession of "computer programmer" was a very new thing. It was so new, no one had thought to sex-stereotype it yet. Programmers could be anyone, as long as the person had a high math aptitude and an ability to hold the mind focused on the excruciating logic of assembly language programming. I had a high math aptitude, and I found that I was able to do assembly language programming. More importantly, I found I could get paid a decent wage by doing assembly language programming, and so software became my career for the next thirty years. It was one woman's alternative to becoming a welfare mother.

Now "assembler", as it was called, was one step above machine language programming. Computers are just machines, after all, and all they do is read the logic of a pattern of off and on spots in an array of electric circuits. In the mid-1960's, we programmed for the most part in "assembler", which was a language that grouped ones and zeroes, so it would be easier for us, as people, to understand.

The language was primitive. It had just a few commands. GET. TAB. GOTO. MOVE. OPEN. That kind of thing. If you were programming a computer to make a pot of coffee in Assembler, you might say something like:

IF coffee = Maxwell House AND Maxwell House = Yes OPEN coffee file. ELSE GOTO End. GET coffee pot. MOVE lid eighteen spaces right. GOTO lift arm file. Etc. Etc.

It would take a couple hundred lines of code before you got a pot of coffee out of it.

Now obviously, this was too tedious a language and a procedure to do the kind of programming tasks we came to ask of our programmers, and so "Higher Order Languages" like Fortran and Cobol and Basic and APL and PL1 were developed. These Higher Order Languages did the same thing to Assembler that Assembler did to Machine Language. That is, they grouped together lines of code so that people would be better able to get their heads around what was going on.

So in the Higher Order Language, to make a pot of coffee, you might just say:

OPEN Maxwell House / Folgers / Starbucks MAKE coffee.

And two lines of High Level code would replace two hundred lines of Assembler code, which replaced two thousand lines of Machine Language code.

This was a great advance for the programmers, and allowed people with SAT scores under 700 in Math to become programmers. By the late 1970's, programming had become a job that many people could do, not just a few Nerd Blossoms with Slide Rules.

The market economy went Bananas. Now computers and programming could be used in every industry, in every function, in every business. All the tedious, repetitive tasks could be done by machines, who would do them exactly the same way, over and over again.

All the manual tasks -- LIKE RUNNING THE NATIONAL ELECTRIC POWER GRID -- were taken over by automated, embedded computer chips.

These chips operated on the same logic that encompassed the hundreds of lines of assembler code. In just thirty three years, computer took over all the manual functions of every aspect of the industrial society.

But there wasn't one day when any one company said: Oh, now I'm going to automate my whole operation. No, no. It was a more subtle thing. It was a piece by piece implementation. Today I'll put the Accounts Receivable up. Next year I'll automate the Customer Service department. The year after that I'll put the batch manufacturing on the network. Then when that's done I'm going to add in all the Inventory Control functions and the Supplier Network.

Nobody's system got done all at one time. Each system had some old code and some new code, all mixed in together, like a plate of spaghetti. Programmers came and programmers went. And when a programmer went, the knowledge of what he was thinking in that twenty-two year old Noodle of his, went with him.

The languages and the technology progressed and evolved so quickly --- EVOLVED. That's an interesting choice of a word. Yes. I would have to say the computer networks EVOLVED.

They didn't spring from the womb fully formed.

And because of this, they didn't exactly have a Single Creator.

They were Products of the Committee Process.

And in this Committee-driven Evolution, each new group of young, enthusiastic programmers picked up where the last group left off. Programming languages obsoleted. The programmers who knew them were only a few people. The programmers who followed with the new languages were many more people. The new programmers used the new languages -- which were conglomerations of code that had been built in the old languages. Rollups of the hundreds of lines of assembler language. Soon none of the new programmers knew the underlying code of the old languages, and the logic of the old languages became embedded inside the instruction set and the microchips of the new languages.

Where nobody knew what it was.

And nobody knew how it would act, if it was ever taken out of context.

Thirty-three years ago, when all the new programmers were in their twenties, and computers were a new and exciting toy -- none of us ever believed the year Two Thousand was ever going to come. What twenty-two year old ever imagines she'll one day be fifty five? Bit space was at a premium. Nobody thought they were writing code that would live thirty three years. We knew our solution would not survive the year 2000, but we never imagined it would need to. We thought someone would come up with a better answer before then. I remember the day we sat around the coffee machine laughing at what a stupid solution it was. But who could imagine at that time that computers would take away our ability to live in concert with the rules of Nature? Who would imagine that the national power grid would be run by it, and there would be no manual backup? Who would envision that the people would forget how to grow their food from the land and extract their seeds from the vegetables? Who would believe that our food would be hybrid and we would depend on new supplies in a complex, computer-driven distribution system that could break with the SNAP of two digits?

And who can believe that the Universe will allow its Humanity to survive with that attitude?

We thought somebody else would come up with a better solution, and our code would be obsoleted. In time. We never thought it would live thirty three years.

And indeed, the code didn't live thirty three years -- but the follow-on languages and microchips INCORPORATED the logic of the original assembler. It had to. Computers are driven in their guts by an Array Table. Array Tables are driven by clocks. As the Universe is driven by the clock of the wheel of the year, so also computers are driven by their clock.

The computers needed a means to calculate position. They needed a measure of linear progress. We were young. We were brilliant. We knew the DATE would make a fine progress array table. So we built logic that divides by it.

Clock logic runs computers.

Embedded somewhere in the millions of lines of code in all the computers all over the world, running all the critical functions of our civilized infrastructure -- is system architecture logic code, hidden UNDER the language instructions -- that divides by the date. This is not something just I did -- this is how it was done. We had very little to work with, very few tools. But one of our most powerful tools was a CLOCK. We used Date Calculations to drive the array tables. We subtracted dates, we added dates, we divided by dates. The application doesn't need to be date dependent to have this problem. When you look at your applications, and you say, "Oh, we don't use the date in that function," that is very nice. But it's not the CORE issue. The operating system, in underlying system code, is calling the date for logic arrays. Particularly when you think of Control Systems, you should think of CLOCK as the driving array.

The power generators driven from mainframe technology were built this way. By design, the computers cannot go farther than 99.

CLOCKS are THE critical component of Computer Logic. Just as they are the driver of the Universe, and the Wheel of the Year. And computer logic cannot be written any other way.

And because of the evolution of computer languages and logic, no one knows where the offending code in a massive multi-million line system is, or what it is doing. In fact, few people could ever find it. It has become like salt, dispersed throughout the system, impossible to identify. More billions of dollars have been spent by corporate America trying to fix this problem than was spent on the Vietnam War.

But the problem is systemic, a part of the inherent logic of calculation, and on December Thirty-First, Nineteen Ninety Nine plus one second -- millions of computers are going to take the one action forbidden to computers for all time.

They are going to divide by Zero.

Which will make them go unstable. "Does not compute", said DATA, and the Captain could only unplug him and shake his head.

And even after they fail, few will be able to find, identify, and correct the failure. New systems will have to be put together. And you know what? When you put together the new systems with four characters in for the date -- you will cause this same problem to show up in the year 10,000. For to use the date as a system driver is a requirement for basic logic. When? When? When? When? The computer has to know. Time is Position. Space is Time. WHEN must be known.

Believe me, the year 10,000 is no farther away in our minds today than the year 2,000 was in the minds of the young system programmers of 1968. We were ALL under thirty years old. We were ALMOST all under twenty-five. Younger than Monica. We thought the Sun would burn out before the year 2,000 came. And who in Corporate Management would take the time to understand the minute details of our programming solutions?

It is not a problem that lends itself to be corrected before it happens. But we have become so dependent, in all aspects of our society, on the computing technology, that a system failure like this will destroy us. It is not just that parts of the nation will be out of electricity for a while.

It is that the entire national power grid will come down.

Our grocery stores only carry a few days worth of food.

Our water supply operates from computer systems.

Our businesses will close.

Our banks will shut down.

Our people will panic, and loot, and turn ugly.

You know, it only took four hours for a boat to come to save the people who fell into the water when the Titanic sank. But once the people fall into the water,

And in the end, the people will demand a solution from a government that has no ability to solve the problem. "This is a function to be handled by private industry", the government said before the disaster. Then the disaster came, and the government did what a government does.

It presided over the holocaust, and kept the people contained.

A year before the event, simulated tests that turned the computer date to 1/1/00 in a water supply plant caused the system to inject toxic levels of fluoride into the drinking water. These types of things needed to be looked into by the U.S. Senate. But our leaders chose instead to hold a trial about the President's Sex Life. (Don'tsayitsaboutsexitsnotaboutsex) The disaster was one our leaders should have gotten us ready for. But you know they did not, just as you know the story I'm telling here is Truth. Are the people who Fiddled while Rome burned in the Impeachment trial the ones you expect to turn to when no water comes out of your tap and the lights go out?

In fact, all we can do is follow the bridegroom's advice, and get some extra oil for our lamps.

It wasn't good that we let our whole society become dependent on technology this way. The electric company didn't know HOW to go back to manual methods now. All the people who knew how to operate the plants manually have lost the knowledge. The equipment has been dismantled. There is no option to go back. We shouldn't have lost our knowledges to do things without computers. It was not a good thing for Humanity. It's something that needed to be changed. And if we couldn't see it as a problem, perhaps the Universe received its command to fix it for us."

-- Desertj98 (, June 24, 1999.

-- ( this), June 24, 1999


Actually, I understand and agree with much (not all) of this essay. A GLARING error: "The year 10,000 is just as far away as the year 2000 was for us", or something to that effect.

The computer age started in (roughly) 1947, so 2000 is 53 years later. 10,000 is, of course, 8,000 YEARS later. Our "computerized world" has gone through about 4 generations of languages in 53 years.

Will we even HAVE electronic computers in 10,000? Will these machines be as LITERAL as today's? We can't answer those questions. But it's a pretty sure bet that COBOL will be dead, along with ANY vestiges of our age.... (Ancient Egypt wasn't 8,000 years ago, was it?)

-- Dennis (, June 24, 1999.

Alright, now where are the Pollys to tell us that IT CAN be fixed, that THERE IS time and that computer logic IS NOT dependent on date?

Where are the Pollys to tell us that the old languages were thrown out and they don't exist anymore. That we now have built compliant languages?

Where are they to tell us that THERE ARE NO embedded chips with problems decribed by the poster?

Where are they to explain to us that the power grid CANNOT FAIL!. It has been built by some of the most intelligent people on Earth, it CANNOT be compromised...

Where are they to calm my fears that this is just another DOOMER trying to look educated? That whoever wrote this just doesn't GET IT about computers.

I'm waiting...

-- (Pollys@come.forward), June 24, 1999.


I'd quibble with the assembly language example. What she shows is OBVIOUSLY designed for non-techie consumption - which is ok in this context. Actual Assembly language would be something like:

PUSH.B 0x43A0 MOVE.B 0x43A0,0x015A TST.B 0x0x015A,"M" BRNZ error23 TST.B 0x015B,"A" BRNZ error23 TST.B 0x015C,"X"

etc etc.

I don't know about you, but that makes me *more* nervous!

Jolly used to do ASM, too.

PS(The above ASM is based on Motorola 68000 - the mainframe crap is *much* less sophisticated!

-- Jollyprez (, June 24, 1999.

Ok. I remember taking BAL (Basic Assembler Language) in community colledge (AKA, Tic-Tac-Tech). I should say, I remember being in the class. I sure can't code in BAL. And yes, computers have to have clocks to meter their processes. But the clocks do not have to be date aware. They just generate what we call "tics". A typical clock tic is 1/100th of a second. Doesn't have to be. So this clock sends out a tic and the processor preforms what ever task it was to preform and checks it's status so it knows what to do at the begining of the next click. The details (mostly left out as you can see) depend on the OS/firmware. The lynch pin of this whole post seems to be that every computer has a clock so they will wave bye-bye at new years. I believe that unless your chip/OS/application is date aware December 31st, 1999 +1 second is just another 100 tics to the computers clock. In case you haven't figured this out by reading the post, I am no expert in BAL or any other assembler language. But this is the perception I have from all the reading I've done in the IT industry. That's 13 or so years in midrange computers as a UNIX system administrator. Read the internals book. Worked with Host Heads.


1) IMHO the above does not accuratly describe the code end of the problem.

2) We are too dependant as a society on the marvelous web of technology.

3) Buy a year supply of food and seeds and hand tools and a whole load of books on how to use it all. Prepare for the worst. Whatever it happens to be.

And keep your...

-- eyes_open (, June 24, 1999.

ARGGHH Damnit - with two returns after each line:

PUSH.B 0x43A0

MOVE.B 0x43A0,0x015A

TST.B 0x0x015A,"M"

BRNZ error23

TST.B 0x015B,"A"

BRNZ error23

TST.B 0x015C,"X"

-- Jollyprez (, June 24, 1999.

This discussion ties in precisely to what Bruce Beach was trying to say about the internal clock problem. He said that there are hidden clocks buried in the chips that are not even known to the later programmers who adopt these systems into their own programs. Efforts were made to discredit his analysis but he was right. This is also a primary reason why the Internal Revenue Service is having such a hard time updating their system with multiple failures before Y2k even entered the picture. How many other major software systems will have similar problems? Programmers should review Mr. Beaches analysis again before things shut down so they will have a better idea of the basic problems that are causing the system to shut down. Some programers that started say 15 years ago may not even be aware what is in the old codes that they included in their programs.

-- Tom (Tom@notstupid.gom), June 24, 1999.

Desertj98, congratulations for your excellent post.

I know perfectly well how bad y2k could get to be, so don't get me wrong. y2k will definetly mean a very important change in our lives, maybe a drastic change. Possibly a terrible change, if we survive.

Now then, I have a technical question to ask:

When you say that on January 1, 2000, computers and computer-like systems "will divide by zero", exactly what do you mean?

Because it's understandable that any algorhythm which contains a date calculation in the denominator could eventually turn to zero or a negative amount, thus producing erroneous results. But my question arises from different comments I've heard from experienced programmers such as yourself who have indicated that it's impossible for that to happen because "any knowledgeable pr

-- George (, June 24, 1999.

(OT: for some reason, I find it insanely funny when a post gets truncated for no apparent reason. It's like, what, did the poster just fall over dead, shot or something?)

-- lisa (, June 24, 1999.

Lisa, it's those ravenous cybergnomes! They haven't been fed enough xxxxxxxxs recently and so are foraging amongst diverse posts.

To outwit the computer guillotine fiends we decided to feed those cybergnomes sacrificial xxxxs so our message wouldn't get chomped and sentences gobbled mid-steam.

Sometimes they eat lots, sometimes just a few xxxs. Who knows the mysteries of the cybergnomes?

Original thread where Desertj98 made that startling debut:

At the end of my rope

xxxxxxxxxxx xxxxxxxxxxx xxxxxxxx

-- Ashton & Leska in Cascadia (, June 24, 1999.

Okay...I'll bite on this one.

I began my career in programming using IBM Assembler, and it looked NOTHING like the code mentioned here. I'm also finding it very hard to find a reason why a date would be involved in division. I haven't seen that in BAL, Fortran, Assembler, or any other language I've used.

The poster also seems to assume that none of this code has been remediated. I've remediated BAL programs and converted BAL programs to COBOL II while remediating dates.

Contrary to the author's opinion, there were many of us snot-nosed kids who pointed out that programs would fail in 2000. There simply wasn't money allocated to remediate them at the time. The emphasis was on making money NOW, not looking ahead 40 years.

Also contrary to the author's opinion, there are still lots of folks around who still know these lower-level languages. I taught Assembler Language and Advanced Assembler Language at a local college for several years, so even if *I* die, those kids have the knowledge to continue. By the time they finished the advanced class, they could go back and forth between machine language and assembler language. Besides those kids, each time I've seen a contract position open up for an Assembler language programmer (say in the past 5 years), I was told that the position had already been filled.

Okay...on to the next point. NONE of these old programs are hard to figure out if you know the language. Even spaghetti-code can easily be unraveled. They're good for a laugh on occasion, as in "What WERE you smoking when you wrote this?", but were easy enough to decipher despite the multiple hands and different styles of folks who worked on them in the past.

Dennis: I don't see COBOL dying anytime soon (at least not in MY lifetime.) Conversions from COBOL to COBOL II or IBM assembler to COBOL II (where reference modification can easily simulate BAL) are always being done. Prophecies of COBOL dying out have been around for at least 10 years now. I remember when CICS was considered dead, yet it bounced back before I even heard the rumor.

So can someone provide an example wherein one would divide dates?


-- Anita (, June 24, 1999.

I saw the above posted on the following: Sorry, can't make my linker link these days.

-- smfdoc (, June 24, 1999.

]On the off-chance that these questions arise from sincere misunderstanding rather than deliberate distortion, I'll take a crack at giving reasonable answers:]

Alright, now where are the Pollys to tell us that IT CAN be fixed, that THERE IS time and that computer logic IS NOT dependent on date?

[Let's ignore the belligerant tone of voice, and deal with computer logic. That logic is dependent on rules built into the hardware, not on dates per se. However, dates as data can control the logical flow of a program, and can appear as output or input data. Indeed, almost any data can be used to control logical flow. This means the computer makes decisions based on these data (if date A is less than date B, do X, otherwise do Y. This is the crux of the y2k problem.]

Where are the Pollys to tell us that the old languages were thrown out and they don't exist anymore. That we now have built compliant languages?

[Some older languages never became popular and were rarely used, but doubtless some code still exists (and is used) that was originally written in every language ever invented. But this affects the y2k issue only insofar as these programs mishandle date calculations. In any case, computers *must* make decisions based on data to perform effectively (that's what computers do). ANY language that can represent a date in any way (which is all of them) can mishandle it. The notion of a compliant 'language' is nonsense. You can even write love poems in German!]

Where are they to tell us that THERE ARE NO embedded chips with problems decribed by the poster?

[To my knowledge, nobody has ever said there are no embedded chips with problems. Nobody has even said that this number is particularly small. However, after a great deal of detailed investigation, we've learned that the both the absolute number of such embedded errors, and their likely impacts, are (fortunately) far less than was originally feared. Nonetheless, the probability that we'll suffer serious problems as a result of the embedded errors we miss approaches unity. The *number* of such problems might be fairly small, but that won't help you if you're in the wrong place at the wrong time. It's like the risk of being killed in an airplane crash -- very small, but very real (this *does* happen). Few (if any) real risks can ever be reduced to zero.]

Where are they to explain to us that the power grid CANNOT FAIL!. It has been built by some of the most intelligent people on Earth, it CANNOT be compromised...

[Of course the grid can fail. The question isn't whether it *can* fail, but whether it *will* fail. And again, investigation and remediation have indicated that the probability of such a failure is very small. It can never be zero. The probability of local y2k- induced outages seems fairly high (which is something quite different from a failed grid, of course).]

Where are they to calm my fears that this is just another DOOMER trying to look educated? That whoever wrote this just doesn't GET IT about computers.

[Try to draw the distinction between possibly and likely. Yes, you can draw to an inside straight and you *might* get the card you're looking for. But what are the odds? Simply pointing out that you might draw the right card is a long way indeed from guaranteeing that you'll draw the right card. And spending all your time dwelling on what *can* happen gives a very distorted vision of what is most likely to happen. Very few things that can happen actually do happen.]

I'm waiting...

[Not anymore]

-- Flint (, June 24, 1999.

Thanks, smfdoc. Here's your link:

herbal lamp oil

I'd sure hate to think this whole thing was simply to encourage folks to buy a particular brand of lamp oil.


-- Anita (, June 24, 1999.


I wouldn't take your advice if you wrote it down on the back of a thousand dollar bill. Why? Because you've been shown to be pathetically asleep over and over.

Quote: "[Of course the grid can fail. The question isn't whether it *can* fail, but whether it *will* fail. And again, investigation and remediation have indicated that the probability of such a failure is very small. It can never be zero. The probability of local y2k- induced outages seems fairly high (which is something quite different from a failed grid, of course).] "

That is the answer I was rhetorically asking for. Of course you can find the light at the end of any tunnel and the silver lining on any gray cloud, which is what makes you so pathetically inept at this topic. "Can fail,... will fail"...what baloney. What a pleasant way of saying you'll never admit to being someone who can't see the future.

You really didn't have to answer my post, in fact I'd rather you wouldn't have done so. There was supposed to be a heavy dose of sarcasm on it and obviously you were too soundly asleep to realize that.

-- (pollys@need not.apply again), June 24, 1999.


As I said in my introductory sentence, I was taking the chance that you were simply misinformed, rather than a belligerant idiot looking stupider than usual. OK, I guessed wrong. You can go back to ranting now, and I won't interfere with your fantasies any more.

-- Flint (, June 24, 1999.

Again, (not wanting to start an argument) you missed a critical part of my post while you were napping. I asked you not to reply because it's not necessary I already know how you feel about people like me I don't need you to remind me and show how dedicated you've become to proving the world wrong than one man in fact CAN be an island. Thanks for the tip Sir Snoozalot, but it wasn't required. I already got your point the first thousand times you said something like... "go back to your fantasies".

-- (pollys@have a . short memory), June 24, 1999.

As bad as I hate to,(grin) I've got to go with Anita on this one. That's obviously a different brand of Assembler than I'm using currently. This post just isn't very accurate.

-- RB (R@AR.ST), June 24, 1999.

Agree with Anita and RB

This post is essentially rubbish, the purported Assembler code is not Assembler, and not even anything like Assembler, which I have written for OS's ranging from Apple 2's through OS 390. It looks more like some bastardized BASIC.

On the clock issue, strangely enough IBM mainframe clocks don't know anything about times or dates, they are more like EXTREMELY accurate metronomes, they simply tick, once every 244 picoseconds.

(A picosecond is a trillionth (1 * 10 raised to the power minus 12, and is called a clock unit)

Every time they tick 1 is added to the low order bit of a 64 bit value which represents the number of clock units which have elapsed since 00:00 am, 1 January 1900, UTC (Universal Time, Coordinated).

Therefore, in terms of the mainframe clock, the rollover to 21st century just represents a 64 bit value 1 greater than the previous value, there's no magic year value in it that suddenly goes to zero.

This is not to say that there are no year 2000 problems of course, but they are caused by the way we interpret the clock within application programs, they are not an artifact of the clock itself.

I could go on and on about this, but I really don't want people to think that this really is "the best programmer explanation yet". It isn't!


-- Ron Davis (, June 24, 1999.

Ohhhh, sorry. I believe they're all over at Rick Cowle's forum for a little 'pow-wow' with Dan the Power Man. Engineer has some serious questions he wants answered from the Bar Association. Some of those jokers had better start giving some searious discussion to the Nuclear Power Plant aspect of this problem! I'm feeling my blood pressure rising again.......where did that person who's in love with their air-conditioner go?

-- Will continue (, June 24, 1999.

My last post was in reply to where some of our Pollys had gone...forgot to hit the refresh button before responding.......zooom! I would really like to expand on the embedded systems issue.....I'm concerned we're being mooned in that department.

-- Will continue (, June 24, 1999.

To All; I said I didn't know the original poster. I said I didn't know if they were even a programmer. And I put quotation marks around the quote. This was a clue that I was quoting someone not making an original statement. I did,however;say that this post would console(read justify)those who have prepared and were "At the end of their rope"to this end the post was successful.The whole point of discussion groups is to put up controversial opinions and explore them.You folks are doing that with a minimum of flaming,thank you. Dennis your objections are valid but a little picky.Tom you have the right idea don't let the trees spoil your view of the forest.George the original author was unknown to me.I am a non-geek,to me her progression made sense even if you can blow away parts of her analysis.Anita you certainly know more about programming than me;but wasn't her point-it could have been fixed but wasn't and now probably can't.Flint I'm enclosing the last CSY2K poll these programmers have serious concerns,read for yourself: Below are the previous results followed by the survey itself. For those new to the survey, there is no cutoff of the voting at 5 on the edwards survey. You may vote higher than 5 as the questions are >purposely subjective. Commentary is welcome. Non-geeks are welcome to participate. >I would appreciate if everyone could keep the format the same. >Name, programming experience in years, Edwards vote, Eastabrook vote. > >Infomagic vote. >I will cutoff the voting as of noon (US-CDT) Sunday, 6-27. I hope to have the results posted Sunday. Thanks for your participation. >ET > >Previous results; > >Edwards - >Period Geeks Exper Non-G Overall >2Q-1997 3.96 17.6 >3Q-1997 4.20 19.7 5.75 4.30 >4Q-1997 4.09 18.1 4.29 4.11 >1Q-1998 4.18 19.7 4.29 4.20 >2Q-1998 4.21 19.5 4.93 4.33 >3Q-1998 4.23 19.6 4.77 4.36 >4Q-1998 4.33 20.5 4.74 4.48 >1Q-1999 3.86 19.7 4.62 4.03 >Eastabrook - >2Q-1998 3.07 3.89 3.21 >3Q-1998 3.40 3.33 3.39 >4Q-1998 3.17 3.53 3.30 >1Q-1999 3.16 3.86 3.32 >Infomagic - >4Q-1998 .46 .59 .51 >1Q-1999 .39 .60 .44 >************** >Edwards' survey - >1 - it ain't gonna happen >3 - bump in the road, 80-hour weeks for all in 1999-2000 >5 - probable collapse of economy, start hoarding now >Eastabrook survey - >5 - We're a Paul Milne, ready and (appropriately) armed for anything. >(Guns for the US, bows for England >4 - Small town (1,000-5,000), nearest large town 50 miles or more. 3 months food, heat, water, whatever. Some prepared lands ,seeds. >3 - Medium town (5,000-100,000), 1 month food and heat.water access. >Fair knowledge of plants, vegetable growing First-aider. >2 - Larger town (up to 1 or maybe 2 million depending on surrounds. >2 weeks food plus some basics. Computer geek with other skills. >1 - New York, Washington, London. No pantry, takeaways! A private > >y2k compliant helicopter brings this down a few points. >Infomagic survey - >So, in this survey, I would like to suggest a very simple score in the >range of zero to one, as a decimal, to any precision you please. >Zero means no significant impact (bump in the road) and one means the extinction of the human race .5 means you think it is halfway between the extremes.

-- desertj98 (, June 24, 1999.

Embedded chips are a wild card. No one knows exactly what or where will become operation-impaired or inoperable, not even the supposedly informed author of the article. I appreciate the fact that she may have once been a programmer, but her warning is drama - I repeat, drama. Perhaps she is angry at the world. I have a question, for anyone to field, inspired by brief observation of this site:

What is the ultimate goal of articles like this?

I agree that the Y2K rollover may create problems. The degree to which those problems will affect our lives is debatable. Should the rollover create an unstable society, how does it benefit that society to be frightened ahead of time? I think we can be reasonably prepared and inform others of the possible danger without crying, "The sky is falling!" Should the rollover create an unstable society, it won't matter if you have a diesel generator, a sub-machine gun, three years worth of dehydrated food, or five thousand bottles of Evian; eventually the have-nots will locate and overpower the haves.

-- Klar (, June 25, 1999.

I think it's time for "Jack" to make another one of his infamous appearances!! Where IS Beach?

-- Will continue (, June 25, 1999.

My God, Klar. What a defeatist attitude. Several of us have prepared for just THAT. It will take a platoon to get into my root-cellar. You need to finish up a few things in the time you have left. It will give you a whole new attitude.

-- Will continue (, June 25, 1999.

Desert said:

" These types of things needed to be looked into by the U.S. Senate. But our leaders chose instead to hold a trial about the President's Sex Life. "

Amen brother. Amen.

-- a (a@a.a), June 25, 1999.


Just an update to earlier response, prompted by an email I received.

Although the low order bit of the clock on an ESA 390 mainframe does represent approx 244 picoseconds, not all implementations of the architecture operate at that level of precision. More common is to increment bit 51 (where the high order bit is referred to as bit 0), which corresponds to exactly one microsecond. In other words, a 1 is added at bit 51 each microsecond. Bits 52 to 63 are left as zeroes. This microsecond accuracy is sufficient for most commercial applications.

For further proof that the author of the original article is ill-informed try this:

...two lines of High Level code would replace two hundred lines of Assembler code, which replaced two thousand lines of Machine Language code...

Incorrect! There is an exact one to one equivalence between a line of assembler code and a machine language instruction (macro instructions excepted). Assembler is simply an easy way to write machine code without the need to remember the exact format of the instruction, consider the instruction to add one packed decimal number to another, the AP instruction (Add Packed). It's a lot easier to remember AP than it is to remember 11111010 which is the actual machine instruction. All the Assembler does is translate my AP to that 11110101. It also looks after translating my operands to the "D1(L1,B1,), D2(L2,B2)" format which the instruction requires but thats another story!

If you really want to try to understand this stuff, or just want to see how complex it can get then you should read ESA 390 Principles of Operation, univerally known as POP!

When you can understand POP then you can call yourself a real programmer (and make lots of cash!).

Masochists or insomniacs can find POP online at:


-- Ron Davis (, June 25, 1999.

Yup. But let's face it...that was only one of the many glaring acts of stupidity. The first one that stands out in MY mind, has got to be the '92 election.

-- Will continue (, June 25, 1999.

I will have to side with Anita on this one - with only one exception:

When PL/1 was first introduced, it was said that it would replace COBOL within 5 years. That was circa 1970-71 as I recall. Has anybody coded in PL/1 lately? Another was ADA. It was said (circa 1975) that the Fed Government was going to require that ADA be used in all government applications and by all contractors to the Fed Government. Has anybody coded in ADA lately? Enter "C" Language - It was said that "C" would replace all COBOL code and that COBOL was dead - That was 1984 if I remember correctly. Yes many of us still code in "C" (I am multi-lingual) but C++ is a better tool for PC based applications. "C" never came close to replacing COBOL.

Fact of the matter is, COBOL is the MOST PORTABLE ANSI STANDARDIZED LANGUAGE THERE IS - PERIOD. It has gone through many changes over the years and most of them have improved the ability of the language to perform right up there, if not beyond, with the latest and greatest languages for specific platforms.

I have to disagree with one thing Mr. Davis pointed out - there is not a 1:1 relationship between BAL instructions and machine code. It is a 1:many relationship. ZAP (Zero and Add Packed) is just the first of dozens of instructions that entered my mind when I read that. I whole heartedly agree on the clock ticks / cycles of Mr. Davis et al. as they are not inherently "chronologically related" to the calendar in many cases. Also I would venture to say that although I have not coded 68000 ASM, I have 80xx and many others as well as IBM mainframe - I probably have at least 4 times as many General Purpose Registers to play with on a mainframe.

On the whole, the article that began this thread was well thought out and put into lay terms - but it was presented with a distinct twist which cannot go unchallenged. What the author left out was the following facts:

1) Operating systems have not just evolved - they have been redeveloped from the ground up over the years. *VS* (Virtual System - more commonly known across all computer-dom as Virtual Memory) was not even a dream 30 years ago. Memory was all REAL and limited to the amount you had installed in the machine. Memory handling is *extremely* clock sensitive for CAS and RAS (refresh) etc.

2) Layers - handlers like the Master Scheduler, Initiator / Terminator, Reader / Interpreter, Input / Output and many others, like operating systems, have not just evolved, but have been redeveloped from the ground up. Since most of these have to work with memory and literally all machines now use Virtual Memory, I am sure you can understand why they had to be redeveloped.

3) Applications - I cannot tell you how many thousands of different applications I have worked on or developed from scratch over the years. This is not in line with Darwin's principles of Evolution here, these are computer programs. I will grant the author that some, indeed many, applications have evolved and have "grandfathered" code embedded within - but typically, this code is well encapsulated as a macro would be and quite easy to examine and modify.

In closing, I would like to offer that the article that inspired this thread did, indeed, bring me back a few years. I can't say that all of my memories from the early days were fond, but they are the foundation of my career. Many of us did, in vain, warn our managers that the code would break upon the turn of the century. These warnings were generally met with either laughter or scorn. "What makes you think that YOUR program will still be around 30 years from now?" Etc.. The decision was not ours, the programmers, but the decisions made by management. Don't bother looking for them (60's- 70's mgmt) as they are either retired or already dead from drinking too much coffee/booze/insert your favorite beverage - or dead from stress related heart attacks etc. We, the programmers of the 50's - 60's and 70's are generally now occupying management roles and trying to clean this mess up. We have it pretty well in hand now here in the USA. We are not in as bad of shape as the author of the article would have you believe.

Yours in COBOL... Dino!

-- (, June 25, 1999.

Cobol Dinosaur:

I never found the exception that you mentioned, but I'll take exception to your statement that the author's presentation was well- thought out. Whether or not "Still Water Woman" (see the link provided for more details on her thoughts) was trying to bring assembler language down to something that non-technicals could understand, her basic premises were STILL totally incorrect.

Where were these arrays of dates and clocks that drove things? What's this about dividing dates by zero? The only time dates are divided by ANYTHING is to determine if the year is a leap year, and they certainly aren't using zero as a denominator.

Nah.. my personal opinion is that Still Water Woman is either still living or desiring the "back to nature" slogans of the 60's. Most of us outgrew those marijuana-smoking, acid-dropping, herbal EVERYTHING, share in the commune, Timothy O'Leary days and realized that for the sake of our children, we needed to move on to become responsible citizens and parents. It seems she's dropped out of society already, as indicated by her website.

desertj98: It was never my intent to infer that YOUR thoughts WERE those of the author. She didn't say that it MAY be too late. She said it WAS too late. She'd like you to believe that, as would so many others. She used spin and lies to encourage folks to believe this. While most can spot spin and lies by the "happy-faced pollyannas", perhaps only those with technical experience can spot the spin and lies by the other side.


-- Anita (, June 25, 1999.


more pedantry (last resort, because I cant argue with a single word you or CD have said).

You were either thinking of . .

Timothy Leary - the 60's acid guru and generally out-there guy (most probably)


David O'Leary - former defender for Arsenal FC and The Republic of Ireland (less likely . . but POSSIBLE)



-- W0lv3r1n3 (, June 25, 1999.

Hey folks, I've got a (my usual) verbose take on this thread, so I started another thread, "A Review of the thread 'Don't miss this Post!!! The Best Programmer Explanation Yet.'"

Can someone do a link to it for the newbies' sake?



-- William J. Schenker, MD (, June 25, 1999.


Sorry: Timothy Leary...NO O'.


-- Anita (, June 25, 1999.

lemme try . . .

hmmm . . . (pokes tongue out of corner of mouth) . .


-- W0lv3r1n3 (, June 25, 1999.

woohoooooo !!!!


-- W0lv3r1n3 (, June 25, 1999.

Anita and George were the closest to my question. Unfortunately, George was cut off. And Anita was sidetracked by her flashbacks to those bad acid trips with Mr. Owsley himself, no doubt.

The writer talked about "system architecture logic code, hidden UNDER the language instructions -- that divides by the date. This is not something just I did -- this is how it was done....We used Date Calculations to drive the array tables. We subtracted dates, we added dates, we divided by dates."

Anita misunderstood and referred to "dividing the dates." She also referred to code examples that were from a later post, not from the article in question. Hopefully, she'll probe the dividing *by* dates question for us, too.

I guess my remaining question is whether anyone can think of any systems in which there are such "array tables" or any instances where one would divide by the integer 79 in 1979 and 99 in 1999, and thus 0 in 2000?

Not to generate a random number as a unique identifier, probably. Maybe as a two-digit date stamp within a string, but then why divide on unpacking it? An "array table" could contain just about anything, but again, why the division?

Programmers have done every fuzzy little thing they could think of individually, but this one seems pretty far out as the way "it was done."

-- jor-el (jor-el@krypton.uni), June 25, 1999.

RonD&Dino: keep in mind also that when you go from object to machine language you're linking in all the app lib and OS system calls, so you really are talking a one to many increase.

Anita: the article was written at a very high level. Most people know that "divide by zero" means an error condition. I think it was used simplistically to convey that the code would encounter a problem. A display of the assembly code for tm_year depicting the actual mechanics of the problem would have been more succinct, but it would have been over 99.9% of the audience's heads.

-- a (a@a.a), June 25, 1999.

While it is true that "divide by 0" is a no-no on almost all computers, I don't think I've ever seen a case where the year was used as a divisor in my 31 years of programming. I have seen "year arrays" where the year is multiplied by the element size to find the right "slot." It's OK to multiply by 0, the answer is 0 and can be used to index to the first item in an array. So while anything is possible, I think this problem will be rare. Any other pros out there ever see a year used as a divisor?

I gotta run for now. I'll have more to say on this article later. <:)=

-- Sysman (, June 25, 1999.

See thread...

A Review of the thread: "Don't Miss This Post!!! The Best Programmer Explanation Yet." 00103z

-- Diane J. Squire (, June 25, 1999.

what ?

my link not good enough for you then eh ?


Im taking my bat and ball home now.


-- W0lv3r1n3 (, June 25, 1999.


First of all, I have NO idea who Mr. Owlsey is.

Secondly, I understand NOT AT ALL what you are saying here:

"The writer talked about "system architecture logic code, hidden UNDER the language instructions -- that divides by the date. This is not something just I did -- this is how it was done....We used Date Calculations to drive the array tables. We subtracted dates, we added dates, we divided by dates."

Anita misunderstood and referred to "dividing the dates." She also referred to code examples that were from a later post, not from the article in question. Hopefully, she'll probe the dividing *by* dates question for us, too."

I've not seen this done in my entire career. Why would anyone divide by dates? What point is there in dividing 1979 by 1999? Are you suggesting that dates were used in hashing algorithms? *I*'ve NOT seen that. If you've seen dates used as the denominator, I BEG you to show me WHERE and WHEN. I didn't pick up the division theorum from a later post. It was what the original author posted.

Divide by: Explanation: 4 divided by 2 = 2 1979 divided by 1999 = .9899949. A more clear example: 79/99 = .7979797.

WHAT array tables, Jor-el? WHICH ones exactly? I spent 7 years as a systems programmer also, Jor-el, and I NEVER saw code that divided by dates. Please elucidate and (if you're older than I am) provide an example of this that is still in existence.



-- Anita (, June 25, 1999.

I'm a professional programmer too (well really I'm an engineer but I've spent a large part of my career coding ATE and embedded chips and such) and I've NEVER seen anything get divided by a date. The only situation in which I can even imagine this is in using the date as a seed for a random number generator. Most likely this code would never be executed unless the application is using random numbers. The only business application that I can think of at the moment that uses random numbers would be a Monte Carlo simulation, as unsed in engineering modeling programs and probably in business planning/forecasting programs. Also flight simulators or similar programs. I can't think of any embedded chip applications that would.

Just out of curiosity, can anyone think of any other apps that would use them? I mean besides games or graphics programs.

-- y2kbiker (, June 25, 1999.

Regarding PL/1 (or PL/I more correctly), PL/1 was first released in 1965 by IBM running under OS/360 in an incomplete implementation of the language. That's when the first statements started about replacing COBOL within 5 years. In fact, COBOL was still in its youth at the time.

I recall at the time (when I worked for IBM) writing a paper on why my customer, Sears Roebuck, should switch from Assembler (and its predecessors like Autocoder on the 1401 and 1410) to COBOL - then learning that IBM was pushing PL/1 rather than COBOL - so I went through my whole paper replacing the word "COBOL" with "PL/1". This was before the days of global replacement in word processors. I think we still had carbon paper.

-- Dan Hunt (, June 25, 1999.

I do not know which is the "best programmer explanation yet", but this is not it. :-)


-- Jerry B (, June 25, 1999.

Hi Dino,

Regarding your point on the one to many relationship between Assembler and machine or object code. Whilst some instructions "appear" to exhibit a one to many relationship, such as the ZAP (zero and add packed) actually still only a single machine instruction is generated. In fact, in that instruction no zeroing takes place, the second operand is simply moved to the first operand location, which is exactly the same effect as zeroing the first location and adding the value at the second location to it.

If you mean to imply that a multiplicity of actions take place deep within the machine, beyond the control of the programmer, such as fetching the data at the second location, checking it for valid numerics and sign half byte, setting conditions etc then I agree, many things happen as the result of coding that op code but they're essentially "hard wired" in the instruction itself.

Regarding PL/I, my absolute favourite language, well I wrote some yesterday! (I wrote some COBOL too but it's a bit restrictive for my taste).

Too all, I'm not a Polly, I'm prepping substantially, I just don't like disinformation.

RonD, who just finished checking 344 Assembler modules, some dating back to 1970, for Y2K compliance. Surprisingly they nearly all were OK!

-- Ron Davis (, June 25, 1999.

OK, a few more comments.

First, on the syntax, I think the writer was attempting to put it on a level that a non-tech would understand. The word MOVE is easier to underatand than MVC, and GOTO is easier than B. Words like OPEN and GET, while not "machine instructions," are MACROs available to the assembler programmer, provided as part of the operating system.

Speaking of MACROs, all it is is a collection of instructions that can be called by a name. Some are simple. For example:


may "generate" something like:

GETONE L 1,=A(FILE) L 0,=A(AREA) BAL 14,8(1)

So while the programmer writes ONE "instruction", the assembler generates 3. Some MACROs, like DCB (MVS) or DTF (VSE) may generate hundreds of instructions. The MACRO processor on IBM mainframes is very "smart" and can do things like generate different instructions based on the type of a parameter, for example. Sorry, I'm getting off track...

The point is that only a few lines of "assembly" source code can generate many lines of "instruction" code. MACROs are very common. In fact, virtually every mainframe assembly program contains many, for things like talking to the file system. However, I think a 10 to 1 ratio is way too high. However if you do consider "linking in all the app lib and OS system calls" as "a" pointed out, it's just about right.

As for the mainframe (370+) hardware clock, yes, it is just a long-term counter. It's "rollover" has nothing to do with the Y2K rollover. In fact, the 360 clock was only 32 bits, and used to rollover every 17 (?) hours. The OS timer/date routines deal with this. What happens is when the machine is IPLed (booted), the operator enters the date and time. The OS fetches the real-time clock, and starts counting from the just entered date/time. The OS does the rollover from seconds to minutes to hours to days to months to years. A non-compliant OS stops at 99. The programmer depends on the OS to supply date and time information.

So I think the article does have some merit on these, and other points. However as I stated above, I don't think the year resulting in a divide by zero error will be that big of a deal.

Later. <:)=

-- Sysman (, June 25, 1999.

Moderation questions? read the FAQ