Andreas Kohlbach
2022-07-17 20:32:31 UTC
Signed 8 bit for years before 1900? Makes no sense. Unsigned 8 bit
gets you pretty far. That doesn't sound like anything that would pass a
design review.
I worked with computers from 1964 until a few years ago.
I can't recall anyone abusing dates to save space.
Isn't this exactly what the Y2K problem was all about? Storing thegets you pretty far. That doesn't sound like anything that would pass a
design review.
I worked with computers from 1964 until a few years ago.
I can't recall anyone abusing dates to save space.
last two digits as characters seems just as arbitrary as using a
single 8bit binary value. True with just one more byte, overflow
isn't a problem, but if everyone used just two more bytes and kept
all characters in a year, Y2K wouldn't have been a problem.
some piece of software is going to have to figure out whether to add
'19' or '20'.
store two digits: possible Y2K impact.
When you want to enter a credit card expiration year, no one has the
patience to enter 20xx for the next 100 years.
I see that in web form pulldown menus are used giving the current and maypatience to enter 20xx for the next 100 years.
be 10 next years (like 2030) to just click.
I did a lot of Y2K remediation. Rarely was the correct solution to ask
the user to enter a 4 digit year. I can't recall that ever happening.
Or it was static. So the 19 was fixed (and not saved) and the user addedthe user to enter a 4 digit year. I can't recall that ever happening.
the rest. So when 1985 he just typed "85" and the machine would likely
also only store "85".
I move this into the folklore group, knowing many of you read there too.
--
Andreas
Andreas