Posted on 09/23/2020 5:16:07 AM PDT by FRinCanada2
https://www.timeanddate.com/countdown/to?iso=20360118T231407&p0=263&msg=Oz.++++++%28%28+32+bit+overflow+Day+%29%29++Oz&ud=1&font=cursive
Instead of clicking on an unknown blog, how about this?
https://www.quora.com/What-is-the-Unix-32-bit-time-overflow
Its a problem that affects UNIX systems with a 32 bit time_t data type.
The data type time_t is defined in < sys/types.h > as an integer type; the size of this type is unspecified, but its not specified as an unsigned type, and is therefore signed.
If the size of your integral typed (char, short, int, long, and long long are all integral types) time_t is 32 bits in width, your computer has whats called the Year 2038 problem - Wikipedia.
The UNIX epoch is 1 January 1970. Time in time_t is measured in +/- seconds relative to that date.
This means that the 32 bit signed integer value will roll over to a negative number on 19 January 2038 at 3:14:07AM GMT, which is exactly 2,147,483,647 seconds after the epoch, and suddenly it will be 13 December 1901.
Its not a problem on most modern UNIX systems, since time_t in modern systems is defined as a 64 bit integral type.
I personally fixed this in Mac OS X in 2004, while employed as a kernel engineer at Apple; in doing the 64 bit kernel work, I made the arbitrary decision to change the type of the field from int 32 bits to long 64 bits specifically because of this problem.
It was arbitrary, because, as a kernel change, there was no 64 bit user space at the time.
Since its unlikely that 32 bit systems will exist in large numbers in 2038 anyway, most other systems will have probably avoided the problem as well.
Note that this issue only impacts you if you have a 32 bit time_t, and you are using it as data type that gets serialized to storage.
Since the best current practice on storage types was a text representation for dates, since well forever, only programmers who ignored this at their peril will have written programs that actually suffer from this problem.
If there were a lot of these programs, then the problems probably would have been hitting us already for programs that had future dates that started out overflowed. One example might a 30 year mortgage with an incept date after 2008, which would push it into negative territory (note that I had fixed Mac OS X four years prior to that date).
Like Y2K, Y2038 is likely to be a non-event, other than a bunch of UNIX geeks having End Of The World parties the weekend before that Tuesday rolled around.
I suspect there may also bee some inside joke T-shirts sold as the day approaches
The World Ends On A Tuesday
Pass It On
The link I posted was just a countdown clock a friend of mine sent me. I thought it was a cool vanity topic and a bit of humor to inject at a time a great focus on November 3rd, 2020. Looking ahead 18 years seems like an eternity but an interesting topic none the less
Ah, I see. :-)
“non-event”
Exactly. We’ll take care of it beforehand, and that’ll be the end of it.
John Titor (time traveler from the year 2036) mentioned this in 2001 on Art Bell’s show
I am starting to believe this guy...
Lol. The issue has been a topic of discussion in Tech since 1970 ! But I just brought it up to put it in perspective with all the countdown clocks set to 3 November 2020
Unix systems have used a timer based on clock ‘ticks’
The starting (0) clock tick was set at January 1 1970.
Unix Time is represented by a 32 bit whole number (an integer) that can be positive or negative (signed). Unix was originally developed in the 60s and 70s so the “start” of Unix Time was set to January 1st 1970 at midnight GMT (Greenwich Mean Time) - this date/time was assigned the Unix Time value of 0. This is what is know as the Unix Epoch.
The 32 bit number will overflow and set back to 0 in the year 2036
Ping
I remember in college (1980’s) wondering what would happen then the clock ticks went over the biggest number that could fit in 32 bits.
I calculated it was 2036 and that seemed SO FAR away at the time...
I kind of want to see what Christmas 1901 looks like. Should probably settle for old pictures.
Luckily, most of the people who make these statements now will either be retired ...or paid handsomely to fix the issue.
It's not that the old systems will be plentiful, it's that the ones that are left will have been kept in place for some very specific reasons, usually because a replacement is considered cost prohibitive for some reason that could have been avoided decades ago.
A disturbing number of computer systems we rely on daily (health insurance, social security, taxes, banking) still use COBOL code originally written 40+ years ago... and will keep doing so as long as they can still find chips that process the instructions. It's not the old code that's the problem though, it's the 40+ years of business rule changes, workarounds, edge cases, and incorporating code from acquisitions that makes the code hard to work with. I wouldn't bet much money that those systems will get fully replaced by 2038, bit I will bet that many of the guys in their late 60's and 70's who already retired once will be willing to come in to work in another decade and a half to fix those remaining systems like they're doing now.
I suspect that most of not email applications will still run in 32bit in 2038. Nothing is more reliant on accurate date/time data than a simple email message.
Do you Agree?
32 bit processing itself isn't the problem, I'll go so far as to predict that by 2038, there will probably be billions of sensors and thingamabobbers floating around the environment using 32 bit processing, probably just as many things using 8 and 16 bit.
What is going to be a problem is the reliance of a 32 bit operating system on the UNIX epoch date in places where the time being recorded accurately is essential for the OS or a mission critical process to function correctly. Stuff like SWIFT, or a state's unemployment benefit processing system, or some hydroelectric dam in a former Soviet republic, or a network switch embedded in an undersea cable that carries diplomatic traffic between two nations that could go to war if a comms link is broken.
For Y2K, there was a scramble, and a few old coders came out of retirement for a quick buck, but the fix was comparatively simple and people were still available who knew enough about the systems they worked on to fix the systems (though some people employed a cheap fix where they split the four digit year into two two digit numbers then told the software to ignore the "20", which caused some issues at the beginning of this year in a few places. It wasn't widely reported on account of everything else going wrong this year, but it happened.).
In another decade and a half, those old programmers aren't going to be alive anymore, and even the young programmers who supported those systems a generation or two after them are going to be long past retirement. Even if they were, this would be a kernel-level issue, and the number of people who could address that quickly is already quite small, and even fewer of those people want to delve into antiquated and hard to pin down embedded systems in old soviet hydroelectric dams.
Index
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.