I have long been partly aware of the 2038 problem, similar in nature to the Y2K problem – which, although at the ripe old age of 8 when it “happened”, I don’t really recall ever actually happening – but the key difference is that in 2038 there is the potential for some critical systems to face issues, in particular 32-bit ones that store dates e.g. every single 32-bit system.

## So what exactly is the problem and what exactly is expected to happen?

In an n-bit system, dates are typically stored in a signed n-bit number, starting from the Unix Epoch, 1970-01-01 00:00:00 UTC, meaning that they can go up to 2^{n-1}-1 seconds after that, which in 32-bit systems means 2038-01-19 03:14:07 UTC. We’re already long past the relevant points for 8-bit and 16-bit systems. In fact, I’m not even sure how 8-bit systems handled it given that 2^{7}-1 = 127 seconds which is a whole 2 minutes and 7 seconds. The same goes for 16-bit systems which would at least get you past 9AM on 1970-01-01, but not by much, 9:06:07 AM to be precise, so at least you’d get 6 working minutes out of it.

So given that times stored in 8- and 16-bit systems wouldn’t even get you a full day, getting 68 years out of a 32-bit system isn’t bad at all right? Well, no, it isn’t, but still, it’s not enough.

## If I don’t do anything and still store my timestamps as signed 32-bit numbers, what will happen?

Here is where the distinction between signed and unsigned comes in, let’s use a 4-bit example just to keep things easy.

An unsigned 4-bit number can go all the way from 0000 = 0 to 1111 = 15 ( = 2^{4} – 1), whereas a signed 4-bit number can go all the way from 1000 = -8 to 0111 = 7 ( = 2^{3} – 1), this is because the first bit is used as the sign: 1 is negative and 0 is positive (or zero).

There is still a need, of course, to store dates before the Unix Epoch, this is typically handled by storing it as a negative number e.g. 1969-12-31 23:59:00 would be stored as -60. Bit-wise, this is done by changing the first bit to a 1 and using the remaining (n-1) bits to count up from “zero” as normal, up until 1 second before the epoch.

This is why dates are stored as signed numbers rather than unsigned, because dates existed before 1970. If we were to switch to unsigned integers then we would get an extra 68 years of breathing space, taking us to 2106-02-07 06:28:15 UTC. There would be two main problems with this however:

- The first one is hopefully obvious, we would completely lose the ability to work with any dates and times before 1970.
- Even that date is probably not as long as some people alive today will possibly live. Someone born on the day I am writing this (21st October 2022) will be 83 when that date comes, and I wouldn’t consider 83 to be an unreasonably long lifespan.

Now as to what will actually happen: time will keep moving and the binary stored number 01111111111111111111111111111111 will turn into 10000000000000000000000000000000 which will now be interpreted as – 2^{31}, so 2^{31} seconds **before** the Unix epoch. This will take us all back to 1901-12-13 20:45:52 UTC. Naturally, this will cause chaos, especially if you believe time travel is possible and we’ve cracked it by then! I may go and buy a DeLorean just in case.

So, what is the solution?

## Enter 64-bit

Most computers and processors you can buy today run on 64 bits, so it is incredibly unlikely that this whole problem will be a problem for end-user devices by 2038, and who is to say that by then that won’t become 128 bits, or even 256? The trouble comes when using architecture still running on 32 bits.

How long can you get out of a 64-bit system?

When doubling the number of bits, you essentially (almost) square the length of time (in seconds) that can be handled in an unsigned integer of n bits. The maths of squaring it doesn’t quite apply to signed integers. So how many years could we get out of 64 bits? A Century? A Millennium? A Decamillennium (that’s 10 millenia or 10,000 years)? Nope, you’d get 584.9 **billion** years. There are many comparisons you could make but the main one in my mind is that being roughly 42 times the age of the universe. This of course becomes 292.5 billion years either side of the Unix Epoch when you consider signed rather than unsigned numbers.

Call it morbid but the human race will be long gone by the time the year 292,500,001,970 comes around.

This said, however, with the advent of 64-bit numbers, could it be time to reconsider what we use as the Epoch? Is there really a need to arbitrarily start dates from 1st January 1970 any more? I’d like to propose 2 new possibilities for the Epoch:

Point in time | Pros | Cons |

0000-00-00 00:00:00 UTC | We can calculate exactly when it was | Things still happened before it, meaning numbers would have to be signed It is based on religion which not everyone subscribes to, even if most countries recognise it as being year 0 and this current year being the 2022 ^{nd} one after it. |

The Big Bang | Nothing happened before it (that we know of at least, I happen to believe otherwise) There would be no need to use a signed integer | We cannot calculate exactly when it was, even the official age of the universe is listed with +- 0.02 billion years, which is 20 million! |