I am using 64 bit Ubuntu system.
I’m currently working on a project that incorporates MariaDB. I’m planning to introduce timestamp technique into the project so that people will receive the correct time for different timezone.
I’ve heard and read some articles about year 2038 problem for timestamp. Many articles suggest we use a 64 bit system to buy a “bit” more time.
How much time is this “bit” referring to? Is it long enough for us to be able to manage web applications until the end? If that’s not the case, is it like only two years extension so when year 2040 comes, are we going to have applications that do not work appropriately?
Well if there's an option to literally buy a "bit", ie transfer from a signed 32bit integer to an unsigned 32bit integer, things keep working into 2106.
Transferring to 64bit is "somewhat better". You get hundreds billions of years of resolution.
And Ubuntu does this:
However, that's the OS level. Just because Ubuntu uses a 64bit integer for its times doesn't mean that MySQL/MariaDB will use it to store its timestamps. If dates past 2038 are important to your now, start testing immediately.
Actually, I can save you some time. It's still broken. This bug was reported over a decade ago but its main test still fails with a 64bit int.
This isn't even storage. It's slightly pathetic.
(And yes, that was run on MariaDB, version 10.1)
Don't store it as an integer at all. Store it as an ISO 8601 formatted date string. This is the standard format used across the Internet.