Unless of course, by then, we have increased to 64 bits.
Computation will be so easy by that point that nobody will be representing numbers as finite-bit integers except deep, deep inside the architecture where the ramifications are understood by the hardware geeks. The only thing is that for people’s convenience, abbreviations will probably STILL be used in representations of the date, leading to general consternation in 2099.