There was a question posed on the #curl IRC channel whether there's ever going to be a need to raise addressing or offsets from 64-bit to something larger, such as 128-bit.

I argue there is no need to do this. 64-bit can already address a very large amount of data. For example, many operating systems and filesystems have a limit of 2**64 for file sizes. But it is difficult to wrap your head around this; how much data can such a file really hold?

Some estimates (*) say that there's going to be around 181 ZB (zettabytes) of data in the world by the end of 2025.

This is only 9812 files if each file holds 2**64 bytes.

*) rivery.io/blog/big-data-statis…

#curl