There's a famous computer joke that goes along the lines of "we needed 4K of RAM to send people to the Moon, and now we need <e.g. 4GB to keep a grocery shopping list>".
I think it is a fine illustration of the Jevons paradox in computing, and one of the "computing Murphy laws", known as the Parkinson's Law of Data - "Data expands to fill the space available for storage". I also think it's quite intriguing to highlight observations of a similar phenomenon related to compilers, especially in the context of #permacomputing
Yesterday I read a book on a minimalist compiler written in the 00's, having a remarkable footprint of merely 424 KB of RAM.
And then I thought about Turbo Pascal for CP/M that ran with 64KB of RAM. And then various compilers that worked on micros with 16KB or less.
And then I read about things like the ALGO compiler, an ALGOL clone, for a first generation/vacuum tube computer Bendix G-15 (yes, the one Usagi Electric has): 2160 words of 29 bit RAM, no more than 370 op/s.