Also, while doing my monthly "look at the things to make sure the servers stay healthy", I realized my #iceshrimp media cache had grown to like 500 gigs. I aggressively cut the max file size and max days to cache remote media, because it's full of a bunch of giant mp4 videos from someone. That should fix it. Though I wish I could kick off a MediaCleanupTask now, instead of waiting for midnight. Oh well. We aren't going to run out of space in the next 24 hours or anything.
in reply to 🇨🇦Samuel Proulx🇨🇦

Oh, I can! I always forget about swagger and the #iceshrimp API.
curl -X 'POST' \
'fed.interfree.ca/api/iceshrimp/admin/drive/prune-expired-media' \
-H 'accept: /' \
-H 'Authorization: Bearer redacted' \
-d ''
And then it dies because I just asked it to delete hundreds of gigs of files and the poor queue is stuffed. But, like, it's doing the thing. Even if it can't return a result.
in reply to 🇨🇦Samuel Proulx🇨🇦

Because I know how much you all care about this ongoing situation: My media cache is now down to a much more reasonable 38 gigs. The entire world might be on fire, but at least you're now aware that a random Canadian guy you don't know fixed a problem that doesn't affect you on his single-person instance running software you don't use! Let the rejoicing commence.
in reply to James H

@quanin Ah, I think Mastodon's default is 30 megs or something. github.com/mastodon/mastodon/issues/20490
in reply to 🇨🇦Samuel Proulx🇨🇦

@quanin Seriously though I'm just waiting for the IPFS toolchain to mature a bit. Then I'll set up IPFS across the servers, store everything on IPFS, and serve via IPFS to people that support it, and run a restricted gateway for those who don't. That way I get deduplication and replication for free, and adding pinning services is easy if I want to be more globally distributed.
in reply to James H

@quanin Right, and videos are only 40. I couldn't find the setting for audio. But it kind of doesn't matter anyway because Mastodon is evil and recompresses everything. Even if it was already compressed in the exact format Mastodon already wants. Because compression artifacts just give everything that classic social media sound!
in reply to 🇨🇦Samuel Proulx🇨🇦

In a way I can kind of see why, though. If your users are allowed to upload a 10 GB file (what the file is doesn't matter), then every instance you're federated with needs to download that file. That assumes that the instance has the bandwidth and the diskspace to download that file. Given how many instances are basically running on a virtual server without S3, or a physical server in some guy's basement, that has the potential to not end well.
⇧