We've just released Matrix spec v1.17 (hopefully our last release before v2.0!) including a few new MSCs and some exciting usability improvements for the spec website!
Read about it in the blog post: matrix.org/blog/2025/12/18/mat…
Matrix v1.17 specification released
Matrix, the open protocol for secure decentralised communicationsRichard van der Hoff (matrix.org)
Das Vitamin C haltige Mark schmeckt schön fruchtig.
miki
in reply to Beb0p • • •Local AI wastes more power than AI that runs in a datacenter.
For LLMs to be as economical and power-efficient as possible, you need to keep your GPUs as highly utilized as you can, preferrably during all hours of the day. For big models, you also want to have as many of those GPUs as you can in one cluster, within limits of course.
THe thing about LLMs is that, unlike let's say Netflix, they require very little bandwidth and are relatively tolerant of latency. This means that there's little disadvantage (beyond the political implications) of running your model at the other end of the world. Unless you're a global organization with offices everywhere, if you buy your own GPUs, those GPUs will likely sit idle beyond 9-to-5. This won't be the case if you rent capacity from a global provider, as that provider can rent the same GPUs to somebody located in a different timezone when you're not using them.
Beb0p
in reply to miki • • •miki
in reply to Beb0p • • •You don't realize how much not a cost bandwidth is when it comes to AI. One Netflix movie (despite all the layers of CDNs) will probably offset your Chat GPT use for a year (napkin math).
I think you have a point re: community impact, but due to how much of a non-concern bandwidth is, we have a really great opportunity to build AI datacenters very close to green energy sources (or build green energy sources closer to AI datacenters). THis includes things like geothermal. If eveybody is running their own AI, they'll use whatever power they have.
This is one of a very few industries that a) uses insane amounts of energy, and b) the energy can be used at the other end of the world from where the product user is, with 0 loss to the user experience.
miki
in reply to miki • • •