Holy _shit_ this paper, and the insight behind it.
You know how every receiver is also a transmitter, _well_: every text predictor is also text compressor, and vice-versa.
You can outperform massive neural networks running millions of parameters, with a few lines of python and a novel application of _gzip_.
aclanthology.org/2023.findings…
“Low-Resource” Text Classification: A Parameter-Free Classification Method with Compressors
Zhiying Jiang, Matthew Yang, Mikhail Tsirlin, Raphael Tang, Yiqin Dai, Jimmy Lin. Findings of the Association for Computational Linguistics: ACL 2023. 2023.ACL Anthology
This entry was edited (2 years ago)