People store large quantities of data in their electronic devices and transfer some of this data to others, whether for professional or personal reasons. Data compression methods are thus of the utmost importance, as they can boost the efficiency of devices and communications, making users less reliant on cloud data services and external storage devices.
Where I work, we’ve been looking into data compression that’s optimized by an ML system. We have a shit-ton of parameters, and the ML algorithm compares the number of sig figs in each parameter to its byte size, and truncates where that doesn’t cause any loss of fidelity. So far, it looks promising, really good compression factor, but we still need to do more work on de-skilling the decompression at the receiving end.
I wouldn’t have thought LLM was the right technology to use for something like this.