• 0 Posts
  • 7 Comments
Joined 8 days ago
cake
Cake day: February 14th, 2025

help-circle




  • Sorry chief you might have embarrassed yourself a little here. No big thing. We’ve all done it (especially me).

    Check out huggingface.

    There’s heaps of models you can run locally. Some are hundreds of Gb in size but can be run on desktop level hardware without issue.

    I have no idea about how LLMs work really so this is supposition, but suppose they need to review a gargantuan amount of text in order to compile a statistical model that can look up the likelihood of whatever word appearing next in a sentence.

    So if you read the sentence “a b c d” 12 times you don’t need to store it 12 times to know that “d” is the most likely word to follow “a b c”.

    I suspect I might regret engaging in this supposition because I’m probably about to be inundated with techbro’s telling me how wrong I am. Whatever. Have at me edge lords.