But also you absolutely can learn & build small versions of LLMs on a regular laptop. I did it on my old 2017 Dell XPS, and trained it on the complete works of Shakespeare. It learnt to write almost passable Shakespeare hallucination in a couple of hours. There’s a good tutorial online if you search for it.
I’ve only figured out how to just run one locally on a Steam Deck, not build and train one.
Still though, even for this more primitive one you built, I’m guessing the overall file footprint size of it was orders of magnitude greater than what you could fit into a megabyte of more simpler chatbot that runs in a local terminal, right?
That tldr needs a tldr…
But also you absolutely can learn & build small versions of LLMs on a regular laptop. I did it on my old 2017 Dell XPS, and trained it on the complete works of Shakespeare. It learnt to write almost passable Shakespeare hallucination in a couple of hours. There’s a good tutorial online if you search for it.
Well shit.
I’ve only figured out how to just run one locally on a Steam Deck, not build and train one.
Still though, even for this more primitive one you built, I’m guessing the overall file footprint size of it was orders of magnitude greater than what you could fit into a megabyte of more simpler chatbot that runs in a local terminal, right?
Oh sure. The training data is about 5mb so easily manageable on a relatively modern machine, but not on the kind of thing that was used for ELIZA.