The Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 15 hours agoSeems legitmedia.piefed.worldimagemessage-square44fedilinkarrow-up1449arrow-down16
arrow-up1443arrow-down1imageSeems legitmedia.piefed.worldThe Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 15 hours agomessage-square44fedilink
minus-squareDarkCloud@lemmy.worldlinkfedilinkarrow-up49·15 hours agoYou can get offline versions of LLMs.
minus-squarecriss_cross@lemmy.worldlinkfedilinkarrow-up10·13 hours agoAnd gpt-oss is an offline version of chatgpt
minus-squaresp3ctr4l@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·edit-211 hours agoI’ve been toying with Qwen3. On my steam deck. 8 bil param model runs stably. Its’s opensource too! Alpaca is a neat little flatpak that containerizes everything and makes running local models so easy that I can literally do it without a mouse or keyboard.
minus-squareutopianfiat@lemmy.worldlinkfedilinkEnglisharrow-up8·14 hours agoIndeed https://huggingface.co/openai-community
minus-squareGhostalmedia@lemmy.worldlinkfedilinkEnglisharrow-up2·10 hours agoI mean, most people have a local LLM in their pocket right now.
minus-squarelinkinkampf19 🖤🩶🤍💜🇺🇦@lemmy.worldlinkfedilinkEnglisharrow-up6·14 hours agoFirst thing that came to mind: GPT4All
You can get offline versions of LLMs.
And gpt-oss is an offline version of chatgpt
I’ve been toying with Qwen3.
On my steam deck.
8 bil param model runs stably.
Its’s opensource too!
Alpaca is a neat little flatpak that containerizes everything and makes running local models so easy that I can literally do it without a mouse or keyboard.
https://ollama.org/
Indeed https://huggingface.co/openai-community
I mean, most people have a local LLM in their pocket right now.
First thing that came to mind: GPT4All