jogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 8 hours agoSelf-Host Weekly (30 January 2026)selfh.stexternal-linkmessage-square19fedilinkarrow-up142arrow-down13
arrow-up139arrow-down1external-linkSelf-Host Weekly (30 January 2026)selfh.stjogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 8 hours agomessage-square19fedilink
minus-squareSanPe_@lemmy.worldlinkfedilinkEnglisharrow-up2·5 hours agoThe “else-hosted” LLM AI is really not my thing, but selfhosted even less…
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up1·3 hours agoIf I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
minus-squareDavid J. Atkinson@c.imlinkfedilinkarrow-up2·3 hours ago@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up1·12 minutes agoI mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.
The “else-hosted” LLM AI is really not my thing, but selfhosted even less…
If I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
I mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.