The Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 15 hours agoSeems legitmedia.piefed.worldimagemessage-square44fedilinkarrow-up1449arrow-down16
arrow-up1443arrow-down1imageSeems legitmedia.piefed.worldThe Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 15 hours agomessage-square44fedilink
minus-squareUriel238 [all pronouns]@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up5arrow-down1·7 hours agoOffline LLMs exist but tend to have a few terabytes of base data just to get started (e.g. before LORAs)
minus-squarenomorebillboards@lemmy.worldlinkfedilinkarrow-up1·1 hour agoI thought it was more like 10-20GB to start out with a usable (but somewhat stupid) model. Are you confusing the size of the dataset with the size of the model?
Offline LLMs exist but tend to have a few terabytes of base data just to get started (e.g. before LORAs)
I thought it was more like 10-20GB to start out with a usable (but somewhat stupid) model.
Are you confusing the size of the dataset with the size of the model?