• 2 Posts
  • 173 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle


  • I used a cloudflare tunnel for streaming music in jellyfin. Didn’t so much else with it and it worked pretty well. Anything high bandwidth you should use something else, but for stuff that doesnt consume a ton of bandwidth like music streaming in my case, it worked fine, at least when I used it a few years back.


  • Bluefruit@lemmy.worldtoSelfhosted@lemmy.worldIdeas
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    Cost, privacy, and control.

    No matter what happens to stuff outside my network, I have full control over my data and hardware, without paying someone for thiers.

    I still haven’t set up my self hosting stuff yet, still moving things in with my girlfriend and unpacking but I’ll be using my mini PCs for home assistant, nextcloud, immich, and Jellyfin to start with. May set up some arr services as well but I kinda like to just pay for things to own them if I can.










  • Yea I’m aware but I appreciate the insight :) so far my local ai experience has been lack luster so I’m hoping that training and RAG will make up for the context size at least a little. Ifnit can answer accurately in the first place, it may not need as big of a context window.

    If you haven’t tried using RAG in some form, I would recommend giving it a go. Its pretty cool stuff, helps make models answer more accurately based on the documentation you give them though in my case, ive had limited success. Tbh, chatgpt has become my last resort when I just wanna get something done but I don’t like using it due to the privacy concerns, not to mention the ethical issues I have with ai training in general from big tech.

    How is searxng BTW? Would you say its good to host or do you use a normal search engine more often? Or do you just use it for the AI search plugin?

    Ive actually been thinking about using it rather than duckduckgo but was also hopeful the search index they are working on would be enough to satisfy my needs, or that a self hosted AI enabled search engine would work well enough when I need it.



  • Thats why i was considering training my own model if possible. Ive been toying around with kobold.CPP and gpt4all which both have RAG implementations.

    My idea is to essentially chat with documentation and as a separate use case, have it potentially be a AI search engine but locally hosted. I do still prefer to search myself, but fuck man, searches have gotten so bad, and the kobold.CPP web lookup feature was pretty neat IMO.

    So yea you’re not wrong, I’m just hoping that if in train it and or give it documentation it can reference when answering, it will be suitable. Mostly AI has been good for me as kind of a rubber ducky when troubleshooting and helping me search for things when I have some specific question and in don’t want “top 5 things vaguely related to your question” results.


  • Getting ready to move from out of the woods and back to civilization with my partner.

    Not looking forward to having neighbors above or below me but I’m very excited to have internet that doesnt fucking suck.

    Once were moved and a bit more settled, I’m gonna start really digging into to selfhosting things. I have the hardware, a couple HP mini PCs that will run home assistant and probably a server for various docker things. Nextcloud and immich seem to be the things I’ve found i wanna use so far. I already have a NAS set up, but was having am issue with it not booting if a monitor isnt plugged in. I bought a dummy plug for it but haven’t tried it out yet.

    Will also be setting up an AI server for local LLM use. Hope to train one to fit my needs once I pull the trigger on 3060 12GB card but need to figure out what other parts I’ll use. Might upgrade my main rig and use the parts from that, or maybe I’ll buy a old dell and fix it up. Not sure yet.

    Lots of ideas, so little time lol.