ubergeek77's Lemmy
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
The Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 15 hours ago

Seems legit

media.piefed.world

message-square
44
fedilink
443

Seems legit

media.piefed.world

The Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 15 hours ago
message-square
44
fedilink
alert-triangle
You must log in or # to comment.
  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    7 hours ago

    Offline LLMs exist but tend to have a few terabytes of base data just to get started (e.g. before LORAs)

    • nomorebillboards@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      I thought it was more like 10-20GB to start out with a usable (but somewhat stupid) model.

      Are you confusing the size of the dataset with the size of the model?

  • khepri@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    7 hours ago

    Could you crunch an LLM into 700Mb that was still functional? Cause this looks like a fun thing to actually do as a joke.

    Edit, I bet I could get https://huggingface.co/distilbert/distilgpt2 to run off a CD. How many tps am I gonna get guys 🤣

    • yellow [she/her]@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      Qwen3-0.6B is about 400 MB at Q4 and is surprisingly coherent for what it is.

      • khepri@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        Wow, just popped it onto my very slow desktop and this little model rips haha. I really think tiny LLMs with a good LoRA on top are going to be a huge deal going forward

      • khepri@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 hours ago

        That’s so crazy that an LLM capable of doing anything at all can be that small! That’s leaves room for like an entire .avi episode of family guy at dvd resolution on there, which is the natural choice for the remaining space of course

    • lime!@feddit.nu
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 hours ago

      there’s also tinyllama, which is somewhere around 600MB. it’s hilariously inept. it’s like someone jpeg-compressed a robot.

      also you’re only gonna load off of that cd once so it’ll perform fine.

  • tomiant@piefed.social
    link
    fedilink
    English
    arrow-up
    26
    ·
    13 hours ago

    FCKGW-RHQQ2-YXRKT-8TG6W-2B7Q8

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      8 hours ago

      make sure to disconnect the internet first

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 hours ago

      CrAcKeD

  • DarkCloud@lemmy.world
    link
    fedilink
    arrow-up
    49
    ·
    15 hours ago

    You can get offline versions of LLMs.

    • criss_cross@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      13 hours ago

      And gpt-oss is an offline version of chatgpt

    • sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      11 hours ago

      I’ve been toying with Qwen3.

      On my steam deck.

      8 bil param model runs stably.

      Its’s opensource too!

      Alpaca is a neat little flatpak that containerizes everything and makes running local models so easy that I can literally do it without a mouse or keyboard.

    • SubArcticTundra@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      9 hours ago

      https://ollama.org/

    • utopianfiat@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      14 hours ago

      Indeed https://huggingface.co/openai-community

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      I mean, most people have a local LLM in their pocket right now.

    • linkinkampf19 🖤🩶🤍💜🇺🇦@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      14 hours ago

      First thing that came to mind: GPT4All

  • SubArcticTundra@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    9 hours ago

    Does anyone know of any OSS LLMs that can search the web the way ChatGPT can?

    • yellow [she/her]@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      4 hours ago

      It’s not the LLM that does the web searching, but the software stack around it. On its own, an LLM is just a text completer. What you’d need a frontend like OpenWebUI or Perplexica that would ask the LLM for, say five internet search queries that could return useful information for the prompt, throw those queries into SearxNG, and then pipe the results into the LLM’s context for it to be used.

      As for the models themselves, any decently-sized one that was released fairly recently would work. If you’re looking specifically for open-source rather than open-weight models (meaning that the training data and methodologies were also released rather than just the model weights), GPT-OSS 20B/120B and the OLMo models are recent standouts there. If not, the Qwen3 series are pretty good. (There are other good models out there, this is just what I remember off the top of my head.)

      • SubArcticTundra@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        13 minutes ago

        Thank you

    • MonkderVierte@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      9 hours ago

      Depends. Does ChatGPT ignore robots.txt too?

  • Björn@swg-empire.de
    link
    fedilink
    arrow-up
    22
    ·
    14 hours ago

    It’s just audio of French farting cats.

    • Lemmyoutofhere@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      13 hours ago

      Le pfffft.

  • SSUPII@sopuli.xyz
    link
    fedilink
    arrow-up
    20
    ·
    edit-2
    14 hours ago

    If we assume a CD, you can probably fit a 256M parameters model in it. But it will LOAD.

    • MacN'Cheezus@lemmy.today
      link
      fedilink
      arrow-up
      9
      ·
      12 hours ago

      DVDs exist. They can fit approx. 7B params, enough to be somewhat productive.

  • NullPointerException@lemmy.ca
    link
    fedilink
    arrow-up
    9
    ·
    15 hours ago

    That’s just Dr Sbaitso.

  • faizalr@piefed.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    14 hours ago

    It reminds me of the Britannica Encyclopedia on CD.

    • KyuubiNoKitsune@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      8 hours ago

      Encarta 95

  • MidsizedSedan@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    15 hours ago

    Isn’t it possible to download all of wikipedia, and it being surprisenly a small file size? Can it fit on a CD?

    • AmbiguousProps@lemmy.today
      link
      fedilink
      English
      arrow-up
      9
      ·
      14 hours ago

      It could fit on a BDXL disc.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        13 hours ago

        You can fit text-only wikipedia on a normal Blu Ray as it’s only about 24GB. You can also easily fit Llama 3.1 or any of the other open, offline capable ai models as they’re only about 4GB.

      • gustofwind@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 hours ago

        could also store it on a flashdrive or micro sd card

    • Axolotl@feddit.it
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      15 hours ago

      No, you really can’t; It’s like 43 gb the text only version

      • puppycat (she/her)@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        1 hour ago

        yes you really can; it’s like 20-25 gb depending on how recent of a copy you have. I’ve been seeding wikipedia for almost a year and it barely takes any space on my computer

      • BanMe@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        14 hours ago

        So gonna need like 2 CDs then

    • SSUPII@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      15 hours ago

      No

      (English) 24,05GB without media. Adding media adds 428,36TB.

      • Axolotl@feddit.it
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        15 hours ago

        Can you give me the text only version link? I found only a version that is like 43gb

        • SSUPII@sopuli.xyz
          link
          fedilink
          arrow-up
          6
          ·
          15 hours ago

          The sizes I mentioned are from around 2023-2024, from https://en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia

          https://dumps.wikimedia.org/enwiki/ (https://en.wikipedia.org/wiki/Wikipedia:Database_download)

        • ZkhqrD5o@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          14 hours ago

          I suggest the happy medium called Kiwix, directly from the programme you can download all of Wikipedia with medium-sized pictures for a hundred gigabytes or so.

        • J_on_Lemmy@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          ·
          14 hours ago

          KiwiX on mobile gives 111.1GB Wikipedia download, It also has a bunch of diff categories if you don’t want the super large one.

      • GregorGizeh@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        12 hours ago

        500TB is still surprisingly reasonable for what is essentially a library of human (surface level) knowledge.

        It would be interesting to know how large the file would be including all text form references (i’d imagine anything else such as videos would completely blow the proportions)

    • Admiral Patrick@dubvee.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 hours ago

      The full 2025-04 English-only ZIM dump is about 120 GB. That includes reduced-size images as well as all articles. I think the text-only version is in the 40-60 GB range.

      There are smaller ZIM versions in the ~4 GB range that would fit on a DVD, but they’re only a subset for specific topics or for a list of the most popular topics.

    • Rain World: Slugcat Game@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 hours ago

      kiwix? that’s compressed (afaik), and when i tried, it took up half of my disk space and needed ethernet

  • SanctimoniousApe@lemmings.world
    link
    fedilink
    arrow-up
    4
    ·
    14 hours ago

    Maybe they meant GTA?

Lemmy Shitpost@lemmy.world

lemmyshitpost@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful

Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

…


2. No Illegal Content

Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

…


3. No Spam

Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

…


4. No Porn/Explicit

Content


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

…


5. No Enciting Harassment,

Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

…


6. NSFW should be behind NSFW tags.

-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

…

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 3.64K users / day
  • 8.98K users / week
  • 14.6K users / month
  • 29.8K users / 6 months
  • 1 local subscriber
  • 35.8K subscribers
  • 21.4K Posts
  • 500K Comments
  • Modlog
  • mods:
  • Aer@lemmy.world
  • Striker@lemmy.world
  • WiildFiire@lemmy.world
  • Decoy321@lemmy.world
  • The Picard Maneuver@startrek.website
  • Flying Squid@lemmy.world
  • The Picard Maneuver@lemmy.world
  • BE: 0.19.7
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org