• Prove_your_argument@piefed.social
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    9
    ·
    2 days ago

    You misunderstand. I’m not saying AI’s chats are better than a quality article. I’m saying their search results are often better.

    Don’t look at what they SAY. Look at the links they provide as sources. Many are bad, but I find I get much better info. If I try google I might try 10+ links before I get one that really says what I want. If I try a chatbot I typically get a link that is relevant within one or two clicks.

    There is no shortcut for intelligence… but AI “SEO” has not been perfected yet .

    • Honse@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 day ago

      Interesting. LLMs have no ability to directly do anything put output text so the tooling around the LLM is what’s actually searching. They probably use some API from bing or something, have you compared results with those from bing because I’d be interested to see how similar they are or how much extra tooling is used for search. I can’t imagine they want to use a lot of cycles generating only like 3 search queries per request, unless they have a smaller dedicated model for that. Would be interested to see the architecture behind it and what’s different from normal search engines.

    • gustofwind@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 days ago

      Yep this has happened to me too

      I used to always get the results I was looking for now it’s just pure garbage but Gemini will have all the expected results as sources

      Obviously deliberate to force us to use Gemini and make free searches useless. Maybe it hasn’t been rolled out to everyone yet but it’s certainly got us

      • AmbitiousProcess (they/them)@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        I’m honestly not even sure it’s deliberate.

        If you give a probability guessing machine like LLMs the ability to review content, it’s probably just gonna be more likely to rank things as you expect for your search specifically than an algorithm made to extremely quickly pull the most relevant links… based on only some of the page as keywords, with no understanding of how the context of your search relates to each page.

        The downside is, of course, that LLMs use way more energy than regular search algorithms, take longer to provide all their citations, etc.

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          A ton of factors have increased energy costs on the web over the years. It’s insignificant per person, but bandwidth is exponentially higher because all websites have miles of crud formatting code nowadays. Memory usage is out of control. Transmission and storage of all the metadata your web browser provides in realtime as you move your mouse around a page is infinitely higher than what we had in the early days of the web.

          The energy cost of ML will reduce as chips progress, but I think the financial reality will come crashing down on the AI industry sooner rather than later and basically keep it out of reach for most people anyway due to cost. I don’t see much of ROI for AI. Right now it’s treated as a capital investment which helps inflate company worth while it lasts, but after a few years the investment is worthless and a giant money sink from energy costs if used.