• Auth@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    10 hours ago

    More Larian Aura farming. Please take a break you’re already full maxed out for community respect its actually getting unfair for other game developers.

  • trslim@pawb.social
    link
    fedilink
    English
    arrow-up
    22
    ·
    14 hours ago

    AI is just a marketing term, there’s nothing intelligent about it. Its simply Large Language Models, databases that predict what should go next. Its like asking the prediction bar when you are typing to write a story.

    • thelasttoot@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      5 hours ago

      It isn’t though. It’s a thing that makes shit. And it makes shit well enough that execs seriously consider it. Even though it can only make a commercial 60% as good as humans can, that’s good enough. Because AI can make 10 commercials in the same time traditional creators can make 1. It doesn’t matter how bad the ai commercials are because they can overwhelm any competition in sheer abundance. AI ads will drown out traditional ads. They are easier to make and are infinitely customisable. I can make 10 new ads a day for less than it would cost to make a single traditional ad. There really is no comparison

      • ThirdConsul@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 hours ago

        Because AI can make 10 commercials in the same time traditional creators can make 1

        Famously Cola Christmas commercial took more time and money to create than a traditional ad would?

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    17 hours ago

    I think the reason so many AI bros are conservative is that conservatives have historically had really bad taste in art/media, so they see the drivel AI creates and think, “oh wow, it looks just like what the artists make,” not realizing that they don’t have the eye to see what it’s missing.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    109
    ·
    20 hours ago

    I like the way Ted Chiang puts it:

    Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.

    There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.

    I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.

    • MoonMelon@lemmy.ml
      link
      fedilink
      English
      arrow-up
      60
      ·
      20 hours ago

      The dialog pushing AI media seems to start from this assumption that I consume media just to have colors and words and sounds enter my face holes. In fact, I consume art and media because I like hearing, seeing, and reading about how other humans experience the same world I do. It’s a form of communication. I like the product but also the process of people trying to capture the bonkers, ineffable experience we all seem to be sharing in ways I would never think of, but can instantly verify.

      What’s funny is, due to the nature of media, it’s kind of impossible to not communicate something, even if the artwork itself is empty. When I see AI media I see the communication of a mind that doesn’t know or give a shit about any of this. So in their attempt make filler they are in fact making art about how inarticulate they are. It’s unintentional, corporate dadaism.

      • hikaru755@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        52 minutes ago

        While I fully agree with you regarding my reasons to consume media, I’m not so sure if that’s actually a majority opinion. I get the impression that for a lot of people, the point of consuming media is actually just entertainment, something to take your mind off other things, and they don’t care about the communication and connection aspect.

      • Coelacanth@feddit.nu
        link
        fedilink
        English
        arrow-up
        20
        ·
        20 hours ago

        Yes, this is it right here. The whole point of art is communication and connection with another human being.

        • Holytimes@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 hour ago

          Eh not always, hell I wouldn’t even say it is most of the time.

          Art for OTHERS is for communicating.

          Art for yourself is everything from glorified shit to slop to trash to works of art, and soul searching .

          Art for marketing is better low effort and shot gunned more often then not and that has actual science to back it up.

          Art in the grand scheme of things is God damn fucking anything. Meaning or no meaning. Soul or no soul.

          Heaven knows iv spent days on clay works that I just toss cause they existed for nothing more then something to do. Iv made paintings that iv proudly hung in my wall. While others I could barely give to shits about and frankly a random ai genned img would have more artistic merit in its prompt then the slop I shat onto the canvas.

          Trying to dictate what is and isn’t art always makes you wrong with out expectation. Everything is art no matter what. The only value in art is what each person says it has to them and nothing more or less.

          That’s the neat part AI art has value if nothing more then purely utilitarian value. But only if you give it value.

          Just as most people don’t value modern art or a fucking banana nailed to a wall. Most won’t give AI art value. And that’s fine.

          The problem is everyone fucking trying to say it has value, just as much as everyone saying it doesn’t.

          You CANT dictate what art has value to others. Period.

          I personally don’t see value in AI art. It’s just a thing that exist. Fuck copyright and really the problem argument anyone ever fucking has over AI art is copy right related. So who the fuck cares, copyright should be abolished anyways it’s God damn fucking stupid.

        • apotheotic (she/her)@beehaw.org
          link
          fedilink
          English
          arrow-up
          11
          ·
          18 hours ago

          Not even necessarily a human being! I’d appreciate the fuck out of art if any species made it. But there must be more than uncaring, unfeeling, probabilistic interpretation of input data.

      • queermunist she/her@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        15 hours ago

        The people pushing AI don’t like like hearing, seeing, and reading about how other humans experience the world. They actually do just want flashing colors and sounds poured into their face holes. They’re basically incapable of understanding art.

    • voracitude@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      20 hours ago

      Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly

      I like this a lot. I’m going to thieve it.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        15 hours ago

        Tangentially related, the easiest way to come up with a unique and cool idea is to come up with a unique and dumb idea (which is way easier) and then work on it until it becomes cool. (Think how dumb some popular franchises concepts are if you take the raw idea out of context.)

  • BaraCoded@literature.cafe
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    22 hours ago

    Writer having toyed with AI, here : yeah, AI writing sucks. It is consensual and bland, never goes into unexpected territory, or completely fails to understand human nature.

    So, we’d better stop calling AI “intelligence”. It’s text-prediction machine learning on steroïds, nothing more, and the fact that we’re still calling that “intelligence” says how gullible we all are.

    It’s just another speculative bubble from the tech bros, as cryptos were, except this time the tech bros have made their nazi coming out.

    • RobotsLeftHand@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      17 hours ago

      I remember reading a longer post on lemmy. The person was describing their slow realization that the political beliefs they were raised with were leading down a dark path. It was a process that took many years, and the story was full of little moments where cracks in his world view widened and the seed of doubt grew.

      And someone who was bored/overwhelmed with having to read a post over three sentences long fed the story into AI to make a short summary. They then posted that summary as a “fixed your post, bro” moment. So basically all the humanity removed. Reminds me of that famous “If the Gettysburg Address were a PowerPoint” https://norvig.com/Gettysburg/

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        14 hours ago

        That’s really sad.

        I’ve used AI to help clean up my sentence structure for copy, but if I am not super explicit with it to not rewrite what I wrote, it will do as you said and take the human element out of it.

  • rafoix@lemmy.zip
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    2
    ·
    23 hours ago

    Tons of shit games are going to have lots of dialogue written by AI. It’s very likely that those games would have had shit dialogue anyway.

  • BlackLaZoR@fedia.io
    link
    fedilink
    arrow-up
    6
    ·
    21 hours ago

    You can make it stylized dialogue but it’s just surface mannerisms. Underneath it’s still the same bland AI

      • saltesc@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        22 hours ago

        Implying people are happy to buy the shit, which isn’t likely, especially in a competitive environment.

        • Leon@pawb.social
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          22 hours ago

          People buy AAA games all the time. Look at Starfield. Garbage game, still sold well.

          • Mister_Feeny@fedia.io
            link
            fedilink
            arrow-up
            4
            ·
            19 hours ago

            Starfield is estimated to have sold 3 millions copies. Baldur’s gate 3, 15 million. Microsoft/Bethesda marketing budgets makes a difference, but not being garbage makes a much bigger difference.

            • fruitycoder@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 hours ago

              Not to mention inertia from studio rep. I’d buy Bethesda hotdog water at some points for those studio. Now they could promise me head from a talented hooker with a clean record and I’d still probally pass cause Id probally just end 39 hours sore, blueballed, and being asked for the next payment.

            • Leon@pawb.social
              link
              fedilink
              English
              arrow-up
              2
              ·
              18 hours ago

              Well yeah, I’m not going to argue that a well made game that respects the player isn’t going to do well. But that doesn’t matter to the publishers and their shareholders when they can pump out AI slop garbage year after year and still have people that drink it up. Just look at the yearly shooters and sports games, they sell enough.

              Besides, what happens when this sort of slop has been normalised? Look at the mobile market, no one bats an eye at the intensely predatory microtransactions, and you’ll even find people defending things like gacha games.

              There was a time where people scoffed at the notion of paying $2~ for some shitty cosmetics, but now people don’t even blink at the idea. Hell, it’s downright cheap in some cases. The AAA industry just has to slop things up for long enough for people to stop caring, because they will stop caring and then continue to shell out for the dubious privilege of guzzling their mediocre, uninspiring garbage.

  • Lemming6969@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    9
    ·
    19 hours ago

    I can see how it could be useful, or mandatory in future rpgs. It can generate a framework for a real writer, with extremely large amounts of logical branching, a billion times faster. Then you go over the top of it and use the framework as concepts to use or revise. This streamlines the process, unifies the creative vision, and allows for such a large game without procedural generation that would haven taken a team 10 years or not at all, done in 2.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      18 hours ago

      This is based on the assumption that the AI output is any good, but the actual game devs and writers are saying otherwise.

      If the game is too big for writers to finish on their own, they’re not going to have time to read and fix everything wrong with the AI output either. This is how you get an empty, soulless game, not Balders Gate 3.

      • Lemming6969@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        18 hours ago

        It’s assuming the ai output isn’t very good. It assumes it can create a framework that necessarily still needs the actual writers, but now they don’t have to come up with 100% of the framework, but instead work on the actual content only. Storyboarding and frameworking is a hodgepodge of nonsense anyway with humans. The goal is to achieve non-linear scaling, not replace quality writers or have the final product Ai written.

        • xthexder@l.sw0.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          18 hours ago

          This sounds like it takes away a huge amount of creative freedom from the writers if the AI is specifying the framework. It’d be like letting the AI write the plot, but then having real writers fill in details along the way, which sounds like a good way to have the story go nowhere interesting.

          I’m not a writer, but if I was to apply this strategy to programming, which I am familiar with, it’d be like letting the AI decide what all the features are, and then I’d have to go and build them. Considering more than half my job is stuff other than actually writing code, this seems overly reductive, and underestimates how much human experience matters in deciding a framework and direction.

          • Lemming6969@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            15 hours ago

            Even in programming there are common feature frameworks. Having a system enumerate them based on a unified design vision from a single source architect rather than 50 different design ideas duct taped together could help a lot. I’ve seen some horrendous systems where you can tell a bunch of totally separate visions were frankenstein’d together, and the same happens in games where you can tell different groups wrote different sections.

            • xthexder@l.sw0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 hours ago

              I’ve seen some horrendous systems where you can tell a bunch of totally separate visions were frankenstein’d together

              My experience has been that using AI only accelerates this process, because the AI has no concept of what good architecture is or how to reduce entropy. Unless you can one-shot the entire architecture, it’s going to immediately go off the rails. And if the architecture was that simple to begin with, there really wasn’t much value in the AI in the first place.

    • Shanmugha@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      15 hours ago

      Aand you end up with… ta-da-m, same old things, just rebranded. Very creative (no)

  • Maiznieks@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    11
    ·
    20 hours ago

    Nah, can’t agree. I have postponed few ideas for years, was able to vibe them in a week during evenings, now i have something usable. 70% of it was vibed, just had to fix stupid stuff that was partially on my queries.

      • Maiznieks@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        Wow, look, a professional right here! Must have a high job insecurity to care about “machines took our jooobs”. Grow up and realise a POC solution is better than no solution, like products don’t ever get rewritten, lol.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    20
    ·
    21 hours ago

    That means in about 6 months or so the AI content quality will be about an 8/10. The processors spread machine “learning” incredibly fast. Some might even say exponentially fast. Pretty soon it’ll be like that old song “If you wonder why your letters never get a reply, when you tell me that you love me, I want to see you write it”. “Letters” is an old version of one-on-one tweeting, but with no character limit.

    • NotASharkInAManSuit@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      2
      ·
      20 hours ago

      I love how you idiots think this tech hasn’t already hit its ceiling. It’s been functionally stagnant for some time now.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      18 hours ago

      What improvements have there been in the previous 6 months? From what I’ve seen the AI is still spewing the same 3/10 slop it has since 2021, with maybe one or two improvements bringing it up from 2/10. I’ve heard several people say some newer/bigger models actually got worse at certain tasks, and clean training data is pretty much dried up to even train more models.

      I just don’t see any world where scaling up the compute and power usage is going to suddenly improve the quality orders of magnitude. By design LLMs are programmed to output the most statistically likely response, but almost by definition is going to be the most average, bland response possible.

    • Skua@kbin.earth
      link
      fedilink
      arrow-up
      14
      ·
      21 hours ago

      Only if you assume that its performance will continue improving indefinitely and (at least) linearly. The companies are really struggling to give their models more compute or more training data now and frankly it doesn’t seem like there have been any big strides for a while

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        12
        ·
        20 hours ago

        Yeah… Linear increases in performance appear to require exponentially more data, hardware, and energy.

        Meanwhile, the big companies are passing around the same $100bn IOU, amortizing GPUs on 6-year schedules but burning them out in months, using those same GPUs as collateral on massive loans, and spending based on an ever-accelerating number of data centers which are not guaranteed to get built or receive sufficient power.

    • Arkthos@pawb.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 hours ago

      I doubt that. A lot of the poor writing quality comes down to choice. All the most powerful models are inherently trained to be bland, seek harmony with the user, and generally come across as kind of slimy in a typically corporate sort of way. This bleeds into the writing style pretty heavily.

      A model trained specifically for creative writing without such a focus would probably do better. We’ll see.

      • Holytimes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        I mean look at a hobby project like neuro sama vs chat gpt.

        It’s rather night and day difference in terms of responses and humanity.

        While neuro and her sister both come across as autistic 7 year olds. They still come across as mostly human autistic 7 year olds. They have their moments they just lose it, which every LLM has.

        But comparing them it’s really really obvious how many of the problems with the inhumanity and blandness is a choice of large companies to have the LLMs be marketable and corpo friendly.

        In a world where these models could be trained and allowed to actually have human ish responses and focus on being “normal” instead of sterile robots. They would at least be way more fun.

        Not much more reliable mind you. But at least they would be fun.