• ShadowRam@fedia.io
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Ok,

      Show me a PCE-E board that can do inference calculations as fast as a 3090 but is less expensive than a 3090.

    • RandomlyRight@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Yeah show me a phone with 48GB RAM. It’s a big factor to consider. Actually, some people are recommending a Mac Studio cause you can get it with 128GB RAM and more and it’s shared with the AI/GPU accelerator. Very energy efficient, but sucks as soon as you want to do literally anything other than inference

      • Fuzzypyro@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I wouldn’t say it particularly sucks. It could be used as a powerhouse hosting server. Docker makes it very easy to do no matter the os now a days. Really though I’d say its competition is more along the lines of ampere systems in terms of power to performance. It even beats amperes 128 core arm cpu at a power to performance ratio which is extremely impressive in the server/enterprise world. Not to say you’re gonna see them in data centers because price to performance is a thing as well. I just feel like it fits right into the niche it was designed for.