I assume they all crib from the same training sets, but surely one of the billion dollar companies behind them can make their own?

  • Hackworth@piefed.ca
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 days ago

    DeepMind keeps trying to build a model architecture that can continue to learn after training, first with the Titans paper and most recently with Nested Learning. It’s promising research, but they have yet to scale their “HOPE” model to larger sizes. And with as much incentive as there is to hype this stuff, I’ll believe it when I see it.