That’s one of the fun things about AI model collapse. The AIs will start polluting their own training data (already have, actually) and the more prolific and capable AI gets the stupider their training will become, and it will never get better again, it will just eventually reach a steady state of some stupid AI creating training data just barely non-stupid enough data to be believable to the other stupid AI deciding whether it’s valid training data, which makes them both slightly stupider until it can’t create non-stupid enough training data anymore, at which point the data quality will start to improve marginally due to the increased proportion of human efforts, and then the cycle will repeat, endlessly. There is no way out of an AI polluted training data set except by adding more real human data. Arguably we’ve already hit peak AI because of this, and this is where it’s plateaued and where it will likely stay once the bubble pops, with only slight incremental progress from then onwards. It’s probably not going to be taking over the world anytime soon. It’s a reflection of our own collective creativity and effort. It’s a confusing, byzantine, hall of mirrors reflection, sometimes funny-shaped reflection, sometimes a scary reflection, but it’s always ultimately a reflection. It’s not intelligence. It’s just ourselves. There’s nobody on the other side of the mirror but ourselves.
That’s one of the fun things about AI model collapse. The AIs will start polluting their own training data (already have, actually) and the more prolific and capable AI gets the stupider their training will become, and it will never get better again, it will just eventually reach a steady state of some stupid AI creating training data just barely non-stupid enough data to be believable to the other stupid AI deciding whether it’s valid training data, which makes them both slightly stupider until it can’t create non-stupid enough training data anymore, at which point the data quality will start to improve marginally due to the increased proportion of human efforts, and then the cycle will repeat, endlessly. There is no way out of an AI polluted training data set except by adding more real human data. Arguably we’ve already hit peak AI because of this, and this is where it’s plateaued and where it will likely stay once the bubble pops, with only slight incremental progress from then onwards. It’s probably not going to be taking over the world anytime soon. It’s a reflection of our own collective creativity and effort. It’s a confusing, byzantine, hall of mirrors reflection, sometimes funny-shaped reflection, sometimes a scary reflection, but it’s always ultimately a reflection. It’s not intelligence. It’s just ourselves. There’s nobody on the other side of the mirror but ourselves.