Simple thought but hard issue to tackle :

If AI ends up generating most internet content (lacking peer reviews, but sounds highly likely according to this), it will inevitably run out of new human material to learn from sooner or later. Our written culture would basically freeze in time, turning into a mix of old ideas “embellished” with machine-made interconnections . If these AI systems start training on what they themselves generate, we will soon drift away from what we consider human culture. When AI starts learning from its own output, things will get quickly wild and unpredictable (imagine layers of disinformation built on top of each other). This sounds inevitable unless we promptly find a way to catalog AI content and exclude it from training protocols. But how much time do we still have?