Confronting Model Collapse: A Call for Transparency and Responsibility in AI
AI systems are at risk of degrading themselves. As AI-generated content loops back into new training datasets, a silent crisis emerges. We call for transparency, rigorous data curation, and public participation to preserve model integrity. Without proactive strategies, the digital ecosystem risks becoming saturated with self-referential noise, eroding trust, accuracy, and the foundation of generative AI itself.