Course/Event Essentials
Event/Course Start
Event/Course End
Event/Course Format
Mixed
Live (synchronous)
Primary Event/Course URL
Training Content and Scope
Scientific Domain
Technical Domain
Level of Instruction
Intermediate
Advanced
Sector of the Target Audience
Research and Academia
Industry
Public Sector
HPC Profile of Target Audience
Application Users
Data Scientists
Language of Instruction
Other Information
Organiser
Supporting Project(s)
EuroCC2/CASTIEL2
Event/Course Description
Abstract:The lecture discusses the phenomenon where large language models degrade in quality when trained on their own generated content, also known as "model collapse." As models repeatedly learn from their own outputs, errors accumulate and creativity diminishes, leading to less coherent and original content. With more content online now generated by these models, the risk of training newer models with poorer quality data increases. This trend emphasises the growing importance of genuine human creativity, as it provides the originality and diversity necessary to sustain quality content. Ultimately, human insight will remain crucial in maintaining and guiding the evolution of AI.