Failures / Standard term
Mode collapse
When a model loses output diversity and starts producing the same patterns, phrasings, and structure regardless of what you ask. Ask for ten different email openings and get ten variations of the same sentence.
The term originated in image generation, where a model might produce the same face over and over. In language models, mode collapse shows up as repetitive sentence structures, identical hedging phrases ("It's important to note that..."), and the same organizational template applied to every response. Heavy preference tuning during training is a common cause: when the model is optimized too aggressively for one "preferred" style, it narrows its range and loses the ability to vary its voice.
Builder example
For creative and writing tools, mode collapse is the moment users start saying "everything sounds like AI." Once users recognize the pattern, they stop trusting the tool's output and start rewriting everything manually, defeating the purpose of the product. Variety is a core quality metric for any generative application.
Common confusion: Mode collapse (one model producing repetitive outputs) is a different problem from model collapse (future models degrading because they trained on AI-generated data). The names are confusingly similar, but the causes and solutions are distinct.