Glossary definitionBrowse the neighboring terms

Failures / Standard term

Out-of-distribution (OOD)

When a model receives input that falls outside the kind of data it was trained on. A model trained on English customer service emails receiving a technical question in Mandarin is out-of-distribution.

Every model has a comfort zone defined by its training data. Inside that zone, predictions are reliable. Outside it, the model's learned patterns no longer apply and outputs become unpredictable. Most models do not signal when they are out of their depth; they keep generating confident-sounding responses. A customer-facing chatbot trained on retail questions will still produce an answer when asked a complex legal question, and that answer will likely be wrong.

Builder example

Out-of-distribution (OOD) inputs are inevitable in production because users do not know or care what your model was trained on. Every unhandled OOD case is a potential confidence-destroying moment where a user trusts a wrong answer. Detecting these cases early and routing them to a fallback or human handoff protects both the user and your product's reputation.

Common confusion: OOD does not mean rare or unusual. A perfectly common request can be out-of-distribution if the model's training data never covered that domain. Frequency in the real world and coverage in the training set are separate things.