Glossary definitionBrowse the neighboring terms

Slop / Research term

Botshit

AI-generated content that has no grounding in truth, passed along by a person who never verified it. The term combines the model's indifference to accuracy with the human's failure to check.

The term builds on philosopher Harry Frankfurt's analysis of bullshit: speech produced with no concern for whether it is true (distinct from lying, which requires knowing the truth and choosing to hide it). AI models generate text with no concept of truth at all. Botshit happens at the handoff: a consultant asks ChatGPT to write a market analysis, skims the polished paragraphs, and emails them to a client as if the claims were researched. The fabricated statistics and confident-sounding sources now sit in a decision-making document because nobody checked them at any point in the chain.

Builder example

The risk scales with two factors: how much truth matters in the domain, and how difficult verification is. An AI drafting social media captions operates in a low-stakes zone. An AI generating legal citations, medical dosage information, or financial projections operates where a single unverified claim can cause real harm. Builders need to match their verification investment to the stakes.

An assistant invents a plausible study title with an author name and year. The user pastes it into a client memo without verifying.

Use source lookup and link validation for claims users might copy. Make checking easy and skipping hard.

Common confusion: Botshit is not the same as hallucination. A hallucination is the model inventing something false. Botshit is what happens when a person takes that hallucination, never checks it, and passes it into a real workflow where others rely on it.