Glossary definitionBrowse the neighboring terms

Vibe work / Practitioner slang

Prompt-and-pray

An AI workflow with no safety net: someone writes a prompt, runs it, and hopes the output is correct. There is no retrieval system, no test, no fallback, and no logging.

Prompt-and-pray looks great in demos because the easy, common cases produce fluent output. The fragility shows up in production: an edge case arrives, the model hallucinates, the output format breaks, the source data shifts, and nobody can diagnose what went wrong because nothing was logged or tested. A team might build a customer support bot by writing one long prompt, demoing it on ten friendly questions, and shipping it. The first ambiguous, angry, or out-of-scope message from a real customer reveals that the entire system was a single prompt with no structure underneath.

Builder example

Many AI features fail because the first impressive prompt becomes the permanent architecture. Prompt-and-pray is a natural and useful prototyping stage: you discover whether the AI can do the task at all. The mistake is stopping there. Every prompt that matters to a product needs source grounding, output validation, failure handling, and a review mechanism before it touches real users.

Common confusion: A long, detailed prompt can feel robust because of its length. Length adds specificity, but it does not add reliability. A 2,000-word prompt with no tests, no retrieval, and no fallback still fails unpredictably on inputs the author did not anticipate.