Glossary definitionBrowse the neighboring terms

Context / Standard term

Prompt engineering

The skill of writing clear instructions, examples, constraints, and output formats that make a model behave more predictably and usefully.

Prompt engineering uses concrete techniques: assigning the model a role ("You are a legal reviewer for consumer contracts"), providing examples of good output so it understands the pattern, setting constraints ("Answer in three sentences using only information from the attached document"), and specifying output format ("Return a JSON object with these fields"). Each technique narrows the range of possible outputs toward the one you want. A well-engineered prompt for summarizing customer emails might include two sample summaries, a length cap, and instructions to flag any email mentioning a refund.

Builder example

Even with advanced retrieval and tool-use pipelines, the prompt shapes how the model interprets and presents its answer. A retrieval system might surface the right product specs, but if the prompt says "be helpful" instead of "compare the two specs the user asked about and flag differences," the output will be vague. Prompt engineering remains essential at every layer of a production system.

You ask an AI to 'write a summary' and get a generic paragraph that could describe any project.

Specify the audience, the key question, the format, and one example of what a good summary looks like. Clarity in, clarity out.

Common confusion: Prompt engineering has not become obsolete as models have gotten smarter. Better models still produce better results with clearer instructions. The techniques have matured, but the need for them persists.