Lesson 2/5AI10 min read

Prompt engineering: why instructions matter?

AI responds to instructions.

Vague instructions get vague results.

Clear, specific instructions get useful results.

The difference is not the technology; it is the input.

Deep dive theory

Why this matters?

Consider asking someone for directions. "How do I get there?" is too vague. Get where? From where? By car or walking? The question cannot be answered because essential information is missing.

"How do I drive from downtown to the airport, avoiding highways, during rush hour?" is specific enough to get a useful answer.

AI works the same way. This is what prompt engineering means — getting better results by asking better questions.

The asymmetry: A few extra minutes spent on a clear prompt can save hours of dealing with unusable output. Most people underinvest in instruction quality and overinvest in fixing bad results.


1. What makes instructions work

Specificity reduces guessing

Every detail left out is a decision AI makes on its own. Sometimes these default decisions are fine. Often they are not.

Consider two requests:

Vague: "Write about our product."

Specific: "Write a 200-word product description for small business owners who are not technical. Focus on ease of setup. Use conversational language, no jargon."

The specific version constrains the output. AI has less room to go in wrong directions because the boundaries are clear.

Context fills gaps

AI knows nothing about the specific situation unless told. The company history, the audience, the purpose, the constraints — all invisible unless provided.

"Write a follow-up email" is abstract. "Write a follow-up email to a customer who attended our webinar last week but did not book a call. They work in healthcare and mentioned compliance concerns" gives AI the context needed to be relevant.

More relevant context usually means more relevant output. Irrelevant details can confuse the model or push important instructions out of its attention. The information investment pays off, but only when the context matches the task.

Examples anchor expectations

Describing a style is abstract. Showing a style is concrete.

Instead of saying "write in a punchy, active style," pasting three paragraphs that demonstrate that style gives AI something specific to match. The example shows what "punchy and active" actually means in practice.

This works for format too. Showing a template produces more consistent results than describing what the template should look like.


2. The anatomy of a prompt

Role setting

Telling AI who to be changes how it responds. "Act as an experienced copywriter" pulls from different patterns than a generic request.

This often works as a focusing mechanism. The AI has learned from many sources, and the role narrows which patterns it draws from.

Different roles produce genuinely different outputs on the same question. A "skeptical editor" reviewing copy produces different feedback than an "enthusiastic marketing manager."

Task specification

Clear definition of what the output should be. Not just the topic, but the format, length, and requirements.

"Write an email" is open-ended. "Write a three-paragraph follow-up email, maximum 150 words, with a single clear call-to-action asking for a 15-minute call" is specific.

Specificity reduces iterations.

Constraints and boundaries

What should not appear matters as much as what should.

"Do not use the word innovative. Do not include bullet points. Do not exceed 100 words." These boundaries prevent common defaults that might not fit the situation.

Without constraints, AI fills space with whatever patterns are most common. Those defaults may not match the specific need.

Output format

If a specific format is needed, specifying it explicitly helps. "Return the answer as a numbered list." "Format as a table with two columns." "Use markdown headers."

AI follows format instructions reasonably well. But it must be told — otherwise it chooses a format that may or may not work.


3. Common patterns that cause problems

Too brief to be useful

A one-line prompt gives AI almost nothing to work with. Without constraints, the output is a guess — and editing a guess takes longer than writing a clear prompt.

Asking for everything at once

Complex outputs are hard to nail in one attempt. Asking for a complete business plan in one prompt is unlikely to produce something good.

Breaking tasks into steps works better. "List the sections a business plan should have. Now draft an executive summary. Now draft the market analysis."

Each step can be reviewed and adjusted before moving to the next. The final output is better controlled.

Assuming AI remembers everything

In long conversations, AI's attention to earlier parts of the conversation weakens. Instructions given at the start may fade from influence.

For important constraints, repeating them periodically helps. Or starting fresh conversations for new tasks rather than continuing indefinitely.

Treating output as final

AI output is a draft, not a finished product.

The more realistic expectation: AI produces something 60-80% of the way there. Human editing, verification, and refinement completes it.


4. Where prompting reaches its limits

Better prompts help, but they cannot solve every problem.

Missing information cannot be prompted into existence

If AI lacks knowledge about something — your specific company, recent events, proprietary processes — no prompt can create that knowledge. It must be provided as context.

For novel or specific situations, the prompt must include the relevant information.

Judgment cannot be fully specified

"Make this email strike the right tone" depends on context that is hard to express. What the recipient knows, the relationship history, the cultural expectations — these shape what "right" means.

For tasks heavily dependent on judgment, generating options to choose from often works better than trying to specify the perfect outcome in advance.

AI has a style it defaults to

Despite careful prompting, AI outputs have a certain texture. A slight verbosity, certain phrase tendencies, a particular rhythm. This is the average of its training data showing through.

Heavy editing can reshape this. But completely eliminating the AI-like quality is difficult.

Skill at prompting is not skill at the task

Being able to prompt for a financial model is not the same as understanding finance. A well-crafted prompt produces a professional-looking spreadsheet, but only someone with financial expertise can tell whether the assumptions behind the numbers are sound.

AI can produce outputs in domains where the user lacks knowledge. But evaluating whether those outputs are correct requires domain expertise.


Think

What would you do in these scenarios?

Simulator

1 / 5
Sim_v4.0.exe

The Wikipedia pitch

An architecture firm asks AI to 'write about our latest project.' The output reads like an encyclopedia article — generic, impersonal, unusable for the client presentation. Why does the output read like Wikipedia instead of a client pitch?


Practice

Test yourself and review key terms

Knowledge check

Q1/4

Why is the prompt 'write about our product' considered bad compared to a specific request?

Concepts

Question

What analogy does the lesson use to explain why vague prompts fail?

Click to reveal

Answer

Asking someone for directions without specifying where you are, where you are going, or how you are traveling.

1 / 25

Do

Your action steps for today

Action plan: what to do today

  • The prompt audit:Take a prompt that produced mediocre results recently. Add three specific constraints and an example of good output. Compare the new result.
  • The template build:For a task done repeatedly, create a template prompt with placeholders for the parts that change. Test it on three different instances.
  • The friction map:Notice where AI output requires the most editing. Those areas often indicate where the prompt needed more specificity.
Note.txt

Some examples and details may be simplified to better convey the core idea. Every business is different — adapt these ideas to your specific context and situation.