AC
AutoClaygent
Lesson 3 of 838% complete
10 min read

Prompt Anatomy

The 5-part structure AutoClaygent generates for every prompt

The 3-Task Rule

Every Claygent prompt should have at most 3 distinct tasks. More than that, and quality drops significantly.

🚨
Anti-Pattern

"Find the platforms they use, the owner's name, the business model, their tech stack, and whether they serve B2B or B2C" — This is 5 tasks. Split it into 2-3 Claygents instead.

How AutoClaygent Handles This

AutoClaygent automatically enforces the 3-task rule. When you describe what you need, it will recommend splitting complex requests into multiple focused Claygents—and explain why.

Anatomy of a Good Prompt

Every production-ready Claygent prompt has these 5 parts:

1. Input Context

Tell Claygent what data it's working with. Use Clay's variable syntax:

Given the company domain: {{domain}}

2. Clear Goal

One sentence describing what you want to find:

Goal: Detect what SaaS platforms this company uses by analyzing their portal URLs.

3. Step-by-Step Instructions with Decision Logic

Numbered steps with specific sources and explicit decision trees:

Research steps: 1. Look for customer-facing portals: - Check {{domain}}/login, {{domain}}/portal, {{domain}}/app - Look for "Client Login", "Patient Portal" in footer 2. When you find a portal link, analyze the URL pattern: **Subdomain Patterns (highest confidence):** - {company}.salesforce.com → Salesforce - {company}.hubspot.com → HubSpot - {company}.janeapp.com → Jane App
💡
Why Decision Trees Matter

Notice the explicit URL pattern mapping. This is what separates 70% accuracy prompts from 95% accuracy prompts. Don't say "find what CRM they use" — tell the model exactly how to detect it.

4. Fallback Instructions

What to do when the primary approach fails:

If no portal found on website: - Check if login redirects to a third-party domain - Look for embedded widgets (chat, booking, support)

5. JSON Output Specification with Evidence

Explicit structure with field descriptions and evidence requirements:

Output as JSON: { "platforms_detected": [ { "platform_name": "Name of platform", "platform_category": "CRM | Support | Scheduling | EHR", "detection_method": "subdomain | redirect | widget", "evidence_url": "URL that revealed this" } ], "portal_url": "Customer portal URL if found" | null, "confidence": "high" | "medium" | "low" }

Bad vs. Good: A Comparison

Bad Prompt (Score: ~4.0)

Find what CRM or booking platform this company uses.

Problems: No context, no detection method, no output format, no fallback, no confidence levels.

Good Prompt (Score: ~8.5)

Given the company domain: {{domain}} Goal: Detect what SaaS platforms this company uses by analyzing their portal URLs. Research steps: 1. Look for customer-facing portals: - Check {{domain}}/login, {{domain}}/portal, {{domain}}/app - Look for "Client Login", "Patient Portal", "Customer Portal" in footer - Check navigation menu for portal links 2. When you find a portal link, analyze the URL pattern: **Subdomain Patterns (highest confidence):** - {company}.salesforce.com → Salesforce - {company}.hubspot.com → HubSpot - {company}.zendesk.com → Zendesk - {company}.janeapp.com → Jane App - {company}.mindbody.io → Mindbody 3. Check for scheduling/booking tools: - Look for "Book Now", "Schedule", "Appointments" buttons - These often reveal Calendly, Acuity, Cal.com, etc. If no portal found on website: - Check if login redirects to a third-party domain - Look for embedded widgets (chat, booking, support) Output as JSON: { "platforms_detected": [ { "platform_name": "Name of platform", "platform_category": "CRM | Support | Scheduling | EHR", "detection_method": "subdomain | redirect | widget", "evidence_url": "URL that revealed this" } ], "portal_url": "Customer portal URL if found" | null, "confidence": "high" | "medium" | "low" } IMPORTANT: - URL patterns are 95%+ accurate — trust subdomains over text - If no platform detected, return empty array (don't guess) - "high" confidence = subdomain pattern match - "medium" confidence = redirect or widget detection - "low" confidence = text mentions only

7 Anti-Patterns to Avoid

  1. "Search the web for..." — Be specific about sources (URL patterns, navigation, footer)
  2. More than 3 tasks — Split into multiple Claygents
  3. No detection method — Explain HOW to find it (subdomain patterns, redirects)
  4. No JSON output format — Specify exact structure with required fields
  5. No evidence requirement — Always ask for the URL/source that proved the finding
  6. Asking for guesses — Instruct to return null or empty when uncertain
  7. No confidence gradients — Define what makes high vs medium vs low confidence
💡
Pro Tip

The best prompts include the detection method accuracy. "URL subdomain patterns are 95%+ accurate" tells the model to prioritize that signal over text scraping (70% accurate).

Try It Yourself

Type a natural language goal and see how it transforms into a structured prompt:

Prompt Transformer — Try It
Select a Pattern

Click a pattern to see how AutoClaygent transforms it into a structured prompt:

Structured Prompt
Select a pattern to see the structured prompt...

AutoClaygent generates prompts like these automatically based on your goals.

How AutoClaygent Handles This

AutoClaygent generates all 5 parts automatically—Input Context, Goal, Steps with Decision Trees, Fallbacks, and JSON Output—based on your plain English description. It also generates a valid JSON schema and iteratively improves the prompt until it scores 8.0+.

Key Takeaways

  • Limit each Claygent to 3 or fewer distinct tasks
  • Every prompt needs: Input, Goal, Steps with Decision Logic, Fallback, JSON Output
  • Include explicit detection methods — don't just say "find X"
  • Require evidence URLs for every finding
  • Define confidence levels with specific criteria
  • Instruct to return null/empty when data can't be verified

Ready to build Claygents that actually work?

Get the complete course with interactive playground and all 9 prompt patterns.