You’ve built tools that LLMs can call and resources that clients can load. Now you’ll learn about prompts - reusable templates that guide users in how to interact with your MCP server. Unlike tools and resources, prompts are user-controlled - users explicitly choose when to use them through a UI menu or slash command.
Creating Prompts with FastMCP
FastMCP provides the @mcp.prompt() decorator to create prompts:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Movies GraphRAG Server")
@mcp.prompt()
def movie_recommendation() -> str:
    """Get movie recommendations based on your preferences."""
    return """I'd like to discover new movies to watch.
Please ask me about:
1. What genres I enjoy
2. Any specific movies I've loved
3. My preferred movie era or style
Then recommend 5 movies I might enjoy and explain why each one would be a good fit."""Prompts with Parameters
Prompts can accept parameters to customize the template:
@mcp.prompt()
def similar_movies(movie_title: str, count: int = 5) -> str:
    """Find movies similar to one you enjoyed."""
    return f"""I really enjoyed the movie "{movie_title}".
Can you recommend {count} similar movies and explain why each one is similar?
Consider factors like:
- Genre and themes
- Director or actors
- Mood and tone
- Era and style"""Users provide the parameters when invoking the prompt, and the template is filled in.
Multi-Message Prompts
Prompts can return multiple messages to create a conversation flow:
from mcp.server.fastmcp.prompts import base
@mcp.prompt()
def analyze_preferences(favorite_movies: str) -> list[base.Message]:
    """Analyze your movie preferences and recommend genres."""
    return [
        base.UserMessage(
            content=f"Here are my favorite movies: {favorite_movies}"
        ),
        base.AssistantMessage(
            content="I'll analyze your movie preferences. Let me look at the genres and themes..."
        ),
        base.UserMessage(
            content="Based on my favorites, what genres should I explore next?"
        )
    ]This creates a conversation starter that includes both user and assistant messages.
Prompt Best Practices
1. Be specific and actionable:
# Good - specific guidance
@mcp.prompt()
def movie_night_planner() -> str:
    """Plan a themed movie night."""
    return """Help me plan a movie night with these details:
1. Theme or genre I want to explore
2. Number of movies (2-4 recommended)
3. Any constraints (runtime, rating, era)
Create a curated list with:
- Movie titles and years
- Why they fit the theme
- Suggested watching order
- Total runtime"""
# Avoid - too vague
@mcp.prompt()
def movies() -> str:
    """Movies."""
    return "Tell me about movies"2. Provide structure:
@mcp.prompt()
def movie_review_template(movie_title: str) -> str:
    """Write a structured movie review."""
    return f"""Write a review of "{movie_title}" covering:
**Plot Summary** (no spoilers)
- Brief overview in 2-3 sentences
**Strengths**
- What worked well?
- Standout performances?
**Weaknesses**
- What could be improved?
**Overall Verdict**
- Rating out of 10
- Who would enjoy this movie?"""3. Use clear parameters:
@mcp.prompt()
def discovery_prompt(
    genre: str,
    decade: str = "any",
    mood: str = "any"
) -> str:
    """Discover hidden gems in a specific genre."""
    filters = []
    if decade != "any":
        filters.append(f"from the {decade}s")
    if mood != "any":
        filters.append(f"with a {mood} mood")
    filter_text = " ".join(filters) if filters else "from any era"
    return f"""Help me discover lesser-known {genre} movies {filter_text}.
Find me 5 hidden gems that:
- Have high ratings but are under-appreciated
- Represent the genre well
- Offer something unique
For each movie, explain:
- Why it's worth watching
- What makes it special
- Who would enjoy it"""When to Use Prompts
Prompts are ideal for common workflows and complex requests. In our movie server, they help with tasks like:
- 
Movie recommendations - "Find movies based on my favorites"
 - 
Themed planning - "Plan a movie marathon with specific criteria"
 - 
Guided discovery - "Help me explore new genres with structured questions"
 - 
Analysis templates - "Compare two movies using standard criteria"
 
Prompts vs Tools
While tools are functions that execute code when the LLM needs them, prompts are pre-written templates that users explicitly select. They work together - a prompt might guide the user to ask questions that lead the LLM to call specific tools. For example, a movie recommendation prompt could guide the conversation that leads to calling the search_movies_by_genre() tool.
Adding Prompts to Your Server
Prompts are simple to add - just use the decorator:
@mcp.prompt()
def movie_discovery(genre: str = "any") -> str:
    """Discover new movies in a genre."""
    if genre == "any":
        return """Help me discover new movies! What genres do I enjoy?
What recent movies have I loved? Do I prefer classics or new releases?"""
    return f"""Recommend 5 diverse {genre} movies that span different
styles and eras. Explain why each is a great example of the genre."""Summary
In this lesson, you learned about MCP prompts:
- 
User-controlled templates - Users explicitly invoke prompts
 - 
@mcp.prompt()decorator - Create prompts with optional parameters - 
Multi-message prompts - Build conversation flows
 - 
Best practices - Be specific, provide structure, use clear parameters
 - 
Use cases - Common workflows, guided interactions, templates
 - 
Prompts vs Tools - Templates vs executable functions
 
Prompts make your server more user-friendly by providing pre-written templates for common tasks.
In the next module, you’ll learn how to integrate MCP tools into your development workflows.