Short-Term Memory

LLMs are stateless — without memory management, every message is a fresh start. This module covers the conversation memory layer: how to store messages as a linked graph, how to retrieve and search them, and how adding a message automatically seeds the long-term memory layer through entity extraction.

By the end of this module, you will:

  • Explain why a linked-list graph structure is more useful for conversation history than a flat table

  • Build a conversation memory layer using add_session(), add_message(), and get_recent_messages()

  • Search message history semantically using vector embeddings

  • Understand how entity extraction automatically connects short-term and long-term memory

Ready? Let’s go →

Chatbot

How can I help you today?

Data Model

Your data model will appear here.