Search⌘ K
AI Features

Conversation Memory and Multi-Turn Context

Understand how to manage conversation memory in LangGraph by passing accumulated dialogue history between invocations to create context-aware AI responses. Explore short-term and long-term memory strategies, state schema design, and how to build effective multi-turn conversational agents with memory reset capabilities.

Every call to app.invoke(...) is independent. LangGraph does not store anything between calls by default. If we invoke the graph three times with three different messages, each call receives only the state we pass it. Without explicit history management, the assistant treats each message as if it is the first one it has ever seen.

This causes the kind of behavior that makes AI assistants frustrating to use. A user asks about pricing, gets a helpful answer, follows up with “what about the enterprise tier?” and receives a response that has no idea what “the enterprise tier” refers to. The context from one turn is simply gone.

The fix is to make conversation history part of the state schema. Each turn appends its exchange to a history list. When we invoke the graph for the next turn, we pass the accumulated history forward as part of the initial state. The drafting node reads this history and incorporates it into the prompt, giving the model the full conversation context it needs to respond coherently.

This lesson builds that pattern. The key insight is that the caller is responsible for passing history forward between invocations. LangGraph provides the state structure. We provide the continuity.

Short-term memory versus long-term memory

Before looking at implementation, it is useful to draw a clear line between two different meanings of “memory” in LangGraph workflows.

  • Short-term memory is everything held in the state object during a single invocation. It lives for the duration of one app.invoke(...) call. Nodes can read it, write to it, and pass it forward. When the invocation ends, this memory is gone unless the caller explicitly captures and preserves it.

  • Long-term memory is state that persists across invocations, across process restarts, and across time. LangGraph supports this through a feature called checkpointing, which saves state snapshots to an external store such as a database or file. A later invocation can retrieve that snapshot and resume execution as if no time had passed.

The conversation history approach in this lesson is a short-term memory pattern. The caller holds onto the history list between turns and passes it back in. This works well for session-scoped conversations where we control the caller. For workflows that need to survive server restarts, support multiple users simultaneously, or resume after hours or days, checkpointing is the right tool. We cover that pattern in a dedicated later lesson. The table below compares short-term and long-term memory in LangGraph.

Memory type

Where it lives

How long it lasts

Use case

Short-term (state)

In the state dict during one invocation

One app.invoke(...) call

Passing data between nodes in a single run

Session-scoped (caller-managed)

In the caller's variables between invocations

Duration of a session

Conversation history passed forward by the caller

Long-term (checkpointed)

External store (database, file)

Indefinitely

Resuming across restarts, multi-user workflows

Two strategies for managing conversation memory

Before building, it is worth understanding the two main approaches to conversation memory. They make different trade-offs between context ...