Memory
Discover different ways of storing chat history in the LangChain framework.
We'll cover the following
So far, we’ve seen a single call to a model that provides us with a response. In several applications, however, we often require a series of questions or a structured conversation to achieve a more contextually coherent interaction with the language model. LangChain provides memory buffers to facilitate a conversational interface where the information exchanged between the user and the model can be stored. These memory buffers act as a context store, allowing the model to retain and utilize relevant information from previous messages in the ongoing conversation.
Get hands-on with 1400+ tech skills courses.