Part 15: Exploring Memory Types in Conversational AI: A Deep Dive

Memory and Summarisation Use Case

In the rapidly evolving landscape of conversational AI, memory plays a crucial role in creating intelligent and context-aware interactions. Understanding different types of memory and how they can be leveraged is essential for building robust AI applications. In this blog post, we'll explore three primary types of memory—Conversation Buffer Memory, Conversation Buffer Window Memory, and Conversation Summary Memory—and discuss their applications and configurations.

Understanding Memory Types

1. Conversation Buffer Memory

Memory size increases with additional conversation.

Conversation Buffer Memory is the simplest form of memory used in conversational AI. It works by storing the entire dialogue history in a raw format. When a user asks a question, it is sent to both the memory and the language model (LLM). Similarly, the response from the LLM is also saved back into memory.

  • Use Case: Ideal for short conversations where the dialogue history is manageable.

  • Limitations: As conversations grow longer, the memory can become too large, potentially exceeding the token limit of the LLM.

NOTE: Memory size increases with additional conversation.

2. Conversation Buffer Window Memory

To address the limitations of Conversation Buffer Memory, Conversation Buffer Window Memory introduces a "window" that limits the amount of conversation history stored.

Only last K conversations saved in memory
  • Key Feature: Allows configuration of the window size, such as retaining only the last few interactions.

  • Use Case: Suitable for applications where recent context is important, but the full conversation history is unnecessary.

  • Configuration: Adjust the window size based on the LLM's token limits and system capabilities.

NOTE: Only last K conversations saved in memory

3. Conversation Summary Memory

Conversation Summary Memory takes a different approach by maintaining a running summary of the conversation. Instead of storing raw text, it continuously updates a concise summary with each interaction.

Save a running summary of the conversation
  • Key Feature: Keeps the memory size small by summarizing the dialogue.

  • Use Case: Ideal for long conversations where maintaining a full history is impractical.

  • Benefits: Reduces concerns about token limits, as the summary can be configured to stay within acceptable bounds.

NOTE: Save a running summary of the conversation

Implementing Memory Types

Using Flow-wise and Lang Flow

In Flow-wise, memory can be incorporated into conversational applications using specific blocks for each memory type. These blocks can be attached to a conversation chain, enabling the application to retain context and generate responses based on the selected memory type.

  • Buffer Memory: Increases with conversation length, leading to potential token issues.

  • Buffer Window Memory: Configurable to store a set number of recent interactions.

  • Summary Memory: Maintains a running summary, keeping the memory lightweight.

External Memory Providers

External memory providers offer additional capabilities, such as session-based memory management. For instance, using a session ID, you can track individual user interactions without maintaining the entire chat history locally.

  • Example: Providers like Zep Memory allow for session-specific memory, enabling personalized interactions across sessions.

Expanding Memory Options

Lang Flow offers a variety of memory types, including:

  • Knowledge Graph Memory: Tracks knowledge graphs and relationships.

  • Entity Memory: Retains specific entities mentioned in conversations.

These advanced memory types provide additional layers of context, enabling more sophisticated applications.

Conclusion

Memory is a cornerstone of effective conversational AI, enabling systems to understand and respond to user inputs in a contextually relevant manner. By selecting the appropriate memory type—whether it's Buffer, Buffer Window, or Summary Memory—developers can tailor their applications to meet specific needs and constraints. As the field of AI continues to evolve, exploring and integrating various memory types will be essential for building intelligent and engaging conversational experiences. Whether you're using Flow-wise, Lang Flow, or external providers, the flexibility and power of memory in AI systems are vast, promising exciting possibilities for future applications.

Last updated