-
Type:
New Feature
-
Resolution: Unresolved
-
Priority:
Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
Target "Soft" date: April 15h, 2026
Context
LangGraph is a framework, part of the LangChain ecosystem, that allows the programmatic creation of "Graphs" to represent the flow of the interaction with AI agents. Each "Node" represents a function that alters the data "State" and connect to other Nodes through "Edges". LangGraph provides short-term memory ("Checkpoints") and Long-Term memory ("Stores") functionality to persist data across sessions and agents.
Checkpoints persist data for "Threads" (a single conversation/interaction) across sessions, allowing the user to come back and continue the interaction. MongoDB already has a Checkpointer library for JavaScript.
Stores persist data across Threads, allowing agents to remember things about the user or their interactions, like their preferences, food allergies, or job experience. This is the library we want to implement so agent's can persist long-term data on MongoDB. The Python team has already written their own implementation.
Task Summary
- Create a new module that exports a class MongoDBStore extending the interface BaseStore.
- The new class must work with Voyage AutoEmbeddings (NODE-7340).
Desired User Experience
Review attached sample application: sample.zip![]()
References:
- LangGraph Docs: Short-Term vs Long-Term memory overview
- LangGraph Docs: Data Persistence
- langraph-store-mongodb (Python implementation of MongoDBStore)
- checkpoint-mongodb (TypeScript implementation of MongoDB Checkpointer)
Additional Details
https://docs.langchain.com/oss/javascript/langgraph/memory#long-term-memory
NOTE: this experience should include Voyage auto-embedding support for MongoDB Vector Store
Use Case
As a... LangGraph JS Developer I want... A persisted Store API to manage data across different conversation threads and sessions. So that... Agents can retain user facts, preferences, and learned instructions (Long-term Memory) without resetting context between interactions.
User Experience
Developers gain a unified interface (Store) to save/retrieve JSON documents via custom namespaces (not just Thread IDs). End-users experience agents that remember them over time.
If bug: N/A
Dependencies
- Upstream: LangGraph State management, Embedding/Vector provider interfaces.
- Downstream: Persistent storage adapters (Postgres, Redis, etc).
Risks/Unknowns
- Latency: "Hot path" memory updates (during generation) may slow down responses.
- Context Overload: Retrieving too many memories via search may exceed LLM token limits.
- Data Drift: JSON schemas for user profiles may become corrupted or bloated over time without strict validation.
Acceptance Criteria
Implementation Requirements
- Implement BaseStore interface: get, put, delete, search.
- Support hierarchical namespaces (e.g., [userId, "memories"]).
- Support Semantic Search (vector similarity) and Metadata Filtering.
Testing Requirements
- Unit tests for CRUD operations on InMemoryStore.
- Mocked tests for vector search/embedding generation.
Documentation Requirements
- API documentation for Store.
- Examples for "Hot Path" vs "Background" memory updates.
Follow Up Requirements
- Implement backend-specific stores (Postgres, Redis).
- Parity check with LangGraph Python BaseStore behavior.
- depends on
-
NODE-7340 [LangChainJS] add Auto Embedding support
-
- Backlog
-
- is depended on by
-
DRIVERS-3350 [AI-Frameworks] Auto embedding support in AI frameworks( LangChain & LangGraph)
-
- In Progress
-