今天,我们很高兴推出 langgraph-checkpoint-redis,这是一个新的集成,将 Redis 强大的内存能力引入 LangGraph。这项合作为开发者提供了构建更有效的 AI 代理的工具,使其在对话和会话中拥有持久内存。
LangGraph 是一个开源框架,用于使用 LLM 构建有状态的、代理式工作流。通过这个 Redis 集成,开发者现在可以利用线程级持久性和跨线程内存来创建能够记住上下文、从经验中学习并随着时间做出更好决策的代理。
在构建有效的 AI 代理时,内存至关重要。最成功的实现使用简单、可组合的模式来有效地管理内存。Redis 非常适合这一角色
langgraph-checkpoint-redis 包提供了两个核心功能,直接对应于代理系统中的内存模式
RedisSaver 和 AsyncRedisSaver 提供线程级持久性,允许代理在同一对话线程中的多次交互中保持连续性
RedisStore 和 AsyncRedisStore 启用跨线程内存,允许代理访问和存储在不同对话线程中持久存在的信息
让我们看看使用 Redis 持久性为简单聊天机器人添加内存是多么容易
from typing import Literal
from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
from langgraph.checkpoint.redis import RedisSaver
# Define a simple tool
@tool
def get_weather(city: Literal["nyc", "sf"]):
"""Use this to get weather information."""
if city == "nyc":
return "It might be cloudy in nyc"
elif city == "sf":
return "It's always sunny in sf"
else:
raise AssertionError("Unknown city")
# Set up model and tools
tools = [get_weather]
model = ChatOpenAI(model="gpt-4o-mini", temperature=0)
# Create Redis persistence
REDIS_URI = "redis://localhost:6379"
with RedisSaver.from_conn_string(REDIS_URI) as checkpointer:
# Initialize Redis indices (only needed once)
checkpointer.setup()
# Create agent with memory
graph = create_react_agent(model, tools=tools, checkpointer=checkpointer)
# Use the agent with a specific thread ID to maintain conversation state
config = {"configurable": {"thread_id": "user123"}}
res = graph.invoke({"messages": [("human", "what's the weather in sf")]}, config)
这个简单的设置允许代理在多次交互中记住“user123”的对话历史。线程 ID 作为对话标识符,所有状态都存储在 Redis 中以便快速检索。
对于更高级的用例,您可能希望代理记住跨不同对话线程的用户信息。这里是使用 RedisStore 实现跨线程内存的简化示例
import uuid
from langchain_anthropic import ChatAnthropic
from langchain_core.runnables import RunnableConfig
from langgraph.checkpoint.redis import RedisSaver
from langgraph.graph import START, MessagesState, StateGraph
from langgraph.store.redis import RedisStore
from langgraph.store.base import BaseStore
# Set up model
model = ChatAnthropic(model="claude-3-5-sonnet-20240620")
# Function that uses store to access and save user memories
def call_model(state: MessagesState, config: RunnableConfig, *, store: BaseStore):
user_id = config["configurable"]["user_id"]
namespace = ("memories", user_id)
# Retrieve relevant memories for this user
memories = store.search(namespace, query=str(state["messages"][-1].content))
info = "\n".join([d.value["data"] for d in memories])
system_msg = f"You are a helpful assistant talking to the user. User info: {info}"
# Store new memories if the user asks to remember something
last_message = state["messages"][-1]
if "remember" in last_message.content.lower():
memory = "User name is Bob"
store.put(namespace, str(uuid.uuid4()), {"data": memory})
# Generate response
response = model.invoke(
[{"role": "system", "content": system_msg}] + state["messages"]
)
return {"messages": response}
# Build the graph
builder = StateGraph(MessagesState)
builder.add_node("call_model", call_model)
builder.add_edge(START, "call_model")
# Initialize Redis persistence and store
REDIS_URI = "redis://localhost:6379"
with RedisSaver.from_conn_string(REDIS_URI) as checkpointer:
checkpointer.setup()
with RedisStore.from_conn_string(REDIS_URI) as store:
store.setup()
# Compile graph with both checkpointer and store
graph = builder.compile(checkpointer=checkpointer, store=store)
使用此代理时,我们可以同时维护对话状态(线程级)和用户信息(跨线程)
# First conversation - tell the agent to remember something
config = {"configurable": {"thread_id": "convo1", "user_id": "user123"}}
response = graph.invoke(
{"messages": [{"role": "user", "content": "Hi! Remember: my name is Bob"}]},
config
)
# Second conversation - different thread but same user
new_config = {"configurable": {"thread_id": "convo2", "user_id": "user123"}}
response = graph.invoke(
{"messages": [{"role": "user", "content": "What's my name?"}]},
new_config
)
# Agent will respond with "Your name is Bob"
在关于代理工作流中内存的文章中,Turing Post 将 AI 内存分为长期和短期组件
Redis 非常擅长支持代理工作流的这两种内存类型
langgraph-checkpoint-redis 在底层使用了一些 Redis 特性来提供高效的内存
该实现包括同步和异步 API,以支持不同的应用架构
# Synchronous API
with RedisSaver.from_conn_string(REDIS_URI) as checkpointer:
checkpointer.setup()
# ... use synchronously ...
# Asynchronous API
async with AsyncRedisSaver.from_conn_string(REDIS_URI) as checkpointer:
await checkpointer.asetup()
# ... use asynchronously ...
该软件包提供了多种实现方式,以满足不同的需求
成功的代理系统是建立在简单和清晰之上的,而不是复杂。LangGraph 的 Redis 集成遵循这一理念,提供了简单、高性能的内存解决方案,这些方案可以组合成复杂的代理架构。
无论您是构建一个记住对话上下文的简单聊天机器人,还是一个在多次交互中维护用户配置文件的复杂代理,langgraph-checkpoint-redis 都为您提供了高效可靠地实现这些功能的构建模块。
立即开始将 Redis 与 LangGraph 一起使用
pip install langgraph-checkpoint-redis
请访问 GitHub 仓库 https://github.com/redis-developer/langgraph-redis,获取全面的文档和示例。
通过将 LangGraph 的代理工作流与 Redis 强大的内存能力相结合,您可以构建感觉更自然、响应更迅速、更具个性化的 AI 代理。代理记住并从交互中学习的能力,是机械工具和随着时间推移真正理解您需求的助手之间的全部区别所在。