LangGraph(五)——自定义状态
目录
- 1. 向状态添加键
- 2. 更新工具中的状态
- 3. 构建状态图
- 4. 提示聊天机器人
- 5. 添加人工协助
- 6. 手动更新状态
- 参考
1. 向状态添加键
通过向状态添加name和birthday键来更新聊天机器人对实体生日的研究:
from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph.message import add_messagesclass State(TypedDict):messages: Annotated[list, add_messages]name: strbirthday: str
将这些信息添加到状态中,可以使其被其他图节点以及图的持久层轻松访问。
2. 更新工具中的状态
在human_assistance工具中填充状态键。这允许人类在信息存储到状态中之前进行审查。使用Command从工具内部发出状态更新。
from langchain_core.messages import ToolMessage
from langchain_core.tools import InjectedToolCallId, tool
from langgraph.types import Command, interrupt@tool
# Note that because we are generating a ToolMessage for a state update, we
# generally require the ID of the corresponding tool call. We can use
# LangChain's InjectedToolCallId to signal that this argument should not
# be revealed to the model in the tool's schema.
def human_assistance(name: str, birthday: str, tool_call_id: Annotated[str, InjectedToolCallId]
) -> str:"""Request assistance from a human"""human_response = interrupt({"question": "Is this correct?","name": name,"birthday": birthday})# If the information is correct, update the state as-is.if human_response.get("correct", "").lower().startswith("y"):verified_name = nameverified_birthday = birthdayresponse = "Correct"# Otherwise, receive information from the human reviewerelse:verified_name = human_response.get("name", name)verified_birthday = human_response.get("birthday", birthday)response = f"Made a correction: {human_response}"# This time we explicitly update the state with a ToolMessage inside# the tool.state_update = {"name": verified_name,"birthday": verified_birthday,"messages": [ToolMessage(response, tool_call_id=tool_call_id)]}# We return a Command object in the tool to update our state.return Command(update=state_update)
3. 构建状态图
构建状态图:
from langchain_tavily import TavilySearch
from langchain.chat_models import init_chat_modelfrom langgraph.checkpoint.memory import MemorySaver
from langgraph.graph import StateGraph, START, END
from langgraph.prebuilt import ToolNode, tools_conditionfrom langgraph.types import Command, interruptllm = init_chat_model("deepseek:deepseek-chat"
)graph_builder = StateGraph(State)tool = TavilySearch(max_results=2)
tools = [tool, human_assistance]
llm_with_tools = llm.bind_tools(tools)def chatbot(state: State):message = llm_with_tools.invoke(state["messages"])# Because we will be interrupting during tool execution,# we disable parallel tool calling to avoid repeating any# tool invocations when we resume.assert len(message.tool_calls) <= 1return {"messages": [message]}graph_builder.add_node("chatbot", chatbot)tool_node = ToolNode(tools=tools)
graph_builder.add_node("tools", tool_node)graph_builder.add_conditional_edges("chatbot",tools_condition,
)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge(START, "chatbot")memory = MemorySaver()graph = graph_builder.compile(checkpointer=memory)
4. 提示聊天机器人
提示聊天机器人查找LangGraph库的生日,并在获取所需信息后指示聊天机器人使用human_assistance工具。通过在工具的参数中设置name和birthday,你强制聊天机器人为这些字段生成建议。
user_input = ("Can you look up when LangGraph was released? ""When you have the answer, use the human_assistance tool for review."
)config = { "configurable": { "thread_id": "1"} }events = graph.stream({ "messages": [ { "role": "user", "content": user_input } ] },config,stream_mode="values"
)for event in events:if "messages" in event:event["messages"][-1].pretty_print()
运行结果:
================================ Human Message =================================Can you look up when LangGraph was released? When you have the answer, use the human_assistance tool for review.
================================== Ai Message ==================================
Tool Calls:tavily_search (call_0_d8e088fb-fd82-42e5-9c75-fcab225d7678)Call ID: call_0_d8e088fb-fd82-42e5-9c75-fcab225d7678Args:query: LangGraph release datesearch_depth: advanced
================================= Tool Message =================================
Name: tavily_search{"query": "LangGraph release date", "follow_up_questions": null, "answer": null, "images": [], "results": [{"url": "https://pypi.org/project/langgraph/", "title": "langgraph - PyPI", "content": "LangGraph is inspired by Pregel and Apache Beam. The public interface draws inspiration from NetworkX. LangGraph is built by LangChain Inc, the creators of LangChain, but can be used without LangChain.\n\nProject details\n\nVerified details\n\nMaintainers\n\nUnverified details\n\nProject links\n\nMeta\n\nClassifiers\n\nRelease history\n\nRelease notifications |\n RSS feed\n\n0.4.3\n\nMay 8, 2025\n\n0.4.2\n\nMay 7, 2025\n\n0.4.1\n\nApr 30, 2025\n\n0.4.0\n\nApr 29, 2025\n\n0.3.34\n\nApr 24, 2025\n\n0.3.33\n\nApr 23, 2025 [...] langgraph 0.4.3\n\npip install langgraph\n\n\nCopy PIP instructions\n\nReleased: \n May 8, 2025\n\nBuilding stateful, multi-actor applications with LLMs\n\nNavigation\n\nVerified details\n\nMaintainers\n\nUnverified details\n\nProject links\n\nMeta\n\nClassifiers\n\nProject description\n\n\n\n[!NOTE]\nLooking for the JS version? See the JS repo and the JS docs. [...] 0.2.14\n\nAug 24, 2024\n\n0.2.13\n\nAug 23, 2024\n\n0.2.12\n\nAug 22, 2024\n\n0.2.11\n\nAug 22, 2024\n\n0.2.10\n\nAug 21, 2024\n\n0.2.9\n\nAug 21, 2024\n\n0.2.8\n\nAug 21, 2024\n\n0.2.7\n\nAug 21, 2024\n\n0.2.7a0\n \npre-release\n\nAug 21, 2024\n\n0.2.6\n\nAug 21, 2024\n\n0.2.5\n\nAug 21, 2024\n\n0.2.5a0\n \npre-release\n\nAug 20, 2024\n\n0.2.4\n\nAug 15, 2024\n\n0.2.3\n\nAug 8, 2024\n\n0.2.2\n\nAug 7, 2024\n\n0.2.1\n\nAug 7, 2024\n\n0.2.0\n\nAug 7, 2024\n\n0.1.19\n\nAug 1, 2024\n\n0.1.18\n \nyanked\n\nJul 31, 2024", "score": 0.8033131, "raw_content": null}, {"url": "https://github.com/langchain-ai/langgraph/releases", "title": "Releases · langchain-ai/langgraph - GitHub", "content": "Source code(zip) 2025-05-09T20:14:45Z \nSource code(tar.gz) 2025-05-09T20:14:45Z [...] Source code(zip) 2025-05-02T05:42:07Z \nSource code(tar.gz) 2025-05-02T05:42:07Z [...] Assets 2\n\nSource code(zip) 2025-05-15T15:20:59Z \nSource code(tar.gz) 2025-05-15T15:20:59Z", "score": 0.78194773, "raw_content": null}], "response_time": 1.39}
================================== Ai Message ==================================
Tool Calls:human_assistance (call_0_2a2e50aa-70ea-4892-9d89-e22d942ce91d)Call ID: call_0_2a2e50aa-70ea-4892-9d89-e22d942ce91dArgs:name: LangGraph Release Date Reviewbirthday: 2023-01-01
再在human_assistance工具中遇到了interrupt。
5. 添加人工协助
聊天机器人未能识别正确的日期,因此向它提供信息:
human_command = Command(resume={"name": "LangGraph","birthday": "Jan 17, 2024"}
)events = graph.stream(human_command, config, stream_mode="values")
for event in events:if "messages" in event:event["messages"][-1].pretty_print()
运行结果为:
================================== Ai Message ==================================
Tool Calls:human_assistance (call_0_2a2e50aa-70ea-4892-9d89-e22d942ce91d)Call ID: call_0_2a2e50aa-70ea-4892-9d89-e22d942ce91dArgs:name: LangGraph Release Date Reviewbirthday: 2023-01-01
================================= Tool Message =================================
Name: human_assistanceMade a correction: {'name': 'LangGraph', 'birthday': 'Jan 17, 2024'}
================================== Ai Message ==================================The release date of LangGraph appears to be around January 17, 2024, based on the review. Let me know if you'd like further details or confirmation!
现在,这些字段反映在状态中:
snapshot = graph.get_state(config){k: v for k, v in snapshot.values.items() if k in ("name", "birthday")}
运行结果为:
{'name': 'LangGraph', 'birthday': 'Jan 17, 2024'}
6. 手动更新状态
LangGraph对应用程序状态提供了高度的控制。例如,在任何时刻(包括中断时),你都可以使用graph.update_state手动覆盖一个键:
graph.update_state(config, {"name", "LangGraph (library)"})
运行结果为:
{'configurable': {'thread_id': '1','checkpoint_ns': '','checkpoint_id': '1f0363ee-7b8e-674c-8006-33b41fa5d353'}}
再通过graph.get_state查看状态中的值:
snapshot = graph.get_state(config){k: v for k, v in snapshot.values.items() if k in ("name", "birthday")}
运行结果为:
{'name': 'LangGraph (library)', 'birthday': 'Jan 17, 2024'}
参考
https://langchain-ai.github.io/langgraph/tutorials/get-started/5-customize-state/