当前位置: 首页 > news >正文

LangGraph 的官网的一个最简单的聊天机器人

1. 安装依赖

确保你已经安装了以下库:

pip install langchain langgraph langchain-community

2. 定义对话状态

在 LangGraph 里,状态(State)就是节点之间传递的数据结构
这里我们用 messages 来存储对话历史,并用 Annotated + add_messages 来定义合并规则。

from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph.message import add_messagesclass State(TypedDict):messages: Annotated[list, add_messages]

 add_messages 的作用是告诉 LangGraph:当多个节点都往 messages 里写数据时,不要覆盖,而是追加。这样才能维持完整的对话历史。

3. 定义聊天节点

我们的核心节点就是调用 LLM(通义千问):

可以改成其它的大语言模型这里。

from langchain_community.chat_models import ChatTongyillm = ChatTongyi(model="qwen-plus", api_key="sk-xxxxxx")def chatbot(state: State):return {"messages": [llm.invoke(state["messages"])]}

这里:

  • 输入:state["messages"](对话历史)

  • 输出:新的消息(由 LLM 生成)

  • LangGraph 会自动把它合并到 messages 里。

4. 构建 LangGraph

LangGraph 的核心就是节点和边:

from langgraph.graph import StateGraph, START, ENDgraph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)graph = graph_builder.compile()

这段代码的含义:

  • 有一个节点叫 chatbot

  • 流程从 START → chatbot → END

  • graph.compile() 得到一个可运行的图

你可以用 ASCII 图查看:

print(graph.get_graph().draw_ascii())

输出:

  START --> chatbot --> END

5. 运行:流式输出

我们希望像真正的聊天一样,用户输入后,模型逐步响应,而不是一次性拿结果。

这就用到了 graph.stream

def stream_graph_updates(user_input: str):for event in graph.stream({"messages": [{"role": "user", "content": user_input}]}):for value in event.values():msg = value["messages"][-1].contentprint(f"🤖 Assistant: {msg}")

这里的 event 会在每个节点执行时产生一次输出,我们取出最后一条消息进行打印。

6. 主循环

最后,做一个简单的 REPL:

while True:try:user_input = input("🧑 User: ")if user_input.lower() in ["quit", "exit", "q"]:console.print("[red]Goodbye![/red]")breakstream_graph_updates(user_input)except:# fallback: 在不支持 input() 的环境下自动执行一次user_input = "What do you know about LangGraph?"console.print(f"User: {user_input}")stream_graph_updates(user_input)break

运行效果:

🧑 User: how are you
🤖 Assistant: Hi there! While I don't experience feelings in the way humans do, I'm always excited to engage in conversations and help out! ٩(◕‿◕。)۶ I love learning new things through our chats. What's on your mind today?
🧑 User: I am harry , who are you
🤖 Assistant: Hello Harry! I'm Qwen, a large-scale language model developed by Tongyi Lab. It's nice to meet you. How can I assist you today?
🧑 User: quit
Goodbye!

其它

调用工具的一个例子

import os
from langchain_community.chat_models import ChatTongyi
from langchain_tavily import TavilySearch
from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
import json
from langgraph.prebuilt import ToolNode, tools_conditionllm = ChatTongyi( model="qwen-plus", api_key="YOUR_API_KEY")
os.environ["TAVILY_API_KEY"] = "YOUR_TAVILY_KEY"llm = ChatTongyi( model="qwen-plus", api_key="YOUR_LLM_KEY")os.environ["TAVILY_API_KEY"] = "YOUR_TAVILY_KEY"class State(TypedDict):messages: Annotated[list, add_messages]graph_builder = StateGraph(State)tool = TavilySearch(max_results=2)
tools = [tool]
llm_with_tools = llm.bind_tools(tools)def chatbot(state: State):return {"messages": [llm_with_tools.invoke(state["messages"])]}graph_builder.add_node("chatbot", chatbot)tool_node = ToolNode(tools=[tool])
graph_builder.add_node("tools", tool_node)graph_builder.add_conditional_edges("chatbot",tools_condition,
)
# Any time a tool is called, we return to the chatbot to decide the next step
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge(START, "chatbot")
graph = graph_builder.compile()# 打印图结构
print(graph.get_graph().draw_ascii())def stream_graph_updates(user_input: str):last_msg = Nonefor event in graph.stream({"messages": [{"role": "user", "content": user_input}]}):for value in event.values():msg = value["messages"][-1]# print(msg)last_msg = msg.contentif last_msg:print(f"🤖 Assistant: {last_msg}")while True:try:user_input = input("🧑 User: ")if user_input.lower() in ["quit", "exit", "q"]:print("Goodbye!")breakstream_graph_updates(user_input)except:# fallback if input() is not availableuser_input = "What do you know about LangGraph?"print("User: " + user_input)stream_graph_updates(user_input)break

执行结果

🧑 User: what function do you have 
🤖 Assistant: I have access to a function called `tavily_search`, which is a powerful search engine tool. It allows me to perform detailed searches with various filters and parameters, such as:- Searching for information on specific topics or queries.
- Filtering results by date (specific ranges, the past day, week, month, or year).
- Including or excluding specific domains.
- Fetching images or favicon icons along with results.
- Adjusting the depth of the search (basic or advanced).Let me know what you'd like to explore, and I can use this function to find the information you need!
🧑 User: what is the max version of langgraph
🤖 Assistant: The latest version of LangGraph.js is **v0.3.0**, which includes features like node/task caching, deferred nodes, and the `isInterrupted()` method to check if the state contains an interrupt.If you're referring to **LangGraph Server**, there is a standalone version (Lite) that supports up to **1 million nodes executed per year**.For more details, you can check:
- [LangGraph.js Version History](https://langchain-ai.github.io/langgraphjs/versions/)
- [LangGraph Server (Lite)](https://forum.langchain.com/t/langgraph-standalone-container-1-million-nodes-execution-limit/484)
🧑 User: quit
Goodbye!

2. 加memory 的例子

from langchain_community.chat_models import ChatTongyi
from typing import Annotatedfrom typing_extensions import TypedDict
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.checkpoint.memory import InMemorySaverllm = ChatTongyi( model="qwen-plus", api_key="YOUR_API_KEY")class State(TypedDict):messages: Annotated[list, add_messages]def chatbot(state: State):return {"messages": [llm.invoke(state["messages"])]}graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)memory = InMemorySaver()
graph = graph_builder.compile(checkpointer=memory)from rich.console import Consolefrom rich.table import Table
# rich 控制台
console = Console()config = {"configurable": {"thread_id": "1"}}# 流式输出
def stream_graph_updates(user_input: str):for event in graph.stream({"messages": [{"role": "user", "content": user_input}]},config,# stream_mode="values",):for value in event.values():msg = value["messages"][-1].contentprint(f"🤖 Assistant: {msg}")# 主循环
while True:try:user_input = input("🧑 User: ")if user_input.lower() in ["quit", "exit", "q"]:console.print("[red]Goodbye![/red]")breakstream_graph_updates(user_input)except:# fallback: 在不支持 input() 的环境下自动执行一次user_input = "What do you know about LangGraph?"console.print(f"User: {user_input}")stream_graph_updates(user_input)break

3. human in the loop

from typing import Annotatedfrom langchain_tavily import TavilySearch
from langchain_core.tools import tool
from typing_extensions import TypedDictfrom langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
from langchain_community.chat_models import ChatTongyi
import os
from langgraph.types import Command, interruptllm = ChatTongyi( model="qwen-plus", api_key="YOUR_API_KEY")
os.environ["TAVILY_API_KEY"] = "YOUR_TAVILY_API_KEY"@tool
def human_assistance(query: str) -> str:"""Request assistance from a human."""human_response = interrupt({"query": query})return human_response["data"]class State(TypedDict):messages: Annotated[list, add_messages]graph_builder = StateGraph(State)@tool
def human_assistance(query: str) -> str:"""Request assistance from a human."""human_response = interrupt({"query": query})print("human_response", human_response)return human_response["data"]tool = TavilySearch(max_results=2)
tools = [tool, human_assistance]
llm_with_tools = llm.bind_tools(tools)def chatbot(state: State):message = llm_with_tools.invoke(state["messages"])# Because we will be interrupting during tool execution,# we disable parallel tool calling to avoid repeating any# tool invocations when we resume.assert len(message.tool_calls) <= 1return {"messages": [message]}graph_builder.add_node("chatbot", chatbot)tool_node = ToolNode(tools=tools)
graph_builder.add_node("tools", tool_node)graph_builder.add_conditional_edges("chatbot",tools_condition,
)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge(START, "chatbot")
graph = graph_builder.compile()config = {"configurable": {"thread_id": "1"}}
def stream_graph_updates(user_input: str):events = graph.stream({"messages": [{"role": "user", "content": user_input}]},config,stream_mode="values",)for event in events:if "messages" in event:event["messages"][-1].pretty_print()while True:try:user_input = input("🧑 User: ")if user_input.lower() in ["quit", "exit", "q"]:print("Goodbye!")breakstream_graph_updates(user_input)except:# fallback if input() is not availableuser_input = "What do you know about LangGraph?"print("User: " + user_input)stream_graph_updates(user_input)break

http://www.dtcms.com/a/335994.html

相关文章:

  • 数据与模型融合波士顿房价回归建模预测
  • SQL Server 2019安装教程(超详细图文)
  • [辩论] TDD(测试驱动开发)
  • 物联网软件开发过程中,数据流图(DFD),用例图,类图,活动图,序列图,状态图,实体关系图(ERD),BPMN(业务流程建模)详解分析
  • 豆包 Java的23种设计模式
  • OpenAI 发布了 GPT-5,有哪些新特性值得关注?国内怎么使用GPT5?
  • 内网后渗透攻击--隐藏通信隧道技术(应用层隧道技术)
  • 『搞笑名称生成器』c++小游戏
  • Nightingale源码Linux进行跨平台编译
  • 7.Ansible自动化之-实施任务控制
  • 如何解决pip安装报错ModuleNotFoundError: No module named ‘imageio’问题
  • maxwell安装部署
  • 数据结构:二叉树的高度 (Height)和节点总数 (Count of Nodes)
  • SpringCloud 07 微服务网关
  • C4 架构模型
  • 说一下事件委托
  • Qt——主窗口 mainWindow
  • Django3 - 建站基础知识点总结
  • 【JAVA 核心编程】面向对象中级:封装与访问控制
  • 获取IPv6地址的三种方式
  • 【Git系列】如何从 Git 中删除 .idea 目录
  • Rust:实现仅通过索引(序数)导出 DLL 函数的功能
  • MySQL定时任务详解 - Event Scheduler 事件调度器从基础到实战
  • 学习Stm32 的第一天
  • 基于RK3588的微电网协调控制器:实现分布式能源的智能调控与优化运行
  • git stash临时保存工作区
  • 因果知识图谱:文本预处理的革命性突破
  • pytest中使用loguru的问题及解决
  • CF2121C Those Who Are With Us
  • Week 12: 深度学习补遗:RNN与LSTM