当前位置: 首页 > wzjs >正文

高端网站建站网络科技公司主要做什么

高端网站建站,网络科技公司主要做什么,企业网站建设及维护,做logo网站的公司目录 1. 引言2. 初始化LLM3. 构建增强型LLM3. 提示链4. 并行化5. 路由6. 协调者-工人6.1 构造协调者6.2 构造工人 7. 评估-优化工作流参考 1. 引言 该系列前几篇原来在官方文档里是属于QuickStart的,结果改到了LangGraph基础中,并且还拆分成了6篇(实际是…

目录

  • 1. 引言
  • 2. 初始化LLM
  • 3. 构建增强型LLM
  • 3. 提示链
  • 4. 并行化
  • 5. 路由
  • 6. 协调者-工人
    • 6.1 构造协调者
    • 6.2 构造工人
  • 7. 评估-优化工作流
  • 参考

1. 引言

  该系列前几篇原来在官方文档里是属于QuickStart的,结果改到了LangGraph基础中,并且还拆分成了6篇(实际是没啥意义):
  LangGraph(一)——构建聊天机器人
  LangGraph(二)——添加工具
  LangGraph(三)——添加记忆
  LangGraph(四)——加入人机交互控制
  LangGraph(五)——自定义状态
  LangGraph(六)——时间旅行

2. 初始化LLM

from langchain.chat_models import init_chat_modelsllm = init_chat_models("deepseek:deepseek-chat"
)

3. 构建增强型LLM

  LLM拥有支持构建工作流和智能体的增强功能,包括结构化输出和工具调用。

  示例如下:

from pydantic import BaseModel, Fieldclass SearchQuery(BaseModel):search_query: str = Field(None, description="Query that is optimized web search.")justification: str = Field(None, description="Why this query is relevant to the user's request.")structured_llm = llm.with_structured_output(SearchQuery)
output = structured_llm.invoke("How does Calcium CT score relate to high cholesterol?")
print(output)def multiply(a: int, b: int) -> int:return a * bllm_with_tools = llm.bind_tools([multiply])
msg = llm_with_tools.invoke("What is 2 times 3?")print(msg.tool_calls)

  运行结果为:

search_query='Calcium CT score relationship with high cholesterol' justification='To find information on how a Calcium CT score (a measure of coronary artery calcium) is related to high cholesterol levels, which is a risk factor for cardiovascular disease.'
[{'name': 'multiply', 'args': {'a': 2, 'b': 3}, 'id': 'call_0_c8ab0424-0b42-412e-93e9-f71dabcb54d8', 'type': 'tool_call'}]

3. 提示链

  在提示链中,每个LLM调用都会处理前一个调用的输出。

  构建上图的代码:

from typing import TypedDict
from langgraph.graph import StateGraph, START, END
from IPython.display import Image, displayclass State(TypedDict):topic: strjoke: strimproved_joke: strfinal_joke: strdef generate_joke(state:State):msg = llm.invoke(f"Write a short joke about {state['topic']}")return { "joke": msg.content }def check_punchline(state: State):if "?" in state["joke"] or "!" in state["joke"]:return "Pass"return "Fail"def improve_joke(state: State):msg = llm.invoke(f"Make this joke funnier by adding wordplay: {state['joke']}")return { "improved_joke": msg.content }def polish_joke(state: State):msg = llm.invoke(f"Add a surprising twist to this joke: {state['joke']}")return { "final_joke": msg.content }workflow = StateGraph(State)
workflow.add_node("generate_joke", generate_joke)
workflow.add_node("improve_joke", improve_joke)
workflow.add_node("polish_joke", polish_joke)workflow.add_edge(START, "generate_joke")
workflow.add_conditional_edges("generate_joke", check_punchline, { "Fail": "improve_joke", "Pass": END }
)
workflow.add_edge("improve_joke", "polish_joke")
workflow.add_edge("polish_joke", END)chain = workflow.compile()display(Image(chain.get_graph().draw_mermaid_png()))state = chain.invoke({ "topic": "cats" })
print("Initial joke:")
print(state["joke"])
print("\n--- --- ---\n")
if "improved_joke" in state:print("Improved joke:")print(state["improved_joke"])print("\n--- --- ---\n")print("Final joke")print(state["final_joke"])
else:print("Joke failed quality gate - no punchline detected!")

  运行结果为:
在这里插入图片描述

Initial joke:
Sure! Here's a purr-fectly short cat joke for you:  **Why don’t cats play poker in the wild?**  
*Because there are too many cheetahs!* 🐱♠️  Hope that gives you a *meow*-ment of laughter! 😸--- --- ---Joke failed quality gate - no punchline detected!

4. 并行化

  通过并行化,LLM可以同时执行一项任务。

  构建上图的代码:

class State(TypedDict):topic: strjoke: strstory: strpoem: strcombined_output: strdef generate_joke(state: State):msg = llm.invoke(f"Write a joke about {state['topic']}")return { "joke": msg.content }def generate_story(state: State):msg = llm.invoke(f"Write a story about {state['topic']}")return { "story": msg.content }def generate_poem(state: State):msg = llm.invoke(f"Write a poem about {state['topic']}")return { "poem": msg.content }def aggregator(state: State):combined = f"Here's a story, joke, and poem about {state['topic']}!\n\n"combined += f"Story:\n{state['story']}\n\n"combined += f"Joke:\n{state['joke']}\n\n"combined += f"Poem:\n{state['poem']}"return { "combined_output": combined }parallel_builder = StateGraph(State)
parallel_builder.add_node("generate_story", generate_story)
parallel_builder.add_node("generate_joke", generate_joke)
parallel_builder.add_node("generate_poem", generate_poem)
parallel_builder.add_node("aggregator", aggregator)parallel_builder.add_edge(START, "generate_story")
parallel_builder.add_edge(START, "generate_joke")
parallel_builder.add_edge(START, "generate_poem")
parallel_builder.add_edge("generate_story", "aggregator")
parallel_builder.add_edge("generate_joke", "aggregator")
parallel_builder.add_edge("generate_poem", "aggregator")
parallel_builder.add_edge("aggregator", END)
parallel_workflow = parallel_builder.compile()display(Image(parallel_workflow.get_graph().draw_mermaid_png()))state = parallel_workflow.invoke({ "topic": "cats" })
print(state["combined_output"])

  运行结果为:
在这里插入图片描述

Here's a story, joke, and poem about cats!Story:
**The Secret Kingdom of the Moonlit Cats**  In the quiet town of Willowbrook, where the streets were lined with cobblestones and lanterns glowed softly at night, there existed a secret known only to the cats.  Every evening, when the moon rose high and humans drifted into dreams, the cats of Willowbrook gathered in the hidden garden behind Old Miss Hattie’s house. There, beneath the silver light, they whispered in a language only they understood—soft meows, flicking tails, and knowing glances.  Their leader was a sleek black tom named Orion, with eyes like twin embers. He had once been a house cat, but now he ruled over the Moonlit Cats, guardians of forgotten mysteries.  One night, a small, scruffy kitten named Pip stumbled upon their meeting. Wide-eyed and trembling, she had been abandoned by her humans and had nowhere to go.  “Who are you?” Orion demanded, his tail flicking.  “I-I’m Pip,” she squeaked. “I saw the lights… and heard the whispers.”  The other cats murmured, some hissing in suspicion. But an elderly tabby named Mistral stepped forward. “She is alone, Orion. The code of the Moonlit Cats says we protect our own.”  Orion studied Pip, then gave a slow nod. “Very well. But you must prove yourself. Tonight, we hunt the Shadow Mice—ghosts of the garden that steal our secrets.”  Pip’s heart pounded, but she lifted her chin. “I’ll do it.”  With Orion leading, the cats slipped through the garden, their paws silent on the dewy grass. The Shadow Mice were swift, darting between the flowers like smoke. Pip, though small, was quick. She pounced, her tiny claws snagging one by the tail. The mouse dissolved into mist, leaving behind a shimmering silver acorn—a lost memory.  Orion purred in approval. “You have the heart of a Moonlit Cat.”  From that night on, Pip trained with the others—learning to read the stars, to listen to the wind’s secrets, and to guard Willowbrook from unseen dangers. And though humans passed by the garden without a second glance, the cats knew the truth: they were the keepers of magic, the silent watchers of the night.  And so, beneath every full moon, if you listened very closely, you might hear the faintest chorus of purrs—a song of belonging, adventure, and the endless mysteries of the Moonlit Cats.  **The End.** 🐾🌙Joke:
Sure! Here's a purr-fect cat joke for you:  **Why don’t cats play poker in the wild?**  
*Because there are too many cheetahs!* �😹  (Get it? Cheetahs sound like "cheaters," and they're wild cats... okay, I'll see myself out.)Poem:
**Whiskers and Grace**  Oh, little hunter, sleek and sly,  
With golden eyes that pierce the sky.  
You stretch and yawn, then pounce with glee—  
A storm of paws, so wild, so free.  Your velvet ears twitch at the sound  
Of rustling leaves or mice abound.  
You arch your back, your tail stands high—  
A regal pose, a king’s proud eye.  At night you prowl in shadows deep,  
While mortals slumber, lost in sleep.  
Yet come the dawn, you softly creep  
To curl beside me, warm and sweet.  Oh, creature wrapped in mystery,  
Both fierce and fragile, light as breeze—  
You rule my heart with quiet art,  
A tiny lion, soft of heart.  So purr and play, my feline friend,  
On love and sunbeams we depend.  
For in your gaze, the world is right—  
A flick of tail, a blink of light.

5. 路由

  路由会对输入进行分类,并将其定向到后续任务。

  构建上图的代码:

print(state["combined_output"])
#%%
from typing import Literal
from langchain_core.messages import HumanMessage, SystemMessageclass Route(BaseModel):step: Literal["poem", "story", "joke"] = Field(None, description="The next step in the routing process")router_llm = llm.with_structured_output(Route)class State(TypedDict):input: strdecision: stroutput: strdef generate_story(state: State):result = llm.invoke(state["input"])return {"output": result.content}def generate_joke(state: State):result = llm.invoke(state["input"])return {"output": result.content}def generate_poem(state: State):result = llm.invoke(state["input"])return {"output": result.content}def router(state: State):decision = router_llm.invoke([SystemMessage(content="Route the input to story, joke, or poem based on the user's request."),HumanMessage(content=state["input"])])return {"decision": decision.step}def route_decision(state: State):match state["decision"]:case "poem":return "generate_poem"case "story":return "generate_story"case "joke":return "generate_joke"return Nonerouter_builder = StateGraph(State)router_builder.add_node("generate_story", generate_story)
router_builder.add_node("generate_joke", generate_joke)
router_builder.add_node("generate_poem", generate_poem)
router_builder.add_node("router", router)router_builder.add_edge(START, "router")
router_builder.add_conditional_edges("router",route_decision,{"generate_story": "generate_story","generate_joke": "generate_joke","generate_poem": "generate_poem"}
)
router_builder.add_edge("generate_story", END)
router_builder.add_edge("generate_joke", END)
router_builder.add_edge("generate_poem", END)router_workflow = router_builder.compile()display(Image(router_workflow.get_graph().draw_mermaid_png()))state = router_workflow.invoke({"input": "Write me a joke about cats"})
print(state["output"])

  运行结果为:
在这里插入图片描述

Sure! Here's a purr-fect cat joke for you:  **Why don’t cats play poker in the jungle?**  
*Because there are too many cheetahs!* 🐆😹  Let me know if you want more—I've got a *litter* of them! 😸

6. 协调者-工人

  在协调者-工人模式中,协调者将任务分解,并将每个子任务委派给工人。

6.1 构造协调者

  构造协调者:

from typing import Annotated
import operatorclass Section(BaseModel):name: str = Field(description="Name of this section of the report")description: str = Field(description="Brief overview of the main topics and concepts to be covered in this section.")class Sections(BaseModel):sections: list[Section] = Field(description="Sections of the report.")planner = llm.with_structured_output(Sections)

6.2 构造工人

  由于协调者-工人工作流很常见,LangGraph提供了Send API来支持这一点。它允许你动态创建工人节点,并向每个节点发送特定的输入。每个工人都有自己的状态,所有工人的输出都会写入一个共享状态键,协调者图可以访问这个共享状态键。这使得协调者可以访问所有工人的输出,并将它们综合成最终输出。下面的代码将sections列表中每一个section Send到一个工人节点。

from langgraph.constants import Send
from pathlib import Pathclass State(TypedDict):topic: strsections: list[Section]completed_sections: Annotated[list, operator.add]final_report: strclass WorkerState(TypedDict):section: Sectioncompleted_sections: Annotated[list, operator.add]def orchestrator(state: State):report_sections = planner.invoke([SystemMessage(content="Generate a plan for the report."),HumanMessage(content=f"Here is the report topic: {state['topic']}")])return { "sections": report_sections.sections }def worker(state: WorkerState):section = llm.invoke([SystemMessage(content="Write a report section following the provided name and description. Include no preamble for each section. Use markdown formatting."),HumanMessage(content=f"Here is the section name: {state['section'].name} and description: {state['section'].description}")])return { "completed_sections": [section.content] }def synthesizer(state: State):completed_sections = state["completed_sections"]completed_report_sections = "\n\n---\n\n".join(completed_sections)return { "final_report": completed_report_sections }def assign_workers(state: State):return [ Send("worker", {"section": s}) for s in state["sections"] ]orchestrator_worker_builder = StateGraph(State)orchestrator_worker_builder.add_node("orchestrator", orchestrator)
orchestrator_worker_builder.add_node("worker", worker)
orchestrator_worker_builder.add_node("synthesizer", synthesizer)orchestrator_worker_builder.add_edge(START, "orchestrator")
orchestrator_worker_builder.add_conditional_edges("orchestrator", assign_workers, [ "worker" ]
)
orchestrator_worker_builder.add_edge("worker", "synthesizer")
orchestrator_worker_builder.add_edge("synthesizer", END)orchestrator_worker = orchestrator_worker_builder.compile()display(Image(orchestrator_worker.get_graph().draw_mermaid_png()))state = orchestrator_worker.invoke({ "topic": "Create a report on LLM scaling laws" })with Path("report.md").open("w", encoding="utf-8") as f:f.write(state["final_report"])

  运行结果为:
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

7. 评估-优化工作流

  在评估-优化工作流中,一个LLM调用生成响应,另一个则在循环中提供评估和反馈:

  构造上图的代码:

class State(TypedDict):joke: strtopic: strfeedback: strfunny_or_not: strclass Feedback(BaseModel):grade: Literal["funny", "not funny"] = Field(description="Decide if the joke is funny or not.")feedback: str = Field(description="If the joke is not funny, provide feedback on how to improve it.")evaluator = llm.with_structured_output(Feedback)def generate_joke(state: State):if state.get("feedback"):msg = llm.invoke(f"Write a joke about {state['topic']} but take into account the feedback: {state['feedback']}")else:msg = llm.invoke(f"Write a joke about {state['topic']}")return { "joke": msg.content }def generate_feedback(state: State):grade = evaluator.invoke(f"Grade the joke {state['joke']}")return { "funny_or_not": grade.grade, "feedback": grade.feedback }def route_joke(state: State):match state["funny_or_not"]:case "funny":return "Accepted"case "not funny":return "Rejected + Feedback"return Noneoptimizer_builder = StateGraph(State)optimizer_builder.add_node("generate_joke", generate_joke)
optimizer_builder.add_node("generate_feedback", generate_feedback)optimizer_builder.add_edge(START, "generate_joke")
optimizer_builder.add_edge("generate_joke", "generate_feedback")
optimizer_builder.add_conditional_edges("generate_feedback",route_joke,{"Accepted": END,"Rejected + Feedback": "generate_joke"}
)optimizer_workflow = optimizer_builder.compile()display(Image(optimizer_workflow.get_graph().draw_mermaid_png()))state = optimizer_workflow.invoke({ "topic": "cats" })
print(state["joke"])

  运行结果为:
在这里插入图片描述

Sure! Here's a purr-fect cat joke for you:  **Why don’t cats play poker in the wild?**  
*Because too many cheetahs!* 🐆😹  (Get it? Like "cheaters," but... cheetahs? Okay, I'll see myself out.) 🚪🐾

参考

https://langchain-ai.github.io/langgraph/tutorials/workflows/

http://www.dtcms.com/wzjs/817653.html

相关文章:

  • 个人工商注册查询网站子公司怎么注册
  • 网站首页如何设计wordpress登录界面能改吗
  • 贵州省建设厅网站官网wordpress+主题+引入js
  • 国外域名 网站备案物流网站和数据库建设
  • 百度快照 网站描述 更新全网营销代运营公司
  • 网站做编辑器邮箱的官方网站注册
  • 做网站那种布局好互联网保险的典型产品
  • 做消费金融网站价格有趣的网站官网
  • 集团公司做网站的好处有什么wordpress安装选择协议怎么写
  • 养老网站备案必须做前置审批吗网站建设工作领导小组
  • 山东卓商网站建设公司动漫设计专业的学校有哪些
  • 北京哪里做网站好外贸型网站推广与监测
  • 网站域名指什么wordpress 发邮件
  • 如何建设网站知乎php网站开发实战视频教程
  • 网站服务器权限加强局网站建设报告
  • 锦州网站建设更好盐城做网站的哪家公司好
  • 南昌营销网站建设门户网站开发要多久
  • 网站的链接建设网站的根目录
  • 长沙做个网站多少钱某运动服网站建设规划书
  • 陕西专业网站建设哪家好c语言可以做网站吗
  • 汕头网站建设优化企业网站在线超市
  • 用php做网站后台可以做qq空间背景音乐的网站
  • 好的网站或网页中国企业报集团是央企吗
  • 网站后台制作步骤心理软件定制开发
  • 网站关键词添加wordpress主题备份与恢复
  • 做菠菜网站好赚吗建设wap手机网站制作
  • 高质量的佛山网站模板湖南建设银行网站是多少钱
  • 南昌做网站seo合肥大型网站设计
  • 德阳做网站公司页面正在跳转 3秒后自动
  • 二级网站建设基本情况wordpress程序上传到服务器