LLM-201: OpenHands与LLM交互链路分析
一、核心交互链路架构
二、详细流程分解
- 前端交互层
React组件通过React Query发起API请求:
// OpenHands/frontend/src/components/ChatInput.tsx
const { trigger } = useSWRMutation('/api/chat', sendMessage);async function sendMessage(url: string, { arg }: { arg: string }) {return axios.post(url, {session_id: sessionId,message: arg});
}
- API路由层
FastAPI处理请求并创建会话:
# OpenHands/openhands/server/routes/chat.py
@app.post("/chat")
async def chat_endpoint(request: ChatRequest):session = AgentSessionManager.get_session(request.session_id)await session.start()await session.process_event(MessageAction(content=request.message))return EventStreamResponse(session.event_stream)
- Agent控制层
AgentController主循环处理事件:
# OpenHands/openhands/controller/agent_controller.py
async def _execute_step(self):messages = self.conversation_memory.process_events(...)llm_response = await self.llm.acompletion(messages)actions = self.agent.response_to_actions(llm_response)for action in actions:await self._handle_action(action)
- LLM交互层
通过LiteLLM集成多模型:
# OpenHands/openhands/llm/llm.pyasync def acompletion(self, messages: list[Message]) -> ModelResponse:return await litellm.acompletion(model=self.config.model_name,messages=convert_to_oa_messages(messages),tools=self.tool_schema)
- 工具执行层
文件编辑工具示例:
# OpenHands/openhands/tools/file_edit.py
class FileEditTool(BaseTool):async def execute(self, params: dict) -> FileEditObservation:with open(params['filepath'], 'w') as f:f.write(params['content'])return FileEditObservation(content=f"Updated {params['filepath']}")
三、典型交互示例
-
用户请求
前端发送:POST /chat {"message": "修改README.md第5行"}
-
链路处理
- 结果反馈
前端接收SSE事件:
{"type": "observation","data": {"content": "Successfully updated README.md","type": "file_edit"}
}
四、关键技术特性
- 实时事件流:通过Server-Sent Events实现低延迟更新
- 上下文管理:ConversationMemory维护500轮对话上下文
- 错误恢复:_react_to_exception方法实现异常自动处理
- 多模型支持:LLM配置支持30+商业/开源模型接入
五、参考
- OpenHands document
- OpenHands on Github