DeepSeek16-open-webui Pipelines开发填坑
一、虚拟环境安装
mkdir open_webui_pipelines
cd open_webui_pipelines
python -m venv py3119_env
call py3119_env\Scripts\activate
二、下载服务以及安装依赖
git clone https://github.com/open-webui/pipelines.git
cd pipelines
pip install -r requirements.txt
三、跑个简单示例
copy .\examples\pipelines\providers\ollama_pipeline.py .\pipelines\
start.bat
输出如下,表示已加载ollama_pipeline
INFO:root:Created subfolder: ./pipelines\ollama_pipeline
INFO:root:Created valves.json in: ./pipelines\ollama_pipeline
Loaded module: ollama_pipeline
INFO:root:Loaded module: ollama_pipeline
on_startup:ollama_pipeline
四、修改使用该PipeLine文件
-
修改文件,指向自己的ollama模型。
修改刚才拷入的 \pipelines\ollama_pipeline.py。
在 32行和33行,指向自己的ollama模型地址和名称。源代码如下:
OLLAMA_BASE_URL = "http://localhost:11434"MODEL = "llama3"
-
修改完重启服务
-
设置:
添加 API 连接(url为pipelines服务的端口):- 进入 Open WebUI 管理员面板 > 设置 > 外部链接,打开【OpenAI API】
- 点击
+
添加新连接。 - 设置 API URL 为
http://localhost:9099
,API 密钥为0p3n-w3bu!
(默认值。必填)。
-
使用:
点击【新对话】,在模型下拉中选择【Ollama Pipeline】,开启对话。
pipelines服务后台输出日志,表示已经启用改pipeline。
五、完善
1、修改\pipelines\ollama_pipeline.py文件,完成以下任务:
- 在页面可以配置ollama服务和模型
- 修改一部分python不符合项
- 将print修改为logger输出
import logging
import os
from typing import List, Union, Generator, Iteratorfrom pydantic import BaseModel, Fieldimport requestsclass Pipeline:class Valves(BaseModel):OLLAMA_BASE_URL: str = Field(default="http://localhost:11434",description="ollama base url")OLLAMA_DEFAULT_MODEL: str = Field(default="llama3", description="ollama default model name")def __init__(self):# self.id = "ollama_pipeline"self.name = "Ollama Pipeline"self.valves = self.Valves(**{k: os.getenv(k, v.default) for k, v inself.Valves.model_fields.items()})self.log = logging.getLogger(__name__)passasync def on_startup(self):self.log.info(f"on_startup:{__name__}")passasync def on_shutdown(self):self.log.info(f"on_shutdown:{__name__}")passdef pipe(self, user_message: str, model_id: str, messages: List[dict], body: dict) -> Union[str, Generator, Iterator]:# This is where you can add your custom pipelines like RAG.self.log.info(f"pipe:{__name__}, model_id:{model_id}, messages:{messages}")ollama_base_url = self.valves.OLLAMA_BASE_URLmodel = self.valves.OLLAMA_DEFAULT_MODELif "user" in body:self.log.info("######################################")self.log.info(f'# User: {body["user"]["name"]} ({body["user"]["id"]})')self.log.info(f"# Message: {user_message}")self.log.info("######################################")try:r = requests.post(url=f"{ollama_base_url}/v1/chat/completions",json={**body, "model": model},stream=True,)r.raise_for_status()if body["stream"]:return r.iter_lines()else:return r.json()except Exception as e:return f"Error: {e}"
- 删除pipelines\ollama_pipeline目录
- 重启服务
六、填坑:当pipelines文件中含有汉字时,文件编码会引起pipeline加载失败。
- 文件为utf-8时错误如下:
Error loading module: ollama_pipeline
'gbk' codec can't decode byte 0xaf in position 781: illegal multibyte sequence
WARNING:root:No Pipeline class found in ollama_pipeline
- 文件为gbk时错误如下:
Error loading module: ollama_pipeline
(unicode error) 'utf-8' codec can't decode byte 0xc6 in position 0: invalid continuation byte (ollama_pipeline.py, line 28)
WARNING:root:No Pipeline class found in ollama_pipeline
- 处理:
修改pipelines\main.py文件,将文件中open()的函数调用,使用了文本方式的读写,都添加encoding=“utf-8”
137: with open(module_path, "r", encoding="utf-8") as file
193: with open(valves_json_path, "w", encoding="utf-8") as f
201: with open(valves_json_path, "r", encoding="utf-8") as f
580: with open(valves_json_path, "w", encoding="utf-8") as f:
- 然后将pipelines文件保存为utf-8即可
- 删除pipelines\ollama_pipeline目录
- 重启服务
© 著作权归作者所有