当前位置: 首页 > news >正文

老显卡老cpu用vllm推理大模型失败Intel(R) Xeon(R) CPU E5-2643 v2

先上结论,显卡太老,无法装cuda12.6

cpu太老,不支持AVX2, ,所以实践失败。

先安装vllm

pip install vllm

它会把torch一起安装

老显卡的驱动cuda是11.6,尝试升级

sudo apt install nvidia-driver-580-server

没升上去。

尝试升级驱动

看看驱动

ubuntu-drivers devices

但是没有看到有啥驱动,显示:ERROR:root:aplay command not found

如果没有这个工具,就可以安装

安装工具

sudo apt install ubuntu-drivers-common

查询可用驱动

ubuntu-drivers devices

没看到有啥

ubuntu-drivers devices
ERROR:root:aplay command not found

安装cuda11.6的torch

pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116

报错没有

pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple, https://download.pytorch.org/whl/cu116
ERROR: Could not find a version that satisfies the requirement torch==1.13.1+cu116 (from versions: 2.2.0, 2.2.1, 2.2.2, 2.3.0, 2.3.1, 2.4.0, 2.4.1, 2.5.0, 2.5.1, 2.6.0, 2.7.0, 2.7.1, 2.8.0, 2.9.0)
ERROR: No matching distribution found for torch==1.13.1+cu116

安装cuda 11.8的

pip install torch==2.0.0 torchvision==0.15.1 torchaudio==2.0.1 --index-url https://download.pytorch.org/whl/cu118

报错

pip install torch==2.0.0 torchvision==0.15.1 torchaudio==2.0.1 --index-url https://download.pytorch.org/whl/cu118
Looking in indexes: https://download.pytorch.org/whl/cu118
ERROR: Could not find a version that satisfies the requirement torch==2.0.0 (from versions: 2.2.0+cu118, 2.2.1+cu118, 2.2.2+cu118, 2.3.0+cu118, 2.3.1+cu118, 2.4.0+cu118, 2.4.1+cu118, 2.5.0+cu118, 2.5.1+cu118, 2.6.0+cu118, 2.7.0+cu118, 2.7.1+cu118)
ERROR: No matching distribution found for torch==2.0.0

也就是只有这几个版本:2.2.0+cu118, 2.2.1+cu118, 2.2.2+cu118, 2.3.0+cu118, 2.3.1+cu118, 2.4.0+cu118, 2.4.1+cu118, 2.5.0+cu118, 2.5.1+cu118, 2.6.0+cu118, 2.7.0+cu118, 2.7.1+cu118

安装2.7.1试试

pip install torch==2.7.0 torchvision==0.22 torchaudio==2.7.1 --index-url https://down
load.pytorch.org/whl/cu118

经过反复调试,最终命令是:

pip install torch==2.7.0 torchvision==0.22 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cu118

还是一样的报错

尝试升级cuda和cudnn

sudo apt upgrade  nvidia-cuda-dev nvidia-cudnn

碰到了这个报错

sudo apt upgrade  nvidia-cuda-dev nvidia-cudnn
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
You might want to run 'apt --fix-broken install' to correct these.
The following packages have unmet dependencies:
intel-oneapi-runtime-dpcpp-cpp : Depends: intel-oneapi-runtime-compilers (>= 2025.3.0-639) but 2025.0.4-1519 is installed
intel-oneapi-runtime-opencl : Depends: intel-oneapi-runtime-compilers (>= 2025.3.0-639) but 2025.0.4-1519 is installed
E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution).

安装提示执行

sudo apt --fix-broken install

重新编译vllm cpu版

下载

git clone https://gitcode.com/GitHub_Trending/vl/vllm vllm_source

构建

pip install --upgrade pip
pip install -v -r requirements/cpu-build.txt --extra-index-url https://download.pytorch.org/whl/cpu
pip install -v -r requirements/cpu.txt --extra-index-url https://download.pytorch.org/whl/cpu

构筑后端

VLLM_TARGET_DEVICE=cpu python setup.py install

报错

running build_ext
-- The CXX compiler identification is GNU 13.3.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build type: RelWithDebInfo
-- Target device: cpu
-- Found Python: /home/skywalk/py312/bin/python (found version "3.12.3") found components: Interpreter Development.Module Development.SABIModule
-- Found python matching: /home/skywalk/py312/bin/python.
CMake Warning at /home/skywalk/py312/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:22 (message):static library kineto_LIBRARY-NOTFOUND not found.
Call Stack (most recent call first):/home/skywalk/py312/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:125 (append_torchlib_if_found)CMakeLists.txt:84 (find_package)-- Found Torch: /home/skywalk/py312/lib/python3.12/site-packages/torch/lib/libtorch.so
CMake Error at cmake/cpu_extension.cmake:188 (message):vLLM CPU backend requires AVX512, AVX2, Power9+ ISA, S390X ISA, ARMv8 orRISC-V support.
Call Stack (most recent call first):CMakeLists.txt:104 (include)-- Configuring incomplete, errors occurred!
Traceback (most recent call last):File "/home/skywalk/github/vllm_source/setup.py", line 706, in <module>setup(File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/__init__.py", line 117, in setupreturn distutils.core.setup(**attrs)^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 186, in setupreturn run_commands(dist)^^^^^^^^^^^^^^^^^^File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 202, in run_commandsdist.run_commands()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1002, in run_commandsself.run_command(cmd)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_commandsuper().run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_commandcmd_obj.run()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/install.py", line 109, in runself.do_egg_install()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/install.py", line 167, in do_egg_installself.run_command('bdist_egg')File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 357, in run_commandself.distribution.run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_commandsuper().run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_commandcmd_obj.run()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/bdist_egg.py", line 177, in runcmd = self.call_command('install_lib', warn_dir=False)^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/bdist_egg.py", line 163, in call_commandself.run_command(cmdname)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 357, in run_commandself.distribution.run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_commandsuper().run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_commandcmd_obj.run()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/install_lib.py", line 19, in runself.build()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/command/install_lib.py", line 113, in buildself.run_command('build_ext')File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 357, in run_commandself.distribution.run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_commandsuper().run_command(command)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_commandcmd_obj.run()File "/home/skywalk/github/vllm_source/setup.py", line 282, in runsuper().run()File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/build_ext.py", line 99, in run_build_ext.run(self)File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/command/build_ext.py", line 368, in runself.build_extensions()File "/home/skywalk/github/vllm_source/setup.py", line 239, in build_extensionsself.configure(ext)File "/home/skywalk/github/vllm_source/setup.py", line 216, in configuresubprocess.check_call(File "/usr/lib/python3.12/subprocess.py", line 413, in check_callraise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '/home/skywalk/github/vllm_source', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cpu', '-DVLLM_PYTHON_EXECUTABLE=/home/skywalk/py312/bin/python', '-DVLLM_PYTHON_PATH=/home/skywalk/github/vllm_source:/usr/lib/python312.zip:/usr/lib/python3.12:/usr/lib/python3.12/lib-dynload:/home/skywalk/py312/lib/python3.12/site-packages:/home/skywalk/github/exo:/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/home/skywalk/github/vllm_source/.deps', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=24']' returned non-zero exit status 1.

没办法了,cpu不行,Intel(R) Xeon(R) CPU E5-2643 v2 @ 3.50GHz 太老了。

使用预编译的cpu版本

pip install vllm --no-build-isolation

白搭,这个也不行,放弃。

vllm -V
[W1030 18:34:21.393673118 OperatorEntry.cpp:218] Warning: Warning only once for all operators,  other operators may also be overridden.
Overriding a previously registered kernel for the same operator and the same dispatch key
operator: aten::_addmm_activation(Tensor self, Tensor mat1, Tensor mat2, *, Scalar beta=1, Scalar alpha=1, bool use_gelu=False) -> Tensor
registered at /pytorch/build/aten/src/ATen/RegisterSchema.cpp:6
dispatch key: AutocastCPU
previous kernel: registered at /pytorch/aten/src/ATen/autocast_mode.cpp:327
new kernel: registered at /opt/workspace/ipex-cpu-dev/csrc/cpu/autocast/autocast_mode.cpp:112 (function operator())
ERROR! Intel® Extension for PyTorch* only works on machines with instruction sets equal or newer         than AVX2, which are not detected on the current machine.

这台机器放弃!

http://www.dtcms.com/a/550365.html

相关文章:

  • wordpress有中文主题吗qq的seo综合查询
  • [人工智能-大模型-117]:模型层 - 用通俗易懂的语言,阐述循环神经网络的结构
  • Vue响应式原理:从数据定义到视图更新全链路解析
  • 周村网站建设广州沙河一起做网站
  • Jenkins Pipeline 快速开始
  • 临淄关键词网站优化培训中心wordpress get_option array
  • 做那事的网站宁波论坛招聘最新消息
  • 凯里专注网站建设报价广州番禺最新发布
  • 车载软件需求开发与管理 --- 一些对软件需求的看法
  • 部分移动(Partial Move)的使用场景:Rust 所有权拆分的精细化实践
  • 建设门户网站所需wordpress个人博客源码
  • 百度 网站添加宝塔配置wordpress主题
  • 2025 年版 Highcharts vs LightningChart JS:科研大数据可视化库的深度对比
  • 上海网站空间租用app开发一般收费
  • 量化指标解码04:解锁MACD的威力|零轴、背离与多周期共振
  • zabbix 监控进程 日志 主从状态和主从延迟
  • xshell连接kali ssh服务拒绝了密码
  • 【MySQL】--- 视图
  • 【大模型:RAG】--CLIP模型实现多模态检索
  • 从零开始:Netlify 免费部署应用超详细指南
  • 空间点绕任意轴旋转的数学原理与实现
  • 公司网站维护如何做分录wordpress 显示阅读数
  • wordpress站内计费搜索wamp和wordpress
  • 唐山网站建设推广网站优缺点分析
  • 虚拟主机 发布网站北京软件培训机构前十名
  • 企业网站规划与建设论文北京房地产信息网
  • 网站建设需要提供哪些材料免费公司logo图标
  • 上海网站建设渠道wordpress 自定义逻辑
  • lua table.remove引发的偶现bug
  • 常熟做网站价格wordpress 改变字体