当前位置: 首页 > wzjs >正文

网站充值接口怎么做关键词优化步骤简短

网站充值接口怎么做,关键词优化步骤简短,司法行政网站建设目的,化妆品 网站建设案例目录 前言 一、环境准备 1. 系统要求 2. 安装必要依赖 二、Anaconda环境配置 1. 安装Anaconda 2. 创建专用Python环境 3. 安装必要的Python包 三、获取Caffe源代码 四、配置编译选项 1. 修改Makefile.config 2. 修改Makefile 3. 修改CMakeLists.txt(如…

目录

前言

一、环境准备

1. 系统要求

2. 安装必要依赖

二、Anaconda环境配置

1. 安装Anaconda

2. 创建专用Python环境

3. 安装必要的Python包

三、获取Caffe源代码

四、配置编译选项

1. 修改Makefile.config

2. 修改Makefile

3. 修改CMakeLists.txt(如果使用cmake,不用可跳过)

4. 创建python3.8的链接库

五、编译Caffe

1. 使用Make编译

① 检查编译环境的python版本是否符合预期

 ② 检查编译环境的protoc版本是否符合预期

③ 编译caffe

2. 验证动态库链接

3. 常见编译错误及解决方案

① 错误1:fatal error: numpy/arrayobject.h: No such file or directory

② 错误2:undefined reference to boost::python...

③ 错误3:error: 'pybind11' is not a namespace-name

④ 错误4:error: 'class std::unordered_map' has no member named 'emplace'

⑤ 错误5:HDF5相关错误

六、安装Python接口

1. 安装所需的python包

2. 编译pycaffe

七、测试Caffe

八、可能遇到的问题及解决方案

问题1:

问题2:

问题3:

问题4:


前言

网上有很多博主都写过关于caffe编译的教程,但是在使用教程的时候总会出现各种各样的错误,但是又找不到解决方案,非常无助,博主在网上参考了不下五篇文章,终于在Linux下把cpu版本的caffe编译成功了,因此记录一下,以防后期更换了设备后需要重新编译,同时也希望能帮助到有需要的小伙伴们,博主列举了部分编译过程中遇到的错误以及解决方案,同时将亲测有效的Makefile.config文件也分享给大家,如果你在编译过程中有什么疑问或者好的解决方案,欢迎评论区留言。

一、环境准备

1. 系统要求

  • Ubuntu 18.04/20.04(其他Linux发行版也可,但包管理命令可能需要调整)

  • GCC/G++ 7.5或更高版本

  • 确保系统已更新:sudo apt update && sudo apt upgrade -y

2. 安装必要依赖

sudo apt-get update
sudo apt-get install -y build-essential cmake git pkg-config
sudo apt-get install -y libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev
sudo apt-get install -y libhdf5-serial-dev protobuf-compiler
sudo apt-get install -y --no-install-recommends libboost-all-dev
sudo apt-get install -y libgflags-dev libgoogle-glog-dev liblmdb-dev
sudo apt-get install -y python-dev python-numpy python-pip python-scipy

二、Anaconda环境配置

1. 安装Anaconda

从Anaconda官网下载并安装:

wget https://repo.anaconda.com/archive/Anaconda3-2024.06-1-Linux-x86_64.sh
bash Anaconda3-2024.06-1-Linux-x86_64.sh

按照提示完成安装后,激活conda环境:

source ~/.anaconda3/bin/activate

2. 创建专用Python环境

conda create -n caffe python=3.8
conda activate caffe

3. 安装必要的Python包

pip install protobuf==3.20.1
pip install onnx==1.6.0
pip install numpy scipy matplotlib scikit-imageconda install -y opencv

三、获取Caffe源代码

git clone https://github.com/BVLC/caffe.git
cd caffe
git checkout master  # 或指定稳定版本,如 git checkout 1.0

四、配置编译选项

1. 修改Makefile.config

复制示例配置文件并修改:

cp Makefile.config.example Makefile.config

给出修改后完整的Makefile.config

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!# cuDNN acceleration switch (uncomment to build with cuDNN).
# USE_CUDNN := 1# CPU-only switch (uncomment to build without GPU support).
CPU_ONLY := 1# uncomment to disable IO dependencies and corresponding data layers
USE_OPENCV := 1
# USE_LEVELDB := 0
# USE_LMDB := 0
# This code is taken from https://github.com/sh1r0/caffe-android-lib
# USE_HDF5 := 0# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#	You should not set this flag if you will be reading LMDBs with any
#	possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1# Uncomment if you're using OpenCV 3
OPENCV_VERSION := 4# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++# CUDA directory contains bin/ and lib/ directories that we need.
# CUDA_DIR := /usr/local/cuda
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
# For CUDA >= 9.0, comment the *_20 and *_21 lines for compatibility.
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
#		-gencode arch=compute_20,code=sm_21 \
#		-gencode arch=compute_30,code=sm_30 \
#		-gencode arch=compute_35,code=sm_35 \
#		-gencode arch=compute_50,code=sm_50 \
#		-gencode arch=compute_52,code=sm_52 \
#		-gencode arch=compute_60,code=sm_60 \
#		-gencode arch=compute_61,code=sm_61 \
#		-gencode arch=compute_61,code=compute_61# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := atlas
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
# MATLAB_DIR := /usr/local
# MATLAB_DIR := /Applications/MATLAB_R2012b.app# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
# PYTHON_INCLUDE := /usr/include/python2.7 \
#		/usr/lib/python2.7/dist-packages/numpy/core/include
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
ANACONDA_HOME := /home/anaconda3/envs/caffe/
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \$(ANACONDA_HOME)/include/python3.8 \$(ANACONDA_HOME)/lib/python3.8/site-packages/numpy/core/include
PYTHON_LIB := $(ANACONDA_HOME)/lib# Uncomment to use Python 3 (default is Python 2)
PYTHON_LIBRARIES := boost_python38 python3.8  
# PYTHON_INCLUDE := /usr/include/python3.5m \
#                 /usr/lib/python3.5/dist-packages/numpy/core/include# We need to be able to find libpythonX.X.so or .dylib.
# PYTHON_LIB := /usr/lib
PYTHON_LIB := $(ANACONDA_HOME)/lib# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib# Uncomment to support layers written in Python (will link against Python libs)
WITH_PYTHON_LAYER := 1# Whatever else you find you need goes here.
INCLUDE_DIRS := /usr/include $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial /usr/include/opencv4
LIBRARY_DIRS := /usr/lib/x86_64-linux-gnu $(ANACONDA_HOME)/lib $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu/hdf5/serial# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib# NCCL acceleration switch (uncomment to build with NCCL)
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
# USE_NCCL := 1# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
USE_PKG_CONFIG := 0# N.B. both build and distribute dirs are cleared on `make clean`
BUILD_DIR := build
DISTRIBUTE_DIR := distribute# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0# enable pretty build (comment to see full commands)
Q ?= @# 强制使用系统版本的 Protobuf
LDFLAGS += -L/usr/lib/x86_64-linux-gnu -lprotobuf# OpenCV 库
LIBRARIES += opencv_core opencv_highgui opencv_imgproc opencv_imgcodecs# OpenCV 4.x 版本弃用了CV_LOAD_IMAGE_COLOR 和 CV_LOAD_IMAGE_GRAYSCALE 宏定义
# 改用 cv::IMREAD_COLOR 和 cv::IMREAD_GRAYSCALE
COMMON_FLAGS += -DCV_LOAD_IMAGE_COLOR=cv::IMREAD_COLOR
COMMON_FLAGS += -DCV_LOAD_IMAGE_GRAYSCALE=cv::IMREAD_GRAYSCALELDFLAGS += -L$(ANACONDA_HOME)/lib -lpython3.8LDFLAGS += -L/usr/lib/x86_64-linux-gnu -lopencv_core -lopencv_imgcodecs -lopencv_highgui# 确保使用正确的 Boost 库
LIBRARIES += boost_filesystem boost_system

由于系统中可能存在多个版本的Boost和Protobuf库(例如anacoda虚拟环境中的库或者自己从源码的不同版本的库),因此在Makefile.config中强制指定系统中的版本

2. 修改Makefile

给出修改后完整的Makefile

PROJECT := caffeCONFIG_FILE := Makefile.config
# Explicitly check for the config file, otherwise make -k will proceed anyway.
ifeq ($(wildcard $(CONFIG_FILE)),)
$(error $(CONFIG_FILE) not found. See $(CONFIG_FILE).example.)
endif
include $(CONFIG_FILE)BUILD_DIR_LINK := $(BUILD_DIR)
ifeq ($(RELEASE_BUILD_DIR),)RELEASE_BUILD_DIR := .$(BUILD_DIR)_release
endif
ifeq ($(DEBUG_BUILD_DIR),)DEBUG_BUILD_DIR := .$(BUILD_DIR)_debug
endifDEBUG ?= 0
ifeq ($(DEBUG), 1)BUILD_DIR := $(DEBUG_BUILD_DIR)OTHER_BUILD_DIR := $(RELEASE_BUILD_DIR)
elseBUILD_DIR := $(RELEASE_BUILD_DIR)OTHER_BUILD_DIR := $(DEBUG_BUILD_DIR)
endif# All of the directories containing code.
SRC_DIRS := $(shell find * -type d -exec bash -c "find {} -maxdepth 1 \\( -name '*.cpp' -o -name '*.proto' \) | grep -q ." \; -print)# The target shared library name
LIBRARY_NAME := $(PROJECT)
LIB_BUILD_DIR := $(BUILD_DIR)/lib
STATIC_NAME := $(LIB_BUILD_DIR)/lib$(LIBRARY_NAME).a
DYNAMIC_VERSION_MAJOR 		:= 1
DYNAMIC_VERSION_MINOR 		:= 0
DYNAMIC_VERSION_REVISION 	:= 0
DYNAMIC_NAME_SHORT := lib$(LIBRARY_NAME).so
#DYNAMIC_SONAME_SHORT := $(DYNAMIC_NAME_SHORT).$(DYNAMIC_VERSION_MAJOR)
DYNAMIC_VERSIONED_NAME_SHORT := $(DYNAMIC_NAME_SHORT).$(DYNAMIC_VERSION_MAJOR).$(DYNAMIC_VERSION_MINOR).$(DYNAMIC_VERSION_REVISION)
DYNAMIC_NAME := $(LIB_BUILD_DIR)/$(DYNAMIC_VERSIONED_NAME_SHORT)
COMMON_FLAGS += -DCAFFE_VERSION=$(DYNAMIC_VERSION_MAJOR).$(DYNAMIC_VERSION_MINOR).$(DYNAMIC_VERSION_REVISION)
# 添加系统 Protobuf 头文件路径(确保优先使用系统头文件)
COMMON_FLAGS += -I/usr/include##############################
# Get all source files
##############################
# CXX_SRCS are the source files excluding the test ones.
CXX_SRCS := $(shell find src/$(PROJECT) ! -name "test_*.cpp" -name "*.cpp")
# CU_SRCS are the cuda source files
CU_SRCS := $(shell find src/$(PROJECT) ! -name "test_*.cu" -name "*.cu")
# TEST_SRCS are the test source files
TEST_MAIN_SRC := src/$(PROJECT)/test/test_caffe_main.cpp
TEST_SRCS := $(shell find src/$(PROJECT) -name "test_*.cpp")
TEST_SRCS := $(filter-out $(TEST_MAIN_SRC), $(TEST_SRCS))
TEST_CU_SRCS := $(shell find src/$(PROJECT) -name "test_*.cu")
GTEST_SRC := src/gtest/gtest-all.cpp
# TOOL_SRCS are the source files for the tool binaries
TOOL_SRCS := $(shell find tools -name "*.cpp")
# EXAMPLE_SRCS are the source files for the example binaries
EXAMPLE_SRCS := $(shell find examples -name "*.cpp")
# BUILD_INCLUDE_DIR contains any generated header files we want to include.
BUILD_INCLUDE_DIR := $(BUILD_DIR)/src
# PROTO_SRCS are the protocol buffer definitions
PROTO_SRC_DIR := src/$(PROJECT)/proto
PROTO_SRCS := $(wildcard $(PROTO_SRC_DIR)/*.proto)
# PROTO_BUILD_DIR will contain the .cc and obj files generated from
# PROTO_SRCS; PROTO_BUILD_INCLUDE_DIR will contain the .h header files
PROTO_BUILD_DIR := $(BUILD_DIR)/$(PROTO_SRC_DIR)
PROTO_BUILD_INCLUDE_DIR := $(BUILD_INCLUDE_DIR)/$(PROJECT)/proto
# NONGEN_CXX_SRCS includes all source/header files except those generated
# automatically (e.g., by proto).
NONGEN_CXX_SRCS := $(shell find \src/$(PROJECT) \include/$(PROJECT) \python/$(PROJECT) \matlab/+$(PROJECT)/private \examples \tools \-name "*.cpp" -or -name "*.hpp" -or -name "*.cu" -or -name "*.cuh")
LINT_SCRIPT := scripts/cpp_lint.py
LINT_OUTPUT_DIR := $(BUILD_DIR)/.lint
LINT_EXT := lint.txt
LINT_OUTPUTS := $(addsuffix .$(LINT_EXT), $(addprefix $(LINT_OUTPUT_DIR)/, $(NONGEN_CXX_SRCS)))
EMPTY_LINT_REPORT := $(BUILD_DIR)/.$(LINT_EXT)
NONEMPTY_LINT_REPORT := $(BUILD_DIR)/$(LINT_EXT)
# PY$(PROJECT)_SRC is the python wrapper for $(PROJECT)
PY$(PROJECT)_SRC := python/$(PROJECT)/_$(PROJECT).cpp
PY$(PROJECT)_SO := python/$(PROJECT)/_$(PROJECT).so
PY$(PROJECT)_HXX := include/$(PROJECT)/layers/python_layer.hpp
# MAT$(PROJECT)_SRC is the mex entrance point of matlab package for $(PROJECT)
MAT$(PROJECT)_SRC := matlab/+$(PROJECT)/private/$(PROJECT)_.cpp
ifneq ($(MATLAB_DIR),)MAT_SO_EXT := $(shell $(MATLAB_DIR)/bin/mexext)
endif
MAT$(PROJECT)_SO := matlab/+$(PROJECT)/private/$(PROJECT)_.$(MAT_SO_EXT)##############################
# Derive generated files
##############################
# The generated files for protocol buffers
PROTO_GEN_HEADER_SRCS := $(addprefix $(PROTO_BUILD_DIR)/, \$(notdir ${PROTO_SRCS:.proto=.pb.h}))
PROTO_GEN_HEADER := $(addprefix $(PROTO_BUILD_INCLUDE_DIR)/, \$(notdir ${PROTO_SRCS:.proto=.pb.h}))
PROTO_GEN_CC := $(addprefix $(BUILD_DIR)/, ${PROTO_SRCS:.proto=.pb.cc})
PY_PROTO_BUILD_DIR := python/$(PROJECT)/proto
PY_PROTO_INIT := python/$(PROJECT)/proto/__init__.py
PROTO_GEN_PY := $(foreach file,${PROTO_SRCS:.proto=_pb2.py}, \$(PY_PROTO_BUILD_DIR)/$(notdir $(file)))
# The objects corresponding to the source files
# These objects will be linked into the final shared library, so we
# exclude the tool, example, and test objects.
CXX_OBJS := $(addprefix $(BUILD_DIR)/, ${CXX_SRCS:.cpp=.o})
CU_OBJS := $(addprefix $(BUILD_DIR)/cuda/, ${CU_SRCS:.cu=.o})
PROTO_OBJS := ${PROTO_GEN_CC:.cc=.o}
OBJS := $(PROTO_OBJS) $(CXX_OBJS) $(CU_OBJS)
# tool, example, and test objects
TOOL_OBJS := $(addprefix $(BUILD_DIR)/, ${TOOL_SRCS:.cpp=.o})
TOOL_BUILD_DIR := $(BUILD_DIR)/tools
TEST_CXX_BUILD_DIR := $(BUILD_DIR)/src/$(PROJECT)/test
TEST_CU_BUILD_DIR := $(BUILD_DIR)/cuda/src/$(PROJECT)/test
TEST_CXX_OBJS := $(addprefix $(BUILD_DIR)/, ${TEST_SRCS:.cpp=.o})
TEST_CU_OBJS := $(addprefix $(BUILD_DIR)/cuda/, ${TEST_CU_SRCS:.cu=.o})
TEST_OBJS := $(TEST_CXX_OBJS) $(TEST_CU_OBJS)
GTEST_OBJ := $(addprefix $(BUILD_DIR)/, ${GTEST_SRC:.cpp=.o})
EXAMPLE_OBJS := $(addprefix $(BUILD_DIR)/, ${EXAMPLE_SRCS:.cpp=.o})
# Output files for automatic dependency generation
DEPS := ${CXX_OBJS:.o=.d} ${CU_OBJS:.o=.d} ${TEST_CXX_OBJS:.o=.d} \${TEST_CU_OBJS:.o=.d} $(BUILD_DIR)/${MAT$(PROJECT)_SO:.$(MAT_SO_EXT)=.d}
# tool, example, and test bins
TOOL_BINS := ${TOOL_OBJS:.o=.bin}
EXAMPLE_BINS := ${EXAMPLE_OBJS:.o=.bin}
# symlinks to tool bins without the ".bin" extension
TOOL_BIN_LINKS := ${TOOL_BINS:.bin=}
# Put the test binaries in build/test for convenience.
TEST_BIN_DIR := $(BUILD_DIR)/test
TEST_CU_BINS := $(addsuffix .testbin,$(addprefix $(TEST_BIN_DIR)/, \$(foreach obj,$(TEST_CU_OBJS),$(basename $(notdir $(obj))))))
TEST_CXX_BINS := $(addsuffix .testbin,$(addprefix $(TEST_BIN_DIR)/, \$(foreach obj,$(TEST_CXX_OBJS),$(basename $(notdir $(obj))))))
TEST_BINS := $(TEST_CXX_BINS) $(TEST_CU_BINS)
# TEST_ALL_BIN is the test binary that links caffe dynamically.
TEST_ALL_BIN := $(TEST_BIN_DIR)/test_all.testbin##############################
# Derive compiler warning dump locations
##############################
WARNS_EXT := warnings.txt
CXX_WARNS := $(addprefix $(BUILD_DIR)/, ${CXX_SRCS:.cpp=.o.$(WARNS_EXT)})
CU_WARNS := $(addprefix $(BUILD_DIR)/cuda/, ${CU_SRCS:.cu=.o.$(WARNS_EXT)})
TOOL_WARNS := $(addprefix $(BUILD_DIR)/, ${TOOL_SRCS:.cpp=.o.$(WARNS_EXT)})
EXAMPLE_WARNS := $(addprefix $(BUILD_DIR)/, ${EXAMPLE_SRCS:.cpp=.o.$(WARNS_EXT)})
TEST_WARNS := $(addprefix $(BUILD_DIR)/, ${TEST_SRCS:.cpp=.o.$(WARNS_EXT)})
TEST_CU_WARNS := $(addprefix $(BUILD_DIR)/cuda/, ${TEST_CU_SRCS:.cu=.o.$(WARNS_EXT)})
ALL_CXX_WARNS := $(CXX_WARNS) $(TOOL_WARNS) $(EXAMPLE_WARNS) $(TEST_WARNS)
ALL_CU_WARNS := $(CU_WARNS) $(TEST_CU_WARNS)
ALL_WARNS := $(ALL_CXX_WARNS) $(ALL_CU_WARNS)EMPTY_WARN_REPORT := $(BUILD_DIR)/.$(WARNS_EXT)
NONEMPTY_WARN_REPORT := $(BUILD_DIR)/$(WARNS_EXT)##############################
# Derive include and lib directories
##############################
CUDA_INCLUDE_DIR := $(CUDA_DIR)/includeCUDA_LIB_DIR :=
# add <cuda>/lib64 only if it exists
ifneq ("$(wildcard $(CUDA_DIR)/lib64)","")CUDA_LIB_DIR += $(CUDA_DIR)/lib64
endif
CUDA_LIB_DIR += $(CUDA_DIR)/libINCLUDE_DIRS += $(BUILD_INCLUDE_DIR) ./src ./include
ifneq ($(CPU_ONLY), 1)INCLUDE_DIRS += $(CUDA_INCLUDE_DIR)LIBRARY_DIRS += $(CUDA_LIB_DIR)LIBRARIES := cudart cublas curand
endifLIBRARIES += glog gflags protobuf boost_system boost_filesystem m hdf5_serial_hl hdf5_serial# handle IO dependencies
USE_LEVELDB ?= 1
USE_LMDB ?= 1
# This code is taken from https://github.com/sh1r0/caffe-android-lib
USE_HDF5 ?= 1
USE_OPENCV ?= 1ifeq ($(USE_LEVELDB), 1)LIBRARIES += leveldb snappy
endif
ifeq ($(USE_LMDB), 1)LIBRARIES += lmdb
endif
# This code is taken from https://github.com/sh1r0/caffe-android-lib
ifeq ($(USE_HDF5), 1)LIBRARIES += hdf5_hl hdf5
endif
ifeq ($(USE_OPENCV), 1)LIBRARIES += opencv_core opencv_highgui opencv_imgprocifeq ($(OPENCV_VERSION), 3)LIBRARIES += opencv_imgcodecsendifendif
PYTHON_LIBRARIES ?= boost_python python2.7
WARNINGS := -Wall -Wno-sign-compare##############################
# Set build directories
##############################DISTRIBUTE_DIR ?= distribute
DISTRIBUTE_SUBDIRS := $(DISTRIBUTE_DIR)/bin $(DISTRIBUTE_DIR)/lib
DIST_ALIASES := dist
ifneq ($(strip $(DISTRIBUTE_DIR)),distribute)DIST_ALIASES += distribute
endifALL_BUILD_DIRS := $(sort $(BUILD_DIR) $(addprefix $(BUILD_DIR)/, $(SRC_DIRS)) \$(addprefix $(BUILD_DIR)/cuda/, $(SRC_DIRS)) \$(LIB_BUILD_DIR) $(TEST_BIN_DIR) $(PY_PROTO_BUILD_DIR) $(LINT_OUTPUT_DIR) \$(DISTRIBUTE_SUBDIRS) $(PROTO_BUILD_INCLUDE_DIR))##############################
# Set directory for Doxygen-generated documentation
##############################
DOXYGEN_CONFIG_FILE ?= ./.Doxyfile
# should be the same as OUTPUT_DIRECTORY in the .Doxyfile
DOXYGEN_OUTPUT_DIR ?= ./doxygen
DOXYGEN_COMMAND ?= doxygen
# All the files that might have Doxygen documentation.
DOXYGEN_SOURCES := $(shell find \src/$(PROJECT) \include/$(PROJECT) \python/ \matlab/ \examples \tools \-name "*.cpp" -or -name "*.hpp" -or -name "*.cu" -or -name "*.cuh" -or \-name "*.py" -or -name "*.m")
DOXYGEN_SOURCES += $(DOXYGEN_CONFIG_FILE)##############################
# Configure build
############################### Determine platform
UNAME := $(shell uname -s)
ifeq ($(UNAME), Linux)LINUX := 1
else ifeq ($(UNAME), Darwin)OSX := 1OSX_MAJOR_VERSION := $(shell sw_vers -productVersion | cut -f 1 -d .)OSX_MINOR_VERSION := $(shell sw_vers -productVersion | cut -f 2 -d .)
endif# Linux
ifeq ($(LINUX), 1)CXX ?= /usr/bin/g++GCCVERSION := $(shell $(CXX) -dumpversion | cut -f1,2 -d.)# older versions of gcc are too dumb to build boost with -Wuninitalizedifeq ($(shell echo | awk '{exit $(GCCVERSION) < 4.6;}'), 1)WARNINGS += -Wno-uninitializedendif# boost::thread is reasonably called boost_thread (compare OS X)# We will also explicitly add stdc++ to the link target.LIBRARIES += boost_thread stdc++VERSIONFLAGS += -Wl,-soname,$(DYNAMIC_VERSIONED_NAME_SHORT) -Wl,-rpath,$(ORIGIN)/../lib
endif# OS X:
# clang++ instead of g++
# libstdc++ for NVCC compatibility on OS X >= 10.9 with CUDA < 7.0
ifeq ($(OSX), 1)CXX := /usr/bin/clang++ifneq ($(CPU_ONLY), 1)CUDA_VERSION := $(shell $(CUDA_DIR)/bin/nvcc -V | grep -o 'release [0-9.]*' | tr -d '[a-z ]')ifeq ($(shell echo | awk '{exit $(CUDA_VERSION) < 7.0;}'), 1)CXXFLAGS += -stdlib=libstdc++LINKFLAGS += -stdlib=libstdc++endif# clang throws this warning for cuda headersWARNINGS += -Wno-unneeded-internal-declaration# 10.11 strips DYLD_* env vars so link CUDA (rpath is available on 10.5+)OSX_10_OR_LATER   := $(shell [ $(OSX_MAJOR_VERSION) -ge 10 ] && echo true)OSX_10_5_OR_LATER := $(shell [ $(OSX_MINOR_VERSION) -ge 5 ] && echo true)ifeq ($(OSX_10_OR_LATER),true)ifeq ($(OSX_10_5_OR_LATER),true)LDFLAGS += -Wl,-rpath,$(CUDA_LIB_DIR)endifendifendif# gtest needs to use its own tuple to not conflict with clangCOMMON_FLAGS += -DGTEST_USE_OWN_TR1_TUPLE=1# boost::thread is called boost_thread-mt to mark multithreading on OS XLIBRARIES += boost_thread-mt# we need to explicitly ask for the rpath to be obeyedORIGIN := @loader_pathVERSIONFLAGS += -Wl,-install_name,@rpath/$(DYNAMIC_VERSIONED_NAME_SHORT) -Wl,-rpath,$(ORIGIN)/../../build/lib
elseORIGIN := \$$ORIGIN
endif# Custom compiler
ifdef CUSTOM_CXXCXX := $(CUSTOM_CXX)
endif# Static linking
ifneq (,$(findstring clang++,$(CXX)))STATIC_LINK_COMMAND := -Wl,-force_load $(STATIC_NAME)
else ifneq (,$(findstring g++,$(CXX)))STATIC_LINK_COMMAND := -Wl,--whole-archive $(STATIC_NAME) -Wl,--no-whole-archive
else# The following line must not be indented with a tab, since we are not inside a target$(error Cannot static link with the $(CXX) compiler)
endif# Debugging
ifeq ($(DEBUG), 1)COMMON_FLAGS += -DDEBUG -g -O0NVCCFLAGS += -G
elseCOMMON_FLAGS += -DNDEBUG -O2
endif# cuDNN acceleration configuration.
ifeq ($(USE_CUDNN), 1)LIBRARIES += cudnnCOMMON_FLAGS += -DUSE_CUDNN
endif# NCCL acceleration configuration
ifeq ($(USE_NCCL), 1)LIBRARIES += ncclCOMMON_FLAGS += -DUSE_NCCL
endif# configure IO libraries
ifeq ($(USE_OPENCV), 1)COMMON_FLAGS += -DUSE_OPENCV
endif
ifeq ($(USE_LEVELDB), 1)COMMON_FLAGS += -DUSE_LEVELDB
endif
ifeq ($(USE_LMDB), 1)COMMON_FLAGS += -DUSE_LMDB
ifeq ($(ALLOW_LMDB_NOLOCK), 1)COMMON_FLAGS += -DALLOW_LMDB_NOLOCK
endif
endif
# This code is taken from https://github.com/sh1r0/caffe-android-lib
ifeq ($(USE_HDF5), 1)COMMON_FLAGS += -DUSE_HDF5
endif# CPU-only configuration
ifeq ($(CPU_ONLY), 1)OBJS := $(PROTO_OBJS) $(CXX_OBJS)TEST_OBJS := $(TEST_CXX_OBJS)TEST_BINS := $(TEST_CXX_BINS)ALL_WARNS := $(ALL_CXX_WARNS)TEST_FILTER := --gtest_filter="-*GPU*"COMMON_FLAGS += -DCPU_ONLY
endif# Python layer support
ifeq ($(WITH_PYTHON_LAYER), 1)COMMON_FLAGS += -DWITH_PYTHON_LAYERLIBRARIES += $(PYTHON_LIBRARIES)
endif# BLAS configuration (default = ATLAS)
BLAS ?= atlas
ifeq ($(BLAS), mkl)# MKLLIBRARIES += mkl_rtCOMMON_FLAGS += -DUSE_MKLMKLROOT ?= /opt/intel/mklBLAS_INCLUDE ?= $(MKLROOT)/includeBLAS_LIB ?= $(MKLROOT)/lib $(MKLROOT)/lib/intel64
else ifeq ($(BLAS), open)# OpenBLASLIBRARIES += openblas
else# ATLASifeq ($(LINUX), 1)ifeq ($(BLAS), atlas)# Linux simply has cblas and atlasLIBRARIES += cblas atlasendifelse ifeq ($(OSX), 1)# OS X packages atlas as the vecLib frameworkLIBRARIES += cblas# 10.10 has accelerate while 10.9 has veclibXCODE_CLT_VER := $(shell pkgutil --pkg-info=com.apple.pkg.CLTools_Executables | grep 'version' | sed 's/[^0-9]*\([0-9]\).*/\1/')XCODE_CLT_GEQ_7 := $(shell [ $(XCODE_CLT_VER) -gt 6 ] && echo 1)XCODE_CLT_GEQ_6 := $(shell [ $(XCODE_CLT_VER) -gt 5 ] && echo 1)ifeq ($(XCODE_CLT_GEQ_7), 1)BLAS_INCLUDE ?= /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/$(shell ls /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/ | sort | tail -1)/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/Headerselse ifeq ($(XCODE_CLT_GEQ_6), 1)BLAS_INCLUDE ?= /System/Library/Frameworks/Accelerate.framework/Versions/Current/Frameworks/vecLib.framework/Headers/LDFLAGS += -framework AccelerateelseBLAS_INCLUDE ?= /System/Library/Frameworks/vecLib.framework/Versions/Current/Headers/LDFLAGS += -framework vecLibendifendif
endif
INCLUDE_DIRS += $(BLAS_INCLUDE)
LIBRARY_DIRS += $(BLAS_LIB)LIBRARY_DIRS += $(LIB_BUILD_DIR)# Automatic dependency generation (nvcc is handled separately)
CXXFLAGS += -MMD -MP# Complete build flags.
COMMON_FLAGS += $(foreach includedir,$(INCLUDE_DIRS),-I$(includedir))
CXXFLAGS += -pthread -fPIC $(COMMON_FLAGS) $(WARNINGS)
NVCCFLAGS += -D_FORCE_INLINES -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
# mex may invoke an older gcc that is too liberal with -Wuninitalized
MATLAB_CXXFLAGS := $(CXXFLAGS) -Wno-uninitialized
LINKFLAGS += -pthread -fPIC $(COMMON_FLAGS) $(WARNINGS)USE_PKG_CONFIG ?= 0
ifeq ($(USE_PKG_CONFIG), 1)PKG_CONFIG := $(shell pkg-config opencv --libs)
elsePKG_CONFIG :=
endif
LDFLAGS += $(foreach librarydir,$(LIBRARY_DIRS),-L$(librarydir)) $(PKG_CONFIG) \$(foreach library,$(LIBRARIES),-l$(library))
PYTHON_LDFLAGS := $(LDFLAGS) $(foreach library,$(PYTHON_LIBRARIES),-l$(library))# 'superclean' target recursively* deletes all files ending with an extension
# in $(SUPERCLEAN_EXTS) below.  This may be useful if you've built older
# versions of Caffe that do not place all generated files in a location known
# to the 'clean' target.
#
# 'supercleanlist' will list the files to be deleted by make superclean.
#
# * Recursive with the exception that symbolic links are never followed, per the
# default behavior of 'find'.
SUPERCLEAN_EXTS := .so .a .o .bin .testbin .pb.cc .pb.h _pb2.py .cuo# Set the sub-targets of the 'everything' target.
EVERYTHING_TARGETS := all py$(PROJECT) test warn lint
# Only build matcaffe as part of "everything" if MATLAB_DIR is specified.
ifneq ($(MATLAB_DIR),)EVERYTHING_TARGETS += mat$(PROJECT)
endif##############################
# Define build targets
##############################
.PHONY: all lib test clean docs linecount lint lintclean tools examples $(DIST_ALIASES) \py mat py$(PROJECT) mat$(PROJECT) proto runtest \superclean supercleanlist supercleanfiles warn everythingall: lib tools exampleslib: $(STATIC_NAME) $(DYNAMIC_NAME)everything: $(EVERYTHING_TARGETS)linecount:cloc --read-lang-def=$(PROJECT).cloc \src/$(PROJECT) include/$(PROJECT) tools examples \python matlablint: $(EMPTY_LINT_REPORT)lintclean:@ $(RM) -r $(LINT_OUTPUT_DIR) $(EMPTY_LINT_REPORT) $(NONEMPTY_LINT_REPORT)docs: $(DOXYGEN_OUTPUT_DIR)@ cd ./docs ; ln -sfn ../$(DOXYGEN_OUTPUT_DIR)/html doxygen$(DOXYGEN_OUTPUT_DIR): $(DOXYGEN_CONFIG_FILE) $(DOXYGEN_SOURCES)$(DOXYGEN_COMMAND) $(DOXYGEN_CONFIG_FILE)$(EMPTY_LINT_REPORT): $(LINT_OUTPUTS) | $(BUILD_DIR)@ cat $(LINT_OUTPUTS) > $@@ if [ -s "$@" ]; then \cat $@; \mv $@ $(NONEMPTY_LINT_REPORT); \echo "Found one or more lint errors."; \exit 1; \fi; \$(RM) $(NONEMPTY_LINT_REPORT); \echo "No lint errors!";$(LINT_OUTPUTS): $(LINT_OUTPUT_DIR)/%.lint.txt : % $(LINT_SCRIPT) | $(LINT_OUTPUT_DIR)@ mkdir -p $(dir $@)@ python $(LINT_SCRIPT) $< 2>&1 \| grep -v "^Done processing " \| grep -v "^Total errors found: 0" \> $@ \|| truetest: $(TEST_ALL_BIN) $(TEST_ALL_DYNLINK_BIN) $(TEST_BINS)tools: $(TOOL_BINS) $(TOOL_BIN_LINKS)examples: $(EXAMPLE_BINS)py$(PROJECT): pypy: $(PY$(PROJECT)_SO) $(PROTO_GEN_PY)$(PY$(PROJECT)_SO): $(PY$(PROJECT)_SRC) $(PY$(PROJECT)_HXX) | $(DYNAMIC_NAME)@ echo CXX/LD -o $@ $<$(Q)$(CXX) -shared -o $@ $(PY$(PROJECT)_SRC) \-o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(PYTHON_LDFLAGS) \-Wl,-rpath,$(ORIGIN)/../../build/libmat$(PROJECT): matmat: $(MAT$(PROJECT)_SO)$(MAT$(PROJECT)_SO): $(MAT$(PROJECT)_SRC) $(STATIC_NAME)@ if [ -z "$(MATLAB_DIR)" ]; then \echo "MATLAB_DIR must be specified in $(CONFIG_FILE)" \"to build mat$(PROJECT)."; \exit 1; \fi@ echo MEX $<$(Q)$(MATLAB_DIR)/bin/mex $(MAT$(PROJECT)_SRC) \CXX="$(CXX)" \CXXFLAGS="\$$CXXFLAGS $(MATLAB_CXXFLAGS)" \CXXLIBS="\$$CXXLIBS $(STATIC_LINK_COMMAND) $(LDFLAGS)" -output $@@ if [ -f "$(PROJECT)_.d" ]; then \mv -f $(PROJECT)_.d $(BUILD_DIR)/${MAT$(PROJECT)_SO:.$(MAT_SO_EXT)=.d}; \firuntest: $(TEST_ALL_BIN)$(TOOL_BUILD_DIR)/caffe$(TEST_ALL_BIN) $(TEST_GPUID) --gtest_shuffle $(TEST_FILTER)pytest: pycd python; python -m unittest discover -s caffe/testmattest: matcd matlab; $(MATLAB_DIR)/bin/matlab -nodisplay -r 'caffe.run_tests(), exit()'warn: $(EMPTY_WARN_REPORT)$(EMPTY_WARN_REPORT): $(ALL_WARNS) | $(BUILD_DIR)@ cat $(ALL_WARNS) > $@@ if [ -s "$@" ]; then \cat $@; \mv $@ $(NONEMPTY_WARN_REPORT); \echo "Compiler produced one or more warnings."; \exit 1; \fi; \$(RM) $(NONEMPTY_WARN_REPORT); \echo "No compiler warnings!";$(ALL_WARNS): %.o.$(WARNS_EXT) : %.o$(BUILD_DIR_LINK): $(BUILD_DIR)/.linked# Create a target ".linked" in this BUILD_DIR to tell Make that the "build" link
# is currently correct, then delete the one in the OTHER_BUILD_DIR in case it
# exists and $(DEBUG) is toggled later.
$(BUILD_DIR)/.linked:@ mkdir -p $(BUILD_DIR)@ $(RM) $(OTHER_BUILD_DIR)/.linked@ $(RM) -r $(BUILD_DIR_LINK)@ ln -s $(BUILD_DIR) $(BUILD_DIR_LINK)@ touch $@$(ALL_BUILD_DIRS): | $(BUILD_DIR_LINK)@ mkdir -p $@$(DYNAMIC_NAME): $(OBJS) | $(LIB_BUILD_DIR)@ echo LD -o $@$(Q)$(CXX) -shared -o $@ $(OBJS) $(VERSIONFLAGS) $(LINKFLAGS) $(LDFLAGS) \-L/usr/lib/x86_64-linux-gnu -lprotobuf  # 强制链接系统库@ cd $(BUILD_DIR)/lib; rm -f $(DYNAMIC_NAME_SHORT); ln -s $(DYNAMIC_VERSIONED_NAME_SHORT) $(DYNAMIC_NAME_SHORT)$(STATIC_NAME): $(OBJS) | $(LIB_BUILD_DIR)@ echo AR -o $@$(Q)ar rcs $@ $(OBJS)$(BUILD_DIR)/%.o: %.cpp $(PROTO_GEN_HEADER) | $(ALL_BUILD_DIRS)@ echo CXX $<$(Q)$(CXX) $< $(CXXFLAGS) -c -o $@ 2> $@.$(WARNS_EXT) \|| (cat $@.$(WARNS_EXT); exit 1)@ cat $@.$(WARNS_EXT)$(PROTO_BUILD_DIR)/%.pb.o: $(PROTO_BUILD_DIR)/%.pb.cc $(PROTO_GEN_HEADER) \| $(PROTO_BUILD_DIR)@ echo CXX $<$(Q)$(CXX) $< $(CXXFLAGS) -c -o $@ 2> $@.$(WARNS_EXT) \|| (cat $@.$(WARNS_EXT); exit 1)@ cat $@.$(WARNS_EXT)$(BUILD_DIR)/cuda/%.o: %.cu | $(ALL_BUILD_DIRS)@ echo NVCC $<$(Q)$(CUDA_DIR)/bin/nvcc $(NVCCFLAGS) $(CUDA_ARCH) -M $< -o ${@:.o=.d} \-odir $(@D)$(Q)$(CUDA_DIR)/bin/nvcc $(NVCCFLAGS) $(CUDA_ARCH) -c $< -o $@ 2> $@.$(WARNS_EXT) \|| (cat $@.$(WARNS_EXT); exit 1)@ cat $@.$(WARNS_EXT)$(TEST_ALL_BIN): $(TEST_MAIN_SRC) $(TEST_OBJS) $(GTEST_OBJ) \| $(DYNAMIC_NAME) $(TEST_BIN_DIR)@ echo CXX/LD -o $@ $<$(Q)$(CXX) $(TEST_MAIN_SRC) $(TEST_OBJS) $(GTEST_OBJ) \-o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib$(TEST_CU_BINS): $(TEST_BIN_DIR)/%.testbin: $(TEST_CU_BUILD_DIR)/%.o \$(GTEST_OBJ) | $(DYNAMIC_NAME) $(TEST_BIN_DIR)@ echo LD $<$(Q)$(CXX) $(TEST_MAIN_SRC) $< $(GTEST_OBJ) \-o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib$(TEST_CXX_BINS): $(TEST_BIN_DIR)/%.testbin: $(TEST_CXX_BUILD_DIR)/%.o \$(GTEST_OBJ) | $(DYNAMIC_NAME) $(TEST_BIN_DIR)@ echo LD $<$(Q)$(CXX) $(TEST_MAIN_SRC) $< $(GTEST_OBJ) \-o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib# Target for extension-less symlinks to tool binaries with extension '*.bin'.
$(TOOL_BUILD_DIR)/%: $(TOOL_BUILD_DIR)/%.bin | $(TOOL_BUILD_DIR)@ $(RM) $@@ ln -s $(notdir $<) $@$(TOOL_BINS): %.bin : %.o | $(DYNAMIC_NAME)@ echo CXX/LD -o $@$(Q)$(CXX) $< -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(LDFLAGS) \-Wl,-rpath,$(ORIGIN)/../lib$(EXAMPLE_BINS): %.bin : %.o | $(DYNAMIC_NAME)@ echo CXX/LD -o $@$(Q)$(CXX) $< -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(LDFLAGS) \-Wl,-rpath,$(ORIGIN)/../../libproto: $(PROTO_GEN_CC) $(PROTO_GEN_HEADER)$(PROTO_BUILD_DIR)/%.pb.cc $(PROTO_BUILD_DIR)/%.pb.h : \$(PROTO_SRC_DIR)/%.proto | $(PROTO_BUILD_DIR)@ echo PROTOC $<$(Q)protoc --proto_path=$(PROTO_SRC_DIR) --cpp_out=$(PROTO_BUILD_DIR) $<$(PY_PROTO_BUILD_DIR)/%_pb2.py : $(PROTO_SRC_DIR)/%.proto \$(PY_PROTO_INIT) | $(PY_PROTO_BUILD_DIR)@ echo PROTOC \(python\) $<$(Q)protoc --proto_path=src --python_out=python $<$(PY_PROTO_INIT): | $(PY_PROTO_BUILD_DIR)touch $(PY_PROTO_INIT)clean:@- $(RM) -rf $(ALL_BUILD_DIRS)@- $(RM) -rf $(OTHER_BUILD_DIR)@- $(RM) -rf $(BUILD_DIR_LINK)@- $(RM) -rf $(DISTRIBUTE_DIR)@- $(RM) $(PY$(PROJECT)_SO)@- $(RM) $(MAT$(PROJECT)_SO)supercleanfiles:$(eval SUPERCLEAN_FILES := $(strip \$(foreach ext,$(SUPERCLEAN_EXTS), $(shell find . -name '*$(ext)' \-not -path './data/*'))))supercleanlist: supercleanfiles@ \if [ -z "$(SUPERCLEAN_FILES)" ]; then \echo "No generated files found."; \else \echo $(SUPERCLEAN_FILES) | tr ' ' '\n'; \fisuperclean: clean supercleanfiles@ \if [ -z "$(SUPERCLEAN_FILES)" ]; then \echo "No generated files found."; \else \echo "Deleting the following generated files:"; \echo $(SUPERCLEAN_FILES) | tr ' ' '\n'; \$(RM) $(SUPERCLEAN_FILES); \fi$(DIST_ALIASES): $(DISTRIBUTE_DIR)$(DISTRIBUTE_DIR): all py | $(DISTRIBUTE_SUBDIRS)# add protocp -r src/caffe/proto $(DISTRIBUTE_DIR)/# add includecp -r include $(DISTRIBUTE_DIR)/mkdir -p $(DISTRIBUTE_DIR)/include/caffe/protocp $(PROTO_GEN_HEADER_SRCS) $(DISTRIBUTE_DIR)/include/caffe/proto# add tool and example binariescp $(TOOL_BINS) $(DISTRIBUTE_DIR)/bincp $(EXAMPLE_BINS) $(DISTRIBUTE_DIR)/bin# add librariescp $(STATIC_NAME) $(DISTRIBUTE_DIR)/libinstall -m 644 $(DYNAMIC_NAME) $(DISTRIBUTE_DIR)/libcd $(DISTRIBUTE_DIR)/lib; rm -f $(DYNAMIC_NAME_SHORT);   ln -s $(DYNAMIC_VERSIONED_NAME_SHORT) $(DYNAMIC_NAME_SHORT)# add python - it's not the standard way, indeed...cp -r python $(DISTRIBUTE_DIR)/-include $(DEPS)

3. 修改CMakeLists.txt(如果使用cmake,不用可跳过)

如果需要使用cmake而不是make:

mkdir build
cd build
cmake -DCPU_ONLY=ON -DPYTHON_EXECUTABLE=$(which python) ..

4. 创建python3.8的链接库

配置文件中是默认调用py2.7的boost,我们需要创建python3.8的链接库

sudo ln -s /usr/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0 /usr/local/lib/libboost_python3.so

五、编译Caffe

1. 使用Make编译

① 检查编译环境的python版本是否符合预期

# 检查 Python 路径
which python
# 应输出:/home/anaconda3/envs/caffe/bin/python# 检查版本
python --version
# 应输出:Python 3.8.20

如果 Python 路径输出为:/usr/bin/python,版本输出为: Python 2.7.18 ,那说明当前环境中用的是系统自带的python而不是anacoda虚拟环境中的python,需要修改一下系统路径,使其强制使用anacoda虚拟环境中的python

# 临时修复(仅在当前终端生效)
export PATH="/home/anaconda3/envs/caffe/bin:$PATH"# 永久修复(添加到 ~/.bashrc)
echo 'export PATH="/home/anaconda3/envs/caffe/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc

 ② 检查编译环境的protoc版本是否符合预期

# 检查protoc路径
which protoc  
# 应该输出 /usr/bin/protoc# 检查protoc版本
protoc --version  
# 应该输出系统版本(如 3.6.1)

如果 protoc 输出路径为:/home/protobuf-3.11.2/build/bin/protoc ,版本输出为 protoc 3.11.2,说明当前环境中使用的是从源码编译的其他版本的protoc,需要修改环境变量使其强制使用系统版本的protoc

# 临时生效(仅当前终端窗口)
export PATH=/usr/bin:$PATH# 永久生效(添加到 ~/.bashrc)
echo 'export PATH=/usr/bin:$PATH' >> ~/.bashrc
source ~/.bashrc

③ 编译caffe

# 在caffe的主目录下,依次执行:sudo make all -j$(nproc)sudo make test -j$(nproc)sudo make runtest -j$(nproc)

最后假如得到是passed的话,那就代表你编译成功了

2. 验证动态库链接

# 检查 _caffe.so 链接的 Python 版本
ldd python/caffe/_caffe.so | grep python# 输出如下,说明成功链接了python3.8
libpython3.8.so.1.0 => /lib/x86_64-linux-gnu/libpython3.8.so.1.0 (0x00007fe800ec8000)
libboost_python38.so.1.71.0 => /lib/x86_64-linux-gnu/libboost_python38.so.1.71.0 (0x00007fe800c70000)

3. 常见编译错误及解决方案

① 错误1:fatal error: numpy/arrayobject.h: No such file or directory

解决方案
确保numpy已安装且路径正确。可以尝试:

sudo apt install python3-numpy
# 或
pip install numpy --upgrade

然后在Makefile.config中检查numpy路径是否正确。

② 错误2:undefined reference to boost::python...

解决方案
确保boost-python版本匹配:

sudo apt install libboost-python-dev
# 或指定版本
sudo apt install libboost-python1.65-dev

在Makefile.config中检查PYTHON_LIBRARIES是否设置为boost_python38 python3.8

③ 错误3:error: 'pybind11' is not a namespace-name

解决方案
安装pybind11:

pip install pybind11
conda install pybind11

④ 错误4:error: 'class std::unordered_map' has no member named 'emplace'

解决方案
需要更新GCC版本:

sudo apt install gcc-7 g++-7
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-7 100
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-7 100

⑤ 错误5:HDF5相关错误

解决方案
安装hdf5开发包并设置正确路径:

sudo apt install libhdf5-serial-dev libhdf5-dev

在Makefile.config中取消注释并修改:

INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial

六、安装Python接口

1. 安装所需的python包

在caffe的主目录下,进入python文件夹。安装requirements.txt所需的包

cd /home/caffe/python
pip install -r requirements.txt

另外,需要升级里面的matplotlib

pip install --upgrade matplotlib

2. 编译pycaffe

为caffe/python添加环境变量

export PYTHONPATH=/home/caffe/python:$PYTHONPATH
# 或永久添加到.bashrc
echo 'export PYTHONPATH=/home/caffe/python:$PYTHONPATH' >> ~/.bashrc
source ~/.bashrc

进入caffe的主目录,开始编译

cd ..
sudo make pycaffe -j$(nproc)

假如没有报错的话,那基本就成功了。

验证安装:

#依次输入
python
import caffe
print(caffe.__version__)

如果输入结果如下图就说明成功了,恭喜你,到这里就编译完成了! 

七、测试Caffe

运行MNIST示例:

./data/mnist/get_mnist.sh
./examples/mnist/create_mnist.sh
./examples/mnist/train_lenet.sh

八、可能遇到的问题及解决方案

问题1:caffe导入相关

ImportError: No module named caffe

解决方案

  • 确保PYTHONPATH包含caffe/python目录

  • 确保已运行make pycaffe

  • 检查Python环境是否正确(使用which python确认)

问题2:Protobuf 相关

.build_release/src/caffe/proto/caffe.pb.h:10:10: fatal error: google/protobuf/port_def.inc: No such file or directory10 | #include <google/protobuf/port_def.inc>|          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.

解决方案

遇到 Protobuf 相关的问题,说明系统中Protobuf 的版本不唯一,检查当前编译环境中的Protobuf 版本是否是自己想要的版本,如果不是那么修改环境变量指定版本

问题3:Opencv相关

src/caffe/layers/window_data_layer.cpp: In member function ‘virtual void caffe::WindowDataLayer<Dtype>::load_batch(caffe::Batch<Dtype>*)’:
src/caffe/layers/window_data_layer.cpp:293:42: error: ‘CV_LOAD_IMAGE_COLOR’ was not declared in this scope293 |         cv_img = cv::imread(image.first, CV_LOAD_IMAGE_COLOR);|                                          ^~~~~~~~~~~~~~~~~~~
make: *** [Makefile:595: .build_release/src/caffe/layers/window_data_layer.o] Error 1

解决方案
caffe源码中和opencv相关的部分是按照3.x版本写的,在4.x版本中有一些改动,OpenCV 4.x 版本弃用了CV_LOAD_IMAGE_COLOR 和 CV_LOAD_IMAGE_GRAYSCALE 宏定义,因此需要在Makefile.config中增加如下两条

COMMON_FLAGS += -DCV_LOAD_IMAGE_COLOR=cv::IMREAD_COLOR
COMMON_FLAGS += -DCV_LOAD_IMAGE_GRAYSCALE=cv::IMREAD_GRAYSCALE

问题4:caffe运行相关

ImportError: dynamic module does not define init function (init_caffe)

解决方案

Caffe报这个错误有两种原因:

第一,有可能是python的版本不对。比如在python2.7下面编译的caffe但是在python3下面运行了import caffe命令;或者相反。

第二,检查自己是否运行了make pycaffe命令。

http://www.dtcms.com/wzjs/30759.html

相关文章:

  • wordpress wowwayseo的关键词无需
  • 江苏扬州疫情最新消息网站seo博客
  • 论坛门户网站开发搜索排行
  • 石家庄网站建设培训班长沙做引流推广的公司
  • php动态网站开发实训报告小说关键词自动生成器
  • 天津网络优化网站建设竞价恶意点击器
  • 企业网站如何建设和推广软文推广
  • 金融直播间网站开发网络营销有什么
  • wordpress讯虎保定seo排名优化
  • 自己的网站怎么接广告外链群发软件
  • 铺铺旺网站做多久了百度竞价推广教程
  • 做淘宝这样的网站需要什么西安网站seo排名优化
  • 可以做思维导图的网站长沙本地推广
  • wordpress win7邯郸网站优化
  • 全国大型教育集团网站建设全媒体广告加盟
  • 做物流网站的公司哪家好百度收录量
  • 网站搭建免费官网能打开各种网站的搜索引擎
  • 网站建设杭州哪家便宜可以免费投放广告的平台
  • 网站建设机构泰安百度公司代理商
  • wordpress可以做seo吗吉林seo技术交流
  • 岭南地区网站建设网站优化是什么意思
  • 网站制作动西安网站建设推广
  • 网站怎样做优化谷歌独立站seo
  • 怎么做商务网站的架构免费观看b站的广告网站平台
  • 个人网站可以做产品宣传吗百度app安装下载免费
  • 自助提卡网站怎么做网站互联网推广
  • 外贸大型门户网站制作女教师遭网课入侵直播录屏曝光8
  • 网购网站建设有产权吗优化网站性能监测
  • 帮人做淘宝网站骗钱搜索引擎排名优化价格
  • 做外贸建网站有没有购买链接