python調用caffe環境配置


背景是這樣的,項目需要,必須將訓練的模型通過C++進行調用,所以必須使用caffe或者mxnet,而caffe是用C++實現,所以有時候簡單的加載一張圖片然后再進行預測十分不方便

用caffe寫prototxt比較容易,寫solver也是很容易,但是如何根據傳入的lmdb數據來predict每一個樣本的類別,抑或如何得到樣本預測為其他類的概率?這看起來是一個簡單的問題,實際上,在pytorch中很容易實現,在caffe中可能需要修改c++代碼,用起來不是很方便直觀,所以能否通過python調用已經訓練完的caffemodel以及deploy.prototxt來實現類別的預測?

這個時候需要在ubuntu上配置caffe,在ubuntu上配置caffe我主要參考了這篇博客,http://www.cnblogs.com/denny402/p/5088399.html

其實主要是有兩部分,第一部分是修改Make.config文件,第二部分是解決so庫找不到的問題

1.修改Makefile.config

關鍵點在於修改配置文件Make.config然后進行編譯,我的Make.config文件如下,

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
USE_CUDNN := 1

# CPU-only switch (uncomment to build without GPU support).
# CPU_ONLY := 1

# uncomment to disable IO dependencies and corresponding data layers
# USE_OPENCV := 0
# USE_LEVELDB := 0
# USE_LMDB := 0

# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#    You should not set this flag if you will be reading LMDBs with any
#    possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1

# Uncomment if you're using OpenCV 3
# OPENCV_VERSION := 3

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
        -gencode arch=compute_20,code=sm_21 \
        -gencode arch=compute_30,code=sm_30 \
        -gencode arch=compute_35,code=sm_35 \
        -gencode arch=compute_50,code=sm_50 \
        -gencode arch=compute_50,code=compute_50

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := atlas
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
MATLAB_DIR := /usr/local/MATLAB/R2014a
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
# PYTHON_INCLUDE := /usr/include/python2.7 \
#        /usr/lib/python2.7/dist-packages/numpy/core/include
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
ANACONDA_HOME := $(HOME)/anaconda
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
         $(ANACONDA_HOME)/include/python2.7 \
         $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include \

# Uncomment to use Python 3 (default is Python 2)
# PYTHON_LIBRARIES := boost_python3 python3.5m
# PYTHON_INCLUDE := /usr/include/python3.5m \
#                 /usr/lib/python3.5/dist-packages/numpy/core/include

# We need to be able to find libpythonX.X.so or .dylib.
PYTHON_LIB := /usr/lib
# PYTHON_LIB := $(ANACONDA_HOME)/lib

# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib

# Uncomment to support layers written in Python (will link against Python libs)
WITH_PYTHON_LAYER := 1

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib

# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
# USE_PKG_CONFIG := 1

BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @

主要是注意PYTHON_INCLUDE這一塊怎么寫,因為我系統中安裝了anaconda2,所以我修改PYTHON_INCLUDE這一塊為anaconda的路徑

修改完成之后,進入caffe根目錄,運行

1 sudo make pycaffe

,編譯成功后,如果重復編譯則會提示Nothing to be done for "pycaffe"

為了防止其他錯誤,還是編譯一下test

1 sudo make test -j8
2 sudo make runtest -j8

2,解決so庫找不到的問題

在編譯的時候我倒是沒有遇到什么問題,但是在進入到python環境中去的時候,我import caffe的時候倒是遇到了各種各樣的問題,但是這種問題大致可以歸結為一種類型,就是

error while loading shared libraries: libhdf5.so.10: cannot open shared object file: No such file or directory

就是找不到caffe想要的庫文件,這個時候這個鏈接 (http://www.cnblogs.com/denny402/p/5088399.html給了一種解決的方法,原因大概是缺少動態鏈接庫,這些庫基本上我們之前都已經安裝了,安裝的路徑是

/use/lib/x86_64-linux-gnu,ll libhdf*的話能夠列出所有的libhdf相關的庫文件,如下圖

如上圖所示,基本上系統里面有很多我們自己的庫,只不過caffe依賴的版本與系統中的版本號不一致,這一點兒與caffe在包含cudnn庫文件的時候類似,只不過caffe的cudnn貌似是在/usr/local/lib下

對已有的庫創建軟鏈接,能夠解決找不到so庫的問題,所以

1 cd /usr/lib/x86_64-linux-gnu
2 sudo ln -s libhdf5.so.7(我文件夾中的so庫的版本號) libhdf5.so.10(caffe需要的版本號)
3 sudo ldconfig

可能還會遇到其他的有關羽so庫找不到的問題,基本上都是按照這個套路來解決

然后import caffe就不會報錯,保險起見,可以再編譯運行一下test


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM