环境介绍:
linux系统,发行版 Ubuntu 12.04 LST
java8
Anancoda| Python3.6 -->tensorflow-gpu==1.4
1. Prerequisites
为了可以编译并且使用Tesnsorflow Serving ,需要安装bazel和gRPC
1.1 Bazel安装
bazel是google的编译工具,目前tensorflow Serving要求的版本在0.5.4以上。
在github/bazel 中下载相应的版本, 建议下载installer-linux-XXX.sh
chmod +x bazel-0.5.4-installer-linux-x86_64.sh
./bazel-0.5.4-installer-linux-x86_64.sh --user # --user是安装到当前用户的家目录下
在bashrc中添加path环境变量
export PATH="$PATH:$HOME/bin" #这里的$HOME是你的家目录
1.2 gRCP 安装
gRCP是google的RCP框架 [2][3]。gRCP的安装说明在这里here
gRCP的以来软件如下:
sudo apt-get update && sudo apt-get install -y \
build-essential \
curl \
libcurl3-dev \
git \
libfreetype6-dev \
libpng12-dev \
libzmq3-dev \
pkg-config \
software-properties-common \
swig \
zip \
zlib1g-dev
接下来进入anaconda的root环境下安装即可:(安装的方式有很多种,我选择了一个比较省事情的安装方法---pip)
pip install tensorflow-serving-api #grcp
你也可以用源代码编译的方式安装gRCP ,具体可以参看girhub的ReadME.md
2. TensorFlow Serving Installing from Source | 通过源代码的方式安装 Tensorflow Serving
由于google的一些网站不能够直接访问,所以目前本地编译是最佳的选择。
2.1 下载Tensorflow serving
从github 下clone
git clone--recurse-submodules https://github.com/tensorflow/serving
#--recurse-submodules required to fetch TensorFlow, gRPC, and other libraries that TensorFlow Serving depends on. Note that these instructions will install the latest master branch of TensorFlow Serving. If you want to install a specific branch (such as a release branch), pass -b to the git clone command.
2.2 编译 Tensorflow serving
下载完毕以后,将当前的工作路径切换到 serving/tensorflow中, 进行配置
step1 configuration:
cd serving #切换到刚刚下载的目录中
cd tensorflow
./configure # 这里需要自己手动的操作了, 有一点需要得别强调:
目前支持cuda8.0和cuda7.0
cudnn指定版本的时候,如果是下载cudann7.0.3, 请你输入7.0.3, 而不是7.0
cd .. # build完毕以后,返回上一级目录
step2 build
bazel build-c opt tensorflow_serving/...
如果编译成功的话 , 执行这个
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server # 可以使用python bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server.py
为了验证效果,执行以下语句启动服务 :
bazel test-c opt tensorflow_serving/...
好了,到此为止安装serving 的大致过程结束。
下期会介绍 basic tutorial and advanced tutorial 部分。
我在查看资料的时候发现CSDN 上也有类似的安装方式,也有不少错误, 如果程序员仔细看完TensorFlow serving的安装文档,应该都会避免的。
这里举个栗子哦 :http://blog.csdn.net/dingqingsong/article/details/77931339
其实这个在tensorflow serving 的网站下面有相关的介绍: 特意贴出来。
之所以这样做,只想说 我们做事情要认真仔细。参考文献这次放在了最后,我是根据参考文献部署成功以后,记录下来自己的安装步骤的。嘿嘿。 重要的角色往往都是最后出场的。
Optimized build
It's possible to compile using some platform specific instruction sets (e.g. AVX) that can significantly improve performance. Wherever you see 'bazel build' in the documentation, you can add the flags-c opt --copt=-msse4.1 --copt=-msse4.2 --copt=-mavx --copt=-mavx2 --copt=-mfma --copt=-O3(or some subset of these flags). For example:
bazel build-c opt--copt=-msse4.1--copt=-msse4.2--copt=-mavx--copt=-mavx2--copt=-mfma--copt=-O3 tensorflow_serving/...
Note:These instruction sets are not available on all machines, especially with older processors, so it may not work with all flags. You can try some subset of them, or revert to just the basic '-c opt' which is guaranteed to work on all machines.
Continuous integration build
Ourcontinuous integration buildusing TensorFlowci_buildinfrastructure offers you simplified development using docker. All you need is git and docker. No need to install all other dependencies manually.
git clone--recursive https://github.com/tensorflow/serving
cd serving
CI_TENSORFLOW_SUBMODULE_PATH
=tensorflow tensorflow/tensorflow/tools/ci_build/ci_build.sh CPU bazel test//tensorflow_serving/...
Note:Theservingdirectory is mapped into the container. You can develop outside the docker container (in your favourite editor) and when you run this build it will build with your changes.
参考文献:
[2] wiki RCP
[3] 知乎| 谁能用通俗的语言解释一下什么是RPC框架? - 知乎