Ubuntu 16.04 TensorFlow Servering
PrerequisitesBazelUsing Bazel custom APT repository (recommended)
$ sudo apt-get install openjdk-8-jdk
$ echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list $ curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add - If you want to install the testing version of Bazel,replace stable with testing.
$ sudo apt-get update && sudo apt-get install bazel Once installed,you can upgrade to a newer version of Bazel with: $ sudo apt-get upgrade bazel gRPC Python$ sudo pip install grpcio Packages dependencies$ sudo apt-get update && sudo apt-get install -y build-essential curl libcurl3-dev git libfreetype6-dev libpng12-dev libzmq3-dev pkg-config python-dev python-numpy python-pip software-properties-common swig zip zlib1g-dev TensorFlow Serving Python API PIP package$ pip install tensorflow-serving-api Installing from sourceClone the TensorFlow Serving repository$ git clone --recurse-submodules https://github.com/tensorflow/serving $ cd serving –recurse-submodules is required to fetch TensorFlow,gRPC,and other libraries that TensorFlow Serving depends on. Install prerequisitesFollow the Prerequisites section above to install all dependencies. To configure TensorFlow,run $ cd tensorflow $ ./configure $ cd .. Build$ bazel build -c opt tensorflow_serving/... Binaries are placed in the bazel-bin directory,and can be run using a command like: $ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server To test your installation,execute: $ sudo pip install autograd # handle ImportError: No module named autograd $ bazel test -c opt tensorflow_serving/... Serving a TensorFlow ModelTrain And Export TensorFlow Model
$ rm -rf /tmp/mnist_model
$ bazel build -c opt //tensorflow_serving/example:mnist_saved_model $ bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model Training model... ... Done training! Exporting trained model to /tmp/mnist_model Done exporting! OR $ python tensorflow_serving/example/mnist_saved_model.py /tmp/mnist_model list the model files$ ls /tmp/mnist_model 1 $ ls /tmp/mnist_model/1 saved_model.pb variables Each version sub-directory contains the following files:
Load Exported Model With Standard TensorFlow ModelServer$ bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server # do not use $ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/ Test The Server$ bazel build -c opt //tensorflow_serving/example:mnist_client $ bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000 ... Inference error rate: 10.4% OR $ python tensorflow_serving/example/mnist_client.py --num_tests=1000 --server=localhost:9000 Reference
(编辑:安卓应用网) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |
