第一步,讀一讀這篇博客
https://www.jb51.net/article/138932.htm (淺談Tensorflow模型的保存與恢復加載)
第二步:
參考博客:
- https://blog.csdn.net/u011734144/article/details/82107610
按照上述教程配置好相關文件之后(模型是下面tensorflow-serving中產生的,直接移到textcnnrnn中的)然后再執行下面命令:
首先啟動:
ljj@debian:~$ docker run -p 8501:8501 --mount type=bind,source=/home/ljj/serving/tensorflow_serving/servables/tensorflow/testdata/textcnnrnn,target=/models/find_lemma_category -e MODEL_NAME=find_lemma_category -t tensorflow/serving
然后調用:
ljj@debian:~$ curl --tlsv1.2 -d '{"instances": [10,10,10,8,6,1,8,9,1]}' -X POST http://0.0.0.0:8501/v1/models/find_lemma_category:predict
但是出現錯誤:
{ "error": "instances is a plain list, but expecting list of objects as multiple input tensors required as per tensorinfo_map" }ljj@debian:~$
事故現場:
- https://www.jianshu.com/p/2fffd0e332bc
- https://blog.csdn.net/SEUer_jeff/article/details/75578732
- https://blog.csdn.net/wangjian1204/article/details/68928656
參考教程:
- https://hub.docker.com/r/bitnami/tensorflow-serving
在這個教程中:https://github.com/tobegit3hub/tensorflow_template_application
在Ubuntu中命令應如下
python sparse_classifier.py train_file ./data/cancer/cancer_train.csv.tfrecords validate_file ./data/cancer/cancer_test.csv.tfrecords feature_size 4 label_size 3 enable_colored_log
python dense_classifier.py train_file ./data/cancer/cancer_train.csv.tfrecords validate_file ./data/cancer/cancer_test.csv.tfrecords feature_size 4 label_size 3 enable_colored_log
使用dense_classifier.py 即可產生checkpoint文件夾,可供后續在http_service中使用rest_server進行調用。但是具體調用瀏覽器端仍存在數組越界的問題
安裝一個命令行下使用的文本瀏覽器便於測試使用
https://www.cnblogs.com/tsdxdx/p/7221132.html
Debian/Ubuntu: apt-get install w3m w3m-img
Centos: yum install w3m w3m-img
常見問題
不再支持export
- https://www.jianshu.com/p/91aae37f1da6