使用TensorRT Inference Server加速深度学习模型的部署 NVIDIA高级系统架构师胡麟.pdf
HOW TO EASILY DEPLOY DEEP LEARNING MODELS IN PRODUCTION 胡麟 linh@ 2019.10.22 TENSORRT INFERENCE SERVER (TRTIS) Inference deployment challenges TensorRT Inference Server benefits Features AGENDA Deploy
用户评论