how to create the Inference configuration files for Deepstream (nvinferserver) for Tao Zoo models

231 views Asked by At

I’m trying to use the Tao Zoo models with Triton and Deepstream using the nvinferserver component with GRPC.

I found the models and their pbtxt configuration files for Triton in GitHub - NVIDIA-AI-IOT/tao-toolkit-triton-apps: Sample app code for deploying TAO Toolkit trained models to Triton 1

and I found the configuration files for deepstream deepstream_reference_apps/README.md at master · NVIDIA-AI-IOT/deepstream_reference_apps · GitHub, they work for the nvinfer component, but not with the nvinferserver

can you provide these configuration files? I think it can be valuable for many people using Deepstream with Triton.

Regards.

0

There are 0 answers