@npuichigo I am trying to use Triton Inference Server with TensorRT-LLM backend with openweb-ui as frontend, but not all routes are provided, e.g. /v1/models
etc.
Is there any plan to support all openapi v1 routes?
It will be really great if full openai api support is available, since kserve is still under works.