Triton Inference Server · GitHub
github.com › triton-inference-serverThe Triton Inference Server provides an optimized cloud and edge inferencing solution. Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala. The Triton backend for TensorFlow 1 and TensorFlow 2. Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.