-
can you give me a sample, just like a docker-compose.yml , eg ... thanks |
Beta Was this translation helpful? Give feedback.
Answered by
Leelongqi
Nov 16, 2023
Replies: 2 comments 1 reply
-
Triton-inference-server config: |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
oandreeva-nv
-
Coming back here, as I had some issues setting it up for propagated tracing with Datadog. In the end here is what worked for me:
If you use ddtrace==2.9.1 you won't need to implement the inject logic when calling triton |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Triton-inference-server config:
get trace data config: #5909
Opentelemetry config:
http receiver config: open-telemetry/opentelemetry-collector#5187