Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default OpenTelemetry sink by ENV variable #783

Open
codefromthecrypt opened this issue Jan 16, 2025 · 0 comments
Open

Default OpenTelemetry sink by ENV variable #783

codefromthecrypt opened this issue Jan 16, 2025 · 0 comments

Comments

@codefromthecrypt
Copy link
Contributor

🚀 Describe the new functionality needed

Let's support the following two variables for the OTLP endpoint when generating configuration, instead of generating a constant.

OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf

The latter we can skip if we only want to support http, which I'm ok with.

💡 Why is this needed? What if we don't build it?

Right now, we can select otel via TELEMETRY_SINKS=otel and control its service name via OTEL_SERVICE_NAME. However, you can't change the OTLP endpoint used without writing your own configuration file, as it is hard-coded to not read a variable.

This implies that first timers will have to edit configuration when using the easiest way to start (docker) because in a container localhost isn't going to the host's localhost.

By supporting ENV defaults, they can change config like this:

OTEL_EXPORTER_OTLP_ENDPOINT=http://host.docker.internal:4318

Other thoughts

A workaround is to use --add-host=localhost:host-gateway which can help folks use ollama and otel when both are running on the host docker process like so:

docker run --rm --name llama-stack --tty -p 5000:5000  \
  --add-host=localhost:host-gateway \
  llamastack/distribution-ollama \
 --env INFERENCE_MODEL=meta-llama/Llama-3.2-3B-Instruct \
 --env TELEMETRY_SINKS=otel
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant