Add Oodle as a supported integration for OpenLLMetry#168
Add Oodle as a supported integration for OpenLLMetry#168mgaurav wants to merge 1 commit intotraceloop:mainfrom
Conversation
* Add Oodle as a supported integration for OpenLLMetry Add Oodle (https://oodle.ai) integration documentation page and register it in the sidebar navigation. Oodle is an AI-native observability platform that supports OpenTelemetry (OTLP) for ingesting traces, metrics, and logs. The integration page documents how to configure TRACELOOP_BASE_URL and TRACELOOP_HEADERS to export LLM traces to Oodle. * Use standard OTLP endpoint with -otlp subdomain Switch from traces_endpoint with /v1/otlp/traces path to the standard endpoint using the -otlp subdomain (e.g., <instance>-otlp.collector.oodle.ai) which supports standard OTLP paths (/v1/traces). This also enables direct SDK-to-Oodle integration without needing an intermediate OTel Collector. --------- Co-authored-by: Vorflux AI <noreply@vorflux.com>
📝 WalkthroughWalkthroughThis PR adds documentation for integrating Oodle (an AI-native observability platform) with OpenLLMetry/OpenTelemetry. It includes a new navigation entry and a comprehensive documentation page describing setup instructions and configuration approaches. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~5 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
openllmetry/integrations/oodle.mdx (1)
42-45: Add a short secret-handling note forTRACELOOP_HEADERS.Consider adding one sentence to avoid placing API keys directly in shell history or committed
.envfiles (e.g., recommend secret manager/CI secrets). This is a docs-only hardening improvement.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@openllmetry/integrations/oodle.mdx` around lines 42 - 45, Add a short secret-handling sentence after the TRACELOOP_HEADERS example: warn users not to store API keys directly in shell history or committed .env files and recommend using a secrets manager or CI secret injection (referencing the TRACELOOP_HEADERS and TRACELOOP_BASE_URL variables) so credentials are supplied securely at runtime instead of hardcoding them.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@openllmetry/integrations/oodle.mdx`:
- Around line 42-45: Add a short secret-handling sentence after the
TRACELOOP_HEADERS example: warn users not to store API keys directly in shell
history or committed .env files and recommend using a secrets manager or CI
secret injection (referencing the TRACELOOP_HEADERS and TRACELOOP_BASE_URL
variables) so credentials are supplied securely at runtime instead of hardcoding
them.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 0140e408-8436-4c26-ad15-ac6eb6df6ba5
📒 Files selected for processing (2)
mint.jsonopenllmetry/integrations/oodle.mdx
Add Oodle (https://oodle.ai) integration documentation page and register it in the sidebar navigation. Oodle is an AI-native observability platform that supports OpenTelemetry (OTLP) for ingesting traces, metrics, and logs.
Summary by CodeRabbit