-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Adding Langsmith trace processor introduces huge latency to chat #529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@baskaryan, you added this integration - any ideas? Tracing isn't sync. It gathers events on the same thread/task, but the expectation is that exports occur in the background. |
Hi @kmariunas, thanks for reporting. This does look like an issue with how we're tracing this specific integration. Mind creating an issue on the LangSmith SDK repo to follow up. In the meantime I'll work on a fix |
) Currently our openai agents sdk calls create/update run methods without `trace_id` and `dotted_order`, which means tracing will happen in the main thread. This fix sets those variables in the agents integration Issue discovered in: openai/openai-agents-python#529
@kmariunas Released a fix for this in |
Closing this out - thanks @angus-langchain! |
Describe the bug
Hey, when we add a langsmith trace processor, our latency goes through the roof. We have a suspicion, that tracing is not done asynchronously. Is there a way to make it async?
This is before and after we removed langsmith (but kept openai) tracing:

Debug information
v0.0.6
)Repro steps
Expected behavior
Tracing does not intrduce any latency to the system
The text was updated successfully, but these errors were encountered: