You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a way to pass my own custom openAI client for dspy to use?
I noticed dspy is using litellm under the hood. Is there a way to provide it with a custom client?
by the way, I tried going through this but it doesn't seem to work well, mainly because it creates a new trace for each dspy call, and I would prefer to use the wrapped client approach which will unify everything under the same trace.
Thanks!
The text was updated successfully, but these errors were encountered:
Is there a way to pass my own custom openAI client for dspy to use?
LiteLLM's custom API guide is probably helpful here. You can create your client and query it in the same way as you would with other supported models through dspy.LM
The Langsmith LiteLLM integration is not fully supported in DSPy, but feel free to share any additional details if we can help make it work!
Hi @arnavsinghvi11 !
Thanks, I tried this approach. I provided the wrapped open ai client using the custom provider, afraid I couldn't get it to work. It seems to be fitted for mocking use cases
Hi there!
I'm using langsmith to instrument my LLM calls.
I had trouble making dspy work with it.
Usually langsmith instrumentation is done either via:
Is there a way to pass my own custom openAI client for dspy to use?
I noticed dspy is using litellm under the hood. Is there a way to provide it with a custom client?
by the way, I tried going through this but it doesn't seem to work well, mainly because it creates a new trace for each dspy call, and I would prefer to use the wrapped client approach which will unify everything under the same trace.
Thanks!
The text was updated successfully, but these errors were encountered: