litellm.APIConnectionError: APIConnectionError: OpenAIException - 'async_generator' object has no attribute 'get' #6086
Unanswered
deepakdeore2004
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
litellm gives below error when
stream
is set in curlbelow is configmap
litellm
litellm-database:main-v1.48.16
calling bckend vLLM model directly returns streaming response but same command fails with litellm
what can be wrong here?
here are some debug logs:
Beta Was this translation helpful? Give feedback.
All reactions