Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

建议llm api引入litellm #1

Open
dorbodwolf opened this issue Nov 2, 2024 · 0 comments
Open

建议llm api引入litellm #1

dorbodwolf opened this issue Nov 2, 2024 · 0 comments

Comments

@dorbodwolf
Copy link

dorbodwolf commented Nov 2, 2024

建议llm api引入litellm,比如我可以通过litellm方便调用本地ollama模型,以下是我的调用代码片段

while retry_count < self.max_retries:
            try:
                response = completion(
                    model=f"{config.LLM_MODLE}", 
                    messages=[{ "content": (
                        prompt
                        ),
                        "role": "user"
                        }
                    ], 
                    api_base=self.api_url
                )
                # 获取生成的文本
                content = response['choices'][0]['message']['content'].strip()
                logging.info(f"Generated content:\n {content}")
                
                # 处理生成的内容
                processed_content = self._post_process_single_article(content)
                return processed_content

            except (json.JSONDecodeError, KeyError) as e:
                retry_count += 1
                last_error = str(e)
                logging.warning(f"Generation attempt {retry_count} failed: {last_error}")
                
                if retry_count >= self.max_retries:
                    logging.error(f"Reached maximum retry attempts ({self.max_retries}), generation failed")
                    raise Exception(f"Failed to generate podcast content, last error: {last_error}")

只需要引入包 from litellm import completion

配置文件如下:

# LLM API settings
LLM_API_URL = "http://localhost:11434"
LLM_API_TOKEN = ""
LLM_MODLE = "ollama/gemma2:latest"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant