Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added logprobs handling in choice response and user can get the logit… #1236

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

hengzzzhou
Copy link
Collaborator

Description

This update addresses the need to obtain the model output logits when using the Camel framework with vLLM deployment. Since the original Camel framework did not provide an interface to retrieve logits, two properties, logprobs and top_logprobs, were added to the vllm_config. Additionally, the chat_agent response handling was modified to allow logging and retrieving the output logits, enabling more detailed scoring information.

Motivation and Context

In the existing Camel framework, there was no direct way to access the logits from model outputs, which are crucial for deploy the prm agent in camel (some prm use the logits of the output as the score). This change was made to provide a better interface for obtaining these logits, improving transparency and control over the model’s output, especially useful for tasks like scoring.

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Put an x in all the boxes that apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of example)

Implemented Tasks

  • Added logprobs and top_logprobs properties to vllm_config.
  • Modified chat_agent response handling to capture and log logits.
  • Ensured that logits are properly logged and accessible from the model output.

Checklist

Go over all the following points, and put an x in all the boxes that apply. If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide. (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly. (required for a bug fix or a new feature)
  • I have updated the documentation accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant