Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simple code causes max_length warning #1237

Open
ssamt opened this issue Nov 18, 2024 · 1 comment
Open

Simple code causes max_length warning #1237

ssamt opened this issue Nov 18, 2024 · 1 comment

Comments

@ssamt
Copy link

ssamt commented Nov 18, 2024

from FlagEmbedding import FlagReranker

reranking_model = 'BAAI/bge-reranker-v2-m3'
reranker = FlagReranker(reranking_model)

if __name__ == '__main__':
    reranker.compute_score(['query', 'paragraph'])

/home/november/miniconda3/envs/november/lib/python3.11/site-packages/transformers/tokenization_utils_base.py:2888: UserWarning: max_length is ignored when padding=True and there is no truncation strategy. To pad to max length, use padding='max_length'.

This error is raised for every iteration. How can I fix this?

@hanhainebula
Copy link
Collaborator

Hello, @ssamt. This is caused by adding max_length parameter when using tokenizer.pad(). We just fixed this issue in this commit. You can try the new code now. Looking forward to your feedback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants