You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
from FlagEmbedding import FlagReranker
reranking_model = 'BAAI/bge-reranker-v2-m3'
reranker = FlagReranker(reranking_model)
if __name__ == '__main__':
reranker.compute_score(['query', 'paragraph'])
/home/november/miniconda3/envs/november/lib/python3.11/site-packages/transformers/tokenization_utils_base.py:2888: UserWarning: max_length is ignored when padding=True and there is no truncation strategy. To pad to max length, use padding='max_length'.
This error is raised for every iteration. How can I fix this?
The text was updated successfully, but these errors were encountered:
Hello, @ssamt. This is caused by adding max_length parameter when using tokenizer.pad(). We just fixed this issue in this commit. You can try the new code now. Looking forward to your feedback.
This error is raised for every iteration. How can I fix this?
The text was updated successfully, but these errors were encountered: