How to use zenlm/zen3-reranker with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="zenlm/zen3-reranker")
# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("zenlm/zen3-reranker") model = AutoModelForSequenceClassification.from_pretrained("zenlm/zen3-reranker")
your metadata
{ "_name_or_path": "BAAI/bge-m3", "architectures": [ "XLMRobertaForSequenceClassification" ],
BGE based and minimal finetuned ... with my setup i see nearly no difference
· Sign up or log in to comment