cross-encoder-ettin-17m-BCE

Paper All Models GitHub

This model is a cross-encoder based on jhu-clsp/ettin-encoder-17m. It was trained on Ms-Marco using loss bce as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.

Contents

Model Description

This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).

  • Training Data: MS MARCO Passage
  • Language: English
  • Loss bce

Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.

Usage

Quick Start:

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tokenizer = AutoTokenizer.from_pretrained("jhu-clsp/ettin-encoder-17m")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-ettin-17m-BCE")

features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")

model.eval()
with torch.no_grad():
    scores = model(**features).logits
    print(scores)

Evaluations

We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.

dataset RR@10 nDCG@10
msmarco_dev 23.97 29.44
trec2019 69.26 50.88
trec2020 75.44 51.94
fever 56.09 58.88
arguana 13.07 19.57
climate_fever 16.67 11.92
dbpedia 45.06 23.29
fiqa 32.23 25.35
hotpotqa 67.78 51.21
nfcorpus 36.68 20.49
nq 31.84 37.01
quora 65.88 67.70
scidocs 19.01 10.16
scifact 55.57 58.18
touche 57.47 30.63
trec_covid 74.72 51.14
robust04 43.73 22.74
lotte_writing 44.62 36.79
lotte_recreation 43.38 39.14
lotte_science 34.20 28.39
lotte_technology 31.35 27.11
lotte_lifestyle 56.02 46.79
Mean In Domain 56.22 44.09
BEIR 13 44.01 35.81
LoTTE (OOD) 42.22 33.49
Downloads last month
44
Safetensors
Model size
16.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for xpmir/cross-encoder-ettin-17m-BCE

Finetuned
(30)
this model

Collection including xpmir/cross-encoder-ettin-17m-BCE

Paper for xpmir/cross-encoder-ettin-17m-BCE