The two-click reproduction matrix below provides commands for reproducing experimental results reported in the following paper. Numbered rows correspond to tables in the paper; additional conditions are provided for comparison purposes.
Xueguang Ma, Ronak Pradeep, Rodrigo Nogueira, and Jimmy Lin. Document Expansions and Learned Sparse Lexical Representations for MS MARCO V1 and V2. Proceedings of the 45th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2022), July 2022.
TREC 2021 | dev | dev2 | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
AP |
nDCG@10 | RR@100 | R@100 | R@1K | RR@100 | R@1K | RR@100 | R@1K | ||||||
(1a) | BM25 doc (k1=0.9, b=0.4) | 0.2126 | 0.5116 | 0.8367 | 0.3195 | 0.6739 | 0.1572 | 0.8054 | 0.1659 | 0.8029 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(1b) | BM25 doc segmented (k1=0.9, b=0.4) | 0.2436 | 0.5776 | 0.8937 | 0.3478 | 0.6930 | 0.1896 | 0.8542 | 0.1930 | 0.8549 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(1c) | BM25+RM3 doc (k1=0.9, b=0.4) | 0.2452 | 0.5304 | 0.7914 | 0.3376 | 0.7341 | 0.0974 | 0.7699 | 0.1033 | 0.7736 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(1d) | BM25+RM3 doc segmented (k1=0.9, b=0.4) | 0.2936 | 0.6189 | 0.9076 | 0.3890 | 0.7678 | 0.1660 | 0.8608 | 0.1702 | 0.8639 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(2a) | BM25 w/ doc2query-T5 doc (k1=0.9, b=0.4) | 0.2387 | 0.5792 | 0.8866 | 0.3443 | 0.7066 | 0.2011 | 0.8614 | 0.2012 | 0.8568 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(2b) | BM25 w/ doc2query-T5 doc segmented (k1=0.9, b=0.4) | 0.2683 | 0.6289 | 0.9454 | 0.3656 | 0.7202 | 0.2226 | 0.8982 | 0.2234 | 0.8952 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(2c) | BM25+RM3 w/ doc2query-T5 doc (k1=0.9, b=0.4) | 0.2611 | 0.5375 | 0.8255 | 0.3580 | 0.7574 | 0.1141 | 0.8191 | 0.1170 | 0.8247 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(2d) | BM25+RM3 w/ doc2query-T5 doc segmented (k1=0.9, b=0.4) | 0.3191 | 0.6559 | 0.8989 | 0.4131 | 0.7948 | 0.1975 | 0.9002 | 0.1978 | 0.8972 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(3a) | uniCOIL (noexp): pre-encoded queries | 0.2587 | 0.6495 | 0.9282 | 0.3563 | 0.6787 | 0.2231 | 0.8987 | 0.2314 | 0.8995 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
(3b) | uniCOIL (w/ doc2query-T5): pre-encoded queries | 0.2718 | 0.6783 | 0.9684 | 0.3700 | 0.7069 | 0.2419 | 0.9122 | 0.2445 | 0.9172 | ||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
uniCOIL (noexp): on-the-fly query inference | 0.2589 | 0.6501 | 0.9282 | 0.3574 | 0.6782 | 0.2232 | 0.8987 | 0.2314 | 0.8993 | |||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|
||||||||||||||
uniCOIL (w/ doc2query-T5): on-the-fly query inference | 0.2720 | 0.6782 | 0.9684 | 0.3702 | 0.7071 | 0.2419 | 0.9120 | 0.2447 | 0.9174 | |||||
Command to generate run on TREC 2021 queries:
Evaluation commands:
Command to generate run on dev queries:
Evaluation commands:
Command to generate run on dev2 queries:
Evaluation commands:
|