Instructions to use Raderspace/RaDeR_Qwen25_3B_NuminaMath_MATH_allquerytypes with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Raderspace/RaDeR_Qwen25_3B_NuminaMath_MATH_allquerytypes with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="Raderspace/RaDeR_Qwen25_3B_NuminaMath_MATH_allquerytypes")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("Raderspace/RaDeR_Qwen25_3B_NuminaMath_MATH_allquerytypes") model = AutoModel.from_pretrained("Raderspace/RaDeR_Qwen25_3B_NuminaMath_MATH_allquerytypes") - Notebooks
- Google Colab
- Kaggle
Model Card for Model ID
RaDeR, are a set of reasoning-based dense retrieval and reranker models trained with data derived from mathematical problem solving using large language models (LLMs). RaDeR retrievers, trained for mathematical reasoning, effectively generalize to diverse retrieval reasoning tasks in the BRIGHT and RAR-b benchmarks, consistently outperforming strong baselines in overall performance.
Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: CIIR, UMass Amherst
- Model type: Retriever
- Language(s): English
- License: MIT
- Finetuned from model: Qwen-2.5-3B-Instruct
Model Sources
- Repository: https://github.com/Debrup-61/RaDeR
- Paper https://huggingface.co/papers/2505.18405
How to Get Started with the Model
Run the following code to start a server of the model with vLLM for fast inference.
vllm serve Raderspace/RaDeR_Qwen25-3B_NuminaMath_MATH_allquerytypes \
--task embed \
--trust-remote-code \
--override-pooler-config '{"pooling_type": "LAST", "normalize": true}' \
--gpu-memory-utilization 0.9 \
--api-key abc \
--tokenizer Qwen/Qwen2.5-3B-Instruct \
--port 8001 \
--disable-log-requests \
--max-num-seqs 5000
Follow the code on Github to see how to query the retriever server.
Training Details
Training Data
The model was trained using the NuminaMath+MATH retrieval training dataset from RaDeR, containing all query types.
Software
https://github.com/Debrup-61/RaDeR
Citation [optional]
BibTeX:
@misc{das2025raderreasoningawaredenseretrieval,
title={RaDeR: Reasoning-aware Dense Retrieval Models},
author={Debrup Das and Sam O' Nuallain and Razieh Rahimi},
year={2025},
eprint={2505.18405},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2505.18405},
}
Model Card Contact
Debrup Das: debrupdas@umass.edu
- Downloads last month
- 1