Update README.md
Browse files
README.md
CHANGED
|
@@ -146,12 +146,15 @@ The processor exposes `process_texts`, `process_images`, and `score_multi_vector
|
|
| 146 |
|
| 147 |
### Prerequisites
|
| 148 |
|
| 149 |
-
|
|
|
|
|
|
|
| 150 |
|
| 151 |
```bash
|
| 152 |
-
pip install torch torchvision
|
|
|
|
| 153 |
pip install flash-attn --no-build-isolation
|
| 154 |
-
|
| 155 |
|
| 156 |
### Inference Code
|
| 157 |
|
|
@@ -336,10 +339,11 @@ We gratefully acknowledge the support of **[Tomoro AI](https://tomoro.ai/)**, a
|
|
| 336 |
If you use this model, please cite:
|
| 337 |
|
| 338 |
```bibtex
|
| 339 |
-
@misc{
|
| 340 |
-
|
| 341 |
-
|
| 342 |
-
year
|
| 343 |
-
url = {https://
|
|
|
|
| 344 |
}
|
| 345 |
```
|
|
|
|
| 146 |
|
| 147 |
### Prerequisites
|
| 148 |
|
| 149 |
+
We strongly suggest `flash-attn` to be installed. If not, please change to `attention_impl="sdpa"`
|
| 150 |
+
|
| 151 |
+
Currently we only support torch==2.8.0, for higher pytorch version, please build flash attention manually, otherwise performance throughput could be low.
|
| 152 |
|
| 153 |
```bash
|
| 154 |
+
pip install torch==2.8.0 torchvision==0.23.0 --index-url https://download.pytorch.org/whl/cu128
|
| 155 |
+
pip install transformers pillow requests
|
| 156 |
pip install flash-attn --no-build-isolation
|
| 157 |
+
```
|
| 158 |
|
| 159 |
### Inference Code
|
| 160 |
|
|
|
|
| 339 |
If you use this model, please cite:
|
| 340 |
|
| 341 |
```bibtex
|
| 342 |
+
@misc{huang2025beyond,
|
| 343 |
+
author = {Huang, Xin and Tan, Kye Min},
|
| 344 |
+
title = {Beyond Text: Unlocking True Multimodal, End-to-end RAG with Tomoro ColQwen3},
|
| 345 |
+
year = {2025},
|
| 346 |
+
url = {https://tomoro.ai/insights/beyond-text-unlocking-true-multimodal-end-to-end-rag-with-tomoro-colqwen3},
|
| 347 |
+
publisher = {Tomoro.ai}
|
| 348 |
}
|
| 349 |
```
|