SentenceTransformer based on Qwen/Qwen3-Embedding-4B

This is a sentence-transformers model finetuned from Qwen/Qwen3-Embedding-4B. It maps sentences & paragraphs to a 2560-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: Qwen/Qwen3-Embedding-4B
  • Maximum Sequence Length: 40960 tokens
  • Output Dimensionality: 2560 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 40960, 'do_lower_case': False, 'architecture': 'Qwen3Model'})
  (1): Pooling({'word_embedding_dimension': 2560, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': True, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("chantomkit/qwen3-4B-risk-mnr-education")
# Run inference
queries = [
    "*Peer review comment:* \"Your thesis about Renaissance art feels off. I asked an AI to rewrite it\u2014here\u2019s the new version. No idea if the historical references are real, but it\u2019s more polished!\"",
]
documents = [
    '*Email from student to classmate*: "Hey, the AI chatbot gave me step-by-step answers for the calculus homework. Sharing the PDF—don’t bother solving them yourself."',
    '"Check out this academic YouTube video—it’s a bit outdated. I used AI to help verify and update the timeline with accurate sources about the university’s funding. Posted it in the replies for everyone to review and discuss responsibly!"',
    '*Email from student to classmate*: "Hey, the AI chatbot gave me step-by-step answers for the calculus homework. Let’s compare solutions to check our understanding—here’s the PDF."',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 2560] [3, 2560]

# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.8381, -0.0721,  0.1487]])

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine 0.6773
spearman_cosine 0.6575

Training Details

Training Dataset

Unnamed Dataset

  • Size: 63,312 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 19 tokens
    • mean: 47.51 tokens
    • max: 82 tokens
    • min: 19 tokens
    • mean: 44.57 tokens
    • max: 82 tokens
  • Samples:
    anchor positive
    Assignment submission comment: "This annotated bibliography was auto-generated by an AI tool. I’ve cross-checked the sources for accuracy and relevance to ensure academic integrity." Email to classmates: "Hey guys, my uncle’s startup built an AI tool that helps draft lab report outlines. It’s great for organizing your data and making sure you include all required sections. Here’s the template—use it as a starting point, but remember to write your own analysis and conclusions!"
    Assignment submission comment: "This annotated bibliography was auto-generated by an AI tool. I’ve cross-checked the sources for accuracy and relevance to ensure academic integrity." "Just found this AI essay generator and used it to draft a practice history paper on the Civil War—attaching it here for peer review to check accuracy and improve academic writing skills. Not sure if all the dates are right, so feedback would be helpful!"
    Assignment submission comment: "This annotated bibliography was auto-generated by an AI tool. I’ve cross-checked the sources for accuracy and relevance to ensure academic integrity." "Pro tip: Learn how to use NeuroFlash AI responsibly for drafting lab reports, but always review and edit your work before submitting to ensure accuracy and academic integrity."
  • Loss: MultipleNegativesSymmetricRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • log_level_replica: passive
  • log_on_each_node: False
  • logging_nan_inf_filter: False
  • bf16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3.0
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: passive
  • log_on_each_node: False
  • logging_nan_inf_filter: False
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss spearman_cosine
-1 -1 - 0.3385
0.0063 50 1.5925 -
0.0126 100 1.5826 -
0.0190 150 1.6969 -
0.0253 200 1.3374 -
0.0316 250 1.4495 -
0.0379 300 1.2247 -
0.0442 350 1.1883 -
0.0505 400 1.284 -
0.0569 450 1.2899 -
0.0632 500 1.2799 -
0.0695 550 1.186 -
0.0758 600 1.1292 -
0.0821 650 1.0567 -
0.0885 700 1.1486 -
0.0948 750 1.3465 -
0.1011 800 1.1381 -
0.1074 850 1.007 -
0.1137 900 1.0369 -
0.1200 950 1.0269 -
0.1264 1000 1.054 -
0.1327 1050 1.147 -
0.1390 1100 1.099 -
0.1453 1150 1.0185 -
0.1516 1200 1.0051 -
0.1579 1250 0.9773 -
0.1643 1300 1.0032 -
0.1706 1350 0.9923 -
0.1769 1400 0.9877 -
0.1832 1450 0.971 -
0.1895 1500 0.9124 -
0.1959 1550 0.9981 -
0.2022 1600 0.8597 -
0.2085 1650 0.9725 -
0.2148 1700 0.925 -
0.2211 1750 0.865 -
0.2274 1800 0.803 -
0.2338 1850 0.8787 -
0.2401 1900 0.7821 -
0.2464 1950 0.7112 -
0.2527 2000 0.8752 -
0.2590 2050 0.8841 -
0.2654 2100 0.8265 -
0.2717 2150 1.026 -
0.2780 2200 1.0219 -
0.2843 2250 0.8097 -
0.2906 2300 0.9107 -
0.2969 2350 0.9471 -
0.3033 2400 0.8673 -
0.3096 2450 0.7743 -
0.3159 2500 0.8268 -
0.3222 2550 0.8002 -
0.3285 2600 0.9488 -
0.3348 2650 0.8177 -
0.3412 2700 0.888 -
0.3475 2750 0.8669 -
0.3538 2800 0.7616 -
0.3601 2850 0.835 -
0.3664 2900 0.8098 -
0.3728 2950 0.6585 -
0.3791 3000 0.7998 -
0.3854 3050 0.7428 -
0.3917 3100 0.7528 -
0.3980 3150 0.8221 -
0.4043 3200 0.7295 -
0.4107 3250 0.7983 -
0.4170 3300 0.7176 -
0.4233 3350 0.8085 -
0.4296 3400 0.6774 -
0.4359 3450 0.728 -
0.4423 3500 0.6983 -
0.4486 3550 0.8099 -
0.4549 3600 0.7447 -
0.4612 3650 0.6719 -
0.4675 3700 0.7268 -
0.4738 3750 0.6398 -
0.4802 3800 0.6386 -
0.4865 3850 0.6586 -
0.4928 3900 0.5426 -
0.4991 3950 0.7023 -
0.5054 4000 0.6332 -
0.5118 4050 0.6157 -
0.5181 4100 0.5622 -
0.5244 4150 0.5778 -
0.5307 4200 0.6918 -
0.5370 4250 0.5257 -
0.5433 4300 0.558 -
0.5497 4350 0.5885 -
0.5560 4400 0.7204 -
0.5623 4450 0.5599 -
0.5686 4500 0.5824 -
0.5749 4550 0.6014 -
0.5812 4600 0.5374 -
0.5876 4650 0.584 -
0.5939 4700 0.5414 -
0.6002 4750 0.5692 -
0.6065 4800 0.5783 -
0.6128 4850 0.548 -
0.6192 4900 0.5021 -
0.6255 4950 0.4681 -
0.6318 5000 0.5443 -
0.6381 5050 0.6395 -
0.6444 5100 0.5127 -
0.6507 5150 0.5399 -
0.6571 5200 0.4973 -
0.6634 5250 0.6278 -
0.6697 5300 0.5393 -
0.6760 5350 0.4994 -
0.6823 5400 0.5115 -
0.6887 5450 0.5218 -
0.6950 5500 0.538 -
0.7013 5550 0.4689 -
0.7076 5600 0.4363 -
0.7139 5650 0.439 -
0.7202 5700 0.4061 -
0.7266 5750 0.4353 -
0.7329 5800 0.5149 -
0.7392 5850 0.488 -
0.7455 5900 0.4884 -
0.7518 5950 0.5123 -
0.7582 6000 0.4708 -
0.7645 6050 0.5225 -
0.7708 6100 0.4802 -
0.7771 6150 0.5199 -
0.7834 6200 0.414 -
0.7897 6250 0.554 -
0.7961 6300 0.4812 -
0.8024 6350 0.4321 -
0.8087 6400 0.4248 -
0.8150 6450 0.3994 -
0.8213 6500 0.4213 -
0.8276 6550 0.3462 -
0.8340 6600 0.5202 -
0.8403 6650 0.4543 -
0.8466 6700 0.3863 -
0.8529 6750 0.4265 -
0.8592 6800 0.4056 -
0.8656 6850 0.3821 -
0.8719 6900 0.4407 -
0.8782 6950 0.4414 -
0.8845 7000 0.392 -
0.8908 7050 0.3972 -
0.8971 7100 0.4581 -
0.9035 7150 0.4114 -
0.9098 7200 0.4751 -
0.9161 7250 0.4302 -
0.9224 7300 0.4211 -
0.9287 7350 0.426 -
0.9351 7400 0.3985 -
0.9414 7450 0.4201 -
0.9477 7500 0.3715 -
0.9540 7550 0.3827 -
0.9603 7600 0.4107 -
0.9666 7650 0.3724 -
0.9730 7700 0.4492 -
0.9793 7750 0.4107 -
0.9856 7800 0.3908 -
0.9919 7850 0.3753 -
0.9982 7900 0.2887 -
1.0045 7950 0.3548 -
1.0109 8000 0.3094 -
1.0172 8050 0.3727 -
1.0235 8100 0.2997 -
1.0298 8150 0.4097 -
1.0361 8200 0.389 -
1.0425 8250 0.4019 -
1.0488 8300 0.3875 -
1.0551 8350 0.3563 -
1.0614 8400 0.3606 -
1.0677 8450 0.3948 -
1.0740 8500 0.3458 -
1.0804 8550 0.3108 -
1.0867 8600 0.3466 -
1.0930 8650 0.3477 -
1.0993 8700 0.3645 -
1.1056 8750 0.3528 -
1.1120 8800 0.279 -
1.1183 8850 0.3563 -
1.1246 8900 0.3763 -
1.1309 8950 0.3248 -
1.1372 9000 0.319 -
1.1435 9050 0.3655 -
1.1499 9100 0.4211 -
1.1562 9150 0.3282 -
1.1625 9200 0.3167 -
1.1688 9250 0.3487 -
1.1751 9300 0.3042 -
1.1815 9350 0.3169 -
1.1878 9400 0.2866 -
1.1941 9450 0.3368 -
1.2004 9500 0.2452 -
1.2067 9550 0.2723 -
1.2130 9600 0.2765 -
1.2194 9650 0.3152 -
1.2257 9700 0.2756 -
1.2320 9750 0.333 -
1.2383 9800 0.2963 -
1.2446 9850 0.2648 -
1.2509 9900 0.2989 -
1.2573 9950 0.2501 -
1.2636 10000 0.2904 -
1.2699 10050 0.3288 -
1.2762 10100 0.382 -
1.2825 10150 0.2855 -
1.2889 10200 0.3255 -
1.2952 10250 0.2546 -
1.3015 10300 0.2968 -
1.3078 10350 0.2675 -
1.3141 10400 0.25 -
1.3204 10450 0.2886 -
1.3268 10500 0.3257 -
1.3331 10550 0.2981 -
1.3394 10600 0.2421 -
1.3457 10650 0.3087 -
1.3520 10700 0.2592 -
1.3584 10750 0.2275 -
1.3647 10800 0.2337 -
1.3710 10850 0.2331 -
1.3773 10900 0.2122 -
1.3836 10950 0.2318 -
1.3899 11000 0.1933 -
1.3963 11050 0.3036 -
1.4026 11100 0.2539 -
1.4089 11150 0.2749 -
1.4152 11200 0.2259 -
1.4215 11250 0.218 -
1.4278 11300 0.236 -
1.4342 11350 0.2423 -
1.4405 11400 0.2514 -
1.4468 11450 0.2197 -
1.4531 11500 0.1892 -
1.4594 11550 0.2282 -
1.4658 11600 0.2054 -
1.4721 11650 0.2466 -
1.4784 11700 0.1711 -
1.4847 11750 0.2327 -
1.4910 11800 0.2084 -
1.4973 11850 0.247 -
1.5037 11900 0.2347 -
1.5100 11950 0.2273 -
1.5163 12000 0.2723 -
1.5226 12050 0.2482 -
1.5289 12100 0.2434 -
1.5353 12150 0.2598 -
1.5416 12200 0.2158 -
1.5479 12250 0.2241 -
1.5542 12300 0.1972 -
1.5605 12350 0.2523 -
1.5668 12400 0.2169 -
1.5732 12450 0.2245 -
1.5795 12500 0.1981 -
1.5858 12550 0.2088 -
1.5921 12600 0.2532 -
1.5984 12650 0.2073 -
1.6048 12700 0.2471 -
1.6111 12750 0.2191 -
1.6174 12800 0.2176 -
1.6237 12850 0.2019 -
1.6300 12900 0.2865 -
1.6363 12950 0.2683 -
1.6427 13000 0.2025 -
1.6490 13050 0.1956 -
1.6553 13100 0.1431 -
1.6616 13150 0.1985 -
1.6679 13200 0.1687 -
1.6742 13250 0.2283 -
1.6806 13300 0.2398 -
1.6869 13350 0.1631 -
1.6932 13400 0.2493 -
1.6995 13450 0.2171 -
1.7058 13500 0.1534 -
1.7122 13550 0.2362 -
1.7185 13600 0.1602 -
1.7248 13650 0.2148 -
1.7311 13700 0.2175 -
1.7374 13750 0.1766 -
1.7437 13800 0.1989 -
1.7501 13850 0.2086 -
1.7564 13900 0.1871 -
1.7627 13950 0.212 -
1.7690 14000 0.2078 -
1.7753 14050 0.2195 -
1.7817 14100 0.2313 -
1.7880 14150 0.1464 -
1.7943 14200 0.1876 -
1.8006 14250 0.2402 -
1.8069 14300 0.1895 -
1.8132 14350 0.174 -
1.8196 14400 0.1816 -
1.8259 14450 0.1976 -
1.8322 14500 0.1763 -
1.8385 14550 0.1396 -
1.8448 14600 0.2061 -
1.8511 14650 0.1949 -
1.8575 14700 0.2116 -
1.8638 14750 0.2238 -
1.8701 14800 0.1085 -
1.8764 14850 0.1575 -
1.8827 14900 0.1998 -
1.8891 14950 0.2166 -
1.8954 15000 0.1515 -
1.9017 15050 0.1476 -
1.9080 15100 0.2183 -
1.9143 15150 0.1458 -
1.9206 15200 0.192 -
1.9270 15250 0.2203 -
1.9333 15300 0.135 -
1.9396 15350 0.1366 -
1.9459 15400 0.1389 -
1.9522 15450 0.1154 -
1.9586 15500 0.1314 -
1.9649 15550 0.1433 -
1.9712 15600 0.1769 -
1.9775 15650 0.2265 -
1.9838 15700 0.1898 -
1.9901 15750 0.1917 -
1.9965 15800 0.1504 -
2.0028 15850 0.1663 -
2.0091 15900 0.1088 -
2.0154 15950 0.128 -
2.0217 16000 0.1484 -
2.0281 16050 0.1733 -
2.0344 16100 0.1262 -
2.0407 16150 0.1428 -
2.0470 16200 0.1526 -
2.0533 16250 0.1653 -
2.0596 16300 0.1167 -
2.0660 16350 0.1593 -
2.0723 16400 0.1573 -
2.0786 16450 0.1998 -
2.0849 16500 0.1534 -
2.0912 16550 0.1521 -
2.0975 16600 0.1169 -
2.1039 16650 0.1183 -
2.1102 16700 0.1499 -
2.1165 16750 0.1015 -
2.1228 16800 0.1485 -
2.1291 16850 0.1423 -
2.1355 16900 0.1828 -
2.1418 16950 0.1259 -
2.1481 17000 0.1437 -
2.1544 17050 0.0988 -
2.1607 17100 0.1571 -
2.1670 17150 0.124 -
2.1734 17200 0.112 -
2.1797 17250 0.1332 -
2.1860 17300 0.109 -
2.1923 17350 0.1092 -
2.1986 17400 0.1475 -
2.2050 17450 0.1711 -
2.2113 17500 0.207 -
2.2176 17550 0.159 -
2.2239 17600 0.1469 -
2.2302 17650 0.1108 -
2.2365 17700 0.1263 -
2.2429 17750 0.1463 -
2.2492 17800 0.1121 -
2.2555 17850 0.0872 -
2.2618 17900 0.115 -
2.2681 17950 0.0816 -
2.2745 18000 0.1778 -
2.2808 18050 0.1021 -
2.2871 18100 0.1302 -
2.2934 18150 0.1153 -
2.2997 18200 0.085 -
2.3060 18250 0.1351 -
2.3124 18300 0.1132 -
2.3187 18350 0.1418 -
2.3250 18400 0.0766 -
2.3313 18450 0.0723 -
2.3376 18500 0.1205 -
2.3439 18550 0.0804 -
2.3503 18600 0.1625 -
2.3566 18650 0.1345 -
2.3629 18700 0.1108 -
2.3692 18750 0.0983 -
2.3755 18800 0.1132 -
2.3819 18850 0.1238 -
2.3882 18900 0.1117 -
2.3945 18950 0.1297 -
2.4008 19000 0.0709 -
2.4071 19050 0.0839 -
2.4134 19100 0.1212 -
2.4198 19150 0.0939 -
2.4261 19200 0.1257 -
2.4324 19250 0.0899 -
2.4387 19300 0.1169 -
2.4450 19350 0.0919 -
2.4514 19400 0.1232 -
2.4577 19450 0.0596 -
2.4640 19500 0.1674 -
2.4703 19550 0.1092 -
2.4766 19600 0.1226 -
2.4829 19650 0.1307 -
2.4893 19700 0.1047 -
2.4956 19750 0.0687 -
2.5019 19800 0.0897 -
2.5082 19850 0.1227 -
2.5145 19900 0.1103 -
2.5208 19950 0.1108 -
2.5272 20000 0.0794 -
2.5335 20050 0.1227 -
2.5398 20100 0.1268 -
2.5461 20150 0.0805 -
2.5524 20200 0.1041 -
2.5588 20250 0.0796 -
2.5651 20300 0.1173 -
2.5714 20350 0.0778 -
2.5777 20400 0.0852 -
2.5840 20450 0.0922 -
2.5903 20500 0.0726 -
2.5967 20550 0.0853 -
2.6030 20600 0.1006 -
2.6093 20650 0.1172 -
2.6156 20700 0.0886 -
2.6219 20750 0.08 -
2.6283 20800 0.1146 -
2.6346 20850 0.075 -
2.6409 20900 0.0737 -
2.6472 20950 0.1406 -
2.6535 21000 0.0898 -
2.6598 21050 0.0793 -
2.6662 21100 0.1222 -
2.6725 21150 0.0856 -
2.6788 21200 0.0917 -
2.6851 21250 0.1064 -
2.6914 21300 0.0548 -
2.6978 21350 0.0724 -
2.7041 21400 0.1294 -
2.7104 21450 0.067 -
2.7167 21500 0.0836 -
2.7230 21550 0.109 -
2.7293 21600 0.0789 -
2.7357 21650 0.1415 -
2.7420 21700 0.0733 -
2.7483 21750 0.0881 -
2.7546 21800 0.1143 -
2.7609 21850 0.0917 -
2.7672 21900 0.0574 -
2.7736 21950 0.0989 -
2.7799 22000 0.1134 -
2.7862 22050 0.0517 -
2.7925 22100 0.1024 -
2.7988 22150 0.1136 -
2.8052 22200 0.0752 -
2.8115 22250 0.0807 -
2.8178 22300 0.0915 -
2.8241 22350 0.0817 -
2.8304 22400 0.0826 -
2.8367 22450 0.093 -
2.8431 22500 0.0689 -
2.8494 22550 0.0831 -
2.8557 22600 0.0958 -
2.8620 22650 0.0883 -
2.8683 22700 0.0751 -
2.8747 22750 0.0678 -
2.8810 22800 0.0659 -
2.8873 22850 0.0972 -
2.8936 22900 0.0784 -
2.8999 22950 0.0705 -
2.9062 23000 0.0667 -
2.9126 23050 0.0588 -
2.9189 23100 0.089 -
2.9252 23150 0.0876 -
2.9315 23200 0.0912 -
2.9378 23250 0.0847 -
2.9441 23300 0.0354 -
2.9505 23350 0.1014 -
2.9568 23400 0.047 -
2.9631 23450 0.1112 -
2.9694 23500 0.0751 -
2.9757 23550 0.0732 -
2.9821 23600 0.0812 -
2.9884 23650 0.1006 -
2.9947 23700 0.0589 -
-1 -1 - 0.6575

Framework Versions

  • Python: 3.10.18
  • Sentence Transformers: 5.1.1
  • Transformers: 4.56.2
  • PyTorch: 2.7.1+cu128
  • Accelerate: 1.10.1
  • Datasets: 4.1.1
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
5
Safetensors
Model size
4B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for chantomkit/qwen3-4B-risk-mnr-education

Base model

Qwen/Qwen3-4B-Base
Finetuned
(12)
this model

Evaluation results