YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

fokan/medsiglip-448-int8

INT8 dynamic quantized version of google/medsiglip-448

  • Quantization: dynamic INT8 on all nn.Linear layers (PyTorch)
  • Intended for CPU inference & smaller disk footprint
  • Saved as pytorch_model.bin (quantized weights); config & processor included.

Note: Quantized state_dict is stored with PyTorch serialization (not safetensors) due to quantization tensors.

Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using fokan/medsiglip-448-int8 1