Z-Image Base GGUF
GGUF quantized version of Tongyi-MAI/Z-Image (Alibaba's 6B parameter diffusion model) for use with ComfyUI-GGUF.
Model Information
| Property | Value |
|---|---|
| Base Model | Tongyi-MAI/Z-Image |
| Architecture | Lumina2 (DiT-based) |
| Parameters | ~6B |
| Type | Non-distilled (supports CFG, negative prompts, LoRA) |
| Recommended Steps | 28-50 |
Available Quantizations
| File | Size | VRAM Required | Quality |
|---|---|---|---|
| z_image_base_Q8_0.gguf | 6.8 GB | ~7-8 GB | Best |
| z_image_base_BF16.gguf | 12.4 GB | ~13 GB | Original |
Usage with ComfyUI
Requirements
- ComfyUI
- ComfyUI-GGUF custom nodes
Installation
- Install ComfyUI-GGUF:
cd ComfyUI/custom_nodes
git clone https://github.com/city96/ComfyUI-GGUF
pip install --upgrade gguf
- Download the GGUF file and place it in:
ComfyUI/models/unet/
- Use the "Unet Loader (GGUF)" node instead of the standard model loader.
Credits
- Original Model: Alibaba Tongyi-MAI Team
- GGUF Tools: city96/ComfyUI-GGUF
- Quantization: babakarto
License
Apache 2.0 (same as original Z-Image model)
- Downloads last month
- 3,977
Hardware compatibility
Log In to add your hardware
4-bit
8-bit
16-bit
Model tree for babakarto/z-image-base-gguf
Base model
Tongyi-MAI/Z-Image