Refacade / README.md
fishze's picture
Add license, pipeline tag, and library name to metadata (#2)
4441228 verified
|
raw
history blame
3.05 kB
metadata
license: apache-2.0
pipeline_tag: image-to-image
library_name: diffusers

Refaçade: Editing Object with Given Reference Texture

Youze Huang1,* Penghui Ruan2,* Bojia Zi3,* Xianbiao Qi4,† Jianan Wang5 Rong Xiao4
* Equal contribution. Corresponding author.

Huggingface Model Github arXiv Huggingface Space Demo Page

🚀 Overview

Refaçade is a unified image–video retexturing model built upon the Wan2.1-based VACE framework. It edits the surface material of specified objects in a video using user-provided reference textures, while preserving the original geometry and background. We use Jigsaw Permutation to decouple structural information in the reference image and a Texture Remover to disentangle the original object’s appearance. This functionality enables users to explore diverse possibilities effectively.


🛠️ Installation

Our project is built upon Wan2.1-based VACE.

pip install -r requirements.txt
pip install wan@git+https://github.com/Wan-Video/Wan2.1

🏃‍♂️ Gradio Demo

You can use this gradio demo to retexture objects. Note that you don't need to compile the SAM2.

python app.py

📂 Download

First, download our checkpoints:

huggingface-cli download --resume-download fishze/Refacade --local-dir models

Next, download SAM2 sam2_hiera_large.pt and place it at:

sam2/SAM2-Video-Predictor/checkpoints/

We recommend to organize local directories as:

Refacade
├── ...
├── examples
├── models
│   ├── refacade
│   │   └── ...
│   ├── texture_remover
│   │   └── ...
│   └── vae
│       └── ...
├── sam2
└── ...

⚡ Quick Start

Minimal Example

python test_pipe.py \
  --ref_img    ./assets/single_example/1.png \
  --ref_mask   ./assets/single_example/mask.png \
  --video_path ./assets/single_example/1.mp4 \
  --mask_path  ./assets/single_example/mask.mp4 \
  --output_dir ./outputs