Skip to content

sbrblee/DivRareGen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[AAAI2025] Diverse Rare Sample Generation with Pretrained GANs

Repository for Diverse Rare Sample Generation with Pretrained GANs [Paper] [Arxiv]

image image

Installation

The environment was builded on StyleGAN2-pytorch by rosinality. With CUDA12.1, the following command will download the corresponding docker image.

docker pull sbrblee/cuda12.1gcc6:1.0

Getting Started

Feature Extraction

python -u scripts/feature_extraction.py --data_path {image directory} # The default model option is VGG16.

This code generates the feature vectors from a given image directory in npz file format.

Train Density Estimator: Normalizing Flow Training

python -u scripts/nf_train.py --npz_path {path to real feature npz file}

This code trains the normalizing flow model (Glow model) from the features in a given npz file. We provide the Glow model architecture used in our paper in models folder as default. This code also saves the MinMax scaler for the feature vectors.

Diverse Rare Sample Generation

python -u scripts/divrare_optimization.py --zG_path {path to reference latent vectors npy file} --real_feature_path {path to real feature npz file} --nf_ckpt {path to checkpoint of normalizing flow model} --scaler_path {path to scaler} --dists_path {path to penalizing distances}

This code generates diverse rare samples for given reference latent vectors. The following options will be helpful to control the algorithm hyperparameters:

--n_sample number of rare samples to generate per reference

--rand_scale scale of noise to add to the initial latent vector for multi-start approach

--lambda1 coefficient of the similarity objective

--lambda2 coefficient of the diversity objective

About

Repository for Diverse Rare Sample Generation with GAN

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages