modelzoo.vision.pytorch.dit.sample_generator#
Samples a large number of images from a pre-trained diffusion model using DDP. Subsequently saves a .npz file that can be used to compute FID and other evaluation metrics via the ADM repo: https://github.com/openai/guided-diffusion/tree/main/evaluations
For a simple single-GPU/CPU sampling script, see sample_generator_dit_simple.py.
Functions
Builds a single .npz file from a folder of .png samples. |
Classes
Main BaseClass for model sample generation :param model_ckpt_path: Path to pretrained diffusion model checkpoint :type model_ckpt_path: str :param vae_ckpt_path: Path to pretrained VAE model checkpoint :type vae_ckpt_path: str :param params: Path to yaml containing model params :type params: str :param sample_dir: Path to folder where generated images and npz file to be stored :type sample_dir: str :param seed: Seed for random generation process :type seed: int :param num_fid_samples: Number of images to be generated :type num_fid_samples: int :param per_gpu_batch_size: Per gpu batch size, this input overrides that in yaml if provided. :type per_gpu_batch_size: int. |