Skip to content

Latest commit

 

History

History
86 lines (54 loc) · 4.89 KB

HOWTOs_CN.md

File metadata and controls

86 lines (54 loc) · 4.89 KB

HOWTOs

English | 简体中文

如何训练 StyleGAN2

  1. 准备训练数据集: FFHQ. 更多细节: DatasetPreparation_CN.md

    1. 下载 FFHQ 数据集. 推荐从 NVlabs/ffhq-dataset 下载 tfrecords 文件.

    2. 从tfrecords 提取到图片或者LMDB. (需要安装 TensorFlow 来读取 tfrecords).

      python scripts/data_preparation/extract_images_from_tfrecords.py

  2. 修改配置文件 options/train/StyleGAN/train_StyleGAN2_256_Cmul2_FFHQ.yml

  3. 使用分布式训练. 更多训练命令: TrainTest_CN.md

    python -m torch.distributed.launch --nproc_per_node=8 --master_port=4321 basicsr/train.py -opt options/train/StyleGAN/train_StyleGAN2_256_Cmul2_FFHQ.yml --launcher pytorch

如何测试 StyleGAN2

  1. ModelZoo (Google Drive, 百度网盘) 下载预训练模型到 experiments/pretrained_models 文件夹.

  2. 测试.

    python inference/inference_stylegan2.py

  3. 结果在 samples 文件夹

如何测试 DFDNet

  1. 安装 dlib. 因为 DFDNet 使用 dlib 做人脸检测和关键点检测. 安装参考.

    1. 克隆 dlib repo: git clone [email protected]:davisking/dlib.git
    2. cd dlib
    3. 安装: python setup.py install
  2. ModelZoo (Google Drive, 百度网盘) 下载预训练的 dlib 模型到 experiments/pretrained_models/dlib 文件夹.
    你可以通过运行下面的命令下载 或 手动下载.

    python scripts/download_pretrained_models.py dlib

  3. ModelZoo (Google Drive, 百度网盘) 下载 DFDNet 模型, 字典和人脸关键点模板到 experiments/pretrained_models/DFDNet 文件夹.
    你可以通过运行下面的命令下载 或 手动下载.

    python scripts/download_pretrained_models.py DFDNet

  4. 准备测试图片到 datasets, 比如说我们把测试图片放在 datasets/TestWhole 文件夹.

  5. 测试.

    python inference/inference_dfdnet.py --upscale_factor=2 --test_path datasets/TestWhole

  6. 结果在 results/DFDNet 文件夹.

How to train SwinIR (SR)

We take the classical SR X4 with DIV2K for example.

  1. Prepare the training dataset: DIV2K. More details are in DatasetPreparation.md

  2. Prepare the validation dataset: Set5. You can download with this guidance

  3. Modify the config file in options/train/SwinIR/train_SwinIR_SRx4_scratch.yml accordingly.

  4. Train with distributed training. More training commands are in TrainTest.md.

    python -m torch.distributed.launch --nproc_per_node=8 --master_port=4331 basicsr/train.py -opt options/train/SwinIR/train_SwinIR_SRx4_scratch.yml --launcher pytorch --auto_resume

Note that:

  1. Different from the original setting in the paper where the X4 model is finetuned from the X2 model, we directly train it from scratch.
  2. We also use EMA (Exponential Moving Average). Note that all model trainings in BasicSR supports EMA.
  3. In the 250K iteration of training X4 model, it can achieve comparable performance to the official model.
ClassicalSR DIV2KX4 PSNR (RGB) PSNR (Y) SSIM (RGB) SSIM (Y)
Official 30.803 32.728 0.8738 0.9028
Reproduce 30.832 32.756 0.8739 0.9025

How to inference SwinIR (SR)

  1. Download pre-trained models from the official SwinIR repo to the experiments/pretrained_models/SwinIR folder.

  2. Inference.

    python inference/inference_swinir.py --input datasets/Set5/LRbicx4 --patch_size 48 --model_path experiments/pretrained_models/SwinIR/001_classicalSR_DIV2K_s48w8_SwinIR-M_x4.pth --output results/SwinIR_SRX4_DIV2K/Set5

  3. The results are in the results/SwinIR_SRX4_DIV2K/Set5 folder.

  4. You may want to calculate the PSNR/SSIM values.

    python scripts/metrics/calculate_psnr_ssim.py --gt datasets/Set5/GTmod12/ --restored results/SwinIR_SRX4_DIV2K/Set5 --crop_border 4

    or test with the Y channel with the --test_y_channel argument.

    python scripts/metrics/calculate_psnr_ssim.py --gt datasets/Set5/GTmod12/ --restored results/SwinIR_SRX4_DIV2K/Set5 --crop_border 4 --test_y_channel