diff --git a/README.md b/README.md index 6420244..021e5dc 100644 --- a/README.md +++ b/README.md @@ -34,6 +34,17 @@ With only a few epochs of training, SN-Net effectively interpolates between the SN-Net is a general framework. However, as different model families are trained differently, we use their own code for stitching experiments. In this repo, we provide examples for plain ViTs and hierarchical ViTs by stitching DeiT and Swin, respectively. +To use our repo, we suggest creating a Python virtual environment. + +```bash +conda create -n snnet python=3.9 +pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113 +pip install fvcore +pip install timm==0.6.12 +``` + +Next, you can feel free to experiment with different settings. + For DeiT-based experiments, please refer to [stitching_deit](./stitching_deit). For Swin-based experiments, please refer to [stitching_swin](./stitching_swin). diff --git a/stitching_resnet_swin/.gitignore b/stitching_resnet_swin/.gitignore index 816fb03..b4fe8e1 100755 --- a/stitching_resnet_swin/.gitignore +++ b/stitching_resnet_swin/.gitignore @@ -109,4 +109,5 @@ output/ *.gz Untitled.ipynb Testing notebook.ipynb -output/* \ No newline at end of file +output/* +logs/* \ No newline at end of file diff --git a/stitching_resnet_swin/train.py b/stitching_resnet_swin/train.py index f6e8bfa..c9e6ca3 100755 --- a/stitching_resnet_swin/train.py +++ b/stitching_resnet_swin/train.py @@ -877,8 +877,6 @@ def main(): _logger.info( f'Scheduled epochs: {num_epochs}. LR stepped per {"epoch" if lr_scheduler.t_in_epochs else "update"}.') - with open('resnet_swin_args_training.json', 'w') as f: - json.dump(vars(args), f, indent=4) try: for epoch in range(start_epoch, num_epochs):