-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bugs in GANs #103
Bugs in GANs #103
Conversation
Removed pool layers
Instance norm should not be there in the last layer
Spelling mistakes!!! (Because of which activations were not being applied properly)
Changed the dimensions of the hidden channels
Embedding layer missing from weight initialization.
I've reviewed the changes and I think they look good, especially if the examples work as expected now. However, I suggest that we implement unit tests for both CycleGAN and DCGAN. This will help us keep track of any possible changes to the models. The unit tests can be quite simple and focus on evaluating the correct order of operations, expected number of layers, and appropriate output dimensions of the model. |
Normalization is applied only to the weights and not biases
Normalization is now applied only to the weights and not biases. Removed nn.Linear layer from the normalization as it does not exist in dcgan generator.
Hi! @JesusPinedaC I have created the unit tests now, and adjusted few other minor things. The models are working as expected now, and the weights are being initialized properly for dcgan. If everything looks good, I think this can be merged now. |
These are excellent unittests! @HarshithBachimanchi Everything looks good to me. |
No description provided.