Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are the items in KL divergence disordered? #3

Open
franciszchen opened this issue Jan 12, 2020 · 1 comment
Open

Are the items in KL divergence disordered? #3

franciszchen opened this issue Jan 12, 2020 · 1 comment

Comments

@franciszchen
Copy link

franciszchen commented Jan 12, 2020

Hi, In trainer.py, Line 201-Line208,
for i in range(self.model_num): ce_loss = self.loss_ce(outputs[i], labels) kl_loss = 0 for j in range(self.model_num): if i!=j: kl_loss += self.loss_kl(F.log_softmax(outputs[i], dim = 1), F.softmax(Variable(outputs[j]), dim=1)) loss = ce_loss + kl_loss / (self.model_num - 1)
Is it the same as the Equation 4 in the paper,
L_Θ1 = L_C_1 +D_KL(p2||p1). Thx~

@franciszchen franciszchen changed the title The items in KL divergence are disordered Are the items in KL divergence disordered? Jan 12, 2020
@chxy95
Copy link
Owner

chxy95 commented Jan 13, 2020

Yes. You can do experiments to validate it or observe the implementation of the KL loss by Pytorch if you want.

@chxy95 chxy95 closed this as completed Jan 13, 2020
@chxy95 chxy95 reopened this Jan 13, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants