-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the results of Y!LDA with multi machines #10
Comments
Yes that is correct. Lda.docToTop & lda.worToTop are local to each machine. Essentially topic assignments for documents in the chunk assigned to a machine. Lda.topToWor is expected to be similar across the 3 machines. For an interpretation of the topic model you can use any one of them. But there is only one global model built which is stored in the global folder along with the global dictionary. This is the one used while testing. --Shravan -----Original Message----- Hi,
Yanbo Reply to this email directly or view it on GitHub: |
Thanks a lot! I checked the lda.topToword file. Btw, for the topic counts table, though there are 3 tables after "train mode", I found that it seems the system will merge the 3 tables together during the "test mode"? |
In line... -----Original Message----- Thanks a lot! I checked the lda.topToword file. [shrav] How many iterations did you run? But "test mode" is much better, only 1 different word for each topic. I think I can interpret the topic model using "test mode" result. Btw, for the topic counts table, though there are 3 tables after "train mode", I found that it seems the system will merge the 3 tables together during the "test mode"? [shrav] Yes. A global table is created and a local table per machine is induced using the global table. --Shravan Reply to this email directly or view it on GitHub: |
I ran 200 iterations |
If you run about 500 to 600 iterations, the words will look similar in the different topToWor files. This is what we have observed. -----Original Message----- I ran 200 iterations Reply to this email directly or view it on GitHub: |
Thanks a lot! I will try more iterations. |
Hi,
Yanbo
The text was updated successfully, but these errors were encountered: