You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was doing some benchmarking runs with DIA-NN 1.9.2, using different settings on the same data.
When I ran both the single pass NN and double pass NN machine learning modes (with all other settings kept identical), I noticed that all the results look identical.
Here's the logs and reports pdf for both runs: report_doublepass.log.txt report_doublepass.pdf report.logsinglepass.txt reportsinglepass.pdf
The docs mention that Double pass NN mode should give better results in most cases at the cost of time, so is it normal/possible that results look identical, or am I doing/understanding something wrong?
Thanks!
The text was updated successfully, but these errors were encountered:
Please see the warnings printed, this is what causes it. However, double pass is normally not recommended. In fact, it will be removed in future versions (may comeback later in a different form, but for now there are zero use cases with a proven benefit from it).
That said if you hover over the Machine learning selection box, it does say the following: , which may be confusing if it turns out not to be the case.
Hi Vadim and team,
I was doing some benchmarking runs with DIA-NN 1.9.2, using different settings on the same data.
When I ran both the single pass NN and double pass NN machine learning modes (with all other settings kept identical), I noticed that all the results look identical.
Here's the logs and reports pdf for both runs:
report_doublepass.log.txt
report_doublepass.pdf
report.logsinglepass.txt
reportsinglepass.pdf
The docs mention that Double pass NN mode should give better results in most cases at the cost of time, so is it normal/possible that results look identical, or am I doing/understanding something wrong?
Thanks!
The text was updated successfully, but these errors were encountered: