You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment, it seems that PCA requires the potentially very large n_structures x n_features feature matrix as an argument. This will not fit in memory for very large datasets.
Perhaps it would be beneficial to design a custom PCA class that allows for the accumulation of a n_features x n_features covariance matrix, which is manageable and can be diagonalized once all structures have been processed. In this way, the exploration of potentially huge datasets should become possible even on ordinary laptops, potentially taking advantage of batched evaluation (and a few hours of runtime)
The text was updated successfully, but these errors were encountered:
At the moment, it seems that PCA requires the potentially very large
n_structures x n_features
feature matrix as an argument. This will not fit in memory for very large datasets.Perhaps it would be beneficial to design a custom PCA class that allows for the accumulation of a
n_features x n_features
covariance matrix, which is manageable and can be diagonalized once all structures have been processed. In this way, the exploration of potentially huge datasets should become possible even on ordinary laptops, potentially taking advantage of batched evaluation (and a few hours of runtime)The text was updated successfully, but these errors were encountered: