Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restructured torch modules to support shapr installation without torch #393

Merged
merged 12 commits into from
Apr 18, 2024

Conversation

LHBO
Copy link
Collaborator

@LHBO LHBO commented Apr 17, 2024

In this PR, we refactor the vaeac approach so that the torch-modules are initiated through functions. Previously, when they were not inside functions, installation of shapr failed as it tried to evaluate torch::, which was not installed.
We tried to fix this in #390, but do to technical issues for me, I had to make a new PR.

Also fixed some typos in the Roxygen documentation and ensured that the progressr progress bar inside the vaeac approach is only called if progressr is available.

Details:
Instead of having

vaeac_dataset <-  torch::dataset(
  name = "vaeac_dataset", 
  initialize = function(X, one_hot_max_sizes) {...},
  .getbatch = function(index) {...},
  .length = function() {...}
 )

we replaced it with

vaeac_dataset <- function(X, one_hot_max_sizes) {
  vaeac_dataset_tmp <- torch::dataset(
    name = "vaeac_dataset", 
    initialize = function(X, one_hot_max_sizes) {...},
    .getbatch = function(index) {...},
    .length = function() {...}
  )
  return(vaeac_dataset_tmp(X = X, one_hot_max_sizes = one_hot_max_sizes))
}

Some extra care had to be given to memory_layer, which had an internal and shared environment between all instances of memory_layer. In the new version, we have to create an environment first and then send this to the new version of memory_layer.

Copy link
Member

@martinju martinju left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested this locally both with and without the torch package, and it seems to work fine.

To test it installs fine without the torch package, deleted the torch package and then installed the pacakge form this branch by devtools::install_github("LHBO/shapr",ref = "Lars/torch_module_restructure"). I can then run e.g. the readme example. When doing devtools::document() I get errors that torch is not installed. This seems to be related to the reference to torch in the documentation. That is anyway fine, and similar to what we get trying to run an example involving torch.

When having torch installed, all checks pass locally, and I can run devtools::document() without errors.

I just checked the code briefly to understand conceptually what you did, but didn't inspect it in detail.
Please write a brief summary of what you did in the PR description, then you can just merge.

Well done!

@LHBO LHBO changed the title Lars/torch module restructure Restructured torch modules to support shapr installation without torch Apr 18, 2024
@LHBO LHBO merged commit ddd32c7 into NorskRegnesentral:master Apr 18, 2024
7 checks passed
@LHBO LHBO deleted the Lars/torch_module_restructure branch April 18, 2024 07:15
@LHBO LHBO mentioned this pull request Apr 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants