Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not require torch #390

Closed
wants to merge 19 commits into from
Closed

Not require torch #390

wants to merge 19 commits into from

Conversation

martinju
Copy link
Member

Currently, installation of shapr fails if torch is not installed, even though torch is only needed for the vaeac approach and included as a suggested package.

The code in this PR is tested and works in a local environment which does not have torch installed.

@martinju martinju requested a review from LHBO April 11, 2024 14:24
@martinju
Copy link
Member Author

@LHBO Please look at #389 first as this is built directly on top of that so it includes that code as well. The "changed files" should reduce to just one file one that is merged. Sorry about that.

@LHBO
Copy link
Collaborator

LHBO commented Apr 11, 2024

@LHBO Please look at #389 first as this is built directly on top of that so it includes that code as well. The "changed files" should reduce to just one file one that is merged. Sorry about that.

The solution of wrapping the file in requireNamespace seems fine to me. However, need to incorporate the requested changes from #389 before merging.

Copy link
Collaborator

@LHBO LHBO left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LHBO Please look at #389 first as this is built directly on top of that so it includes that code as well. The "changed files" should reduce to just one file one that is merged. Sorry about that.

The solution of wrapping the file in requireNamespace seems fine to me. However, need to incorporate the requested changes from #389 before merging.

@LHBO
Copy link
Collaborator

LHBO commented Apr 12, 2024

@martinju, it seems like the changes from #389 were not accepted. When I look at approach_vaeac.R, it seems that this PR wants to use last_epoch and not last_state.

@LHBO
Copy link
Collaborator

LHBO commented Apr 16, 2024

Should one in this PR also add the following check to vaeac such that it works without progressr? It did not work from Python or something you mentioned at some time, but I have forgotten the exact details. @martinju

if (requireNamespace("progressr", quietly = TRUE)) {
    p <- progressr::progressor(...)
  } else {
    p <- NULL
  }

@martinju martinju marked this pull request as draft April 17, 2024 10:27
@martinju
Copy link
Member Author

I am still not able to make this work out. What I have tried:

  1. Wrap everything in if(requireNamespace("torch")){ FUNCTIONS + roxygen}
  • This was worked out but gave errors on the form: "Block must have a @name.
    ℹ Either document an existing object or manually specify with @name" when calling devtools::document()
  1. Wrap every single torch-module in FUNCTION_NAME <- ifelse0(!requireNamespace("torch"),"nothing","FUNCTION") with the roxygen commands outside of this.
  • This gave wierd-looking documentation with no input for the arguments
  1. As suggested by chatGPT, I added a placeholder for the function in the main .R-function file and then made a conditional re-definition of that in the .onLoad function under zzz.R.
  • This also gave wierd documentation.

To show the issues, in the pushed code memory_layer() is defined without conditioning, skip_connection is defined as in point 3, and the rest is as in point 2.

I don't know how to proceed with this for now.

One (unsatisfactory) last way out of this is perhaps to delete all documentation for these objects?

@LHBO Feel free to try to solve it. If we don't succeed, we may try to create a small reproducable example package with this issue and bring it on stack overflow or similar...

@LHBO
Copy link
Collaborator

LHBO commented Apr 18, 2024

This was fixed in #393 and we therefore close this PR.

@LHBO LHBO closed this Apr 18, 2024
@LHBO LHBO deleted the not_require_torch branch April 18, 2024 08:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants