Skip to content

Cache location used by CacheDataset #2909

Answered by Nic-Ma
dianemarquette asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @dianemarquette ,

Thanks for your interest here and experiments with MONAI.

  1. It depends on your transforms where to cache the data, if you used ToDevice("cuda:0") it will cache the PyTorch Tensor to GPU RAM, otherwise, in PC RAM like your case, it's just numpy arrays.
  2. Please refer to this tutorial on how to cache in GPU RAM and execute following transforms on GPU directly for better perf:
    https://github.com/Project-MONAI/tutorials/blob/master/acceleration/fast_training_tutorial.ipynb
    Please note that it's still under development, not all the MONAI transforms support GPU Tensor operations.

Thanks.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@dianemarquette
Comment options

Answer selected by dianemarquette
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants