Cache location used by CacheDataset #2909
-
Hi! I originally trained my neural networks using a GeForce RTX 2080 Super (with 8Gb VRAM) and 32 GB RAM. Recently, I changed setups, and I now have a GeForce RTX 3090 (with 24Gb VRAM) but only 16 GB RAM. I tried to train a neural network using MONAI's Until now, I had assumed that Given the bug, I wonder whether the limited amount of RAM causes this OOM on my new setup. My questions:
Thanks in advance for your insights :) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @dianemarquette , Thanks for your interest here and experiments with MONAI.
Thanks. |
Beta Was this translation helpful? Give feedback.
Hi @dianemarquette ,
Thanks for your interest here and experiments with MONAI.
ToDevice("cuda:0")
it will cache the PyTorch Tensor to GPU RAM, otherwise, in PC RAM like your case, it's just numpy arrays.https://github.com/Project-MONAI/tutorials/blob/master/acceleration/fast_training_tutorial.ipynb
Please note that it's still under development, not all the MONAI transforms support GPU Tensor operations.
Thanks.