understanding sliding window batch size parameter ( sw_batch_size )? #2866
Replies: 3 comments 2 replies
-
Hi @neuronflow , The Thanks. |
Beta Was this translation helpful? Give feedback.
-
@Nic-Ma thanks this is how I also understood it. How are the batches sampled from the (512, 512, 512) volume? During my limited testing I found much better performance doing inference with (192,192,96) and a (in practice I used a bit different window sizes) I believe in my case this is mostly driven by normalization. The small samples from the edges sometimes contained only background and than the normalization would "amplify" the background signal. Hence I am asking how they are sampled? |
Beta Was this translation helpful? Give feedback.
-
Thank you, I am not sure if I completely understand the code. A practical example to better illustrate my question: I trained a network on patches (192,192,64) with batch size Now I want use the network to predict a single volume approx. (6000,4000,3000). Do you have a recommendation how to best balance |
Beta Was this translation helpful? Give feedback.
-
https://docs.monai.io/en/latest/_modules/monai/inferers/utils.html#sliding_window_inference
What is the intended use case for the batch size parameter for the sliding window inferer,
sw_batch_size
?During my tests on various datasets, I always had best results when prioritizing window_size over batch_size.
How are the batches sampled?
I thought multiple batches from various locations in the volume might help to keep normalization etc. on a reasonable level, however during my tests I could not onserve this.
https://docs.monai.io/en/latest/_modules/monai/inferers/utils.html#sliding_window_inference
Beta Was this translation helpful? Give feedback.
All reactions