Sizes of tensors must match except in dimension 4. Got 18 and 17 #1882
Replies: 7 comments
-
What transforms are you using (training, validation and testing)? And what is your batch size (also training, validation and testing)? |
Beta Was this translation helpful? Give feedback.
-
I just use loadimage, addchanel and then toTensord
My train batch size is 2 now (by default), I have use a batch size equal to 1 but it didn't work too. It raises an error on outputs = model(inputs). Is there any mismatch? |
Beta Was this translation helpful? Give feedback.
-
And what size are your input images? |
Beta Was this translation helpful? Give feedback.
-
My input image is from NIFTI images (512x512), when it feeds into the data loader, the image has the input shape as 1,1,512,512,132 with 512x512 is the image shape and 132 is the number of images in the first NIFTI |
Beta Was this translation helpful? Give feedback.
-
If all your images (train, validation, testing) are definitely the same size in all dimensions, then the problem might be in the '132'. Make that last dimension divisible by pool size^depth. For example 2^4 = 16. So either pad or crop your images to a multiple of 16. Another way to do it is with patches. The spleen example crops the patches to 96. This is the best approach because the big images take up quite a bit of memory. |
Beta Was this translation helpful? Give feedback.
-
Is there any transform to make the image size maintains the same (like 512 x 512). I change the RandCropByPosNegLabeld to
Or can we change from Crop the Image to Pad the images? Hope you will help me. |
Beta Was this translation helpful? Give feedback.
-
You can keep your input images as 512x512 and use the Monai transforms to do patch wise cropping (don't worry your network will reassemble them later if you are copying the spleen example code). Try a simple RandSpatialCropD ~ look at the brats segmentation tutorial. Get that working first then try and change small things to see if they make an improvement to your segmentations. |
Beta Was this translation helpful? Give feedback.
-
I try to train Monai with Unet (the architecture is the same as the tutorial at
https://github.com/Project-MONAI/tutorials/blob/master/3d_segmentation/spleen_segmentation_3d.ipynb
Please help me to fix this bug
The U-net architecture is defined as
I try to segment 1 class (tumor only)
Beta Was this translation helpful? Give feedback.
All reactions