Replies: 1 comment
-
Hi @philnovv, in the spleen segmentation tutorial since the output is 2, so if you do softmax->argmax->one-hot which will give the same result by doing argmax->one-hot. You can also add a softmax here. And if you don't add one-hot here, you only need the probability, then you should add softmax here and remove one-hot. Hope this can help you clarify, thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, in the spleen segmentation example here:
https://github.com/Project-MONAI/tutorials/blob/main/3d_segmentation/spleen_segmentation_3d.ipynb
I understand that the output of the model has not been activated by a softmax. Hence, we pass softmax=True to the loss function. But during the validation, when does softmax get applied?
We have:
val_outputs = sliding_window_inference(val_inputs, roi_size, sw_batch_size, model)
val_outputs = [post_pred(i) for i in decollate_batch(val_outputs)]
but post_pred does not apply softmax. What am I missing?
Contrast this to the following segmentation example notebook:
https://github.com/Project-MONAI/tutorials/blob/main/2d_segmentation/torch/unet_training_dict.py
where post_pred (here "post_trans") contains the sigmoid activation. Can anyone clarify?
Cheers
Beta Was this translation helpful? Give feedback.
All reactions