Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why use softmax instead of sigmoid? #17

Open
evbtst opened this issue Mar 6, 2020 · 1 comment
Open

Why use softmax instead of sigmoid? #17

evbtst opened this issue Mar 6, 2020 · 1 comment

Comments

@evbtst
Copy link

evbtst commented Mar 6, 2020

line 66: msv_E2[sc] = upsample(F.softmax(e2[0], dim=1)[:,1].data.cpu(), (h,w))

line 71: msv_E2[sc] = F.softmax(e2[0], dim=1)[:,1].data.cpu()

If you run these lines for each object (Propagate_MS is inside the loop for o in range(num_objects) ) , why use softmax instead of sigmoid?

@seoungwugoh
Copy link
Owner

Hi @evbtst,
We perform 2-way (FG & BG) softmax for the binary segmentation. The similar operation can be implemented using the sigmoid. But, we simply choose to use the softmax.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants