You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
… shape=(N, C)to (N,) (pytorch#9965)
Summary:
- fixespytorch#9141, pytorch#9301
- use logsigmoid at multilabel_soft_margin_loss to make it more stable (NOT fixing legacy MultiLabelSoftMarginCriterion)
- return (N) instead of (N, C) to match the same behavior as MultiMarginLoss
- Note that with this PR, the following behavior is expected:
```
loss = F.multilabel_soft_margin_loss(outputs, labels, reduction='none')
loss_mean = F.multilabel_soft_margin_loss(outputs, labels, reduction='elementwise_mean')
loss_sum = F.multilabel_soft_margin_loss(outputs, labels, reduction='sum')
loss.sum() == loss_sum # True
loss.mean() == loss_mean # True
```
Pull Request resolved: pytorch#9965
Differential Revision: D9038402
Pulled By: weiyangfb
fbshipit-source-id: 0fa94c7b3cd370ea62bd6333f1a0e9bd0b8ccbb9
Uh oh!
There was an error while loading. Please reload this page.
Currently it calls
sigmoid
and thenbinary_cross_entropy
(which doeslog
internally):https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py#L1796-L1797
Is it not possible to compute it as:
torch.mean(-(y * F.logsigmoid(x) + (1 - y) * F.logsigmoid(-x)))
?The text was updated successfully, but these errors were encountered: