Skip to content

Make torch.nn.functional.multilabel_soft_margin_loss more stable #9141

Closed
@vadimkantorov

Description

@vadimkantorov

Currently it calls sigmoid and then binary_cross_entropy (which does log internally):
https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py#L1796-L1797

Is it not possible to compute it as: torch.mean(-(y * F.logsigmoid(x) + (1 - y) * F.logsigmoid(-x))) ?

y * log(1 / (1 + exp(-x)) + (1 - y) * log(1 - 1 / (1 + exp(-x))) =
y * logsigmoid(x) + (1 - y) * log((1 + exp(-x) - 1) / (1 + exp(-x))) = 
y * logsigmoid(x) + (1 - y) * log(1 / (1 + exp(x))) = 
y * logsigmoid(x) + (1 - y) * logsigmoid(-x)

Metadata

Metadata

Assignees

Labels

todoNot as important as medium or high priority tasks, but we will work on these.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions