Closed
Description
Currently it calls sigmoid
and then binary_cross_entropy
(which does log
internally):
https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py#L1796-L1797
Is it not possible to compute it as: torch.mean(-(y * F.logsigmoid(x) + (1 - y) * F.logsigmoid(-x)))
?
y * log(1 / (1 + exp(-x)) + (1 - y) * log(1 - 1 / (1 + exp(-x))) =
y * logsigmoid(x) + (1 - y) * log((1 + exp(-x) - 1) / (1 + exp(-x))) =
y * logsigmoid(x) + (1 - y) * log(1 / (1 + exp(x))) =
y * logsigmoid(x) + (1 - y) * logsigmoid(-x)