Skip to content
Snippets Groups Projects
Commit 4a081755 authored by Frisinghelli Daniel's avatar Frisinghelli Daniel
Browse files

Test effect of loss clamp.

parent 0646dda2
No related branches found
No related tags found
No related merge requests found
...@@ -139,8 +139,7 @@ class BernoulliWeibullLoss(BernoulliLoss): ...@@ -139,8 +139,7 @@ class BernoulliWeibullLoss(BernoulliLoss):
# clip shape to (0, 10) # clip shape to (0, 10)
# NOTE: in general shape in (0, +infinity), clipping is required for # NOTE: in general shape in (0, +infinity), clipping is required for
# numerical stability # numerical stability
shape = torch.clamp( shape = torch.exp(y_pred[:, 1, ...].squeeze()[mask][~mask_p])
torch.exp(y_pred[:, 1, ...].squeeze()[mask][~mask_p]), max=10)
# negative log-likelihood function of Bernoulli-Weibull distribution # negative log-likelihood function of Bernoulli-Weibull distribution
loss = torch.zeros_like(y_true) loss = torch.zeros_like(y_true)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment