Cross Entropy |
-\sum_{i=1}^{n} y_i \log(p_i) |
Kullback-Leibler divergence Loss |
\sum_{i=1}^{n} y_i \log\left(\frac{y_i}{p_i}\right) |
Binary Cross Entropy |
-(y \log(p) + (1-y)\log(1-p)) |
Binary Cross Entropy with logits |
-\sum_{i=1}^{n} y_i \log\left(\sigma(\text{logits}_i)\right) + (1-y_i)\log\left(1-\sigma(\text{logits}_i)\right) |
Negative log likelihood Loss |
-\sum_{i=1}^{n} y_i \log(p_i) |
Poisson Negative log likelihood Loss |
\sum_{i=1}^{n} p_i - y_i \log(p_i) |
Gaussian Negative log likelihood Loss |
\frac{1}{2} \sum_{i=1}^{n} \left(\frac{y_i - \mu_i}{\sigma_i}\right)^2 + \log(\sigma_i) + \frac{1}{2}\log(2\pi) |
Cosine Embedding Loss |
\frac{1}{2N} \sum_{i=1}^{N} \left(1 - y_i \cdot \frac{x_i}{|x_i|}\right) + \left(1 - y_i \cdot \frac{x_i'}{|x_i'|}\right) |
Hinge Embedding Loss |
\frac{1}{n} \sum_{i=1}^{n} \max(0, \text{margin} - y_i \cdot x_i) |
L1-Loss |
\sum_{i=1}^{n}\left\lvert y_i - p_i \right\rvert |
Smooth L1-Loss |
$ \sum_{i=1}^{n} \text{smooth}_{L1}(y_i - p_i)$ |
Huber Loss |
$ \sum_{i=1}^{n} \text{huber}(y_i - p_i)$ |
Mean Squared Error |
\frac{1}{n} \sum_{i=1}^{n} (y_i - p_i)^2 |
Soft Margin Loss |
\frac{1}{n} \sum_{i=1}^{n} \max(0, \text{margin} - y_i \cdot p_i) |
Multi Margin Loss |
\frac{1}{n} \sum_{i=1}^{n} \sum_{j \neq y_i}^{C} \max(0, \text{margin} - p_{y_i} + p_j) |
Multilabel Margin Loss |
\frac{1}{n} \sum_{i=1}^{n} \sum_{j \neq y_i}^{C} \max(0, \text{margin} - p_{ij}) |
Multilabel Soft Margin Loss |
\frac{1}{n} \sum_{i=1}^{n} \sum_{j \neq y_i}^{C} \log(1 + \exp(margin - p_{ij})) |
Margin Ranking Loss |
\frac{1}{n} \sum_{i=1}^{n} \max(0, m - y_i \cdot (x_i - x_i') + y_i \cdot (x_i - x_i')) |
Triplet Margin Loss |
\frac{1}{n} \sum_{i=1}^{n} \max(0, \text{margin} + d_{i} - d_{i'}), |
Triplet Margin with Distance Loss |
\frac{1}{n} \sum_{i=1}^{n} \max(0, \text{margin} + d_{i} - d_{i'}), |
Focal Loss |
-\sum_{i=1}^{n} (1-p_i)^\gamma \log(p_i) |
Online Triplet Loss |
\max(0, \text{margin} + d_{i} - d_{i'}) |
AUC Loss |
\frac{1}{2}\left(1 - \text{AUC}\right) |
Contrastive Loss |
\frac{1}{n} \sum_{i=1}^{n} (1 - y_i) \cdot \frac{1}{2}d_{i}^2 + y_i \cdot \frac{1}{2}\max(0, \text{margin} - d_{i})^2 |
Angular Loss |
\frac{1}{n}\sum_{i=1}^{n}\max(m + \cos(\theta_{y_i} - \theta_{y_{\text{margin}}}), 0) |
Dice Loss |
1 - \frac{2\sum_{i=1}^{n}(p_i \cdot y_i) + \epsilon}{\sum_{i=1}^{n} p_i + \sum_{i=1}^{n} y_i + \epsilon} |
Tversky Loss |
1 - \frac{\sum_{i=1}^{n}(p_i \cdot y_i) + \epsilon}{\sum_{i=1}^{n} p_i \cdot y_i + \alpha \sum_{i=1}^{n}(p_i \cdot (1-y_i)) + \beta \sum_{i=1}^{n}((1-p_i) \cdot y_i) + \epsilon} |
F-Beta Loss |
\left(1 + \beta^2\right) \cdot \frac{\sum_{i=1}^{n}(p_i \cdot y_i) + \epsilon}{\beta^2 \sum_{i=1}^{n} p_i + \sum_{i=1}^{n} y_i + \epsilon} |
Discussion