Skip to contents

Measure to compare true observed labels with predicted probabilities in multiclass classification tasks.

Details

The Log Loss (a.k.a Benoulli Loss, Logistic Loss, Cross-Entropy Loss) is defined as $$ -\frac{1}{n} \sum_{i=1}^n w_i \log \left( p_i \right ) $$ where \(p_i\) is the probability for the true class of observation \(i\) and \(w_i\) are normalized weights for each observation \(x_i\).

Note

The score function calls mlr3measures::logloss() from package mlr3measures.

If the measure is undefined for the input, NaN is returned. This can be customized by setting the field na_value.

Dictionary

This Measure can be instantiated via the dictionary mlr_measures or with the associated sugar function msr():

mlr_measures$get("classif.logloss")
msr("classif.logloss")

Parameters

Empty ParamSet

Meta Information

  • Type: "classif"

  • Range: \([0, \infty)\)

  • Minimize: TRUE

  • Required prediction: prob