aif360.sklearn.metrics.kl_divergence

aif360.sklearn.metrics.kl_divergence(y_true, y_pred=None, *, prot_attr=None, priv_group=1, sample_weight=None)[source]

Compute the Kullback-Leibler divergence, \(KL(P_p||P_u) = \sum_y P_p(y)\log\left(\frac{P_p(y)}{P_u(y)}\right)\)

where \(P_p\) is the probability distribution over labels of the privileged group and, similiarly, \(P_u\) is the distribution of the unprivileged group.

Parameters:
  • y_true (pandas.Series) – Ground truth (correct) target values. If y_pred is provided, this is ignored.

  • y_pred (array-like, optional) – Estimated targets as returned by a classifier.

  • prot_attr (array-like, keyword-only) – Protected attribute(s). If None, all protected attributes in y_true are used.

  • priv_group (scalar) – The label of the privileged group.

  • sample_weight (array-like, optional) – Sample weights.

Returns:

float – KL divergence.