Why does the exponent of log probabilities from `score_sample()` in `sklearn.mixture.GMM` exceed 1?


#1

Why does the exponent of log probabilities from score_sample() in sklearn.mixture.GMM exceed 1?

Shouldn’t exponent of a log probability be in (0, 1)?


#2

Quote from https://github.com/scikit-learn/scikit-learn/issues/4202

These are not probabilities, but probability densities . Because of that, they won’t sum to 1, but they will integrate to 1.

Take a simple 1D example of where the probability is zero except in the range (0, 0.1). Then the probability density must have an average value of 10 in that region for the normalization criterion to be met!