Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down on April 23, and its replacement, is already up and running.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: RE: Searching for Kullback-Leiber divergence

From   "Nick Cox" <>
To   <>
Subject   st: RE: Searching for Kullback-Leiber divergence
Date   Sun, 9 May 2010 17:15:24 +0100

I suspect that you will need to program it yourself. 

The correct spelling is Leibler, not Leiber, but even running -findit-
with correct author names finds nothing. (Mind you, it doesn't find
-multgof-, either.) 


Michael C. Morrison

Tirthankar Chakravarty advised that I look into -multigof- for the 
Kullback-Leiber divergence. Thanks for the response but -multigof- is 
not what I'm looking for.

Kullback-Leiber divergence is sometimes referred to as 'relative 
entropy' or 'cross entropy'. The Kullback-Leiber divergence that I need 
summarizes the effect of location and shape changes on the overall 
relative distribution involving two continuous distributions. The 
Kullback-Leiber divergence has a simple interpretation in terms of the 
relative distribution, and it is decomposable into the location, shape 
and other components.

I have - reldist-. It  does a great job in plotting relative & 
cumulative pdfs, location/shape shift changes, polarization 
coefficients, but it doesn't provide a measure of the overall 
distributional difference between two distributions. That's where the 
The Kullback-Leiber divergence comes to the rescue. The advantage of the

Kullback-Leiber divergence is that it is decomposable.

Hope this clarifies what I'm searching for.

*   For searches and help try:

© Copyright 1996–2016 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index