Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down on April 23, and its replacement, is already up and running.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: Searching for Kullback–Leiber divergence

From   "Michael C. Morrison" <>
Subject   st: Searching for Kullback–Leiber divergence
Date   Sun, 09 May 2010 11:11:42 -0500

Tirthankar Chakravarty advised that I look into -multigof- for the Kullback–Leiber divergence. Thanks for the response but -multigof- is not what I'm looking for.

Kullback–Leiber divergence is sometimes referred to as 'relative entropy' or 'cross entropy'. The Kullback–Leiber divergence that I need summarizes the effect of location and shape changes on the overall relative distribution involving two continuous distributions. The Kullback–Leiber divergence has a simple interpretation in terms of the relative distribution, and it is decomposable into the location, shape and other components.

I have - reldist-. It does a great job in plotting relative & cumulative pdfs, location/shape shift changes, polarization coefficients, but it doesn't provide a measure of the overall distributional difference between two distributions. That's where the The Kullback–Leiber divergence comes to the rescue. The advantage of the Kullback–Leiber divergence is that it is decomposable.

Hope this clarifies what I'm searching for.

*   For searches and help try:

© Copyright 1996–2016 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index