Notice: On March 31, it was **announced** that Statalist is moving from an email list to a **forum**. The old list will shut down at the end of May, and its replacement, **statalist.org** is already up and running.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

From |
"Michael C. Morrison" <Morrimic@niacc.edu> |

To |
statalist@hsphsun2.harvard.edu |

Subject |
Re: st: Re: st: Searching for Kullback–Leibler divergence |

Date |
Sun, 09 May 2010 13:21:31 -0500 |

If you look at this article http://www.stat.psu.edu/reports/1997/tr9702.pdf

Mike On 5/9/2010 12:32 PM, Tirthankar Chakravarty wrote:

Michael, That, I think, is a slightly harder problem. See here and the references within: http://www.tsc.uc3m.es/~fernando/bare_conf3.pdf Most of these references ([21], [12], [5], [22], [13], [18]) are recent and fairly involved. If you have an algorithm in mind that would be very helpful in answering your question/supplying you with code. Eqn (4) in the link above is fairly easily programmable. However, it would be much easier if I could see what you have in mind in situ, so a reference to an application would be great. T 2010/5/9 Michael C. Morrison<Morrimic@niacc.edu>Tirthankar Chakravarty advised that I look into -multigof- for the Kullback–Leiber divergence. Thanks for the response but -multigof- is not what I'm looking for. Kullback–Leiber divergence is sometimes referred to as 'relative entropy' or 'cross entropy'. The Kullback–Leiber divergence that I need summarizes the effect of location and shape changes on the overall relative distribution involving two continuous distributions. The Kullback–Leiber divergence has a simple interpretation in terms of the relative distribution, and it is decomposable into the location, shape and other components. I have - reldist-. It does a great job in plotting relative& cumulative pdfs, location/shape shift changes, polarization coefficients, but it doesn't provide a measure of the overall distributional difference between two distributions. That's where the The Kullback–Leiber divergence comes to the rescue. The advantage of the Kullback–Leiber divergence is that it is decomposable. Hope this clarifies what I'm searching for. Mike * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/-- To every ω-consistent recursive class κ of formulae there correspond recursive class signs r, such that neither v Gen r nor Neg(v Gen r) belongs to Flg(κ) (where v is the free variable of r). * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/

* * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/

**References**:**st: Searching for Kullback–Leiber divergence***From:*"Michael C. Morrison" <Morrimic@niacc.edu>

**st: Re: st: Searching for Kullback–Leiber divergence***From:*Tirthankar Chakravarty <tirthankar.chakravarty@gmail.com>

- Prev by Date:
**st: RE: AW: RE: AW: RE: RE: Delete missing** - Next by Date:
**st: RE: export mysql** - Previous by thread:
**st: Re: st: Searching for Kullback–Leiber divergence** - Next by thread:
**st: Factor Analysis** - Index(es):