Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down on April 23, and its replacement, is already up and running.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: mata moptimize with Nelder-Mead option: why does it care about Hessian?

From   "Prieger, James" <>
To   <>
Subject   st: mata moptimize with Nelder-Mead option: why does it care about Hessian?
Date   Fri, 25 Jun 2010 16:26:24 -0700

Does anyone have experience using moptimize in Mata with technique "nm"
(Nelder-Mead)?  I am trying to use it, but upon execution get the error

"Hessian is not negative semidefinite"

Why would moptimize care about the Hessian if I'm using Nelder-Mead?  I
thought the point of the simplex method was for it to work when the
likelihood function isn't continuous or differentiable.  Does
evaluatortype d0 still try to find first and second derivatives anyway?


Code excerpt:

M = moptimize_init()
moptimize_init_evaluator(M, &LLiklRenApp())
moptimize_init_evaluatortype(M, "d0")
moptimize_init_depvar(M, 1, "y c0 c1 c2 c3 z0 z1 z2 z3 a g tau d") //
yes, this is a messy problem!
moptimize_init_depvar(M, 2, "")
moptimize_init_eq_indepvars(M, 1, "x")
moptimize_init_eq_indepvars(M, 2, "")
moptimize_init_eq_indepvars(M, 3, "")
moptimize_init_technique(M, "nm")
delta = J(1,4,.1)
moptimize_init_nmsimplexdeltas(M, delta)


James Prieger
Associate Professor
Pepperdine University School of Public Policy

*   For searches and help try:

© Copyright 1996–2016 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index