Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

RE: st: Mata and Optimize Command


From   "Randy Akee " <[email protected]>
To   <[email protected]>
Subject   RE: st: Mata and Optimize Command
Date   Mon, 24 Sep 2007 18:04:06 +0200

Thanks for this, this was extremely helpful.
R

-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Jeff
Pitblado, StataCorp LP
Sent: Montag, 24. September 2007 16:49
To: [email protected]
Subject: Re: st: Mata and Optimize Command

Randy Akee <[email protected]> is getting a conformability error in the
-md0()-
Mata routine he wrote for use with -optimize()-:

> I am trying to use the MATA Optimize command to get a minimum distance
> estimator of some parameters of interest.  I have six equations and
six
> unknown parameters - my actual problem is a bit more complex, but I
> can't seem to get the simple version to work below:
> 
> I've typed in two matrices, c and f.  I'm trying to find the
parameters,
> p, that minimize: (c-f(p))'(c-f(p)).
> 
> The error that I get is that the function is not conformable, does
> anyone have a suggestion on how I could correct this in what I've
done?
> I believe the issue is that I need somehow to specify the dimensions
of
> the parameter vector, p.  I'm just not sure if that is possible to do,
> any ideas would be appreciated. 

Randy also reported the Mata code that reproduces the error.  Here is
Randy's
evaluator function:

>	void md0(todo, p, c, f, lnf, S, H)
>	{
>		lnf = (c-f*p)'*(c-f*p)
>	}

Here is a brief description of the arguments to this function:

	Input variables:
	----------------
	todo	-- a scalar message variable from -optimize()- that
indicates
		   whether to compute 1st and 2nd order derivatives;
this
		   variable can safely be ignored if your routine is not
going
		   to compute derivatives
	p	-- the current value of the parameter vector (a
rowvector)
	c	-- Randy's first user-defined argument
>			: c
>			       1
>			    +-----+
>			  1 |  1  |
>			  2 |  2  |
>			  3 |  3  |
>			  4 |  4  |
>			  5 |  5  |
>			  6 |  6  |
>			    +-----+
	f	-- Randy's second user-defined argument
>			: f
>			       1   2   3   4   5   6
>			    +-------------------------+
>			  1 |  1   1   0   0   0   0  |
>			  2 |  0   1   1   0   0   0  |
>			  3 |  0   0   1   1   0   0  |
>			  4 |  0   0   0   1   1   0  |
>			  5 |  0   0   0   0   1   1  |
>			  6 |  1   0   0   0   0   1  |
>			    +-------------------------+

	Output variables:
	-----------------
	lnf	-- the value of the objective function that is being
optimized
	g	-- the gradient vector
	H	-- the Hessian matrix

The problem with Randy's evaluator it that with 'c' a 6x1 column vector
and 'f' a 6x6 matrix, 'p' would need to be a 6x1 column vector.  There
are two
reasons why this is a problem:

	1.  -optimize()- requires that 'p' is a rowvector

	2.  Randy set the starting values using

>		: optimize_init_params(S,(0,0))

	    which is a 1x2 rowvector.

In this case, 'p' needs to be a 1x6 rowvector, so there are missing
transpose
operators in Randy's -md0()- function.  Here is how I would code Randy's
evaluator and starting values:

	void md0(todo, p, c, f, lnf, S, H)
	{
        	real colvector  diff
        	diff = c - f*p'
        	lnf = cross(diff,diff)
	}

	optimize_init_params(S, J(1,6,0))

[-cross(z,z)- is faster that -z'*z-]

Note that 'lnf' will be a scalar, but Randy coded 'md0()' as a type -v0-
evaluator.

If Randy wants to return a column vector of the squared differences, he
can
use the following function evaluator

	void mv0(todo, p, c, f, lnf, S, H)
	{
        	lnf = (c - f*p') :* (c - f*p')
	}

Here is the do-file I composed while looking into Randy's code.

***** BEGIN:
mata:

c = (1,2,3,4,5,6)'

f =
(1,1,0,0,0,0\0,1,1,0,0,0\0,0,1,1,0,0\0,0,0,1,1,0\0,0,0,0,1,1\1,0,0,0,0,1
)

void md0(todo, p, c, f, lnf, S, H)
{
        real colvector  diff
        diff = c - f*p'
        lnf = cross(diff,diff)
}

Sd = optimize_init()
optimize_init_evaluator(Sd, &md0())
optimize_init_evaluatortype(Sd, "d0")
optimize_init_params(Sd, J(1,6,0))
optimize_init_which(Sd, "min")
optimize_init_argument(Sd, 1, c)
optimize_init_argument(Sd, 2, f)
optimize(Sd)

void mv0(todo, p, c, f, lnf, S, H)
{
        lnf = (c - f*p') :* (c - f*p')
}

Sv = optimize_init()
optimize_init_evaluator(Sv, &md0())
optimize_init_evaluatortype(Sv, "v0")
optimize_init_params(Sv, J(1,6,0))
optimize_init_which(Sv, "min")
optimize_init_argument(Sv, 1, c)
optimize_init_argument(Sv, 2, f)
optimize(Sv)

end
***** END:

--Jeff
[email protected]
*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index