Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

RE: st: Mata and Parameter Constraints


From   "Randy Akee " <akee@iza.org>
To   <statalist@hsphsun2.harvard.edu>
Subject   RE: st: Mata and Parameter Constraints
Date   Tue, 25 Sep 2007 10:09:49 +0200

Dear Statalist,

I am interested in including parameter interactions in my mata
optimization routine.  As it stands, the parameters (contained in the
vector p) are only added linearly in my function to be optimized.  

For example, given the mata optimization below, 
>			: c
>			       1
>			    +-----+
>			  1 |  1  |
>			  2 |  2  |
>			  3 |  3  |
>			  4 |  4  |
>			  5 |  5  |
>			  6 |  6  |
>			    +-----+
>			: f
>			       1   2   3   4   5   6
>			    +-------------------------+
>			  1 |  1   1   0   0   0   0  |
>			  2 |  0   1   1   0   0   0  |
>			  3 |  0   0   1   1   0   0  |
>			  4 |  0   0   0   1   1   0  |
>			  5 |  0   0   0   0   1   1  |
>			  6 |  1   0   0   0   0   1  |
>			    +-------------------------+
>
>void mv0(todo, p, c, f, lnf, S, H)
>{
>        lnf = (c - f*p') :* (c - f*p')
>}
>
>Sv = optimize_init()
>optimize_init_evaluator(Sv, &md0())
>optimize_init_evaluatortype(Sv, "v0")
>optimize_init_params(Sv, J(1,6,0))
>optimize_init_which(Sv, "min")
>optimize_init_argument(Sv, 1, c)
>optimize_init_argument(Sv, 2, f)
>optimize(Sv)
>
>end


The first row equation from the function: lnf = (c - f*p') :* (c - f*p')
Would be: lnf[1] = (1 - p1 - p2)*(1 - p1 - p2)  

*** where p1 and p2 are the first two elements (of six elements) of the
parameter vector p (to be solved via mata's optimization)

And the second row equation would be:

Lnf[2] = (2 - p2 - p3)*(2 - p2 - p3)

I'm interested in having an additional term in these equations, such as
p1*p2 or p2*p3, etc., so that the new first row equation might look
like:

lnf = (1 - p1 - p2 - p1*p3)*(1 - p1 - p2 - p1*p3)

My question really comes down to finding out how one can create these
parameter interactions.  Does this problem require including several
functions into the optimization command (i.e. one for each row
equation)? Or is there some other solution available?  

Mata does have a command  - optimize_init_constraints() which allows for
constraining the parameters, however this is not going to be of use for
me as it only allows for linear constraints.  

Thanks for any insight here.
Randy



-----Original Message-----
From: owner-statalist@hsphsun2.harvard.edu
[mailto:owner-statalist@hsphsun2.harvard.edu] On Behalf Of Jeff
Pitblado, StataCorp LP
Sent: Montag, 24. September 2007 16:49
To: statalist@hsphsun2.harvard.edu
Subject: Re: st: Mata and Optimize Command

Randy Akee <akee@iza.org> is getting a conformability error in the
-md0()-
Mata routine he wrote for use with -optimize()-:

> I am trying to use the MATA Optimize command to get a minimum distance
> estimator of some parameters of interest.  I have six equations and
six
> unknown parameters - my actual problem is a bit more complex, but I
> can't seem to get the simple version to work below:
> 
> I've typed in two matrices, c and f.  I'm trying to find the
parameters,
> p, that minimize: (c-f(p))'(c-f(p)).
> 
> The error that I get is that the function is not conformable, does
> anyone have a suggestion on how I could correct this in what I've
done?
> I believe the issue is that I need somehow to specify the dimensions
of
> the parameter vector, p.  I'm just not sure if that is possible to do,
> any ideas would be appreciated. 

Randy also reported the Mata code that reproduces the error.  Here is
Randy's
evaluator function:

>	void md0(todo, p, c, f, lnf, S, H)
>	{
>		lnf = (c-f*p)'*(c-f*p)
>	}

Here is a brief description of the arguments to this function:

	Input variables:
	----------------
	todo	-- a scalar message variable from -optimize()- that
indicates
		   whether to compute 1st and 2nd order derivatives;
this
		   variable can safely be ignored if your routine is not
going
		   to compute derivatives
	p	-- the current value of the parameter vector (a
rowvector)
	c	-- Randy's first user-defined argument
>			: c
>			       1
>			    +-----+
>			  1 |  1  |
>			  2 |  2  |
>			  3 |  3  |
>			  4 |  4  |
>			  5 |  5  |
>			  6 |  6  |
>			    +-----+
	f	-- Randy's second user-defined argument
>			: f
>			       1   2   3   4   5   6
>			    +-------------------------+
>			  1 |  1   1   0   0   0   0  |
>			  2 |  0   1   1   0   0   0  |
>			  3 |  0   0   1   1   0   0  |
>			  4 |  0   0   0   1   1   0  |
>			  5 |  0   0   0   0   1   1  |
>			  6 |  1   0   0   0   0   1  |
>			    +-------------------------+

	Output variables:
	-----------------
	lnf	-- the value of the objective function that is being
optimized
	g	-- the gradient vector
	H	-- the Hessian matrix

The problem with Randy's evaluator it that with 'c' a 6x1 column vector
and 'f' a 6x6 matrix, 'p' would need to be a 6x1 column vector.  There
are two
reasons why this is a problem:

	1.  -optimize()- requires that 'p' is a rowvector

	2.  Randy set the starting values using

>		: optimize_init_params(S,(0,0))

	    which is a 1x2 rowvector.

In this case, 'p' needs to be a 1x6 rowvector, so there are missing
transpose
operators in Randy's -md0()- function.  Here is how I would code Randy's
evaluator and starting values:

	void md0(todo, p, c, f, lnf, S, H)
	{
        	real colvector  diff
        	diff = c - f*p'
        	lnf = cross(diff,diff)
	}

	optimize_init_params(S, J(1,6,0))

[-cross(z,z)- is faster that -z'*z-]

Note that 'lnf' will be a scalar, but Randy coded 'md0()' as a type -v0-
evaluator.

If Randy wants to return a column vector of the squared differences, he
can
use the following function evaluator

	void mv0(todo, p, c, f, lnf, S, H)
	{
        	lnf = (c - f*p') :* (c - f*p')
	}

Here is the do-file I composed while looking into Randy's code.

***** BEGIN:
mata:

c = (1,2,3,4,5,6)'

f =
(1,1,0,0,0,0\0,1,1,0,0,0\0,0,1,1,0,0\0,0,0,1,1,0\0,0,0,0,1,1\1,0,0,0,0,1
)

void md0(todo, p, c, f, lnf, S, H)
{
        real colvector  diff
        diff = c - f*p'
        lnf = cross(diff,diff)
}

Sd = optimize_init()
optimize_init_evaluator(Sd, &md0())
optimize_init_evaluatortype(Sd, "d0")
optimize_init_params(Sd, J(1,6,0))
optimize_init_which(Sd, "min")
optimize_init_argument(Sd, 1, c)
optimize_init_argument(Sd, 2, f)
optimize(Sd)

void mv0(todo, p, c, f, lnf, S, H)
{
        lnf = (c - f*p') :* (c - f*p')
}

Sv = optimize_init()
optimize_init_evaluator(Sv, &md0())
optimize_init_evaluatortype(Sv, "v0")
optimize_init_params(Sv, J(1,6,0))
optimize_init_which(Sv, "min")
optimize_init_argument(Sv, 1, c)
optimize_init_argument(Sv, 2, f)
optimize(Sv)

end
***** END:

--Jeff
jpitblado@stata.com
*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index