Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: pointing to a class member function (likelihood) with optimize() in mata


From   [email protected] (Jeff Pitblado, StataCorp LP)
To   [email protected]
Subject   Re: st: pointing to a class member function (likelihood) with optimize() in mata
Date   Thu, 28 Oct 2010 10:19:51 -0500

Pierre-Carl Michaud <[email protected]> is trying to use -optimize()- with
an evaluator function implemented as a member function of a Mata class:

> In mata, I am trying to write a general class to estimate models. I  
> have had no problem with optimize outside of a class. I want now to  
> call optimize in the class and I have a problem with the line
>    optimize_init_evaluator(s, &func())
> The likelihood func() is a member function (method) in the class and  
> as such referring to &func() does not find it. I've tried  
> &classname::func() and no luck either. &this.func() does refer to  
> something but then I get
>    modelbase::func():  3001  expected 6 arguments but received 1
> same error message if I define func() in the class definition as a  
> static member function.
> Anyone has an idea of how to point to a class function so that  
> optimize() finds it and its arguments?

There is no mechanism in object oriented programming (OOP) for calling a
class's public member function using it's address.  The user-programmer would
be required to know too many things about how the class system is implemented,
and the system would have to provide hooks to even make it possible.

Pierre-Carl can still use his general class, he will just need to provide
-optimize()- with a simple wrap-around function that basically is a
call-through to the class's evaluator function.  Here is a simple example,
using the logistic regression model.

First, let's define a class called 'logit_model' that has two private data
members, and three public function members.

	class logit_model {
	private:
		real colvector	y
		real matrix	X

	public:
		void set_y()
		void set_X()
		void eval()
	}

The 'set_y()' function simply provides a way for the programmer to supply the
class instance with a dependent variable; similarly, 'set_X()' provides a way
to supply the independent variables.  The 'eval()' function will compute the
log-likelihood, gradient, and Hessian for the logistic regression model.

The 'set_*()' functions simply make a copy of the supplied argument.

	void logit_model::set_y(real colvector y)
	{
		this.y = y
	}

	void logit_model::set_X(real matrix X)
	{
		this.X = X
	}

The 'eval()' function computes a log-likelihood value, gradient vector , and
Hessian matrix according to the requirements of -optimize()-'s "d2" evaluator
functions:

	void logit_model::eval(	real	scalar		todo,
				real	rowvector	beta,
				real	scalar		lnf,
				real	rowvector	g,
				real	matrix		H)
	{
		real colvector	pm
		real colvector	xb
		real colvector	lj
		real colvector	dllj
		real colvector	d2llj

		pm	= 2*(this.y :!= 0) :- 1
		xb	= this.X*beta'

		lj	= invlogit(pm:*xb)
		if (any(lj :== 0)) {
			lnf = .
			return
		}
		lnf = quadcolsum(ln(lj))
		if (todo == 0) return

		dllj	= pm :* invlogit(-pm:*xb)
		if (missing(dllj)) {
			lnf = .
			return
		}
		g	= quadcross(dllj, X)
		if (todo == 1) return

		d2llj	= abs(dllj) :* lj
		if (missing(d2llj)) {
			lnf = .
			return
		}
		H	= - quadcross(X, d2llj, X)
	}

The only other function we need is the wrap-around function.  This wrap-around
function will take the arguments of -optimize()-'s "d2" evaluators, with the
addition of a single argument for the class instance.  We named the
wrap-around function 'logit_eval()', and here it is:

	void logit_eval(	real	scalar		todo,
				real	rowvector	beta,
				class	logit_model	M,
				real	scalar		lnf,
				real	rowvector	g,
				real	matrix		H)
	{
		M.eval(todo,beta,lnf,g,H)
	}

Here is an example using the auto data, showing how this class and function
are used.

	sysuse auto
	gen one = 1

First we need to declare an instance of our class, and supply it with some
data for 'y' and 'X'.

	mata
	u
	M = logit_model()
	M.set_y(st_data(., "foreign"))
	M.set_X(st_data(., "mpg turn trunk one"))

Now we can work with -optimize()-.

First we initialize an -optimize()- handle,

	S = optimize_init()

then declare the wrap-around evaluator function and its type,

	optimize_init_evaluator(S, &logit_eval())
	optimize_init_evaluatortype(S, "d2")

then declare our class instance as an extra argument for the evaluator,

	optimize_init_argument(S, 1, M)

provide starting values, and finally perform the optimization.

	optimize_init_params(S, J(1,4,0))
	optimize(S)
	end

Here is a log of the above example.  We added a call to Stata's -logit-
command to show that the results from -optimize()- match.

***** BEGIN:
. sysuse auto
(1978 Automobile Data)

. gen one = 1

. 
. mata:
------------------------------------------------- mata (type end to exit) -----
: 
: class logit_model {
> private:
>         real colvector  y
>         real matrix     X
> 
> public:
>         void set_y()
>         void set_X()
>         void eval()
> }

: 
: void logit_model::set_y(real colvector y)
> {
>         this.y = y
> }

: 
: void logit_model::set_X(real matrix X)
> {
>         this.X = X
> }

: 
: void logit_model::eval( real    scalar          todo,
>                         real    rowvector       beta,
>                         real    scalar          lnf,
>                         real    rowvector       g,
>                         real    matrix          H)
> {
>         real colvector  pm
>         real colvector  xb
>         real colvector  lj
>         real colvector  dllj
>         real colvector  d2llj
> 
>         pm      = 2*(this.y :!= 0) :- 1
>         xb      = this.X*beta'
> 
>         lj      = invlogit(pm:*xb)
>         if (any(lj :== 0)) {
>                 lnf = .
>                 return
>         }
>         lnf = quadcolsum(ln(lj))
>         if (todo == 0) return
> 
>         dllj    = pm :* invlogit(-pm:*xb)
>         if (missing(dllj)) {
>                 lnf = .
>                 return
>         }
>         g       = quadcross(dllj, X)
>         if (todo == 1) return
> 
>         d2llj   = abs(dllj) :* lj
>         if (missing(d2llj)) {
>                 lnf = .
>                 return
>         }
>         H       = - quadcross(X, d2llj, X)
> }

: 
: void logit_eval(        real    scalar          todo,
>                         real    rowvector       beta,
>                         class   logit_model     M,
>                         real    scalar          lnf,
>                         real    rowvector       g,
>                         real    matrix          H)
> {
>         M.eval(todo,beta,lnf,g,H)
> }

: 
: M = logit_model()

: M.set_y(st_data(., "foreign"))

: M.set_X(st_data(., "mpg turn trunk one"))

: 
: S = optimize_init()

: optimize_init_evaluator(S, &logit_eval())

: optimize_init_evaluatortype(S, "d2")

: optimize_init_argument(S, 1, M)

: optimize_init_params(S, J(1,4,0))

: optimize(S)
Iteration 0:   f(p) = -51.292891  
Iteration 1:   f(p) = -25.856762  
Iteration 2:   f(p) = -25.685523  
Iteration 3:   f(p) = -25.685148  
Iteration 4:   f(p) = -25.685148  
                  1              2              3              4
    +-------------------------------------------------------------+
  1 |  -.0776544663   -.6146979766   -.0214041449    24.53154516  |
    +-------------------------------------------------------------+

: 
: end
-------------------------------------------------------------------------------

. 
. logit for mpg turn trunk

Iteration 0:   log likelihood =  -45.03321  
Iteration 1:   log likelihood = -28.007564  
Iteration 2:   log likelihood = -25.802293  
Iteration 3:   log likelihood = -25.685271  
Iteration 4:   log likelihood = -25.685148  
Iteration 5:   log likelihood = -25.685148  

Logistic regression                               Number of obs   =         74
                                                  LR chi2(3)      =      38.70
                                                  Prob > chi2     =     0.0000
Log likelihood = -25.685148                       Pseudo R2       =     0.4296

------------------------------------------------------------------------------
     foreign |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
         mpg |  -.0776545   .0708997    -1.10   0.273    -.2166154    .0613065
        turn |   -.614698   .1599212    -3.84   0.000    -.9281377   -.3012583
       trunk |  -.0214041   .1120528    -0.19   0.849    -.2410236    .1982153
       _cons |   24.53155   6.785011     3.62   0.000     11.23317    37.82992
------------------------------------------------------------------------------
***** END:

--Jeff					-- Hua
[email protected]			[email protected]
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index