Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: st: RE: Linear Regression


From   "Brian P. Poi" <bpoi@stata.com>
To   statalist@hsphsun2.harvard.edu
Subject   Re: st: RE: Linear Regression
Date   Fri, 8 Aug 2008 09:18:02 -0500 (CDT)

On Fri, 8 Aug 2008, Meric Osman wrote:

Hi,
I have a problem about writing my regression equation. I want to add
the variable and its square at the same time. Such as

y = a + a^2 + ...

Actually, I can do it by generating a new variable which is a*a, but I
don't wanna do that because it is not an efficient way.

I really appreciate if you can help me with that.

One solution is to use the -nl- command instead of -regress-. Say we want to fit the regression

mpg = b0 + b1*gear + b2*gear^2 + error

Then we would type

. sysuse auto
. nl (mpg = {b0} + {b1}*gear + {b2}*gear^2), variables(gear)

An advantage of using -nl- is that we can then get the marginal effect for gear very easily using -mfx-:

. mfx compute

Had we used -regress-, we could not have used -mfx-. To use -regress-, we would have needed to create a new variable gearsq, say, containing gear^2. But when we call -mfx-, it has no way of knowing that gearsq and gear are related. Using -nl- obviates this issue. When you intend to use -mfx- after -nl-, you need to use the variables() option. Otherwise, -mfx- has no way of knowing what the variables in the model are.

For more tricks with -nl-, see Poi (2008, "Stata tip 58: nl is not just for nonlinear models", Stata Journal vol. 8. no. 1).

-- Brian Poi
-- bpoi@stata.com
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/




© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index