Home  /  Resources & support  /  FAQs  /  Obtaining elasticities for independent variables

When I use the eyex option of margins, what is it actually computing and how does it relate to the coefficients of the loglinear model?

Title   Obtaining elasticities for independent variables
Author May Boggess, StataCorp
Kristin MacDonald, StataCorp

The eyex() option causes margins to compute d(log f)/d(log x), where f is the prediction function specified in the predict() option of margins or, if none was specified, the default prediction option for the preceding estimation command.

The elasticity d(log f)/d(log x) can be calculated easily from the marginal effect df/dx by using the chain rule. This gives the formula

  d(log f)       d(log f)    dx_i
  ---------  =  ---------- * ----
  d(log x_i)       dx_i      d(log x_i) 

Because d(log x_i)/dx_i = 1/x_i, we have

  d(log f)       d(log f)     x_i            d(log f)     df       x_i     df
  ---------  =  ---------- * ----  =  x_i * ---------  * -----  = ----- * ----
  d(log x_i)       dx_i        1               df         dx_i      f     dx_i

where x_i is the ith independent variable in the regression. By default, margins evaluates this for each observation and reports the average of the elasticities. We can use the atmeans option to evaluate this at the mean of the independent variables or the at() option to specify specific values of the independent variables. If the predicted value is negative, the elasticities cannot be computed because we cannot take the log of a negative number.

We can verify that the above formula works. In this example, we will calculate the elasticities at the means of the independent variables.

. sysuse auto, clear
(1978 automobile data)

. regress mpg weight length

Source SS df MS Number of obs = 74
F(2, 71) = 69.34
Model 1616.08062 2 808.040312 Prob > F = 0.0000
Residual 827.378835 71 11.653223 R-squared = 0.6614
Adj R-squared = 0.6519
Total 2443.45946 73 33.4720474 Root MSE = 3.4137
mpg Coefficient Std. err. t P>|t| [95% conf. interval]
weight -.0038515 .001586 -2.43 0.018 -.0070138 -.0006891
length -.0795935 .0553577 -1.44 0.155 -.1899736 .0307867
_cons 47.88487 6.08787 7.87 0.000 35.746 60.02374
. summarize weight
Variable Obs Mean Std. dev. Min Max
weight 74 3019.459 777.1936 1760 4840
. local meanwei=r(mean) . summarize length
Variable Obs Mean Std. dev. Min Max
length 74 187.9324 22.26634 142 233
. local meanlen=r(mean) . local f=`meanwei'*_b[weight]+`meanlen'*_b[length]+_b[_cons] . display "weight: eyex = " (`meanwei'*_b[weight])/`f' weight: eyex = -.54604966 . display "length: eyex = " (`meanlen'*_b[length])/`f' length: eyex = -.70235175 . margins, eyex(weight length) atmeans nose Conditional marginal effects Number of obs = 74 Expression: Linear prediction, predict() ey/ex wrt: weight length At: weight = 3019.459 (mean) length = 187.9324 (mean)
ey/ex
weight -.5460497
length -.7023518

We can plot the elasticities as functions of the independent variables using margins with the at() option followed by marginsplot. In the following example, there are two independent variables, and we plot the elasticity of each independent variable at the mean of the other variable.

. sysuse auto, clear
(1978 automobile data)

. regress mpg weight length

Source SS df MS Number of obs = 74
F(2, 71) = 69.34
Model 1616.08062 2 808.040312 Prob > F = 0.0000
Residual 827.378835 71 11.653223 R-squared = 0.6614
Adj R-squared = 0.6519
Total 2443.45946 73 33.4720474 Root MSE = 3.4137
mpg Coefficient Std. err. t P>|t| [95% conf. interval]
weight -.0038515 .001586 -2.43 0.018 -.0070138 -.0006891
length -.0795935 .0553577 -1.44 0.155 -.1899736 .0307867
_cons 47.88487 6.08787 7.87 0.000 35.746 60.02374
. margins, eyex(weight) at(weight = (1750(250)5000) (mean) length) noatlegend Conditional marginal effects Number of obs = 74 Model VCE : OLS Expression : Linear prediction, predict() ey/ex w.r.t. : weight
Delta-method
ey/ex std. err. t P>|t| [95% conf. interval]
weight
_at
1 -.2573869 .0862872 -2.98 0.004 -.4294388 -.085335
2 -.3053854 .1062865 -2.87 0.005 -.5173147 -.0934561
3 -.3571938 .1292519 -2.76 0.007 -.6149147 -.0994728
4 -.4132845 .1557291 -2.65 0.010 -.7237996 -.1027694
5 -.4742113 .1863901 -2.54 0.013 -.8458627 -.10256
6 -.540628 .2220688 -2.43 0.017 -.9834206 -.0978354
7 -.6133115 .2638095 -2.32 0.023 -1.139333 -.0872902
8 -.6931926 .312933 -2.22 0.030 -1.317163 -.0692218
9 -.7813962 .3711274 -2.11 0.039 -1.521403 -.0413892
10 -.8792944 .4405755 -2.00 0.050 -1.757777 -.0008119
11 -.9885785 .5241373 -1.89 0.063 -2.033679 .0565216
12 -1.111358 .6256144 -1.78 0.080 -2.358797 .1360822
13 -1.250295 .7501418 -1.67 0.100 -2.746036 .2454448
14 -1.408807 .9047836 -1.56 0.124 -3.212894 .3952802
. marginsplot, noci xlabel(2000(1000)5000) saving(weight, replace) nodraw Variables that uniquely identify margins: weight (file weight.gph not found) file weight.gph saved . margins, eyex(length) at(length = (140(10)240) (mean) weight) noatlegend Conditional marginal effects Number of obs = 74 Model VCE: OLS Expression: Linear prediction, predict() ey/ex wrt: length
Delta-method
ey/ex std. err. t P>|t| [95% conf. interval]
length
_at
1 -.4437283 .2618243 -1.69 0.095 -.9657911 .0783345
2 -.4909849 .2991911 -1.64 0.105 -1.087555 .1055853
3 -.5414398 .3411019 -1.59 0.117 -1.221578 .138698
4 -.5954291 .3882531 -1.53 0.130 -1.369584 .1787255
5 -.6533377 .4414755 -1.48 0.143 -1.533615 .2269394
6 -.7156083 .5017656 -1.43 0.158 -1.7161 .2848838
7 -.7827532 .5703263 -1.37 0.174 -1.919952 .3544452
8 -.855368 .64862 -1.32 0.191 -2.14868 .4379437
9 -.9341494 .7384374 -1.27 0.210 -2.406552 .5382529
10 -1.019918 .8419885 -1.21 0.230 -2.698795 .6589597
11 -1.113646 .9620256 -1.16 0.251 -3.03187 .8045788
. marginsplot, noci xlabel(140(20)240) saving(length, replace) nodraw Variables that uniquely identify margins: length (file length.gph not found) file length.gph saved . graph combine weight.gph length.gph, ycommon

How do the elasticities computed by margins relate to the coefficients of the loglinear model?

The term elasticity has also been used to describe the coefficient of the model

  ln(y) = b0 + b1*ln(x)

This is called a constant elasticity model. When we do

  y = c0 + c1*x

and compute d(ln(f))/d(ln(x)), where f is the linear predictor, this is a function of x. We can evaluate this function at any value of x we please. This is a varying elasticity model.

In the following example, we compute the variable elasticity using margins, but rather than just computing it at just one point, the mean of the independent variable, we compute it at many values of the independent variable. We also plot it so we can get a good feel for the elasticity as a function of the independent variable.

Also, we fit the loglinear model and plot the coefficient on the graph. The final piece we add to the graph is to mark the mean of the independent variable and the value of the varying elasticity there.

. sysuse auto, clear
(1978 automobile data)

. keep mpg weight

. sum weight

Variable Obs Mean Std. dev. Min Max
weight 74 3019.459 777.1936 1760 4840
. local mean=r(mean) . * ---constant elasticity model -------------------- . gen lnmpg=ln(mpg) . gen lnwei=ln(weight) . regress lnmpg lnwei
Source SS df MS Number of obs = 74
F(1, 72) = 179.41
Model 3.52612925 1 3.52612925 Prob > F = 0.0000
Residual 1.4150941 72 .019654085 R-squared = 0.7136
Adj R-squared = 0.7096
Total 4.94122335 73 .067687991 Root MSE = .14019
lnmpg Coefficient Std. err. t P>|t| [95% conf. interval]
lnwei -.8251737 .061606 -13.39 0.000 -.9479829 -.7023645
_cons 9.608391 .4918087 19.54 0.000 8.627989 10.58879
. gen marg_cons=_b[lnwei] . * ---varying elasticity model---------------------- . regress mpg weight
Source SS df MS Number of obs = 74
F(1, 72) = 134.62
Model 1591.9902 1 1591.9902 Prob > F = 0.0000
Residual 851.469256 72 11.8259619 R-squared = 0.6515
Adj R-squared = 0.6467
Total 2443.45946 73 33.4720474 Root MSE = 3.4389
mpg Coefficient Std. err. t P>|t| [95% conf. interval]
weight -.0060087 .0005179 -11.60 0.000 -.0070411 -.0049763
_cons 39.44028 1.614003 24.44 0.000 36.22283 42.65774
. * ----elasticity at the mean----------------------- . margins, eyex(weight) atmeans Conditional marginal effects Number of obs = 74 Model VCE : OLS Expression : Linear prediction, predict() ey/ex w.r.t. : weight at : weight = 3019.459 (mean)
Delta-method
ey/ex std. err. t P>|t| [95% conf. interval]
weight -.8518915 .0751441 -11.34 0.000 -1.001689 -.7020944
. matrix A=r(b) . gen marg_mean = A[1,1] . * ----elasticity at different values of weight ---- . margins, eyex(weight) at(weight = (1750(250)5000)) noatlegend Conditional marginal effects Number of obs = 74 Model VCE : OLS Expression : Linear prediction, predict() ey/ex w.r.t. : weight
Delta-method
ey/ex std. err. t P>|t| [95% conf. interval]
weight
_at
1 -.3635323 .0236104 -15.40 0.000 -.4105988 -.3164658
2 -.4382239 .0300205 -14.60 0.000 -.4980686 -.3783791
3 -.5215725 .0378009 -13.80 0.000 -.5969273 -.4462178
4 -.615176 .0473276 -13.00 0.000 -.7095219 -.5208302
5 -.721051 .0591092 -12.20 0.000 -.8388829 -.603219
6 -.8417798 .0738467 -11.40 0.000 -.9889906 -.694569
7 -.9807243 .0925265 -10.60 0.000 -1.165172 -.7962761
8 -1.142343 .1165684 -9.80 0.000 -1.374718 -.9099685
9 -1.332681 .1480732 -9.00 0.000 -1.627859 -1.037502
10 -1.560137 .1902484 -8.20 0.000 -1.93939 -1.180884
11 -1.836744 .2481783 -7.40 0.000 -2.331478 -1.342009
12 -2.180362 .3302937 -6.60 0.000 -2.838791 -1.521934
13 -2.6187 .4513707 -5.80 0.000 -3.518491 -1.718909
14 -3.197182 .6391756 -5.00 0.000 -4.471355 -1.923008
. marginsplot, noci xlabel(2000(1000)5000) addplot(line marg_cons wei /* > */ || line marg_mean wei, xline(`mean') /* > */ legend(order(1 "varying" 2 "constant" 3 "mean") rows(3))) Variables that uniquely identify margins: weight