Statalist The Stata Listserver

[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

RE: Re: st: Total Least Squares Regression, anyone

From   Joseph Coveney <>
To   Statalist <>
Subject   RE: Re: st: Total Least Squares Regression, anyone
Date   Tue, 07 Mar 2006 02:12:35 +0900

Thomas J. Steichen wrote:

As a first thought, eivreg seems a reasonable approach.  However,
the Stata implementation with a single X variable is identical to
least-squares regression, a la regress, as it forces reliability
to be 1 with a single X.

An alternative is the reduced-major axis (RMA) equation provided by
program concord. Concord also provides some of the other mentioned

A second alternative is to solve for the TLS regression coefficients
by minimizing sum((y - (a + b*x))^2 / (1 + b^2)) over the data.

I'm not sure what is meant by "standard deviation of the regression"
but I suspect it is just rmse = sqrt(sum(y - yhat)^2/(n-2)), which
can be computed once you have the slope and intercept.


I didn't think that -eivreg- forced the reliability to be one in any
circumstances, even with a single predictor.  It defaults to one if you
don't specify otherwise, but even with one predictor, it allows you to
specify reliabilty less than one (but at least a smidgeon greater than the
coefficient of determination of the predictor regressed on the response).

I've seen the second alternative described as Deming Regression
(  -amoeba-
could be used for this.  You should even be able to us -nl- with iterative
re-weighting to get the pentultimate minimization of the expression.

The "standard deviation" I believe is the sigma_k+1 on Page 10 of

You can find a Fortran 90 program of Van Huffel's total least squares
algorithm at but
writing from scratch should be more straightforward than translating the
Fortran into Mata.

Joseph Coveney

*   For searches and help try:

© Copyright 1996–2021 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index