Statalist The Stata Listserver

[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: Re: st: least products regression

From   n j cox <>
Subject   Re: Re: st: least products regression
Date   Tue, 22 Aug 2006 20:41:36 +0100

The command -concord- by Thomas Steichen and myself
includes a fitting of this line, which is known by
many, many different names. We do use the term
"reduced major axis", but that is evidently not
among the keywords used to code the command.

So far, the publication history of this program
in the STB and SJ includes 8 episodes: some of them are
just small fixes or flourishes, but others are more substantial.

st0015_3 from
st0015_2 from
st0015_1 from
st0015 from
sg84_3 from
sg84_2 from
sg84_1 from
sg84 from

I remain unclear which parameters Richard wants
SE and CI for.

My rather predictable bias, here and elsewhere, is
that you learn more from graphics than by treating
the problem as inferential. See also

Graphing agreement and disagreement. Stata Journal 4:
329--349 (2004)


P.S. for "Altmann" read "Altman".

Richard Hiscock wrote:

I am exploring using ordinary least products regression (or geometric mean
regression) rather than the Bland-Altmann difference method to assess
disagreement between to methods of measurement (continuous outcome) when
neither is a gold standard,
I have searched under cross product, OLproduct, geometric regression with no success.
In particular automated methods to obtain SE and hence CI.
any guidance would be appreciated

Joseph Coveney replied:

googling {stata "reduced major axis regression" OR "ordinary least products"
OR "geometric mean regression" OR "standardised principal component
regression"} turns up an objective function on SAS's site for ordinary least
products regression under the alias of reduced major axis regression.

Stata doesn't seem to have anything canned for what you want, but using an
objective function from SAS's website, you can readily whip up something
using -amoeba-, for example, and use it in conjunction with -jackknife-
or -bootstrap- to get your error estimates. Try something like what I show
below followng the Web sources.

I recall reading somewhere (it might have been one of Raymond Carroll's
lecture slide shows available on the Web, but don't quote me on this) that
these approaches (Deming regression, reduced axis regression) to
errors-in-variables regression are shakey for everything but the most
restrictive assumption about the ratio of error variances in the x and y
variables. Both the official -eivreg- and Yulia Marchenko's -deming- want
the user to do some thinking about or independent investigation of this
assumption and to make it explicit. Because you're making a particular
assumption about this, anyway, using least products regression, you're
probably be better off using one of these commands in conjunction with one
of the resampling techniques.

I recall (again maybe from Professor Carroll's slides) that the approach
also makes a strong assumption about the model being correctly specified,
i.e., linear functional relationship, no missing variables, and so on.


* For searches and help try:

© Copyright 1996–2022 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index