[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: CI
Maarten Buis <email@example.com>
Re: st: CI
Mon, 19 Dec 2011 10:25:57 +0100
On Mon, Dec 19, 2011 at 7:37 AM, Newton Chagoma Chagoma wrote:
> I done a cox regression and one of the variables"kps" has a confidence
> interval ranging from 0 to a dot(.)
> say Risk factor HR p-values 95% CI
> Kps 1.49 0.999 0, .
> what do I have to say about it?or rather explain and report it.
The dot stands in this case for +infinity, so it means that your
estimate is completely meaningless: A confidence interval of 0 to
infinity for a ratio is the same as a confidence interval from
-infinity to +infinity for a difference. It is true that the ratio
lies somewhere between 0 and +infinity, but we did not need to do a
statistical analysis to find that out.
In practice it means that there is a serious problem with your model
that you must fix. You told us exactly nothing about your problem, so
it is hard to diagnose the problem. I can think of two possibilities:
Things like these can happen when you have covariates on wildly
different scales. I would look at all variables in your model and
consider its scale: e.g. don't add annual income in euros but in 1000s
of euros. Also consider the origin: don't add year of birth, but year
of birth - 1950. Changing the scale (in the example above, divide by
1000) or the origin (in the example above, subtract 1950), or a
combination of the two (for example: year of birth in decades since
1950), are linear transformations, so they do not change the model.
For example saying that someone is born 27 years after 1950 is exactly
the same as saying someone is born 1975 years after 0, or saying
someone earns .5 of 1000 euros more is the same as saying that someone
earns 500 euros more. However, computers work better if the numbers
don't get too large or too small.
Another possible cause is multi-colinearity. This would mean that you
need to drop variables or impose other constraints. For example, say
you entered the education of the father and the mother. These tend to
be highly correlated. If the correlation is too high, you can drop one
of the parents, which is often done in my sub-discipline.
Alternatively, you can keep both parents but constrain the effects to
be the same. It depends on the problem, which solution would make most
Hope this helps,
Maarten L. Buis
Institut fuer Soziologie
* For searches and help try: