[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]
Re: st: importance weights
Importance weights do whatever you want them to do. For example,
specify the iweight to be more than 1, say 10, and in the auto dataset
regress will give you 7 observations only. Thus, it can do the
opposite of what you claim.
. help weights
I recommend you to stay away from importance weights in most cases,
but they are very handy in the rare cases when you as a programmer
need them. For example, in my confidence ellipse program (. findit
ellip) I implemented Bartels' "fractional pooling of disparate
observations" in the pool(#) option using fractional importance
weights to downweigh "problematic" observations, e.g. problematic in a
Why do you want to use importance weights?
"B. Burcin Yurtoglu" <burcin.yurtoglu@u...> wrote:
> The use of "importance" weights seems to increase the number of
> observations in the regress command.
> reg y x
> produces an output say with 8 observations
> reg y x [iw=1/weight]
> produces an output with 17 observations
> Does anybody have an explanation or suggestions for reading?
* For searches and help try: