[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]
st: Standard error of mean under auto-correlation
Does anyone know of a formula to compute the standard error of a mean when
the random variable is serially correlated?
Right now I don't want to run the
command, because missing observations in my data cause it to take too long.
* For searches and help try: