Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down on April 23, and its replacement, is already up and running.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: Regression analysis with a minimum function on the RHS

From   "Sebastian van Baal" <>
To   <>
Subject   Re: st: Regression analysis with a minimum function on the RHS
Date   Sat, 6 Mar 2010 00:18:25 +0100

Thank you for your suggestions! 
> On Fri, Mar 5, 2010 at 2:14 PM, Austin Nichols <>
> This seems likely to be problematic no matter what you do--typically
> the objective function should be differentiable in the parameters in
> these kinds of problems.  What is the theory that drives this
> specification?  Is there an alternative parameterization that is
> differentiable?

I admit it is a special hypothesis. My model is based on psychological and
microeconomic theory -- a mixture that creates all sorts of problems but is
also very interesting (to me). An alternative parameterization could be the
following: My original problem 

y = {b0} + min({b1}*x1 , {b2}*x2) 

is formally equal to 

y = {b0} + 0.5*[{b1}*x1 + {b2}*x2 - abs({b1}*x1 - {b2}*x2)]. 

Would you think that the second approach is better suited for estimation
with nl? 

> Steve:
> How about this approach?
> 1. run -sureg- to fit the regressions separately on x1 and x2.  Apply
> -constraint- first to get equal intercepts.
> 2. Use b0 + b1*X1  where   b1*x1 < b2*x2;  otherwise use b2*x2.

This seems to be a good approach I hadn't thought about. The results look
promising, but I will have to consult the literature before I decide on

Thanks again 

*   For searches and help try:

© Copyright 1996–2016 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index