Tuesday, September 22, 2009

Curve fitting

Very often in experimental physics, we're interested in comparing some data to a physical model that may involve a number of unknown parameters, and we want to find the set of parameters that gives the best fit.  Typically "best fit" means minimizing a "cost" function, often the sum of the squares of the deviations between the model and the data.  The challenge is that many models can be very complicated, with nonlinear dependences on the parameters.  This often means that finding the optimal parameters can be very difficult - the cost function in parameter-space can have lots of shallow, local minima, for example.  The cost function may also be extremely sensitive to some parameters (the "stiff" ones) and comparatively insensitive to others (the "sloppy" ones).  In arxiv:0909.3884, James Sethna and Cornell colleagues take a look at this dilemma using the tools of differential geometry, and they propose an improvement to standard techniques based on geodesics on the relevant hypersurface in parameter space.  This looks really cool (if mathematical!), and I wish they'd included an example of an actual minimization problem that they'd done with this (instead of leaving it for an "in preparation" reference).  Any prospect for real improvements in nonlinear fitting is exciting.

8 comments:

Don Monroe said...

Any speed-up in nonlinear fitting is good, of course. But to my mind the particular procedure is less interesting than the almost-philosophical questions that these complex fitting problems pose.

In systems biology, for example, Sethna and his colleagues shown, as have others, that most of the parameters of the very complex models will never be constrained by real data. Efficiently fitting these "sloppy" parameters to the noise in real data is a fool's errand.

What then is the goal of fitting?

First, one must abandon the idea that one is "reverse engineering" the real system. Some of the "true" parameters will be unknowable. Don't confuse the map with the territory.

The fit will generally identify composite variables that are tightly constrained by the data, perhaps confining the fits to a long, narrow, but flat canyon in parameter space. But if these variables have no obvious meaning in the underlying model, the exercise seems conceptually pointless.

For these reasons, many researchers, at least in systems biology, think that fits should only be evaluated by comparing their predictions to measurements.

Any reasonable procedure will of course match the known observations, and will reject any model that can't do that reasonably well. But simply reiterating the data seems like a rather vacuous task.

Where models can really provide value is when they accurately predict the results of new experiments. More importantly, modelers need to be able to state clearly which of their predictions are highly constrained by the fit, and which predictions, such as those that are sensitive to undetermined parameters, are not trustworthy. This is key to any practical application of the "understanding."

Another use for models is to guide researchers to those experiments that will be most informative, meaning they will further constrain the parameter values (or the predictions).

It will be interesting to see if this geodesic approach advances these goals.

Douglas Natelson said...

Hi Don - I agree, of course. Models without predictive power are just models. I was thinking, in my own experience, of extracting fit parameters which may then be checked for consistency against other measurements or determinations. If there's no constraining the fit (that is, if a whole canyon full of model parameter sets do equally well fitting the data), then fitting gives little physical insight.

Uncle Al said...

If they have brilliant new theory and no applied example they have adopted the string theory business model so profitably exploited by SUSY, dark matter, solar axions, Yukawa potentials, 1/r^2 gravitation at small "r", etc.

Anonymous said...

"If they have brilliant new theory and no applied example" Xe-xe-xe! That's no surprise. Obviously, the idea is "le-gen-da-ry", so it needs a separate advertising (arXive paper). It would take a series of later papers to find a good example, which illustrates how legendary it is. Would not be surprised if it turns out to be a cold fusion of numerical methods

Douglas Natelson said...

Al, your comments are becoming increasingly bizarre.

Anon., I think you and Al are both being more than a little harsh. I understand that a Letter-type paper doesn't have space for a fully developed example of using this stuff. However, Sethna is a solid guy with a very good track record. I don't think this is hype or misleading - I just wanted to see the new ideas in action. No more, no less.

Anonymous said...

In a related study some of the authors used the "sloppiness" of the fit to address uncertainties on model predicted quantities.

Bayesian Error Estimation in Density-Functional Theory
Phys. Rev. Lett. 95, 216401 (2005)

Anonymous said...

Dear Doug,

I know Jim S. pretty well, I had an office on the same 5th floor with all other theorists. He knows better than anyone that an example would be more than desirable. It is kinda essential for an illustration. The fact that it is manifestly absent in a paper of such a well-known guy tells us about their idea quite a lot. For a theorist (physicist, I mean) it is a shame to handwave about a new thing without an example. Yeah, makes one think about other infamous stuff, sorry ;(

Jim said...

We're keenly aware that a new idea is different from a new, practical algorithm! We've been working hard on a good algorithm, and on a good set of test problems that show the advantages of different methods. Our current best method is a significant improvement over both the Numerical Recipes formulation and that of the Minpack Levenberg-Marquardt algorithm, for finding the best fit of sums of exponentials (our main test problem). We're refining it and testing it on two more realistic problems (a systems biology problem and a variational wavefunction for quantum Monte Carlo), and hope to have something tangible in the next few months.