Re: [ROOT] Fitting with stochastic optimizers

From: Eddy Offermann (eddy@rentec.com)
Date: Fri Mar 02 2001 - 18:14:58 MET


Hi Anton,

I understand that in your stochastic approach you can minimize any function
just like Minuit but that your algorithm will not get stuck in any local
extreme
However, it will not fullfill (just like Minuit) requirement 1 and 3 (just
like Minuit). For instance, suppose that in your data set a few points
have x and/or y infinite, will the coeficients fitted still make sense.
How robust is the algorithm : how many garbage points will bring the
algorithm to its knees. For Minuit it only takes one bad point.

Thinking about point 1). I have to do thousand of fits in approx. 1 minute.
All that happens is that each time a few more points are added.
In general each time one can use the previous fit as the starting point
of the new one but for a fit linear in its parameters much faster methods
are available.

Eddy

can find 
> From: "Anton Fokin" <anton.fokin@smartquant.com>
> To: "Eddy Offermann" <eddy@rentec.com>
> Cc: <roottalk@pcroot.cern.ch>
> Subject: Re: [ROOT] Fitting with stochastic optimizers 
> Date: Fri, 2 Mar 2001 17:17:24 +0100
> MIME-Version: 1.0
> Content-Transfer-Encoding: 7bit
> X-Priority: 3
> X-MSMail-Priority: Normal
> X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2615.200
> X-Filter-Version: 1.3 (ram)
> 
> Hi Eddy,
> 
> if we talk about stochastics, an optimizer minimizes your objective function
> wihtout taking care of its shape.  In case of least squares it minimizes
> 
> Objective = Sum(Fi(P1,..PN, Xi) - Yi)^2
> 
> and reports the best set of parameters P1..PN
> 
> In fact you can use any function to estimate errors and this will not affect
> the algorithm.
> 
> There is no direct Hessian matrix or other calulations which these
> algorithms use. Instead they try to change a set of parameters in a smart
> way to explore the search space and find the best set.
> 
> Note that stochastic algorithms do not guarantee that you find a global
> minimum in finite time but they are quite helpful if you want to find a good
> solution in reasonable time for complex functions and a number of
> constraints.
> 
> If you carfully optimize your objective on a set of N data points, you can
> spend much less time to calibrate your model after adding M << N points,
> assuming that a new minimum lies near the old one.
> 
> Regards,
> Anton
> 
> PS. Could someone clever help me with inverse probability function of
> generalized (perhaps too much generalized:) ) Cauchy distribution?
> 
> see g(dx) function at
> 
> http://www.smartquant.com/htmldoc/TSimulatedAnnealing.html
> 
> generalized annealing parameter displacement formula ...
> 
> 
> http://www.smartquant.com
> 
> 
> ----- Original Message -----
> From: Eddy Offermann <eddy@rentec.com>
> To: <anton.fokin@smartquant.com>
> Cc: <roottalk@pcroot.cern.ch>
> Sent: Friday, March 02, 2001 4:22 PM
> Subject: Re: [ROOT] Fitting with stochastic optimizers
> 
> 
> > Hi Anton,
> >
> > If we start to tinker/rewrite Minuit, I would like to see the following
> > additions/improvements:
> >
> > 1) This refers to linear fits:
> >     When fitting N data points and get M additional ones, I would like
> >     to ADD the information of the new ones to my Hessian matrix and
> >     gradient. I do NOT want to analyze all M+N data points again !
> >     This is important for on-line applications and non-cheating
> >     data analyss with time series. It is straightforward to implement
> >     for a least-squares objective function.
> >
> >  2) Be able to easily specify Bayesian priors for my parameters
> >
> >  3) Have possibilty of applying a robust algorithm instead
> >      of least squares like least median squares
> >
> > What do others wish/think ??
> >
> > Best regards Eddy
> >
> > (out of town for the next 2weeks)
> >
> > > X-Authentication-Warning: pcroot.cern.ch: majordomo set sender to
> > owner-roottalk@root.cern.ch using -f
> > > From: "Anton Fokin" <anton.fokin@smartquant.com>
> > > To: "roottalk" <roottalk@pcroot.cern.ch>
> > > Subject: [ROOT] Fitting with stochastic optimizers
> > > Date: Fri, 2 Mar 2001 11:42:49 +0100
> > > MIME-Version: 1.0
> > > Content-Transfer-Encoding: 7bit
> > > X-Priority: 3
> > > X-MSMail-Priority: Normal
> > > X-MIMEOLE: Produced By Microsoft MimeOLE V5.00.2615.200
> > > X-Filter-Version: 1.3 (ram)
> > >
> > > Hi rooters,
> > >
> > > I've just seen several postings about fitting. I am curious if someone
> ever
> > > went into troubles with root minuit and would like to have stochastic
> > > optimization as an alternative? I can provide simulated
> annealing/genetics
> > > if enough people want it.
> > >
> > > Currently I have minimization functionality for TF1/2 user defined
> functions
> > > on www.smartquant.com/neural.html  I can extend the package so that it
> could
> > > be used with TVirtualFitter.
> > >
> > > Regards,
> > > Anton
> > >
> > > http://www.smartquant.com
> > >
> > >
> > >
> >
> > Eddy A.J.M. Offermann
> > Renaissance Technologies Corp.
> > Route 25A, East Setauket NY 11733
> > e-mail: eddy@rentec.com
> > http://www.rentec.com
> >
> >
> >
> 

Eddy A.J.M. Offermann
Renaissance Technologies Corp.
Route 25A, East Setauket NY 11733
e-mail: eddy@rentec.com
http://www.rentec.com



This archive was generated by hypermail 2b29 : Tue Jan 01 2002 - 17:50:38 MET