Re: [ROOT] Fitting with stochastic optimizers

From: Anton Fokin (anton.fokin@smartquant.com)
Date: Fri Mar 02 2001 - 20:23:02 MET


> I understand that in your stochastic approach you can minimize any
function
> just like Minuit but that your algorithm will not get stuck in any local
> extreme

Correct.

> However, it will not fullfill (just like Minuit) requirement 1 and 3 (just
> like Minuit). For instance, suppose that in your data set a few points
> have x and/or y infinite, will the coeficients fitted still make sense.

Hmm, it depends on what you call a fit and especiall what kind of error
estimator you use. The algorithm will behave accordingly and if you have an
outlier which is included in the dataset, it will try to find the best param
set fitting the data set including this outlier too. I mean if you fit with
y=ax+b and have (1,inf) point, of course you get very large a. One the other
nand you may set constraints on parameters, no prob with that. Or I don't
get what you mean? :)

> How robust is the algorithm : how many garbage points will bring the
> algorithm to its knees. For Minuit it only takes one bad point.

Again, what do you mean by bringing the algorith to its knees? Segmentation
fault?:)

> Thinking about point 1). I have to do thousand of fits in approx. 1
minute.
> All that happens is that each time a few more points are added.
> In general each time one can use the previous fit as the starting point
> of the new one but for a fit linear in its parameters much faster methods
> are available.

Of course if you have a certain type of fit, it can be more efficient to use
special methods. As I said stochastic methods are for hard objectives and
constraints. Are you talking about dynamic fitting of time series for a
number of commodities with with Box-Jenkins/GARCH or something?

What about neural net replacement? Where was an article discussing ANN
design for GARCH like predictions. Although the article did not discuus
on-line applications, I assume that in this case training the net on-line
with new patterns is much faster than re-making the whole fit (a trained net
already "rememebers" previous fit/params/search space).

Reagards,
Anton

>
> Eddy
>
> can find
> > From: "Anton Fokin" <anton.fokin@smartquant.com>
> > To: "Eddy Offermann" <eddy@rentec.com>
> > Cc: <roottalk@pcroot.cern.ch>
> > Subject: Re: [ROOT] Fitting with stochastic optimizers
> > Date: Fri, 2 Mar 2001 17:17:24 +0100
> > MIME-Version: 1.0
> > Content-Transfer-Encoding: 7bit
> > X-Priority: 3
> > X-MSMail-Priority: Normal
> > X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2615.200
> > X-Filter-Version: 1.3 (ram)
> >
> > Hi Eddy,
> >
> > if we talk about stochastics, an optimizer minimizes your objective
function
> > wihtout taking care of its shape.  In case of least squares it minimizes
> >
> > Objective = Sum(Fi(P1,..PN, Xi) - Yi)^2
> >
> > and reports the best set of parameters P1..PN
> >
> > In fact you can use any function to estimate errors and this will not
affect
> > the algorithm.
> >
> > There is no direct Hessian matrix or other calulations which these
> > algorithms use. Instead they try to change a set of parameters in a
smart
> > way to explore the search space and find the best set.
> >
> > Note that stochastic algorithms do not guarantee that you find a global
> > minimum in finite time but they are quite helpful if you want to find a
good
> > solution in reasonable time for complex functions and a number of
> > constraints.
> >
> > If you carfully optimize your objective on a set of N data points, you
can
> > spend much less time to calibrate your model after adding M << N points,
> > assuming that a new minimum lies near the old one.
> >
> > Regards,
> > Anton
> >
> > PS. Could someone clever help me with inverse probability function of
> > generalized (perhaps too much generalized:) ) Cauchy distribution?
> >
> > see g(dx) function at
> >
> > http://www.smartquant.com/htmldoc/TSimulatedAnnealing.html
> >
> > generalized annealing parameter displacement formula ...
> >
> >
> > http://www.smartquant.com
> >
> >
> > ----- Original Message -----
> > From: Eddy Offermann <eddy@rentec.com>
> > To: <anton.fokin@smartquant.com>
> > Cc: <roottalk@pcroot.cern.ch>
> > Sent: Friday, March 02, 2001 4:22 PM
> > Subject: Re: [ROOT] Fitting with stochastic optimizers
> >
> >
> > > Hi Anton,
> > >
> > > If we start to tinker/rewrite Minuit, I would like to see the
following
> > > additions/improvements:
> > >
> > > 1) This refers to linear fits:
> > >     When fitting N data points and get M additional ones, I would like
> > >     to ADD the information of the new ones to my Hessian matrix and
> > >     gradient. I do NOT want to analyze all M+N data points again !
> > >     This is important for on-line applications and non-cheating
> > >     data analyss with time series. It is straightforward to implement
> > >     for a least-squares objective function.
> > >
> > >  2) Be able to easily specify Bayesian priors for my parameters
> > >
> > >  3) Have possibilty of applying a robust algorithm instead
> > >      of least squares like least median squares
> > >
> > > What do others wish/think ??
> > >
> > > Best regards Eddy
> > >
> > > (out of town for the next 2weeks)
> > >
> > > > X-Authentication-Warning: pcroot.cern.ch: majordomo set sender to
> > > owner-roottalk@root.cern.ch using -f
> > > > From: "Anton Fokin" <anton.fokin@smartquant.com>
> > > > To: "roottalk" <roottalk@pcroot.cern.ch>
> > > > Subject: [ROOT] Fitting with stochastic optimizers
> > > > Date: Fri, 2 Mar 2001 11:42:49 +0100
> > > > MIME-Version: 1.0
> > > > Content-Transfer-Encoding: 7bit
> > > > X-Priority: 3
> > > > X-MSMail-Priority: Normal
> > > > X-MIMEOLE: Produced By Microsoft MimeOLE V5.00.2615.200
> > > > X-Filter-Version: 1.3 (ram)
> > > >
> > > > Hi rooters,
> > > >
> > > > I've just seen several postings about fitting. I am curious if
someone
> > ever
> > > > went into troubles with root minuit and would like to have
stochastic
> > > > optimization as an alternative? I can provide simulated
> > annealing/genetics
> > > > if enough people want it.
> > > >
> > > > Currently I have minimization functionality for TF1/2 user defined
> > functions
> > > > on www.smartquant.com/neural.html  I can extend the package so that
it
> > could
> > > > be used with TVirtualFitter.
> > > >
> > > > Regards,
> > > > Anton
> > > >
> > > > http://www.smartquant.com
> > > >
> > > >
> > > >
> > >
> > > Eddy A.J.M. Offermann
> > > Renaissance Technologies Corp.
> > > Route 25A, East Setauket NY 11733
> > > e-mail: eddy@rentec.com
> > > http://www.rentec.com
> > >
> > >
> > >
> >
>
> Eddy A.J.M. Offermann
> Renaissance Technologies Corp.
> Route 25A, East Setauket NY 11733
> e-mail: eddy@rentec.com
> http://www.rentec.com
>
>
>



This archive was generated by hypermail 2b29 : Tue Jan 01 2002 - 17:50:38 MET