RE: Troubling running TMultiLayerPerceptron with ROOT 5.08/00

From: Nick West <n.west1_at_physics.ox.ac.uk>
Date: Fri, 10 Mar 2006 16:28:34 -0000


Hi Andrea,

Thanks for getting back to me so quickly.

> I've tried your test case and can reproduce the problem.
> In fact, looking at the "mlp->DrawResult(0,"test")" plots
> show that often the networks degenerate and give the same
> result almost independently from the inputs (roughly the
> everage of "type").
> However, I couldn't quickly track this issue to any change
> between ROOT versions 4.04/02 and 5.08/00.
>
> As a workaround, I'd suggest to
> - do not normalize the inputs
> - use the kRibierePolak training method, I've found it
> usually more robust than the kBFGS, even if slower
> - try to tune the training parameters, to avoid the local
> minimum corresponding to the degenerate condition

Well I have tried the easy suggestions: not normalising and using kRibierePolak:-

  TMultiLayerPerceptron *mlp

bud sadly, it makes no difference, the RMS from "mlp->DrawResult(0,"test")
is still small so the plot remains degenerate:-   

  Test set distribution.  Mean : 0.696387 RMS 0.0705763
  Test set distribution.  Mean : 0.694241 RMS 0.0540238
  Test set distribution.  Mean : 0.692935 RMS 6.59971e-08
  Test set distribution.  Mean : 0.693012 RMS 4.61586e-07
  Test set distribution.  Mean : 0.693139 RMS 0.00310088

Cheers,

Nick. Received on Fri Mar 10 2006 - 17:29:00 MET

This archive was generated by hypermail 2.2.0 : Mon Jan 01 2007 - 16:31:57 MET