// @(#)root/mlp:$Id$
// Author: Christophe.Delaere@cern.ch   20/07/03

/*************************************************************************
 * Copyright (C) 1995-2003, Rene Brun and Fons Rademakers.               *
 * All rights reserved.                                                  *
 *                                                                       *
 * For the licensing terms see $ROOTSYS/LICENSE.                         *
 * For the list of contributors see $ROOTSYS/README/CREDITS.             *
 *************************************************************************/

///////////////////////////////////////////////////////////////////////////
//
// TMultiLayerPerceptron
//
// This class describes a neural network.
// There are facilities to train the network and use the output.
//
// The input layer is made of inactive neurons (returning the
// optionaly normalized input) and output neurons are linear.
// The type of hidden neurons is free, the default being sigmoids.
// (One should still try to pass normalized inputs, e.g. between [0.,1])
//
// The basic input is a TTree and two (training and test) TEventLists.
// Input and output neurons are assigned a value computed for each event
// with the same possibilities as for TTree::Draw().
// Events may be weighted individualy or via TTree::SetWeight().
// 6 learning methods are available: kStochastic, kBatch,
// kSteepestDescent, kRibierePolak, kFletcherReeves and kBFGS.
//
// This implementation, written by C. Delaere, is *inspired* from
// the mlpfit package from J.Schwindling et al. with some extensions:
//   * the algorithms are globally the same
//   * in TMultilayerPerceptron, there is no limitation on the number of
//     layers/neurons, while MLPFIT was limited to 2 hidden layers
//   * TMultilayerPerceptron allows you to save the network in a root file, and
//     provides more export functionalities
//   * TMultilayerPerceptron gives more flexibility regarding the normalization of
//     inputs/outputs
//   * TMultilayerPerceptron provides, thanks to Andrea Bocci, the possibility to
//     use cross-entropy errors, which allows to train a network for pattern
//     classification based on Bayesian posterior probability.
//
///////////////////////////////////////////////////////////////////////////
//BEGIN_HTML <!--
/* -->
<UL>
        <LI><P><A NAME="intro"></A><FONT COLOR="#5c8526">
        <FONT SIZE=4 STYLE="font-size: 15pt">Introduction</FONT></FONT></P>
</UL>
<P>Neural Networks are more and more used in various fields for data
analysis and classification, both for research and commercial
institutions. Some randomly chosen examples are:</P>
<UL>
        <LI><P>image analysis</P>
        <LI><P>financial movements predictions and analysis</P>
        <LI><P>sales forecast and product shipping optimisation</P>
        <LI><P>in particles physics: mainly for classification tasks (signal
        over background discrimination)</P>
</UL>
<P>More than 50% of neural networks are multilayer perceptrons. This
implementation of multilayer perceptrons is inspired from the
<A HREF="http://schwind.home.cern.ch/schwind/MLPfit.html">MLPfit
package</A> originaly written by Jerome Schwindling. MLPfit remains
one of the fastest tool for neural networks studies, and this ROOT
add-on will not try to compete on that. A clear and flexible Object
Oriented implementation has been chosen over a faster but more
difficult to maintain code. Nevertheless, the time penalty does not
exceed a factor 2.</P>
<UL>
        <LI><P><A NAME="mlp"></A><FONT COLOR="#5c8526">
        <FONT SIZE=4 STYLE="font-size: 15pt">The
        MLP</FONT></FONT></P>
</UL>
<P>The multilayer perceptron is a simple feed-forward network with
the following structure:</P>
<P ALIGN=CENTER><IMG SRC="gif/mlp.png" NAME="MLP"
ALIGN=MIDDLE WIDTH=333 HEIGHT=358 BORDER=0>
</P>
<P>It is made of neurons characterized by a bias and weighted links
between them (let's call those links synapses). The input neurons
receive the inputs, normalize them and forward them to the first
hidden layer.
</P>
<P>Each neuron in any subsequent layer first computes a linear
combination of the outputs of the previous layer. The output of the
neuron is then function of that combination with <I>f</I> being
linear for output neurons or a sigmoid for hidden layers. This is
useful because of two theorems:</P>
<OL>
        <LI><P>A linear combination of sigmoids can approximate any
        continuous function.</P>
        <LI><P>Trained with output = 1 for the signal and 0 for the
        background, the approximated function of inputs X is the probability
        of signal, knowing X.</P>
</OL>
<UL>
        <LI><P><A NAME="lmet"></A><FONT COLOR="#5c8526">
        <FONT SIZE=4 STYLE="font-size: 15pt">Learning
        methods</FONT></FONT></P>
</UL>
<P>The aim of all learning methods is to minimize the total error on
a set of weighted examples. The error is defined as the sum in
quadrature, devided by two, of the error on each individual output
neuron.</P>
<P>In all methods implemented, one needs to compute
the first derivative of that error with respect to the weights.
Exploiting the well-known properties of the derivative, especialy the
derivative of compound functions, one can write:</P>
<UL>
        <LI><P>for a neuton: product of the local derivative with the
        weighted sum on the outputs of the derivatives.</P>
        <LI><P>for a synapse: product of the input with the local derivative
        of the output neuron.</P>
</UL>
<P>This computation is called back-propagation of the errors. A
loop over all examples is called an epoch.</P>
<P>Six learning methods are implemented.</P>
<P><FONT COLOR="#006b6b"><I>Stochastic minimization</I>:</FONT> This
is the most trivial learning method. This is the Robbins-Monro
stochastic approximation applied to multilayer perceptrons. The
weights are updated after each example according to the formula:</P>
<P ALIGN=CENTER>$w_{ij}(t+1) = w_{ij}(t) + \Delta w_{ij}(t)$
</P>
<P ALIGN=CENTER>with
</P>
<P ALIGN=CENTER>$\Delta w_{ij}(t) = - \eta(\d e_p / \d w_{ij} +
\delta) + \epsilon \Deltaw_{ij}(t-1)$</P>
<P>The parameters for this method are Eta, EtaDecay, Delta and
Epsilon.</P>
<P><FONT COLOR="#006b6b"><I>Steepest descent with fixed step size
(batch learning)</I>:</FONT> It is the same as the stochastic
minimization, but the weights are updated after considering all the
examples, with the total derivative dEdw. The parameters for this
method are Eta, EtaDecay, Delta and Epsilon.</P>
<P><FONT COLOR="#006b6b"><I>Steepest descent algorithm</I>: </FONT>Weights
are set to the minimum along the line defined by the gradient. The
only parameter for this method is Tau. Lower tau = higher precision =
slower search. A value Tau = 3 seems reasonable.</P>
<P><FONT COLOR="#006b6b"><I>Conjugate gradients with the
Polak-Ribiere updating formula</I>: </FONT>Weights are set to the
minimum along the line defined by the conjugate gradient. Parameters
are Tau and Reset, which defines the epochs where the direction is
reset to the steepes descent.</P>
<P><FONT COLOR="#006b6b"><I>Conjugate gradients with the
Fletcher-Reeves updating formula</I>: </FONT>Weights are set to the
minimum along the line defined by the conjugate gradient. Parameters
are Tau and Reset, which defines the epochs where the direction is
reset to the steepes descent.</P>
<P><FONT COLOR="#006b6b"><I>Broyden, Fletcher, Goldfarb, Shanno
(BFGS) method</I>:</FONT> Implies the computation of a NxN matrix
computation, but seems more powerful at least for less than 300
weights. Parameters are Tau and Reset, which defines the epochs where
the direction is reset to the steepes descent.</P>
<UL>
        <LI><P><A NAME="use"></A><FONT COLOR="#5c8526">
        <FONT SIZE=4 STYLE="font-size: 15pt">How
        to use it...</FONT></FONT></P></LI>
</UL>
<P><FONT SIZE=3>TMLP is build from 3 classes: TNeuron, TSynapse and
TMultiLayerPerceptron. Only TMultiLayerPerceptron should be used
explicitly by the user.</FONT></P>
<P><FONT SIZE=3>TMultiLayerPerceptron will take examples from a TTree
given in the constructor. The network is described by a simple
string: The input/output layers are defined by giving the expression for
each neuron, separated by comas. Hidden layers are just described
by the number of neurons. The layers are separated by colons.
In addition, input/output layer formulas can be preceded by '@' (e.g "@out")
if one wants to also normalize the data from the TTree.
Input and outputs are taken from the TTree given as second argument.
Expressions are evaluated as for TTree::Draw(), arrays are expended in
distinct neurons, one for each index.
This can only be done for fixed-size arrays.
If the formula ends with &quot;!&quot;, softmax functions are used for the output layer.
One defines the training and test datasets by TEventLists.</FONT></P>
<P STYLE="margin-left: 2cm"><FONT SIZE=3><SPAN STYLE="background: #e6e6e6">
<U><FONT COLOR="#ff0000">Example</FONT></U><SPAN STYLE="text-decoration: none">:
</SPAN>TMultiLayerPerceptron(&quot;x,y:10:5:f&quot;,inputTree);</SPAN></FONT></P>
<P><FONT SIZE=3>Both the TTree and the TEventLists can be defined in
the constructor, or later with the suited setter method. The lists
used for training and test can be defined either explicitly, or via
a string containing the formula to be used to define them, exactly as
for a TCut.</FONT></P>
<P><FONT SIZE=3>The learning method is defined using the
TMultiLayerPerceptron::SetLearningMethod() . Learning methods are :</FONT></P>
<P><FONT SIZE=3>TMultiLayerPerceptron::kStochastic, <BR>
TMultiLayerPerceptron::kBatch,<BR>
TMultiLayerPerceptron::kSteepestDescent,<BR>
TMultiLayerPerceptron::kRibierePolak,<BR>
TMultiLayerPerceptron::kFletcherReeves,<BR>
TMultiLayerPerceptron::kBFGS<BR></FONT></P>
<P>A weight can be assigned to events, either in the constructor, either
with TMultiLayerPerceptron::SetEventWeight(). In addition, the TTree weight
is taken into account.</P>
<P><FONT SIZE=3>Finally, one starts the training with
TMultiLayerPerceptron::Train(Int_t nepoch, Option_t* options). The
first argument is the number of epochs while option is a string that
can contain: &quot;text&quot; (simple text output) , &quot;graph&quot;
(evoluting graphical training curves), &quot;update=X&quot; (step for
the text/graph output update) or &quot;+&quot; (will skip the
randomisation and start from the previous values). All combinations
are available. </FONT></P>
<P STYLE="margin-left: 2cm"><FONT SIZE=3><SPAN STYLE="background: #e6e6e6">
<U><FONT COLOR="#ff0000">Example</FONT></U>:
net.Train(100,&quot;text, graph, update=10&quot;).</SPAN></FONT></P>
<P><FONT SIZE=3>When the neural net is trained, it can be used
directly ( TMultiLayerPerceptron::Evaluate() ) or exported to a
standalone C++ code ( TMultiLayerPerceptron::Export() ).</FONT></P>
<P><FONT SIZE=3>Finaly, note that even if this implementation is inspired from the mlpfit code,
the feature lists are not exactly matching:
<UL>
        <LI><P>mlpfit hybrid learning method is not implemented</P></LI>
        <LI><P>output neurons can be normalized, this is not the case for mlpfit</P></LI>
        <LI><P>the neural net is exported in C++, FORTRAN or PYTHON</P></LI>
        <LI><P>the drawResult() method allows a fast check of the learning procedure</P></LI>
</UL>
In addition, the paw version of mlpfit had additional limitations on the number of neurons, hidden layers and inputs/outputs that does not apply to TMultiLayerPerceptron.
<!-- */
// -->END_HTML

#include "TMultiLayerPerceptron.h"
#include "TSynapse.h"
#include "TNeuron.h"
#include "TClass.h"
#include "TTree.h"
#include "TEventList.h"
#include "TRandom3.h"
#include "TTimeStamp.h"
#include "TRegexp.h"
#include "TCanvas.h"
#include "TH2.h"
#include "TGraph.h"
#include "TLegend.h"
#include "TMultiGraph.h"
#include "TDirectory.h"
#include "TSystem.h"
#include "Riostream.h"
#include "TMath.h"
#include "TTreeFormula.h"
#include "TTreeFormulaManager.h"
#include "TMarker.h"
#include "TLine.h"
#include "TText.h"
#include "TObjString.h"
#include <stdlib.h>

ClassImp(TMultiLayerPerceptron)

//______________________________________________________________________________
TMultiLayerPerceptron::TMultiLayerPerceptron()
{
   // Default constructor
   if(!TClass::GetClass("TTreePlayer")) gSystem->Load("libTreePlayer");
   fNetwork.SetOwner(true);
   fFirstLayer.SetOwner(false);
   fLastLayer.SetOwner(false);
   fSynapses.SetOwner(true);
   fData = 0;
   fCurrentTree = -1;
   fCurrentTreeWeight = 1;
   fStructure = "";
   fWeight = "1";
   fTraining = 0;
   fTrainingOwner = false;
   fTest = 0;
   fTestOwner = false;
   fEventWeight = 0;
   fManager = 0;
   fLearningMethod = TMultiLayerPerceptron::kBFGS;
   fEta = .1;
   fEtaDecay = 1;
   fDelta = 0;
   fEpsilon = 0;
   fTau = 3;
   fLastAlpha = 0;
   fReset = 50;
   fType = TNeuron::kSigmoid;
   fOutType =  TNeuron::kLinear;
   fextF = "";
   fextD = "";
}

//______________________________________________________________________________
TMultiLayerPerceptron::TMultiLayerPerceptron(const char * layout, TTree * data,
                                             TEventList * training,
                                             TEventList * test,
                                             TNeuron::ENeuronType type,
                                             const char* extF, const char* extD)
{
   // The network is described by a simple string:
   // The input/output layers are defined by giving
   // the branch names separated by comas.
   // Hidden layers are just described by the number of neurons.
   // The layers are separated by colons.
   // Ex: "x,y:10:5:f"
   // The output can be prepended by '@' if the variable has to be
   // normalized.
   // The output can be followed by '!' to use Softmax neurons for the
   // output layer only.
   // Ex: "x,y:10:5:c1,c2,c3!"
   // Input and outputs are taken from the TTree given as second argument.
   // training and test are the two TEventLists defining events
   // to be used during the neural net training.
   // Both the TTree and the TEventLists  can be defined in the constructor,
   // or later with the suited setter method.

   if(!TClass::GetClass("TTreePlayer")) gSystem->Load("libTreePlayer");
   fNetwork.SetOwner(true);
   fFirstLayer.SetOwner(false);
   fLastLayer.SetOwner(false);
   fSynapses.SetOwner(true);
   fStructure = layout;
   fData = data;
   fCurrentTree = -1;
   fCurrentTreeWeight = 1;
   fTraining = training;
   fTrainingOwner = false;
   fTest = test;
   fTestOwner = false;
   fWeight = "1";
   fType = type;
   fOutType =  TNeuron::kLinear;
   fextF = extF;
   fextD = extD;
   fEventWeight = 0;
   fManager = 0;
   if (data) {
      BuildNetwork();
      AttachData();
   }
   fLearningMethod = TMultiLayerPerceptron::kBFGS;
   fEta = .1;
   fEpsilon = 0;
   fDelta = 0;
   fEtaDecay = 1;
   fTau = 3;
   fLastAlpha = 0;
   fReset = 50;
}

//______________________________________________________________________________
TMultiLayerPerceptron::TMultiLayerPerceptron(const char * layout,
                                             const char * weight, TTree * data,
                                             TEventList * training,
                                             TEventList * test,
                                             TNeuron::ENeuronType type,
                                             const char* extF, const char* extD)
{
   // The network is described by a simple string:
   // The input/output layers are defined by giving
   // the branch names separated by comas.
   // Hidden layers are just described by the number of neurons.
   // The layers are separated by colons.
   // Ex: "x,y:10:5:f"
   // The output can be prepended by '@' if the variable has to be
   // normalized.
   // The output can be followed by '!' to use Softmax neurons for the
   // output layer only.
   // Ex: "x,y:10:5:c1,c2,c3!"
   // Input and outputs are taken from the TTree given as second argument.
   // training and test are the two TEventLists defining events
   // to be used during the neural net training.
   // Both the TTree and the TEventLists  can be defined in the constructor,
   // or later with the suited setter method.

   if(!TClass::GetClass("TTreePlayer")) gSystem->Load("libTreePlayer");
   fNetwork.SetOwner(true);
   fFirstLayer.SetOwner(false);
   fLastLayer.SetOwner(false);
   fSynapses.SetOwner(true);
   fStructure = layout;
   fData = data;
   fCurrentTree = -1;
   fCurrentTreeWeight = 1;
   fTraining = training;
   fTrainingOwner = false;
   fTest = test;
   fTestOwner = false;
   fWeight = weight;
   fType = type;
   fOutType =  TNeuron::kLinear;
   fextF = extF;
   fextD = extD;
   fEventWeight = 0;
   fManager = 0;
   if (data) {
      BuildNetwork();
      AttachData();
   }
   fLearningMethod = TMultiLayerPerceptron::kBFGS;
   fEta = .1;
   fEtaDecay = 1;
   fDelta = 0;
   fEpsilon = 0;
   fTau = 3;
   fLastAlpha = 0;
   fReset = 50;
}

//______________________________________________________________________________
TMultiLayerPerceptron::TMultiLayerPerceptron(const char * layout, TTree * data,
                                             const char * training,
                                             const char * test,
                                             TNeuron::ENeuronType type,
                                             const char* extF, const char* extD)
{
   // The network is described by a simple string:
   // The input/output layers are defined by giving
   // the branch names separated by comas.
   // Hidden layers are just described by the number of neurons.
   // The layers are separated by colons.
   // Ex: "x,y:10:5:f"
   // The output can be prepended by '@' if the variable has to be
   // normalized.
   // The output can be followed by '!' to use Softmax neurons for the
   // output layer only.
   // Ex: "x,y:10:5:c1,c2,c3!"
   // Input and outputs are taken from the TTree given as second argument.
   // training and test are two cuts (see TTreeFormula) defining events
   // to be used during the neural net training and testing.
   // Example: "Entry$%2", "(Entry$+1)%2".
   // Both the TTree and the cut can be defined in the constructor,
   // or later with the suited setter method.

   if(!TClass::GetClass("TTreePlayer")) gSystem->Load("libTreePlayer");
   fNetwork.SetOwner(true);
   fFirstLayer.SetOwner(false);
   fLastLayer.SetOwner(false);
   fSynapses.SetOwner(true);
   fStructure = layout;
   fData = data;
   fCurrentTree = -1;
   fCurrentTreeWeight = 1;
   fTraining = new TEventList(Form("fTrainingList_%lu",(ULong_t)this));
   fTrainingOwner = true;
   fTest = new TEventList(Form("fTestList_%lu",(ULong_t)this));
   fTestOwner = true;
   fWeight = "1";
   TString testcut = test;
   if(testcut=="") testcut = Form("!(%s)",training);
   fType = type;
   fOutType =  TNeuron::kLinear;
   fextF = extF;
   fextD = extD;
   fEventWeight = 0;
   fManager = 0;
   if (data) {
      BuildNetwork();
      data->Draw(Form(">>fTrainingList_%lu",(ULong_t)this),training,"goff");
      data->Draw(Form(">>fTestList_%lu",(ULong_t)this),(const char *)testcut,"goff");
      AttachData();
   }
   else {
      Warning("TMultiLayerPerceptron::TMultiLayerPerceptron","Data not set. Cannot define datasets");
   }
   fLearningMethod = TMultiLayerPerceptron::kBFGS;
   fEta = .1;
   fEtaDecay = 1;
   fDelta = 0;
   fEpsilon = 0;
   fTau = 3;
   fLastAlpha = 0;
   fReset = 50;
}

//______________________________________________________________________________
TMultiLayerPerceptron::TMultiLayerPerceptron(const char * layout,
                                             const char * weight, TTree * data,
                                             const char * training,
                                             const char * test,
                                             TNeuron::ENeuronType type,
                                             const char* extF, const char* extD)
{
   // The network is described by a simple string:
   // The input/output layers are defined by giving
   // the branch names separated by comas.
   // Hidden layers are just described by the number of neurons.
   // The layers are separated by colons.
   // Ex: "x,y:10:5:f"
   // The output can be prepended by '@' if the variable has to be
   // normalized.
   // The output can be followed by '!' to use Softmax neurons for the
   // output layer only.
   // Ex: "x,y:10:5:c1,c2,c3!"
   // Input and outputs are taken from the TTree given as second argument.
   // training and test are two cuts (see TTreeFormula) defining events
   // to be used during the neural net training and testing.
   // Example: "Entry$%2", "(Entry$+1)%2".
   // Both the TTree and the cut can be defined in the constructor,
   // or later with the suited setter method.

   if(!TClass::GetClass("TTreePlayer")) gSystem->Load("libTreePlayer");
   fNetwork.SetOwner(true);
   fFirstLayer.SetOwner(false);
   fLastLayer.SetOwner(false);
   fSynapses.SetOwner(true);
   fStructure = layout;
   fData = data;
   fCurrentTree = -1;
   fCurrentTreeWeight = 1;
   fTraining = new TEventList(Form("fTrainingList_%lu",(ULong_t)this));
   fTrainingOwner = true;
   fTest = new TEventList(Form("fTestList_%lu",(ULong_t)this));
   fTestOwner = true;
   fWeight = weight;
   TString testcut = test;
   if(testcut=="") testcut = Form("!(%s)",training);
   fType = type;
   fOutType =  TNeuron::kLinear;
   fextF = extF;
   fextD = extD;
   fEventWeight = 0;
   fManager = 0;
   if (data) {
      BuildNetwork();
      data->Draw(Form(">>fTrainingList_%lu",(ULong_t)this),training,"goff");
      data->Draw(Form(">>fTestList_%lu",(ULong_t)this),(const char *)testcut,"goff");
      AttachData();
   }
   else {
      Warning("TMultiLayerPerceptron::TMultiLayerPerceptron","Data not set. Cannot define datasets");
   }
   fLearningMethod = TMultiLayerPerceptron::kBFGS;
   fEta = .1;
   fEtaDecay = 1;
   fDelta = 0;
   fEpsilon = 0;
   fTau = 3;
   fLastAlpha = 0;
   fReset = 50;
}

//______________________________________________________________________________
TMultiLayerPerceptron::~TMultiLayerPerceptron()
{
   // Destructor
   if(fTraining && fTrainingOwner) delete fTraining;
   if(fTest && fTestOwner) delete fTest;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetData(TTree * data)
{
   // Set the data source
   if (fData) {
      std::cerr << "Error: data already defined." << std::endl;
      return;
   }
   fData = data;
   if (data) {
      BuildNetwork();
      AttachData();
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetEventWeight(const char * branch)
{
   // Set the event weight
   fWeight=branch;
   if (fData) {
      if (fEventWeight) {
         fManager->Remove(fEventWeight);
         delete fEventWeight;
      }
      fManager->Add((fEventWeight = new TTreeFormula("NNweight",fWeight.Data(),fData)));
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetTrainingDataSet(TEventList* train)
{
   // Sets the Training dataset.
   // Those events will be used for the minimization
   if(fTraining && fTrainingOwner) delete fTraining;
   fTraining = train;
   fTrainingOwner = false;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetTestDataSet(TEventList* test)
{
   // Sets the Test dataset.
   // Those events will not be used for the minimization but for control
   if(fTest && fTestOwner) delete fTest;
   fTest = test;
   fTestOwner = false;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetTrainingDataSet(const char * train)
{
   // Sets the Training dataset.
   // Those events will be used for the minimization.
   // Note that the tree must be already defined.
   if(fTraining && fTrainingOwner) delete fTraining;
   fTraining = new TEventList(Form("fTrainingList_%lu",(ULong_t)this));
   fTrainingOwner = true;
   if (fData) {
      fData->Draw(Form(">>fTrainingList_%lu",(ULong_t)this),train,"goff");
   }
   else {
      Warning("TMultiLayerPerceptron::TMultiLayerPerceptron","Data not set. Cannot define datasets");
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetTestDataSet(const char * test)
{
   // Sets the Test dataset.
   // Those events will not be used for the minimization but for control.
   // Note that the tree must be already defined.
   if(fTest && fTestOwner) {delete fTest; fTest=0;}
   if(fTest) if(strncmp(fTest->GetName(),Form("fTestList_%lu",(ULong_t)this),10)) delete fTest;
   fTest = new TEventList(Form("fTestList_%lu",(ULong_t)this));
   fTestOwner = true;
   if (fData) {
      fData->Draw(Form(">>fTestList_%lu",(ULong_t)this),test,"goff");
   }
   else {
      Warning("TMultiLayerPerceptron::TMultiLayerPerceptron","Data not set. Cannot define datasets");
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetLearningMethod(TMultiLayerPerceptron::ELearningMethod method)
{
   // Sets the learning method.
   // Available methods are: kStochastic, kBatch,
   // kSteepestDescent, kRibierePolak, kFletcherReeves and kBFGS.
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fLearningMethod = method;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetEta(Double_t eta)
{
   // Sets Eta - used in stochastic minimisation
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fEta = eta;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetEpsilon(Double_t eps)
{
   // Sets Epsilon - used in stochastic minimisation
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fEpsilon = eps;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetDelta(Double_t delta)
{
   // Sets Delta - used in stochastic minimisation
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fDelta = delta;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetEtaDecay(Double_t ed)
{
   // Sets EtaDecay - Eta *= EtaDecay at each epoch
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fEtaDecay = ed;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetTau(Double_t tau)
{
   // Sets Tau - used in line search
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fTau = tau;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetReset(Int_t reset)
{
   // Sets number of epochs between two resets of the
   // search direction to the steepest descent.
   // (look at the constructor for the complete description
   // of learning methods and parameters)
   fReset = reset;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::GetEntry(Int_t entry) const
{
   // Load an entry into the network
   if (!fData) return;
   fData->GetEntry(entry);
   if (fData->GetTreeNumber() != fCurrentTree) {
      ((TMultiLayerPerceptron*)this)->fCurrentTree = fData->GetTreeNumber();
      fManager->Notify();
      ((TMultiLayerPerceptron*)this)->fCurrentTreeWeight = fData->GetWeight();
   }
   Int_t nentries = fNetwork.GetEntriesFast();
   for (Int_t i=0;i<nentries;i++) {
      TNeuron *neuron = (TNeuron *)fNetwork.UncheckedAt(i);
      neuron->SetNewEvent();
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::Train(Int_t nEpoch, Option_t * option, Double_t minE)
{
   // Train the network.
   // nEpoch is the number of iterations.
   // option can contain:
   // - "text" (simple text output)
   // - "graph" (evoluting graphical training curves)
   // - "update=X" (step for the text/graph output update)
   // - "+" will skip the randomisation and start from the previous values.
   // - "current" (draw in the current canvas)
   // - "minErrorTrain" (stop when NN error on the training sample gets below minE
   // - "minErrorTest" (stop when NN error on the test sample gets below minE
   // All combinations are available.

   Int_t i;
   TString opt = option;
   opt.ToLower();
   // Decode options and prepare training.
   Int_t verbosity = 0;
   Bool_t newCanvas = true;
   Bool_t minE_Train = false;
   Bool_t minE_Test  = false;
   if (opt.Contains("text"))
      verbosity += 1;
   if (opt.Contains("graph"))
      verbosity += 2;
   Int_t displayStepping = 1;
   if (opt.Contains("update=")) {
      TRegexp reg("update=[0-9]*");
      TString out = opt(reg);
      displayStepping = atoi(out.Data() + 7);
   }
   if (opt.Contains("current"))
      newCanvas = false;
   if (opt.Contains("minerrortrain"))
      minE_Train = true;
   if (opt.Contains("minerrortest"))
      minE_Test = true;
   TVirtualPad *canvas = 0;
   TMultiGraph *residual_plot = 0;
   TGraph *train_residual_plot = 0;
   TGraph *test_residual_plot = 0;
   if ((!fData) || (!fTraining) || (!fTest)) {
      Error("Train","Training/Test samples still not defined. Cannot train the neural network");
      return;
   }
   Info("Train","Using %d train and %d test entries.",
        fTraining->GetN(), fTest->GetN());
   // Text and Graph outputs
   if (verbosity % 2)
      std::cout << "Training the Neural Network" << std::endl;
   if (verbosity / 2) {
      residual_plot = new TMultiGraph;
      if(newCanvas)
         canvas = new TCanvas("NNtraining", "Neural Net training");
      else {
         canvas = gPad;
         if(!canvas) canvas = new TCanvas("NNtraining", "Neural Net training");
      }
      train_residual_plot = new TGraph(nEpoch);
      test_residual_plot  = new TGraph(nEpoch);
      canvas->SetLeftMargin(0.14);
      train_residual_plot->SetLineColor(4);
      test_residual_plot->SetLineColor(2);
      residual_plot->Add(train_residual_plot);
      residual_plot->Add(test_residual_plot);
      residual_plot->Draw("LA");
      if (residual_plot->GetXaxis())  residual_plot->GetXaxis()->SetTitle("Epoch");
      if (residual_plot->GetYaxis())  residual_plot->GetYaxis()->SetTitle("Error");
   }
   // If the option "+" is not set, one has to randomize the weights first
   if (!opt.Contains("+"))
      Randomize();
   // Initialisation
   fLastAlpha = 0;
   Int_t els = fNetwork.GetEntriesFast() + fSynapses.GetEntriesFast();
   Double_t *buffer = new Double_t[els];
   Double_t *dir = new Double_t[els];
   for (i = 0; i < els; i++)
      buffer[i] = 0;
   Int_t matrix_size = fLearningMethod==TMultiLayerPerceptron::kBFGS ? els : 1;
   TMatrixD bfgsh(matrix_size, matrix_size);
   TMatrixD gamma(matrix_size, 1);
   TMatrixD delta(matrix_size, 1);
   // Epoch loop. Here is the training itself.
   Double_t training_E = 1e10;
   Double_t test_E = 1e10;
   for (Int_t iepoch = 0; (iepoch < nEpoch) && (!minE_Train || training_E>minE) && (!minE_Test || test_E>minE) ; iepoch++) {
      switch (fLearningMethod) {
      case TMultiLayerPerceptron::kStochastic:
         {
            MLP_Stochastic(buffer);
            break;
         }
      case TMultiLayerPerceptron::kBatch:
         {
            ComputeDEDw();
            MLP_Batch(buffer);
            break;
         }
      case TMultiLayerPerceptron::kSteepestDescent:
         {
            ComputeDEDw();
            SteepestDir(dir);
            if (LineSearch(dir, buffer))
               MLP_Batch(buffer);
            break;
         }
      case TMultiLayerPerceptron::kRibierePolak:
         {
            ComputeDEDw();
            if (!(iepoch % fReset)) {
               SteepestDir(dir);
            } else {
               Double_t norm = 0;
               Double_t onorm = 0;
               for (i = 0; i < els; i++)
                  onorm += dir[i] * dir[i];
               Double_t prod = 0;
               Int_t idx = 0;
               TNeuron *neuron = 0;
               TSynapse *synapse = 0;
               Int_t nentries = fNetwork.GetEntriesFast();
               for (i=0;i<nentries;i++) {
                  neuron = (TNeuron *) fNetwork.UncheckedAt(i);
                  prod -= dir[idx++] * neuron->GetDEDw();
                  norm += neuron->GetDEDw() * neuron->GetDEDw();
               }
               nentries = fSynapses.GetEntriesFast();
               for (i=0;i<nentries;i++) {
                  synapse = (TSynapse *) fSynapses.UncheckedAt(i);
                  prod -= dir[idx++] * synapse->GetDEDw();
                  norm += synapse->GetDEDw() * synapse->GetDEDw();
               }
               ConjugateGradientsDir(dir, (norm - prod) / onorm);
            }
            if (LineSearch(dir, buffer))
               MLP_Batch(buffer);
            break;
         }
      case TMultiLayerPerceptron::kFletcherReeves:
         {
            ComputeDEDw();
            if (!(iepoch % fReset)) {
               SteepestDir(dir);
            } else {
               Double_t norm = 0;
               Double_t onorm = 0;
               for (i = 0; i < els; i++)
                  onorm += dir[i] * dir[i];
               TNeuron *neuron = 0;
               TSynapse *synapse = 0;
               Int_t nentries = fNetwork.GetEntriesFast();
               for (i=0;i<nentries;i++) {
                  neuron = (TNeuron *) fNetwork.UncheckedAt(i);
                  norm += neuron->GetDEDw() * neuron->GetDEDw();
               }
               nentries = fSynapses.GetEntriesFast();
               for (i=0;i<nentries;i++) {
                  synapse = (TSynapse *) fSynapses.UncheckedAt(i);
                  norm += synapse->GetDEDw() * synapse->GetDEDw();
               }
               ConjugateGradientsDir(dir, norm / onorm);
            }
            if (LineSearch(dir, buffer))
               MLP_Batch(buffer);
            break;
         }
      case TMultiLayerPerceptron::kBFGS:
         {
            SetGammaDelta(gamma, delta, buffer);
            if (!(iepoch % fReset)) {
               SteepestDir(dir);
               bfgsh.UnitMatrix();
            } else {
               if (GetBFGSH(bfgsh, gamma, delta)) {
                  SteepestDir(dir);
                  bfgsh.UnitMatrix();
               } else {
                  BFGSDir(bfgsh, dir);
               }
            }
            if (DerivDir(dir) > 0) {
               SteepestDir(dir);
               bfgsh.UnitMatrix();
            }
            if (LineSearch(dir, buffer)) {
               bfgsh.UnitMatrix();
               SteepestDir(dir);
               if (LineSearch(dir, buffer)) {
                  Error("TMultiLayerPerceptron::Train()","Line search fail");
                  iepoch = nEpoch;
               }
            }
            break;
         }
      }
      // Security: would the learning lead to non real numbers,
      // the learning should stop now.
      if (TMath::IsNaN(GetError(TMultiLayerPerceptron::kTraining))) {
         Error("TMultiLayerPerceptron::Train()","Stop.");
         iepoch = nEpoch;
      }
      // Process other ROOT events.  Time penalty is less than
      // 1/1000 sec/evt on a mobile AMD Athlon(tm) XP 1500+
      gSystem->ProcessEvents();
      training_E = TMath::Sqrt(GetError(TMultiLayerPerceptron::kTraining) / fTraining->GetN());
      test_E = TMath::Sqrt(GetError(TMultiLayerPerceptron::kTest) / fTest->GetN());
      // Intermediate graph and text output
      if ((verbosity % 2) && ((!(iepoch % displayStepping)) || (iepoch == nEpoch - 1))) {
         std::cout << "Epoch: " << iepoch
              << " learn=" << training_E
              << " test=" << test_E
              << std::endl;
      }
      if (verbosity / 2) {
         train_residual_plot->SetPoint(iepoch, iepoch,training_E);
         test_residual_plot->SetPoint(iepoch, iepoch,test_E);
         if (!iepoch) {
            Double_t trp = train_residual_plot->GetY()[iepoch];
            Double_t tep = test_residual_plot->GetY()[iepoch];
            for (i = 1; i < nEpoch; i++) {
               train_residual_plot->SetPoint(i, i, trp);
               test_residual_plot->SetPoint(i, i, tep);
            }
         }
         if ((!(iepoch % displayStepping)) || (iepoch == nEpoch - 1)) {
            if (residual_plot->GetYaxis()) {
               residual_plot->GetYaxis()->UnZoom();
               residual_plot->GetYaxis()->SetTitleOffset(1.4);
               residual_plot->GetYaxis()->SetDecimals();
            }
            canvas->Modified();
            canvas->Update();
         }
      }
   }
   // Cleaning
   delete [] buffer;
   delete [] dir;
   // Final Text and Graph outputs
   if (verbosity % 2)
      std::cout << "Training done." << std::endl;
   if (verbosity / 2) {
      TLegend *legend = new TLegend(.75, .80, .95, .95);
      legend->AddEntry(residual_plot->GetListOfGraphs()->At(0),
                       "Training sample", "L");
      legend->AddEntry(residual_plot->GetListOfGraphs()->At(1),
                       "Test sample", "L");
      legend->Draw();
   }
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::Result(Int_t event, Int_t index) const
{
   // Computes the output for a given event.
   // Look at the output neuron designed by index.
   GetEntry(event);
   TNeuron *out = (TNeuron *) (fLastLayer.At(index));
   if (out)
      return out->GetValue();
   else
      return 0;
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::GetError(Int_t event) const
{
   // Error on the output for a given event
   GetEntry(event);
   Double_t error = 0;
   // look at 1st output neruon to determine type and error function
   Int_t nEntries = fLastLayer.GetEntriesFast();
   if (nEntries == 0) return 0.0;
   switch (fOutType) {
   case (TNeuron::kSigmoid):
         error = GetCrossEntropyBinary();
         break;
   case (TNeuron::kSoftmax):
         error = GetCrossEntropy();
         break;
   case (TNeuron::kLinear):
         error = GetSumSquareError();
         break;
   default:
         // default to sum-of-squares error
         error = GetSumSquareError();
   }
   error *= fEventWeight->EvalInstance();
   error *= fCurrentTreeWeight;
   return error;
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::GetError(TMultiLayerPerceptron::EDataSet set) const
{
   // Error on the whole dataset
   TEventList *list =
       ((set == TMultiLayerPerceptron::kTraining) ? fTraining : fTest);
   Double_t error = 0;
   Int_t i;
   if (list) {
      Int_t nEvents = list->GetN();
      for (i = 0; i < nEvents; i++) {
         error += GetError(list->GetEntry(i));
      }
   } else if (fData) {
      Int_t nEvents = (Int_t) fData->GetEntries();
      for (i = 0; i < nEvents; i++) {
         error += GetError(i);
      }
   }
   return error;
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::GetSumSquareError() const
{
   // Error on the output for a given event
   Double_t error = 0;
   for (Int_t i = 0; i < fLastLayer.GetEntriesFast(); i++) {
      TNeuron *neuron = (TNeuron *) fLastLayer[i];
      error += neuron->GetError() * neuron->GetError();
   }
   return (error / 2.);
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::GetCrossEntropyBinary() const
{
   // Cross entropy error for sigmoid output neurons, for a given event
   Double_t error = 0;
   for (Int_t i = 0; i < fLastLayer.GetEntriesFast(); i++) {
      TNeuron *neuron = (TNeuron *) fLastLayer[i];
      Double_t output = neuron->GetValue();     // sigmoid output and target
      Double_t target = neuron->GetTarget();    // values lie in [0,1]
      if (target < DBL_EPSILON) {
         if (output == 1.0)
            error = DBL_MAX;
         else
            error -= TMath::Log(1 - output);
      } else
      if ((1 - target) < DBL_EPSILON) {
         if (output == 0.0)
            error = DBL_MAX;
         else
            error -= TMath::Log(output);
      } else {
         if (output == 0.0 || output == 1.0)
            error = DBL_MAX;
         else
            error -= target * TMath::Log(output / target) + (1-target) * TMath::Log((1 - output)/(1 - target));
      }
   }
   return error;
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::GetCrossEntropy() const
{
   // Cross entropy error for a softmax output neuron, for a given event
   Double_t error = 0;
   for (Int_t i = 0; i < fLastLayer.GetEntriesFast(); i++) {
      TNeuron *neuron = (TNeuron *) fLastLayer[i];
      Double_t output = neuron->GetValue();     // softmax output and target
      Double_t target = neuron->GetTarget();    // values lie in [0,1]
      if (target > DBL_EPSILON) {               // (target == 0) => dE = 0
         if (output == 0.0)
            error = DBL_MAX;
         else
            error -= target * TMath::Log(output / target);
      }
   }
   return error;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::ComputeDEDw() const
{
   // Compute the DEDw = sum on all training events of dedw for each weight
   // normalized by the number of events.
   Int_t i,j;
   Int_t nentries = fSynapses.GetEntriesFast();
   TSynapse *synapse;
   for (i=0;i<nentries;i++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(i);
      synapse->SetDEDw(0.);
   }
   TNeuron *neuron;
   nentries = fNetwork.GetEntriesFast();
   for (i=0;i<nentries;i++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(i);
      neuron->SetDEDw(0.);
   }
   Double_t eventWeight = 1.;
   if (fTraining) {
      Int_t nEvents = fTraining->GetN();
      for (i = 0; i < nEvents; i++) {
         GetEntry(fTraining->GetEntry(i));
         eventWeight = fEventWeight->EvalInstance();
         eventWeight *= fCurrentTreeWeight;
         nentries = fSynapses.GetEntriesFast();
         for (j=0;j<nentries;j++) {
            synapse = (TSynapse *) fSynapses.UncheckedAt(j);
            synapse->SetDEDw(synapse->GetDEDw() + (synapse->GetDeDw()*eventWeight));
         }
         nentries = fNetwork.GetEntriesFast();
         for (j=0;j<nentries;j++) {
            neuron = (TNeuron *) fNetwork.UncheckedAt(j);
            neuron->SetDEDw(neuron->GetDEDw() + (neuron->GetDeDw()*eventWeight));
         }
      }
      nentries = fSynapses.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         synapse = (TSynapse *) fSynapses.UncheckedAt(j);
         synapse->SetDEDw(synapse->GetDEDw() / (Double_t) nEvents);
      }
      nentries = fNetwork.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         neuron = (TNeuron *) fNetwork.UncheckedAt(j);
         neuron->SetDEDw(neuron->GetDEDw() / (Double_t) nEvents);
      }
   } else if (fData) {
      Int_t nEvents = (Int_t) fData->GetEntries();
      for (i = 0; i < nEvents; i++) {
         GetEntry(i);
         eventWeight = fEventWeight->EvalInstance();
         eventWeight *= fCurrentTreeWeight;
         nentries = fSynapses.GetEntriesFast();
         for (j=0;j<nentries;j++) {
            synapse = (TSynapse *) fSynapses.UncheckedAt(j);
            synapse->SetDEDw(synapse->GetDEDw() + (synapse->GetDeDw()*eventWeight));
         }
         nentries = fNetwork.GetEntriesFast();
         for (j=0;j<nentries;j++) {
            neuron = (TNeuron *) fNetwork.UncheckedAt(j);
            neuron->SetDEDw(neuron->GetDEDw() + (neuron->GetDeDw()*eventWeight));
         }
      }
      nentries = fSynapses.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         synapse = (TSynapse *) fSynapses.UncheckedAt(j);
         synapse->SetDEDw(synapse->GetDEDw() / (Double_t) nEvents);
      }
      nentries = fNetwork.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         neuron = (TNeuron *) fNetwork.UncheckedAt(j);
         neuron->SetDEDw(neuron->GetDEDw() / (Double_t) nEvents);
      }
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::Randomize() const
{
   // Randomize the weights
   Int_t nentries = fSynapses.GetEntriesFast();
   Int_t j;
   TSynapse *synapse;
   TNeuron *neuron;
   TTimeStamp ts;
   TRandom3 gen(ts.GetSec());
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      synapse->SetWeight(gen.Rndm() - 0.5);
   }
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      neuron->SetWeight(gen.Rndm() - 0.5);
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::AttachData()
{
   // Connects the TTree to Neurons in input and output
   // layers. The formulas associated to each neuron are created
   // and reported to the network formula manager.
   // By default, the branch is not normalised since this would degrade
   // performance for classification jobs.
   // Normalisation can be requested by putting '@' in front of the formula.
   Int_t j = 0;
   TNeuron *neuron = 0;
   Bool_t normalize = false;
   fManager = new TTreeFormulaManager;

   // Set the size of the internal array of parameters of the formula
   Int_t maxop, maxpar, maxconst;
   ROOT::v5::TFormula::GetMaxima(maxop, maxpar, maxconst);
   ROOT::v5::TFormula::SetMaxima(10, 10, 10);
   
   //first layer
   const TString input = TString(fStructure(0, fStructure.First(':')));
   const TObjArray *inpL = input.Tokenize(", ");
   Int_t nentries = fFirstLayer.GetEntriesFast();
   // make sure nentries == entries in inpL
   R__ASSERT(nentries == inpL->GetLast()+1);
   for (j=0;j<nentries;j++) {
      normalize = false;
      const TString brName = ((TObjString *)inpL->At(j))->GetString();
      neuron = (TNeuron *) fFirstLayer.UncheckedAt(j);
      if (brName[0]=='@')
         normalize = true;
      fManager->Add(neuron->UseBranch(fData,brName.Data() + (normalize?1:0)));
      if(!normalize) neuron->SetNormalisation(0., 1.);
   }
   delete inpL;

   // last layer
   TString output = TString(
           fStructure(fStructure.Last(':') + 1,
                      fStructure.Length() - fStructure.Last(':')));
   const TObjArray *outL = output.Tokenize(", ");
   nentries = fLastLayer.GetEntriesFast();
   // make sure nentries == entries in outL
   R__ASSERT(nentries == outL->GetLast()+1);
   for (j=0;j<nentries;j++) {
      normalize = false;
      const TString brName = ((TObjString *)outL->At(j))->GetString();
      neuron = (TNeuron *) fLastLayer.UncheckedAt(j);
      if (brName[0]=='@')
         normalize = true;
      fManager->Add(neuron->UseBranch(fData,brName.Data() + (normalize?1:0)));
      if(!normalize) neuron->SetNormalisation(0., 1.);
   }
   delete outL;

   fManager->Add((fEventWeight = new TTreeFormula("NNweight",fWeight.Data(),fData)));
   //fManager->Sync();

   // Set the old values
   ROOT::v5::TFormula::SetMaxima(maxop, maxpar, maxconst);
}

//______________________________________________________________________________
void TMultiLayerPerceptron::ExpandStructure()
{
   // Expand the structure of the first layer
   TString input  = TString(fStructure(0, fStructure.First(':')));
   const TObjArray *inpL = input.Tokenize(", ");
   Int_t nneurons = inpL->GetLast()+1;

   TString hiddenAndOutput = TString(
         fStructure(fStructure.First(':') + 1,
                    fStructure.Length() - fStructure.First(':')));
   TString newInput;
   Int_t i = 0;
   // loop on input neurons
   for (i = 0; i<nneurons; i++) {
      const TString name = ((TObjString *)inpL->At(i))->GetString();
      TTreeFormula f("sizeTestFormula",name,fData);
      // Variable size arrays are unrelialable
      if(f.GetMultiplicity()==1 && f.GetNdata()>1) {
         Warning("TMultiLayerPerceptron::ExpandStructure()","Variable size arrays cannot be used to build implicitely an input layer. The index 0 will be assumed.");
      }
      // Check if we are coping with an array... then expand
      // The array operator used is {}. It is detected in TNeuron, and
      // passed directly as instance index of the TTreeFormula,
      // so that complex compounds made of arrays can be used without
      // parsing the details.
      else if(f.GetNdata()>1) {
         for(Int_t j=0; j<f.GetNdata(); j++) {
            if(i||j) newInput += ",";
            newInput += name;
            newInput += "{";
            newInput += j;
            newInput += "}";
         }
         continue;
      }
      if(i) newInput += ",";
      newInput += name;
   }
   delete inpL;

   // Save the result
   fStructure = newInput + ":" + hiddenAndOutput;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::BuildNetwork()
{
   // Instanciates the network from the description
   ExpandStructure();
   TString input  = TString(fStructure(0, fStructure.First(':')));
   TString hidden = TString(
           fStructure(fStructure.First(':') + 1,
                      fStructure.Last(':') - fStructure.First(':') - 1));
   TString output = TString(
           fStructure(fStructure.Last(':') + 1,
                      fStructure.Length() - fStructure.Last(':')));
   Int_t bll = atoi(TString(
           hidden(hidden.Last(':') + 1,
                  hidden.Length() - (hidden.Last(':') + 1))).Data());
   if (input.Length() == 0) {
      Error("BuildNetwork()","malformed structure. No input layer.");
      return;
   }
   if (output.Length() == 0) {
      Error("BuildNetwork()","malformed structure. No output layer.");
      return;
   }
   BuildFirstLayer(input);
   BuildHiddenLayers(hidden);
   BuildLastLayer(output, bll);
}

//______________________________________________________________________________
void TMultiLayerPerceptron::BuildFirstLayer(TString & input)
{
   // Instanciates the neurons in input
   // Inputs are normalised and the type is set to kOff
   // (simple forward of the formula value)

   const TObjArray *inpL = input.Tokenize(", ");
   const Int_t nneurons =inpL->GetLast()+1;
   TNeuron *neuron = 0;
   Int_t i = 0;
   for (i = 0; i<nneurons; i++) {
      const TString name = ((TObjString *)inpL->At(i))->GetString();
      neuron = new TNeuron(TNeuron::kOff, name);
      fFirstLayer.AddLast(neuron);
      fNetwork.AddLast(neuron);
   }
   delete inpL;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::BuildHiddenLayers(TString & hidden)
{
   // Builds hidden layers.
   Int_t beg = 0;
   Int_t end = hidden.Index(":", beg + 1);
   Int_t prevStart = 0;
   Int_t prevStop = fNetwork.GetEntriesFast();
   Int_t layer = 1;
   while (end != -1) {
      BuildOneHiddenLayer(hidden(beg, end - beg), layer, prevStart, prevStop, false);
      beg = end + 1;
      end = hidden.Index(":", beg + 1);
   }

   BuildOneHiddenLayer(hidden(beg, hidden.Length() - beg), layer, prevStart, prevStop, true);
}

//______________________________________________________________________________
void TMultiLayerPerceptron::BuildOneHiddenLayer(const TString& sNumNodes, Int_t& layer,
                                                  Int_t& prevStart, Int_t& prevStop,
                                                  Bool_t lastLayer)
{
   // Builds a hidden layer, updates the number of layers.
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   TString name;
   if (!sNumNodes.IsAlnum() || sNumNodes.IsAlpha()) {
      Error("BuildOneHiddenLayer",
            "The specification '%s' for hidden layer %d must contain only numbers!",
            sNumNodes.Data(), layer - 1);
   } else {
      Int_t num = atoi(sNumNodes.Data());
      for (Int_t i = 0; i < num; i++) {
         name.Form("HiddenL%d:N%d",layer,i);
         neuron = new TNeuron(fType, name, "", (const char*)fextF, (const char*)fextD);
         fNetwork.AddLast(neuron);
         for (Int_t j = prevStart; j < prevStop; j++) {
            synapse = new TSynapse((TNeuron *) fNetwork[j], neuron);
            fSynapses.AddLast(synapse);
         }
      }

      if (!lastLayer) {
         // tell each neuron which ones are in its own layer (for Softmax)
         Int_t nEntries = fNetwork.GetEntriesFast();
         for (Int_t i = prevStop; i < nEntries; i++) {
            neuron = (TNeuron *) fNetwork[i];
            for (Int_t j = prevStop; j < nEntries; j++)
               neuron->AddInLayer((TNeuron *) fNetwork[j]);
         }
      }

      prevStart = prevStop;
      prevStop = fNetwork.GetEntriesFast();
      layer++;
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::BuildLastLayer(TString & output, Int_t prev)
{
   // Builds the output layer
   // Neurons are linear combinations of input, by defaul.
   // If the structure ends with "!", neurons are set up for classification,
   // ie. with a sigmoid (1 neuron) or softmax (more neurons) activation function.

   Int_t nneurons = output.CountChar(',')+1;
   if (fStructure.EndsWith("!")) {
      fStructure = TString(fStructure(0, fStructure.Length() - 1));  // remove "!"
      if (nneurons == 1)
         fOutType = TNeuron::kSigmoid;
      else
         fOutType = TNeuron::kSoftmax;
   }
   Int_t prevStop = fNetwork.GetEntriesFast();
   Int_t prevStart = prevStop - prev;
   Ssiz_t pos = 0;
   TNeuron *neuron;
   TSynapse *synapse;
   TString name;
   Int_t i,j;
   for (i = 0; i<nneurons; i++) {
      Ssiz_t nextpos=output.Index(",",pos);
      if (nextpos!=kNPOS)
         name=output(pos,nextpos-pos);
      else name=output(pos,output.Length());
      pos+=nextpos+1;
      neuron = new TNeuron(fOutType, name);
      for (j = prevStart; j < prevStop; j++) {
         synapse = new TSynapse((TNeuron *) fNetwork[j], neuron);
         fSynapses.AddLast(synapse);
      }
      fLastLayer.AddLast(neuron);
      fNetwork.AddLast(neuron);
   }
   // tell each neuron which ones are in its own layer (for Softmax)
   Int_t nEntries = fNetwork.GetEntriesFast();
   for (i = prevStop; i < nEntries; i++) {
      neuron = (TNeuron *) fNetwork[i];
      for (j = prevStop; j < nEntries; j++)
         neuron->AddInLayer((TNeuron *) fNetwork[j]);
   }

}

//______________________________________________________________________________
void TMultiLayerPerceptron::DrawResult(Int_t index, Option_t * option) const
{
   // Draws the neural net output
   // It produces an histogram with the output for the two datasets.
   // Index is the number of the desired output neuron.
   // "option" can contain:
   // - test or train to select a dataset
   // - comp to produce a X-Y comparison plot
   // - nocanv to not create a new TCanvas for the plot

   TString opt = option;
   opt.ToLower();
   TNeuron *out = (TNeuron *) (fLastLayer.At(index));
   if (!out) {
      Error("DrawResult()","no such output.");
      return;
   }
   //TCanvas *canvas = new TCanvas("NNresult", "Neural Net output");
   if (!opt.Contains("nocanv"))
      new TCanvas("NNresult", "Neural Net output");
   const Double_t *norm = out->GetNormalisation();
   TEventList *events = 0;
   TString setname;
   Int_t i;
   if (opt.Contains("train")) {
      events = fTraining;
      setname = Form("train%d",index);
   } else if (opt.Contains("test")) {
      events = fTest;
      setname = Form("test%d",index);
   }
   if ((!fData) || (!events)) {
      Error("DrawResult()","no dataset.");
      return;
   }
   if (opt.Contains("comp")) {
      //comparison plot
      TString title = "Neural Net Output control. ";
      title += setname;
      setname = "MLP_" + setname + "_comp";
      TH2D *hist = ((TH2D *) gDirectory->Get(setname.Data()));
      if (!hist)
         hist = new TH2D(setname.Data(), title.Data(), 50, -1, 1, 50, -1, 1);
      hist->Reset();
      Int_t nEvents = events->GetN();
      for (i = 0; i < nEvents; i++) {
         GetEntry(events->GetEntry(i));
         hist->Fill(out->GetValue(), (out->GetBranch() - norm[1]) / norm[0]);
      }
      hist->Draw();
   } else {
      //output plot
      TString title = "Neural Net Output. ";
      title += setname;
      setname = "MLP_" + setname;
      TH1D *hist = ((TH1D *) gDirectory->Get(setname.Data()));
      if (!hist)
         hist = new TH1D(setname, title, 50, 1, -1);
      hist->Reset();
      Int_t nEvents = events->GetN();
      for (i = 0; i < nEvents; i++)
         hist->Fill(Result(events->GetEntry(i), index));
      hist->Draw();
      if (opt.Contains("train") && opt.Contains("test")) {
         events = fTraining;
         setname = "train";
         hist = ((TH1D *) gDirectory->Get("MLP_test"));
         if (!hist)
            hist = new TH1D(setname, title, 50, 1, -1);
         hist->Reset();
         nEvents = events->GetN();
         for (i = 0; i < nEvents; i++)
            hist->Fill(Result(events->GetEntry(i), index));
         hist->Draw("same");
      }
   }
}

//______________________________________________________________________________
Bool_t TMultiLayerPerceptron::DumpWeights(Option_t * filename) const
{
   // Dumps the weights to a text file.
   // Set filename to "-" (default) to dump to the standard output
   TString filen = filename;
   std::ostream * output;
   if (filen == "") {
      Error("TMultiLayerPerceptron::DumpWeights()","Invalid file name");
      return kFALSE;
   }
   if (filen == "-")
      output = &std::cout;
   else
      output = new std::ofstream(filen.Data());
   TNeuron *neuron = 0;
   *output << "#input normalization" << std::endl;
   Int_t nentries = fFirstLayer.GetEntriesFast();
   Int_t j=0;
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fFirstLayer.UncheckedAt(j);
      *output << neuron->GetNormalisation()[0] << " "
              << neuron->GetNormalisation()[1] << std::endl;
   }
   *output << "#output normalization" << std::endl;
   nentries = fLastLayer.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fLastLayer.UncheckedAt(j);
      *output << neuron->GetNormalisation()[0] << " "
              << neuron->GetNormalisation()[1] << std::endl;
   }
   *output << "#neurons weights" << std::endl;
   TObjArrayIter *it = (TObjArrayIter *) fNetwork.MakeIterator();
   while ((neuron = (TNeuron *) it->Next()))
      *output << neuron->GetWeight() << std::endl;
   delete it;
   it = (TObjArrayIter *) fSynapses.MakeIterator();
   TSynapse *synapse = 0;
   *output << "#synapses weights" << std::endl;
   while ((synapse = (TSynapse *) it->Next()))
      *output << synapse->GetWeight() << std::endl;
   delete it;
   if (filen != "-") {
      ((std::ofstream *) output)->close();
      delete output;
   }
   return kTRUE;
}

//______________________________________________________________________________
Bool_t TMultiLayerPerceptron::LoadWeights(Option_t * filename)
{
   // Loads the weights from a text file conforming to the format
   // defined by DumpWeights.
   TString filen = filename;
   Double_t w;
   if (filen == "") {
      Error("TMultiLayerPerceptron::LoadWeights()","Invalid file name");
      return kFALSE;
   }
   char *buff = new char[100];
   std::ifstream input(filen.Data());
   // input normalzation
   input.getline(buff, 100);
   TObjArrayIter *it = (TObjArrayIter *) fFirstLayer.MakeIterator();
   Float_t n1,n2;
   TNeuron *neuron = 0;
   while ((neuron = (TNeuron *) it->Next())) {
      input >> n1 >> n2;
      neuron->SetNormalisation(n2,n1);
   }
   input.getline(buff, 100);
   // output normalization
   input.getline(buff, 100);
   delete it;
   it = (TObjArrayIter *) fLastLayer.MakeIterator();
   while ((neuron = (TNeuron *) it->Next())) {
      input >> n1 >> n2;
      neuron->SetNormalisation(n2,n1);
   }
   input.getline(buff, 100);
   // neuron weights
   input.getline(buff, 100);
   delete it;
   it = (TObjArrayIter *) fNetwork.MakeIterator();
   while ((neuron = (TNeuron *) it->Next())) {
      input >> w;
      neuron->SetWeight(w);
   }
   delete it;
   input.getline(buff, 100);
   // synapse weights
   input.getline(buff, 100);
   it = (TObjArrayIter *) fSynapses.MakeIterator();
   TSynapse *synapse = 0;
   while ((synapse = (TSynapse *) it->Next())) {
      input >> w;
      synapse->SetWeight(w);
   }
   delete it;
   delete[] buff;
   return kTRUE;
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::Evaluate(Int_t index, Double_t *params) const
{
   // Returns the Neural Net for a given set of input parameters
   // #parameters must equal #input neurons

   TObjArrayIter *it = (TObjArrayIter *) fNetwork.MakeIterator();
   TNeuron *neuron;
   while ((neuron = (TNeuron *) it->Next()))
      neuron->SetNewEvent();
   delete it;
   it = (TObjArrayIter *) fFirstLayer.MakeIterator();
   Int_t i=0;
   while ((neuron = (TNeuron *) it->Next()))
      neuron->ForceExternalValue(params[i++]);
   delete it;
   TNeuron *out = (TNeuron *) (fLastLayer.At(index));
   if (out)
      return out->GetValue();
   else
      return 0;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::Export(Option_t * filename, Option_t * language) const
{
   // Exports the NN as a function for any non-ROOT-dependant code
   // Supported languages are: only C++ , FORTRAN and Python (yet)
   // This feature is also usefull if you want to plot the NN as
   // a function (TF1 or TF2).

   TString lg = language;
   lg.ToUpper();
   Int_t i;
   if(GetType()==TNeuron::kExternal) {
      Warning("TMultiLayerPerceptron::Export","Request to export a network using an external function");
   }
   if (lg == "C++") {
      TString basefilename = filename;
      Int_t slash = basefilename.Last('/')+1;
      if (slash) basefilename = TString(basefilename(slash, basefilename.Length()-slash));

      TString classname = basefilename;
      TString header = filename;
      header += ".h";
      TString source = filename;
      source += ".cxx";
      std::ofstream headerfile(header);
      std::ofstream sourcefile(source);
      headerfile << "#ifndef " << basefilename << "_h" << std::endl;
      headerfile << "#define " << basefilename << "_h" << std::endl << std::endl;
      headerfile << "class " << classname << " { " << std::endl;
      headerfile << "public:" << std::endl;
      headerfile << "   " << classname << "() {}" << std::endl;
      headerfile << "   ~" << classname << "() {}" << std::endl;
      sourcefile << "#include \"" << header << "\"" << std::endl;
      sourcefile << "#include <cmath>" << std::endl << std::endl;
      headerfile << "   double Value(int index";
      sourcefile << "double " << classname << "::Value(int index";
      for (i = 0; i < fFirstLayer.GetEntriesFast(); i++) {
         headerfile << ",double in" << i;
         sourcefile << ",double in" << i;
      }
      headerfile << ");" << std::endl;
      sourcefile << ") {" << std::endl;
      for (i = 0; i < fFirstLayer.GetEntriesFast(); i++)
         sourcefile << "   input" << i << " = (in" << i << " - "
             << ((TNeuron *) fFirstLayer[i])->GetNormalisation()[1] << ")/"
             << ((TNeuron *) fFirstLayer[i])->GetNormalisation()[0] << ";"
             << std::endl;
      sourcefile << "   switch(index) {" << std::endl;
      TNeuron *neuron;
      TObjArrayIter *it = (TObjArrayIter *) fLastLayer.MakeIterator();
      Int_t idx = 0;
      while ((neuron = (TNeuron *) it->Next()))
         sourcefile << "     case " << idx++ << ":" << std::endl
                    << "         return neuron" << neuron << "();" << std::endl;
      sourcefile << "     default:" << std::endl
                 << "         return 0.;" << std::endl << "   }"
                 << std::endl;
      sourcefile << "}" << std::endl << std::endl;
      headerfile << "   double Value(int index, double* input);" << std::endl;
      sourcefile << "double " << classname << "::Value(int index, double* input) {" << std::endl;
      for (i = 0; i < fFirstLayer.GetEntriesFast(); i++)
         sourcefile << "   input" << i << " = (input[" << i << "] - "
             << ((TNeuron *) fFirstLayer[i])->GetNormalisation()[1] << ")/"
             << ((TNeuron *) fFirstLayer[i])->GetNormalisation()[0] << ";"
             << std::endl;
      sourcefile << "   switch(index) {" << std::endl;
      delete it;
      it = (TObjArrayIter *) fLastLayer.MakeIterator();
      idx = 0;
      while ((neuron = (TNeuron *) it->Next()))
         sourcefile << "     case " << idx++ << ":" << std::endl
                    << "         return neuron" << neuron << "();" << std::endl;
      sourcefile << "     default:" << std::endl
                 << "         return 0.;" << std::endl << "   }"
                 << std::endl;
      sourcefile << "}" << std::endl << std::endl;
      headerfile << "private:" << std::endl;
      for (i = 0; i < fFirstLayer.GetEntriesFast(); i++)
         headerfile << "   double input" << i << ";" << std::endl;
      delete it;
      it = (TObjArrayIter *) fNetwork.MakeIterator();
      idx = 0;
      while ((neuron = (TNeuron *) it->Next())) {
         if (!neuron->GetPre(0)) {
            headerfile << "   double neuron" << neuron << "();" << std::endl;
            sourcefile << "double " << classname << "::neuron" << neuron
                       << "() {" << std::endl;
            sourcefile << "   return input" << idx++ << ";" << std::endl;
            sourcefile << "}" << std::endl << std::endl;
         } else {
            headerfile << "   double input" << neuron << "();" << std::endl;
            sourcefile << "double " << classname << "::input" << neuron
                       << "() {" << std::endl;
            sourcefile << "   double input = " << neuron->GetWeight()
                       << ";" << std::endl;
            TSynapse *syn = 0;
            Int_t n = 0;
            while ((syn = neuron->GetPre(n++))) {
               sourcefile << "   input += synapse" << syn << "();" << std::endl;
            }
            sourcefile << "   return input;" << std::endl;
            sourcefile << "}" << std::endl << std::endl;

            headerfile << "   double neuron" << neuron << "();" << std::endl;
            sourcefile << "double " << classname << "::neuron" << neuron << "() {" << std::endl;
            sourcefile << "   double input = input" << neuron << "();" << std::endl;
            switch(neuron->GetType()) {
               case (TNeuron::kSigmoid):
                  {
                     sourcefile << "   return ((input < -709. ? 0. : (1/(1+exp(-input)))) * ";
                     break;
                  }
               case (TNeuron::kLinear):
                  {
                     sourcefile << "   return (input * ";
                     break;
                  }
               case (TNeuron::kTanh):
                  {
                     sourcefile << "   return (tanh(input) * ";
                     break;
                  }
               case (TNeuron::kGauss):
                  {
                     sourcefile << "   return (exp(-input*input) * ";
                     break;
                  }
               case (TNeuron::kSoftmax):
                  {
                     sourcefile << "   return (exp(input) / (";
                     Int_t nn = 0;
                     TNeuron* side = neuron->GetInLayer(nn++);
                     sourcefile << "exp(input" << side << "())";
                     while ((side = neuron->GetInLayer(nn++)))
                        sourcefile << " + exp(input" << side << "())";
                     sourcefile << ") * ";
                     break;
                  }
               default:
                  {
                     sourcefile << "   return (0.0 * ";
                  }
            }
            sourcefile << neuron->GetNormalisation()[0] << ")+" ;
            sourcefile << neuron->GetNormalisation()[1] << ";" << std::endl;
            sourcefile << "}" << std::endl << std::endl;
         }
      }
      delete it;
      TSynapse *synapse = 0;
      it = (TObjArrayIter *) fSynapses.MakeIterator();
      while ((synapse = (TSynapse *) it->Next())) {
         headerfile << "   double synapse" << synapse << "();" << std::endl;
         sourcefile << "double " << classname << "::synapse"
                    << synapse << "() {" << std::endl;
         sourcefile << "   return (neuron" << synapse->GetPre()
                    << "()*" << synapse->GetWeight() << ");" << std::endl;
         sourcefile << "}" << std::endl << std::endl;
      }
      delete it;
      headerfile << "};" << std::endl << std::endl;
      headerfile << "#endif // " << basefilename << "_h" << std::endl << std::endl;
      headerfile.close();
      sourcefile.close();
      std::cout << header << " and " << source << " created." << std::endl;
   }
   else if(lg == "FORTRAN") {
      TString implicit = "      implicit double precision (a-h,n-z)\n";
      std::ofstream sigmoid("sigmoid.f");
      sigmoid         << "      double precision FUNCTION SIGMOID(X)"        << std::endl
                    << implicit
                << "      IF(X.GT.37.) THEN"                        << std::endl
                    << "         SIGMOID = 1."                        << std::endl
                << "      ELSE IF(X.LT.-709.) THEN"                << std::endl
                    << "         SIGMOID = 0."                        << std::endl
                    << "      ELSE"                                        << std::endl
                    << "         SIGMOID = 1./(1.+EXP(-X))"                << std::endl
                    << "      ENDIF"                                << std::endl
                    << "      END"                                        << std::endl;
      sigmoid.close();
      TString source = filename;
      source += ".f";
      std::ofstream sourcefile(source);

      // Header
      sourcefile << "      double precision function " << filename
                 << "(x, index)" << std::endl;
      sourcefile << implicit;
      sourcefile << "      double precision x(" <<
      fFirstLayer.GetEntriesFast() << ")" << std::endl << std::endl;

      // Last layer
      sourcefile << "C --- Last Layer" << std::endl;
      TNeuron *neuron;
      TObjArrayIter *it = (TObjArrayIter *) fLastLayer.MakeIterator();
      Int_t idx = 0;
      TString ifelseif = "      if (index.eq.";
      while ((neuron = (TNeuron *) it->Next())) {
         sourcefile << ifelseif.Data() << idx++ << ") then" << std::endl
                    << "          " << filename
                    << "=neuron" << neuron << "(x);" << std::endl;
         ifelseif = "      else if (index.eq.";
      }
      sourcefile << "      else" << std::endl
                 << "          " << filename << "=0.d0" << std::endl
                 << "      endif" << std::endl;
      sourcefile << "      end" << std::endl;

      // Network
      sourcefile << "C --- First and Hidden layers" << std::endl;
      delete it;
      it = (TObjArrayIter *) fNetwork.MakeIterator();
      idx = 0;
      while ((neuron = (TNeuron *) it->Next())) {
         sourcefile << "      double precision function neuron"
                    << neuron << "(x)" << std::endl
                    << implicit;
         sourcefile << "      double precision x("
                    << fFirstLayer.GetEntriesFast() << ")" << std::endl << std::endl;
         if (!neuron->GetPre(0)) {
            sourcefile << "      neuron" << neuron
             << " = (x(" << idx+1 << ") - "
             << ((TNeuron *) fFirstLayer[idx])->GetNormalisation()[1]
             << "d0)/"
             << ((TNeuron *) fFirstLayer[idx])->GetNormalisation()[0]
             << "d0" << std::endl;
            idx++;
         } else {
            sourcefile << "      neuron" << neuron
                       << " = " << neuron->GetWeight() << "d0" << std::endl;
            TSynapse *syn;
            Int_t n = 0;
            while ((syn = neuron->GetPre(n++)))
               sourcefile << "      neuron" << neuron
                              << " = neuron" << neuron
                          << " + synapse" << syn << "(x)" << std::endl;
            switch(neuron->GetType()) {
               case (TNeuron::kSigmoid):
                  {
                     sourcefile << "      neuron" << neuron
                                << "= (sigmoid(neuron" << neuron << ")*";
                     break;
                  }
               case (TNeuron::kLinear):
                  {
                     break;
                  }
               case (TNeuron::kTanh):
                  {
                     sourcefile << "      neuron" << neuron
                                << "= (tanh(neuron" << neuron << ")*";
                     break;
                  }
               case (TNeuron::kGauss):
                  {
                     sourcefile << "      neuron" << neuron
                                << "= (exp(-neuron" << neuron << "*neuron"
                                << neuron << "))*";
                     break;
                  }
               case (TNeuron::kSoftmax):
                  {
                     Int_t nn = 0;
                     TNeuron* side = neuron->GetInLayer(nn++);
                     sourcefile << "      div = exp(neuron" << side << "())" << std::endl;
                     while ((side = neuron->GetInLayer(nn++)))
                        sourcefile << "      div = div + exp(neuron" << side << "())" << std::endl;
                     sourcefile << "      neuron"  << neuron ;
                     sourcefile << "= (exp(neuron" << neuron << ") / div * ";
                     break;
                  }
               default:
                  {
                     sourcefile << "   neuron " << neuron << "= 0.";
                  }
            }
            sourcefile << neuron->GetNormalisation()[0] << "d0)+" ;
            sourcefile << neuron->GetNormalisation()[1] << "d0" << std::endl;
         }
         sourcefile << "      end" << std::endl;
      }
      delete it;

      // Synapses
      sourcefile << "C --- Synapses" << std::endl;
      TSynapse *synapse = 0;
      it = (TObjArrayIter *) fSynapses.MakeIterator();
      while ((synapse = (TSynapse *) it->Next())) {
         sourcefile << "      double precision function " << "synapse"
                    << synapse << "(x)\n" << implicit;
         sourcefile << "      double precision x("
                    << fFirstLayer.GetEntriesFast() << ")" << std::endl << std::endl;
         sourcefile << "      synapse" << synapse
                    << "=neuron" << synapse->GetPre()
                    << "(x)*" << synapse->GetWeight() << "d0" << std::endl;
         sourcefile << "      end" << std::endl << std::endl;
      }
      delete it;
      sourcefile.close();
      std::cout << source << " created." << std::endl;
   }
   else if(lg == "PYTHON") {
      TString classname = filename;
      TString pyfile = filename;
      pyfile += ".py";
      std::ofstream pythonfile(pyfile);
      pythonfile << "from math import exp" << std::endl << std::endl;
      pythonfile << "from math import tanh" << std::endl << std::endl;
      pythonfile << "class " << classname << ":" << std::endl;
      pythonfile << "\tdef value(self,index";
      for (i = 0; i < fFirstLayer.GetEntriesFast(); i++) {
         pythonfile << ",in" << i;
      }
      pythonfile << "):" << std::endl;
      for (i = 0; i < fFirstLayer.GetEntriesFast(); i++)
         pythonfile << "\t\tself.input" << i << " = (in" << i << " - "
             << ((TNeuron *) fFirstLayer[i])->GetNormalisation()[1] << ")/"
             << ((TNeuron *) fFirstLayer[i])->GetNormalisation()[0] << std::endl;
      TNeuron *neuron;
      TObjArrayIter *it = (TObjArrayIter *) fLastLayer.MakeIterator();
      Int_t idx = 0;
      while ((neuron = (TNeuron *) it->Next()))
         pythonfile << "\t\tif index==" << idx++
                    << ": return self.neuron" << neuron << "();" << std::endl;
      pythonfile << "\t\treturn 0." << std::endl;
      delete it;
      it = (TObjArrayIter *) fNetwork.MakeIterator();
      idx = 0;
      while ((neuron = (TNeuron *) it->Next())) {
         pythonfile << "\tdef neuron" << neuron << "(self):" << std::endl;
         if (!neuron->GetPre(0))
            pythonfile << "\t\treturn self.input" << idx++ << std::endl;
         else {
            pythonfile << "\t\tinput = " << neuron->GetWeight() << std::endl;
            TSynapse *syn;
            Int_t n = 0;
            while ((syn = neuron->GetPre(n++)))
               pythonfile << "\t\tinput = input + self.synapse"
                          << syn << "()" << std::endl;
            switch(neuron->GetType()) {
               case (TNeuron::kSigmoid):
                  {
                     pythonfile << "\t\tif input<-709. : return " << neuron->GetNormalisation()[1] << std::endl;
                     pythonfile << "\t\treturn ((1/(1+exp(-input)))*";
                     break;
                  }
               case (TNeuron::kLinear):
                  {
                     pythonfile << "\t\treturn (input*";
                     break;
                  }
               case (TNeuron::kTanh):
                  {
                     pythonfile << "\t\treturn (tanh(input)*";
                     break;
                  }
               case (TNeuron::kGauss):
                  {
                     pythonfile << "\t\treturn (exp(-input*input)*";
                     break;
                  }
               case (TNeuron::kSoftmax):
                  {
                     pythonfile << "\t\treturn (exp(input) / (";
                     Int_t nn = 0;
                     TNeuron* side = neuron->GetInLayer(nn++);
                     pythonfile << "exp(self.neuron" << side << "())";
                     while ((side = neuron->GetInLayer(nn++)))
                        pythonfile << " + exp(self.neuron" << side << "())";
                     pythonfile << ") * ";
                     break;
                  }
               default:
                  {
                     pythonfile << "\t\treturn 0.";
                  }
            }
            pythonfile << neuron->GetNormalisation()[0] << ")+" ;
            pythonfile << neuron->GetNormalisation()[1] << std::endl;
         }
      }
      delete it;
      TSynapse *synapse = 0;
      it = (TObjArrayIter *) fSynapses.MakeIterator();
      while ((synapse = (TSynapse *) it->Next())) {
         pythonfile << "\tdef synapse" << synapse << "(self):" << std::endl;
         pythonfile << "\t\treturn (self.neuron" << synapse->GetPre()
                    << "()*" << synapse->GetWeight() << ")" << std::endl;
      }
      delete it;
      pythonfile.close();
      std::cout << pyfile << " created." << std::endl;
   }
}

//______________________________________________________________________________
void TMultiLayerPerceptron::Shuffle(Int_t * index, Int_t n) const
{
   // Shuffle the Int_t index[n] in input.
   // Input:
   //   index: the array to shuffle
   //   n: the size of the array
   // Output:
   //   index: the shuffled indexes
   // This method is used for stochastic training

   TTimeStamp ts;
   TRandom3 rnd(ts.GetSec());
   Int_t j, k;
   Int_t a = n - 1;
   for (Int_t i = 0; i < n; i++) {
      j = (Int_t) (rnd.Rndm() * a);
      k = index[j];
      index[j] = index[i];
      index[i] = k;
   }
   return;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::MLP_Stochastic(Double_t * buffer)
{
   // One step for the stochastic method
   // buffer should contain the previous dw vector and will be updated

   Int_t nEvents = fTraining->GetN();
   Int_t *index = new Int_t[nEvents];
   Int_t i,j,nentries;
   for (i = 0; i < nEvents; i++)
      index[i] = i;
   fEta *= fEtaDecay;
   Shuffle(index, nEvents);
   TNeuron *neuron;
   TSynapse *synapse;
   for (i = 0; i < nEvents; i++) {
      GetEntry(fTraining->GetEntry(index[i]));
      // First compute DeDw for all neurons: force calculation before
      // modifying the weights.
      nentries = fFirstLayer.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         neuron = (TNeuron *) fFirstLayer.UncheckedAt(j);
         neuron->GetDeDw();
      }
      Int_t cnt = 0;
      // Step for all neurons
      nentries = fNetwork.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         neuron = (TNeuron *) fNetwork.UncheckedAt(j);
         buffer[cnt] = (-fEta) * (neuron->GetDeDw() + fDelta)
                       + fEpsilon * buffer[cnt];
         neuron->SetWeight(neuron->GetWeight() + buffer[cnt++]);
      }
      // Step for all synapses
      nentries = fSynapses.GetEntriesFast();
      for (j=0;j<nentries;j++) {
         synapse = (TSynapse *) fSynapses.UncheckedAt(j);
         buffer[cnt] = (-fEta) * (synapse->GetDeDw() + fDelta)
                       + fEpsilon * buffer[cnt];
         synapse->SetWeight(synapse->GetWeight() + buffer[cnt++]);
      }
   }
   delete[]index;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::MLP_Batch(Double_t * buffer)
{
   // One step for the batch (stochastic) method.
   // DEDw should have been updated before calling this.

   fEta *= fEtaDecay;
   Int_t cnt = 0;
   TObjArrayIter *it = (TObjArrayIter *) fNetwork.MakeIterator();
   TNeuron *neuron = 0;
   // Step for all neurons
   while ((neuron = (TNeuron *) it->Next())) {
      buffer[cnt] = (-fEta) * (neuron->GetDEDw() + fDelta)
                    + fEpsilon * buffer[cnt];
      neuron->SetWeight(neuron->GetWeight() + buffer[cnt++]);
   }
   delete it;
   it = (TObjArrayIter *) fSynapses.MakeIterator();
   TSynapse *synapse = 0;
   // Step for all synapses
   while ((synapse = (TSynapse *) it->Next())) {
      buffer[cnt] = (-fEta) * (synapse->GetDEDw() + fDelta)
                    + fEpsilon * buffer[cnt];
      synapse->SetWeight(synapse->GetWeight() + buffer[cnt++]);
   }
   delete it;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::MLP_Line(Double_t * origin, Double_t * dir, Double_t dist)
{
   // Sets the weights to a point along a line
   // Weights are set to [origin + (dist * dir)].

   Int_t idx = 0;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   TObjArrayIter *it = (TObjArrayIter *) fNetwork.MakeIterator();
   while ((neuron = (TNeuron *) it->Next())) {
      neuron->SetWeight(origin[idx] + (dir[idx] * dist));
      idx++;
   }
   delete it;
   it = (TObjArrayIter *) fSynapses.MakeIterator();
   while ((synapse = (TSynapse *) it->Next())) {
      synapse->SetWeight(origin[idx] + (dir[idx] * dist));
      idx++;
   }
   delete it;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SteepestDir(Double_t * dir)
{
   // Sets the search direction to steepest descent.
   Int_t idx = 0;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   TObjArrayIter *it = (TObjArrayIter *) fNetwork.MakeIterator();
   while ((neuron = (TNeuron *) it->Next()))
      dir[idx++] = -neuron->GetDEDw();
   delete it;
   it = (TObjArrayIter *) fSynapses.MakeIterator();
   while ((synapse = (TSynapse *) it->Next()))
      dir[idx++] = -synapse->GetDEDw();
   delete it;
}

//______________________________________________________________________________
bool TMultiLayerPerceptron::LineSearch(Double_t * direction, Double_t * buffer)
{
   // Search along the line defined by direction.
   // buffer is not used but is updated with the new dw
   // so that it can be used by a later stochastic step.
   // It returns true if the line search fails.

   Int_t idx = 0;
   Int_t j,nentries;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   // store weights before line search
   Double_t *origin = new Double_t[fNetwork.GetEntriesFast() +
                                   fSynapses.GetEntriesFast()];
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      origin[idx++] = neuron->GetWeight();
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      origin[idx++] = synapse->GetWeight();
   }
   // try to find a triplet (alpha1, alpha2, alpha3) such that
   // Error(alpha1)>Error(alpha2)<Error(alpha3)
   Double_t err1 = GetError(kTraining);
   Double_t alpha1 = 0.;
   Double_t alpha2 = fLastAlpha;
   if (alpha2 < 0.01)
      alpha2 = 0.01;
   if (alpha2 > 2.0)
      alpha2 = 2.0;
   Double_t alpha3 = alpha2;
   MLP_Line(origin, direction, alpha2);
   Double_t err2 = GetError(kTraining);
   Double_t err3 = err2;
   Bool_t bingo = false;
   Int_t icount;
   if (err1 > err2) {
      for (icount = 0; icount < 100; icount++) {
         alpha3 *= fTau;
         MLP_Line(origin, direction, alpha3);
         err3 = GetError(kTraining);
         if (err3 > err2) {
            bingo = true;
            break;
         }
         alpha1 = alpha2;
         err1 = err2;
         alpha2 = alpha3;
         err2 = err3;
      }
      if (!bingo) {
         MLP_Line(origin, direction, 0.);
         delete[]origin;
         return true;
      }
   } else {
      for (icount = 0; icount < 100; icount++) {
         alpha2 /= fTau;
         MLP_Line(origin, direction, alpha2);
         err2 = GetError(kTraining);
         if (err1 > err2) {
            bingo = true;
            break;
         }
         alpha3 = alpha2;
         err3 = err2;
      }
      if (!bingo) {
         MLP_Line(origin, direction, 0.);
         delete[]origin;
         fLastAlpha = 0.05;
         return true;
      }
   }
   // Sets the weights to the bottom of parabola
   fLastAlpha = 0.5 * (alpha1 + alpha3 -
                (err3 - err1) / ((err3 - err2) / (alpha3 - alpha2)
                - (err2 - err1) / (alpha2 - alpha1)));
   fLastAlpha = fLastAlpha < 10000 ? fLastAlpha : 10000;
   MLP_Line(origin, direction, fLastAlpha);
   GetError(kTraining);
   // Stores weight changes (can be used by a later stochastic step)
   idx = 0;
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      buffer[idx] = neuron->GetWeight() - origin[idx];
      idx++;
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      buffer[idx] = synapse->GetWeight() - origin[idx];
      idx++;
   }
   delete[]origin;
   return false;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::ConjugateGradientsDir(Double_t * dir, Double_t beta)
{
   // Sets the search direction to conjugate gradient direction
   // beta should be:
   //  ||g_{(t+1)}||^2 / ||g_{(t)}||^2                   (Fletcher-Reeves)
   //  g_{(t+1)} (g_{(t+1)}-g_{(t)}) / ||g_{(t)}||^2     (Ribiere-Polak)

   Int_t idx = 0;
   Int_t j,nentries;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      dir[idx] = -neuron->GetDEDw() + beta * dir[idx];
      idx++;
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      dir[idx] = -synapse->GetDEDw() + beta * dir[idx];
      idx++;
   }
}

//______________________________________________________________________________
bool TMultiLayerPerceptron::GetBFGSH(TMatrixD & bfgsh, TMatrixD & gamma, TMatrixD & delta)
{
   // Computes the hessian matrix using the BFGS update algorithm.
   // from gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}).
   // It returns true if such a direction could not be found
   // (if gamma and delta are orthogonal).

   TMatrixD gd(gamma, TMatrixD::kTransposeMult, delta);
   if ((Double_t) gd[0][0] == 0.)
      return true;
   TMatrixD aHg(bfgsh, TMatrixD::kMult, gamma);
   TMatrixD tmp(gamma, TMatrixD::kTransposeMult, bfgsh);
   TMatrixD gHg(gamma, TMatrixD::kTransposeMult, aHg);
   Double_t a = 1 / (Double_t) gd[0][0];
   Double_t f = 1 + ((Double_t) gHg[0][0] * a);
   TMatrixD res( TMatrixD(delta, TMatrixD::kMult,
                TMatrixD(TMatrixD::kTransposed, delta)));
   res *= f;
   res -= (TMatrixD(delta, TMatrixD::kMult, tmp) +
           TMatrixD(aHg, TMatrixD::kMult,
                   TMatrixD(TMatrixD::kTransposed, delta)));
   res *= a;
   bfgsh += res;
   return false;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::SetGammaDelta(TMatrixD & gamma, TMatrixD & delta,
                                          Double_t * buffer)
{
   // Sets the gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}) vectors
   // Gamma is computed here, so ComputeDEDw cannot have been called before,
   // and delta is a direct translation of buffer into a TMatrixD.

   Int_t els = fNetwork.GetEntriesFast() + fSynapses.GetEntriesFast();
   Int_t idx = 0;
   Int_t j,nentries;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      gamma[idx++][0] = -neuron->GetDEDw();
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      gamma[idx++][0] = -synapse->GetDEDw();
   }
   for (Int_t i = 0; i < els; i++)
      delta[i] = buffer[i];
   //delta.SetElements(buffer,"F");
   ComputeDEDw();
   idx = 0;
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      gamma[idx++][0] += neuron->GetDEDw();
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      gamma[idx++][0] += synapse->GetDEDw();
   }
}

//______________________________________________________________________________
Double_t TMultiLayerPerceptron::DerivDir(Double_t * dir)
{
   // scalar product between gradient and direction
   // = derivative along direction

   Int_t idx = 0;
   Int_t j,nentries;
   Double_t output = 0;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      output += neuron->GetDEDw() * dir[idx++];
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      output += synapse->GetDEDw() * dir[idx++];
   }
   return output;
}

//______________________________________________________________________________
void TMultiLayerPerceptron::BFGSDir(TMatrixD & bfgsh, Double_t * dir)
{
   // Computes the direction for the BFGS algorithm as the product
   // between the Hessian estimate (bfgsh) and the dir.

   Int_t els = fNetwork.GetEntriesFast() + fSynapses.GetEntriesFast();
   TMatrixD dedw(els, 1);
   Int_t idx = 0;
   Int_t j,nentries;
   TNeuron *neuron = 0;
   TSynapse *synapse = 0;
   nentries = fNetwork.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      neuron = (TNeuron *) fNetwork.UncheckedAt(j);
      dedw[idx++][0] = neuron->GetDEDw();
   }
   nentries = fSynapses.GetEntriesFast();
   for (j=0;j<nentries;j++) {
      synapse = (TSynapse *) fSynapses.UncheckedAt(j);
      dedw[idx++][0] = synapse->GetDEDw();
   }
   TMatrixD direction(bfgsh, TMatrixD::kMult, dedw);
   for (Int_t i = 0; i < els; i++)
      dir[i] = -direction[i][0];
   //direction.GetElements(dir,"F");
}

//______________________________________________________________________________
void TMultiLayerPerceptron::Draw(Option_t * /*option*/)
{
  // Draws the network structure.
  // Neurons are depicted by a blue disk, and synapses by
  // lines connecting neurons.
  // The line width is proportionnal to the weight.

#define NeuronSize 2.5

   Int_t nLayers = fStructure.CountChar(':')+1;
   Float_t xStep = 1./(nLayers+1.);
   Int_t layer;
   for(layer=0; layer< nLayers-1; layer++) {
      Float_t nNeurons_this = 0;
      if(layer==0) {
         TString input      = TString(fStructure(0, fStructure.First(':')));
         nNeurons_this = input.CountChar(',')+1;
      }
      else {
         Int_t cnt=0;
         TString hidden = TString(fStructure(fStructure.First(':') + 1,fStructure.Last(':') - fStructure.First(':') - 1));
         Int_t beg = 0;
         Int_t end = hidden.Index(":", beg + 1);
         while (end != -1) {
            Int_t num = atoi(TString(hidden(beg, end - beg)).Data());
            cnt++;
            beg = end + 1;
            end = hidden.Index(":", beg + 1);
            if(layer==cnt) nNeurons_this = num;
         }
         Int_t num = atoi(TString(hidden(beg, hidden.Length() - beg)).Data());
         cnt++;
         if(layer==cnt) nNeurons_this = num;
      }
      Float_t nNeurons_next = 0;
      if(layer==nLayers-2) {
         TString output = TString(fStructure(fStructure.Last(':') + 1,fStructure.Length() - fStructure.Last(':')));
         nNeurons_next = output.CountChar(',')+1;
      }
      else {
         Int_t cnt=0;
         TString hidden = TString(fStructure(fStructure.First(':') + 1,fStructure.Last(':') - fStructure.First(':') - 1));
         Int_t beg = 0;
         Int_t end = hidden.Index(":", beg + 1);
         while (end != -1) {
            Int_t num = atoi(TString(hidden(beg, end - beg)).Data());
            cnt++;
            beg = end + 1;
            end = hidden.Index(":", beg + 1);
            if(layer+1==cnt) nNeurons_next = num;
         }
         Int_t num = atoi(TString(hidden(beg, hidden.Length() - beg)).Data());
         cnt++;
         if(layer+1==cnt) nNeurons_next = num;
      }
      Float_t yStep_this = 1./(nNeurons_this+1.);
      Float_t yStep_next = 1./(nNeurons_next+1.);
      TObjArrayIter* it = (TObjArrayIter *) fSynapses.MakeIterator();
      TSynapse *theSynapse = 0;
      Float_t maxWeight = 0;
      while ((theSynapse = (TSynapse *) it->Next()))
         maxWeight = maxWeight < theSynapse->GetWeight() ? theSynapse->GetWeight() : maxWeight;
      delete it;
      it = (TObjArrayIter *) fSynapses.MakeIterator();
      for(Int_t neuron1=0; neuron1<nNeurons_this; neuron1++) {
         for(Int_t neuron2=0; neuron2<nNeurons_next; neuron2++) {
            TLine* synapse = new TLine(xStep*(layer+1),yStep_this*(neuron1+1),xStep*(layer+2),yStep_next*(neuron2+1));
            synapse->Draw();
            theSynapse = (TSynapse *) it->Next();
            if (!theSynapse) continue;
            synapse->SetLineWidth(Int_t((theSynapse->GetWeight()/maxWeight)*10.));
            synapse->SetLineStyle(1);
            if(((TMath::Abs(theSynapse->GetWeight())/maxWeight)*10.)<0.5) synapse->SetLineStyle(2);
            if(((TMath::Abs(theSynapse->GetWeight())/maxWeight)*10.)<0.25) synapse->SetLineStyle(3);
         }
      }
      delete it;
   }
   for(layer=0; layer< nLayers; layer++) {
      Float_t nNeurons = 0;
      if(layer==0) {
         TString input      = TString(fStructure(0, fStructure.First(':')));
         nNeurons = input.CountChar(',')+1;
      }
      else if(layer==nLayers-1) {
         TString output = TString(fStructure(fStructure.Last(':') + 1,fStructure.Length() - fStructure.Last(':')));
         nNeurons = output.CountChar(',')+1;
      }
      else {
         Int_t cnt=0;
         TString hidden = TString(fStructure(fStructure.First(':') + 1,fStructure.Last(':') - fStructure.First(':') - 1));
         Int_t beg = 0;
         Int_t end = hidden.Index(":", beg + 1);
         while (end != -1) {
            Int_t num = atoi(TString(hidden(beg, end - beg)).Data());
            cnt++;
            beg = end + 1;
            end = hidden.Index(":", beg + 1);
            if(layer==cnt) nNeurons = num;
         }
         Int_t num = atoi(TString(hidden(beg, hidden.Length() - beg)).Data());
         cnt++;
         if(layer==cnt) nNeurons = num;
      }
      Float_t yStep = 1./(nNeurons+1.);
      for(Int_t neuron=0; neuron<nNeurons; neuron++) {
         TMarker* m = new TMarker(xStep*(layer+1),yStep*(neuron+1),20);
         m->SetMarkerColor(4);
         m->SetMarkerSize(NeuronSize);
         m->Draw();
      }
   }
   const TString input = TString(fStructure(0, fStructure.First(':')));
   const TObjArray *inpL = input.Tokenize(" ,");
   const Int_t nrItems = inpL->GetLast()+1;
   Float_t yStep = 1./(nrItems+1);
   for (Int_t item = 0; item < nrItems; item++) {
      const TString brName = ((TObjString *)inpL->At(item))->GetString();
      TText* label = new TText(0.5*xStep,yStep*(item+1),brName.Data());
      label->Draw();
   }
   delete inpL;

   Int_t numOutNodes=fLastLayer.GetEntriesFast();
   yStep=1./(numOutNodes+1);
   for (Int_t outnode=0; outnode<numOutNodes; outnode++) {
      TNeuron* neuron=(TNeuron*)fLastLayer[outnode];
      if (neuron && neuron->GetName()) {
         TText* label = new TText(xStep*nLayers,
                                  yStep*(outnode+1),
                                  neuron->GetName());
         label->Draw();
      }
   }
}
 TMultiLayerPerceptron.cxx:1
 TMultiLayerPerceptron.cxx:2
 TMultiLayerPerceptron.cxx:3
 TMultiLayerPerceptron.cxx:4
 TMultiLayerPerceptron.cxx:5
 TMultiLayerPerceptron.cxx:6
 TMultiLayerPerceptron.cxx:7
 TMultiLayerPerceptron.cxx:8
 TMultiLayerPerceptron.cxx:9
 TMultiLayerPerceptron.cxx:10
 TMultiLayerPerceptron.cxx:11
 TMultiLayerPerceptron.cxx:12
 TMultiLayerPerceptron.cxx:13
 TMultiLayerPerceptron.cxx:14
 TMultiLayerPerceptron.cxx:15
 TMultiLayerPerceptron.cxx:16
 TMultiLayerPerceptron.cxx:17
 TMultiLayerPerceptron.cxx:18
 TMultiLayerPerceptron.cxx:19
 TMultiLayerPerceptron.cxx:20
 TMultiLayerPerceptron.cxx:21
 TMultiLayerPerceptron.cxx:22
 TMultiLayerPerceptron.cxx:23
 TMultiLayerPerceptron.cxx:24
 TMultiLayerPerceptron.cxx:25
 TMultiLayerPerceptron.cxx:26
 TMultiLayerPerceptron.cxx:27
 TMultiLayerPerceptron.cxx:28
 TMultiLayerPerceptron.cxx:29
 TMultiLayerPerceptron.cxx:30
 TMultiLayerPerceptron.cxx:31
 TMultiLayerPerceptron.cxx:32
 TMultiLayerPerceptron.cxx:33
 TMultiLayerPerceptron.cxx:34
 TMultiLayerPerceptron.cxx:35
 TMultiLayerPerceptron.cxx:36
 TMultiLayerPerceptron.cxx:37
 TMultiLayerPerceptron.cxx:38
 TMultiLayerPerceptron.cxx:39
 TMultiLayerPerceptron.cxx:40
 TMultiLayerPerceptron.cxx:41
 TMultiLayerPerceptron.cxx:42
 TMultiLayerPerceptron.cxx:43
 TMultiLayerPerceptron.cxx:44
 TMultiLayerPerceptron.cxx:45
 TMultiLayerPerceptron.cxx:46
 TMultiLayerPerceptron.cxx:47
 TMultiLayerPerceptron.cxx:48
 TMultiLayerPerceptron.cxx:49
 TMultiLayerPerceptron.cxx:50
 TMultiLayerPerceptron.cxx:51
 TMultiLayerPerceptron.cxx:52
 TMultiLayerPerceptron.cxx:53
 TMultiLayerPerceptron.cxx:54
 TMultiLayerPerceptron.cxx:55
 TMultiLayerPerceptron.cxx:56
 TMultiLayerPerceptron.cxx:57
 TMultiLayerPerceptron.cxx:58
 TMultiLayerPerceptron.cxx:59
 TMultiLayerPerceptron.cxx:60
 TMultiLayerPerceptron.cxx:61
 TMultiLayerPerceptron.cxx:62
 TMultiLayerPerceptron.cxx:63
 TMultiLayerPerceptron.cxx:64
 TMultiLayerPerceptron.cxx:65
 TMultiLayerPerceptron.cxx:66
 TMultiLayerPerceptron.cxx:67
 TMultiLayerPerceptron.cxx:68
 TMultiLayerPerceptron.cxx:69
 TMultiLayerPerceptron.cxx:70
 TMultiLayerPerceptron.cxx:71
 TMultiLayerPerceptron.cxx:72
 TMultiLayerPerceptron.cxx:73
 TMultiLayerPerceptron.cxx:74
 TMultiLayerPerceptron.cxx:75
 TMultiLayerPerceptron.cxx:76
 TMultiLayerPerceptron.cxx:77
 TMultiLayerPerceptron.cxx:78
 TMultiLayerPerceptron.cxx:79
 TMultiLayerPerceptron.cxx:80
 TMultiLayerPerceptron.cxx:81
 TMultiLayerPerceptron.cxx:82
 TMultiLayerPerceptron.cxx:83
 TMultiLayerPerceptron.cxx:84
 TMultiLayerPerceptron.cxx:85
 TMultiLayerPerceptron.cxx:86
 TMultiLayerPerceptron.cxx:87
 TMultiLayerPerceptron.cxx:88
 TMultiLayerPerceptron.cxx:89
 TMultiLayerPerceptron.cxx:90
 TMultiLayerPerceptron.cxx:91
 TMultiLayerPerceptron.cxx:92
 TMultiLayerPerceptron.cxx:93
 TMultiLayerPerceptron.cxx:94
 TMultiLayerPerceptron.cxx:95
 TMultiLayerPerceptron.cxx:96
 TMultiLayerPerceptron.cxx:97
 TMultiLayerPerceptron.cxx:98
 TMultiLayerPerceptron.cxx:99
 TMultiLayerPerceptron.cxx:100
 TMultiLayerPerceptron.cxx:101
 TMultiLayerPerceptron.cxx:102
 TMultiLayerPerceptron.cxx:103
 TMultiLayerPerceptron.cxx:104
 TMultiLayerPerceptron.cxx:105
 TMultiLayerPerceptron.cxx:106
 TMultiLayerPerceptron.cxx:107
 TMultiLayerPerceptron.cxx:108
 TMultiLayerPerceptron.cxx:109
 TMultiLayerPerceptron.cxx:110
 TMultiLayerPerceptron.cxx:111
 TMultiLayerPerceptron.cxx:112
 TMultiLayerPerceptron.cxx:113
 TMultiLayerPerceptron.cxx:114
 TMultiLayerPerceptron.cxx:115
 TMultiLayerPerceptron.cxx:116
 TMultiLayerPerceptron.cxx:117
 TMultiLayerPerceptron.cxx:118
 TMultiLayerPerceptron.cxx:119
 TMultiLayerPerceptron.cxx:120
 TMultiLayerPerceptron.cxx:121
 TMultiLayerPerceptron.cxx:122
 TMultiLayerPerceptron.cxx:123
 TMultiLayerPerceptron.cxx:124
 TMultiLayerPerceptron.cxx:125
 TMultiLayerPerceptron.cxx:126
 TMultiLayerPerceptron.cxx:127
 TMultiLayerPerceptron.cxx:128
 TMultiLayerPerceptron.cxx:129
 TMultiLayerPerceptron.cxx:130
 TMultiLayerPerceptron.cxx:131
 TMultiLayerPerceptron.cxx:132
 TMultiLayerPerceptron.cxx:133
 TMultiLayerPerceptron.cxx:134
 TMultiLayerPerceptron.cxx:135
 TMultiLayerPerceptron.cxx:136
 TMultiLayerPerceptron.cxx:137
 TMultiLayerPerceptron.cxx:138
 TMultiLayerPerceptron.cxx:139
 TMultiLayerPerceptron.cxx:140
 TMultiLayerPerceptron.cxx:141
 TMultiLayerPerceptron.cxx:142
 TMultiLayerPerceptron.cxx:143
 TMultiLayerPerceptron.cxx:144
 TMultiLayerPerceptron.cxx:145
 TMultiLayerPerceptron.cxx:146
 TMultiLayerPerceptron.cxx:147
 TMultiLayerPerceptron.cxx:148
 TMultiLayerPerceptron.cxx:149
 TMultiLayerPerceptron.cxx:150
 TMultiLayerPerceptron.cxx:151
 TMultiLayerPerceptron.cxx:152
 TMultiLayerPerceptron.cxx:153
 TMultiLayerPerceptron.cxx:154
 TMultiLayerPerceptron.cxx:155
 TMultiLayerPerceptron.cxx:156
 TMultiLayerPerceptron.cxx:157
 TMultiLayerPerceptron.cxx:158
 TMultiLayerPerceptron.cxx:159
 TMultiLayerPerceptron.cxx:160
 TMultiLayerPerceptron.cxx:161
 TMultiLayerPerceptron.cxx:162
 TMultiLayerPerceptron.cxx:163
 TMultiLayerPerceptron.cxx:164
 TMultiLayerPerceptron.cxx:165
 TMultiLayerPerceptron.cxx:166
 TMultiLayerPerceptron.cxx:167
 TMultiLayerPerceptron.cxx:168
 TMultiLayerPerceptron.cxx:169
 TMultiLayerPerceptron.cxx:170
 TMultiLayerPerceptron.cxx:171
 TMultiLayerPerceptron.cxx:172
 TMultiLayerPerceptron.cxx:173
 TMultiLayerPerceptron.cxx:174
 TMultiLayerPerceptron.cxx:175
 TMultiLayerPerceptron.cxx:176
 TMultiLayerPerceptron.cxx:177
 TMultiLayerPerceptron.cxx:178
 TMultiLayerPerceptron.cxx:179
 TMultiLayerPerceptron.cxx:180
 TMultiLayerPerceptron.cxx:181
 TMultiLayerPerceptron.cxx:182
 TMultiLayerPerceptron.cxx:183
 TMultiLayerPerceptron.cxx:184
 TMultiLayerPerceptron.cxx:185
 TMultiLayerPerceptron.cxx:186
 TMultiLayerPerceptron.cxx:187
 TMultiLayerPerceptron.cxx:188
 TMultiLayerPerceptron.cxx:189
 TMultiLayerPerceptron.cxx:190
 TMultiLayerPerceptron.cxx:191
 TMultiLayerPerceptron.cxx:192
 TMultiLayerPerceptron.cxx:193
 TMultiLayerPerceptron.cxx:194
 TMultiLayerPerceptron.cxx:195
 TMultiLayerPerceptron.cxx:196
 TMultiLayerPerceptron.cxx:197
 TMultiLayerPerceptron.cxx:198
 TMultiLayerPerceptron.cxx:199
 TMultiLayerPerceptron.cxx:200
 TMultiLayerPerceptron.cxx:201
 TMultiLayerPerceptron.cxx:202
 TMultiLayerPerceptron.cxx:203
 TMultiLayerPerceptron.cxx:204
 TMultiLayerPerceptron.cxx:205
 TMultiLayerPerceptron.cxx:206
 TMultiLayerPerceptron.cxx:207
 TMultiLayerPerceptron.cxx:208
 TMultiLayerPerceptron.cxx:209
 TMultiLayerPerceptron.cxx:210
 TMultiLayerPerceptron.cxx:211
 TMultiLayerPerceptron.cxx:212
 TMultiLayerPerceptron.cxx:213
 TMultiLayerPerceptron.cxx:214
 TMultiLayerPerceptron.cxx:215
 TMultiLayerPerceptron.cxx:216
 TMultiLayerPerceptron.cxx:217
 TMultiLayerPerceptron.cxx:218
 TMultiLayerPerceptron.cxx:219
 TMultiLayerPerceptron.cxx:220
 TMultiLayerPerceptron.cxx:221
 TMultiLayerPerceptron.cxx:222
 TMultiLayerPerceptron.cxx:223
 TMultiLayerPerceptron.cxx:224
 TMultiLayerPerceptron.cxx:225
 TMultiLayerPerceptron.cxx:226
 TMultiLayerPerceptron.cxx:227
 TMultiLayerPerceptron.cxx:228
 TMultiLayerPerceptron.cxx:229
 TMultiLayerPerceptron.cxx:230
 TMultiLayerPerceptron.cxx:231
 TMultiLayerPerceptron.cxx:232
 TMultiLayerPerceptron.cxx:233
 TMultiLayerPerceptron.cxx:234
 TMultiLayerPerceptron.cxx:235
 TMultiLayerPerceptron.cxx:236
 TMultiLayerPerceptron.cxx:237
 TMultiLayerPerceptron.cxx:238
 TMultiLayerPerceptron.cxx:239
 TMultiLayerPerceptron.cxx:240
 TMultiLayerPerceptron.cxx:241
 TMultiLayerPerceptron.cxx:242
 TMultiLayerPerceptron.cxx:243
 TMultiLayerPerceptron.cxx:244
 TMultiLayerPerceptron.cxx:245
 TMultiLayerPerceptron.cxx:246
 TMultiLayerPerceptron.cxx:247
 TMultiLayerPerceptron.cxx:248
 TMultiLayerPerceptron.cxx:249
 TMultiLayerPerceptron.cxx:250
 TMultiLayerPerceptron.cxx:251
 TMultiLayerPerceptron.cxx:252
 TMultiLayerPerceptron.cxx:253
 TMultiLayerPerceptron.cxx:254
 TMultiLayerPerceptron.cxx:255
 TMultiLayerPerceptron.cxx:256
 TMultiLayerPerceptron.cxx:257
 TMultiLayerPerceptron.cxx:258
 TMultiLayerPerceptron.cxx:259
 TMultiLayerPerceptron.cxx:260
 TMultiLayerPerceptron.cxx:261
 TMultiLayerPerceptron.cxx:262
 TMultiLayerPerceptron.cxx:263
 TMultiLayerPerceptron.cxx:264
 TMultiLayerPerceptron.cxx:265
 TMultiLayerPerceptron.cxx:266
 TMultiLayerPerceptron.cxx:267
 TMultiLayerPerceptron.cxx:268
 TMultiLayerPerceptron.cxx:269
 TMultiLayerPerceptron.cxx:270
 TMultiLayerPerceptron.cxx:271
 TMultiLayerPerceptron.cxx:272
 TMultiLayerPerceptron.cxx:273
 TMultiLayerPerceptron.cxx:274
 TMultiLayerPerceptron.cxx:275
 TMultiLayerPerceptron.cxx:276
 TMultiLayerPerceptron.cxx:277
 TMultiLayerPerceptron.cxx:278
 TMultiLayerPerceptron.cxx:279
 TMultiLayerPerceptron.cxx:280
 TMultiLayerPerceptron.cxx:281
 TMultiLayerPerceptron.cxx:282
 TMultiLayerPerceptron.cxx:283
 TMultiLayerPerceptron.cxx:284
 TMultiLayerPerceptron.cxx:285
 TMultiLayerPerceptron.cxx:286
 TMultiLayerPerceptron.cxx:287
 TMultiLayerPerceptron.cxx:288
 TMultiLayerPerceptron.cxx:289
 TMultiLayerPerceptron.cxx:290
 TMultiLayerPerceptron.cxx:291
 TMultiLayerPerceptron.cxx:292
 TMultiLayerPerceptron.cxx:293
 TMultiLayerPerceptron.cxx:294
 TMultiLayerPerceptron.cxx:295
 TMultiLayerPerceptron.cxx:296
 TMultiLayerPerceptron.cxx:297
 TMultiLayerPerceptron.cxx:298
 TMultiLayerPerceptron.cxx:299
 TMultiLayerPerceptron.cxx:300
 TMultiLayerPerceptron.cxx:301
 TMultiLayerPerceptron.cxx:302
 TMultiLayerPerceptron.cxx:303
 TMultiLayerPerceptron.cxx:304
 TMultiLayerPerceptron.cxx:305
 TMultiLayerPerceptron.cxx:306
 TMultiLayerPerceptron.cxx:307
 TMultiLayerPerceptron.cxx:308
 TMultiLayerPerceptron.cxx:309
 TMultiLayerPerceptron.cxx:310
 TMultiLayerPerceptron.cxx:311
 TMultiLayerPerceptron.cxx:312
 TMultiLayerPerceptron.cxx:313
 TMultiLayerPerceptron.cxx:314
 TMultiLayerPerceptron.cxx:315
 TMultiLayerPerceptron.cxx:316
 TMultiLayerPerceptron.cxx:317
 TMultiLayerPerceptron.cxx:318
 TMultiLayerPerceptron.cxx:319
 TMultiLayerPerceptron.cxx:320
 TMultiLayerPerceptron.cxx:321
 TMultiLayerPerceptron.cxx:322
 TMultiLayerPerceptron.cxx:323
 TMultiLayerPerceptron.cxx:324
 TMultiLayerPerceptron.cxx:325
 TMultiLayerPerceptron.cxx:326
 TMultiLayerPerceptron.cxx:327
 TMultiLayerPerceptron.cxx:328
 TMultiLayerPerceptron.cxx:329
 TMultiLayerPerceptron.cxx:330
 TMultiLayerPerceptron.cxx:331
 TMultiLayerPerceptron.cxx:332
 TMultiLayerPerceptron.cxx:333
 TMultiLayerPerceptron.cxx:334
 TMultiLayerPerceptron.cxx:335
 TMultiLayerPerceptron.cxx:336
 TMultiLayerPerceptron.cxx:337
 TMultiLayerPerceptron.cxx:338
 TMultiLayerPerceptron.cxx:339
 TMultiLayerPerceptron.cxx:340
 TMultiLayerPerceptron.cxx:341
 TMultiLayerPerceptron.cxx:342
 TMultiLayerPerceptron.cxx:343
 TMultiLayerPerceptron.cxx:344
 TMultiLayerPerceptron.cxx:345
 TMultiLayerPerceptron.cxx:346
 TMultiLayerPerceptron.cxx:347
 TMultiLayerPerceptron.cxx:348
 TMultiLayerPerceptron.cxx:349
 TMultiLayerPerceptron.cxx:350
 TMultiLayerPerceptron.cxx:351
 TMultiLayerPerceptron.cxx:352
 TMultiLayerPerceptron.cxx:353
 TMultiLayerPerceptron.cxx:354
 TMultiLayerPerceptron.cxx:355
 TMultiLayerPerceptron.cxx:356
 TMultiLayerPerceptron.cxx:357
 TMultiLayerPerceptron.cxx:358
 TMultiLayerPerceptron.cxx:359
 TMultiLayerPerceptron.cxx:360
 TMultiLayerPerceptron.cxx:361
 TMultiLayerPerceptron.cxx:362
 TMultiLayerPerceptron.cxx:363
 TMultiLayerPerceptron.cxx:364
 TMultiLayerPerceptron.cxx:365
 TMultiLayerPerceptron.cxx:366
 TMultiLayerPerceptron.cxx:367
 TMultiLayerPerceptron.cxx:368
 TMultiLayerPerceptron.cxx:369
 TMultiLayerPerceptron.cxx:370
 TMultiLayerPerceptron.cxx:371
 TMultiLayerPerceptron.cxx:372
 TMultiLayerPerceptron.cxx:373
 TMultiLayerPerceptron.cxx:374
 TMultiLayerPerceptron.cxx:375
 TMultiLayerPerceptron.cxx:376
 TMultiLayerPerceptron.cxx:377
 TMultiLayerPerceptron.cxx:378
 TMultiLayerPerceptron.cxx:379
 TMultiLayerPerceptron.cxx:380
 TMultiLayerPerceptron.cxx:381
 TMultiLayerPerceptron.cxx:382
 TMultiLayerPerceptron.cxx:383
 TMultiLayerPerceptron.cxx:384
 TMultiLayerPerceptron.cxx:385
 TMultiLayerPerceptron.cxx:386
 TMultiLayerPerceptron.cxx:387
 TMultiLayerPerceptron.cxx:388
 TMultiLayerPerceptron.cxx:389
 TMultiLayerPerceptron.cxx:390
 TMultiLayerPerceptron.cxx:391
 TMultiLayerPerceptron.cxx:392
 TMultiLayerPerceptron.cxx:393
 TMultiLayerPerceptron.cxx:394
 TMultiLayerPerceptron.cxx:395
 TMultiLayerPerceptron.cxx:396
 TMultiLayerPerceptron.cxx:397
 TMultiLayerPerceptron.cxx:398
 TMultiLayerPerceptron.cxx:399
 TMultiLayerPerceptron.cxx:400
 TMultiLayerPerceptron.cxx:401
 TMultiLayerPerceptron.cxx:402
 TMultiLayerPerceptron.cxx:403
 TMultiLayerPerceptron.cxx:404
 TMultiLayerPerceptron.cxx:405
 TMultiLayerPerceptron.cxx:406
 TMultiLayerPerceptron.cxx:407
 TMultiLayerPerceptron.cxx:408
 TMultiLayerPerceptron.cxx:409
 TMultiLayerPerceptron.cxx:410
 TMultiLayerPerceptron.cxx:411
 TMultiLayerPerceptron.cxx:412
 TMultiLayerPerceptron.cxx:413
 TMultiLayerPerceptron.cxx:414
 TMultiLayerPerceptron.cxx:415
 TMultiLayerPerceptron.cxx:416
 TMultiLayerPerceptron.cxx:417
 TMultiLayerPerceptron.cxx:418
 TMultiLayerPerceptron.cxx:419
 TMultiLayerPerceptron.cxx:420
 TMultiLayerPerceptron.cxx:421
 TMultiLayerPerceptron.cxx:422
 TMultiLayerPerceptron.cxx:423
 TMultiLayerPerceptron.cxx:424
 TMultiLayerPerceptron.cxx:425
 TMultiLayerPerceptron.cxx:426
 TMultiLayerPerceptron.cxx:427
 TMultiLayerPerceptron.cxx:428
 TMultiLayerPerceptron.cxx:429
 TMultiLayerPerceptron.cxx:430
 TMultiLayerPerceptron.cxx:431
 TMultiLayerPerceptron.cxx:432
 TMultiLayerPerceptron.cxx:433
 TMultiLayerPerceptron.cxx:434
 TMultiLayerPerceptron.cxx:435
 TMultiLayerPerceptron.cxx:436
 TMultiLayerPerceptron.cxx:437
 TMultiLayerPerceptron.cxx:438
 TMultiLayerPerceptron.cxx:439
 TMultiLayerPerceptron.cxx:440
 TMultiLayerPerceptron.cxx:441
 TMultiLayerPerceptron.cxx:442
 TMultiLayerPerceptron.cxx:443
 TMultiLayerPerceptron.cxx:444
 TMultiLayerPerceptron.cxx:445
 TMultiLayerPerceptron.cxx:446
 TMultiLayerPerceptron.cxx:447
 TMultiLayerPerceptron.cxx:448
 TMultiLayerPerceptron.cxx:449
 TMultiLayerPerceptron.cxx:450
 TMultiLayerPerceptron.cxx:451
 TMultiLayerPerceptron.cxx:452
 TMultiLayerPerceptron.cxx:453
 TMultiLayerPerceptron.cxx:454
 TMultiLayerPerceptron.cxx:455
 TMultiLayerPerceptron.cxx:456
 TMultiLayerPerceptron.cxx:457
 TMultiLayerPerceptron.cxx:458
 TMultiLayerPerceptron.cxx:459
 TMultiLayerPerceptron.cxx:460
 TMultiLayerPerceptron.cxx:461
 TMultiLayerPerceptron.cxx:462
 TMultiLayerPerceptron.cxx:463
 TMultiLayerPerceptron.cxx:464
 TMultiLayerPerceptron.cxx:465
 TMultiLayerPerceptron.cxx:466
 TMultiLayerPerceptron.cxx:467
 TMultiLayerPerceptron.cxx:468
 TMultiLayerPerceptron.cxx:469
 TMultiLayerPerceptron.cxx:470
 TMultiLayerPerceptron.cxx:471
 TMultiLayerPerceptron.cxx:472
 TMultiLayerPerceptron.cxx:473
 TMultiLayerPerceptron.cxx:474
 TMultiLayerPerceptron.cxx:475
 TMultiLayerPerceptron.cxx:476
 TMultiLayerPerceptron.cxx:477
 TMultiLayerPerceptron.cxx:478
 TMultiLayerPerceptron.cxx:479
 TMultiLayerPerceptron.cxx:480
 TMultiLayerPerceptron.cxx:481
 TMultiLayerPerceptron.cxx:482
 TMultiLayerPerceptron.cxx:483
 TMultiLayerPerceptron.cxx:484
 TMultiLayerPerceptron.cxx:485
 TMultiLayerPerceptron.cxx:486
 TMultiLayerPerceptron.cxx:487
 TMultiLayerPerceptron.cxx:488
 TMultiLayerPerceptron.cxx:489
 TMultiLayerPerceptron.cxx:490
 TMultiLayerPerceptron.cxx:491
 TMultiLayerPerceptron.cxx:492
 TMultiLayerPerceptron.cxx:493
 TMultiLayerPerceptron.cxx:494
 TMultiLayerPerceptron.cxx:495
 TMultiLayerPerceptron.cxx:496
 TMultiLayerPerceptron.cxx:497
 TMultiLayerPerceptron.cxx:498
 TMultiLayerPerceptron.cxx:499
 TMultiLayerPerceptron.cxx:500
 TMultiLayerPerceptron.cxx:501
 TMultiLayerPerceptron.cxx:502
 TMultiLayerPerceptron.cxx:503
 TMultiLayerPerceptron.cxx:504
 TMultiLayerPerceptron.cxx:505
 TMultiLayerPerceptron.cxx:506
 TMultiLayerPerceptron.cxx:507
 TMultiLayerPerceptron.cxx:508
 TMultiLayerPerceptron.cxx:509
 TMultiLayerPerceptron.cxx:510
 TMultiLayerPerceptron.cxx:511
 TMultiLayerPerceptron.cxx:512
 TMultiLayerPerceptron.cxx:513
 TMultiLayerPerceptron.cxx:514
 TMultiLayerPerceptron.cxx:515
 TMultiLayerPerceptron.cxx:516
 TMultiLayerPerceptron.cxx:517
 TMultiLayerPerceptron.cxx:518
 TMultiLayerPerceptron.cxx:519
 TMultiLayerPerceptron.cxx:520
 TMultiLayerPerceptron.cxx:521
 TMultiLayerPerceptron.cxx:522
 TMultiLayerPerceptron.cxx:523
 TMultiLayerPerceptron.cxx:524
 TMultiLayerPerceptron.cxx:525
 TMultiLayerPerceptron.cxx:526
 TMultiLayerPerceptron.cxx:527
 TMultiLayerPerceptron.cxx:528
 TMultiLayerPerceptron.cxx:529
 TMultiLayerPerceptron.cxx:530
 TMultiLayerPerceptron.cxx:531
 TMultiLayerPerceptron.cxx:532
 TMultiLayerPerceptron.cxx:533
 TMultiLayerPerceptron.cxx:534
 TMultiLayerPerceptron.cxx:535
 TMultiLayerPerceptron.cxx:536
 TMultiLayerPerceptron.cxx:537
 TMultiLayerPerceptron.cxx:538
 TMultiLayerPerceptron.cxx:539
 TMultiLayerPerceptron.cxx:540
 TMultiLayerPerceptron.cxx:541
 TMultiLayerPerceptron.cxx:542
 TMultiLayerPerceptron.cxx:543
 TMultiLayerPerceptron.cxx:544
 TMultiLayerPerceptron.cxx:545
 TMultiLayerPerceptron.cxx:546
 TMultiLayerPerceptron.cxx:547
 TMultiLayerPerceptron.cxx:548
 TMultiLayerPerceptron.cxx:549
 TMultiLayerPerceptron.cxx:550
 TMultiLayerPerceptron.cxx:551
 TMultiLayerPerceptron.cxx:552
 TMultiLayerPerceptron.cxx:553
 TMultiLayerPerceptron.cxx:554
 TMultiLayerPerceptron.cxx:555
 TMultiLayerPerceptron.cxx:556
 TMultiLayerPerceptron.cxx:557
 TMultiLayerPerceptron.cxx:558
 TMultiLayerPerceptron.cxx:559
 TMultiLayerPerceptron.cxx:560
 TMultiLayerPerceptron.cxx:561
 TMultiLayerPerceptron.cxx:562
 TMultiLayerPerceptron.cxx:563
 TMultiLayerPerceptron.cxx:564
 TMultiLayerPerceptron.cxx:565
 TMultiLayerPerceptron.cxx:566
 TMultiLayerPerceptron.cxx:567
 TMultiLayerPerceptron.cxx:568
 TMultiLayerPerceptron.cxx:569
 TMultiLayerPerceptron.cxx:570
 TMultiLayerPerceptron.cxx:571
 TMultiLayerPerceptron.cxx:572
 TMultiLayerPerceptron.cxx:573
 TMultiLayerPerceptron.cxx:574
 TMultiLayerPerceptron.cxx:575
 TMultiLayerPerceptron.cxx:576
 TMultiLayerPerceptron.cxx:577
 TMultiLayerPerceptron.cxx:578
 TMultiLayerPerceptron.cxx:579
 TMultiLayerPerceptron.cxx:580
 TMultiLayerPerceptron.cxx:581
 TMultiLayerPerceptron.cxx:582
 TMultiLayerPerceptron.cxx:583
 TMultiLayerPerceptron.cxx:584
 TMultiLayerPerceptron.cxx:585
 TMultiLayerPerceptron.cxx:586
 TMultiLayerPerceptron.cxx:587
 TMultiLayerPerceptron.cxx:588
 TMultiLayerPerceptron.cxx:589
 TMultiLayerPerceptron.cxx:590
 TMultiLayerPerceptron.cxx:591
 TMultiLayerPerceptron.cxx:592
 TMultiLayerPerceptron.cxx:593
 TMultiLayerPerceptron.cxx:594
 TMultiLayerPerceptron.cxx:595
 TMultiLayerPerceptron.cxx:596
 TMultiLayerPerceptron.cxx:597
 TMultiLayerPerceptron.cxx:598
 TMultiLayerPerceptron.cxx:599
 TMultiLayerPerceptron.cxx:600
 TMultiLayerPerceptron.cxx:601
 TMultiLayerPerceptron.cxx:602
 TMultiLayerPerceptron.cxx:603
 TMultiLayerPerceptron.cxx:604
 TMultiLayerPerceptron.cxx:605
 TMultiLayerPerceptron.cxx:606
 TMultiLayerPerceptron.cxx:607
 TMultiLayerPerceptron.cxx:608
 TMultiLayerPerceptron.cxx:609
 TMultiLayerPerceptron.cxx:610
 TMultiLayerPerceptron.cxx:611
 TMultiLayerPerceptron.cxx:612
 TMultiLayerPerceptron.cxx:613
 TMultiLayerPerceptron.cxx:614
 TMultiLayerPerceptron.cxx:615
 TMultiLayerPerceptron.cxx:616
 TMultiLayerPerceptron.cxx:617
 TMultiLayerPerceptron.cxx:618
 TMultiLayerPerceptron.cxx:619
 TMultiLayerPerceptron.cxx:620
 TMultiLayerPerceptron.cxx:621
 TMultiLayerPerceptron.cxx:622
 TMultiLayerPerceptron.cxx:623
 TMultiLayerPerceptron.cxx:624
 TMultiLayerPerceptron.cxx:625
 TMultiLayerPerceptron.cxx:626
 TMultiLayerPerceptron.cxx:627
 TMultiLayerPerceptron.cxx:628
 TMultiLayerPerceptron.cxx:629
 TMultiLayerPerceptron.cxx:630
 TMultiLayerPerceptron.cxx:631
 TMultiLayerPerceptron.cxx:632
 TMultiLayerPerceptron.cxx:633
 TMultiLayerPerceptron.cxx:634
 TMultiLayerPerceptron.cxx:635
 TMultiLayerPerceptron.cxx:636
 TMultiLayerPerceptron.cxx:637
 TMultiLayerPerceptron.cxx:638
 TMultiLayerPerceptron.cxx:639
 TMultiLayerPerceptron.cxx:640
 TMultiLayerPerceptron.cxx:641
 TMultiLayerPerceptron.cxx:642
 TMultiLayerPerceptron.cxx:643
 TMultiLayerPerceptron.cxx:644
 TMultiLayerPerceptron.cxx:645
 TMultiLayerPerceptron.cxx:646
 TMultiLayerPerceptron.cxx:647
 TMultiLayerPerceptron.cxx:648
 TMultiLayerPerceptron.cxx:649
 TMultiLayerPerceptron.cxx:650
 TMultiLayerPerceptron.cxx:651
 TMultiLayerPerceptron.cxx:652
 TMultiLayerPerceptron.cxx:653
 TMultiLayerPerceptron.cxx:654
 TMultiLayerPerceptron.cxx:655
 TMultiLayerPerceptron.cxx:656
 TMultiLayerPerceptron.cxx:657
 TMultiLayerPerceptron.cxx:658
 TMultiLayerPerceptron.cxx:659
 TMultiLayerPerceptron.cxx:660
 TMultiLayerPerceptron.cxx:661
 TMultiLayerPerceptron.cxx:662
 TMultiLayerPerceptron.cxx:663
 TMultiLayerPerceptron.cxx:664
 TMultiLayerPerceptron.cxx:665
 TMultiLayerPerceptron.cxx:666
 TMultiLayerPerceptron.cxx:667
 TMultiLayerPerceptron.cxx:668
 TMultiLayerPerceptron.cxx:669
 TMultiLayerPerceptron.cxx:670
 TMultiLayerPerceptron.cxx:671
 TMultiLayerPerceptron.cxx:672
 TMultiLayerPerceptron.cxx:673
 TMultiLayerPerceptron.cxx:674
 TMultiLayerPerceptron.cxx:675
 TMultiLayerPerceptron.cxx:676
 TMultiLayerPerceptron.cxx:677
 TMultiLayerPerceptron.cxx:678
 TMultiLayerPerceptron.cxx:679
 TMultiLayerPerceptron.cxx:680
 TMultiLayerPerceptron.cxx:681
 TMultiLayerPerceptron.cxx:682
 TMultiLayerPerceptron.cxx:683
 TMultiLayerPerceptron.cxx:684
 TMultiLayerPerceptron.cxx:685
 TMultiLayerPerceptron.cxx:686
 TMultiLayerPerceptron.cxx:687
 TMultiLayerPerceptron.cxx:688
 TMultiLayerPerceptron.cxx:689
 TMultiLayerPerceptron.cxx:690
 TMultiLayerPerceptron.cxx:691
 TMultiLayerPerceptron.cxx:692
 TMultiLayerPerceptron.cxx:693
 TMultiLayerPerceptron.cxx:694
 TMultiLayerPerceptron.cxx:695
 TMultiLayerPerceptron.cxx:696
 TMultiLayerPerceptron.cxx:697
 TMultiLayerPerceptron.cxx:698
 TMultiLayerPerceptron.cxx:699
 TMultiLayerPerceptron.cxx:700
 TMultiLayerPerceptron.cxx:701
 TMultiLayerPerceptron.cxx:702
 TMultiLayerPerceptron.cxx:703
 TMultiLayerPerceptron.cxx:704
 TMultiLayerPerceptron.cxx:705
 TMultiLayerPerceptron.cxx:706
 TMultiLayerPerceptron.cxx:707
 TMultiLayerPerceptron.cxx:708
 TMultiLayerPerceptron.cxx:709
 TMultiLayerPerceptron.cxx:710
 TMultiLayerPerceptron.cxx:711
 TMultiLayerPerceptron.cxx:712
 TMultiLayerPerceptron.cxx:713
 TMultiLayerPerceptron.cxx:714
 TMultiLayerPerceptron.cxx:715
 TMultiLayerPerceptron.cxx:716
 TMultiLayerPerceptron.cxx:717
 TMultiLayerPerceptron.cxx:718
 TMultiLayerPerceptron.cxx:719
 TMultiLayerPerceptron.cxx:720
 TMultiLayerPerceptron.cxx:721
 TMultiLayerPerceptron.cxx:722
 TMultiLayerPerceptron.cxx:723
 TMultiLayerPerceptron.cxx:724
 TMultiLayerPerceptron.cxx:725
 TMultiLayerPerceptron.cxx:726
 TMultiLayerPerceptron.cxx:727
 TMultiLayerPerceptron.cxx:728
 TMultiLayerPerceptron.cxx:729
 TMultiLayerPerceptron.cxx:730
 TMultiLayerPerceptron.cxx:731
 TMultiLayerPerceptron.cxx:732
 TMultiLayerPerceptron.cxx:733
 TMultiLayerPerceptron.cxx:734
 TMultiLayerPerceptron.cxx:735
 TMultiLayerPerceptron.cxx:736
 TMultiLayerPerceptron.cxx:737
 TMultiLayerPerceptron.cxx:738
 TMultiLayerPerceptron.cxx:739
 TMultiLayerPerceptron.cxx:740
 TMultiLayerPerceptron.cxx:741
 TMultiLayerPerceptron.cxx:742
 TMultiLayerPerceptron.cxx:743
 TMultiLayerPerceptron.cxx:744
 TMultiLayerPerceptron.cxx:745
 TMultiLayerPerceptron.cxx:746
 TMultiLayerPerceptron.cxx:747
 TMultiLayerPerceptron.cxx:748
 TMultiLayerPerceptron.cxx:749
 TMultiLayerPerceptron.cxx:750
 TMultiLayerPerceptron.cxx:751
 TMultiLayerPerceptron.cxx:752
 TMultiLayerPerceptron.cxx:753
 TMultiLayerPerceptron.cxx:754
 TMultiLayerPerceptron.cxx:755
 TMultiLayerPerceptron.cxx:756
 TMultiLayerPerceptron.cxx:757
 TMultiLayerPerceptron.cxx:758
 TMultiLayerPerceptron.cxx:759
 TMultiLayerPerceptron.cxx:760
 TMultiLayerPerceptron.cxx:761
 TMultiLayerPerceptron.cxx:762
 TMultiLayerPerceptron.cxx:763
 TMultiLayerPerceptron.cxx:764
 TMultiLayerPerceptron.cxx:765
 TMultiLayerPerceptron.cxx:766
 TMultiLayerPerceptron.cxx:767
 TMultiLayerPerceptron.cxx:768
 TMultiLayerPerceptron.cxx:769
 TMultiLayerPerceptron.cxx:770
 TMultiLayerPerceptron.cxx:771
 TMultiLayerPerceptron.cxx:772
 TMultiLayerPerceptron.cxx:773
 TMultiLayerPerceptron.cxx:774
 TMultiLayerPerceptron.cxx:775
 TMultiLayerPerceptron.cxx:776
 TMultiLayerPerceptron.cxx:777
 TMultiLayerPerceptron.cxx:778
 TMultiLayerPerceptron.cxx:779
 TMultiLayerPerceptron.cxx:780
 TMultiLayerPerceptron.cxx:781
 TMultiLayerPerceptron.cxx:782
 TMultiLayerPerceptron.cxx:783
 TMultiLayerPerceptron.cxx:784
 TMultiLayerPerceptron.cxx:785
 TMultiLayerPerceptron.cxx:786
 TMultiLayerPerceptron.cxx:787
 TMultiLayerPerceptron.cxx:788
 TMultiLayerPerceptron.cxx:789
 TMultiLayerPerceptron.cxx:790
 TMultiLayerPerceptron.cxx:791
 TMultiLayerPerceptron.cxx:792
 TMultiLayerPerceptron.cxx:793
 TMultiLayerPerceptron.cxx:794
 TMultiLayerPerceptron.cxx:795
 TMultiLayerPerceptron.cxx:796
 TMultiLayerPerceptron.cxx:797
 TMultiLayerPerceptron.cxx:798
 TMultiLayerPerceptron.cxx:799
 TMultiLayerPerceptron.cxx:800
 TMultiLayerPerceptron.cxx:801
 TMultiLayerPerceptron.cxx:802
 TMultiLayerPerceptron.cxx:803
 TMultiLayerPerceptron.cxx:804
 TMultiLayerPerceptron.cxx:805
 TMultiLayerPerceptron.cxx:806
 TMultiLayerPerceptron.cxx:807
 TMultiLayerPerceptron.cxx:808
 TMultiLayerPerceptron.cxx:809
 TMultiLayerPerceptron.cxx:810
 TMultiLayerPerceptron.cxx:811
 TMultiLayerPerceptron.cxx:812
 TMultiLayerPerceptron.cxx:813
 TMultiLayerPerceptron.cxx:814
 TMultiLayerPerceptron.cxx:815
 TMultiLayerPerceptron.cxx:816
 TMultiLayerPerceptron.cxx:817
 TMultiLayerPerceptron.cxx:818
 TMultiLayerPerceptron.cxx:819
 TMultiLayerPerceptron.cxx:820
 TMultiLayerPerceptron.cxx:821
 TMultiLayerPerceptron.cxx:822
 TMultiLayerPerceptron.cxx:823
 TMultiLayerPerceptron.cxx:824
 TMultiLayerPerceptron.cxx:825
 TMultiLayerPerceptron.cxx:826
 TMultiLayerPerceptron.cxx:827
 TMultiLayerPerceptron.cxx:828
 TMultiLayerPerceptron.cxx:829
 TMultiLayerPerceptron.cxx:830
 TMultiLayerPerceptron.cxx:831
 TMultiLayerPerceptron.cxx:832
 TMultiLayerPerceptron.cxx:833
 TMultiLayerPerceptron.cxx:834
 TMultiLayerPerceptron.cxx:835
 TMultiLayerPerceptron.cxx:836
 TMultiLayerPerceptron.cxx:837
 TMultiLayerPerceptron.cxx:838
 TMultiLayerPerceptron.cxx:839
 TMultiLayerPerceptron.cxx:840
 TMultiLayerPerceptron.cxx:841
 TMultiLayerPerceptron.cxx:842
 TMultiLayerPerceptron.cxx:843
 TMultiLayerPerceptron.cxx:844
 TMultiLayerPerceptron.cxx:845
 TMultiLayerPerceptron.cxx:846
 TMultiLayerPerceptron.cxx:847
 TMultiLayerPerceptron.cxx:848
 TMultiLayerPerceptron.cxx:849
 TMultiLayerPerceptron.cxx:850
 TMultiLayerPerceptron.cxx:851
 TMultiLayerPerceptron.cxx:852
 TMultiLayerPerceptron.cxx:853
 TMultiLayerPerceptron.cxx:854
 TMultiLayerPerceptron.cxx:855
 TMultiLayerPerceptron.cxx:856
 TMultiLayerPerceptron.cxx:857
 TMultiLayerPerceptron.cxx:858
 TMultiLayerPerceptron.cxx:859
 TMultiLayerPerceptron.cxx:860
 TMultiLayerPerceptron.cxx:861
 TMultiLayerPerceptron.cxx:862
 TMultiLayerPerceptron.cxx:863
 TMultiLayerPerceptron.cxx:864
 TMultiLayerPerceptron.cxx:865
 TMultiLayerPerceptron.cxx:866
 TMultiLayerPerceptron.cxx:867
 TMultiLayerPerceptron.cxx:868
 TMultiLayerPerceptron.cxx:869
 TMultiLayerPerceptron.cxx:870
 TMultiLayerPerceptron.cxx:871
 TMultiLayerPerceptron.cxx:872
 TMultiLayerPerceptron.cxx:873
 TMultiLayerPerceptron.cxx:874
 TMultiLayerPerceptron.cxx:875
 TMultiLayerPerceptron.cxx:876
 TMultiLayerPerceptron.cxx:877
 TMultiLayerPerceptron.cxx:878
 TMultiLayerPerceptron.cxx:879
 TMultiLayerPerceptron.cxx:880
 TMultiLayerPerceptron.cxx:881
 TMultiLayerPerceptron.cxx:882
 TMultiLayerPerceptron.cxx:883
 TMultiLayerPerceptron.cxx:884
 TMultiLayerPerceptron.cxx:885
 TMultiLayerPerceptron.cxx:886
 TMultiLayerPerceptron.cxx:887
 TMultiLayerPerceptron.cxx:888
 TMultiLayerPerceptron.cxx:889
 TMultiLayerPerceptron.cxx:890
 TMultiLayerPerceptron.cxx:891
 TMultiLayerPerceptron.cxx:892
 TMultiLayerPerceptron.cxx:893
 TMultiLayerPerceptron.cxx:894
 TMultiLayerPerceptron.cxx:895
 TMultiLayerPerceptron.cxx:896
 TMultiLayerPerceptron.cxx:897
 TMultiLayerPerceptron.cxx:898
 TMultiLayerPerceptron.cxx:899
 TMultiLayerPerceptron.cxx:900
 TMultiLayerPerceptron.cxx:901
 TMultiLayerPerceptron.cxx:902
 TMultiLayerPerceptron.cxx:903
 TMultiLayerPerceptron.cxx:904
 TMultiLayerPerceptron.cxx:905
 TMultiLayerPerceptron.cxx:906
 TMultiLayerPerceptron.cxx:907
 TMultiLayerPerceptron.cxx:908
 TMultiLayerPerceptron.cxx:909
 TMultiLayerPerceptron.cxx:910
 TMultiLayerPerceptron.cxx:911
 TMultiLayerPerceptron.cxx:912
 TMultiLayerPerceptron.cxx:913
 TMultiLayerPerceptron.cxx:914
 TMultiLayerPerceptron.cxx:915
 TMultiLayerPerceptron.cxx:916
 TMultiLayerPerceptron.cxx:917
 TMultiLayerPerceptron.cxx:918
 TMultiLayerPerceptron.cxx:919
 TMultiLayerPerceptron.cxx:920
 TMultiLayerPerceptron.cxx:921
 TMultiLayerPerceptron.cxx:922
 TMultiLayerPerceptron.cxx:923
 TMultiLayerPerceptron.cxx:924
 TMultiLayerPerceptron.cxx:925
 TMultiLayerPerceptron.cxx:926
 TMultiLayerPerceptron.cxx:927
 TMultiLayerPerceptron.cxx:928
 TMultiLayerPerceptron.cxx:929
 TMultiLayerPerceptron.cxx:930
 TMultiLayerPerceptron.cxx:931
 TMultiLayerPerceptron.cxx:932
 TMultiLayerPerceptron.cxx:933
 TMultiLayerPerceptron.cxx:934
 TMultiLayerPerceptron.cxx:935
 TMultiLayerPerceptron.cxx:936
 TMultiLayerPerceptron.cxx:937
 TMultiLayerPerceptron.cxx:938
 TMultiLayerPerceptron.cxx:939
 TMultiLayerPerceptron.cxx:940
 TMultiLayerPerceptron.cxx:941
 TMultiLayerPerceptron.cxx:942
 TMultiLayerPerceptron.cxx:943
 TMultiLayerPerceptron.cxx:944
 TMultiLayerPerceptron.cxx:945
 TMultiLayerPerceptron.cxx:946
 TMultiLayerPerceptron.cxx:947
 TMultiLayerPerceptron.cxx:948
 TMultiLayerPerceptron.cxx:949
 TMultiLayerPerceptron.cxx:950
 TMultiLayerPerceptron.cxx:951
 TMultiLayerPerceptron.cxx:952
 TMultiLayerPerceptron.cxx:953
 TMultiLayerPerceptron.cxx:954
 TMultiLayerPerceptron.cxx:955
 TMultiLayerPerceptron.cxx:956
 TMultiLayerPerceptron.cxx:957
 TMultiLayerPerceptron.cxx:958
 TMultiLayerPerceptron.cxx:959
 TMultiLayerPerceptron.cxx:960
 TMultiLayerPerceptron.cxx:961
 TMultiLayerPerceptron.cxx:962
 TMultiLayerPerceptron.cxx:963
 TMultiLayerPerceptron.cxx:964
 TMultiLayerPerceptron.cxx:965
 TMultiLayerPerceptron.cxx:966
 TMultiLayerPerceptron.cxx:967
 TMultiLayerPerceptron.cxx:968
 TMultiLayerPerceptron.cxx:969
 TMultiLayerPerceptron.cxx:970
 TMultiLayerPerceptron.cxx:971
 TMultiLayerPerceptron.cxx:972
 TMultiLayerPerceptron.cxx:973
 TMultiLayerPerceptron.cxx:974
 TMultiLayerPerceptron.cxx:975
 TMultiLayerPerceptron.cxx:976
 TMultiLayerPerceptron.cxx:977
 TMultiLayerPerceptron.cxx:978
 TMultiLayerPerceptron.cxx:979
 TMultiLayerPerceptron.cxx:980
 TMultiLayerPerceptron.cxx:981
 TMultiLayerPerceptron.cxx:982
 TMultiLayerPerceptron.cxx:983
 TMultiLayerPerceptron.cxx:984
 TMultiLayerPerceptron.cxx:985
 TMultiLayerPerceptron.cxx:986
 TMultiLayerPerceptron.cxx:987
 TMultiLayerPerceptron.cxx:988
 TMultiLayerPerceptron.cxx:989
 TMultiLayerPerceptron.cxx:990
 TMultiLayerPerceptron.cxx:991
 TMultiLayerPerceptron.cxx:992
 TMultiLayerPerceptron.cxx:993
 TMultiLayerPerceptron.cxx:994
 TMultiLayerPerceptron.cxx:995
 TMultiLayerPerceptron.cxx:996
 TMultiLayerPerceptron.cxx:997
 TMultiLayerPerceptron.cxx:998
 TMultiLayerPerceptron.cxx:999
 TMultiLayerPerceptron.cxx:1000
 TMultiLayerPerceptron.cxx:1001
 TMultiLayerPerceptron.cxx:1002
 TMultiLayerPerceptron.cxx:1003
 TMultiLayerPerceptron.cxx:1004
 TMultiLayerPerceptron.cxx:1005
 TMultiLayerPerceptron.cxx:1006
 TMultiLayerPerceptron.cxx:1007
 TMultiLayerPerceptron.cxx:1008
 TMultiLayerPerceptron.cxx:1009
 TMultiLayerPerceptron.cxx:1010
 TMultiLayerPerceptron.cxx:1011
 TMultiLayerPerceptron.cxx:1012
 TMultiLayerPerceptron.cxx:1013
 TMultiLayerPerceptron.cxx:1014
 TMultiLayerPerceptron.cxx:1015
 TMultiLayerPerceptron.cxx:1016
 TMultiLayerPerceptron.cxx:1017
 TMultiLayerPerceptron.cxx:1018
 TMultiLayerPerceptron.cxx:1019
 TMultiLayerPerceptron.cxx:1020
 TMultiLayerPerceptron.cxx:1021
 TMultiLayerPerceptron.cxx:1022
 TMultiLayerPerceptron.cxx:1023
 TMultiLayerPerceptron.cxx:1024
 TMultiLayerPerceptron.cxx:1025
 TMultiLayerPerceptron.cxx:1026
 TMultiLayerPerceptron.cxx:1027
 TMultiLayerPerceptron.cxx:1028
 TMultiLayerPerceptron.cxx:1029
 TMultiLayerPerceptron.cxx:1030
 TMultiLayerPerceptron.cxx:1031
 TMultiLayerPerceptron.cxx:1032
 TMultiLayerPerceptron.cxx:1033
 TMultiLayerPerceptron.cxx:1034
 TMultiLayerPerceptron.cxx:1035
 TMultiLayerPerceptron.cxx:1036
 TMultiLayerPerceptron.cxx:1037
 TMultiLayerPerceptron.cxx:1038
 TMultiLayerPerceptron.cxx:1039
 TMultiLayerPerceptron.cxx:1040
 TMultiLayerPerceptron.cxx:1041
 TMultiLayerPerceptron.cxx:1042
 TMultiLayerPerceptron.cxx:1043
 TMultiLayerPerceptron.cxx:1044
 TMultiLayerPerceptron.cxx:1045
 TMultiLayerPerceptron.cxx:1046
 TMultiLayerPerceptron.cxx:1047
 TMultiLayerPerceptron.cxx:1048
 TMultiLayerPerceptron.cxx:1049
 TMultiLayerPerceptron.cxx:1050
 TMultiLayerPerceptron.cxx:1051
 TMultiLayerPerceptron.cxx:1052
 TMultiLayerPerceptron.cxx:1053
 TMultiLayerPerceptron.cxx:1054
 TMultiLayerPerceptron.cxx:1055
 TMultiLayerPerceptron.cxx:1056
 TMultiLayerPerceptron.cxx:1057
 TMultiLayerPerceptron.cxx:1058
 TMultiLayerPerceptron.cxx:1059
 TMultiLayerPerceptron.cxx:1060
 TMultiLayerPerceptron.cxx:1061
 TMultiLayerPerceptron.cxx:1062
 TMultiLayerPerceptron.cxx:1063
 TMultiLayerPerceptron.cxx:1064
 TMultiLayerPerceptron.cxx:1065
 TMultiLayerPerceptron.cxx:1066
 TMultiLayerPerceptron.cxx:1067
 TMultiLayerPerceptron.cxx:1068
 TMultiLayerPerceptron.cxx:1069
 TMultiLayerPerceptron.cxx:1070
 TMultiLayerPerceptron.cxx:1071
 TMultiLayerPerceptron.cxx:1072
 TMultiLayerPerceptron.cxx:1073
 TMultiLayerPerceptron.cxx:1074
 TMultiLayerPerceptron.cxx:1075
 TMultiLayerPerceptron.cxx:1076
 TMultiLayerPerceptron.cxx:1077
 TMultiLayerPerceptron.cxx:1078
 TMultiLayerPerceptron.cxx:1079
 TMultiLayerPerceptron.cxx:1080
 TMultiLayerPerceptron.cxx:1081
 TMultiLayerPerceptron.cxx:1082
 TMultiLayerPerceptron.cxx:1083
 TMultiLayerPerceptron.cxx:1084
 TMultiLayerPerceptron.cxx:1085
 TMultiLayerPerceptron.cxx:1086
 TMultiLayerPerceptron.cxx:1087
 TMultiLayerPerceptron.cxx:1088
 TMultiLayerPerceptron.cxx:1089
 TMultiLayerPerceptron.cxx:1090
 TMultiLayerPerceptron.cxx:1091
 TMultiLayerPerceptron.cxx:1092
 TMultiLayerPerceptron.cxx:1093
 TMultiLayerPerceptron.cxx:1094
 TMultiLayerPerceptron.cxx:1095
 TMultiLayerPerceptron.cxx:1096
 TMultiLayerPerceptron.cxx:1097
 TMultiLayerPerceptron.cxx:1098
 TMultiLayerPerceptron.cxx:1099
 TMultiLayerPerceptron.cxx:1100
 TMultiLayerPerceptron.cxx:1101
 TMultiLayerPerceptron.cxx:1102
 TMultiLayerPerceptron.cxx:1103
 TMultiLayerPerceptron.cxx:1104
 TMultiLayerPerceptron.cxx:1105
 TMultiLayerPerceptron.cxx:1106
 TMultiLayerPerceptron.cxx:1107
 TMultiLayerPerceptron.cxx:1108
 TMultiLayerPerceptron.cxx:1109
 TMultiLayerPerceptron.cxx:1110
 TMultiLayerPerceptron.cxx:1111
 TMultiLayerPerceptron.cxx:1112
 TMultiLayerPerceptron.cxx:1113
 TMultiLayerPerceptron.cxx:1114
 TMultiLayerPerceptron.cxx:1115
 TMultiLayerPerceptron.cxx:1116
 TMultiLayerPerceptron.cxx:1117
 TMultiLayerPerceptron.cxx:1118
 TMultiLayerPerceptron.cxx:1119
 TMultiLayerPerceptron.cxx:1120
 TMultiLayerPerceptron.cxx:1121
 TMultiLayerPerceptron.cxx:1122
 TMultiLayerPerceptron.cxx:1123
 TMultiLayerPerceptron.cxx:1124
 TMultiLayerPerceptron.cxx:1125
 TMultiLayerPerceptron.cxx:1126
 TMultiLayerPerceptron.cxx:1127
 TMultiLayerPerceptron.cxx:1128
 TMultiLayerPerceptron.cxx:1129
 TMultiLayerPerceptron.cxx:1130
 TMultiLayerPerceptron.cxx:1131
 TMultiLayerPerceptron.cxx:1132
 TMultiLayerPerceptron.cxx:1133
 TMultiLayerPerceptron.cxx:1134
 TMultiLayerPerceptron.cxx:1135
 TMultiLayerPerceptron.cxx:1136
 TMultiLayerPerceptron.cxx:1137
 TMultiLayerPerceptron.cxx:1138
 TMultiLayerPerceptron.cxx:1139
 TMultiLayerPerceptron.cxx:1140
 TMultiLayerPerceptron.cxx:1141
 TMultiLayerPerceptron.cxx:1142
 TMultiLayerPerceptron.cxx:1143
 TMultiLayerPerceptron.cxx:1144
 TMultiLayerPerceptron.cxx:1145
 TMultiLayerPerceptron.cxx:1146
 TMultiLayerPerceptron.cxx:1147
 TMultiLayerPerceptron.cxx:1148
 TMultiLayerPerceptron.cxx:1149
 TMultiLayerPerceptron.cxx:1150
 TMultiLayerPerceptron.cxx:1151
 TMultiLayerPerceptron.cxx:1152
 TMultiLayerPerceptron.cxx:1153
 TMultiLayerPerceptron.cxx:1154
 TMultiLayerPerceptron.cxx:1155
 TMultiLayerPerceptron.cxx:1156
 TMultiLayerPerceptron.cxx:1157
 TMultiLayerPerceptron.cxx:1158
 TMultiLayerPerceptron.cxx:1159
 TMultiLayerPerceptron.cxx:1160
 TMultiLayerPerceptron.cxx:1161
 TMultiLayerPerceptron.cxx:1162
 TMultiLayerPerceptron.cxx:1163
 TMultiLayerPerceptron.cxx:1164
 TMultiLayerPerceptron.cxx:1165
 TMultiLayerPerceptron.cxx:1166
 TMultiLayerPerceptron.cxx:1167
 TMultiLayerPerceptron.cxx:1168
 TMultiLayerPerceptron.cxx:1169
 TMultiLayerPerceptron.cxx:1170
 TMultiLayerPerceptron.cxx:1171
 TMultiLayerPerceptron.cxx:1172
 TMultiLayerPerceptron.cxx:1173
 TMultiLayerPerceptron.cxx:1174
 TMultiLayerPerceptron.cxx:1175
 TMultiLayerPerceptron.cxx:1176
 TMultiLayerPerceptron.cxx:1177
 TMultiLayerPerceptron.cxx:1178
 TMultiLayerPerceptron.cxx:1179
 TMultiLayerPerceptron.cxx:1180
 TMultiLayerPerceptron.cxx:1181
 TMultiLayerPerceptron.cxx:1182
 TMultiLayerPerceptron.cxx:1183
 TMultiLayerPerceptron.cxx:1184
 TMultiLayerPerceptron.cxx:1185
 TMultiLayerPerceptron.cxx:1186
 TMultiLayerPerceptron.cxx:1187
 TMultiLayerPerceptron.cxx:1188
 TMultiLayerPerceptron.cxx:1189
 TMultiLayerPerceptron.cxx:1190
 TMultiLayerPerceptron.cxx:1191
 TMultiLayerPerceptron.cxx:1192
 TMultiLayerPerceptron.cxx:1193
 TMultiLayerPerceptron.cxx:1194
 TMultiLayerPerceptron.cxx:1195
 TMultiLayerPerceptron.cxx:1196
 TMultiLayerPerceptron.cxx:1197
 TMultiLayerPerceptron.cxx:1198
 TMultiLayerPerceptron.cxx:1199
 TMultiLayerPerceptron.cxx:1200
 TMultiLayerPerceptron.cxx:1201
 TMultiLayerPerceptron.cxx:1202
 TMultiLayerPerceptron.cxx:1203
 TMultiLayerPerceptron.cxx:1204
 TMultiLayerPerceptron.cxx:1205
 TMultiLayerPerceptron.cxx:1206
 TMultiLayerPerceptron.cxx:1207
 TMultiLayerPerceptron.cxx:1208
 TMultiLayerPerceptron.cxx:1209
 TMultiLayerPerceptron.cxx:1210
 TMultiLayerPerceptron.cxx:1211
 TMultiLayerPerceptron.cxx:1212
 TMultiLayerPerceptron.cxx:1213
 TMultiLayerPerceptron.cxx:1214
 TMultiLayerPerceptron.cxx:1215
 TMultiLayerPerceptron.cxx:1216
 TMultiLayerPerceptron.cxx:1217
 TMultiLayerPerceptron.cxx:1218
 TMultiLayerPerceptron.cxx:1219
 TMultiLayerPerceptron.cxx:1220
 TMultiLayerPerceptron.cxx:1221
 TMultiLayerPerceptron.cxx:1222
 TMultiLayerPerceptron.cxx:1223
 TMultiLayerPerceptron.cxx:1224
 TMultiLayerPerceptron.cxx:1225
 TMultiLayerPerceptron.cxx:1226
 TMultiLayerPerceptron.cxx:1227
 TMultiLayerPerceptron.cxx:1228
 TMultiLayerPerceptron.cxx:1229
 TMultiLayerPerceptron.cxx:1230
 TMultiLayerPerceptron.cxx:1231
 TMultiLayerPerceptron.cxx:1232
 TMultiLayerPerceptron.cxx:1233
 TMultiLayerPerceptron.cxx:1234
 TMultiLayerPerceptron.cxx:1235
 TMultiLayerPerceptron.cxx:1236
 TMultiLayerPerceptron.cxx:1237
 TMultiLayerPerceptron.cxx:1238
 TMultiLayerPerceptron.cxx:1239
 TMultiLayerPerceptron.cxx:1240
 TMultiLayerPerceptron.cxx:1241
 TMultiLayerPerceptron.cxx:1242
 TMultiLayerPerceptron.cxx:1243
 TMultiLayerPerceptron.cxx:1244
 TMultiLayerPerceptron.cxx:1245
 TMultiLayerPerceptron.cxx:1246
 TMultiLayerPerceptron.cxx:1247
 TMultiLayerPerceptron.cxx:1248
 TMultiLayerPerceptron.cxx:1249
 TMultiLayerPerceptron.cxx:1250
 TMultiLayerPerceptron.cxx:1251
 TMultiLayerPerceptron.cxx:1252
 TMultiLayerPerceptron.cxx:1253
 TMultiLayerPerceptron.cxx:1254
 TMultiLayerPerceptron.cxx:1255
 TMultiLayerPerceptron.cxx:1256
 TMultiLayerPerceptron.cxx:1257
 TMultiLayerPerceptron.cxx:1258
 TMultiLayerPerceptron.cxx:1259
 TMultiLayerPerceptron.cxx:1260
 TMultiLayerPerceptron.cxx:1261
 TMultiLayerPerceptron.cxx:1262
 TMultiLayerPerceptron.cxx:1263
 TMultiLayerPerceptron.cxx:1264
 TMultiLayerPerceptron.cxx:1265
 TMultiLayerPerceptron.cxx:1266
 TMultiLayerPerceptron.cxx:1267
 TMultiLayerPerceptron.cxx:1268
 TMultiLayerPerceptron.cxx:1269
 TMultiLayerPerceptron.cxx:1270
 TMultiLayerPerceptron.cxx:1271
 TMultiLayerPerceptron.cxx:1272
 TMultiLayerPerceptron.cxx:1273
 TMultiLayerPerceptron.cxx:1274
 TMultiLayerPerceptron.cxx:1275
 TMultiLayerPerceptron.cxx:1276
 TMultiLayerPerceptron.cxx:1277
 TMultiLayerPerceptron.cxx:1278
 TMultiLayerPerceptron.cxx:1279
 TMultiLayerPerceptron.cxx:1280
 TMultiLayerPerceptron.cxx:1281
 TMultiLayerPerceptron.cxx:1282
 TMultiLayerPerceptron.cxx:1283
 TMultiLayerPerceptron.cxx:1284
 TMultiLayerPerceptron.cxx:1285
 TMultiLayerPerceptron.cxx:1286
 TMultiLayerPerceptron.cxx:1287
 TMultiLayerPerceptron.cxx:1288
 TMultiLayerPerceptron.cxx:1289
 TMultiLayerPerceptron.cxx:1290
 TMultiLayerPerceptron.cxx:1291
 TMultiLayerPerceptron.cxx:1292
 TMultiLayerPerceptron.cxx:1293
 TMultiLayerPerceptron.cxx:1294
 TMultiLayerPerceptron.cxx:1295
 TMultiLayerPerceptron.cxx:1296
 TMultiLayerPerceptron.cxx:1297
 TMultiLayerPerceptron.cxx:1298
 TMultiLayerPerceptron.cxx:1299
 TMultiLayerPerceptron.cxx:1300
 TMultiLayerPerceptron.cxx:1301
 TMultiLayerPerceptron.cxx:1302
 TMultiLayerPerceptron.cxx:1303
 TMultiLayerPerceptron.cxx:1304
 TMultiLayerPerceptron.cxx:1305
 TMultiLayerPerceptron.cxx:1306
 TMultiLayerPerceptron.cxx:1307
 TMultiLayerPerceptron.cxx:1308
 TMultiLayerPerceptron.cxx:1309
 TMultiLayerPerceptron.cxx:1310
 TMultiLayerPerceptron.cxx:1311
 TMultiLayerPerceptron.cxx:1312
 TMultiLayerPerceptron.cxx:1313
 TMultiLayerPerceptron.cxx:1314
 TMultiLayerPerceptron.cxx:1315
 TMultiLayerPerceptron.cxx:1316
 TMultiLayerPerceptron.cxx:1317
 TMultiLayerPerceptron.cxx:1318
 TMultiLayerPerceptron.cxx:1319
 TMultiLayerPerceptron.cxx:1320
 TMultiLayerPerceptron.cxx:1321
 TMultiLayerPerceptron.cxx:1322
 TMultiLayerPerceptron.cxx:1323
 TMultiLayerPerceptron.cxx:1324
 TMultiLayerPerceptron.cxx:1325
 TMultiLayerPerceptron.cxx:1326
 TMultiLayerPerceptron.cxx:1327
 TMultiLayerPerceptron.cxx:1328
 TMultiLayerPerceptron.cxx:1329
 TMultiLayerPerceptron.cxx:1330
 TMultiLayerPerceptron.cxx:1331
 TMultiLayerPerceptron.cxx:1332
 TMultiLayerPerceptron.cxx:1333
 TMultiLayerPerceptron.cxx:1334
 TMultiLayerPerceptron.cxx:1335
 TMultiLayerPerceptron.cxx:1336
 TMultiLayerPerceptron.cxx:1337
 TMultiLayerPerceptron.cxx:1338
 TMultiLayerPerceptron.cxx:1339
 TMultiLayerPerceptron.cxx:1340
 TMultiLayerPerceptron.cxx:1341
 TMultiLayerPerceptron.cxx:1342
 TMultiLayerPerceptron.cxx:1343
 TMultiLayerPerceptron.cxx:1344
 TMultiLayerPerceptron.cxx:1345
 TMultiLayerPerceptron.cxx:1346
 TMultiLayerPerceptron.cxx:1347
 TMultiLayerPerceptron.cxx:1348
 TMultiLayerPerceptron.cxx:1349
 TMultiLayerPerceptron.cxx:1350
 TMultiLayerPerceptron.cxx:1351
 TMultiLayerPerceptron.cxx:1352
 TMultiLayerPerceptron.cxx:1353
 TMultiLayerPerceptron.cxx:1354
 TMultiLayerPerceptron.cxx:1355
 TMultiLayerPerceptron.cxx:1356
 TMultiLayerPerceptron.cxx:1357
 TMultiLayerPerceptron.cxx:1358
 TMultiLayerPerceptron.cxx:1359
 TMultiLayerPerceptron.cxx:1360
 TMultiLayerPerceptron.cxx:1361
 TMultiLayerPerceptron.cxx:1362
 TMultiLayerPerceptron.cxx:1363
 TMultiLayerPerceptron.cxx:1364
 TMultiLayerPerceptron.cxx:1365
 TMultiLayerPerceptron.cxx:1366
 TMultiLayerPerceptron.cxx:1367
 TMultiLayerPerceptron.cxx:1368
 TMultiLayerPerceptron.cxx:1369
 TMultiLayerPerceptron.cxx:1370
 TMultiLayerPerceptron.cxx:1371
 TMultiLayerPerceptron.cxx:1372
 TMultiLayerPerceptron.cxx:1373
 TMultiLayerPerceptron.cxx:1374
 TMultiLayerPerceptron.cxx:1375
 TMultiLayerPerceptron.cxx:1376
 TMultiLayerPerceptron.cxx:1377
 TMultiLayerPerceptron.cxx:1378
 TMultiLayerPerceptron.cxx:1379
 TMultiLayerPerceptron.cxx:1380
 TMultiLayerPerceptron.cxx:1381
 TMultiLayerPerceptron.cxx:1382
 TMultiLayerPerceptron.cxx:1383
 TMultiLayerPerceptron.cxx:1384
 TMultiLayerPerceptron.cxx:1385
 TMultiLayerPerceptron.cxx:1386
 TMultiLayerPerceptron.cxx:1387
 TMultiLayerPerceptron.cxx:1388
 TMultiLayerPerceptron.cxx:1389
 TMultiLayerPerceptron.cxx:1390
 TMultiLayerPerceptron.cxx:1391
 TMultiLayerPerceptron.cxx:1392
 TMultiLayerPerceptron.cxx:1393
 TMultiLayerPerceptron.cxx:1394
 TMultiLayerPerceptron.cxx:1395
 TMultiLayerPerceptron.cxx:1396
 TMultiLayerPerceptron.cxx:1397
 TMultiLayerPerceptron.cxx:1398
 TMultiLayerPerceptron.cxx:1399
 TMultiLayerPerceptron.cxx:1400
 TMultiLayerPerceptron.cxx:1401
 TMultiLayerPerceptron.cxx:1402
 TMultiLayerPerceptron.cxx:1403
 TMultiLayerPerceptron.cxx:1404
 TMultiLayerPerceptron.cxx:1405
 TMultiLayerPerceptron.cxx:1406
 TMultiLayerPerceptron.cxx:1407
 TMultiLayerPerceptron.cxx:1408
 TMultiLayerPerceptron.cxx:1409
 TMultiLayerPerceptron.cxx:1410
 TMultiLayerPerceptron.cxx:1411
 TMultiLayerPerceptron.cxx:1412
 TMultiLayerPerceptron.cxx:1413
 TMultiLayerPerceptron.cxx:1414
 TMultiLayerPerceptron.cxx:1415
 TMultiLayerPerceptron.cxx:1416
 TMultiLayerPerceptron.cxx:1417
 TMultiLayerPerceptron.cxx:1418
 TMultiLayerPerceptron.cxx:1419
 TMultiLayerPerceptron.cxx:1420
 TMultiLayerPerceptron.cxx:1421
 TMultiLayerPerceptron.cxx:1422
 TMultiLayerPerceptron.cxx:1423
 TMultiLayerPerceptron.cxx:1424
 TMultiLayerPerceptron.cxx:1425
 TMultiLayerPerceptron.cxx:1426
 TMultiLayerPerceptron.cxx:1427
 TMultiLayerPerceptron.cxx:1428
 TMultiLayerPerceptron.cxx:1429
 TMultiLayerPerceptron.cxx:1430
 TMultiLayerPerceptron.cxx:1431
 TMultiLayerPerceptron.cxx:1432
 TMultiLayerPerceptron.cxx:1433
 TMultiLayerPerceptron.cxx:1434
 TMultiLayerPerceptron.cxx:1435
 TMultiLayerPerceptron.cxx:1436
 TMultiLayerPerceptron.cxx:1437
 TMultiLayerPerceptron.cxx:1438
 TMultiLayerPerceptron.cxx:1439
 TMultiLayerPerceptron.cxx:1440
 TMultiLayerPerceptron.cxx:1441
 TMultiLayerPerceptron.cxx:1442
 TMultiLayerPerceptron.cxx:1443
 TMultiLayerPerceptron.cxx:1444
 TMultiLayerPerceptron.cxx:1445
 TMultiLayerPerceptron.cxx:1446
 TMultiLayerPerceptron.cxx:1447
 TMultiLayerPerceptron.cxx:1448
 TMultiLayerPerceptron.cxx:1449
 TMultiLayerPerceptron.cxx:1450
 TMultiLayerPerceptron.cxx:1451
 TMultiLayerPerceptron.cxx:1452
 TMultiLayerPerceptron.cxx:1453
 TMultiLayerPerceptron.cxx:1454
 TMultiLayerPerceptron.cxx:1455
 TMultiLayerPerceptron.cxx:1456
 TMultiLayerPerceptron.cxx:1457
 TMultiLayerPerceptron.cxx:1458
 TMultiLayerPerceptron.cxx:1459
 TMultiLayerPerceptron.cxx:1460
 TMultiLayerPerceptron.cxx:1461
 TMultiLayerPerceptron.cxx:1462
 TMultiLayerPerceptron.cxx:1463
 TMultiLayerPerceptron.cxx:1464
 TMultiLayerPerceptron.cxx:1465
 TMultiLayerPerceptron.cxx:1466
 TMultiLayerPerceptron.cxx:1467
 TMultiLayerPerceptron.cxx:1468
 TMultiLayerPerceptron.cxx:1469
 TMultiLayerPerceptron.cxx:1470
 TMultiLayerPerceptron.cxx:1471
 TMultiLayerPerceptron.cxx:1472
 TMultiLayerPerceptron.cxx:1473
 TMultiLayerPerceptron.cxx:1474
 TMultiLayerPerceptron.cxx:1475
 TMultiLayerPerceptron.cxx:1476
 TMultiLayerPerceptron.cxx:1477
 TMultiLayerPerceptron.cxx:1478
 TMultiLayerPerceptron.cxx:1479
 TMultiLayerPerceptron.cxx:1480
 TMultiLayerPerceptron.cxx:1481
 TMultiLayerPerceptron.cxx:1482
 TMultiLayerPerceptron.cxx:1483
 TMultiLayerPerceptron.cxx:1484
 TMultiLayerPerceptron.cxx:1485
 TMultiLayerPerceptron.cxx:1486
 TMultiLayerPerceptron.cxx:1487
 TMultiLayerPerceptron.cxx:1488
 TMultiLayerPerceptron.cxx:1489
 TMultiLayerPerceptron.cxx:1490
 TMultiLayerPerceptron.cxx:1491
 TMultiLayerPerceptron.cxx:1492
 TMultiLayerPerceptron.cxx:1493
 TMultiLayerPerceptron.cxx:1494
 TMultiLayerPerceptron.cxx:1495
 TMultiLayerPerceptron.cxx:1496
 TMultiLayerPerceptron.cxx:1497
 TMultiLayerPerceptron.cxx:1498
 TMultiLayerPerceptron.cxx:1499
 TMultiLayerPerceptron.cxx:1500
 TMultiLayerPerceptron.cxx:1501
 TMultiLayerPerceptron.cxx:1502
 TMultiLayerPerceptron.cxx:1503
 TMultiLayerPerceptron.cxx:1504
 TMultiLayerPerceptron.cxx:1505
 TMultiLayerPerceptron.cxx:1506
 TMultiLayerPerceptron.cxx:1507
 TMultiLayerPerceptron.cxx:1508
 TMultiLayerPerceptron.cxx:1509
 TMultiLayerPerceptron.cxx:1510
 TMultiLayerPerceptron.cxx:1511
 TMultiLayerPerceptron.cxx:1512
 TMultiLayerPerceptron.cxx:1513
 TMultiLayerPerceptron.cxx:1514
 TMultiLayerPerceptron.cxx:1515
 TMultiLayerPerceptron.cxx:1516
 TMultiLayerPerceptron.cxx:1517
 TMultiLayerPerceptron.cxx:1518
 TMultiLayerPerceptron.cxx:1519
 TMultiLayerPerceptron.cxx:1520
 TMultiLayerPerceptron.cxx:1521
 TMultiLayerPerceptron.cxx:1522
 TMultiLayerPerceptron.cxx:1523
 TMultiLayerPerceptron.cxx:1524
 TMultiLayerPerceptron.cxx:1525
 TMultiLayerPerceptron.cxx:1526
 TMultiLayerPerceptron.cxx:1527
 TMultiLayerPerceptron.cxx:1528
 TMultiLayerPerceptron.cxx:1529
 TMultiLayerPerceptron.cxx:1530
 TMultiLayerPerceptron.cxx:1531
 TMultiLayerPerceptron.cxx:1532
 TMultiLayerPerceptron.cxx:1533
 TMultiLayerPerceptron.cxx:1534
 TMultiLayerPerceptron.cxx:1535
 TMultiLayerPerceptron.cxx:1536
 TMultiLayerPerceptron.cxx:1537
 TMultiLayerPerceptron.cxx:1538
 TMultiLayerPerceptron.cxx:1539
 TMultiLayerPerceptron.cxx:1540
 TMultiLayerPerceptron.cxx:1541
 TMultiLayerPerceptron.cxx:1542
 TMultiLayerPerceptron.cxx:1543
 TMultiLayerPerceptron.cxx:1544
 TMultiLayerPerceptron.cxx:1545
 TMultiLayerPerceptron.cxx:1546
 TMultiLayerPerceptron.cxx:1547
 TMultiLayerPerceptron.cxx:1548
 TMultiLayerPerceptron.cxx:1549
 TMultiLayerPerceptron.cxx:1550
 TMultiLayerPerceptron.cxx:1551
 TMultiLayerPerceptron.cxx:1552
 TMultiLayerPerceptron.cxx:1553
 TMultiLayerPerceptron.cxx:1554
 TMultiLayerPerceptron.cxx:1555
 TMultiLayerPerceptron.cxx:1556
 TMultiLayerPerceptron.cxx:1557
 TMultiLayerPerceptron.cxx:1558
 TMultiLayerPerceptron.cxx:1559
 TMultiLayerPerceptron.cxx:1560
 TMultiLayerPerceptron.cxx:1561
 TMultiLayerPerceptron.cxx:1562
 TMultiLayerPerceptron.cxx:1563
 TMultiLayerPerceptron.cxx:1564
 TMultiLayerPerceptron.cxx:1565
 TMultiLayerPerceptron.cxx:1566
 TMultiLayerPerceptron.cxx:1567
 TMultiLayerPerceptron.cxx:1568
 TMultiLayerPerceptron.cxx:1569
 TMultiLayerPerceptron.cxx:1570
 TMultiLayerPerceptron.cxx:1571
 TMultiLayerPerceptron.cxx:1572
 TMultiLayerPerceptron.cxx:1573
 TMultiLayerPerceptron.cxx:1574
 TMultiLayerPerceptron.cxx:1575
 TMultiLayerPerceptron.cxx:1576
 TMultiLayerPerceptron.cxx:1577
 TMultiLayerPerceptron.cxx:1578
 TMultiLayerPerceptron.cxx:1579
 TMultiLayerPerceptron.cxx:1580
 TMultiLayerPerceptron.cxx:1581
 TMultiLayerPerceptron.cxx:1582
 TMultiLayerPerceptron.cxx:1583
 TMultiLayerPerceptron.cxx:1584
 TMultiLayerPerceptron.cxx:1585
 TMultiLayerPerceptron.cxx:1586
 TMultiLayerPerceptron.cxx:1587
 TMultiLayerPerceptron.cxx:1588
 TMultiLayerPerceptron.cxx:1589
 TMultiLayerPerceptron.cxx:1590
 TMultiLayerPerceptron.cxx:1591
 TMultiLayerPerceptron.cxx:1592
 TMultiLayerPerceptron.cxx:1593
 TMultiLayerPerceptron.cxx:1594
 TMultiLayerPerceptron.cxx:1595
 TMultiLayerPerceptron.cxx:1596
 TMultiLayerPerceptron.cxx:1597
 TMultiLayerPerceptron.cxx:1598
 TMultiLayerPerceptron.cxx:1599
 TMultiLayerPerceptron.cxx:1600
 TMultiLayerPerceptron.cxx:1601
 TMultiLayerPerceptron.cxx:1602
 TMultiLayerPerceptron.cxx:1603
 TMultiLayerPerceptron.cxx:1604
 TMultiLayerPerceptron.cxx:1605
 TMultiLayerPerceptron.cxx:1606
 TMultiLayerPerceptron.cxx:1607
 TMultiLayerPerceptron.cxx:1608
 TMultiLayerPerceptron.cxx:1609
 TMultiLayerPerceptron.cxx:1610
 TMultiLayerPerceptron.cxx:1611
 TMultiLayerPerceptron.cxx:1612
 TMultiLayerPerceptron.cxx:1613
 TMultiLayerPerceptron.cxx:1614
 TMultiLayerPerceptron.cxx:1615
 TMultiLayerPerceptron.cxx:1616
 TMultiLayerPerceptron.cxx:1617
 TMultiLayerPerceptron.cxx:1618
 TMultiLayerPerceptron.cxx:1619
 TMultiLayerPerceptron.cxx:1620
 TMultiLayerPerceptron.cxx:1621
 TMultiLayerPerceptron.cxx:1622
 TMultiLayerPerceptron.cxx:1623
 TMultiLayerPerceptron.cxx:1624
 TMultiLayerPerceptron.cxx:1625
 TMultiLayerPerceptron.cxx:1626
 TMultiLayerPerceptron.cxx:1627
 TMultiLayerPerceptron.cxx:1628
 TMultiLayerPerceptron.cxx:1629
 TMultiLayerPerceptron.cxx:1630
 TMultiLayerPerceptron.cxx:1631
 TMultiLayerPerceptron.cxx:1632
 TMultiLayerPerceptron.cxx:1633
 TMultiLayerPerceptron.cxx:1634
 TMultiLayerPerceptron.cxx:1635
 TMultiLayerPerceptron.cxx:1636
 TMultiLayerPerceptron.cxx:1637
 TMultiLayerPerceptron.cxx:1638
 TMultiLayerPerceptron.cxx:1639
 TMultiLayerPerceptron.cxx:1640
 TMultiLayerPerceptron.cxx:1641
 TMultiLayerPerceptron.cxx:1642
 TMultiLayerPerceptron.cxx:1643
 TMultiLayerPerceptron.cxx:1644
 TMultiLayerPerceptron.cxx:1645
 TMultiLayerPerceptron.cxx:1646
 TMultiLayerPerceptron.cxx:1647
 TMultiLayerPerceptron.cxx:1648
 TMultiLayerPerceptron.cxx:1649
 TMultiLayerPerceptron.cxx:1650
 TMultiLayerPerceptron.cxx:1651
 TMultiLayerPerceptron.cxx:1652
 TMultiLayerPerceptron.cxx:1653
 TMultiLayerPerceptron.cxx:1654
 TMultiLayerPerceptron.cxx:1655
 TMultiLayerPerceptron.cxx:1656
 TMultiLayerPerceptron.cxx:1657
 TMultiLayerPerceptron.cxx:1658
 TMultiLayerPerceptron.cxx:1659
 TMultiLayerPerceptron.cxx:1660
 TMultiLayerPerceptron.cxx:1661
 TMultiLayerPerceptron.cxx:1662
 TMultiLayerPerceptron.cxx:1663
 TMultiLayerPerceptron.cxx:1664
 TMultiLayerPerceptron.cxx:1665
 TMultiLayerPerceptron.cxx:1666
 TMultiLayerPerceptron.cxx:1667
 TMultiLayerPerceptron.cxx:1668
 TMultiLayerPerceptron.cxx:1669
 TMultiLayerPerceptron.cxx:1670
 TMultiLayerPerceptron.cxx:1671
 TMultiLayerPerceptron.cxx:1672
 TMultiLayerPerceptron.cxx:1673
 TMultiLayerPerceptron.cxx:1674
 TMultiLayerPerceptron.cxx:1675
 TMultiLayerPerceptron.cxx:1676
 TMultiLayerPerceptron.cxx:1677
 TMultiLayerPerceptron.cxx:1678
 TMultiLayerPerceptron.cxx:1679
 TMultiLayerPerceptron.cxx:1680
 TMultiLayerPerceptron.cxx:1681
 TMultiLayerPerceptron.cxx:1682
 TMultiLayerPerceptron.cxx:1683
 TMultiLayerPerceptron.cxx:1684
 TMultiLayerPerceptron.cxx:1685
 TMultiLayerPerceptron.cxx:1686
 TMultiLayerPerceptron.cxx:1687
 TMultiLayerPerceptron.cxx:1688
 TMultiLayerPerceptron.cxx:1689
 TMultiLayerPerceptron.cxx:1690
 TMultiLayerPerceptron.cxx:1691
 TMultiLayerPerceptron.cxx:1692
 TMultiLayerPerceptron.cxx:1693
 TMultiLayerPerceptron.cxx:1694
 TMultiLayerPerceptron.cxx:1695
 TMultiLayerPerceptron.cxx:1696
 TMultiLayerPerceptron.cxx:1697
 TMultiLayerPerceptron.cxx:1698
 TMultiLayerPerceptron.cxx:1699
 TMultiLayerPerceptron.cxx:1700
 TMultiLayerPerceptron.cxx:1701
 TMultiLayerPerceptron.cxx:1702
 TMultiLayerPerceptron.cxx:1703
 TMultiLayerPerceptron.cxx:1704
 TMultiLayerPerceptron.cxx:1705
 TMultiLayerPerceptron.cxx:1706
 TMultiLayerPerceptron.cxx:1707
 TMultiLayerPerceptron.cxx:1708
 TMultiLayerPerceptron.cxx:1709
 TMultiLayerPerceptron.cxx:1710
 TMultiLayerPerceptron.cxx:1711
 TMultiLayerPerceptron.cxx:1712
 TMultiLayerPerceptron.cxx:1713
 TMultiLayerPerceptron.cxx:1714
 TMultiLayerPerceptron.cxx:1715
 TMultiLayerPerceptron.cxx:1716
 TMultiLayerPerceptron.cxx:1717
 TMultiLayerPerceptron.cxx:1718
 TMultiLayerPerceptron.cxx:1719
 TMultiLayerPerceptron.cxx:1720
 TMultiLayerPerceptron.cxx:1721
 TMultiLayerPerceptron.cxx:1722
 TMultiLayerPerceptron.cxx:1723
 TMultiLayerPerceptron.cxx:1724
 TMultiLayerPerceptron.cxx:1725
 TMultiLayerPerceptron.cxx:1726
 TMultiLayerPerceptron.cxx:1727
 TMultiLayerPerceptron.cxx:1728
 TMultiLayerPerceptron.cxx:1729
 TMultiLayerPerceptron.cxx:1730
 TMultiLayerPerceptron.cxx:1731
 TMultiLayerPerceptron.cxx:1732
 TMultiLayerPerceptron.cxx:1733
 TMultiLayerPerceptron.cxx:1734
 TMultiLayerPerceptron.cxx:1735
 TMultiLayerPerceptron.cxx:1736
 TMultiLayerPerceptron.cxx:1737
 TMultiLayerPerceptron.cxx:1738
 TMultiLayerPerceptron.cxx:1739
 TMultiLayerPerceptron.cxx:1740
 TMultiLayerPerceptron.cxx:1741
 TMultiLayerPerceptron.cxx:1742
 TMultiLayerPerceptron.cxx:1743
 TMultiLayerPerceptron.cxx:1744
 TMultiLayerPerceptron.cxx:1745
 TMultiLayerPerceptron.cxx:1746
 TMultiLayerPerceptron.cxx:1747
 TMultiLayerPerceptron.cxx:1748
 TMultiLayerPerceptron.cxx:1749
 TMultiLayerPerceptron.cxx:1750
 TMultiLayerPerceptron.cxx:1751
 TMultiLayerPerceptron.cxx:1752
 TMultiLayerPerceptron.cxx:1753
 TMultiLayerPerceptron.cxx:1754
 TMultiLayerPerceptron.cxx:1755
 TMultiLayerPerceptron.cxx:1756
 TMultiLayerPerceptron.cxx:1757
 TMultiLayerPerceptron.cxx:1758
 TMultiLayerPerceptron.cxx:1759
 TMultiLayerPerceptron.cxx:1760
 TMultiLayerPerceptron.cxx:1761
 TMultiLayerPerceptron.cxx:1762
 TMultiLayerPerceptron.cxx:1763
 TMultiLayerPerceptron.cxx:1764
 TMultiLayerPerceptron.cxx:1765
 TMultiLayerPerceptron.cxx:1766
 TMultiLayerPerceptron.cxx:1767
 TMultiLayerPerceptron.cxx:1768
 TMultiLayerPerceptron.cxx:1769
 TMultiLayerPerceptron.cxx:1770
 TMultiLayerPerceptron.cxx:1771
 TMultiLayerPerceptron.cxx:1772
 TMultiLayerPerceptron.cxx:1773
 TMultiLayerPerceptron.cxx:1774
 TMultiLayerPerceptron.cxx:1775
 TMultiLayerPerceptron.cxx:1776
 TMultiLayerPerceptron.cxx:1777
 TMultiLayerPerceptron.cxx:1778
 TMultiLayerPerceptron.cxx:1779
 TMultiLayerPerceptron.cxx:1780
 TMultiLayerPerceptron.cxx:1781
 TMultiLayerPerceptron.cxx:1782
 TMultiLayerPerceptron.cxx:1783
 TMultiLayerPerceptron.cxx:1784
 TMultiLayerPerceptron.cxx:1785
 TMultiLayerPerceptron.cxx:1786
 TMultiLayerPerceptron.cxx:1787
 TMultiLayerPerceptron.cxx:1788
 TMultiLayerPerceptron.cxx:1789
 TMultiLayerPerceptron.cxx:1790
 TMultiLayerPerceptron.cxx:1791
 TMultiLayerPerceptron.cxx:1792
 TMultiLayerPerceptron.cxx:1793
 TMultiLayerPerceptron.cxx:1794
 TMultiLayerPerceptron.cxx:1795
 TMultiLayerPerceptron.cxx:1796
 TMultiLayerPerceptron.cxx:1797
 TMultiLayerPerceptron.cxx:1798
 TMultiLayerPerceptron.cxx:1799
 TMultiLayerPerceptron.cxx:1800
 TMultiLayerPerceptron.cxx:1801
 TMultiLayerPerceptron.cxx:1802
 TMultiLayerPerceptron.cxx:1803
 TMultiLayerPerceptron.cxx:1804
 TMultiLayerPerceptron.cxx:1805
 TMultiLayerPerceptron.cxx:1806
 TMultiLayerPerceptron.cxx:1807
 TMultiLayerPerceptron.cxx:1808
 TMultiLayerPerceptron.cxx:1809
 TMultiLayerPerceptron.cxx:1810
 TMultiLayerPerceptron.cxx:1811
 TMultiLayerPerceptron.cxx:1812
 TMultiLayerPerceptron.cxx:1813
 TMultiLayerPerceptron.cxx:1814
 TMultiLayerPerceptron.cxx:1815
 TMultiLayerPerceptron.cxx:1816
 TMultiLayerPerceptron.cxx:1817
 TMultiLayerPerceptron.cxx:1818
 TMultiLayerPerceptron.cxx:1819
 TMultiLayerPerceptron.cxx:1820
 TMultiLayerPerceptron.cxx:1821
 TMultiLayerPerceptron.cxx:1822
 TMultiLayerPerceptron.cxx:1823
 TMultiLayerPerceptron.cxx:1824
 TMultiLayerPerceptron.cxx:1825
 TMultiLayerPerceptron.cxx:1826
 TMultiLayerPerceptron.cxx:1827
 TMultiLayerPerceptron.cxx:1828
 TMultiLayerPerceptron.cxx:1829
 TMultiLayerPerceptron.cxx:1830
 TMultiLayerPerceptron.cxx:1831
 TMultiLayerPerceptron.cxx:1832
 TMultiLayerPerceptron.cxx:1833
 TMultiLayerPerceptron.cxx:1834
 TMultiLayerPerceptron.cxx:1835
 TMultiLayerPerceptron.cxx:1836
 TMultiLayerPerceptron.cxx:1837
 TMultiLayerPerceptron.cxx:1838
 TMultiLayerPerceptron.cxx:1839
 TMultiLayerPerceptron.cxx:1840
 TMultiLayerPerceptron.cxx:1841
 TMultiLayerPerceptron.cxx:1842
 TMultiLayerPerceptron.cxx:1843
 TMultiLayerPerceptron.cxx:1844
 TMultiLayerPerceptron.cxx:1845
 TMultiLayerPerceptron.cxx:1846
 TMultiLayerPerceptron.cxx:1847
 TMultiLayerPerceptron.cxx:1848
 TMultiLayerPerceptron.cxx:1849
 TMultiLayerPerceptron.cxx:1850
 TMultiLayerPerceptron.cxx:1851
 TMultiLayerPerceptron.cxx:1852
 TMultiLayerPerceptron.cxx:1853
 TMultiLayerPerceptron.cxx:1854
 TMultiLayerPerceptron.cxx:1855
 TMultiLayerPerceptron.cxx:1856
 TMultiLayerPerceptron.cxx:1857
 TMultiLayerPerceptron.cxx:1858
 TMultiLayerPerceptron.cxx:1859
 TMultiLayerPerceptron.cxx:1860
 TMultiLayerPerceptron.cxx:1861
 TMultiLayerPerceptron.cxx:1862
 TMultiLayerPerceptron.cxx:1863
 TMultiLayerPerceptron.cxx:1864
 TMultiLayerPerceptron.cxx:1865
 TMultiLayerPerceptron.cxx:1866
 TMultiLayerPerceptron.cxx:1867
 TMultiLayerPerceptron.cxx:1868
 TMultiLayerPerceptron.cxx:1869
 TMultiLayerPerceptron.cxx:1870
 TMultiLayerPerceptron.cxx:1871
 TMultiLayerPerceptron.cxx:1872
 TMultiLayerPerceptron.cxx:1873
 TMultiLayerPerceptron.cxx:1874
 TMultiLayerPerceptron.cxx:1875
 TMultiLayerPerceptron.cxx:1876
 TMultiLayerPerceptron.cxx:1877
 TMultiLayerPerceptron.cxx:1878
 TMultiLayerPerceptron.cxx:1879
 TMultiLayerPerceptron.cxx:1880
 TMultiLayerPerceptron.cxx:1881
 TMultiLayerPerceptron.cxx:1882
 TMultiLayerPerceptron.cxx:1883
 TMultiLayerPerceptron.cxx:1884
 TMultiLayerPerceptron.cxx:1885
 TMultiLayerPerceptron.cxx:1886
 TMultiLayerPerceptron.cxx:1887
 TMultiLayerPerceptron.cxx:1888
 TMultiLayerPerceptron.cxx:1889
 TMultiLayerPerceptron.cxx:1890
 TMultiLayerPerceptron.cxx:1891
 TMultiLayerPerceptron.cxx:1892
 TMultiLayerPerceptron.cxx:1893
 TMultiLayerPerceptron.cxx:1894
 TMultiLayerPerceptron.cxx:1895
 TMultiLayerPerceptron.cxx:1896
 TMultiLayerPerceptron.cxx:1897
 TMultiLayerPerceptron.cxx:1898
 TMultiLayerPerceptron.cxx:1899
 TMultiLayerPerceptron.cxx:1900
 TMultiLayerPerceptron.cxx:1901
 TMultiLayerPerceptron.cxx:1902
 TMultiLayerPerceptron.cxx:1903
 TMultiLayerPerceptron.cxx:1904
 TMultiLayerPerceptron.cxx:1905
 TMultiLayerPerceptron.cxx:1906
 TMultiLayerPerceptron.cxx:1907
 TMultiLayerPerceptron.cxx:1908
 TMultiLayerPerceptron.cxx:1909
 TMultiLayerPerceptron.cxx:1910
 TMultiLayerPerceptron.cxx:1911
 TMultiLayerPerceptron.cxx:1912
 TMultiLayerPerceptron.cxx:1913
 TMultiLayerPerceptron.cxx:1914
 TMultiLayerPerceptron.cxx:1915
 TMultiLayerPerceptron.cxx:1916
 TMultiLayerPerceptron.cxx:1917
 TMultiLayerPerceptron.cxx:1918
 TMultiLayerPerceptron.cxx:1919
 TMultiLayerPerceptron.cxx:1920
 TMultiLayerPerceptron.cxx:1921
 TMultiLayerPerceptron.cxx:1922
 TMultiLayerPerceptron.cxx:1923
 TMultiLayerPerceptron.cxx:1924
 TMultiLayerPerceptron.cxx:1925
 TMultiLayerPerceptron.cxx:1926
 TMultiLayerPerceptron.cxx:1927
 TMultiLayerPerceptron.cxx:1928
 TMultiLayerPerceptron.cxx:1929
 TMultiLayerPerceptron.cxx:1930
 TMultiLayerPerceptron.cxx:1931
 TMultiLayerPerceptron.cxx:1932
 TMultiLayerPerceptron.cxx:1933
 TMultiLayerPerceptron.cxx:1934
 TMultiLayerPerceptron.cxx:1935
 TMultiLayerPerceptron.cxx:1936
 TMultiLayerPerceptron.cxx:1937
 TMultiLayerPerceptron.cxx:1938
 TMultiLayerPerceptron.cxx:1939
 TMultiLayerPerceptron.cxx:1940
 TMultiLayerPerceptron.cxx:1941
 TMultiLayerPerceptron.cxx:1942
 TMultiLayerPerceptron.cxx:1943
 TMultiLayerPerceptron.cxx:1944
 TMultiLayerPerceptron.cxx:1945
 TMultiLayerPerceptron.cxx:1946
 TMultiLayerPerceptron.cxx:1947
 TMultiLayerPerceptron.cxx:1948
 TMultiLayerPerceptron.cxx:1949
 TMultiLayerPerceptron.cxx:1950
 TMultiLayerPerceptron.cxx:1951
 TMultiLayerPerceptron.cxx:1952
 TMultiLayerPerceptron.cxx:1953
 TMultiLayerPerceptron.cxx:1954
 TMultiLayerPerceptron.cxx:1955
 TMultiLayerPerceptron.cxx:1956
 TMultiLayerPerceptron.cxx:1957
 TMultiLayerPerceptron.cxx:1958
 TMultiLayerPerceptron.cxx:1959
 TMultiLayerPerceptron.cxx:1960
 TMultiLayerPerceptron.cxx:1961
 TMultiLayerPerceptron.cxx:1962
 TMultiLayerPerceptron.cxx:1963
 TMultiLayerPerceptron.cxx:1964
 TMultiLayerPerceptron.cxx:1965
 TMultiLayerPerceptron.cxx:1966
 TMultiLayerPerceptron.cxx:1967
 TMultiLayerPerceptron.cxx:1968
 TMultiLayerPerceptron.cxx:1969
 TMultiLayerPerceptron.cxx:1970
 TMultiLayerPerceptron.cxx:1971
 TMultiLayerPerceptron.cxx:1972
 TMultiLayerPerceptron.cxx:1973
 TMultiLayerPerceptron.cxx:1974
 TMultiLayerPerceptron.cxx:1975
 TMultiLayerPerceptron.cxx:1976
 TMultiLayerPerceptron.cxx:1977
 TMultiLayerPerceptron.cxx:1978
 TMultiLayerPerceptron.cxx:1979
 TMultiLayerPerceptron.cxx:1980
 TMultiLayerPerceptron.cxx:1981
 TMultiLayerPerceptron.cxx:1982
 TMultiLayerPerceptron.cxx:1983
 TMultiLayerPerceptron.cxx:1984
 TMultiLayerPerceptron.cxx:1985
 TMultiLayerPerceptron.cxx:1986
 TMultiLayerPerceptron.cxx:1987
 TMultiLayerPerceptron.cxx:1988
 TMultiLayerPerceptron.cxx:1989
 TMultiLayerPerceptron.cxx:1990
 TMultiLayerPerceptron.cxx:1991
 TMultiLayerPerceptron.cxx:1992
 TMultiLayerPerceptron.cxx:1993
 TMultiLayerPerceptron.cxx:1994
 TMultiLayerPerceptron.cxx:1995
 TMultiLayerPerceptron.cxx:1996
 TMultiLayerPerceptron.cxx:1997
 TMultiLayerPerceptron.cxx:1998
 TMultiLayerPerceptron.cxx:1999
 TMultiLayerPerceptron.cxx:2000
 TMultiLayerPerceptron.cxx:2001
 TMultiLayerPerceptron.cxx:2002
 TMultiLayerPerceptron.cxx:2003
 TMultiLayerPerceptron.cxx:2004
 TMultiLayerPerceptron.cxx:2005
 TMultiLayerPerceptron.cxx:2006
 TMultiLayerPerceptron.cxx:2007
 TMultiLayerPerceptron.cxx:2008
 TMultiLayerPerceptron.cxx:2009
 TMultiLayerPerceptron.cxx:2010
 TMultiLayerPerceptron.cxx:2011
 TMultiLayerPerceptron.cxx:2012
 TMultiLayerPerceptron.cxx:2013
 TMultiLayerPerceptron.cxx:2014
 TMultiLayerPerceptron.cxx:2015
 TMultiLayerPerceptron.cxx:2016
 TMultiLayerPerceptron.cxx:2017
 TMultiLayerPerceptron.cxx:2018
 TMultiLayerPerceptron.cxx:2019
 TMultiLayerPerceptron.cxx:2020
 TMultiLayerPerceptron.cxx:2021
 TMultiLayerPerceptron.cxx:2022
 TMultiLayerPerceptron.cxx:2023
 TMultiLayerPerceptron.cxx:2024
 TMultiLayerPerceptron.cxx:2025
 TMultiLayerPerceptron.cxx:2026
 TMultiLayerPerceptron.cxx:2027
 TMultiLayerPerceptron.cxx:2028
 TMultiLayerPerceptron.cxx:2029
 TMultiLayerPerceptron.cxx:2030
 TMultiLayerPerceptron.cxx:2031
 TMultiLayerPerceptron.cxx:2032
 TMultiLayerPerceptron.cxx:2033
 TMultiLayerPerceptron.cxx:2034
 TMultiLayerPerceptron.cxx:2035
 TMultiLayerPerceptron.cxx:2036
 TMultiLayerPerceptron.cxx:2037
 TMultiLayerPerceptron.cxx:2038
 TMultiLayerPerceptron.cxx:2039
 TMultiLayerPerceptron.cxx:2040
 TMultiLayerPerceptron.cxx:2041
 TMultiLayerPerceptron.cxx:2042
 TMultiLayerPerceptron.cxx:2043
 TMultiLayerPerceptron.cxx:2044
 TMultiLayerPerceptron.cxx:2045
 TMultiLayerPerceptron.cxx:2046
 TMultiLayerPerceptron.cxx:2047
 TMultiLayerPerceptron.cxx:2048
 TMultiLayerPerceptron.cxx:2049
 TMultiLayerPerceptron.cxx:2050
 TMultiLayerPerceptron.cxx:2051
 TMultiLayerPerceptron.cxx:2052
 TMultiLayerPerceptron.cxx:2053
 TMultiLayerPerceptron.cxx:2054
 TMultiLayerPerceptron.cxx:2055
 TMultiLayerPerceptron.cxx:2056
 TMultiLayerPerceptron.cxx:2057
 TMultiLayerPerceptron.cxx:2058
 TMultiLayerPerceptron.cxx:2059
 TMultiLayerPerceptron.cxx:2060
 TMultiLayerPerceptron.cxx:2061
 TMultiLayerPerceptron.cxx:2062
 TMultiLayerPerceptron.cxx:2063
 TMultiLayerPerceptron.cxx:2064
 TMultiLayerPerceptron.cxx:2065
 TMultiLayerPerceptron.cxx:2066
 TMultiLayerPerceptron.cxx:2067
 TMultiLayerPerceptron.cxx:2068
 TMultiLayerPerceptron.cxx:2069
 TMultiLayerPerceptron.cxx:2070
 TMultiLayerPerceptron.cxx:2071
 TMultiLayerPerceptron.cxx:2072
 TMultiLayerPerceptron.cxx:2073
 TMultiLayerPerceptron.cxx:2074
 TMultiLayerPerceptron.cxx:2075
 TMultiLayerPerceptron.cxx:2076
 TMultiLayerPerceptron.cxx:2077
 TMultiLayerPerceptron.cxx:2078
 TMultiLayerPerceptron.cxx:2079
 TMultiLayerPerceptron.cxx:2080
 TMultiLayerPerceptron.cxx:2081
 TMultiLayerPerceptron.cxx:2082
 TMultiLayerPerceptron.cxx:2083
 TMultiLayerPerceptron.cxx:2084
 TMultiLayerPerceptron.cxx:2085
 TMultiLayerPerceptron.cxx:2086
 TMultiLayerPerceptron.cxx:2087
 TMultiLayerPerceptron.cxx:2088
 TMultiLayerPerceptron.cxx:2089
 TMultiLayerPerceptron.cxx:2090
 TMultiLayerPerceptron.cxx:2091
 TMultiLayerPerceptron.cxx:2092
 TMultiLayerPerceptron.cxx:2093
 TMultiLayerPerceptron.cxx:2094
 TMultiLayerPerceptron.cxx:2095
 TMultiLayerPerceptron.cxx:2096
 TMultiLayerPerceptron.cxx:2097
 TMultiLayerPerceptron.cxx:2098
 TMultiLayerPerceptron.cxx:2099
 TMultiLayerPerceptron.cxx:2100
 TMultiLayerPerceptron.cxx:2101
 TMultiLayerPerceptron.cxx:2102
 TMultiLayerPerceptron.cxx:2103
 TMultiLayerPerceptron.cxx:2104
 TMultiLayerPerceptron.cxx:2105
 TMultiLayerPerceptron.cxx:2106
 TMultiLayerPerceptron.cxx:2107
 TMultiLayerPerceptron.cxx:2108
 TMultiLayerPerceptron.cxx:2109
 TMultiLayerPerceptron.cxx:2110
 TMultiLayerPerceptron.cxx:2111
 TMultiLayerPerceptron.cxx:2112
 TMultiLayerPerceptron.cxx:2113
 TMultiLayerPerceptron.cxx:2114
 TMultiLayerPerceptron.cxx:2115
 TMultiLayerPerceptron.cxx:2116
 TMultiLayerPerceptron.cxx:2117
 TMultiLayerPerceptron.cxx:2118
 TMultiLayerPerceptron.cxx:2119
 TMultiLayerPerceptron.cxx:2120
 TMultiLayerPerceptron.cxx:2121
 TMultiLayerPerceptron.cxx:2122
 TMultiLayerPerceptron.cxx:2123
 TMultiLayerPerceptron.cxx:2124
 TMultiLayerPerceptron.cxx:2125
 TMultiLayerPerceptron.cxx:2126
 TMultiLayerPerceptron.cxx:2127
 TMultiLayerPerceptron.cxx:2128
 TMultiLayerPerceptron.cxx:2129
 TMultiLayerPerceptron.cxx:2130
 TMultiLayerPerceptron.cxx:2131
 TMultiLayerPerceptron.cxx:2132
 TMultiLayerPerceptron.cxx:2133
 TMultiLayerPerceptron.cxx:2134
 TMultiLayerPerceptron.cxx:2135
 TMultiLayerPerceptron.cxx:2136
 TMultiLayerPerceptron.cxx:2137
 TMultiLayerPerceptron.cxx:2138
 TMultiLayerPerceptron.cxx:2139
 TMultiLayerPerceptron.cxx:2140
 TMultiLayerPerceptron.cxx:2141
 TMultiLayerPerceptron.cxx:2142
 TMultiLayerPerceptron.cxx:2143
 TMultiLayerPerceptron.cxx:2144
 TMultiLayerPerceptron.cxx:2145
 TMultiLayerPerceptron.cxx:2146
 TMultiLayerPerceptron.cxx:2147
 TMultiLayerPerceptron.cxx:2148
 TMultiLayerPerceptron.cxx:2149
 TMultiLayerPerceptron.cxx:2150
 TMultiLayerPerceptron.cxx:2151
 TMultiLayerPerceptron.cxx:2152
 TMultiLayerPerceptron.cxx:2153
 TMultiLayerPerceptron.cxx:2154
 TMultiLayerPerceptron.cxx:2155
 TMultiLayerPerceptron.cxx:2156
 TMultiLayerPerceptron.cxx:2157
 TMultiLayerPerceptron.cxx:2158
 TMultiLayerPerceptron.cxx:2159
 TMultiLayerPerceptron.cxx:2160
 TMultiLayerPerceptron.cxx:2161
 TMultiLayerPerceptron.cxx:2162
 TMultiLayerPerceptron.cxx:2163
 TMultiLayerPerceptron.cxx:2164
 TMultiLayerPerceptron.cxx:2165
 TMultiLayerPerceptron.cxx:2166
 TMultiLayerPerceptron.cxx:2167
 TMultiLayerPerceptron.cxx:2168
 TMultiLayerPerceptron.cxx:2169
 TMultiLayerPerceptron.cxx:2170
 TMultiLayerPerceptron.cxx:2171
 TMultiLayerPerceptron.cxx:2172
 TMultiLayerPerceptron.cxx:2173
 TMultiLayerPerceptron.cxx:2174
 TMultiLayerPerceptron.cxx:2175
 TMultiLayerPerceptron.cxx:2176
 TMultiLayerPerceptron.cxx:2177
 TMultiLayerPerceptron.cxx:2178
 TMultiLayerPerceptron.cxx:2179
 TMultiLayerPerceptron.cxx:2180
 TMultiLayerPerceptron.cxx:2181
 TMultiLayerPerceptron.cxx:2182
 TMultiLayerPerceptron.cxx:2183
 TMultiLayerPerceptron.cxx:2184
 TMultiLayerPerceptron.cxx:2185
 TMultiLayerPerceptron.cxx:2186
 TMultiLayerPerceptron.cxx:2187
 TMultiLayerPerceptron.cxx:2188
 TMultiLayerPerceptron.cxx:2189
 TMultiLayerPerceptron.cxx:2190
 TMultiLayerPerceptron.cxx:2191
 TMultiLayerPerceptron.cxx:2192
 TMultiLayerPerceptron.cxx:2193
 TMultiLayerPerceptron.cxx:2194
 TMultiLayerPerceptron.cxx:2195
 TMultiLayerPerceptron.cxx:2196
 TMultiLayerPerceptron.cxx:2197
 TMultiLayerPerceptron.cxx:2198
 TMultiLayerPerceptron.cxx:2199
 TMultiLayerPerceptron.cxx:2200
 TMultiLayerPerceptron.cxx:2201
 TMultiLayerPerceptron.cxx:2202
 TMultiLayerPerceptron.cxx:2203
 TMultiLayerPerceptron.cxx:2204
 TMultiLayerPerceptron.cxx:2205
 TMultiLayerPerceptron.cxx:2206
 TMultiLayerPerceptron.cxx:2207
 TMultiLayerPerceptron.cxx:2208
 TMultiLayerPerceptron.cxx:2209
 TMultiLayerPerceptron.cxx:2210
 TMultiLayerPerceptron.cxx:2211
 TMultiLayerPerceptron.cxx:2212
 TMultiLayerPerceptron.cxx:2213
 TMultiLayerPerceptron.cxx:2214
 TMultiLayerPerceptron.cxx:2215
 TMultiLayerPerceptron.cxx:2216
 TMultiLayerPerceptron.cxx:2217
 TMultiLayerPerceptron.cxx:2218
 TMultiLayerPerceptron.cxx:2219
 TMultiLayerPerceptron.cxx:2220
 TMultiLayerPerceptron.cxx:2221
 TMultiLayerPerceptron.cxx:2222
 TMultiLayerPerceptron.cxx:2223
 TMultiLayerPerceptron.cxx:2224
 TMultiLayerPerceptron.cxx:2225
 TMultiLayerPerceptron.cxx:2226
 TMultiLayerPerceptron.cxx:2227
 TMultiLayerPerceptron.cxx:2228
 TMultiLayerPerceptron.cxx:2229
 TMultiLayerPerceptron.cxx:2230
 TMultiLayerPerceptron.cxx:2231
 TMultiLayerPerceptron.cxx:2232
 TMultiLayerPerceptron.cxx:2233
 TMultiLayerPerceptron.cxx:2234
 TMultiLayerPerceptron.cxx:2235
 TMultiLayerPerceptron.cxx:2236
 TMultiLayerPerceptron.cxx:2237
 TMultiLayerPerceptron.cxx:2238
 TMultiLayerPerceptron.cxx:2239
 TMultiLayerPerceptron.cxx:2240
 TMultiLayerPerceptron.cxx:2241
 TMultiLayerPerceptron.cxx:2242
 TMultiLayerPerceptron.cxx:2243
 TMultiLayerPerceptron.cxx:2244
 TMultiLayerPerceptron.cxx:2245
 TMultiLayerPerceptron.cxx:2246
 TMultiLayerPerceptron.cxx:2247
 TMultiLayerPerceptron.cxx:2248
 TMultiLayerPerceptron.cxx:2249
 TMultiLayerPerceptron.cxx:2250
 TMultiLayerPerceptron.cxx:2251
 TMultiLayerPerceptron.cxx:2252
 TMultiLayerPerceptron.cxx:2253
 TMultiLayerPerceptron.cxx:2254
 TMultiLayerPerceptron.cxx:2255
 TMultiLayerPerceptron.cxx:2256
 TMultiLayerPerceptron.cxx:2257
 TMultiLayerPerceptron.cxx:2258
 TMultiLayerPerceptron.cxx:2259
 TMultiLayerPerceptron.cxx:2260
 TMultiLayerPerceptron.cxx:2261
 TMultiLayerPerceptron.cxx:2262
 TMultiLayerPerceptron.cxx:2263
 TMultiLayerPerceptron.cxx:2264
 TMultiLayerPerceptron.cxx:2265
 TMultiLayerPerceptron.cxx:2266
 TMultiLayerPerceptron.cxx:2267
 TMultiLayerPerceptron.cxx:2268
 TMultiLayerPerceptron.cxx:2269
 TMultiLayerPerceptron.cxx:2270
 TMultiLayerPerceptron.cxx:2271
 TMultiLayerPerceptron.cxx:2272
 TMultiLayerPerceptron.cxx:2273
 TMultiLayerPerceptron.cxx:2274
 TMultiLayerPerceptron.cxx:2275
 TMultiLayerPerceptron.cxx:2276
 TMultiLayerPerceptron.cxx:2277
 TMultiLayerPerceptron.cxx:2278
 TMultiLayerPerceptron.cxx:2279
 TMultiLayerPerceptron.cxx:2280
 TMultiLayerPerceptron.cxx:2281
 TMultiLayerPerceptron.cxx:2282
 TMultiLayerPerceptron.cxx:2283
 TMultiLayerPerceptron.cxx:2284
 TMultiLayerPerceptron.cxx:2285
 TMultiLayerPerceptron.cxx:2286
 TMultiLayerPerceptron.cxx:2287
 TMultiLayerPerceptron.cxx:2288
 TMultiLayerPerceptron.cxx:2289
 TMultiLayerPerceptron.cxx:2290
 TMultiLayerPerceptron.cxx:2291
 TMultiLayerPerceptron.cxx:2292
 TMultiLayerPerceptron.cxx:2293
 TMultiLayerPerceptron.cxx:2294
 TMultiLayerPerceptron.cxx:2295
 TMultiLayerPerceptron.cxx:2296
 TMultiLayerPerceptron.cxx:2297
 TMultiLayerPerceptron.cxx:2298
 TMultiLayerPerceptron.cxx:2299
 TMultiLayerPerceptron.cxx:2300
 TMultiLayerPerceptron.cxx:2301
 TMultiLayerPerceptron.cxx:2302
 TMultiLayerPerceptron.cxx:2303
 TMultiLayerPerceptron.cxx:2304
 TMultiLayerPerceptron.cxx:2305
 TMultiLayerPerceptron.cxx:2306
 TMultiLayerPerceptron.cxx:2307
 TMultiLayerPerceptron.cxx:2308
 TMultiLayerPerceptron.cxx:2309
 TMultiLayerPerceptron.cxx:2310
 TMultiLayerPerceptron.cxx:2311
 TMultiLayerPerceptron.cxx:2312
 TMultiLayerPerceptron.cxx:2313
 TMultiLayerPerceptron.cxx:2314
 TMultiLayerPerceptron.cxx:2315
 TMultiLayerPerceptron.cxx:2316
 TMultiLayerPerceptron.cxx:2317
 TMultiLayerPerceptron.cxx:2318
 TMultiLayerPerceptron.cxx:2319
 TMultiLayerPerceptron.cxx:2320
 TMultiLayerPerceptron.cxx:2321
 TMultiLayerPerceptron.cxx:2322
 TMultiLayerPerceptron.cxx:2323
 TMultiLayerPerceptron.cxx:2324
 TMultiLayerPerceptron.cxx:2325
 TMultiLayerPerceptron.cxx:2326
 TMultiLayerPerceptron.cxx:2327
 TMultiLayerPerceptron.cxx:2328
 TMultiLayerPerceptron.cxx:2329
 TMultiLayerPerceptron.cxx:2330
 TMultiLayerPerceptron.cxx:2331
 TMultiLayerPerceptron.cxx:2332
 TMultiLayerPerceptron.cxx:2333
 TMultiLayerPerceptron.cxx:2334
 TMultiLayerPerceptron.cxx:2335
 TMultiLayerPerceptron.cxx:2336
 TMultiLayerPerceptron.cxx:2337
 TMultiLayerPerceptron.cxx:2338
 TMultiLayerPerceptron.cxx:2339
 TMultiLayerPerceptron.cxx:2340
 TMultiLayerPerceptron.cxx:2341
 TMultiLayerPerceptron.cxx:2342
 TMultiLayerPerceptron.cxx:2343
 TMultiLayerPerceptron.cxx:2344
 TMultiLayerPerceptron.cxx:2345
 TMultiLayerPerceptron.cxx:2346
 TMultiLayerPerceptron.cxx:2347
 TMultiLayerPerceptron.cxx:2348
 TMultiLayerPerceptron.cxx:2349
 TMultiLayerPerceptron.cxx:2350
 TMultiLayerPerceptron.cxx:2351
 TMultiLayerPerceptron.cxx:2352
 TMultiLayerPerceptron.cxx:2353
 TMultiLayerPerceptron.cxx:2354
 TMultiLayerPerceptron.cxx:2355
 TMultiLayerPerceptron.cxx:2356
 TMultiLayerPerceptron.cxx:2357
 TMultiLayerPerceptron.cxx:2358
 TMultiLayerPerceptron.cxx:2359
 TMultiLayerPerceptron.cxx:2360
 TMultiLayerPerceptron.cxx:2361
 TMultiLayerPerceptron.cxx:2362
 TMultiLayerPerceptron.cxx:2363
 TMultiLayerPerceptron.cxx:2364
 TMultiLayerPerceptron.cxx:2365
 TMultiLayerPerceptron.cxx:2366
 TMultiLayerPerceptron.cxx:2367
 TMultiLayerPerceptron.cxx:2368
 TMultiLayerPerceptron.cxx:2369
 TMultiLayerPerceptron.cxx:2370
 TMultiLayerPerceptron.cxx:2371
 TMultiLayerPerceptron.cxx:2372
 TMultiLayerPerceptron.cxx:2373
 TMultiLayerPerceptron.cxx:2374
 TMultiLayerPerceptron.cxx:2375
 TMultiLayerPerceptron.cxx:2376
 TMultiLayerPerceptron.cxx:2377
 TMultiLayerPerceptron.cxx:2378
 TMultiLayerPerceptron.cxx:2379
 TMultiLayerPerceptron.cxx:2380
 TMultiLayerPerceptron.cxx:2381
 TMultiLayerPerceptron.cxx:2382
 TMultiLayerPerceptron.cxx:2383
 TMultiLayerPerceptron.cxx:2384
 TMultiLayerPerceptron.cxx:2385
 TMultiLayerPerceptron.cxx:2386
 TMultiLayerPerceptron.cxx:2387
 TMultiLayerPerceptron.cxx:2388
 TMultiLayerPerceptron.cxx:2389
 TMultiLayerPerceptron.cxx:2390
 TMultiLayerPerceptron.cxx:2391
 TMultiLayerPerceptron.cxx:2392
 TMultiLayerPerceptron.cxx:2393
 TMultiLayerPerceptron.cxx:2394
 TMultiLayerPerceptron.cxx:2395
 TMultiLayerPerceptron.cxx:2396
 TMultiLayerPerceptron.cxx:2397
 TMultiLayerPerceptron.cxx:2398
 TMultiLayerPerceptron.cxx:2399
 TMultiLayerPerceptron.cxx:2400
 TMultiLayerPerceptron.cxx:2401
 TMultiLayerPerceptron.cxx:2402
 TMultiLayerPerceptron.cxx:2403
 TMultiLayerPerceptron.cxx:2404
 TMultiLayerPerceptron.cxx:2405
 TMultiLayerPerceptron.cxx:2406
 TMultiLayerPerceptron.cxx:2407
 TMultiLayerPerceptron.cxx:2408
 TMultiLayerPerceptron.cxx:2409
 TMultiLayerPerceptron.cxx:2410
 TMultiLayerPerceptron.cxx:2411
 TMultiLayerPerceptron.cxx:2412
 TMultiLayerPerceptron.cxx:2413
 TMultiLayerPerceptron.cxx:2414
 TMultiLayerPerceptron.cxx:2415
 TMultiLayerPerceptron.cxx:2416
 TMultiLayerPerceptron.cxx:2417
 TMultiLayerPerceptron.cxx:2418
 TMultiLayerPerceptron.cxx:2419
 TMultiLayerPerceptron.cxx:2420
 TMultiLayerPerceptron.cxx:2421
 TMultiLayerPerceptron.cxx:2422
 TMultiLayerPerceptron.cxx:2423
 TMultiLayerPerceptron.cxx:2424
 TMultiLayerPerceptron.cxx:2425
 TMultiLayerPerceptron.cxx:2426
 TMultiLayerPerceptron.cxx:2427
 TMultiLayerPerceptron.cxx:2428
 TMultiLayerPerceptron.cxx:2429
 TMultiLayerPerceptron.cxx:2430
 TMultiLayerPerceptron.cxx:2431
 TMultiLayerPerceptron.cxx:2432
 TMultiLayerPerceptron.cxx:2433
 TMultiLayerPerceptron.cxx:2434
 TMultiLayerPerceptron.cxx:2435
 TMultiLayerPerceptron.cxx:2436
 TMultiLayerPerceptron.cxx:2437
 TMultiLayerPerceptron.cxx:2438
 TMultiLayerPerceptron.cxx:2439
 TMultiLayerPerceptron.cxx:2440
 TMultiLayerPerceptron.cxx:2441
 TMultiLayerPerceptron.cxx:2442
 TMultiLayerPerceptron.cxx:2443
 TMultiLayerPerceptron.cxx:2444
 TMultiLayerPerceptron.cxx:2445
 TMultiLayerPerceptron.cxx:2446
 TMultiLayerPerceptron.cxx:2447
 TMultiLayerPerceptron.cxx:2448
 TMultiLayerPerceptron.cxx:2449
 TMultiLayerPerceptron.cxx:2450
 TMultiLayerPerceptron.cxx:2451
 TMultiLayerPerceptron.cxx:2452
 TMultiLayerPerceptron.cxx:2453
 TMultiLayerPerceptron.cxx:2454
 TMultiLayerPerceptron.cxx:2455
 TMultiLayerPerceptron.cxx:2456
 TMultiLayerPerceptron.cxx:2457
 TMultiLayerPerceptron.cxx:2458
 TMultiLayerPerceptron.cxx:2459
 TMultiLayerPerceptron.cxx:2460
 TMultiLayerPerceptron.cxx:2461
 TMultiLayerPerceptron.cxx:2462
 TMultiLayerPerceptron.cxx:2463
 TMultiLayerPerceptron.cxx:2464
 TMultiLayerPerceptron.cxx:2465
 TMultiLayerPerceptron.cxx:2466
 TMultiLayerPerceptron.cxx:2467
 TMultiLayerPerceptron.cxx:2468
 TMultiLayerPerceptron.cxx:2469
 TMultiLayerPerceptron.cxx:2470
 TMultiLayerPerceptron.cxx:2471
 TMultiLayerPerceptron.cxx:2472
 TMultiLayerPerceptron.cxx:2473
 TMultiLayerPerceptron.cxx:2474
 TMultiLayerPerceptron.cxx:2475
 TMultiLayerPerceptron.cxx:2476
 TMultiLayerPerceptron.cxx:2477
 TMultiLayerPerceptron.cxx:2478
 TMultiLayerPerceptron.cxx:2479
 TMultiLayerPerceptron.cxx:2480
 TMultiLayerPerceptron.cxx:2481
 TMultiLayerPerceptron.cxx:2482
 TMultiLayerPerceptron.cxx:2483
 TMultiLayerPerceptron.cxx:2484
 TMultiLayerPerceptron.cxx:2485
 TMultiLayerPerceptron.cxx:2486
 TMultiLayerPerceptron.cxx:2487
 TMultiLayerPerceptron.cxx:2488
 TMultiLayerPerceptron.cxx:2489
 TMultiLayerPerceptron.cxx:2490
 TMultiLayerPerceptron.cxx:2491
 TMultiLayerPerceptron.cxx:2492
 TMultiLayerPerceptron.cxx:2493
 TMultiLayerPerceptron.cxx:2494
 TMultiLayerPerceptron.cxx:2495
 TMultiLayerPerceptron.cxx:2496
 TMultiLayerPerceptron.cxx:2497
 TMultiLayerPerceptron.cxx:2498
 TMultiLayerPerceptron.cxx:2499
 TMultiLayerPerceptron.cxx:2500
 TMultiLayerPerceptron.cxx:2501
 TMultiLayerPerceptron.cxx:2502
 TMultiLayerPerceptron.cxx:2503
 TMultiLayerPerceptron.cxx:2504
 TMultiLayerPerceptron.cxx:2505
 TMultiLayerPerceptron.cxx:2506
 TMultiLayerPerceptron.cxx:2507
 TMultiLayerPerceptron.cxx:2508
 TMultiLayerPerceptron.cxx:2509
 TMultiLayerPerceptron.cxx:2510
 TMultiLayerPerceptron.cxx:2511
 TMultiLayerPerceptron.cxx:2512
 TMultiLayerPerceptron.cxx:2513
 TMultiLayerPerceptron.cxx:2514
 TMultiLayerPerceptron.cxx:2515
 TMultiLayerPerceptron.cxx:2516
 TMultiLayerPerceptron.cxx:2517
 TMultiLayerPerceptron.cxx:2518
 TMultiLayerPerceptron.cxx:2519
 TMultiLayerPerceptron.cxx:2520
 TMultiLayerPerceptron.cxx:2521
 TMultiLayerPerceptron.cxx:2522
 TMultiLayerPerceptron.cxx:2523
 TMultiLayerPerceptron.cxx:2524
 TMultiLayerPerceptron.cxx:2525
 TMultiLayerPerceptron.cxx:2526
 TMultiLayerPerceptron.cxx:2527
 TMultiLayerPerceptron.cxx:2528
 TMultiLayerPerceptron.cxx:2529
 TMultiLayerPerceptron.cxx:2530
 TMultiLayerPerceptron.cxx:2531
 TMultiLayerPerceptron.cxx:2532
 TMultiLayerPerceptron.cxx:2533
 TMultiLayerPerceptron.cxx:2534
 TMultiLayerPerceptron.cxx:2535
 TMultiLayerPerceptron.cxx:2536
 TMultiLayerPerceptron.cxx:2537
 TMultiLayerPerceptron.cxx:2538
 TMultiLayerPerceptron.cxx:2539
 TMultiLayerPerceptron.cxx:2540
 TMultiLayerPerceptron.cxx:2541
 TMultiLayerPerceptron.cxx:2542
 TMultiLayerPerceptron.cxx:2543
 TMultiLayerPerceptron.cxx:2544
 TMultiLayerPerceptron.cxx:2545
 TMultiLayerPerceptron.cxx:2546
 TMultiLayerPerceptron.cxx:2547
 TMultiLayerPerceptron.cxx:2548
 TMultiLayerPerceptron.cxx:2549
 TMultiLayerPerceptron.cxx:2550
 TMultiLayerPerceptron.cxx:2551
 TMultiLayerPerceptron.cxx:2552
 TMultiLayerPerceptron.cxx:2553
 TMultiLayerPerceptron.cxx:2554
 TMultiLayerPerceptron.cxx:2555
 TMultiLayerPerceptron.cxx:2556
 TMultiLayerPerceptron.cxx:2557
 TMultiLayerPerceptron.cxx:2558
 TMultiLayerPerceptron.cxx:2559
 TMultiLayerPerceptron.cxx:2560
 TMultiLayerPerceptron.cxx:2561
 TMultiLayerPerceptron.cxx:2562
 TMultiLayerPerceptron.cxx:2563
 TMultiLayerPerceptron.cxx:2564
 TMultiLayerPerceptron.cxx:2565
 TMultiLayerPerceptron.cxx:2566