Log of /trunk/math/mlp/inc/TMultiLayerPerceptron.h
Parent Directory
Revision
36832 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Mon Nov 22 08:53:49 2010 UTC (4 years, 2 months ago) by
brun
File length: 8653 byte(s)
Diff to
previous 29964
From Christophe Delaere:
Added a third parameter to set the limit, and the behavior is then controled
by extra options.
// - "minErrorTrain" (stop when NN error on the training sample gets below
minE
// - "minErrorTest" (stop when NN error on the test sample gets below minE
The limit on epochs is always active but can be put to a high value. This
guaranties the stopping condition.
Revision
22885 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Fri Mar 28 13:57:25 2008 UTC (6 years, 9 months ago) by
rdm
File length: 8458 byte(s)
Diff to
previous 22428
move the following directories under the new "math" meta directory:
mathcore
mathmore
fftw
foam
fumili
genvector
matrix
minuit
minuit2
mlp
physics
smatrix
splot
unuran
quadp
Revision
13804 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Mon Jan 9 15:47:30 2006 UTC (9 years ago) by
brun
Original Path:
trunk/mlp/inc/TMultiLayerPerceptron.h
File length: 8370 byte(s)
Diff to
previous 12329
From Christophe Delaere and Andrea Bocci:
Andrea has extended a bit ROOT's TMultiLayerPerceptron to
optionally use cross-entropy errors, which allows to train a network
for pattern classification based on Bayesian posterior probability.
Reference: [Bishop 1995 , Neural Networks for Pattern Recognition], in
particular chapter 6.
In order to achieve this, I had to add the softmax (generalized
sigmoid) neuron function, which in turn required a bit of changes to
the neuron itself.
Also, I added softmax and sigmoid as possible output neurons, requiring
some changes to how error back propagation is performed.
Currently, softmax neurons are used only in the output layer, but
everything is setup so that they should be OK as hidden units, too,
provided they form a whole layer.
Revision
12329 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Mon Jul 18 12:02:02 2005 UTC (9 years, 6 months ago) by
brun
Original Path:
trunk/mlp/inc/TMultiLayerPerceptron.h
File length: 8189 byte(s)
Diff to
previous 10287
From Christophe Delaere:
- input normalization is now optional. A "@" must be added at the beginning of
the input description to enforce normalization.
- the input/output normalization is now saved/loaded with the weight
(DumpWeights() / LoadWeights() )
- the input/output normalization is taken into account when a function is
exported (C++, FORTRAN, PYTHON)
- The neuron transfer function can now be chosen, either as a predefined
function (sigmoid(default), tanh, gauss, linear) or as an external function
(TFormula).
- arrays can now be used as input. If no index is specified, a neuron will be
created for each element in the array. Only fixed-size arrays are handled
this way.
- TChains can now be used without crash.
- bugfix in TMultiLayerPerceptron::DrawResult() (thanks to Axel): the training
sample was always used, ignoring the option field.
Revision
7159 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Added
Wed Aug 27 15:31:14 2003 UTC (11 years, 5 months ago) by
brun
Original Path:
trunk/mlp/inc/TMultiLayerPerceptron.h
File length: 5251 byte(s)
New package "mlp" (MultiLayerPerceptron" by Christophe.Delaere
The package has 3 classes TMultiLayerPerceptron, TNeuron, TSynapse
// TMultiLayerPerceptron
//
// This class describes a neural network.
// There are facilities to train the network and use the output.
//
// The input layer is made of inactive neurons (returning the
// normalized input), hidden layers are made of sigmoids and output
// neurons are linear.
//
// The basic input is a TTree and two (training and test) TEventLists.
// For classification jobs, a branch (maybe in a TFriend) must contain
// the expected output.
// 6 learning methods are available: kStochastic, kBatch,
// kSteepestDescent, kRibierePolak, kFletcherReeves and kBFGS.
//
// This implementation is *inspired* from the mlpfit package from
// J.Schwindling et al.
This form allows you to request diffs between any two revisions of this file.
For each of the two "sides" of the diff,
enter a numeric revision.