Log of /trunk/math/mlp/inc/TNeuron.h
Parent Directory
Revision
22885 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Fri Mar 28 13:57:25 2008 UTC (6 years, 9 months ago) by
rdm
File length: 4602 byte(s)
Diff to
previous 22428
move the following directories under the new "math" meta directory:
mathcore
mathmore
fftw
foam
fumili
genvector
matrix
minuit
minuit2
mlp
physics
smatrix
splot
unuran
quadp
Revision
13804 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Mon Jan 9 15:47:30 2006 UTC (9 years ago) by
brun
Original Path:
trunk/mlp/inc/TNeuron.h
File length: 4551 byte(s)
Diff to
previous 12329
From Christophe Delaere and Andrea Bocci:
Andrea has extended a bit ROOT's TMultiLayerPerceptron to
optionally use cross-entropy errors, which allows to train a network
for pattern classification based on Bayesian posterior probability.
Reference: [Bishop 1995 , Neural Networks for Pattern Recognition], in
particular chapter 6.
In order to achieve this, I had to add the softmax (generalized
sigmoid) neuron function, which in turn required a bit of changes to
the neuron itself.
Also, I added softmax and sigmoid as possible output neurons, requiring
some changes to how error back propagation is performed.
Currently, softmax neurons are used only in the output layer, but
everything is setup so that they should be OK as hidden units, too,
provided they form a whole layer.
Revision
12329 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Mon Jul 18 12:02:02 2005 UTC (9 years, 6 months ago) by
brun
Original Path:
trunk/mlp/inc/TNeuron.h
File length: 4136 byte(s)
Diff to
previous 10822
From Christophe Delaere:
- input normalization is now optional. A "@" must be added at the beginning of
the input description to enforce normalization.
- the input/output normalization is now saved/loaded with the weight
(DumpWeights() / LoadWeights() )
- the input/output normalization is taken into account when a function is
exported (C++, FORTRAN, PYTHON)
- The neuron transfer function can now be chosen, either as a predefined
function (sigmoid(default), tanh, gauss, linear) or as an external function
(TFormula).
- arrays can now be used as input. If no index is specified, a neuron will be
created for each element in the array. Only fixed-size arrays are handled
this way.
- TChains can now be used without crash.
- bugfix in TMultiLayerPerceptron::DrawResult() (thanks to Axel): the training
sample was always used, ignoring the option field.
Revision
10822 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Thu Dec 16 21:20:47 2004 UTC (10 years, 1 month ago) by
brun
Original Path:
trunk/mlp/inc/TNeuron.h
File length: 3854 byte(s)
Diff to
previous 10183
From Axel Naumann
I've added four utility methods to TMLPAnalyzer. ANNs used to represent
an unknown function (i.e. not for classification) had no appropriate
tools for testing the output quality.
For the sake of clarity I made the TNeurons derive from TNamed; they now
get names assigned. First, last layer: their TTreeFormula, hidden layer:
Form("HiddenL%d:N%d",layer,i). This allows quick access to the nodes'
names when drawing their input / output for e.g. axis labels.
There was a small bug in the "vary the inputs by a bit, look what
happens to the output" algorithm. The input wasn't reset to its original
value before the next node's input was modified, creating "cross talk".
The loops are also a bit more efficient now.
The status graph is now only updated once per round (works for me, even
with the zoomed axis).
The example in the class descr is fixed (no type specifiers "/F").
The c'tor doc for const char* test/train cuts is fixed.
TMLP::DrawResult now has the option "nocanv" which noesn't create a new
canvas (it's this way around for backwards comp).
tutorials/mlpHiggs now has the proper orthogonal train and test cut.
TMLPAnalyzer now creates a TTree containing the input, true output and
real output - good for quick plots of any possible dependencies one can
think of. This is used to plot the relative difference of (output,true)
vs true (by DrawTruthDeviation) and (output,true) vs input (by
DrawTruthDeviationInOut). The former shows how dependent the error is on
the output value, the latter how much it depends on tyhe input. These
histos are plotted (and returned) as TProfiles, showing the mean
deviation (and the std dev of the deviation) vs true or input value.
Revision
7219 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Fri Sep 5 10:40:01 2003 UTC (11 years, 4 months ago) by
brun
Original Path:
trunk/mlp/inc/TNeuron.h
File length: 3201 byte(s)
Diff to
previous 7181
Update to the Neural Net package by Christophe Delaere
-Problem in writing TNeuron solved.
-Problem reading a TChain solved.
-Empty canvas appearing when normalizing a neuron does not appear anymore.
-Option to normalize output added.
-Few problems in teh documentation corrected.
Revision
7159 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Added
Wed Aug 27 15:31:14 2003 UTC (11 years, 5 months ago) by
brun
Original Path:
trunk/mlp/inc/TNeuron.h
File length: 3195 byte(s)
New package "mlp" (MultiLayerPerceptron" by Christophe.Delaere
The package has 3 classes TMultiLayerPerceptron, TNeuron, TSynapse
// TMultiLayerPerceptron
//
// This class describes a neural network.
// There are facilities to train the network and use the output.
//
// The input layer is made of inactive neurons (returning the
// normalized input), hidden layers are made of sigmoids and output
// neurons are linear.
//
// The basic input is a TTree and two (training and test) TEventLists.
// For classification jobs, a branch (maybe in a TFriend) must contain
// the expected output.
// 6 learning methods are available: kStochastic, kBatch,
// kSteepestDescent, kRibierePolak, kFletcherReeves and kBFGS.
//
// This implementation is *inspired* from the mlpfit package from
// J.Schwindling et al.
This form allows you to request diffs between any two revisions of this file.
For each of the two "sides" of the diff,
enter a numeric revision.