Log of /trunk/math/mlp/inc/TMLPAnalyzer.h
Parent Directory
Revision
22885 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Fri Mar 28 13:57:25 2008 UTC (6 years, 9 months ago) by
rdm
File length: 2383 byte(s)
Diff to
previous 20882
move the following directories under the new "math" meta directory:
mathcore
mathmore
fftw
foam
fumili
genvector
matrix
minuit
minuit2
mlp
physics
smatrix
splot
unuran
quadp
Revision
13804 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Mon Jan 9 15:47:30 2006 UTC (9 years ago) by
brun
Original Path:
trunk/mlp/inc/TMLPAnalyzer.h
File length: 2384 byte(s)
Diff to
previous 11024
From Christophe Delaere and Andrea Bocci:
Andrea has extended a bit ROOT's TMultiLayerPerceptron to
optionally use cross-entropy errors, which allows to train a network
for pattern classification based on Bayesian posterior probability.
Reference: [Bishop 1995 , Neural Networks for Pattern Recognition], in
particular chapter 6.
In order to achieve this, I had to add the softmax (generalized
sigmoid) neuron function, which in turn required a bit of changes to
the neuron itself.
Also, I added softmax and sigmoid as possible output neurons, requiring
some changes to how error back propagation is performed.
Currently, softmax neurons are used only in the output layer, but
everything is setup so that they should be OK as hidden units, too,
provided they form a whole layer.
Revision
11024 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Thu Feb 3 07:29:32 2005 UTC (9 years, 11 months ago) by
brun
Original Path:
trunk/mlp/inc/TMLPAnalyzer.h
File length: 2349 byte(s)
Diff to
previous 10831
From Axel Nauman & Christophe Delaere
This patch fixes a bug in DrawNetwork, where the hists' upper edge was smaller than the lower edge (this was causing the corrupted histos in the mlpHiggs tutorial). I updated the new regression methods (some null pointer checks, better labels) and their doc. I added the following comment to the doc of TMultiLayerPerceptron: "(One should still try to pass normalized inputs, e.g. between [0.,1])", and added labels for the output nodes in Draw.
Revision
10831 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Fri Dec 17 22:34:01 2004 UTC (10 years, 1 month ago) by
brun
Original Path:
trunk/mlp/inc/TMLPAnalyzer.h
File length: 2256 byte(s)
Diff to
previous 10822
From Axel Naumann:
As you pointed out, MSVC found a real problem. Your quick correction
still left the underlying logic problem. Fixed now, see attachment. I
also renamed the vars.
On the mlpHiggs result: the result is (within the randomization
variations) unchanged - I did not touch any of the internals of the MLP
algorithm. Only DrawNetwork didn't manage to display the stack of
histos. This looks like a problem with THStack, its side effect shows up
in mlpHiggs. I replaced the THStack->Draw by sigh->Draw(),
bgh->Draw("same") for now, leaving a reminder comment that this needs to
be fixed. I'll look into that later.
My new methods don't work on the mlpHiggs example (they do work on my
private use case, though). There is a problem creating the TProfile
histos for the mlpHiggs tutorial which I didn't find after chasing it
down for hours. I need more time for that.
I forgot one deletion, now the legends are only created if options
doesn't contain "goff". The profiles are now filled with O-T:T, not
(O-T)/T:T (O: output value, T: truth value), to avoid div by 0. One
THStack had an invalid Form()'ed title.
TMLP::Train now tells how many test and train events are used.
Revision
10822 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Modified
Thu Dec 16 21:20:47 2004 UTC (10 years, 1 month ago) by
brun
Original Path:
trunk/mlp/inc/TMLPAnalyzer.h
File length: 2188 byte(s)
Diff to
previous 10183
From Axel Naumann
I've added four utility methods to TMLPAnalyzer. ANNs used to represent
an unknown function (i.e. not for classification) had no appropriate
tools for testing the output quality.
For the sake of clarity I made the TNeurons derive from TNamed; they now
get names assigned. First, last layer: their TTreeFormula, hidden layer:
Form("HiddenL%d:N%d",layer,i). This allows quick access to the nodes'
names when drawing their input / output for e.g. axis labels.
There was a small bug in the "vary the inputs by a bit, look what
happens to the output" algorithm. The input wasn't reset to its original
value before the next node's input was modified, creating "cross talk".
The loops are also a bit more efficient now.
The status graph is now only updated once per round (works for me, even
with the zoomed axis).
The example in the class descr is fixed (no type specifiers "/F").
The c'tor doc for const char* test/train cuts is fixed.
TMLP::DrawResult now has the option "nocanv" which noesn't create a new
canvas (it's this way around for backwards comp).
tutorials/mlpHiggs now has the proper orthogonal train and test cut.
TMLPAnalyzer now creates a TTree containing the input, true output and
real output - good for quick plots of any possible dependencies one can
think of. This is used to plot the relative difference of (output,true)
vs true (by DrawTruthDeviation) and (output,true) vs input (by
DrawTruthDeviationInOut). The former shows how dependent the error is on
the output value, the latter how much it depends on tyhe input. These
histos are plotted (and returned) as TProfiles, showing the mean
deviation (and the std dev of the deviation) vs true or input value.
Revision
8766 -
(
view)
(
download)
(
as text)
(
annotate)
-
[select for diffs]
Added
Mon May 3 16:30:12 2004 UTC (10 years, 8 months ago) by
brun
Original Path:
trunk/mlp/inc/TMLPAnalyzer.h
File length: 1100 byte(s)
From Christophe Delaere
New class TMLPAnalyzer: This class grouping several utilities is designed
for analyzing a Neural Network.
The tutorial mlpHiggs.C has been modified to illustrate this new class.
This form allows you to request diffs between any two revisions of this file.
For each of the two "sides" of the diff,
enter a numeric revision.