Logo ROOT  
Reference Guide
mlpRegression.C File Reference

Detailed Description

This macro shows the use of an ANN for regression analysis: given a set {i} of input vectors i and a set {o} of output vectors o, one looks for the unknown function f(i)=o.

The ANN can approximate this function; TMLPAnalyzer::DrawTruthDeviation methods can be used to evaluate the quality of the approximation.

For simplicity, we use a known function to create test and training data. In reality this function is usually not known, and the data comes e.g. from measurements.

Network with structure: x,y:10:8:f
inputs with low values in the differences plot may not be needed
x -> 0.0837213 +/- 0.0429966
y -> 0.0818201 +/- 0.039177
Double_t theUnknownFunction(Double_t x, Double_t y) {
return sin((1.7+x)*(x-0.3)-2.3*(y+0.7));
}
void mlpRegression() {
// create a tree with train and test data.
// we have two input parameters x and y,
// and one output value f(x,y)
TNtuple* t=new TNtuple("tree","tree","x:y:f");
for (Int_t i=0; i<1000; i++) {
Float_t x=r.Rndm();
Float_t y=r.Rndm();
// fill it with x, y, and f(x,y) - usually this function
// is not known, and the value of f given an x and a y comes
// e.g. from measurements
t->Fill(x,y,theUnknownFunction(x,y));
}
// create ANN
"Entry$%2","(Entry$%2)==0");
mlp->Train(150,"graph update=10");
// analyze it
TMLPAnalyzer* mlpa=new TMLPAnalyzer(mlp);
mlpa->CheckNetwork();
mlpa->DrawDInputs();
// draw statistics shows the quality of the ANN's approximation
TCanvas* cIO=new TCanvas("TruthDeviation", "TruthDeviation");
cIO->Divide(2,2);
cIO->cd(1);
// draw the difference between the ANN's output for (x,y) and
// the true value f(x,y), vs. f(x,y), as TProfiles
cIO->cd(2);
// draw the difference between the ANN's output for (x,y) and
// the true value f(x,y), vs. x, and vs. y, as TProfiles
cIO->cd(3);
// draw a box plot of the ANN's output for (x,y) vs f(x,y)
mlpa->GetIOTree()->Draw("Out.Out0-True.True0:True.True0>>hDelta","","goff");
TH2F* hDelta=(TH2F*)gDirectory->Get("hDelta");
hDelta->SetTitle("Difference between ANN output and truth vs. truth");
hDelta->Draw("BOX");
cIO->cd(4);
// draw difference of ANN's output for (x,y) vs f(x,y) assuming
// the ANN can extrapolate
Double_t vx[225];
Double_t vy[225];
Double_t delta[225];
Double_t v[2];
for (Int_t ix=0; ix<15; ix++) {
v[0]=ix/5.-1.;
for (Int_t iy=0; iy<15; iy++) {
v[1]=iy/5.-1.;
Int_t idx=ix*15+iy;
vx[idx]=v[0];
vy[idx]=v[1];
delta[idx]=mlp->Evaluate(0, v)-theUnknownFunction(v[0],v[1]);
}
}
TGraph2D* g2Extrapolate=new TGraph2D("ANN extrapolation",
"ANN extrapolation, ANN output - truth",
225, vx, vy, delta);
g2Extrapolate->Draw("TRI2");
}
ROOT::R::TRInterface & r
Definition: Object.C:4
int Int_t
Definition: RtypesCore.h:43
double Double_t
Definition: RtypesCore.h:57
float Float_t
Definition: RtypesCore.h:55
#define gDirectory
Definition: TDirectory.h:229
double sin(double)
The Canvas class.
Definition: TCanvas.h:27
TVirtualPad * cd(Int_t subpadnumber=0)
Set current canvas & pad.
Definition: TCanvas.cxx:701
Graphics object made of three arrays X, Y and Z with the same number of points each.
Definition: TGraph2D.h:41
virtual void Draw(Option_t *option="P0")
Specific drawing options can be used to paint a TGraph2D:
Definition: TGraph2D.cxx:708
virtual void SetTitle(const char *title)
See GetStatOverflows for more information.
Definition: TH1.cxx:6345
virtual void Draw(Option_t *option="")
Draw this histogram with options.
Definition: TH1.cxx:2998
2-D histogram with a float per channel (see TH1 documentation)}
Definition: TH2.h:251
This utility class contains a set of tests usefull when developing a neural network.
Definition: TMLPAnalyzer.h:25
void DrawDInputs()
Draws the distribution (on the test sample) of the impact on the network output of a small variation ...
THStack * DrawTruthDeviationInsOut(Int_t outnode=0, Option_t *option="")
Creates a profile of the difference of the MLP output outnode minus the true value of outnode vs the ...
void CheckNetwork()
Gives some information about the network in the terminal.
void GatherInformations()
Collect information about what is useful in the network.
THStack * DrawTruthDeviations(Option_t *option="")
Creates TProfiles of the difference of the MLP output minus the true value vs the true value,...
TTree * GetIOTree() const
Definition: TMLPAnalyzer.h:56
This class describes a neural network.
Double_t Evaluate(Int_t index, Double_t *params) const
Returns the Neural Net for a given set of input parameters #parameters must equal #input neurons.
void Train(Int_t nEpoch, Option_t *option="text", Double_t minE=0)
Train the network.
A simple TTree restricted to a list of float variables only.
Definition: TNtuple.h:28
virtual Int_t Fill()
Fill a Ntuple with current values in fArgs.
Definition: TNtuple.cxx:170
virtual void Divide(Int_t nx=1, Int_t ny=1, Float_t xmargin=0.01, Float_t ymargin=0.01, Int_t color=0)
Automatic pad generation by division.
Definition: TPad.cxx:1165
This is the base class for the ROOT Random number generators.
Definition: TRandom.h:27
virtual void Draw(Option_t *opt)
Default Draw method for all objects.
Definition: TTree.h:426
Double_t y[n]
Definition: legend1.C:17
Double_t x[n]
Definition: legend1.C:17
Author
Axel Naumann, 2005-02-02

Definition in file mlpRegression.C.