Logo ROOT   6.18/05
Reference Guide
TMVAMulticlass.C File Reference

Detailed Description

View in nbviewer Open in SWAN This macro provides a simple example for the training and testing of the TMVA multiclass classification

==> Start TMVAMulticlass
Creating testdata....
... event: 0 (2000)
... event: 1000 (2000)
======> EVENT:0
var1 = -1.14361
var2 = -0.822373
var3 = -0.395426
var4 = -0.529427
created tree: TreeS
... event: 0 (2000)
... event: 1000 (2000)
======> EVENT:0
var1 = -1.54361
var2 = -1.42237
var3 = -1.39543
var4 = -2.02943
created tree: TreeB0
... event: 0 (2000)
... event: 1000 (2000)
======> EVENT:0
var1 = -1.54361
var2 = -0.822373
var3 = -0.395426
var4 = -2.02943
created tree: TreeB1
======> EVENT:0
var1 = 0.463304
var2 = 1.37192
var3 = -1.16769
var4 = -1.77551
created tree: TreeB2
created data file: tmva_example_multiple_background.root
created tmva_example_multiple_background.root for tests of the multiclass features
DataSetInfo : [dataset] : Added class "Signal"
: Add Tree TreeS of type Signal with 2000 events
DataSetInfo : [dataset] : Added class "bg0"
: Add Tree TreeB0 of type bg0 with 2000 events
DataSetInfo : [dataset] : Added class "bg1"
: Add Tree TreeB1 of type bg1 with 2000 events
DataSetInfo : [dataset] : Added class "bg2"
: Add Tree TreeB2 of type bg2 with 2000 events
: Dataset[dataset] : Class index : 0 name : Signal
: Dataset[dataset] : Class index : 1 name : bg0
: Dataset[dataset] : Class index : 2 name : bg1
: Dataset[dataset] : Class index : 3 name : bg2
Factory : Booking method: ␛[1mBDTG␛[0m
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
DataSetFactory : [dataset] : Number of events in input trees
:
:
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 1000
: Signal -- testing events : 1000
: Signal -- training and testing events: 2000
: bg0 -- training events : 1000
: bg0 -- testing events : 1000
: bg0 -- training and testing events: 2000
: bg1 -- training events : 1000
: bg1 -- testing events : 1000
: bg1 -- training and testing events: 2000
: bg2 -- training events : 1000
: bg2 -- testing events : 1000
: bg2 -- training and testing events: 2000
:
DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.397 +0.623 +0.832
: var2: +0.397 +1.000 +0.716 +0.737
: var3: +0.623 +0.716 +1.000 +0.859
: var4: +0.832 +0.737 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg0):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.365 +0.592 +0.811
: var2: +0.365 +1.000 +0.708 +0.740
: var3: +0.592 +0.708 +1.000 +0.859
: var4: +0.811 +0.740 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg1):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.407 +0.610 +0.834
: var2: +0.407 +1.000 +0.710 +0.741
: var3: +0.610 +0.710 +1.000 +0.851
: var4: +0.834 +0.741 +0.851 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg2):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 -0.647 -0.016 -0.013
: var2: -0.647 +1.000 +0.015 +0.002
: var3: -0.016 +0.015 +1.000 -0.024
: var4: -0.013 +0.002 -0.024 +1.000
: ----------------------------------------
DataSetFactory : [dataset] :
:
Factory : Booking method: ␛[1mMLP␛[0m
:
MLP : Building Network.
: Initializing weights
Factory : ␛[1mTrain all methods␛[0m
Factory : [dataset] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "P" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "G" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.047647 1.0025 [ -3.6592 3.2645 ]
: var2: 0.32647 1.0646 [ -3.6891 3.7877 ]
: var3: 0.11493 1.1230 [ -4.5727 4.5640 ]
: var4: -0.076531 1.2652 [ -4.8486 5.0412 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.082544 1.0000 [ -3.6274 3.1017 ]
: var2: 0.36715 1.0000 [ -3.3020 3.4950 ]
: var3: 0.066865 1.0000 [ -2.9882 3.3086 ]
: var4: -0.20593 1.0000 [ -3.3088 2.8423 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 5.7502e-10 1.8064 [ -8.0344 7.8312 ]
: var2:-1.6078e-11 0.90130 [ -2.6765 2.7523 ]
: var3: 3.0841e-10 0.73386 [ -2.6572 2.2255 ]
: var4:-2.6886e-10 0.62168 [ -1.7384 2.2297 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.013510 1.0000 [ -2.6520 6.2074 ]
: var2: 0.0096839 1.0000 [ -2.8402 6.3073 ]
: var3: 0.010397 1.0000 [ -3.0251 5.8860 ]
: var4: 0.0053980 1.0000 [ -3.0998 5.7078 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
Factory : Train method: BDTG for Multiclass classification
:
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 4000 events: 5.57 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of BDTG on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 1.67 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.class.C␛[0m
: TMVAMulticlass.root:/dataset/Method_BDT/BDTG
Factory : Training finished
:
Factory : Train method: MLP for Multiclass classification
:
: Training Network
:
: Elapsed time for training with 4000 events: 24.1 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of MLP on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0147 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.class.C␛[0m
: Write special histos to file: TMVAMulticlass.root:/dataset/Method_MLP/MLP
Factory : Training finished
:
: Ranking input variables (method specific)...
BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 3.117e-01
: 2 : var1 : 2.504e-01
: 3 : var2 : 2.430e-01
: 4 : var3 : 1.949e-01
: --------------------------------------
MLP : Ranking result (top variable is best ranked)
: -----------------------------
: Rank : Variable : Importance
: -----------------------------
: 1 : var4 : 6.076e+01
: 2 : var2 : 4.824e+01
: 3 : var1 : 2.116e+01
: 4 : var3 : 1.692e+01
: -----------------------------
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.weights.xml␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.weights.xml␛[0m
MLP : Building Network.
: Initializing weights
Factory : ␛[1mTest all methods␛[0m
Factory : Test method: BDTG for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of BDTG on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 1.03 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: MLP for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of MLP on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0146 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : ␛[1mEvaluate all methods␛[0m
: Evaluate multiclass classification method: BDTG
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: MLP
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
:
: 1-vs-rest performance metrics per class
: -------------------------------------------------------------------------------------------------------
:
: Considers the listed class as signal and the other classes
: as background, reporting the resulting binary performance.
: A score of 0.820 (0.850) means 0.820 was acheived on the
: test set and 0.850 on the training set.
:
: Dataset MVA Method ROC AUC Sig eff@B=0.01 Sig eff@B=0.10 Sig eff@B=0.30
: Name: / Class: test (train) test (train) test (train) test (train)
:
: dataset BDTG
: ------------------------------
: Signal 0.968 (0.978) 0.508 (0.605) 0.914 (0.945) 0.990 (0.996)
: bg0 0.910 (0.931) 0.256 (0.288) 0.737 (0.791) 0.922 (0.956)
: bg1 0.947 (0.954) 0.437 (0.511) 0.833 (0.856) 0.971 (0.971)
: bg2 0.978 (0.982) 0.585 (0.678) 0.951 (0.956) 0.999 (0.996)
:
: dataset MLP
: ------------------------------
: Signal 0.970 (0.975) 0.596 (0.632) 0.933 (0.938) 0.988 (0.993)
: bg0 0.929 (0.934) 0.303 (0.298) 0.787 (0.793) 0.949 (0.961)
: bg1 0.962 (0.967) 0.467 (0.553) 0.881 (0.906) 0.985 (0.992)
: bg2 0.975 (0.979) 0.629 (0.699) 0.929 (0.940) 0.998 (0.998)
:
: -------------------------------------------------------------------------------------------------------
:
:
: Confusion matrices for all methods
: -------------------------------------------------------------------------------------------------------
:
: Does a binary comparison between the two classes given by a
: particular row-column combination. In each case, the class
: given by the row is considered signal while the class given
: by the column index is considered background.
:
: === Showing confusion matrix for method : BDTG
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.497 (0.373) 0.710 (0.693) 0.680 (0.574)
: bg0 0.271 (0.184) - 0.239 (0.145) 0.705 (0.667)
: bg1 0.855 (0.766) 0.369 (0.222) - 0.587 (0.578)
: bg2 0.714 (0.585) 0.705 (0.581) 0.648 (0.601) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.911 (0.853) 0.991 (0.981) 0.945 (0.913)
: bg0 0.833 (0.774) - 0.654 (0.582) 0.930 (0.901)
: bg1 0.971 (0.980) 0.716 (0.681) - 0.871 (0.862)
: bg2 0.976 (0.951) 0.971 (0.973) 0.936 (0.941) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.957) 0.999 (1.000) 0.998 (0.997)
: bg0 0.965 (0.926) - 0.874 (0.835) 0.991 (0.976)
: bg1 1.000 (0.999) 0.916 (0.894) - 0.988 (0.985)
: bg2 0.999 (0.999) 0.997 (0.999) 0.996 (0.997) -
:
: === Showing confusion matrix for method : MLP
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.465 (0.490) 0.974 (0.953) 0.632 (0.498)
: bg0 0.320 (0.269) - 0.224 (0.250) 0.655 (0.627)
: bg1 0.943 (0.920) 0.341 (0.275) - 0.632 (0.687)
: bg2 0.665 (0.642) 0.697 (0.680) 0.706 (0.598) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.865 (0.854) 0.996 (0.994) 0.908 (0.907)
: bg0 0.784 (0.776) - 0.666 (0.655) 0.919 (0.895)
: bg1 0.998 (0.998) 0.791 (0.785) - 0.912 (0.902)
: bg2 0.943 (0.903) 0.946 (0.939) 0.924 (0.928) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.964) 0.997 (0.997) 0.993 (0.986)
: bg0 0.952 (0.924) - 0.936 (0.928) 0.992 (0.990)
: bg1 1.000 (1.000) 0.945 (0.936) - 0.998 (0.995)
: bg2 0.994 (0.985) 0.998 (0.998) 0.998 (0.998) -
:
: -------------------------------------------------------------------------------------------------------
:
Dataset:dataset : Created tree 'TestTree' with 4000 events
:
Dataset:dataset : Created tree 'TrainTree' with 4000 events
:
Factory : ␛[1mThank you for using TMVA!␛[0m
: ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m
==> Wrote root file: TMVAMulticlass.root
==> TMVAMulticlass is done!
#include <cstdlib>
#include <iostream>
#include <map>
#include <string>
#include "TFile.h"
#include "TTree.h"
#include "TString.h"
#include "TSystem.h"
#include "TROOT.h"
#include "TMVA/Tools.h"
#include "TMVA/Factory.h"
using namespace TMVA;
void TMVAMulticlass( TString myMethodList = "" )
{
// This loads the library
// to get access to the GUI and all tmva macros
//
// TString tmva_dir(TString(gRootDir) + "/tmva");
// if(gSystem->Getenv("TMVASYS"))
// tmva_dir = TString(gSystem->Getenv("TMVASYS"));
// gROOT->SetMacroPath(tmva_dir + "/test/:" + gROOT->GetMacroPath() );
// gROOT->ProcessLine(".L TMVAMultiClassGui.C");
//---------------------------------------------------------------
// Default MVA methods to be trained + tested
std::map<std::string,int> Use;
Use["MLP"] = 1;
Use["BDTG"] = 1;
Use["DNN_CPU"] = 0;
Use["FDA_GA"] = 0;
Use["PDEFoam"] = 0;
//---------------------------------------------------------------
std::cout << std::endl;
std::cout << "==> Start TMVAMulticlass" << std::endl;
if (myMethodList != "") {
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) it->second = 0;
std::vector<TString> mlist = TMVA::gTools().SplitString( myMethodList, ',' );
for (UInt_t i=0; i<mlist.size(); i++) {
std::string regMethod(mlist[i]);
if (Use.find(regMethod) == Use.end()) {
std::cout << "Method \"" << regMethod << "\" not known in TMVA under this name. Choose among the following:" << std::endl;
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) std::cout << it->first << " ";
std::cout << std::endl;
return;
}
Use[regMethod] = 1;
}
}
// Create a new root output file.
TString outfileName = "TMVAMulticlass.root";
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );
TMVA::Factory *factory = new TMVA::Factory( "TMVAMulticlass", outputFile,
"!V:!Silent:Color:DrawProgressBar:Transformations=I;D;P;G,D:AnalysisType=multiclass" );
TMVA::DataLoader *dataloader=new TMVA::DataLoader("dataset");
dataloader->AddVariable( "var1", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
TFile *input(0);
TString fname = "./tmva_example_multiple_background.root";
if (!gSystem->AccessPathName( fname )) {
// first we try to find the file in the local directory
std::cout << "--- TMVAMulticlass : Accessing " << fname << std::endl;
input = TFile::Open( fname );
}
else {
std::cout << "Creating testdata...." << std::endl;
TString createDataMacro = gROOT->GetTutorialDir() + "/tmva/createData.C";
gROOT->ProcessLine(TString::Format(".L %s",createDataMacro.Data()));
gROOT->ProcessLine("create_MultipleBackground(2000)");
std::cout << " created tmva_example_multiple_background.root for tests of the multiclass features"<<std::endl;
input = TFile::Open( fname );
}
if (!input) {
std::cout << "ERROR: could not open data file" << std::endl;
exit(1);
}
TTree *signalTree = (TTree*)input->Get("TreeS");
TTree *background0 = (TTree*)input->Get("TreeB0");
TTree *background1 = (TTree*)input->Get("TreeB1");
TTree *background2 = (TTree*)input->Get("TreeB2");
gROOT->cd( outfileName+TString(":/") );
dataloader->AddTree (signalTree,"Signal");
dataloader->AddTree (background0,"bg0");
dataloader->AddTree (background1,"bg1");
dataloader->AddTree (background2,"bg2");
dataloader->PrepareTrainingAndTestTree( "", "SplitMode=Random:NormMode=NumEvents:!V" );
if (Use["BDTG"]) // gradient boosted decision trees
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG", "!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.10:UseBaggedBoost:BaggedSampleFraction=0.50:nCuts=20:MaxDepth=2");
if (Use["MLP"]) // neural network
factory->BookMethod( dataloader, TMVA::Types::kMLP, "MLP", "!H:!V:NeuronType=tanh:NCycles=1000:HiddenLayers=N+5,5:TestRate=5:EstimatorType=MSE");
if (Use["FDA_GA"]) // functional discriminant with GA minimizer
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_GA", "H:!V:Formula=(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3:ParRanges=(-1,1);(-10,10);(-10,10);(-10,10);(-10,10):FitMethod=GA:PopSize=300:Cycles=3:Steps=20:Trim=True:SaveBestGen=1" );
if (Use["PDEFoam"]) // PDE-Foam approach
factory->BookMethod( dataloader, TMVA::Types::kPDEFoam, "PDEFoam", "!H:!V:TailCut=0.001:VolFrac=0.0666:nActiveCells=500:nSampl=2000:nBin=5:Nmin=100:Kernel=None:Compress=T" );
if (Use["DNN_CPU"]) {
TString layoutString("Layout=TANH|100,TANH|50,TANH|10,LINEAR");
TString training0("LearningRate=1e-1, Momentum=0.5, Repetitions=1, ConvergenceSteps=10,"
" BatchSize=256, TestRepetitions=10, Multithreading=True");
TString training1("LearningRate=1e-2, Momentum=0.0, Repetitions=1, ConvergenceSteps=10,"
" BatchSize=256, TestRepetitions=7, Multithreading=True");
TString trainingStrategyString("TrainingStrategy=");
trainingStrategyString += training0 + "|" + training1;
TString nnOptions("!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:"
"WeightInitialization=XAVIERUNIFORM:Architecture=CPU");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
factory->BookMethod(dataloader, TMVA::Types::kDNN, "DNN_CPU", nnOptions);
}
// Train MVAs using the set of training events
factory->TrainAllMethods();
// Evaluate all MVAs using the set of test events
factory->TestAllMethods();
// Evaluate and compare performance of all configured MVAs
factory->EvaluateAllMethods();
// --------------------------------------------------------------
// Save the output
outputFile->Close();
std::cout << "==> Wrote root file: " << outputFile->GetName() << std::endl;
std::cout << "==> TMVAMulticlass is done!" << std::endl;
delete factory;
delete dataloader;
// Launch the GUI for the root macros
if (!gROOT->IsBatch()) TMVAMultiClassGui( outfileName );
}
int main( int argc, char** argv )
{
// Select methods (don't look at this code - not of interest)
TString methodList;
for (int i=1; i<argc; i++) {
TString regMethod(argv[i]);
if(regMethod=="-b" || regMethod=="--batch") continue;
if (!methodList.IsNull()) methodList += TString(",");
methodList += regMethod;
}
TMVAMulticlass(methodList);
return 0;
}
unsigned int UInt_t
Definition: RtypesCore.h:42
#define gROOT
Definition: TROOT.h:414
R__EXTERN TSystem * gSystem
Definition: TSystem.h:560
A ROOT file is a suite of consecutive data records (TKey instances) with a well defined format.
Definition: TFile.h:48
virtual void Close(Option_t *option="")
Close a file.
Definition: TFile.cxx:914
static TFile * Open(const char *name, Option_t *option="", const char *ftitle="", Int_t compress=ROOT::RCompressionSetting::EDefaults::kUseGeneralPurpose, Int_t netopt=0)
Create / open a file.
Definition: TFile.cxx:3980
void PrepareTrainingAndTestTree(const TCut &cut, const TString &splitOpt)
prepare the training and test trees -> same cuts for signal and background
Definition: DataLoader.cxx:608
void AddTree(TTree *tree, const TString &className, Double_t weight=1.0, const TCut &cut="", Types::ETreeType tt=Types::kMaxTreeType)
Definition: DataLoader.cxx:336
void AddVariable(const TString &expression, const TString &title, const TString &unit, char type='F', Double_t min=0, Double_t max=0)
user inserts discriminating variable in data set info
Definition: DataLoader.cxx:470
This is the main MVA steering class.
Definition: Factory.h:81
void TrainAllMethods()
Iterates through all booked methods and calls training.
Definition: Factory.cxx:1093
MethodBase * BookMethod(DataLoader *loader, TString theMethodName, TString methodTitle, TString theOption="")
Book a classifier or regression method.
Definition: Factory.cxx:358
void TestAllMethods()
Evaluates all booked methods on the testing data and adds the output to the Results in the corresponi...
Definition: Factory.cxx:1231
void EvaluateAllMethods(void)
Iterates over all MVAs that have been booked, and calls their evaluation methods.
Definition: Factory.cxx:1333
static Tools & Instance()
Definition: Tools.cxx:75
std::vector< TString > SplitString(const TString &theOpt, const char separator) const
splits the option string at 'separator' and fills the list 'splitV' with the primitive strings
Definition: Tools.cxx:1211
@ kFDA
Definition: Types.h:94
@ kBDT
Definition: Types.h:88
@ kPDEFoam
Definition: Types.h:96
@ kMLP
Definition: Types.h:92
virtual const char * GetName() const
Returns name of object.
Definition: TNamed.h:47
Basic string class.
Definition: TString.h:131
const char * Data() const
Definition: TString.h:364
Bool_t IsNull() const
Definition: TString.h:402
static TString Format(const char *fmt,...)
Static method which formats a string using a printf style format descriptor and return a TString.
Definition: TString.cxx:2311
virtual Bool_t AccessPathName(const char *path, EAccessMode mode=kFileExists)
Returns FALSE if one can access a file using the specified access mode.
Definition: TSystem.cxx:1286
A TTree represents a columnar dataset.
Definition: TTree.h:71
int main(int argc, char **argv)
create variable transformations
Tools & gTools()
void TMVAMultiClassGui(const char *fName="TMVAMulticlass.root", TString dataset="")
Author
Andreas Hoecker

Definition in file TMVAMulticlass.C.