Logo ROOT  
Reference Guide
 
Loading...
Searching...
No Matches
TMVAMultipleBackgroundExample.C File Reference

Detailed Description

View in nbviewer Open in SWAN
This example shows the training of signal with three different backgrounds Then in the application a tree is created with all signal and background events where the true class ID and the three classifier outputs are added finally with the application tree, the significance is maximized with the help of the TMVA genetic algorithm.

  • Project : TMVA - a Root-integrated toolkit for multivariate data analysis
  • Package : TMVA
  • Executable: TMVAGAexample
Start Test TMVAGAexample
========================
... event: 0 (200)
======> EVENT:0
var1 = -1.14361
var2 = -0.822373
var3 = -0.395426
var4 = -0.529427
created tree: TreeS
... event: 0 (200)
======> EVENT:0
var1 = -1.54361
var2 = -1.42237
var3 = -1.39543
var4 = -2.02943
created tree: TreeB0
... event: 0 (200)
======> EVENT:0
var1 = -1.54361
var2 = -0.822373
var3 = -0.395426
var4 = -2.02943
created tree: TreeB1
======> EVENT:0
var1 = 0.463304
var2 = 1.37192
var3 = -1.16769
var4 = -1.77551
created tree: TreeB2
created data file: tmva_example_multiple_background.root
========================
--- Training
<HEADER> DataSetInfo : [datasetBkg0] : Added class "Signal"
: Add Tree TreeS of type Signal with 200 events
<HEADER> DataSetInfo : [datasetBkg0] : Added class "Background"
: Add Tree TreeB0 of type Background with 200 events
<HEADER> Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Rebuilding Dataset datasetBkg0
: Building event vectors for type 2 Signal
: Dataset[datasetBkg0] : create input formulas for tree TreeS
: Building event vectors for type 2 Background
: Dataset[datasetBkg0] : create input formulas for tree TreeB0
<HEADER> DataSetFactory : [datasetBkg0] : Number of events in input trees
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 100
: Signal -- testing events : 100
: Signal -- training and testing events: 200
: Background -- training events : 100
: Background -- testing events : 100
: Background -- training and testing events: 200
:
<HEADER> DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.427 +0.620 +0.834
: var2: +0.427 +1.000 +0.756 +0.779
: var3: +0.620 +0.756 +1.000 +0.854
: var4: +0.834 +0.779 +0.854 +1.000
: ----------------------------------------
<HEADER> DataSetInfo : Correlation matrix (Background):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.404 +0.560 +0.803
: var2: +0.404 +1.000 +0.784 +0.784
: var3: +0.560 +0.784 +1.000 +0.836
: var4: +0.803 +0.784 +0.836 +1.000
: ----------------------------------------
<HEADER> DataSetFactory : [datasetBkg0] :
:
<HEADER> Factory : Train all methods
<HEADER> Factory : [datasetBkg0] : Create Transformation "I" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "P" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "G" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.0088163 1.0188 [ -3.1150 2.2852 ]
: var2: 0.043750 1.1258 [ -3.6952 3.1113 ]
: var3: 0.091345 1.1793 [ -3.3587 3.9796 ]
: var4: 0.20148 1.3300 [ -3.7913 4.1179 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: -0.12168 1.0000 [ -3.1787 2.4607 ]
: var2: -0.061928 1.0000 [ -2.7282 2.4350 ]
: var3: -0.014488 1.0000 [ -2.6527 3.2319 ]
: var4: 0.28207 1.0000 [ -1.9094 2.3930 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 7.9442e-09 2.0999 [ -6.9285 6.2549 ]
: var2:-9.4762e-10 0.81623 [ -2.1779 1.8409 ]
: var3: 1.3434e-09 0.51228 [ -1.2574 1.2890 ]
: var4: 3.1898e-10 0.35594 [ -0.84818 0.98796 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.18535 1.0000 [ -1.2768 5.4654 ]
: var2: 0.14488 1.0000 [ -2.0258 6.0132 ]
: var3: 0.11957 1.0000 [ -1.9925 7.5386 ]
: var4: 0.044545 1.0000 [ -2.6680 5.5691 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
<HEADER> IdTransformation : Ranking result (top variable is best ranked)
: -----------------------------------
: Rank : Variable : Separation
: -----------------------------------
: 1 : Variable 4 : 4.271e-01
: 2 : Variable 3 : 3.270e-01
: 3 : Variable 2 : 1.993e-01
: 4 : Variable 1 : 1.440e-01
: -----------------------------------
<HEADER> Factory : Train method: BDTG for Classification
:
<HEADER> BDTG : #events: (reweighted) sig: 100 bkg: 100
: #events: (unweighted) sig: 100 bkg: 100
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 200 events: 0.0586 sec
<HEADER> BDTG : [datasetBkg0] : Evaluation of BDTG on training sample (200 events)
: Elapsed time for evaluation of 200 events: 0.00539 sec
: Creating xml weight file: datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml
: Creating standalone class: datasetBkg0/weights/TMVAMultiBkg0_BDTG.class.C
: TMVASignalBackground0.root:/datasetBkg0/Method_BDT/BDTG
<HEADER> Factory : Training finished
:
: Ranking input variables (method specific)...
<HEADER> BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var1 : 2.757e-01
: 2 : var2 : 2.610e-01
: 3 : var3 : 2.418e-01
: 4 : var4 : 2.215e-01
: --------------------------------------
<HEADER> Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml
<HEADER> Factory : Test all methods
<HEADER> Factory : Test method: BDTG for Classification performance
:
<HEADER> BDTG : [datasetBkg0] : Evaluation of BDTG on testing sample (200 events)
: Elapsed time for evaluation of 200 events: 0.00494 sec
<HEADER> Factory : Evaluate all methods
<HEADER> Factory : Evaluate classifier: BDTG
:
<HEADER> BDTG : [datasetBkg0] : Loop over test events and fill histograms with classifier response...
:
<HEADER> TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.12984 0.97510 [ -2.0823 2.9998 ]
: var2: 0.057210 0.86936 [ -1.9349 2.0015 ]
: var3: 0.16183 0.98795 [ -2.4774 3.0223 ]
: var4: 0.32229 1.2452 [ -2.9030 3.3317 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: datasetBkg0 BDTG : 0.945
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: datasetBkg0 BDTG : 0.000 (0.975) 0.812 (0.986) 0.966 (0.990)
: -------------------------------------------------------------------------------------------------------------------
:
<HEADER> Dataset:datasetBkg0 : Created tree 'TestTree' with 200 events
:
<HEADER> Dataset:datasetBkg0 : Created tree 'TrainTree' with 200 events
:
<HEADER> Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
<HEADER> DataSetInfo : [datasetBkg1] : Added class "Signal"
: Add Tree TreeS of type Signal with 200 events
<HEADER> DataSetInfo : [datasetBkg1] : Added class "Background"
: Add Tree TreeB1 of type Background with 200 events
<HEADER> Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Rebuilding Dataset datasetBkg1
: Building event vectors for type 2 Signal
: Dataset[datasetBkg1] : create input formulas for tree TreeS
: Building event vectors for type 2 Background
: Dataset[datasetBkg1] : create input formulas for tree TreeB1
<HEADER> DataSetFactory : [datasetBkg1] : Number of events in input trees
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 100
: Signal -- testing events : 100
: Signal -- training and testing events: 200
: Background -- training events : 100
: Background -- testing events : 100
: Background -- training and testing events: 200
:
<HEADER> DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.427 +0.620 +0.834
: var2: +0.427 +1.000 +0.756 +0.779
: var3: +0.620 +0.756 +1.000 +0.854
: var4: +0.834 +0.779 +0.854 +1.000
: ----------------------------------------
<HEADER> DataSetInfo : Correlation matrix (Background):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.404 +0.560 +0.803
: var2: +0.404 +1.000 +0.784 +0.784
: var3: +0.560 +0.784 +1.000 +0.836
: var4: +0.803 +0.784 +0.836 +1.000
: ----------------------------------------
<HEADER> DataSetFactory : [datasetBkg1] :
:
<HEADER> Factory : Train all methods
<HEADER> Factory : [datasetBkg1] : Create Transformation "I" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "P" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "G" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.0088163 1.0188 [ -3.1150 2.2852 ]
: var2: 0.34375 1.0917 [ -3.0952 3.1113 ]
: var3: 0.59134 1.0492 [ -2.3587 3.9796 ]
: var4: 0.20148 1.3300 [ -3.7913 4.1179 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: -0.18819 1.0000 [ -3.1896 2.5009 ]
: var2: 0.092430 1.0000 [ -2.5681 2.4906 ]
: var3: 0.69993 1.0000 [ -1.8985 3.9795 ]
: var4: -0.0086609 1.0000 [ -2.1826 2.3487 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 4.8243e-09 1.9581 [ -6.5407 5.8288 ]
: var2:-4.0978e-10 0.87524 [ -2.4248 2.2031 ]
: var3:-9.3767e-10 0.53717 [ -1.6289 1.2503 ]
: var4:-1.2718e-09 0.45934 [ -1.1321 1.1889 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.17868 1.0000 [ -1.3103 5.4712 ]
: var2: 0.11739 1.0000 [ -1.9872 6.1313 ]
: var3: 0.13673 1.0000 [ -1.6287 5.9055 ]
: var4: 0.068428 1.0000 [ -1.8732 5.5145 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
<HEADER> IdTransformation : Ranking result (top variable is best ranked)
: -----------------------------------
: Rank : Variable : Separation
: -----------------------------------
: 1 : Variable 4 : 4.271e-01
: 2 : Variable 1 : 1.440e-01
: 3 : Variable 3 : 4.644e-02
: 4 : Variable 2 : 2.725e-02
: -----------------------------------
<HEADER> Factory : Train method: BDTG for Classification
:
<HEADER> BDTG : #events: (reweighted) sig: 100 bkg: 100
: #events: (unweighted) sig: 100 bkg: 100
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 200 events: 0.0559 sec
<HEADER> BDTG : [datasetBkg1] : Evaluation of BDTG on training sample (200 events)
: Elapsed time for evaluation of 200 events: 0.00546 sec
: Creating xml weight file: datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml
: Creating standalone class: datasetBkg1/weights/TMVAMultiBkg1_BDTG.class.C
: TMVASignalBackground1.root:/datasetBkg1/Method_BDT/BDTG
<HEADER> Factory : Training finished
:
: Ranking input variables (method specific)...
<HEADER> BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var3 : 2.700e-01
: 2 : var1 : 2.526e-01
: 3 : var4 : 2.433e-01
: 4 : var2 : 2.341e-01
: --------------------------------------
<HEADER> Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml
<HEADER> Factory : Test all methods
<HEADER> Factory : Test method: BDTG for Classification performance
:
<HEADER> BDTG : [datasetBkg1] : Evaluation of BDTG on testing sample (200 events)
: Elapsed time for evaluation of 200 events: 0.00493 sec
<HEADER> Factory : Evaluate all methods
<HEADER> Factory : Evaluate classifier: BDTG
:
<HEADER> BDTG : [datasetBkg1] : Loop over test events and fill histograms with classifier response...
:
<HEADER> TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.12984 0.97510 [ -2.0823 2.9998 ]
: var2: 0.35721 0.80705 [ -1.3349 2.3468 ]
: var3: 0.66183 0.87515 [ -1.4774 3.9796 ]
: var4: 0.32229 1.2452 [ -2.9030 3.3317 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: datasetBkg1 BDTG : 0.992
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: datasetBkg1 BDTG : 0.829 (1.000) 0.978 (1.000) 1.000 (1.000)
: -------------------------------------------------------------------------------------------------------------------
:
<HEADER> Dataset:datasetBkg1 : Created tree 'TestTree' with 200 events
:
<HEADER> Dataset:datasetBkg1 : Created tree 'TrainTree' with 200 events
:
<HEADER> Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
<HEADER> DataSetInfo : [datasetBkg2] : Added class "Signal"
: Add Tree TreeS of type Signal with 200 events
<HEADER> DataSetInfo : [datasetBkg2] : Added class "Background"
: Add Tree TreeB2 of type Background with 200 events
<HEADER> Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Rebuilding Dataset datasetBkg2
: Building event vectors for type 2 Signal
: Dataset[datasetBkg2] : create input formulas for tree TreeS
: Building event vectors for type 2 Background
: Dataset[datasetBkg2] : create input formulas for tree TreeB2
<HEADER> DataSetFactory : [datasetBkg2] : Number of events in input trees
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 100
: Signal -- testing events : 100
: Signal -- training and testing events: 200
: Background -- training events : 100
: Background -- testing events : 100
: Background -- training and testing events: 200
:
<HEADER> DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.427 +0.620 +0.834
: var2: +0.427 +1.000 +0.756 +0.779
: var3: +0.620 +0.756 +1.000 +0.854
: var4: +0.834 +0.779 +0.854 +1.000
: ----------------------------------------
<HEADER> DataSetInfo : Correlation matrix (Background):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 -0.694 -0.018 +0.189
: var2: -0.694 +1.000 +0.048 -0.106
: var3: -0.018 +0.048 +1.000 -0.033
: var4: +0.189 -0.106 -0.033 +1.000
: ----------------------------------------
<HEADER> DataSetFactory : [datasetBkg2] :
:
<HEADER> Factory : Train all methods
<HEADER> Factory : [datasetBkg2] : Create Transformation "I" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "P" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "G" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.28578 0.91179 [ -2.7150 2.2852 ]
: var2: 0.67483 0.96936 [ -3.0952 3.1113 ]
: var3: 0.31482 1.1483 [ -2.3587 3.9796 ]
: var4: 0.47104 1.1963 [ -2.2913 4.1179 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.21314 1.0000 [ -2.9018 2.2127 ]
: var2: 0.65434 1.0000 [ -2.8620 2.8045 ]
: var3: 0.097560 1.0000 [ -2.1290 2.6029 ]
: var4: 0.28364 1.0000 [ -2.2148 2.5819 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1:-1.8283e-09 1.5535 [ -5.3869 5.6955 ]
: var2: 8.3819e-10 0.94853 [ -2.3039 2.7397 ]
: var3:-2.2631e-09 0.82510 [ -2.0402 1.8198 ]
: var4: 1.3062e-09 0.72605 [ -1.7460 1.7342 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.17765 1.0000 [ -1.4239 4.6740 ]
: var2: 0.15482 1.0000 [ -1.4140 5.3487 ]
: var3: 0.12377 1.0000 [ -1.8596 5.4085 ]
: var4: 0.098067 1.0000 [ -2.1757 4.5835 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
<HEADER> IdTransformation : Ranking result (top variable is best ranked)
: -----------------------------------
: Rank : Variable : Separation
: -----------------------------------
: 1 : Variable 2 : 4.041e-01
: 2 : Variable 4 : 2.872e-01
: 3 : Variable 3 : 2.701e-01
: 4 : Variable 1 : 1.551e-01
: -----------------------------------
<HEADER> Factory : Train method: BDTG for Classification
:
<HEADER> BDTG : #events: (reweighted) sig: 100 bkg: 100
: #events: (unweighted) sig: 100 bkg: 100
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 200 events: 0.0567 sec
<HEADER> BDTG : [datasetBkg2] : Evaluation of BDTG on training sample (200 events)
: Elapsed time for evaluation of 200 events: 0.0055 sec
: Creating xml weight file: datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml
: Creating standalone class: datasetBkg2/weights/TMVAMultiBkg2_BDTG.class.C
: TMVASignalBackground2.root:/datasetBkg2/Method_BDT/BDTG
<HEADER> Factory : Training finished
:
: Ranking input variables (method specific)...
<HEADER> BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 3.127e-01
: 2 : var1 : 2.386e-01
: 3 : var2 : 2.274e-01
: 4 : var3 : 2.213e-01
: --------------------------------------
<HEADER> Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml
<HEADER> Factory : Test all methods
<HEADER> Factory : Test method: BDTG for Classification performance
:
<HEADER> BDTG : [datasetBkg2] : Evaluation of BDTG on testing sample (200 events)
: Elapsed time for evaluation of 200 events: 0.00491 sec
<HEADER> Factory : Evaluate all methods
<HEADER> Factory : Evaluate classifier: BDTG
:
<HEADER> BDTG : [datasetBkg2] : Loop over test events and fill histograms with classifier response...
:
<HEADER> TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.33014 0.87831 [ -1.8821 2.9998 ]
: var2: 0.68086 0.81675 [ -1.2800 2.0015 ]
: var3: 0.27828 1.0286 [ -1.8691 3.0223 ]
: var4: 0.67359 1.1090 [ -1.7755 3.3317 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: datasetBkg2 BDTG : 0.946
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: datasetBkg2 BDTG : 0.000 (0.975) 0.926 (0.979) 0.978 (0.986)
: -------------------------------------------------------------------------------------------------------------------
:
<HEADER> Dataset:datasetBkg2 : Created tree 'TestTree' with 200 events
:
<HEADER> Dataset:datasetBkg2 : Created tree 'TrainTree' with 200 events
:
<HEADER> Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
========================
--- Application & create combined tree
: Booking "BDT method" of type "BDT" from datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml.
: Reading weight file: datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml
<HEADER> DataSetInfo : [Default] : Added class "Signal"
<HEADER> DataSetInfo : [Default] : Added class "Background"
: Booked classifier "BDTG" of type: "BDT"
: Booking "BDT method" of type "BDT" from datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml.
: Reading weight file: datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml
<HEADER> DataSetInfo : [Default] : Added class "Signal"
<HEADER> DataSetInfo : [Default] : Added class "Background"
: Booked classifier "BDTG" of type: "BDT"
: Booking "BDT method" of type "BDT" from datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml.
: Reading weight file: datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml
<HEADER> DataSetInfo : [Default] : Added class "Signal"
<HEADER> DataSetInfo : [Default] : Added class "Background"
: Booked classifier "BDTG" of type: "BDT"
--- Select signal sample
: Rebuilding Dataset Default
: Rebuilding Dataset Default
: Rebuilding Dataset Default
--- End of event loop: Real time 0:00:00, CP time 0.020
--- Select background 0 sample
--- End of event loop: Real time 0:00:00, CP time 0.020
--- Select background 1 sample
--- End of event loop: Real time 0:00:00, CP time 0.020
--- Select background 2 sample
--- End of event loop: Real time 0:00:00, CP time 0.020
--- Created root file: "tmva_example_multiple_backgrounds__applied.root" containing the MVA output histograms
==> Application of readers is done! combined tree created
========================
--- maximize significance
Classifier ranges (defined by the user)
range: -1 1
range: -1 1
range: -1 1
<HEADER> FitterBase : <GeneticFitter> Optimisation, please be patient ... (inaccurate progress timing for GA)
: Elapsed time: 7.51 sec
======================
Efficiency : 0.935
Purity : 0.886256
True positive weights : 187
False positive weights: 24
Signal weights : 200
cutValue[0] = -0.760936;
cutValue[1] = 0.992815;
cutValue[2] = 0.732853;
#include <iostream> // Stream declarations
#include <vector>
#include <limits>
#include "TChain.h"
#include "TCut.h"
#include "TDirectory.h"
#include "TH1F.h"
#include "TH1.h"
#include "TMath.h"
#include "TFile.h"
#include "TStopwatch.h"
#include "TROOT.h"
#include "TSystem.h"
#include "TMVA/Factory.h"
#include "TMVA/DataLoader.h"//required to load dataset
#include "TMVA/Reader.h"
using std::vector, std::cout, std::endl;
using namespace TMVA;
// ----------------------------------------------------------------------------------------------
// Training
// ----------------------------------------------------------------------------------------------
//
void Training(){
std::string factoryOptions( "!V:!Silent:Transformations=I;D;P;G,D:AnalysisType=Classification" );
TString fname = "./tmva_example_multiple_background.root";
TFile *input(0);
TTree *signal = (TTree*)input->Get("TreeS");
TTree *background0 = (TTree*)input->Get("TreeB0");
TTree *background1 = (TTree*)input->Get("TreeB1");
TTree *background2 = (TTree*)input->Get("TreeB2");
/// global event weights per tree (see below for setting event-wise weights)
// Create a new root output file.
TString outfileName( "TMVASignalBackground0.root" );
// background 0
// ____________
TMVA::Factory *factory = new TMVA::Factory( "TMVAMultiBkg0", outputFile, factoryOptions );
dataloader->AddVariable( "var1", "Variable 1", "", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
dataloader->AddSignalTree ( signal, signalWeight );
dataloader->AddBackgroundTree( background0, background0Weight );
// factory->SetBackgroundWeightExpression("weight");
TCut mycuts = ""; // for example: TCut mycuts = "abs(var1)<0.5 && abs(var2-0.5)<1";
TCut mycutb = ""; // for example: TCut mycutb = "abs(var1)<0.5";
// tell the factory to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycuts, mycutb,
"nTrain_Signal=0:nTrain_Background=0:SplitMode=Random:NormMode=NumEvents:!V" );
// Boosted Decision Trees
"!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.30:UseBaggedBoost:BaggedSampleFraction=0.6:SeparationType=GiniIndex:nCuts=20:MaxDepth=2" );
factory->TrainAllMethods();
factory->TestAllMethods();
factory->EvaluateAllMethods();
outputFile->Close();
delete factory;
delete dataloader;
// background 1
// ____________
outfileName = "TMVASignalBackground1.root";
dataloader=new TMVA::DataLoader("datasetBkg1");
factory = new TMVA::Factory( "TMVAMultiBkg1", outputFile, factoryOptions );
dataloader->AddVariable( "var1", "Variable 1", "", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
dataloader->AddSignalTree ( signal, signalWeight );
dataloader->AddBackgroundTree( background1, background1Weight );
// dataloader->SetBackgroundWeightExpression("weight");
// tell the factory to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycuts, mycutb,
"nTrain_Signal=0:nTrain_Background=0:SplitMode=Random:NormMode=NumEvents:!V" );
// Boosted Decision Trees
"!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.30:UseBaggedBoost:BaggedSampleFraction=0.6:SeparationType=GiniIndex:nCuts=20:MaxDepth=2" );
factory->TrainAllMethods();
factory->TestAllMethods();
factory->EvaluateAllMethods();
outputFile->Close();
delete factory;
delete dataloader;
// background 2
// ____________
outfileName = "TMVASignalBackground2.root";
factory = new TMVA::Factory( "TMVAMultiBkg2", outputFile, factoryOptions );
dataloader=new TMVA::DataLoader("datasetBkg2");
dataloader->AddVariable( "var1", "Variable 1", "", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
dataloader->AddSignalTree ( signal, signalWeight );
dataloader->AddBackgroundTree( background2, background2Weight );
// dataloader->SetBackgroundWeightExpression("weight");
// tell the dataloader to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycuts, mycutb,
"nTrain_Signal=0:nTrain_Background=0:SplitMode=Random:NormMode=NumEvents:!V" );
// Boosted Decision Trees
"!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.30:UseBaggedBoost:BaggedSampleFraction=0.5:SeparationType=GiniIndex:nCuts=20:MaxDepth=2" );
factory->TrainAllMethods();
factory->TestAllMethods();
factory->EvaluateAllMethods();
outputFile->Close();
delete factory;
delete dataloader;
}
// ----------------------------------------------------------------------------------------------
// Application
// ----------------------------------------------------------------------------------------------
//
// create a summary tree with all signal and background events and for each event the three classifier values and the true classID
// Create a new root output file.
TString outfileName( "tmva_example_multiple_backgrounds__applied.root" );
TTree* outputTree = new TTree("multiBkg","multiple backgrounds tree");
Float_t weight = 1.f;
outputTree->Branch("classID", &classID, "classID/I");
outputTree->Branch("var1", &var1, "var1/F");
outputTree->Branch("var2", &var2, "var2/F");
outputTree->Branch("var3", &var3, "var3/F");
outputTree->Branch("var4", &var4, "var4/F");
outputTree->Branch("weight", &weight, "weight/F");
outputTree->Branch("cls0", &classifier0, "cls0/F");
outputTree->Branch("cls1", &classifier1, "cls1/F");
outputTree->Branch("cls2", &classifier2, "cls2/F");
// create three readers for the three different signal/background classifications, .. one for each background
TMVA::Reader *reader0 = new TMVA::Reader( "!Color:!Silent" );
TMVA::Reader *reader1 = new TMVA::Reader( "!Color:!Silent" );
TMVA::Reader *reader2 = new TMVA::Reader( "!Color:!Silent" );
reader0->AddVariable( "var1", &var1 );
reader0->AddVariable( "var2", &var2 );
reader0->AddVariable( "var3", &var3 );
reader0->AddVariable( "var4", &var4 );
reader1->AddVariable( "var1", &var1 );
reader1->AddVariable( "var2", &var2 );
reader1->AddVariable( "var3", &var3 );
reader1->AddVariable( "var4", &var4 );
reader2->AddVariable( "var1", &var1 );
reader2->AddVariable( "var2", &var2 );
reader2->AddVariable( "var3", &var3 );
reader2->AddVariable( "var4", &var4 );
// load the weight files for the readers
TString method = "BDT method";
reader0->BookMVA( "BDT method", "datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml" );
reader1->BookMVA( "BDT method", "datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml" );
reader2->BookMVA( "BDT method", "datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml" );
// load the input file
TFile *input(0);
TString fname = "./tmva_example_multiple_background.root";
// loop through signal and all background trees
for( int treeNumber = 0; treeNumber < 4; ++treeNumber ) {
if( treeNumber == 0 ){
theTree = (TTree*)input->Get("TreeS");
std::cout << "--- Select signal sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 0;
}else if( treeNumber == 1 ){
theTree = (TTree*)input->Get("TreeB0");
std::cout << "--- Select background 0 sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 1;
}else if( treeNumber == 2 ){
theTree = (TTree*)input->Get("TreeB1");
std::cout << "--- Select background 1 sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 2;
}else if( treeNumber == 3 ){
theTree = (TTree*)input->Get("TreeB2");
std::cout << "--- Select background 2 sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 3;
}
theTree->SetBranchAddress( "var1", &var1 );
theTree->SetBranchAddress( "var2", &var2 );
theTree->SetBranchAddress( "var3", &var3 );
theTree->SetBranchAddress( "var4", &var4 );
std::cout << "--- Processing: " << theTree->GetEntries() << " events" << std::endl;
sw.Start();
Int_t nEvent = theTree->GetEntries();
// Int_t nEvent = 100;
for (Long64_t ievt=0; ievt<nEvent; ievt++) {
if (ievt%1000 == 0){
std::cout << "--- ... Processing event: " << ievt << std::endl;
}
theTree->GetEntry(ievt);
// get the classifiers for each of the signal/background classifications
classifier0 = reader0->EvaluateMVA( method );
classifier1 = reader1->EvaluateMVA( method );
classifier2 = reader2->EvaluateMVA( method );
outputTree->Fill();
}
// get elapsed time
sw.Stop();
std::cout << "--- End of event loop: "; sw.Print();
}
input->Close();
// write output tree
/* outputTree->SetDirectory(outputFile);
outputTree->Write(); */
outputFile->Write();
outputFile->Close();
std::cout << "--- Created root file: \"" << outfileName.Data() << "\" containing the MVA output histograms" << std::endl;
delete reader0;
delete reader1;
delete reader2;
std::cout << "==> Application of readers is done! combined tree created" << std::endl << std::endl;
}
// -----------------------------------------------------------------------------------------
// Genetic Algorithm Fitness definition
// -----------------------------------------------------------------------------------------
//
class MyFitness : public IFitterTarget {
public:
// constructor
hSignal = new TH1F("hsignal","hsignal",100,-1,1);
hFP = new TH1F("hfp","hfp",100,-1,1);
hTP = new TH1F("htp","htp",100,-1,1);
TString cutsAndWeightSignal = "weight*(classID==0)";
nSignal = chain->Draw("Entry$/Entries$>>hsignal",cutsAndWeightSignal,"goff");
weightsSignal = hSignal->Integral();
}
// the output of this function will be minimized
Double_t EstimatorFunction( std::vector<Double_t> & factors ){
TString cutsAndWeightTruePositive = Form("weight*((classID==0) && cls0>%f && cls1>%f && cls2>%f )",factors.at(0), factors.at(1), factors.at(2));
TString cutsAndWeightFalsePositive = Form("weight*((classID >0) && cls0>%f && cls1>%f && cls2>%f )",factors.at(0), factors.at(1), factors.at(2));
// Entry$/Entries$ just draws something reasonable. Could in principle anything
Float_t nTP = chain->Draw("Entry$/Entries$>>htp",cutsAndWeightTruePositive,"goff");
Float_t nFP = chain->Draw("Entry$/Entries$>>hfp",cutsAndWeightFalsePositive,"goff");
weightsTruePositive = hTP->Integral();
weightsFalsePositive = hFP->Integral();
if( weightsSignal > 0 )
purity = 0;
Float_t toMinimize = std::numeric_limits<float>::max(); // set to the highest existing number
if( effTimesPur > 0 ) // if larger than 0, take 1/x. This is the value to minimize
toMinimize = 1./(effTimesPur); // we want to minimize 1/efficiency*purity
// Print();
return toMinimize;
}
void Print(){
std::cout << std::endl;
std::cout << "======================" << std::endl
<< "Efficiency : " << efficiency << std::endl
<< "Purity : " << purity << std::endl << std::endl
<< "True positive weights : " << weightsTruePositive << std::endl
<< "False positive weights: " << weightsFalsePositive << std::endl
<< "Signal weights : " << weightsSignal << std::endl;
}
private:
};
// ----------------------------------------------------------------------------------------------
// Call of Genetic algorithm
// ----------------------------------------------------------------------------------------------
//
// define all the parameters by their minimum and maximum value
// in this example 3 parameters (=cuts on the classifiers) are defined.
ranges.push_back( new Interval(-1,1) ); // for some classifiers (especially LD) the ranges have to be taken larger
ranges.push_back( new Interval(-1,1) );
ranges.push_back( new Interval(-1,1) );
std::cout << "Classifier ranges (defined by the user)" << std::endl;
for( std::vector<Interval*>::iterator it = ranges.begin(); it != ranges.end(); it++ ){
std::cout << " range: " << (*it)->GetMin() << " " << (*it)->GetMax() << std::endl;
}
TChain* chain = new TChain("multiBkg");
chain->Add("tmva_example_multiple_backgrounds__applied.root");
// prepare the genetic algorithm with an initial population size of 20
// mind: big population sizes will help in searching the domain space of the solution
// but you have to weight this out to the number of generations
// the extreme case of 1 generation and populationsize n is equal to
// a Monte Carlo calculation with n tries
const TString name( "multipleBackgroundGA" );
const TString opts( "PopSize=100:Steps=30" );
GeneticFitter mg( *myFitness, name, ranges, opts);
// mg.SetParameters( 4, 30, 200, 10,5, 0.95, 0.001 );
std::vector<Double_t> result;
dynamic_cast<MyFitness*>(myFitness)->Print();
std::cout << std::endl;
int n = 0;
for( std::vector<Double_t>::iterator it = result.begin(); it<result.end(); it++ ){
std::cout << " cutValue[" << n << "] = " << (*it) << ";"<< std::endl;
n++;
}
}
{
// ----------------------------------------------------------------------------------------
// Run all
// ----------------------------------------------------------------------------------------
cout << "Start Test TMVAGAexample" << endl
<< "========================" << endl
<< endl;
TString createDataMacro = gROOT->GetTutorialDir() + "/machine_learning/createData.C";
gROOT->ProcessLine(TString::Format(".L %s",createDataMacro.Data()));
gROOT->ProcessLine("create_MultipleBackground(200)");
cout << endl;
cout << "========================" << endl;
cout << "--- Training" << endl;
cout << endl;
cout << "========================" << endl;
cout << "--- Application & create combined tree" << endl;
cout << endl;
cout << "========================" << endl;
cout << "--- maximize significance" << endl;
}
int main( int argc, char** argv ) {
}
int main()
Definition Prototype.cxx:12
int Int_t
Definition RtypesCore.h:45
float Float_t
Definition RtypesCore.h:57
double Double_t
Definition RtypesCore.h:59
long long Long64_t
Definition RtypesCore.h:69
ROOT::Detail::TRangeCast< T, true > TRangeDynCast
TRangeDynCast is an adapter class that allows the typed iteration through a TCollection.
Option_t Option_t TPoint TPoint const char GetTextMagnitude GetFillStyle GetLineColor GetLineWidth GetMarkerStyle GetTextAlign GetTextColor GetTextSize void input
Option_t Option_t TPoint TPoint const char GetTextMagnitude GetFillStyle GetLineColor GetLineWidth GetMarkerStyle GetTextAlign GetTextColor GetTextSize void char Point_t Rectangle_t WindowAttributes_t Float_t Float_t Float_t Int_t Int_t UInt_t UInt_t Rectangle_t result
char name[80]
Definition TGX11.cxx:110
void Print(GNN_Data &d, std::string txt="")
#define gROOT
Definition TROOT.h:406
char * Form(const char *fmt,...)
Formats a string in a circular formatting buffer.
Definition TString.cxx:2489
const_iterator begin() const
const_iterator end() const
A chain is a collection of files containing TTree objects.
Definition TChain.h:33
A specialized string object used for TTree selections.
Definition TCut.h:25
A ROOT file is an on-disk file, usually with extension .root, that stores objects in a file-system-li...
Definition TFile.h:131
static TFile * Open(const char *name, Option_t *option="", const char *ftitle="", Int_t compress=ROOT::RCompressionSetting::EDefaults::kUseCompiledDefault, Int_t netopt=0)
Create / open a file.
Definition TFile.cxx:4130
1-D histogram with a float per channel (see TH1 documentation)
Definition TH1.h:645
This is the main MVA steering class.
Definition Factory.h:80
void TrainAllMethods()
Iterates through all booked methods and calls training.
Definition Factory.cxx:1114
MethodBase * BookMethod(DataLoader *loader, TString theMethodName, TString methodTitle, TString theOption="")
Book a classifier or regression method.
Definition Factory.cxx:352
void TestAllMethods()
Evaluates all booked methods on the testing data and adds the output to the Results in the corresponi...
Definition Factory.cxx:1271
void EvaluateAllMethods(void)
Iterates over all MVAs that have been booked, and calls their evaluation methods.
Definition Factory.cxx:1376
Fitter using a Genetic Algorithm.
Interface for a fitter 'target'.
The TMVA::Interval Class.
Definition Interval.h:61
The Reader class serves to use the MVAs in a specific analysis context.
Definition Reader.h:64
Stopwatch class.
Definition TStopwatch.h:28
Basic string class.
Definition TString.h:139
static TString Format(const char *fmt,...)
Static method which formats a string using a printf style format descriptor and return a TString.
Definition TString.cxx:2378
A TTree represents a columnar dataset.
Definition TTree.h:79
const Int_t n
Definition legend1.C:16
double efficiency(double effFuncVal, int catIndex, int sigCatIndex)
Definition MathFuncs.h:116
create variable transformations
Author
Andreas Hoecker

Definition in file TMVAMultipleBackgroundExample.C.