Logo ROOT  
Reference Guide
TMVARegression.C File Reference

Detailed Description

View in nbviewer Open in SWAN This macro provides examples for the training and testing of the TMVA classifiers.

As input data is used a toy-MC sample consisting of four Gaussian-distributed and linearly correlated input variables.

The methods to be used can be switched on and off by means of booleans, or via the prompt command, for example:

root -l TMVARegression.C\‍(\"LD,MLP\"\‍)

(note that the backslashes are mandatory) If no method given, a default set is used.

The output file "TMVAReg.root" can be analysed with the use of dedicated macros (simply say: root -l <macro.C>), which can be conveniently invoked through a GUI that will appear at the end of the run of this macro.

  • Project : TMVA - a Root-integrated toolkit for multivariate data analysis
  • Package : TMVA
  • Root Macro: TMVARegression
==> Start TMVARegression
create data set info dataset
--- TMVARegression : Using input file: ./files/tmva_reg_example.root
DataSetInfo : [dataset] : Added class "Regression"
: Add Tree TreeR of type Regression with 10000 events
: Dataset[dataset] : Class index : 0 name : Regression
Factory : Booking method: ␛[1mPDEFoam␛[0m
:
: Building event vectors for type 2 Regression
: Dataset[dataset] : create input formulas for tree TreeR
DataSetFactory : [dataset] : Number of events in input trees
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Regression -- training events : 1000
: Regression -- testing events : 9000
: Regression -- training and testing events: 10000
:
DataSetInfo : Correlation matrix (Regression):
: ------------------------
: var1 var2
: var1: +1.000 +0.006
: var2: +0.006 +1.000
: ------------------------
DataSetFactory : [dataset] :
:
Factory : Booking method: ␛[1mKNN␛[0m
:
Factory : Booking method: ␛[1mLD␛[0m
:
Factory : Booking method: ␛[1mDNN_CPU␛[0m
:
: Parsing option string:
: ... "!H:V:ErrorStrategy=SUMOFSQUARES:VarTransform=G:WeightInitialization=XAVIERUNIFORM:Architecture=CPU:Layout=TANH|50,Layout=TANH|50,Layout=TANH|50,LINEAR:TrainingStrategy=LearningRate=1e-2,Momentum=0.5,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=10,WeightDecay=0.01,Regularization=NONE,DropConfig=0.2+0.2+0.2+0.,DropRepetitions=2|LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=L2,DropConfig=0.1+0.1+0.1,DropRepetitions=1|LearningRate=1e-4,Momentum=0.3,Repetitions=1,ConvergenceSteps=10,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=NONE"
: The following options are set:
: - By User:
: <none>
: - Default:
: Boost_num: "0" [Number of times the classifier will be boosted]
: Parsing option string:
: ... "!H:V:ErrorStrategy=SUMOFSQUARES:VarTransform=G:WeightInitialization=XAVIERUNIFORM:Architecture=CPU:Layout=TANH|50,Layout=TANH|50,Layout=TANH|50,LINEAR:TrainingStrategy=LearningRate=1e-2,Momentum=0.5,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=10,WeightDecay=0.01,Regularization=NONE,DropConfig=0.2+0.2+0.2+0.,DropRepetitions=2|LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=L2,DropConfig=0.1+0.1+0.1,DropRepetitions=1|LearningRate=1e-4,Momentum=0.3,Repetitions=1,ConvergenceSteps=10,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=NONE"
: The following options are set:
: - By User:
: V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
: VarTransform: "G" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
: H: "False" [Print method-specific help message]
: Layout: "TANH|50,Layout=TANH|50,Layout=TANH|50,LINEAR" [Layout of the network.]
: ErrorStrategy: "SUMOFSQUARES" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
: WeightInitialization: "XAVIERUNIFORM" [Weight initialization strategy]
: Architecture: "CPU" [Which architecture to perform the training on.]
: TrainingStrategy: "LearningRate=1e-2,Momentum=0.5,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=10,WeightDecay=0.01,Regularization=NONE,DropConfig=0.2+0.2+0.2+0.,DropRepetitions=2|LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=L2,DropConfig=0.1+0.1+0.1,DropRepetitions=1|LearningRate=1e-4,Momentum=0.3,Repetitions=1,ConvergenceSteps=10,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=NONE" [Defines the training strategies.]
: - Default:
: VerbosityLevel: "Default" [Verbosity level]
: CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
: IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
: ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
DNN_CPU : [dataset] : Create Transformation "G" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Preparing the Gaussian transformation...
TFHandler_DNN_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.012586 1.0260 [ -3.3377 5.7307 ]
: var2: 0.0043504 1.0383 [ -4.5564 5.7307 ]
: fvalue: 165.93 84.643 [ 2.0973 391.01 ]
: -----------------------------------------------------------
Parsed Training DNN string LearningRate=1e-2,Momentum=0.5,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=10,WeightDecay=0.01,Regularization=NONE,DropConfig=0.2+0.2+0.2+0.,DropRepetitions=2|LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=20,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=L2,DropConfig=0.1+0.1+0.1,DropRepetitions=1|LearningRate=1e-4,Momentum=0.3,Repetitions=1,ConvergenceSteps=10,BatchSize=50,TestRepetitions=5,WeightDecay=0.01,Regularization=NONE
STring has size 3
Factory : Booking method: ␛[1mBDTG␛[0m
:
<WARNING> : Value for option maxdepth was previously set to 3
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
Factory : ␛[1mTrain all methods␛[0m
Factory : [dataset] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3759 1.1674 [ 0.0058046 4.9975 ]
: var2: 2.4823 1.4587 [ 0.0032142 4.9971 ]
: fvalue: 165.93 84.643 [ 2.0973 391.01 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
IdTransformation : Ranking result (top variable is best ranked)
: --------------------------------------------
: Rank : Variable : |Correlation with target|
: --------------------------------------------
: 1 : var2 : 7.636e-01
: 2 : var1 : 5.936e-01
: --------------------------------------------
IdTransformation : Ranking result (top variable is best ranked)
: -------------------------------------
: Rank : Variable : Mutual information
: -------------------------------------
: 1 : var2 : 2.315e+00
: 2 : var1 : 1.882e+00
: -------------------------------------
IdTransformation : Ranking result (top variable is best ranked)
: ------------------------------------
: Rank : Variable : Correlation Ratio
: ------------------------------------
: 1 : var1 : 6.545e+00
: 2 : var2 : 2.414e+00
: ------------------------------------
IdTransformation : Ranking result (top variable is best ranked)
: ----------------------------------------
: Rank : Variable : Correlation Ratio (T)
: ----------------------------------------
: 1 : var2 : 8.189e-01
: 2 : var1 : 3.128e-01
: ----------------------------------------
Factory : Train method: PDEFoam for Regression
:
: Build mono target regression foam
: Elapsed time: 0.605 sec
: Elapsed time for training with 1000 events: 0.612 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of PDEFoam on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00525 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: ␛[0;36mdataset/weights/TMVARegression_PDEFoam.weights.xml␛[0m
: writing foam MonoTargetRegressionFoam to file
: Foams written to file: ␛[0;36mdataset/weights/TMVARegression_PDEFoam.weights_foams.root␛[0m
Factory : Training finished
:
Factory : Train method: KNN for Regression
:
KNN : <Train> start...
: Reading 1000 events
: Number of signal events 1000
: Number of background events 0
: Creating kd-tree with 1000 events
: Computing scale factor for 1d distributions: (ifrac, bottom, top) = (80%, 10%, 90%)
ModulekNN : Optimizing tree for 2 variables with 1000 values
: <Fill> Class 1 has 1000 events
: Elapsed time for training with 1000 events: 0.00144 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of KNN on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00764 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: ␛[0;36mdataset/weights/TMVARegression_KNN.weights.xml␛[0m
Factory : Training finished
:
Factory : Train method: LD for Regression
:
LD : Results for LD coefficients:
: -----------------------
: Variable: Coefficient:
: -----------------------
: var1: +42.509
: var2: +44.738
: (offset): -88.627
: -----------------------
: Elapsed time for training with 1000 events: 0.000373 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of LD on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.000578 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: ␛[0;36mdataset/weights/TMVARegression_LD.weights.xml␛[0m
Factory : Training finished
:
Factory : Train method: DNN_CPU for Regression
:
TFHandler_DNN_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.012586 1.0260 [ -3.3377 5.7307 ]
: var2: 0.0043504 1.0383 [ -4.5564 5.7307 ]
: fvalue: 165.93 84.643 [ 2.0973 391.01 ]
: -----------------------------------------------------------
: Start of neural network training on CPU.
:
: Training phase 1 of 3:
: Epoch | Train Err. Test Err. GFLOP/s Conv. Steps
: --------------------------------------------------------------
: 10 | 1496.83 1224.76 1.73552 0
: 20 | 953.759 1119.18 1.7385 0
: 30 | 2889.57 2151.09 1.74086 10
: 40 | 1276.06 1106.05 1.71931 0
: 50 | 1866.81 1351.93 1.65446 10
: 60 | 2386.5 1998.55 1.5303 20
:
: Training phase 2 of 3:
: Epoch | Train Err. Test Err. GFLOP/s Conv. Steps
: --------------------------------------------------------------
: 5 | 9378.45 1440.63 1.81502 0
: 10 | 8814.92 1240.25 1.81667 0
: 15 | 8499.09 1191.97 1.81369 0
: 20 | 9230.29 1563.43 1.81355 5
: 25 | 10185.5 2726.69 1.81445 10
: 30 | 9515.84 2382.06 1.81989 15
: 35 | 8752.12 2328.03 1.81997 20
:
: Training phase 3 of 3:
: Epoch | Train Err. Test Err. GFLOP/s Conv. Steps
: --------------------------------------------------------------
: 5 | 1639.75 2225.24 1.93602 0
: 10 | 1602.13 2174.71 1.9262 0
: 15 | 1597.74 2162.76 1.92599 0
: 20 | 1593.52 2151.87 1.9345 0
: 25 | 1589.5 2142.42 1.92683 0
: 30 | 1585.72 2133.98 1.92486 0
: 35 | 1582.31 2126.31 1.92411 0
: 40 | 1578.64 2117.58 1.93134 0
: 45 | 1575.32 2109.75 1.91916 0
: 50 | 1572.06 2102.06 1.92511 0
: 55 | 1569.01 2095.62 1.92588 0
: 60 | 1565.91 2088.35 1.93008 0
: 65 | 1563.3 2081.9 1.92602 0
: 70 | 1560.21 2075.21 1.92309 0
: 75 | 1557.5 2068.98 1.9304 0
: 80 | 1554.88 2062.7 1.92592 0
: 85 | 1552.54 2056.95 1.92538 0
: 90 | 1549.82 2051.08 1.92534 0
: 95 | 1547.33 2045.48 1.92632 0
: 100 | 1544.82 2039.78 1.92873 0
: 105 | 1542.56 2034.65 1.92753 0
: 110 | 1540.97 2029.84 1.92827 0
: 115 | 1538.93 2025.02 1.92447 0
: 120 | 1536.87 2020.63 1.92691 0
: 125 | 1534.91 2015.82 1.92677 0
: 130 | 1533.01 2011.56 1.9264 0
: 135 | 1531.2 2007.51 1.92319 0
: 140 | 1529.41 2003.06 1.92062 0
: 145 | 1527.69 1999.19 1.91855 0
: 150 | 1526.01 1995.2 1.91528 0
: 155 | 1524.36 1990.81 1.92645 0
: 160 | 1522.77 1987.42 1.9253 0
: 165 | 1521.22 1983.95 1.92781 0
: 170 | 1519.7 1980.13 1.92439 0
: 175 | 1518.27 1976.29 1.92614 0
: 180 | 1516.88 1972.84 1.92788 0
: 185 | 1515.46 1969.8 1.92962 0
: 190 | 1514.09 1966.43 1.92442 0
: 195 | 1512.77 1963.25 1.92313 0
: 200 | 1511.5 1960.33 1.92709 0
: 205 | 1510.25 1957.01 1.92189 0
: 210 | 1509.04 1954.27 1.7556 0
: 215 | 1507.88 1951.21 1.92162 0
: 220 | 1506.71 1948.31 1.92763 0
: 225 | 1505.75 1945.76 1.92314 0
: 230 | 1504.49 1942.83 1.91933 0
: 235 | 1503.42 1939.87 1.93165 0
: 240 | 1502.38 1937.35 1.92612 0
: 245 | 1501.38 1934.79 1.91787 0
: 250 | 1500.39 1932.49 1.91935 0
: 255 | 1499.43 1929.66 1.93179 0
: 260 | 1498.49 1927.43 1.92063 0
: 265 | 1497.62 1925.19 1.92372 0
: 270 | 1496.7 1922.89 1.92464 0
: 275 | 1495.91 1920.38 1.91889 0
: 280 | 1494.94 1918.12 1.9202 0
: 285 | 1494.11 1916.35 1.92085 5
: 290 | 1493.29 1914.02 1.9285 0
: 295 | 1492.51 1911.56 1.92811 0
: 300 | 1491.72 1909.55 1.92504 0
: 305 | 1491 1907.29 1.92691 0
: 310 | 1490.22 1905.63 1.92592 5
: 315 | 1489.49 1903.66 1.92085 0
: 320 | 1488.77 1901.54 1.9247 0
: 325 | 1488.08 1899.74 1.92468 5
: 330 | 1487.42 1897.79 1.92645 0
: 335 | 1486.74 1896.13 1.92602 5
: 340 | 1486.12 1894.49 1.92173 0
: 345 | 1485.47 1892.26 1.92558 0
: 350 | 1484.83 1890.64 1.9288 5
: 355 | 1484.23 1888.85 1.92233 0
: 360 | 1483.65 1887.04 1.93237 5
: 365 | 1483.06 1885.23 1.93264 0
: 370 | 1482.45 1883.76 1.92285 5
: 375 | 1481.89 1882.06 1.92148 0
: 380 | 1481.37 1880.77 1.92774 5
: 385 | 1480.8 1878.91 1.92337 0
: 390 | 1480.28 1877.53 1.92459 5
: 395 | 1479.77 1876.14 1.9263 0
: 400 | 1479.25 1874.31 1.93088 5
: 405 | 1478.76 1873.17 1.92262 0
: 410 | 1478.28 1871.54 1.92167 5
: 415 | 1477.79 1870.11 1.92664 0
: 420 | 1477.33 1868.65 1.92087 5
: 425 | 1476.87 1867.19 1.92406 0
: 430 | 1476.42 1866.03 1.92774 5
: 435 | 1476 1864.36 1.92522 0
: 440 | 1475.55 1863.32 1.92098 5
: 445 | 1475.11 1861.99 1.92256 0
: 450 | 1470.38 1859.77 1.93114 0
: 455 | 1402.4 1816.37 1.91728 0
: 460 | 1401.27 1814.84 1.9205 5
: 465 | 1400.22 1813.8 1.92087 0
: 470 | 1399.26 1812.88 1.92731 5
: 475 | 1398.29 1811.3 1.92605 0
: 480 | 1397.39 1810.17 1.93217 5
: 485 | 1396.49 1809.53 1.92566 10
:
: Elapsed time for training with 1000 events: 8.14 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of DNN_CPU on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.0198 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: ␛[0;36mdataset/weights/TMVARegression_DNN_CPU.weights.xml␛[0m
Factory : Training finished
:
Factory : Train method: BDTG for Regression
:
: Regression Loss Function: Huber
: Training 2000 Decision Trees ... patience please
: Elapsed time for training with 1000 events: 1.57 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of BDTG on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.359 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: ␛[0;36mdataset/weights/TMVARegression_BDTG.weights.xml␛[0m
: TMVAReg.root:/dataset/Method_BDT/BDTG
Factory : Training finished
:
TH1.Print Name = TrainingHistory_DNN_CPU_testError, Entries= 0, Total sum= 211629
TH1.Print Name = TrainingHistory_DNN_CPU_trainingError, Entries= 0, Total sum= 221505
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: ␛[0;36mdataset/weights/TMVARegression_PDEFoam.weights.xml␛[0m
: Read foams from file: ␛[0;36mdataset/weights/TMVARegression_PDEFoam.weights_foams.root␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVARegression_KNN.weights.xml␛[0m
: Creating kd-tree with 1000 events
: Computing scale factor for 1d distributions: (ifrac, bottom, top) = (80%, 10%, 90%)
ModulekNN : Optimizing tree for 2 variables with 1000 values
: <Fill> Class 1 has 1000 events
: Reading weight file: ␛[0;36mdataset/weights/TMVARegression_LD.weights.xml␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVARegression_DNN_CPU.weights.xml␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVARegression_BDTG.weights.xml␛[0m
Factory : ␛[1mTest all methods␛[0m
Factory : Test method: PDEFoam for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of PDEFoam on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0686 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: KNN for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of KNN on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0744 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: LD for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of LD on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.00326 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: DNN_CPU for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of DNN_CPU on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.176 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: BDTG for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of BDTG on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 2.12 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : ␛[1mEvaluate all methods␛[0m
: Evaluate regression method: PDEFoam
: TestRegression (testing)
: Calculate regression for all events
: Elapsed time for evaluation of 9000 events: 0.0437 sec
: TestRegression (training)
: Calculate regression for all events
: Elapsed time for evaluation of 1000 events: 0.00509 sec
TFHandler_PDEFoam : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3352 1.1893 [ 0.00020069 5.0000 ]
: var2: 2.4860 1.4342 [ 0.00071490 5.0000 ]
: fvalue: 163.91 83.651 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: KNN
: TestRegression (testing)
: Calculate regression for all events
: Elapsed time for evaluation of 9000 events: 0.0776 sec
: TestRegression (training)
: Calculate regression for all events
: Elapsed time for evaluation of 1000 events: 0.00898 sec
TFHandler_KNN : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3352 1.1893 [ 0.00020069 5.0000 ]
: var2: 2.4860 1.4342 [ 0.00071490 5.0000 ]
: fvalue: 163.91 83.651 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: LD
: TestRegression (testing)
: Calculate regression for all events
: Elapsed time for evaluation of 9000 events: 0.00501 sec
: TestRegression (training)
: Calculate regression for all events
: Elapsed time for evaluation of 1000 events: 0.000596 sec
TFHandler_LD : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3352 1.1893 [ 0.00020069 5.0000 ]
: var2: 2.4860 1.4342 [ 0.00071490 5.0000 ]
: fvalue: 163.91 83.651 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: DNN_CPU
: TestRegression (testing)
: Calculate regression for all events
: Elapsed time for evaluation of 9000 events: 0.168 sec
: TestRegression (training)
: Calculate regression for all events
: Elapsed time for evaluation of 1000 events: 0.019 sec
TFHandler_DNN_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: -0.027278 1.0264 [ -3.3694 5.7307 ]
: var2: 0.0056047 0.98632 [ -5.7307 5.7307 ]
: fvalue: 163.91 83.651 [ 1.6186 394.84 ]
: -----------------------------------------------------------
TFHandler_DNN_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: -0.027278 1.0264 [ -3.3694 5.7307 ]
: var2: 0.0056047 0.98632 [ -5.7307 5.7307 ]
: fvalue: 163.91 83.651 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: BDTG
: TestRegression (testing)
: Calculate regression for all events
: Elapsed time for evaluation of 9000 events: 2.13 sec
: TestRegression (training)
: Calculate regression for all events
: Elapsed time for evaluation of 1000 events: 0.234 sec
TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3352 1.1893 [ 0.00020069 5.0000 ]
: var2: 2.4860 1.4342 [ 0.00071490 5.0000 ]
: fvalue: 163.91 83.651 [ 1.6186 394.84 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by smallest RMS on test sample:
: ("Bias" quotes the mean deviation of the regression from true target.
: "MutInf" is the "Mutual Information" between regression and target.
: Indicated by "_T" are the corresponding "truncated" quantities ob-
: tained when removing events deviating more than 2sigma from average.)
: --------------------------------------------------------------------------------------------------
: --------------------------------------------------------------------------------------------------
: dataset BDTG : 0.0707 0.102 2.45 1.95 | 3.100 3.175
: dataset KNN : -0.237 0.578 5.17 3.44 | 2.898 2.939
: dataset PDEFoam : 0.106 -0.0677 9.22 7.74 | 2.283 2.375
: dataset LD : 0.461 2.22 19.6 17.6 | 1.985 1.979
: dataset DNN_CPU :-2.21e+08-2.22e+08 3.01e+08 2.61e+08 | 0.000 0.000
: --------------------------------------------------------------------------------------------------
:
: Evaluation results ranked by smallest RMS on training sample:
: (overtraining check)
: --------------------------------------------------------------------------------------------------
: DataSet Name: MVA Method: <Bias> <Bias_T> RMS RMS_T | MutInf MutInf_T
: --------------------------------------------------------------------------------------------------
: dataset BDTG : 0.0597 0.0107 0.566 0.293 | 3.441 3.466
: dataset KNN : -0.425 0.423 5.19 3.54 | 3.006 3.034
: dataset PDEFoam : 8.35e-07 0.106 8.04 6.57 | 2.488 2.579
: dataset LD :-1.03e-06 1.54 20.1 18.5 | 2.134 2.153
: dataset DNN_CPU :-2.12e+08-2.14e+08 3.05e+08 2.62e+08 | -0.000 -0.000
: --------------------------------------------------------------------------------------------------
:
Dataset:dataset : Created tree 'TestTree' with 9000 events
:
Dataset:dataset : Created tree 'TrainTree' with 1000 events
:
Factory : ␛[1mThank you for using TMVA!␛[0m
: ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m
==> Wrote root file: TMVAReg.root
==> TMVARegression is done!
#include <cstdlib>
#include <iostream>
#include <map>
#include <string>
#include "TChain.h"
#include "TFile.h"
#include "TTree.h"
#include "TString.h"
#include "TObjString.h"
#include "TSystem.h"
#include "TROOT.h"
#include "TMVA/Tools.h"
#include "TMVA/Factory.h"
using namespace TMVA;
void TMVARegression( TString myMethodList = "" )
{
// The explicit loading of the shared libTMVA is done in TMVAlogon.C, defined in .rootrc
// if you use your private .rootrc, or run from a different directory, please copy the
// corresponding lines from .rootrc
// methods to be processed can be given as an argument; use format:
//
// mylinux~> root -l TMVARegression.C\‍(\"myMethod1,myMethod2,myMethod3\"\‍)
//
//---------------------------------------------------------------
// This loads the library
// Default MVA methods to be trained + tested
std::map<std::string,int> Use;
// Mutidimensional likelihood and Nearest-Neighbour methods
Use["PDERS"] = 0;
Use["PDEFoam"] = 1;
Use["KNN"] = 1;
//
// Linear Discriminant Analysis
Use["LD"] = 1;
//
// Function Discriminant analysis
Use["FDA_GA"] = 0;
Use["FDA_MC"] = 0;
Use["FDA_MT"] = 0;
Use["FDA_GAMT"] = 0;
//
// Neural Network
Use["MLP"] = 0;
#ifdef R__HAS_TMVACPU
Use["DNN_CPU"] = 1;
#else
Use["DNN_CPU"] = 0;
#endif
//
// Support Vector Machine
Use["SVM"] = 0;
//
// Boosted Decision Trees
Use["BDT"] = 0;
Use["BDTG"] = 1;
// ---------------------------------------------------------------
std::cout << std::endl;
std::cout << "==> Start TMVARegression" << std::endl;
// Select methods (don't look at this code - not of interest)
if (myMethodList != "") {
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) it->second = 0;
std::vector<TString> mlist = gTools().SplitString( myMethodList, ',' );
for (UInt_t i=0; i<mlist.size(); i++) {
std::string regMethod(mlist[i].Data());
if (Use.find(regMethod) == Use.end()) {
std::cout << "Method \"" << regMethod << "\" not known in TMVA under this name. Choose among the following:" << std::endl;
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) std::cout << it->first << " ";
std::cout << std::endl;
return;
}
Use[regMethod] = 1;
}
}
// --------------------------------------------------------------------------------------------------
// Here the preparation phase begins
// Create a new root output file
TString outfileName( "TMVAReg.root" );
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );
// Create the factory object. Later you can choose the methods
// whose performance you'd like to investigate. The factory will
// then run the performance analysis for you.
//
// The first argument is the base of the name of all the
// weightfiles in the directory weight/
//
// The second argument is the output file for the training results
// All TMVA output can be suppressed by removing the "!" (not) in
// front of the "Silent" argument in the option string
TMVA::Factory *factory = new TMVA::Factory( "TMVARegression", outputFile,
"!V:!Silent:Color:DrawProgressBar:AnalysisType=Regression" );
TMVA::DataLoader *dataloader=new TMVA::DataLoader("dataset");
// If you wish to modify default settings
// (please check "src/Config.h" to see all available global options)
//
// (TMVA::gConfig().GetVariablePlotting()).fTimesRMS = 8.0;
// (TMVA::gConfig().GetIONames()).fWeightFileDir = "myWeightDirectory";
// Define the input variables that shall be used for the MVA training
// note that you may also use variable expressions, such as: "3*var1/var2*abs(var3)"
// [all types of expressions that can also be parsed by TTree::Draw( "expression" )]
dataloader->AddVariable( "var1", "Variable 1", "units", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "units", 'F' );
// You can add so-called "Spectator variables", which are not used in the MVA training,
// but will appear in the final "TestTree" produced by TMVA. This TestTree will contain the
// input variables, the response values of all trained MVAs, and the spectator variables
dataloader->AddSpectator( "spec1:=var1*2", "Spectator 1", "units", 'F' );
dataloader->AddSpectator( "spec2:=var1*3", "Spectator 2", "units", 'F' );
// Add the variable carrying the regression target
dataloader->AddTarget( "fvalue" );
// It is also possible to declare additional targets for multi-dimensional regression, ie:
// factory->AddTarget( "fvalue2" );
// BUT: this is currently ONLY implemented for MLP
// Read training and test data (see TMVAClassification for reading ASCII files)
// load the signal and background event samples from ROOT trees
TFile *input(0);
TString fname = "./tmva_reg_example.root";
if (!gSystem->AccessPathName( fname )) {
input = TFile::Open( fname ); // check if file in local directory exists
}
else {
input = TFile::Open("http://root.cern.ch/files/tmva_reg_example.root", "CACHEREAD"); // if not: download from ROOT server
}
if (!input) {
std::cout << "ERROR: could not open data file" << std::endl;
exit(1);
}
std::cout << "--- TMVARegression : Using input file: " << input->GetName() << std::endl;
// Register the regression tree
TTree *regTree = (TTree*)input->Get("TreeR");
// global event weights per tree (see below for setting event-wise weights)
Double_t regWeight = 1.0;
// You can add an arbitrary number of regression trees
dataloader->AddRegressionTree( regTree, regWeight );
// This would set individual event weights (the variables defined in the
// expression need to exist in the original TTree)
dataloader->SetWeightExpression( "var1", "Regression" );
// Apply additional cuts on the signal and background samples (can be different)
TCut mycut = ""; // for example: TCut mycut = "abs(var1)<0.5 && abs(var2-0.5)<1";
// tell the DataLoader to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycut,
"nTrain_Regression=1000:nTest_Regression=0:SplitMode=Random:NormMode=NumEvents:!V" );
//
// dataloader->PrepareTrainingAndTestTree( mycut,
// "nTrain_Regression=0:nTest_Regression=0:SplitMode=Random:NormMode=NumEvents:!V" );
// If no numbers of events are given, half of the events in the tree are used
// for training, and the other half for testing:
//
// dataloader->PrepareTrainingAndTestTree( mycut, "SplitMode=random:!V" );
// Book MVA methods
//
// Please lookup the various method configuration options in the corresponding cxx files, eg:
// src/MethoCuts.cxx, etc, or here: http://tmva.sourceforge.net/optionRef.html
// it is possible to preset ranges in the option string in which the cut optimisation should be done:
// "...:CutRangeMin[2]=-1:CutRangeMax[2]=1"...", where [2] is the third input variable
// PDE - RS method
if (Use["PDERS"])
factory->BookMethod( dataloader, TMVA::Types::kPDERS, "PDERS",
"!H:!V:NormTree=T:VolumeRangeMode=Adaptive:KernelEstimator=Gauss:GaussSigma=0.3:NEventsMin=40:NEventsMax=60:VarTransform=None" );
// And the options strings for the MinMax and RMS methods, respectively:
//
// "!H:!V:VolumeRangeMode=MinMax:DeltaFrac=0.2:KernelEstimator=Gauss:GaussSigma=0.3" );
// "!H:!V:VolumeRangeMode=RMS:DeltaFrac=3:KernelEstimator=Gauss:GaussSigma=0.3" );
if (Use["PDEFoam"])
factory->BookMethod( dataloader, TMVA::Types::kPDEFoam, "PDEFoam",
"!H:!V:MultiTargetRegression=F:TargetSelection=Mpv:TailCut=0.001:VolFrac=0.0666:nActiveCells=500:nSampl=2000:nBin=5:Compress=T:Kernel=None:Nmin=10:VarTransform=None" );
// K-Nearest Neighbour classifier (KNN)
if (Use["KNN"])
factory->BookMethod( dataloader, TMVA::Types::kKNN, "KNN",
"nkNN=20:ScaleFrac=0.8:SigmaFact=1.0:Kernel=Gaus:UseKernel=F:UseWeight=T:!Trim" );
// Linear discriminant
if (Use["LD"])
factory->BookMethod( dataloader, TMVA::Types::kLD, "LD",
"!H:!V:VarTransform=None" );
// Function discrimination analysis (FDA) -- test of various fitters - the recommended one is Minuit (or GA or SA)
if (Use["FDA_MC"])
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_MC",
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=MC:SampleSize=100000:Sigma=0.1:VarTransform=D" );
if (Use["FDA_GA"]) // can also use Simulated Annealing (SA) algorithm (see Cuts_SA options) .. the formula of this example is good for parabolas
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_GA",
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=GA:PopSize=100:Cycles=3:Steps=30:Trim=True:SaveBestGen=1:VarTransform=Norm" );
if (Use["FDA_MT"])
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_MT",
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100);(-10,10):FitMethod=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=2:UseImprove:UseMinos:SetBatch" );
if (Use["FDA_GAMT"])
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_GAMT",
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=GA:Converger=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=0:!UseImprove:!UseMinos:SetBatch:Cycles=1:PopSize=5:Steps=5:Trim" );
// Neural network (MLP)
if (Use["MLP"])
factory->BookMethod( dataloader, TMVA::Types::kMLP, "MLP", "!H:!V:VarTransform=Norm:NeuronType=tanh:NCycles=20000:HiddenLayers=N+20:TestRate=6:TrainingMethod=BFGS:Sampling=0.3:SamplingEpoch=0.8:ConvergenceImprove=1e-6:ConvergenceTests=15:!UseRegulator" );
if (Use["DNN_CPU"]) {
/*
TString layoutString ("Layout=TANH|(N+100)*2,LINEAR");
TString layoutString ("Layout=SOFTSIGN|100,SOFTSIGN|50,SOFTSIGN|20,LINEAR");
TString layoutString ("Layout=RELU|300,RELU|100,RELU|30,RELU|10,LINEAR");
TString layoutString ("Layout=SOFTSIGN|50,SOFTSIGN|30,SOFTSIGN|20,SOFTSIGN|10,LINEAR");
TString layoutString ("Layout=TANH|50,TANH|30,TANH|20,TANH|10,LINEAR");
TString layoutString ("Layout=SOFTSIGN|50,SOFTSIGN|20,LINEAR");
TString layoutString ("Layout=TANH|100,TANH|30,LINEAR");
*/
TString layoutString("Layout=TANH|50,Layout=TANH|50,Layout=TANH|50,LINEAR");
TString training0("LearningRate=1e-2,Momentum=0.5,Repetitions=1,ConvergenceSteps=20,BatchSize=50,"
"TestRepetitions=10,WeightDecay=0.01,Regularization=NONE,DropConfig=0.2+0.2+0.2+0.,"
"DropRepetitions=2");
TString training1("LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=20,BatchSize=50,"
"TestRepetitions=5,WeightDecay=0.01,Regularization=L2,DropConfig=0.1+0.1+0.1,DropRepetitions="
"1");
TString training2("LearningRate=1e-4,Momentum=0.3,Repetitions=1,ConvergenceSteps=10,BatchSize=50,"
"TestRepetitions=5,WeightDecay=0.01,Regularization=NONE");
TString trainingStrategyString("TrainingStrategy=");
trainingStrategyString += training0 + "|" + training1 + "|" + training2;
// TString trainingStrategyString
// ("TrainingStrategy=LearningRate=1e-1,Momentum=0.3,Repetitions=3,ConvergenceSteps=20,BatchSize=30,TestRepetitions=7,WeightDecay=0.0,L1=false,DropFraction=0.0,DropRepetitions=5");
TString nnOptions(
"!H:V:ErrorStrategy=SUMOFSQUARES:VarTransform=G:WeightInitialization=XAVIERUNIFORM:Architecture=CPU");
// TString nnOptions ("!H:V:VarTransform=Normalize:ErrorStrategy=CHECKGRADIENTS");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
factory->BookMethod(dataloader, TMVA::Types::kDNN, "DNN_CPU", nnOptions); // NN
}
// Support Vector Machine
if (Use["SVM"])
factory->BookMethod( dataloader, TMVA::Types::kSVM, "SVM", "Gamma=0.25:Tol=0.001:VarTransform=Norm" );
// Boosted Decision Trees
if (Use["BDT"])
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDT",
"!H:!V:NTrees=100:MinNodeSize=1.0%:BoostType=AdaBoostR2:SeparationType=RegressionVariance:nCuts=20:PruneMethod=CostComplexity:PruneStrength=30" );
if (Use["BDTG"])
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG",
"!H:!V:NTrees=2000::BoostType=Grad:Shrinkage=0.1:UseBaggedBoost:BaggedSampleFraction=0.5:nCuts=20:MaxDepth=3:MaxDepth=4" );
// --------------------------------------------------------------------------------------------------
// Now you can tell the factory to train, test, and evaluate the MVAs
// Train MVAs using the set of training events
factory->TrainAllMethods();
// Evaluate all MVAs using the set of test events
factory->TestAllMethods();
// Evaluate and compare performance of all configured MVAs
factory->EvaluateAllMethods();
// --------------------------------------------------------------
// Save the output
outputFile->Close();
std::cout << "==> Wrote root file: " << outputFile->GetName() << std::endl;
std::cout << "==> TMVARegression is done!" << std::endl;
delete factory;
delete dataloader;
// Launch the GUI for the root macros
if (!gROOT->IsBatch()) TMVA::TMVARegGui( outfileName );
}
int main( int argc, char** argv )
{
// Select methods (don't look at this code - not of interest)
TString methodList;
for (int i=1; i<argc; i++) {
TString regMethod(argv[i]);
if(regMethod=="-b" || regMethod=="--batch") continue;
if (!methodList.IsNull()) methodList += TString(",");
methodList += regMethod;
}
TMVARegression(methodList);
return 0;
}
unsigned int UInt_t
Definition: RtypesCore.h:42
double Double_t
Definition: RtypesCore.h:55
#define gROOT
Definition: TROOT.h:415
R__EXTERN TSystem * gSystem
Definition: TSystem.h:560
A specialized string object used for TTree selections.
Definition: TCut.h:25
A ROOT file is a suite of consecutive data records (TKey instances) with a well defined format.
Definition: TFile.h:48
static Bool_t SetCacheFileDir(ROOT::Internal::TStringView cacheDir, Bool_t operateDisconnected=kTRUE, Bool_t forceCacheread=kFALSE)
Definition: TFile.h:319
static TFile * Open(const char *name, Option_t *option="", const char *ftitle="", Int_t compress=ROOT::RCompressionSetting::EDefaults::kUseCompiledDefault, Int_t netopt=0)
Create / open a file.
Definition: TFile.cxx:3923
void Close(Option_t *option="") override
Close a file.
Definition: TFile.cxx:856
void AddSpectator(const TString &expression, const TString &title="", const TString &unit="", Double_t min=0, Double_t max=0)
user inserts target in data set info
Definition: DataLoader.cxx:509
void AddRegressionTree(TTree *tree, Double_t weight=1.0, Types::ETreeType treetype=Types::kMaxTreeType)
Definition: DataLoader.h:105
void SetWeightExpression(const TString &variable, const TString &className="")
Definition: DataLoader.cxx:548
void PrepareTrainingAndTestTree(const TCut &cut, const TString &splitOpt)
prepare the training and test trees -> same cuts for signal and background
Definition: DataLoader.cxx:617
void AddTarget(const TString &expression, const TString &title="", const TString &unit="", Double_t min=0, Double_t max=0)
user inserts target in data set info
Definition: DataLoader.cxx:497
void AddVariable(const TString &expression, const TString &title, const TString &unit, char type='F', Double_t min=0, Double_t max=0)
user inserts discriminating variable in data set info
Definition: DataLoader.cxx:470
This is the main MVA steering class.
Definition: Factory.h:81
void TrainAllMethods()
Iterates through all booked methods and calls training.
Definition: Factory.cxx:1094
MethodBase * BookMethod(DataLoader *loader, TString theMethodName, TString methodTitle, TString theOption="")
Book a classifier or regression method.
Definition: Factory.cxx:346
void TestAllMethods()
Evaluates all booked methods on the testing data and adds the output to the Results in the corresponi...
Definition: Factory.cxx:1245
void EvaluateAllMethods(void)
Iterates over all MVAs that have been booked, and calls their evaluation methods.
Definition: Factory.cxx:1350
static Tools & Instance()
Definition: Tools.cxx:75
std::vector< TString > SplitString(const TString &theOpt, const char separator) const
splits the option string at 'separator' and fills the list 'splitV' with the primitive strings
Definition: Tools.cxx:1211
@ kFDA
Definition: Types.h:94
@ kBDT
Definition: Types.h:88
@ kPDERS
Definition: Types.h:82
@ kPDEFoam
Definition: Types.h:96
@ kSVM
Definition: Types.h:91
@ kKNN
Definition: Types.h:85
@ kMLP
Definition: Types.h:92
virtual const char * GetName() const
Returns name of object.
Definition: TNamed.h:47
Basic string class.
Definition: TString.h:131
Bool_t IsNull() const
Definition: TString.h:402
virtual Bool_t AccessPathName(const char *path, EAccessMode mode=kFileExists)
Returns FALSE if one can access a file using the specified access mode.
Definition: TSystem.cxx:1287
A TTree represents a columnar dataset.
Definition: TTree.h:72
int main(int argc, char **argv)
create variable transformations
void TMVARegGui(const char *fName="TMVAReg.root", TString dataset="")
Tools & gTools()
Author
Andreas Hoecker

Definition in file TMVARegression.C.