==> Start TMVAMulticlass
--- TMVAMulticlass: Using input file: ./files/tmva_multiclass_example.root
DataSetInfo : [dataset] : Added class "Signal"
: Add Tree TreeS of type Signal with 2000 events
DataSetInfo : [dataset] : Added class "bg0"
: Add Tree TreeB0 of type bg0 with 2000 events
DataSetInfo : [dataset] : Added class "bg1"
: Add Tree TreeB1 of type bg1 with 2000 events
DataSetInfo : [dataset] : Added class "bg2"
: Add Tree TreeB2 of type bg2 with 2000 events
: Dataset[dataset] : Class index : 0 name : Signal
: Dataset[dataset] : Class index : 1 name : bg0
: Dataset[dataset] : Class index : 2 name : bg1
: Dataset[dataset] : Class index : 3 name : bg2
Factory : Booking method: ␛[1mBDTG␛[0m
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Building event vectors for type 2 Signal
: Dataset[dataset] : create input formulas for tree TreeS
: Building event vectors for type 2 bg0
: Dataset[dataset] : create input formulas for tree TreeB0
: Building event vectors for type 2 bg1
: Dataset[dataset] : create input formulas for tree TreeB1
: Building event vectors for type 2 bg2
: Dataset[dataset] : create input formulas for tree TreeB2
DataSetFactory : [dataset] : Number of events in input trees
:
:
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 1000
: Signal -- testing events : 1000
: Signal -- training and testing events: 2000
: bg0 -- training events : 1000
: bg0 -- testing events : 1000
: bg0 -- training and testing events: 2000
: bg1 -- training events : 1000
: bg1 -- testing events : 1000
: bg1 -- training and testing events: 2000
: bg2 -- training events : 1000
: bg2 -- testing events : 1000
: bg2 -- training and testing events: 2000
:
DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.397 +0.623 +0.832
: var2: +0.397 +1.000 +0.716 +0.737
: var3: +0.623 +0.716 +1.000 +0.859
: var4: +0.832 +0.737 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg0):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.365 +0.592 +0.811
: var2: +0.365 +1.000 +0.708 +0.740
: var3: +0.592 +0.708 +1.000 +0.859
: var4: +0.811 +0.740 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg1):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.407 +0.610 +0.834
: var2: +0.407 +1.000 +0.710 +0.741
: var3: +0.610 +0.710 +1.000 +0.851
: var4: +0.834 +0.741 +0.851 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg2):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 -0.647 -0.016 -0.013
: var2: -0.647 +1.000 +0.015 +0.002
: var3: -0.016 +0.015 +1.000 -0.024
: var4: -0.013 +0.002 -0.024 +1.000
: ----------------------------------------
DataSetFactory : [dataset] :
:
Factory : Booking method: ␛[1mMLP␛[0m
:
MLP : Building Network.
: Initializing weights
Factory : Booking method: ␛[1mPDEFoam␛[0m
:
Factory : Booking method: ␛[1mDL_CPU␛[0m
:
: Parsing option string:
: ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:WeightInitialization=XAVIERUNIFORM:Architecture=GPU:Layout=TANH|100,TANH|50,TANH|10,LINEAR:TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100"
: The following options are set:
: - By User:
: <none>
: - Default:
: Boost_num: "0" [Number of times the classifier will be boosted]
: Parsing option string:
: ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:WeightInitialization=XAVIERUNIFORM:Architecture=GPU:Layout=TANH|100,TANH|50,TANH|10,LINEAR:TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100"
: The following options are set:
: - By User:
: V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
: VarTransform: "N" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
: H: "False" [Print method-specific help message]
: Layout: "TANH|100,TANH|50,TANH|10,LINEAR" [Layout of the network.]
: ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
: WeightInitialization: "XAVIERUNIFORM" [Weight initialization strategy]
: Architecture: "GPU" [Which architecture to perform the training on.]
: TrainingStrategy: "Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100" [Defines the training strategies.]
: - Default:
: VerbosityLevel: "Default" [Verbosity level]
: CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
: IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
: InputLayout: "0|0|0" [The Layout of the input]
: BatchLayout: "0|0|0" [The Layout of the batch]
: RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
: ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
DL_CPU : [dataset] : Create Transformation "N" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
␛[31m<ERROR> : CUDA backend not enabled. Please make sure you have CUDA installed and it was successfully detected by CMAKE by using -Dtmva-gpu=On ␛[0m
: Will now use instead the CPU architecture !
: Will now use the CPU architecture with BLAS and IMT support !
Factory : ␛[1mTrain all methods␛[0m
Factory : [dataset] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "P" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "G" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.047647 1.0025 [ -3.6592 3.2645 ]
: var2: 0.32647 1.0646 [ -3.6891 3.7877 ]
: var3: 0.11493 1.1230 [ -4.5727 4.5640 ]
: var4: -0.076531 1.2652 [ -4.8486 5.0412 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.082544 1.0000 [ -3.6274 3.1017 ]
: var2: 0.36715 1.0000 [ -3.3020 3.4950 ]
: var3: 0.066865 1.0000 [ -2.9882 3.3086 ]
: var4: -0.20593 1.0000 [ -3.3088 2.8423 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 5.7502e-10 1.8064 [ -8.0344 7.8312 ]
: var2:-1.6078e-11 0.90130 [ -2.6765 2.7523 ]
: var3: 3.0841e-10 0.73386 [ -2.6572 2.2255 ]
: var4:-2.6886e-10 0.62168 [ -1.7384 2.2297 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.013510 1.0000 [ -2.6520 6.2074 ]
: var2: 0.0096839 1.0000 [ -2.8402 6.3073 ]
: var3: 0.010397 1.0000 [ -3.0251 5.8860 ]
: var4: 0.0053980 1.0000 [ -3.0998 5.7078 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
Factory : Train method: BDTG for Multiclass classification
:
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 4000 events: 5.38 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of BDTG on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 1.87 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.class.C␛[0m
: TMVAMulticlass.root:/dataset/Method_BDT/BDTG
Factory : Training finished
:
Factory : Train method: MLP for Multiclass classification
:
: Training Network
:
: Elapsed time for training with 4000 events: 23.3 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of MLP on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0159 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.class.C␛[0m
: Write special histos to file: TMVAMulticlass.root:/dataset/Method_MLP/MLP
Factory : Training finished
:
Factory : Train method: PDEFoam for Multiclass classification
:
: Build up multiclass foam 0
: Elapsed time: 0.66 sec
: Build up multiclass foam 1
: Elapsed time: 0.667 sec
: Build up multiclass foam 2
: Elapsed time: 0.67 sec
: Build up multiclass foam 3
: Elapsed time: 0.468 sec
: Elapsed time for training with 4000 events: 2.64 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of PDEFoam on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.132 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights.xml␛[0m
: writing foam MultiClassFoam0 to file
: writing foam MultiClassFoam1 to file
: writing foam MultiClassFoam2 to file
: writing foam MultiClassFoam3 to file
: Foams written to file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights_foams.root␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.class.C␛[0m
Factory : Training finished
:
Factory : Train method: DL_CPU for Multiclass classification
:
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070769 0.28960 [ -1.0000 1.0000 ]
: var2: 0.074130 0.28477 [ -1.0000 1.0000 ]
: var3: 0.026106 0.24582 [ -1.0000 1.0000 ]
: var4: -0.034951 0.25587 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: Start of deep neural network training on CPU using MT, nthreads = 1
:
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070769 0.28960 [ -1.0000 1.0000 ]
: var2: 0.074130 0.28477 [ -1.0000 1.0000 ]
: var3: 0.026106 0.24582 [ -1.0000 1.0000 ]
: var4: -0.034951 0.25587 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: ***** Deep Learning Network *****
DEEP NEURAL NETWORK: Depth = 4 Input = ( 1, 1, 4 ) Batch size = 100 Loss function = C
Layer 0 DENSE Layer: ( Input = 4 , Width = 100 ) Output = ( 1 , 100 , 100 ) Activation Function = Tanh
Layer 1 DENSE Layer: ( Input = 100 , Width = 50 ) Output = ( 1 , 100 , 50 ) Activation Function = Tanh
Layer 2 DENSE Layer: ( Input = 50 , Width = 10 ) Output = ( 1 , 100 , 10 ) Activation Function = Tanh
Layer 3 DENSE Layer: ( Input = 10 , Width = 4 ) Output = ( 1 , 100 , 4 ) Activation Function = Identity
: Using 3200 events for training and 800 for testing
: Compute initial loss on the validation data
: Training phase 1 of 1: Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 0.700324
: --------------------------------------------------------------
: Epoch | Train Err. Val. Err. t(s)/epoch t(s)/Loss nEvents/s Conv. Steps
: --------------------------------------------------------------
: Start epoch iteration ...
: 1 Minimum Test error found - save the configuration
: 1 | 0.605172 0.539089 0.0762533 0.00662036 45955.3 0
: 2 Minimum Test error found - save the configuration
: 2 | 0.513715 0.504785 0.0773298 0.00662746 45260.2 0
: 3 Minimum Test error found - save the configuration
: 3 | 0.489483 0.481516 0.0778434 0.00667873 44966.1 0
: 4 Minimum Test error found - save the configuration
: 4 | 0.468892 0.456093 0.0781116 0.00668817 44803.2 0
: 5 Minimum Test error found - save the configuration
: 5 | 0.447349 0.43554 0.078401 0.00672555 44645.7 0
: 6 Minimum Test error found - save the configuration
: 6 | 0.427669 0.416776 0.0786189 0.00674491 44522.3 0
: 7 Minimum Test error found - save the configuration
: 7 | 0.412227 0.402905 0.0788659 0.00676955 44385 0
: 8 Minimum Test error found - save the configuration
: 8 | 0.400022 0.391532 0.0790301 0.00679537 44300 0
: 9 Minimum Test error found - save the configuration
: 9 | 0.390035 0.382877 0.0792065 0.00681594 44204.6 0
: 10 Minimum Test error found - save the configuration
: 10 | 0.381184 0.373094 0.0794322 0.00683644 44079.7 0
: 11 Minimum Test error found - save the configuration
: 11 | 0.373144 0.364146 0.0796722 0.00687305 43956.6 0
: 12 Minimum Test error found - save the configuration
: 12 | 0.36467 0.354548 0.0799008 0.00691595 43844.7 0
: 13 Minimum Test error found - save the configuration
: 13 | 0.356724 0.344961 0.0799662 0.00689561 43793.2 0
: 14 Minimum Test error found - save the configuration
: 14 | 0.349567 0.336777 0.080163 0.00691233 43685.6 0
: 15 Minimum Test error found - save the configuration
: 15 | 0.34154 0.329384 0.0803721 0.00693663 43575.7 0
: 16 Minimum Test error found - save the configuration
: 16 | 0.333451 0.319651 0.0812098 0.00706189 43157 0
: 17 Minimum Test error found - save the configuration
: 17 | 0.325404 0.31456 0.0810364 0.00705228 43252.5 0
: 18 Minimum Test error found - save the configuration
: 18 | 0.317414 0.305984 0.0808645 0.007002 43323.7 0
: 19 Minimum Test error found - save the configuration
: 19 | 0.31041 0.295871 0.0810249 0.00701451 43237.2 0
: 20 Minimum Test error found - save the configuration
: 20 | 0.304469 0.295138 0.0811048 0.00702806 43198.4 0
: 21 Minimum Test error found - save the configuration
: 21 | 0.299005 0.29337 0.0812402 0.00703725 43125 0
: 22 Minimum Test error found - save the configuration
: 22 | 0.293883 0.283486 0.081416 0.00707214 43043.2 0
: 23 Minimum Test error found - save the configuration
: 23 | 0.289619 0.27997 0.0815082 0.0070726 42990.2 0
: 24 Minimum Test error found - save the configuration
: 24 | 0.286669 0.277948 0.0815742 0.00707146 42951.4 0
: 25 Minimum Test error found - save the configuration
: 25 | 0.283868 0.273977 0.0817049 0.00708815 42885.8 0
: 26 Minimum Test error found - save the configuration
: 26 | 0.280755 0.272366 0.0817001 0.00710942 42900.8 0
: 27 | 0.278142 0.274523 0.0823239 0.00728328 42643.6 1
: 28 Minimum Test error found - save the configuration
: 28 | 0.274963 0.266929 0.0825547 0.00720346 42467.8 0
: 29 | 0.272323 0.267041 0.0822875 0.00704221 42527.6 1
: 30 | 0.271611 0.269537 0.0819959 0.00703836 42690.8 2
: 31 Minimum Test error found - save the configuration
: 31 | 0.269708 0.265874 0.0821009 0.00714997 42694.6 0
: 32 Minimum Test error found - save the configuration
: 32 | 0.267237 0.261958 0.0821393 0.00714591 42670.4 0
: 33 Minimum Test error found - save the configuration
: 33 | 0.264093 0.2594 0.0822925 0.00716171 42592.4 0
: 34 Minimum Test error found - save the configuration
: 34 | 0.262149 0.253979 0.0821716 0.00712688 42641.3 0
: 35 | 0.259998 0.253987 0.0823934 0.00739667 42668.5 1
: 36 Minimum Test error found - save the configuration
: 36 | 0.260164 0.252547 0.0859103 0.00731784 40716.4 0
: 37 Minimum Test error found - save the configuration
: 37 | 0.257429 0.250074 0.0828902 0.00723925 42299.6 0
: 38 | 0.255099 0.252524 0.0823402 0.00706608 42511.3 1
: 39 | 0.253895 0.252011 0.0824275 0.00708313 42471.6 2
: 40 Minimum Test error found - save the configuration
: 40 | 0.2515 0.244729 0.082543 0.00719715 42470.8 0
: 41 | 0.249418 0.248411 0.0824469 0.00709181 42465.6 1
: 42 Minimum Test error found - save the configuration
: 42 | 0.249256 0.242411 0.0830462 0.007269 42229.1 0
: 43 | 0.246451 0.243334 0.0831919 0.0072477 42136.2 1
: 44 Minimum Test error found - save the configuration
: 44 | 0.246706 0.238985 0.0838684 0.00724307 41761.7 0
: 45 | 0.243801 0.240574 0.082549 0.00707989 42401.4 1
: 46 | 0.242991 0.24054 0.0824845 0.00708766 42442.1 2
: 47 Minimum Test error found - save the configuration
: 47 | 0.240863 0.237182 0.0825338 0.00718133 42467.1 0
: 48 Minimum Test error found - save the configuration
: 48 | 0.240419 0.23637 0.0825173 0.0071661 42467.8 0
: 49 | 0.240099 0.23749 0.0824812 0.00709187 42446.3 1
: 50 Minimum Test error found - save the configuration
: 50 | 0.238473 0.234945 0.0825458 0.00718492 42462.3 0
: 51 Minimum Test error found - save the configuration
: 51 | 0.237199 0.231014 0.082582 0.00716753 42432.2 0
: 52 | 0.236863 0.239149 0.0825258 0.00710979 42431.3 1
: 53 | 0.23453 0.238699 0.0826882 0.0071134 42342.1 2
: 54 | 0.233889 0.232056 0.0827529 0.0071075 42302.6 3
: 55 | 0.233178 0.23845 0.0826189 0.00710466 42376.1 4
: 56 Minimum Test error found - save the configuration
: 56 | 0.232184 0.229015 0.0827082 0.00719107 42374.5 0
: 57 | 0.231242 0.230563 0.0826835 0.00710637 42340.8 1
: 58 | 0.230405 0.232877 0.0827281 0.00711416 42320.2 2
: 59 | 0.230129 0.22915 0.0826534 0.00710788 42358.6 3
: 60 | 0.229737 0.230882 0.0827682 0.00710404 42292.1 4
: 61 Minimum Test error found - save the configuration
: 61 | 0.228381 0.228744 0.0828445 0.00722337 42316.2 0
: 62 Minimum Test error found - save the configuration
: 62 | 0.227517 0.224555 0.0829644 0.00724665 42262.2 0
: 63 | 0.226809 0.232029 0.0828274 0.00712078 42268.4 1
: 64 | 0.227981 0.22786 0.0827947 0.00711453 42283.2 2
: 65 Minimum Test error found - save the configuration
: 65 | 0.226148 0.222418 0.0829223 0.00721515 42268.1 0
: 66 | 0.225905 0.225275 0.0828052 0.00711399 42277 1
: 67 | 0.225795 0.22287 0.0828066 0.00711398 42276.2 2
: 68 | 0.224314 0.22575 0.0828022 0.00712804 42286.6 3
: 69 | 0.223996 0.226948 0.0828519 0.00713109 42260.5 4
: 70 | 0.22361 0.225157 0.0828557 0.00712123 42252.9 5
: 71 | 0.223771 0.230996 0.0828751 0.00711447 42238.3 6
: 72 | 0.223224 0.225574 0.0828748 0.00712368 42243.6 7
: 73 | 0.221355 0.223051 0.08292 0.00712336 42218.2 8
: 74 | 0.221303 0.225109 0.0828849 0.0071261 42239.3 9
: 75 | 0.220454 0.223857 0.0829259 0.00711943 42212.7 10
: 76 Minimum Test error found - save the configuration
: 76 | 0.219957 0.221295 0.0831711 0.00725146 42149.8 0
: 77 | 0.219519 0.222607 0.0829202 0.00713048 42222.1 1
: 78 | 0.219683 0.222232 0.0828896 0.00712377 42235.4 2
: 79 | 0.220643 0.224957 0.0828653 0.0071163 42244.8 3
: 80 Minimum Test error found - save the configuration
: 80 | 0.218613 0.220935 0.0831221 0.00730682 42207.8 0
: 81 | 0.219165 0.225358 0.082984 0.00713623 42189.8 1
: 82 Minimum Test error found - save the configuration
: 82 | 0.218457 0.220527 0.0835746 0.00730145 41954.5 0
: 83 | 0.216363 0.225192 0.0830177 0.0071481 42177.6 1
: 84 | 0.217346 0.222136 0.0831008 0.00714365 42129 2
: 85 Minimum Test error found - save the configuration
: 85 | 0.216651 0.218128 0.0832821 0.00727608 42101.9 0
: 86 | 0.21694 0.221883 0.0831669 0.00714788 42094.7 1
: 87 | 0.217059 0.223074 0.0831578 0.00715627 42104.4 2
: 88 | 0.215404 0.22102 0.0831341 0.00714693 42112.4 3
: 89 | 0.215789 0.224021 0.0830404 0.00713493 42157.7 4
: 90 | 0.216119 0.22077 0.0831088 0.00715072 42128.5 5
: 91 | 0.216078 0.221672 0.0836139 0.00717814 41865.2 6
: 92 Minimum Test error found - save the configuration
: 92 | 0.215974 0.217793 0.0837188 0.00740393 41931.5 0
: 93 | 0.214695 0.218944 0.0837646 0.00719422 41791.6 1
: 94 Minimum Test error found - save the configuration
: 94 | 0.214501 0.216375 0.0831689 0.00724571 42147.9 0
: 95 | 0.213968 0.217106 0.0830409 0.00714038 42160.4 1
: 96 | 0.213424 0.219886 0.0830377 0.00712729 42154.9 2
: 97 | 0.213153 0.218034 0.0830728 0.00713982 42142.5 3
: 98 | 0.21183 0.220326 0.0830919 0.0071427 42133.4 4
: 99 | 0.213502 0.219353 0.0831201 0.00715794 42126.2 5
: 100 | 0.212664 0.221318 0.083138 0.0071362 42104.3 6
: 101 Minimum Test error found - save the configuration
: 101 | 0.212874 0.214364 0.083461 0.00727372 42001.8 0
: 102 | 0.212944 0.217542 0.0830921 0.00715155 42138.2 1
: 103 | 0.212498 0.219298 0.0831186 0.00714909 42122.2 2
: 104 | 0.210801 0.217479 0.0831368 0.0071514 42113.3 3
: 105 | 0.211076 0.222463 0.0832915 0.00715428 42029.4 4
: 106 | 0.211405 0.218649 0.0832389 0.00714061 42050.9 5
: 107 | 0.210024 0.216776 0.0831534 0.00714516 42100.7 6
: 108 | 0.211868 0.216127 0.0831252 0.00717941 42135.3 7
: 109 | 0.212266 0.217518 0.0833378 0.00736225 42118.8 8
: 110 | 0.211339 0.217787 0.0838719 0.00717704 41723.8 9
: 111 | 0.209889 0.217164 0.0841347 0.00731868 41658 10
: 112 | 0.210302 0.21473 0.0835221 0.0071709 41911.6 11
:
: Elapsed time for training with 4000 events: 9.24 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of DL_CPU on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.117 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_DL_CPU.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_DL_CPU.class.C␛[0m
Factory : Training finished
:
: Ranking input variables (method specific)...
BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 3.117e-01
: 2 : var1 : 2.504e-01
: 3 : var2 : 2.430e-01
: 4 : var3 : 1.949e-01
: --------------------------------------
MLP : Ranking result (top variable is best ranked)
: -----------------------------
: Rank : Variable : Importance
: -----------------------------
: 1 : var4 : 6.076e+01
: 2 : var2 : 4.824e+01
: 3 : var1 : 2.116e+01
: 4 : var3 : 1.692e+01
: -----------------------------
PDEFoam : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 2.991e-01
: 2 : var1 : 2.930e-01
: 3 : var3 : 2.365e-01
: 4 : var2 : 1.714e-01
: --------------------------------------
: No variable ranking supplied by classifier: DL_CPU
TH1.Print Name = TrainingHistory_DL_CPU_trainingError, Entries= 0, Total sum= 29.4278
TH1.Print Name = TrainingHistory_DL_CPU_valError, Entries= 0, Total sum= 29.2086
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.weights.xml␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.weights.xml␛[0m
MLP : Building Network.
: Initializing weights
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights.xml␛[0m
: Read foams from file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights_foams.root␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_DL_CPU.weights.xml␛[0m
Factory : ␛[1mTest all methods␛[0m
Factory : Test method: BDTG for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of BDTG on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.983 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: MLP for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of MLP on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0167 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: PDEFoam for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of PDEFoam on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.134 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: DL_CPU for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of DL_CPU on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.116 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : ␛[1mEvaluate all methods␛[0m
: Evaluate multiclass classification method: BDTG
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: MLP
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: PDEFoam
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_PDEFoam : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: DL_CPU
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.077270 0.29534 [ -1.1155 1.0914 ]
: var2: 0.068045 0.27981 [ -1.0016 1.0000 ]
: var3: 0.027548 0.24565 [ -0.80459 0.85902 ]
: var4: -0.034157 0.25816 [ -1.0000 0.83435 ]
: -----------------------------------------------------------
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.077270 0.29534 [ -1.1155 1.0914 ]
: var2: 0.068045 0.27981 [ -1.0016 1.0000 ]
: var3: 0.027548 0.24565 [ -0.80459 0.85902 ]
: var4: -0.034157 0.25816 [ -1.0000 0.83435 ]
: -----------------------------------------------------------
:
: 1-vs-rest performance metrics per class
: -------------------------------------------------------------------------------------------------------
:
: Considers the listed class as signal and the other classes
: as background, reporting the resulting binary performance.
: A score of 0.820 (0.850) means 0.820 was acheived on the
: test set and 0.850 on the training set.
:
: Dataset MVA Method ROC AUC Sig eff@B=0.01 Sig eff@B=0.10 Sig eff@B=0.30
: Name: / Class: test (train) test (train) test (train) test (train)
:
: dataset BDTG
: ------------------------------
: Signal 0.968 (0.978) 0.508 (0.605) 0.914 (0.945) 0.990 (0.996)
: bg0 0.910 (0.931) 0.256 (0.288) 0.737 (0.791) 0.922 (0.956)
: bg1 0.947 (0.954) 0.437 (0.511) 0.833 (0.856) 0.971 (0.971)
: bg2 0.978 (0.982) 0.585 (0.678) 0.951 (0.956) 0.999 (0.996)
:
: dataset MLP
: ------------------------------
: Signal 0.970 (0.975) 0.596 (0.632) 0.933 (0.938) 0.988 (0.993)
: bg0 0.929 (0.934) 0.303 (0.298) 0.787 (0.793) 0.949 (0.961)
: bg1 0.962 (0.967) 0.467 (0.553) 0.881 (0.906) 0.985 (0.992)
: bg2 0.975 (0.979) 0.629 (0.699) 0.929 (0.940) 0.998 (0.998)
:
: dataset PDEFoam
: ------------------------------
: Signal 0.916 (0.928) 0.294 (0.382) 0.744 (0.782) 0.932 (0.952)
: bg0 0.837 (0.848) 0.109 (0.147) 0.519 (0.543) 0.833 (0.851)
: bg1 0.890 (0.902) 0.190 (0.226) 0.606 (0.646) 0.923 (0.929)
: bg2 0.967 (0.972) 0.510 (0.527) 0.900 (0.926) 0.993 (0.998)
:
: dataset DL_CPU
: ------------------------------
: Signal 0.976 (0.976) 0.610 (0.646) 0.937 (0.941) 0.991 (0.994)
: bg0 0.929 (0.932) 0.302 (0.314) 0.789 (0.783) 0.944 (0.957)
: bg1 0.964 (0.965) 0.463 (0.559) 0.892 (0.890) 0.989 (0.989)
: bg2 0.979 (0.978) 0.667 (0.687) 0.937 (0.927) 0.999 (0.999)
:
: -------------------------------------------------------------------------------------------------------
:
:
: Confusion matrices for all methods
: -------------------------------------------------------------------------------------------------------
:
: Does a binary comparison between the two classes given by a
: particular row-column combination. In each case, the class
: given by the row is considered signal while the class given
: by the column index is considered background.
:
: === Showing confusion matrix for method : BDTG
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.497 (0.373) 0.710 (0.693) 0.680 (0.574)
: bg0 0.271 (0.184) - 0.239 (0.145) 0.705 (0.667)
: bg1 0.855 (0.766) 0.369 (0.222) - 0.587 (0.578)
: bg2 0.714 (0.585) 0.705 (0.581) 0.648 (0.601) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.911 (0.853) 0.991 (0.981) 0.945 (0.913)
: bg0 0.833 (0.774) - 0.654 (0.582) 0.930 (0.901)
: bg1 0.971 (0.980) 0.716 (0.681) - 0.871 (0.862)
: bg2 0.976 (0.951) 0.971 (0.973) 0.936 (0.941) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.957) 0.999 (1.000) 0.998 (0.997)
: bg0 0.965 (0.926) - 0.874 (0.835) 0.991 (0.976)
: bg1 1.000 (0.999) 0.916 (0.894) - 0.988 (0.985)
: bg2 0.999 (0.999) 0.997 (0.999) 0.996 (0.997) -
:
: === Showing confusion matrix for method : MLP
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.465 (0.490) 0.974 (0.953) 0.632 (0.498)
: bg0 0.320 (0.269) - 0.224 (0.250) 0.655 (0.627)
: bg1 0.943 (0.920) 0.341 (0.275) - 0.632 (0.687)
: bg2 0.665 (0.642) 0.697 (0.680) 0.706 (0.598) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.865 (0.854) 0.996 (0.994) 0.908 (0.907)
: bg0 0.784 (0.776) - 0.666 (0.655) 0.919 (0.895)
: bg1 0.998 (0.998) 0.791 (0.785) - 0.912 (0.902)
: bg2 0.943 (0.903) 0.946 (0.939) 0.924 (0.928) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.964) 0.997 (0.997) 0.993 (0.986)
: bg0 0.952 (0.924) - 0.936 (0.928) 0.992 (0.990)
: bg1 1.000 (1.000) 0.945 (0.936) - 0.998 (0.995)
: bg2 0.994 (0.985) 0.998 (0.998) 0.998 (0.998) -
:
: === Showing confusion matrix for method : PDEFoam
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.289 (0.233) 0.467 (0.436) 0.421 (0.332)
: bg0 0.100 (0.045) - 0.132 (0.116) 0.540 (0.313)
: bg1 0.209 (0.434) 0.153 (0.092) - 0.347 (0.323)
: bg2 0.560 (0.552) 0.445 (0.424) 0.501 (0.506) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.665 (0.640) 0.854 (0.822) 0.807 (0.790)
: bg0 0.538 (0.520) - 0.415 (0.374) 0.843 (0.833)
: bg1 0.885 (0.886) 0.542 (0.491) - 0.728 (0.646)
: bg2 0.928 (0.890) 0.956 (0.959) 0.847 (0.895) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.898 (0.878) 0.971 (0.950) 0.982 (0.975)
: bg0 0.828 (0.810) - 0.696 (0.676) 0.954 (0.951)
: bg1 0.951 (0.966) 0.803 (0.745) - 0.958 (0.966)
: bg2 0.998 (0.991) 0.998 (0.996) 0.998 (0.993) -
:
: === Showing confusion matrix for method : DL_CPU
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.486 (0.477) 0.935 (0.949) 0.672 (0.653)
: bg0 0.397 (0.383) - 0.201 (0.225) 0.594 (0.561)
: bg1 0.929 (0.890) 0.312 (0.287) - 0.640 (0.692)
: bg2 0.670 (0.661) 0.721 (0.655) 0.689 (0.672) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.876 (0.875) 0.992 (0.989) 0.943 (0.931)
: bg0 0.777 (0.764) - 0.646 (0.697) 0.909 (0.882)
: bg1 0.991 (0.992) 0.796 (0.800) - 0.851 (0.881)
: bg2 0.939 (0.932) 0.939 (0.947) 0.914 (0.932) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.970 (0.970) 0.999 (1.000) 0.999 (0.999)
: bg0 0.956 (0.936) - 0.925 (0.911) 0.986 (0.983)
: bg1 1.000 (0.998) 0.952 (0.952) - 1.000 (0.995)
: bg2 0.999 (0.998) 0.999 (0.999) 0.997 (0.999) -
:
: -------------------------------------------------------------------------------------------------------
:
Dataset:dataset : Created tree 'TestTree' with 4000 events
:
Dataset:dataset : Created tree 'TrainTree' with 4000 events
:
Factory : ␛[1mThank you for using TMVA!␛[0m
: ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m
==> Wrote root file: TMVAMulticlass.root
==> TMVAMulticlass is done!