As input data is used a toy-MC sample consisting of four Gaussian-distributed and linearly correlated input variables.
The methods to be used can be switched on and off by means of booleans, or via the prompt command, for example:
(note that the backslashes are mandatory) If no method given, a default set is used.
The output file "TMVAReg.root" can be analysed with the use of dedicated macros (simply say: root -l <macro.C>), which can be conveniently invoked through a GUI that will appear at the end of the run of this macro.
0.0657119750977
48.9414801598
Processing /mnt/build/workspace/root-makedoc-v612/rootspi/rdoc/src/v6-12-00-patches/tutorials/tmva/TMVARegression.C...
==> Start TMVARegression
--- TMVARegression : Using input file: ./files/tmva_reg_example.root
DataSetInfo : [dataset] : Added class "Regression"
: Add Tree TreeR of type Regression with 10000 events
: Dataset[dataset] : Class index : 0 name : Regression
Factory : Booking method: [1mPDEFoam[0m
:
DataSetFactory : [dataset] : Number of events in input trees
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Regression -- training events : 1000
: Regression -- testing events : 9000
: Regression -- training and testing events: 10000
:
DataSetInfo : Correlation matrix (Regression):
: ------------------------
: var1 var2
: var1: +1.000 -0.018
: var2: -0.018 +1.000
: ------------------------
DataSetFactory : [dataset] :
:
Factory : Booking method: [1mKNN[0m
:
Factory : Booking method: [1mLD[0m
:
Factory : Booking method: [1mFDA_GA[0m
:
FDA_GA : [dataset] : Create Transformation "Norm" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : target 'fvalue' <---> Output : target 'fvalue'
: Create parameter interval for parameter 0 : [-100,100]
: Create parameter interval for parameter 1 : [-100,100]
: Create parameter interval for parameter 2 : [-100,100]
: User-defined formula string : "(0)+(1)*x0+(2)*x1"
: TFormula-compatible formula string: "[0]+[1]*[3]+[2]*[4]"
Factory : Booking method: [1mMLP[0m
:
MLP : [dataset] : Create Transformation "Norm" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : target 'fvalue' <---> Output : target 'fvalue'
MLP : Building Network.
: Initializing weights
Factory : Booking method: [1mBDTG[0m
:
<WARNING> : Value for option maxdepth was previously set to 3
: the option *InverseBoostNegWeights* does not exist for BoostType=Grad --> change
: to new default for GradBoost *Pray*
Factory : [1mTrain all methods[0m
Factory : [dataset] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.4138 1.1963 [ 0.0026062 4.9957 ]
: var2: 2.4356 1.4134 [ 0.0092062 4.9990 ]
: fvalue: 164.96 82.203 [ 1.7144 391.23 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
IdTransformation : Ranking result (top variable is best ranked)
: --------------------------------------------
: Rank : Variable : |Correlation with target|
: --------------------------------------------
: 1 : var2 : 7.419e-01
: 2 : var1 : 5.996e-01
: --------------------------------------------
IdTransformation : Ranking result (top variable is best ranked)
: -------------------------------------
: Rank : Variable : Mutual information
: -------------------------------------
: 1 : var2 : 2.029e+00
: 2 : var1 : 1.950e+00
: -------------------------------------
IdTransformation : Ranking result (top variable is best ranked)
: ------------------------------------
: Rank : Variable : Correlation Ratio
: ------------------------------------
: 1 : var1 : 6.538e+00
: 2 : var2 : 2.460e+00
: ------------------------------------
IdTransformation : Ranking result (top variable is best ranked)
: ----------------------------------------
: Rank : Variable : Correlation Ratio (T)
: ----------------------------------------
: 1 : var2 : 9.156e-01
: 2 : var1 : 2.981e-01
: ----------------------------------------
Factory : Train method: PDEFoam for Regression
:
: Build mono target regression foam
: Elapsed time: 0.658 sec
: Elapsed time for training with 1000 events: 0.666 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of PDEFoam on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.0189 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: [0;36mdataset/weights/TMVARegression_PDEFoam.weights.xml[0m
: writing foam MonoTargetRegressionFoam to file
: Foams written to file: [0;36mdataset/weights/TMVARegression_PDEFoam.weights_foams.root[0m
Factory : Training finished
:
Factory : Train method: KNN for Regression
:
KNN : <Train> start...
: Reading 1000 events
: Number of signal events 1000
: Number of background events 0
: Creating kd-tree with 1000 events
: Computing scale factor for 1d distributions: (ifrac, bottom, top) = (80%, 10%, 90%)
ModulekNN : Optimizing tree for 2 variables with 1000 values
: <Fill> Class 1 has 1000 events
: Elapsed time for training with 1000 events: 0.0014 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of KNN on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.0144 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: [0;36mdataset/weights/TMVARegression_KNN.weights.xml[0m
Factory : Training finished
:
Factory : Train method: LD for Regression
:
LD : Results for LD coefficients:
: -----------------------
: Variable: Coefficient:
: -----------------------
: var1: +42.104
: var2: +44.607
: (offset): -87.420
: -----------------------
: Elapsed time for training with 1000 events: 0.00037 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of LD on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00312 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: [0;36mdataset/weights/TMVARegression_LD.weights.xml[0m
Factory : Training finished
:
Factory : Train method: FDA_GA for Regression
:
TFHandler_FDA_GA : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.36639 0.47921 [ -1.0000 1.0000 ]
: var2: -0.027454 0.56651 [ -1.0000 1.0000 ]
: fvalue: -0.16180 0.42208 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
FitterBase : <GeneticFitter> Optimisation, please be patient ... (inaccurate progress timing for GA)
: Elapsed time: 13.3 sec
FDA_GA : Results for parameter fit using "GA" fitter:
: -----------------------
: Parameter: Fit result:
: -----------------------
: Par(0): -0.344844
: Par(1): 0.537889
: Par(2): 0.573868
: -----------------------
: Discriminator expression: "(0)+(1)*x0+(2)*x1"
: Value of estimator at minimum: 0.00944176
: Elapsed time for training with 1000 events: 13.5 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of FDA_GA on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00475 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: [0;36mdataset/weights/TMVARegression_FDA_GA.weights.xml[0m
Factory : Training finished
:
Factory : Train method: MLP for Regression
:
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.36639 0.47921 [ -1.0000 1.0000 ]
: var2: -0.027454 0.56651 [ -1.0000 1.0000 ]
: fvalue: -0.16180 0.42208 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: Training Network
:
: Inaccurate progress timing for MLP...
: Elapsed time for training with 1000 events: 11.7 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of MLP on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00631 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: [0;36mdataset/weights/TMVARegression_MLP.weights.xml[0m
: Write special histos to file: TMVAReg.root:/dataset/Method_MLP/MLP
Factory : Training finished
:
Factory : Train method: BDTG for Regression
:
: Regression Loss Function: Huber
: Training 2000 Decision Trees ... patience please
: Elapsed time for training with 1000 events: 1.25 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Evaluation of BDTG on training sample
: Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.397 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
: Creating xml weight file: [0;36mdataset/weights/TMVARegression_BDTG.weights.xml[0m
Factory : Training finished
:
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Read foams from file: [0;36mdataset/weights/TMVARegression_PDEFoam.weights_foams.root[0m
: Creating kd-tree with 1000 events
: Computing scale factor for 1d distributions: (ifrac, bottom, top) = (80%, 10%, 90%)
ModulekNN : Optimizing tree for 2 variables with 1000 values
: <Fill> Class 1 has 1000 events
: User-defined formula string : "(0)+(1)*x0+(2)*x1"
: TFormula-compatible formula string: "[0]+[1]*[3]+[2]*[4]"
MLP : Building Network.
: Initializing weights
Factory : [1mTest all methods[0m
Factory : Test method: PDEFoam for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of PDEFoam on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0721 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: KNN for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of KNN on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.112 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: LD for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of LD on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0126 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: FDA_GA for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of FDA_GA on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0233 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: MLP for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of MLP on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0343 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : Test method: BDTG for Regression performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Evaluation of BDTG on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 9000 events: 2.58 sec
: Create variable histograms
: Create regression target histograms
: Create regression average deviation
: Results created
Factory : [1mEvaluate all methods[0m
: Evaluate regression method: PDEFoam
TFHandler_PDEFoam : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3309 1.1858 [ 0.00020069 5.0000 ]
: var2: 2.4914 1.4393 [ 0.00071490 5.0000 ]
: fvalue: 164.02 83.932 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: KNN
TFHandler_KNN : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3309 1.1858 [ 0.00020069 5.0000 ]
: var2: 2.4914 1.4393 [ 0.00071490 5.0000 ]
: fvalue: 164.02 83.932 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: LD
TFHandler_LD : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3309 1.1858 [ 0.00020069 5.0000 ]
: var2: 2.4914 1.4393 [ 0.00071490 5.0000 ]
: fvalue: 164.02 83.932 [ 1.6186 394.84 ]
: -----------------------------------------------------------
: Evaluate regression method: FDA_GA
TFHandler_FDA_GA : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.33319 0.47497 [ -1.0010 1.0017 ]
: var2: -0.0050961 0.57690 [ -1.0034 1.0004 ]
: fvalue: -0.16662 0.43095 [ -1.0005 1.0185 ]
: -----------------------------------------------------------
TFHandler_FDA_GA : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.33319 0.47497 [ -1.0010 1.0017 ]
: var2: -0.0050961 0.57690 [ -1.0034 1.0004 ]
: fvalue: -0.16662 0.43095 [ -1.0005 1.0185 ]
: -----------------------------------------------------------
: Evaluate regression method: MLP
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.33319 0.47497 [ -1.0010 1.0017 ]
: var2: -0.0050961 0.57690 [ -1.0034 1.0004 ]
: fvalue: -0.16662 0.43095 [ -1.0005 1.0185 ]
: -----------------------------------------------------------
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.33319 0.47497 [ -1.0010 1.0017 ]
: var2: -0.0050961 0.57690 [ -1.0034 1.0004 ]
: fvalue: -0.16662 0.43095 [ -1.0005 1.0185 ]
: -----------------------------------------------------------
: Evaluate regression method: BDTG
TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3309 1.1858 [ 0.00020069 5.0000 ]
: var2: 2.4914 1.4393 [ 0.00071490 5.0000 ]
: fvalue: 164.02 83.932 [ 1.6186 394.84 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by smallest RMS on test sample:
: ("Bias" quotes the mean deviation of the regression from true target.
: "MutInf" is the "Mutual Information" between regression and target.
: Indicated by "_T" are the corresponding "truncated" quantities ob-
: tained when removing events deviating more than 2sigma from average.)
: --------------------------------------------------------------------------------------------------
: --------------------------------------------------------------------------------------------------
: dataset MLP : -0.0141 -0.0128 0.319 0.302 | 3.442 3.440
: dataset BDTG : 0.252 0.209 2.27 1.83 | 3.137 3.210
: dataset KNN : -0.507 0.436 5.77 3.79 | 2.871 2.903
: dataset PDEFoam : -0.831 -0.645 9.90 8.12 | 2.245 2.327
: dataset FDA_GA : -0.377 1.31 19.7 17.9 | 1.993 1.987
: dataset LD : -0.0644 1.63 19.7 17.9 | 1.988 1.981
: --------------------------------------------------------------------------------------------------
:
: Evaluation results ranked by smallest RMS on training sample:
: (overtraining check)
: --------------------------------------------------------------------------------------------------
: DataSet Name: MVA Method: <Bias> <Bias_T> RMS RMS_T | MutInf MutInf_T
: --------------------------------------------------------------------------------------------------
: dataset MLP : 0.00265 0.0102 0.311 0.297 | 3.429 3.426
: dataset BDTG : 0.0373-0.000948 0.483 0.255 | 3.435 3.459
: dataset KNN : -0.523 0.298 5.55 3.82 | 2.931 2.946
: dataset PDEFoam : 7.41e-07 0.243 7.99 6.37 | 2.489 2.565
: dataset FDA_GA : -0.335 1.58 18.9 16.7 | 2.099 2.103
: dataset LD : 3.68e-06 1.76 18.9 16.9 | 2.101 2.099
: --------------------------------------------------------------------------------------------------
:
Dataset:dataset : Created tree 'TestTree' with 9000 events
:
Dataset:dataset : Created tree 'TrainTree' with 1000 events
:
Factory : [1mThank you for using TMVA![0m
: [1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html[0m
==> Wrote root file: TMVAReg.root
==> TMVARegression is done!
#include <cstdlib>
#include <iostream>
#include <map>
#include <string>
void TMVARegression( TString myMethodList = "" )
{
std::map<std::string,int> Use;
Use["PDERS"] = 0;
Use["PDEFoam"] = 1;
Use["KNN"] = 1;
Use["LD"] = 1;
Use["FDA_GA"] = 1;
Use["FDA_MC"] = 0;
Use["FDA_MT"] = 0;
Use["FDA_GAMT"] = 0;
Use["MLP"] = 1;
Use["DNN_CPU"] = 0;
Use["SVM"] = 0;
Use["BDT"] = 0;
Use["BDTG"] = 1;
std::cout << std::endl;
std::cout << "==> Start TMVARegression" << std::endl;
if (myMethodList != "") {
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) it->second = 0;
for (
UInt_t i=0; i<mlist.size(); i++) {
std::string regMethod(mlist[i]);
if (Use.find(regMethod) == Use.end()) {
std::cout << "Method \"" << regMethod << "\" not known in TMVA under this name. Choose among the following:" << std::endl;
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) std::cout << it->first << " ";
std::cout << std::endl;
return;
}
Use[regMethod] = 1;
}
}
TString outfileName( "TMVAReg.root" );
TFile* outputFile =
TFile::Open( outfileName,
"RECREATE" );
"!V:!Silent:Color:DrawProgressBar:AnalysisType=Regression" );
dataloader->
AddVariable(
"var1",
"Variable 1",
"units",
'F' );
dataloader->
AddVariable(
"var2",
"Variable 2",
"units",
'F' );
dataloader->
AddSpectator(
"spec1:=var1*2",
"Spectator 1",
"units",
'F' );
dataloader->
AddSpectator(
"spec2:=var1*3",
"Spectator 2",
"units",
'F' );
TFile *input(0);
TString fname = "./tmva_reg_example.root";
}
else {
input =
TFile::Open(
"http://root.cern.ch/files/tmva_reg_example.root",
"CACHEREAD");
}
if (!input) {
std::cout << "ERROR: could not open data file" << std::endl;
exit(1);
}
std::cout <<
"--- TMVARegression : Using input file: " << input->
GetName() << std::endl;
TTree *regTree = (TTree*)input->
Get(
"TreeR");
TCut mycut = "";
"nTrain_Regression=1000:nTest_Regression=0:SplitMode=Random:NormMode=NumEvents:!V" );
if (Use["PDERS"])
"!H:!V:NormTree=T:VolumeRangeMode=Adaptive:KernelEstimator=Gauss:GaussSigma=0.3:NEventsMin=40:NEventsMax=60:VarTransform=None" );
if (Use["PDEFoam"])
"!H:!V:MultiTargetRegression=F:TargetSelection=Mpv:TailCut=0.001:VolFrac=0.0666:nActiveCells=500:nSampl=2000:nBin=5:Compress=T:Kernel=None:Nmin=10:VarTransform=None" );
if (Use["KNN"])
"nkNN=20:ScaleFrac=0.8:SigmaFact=1.0:Kernel=Gaus:UseKernel=F:UseWeight=T:!Trim" );
if (Use["LD"])
"!H:!V:VarTransform=None" );
if (Use["FDA_MC"])
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=MC:SampleSize=100000:Sigma=0.1:VarTransform=D" );
if (Use["FDA_GA"])
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=GA:PopSize=100:Cycles=3:Steps=30:Trim=True:SaveBestGen=1:VarTransform=Norm" );
if (Use["FDA_MT"])
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100);(-10,10):FitMethod=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=2:UseImprove:UseMinos:SetBatch" );
if (Use["FDA_GAMT"])
"!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=GA:Converger=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=0:!UseImprove:!UseMinos:SetBatch:Cycles=1:PopSize=5:Steps=5:Trim" );
if (Use["MLP"])
factory->
BookMethod( dataloader,
TMVA::Types::kMLP,
"MLP",
"!H:!V:VarTransform=Norm:NeuronType=tanh:NCycles=20000:HiddenLayers=N+20:TestRate=6:TrainingMethod=BFGS:Sampling=0.3:SamplingEpoch=0.8:ConvergenceImprove=1e-6:ConvergenceTests=15:!UseRegulator" );
if (Use["DNN_CPU"]) {
TString layoutString("Layout=TANH|100,LINEAR");
TString training0("LearningRate=1e-5,Momentum=0.5,Repetitions=1,ConvergenceSteps=500,BatchSize=50,"
"TestRepetitions=7,WeightDecay=0.01,Regularization=NONE,DropConfig=0.5+0.5+0.5+0.5,"
"DropRepetitions=2");
TString training1("LearningRate=1e-5,Momentum=0.9,Repetitions=1,ConvergenceSteps=170,BatchSize=30,"
"TestRepetitions=7,WeightDecay=0.01,Regularization=L2,DropConfig=0.1+0.1+0.1,DropRepetitions="
"1");
TString training2("LearningRate=1e-5,Momentum=0.3,Repetitions=1,ConvergenceSteps=150,BatchSize=40,"
"TestRepetitions=7,WeightDecay=0.01,Regularization=NONE");
TString training3("LearningRate=1e-6,Momentum=0.1,Repetitions=1,ConvergenceSteps=500,BatchSize=100,"
"TestRepetitions=7,WeightDecay=0.0001,Regularization=NONE");
TString trainingStrategyString("TrainingStrategy=");
trainingStrategyString += training0 + "|" + training1 + "|" + training2 + "|" + training3;
TString nnOptions(
"!H:V:ErrorStrategy=SUMOFSQUARES:VarTransform=G:WeightInitialization=XAVIERUNIFORM:Architecture=CPU");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
}
if (Use["SVM"])
if (Use["BDT"])
"!H:!V:NTrees=100:MinNodeSize=1.0%:BoostType=AdaBoostR2:SeparationType=RegressionVariance:nCuts=20:PruneMethod=CostComplexity:PruneStrength=30" );
if (Use["BDTG"])
"!H:!V:NTrees=2000::BoostType=Grad:Shrinkage=0.1:UseBaggedBoost:BaggedSampleFraction=0.5:nCuts=20:MaxDepth=3:MaxDepth=4" );
outputFile->Close();
std::cout << "==> Wrote root file: " << outputFile->GetName() << std::endl;
std::cout << "==> TMVARegression is done!" << std::endl;
}
int main(
int argc,
char** argv )
{
TString methodList;
for (int i=1; i<argc; i++) {
TString regMethod(argv[i]);
if(regMethod=="-b" || regMethod=="--batch") continue;
if (!methodList.IsNull()) methodList += TString(",");
methodList += regMethod;
}
TMVARegression(methodList);
return 0;
}