TMVAinJupyter_FullTutorial.ipynb Open in SWAN Download

No description has been provided for this image No description has been provided for this image

Enable ipywidgets

To be able to visualize decision trees and DNN weight map, you must enable ipywidgets. To do so, run the following cell, and refresh the page!

In [ ]:
!jupyter nbextension enable --py widgetsnbextension
In [1]:
import ROOT
from ROOT import TFile, TMVA, TCut
Welcome to JupyROOT 6.09/01

Enable JS visualization

To use new interactive features in notebook we have to enable a module called JsMVA. This can be done by using ipython magic: %jsmva.

In [2]:
%jsmva on

Declaration of Factory

First let's start with the classical version of declaration. If you know how to use TMVA in C++ then you can use that version here in python: first we need to pass a string called job name, as second argument we need to pass an opened output TFile (this is optional, if it's present then it will be used to store output histograms) and as third (or second) argument we pass a string which contains all the settings related to Factory (separated with ':' character).

C++ like declaration

In [3]:
outputFile = TFile( "TMVA.root", 'RECREATE' )
TMVA.Tools.Instance();

factory = TMVA.Factory( "TMVAClassification", outputFile #this is optional
                       ,"!V:Color:DrawProgressBar:Transformations=I;D;P;G,D:AnalysisType=Classification" )

The options string can contain the following options:

OptionDefaultPredefined valuesDescription
V False - Verbose flag
Color True - Flag for colored output
Transformations "" - List of transformations to test. For example with "I;D;P;U;G" string identity, decorrelation, PCA, uniform and Gaussian transformations will be applied
Silent False - Batch mode: boolean silent flag inhibiting any output from TMVA after the creation of the factory class object
DrawProgressBar True - Draw progress bar to display training, testing and evaluation schedule (default: True)
AnalysisType Auto Classification, Regression, Multiclass, Auto Set the analysis type

Pythonic version

By enabling JsMVA we have new, more readable ways to do the declaration (this applies to all functions, not just to the constructor).

In [4]:
factory = TMVA.Factory("TMVAClassification", TargetFile=outputFile,
                       V=False, Color=True, DrawProgressBar=True, Transformations=["I", "D", "P", "G", "D"],
                       AnalysisType="Classification")

Arguments of constructor: The options string can contain the following options:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
JobName yes, 1. not optional - Name of job
TargetFile yes, 2. if not passed histograms won't be saved - File to write control and performance histograms histograms
V no False - Verbose flag
Color no True - Flag for colored output
Transformations no "" - List of transformations to test. For example with "I;D;P;U;G" string identity, decorrelation, PCA, uniform and Gaussian transformations will be applied
Silent no False - Batch mode: boolean silent flag inhibiting any output from TMVA after the creation of the factory class object
DrawProgressBar no True - Draw progress bar to display training, testing and evaluation schedule (default: True)
AnalysisType no Auto Classification, Regression, Multiclass, Auto Set the analysis type

Declaring the DataLoader, adding variables and setting up the dataset

First we need to declare a DataLoader and add the variables (passing the variable names used in the test and train trees in input dataset). To add variable names to DataLoader we use the AddVariable function. Arguments of this function:

  1. String containing the variable name. Using ":=" we can add definition too.

  2. String (label to variable, if not present the variable name will be used) or character (defining the type of data points)

  3. If we have label for variable, the data point type still can be passed as third argument

In [5]:
dataset = "tmva_class_example" #the dataset name
loader  = TMVA.DataLoader(dataset)

loader.AddVariable( "myvar1 := var1+var2", 'F' )
loader.AddVariable( "myvar2 := var1-var2", "Expression 2", 'F' )
loader.AddVariable( "var3",                "Variable 3", 'F' )
loader.AddVariable( "var4",                "Variable 4", 'F' )

It is possible to define spectator variables, which are part of the input data set, but which are not used in the MVA training, test nor during the evaluation, but can be used for correlation tests or others. Parameters:

  1. String containing the definition of spectator variable.
  2. Label for spectator variable.
  3. Data type
In [6]:
loader.AddSpectator( "spec1:=var1*2",  "Spectator 1",  'F' )
loader.AddSpectator( "spec2:=var1*3",  "Spectator 2",  'F' )

After adding the variables we have to add the datas to DataLoader. In order to do this we check if the dataset file doesn't exist in files directory we download from CERN's server. When we have the root file we open it and get the signal and background trees.

In [7]:
if ROOT.gSystem.AccessPathName( "tmva_class_example.root" ) != 0: 
    ROOT.gSystem.Exec( "wget https://root.cern.ch/files/tmva_class_example.root")

input = TFile.Open( "tmva_class_example.root" )

# Get the signal and background trees for training
signal      = input.Get( "TreeS" )
background  = input.Get( "TreeB" )

To pass the signal and background trees to DataLoader we use the AddSignalTree and AddBackgroundTree functions, and we set up the corresponding DataLoader variable's too. Arguments of functions:

  1. Signal/Background tree
  2. Global weight used in all events in the tree.
In [8]:
# Global event weights (see below for setting event-wise weights)
signalWeight     = 1.0
backgroundWeight = 1.0

loader.AddSignalTree(signal, signalWeight)
loader.AddBackgroundTree(background, backgroundWeight)

loader.fSignalWeight = signalWeight
loader.fBackgroundWeight = backgroundWeight
loader.fTreeS = signal
loader.fTreeB = background
DataSetInfo
Dataset: tmva_class_exampleAdded class "Signal"
Add Tree TreeS of type Signal with 6000 events
DataSetInfo
Dataset: tmva_class_exampleAdded class "Background"
Add Tree TreeB of type Background with 6000 events

With using DataLoader.PrepareTrainingAndTestTree function we apply cuts on input events. In C++ this function also needs to add the options as a string (as we seen in Factory constructor) which with JsMVA can be passed (same as Factory constructor case) as keyword arguments.

Arguments of PrepareTrainingAndTestTree:

Keyword Can be used as positional argument Default Predefined values Description
SigCut yes, 1. - - TCut object for signal cut
Bkg yes, 2. - - TCut object for background cut
SplitMode no Random Random, Alternate, Block Method of picking training and testing events
MixMode no SameAsSplitMode SameAsSplitMode, Random, Alternate, Block Method of mixing events of differnt classes into one dataset
SplitSeed no 100 - Seed for random event shuffling
NormMode no EqualNumEvents None, NumEvents, EqualNumEvents Overall renormalisation of event-by-event weights used in the training (NumEvents: average weight of 1 per event, independently for signal and background; EqualNumEvents: average weight of 1 per event for signal, and sum of weights for background equal to sum of weights for signal)
nTrain_Signal no 0 (all) - Number of training events of class Signal
nTest_Signal no 0 (all) - Number of test events of class Signal
nTrain_Background no 0 (all) - Number of training events of class Background
nTest_Background no 0 (all) - Number of test events of class Background
V no False - Verbosity
VerboseLevel no Info Debug, Verbose, Info Verbosity level
In [9]:
mycuts = TCut("")
mycutb = TCut("")

loader.PrepareTrainingAndTestTree(SigCut=mycuts, BkgCut=mycutb,
                    nTrain_Signal=0, nTrain_Background=0, SplitMode="Random", NormMode="NumEvents", V=False)

Visualizing input variables

In [10]:
loader.DrawInputVariable("myvar1")
DataSetFactory
Dataset: tmva_class_exampleNumber of events in input trees
Number of training and testing events
Signaltraining events3000
testing events3000
training and testing events6000
Backgroundtraining events3000
testing events3000
training and testing events6000
DataSetInfo Correlation matrix (Signal)
DataSetInfo Correlation matrix (Background)
DataSetFactory
Dataset: tmva_class_example

We can also visualize transformations on input variables

In [11]:
loader.DrawInputVariable("myvar1", processTrfs=["D", "N"]) #Transformations: I;N;D;P;U;G,D
DataLoader
Dataset: tmva_class_exampleCreate Transformation "D" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'
DataLoader
Dataset: tmva_class_exampleCreate Transformation "N" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'
Preparing the Decorrelation transformation...
TFHandler_DataLoader
VariableMeanRMSMinMax
myvar1-0.112021.0000-3.88133.3150
myvar2-0.0174041.0000-3.72403.6440
var3-0.112411.0000-3.72483.8805
var40.322611.0000-3.36623.1355
TFHandler_DataLoader
VariableMeanRMSMinMax
myvar10.0475640.27792-1.00001.0000
myvar20.00612620.27145-1.00001.0000
var3-0.0500400.26298-1.00001.0000
var40.134720.30761-1.00001.0000
DataLoader
Dataset: tmva_class_exampleCreate Transformation "D" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'
DataLoader
Dataset: tmva_class_exampleCreate Transformation "N" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'
Preparing the Decorrelation transformation...
TFHandler_DataLoader
VariableMeanRMSMinMax
myvar1-0.112021.0000-3.88133.3150
myvar2-0.0174041.0000-3.72403.6440
var3-0.112411.0000-3.72483.8805
var40.322611.0000-3.36623.1355
TFHandler_DataLoader
VariableMeanRMSMinMax
myvar10.0475640.27792-1.00001.0000
myvar20.00612620.27145-1.00001.0000
var3-0.0500400.26298-1.00001.0000
var40.134720.30761-1.00001.0000

Correlation matrix of input variables

In [12]:
loader.DrawCorrelationMatrix("Signal")

Booking methods

To add which we want to train on dataset we have to use the Factory.BookMethod function. This method will add a method and it's options to Factory.

Arguments:

Keyword Can be used as positional argument Default Predefined values Description
DataLoader yes, 1. - - Pointer to DataLoader object
Method yes, 2. - kVariable kCuts , kLikelihood , kPDERS , kHMatrix , kFisher , kKNN , kCFMlpANN , kTMlpANN , kBDT , kDT , kRuleFit , kSVM , kMLP , kBayesClassifier, kFDA , kBoost , kPDEFoam , kLD , kPlugins , kCategory , kDNN , kPyRandomForest , kPyAdaBoost , kPyGTB , kC50 , kRSNNS , kRSVM , kRXGB , kMaxMethod Selected method number, method numbers defined in TMVA.Types
MethodTitle yes, 3. - - Label for method
* no - - Other named arguments which are the options for selected method.
In [13]:
factory.BookMethod( DataLoader=loader, Method=TMVA.Types.kSVM, MethodTitle="SVM", 
                Gamma=0.25, Tol=0.001, VarTransform="Norm" )

factory.BookMethod( loader,TMVA.Types.kMLP, "MLP", 
        H=False, V=False, NeuronType="tanh", VarTransform="N", NCycles=600, HiddenLayers="N+5",
                   TestRate=5, UseRegulator=False )

factory.BookMethod( loader,TMVA.Types.kLD, "LD", 
        H=False, V=False, VarTransform="None", CreateMVAPdfs=True, PDFInterpolMVAPdf="Spline2",
                   NbinsMVAPdf=50, NsmoothMVAPdf=10 )

factory.BookMethod( loader,TMVA.Types.kLikelihood,"Likelihood","NSmoothSig[0]=20:NSmoothBkg[0]=20:NSmoothBkg[1]=10",
    NSmooth=1, NAvEvtPerBin=50, H=True, V=False,TransformOutput=True,PDFInterpol="Spline2")

factory.BookMethod( loader, TMVA.Types.kBDT, "BDT",
    H=False, V=False, NTrees=850, MinNodeSize="2.5%", MaxDepth=3, BoostType="AdaBoost", AdaBoostBeta=0.5,
                   UseBaggedBoost=True, BaggedSampleFraction=0.5, SeparationType="GiniIndex", nCuts=20 )
Out[13]:
<ROOT.TMVA::MethodBDT object ("BDT") at 0x6811880>
Factory Booking method: SVM
SVM
Dataset: tmva_class_exampleCreate Transformation "Norm" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'
Factory Booking method: MLP
MLP
Dataset: tmva_class_exampleCreate Transformation "N" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'
MLP Building Network.
Initializing weights
Factory Booking method: LD
Factory Booking method: Likelihood
Factory Booking method: BDT

Booking DNN: 2 ways (don't use both in the same time)

There is two way to book DNN:

  1. The visual way: run the next cell, and design the network graphically and then click on "Save Network"
In [14]:
factory.BookDNN(loader)
  1. Classical way
In [15]:
trainingStrategy = [{
        "LearningRate": 1e-1,
        "Momentum": 0.0,
        "Repetitions": 1,
        "ConvergenceSteps": 300,
        "BatchSize": 20,
        "TestRepetitions": 15,
        "WeightDecay": 0.001,
        "Regularization": "NONE",
        "DropConfig": "0.0+0.5+0.5+0.5",
        "DropRepetitions": 1,
        "Multithreading": True

    },  {
        "LearningRate": 1e-2,
        "Momentum": 0.5,
        "Repetitions": 1,
        "ConvergenceSteps": 300,
        "BatchSize": 30,
        "TestRepetitions": 7,
        "WeightDecay": 0.001,
        "Regularization": "L2",
        "DropConfig": "0.0+0.1+0.1+0.1",
        "DropRepetitions": 1,
        "Multithreading": True

    }, {
        "LearningRate": 1e-2,
        "Momentum": 0.3,
        "Repetitions": 1,
        "ConvergenceSteps": 300,
        "BatchSize": 40,
        "TestRepetitions": 7,
        "WeightDecay": 0.001,
        "Regularization": "L2",
        "Multithreading": True

    },{
        "LearningRate": 1e-3,
        "Momentum": 0.1,
        "Repetitions": 1,
        "ConvergenceSteps": 200,
        "BatchSize": 70,
        "TestRepetitions": 7,
        "WeightDecay": 0.001,
        "Regularization": "NONE",
        "Multithreading": True

}, {
        "LearningRate": 1e-3,
        "Momentum": 0.1,
        "Repetitions": 1,
        "ConvergenceSteps": 200,
        "BatchSize": 70,
        "TestRepetitions": 7,
        "WeightDecay": 0.001,
        "Regularization": "NONE",
        "Multithreading": True

}]

factory.BookMethod(DataLoader=loader, Method=TMVA.Types.kDNN, MethodTitle="DNN", 
                   H = False, V=False, VarTransform="Normalize", ErrorStrategy="CROSSENTROPY",
                   Layout=["TANH|100", "TANH|50", "TANH|10", "LINEAR"],
                   TrainingStrategy=trainingStrategy,Architecture="STANDARD")
Out[15]:
<ROOT.TMVA::MethodDNN object ("DNN") at 0x6634270>
Factory Booking method: DNN
DNN
Dataset: tmva_class_exampleCreate Transformation "Normalize" with events from all classes.
Transformation, Variable selection :
Input : variable 'myvar1' <---> Output : variable 'myvar1'
Input : variable 'myvar2' <---> Output : variable 'myvar2'
Input : variable 'var3' <---> Output : variable 'var3'
Input : variable 'var4' <---> Output : variable 'var4'

Train Methods

When you use the jsmva magic, the original C++ version of Factory::TrainAllMethods is rewritten by a new training method, which will produce notebook compatible output during the training, so we can trace the process (progress bar, error plot). For some methods (MLP, DNN, BDT) there will be created a tracer plot (for MLP, DNN test and training error vs epoch, for BDT error fraction and boost weight vs tree number). There are also some method which doesn't support interactive tracing, so for these methods just a simple text will be printed, just to we know that TrainAllMethods function is training this method currently.

For methods where is possible to trace the training interactively there is a stop button, which can stop the training process. This button just stops the training of the current method, and doesn't stop the TrainAllMethods completely.

In [16]:
factory.TrainAllMethods()

Dataset: tmva_class_example

Train method: SVM

0%

Train method: MLP

0%

Train method: LD

Training...
End

Train method: Likelihood

Training...
End

Train method: BDT

0%

Train method: DNN

0%
TFHandler_SVM
VariableMeanRMSMinMax
myvar10.0839890.36407-1.00001.0000
myvar20.00947780.27696-1.00001.0000
var30.0802790.36720-1.00001.0000
var40.129860.39603-1.00001.0000
Building SVM Working Set...with 6000 event instances
Elapsed time for Working Set build : 1.24 sec
Sorry, no computing time forecast available for SVM, please wait ...
Elapsed time : 1.68 sec
Elapsed time for training with 6000 events : 2.94 sec
SVM
Dataset: tmva_class_exampleEvaluation of SVM on training sample (6000 events)
Elapsed time for evaluation of 6000 events : 1.03 sec
Creating xml weight file: tmva_class_example/weights/TMVAClassification_SVM.weights.xml
Creating standalone class: tmva_class_example/weights/TMVAClassification_SVM.class.C
TFHandler_MLP
VariableMeanRMSMinMax
myvar10.0839890.36407-1.00001.0000
myvar20.00947780.27696-1.00001.0000
var30.0802790.36720-1.00001.0000
var40.129860.39603-1.00001.0000
Training Network
Elapsed time for training with 6000 events : 1.43 sec
MLP
Dataset: tmva_class_exampleEvaluation of MLP on training sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.00932 sec
Creating xml weight file: tmva_class_example/weights/TMVAClassification_MLP.weights.xml
Creating standalone class: tmva_class_example/weights/TMVAClassification_MLP.class.C
Write special histos to file: TMVA.root:/tmva_class_example/Method_MLP/MLP
LD Results for LD coefficients:
Variable: Coefficient:
myvar1: -0.359
myvar2: -0.109
var3: -0.211
var4: +0.722
(offset): -0.054
Elapsed time for training with 6000 events : 0.00231 sec
LD
Dataset: tmva_class_exampleEvaluation of LD on training sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.000759 sec
Dataset: tmva_class_example Separation from histogram (PDF): 0.452 (0.000)
Evaluation of LD on training sample
Creating xml weight file: tmva_class_example/weights/TMVAClassification_LD.weights.xml
Creating standalone class: tmva_class_example/weights/TMVAClassification_LD.class.C
================================================================
Dataset: Likelihood 
--- Short description:
The maximum-likelihood classifier models the data with probability
density functions (PDF) reproducing the signal and background
distributions of the input variables. Correlations among the
variables are ignored.
--- Performance optimisation:
Required for good performance are decorrelated input variables
(PCA transformation via the option "VarTransform=Decorrelate"
may be tried). Irreducible non-linear correlations may be reduced
by precombining strongly correlated input variables, or by simply
removing one of the variables.
--- Performance tuning via configuration options:
High fidelity PDF estimates are mandatory, i.e., sufficient training
statistics is required to populate the tails of the distributions
It would be a surprise if the default Spline or KDE kernel parameters
provide a satisfying fit to the data. The user is advised to properly
tune the events per bin and smooth options in the spline cases
individually per variable. If the KDE kernel is used, the adaptive
Gaussian kernel may lead to artefacts, so please always also try
the non-adaptive one.
All tuning parameters must be adjusted individually for each input
variable!
================================================================
Filling reference histograms
Building PDF out of reference histograms
Elapsed time for training with 6000 events : 0.0304 sec
Likelihood
Dataset: tmva_class_exampleEvaluation of Likelihood on training sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.00743 sec
Creating xml weight file: tmva_class_example/weights/TMVAClassification_Likelihood.weights.xml
Creating standalone class: tmva_class_example/weights/TMVAClassification_Likelihood.class.C
Write monitoring histograms to file: TMVA.root:/tmva_class_example/Method_Likelihood/Likelihood
BDT #events: (reweighted) sig: 3000 bkg: 3000
#events: (unweighted) sig: 3000 bkg: 3000
Training 850 Decision Trees ... patience please
Elapsed time for training with 6000 events : 1.54 sec
BDT
Dataset: tmva_class_exampleEvaluation of BDT on training sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.443 sec
Creating xml weight file: tmva_class_example/weights/TMVAClassification_BDT.weights.xml
Creating standalone class: tmva_class_example/weights/TMVAClassification_BDT.class.C
TFHandler_DNN
VariableMeanRMSMinMax
myvar10.0839890.36407-1.00001.0000
myvar20.00947780.27696-1.00001.0000
var30.0802790.36720-1.00001.0000
var40.129860.39603-1.00001.0000
TFHandler_DNN
VariableMeanRMSMinMax
myvar10.0839890.36407-1.00001.0000
myvar20.00947780.27696-1.00001.0000
var30.0802790.36720-1.00001.0000
var40.129860.39603-1.00001.0000
TFHandler_DNN
VariableMeanRMSMinMax
myvar10.0751130.36776-1.10741.0251
myvar20.00755950.27349-0.906631.0008
var30.0702280.37106-1.06491.0602
var40.120900.39854-1.18711.0199
Using Standard Implementation.Training with learning rate = 0.1, momentum = 0, repetitions = 1
Training with learning rate = 0.01, momentum = 0.5, repetitions = 1
Training with learning rate = 0.01, momentum = 0.3, repetitions = 1
Training with learning rate = 0.001, momentum = 0.1, repetitions = 1
Training with learning rate = 0.001, momentum = 0.1, repetitions = 1
Elapsed time for training with 6000 events : 4.53 sec
DNN
Dataset: tmva_class_exampleEvaluation of DNN on training sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.212 sec
Creating xml weight file: tmva_class_example/weights/TMVAClassification_DNN.weights.xml
Creating standalone class: tmva_class_example/weights/TMVAClassification_DNN.class.C

Test end evaluate the methods

To test test the methods and evaluate the performance we need to run Factory.TestAllMethods and Factory.EvaluateAllMethods functions.

In [17]:
factory.TestAllMethods()
factory.EvaluateAllMethods()
Factory Test all methods
Factory Test method: SVM for Classification performance
SVM
Dataset: tmva_class_exampleEvaluation of SVM on testing sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.983 sec
Factory Test method: MLP for Classification performance
MLP
Dataset: tmva_class_exampleEvaluation of MLP on testing sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.00927 sec
Factory Test method: LD for Classification performance
LD
Dataset: tmva_class_exampleEvaluation of LD on testing sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.00108 sec
Dataset: tmva_class_exampleEvaluation of LD on testing sample
Factory Test method: Likelihood for Classification performance
Likelihood
Dataset: tmva_class_exampleEvaluation of Likelihood on testing sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.00623 sec
Factory Test method: BDT for Classification performance
BDT
Dataset: tmva_class_exampleEvaluation of BDT on testing sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.367 sec
Factory Test method: DNN for Classification performance
DNN
Dataset: tmva_class_exampleEvaluation of DNN on testing sample (6000 events)
Elapsed time for evaluation of 6000 events : 0.193 sec
Factory Evaluate all methods
Factory Evaluate classifier: SVM
TFHandler_SVM
VariableMeanRMSMinMax
myvar10.0751130.36776-1.10741.0251
myvar20.00755950.27349-0.906631.0008
var30.0702280.37106-1.06491.0602
var40.120900.39854-1.18711.0199
SVM
Dataset: tmva_class_exampleLoop over test events and fill histograms with classifier response...
TFHandler_SVM
VariableMeanRMSMinMax
myvar10.0751130.36776-1.10741.0251
myvar20.00755950.27349-0.906631.0008
var30.0702280.37106-1.06491.0602
var40.120900.39854-1.18711.0199
Factory Evaluate classifier: MLP
TFHandler_MLP
VariableMeanRMSMinMax
myvar10.0751130.36776-1.10741.0251
myvar20.00755950.27349-0.906631.0008
var30.0702280.37106-1.06491.0602
var40.120900.39854-1.18711.0199
MLP
Dataset: tmva_class_exampleLoop over test events and fill histograms with classifier response...
TFHandler_MLP
VariableMeanRMSMinMax
myvar10.0751130.36776-1.10741.0251
myvar20.00755950.27349-0.906631.0008
var30.0702280.37106-1.06491.0602
var40.120900.39854-1.18711.0199
Factory Evaluate classifier: LD
LD
Dataset: tmva_class_exampleLoop over test events and fill histograms with classifier response...
Also filling probability and rarity histograms (on request)...
TFHandler_LD
VariableMeanRMSMinMax
myvar1-0.0108143.0633-9.86057.9024
myvar20.000905521.1092-3.70674.0291
var3-0.0151181.7459-5.35634.6430
var40.143312.1667-6.96755.0307
Factory Evaluate classifier: Likelihood
Likelihood
Dataset: tmva_class_exampleLoop over test events and fill histograms with classifier response...
TFHandler_Likelihood
VariableMeanRMSMinMax
myvar1-0.0108143.0633-9.86057.9024
myvar20.000905521.1092-3.70674.0291
var3-0.0151181.7459-5.35634.6430
var40.143312.1667-6.96755.0307
Factory Evaluate classifier: BDT
BDT
Dataset: tmva_class_exampleLoop over test events and fill histograms with classifier response...
TFHandler_BDT
VariableMeanRMSMinMax
myvar1-0.0108143.0633-9.86057.9024
myvar20.000905521.1092-3.70674.0291
var3-0.0151181.7459-5.35634.6430
var40.143312.1667-6.96755.0307
Factory Evaluate classifier: DNN
DNN
Dataset: tmva_class_exampleLoop over test events and fill histograms with classifier response...
TFHandler_DNN
VariableMeanRMSMinMax
myvar10.0751130.36776-1.10741.0251
myvar20.00755950.27349-0.906631.0008
var30.0702280.37106-1.06491.0602
var40.120900.39854-1.18711.0199
Evaluation results ranked by best signal efficiency and purity (area)
DataSet MVA
Name: Method: ROC-integ
tmva_class_example DNN : 0.940
tmva_class_example MLP : 0.939
tmva_class_example SVM : 0.937
tmva_class_example BDT : 0.931
tmva_class_example LD : 0.895
tmva_class_example Likelihood : 0.827
Testing efficiency compared to training efficiency (overtraining check)
DataSet MVA Signal efficiency: from test sample (from training sample)
Name: Method: @B=0.01 @B=0.10 @B=0.30
tmva_class_example DNN : 0.390 (0.345) 0.804 (0.798) 0.962 (0.963)
tmva_class_example MLP : 0.365 (0.345) 0.806 (0.797) 0.962 (0.964)
tmva_class_example SVM : 0.400 (0.322) 0.802 (0.791) 0.961 (0.961)
tmva_class_example BDT : 0.350 (0.380) 0.778 (0.805) 0.955 (0.959)
tmva_class_example LD : 0.261 (0.242) 0.679 (0.662) 0.901 (0.903)
tmva_class_example Likelihood : 0.106 (0.101) 0.400 (0.371) 0.812 (0.813)
Dataset:tmva_class_exa...: Created tree 'TestTree' with 6000 events
Dataset:tmva_class_exa...: Created tree 'TrainTree' with 6000 events
Factory Thank you for using TMVA!
For citation information, please visit: http://tmva.sf.net/citeTMVA.html

Classifier Output Distributions

To draw the classifier output distribution we have to use Factory.DrawOutputDistribution function which is inserted by invoking jsmva magic. The parameters of the function are the following: The options string can contain the following options:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
datasetName yes, 1. - - The name of dataset
methodName yes, 2. - - The name of method
In [18]:
factory.DrawOutputDistribution(dataset, "MLP")

Classifier Probability Distributions

To draw the classifier probability distribution we have to use Factory.DrawProbabilityDistribution function which is inserted by invoking jsmva magic. The parameters of the function are the following: The options string can contain the following options:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
datasetName yes, 1. - - The name of dataset
In [19]:
factory.DrawProbabilityDistribution(dataset, "LD")

ROC curve

To draw the ROC (receiver operating characteristic) curve we have to use Factory.DrawROCCurve function which is inserted by invoking jsmva magic. The parameters of the function are the following: The options string can contain the following options:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
datasetName yes, 1. - - The name of dataset
In [20]:
factory.DrawROCCurve(dataset)

Classifier Cut Efficiencies

To draw the classifier cut efficiencies we have to use Factory.DrawCutEfficiencies function which is inserted by invoking jsmva magic. The parameters of the function are the following: The options string can contain the following options:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
datasetName yes, 1. - - The name of dataset
methodName yes, 2. - - The name of method
In [21]:
factory.DrawCutEfficiencies(dataset, "MLP")

Draw Neural Network

If we trained a neural network then the weights of the network will be saved to XML and C file. We can read back the XML file and we can visualize the network using Factory.DrawNeuralNetwork function.

The arguments of this function:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
datasetName yes, 1. - - The name of dataset
methodName yes, 2. - - The name of method

This visualization will be interactive, and we can do the following with it:

  • Mouseover (node, weight): focusing
  • Zooming and grab and move supported
  • Reset: double click

The synapses are drawn with 2 colors, one for positive weight and one for negative weight. The absolute value of the synapses are scaled and transformed to thickness of line between to node.

In [22]:
factory.DrawNeuralNetwork(dataset, "MLP")

Draw Deep Neural Network

The DrawNeuralNetwork function also can visualize deep neural networks, we just have to pass "DNN" as method name. If you have very big network with lots of thousands of neurons then drawing the network will be a little bit slow and will need a lot of ram, so be careful with this function.

This visualization also will be interactive, and we can do the following with it:

  • Zooming and grab and move supported
In [23]:
factory.DrawNeuralNetwork(dataset, "DNN")

Draw Decision Tree

The trained decision trees will be save to XML save too, so we can read back the XML file and we can visualize the trees. This is the purpose of Factory.DrawDecisionTree function.

The arguments of this function:

KeywordCan be used as positional argumentDefaultPredefined valuesDescription
datasetName yes, 1. - - The name of dataset
methodName yes, 2. - - The name of method

This function will produce a little box where you can enter the index of the tree (the number of trees will be also will appear before this input box) you want to see. After choosing this number you have to press the Draw button. The nodes of tree will be colored, the color is associated to signal efficiency.

The visualization of tree will be interactive and you can do the following with it:

  • Mouseover (node, weight): showing decision path

  • Zooming and grab and move supported

  • Reset zoomed tree: double click

  • Expand all closed subtrees, turn off zoom: button in the bottom of the picture

  • Click on node:

    • hiding subtree, if node children are hidden the node will have a green border
    • rescaling: bigger nodes, bigger texts
    • click again to show the subtree
In [24]:
factory.DrawDecisionTree(dataset, "BDT") #11

DNN weights heat map

In [25]:
factory.DrawDNNWeights(dataset, "DNN")

Close the factory's output file

In [26]:
outputFile.Close()
In [ ]: