Difference: AtlasDataAnalysis (117 vs. 118)

Revision 1182011-10-04 - AlistairGemmell

Line: 1 to 1
 
META TOPICPARENT name="WebHome"
<-- p { margin-bottom: 0.21cm; }h1 { margin-bottom: 0.21cm; }h1.western { font-family: "Liberation Serif",serif; }h1.cjk { font-family: "DejaVu Sans"; }h1.ctl { font-family: "DejaVu Sans"; }h2 { margin-bottom: 0.21cm; }h4 { margin-bottom: 0.21cm; }h5 { margin-bottom: 0.21cm; }h3 { margin-bottom: 0.21cm; }h3.western { font-family: "Liberation Serif",serif; }h3.cjk { font-family: "DejaVu Sans"; }h3.ctl { font-family: "DejaVu Sans"; }pre.cjk { font-family: "DejaVu Sans",monospace; }a:link { } -->
Line: 252 to 252
 

Variables used by the GlaNtp package

Changed:
<
<
The variables used by the package can be divided into two sets. The first are those variables that are constant throughout the sample - the 'global' variables (e.g. cross-section of the sample). These can be specified in their own tree, where they will be recorded (and read by GlaNtp) once only. If desired, these variables can be defined within the main tree of the input file - however, then they will be recorded once per event, and read in once per event. This is obviously a bit wasteful, but for historical reasons it can be done. To determine which of these behaviours you use, set LoadGlobalOnEachEvent in FlatPlotter and FlatReader to 1 for the events to be read in on an event-by-event basis, or 0 to be read in once from the global tree (or from the first event only). For more information on this switch, refer to this. The other variables are those that change on an event-by-event basis. These variables include both the variables we are going to train the Neural Net on (more information relevant to those variables is given in the relevant section of this TWiki), and other useful variables, such as filter flags (that tell GlaNtp whether an event is sensible or not). All of these variables are listed in the file VariableTreeToNTPATLASttHSemiLeptonic-v15.txt
>
>
The variables used by the package can be divided into two sets. The first are those variables that are constant throughout the sample - the 'global' variables (e.g. cross-section of the sample). These can be specified in their own tree, where they will be recorded (and read by GlaNtp) once only. If desired, these variables can be defined within the main tree of the input file - however, then they will be recorded once per event, and read in once per event. This is obviously a bit wasteful, but for historical reasons it can be done. To determine which of these behaviours you use, set LoadGlobalOnEachEvent in FlatPlotter and FlatReader to 1 for the events to be read in on an event-by-event basis, or 0 to be read in once from the global tree (or from the first event only). For more information on this switch, refer to this. The other variables are those that change on an event-by-event basis. These variables include both the variables we are going to train the Neural Net on (more information relevant to those variables is given in the relevant section of this TWiki), and other useful variables, such as filter flags (that tell GlaNtp whether an event is sensible or not). All of these variables are listed in the file VariableTreeToNTPATLASttHSemiLeptonic-v15.txt
  The file maps logical values to their branch/leaf. The tree can be the global tree or the event tree.
Line: 271 to 271
  If you want a parameter to be found in the output, best to list it here....
Added:
>
>

Variables that must be listed in the event (not the global) tree

nGenForType, LumiForType, Eventtype

 

Variables used for training the Neural Net

The list of variables on which the neural net is to train is set in the shell script, under TMVAvarset.txt (this file is created when the script runs). At present, these variables are:

Line: 564 to 568
 

Other switches to influence the running

Changed:
<
<
In genemflat_batch_Complete2_SL5.sh, at the start of the file there are a number of switches established:
>
>

genemflat_batch_Complete2_SL5.sh

At the start of the file there are a number of switches established:

 
# Flags to limit the scope of the run if desired                                                                                                                                                               
Line: 580 to 586
  ***NOTE*** The flags DoTraining and DoTemplates had previously (until release 00-00-21) been set on the command line. They were moved from the command line when the other flags were introduced.
Added:
>
>

teststeerFlatPlotterATLAStthSemileptonic-v16.txt and teststeerFlatReaderATLAStthSemileptonic-v16.txt

GeneralParameter bool   1 LoadGlobalOnEachEvent=0
Determines if you have a separate global tree or not. If you do not, set this equal to one, and the relevant global values will be read out anew for each event from the event tree.
 

Where the output is stored

Added:
>
>
Computentp120.log

The log file from Computentp -- more information about the information contained within it is found here

  •  trees/NNInputs_120.root

    The output from Computentp - it is a copy of all of the input datasets, with the addition of the variables TrainWeight and weight.

    Line: 682 to 698
     A useful little html page that one of Rick's scripts creates, showing a number of useful plots - the signal and background Net scores, distributions of input variables and their correlations, and so on.
    Added:
    >
    >

    Information found in log files

    Computentp120.log

    After looping over all the events, you will see a table like the one below:

     Process Name                                                                       File Name File     Scale    Events  Integral   IntLumi     Alpha
             ttjj /data/atlas09/ahgemmell/NNInputFiles_v16/mergedfilesProcessed/105200-29Aug.root    0         1       613       613         1  0.907015
              ttH      /data/atlas09/ahgemmell/NNInputFiles_v16/mergedfilesProcessed/ttH-v16.root  120         1       556       556         1         1
    

    Some of the values are established through steerComputentp.txt in the line

    ListParameter  Process:ttH       1 Filename:/data/atlas09/ahgemmell/NNInputFiles_v16/mergedfilesProcessed/ttH-v16.root:File:120:IntLumi:1.0

    1. Process Name / File Name : The mapping between these two is established in steerComputentp.txt
    2. File : The number which Computentp uses to differentiate between the various files it's processing - established in steerComputentp.txt
    3. IntLumi : The luminosity which Computentp is aiming to simulate by applying 'weight' to your samples - established in steerComputentp.txt
    4. Integral : The number of events you would expect that sample to have within your desired luminosity
    5. Alpha : This should equal TrainWeight
     

    Limitations

    • It must also be run on a PBS machine because of the structure of the genemflat_batch_Complete2_SL5.sh file (i.e. PBS commands).

    Line: 811 to 859
      To enable you to specify the range and number of bins in the histogram showing the distribution of the pseudoexperiment exclusions. (Found in drivetestFlatFitAtlastth.rootUnscaledTemplates.root)
    Deleted:
    <
    <
    In both FlatReader and FlatPlotter:

    GeneralParameter bool 1 LoadGlobalOnEachEvent=1

    This needs to be set to one if you wish to load the global variables anew for each event. Otherwise the global variables will be loaded once only - from the Global tree if you have specified it, or from the first event if you haven't. Therefore, if your input datasets have non-sensible states and no global tree, this must be set to one. Otherwise, if the first entry is not sensible, or for some other reason has an unreasonable answer for this global value, a problem will develop. With this switch on, the values will be loaded each and every time – obviously this slows the code down – if the global values are safely stored in every entry, it might be best to set this to false.

     

    TMVAsteer.txt (genemflat_batch_Complete2_SL5.sh)

    H6AONN5MEMLP      MLP         1 !H:!V:NCycles=1000:HiddenLayers=N+1,N:RandomSeed=9876543
     
    This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
    Ideas, requests, problems regarding TWiki? Send feedback