Simplified ATLAS SUSY analysis framework¶
Holds a collections of SUSY analyses. These can be run over samples in different input formats:
- DAOD_TRUTH (TRUTH1 and TRUTH3 tested)
- xAOD (either truth-level and or reco-level - the latter with some constraints)
- HepMC (uncompressed HepMC v2 tested)
- slimmed ntuples (Reduced ntuples produced from above input)
- DELPHES (Converter to slimmed ntuples provided)
It provides the analysis acceptance per control and signal region as well as optionally histograms or ntuples with event level objects.
Setting Up¶
First, you'll go inside of the docker image to set up the environment
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
|
at which point you can then run the code. All docker images can be found in the GitLab registry.
Release Setup
By default, the docker image will execute source /release_setup.sh
for you.
Running¶
You can either run inside the docker image using the simpleAnalysis
command
1 |
|
or outside of the docker image (mounting your current working directory in)
1 |
|
Docker Entrypoint
The docker file will only prefix simpleAnalysis
to any commands you pass if you start with a dash (-
). For example
1 |
|
will try to run inputFile1.root
as a bash command instead. If you do not need any of the flags listed below, then
1 |
|
will work as expected.
This will run the analyses specified with -a
option (or all if not given)
over all of the input files and provide acceptances in a text file for each
analysis (analysisName.txt
) and histograms in a root file (analysisName.root
).
The following additional commandline options that can be specified:
--nevents <num>
limit the number of events processed-l [--listanalysis]
lists all available analyses-n [ --ntuple]
activates ntuple outputs-o [--output] <name>
merges the different analysis outputs into single text and root files-w [--mcweight] <num>
chooses which event weight to use in case of multiple. Default is 0 and weighting can be disabled by setting it to -1-P [--pdfVariations] <initPDF>
do PDF variations following U.L. method of https://arxiv.org/abs/1206.2892-T [--useTrueTau]
use p_T of true tau instead of visible p_T (not recommended, but for backward compatibility)
Outputs¶
SimpleAnalysis will produce a .txt
and .root
file, either one pair of files per
analysis or a single pair if the -o
option is used. In the latter case,
everything gets prefixed with the analysis name to avoid name clashes.
The plain-text file contains the number of events accepted, the acceptance and its
error for each signal region in the analysis in the form of a comma-separated
table. The number of events is the actual number of MC events accepted, while
the acceptance is calculated taking into account event weights (unless disabled
with option -w -1
). Besides the analysis defined signal regions, there is an
_All
line which gives the total number of events processed, the sum of all
event weights and the sum of the square of the event weight, in that order.
This can be used for normalization and merging results.
The root file(s) will contain all histograms defined in the analysis. Note all
histogram filling is done with the event weight. If the -n
option is used,
the root file will also contain one ntuple per analysis with all the variables
defined in the analysis code. The ntuple has one entry per event and in case
the analysis code filling a variable was not reached in a given event, the
value will either be 0 or an empty vector. This has to be accounted for in
case 0 is a valid value for an ntuple variable. Besides the analysis defined
variables, the ntuple has an Event number (counting from 1) branch, an event
weight branch and a branch for each signal region. The latter will be 1 for
events that passed that selection.
Slimming¶
For running over large input files more than once, it can be advantageous to
first slim the files to an absolute minimum by making an ntuple with all the
input objects. This can be done trivially with slimMaker
. Minimum object pTs
can be specified on the commandline, see slimMaker --help
. The output can
supplied to simpleAnalysis
in the same way as DAOD or HepMC files and the program will
automatically detect the type of input.
DELPHES input¶
SimpleAnalysis is not directly integrated with DELPHES. Instead a python script is provided which can be run in a DELPHES setup to convert the DELPHES output ROOT file into the simple ROOT TTree format which can be directly used by SimpleAnalysis
1 |
|