|
META TOPICPARENT |
name="ConnorsAnalysis" |
Return the the main analysis page.
Analysis Timeline |
|
< < | The seqence of processes involved in NA62 data analysis
This section is written to later discus the efficiencies of the NA62 analysis and which efficiencies do not cancel between the Pnn channel and the muon normalisation. |
> > | Run the (updated) 2016 cuts based analysis on the 2017 data [started] |
|
Summary: |
|
< < |
- The events occur in the detector
- Detector information is processed in the triggers and stored on Castor
- Raw data is reconstructed by the framework then the reconstructed data is filtered to purpose and stored on EOS
- Filtered data is then processed by the user directory files
- KaonEventAnalysis.cc then processes the data in stages
1: The events occur in the detector
Beam particles (mostly pions and protons, with 6% kaons) exit the T10 target and pass through the colimators, magnetic fields and GTK, possibly interacting or decaying (largely muon and photon decays from the pions). Negligible background from matter emissions (E too low) and high energy cosmic particles (low frequency below ground and usually wrong kinematics).
2: Detector information is processed in the triggers and stored on Castor
The L0TP processes the L0 trigger decision from the detector signals, then the PC farm processes L1 and auto-passes L2 (assuming all signals present). The Mergers then buffer the events and write to Castor.
3: Raw data is reconstructed by the framework then the reconstructed data is filtered to purpose and stored on EOS
The data is reconstructed using a version or revision of the framework that is dependent on the time the data was taken, reco efficiency is important at this stage.
The Pnn filtering code or others are used to reduce the file sizes and separate the events based on the analysis group that will use them.
4: Filtered data is then processed by the user directory files |
> > |
- Finish updating the user directory files (specifically CHODAnalysis) so that KaonEventAnalysis.cc runs correctly on 2017 data (check you are using the correct framework revision)
- Run on one file to check functionality
- Ensure blinding is maintained
- Run on as much data as possible
- Try to run on any failed data files to complete the selection
- Discus unblinding
- Unblind starting with control regions
- Write up results with the intention to publish eventually
1. Updated the version of the user directory files and started testing [in progress] |
| |
|
< < | User directory pre-analysing files: GigaTrackerEvtReco, TwoPhotonAnalysis.cc, TrackAnalysis.cc and OneTrackEventAnalysis.cc. |
> > | Copied over a recent version of Giuseppe's codes (after a full backup), we need to check that these all work as intended with 2017 data and check that these are the most recent versions of the codes (and check for any missing, new codes).
- Starting with a test run on just TrackAnalysis (responsible for track to sub-detector matching, but no cuts) and TwoPhotonAnalysis (responsible for the pi0 variables), with just GigaTrackerEvtReco as a pre-analyzer and with the usual dependency libraries. This was done with a single, reprocessed, 2017, golden run file; specifically run 8252, filtered for Pnn, bursts 13-15.
|
| |
|
< < | 5: KaonEventAnalysis.cc then processes the data in stages
- The main function containing the base analysis
- Start of Job, Run then Burst; Initialise histograms, trees and output
- Process each event and call the relevant analysis functions
- Run the specific analysis function on each event that passed the previous stages
- Post-processing with: PostProcess; End of Burst, Run, Job; DrawPlot
Using Kμ2 as a normalisation sample [current work] |
> > | Using Kμ2 as a normalisation sample [done] |
|
The overall aim of NA62 if to measure the branching fraction (using BR as the canonical shorthand from here) of the decay K+→π+νν. In order to do so, we must account for errors both statistical and systematic. Therefore, if we measure the BR and normalise the number of events we observe by dividing it by one of the primary kaon decays (μ+ν or π+π0) we can cancel many of the major systematics. If we use both primary decays for a normalisation sample and compare the value, we can check if we are properly accounting for all systematics, as both should provide the same result. First we use the number of observed events of decay i: |
|
- Muon ID efficiency [should be done from the MC run]
- Trigger efficiencies
- Anything we missed?
|
|
< < | Run the (updated) 2016 cuts based analysis on the 2017 data [future]
- Finish updating the user directory files (specifically CHODAnalysis) so that KaonEventAnalysis.cc runs correctly on 2017 data (check you are using the correct framework revision)
- Run on one file to check functionality
- Ensure blinding is maintained
- Run on as much data as possible
- Try to run on any failed data files to complete the selection
- Discus unblinding
- Unblind starting with control regions
- Write up results with the intention to publish eventually
GTK3 interaction MC work [future] |
> > | The seqence of processes involved in NA62 Pnn (and similar) data analysis
This section is written to later discus the efficiencies of the NA62 analysis and which efficiencies do not cancel between the Pnn channel and the muon normalisation.
Summary:
- The events occur in the detector
- Detector information is processed in the triggers and stored on Castor
- Raw data is reconstructed by the framework then the reconstructed data is filtered to purpose and stored on EOS
- Filtered data is then processed by the user directory files
- KaonEventAnalysis.cc then processes the data in stages
1: The events occur in the detector
Beam particles (mostly pions and protons, with 6% kaons) exit the T10 target and pass through the colimators, magnetic fields and GTK, possibly interacting or decaying (largely muon and photon decays from the pions). Negligible background from matter emissions (E too low) and high energy cosmic particles (low frequency below ground and usually wrong kinematics).
2: Detector information is processed in the triggers and stored on Castor
The L0TP processes the L0 trigger decision from the detector signals, then the PC farm processes L1 and auto-passes L2 (assuming all signals present). The Mergers then buffer the events and write to Castor.
3: Raw data is reconstructed by the framework then the reconstructed data is filtered to purpose and stored on EOS
The data is reconstructed using a version or revision of the framework that is dependent on the time the data was taken, reco efficiency is important at this stage.
The Pnn filtering code or others are used to reduce the file sizes and separate the events based on the analysis group that will use them.
4: Filtered data is then processed by the user directory files
User directory pre-analysing files: GigaTrackerEvtReco, TwoPhotonAnalysis.cc, TrackAnalysis.cc and OneTrackEventAnalysis.cc.
5: KaonEventAnalysis.cc then processes the data in stages
- The main function containing the base analysis
- Start of Job, Run then Burst; Initialise histograms, trees and output
- Process each event and call the relevant analysis functions
- Run the specific analysis function on each event that passed the previous stages
- Post-processing with: PostProcess; End of Burst, Run, Job; DrawPlot
GTK3 interaction MC work [Cancelled] |
|
- Altering the Geant4 setup to change the hadronic interaction probability in GTK3 to 1 or reject non-interacting events (more likely solution)
- Generate on the order of 100M events
- Compare with data
- Make an estimation of this background for the PNN errors
|
|
> > | Cancelled as generation of the statistics isn't feasible.
This has been left for the experts to look into further, as considerable work would be required to make this possible. |
| Initail work completed to set up the framework and user directory codes: |
|
< < | Build flag issue with --old-specialtrigger, causing a dependency on the framaework's UserMethods.cc file. |
> > | Build flag issue with --old-specialtrigger, causing a dependency on the framaework's UserMethods.cc file. |
|
- Everything works as expected if you comment out the OLD_SPECIALTRIGGER block as described in the analysisbuild.txt readme file and replace it manually with whichever trigger you want to use (either works or complains that you're using the wrong one at runtime). However, you then have a dependency on a framework file.
- When using the flag, it seems that the "#define OLD_SPECIALTRIGGER" line in NA62AnalysisBuilder.py causes this definition to become stuck in the pre-processor, such that it continues to be defined if you try to build without the flag at a later stage
- Solution 1: run a CleanAll command with NA62AnalysisBuilder.py then re-source the env.sh file
- Solution 2: CleanAll then log out of your ssh session and log back in, source then build
|
|
< < | Generating a Pnn sample from the Kaon code given to me by Giuseppe. |
> > | Generating a Pnn sample from the Kaon code given to me by Giuseppe. |
|
- Build fails due to class conflict in the public directory files
- Solution: fixed manually by Giuseppe in the codes, largely by replacing the conflicting "Particle" class with "MyParticle"
- Further run fails due to a special trigger issue not dependent on the build flag
- Solution: a special trigger element of the code needs changing when swapping between MC and data files
- Now working on afs and should be able to set up on any other system (copy placed in: userDir2
)
|
|
< < | A test analyser, to understand how to generate an analyser from scratch and plot variables in the data files, using the framework as a basis. |
> > | A test analyser, to understand how to generate an analyser from scratch and plot variables in the data files, using the framework as a basis. |
|
- This analyser is now setup such that it builds and begins to run with the current setup, designed to record the number of spectrometer candidates, but it fails at runtime due to an issue with a special trigger which is not specifically used in the code.
- Solution: frameworks are not yet completely backwards compatible, I need to use the --old-specialtrigger flag after "build" to get this to work
|