Difference: ConnorsAnalysis-AnalysisTimeline (10 vs. 11)

Revision 112018-01-29 - ConnorGraham

Line: 1 to 1
 
META TOPICPARENT name="ConnorsAnalysis"
Return the the main analysis page.
Line: 11 to 11
 Summary:
  1. The events occur in the detector
  2. Detector information is processed in the triggers and stored on Castor
Changed:
<
<
  1. Raw data is reconstructed by the framework
  2. Reconstructed data is filtered to purpose then stored on EOS
>
>
  1. Raw data is reconstructed by the framework then the reconstructed data is filtered to purpose and stored on EOS
 
  1. Filtered data is then processed by the user directory files
  2. KaonEventAnalysis.cc then processes the data in stages
1: The events occur in the detector
Line: 23 to 22
  The L0TP processes the L0 trigger decision from the detector signals, then the PC farm processes L1 and auto-passes L2 (assuming all signals present). The Mergers then buffer the events and write to Castor.
Changed:
<
<
3: Raw data is reconstructed by the framework
>
>
3: Raw data is reconstructed by the framework then the reconstructed data is filtered to purpose and stored on EOS
  The data is reconstructed using a version or revision of the framework that is dependent on the time the data was taken, reco efficiency is important at this stage.
Deleted:
<
<
4: Reconstructed data is filtered to purpose then stored on EOS
 The Pnn filtering code or others are used to reduce the file sizes and separate the events based on the analysis group that will use them.
Changed:
<
<
5: Filtered data is then processed by the user directory files
>
>
4: Filtered data is then processed by the user directory files
 
Changed:
<
<
User directory pre-analysing files: GigaTrackerEvtReco, AccidentalAnalysis.cc, TwoPhotonAnalysis.cc, TrackAnalysis.cc and OneTrackEventAnalysis.cc.
>
>
User directory pre-analysing files: GigaTrackerEvtReco, TwoPhotonAnalysis.cc, TrackAnalysis.cc and OneTrackEventAnalysis.cc.
 
Changed:
<
<
6: KaonEventAnalysis.cc then processes the data in stages
>
>
5: KaonEventAnalysis.cc then processes the data in stages
 
  1. The main function containing the base analysis
Changed:
<
<
  1. Start of Job, Run then Burst
  2. Initialise histograms, trees and output
>
>
  1. Start of Job, Run then Burst; Initialise histograms, trees and output
 
  1. Process each event and call the relevant analysis functions
  2. Run the specific analysis function on each event that passed the previous stages
  3. Post-processing with: PostProcess; End of Burst, Run, Job; DrawPlot
Line: 46 to 42
  The overall aim of NA62 if to measure the branching fraction (using BR as the canonical shorthand from here) of the decay K+→π+νν. In order to do so, we must account for errors both statistical and systematic. Therefore, if we measure the BR and normalise the number of events we observe by dividing it by one of the primary kaon decays (μ+ν or π+π0) we can cancel many of the major systematics. If we use both primary decays for a normalisation sample and compare the value, we can check if we are properly accounting for all systematics, as both should provide the same result. First we use the number of observed events of decay i:
Changed:
<
<
Ni = fK ⋅ t ⋅ [BR(K→i)] ⋅ Ai ⋅ Ei
>
>
Ni = fK ⋅ t ⋅ [BR(K→i)] ⋅ Ai
 
Changed:
<
<
where fK is the frequency of kaons in the beam, t is the time period of data taking, Ai is the "acceptance" or number of decays in the detector's fiducial region and Ei is the product of efficiencies ∏rεr for the efficiencies r relating to the trigger, reconstruction, kaon ID, daughter ID, track matching and all other processes used in the analysis (this should cover all contributions, even things like the possibility of events being incorrectly tagged as the decay you are measuring).
>
>
where fK is the frequency of kaons in the beam, t is the time period of data taking and Ai is the "acceptance" or number of decays in the detector's fiducial region (this should cover all contributions, even things like the possibility of events being incorrectly tagged as the decay you are measuring).
  From this we can construct an equation for:
Changed:
<
<
BR(K+→π+νν) = BR(K+→μ+ν) ⋅ Nπ+νν/Nμ+νAμ+ν/Aπ+ννEμ+ν/Eπ+νν
where the fK and t terms cancel, along with many of the εr efficiencies and BR(K+→μ+ν) can be taken from the PDG listings as it has been thoroughly measured by previous experiments.
>
>
BR(K+→π+νν) = BR(K+→μ+ν) ⋅ Nπ+νν/Nμ+νAμ+ν/Aπ+νν
where the fK and t terms cancel, along with many of the efficiencies included in the acceptance and BR(K+→μ+ν) can be taken from the PDG listings as it has been thoroughly measured by previous experiments.
  Step 1: Generate a Kμ2 normalisation sample. [done]
  • Start by generating a sample of Kμ2 data with Pnn like cuts from one burst (current file) and organise some output histograms
Line: 64 to 60
 
  • Select muon events at MC truth level [starting]
  • Take basic acceptance cut of 105 (possibly 115) to 165m
  • Bin "passed" events by momentum as has been done in previous studies (15-20, 20-25, 25-30, 30-35 GeV)
Changed:
<
<
  • Calculate binned acceptance by dividing by all events, recorded using the same binning system
>
>
  • Calculate binned acceptance by dividing by all events that pass Kmu2 cuts, recorded using the same binning system, by the geometric acceptance binned numbers
 Step 3: Run on as much 2016 data as possible with HTCondor to calculate a value for "Nμ+ν".
Changed:
<
<
Step 4: Start looking at the efficiencies "εr".
  • First, look at the pion ID efficiency
>
>
  • Run selection on run number 6431 (large, good quality run) to start with
  • Run on Giuseppe's list of all good runs of 2016
Step 4: Start looking at the efficiencies that don't cancel in the acceptances fraction "εr".
  • Pion ID efficiency.
  • Matter interaction differences of muon and pion.
  • Anything we missed?
 

Run the (updated) 2016 cuts based analysis on the 2017 data [future]

  1. Finish updating the user directory files (specifically CHODAnalysis) so that KaonEventAnalysis.cc runs correctly on 2017 data (check you are using the correct framework revision)
  2. Run on one file to check functionality
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2025 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback