export SCRAM_ARCH=slc7_amd64_gcc700
cmsrel CMSSW_10_6_30
cd CMSSW_10_6_30/src/
cmsenv
For the following step you should have a ssh key associated to your GitHub account. For more information, see connecting-to-github-with-ssh-key.
git clone -b master git@github.com:CMS-HSCP/SUSYBSMAnalysis-HSCP.git SUSYBSMAnalysis
To compile the code, run
scram b -j
Copy the relevant scripts to /src
# make sure you are under CMSSW_10_6_30/src/
cp SUSYBSMAnalysis/Analyzer/test/Tamas/submitCrabJobs* .
cp SUSYBSMAnalysis/Analyzer/test/Tamas/HSCParticleProducerAnalyzer_master_cfg.py .
cp SUSYBSMAnalysis/Analyzer/test/Tamas/HSCParticleProducerAnalyzer_2018_mc_cfg.py .
cp SUSYBSMAnalysis/Analyzer/test/Tamas/HSCParticleProducerAnalyzer_2018_SignalMC_cfg.py .
**Customize to you
sed -i 's/tvami/<yourUserName>\ ./g' submitCrabJobs*
# make sure you have a site to write to: `crab checkwrite --site=<yourTierTwoSite>`.
sed -i 's/T2_HU_Budapest/<yourTierTwoSite>\ ./g' submitCrabJobs*
mkdir SUSYBSMAnalysis/Analyzer/test/<yourName>
cp submitCrabJobs* SUSYBSMAnalysis/Analyzer/test/<yourName>
# Do not overwrite Tamas's files in the Tamas folder
Get proxy
voms-proxy-init --voms cms -valid 192:00
Submit to CRAB depending on what you want to run
submitCrabJobsSignalsMT.py
will submit all the 100 signal points we havesubmitCrabJobsAll.py
submits data + bkg MC + gluino / stau signalsHow to submit
python3 submitCrabJobsAll.py <XvY>
where XvY is the versioning, e.g.
python3 submitCrabJobsAll.py 46p6
Status checks and downloading files
# Make sure you are still under `/src/`
cp SUSYBSMAnalysis/Analyzer/test/Tamas/statusCrabJobsMT.py .
python3 statusCrabJobsMT.py <XvY>
# To download
cp SUSYBSMAnalysis/Analyzer/test/Tamas/downloadCrabJobs* .
# Depending on how many ppl use the machine / how many jobs are running, use the MT (multi-threaded) option or not (MT will be much faster but will have a big strain on the PC used)
python3 downloadCrabJobs.py <XvY>
hadd
files / scale histograms to correct yieldcp SUSYBSMAnalysis/Analyzer/test/Tamas/haddCrabJobsMT.py .
python3 haddCrabJobsMT.py <XvY>
# This now creates a single root file for each process
# Rescale and print out further hadd commands
cp SUSYBSMAnalysis/Analyzer/test/Tamas/rescaleAndPrintHadd.py .
python3 rescaleAndPrintHadd.py <XvY>
# This now scales the signal / bkg MC yields to 100 /fb
# And prints out further hadd commands to add up the bkg MC / different eras in a year
This is done in the https://github.com/tvami/HSCPbackgroundPred repo using 2DAlphabet for the ionization method
Analysis Type: | |
---|---|
Type 0 | Tk only |
Type 1 | Tk+Muon |
Type 2 | Tk+TOF |
Type 3 | TOF only |
Type 4 | Q<1 (FCP) |
Type 5 | Q>1 |
FCPs are not done in this group anymore.
If you need to setup CRAB
Setup CRAB environment
source /cvmfs/cms.cern.ch/common/crab-setup.sh
**If you want to check on individual jobs:
The following directory will be created: `crab_projects/crab_<request-name>`.
- To get status: `crab status -d crab_projects/crab_<request-name>`
- To resubmit (killed and failed jobs): `crab resubmit -d crab_projects/crab_<request-name>`
- To retrieve the output: `crab getoutput -d crab_projects/crab_<request-name> [--jobids id1,id2]`
- To get report (processed lumi json): `crab report -d crab_projects/crab_<request-name>`
- How to get corresponding integrated lumi, see section [Compute Lumi](#compute-lumi).
Copy the script:
cp Analyzer/test/compareRootFiles.py .
python compareRootFiles.py -h
This script takes two root files (to set in the file) and compares their histograms with a Kolmogorov test. Any difference is saved in:
differences.txt
For any support/suggestions, mail to Tamas.Almos.VamiSPAMNOT@cern.ch