Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 52 to 57 | ||||||||
Submitting a jobJobs are submitted to a Compute Element (CE). The ScotGrid site at Glasgow has four CEs: | ||||||||
Changed: | ||||||||
< < | svr009.gla.scotgrid.ac.uk svr010.gla.scotgrid.ac.uk svr011.gla.scotgrid.ac.uk svr019.gla.scotgrid.ac.uk | |||||||
> > |
ce01.gla.scotgrid.ac.uk ce02.gla.scotgrid.ac.uk ce03.gla.scotgrid.ac.uk ce04.gla.scotgrid.ac.uk | |||||||
It does not matter which CE you choose to submit to. (If you've looked at the tutorial linked above, you'll see that Durham gave their CEs the sensible names ce1 , ce2 , etc. We thought that would be too easy.) | ||||||||
Line: 61 to 67 | ||||||||
It does not matter which CE you choose to submit to. (If you've looked at the tutorial linked above, you'll see that Durham gave their CEs the sensible names ce1 , ce2 , etc. We thought that would be too easy.)
Jobs are submitted using the arcsub command: | ||||||||
Added: | ||||||||
> > | ||||||||
$ arcsub -c <CE_HOSTNAME> <XRSL_FILENAME> | ||||||||
Changed: | ||||||||
< < | For example, to submit test.xrsl to svr011 at Glasgow:
$ arcsub -c svr011.gla.scotgrid.ac.uk test.xrsl Job submitted with jobid: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
> > | For example, to submit test.xrsl to ce03 at Glasgow:
$ arcsub -c ce03.gla.scotgrid.ac.uk test.xrsl Job submitted with jobid: gsiftp://ce03.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
When a job is submitted successfully, you will be presented with its job ID which can be used to refer to the job later. Information about submitted jobs is also recorded in a job list file; by default, this file is ~/.arc/jobs.dat (~/.arc/jobs.xml with some earlier versions of ARC), but you can choose a different location by supplying the -j argument to arcsub : | ||||||||
Line: 70 to 78 | ||||||||
When a job is submitted successfully, you will be presented with its job ID which can be used to refer to the job later. Information about submitted jobs is also recorded in a job list file; by default, this file is ~/.arc/jobs.dat (~/.arc/jobs.xml with some earlier versions of ARC), but you can choose a different location by supplying the -j argument to arcsub : | ||||||||
Added: | ||||||||
> > | ||||||||
$ arcsub -j <JOBLIST_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME>For example: | ||||||||
Changed: | ||||||||
< < | $ arcsub -j test.dat -c svr011.gla.scotgrid.ac.uk test.xrsl | |||||||
> > |
$ arcsub -j test.dat -c ce03.gla.scotgrid.ac.uk test.xrsl | |||||||
Querying the status of a job | ||||||||
Line: 80 to 90 | ||||||||
Querying the status of a jobYou can obtain information about the status of jobs using thearcstat command: | ||||||||
Added: | ||||||||
> > | ||||||||
$ arcstat <JOB_ID>For example, to obtain information about the job submitted in the previous step: | ||||||||
Changed: | ||||||||
< < | $ arcstat gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy Job: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
> > |
$ arcstat gsiftp://ce03.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy Job: gsiftp://ce03.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
Name: StageJob State: Queuing | ||||||||
Line: 101 to 114 | ||||||||
Retrieving job outputOutput and log files for a job can be retrieved using thearcget command. As when querying the status of a job, you can use either a job ID or a job list file with this command: | ||||||||
Added: | ||||||||
> > | ||||||||
$ arcget <JOB_ID> $ arcget -j <JOBLIST_FILENAME>For example, to get the output of the job submitted above: | ||||||||
Changed: | ||||||||
< < | $ arcget gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
> > |
$ arcget gsiftp://ce03.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
Results stored at: p6vLDmj3kwrnZ4eC3pmXXsQmABFKDmABFKDm9pFKDmABFKDmtVM1wm Jobs processed: 1, successfully retrieved: 1, successfully cleaned: 1 |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 7 to 7 | ||||||||
ARC toolsThe tools required for grid job submission and management are available from CVMFS: | ||||||||
Deleted: | ||||||||
< < | ||||||||
export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase alias setupATLAS='source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh' setupATLAS | ||||||||
Changed: | ||||||||
< < | lsetup emi
If you plan to submit jobs to the ScotGrid VO, at present you must also amend the X509_VOMSES environment variable as follows: | |||||||
> > | lsetup emi | |||||||
Changed: | ||||||||
< < | export X509_VOMSES=/etc/vomses | |||||||
> > | If you plan to submit jobs to the ScotGrid VO, at present you must also amend the X509_VOMSES environment variable as follows:
export X509_VOMSES=/etc/vomses | |||||||
Certificates and proxiesTo use grid resources, you will need a certificate, from which you can generate a proxy certificate. The proxy certificate has a relatively short lifetime, and is used to actually submit the job. A proxy is associated with a particular Virtual Organisation (VO), for examplevo.scotgrid.ac.uk , which is selected when it is created. You can generate a proxy using the arcproxy command: | ||||||||
Changed: | ||||||||
< < |
$ arcproxy -S <VO_ALIAS> -N | |||||||
> > | $ arcproxy -S <VO_ALIAS> -N | |||||||
For example, to generate a proxy for the vo.scotgrid.ac.uk VO: | ||||||||
Changed: | ||||||||
< < |
$ arcproxy -S vo.scotgrid.ac.uk -N | |||||||
> > | $ arcproxy -S vo.scotgrid.ac.uk -N | |||||||
Your identity: /C=UK/O=eScience/OU=Glasgow/L=Compserv/CN=bugs bunny Contacting VOMS server (named vo.scotgrid.ac.uk): voms.gridpp.ac.uk on port: 15509 Proxy generation succeeded | ||||||||
Line: 43 to 35 | ||||||||
Job description (xRSL)Before submitting a job, you need to create a file which describes the features of the job for ARC (its executable, the names of input and output files, what to do with logs, etc.). This file is written in the Extended Resource Specification Language (xRSL). A simple job description which runs a script calledtest.sh could look like this: | ||||||||
Changed: | ||||||||
< < |
& | |||||||
> > | & | |||||||
(executable = "test.sh") (arguments = "") (jobName = "TestJob") | ||||||||
Line: 61 to 51 | ||||||||
Submitting a job | ||||||||
Changed: | ||||||||
< < | Jobs are submitted to a Compute Element (CE). The ScotGrid site at Glasgow has four CEs:
svr009.gla.scotgrid.ac.uk | |||||||
> > | Jobs are submitted to a Compute Element (CE). The ScotGrid site at Glasgow has four CEs:
svr009.gla.scotgrid.ac.uk | |||||||
svr010.gla.scotgrid.ac.uk svr011.gla.scotgrid.ac.uk svr019.gla.scotgrid.ac.uk | ||||||||
Line: 73 to 61 | ||||||||
It does not matter which CE you choose to submit to. (If you've looked at the tutorial linked above, you'll see that Durham gave their CEs the sensible names ce1 , ce2 , etc. We thought that would be too easy.)
Jobs are submitted using the arcsub command: | ||||||||
Changed: | ||||||||
< < |
$ arcsub -c <CE_HOSTNAME> <XRSL_FILENAME> | |||||||
> > | $ arcsub -c <CE_HOSTNAME> <XRSL_FILENAME> | |||||||
For example, to submit test.xrsl to svr011 at Glasgow: | ||||||||
Changed: | ||||||||
< < |
$ arcsub -c svr011.gla.scotgrid.ac.uk test.xrsl | |||||||
> > | $ arcsub -c svr011.gla.scotgrid.ac.uk test.xrsl | |||||||
Job submitted with jobid: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy
When a job is submitted successfully, you will be presented with its job ID which can be used to refer to the job later. Information about submitted jobs is also recorded in a job list file; by default, this file is ~/.arc/jobs.dat (~/.arc/jobs.xml with some earlier versions of ARC), but you can choose a different location by supplying the -j argument to arcsub : | ||||||||
Changed: | ||||||||
< < |
$ arcsub -j <JOBLIST_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME> | |||||||
> > | $ arcsub -j <JOBLIST_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME> | |||||||
For example: | ||||||||
Changed: | ||||||||
< < |
$ arcsub -j test.dat -c svr011.gla.scotgrid.ac.uk test.xrsl | |||||||
> > | $ arcsub -j test.dat -c svr011.gla.scotgrid.ac.uk test.xrsl | |||||||
Querying the status of a jobYou can obtain information about the status of jobs using thearcstat command: | ||||||||
Changed: | ||||||||
< < |
$ arcstat <JOB_ID> | |||||||
> > | $ arcstat <JOB_ID> | |||||||
For example, to obtain information about the job submitted in the previous step: | ||||||||
Changed: | ||||||||
< < |
$ arcstat gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
> > | $ arcstat gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
Job: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy Name: StageJob State: Queuing | ||||||||
Line: 119 to 95 | ||||||||
You may have to wait a few minutes after submitting a job before status information becomes available. You can also query the status of all jobs in a job list file: | ||||||||
Changed: | ||||||||
< < |
$ arcstat -j <JOBLIST_FILENAME> | |||||||
> > | $ arcstat -j <JOBLIST_FILENAME> | |||||||
Retrieving job outputOutput and log files for a job can be retrieved using thearcget command. As when querying the status of a job, you can use either a job ID or a job list file with this command: | ||||||||
Changed: | ||||||||
< < |
$ arcget <JOB_ID> | |||||||
> > | $ arcget <JOB_ID> | |||||||
$ arcget -j <JOBLIST_FILENAME> For example, to get the output of the job submitted above: | ||||||||
Changed: | ||||||||
< < |
$ arcget gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
> > | $ arcget gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
Results stored at: p6vLDmj3kwrnZ4eC3pmXXsQmABFKDmABFKDm9pFKDmABFKDmtVM1wm Jobs processed: 1, successfully retrieved: 1, successfully cleaned: 1 | ||||||||
Line: 146 to 116 | ||||||||
Copying input and output files ("staging")You can tell ARC to copy input and output files to and from the compute element by including additional attributes in your xRSL file: | ||||||||
Changed: | ||||||||
< < |
& | |||||||
> > | & | |||||||
(executable = "test.sh") (arguments = "") (jobName = "TestJob") | ||||||||
Line: 163 to 131 | ||||||||
Files used in the exectuable , stdout and stderr attributes are transferred automatically, but other files should be listed in the inputFiles or outputFiles attribute as necessary. The inputFiles and outputFiles attributes each take one or more values like this: | ||||||||
Changed: | ||||||||
< < |
("<FILENAME>" "<URL>") | |||||||
> > | ("<FILENAME>" "<URL>") | |||||||
Where <URL> is left blank, ARC transfers the file to or from the submission machine (this would be the case for input.dat , output.txt and results.tgz in the example xRSL above). Alternatively, a URL may be provided to copy the file to or from a remote resource: | ||||||||
Changed: | ||||||||
< < |
("index.html" "http://www.example.org/index.html") | |||||||
> > | ("index.html" "http://www.example.org/index.html") | |||||||
("rabbits.zip" "ftp://ftp.example.org/rabbits.zip") ("values.dat" "gsiftp://gridftp.example.org/data/values.dat") |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 51 to 51 | ||||||||
(jobName = "TestJob") (stdout = "stdout") (stderr = "stderr") | ||||||||
Changed: | ||||||||
< < | (gmlog = "test.log") | |||||||
> > | (gmlog = "gmlog") | |||||||
(walltime="60") | ||||||||
Line: 85 to 85 | ||||||||
Job submitted with jobid: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | ||||||||
Changed: | ||||||||
< < | When a job is submitted successfully, you will be presented with its job ID which can be used to refer to the job later. Information about submitted jobs is also recorded in a job list file; by default, this file is ~/.arc/jobs.dat (~/.arc/jobs.xml with some versions of ARC), but you can choose a different location by supplying the -j argument to arcsub : | |||||||
> > | When a job is submitted successfully, you will be presented with its job ID which can be used to refer to the job later. Information about submitted jobs is also recorded in a job list file; by default, this file is ~/.arc/jobs.dat (~/.arc/jobs.xml with some earlier versions of ARC), but you can choose a different location by supplying the -j argument to arcsub : | |||||||
$ arcsub -j <JOBLIST_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME> | ||||||||
Line: 142 to 142 | ||||||||
You will only be able to retrieve job output once the job has finished. | ||||||||
Added: | ||||||||
> > |
Copying input and output files ("staging")You can tell ARC to copy input and output files to and from the compute element by including additional attributes in your xRSL file:& (executable = "test.sh") (arguments = "") (jobName = "TestJob") (inputFiles = ("input.dat" "")) (outputFiles = ("output.txt" "") ("results.tgz" "") ) (stdout = "stdout") (stderr = "stderr") (gmlog = "gmlog") (walltime="60")Files used in the exectuable , stdout and stderr attributes are transferred automatically, but other files should be listed in the inputFiles or outputFiles attribute as necessary. The inputFiles and outputFiles attributes each take one or more values like this:
("<FILENAME>" "<URL>")Where <URL> is left blank, ARC transfers the file to or from the submission machine (this would be the case for input.dat , output.txt and results.tgz in the example xRSL above). Alternatively, a URL may be provided to copy the file to or from a remote resource:
("index.html" "http://www.example.org/index.html") ("rabbits.zip" "ftp://ftp.example.org/rabbits.zip") ("values.dat" "gsiftp://gridftp.example.org/data/values.dat")Various protocols are supported, including Rucio and SRM, and details can be found in the ARC reference manual: http://www.nordugrid.org/documents/xrsl.pdf Due to the slightly convoluted path files follow to make their way from the submission machine through the CE to the compute node, it is easiest to avoid using paths when specifying outputFiles . Instead, if files are created in subdirectories, it may be simpler to copy these files back to $HOME at the end of the script (this is a working directory belonging to your job, and is not related to your home directory). You may also wish to add multiple output files or directories to an archive, in order to simplify the process of retrieving results further. |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 27 to 27 | ||||||||
To use grid resources, you will need a certificate, from which you can generate a proxy certificate. The proxy certificate has a relatively short lifetime, and is used to actually submit the job. A proxy is associated with a particular Virtual Organisation (VO), for example vo.scotgrid.ac.uk , which is selected when it is created. You can generate a proxy using the arcproxy command:
| ||||||||
Changed: | ||||||||
< < | arcproxy -S <VO_ALIAS> -N | |||||||
> > | $ arcproxy -S <VO_ALIAS> -N | |||||||
For example, to generate a proxy for the vo.scotgrid.ac.uk VO:
| ||||||||
Changed: | ||||||||
< < | arcproxy -S vo.scotgrid.ac.uk -N | |||||||
> > | $ arcproxy -S vo.scotgrid.ac.uk -N Your identity: /C=UK/O=eScience/OU=Glasgow/L=Compserv/CN=bugs bunny Contacting VOMS server (named vo.scotgrid.ac.uk): voms.gridpp.ac.uk on port: 15509 Proxy generation succeeded Your proxy is valid until: 2018-01-19 23:36:59 | |||||||
Job description (xRSL) | ||||||||
Line: 71 to 75 | ||||||||
Jobs are submitted using the arcsub command:
| ||||||||
Changed: | ||||||||
< < | arcsub -j <DATABASE_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME> | |||||||
> > | $ arcsub -c <CE_HOSTNAME> <XRSL_FILENAME> | |||||||
For example, to submit test.xrsl to svr011 at Glasgow:
| ||||||||
Changed: | ||||||||
< < | arcsub -j test.db -c svr011.gla.scotgrid.ac.uk test.xrsl | |||||||
> > | $ arcsub -c svr011.gla.scotgrid.ac.uk test.xrsl Job submitted with jobid: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy | |||||||
Added: | ||||||||
> > |
When a job is submitted successfully, you will be presented with its job ID which can be used to refer to the job later. Information about submitted jobs is also recorded in a job list file; by default, this file is ~/.arc/jobs.dat (~/.arc/jobs.xml with some versions of ARC), but you can choose a different location by supplying the -j argument to arcsub :
$ arcsub -j <JOBLIST_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME>For example: $ arcsub -j test.dat -c svr011.gla.scotgrid.ac.uk test.xrsl Querying the status of a jobYou can obtain information about the status of jobs using thearcstat command:
$ arcstat <JOB_ID>For example, to obtain information about the job submitted in the previous step: $ arcstat gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy Job: gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy Name: StageJob State: Queuing Status of 1 jobs was queried, 1 jobs returned informationYou may have to wait a few minutes after submitting a job before status information becomes available. You can also query the status of all jobs in a job list file: $ arcstat -j <JOBLIST_FILENAME> Retrieving job outputOutput and log files for a job can be retrieved using thearcget command. As when querying the status of a job, you can use either a job ID or a job list file with this command:
$ arcget <JOB_ID> $ arcget -j <JOBLIST_FILENAME>For example, to get the output of the job submitted above: $ arcget gsiftp://svr011.gla.scotgrid.ac.uk:2811/jobs/NCKLDmEQkwrnZ4eC5pmRAbBiTBFKDmABFKDmpMFKDmABFKDmQffBxy Results stored at: p6vLDmj3kwrnZ4eC3pmXXsQmABFKDmABFKDm9pFKDmABFKDmtVM1wm Jobs processed: 1, successfully retrieved: 1, successfully cleaned: 1You will only be able to retrieve job output once the job has finished. |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 16 to 16 | ||||||||
lsetup emi | ||||||||
Added: | ||||||||
> > | If you plan to submit jobs to the ScotGrid VO, at present you must also amend the X509_VOMSES environment variable as follows:
export X509_VOMSES=/etc/vomses | |||||||
Certificates and proxiesTo use grid resources, you will need a certificate, from which you can generate a proxy certificate. The proxy certificate has a relatively short lifetime, and is used to actually submit the job. A proxy is associated with a particular Virtual Organisation (VO), for examplevo.scotgrid.ac.uk , which is selected when it is created. You can generate a proxy using the arcproxy command: | ||||||||
Line: 50 to 56 | ||||||||
http://www.nordugrid.org/documents/xrsl.pdf
Submitting a job | ||||||||
Added: | ||||||||
> > |
Jobs are submitted to a Compute Element (CE). The ScotGrid site at Glasgow has four CEs:
svr009.gla.scotgrid.ac.uk svr010.gla.scotgrid.ac.uk svr011.gla.scotgrid.ac.uk svr019.gla.scotgrid.ac.ukIt does not matter which CE you choose to submit to. (If you've looked at the tutorial linked above, you'll see that Durham gave their CEs the sensible names ce1 , ce2 , etc. We thought that would be too easy.)
Jobs are submitted using the arcsub command:
arcsub -j <DATABASE_FILENAME> -c <CE_HOSTNAME> <XRSL_FILENAME>For example, to submit test.xrsl to svr011 at Glasgow:
arcsub -j test.db -c svr011.gla.scotgrid.ac.uk test.xrsl |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Changed: | ||||||||
< < | Grid User Interface | |||||||
> > | Jobs can be submitted to grid resources using the ARC tools, which are available in CVMFS. Our colleagues in Durham have written a good introductory tutorial; a summary of the steps required to submit and manage jobs, adapted for Glasgow users, is given below.
ARC tools | |||||||
The tools required for grid job submission and management are available from CVMFS: | ||||||||
Line: 13 to 15 | ||||||||
lsetup emi | ||||||||
Added: | ||||||||
> > |
Certificates and proxiesTo use grid resources, you will need a certificate, from which you can generate a proxy certificate. The proxy certificate has a relatively short lifetime, and is used to actually submit the job. A proxy is associated with a particular Virtual Organisation (VO), for examplevo.scotgrid.ac.uk , which is selected when it is created. You can generate a proxy using the arcproxy command:
arcproxy -S <VO_ALIAS> -NFor example, to generate a proxy for the vo.scotgrid.ac.uk VO:
arcproxy -S vo.scotgrid.ac.uk -N Job description (xRSL)Before submitting a job, you need to create a file which describes the features of the job for ARC (its executable, the names of input and output files, what to do with logs, etc.). This file is written in the Extended Resource Specification Language (xRSL). A simple job description which runs a script calledtest.sh could look like this:
& (executable = "test.sh") (arguments = "") (jobName = "TestJob") (stdout = "stdout") (stderr = "stderr") (gmlog = "test.log") (walltime="60")A full description of xRSL can be found in the ARC reference manual: http://www.nordugrid.org/documents/xrsl.pdf Submitting a job |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid ServicesGrid User Interface | ||||||||
Changed: | ||||||||
< < | If you need to use grid user tools directly at Glasgow, use the CVMFS installed versions: | |||||||
> > | The tools required for grid job submission and management are available from CVMFS: | |||||||
Added: | ||||||||
> > | export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase alias setupATLAS='source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh' | |||||||
setupATLAS | ||||||||
Deleted: | ||||||||
< < | localSetupEmi Alternatively, if you want to use ATLAS data management commands: | |||||||
Changed: | ||||||||
< < | setupATLAS localSetupDQ2Client | |||||||
> > | lsetup emi | |||||||
\ No newline at end of file |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid ServicesGrid User Interface | ||||||||
Changed: | ||||||||
< < | If you need to use grid user tools directly at Glasgow, use the CVMFS installed versions, i.e., | |||||||
> > | If you need to use grid user tools directly at Glasgow, use the CVMFS installed versions: | |||||||
setupATLAS localSetupEmi | ||||||||
Changed: | ||||||||
< < | or, if you want to use ATLAS data management commands, | |||||||
> > | Alternatively, if you want to use ATLAS data management commands: | |||||||
setupATLAS |
Line: 1 to 1 | |||||||||
---|---|---|---|---|---|---|---|---|---|
Grid ServicesGrid User Interface | |||||||||
Changed: | |||||||||
< < | Every PPE desktop (both SL4x or SL5x) can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed on SL4x desktops in /data/ppe01/sl44/i386/grid . For SL5x desktops the User Interface (UI) tools, dq2-client , but not GANGA are installed in /data/ppe01/sl5x/x86_64/grid | ||||||||
> > | If you need to use grid user tools directly at Glasgow, use the CVMFS installed versions, i.e., | ||||||||
Deleted: | |||||||||
< < |
Using the dq2-client | ||||||||
Changed: | |||||||||
< < | Depending on the SL version and shell source a setup file: | ||||||||
> > | setupATLAS localSetupEmi | ||||||||
Changed: | |||||||||
< < |
| ||||||||
> > | or, if you want to use ATLAS data management commands, | ||||||||
Changed: | |||||||||
< < | Then source the corresponding UI setup file (see section below).
Then create a voms proxy
voms-proxy-init -voms atlasand use the dq2 Tools as needed. # Now try these simple commands: dq2-ls mc09_7TeV.105003.pythia_sdiff.merge.NTUP_MINBIAS.e514_s764_s767_r1229_p137_tid130190_00
and to show the individual files (This may not work on SL5 machines)
dq2-ls -f mc09_7TeV.105003.pythia_sdiff.merge.NTUP_MINBIAS.e514_s764_s767_r1229_p137_tid130190_00
Using the UISource the appropriate setup file depending on your current shell and sl version of your machine. If you are using the dq2 client this needs to be done after sourcing the dq2 client setup file. To check your current shell use:echo $SHELL
To check the version of SL use: cat /etc/redhat-release
| ||||||||
> > | setupATLAS localSetupDQ2Client |
Line: 1 to 1 | |||||||||
---|---|---|---|---|---|---|---|---|---|
Grid Services | |||||||||
Line: 6 to 6 | |||||||||
Every PPE desktop (both SL4x or SL5x) can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed on SL4x desktops in /data/ppe01/sl44/i386/grid . For SL5x desktops the User Interface (UI) tools, dq2-client , but not GANGA are installed in /data/ppe01/sl5x/x86_64/grid | |||||||||
Deleted: | |||||||||
< < |
Using the UIFirst source the appropriate setup file depending on your current shell and sl version of your machine. To check your current shell use:echo $SHELL
To check the version of SL use: cat /etc/redhat-release
| ||||||||
Using the dq2-client | |||||||||
Changed: | |||||||||
< < | Setup the UI as mentioned in the previous section. | ||||||||
> > | Depending on the SL version and shell source a setup file: | ||||||||
Deleted: | |||||||||
< < | Again depending on SL version and shell source a setup file: | ||||||||
| |||||||||
Line: 38 to 20 | |||||||||
| |||||||||
Added: | |||||||||
> > | Then source the corresponding UI setup file (see section below). | ||||||||
Then create a voms proxy
voms-proxy-init -voms atlas | |||||||||
Line: 51 to 35 | |||||||||
dq2-ls -f mc09_7TeV.105003.pythia_sdiff.merge.NTUP_MINBIAS.e514_s764_s767_r1229_p137_tid130190_00 | |||||||||
Added: | |||||||||
> > |
Using the UISource the appropriate setup file depending on your current shell and sl version of your machine. If you are using the dq2 client this needs to be done after sourcing the dq2 client setup file. To check your current shell use:echo $SHELL
To check the version of SL use: cat /etc/redhat-release
| ||||||||
-- WilliamBell - 08 Aug 2008 \ No newline at end of file |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 45 to 45 | ||||||||
Now try these simple commands: | ||||||||
Changed: | ||||||||
< < | dq2-ls alpgen.105762.ttbarlnlnNp2_lowQ | |||||||
> > | dq2-ls mc09_7TeV.105003.pythia_sdiff.merge.NTUP_MINBIAS.e514_s764_s767_r1229_p137_tid130190_00 | |||||||
and to show the individual files (This may not work on SL5 machines) | ||||||||
Changed: | ||||||||
< < | dq2-ls -f alpgen.105762.ttbarlnlnNp2_lowQ | |||||||
> > | dq2-ls -f mc09_7TeV.105003.pythia_sdiff.merge.NTUP_MINBIAS.e514_s764_s767_r1229_p137_tid130190_00 | |||||||
-- WilliamBell - 08 Aug 2008 \ No newline at end of file |
Line: 1 to 1 | |||||||||
---|---|---|---|---|---|---|---|---|---|
Grid ServicesGrid User Interface | |||||||||
Changed: | |||||||||
< < | Every PPE desktop (both SL4x or SL5x) can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed on SL4x desktops in /data/ppe01/sl44/i386/grid . For SL5x desktops the User Interface (UI) tools, dq2-client , but not GANGA are installed in /data/ppe01/sl5x/x86_64/grid | ||||||||
> > | Every PPE desktop (both SL4x or SL5x) can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed on SL4x desktops in /data/ppe01/sl44/i386/grid . For SL5x desktops the User Interface (UI) tools, dq2-client , but not GANGA are installed in /data/ppe01/sl5x/x86_64/grid | ||||||||
Using the UIFirst source the appropriate setup file depending on your current shell and sl version of your machine. | |||||||||
Changed: | |||||||||
< < | To check your current shell use: echo $SHELL
To check the version of SL use: cat /etc/redhat-release | ||||||||
> > | To check your current shell use: echo $SHELL | ||||||||
Changed: | |||||||||
< < |
| ||||||||
> > | To check the version of SL use: cat /etc/redhat-release
| ||||||||
Using the dq2-client | |||||||||
Line: 36 to 29 | |||||||||
Setup the UI as mentioned in the previous section. Again depending on SL version and shell source a setup file: | |||||||||
Changed: | |||||||||
< < |
| ||||||||
> > |
| ||||||||
Then create a voms proxy
voms-proxy-init -voms atlas |
Line: 1 to 1 | |||||||||
---|---|---|---|---|---|---|---|---|---|
Grid ServicesGrid User Interface | |||||||||
Changed: | |||||||||
< < | Every PPE SL4x desktop can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed in /data/ppe01/sl44/i386/grid
Using the UI | ||||||||
> > | Every PPE desktop (both SL4x or SL5x) can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed on SL4x desktops in /data/ppe01/sl44/i386/grid . For SL5x desktops the User Interface (UI) tools, dq2-client , but not GANGA are installed in /data/ppe01/sl5x/x86_64/grid | ||||||||
Deleted: | |||||||||
< < | Check your current shell
echo $SHELL | ||||||||
Changed: | |||||||||
< < | If you are using bash or zsh then
source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.sh | ||||||||
> > |
Using the UI | ||||||||
Changed: | |||||||||
< < | If you are using tcsh or csh then
source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.csh | ||||||||
> > | First source the appropriate setup file depending on your current shell and sl version of your machine. | ||||||||
Changed: | |||||||||
< < | |||||||||
> > | To check your current shell use: echo $SHELL
To check the version of SL use: cat /etc/redhat-release
| ||||||||
Changed: | |||||||||
< < | Using the dq2-client | ||||||||
> > |
Using the dq2-client | ||||||||
Setup the UI as mentioned in the previous section. | |||||||||
Changed: | |||||||||
< < | If you are using bash or zsh then
source /data/ppe01/sl44/i386/grid/dq2-client/setup.sh | ||||||||
> > | Again depending on SL version and shell source a setup file: | ||||||||
Changed: | |||||||||
< < | If you are using tcsh or csh then
source /data/ppe01/sl44/i386/grid/dq2-client/setup.csh | ||||||||
> > |
| ||||||||
Then create a voms proxy
voms-proxy-init -voms atlas |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid ServicesGrid User Interface | ||||||||
Changed: | ||||||||
< < | Every PPE SL4x desktop can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed in /data/ppe01/sl44/i386/grid | |||||||
> > | Every PPE SL4x desktop can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed in /data/ppe01/sl44/i386/grid | |||||||
Using the UICheck your current shell | ||||||||
Line: 14 to 14 | ||||||||
source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.shIf you are using tcsh or csh then | ||||||||
Changed: | ||||||||
< < | source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.csh | |||||||
> > | source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.csh | |||||||
Using the dq2-client | ||||||||
Line: 39 to 41 | ||||||||
dq2-ls -f alpgen.105762.ttbarlnlnNp2_lowQ | ||||||||
Deleted: | ||||||||
< < | Using GANGA for ATLASsource /afs/cern.ch/sw/ganga/install/etc/setup-atlas.sh
then follow the Physics Analysis Workbook (Batch 2)
https://twiki.cern.ch/twiki/bin/view/AtlasProtected/PhysicsAnalysisWorkBookBatch2
and
https://twiki.cern.ch/twiki/bin/view/Atlas/FullGangaAtlasTutorial
Using GANGASetup the UI as mentioned in the previous section. If you are using bash or zsh thensource /data/ppe01/sl44/i386/grid/ganga/setup.shIf you are using tcsh or csh then source /data/ppe01/sl44/i386/grid/ganga/setup.cshRefer to the ATLAS, LHCb or core GANGA documentation. The release installed contains GangaAtlas , GangaLHCb , GangaNG , GangaCronus , GangaGUI , GangaPlotter
| |||||||
-- WilliamBell - 08 Aug 2008 |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 39 to 39 | ||||||||
dq2-ls -f alpgen.105762.ttbarlnlnNp2_lowQ | ||||||||
Added: | ||||||||
> > | Using GANGA for ATLASsource /afs/cern.ch/sw/ganga/install/etc/setup-atlas.sh
then follow the Physics Analysis Workbook (Batch 2)
https://twiki.cern.ch/twiki/bin/view/AtlasProtected/PhysicsAnalysisWorkBookBatch2
and
https://twiki.cern.ch/twiki/bin/view/Atlas/FullGangaAtlasTutorial
| |||||||
Using GANGASetup the UI as mentioned in the previous section. |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Grid Services | ||||||||
Line: 29 to 29 | ||||||||
Then create a voms proxy
voms-proxy-init -voms atlas | ||||||||
Changed: | ||||||||
< < | and use the dq2 Tools as needed. | |||||||
> > | and use the dq2 Tools as needed. #
Now try these simple commands:
dq2-ls alpgen.105762.ttbarlnlnNp2_lowQ
and to show the individual files (This may not work on SL5 machines)
dq2-ls -f alpgen.105762.ttbarlnlnNp2_lowQ | |||||||
Using GANGASetup the UI as mentioned in the previous section. |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
| ||||||||
Added: | ||||||||
> > | Grid Services | |||||||
Grid User InterfaceEvery PPE SL4x desktop can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed in/data/ppe01/sl44/i386/grid |
Line: 1 to 1 | ||||||||
---|---|---|---|---|---|---|---|---|
Added: | ||||||||
> > |
Grid User InterfaceEvery PPE SL4x desktop can act as a EGEE gLite UI. The User Interface (UI) tools, dq2-client and GANGA are installed in/data/ppe01/sl44/i386/grid
Using the UICheck your current shellecho $SHELLIf you are using bash or zsh then source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.shIf you are using tcsh or csh then source /data/ppe01/sl44/i386/grid/glite-ui/latest/external/etc/profile.d/grid-env.csh Using the dq2-clientSetup the UI as mentioned in the previous section. If you are using bash or zsh thensource /data/ppe01/sl44/i386/grid/dq2-client/setup.shIf you are using tcsh or csh then source /data/ppe01/sl44/i386/grid/dq2-client/setup.cshThen create a voms proxy voms-proxy-init -voms atlasand use the dq2 Tools as needed. Using GANGASetup the UI as mentioned in the previous section. If you are using bash or zsh thensource /data/ppe01/sl44/i386/grid/ganga/setup.shIf you are using tcsh or csh then source /data/ppe01/sl44/i386/grid/ganga/setup.cshRefer to the ATLAS, LHCb or core GANGA documentation. The release installed contains GangaAtlas , GangaLHCb , GangaNG , GangaCronus , GangaGUI , GangaPlotter
-- WilliamBell - 08 Aug 2008 |