condor_submit is the program for actually submitting jobs to Condor. condor_submit wants as its sole argument the name of a submit-description file which contains commands and keywords to direct the queuing of jobs. In the submit-description file, you will tell Condor everything it needs to know about the job. Items such as the name of the executable to run, the initial working directory, command-line arguments, etc., all go into the submit-description file. condor_submit then creates a new job ClassAd based upon this information and ships it along with the executable to run to the condor_schedd daemon running on your machine. At that point your job has been submitted into Condor.
Now please read the condor_submit manual page in the
Command Reference chapter before you continue; it is on page and
contains a complete and full description of how to use condor_submit.
Now that you have read about condor_submit and have an idea of how it works, we'll followup with a few additional examples of submit-description files.
Example 1 below about the simplest submit-description file possible. It queues up one copy of the program ``foo'' for execution by Condor. Condor will attempt to run the job on a machine which has the same architecture and operating system as the machine from which it was submitted. Since no input, output, and error commands were given, the files stdin, stdout, and stderr will all refer to /dev/null. (The program may produce output by explicitly opening a file and writing to it.)
#################### # # Example 1 # Simple condor job description file # #################### Executable = foo Queue
Example 2 below queues 2 copies of program the program ``mathematica''. The first copy will run in directory ``run_1'', and the second will run in directory ``run_2''. In both cases the names of the files used for stdin, stdout, and stderr will be test.data, loop.out, and loop.error, but the actual files will be different as they are in different directories. This is often a convenient way to organize your data if you have a large group of condor jobs to run. The example file submits ``mathematica'' as a Vanilla Universe job, perhaps because the source and/or object code to program ``mathematica'' was not available and therefore the re-link step necessary for Standard Universe jobs could not be performed.
#################### # # Example 2: demonstrate use of multiple # directories for data organization. # #################### Executable = mathematica Universe = vanilla input = test.data output = loop.out error = loop.error Initialdir = run_1 Queue Initialdir = run_2 Queue
The submit-description file Example 3 below queues 150 runs of program ``foo'' which must have been compiled and linked for Silicon Graphics workstations running IRIX 6.x. Condor will not attempt to run the processes on machines which have less than 32 megabytes of physical memory, and will run them on machines which have at least 64 megabytes if such machines are available. Stdin, stdout, and stderr will refer to ``in.0'', ``out.0'', and ``err.0'' for the first run of this program (process 0). Stdin, stdout, and stderr will refer to ``in.1'', ``out.1'', and ``err.1'' for process 1, and so forth. A log file containing entries about where/when Condor runs, checkpoints, and migrates processes in this cluster will be written into file ``foo.log''.
#################### # # Example 3: Show off some fancy features including # use of pre-defined macros and logging. # #################### Executable = foo Requirements = Memory >= 32 && OpSys == "IRIX6" && Arch =="SGI" Rank = Memory >= 64 Image_Size = 28 Meg Error = err.$(Process) Input = in.$(Process) Output = out.$(Process) Log = foo.log Queue 150
There are a few more things you should know about the powerful Requirements and Rank commands in the submit-description file.
First of all, both of them need to be valid Condor ClassAd expressions.
From the condor_submit manual page and the above examples, you can see
that writing ClassAd expressions is quite intuitive (especially if you
are familiar with the programming language C). However, there are some
pretty nifty expressions you can write with ClassAds if you care to read
more about them. The complete lowdown on ClassAds and their expressions
can be found in section 4.1 on
page .
All of the commands in the submit-description file are case insensitive, except for the ClassAd attribute string values that appear in the ClassAd expressions that you write! ClassAds attribute names are case insensitive, but ClassAd string values are always case sensitive. If you accidently say
requirements = arch == "alpha"instead of what you should have said, which is:
requirements = arch == "ALPHA"you will not get what you want.
So now that you know ClassAd attributes are case-sensitive, how do you know what the capitalization should be for an arbitrary attribute ? For that matter, how do you know what attributes you can use ? The answer is you can use any attribute that appears in either a machine or a job ClassAd. To view all of the machine ClassAd attributes, simply run condor_status -l. The -l argument to condor_status means to display the complete machine ClassAd. Similarly for job ClassAds, do a condor_q -l command (Note: you'll have to submit some jobs first before you can view a job ClassAd). This will show you all the available attributes you can play with, along with their proper capitalization.
To help you out with what these attributes all signify, below we list descriptions for the attributes which will be common by default to every machine ClassAd. Remember that because ClassAds are flexible, the machine ads in your pool may be including additional attributes specific to your site's installation/policies.
There are times when you would like to submit jobs across machine architectures. For instance, let's say you have an Intel machine running LINUX sitting on your desk. This is the machine where you do all your work and where all your files are stored. But perhaps the majority of machines in your pool are Sun SPARC machines running Solaris. You would want to submit jobs directly from your LINUX box that would run on the SPARC machines.
This is easily accomplished. You will need, or course, to create your executable on the same type of machine where you want your job to run -- Condor will not convert machine instructions hetrogeneously for you! The trick is simply what to specify for your requirements command in your submit-description file. By default, condor_submit inserts requirements that will make your job run on the same type of machine you are submitting from. To override this, simply state what you want. Returning to our example, you would put the following into your submit-description file:
requirements = Arch == "SUN4x" && OpSys == "SOLARIS251"Just run condor_status to display the Arch and OpSys values for any/all machines in the pool.