Off Site
- At GA
- To get PPPL tools:
source /c/transp/etc/transp_setup.csh
- At MIT
- To get PPPL tools:
- bash users
- export NTCCHOME=/usr/local/cmod/codes/transp/pppl_tools
- source $NTCCHOME/transp_setup.bash
- tcsh users
- setenv NTCCHOME /usr/local/cmod/codes/transp/pppl_tools
- source $NTCCHOME/transp_setup.csh
Index to Details
Getting a Proxy
PPPL users do not need a Grid Proxy.
Users can automatically renew their proxy at login by sourcing below scripts:
- off-site users
- tcsh users add to the end of:
- bash users add to the end of:
- at GA
- tcsh users:
- source /c/transp/etc/transp_setup.csh
- follow above instructions for off-site users
- other Sites
- If you installed NTCC module tr_start or tr_client
- define environment variable TRANSP_LOCATION = $PREFIX/etc
- add $PREFIX/bin and $PREFIX/etc to your $PATH
($PREFIX is where you installed the software)
- You can download the scripts here
then extract scripts into $TRANSP_LOCATION.
- follow above examples for off-site users
Namelists
TR.DAT and TR.INF files are in $TRINF/<TOK>/<yy>
Basic Scripts
Note: tr_start and tr_send may be run from xtranspin
tr_start <runid> [tshare] [<pbs-queue>]
-----------------
must be run from directory where Namelist is
creates MDSplus tree for input data
writes <runid>.REQUEST file
uses environments
TR_EMAIL Your e-mail address
EDITOR Your favourite editor
MDS_TRANSP_SERVER transpgrid.pppl.gov (for PPPL Users)
NONE (for non-PPPL if not using mdsplus)
MDS_TRANSP_TREE (do NOT specify if PPPL user or nomdsplus)
tr_send <runid>
--------------------------
must be run from same directory as tr_start
submits run to pppl pbs queue
(for non-PPPL via globus-job-submit)
For non-PPPL, if not using mdsplus:
tr_send_pppl.pl <runid> <TOK> NOMDSPLUS
tr_cleanup <runid> <tok>
-------------------------
queues request to cleanup all traces of a run
if the run is still active, it will be aborted
tr_look <runid> [tok] [archive]
---------------------------------
writes interim MDSplus output into TRLOOK_<TOK> on transpgrid.pppl.gov
options:
archive: for aborted and active runs
archive run "as is" on final destination
if the run is still active, it will be aborted
nomdsplus: generate CDF file only ( in $LOOKDIR/<TOK>)
If not PPPL: only use with archive
tr_halt <runid> <tok> [t <time>]
---------------------------------------------
suspend a run immediately, or at t = <time>
Steering a Run
This is disabled.
- Web Sites
- Logfiles on disk
- completed runs:
$LOGDIR/<TOK>/
- aborted runs:
$RESULTDIR/<TOK>/<runid>/
- Interim output / Aborted runs
rplot t s transpgrid.pppl.gov t trlook_<tok> y <tok>.<yy> q <runid>
or
cd $LOOKDIR/<TOK>
rplot <runid>
- Completed Runs
rplot t y <tok>.<yy> q <runid>
--or-- at PPPL reading from disk:
rplot d '$ARCDIR/<TOK>/<yy>' q <runid>
- Special Output (other then .CFD)
*.DATA* and *.XML files are in $ARCDIR/<TOK>/<yy>
- Default Mdsplus Trees are:
- On Server transpgrid.pppl.gov
- TRANSP_<TOK> for "temporary" runs
these runs may be archived on demand (tr_save)
- TRLOOK_<TOK> for "interim" output
To export a TRANSP run, you made at PPPL, to another Site, you run export_run to produce a tar file containing the mdsplus tree files, which you then can copy to the remote site. There will be site-specific rules, how to deal with it at the remote site. Very likely you need to assign a different runid/mdsplus-id. Currently this is only supported by GA.
- At PPPL:
export_run
usage: export_run <runid> <tok> <year> [remote tree name]
e.g: export_run 12345A01 d3d 94
will produce /tmp/$USER/123450101.tar.gz
e.g: export_run 12345A01 iter 06
will produce /tmp/$USER/12e45A01.tar.gz
- Copy the tar file to the remote site, using scp or globus-url-copy.
- At remote site:
- At GA:
Use the script archiverun, to generate a GA runid and add the run to the database.
Unpack /u/aswath/archivetransprun.tar
and follow the instructions in README.TXT
A TRANSP run made on another Site, that is accessible via MDSplus, can be imported to PPPL, but needs to be run from a privileged account.
- as pshr#### with a proxy, run import_run
e.g: import_run 12345A01 _atlas.gat.com transp
import_run will
- check if run already exists
- extract input and output data from remote mdsplus Server
- create a tree at PPPL
- write all data into tree
- instruct you how to run import_finish
- as pshare, run import_finish
To overwrite PPPL settings,e.g. to use globus, send input as tar file, etc. use below scripts.
See also Available PPPL Tools for Grid Users
- tr_start.pl <server> <tree> <runid> [<mdsid> silent] [xtar | noxtar | nocreate | nomdsplus | sendtree] [ -nbi_np <np> ] [ -alt <username> | tshare ] [-q <pbs-queue>]
to run instead of tr_start
Examples:
- standard PPPL - pretend to be a grid user
tr_start.pl transpgrid.pppl.gov transp_tftr 12345A01
will ask if you want a tar file
- without creating a mdsplus tree, for sending Ufiles, fetching tree
tr_start.pl PPPL PPPL 12345A01 nocreate
- without creating a mdsplus tree, tree is created by TRANSP on remote client
tr_start.pl transpgrid.pppl.gov transp_tftr 12345A01 start_remote
- no MDSplus, for sending Ufiles, fetching output
tr_start.pl NONE NONE 12345A01 nomdsplus
- create Tree, for sending tree files, receiving tree
tr_start.pl transpgrid.pppl.gov transp_tftr 12345A01 sendtree
- tr_send_pppl.pl <runid> <tok> [ -s ] [trdat] [FAST | SLOW] [ UFILE | CREATE | NOMDSPLUS | TREES]
to run instead of tr_send
Examples:
- pretend to be a grid user (standard or tar file)
tr_send_pppl.pl 12345A01 tftr
- to send to transpcomp (instead of transpgrid) (disabled)
tr_send_pppl.pl 12345A01 tftr FAST
- NO tree was created by client, Tree will be created by PPPL on Client's Server
tr_send_pppl.pl 12345A01 tftr UFILE
- NO tree was created, send Ufiles, PPPL creates local tree, to be fetched
tr_send_pppl.pl 12345A01 tftr CREATE
- NO MDSPLUS, send Ufiles, fetch CDF & output
tr_send_pppl.pl 12345A01 tftr NOMDSPLUS
- Send Tree Triplets to PPPL, to be returned with Output
tr_send_pppl.pl 12345A01 tftr TREES
- tr_send_cleanup <runid> <tok>
to run instead of tr_cleanup
- submitting a run manually
See Submitting a Collaboratory Run