3.4. Using sh_gamit and sh_glred

Now that we’ve described the program flow and control files, we’ll take you through a simple application of sh_gamit and sh_glred using the small network in Southern California previously provided (prior to 10.7) in ~/gg/example/. The command for sh_gamit issued at the project-directory level (2000/) is

$ sh_gamit -expt scal -d 2000 034 035 036 -noftp -pres ELEV -copt x k p -dopt c ao >& sh_gamit.log

In this example, we start with all of the data to be processed already present in the rinex/ directory. If you wanted to add additional data available from a global or regional archive, you would specify the site names in sites.defaults, enter the URL information (if not already present) into ~/gg/tables/ftp_info, and remove the -noftp option from the call to sh_gamit. As a result of the command shown, sh_gamit will execute for each day the following steps, noted in the screen output redirected to sh_gamit.log:

  1. Assign parameters for program flow, giving precedence first to the command-line arguments, then to the parameters set in process.defaults and sites.defaults, and then to default assignments within sh_gamit itself. In this case, the command-line entry -noftp overrides the default to search archives for requested or updated observation, navigation, and EOP files; and the command-line entries for which files to compress or delete at the end of the run override those set in process.defaults.

  2. Create the day-directory and/or standard directories which do not yet exist (all of these are already present in the ~/gg/example/ directory).

  3. Link into the day directory (034/) the standard tables (see script links.day) and the RINEX files that contain data for the specified interval (00:00–24:00 as set in process.defaults).

  4. Download orbital SP3 files from a global data center and create GAMIT g-files using script sh_get_orbits.

  5. Run sh_upd_stnfo, which invokes program mstinf to update station.info from the RINEX headers. (It is recommended that this step be skipped by setting xstinfo {<expt>} all in sites.defaults.)

  6. Run makexp to create the input files for makex (scal.makex.batch) and fixdrv (dscal0.034).

  7. Run sh_check_sess to make sure that all of the satellites included in the RINEX observation files are present in the navigation file (brdc/brdc0340.00n, previously downloaded at MIT from an IGS archive) and in the g-file (created previously at MIT from an IGS SP3 file).

  8. Run sh_makej to create a j-file of satellite clock estimates from the SP3 file (default) or navigation file.

  9. Run makex to create x-files (observations) and k-files (receiver clock estimates) using phase and pseudorange data from the RINEX observation files, broadcast ephemeris from the navigation file, and satellite clocks from the j-file. A record of makex, showing the data found and any problematic data encountered is written to scal.makex.infor.

  10. Run fixdrv to create the batch file for GAMIT processing. Though not used directly, fixdrv also reads the k-file of episodic clock values and fits a first-order polynomial to them as a crude check for jumps and rapid drifts in the receiver clock (fixdrv.out).

  11. Execute the batch run to generate a tabular orbital ephemeris (arc), model the phase observations (model), edit the data (autcln), and estimate parameters (solve), a sequence completed twice in order that autcln may operate on flat residuals and that the final adjustments in solve are well within a linear range. A record of this run is not written to sh_gamit.log (to save space) but is recorded in GAMIT.status, GAMIT.warning, and GAMIT.fatal in the day directory.

  12. Save the cleaning summary (autcln.post.sum) to autcln.post.sum.scal (potentially for archiving, though that’s not done for this example) and write key information from model and solve to the HISTORY file, which, unlike all other files in the day directory, is appended rather than overwritten in reruns so that a record of previous runs is maintained.

  13. Create sky plots of phase residuals and plots of phase vs elevation angle using the DPH files written by autcln; if the Ghostscript program is available, translate the plots from PostScript to PNG and move them into figs/.

  14. Invoke sh_cleanup to delete or compress files as specified by -dopt and -copt.

The most common problems with sh_gamit are missing or incorrect receiver and antenna information (“metadata”) in station.info and the loss of data due to bad tracking or bad coordinates.

There are two approaches you can take to providing metadata to GAMIT. If you have RINEX files for which you know all the headers are correct, you can have sh_gamit invoke sh_upd_stnfo to update station.info for each day using the header entries (no update will be made if the station.info entry is present and matches the RINEX header). If the antenna height information is correct but the receiver and antenna names not IGS standard, you can still use this feature if you put into ~/gg/tables/guess_rcvant.dat a substring that will uniquely match what appears in the header. If you know that all of your receivers and/or antennas are of the same type, you can force their use by specifying them as ant default and rcv default in guess_rcvant.dat. If you expect sh_gamit to generate station.info entries from the RINEX header information, you should review the antenna and receiver names used in all of the headers before you start (e.g., by grep’ing on the RINEX files for REC # and ANT #). Antenna heights can also be problematic. In the RINEX standard, the header value is supposed to be a vertical height to antenna reference point (ARP), but often a slant height is actually given in the file. If the latter is the case, then stinf_slthgt in process.defaults can be set to a height above which the value will be assumed to a slant height to the outside edge of the ground plane. (Setting stinf_slthgt to 0 or a large number will cause all values to be interpreted as direct height measurements.)

If you have any doubts about the validity of the RINEX headers, it is better to create (and check!) station.info before you start the GAMIT processing. You can create a file with all of the entries for your survey by running sh_upd_stnfo manually with your RINEX files, then edit the file as appropriate. In creating this file, it is best to start with a template from the current MIT or SOPAC global station.info file (available in incremental_updates/tables/) so that you can conveniently add entries for continuous stations from the global file. This template can be header-only, but the preferred approach is to start with the full global station.info file and use the -l {<sitelist>} option of sh_upd_stnfo to extract only the stations you need, either from an existing list or one created automatically by sh_upd_stnfo from the sites with ftprnx in sites.defaults and/or the file names in your experiment rinex/ directory. Hence, if you have a current copy of the MIT or SOPAC station.info in ~/gg/tables/, have run run sh_setup to copy it into your experiment tables/ directory, and have edited sites.defaults to specify the RINEX files you want from a remote archive (using ftprnx), you can run (in tables/):

$ sh_upd_stnfo -ref station.info -l sd

which will produce station.info.new with the global entries for only the ftprnx sites. (If the sites you need from the global station.info file are not in sites.defaults with ftprnx, but are present in the rinex/ directory, you can use -l sdrnx or -l rnx in the sh_upd_stnfo command.) Then make sure the RINEX files in the experiment rinex/ directory are not compressed, rename station.info.new to station.info and run (in tables/):

$ sh_upd_stnfo -files ../rinex/*o

sh_upd_stnfo also allows entries for station.info to be created from IGS log files or SINEX files. If you construct station.info in advance of your processing, set xstinfo {<expt>} all in sites.defaults to block any attempt by sh_gamit to update the file from RINEX headers.

Bad a priori coordinates will result in autcln detecting too many cycle slips and deleting all of the data. A clear indication that this has happened is Range rms values greater than about 20 m at the top of the autcln.prefit.sum file together with 0 for DATA AMOUNTS in the same file. When this happens, you should check the experiment tables/lfile. for the source of the coordinates used. If a priori coordinates for a station are not available in the L-file (or .apr file) from previous processing, sh_gamit will by default invoke the sh_rx2apr script to perform a pseudorange solution. Coordinates good to 10–20 m can usually be obtained from the data at the station of interest (better if selective availability is off), but the preferred approach is to perform the solution differentially, using also a RINEX file from an IGS station with known coordinates. To make sure this happens, you should specify ftprnx in sites.defaults and have present in the rinex/ directory or available via download from an IGS data archive the RINEX files for each day from one or more IGS stations. (You can use this setting to get a differential pseudorange solution even if you have -noftp specified for sh_gamit provided you copy the IGS data into the rinex/ directory in advance of the run.) If a differential pseudorange solution was used and you still have bad coordinates, try executing sh_rx2apr manually with data from each of several days and compare the coordinates to see if the day you used originally was an anomaly because it was short or had bad pseudorange data. To by-pass sh_rx2apr and use the coordinates from the RINEX header, set use_rxc = Y in process.defaults. This option should be used only if you know that the header values are always present and accurate or if you have RINEX 3 files, which the programs svpos and svdiff invoked by sh_rx2apr will not yet support. Note that a large adjustment of coordinates, due to bad data or a short session, on one day can cause problems with the next day since lfile. in the experiment tables/ directory will be updated. You can avoid this update by copying into the aprf file specified in process.defaults any site coordinates that you know to be good; the L-file will be initialized with these values as processing begins for each day.

sh_gamit can also fail if it is unable to download required global RINEX files or orbital information from an IGS archive (CDDIS or SOPAC if not otherwise specified). The GAMIT.fatal message will usually make clear what file is missing. In this case, check the connection to the archive manually and restart the processing.

For the most part, you can rerun a day after a failure with simply a remedy of the detected problem (e.g., a bad station.info entry). However, in some circumstances old corrupted files will be used: (a) Unless you specify -remakex Y in the command line, any existing x-file in a directory will be used again and the script assumes that there is a valid station.info entry for this file (if not, the process will fail). (b) Any existing RINEX file linked in the day directory will be assumed to exist. If the link is now empty because you have renamed or removed the file in the remote directory, this may not be detected correctly on all systems. (c) A previously added station.info entry will be used (and not replaced) if it applies to the day being processed. (d) Coordinates in the L-file will be used if they exist (so if the entry has been corrupted it should be removed and/or correct coordinates put in to the .apr file).

Descriptions of how to run sh_gamit with sessions crossing day boundaries can be found in Section 6.3 of the GAMIT Reference Manual.

To generate time series from the GAMIT runs for the three days in the example, type at the project-directory level (2000/)

$ sh_glred -expt scal -s 2000 034 2000 036 -opt H G T >&! sh_glred.log

The script will execute the following steps, noted in the screen output redirected to sh_glred.log:

  1. Search all day directories between 034/ and 036/ for GAMIT h-files containing the substring scal and run htoglb to convert these to binary h-files for glred. Since each GAMIT h-file contains two solutions, one with ambiguities estimated as real numbers (“biases free”) and one with ambiguities resolved (“biases fixed”), htoglb will create two binary h-files, with extents .glr (GAMIT loose free) and .glx (GAMIT loose fixed), respectively. They are stored in the glbf/ directory and named with the year/month/day/hr/min, e.g h0002031200_scal.glx for day 034.

  2. Generate in the gsoln/ directory a .gdl file (h-file list) for each day (.glx is default).

  3. Run glred for each day, using commands of the form

    $ glred 6 globk_scat_00034.prt globk_scal_00034.log globk_scal_00034.gdl globk.cmd
    

    The .log and .org files provide a record of the output.

  4. Run sh_plot_pos to invoke program tssum to create PBO-format .pos files from the coordinates of each site on each day in the .org files and invoke GMT to plot the time series. (Specifying E instead of T on the command line will use the older program ensum to generate mb_ files for plotting with sh_plotcrd.)

This task could be accomplished in individual steps by running htoglb directly with wild cards specifying the day directories, creating a .gdl file in gsoln/ using ls ../glbf/h*glx, running the program glred (rather than the script sh_glred) with this .gdl file, and running sh_plot_pos. In fact, for velocity solutions and time series using combined files, you will need to use this approach (see Chapter 4). The advantages of the script are threefold: 1) fewer commands, 2) it allows easy daily combination of local h-files with those from an external processing center (e.g. MIT or SOPAC), using the LA, LB and LC options (see Section 5.1); and 3) it allows easy aggregation of days into weekly or monthly combined h-files, using the -ncomb option (see the sh_glred help by typing the command without arguments). You can mix using sh_glred with running glred directly as long as you keep in mind two differences: 1) sh_glred, like sh_gamit, is launched from the project directory but executes glred within the gsoln/ directory whereas glred is launched directly from within the gsoln/ directory itself; and 2) sh_glred creates a .gdl file for each day whereas running glred itself efficiently requires creating a single .gdl file containing the h-files for all of the days. In a .gdl file, the + symbol is used to indicate to glred that h-files are to be combined before performing the solution. For example, using the file

../glbf/h1211021200_pan1.glx 1.0 +
../glf/h1211021200_pan2.glx 1.0 +
../glbf/h1211021200_pcnw.glx 1.0 +
../glbf/nmt17125.e.glb 1.0
../glbf/h1211031200_pan1.glx 1.0 +
../glbf/h1211031200_pan2.glx 1.0 +
../glbf/h1211031200_pcnw.glx 1.0 +
../glbf/nmt17126.e.glb 1.0

glred would combine all four h-files for day 2012-11-02 before performing a solution, and then do the same thing for the four files for day 2012-11-03. (Running globk, rather than glred, would combine all of the files listed into a single solution whether or not the + are present).

Other options of sh_glred specify the downloading of data from a remote archive (F), removal of old h-files from the glfpth directory (R), and compressing the h-files at the end of the run (C). Type sh_glred without arguments to see all options.

To avoid overwriting useful h-files or using obsolete ones, it is important to keep in mind the precedence rules of the script. For local data (sh_gamit day directories), specifying the H option will force htoglb to be rerun for all directories within the time span indicated, whether or not a binary file or link exists in the glbf/ directory. Omitting H will cause no new binary files to be created, so it is not possible to retranslate only a selected group of ascii H-files. This is not an important limitation, however, because htoglb runs quickly. For global ascii h-files downloaded from SOPAC, setting H will also force htoglb to be rerun on any ascii H-files present or linked (by LA) in the H-file (glbf/) directory, but you can safely set F since the script will not re-download any remote (ascii) H-files that are present.

Note that most of the shell scripts called by sh_gamit and sh_glred can be run stand-alone for specific processing tasks. The most useful of these are sh_make_rinex (templates for running teqc to translate raw data files), sh_get_nav, sh_get_rinex, sh_get_orbits, sh_sp3fit, sh_update_eop, sh_link_rinex, sh_oneway (to get sky plots from DPH files), and sh_get_hfiles. Type the name of the script without arguments to see the documentation.