.. _intro_prod_combine: Combining h-files ================= There are several reasons why you may want to combine the h-files from your daily GAMIT processing with other h-files before generating a times series or velocity solution. If you have more than ~ 50 sites in a regional network, it is more efficient and just as accurate to process these in GAMIT networks of 30–50 sites and then combine them in GLOBK than to use a single large network (the GAMIT limit is 99 sites). Further, if you are processing a regional network and want to tie it rigorously to a larger regional or global reference frame, you can do so by combining your h-file(s) with those generated by MIT or SOPAC from their IGS processing or another analysis center's processing of regional continuous networks. Finally, to obtain more useful long-term statistics from your time series, to strengthen the reference frame for survey-mode observations, or to reduce the computational time for velocity solutions, you may wish to first combine the h-files from 5–30 days into a single h-file to be used in subsequent solutions. We discuss the pros and cons of these strategies in :numref:`intro_prod_weighting` and :numref:`intro_prod_refframe`, but their mechanical implementation is straightforward. For processing of a single day, :program:`sh_glred` combines the GAMIT processing from multiple GNSS or multiple subnets using the :option:`-netext` option and includes independently processed regional or global networks using the :option:`LA`, :option:`LB` or :option:`LC` options. Running :program:`sh_glred` for a single day with :option:`-globk_prefit COMB` specified will direct :program:`globk` to save a combined h-file for further processing. For multiple days, this option together with :option:`-ncomb` will aggregate the data over several days. :program:`sh_glred` allows you to automatically download and/or link in h-files from an external source. If the :option:`F` option is specified, h-files implied by the :option:`-net` option will be downloaded from MIT or SOPAC into the primary h-file directory (specified by :content:`glfpth` in :file:`process.defaults` and nominally :file:`glbf/`). Alternatively, you can collect the h-files in advance in the directory specified by :content:`hfnd` in :file:`process.defaults` and then set the :option:`LA`, :option:`LB`, and/or :option:`LC` options to link these files into :file:`glbf/`. :option:`LA` refers to ascii h-files (requiring :program:`htoglb` translation) of the form :file:`h{}{}?.{}{}` (e.g. SOPAC network files), :option:`LB` to binary h-files of the form :file:`h{}{
}{}????_{}{}.gl?` (with :file:`.glx` given precedence over :file:`.glr`), and :option:`LC` to binary h-files of the form :file:`H{}{}{
}_{}{}.GLX` (e.g., MIT combined h-files). For example, to use local processing in combination with MIT global h-files previously downloaded onto your system, the command to generate a 30-day time series for experiment "emed" would be .. code-block:: console $ sh_glred -expt emed -s 2009 121 2009 150 -net MIT -opt H LC G T Since :program:`sh_glred` will look for all binary h-files in the :file:`glbf/` directory, created previously when you generated the time series, you can leave out the :option:`-net`, :option:`H`, and :option:`LA` options when combining the files. This feature is particularly useful if the days you want to aggregate are not continuous (e.g. from a survey) and you have used the :option:`-local` option in the :program:`sh_glred` command that generated the time series. If you are combining data over a span that is long enough that the error in the *a priori* velocity of any of your sites is large enough to cause an error in position, you should estimate velocities in the combination; however, it's generally better to make the span of the combination short enough that this is not an issue.