<p>See index to the full list of these scripts at <ahref="scriptsother.html">Other Scripts</a>. This page only documents a sub-set which is not more fully documented elsewhere.
<p><ahref="../computing/folkupdate.html">Folk update</a> process produces a webpage listing all expo participants and it also runs some validation checks on the input file /folk/folk.csv . Troggle also directly imports folk.csv so that it knows who everyone is, but errors during the importing are not as easy to see as validation errors printed when running the <ahref="../computing/folkupdate.html">make-folklist.py</a> script.
<h4id="photos">Photos</a></h4>
<p><ahref="">updatephotos</a> (in the :loser: repo) uses the BINS package to generate photo albums. BINS uses the EXIF data (date, location if available) in the uploaded image files to produce a page showing the information available about the picture. All image meta-data are stored in XML files.
<p>BINS is no longer maintained by its author so expo has taken on the responsibility for keeping it running. (Wookey is in the process of packaging it as a proper debian package).
<p>In Autumn 2023 we searched all the EXIF data on all our hoto archive looking for geo-located photos. This found a few entrances which had been lost.
<p>The script is <code><ahref="">/troggle/photomap/pmap.py</a></code>
which currently generates a single file <var>photos_jpg.gpx</var> which can be imported into mapping software (such as GPSprune), but does not yet hot-link to the cave description pages or the photos themselves.
<p>We would want to add <ele> for elevation and we could use GPX extensions to insert the URL info we need to make this clickable and more useful, e.g. see <ahref="https://hikingguy.com/how-to-hike/what-is-a-gpx-file/">What is a GPX file</a> and <ahref="https://www.mapsmarker.com/kb/user-guide/how-to-use-gpx-extensions-to-customize-tracks">GPX extensions</a>.
<p>How these are used once produced is <ahref="../bierbook.html">documented in the the handbook</a>
<p>These are LaTeX files and the instructions for how to process them are in each .tex file. The protractors do not change but the others need a configuration file for all the cavers expected to attend expo.
<p>The .tex files are in :expoweb:/documents/. There is a style file also there bierbook-style.sty which is used by both the bierbook and seshbook. Read the readme.txt file which explains which LaTeX packages you need. Build like this:
The design of these files is intended to confine all changes year to year to the names.txt and dates.txt files, thanks to LaTeX's capability to read an external file and iterate through line by line performing the same action for each name.
<p>We use the OCAML program gpx2survex but we now also have a python equivalent gpx2survex.py which is used by make_svx2.sh This is part of the make_essentials generation process.
<p>gpx2survex simplifies a track so that it is less voluminous.
<p>For the reverse process we don't need a script. For svx-to-gps we can use <var>survexport</var>: Olly says [2022]: "you shouldn't need to mess around with undocumented scripts - since 2018, you can just do:
<p>which are indexes to passage names and locations in the very extensive cave descriptions for Kaninchenhohle and Steinbruckenhohle. We may need this again for Tunnocks/Balkonhohle.
<p>Writes out legs and entrances in json format. In :loser:/fixedpts/ (along with <em>make-fb-map.pl</em>which does Austrian coordinate transformations).
Also in the :loser:fixedpts/scripts/convert_shx/ folder is a 135-line short script convert_shx.ml written in
<ahref="https://en.wikipedia.org/wiki/OCaml">OCaml</a> which constructs input to the
<ahref="https://gdal.org/programs/ogr2ogr.html">ogr2ogr</a> GIS feature file format conversion utility.
<p>The documented workflow today does not appear to need this, but that might be a documentation fault. It might do a key job. <spanstyle="color:red">[to be investigated]</span>
<p>This runs "cavern" (a commandline tool installed as part of survex) to produce a text (or HTML)
report of the key statistics from the master svx file for a cave
(the one that includes all the svx files for the individual passages).
It is unclear who uses this or for what. It may be the script that generates the input data used by <ahref="#tabular">caves-tabular.html</a>
<h4id="tabular">caves-tabular.html</h4>
<p> This webpage <ahref="../../scripts/caves-tabular.html">caves-tabular.html</a> uses a page-specifc JavaScript file TableSort.js which allows the user to resort the table of all the cave data by any of the columns by clicking on it [by Radost]. The exact source of the data in the table is undocumented, but presumably from cavern .3d file export at an unknown date. This may be that generated by <ahref="#summ">summarizecave.sh</a> .
<h4id="prosp">make-prospectingguide-new.py and prospecting_guide_short.py</h4>
Uebersicht_2011.svx doing conversions on a dataset generated from dataset generated from CaveRenderer. The documented workflow today does not appear to need this, but that might be a documentation fault. It might do a key job.
<spanstyle="color:red">[to be investigated]</span>
<p>Obsolete. Traced all the svx file dependencies via the *include statements.
It produced a file used by <ahref="#makefile">makefile</a> above.
The troggle <var>parser/survex.py</var> code now (sinc 2020) produces an indented list of the current *include tree in a file in the /troggle/ folder whenever the survex files are imported by <var>databaseReset.py</var>.