Update import times documentaiotn

This commit is contained in:
Philip Sargent 2021-02-05 11:54:06 +00:00
parent 6e8421ddc5
commit 2e92a98fbc

@ -38,8 +38,8 @@ Usage is 'python databaseReset.py <command> [runlabel]'
caves and logbooks must be run on an empty db before the others as they
set up db tables used by the others.
</pre></code>
<p>On a clean computer with 16GB of memory and using sqlite a complete import takes about 10 minutes now if nothing else is running.
On the shared expo server it could take a couple of hours if the server was in use
<p>On a clean computer with 16GB of memory and using sqlite a complete import takes less than 2 minutes now if nothing else is running.
On the shared expo server it takes longer if the server was in use
(we have only a share of it).
<p>Here is an example of the output after it runs, showing which options were used recently and how long
each option took (in seconds). <code><pre>
@ -58,9 +58,9 @@ survexblks (s) 1153.1 - 3917.0 1464.1 1252.9
tunnel (s) - - 25.5 - 23.1
scans (s) - - 52.5 - 45.9
</pre></code>
[This data is from May 2020 immediately after troggle had been ported from python2 to python3 but before the survex import was re-engineered. It now takes ~600s in total.]
[This data is from May 2020 immediately after troggle had been ported from python2 to python3 but before the survex import was re-engineered. Since July it takes only ~80s for a full reset.]
<p>The 'survexblks' option loaded all the survex files recursively following the <var>*include</var>
statements. It takes a long time if memory is low and the operating system has to page a lot. This has now been rewritten.
statements. It took a long time when memory was low and the operating system had to page a lot. This has now been rewritten and the all batched within a single database transaction.
<p>(That value of 0 seconds for QMs looks suspicious..)
<p>The file <var>import_profile.json</var> holds these historic times. Delete it to get
a clean slate.