1913 Commits

Author SHA1 Message Date
a0f85454f8 detecting orphan cave ids and adding to pending list 2023-08-02 18:23:04 +03:00
c76c09fced remove excess prints 2023-08-02 15:48:36 +03:00
efa40b47ca change of id for this class of data issue 2023-08-02 15:47:24 +03:00
6bca75b877 NOEDIT fixed 2023-08-02 13:41:12 +03:00
585eb534a9 Entrance locations explanations 2023-08-02 10:17:48 +03:00
7fc058b1da return to correct wallet 2023-07-31 22:00:46 +03:00
8d9b320d89 filesize now shown 2023-07-31 16:16:43 +03:00
89c1c65340 hack wallet scan rename job 2023-07-31 15:49:54 +03:00
5f07f234ef electronic surveys still need notes 2023-07-29 18:21:07 +03:00
af6081e406 better sort order for issues 2023-07-29 18:11:19 +03:00
1165b10fe4 Clearer issues message 2023-07-29 17:21:27 +03:00
4a7c14f8dc remove unused page 2023-07-27 14:40:52 +03:00
955fe9661a fix for notice on edit cave form 2023-07-27 12:21:50 +03:00
d93133c338 ambiguous aliases removed more thoroughly 2023-07-27 00:38:47 +03:00
Martin Green
a86f251423 Fixed parsing and rendering of an entrances last visited field 2023-07-26 20:40:30 +01:00
6482aa697f helpful comments on form 2023-07-26 17:38:19 +03:00
0706d5dc77 fix entrqnce edit bug 2023-07-26 16:54:37 +03:00
cdac10fdcf <figure> forr folk bios 2023-07-26 15:03:26 +03:00
1f656b2101 properly turn off automagic entrance creation 2023-07-26 01:23:49 +03:00
fab7adf079 Cleaning up entrance importing robustly 2023-07-26 00:14:46 +03:00
72a6b091e6 make more robust 2023-07-25 22:14:13 +03:00
af552a3d62 better detect unknown cave identifer strings from users 2023-07-25 21:07:13 +03:00
5ce21564fc Remove unused field on survexstation class 2023-07-25 18:56:13 +03:00
7d4ca5dae2 Make robust against duplicate objects 2023-07-25 18:55:42 +03:00
3c78ab79ca better fix for variant date formats 2023-07-25 01:34:02 +03:00
748cb91a20 lengthen url field, i hope 2023-07-25 01:34:02 +03:00
Martin Green
8463f8947e Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-24 23:03:47 +01:00
Martin Green
380565c6f5 save connection between cave and entrances, before writing out the cavedata file 2023-07-24 23:03:12 +01:00
31c815eeb0 bugfix in error message 2023-07-24 17:33:39 +03:00
c31615b1ff clean up after Mark changed things a bit 2023-07-24 15:26:36 +03:00
3bd7684d4b formatting terminal o/p 2023-07-24 14:24:53 +03:00
64fa602a07 ignore /subsections/ files 2023-07-24 14:24:39 +03:00
9f2b77bf1d More compact parsing report to terminal 2023-07-24 13:14:42 +03:00
9473b22bd9 detect non-ISO dates in JSON and from user and fix 2023-07-23 23:30:19 +03:00
3ea014ec57 add new *team role 2023-07-23 22:01:01 +03:00
3ffb07371d warning message to people who have got here before they are ready, and link to docm 2023-07-22 22:26:50 +03:00
f76c22e843 moving dev configs to _deploy/ 2023-07-17 19:25:40 +03:00
33d279b95a rearrange variant debian, ubuntu dev environemnts 2023-07-17 19:05:22 +03:00
f8faf6c2ba updated build stuff 2023-07-17 19:00:56 +03:00
6e452b2ee9 current settings on server - sanitized 2023-07-17 17:40:29 +03:00
dc06744444 fix exception? 2023-07-14 11:26:47 +02:00
26e96839e4 stop crash onloading wallet with blank caveid 2023-07-14 11:22:48 +02:00
4e4824546a stop crash on live system 2023-07-14 11:21:49 +02:00
387bdf4f91 shorter msg 2023-07-14 11:20:57 +02:00
b650095588 calendar S T colour >10 per date 2023-07-14 11:13:06 +02:00
761a71930b hack to stop crash 2023-07-13 22:20:01 +02:00
127002d736 more entries 2023-07-13 21:36:48 +02:00
a062e9ea44 moving venv stuff to _deploy 2023-07-13 16:02:51 +02:00
Martin Green
380fe8cc32 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-13 12:35:00 +02:00
Martin Green
bdb5e3208b Rotate and reflect images, as specified in EXIF whilst uploading 2023-07-13 12:34:52 +02:00
Martin Green
c119c99308 removed print statement 2023-07-13 12:33:55 +02:00
Martin Green
2af4f4b10e ignore javascript dir that should be downloaded from the deployment server 2023-07-13 12:27:13 +02:00
Expo laptop Crowley
a5968121a0 add DEPRECATed SURVEX FILES TO IGNORE LIST 2023-07-12 22:04:46 +02:00
5eb6ef4d31 more synonyms 2023-07-12 16:44:45 +02:00
Mark Shinwell
5f6359694d Update test to reflect change in loser repo 2023-07-12 14:12:40 +02:00
Martin Green
90a6eaa26d Updated requirements to match expo.survex.com 2023-07-10 17:42:29 +02:00
Martin Green
67361fa66c Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-10 17:00:56 +02:00
Martin Green
b27852c1f3 redirect to actual login page 2023-07-10 16:55:09 +02:00
Expo laptop Crowley
8ff438942d handle shortform months and days in svx file 2023-07-10 12:49:14 +02:00
Expo laptop Crowley
b3e2f34960 hack fix 2023-07-10 10:33:57 +02:00
Expo laptop Crowley
d0ccc46260 better test & msg 2023-07-10 10:03:23 +02:00
Expo laptop Crowley
5a7f197bcb hack to make server reset quickly 2023-07-10 09:24:11 +02:00
Expo laptop Crowley
389fb6c409 add month 2023-07-10 09:18:17 +02:00
Expo laptop Crowley
4d48dd4386 Logbook debugging 2023-07-09 11:30:50 +02:00
Expo on server
086537cb56 =Added string to caveslugs so admin interface works. Added explanation for the rational for cave slugs. 2023-07-08 23:45:32 +01:00
Expo on server
beab42323b Added CaveSlugs to Admin 2023-07-08 23:43:57 +01:00
Expo on server
3d43c0ec12 Allow for parent directories to be created when created photos. 2023-07-08 22:42:06 +01:00
b1c5b03104 link to 2023 in menus 2023-07-08 17:13:57 +01:00
Martin Green
68724a0504 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-08 17:57:33 +02:00
Martin Green
3359889d97 Better attempt at creating a default url for caves created by svx files 2023-07-08 17:56:49 +02:00
2f24a7f7bb Barbie setup 2023-07-08 15:35:43 +01:00
Martin Green
66ee96cd63 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-05 21:11:04 +01:00
Martin Green
ad37a82713 git comitting allowing files to be comitted in different directories. 2023-07-05 21:10:05 +01:00
Martin Green
ffed6e3ba6 convert uploaded images to RGB so that it can be saved as jpg 2023-07-05 21:08:51 +01:00
Expo laptop Crowley
7b8e93cdb5 menus and logbook 2023-07-05 21:54:25 +02:00
Expo laptop Crowley
4158f5ba63 add 2023 2023-07-05 20:17:08 +02:00
Martin Green
a70cf6cad3 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-05 18:27:49 +01:00
Martin Green
63599cb27a Removed ability to add other caves entrances to a cave. 2023-07-05 18:22:08 +01:00
Martin Green
a0fcb78e95 Removed kataster area for unoffical numbers of pending caves. Added .html to urls of pending caves. 2023-07-05 18:21:15 +01:00
Expo laptop Crowley
745ccd7f88 Show when running locally 2023-07-05 19:05:07 +02:00
Martin Green
2b30b7b624 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-05 17:45:21 +01:00
Martin Green
410a285085 Bug fixes to allow adding of a new cave. No longer ask users about filenames or urls. 2023-07-05 17:43:57 +01:00
Expo laptop Crowley
d2bcef6e36 Turn menu bar magenta if running on localserver 2023-07-05 18:35:40 +02:00
Martin Green
067fcd9892 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-05 13:19:11 +01:00
Martin Green
c4095eb336 Have images saved to correct place, when edited in the cave or entrance view. etc 2023-07-05 13:18:02 +01:00
Expo laptop Crowley
020a083afa more fix 2023-07-05 13:09:49 +02:00
Expo laptop Crowley
fd9f21de2e more local fixups 2023-07-05 12:49:30 +02:00
Expo laptop Crowley
7268eb4f30 synch localsettings variants 2023-07-05 12:11:44 +02:00
Expo laptop Crowley
910e0e3123 updating requirements for testing 2023-07-05 11:22:31 +02:00
Expo laptop Crowley
7db17154ad ugh stage problem 2023-07-05 11:22:31 +02:00
Expo laptop Crowley
ebcc0db665 ugh stage problem 2023-07-05 11:22:31 +02:00
Expo on server
73675ca1b9 Radost fiux for narrow screens 2023-07-05 10:22:30 +01:00
Martin Green
973d05f9fb Modification to templates to make them more suitable on a stand alone website. 2023-07-03 21:35:00 +01:00
Martin Green
52299fb6fd Add links to entrances page 2023-07-03 09:32:34 +01:00
Martin Green
01964e7cf6 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-07-03 08:06:45 +01:00
Martin Green
929f6732d0 Add some sort of orderinng 2023-07-03 08:04:26 +01:00
Martin Green
b91223da66 Add list of entrances 2023-07-03 08:00:03 +01:00
a04d9ef056 capitalisation fix 2023-06-28 20:37:06 +01:00
Martin Green
e4fe5eaf5d removed extranius bracket from html 2023-06-08 01:10:07 +01:00
Martin Green
c7494fe379 bug fix 2023-06-07 23:49:19 +01:00
Martin Green
e4112431be bug fix 2023-06-07 23:47:14 +01:00
Martin Green
65eec8e91d attempt to use {% url %} tag to point at 3d file for cave viewer 2023-06-07 23:44:44 +01:00
Martin Green
0b0f2f07e1 Try having caveviewer using same url as download link again 2023-06-07 23:25:25 +01:00
Martin Green
ee34f87563 Replaced cave viewer link to 3d file with the working orginal 2023-06-07 23:20:51 +01:00
Martin Green
e42f0569fd Made the sources of the 3d download files consistent 2023-06-07 23:19:08 +01:00
Martin Green
d242a8bb1d Try using url template command to link to 3d files 2023-06-07 23:08:28 +01:00
Martin Green
0cd32d6a15 See what breaks when put .3d on the end of urls for dowloading 3d files 2023-06-07 22:58:28 +01:00
Martin Green
fc9977952e Changed source of cave 3d files back. 2023-06-07 22:47:06 +01:00
Martin Green
56e9273047 changed source of 3d files for cave viewer, to see if it works better 2023-06-07 22:45:36 +01:00
Martin Green
12cee59605 Try not showing kataster codes if they are empty/None 2023-06-07 22:38:22 +01:00
Martin Green
90862e9a89 Add editlink classes for editing the entrances 2023-06-07 22:26:11 +01:00
Martin Green
ec3ebe8499 Fixed comments again 2023-06-07 22:21:41 +01:00
Martin Green
1ef636ca6f fixed broken commment tags 2023-06-07 22:19:12 +01:00
Martin Green
174d7bfe13 Commented out Kataster status. Chnaged Survey heading to Surveys and Rigging Guides. Moved explorers and references down the page. 2023-06-07 22:08:11 +01:00
Martin Green
4e34ae0530 Reordered cave description and survey. 2023-06-07 21:54:25 +01:00
Expo on server
f477507d27 Ignore Windows Zone.identifier files
Ignore all of media/jslib except the readme
2023-06-04 11:17:48 +01:00
969ed6cce5 added 1627 caves to 'caves'page 2023-05-20 22:47:09 +03:00
2e6b8d44f1 adding BS4 because of Martin's KML stuff 2023-05-20 21:47:14 +03:00
93201ab458 better layout to understand apache interactions 2023-05-20 21:36:05 +03:00
Martin Green
43724f1cf6 When reloading a cave, do so from cave.filename, rather than inorrectly assuming the cave is named after its slug 2023-05-08 23:09:15 +01:00
Martin Green
7d140af87a Do not rename caves when saving 2023-05-08 22:34:45 +01:00
Martin Green
77ed1b444c Hack to turn cave URL to absolute links, so they can work from anywhere on the website. 2023-05-08 22:32:49 +01:00
Martin Green
f7fca58c57 Start of creating kmz file, with entrance photos and links to expo.survex.com 2023-05-08 01:10:43 +01:00
Martin Green
ea7c29a54e Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2023-05-02 22:41:36 +01:00
Martin Green
0052385333 Add lat long to entrance locations on cave pages 2023-05-02 22:40:44 +01:00
7f0e7d8fa3 Ubuntu 22.04 + python3.11 2023-05-02 23:26:53 +03:00
Philip Sargent (laptop)
ef467ad481 Making the dummy entrance thing clear 2023-05-02 17:42:58 +03:00
Philip Sargent (laptop)
175307773b More fixes for laptop 2023-05-02 15:28:28 +03:00
Philip Sargent (laptop)
8e982bd6be Fixing new dev machine setup 2023-05-02 14:27:55 +03:00
Martin Green
783154d24a Restored check of git status after a committ, but only for the one file being committed 2023-05-01 22:06:48 +01:00
Philip Sargent (laptop)
40126f6e2f fixing laptop SNowWHite 2023-05-01 23:29:20 +03:00
Martin Green
1ccd9c418b template missing from last commit 2023-05-01 00:03:58 +01:00
Martin Green
896af43994 kml file output for google earth etc 2023-05-01 00:01:41 +01:00
Martin Green
2ee63a9804 If a cave is made with no entrances, redirect to a new entrance form for the cave 2023-04-30 21:42:03 +01:00
Martin Green
c7b3e8aff0 Added an ordering of entrances 2023-04-30 21:41:17 +01:00
Martin Green
c50236575f Dummy entrances were being written to a global variable, recording that they have a filename. But the filename was not written. This was then causing an exception when the user asked to edit a cave, leading to all entrances being reread. Obviously the dummy entrances file ould not be read.
PS. Do we really want to reread all entrances when we start editing a cave?
2023-04-30 21:06:57 +01:00
Martin Green
e8d1265ee4 Be more specific about what is being commited and do not check for unadded changes, and then through an error at the user, which they are unlikely to understand. 2023-04-30 21:04:05 +01:00
Martin Green
10ff8a5aab Fixed errors when creating new caves and entrances via the website. Moved slug logic to forms (previously checked in). This allows slug uniqueness to be validated in the form 2023-04-30 19:05:57 +01:00
Martin Green
941100a8a3 Previously if a cave was found without an entrance an entrance was added to the database. However as this was not also added as a file, the new entrance could not be edited via the web interface. Without an entrance being made, they can still be added. 2023-04-30 19:03:09 +01:00
Martin Green
374caa0d9a Fixeed spelling error in variable name 2023-04-30 19:01:07 +01:00
Martin Green
82aaa2b523 Improved cave form, layout text and validations 2023-04-30 19:00:22 +01:00
Martin Green
31a60ce85c For a cave, force users to enter are and have a unique URL. Cope better if area not defined 2023-04-30 18:58:41 +01:00
Martin Green
0a3a6934c4 Added some commented out lines for allowing for local development, where files are server from /usr/share/javascript or expofilescache 2023-04-30 18:54:53 +01:00
Martin Green
ed9f7b03bf Have entrance coordinates displayed to web pages. (Requires expo web entrance survey points to start with 1623 or 1626) 2023-04-30 00:59:58 +01:00
Expo on server
fc8584775e Get CaveView working again on cave pages. (v2.2.0)
Move caveView section back to bottom of page.
2023-04-29 23:35:18 +01:00
Martin Green
b7ec4f2efe python virtual library 2023-04-29 22:49:26 +01:00
Martin Green
0b566575f3 Change uploaded images to jpg to png, for much smaller files 2023-04-29 22:49:26 +01:00
Expo on server
73af227fb3 remove unused copy of CaveView in troggle and updated cave.html
template for caveview2
2023-04-29 22:46:14 +01:00
8aec40f951 todo 2023-04-22 23:27:06 +01:00
c5a9bdc724 xml parser attmpt retract 2023-04-22 23:15:50 +01:00
30ef427b90 refactor cave import 2023-04-22 22:05:12 +01:00
275adc8efa mistaken default fixed 2023-04-22 03:46:34 +01:00
02e46ed723 Remove uneeded print 2023-04-22 03:35:07 +01:00
94b8b357fb Fix entrance edit too, saving slug now 2023-04-22 03:26:53 +01:00
2ed66fe3d0 edit cave reads from HTML file not just db 2023-04-22 01:24:32 +01:00
116cfc7c6e bad tests. 2023-04-06 00:57:19 +01:00
f15555a3bd more tests 2023-04-06 00:51:04 +01:00
dcfff1ede7 tests for QM pages 2023-04-06 00:16:44 +01:00
0f76422d85 format change 2023-04-05 23:33:01 +01:00
d3d983eedb QM check-box report for open leads 2023-04-05 23:13:12 +01:00
2541766dd5 typo 2023-04-05 20:42:19 +01:00
e002a634ff remove obsolete setting, add PHOTOS_ROOT 2023-04-05 20:42:09 +01:00
489dd157b6 context processor documn link 2023-04-05 20:41:34 +01:00
0a76acd664 test checked 2023-04-05 12:46:10 +01:00
4d8cb82ef9 initial refactoring 2023-03-31 12:19:22 +01:00
9ffe3f690b tidy up entranceSlug all now deleted 2023-03-28 20:30:00 +01:00
e7d9e9402a bugfixes 2023-03-28 20:05:38 +01:00
3ef5c1aa0b add FLUSH command 2023-03-28 19:51:20 +01:00
9a28e93ac6 EntranceSlug removed from data model. 2023-03-28 19:26:37 +01:00
5738da8566 cached_slug str now a .slug field & pending fixed 2023-03-28 19:08:05 +01:00
bbc13c4eb9 remove 'primary' concept from entrance slugs. tested. 2023-03-28 17:08:55 +01:00
6ca5d5bfa8 debug ent report 2023-03-28 15:37:25 +01:00
2091eb8e8b remove unused versionControlForm 2023-03-28 14:19:06 +01:00
860ce6b065 Creates new cave from survex file upload 2023-03-24 00:54:26 +00:00
8b5887a87a Allows creation of new svxfile with unknown cave 2023-03-23 21:26:16 +00:00
770edd6391 Survex editor now parses edited files 2023-03-23 19:05:25 +00:00
562ef48f19 re-parsing survex file when editing 2023-03-23 01:17:38 +00:00
70d6d9eb77 error msgs improved 2023-03-22 23:30:05 +00:00
2332bfdc57 comment to resolve confusion 2023-03-22 23:29:40 +00:00
1db7c5ffea change on_delete to cascade for blocks in a file 2023-03-22 23:29:12 +00:00
838b358a4a chnage template comment 2023-03-22 23:28:26 +00:00
dd32114698 delete test upload file after test 2023-03-22 17:57:48 +00:00
5f46d8fdc5 Fix bugs 2023-03-22 15:18:16 +00:00
ad48851118 re-enable login restriction on wallet create 2023-03-22 15:04:34 +00:00
ead3f8dc8b fix broken tick-lists in wallets 2023-03-21 18:23:07 +00:00
6b3cb9f02e samedate for all wallets on wallet edit form 2023-03-21 14:56:34 +00:00
435f3cf00b fixed url glitch, added parent url 2023-03-21 14:29:15 +00:00
7caf1690c6 fix download .3d file 2023-03-21 12:56:51 +00:00
4ae79a642e Report format clean-up 2023-03-19 19:12:09 +00:00
9489b9209d obviate meaningless error messages 2023-03-18 20:32:35 +00:00
5a90e7b727 fix more twiddly variations of QMs 2023-03-18 03:03:06 +00:00
d64948749e more QMs parsed correctly 2023-03-18 00:57:40 +00:00
7345e3a328 Much QM re-engineering 2023-03-17 20:01:52 +00:00
de54576d11 Fix QMs reports, remove V grade 2023-03-17 14:33:30 +00:00
12c7b9b5d1 + ? grades 2023-03-16 21:55:14 +00:00
c40b56844f TICK shown 2023-03-16 21:38:03 +00:00
09f5741b71 cleaner formatting 2023-03-16 21:31:09 +00:00
17fa306b09 More QM fields in JSON export 2023-03-16 21:26:31 +00:00
3011e7b114 Adding QM JSON export 2023-03-16 21:06:52 +00:00
98066591da More archaic URLs re-enabled 2023-03-15 14:22:21 +00:00
56c78611fb enable /smkridge/ as /1623/ url 2023-03-15 13:58:09 +00:00
a7660c8ec7 QM report updating 2023-03-14 16:11:37 +00:00
934a19b879 updated .svx template 2023-03-14 16:10:57 +00:00
c247636c4c Working on QMn TICK lines again 2023-03-14 03:27:05 +00:00
85fab88ac9 Fixing inherited *date into sub-blocks 2023-03-14 02:12:28 +00:00
b428a87f1a Better debug messages 2023-03-13 20:27:27 +00:00
d0a05af9c6 Now inheriting *date from parent survexblock 2023-03-13 19:44:39 +00:00
83dc9e6c09 Move code closer to related code 2023-03-13 19:01:30 +00:00
7aeffff00c Better error msgs for poorly formatted QMs 2023-03-13 16:31:42 +00:00
94dd0fe1fd documn. url reverse() tests 2023-03-12 23:40:11 +00:00
45fcaac47d test revers() function 2023-03-12 16:21:31 +00:00
35e9eb558d Identified survey length discrepencies 2023-03-12 01:09:17 +00:00
b88b142332 fixing bad splay detection 2023-03-12 00:35:37 +00:00
870b290726 add station to QM report 2023-03-11 21:14:35 +00:00
7b10aa0bdf WSL1 on thinkpad 2023-03-11 17:58:58 +00:00
4d66548920 Fix URL to survex file in menus 2023-03-11 00:47:13 +00:00
7c923842ca more comment into generated file 2023-03-11 00:11:56 +00:00
02970512c1 link to 204 wallets 2023-03-08 22:42:02 +00:00
2c36666d41 tidy test fior git clean 2023-03-08 18:29:14 +00:00
b3d9eeecd2 fixing paths in settings to be Path() not str() 2023-03-08 18:24:57 +00:00
301fa1fce1 re-enable lookup() on survexblock objects (again) 2023-03-08 18:02:00 +00:00
e71939fe00 restore lookup() function 2023-03-06 22:30:36 +00:00
859a28b4ee 1979 logbook parsing now works 2023-03-06 22:30:07 +00:00
432759bfc1 Import new survexfile ONLY 2023-03-06 16:37:54 +00:00
94e145adce CASCADE fixes in data model 2023-03-06 16:37:38 +00:00
ccfc44a423 Saving new survex file parses contents. 2023-03-06 04:52:41 +00:00
8f3b329552 obscure bug empty directory inside empty directory 2023-03-05 23:55:00 +00:00
63dde36389 100th test 2023-03-05 23:17:11 +00:00
8fc25de794 Initial attempts at saving edited survex file 2023-03-05 23:06:06 +00:00
d5887e8f99 bloody bugs 2023-03-05 19:53:12 +00:00
b086348d38 Bugfixes for new survex file 2023-03-05 19:09:28 +00:00
4a2106183a Bugfix for new survex file 2023-03-05 18:20:18 +00:00
06ac840dd0 Needed for django 4.2 admin pages 2023-03-05 17:46:01 +00:00
6ab7a340e2 Now 99 tests 2023-03-03 15:15:17 +00:00
1cb81cbb09 Duplicate survex files in lists all done properly 2023-02-28 18:52:04 +00:00
e8c824a396 More names in cross-references 2023-02-28 17:41:15 +00:00
dc03016dbe clean up de-duplication code 2023-02-28 16:18:29 +00:00
5067ef2c8c reformat template fragment 2023-02-28 16:18:13 +00:00
5c3927c25d all working, queries improved, date-ordered. 2023-02-27 22:23:24 +00:00
154722f765 logbook trips, and some wallets working. 2023-02-27 19:14:08 +00:00
6387de038b get events on same date.progress. 2023-02-27 16:42:08 +00:00
6de4fa66a2 Initial add of On This Day links to svx file page 2023-02-26 22:13:37 +00:00
394d94d5d6 Extended test 2023-02-24 23:12:44 +00:00
a3fc9a17ed New tests for parsing logbooks 2023-02-24 22:55:18 +00:00
3d38611e4f new test for logbook 2023-02-24 20:21:06 +00:00
d1dac92034 Now using HTTP status codes properly 2023-02-24 17:38:06 +00:00
bc9306fc1b str() needed with python 3.11.0rc1 & added try/execept 2023-02-24 17:21:56 +00:00
4358e6440a remove unneeded warnings 2023-02-11 23:03:30 +00:00
6b5f048b4b Problems with venv script, ongoing.. 2023-02-11 23:03:12 +00:00
709b720be9 Update URLs to django documn version 2023-02-10 00:05:04 +00:00
19d9942676 Django 3.0 deprecations removed 2023-02-09 20:54:55 +00:00
0740f9ea5a More todos 2023-02-08 23:37:00 +00:00
28f87e82e9 page rename fix 2023-02-03 22:36:15 +00:00
175e71facf Many many bugs fixed for >1 svx file on a wallet 2023-02-03 22:19:51 +00:00
c1231ff451 refactored and most recent wallet link fixed 2023-02-03 17:13:29 +00:00
e70d51e52b Made JSON visible on wallet edit page 2023-02-03 11:34:38 +00:00
7f5bd2c17e docm. update 2023-02-03 11:34:24 +00:00
f7d91b5929 todo updates 2023-02-02 22:02:16 +00:00
e94a24bbd4 to-do lists updates 2023-02-02 21:50:40 +00:00
e0a198bac5 Added 'Notes not needed' checkbox 2023-02-02 17:39:56 +00:00
25e00e02b7 Complete set of todo strings on webpage 2023-02-02 16:15:13 +00:00
13f0c6f988 * prefix documented 2023-02-02 15:40:50 +00:00
9abfcdd091 re enable 2023-02-02 14:57:44 +00:00
18541de371 test file permissions 2023-02-02 14:51:20 +00:00
578f02db2d Better diagnostics from bad slug error 2023-02-02 11:19:46 +00:00
3b9695b4f3 spiders are asking for weird urls 2023-02-02 11:13:02 +00:00
b55bfb8868 more debug 2023-02-02 00:01:37 +00:00
3a348d5d1a more debug 2023-02-01 23:59:38 +00:00
5ed91e1c15 debug msg 2023-02-01 23:56:54 +00:00
8aa5a601e7 if db badly corrupt 2023-02-01 23:54:26 +00:00
9d1c0ac395 Setting wallet dates earlier in the import process 2023-02-01 23:43:05 +00:00
c7d88077ec renamed wallet form file 2023-02-01 21:58:48 +00:00
5798e8dcd5 make nav cope with missing wallets 2023-02-01 21:31:07 +00:00
5ae37eef82 Fix to not create an extra wallet 2023-02-01 19:31:29 +00:00
587ccff501 date handling now working for new wallet 2023-02-01 19:10:46 +00:00
8e51f3aff8 Dedicated 'Create' button for a new wallet 2023-02-01 17:21:33 +00:00
e38d29f33d menu change to go to latest blank wallet 2023-01-31 20:52:22 +00:00
3c7661836c New wallet goes to next available slot now 2023-01-31 20:28:39 +00:00
1f5b56a593 Wallet editor into separate file 2023-01-31 17:13:41 +00:00
57930b7aa5 Show survey length per survex file 2023-01-31 14:49:54 +00:00
76ed801077 Minor bulk create tweak 2023-01-31 01:37:00 +00:00
241dde3a84 Bulk_create working for team on a survexblock 2023-01-31 00:39:30 +00:00
b98d7d246c bug fixes renaming 2023-01-30 23:22:28 +00:00
3aca0d0c76 ruf cleanup imports, bigly. 2023-01-30 23:04:11 +00:00
e35fccea5d Removed unused properties and moving 2023-01-30 22:27:17 +00:00
7808005498 ran 'black' to reformat all the core files 2023-01-30 19:04:36 +00:00
d06dd3d166 Updated all.svx documn 2023-01-30 19:00:05 +00:00
a12f666e96 Remove unneeded Class ExpeditionDay 2023-01-30 16:42:56 +00:00
ebe86d73d4 Renaming class - step 3 complete 2023-01-30 16:27:01 +00:00
b29ff61871 Renaming class step 2 2023-01-30 16:18:19 +00:00
58f7cf72d4 renaming confusing Class, step 1 2023-01-30 16:07:44 +00:00
3742e0f367 fixing Sunday display on calendar 2023-01-30 15:28:11 +00:00
7d98980121 Fixing dates on expedition table 2023-01-29 22:11:00 +00:00
89d0e1723e fixing cyclic import problem 2023-01-29 21:45:51 +00:00
226cbb6b5c Fixed coloured T and S on expo calendar 2023-01-29 20:59:56 +00:00
a1c34f488d bugfix from rearranged model files 2023-01-29 18:17:43 +00:00
79bccade02 separated out ent locations from survex parser 2023-01-29 17:03:50 +00:00
5639435058 Moved Logbooks to separate model file 2023-01-29 16:47:46 +00:00
432be660a4 separated out Model for wallet to its own file 2023-01-29 16:23:58 +00:00
f73640522b More public debug reports 2023-01-29 12:41:50 +00:00
d9d4181dda Add new survex import report 2023-01-29 12:11:47 +00:00
071f68080c Inherit *team to anonymous survex blocks 2023-01-29 01:30:10 +00:00
03fa5f5548 compiling more regexes 2023-01-28 21:17:16 +00:00
0d9d307490 refactor to reduce db hits and fix *team 2023-01-28 21:00:38 +00:00
e6fd1f0ec5 Collect dataissues and write to db all at once 2023-01-28 15:10:39 +00:00
2704fc42d4 faster db creation, safer file reading with 'with' 2023-01-28 14:04:32 +00:00
d9a4069662 moved aliases to people module, faster db too 2023-01-28 14:03:46 +00:00
9e71be8169 refactored, global removed 2023-01-28 13:14:54 +00:00
db0504057b no search before db object creation: faster 2023-01-28 11:45:30 +00:00
e4c804b305 refactoring author checks 2023-01-28 10:47:25 +00:00
e01bd39609 refactored logbooks parser 2023-01-27 23:21:07 +00:00
6565b3f9c4 refactored, faster deletion prior to loading 2023-01-27 17:41:10 +00:00
2fee216e80 Remove logdataissues from TROG 2023-01-27 17:24:31 +00:00
75834902f2 new report on tehrion file parsing 2023-01-26 23:36:56 +00:00
719e0fe918 therion issues report 2023-01-26 22:36:49 +00:00
a321625f35 fix number of entries expected 2023-01-26 21:52:56 +00:00
0c4ce6dc3c deleted old parser code 2023-01-26 21:33:17 +00:00
733765802e reformatted all old logbook formats 2023-01-26 21:33:06 +00:00
1be3a3892c ruff removed unused imports 2023-01-19 21:34:09 +00:00
89b0c0862e ran ruff to remove unused imports 2023-01-19 21:20:47 +00:00
ba2ae6cd82 reformatted using black 2023-01-19 21:18:42 +00:00
0f8fe0e290 isrt reordering of import statements 2023-01-19 20:47:26 +00:00
be9fcc522a sort imports using isort. tested. 2023-01-19 18:35:56 +00:00
939d3970aa sort imports using isort. tested. 2023-01-19 18:33:04 +00:00
e5a9330a91 reordering import statements using isort 2023-01-19 18:30:05 +00:00
1b70ccea3e consistent tests with logbooks junk removal 2023-01-16 19:52:05 +00:00
822965ebe5 remove %s old formatting style (last bits) 2023-01-02 22:26:33 +00:00
7738b2836e removed temp files 2022-12-29 16:26:25 +00:00
1ab7528f7b Clean up unused templates. All checked. 2022-12-29 15:07:58 +00:00
de74cd4867 tidy up 2022-12-29 13:56:46 +00:00
9dc1853e10 cope with bad wallet name 2022-12-29 13:56:09 +00:00
78740a1fc0 remove logbooksearch, use site-wide search on server 2022-12-29 13:54:38 +00:00
b131e567b5 logbooksearch ? 2022-12-29 13:52:37 +00:00
5bbb363f12 Adding docstrings, deleting unused code 2022-12-23 23:32:59 +00:00
0e29cdd48c bugfix 2022-12-23 22:18:17 +00:00
8374500da5 Put colour bar on wallet editor 2022-12-23 22:14:00 +00:00
194470841e logbooks correct numberss of entries expected 2022-12-23 22:13:43 +00:00
a71c616afd update deprecations etc 2022-12-23 22:13:11 +00:00
9c39c9dcff docstrings added 2022-12-23 00:49:22 +00:00
a72c2bd96a Using Path() wherever possible in localsettings.py 2022-12-22 16:04:16 +00:00
f23764c486 diagnosing missing entrance file 2022-12-22 00:56:46 +00:00
a7a126dd55 Formatting wallets tables 2022-12-21 22:10:55 +00:00
d06af5b0ec rearrange control panel 2022-12-21 15:29:57 +00:00
ec040824f6 fix frontmatter/endmatter 2022-12-21 02:05:26 +00:00
517da57a0c All broken logbooks now parsing correctly. 2022-12-20 23:48:56 +00:00
5ee3ebad3e converting 1988 logboog 2022-12-20 21:53:56 +00:00
05df2e084c converting 1987 logbook 2022-12-20 19:59:36 +00:00
dc3a61addd convert old logbooks to modern format 2022-12-20 16:38:32 +00:00
9169abdb05 2019 blog edits 2022-12-20 15:18:07 +00:00
81e95291e8 More surveyors in list 2022-12-20 12:48:39 +00:00
f24f283a07 attempted speedup, explicit dates wallet objects 2022-12-20 00:07:55 +00:00
bb14c94ab1 Updates to make 2018 blog merge work (faster) 2022-12-19 20:13:26 +00:00
7e9bb73777 Vital fix to stop parsing terminating too early 2022-12-19 11:38:34 +00:00
43a98b4421 Not quite getting all the blog post contents 2022-12-19 00:33:32 +00:00
f1d5df9933 fix author display for logbook entry 2022-12-18 21:20:30 +00:00
8ce86aabee strip spaces from titles 2022-12-18 20:36:11 +00:00
d1b94763b4 Fixing wiki-parsing for 2009 logbook 2022-12-18 19:33:56 +00:00
73b710d53f fix more logbook parsing 2022-12-17 17:05:55 +00:00
0a4471e039 Fixed round-trip import-export-import bugs 2022-12-17 03:02:08 +00:00
f80e4efed8 parse several UK caving blogs per year - working 2022-12-16 19:57:56 +00:00
5e9fd7fd77 bug in python 3.9? 2022-12-15 01:06:54 +00:00
5cc6c26606 blog parsing working 2022-12-15 00:35:48 +00:00
cb50528e2d exptl parse UK Caving blog 2022-12-14 23:46:14 +00:00
6dd8e5a75c ignore soft links 2022-12-10 17:11:39 +00:00
0e47909704 tidy and comments 2022-12-10 13:00:57 +00:00
cabcada0b8 2003 logbook export/re-import as now HTML format 2022-12-09 23:45:07 +00:00
17b2b7b89c fix comments and 2019 LB parsing 2022-12-07 18:22:09 +00:00
1eab261b30 fix bugs made visible by py 3.11 2022-11-23 21:59:42 +00:00
Philip Sargent
b06d1dae42 Convert.format() to f-strings with flynt 2022-11-23 10:48:39 +00:00
Philip Sargent
45a640dfe9 Convert .format() to f-strings 2022-11-23 10:41:14 +00:00
Philip Sargent
ca9a3b5c7b tidy up 2022-11-23 10:36:02 +00:00
Philip Sargent
0b32d51ba3 bugfix 2022-11-23 00:49:47 +00:00
Philip Sargent
1a9e17a7e8 remove entry-type and tidy cache bits 2022-11-23 00:36:44 +00:00
Philip Sargent
995df16bec bugfix length declaration 2022-11-21 16:52:10 +00:00
Philip Sargent
259f85742a moved parser settings 2022-11-21 16:47:25 +00:00
Philip Sargent
a795707552 cache tidy and move settings into parser 2022-11-21 16:41:52 +00:00
Philip Sargent
bcb61f9cd9 remove cache and setdatesfromlogbook 2022-11-21 16:26:30 +00:00
Philip Sargent
4260b0f092 Removed misguided ObjStore as we do need multiuser db 2022-11-21 00:04:33 +00:00
Philip Sargent
4514eda311 make compatible with python 3.11 as well as 3.10 2022-11-18 20:42:03 +00:00
Philip Sargent
725c5ad0cd Updating comments 2022-11-17 01:24:39 +00:00
Philip Sargent
0b89979418 bug found by python 3.11 2022-11-15 23:56:35 +00:00
Philip Sargent
96281c33e8 updating venv generation 2022-11-15 23:56:17 +00:00
Philip Sargent
43bf2620f1 dump3d --legs future possibility 2022-11-15 22:25:39 +00:00
5b3f91d3e5 bugfixes 2022-10-27 17:23:41 +01:00
73e57a19df fixes for WSL1 on new machine 2022-10-27 16:29:11 +01:00
275ce87e30 OS config for a new dev machine 2022-10-27 12:30:59 +01:00
Philip Sargent
d9ed90b729 update for old PC settings 2022-10-26 00:20:47 +01:00
Philip Sargent
d82c521f4f direct link to lower table 2022-10-19 14:00:08 +03:00
Philip Sargent
2cafa32c7e links to other years wallets 2022-10-18 23:28:38 +03:00
Philip Sargent
830150ade6 Making page templates autoadjust to the current year 2022-10-15 21:28:56 +03:00
Philip Sargent
55ac98ebe1 Add survex trips to logbook mentions 2022-10-15 19:33:30 +03:00
Philip Sargent
454c2c2830 Fix links to scans directly from the master drawings list 2022-10-15 18:37:46 +03:00
Philip Sargent
2fa298cae9 cures weird bug 2022-10-15 17:26:09 +03:00
Philip Sargent
3b106a3803 fix PCTEXT better in dis-laying tunnel files 2022-10-15 17:25:41 +03:00
Philip Sargent
da09bc7968 Render tunnel files as XML in webpage, not just text 2022-10-15 14:07:15 +03:00
Philip Sargent
e0ac09d5ec add 'lastvisit' field to entrances, for the date 2022-10-12 23:12:55 +03:00
Philip Sargent
45f06293f5 Add 'foreign friends' to names alias list 2022-10-12 23:10:48 +03:00
Philip Sargent
004a3e2db1 comment out hr and top heading, never needed and disrupt layout 2022-10-12 23:09:58 +03:00
Philip Sargent
b81b4ef2ef Add date of update to the generated html file 2022-10-12 22:09:58 +03:00
Philip Sargent
52c0ab213a Person twiddles 2022-10-11 21:47:18 +03:00
Philip Sargent
c3bfd50cf1 Update link to new person wallets on person page 2022-10-11 21:33:13 +03:00
Philip Sargent
39683cc413 TIdy and docum of cave entry code 2022-10-11 21:01:02 +03:00
Philip Sargent
47e2c6e214 more text on aliases page 2022-10-10 22:12:42 +03:00
Philip Sargent
ff8eaa241e *team parsing much improved. Copes with everything now. 2022-10-10 15:40:21 +03:00
Philip Sargent
52a035e4cf more fixes 2022-10-10 01:08:37 +03:00
Philip Sargent
8c8b6966a7 sort output, accept more comma use 2022-10-10 00:28:57 +03:00
Philip Sargent
861980a8e9 More fixes to name resolution checking 2022-10-09 23:50:32 +03:00
Philip Sargent
3c31c333f2 Widen the recognizer capabilities for names 2022-10-09 02:32:34 +03:00
Expo on server
235bd86af3 ./pre-run.sh does not fixup git work 2022-10-09 00:32:26 +01:00
Philip Sargent
e6ca20b1ed new report to make aliases visible 2022-10-09 00:17:53 +03:00
Philip Sargent
b470ab66e2 Now tests that loser repo is clean and survex runs on 1623.svx and 1626.svx 2022-10-08 20:43:01 +03:00
Philip Sargent
e9790e70d6 abbrv. names now accepted when parsing logbooks, survex 2022-10-08 01:52:10 +03:00
Philip Sargent
55bc042798 bugfixes 2022-10-08 00:48:21 +03:00
Philip Sargent
4e9680a3ad big changes to cope with survexblock not yet dated, no *date yet 2022-10-07 23:48:41 +03:00
Philip Sargent
bec262bb2d comments 2022-10-07 23:47:45 +03:00
Philip Sargent
74b3147076 fix for running troggle not on master git server 2022-10-07 23:47:30 +03:00
Philip Sargent
f51d1e114e small chnages to name resolution 2022-10-07 23:47:05 +03:00
Philip Sargent
c76cd38d76 use generator when reading individual survex files too, saves anothe 6MB 2022-10-07 11:41:46 +03:00
Philip Sargent
b4c4f2aefc reduce mem use by 21.2MB by using a generator 2022-10-07 10:57:30 +03:00
Philip Sargent
d16226c879 bug patch for duplicate SUrvexFile object 2022-10-06 21:55:31 +03:00
Philip Sargent
1f70b77735 fix crash when people put a list of cave ids in a wallet 2022-10-06 21:37:09 +03:00
Philip Sargent
4a34986598 error msg fix 2022-10-06 21:03:14 +03:00
Philip Sargent
70709c505c Fixing bugs and better error msgs 2022-10-06 21:02:15 +03:00
Philip Sargent
8d08a67302 strip spaces in pending caves list 2022-10-06 12:51:43 +03:00
Philip Sargent
29c5c82337 polishing outputs for importing unseen survex files 2022-10-05 23:18:11 +03:00
Philip Sargent
7e47fe1f30 Parse all files, not just those in the *include tree 2022-10-05 21:11:18 +03:00
Philip Sargent
9e5bdace2c make cache work even if timestamps wrong 2022-10-05 21:10:05 +03:00
Philip Sargent
a6e60c0bf7 working on loading all the non*inlcuded svx files 2022-10-04 00:00:55 +03:00
Philip Sargent
c8163ab0cd fix bug for wallet with empty fpath 2022-10-03 21:18:35 +03:00
Philip Sargent
4495be2083 explanation of column headings 2022-10-03 20:35:23 +03:00
Philip Sargent
fe28d9ba39 Add survex files,wallets and same-day LBEs to the logbook entry 2022-09-27 23:59:25 +03:00
Philip Sargent
b60e1f2493 rename data issue tag 2022-09-27 21:37:45 +03:00
Philip Sargent
78a62a1551 error msg was incorrect. fix. 2022-09-26 00:18:41 +03:00
Philip Sargent
f0195682f2 Now shows cave id even if no *ref 2022-09-25 22:39:01 +03:00
Philip Sargent
2f64e2d4c1 FIx pending caves which had got a bit garbled 2022-09-25 21:43:00 +03:00
Philip Sargent
829e18baef Make scans in wallet subfolders clickable 2022-09-25 21:42:21 +03:00
Philip Sargent
760dbc588a Make scans in subfolders somewhat visible 2022-09-24 00:34:39 +03:00
Philip Sargent
f3ecdd6d87 Cleaner monitoring output when importing wallets and scans 2022-09-24 00:17:51 +03:00
Philip Sargent
6e3fdd35c1 Replaced maintenance headache with cleaner folder walking 2022-09-23 23:43:34 +03:00
Philip Sargent
c3672b476c test fixed to match new code 2022-09-23 21:23:15 +03:00
Philip Sargent
9d56e467cd fix broken upload form 2022-09-23 21:07:51 +03:00
Philip Sargent
97b0ce8c96 removing technical debt, replace convoluted code 2022-09-23 00:49:40 +03:00
Philip Sargent
aa20692ad6 bugfixes and enabling older wallets to be found and listed 2022-09-22 22:41:42 +03:00
Philip Sargent
af88cb4d0f Show list of folders in a wallet 2022-09-22 19:42:44 +03:00
Philip Sargent
b4cf2bac95 more complete list of caves linked to wallets 2022-09-22 01:37:25 +03:00
Philip Sargent
5c0835e076 create links from cave ids 2022-09-22 00:25:02 +03:00
Philip Sargent
e2b280ccdc fix mistaken error messages 2022-09-22 00:23:47 +03:00
Philip Sargent
1971f51b52 find more wallets than we thought we had 2022-09-22 00:23:22 +03:00
Philip Sargent
11b1d41a55 increase entires to 64 2022-09-22 00:22:09 +03:00
Philip Sargent
86ea33bbce Correct and validate JSON dates when they are read from file 2022-09-20 23:06:45 +03:00
Philip Sargent
71bd07e70e Handling and fixing bad dates in JSON input 2022-09-20 22:52:31 +03:00
Philip Sargent
94b49adc4e added free text field to wallet JSON 2022-09-20 22:05:35 +03:00
Philip Sargent
36995ec051 handling survex files not linked in completely 2022-09-20 02:36:40 +03:00
Philip Sargent
61f9863a06 bug fixes and coping with a list of cave ids in JSON 2022-09-20 01:02:06 +03:00
Philip Sargent
47878d264b Make survex file more obvious in table 2022-09-19 23:10:09 +03:00
Philip Sargent
0611c3f00f documn improvements 2022-09-19 21:54:51 +03:00
Philip Sargent
13b57d2bb6 year list for wallets by year 2022-09-19 21:54:32 +03:00
Philip Sargent
2648bada30 remove URL field from wallet as seen and edited 2022-09-19 20:55:34 +03:00
Philip Sargent
d2c6c4d7fb find perps guilty of bad *ref brhaviour 2022-09-18 23:53:04 +03:00
Philip Sargent
b5f8c5294e depry detects uneeded dependencies 2022-09-18 20:28:23 +03:00
Philip Sargent
9cd009f8ba typo, bugfix 2022-09-16 23:22:45 +03:00
Philip Sargent
68865a80ef Fixing bad date parsing, better warning msgs 2022-09-16 22:54:22 +03:00
Philip Sargent
ddfc677a1e bugfix for missing dates 2022-09-16 21:26:03 +03:00
Philip Sargent
0ab3a4ff44 comments added 2022-09-15 22:55:45 +03:00
Philip Sargent
f12d0bd580 Remove unneeded code now svx files have been edited 2022-09-15 22:55:16 +03:00
Philip Sargent
e28f04a51c add link to Help on using wallets 2022-09-14 01:20:00 +03:00
Philip Sargent
9410dda69e List the logbook trips and other svx files of the same date 2022-09-14 00:31:37 +03:00
Philip Sargent
04696b7b80 better link text 2022-09-13 00:08:37 +03:00
Philip Sargent
a41cd8eb24 oops. breaking chnage. fixed. 2022-09-12 22:51:33 +03:00
Philip Sargent
29dc99c21f tidy bad git messages in tests 2022-09-12 22:47:31 +03:00
Philip Sargent
dfc903208e more useful links ion wallets pages 2022-09-12 22:47:12 +03:00
Philip Sargent
beecb4b0ac remove redundant code 2022-09-12 22:46:45 +03:00
Philip Sargent
182df351b9 Extra useful links on wallets pages 2022-09-12 22:20:14 +03:00
Philip Sargent
fd57071411 Fixed test to undo side-effect git commit 2022-09-12 21:28:07 +03:00
Philip Sargent
785845598f catch crashes when no data has been imported 2022-09-12 20:50:57 +03:00
Philip Sargent
6452a7beed fix green block for survex files on table 2022-08-31 12:09:07 +03:00
Philip Sargent
5c667c1826 update dev environment 2022-08-31 09:28:23 +03:00
Philip Sargent
dc2b8ad431 fixes for crashes,svx files in wallets 2022-08-31 09:27:14 +03:00
Philip Sargent
3af1112847 fix wallet date from survexfile 2022-08-30 20:46:17 +03:00
Philip Sargent
0853bbdd19 Many fixes and speedups 2022-08-30 17:58:49 +03:00
Philip Sargent
6daa96b69e correcting output comment 2022-08-30 17:30:46 +03:00
Philip Sargent
9aaadafc13 populate with people 2022-08-25 17:29:57 +03:00
Philip Sargent
6c384492be fix missing .svx as not an error 2022-08-25 17:29:43 +03:00
Philip Sargent
ab184bccf3 comment updates 2022-08-25 16:31:38 +03:00
Philip Sargent
79672dd4b3 bugfix 2022-08-25 16:12:13 +03:00
Philip Sargent
760abe1a9e cope with swapped people/title 2022-08-25 15:54:00 +03:00
Philip Sargent
8f03e590cc update test to match new wallet json location 2022-08-25 14:38:14 +03:00
Expo on server
57c4732566 Fix erroring code in core/views/caves.py 2022-08-25 03:31:54 +01:00
Philip Sargent
17bbbd6eab get the survexfile path when editing a naked wallet 2022-08-24 19:08:08 +03:00
Philip Sargent
7e9fd0f353 Better display of wallet names copied from svx files 2022-08-24 18:28:15 +03:00
Philip Sargent
8ca50d8fd4 bugfix which was deleting metadata 2022-08-24 17:01:20 +03:00
Philip Sargent
43b6b590e8 fix bleed through of previous metadata onto another wallet 2022-08-24 16:22:15 +03:00
Philip Sargent
e98ffced98 better explanation text 2022-08-24 15:07:19 +03:00
Expo on server
d37bacb91a Clarify databasereset help on subfunction use 2022-08-24 12:42:01 +01:00
Philip Sargent
96b2c6c9ed fix url for cave edit page in error msg 2022-08-24 14:15:40 +03:00
Philip Sargent
b8cd8c4785 wallets form edits 2022-08-24 13:40:49 +03:00
Philip Sargent
a30a2b9ef9 bugfix 2022-08-20 09:45:28 +03:00
Philip Sargent
e195497829 better commit msg 2022-08-16 21:58:13 +03:00
Philip Sargent
03cda8a897 update prompt from 2019 to 2022 2022-08-16 21:21:03 +03:00
Philip Sargent
13e3da8d26 bugfix 2022-08-16 20:57:34 +03:00
Philip Sargent
ee7e3b6d41 make field longer 2022-08-16 20:54:27 +03:00
Philip Sargent
d05294adaf better, non-real prompt 2022-08-16 20:52:34 +03:00
Philip Sargent
f1aa6a9794 better hint for svx file name input 2022-08-16 20:47:24 +03:00
Philip Sargent
41c68aef26 detecting empty wallets where we only have JSON and no files 2022-08-16 20:02:28 +03:00
Philip Sargent
e94dc6eb6f remove more "None" text 2022-08-16 17:49:55 +03:00
Philip Sargent
aaba4fd2a9 minor refactoring 2022-08-16 17:42:37 +03:00
Philip Sargent
1a49e5347f cleaner table for None values 2022-08-16 17:42:13 +03:00
Philip Sargent
51f5261bfc bad bug in not clearing out previous data. fixed 2022-08-16 16:48:19 +03:00
Philip Sargent
b2d8b21822 update 2019 to 2022 2022-08-16 16:26:19 +03:00
Philip Sargent
b2a26be8c8 change photo to GPS guide: more useful 2022-08-16 15:28:27 +03:00
Philip Sargent
dcc36f3286 Docum for creating empty wallet 2022-08-16 14:19:25 +03:00
Philip Sargent
3c13f62bd1 re-fettled scan upload, creates Wallet object earlier 2022-08-14 23:40:56 +03:00
Philip Sargent
284e044a03 Fix wallets scan upload faults 2022-08-14 22:52:14 +03:00
Philip Sargent
b093d00ff4 match therion files to wallets, scans names therein 2022-08-14 01:47:53 +03:00
Philip Sargent
8e93680146 personrole is not just about roles 2022-08-13 23:57:57 +03:00
Philip Sargent
8fa25c815a fix apparent error when running on dev system 2022-08-13 23:57:37 +03:00
Philip Sargent
edfba8d355 git commit edits to wallet data 2022-08-13 23:56:56 +03:00
Philip Sargent
1eadc931cb Bugfix for git add for uploaded drawings 2022-08-13 21:14:57 +03:00
f3002a694d New troggle config for Barbie Xubuntu pink 2022-08-13 16:14:47 +01:00
Philip Sargent
5149cf1ece add test for renaming single photo 2022-08-11 23:44:19 +03:00
Philip Sargent
1bbfd1e517 bugfix in photo upload, untidy 2022-08-11 23:35:53 +03:00
Philip Sargent
e35616a611 look in filesystem not database for file location 2022-08-11 22:18:58 +03:00
Philip Sargent
a0a1927437 removed outdated text 2022-08-11 21:57:23 +03:00
Philip Sargent
3607b9f140 enable photo file rename 2022-08-11 21:19:52 +03:00
Philip Sargent
25c425cff8 QMs grayed out for survex files 2014 and earlier 2022-08-07 23:41:45 +03:00
Philip Sargent
7f335e082c prevent error message repetition in DataIssues 2022-08-07 23:26:31 +03:00
Philip Sargent
9220dbf2e6 bugfix 2022-08-07 22:52:29 +03:00
Philip Sargent
f33c6cc057 bugfix in new wallet 2022-08-06 22:23:39 +03:00
Philip Sargent
186eb20fb3 Make drawings repo the MASTER for contents.json 2022-08-06 21:27:36 +03:00
Philip Sargent
ac22a984ee missing from Barbie git 2022-08-06 20:02:17 +03:00
Philip Sargent
61c04a1fb9 removing links to most old-style wallets pages 2022-08-01 17:46:54 +03:00
Philip Sargent
0fd3cf43e8 formatting 2022-08-01 17:32:49 +03:00
Martin Green
c1aaf07885 Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2022-08-01 16:05:17 +02:00
Martin Green
f491264b9e Allow entrances to be edited with the correct parent url. Commit changes to caves when adding a new entrance. Order entrances alphabetically 2022-08-01 16:04:22 +02:00
Philip Sargent
eed35d01a8 tick lists now on 3 wallets reports 2022-08-01 15:55:20 +03:00
Philip Sargent
df42b1ccb3 remove debugging print 2022-08-01 03:10:07 +03:00
Philip Sargent
129ea3cc5b debugging ticklist 2022-08-01 02:50:19 +03:00
Martin Green
fa6758b9a0 edit entrances within the correct url parent 2022-07-31 21:29:17 +02:00
Philip Sargent
5da1fce41f tidy links 2022-07-31 19:33:14 +03:00
Philip Sargent
a2a5e9200e wallets per person - slow implementation 2022-07-31 18:58:46 +03:00
Martin Green
94252a94fe Edit caves in the same parent URL as the final file 2022-07-31 17:08:28 +02:00
Philip Sargent
c1ba6a39a5 Wallets by year and by cave 2022-07-31 01:02:02 +03:00
Philip Sargent
724234949f Populate blank wallet fields with survex data 2022-07-29 20:55:19 +03:00
Philip Sargent
bc3da1182b starting jsn population when we know the data 2022-07-29 17:49:07 +03:00
Philip Sargent
7872e98cb2 fixing pending caves system to be cleaner 2022-07-28 18:36:57 +03:00
Philip Sargent
c29e240c2b creating new wallet now copies nd commits 2022-07-28 18:36:40 +03:00
Philip Sargent
93622b111f obscure bug fixed for lines ;*include 2022-07-28 15:15:11 +03:00
Philip Sargent
9a461c31a8 adding people 2022-07-28 02:37:44 +03:00
Philip Sargent
fea69c0371 Extend wallets by cave report 2022-07-28 01:48:22 +03:00
Philip Sargent
dd0fcc28dd update todo strings 2022-07-27 23:24:53 +03:00
Philip Sargent
3d7cb78e47 copies all wallet data to drawings repo as backup 2022-07-27 23:24:53 +03:00
Philip Sargent
1468c49723 remove unused SCANS_URL 2022-07-27 23:24:52 +03:00
Philip Sargent
c39fb30707 new urls and dummy functions and rename 2022-07-27 23:24:51 +03:00
Martin Green
91568b7151 Allow HTML to be blank and determine the new files need git commiting. 2022-07-26 17:09:15 +01:00
Martin Green
7090bab632 closed form for search item in right hand menu 2022-07-26 15:56:31 +01:00
Philip Sargent
039792e320 improve reifying process 2022-07-25 15:03:58 +03:00
Philip Sargent
cddcb0e321 fix minor bug in setting dummy entrance 2022-07-25 11:31:43 +03:00
Philip Sargent
f9a7ba7927 alias as separate error type 2022-07-25 04:17:06 +03:00
Philip Sargent
7785843597 error msg when no .pos file 2022-07-25 04:10:28 +03:00
Expo on server
c5b9cd57f2 deleted transient file 2022-07-25 02:10:18 +01:00
Philip Sargent
3577d8cb68 big rewrite of cave alias lookup system 2022-07-25 02:58:13 +03:00
Philip Sargent
9c090f0383 test now matches model schema 2022-07-25 02:57:31 +03:00
Philip Sargent
476ee482fa Pending caves with 1626 properly 2022-07-25 02:57:00 +03:00
Philip Sargent
25d5361da4 1626 as well as 1623 2022-07-25 02:56:12 +03:00
Philip Sargent
de7388bdc5 GetCave:Lookup neeeds work 2022-07-24 21:38:14 +03:00
Philip Sargent
5007393536 better error handling 2022-07-23 20:05:58 +03:00
Philip Sargent
a1e538e93a making it work with 1626 no schonberg 2022-07-23 19:26:47 +03:00
Philip Sargent
7288bd9da3 add comment on point where it crashes 2022-07-22 13:42:04 +03:00
Philip Sargent
f194126fb5 better error description 2022-07-22 13:41:47 +03:00
Philip Sargent
3b1b96c4c8 EXTEND MESSAGE FILED. cRASHES mARIAdb 2022-07-22 13:41:28 +03:00
Philip Sargent
31b912f3ca bugfixes 2022-07-22 12:40:42 +03:00
Philip Sargent
38eb65ac0e remove unused code 2022-07-22 11:23:00 +03:00
Philip Sargent
796dbf1438 more dead chicken waving 2022-07-21 21:52:10 +03:00
Philip Sargent
f46942fadf typo again 2022-07-21 21:04:49 +03:00
Philip Sargent
3a52d790f0 typo 2022-07-21 21:03:54 +03:00
Philip Sargent
ce7dfd6510 working on MariaDB crash on server 2022-07-21 21:01:57 +03:00
Philip Sargent
312ecdcfe1 need to see exception with all the renaming going on 2022-07-21 19:51:04 +03:00
Philip Sargent
c747664a26 Better links for 2022 2022-07-21 19:34:52 +03:00
Philip Sargent
931c33cfdb Clean DataIssues output, new therionrefs.log file 2022-07-21 19:01:04 +03:00
Philip Sargent
f895a7e44c wallets now sorted as well as deduplicated 2022-07-21 11:10:04 +03:00
Philip Sargent
5161fce32e remove duplicate lines, add heading link 2022-07-21 10:50:15 +03:00
Philip Sargent
8245ee103e wallets for just one cave 2022-07-21 10:32:11 +03:00
Philip Sargent
3e869ae76a fix variables needed for error message 2022-07-21 09:40:35 +03:00
Philip Sargent
dd0a448f90 fix 3d file download on cave page 2022-07-21 01:22:07 +03:00
Philip Sargent
3ab8a5d1ad layout tidy 2022-07-21 00:40:03 +03:00
Philip Sargent
83bbda7c40 bugfix 2022-07-21 00:07:52 +03:00
Philip Sargent
d058942084 revert, this was a bad idea 2022-07-20 21:04:59 +03:00
Philip Sargent
0a158db97d Trial QM tick mechanism in survex files 2022-07-20 20:47:29 +03:00
Philip Sargent
de37eea167 typo 290 instead of 291 2022-07-20 17:25:36 +03:00
Philip Sargent
a215ebd62c fix QM regex for number and rearrange url code 2022-07-20 17:02:38 +03:00
Philip Sargent
549c1649b4 QMs now have working url to survexfile & tick description 2022-07-20 14:44:56 +03:00
Philip Sargent
2a7f1506c9 track down url resoution error 2022-07-20 10:08:23 +03:00
Philip Sargent
9a395eafef aise exception in cave descriptionb rendering 2022-07-20 10:04:00 +03:00
Philip Sargent
2e14be61a2 bugfix 2022-07-19 20:54:46 +03:00
Philip Sargent
6883ff49a0 Add fields to QM model 2022-07-19 20:54:28 +03:00
Philip Sargent
d9d75b3aee another attempt to avoid non-null parent pseudo error 2022-07-19 20:00:35 +03:00
Philip Sargent
1395ac76e9 Attempt fix to MariaDB crash in databasereset 2022-07-19 19:48:11 +03:00
Philip Sargent
b79eb9a969 better error msg 2022-07-19 19:18:42 +03:00
Philip Sargent
23462df49c bugfix 2022-07-19 19:06:56 +03:00
Philip Sargent
3db9c16082 add Homecoming to QMs list 2022-07-18 19:19:30 +03:00
Philip Sargent
6ec7071ffc Fix display of expoyear 2022-07-18 19:19:06 +03:00
Philip Sargent
4efeefe6c9 git commit when editing survex files online 2022-07-18 18:42:21 +03:00
Philip Sargent
5b7c105c5f missed a bit of refactoring 2022-07-18 18:06:23 +03:00
Philip Sargent
dd00ff69aa refactor to put Martins git stuff in utils 2022-07-18 17:37:22 +03:00
Philip Sargent
ee9b808461 moved writetrogglefile() to core.utils 2022-07-18 16:57:13 +03:00
Philip Sargent
8484f26ee9 Fix URL links and better name display 2022-07-18 16:16:58 +03:00
Philip Sargent
deec330990 test for loser git repo sanity 2022-07-18 15:46:57 +03:00
Philip Sargent
145540caf5 more weirdness in circumventing MariaDB/Django misunderstandings 2022-07-17 16:08:01 +03:00
Philip Sargent
b7035f1574 more helpful message 2022-07-17 16:07:27 +03:00
Philip Sargent
6efbec7750 Date and People checks 2022-07-17 15:41:05 +03:00
Philip Sargent
037a50cf47 read 1623/264 as 1623-264 etc 2022-07-17 15:28:20 +03:00
Philip Sargent
bb65ffaee6 bugfix 2022-07-17 15:22:26 +03:00
Philip Sargent
b20e6c5a58 Complaints now on wallet page 2022-07-17 15:01:53 +03:00
Philip Sargent
7c82c2d97c rename 2022-07-17 15:01:25 +03:00
Philip Sargent
810e058c07 new complaints messages for wallet 2022-07-17 15:00:37 +03:00
Philip Sargent
8aab01c126 cleaner zeroth cave setting 2022-07-15 16:44:02 +03:00
Philip Sargent
73e9ae54fa cleaner survexdirectory creation 2022-07-15 16:17:40 +03:00
Martin Green
d4c213e0b3 Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2022-07-15 14:06:43 +01:00
Martin Green
3748840e23 Removed extranious characters where there is one unnnumbered entrance. 2022-07-15 14:05:48 +01:00
Philip Sargent
02cf9b1c22 fixing Beckas complaint 2022-07-15 16:04:07 +03:00
Martin Green
ef27901125 Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2022-07-15 13:43:50 +01:00
Martin Green
a1560c60c6 Added kataster number to cave heading 2022-07-15 13:42:58 +01:00
Philip Sargent
b8355cbf8d fix bugs and tidyup in recent changes 2022-07-15 15:11:49 +03:00
Philip Sargent
86a18c3ebc catch nonUTF8 survex files, DataIssues url editor 2022-07-15 14:09:32 +03:00
Martin Green
5582d545a1 Allow QMS to be rendered to a string, if the case that an optional parameter is missing 2022-07-11 23:29:59 +01:00
Martin Green
c416de6e1e Allow user to see a caves edit link even if the user is not logged in. Such that they can be prompted to log in if required. 2022-07-11 21:12:53 +01:00
Philip Sargent
91c9cf0c31 2019 logbook error fix 2022-07-09 01:55:11 +03:00
Philip Sargent
278a84a485 better error msg 2022-07-09 01:54:48 +03:00
Philip Sargent
8d31ab763d improve date parse eror messages 2022-07-09 01:30:49 +03:00
Philip Sargent
a4b1c7b142 remove getqms 2022-07-08 22:20:29 +03:00
Philip Sargent
776f9f7833 remove getqms 2022-07-08 22:19:56 +03:00
Philip Sargent
9803ebe2e8 fix bug in Area creation 2022-07-08 22:19:07 +03:00
Philip Sargent
ca5586fc42 Report badly formatted ;QM lines 2022-07-08 20:08:42 +03:00
Philip Sargent
d3572e18c3 QM reports all working 2022-07-06 17:35:08 +03:00
Philip Sargent
7dc3cc3b91 fix for individual QM display from survex 2022-07-06 15:38:53 +03:00
Philip Sargent
ee4237b14c cleaner template pages 2022-07-06 13:44:40 +03:00
Philip Sargent
848043f7f4 Linkt to QMs on cave page 2022-07-06 11:39:19 +03:00
Philip Sargent
da4d7d6d5e shorten slug to fit 2022-07-06 11:10:50 +03:00
Philip Sargent
0ea3ed1ef2 Not quite so broken QMs from survex files 2022-07-05 22:40:58 +03:00
Philip Sargent
2bd617b543 Fixed QM report for survex-imported QMs 2022-07-05 20:24:51 +03:00
Philip Sargent
96101252bd Documn link 2022-07-05 17:40:31 +03:00
Philip Sargent
9d4a97fc19 Tidy HTML output 2022-07-05 17:02:43 +03:00
Philip Sargent
c9a33a4010 bug fix for forgotten href 2022-07-05 16:34:08 +03:00
Philip Sargent
87fd260051 import diagnostics 2022-07-05 16:30:42 +03:00
Philip Sargent
5d7d2b82b2 Fixing non-null error, even though it should be allowed 2022-07-05 15:57:49 +03:00
Philip Sargent
779afc2f2a QM report pages now not crashing, working.. 2022-07-05 15:38:23 +03:00
Philip Sargent
33eb91346c CaveView - how to install 2022-07-05 14:16:21 +03:00
Philip Sargent
a11541eb58 prefetch_related initial attempts 2022-07-05 14:14:03 +03:00
Martin Green
dbe6d10fff Revert "Added some test pages showing different ways pages could be edited. This probably wants removing soon"
This reverts commit 2af88353f3.
2022-06-28 00:24:57 +01:00
Martin Green
2af88353f3 Added some test pages showing different ways pages could be edited. This probably wants removing soon 2022-06-28 00:18:24 +01:00
Martin Green
82fe350493 Added HTMLarea widget to the edit entrance form 2022-06-27 00:34:08 +01:00
Martin Green
47d1662033 Added help writing HTML in the cave editing form. Made the HTML previews optional 2022-06-26 21:29:46 +01:00
Martin Green
4e5d8d1d76 Refactored code, with an aim of allowing more than one HTMLarea on a page 2022-06-26 18:29:20 +01:00
Martin Green
f1fcef2a6f Refactorising CodeMirror HTML editor, with an ultimate aim to make it reusable. However more work if required... 2022-06-26 14:16:42 +01:00
Martin Green
8f0ea8ed82 Fix for adding images whilst editing, for where the t directory does not exist and at the root directory of expoweb 2022-06-26 11:20:14 +01:00
Martin Green
5fbe0b31c2 Require a login if public and check for CSRF cookies for uploading images 2022-06-26 01:15:00 +01:00
Martin Green
24a016e76a Fixed spelling of a variable name 2022-06-25 23:36:53 +01:00
Martin Green
5de88ce92d Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2022-06-25 23:19:28 +01:00
Martin Green
20583b04c0 Allowed user to select/upload images when editing. When uploaded thumbnails and description pages are automatically created. Git commiting can now handle multiple files at once. 2022-06-25 23:17:19 +01:00
Philip Sargent
2f1ba9cb54 install updates 2022-06-25 21:34:42 +03:00
Philip Sargent
451326789b updated packages and settings 2022-06-25 20:01:43 +03:00
Philip Sargent
859ae9d825 update pip inside venv 2022-06-25 20:00:19 +03:00
Philip Sargent
364a636fa0 Updated pip packages 2022-06-25 19:37:37 +03:00
Philip Sargent
293eb10ffd New OS and venv tool 2022-06-25 19:28:01 +03:00
Martin Green
b3d9e81499 Implement redirects after login (using the next parameter) 2022-06-25 16:13:02 +01:00
Martin Green
74a5125cf9 Allow for PosixPaths to work with GetListDir 2022-06-25 16:08:19 +01:00
Martin Green
d607b30953 Do not allow for the main menu to be overriden, instead display old menu at bottom of page. To do this the id of the main menu was changed from links to menulinks 2022-06-25 01:07:17 +01:00
Martin Green
abdea22899 Allow for slightly different wording in git output 2022-06-25 01:05:29 +01:00
Martin Green
48f82aaaca Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2022-06-24 21:58:21 +01:00
Martin Green
5ac2e24cc2 Removed jQuery 2022-06-24 21:58:00 +01:00
Philip Sargent
d6db942626 Useful datamanagement link 2022-06-24 23:18:10 +03:00
Martin Green
7db7c67065 Added link to parent cave in menu 2022-06-24 19:28:31 +01:00
Martin Green
54f47c58db Adding the latest stable jquert, which is used in the 161 names page, to replace the previous frameset implimentation. 2022-06-24 17:09:46 +01:00
Martin Green
ceb6d2fef1 Added a sub menu when in the handbook directory. Imcreased width of edit preview 2022-06-24 15:48:35 +01:00
Martin Green
b38412b145 Added come mirror for the edit page, including some buttons to make html 2022-06-24 14:39:09 +01:00
Martin Green
ef68db080a Add git commit messages when editing via website. Make sure cust menus are not deleted. 2022-06-23 21:31:57 +01:00
Martin Green
97a9f2aae6 The editing system was accidentally deleting customised sidebar menus. This will stop the happening 2022-06-23 20:03:05 +01:00
Martin Green
2f42f488ab Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2022-06-23 19:02:10 +01:00
Martin Green
f1e800d8bf Move saving and comitting code to a seperate library 2022-06-23 18:48:29 +01:00
Philip Sargent
16c6aed65f Fix missing images 2022-06-23 19:18:21 +03:00
Philip Sargent
77cf3455a6 Intercept NoReverseMatch eceptions more helpfully 2022-06-23 19:01:57 +03:00
Philip Sargent
4fa8d18621 utf8 encoding checks when reading files in the view 2022-06-23 19:01:25 +03:00
Philip Sargent
6de9181390 Put inside if clause if attribute does not exist 2022-06-23 18:44:53 +03:00
Philip Sargent
bd8d450542 UTF-8 check inserted into databasereset 2022-06-23 16:12:13 +03:00
Philip Sargent
7b0d90182b Added utf8 test to test suite 2022-06-23 16:03:50 +03:00
Philip Sargent
b5f2d0641d Provide dummy expedition_id to keep MariaDB happy 2022-06-22 23:08:32 +03:00
Philip Sargent
4662d10c4e apache now using system locale 2022-06-22 20:22:42 +03:00
Philip Sargent
13a63b64d4 sanitized recent localsettings as on server 2022-06-22 20:17:33 +03:00
Martin Green
4d8125a2fd removed rest of endocings 2022-06-22 09:10:56 +01:00
Martin Green
ea880915b0 Removed encoding of file paths as encoding now correct 2022-06-22 09:08:01 +01:00
Martin Green
836387057a Now the servers locale is utf8, try opening files spectifying path using a text string 2022-06-22 08:52:04 +01:00
Martin Green
f0a9c33795 bug fix to previous test 2022-06-21 22:45:11 +01:00
Martin Green
843bfa8ba6 test to find out locale set in the WSGI enivroment 2022-06-21 22:43:06 +01:00
Martin Green
654f8e8c6c bug fix to prev commit 2022-06-20 22:12:00 +01:00
Martin Green
c6272e4103 Refactorise saving code. Do not save and git change if there are no changes 2022-06-20 22:09:10 +01:00
Martin Green
3420422f29 bug fix to setting encoding 2022-06-20 21:38:46 +01:00
Martin Green
a664e8ce8c Made it such that opening files for reading and writing when editing to use utf8 encoding 2022-06-20 21:35:51 +01:00
Martin Green
491fba7d64 test 2022-06-19 01:57:08 +01:00
Martin Green
3d35cf713f test 2022-06-19 01:56:28 +01:00
Martin Green
b75a91ed70 test 2022-06-19 01:55:01 +01:00
Martin Green
02a3d6a359 test 2022-06-19 01:53:13 +01:00
Martin Green
8372d9d85c test 2022-06-19 01:52:09 +01:00
Martin Green
945373df67 test 2022-06-19 01:50:05 +01:00
Martin Green
26e952154e test 2022-06-19 01:44:57 +01:00
Martin Green
e33ce724c0 test 2022-06-19 01:41:48 +01:00
Martin Green
86b24c9dfe test 2022-06-19 01:39:30 +01:00
Martin Green
3465e9bd16 test 2022-06-19 01:37:51 +01:00
Martin Green
c9d7931ccb test 2022-06-19 01:35:32 +01:00
Martin Green
aad6b70736 test 2022-06-19 01:33:08 +01:00
Martin Green
d560b17ab6 test 2022-06-19 01:16:53 +01:00
Martin Green
18dbb847e3 test 2022-06-19 01:12:49 +01:00
Martin Green
84693b6524 test 2022-06-19 01:04:47 +01:00
Martin Green
5ac6bfd49a test 2022-06-19 01:03:45 +01:00
Martin Green
5666f1e9a7 test 2022-06-19 01:02:41 +01:00
Martin Green
b39a57786d test 2022-06-19 00:59:54 +01:00
Martin Green
4fe306d35f test 2022-06-19 00:57:04 +01:00
Martin Green
60b2e5e8c7 utf8 edit pages 2022-06-19 00:54:50 +01:00
Martin Green
18a58c1042 test 2022-06-19 00:49:07 +01:00
Martin Green
d9cfbc19ed test 2022-06-19 00:32:25 +01:00
Martin Green
372c7e0804 test 2022-06-19 00:27:10 +01:00
Martin Green
5f34a78d94 test 2022-06-19 00:25:48 +01:00
Martin Green
f0cfed2ef6 Future proof for filesystem encoding changing in the future 2022-06-18 23:43:21 +01:00
Martin Green
59a45871fd utf debugging 2022-06-18 23:37:37 +01:00
Martin Green
328347f8af debugging 2022-06-18 23:26:37 +01:00
Martin Green
845e70465e bug fix 2022-06-18 23:24:21 +01:00
Martin Green
6bcf70bb8b refactorisation to make debugging easier 2022-06-18 23:13:40 +01:00
Martin Green
41dfe08d2a try coverting path from utf8 2022-06-18 22:48:30 +01:00
Martin Green
38d23fd76b Attempting to fix utf8 urls by not type converting to string 2022-06-18 22:41:00 +01:00
Philip Sargent
bf6c6e56a6 fixing to work with python3.10 on a machine where 3.9 is default 2022-05-19 22:38:28 +03:00
Philip Sargent
b259e43de2 bad CSS from typo 2022-05-19 16:13:35 +03:00
Philip Sargent
1556ccd7f6 programmers note 2022-05-19 15:59:44 +03:00
Philip Sargent
a7baf4f3e6 restore Sit Index menu item 2022-05-19 15:58:54 +03:00
Philip Sargent
f0634ff164 specify default encoding explicitly 2022-04-28 00:30:43 +03:00
Philip Sargent
322d454d41 revert 2022-04-27 23:43:15 +03:00
Philip Sargent
cafde67c02 ISO-8859-1 fallback 1 2022-04-27 23:25:37 +03:00
Philip Sargent
362aedc2ac back to how it was 2022-04-27 23:07:02 +03:00
Philip Sargent
b3b10b0db7 ugh 2022-04-27 23:00:42 +03:00
Philip Sargent
3528587890 raise exception on live system 2022-04-27 22:58:43 +03:00
Philip Sargent
3bd308effa more superficial fixings 2022-04-27 22:35:20 +03:00
Philip Sargent
fbffbf0909 hack to make borken file less unreadable 2022-04-27 22:29:50 +03:00
Philip Sargent
f05e885517 workaround security update on distsortreversed
Due to Django security update CVE-2021-45116 which removed the capability of resolving a method in a template when called dictsortreversed
2022-04-23 22:42:46 +03:00
Philip Sargent
9ead6b00f9 warning text for broken table in page, pending fix 2022-04-20 23:18:05 +03:00
Philip Sargent
00eb978f5f fixed test broken by recent update 2022-04-20 21:22:38 +03:00
Philip Sargent
c9931fd45e survey legs calc fixed 2022-04-18 23:33:04 +03:00
Philip Sargent
3813b21dcf regularising _URL and _ROOT idiom 2022-04-18 22:48:49 +03:00
Philip Sargent
ccd386ff4e Better error message with permissions problems 2022-04-12 22:42:36 +03:00
Philip Sargent
d29f3030a4 remove redundant and old setting 2022-04-12 21:05:28 +03:00
Expo on server
fe53b08f35 Update the 'pagenotfound' page to reflect what users see. 2022-04-09 22:59:47 +01:00
Philip Sargent
5a64d9d3d0 test checks that localsettings is not out of step with code too 2022-04-07 01:20:44 +03:00
Philip Sargent
4b1012cbb4 comment out unused code prior to deletion 2022-04-07 01:13:54 +03:00
Philip Sargent
c0c4fb72ca local to new laptop 2022-04-06 21:07:43 +03:00
Philip Sargent
ed71fa48f1 cleaning unused settings 2022-04-06 21:01:31 +03:00
Philip Sargent
18c2892967 Make more robust to WSL chmod failures for tests 2022-04-06 20:43:26 +03:00
Philip Sargent
71ed0815cc Revert "update oddity with VS Code and WSL2"
This reverts commit 41b2bcee4f.

CR LF != LF issue
2022-04-05 10:38:52 +03:00
Philip Sargent
41b2bcee4f update oddity with VS Code and WSL2 2022-04-02 00:03:53 +03:00
1f3f60a6a3 Now easier to fix by hand 2022-03-31 00:21:42 +01:00
a9ef96f84e whackety whack 2022-03-31 00:07:47 +01:00
3390a62020 whack, whack.. 2022-03-31 00:06:54 +01:00
9461eed380 mole whacking 2022-03-30 23:56:40 +01:00
aeb210bd30 typo 2022-03-30 23:54:38 +01:00
d7246cbb98 UTF-8 failure thing 2022-03-30 23:53:57 +01:00
785500241e better debug 2022-03-30 21:29:52 +01:00
Philip Sargent
dddb9b1f57 Struggling to use venv pip with WSL2 Ubuntu-20.04 2022-03-30 02:17:08 +01:00
7f16bca7f7 Better error messaged and url 2022-03-29 15:31:25 +01:00
Philip Sargent
efeb0efd1e umlauts and edit this page 2022-03-25 02:05:10 +00:00
Philip Sargent
406259a8a2 make 1999 work like other years 2022-03-25 02:04:44 +00:00
Philip Sargent
7fd9497d5c bugfix 2022-03-24 21:20:53 +00:00
Philip Sargent
3617f9b6d9 bugfix and docum 2022-03-24 20:59:36 +00:00
Philip Sargent
1589188988 enable wallet view even if not logged in 2022-03-24 20:45:15 +00:00
Philip Sargent
a514355e5e to do updated 2022-03-24 01:16:43 +00:00
Philip Sargent
be410d4d9d minor refactoring 2022-03-24 01:05:50 +00:00
Philip Sargent
13ffe1fcc6 url to full logbooks fixed in logbookentry 2022-03-23 23:35:42 +00:00
Philip Sargent
9ccf5912d4 restored logbook cacheing 2022-03-23 22:55:59 +00:00
Philip Sargent
4c7deadb9a documenting fossils 2022-03-23 22:55:43 +00:00
Philip Sargent
42b615d16b survex legs bug fix 2022-03-23 20:05:38 +00:00
Philip Sargent
70efb10ece reordered main menu 2022-03-23 11:32:36 +00:00
Philip Sargent
8fd4f818b5 better err text for mysterious error 2022-03-22 02:24:26 +00:00
Philip Sargent
f1b206ad34 fixing bugs after wookey session 2022-03-22 02:22:15 +00:00
Philip Sargent
48171ae824 better fix 2022-03-18 20:45:40 +00:00
Philip Sargent
28fb4d1e94 |Another subtle bug 2022-03-18 20:43:01 +00:00
Philip Sargent
1d504e4066 bugfix 2022-03-18 20:09:49 +00:00
Philip Sargent
6a18511dd0 Fixing URLs for cave descriptions 2022-03-18 20:00:15 +00:00
Philip Sargent
6f32364675 fixed get_absolute_url error 2022-03-18 14:18:16 +00:00
Philip Sargent
3b997a32bf bugfix if no svx file 2022-03-18 12:55:08 +00:00
Philip Sargent
8b889ade5f Fix scanned walletindex pages 2022-03-18 12:26:32 +00:00
Philip Sargent
af7fc8f243 use new upload page for most wallets 2022-03-18 11:28:35 +00:00
Philip Sargent
e4ee4abce8 missing print msg 2022-03-18 10:21:25 +00:00
Philip Sargent
2544bc5f3d extra cross link URL to wallet reports 2022-03-18 02:49:45 +00:00
Philip Sargent
19d017a457 Edit contents.json online 2022-03-17 00:41:29 +00:00
Philip Sargent
e34f162688 Test photo upload 2022-03-16 12:43:39 +00:00
Philip Sargent
60fc66cdf5 package updates for Django 3.2 2022-03-16 11:02:54 +00:00
Philip Sargent
d3ddcba313 bug fixes 2022-03-15 23:00:23 +00:00
Philip Sargent
251e3bf844 Bug fixes 2022-03-15 20:53:55 +00:00
Philip Sargent
6bdd9be092 contnets.json now visible on form 2022-03-15 19:15:45 +00:00
Philip Sargent
3390f51049 Form creates wallet folder and contents.json 2022-03-15 17:04:43 +00:00
Philip Sargent
fac748d2e2 Better table titles 2022-03-13 23:48:22 +00:00
Philip Sargent
ad1283662d Django 3.2 package settings 2022-03-13 13:26:49 +00:00
Philip Sargent
bb8a92fff1 settings to enable Upload Photos 2022-03-13 11:26:29 +00:00
Philip Sargent
b65639df05 Upload form for Photos 2022-03-13 01:01:00 +00:00
Philip Sargent
f99ebf84e9 running cavern on svx files improved 2022-03-11 16:22:37 +00:00
Philip Sargent
8e78dd4a2e update menu 2022-03-10 22:59:47 +00:00
Philip Sargent
822f8a1699 Fix URL bug 2022-03-10 18:58:58 +00:00
Philip Sargent
488ce46d73 File upload forms descriptions 2022-03-08 22:59:04 +00:00
Philip Sargent
f32df567f2 Updated troggle menu with 2022 caves 2022-03-08 09:34:52 +00:00
Philip Sargent
d6cc32ee9a Detect more survex errors 2022-03-07 16:23:20 +00:00
Philip Sargent
3ac617431f Make .3d files in same dir as .svx 2022-03-06 01:29:45 +00:00
Philip Sargent
7a58aac08e Drawings uploads git works 2022-03-05 22:16:03 +00:00
Philip Sargent
a3a65524b8 better errors for drawings parsing & upload 2022-03-05 20:29:01 +00:00
Philip Sargent
88f5df0f19 More detailed debug output 2022-03-05 18:02:01 +00:00
Philip Sargent
5fe436e76a Add git status test for 3 repos 2022-03-05 17:42:12 +00:00
Philip Sargent
d7fd6b00ae Detect unwriteable file permissions earlier 2022-03-05 17:05:15 +00:00
Philip Sargent
32377f4e6c Cave import & reports fixes 2022-03-05 12:20:26 +00:00
Philip Sargent
1b9fccc2a4 Upversioned packages. Also now Dj4.0 capable. 2022-03-04 14:54:49 +00:00
Philip Sargent
dc4374cb9e Update to new Django admin styles for v3.x 2022-03-03 14:18:51 +00:00
Philip Sargent
7f41017ce3 git merge failure caught in svx files 2022-03-03 00:26:04 +00:00
Philip Sargent
02d58d440e WORKING both py3.9.10 & 3.8.10 (dj2.2.25) 2022-03-02 23:19:48 +00:00
Philip Sargent
73b26ec206 uses venv & links script 2022-03-02 23:18:39 +00:00
Philip Sargent
601fc2cffc WORKING Dj2.2.24 & 2.2.25 py3.7 2022-03-02 21:15:24 +00:00
Philip Sargent
af50d4912d Catch error if unfixed merges in survex files 2022-03-01 01:30:09 +00:00
Philip Sargent
8bd20f9600 Prospecting guide disabled - 100s bad URLs 2022-02-28 15:46:19 +00:00
Philip Sargent
6d435ee473 more comments 2022-02-26 23:20:59 +00:00
Philip Sargent
7f542b8936 LIBDIR automatically gets right python version 2022-02-25 18:33:34 +00:00
Philip Sargent
2c13c1b5f3 remove 'testing' from menu template too 2022-02-23 23:04:00 +00:00
Philip Sargent
29c929aba4 Update title to 2022 2022-02-23 22:52:55 +00:00
Philip Sargent
a87ef54492 remove 'testing' in search field 2022-02-23 22:51:37 +00:00
Philip Sargent
32e6d5f891 Update to-do lists and README text 2022-02-20 00:21:56 +00:00
Philip Sargent
ab8813e389 capture git subprocess errors 2021-12-30 23:27:42 +00:00
Philip Sargent
21ad6ecffb New debug page for subprocess runs 2021-12-30 22:46:34 +00:00
Philip Sargent
b359937eab remove chmod attempt 2021-12-30 21:13:34 +00:00
Philip Sargent
c0545b8777 separate chmod from context 2021-12-30 20:21:47 +00:00
Philip Sargent
4470c5abbd chmod after write 2021-12-30 20:08:24 +00:00
Philip Sargent
c3a54858d5 chmod with context handler 2021-12-30 20:03:34 +00:00
Philip Sargent
0a3037f077 let exceptions bubble up 2021-12-30 19:46:44 +00:00
Philip Sargent
84e165b8fc Move exception handling to calling View 2021-12-30 19:28:33 +00:00
Philip Sargent
5bad82b4f0 bug fix 2021-12-30 19:10:13 +00:00
Philip Sargent
d1e6125d15 add git commit to file saving in 3 places 2021-12-30 19:07:17 +00:00
Philip Sargent
26454bf6c6 first attempt EtP fix for cave pages 2021-12-30 14:15:08 +00:00
Philip Sargent
1da2be03e6 Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2021-12-30 01:01:07 +00:00
Philip Sargent
5aac280618 Bug fix and extra comment 2021-12-30 00:56:09 +00:00
Philip Sargent
928c451040 msg that control panel not working 2021-12-30 00:55:42 +00:00
9b44ba3ef2 precompile regexes 2021-12-19 14:24:20 +00:00
Philip Sargent
02e475642a fix final \n on edited files 2021-12-07 23:46:55 +00:00
Philip Sargent
406b4590a9 fix tests to match new error messages 2021-12-05 21:45:06 +00:00
Philip Sargent
b3aa99f008 Better error msgs for bad slugs 2021-12-05 21:23:06 +00:00
Philip Sargent
bb97b7c862 Catch permissions error. 2021-12-05 17:45:45 +00:00
Philip Sargent
1aef81dccc testing WSL2 & docm addn 2021-12-05 17:11:12 +00:00
8a43cf7dfb logbook cache reading broken - disabled 2021-11-11 22:57:49 +02:00
2391b5a504 Django 'command' system docm 2021-11-11 19:34:59 +02:00
e61bc7416e to-do list updates 2021-11-07 00:36:44 +02:00
64f89be6a9 docm .3d file generation for CaveView 2021-11-06 23:57:51 +02:00
b72706356d fixed mistaken DataIssue report 2021-11-06 23:37:31 +02:00
73ffb509f7 CaveView status is now documented 2021-11-06 22:59:10 +02:00
5723f57fc5 Add crossrefs to handbook UTM pages, laser points 2021-11-06 00:15:24 +02:00
c38aa357b7 Better error msg for overwriting 3d files 2021-11-05 23:51:10 +02:00
df43aae33d 2021 docm update re logbook imports 2021-11-05 23:01:10 +02:00
b461b87df6 fix unneeded runs of survex on survex mport 2021-11-05 22:59:54 +02:00
7575e8c16f fix old comments 2021-10-31 19:42:06 +02:00
2869f228d4 fix side effects in tests: git and file upload 2021-10-31 19:25:45 +02:00
252fcc4716 git install test and tidying 2021-10-31 18:01:14 +02:00
36f92c5c9c jslib docum 2021-10-31 17:59:11 +02:00
f13a98e53b moved to handbook 2021-10-29 22:34:49 +03:00
3f6fb260a4 moving docm from README to handbook 2021-10-29 22:18:36 +03:00
bf5d0556fc remove broken and ancient AERW map images 2021-10-26 19:00:18 +03:00
c774b14e04 remove unused JS code after checking it is redundant 2021-10-26 01:02:27 +03:00
b6bbec235c Remove refs to absent and unused JS & CSS 2021-10-25 01:00:02 +03:00
5533029072 tried old jquery CSS 2021-10-25 00:38:24 +03:00
5807e4a873 tidying unused and unavailable JSLIB bits 2021-10-25 00:04:02 +03:00
f9e8cf60bc Caveview docum added 2021-10-24 19:17:02 +03:00
9294c8c2f1 remove unneeded LIBDIR 2021-10-24 19:16:01 +03:00
0d7cbbea37 sanitize passwords before push to git on server 2021-10-23 21:07:16 +03:00
0fb6f1e4ed text moved from README.txt to handbook on expoweb 2021-10-23 00:08:39 +03:00
9130160bd6 new tests for CAVERN and SURVEXPORT 2021-05-13 23:15:30 +03:00
6410cc1090 , replace raise with error msg 2021-05-13 21:46:29 +03:00
ac11c56ca0 update to WSL on different machine 2021-05-13 00:13:11 +03:00
515a639fd0 update re git commit within testsuite 2021-05-13 00:10:53 +03:00
4503751907 improving README for a new dev install 2021-05-13 00:00:39 +03:00
Philip Sargent
3cc9fe4dd9 fix for server 2021-05-09 01:13:53 +01:00
Philip Sargent
e42cb582c7 1988, 1989 still bad 2021-05-09 00:55:37 +01:00
Philip Sargent
39cd616c90 survexstations also tabulated 2021-05-07 23:46:11 +01:00
Philip Sargent
8c5fdf5021 Entrance locations new report - url to cave 2021-05-07 23:21:57 +01:00
Philip Sargent
1ff723554c Northings and Easting report 2021-05-07 22:42:10 +01:00
Philip Sargent
41ed15f47f 1987 crashes too.. 2021-05-07 21:43:46 +01:00
Philip Sargent
d916d4125c 1989 crashes mysql too. duh. 2021-05-07 21:22:40 +01:00
Philip Sargent
4877a7ddc4 omit 1988 logbook, crashes mysql 2021-05-07 21:00:06 +01:00
Philip Sargent
c6bcb5fde9 wgs84 lat long for entrances 2021-05-07 20:44:58 +01:00
Philip Sargent
4cd7367a7e remove commented out bits 2021-05-07 19:38:26 +01:00
Philip Sargent
47d9d7d242 Parse all logbooks 2021-05-06 21:07:19 +01:00
Philip Sargent
c3c222e045 docm on JS files 2021-05-05 17:56:05 +01:00
Philip Sargent
d374779c47 dwg upload and django admin extra search 2021-05-05 00:35:10 +01:00
Philip Sargent
44b6770b6a oops. finished update on rename 2021-05-04 21:03:04 +01:00
Philip Sargent
1638f97d0c moved functions between files 2021-05-04 20:57:16 +01:00
Philip Sargent
b3fcd7765e renamed to uploadfiles 2021-05-04 20:56:44 +01:00
Philip Sargent
dc3379c186 fix end slash issue 2021-05-04 15:48:11 +01:00
Philip Sargent
b4abd7b6bc menu update 2021-05-04 15:44:46 +01:00
Philip Sargent
6d341a3cfe removed field from model 2021-05-04 15:43:10 +01:00
Philip Sargent
1d9d96f467 IFRAMES chnaged to DENY 2021-05-04 14:17:07 +01:00
Philip Sargent
56c3517328 fixed url ambiguity by rename 2021-05-04 14:16:48 +01:00
Philip Sargent
90bb0759a0 Drawing files upload form 2021-05-04 02:46:56 +01:00
Philip Sargent
9ae2e18fe6 delete redundant logbook dump 2021-05-03 23:45:02 +01:00
Philip Sargent
8ad791c594 rearranged config files 2021-05-03 22:47:57 +01:00
Philip Sargent
e6adced39d removed menu link to edit database object directly 2021-05-03 21:21:28 +01:00
Philip Sargent
fd95bb8198 split surveys->scans + drawings 2021-05-03 20:36:29 +01:00
Philip Sargent
9b9f6720e0 not found now does 404 & moved login 2021-05-03 20:35:35 +01:00
Philip Sargent
254b465755 git integration with Save this page 2021-05-03 00:52:51 +01:00
Philip Sargent
5a085ba7ba another todo quote 2021-05-02 22:48:25 +01:00
Philip Sargent
4782f3b184 fixed test mistake & bug in Edit This 2021-05-02 22:47:59 +01:00
Philip Sargent
51da26564f chnage img fix to import, not display 2021-05-02 15:50:20 +01:00
Philip Sargent
a9ffae9b87 Fix images in single logbook entries 2021-05-02 14:50:46 +01:00
Philip Sargent
3393db0fbc Bigger buttons, phone compatible 2021-05-01 18:35:08 +01:00
Philip Sargent
100209ea16 add cave column to wallets report 2021-05-01 00:19:04 +01:00
Philip Sargent
425b534c30 add wallet column to expo report 2021-05-01 00:18:39 +01:00
Philip Sargent
63640db81f remove unused templatetags load 2021-05-01 00:18:13 +01:00
Philip Sargent
03160f3863 Fix upload file test 2021-04-30 23:22:33 +01:00
Philip Sargent
7368942488 remove unused templatetags code 2021-04-30 23:21:38 +01:00
Philip Sargent
9a69ce50f9 remove unused templatetag code 2021-04-30 22:48:53 +01:00
Philip Sargent
b545f8ed40 cleaned & removed defunct wiki_to_html 2021-04-30 22:44:03 +01:00
Philip Sargent
be0148d146 removing cruft, renaming badly named things 2021-04-30 21:32:53 +01:00
Philip Sargent
8f1d6e2cc2 file upload integration test working 2021-04-30 18:02:05 +01:00
Philip Sargent
fde30685a8 bugfix 2021-04-30 03:52:30 +01:00
Philip Sargent
bdf535fcbf Scan Upload working nicely 2021-04-30 03:44:53 +01:00
Philip Sargent
03a5f5989e chipping away bug in personexpedition, remove role 2021-04-30 00:24:36 +01:00
Philip Sargent
e5cf1b5289 download logbook in standard HTML works 2021-04-28 02:43:09 +01:00
Philip Sargent
62799d196b tabs to spaces 2021-04-28 00:50:36 +01:00
Philip Sargent
cb6619a90a buxfixes 2021-04-28 00:50:26 +01:00
Philip Sargent
b9fad1f4fb new path() interacts badly with include(). fixed 2021-04-28 00:48:20 +01:00
Philip Sargent
5e478c7eb0 Imports in control panel work again 2021-04-27 20:44:24 +01:00
Philip Sargent
821aaa1f66 Changing to new path() url function (initial) 2021-04-27 19:02:11 +01:00
Philip Sargent
942cbdd4b2 clean out broken QM bits 2021-04-27 15:38:20 +01:00
Philip Sargent
13f3057185 bugfixes 2021-04-27 14:51:04 +01:00
Philip Sargent
81d58f1275 delete: never implemented properly 2021-04-27 14:50:26 +01:00
Philip Sargent
e236e792ec todo: parsing caves 2021-04-27 00:32:01 +01:00
Philip Sargent
9e7414e0e0 remove autologbooks function 2021-04-27 00:31:23 +01:00
Philip Sargent
e6eeaf1674 two caves no longer pending 2021-04-26 23:47:08 +01:00
Philip Sargent
49b9225b6e rename scansfolder to wallet 2021-04-26 19:50:03 +01:00
Philip Sargent
7f64670f36 reanme manyscansfolders to manywallets 2021-04-26 19:22:29 +01:00
Philip Sargent
7dd5840353 reanem tunnelcontains to dwgcontains 2021-04-26 18:54:17 +01:00
Philip Sargent
72df5d5213 /tunneldata/ to /drawings/ 2021-04-26 18:45:21 +01:00
Philip Sargent
d43ce1bdb2 rename TUNNEL_DATA as DRAWINGS_DATA 2021-04-26 18:42:10 +01:00
Philip Sargent
bd647b99ec rename tunnelname as dwgname 2021-04-26 18:37:59 +01:00
Philip Sargent
0997fd0901 rename ScansFolder class as Wallet 2021-04-26 18:18:16 +01:00
Philip Sargent
dc840c9bc7 tunnelpath to dwgpath 2021-04-26 18:11:14 +01:00
Philip Sargent
37403a7234 renamed tunnel to drawing or dwg 2021-04-26 18:08:42 +01:00
Philip Sargent
f0d291f527 rename surveyscansfolder(s) to scanwallet(s) 2021-04-26 17:46:23 +01:00
Philip Sargent
b8803c8e5b rename surveyscansingle 2021-04-26 17:40:48 +01:00
Philip Sargent
9e11c0814e missing entrance .html file now handled differently 2021-04-26 17:23:23 +01:00
Philip Sargent
72fa8a5883 Making entrances work for pending caves 2021-04-26 02:10:45 +01:00
Philip Sargent
a656ada67a Fixing cave edit form and cave creation parser 2021-04-25 04:04:53 +01:00
Philip Sargent
20c42b14bf update to MariaDB management 2021-04-25 01:48:03 +01:00
Philip Sargent
4e59c8791f copied comments from the html template file to the xml template too 2021-04-25 01:47:34 +01:00
Philip Sargent
8128870d57 more robust logbooks parsing 2021-04-24 01:23:55 +01:00
Philip Sargent
b979bdb560 slug too long for field 2021-04-23 16:31:52 +01:00
Philip Sargent
b7659a477c Deep fix, nonunique ids in logbookentries fixed 2021-04-23 16:11:50 +01:00
Philip Sargent
343d6cf350 delete old forms, templates. fix logdataissues 2021-04-23 11:43:25 +01:00
Philip Sargent
dbd186e299 make ?reload private and clean old error msgs 2021-04-23 03:07:21 +01:00
Philip Sargent
1a4be0f02e stop file logging from tests 2021-04-23 03:05:22 +01:00
Philip Sargent
8f89b022c7 drawing file upload form working 2021-04-22 02:45:28 +01:00
Philip Sargent
74403d28e9 fix field in search box so no need to delete it 2021-04-21 22:09:42 +01:00
Philip Sargent
1968db62ad archeology on the logbook entry editing forms 2021-04-21 22:09:20 +01:00
Philip Sargent
bcdb3572fa Add new per-module ToDo texts 2021-04-21 19:08:42 +01:00
Philip Sargent
18938c9fca more attempts to recognise scotsmen 2021-04-20 23:57:51 +01:00
Philip Sargent
8f0e7435d6 renaming 'tunnel' to 'dwg' in urls and views 2021-04-20 23:57:19 +01:00
Philip Sargent
bad5484d12 fix for missing scotsmen 2021-04-20 23:14:10 +01:00
Philip Sargent
b4ba3c40eb catch unknown scotsman error 2021-04-20 22:58:41 +01:00
Philip Sargent
3b0c6ef2ea Better labels for objects in admin console 2021-04-20 19:47:08 +01:00
Philip Sargent
7a6578e205 Now Django 3.2 compatible without deprcation warnings. 2021-04-20 19:46:32 +01:00
Philip Sargent
a2083c5310 disabling Google FLoC 2021-04-19 01:47:12 +01:00
Philip Sargent
879f6c288e Make troggle compatible with Django 3.1 2021-04-19 01:32:18 +01:00
Philip Sargent
e17a21defd Tests work on cave, person, expedition pages 2021-04-18 01:58:24 +01:00
Philip Sargent
eea74406c9 fix template bug for newentrance 2021-04-17 23:59:11 +01:00
Philip Sargent
d4317b5fd3 better error pages 2021-04-17 21:24:37 +01:00
Philip Sargent
25b8fc2e1d test suite with users logins 2021-04-17 20:45:38 +01:00
Philip Sargent
f8b613e0aa prospect and moving code to better places 2021-04-17 01:41:06 +01:00
Philip Sargent
4ad7033285 working, but very faint 2021-04-16 21:47:40 +01:00
Philip Sargent
eca0bcc6d8 oops 2021-04-16 21:29:32 +01:00
Philip Sargent
fa1df39923 stopo prospecting map crashing 2021-04-16 21:28:44 +01:00
Philip Sargent
49277216ba Fixed cave sort-order in cave index 2021-04-16 16:01:35 +01:00
Philip Sargent
16ef4fa9fb split out prospecting guide code 2021-04-16 03:05:39 +01:00
Philip Sargent
9695e49024 add link to all expedition pages back to expoweb /year/ page 2021-04-16 01:56:43 +01:00
Philip Sargent
540ce7c076 bug fix in error message 2021-04-15 18:06:04 +01:00
Philip Sargent
27491c933a enabled mugshots & blurb in people pages 2021-04-15 17:51:01 +01:00
Philip Sargent
7124d978d3 add 'del' and 'delfirst' options 2021-04-15 14:27:16 +01:00
Philip Sargent
0fee2bb165 add labels to templates to help debugging 2021-04-15 12:55:13 +01:00
Philip Sargent
3e50d0edca renaming CSS files for clarity 2021-04-15 12:34:51 +01:00
Philip Sargent
38a63641bc abort messages when parsing caves 2021-04-15 01:52:09 +01:00
Philip Sargent
5c4a33873f cull old CSS 2021-04-15 01:51:42 +01:00
Philip Sargent
c2c7de4c59 more cave parsing data fixes 2021-04-14 22:50:47 +01:00
Philip Sargent
d598a6d0f5 better integration of svx file DatIssues 2021-04-14 21:08:06 +01:00
Philip Sargent
db3addc819 Detects missing svx and description files 2021-04-14 18:24:08 +01:00
Philip Sargent
d8b1d59b12 Cave and Entrance forms tuned to user needs 2021-04-14 16:28:30 +01:00
Philip Sargent
54d98f58f3 docum and defaults in template cave-data xml 2021-04-14 01:52:42 +01:00
Philip Sargent
4a13232467 stop being so verbose 2021-04-14 00:12:27 +01:00
Philip Sargent
ba0f573618 restored cave edit capability 2021-04-14 00:11:59 +01:00
Philip Sargent
2f03f77ce4 rename function more accurately 2021-04-13 23:52:56 +01:00
Philip Sargent
daf58e9e45 replace assert() with message logging 2021-04-13 22:27:01 +01:00
Philip Sargent
2467065ac3 actually needed it seems. 2021-04-13 02:29:24 +01:00
Philip Sargent
0820d7c0dc Docstrings for all modules 2021-04-13 01:37:42 +01:00
Philip Sargent
267741fa8b fixing typos and changes in importing 2021-04-13 01:13:08 +01:00
Philip Sargent
7bc73d1ca8 move models_survex to models/survex.py 2021-04-13 00:50:12 +01:00
Philip Sargent
957169d9aa move models_caves to models/caves.py 2021-04-13 00:47:17 +01:00
Philip Sargent
5b3b0e67e9 create core/models/ directroy 2021-04-13 00:43:57 +01:00
Philip Sargent
304bbd230a deprecated non-raw regex 2021-04-13 00:18:30 +01:00
Philip Sargent
ca1df94be5 moved clever slash middleware & unused.py 2021-04-13 00:14:15 +01:00
Philip Sargent
2a1710596a moving save_carefully() 2021-04-13 00:11:08 +01:00
Philip Sargent
b602f3ae13 creating core/utils.py 2021-04-12 23:58:48 +01:00
Philip Sargent
5024abc812 add url field to DataIssue 2021-04-12 01:28:54 +01:00
Philip Sargent
dbd9b1a095 Enable svx view if url just misses off .svx 2021-04-12 01:16:49 +01:00
Philip Sargent
f6f83c6f70 data issues much easier to read 2021-04-12 01:00:47 +01:00
Philip Sargent
bc9b4f508b Public Import Errors webpage 2021-04-11 20:00:09 +01:00
Philip Sargent
7f5ac93cc6 url dispatcher tidying 2021-04-11 03:02:06 +01:00
Philip Sargent
5d4ad93c51 Better FileNotFound in expofiles 2021-04-10 15:30:29 +01:00
Philip Sargent
a7e59b2bb0 clear cache on parsing & 2.2.19 notes 2021-04-10 02:12:13 +01:00
Philip Sargent
876868506f tidy obsolete troggle/code/reset_db 2021-04-10 01:14:23 +01:00
Philip Sargent
6dc54adec8 Cache enabled for 'expedition' pages 2021-04-10 01:07:49 +01:00
Philip Sargent
16a6e05849 Dj2.2.19 LTS tested. 2021-04-08 01:37:59 +01:00
Philip Sargent
f16d9a5848 Therion renaming missed one 2021-04-08 01:22:09 +01:00
Philip Sargent
cb5b80353d Therion files now handled 2021-04-08 01:09:06 +01:00
Philip Sargent
b7d54111ba Import Therion files too 2021-04-07 21:53:43 +01:00
Philip Sargent
bf74913486 compatible with Dj2.1.5 2021-04-07 21:53:17 +01:00
Philip Sargent
e3a341eb22 chnage maintenance menus & move tests 2021-04-07 16:04:27 +01:00
Philip Sargent
785d6360cd Now compat with Dj2.0.13 & 1.11.29 2021-04-06 22:50:57 +01:00
Philip Sargent
05ed8af158 remove 'register' bad link 2021-04-06 01:19:50 +01:00
Philip Sargent
d1cd72c5f8 New user login/logoff system using standard Dj 2021-04-06 00:49:09 +01:00
Philip Sargent
6d6bec35f2 fix incorrect folder when showing index.html 2021-04-05 15:48:48 +01:00
Philip Sargent
9db1a8490c fix bug in parsing bad HTML pages e.g.expo 82 2021-04-05 14:49:06 +01:00
Philip Sargent
409037bdf3 tiny tidyings 2021-04-05 14:01:15 +01:00
Philip Sargent
53fef14024 transaction incompatible with migrate Django 2+ 2021-04-04 01:44:41 +01:00
Philip Sargent
c08356876d missed a wiki_markup ref 2021-04-03 21:59:18 +01:00
Philip Sargent
4d7e3d6866 no permission for this on expo server 2021-04-03 21:09:16 +01:00
Philip Sargent
ab5512e9d6 bugger, missed this edit 2021-04-03 20:54:33 +01:00
Philip Sargent
f6ae46e352 3d CaveView regeneates .3d file in cache 2021-04-03 20:52:35 +01:00
Philip Sargent
7ee7a05ea1 typos in templates 2021-04-03 00:35:31 +01:00
Philip Sargent
e559a1dabd commentary in prospecting guide 2021-04-03 00:35:10 +01:00
Philip Sargent
8707e4a819 fix missing slug field in pending caves 2021-04-03 00:34:34 +01:00
Philip Sargent
912e447200 fix password import error 2021-04-03 00:33:55 +01:00
Philip Sargent
52c1dabd0e survex_file field inconsistency detection & edit 2021-04-02 23:21:23 +01:00
Philip Sargent
bd8d59b343 restoring wiki_markup where still used 2021-04-02 23:17:54 +01:00
Philip Sargent
ea221281a3 remove unused {% load link %} 2021-04-02 20:54:18 +01:00
Philip Sargent
cc5b4fa650 remove {% load wiki_markup %} 2021-04-02 20:41:42 +01:00
Philip Sargent
663d4a2a02 JSLIB and tidy path settings 2021-04-02 19:22:53 +01:00
Philip Sargent
371542fb1e Caveview enabled - local copy 3MB 2021-04-02 19:02:10 +01:00
Philip Sargent
b71f2c4ebb rename wiki folder 2021-04-02 15:52:56 +01:00
Philip Sargent
d841faa057 login decorator moved into views & .gitignore 2021-04-02 15:51:14 +01:00
Philip Sargent
c8cc1673e0 enable non-cave html and images redirection 2021-04-01 21:44:03 +01:00
Philip Sargent
d6409b22c2 Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2021-04-01 20:46:35 +01:00
Expo on server
35697b9af5 ignoire localsettings AGAIN 2021-04-01 20:46:26 +01:00
Philip Sargent
7374244806 Merge branch 'python3-new' of ssh://expo.survex.com/home/expo/troggle into python3-new 2021-04-01 20:27:52 +01:00
Expo on server
391790d648 ignoire localsettings AGAIN 2021-04-01 20:27:23 +01:00
Philip Sargent
f752f934b6 fix cave description view 2021-04-01 20:08:57 +01:00
Philip Sargent
573dba4712 ignore localsettings.py again 2021-04-01 02:57:35 +01:00
Philip Sargent
51de825189 making PHOTOS files served correctly by troggle 2021-04-01 02:50:30 +01:00
Philip Sargent
4c0ad53b3a culling unused JS 2021-04-01 02:44:49 +01:00
Philip Sargent
8f790309ce tests for new /site_media/ management 2021-03-31 23:41:46 +01:00
Philip Sargent
2690203912 new method for /site-media/, /static/, /photos/ 2021-03-31 23:19:48 +01:00
Philip Sargent
9d8a44696b remove garbage & duplicated code 2021-03-31 22:13:51 +01:00
Philip Sargent
7cccf4daf1 move *_views files to /views/* 2021-03-31 21:51:17 +01:00
Philip Sargent
e1cf43c260 trailing slashes fixed 2021-03-31 20:18:46 +01:00
Philip Sargent
a6ed0a964e making flat files delivery more robust 2021-03-31 17:57:43 +01:00
Philip Sargent
3452c2c5d4 flatpages to expopages 2021-03-31 16:14:36 +01:00
Philip Sargent
577bd51613 Moved secrets to credentials.py 2021-03-31 13:00:09 +01:00
Philip Sargent
b3b2356a7e expopages now troggle/core/views_expo 2021-03-30 21:48:36 +01:00
Philip Sargent
b75baffdcf delete unused profiles app 2021-03-30 21:09:01 +01:00
Philip Sargent
cacae6a9cd rename flatpages as expopages to reduce confusion 2021-03-30 21:05:27 +01:00
Philip Sargent
0f024b27f0 Replace assert() with DataIssue message 2021-03-29 02:06:19 +01:00
Philip Sargent
c81f17c24b more tests for files served from several places 2021-03-28 23:48:36 +01:00
Philip Sargent
623483f3b1 Fixing multiple caves with same kataser no 2021-03-28 23:47:47 +01:00
Philip Sargent
0ecaa9b8ee Disable "secure" (i.e. SSL trabsport only) cookies 2021-03-28 15:40:07 +01:00
Philip Sargent
a99020078c tidy render() calls 2021-03-28 03:48:24 +01:00
Philip Sargent
a4c892b696 fixed serving expofiles from test server 2021-03-28 03:48:04 +01:00
Philip Sargent
c4cd2178f7 we will never test or host on naked Windows 2021-03-27 23:28:12 +00:00
Philip Sargent
c7475cda83 merge expeditions and statistics pages 2021-03-27 20:05:15 +00:00
Philip Sargent
ffaaea497c re-ordering middleware and logon system 2021-03-27 18:22:07 +00:00
Philip Sargent
e7947069a2 should all be working, but isn't 2021-03-26 23:40:34 +00:00
Philip Sargent
0abd8aedff layout tidying 2021-03-26 21:20:08 +00:00
Philip Sargent
dba0fd8b20 remove survexblock webpage - redundant 2021-03-26 21:19:31 +00:00
Philip Sargent
ec83c1ff12 csrf continued 2021-03-26 19:42:58 +00:00
Philip Sargent
1c7e99e91b attempting to enable csrf cookie robustly 2021-03-26 17:33:58 +00:00
Philip Sargent
f5e799d632 new remote expofiles option 2021-03-26 13:51:00 +00:00
Philip Sargent
2e068d3fbb move mistake to expoweb 2021-03-26 13:30:40 +00:00
Philip Sargent
713db304e2 deleting unused old utilities 2021-03-26 13:14:52 +00:00
Philip Sargent
3487c22da3 Survex 1624, 1626, 1627 now displayed robustly 2021-03-26 02:01:29 +00:00
Philip Sargent
65c3cb31d7 improved display of survex files for a cave 2021-03-25 20:23:25 +00:00
Philip Sargent
80874887cc 404 fix attempt 2021-03-25 16:15:58 +00:00
Philip Sargent
8723d62add Survex files subdirectories displayed differently 2021-03-25 16:15:26 +00:00
Philip Sargent
213ada4ae9 unused app pending deletion 2021-03-25 16:13:58 +00:00
Philip Sargent
70684a29c6 1982 giving problems. Caved ID issue suspected. 2021-03-24 22:00:51 +00:00
Philip Sargent
06c4d026f8 skip years crashing parser 2021-03-24 21:40:52 +00:00
Philip Sargent
65be64c756 type data added to report 2021-03-24 21:16:06 +00:00
Philip Sargent
2fe2c0515f update push procedure and fix order in pathsreport 2021-03-24 20:38:43 +00:00
Philip Sargent
ecbef84c37 sanitised server settings 2021-03-24 17:34:40 +00:00
Philip Sargent
39194704f5 pathlib updates 2021-03-24 17:32:45 +00:00
Philip Sargent
9a91487375 pathlib for path management & cavelist fixes 2021-03-24 15:46:35 +00:00
Philip Sargent
7f37327bcd Fixing cave list webpage 2021-03-24 00:55:36 +00:00
Philip Sargent
a9fa251fee now robust re svx in folders unofficial numbers 2021-03-23 17:35:41 +00:00
Philip Sargent
4e00645851 fix if empty directory present & display 144, 40 2021-03-23 16:36:55 +00:00
Philip Sargent
87b30741fc more mimetypes for flatfiles 2021-03-22 02:27:19 +00:00
Philip Sargent
a0f504d1e2 new self-adjusting paths 2021-03-22 02:26:46 +00:00
Philip Sargent
24bab23508 server setup docum 2021-03-22 02:26:11 +00:00
Philip Sargent
a0c3ef8ea1 comments on urls resolution re apache & bugfix 2021-03-21 01:37:52 +00:00
Philip Sargent
b4b343b578 survex syntax colouring - local copies 2021-03-21 01:36:08 +00:00
Philip Sargent
18b570d750 remote /expofiles/ now for runserver dev 2021-03-21 01:33:59 +00:00
Philip Sargent
314f600523 Dj1.11.29 running - trimmed unneeded packages 2021-03-17 21:09:44 +00:00
Philip Sargent
6ac65cf893 Buxfix for cave search not finding any cave file 2021-03-17 20:58:25 +00:00
Philip Sargent
5836c6ff90 Importing old logbooks 2021-02-06 00:18:48 +00:00
Philip Sargent
a4d7183260 Added link to Kataster list on link to new cave form 2020-08-02 23:54:14 +01:00
Philip Sargent
d61c2b20c8 Deleted archaisms and new comments 2020-08-02 23:53:35 +01:00
Philip Sargent
3dcc8883cd Updated Troggle Article 2020-07-31 04:44:29 +01:00
Philip Sargent
3574dd4b1e Fix skipped import error messages for drawings 2020-07-29 22:54:53 +01:00
Philip Sargent
95b9daca80 remove commented out code, more comments 2020-07-29 22:54:09 +01:00
Philip Sargent
42456e8e98 fix server instructions + comment in stats output 2020-07-28 01:46:00 +01:00
Philip Sargent
0094cf7054 clean troggle menu and drawingfiles layout 2020-07-28 01:22:06 +01:00
Philip Sargent
d4c79ab66b add docutils needed and SVX_URL 2020-07-28 00:08:35 +01:00
Philip Sargent
57b8242f7e utf-8 templates .gitignore 2020-07-28 00:08:34 +01:00
Expo on server
a624cc8a68 restoring .gitignore 2020-07-28 00:06:07 +01:00
Expo on server
b5b0e4191a Remove generated lines-of-templates.txt file 2020-07-27 01:52:36 +01:00
Philip Sargent
0403c68429 enabling django/admin/ auto documentn. functions 2020-07-26 23:38:17 +01:00
Philip Sargent
f1ceb38f5f fix for no comma after lat item in django template 2020-07-26 21:11:29 +01:00
Philip Sargent
0cf3b869af First implementation of html API, both TSV and JSON 2020-07-26 20:48:25 +01:00
Philip Sargent
69b843a824 Removing editLogbookEntry capability 2020-07-26 03:22:37 +01:00
Philip Sargent
924c5a3bf8 Delete commented-out code 2020-07-26 02:26:04 +01:00
Philip Sargent
809633bdd3 Fixed QMs in Admin control panel &remv. OtherCave 2020-07-23 02:16:08 +01:00
Philip Sargent
64727e0d3a Commented-out unused CaveDescription object 2020-07-23 01:24:06 +01:00
Philip Sargent
2a0aee5bf5 remove NewSubCave 2020-07-23 01:08:45 +01:00
Philip Sargent
c65544a377 differences between sqlite and MariaDB 2020-07-22 23:51:50 +01:00
Philip Sargent
a6ed0997e8 No speedup for database init. 2020-07-22 23:44:25 +01:00
Philip Sargent
721341604c Speed up migrations and init 2020-07-22 23:43:07 +01:00
Philip Sargent
3e3cae507c More transactions enabled. 2020-07-22 23:36:46 +01:00
Philip Sargent
1ef5924f0c Two more transactions 2020-07-22 23:27:25 +01:00
Philip Sargent
070157eacb TRansX speedup for import + remove fossil profiles 2020-07-22 23:14:10 +01:00
Philip Sargent
fbf5daff0e gitignor generated files 2020-07-22 23:14:10 +01:00
Philip Sargent
427afa9ebd Removing foissil: subcave and flatpages-redirects 2020-07-22 23:14:09 +01:00
Philip Sargent
190514597b apache restart 2020-07-22 23:14:09 +01:00
Expo on server
92de606bc6 removing files which are generated 2020-07-22 23:14:02 +01:00
Philip Sargent
5aed96c5a6 mistake 2020-07-21 00:18:02 +01:00
Philip Sargent
5bc2c132fa fix makes prospecting_guide work 2020-07-21 00:15:53 +01:00
Expo on server
f7db908cb2 generated files should be in gitignore 2020-07-20 23:51:15 +01:00
Philip Sargent
1bc82dea15 make survexblock titles 200 chars 2020-07-20 23:25:49 +01:00
Philip Sargent
f131509c56 3dtopos to survexpos + comments 2020-07-20 22:53:26 +01:00
Expo on server
8e77a70ad6 remove unused SVX_URL variable 2020-07-20 20:11:07 +01:00
Expo on server
867486e72e remove leftover diff file 2020-07-20 20:10:31 +01:00
Philip Sargent
28130de9cb avoiding problem with WSGI populate() error 2020-07-20 18:31:50 +01:00
Philip Sargent
1523586b37 test2 option and gitignore updates 2020-07-20 13:04:30 +01:00
Expo on server
d7838e2a42 migraitons files should be in gitign9ore 2020-07-20 12:54:32 +01:00
Philip Sargent
eb923af44f webpage tests created (no database) 2020-07-19 01:23:07 +01:00
Philip Sargent
edd5a3efd9 Module documentation docstrings 2020-07-18 16:23:54 +01:00
Philip Sargent
90dfa516da *team format updates 2020-07-09 18:06:03 +01:00
Philip Sargent
37620b4dbc *units factor x and feet 2020-07-08 00:00:56 +01:00
Philip Sargent
71b5383090 parse 2006 logbook from html 2020-07-07 19:07:45 +01:00
Philip Sargent
52afb9f466 units conversion not quite working 2020-07-07 02:46:18 +01:00
Philip Sargent
efc43b0863 *units in feet and metres, subcaves 2020-07-07 01:35:58 +01:00
Philip Sargent
ff3cdb10dc extra error printout and remove old code 2020-07-06 21:46:58 +01:00
Philip Sargent
d27a74c97b per svxfile debug printing enabled 2020-07-06 21:46:19 +01:00
Philip Sargent
d2833d26cc fix schema and try cache caves import 2020-07-06 20:27:31 +01:00
Philip Sargent
8530b0643d person attribution of surveyed length working 2020-07-06 01:24:43 +01:00
Philip Sargent
3f9971d2ee title and wallet folders working in /survexfile/258 2020-07-05 17:22:26 +01:00
Philip Sargent
5be41c8163 splays and alias splays implemented 2020-07-04 13:31:46 +01:00
Philip Sargent
51d0daafdd QM placeholder logbook entries sorted out 2020-07-04 01:10:53 +01:00
Philip Sargent
5ed6271c08 rename variables and leglength calc 2020-07-04 01:10:17 +01:00
Philip Sargent
abbe8d467b refactored LinearLoad(), output unchanged 2020-07-03 18:08:59 +01:00
Philip Sargent
fd6f0b0a35 stack-based state of *data fixed 2020-07-03 17:22:15 +01:00
Philip Sargent
67f66b72e8 State push/pop working 2020-07-03 14:53:36 +01:00
Philip Sargent
e54436e818 chaos monkey 2020-07-02 16:26:05 +01:00
Philip Sargent
9a514e7d3f rebuild all .3d files and progress on compass/clino/tape 2020-07-02 16:25:51 +01:00
Philip Sargent
6cd660982c More informative html in templates 2020-07-02 16:24:39 +01:00
Philip Sargent
df434cd399 SurvexBlocks now importing in deatil 2020-07-01 22:49:38 +01:00
Philip Sargent
8cc768e5b6 fix survexdirectories to ref files properly 2020-07-01 17:41:09 +01:00
Philip Sargent
5feb07e3f6 Creates folders as needed on editing new svx file 2020-07-01 00:20:27 +01:00
Philip Sargent
514887d19f Fix needed for pending caves with no xml file 2020-06-30 17:59:53 +01:00
Philip Sargent
ae892a07d4 ignore autogenerated db schema & migrations files 2020-06-30 17:34:35 +01:00
Philip Sargent
f76e0d3a16 rename Tunnel files to Drawings files - phase 1 2020-06-30 15:52:29 +01:00
Philip Sargent
0a57ac3132 all SurvexDirectory set OK, pending caves made 2020-06-30 15:39:24 +01:00
Philip Sargent
6b0275d035 CASCADE on delete removed from key foregin keys 2020-06-30 15:26:03 +01:00
Philip Sargent
be2b17ea85 More debugging and warnings 2020-06-30 15:24:42 +01:00
Philip Sargent
c4e2ae2395 add memory footprint display 2020-06-30 15:22:41 +01:00
Philip Sargent
39f042240d Fix SurvexDirectory import 2020-06-29 21:16:13 +01:00
Philip Sargent
a60a495c83 Creating forgotten caves & better GetCaveLookup() 2020-06-29 21:15:42 +01:00
Philip Sargent
459ed11b58 avoiding problem in KH QMs import 2020-06-28 15:57:40 +01:00
Philip Sargent
bf1c683fd0 fixing parent blocks & titles 2020-06-28 14:42:26 +01:00
Philip Sargent
122cdd7fc8 replace GetCaveByReference 2020-06-28 01:50:34 +01:00
Philip Sargent
899ba13df4 working and refactored 2020-06-27 19:00:26 +01:00
Philip Sargent
467baec7da Renaming functions 2020-06-27 18:00:24 +01:00
Philip Sargent
4716eaa4b6 Working. More fault checking. 2020-06-27 17:55:59 +01:00
Philip Sargent
c55716df08 move function (correctly this time) 2020-06-27 12:08:02 +01:00
Philip Sargent
ca6f7ed587 move function 2020-06-27 12:04:34 +01:00
Philip Sargent
e2713cfe2d recursive scan *import to make linear filelist 2020-06-27 00:50:40 +01:00
Philip Sargent
030c49ff7c rewritten QM loading from SVX files 2020-06-25 03:17:56 +01:00
Philip Sargent
d3654266ee Better error msgs, one bug fix 2020-06-25 02:10:20 +01:00
Philip Sargent
04f14c91f0 rearrange ref and comment detection 2020-06-24 22:46:18 +01:00
Philip Sargent
664c18ebbe refactor team and ignore sections 2020-06-24 19:07:11 +01:00
Philip Sargent
3645c98685 extract *date function 2020-06-24 17:55:42 +01:00
Philip Sargent
45bbfce4d3 extract *ref, ;ref and ;QMs to functions 2020-06-24 14:49:39 +01:00
Philip Sargent
bb69cc073a start refactor survex import 2020-06-24 14:10:13 +01:00
Philip Sargent
dc5a53376d exploring recusrive behaviour 2020-06-24 02:33:43 +01:00
Philip Sargent
6bf762b72f bin Makesurvexstation, survextitle, survexEquate 2020-06-24 01:57:20 +01:00
Philip Sargent
d6c4ffca5a tunnelfile.scans name change for compatibility 2020-06-24 00:36:32 +01:00
Philip Sargent
c91aa4be47 Simple renaming too-similar variables 2020-06-24 00:18:01 +01:00
Philip Sargent
9cd70b31ac simple rename survexscansfolder to scansfolder 2020-06-23 23:46:33 +01:00
Philip Sargent
b64c779a58 rename SurvexScansFolders and tidy survex parser 2020-06-23 23:34:08 +01:00
Philip Sargent
2e7cf188af cosmetic and moving a funciton 2020-06-23 22:44:06 +01:00
Philip Sargent
674cea629d clean up import statements: more specific 2020-06-22 00:03:23 +01:00
Philip Sargent
8199e67b79 re-ordered middlkeware 2020-06-21 00:06:03 +01:00
Philip Sargent
7b260006bf dj.-reg. 2.5.2 all tested. 2020-06-20 23:26:50 +01:00
Philip Sargent
f3232cc5df More security, middleware upgrade, dj-reg.2.5 2020-06-20 23:08:34 +01:00
Philip Sargent
477a289c2e Working with django-registration==2.4 2020-06-20 20:42:10 +01:00
Philip Sargent
77c80d1a69 remove in-memory phase of data import 2020-06-20 19:55:23 +01:00
Philip Sargent
f608fc186e Fixup after rebase and pre- scripts fixed 2020-06-20 17:19:50 +01:00
Philip Sargent
e697466557 Cleanup secrets management, pre-run checks. 2020-06-20 16:51:07 +01:00
Philip Sargent
b35a0b0d26 Fully working dj 1.11.29 2020-06-19 16:39:05 +01:00
Philip Sargent
2c469718f6 caves, scans, survex work in databaseRest 2020-06-19 00:26:15 +01:00
Philip Sargent
4a51de95c4 Django 1.11.19 runs clean on pre-built db. 2020-06-18 22:44:41 +01:00
Philip Sargent
ee1d4bb600 static files icons into troggle repo 2020-06-18 21:51:19 +01:00
Philip Sargent
d9c6986a89 static files redone 2020-06-18 21:50:16 +01:00
Philip Sargent
bd6490631f edit this page working 2020-06-18 15:54:40 +01:00
Philip Sargent
640e1e3b5e cleaner warnings but site_media still not ok 2020-06-18 12:10:42 +01:00
Philip Sargent
f4231456e7 admin urls 2020-06-18 11:48:50 +01:00
Philip Sargent
222d85f052 regex deprecation warning 2020-06-18 10:59:11 +01:00
Philip Sargent
ee92182163 n_delete=... soon to be required 2020-06-18 00:20:47 +01:00
Philip Sargent
55dd577275 New url dispatcher OK 1.9.13 2020-06-17 22:55:51 +01:00
Philip Sargent
4941d230e2 TEMPLATES done for dj 1.9.13 2020-06-17 21:27:01 +01:00
Philip Sargent
660479d692 django 1.9.13 initial 2020-06-17 01:00:50 +01:00
Philip Sargent
b153fafa9f django 1.8.19 2020-06-16 22:16:48 +01:00
Philip Sargent
adc43324f3 move TEMPLATE to localsettings 2020-06-16 19:30:06 +01:00
Philip Sargent
f6bd08029f remove useless nulls on ManyToMany 2020-06-16 19:28:24 +01:00
Philip Sargent
94e5a06a15 clean up survexlegs 2020-06-16 19:27:32 +01:00
Philip Sargent
8fc0ba136f removed jgtfileupload 2020-06-16 16:48:19 +01:00
Philip Sargent
9f5e779b5e remove Survey & virtual survey wallet - never used 2020-06-16 16:17:35 +01:00
Philip Sargent
1b693da5ed break recursive import cycle 2020-06-16 16:07:36 +01:00
Philip Sargent
4c44a504ed preparing for django 1.8 2020-06-16 11:14:10 +01:00
Philip Sargent
453af2851b Stop storing all SurvexStations 2020-06-15 03:28:51 +01:00
Philip Sargent
30e560d808 Dump django.sites, .redirects, registration, extensions 2020-06-14 20:02:52 +01:00
Philip Sargent
277f60e3e2 autogenerated schema 2020-06-14 19:35:18 +01:00
Philip Sargent
77ca2d8830 server config 2020-06-14 10:12:07 +01:00
Philip Sargent
d4deea2019 remove tinyMCE more 2020-06-14 10:05:25 +01:00
Philip Sargent
38fa552c00 Disable TinyMCE 2020-06-13 23:16:19 +01:00
Philip Sargent
f8fa426adb rearrange for clarity 2020-06-13 01:27:42 +01:00
Philip Sargent
38beb34a38 cleaner import statements 2020-06-13 01:26:59 +01:00
Philip Sargent
fb0438d352 bugfix returning multiple object catch 2020-06-13 01:26:28 +01:00
Philip Sargent
44fe6a8b89 cleanup and better DatIssue msgs 2020-06-13 01:24:46 +01:00
Philip Sargent
269b8840ad import fixes & statistics table 2020-06-12 18:10:07 +01:00
Philip Sargent
b9a223c049 Fix to /caves/ != /caves 2020-06-12 14:54:00 +01:00
Philip Sargent
db37710b90 fixes: stopped storing survex legs 2020-06-12 14:52:00 +01:00
Philip Sargent
092c8bb913 stopped storing survex legs 2020-06-12 14:06:03 +01:00
Philip Sargent
d807e3de7d Object storage as alternative to SQL 2020-06-08 21:33:32 +01:00
Philip Sargent
8c965015f3 added static setup 2020-06-08 00:12:50 +01:00
Philip Sargent
538a3b6ca8 fixed circular ref on setup & in-memory db 2020-06-08 00:11:09 +01:00
Philip Sargent
9237a6262e Make import robust against duplicate kataster numbers 2020-06-07 17:49:58 +01:00
Philip Sargent
fcfda644d3 Fix un-cleared db error (partial) 2020-06-07 16:16:35 +01:00
Philip Sargent
75bac01f3a Fix bad pages for clashing kataster numbers 2020-06-07 16:13:59 +01:00
Philip Sargent
72fd57ef76 Rearranged caveindex page 2020-06-07 16:12:52 +01:00
Philip Sargent
fda50ed570 attempt to simplify wnt horribly wrong 2020-06-06 22:51:55 +01:00
Philip Sargent
f8a3c8f5bc Bugfix for capitalised filename extensions 2020-06-05 00:38:05 +01:00
Philip Sargent
681bfcb4c4 removing redundant functions 2020-06-04 23:53:36 +01:00
Philip Sargent
27816724f8 moved 2 funct, deletion of FileAbstraction pending 2020-06-04 23:38:57 +01:00
Philip Sargent
ac9ac5e397 Remove unused Survey object 2020-06-04 23:16:26 +01:00
Philip Sargent
43c6e2f2e1 dlete todo page, partly fix other url pages 2020-06-04 23:00:58 +01:00
Philip Sargent
b7fea2042f merge exptl & stats, delete millenial & eyecandy 2020-06-04 21:57:04 +01:00
Philip Sargent
c6d68749e0 Comment-out all ScannedImage objects 2020-06-04 19:32:26 +01:00
Philip Sargent
ae89a707ec Unit tests outline implemented 2020-06-03 21:57:05 +01:00
Philip Sargent
973c6f4ef8 update svx template & fix CRLF 2020-06-02 21:38:29 +01:00
Philip Sargent
4dd0a5ddf2 import syntax fix 2020-06-02 21:38:01 +01:00
Philip Sargent
90dc3dac3b Change troggle horizontal menu items 2020-06-01 17:41:41 +01:00
Philip Sargent
8c4c2ad1cf Progress dots on importing data 2020-06-01 00:42:48 +01:00
Philip Sargent
f949bb8dc0 python3 fixes for djsngo admin pages 2020-05-31 22:35:36 +01:00
Philip Sargent
c863bf6e1d Oops. Remove CSV download pages 2020-05-31 21:38:19 +01:00
Philip Sargent
5d89cf9474 Delete SURVEYS.CSV code 2020-05-31 21:03:40 +01:00
Philip Sargent
09aedecc3b Unicode fix for SVX display and edit page 2020-05-31 20:46:12 +01:00
Philip Sargent
fe515e9f01 Troggle code documentation pointers 2020-05-31 20:44:09 +01:00
Philip Sargent
69d2c0887c Adding progress dots to import print output and fix SURVEY_SCANS 2020-05-31 19:23:07 +01:00
Philip Sargent
8e577022b2 Reducing input print output 2020-05-31 19:21:54 +01:00
Philip Sargent
3088727fd4 unicode fix for python3 2020-05-31 19:00:44 +01:00
Philip Sargent
b33ad5833e delete duplication 2020-05-30 20:48:17 +01:00
Philip Sargent
3264b6edef bug fix in logbook parser 2020-05-30 20:31:20 +01:00
Philip Sargent
58c2650162 Style chnage to troggle pages for python3 2020-05-30 12:35:47 +01:00
Philip Sargent
01e098339e Imposrt mad more specific 2020-05-30 12:35:15 +01:00
Philip Sargent
d857cc9084 format tidy & normalise paths in survex *inlcude 2020-05-30 02:35:05 +01:00
Philip Sargent
4205821bac allow STATIC and tinyMCE to work with runserver 2020-05-30 02:34:33 +01:00
Philip Sargent
0776978c9c Import rejigging to fix circular refs 2020-05-30 01:11:02 +01:00
Philip Sargent
6568cb8900 import fix to allow manage.py to run 2020-05-28 22:42:50 +01:00
Philip Sargent
6a755598b2 Moved classes to models_caves and fixed imports 2020-05-28 04:54:53 +01:00
Philip Sargent
df3917a677 Expunge cavetab 2020-05-28 02:20:50 +01:00
Philip Sargent
d2192ffd21 delete duplication 2020-05-28 02:09:36 +01:00
Philip Sargent
cb4128436c expung imagekit and clean import lists 2020-05-28 01:38:35 +01:00
Philip Sargent
6cc578435c Refactor to avoid loading unused surveystations 2020-05-28 01:16:45 +01:00
Philip Sargent
73637ba53d Clean install with python3 2020-05-27 01:04:37 +01:00
Philip Sargent
c9657aeb8c preparing to clean up LoadPos 2020-05-26 16:41:11 +01:00
Expo on server
dfb7cc88cd Allow being unable to open local LOGFILE. 2020-05-26 02:26:51 +01:00
Expo on server
227120fd57 Add check to avoid running databaseReset as root accidentally 2020-05-26 02:25:30 +01:00
Philip Sargent
8b74ff4bb6 __unicode__ to __str__ 2to3 conversion 2020-05-26 02:21:36 +01:00
Philip Sargent
f4099c6929 fix py3 bug and make settings import clearer 2020-05-26 02:21:03 +01:00
Philip Sargent
6ae5c0d912 remove old imagekit files 2020-05-26 02:19:46 +01:00
Philip Sargent
44d190e91c partial fix to make OK on python3.8 2020-05-25 01:49:02 +01:00
Philip Sargent
0a864c7f87 2to3 sort comparison key change 2020-05-25 01:46:52 +01:00
Philip Sargent
50d753a87b Convert codebase for python3 usage 2020-05-24 20:56:27 +01:00
Philip Sargent
35f85c55f1 Update requirements list 2020-05-24 20:55:18 +01:00
Philip Sargent
b69bdcd126 tidying and prep for python3 2020-05-24 13:35:47 +01:00
Philip Sargent
49d5857b36 tabs and spaces format fix 2020-05-24 13:31:38 +01:00
Philip Sargent
40ad04b79f unused code commented out 2020-05-24 13:30:39 +01:00
Philip Sargent
a3e564855a removing imagekit 2020-05-22 01:28:45 +01:00
Philip Sargent
15d0d05185 bugfix 2020-05-20 13:40:09 +01:00
Philip Sargent
819eca5dea cleaning options list 2020-05-20 12:45:10 +01:00
Philip Sargent
edbe793c68 added profile option 2020-05-20 12:18:12 +01:00
Philip Sargent
e017c6effc Indented msgs for recursive file traversal 2020-05-15 21:45:23 +01:00
Philip Sargent
d4ac28af18 Remove PHOTOS_ROOT and DPhoto class 2020-05-15 21:32:55 +01:00
Philip Sargent
931aa4e3cb add mysql startup documentation 2020-05-14 19:37:46 +01:00
Philip Sargent
cc4017e481 fix renaming reload to reinit 2020-05-14 17:33:33 +01:00
Philip Sargent
38adb9a52f typo 2020-05-14 17:27:51 +01:00
Philip Sargent
ccc5813b3f indent recursion and update comments 2020-05-14 17:21:34 +01:00
Philip Sargent
314d0e8b71 skip fast pass option added as default 2020-05-13 23:11:47 +01:00
Philip Sargent
0338889905 tryng django 1.7 recommended change syncdb to migrate 2020-05-13 22:13:18 +01:00
Philip Sargent
876cd8909f still not expunged MySQL connection 2020-05-13 21:55:44 +01:00
Philip Sargent
ac7cb45f61 more thorough reset before running :memory: 2020-05-13 21:52:28 +01:00
Philip Sargent
f326bf9148 more thorough reset between dbs 2020-05-13 21:25:17 +01:00
Philip Sargent
b1596c0ac4 Merge branch 'master' of ssh://expo.survex.com/home/expo/troggle 2020-05-13 19:59:46 +01:00
Expo on server
13d3f37f05 menu update 2020-05-13 19:59:41 +01:00
Philip Sargent
e4290c4ab0 adding *ref to troggle svx parser 2020-05-13 19:57:07 +01:00
Expo on server
2918b4b92c Add simple search function to default menu 2020-05-02 04:14:32 +01:00
Philip Sargent
39c622d5bf dbReset now loads into memory first (fast err checking), then into db 2020-04-30 23:15:57 +01:00
Philip Sargent
76a6b501f3 LoadPos not-found cache working 2020-04-28 22:51:18 +01:00
Philip Sargent
ecf92e2079 getting the LoadPos to work better 2020-04-28 21:50:53 +01:00
Philip Sargent
b4c0c4d219 Understanding and speeding up LoadPos 2020-04-28 18:26:08 +01:00
Philip Sargent
4be8c81291 reducing clutter in output 2020-04-28 01:18:57 +01:00
Philip Sargent
a8460065a4 Thorough spring clean and profiling 2020-04-27 23:51:41 +01:00
Philip Sargent
2b39dec560 installation notes 2020-04-26 00:49:29 +01:00
0b85a9d330 adding sync to databaseRest and cleaning a template 2020-04-19 23:35:41 +01:00
b123f6ada7 Dumps loaded data into a .sql file 2020-04-16 20:36:42 +01:00
e5c288c764 get the profile display working & bug fix 2020-04-15 23:29:59 +01:00
9db7d8e589 Made a jobqueue to time how long importing takes 2020-04-15 04:09:28 +01:00
5e48687347 More tidying 2020-04-14 20:46:45 +01:00
09bbf81915 Tidy formatting prior to adding some new stuff 2020-04-14 20:19:41 +01:00
78f8ea2b5b bug fixed for new logbook 2020-04-13 17:35:58 +01:00
e08b4275a9 2010 is html format not wiki format.
Or at least, now it is.
2020-04-12 23:03:00 +01:00
ac9f3cf061 New cacheing for parsed logbooks. All logbooks load in 75 seconds now. 2020-04-12 22:29:30 +01:00
98fd314a62 Prevent that annoying popup from Google offering to translate the page from German 2020-04-11 18:10:09 +01:00
79a31a41f9 Fixed bad import of surveyscans references from tunnel files 2020-04-11 00:36:27 +01:00
6aae9083c3 implemened NOEDIT as a meta tag and fixed double-menus problem 2020-04-10 13:13:23 +01:00
d71e31417b scanned image files importing 2020-04-09 02:40:32 +01:00
fbe6c0c859 rearrange main menu 2020-04-03 01:11:30 +01:00
53b797fb53 Validation of mugshot or blrub file added 2020-04-01 19:58:31 +01:00
98eb9173ee rename troggle log back to what it was 2020-03-31 23:39:52 +01:00
ecfa95310d Documenting installation 2020-03-14 20:08:44 +00:00
0e75a9163b Documenting use of apache 2020-03-12 17:40:03 +00:00
Expo on server
59633d94f5 Remove mention of obsolete EMAIL_HOST_USER to correspond with change made in pathreport.py 2020-03-09 17:44:29 +00:00
53206ad1d7 Removed EMAIL_HOST entirely 2020-03-09 16:52:51 +00:00
9aa91bf3e2 remove EMAIL_HOST global name 2020-03-09 16:48:51 +00:00
867479e05d Fixing my earlier mistake these are xml not html 2020-03-02 17:08:49 +00:00
bb1f69dd90 stuff generated on server 2020-02-29 12:08:22 +00:00
d219f7b966 another missing code 2020-02-27 01:35:12 +00:00
3f812e5275 comment out missing code 2020-02-27 01:33:19 +00:00
cdef395f89 New troggle report on defined directory paths 2020-02-27 00:58:09 +00:00
Philip Sargent
66f6a9ce90 Troggle on Windows 10 using WSL 2020-02-25 14:31:52 +00:00
Philip Sargent
b07c888c7a debug for crashing tunnel import 2020-02-25 14:22:50 +00:00
Philip Sargent
d170a3c36e add shortcut for logbooks & note explaining notability metric 2020-02-24 21:49:01 +00:00
429c21a8e9 installing on WSL ubuntu on Windows 10 2020-02-22 00:04:41 +00:00
Philip Sargent
8c10908353 Revert "Merge branch 'RW_rebuild' of ssh://expo@expo.survex.com/home/expo/troggle"
This reverts commit e0963a1c39.
2020-02-21 16:31:54 +00:00
Philip Sargent
e0963a1c39 Merge branch 'RW_rebuild' of ssh://expo@expo.survex.com/home/expo/troggle
# Conflicts:
#	core/models_millenial.py
#	core/views_caves.py
#	databaseResetM.py
#	parsers/cavesM.py
#	urls.py
2020-02-21 16:27:34 +00:00
Sam Wenham
e77aa9fb84 Changes needed to stop the survex parser having to go through the data twice
Taken from the Django 1.10 upgrade branch
2020-02-21 15:57:07 +00:00
Philip Sargent
f5fe2d9e33 typo 2020-02-21 14:00:33 +00:00
Philip Sargent
5006342b7b forgot a bit 2020-02-21 13:59:14 +00:00
Philip Sargent
3ce8b67b4f added rebuild command option 2020-02-21 13:57:04 +00:00
Sam Wenham
52cec290d9 We don't want troggle trying to write out the files 2020-02-20 18:42:31 +00:00
Philip Sargent
a559151c57 Merge remote-tracking branch 'origin/master' 2020-02-20 15:29:30 +00:00
Philip Sargent
2fc60f9f74 Fixing logbooks with parse errors 2020-02-20 15:26:33 +00:00
Wookey
3b1fcb7feb Merge branch 'master' of ssh://expo.survex.com/~/troggle 2020-02-20 15:02:14 +00:00
Wookey
2838f540d1 Minor text and whitespace fixes 2020-02-20 15:01:50 +00:00
Wookey
f5ec5a61a9 Move survex parsing later in the process as it tends to run out of memory 2020-02-20 14:57:37 +00:00
Philip Sargent
44caf35fd8 typo 2020-02-20 14:15:28 +00:00
Philip Sargent
c5055e7f34 backport order of operations in reset() and change logbook parser to do paragraphs differently 2020-02-20 14:13:38 +00:00
Philip Sargent
de14ecea22 Fixing back mistaken chnage 2020-02-20 01:42:52 +00:00
Philip Sargent
f5174a3248 typo 2020-02-19 22:52:00 +00:00
Philip Sargent
f0889ce0f8 typos fix and more description 2020-02-19 22:51:07 +00:00
Philip Sargent
b6dc711c14 making in-code documentation strings match what actually happens 2020-02-19 22:14:00 +00:00
Philip Sargent
04fb2e8701 DOCTYPE update for template for generated files 2020-02-19 21:48:37 +00:00
Philip Sargent
c1439bed8d Adding <pre> and </pre> to the logbook entry display so that all the paragraphs are not munged into one when displayed. Untested. 2020-02-19 21:47:13 +00:00
Philip Sargent
a88f326ee6 added year 2019 to logbook parsing 2020-02-17 01:39:00 +00:00
56618dbe65 fix CUCC url in template and add years to 2020 2019-12-23 19:40:44 +00:00
Expo on server
71ef710d09 Fix the location of the cave view javascript 2019-12-08 10:53:43 +00:00
Sam Wenham
c74852b60b Merge branch 'master' of ssh://expo.survex.com/~/troggle
# Conflicts:
#	README.txt
2019-07-29 11:36:24 +01:00
Sam Wenham
a26109cb30 Allow comments against names in logbooks in brackets
Convert accent chars in names into simple chars as this is what people enter in the logbook
2019-07-11 12:29:38 +01:00
Sam Wenham
6b5b9a5315 Merge branch 'master' of ssh://expo.survex.com/~/troggle 2019-07-10 12:37:38 +01:00
Sam Wenham
4ebf3d8a0e Bring back TinyMCE for editing flatpages 2019-07-10 12:32:04 +01:00
37d02b298d added ssh git clone command variant 2019-07-09 15:55:27 +01:00
Sam Wenham
d6053322e8 Merge branch 'master' of ssh://expo.survex.com/~/troggle 2019-07-09 10:41:21 +01:00
Expo on server
5b5f385b67 Remove .hgignore Change mode on modelvis.py 2019-07-09 05:23:37 +01:00
Sam Wenham
04428c45c9 Fix description of localsettingsdocker 2019-07-09 05:23:30 +01:00
a7f605ced9 changes because we do not use svn anymore
Signed-off-by: psargent <philip.sargent@gmail.com>
2019-07-09 05:23:22 +01:00
Expo on server
0adb8e528d Add .gitignore file 2019-07-09 05:23:14 +01:00
Expo on server
f4280f9907 Add info to debian instructions on creating troggle logfile (in /var/log) 2019-07-09 05:22:49 +01:00
2d7892e3b1 Merge remote-tracking branch 'troggle/master' 2019-07-02 19:04:13 +01:00
Sam Wenham
8edeb2f622 Merge branch 'master' of ssh://expo.survex.com/~/troggle 2019-07-02 18:15:13 +01:00
Expo on server
d157a081b1 Remove .hgignore
Change mode on modelvis.py
2019-07-02 18:14:19 +01:00
Sam Wenham
fcc57cf365 Fix description of localsettingsdocker 2019-07-02 17:52:40 +01:00
12c8ab41bf changes because we do not use svn anymore
Signed-off-by: psargent <philip.sargent@gmail.com>
2019-06-27 01:24:38 +01:00
Expo on server
9266e5460e Add .gitignore file 2019-06-27 00:23:22 +01:00
Expo on server
ad45859071 Add info to debian instructions on creating troggle logfile (in /var/log) 2019-06-27 00:14:39 +01:00
expo on server
ee759980c4 remove hack in logbook parsing to convert ol to olly and wook to wookey.
It broke 'Olaf' as a name, for example.
2019-06-26 21:46:57 +01:00
expo on server
18b371bc15 remove hack in logbook parsing to convert ol to olly and wook to wookey.
It broke 'Olaf' as a name, for example.
2019-06-26 21:46:57 +01:00
expo on server
9e77b8bb75 Add server setup instructions/recipie for Debian Stretch 2019-06-26 21:45:17 +01:00
expo on server
e6acd4bdbd Add server setup instructions/recipie for Debian Stretch 2019-06-26 21:45:17 +01:00
Sam Wenham
424219fb6f Just commit the logbook parser this time (can we move to git now!!!) 2019-06-26 21:21:37 +01:00
Sam Wenham
2ebb37552f Just commit the logbook parser this time (can we move to git now!!!) 2019-06-26 21:21:37 +01:00
Sam Wenham
822359fe51 Backed out changeset: 4552f42bdf54 2019-06-26 20:57:24 +01:00
Sam Wenham
97426a0ddb Backed out changeset: 4552f42bdf54 2019-06-26 20:57:24 +01:00
Sam Wenham
3f78382d45 Remove this stupid hard coded name match 2019-06-26 20:56:08 +01:00
Sam Wenham
8a1be45aac Remove this stupid hard coded name match 2019-06-26 20:56:08 +01:00
Sam Wenham
b5cca8be3b Merge 2019-06-26 18:43:42 +01:00
Sam Wenham
4d2f9a2b39 Merge 2019-06-26 18:43:42 +01:00
Sam Wenham
8fe02e5c89 Allow html chars in names 2019-06-26 18:36:08 +01:00
Sam Wenham
b2dd905f0e Allow html chars in names 2019-06-26 18:36:08 +01:00
expo on server
c06d372984 Add expo.survex.com to ALLOWED_HOSTS in troggle settings 2019-06-26 15:23:20 +01:00
expo on server
7a9aef6faf Add expo.survex.com to ALLOWED_HOSTS in troggle settings 2019-06-26 15:23:20 +01:00
expo on server
6889ae9fa3 Add SURVEX_TOPNAME (top-level survex file) as a setting item in settings.py so it's not hardcoded. 2019-06-26 03:32:18 +01:00
expo on server
02d3cc84d5 Add SURVEX_TOPNAME (top-level survex file) as a setting item in settings.py so it's not hardcoded. 2019-06-26 03:32:18 +01:00
Sam Wenham
768ec83037 Updating caves and entrances is no longer nuclear!
Big overhaul of people processing, fullname added to the model
lastname is now names -1 unless you only have one (yes you Wookey)
this allows for Jon Arne Toft and Wookey to live it the same DB
names can now have html chars in them, this should be real unicode but that can
only happen when we go to Python 3!
2019-04-19 22:52:54 +01:00
Sam Wenham
b42249890e Updating caves and entrances is no longer nuclear!
Big overhaul of people processing, fullname added to the model
lastname is now names -1 unless you only have one (yes you Wookey)
this allows for Jon Arne Toft and Wookey to live it the same DB
names can now have html chars in them, this should be real unicode but that can
only happen when we go to Python 3!
2019-04-19 22:52:54 +01:00
Sam Wenham
2f9870644b missed objects 2019-04-18 19:27:23 +01:00
Sam Wenham
cc313246bb missed objects 2019-04-18 19:27:23 +01:00
Sam Wenham
4e187581b3 Clear data issues for logbooks before reloading 2019-04-18 19:26:09 +01:00
Sam Wenham
bfe018cde6 Clear data issues for logbooks before reloading 2019-04-18 19:26:09 +01:00
Sam Wenham
dc479b33c5 Add ordering to the data issues model 2019-04-18 19:01:29 +01:00
Sam Wenham
ae284a1f30 Add ordering to the data issues model 2019-04-18 19:01:29 +01:00
Sam Wenham
f1736c53c4 Fix CSRF issues in svx form
Set date formats
Add DataIssue model and add errors to it to allow us to give people a list of
stuff to fix
2019-04-14 22:45:31 +01:00
Sam Wenham
23df89cf31 Fix CSRF issues in svx form
Set date formats
Add DataIssue model and add errors to it to allow us to give people a list of
stuff to fix
2019-04-14 22:45:31 +01:00
Sam Wenham
05c5e26e99 Sort people by notability
Better errors and tidy
Nicer date formats
2019-04-02 02:04:38 +01:00
Sam Wenham
d1d0c24ed8 Sort people by notability
Better errors and tidy
Nicer date formats
2019-04-02 02:04:38 +01:00
Wookey
c4301cf6df Merge lots of troggle fixes 2019-04-02 00:57:54 +01:00
Wookey
b3089fafe9 Merge lots of troggle fixes 2019-04-02 00:57:54 +01:00
Wookey
de7d68b1eb folk.csv has moved into 'folk' dir out of 'noinfo' 2019-04-02 00:57:13 +01:00
Wookey
e913a56a6b folk.csv has moved into 'folk' dir out of 'noinfo' 2019-04-02 00:57:13 +01:00
expoonserver
bb8dbb381f Move cave and entrance data out of 'noinfo' 2019-04-01 23:03:45 +01:00
expoonserver
39c61bd526 Move cave and entrance data out of 'noinfo' 2019-04-01 23:03:45 +01:00
Sam Wenham
144610d6c2 Better error messages 2019-03-31 16:44:58 +01:00
Sam Wenham
10f1cdb458 Better error messages 2019-03-31 16:44:58 +01:00
Sam Wenham
40f413ba47 Ooops shouldn't of commited the DateTime change, yet... 2019-03-31 16:43:21 +01:00
Sam Wenham
a588221524 Ooops shouldn't of commited the DateTime change, yet... 2019-03-31 16:43:21 +01:00
Sam Wenham
9cd8734947 Support html and wiki logbook entrys
Move nearest_station to nearest_station_name and make nearest_station a foreign
key to SurvexStation
Lots of tidying
2019-03-31 15:39:53 +01:00
Sam Wenham
9df91b221b Support html and wiki logbook entrys
Move nearest_station to nearest_station_name and make nearest_station a foreign
key to SurvexStation
Lots of tidying
2019-03-31 15:39:53 +01:00
Sam Wenham
c8551991b2 Remove the redundant render_with_context() as django now does this just with the
render() shortcut
Move from mimetype to content_type, missed in last commit
2019-03-30 17:02:07 +00:00
Sam Wenham
64a4842dcb Remove the redundant render_with_context() as django now does this just with the
render() shortcut
Move from mimetype to content_type, missed in last commit
2019-03-30 17:02:07 +00:00
Sam Wenham
f666b9c396 Update new management command for DB reset
Switch to content_type from mimetype
Make DB reset not nuke so much
Tidy logbook parser
2019-03-30 13:58:38 +00:00
Sam Wenham
a4532a29da Update new management command for DB reset
Switch to content_type from mimetype
Make DB reset not nuke so much
Tidy logbook parser
2019-03-30 13:58:38 +00:00
Wookey
5469794159 Only show unofficial number if it's not already displayed 2019-03-27 01:59:09 +00:00
Wookey
705dd51f30 Only show unofficial number if it's not already displayed 2019-03-27 01:59:09 +00:00
expoonserver
1e26578305 Add reload_db option to databaseReset.py 2019-03-26 23:59:13 +00:00
expoonserver
ddb62f2897 Add reload_db option to databaseReset.py 2019-03-26 23:59:13 +00:00
expoonserver
8b5f81c8f8 Display temporary numbers on main cave index, when they exist. 2019-03-26 23:58:27 +00:00
expoonserver
f8be510509 Display temporary numbers on main cave index, when they exist. 2019-03-26 23:58:27 +00:00
Sam Wenham
27af84da65 Remove the news section as it never gets updated
Fix logbook entry so the edit link works
Tidy the control panel page
2019-03-10 11:05:57 +00:00
Sam Wenham
121f0a6aac Remove the news section as it never gets updated
Fix logbook entry so the edit link works
Tidy the control panel page
2019-03-10 11:05:57 +00:00
Sam Wenham
9646c32819 Remove jquery.min.js from troggle as it busts the footer menu. Yep troggle has a footer menu!! 2019-03-09 19:32:00 +00:00
Sam Wenham
8932bdc466 Remove jquery.min.js from troggle as it busts the footer menu. Yep troggle has a footer menu!! 2019-03-09 19:32:00 +00:00
Sam Wenham
c3ab5c6096 Fix person chronology to get the date from te logbook entry 2019-03-09 18:43:58 +00:00
Sam Wenham
9fa93fdd15 Fix person chronology to get the date from te logbook entry 2019-03-09 18:43:58 +00:00
Sam Wenham
7a7433bc84 Fix people list
Cope with Jimmy McFoo as a name!
Don't set the top expo value in the code whin it is piss easy to calculate
Fix typo from last commit
2019-03-09 18:21:10 +00:00
Sam Wenham
b4296f1736 Fix people list
Cope with Jimmy McFoo as a name!
Don't set the top expo value in the code whin it is piss easy to calculate
Fix typo from last commit
2019-03-09 18:21:10 +00:00
Sam Wenham
ff8c5ef0c1 There is no point having two functions do basicaly the same thing so make the
load all logbooks call load logbook(expo)
Remove the return message from load logbook as it isn't used
2019-03-09 11:18:44 +00:00
Sam Wenham
1bac650aee There is no point having two functions do basicaly the same thing so make the
load all logbooks call load logbook(expo)
Remove the return message from load logbook as it isn't used
2019-03-09 11:18:44 +00:00
Sam Wenham
a22b42e832 Make the logbook parser a little more sane
Move the parser to expo mapping to settings
Set a default parser
Iterate over the expo years rather than the mapping list!
2019-03-06 23:20:34 +00:00
Sam Wenham
9fc80bed35 Make the logbook parser a little more sane
Move the parser to expo mapping to settings
Set a default parser
Iterate over the expo years rather than the mapping list!
2019-03-06 23:20:34 +00:00
Sam Wenham
afa5a8b940 Merge 2019-03-04 20:04:23 +00:00
Sam Wenham
59f8647e0f Merge 2019-03-04 20:04:23 +00:00
Sam Wenham
f593104c04 Backed out changeset: e80a936faab6 2019-03-04 19:39:57 +00:00
Sam Wenham
384b0438b4 Backed out changeset: e80a936faab6 2019-03-04 19:39:57 +00:00
Sam Wenham
dc6d89b0ca Backed out changeset: f23440eb11a3 2019-03-04 19:39:43 +00:00
Sam Wenham
e01507d541 Backed out changeset: f23440eb11a3 2019-03-04 19:39:43 +00:00
Rad
b505a26ce4 rebuild descriptions database, some visuals 2019-02-28 12:36:49 +00:00
Rad
a5e1529514 working on rebuilding everything 2019-02-27 22:29:45 +00:00
Sam Wenham
6f42bd51e1 Revert (I hate hg!!!) 2019-02-26 20:43:18 +00:00
Sam Wenham
42d10cf43d Revert (I hate hg!!!) 2019-02-26 20:43:18 +00:00
Sam Wenham
4e27c90f77 merge 2019-02-26 20:41:47 +00:00
Sam Wenham
2226aa34d5 merge 2019-02-26 20:41:47 +00:00
Sam
0268ff46b3 Add docker readme, settings and update compose file
Fix views_logbooks.py
2019-02-26 19:19:01 +00:00
Rad
1d7cf3f41a Messing with millenialcaves.html or similar 2019-02-26 14:07:45 +00:00
Rad
32c186afd7 Messing with millenialcaves.html or similar 2019-02-26 14:05:41 +00:00
Rad
54a9f7a37c Messing with millenialcaves.html or similar 2019-02-26 12:50:19 +00:00
Rad
e4e8cc5993 Messing with millenialcaves.html or similar 2019-02-26 12:47:50 +00:00
Rad
8703ed5d94 Messing with millenialcaves.html or similar 2019-02-26 12:30:20 +00:00
Rad
a4118261e1 Messing with millenialcaves.html or similar 2019-02-26 12:29:46 +00:00
Rad
6392c1f238 Messing with millenialcaves.html or similar 2019-02-26 12:23:12 +00:00
Rad
4148ece133 Messing with millenialcaves.html or similar 2019-02-26 12:07:45 +00:00
Rad
c724f292ca Messing with millenialcaves.html or similar 2019-02-26 12:03:17 +00:00
Rad
53513b812b Messing with millenialcaves.html or similar 2019-02-26 12:01:55 +00:00
Rad
beffdbd89d Messing with millenialcaves.html or similar 2019-02-26 12:01:30 +00:00
Rad
8bd0df1bab Messing with millenialcaves.html or similar 2019-02-26 10:57:02 +00:00
Rad
4ae43e94f4 Messing with millenialcaves.html or similar 2019-02-26 10:02:57 +00:00
Rad
da88771fd4 Messing with millenialcaves.html or similar 2019-02-26 09:45:17 +00:00
Rad
b6b7d2aa12 Messing with millenialcaves.html or similar 2019-02-26 09:41:02 +00:00
Rad
c733b0f2eb Messing with millenialcaves.html or similar 2019-02-26 02:03:26 +00:00
Rad
9712bf6dfd Messing with millenialcaves.html or similar 2019-02-26 02:01:09 +00:00
Rad
5e4c1493a1 Messing with millenialcaves.html or similar 2019-02-26 01:56:39 +00:00
Rad
41b1334257 Messing with millenialcaves.html or similar 2019-02-26 01:48:52 +00:00
Rad
a2fcbae129 Messing with millenialcaves.html or similar 2019-02-26 01:46:54 +00:00
Rad
e9077542c9 Messing with millenialcaves.html or similar 2019-02-26 01:46:05 +00:00
Rad
79595521a9 Messing with millenialcaves.html or similar 2019-02-26 01:45:03 +00:00
Rad
38b658fd3f Messing with millenialcaves.html or similar 2019-02-26 01:43:54 +00:00
Rad
a89123755c Messing with millenialcaves.html or similar 2019-02-26 01:43:28 +00:00
Rad
0fb9accd05 Messing with millenialcaves.html or similar 2019-02-26 01:41:15 +00:00
Rad
f87df707ab Messing with millenialcaves.html or similar 2019-02-26 01:37:52 +00:00
Rad
a2cb771fc1 Messing with millenialcaves.html or similar 2019-02-26 01:35:55 +00:00
Rad
c888f59ff0 Messing with millenialcaves.html or similar 2019-02-26 01:34:09 +00:00
Rad
43ff6e09be Messing with millenialcaves.html or similar 2019-02-26 01:30:32 +00:00
Rad
810ab3ea4f Messing with millenialcaves.html or similar 2019-02-26 01:18:47 +00:00
Rad
cb5978237b Messing with millenialcaves.html or similar 2019-02-26 01:13:54 +00:00
Rad
622d523c98 Messing with millenialcaves.html or similar 2019-02-26 01:12:14 +00:00
Rad
ee7d2529e7 Messing with millenialcaves.html or similar 2019-02-26 01:08:04 +00:00
Rad
82de967f97 Messing with millenialcaves.html or similar 2019-02-26 01:07:18 +00:00
Rad
466e667e14 Messing with millenialcaves.html or similar 2019-02-26 01:04:09 +00:00
Rad
3c563ce665 Messing with millenialcaves.html or similar 2019-02-26 01:03:22 +00:00
Rad
19a061efa8 Messing with millenialcaves.html or similar 2019-02-26 00:56:46 +00:00
Rad
a397eb9d00 Messing with millenialcaves.html or similar 2019-02-26 00:48:34 +00:00
Rad
e5d864359a Messing with millenialcaves.html or similar 2019-02-26 00:47:35 +00:00
Rad
b2adc285b6 Messing with millenialcaves.html or similar 2019-02-26 00:45:56 +00:00
Rad
8af604262d Messing with millenialcaves.html or similar 2019-02-26 00:43:46 +00:00
Rad
b33ca2b290 Messing with millenialcaves.html or similar 2019-02-26 00:43:05 +00:00
Rad
c4455168c6 Messing with millenialcaves.html or similar 2019-02-26 00:35:28 +00:00
Rad
1b4674acde Messing with millenialcaves.html or similar 2019-02-26 00:33:37 +00:00
Rad
4fac4317a3 Messing with millenialcaves.html or similar 2019-02-26 00:33:04 +00:00
Rad
78bf9986b7 Messing with millenialcaves.html or similar 2019-02-26 00:30:09 +00:00
Rad
5154c0d8e5 Messing with millenialcaves.html or similar 2019-02-26 00:29:16 +00:00
Rad
b01fcc3a6d Messing with millenialcaves.html or similar 2019-02-26 00:23:23 +00:00
Rad
e8585bec42 Messing with millenialcaves.html or similar 2019-02-26 00:22:58 +00:00
Rad
521f0241f8 Messing with millenialcaves.html or similar 2019-02-26 00:21:54 +00:00
Rad
0394becdac Messing with millenialcaves.html or similar 2019-02-26 00:21:27 +00:00
Rad
e5fa636776 Messing with millenialcaves.html or similar 2019-02-26 00:17:56 +00:00
Rad
6beaf4afdd Messing with millenialcaves.html or similar 2019-02-26 00:17:11 +00:00
Rad
822812525e Messing with millenialcaves.html or similar 2019-02-26 00:08:15 +00:00
Rad
a4a92483bd Messing with millenialcaves.html or similar 2019-02-26 00:04:27 +00:00
Rad
3254ba1443 2019-02-26 00:00:34 +00:00
Rad
4c3d0ce7fa 2019-02-25 23:55:06 +00:00
Rad
a99afe07c6 2019-02-25 23:53:19 +00:00
Rad
73bb60eff9 2019-02-25 23:52:47 +00:00
Rad
0a214c5d4b 2019-02-25 23:51:26 +00:00
Rad
29c53f35ab 2019-02-25 23:48:58 +00:00
Rad
3746dab5de 2019-02-25 23:46:52 +00:00
Rad
18dbadd675 space/tab 2019-02-25 23:42:56 +00:00
Rad
ee2cd0d391 trying to add new field 2019-02-25 23:40:53 +00:00
Rad
0cc4e7c7d3 2019-02-25 23:37:12 +00:00
Sam Wenham
478065786f Merge 2019-02-25 23:34:10 +00:00
Sam Wenham
e64d82cd92 Start of moving databasereset to django management 2019-02-25 23:10:24 +00:00
Sam Wenham
12a991920a Get get_absolute_url in the correct place 2019-02-25 23:07:20 +00:00
Rad
0758efb3ec 2019-02-25 22:34:13 +00:00
Rad
54b782c67e tab/space fix 2019-02-25 22:28:30 +00:00
Rad
78a5f656b9 added Rad's playground 2019-02-25 22:24:33 +00:00
Rad
6e23853759 merge 2019-02-25 21:02:30 +00:00
Rad
becfaa1504 change to table 2019-02-25 20:58:32 +00:00
Sam Wenham
77a6015ad6 Fix the All Survex page to work with 1623 area 2019-02-25 20:13:28 +00:00
Sam Wenham
7c15a7439d Decode the url encoded # when looking at wallets 2019-02-24 19:50:45 +00:00
Sam Wenham
b4f4db5754 Deal better with the wallet letter number combo of 2019#X01 2019-02-24 18:55:30 +00:00
Sam Wenham
c6656e6642 Stop django moaning about unit tests from pre 1.6, like we have any anyway! 2019-02-24 16:48:12 +00:00
Sam Wenham
e6fa54d0e5 Fix survey scans
Remove the assert for folders in survey wallets, this does mean currently they
will be ignored by troggle.
2019-02-24 16:46:02 +00:00
Sam Wenham
f16b4e3f47 Make the suryeys importer not explode 2019-02-24 14:29:14 +00:00
Sam Wenham
4ad5b68433 Make things more compatiable with newer python
Fix the expeditions list
Improvements to make it compatiable with django 1.8
Bump the years to add 2018
Update the .hgignore file to ignore junk
2019-02-24 13:03:34 +00:00
Sam Wenham
552730f0a3 Revert urls.py as it contains Django 1.8 upgrade code 2019-02-23 15:43:38 +00:00
Sam Wenham
a1f02e575f Prevent troggle adding the menu if there is one in the file
Add a Docker compose file to bring up a dev troggle easily
Various PEP improvments
2019-02-23 15:30:58 +00:00
Sam Wenham
f58b1db920 Don't create years that aren't here yet troggle goes boom 2018-06-20 18:14:13 +01:00
Sam Wenham
3d2ac06a72 Move the years on a bit 2018-06-20 18:11:12 +01:00
expoonserver
9802f45452 Add missing linefeed on survey-parsing error message 2018-06-18 23:43:20 +01:00
expoonserver
1ad58d6b5d Make sure that cave parser only reads .html files in cave_data dir (to stop foo~ causing 'duplicate cave' error) 2018-06-18 23:17:05 +01:00
expoonserver
6805bcb690 Add 'troggle' namespace to databasereset.py so it runs in django >1.5 2018-06-17 02:41:58 +01:00
expoonserver
c162411f0b FileUploadForm does not work with django 1.7.
It tries to use database during class initialisation.
removed it for now - not sure if it's important...
2018-06-17 02:24:00 +01:00
expoonserver
10a05d686e django.setup needs to be run before any attempt to use database 2018-06-17 02:23:02 +01:00
expoonserver
89ef5c19ff imports must specify the application name i nlater django versions.
databasereset updated accordingly.
2018-06-16 19:00:26 +01:00
Sam Wenham
4385ce86c1 Add the extra setting for the threed cache to all the template configs 2018-04-20 20:58:05 +01:00
Sam Wenham
46124a770f Fix the django for the spinny js cave viewer.
Make the paths settings (don't hard code things like this!!)
Add " round spinny urls from the late merge (the rest were done for the move off 1.4.2
2018-04-20 20:55:12 +01:00
Sam Wenham
6f6327d267 Merge with django-upgrade 2018-04-17 22:19:20 +01:00
expoonserver
6710a469ee Add CaveView spinny caves view to each troggle cave page 2018-04-17 21:57:02 +01:00
Sam Wenham
174c475ec7 Add default BooleanField(default=False) for django 1.7 compatibility 2018-04-17 21:51:39 +01:00
Sam Wenham
d3b42a125d 1.7 requiremnets 2018-04-15 16:45:07 +01:00
Sam Wenham
2f2f4d396d New vars needed to make django 1.7 and tinymce work 2018-04-15 16:36:23 +01:00
Sam Wenham
e1eea7088f Django 1.7 wsgi.py 2018-04-15 16:29:30 +01:00
Sam Wenham
760fa3114f missed from last commit 2018-04-15 16:28:52 +01:00
Sam Wenham
798ae591c6 Django 1.7 mostly working. Big refactor so probably bugs 2018-04-15 16:28:13 +01:00
Sam Wenham
7877efba0a Up to 1.6.11 on stretch. New manage.py. Some tidying 2018-04-15 12:00:59 +01:00
Sam Wenham
cfa888fde6 More cleanup and modernisation 2018-04-14 21:37:12 +01:00
Sam Wenham
cedcb0988a Clean up indenting in models
add registration required modules
2018-04-14 21:14:19 +01:00
Sam Wenham
c939013b14 Add ref as a valid survex command to prevent errors 2018-04-14 16:13:21 +01:00
Sam Wenham
458d0e1ebc add all the docker commands to bulid and run troggle in a container (more of a guide than something to run) 2018-04-11 22:32:47 +01:00
Sam Wenham
776152ef47 Add missing expose container port and commneted command to auto start the dev server 2018-04-11 22:18:15 +01:00
Sam Wenham
9f285a9f34 Update requirements for 1.5.12 and preserve the 1.4.22 requiremnets 2018-04-11 22:13:31 +01:00
Sam Wenham
302ad0632e Add the docker files and the pip requiremnets.txt to allow install usign pip 2018-04-11 22:03:48 +01:00
Sam Wenham
ffb5d7bdda Upgrade to django 1.5, some functions have been changed
url in templates now requires quotes roung the first arg
USE_TZ added
2018-04-11 22:02:57 +01:00
Sam Wenham
242cf4741a Import Image from PIL to support newer python
import the Django registration module rather than the troggle one
2018-04-10 01:34:06 +01:00
Wookey
41a14f161d Avoid barf if URL field in new cave form is left blank. 2018-02-28 15:57:27 +00:00
Wookey
f0e1406c5f Update old website base URL in template from cucc.survex.com/expo to expo.survex.com 2018-02-28 15:55:00 +00:00
expoonserver
d7c6676c49 Test whether url is not 'None' before applying 'startswith' test in
forms.py entering new caves, otherwise it barfs.
2017-10-25 03:49:03 +01:00
expoonserver
5e9dfc6ea6 Fix Scan scanning, so that 2015#X01 format (with 'X') is accepted in
scan directories. Allows 2016 data to be processed.
2017-03-07 15:44:42 +00:00
Sam Wenham
27fca090fc Bring troggle a little more up to date 2016-09-04 13:47:26 +01:00
expo
716131f005 Fix cave pages to have entrances and description on one page.
Fixes broken links on description and entrance pages.
Removes need for jquery-ui.
2016-07-02 23:42:47 +01:00
expo
496280f3e6 merge serve changes
HGerver canges Enter commit message.  Lines beginning with 'HG:' are removed.
2016-06-09 04:16:46 +01:00
Sam Wenham
0dd0951b28 Merge 2016-05-20 21:35:58 +01:00
Wookey
b9597fbb57 Merge 'expofiles' instead of 'expoimages' config changes 2016-01-27 04:27:38 +00:00
Wookey
edc6591554 Correct typo on cave and entrance template files
('If you edit this files...')
2016-01-27 04:24:44 +00:00
expoonserver
560b9bf985 Move expoimage to expofiles
Relies on permanent rediect in apache config to keep old URLs working
everywhere.
2015-10-02 15:10:04 +01:00
expoonserver
6652e3f160 remove code saying we can't do interlaced pngs. It's fine now. 2015-10-02 15:07:03 +01:00
expo
b0f1f73ce4 Store expo user/password info in localsettings file, and not repeated in databaseReset script 2015-09-16 01:58:51 +01:00
expo
214d887c57 Commit changes made on expo 2015 2015-09-16 01:52:45 +01:00
Sam Wenham
6b16724c2a tidy up after merge 2015-08-22 13:28:17 +01:00
Sam Wenham
f1bb927063 Merge settings changes 2015-08-22 13:26:38 +01:00
expo
eeda1bed73 properly quote JSLIB_PATH and mke clear that example password is just an example 2015-07-26 00:38:10 +01:00
Sam Wenham
751ec9517f Change JSLIB_PATH to JSLIB_URL and correct the path 2015-07-01 18:22:25 +01:00
Wookey
228814be33 Fix unquoted string in troggle localsettingspotatohut.py 2015-07-01 03:55:12 +01:00
Sam Wenham
cebcbeb73a sysadmin to expouser for email 2015-07-01 01:26:04 +01:00
Sam Wenham
057b09dca9 Move expo user settings out of databasereset.py to localsettings where they really belong 2015-07-01 01:18:25 +01:00
Sam Wenham
480541ae54 Add a little style 2015-06-28 13:52:33 +01:00
Sam Wenham
60303d041c Remove unnecessary escape slashes 2015-06-28 13:46:28 +01:00
Sam Wenham
5a911ecec7 I think this is breaking prospecting 2015-06-28 13:39:50 +01:00
Sam Wenham
7056f9a8b2 Remove balkonhoehle from the QM parser as this will need a lot of effort to get working 2015-06-28 12:28:18 +01:00
Sam Wenham
34036581f2 Correct JSLIB_URL 2015-06-27 13:01:15 +01:00
expo
dcc67fddda Don't put passwords in the repo 2015-06-24 04:41:50 +01:00
expo
03cad0a37f Survex parser fix to avoid allocation on error (by martin). 2015-06-24 04:09:19 +01:00
expo
a4651eaa0a Added warnings that the database will need updating is cave or entrance data files are modified 2015-06-21 15:11:51 +01:00
expo
7aed3d3b30 Moved notable caves to settings.py, link to a script to fix permissions 2015-06-21 15:08:09 +01:00
expo
4771f52b20 Have different links for system js files and troggle js files 2015-06-21 15:06:44 +01:00
Wookey
77ad85b05c merge balconhoehle changes from server 2015-06-19 01:55:51 +01:00
Wookey
01d877d26e Use django-registration, not a local copy.
This old one is uses deprecated hashcompat.
2015-06-10 23:52:49 +01:00
DWalker
e84d990366 Add in balkon hoehle QM list 2015-05-25 21:55:54 +01:00
Wookey
e06be10f7f Change password of 'expo' user created by databasereset script to match that used elsewhere 2015-05-25 21:26:26 +01:00
Wookey
fe6750e824 Fix up obvious URLs containing subarea names (smkridge) 2015-04-08 03:40:57 +01:00
Wookey
d29fe2ee1c Merge in Sam's parser debugging 2015-04-08 03:27:48 +01:00
Wookey
1156b1d3ea rename troggle paper.odt to troggle_paper.odt as space in repo are a
pain
2015-04-08 03:24:00 +01:00
Wookey
126a10cf94 Rename troggle paper to not have a space in it. 2015-04-06 02:38:24 +01:00
Sam Wenham
4560e0da84 Revert all of this the date is needed and is a not null in the db 2015-01-26 21:53:32 +00:00
Sam Wenham
f9c2e0e170 One more try 2015-01-26 21:15:17 +00:00
Sam Wenham
cf413dd03c Ooops that wasn't right 2015-01-26 21:13:47 +00:00
Sam Wenham
4965678443 Don't assert an error on bad date formats 2015-01-26 21:12:27 +00:00
Sam Wenham
67f94f9436 A little more verbosity 2015-01-19 22:48:50 +00:00
Sam Wenham
1186662960 Add a little verbosity 2015-01-19 22:41:48 +00:00
Sam Wenham
3010961383 Try and ignore files that don't end .html (We really need to change to .xml) eg .html.orig!!
Change the index on troggle to move on with the year
2015-01-19 21:28:35 +00:00
Wookey
806fd41130 remove two files accidentally included in last commit 2014-09-11 07:41:33 +01:00
Wookey
af07161f05 remove internal copies of jquery, jquiery-forms, jquery-ui+themes,
django-feincms and codemirror
2014-09-11 07:40:58 +01:00
Wookey
5ff759db93 Fix templates to use system javascript for jquery, jquery-ui and
jquery-ui themes
2014-09-11 07:38:45 +01:00
Wookey
7f292d402b Use REPOS_ROOT_PATH so there is just one place to change paths 2014-09-11 06:33:34 +01:00
Wookey
c180780da9 Update the README file a bit - still needs work. 2014-09-10 23:46:05 +01:00
Wookey
d75862bc41 Merge change of 'cavesnew'->'caves' in databasereset. 2014-07-28 01:22:52 +01:00
Wookey
7cdb603d75 Add 107 to notable caves (noting that this is hard-coded into
core/views_caves.py which is just shoddy)
2014-07-28 01:21:24 +01:00
expo
94c44b0d7b Change databasereset to use 'caves' instead of 'cavesnew' for reloading the cave database 2014-07-28 00:18:10 +01:00
expo
4a3d181097 Set potato hut localsettingsfile to have correct URLs 2014-07-23 09:47:48 +01:00
Sam
d8863dca48 Fix media url to allow for working in the hut 2014-07-23 09:10:31 +01:00
expo
e0c439e850 Add a new config file for the potato hut setup. 2014-07-23 09:11:17 +01:00
Wookey
f4f1b3ca6d Allow comma in starcommands (*,fix) (comma is default valid *set blank) 2014-07-01 02:26:26 +01:00
Wookey
4a93790c7e Fix survex parser to allow whitespace between * and command (as survex
does).
2014-07-01 02:12:34 +01:00
Wookey
5265acd9dc merge in survex parsing changes from server. 2014-06-26 02:37:55 +01:00
expoonserver
9f69bb5fca Remove spurious real password from example localsettingserver.py file.
Add comment on how to use it.
2014-06-26 02:35:37 +01:00
expoonserver
b1d6e1c3d5 Replace assert on unrecognised commands with print, so that a minor
parsing issue doesn't completely kill a parsing update.
Add parsing for requires and alias commands.
2014-06-26 02:34:19 +01:00
Wookey
659703b221 Merge with server version 2014-06-09 19:30:06 +01:00
expoonserver
3869bd536e remove humongous troggle_log.txt from repo 2014-05-19 03:12:16 +01:00
expoonserver
408d154d3f Refer to debian package, not upstream URL 2014-05-19 03:11:46 +01:00
Wookey
44e3eb8a18 Tidy up urls file a little 2014-05-14 20:46:59 +01:00
wookey
51a3cecc02 Document 'cavesnew' option in databasereset.py - which just reads in
caves datafiles.
2013-10-07 23:45:59 +01:00
olly
6b4ea7b83e merge 2013-08-08 15:48:10 +02:00
expo
da71cca22f Prospecting guide and images and few minor other things. 2013-08-01 17:00:01 +02:00
wookey
5c945e3431 Put correct user for mysql on seagrass back into config (It was accidentally overwritten in recent changes) 2013-07-06 09:28:39 +01:00
Wookey
ba5bc365c1 merge support for django 1.2 location for auth module 2013-07-02 21:12:59 +01:00
Wookey
c362b1b529 3rd attempt at getting the right syntax for the CSRF protection in 2013-07-02 21:11:07 +01:00
Wookey
f90b6dc7ab update location of auth module for django 1.4 2013-07-02 21:10:30 +01:00
wookey
a6a9016548 Add support for old and new (1.4 on) location for auth module. 2013-07-02 21:05:48 +01:00
Wookey
5351108ec1 merged in proper CSRF changes from server 2013-07-02 20:23:55 +01:00
Wookey
7759e481d4 Change database syntax to modern format as old style no longer
supported in django 1.4
2013-07-02 18:13:27 +01:00
Wookey
69c3a06c98 Remove support for django 1.0 CSRF as we only care about 1.2 or later 2013-07-02 18:12:18 +01:00
Wookey
d1ad8730d7 Add CSRF protection to registration form (and remove annoying second
password)
2013-07-02 18:10:45 +01:00
wookey
f3a570a21d Add csrf token to registration forms 2013-07-02 17:26:35 +01:00
Wookey
f626d3304d parsing_log should not be saved in the vcs 2013-07-02 00:49:07 +01:00
Wookey
7eb4c89bf0 Don't explode if a master survex file is not found for a directory -
that shouldn't cause total failure to read the database in.
2013-07-02 00:47:42 +01:00
wookey
9435be0f19 Add 'people' option to DatabaseReset.py, to read in just the folk list after update.
Not sure that it actually works mind...
2013-07-02 00:34:58 +01:00
wookey
7f108f6d9a Set title to show 1976-2013
Put quick link to 2011 back as that one works
2013-07-02 00:33:53 +01:00
wookey
3f98470af8 Add a function for running people parser
And comments on how logbooks can't be read in until 'year' exists in database
2013-06-25 15:59:19 +01:00
wookey
e58b69782c Add note on how to create a new year in troggle. 2013-06-25 15:56:19 +01:00
wookey
e49e22b37c Removed asserts which meant that if any 'odd' .svx files, or directories
with no obvious 'controlling' svx file, were added to the dataset then the
survex viewer code exploded and the website didn't work.

It's wrong that adding a new cave with an oddly-named .svx file can break
the website in this way, so these asserts are wrong.
2013-06-24 23:32:12 +01:00
wookey
82e69b4f05 Add parsing_log.txt to the files ignored by the VCS. 2013-06-24 23:29:14 +01:00
wookey
ea9266ecf9 Add help command and usage info to databaseReset.py 2013-06-24 01:31:14 +01:00
wookey
99ea6778ad Add comment to identifycavedir function
and remove now-disused special-case filename
2013-06-24 01:30:17 +01:00
wookey
ccd80e74f8 Change template headers to show 2012/2013 as shortcuts 2013-06-23 03:19:41 +01:00
Wookey
3057d2a232 Add checking for compass too
Only print filenames on error by default
2013-05-22 02:33:47 +01:00
Wookey
d1ac659d4f Add error check in place where parser died 2013-05-22 02:10:58 +01:00
wookey
bb1989d0f0 Add some exception checking to parsers/caves.py so that missing entrance
slugs don't blow up the import. Also reduce the noise, so
you just get a warning about missing slugs printed out
2012-09-24 23:23:38 +01:00
wookey
418e5e1d3f Add debug for which entrance file was being read so we get a clue where to look when 'databasereset newcaves' falls over 2012-09-24 22:38:35 +01:00
Wookey
3b12e6d975 Add some debug to cave parser as it's easy to make it fail
e.g. by referring to slugs that don't exist.
2012-09-24 22:29:18 +01:00
expoonserver
54d7f1d097 Remove jgtfile URLs (presumably no longer needed) 2012-09-08 01:12:17 +01:00
Martin
cfc90deb83 Merge 2012-08-14 23:49:26 +02:00
Martin
1a0e577606 Bug fixing of cave and entrance forms removal of slugs 2012-08-14 22:51:15 +02:00
Martin
a05fe94d90 ignore files ending in ~ 2012-08-14 15:31:34 +02:00
Martin
8e64062214 added entrance locations 2012-08-14 15:08:08 +02:00
Martin
8c1882eec8 fixed spelling 2012-08-14 15:06:18 +02:00
Martin
8dd51096cf allow extensions to be capatalised 2012-08-14 15:05:15 +02:00
expo
ecd5bbcb1d Started removing foreignkeys to caves, to achieve greater flexability. Some log book entries stuff may be broken. Add ability to make new caves and entrances via website. 2012-08-12 19:10:23 +02:00
Martin Green
6d5babd331 Prospecting template 2012-08-10 19:34:44 +02:00
Martin Green
79b7d32664 Made a prospecting guide and fixed survex station description. Removed parsing of underground descriptions to wikis. 2012-08-10 19:02:13 +02:00
expo
dd66ad835a Fixed directory names for the survey scans such that surveys could be found. It did not seem possible to simply change the localsettings.py file to get it to work. 2012-08-08 11:29:15 +02:00
expo
a29fd964bd Prevent modification of auto generated files 2012-08-06 12:56:20 +02:00
expo
1ef274ec1d Editing no longer changes files more than nesecary. Removed TinyMCE editing. /Sumbit/Submit 2012-08-06 12:19:48 +02:00
expo
0f5627505f Fix broken markup 2012-08-05 21:37:46 +02:00
expo
c0782e1cca Fixed cave order 2012-08-05 19:28:34 +02:00
expo
ed1d273e03 Fixed cave order 2012-08-05 19:26:24 +02:00
expo
9654e5da1c FIx base template so admin link, expoweb link work and use consistent base URL 2012-08-05 02:33:48 +02:00
expo
8040b746b4 Note that the instructions for adding a survey are all wrong. 2012-08-05 00:35:02 +02:00
expo
05004aa874 Fix up parser paths so everything is found 2012-08-05 00:26:05 +02:00
Martin Green
4a21720745 Merge 2012-06-10 17:24:10 +01:00
Martin Green
13cb2e9b0f no need to export to cavetab2 anymore 2012-06-10 17:22:50 +01:00
ExpoOnServer
0259947cda merge 2012-06-10 17:21:26 +01:00
ExpoOnServer
080684e56f no need to export cavetab2 anymore 2012-06-10 17:20:57 +01:00
Martin Green
4b269bb234 update caves from new cave file format not cavetab2.csv 2012-06-10 17:16:33 +01:00
Martin Green
1a62931202 Merge 2012-06-10 16:56:44 +01:00
Martin Green
c2029df3c9 New parser for new cave format 2012-06-10 16:56:12 +01:00
ExpoOnServer
4a074295ad Looks like photos have been added by editing urls.py. 2012-06-10 16:19:17 +01:00
Martin Green
711fefb0da Start to change dataformat for caves, along with there editing. Start to change survex reader to cope better with equates/tags. 2012-06-10 14:59:21 +01:00
Martin Green
fd12e70f78 Editing for entrances along with caves
More detailed display of entrances
2012-05-23 09:23:40 +01:00
Martin Green
fac89bae30 Render a cave editing page. Nb it does not do save anything yet. 2012-01-07 19:05:25 +00:00
Wookey
ab97e367cb merge from upstream 2011-09-15 12:13:07 +01:00
Wookey
ae693ca4c5 Add 2010 and 2011 logbooks to parsing list (can we make this auto
somehow - by agreeing a logbook format, or letting it guess)?
2011-09-15 12:12:18 +01:00
expo
77dea07b40 branch merge 2011-09-02 03:39:20 +02:00
expo
77dcf7f759 Remove old ref to goatchurch in localconfig 2011-09-01 01:50:51 +02:00
Martin Green
59e7c4d5df Bug fix 2011-08-08 13:11:57 +01:00
Martin Green
0b5e57b85e ignorecase when finding html tags 2011-08-08 12:58:02 +01:00
Martin Green
c623acf832 template changes. Fix link to css. 2011-08-08 12:40:47 +01:00
Martin Green
36b1888f46 Added 'page not found do you wnat to make this page' page. Minor tweaks 2011-08-08 12:18:47 +01:00
Martin Green
c09a668620 Fix logbook editing 2011-08-08 12:17:38 +01:00
Martin Green
e85c386375 =Make a common base for expoweb pages. Ignore any header information in expoweb except titles. 2011-08-08 10:58:50 +01:00
Martin Green
c66ecc4d7f Allow pages to be rendered when the body tag has attributes. Put an edit link on the homepage. 2011-08-08 10:04:59 +01:00
Martin Green
13fe89af9f Allow for editing flatpage titles, and made a common uneditable list of links. 2011-08-08 09:51:47 +01:00
Martin Green
d8fe39ae86 Allow the viewing of noinfo caves on non public website without login 2011-08-08 08:51:12 +01:00
Martin Green
5f5359f933 Changed regex for finding head and body of flat pages. 2011-08-07 19:17:27 +01:00
Martin Green
e820a516de bug fix for edit link for index files 2011-08-07 17:30:18 +01:00
expo
e9fdea80c0 Changed ubuntu local settings to be applicable to the expo machine 2011-08-07 16:12:52 +02:00
expo
9534bd8881 Make caveindex link to urls in the original hierachy such that their
hyperlinks and images work.
2011-08-07 16:11:35 +02:00
ExpoOnServer
5be508620e update localsettings for server and expo machine 2011-07-14 03:50:49 +01:00
Wookey
82e968d5c7 Attempt 17b to end with the right files as tip 2011-07-12 02:44:07 +01:00
Wookey
b4b060a962 Add odt and ods mime types to our list.
Maybe this should just be read in from the real list...
2011-07-12 00:57:48 +01:00
ExpoOnServer
64e5e9d45c merging correct urls.py for /troggle dir in 2011-07-12 00:49:24 +01:00
ExpoOnServer
881215e815 Add empty troggle_log.txt file to save doing it by hand 2011-07-12 00:02:01 +01:00
ExpoOnServer
35cd983cc9 I seem to be going wrong in circles here 2011-07-11 23:45:12 +01:00
Wookey
0a70039dee really, really get all version the same! 2011-07-11 23:43:32 +01:00
ExpoOnServer
18ccc57f87 add /troggle dir (Martin's changes to get main site back as entry point) 2011-07-11 23:35:11 +01:00
Wookey
c23fcc5b06 rest of martin's changes, without reverting lineend issues 2011-07-11 23:28:23 +01:00
Wookey
21ff3b8b5d Add changes from martin 2011-07-11 23:19:48 +01:00
Martin Green
97c388dba0 Moved troggle main page to /troggle added a link in flat pages.
Now / takes you to the expoweb index page
2011-07-11 22:38:40 +01:00
Martin Green
10799e2ce3 Do not make an entrance redirect for entrances without there own pages 2011-07-11 22:37:49 +01:00
Martin Green
7ef6b1fcc2 implemented mimetypes, index.htm(l) and fixed edit view 2011-07-11 22:36:48 +01:00
Martin Green
7a220b4c87 Change absolute url for caves to there expoweb url, such that links work 2011-07-11 22:35:32 +01:00
Wookey
dc1327674c remove all the DOS linefeeds 2011-07-11 02:10:22 +01:00
Wookey
c8ff8e3ef6 Add /index.htm to EXPOWEB root URL in main template so that you get
the static stuff
2011-07-11 01:55:12 +01:00
Wookey
f766df597c undosify lineends 2011-07-11 01:49:03 +01:00
Wookey
bab92cb88c merge martin's tip again 2011-07-11 00:52:58 +01:00
Martin Green
5d8a5494cd Split up tags such that they use ajax 2011-07-11 00:50:07 +01:00
Wookey
129d93dfa7 Merge from Martin's tip 2011-07-11 00:49:18 +01:00
Martin Green
65c55f0f21 Removed conversion to wiki, replaced Surveystation models with text, added area 1623 to all relevant caves. 2011-07-11 00:15:59 +01:00
Martin Green
8578a3097a Added flat pages for entrance and special flatpage redirects.
Enetrances should probably store their urls like cavers.  Maybe the flatpages should be handled by the app Aaron installed.
2011-07-11 00:13:06 +01:00
Martin Green
de5f68e42c Removed links to removed forms 2011-07-11 00:04:30 +01:00
Martin Green
f44b0be459 slug views, start of cave eidt form, cavelist splitting up by kataster area etc. 2011-07-11 00:03:36 +01:00
Martin Green
a128401d49 Added parsing of all.svx, along side parsing individual caves.
Added the making and parsing of all.pos to determine the location of stations.
Mare work is required so the caves are parsed and stored only once.
Survex parsing appears to include bugs, that print out errors.
2011-07-11 00:01:12 +01:00
Martin Green
5075ded032 Removed modelforms for Caves started to add normal forms 2011-07-10 23:57:31 +01:00
Martin Green
47c2e87979 Removed SurveyStation model (not SurvexStation) 2011-07-10 23:55:54 +01:00
Martin Green
53352e7987 Added THREEDTOPOS setting for survexs 3dtopos program 2011-07-10 23:53:32 +01:00
Martin Green
44f86a7d6f Added url to cave and turned entrances station names and removed the previous SurveyStation model.
Note caves should be rendered in the directory of their original url to make links work.
Note SurveyStations appeared to duplicate SurvexStations.
Note Given we want to be running from a mercurial repository, it is easiest to store the names of survey stations rather than foreign keys.
2011-07-10 23:52:18 +01:00
Martin Green
c37124d9c4 Add ability to views caves via their cave slug. Not recommended until links are fixed. 2011-07-10 23:48:13 +01:00
Martin Green
69ab1e0249 Changed to regex to make 2003 expo logbooks parse 2011-07-10 23:45:45 +01:00
Martin Green
2fd8052ac2 Added redmund style for jquery-ui 2011-07-10 23:40:52 +01:00
Wookey
28924db9f8 merge fix from martin's tip. 2011-07-10 23:30:36 +01:00
Martin Green
50545af223 Added editing of flat pages. Added slugfields to models to refer to them. 2011-06-02 19:16:16 +01:00
expo
30829ff9c8 debug 2011-05-02 03:25:43 +01:00
Martin Green
ede9e4a9bd debug 2011-05-02 03:23:59 +01:00
Martin Green
04d0e80430 debug 2011-05-02 03:22:45 +01:00
Martin Green
366d4736ca Try to fake crsf tags so site works on djang0 1.1 2011-05-02 03:20:31 +01:00
Martin Green
f3391a912e Attempt to get CSRF tag not breaking django 1.1 2011-05-02 03:13:54 +01:00
Martin Green
52eb4030d0 Attempt to get csrf tag working in django 1.1- 2011-05-02 03:11:17 +01:00
Martin Green
835680f0ee Get CSRF middleware to work on django 1.1- and 1.2+ 2011-05-02 02:51:14 +01:00
Martin Green
cdf54e0f9b Added ability to host website not at the root, eg. http://m.com/troggle/ 2011-05-02 02:37:33 +01:00
Martin Green
b439d40120 Debugging, and make get_name function accessable (should really be renamed) 2011-05-02 02:15:54 +01:00
Martin Green
cb744ddeef CRCF protection 2011-05-02 02:14:15 +01:00
Martin Green
872ffe5882 decorator to check if user is logged in if settings.PUBLIC_SITE 2011-05-02 02:13:27 +01:00
Martin Green
671e946c6d settings.PUBLIC_SITE, login required if public for logbook entry, CRCF middleware 2011-05-02 02:12:26 +01:00
Martin Green
3928609c29 Bug fix to expedition links 2011-05-02 00:56:53 +01:00
Martin Green
e942c839a1 Link to expowebsite 2011-05-02 00:53:44 +01:00
Martin Green
bff34aafb9 FIX2 2011-05-01 23:21:47 +01:00
Martin Green
7623943f3e Fix 2011-05-01 23:11:18 +01:00
Martin Green
6d7691791a Added settings hooks for TinyMCE. On debian apt-get install tinymce python-django-tinymce 2011-05-01 19:58:38 +01:00
Martin Green
b001df1f53 edit logbooks, new logbook format, increased database normalisation 2011-05-01 19:32:41 +01:00
Martin Green
1cc7f2d92e Allow survey scans to be scrapped with a file in the top level directory of the year 2011-05-01 19:20:25 +01:00
Martin Green
7a0a898bc6 Added variables to configure TinyMCE 2011-05-01 19:17:57 +01:00
Martin Green
41aca4e2d7 Added files for jQuery to allow for UI and dynamic formsets. 2011-05-01 19:15:34 +01:00
Martin Green
7e89b12004 Setup files for hg to ignore (*.pyc, db*, localsettings.py) 2011-05-01 19:13:07 +01:00
Aaron Curtis
7bac9f829e Renaming main branch from 'svn' to 'default' per mercurial convention.
Hopefully this will keep the main branch as the active one, so the Erebus branch is only used if requested.
2009-09-27 00:43:01 -06:00
goatchurch
2435639498 rolled back a bad update 2009-09-14 23:23:09 +01:00
expo
2be3e4ce9d get survey scans into database 2009-09-14 23:09:50 +01:00
goatchurch
1294444026 make 2008 logbook correctly parse 2009-09-14 22:52:46 +01:00
goatchurch
7578b65573 able to save sketches up from tunnel 2009-09-13 17:27:46 +01:00
goatchurch
ced45c92f7 tunnelfiles scheme added 2009-09-11 23:56:47 +01:00
goatchurch
f21cddb2d0 modelviz added 2009-09-11 09:04:59 +01:00
goatchurch
735b729a41 survey scans features added 2009-09-10 22:07:31 +01:00
goatchurch
c5b933f922 parsing 2009-09-08 23:05:04 +01:00
goatchurch
ce6fe2590d login required for saving survex files 2009-08-29 18:35:02 +01:00
goatchurch
7509a76eb0 login required for saving survex files 2009-08-29 18:34:18 +01:00
goatchurch
41eaa06e55 login required for saving survex files 2009-08-29 18:34:01 +01:00
goatchurch
7429749004 login required for saving survex files 2009-08-29 18:33:44 +01:00
goatchurch
709f9954f4 login required for saving survex files 2009-08-29 18:33:28 +01:00
expo
29adaa03c6 get rid of photo 2009-08-29 18:08:55 +01:00
goatchurch
9f169fb2b9 enable admin url 2009-08-29 17:30:07 +01:00
goatchurch
6b8294d9dc remove dependence on latest django 2009-08-29 16:23:11 +01:00
goatchurch
0ea70273fe quick hack to make work in django1.0 Photo to DPhoto 2009-08-23 23:29:05 +01:00
goatchurch
c66b5e2dad [svn] latest hacking for various statistics 2009-08-05 11:58:36 +01:00
goatchurch
9077462893 [svn] now with ability to make new svx file 2009-08-01 07:31:27 +01:00
goatchurch
7158a79a34 [svn] full checkin. animations disabled, sorry 2009-07-27 13:43:43 +01:00
goatchurch
68060d6118 [svn] some file reading things 2009-07-27 13:42:54 +01:00
substantialnoninfringinguser
ddbdc73e7e [svn] fix indexError bug julian found 2009-07-22 16:35:49 +01:00
substantialnoninfringinguser
263b640641 [svn] Various bug fixes, using more raw_id fields in admin so it loads faster. I had to put onLoad="contentHeight();" back into the base template. This is a bad solution, I would rather use Martin's, but it wasn't working. 2009-07-22 16:18:00 +01:00
goatchurch
84ad39f24a [svn] bugged 2009-07-21 07:20:34 +01:00
substantialnoninfringinguser
408a4c79aa [svn] 2009-07-17 01:14:37 +01:00
substantialnoninfringinguser
b9bbccfe00 [svn] * Make descriptions parser also replace links to descriptions from Cave models' underground_descriptions with wikilinks for valid (existing) links
* Make entrances searchable in admin by cave kataster number
2009-07-16 05:37:33 +01:00
substantialnoninfringinguser
05d262e42b [svn] only logged in users should see the tasks page thing 2009-07-15 01:55:26 +01:00
substantialnoninfringinguser
18e61d19f5 [svn] * wikilink to html for subcaves and cave descriptions
* fix header regex
2009-07-12 06:30:24 +01:00
substantialnoninfringinguser
4a073ea161 [svn] Add regex to turn ==headers== into <h2>headers</2> 2009-07-12 05:54:08 +01:00
substantialnoninfringinguser
2993ca74cc [svn] override save for CaveDescriptions to scan qm wikilinks and add into the manytomany field linked_qms 2009-07-11 01:36:00 +01:00
substantialnoninfringinguser
1566923d5c [svn] Make QM wikilinks work in new format, and fix cave description parser to output working wikilinks. 2009-07-09 05:08:21 +01:00
substantialnoninfringinguser
b0073caf5f [svn] not ready for that yet 2009-07-06 08:35:08 +01:00
substantialnoninfringinguser
8ad044cb2c [svn] * Make Q< wikilinks work again
* Add new ajax bit in LogbookEntry admin which checks for QMs not in wikilink format and allows one click fixes. Soon to be expanded to check for wikilinks that aren't in foreignkey.
* Tweaks to admin including using raw_id_fields for PersonExpedition & other foreignkeyed models with lots of instances.
2009-07-06 08:31:24 +01:00
martin speleo
8a9eb32aaf [svn] wiki_to_html changes.
Changes views of qm model.
2009-07-04 19:35:06 +01:00
substantialnoninfringinguser
7f2199405d [svn] 2009-07-04 19:29:19 +01:00
substantialnoninfringinguser
38a545e174 [svn] Remove old subcave model, along with mptt and feincms. Also move OtherCaveNames admin representation to an inline in Cave. 2009-07-04 19:26:51 +01:00
substantialnoninfringinguser
4f0271ad49 [svn] 2009-07-04 18:41:48 +01:00
martin speleo
7fc1602f7a [svn] Initial and poor attempt at a view for cave descriptions. 2009-07-04 18:11:20 +01:00
martin speleo
aa26690e33 [svn] Pareser for cave descriptions 2009-07-04 17:19:30 +01:00
martin speleo
09581829d1 [svn] Changed addToArgsSurveyStation such that it does not pass a surveystation model to html_to_wiki. Which was unecessary as html_to_wiki returned it without modification. By removing it html_to_wiki can be cleaned up. 2009-07-04 17:08:48 +01:00
martin speleo
3afb94f5d2 [svn] Work on turn html pages into cavedescription models.py.
Moved parser/cavetabs html_to_wiki function to utils.py
Added databaseReset.py desc to refresh the cavedescriptions.
2009-07-04 16:42:17 +01:00
martin speleo
29f084613d [svn] removed redundant import 2009-07-04 16:39:59 +01:00
substantialnoninfringinguser
dd76a1a0be [svn] * Adding JS fill in next QM number via ajax.
* Slight models cleanup- get rid of TroggleImageModel class, use mixin instead.
* Collect various troggle shared functions into utils.py
2009-07-04 08:27:49 +01:00
martin speleo
c132477f80 [svn] Added cavedescription and new subcave.
Changed parsers/survex to read *title into subcave
2009-07-04 00:28:28 +01:00
martin speleo
92635f6f68 [svn] Change to get js in admin work for feincms 2009-07-04 00:26:12 +01:00
martin speleo
65ef255b99 [svn] Fixed the following of *includes by adding white space to the end of the regex. 2009-07-03 23:56:39 +01:00
substantialnoninfringinguser
854fe85132 [svn] 2009-07-03 21:59:31 +01:00
martin speleo
4da6203828 [svn] Fixed setContentHeight to work properly for eye candy view, whilst removing it from the non-eyecandy view 2009-07-03 21:29:02 +01:00
martin speleo
7db1aae5ee [svn] Remove broken import search 2009-07-03 21:04:28 +01:00
substantialnoninfringinguser
b4388d838e [svn] 2009-07-03 20:49:04 +01:00
substantialnoninfringinguser
8446047ab2 [svn] Brief code cleanup. 2009-07-03 05:31:49 +01:00
substantialnoninfringinguser
dc19150eba [svn] whoops 2009-07-03 00:51:41 +01:00
substantialnoninfringinguser
a89139763f [svn] Use template block "related" for related objects. Various cleanup, fix personexpedition date views. 2009-07-03 00:50:56 +01:00
substantialnoninfringinguser
dab138c731 [svn] More fallout of renaming expo to core. Also fix 2009-07-02 23:02:42 +01:00
substantialnoninfringinguser
205a73917d [svn] Fix leftover from expo -> core rename, and add databaseReset.py to README.txt 2009-07-02 22:31:28 +01:00
substantialnoninfringinguser
ae3fe8cd42 [svn] Renaming troggle.expo to troggle.core. To do this, used:
perl -p -i -e "s/expo(?=[\s\.']+)/core/g" `find -name \*.py`

and then manually checked each change (had to remove a couple)
2009-07-02 20:43:18 +01:00
substantialnoninfringinguser
c0b274767b [svn] Add photos wiki syntaxes: e.g.
[[display:centre photo:andyc.jpg]] where centre is a class applied to image, and andyc.jpg is the filename of a Photo model instance. Image will be displayed as thumbnail with link to full size image.
[[photo:andyc.jpg]] will produce a link to the admin page for the andyc.jpg Photo model instance.
[[photo:andyc.jpg Title of the link]] will produce a link to the admin page for the andyc.jpg Photo model instance with link text "Title of the link"
2009-07-02 04:10:51 +01:00
martin speleo
620040bde1 [svn] Fixed accidental removal of fading in margin pictures from main page of eye candy site.
Reduced non eye candy margins.
Moved set contents style height function into main.js from being embeded js, and ran when eye candy is turned on.  Remove style attribute when eye candy is turned back off.
2009-06-28 23:11:45 +01:00
martin speleo
22aa9990a5 [svn] Have different css for plain and eye candy views. 2009-06-28 22:23:56 +01:00
goatchurch
16b7404d9b [svn] horrid .svns copied accidentally 2009-06-28 21:26:35 +01:00
goatchurch
db5e315db0 [svn] forgot to add directory 2009-06-28 21:22:16 +01:00
goatchurch
4c87ce59d3 [svn] with command option 2009-06-28 20:47:11 +01:00
martin speleo
ca7bc171c9 [svn] Fixed small semantics issues stopping base.js working with IE.
Made toggle eyecandy persistent (using a cookie)
Made toggle eyecandy turn off footer menu images
Only load footer menu images if the eyecandy is being used.
2009-06-28 19:33:24 +01:00
substantialnoninfringinguser
b55b17ccc1 [svn] Make header scroll with page because Julian said so 2009-06-19 15:38:32 +01:00
substantialnoninfringinguser
59830c80af [svn] Add readme with installation instructions. 2009-06-19 07:02:25 +01:00
substantialnoninfringinguser
b4a63eca02 [svn] Adding logbook export features. Troggle can now produce .txt or .html logbooks through the controlPanel or via an action in the LogbookEntry admin pages. 2009-06-18 06:53:52 +01:00
substantialnoninfringinguser
0306723c95 [svn] Whoops, forgot to add the file in last revision. 2009-06-14 04:36:19 +01:00
substantialnoninfringinguser
af9743026e [svn] Added beginnings subcaves parser. This required importing more information from cavetab, namely the location where the main cave page appeared on the old expo website. 2009-06-14 04:33:19 +01:00
substantialnoninfringinguser
9b44731c33 [svn] * Fix bugs that were causing broken wikilinks. *Add edit link to mugshots. *make admin url trailing-slash tolerant 2009-06-12 05:39:30 +01:00
substantialnoninfringinguser
5946e159bc [svn] Just realized it makes no sense to have qms ticked off by a logbook entry as an inline. Instead, we need some kind of drop down list where ticked off qms can be searched for and selected. Should be doable. 2009-06-11 06:37:07 +01:00
substantialnoninfringinguser
327ea9cacf [svn] Edited wiki page through web user interface. 2009-06-11 06:35:18 +01:00
substantialnoninfringinguser
6d6991e266 [svn] Added detection of noinfo in cave parser. It sets the non_public flag to true, and the view then shows nonpublic.html instead of the cave if the user isn't logged in. 2009-06-10 17:47:05 +01:00
substantialnoninfringinguser
e4ea57932e [svn] Whoops, forgot the template during last commit. 2009-06-10 06:37:38 +01:00
substantialnoninfringinguser
484a17d496 [svn] * Added non-public field for protecting copyright info etc. Field is on all models but needs to be checked for in views. So far, only the cave view checks.
* Added the Person wiki syntax which looks like [[person:John Doe]]
2009-06-10 06:34:50 +01:00
substantialnoninfringinguser
1d421b2d7c [svn] Fixed a bug with QMs with numbers between 1 and 10, and fixed the links in the recent changes box. 2009-06-10 05:37:53 +01:00
substantialnoninfringinguser
4ce282b88b [svn] Created wiki page through web user interface. 2009-06-10 00:22:29 +01:00
substantialnoninfringinguser
85ada36973 [svn] * Added admin inlines for QMs in LogbookEntry model
* Added QM list edit view
* Fixed "recent changes" box on front page
2009-06-10 00:05:02 +01:00
substantialnoninfringinguser
a3e42d3b19 [svn] 2009-06-09 23:13:11 +01:00
goatchurch
542f55d43e [svn] backup settings 2009-06-09 19:52:32 +01:00
goatchurch
d87f221a2b [svn] fix the revert and css 2009-06-09 19:15:31 +01:00
goatchurch
6237a19d17 [svn] the ajax page 2009-06-09 19:13:48 +01:00
goatchurch
17175637dc [svn] codemirror 2009-06-09 18:59:54 +01:00
substantialnoninfringinguser
32b5c7fbb0 [svn] fix logfile setting 2009-06-09 18:20:55 +01:00
substantialnoninfringinguser
ef47d092e6 [svn] Edited wiki page through web user interface. 2009-06-09 02:29:21 +01:00
substantialnoninfringinguser
8648c85b67 [svn] Edited wiki page through web user interface. 2009-06-09 02:21:30 +01:00
substantialnoninfringinguser
657c37d45c [svn] Created wiki page through web user interface. Lost the goddamn thing twice now due to browser crash and stupid back button so it's not done but I'm saving it anyway! 2009-06-09 02:06:13 +01:00
substantialnoninfringinguser
006becf6ca [svn] Removed redundant fields "date" and "place" from Persontrip model. A PersonTrip's date and place are stored in its parent LogbookEntry. PersonTrips are the people who participate in the trip in a LogbookEntry, so it would make no sense to have different dates and places from the LogbookEntry they are foreignkeyed to. 2009-06-09 00:29:00 +01:00
substantialnoninfringinguser
012d948193 [svn] Rewrote get_absolute_url methods of models to use urlparse.urljoin instead of just +ing the urls together. This fixes problems with double slashes. 2009-06-08 20:16:18 +01:00
pjrharley
a048adcdac [svn] A few registration updates
-display an error for nonmatching passwords
-display an error for short passwords
-dont direct people to http://http://sitename....
2009-05-30 16:17:19 +01:00
substantialnoninfringinguser
b091e8eb09 [svn] Have control panel display an error for logged in, non-superuser users. 2009-05-24 23:24:59 +01:00
pjrharley
14b39d906c [svn] Use the django compatability thing - webserver might have old python on it.... 2009-05-23 21:13:53 +01:00
substantialnoninfringinguser
0508ba299c [svn] Fix mistakes in export admin actions. The python serializer only works on simple objects (lists, dicts etc) and not model instances so nix that part. 2009-05-23 20:46:10 +01:00
substantialnoninfringinguser
02db5a9170 [svn] Re-enable JSON and XML export actions in admin pages now that troggle is using latest SVN version of Django. 2009-05-23 20:37:42 +01:00
substantialnoninfringinguser
93a68ff43e [svn] Fix broken admin link. 2009-05-23 20:06:05 +01:00
substantialnoninfringinguser
97e423ba86 [svn] fix imports 2009-05-23 16:51:21 +01:00
substantialnoninfringinguser
3033f1eecd [svn] Created wiki page through web user interface. 2009-05-22 22:38:41 +01:00
pjrharley
f4405a16f1 [svn] Dont say activation failed if it didn't\! 2009-05-22 21:02:48 +01:00
pjrharley
025b743070 [svn] Accidentally commited another change... so might as well add the template to go with it. Send activation email as text and html so the link is clickable 2009-05-22 21:02:24 +01:00
pjrharley
e27f5565cb [svn] Use hashlib rather than depreciated sha 2009-05-22 20:59:03 +01:00
substantialnoninfringinguser
7fe5cd6ede [svn] Edited wiki page through web user interface. 2009-05-22 08:17:17 +01:00
substantialnoninfringinguser
7052355596 [svn] Edited wiki page through web user interface. 2009-05-22 07:59:37 +01:00
substantialnoninfringinguser
1e6d1a9f2f [svn] Created wiki page through web user interface. 2009-05-22 07:58:58 +01:00
substantialnoninfringinguser
a776c6ba13 [svn] Created wiki page through web user interface. 2009-05-22 07:47:11 +01:00
substantialnoninfringinguser
75f782ab71 [svn] more survey binder updates 2009-05-22 06:49:13 +01:00
substantialnoninfringinguser
832f56a6d0 [svn] fix wrongly named template tags 2009-05-22 06:43:25 +01:00
substantialnoninfringinguser
f6d3a7c84e [svn] switched from dodgy manually writing to logfile to using python's logging module, which seems great 2009-05-22 06:17:24 +01:00
substantialnoninfringinguser
7769a35f07 [svn] - Remove feature (admin JSON / XML downloads) which won't work until we have django 1.1 installed (works on my SVN version, but not on seagrass debian package version).
- Copy feincms media to project so that we don't have to serve it separately. Also useful because we may want to customize it.
2009-05-22 02:54:09 +01:00
substantialnoninfringinguser
c38dfd20a1 [svn] * Make subcave urls work.
* Add json and xml download to admin.
2009-05-22 01:50:16 +01:00
substantialnoninfringinguser
83634fe95a [svn] minor logfile mistake 2009-05-21 22:55:08 +01:00
substantialnoninfringinguser
e336e9c770 [svn] allow the recreate tables thing on control panel to work 2009-05-21 20:46:24 +01:00
substantialnoninfringinguser
3ac1169aa7 [svn] fix minor logfile error 2009-05-21 20:24:21 +01:00
substantialnoninfringinguser
3d8a6fb55a [svn] 2009-05-21 20:17:07 +01:00
substantialnoninfringinguser
891b3abb44 [svn] Updates to allow subcave tree with nice admin. 2009-05-21 19:47:19 +01:00
substantialnoninfringinguser
01b0980c44 [svn] forgot to add earlier 2009-05-20 03:28:48 +01:00
substantialnoninfringinguser
2c2f11be39 [svn] 2009-05-19 06:32:42 +01:00
substantialnoninfringinguser
d71078d03d [svn] 2009-05-18 04:30:26 +01:00
substantialnoninfringinguser
12009e36df [svn] Turn main menu into dropdown (well actually, drop up) menu. 2009-05-18 04:25:42 +01:00
substantialnoninfringinguser
21c39f70de [svn] - Make control panel downloads (qm.csv for each cave, CAVETAB2.CSV) work.
- Fix problems in QM parsing script
2009-05-17 04:31:23 +01:00
substantialnoninfringinguser
7566faf77b [svn] Make the workaround to avoid parsing interlaced pngs actually work (see issue # 14) 2009-05-15 03:56:11 +01:00
substantialnoninfringinguser
f27d5988f0 [svn] semi ugly hack... 2009-05-15 03:38:11 +01:00
substantialnoninfringinguser
d8a215a575 [svn] Add: new generic object list template object_list.html, and convenience filter named "link" for making links from objects, and make expeditions list page using those two. Also, fixed survey parsing in databaseReset.py 2009-05-15 03:29:19 +01:00
substantialnoninfringinguser
118d132797 [svn] Forgot to upload with earlier commit 2009-05-14 14:24:46 +01:00
substantialnoninfringinguser
06487e5534 [svn] localsettings should override settings, so the import should be at the bottom of the file, unless someone has a better way of doing this 2009-05-14 06:39:36 +01:00
substantialnoninfringinguser
c0b73d4777 [svn] 2009-05-14 06:32:58 +01:00
substantialnoninfringinguser
e9e755b517 [svn] Fixed broken buttons on controlpanel, added CAVETAB2.CSV export and download buttons and made them work too.
Changed ordering on PersonExpeditions so that it is based on their expedition. That way, even if we don't have date info on when a user was on expo exactly, pages like personindex work correctly.
2009-05-14 06:19:46 +01:00
substantialnoninfringinguser
191619e6d8 [svn] Add link to google code issue tracker 2009-05-13 07:01:45 +01:00
substantialnoninfringinguser
0f64e786b5 [svn] Made the subcaves work! Now we just have to figure out how to parse them...
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8343 by cucc @ 5/11/2009 6:36 AM
2009-05-13 06:28:36 +01:00
substantialnoninfringinguser
7164296c9d [svn]
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8342 by cucc @ 5/11/2009 3:23 AM
2009-05-13 06:27:45 +01:00
substantialnoninfringinguser
787445c071 [svn]
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8341 by cucc @ 5/11/2009 3:21 AM
2009-05-13 06:27:00 +01:00
substantialnoninfringinguser
d9d119c0c9 [svn] django-evolution is optional so shouldn't be in main settings
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8340 by cucc @ 5/11/2009 3:18 AM
2009-05-13 06:26:17 +01:00
substantialnoninfringinguser
c45eb31e8f [svn] Switch from photologue to imagekit. Less bloat.
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8338 by cucc @ 5/11/2009 3:08 AM
2009-05-13 06:24:52 +01:00
substantialnoninfringinguser
b31d022c1a [svn] Dynamic thumbnail generation for photos and survey scans using imagekit, further improving registration system, other misc.
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8336 by cucc @ 5/10/2009 11:05 PM
2009-05-13 06:23:57 +01:00
substantialnoninfringinguser
919c7e932a [svn] Fixes to deal with reorganization of expo surveys repository. Now that survey scans and Surveys.csv are in different directories, we have two settings variables, settings.SURVEYS for the root of the survey repo, and settings.SURVEY_SCANS for the surveyscans directory.
Fixed tab / indent muck in surveys parser. Commented out some "file abstraction" stuff for the time being.
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8335 by cucc @ 5/10/2009 7:26 AM
2009-05-13 06:22:53 +01:00
substantialnoninfringinguser
9489fe56d9 [svn] Improve registration system.
Add jquery fade effects and quick search.
Copied from http://cucc@cucc.survex.com/svn/trunk/expoweb/troggle/, rev. 8334 by cucc @ 5/10/2009 5:23 AM
2009-05-13 06:22:07 +01:00
476 changed files with 60772 additions and 12688 deletions

74
.gitignore vendored Normal file
View File

@@ -0,0 +1,74 @@
# use glob syntax
syntax: glob
*.orig
*.pyc
*.sql
*.sqlite
*.prof
*~
.idea/*
.swp
.vscode/*
_1623.3d
_1623.err
_1623.pos
_1623.svx
_16230.svx
cave-lookup.json
core/migrations/*
db*
desktop.ini
diffsettings.txt
ignored-files.log
import_profile.json
lines-of-python.txt
lines-of-templates.txt
loadlogbk.log
loadsurvexblks.log
logbktrips.shelve
memdump.sql
my_project.dot
parsing_log.txt
svxblks.log
svxlinear.log
troggle
troggle-inspectdb.py
troggle-sqlite.sql
troggle.log
troggle.sqlite
troggle.sqlite-journal
troggle_log.txt
tunnel-import.log
logbktrips.shelve.db
credentials.py
localsettings.py
localsettings-expo-live.py
_deploy/old/localsettings-expo-live.py
_deploy/old/localsettings.py
debian/localsettings.py
debian/credentials.py
wsl/localsettings.py
wsl/credentials.py
media/jslib/*
!media/jslib/readme.txt
_test_response.html
_deploy/wsl/localsettingsWSL.py.bak
therionrefs.log
_1623-and-1626.svx
_1623-and-1626-no-schoenberg-hs.svx
localsettings-oldMuscogee.py
troggle.sqlite-journal - Shortcut.lnk
troggle.sqlite - Shortcut.lnk
_deploy/debian/localsettings-jan.py
_deploy/debian/localsettings-nw.py
py310d32
_deploy/debian/localsettingsserver2023-01-secret.py
_deploy/debian/localsettings2023-04-05-secret.py
pydebianbullseye
javascript

View File

@@ -1,16 +0,0 @@
# use glob syntax
syntax: glob
*.pyc
db*
localsettings.py
*~
parsing_log.txt
troggle
troggle_log.txt
.idea/*
*.orig
media/images/*
.vscode/*
.swp
imagekit-off/

View File

@@ -1,46 +1,214 @@
Troggle is an application for caving expedition data management, originally created for use on Cambridge University Caving Club (CUCC)expeditions and licensed under the GNU Lesser General Public License.
Updated 2 May 2023
Troggle has been forked into two projects. The original one is maintained by Aron Curtis and is used for Erebus caves. The CUCC variant uses files as the definitive data, not the database and lives at expo.sruvex.com/troggle.
Troggle is an application for caving expedition data management,
originally created for use on Cambridge University Caving Club (CUCC)expeditions
and licensed under the GNU Lesser General Public License.
Troggle has been forked into two projects. The original one is maintained by Aaron Curtis
and was used for Erebus caves in Antarctica.
The CUCC variant uses files as the definitive data, not the database, and lives at http://expo.survex.com/repositories/troggle/.git/
For the server setup, see /_deploy/debian/wookey-exposerver-recipe.txt
and see http://expo.survex.com/handbook/troggle/serverconfig.html
Much material which was in this file has been moved to
http://expo.survex.com/handbook/troggle/serverconfig.html
See copyright notices in
http://expo.survex.com/handbook/computing/contribute.html
and for context see
http://expo.survex.com/handbook/computing/onlinesystems.html
Troggle setup
==========
=============
0. read the very extensive online documentation and stop reading this README...
well, come back to this README after you have read the HTML pages. Not everything has been transferred.
Python, Django, and Database setup
http://expo.survex.com/handbook/troggle/troglaptop.html
http://expo.survex.com/handbook/troggle/serverconfig.html
http://expo.survex.com/handbook/troggle/trogdangoup.html
and at troggle/debian/serversetup
1. set up the ssh key-exchange with the git server so you can clone troggle
http://expo.survex.com/handbook/computing/keyexchange.html
Setting up directories
----------------------
see http://expo.survex.com/handbook/troggle/troglaptop.html and
http://expo.survex.com/handbook/troggle/serverconfig.html
Next, you need to fill in your local settings. Copy _deploy/WSL/localsettingsWSL.py
to a new file called localsettings.py and edit it and settings.py to match
your machine's file locations.
Follow the instructions contained in the file to fill out your settings.
{ in _deploy/old/ we have these which are all very out of date:
localsettings-expo-live.py is the python2.7 settings for the server.
localsettingsubuntu.py
localsettingsdocker.py
localsettingswindows.py
localsettingspotatohut.py
}
Python3, Django, and Database setup
-----------------------------------
Troggle requires Django 1.4 or greater, and any version of Python that works with it.
Install Django with the following command:
We are now using Django 3.2 and will move to 4.2 in 2024
We are installing with python 3.11 (the server is running 3.9)
apt-get install python-django (on debian/ubuntu)
Install Django using pip, not with apt, on your test system in a venv.
Conventionally on our main master expo server we install everything that we can as debian packages, not using pip.
If you want to use MySQL or Postgresql, download and install them. However, you can also use Django with Sqlite3, which is included in Python and thus requires no extra installation.
[installation instructions removed - now in http://expo.survex.com/handbook/troggle/troglaptop.html ]
[venv description removed - read it in http://expo.survex.com/handbook/troggle/troglaptop.html ]
READ the os-trog.sh script !
READ the venv-trog.sh script !
Automatic Provisioning and Configuration
----------------------------------------
We don't do this - yet.
Troggle itself
-------------
Choose a directory where you will keep troggle, and svn check out Troggle into it using the following command:
The most appropriate configuration tools today (2021) appear to be Bolt or Ansible
https://puppet.com/docs/bolt/latest/bolt.html (declarative, local)
https://docs.ansible.com/ansible/latest/user_guide/intro_getting_started.html (procedural, remote)
https://puppet.com/blog/automating-from-zero-to-something/
svn co http://troggle.googlecode.com/svn/
We don't need anything for the deploy server itself, but we could do with something for setting
up test servers quickly to help get newbie developers up to speed faster. But learning a new tool
creates a barrier in itself. This is one reason most of us don't use Docker.
CSS and media files
-------------------
We are not using the STATICFILES capability.
We are serving css files from troggle/media/.. (see urls.py)
Plain CSS pages
---------------
When running the test server
manage.py runserver 0.0.0.0:8000
and without Apache running, we are serving CSS using using this Django 'view':
view_surveys.cssfilessingle
i.e.
cssfilessingle() in core/view_surveys.py
Setting up survex
-----------------
You need to have survex installed as the command line tools 'cavern' is
used as part of the survex import process.
$ sudo apt install survex
Setting up tables and importing survey data
-------------------------------------------
Run
$ sudo python databaseReset.py
from the troggle directory will give you instructions.
[ NB Adding a new year/expedition requires adding a column to the
folk/folk.csv table - a year doesn't exist until that is done.]
If you want to work on the source code and be able to commit, you will need to use https instead of http, and your google account will need to be added to the troggle project members list. Contact aaron dot curtis at cantab dot net to get this set up.
MariaDB database
----------------
Start it up with
$ sudo mysql -u -p
when it will prompt you to type in the password. Get this by reading the settings.py file in use on the server.
then
> CREATE DATABASE troggle;
> use troggle;
> exit;
Next, you need to fill in your local settings. Copy either localsettingsubuntu.py or localsettingsserver.py to a new file called localsettings.py. Follow the instructions contained in the file to fill out your settings.
Note the semicolons.
You can check the status of the db service:
$ sudo systemctl status mysql
You can start and stop the db service with
$ sudo systemctl restart mysql.service
$ sudo systemctl stop mysql.service
$ sudo systemctl start mysql.service
While logged in at a terminal session as expo on expo.survex.,com
$ mysql -h localhost -u expo -p<password>
will get you the MariasDb command prompt: https://www.hostwinds.com/guide/how-to-use-mysql-mariadb-from-command-line/
then (Note the SEMICOLONS !):
>drop database troggle;
>create database troggle;
>quit
Somewhere I have notes for the GRANT PRIVS type runes...
Ah yes:
CREATE DATABASE troggle;
GRANT ALL PRIVILEGES ON troggle.* TO 'expo'@'localhost' IDENTIFIED BY 'somepassword'; FLUSH PRIVILEGES; (at mysql root prompt)
(explained on https://chartio.com/resources/tutorials/how-to-grant-all-privileges-on-a-database-in-mysql/)
(but you need to create the database too)
The GRANT ALL PRIVILEGES bit requires you to logon in to MariaDB as root. sudo doesn't cut it.
these permissions are set in a different 'info' database which usually is untouched even if database troggle gets creamed.
The 'somepassword' is specified int he localsettings.py file.
Setting up tables and importing legacy data
------------------------------------------
Run "python databaseReset.py reset" from the troggle directory.
PERMISSIONS
https://linuxize.com/post/usermod-command-in-linux/
Once troggle is running, you can also log in and then go to "Import / export" data under "admin" on the menu.
THIS MAY BE OUT OF DATE - from 2022 we are running Apache as user 'expo' not 'www-data'
Adding a new year/expedition requires adding a column to the
noinfo/folk.csv table - a year doesn't exist until that is done.
so that the online editing system for SVX files works.
The same goes for /expoweb/ files, so that "edit this page" works and the New Cave
and New Entrance forms work.
sudo usermod -a expo expocvs
the expocvs group is used for git
all the users should be in this group
Running a Troggle server
------------------------
For high volume use, Troggle should be run using a web server like apache. However, a quick way to get started is to use the development server built into Django.
Running a Troggle server with Apache
------------------------------------
Troggle also needs these aliases to be configured. These are set in
/home/expo/config/apache/expo.conf
on the expo server.
To do this, run "python manage.py runserver" from the troggle directory.
At least these need setting:
DocumentRoot /home/expo/expoweb
WSGIScriptAlias / /home/expo/troggle/wsgi.py
<Directory /home/expo/troggle>
<Files wsgi.py>
Require all granted
</Files>
</Directory>
the instructions for apache Alias commands are in comments at the end of
the urls.py file.
Unlike the django "manage.py runserver" method, apache requires a restart before it will use
any changed files:
sudo service apache2 restart
Olly's comments 20 July 2020:
olly: looking at /lib/systemd/system/apache2.service suggests so
olly: ExecStart=/usr/sbin/apachectl start
olly: ExecStop=/usr/sbin/apachectl stop
olly: ExecReload=/usr/sbin/apachectl graceful
Additions
---------
The python code has been manually cleaned using the 'black' and 'ruff' lint tools,
and the 'deptry' dependency checker. This needs doing every year or so.
See dependencies-check-deptry.txt
See troggle/pyproject.toml for configurations
Experimental additions
----------------------
These are untried tools which may help us document how troggle works in future.
pip install pygraphviz
pip install pyparsing pydot # installs fine
django extension graph_models # https://django-extensions.readthedocs.io/en/latest/graph_models.html

27
README/index.html Normal file
View File

@@ -0,0 +1,27 @@
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Troggle - Coding Documentation</title>
<link rel="stylesheet" type="text/css" href="..media/css/main2.css" />
</head>
<body>
<h1>Troggle Code - README</h1>
<h2>Contents of README.txt file</h2>
<iframe name="erriframe" width="70%" height="500"
src="../README.txt" frameborder="1" ></iframe>
<h2>Troggle documentation in the Expo Handbook</h2>
<ul>
<li><a href="http://expo.survex.com/handbook/troggle/trogintro.html">Intro</a>
<li><a href="http://expo.survex.com/handbook/troggle/trogindex.html">Troggle manual INDEX</a>
<li><a href="http://expo.survex.com/handbook/troggle/trogarch.html">Troggle data model</a>
<li><a href="http://expo.survex.com/handbook/troggle/trogimport.html">Troggle importing data</a>
<li><a href="http://expo.survex.com/handbook/troggle/trogdesign.html">Troggle design decisions</a>
<li><a href="http://expo.survex.com/handbook/troggle/trogdesignx.html">Troggle future architectures</a>
<li><a href="http://expo.survex.com/handbook/troggle/trogsimpler.html">a kinder simpler Troggle?</a>
</ul>
<hr />
</body></html>

BIN
README/troggle2020.docx Normal file

Binary file not shown.

BIN
README/troggle2020.odt Normal file

Binary file not shown.

BIN
README/troggle2020.pdf Normal file

Binary file not shown.

View File

@@ -0,0 +1,160 @@
import os
import sys
import urllib.parse
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
NOTE this file is vastly out of sync with troggle/_deploy/wsl/localsettings.py
which is the most recent version used in active maintenance. There should be
essential differences, but there and many, many non-essential differences which
should be eliminated for clarity and to use modern idioms. 8 March 2023.
"""
print(" * importing troggle/localsettings.py")
# DO NOT check this file into the git repo - it contains real passwords.
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever
#SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : '123456789012345', # Not used with sqlite3. Not a real password.
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
EXPOUSER = 'expo'
EXPOUSERPASS = 'Not a real password'
EXPOADMINUSER = 'expoadmin'
EXPOADMINUSERPASS = 'Not a real password'
EXPOUSER_EMAIL = 'wookey@wookware.org'
EXPOADMINUSER_EMAIL = 'wookey@wookware.org'
REPOS_ROOT_PATH = '/home/expo/'
sys.path.append(REPOS_ROOT_PATH)
sys.path.append(REPOS_ROOT_PATH + 'troggle')
# Define the path to the django app (troggle in this case)
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
PHOTOS_YEAR = "2023"
# add in 358 when they don't make it crash horribly
NOTABLECAVESHREFS = [ "290", "291", "359", "264", "258", "204", "76", "107"]
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
PYTHON_PATH + "templates"
],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
'django.contrib.auth.context_processors.auth', # knowledge of logged-on user & permissions
'core.context.troggle_context', # in core/troggle.py
'django.template.context_processors.debug',
#'django.template.context_processors.request', # copy of current request, added in trying to make csrf work
'django.template.context_processors.i18n',
'django.template.context_processors.media', # includes a variable MEDIA_URL
'django.template.context_processors.static', # includes a variable STATIC_URL
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader', #For each app, inc admin, in INSTALLED_APPS, loader looks for /templates
# insert your own TEMPLATE_LOADERS here
]
},
},
]
PUBLIC_SITE = True
# This should be False for normal running
DEBUG = False
CACHEDPAGES = True # experimental page cache for a handful of page types
# executables:
CAVERN = 'cavern' # for parsing .svx files and producing .3d files
SURVEXPORT = 'survexport' # for parsing .3d files and producing .pos files
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
LIBDIR = Path(REPOS_ROOT_PATH) / 'lib' / PV
EXPOWEB = Path(REPOS_ROOT_PATH + 'expoweb/')
SURVEYS = REPOS_ROOT_PATH
SURVEY_SCANS = REPOS_ROOT_PATH + 'expofiles/surveyscans/'
FILES = REPOS_ROOT_PATH + 'expofiles'
PHOTOS_ROOT = REPOS_ROOT_PATH + 'expofiles/photos/'
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / 'templates'
MEDIA_ROOT = TROGGLE_PATH / 'media'
JSLIB_ROOT = TROGGLE_PATH / 'media' / 'jslib' # used for CaveViewer JS utility
CAVEDESCRIPTIONS = EXPOWEB / "cave_data"
ENTRANCEDESCRIPTIONS = EXPOWEB / "entrance_data"
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
#URL_ROOT = 'http://expo.survex.com/'
URL_ROOT = '/'
DIR_ROOT = Path("") #this should end in / if a value is given
EXPOWEB_URL = '/'
SURVEYS_URL = '/survey_scans/'
REPOS_ROOT_PATH = Path(REPOS_ROOT_PATH)
SURVEX_DATA = REPOS_ROOT_PATH / "loser"
DRAWINGS_DATA = REPOS_ROOT_PATH / "drawings"
EXPOFILES = REPOS_ROOT_PATH / "expofiles"
SCANS_ROOT = EXPOFILES / "surveyscans"
PHOTOS_ROOT = EXPOFILES / "photos"
#EXPOFILES = urllib.parse.urljoin(REPOS_ROOT_PATH, 'expofiles/')
PHOTOS_URL = urllib.parse.urljoin(URL_ROOT, '/photos/')
# MEDIA_URL is used by urls.py in a regex. See urls.py & core/views_surveys.py
MEDIA_URL = '/site_media/'
STATIC_URL = urllib.parse.urljoin(URL_ROOT , '/static/') # used by Django admin pages. Do not delete.
JSLIB_URL = urllib.parse.urljoin(URL_ROOT , '/javascript/') # always fails, try to revive it ?
# STATIC_ROOT removed after merging content into MEDIA_ROOT. See urls.py & core/views/surveys.py
#TINY_MCE_MEDIA_ROOT = STATIC_ROOT + '/tiny_mce/' # not needed while TinyMCE not installed
#TINY_MCE_MEDIA_URL = STATIC_URL + '/tiny_mce/' # not needed while TinyMCE not installed
LOGFILE = '/var/log/troggle/troggle.log'
IMPORTLOGFILE = '/var/log/troggle/import.log'
# Sanitise these to be strings as Django seems to be particularly sensitive to crashing if they aren't
STATIC_URL = str(STATIC_URL) + "/"
MEDIA_URL = str(MEDIA_URL) + "/"
print(" + finished importing troggle/localsettings.py")

View File

@@ -0,0 +1,160 @@
import os
import sys
import urllib.parse
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
NOTE this file is vastly out of sync with troggle/_deploy/wsl/localsettings.py
which is the most recent version used in active maintenance. There should be
essential differences, but there and many, many non-essential differences which
should be eliminated for clarity and to use modern idioms. 8 March 2023.
"""
print(" * importing troggle/localsettings.py")
# DO NOT check this file into the git repo - it contains real passwords.
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever
#SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : 'uFqP56B4XleeyIW', # Not used with sqlite3.
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
EXPOUSER = 'expo'
EXPOUSERPASS = '161:gosser'
EXPOADMINUSER = 'expoadmin'
EXPOADMINUSERPASS = 'gosser:161'
EXPOUSER_EMAIL = 'wookey@wookware.org'
EXPOADMINUSER_EMAIL = 'wookey@wookware.org'
REPOS_ROOT_PATH = '/home/expo/'
sys.path.append(REPOS_ROOT_PATH)
sys.path.append(REPOS_ROOT_PATH + 'troggle')
# Define the path to the django app (troggle in this case)
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
PHOTOS_YEAR = "2023"
# add in 358 when they don't make it crash horribly
NOTABLECAVESHREFS = [ "290", "291", "359", "264", "258", "204", "76", "107"]
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
PYTHON_PATH + "templates"
],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
'django.contrib.auth.context_processors.auth', # knowledge of logged-on user & permissions
'core.context.troggle_context', # in core/troggle.py
'django.template.context_processors.debug',
#'django.template.context_processors.request', # copy of current request, added in trying to make csrf work
'django.template.context_processors.i18n',
'django.template.context_processors.media', # includes a variable MEDIA_URL
'django.template.context_processors.static', # includes a variable STATIC_URL
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader', #For each app, inc admin, in INSTALLED_APPS, loader looks for /templates
# insert your own TEMPLATE_LOADERS here
]
},
},
]
PUBLIC_SITE = True
# This should be False for normal running
DEBUG = False
CACHEDPAGES = True # experimental page cache for a handful of page types
# executables:
CAVERN = 'cavern' # for parsing .svx files and producing .3d files
SURVEXPORT = 'survexport' # for parsing .3d files and producing .pos files
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
LIBDIR = Path(REPOS_ROOT_PATH) / 'lib' / PV
EXPOWEB = Path(REPOS_ROOT_PATH + 'expoweb/')
SURVEYS = REPOS_ROOT_PATH
SURVEY_SCANS = REPOS_ROOT_PATH + 'expofiles/surveyscans/'
FILES = REPOS_ROOT_PATH + 'expofiles'
PHOTOS_ROOT = REPOS_ROOT_PATH + 'expofiles/photos/'
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / 'templates'
MEDIA_ROOT = TROGGLE_PATH / 'media'
JSLIB_ROOT = TROGGLE_PATH / 'media' / 'jslib' # used for CaveViewer JS utility
CAVEDESCRIPTIONS = EXPOWEB / "cave_data"
ENTRANCEDESCRIPTIONS = EXPOWEB / "entrance_data"
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
#URL_ROOT = 'http://expo.survex.com/'
URL_ROOT = '/'
DIR_ROOT = Path("") #this should end in / if a value is given
EXPOWEB_URL = '/'
SURVEYS_URL = '/survey_scans/'
REPOS_ROOT_PATH = Path(REPOS_ROOT_PATH)
SURVEX_DATA = REPOS_ROOT_PATH / "loser"
DRAWINGS_DATA = REPOS_ROOT_PATH / "drawings"
EXPOFILES = REPOS_ROOT_PATH / "expofiles"
SCANS_ROOT = EXPOFILES / "surveyscans"
PHOTOS_ROOT = EXPOFILES / "photos"
#EXPOFILES = urllib.parse.urljoin(REPOS_ROOT_PATH, 'expofiles/')
PHOTOS_URL = urllib.parse.urljoin(URL_ROOT, '/photos/')
# MEDIA_URL is used by urls.py in a regex. See urls.py & core/views_surveys.py
MEDIA_URL = '/site_media/'
STATIC_URL = urllib.parse.urljoin(URL_ROOT , '/static/') # used by Django admin pages. Do not delete.
JSLIB_URL = urllib.parse.urljoin(URL_ROOT , '/javascript/') # always fails, try to revive it ?
# STATIC_ROOT removed after merging content into MEDIA_ROOT. See urls.py & core/views/surveys.py
#TINY_MCE_MEDIA_ROOT = STATIC_ROOT + '/tiny_mce/' # not needed while TinyMCE not installed
#TINY_MCE_MEDIA_URL = STATIC_URL + '/tiny_mce/' # not needed while TinyMCE not installed
LOGFILE = '/var/log/troggle/troggle.log'
IMPORTLOGFILE = '/var/log/troggle/import.log'
# Sanitise these to be strings as Django seems to be particularly sensitive to crashing if they aren't
STATIC_URL = str(STATIC_URL) + "/"
MEDIA_URL = str(MEDIA_URL) + "/"
print(" + finished importing troggle/localsettings.py")

View File

@@ -0,0 +1,164 @@
import os
import sys
import urllib.parse
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem, or /javascript/ can be in
a system-wide location rather than just a local directory.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
Read https://realpython.com/python-pathlib/
Read https://adamj.eu/tech/2020/03/16/use-pathlib-in-your-django-project/
"""
print(" * importing troggle/localsettings.py")
# DO NOT check this file into the git repo - it contains real passwords.
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever
#SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : '123456789012345', # Not used with sqlite3.Not the real password
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
EXPOUSER = 'expo'
EXPOADMINUSER = 'expoadmin'
EXPOUSER_EMAIL = 'wookey@wookware.org'
EXPOADMINUSER_EMAIL = 'wookey@wookware.org'
SECRET_KEY = "zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz"
EXPOUSERPASS = "nope"
EXPOADMINUSERPASS = "nope"
EMAIL_HOST_PASSWORD = "nope"
REPOS_ROOT_PATH = '/home/expo/'
sys.path.append(REPOS_ROOT_PATH)
sys.path.append(REPOS_ROOT_PATH + 'troggle')
# Define the path to the django app (troggle in this case)
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
PHOTOS_YEAR = "2022"
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
PYTHON_PATH + "templates"
],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
'django.contrib.auth.context_processors.auth', # knowledge of logged-on user & permissions
'core.context.troggle_context', # in core/troggle.py
'django.template.context_processors.debug',
#'django.template.context_processors.request', # copy of current request, added in trying to make csrf work
'django.template.context_processors.i18n',
'django.template.context_processors.media', # includes a variable MEDIA_URL
'django.template.context_processors.static', # includes a variable STATIC_URL
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader', #For each app, inc admin, in INSTALLED_APPS, loader looks for /templates
# insert your own TEMPLATE_LOADERS here
]
},
},
]
PUBLIC_SITE = True
# This should be False for normal running
DEBUG = True
CACHEDPAGES = True # experimental page cache for a handful of page types
SURVEX_DATA = REPOS_ROOT_PATH + 'loser/'
DRAWINGS_DATA = REPOS_ROOT_PATH + 'drawings/'
# executables:
CAVERN = 'cavern' # for parsing .svx files and producing .3d files
SURVEXPORT = 'survexport' # for parsing .3d files and producing .pos files
EXPOWEB = REPOS_ROOT_PATH + 'expoweb/'
#SURVEYS = REPOS_ROOT_PATH
SCANS_ROOT = REPOS_ROOT_PATH + 'expofiles/surveyscans/'
FILES = REPOS_ROOT_PATH + 'expofiles'
PHOTOS_ROOT = REPOS_ROOT_PATH + 'expofiles/photos/'
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / 'templates'
MEDIA_ROOT = TROGGLE_PATH / 'media'
JSLIB_ROOT = TROGGLE_PATH / 'media' / 'jslib' # used for CaveViewer JS utility
CAVEDESCRIPTIONS = os.path.join(EXPOWEB, "cave_data")
ENTRANCEDESCRIPTIONS = os.path.join(EXPOWEB, "entrance_data")
# CACHEDIR = REPOS_ROOT_PATH + 'expowebcache/'
# THREEDCACHEDIR = CACHEDIR + '3d/'
# THUMBNAILCACHE = CACHEDIR + 'thumbs'
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
LIBDIR = Path(REPOS_ROOT_PATH) / 'lib' / PV
#Note that all these *_URL constants are not actually used in urls.py, they should be..
#URL_ROOT = 'http://expo.survex.com/'
URL_ROOT = '/'
DIR_ROOT = ''#this should end in / if a value is given
EXPOWEB_URL = '/'
SCANS_URL = '/survey_scans/'
EXPOFILES = urllib.parse.urljoin(REPOS_ROOT_PATH, 'expofiles/')
PHOTOS_URL = urllib.parse.urljoin(URL_ROOT, '/photos/')
# MEDIA_URL is used by urls.py in a regex. See urls.py & core/views_surveys.py
MEDIA_URL = '/site_media/'
STATIC_URL = urllib.parse.urljoin(URL_ROOT , '/static/') # used by Django admin pages. Do not delete.
JSLIB_URL = urllib.parse.urljoin(URL_ROOT , '/javascript/') # always fails, try to revive it ?
#TINY_MCE_MEDIA_ROOT = STATIC_ROOT + '/tiny_mce/' # not needed while TinyMCE not installed
#TINY_MCE_MEDIA_URL = STATIC_URL + '/tiny_mce/' # not needed while TinyMCE not installed
LOGFILE = '/var/log/troggle/troggle.log'
IMPORTLOGFILE = '/var/log/troggle/import.log'
# add in 358 when they don't make it crash horribly
NOTABLECAVESHREFS = [ "290", "291", "359", "264", "258", "204", "76", "107"]
# Sanitise these to be strings as all other code is expecting strings
# and we have not made the change to pathlib Path type in the other localsettings-* variants yet.
CAVEDESCRIPTIONS = os.fspath(CAVEDESCRIPTIONS)
ENTRANCEDESCRIPTIONS = os.fspath(ENTRANCEDESCRIPTIONS)
LOGFILE = os.fspath(LOGFILE)
#SURVEYS = os.fspath(SURVEYS)
EXPOWEB = os.fspath(EXPOWEB)
DRAWINGS_DATA = os.fspath(DRAWINGS_DATA)
SURVEX_DATA = os.fspath(SURVEX_DATA)
REPOS_ROOT_PATH = os.fspath(REPOS_ROOT_PATH)
TEMPLATE_PATH = os.fspath(TROGGLE_PATH)
MEDIA_ROOT = os.fspath(MEDIA_ROOT)
JSLIB_ROOT = os.fspath(JSLIB_ROOT)
SCANS_ROOT = os.fspath(SCANS_ROOT)
LIBDIR = os.fspath(LIBDIR)
print(" + finished importing troggle/localsettings.py")

View File

@@ -0,0 +1,70 @@
#!/bin/bash
# Run this in a terminal in the troggle directory: 'bash os-trog.sh'
# On WSL, do Shift-click in the file explorer on the troggle folder to open a Linux command line
# 'Open Linux shell here'
echo 'Run this in a terminal in the troggle directory: "bash venv-trog.sh"'
cat /etc/os-release
# Expects an Ubuntu 22.04 relatively clean install.
sudo apt install python-is-python3 -y
python --version : ensure python is an alias for python3 not python2.7
sudo apt update -y
sudo apt dist-upgrade -y
sudo apt autoremove -y
sudo apt install sqlite3 -y
sudo apt install python3-pip -y
# this installs a shed-load of other stuff: binutils etc.sudo apt install survex-aven
sudo apt install git openssh-client -y
# On a clean debian 11 (bullseye) installation with Xfce & ssh,
#on ubuntu 20.04:
#Package sftp is not available, but is referred to by another package.
#This may mean that the package is missing, has been obsoleted, or
#is only available from another source
#E: Package 'sftp' has no installation candidate
# On Ubuntu 20.04, with python10, the pip install fails.
# So you need to get the pip from source
# sudo curl -sS https://bootstrap.pypa.io/get-pip.py | python3.10
# but really you should be using 22.04
# and also, isf using debian,
# sudo python3.10 -m pip install -U virtualenv
# as debian does not install everything that ubuntu does, you need:
sudo useradd expo
sudo usermod -a -G sudo expo # to put expo in sudoers group, re-login required
sudo apt install python3-venv -y
sudo apt install python3-dev -y
# default since 22.04
# sudo apt install python3.10
sudo apt install python3.10-venv -y
sudo apt install python3.10-dev -y
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.10 1
sudo apt install mariadb-server -y
sudo apt install libmariadb-dev -y
sudo python -m pip install --upgrade pip
sudo apt install sftp -y
echo '###'
echo '### NOW INSTALLING tunnel and therion, go and have a cup of tea. Or a 3-course meal.'
echo '###'
sudo apt install tunnelx therion -y
# Go to https://expo.survex.com/handbook/troggle/troglaptop.html#dbtools
# sudo service mysql start
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
echo '###'
echo '### Currently set version of python'
python --version
echo '###'
echo '### Now YOU have to configure the git settings for YOURSELF (not "expo")'

View File

@@ -0,0 +1,147 @@
"""
Django settings for troggle project.
For more information on this file, see
https://docs.djangoproject.com/en/dev/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/dev/ref/settings/
"""
# Imports should be grouped in the following order:
# 1.Standard library imports.
# 2.Related third party imports.
# 3.Local application/library specific imports.
# 4.You should put a blank line between each group of imports.
print("* importing troggle/settings.py")
# default value, then gets overwritten by real secrets
SECRET_KEY = "not-the-real-secret-key-a#vaeozn0---^fj!355qki*vj2"
GIT = "git" # command for running git
# Note that this builds upon the django system installed
# global settings in
# django/conf/global_settings.py which is automatically loaded first.
# read https://docs.djangoproject.com/en/dev/topics/settings/
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
# BASE_DIR = os.path.dirname(os.path.dirname(__file__))
# Django settings for troggle project.
ALLOWED_HOSTS = ["*", "expo.survex.com", ".survex.com", "localhost", "127.0.0.1", "192.168.0.5"]
ADMINS = (
# ('Your Name', 'your_email@domain.com'),
)
MANAGERS = ADMINS
# LOGIN_URL = '/accounts/login/' # this is the default value so does not need to be set
# Local time zone for this installation. Choices can be found here:
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# although not all choices may be available on all operating systems.
# If running in a Windows environment this must be set to the same as your
# system time zone.
USE_TZ = True
TIME_ZONE = "Europe/London"
# Language code for this installation. All choices can be found here:
# http://www.i18nguy.com/unicode/language-identifiers.html
LANGUAGE_CODE = "en-uk"
SITE_ID = 1
# If you set this to False, Django will make some optimizations so as not
# to load the internationalization machinery.
USE_I18N = True
USE_L10N = True
FIX_PERMISSIONS = []
# top-level survex file basename (without .svx)
SURVEX_TOPNAME = "1623-and-1626-no-schoenberg-hs"
# Caves for which survex files exist, but are not otherwise registered
# replaced (?) by expoweb/cave_data/pendingcaves.txt
# PENDING = ["1626-361", "2007-06", "2009-02",
# "2012-ns-01", "2012-ns-02", "2010-04", "2012-ns-05", "2012-ns-06",
# "2012-ns-07", "2012-ns-08", "2012-ns-12", "2012-ns-14", "2012-ns-15", "2014-bl888",
# "2018-pf-01", "2018-pf-02"]
APPEND_SLASH = (
False # never relevant because we have urls that match unknown files and produce an 'edit this page' response
)
SMART_APPEND_SLASH = True # not eorking as middleware different after Dj2.0
LOGIN_REDIRECT_URL = "/" # does not seem to have any effect
SECURE_CONTENT_TYPE_NOSNIFF = True
SECURE_BROWSER_XSS_FILTER = True
# SESSION_COOKIE_SECURE = True # if enabled, cannot login to Django control panel, bug elsewhere?
# CSRF_COOKIE_SECURE = True # if enabled only sends cookies over SSL
X_FRAME_OPTIONS = "DENY" # changed to "DENY" after I eliminated all the iframes e.g. /xmlvalid.html
DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField" # from Django 3.2
INSTALLED_APPS = (
"django.contrib.admin",
"django.contrib.auth", # includes the url redirections for login, logout
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.admindocs",
"django.forms", # Required to customise widget templates
# 'django.contrib.staticfiles', # We put our CSS etc explicitly in the right place so do not need this
"troggle.core",
)
FORM_RENDERER = "django.forms.renderers.TemplatesSetting" # Required to customise widget templates
# See the recommended order of these in https://docs.djangoproject.com/en/dev/ref/middleware/
# Note that this is a radically different onion architecture from earlier versions though it looks the same,
# see https://docs.djangoproject.com/en/dev/topics/http/middleware/#upgrading-pre-django-1-10-style-middleware
# Seriously, read this: https://www.webforefront.com/django/middlewaredjango.html which is MUCH BETTER than the docs
MIDDLEWARE = [
#'django.middleware.security.SecurityMiddleware', # SECURE_SSL_REDIRECT and SECURE_SSL_HOST # we don't use this
"django.middleware.gzip.GZipMiddleware", # not needed when expofiles and photos served by apache
"django.contrib.sessions.middleware.SessionMiddleware", # Manages sessions, if CSRF_USE_SESSIONS then it needs to be early
"django.middleware.common.CommonMiddleware", # DISALLOWED_USER_AGENTS, APPEND_SLASH and PREPEND_WWW
"django.middleware.csrf.CsrfViewMiddleware", # Cross Site Request Forgeries by adding hidden form fields to POST
"django.contrib.auth.middleware.AuthenticationMiddleware", # Adds the user attribute, representing the currently-logged-in user
"django.contrib.admindocs.middleware.XViewMiddleware", # this and docutils needed by admindocs
"django.contrib.messages.middleware.MessageMiddleware", # Cookie-based and session-based message support. Needed by admin system
"django.middleware.clickjacking.XFrameOptionsMiddleware", # clickjacking protection via the X-Frame-Options header
#'django.middleware.security.SecurityMiddleware', # SECURE_HSTS_SECONDS, SECURE_CONTENT_TYPE_NOSNIFF, SECURE_BROWSER_XSS_FILTER, SECURE_REFERRER_POLICY, and SECURE_SSL_REDIRECT
#'troggle.core.middleware.SmartAppendSlashMiddleware' # needs adapting after Dj2.0
]
ROOT_URLCONF = "troggle.urls"
WSGI_APPLICATION = "troggle.wsgi.application" # change to asgi as soon as we upgrade to Django 3.0
ACCOUNT_ACTIVATION_DAYS = 3
# AUTH_PROFILE_MODULE = 'core.person' # used by removed profiles app ?
QM_PATTERN = "\[\[\s*[Qq][Mm]:([ABC]?)(\d{4})-(\d*)-(\d*)\]\]"
# Re-enable TinyMCE when Dj upgraded to v3. Also templates/editexpopage.html
# TINYMCE_DEFAULT_CONFIG = {
# 'plugins': "table,spellchecker,paste,searchreplace",
# 'theme': "advanced",
# }
# TINYMCE_SPELLCHECKER = False
# TINYMCE_COMPRESSOR = True
TEST_RUNNER = "django.test.runner.DiscoverRunner"
from localsettings import *
# localsettings needs to take precedence. Call it to override any existing vars.

View File

@@ -0,0 +1,147 @@
"""
Django settings for troggle project.
For more information on this file, see
https://docs.djangoproject.com/en/dev/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/dev/ref/settings/
"""
# Imports should be grouped in the following order:
# 1.Standard library imports.
# 2.Related third party imports.
# 3.Local application/library specific imports.
# 4.You should put a blank line between each group of imports.
print("* importing troggle/settings.py")
# default value, then gets overwritten by real secrets
SECRET_KEY = "not-the-real-secret-key-a#vaeozn0---^fj!355qki*vj2"
GIT = "git" # command for running git
# Note that this builds upon the django system installed
# global settings in
# django/conf/global_settings.py which is automatically loaded first.
# read https://docs.djangoproject.com/en/dev/topics/settings/
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
# BASE_DIR = os.path.dirname(os.path.dirname(__file__))
# Django settings for troggle project.
ALLOWED_HOSTS = ["*", "expo.survex.com", ".survex.com", "localhost", "127.0.0.1", "192.168.0.5"]
ADMINS = (
# ('Your Name', 'your_email@domain.com'),
)
MANAGERS = ADMINS
# LOGIN_URL = '/accounts/login/' # this is the default value so does not need to be set
# Local time zone for this installation. Choices can be found here:
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# although not all choices may be available on all operating systems.
# If running in a Windows environment this must be set to the same as your
# system time zone.
USE_TZ = True
TIME_ZONE = "Europe/London"
# Language code for this installation. All choices can be found here:
# http://www.i18nguy.com/unicode/language-identifiers.html
LANGUAGE_CODE = "en-uk"
SITE_ID = 1
# If you set this to False, Django will make some optimizations so as not
# to load the internationalization machinery.
USE_I18N = True
USE_L10N = True
FIX_PERMISSIONS = []
# top-level survex file basename (without .svx)
SURVEX_TOPNAME = "1623-and-1626-no-schoenberg-hs"
# Caves for which survex files exist, but are not otherwise registered
# replaced (?) by expoweb/cave_data/pendingcaves.txt
# PENDING = ["1626-361", "2007-06", "2009-02",
# "2012-ns-01", "2012-ns-02", "2010-04", "2012-ns-05", "2012-ns-06",
# "2012-ns-07", "2012-ns-08", "2012-ns-12", "2012-ns-14", "2012-ns-15", "2014-bl888",
# "2018-pf-01", "2018-pf-02"]
APPEND_SLASH = (
False # never relevant because we have urls that match unknown files and produce an 'edit this page' response
)
SMART_APPEND_SLASH = True # not eorking as middleware different after Dj2.0
LOGIN_REDIRECT_URL = "/" # does not seem to have any effect
SECURE_CONTENT_TYPE_NOSNIFF = True
SECURE_BROWSER_XSS_FILTER = True
# SESSION_COOKIE_SECURE = True # if enabled, cannot login to Django control panel, bug elsewhere?
# CSRF_COOKIE_SECURE = True # if enabled only sends cookies over SSL
X_FRAME_OPTIONS = "DENY" # changed to "DENY" after I eliminated all the iframes e.g. /xmlvalid.html
DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField" # from Django 3.2
INSTALLED_APPS = (
"django.contrib.admin",
"django.contrib.auth", # includes the url redirections for login, logout
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.admindocs",
"django.forms", # Required to customise widget templates
# 'django.contrib.staticfiles', # We put our CSS etc explicitly in the right place so do not need this
"troggle.core",
)
FORM_RENDERER = "django.forms.renderers.TemplatesSetting" # Required to customise widget templates
# See the recommended order of these in https://docs.djangoproject.com/en/dev/ref/middleware/
# Note that this is a radically different onion architecture from earlier versions though it looks the same,
# see https://docs.djangoproject.com/en/dev/topics/http/middleware/#upgrading-pre-django-1-10-style-middleware
# Seriously, read this: https://www.webforefront.com/django/middlewaredjango.html which is MUCH BETTER than the docs
MIDDLEWARE = [
#'django.middleware.security.SecurityMiddleware', # SECURE_SSL_REDIRECT and SECURE_SSL_HOST # we don't use this
"django.middleware.gzip.GZipMiddleware", # not needed when expofiles and photos served by apache
"django.contrib.sessions.middleware.SessionMiddleware", # Manages sessions, if CSRF_USE_SESSIONS then it needs to be early
"django.middleware.common.CommonMiddleware", # DISALLOWED_USER_AGENTS, APPEND_SLASH and PREPEND_WWW
"django.middleware.csrf.CsrfViewMiddleware", # Cross Site Request Forgeries by adding hidden form fields to POST
"django.contrib.auth.middleware.AuthenticationMiddleware", # Adds the user attribute, representing the currently-logged-in user
"django.contrib.admindocs.middleware.XViewMiddleware", # this and docutils needed by admindocs
"django.contrib.messages.middleware.MessageMiddleware", # Cookie-based and session-based message support. Needed by admin system
"django.middleware.clickjacking.XFrameOptionsMiddleware", # clickjacking protection via the X-Frame-Options header
#'django.middleware.security.SecurityMiddleware', # SECURE_HSTS_SECONDS, SECURE_CONTENT_TYPE_NOSNIFF, SECURE_BROWSER_XSS_FILTER, SECURE_REFERRER_POLICY, and SECURE_SSL_REDIRECT
#'troggle.core.middleware.SmartAppendSlashMiddleware' # needs adapting after Dj2.0
]
ROOT_URLCONF = "troggle.urls"
WSGI_APPLICATION = "troggle.wsgi.application" # change to asgi as soon as we upgrade to Django 3.0
ACCOUNT_ACTIVATION_DAYS = 3
# AUTH_PROFILE_MODULE = 'core.person' # used by removed profiles app ?
QM_PATTERN = "\[\[\s*[Qq][Mm]:([ABC]?)(\d{4})-(\d*)-(\d*)\]\]"
# Re-enable TinyMCE when Dj upgraded to v3. Also templates/editexpopage.html
# TINYMCE_DEFAULT_CONFIG = {
# 'plugins': "table,spellchecker,paste,searchreplace",
# 'theme': "advanced",
# }
# TINYMCE_SPELLCHECKER = False
# TINYMCE_COMPRESSOR = True
TEST_RUNNER = "django.test.runner.DiscoverRunner"
from localsettings import *
# localsettings needs to take precedence. Call it to override any existing vars.

View File

@@ -0,0 +1,173 @@
#!/bin/bash
# Crowley has python 3.9.2
# Taken from: footled lots to make this work with python 3.10 & 3.11 and WSL1 and WSL2 on Ubuntu 22.04
# Run this in a terminal in the troggle directory: 'bash venv-trog-crowley.sh'
echo '-- DONT RUN THIS - messes up permissions!'
echo '-- Run this in a terminal in the real troggle directory: "bash venv-trog-crowley.sh"'
# use the script os-trog-crowley.sh
# If you are using Debian, then stick with the default version of python
# If you are using Ubuntu, then it is easy to use a later version of python, e.g. 3.11
# NOW we set up troggle
PYTHON=python3.9
VENAME=p9d4 # python3.x and django 4
echo "** You are logged in as `id -u -n`"
echo "The 50MB pip cache will be in /home/`id -u -n`/.cache/"
echo "The 150MB venv will created in /home/`id -u -n`/$VENAME/"
TROGDIR=$(cd $(dirname $0) && pwd)
echo "-- Troggle folder (this script location): ${TROGDIR}"
if [ -d requirements.txt ]; then
echo "-- No requirements.txt found. Copy it from your most recent installation."
exit 1
fi
echo ## Using requirements.txt :
cat requirements.txt
echo ##
$PYTHON --version
# NOTE that when using a later or earlier verison of python, you MUST also
# use the allowed version of Pillow, see https://pillow.readthedocs.io/en/latest/installation.html
# NOW set up link from expo user folder
# needed for WSL2
echo Creating links from Linux filesystem user
# These links only need making once, for many venv
cd ~
if [ ! -d $VENAME ]; then
echo "## Creating venv $VENAME. (If this fails with a pip error, you need to ensure you have python3.11-venv installed and/or use a Ubuntu window)"
$PYTHON -m venv $VENAME
else
echo "## /$VENAME/ already exists ! Delete it first."
exit 1
fi
# Activate the virtual env and see what the default packages are
echo "### Activating $VENAME"
cd $VENAME
echo "-- now in: ${PWD}"
source bin/activate
echo "### Activated."
# update local version of pip, more recent than OS version
# debian bullseye installs pip 20.3.4 which barfs, we want >22.0.3
# update local version of setuptools, more recent than OS version, needed for packages without wheels
echo "### installing later version of pip inside $VENAME"
$PYTHON -m pip install --upgrade pip
$PYTHON -m pip install --upgrade setuptools
PIP=pip
$PIP list > original-pip.list
$PIP freeze >original.txt
# we are in /home/$USER/$VENAME/
ln -s ${TROGDIR} troggle
ln -s ${TROGDIR}/../expoweb expoweb
ln -s ${TROGDIR}/../loser loser
ln -s ${TROGDIR}/../drawings drawings
# fudge for philip's machine
if [ -d ${TROGDIR}/../expofiles ]; then
ln -s ${TROGDIR}/../expofiles expofiles
else
if [ ! -d /mnt/f/expofiles ]; then
sudo mkdir /mnt/f
sudo mount -t drvfs F: /mnt/f
else
ln -s /mnt/f/expofiles expofiles
fi
fi
echo "### Setting file permissions.. may take a while.."
git config --global --add safe.directory '*'
#sudo chmod -R 0777 *
echo "### links to expoweb, troggle etc. complete:"
ls -tla
echo "###"
echo "### now installing ${TROGDIR}/requirements.txt"
echo "###"
# NOW THERE IS A PERMISSIONS FAILURE THAT DIDN'T HAPPEN BEFORE
# seen on wsl2 as well as wsl1
# which ALSO ruins EXISTING permissions !
# Guessing it is to do with pip not liking non-standard py 3.11 installation on Ubuntu 22.04
$PIP install -r ${TROGDIR}/requirements.txt
echo '### install from requirements.txt completed.'
echo '### '
$PIP freeze > requirements.txt
# so that we can track requirements more easily with git
# because we do not install these with pip, but they are listed by the freeze command
# Now find out what we actually installed by subtracting the stuff venv installed anyway
sort original.txt > 1
sort requirements.txt >2
comm -3 1 2 --check-order | awk '{ print $1}'>fresh-requirements.txt
rm 1
rm 2
cp requirements.txt requirements-$VENAME.txt
cp requirements-$VENAME.txt troggle/requirements-$VENAME.txt
$PIP list > installed-pip.list
$PIP list -o > installed-pip-o.list
REQ=installation-record
mkdir $REQ
mv requirements-$VENAME.txt $REQ
mv original.txt $REQ
mv requirements.txt $REQ
mv original-pip.list $REQ
mv installed-pip.list $REQ
mv installed-pip-o.list $REQ
cp fresh-requirements.txt ../requirements.txt
mv fresh-requirements.txt $REQ
cp troggle/`basename "$0"` $REQ
$PYTHON --version
python --version
echo "Django version:`django-admin --version`"
echo "### Now do
'[sudo service mysql start]'
'[sudo service mariadb restart]'
'[sudo mysql_secure_installation]'
'cd ~/$VENAME'
'source bin/activate'
'cd troggle'
'django-admin'
'python manage.py check'
## this tests if you have set up ssh correcting. Refer to documentation https://expo.survex.com/handbook/computing/keyexchange.html
## you need to follow the Linux instructions.
'ssh expo@expo.survex.com'
## the next tests will fail unless ~/expofiles is set correctly to a folder on your machine
## the tests may ALSO fail because of ssh and permissions errors
# Ran 85 tests in 83.492s
# FAILED (failures=5)
## So you will need to run
#$sudo chown -Rhv philip:philip ~/$VENAME (if your username is philip)
# and then REBOOT (or at least, exit WSL and terminate and restart WSL)
# because this chmod only takes effect then.
'./pre-run.sh' (runs the migrations and then the tests)
'python databaseReset.py reset $VENAME'
'python manage.py runserver 0.0.0.0:8000 (and allow access when the firewall window pops up)'
"
if [ ! -d /mnt/f/expofiles ]; then
echo '### No valid expofiles directory. Fix this before any tests will work.
fi

View File

@@ -0,0 +1,227 @@
# This is the main Apache server configuration file. It contains the
# configuration directives that give the server its instructions.
# See http://httpd.apache.org/docs/2.4/ for detailed information about
# the directives and /usr/share/doc/apache2/README.Debian about Debian specific
# hints.
#
#
# Summary of how the Apache 2 configuration works in Debian:
# The Apache 2 web server configuration in Debian is quite different to
# upstream's suggested way to configure the web server. This is because Debian's
# default Apache2 installation attempts to make adding and removing modules,
# virtual hosts, and extra configuration directives as flexible as possible, in
# order to make automating the changes and administering the server as easy as
# possible.
# It is split into several files forming the configuration hierarchy outlined
# below, all located in the /etc/apache2/ directory:
#
# /etc/apache2/
# |-- apache2.conf
# | `-- ports.conf
# |-- mods-enabled
# | |-- *.load
# | `-- *.conf
# |-- conf-enabled
# | `-- *.conf
# `-- sites-enabled
# `-- *.conf
#
#
# * apache2.conf is the main configuration file (this file). It puts the pieces
# together by including all remaining configuration files when starting up the
# web server.
#
# * ports.conf is always included from the main configuration file. It is
# supposed to determine listening ports for incoming connections which can be
# customized anytime.
#
# * Configuration files in the mods-enabled/, conf-enabled/ and sites-enabled/
# directories contain particular configuration snippets which manage modules,
# global configuration fragments, or virtual host configurations,
# respectively.
#
# They are activated by symlinking available configuration files from their
# respective *-available/ counterparts. These should be managed by using our
# helpers a2enmod/a2dismod, a2ensite/a2dissite and a2enconf/a2disconf. See
# their respective man pages for detailed information.
#
# * The binary is called apache2. Due to the use of environment variables, in
# the default configuration, apache2 needs to be started/stopped with
# /etc/init.d/apache2 or apache2ctl. Calling /usr/bin/apache2 directly will not
# work with the default configuration.
# Global configuration
#
#
# ServerRoot: The top of the directory tree under which the server's
# configuration, error, and log files are kept.
#
# NOTE! If you intend to place this on an NFS (or otherwise network)
# mounted filesystem then please read the Mutex documentation (available
# at <URL:http://httpd.apache.org/docs/2.4/mod/core.html#mutex>);
# you will save yourself a lot of trouble.
#
# Do NOT add a slash at the end of the directory path.
#
#ServerRoot "/etc/apache2"
#
# The accept serialization lock file MUST BE STORED ON A LOCAL DISK.
#
#Mutex file:${APACHE_LOCK_DIR} default
#
# The directory where shm and other runtime files will be stored.
#
DefaultRuntimeDir ${APACHE_RUN_DIR}
#
# PidFile: The file in which the server should record its process
# identification number when it starts.
# This needs to be set in /etc/apache2/envvars
#
PidFile ${APACHE_PID_FILE}
#
# Timeout: The number of seconds before receives and sends time out.
#
Timeout 300
#
# KeepAlive: Whether or not to allow persistent connections (more than
# one request per connection). Set to "Off" to deactivate.
#
KeepAlive On
#
# MaxKeepAliveRequests: The maximum number of requests to allow
# during a persistent connection. Set to 0 to allow an unlimited amount.
# We recommend you leave this number high, for maximum performance.
#
MaxKeepAliveRequests 100
#
# KeepAliveTimeout: Number of seconds to wait for the next request from the
# same client on the same connection.
#
KeepAliveTimeout 5
# These need to be set in /etc/apache2/envvars
User ${APACHE_RUN_USER}
Group ${APACHE_RUN_GROUP}
#
# HostnameLookups: Log the names of clients or just their IP addresses
# e.g., www.apache.org (on) or 204.62.129.132 (off).
# The default is off because it'd be overall better for the net if people
# had to knowingly turn this feature on, since enabling it means that
# each client request will result in AT LEAST one lookup request to the
# nameserver.
#
HostnameLookups Off
# ErrorLog: The location of the error log file.
# If you do not specify an ErrorLog directive within a <VirtualHost>
# container, error messages relating to that virtual host will be
# logged here. If you *do* define an error logfile for a <VirtualHost>
# container, that host's errors will be logged there and not here.
#
ErrorLog ${APACHE_LOG_DIR}/error.log
#
# LogLevel: Control the severity of messages logged to the error_log.
# Available values: trace8, ..., trace1, debug, info, notice, warn,
# error, crit, alert, emerg.
# It is also possible to configure the log level for particular modules, e.g.
# "LogLevel info ssl:warn"
#
LogLevel warn
# Include module configuration:
IncludeOptional mods-enabled/*.load
IncludeOptional mods-enabled/*.conf
# Include list of ports to listen on
Include ports.conf
# Sets the default security model of the Apache2 HTTPD server. It does
# not allow access to the root filesystem outside of /usr/share and /var/www.
# The former is used by web applications packaged in Debian,
# the latter may be used for local directories served by the web server. If
# your system is serving content from a sub-directory in /srv you must allow
# access here, or in any related virtual host.
<Directory />
Options FollowSymLinks
AllowOverride None
Require all denied
</Directory>
<Directory /usr/share>
AllowOverride None
Require all granted
</Directory>
<Directory /var/www/>
Options Indexes FollowSymLinks
AllowOverride None
Require all granted
</Directory>
#<Directory /srv/>
# Options Indexes FollowSymLinks
# AllowOverride None
# Require all granted
#</Directory>
# AccessFileName: The name of the file to look for in each directory
# for additional configuration directives. See also the AllowOverride
# directive.
#
AccessFileName .htaccess
#
# The following lines prevent .htaccess and .htpasswd files from being
# viewed by Web clients.
#
<FilesMatch "^\.ht">
Require all denied
</FilesMatch>
#
# The following directives define some format nicknames for use with
# a CustomLog directive.
#
# These deviate from the Common Log Format definitions in that they use %O
# (the actual bytes sent including headers) instead of %b (the size of the
# requested file), because the latter makes it impossible to detect partial
# requests.
#
# Note that the use of %{X-Forwarded-For}i instead of %h is not recommended.
# Use mod_remoteip instead.
#
LogFormat "%v:%p %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" vhost_combined
LogFormat "%h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" combined
LogFormat "%h %l %u %t \"%r\" %>s %O" common
LogFormat "%{Referer}i -> %U" referer
LogFormat "%{User-agent}i" agent
# Include of directories ignores editors' and dpkg's backup files,
# see README.Debian for details.
# Include generic snippets of statements
IncludeOptional conf-enabled/*.conf
# Include the virtual host configurations:
IncludeOptional sites-enabled/*.conf
# vim: syntax=apache ts=4 sw=4 sts=4 sr noet

View File

@@ -0,0 +1,47 @@
# envvars - default environment variables for apache2ctl
# this won't be correct after changing uid
unset HOME
# for supporting multiple apache2 instances
if [ "${APACHE_CONFDIR##/etc/apache2-}" != "${APACHE_CONFDIR}" ] ; then
SUFFIX="-${APACHE_CONFDIR##/etc/apache2-}"
else
SUFFIX=
fi
# Since there is no sane way to get the parsed apache2 config in scripts, some
# settings are defined via environment variables and then used in apache2ctl,
# /etc/init.d/apache2, /etc/logrotate.d/apache2, etc.
export APACHE_RUN_USER=expo
export APACHE_RUN_GROUP=expo
# temporary state file location. This might be changed to /run in Wheezy+1
export APACHE_PID_FILE=/var/run/apache2$SUFFIX/apache2.pid
export APACHE_RUN_DIR=/var/run/apache2$SUFFIX
export APACHE_LOCK_DIR=/var/lock/apache2$SUFFIX
# Only /var/log/apache2 is handled by /etc/logrotate.d/apache2.
export APACHE_LOG_DIR=/var/log/apache2$SUFFIX
## The locale used by some modules like mod_dav
#export LANG=C
## Uncomment the following line to use the system default locale instead:
. /etc/default/locale
export LANG
## The command to get the status for 'apache2ctl status'.
## Some packages providing 'www-browser' need '--dump' instead of '-dump'.
#export APACHE_LYNX='www-browser -dump'
## If you need a higher file descriptor limit, uncomment and adjust the
## following line (default is 8192):
#APACHE_ULIMIT_MAX_FILES='ulimit -n 65536'
## If you would like to pass arguments to the web server, add them below
## to the APACHE_ARGUMENTS environment.
#export APACHE_ARGUMENTS=''
## Enable the debug mode for maintainer scripts.
## This will produce a verbose output on package installations of web server modules and web application
## installations which interact with Apache
#export APACHE2_MAINTSCRIPT_DEBUG=1

View File

@@ -0,0 +1,121 @@
import os
import sys
import urllib.parse
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
"""
print(" * importing troggle/localsettings.py")
# DO NOT check this file into the git repo - it contains real passwords. [not this copy]
SECRET_KEY = "zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz"
EXPOUSERPASS = "nope"
EXPOADMINUSERPASS = "nope"
EMAIL_HOST_PASSWORD = "nope"
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : 'not a real password', # Not used with sqlite3.
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
EXPOUSER = 'expo'
EXPOUSERPASS = "nnn:gggggg"
EXPOUSER_EMAIL = 'wookey@wookware.org'
REPOS_ROOT_PATH = '/home/expo/'
sys.path.append(REPOS_ROOT_PATH)
sys.path.append(REPOS_ROOT_PATH + 'troggle')
# Define the path to the django app (troggle in this case)
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
PYTHON_PATH + "templates"
],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
'django.contrib.auth.context_processors.auth',
'core.context.troggle_context',
'django.template.context_processors.debug',
'django.template.context_processors.i18n',
'django.template.context_processors.media',
'django.template.context_processors.static',
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
# insert your TEMPLATE_LOADERS here
]
},
},
]
PUBLIC_SITE = True
# This should be False for normal running
DEBUG = True
SURVEX_DATA = REPOS_ROOT_PATH + 'loser/'
DRAWINGS_DATA = REPOS_ROOT_PATH + 'drawings/'
CAVERN = 'cavern'
THREEDTOPOS = 'survexport'
EXPOWEB = REPOS_ROOT_PATH + 'expoweb/'
SURVEYS = REPOS_ROOT_PATH
SURVEY_SCANS = REPOS_ROOT_PATH + 'expofiles/surveyscans/'
FILES = REPOS_ROOT_PATH + 'expofiles'
CAVEDESCRIPTIONS = os.path.join(EXPOWEB, "cave_data")
ENTRANCEDESCRIPTIONS = os.path.join(EXPOWEB, "entrance_data")
CACHEDIR = REPOS_ROOT_PATH + 'expowebcache/'
THREEDCACHEDIR = CACHEDIR + '3d/'
THUMBNAILCACHE = CACHEDIR + 'thumbs'
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
URL_ROOT = 'http://expo.survex.com/'
DIR_ROOT = ''#this should end in / if a value is given
EXPOWEB_URL = '/'
SURVEYS_URL = '/survey_scans/'
EXPOFILES = urllib.parse.urljoin(REPOS_ROOT_PATH, 'expofiles/')
PHOTOS_URL = urllib.parse.urljoin(URL_ROOT, '/photos/')
# MEDIA_URL is used by urls.py in a regex. See urls.py & core/views/surveys.py
MEDIA_URL = '/site_media/'
MEDIA_ROOT = REPOS_ROOT_PATH + '/troggle/media/'
STATIC_URL = urllib.parse.urljoin(URL_ROOT , '/static/') # used by Django admin pages. Do not delete.
JSLIB_URL = urllib.parse.urljoin(URL_ROOT , '/javascript/') # always fails, try to revive it ?
#TINY_MCE_MEDIA_ROOT = STATIC_ROOT + '/tiny_mce/' # not needed while TinyMCE not installed
#TINY_MCE_MEDIA_URL = STATIC_URL + '/tiny_mce/' # not needed while TinyMCE not installed
LOGFILE = '/var/log/troggle/troggle.log'
IMPORTLOGFILE = '/var/log/troggle/import.log'
# add in 290, 291, 358 when they don't make it crash horribly
NOTABLECAVESHREFS = [ "264", "258", "204", "76", "107"]

View File

@@ -0,0 +1,164 @@
import os
import sys
import urllib.parse
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem, or /javascript/ can be in
a system-wide location rather than just a local directory.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
Read https://realpython.com/python-pathlib/
Read https://adamj.eu/tech/2020/03/16/use-pathlib-in-your-django-project/
"""
print(" * importing troggle/localsettings.py")
# DO NOT check this file into the git repo - it contains real passwords.
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever
#SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : 'uFqP56B4XleeyIW', # Not used with sqlite3.
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
EXPOUSER = 'expo'
EXPOADMINUSER = 'expoadmin'
EXPOUSER_EMAIL = 'wookey@wookware.org'
EXPOADMINUSER_EMAIL = 'wookey@wookware.org'
SECRET_KEY = "zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz"
EXPOUSERPASS = "nope"
EXPOADMINUSERPASS = "nope"
EMAIL_HOST_PASSWORD = "nope"
REPOS_ROOT_PATH = '/home/expo/'
sys.path.append(REPOS_ROOT_PATH)
sys.path.append(REPOS_ROOT_PATH + 'troggle')
# Define the path to the django app (troggle in this case)
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
PHOTOS_YEAR = "2022"
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
PYTHON_PATH + "templates"
],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
'django.contrib.auth.context_processors.auth', # knowledge of logged-on user & permissions
'core.context.troggle_context', # in core/troggle.py
'django.template.context_processors.debug',
#'django.template.context_processors.request', # copy of current request, added in trying to make csrf work
'django.template.context_processors.i18n',
'django.template.context_processors.media', # includes a variable MEDIA_URL
'django.template.context_processors.static', # includes a variable STATIC_URL
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader', #For each app, inc admin, in INSTALLED_APPS, loader looks for /templates
# insert your own TEMPLATE_LOADERS here
]
},
},
]
PUBLIC_SITE = True
# This should be False for normal running
DEBUG = True
CACHEDPAGES = True # experimental page cache for a handful of page types
SURVEX_DATA = REPOS_ROOT_PATH + 'loser/'
DRAWINGS_DATA = REPOS_ROOT_PATH + 'drawings/'
# executables:
CAVERN = 'cavern' # for parsing .svx files and producing .3d files
SURVEXPORT = 'survexport' # for parsing .3d files and producing .pos files
EXPOWEB = REPOS_ROOT_PATH + 'expoweb/'
#SURVEYS = REPOS_ROOT_PATH
SCANS_ROOT = REPOS_ROOT_PATH + 'expofiles/surveyscans/'
FILES = REPOS_ROOT_PATH + 'expofiles'
PHOTOS_ROOT = REPOS_ROOT_PATH + 'expofiles/photos/'
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / 'templates'
MEDIA_ROOT = TROGGLE_PATH / 'media'
JSLIB_ROOT = TROGGLE_PATH / 'media' / 'jslib' # used for CaveViewer JS utility
CAVEDESCRIPTIONS = os.path.join(EXPOWEB, "cave_data")
ENTRANCEDESCRIPTIONS = os.path.join(EXPOWEB, "entrance_data")
# CACHEDIR = REPOS_ROOT_PATH + 'expowebcache/'
# THREEDCACHEDIR = CACHEDIR + '3d/'
# THUMBNAILCACHE = CACHEDIR + 'thumbs'
PYTHON_PATH = REPOS_ROOT_PATH + 'troggle/'
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
LIBDIR = Path(REPOS_ROOT_PATH) / 'lib' / PV
#Note that all these *_URL constants are not actually used in urls.py, they should be..
#URL_ROOT = 'http://expo.survex.com/'
URL_ROOT = '/'
DIR_ROOT = ''#this should end in / if a value is given
EXPOWEB_URL = '/'
SCANS_URL = '/survey_scans/'
EXPOFILES = urllib.parse.urljoin(REPOS_ROOT_PATH, 'expofiles/')
PHOTOS_URL = urllib.parse.urljoin(URL_ROOT, '/photos/')
# MEDIA_URL is used by urls.py in a regex. See urls.py & core/views_surveys.py
MEDIA_URL = '/site_media/'
STATIC_URL = urllib.parse.urljoin(URL_ROOT , '/static/') # used by Django admin pages. Do not delete.
JSLIB_URL = urllib.parse.urljoin(URL_ROOT , '/javascript/') # always fails, try to revive it ?
#TINY_MCE_MEDIA_ROOT = STATIC_ROOT + '/tiny_mce/' # not needed while TinyMCE not installed
#TINY_MCE_MEDIA_URL = STATIC_URL + '/tiny_mce/' # not needed while TinyMCE not installed
LOGFILE = '/var/log/troggle/troggle.log'
IMPORTLOGFILE = '/var/log/troggle/import.log'
# add in 358 when they don't make it crash horribly
NOTABLECAVESHREFS = [ "290", "291", "359", "264", "258", "204", "76", "107"]
# Sanitise these to be strings as all other code is expecting strings
# and we have not made the change to pathlib Path type in the other localsettings-* variants yet.
CAVEDESCRIPTIONS = os.fspath(CAVEDESCRIPTIONS)
ENTRANCEDESCRIPTIONS = os.fspath(ENTRANCEDESCRIPTIONS)
LOGFILE = os.fspath(LOGFILE)
#SURVEYS = os.fspath(SURVEYS)
EXPOWEB = os.fspath(EXPOWEB)
DRAWINGS_DATA = os.fspath(DRAWINGS_DATA)
SURVEX_DATA = os.fspath(SURVEX_DATA)
REPOS_ROOT_PATH = os.fspath(REPOS_ROOT_PATH)
TEMPLATE_PATH = os.fspath(TROGGLE_PATH)
MEDIA_ROOT = os.fspath(MEDIA_ROOT)
JSLIB_ROOT = os.fspath(JSLIB_ROOT)
SCANS_ROOT = os.fspath(SCANS_ROOT)
LIBDIR = os.fspath(LIBDIR)
print(" + finished importing troggle/localsettings.py")

View File

@@ -0,0 +1,23 @@
#This requirements txt matches the libaries as of 2023-07-09 on expo.survex.com <Debian GNU/Linux 11 (bullseye)>
#Nb on the server asgiref==3.3.0, however this conflicts with the Django==3.2.12 requirement
asgiref==3.3.2
Django==3.2.12
docutils==0.16
packaging==20.9
Pillow==8.1.2
pytz==2021.1
sqlparse==0.4.1
Unidecode==1.2.0
beautifulsoup4==4.9.3
piexif==1.1.3
#Not installed on expo.survex.com
#black==23.3
#click==8.1.3
#coverage==7.2
#isort==5.12.0
#mypy-extensions==1.0.0
#pathspec==0.11
#platformdirs==3.8
#ruff==0.0.245

View File

@@ -0,0 +1,93 @@
Instructions for setting up new expo debian server/VM
For Debian Stretch, June 2019.
[Note added March 2021:
See also http://expo.survex.com/handbook/troggle/serverconfig.html
and troggle/README.txt
]
adduser expo
apt install openssh-server mosh tmux mc zile emacs-nox mc most ncdu
apt install python-django apache2 mysql-server survex make rsync
apt install libjs-openlayers make
apt install git mercurial mercurial-server?
for boe:
apt install libcgi-session-perl libcrypt-passwdmd5-perl libfile-slurp-perl libgit-wrapper-perl libhtml-template-perl libhtml-template-pro-perl libmime-lite-perl libtext-password-pronounceable-perl libtime-parsedate-perl libuuid-tiny-perl libcrypt-cracklib-perl
obsolete-packages:
bins (move to jigl?) (for photos)
python-django 1.7
backports: survex therion
not-packaged: caveview
make these dirs available at top documentroot:
cuccfiles
expofiles
loser (link to repo)
tunneldata (link to repo)
troggle (link to repo)
expoweb (link to repo)
boc/boe
config
containing:
setup apache configs for cucc and expo
#disable default website
a2dissite 000-default
a2ensite cucc
a2ensite expo
a2enmod cgid
Boe config:
Alias /boe /home/expo/boe/boc/boc.pl
<Directory /home/expo/boe/boc>
AddHandler cgi-script .pl
SetHandler cgi-script
Options +ExecCGI
Require all granted
</Directory>
And remember to set both program and data dir to be
www-data:www-data
(optionally make file group read/write by treasurer account)
create empty repo by clicking create in boe interface
then set names in 'settings'
Set up mysql (as root)
mysql -p
CREATE DATABASE troggle;
GRANT ALL PRIVILEGES ON troggle.* TO 'expo'@'localhost' IDENTIFIED BY 'somepassword';
install django:
NO!
This was:sudo apt install python-django python-django-registration python-django-imagekit python-django-tinymce fonts-freefont-ttf libapache2-mod-wsgi
Should be ?
sudo apt install python-django python-django-tinymce fonts-freefont-ttf libapache2-mod-wsgi
CHeck if this is correct:
python-django-tinymce comes from https://salsa.debian.org/python-team/modules/python-django-tinymce
(both modified for stretch/python2). packages under /home/wookey/packages/
need fonts-freefont-ttf (to have truetype freesans available for troggle via PIL)
need libapache2-mod-wsgi for apache wsgi support.
On stretch the django 1.10 is no use so get rid of that:
apt remove python3-django python-django python-django-common python-django-doc
Then replace with django 1.7 (Needs to be built for stretch)
apt install python-django python-django-common python-django-doc
apt install python-django-registration python-django-imagekit python-django-tinymce
then hold them to stop them being upgraded by unattended upgrades:
echo "python-django hold" | sudo dpkg --set-selections
echo "python-django-common hold" | sudo dpkg --set-selections
echo "python-django-doc hold" | sudo dpkg --set-selections
#troggle has to have a writable logfile otherwise the website explodes
# 500 error on the server, and apache error log has non-rentrant errors
create /var/log/troggle/troggle.log
chown www-data:adm /var/log/troggle/troggle.log
chmod 660 /var/log/troggle/troggle.log

View File

@@ -0,0 +1,7 @@
# install the apport exception handler if available
try:
import apport_python_hook
except ImportError:
pass
else:
apport_python_hook.install()

View File

@@ -0,0 +1,103 @@
adduser expo
apt install openssh-server mosh tmux mc zile emacs-nox mc most ncdu
apt install python-django apache2 mysql-server survex make rsync
apt install libjs-openlayers make
apt install git mercurial mercurial-server?
for boe:
apt install libcgi-session-perl libcrypt-passwdmd5-perl libfile-slurp-perl libgit-wrapper-perl libhtml-template-perl libhtml-template-pro-perl libmime-lite-perl libtext-password-pronounceable-perl libtime-parsedate-perl libuuid-tiny-perl libcrypt-cracklib-perl
apt install ufraw for PEF image decoding.
sudo apt install python-django python-django-registration e fonts-freefont-ttf libapache2-mod-wsgi python3-gdbm
# sudo apt install python-django-imagekit python-django-tinymc
obsolete-packages: bins (move to jigl?)
older python-django?
backports: survex therion
not-packaged: caveview
make these dirs available at top documentroot:
cuccfiles
expofiles
loser
tunneldata
troggle
expoweb
boc/boe
config
containing:
setup apache configs for cucc and expo
#disable default website
a2dissite 000-default
a2ensite cucc
a2ensite expo
a2enmod cgid
Boe config:
Alias /boe /home/expo/boe/boc/boc.pl
<Directory /home/expo/boe/boc>
AddHandler cgi-script .pl
SetHandler cgi-script
Options +ExecCGI
Require all granted
</Directory>
And remember to set both program and data dir to be
www-data:www-data
(optionally make file group read/write by treasurer account)
create empty repo by clicking create in boe interface
then set names in 'settings'
Set up mysql (as root)
mysql -p
CREATE DATABASE troggle;
GRANT ALL PRIVILEGES ON troggle.* TO 'expo'@'localhost' IDENTIFIED BY 'somepassword';
Ctrl-D to exit
somepassword is set in localsettings.py
sudo service mariadb stop
sudo service mariadb start
to delete the database, it is
DROP DATABASE troggle;
install django:
sudo apt install python-django python-django-registration python-django-imagekit python-django-tinymce fonts-freefont-ttf libapache2-mod-wsgi
python-django-imagekit comes from https://salsa.debian.org/python-team/modules/python-django-imagekit
python-django-tinymce comes from https://salsa.debian.org/python-team/modules/python-django-tinymce
need fonts-freefont-ttf (to have truetype freesans available for troggle via PIL)
need libapache2-mod-wsgi for apache wsgi support.
On stretch the django 1.10 is no use so get rid of that:
apt remove python3-django python-django python-django-common python-django-doc
Then replace with django 1.7 (Needs to be built for stretch)
apt install python-django python-django-common python-django-doc
apt install python-django-registration python-django-imagekit python-django-tinymce
then hold them to stop them being upgraded by unattended upgrades:
echo "python-django hold" | sudo dpkg --set-selections
echo "python-django-common hold" | sudo dpkg --set-selections
echo "python-django-doc hold" | sudo dpkg --set-selections
Optimizing server
I've tweaked the apache and mysql settings to make them a bit more suitable for a small machine. Seems to have shaved 200MB or so off the idling footprint.
https://www.narga.net/optimizing-apachephpmysql-low-memory-server/
(just discovered 'ab' for running apache performance tests - handy).
Do the edit to site-packages/django/db/backends/base.py
to comment out the requirement for mysqlclient >1.3.13
as we run perfectly happily with Django 2.2.19 & mysqlite 1.3.10
:
version = Database.version_info
#test nobbled by Wookey 2021-04-08 as 1.3.13 is not available on stable
#if version < (1, 3, 13):
# raise ImproperlyConfigured('mysqlclient 1.3.13 or newer is required; you have %s.' % Database.__version__)

View File

@@ -1,5 +1,8 @@
# Running troggle on Docker
These notes written by Sam Wenham in Feb., 2019.
These all pre-date the move to python3, later versions of Django (1.11.+) and debian.
## Install
First you need to install
- [docker-ce](https://docs.docker.com/install/)

View File

@@ -1,5 +1,6 @@
import sys
# link localsettings to this file for use on expo computer in austria
# This is the local settings for use with the docker compose dev setup. It is imported automatically
DATABASES = {
'default': {
@@ -13,7 +14,7 @@ DATABASES = {
}
EXPOUSER = 'expo'
EXPOUSERPASS = 'somepasshere'
EXPOUSERPASS = "nnn:gggggg"
EXPOUSER_EMAIL = 'wookey@wookware.org'
REPOS_ROOT_PATH = '/expo/'
@@ -23,7 +24,7 @@ sys.path.append(REPOS_ROOT_PATH + 'troggle')
PUBLIC_SITE = False
SURVEX_DATA = REPOS_ROOT_PATH + 'loser/'
TUNNEL_DATA = REPOS_ROOT_PATH + 'tunneldata/'
DRAWINGS_DATA = REPOS_ROOT_PATH + 'drawings/'
CAVERN = 'cavern'
THREEDTOPOS = '3dtopos'
@@ -47,13 +48,13 @@ MEDIA_URL = URL_ROOT + DIR_ROOT + 'site_media/'
MEDIA_ROOT = REPOS_ROOT_PATH + '/troggle/media/'
MEDIA_ADMIN_DIR = '/usr/lib/python2.7/site-packages/django/contrib/admin/media/'
STATIC_URL = URL_ROOT
STATIC_ROOT = DIR_ROOT
STATIC_URL = "/static/"
STATIC_ROOT = "/expo/static"
JSLIB_URL = URL_ROOT + 'javascript/'
TINY_MCE_MEDIA_ROOT = '/usr/share/tinymce/www/'
TINY_MCE_MEDIA_URL = URL_ROOT + DIR_ROOT + '/tinymce_media/'
TINY_MCE_MEDIA_ROOT = STATIC_ROOT + '/tiny_mce/'
TINY_MCE_MEDIA_URL = STATIC_ROOT + '/tiny_mce/'
TEMPLATE_DIRS = (
PYTHON_PATH + "templates",

View File

@@ -0,0 +1,9 @@
Django==1.7.11
django-registration==2.1.2
mysql
#imagekit
django-imagekit
Image
django-tinymce==2.7.0
smartencoding
unidecode

View File

@@ -6,3 +6,4 @@ django-imagekit
Image
django-tinymce==2.7.0
smartencoding
unidecode

18
_deploy/readme.txt Normal file
View File

@@ -0,0 +1,18 @@
2023-07-17 Philip Sargent
Trying to sort out configurations as we got into a bit of a mess on
Expo in the last couple of weeks with two (notionally identical Debian
Bullseye) expo laptops Crowley (which has local troggle installed and
can run locally) and Aziraphale (has local copy of troggle repo but is
not configured to run locally), Martin Green's laptop (Ubuntu 22.04.2),
Philip's Barbie laptop Ubuntu 22.04.3). And of course the server itself
expo.survex.com which is running Debian Bullseye. But most development
recently had been done on Philip's two other machines, desktop and PC,
both running Ubuntu on WSL on Windows and both using venv environments,
which Crowley also does.
- settings.py
is common to all configurations,
but these are all different:
- localsettings.py
- requirements.txt

View File

@@ -1,6 +1,11 @@
import sys
# link localsettings to this file for use on expo computer in austria
# This will ALL NEED TO BE CHANGED to match localsettingsWSL / python3 / Django v2.2
# This WILL NOT WORK as it is for an earlier version of Django
# consult localsettingsWSL for updates required.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
@@ -13,7 +18,7 @@ DATABASES = {
}
EXPOUSER = 'expo'
EXPOUSERPASS = 'realpasshere'
EXPOUSERPASS = "nnn:gggggg"
EXPOUSER_EMAIL = 'wookey@wookware.org'
REPOS_ROOT_PATH = '/home/expo/expofiles/'
@@ -24,7 +29,7 @@ sys.path.append(REPOS_ROOT_PATH + 'troggle')
PUBLIC_SITE = False
SURVEX_DATA = REPOS_ROOT_PATH + 'loser/'
TUNNEL_DATA = REPOS_ROOT_PATH + 'tunneldata/'
DRAWINGS_DATA = REPOS_ROOT_PATH + 'drawings/'
THREEDCACHEDIR = REPOS_ROOT_PATH + 'expowebcache/3d/'
CAVERN = 'cavern'

View File

@@ -0,0 +1,3 @@
The copy in this /_deploy/ folder may not be the latest if active development
has been going on in the parent folder. Check there for a later copy of
the localsettingsWSL file.

View File

@@ -0,0 +1,188 @@
import sys
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem, or /javascript/ can be in
a system-wide location rather than just a local directory.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
Read https://realpython.com/python-pathlib/
Read https://adamj.eu/tech/2020/03/16/use-pathlib-in-your-django-project/
"""
print(" * importing troggle/localsettings.py")
# -----------------------------------------------------------------
# THINK before you push this to a repo
# - have you checked that credentials.py is in .gitignore ?
# - we don't want to have to change the expo system password !
# -----------------------------------------------------------------
# default values, real secrets imported from credentials.py
SECRET_KEY = "real-SECRET_KEY--imported-from-localsettings.py"
EXPOUSERPASS = "nnn:gggggg - real-expo-password---imported-from-localsettings.py"
EXPOADMINUSERPASS = "gggggg:nnn - real-expo-password---imported-from-localsettings.py"
EMAIL_HOST_PASSWORD = "real-email-password---imported-from-localsettings.py"
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever. Tests are then less accurate.
# SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
SERVERPORT = "8000" # not needed
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
# Troggle does a lot of file-handling. This is very error-prone when using primitive methods,
# so we use pathlib which has been standard since python 3.4
# If pathlib is new to you, you will need to read https://realpython.com/python-pathlib/
# --------------------- MEDIA redirections BEGIN ---------------------
REPOS_ROOT_PATH = Path(__file__).parent.parent
LIBDIR = REPOS_ROOT_PATH / "lib" / PV
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / "templates"
MEDIA_ROOT = TROGGLE_PATH / "media"
JSLIB_ROOT = TROGGLE_PATH / "media" / "jslib" # used for CaveViewer JS utility
EXPOFILES = REPOS_ROOT_PATH / "expofiles"
SCANS_ROOT = EXPOFILES / "surveyscans"
# PHOTOS_ROOT = EXPOFILES / 'photos'
PHOTOS_ROOT = Path("/mnt/d/EXPO/PHOTOS")
PHOTOS_YEAR = "2023"
NOTABLECAVESHREFS = ["290", "291", "264", "258", "204", "359", "76", "107"]
# PYTHON_PATH = os.fspath(PYTHON_PATH)
PYTHON_PATH = REPOS_ROOT_PATH / "troggle"
LOGFILE = PYTHON_PATH / "troggle.log"
SQLITEDB = PYTHON_PATH / "troggle.sqlite"
KMZ_ICONS_PATH = PYTHON_PATH / "kmz_icons"
# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash if there is a path component (optional in other cases).
MEDIA_URL = "/site-media/"
DIR_ROOT = Path("") # this should end in / if a value is given
URL_ROOT = "/"
# URL_ROOT = 'http://localhost:'+ SERVERPORT +'/'
# Note that these constants are not actually used in urls.py, they should be..
# and they all need to end with / so using 'Path' doesn't work..
MEDIA_URL = Path(URL_ROOT, "/site_media/")
PHOTOS_URL = Path(URL_ROOT, "/photos/")
STATIC_URL = Path(URL_ROOT, "/static/") # used by Django admin pages. Do not delete.
JSLIB_URL = Path(URL_ROOT, "/javascript/") # used for CaveViewer JS utility
# STATIC_ROOT removed after merging content into MEDIA_ROOT. See urls.py & core/views/surveys.py
# --------------------- MEDIA redirections END ---------------------
PUBLIC_SITE = True
DEBUG = True # Always keep this True, even when on public server. Otherwise NO USEFUL ERROR MESSAGES !
CACHEDPAGES = True # experimental page cache for a handful of page types
# executables:
CAVERN = "cavern" # for parsing .svx files and producing .3d files
SURVEXPORT = "survexport" # for parsing .3d files and producing .pos files
DBSQLITE = {
"default": {
"ENGINE": "django.db.backends.sqlite3", # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
#'NAME' : 'troggle.sqlite',
"NAME": str(SQLITEDB),
"USER": "expo", # Not used with sqlite3.
"PASSWORD": "sekrit", # Not used with sqlite3.
"HOST": "", # Set to empty string for localhost. Not used with sqlite3.
"PORT": "", # Set to empty string for default. Not used with sqlite3.
}
}
DBMARIADB = {
"default": {
"ENGINE": "django.db.backends.mysql", # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
"NAME": "troggle", # Or path to database file if using sqlite3.
"USER": "expo",
"PASSWORD": "my-secret-password-schwatzmooskogel",
"HOST": "", # Set to empty string for localhost. Not used with sqlite3.
"PORT": "", # Set to empty string for default. Not used with sqlite3.
}
}
# default database for me is squlite
DBSWITCH = "sqlite"
if DBSWITCH == "sqlite":
DATABASES = DBSQLITE
if DBSWITCH == "mariadb":
DATABASES = DBMARIADB
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [TEMPLATE_PATH],
"OPTIONS": {
"debug": "DEBUG",
"context_processors": [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
"django.contrib.auth.context_processors.auth", # knowledge of logged-on user & permissions
"core.context.troggle_context", # in core/troggle.py - only used in expedition.html
"django.template.context_processors.debug",
"django.template.context_processors.i18n",
"django.template.context_processors.media", # includes a variable MEDIA_URL
"django.template.context_processors.static", # includes a variable STATIC_URL used by admin pages
"django.template.context_processors.tz",
"django.template.context_processors.request", # must be enabled in DjangoTemplates (TEMPLATES) in order to use the admin navigation sidebar.
"django.contrib.messages.context_processors.messages",
],
"loaders": [
"django.template.loaders.filesystem.Loader", # default lcation is troggle/templates/
"django.template.loaders.app_directories.Loader", # needed for admin 'app'
],
},
},
]
EXPOUSER = "expo"
EXPOUSER_EMAIL = "philip.sargent@gmail.com"
EXPOADMINUSER = "expoadmin"
EXPOADMINUSER_EMAIL = "philip.sargent@gmail.com"
EMAIL_HOST = "smtp-auth.mythic-beasts.com"
EMAIL_HOST_USER = "django-test@klebos.net" # Philip Sargent really
EMAIL_PORT = 587
EMAIL_USE_TLS = True
DEFAULT_FROM_EMAIL = "django-test@klebos.net"
SURVEX_DATA = REPOS_ROOT_PATH / "loser"
DRAWINGS_DATA = REPOS_ROOT_PATH / "drawings"
EXPOWEB = REPOS_ROOT_PATH / "expoweb"
CAVEDESCRIPTIONS = EXPOWEB / "cave_data"
ENTRANCEDESCRIPTIONS = EXPOWEB / "entrance_data"
EXPOWEB_URL = ""
# SCANS_URL = '/survey_scans/' # defunct, removed.
sys.path.append(str(REPOS_ROOT_PATH))
sys.path.append(str(PYTHON_PATH))
# Sanitise these to be strings as Django seems to be particularly sensitive to crashing if they aren't
# and we have not made the change to pathlib Path type in the other localsettings-* variants yet.
CAVEDESCRIPTIONS = os.fspath(CAVEDESCRIPTIONS)
ENTRANCEDESCRIPTIONS = os.fspath(ENTRANCEDESCRIPTIONS)
STATIC_URL = str(STATIC_URL) + "/"
MEDIA_URL = str(MEDIA_URL) + "/"

View File

@@ -0,0 +1,12 @@
asgiref==3.3.4
confusable-homoglyphs==3.2.0
coverage==5.5
Django==3.2.12
docutils==0.14
gunicorn==20.1.0
Pillow==9.0.1
pytz==2019.1
reportlab==3.6.8
sqlparse==0.2.4
typing-extensions==3.7.4.3
Unidecode==1.0.23

View File

@@ -0,0 +1,20 @@
# Philip bleeding edge config
asgiref==3.6.0
beautifulsoup4==4.12.2
black==23.1.0
click==8.1.3
coverage==7.1.0
Django==4.2
docutils==0.19
isort==5.12.0
mypy-extensions==1.0.0
packaging==23.0
pathspec==0.11.0
Pillow==9.4.0
platformdirs==3.0.0
pytz==2022.7
ruff==0.0.245
soupsieve==2.4.1
sqlparse==0.4.3
Unidecode==1.3.6
piexif==1.1.3

View File

@@ -0,0 +1,17 @@
asgiref==3.6.0
black==23.1.0
click==8.1.3
coverage==7.1.0
Django==4.2
docutils==0.19
isort==5.12.0
mypy-extensions==1.0.0
packaging==23.0
pathspec==0.11.0
Pillow==9.4.0
platformdirs==3.0.0
pytz==2022.7
ruff==0.0.245
sqlparse==0.4.3
tomli==2.0.1
Unidecode==1.3.6

View File

@@ -0,0 +1,9 @@
asgiref==3.5.2
coverage==6.5.0
Django==3.2.16
docutils==0.19
Pillow==9.3.0
pytz==2022.6
sqlparse==0.4.3
typing_extensions==4.4.0
Unidecode==1.3.6

View File

@@ -0,0 +1,16 @@
asgiref==3.6.0
black==23.1.0
click==8.1.3
coverage==7.1.0
Django==4.2
docutils==0.19
isort==5.12.0
mypy-extensions==1.0.0
packaging==23.0
pathspec==0.11.0
Pillow==9.4.0
platformdirs==3.0.0
pytz==2022.7
ruff==0.0.245
sqlparse==0.4.3
Unidecode==1.3.6

View File

@@ -0,0 +1,9 @@
asgiref==3.5.2
coverage==6.5.0
Django==3.2.16
docutils==0.19
Pillow==9.3.0
pytz==2022.6
sqlparse==0.4.3
typing_extensions==4.4.0
Unidecode==1.3.6

View File

@@ -0,0 +1,21 @@
asgiref==3.7.0
beautifulsoup4==4.12.0
black==23.3.0
click==8.1.3
coverage==7.2.0
Django==4.2
docutils==0.20
isort==5.12.0
mypy-extensions==1.0.0
packaging==23.0
pathspec==0.11.0
Pillow==10.0.0
pkg_resources==0.0.0
platformdirs==3.8.0
pytz==2023.3
ruff==0.0.245
soupsieve==2.4.1
sqlparse==0.4.0
tomli==2.0.1
typing_extensions==4.7.1
Unidecode==1.3.6

View File

@@ -0,0 +1,17 @@
asgiref==3.3.4
confusable-homoglyphs==3.2.0
Django==3.2
docutils==0.14
gunicorn==20.1.0
Pillow==5.4.1
sqlparse==0.2.4
typing-extensions==3.7.4.3
Unidecode==1.0.23
mariadb==1.0.11
mysql-connector-python==8.0.29
mysqlclient==2.1.0
Pillow==9.1.0
pytz==2022.5
asgiref==3.5.0
gunicorn==20.1.0

171
_deploy/wsl/venv-trog.sh Normal file
View File

@@ -0,0 +1,171 @@
#!/bin/bash
# footled lots to make this work with python 3.10 & 3.11 and WSL1 and WSL2 on Ubuntu 22.04
# Run this in a terminal in the troggle directory: 'bash venv-trog.sh'
echo '-- Run this in a terminal in the real troggle directory: "bash venv-trog.sh"'
# Expects an Ubuntu 22.04 (or 20.04) relatively clean install.
# If you have not already installed these on your clean Ubuntu install DO THIS FIRST
# use the script os-trog.sh
# If you are using Debian, then stick with the default version of python
# If you are using Ubuntu, then it is easy to use a later version of python, e.g. 3.11
# NOW we set up troggle
PYTHON=python3.11
VENAME=p11d4 # python3.x and django 4.2
echo "** You are logged in as `id -u -n`"
echo "The 50MB pip cache will be in /home/`id -u -n`/.cache/"
echo "The 150MB venv will created in /home/`id -u -n`/$VENAME/"
TROGDIR=$(cd $(dirname $0) && pwd)
echo "-- Troggle folder (this script location): ${TROGDIR}"
if [ -d requirements.txt ]; then
echo "-- No requirements.txt found. You should be in the /troggle/ folder. Copy it from your most recent installation."
exit 1
fi
echo ## Using requirements.txt :
cat requirements.txt
echo ##
$PYTHON --version
# NOTE that when using a later or earlier verison of python, you MUST also
# use the allowed version of Pillow, see https://pillow.readthedocs.io/en/latest/installation.html
# NOW set up link from expo user folder
# needed for WSL2
echo Creating links from Linux filesystem user
# These links only need making once, for many venv
cd ~
if [ ! -d $VENAME ]; then
echo "## Creating venv $VENAME. (If this fails with a pip error, you need to ensure you have python3.11-venv installed and/or use a Ubuntu window)"
$PYTHON -m venv $VENAME
else
echo "## /$VENAME/ already exists ! Delete it first."
exit 1
fi
# Activate the virtual env and see what the default packages are
echo "### Activating $VENAME"
cd $VENAME
echo "-- now in: ${PWD}"
source bin/activate
echo "### Activated."
# update local version of pip, more recent than OS version
# debian bullseye installs pip 20.3.4 which barfs, we want >22.0.3
# update local version of setuptools, more recent than OS version, needed for packages without wheels
echo "### installing later version of pip inside $VENAME"
$PYTHON -m pip install --upgrade pip
$PYTHON -m pip install --upgrade setuptools
PIP=pip
$PIP list > original-pip.list
$PIP freeze >original.txt
# we are in /home/$USER/$VENAME/
ln -s ${TROGDIR} troggle
ln -s ${TROGDIR}/../expoweb expoweb
ln -s ${TROGDIR}/../loser loser
ln -s ${TROGDIR}/../drawings drawings
#ln -s ${TROGDIR}/../expofiles expofiles
# fudge for philip's machine
if [ ! -d /mnt/d/EXPO ]; then
sudo mkdir /mnt/d
sudo mount -t drvfs D: /mnt/d
fi
if [ -d ${TROGDIR}/../expofiles ]; then
ln -s ${TROGDIR}/../expofiles expofiles
else
ln -s /mnt/d/EXPO/expofiles expofiles
fi
echo "### Setting file permissions.. may take a while.."
git config --global --add safe.directory '*'
sudo chmod -R 777 *
echo "### links to expoweb, troggle etc. complete:"
ls -tla
echo "###"
echo "### now installing ${TROGDIR}/requirements.txt"
echo "###"
# NOW THERE IS A PERMISSIONS FAILURE THAT DIDN'T HAPPEN BEFORE
# seen on wsl2 as well as wsl1
# which ALSO ruins EXISTING permissions !
# Guessing it is to do with pip not liking non-standard py 3.11 installation on Ubuntu 22.04
$PIP install -r ${TROGDIR}/requirements.txt
echo '### install from requirements.txt completed.'
echo '### '
$PIP freeze > requirements.txt
# so that we can track requirements more easily with git
# because we do not install these with pip, but they are listed by the freeze command
# Now find out what we actually installed by subtracting the stuff venv installed anyway
sort original.txt > 1
sort requirements.txt >2
comm -3 1 2 --check-order | awk '{ print $1}'>fresh-requirements.txt
rm 1
rm 2
cp requirements.txt requirements-$VENAME.txt
cp requirements-$VENAME.txt troggle/requirements-$VENAME.txt
$PIP list > installed-pip.list
$PIP list -o > installed-pip-o.list
REQ=installation-record
mkdir $REQ
mv requirements-$VENAME.txt $REQ
mv original.txt $REQ
mv requirements.txt $REQ
mv original-pip.list $REQ
mv installed-pip.list $REQ
mv installed-pip-o.list $REQ
cp fresh-requirements.txt ../requirements.txt
mv fresh-requirements.txt $REQ
cp troggle/`basename "$0"` $REQ
$PYTHON --version
python --version
echo "Django version:`django-admin --version`"
echo "### Now do
'[sudo service mysql start]'
'[sudo service mariadb restart]'
'[sudo mysql_secure_installation]'
'cd ~/$VENAME'
'source bin/activate'
'cd troggle'
'django-admin'
'python manage.py check'
## this tests if you have set up ssh correcting. Refer to documentation https://expo.survex.com/handbook/computing/keyexchange.html
## you need to follow the Linux instructions.
'ssh expo@expo.survex.com'
## the next tests will fail unless ~/expofiles is set correctly to a folder on your machine
## the tests may ALSO fail because of ssh and permissions errors
## So you will need to run
$sudo chown -Rhv philip:philip ~/$VENAME (if your username is philip)
# and then REBOOT (or at least, exit WSL and terminate and restart WSL)
# because this chmod only takes effect then.
'python manage.py test -v 2'
'./pre-run.sh' (runs the tests again)
'python databaseReset.py reset $VENAME'
'python manage.py runserver 0.0.0.0:8000 (and allow access when the firewall window pops up)'
"
if [ ! -d /mnt/d/expofiles ]; then
echo '### No valid expofiles directory on /mnt/d . Fix this before any tests will work.'
fi

View File

@@ -0,0 +1,181 @@
import sys
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem, or /javascript/ can be in
a system-wide location rather than just a local directory.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
Read https://realpython.com/python-pathlib/
Read https://adamj.eu/tech/2020/03/16/use-pathlib-in-your-django-project/
"""
print(" * importing troggle/localsettings.py")
#-----------------------------------------------------------------
# THINK before you push this to a repo
# - have you checked that credentials.py is in .gitignore ?
# - we don't want to have to change the expo system password !
#-----------------------------------------------------------------
# default values, real secrets imported from credentials.py
SECRET_KEY = "real-SECRET_KEY--imported-from-localsettings.py"
EXPOUSERPASS = "nnn:gggggg - real-expo-password---imported-from-localsettings.py"
EXPOADMINUSERPASS = "gggggg:nnn - real-expo-password---imported-from-localsettings.py"
EMAIL_HOST_PASSWORD = "real-email-password---imported-from-localsettings.py"
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever. Tests are then less accurate.
#SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
SERVERPORT = '8000' # not needed
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
# Troggle does a lot of file-handling. This is very error-prone when using primitive methods,
# so we use pathlib which has been standard since python 3.4
# If pathlib is new to you, you will need to read https://realpython.com/python-pathlib/
# --------------------- MEDIA redirections BEGIN ---------------------
REPOS_ROOT_PATH = Path(__file__).parent.parent
LIBDIR = REPOS_ROOT_PATH / 'lib' / PV
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / 'templates'
MEDIA_ROOT = TROGGLE_PATH / 'media'
JSLIB_ROOT = TROGGLE_PATH / 'media' / 'jslib' # used for CaveViewer JS utility
# EXPOFILES = REPOS_ROOT_PATH / "expofiles"
EXPOFILES = Path('/media/philip/sd-huge1/cucc-expo/expofiles/')
SCANS_ROOT = EXPOFILES / 'surveyscans'
PHOTOS_ROOT = EXPOFILES / 'photos'
PHOTOS_YEAR = "2023"
NOTABLECAVESHREFS = ["290", "291", "264", "258", "204", "359", "76", "107"]
# PYTHON_PATH = os.fspath(PYTHON_PATH)
PYTHON_PATH = REPOS_ROOT_PATH / "troggle"
LOGFILE = PYTHON_PATH / "troggle.log"
# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash if there is a path component (optional in other cases).
MEDIA_URL = '/site-media/'
DIR_ROOT = Path("") #this should end in / if a value is given
URL_ROOT = '/'
# URL_ROOT = 'http://localhost:'+ SERVERPORT +'/'
# Note that these constants are not actually used in urls.py, they should be..
# and they all need to end with / so using 'Path' doesn't work..
MEDIA_URL = Path(URL_ROOT, "/site_media/")
PHOTOS_URL = Path(URL_ROOT, "/photos/")
STATIC_URL = Path(URL_ROOT, "/static/") # used by Django admin pages. Do not delete.
JSLIB_URL = Path(URL_ROOT, "/javascript/") # used for CaveViewer JS utility
# STATIC_ROOT removed after merging content into MEDIA_ROOT. See urls.py & core/views/surveys.py
# --------------------- MEDIA redirections END ---------------------
PUBLIC_SITE = True
DEBUG = True # Always keep this True, even when on public server. Otherwise NO USEFUL ERROR MESSAGES !
CACHEDPAGES = True # experimental page cache for a handful of page types
# executables:
CAVERN = 'cavern' # for parsing .svx files and producing .3d files
SURVEXPORT = 'survexport' # for parsing .3d files and producing .pos files
DBSQLITE = {
'default': {
'ENGINE': 'django.db.backends.sqlite3', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle.sqlite',
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : 'sekrit', # Not used with sqlite3.
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
DBMARIADB = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo',
'PASSWORD' : 'my-secret-password-schwatzmooskogel',
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
# default database for me is squlite
DBSWITCH = "sqlite"
if DBSWITCH == "sqlite":
DATABASES = DBSQLITE
if DBSWITCH == "mariadb":
DATABASES = DBMARIADB
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
"DIRS": [TEMPLATE_PATH],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
'django.contrib.auth.context_processors.auth', # knowledge of logged-on user & permissions
'core.context.troggle_context', # in core/troggle.py - only used in expedition.html
'django.template.context_processors.debug',
'django.template.context_processors.i18n',
'django.template.context_processors.media', # includes a variable MEDIA_URL
'django.template.context_processors.static', # includes a variable STATIC_URL used by admin pages
'django.template.context_processors.tz',
'django.template.context_processors.request', # must be enabled in DjangoTemplates (TEMPLATES) in order to use the admin navigation sidebar.
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader', # default lcation is troggle/templates/
'django.template.loaders.app_directories.Loader', # needed for admin 'app'
],
},
},
]
EXPOUSER = "expo"
EXPOUSER_EMAIL = "philip.sargent@gmail.com"
EXPOADMINUSER = "expoadmin"
EXPOADMINUSER_EMAIL = "philip.sargent@gmail.com"
EMAIL_HOST = "smtp-auth.mythic-beasts.com"
EMAIL_HOST_USER = "django-test@klebos.net" # Philip Sargent really
EMAIL_PORT = 587
EMAIL_USE_TLS = True
DEFAULT_FROM_EMAIL = "django-test@klebos.net"
SURVEX_DATA = REPOS_ROOT_PATH / "loser"
DRAWINGS_DATA = REPOS_ROOT_PATH / "drawings"
EXPOWEB = REPOS_ROOT_PATH / "expoweb"
CAVEDESCRIPTIONS = EXPOWEB / "cave_data"
ENTRANCEDESCRIPTIONS = EXPOWEB / "entrance_data"
EXPOWEB_URL = ''
# SCANS_URL = '/survey_scans/' # defunct, removed.
sys.path.append(str(REPOS_ROOT_PATH))
sys.path.append(str(PYTHON_PATH))
# Sanitise these to be strings as all other code is expecting strings
# and we have not made the change to pathlib Path type in the other localsettings-* variants yet.
#CAVEDESCRIPTIONS = os.fspath(CAVEDESCRIPTIONS)
#ENTRANCEDESCRIPTIONS = os.fspath(ENTRANCEDESCRIPTIONS)
STATIC_URL = str(STATIC_URL) + "/"
MEDIA_URL = str(MEDIA_URL) + "/"

View File

@@ -0,0 +1,196 @@
import sys
import os
import urllib.parse
from pathlib import Path
"""Settings for a troggle installation which may vary among different
installations: for development or deployment, in a docker image or
python virtual environment (venv), on ubuntu, debian or in Windows
System for Linux (WSL), on the main server or in the potato hut,
using SQLite or mariaDB.
It sets the directory locations for the major parts of the system so
that e.g. expofiles can be on a different filesystem, or /javascript/ can be in
a system-wide location rather than just a local directory.
This file is included at the end of the main troggle/settings.py file so that
it overwrites defaults in that file.
Read https://realpython.com/python-pathlib/
Read https://adamj.eu/tech/2020/03/16/use-pathlib-in-your-django-project/
"""
print(" * importing troggle/localsettings.py")
#-----------------------------------------------------------------
# THINK before you push this to a repo
# - have you checked that credentials.py is in .gitignore ?
# - we don't want to have to change the expo system password !
#-----------------------------------------------------------------
# default values, real secrets imported from credentials.py
SECRET_KEY = "real-SECRET_KEY--imported-from-localsettings.py"
EXPOUSERPASS = "nnn:gggggg - real-expo-password---imported-from-localsettings.py"
EXPOADMINUSERPASS = "gggggg:nnn - real-expo-password---imported-from-localsettings.py"
EMAIL_HOST_PASSWORD = "real-email-password---imported-from-localsettings.py"
EXPOFILESREMOTE = False # if True, then re-routes urls in expofiles to remote sever. Tests are then less accurate.
#SECURE_SSL_REDIRECT = True # breaks 7 tests in test suite 301 not 200 (or 302) and runserver fails completely
SERVERPORT = '8000' # not needed
PV = "python" + str(sys.version_info.major) + "." + str(sys.version_info.minor)
# Troggle does a lot of file-handling. This is very error-prone when using primitive methods,
# so we use pathlib which has been standard since python 3.4
# If pathlib is new to you, you will need to read https://realpython.com/python-pathlib/
# --------------------- MEDIA redirections BEGIN ---------------------
REPOS_ROOT_PATH = Path(__file__).parent.parent
LIBDIR = REPOS_ROOT_PATH / 'lib' / PV
#LIBDIR = REPOS_ROOT_PATH / 'lib' / 'python3.9' # should be finding this automatically: python --version etc.
TROGGLE_PATH = Path(__file__).parent
TEMPLATE_PATH = TROGGLE_PATH / 'templates'
MEDIA_ROOT = TROGGLE_PATH / 'media'
JSLIB_ROOT = TROGGLE_PATH / 'media' / 'jslib' # used for CaveViewer JS utility
#FILES = Path('/mnt/d/expofiles/')
EXPOFILES = Path('/media/philip/sd-huge1/cucc-expo/expofiles/')
SCANS_ROOT = EXPOFILES / 'surveyscans'
PHOTOS_ROOT = EXPOFILES / 'photos'
PHOTOS_YEAR = "2022"
# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash if there is a path component (optional in other cases).
MEDIA_URL = '/site-media/'
DIR_ROOT = ''#this should end in / if a value is given
URL_ROOT = '/'
# URL_ROOT = 'http://localhost:'+ SERVERPORT +'/'
#Note that these constants are not actually used in urls.py, they should be..
MEDIA_URL = urllib.parse.urljoin(URL_ROOT , '/site_media/')
SCANS_URL = urllib.parse.urljoin(URL_ROOT , '/survey_scans/')
PHOTOS_URL = urllib.parse.urljoin(URL_ROOT , '/photos/')
SVX_URL = urllib.parse.urljoin(URL_ROOT , '/survex/')
STATIC_URL = urllib.parse.urljoin(URL_ROOT , '/static/') # used by Django admin pages. Do not delete.
JSLIB_URL = urllib.parse.urljoin(URL_ROOT , '/javascript/') # used for CaveViewer JS utility
#STATIC_ROOT removed after merging content into MEDIA_ROOT. See urls.py & core/views/surveys.py
# --------------------- MEDIA redirections END ---------------------
PUBLIC_SITE = True
DEBUG = True # Always keep this True, even when on public server. Otherwise NO USEFUL ERROR MESSAGES !
CACHEDPAGES = True # experimental page cache for a handful of page types
# executables:
CAVERN = 'cavern' # for parsing .svx files and producing .3d files
SURVEXPORT = 'survexport' # for parsing .3d files and producing .pos files
DBSQLITE = {
'default': {
'ENGINE': 'django.db.backends.sqlite3', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle.sqlite',
# 'NAME' : ':memory:',
'USER' : 'expo', # Not used with sqlite3.
'PASSWORD' : 'sekrit', # Not used with sqlite3.
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
DBMARIADB = {
'default': {
'ENGINE': 'django.db.backends.mysql', # 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME' : 'troggle', # Or path to database file if using sqlite3.
'USER' : 'expo',
'PASSWORD' : 'my-secret-password-schwatzmooskogel',
'HOST' : '', # Set to empty string for localhost. Not used with sqlite3.
'PORT' : '', # Set to empty string for default. Not used with sqlite3.
}
}
# default database for me is squlite
DBSWITCH = "sqlite"
if DBSWITCH == "sqlite":
DATABASES = DBSQLITE
if DBSWITCH == "mariadb":
DATABASES = DBMARIADB
NOTABLECAVESHREFS = [ "290", "291", "359", "264", "258", "204", "76", "107"]
PYTHON_PATH = REPOS_ROOT_PATH / 'troggle'
sys.path.append(os.fspath(REPOS_ROOT_PATH))
sys.path.append(os.fspath(PYTHON_PATH))
LOGFILE = PYTHON_PATH / 'troggle.log'
PYTHON_PATH = os.fspath(PYTHON_PATH)
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
TEMPLATE_PATH
],
'OPTIONS': {
'debug': 'DEBUG',
'context_processors': [
# django.template.context_processors.csrf, # is always enabled and cannot be removed, sets csrf_token
'django.contrib.auth.context_processors.auth', # knowledge of logged-on user & permissions
'core.context.troggle_context', # in core/troggle.py - only used in expedition.html
'django.template.context_processors.debug',
'django.template.context_processors.i18n',
'django.template.context_processors.media', # includes a variable MEDIA_URL
'django.template.context_processors.static', # includes a variable STATIC_URL used by admin pages
'django.template.context_processors.tz',
'django.template.context_processors.request', # must be enabled in DjangoTemplates (TEMPLATES) in order to use the admin navigation sidebar.
'django.contrib.messages.context_processors.messages',
],
'loaders': [
'django.template.loaders.filesystem.Loader', # default lcation is troggle/templates/
'django.template.loaders.app_directories.Loader', # needed for admin 'app'
]
},
},
]
EXPOUSERPASS = "nnn:gggggg - real-expo-password---imported-from-localsettings.py"
EMAIL_HOST_PASSWORD = "real-email-password---imported-from-localsettings.py"
EXPOUSER = 'expo'
EXPOUSER_EMAIL = 'philip.sargent@gmail.com'
EXPOADMINUSER = 'expoadmin'
EXPOADMINUSER_EMAIL = 'philip.sargent@gmail.com'
EMAIL_HOST = "smtp-auth.mythic-beasts.com"
EMAIL_HOST_USER = "django-test@klebos.net" # Philip Sargent really
EMAIL_PORT=587
EMAIL_USE_TLS = True
DEFAULT_FROM_EMAIL = 'django-test@klebos.net'
SURVEX_DATA = REPOS_ROOT_PATH / "loser"
DRAWINGS_DATA = REPOS_ROOT_PATH / "drawings"
EXPOWEB = REPOS_ROOT_PATH / "expoweb"
#SURVEYS = REPOS_ROOT_PATH
CAVEDESCRIPTIONS = EXPOWEB / "cave_data"
ENTRANCEDESCRIPTIONS = EXPOWEB / "entrance_data"
EXPOWEB_URL = ''
# SCANS_URL = '/survey_scans/' # defunct, removed.
# Sanitise these to be strings as all other code is expecting strings
# and we have not made the change to pathlib Path type in the other localsettings-* variants yet.
CAVEDESCRIPTIONS = os.fspath(CAVEDESCRIPTIONS)
ENTRANCEDESCRIPTIONS = os.fspath(ENTRANCEDESCRIPTIONS)
LOGFILE = os.fspath(LOGFILE)
#SURVEYS = os.fspath(SURVEYS)
EXPOWEB = os.fspath(EXPOWEB)
DRAWINGS_DATA = os.fspath(DRAWINGS_DATA)
SURVEX_DATA = os.fspath(SURVEX_DATA)
REPOS_ROOT_PATH = os.fspath(REPOS_ROOT_PATH)
TEMPLATE_PATH = os.fspath(TROGGLE_PATH)
MEDIA_ROOT = os.fspath(MEDIA_ROOT)
JSLIB_ROOT = os.fspath(JSLIB_ROOT)
SCANS_ROOT = os.fspath(SCANS_ROOT)

View File

@@ -0,0 +1,46 @@
#! /bin/sh
# create and sanitise files for pushing to repo, for Babie laptop
echo deprecations.
python -Wall manage.py check -v 3 2>deprecations.txt >/dev/null
echo diffsettings.
rm diffsettings.txt
if test -f "diffsettings.txt"; then
echo "diffsettings.txt not deleted. You have a serious permissions problem. Aborting.."
exit
fi
python manage.py diffsettings | grep "###" > diffsettings.txt
echo pip freeze.
pip freeze > requirements.txt
echo inspectdb.
# this next line requires database setting to be troggle.sqlite:
python manage.py inspectdb > troggle-inspectdb.py
#egrep -in "unable|error" troggle-inspectdb.py
echo remove passwords.
cp localsettings.py localsettingsXubuntu.py
sed -i '/EXPOUSERPASS/ s/^.*$/EXPOUSERPASS = "nnn:gggggg - real-expo-password---imported-from-localsettings.py"/' diffsettings.txt
sed -i '/EXPOUSERPASS/ s/^.*$/EXPOUSERPASS = "nnn:gggggg - real-expo-password---imported-from-localsettings.py"/' localsettingsXubuntu.py
echo " reset: EXPOUSERPASS = \"nnn:gggggg\" - real-expo-password---imported-from-localsettings.py"
sed -i '/EXPOADMINUSERPASS/ s/^.*$/EXPOADMINUSERPASS = "gggggg:nnn - real-expo-password---imported-from-localsettings.py"/' diffsettings.txt
sed -i '/EXPOADMINUSERPASS/ s/^.*$/EXPOADMINUSERPASS = "gggggg:nnn - real-expo-password---imported-from-localsettings.py"/' localsettingsXubuntu.py
echo " reset: EXPOUSERPASS = \"gggggg:nnn\" - real-expo-password---imported-from-localsettings.py"
sed -i '/EMAIL_HOST_PASSWORD/ s/^.*$/EMAIL_HOST_PASSWORD = "real-email-password---imported-from-localsettings.py"/' diffsettings.txt
sed -i '/EMAIL_HOST_PASSWORD/ s/^.*$/EMAIL_HOST_PASSWORD = "real-email-password---imported-from-localsettings.py"/' localsettingsXubuntu.py
echo " reset: EMAIL_HOST_PASSWORD = \"real-email-password--imported-from-localsettings.py\""
sed -i '/SECRET_KEY/ s/^.*$/SECRET_KEY = "real-SECRET_KEY--imported-from-localsettings.py"/' diffsettings.txt
sed -i '/SECRET_KEY/ s/^.*$/SECRET_KEY = "real-SECRET_KEY--imported-from-localsettings.py"/' localsettingsXubuntu.py
echo " reset: SECRET_KEY = \"real-SECRET_KEY--imported-from-localsettings.py\""
mv _deploy/xubuntu/localsettingsXubuntu.py _deploy/xubuntu/localsettingsXubuntu.py.bak
mv localsettingsXubuntu.py _deploy/xubuntu
#
# Do these before final testing, *not* just before pushing:
# in ./pre-run.sh
# python reset-django.py
# python manage.py makemigrations
# python manage.py test
# python manage.py inspectdb > troggle-inspectdb.py
# egrep -i "unable|error" troggle-inspectdb.py

1
categories.json Normal file

File diff suppressed because one or more lines are too long

1
confusables.json Normal file

File diff suppressed because one or more lines are too long

214
core/TESTS/test_caves.py Normal file
View File

@@ -0,0 +1,214 @@
"""
Modified for Expo April 2021.
"""
import re
from http import HTTPStatus
from django.test import Client, TestCase
from troggle.core.models.caves import Area, Cave
from troggle.core.models.troggle import Person, PersonExpedition
# import troggle.settings as settings
# FIXTURE_DIRS = settings.PYTHON_PATH / "core" /"fixtures"
class FixtureTests(TestCase):
"""These just hit the database.
They do not exercise the GET and url functions
"""
fixtures = ["auth_users", "expo_areas", "expo_caves", "expo_exped"]
ph = r"and leads in 800m of tortuous going to"
def setUp(self):
pass
def tearDown(self):
pass
def test_fix_person_loaded(self):
p = Person.objects.get(fullname="Michael Sargent")
self.assertEqual(str(p.first_name), "Michael")
def test_fix_person_loaded(self):
pe = PersonExpedition.objects.get(pk="681")
self.assertEqual(str(pe.person.fullname), "Michael Sargent")
self.assertEqual(str(pe.expedition.year), "2019")
def test_fix_area_loaded(self):
a = Area.objects.get(short_name="1623")
self.assertEqual(str(a.short_name), "1623")
def test_fix_cave_loaded115(self):
c = Cave.objects.get(kataster_number="115")
self.assertEqual(str(c.description_file), "1623/115.htm")
self.assertEqual(str(c.url), "1623/115.url") # intentional
self.assertEqual(str(c.filename), "1623-115.html")
# c.area is a 'ManyRelatedManager' object and not iterable
# self.assertEqual(str(c.[0].short_name), "1623")
ph = self.ph
phmatch = re.search(ph, c.underground_description)
self.assertIsNotNone(phmatch, "In fixture-loaded cave, failed to find expected text: '" + ph + "'")
def test_fix_cave_loaded284(self):
c = Cave.objects.get(kataster_number="284")
self.assertEqual(str(c.description_file), "")
self.assertEqual(str(c.url), "1623/284/284.html")
self.assertEqual(str(c.filename), "1623-284.html")
ph = r"at a depth of 72m, there are large round blocks"
phmatch = re.search(ph, c.notes)
self.assertIsNotNone(phmatch, "In fixture-loaded cave, failed to find expected text: '" + ph + "'")
def test_page_personexpedition(self):
response = self.client.get("/personexpedition/MichaelSargent/2019")
content = response.content.decode()
# with open('testresponse.html','w') as tr:
# tr.writelines(content)
self.assertEqual(response.status_code, HTTPStatus.OK)
for ph in [r"Michael Sargent", r"Table of all trips and surveys aligned by date"]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# Need to add a fixture so that this actually has a logbook entry and a trip/svx in it.
class FixturePageTests(TestCase):
"""Currently nothing that runs troggle works - all do 404. Must be something in a template rendering crash?
ordinary pages are OK, and expopages and expofiles are OK, even though they come through troggle. And the
fixtures are certainly loaded into the db as the other tests show.
"""
# The fixtures have a password hash which is compatible with plain-text password 'secretword'
fixtures = ["auth_users", "expo_areas", "expo_caves", "expo_exped"]
ph = r"and leads in 800m of tortuous going to"
@classmethod
def setUpTestData(cls):
pass
def setUp(self):
from django.contrib.auth.models import User
self.user = User.objects.get(username="expotest")
# Every test needs a client.
self.client = Client()
def tearDown(self):
pass
def test_fix_expedition(self):
response = self.client.get("/expedition/2019")
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"Michael Sargent"
content = response.content.decode()
phmatch = re.search(ph, content)
# with open('exped-op.html', 'w') as f:
# f.write(content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_fix_personexped(self):
response = self.client.get("/personexpedition/MichaelSargent/2019")
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"Table of all trips and surveys aligned by date"
content = response.content.decode()
phmatch = re.search(ph, content)
# with open('persexped-op.html', 'w') as f:
# f.write(content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_fix_person(self):
response = self.client.get("/person/MichaelSargent")
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"second-generation expo caver "
content = response.content.decode()
phmatch = re.search(ph, content)
# with open('person-op.html', 'w') as f:
# f.write(content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_fix_cave_url115(self):
ph = self.ph
response = self.client.get("/1623/115.url") # yes this is intentional, see the inserted data above & fixture
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_fix_cave_url284(self):
response = self.client.get("/1623/284/284.html")
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"at a depth of 72m, there are large round blocks"
content = response.content.decode()
phmatch = re.search(ph, content)
# with open('cave-url284.html', 'w') as f:
# f.write(content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_fix_cave_bare_url115(self):
"""Expect to get Page Not Found and status 404"""
ph = self.ph
ph = "Probably a mistake."
response = self.client.get("/1623/115")
self.assertEqual(response.status_code, HTTPStatus.NOT_FOUND)
content = response.content.decode()
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'") # 200 & Page Not Found
def test_fix_cave_slug115(self):
"""Expect to get Page Not Found and status 404"""
ph = self.ph
ph = "Probably a mistake."
response = self.client.get("/1623-115")
self.assertEqual(response.status_code, HTTPStatus.NOT_FOUND)
content = response.content.decode()
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'") # 200 & Page Not Found
def test_fix_caves284(self):
response = self.client.get("/caves")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"284 Seetrichter"
phmatch = re.search(ph, content)
# with open('_cave_fix_caves.html', 'w') as f:
# f.write(content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# Although the Cave object exists, it looks like we get a bad slug error when trying to get a QM page.
# def test_fix_qms(self):
# response = self.client.get("/cave/qms/1623-284")
# self.assertEqual(response.status_code, HTTPStatus.OK)
# content = response.content.decode()
# ph = r"Question marks for 284 - Seetrichter"
# phmatch = re.search(ph, content)
# with open('_cave-fixqms.html', 'w') as f:
# f.write(content)
# self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# def test_fix_openqms(self):
# response = self.client.get("/cave/openqms/1623-284")
# self.assertEqual(response.status_code, HTTPStatus.OK)
# content = response.content.decode()
# ph = r"Open Leads for 284 - Seetrichter"
# phmatch = re.search(ph, content)
# with open('_cave-fixopenqms.html', 'w') as f:
# f.write(content)
# self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")

277
core/TESTS/test_imports.py Normal file
View File

@@ -0,0 +1,277 @@
"""
We are using unittest for troggle.
Note that the database has not been parsed from the source files when these tests are run,
so any path that relies on data being in the database will fail.
The simple redirections to files which exist, e.g. in
/expoweb/
/expofiles/
/expofiles/documents/
etc. using parameters in localsettings such as PHOTOS_ROOT will test fine.
But paths like this:
/survey_scans/
/caves/
which rely on database resolution will fail unless a fixture has been set up for
them.
https://docs.djangoproject.com/en/dev/topics/testing/tools/
"""
import re
import subprocess
import unittest
from django.test import Client, SimpleTestCase, TestCase
class SimpleTest(SimpleTestCase):
def test_test_setting(self):
from django.conf import settings
self.assertEqual(settings.EMAIL_BACKEND, "django.core.mail.backends.locmem.EmailBackend")
import troggle.settings as settings
def test_import_TroggleModel(self):
from troggle.core.models.troggle import TroggleModel
def test_import_Cave(self):
from troggle.core.models.caves import Cave
def test_import_parsers_surveys(self):
# from PIL import Image
from functools import reduce
from troggle.core.utils import save_carefully
def test_import_parsers_survex(self):
import troggle.core.models.caves as models_caves
import troggle.core.models.survex as models_survex
import troggle.core.models.troggle as models
import troggle.settings as settings
from troggle.core.views import caves, drawings, other, scans, statistics, survex, uploads
from troggle.core.views.caves import cavepage, ent
from troggle.core.views.other import frontpage
from troggle.parsers.people import GetPersonExpeditionNameLookup
def test_import_views_uploads(self):
from troggle.core.views.uploads import dwgupload
def test_import_views_walletedit(self):
from troggle.core.views.wallets_edit import walletedit
def test_import_parsers_QMs(self):
from troggle.core.models.logbooks import QM
def test_import_parsers_people(self):
from html import unescape
from unidecode import unidecode
def test_import_parsers_logbooks(self):
from django.template.defaultfilters import slugify
from django.utils.timezone import get_current_timezone, make_aware
from parsers.people import GetPersonExpeditionNameLookup
from troggle.core.models.logbooks import CaveSlug, QM, LogbookEntry, PersonLogEntry
from troggle.core.models.troggle import DataIssue, Expedition
def test_import_core_views_caves(self):
from django.conf import settings
from django.contrib.auth.decorators import login_required
from django.http import HttpResponse, HttpResponseRedirect
from django.shortcuts import get_object_or_404, render
import troggle.core.views.expo
from troggle.core.forms import CaveAndEntranceFormSet, CaveForm, EntranceForm, EntranceLetterForm
from troggle.core.models.caves import Area, Cave, CaveAndEntrance, Entrance, SurvexStation #EntranceSlug,
from troggle.core.models.troggle import Expedition
from troggle.core.views.auth import login_required_if_public
def test_import_parsers_mix(self):
import troggle.parsers.caves
import troggle.parsers.drawings
import troggle.parsers.logbooks
import troggle.parsers.people
import troggle.parsers.QMs
import troggle.parsers.scans
import troggle.parsers.survex
import troggle.settings
from troggle.parsers.logbooks import GetCaveLookup
def test_import_imports(self):
from django.contrib.auth.models import User
from django.core import management
from django.db import close_old_connections, connection, connections
from django.http import HttpResponse
from django.urls import reverse
def test_import_urls(self):
from django.conf import settings
#from django.conf.urls import include, url
from django.contrib import admin, auth
from django.urls import resolve, reverse
from django.views.generic.base import RedirectView
from django.views.generic.edit import UpdateView
from django.views.generic.list import ListView
from troggle.core.views import caves, other, statistics, survex
from troggle.core.views.auth import expologin, expologout
from troggle.core.views.caves import cavepage, ent
from troggle.core.views.expo import (
editexpopage,
expofiles_redirect,
expofilessingle,
expopage,
map,
mapfile,
mediapage,
)
from troggle.core.views.logbooks import (
Expeditions_jsonListView,
Expeditions_tsvListView,
expedition,
get_logbook_entries,
get_people,
logbookentry,
notablepersons,
person,
personexpedition,
)
from troggle.core.views.other import controlpanel
from troggle.core.views.prospect import prospecting, prospecting_image
from troggle.core.views.statistics import dataissues, pathsreport, stats
from troggle.core.views.survex import survexcavesingle, survexcaveslist, svx
class ImportTest(TestCase):
@classmethod
def setUpTestData(cls):
import troggle.settings as settings
from troggle.parsers.logbooks import LOGBOOKS_DIR, DEFAULT_LOGBOOK_FILE
LOGBOOKS_PATH = settings.EXPOWEB / LOGBOOKS_DIR
test_year = "1986"
cls.test_logbook = LOGBOOKS_PATH / test_year / DEFAULT_LOGBOOK_FILE
def setUp(self):
pass
def tearDown(self):
pass
def test_logbook_exists(self):
self.assertTrue(self.test_logbook.is_file())
class SubprocessTest(TestCase):
@classmethod
def setUpTestData(cls):
pass
def setUp(self):
pass
def tearDown(self):
pass
def test_utf8(self):
"""Expects that utf8 is the default encoding when opening files"""
import locale
import sys
self.assertTrue(
sys.getdefaultencoding() == "utf-8", f"{sys.getdefaultencoding()} - UTF8 error in getdefaultencoding"
)
self.assertTrue(
sys.getfilesystemencoding() == "utf-8",
f"{sys.getfilesystemencoding()} - UTF8 error in getfilesystemencoding",
)
self.assertTrue(
locale.getdefaultlocale()[1] == "UTF-8",
f"{locale.getdefaultlocale()} - UTF8 error in locale.getdefaultlocale",
)
self.assertTrue(
locale.getpreferredencoding() == "UTF-8",
f"{locale.getpreferredencoding()} - UTF8 error in locale.getpreferredencoding",
)
def test_installs(self):
"""Expects external software installed: cavern, survexport, git
(but not whether it actually works)
"""
import troggle.settings as settings
for i in [settings.CAVERN, settings.SURVEXPORT, settings.GIT]:
# Define command as string and then split() into list format
cmd = f"which {i}".split()
try:
sp = subprocess.check_call(cmd, shell=False)
except subprocess.CalledProcessError:
self.assertTrue(False, f"no {i} installed")
def test_repos_git_status(self):
"""Expects clean git repos with no added files and no merge failures"""
from pathlib import Path
import troggle.settings as settings
TROGGLE_PATH = Path(settings.REPOS_ROOT_PATH) / "troggle"
for cwd in [settings.SURVEX_DATA, settings.EXPOWEB, settings.DRAWINGS_DATA, TROGGLE_PATH]:
sp = subprocess.run([settings.GIT, "status"], cwd=cwd, capture_output=True, text=True)
out = str(sp.stdout)
if len(out) > 160:
out = out[:75] + "\n <Long output curtailed>\n" + out[-75:]
print(f"git output: {cwd}:\n # {sp.stderr=}\n # sp.stdout={out} \n # return code: {str(sp.returncode)}")
if sp.returncode != 0:
print(f"git output: {cwd}:\n # {sp.stderr=}\n # sp.stdout={out} \n # return code: {str(sp.returncode)}")
self.assertTrue(sp.returncode == 0, f"{cwd} - git is unhappy")
content = sp.stdout
ph = r"nothing to commit, working tree clean"
phmatch = re.search(ph, content)
msg = f'{cwd} - Failed to find expected git output: "{ph}"'
self.assertIsNotNone(phmatch, msg)
# ph1 = r"no changes added to commit"
# phmatch1 = re.search(ph1, content)
# ph2 = r"nothing to commit"
# phmatch2 = re.search(ph2, content)
# phmatch = phmatch1 or phmatch2
# msg = f'{cwd} - Failed to find expected git output: "{ph1}" or "{ph2}"'
# self.assertIsNotNone(phmatch, msg)
def test_loser_survex_status(self):
"""Expects no failures of survex files"""
from pathlib import Path
import troggle.settings as settings
cwd = settings.SURVEX_DATA
for survey in ["1623-and-1626-no-schoenberg-hs.svx"]:
sp = subprocess.run([settings.CAVERN, survey], cwd=cwd, capture_output=True, text=True)
out = str(sp.stdout)
if len(out) > 160:
out = out[:75] + "\n <Long output curtailed>\n" + out[-75:]
# print(f'survex output: {cwd}:\n # {sp.stderr=}\n # sp.stdout={out} \n # return code: {str(sp.returncode)}')
if sp.returncode != 0:
print(
f"survex output: {cwd}:\n # {sp.stderr=}\n # sp.stdout={out} \n # return code: {str(sp.returncode)}"
)
self.assertTrue(sp.returncode == 0, f"{cwd} - survex is unhappy")
content = sp.stdout
ph = r"Total length of survey legs"
phmatch = re.search(ph, content)
msg = f'{cwd} - Failed to find expected survex output: "{ph}"'
self.assertIsNotNone(phmatch, msg)
ph1 = r"Time used"
phmatch1 = re.search(ph1, content)
ph2 = r"vertical length of survey le"
phmatch2 = re.search(ph2, content)
phmatch = phmatch1 or phmatch2
msg = f'{cwd} - Failed to find expected survex output: "{ph1}" or "{ph2}"'
self.assertIsNotNone(phmatch, msg)

451
core/TESTS/test_logins.py Normal file
View File

@@ -0,0 +1,451 @@
"""
Originally written for CUYC
Philip Sargent (Feb.2021)
Modified for Expo April 2021.
"""
import pathlib
import re
from http import HTTPStatus
from django.test import Client, TestCase
import troggle.settings as settings
from troggle.core.models.wallets import Wallet
from troggle.core.models.troggle import Expedition
class DataTests(TestCase):
"""These check that the NULL and NON-UNIQUE constraints are working in the database"""
@classmethod
def setUpTestData(cls):
pass
def setUp(self):
from django.contrib.auth.models import User
u = User()
u.pk = 9000
u.user_id = 8000
u.username, u.password = "stinker", "secretword"
u.email = "philip.sargent+SP@gmail.com"
u.first_name, u.last_name = "Stinker", "Pinker"
u.save()
self.user = u
def tearDown(self):
# self.member.delete() # must delete member before user
# self.user.delete() # horrible crash, why?
pass
class FixturePageTests(TestCase):
# The fixtures have a password hash which is compatible with plain-text password 'secretword'
fixtures = ["auth_users"]
def setUp(self):
from django.contrib.auth.models import User
self.user = User.objects.get(username="expotest")
def tearDown(self):
pass
def test_fix_admin_login_fail(self):
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
logged_in = c.login(username=u.username, password="secretword") # fails to work if password=u.password !
self.assertTrue(logged_in, "FAILED to login as '" + u.username + "'")
response = c.get("/admin/")
content = response.content.decode()
# with open('admin-op.html', 'w') as f:
# f.write(content)
t = re.search(r"Troggle administration", content)
self.assertIsNone(t, "Logged in as '" + u.username + "' (not staff) but still managed to get the Admin page")
class PostTests(TestCase):
"""Tests walletedit form"""
fixtures = ["auth_users"]
@classmethod
def setUpTestData(cls):
pass
def setUp(self):
from django.contrib.auth.models import User
self.user = User.objects.get(username="expotest")
self.client = Client()
testyear = "2022"
wname = f"{testyear}:00"
self.testyear = testyear
w = Wallet()
w.pk = 9100
w.fpath = str(pathlib.Path(settings.SCANS_ROOT, wname))
w.walletname = wname
w.save()
self.wallet = w
e = Expedition()
e.year = testyear
e.save()
self.expedition = e
def test_file_permissions(self):
"""Expect to be allowed to write to SCANS_ROOT, DRAWINGS_DATA, SURVEX_DATA, EXPOWEB
Need to login first.
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
testyear = self.testyear
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
for p in [settings.SCANS_ROOT,
settings.DRAWINGS_DATA / "walletjson",
settings.EXPOWEB / "documents",
settings.SURVEX_DATA / "docs"
]:
_test_file_path = pathlib.Path(p, "_created_by_test_suite.txt")
self.assertEqual(_test_file_path.is_file(), False)
with open(_test_file_path, "w") as f:
f.write("test string: can we write to this directory?")
self.assertEqual(_test_file_path.is_file(), True)
_test_file_path.unlink()
def test_scan_upload(self):
"""Expect scan upload to wallet to work on any file
Need to login first.
This upload form looks for the Cave and the Wallet, so the test fails if the database is not loaded with the cave
identified in the wallet
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
testyear = self.testyear
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
with open("core/fixtures/test_upload_file.txt", "r") as testf:
response = self.client.post(
f"/walletedit/{testyear}:00", data={"name": "test_upload_file.txt", "uploadfiles": testf}
)
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
# with open("_test_response.html", "w") as f:
# f.write(content)
for ph in [
r"test_upload_",
rf"&larr; {testyear}#00 &rarr;",
r"description written",
r"Plan not required",
r"edit settings or upload a file",
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# Does not use the filename Django actually uses, assumes it is unchanged. Bug: accumulates one file with random name
# added each time it is run. The name of the uploaded file is only available within the code where it happens
remove_file = pathlib.Path(settings.SCANS_ROOT) / f'{testyear}' / f'{testyear}#00'/ 'test_upload_file.txt'
remove_file.unlink()
# Just uploading a file does NOT do any git commit.
# You need to create or edit a contents.json file for that to happen.
def test_photo_upload(self):
"""Expect photo upload to work on any file (contrary to msg on screen)
Upload into current default year. settings.PHOTOS_YEAR
Deletes file afterwards
Need to login first.
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
with open("core/fixtures/test_upload_file.txt", "r") as testf:
response = self.client.post(
"/photoupload/", data={"name": "test_upload_file.txt", "renameto": "", "uploadfiles": testf}
)
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(response.status_code, HTTPStatus.OK)
# with open('_test_response.html', 'w') as f:
# f.write(content)
for ph in [
r"test_upload_",
r"Upload photos into /photos/" + str(settings.PHOTOS_YEAR),
r" you can create a new folder in your name",
r"Create new Photographer folder",
r"only photo image files are accepted",
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# Does not use the filename Django actually uses, assumes it is unchanged. Bug: accumulates one file with random name
# added each time it is run. The name of the uploaded file is only available within the code where it happens
remove_file = pathlib.Path(settings.PHOTOS_ROOT, settings.PHOTOS_YEAR) / "test_upload_file.txt"
remove_file.unlink()
def test_photo_upload_rename(self):
"""Expect photo upload to work on any file (contrary to msg on screen)
Upload into current default year. settings.PHOTOS_YEAR
Deletes file afterwards
Need to login first.
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
rename = "RENAMED-FILE.JPG"
with open("core/fixtures/test_upload_file.txt", "r") as testf:
response = self.client.post(
"/photoupload/", data={"name": "test_upload_file.txt", "renameto": rename, "uploadfiles": testf}
)
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(response.status_code, HTTPStatus.OK)
# with open('_test_response.html', 'w') as f:
# f.write(content)
for ph in [rename]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# Does not use the filename Django actually uses, assumes it is unchanged. Bug: accumulates one file with random name
# added each time it is run. The name of the uploaded file is only available within the code where it happens
remove_file = pathlib.Path(settings.PHOTOS_ROOT, settings.PHOTOS_YEAR) / rename
remove_file.unlink()
def test_photo_folder_create(self):
"""Create folder for new user
Create in current default year. settings.PHOTOS_YEAR
Deletes folder afterwards
Need to login first.
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
response = self.client.post("/photoupload/", data={"photographer": "GussieFinkNottle"})
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(response.status_code, HTTPStatus.OK)
# with open('_test_response.html', 'w') as f:
# f.write(content)
for ph in [r"/GussieFinkNottle/", r"Create new Photographer folder"]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# Does not use the filename Django actually uses, assumes it is unchanged. Bug: accumulates one file with random name
# added each time it is run. The name of the uploaded file is only available within the code where it happens
remove_dir = pathlib.Path(settings.PHOTOS_ROOT, settings.PHOTOS_YEAR) / "GussieFinkNottle"
remove_dir.rmdir()
def test_dwg_upload_txt(self):
"""Expect .pdf file to be refused upload
Need to login first.
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
with open("core/fixtures/test_upload_file.pdf", "r") as testf:
response = self.client.post(
"/dwgupload/uploads", data={"name": "test_upload_file.txt", "uploadfiles": testf}
)
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
t = re.search("Files refused:", content)
self.assertIsNotNone(t, 'Logged in but failed to see "Files refused:"')
def test_dwg_upload_drawing(self):
"""Expect no-suffix file to upload
Note that this skips the git commit process. That would need a new test.
Need to login first.
"""
c = self.client
from django.contrib.auth.models import User
u = User.objects.get(username="expotest")
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
c.login(username=u.username, password="secretword")
with open("core/fixtures/test_upload_nosuffix", "r") as testf:
response = self.client.post(
"/dwguploadnogit/uploads", data={"name": "test_upload_nosuffix", "uploadfiles": testf}
)
content = response.content.decode()
# with open('_test_response.html', 'w') as f:
# f.write(content)
self.assertEqual(response.status_code, HTTPStatus.OK)
for ph in [
r"test_upload_nosuffix",
r"You cannot create folders here",
r"Creating a folder is done by a nerd",
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(
phmatch, "Expect no-suffix file to upload OK. Failed to find expected text: '" + ph + "'"
)
# Does not use the filename Django actually uses, assumes it is unchanged. Bug: accumulates one file with random name
# added each time it is run. The name of the uploaded file is only available within the code where it happens
# UploadedFile.name see https://docs.djangoproject.com/en/4.1/ref/files/uploads/#django.core.files.uploadedfile.UploadedFile
remove_file = pathlib.Path(settings.DRAWINGS_DATA) / "uploads" / "test_upload_nosuffix"
remove_file.unlink()
class ComplexLoginTests(TestCase):
"""These test the login and capabilities of logged-in users, they do not use fixtures"""
def setUp(self):
"""setUp runs once for each test in this class"""
from django.contrib.auth.models import User
u = User()
u.pk = 9000
u.user_id = 8000
u.username, u.password = "expotest", "secretword"
u.email = "philip.sargent+ET@gmail.com"
u.first_name, u.last_name = "ExpoTest", "Caver"
u.is_staff = True
u.is_superuser = True
u.set_password(u.password) # This creates a new salt and thus a new key for EACH test
u.save() # vital that we save all this before attempting login
# print ('\n',u.password)
self.user = u
def tearDown(self):
self.client.logout() # not needed as each test creates a new self.client
# self.member.delete()
##self.user.delete() # id attribute set to None !
pass
# def test_login_redirect_for_non_logged_on_user(self): # need to fix this in real system
# c = self.client
# # Need to login first. Tests that we are redirected to login page if not logged in
# response = c.get('noinfo/cave-number-index')
# self.assertRedirects(response, "/login/?next=/committee/appointments/")
def test_ordinary_login(self):
c = self.client
u = self.user
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
logged_in = c.login(username=u.username, password="secretword") # fails to work if password=u.password !
self.assertTrue(logged_in, "FAILED to login as '" + u.username + "'")
response = c.get("/accounts/login/") # defined by auth system
content = response.content.decode()
t = re.search(r"You are now logged in", content)
self.assertIsNotNone(t, "Logged in as '" + u.username + "' but failed to get 'Now you can' greeting")
def test_authentication_login(self):
c = self.client
u = self.user
self.assertTrue(u.is_active, "User '" + u.username + "' is INACTIVE")
# This is weird. I thought that the user had to login before she was in the authenticated state
self.assertTrue(u.is_authenticated, "User '" + u.username + "' is NOT AUTHENTICATED before login")
logged_in = c.login(username=u.username, password="secretword") # fails to work if password=u.password !
self.assertTrue(logged_in, "FAILED to login as '" + u.username + "'")
self.assertTrue(u.is_authenticated, "User '" + u.username + "' is NOT AUTHENTICATED after login")
# c.logout() # This next test always means user is still authenticated after logout. Surely not?
# self.assertFalse(u.is_authenticated, 'User \'' + u.username + '\' is STILL AUTHENTICATED after logout')
def test_admin_login(self):
c = self.client
u = self.user
logged_in = c.login(username=u.username, password="secretword") # fails to work if password=u.password !
self.assertTrue(logged_in, "FAILED to login as '" + u.username + "'")
response = c.get("/admin/")
content = response.content.decode()
# with open('admin-op.html', 'w') as f:
# f.write(content)
t = re.search(r"Troggle database administration", content)
self.assertIsNotNone(t, "Logged in as '" + u.username + "' but failed to get the Troggle Admin page")
def test_noinfo_login(self):
c = self.client # inherited from TestCase
u = self.user
logged_in = c.login(username=u.username, password="secretword") # fails if password=u.password !
self.assertTrue(logged_in, "FAILED to login as '" + u.username + "'")
response = c.get("/stats") # a page with the Troggle menus
content = response.content.decode()
t = re.search(r"User\:expotest", content)
self.assertIsNotNone(t, "Logged in as '" + u.username + "' but failed to get 'User:expotest' heading")
response = c.get("/noinfo/cave-number-index")
content = response.content.decode()
t = re.search(r"2001-07 Hoffnungschacht", content)
self.assertIsNotNone(t, "Logged in as '" + u.username + "' but failed to get /noinfo/ content")
def test_user_force(self):
c = self.client
u = self.user
try:
c.force_login(u)
except:
self.assertIsNotNone(
None,
"Unexpected exception trying to force_login as '"
+ u.username
+ "' but failed (Bad Django documentation?)",
)
response = c.get("/stats") # a page with the Troggle menus
content = response.content.decode()
t = re.search(r"Log out", content)
self.assertIsNotNone(t, "Forced logged in as '" + u.username + "' but failed to get Log out heading")
response = c.get("/accounts/login/")
content = response.content.decode()
t = re.search(r"You are now logged in", content)
self.assertIsNotNone(t, "Forced logged in as '" + u.username + "' but failed to get /accounts/profile/ content")

137
core/TESTS/test_parsers.py Normal file
View File

@@ -0,0 +1,137 @@
"""
We are using unittest for troggle.
Note that the database has not been parsed from the source files when these tests are run,
so any path that relies on data being in the database will fail.
The simple redirections to files which exist, e.g. in
/expoweb/
/photos/
etc. will test fine.
But paths like this:
/survey_scans/
/caves/
which rely on database resolution will fail unless a fixture has been set up for
them.
https://docs.djangoproject.com/en/dev/topics/testing/tools/
"""
import re
import subprocess
import unittest
from http import HTTPStatus
from django.test import Client, SimpleTestCase, TestCase
from troggle.core.models.troggle import Expedition, DataIssue, Person, PersonExpedition
import troggle.parsers.logbooks as lbp
TEST_YEAR = "1986"
lbp.ENTRIES[TEST_YEAR] = 4 # number of entries in the test logbook
class ImportTest(TestCase):
@classmethod
def setUpTestData(cls):
def make_person(firstname, lastname, nickname=False, vfho=False, guest=False):
fullname = f"{firstname} {lastname}"
lookupAttribs = {"first_name": firstname, "last_name": (lastname or "")}
nonLookupAttribs = {"is_vfho": vfho, "fullname": fullname, "nickname": nickname}
person = Person.objects.create(**nonLookupAttribs, **lookupAttribs)
lookupAttribs = {"person": person, "expedition": cls.test_expo}
nonLookupAttribs = {"is_guest": guest}
pe = PersonExpedition.objects.create(**nonLookupAttribs, **lookupAttribs)
return person
import troggle.settings as settings
LOGBOOKS_PATH = settings.EXPOWEB / lbp.LOGBOOKS_DIR
cls.test_logbook = LOGBOOKS_PATH / TEST_YEAR / lbp.DEFAULT_LOGBOOK_FILE
frontmatter_file = LOGBOOKS_PATH / TEST_YEAR / "frontmatter.html"
if frontmatter_file.is_file():
frontmatter_file.unlink() # delete if it exists
lookupAttribs = {"year": TEST_YEAR}
nonLookupAttribs = {"name": f"CUCC expo-test {TEST_YEAR}"}
cls.test_expo = Expedition.objects.create(**nonLookupAttribs, **lookupAttribs)
fred = make_person("Fred", "Smartarse", nickname="freddy")
phil = make_person("Phil", "Tosser", nickname="tosspot")
dave = make_person("David", "Smartarse", "")
mike = make_person("Michael", "Wideboy", "WB", vfho=True)
# NOT created Kurt, as the whole point is that he is a guest.
def setUp(self):
pass
def tearDown(self):
pass
def test_logbook_exists(self):
self.assertTrue(self.test_logbook.is_file())
def test_logbook_parse(self):
lbp.LoadLogbook(self.test_expo)
issues = DataIssue.objects.all()
messages = []
for i in issues:
if i.parser=="logbooks":
# f"{self.parser} - {self.message}"
messages.append(i.message)
print(f"'{i.message}'")
expected = [
" ! - 1986 No name match for: 'Kurt Keinnamen' in entry tid='1986_s02' for this expedition year.",
]
not_expected = [
" ! - 1986 No name match for: 'Dave Smartarse' in entry tid='1986_s01' for this expedition year.",
" ! - 1986 Warning: logentry: surface - stupour - no expo member author for entry '1986_s03'",
" ! - 1986 Warning: logentry: 123 - wave 2 - no expo member author for entry '1986_s02'",
]
for e in expected:
self.assertIn(e, messages)
for e in not_expected:
self.assertNotIn(e, messages)
def test_aliases(self):
# Problem: '' empty string appears as valid alias for David Smartarse
response = self.client.get(f"/aliases/{TEST_YEAR}")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
# with open('_test_response.html', 'w') as f:
# f.write(content)
ph = f"'fsmartarse'"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_survexfiles(self):
# Needs another test with test data
response = self.client.get("/survexfile/caves/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
# with open('_test_response.html', 'w') as f:
# f.write(content)
ph = f"Caves with subdirectories"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_people(self):
# Needs another test with test data
response = self.client.get("/people")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
# with open('_test_response.html', 'w') as f:
# f.write(content)
ph = f"<td><a href=\"/personexpedition/FredSmartarse/{TEST_YEAR}\">{TEST_YEAR}</a></td>"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")

162
core/TESTS/test_urls.py Normal file
View File

@@ -0,0 +1,162 @@
"""
We are using unittest for troggle.
Note that the database has not been parsed from the source files when these tests are run,
so any path that relies on data being in the database will fail.
https://docs.djangoproject.com/en/dev/topics/testing/tools/
We are not using
https://github.com/FactoryBoy/factory_boy
because we are trying to minimise the number of 3rd-party packages because they expose us to update hell,
as experience in 2019-2020.
However we could use
https://docs.python.org/dev/library/unittest.mock.html
as this is now part if python - if we can get our heads around it.
The tests in this file:
The code {% url THING %} or {% url THING PARAMETER %} appears a hundred times or more in the troggle/templates/ HTML template files.
This is the template synstax for
reverse('THING')
or
reverse('THING', args=[PARAMETER])
It is the URLS which take parameters which need understanding and testing. The reverse() which take no
parameters should be fine as this is fundamental Django stuff which will have been tested to death.
But the reverse() function is purely syntactical, the PARAMETER is just a string which is applied to
the url. So this is not testing anything important really. See the test_url_threed() below.
These url lines all come from templates/*.html
1. No tests: No parameters
{% url "caveindex" %}
{% url "controlpanel" %}
{% url "dataissues" %}
{% url "dwgallfiles" %}
{% url "dwgupload" %}
{% url "eastings" %}
{% url "exportlogbook" %}
{% url "newcave" %}
{% url "notablepersons" %}
{% url "photoupload" %}
{% url "walletedit" %}
Tests exist:
{% url "stats" %}
{% url "allscans" %}
{% url "survexcaveslist" %}
2. With parameter
{% url "caveQMs" "1623-290" %}
{% url "cave_openQMs" "1623-290" %}
{% url "cavewallets" cave_id %}
{% url "dwgfilesingle" drawing.dwgpath %}
{% url "edit_cave" cave.url_parent cave.slug %}
{% url "editentrance" cave.slug ent.entrance.slug %}
{% url "editexpopage" path %}
{% url "err" title %}
{% url "expedition" 2022 %}
{% url "newentrance" cave.slug %}
{% url "survexcavessingle" cavedir %}
{% url "survexcavessingle" cavefiles.0.1 %}
{% url "svx" cavepath %}
{% url "svx" survexfile.path %}
{% url "svxlog" title %}
{% url 'caveQMs' '1623-161' %}
{% url 'image_selector' path %}
{% url 'new_image_form' path %}
Tests exist:
{% url "threed" title %}
"""
todo = """These just do {% url THING %} with no parameter, we also need tests which take a parameter
- Read all this https://developer.mozilla.org/en-US/docs/Learn/Server-side/Django/Testing
- Read all this https://realpython.com/testing-in-django-part-1-best-practices-and-examples/
- add 'coverage' to all tests
- statistics also needs test when we have put data into the database
"""
import re
from http import HTTPStatus
from django.test import Client, TestCase
from django.urls import reverse, path
# class SimplePageTest(unittest.TestCase):
class URLTests(TestCase):
"""These tests may appear to be redundant, but in fact they exercise different bits of code. The urls.py
dispatcher is sending these URLs view via different 'view' handlers, and they all need verifying.
"""
@classmethod
def setUpTestData(cls):
# Set up data for the whole TestCase
# cls.foo = Foo.objects.create(bar="Test")
# Some test using self.foo in tests below..
# read in some SQL ?
pass
def setUp(self):
# Every test needs a client.
self.client = Client()
def test_statistics(self):
response = self.client.get("/statistics")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"0 expeditions: 0 people, 0 caves and 0 logbook entries."
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_stats(self):
# Needs another test with test data
response = self.client.get("/stats")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
# with open('_test_response.html', 'w') as f:
# f.write(content)
ph = r"Total length: 0.0 km adding up the total for each year."
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_url_stats(self):
"""Test the {% url "stats" %} reverse resolution
path('statistics', statistics.stats, name="stats"),
path('stats', statistics.stats, name="stats"),
"""
reversed_url = reverse('stats') # NB _ must be written as - if present in name
self.assertEqual(reversed_url, "/stats")
def test_url_allscans(self):
"""Test the {% url "allscans" %} reverse resolution
path('survey_scans/', allscans, name="allscans"), # all the scans in all wallets
"""
reversed_url = reverse('allscans') # NB _ must be written as - if present in name
self.assertEqual(reversed_url, "/survey_scans/")
def test_url_survexcaveslist(self):
"""Test the {% url "allscans" %} reverse resolution
path('survexfile/caves', survex.survexcaveslist, name="survexcaveslist"),
path('survexfile/caves/', survex.survexcaveslist, name="survexcaveslist"), # auto slash not working
"""
reversed_url = reverse('survexcaveslist') # NB _ must be written as - if present in name
self.assertEqual(reversed_url, "/survexfile/caves/")
def test_url_threed(self):
"""Test the {% url "threed" %} reverse resolution
path('survexfile/<path:survex_file>.3d', survex.threed, name="threed"),
"""
reversed_url = reverse('threed', args=['zilch']) # NB _ must be written as - if present in name
self.assertEqual(reversed_url, "/survexfile/zilch.3d")

View File

@@ -0,0 +1,632 @@
"""
IGNORED tests
- all test files with hyphens in the filename are ignored
- filnames with _ are OK
$ python manage.py test cuy.photologue --parallel
only runs the photologue tests. Working.(well, it was working..)
$ python manage.py test cuy.mailman --parallel
$ python manage.py test paypal.standard --parallel
needs work: a very large test suite
$ python manage.py test tagging --parallel
a huge suite - needs a lot of work to with Django 1.11 & python3
$ python manage.py test cuy.club --parallel
Runs the tests in this file only
"""
import re
import unittest
from django.test import Client, SimpleTestCase, TestCase, TransactionTestCase
class ImportTest(TestCase):
def test_import_imports(self):
#ed to go through all modules and copy all imports here
from io import StringIO
from cuy.club.models import (Article, Event, Member, Webpage,
WebpageCategory)
from cuy.website.views.generic import PUBLIC_LOGIN
from django.conf import settings
from django.contrib.auth.decorators import login_required
from django.contrib.auth.models import User
from django.core import management
from django.db import connection, connections
from django.db.utils import IntegrityError
from django.http import HttpResponse, HttpResponseRedirect
from django.shortcuts import get_object_or_404, render
from django.template.defaultfilters import slugify
from django.utils.timezone import get_current_timezone, make_aware
class SimpleTest(SimpleTestCase):
def test_arith_mult(self):
"""
Tests that 10 x 10 always equals 100.
"""
self.assertEqual(10*10, 100)
class DataTests(TestCase ):
'''These check that the NULL and NON-UNIQUE constraints are working in the database '''
@classmethod
def setUpTestData(cls):
pass
def setUp(self):
from cuy.club.models import Member
from django.contrib.auth.models import User
m = Member()
m.pk=8000
m.user_id = 9000 # not NULL constraint
m.save()
self.member = m
u = User()
u.pk = 9000
u.user_id = 8000
u.username, u.password ='stinker', 'secretword'
u.email='philip.sargent+SP@gmail.com'
u.first_name, u.last_name ='Stinker', 'Pinker'
u.save()
self.user = u
def tearDown(self):
#self.member.delete() # must delete member before user
#self.user.delete() # horrible crash, why?
pass
def test_member_not_null_field(self):
from cuy.club.models import Member
from django.db.utils import IntegrityError
n = Member()
try:
n.save()
except IntegrityError as ex:
t = re.search(r'NOT NULL constraint failed: club_member.user_id', str(ex))
self.assertIsNotNone(t, "Exception is not the expected 'NOT NULL constraint failed'")
n.user_id = 1000
try:
n.save
except:
return self.assertIsNotNone(None, "Failed to save valid Member to database")
def test_member_not_unique_field(self):
from cuy.club.models import Member
from django.db.utils import IntegrityError
m1 = Member()
m2 = Member()
m1.user_id = 1000
m2.user_id = m1.user_id
m1.save()
try:
m2.save()
except IntegrityError as ex:
t = re.search(r'UNIQUE constraint failed: club_member.user_id', str(ex))
return self.assertIsNotNone(t, "IntegrityError as expected but message is not the expected 'UNIQUE constraint failed'" )
self.assertIsNotNone(None, "Failed to enforce 'UNIQUE constraint' on saving two Member objects with same user_id")
def test_article_invalid_date(self):
from cuy.club.models import Article, Member
from django.core.exceptions import ValidationError
from django.db.utils import IntegrityError
a = Article()
m = self.member
a.author_id = m.user_id
a.publish="not a valid datetime"
try:
a.save()
except ValidationError as ex:
t = re.search(r'value has an invalid format. It must be in YYYY-MM-DD HH:MM', str(ex))
self.assertIsNotNone(t, "Exception is not the expected 'invalid format'")
def test_article_and_author_not_null(self):
from cuy.club.models import Article, Member
from django.core.exceptions import ValidationError
from django.db.utils import IntegrityError
a2 = Article()
a2.publish ="2021-02-17 17:25"
a2.author_id = None
try:
a2.save()
except IntegrityError as ex:
t = re.search(r'NOT NULL constraint failed: club_article.author_id', str(ex))
self.assertIsNotNone(t, "Exception is not the expected 'NOT NULL constraint failed'")
except:
self.assertIsNotNone(None, "Exception is not the expected 'NOT NULL constraint failed' IntegrityError")
def test_article_and_author_ok(self):
from cuy.club.models import Article, Member
from django.core.exceptions import ValidationError
from django.db.utils import IntegrityError
m = self.member
a3 = Article()
a3.pk = 5000
a3.publish ="2021-02-17 17:25"
a3.author_id = m.pk
try:
a3.save()
except:
return self.assertIsNotNone(None, "Failed to save valid Article to database")
def test_member_and_user(self):
u = self.user
m = self.member
m.user = u
self.assertEqual(m.user.last_name, 'Pinker')
m.save()
u.save()
class FixturePageTests(TestCase):
fixtures = ['cuyc_basic_data.json', 'test_data.json', 'auth_user_gussie']
def setUp(self):
from django.contrib.auth.models import User
self.user = User.objects.get(username='gussie')
self.member = self.user.profile
def tearDown(self):
pass
def test_fix_event_loaded(self):
from cuy.club.models import Event
e = Event.objects.get(slug='spring-in-the-med')
self.assertEqual(str(e.shore_contact.first_name()), 'Stiffy')
self.assertEqual(str(e.organiser.last_name()), 'Fittleworth')
def test_fix_page_all_trips(self):
response = self.client.get('/programme/')
content = response.content.decode()
t = re.search(r'Spring in the Arctic', content)
self.assertIsNotNone(t, "Failed to see Event loaded from fixture")
t = re.search(r'High Summer in the Irish Sea', content)
self.assertIsNotNone(t, "Failed to see Event loaded from fixture")
def test_fix_page_event(self):
response = self.client.get('/programme/events/spring-in-the-arctic/')
content = response.content.decode()
t = re.search(r'Spring in the Arctic', content)
self.assertIsNotNone(t, "Failed to see Event loaded from fixture")
def test_fix_admin_login_fail(self):
c = self.client
from cuy.club.models import Member
from django.contrib.auth.models import User
m = Member.objects.get(pk=9002)
u = User.objects.get(username='bingo')
self.assertTrue(u.is_active, 'User \'' + u.username + '\' is INACTIVE')
logged_in = c.login(username=u.username, password='secretword') # fails to work if password=u.password !
self.assertTrue(logged_in, 'FAILED to login as \'' + u.username + '\'')
response = c.get('/admin/')
content = response.content.decode()
with open('admin-op.html', 'w') as f:
f.write(content)
t = re.search(r'Site administration', content)
self.assertIsNone(t, 'Logged in as \'' + u.username + '\' (not staff) but still managed to get the Admin page' )
class ComplexLoginTests(TestCase):
'''These test the login and capabilities of logged-in users'''
def setUp(self):
'''setUp runs once for each test in this class'''
from cuy.club.models import AFFILIATION, MEMBER_TYPES, Member
from django.contrib.auth.models import User
m = Member()
m.pk=8000
m.user_id = 9000 # not NULL constraint
m.email = "philip.sargent+HG@gmail.com"
m.member_type = MEMBER_TYPES[1]
m.affiliation = AFFILIATION[3]
m.committee_email_prefix = 'honoria'
u = User()
u.pk = 9000
u.user_id = 8000
u.username, u.password ='honoria', 'secretword'
u.email='philip.sargent+HG@gmail.com'
u.first_name, u.last_name ='Honoria', 'Glossop'
u.is_staff = True
u.is_superuser = True
u.set_password(u.password) # This creates a new salt and thus a new key for EACH test
u.save() # vital that we save all this before attempting login
#print ('\n',u.password)
m.save()
self.user = u
self.member = m
from cuy.club.models import ClubRole, Elected
cr = ClubRole()
cr.id = 7000
cr.title = 'Skipper'
cr.short_description = 'Club skipper who can lead trips'
cr.committee_position = True
cr.rank = 8
cr.save()
self.clubrole = cr
e = Elected()
e.member = m
e.club_role = cr
e.save()
self.elected = e
def tearDown(self):
self.client.logout() # not needed as each test creates a new self.client
#self.member.delete()
##self.user.delete() # id attribute set to None !
pass
def test_login_redirect_for_non_logged_on_user(self):
c = self.client
# Need to login first. Tests that we are redirected to login page if not logged in
response = c.get('/committee/appointments/')
self.assertRedirects(response, "/login/?next=/committee/appointments/")
def test_ordinary_login(self):
c = self.client
u = self.user
self.assertTrue(u.is_active, 'User \'' + u.username + '\' is INACTIVE')
logged_in = c.login(username=u.username, password='secretword') # fails to work if password=u.password !
self.assertTrue(logged_in, 'FAILED to login as \'' + u.username + '\'')
response = c.get('/')
content = response.content.decode()
t = re.search(r'Hello Honoria', content)
self.assertIsNotNone(t, 'Logged in as \'' + u.username + '\' but failed to get personal greeting' )
def test_authentication_login(self):
c = self.client
u = self.user
self.assertTrue(u.is_active, 'User \'' + u.username + '\' is INACTIVE')
# This is weird. I thought that the user had to login before she was in the authenticated state
self.assertTrue(u.is_authenticated, 'User \'' + u.username + '\' is NOT AUTHENTICATED before login')
logged_in = c.login(username=u.username, password='secretword') # fails to work if password=u.password !
self.assertTrue(logged_in, 'FAILED to login as \'' + u.username + '\'')
self.assertTrue(u.is_authenticated, 'User \'' + u.username + '\' is NOT AUTHENTICATED after login')
c.logout()
self.assertFalse(u.is_authenticated, 'User \'' + u.username + '\' is STILL AUTHENTICATED after logout')
def test_admin_login(self):
c = self.client
u = self.user
m = self.member
m.user = u
logged_in = c.login(username=u.username, password='secretword') # fails to work if password=u.password !
self.assertTrue(logged_in, 'FAILED to login as \'' + u.username + '\'')
response = c.get('/admin/')
content = response.content.decode()
# with open('admin-op.html', 'w') as f:
# f.write(content)
t = re.search(r'Site administration', content)
self.assertIsNotNone(t, 'Logged in as \'' + u.username + '\' but failed to get the Admin page' )
def test_user_account_login(self):
# User must be associated with a Member for whom is_committee() is True
c = self.client
u = self.user
m = self.member
m.user = u
logged_in = c.login(username=u.username, password='secretword') # fails if password=u.password !
self.assertTrue(logged_in, 'FAILED to login as \'' + u.username + '\'')
response = c.get('/accounts/profile/')
content = response.content.decode()
# with open('account-profile-op.html', 'w') as f:
# f.write(content)
t = re.search(r'CUYC Member Profile - Cambridge University Yacht Club', content)
self.assertIsNotNone(t, 'Logged in as \'' + u.username + '\' but failed to get /accounts/profile/ content')
def test_committee_login(self):
from django.contrib.auth.models import User
# User must be associated with a Member for whom is_committee() is True
c = self.client # inherited from TestCase
u = self.user
m = self.member
cr = self.clubrole
e = self.elected
m.user = u
logged_in = c.login(username=u.username, password='secretword') # fails if password=u.password !
self.assertTrue(logged_in, 'FAILED to login as \'' + u.username + '\'')
response = c.get('/')
content = response.content.decode()
t = re.search(r'Hello Honoria', content)
self.assertIsNotNone(t, 'Logged in as \'' + u.username + '\' but failed to get personal greeting' )
response = c.get('/committee/appointments/')
content = response.content.decode()
# with open('cmttee-op.html', 'w') as f:
# f.write(content)
t = re.search(r'A word of warning...', content)
self.assertIsNotNone(t, 'Logged in as \'' + u.username + '\' but failed to get /committee/ content')
def test_user_force(self):
from django.conf import settings
c = self.client
u = self.user
m = self.member
m.user = u
try:
c.force_login(u)
except:
self.assertIsNotNone(None, 'Unexpected exception trying to force_login as \'' + u.username + '\' but failed (Bad Django documentation?)')
response = c.get('/')
content = response.content.decode()
t = re.search(r'Hello Honoria', content)
self.assertIsNotNone(t, 'Forced logged in as \'' + u.username + '\' but failed to get personal greeting' )
response = c.get('/accounts/profile/')
content = response.content.decode()
t = re.search(r'From here you can update your', content)
self.assertIsNotNone(t, 'Forced logged in as \'' + u.username + '\' but failed to get /accounts/profile/ content')
class DynamicPageTests(TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_empty_yachts(self):
# no page there initially
response = self.client.get('/yachts/')
content = response.content.decode()
self.assertEqual(response.status_code, 404)
def test_full_yachts(self):
'''Creating a WebpageCategory and an index webpage creates a valid url
'''
from cuy.club.models import Webpage, WebpageCategory
wc = WebpageCategory()
wc.pk = 8000
wc.id = 8000
wc.name, wc.slug ='Yachts', 'yachts'
wc.save()
self.webcategory = wc
p = Webpage()
p.pk = 9000
p.id = 9000
p.category_id = wc.id
p.description = "Current Yacht"
p.edited = 1
p.event_id = None
p.index = 1
p.markup = "<h1>Skylark</h1>"
p.ordering = 10
p.slug = "yacht"
p.title = "Skylark Yacht"
p.save()
self.webpage = p
response = self.client.get('/yachts/')
content = response.content.decode()
self.assertEqual(response.status_code, 200)
class PageTests(TestCase):
def setUp(self):
# Every test needs a client.
# new in Django 1.5 no need to create self.client first
# https://docs.djangoproject.com/en/dev/topics/testing/tools/#django.test.LiveServerTestCase
#self.client = Client()
pass
def tearDown(self):
pass
def test_basic_admin(self):
response = self.client.get('/admin/login/')
self.assertEqual(response.status_code, 200)
def test_basic_admindoc(self):
# Need to login first. Tests that we are redirected
response = self.client.get('/admin/doc/models/')
self.assertRedirects(response, "/admin/login/?next=/admin/doc/models/")
def test_basic_programme(self):
response = self.client.get('/programme/')
self.assertEqual(response.status_code, 200)
def test_basic_login (self):
# Need to login first
response = self.client.post('/login/', {'username': 'gussie', 'password': 'secretword'})
if response.status_code == 302:
print(response['location'])
self.assertEqual(response.status_code, 200) # fails because user does not exist
def test_basic_committee(self):
# Need to login first. Tests that we are redirected to login page
response = self.client.get('/committee/')
self.assertRedirects(response, "/login/?next=/committee/")
# --- Check non-logged-in users cannot see these
def test_basic_gallery(self):
response = self.client.get('/gallery/')
self.assertEqual(response.status_code, 200)
def test_basic_sitemap(self):
response = self.client.get('/site-map/')
self.assertEqual(response.status_code, 200)
# --- public club pages created by content in templates/*.html
def test_basic_club(self):
response = self.client.get('/club/')
content = response.content.decode()
t = re.search(r'offers opportunities for members of the university to sail yachts', content)
self.assertIsNotNone(t)
def test_basic_programme(self):
response = self.client.get('/programme/')
content = response.content.decode()
t = re.search(r'If you would like to go on any of these events', content)
self.assertIsNotNone(t)
def test_basic_programme_onshore(self):
response = self.client.get('/programme/on_shore/')
content = response.content.decode()
t = re.search(r'All Upcoming Shore Based Events', content)
self.assertIsNotNone(t)
def test_page_equal_opps(self):
response = self.client.get('/club/equal-opps/')
content = response.content.decode()
t = re.search(r'commitment to a policy of equal opportunities', content)
self.assertIsNotNone(t)
def test_page_safety(self):
response = self.client.get('/club/safety/')
content = response.content.decode()
t = re.search(r'endeavour to maintain the highest levels of safety', content)
self.assertIsNotNone(t)
def test_page_safety_risk(self):
response = self.client.get('/club/safety/risk/')
content = response.content.decode()
t = re.search(r'rules for the use of safety lines to be described and monitored by the skipper.', content)
self.assertIsNotNone(t)
def test_page_safetypolicy(self):
response = self.client.get('/club/safetypolicy/')
content = response.content.decode()
t = re.search(r'should be capable of swimming at least fifty meters in clothing and keeping afloat for at least five minutes', content)
self.assertIsNotNone(t)
def test_page_safety_rules(self):
response = self.client.get('/club/safety/rules/')
content = response.content.decode()
t = re.search(r'Safety Officer is responsible for the maintenance of safety records', content)
self.assertIsNotNone(t)
def test_page_regulations(self):
response = self.client.get('/club/regulations/')
content = response.content.decode()
t = re.search(r'Sanger Institute, the Babraham Institute, Wellcome and MRC Research Laboratories', content)
self.assertIsNotNone(t)
def test_page_constitution(self):
response = self.client.get('/club/constitution/')
content = response.content.decode()
t = re.search(r'to provide a wide variety of safe and affordable yacht sailing', content)
self.assertIsNotNone(t)
def test_page_clubcommittee(self):
response = self.client.get('/club/committee/')
content = response.content.decode()
t = re.search(r'CUYC elects new officers as needed, usually at the beginning of each term', content)
self.assertIsNotNone(t)
def test_page_damages(self):
response = self.client.get('/club/damages/')
content = response.content.decode()
t = re.search(r'all crew participants may be required to contribute to the payment of damages', content)
self.assertIsNotNone(t)
def test_page_training(self):
response = self.client.get('/training/')
content = response.content.decode()
t = re.search(r'members of the club are always happy to pass on informal training tips', content)
self.assertIsNotNone(t)
def test_page_racing(self):
response = self.client.get('/racing/')
content = response.content.decode()
t = re.search(r'CUYC Racing Squad', content)
self.assertIsNotNone(t)
def test_page_blog(self):
response = self.client.get('/blog/')
content = response.content.decode()
t = re.search(r'Latest Posts', content)
self.assertIsNotNone(t)
def test_page_gallery(self):
response = self.client.get('/gallery/')
content = response.content.decode()
t = re.search(r'Photo Galleries', content)
self.assertIsNotNone(t)
def test_page_about_photos(self):
response = self.client.get('/about_photos/')
content = response.content.decode()
t = re.search(r'have been supplied by members of CUYC', content)
self.assertIsNotNone(t)
def test_page_loginhelp(self):
response = self.client.get('/login/help/')
content = response.content.decode()
t = re.search(r'Existing CUYC Member, without an account?', content)
self.assertIsNotNone(t)
def test_page_loginregister(self):
response = self.client.get('/login/register/')
content = response.content.decode()
t = re.search(r'If you are, or have ever been, a CUYC or CUCrC member', content)
self.assertIsNotNone(t)
# --- These pages are not connected to top level public menus but are in fact public
def test_page_club_tripinformation(self):
response = self.client.get('/club/trip-information/')
content = response.content.decode()
t = re.search(r'organisers have a choice to add a sum to the trip fee quoted on the website to cover expenses', content)
self.assertIsNotNone(t)
def test_page_club_trippayment(self):
response = self.client.get('/club/trip-information/payment/')
content = response.content.decode()
t = re.search(r'All payments to the club should be sent via Paypal', content)
self.assertIsNotNone(t)
def test_page_club_trip_typical_day(self):
response = self.client.get('/club/trip-information/typical-day/')
content = response.content.decode()
t = re.search(r'Skipper and first mate crawl out of their sleeping bags early', content)
self.assertIsNotNone(t)
def test_page_club_trip_faq(self):
response = self.client.get('/club/trip-information/faq/')
content = response.content.decode()
t = re.search(r'Different people are seasick in different ways', content)
self.assertIsNotNone(t)
def test_page_club_trip_kit(self):
response = self.client.get('/club/trip-information/kit/')
content = response.content.decode()
t = re.search(r'appropriate quantity of base layer clothes to match the duration', content)
self.assertIsNotNone(t)

564
core/TESTS/tests.py Normal file
View File

@@ -0,0 +1,564 @@
"""
We are using unittest for troggle.
Note that the database has not been parsed from the source files when these tests are run,
so any path that relies on data being in the database will fail.
The simple redirections to files which exist, e.g. in
/expoweb/
/photos/
etc. will test fine.
But paths like this:
/survey_scans/
/caves/
which rely on database resolution will fail unless a fixture has been set up for
them.
https://docs.djangoproject.com/en/dev/topics/testing/tools/
"""
todo = """ADD TESTS when we are redirecting /expofiles/ to a remote file-delivering site
- Add test for running cavern to produce a .3d file
"""
import re
from http import HTTPStatus
from django.test import Client, TestCase
# class SimplePageTest(unittest.TestCase):
class PageTests(TestCase):
"""These tests may appear to be redundant, but in fact they exercise different bits of code. The urls.py
dispatcher is sending these URLs view via different 'view' handlers, and they all need verifying.
"""
@classmethod
def setUpTestData(cls):
# Set up data for the whole TestCase
# cls.foo = Foo.objects.create(bar="Test")
# Some test using self.foo in tests below..
# read in some SQL ?
pass
def setUp(self):
# Every test needs a client.
self.client = Client()
def test_expoweb_root(self):
response = self.client.get("")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"CUCC in Austria"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_root_slash(self):
response = self.client.get("/")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"CUCC in Austria"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_paths(self):
response = self.client.get("/pathsreport")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"This report is generated from"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_dir(self):
response = self.client.get("/handbook")
response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.FOUND) # 302 directory, so redirects to /index.htm
def test_expoweb_dirslash(self):
response = self.client.get("/handbook/")
response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.FOUND) # 302 directory, so redirects to /index.htm
def test_expoweb_dir_no_index(self):
response = self.client.get("/handbook/troggle")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.NOT_FOUND)
ph = r"Page not found handbook/troggle/index.html"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_dir_with_index_htm(self):
response = self.client.get("/years/1999/index.htm")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK) # directory, so redirects to /index.htm
ph = r"Passage descriptions for 1999"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_dir_with_index_html(self):
response = self.client.get("/years/2015/index.html")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK) # directory, so redirects to /index.htm
ph = r"Things left at top camp 2014"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_dir_with_index2(self):
response = self.client.get("/handbook/index.htm")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"Introduction to expo"
phmatch = re.search(ph, content)
# print("\n ! - test_expoweb_dir_with_index2\n{}\n{}".format(response.reason_phrase, content))
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_htm(self):
response = self.client.get("/handbook/index.htm")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"Introduction to expo"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_notfound(self):
response = self.client.get("/handbook/_test_zyxxypqrqx.html")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.NOT_FOUND)
ph = r"<h1>Page not found"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_no_dir(self):
# slash where there should not be one
response = self.client.get("/handbook/_test_zyxxypqrqx/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"<h1>Directory not found"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_troggle_default(self):
# default page after logon
response = self.client.get("/troggle")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"expeditions the club has undertaken"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_troggle_default_slash(self):
response = self.client.get("/troggle/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"<h1>Directory not found"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_expoweb_via_areaid(self):
# the dispatcher takes a detour via the cave renering procedure for this
response = self.client.get("/guidebook/t/via201.jpg")
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 6057)
def test_cave_kataster_not_found(self):
# database not loaded, so no caves found; so looks for a generic expopage and fails
response = self.client.get("/1623/115.htm")
self.assertEqual(response.status_code, HTTPStatus.NOT_FOUND)
content = response.content.decode()
ph = r"Page not found 1623/115.htm"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_caves_page(self):
response = self.client.get("/caves")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"Cave Number Index - kept updated"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_caves_page_kataster_not_found(self):
response = self.client.get("/caves")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"115"
phmatch = re.search(ph, content)
self.assertIsNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_ss(self):
response = self.client.get("/survey_scans/")
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r"All Survey scans folders "
content = response.content.decode()
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_admin(self):
# see the login page
response = self.client.get("/admin/login/")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
ph = r'<h1 id="site-name">Troggle database administration</h1>'
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_admindocs_exped(self):
# Get redirected to login page
response = self.client.get("/admin/doc/models/core.expedition/")
response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.FOUND) # 302
def test_page_expofiles_root_dir(self):
# Root expofiles - odd interaction with url parsing so needs testing
response = self.client.get("/expofiles")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r'a href="/expofiles/geotiffsurveys">/geotiffsurveys/',
r'<a href="/expofiles/photos">/photos/',
r'<a href="/expofiles/surveyscans">/surveyscans/',
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_expofiles_root_slash_dir(self):
# Root expofiles - odd interaction with url parsing so needs testing
response = self.client.get("/expofiles/")
if response.status_code != HTTPStatus.OK: # 200
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND: # 302
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r'a href="/expofiles/geotiffsurveys">/geotiffsurveys/',
r'<a href="/expofiles/photos">/photos/',
r'<a href="/expofiles/surveyscans">/surveyscans/',
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_expofiles_badness(self):
# should display expofiles directory contents not its parent
response = self.client.get("/expofiles/99badness99")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r'a href="/expofiles/geotiffsurveys">/geotiffsurveys/',
r'<a href="/expofiles/photos">/photos/',
r'<a href="/expofiles/surveyscans">/surveyscans/',
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_expofiles_docs_dir(self):
# Flat file tests.
response = self.client.get("/expofiles/documents/")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r'a href="/expofiles/documents/bier-tent-instructions.pdf">bier-tent-instructions.pdf',
r'a href="/expofiles/documents/boc.pdf">boc.pdf',
r'a href="/expofiles/documents/idiots-guide-expo-git.pdf"',
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_survey_scans_dir(self):
# Flat file tests.
response = self.client.get("/expofiles/surveyscans")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r'<a href="/expofiles/surveyscans/2004">/2004/',
r'<a href="/expofiles/surveyscans/1989LUSS">/1989LUSS/',
r'<a href="/expofiles/surveyscans/2018">/2018',
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_folk(self):
# This page is separately generated, so it has the full data content
response = self.client.get("/folk/index.htm")
content = response.content.decode()
self.assertEqual(response.status_code, HTTPStatus.OK)
for ph in [
r"involves some active contribution",
r"Naomi Griffiths",
r"Gail Smith",
r"Phil Wigglesworth",
r"A more obscure record of longest gap between expos has",
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_expofile_documents(self):
# this gets an empty page as the database has not been loaded
response = self.client.get("/expofiles/documents")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"notice_generale_cordes_courant"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_expofile_documents_slash(self):
# this gets an empty page as the database has not been loaded
response = self.client.get("/expofiles/documents/")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"notice_generale_cordes_courant"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_expofile_document_loeffler_pdf(self):
# Flat file tests.
response = self.client.get("/expofiles/documents/surveying/tunnel-loefflerCP35-only.pdf")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 2299270)
def test_page_expofile_document_rope_pdf(self):
# Flat file tests.
response = self.client.get("/expofiles/documents/ropes/rope-age-agm-2019.pdf")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 76197)
def test_page_expofile_document_png(self):
# Flat file tests.
response = self.client.get("/expofiles/documents/callout-2012.png")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 69921)
def test_page_expofile_writeup(self):
# Flat file tests.
response = self.client.get("/expofiles/writeups/1982/logbook1982.pdf")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 12915413)
def test_page_site_media_ok(self):
# Flat file tests.
response = self.client.get("/site_media/surveyHover.gif")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 39482) # need to check it is not just an error page
def test_page_site_media_css(self):
# Flat file tests.
response = self.client.get("/site_media/css/trog3.css")
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode() # need to check it is not just an error page
ph = r"This text is used by the test system to determine that trog3.css loaded correctly"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_photos_ok(self):
# Flat file tests.
response = self.client.get("/photos/2018/PhilipSargent/corin.jpg") # exists
if response.status_code != HTTPStatus.OK:
self.assertEqual(response.status_code, HTTPStatus.FOUND)
if response.status_code != HTTPStatus.FOUND:
self.assertEqual(response.status_code, HTTPStatus.OK)
self.assertEqual(len(response.content), 67487) # need to check it is not just an error page
def test_page_photos_not_ok(self):
# Flat file tests.
response = self.client.get("/photos/2018/PhilipSargent/_corin.jpeg") # does not exist
self.assertEqual(response.status_code, HTTPStatus.NOT_FOUND)
content = response.content.decode()
ph = r"<title>Page not found 2018/PhilipSargent/_corin.jpeg</title>"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_photos_dir(self):
# Flat file tests.
response = self.client.get("/photos/2018/PhilipSargent/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"Directory not displayed"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_survey_scans_empty(self):
# this gets an empty page as the database has not been loaded
response = self.client.get("/survey_scans/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"contains the scanned original in-cave survey notes and sketches"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_dwgdataraw_empty(self):
# this gets an empty page as the database has not been loaded
response = self.client.get("/dwgdataraw/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"<h1>Directory not found"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_dwgallfiles_empty(self):
# this gets an empty page as the database has not been loaded
response = self.client.get("/dwgfiles")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r"All Tunnel and Therion files",
r"<th>Wallets</th><th>Scan files in the wallets</th><th>Frames</th></tr>",
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_dwgallfiles_empty_slash(self):
# this gets an empty page as the database has not been loaded
response = self.client.get("/dwgfiles/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
for ph in [
r"All Tunnel and Therion files",
r"<th>Wallets</th><th>Scan files in the wallets</th><th>Frames</th></tr>",
]:
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_page_slash_empty(self):
# tslash where there should not be one
response = self.client.get("/expedition/1979/")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"<h1>Directory not found"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_not_found_survexfile_cave(self):
response = self.client.get("/survexfile/not_a_real_cave_number")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"Cave Identifier not found in database"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_dataissues(self):
# Needs another test with test data
response = self.client.get("/dataissues")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"as well as these import/parsing issues"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_therionissues(self):
# Needs another test with test data
response = self.client.get("/therionissues")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"! Un-parsed image filename"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_surveximport(self):
# Needs another test with test data
response = self.client.get("/surveximport")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
# with open('_test_response.html', 'w') as f:
# f.write(content)
ph = r"The number at the left-hand margin is the depth"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_survexdebug(self):
# Needs another test with test data
response = self.client.get("/survexdebug")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"Running list of warnings during import"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
def test_eastings(self):
# Needs another test with test data
response = self.client.get("/eastings")
self.assertEqual(response.status_code, HTTPStatus.OK)
content = response.content.decode()
ph = r"<tr><th>Survex Station</th><th>x</th><th>y</th></tr>"
phmatch = re.search(ph, content)
self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph + "'")
# ADD TESTS when we are redirecting /expofiles/ to get the actual files using e.g.
# import requests
# page = requests.get("http://dataquestio.github.io/web-scraping-pages/simple.html")
# these need a fixture to load the datbase before they will pass
# we also need tests for invalid queries to check that error pages are right
# def test_page_survey_scans_khplan2_png(self):
# # this has an error as the database has not been loaded yet in the tests
# response = self.client.get('/survey_scans/smkhs/khplan2.png')
# if response.status_code != HTTPStatus.OK:
# self.assertEqual(response.status_code, HTTPStatus.FOUND)
# if response.status_code != HTTPStatus.FOUND:
# self.assertEqual(response.status_code, HTTPStatus.OK)
# self.assertEqual(len(response.content), 823304) # fails, but is working manually!
# def test_page_dwgdataraw_107sketch_xml(self):
# # this has an error as the database has not been loaded yet in the tests
# response = self.client.get('/dwgdataraw/107/107sketch-v2.xml')
# if response.status_code != HTTPStatus.OK:
# self.assertEqual(response.status_code, HTTPStatus.FOUND)
# if response.status_code != HTTPStatus.FOUND:
# self.assertEqual(response.status_code, HTTPStatus.OK)
# content = response.content.decode()
# for ph in [ r'tunneldate="2014-08-21 11:34:00"',
# r'<sketchsubset subname="Caves of the Loser Plateau"/>',
# r'sfsketch="ollyjen107drawings',
# r'sfsketch="surveyscans/2014/2014#01',
# r'aa-js-plan.png"' ]:
# phmatch = re.search(ph, content)
# self.assertIsNotNone(phmatch, "Failed to find expected text: '" + ph +"'")
# database not loaded yet:
# response = self.client.get('/survey_scans/1991surveybook/page0002.png')
# response = self.client.get('/survey_scans/1991surveybook/')
# content = response.content.decode()
# print(content)
# png93 = re.search(r'/page0093.png">page0093.png</a></td>', content)

View File

@@ -1,22 +1,41 @@
from troggle.core.models import *
from django.contrib import admin
from django.forms import ModelForm
import django.forms as forms
from django.http import HttpResponse
from django.core import serializers
from troggle.core.views_other import downloadLogbook
#from troggle.reversion.admin import VersionAdmin #django-reversion version control
from django.http import HttpResponse
from troggle.core.models.caves import Area, Cave, CaveAndEntrance, Entrance
from troggle.core.models.logbooks import QM, LogbookEntry, PersonLogEntry, CaveSlug
from troggle.core.models.survex import (
DrawingFile,
SingleScan,
SurvexBlock,
SurvexDirectory,
SurvexFile,
SurvexPersonRole,
SurvexStation,
)
from troggle.core.models.wallets import Wallet
from troggle.core.models.troggle import DataIssue, Expedition, Person, PersonExpedition
"""This code significantly adds to the capabilities of the Django Management control panel for Troggle data.
In particular, it enables JSON export of any data with 'export_as_json'
and configures the search fields to be used within the control panel.
What is the search path for the css and js inclusions in the Media subclasses though ?!
The page looks for /static/jquery/jquery.min.js
"""
class TroggleModelAdmin(admin.ModelAdmin):
def save_model(self, request, obj, form, change):
"""overriding admin save to fill the new_since parsing_field"""
obj.new_since_parsing=True
"""overriding admin save to fill the new_since parsing_field
new_since_parsing is not currently used in troggle. It is a fossil."""
obj.new_since_parsing = True
obj.save()
class Media:
js = ('jquery/jquery.min.js','js/QM_helper.js')
js = ("jquery/jquery.min.js", "js/QM_helper.js") # not currently available to troggle, see media/js/README
class RoleInline(admin.TabularInline):
@@ -28,58 +47,36 @@ class SurvexBlockAdmin(TroggleModelAdmin):
inlines = (RoleInline,)
class ScannedImageInline(admin.TabularInline):
model = ScannedImage
extra = 4
# class QMsFoundInline(admin.TabularInline):
# model = QM
# fk_name = "found_by"
# fields = ("number", "grade", "location_description", "comment") # need to add foreignkey to cave part
# extra = 1
class OtherCaveInline(admin.TabularInline):
model = OtherCaveName
class PersonLogEntryInline(admin.TabularInline):
model = PersonLogEntry
raw_id_fields = ("personexpedition",)
extra = 1
class SurveyAdmin(TroggleModelAdmin):
inlines = (ScannedImageInline,)
search_fields = ('expedition__year','wallet_number')
class QMsFoundInline(admin.TabularInline):
model=QM
fk_name='found_by'
fields=('number','grade','location_description','comment')#need to add foreignkey to cave part
extra=1
class PhotoInline(admin.TabularInline):
model = DPhoto
exclude = ['is_mugshot' ]
extra = 1
class PersonTripInline(admin.TabularInline):
model = PersonTrip
raw_id_fields = ('personexpedition',)
extra = 1
#class LogbookEntryAdmin(VersionAdmin):
class LogbookEntryAdmin(TroggleModelAdmin):
prepopulated_fields = {'slug':("title",)}
search_fields = ('title','expedition__year')
date_heirarchy = ('date')
inlines = (PersonTripInline, PhotoInline, QMsFoundInline)
prepopulated_fields = {"slug": ("title",)}
search_fields = ("title", "expedition__year")
date_heirarchy = "date"
# inlines = (PersonLogEntryInline, QMsFoundInline)
class Media:
css = {
"all": ("css/troggleadmin.css",)
}
actions=('export_logbook_entries_as_html','export_logbook_entries_as_txt')
def export_logbook_entries_as_html(modeladmin, request, queryset):
response=downloadLogbook(request=request, queryset=queryset, extension='html')
css = {"all": ("css/troggleadmin.css",)} # this does not exist
actions = ("export_logbook_entries_as_html", "export_logbook_entries_as_txt")
def export_logbook_entries_as_html(self, modeladmin, request, queryset):
response = downloadLogbook(request=request, queryset=queryset, extension="html") # fails, no queryset
return response
def export_logbook_entries_as_txt(modeladmin, request, queryset):
response=downloadLogbook(request=request, queryset=queryset, extension='txt')
def export_logbook_entries_as_txt(self, modeladmin, request, queryset):
response = downloadLogbook(request=request, queryset=queryset, extension="txt") # fails, no queryset
return response
@@ -89,70 +86,89 @@ class PersonExpeditionInline(admin.TabularInline):
class PersonAdmin(TroggleModelAdmin):
search_fields = ('first_name','last_name')
search_fields = ("first_name", "last_name")
inlines = (PersonExpeditionInline,)
class QMAdmin(TroggleModelAdmin):
search_fields = ('found_by__cave__kataster_number','number','found_by__date')
list_display = ('__unicode__','grade','found_by','ticked_off_by')
list_display_links = ('__unicode__',)
list_editable = ('found_by','ticked_off_by','grade')
list_per_page = 20
raw_id_fields=('found_by','ticked_off_by')
search_fields = ("number", "expoyear")
list_display = ("__str__", "grade")
list_display_links = ("__str__",)
# list_editable = ("comment", "page_ref", "grade")
# list_per_page = 20
# raw_id_fields = ("found_by", "ticked_off_by")
class PersonExpeditionAdmin(TroggleModelAdmin):
search_fields = ('person__first_name','expedition__year')
search_fields = ("person__first_name", "expedition__year")
class CaveAdmin(TroggleModelAdmin):
search_fields = ('official_name','kataster_number','unofficial_number')
inlines = (OtherCaveInline,)
search_fields = ("official_name", "kataster_number", "unofficial_number")
extra = 4
class EntranceAdmin(TroggleModelAdmin):
search_fields = ('caveandentrance__cave__kataster_number',)
search_fields = ("caveandentrance__cave__kataster_number",)
class SurvexStationAdmin(TroggleModelAdmin):
search_fields = ("name",)
class SurvexFileAdmin(TroggleModelAdmin):
search_fields = ("path",)
class SurvexDirectoryAdmin(TroggleModelAdmin):
search_fields = (
"path",
"survexdirectory",
)
class DrawingFileAdmin(TroggleModelAdmin):
search_fields = ("dwgname",)
class WalletAdmin(TroggleModelAdmin):
search_fields = ("fpath",)
admin.site.register(DPhoto)
admin.site.register(Cave, CaveAdmin)
admin.site.register(Area)
#admin.site.register(OtherCaveName)
admin.site.register(CaveAndEntrance)
admin.site.register(NewSubCave)
admin.site.register(CaveDescription)
admin.site.register(Entrance, EntranceAdmin)
admin.site.register(CaveSlug)
admin.site.register(SurvexBlock, SurvexBlockAdmin)
admin.site.register(DrawingFile, DrawingFileAdmin)
admin.site.register(Expedition)
admin.site.register(Person,PersonAdmin)
admin.site.register(Person, PersonAdmin)
admin.site.register(SurvexPersonRole)
admin.site.register(PersonExpedition,PersonExpeditionAdmin)
admin.site.register(SurvexDirectory, SurvexDirectoryAdmin)
admin.site.register(SurvexFile, SurvexFileAdmin)
admin.site.register(SurvexStation, SurvexStationAdmin)
admin.site.register(PersonExpedition, PersonExpeditionAdmin)
admin.site.register(LogbookEntry, LogbookEntryAdmin)
#admin.site.register(PersonTrip)
admin.site.register(QM, QMAdmin)
admin.site.register(Survey, SurveyAdmin)
admin.site.register(ScannedImage)
admin.site.register(SurvexStation)
admin.site.register(SurvexScansFolder)
admin.site.register(SurvexScanSingle)
admin.site.register(Wallet, WalletAdmin)
admin.site.register(SingleScan)
admin.site.register(DataIssue)
def export_as_json(modeladmin, request, queryset):
response = HttpResponse(mimetype="text/json")
response['Content-Disposition'] = 'attachment; filename=troggle_output.json'
response = HttpResponse(content_type="text/json")
response["Content-Disposition"] = "attachment; filename=troggle_output.json"
serializers.serialize("json", queryset, stream=response)
return response
def export_as_xml(modeladmin, request, queryset):
response = HttpResponse(mimetype="text/xml")
response['Content-Disposition'] = 'attachment; filename=troggle_output.xml'
response = HttpResponse(content_type="text/xml")
response["Content-Disposition"] = "attachment; filename=troggle_output.xml"
serializers.serialize("xml", queryset, stream=response)
return response
#admin.site.add_action(export_as_xml)
#admin.site.add_action(export_as_json)
admin.site.add_action(export_as_xml)
admin.site.add_action(export_as_json)

View File

@@ -1,5 +1,22 @@
from django.conf import settings
from troggle.core.models import Expedition
from troggle.core.models.troggle import Expedition
"""This is the only troggle-specific 'context processor' that troggle uses
in the processing of Django templates
This seems to mean that every page produced has bundled in its context the complete 'settings' and
the expedition class object, so all templates can do queries on Expedition.
https://betterprogramming.pub/django-quick-tips-context-processors-da74f887f1fc
If it is commented out, the logbookentry page goes crazy and the screws up all the site_media resultions for CSS file s!
Seems to be necessary to make {{settings.MEDIA_URL}} work. Which is obvious in retrospect.
It is VITAL that no database operations are done in any context processor, see
https://adamj.eu/tech/2023/03/23/django-context-processors-database-queries/
"""
def troggle_context(request):
return { 'settings':settings, 'Expedition':Expedition }
return {"settings": settings}
# return {"settings": settings, "Expedition": Expedition}

View File

@@ -1,43 +0,0 @@
import troggle.settings as settings
import os
import urllib
def urljoin(x, y): return x + "/" + y
def listdir(*path):
try:
strippedpath = [p for p in path if p]
root = os.path.join(settings.FILES, *strippedpath )
l = ""
#l = root + "\n"
isdir = os.path.isdir(root) #This seems to be required for os.path.isdir to work...
#l += str(isdir) + "\n"
for p in os.listdir(root):
if os.path.isdir(os.path.join(root, p)):
l += p + "/\n"
elif os.path.isfile(os.path.join(root, p)):
l += p + "\n"
#Ignore non-files and non-directories
return l
except:
if strippedpath:
c = reduce(urljoin, strippedpath)
else:
c = ""
c = c.replace("#", "%23")
print("FILE: ", settings.FILES + "listdir/" + c)
return urllib.urlopen(settings.FILES + "listdir/" + c).read()
def dirsAsList(*path):
return [d for d in listdir(*path).split("\n") if len(d) > 0 and d[-1] == "/"]
def filesAsList(*path):
return [d for d in listdir(*path).split("\n") if len(d) > 0 and d[-1] != "/"]
def readFile(*path):
try:
f = open(os.path.join(settings.FILES, *path))
except:
f = urllib.urlopen(settings.FILES + "download/" + reduce(urljoin, path))
return f.read()

View File

@@ -0,0 +1,39 @@
[
{"pk": 9010, "model": "auth.user", "fields":
{"username": "expotest", "first_name": "ExpoTest", "last_name": "Caver", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+expo@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}},
{"pk": 9011, "model": "auth.user", "fields":
{"username": "expotestadmin", "first_name": "ExpoTest", "last_name": "Admin", "is_active": true, "is_superuser": true, "is_staff": true, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+expoadmin@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}},
{"model": "auth.user", "pk": 8999, "fields":
{
"email": "philip.sargent+GFN@gmail.com",
"first_name": "Gussie",
"last_name": "Fink-Nottle",
"id": 8999,
"is_active": true,
"is_staff": true,
"is_superuser": true,
"last_login": "2021-01-01 00:00:01+0100",
"password": "pbkdf2_sha256$150000$EbI1VetXC8tM$pHb5Y7af/TCsNeD6H0EwGx4DWB7qyZyq1bUWKytuiTA=",
"username": "gussie",
"date_joined": "2021-01-01 00:00:00+0100"
}},
{"pk": 9000, "model": "auth.user", "fields":
{"username": "oofy", "first_name": "Oofy", "last_name": "Prosser", "is_active": true, "is_superuser": true, "is_staff": true, "last_login": "2021-01-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+oofy@gmail.com", "date_joined": "2021-01-01 00:00:00+0100"}},
{"pk": 9001, "model": "auth.user", "fields":
{"username": "stiffy", "first_name": "Stiffy", "last_name": "Byng", "is_active": true, "is_superuser": true, "is_staff": true, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+stiffy@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}},
{"pk": 9002, "model": "auth.user", "fields":
{"username": "bingo", "first_name": "Bingo", "last_name": "Little", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+bingo@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}},
{"pk": 9003, "model": "auth.user", "fields":
{"username": "spode", "first_name": "Roderick", "last_name": "Spode", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+spode@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}},
{"pk": 9004, "model": "auth.user", "fields":
{"username": "boko", "first_name": "Boko", "last_name": "Fittleworth", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+boko@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}}
]

View File

@@ -0,0 +1,8 @@
[
{"pk": 9010, "model": "auth.user", "fields":
{"username": "expotest", "first_name": "ExpoTest", "last_name": "Caver", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+expo@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}},
{"pk": 9011, "model": "auth.user", "fields":
{"username": "expotestadmin", "first_name": "ExpoTest", "last_name": "Admin", "is_active": true, "is_superuser": true, "is_staff": true, "last_login": "2021-02-01 00:00:00+0100", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+expoadmin@gmail.com", "date_joined": "2021-02-01 00:00:00+0100"}}
]

View File

@@ -0,0 +1,292 @@
[
{"pk": 1, "model": "club.boat", "fields":
{"name": "Skylark", "cuy_boat": true, "berths": 8, "boat_type": "Beneteau First 40.7", "length": "41ft", "notes": "We bought her in June 2016 when she was based in Izola, Slovenia, then brought her home over the course of the 2016 Summer Programme."}},
{"pk": 1, "model": "club.clubrole", "fields":
{"html_description": "Head of the Section: Overall responsibility for all the activities of CUY - authorises all activities, finances and external communication on behalf of the Club; Committee Management: Organisation of CUY Committee Meetings and Elections; Yacht Charter: Liaises with yacht charter companies to arrange yacht bookings for trips; Development: Organisation of long-term development plans for the Club;", "multiple": false, "title": "Commodore", "rank": 1, "short_description": "Chief", "committee_position": true, "club_email": "commodore@cuy.org.uk", "slug": "commodore"}},
{"pk": 2, "model": "club.clubrole", "fields":
{"html_description": "House Officer Support: Authorizes the activities of all house officers (Purser, Social, Webmaster, Publicity and Sponsorship) and ensures they have details of their responsibilities and that they are properly informed and supported in thier positions. Works with the Rear-Commodore House on legal issues and documentation (see below). Manages Club Shop orders.", "multiple": false, "title": "Vice-Commodore House", "rank": 2, "short_description": "Blah", "committee_position": true, "club_email": "vc-house@cuy.org.uk", "slug": "vice-commodore-house"}},
{"pk": 3, "model": "club.clubrole", "fields":
{"html_description": "Sailing Officer Support: Authorizes the activities of all sailing officers (Training and Racing) and ensures they have details of their responsibilities and that they are properly informed and supported in thier positions. Event Management: Manages the CUY program of trips and events by liaising with skippers, charterers and the commodore. Ensures a proper and accurate record is kept of trip and event information both before and after the trip or event. Liases with the Rear-Commodore Sailing about upcoming trips to ensure they are viable and sucessful.", "multiple": false, "title": "Vice-Commodore Sailing", "rank": 2, "short_description": "Blah", "committee_position": true, "club_email": "vc-sailing@cuy.org.uk", "slug": "vice-commodore-sailing"}},
{"pk": 5, "model": "club.clubrole", "fields":
{"html_description": "Legal: Ensures CUY obtains and sustains insurance policies appropriate to Club activities. Monitors details of charter agreements. Manages contractual disputes with charterers. Liases with Club legal contacts. Documentation: Ensures CUY Regulations; CUY Crew Register; Safety Policy; House Style; Skipper Manual; Agenda and Minutes Committee Meetings and any other key club documentation stay up-to-date.\r\n\r\n", "multiple": false, "title": "Rear-Commodore House", "rank": 3, "short_description": "Blah", "committee_position": true, "club_email": "rc-house@cuy.org.uk", "slug": "rear-commodore-house"}},
{"pk": 6, "model": "club.clubrole", "fields":
{"html_description": "Works with VC-Training to ensure a workable programme of practical and theory courses is made for each term. Responsible for liaising with instructors to ensure courses run smoothly.", "multiple": false, "title": "Rear-Commodore Training", "rank": 3, "short_description": "Blah", "committee_position": true, "club_email": "rc-training@cuy.org.uk", "slug": "rear-commodore-training"}},
{"pk": 7, "model": "club.clubrole", "fields":
{"html_description": "Skipper Managament: Ensures skippers of upcoming trips are aware of standard club procedures detailed in the CUY Manual and that they have the necessary information and equipment. Ensures that the crew have completed Crew Registers and paid Membership Fees before going on trips. Ensures records are taken of travel arrangements to and from trip or event locations. Upon completion of trip ensures expenses and defect reports are collated.", "multiple": false, "title": "Rear-Commodore Sailing", "rank": 3, "short_description": "Blah", "committee_position": true, "club_email": "rc-sailing@cuy.org.uk", "slug": "rear-commodore-sailing"}},
{"pk": 8, "model": "club.clubrole", "fields":
{"html_description": "Financial management; processing all payments and receipts for activities and permenent funds. Preparing the Financial Statement for termly audit and end of year Summary of Accounts. Membership; management of membership in liasion with Trip/Event organisers, Rear-Commodore Sailing, and the DB Admin. Grants applications; preparing funding applications for the Sports and Societies syndicates, and other funding source that may be available. Spending plans & strategy; preparing and presenting to the Committee financial forecasts and strategies for the investment and long term financial future of the Club", "multiple": false, "title": "Purser", "rank": 4, "short_description": "Blah", "committee_position": true, "club_email": "purser@cuy.org.uk", "slug": "purser"}},
{"pk": 9, "model": "club.clubrole", "fields":
{"html_description": "Social programme; submission of dates for socials to the Vice-Commodore Sailing, and planning of socials, including end of term dinner. New & potential members introduction; acting at socials to welcome new & potential members and inform them about club activities.", "multiple": false, "title": "Social Officer", "rank": 5, "short_description": "Blah", "committee_position": true, "club_email": "social@cuy.org.uk", "slug": "social-officer"}},
{"pk": 10, "model": "club.clubrole", "fields":
{"html_description": "Organising RYA Practical Training courses.", "multiple": false, "title": "Practical Training Officer", "rank": 5, "short_description": "Blah", "committee_position": true, "club_email": "practical@cuy.org.uk", "slug": "practical-training-officer"}},
{"pk": 11, "model": "club.clubrole", "fields":
{"html_description": "CUY Racing Squad training and development; improving racing knowledge and skills. Race selection & entry management. Varsity Yacht Race; organising an annual race with Oxford as part of an RORC/JOG or similar offshore/coastal/inshore race.", "multiple": false, "title": "Racing Officer", "rank": 5, "short_description": "Blah", "committee_position": true, "club_email": "racing@cuy.org.uk", "slug": "racing-officer"}},
{"pk": 13, "model": "club.clubrole", "fields":
{"html_description": "Webmaster; control and maintenance of style, scripts and code validity. Liasion with SRCF host; ensuring compliance with regulations and maintenance of filespace. DB Admin; development and administration of CUY Database and associated e-mail lists. Maintenance of Photos section of the website.", "multiple": false, "title": "Webmaster and Database Admin", "rank": 6, "short_description": "Blah", "committee_position": true, "club_email": "webgeek@cuy.org.uk", "slug": "webmaster-and-database-admin"}},
{"pk": 14, "model": "club.clubrole", "fields":
{"html_description": "Boat management. Is responsible for the general upkeep of CUY yachts so that they are ready and safe to be sailed. Ensures that the correct equipment and information on its use is onboard and in the correct locations. Also liaises with VC-Sailing in order to create a workable trip plan.", "multiple": false, "title": "Bosun", "rank": 4, "short_description": "Blah", "committee_position": true, "club_email": "bosun@cuy.org.uk", "slug": "bosun"}},
{"pk": 4, "model": "club.clubrole", "fields":
{"html_description": "Management and delegation of tasks to the Practical Training Officer and Theory Training Officer. Development and Evaluation of the CUY Training Scheme and courses run within the scheme. Ensuring compliance with CUY standards as set out in the training section of the CUY Manual. Training Programme; ensuring submission of dates to the Vice-Commodore Sailing for all training activities, with regard to the advice given in the Training section of the CUY Manual. Overseeing the editing and expanding the website training section.", "multiple": false, "title": "Vice-Commodore Training", "rank": 2, "short_description": "Blah", "committee_position": true, "club_email": "vc-training@cuy.org.uk", "slug": "vice-commodore-training"}},
{"pk": 15, "model": "club.clubrole", "fields":
{"html_description": "Management and delegation of tasks to the Practical Training Officer and Theory Training Officer. Development and Evaluation of the CUY Training Scheme and courses run within the scheme. Ensuring compliance with RYA and CUY standards as set out in the training section of the CUY Manual. Training Programme; ensuring submission of dates to the Vice-Commodore Sailing for all training activities, with regard to the advice given in the Training section of the CUY Manual. Overseeing the editing and expanding the website training section. Management of the RYA Practical and Shorebased training centres.", "multiple": false, "title": "RYA Principal", "rank": 4, "short_description": "Blah", "committee_position": true, "club_email": "rya-principal@cuy.org.uk", "slug": "rya-principal"}},
{"pk": 12, "model": "club.clubrole", "fields":
{"html_description": "Publicity: Publicity articles & campaigns; organising Freshers' Fair and Squash as well as ongoing publicity throughout the year. College Reps scheme; implementation and administration of College Reps scheme as a route of dissemination for publicity material and attracting new members. Sponsorship & funding in co-ordination with the rest of the CUY Committee", "multiple": false, "title": "Publicity and Sponsorship Officer", "rank": 6, "short_description": "Blah", "committee_position": true, "club_email": "sponsorship@cuy.org.uk", "slug": "publicity-and-sponsorship-officer"}},
{"pk": 16, "model": "club.clubrole", "fields":
{"html_description": "Blah", "multiple": true, "title": "Skipper", "rank": 8, "short_description": "Blah", "committee_position": true, "club_email": "", "slug": "skipper"}},
{"pk": 17, "model": "club.clubrole", "fields":
{"html_description": "Blah", "multiple": true, "title": "Instructor", "rank": 7, "short_description": "Blah", "committee_position": true, "club_email": "instructors@cuy.org.uk", "slug": "instructors"}},
{"pk": 5, "model": "club.eventtype", "fields":
{"name": "Other", "default_role": 4,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "shorebased"}},
{"pk": 4, "model": "club.eventtype", "fields":
{"name": "Theory Training", "default_role": 5,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "shorebased"}},
{"pk": 3, "model": "club.eventtype", "fields":
{"name": "Practical Training", "default_role": 5,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "atsea"}},
{"pk": 2, "model": "club.eventtype", "fields":
{"name": "Race", "default_role": 4,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "atsea"}},
{"pk": 6, "model": "club.eventtype", "fields":
{"name": "Social", "default_role": null,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "social"}},
{"pk": 1, "model": "club.eventtype", "fields":
{"name": "Cruising", "default_role": 4,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "atsea"}},
{"pk": 7, "model": "club.eventtype", "fields":
{"name": "Trip", "default_role": 4,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "atsea"}},
{"pk": 8, "model": "club.eventtype", "fields":
{"name": "Adventurous", "default_role": 4,
"default_thumbnail": "images/HappySailing_square.jpeg",
"event_type": "atsea"}},
{"pk": 1, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-start-yachting", "title": "RYA Start Yachting"}},
{"pk": 2, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "rya-day-skipper-theory", "title": "RYA Day Skipper Theory"}},
{"pk": 3, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-day-skipper-practical", "title": "RYA Day Skipper Practical"}},
{"pk": 4, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "vhf-radio-licence", "title": "VHF SRC Radio Licence"}},
{"pk": 5, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "first-aid-certificate", "title": "First Aid Certificate",
"expires": true, "length": 3}},
{"pk": 6, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "cuy-first-mate", "title": "CUYC First Mate"}},
{"pk": 7, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "rya-mca-costal-skipper-theory", "title": "RYA Costal Skipper/Yachtmaster Theory"}},
{"pk": 8, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-compcrew", "title": "RYA Competent Crew"}},
{"pk": 9, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-costal-skipper-practical-course", "title": "RYA Costal Skipper Practical Course"}},
{"pk": 10, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-mca-costal-skipper-certificate-of-competence", "title": "RYA / MCA Yachtmaster Costal Certificate of Competence"}},
{"pk": 11, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-mca-yachtmaster-offshore-certificate-of-compet", "title": "RYA / MCA Yachtmaster Offshore Certificate of Competence"}},
{"pk": 12, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "rya-mca-yachtmaster-ocean-certificate-of-competenc", "title": "RYA / MCA Yachtmaster Ocean Certificate of Competence"}},
{"pk": 13, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "rya-diesel-engine-course", "title": "RYA Diesel Engine Course"}},
{"pk": 14, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "rya-radar-course", "title": "RYA Radar Course"}},
{"pk": 15, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "rya-sea-survival-course", "title": "RYA Sea Survival Course"}},
{"pk": 16, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "rya-yachtmaster-ocean-theory", "title": "RYA Yachtmaster Ocean Theory"}},
{"pk": 17, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "cuy-skipper", "title": "CUYC Skipper"}},
{"pk": 18, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "cuy-examiner-yacht", "title": "RYA Yachtmaster Examiner",
"expires": false}},
{"pk": 19, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "cuy-sail-trim", "title": "CUYC Sail Trim"}},
{"pk": 20, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-commercial", "title": "RYA Commercial Endorsement",
"expires": true, "length": 5}},
{"pk": 21, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-ppr-course", "title": "RYA Professional Practices and Responsibilities"}},
{"pk": 22, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "cuy-ml5", "title": "MCA ML5 Medical Certificate",
"expires": true, "length": 5}},
{"pk": 23, "model": "club.qualification", "fields":
{"rya": false, "qualification_type": "", "slug": "cuy-eng1", "title": "MCA ENG.1 Medical Certificate",
"expires": true, "length": 2}},
{"pk": 24, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "cuy-instruct-cruise", "title": "RYA Cruising Instructor",
"expires": true, "length": 5}},
{"pk": 25, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "Practical", "slug": "cuy-instruct-yacht", "title": "RYA Yachtmaster Instructor",
"expires": true, "length": 5}},
{"pk": 26, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-instruct-shore", "title": "RYA Shorebased Instructor",
"expires": false}},
{"pk": 27, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-instruct-diesel", "title": "RYA Diesel Engine Instructor",
"expires": false}},
{"pk": 28, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-instruct-fistaid", "title": "RYA First Aid Instructor",
"expires": false}},
{"pk": 29, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-instruct-survival", "title": "RYA Sea Survival Instructor",
"expires": false}},
{"pk": 30, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-instruct-radar", "title": "RYA Radar Instructor",
"expires": false}},
{"pk": 31, "model": "club.qualification", "fields":
{"rya": true, "qualification_type": "", "slug": "cuy-instruct-vhf", "title": "RYA VHF Instructor",
"expires": false}},
{"pk": 1, "model": "club.role", "fields":
{"event_types": [5, 3, 2, 1], "name": "Skipper", "description": "Skipper"}},
{"pk": 2, "model": "club.role", "fields":
{"event_types": [5, 3, 2, 1], "name": "First Mate", "description": "First Mate"}},
{"pk": 3, "model": "club.role", "fields":
{"event_types": [5, 3, 2, 1], "name": "Watch Leader", "description": "Watch leader"}},
{"pk": 4, "model": "club.role", "fields":
{"event_types": [5, 3, 2, 1], "name": "Crew", "description": "crew"}},
{"pk": 5, "model": "club.role", "fields":
{"event_types": [5, 4, 3, 2], "name": "Student", "description": "student"}},
{"pk": 6, "model": "club.role", "fields":
{"event_types": [5, 4, 3, 2], "name": "Instructor", "description": "Instructor"}},
{"pk": 7, "model": "club.role", "fields":
{"event_types": [5, 2, 1], "name": "Helm", "description": "Helm"}},
{"pk": 8, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Bow", "description": "Bowman"}},
{"pk": 9, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Mast", "description": "Mastman"}},
{"pk": 10, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Pit", "description": "Pit."}},
{"pk": 11, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Trim", "description": "Trim"}},
{"pk": 12, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Main Trim", "description": "Main trim."}},
{"pk": 13, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Navigator", "description": "Navigator"}},
{"pk": 14, "model": "club.role", "fields":
{"event_types": [5, 2], "name": "Tactics", "description": "Tactics"}},
{"pk": 15, "model": "club.role", "fields":
{"event_types": [5, 3, 2, 1, 7, 8], "name": "Second Mate", "description": "Second Mate is usually third in charge, after the Skipper and the First Mate."}},
{"pk": 16, "model": "club.role", "fields":
{"event_types": [6], "name": "Drinker", "description": "Someone who will drink."}},
{"pk": 17, "model": "club.role", "fields":
{"event_types": [8, 1], "name": "Cook", "description": "Cooks food."}},
{"pk": 3, "model": "club.samplewebpage", "fields":
{"markup": "<h1>(% event_name %)</h1>\r\n\r\n\r\nBlah Practical Training trip example webpage", "slug": "practical-training-index", "description": "Default page for a practical training trip.", "title": "Practical Training index"}},
{"pk": 4, "model": "club.samplewebpage", "fields":
{"markup": "<h1>(% event_name %)</h1>\r\n\r\nTheory trip\r\n example webpage", "slug": "theory-training-index", "description": "ehcr", "title": "Theory Training Index"}},
{"pk": 5, "model": "club.samplewebpage", "fields":
{"markup": "<h1>(% event_name %)</h1>\r\n\r\n\r\nBlah Social example webpage", "slug": "social-index", "description": "Balh", "title": "Social Index"}},
{"pk": 6, "model": "club.samplewebpage", "fields":
{"markup": "<h1> Kit Page</h1>\r\n example webpage", "slug": "kit", "description": "Kit template page", "title": "Kit"}},
{"pk": 7, "model": "club.samplewebpage", "fields":
{"markup": "<h1>Crew!</h1>\r\n\r\n example webpage", "slug": "crew", "description": "Crew page", "title": "Crew"}},
{"pk": 2, "model": "club.samplewebpage", "fields":
{"markup": "<h1>(% event_name %)</h1>\r\n\r\n\r\nBlah Racing trip example webpage", "slug": "racing-index", "description": "Default Race trip index page.", "title": "Racing Index"}},
{"pk": 1, "model": "club.samplewebpage", "fields":
{"markup": "<h1>(% event_name%)<h1>\r\n\r\nBlah blah Cruising trip example webpage blah.", "slug": "cruising-index", "description": "Default cruising trip index page.", "title": "Cruising Index"}},
{"pk": 1, "model": "photologue.photosize", "fields":
{"name": "thumbnail", "watermark": null, "increment_count": false, "effect": null, "crop": true, "height": 75, "width": 75, "upscale": false, "pre_cache": true, "quality": 90}},
{"pk": 2, "model": "photologue.photosize", "fields":
{"name": "small", "watermark": null, "increment_count": false, "effect": null, "crop": false, "height": 150, "width": 150, "upscale": false, "pre_cache": true, "quality": 90}},
{"pk": 3, "model": "photologue.photosize", "fields":
{"name": "display", "watermark": null, "increment_count": true, "effect": null, "crop": false, "height": 500, "width": 500, "upscale": false, "pre_cache": false, "quality": 90}},
{"pk": 4, "model": "photologue.photosize", "fields":
{"name": "large", "watermark": null, "increment_count": true, "effect": null, "crop": false, "height": 1000, "width": 1000, "upscale": false, "pre_cache": false, "quality": 90}}
]

View File

@@ -0,0 +1,500 @@
[
{"model": "club.boat", "pk": 8000, "fields":
{
"berths": 4,
"boat_type": null,
"cuy_boat": 0,
"id": 8000,
"length": "35",
"name": "Goblin",
"notes": "We Didn't Mean to Go to Sea is the seventh book in Arthur Ransome's Swallows and Amazons series of children's books.\r\n\r\nThe book features a small sailing cutter, the Goblin, which is almost identical to Ransome's own boat Nancy Blackett. Ransome sailed Nancy Blackett across to Flushing by the same route as part of his research for the book. The navigational detail and the geography are both correct for the period when the story is set, unlike other books in the series."
}},
{"model": "club.boat", "pk": 8001, "fields":
{
"berths": 0,
"boat_type": "dinghy",
"cuy_boat": 0,
"id": 8001,
"length": "13",
"name": "Swallow",
"notes": "Ransome and Ernest Altounyan bought two small dinghies called Swallow and Mavis. Ransome kept Swallow until he sold it a number of years later."
}},
{"model": "club.boat", "pk": 8002, "fields":
{
"berths": 0,
"boat_type": "dinghy",
"cuy_boat": 0,
"id": 8002,
"length": "13",
"name": "Amazon",
"notes": "the Blackett children (Nancy and Peggy), who sail a dinghy named Amazon. \r\n\r\nSwallows and Amazons contains no sorcery; its plot is plausible, its characters ordinary children. Therein lies its enduring magic. A celebration of friendship, imagination, fair play, and exploration, Swallows and Amazons inspires even the most landlocked kid to dream of messing about in boats, building fires, camping out and navigating by the stars"
}},
{"model": "club.webpagecategory", "pk": 8000, "fields":
{
"id": 8000,
"name": "Yachts",
"slug": "yachts"
}},
{"model": "club.webpagecategory", "pk": 8001, "fields":
{
"id": 8001,
"name": "Club",
"slug": "club"
}},
{"model": "club.webpagecategory", "pk": 8002, "fields":
{
"id": 8002,
"name": "Summer",
"slug": "summer"
}},
{"model": "club.webpagecategory", "pk": 8002, "fields":
{
"id": 8003,
"name": "Sailing",
"slug": "sailing"
}},
{"model": "club.webpagecategory_photos", "pk": 8000, "fields":
{
"clubphoto_id": 7000,
"id": 5000,
"webpagecategory_id": 8000
}},
{"model": "club.clubphoto", "pk": 7000, "fields":
{
"id": 7000,
"name": "IRPCS 4.4",
"num_views": 0,
"origional_image": "images/training/exams/IRPCS-4-4.png"
}},
{"model": "club.webpage", "pk": 9000, "fields":
{
"category_id": 8000,
"description": "Current Yacht",
"edited": 1,
"event_id": null,
"id": 9000,
"index": 1,
"markup": "<h1>Skylark</h1>\r\n<p><strong> \r\n<table border=\"0\">\r\n<tbody>\r\n<tr>\r\n<td>\r\n<p><strong>Skylark, a Beneteau First 40.7, is our main and largest club yacht. </strong>We bought her in June 2016 when she was based in Izola, Slovenia, then brought her home over the course of the 2016 Summer Programme. She's been to Croatia, Greece, Italy, Spain and France on the way home - along with countless other stops along the way.</p>\r\n<p>Since arriving in the UK, she's spent time on the East and South coasts, pottering round the Solent or across the Channel, while Summer Programmes have taken her to the Norwegian Fjords, round the West Coast of Ireland, and all the way up to the Faeroes and Shetland.</p><img src='/site-media/images/training/exams/IRPCS-4-4.png'>",
"ordering": 10,
"slug": "yacht1",
"title": "Skylark Yacht"
}},
{"model": "club.webpage", "pk": 9001, "fields":
{ "category_id": 8001,
"description": "Safeguarding Policy",
"edited": 1,
"event_id": null,
"id": 9001,
"index": 1,
"markup": "<h1>Safeguarding Policy</h1><p>Content is here in the main backup database</p>",
"ordering": 10,
"slug": "safeguarding-policy",
"title": "Safeguarding Policy"
}},
{"model": "club.webpage", "pk": 9002, "fields":
{ "category_id": 8001,
"description": "Complaints",
"edited": 1,
"event_id": null,
"id": 9002,
"index": 1,
"markup": "<h1>Complaints</h1><p>Content is here in the main backup database</p>",
"ordering": 10,
"slug": "complaints",
"title": "Complaints"
}},
{"model": "club.webpage", "pk": 9003, "fields":
{ "category_id": 8001,
"description": "Other Sailing Opportunities in Cambridge",
"edited": 1,
"event_id": null,
"id": 9003,
"index": 1,
"markup": "<h1>Other Sailing Opportunities in Cambridge</h1><p>Content is here in the main backup database</p>",
"ordering": 10,
"slug": "other-sailing-in-camb",
"title": "Other Sailing Opportunities in Cambridge"
}},
{"model": "club.webpage", "pk": 9004, "fields":
{ "category_id": 8001,
"description": "CUYC Privacy Notice",
"edited": 1,
"event_id": null,
"id": 9004,
"index": 1,
"markup": "<h1>CUYC Privacy Notice</h1><p>Content is here in the main backup database</p>",
"ordering": 10,
"slug": "privacy-notice",
"title": "CUYC Privacy Notice"
}},
{"model": "club.webpage", "pk": 9005, "fields":
{ "category_id": 8003,
"description": "FAQ",
"edited": 1,
"event_id": null,
"id": 9005,
"index": 0,
"markup": "<h1>FAQ</h1><p>Content is here in the main backup database</p>",
"ordering": 10,
"slug": "faq",
"title": "FAQ" }},
{"model": "club.webpage", "pk": 9006, "fields":
{ "category_id": 8002,
"description": "Summer",
"edited": 1,
"event_id": null,
"id": 9006,
"index": 1,
"markup": "<h1>Summer</h1><p>Content is here in the main backup database</p>",
"ordering": 10,
"slug": "summer",
"title": "Summer"
}},
{"pk": 9000, "model": "auth.user", "fields":
{"username": "oofy", "first_name": "Oofy", "last_name": "Prosser", "is_active": true, "is_superuser": true, "is_staff": true, "last_login": "2021-01-01 00:00:00", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+oofy@gmail.com", "date_joined": "2021-01-01 00:00:00"}},
{"pk": 9001, "model": "auth.user", "fields":
{"username": "stiffy", "first_name": "Stiffy", "last_name": "Byng", "is_active": true, "is_superuser": true, "is_staff": true, "last_login": "2021-02-01 00:00:00", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+stiffy@gmail.com", "date_joined": "2021-02-01 00:00:00"}},
{"pk": 9002, "model": "auth.user", "fields":
{"username": "bingo", "first_name": "Bingo", "last_name": "Little", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+bingo@gmail.com", "date_joined": "2021-02-01 00:00:00"}},
{"pk": 9003, "model": "auth.user", "fields":
{"username": "spode", "first_name": "Roderick", "last_name": "Spode", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+spode@gmail.com", "date_joined": "2021-02-01 00:00:00"}},
{"pk": 9004, "model": "auth.user", "fields":
{"username": "boko", "first_name": "Boko", "last_name": "Fittleworth", "is_active": true, "is_superuser": false, "is_staff": false, "last_login": "2021-02-01 00:00:00", "groups": [], "user_permissions": [], "password": "pbkdf2_sha256$150000$I9wNXhHCAaHo$0ncTIJ7G+3bSaKHg7RD3ZG2a/4v7cG1bjovq9BiCyA4=", "email": "philip.sargent+boko@gmail.com", "date_joined": "2021-02-01 00:00:00"}},
{"model": "club.member", "pk": 9000, "fields":
{"user": 9000, "title": "Millionaire", "email": "philip.sargent+oofy@gmail.com", "member_state": "active", "nice": "nice", "nice_ref": null, "member_type": "senior", "affiliation": "", "affiliation_other": null, "portrait": "", "committee_email_prefix": "oofy", "bio": "Alexander Charles 'Oofy' Prosser is the richest member of the Drones Club, he is also a friend of Bertie Wooster.", "credit_rating": "ok", "crsid": null}},
{"model": "club.member", "pk": 9001, "fields":
{"user": 9001, "title": "Niece and ward of Sir Watkyn Bassett", "email": "philip.sargent+stiffy@gmail.com", "member_state": "active", "nice": "nice", "nice_ref": null, "member_type": "affiliate", "affiliation": "student", "affiliation_other": null, "portrait": "", "committee_email_prefix": "stiffy", "bio": "Stephanie 'Stiffy' Byng is the niece and ward of Sir Watkyn Bassett, she initially lives with him in Totleigh Towers. She is short and has blue eyes. She wears a wind-swept hairstyle, and has an Aberdeen terrier named Bartholomew. Stiffy often gets bright ideas that end up making trouble for others, and she is not above using blackmail to induce Bertie Wooster to do errands for her.", "credit_rating": "good", "crsid": null}},
{"model": "club.member", "pk": 9002, "fields":
{"user": 9002, "title": "Described as long and thin", "email": "philip.sargent+bingo@gmail.com", "member_state": "active", "nice": "nice", "nice_ref": null, "member_type": "senior", "affiliation": "unknown", "affiliation_other": null, "portrait": "", "committee_email_prefix": "bingo", "bio": "Bingo, who has an impulsive and romantic nature, falls in love with numerous women in quick succession, generally pursuing an absurd scheme to woo his latest love interest and invariably causing problems for his pal Bertie", "credit_rating": "ok", "crsid": null}},
{"model": "club.member", "pk": 9003, "fields":
{"user": 9003, "title": "Dictator", "email": "philip.sargent+spode@gmail.com", "member_state": "active", "nice": "bad", "nice_ref": "fascist tendences", "member_type": "unknown", "affiliation": "external", "affiliation_other": null, "portrait": "", "committee_email_prefix": "spode", "bio": "The leader of a fascist group in London called the Saviours of Britain, also known as the Black Shorts.", "credit_rating": "good", "crsid": null}},
{"model": "club.member", "pk": 9004, "fields":
{"user": 9004, "title": "An author with a unique dress sense", "email": "philip.sargent+boko@gmail.com", "member_state": "active", "nice": "nice", "nice_ref": null, "member_type": "senior", "affiliation": "postdoc", "affiliation_other": null, "portrait": "", "committee_email_prefix": "boko", "bio": "According to Bertie, after Jeeves first saw him, Jeeves winced and tottered off to the kitchen, probably to pull himself together with cooking sherry. Boko is engaged to Zenobia 'Nobby' Hopwood", "credit_rating": "ok", "crsid": null}},
{"model": "club.article", "pk": 9000, "fields":
{"title": "Blood orange and Campari steamed pudding", "publish": "2021-02-01 00:00:00", "hide": false,
"author": 9000, "thumbnail": "images/training/exams/IRPCS-3-6.png", "slug":"blood_orange_campari",
"short_summary": "A recipe for a sharp and delicious pudding",
"tease": "Put the orange segments and pomegranate seeds in a bowl with the golden syrup, Campari and gin",
"body": "This updated take on the traditional steamed pudding stars blood oranges and Campari. It can even be cooked in the microwave for a quick and easy hack. Serve with proper custard."}},
{"model": "club.article", "pk": 9001, "fields":
{"title": "Orange-scented brioche pudding", "publish": "2021-02-01 00:00:00", "hide": false,
"author": 9001, "thumbnail": "images/training/exams/IRPCS-3-5.png", "slug":"orange_brioche",
"short_summary": "A fragrant bread and butter pudding.",
"tease": "Put the sultanas and Grand Marnier into a small saucepan, bring to the boil and simmer",
"body": "An old-fashioned bread and butter pudding with a fragrant flourish. You can get ready-sliced long brioche loaves, which makes life simpler, but if you need to get out a bread knife yourself, just try to slice thinly. Any good unchunky marmalade would do. I think this is better warm rather than hot straight from the oven."}},
{"model": "club.article", "pk": 9002, "fields":
{"title": "Upside-down orange pudding", "publish": "2021-02-01 00:00:00", "hide": true,
"author": 9002, "thumbnail": "images/training/exams/IRPCS-3-5.png", "slug":"upside_orange",
"short_summary": "Very yummy.",
"tease": "Yum",
"body": "If you find puddings a bit heavy, you'll love this light upside-down pudding. And it's easy to make too."}},
{"model": "club.article", "pk": 9003, "fields":
{"title": "Hot Citrus Pudding", "publish": "2021-02-01 00:00:00", "hide": false,
"author": 9001, "thumbnail": "images/training/exams/IRPCS-3-6.png", "slug":"hot_citrus",
"short_summary": "Although this pudding is served hot, it is just as nice cold. ",
"tease": "Mind you, I doubt if there will be any left over.",
"body": "There are two main types of oranges: sweet oranges and bitter (Seville) oranges. The former can be thick- or thin- skinned, with or without seeds, and has sweet-tasting orange or red-flecked flesh. Bitter oranges have aromatic dimpled skin with very bitter pith and very sour, pale-orange flesh. They always contain seeds."}},
{"model": "club.article", "pk": 9004, "fields":
{"title": "Self-saucing Jaffa pudding", "publish": "2021-02-01 00:00:00", "hide": false,
"author": 9001, "thumbnail": "images/training/exams/IRPCS-4-1.png", "slug":"jaffa_saucing",
"short_summary": "An intense chocolate orange sponge bake. ",
"tease": "Yum. This intense chocolate orange sponge bake with thick sauce is about as indulgent as a good pudding gets.",
"body": "Mix ½ pint boiling water with sugar and cocoa then pour this over the batter. Return the pot to the slow cooker base, cover and cook on High for 3 hours until firm and risen."}},
{"model": "club.article", "pk": 9005, "fields":
{"title": "Terry's Chocolate Orange Melt In The Middle Pudding", "publish": "2021-02-01 00:00:00", "hide": false,
"author": 9001, "thumbnail": "images/training/exams/IRPCS-4-2.png", "slug":"chocolate_orange",
"short_summary": "If you are fan of Chocolate Orange this is the pud for you.",
"tease": "Yum. a beautifully light chocolate sponge pudding.",
"body": "This beautifully light chocolate sponge pudding is encased around a whole Terry's Chocolate Orange and when served straight from the oven will create a gooey melt in the middle chocolate centre. This pudding is a great alternative to the traditional Christmas pudding or a deliciously indulgent finale to a weekend roast with the family"}},
{"model": "club.affiliationcheck", "pk": 9000, "fields":
{"member": 9000, "claim_date": "2021-02-01 00:00:01", "claim": "alum", "confirmed": false, "confirmation_type": null, "confirmed_by": null, "confirmed_date": null}},
{"model": "club.affiliationcheck", "pk": 9001, "fields":
{"member": 9001, "claim_date": "2021-02-01 00:00:01", "claim": "affiliate", "confirmed": false, "confirmation_type": null, "confirmed_by": null, "confirmed_date": null}},
{"model": "club.affiliationcheck", "pk": 9002, "fields":
{"member": 9002, "claim_date": "2021-02-01 00:00:01", "claim": "senior", "confirmed": true, "confirmation_type": null, "confirmed_by": null, "confirmed_date": null}},
{"model": "club.affiliationcheck", "pk": 9003, "fields":
{"member": 9003, "claim_date": "2021-02-01 00:00:01", "claim": "unknown", "confirmed": false, "confirmation_type": null, "confirmed_by": null, "confirmed_date": null}},
{"model": "club.affiliationcheck", "pk": 9004, "fields":
{"member": 9004, "claim_date": "2021-02-01 00:00:01", "claim": "senior", "confirmed": false, "confirmation_type": null, "confirmed_by": null, "confirmed_date": null}},
{"model": "club.elected", "pk": 5000, "fields":
{"member": 9000, "elected_until": "", "club_role": 9000
}},
{"model": "club.elected", "pk": 5001, "fields":
{"member": 9001, "elected_until": "", "club_role": 16
}},
{"model": "club.elected", "pk": 5002, "fields":
{"member": 9001, "elected_until": "", "club_role": 17
}},
{"model": "club.award", "pk": 6000, "fields":
{"member": 9001, "award_date": "2000-01-01", "qualification": 11
}},
{"model": "club.award", "pk": 6001, "fields":
{"member": 9002, "award_date": "2000-01-01", "qualification": 11
}},
{"model": "club.award", "pk": 6002, "fields":
{"member": 9004, "award_date": "2000-01-01", "qualification": 3
}},
{"model": "club.award", "pk": 6003, "fields":
{"member": 9000, "award_date": "2019-03-10", "qualification": 5
}},
{"model": "club.clubrole", "pk": 9000, "fields":
{"title": "Drunken sailor", "slug": "drunk_sailor", "rank": 100, "multiple": true, "club_email": "", "short_description": "Traditional crew role", "html_description": "In the scuppers, early in the morning.", "committee_position": false, "division": null}},
{"model": "club.crewregister", "pk": 10000, "fields":
{"member": 9000,
"encoded": true,
"dob": "1920-02-01",
"gender": "M",
"cambridge_address": "The Drones Club, London",
"vacation_landline": "01632 960374",
"kin1_name": "Barmy Fotheringay-Phipps ",
"kin1_address": "The Drones Club, London",
"kin1_phone": "01632 960620",
"log": 20,
"days": 3,
"seasickness": "severe",
"can_swim": true,
"accepted_conditions": true,
"checked_up_to_date": true,
"checked_date": "2021-02-01 00:00:02"
}},
{"model": "club.crewregister", "pk": 10001, "fields":
{"member": 9001,
"encoded": true,
"dob": "1920-02-01",
"gender": "F",
"cambridge_address": "Totleigh Towers",
"vacation_landline": "01223 496 0551",
"kin1_name": "Sir Watkyn Bassett",
"kin1_address": "Totleigh Towers. (All this detail is because there a minimum set of fields to be completed.)",
"kin1_phone": "01223 496 0551",
"log": 450,
"days": 45,
"seasickness": "mild",
"can_swim": true,
"accepted_conditions": true,
"checked_up_to_date": true,
"checked_date": "2021-02-01 00:00:02"
}},
{"model": "club.event", "pk": 20000, "fields":
{"name": "Spring in the Arctic",
"slug": "spring-in-the-arctic",
"state": "public",
"event_type": 1,
"organiser": 9001, "shore_contact": 9002,
"start_date": "2031-03-01 00:00:00",
"end_date": "2031-03-03 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision1.png",
"photos": [7000],
"spaces": 10, "boats": [8001],
"short_summary": "A wonderfully refreshing trip among the ice floes.",
"summary": "This is going to be the most amazing trip."}},
{"model": "club.eventsettings", "pk": 20000, "fields": {
"event": 20000,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20001, "fields":
{"name": "Spring in the Med",
"slug": "spring-in-the-med",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2031-03-11 00:00:00",
"end_date": "2031-03-13 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision2.png",
"photos": [7000],
"spaces": 8, "boats": [8001],
"short_summary": "A joyful celebration of spring flowers in the Cylades.",
"summary": "This is going to be the most amazing trip."}},
{"model": "club.eventsettings", "pk": 20001, "fields": {
"event": 20001,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20002, "fields":
{"name": "Spring in the Solent",
"slug": "spring-in-the-solent",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2031-03-21 00:00:00",
"end_date": "2031-03-23 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision3.png",
"photos": [7000],
"spaces": 8, "boats": [8001],
"short_summary": "A rainy and blustery wet week discovering how to do tidal calculations at night.",
"summary": "This is going to be the most amazing trip."}},
{"model": "club.eventsettings", "pk": 20002, "fields": {
"event": 20002,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20003, "fields":
{"name": "Early Summer in the Med",
"slug": "early-summer-in-the-med",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2031-05-11 00:00:00",
"end_date": "2031-06-13 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision1.png",
"photos": [7000],
"spaces": 18, "boats": [8001],
"short_summary": "Sheer hedonism in the Cylades.",
"summary": "This is going to be the most amazing trip: a flotilla of joyfulness."}},
{"model": "club.eventsettings", "pk": 20003, "fields": {
"event": 20003,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20004, "fields":
{"name": "Summer in the Med",
"slug": "summer-in-the-med",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2031-06-11 00:00:00",
"end_date": "2031-07-13 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision2.png",
"photos": [7000],
"spaces": 18, "boats": [8001],
"short_summary": "The Dodecanese is spectacularly beautiful at this time of year.",
"summary": "This is going to be the most amazing trip."}},
{"model": "club.eventsettings", "pk": 20004, "fields": {
"event": 20004,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20005, "fields":
{"name": "High Summer in the Med",
"slug": "high-summer-in-the-med",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2031-07-11 00:00:00",
"end_date": "2031-08-13 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision3.png",
"photos": [7000],
"spaces": 18, "boats": [8001],
"short_summary": "The Saronic Gulf is busy and packed at this time of year.",
"summary": "This is going to be the most amazing trip. Party, party, party!"}},
{"model": "club.eventsettings", "pk": 20005, "fields": {
"event": 20005,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20006, "fields":
{"name": "High Summer in the Irish Sea",
"slug": "high-summer-in-the-irish",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2031-07-11 00:00:00",
"end_date": "2031-08-13 00:00:00",
"added_date": "2021-02-01 12:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision2.png",
"photos": [7000],
"spaces": 18, "boats": [8001],
"short_summary": "The Irish Sea is is wonderful at this time of year.",
"summary": "Welsh and Irush coasts, Manx beer."}},
{"model": "club.eventsettings", "pk": 20006, "fields": {
"event": 20006,
"show_event_in_progress": true
}},
{"model": "club.event", "pk": 20007, "fields":
{"name": "RYA First Aid course",
"slug": "rya-first-aid-2019",
"state": "public",
"event_type": 1,
"organiser": 9004, "shore_contact": 9001,
"start_date": "2019-03-10 00:00:00",
"end_date": "2019-03-10 00:00:00",
"added_date": "2019-03-10 00:00:00",
"modified_date": "2021-02-01 13:00:00",
"thumbnail": "images/training/exams/collision2.png",
"photos": [],
"spaces": 12, "boats": [],
"short_summary": "A one-day RYA First Aid Course",
"summary": "A First Aid certificate is a requirement for candidates for the RYA Yachtmaster Exams."}},
{"model": "club.eventsettings", "pk": 20006, "fields": {
"event": 20006,
"show_event_in_progress": true
}},
{"model": "club.participate", "pk": 30000, "fields":
{"person": 9001,
"event": 20000,
"state": "confirmed",
"date_added": "2021-02-01 12:00:00",
"role": 1}},
{"model": "club.participate", "pk": 30001, "fields":
{"person": 9000,
"event": 20007,
"state": "confirmed",
"date_added": "2019-03-10 00:00:00",
"paid": true,
"role": 5}}
]

View File

@@ -0,0 +1,52 @@
[
{"model": "core.area", "pk": 25, "fields":
{"short_name": "1626 or 6 (borderline)", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 24, "fields":
{"short_name": "8a", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 23, "fields":
{"short_name": "2b or 4 (unclear)", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 22, "fields":
{"short_name": "11", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 21, "fields":
{"short_name": "3", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 20, "fields":
{"short_name": "4", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 19, "fields":
{"short_name": "1b", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 18, "fields":
{"short_name": "8b", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 17, "fields":
{"short_name": "2d", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 16, "fields":
{"short_name": "7", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 15, "fields":
{"short_name": "2b", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 14, "fields":
{"short_name": "8c", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 13, "fields":
{"short_name": "2c", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 12, "fields":
{"short_name": "8d", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 11, "fields":
{"short_name": "", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 10, "fields":
{"short_name": "5", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 9, "fields":
{"short_name": "6", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 8, "fields":
{"short_name": "2a", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 7, "fields":
{"short_name": "1c", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 6, "fields":
{"short_name": "1d", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 5, "fields":
{"short_name": "1a", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 4, "fields":
{"short_name": "9", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 3, "fields":
{"short_name": "10", "name": null, "description": null, "super": 1, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 2, "fields":
{"short_name": "1626", "name": null, "description": null, "super": null, "new_since_parsing": false, "non_public": false}},
{"model": "core.area", "pk": 1, "fields":
{"short_name": "1623", "name": null, "description": null, "super": null, "new_since_parsing": false, "non_public": false}}
]

View File

@@ -0,0 +1,40 @@
[{"model": "core.cave", "pk": 43, "fields":
{"new_since_parsing": false, "non_public": false,
"official_name": "Schnellzugh&ouml;hle",
"kataster_code": "6/t/S/W x",
"kataster_number": "115",
"unofficial_number": "40m",
"explorers": "CUCC 1980-1985",
"underground_description": "This is the main entrance through which the majority of the <a href=\"41.htm\">Stellerwegh&ouml;hle</a> system was explored. See the separate <a href=\"41/115.htm#ent115\">full guidebook description</a> for details, just an overview is given here.</p><p>The entrance leads to a non-obvious way on to the head of the short <b>Bell Pitch</b>, from where very awkward going leads out to a bigger passage to reach <b>The Ramp</b> a series of off-vertical pitches. The damper but technically easier <b>Inlet Pitches</b> drop to a Big Chamber, from where <b>Pete's Purgatory</b> starts, and leads in 800m of tortuous going to <b>The Confluence</b> and the larger streamway leading to the deepest point.</p><p>Better is the <b>Purgatory Bypass</b> which starts as dry fossil tubes, with a choice of routes to reach <b>Junction Chamber</b> where the <b>Big Rift</b> of <a href=\"41.htm\">Stellerwegh&ouml;hle</a> enters. Opposite, the huge fossil tube of <b>Dartford Tunnel</b> makes for easy progress to the Confluence, about halfway down the system. The continuing main streamway is interrupted by a bypassable sump and numerous pitches before a low airspace duck at the end of an unpromising canal leads to the spectacular <b>Orgasm Chasm</b>. Careful rigging avoids the water in this 140m shaft, ending in muddy passage and another short drop to a deep and terminal sump. ",
"equipment": "",
"references": "",
"survey": "CUCC's parts surveyed to Grade 5 but not all drawn up - see <a href=\"41/survey.htm\">here</a>",
"kataster_status": "",
"underground_centre_line": "In dataset",
"notes": "The Austrian Kataster has adopted a very perverse way of numbering things. Their numbers are as follows:</p><ul> <li>115a&nbsp;&nbsp;&nbsp;Stellerwegh&ouml;hle entrance&nbsp;&nbsp;&nbsp;41a</li> <li>115b&nbsp;&nbsp;&nbsp;Stellerwegh&ouml;hle entrance&nbsp;&nbsp;&nbsp;41b</li> <li>115c&nbsp;&nbsp;&nbsp;Stellerwegh&ouml;hle entrance&nbsp;&nbsp;&nbsp;41c ( where ? )</li> <li>115d&nbsp;&nbsp;&nbsp;Schnellzugh&ouml;hle entrance&nbsp;&nbsp;&nbsp;115</li> <li>115e&nbsp;&nbsp;&nbsp;unnamed entrance&nbsp;&nbsp;&nbsp;142</li></ul><p>", "length": "SMK system total 54000m", "depth": "from entrance; SMK system total 1032m", "extent": "SMK system total 2812m",
"survex_file": "smk-system.svx",
"description_file": "1623/115.htm",
"url": "1623/115.url",
"filename": "1623-115.html",
"area": [1, 8]}},
{"model": "core.cave", "pk": 350, "fields":
{"new_since_parsing": false, "non_public": false,
"official_name": "Seetrichter",
"kataster_code": "",
"kataster_number": "284",
"unofficial_number": "",
"explorers": "<p></p>",
"underground_description": "",
"equipment": "<p></p>",
"references": "<p>",
"survey": "<p></p>",
"kataster_status": "",
"underground_centre_line": "",
"notes": "A 25m long (22m deep) resurgence in Altausee. At the bottom, at a depth of 72m, there are large round blocks.", "length": "", "depth": "", "extent": "",
"survex_file": "",
"description_file": "",
"url": "1623/284/284.html",
"filename": "1623-284.html",
"area": [1, 11]}}
]

View File

@@ -0,0 +1,17 @@
[{"model": "core.expedition", "pk": 44, "fields":
{"new_since_parsing": false, "non_public": false,
"year": "2019", "name": "CUCC expo 2019"}},
{"model": "core.personexpedition", "pk": 681, "fields":
{"new_since_parsing": false, "non_public": false,
"expedition": 44,
"person": 250, "slugfield": null, "is_guest": false
}},
{"model": "core.person", "pk": 250, "fields":
{"new_since_parsing": false, "non_public": false,
"first_name": "Michael",
"last_name": "Sargent",
"fullname": "Michael Sargent", "is_vfho": false, "mug_shot": null,
"blurb": "\n\n\n\n\n\n<p><img class=\"onleft\" src=\"/folk/i/mikey0.jpg\">\n<img class=\"onright\" src=\"/folk/i/mikey1.jpg\" height=\"400\"\nalt=\"\" />\n<b>Michael Sargent</b> CUCC<br />\nExpeditions 2014, 15, 16, 17, 18, 19.\n<p>The first second-generation expo caver in 2014, later members of this exclusive group were Dan Lenartowicz and Sarah Connolly.\n\n\n<img class=\"onleft\" src=\"/folk/i/michaelsargent.jpg\">\n<im\n\n<hr style=\"clear: both\" /><p class=\"caption\">Pre-expo (pre-student) photos from President's Invite (OUCC) \nand first abseiling instruction (Cambridge).</p>\n", "orderref": ""}}
]

View File

@@ -0,0 +1,58 @@
This folder is used by manage.py to load fixtures, as are all the folders
called /fixtures/ in any Django app here.
e.g. a list of files which are in the /fixtures/ folders:
$ python manage.py loaddata cuyc_basic_data test_data_1 test_data_1.1 test_data_2
$ python manage.py help migration
usage: manage.py migrate [-h] [--noinput] [--database DATABASE] [--fake]
[--fake-initial] [--plan] [--run-syncdb] [--version]
[-v {0,1,2,3}] [--settings SETTINGS]
[--pythonpath PYTHONPATH] [--traceback] [--no-color]
[--force-color]
[app_label] [migration_name]
Updates database schema. Manages both apps with migrations and those without.
positional arguments:
app_label App label of an application to synchronize the state.
migration_name Database state will be brought to the state after that
migration. Use the name "zero" to unapply all
migrations.
optional arguments:
--noinput, --no-input
Tells Django to NOT prompt the user for input of any
kind.
--database DATABASE Nominates a database to synchronize. Defaults to the
"default" database.
--fake Mark migrations as run without actually running them.
--fake-initial Detect if tables already exist and fake-apply initial
migrations if so. Make sure that the current database
schema matches your initial migration before using
this flag. Django will only check for an existing
table name.
--plan Shows a list of the migration actions that will be
performed.
--run-syncdb Creates tables for apps without migrations.
$ python manage.py help loaddata
usage: manage.py loaddata [-h] [--database DATABASE] [--app APP_LABEL]
[--ignorenonexistent] [-e EXCLUDE] [--format FORMAT]
[--version] [-v {0,1,2,3}] [--settings SETTINGS]
[--pythonpath PYTHONPATH] [--traceback] [--no-color]
[--force-color]
fixture [fixture ...]
Installs the named fixture(s) in the database.
optional arguments:
--app APP_LABEL Only look for fixtures in the specified app.
--ignorenonexistent, -i
Ignores entries in the serialized data for fields that
do not currently exist on the model.
positional arguments:
fixture Fixture labels.

View File

@@ -0,0 +1,5 @@
This file is uploaded by the integration test suite as part of the tests.
It, and any other with similar names, e.g test_upload_GPev9qN.txt can be safely deleted,
EXCEPT for the original copy which lives in troggle/core/fixtures/

View File

@@ -0,0 +1,5 @@
This file is uploaded by the integration test suite as part of the tests.
It, and any other with similar names, e.g test_upload_GPev9qN.txt can be safely deleted,
EXCEPT for the original copy which lives in troggle/core/fixtures/

View File

@@ -0,0 +1,7 @@
This file is uploaded by the integration test suite as part of the tests.
This has no suffix so it is pretending to be a Therion config file.
It, and any other with similar names, e.g test_upload_GPev9qN.txt can be safely deleted,
EXCEPT for the original copy which lives in troggle/core/fixtures/

View File

@@ -0,0 +1 @@
[{"model": "core.logbookentry", "pk": 7, "fields": {"new_since_parsing": false, "non_public": false, "date": "2019-07-11", "expeditionday": null, "expedition": 44, "title": "base camp - CUCC Austria Expedition 2019 Blog", "cave_slug": "None", "place": "base camp", "text": "<a href=\"https://ukcaving.com/board/index.php?topic=25249.msg311372#msg311372\">blog post</a> </br></br> At the time of writing, I am sat in the Tatty Hut at Base Camp in Bad Aussee. It is day five of expo and a lot has happened. We discovered on Sunday (day one - 07/07/2019) that our Top Camp, Steinbrueken, was full of snow: Meanwhile, Base Camp preparations were well underway: he beer tent was being hoisted (above) and the new rope (thanks to UK Caving and Spanset for the sponsorship!) was being soaked, coiled, and cut into usable lengths ready for caving. </br></br> The next few days consisted of Expo members undertaking multitudes of carrying trips up to top camp, and a few hardy folk doing their best to fettle the bivvy for habitability. Tuesday (09/07/2019) night saw the first people sleeping in Steinbrueken. Mostly, they described the experience as \"chilly\" but one person went as far as to claim he had been warmer there than at Base Camp. </br></br> Also on Tuesday (09/07/2019), a new route was devised and cairned directly from Heimkommen Hoehle to the tourist path on the col. The idea being that Homecoming could be close enough to push from Base Camp rather than Steinbrueken. This came with the discovery that Fischgesicht Hoehle's entrance was under two to three metres of snow: </br></br> On Wednesday (10/07/2019), Expo split into three groups. The majority went to Steinbrueken to commence the final push towards habitability while some went to investigate Balkonhoehle. Three of us (Dickon Morris, Daniel Heins, and myself) went to Heimkommen to rig to the pushing front (the decision to concentrate on Heimkommen and Balkon having been made for us by the plateau). </br></br> That's all for now, </br></br> Tom Crossley (11/07/2019)", "slug": "base-camp-cucc-austria-expedition-2019-blog", "filename": null, "entry_type": "html"}}]

View File

@@ -1,179 +1,239 @@
from django.forms import ModelForm
from models import Cave, Person, PersonExpedition, LogbookEntry, QM, Expedition, Entrance, CaveAndEntrance
import django.forms as forms
from django.forms import ModelForm
from django.forms.models import modelformset_factory
from django.contrib.admin.widgets import AdminDateWidget
import string
from datetime import date
from tinymce.widgets import TinyMCE
from troggle.core.models.caves import Cave, CaveAndEntrance, Entrance
from troggle.core.views.editor_helpers import HTMLarea
from django.core.exceptions import ValidationError
# from tinymce.widgets import TinyMCE
import re
"""These are all the class-based Forms used by troggle.
There are other, simpler, upload forms in view/uploads.py
class-based forms are quicker to set up (for Django experts) but
are more difficult to maintain by non-Django experts.
"""
todo = """
"""
class CaveForm(ModelForm):
underground_description = forms.CharField(required = False, widget=forms.Textarea())
explorers = forms.CharField(required = False, widget=forms.Textarea())
equipment = forms.CharField(required = False, widget=forms.Textarea())
survey = forms.CharField(required = False, widget=forms.Textarea())
kataster_status = forms.CharField(required = False, widget=forms.Textarea())
underground_centre_line = forms.CharField(required = False, widget=forms.Textarea())
notes = forms.CharField(required = False, widget=forms.Textarea())
references = forms.CharField(required = False, widget=forms.Textarea())
url = forms.CharField(required = True)
"""Only those fields for which we want to override defaults are listed here
the other fields are present on the form, but use the default presentation style
"""
official_name = forms.CharField(required=False, widget=forms.TextInput(attrs={"size": "45"}))
underground_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
explorers = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
equipment = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
survey = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
# survey = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
kataster_status = forms.CharField(required=False)
underground_centre_line = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
notes = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
references = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter page content (using HTML)"}),
)
description_file = forms.CharField(required=False, label="Path of top-level description file for this cave, when a separate file is used. Otherwise blank.", widget=forms.TextInput(attrs={"size": "45"}), help_text="")
survex_file = forms.CharField(
required=False, label="Survex file eg. caves-1623/000/000.svx", widget=forms.TextInput(attrs={"size": "45"})
)
#url = forms.CharField(required=True, label="URL eg. 1623/000/000 (no .html)", widget=forms.TextInput(attrs={"size": "45"}))
length = forms.CharField(required=False, label="Length (m)")
depth = forms.CharField(required=False, label="Depth (m)")
extent = forms.CharField(required=False, label="Extent (m)")
#cave_slug = forms.CharField()
class Meta:
model = Cave
exclude = ("filename",)
field_order = ['area', 'unofficial_number', 'kataster_number', 'official_name', 'underground_description', 'explorers', 'equipment', 'survey', 'kataster_status', 'underground_centre_line', 'notes', 'references', 'description_file', 'survex_file', 'url', 'length', 'depth', 'extent']
def get_area(self):
for a in self.cleaned_data["area"]:
if a.kat_area():
return a.kat_area()
def clean_cave_slug(self):
if self.cleaned_data["cave_slug"] == "":
myArea = ""
for a in self.cleaned_data["area"]:
if a.kat_area():
myArea = a.kat_area()
if self.data["kataster_number"]:
cave_slug = f"{myArea}-{self.cleaned_data['kataster_number']}"
else:
cave_slug = f"{myArea}-{self.cleaned_data['unofficial_number']}"
else:
cave_slug = self.cleaned_data["cave_slug"]
# Converting a PENDING cave to a real cave by saving this form
print("EEE", cave_slug.replace("-PENDING-", "-"))
return cave_slug.replace("-PENDING-", "-")
# def clean_url(self):
# data = self.cleaned_data["url"]
# if not re.match("\d\d\d\d/.", data):
# raise ValidationError("URL must start with a four digit Kataster area.")
# return data
def clean(self):
if self.cleaned_data.get("kataster_number") == "" and self.cleaned_data.get("unofficial_number") == "":
self._errors["unofficial_number"] = self.error_class(["Either the kataster or unoffical number is required."])
if self.cleaned_data.get("kataster_number") != "" and self.cleaned_data.get("official_name") == "":
self._errors["official_name"] = self.error_class(["This field is required when there is a kataster number."])
if self.cleaned_data.get("area") == []:
cleaned_data = super(CaveForm, self).clean()
if self.data.get("kataster_number") == "" and self.data.get("unofficial_number") == "":
self._errors["unofficial_number"] = self.error_class(
["Either the kataster or unoffical number is required."]
)
# if self.cleaned_data.get("kataster_number") != "" and self.cleaned_data.get("official_name") == "":
# self._errors["official_name"] = self.error_class(["This field is required when there is a kataster number."])
if cleaned_data.get("area") == []:
self._errors["area"] = self.error_class(["This field is required."])
if self.cleaned_data.get("url") and self.cleaned_data.get("url").startswith("/"):
self._errors["url"] = self.error_class(["This field can not start with a /."])
return self.cleaned_data
class VersionControlCommentForm(forms.Form):
description_of_change = forms.CharField(required = True, widget=forms.Textarea())
if cleaned_data.get("url") and cleaned_data.get("url").startswith("/"):
self._errors["url"] = self.error_class(["This field cannot start with a /."])
return cleaned_data
class EntranceForm(ModelForm):
#underground_description = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 30}))
#explorers = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
#equipment = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
#survey = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
#kataster_status = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
#underground_centre_line = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
#notes = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
#references = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
other_station = forms.CharField(required=False) # Trying to change this to a singl;e line entry
tag_station = forms.CharField(required=False) # Trying to change this to a singl;e line entry
exact_station = forms.CharField(required=False) # Trying to change this to a singl;e line entry
northing = forms.CharField(required=False) # Trying to change this to a singl;e line entry
easting = forms.CharField(required=False) # Trying to change this to a singl;e line entry
alt = forms.CharField(required=False) # Trying to change this to a singl;e line entry
"""Only those fields for which we want to override defaults are listed here
the other fields are present on the form, but use the default presentation style
"""
name = forms.CharField(required=False, widget=forms.TextInput(attrs={"size": "45"}))
entrance_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
explorers = forms.CharField(required=False, widget=forms.TextInput(attrs={"size": "45"}))
# explorers = forms.CharField(required = False, widget=TinyMCE(attrs={'cols': 80, 'rows': 10}))
map_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
location_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
lastvisit = forms.CharField(
required=False, widget=forms.TextInput(attrs={"size": "10"}), label="Date of last visit, e.g. 2023-07-11"
)
approach = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
underground_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
photo = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
marking_comment = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
findability_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
other_description = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
bearings = forms.CharField(
required=False,
widget=HTMLarea(attrs={"height": "80%", "rows": 20, "placeholder": "Enter text (using HTML)"}),
)
tag_station = forms.CharField(
required=False,
widget=forms.TextInput(attrs={"size": "50"}), label="Tag station: Survex station id, e.g. 1623.p2023-xx-01"
)
exact_station = forms.CharField(
required=False,
widget=forms.TextInput(attrs={"size": "50"}), label="Exact station: Survex station id, e.g. 1623.2023-xx-01.2"
)
other_station = forms.CharField(
required=False,
widget=forms.TextInput(attrs={"size": "50"}), label="Other station: Survex station id, e.g. 1623.2023-xx-01.33"
)
northing = forms.CharField(
required=False, widget=forms.TextInput(attrs={"size": "10"}), label="Northing (UTM) - from survex data"
)
easting = forms.CharField(
required=False, widget=forms.TextInput(attrs={"size": "10"}), label="Easting (UTM) - from survex data"
)
lat_wgs84 = forms.CharField(
required=False, widget=forms.TextInput(attrs={"size": "10"}), label="Latitude (WSG84) - if no other location"
)
long_wgs84 = forms.CharField(
required=False, widget=forms.TextInput(attrs={"size": "10"}), label="Longitude (WSG84) - if no other location"
)
alt = forms.CharField(required=False, label="Altitude (m)")
url = forms.CharField(required=False, label="URL [usually blank]", widget=forms.TextInput(attrs={"size": "45"}))
field_order = ['name', 'entrance_description', 'explorers', 'map_description', 'location_description', 'lastvisit', 'approach', 'underground_description', 'photo', 'marking_comment', 'findability_description', 'other_description', 'bearings', 'tag_station', 'exact_station', 'other_station', 'northing', 'easting', 'lat_wgs84', 'long_wgs84', 'alt', 'url']
class Meta:
model = Entrance
exclude = ("cached_primary_slug", "filename",)
exclude = (
"cached_primary_slug",
"filename",
)
def clean(self):
if self.cleaned_data.get("url").startswith("/"):
self._errors["url"] = self.error_class(["This field can not start with a /."])
self._errors["url"] = self.error_class(["This field cannot start with a /."])
return self.cleaned_data
CaveAndEntranceFormSet = modelformset_factory(CaveAndEntrance, exclude=('cave',))
# This next line is called from the templates/edit_cave.html template.
# This is sufficient to create an entire entry for for the cave fields automatically
# http://localhost:8000/cave/new/
# using django built-in Deep Magic. https://docs.djangoproject.com/en/dev/topics/forms/modelforms/
# for forms which map directly onto a Django Model
CaveAndEntranceFormSet = modelformset_factory(CaveAndEntrance, exclude=("cave",))
# This is used only in edit_entrance() in views/caves.py
class EntranceLetterForm(ModelForm):
"""Form to link entrances to caves, along with an entrance number.
Nb. The relationship between caves and entrances has historically been a many to many relationship.
With entrances gaining new caves and letters when caves are joined.
"""
class Meta:
model = CaveAndEntrance
exclude = ('cave', 'entrance')
#class PersonForm(ModelForm):
# class Meta:
# model = Person
#class LogbookEntryForm(ModelForm):
# class Meta:
# model = LogbookEntry#
# def wikiLinkHints(LogbookEntry=None):
# """
# This function returns html-formatted paragraphs for each of the
# wikilink types that are related to this logbookentry. Each paragraph
# contains a list of all of the related wikilinks.
#
# Perhaps an admin javascript solution would be better.
# """
# res = ["Please use the following wikilinks, which are related to this logbook entry:"]
#
# res.append(r'</p><p style="float: left;"><b>QMs found:</b>')
# for QM in LogbookEntry.instance.QMs_found.all():
# res.append(QM.wiki_link())
# res.append(r'</p><p style="float: left;"><b>QMs ticked off:</b>')
# for QM in LogbookEntry.instance.QMs_ticked_off.all():
# res.append(QM.wiki_link())
# res.append(r'</p><p style="float: left; "><b>People</b>')
# for persontrip in LogbookEntry.instance.persontrip_set.all():
# res.append(persontrip.wiki_link())
# res.append(r'</p>')
# return string.join(res, r'<br />')
# def __init__(self, *args, **kwargs):
# super(LogbookEntryForm, self).__init__(*args, **kwargs)
# self.fields['text'].help_text=self.wikiLinkHints()#
#class CaveForm(forms.Form):
# html = forms.CharField(widget=TinyMCE(attrs={'cols': 80, 'rows': 30}))
def getTripForm(expedition):
class TripForm(forms.Form):
date = forms.DateField()
title = forms.CharField(max_length=200)
caves = [cave.reference() for cave in Cave.objects.all()]
caves.sort()
caves = ["-----"] + caves
cave = forms.ChoiceField([(c, c) for c in caves], required=False)
location = forms.CharField(max_length=200, required=False)
caveOrLocation = forms.ChoiceField([("cave", "Cave"), ("location", "Location")], widget = forms.widgets.RadioSelect())
html = forms.CharField(widget=TinyMCE(attrs={'cols': 80, 'rows': 30}))
def clean(self):
print dir(self)
if self.cleaned_data.get("caveOrLocation") == "cave" and not self.cleaned_data.get("cave"):
self._errors["cave"] = self.error_class(["This field is required"])
if self.cleaned_data.get("caveOrLocation") == "location" and not self.cleaned_data.get("location"):
self._errors["location"] = self.error_class(["This field is required"])
return self.cleaned_data
class PersonTripForm(forms.Form):
names = [get_name(pe) for pe in PersonExpedition.objects.filter(expedition = expedition)]
names.sort()
names = ["-----"] + names
name = forms.ChoiceField([(n, n) for n in names])
TU = forms.FloatField(required=False)
author = forms.BooleanField(required=False, default=False)
PersonTripFormSet = formset_factory(PersonTripForm, extra=1)
return PersonTripFormSet, TripForm
def get_name(pe):
if pe.nickname:
return pe.nickname
else:
return pe.person.first_name
#class UploadFileForm(forms.Form):
# title = forms.CharField(max_length=50)
# file = forms.FileField()
# html = forms.CharField(widget=TinyMCE(attrs={'cols': 80, 'rows': 30}))
# lon_utm = forms.FloatField(required=False)
# lat_utm = forms.FloatField(required=False)
# slug = forms.CharField(max_length=50)
# date = forms.DateField(required=False)
# caves = [cave.slug for cave in Cave.objects.all()]
# caves.sort()
# caves = ["-----"] + caves
# cave = forms.ChoiceField([(c, c) for c in caves], required=False)
# entrance = forms.ChoiceField([("-----", "Please select a cave"), ], required=False)
# qm = forms.ChoiceField([("-----", "Please select a cave"), ], required=False)
# expeditions = [e.year for e in Expedition.objects.all()]
# expeditions.sort()
# expeditions = ["-----"] + expeditions
# expedition = forms.ChoiceField([(e, e) for e in expeditions], required=False)
# logbookentry = forms.ChoiceField([("-----", "Please select an expedition"), ], required=False)
# person = forms.ChoiceField([("-----", "Please select an expedition"), ], required=False)
# survey_point = forms.CharField()
exclude = ("cave", "entrance")
def full_clean(self):
super(EntranceLetterForm, self).full_clean()
try:
self.instance.validate_unique()
except forms.ValidationError as e:
self._update_errors(e)

View File

@@ -1,22 +0,0 @@
from imagekit.specs import ImageSpec
from imagekit import processors
class ResizeThumb(processors.Resize):
width = 100
crop = False
class ResizeDisplay(processors.Resize):
width = 600
#class EnhanceThumb(processors.Adjustment):
#contrast = 1.2
#sharpness = 2
class Thumbnail(ImageSpec):
access_as = 'thumbnail_image'
pre_cache = True
processors = [ResizeThumb]
class Display(ImageSpec):
increment_count = True
processors = [ResizeDisplay]

Binary file not shown.

View File

@@ -0,0 +1,36 @@
from django.core.management.base import BaseCommand
"""this is now replaced by databaseRest.py
This is an example of how to create our own bespoke commandline
commands.
Good articles on creating Django commands at
https://www.mattlayman.com/understand-django/command-apps/
https://www.geeksforgeeks.org/custom-django-management-commands/
Django docs:
https://docs.djangoproject.com/en/dev/howto/custom-management-commands/
We might use this mechanism to replace/enhance the
folk, wallets and any cron jobs or other standalone scripts.
"""
class Command(BaseCommand):
def add_arguments(self, parser):
# Positional arguments
parser.add_argument("posargs", nargs="+", type=int)
# Named (optional) arguments
parser.add_argument(
"--delete",
action="store_true",
help="Removed as redundant - use databaseReset.py",
)
def handle(self, *args, **options):
print(args)
print(options)

View File

@@ -1,182 +0,0 @@
from django.core.management.base import BaseCommand, CommandError
from optparse import make_option
from troggle.core.models import Cave
import settings
databasename=settings.DATABASES['default']['NAME']
expouser=settings.EXPOUSER
expouserpass=settings.EXPOUSERPASS
expouseremail=settings.EXPOUSER_EMAIL
class Command(BaseCommand):
help = 'This is normal usage, clear database and reread everything'
option_list = BaseCommand.option_list + (
make_option('--foo',
action='store_true',
dest='foo',
default=False,
help='test'),
)
def add_arguments(self, parser):
parser.add_argument(
'--foo',
action='store_true',
dest='foo',
help='Help text',
)
def handle(self, *args, **options):
print(args)
print(options)
if "desc" in args:
self.resetdesc()
elif "scans" in args:
self.import_surveyscans()
elif "caves" in args:
self.reload_db()
self.make_dirs()
self.pageredirects()
self.import_caves()
elif "people" in args:
self.import_people()
elif "QMs" in args:
self.import_QMs()
elif "tunnel" in args:
self.import_tunnelfiles()
elif "reset" in args:
self.reset()
elif "survex" in args:
self.import_survex()
elif "survexpos" in args:
import parsers.survex
parsers.survex.LoadPos()
elif "logbooks" in args:
self.import_logbooks()
elif "autologbooks" in args:
self.import_auto_logbooks()
elif "dumplogbooks" in args:
self.dumplogbooks()
elif "writeCaves" in args:
self.writeCaves()
elif "foo" in args:
self.stdout.write('Tesing....')
else:
self.stdout.write("%s not recognised" % args)
self.usage(options)
def reload_db():
if settings.DATABASES['default']['ENGINE'] == 'django.db.backends.sqlite3':
try:
os.remove(databasename)
except OSError:
pass
else:
cursor = connection.cursor()
cursor.execute("DROP DATABASE %s" % databasename)
cursor.execute("CREATE DATABASE %s" % databasename)
cursor.execute("ALTER DATABASE %s CHARACTER SET=utf8" % databasename)
cursor.execute("USE %s" % databasename)
management.call_command('migrate', interactive=False)
# management.call_command('syncdb', interactive=False)
user = User.objects.create_user(expouser, expouseremail, expouserpass)
user.is_staff = True
user.is_superuser = True
user.save()
def make_dirs():
"""Make directories that troggle requires"""
# should also deal with permissions here.
if not os.path.isdir(settings.PHOTOS_ROOT):
os.mkdir(settings.PHOTOS_ROOT)
def import_caves():
import parsers.caves
print("importing caves")
parsers.caves.readcaves()
def import_people():
import parsers.people
parsers.people.LoadPersonsExpos()
def import_logbooks():
# The below line was causing errors I didn't understand (it said LOGFILE was a string), and I couldn't be bothered to figure
# what was going on so I just catch the error with a try. - AC 21 May
try:
settings.LOGFILE.write('\nBegun importing logbooks at ' + time.asctime() + '\n' + '-' * 60)
except:
pass
import parsers.logbooks
parsers.logbooks.LoadLogbooks()
def import_survex():
import parsers.survex
parsers.survex.LoadAllSurvexBlocks()
parsers.survex.LoadPos()
def import_QMs():
import parsers.QMs
def import_surveys():
import parsers.surveys
parsers.surveys.parseSurveys(logfile=settings.LOGFILE)
def import_surveyscans():
import parsers.surveys
parsers.surveys.LoadListScans()
def import_tunnelfiles():
import parsers.surveys
parsers.surveys.LoadTunnelFiles()
def reset():
""" Wipe the troggle database and import everything from legacy data
"""
reload_db()
make_dirs()
pageredirects()
import_caves()
import_people()
import_surveyscans()
import_survex()
import_logbooks()
import_QMs()
try:
import_tunnelfiles()
except:
print("Tunnel files parser broken.")
import_surveys()
def pageredirects():
for oldURL, newURL in [("indxal.htm", reverse("caveindex"))]:
f = troggle.flatpages.models.Redirect(originalURL=oldURL, newURL=newURL)
f.save()
def writeCaves():
for cave in Cave.objects.all():
cave.writeDataFile()
for entrance in Entrance.objects.all():
entrance.writeDataFile()
def usage(self, parser):
print("""Usage is 'manage.py reset_db <command>'
where command is:
reset - this is normal usage, clear database and reread everything
desc
caves - read in the caves
logbooks - read in the logbooks
autologbooks
dumplogbooks
people
QMs - read in the QM files
resetend
scans - read in the scanned surveynotes
survex - read in the survex files
survexpos
tunnel - read in the Tunnel files
writeCaves
""")

View File

@@ -1,24 +0,0 @@
import utm
import math
from django.conf import settings
def lat_lon_entrance(utmstring):
try:
x = float(utmstring.split()[0])
y = float(utmstring.split()[1])
#return ' '+str(x+y)+' '+str(y)
q = utm.to_latlon(x, y, 33, 'U')
return "{:.5f} {:.5f}".format(q[0],q[1])
except:
return 'Not found'
def top_camp_distance(utmstring):
try:
x = float(utmstring.split()[0])
y = float(utmstring.split()[1])
tx = settings.TOPCAMPX
ty = settings.TOPCAMPY
dist = math.sqrt( (tx-x)*(tx-x) + (ty-y)*(ty-y) )
return "{:.1f}".format(dist)
except:
return 'Not found'

77
core/middleware.py Normal file
View File

@@ -0,0 +1,77 @@
from django import http
from django.conf import settings
from django.urls import Resolver404, resolve
"""Non-standard django middleware is loaded from this file.
"""
todo = """SmartAppendSlashMiddleware(object) Not Working.
It needs re-writing to be compatible with Django v2.0 and later
"""
class SmartAppendSlashMiddleware(object):
"""
"SmartAppendSlash" middleware for taking care of URL rewriting.
This middleware appends a missing slash, if:
* the SMART_APPEND_SLASH setting is True
* the URL without the slash does not exist
* the URL with an appended slash does exist.
Otherwise it won't touch the URL.
"""
def process_request(self, request):
"""Called for every url so return as quickly as possible
Append a slash if SMART_APPEND_SLASH is set, the resulting URL resolves and it doesn't without the /
"""
if not settings.SMART_APPEND_SLASH:
return None
if request.path.endswith("/"):
return None
if request.path.endswith("_edit"):
return None
host = http.HttpRequest.get_host(request)
old_url = [host, request.path]
if _resolves(old_url[1]):
return None
# So: it does not resolve according to our criteria, i.e. _edit doesn't count
new_url = old_url[:]
new_url[1] = new_url[1] + "/"
if not _resolves(new_url[1]):
return None
else:
if settings.DEBUG and request.method == "POST":
# replace this exception with a redirect to an error page
raise RuntimeError(
f"You called this URL via POST, but the URL doesn't end in a slash and you have SMART_APPEND_SLASH set. Django can't redirect to the slash URL while maintaining POST data. Change your form to point to {new_url[0]}{new_url[1]} (note the trailing slash), or set SMART_APPEND_SLASH=False in your Django settings."
)
if new_url != old_url:
# Redirect
if new_url[0]:
newurl = f"{request.is_secure() and 'https' or 'http'}://{new_url[0]}{new_url[1]}"
else:
newurl = new_url[1]
if request.GET:
newurl += "?" + request.GET.urlencode()
return http.HttpResponsePermanentRedirect(newurl)
return None
def _resolves(url):
try:
# If the URL does not resolve, the function raises a Resolver404 exception (a subclass of Http404)
resolve(url)
# this will ALWAYS be resolved by expopages because it will produce pagenotfound if not the thing asked for
# so handle this in expopages, not in middleware
return True
except Resolver404:
return False
except:
print(url)
raise

View File

View File

@@ -1,22 +0,0 @@
import urllib, urlparse, string, os, datetime, logging, re
import subprocess
from django.forms import ModelForm
from django.db import models
from django.contrib import admin
from django.core.files.storage import FileSystemStorage
from django.contrib.auth.models import User
from django.contrib.contenttypes.models import ContentType
from django.db.models import Min, Max
from django.conf import settings
from decimal import Decimal, getcontext
from django.core.urlresolvers import reverse
from imagekit.models import ImageModel
from django.template import Context, loader
import settings
getcontext().prec=2 #use 2 significant figures for decimal calculations
from troggle.core.models_survex import * #ancient models for both survex and other things
from troggle.core.models_old import *
from troggle.core.models_millenial import * #updated models are here

View File

@@ -1,864 +0,0 @@
import urllib, urlparse, string, os, datetime, logging, re
import subprocess
from django.forms import ModelForm
from django.db import models
from django.contrib import admin
from django.core.files.storage import FileSystemStorage
from django.contrib.auth.models import User
from django.contrib.contenttypes.models import ContentType
from django.db.models import Min, Max
from django.conf import settings
from decimal import Decimal, getcontext
from django.core.urlresolvers import reverse
from imagekit.models import ImageModel
from django.template import Context, loader
import settings
getcontext().prec=2 #use 2 significant figures for decimal calculations
from troggle.core.models_survex import *
from troggle.core.models_millenial import *
def get_related_by_wikilinks(wiki_text):
found=re.findall(settings.QM_PATTERN,wiki_text)
res=[]
for wikilink in found:
qmdict={'urlroot':settings.URL_ROOT,'cave':wikilink[2],'year':wikilink[1],'number':wikilink[3]}
try:
cave_slugs = CaveSlug.objects.filter(cave__kataster_number = qmdict['cave'])
qm=QM.objects.get(found_by__cave_slug__in = cave_slugs,
found_by__date__year = qmdict['year'],
number = qmdict['number'])
res.append(qm)
except QM.DoesNotExist:
print('fail on '+str(wikilink))
return res
try:
logging.basicConfig(level=logging.DEBUG,
filename=settings.LOGFILE,
filemode='w')
except:
subprocess.call(settings.FIX_PERMISSIONS)
logging.basicConfig(level=logging.DEBUG,
filename=settings.LOGFILE,
filemode='w')
#This class is for adding fields and methods which all of our models will have.
class TroggleModel(models.Model):
new_since_parsing = models.BooleanField(default=False, editable=False)
non_public = models.BooleanField(default=False)
def object_name(self):
return self._meta.object_name
def get_admin_url(self):
return urlparse.urljoin(settings.URL_ROOT, "/admin/core/" + self.object_name().lower() + "/" + str(self.pk))
class Meta:
abstract = True
class TroggleImageModel(ImageModel):
new_since_parsing = models.BooleanField(default=False, editable=False)
def object_name(self):
return self._meta.object_name
def get_admin_url(self):
return urlparse.urljoin(settings.URL_ROOT, "/admin/core/" + self.object_name().lower() + "/" + str(self.pk))
class Meta:
abstract = True
#
# single Expedition, usually seen by year
#
class Expedition(TroggleModel):
year = models.CharField(max_length=20, unique=True)
name = models.CharField(max_length=100)
def __unicode__(self):
return self.year
class Meta:
ordering = ('-year',)
get_latest_by = 'year'
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('expedition', args=[self.year]))
# construction function. should be moved out
def get_expedition_day(self, date):
expeditiondays = self.expeditionday_set.filter(date=date)
if expeditiondays:
assert len(expeditiondays) == 1
return expeditiondays[0]
res = ExpeditionDay(expedition=self, date=date)
res.save()
return res
def day_min(self):
res = self.expeditionday_set.all()
return res and res[0] or None
def day_max(self):
res = self.expeditionday_set.all()
return res and res[len(res) - 1] or None
class ExpeditionDay(TroggleModel):
expedition = models.ForeignKey("Expedition")
date = models.DateField()
class Meta:
ordering = ('date',)
def GetPersonTrip(self, personexpedition):
personexpeditions = self.persontrip_set.filter(expeditionday=self)
return personexpeditions and personexpeditions[0] or None
#
# single Person, can go on many years
#
class Person(TroggleModel):
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
is_vfho = models.BooleanField(help_text="VFHO is the Vereines f&uuml;r H&ouml;hlenkunde in Obersteier, a nearby Austrian caving club.", default=False)
mug_shot = models.CharField(max_length=100, blank=True,null=True)
blurb = models.TextField(blank=True,null=True)
#href = models.CharField(max_length=200)
orderref = models.CharField(max_length=200) # for alphabetic
#the below have been removed and made methods. I'm not sure what the b in bisnotable stands for. - AC 16 Feb
#notability = models.FloatField() # for listing the top 20 people
#bisnotable = models.BooleanField(default=False)
user = models.OneToOneField(User, null=True, blank=True)
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT,reverse('person',kwargs={'first_name':self.first_name,'last_name':self.last_name}))
class Meta:
verbose_name_plural = "People"
ordering = ('orderref',) # "Wookey" makes too complex for: ('last_name', 'first_name')
def __unicode__(self):
if self.last_name:
return "%s %s" % (self.first_name, self.last_name)
return self.first_name
def notability(self):
notability = Decimal(0)
for personexpedition in self.personexpedition_set.all():
if not personexpedition.is_guest:
notability += Decimal(1) / (2012 - int(personexpedition.expedition.year))
return notability
def bisnotable(self):
return self.notability() > Decimal(1)/Decimal(3)
def surveyedleglength(self):
return sum([personexpedition.surveyedleglength() for personexpedition in self.personexpedition_set.all()])
def first(self):
return self.personexpedition_set.order_by('-expedition')[0]
def last(self):
return self.personexpedition_set.order_by('expedition')[0]
#def Sethref(self):
#if self.last_name:
#self.href = self.first_name.lower() + "_" + self.last_name.lower()
#self.orderref = self.last_name + " " + self.first_name
#else:
# self.href = self.first_name.lower()
#self.orderref = self.first_name
#self.notability = 0.0 # set temporarily
#
# Person's attenance to one Expo
#
class PersonExpedition(TroggleModel):
expedition = models.ForeignKey(Expedition)
person = models.ForeignKey(Person)
slugfield = models.SlugField(max_length=50,blank=True,null=True)
is_guest = models.BooleanField(default=False)
COMMITTEE_CHOICES = (
('leader','Expo leader'),
('medical','Expo medical officer'),
('treasurer','Expo treasurer'),
('sponsorship','Expo sponsorship coordinator'),
('research','Expo research coordinator'),
)
expo_committee_position = models.CharField(blank=True,null=True,choices=COMMITTEE_CHOICES,max_length=200)
nickname = models.CharField(max_length=100,blank=True,null=True)
def GetPersonroles(self):
res = [ ]
for personrole in self.personrole_set.order_by('survexblock'):
if res and res[-1]['survexpath'] == personrole.survexblock.survexpath:
res[-1]['roles'] += ", " + str(personrole.role)
else:
res.append({'date':personrole.survexblock.date, 'survexpath':personrole.survexblock.survexpath, 'roles':str(personrole.role)})
return res
class Meta:
ordering = ('-expedition',)
#order_with_respect_to = 'expedition'
def __unicode__(self):
return "%s: (%s)" % (self.person, self.expedition)
#why is the below a function in personexpedition, rather than in person? - AC 14 Feb 09
def name(self):
if self.nickname:
return "%s (%s) %s" % (self.person.first_name, self.nickname, self.person.last_name)
if self.person.last_name:
return "%s %s" % (self.person.first_name, self.person.last_name)
return self.person.first_name
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('personexpedition',kwargs={'first_name':self.person.first_name,'last_name':self.person.last_name,'year':self.expedition.year}))
def surveyedleglength(self):
survexblocks = [personrole.survexblock for personrole in self.personrole_set.all() ]
return sum([survexblock.totalleglength for survexblock in set(survexblocks)])
# would prefer to return actual person trips so we could link to first and last ones
def day_min(self):
res = self.persontrip_set.aggregate(day_min=Min("expeditionday__date"))
return res["day_min"]
def day_max(self):
res = self.persontrip_set.all().aggregate(day_max=Max("expeditionday__date"))
return res["day_max"]
#
# Single parsed entry from Logbook
#
class LogbookEntry(TroggleModel):
date = models.DateField()#MJG wants to turn this into a datetime such that multiple Logbook entries on the same day can be ordered.
expeditionday = models.ForeignKey("ExpeditionDay", null=True)#MJG wants to KILL THIS (redundant information)
expedition = models.ForeignKey(Expedition,blank=True,null=True) # yes this is double-
#author = models.ForeignKey(PersonExpedition,blank=True,null=True) # the person who writes it up doesn't have to have been on the trip.
# Re: the above- so this field should be "typist" or something, not "author". - AC 15 jun 09
#MJG wants to KILL THIS, as it is typically redundant with PersonTrip.is_logbook_entry_author, in the rare it was not redundanty and of actually interest it could be added to the text.
title = models.CharField(max_length=settings.MAX_LOGBOOK_ENTRY_TITLE_LENGTH)
cave_slug = models.SlugField(max_length=50)
place = models.CharField(max_length=100,blank=True,null=True,help_text="Only use this if you haven't chosen a cave")
text = models.TextField()
slug = models.SlugField(max_length=50)
filename = models.CharField(max_length=200,null=True)
class Meta:
verbose_name_plural = "Logbook Entries"
# several PersonTrips point in to this object
ordering = ('-date',)
def __getattribute__(self, item):
if item == "cave": #Allow a logbookentries cave to be directly accessed despite not having a proper foreignkey
return CaveSlug.objects.get(slug = self.cave_slug).cave
return super(LogbookEntry, self).__getattribute__(item)
def __init__(self, *args, **kwargs):
if "cave" in kwargs.keys():
if kwargs["cave"] is not None:
kwargs["cave_slug"] = CaveSlug.objects.get(cave=kwargs["cave"], primary=True).slug
kwargs.pop("cave")
return super(LogbookEntry, self).__init__(*args, **kwargs)
def isLogbookEntry(self): # Function used in templates
return True
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('logbookentry',kwargs={'date':self.date,'slug':self.slug}))
def __unicode__(self):
return "%s: (%s)" % (self.date, self.title)
def get_next_by_id(self):
LogbookEntry.objects.get(id=self.id+1)
def get_previous_by_id(self):
LogbookEntry.objects.get(id=self.id-1)
def new_QM_number(self):
"""Returns """
if self.cave:
nextQMnumber=self.cave.new_QM_number(self.date.year)
else:
return none
return nextQMnumber
def new_QM_found_link(self):
"""Produces a link to a new QM with the next number filled in and this LogbookEntry set as 'found by' """
return settings.URL_ROOT + r'/admin/core/qm/add/?' + r'found_by=' + str(self.pk) +'&number=' + str(self.new_QM_number())
def DayIndex(self):
return list(self.expeditionday.logbookentry_set.all()).index(self)
#
# Single Person going on a trip, which may or may not be written up (accounts for different T/U for people in same logbook entry)
#
class PersonTrip(TroggleModel):
personexpedition = models.ForeignKey("PersonExpedition",null=True)
#expeditionday = models.ForeignKey("ExpeditionDay")#MJG wants to KILL THIS (redundant information)
#date = models.DateField() #MJG wants to KILL THIS (redundant information)
time_underground = models.FloatField(help_text="In decimal hours")
logbook_entry = models.ForeignKey(LogbookEntry)
is_logbook_entry_author = models.BooleanField(default=False)
# sequencing by person (difficult to solve locally)
#persontrip_next = models.ForeignKey('PersonTrip', related_name='pnext', blank=True,null=True)#MJG wants to KILL THIS (and use funstion persontrip_next_auto)
#persontrip_prev = models.ForeignKey('PersonTrip', related_name='pprev', blank=True,null=True)#MJG wants to KILL THIS(and use funstion persontrip_prev_auto)
def persontrip_next(self):
futurePTs = PersonTrip.objects.filter(personexpedition = self.personexpedition, logbook_entry__date__gt = self.logbook_entry.date).order_by('logbook_entry__date').all()
if len(futurePTs) > 0:
return futurePTs[0]
else:
return None
def persontrip_prev(self):
pastPTs = PersonTrip.objects.filter(personexpedition = self.personexpedition, logbook_entry__date__lt = self.logbook_entry.date).order_by('-logbook_entry__date').all()
if len(pastPTs) > 0:
return pastPTs[0]
else:
return None
def place(self):
return self.logbook_entry.cave and self.logbook_entry.cave or self.logbook_entry.place
def __unicode__(self):
return "%s (%s)" % (self.personexpedition, self.logbook_entry.date)
##########################################
# move following classes into models_cave
##########################################
class Area(TroggleModel):
short_name = models.CharField(max_length=100)
name = models.CharField(max_length=200, blank=True, null=True)
description = models.TextField(blank=True,null=True)
parent = models.ForeignKey('Area', blank=True, null=True)
def __unicode__(self):
if self.parent:
return unicode(self.parent) + u" - " + unicode(self.short_name)
else:
return unicode(self.short_name)
def kat_area(self):
if self.short_name in ["1623", "1626"]:
return self.short_name
elif self.parent:
return self.parent.kat_area()
class CaveAndEntrance(models.Model):
cave = models.ForeignKey('Cave')
entrance = models.ForeignKey('Entrance')
entrance_letter = models.CharField(max_length=20,blank=True,null=True)
def __unicode__(self):
return unicode(self.cave) + unicode(self.entrance_letter)
class CaveSlug(models.Model):
cave = models.ForeignKey('Cave')
slug = models.SlugField(max_length=50, unique = True)
primary = models.BooleanField(default=False)
class Cave(TroggleModel):
# too much here perhaps,
official_name = models.CharField(max_length=160)
area = models.ManyToManyField(Area, blank=True, null=True)
kataster_code = models.CharField(max_length=20,blank=True,null=True)
kataster_number = models.CharField(max_length=10,blank=True, null=True)
unofficial_number = models.CharField(max_length=60,blank=True, null=True)
entrances = models.ManyToManyField('Entrance', through='CaveAndEntrance')
explorers = models.TextField(blank=True,null=True)
underground_description = models.TextField(blank=True,null=True)
equipment = models.TextField(blank=True,null=True)
references = models.TextField(blank=True,null=True)
survey = models.TextField(blank=True,null=True)
kataster_status = models.TextField(blank=True,null=True)
underground_centre_line = models.TextField(blank=True,null=True)
notes = models.TextField(blank=True,null=True)
length = models.CharField(max_length=100,blank=True,null=True)
depth = models.CharField(max_length=100,blank=True,null=True)
extent = models.CharField(max_length=100,blank=True,null=True)
survex_file = models.CharField(max_length=100,blank=True,null=True)
description_file = models.CharField(max_length=200,blank=True,null=True)
url = models.CharField(max_length=200,blank=True,null=True)
filename = models.CharField(max_length=200)
#class Meta:
# unique_together = (("area", "kataster_number"), ("area", "unofficial_number"))
# FIXME Kataster Areas and CUCC defined sub areas need seperating
#href = models.CharField(max_length=100)
class Meta:
ordering = ('kataster_code', 'unofficial_number')
def hassurvey(self):
if not self.underground_centre_line:
return "No"
if (self.survey.find("<img") > -1 or self.survey.find("<a") > -1 or self.survey.find("<IMG") > -1 or self.survey.find("<A") > -1):
return "Yes"
return "Missing"
def hassurveydata(self):
if not self.underground_centre_line:
return "No"
if self.survex_file:
return "Yes"
return "Missing"
def slug(self):
primarySlugs = self.caveslug_set.filter(primary = True)
if primarySlugs:
return primarySlugs[0].slug
else:
slugs = self.caveslug_set.filter()
if slugs:
return slugs[0].slug
def ours(self):
return bool(re.search(r'CUCC', self.explorers))
def reference(self):
if self.kataster_number:
return "%s-%s" % (self.kat_area(), self.kataster_number)
else:
return "%s-%s" % (self.kat_area(), self.unofficial_number)
def get_absolute_url(self):
if self.kataster_number:
href = self.kataster_number
elif self.unofficial_number:
href = self.unofficial_number
else:
href = official_name.lower()
#return settings.URL_ROOT + '/cave/' + href + '/'
return urlparse.urljoin(settings.URL_ROOT, reverse('cave',kwargs={'cave_id':href,}))
def __unicode__(self, sep = u": "):
return unicode(self.slug())
def get_QMs(self):
return QM.objects.filter(found_by__cave_slug=self.caveslug_set.all())
def new_QM_number(self, year=datetime.date.today().year):
"""Given a cave and the current year, returns the next QM number."""
try:
res=QM.objects.filter(found_by__date__year=year, found_by__cave=self).order_by('-number')[0]
except IndexError:
return 1
return res.number+1
def kat_area(self):
for a in self.area.all():
if a.kat_area():
return a.kat_area()
def entrances(self):
return CaveAndEntrance.objects.filter(cave=self)
def singleentrance(self):
return len(CaveAndEntrance.objects.filter(cave=self)) == 1
def entrancelist(self):
rs = []
res = ""
for e in CaveAndEntrance.objects.filter(cave=self):
rs.append(e.entrance_letter)
rs.sort()
prevR = None
n = 0
for r in rs:
if prevR:
if chr(ord(prevR) + 1 ) == r:
prevR = r
n += 1
else:
if n == 0:
res += ", " + prevR
else:
res += "&ndash;" + prevR
else:
prevR = r
n = 0
res += r
if n == 0:
res += ", " + prevR
else:
res += "&ndash;" + prevR
return res
def writeDataFile(self):
try:
f = open(os.path.join(settings.CAVEDESCRIPTIONS, self.filename), "w")
except:
subprocess.call(settings.FIX_PERMISSIONS)
f = open(os.path.join(settings.CAVEDESCRIPTIONS, self.filename), "w")
t = loader.get_template('dataformat/cave.xml')
c = Context({'cave': self})
u = t.render(c)
u8 = u.encode("utf-8")
f.write(u8)
f.close()
def getArea(self):
areas = self.area.all()
lowestareas = list(areas)
for area in areas:
if area.parent in areas:
try:
lowestareas.remove(area.parent)
except:
pass
return lowestareas[0]
def getCaveByReference(reference):
areaname, code = reference.split("-", 1)
print(areaname, code)
area = Area.objects.get(short_name = areaname)
print(area)
foundCaves = list(Cave.objects.filter(area = area, kataster_number = code).all()) + list(Cave.objects.filter(area = area, unofficial_number = code).all())
print(list(foundCaves))
assert len(foundCaves) == 1
return foundCaves[0]
class OtherCaveName(TroggleModel):
name = models.CharField(max_length=160)
cave = models.ForeignKey(Cave)
def __unicode__(self):
return unicode(self.name)
class EntranceSlug(models.Model):
entrance = models.ForeignKey('Entrance')
slug = models.SlugField(max_length=50, unique = True)
primary = models.BooleanField(default=False)
class Entrance(TroggleModel):
name = models.CharField(max_length=100, blank=True,null=True)
entrance_description = models.TextField(blank=True,null=True)
explorers = models.TextField(blank=True,null=True)
map_description = models.TextField(blank=True,null=True)
location_description = models.TextField(blank=True,null=True)
approach = models.TextField(blank=True,null=True)
underground_description = models.TextField(blank=True,null=True)
photo = models.TextField(blank=True,null=True)
MARKING_CHOICES = (
('P', 'Paint'),
('P?', 'Paint (?)'),
('T', 'Tag'),
('T?', 'Tag (?)'),
('R', 'Needs Retag'),
('S', 'Spit'),
('S?', 'Spit (?)'),
('U', 'Unmarked'),
('?', 'Unknown'))
marking = models.CharField(max_length=2, choices=MARKING_CHOICES)
marking_comment = models.TextField(blank=True,null=True)
FINDABLE_CHOICES = (
('?', 'To be confirmed ...'),
('S', 'Coordinates'),
('L', 'Lost'),
('R', 'Refindable'))
findability = models.CharField(max_length=1, choices=FINDABLE_CHOICES, blank=True, null=True)
findability_description = models.TextField(blank=True,null=True)
alt = models.TextField(blank=True, null=True)
northing = models.TextField(blank=True, null=True)
easting = models.TextField(blank=True, null=True)
tag_station = models.TextField(blank=True, null=True)
exact_station = models.TextField(blank=True, null=True)
other_station = models.TextField(blank=True, null=True)
other_description = models.TextField(blank=True,null=True)
bearings = models.TextField(blank=True,null=True)
url = models.CharField(max_length=200,blank=True,null=True)
filename = models.CharField(max_length=200)
cached_primary_slug = models.CharField(max_length=200,blank=True,null=True)
def __unicode__(self):
return unicode(self.slug())
def exact_location(self):
return SurvexStation.objects.lookup(self.exact_station)
def other_location(self):
return SurvexStation.objects.lookup(self.other_station)
def find_location(self):
r = {'': 'To be entered ',
'?': 'To be confirmed:',
'S': '',
'L': 'Lost:',
'R': 'Refindable:'}[self.findability]
if self.tag_station:
try:
s = SurvexStation.objects.lookup(self.tag_station)
return r + "%0.0fE %0.0fN %0.0fAlt" % (s.x, s.y, s.z)
except:
return r + "%s Tag Station not in dataset" % self.tag_station
if self.exact_station:
try:
s = SurvexStation.objects.lookup(self.exact_station)
return r + "%0.0fE %0.0fN %0.0fAlt" % (s.x, s.y, s.z)
except:
return r + "%s Exact Station not in dataset" % self.tag_station
if self.other_station:
try:
s = SurvexStation.objects.lookup(self.other_station)
return r + "%0.0fE %0.0fN %0.0fAlt %s" % (s.x, s.y, s.z, self.other_description)
except:
return r + "%s Other Station not in dataset" % self.tag_station
if self.FINDABLE_CHOICES == "S":
r += "ERROR, Entrance has been surveyed but has no survex point"
if self.bearings:
return r + self.bearings
return r
def best_station(self):
if self.tag_station:
return self.tag_station
if self.exact_station:
return self.exact_station
if self.other_station:
return self.other_station
def has_photo(self):
if self.photo:
if (self.photo.find("<img") > -1 or self.photo.find("<a") > -1 or self.photo.find("<IMG") > -1 or self.photo.find("<A") > -1):
return "Yes"
else:
return "Missing"
else:
return "No"
def marking_val(self):
for m in self.MARKING_CHOICES:
if m[0] == self.marking:
return m[1]
def findability_val(self):
for f in self.FINDABLE_CHOICES:
if f[0] == self.findability:
return f[1]
def tag(self):
return SurvexStation.objects.lookup(self.tag_station)
def needs_surface_work(self):
return self.findability != "S" or not self.has_photo or self.marking != "T"
def get_absolute_url(self):
ancestor_titles='/'.join([subcave.title for subcave in self.get_ancestors()])
if ancestor_titles:
res = '/'.join((self.get_root().cave.get_absolute_url(), ancestor_titles, self.title))
else:
res = '/'.join((self.get_root().cave.get_absolute_url(), self.title))
return res
def slug(self):
if not self.cached_primary_slug:
primarySlugs = self.entranceslug_set.filter(primary = True)
if primarySlugs:
self.cached_primary_slug = primarySlugs[0].slug
self.save()
else:
slugs = self.entranceslug_set.filter()
if slugs:
self.cached_primary_slug = slugs[0].slug
self.save()
return self.cached_primary_slug
def writeDataFile(self):
try:
f = open(os.path.join(settings.ENTRANCEDESCRIPTIONS, self.filename), "w")
except:
subprocess.call(settings.FIX_PERMISSIONS)
f = open(os.path.join(settings.ENTRANCEDESCRIPTIONS, self.filename), "w")
t = loader.get_template('dataformat/entrance.xml')
c = Context({'entrance': self})
u = t.render(c)
u8 = u.encode("utf-8")
f.write(u8)
f.close()
class CaveDescription(TroggleModel):
short_name = models.CharField(max_length=50, unique = True)
long_name = models.CharField(max_length=200, blank=True, null=True)
description = models.TextField(blank=True,null=True)
linked_subcaves = models.ManyToManyField("NewSubCave", blank=True,null=True)
linked_entrances = models.ManyToManyField("Entrance", blank=True,null=True)
linked_qms = models.ManyToManyField("QM", blank=True,null=True)
def __unicode__(self):
if self.long_name:
return unicode(self.long_name)
else:
return unicode(self.short_name)
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('cavedescription', args=(self.short_name,)))
def save(self):
"""
Overridden save method which stores wikilinks in text as links in database.
"""
super(CaveDescription, self).save()
qm_list=get_related_by_wikilinks(self.description)
for qm in qm_list:
self.linked_qms.add(qm)
super(CaveDescription, self).save()
class NewSubCave(TroggleModel):
name = models.CharField(max_length=200, unique = True)
def __unicode__(self):
return unicode(self.name)
class QM(TroggleModel):
#based on qm.csv in trunk/expoweb/1623/204 which has the fields:
#"Number","Grade","Area","Description","Page reference","Nearest station","Completion description","Comment"
found_by = models.ForeignKey(LogbookEntry, related_name='QMs_found',blank=True, null=True )
ticked_off_by = models.ForeignKey(LogbookEntry, related_name='QMs_ticked_off',null=True,blank=True)
#cave = models.ForeignKey(Cave)
#expedition = models.ForeignKey(Expedition)
number = models.IntegerField(help_text="this is the sequential number in the year", )
GRADE_CHOICES=(
('A', 'A: Large obvious lead'),
('B', 'B: Average lead'),
('C', 'C: Tight unpromising lead'),
('D', 'D: Dig'),
('X', 'X: Unclimbable aven')
)
grade = models.CharField(max_length=1, choices=GRADE_CHOICES)
location_description = models.TextField(blank=True)
#should be a foreignkey to surveystation
nearest_station_description = models.CharField(max_length=400,null=True,blank=True)
nearest_station = models.CharField(max_length=200,blank=True,null=True)
area = models.CharField(max_length=100,blank=True,null=True)
completion_description = models.TextField(blank=True,null=True)
comment=models.TextField(blank=True,null=True)
def __unicode__(self):
return u"%s %s" % (self.code(), self.grade)
def code(self):
return u"%s-%s-%s" % (unicode(self.found_by.cave)[6:], self.found_by.date.year, self.number)
def get_absolute_url(self):
#return settings.URL_ROOT + '/cave/' + self.found_by.cave.kataster_number + '/' + str(self.found_by.date.year) + '-' + '%02d' %self.number
return urlparse.urljoin(settings.URL_ROOT, reverse('qm',kwargs={'cave_id':self.found_by.cave.kataster_number,'year':self.found_by.date.year,'qm_id':self.number,'grade':self.grade}))
def get_next_by_id(self):
return QM.objects.get(id=self.id+1)
def get_previous_by_id(self):
return QM.objects.get(id=self.id-1)
def wiki_link(self):
return u"%s%s%s" % ('[[QM:',self.code(),']]')
photoFileStorage = FileSystemStorage(location=settings.PHOTOS_ROOT, base_url=settings.PHOTOS_URL)
class DPhoto(TroggleImageModel):
caption = models.CharField(max_length=1000,blank=True,null=True)
contains_logbookentry = models.ForeignKey(LogbookEntry,blank=True,null=True)
contains_person = models.ManyToManyField(Person,blank=True,null=True)
file = models.ImageField(storage=photoFileStorage, upload_to='.',)
is_mugshot = models.BooleanField(default=False)
contains_cave = models.ForeignKey(Cave,blank=True,null=True)
contains_entrance = models.ForeignKey(Entrance, related_name="photo_file",blank=True,null=True)
#nearest_survey_point = models.ForeignKey(SurveyStation,blank=True,null=True)
nearest_QM = models.ForeignKey(QM,blank=True,null=True)
lon_utm = models.FloatField(blank=True,null=True)
lat_utm = models.FloatField(blank=True,null=True)
class IKOptions:
spec_module = 'core.imagekit_specs'
cache_dir = 'thumbs'
image_field = 'file'
#content_type = models.ForeignKey(ContentType)
#object_id = models.PositiveIntegerField()
#location = generic.GenericForeignKey('content_type', 'object_id')
def __unicode__(self):
return self.caption
scansFileStorage = FileSystemStorage(location=settings.SURVEY_SCANS, base_url=settings.SURVEYS_URL)
def get_scan_path(instance, filename):
year=instance.survey.expedition.year
#print("WN: ", type(instance.survey.wallet_number), instance.survey.wallet_number, instance.survey.wallet_letter)
number=str(instance.survey.wallet_number)
if str(instance.survey.wallet_letter) != "None":
number=str(instance.survey.wallet_letter) + number #two strings formatting because convention is 2009#01 or 2009#X01
return os.path.join('./',year,year+r'#'+number,str(instance.contents)+str(instance.number_in_wallet)+r'.jpg')
class ScannedImage(TroggleImageModel):
file = models.ImageField(storage=scansFileStorage, upload_to=get_scan_path)
scanned_by = models.ForeignKey(Person,blank=True, null=True)
scanned_on = models.DateField(null=True)
survey = models.ForeignKey('Survey')
contents = models.CharField(max_length=20,choices=(('notes','notes'),('plan','plan_sketch'),('elevation','elevation_sketch')))
number_in_wallet = models.IntegerField(null=True)
lon_utm = models.FloatField(blank=True,null=True)
lat_utm = models.FloatField(blank=True,null=True)
class IKOptions:
spec_module = 'core.imagekit_specs'
cache_dir = 'thumbs'
image_field = 'file'
#content_type = models.ForeignKey(ContentType)
#object_id = models.PositiveIntegerField()
#location = generic.GenericForeignKey('content_type', 'object_id')
#This is an ugly hack to deal with the #s in our survey scan paths. The correct thing is to write a custom file storage backend which calls urlencode on the name for making file.url but not file.path.
def correctURL(self):
return string.replace(self.file.url,r'#',r'%23')
def __unicode__(self):
return get_scan_path(self,'')
class Survey(TroggleModel):
expedition = models.ForeignKey('Expedition') #REDUNDANT (logbook_entry)
wallet_number = models.IntegerField(blank=True,null=True)
wallet_letter = models.CharField(max_length=1,blank=True,null=True)
comments = models.TextField(blank=True,null=True)
location = models.CharField(max_length=400,blank=True,null=True) #REDUNDANT
subcave = models.ForeignKey('NewSubCave', blank=True, null=True)
#notes_scan = models.ForeignKey('ScannedImage',related_name='notes_scan',blank=True, null=True) #Replaced by contents field of ScannedImage model
survex_block = models.OneToOneField('SurvexBlock',blank=True, null=True)
logbook_entry = models.ForeignKey('LogbookEntry')
centreline_printed_on = models.DateField(blank=True, null=True)
centreline_printed_by = models.ForeignKey('Person',related_name='centreline_printed_by',blank=True,null=True)
#sketch_scan = models.ForeignKey(ScannedImage,blank=True, null=True) #Replaced by contents field of ScannedImage model
tunnel_file = models.FileField(upload_to='surveyXMLfiles',blank=True, null=True)
tunnel_main_sketch = models.ForeignKey('Survey',blank=True,null=True)
integrated_into_main_sketch_on = models.DateField(blank=True,null=True)
integrated_into_main_sketch_by = models.ForeignKey('Person' ,related_name='integrated_into_main_sketch_by', blank=True,null=True)
rendered_image = models.ImageField(upload_to='renderedSurveys',blank=True,null=True)
def __unicode__(self):
return self.expedition.year+"#"+"%02d" % int(self.wallet_number)
def notes(self):
return self.scannedimage_set.filter(contents='notes')
def plans(self):
return self.scannedimage_set.filter(contents='plan')
def elevations(self):
return self.scannedimage_set.filter(contents='elevation')

0
core/models/__init__.py Normal file
View File

693
core/models/caves.py Normal file
View File

@@ -0,0 +1,693 @@
import os
import os
import re
from collections import defaultdict
from datetime import datetime, timezone
from pathlib import Path
from django.db import models
from django.template import loader
import settings
from troggle.core.models.logbooks import QM
from troggle.core.models.survex import SurvexStation
from troggle.core.models.troggle import DataIssue, TroggleModel
from troggle.core.utils import TROG, writetrogglefile
# Use the TROG global object to cache the cave lookup list. No good for multi-user..
Gcavelookup = TROG["caves"]["gcavelookup"]
Gcave_count = TROG["caves"]["gcavecount"]
Gcavelookup = None
Gcave_count = None
"""The model declarations for Areas, Caves and Entrances
"""
todo = """
- Find out why we have separate objects CaveSlug and why
these are not just a single field on the Model. Do we ever need more
than one slug per cave or entrance? Surely that would break everything??
- Can we rewrite things to eliminate the CaveSlug and objects? Surely
foreign keys work fine ?!
- Why do we have CaveAndEntrance objects ? Surely entranceletter belong son the Entrance object?
- move the aliases list from the code and put into an editable file
- Restore constraint: unique_together = (("area", "kataster_number"), ("area", "unofficial_number"))
"""
class Area(TroggleModel):
short_name = models.CharField(max_length=100)
name = models.CharField(max_length=200, blank=True, null=True)
description = models.TextField(blank=True, null=True)
super = models.ForeignKey("Area", blank=True, null=True, on_delete=models.SET_NULL)
def __str__(self):
if self.super:
return str(self.super) + " - " + str(self.short_name)
else:
return str(self.short_name)
def kat_area(self):
if self.short_name in ["1623", "1626", "1624", "1627"]:
return self.short_name
elif self.super:
return self.super.kat_area()
class CaveAndEntrance(models.Model):
"""This class is ONLY used to create a FormSet for editing the cave and all its
entrances in one form.
CASCADE means that if the cave or the entrance is deleted, then this CaveAndEntrance
is deleted too
"""
cave = models.ForeignKey("Cave", on_delete=models.CASCADE)
entrance = models.ForeignKey("Entrance", on_delete=models.CASCADE)
entrance_letter = models.CharField(max_length=20, blank=True, null=True)
class Meta:
unique_together = [["cave", "entrance"], ["cave", "entrance_letter"]]
ordering = ["entrance_letter"]
def __str__(self):
return str(self.cave) + str(self.entrance_letter)
# class CaveSlug(models.Model):
# moved to models/logbooks.py to avoid cyclic import problem
class Cave(TroggleModel):
# too much here perhaps,
area = models.ManyToManyField(Area, blank=False)
depth = models.CharField(max_length=100, blank=True, null=True)
description_file = models.CharField(max_length=200, blank=True, null=True)
entrances = models.ManyToManyField("Entrance", through="CaveAndEntrance")
equipment = models.TextField(blank=True, null=True)
explorers = models.TextField(blank=True, null=True)
extent = models.CharField(max_length=100, blank=True, null=True)
filename = models.CharField(max_length=200)
kataster_code = models.CharField(max_length=20, blank=True, null=True)
kataster_number = models.CharField(max_length=10, blank=True, null=True)
kataster_status = models.TextField(blank=True, null=True)
length = models.CharField(max_length=100, blank=True, null=True)
notes = models.TextField(blank=True, null=True)
official_name = models.CharField(max_length=160)
references = models.TextField(blank=True, null=True)
survex_file = models.CharField(max_length=100, blank=True, null=True) # should be a foreign key
survey = models.TextField(blank=True, null=True)
underground_centre_line = models.TextField(blank=True, null=True)
underground_description = models.TextField(blank=True, null=True)
unofficial_number = models.CharField(max_length=60, blank=True, null=True)
url = models.CharField(max_length=300, blank=True, null=True, unique = True)
# class Meta:
# unique_together = (("area", "kataster_number"), ("area", "unofficial_number"))
# FIXME Kataster Areas and CUCC defined sub areas need seperating
# href = models.CharField(max_length=100)
class Meta:
ordering = ("kataster_code", "unofficial_number")
def hassurvey(self):
"""This is almost certainly a fossil - needs checking...
"""
if not self.underground_centre_line:
return "No"
if (
self.survey.find("<img") > -1
or self.survey.find("<a") > -1
or self.survey.find("<IMG") > -1
or self.survey.find("<A") > -1
):
return "Yes"
return "Missing"
def hassurveydata(self):
if not self.underground_centre_line:
return "No"
if self.survex_filcavee:
return "Yes"
return "Missing"
def slug(self):
primarySlugs = self.caveslug_set.filter(primary=True)
if primarySlugs:
return primarySlugs[0].slug
else:
slugs = self.caveslug_set.filter()
if slugs:
return slugs[0].slug
def ours(self):
return bool(re.search(r"CUCC", self.explorers))
def number(self):
if self.kataster_number:
return self.kataster_number
else:
return self.unofficial_number
def reference(self):
return f"{self.kat_area()}-{self.number()}"
def get_absolute_url(self):
if self.kataster_number:
pass
elif self.unofficial_number:
pass
else:
self.official_name.lower()
return Path(settings.URL_ROOT) / self.url # not good Django style.. NEEDS actual URL
def url_parent(self):
return self.url.rsplit("/", 1)[0]
def __str__(self, sep=": "):
return str(self.slug())
def get_open_QMs(self):
"""Searches for all QMs that reference this cave."""
# qms = self.qm_set.all().order_by('expoyear', 'block__date')
qms = QM.objects.filter(cave=self).order_by(
"expoyear", "block__date"
) # a QuerySet, see https://docs.djangoproject.com/en/dev/ref/models/querysets/#order-by
qmsopen = qms.filter(ticked=False)
return qmsopen # a QuerySet
def get_ticked_QMs(self):
"""Searches for all QMs that reference this cave."""
qms = QM.objects.filter(cave=self).order_by(
"expoyear", "block__date"
)
qmticked = qms.filter(ticked=True)
return qmticked # a QuerySet
def get_QMs(self):
qms = self.get_open_QMs() | self.get_ticked_QMs() # set union operation
return qms # a QuerySet
def kat_area(self):
try:
for a in self.area.all():
if a.kat_area():
return a.kat_area()
except:
return ""
def entrances(self):
return CaveAndEntrance.objects.filter(cave=self)
def singleentrance(self):
return len(CaveAndEntrance.objects.filter(cave=self)) == 1
def entrancelist(self):
rs = []
res = ""
for e in CaveAndEntrance.objects.filter(cave=self):
if e.entrance_letter:
rs.append(e.entrance_letter)
rs.sort()
prevR = ""
n = 0
for r in rs:
if prevR:
if chr(ord(prevR) + 1) == r:
prevR = r
n += 1
else:
if n == 0:
res += ", " + prevR
else:
res += "&ndash;" + prevR
else:
prevR = r
n = 0
res += r
if n == 0:
if res:
res += ", " + prevR
else:
res += "&ndash;" + prevR
return res
def writeDataFile(self):
filepath = os.path.join(settings.CAVEDESCRIPTIONS, self.filename)
t = loader.get_template("dataformat/cave.xml")
now = datetime.now(timezone.utc)
print(now)
c = dict({"cave": self, "date": now})
u = t.render(c)
writetrogglefile(filepath, u)
return
def file_output(self):
filepath = Path(os.path.join(settings.CAVEDESCRIPTIONS, self.filename))
t = loader.get_template("dataformat/cave.xml")
now = datetime.now(timezone.utc)
c = dict({"cave": self, "date": now})
content = t.render(c)
return (filepath, content, "utf8")
def getArea(self):
areas = self.area.all()
lowestareas = list(areas)
for area in areas:
if area.super in areas:
try:
lowestareas.remove(area.super)
except:
pass
return lowestareas[0]
class Entrance(TroggleModel):
MARKING_CHOICES = (
("P", "Paint"),
("P?", "Paint (?)"),
("T", "Tag"),
("T?", "Tag (?)"),
("R", "Needs Retag"),
("S", "Spit"),
("S?", "Spit (?)"),
("U", "Unmarked"),
("?", "Unknown"),
)
FINDABLE_CHOICES = (("?", "To be confirmed ..."), ("S", "Coordinates"), ("L", "Lost"), ("R", "Refindable"))
alt = models.TextField(blank=True, null=True)
approach = models.TextField(blank=True, null=True)
bearings = models.TextField(blank=True, null=True)
easting = models.TextField(blank=True, null=True)
entrance_description = models.TextField(blank=True, null=True)
exact_station = models.TextField(blank=True, null=True)
explorers = models.TextField(blank=True, null=True)
filename = models.CharField(max_length=200)
findability = models.CharField(max_length=1, choices=FINDABLE_CHOICES, blank=True, null=True)
findability_description = models.TextField(blank=True, null=True)
lastvisit = models.TextField(blank=True, null=True)
lat_wgs84 = models.TextField(blank=True, null=True)
location_description = models.TextField(blank=True, null=True)
long_wgs84 = models.TextField(blank=True, null=True)
map_description = models.TextField(blank=True, null=True)
marking = models.CharField(max_length=2, choices=MARKING_CHOICES)
marking_comment = models.TextField(blank=True, null=True)
name = models.CharField(max_length=100, blank=True, null=True)
northing = models.TextField(blank=True, null=True)
other_description = models.TextField(blank=True, null=True)
other_station = models.TextField(blank=True, null=True)
photo = models.TextField(blank=True, null=True)
slug = models.SlugField(max_length=50, unique=True, default="default_slug_id")
tag_station = models.TextField(blank=True, null=True)
underground_description = models.TextField(blank=True, null=True)
url = models.CharField(max_length=300, blank=True, null=True)
class Meta:
ordering = ["caveandentrance__entrance_letter"]
def __str__(self):
return str(self.slug)
def single(self, station):
try:
single = SurvexStation.objects.get(name = station)
return single
except:
stations = SurvexStation.objects.filter(name = station)
print(f" # MULTIPLE stations found with same name '{station}' in Entrance {self}:")
if len(stations) > 1:
for s in stations:
print(f" # {s.id=} - {s.name} {s.latlong()}") # .id is Django internal field, not one of ours
return stations[0]
else:
return None
def exact_location(self):
return self.single(self.exact_station)
def other_location(self):
return self.single(self.other_station)
def find_location(self):
r = {"": "To be entered ", "?": "To be confirmed:", "S": "", "L": "Lost:", "R": "Refindable:"}[self.findability]
if self.tag_station:
try:
s = SurvexStation.objects.lookup(self.tag_station)
return r + f"{s.x:0.0f}E {s.y:0.0f}N {s.z:0.0f}Alt"
except:
return r + f"{self.tag_station} Tag Station not in dataset"
if self.exact_station:
try:
s = SurvexStation.objects.lookup(self.exact_station)
return r + f"{s.x:0.0f}E {s.y:0.0f}N {s.z:0.0f}Alt"
except:
return r + f"{self.tag_station} Exact Station not in dataset"
if self.other_station:
try:
s = SurvexStation.objects.lookup(self.other_station)
return r + f"{s.x:0.0f}E {s.y:0.0f}N {s.z:0.0f}Alt {self.other_description}"
except:
return r + f"{self.tag_station} Other Station not in dataset"
if self.FINDABLE_CHOICES == "S":
r += "ERROR, Entrance has been surveyed but has no survex point"
if self.bearings:
return r + self.bearings
return r
def best_station(self):
if self.tag_station:
return self.tag_station
if self.exact_station:
return self.exact_station
if self.other_station:
return self.other_station
def has_photo(self):
if self.photo:
if (
self.photo.find("<img") > -1
or self.photo.find("<a") > -1
or self.photo.find("<IMG") > -1
or self.photo.find("<A") > -1
):
return "Yes"
else:
return "Missing"
else:
return "No"
def marking_val(self):
for m in self.MARKING_CHOICES:
if m[0] == self.marking:
return m[1]
def findability_val(self):
for f in self.FINDABLE_CHOICES:
if f[0] == self.findability:
return f[1]
def tag(self):
return self.single(self.tag_station)
def needs_surface_work(self):
return self.findability != "S" or not self.has_photo or self.marking != "T"
def get_absolute_url(self):
res = "/".join((self.get_root().cave.get_absolute_url(), self.title))
return res
def cavelist(self):
rs = []
for e in CaveAndEntrance.objects.filter(entrance=self):
if e.cave:
rs.append(e.cave)
return rs
def get_file_path(self):
return Path(settings.ENTRANCEDESCRIPTIONS, self.filename)
def file_output(self):
filepath = Path(os.path.join(settings.ENTRANCEDESCRIPTIONS, self.filename))
t = loader.get_template("dataformat/entrance.xml")
now = datetime.now(timezone.utc)
c = dict({"entrance": self, "date": now})
content = t.render(c)
return (filepath, content, "utf8")
def writeDataFile(self):
filepath = os.path.join(settings.ENTRANCEDESCRIPTIONS, self.filename)
t = loader.get_template("dataformat/entrance.xml")
now = datetime.now(timezone.utc)
c = dict({"entrance": self, "date": now})
u = t.render(c)
writetrogglefile(filepath, u)
return
def url_parent(self):
if self.url:
return self.url.rsplit("/", 1)[0]
else:
cavelist = self.cavelist()
if len(self.cavelist()) == 1:
return cavelist[0].url_parent()
else:
return ""
def latlong(self):
station = None
if self.other_station:
try:
station = SurvexStation.objects.get(name = self.other_station)
except:
pass
if self.tag_station:
try:
station = SurvexStation.objects.get(name = self.tag_station)
except:
pass
if self.exact_station:
try:
station = SurvexStation.objects.get(name = self.exact_station)
except:
pass
if station:
return station.latlong()
def GetCaveLookup():
"""A very relaxed way of finding probably the right cave given almost any string which might serve to identify it
lookup function modelled on GetPersonExpeditionNameLookup
repeated assignment each call, needs refactoring
Used when parsing wallets contents.json file too in views/uploads.py
Does NOT detect duplicates! Needs fixing.
Needs to be a proper funciton that raises an exception if there is a duplicate.
OR we could set it to return None if there are duplicates, and require the caller to
fall back on doing the actual database query it wants rather thna using this cache shortcut
"""
duplicates = {}
def checkcaveid(cave, id):
global Gcavelookup
if id not in Gcavelookup:
Gcavelookup[id] = cave
Gcave_count[id] += 1
else:
if cave == Gcavelookup[id]:
pass # same id, same cave
else: # same id but different cave
# message = f" - Warning: ignoring alias id '{id:3}'. Caves '{Gcavelookup[id]}' and '{cave}'. "
# print(message)
# DataIssue.objects.create(parser="aliases", message=message)
duplicates[id] = 1
global Gcavelookup
if Gcavelookup:
return Gcavelookup
Gcavelookup = {"NONEPLACEHOLDER": None}
global Gcave_count
Gcave_count = defaultdict(int) # sets default value to int(0)
DataIssue.objects.filter(parser="aliases").delete()
DataIssue.objects.filter(parser="aliases ok").delete()
for cave in Cave.objects.all():
key = cave.official_name.lower()
if key != "" and key != "unamed" and key != "unnamed":
if Gcave_count[key] > 0:
# message = f" - Warning: ignoring alias id '{id:3}'. Caves '{Gcavelookup[id]}' and '{cave}'. "
# print(message)
# DataIssue.objects.create(parser="aliases", message=message)
duplicates[key] = 1
else:
Gcavelookup[key] = cave
Gcave_count[key] += 1
if cave.kataster_number:
checkcaveid(cave, cave.kataster_number) # we do expect 1623/55 and 1626/55 to cause a warning message
# the rest of these are 'nice to have' but may validly already be set
if cave.unofficial_number:
unoffn = cave.unofficial_number.lower()
checkcaveid(cave, unoffn)
if cave.filename:
# this is the slug - usually.. but usually done as as f'{cave.area}-{cave.kataster_number}'
fn = cave.filename.replace(".html", "").lower()
checkcaveid(cave, fn)
if cave.slug():
# also possibly done already
slug = cave.slug().lower()
checkcaveid(cave, slug)
# These might alse create more duplicate entries
# Yes, this should be set in, and imported from, settings.py
aliases = [
("1987-02", "267"),
("1990-01", "171"),
("1990-02", "172"),
("1990-03", "173"),
("1990-04", "174"),
("1990-05", "175"),
("1990-06", "176"),
("1990-07", "177"),
("1990-08", "178"),
("1990-09", "179"),
("1990-10", "180"),
("1990-11", "181"),
("1990-12", "182"),
("1990-13", "183"),
("1990-14", "184"),
("1990-18", "188"),
("1990-adam", "225"),
("1993-01", "200"),
("1996-02", "224"),
("1996-03", "223"),
("1996-04", "222"),
("1996wk2", "207"),
("1996wk3", "208"),
("1996wk5", "219"),
("1996wk6", "218"),
("1996wk8", "209"),
("1996wk11", "268"),
("96wk11", "268"),
("1998-01", "201"),
("1998-03", "210"),
("1999-03", "204"),
("1999-04", "230"),
("1999-10", "162"),
("1999-bo-01", "205"),
("1999-ob-03", "226"),
("1999-ob-04", "227"),
("2000-01", "231"),
("2000-03", "214"),
("2000-04", "220"),
("2000-05", "215"),
("2000-06", "216"),
("2000-07", "217"),
("2000-09", "234"),
("2000-aa-01", "250"),
("2001-04", "239"),
("2001-05", "243"),
("2002-01", "249"),
("2002-02", "234"),
("2002-04", "242"),
("2002-05", "294"),
("2003-01", "256"),
("2003-02", "248"),
("2003-03", "247"),
("2003-04", "241"),
("2003-05", "246"),
("2003-06", "161"),
("2003-08", "240"),
("2003-09", "245"),
("2003-10", "244"),
("2004-01", "269"),
("2004-03", "270"),
("2004-11", "251"),
("2004-12", "161"),
("2004-15", "253"),
("2004-19", "254"),
("2004-20", "255"),
("2005-04", "204"),
("2005-05", "264"),
("2005-07", "257"),
("2006-08", "285"),
("2006-09", "298"),
("2007-71", "271"),
("2010-01", "263"),
("2010-03", "293"),
("2011-01", "292"),
("2012-dd-05", "286"),
("2012-ns-13", "292"),
("2014-neo-01", "273"),
("2014-sd-01", "274"),
("2014-ms-14", "287"),
("2015-mf-06", "288"),
("2016-jb-01", "289"),
("2017-pw-01", "277"),
("2018-dm-07", "359"), # NB this is 1626
("2017_cucc_24", "291"), # note _ not -
("2017_cucc_23", "295"), # note _ not -
("2017_cucc_28", "290"), # note _ not -
("bs17", "283"),
("1976/b11", "198"),
("1976/b8", "197"),
("1976/b9", "190"),
("b11", "1976/b11"),
("b8", "1976/b8"),
("b9", "1976/b9"),
("2011-01-bs30", "190"),
("bs30", "190"),
("2011-01", "190"),
("quarriesd", "2002-08"),
("2002-x11", "2005-08"),
("2002-x12", "2005-07"),
("2002-x13", "2005-06"),
("2002-x14", "2005-05"),
("kh", "161"),
("161-kh", "161"),
("204-steinBH", "204"),
("stonebridge", "204"),
("hauchhole", "234"),
("hauch", "234"),
("234-hauch", "234"),
("tunnocks", "258"),
("balcony", "264"),
("balkon", "264"),
("fgh", "290"),
("gsh", "291"),
("homecoming", "2018-dm-07"),
("heimkommen", "2018-dm-07"),
("Heimkehr", "2018-dm-07"),
("99ob02", "1999-ob-02"),
]
for i in aliases:
if i[1] in Gcavelookup:
if i[0] in Gcavelookup:
# already set by a different method, but is it the same cave?
if Gcavelookup[i[0]] == Gcavelookup[i[1]]:
pass
else:
Gcave_count[i[0]] += 1
Gcavelookup[i[0]] = Gcavelookup[i[1]]
else:
message = f" * Coding or cave existence mistake, cave for id '{i[1]}' does not exist. Expecting to set alias '{i[0]}' to it"
# print(message)
DataIssue.objects.create(parser="aliases", message=message)
addmore = {}
for id in Gcavelookup:
addmore[id.replace("-", "_")] = Gcavelookup[id]
addmore[id.replace("_", "-")] = Gcavelookup[id]
addmore[id.upper()] = Gcavelookup[id]
Gcavelookup = {**addmore, **Gcavelookup}
addmore = {}
ldup = []
for d in duplicates:
Gcavelookup.pop(d)
Gcave_count.pop(d)
ldup.append(d)
if ldup:
message = f" - Ambiguous aliases removed: {ldup}"
print(message)
DataIssue.objects.create(parser="aliases ok", message=message)
for c in Gcave_count:
if Gcave_count[c] > 1:
message = f" ** Duplicate cave id count={Gcave_count[c]} id:'{Gcavelookup[c]}' cave __str__:'{c}'"
print(message)
DataIssue.objects.create(parser="aliases", message=message)
return Gcavelookup

226
core/models/logbooks.py Normal file
View File

@@ -0,0 +1,226 @@
from pathlib import Path
from urllib.parse import urljoin
from django.db import models
from django.urls import reverse
import settings
from troggle.core.models.troggle import Expedition, TroggleModel
"""The model declarations LogBookEntry, PersonLogEntry, QM
"""
todo = """
- Can we rewrite things to eliminate the CaveSlug and objects? No
Surely foreign keys work fine ?! No
Foreign keys do not allow for there being multiple ways to refer to a cave, eg 1623-1999-03 aka 1623-204
Having slugs allows for much more loose coupling to caves, which removes alot of the need to reset the database, which interupts work flow.
It also means we do not have to be creating temporary cave objects in the database, where we do not have the underlying file in cave_data.
To Do move Cave Slug back to troggle.core.models
"""
class CaveSlug(models.Model):
"""Moved here to avoid nasty cyclic import error
CASCADE means that if the Cave is deleted, this is too
"""
cave = models.ForeignKey("Cave", on_delete=models.CASCADE)
slug = models.SlugField(max_length=50, unique=True)
primary = models.BooleanField(default=False)
def __str__(self):
return f"{self.slug}: {self.cave}"
class LogbookEntry(TroggleModel):
"""Single parsed entry from Logbook
Gets deleted if the Expedition gets deleted"""
date = (
models.DateField()
) # MJG wants to turn this into a datetime such that multiple Logbook entries on the same day can be ordered.ld()
expedition = models.ForeignKey(Expedition, blank=True, null=True, on_delete=models.CASCADE) # yes this is double-
title = models.CharField(max_length=200)
cave_slug = models.SlugField(max_length=50, blank=True, null=True)
place = models.CharField(
max_length=100, blank=True, null=True, help_text="Only use this if you haven't chosen a cave"
)
text = models.TextField()
slug = models.SlugField(max_length=50)
time_underground = models.FloatField(null=True, help_text="In decimal hours")
class Meta:
verbose_name_plural = "Logbook Entries"
# several PersonLogEntrys point in to this object
ordering = ("-date",)
def cave(self): # Why didn't he just make this a foreign key to Cave ?
c = CaveSlug.objects.get(slug=self.cave_slug, primary=True).cave
return c
def isLogbookEntry(self): # Function used in templates
return True
def get_absolute_url(self):
return urljoin(settings.URL_ROOT, reverse("logbookentry", kwargs={"date": self.date, "slug": self.slug}))
def __str__(self):
return f"{self.date}: {self.title}"
def get_next_by_id(self):
LogbookEntry.objects.get(id=self.id + 1)
def get_previous_by_id(self):
LogbookEntry.objects.get(id=self.id - 1)
def DayIndex(self):
"""This is used to set different colours for the different trips on
the calendar view of the expedition"""
mx = 10
todays = list(LogbookEntry.objects.filter(date=self.date))
if self in todays:
index = todays.index(self)
else:
print(f"DayIndex: Synchronization error in logbook entries. Restart server or do full reset. {self}")
index = 0
if index not in range(0, mx):
print(f"DayIndex: More than {mx-1} LogbookEntry items on one day '{index}' {self}, restarting colour sequence.")
index = index % mx
return index
class PersonLogEntry(TroggleModel):
"""Single Person going on a trip, which may or may not be written up.
It could account for different T/U for people in same logbook entry.
CASCADE means that if the personexpedition or the logbookentry is deleted,
then this PersonLogEntry is deleted too
"""
personexpedition = models.ForeignKey("PersonExpedition", null=True, on_delete=models.CASCADE)
time_underground = models.FloatField(help_text="In decimal hours")
logbook_entry = models.ForeignKey(LogbookEntry, on_delete=models.CASCADE)
is_logbook_entry_author = models.BooleanField(default=False)
class Meta:
ordering = ("-personexpedition",)
# order_with_respect_to = 'personexpedition'
def next_personlog(self):
futurePTs = (
PersonLogEntry.objects.filter(
personexpedition=self.personexpedition, logbook_entry__date__gt=self.logbook_entry.date
)
.order_by("logbook_entry__date")
.all()
)
if len(futurePTs) > 0:
return futurePTs[0]
else:
return None
def prev_personlog(self):
pastPTs = (
PersonLogEntry.objects.filter(
personexpedition=self.personexpedition, logbook_entry__date__lt=self.logbook_entry.date
)
.order_by("-logbook_entry__date")
.all()
)
if len(pastPTs) > 0:
return pastPTs[0]
else:
return None
def place(self):
return self.logbook_entry.cave and self.logbook_entry.cave or self.logbook_entry.place
def __str__(self):
return f"{self.personexpedition} ({self.logbook_entry.date})"
class QM(TroggleModel):
"""This is based on qm.csv in trunk/expoweb/1623/204 which has the fields:
"Number","Grade","Area","Description","Page reference","Nearest station","Completion description","Comment"
All the stuff handling TICK QMs is INCOMPLETE
"""
number = models.IntegerField(
help_text="this is the sequential number in the year, only unique for CSV imports",
)
grade = models.CharField(max_length=1, blank=True, null=True, help_text="A/B/C/D/X")
cave = models.ForeignKey("Cave", related_name="QMs", blank=True, null=True, on_delete=models.SET_NULL)
block = models.ForeignKey("SurvexBlock", null=True, on_delete=models.SET_NULL) # only for QMs from survex files
blockname = models.TextField(blank=True, null=True) # NB truncated copy of survexblock name with last char added
expoyear = models.CharField(max_length=4, blank=True, null=True)
ticked = models.BooleanField(default=False)
location_description = models.TextField(blank=True, null=True)
completion_description = models.TextField(blank=True, null=True)
completion_date = models.DateField(blank=True, null=True)
nearest_station_name = models.CharField(max_length=200, blank=True, null=True)
resolution_station_name = models.CharField(max_length=200, blank=True, null=True)
area = models.CharField(max_length=100, blank=True, null=True)
page_ref = models.TextField(blank=True, null=True)
comment = models.TextField(blank=True, null=True)
def __str__(self):
return f"{self.code()}"
def code(self):
if self.cave:
cavestr = str(self.cave.slug())[5:]
else:
cavestr = ""
if self.expoyear:
expoyearstr = str(self.expoyear)
else:
expoyearstr = str(self.cave.slug())[5:9]
if self.blockname:
blocknamestr = "-" + str(self.blockname)
else:
blocknamestr = ""
return f"{cavestr}-{expoyearstr}-{self.number}{self.grade}{blocknamestr}"
# def get_completion_url(self):
# """assumes html file named is in same folder as cave description file
# WRONG - needs rewriting!"""
# cd = None
# if self.completion_description:
# try:
# dir = Path(self.cave.url).parent
# cd = dir / self.completion_description
# except:
# cd = None
# return cd
def newslug(self):
qmslug = f"{str(self.cave)}-{self.expoyear}-{self.blockname}{self.number}{self.grade}"
return qmslug
def get_absolute_url(self):
# This reverse resolution stuff is pure magic. Just change the regex in urls.py and everything changes to suit. Whacky.
return urljoin(
settings.URL_ROOT,
reverse(
"qm",
kwargs={
"cave_id": self.cave.slug(),
"year": self.expoyear,
"blockname": self.blockname,
"qm_id": self.number,
"grade": self.grade,
},
),
)
def get_next_by_id(self): # called in template
return QM.objects.get(id=self.id + 1)
def get_previous_by_id(self): # called in template
return QM.objects.get(id=self.id - 1)

298
core/models/survex.py Normal file
View File

@@ -0,0 +1,298 @@
import os
import re
from urllib.parse import urljoin
from pathlib import Path
from django.conf import settings
from django.db import models
from django.urls import reverse
# from troggle.core.models.troggle import DataIssue # circular import. Hmm
class SurvexDirectory(models.Model):
"""This relates a Cave to the primary SurvexFile which is the 'head' of the survex tree for
that cave. Surely this could just be a property of Cave ? No. Several subdirectories
all relate to the same Cave
"""
path = models.CharField(max_length=200)
cave = models.ForeignKey("Cave", blank=True, null=True, on_delete=models.SET_NULL)
primarysurvexfile = models.ForeignKey(
"SurvexFile", related_name="primarysurvexfile", blank=True, null=True, on_delete=models.SET_NULL
)
# could also include files in directory but not referenced
class Meta:
ordering = ("id",)
verbose_name_plural = "Survex directories"
def contents(self):
return "[SvxDir:" + str(self.path) + " | Primary svx:" + str(self.primarysurvexfile.path) + ".svx ]"
def __str__(self):
return "[SvxDir:" + str(self.path)+ "]"
class SurvexFile(models.Model):
path = models.CharField(max_length=200)
survexdirectory = models.ForeignKey("SurvexDirectory", blank=True, null=True, on_delete=models.SET_NULL)
cave = models.ForeignKey("Cave", blank=True, null=True, on_delete=models.SET_NULL)
class Meta:
ordering = ("id",)
# Don't change from the default as that breaks troggle webpages and internal referencing!
# def __str__(self):
# return "[SurvexFile:"+str(self.path) + "-" + str(self.survexdirectory) + "-" + str(self.cave)+"]"
def exists(self):
"""This is only used within the Django templates
"""
fname = Path(settings.SURVEX_DATA, self.path + ".svx")
return fname.is_file()
def SetDirectory(self):
dirpath = os.path.split(self.path)[0]
# pointless search every time we import a survex file if we know there are no duplicates..
# don't use this for initial import.
survexdirectorylist = SurvexDirectory.objects.filter(cave=self.cave, path=dirpath)
if survexdirectorylist:
self.survexdirectory = survexdirectorylist[0]
else:
survexdirectory = SurvexDirectory(path=dirpath, cave=self.cave, primarysurvexfile=self)
survexdirectory.save()
self.survexdirectory = survexdirectory
self.save()
# Don't change from the default as that breaks troggle webpages and internal referencing!
# def __str__(self):
# return "[SurvexFile:"+str(self.path) + "-" + str(self.survexdirectory) + "-" + str(self.cave)+"]"
def __str__(self):
return self.path
class SurvexStationLookUpManager(models.Manager):
"""what this does,
https://docs.djangoproject.com/en/dev/topics/db/managers/
This changes the .objects thinggy to use a case-insensitive match name__iexact
so that now SurvexStation.objects.lookup() works as a case-insensitive match
"""
def lookup(self, name):
blocknames, sep, stationname = name.rpartition(".")
return self.get(block=SurvexBlock.objects.lookup(blocknames), name__iexact=stationname)
class SurvexStation(models.Model):
name = models.CharField(max_length=100)
# block = models.ForeignKey("SurvexBlock", null=True, on_delete=models.SET_NULL)
# block not used since 2020. survex stations objects are only used for entrnce locations and all taken from the .3d file
objects = SurvexStationLookUpManager() # overwrites SurvexStation.objects and enables lookup()
x = models.FloatField(blank=True, null=True)
y = models.FloatField(blank=True, null=True)
z = models.FloatField(blank=True, null=True)
# def path(self):
# r = self.name
# b = self.block
# while True:
# if b.name:
# r = b.name + "." + r
# if b.parent:
# b = b.parent
# else:
# return r
class Meta:
ordering = ("id",)
def __str__(self):
return self.name and str(self.name) or "no name"
def latlong(self):
return utmToLatLng(33, self.x, self.y, northernHemisphere=True)
import math
def utmToLatLng(zone, easting, northing, northernHemisphere=True):
if not northernHemisphere:
northing = 10000000 - northing
a = 6378137
e = 0.081819191
e1sq = 0.006739497
k0 = 0.9996
arc = northing / k0
mu = arc / (a * (1 - math.pow(e, 2) / 4.0 - 3 * math.pow(e, 4) / 64.0 - 5 * math.pow(e, 6) / 256.0))
ei = (1 - math.pow((1 - e * e), (1 / 2.0))) / (1 + math.pow((1 - e * e), (1 / 2.0)))
ca = 3 * ei / 2 - 27 * math.pow(ei, 3) / 32.0
cb = 21 * math.pow(ei, 2) / 16 - 55 * math.pow(ei, 4) / 32
cc = 151 * math.pow(ei, 3) / 96
cd = 1097 * math.pow(ei, 4) / 512
phi1 = mu + ca * math.sin(2 * mu) + cb * math.sin(4 * mu) + cc * math.sin(6 * mu) + cd * math.sin(8 * mu)
n0 = a / math.pow((1 - math.pow((e * math.sin(phi1)), 2)), (1 / 2.0))
r0 = a * (1 - e * e) / math.pow((1 - math.pow((e * math.sin(phi1)), 2)), (3 / 2.0))
fact1 = n0 * math.tan(phi1) / r0
_a1 = 500000 - easting
dd0 = _a1 / (n0 * k0)
fact2 = dd0 * dd0 / 2
t0 = math.pow(math.tan(phi1), 2)
Q0 = e1sq * math.pow(math.cos(phi1), 2)
fact3 = (5 + 3 * t0 + 10 * Q0 - 4 * Q0 * Q0 - 9 * e1sq) * math.pow(dd0, 4) / 24
fact4 = (61 + 90 * t0 + 298 * Q0 + 45 * t0 * t0 - 252 * e1sq - 3 * Q0 * Q0) * math.pow(dd0, 6) / 720
lof1 = _a1 / (n0 * k0)
lof2 = (1 + 2 * t0 + Q0) * math.pow(dd0, 3) / 6.0
lof3 = (5 - 2 * Q0 + 28 * t0 - 3 * math.pow(Q0, 2) + 8 * e1sq + 24 * math.pow(t0, 2)) * math.pow(dd0, 5) / 120
_a2 = (lof1 - lof2 + lof3) / math.cos(phi1)
_a3 = _a2 * 180 / math.pi
latitude = 180 * (phi1 - fact1 * (fact2 + fact3 + fact4)) / math.pi
if not northernHemisphere:
latitude = -latitude
longitude = ((zone > 0) and (6 * zone - 183.0) or 3.0) - _a3
return (latitude, longitude)
#
# Single SurvexBlock
#
class SurvexBlockLookUpManager(models.Manager):
"""what this does,
https://docs.djangoproject.com/en/dev/topics/db/managers/
This adds a method to the .objects thinggy to use a case-insensitive match name__iexact
so that now SurvexBlock.objects.lookup() works as a case-insensitive match.
This is used in lookup() in SurvexStationLookUpManager()
which is used in Entrance().other_location() which is used in the Cave webpage
"""
def lookup(self, name):
if name == "":
blocknames = []
else:
blocknames = name.split(".")
block = SurvexBlock.objects.get(parent=None, survexfile__path=settings.SURVEX_TOPNAME)
for blockname in blocknames:
block = SurvexBlock.objects.get(parent=block, name__iexact=blockname)
return block
class SurvexBlock(models.Model):
"""One begin..end block within a survex file. The basic element of a survey trip.
Multiple anonymous survex blocks are possible within the same surfex file
Blocks can span several *included survexfile though.
"""
objects = SurvexBlockLookUpManager() # overwrites SurvexBlock.objects and enables lookup()
name = models.CharField(max_length=100)
title = models.CharField(max_length=200)
parent = models.ForeignKey("SurvexBlock", blank=True, null=True, on_delete=models.SET_NULL)
date = models.DateField(blank=True, null=True)
expedition = models.ForeignKey("Expedition", blank=True, null=True, on_delete=models.SET_NULL)
# if the survexfile object is deleted, then all teh suvex-blocks in it should be too,
# though a block can span more than one file...
survexfile = models.ForeignKey("SurvexFile", blank=True, null=True, on_delete=models.CASCADE)
survexpath = models.CharField(max_length=200) # the path for the survex stations
scanswallet = models.ForeignKey(
"Wallet", null=True, on_delete=models.SET_NULL
) # only ONE wallet per block. The most recent seen overwites.. ugh.
legsall = models.IntegerField(null=True) # summary data for this block
legslength = models.FloatField(null=True)
class Meta:
ordering = ("id",)
# def __str__(self):
# return "[SurvexBlock:" + str(self.name) + "-path:" + str(self.survexpath) + "-cave:" + str(self.cave) + "]"
def __str__(self):
return self.name and str(self.name) or "no_name-#" + str(self.id)
def isSurvexBlock(self): # Function used in templates
return True
def DayIndex(self):
"""This is used to set different colours for the different trips on
the calendar view of the expedition"""
# print(f"SurvexBlock DayIndex {self.name} '{self.date}' {len(list(SurvexBlock.objects.filter(date=self.date)))} on this date")
mx = 10
todays = list(SurvexBlock.objects.filter(date=self.date))
if self in todays:
index = todays.index(self)
else:
print(f"DayIndex: Synchronization error in survex blocks. Restart server or do full reset. {self}")
index = 0
if index not in range(0, mx):
print(f"DayIndex: More than {mx-1} SurvexBlock items on one day '{index}' {self}, restarting colour sequence.")
index = index % mx
# return list(self.survexblock_set.all()).index(self)
return index
class SurvexPersonRole(models.Model):
"""The CASCADE means that if a SurvexBlock or a Person is deleted, then the SurvexPersonRole
is deleted too
"""
survexblock = models.ForeignKey("SurvexBlock", on_delete=models.CASCADE)
# increasing levels of precision, Surely we only need survexblock and person now that we have no link to a logbook entry?
personname = models.CharField(max_length=100)
person = models.ForeignKey("Person", blank=True, null=True, on_delete=models.CASCADE) # not needed
personexpedition = models.ForeignKey("PersonExpedition", blank=True, null=True, on_delete=models.SET_NULL)
def __str__(self):
return str(self.personname) + " - " + str(self.survexblock)
class SingleScan(models.Model):
"""A single file holding an image. Could be raw notes, an elevation plot or whatever"""
ffile = models.CharField(max_length=200)
name = models.CharField(max_length=200)
wallet = models.ForeignKey("Wallet", null=True, on_delete=models.SET_NULL)
class Meta:
ordering = ("name",)
def get_absolute_url(self):
return urljoin(
settings.URL_ROOT,
reverse("scansingle", kwargs={"path": re.sub("#", "%23", self.wallet.walletname), "file": self.name}),
)
def __str__(self):
return "Scan Image: " + str(self.name) + " in " + str(self.wallet)
class DrawingFile(models.Model):
"""A file holding a Therion (several types) or a Tunnel drawing
Most of the implied capabilities are not implemented yet"""
dwgpath = models.CharField(max_length=200)
dwgname = models.CharField(max_length=200)
dwgwallets = models.ManyToManyField("Wallet") # implicitly links via folders to scans to SVX files
scans = models.ManyToManyField("SingleScan") # implicitly links via scans to SVX files
dwgcontains = models.ManyToManyField("DrawingFile") # case when its a frame type
filesize = models.IntegerField(default=0)
npaths = models.IntegerField(default=0)
survexfiles = models.ManyToManyField("SurvexFile") # direct link to SVX files - not populated yet
class Meta:
ordering = ("dwgpath",)
def __str__(self):
return "Drawing File: " + str(self.dwgname) + " (" + str(self.filesize) + " bytes)"

187
core/models/troggle.py Normal file
View File

@@ -0,0 +1,187 @@
from decimal import Decimal, getcontext
from urllib.parse import urljoin
getcontext().prec = 2 # use 2 significant figures for decimal calculations
from django.db import models
from django.urls import reverse
import settings
"""This file declares TroggleModel which inherits from django.db.models.Model
All TroggleModel and models.Model subclasses inherit persistence in the django relational database. This is known as
the django Object Relational Mapping (ORM).
There are more subclasses defined in models/caves.py models/survex.py etc.
"""
class TroggleModel(models.Model):
"""This class is for adding fields and methods which all of our models will have."""
new_since_parsing = models.BooleanField(default=False, editable=False)
non_public = models.BooleanField(default=False)
def object_name(self):
return self._meta.object_name
def get_admin_url(self):
return urljoin(settings.URL_ROOT, "/admin/core/" + self.object_name().lower() + "/" + str(self.pk))
class Meta:
abstract = True
class DataIssue(TroggleModel):
"""When importing cave data any validation problems produce a message which is
recorded as a DataIssue. The django admin system automatically produces a page listing
these at /admin/core/dataissue/
This is a use of the NOTIFICATION pattern:
https://martinfowler.com/eaaDev/Notification.html
We have replaced all assertions in the code with messages and local fix-ups or skips:
https://martinfowler.com/articles/replaceThrowWithNotification.html
See also the use of stash_data_issue() & store_data_issues() in parsers/survex.py which defer writing to the database until the end of the import.
"""
date = models.DateTimeField(auto_now_add=True, blank=True)
parser = models.CharField(max_length=50, blank=True, null=True)
message = models.CharField(max_length=800, blank=True, null=True)
url = models.CharField(max_length=300, blank=True, null=True) # link to offending object
class Meta:
ordering = ["date"]
def __str__(self):
return f"{self.parser} - {self.message}"
#
# single Expedition, usually seen by year
#
class Expedition(TroggleModel):
year = models.CharField(max_length=20, unique=True)
name = models.CharField(max_length=100)
logbookfile = models.CharField(max_length=100, blank=True, null=True)
def __str__(self):
return self.year
class Meta:
ordering = ("-year",)
get_latest_by = "year"
def get_absolute_url(self):
return urljoin(settings.URL_ROOT, reverse("expedition", args=[self.year]))
# class ExpeditionDay(TroggleModel):
# """Exists only on Expedition now. Removed links from logbookentry, personlogentry, survex stuff etc.
# """
# expedition = models.ForeignKey("Expedition",on_delete=models.CASCADE)
# date = models.DateField()
# class Meta:
# ordering = ('date',)
class Person(TroggleModel):
"""single Person, can go on many years"""
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
fullname = models.CharField(max_length=200)
nickname = models.CharField(max_length=200)
is_vfho = models.BooleanField(
help_text="VFHO is the Vereines f&uuml;r H&ouml;hlenkunde in Obersteier, a nearby Austrian caving club.",
default=False,
)
mug_shot = models.CharField(max_length=100, blank=True, null=True)
blurb = models.TextField(blank=True, null=True)
orderref = models.CharField(max_length=200) # for alphabetic
def get_absolute_url(self):
return urljoin(
settings.URL_ROOT, reverse("person", kwargs={"first_name": self.first_name, "last_name": self.last_name})
)
class Meta:
verbose_name_plural = "People"
ordering = ("orderref",) # "Wookey" makes too complex for: ('last_name', 'first_name')
def __str__(self):
if self.last_name:
return f"{self.first_name} {self.last_name}"
return self.first_name
def notability(self):
"""This is actually recency: all recent cavers, weighted by number of expos"""
notability = Decimal(0)
max_expo_val = 0
max_expo_year = Expedition.objects.all().aggregate(models.Max("year"))
max_expo_val = int(max_expo_year["year__max"]) + 1
for personexpedition in self.personexpedition_set.all():
if not personexpedition.is_guest:
notability += Decimal(1) / (max_expo_val - int(personexpedition.expedition.year))
return notability
def bisnotable(self):
"""Boolean: is this person notable?"""
return self.notability() > Decimal(1) / Decimal(3)
def surveyedleglength(self):
return sum([personexpedition.surveyedleglength() for personexpedition in self.personexpedition_set.all()])
def first(self):
return self.personexpedition_set.order_by("-expedition")[0]
def last(self):
return self.personexpedition_set.order_by("expedition")[0]
# moved from personexpedition
def name(self):
if self.nickname:
return f"{self.first_name} ({self.nickname}) {self.last_name}"
if self.last_name:
return f"{self.first_name} {self.last_name}"
return self.first_name
class PersonExpedition(TroggleModel):
"""Person's attendance to one Expo
CASCADE means that if an expedition or a person is deleted, the PersonExpedition
is deleted too
"""
expedition = models.ForeignKey(Expedition, on_delete=models.CASCADE)
person = models.ForeignKey(Person, on_delete=models.CASCADE)
slugfield = models.SlugField(max_length=50, blank=True, null=True) # 2022 to be used in future
is_guest = models.BooleanField(default=False)
class Meta:
ordering = ("-expedition",)
# order_with_respect_to = 'expedition'
def __str__(self):
return f"{self.person}: ({self.expedition})"
def get_absolute_url(self):
return urljoin(
settings.URL_ROOT,
reverse(
"personexpedition",
kwargs={
"first_name": self.person.first_name,
"last_name": self.person.last_name,
"year": self.expedition.year,
},
),
)
def surveyedleglength(self):
"""Survey length for this person on all survex trips on this expedition"""
survexblocks = [personrole.survexblock for personrole in self.survexpersonrole_set.all()]
return sum([survexblock.legslength for survexblock in set(survexblocks)])

339
core/models/wallets.py Normal file
View File

@@ -0,0 +1,339 @@
import datetime
import json
import operator
import re
from functools import reduce
from pathlib import Path
from urllib.parse import urljoin
from django.conf import settings
from django.db import models
from django.urls import reverse
# from troggle.core.models.survex import SurvexBlock
# from troggle.core.models.troggle import DataIssue # circular import. Hmm
YEAR_RANGE = (1975, 2050)
def make_valid_date(date):
"""Take whatever garbage some fool has typed in and try to make it into a valid ISO-format date
"""
datestr = date.replace(".", "-")
try:
samedate = datetime.date.fromisoformat(datestr)
return samedate
except ValueError:
# Could be in std euro format e.g. 14/07/2023
match = re.search(r'(\d{1,2})/(\d{1,2})/(\d{2,4})', datestr)
if match:
d = int(match.group(1))
m = int(match.group(2))
y = int(match.group(3))
if y<2000:
y = y + 2000
try:
samedate = datetime.date(y, m, d)
print(f"- - Warning, not in ISO format. '{datestr=}' but we coped: {samedate.isoformat()} ")
return samedate
except:
print(f"! - Fail, tried to decompose date in dd/mm/yyyy format but failed: {datestr=} ")
return None
# probably a single digit day number or month number
match = re.search(r'(\d{4})-(\d{1,2})-(\d{1,2})', datestr)
if match:
y = int(match.group(1))
m = int(match.group(2))
d = int(match.group(3))
try:
samedate = datetime.date(y, m, d)
print(f"- - Warning, 1 digit only for month or day '{datestr=}' but we coped: {samedate.isoformat()} ")
return samedate
except:
print(f"! - Fail, tried to decompose date in yyyy-mm-d or yyy-m-dd format but failed: {datestr=} ")
return None
print(f"! - Failed to understand date, none of our tricks worked {datestr=} ")
return None
class Wallet(models.Model):
"""We do not keep the JSON values in the database, we query them afresh each time,
but we will change this when we need to do a Django query on e.g. personame
"""
fpath = models.CharField(max_length=200)
walletname = models.CharField(max_length=200)
walletdate = models.DateField(blank=True, null=True)
walletyear = models.DateField(blank=True, null=True)
class Meta:
ordering = ("walletname",)
def get_absolute_url(self):
return urljoin(settings.URL_ROOT, reverse("singlewallet", kwargs={"path": re.sub("#", "%23", self.walletname)}))
def get_json(self):
"""Read the JSON file for the wallet and do stuff
Do it every time it is queried, to be sure the result is fresh
import DataIssue locally to prevent import cycle problem"""
# jsonfile = Path(self.fpath, 'contents.json')
# Get from git repo instead
# :drawings: walletjson/2022/2022#01/contents.json
# fpath = /mnt/d/EXPO/expofiles/surveyscans/1999/1999#02
fp = Path(self.fpath)
wname = fp.name
wyear = fp.parent.name
wurl = f"/walletedit/{self.walletname}".replace('#', ':')
if len(wyear) != 4 or len(wname) !=6:
# no contents.json for old-style wallets
# but this ruined all the tick-list displays.. why?!
# return None
pass
jsonfile = Path(settings.DRAWINGS_DATA, "walletjson") / wyear / wname / "contents.json"
if not Path(jsonfile).is_file():
message = f"! {jsonfile} is not a file {wyear=} {wname=} "
from troggle.core.models.troggle import DataIssue
print(message)
DataIssue.objects.update_or_create(parser="wallets", message=message, url=wurl)
return None
else:
with open(jsonfile) as json_f:
try:
waldata = json.load(json_f)
except:
message = f"! {str(self.walletname)} Failed to load {jsonfile} JSON file"
print(message)
DataIssue.objects.update_or_create(parser="wallets", message=message, url=wurl)
return None
if waldata["date"]:
thisdate = make_valid_date(waldata["date"])
if thisdate:
self.walletdate = thisdate
self.save()
waldata["date"] = thisdate.isoformat()
else:
message = f"! {str(self.walletname)} Date format not ISO {waldata['date']}. Failed to load from {jsonfile} JSON file"
from troggle.core.models.troggle import DataIssue
DataIssue.objects.update_or_create(parser="wallets", message=message, url=wurl)
return waldata
def year(self):
"""This gets the year syntactically without opening and reading the JSON"""
if len(self.walletname) < 5:
return None
if self.walletname[4] != "#":
return None
year = int(self.walletname[0:4])
ymin, ymax = YEAR_RANGE
if year < ymin or year > ymax:
return None
else:
self.walletyear = datetime.date(year, 1, 1)
self.save()
return str(year)
# Yes this is horribly, horribly inefficient, esp. for a page that have date, people and cave in it
def date(self):
"""Reads all the JSON data just to get the JSON date."""
if self.walletdate:
return self.walletdate
if not (jsondata := self.get_json()): # WALRUS
return None
datestr = jsondata["date"]
if not datestr:
return None
else:
datestr = datestr.replace(".", "-")
try:
samedate = datetime.date.fromisoformat(datestr)
self.walletdate = samedate.isoformat()
except:
try:
samedate = datetime.date.fromisoformat(datestr[:10])
self.walletdate = samedate.isoformat()
except:
samedate = None
self.save()
return self.walletdate
def people(self):
if not self.get_json():
return None
jsondata = self.get_json()
return jsondata["people"]
def cave(self):
if not self.get_json():
return None
jsondata = self.get_json()
return jsondata["cave"]
def name(self):
if not self.get_json():
return None
jsondata = self.get_json()
return jsondata["name"]
def get_fnames(self):
'''Filenames without the suffix, i.e. without the ".jpg"'''
dirpath = Path(settings.SCANS_ROOT, self.fpath) # does nowt as fpath is a rooted path already
files = []
if not self.fpath:
files.append(f"Incorrect path to wallet contents: '{self.fpath}'")
return files
if not dirpath.is_dir():
files.append(f"Incorrect path to wallet contents: '{self.fpath}'")
return files
else:
try:
for f in dirpath.iterdir():
if f.is_file():
files.append(Path(f.name).stem)
else:
files.append(f"-{Path(f.name).stem}-")
except FileNotFoundError:
files.append("FileNotFoundError")
pass
return files
def fixsurvextick(self, tick):
blocks = self.survexblock_set.all()
# blocks = SurvexBlock.objects.filter(scanswallet = self)
result = tick
for b in blocks:
if b.survexfile: # if any exist in db, no check for validity or a real file. Refactor.
result = "seagreen" # slightly different shade of green
return result
def get_ticks(self):
"""Reads all the JSON data and sets the colour of the completion tick for each condition"""
ticks = {}
waldata = self.get_json()
if not waldata:
ticks["S"] = "darkgrey"
ticks["C"] = "darkgrey"
ticks["Q"] = "darkgrey"
ticks["N"] = "darkgrey"
ticks["P"] = "darkgrey"
ticks["E"] = "darkgrey"
ticks["T"] = "darkgrey"
ticks["W"] = "darkgrey"
return ticks
ticks = {}
# Initially, are there any required survex files present ?
# Note that we can't set the survexblock here on the wallet as that info is only available while parsing the survex file
survexok = "red"
ticks["S"] = "red"
if waldata["survex not required"]:
survexok = "green"
ticks["S"] = "green"
else:
if waldata["survex file"]:
if not type(waldata["survex file"]) == list: # a string also is a sequence type, so do it this way
waldata["survex file"] = [waldata["survex file"]]
ngood = 0
nbad = 0
ticks["S"] = "purple"
for sx in waldata["survex file"]:
# this logic appears in several places, inc uploads.py). Refactor.
if sx != "":
if Path(sx).suffix.lower() != ".svx":
sx = sx + ".svx"
if (Path(settings.SURVEX_DATA) / sx).is_file():
ngood += 1
else:
nbad += 1
if nbad == 0 and ngood >= 1: # all valid
ticks["S"] = "green"
elif nbad >= 1 and ngood >= 1: # some valid, some invalid
ticks["S"] = "orange"
elif nbad >= 1 and ngood == 0: # all bad
ticks["S"] = "red"
elif nbad == 0 and ngood == 0: # list of blank strings
ticks["S"] = "red"
else:
ticks["S"] = "fuchsia" # have fun working out what this means
# Cave Description
if waldata["description written"]:
ticks["C"] = "green"
else:
ticks["C"] = survexok
# QMs
if waldata["qms written"]:
ticks["Q"] = "green"
else:
ticks["Q"] = survexok
if not self.year():
ticks["Q"] = "darkgrey"
else:
if int(self.year()) < 2015:
ticks["Q"] = "lightgrey"
if 'notes not required' not in waldata:
waldata['notes not required'] = False
# Notes, Plan, Elevation
files = self.get_fnames()
# Notes required
notes_scanned = reduce(operator.or_, [f.startswith("note") for f in files], False)
notes_scanned = reduce(operator.or_, [f.endswith("notes") for f in files], notes_scanned)
notes_required = not (notes_scanned or waldata["notes not required"])
if notes_required:
ticks["N"] = "red"
else:
ticks["N"] = "green"
# print(f"{self.walletname} {ticks['N'].upper()} {notes_scanned=} {notes_required=} {waldata['notes not required']=}")
# Plan drawing required
plan_scanned = reduce(operator.or_, [f.startswith("plan") for f in files], False)
plan_scanned = reduce(operator.or_, [f.endswith("plan") for f in files], plan_scanned)
plan_drawing_required = not (plan_scanned or waldata["plan drawn"] or waldata["plan not required"])
if plan_drawing_required:
ticks["P"] = "red"
else:
ticks["P"] = "green"
# Elev drawing required
elev_scanned = reduce(operator.or_, [f.startswith("elev") for f in files], False)
elev_scanned = reduce(operator.or_, [f.endswith("elev") for f in files], elev_scanned)
elev_scanned = reduce(operator.or_, [f.endswith("elevation") for f in files], elev_scanned)
elev_drawing_required = not (elev_scanned or waldata["elev drawn"] or waldata["elev not required"])
if elev_drawing_required:
ticks["E"] = "red"
else:
ticks["E"] = "green"
# if electronic, don't require P or E
if waldata["electronic survey"]:
# ticks["N"] = "green"
ticks["P"] = "green"
ticks["E"] = "green"
# ticks["T"] = "green" # No, this does not mean it has been 'tunneled' properly
# Tunnel / Therion
if elev_drawing_required or plan_drawing_required:
ticks["T"] = "red"
else:
ticks["T"] = "green"
# Website
if waldata["website updated"]:
ticks["W"] = "green"
else:
ticks["W"] = "red"
return ticks
def __str__(self):
return "[" + str(self.walletname) + " (Wallet)]"

View File

@@ -1,83 +0,0 @@
from django.db import models
from django.conf import settings
import troggle.core.methods_millenial as methods_millenial
#
# This file was created in 2019
# It's a result of massive frustration with cluttered database of troggle
# Maximal clarity of code was primary goal (previous code had very little comments)
# Maximal speed of database rebuild was secondary goal
#
#
# The following file will tell you what fields and methods are avaliable inside this database
# be carefull you might miss some! ManyToMany fields can be used from the far end as well
#
#
# Naming conventions:
# (Upper/lower convention)
# Class names are writen Udddd_ddd_dddM - they finish with M for backwards compatibility
# Fields/methods are written lower_lower_lower
#
class PersonM(models.Model): #instance of this class corresponds to one physical peson
name = models.CharField(max_length=100) #just name, talk to wookey if you diagree
surveys_made = models.ManyToManyField('SurveyM', related_name='people_surveyed') #links to survey objects that this person made (made=:survex says so)
expos_attended = models.ManyToManyField('ExpeditionM', related_name='people_attended') #expos attended by this person (attended=:folk.csv says so)
logbook_entries_written = models.ManyToManyField('Logbook_entryM', related_name='people_wrote') #links to logbook chuncks created by a person
class CaveM(models.Model): #instance of this class corresponds to one 'thing' that people call cave
entrance = models.CharField(max_length=100) #UTM string describing ONE(!) entrance. Purpose = findability
title = models.TextField() #title given to the topmost survey in survex, numeric name otherwise c.f. name (e.g. 'Fishface')
name = models.TextField() #name given to the topmost survey in survex (e.g. '2017-cucc-28')
surveys = models.ManyToManyField('SurveyM', related_name='cave_parent') #links to surveys objects that this cave contains
survex_file = models.TextField() #gives path to top level survex file
total_length = models.FloatField() #holds total length of this cave (as given by cavern)
total_depth = models.FloatField() #holds total depth of this cave (as given by cavern)
description = models.TextField() #holds link to description
date = models.TextField() #holds date of last visit
def top_camp_distance(self): #returns distance of this cave from topcamp
return methods_millenial.top_camp_distance(self.entrance)
def top_camp_bearing(self): #returns bearing to this cave from topcamp in format 235.5 (float north-based azimuth)
return methods_millenial.top_camp_bearing(self.entrance)
def top_camp_bearing_letter(self): #returns bearing to this cave from topcamp in format e.g. 'NE'
return methods_millenial.top_camp_bearing_letter(self.entrance)
def lat_lon_entrance(self): #lat_lon entrance location
return methods_millenial.lat_lon_entrance(self.entrance)
class Cave_descriptionM(models.Model): #instance of this class corresponds to each of the .html files in descriptions
#each of those holds one XML field
slug = models.TextField()
explorers = models.TextField()
underground_description = models.TextField()
equipment = models.TextField()
references = models.TextField()
survey = models.TextField()
kataster_status = models.TextField()
underground_centre_line = models.TextField()
survex_file = models.TextField() #as given in .html file
notes = models.TextField()
class ExpeditionM(models.Model): #instance of this class corresponds to one expo (usually one year)
date = models.CharField(max_length=100) #date in format YYYY.MM.DD-YYYY.MM.DD
class SurveyM(models.Model): #instance of this class corresponds to one .svx file - one trip
date = models.CharField(max_length=100) #date of the trip in format YYYY.MM.DD (dated:=date given by .svx file)
survex_file = models.TextField()
class Logbook_entryM(models.Model): #instance of this class corresponds to one bit of logbook (c.f. expo.survex.com/years/2015/logbook.html or simil)
date = models.CharField(max_length=100) #date as typed into logbook
contents = models.TextField() #contents of the logbook chunk
class Parser_messageM(models.Model): #instance of this class contains one error or warining message produce by any of the parsers
parsername = models.CharField(max_length = 20) #name of parser
content = models.TextField() #content of message
message_type = models.CharField(max_length = 10) # [Error,Info] or similar

View File

@@ -1,862 +0,0 @@
import urllib, urlparse, string, os, datetime, logging, re
import subprocess
from django.forms import ModelForm
from django.db import models
from django.contrib import admin
from django.core.files.storage import FileSystemStorage
from django.contrib.auth.models import User
from django.contrib.contenttypes.models import ContentType
from django.db.models import Min, Max
from django.conf import settings
from decimal import Decimal, getcontext
from django.core.urlresolvers import reverse
from imagekit.models import ImageModel
from django.template import Context, loader
import settings
getcontext().prec=2 #use 2 significant figures for decimal calculations
def get_related_by_wikilinks(wiki_text):
found=re.findall(settings.QM_PATTERN,wiki_text)
res=[]
for wikilink in found:
qmdict={'urlroot':settings.URL_ROOT,'cave':wikilink[2],'year':wikilink[1],'number':wikilink[3]}
try:
cave_slugs = CaveSlug.objects.filter(cave__kataster_number = qmdict['cave'])
qm=QM.objects.get(found_by__cave_slug__in = cave_slugs,
found_by__date__year = qmdict['year'],
number = qmdict['number'])
res.append(qm)
except QM.DoesNotExist:
print('fail on '+str(wikilink))
return res
try:
logging.basicConfig(level=logging.DEBUG,
filename=settings.LOGFILE,
filemode='w')
except:
subprocess.call(settings.FIX_PERMISSIONS)
logging.basicConfig(level=logging.DEBUG,
filename=settings.LOGFILE,
filemode='w')
#This class is for adding fields and methods which all of our models will have.
class TroggleModel(models.Model):
new_since_parsing = models.BooleanField(default=False, editable=False)
non_public = models.BooleanField(default=False)
def object_name(self):
return self._meta.object_name
def get_admin_url(self):
return urlparse.urljoin(settings.URL_ROOT, "/admin/core/" + self.object_name().lower() + "/" + str(self.pk))
class Meta:
abstract = True
class TroggleImageModel(ImageModel):
new_since_parsing = models.BooleanField(default=False, editable=False)
def object_name(self):
return self._meta.object_name
def get_admin_url(self):
return urlparse.urljoin(settings.URL_ROOT, "/admin/core/" + self.object_name().lower() + "/" + str(self.pk))
class Meta:
abstract = True
#
# single Expedition, usually seen by year
#
class Expedition(TroggleModel):
year = models.CharField(max_length=20, unique=True)
name = models.CharField(max_length=100)
def __unicode__(self):
return self.year
class Meta:
ordering = ('-year',)
get_latest_by = 'year'
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('expedition', args=[self.year]))
# construction function. should be moved out
def get_expedition_day(self, date):
expeditiondays = self.expeditionday_set.filter(date=date)
if expeditiondays:
assert len(expeditiondays) == 1
return expeditiondays[0]
res = ExpeditionDay(expedition=self, date=date)
res.save()
return res
def day_min(self):
res = self.expeditionday_set.all()
return res and res[0] or None
def day_max(self):
res = self.expeditionday_set.all()
return res and res[len(res) - 1] or None
class ExpeditionDay(TroggleModel):
expedition = models.ForeignKey("Expedition")
date = models.DateField()
class Meta:
ordering = ('date',)
def GetPersonTrip(self, personexpedition):
personexpeditions = self.persontrip_set.filter(expeditionday=self)
return personexpeditions and personexpeditions[0] or None
#
# single Person, can go on many years
#
class Person(TroggleModel):
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
is_vfho = models.BooleanField(help_text="VFHO is the Vereines f&uuml;r H&ouml;hlenkunde in Obersteier, a nearby Austrian caving club.", default=False)
mug_shot = models.CharField(max_length=100, blank=True,null=True)
blurb = models.TextField(blank=True,null=True)
#href = models.CharField(max_length=200)
orderref = models.CharField(max_length=200) # for alphabetic
#the below have been removed and made methods. I'm not sure what the b in bisnotable stands for. - AC 16 Feb
#notability = models.FloatField() # for listing the top 20 people
#bisnotable = models.BooleanField(default=False)
user = models.OneToOneField(User, null=True, blank=True)
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT,reverse('person',kwargs={'first_name':self.first_name,'last_name':self.last_name}))
class Meta:
verbose_name_plural = "People"
ordering = ('orderref',) # "Wookey" makes too complex for: ('last_name', 'first_name')
def __unicode__(self):
if self.last_name:
return "%s %s" % (self.first_name, self.last_name)
return self.first_name
def notability(self):
notability = Decimal(0)
for personexpedition in self.personexpedition_set.all():
if not personexpedition.is_guest:
notability += Decimal(1) / (2012 - int(personexpedition.expedition.year))
return notability
def bisnotable(self):
return self.notability() > Decimal(1)/Decimal(3)
def surveyedleglength(self):
return sum([personexpedition.surveyedleglength() for personexpedition in self.personexpedition_set.all()])
def first(self):
return self.personexpedition_set.order_by('-expedition')[0]
def last(self):
return self.personexpedition_set.order_by('expedition')[0]
#def Sethref(self):
#if self.last_name:
#self.href = self.first_name.lower() + "_" + self.last_name.lower()
#self.orderref = self.last_name + " " + self.first_name
#else:
# self.href = self.first_name.lower()
#self.orderref = self.first_name
#self.notability = 0.0 # set temporarily
#
# Person's attenance to one Expo
#
class PersonExpedition(TroggleModel):
expedition = models.ForeignKey(Expedition)
person = models.ForeignKey(Person)
slugfield = models.SlugField(max_length=50,blank=True,null=True)
is_guest = models.BooleanField(default=False)
COMMITTEE_CHOICES = (
('leader','Expo leader'),
('medical','Expo medical officer'),
('treasurer','Expo treasurer'),
('sponsorship','Expo sponsorship coordinator'),
('research','Expo research coordinator'),
)
expo_committee_position = models.CharField(blank=True,null=True,choices=COMMITTEE_CHOICES,max_length=200)
nickname = models.CharField(max_length=100,blank=True,null=True)
def GetPersonroles(self):
res = [ ]
for personrole in self.personrole_set.order_by('survexblock'):
if res and res[-1]['survexpath'] == personrole.survexblock.survexpath:
res[-1]['roles'] += ", " + str(personrole.role)
else:
res.append({'date':personrole.survexblock.date, 'survexpath':personrole.survexblock.survexpath, 'roles':str(personrole.role)})
return res
class Meta:
ordering = ('-expedition',)
#order_with_respect_to = 'expedition'
def __unicode__(self):
return "%s: (%s)" % (self.person, self.expedition)
#why is the below a function in personexpedition, rather than in person? - AC 14 Feb 09
def name(self):
if self.nickname:
return "%s (%s) %s" % (self.person.first_name, self.nickname, self.person.last_name)
if self.person.last_name:
return "%s %s" % (self.person.first_name, self.person.last_name)
return self.person.first_name
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('personexpedition',kwargs={'first_name':self.person.first_name,'last_name':self.person.last_name,'year':self.expedition.year}))
def surveyedleglength(self):
survexblocks = [personrole.survexblock for personrole in self.personrole_set.all() ]
return sum([survexblock.totalleglength for survexblock in set(survexblocks)])
# would prefer to return actual person trips so we could link to first and last ones
def day_min(self):
res = self.persontrip_set.aggregate(day_min=Min("expeditionday__date"))
return res["day_min"]
def day_max(self):
res = self.persontrip_set.all().aggregate(day_max=Max("expeditionday__date"))
return res["day_max"]
#
# Single parsed entry from Logbook
#
class LogbookEntry(TroggleModel):
date = models.DateField()#MJG wants to turn this into a datetime such that multiple Logbook entries on the same day can be ordered.
expeditionday = models.ForeignKey("ExpeditionDay", null=True)#MJG wants to KILL THIS (redundant information)
expedition = models.ForeignKey(Expedition,blank=True,null=True) # yes this is double-
#author = models.ForeignKey(PersonExpedition,blank=True,null=True) # the person who writes it up doesn't have to have been on the trip.
# Re: the above- so this field should be "typist" or something, not "author". - AC 15 jun 09
#MJG wants to KILL THIS, as it is typically redundant with PersonTrip.is_logbook_entry_author, in the rare it was not redundanty and of actually interest it could be added to the text.
title = models.CharField(max_length=settings.MAX_LOGBOOK_ENTRY_TITLE_LENGTH)
cave_slug = models.SlugField(max_length=50)
place = models.CharField(max_length=100,blank=True,null=True,help_text="Only use this if you haven't chosen a cave")
text = models.TextField()
slug = models.SlugField(max_length=50)
filename = models.CharField(max_length=200,null=True)
class Meta:
verbose_name_plural = "Logbook Entries"
# several PersonTrips point in to this object
ordering = ('-date',)
def __getattribute__(self, item):
if item == "cave": #Allow a logbookentries cave to be directly accessed despite not having a proper foreignkey
return CaveSlug.objects.get(slug = self.cave_slug).cave
return super(LogbookEntry, self).__getattribute__(item)
def __init__(self, *args, **kwargs):
if "cave" in kwargs.keys():
if kwargs["cave"] is not None:
kwargs["cave_slug"] = CaveSlug.objects.get(cave=kwargs["cave"], primary=True).slug
kwargs.pop("cave")
return super(LogbookEntry, self).__init__(*args, **kwargs)
def isLogbookEntry(self): # Function used in templates
return True
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('logbookentry',kwargs={'date':self.date,'slug':self.slug}))
def __unicode__(self):
return "%s: (%s)" % (self.date, self.title)
def get_next_by_id(self):
LogbookEntry.objects.get(id=self.id+1)
def get_previous_by_id(self):
LogbookEntry.objects.get(id=self.id-1)
def new_QM_number(self):
"""Returns """
if self.cave:
nextQMnumber=self.cave.new_QM_number(self.date.year)
else:
return none
return nextQMnumber
def new_QM_found_link(self):
"""Produces a link to a new QM with the next number filled in and this LogbookEntry set as 'found by' """
return settings.URL_ROOT + r'/admin/core/qm/add/?' + r'found_by=' + str(self.pk) +'&number=' + str(self.new_QM_number())
def DayIndex(self):
return list(self.expeditionday.logbookentry_set.all()).index(self)
#
# Single Person going on a trip, which may or may not be written up (accounts for different T/U for people in same logbook entry)
#
class PersonTrip(TroggleModel):
personexpedition = models.ForeignKey("PersonExpedition",null=True)
#expeditionday = models.ForeignKey("ExpeditionDay")#MJG wants to KILL THIS (redundant information)
#date = models.DateField() #MJG wants to KILL THIS (redundant information)
time_underground = models.FloatField(help_text="In decimal hours")
logbook_entry = models.ForeignKey(LogbookEntry)
is_logbook_entry_author = models.BooleanField(default=False)
# sequencing by person (difficult to solve locally)
#persontrip_next = models.ForeignKey('PersonTrip', related_name='pnext', blank=True,null=True)#MJG wants to KILL THIS (and use funstion persontrip_next_auto)
#persontrip_prev = models.ForeignKey('PersonTrip', related_name='pprev', blank=True,null=True)#MJG wants to KILL THIS(and use funstion persontrip_prev_auto)
def persontrip_next(self):
futurePTs = PersonTrip.objects.filter(personexpedition = self.personexpedition, logbook_entry__date__gt = self.logbook_entry.date).order_by('logbook_entry__date').all()
if len(futurePTs) > 0:
return futurePTs[0]
else:
return None
def persontrip_prev(self):
pastPTs = PersonTrip.objects.filter(personexpedition = self.personexpedition, logbook_entry__date__lt = self.logbook_entry.date).order_by('-logbook_entry__date').all()
if len(pastPTs) > 0:
return pastPTs[0]
else:
return None
def place(self):
return self.logbook_entry.cave and self.logbook_entry.cave or self.logbook_entry.place
def __unicode__(self):
return "%s (%s)" % (self.personexpedition, self.logbook_entry.date)
##########################################
# move following classes into models_cave
##########################################
class Area(TroggleModel):
short_name = models.CharField(max_length=100)
name = models.CharField(max_length=200, blank=True, null=True)
description = models.TextField(blank=True,null=True)
parent = models.ForeignKey('Area', blank=True, null=True)
def __unicode__(self):
if self.parent:
return unicode(self.parent) + u" - " + unicode(self.short_name)
else:
return unicode(self.short_name)
def kat_area(self):
if self.short_name in ["1623", "1626"]:
return self.short_name
elif self.parent:
return self.parent.kat_area()
class CaveAndEntrance(models.Model):
cave = models.ForeignKey('Cave')
entrance = models.ForeignKey('Entrance')
entrance_letter = models.CharField(max_length=20,blank=True,null=True)
def __unicode__(self):
return unicode(self.cave) + unicode(self.entrance_letter)
class CaveSlug(models.Model):
cave = models.ForeignKey('Cave')
slug = models.SlugField(max_length=50, unique = True)
primary = models.BooleanField(default=False)
class Cave(TroggleModel):
# too much here perhaps,
official_name = models.CharField(max_length=160)
area = models.ManyToManyField(Area, blank=True, null=True)
kataster_code = models.CharField(max_length=20,blank=True,null=True)
kataster_number = models.CharField(max_length=10,blank=True, null=True)
unofficial_number = models.CharField(max_length=60,blank=True, null=True)
entrances = models.ManyToManyField('Entrance', through='CaveAndEntrance')
explorers = models.TextField(blank=True,null=True)
underground_description = models.TextField(blank=True,null=True)
equipment = models.TextField(blank=True,null=True)
references = models.TextField(blank=True,null=True)
survey = models.TextField(blank=True,null=True)
kataster_status = models.TextField(blank=True,null=True)
underground_centre_line = models.TextField(blank=True,null=True)
notes = models.TextField(blank=True,null=True)
length = models.CharField(max_length=100,blank=True,null=True)
depth = models.CharField(max_length=100,blank=True,null=True)
extent = models.CharField(max_length=100,blank=True,null=True)
survex_file = models.CharField(max_length=100,blank=True,null=True)
description_file = models.CharField(max_length=200,blank=True,null=True)
url = models.CharField(max_length=200,blank=True,null=True)
filename = models.CharField(max_length=200)
#class Meta:
# unique_together = (("area", "kataster_number"), ("area", "unofficial_number"))
# FIXME Kataster Areas and CUCC defined sub areas need seperating
#href = models.CharField(max_length=100)
class Meta:
ordering = ('kataster_code', 'unofficial_number')
def hassurvey(self):
if not self.underground_centre_line:
return "No"
if (self.survey.find("<img") > -1 or self.survey.find("<a") > -1 or self.survey.find("<IMG") > -1 or self.survey.find("<A") > -1):
return "Yes"
return "Missing"
def hassurveydata(self):
if not self.underground_centre_line:
return "No"
if self.survex_file:
return "Yes"
return "Missing"
def slug(self):
primarySlugs = self.caveslug_set.filter(primary = True)
if primarySlugs:
return primarySlugs[0].slug
else:
slugs = self.caveslug_set.filter()
if slugs:
return slugs[0].slug
def ours(self):
return bool(re.search(r'CUCC', self.explorers))
def reference(self):
if self.kataster_number:
return "%s-%s" % (self.kat_area(), self.kataster_number)
else:
return "%s-%s" % (self.kat_area(), self.unofficial_number)
def get_absolute_url(self):
if self.kataster_number:
href = self.kataster_number
elif self.unofficial_number:
href = self.unofficial_number
else:
href = official_name.lower()
#return settings.URL_ROOT + '/cave/' + href + '/'
return urlparse.urljoin(settings.URL_ROOT, reverse('cave',kwargs={'cave_id':href,}))
def __unicode__(self, sep = u": "):
return unicode(self.slug())
def get_QMs(self):
return QM.objects.filter(found_by__cave_slug=self.caveslug_set.all())
def new_QM_number(self, year=datetime.date.today().year):
"""Given a cave and the current year, returns the next QM number."""
try:
res=QM.objects.filter(found_by__date__year=year, found_by__cave=self).order_by('-number')[0]
except IndexError:
return 1
return res.number+1
def kat_area(self):
for a in self.area.all():
if a.kat_area():
return a.kat_area()
def entrances(self):
return CaveAndEntrance.objects.filter(cave=self)
def singleentrance(self):
return len(CaveAndEntrance.objects.filter(cave=self)) == 1
def entrancelist(self):
rs = []
res = ""
for e in CaveAndEntrance.objects.filter(cave=self):
rs.append(e.entrance_letter)
rs.sort()
prevR = None
n = 0
for r in rs:
if prevR:
if chr(ord(prevR) + 1 ) == r:
prevR = r
n += 1
else:
if n == 0:
res += ", " + prevR
else:
res += "&ndash;" + prevR
else:
prevR = r
n = 0
res += r
if n == 0:
res += ", " + prevR
else:
res += "&ndash;" + prevR
return res
def writeDataFile(self):
try:
f = open(os.path.join(settings.CAVEDESCRIPTIONS, self.filename), "w")
except:
subprocess.call(settings.FIX_PERMISSIONS)
f = open(os.path.join(settings.CAVEDESCRIPTIONS, self.filename), "w")
t = loader.get_template('dataformat/cave.xml')
c = Context({'cave': self})
u = t.render(c)
u8 = u.encode("utf-8")
f.write(u8)
f.close()
def getArea(self):
areas = self.area.all()
lowestareas = list(areas)
for area in areas:
if area.parent in areas:
try:
lowestareas.remove(area.parent)
except:
pass
return lowestareas[0]
def getCaveByReference(reference):
areaname, code = reference.split("-", 1)
print(areaname, code)
area = Area.objects.get(short_name = areaname)
print(area)
foundCaves = list(Cave.objects.filter(area = area, kataster_number = code).all()) + list(Cave.objects.filter(area = area, unofficial_number = code).all())
print(list(foundCaves))
assert len(foundCaves) == 1
return foundCaves[0]
class OtherCaveName(TroggleModel):
name = models.CharField(max_length=160)
cave = models.ForeignKey(Cave)
def __unicode__(self):
return unicode(self.name)
class EntranceSlug(models.Model):
entrance = models.ForeignKey('Entrance')
slug = models.SlugField(max_length=50, unique = True)
primary = models.BooleanField(default=False)
class Entrance(TroggleModel):
name = models.CharField(max_length=100, blank=True,null=True)
entrance_description = models.TextField(blank=True,null=True)
explorers = models.TextField(blank=True,null=True)
map_description = models.TextField(blank=True,null=True)
location_description = models.TextField(blank=True,null=True)
approach = models.TextField(blank=True,null=True)
underground_description = models.TextField(blank=True,null=True)
photo = models.TextField(blank=True,null=True)
MARKING_CHOICES = (
('P', 'Paint'),
('P?', 'Paint (?)'),
('T', 'Tag'),
('T?', 'Tag (?)'),
('R', 'Needs Retag'),
('S', 'Spit'),
('S?', 'Spit (?)'),
('U', 'Unmarked'),
('?', 'Unknown'))
marking = models.CharField(max_length=2, choices=MARKING_CHOICES)
marking_comment = models.TextField(blank=True,null=True)
FINDABLE_CHOICES = (
('?', 'To be confirmed ...'),
('S', 'Coordinates'),
('L', 'Lost'),
('R', 'Refindable'))
findability = models.CharField(max_length=1, choices=FINDABLE_CHOICES, blank=True, null=True)
findability_description = models.TextField(blank=True,null=True)
alt = models.TextField(blank=True, null=True)
northing = models.TextField(blank=True, null=True)
easting = models.TextField(blank=True, null=True)
tag_station = models.TextField(blank=True, null=True)
exact_station = models.TextField(blank=True, null=True)
other_station = models.TextField(blank=True, null=True)
other_description = models.TextField(blank=True,null=True)
bearings = models.TextField(blank=True,null=True)
url = models.CharField(max_length=200,blank=True,null=True)
filename = models.CharField(max_length=200)
cached_primary_slug = models.CharField(max_length=200,blank=True,null=True)
def __unicode__(self):
return unicode(self.slug())
def exact_location(self):
return SurvexStation.objects.lookup(self.exact_station)
def other_location(self):
return SurvexStation.objects.lookup(self.other_station)
def find_location(self):
r = {'': 'To be entered ',
'?': 'To be confirmed:',
'S': '',
'L': 'Lost:',
'R': 'Refindable:'}[self.findability]
if self.tag_station:
try:
s = SurvexStation.objects.lookup(self.tag_station)
return r + "%0.0fE %0.0fN %0.0fAlt" % (s.x, s.y, s.z)
except:
return r + "%s Tag Station not in dataset" % self.tag_station
if self.exact_station:
try:
s = SurvexStation.objects.lookup(self.exact_station)
return r + "%0.0fE %0.0fN %0.0fAlt" % (s.x, s.y, s.z)
except:
return r + "%s Exact Station not in dataset" % self.tag_station
if self.other_station:
try:
s = SurvexStation.objects.lookup(self.other_station)
return r + "%0.0fE %0.0fN %0.0fAlt %s" % (s.x, s.y, s.z, self.other_description)
except:
return r + "%s Other Station not in dataset" % self.tag_station
if self.FINDABLE_CHOICES == "S":
r += "ERROR, Entrance has been surveyed but has no survex point"
if self.bearings:
return r + self.bearings
return r
def best_station(self):
if self.tag_station:
return self.tag_station
if self.exact_station:
return self.exact_station
if self.other_station:
return self.other_station
def has_photo(self):
if self.photo:
if (self.photo.find("<img") > -1 or self.photo.find("<a") > -1 or self.photo.find("<IMG") > -1 or self.photo.find("<A") > -1):
return "Yes"
else:
return "Missing"
else:
return "No"
def marking_val(self):
for m in self.MARKING_CHOICES:
if m[0] == self.marking:
return m[1]
def findability_val(self):
for f in self.FINDABLE_CHOICES:
if f[0] == self.findability:
return f[1]
def tag(self):
return SurvexStation.objects.lookup(self.tag_station)
def needs_surface_work(self):
return self.findability != "S" or not self.has_photo or self.marking != "T"
def get_absolute_url(self):
ancestor_titles='/'.join([subcave.title for subcave in self.get_ancestors()])
if ancestor_titles:
res = '/'.join((self.get_root().cave.get_absolute_url(), ancestor_titles, self.title))
else:
res = '/'.join((self.get_root().cave.get_absolute_url(), self.title))
return res
def slug(self):
if not self.cached_primary_slug:
primarySlugs = self.entranceslug_set.filter(primary = True)
if primarySlugs:
self.cached_primary_slug = primarySlugs[0].slug
self.save()
else:
slugs = self.entranceslug_set.filter()
if slugs:
self.cached_primary_slug = slugs[0].slug
self.save()
return self.cached_primary_slug
def writeDataFile(self):
try:
f = open(os.path.join(settings.ENTRANCEDESCRIPTIONS, self.filename), "w")
except:
subprocess.call(settings.FIX_PERMISSIONS)
f = open(os.path.join(settings.ENTRANCEDESCRIPTIONS, self.filename), "w")
t = loader.get_template('dataformat/entrance.xml')
c = Context({'entrance': self})
u = t.render(c)
u8 = u.encode("utf-8")
f.write(u8)
f.close()
class CaveDescription(TroggleModel):
short_name = models.CharField(max_length=50, unique = True)
long_name = models.CharField(max_length=200, blank=True, null=True)
description = models.TextField(blank=True,null=True)
linked_subcaves = models.ManyToManyField("NewSubCave", blank=True,null=True)
linked_entrances = models.ManyToManyField("Entrance", blank=True,null=True)
linked_qms = models.ManyToManyField("QM", blank=True,null=True)
def __unicode__(self):
if self.long_name:
return unicode(self.long_name)
else:
return unicode(self.short_name)
def get_absolute_url(self):
return urlparse.urljoin(settings.URL_ROOT, reverse('cavedescription', args=(self.short_name,)))
def save(self):
"""
Overridden save method which stores wikilinks in text as links in database.
"""
super(CaveDescription, self).save()
qm_list=get_related_by_wikilinks(self.description)
for qm in qm_list:
self.linked_qms.add(qm)
super(CaveDescription, self).save()
class NewSubCave(TroggleModel):
name = models.CharField(max_length=200, unique = True)
def __unicode__(self):
return unicode(self.name)
class QM(TroggleModel):
#based on qm.csv in trunk/expoweb/1623/204 which has the fields:
#"Number","Grade","Area","Description","Page reference","Nearest station","Completion description","Comment"
found_by = models.ForeignKey(LogbookEntry, related_name='QMs_found',blank=True, null=True )
ticked_off_by = models.ForeignKey(LogbookEntry, related_name='QMs_ticked_off',null=True,blank=True)
#cave = models.ForeignKey(Cave)
#expedition = models.ForeignKey(Expedition)
number = models.IntegerField(help_text="this is the sequential number in the year", )
GRADE_CHOICES=(
('A', 'A: Large obvious lead'),
('B', 'B: Average lead'),
('C', 'C: Tight unpromising lead'),
('D', 'D: Dig'),
('X', 'X: Unclimbable aven')
)
grade = models.CharField(max_length=1, choices=GRADE_CHOICES)
location_description = models.TextField(blank=True)
#should be a foreignkey to surveystation
nearest_station_description = models.CharField(max_length=400,null=True,blank=True)
nearest_station = models.CharField(max_length=200,blank=True,null=True)
area = models.CharField(max_length=100,blank=True,null=True)
completion_description = models.TextField(blank=True,null=True)
comment=models.TextField(blank=True,null=True)
def __unicode__(self):
return u"%s %s" % (self.code(), self.grade)
def code(self):
return u"%s-%s-%s" % (unicode(self.found_by.cave)[6:], self.found_by.date.year, self.number)
def get_absolute_url(self):
#return settings.URL_ROOT + '/cave/' + self.found_by.cave.kataster_number + '/' + str(self.found_by.date.year) + '-' + '%02d' %self.number
return urlparse.urljoin(settings.URL_ROOT, reverse('qm',kwargs={'cave_id':self.found_by.cave.kataster_number,'year':self.found_by.date.year,'qm_id':self.number,'grade':self.grade}))
def get_next_by_id(self):
return QM.objects.get(id=self.id+1)
def get_previous_by_id(self):
return QM.objects.get(id=self.id-1)
def wiki_link(self):
return u"%s%s%s" % ('[[QM:',self.code(),']]')
photoFileStorage = FileSystemStorage(location=settings.PHOTOS_ROOT, base_url=settings.PHOTOS_URL)
class DPhoto(TroggleImageModel):
caption = models.CharField(max_length=1000,blank=True,null=True)
contains_logbookentry = models.ForeignKey(LogbookEntry,blank=True,null=True)
contains_person = models.ManyToManyField(Person,blank=True,null=True)
file = models.ImageField(storage=photoFileStorage, upload_to='.',)
is_mugshot = models.BooleanField(default=False)
contains_cave = models.ForeignKey(Cave,blank=True,null=True)
contains_entrance = models.ForeignKey(Entrance, related_name="photo_file",blank=True,null=True)
#nearest_survey_point = models.ForeignKey(SurveyStation,blank=True,null=True)
nearest_QM = models.ForeignKey(QM,blank=True,null=True)
lon_utm = models.FloatField(blank=True,null=True)
lat_utm = models.FloatField(blank=True,null=True)
class IKOptions:
spec_module = 'core.imagekit_specs'
cache_dir = 'thumbs'
image_field = 'file'
#content_type = models.ForeignKey(ContentType)
#object_id = models.PositiveIntegerField()
#location = generic.GenericForeignKey('content_type', 'object_id')
def __unicode__(self):
return self.caption
scansFileStorage = FileSystemStorage(location=settings.SURVEY_SCANS, base_url=settings.SURVEYS_URL)
def get_scan_path(instance, filename):
year=instance.survey.expedition.year
#print("WN: ", type(instance.survey.wallet_number), instance.survey.wallet_number, instance.survey.wallet_letter)
number=str(instance.survey.wallet_number)
if str(instance.survey.wallet_letter) != "None":
number=str(instance.survey.wallet_letter) + number #two strings formatting because convention is 2009#01 or 2009#X01
return os.path.join('./',year,year+r'#'+number,str(instance.contents)+str(instance.number_in_wallet)+r'.jpg')
class ScannedImage(TroggleImageModel):
file = models.ImageField(storage=scansFileStorage, upload_to=get_scan_path)
scanned_by = models.ForeignKey(Person,blank=True, null=True)
scanned_on = models.DateField(null=True)
survey = models.ForeignKey('Survey')
contents = models.CharField(max_length=20,choices=(('notes','notes'),('plan','plan_sketch'),('elevation','elevation_sketch')))
number_in_wallet = models.IntegerField(null=True)
lon_utm = models.FloatField(blank=True,null=True)
lat_utm = models.FloatField(blank=True,null=True)
class IKOptions:
spec_module = 'core.imagekit_specs'
cache_dir = 'thumbs'
image_field = 'file'
#content_type = models.ForeignKey(ContentType)
#object_id = models.PositiveIntegerField()
#location = generic.GenericForeignKey('content_type', 'object_id')
#This is an ugly hack to deal with the #s in our survey scan paths. The correct thing is to write a custom file storage backend which calls urlencode on the name for making file.url but not file.path.
def correctURL(self):
return string.replace(self.file.url,r'#',r'%23')
def __unicode__(self):
return get_scan_path(self,'')
class Survey(TroggleModel):
expedition = models.ForeignKey('Expedition') #REDUNDANT (logbook_entry)
wallet_number = models.IntegerField(blank=True,null=True)
wallet_letter = models.CharField(max_length=1,blank=True,null=True)
comments = models.TextField(blank=True,null=True)
location = models.CharField(max_length=400,blank=True,null=True) #REDUNDANT
subcave = models.ForeignKey('NewSubCave', blank=True, null=True)
#notes_scan = models.ForeignKey('ScannedImage',related_name='notes_scan',blank=True, null=True) #Replaced by contents field of ScannedImage model
survex_block = models.OneToOneField('SurvexBlock',blank=True, null=True)
logbook_entry = models.ForeignKey('LogbookEntry')
centreline_printed_on = models.DateField(blank=True, null=True)
centreline_printed_by = models.ForeignKey('Person',related_name='centreline_printed_by',blank=True,null=True)
#sketch_scan = models.ForeignKey(ScannedImage,blank=True, null=True) #Replaced by contents field of ScannedImage model
tunnel_file = models.FileField(upload_to='surveyXMLfiles',blank=True, null=True)
tunnel_main_sketch = models.ForeignKey('Survey',blank=True,null=True)
integrated_into_main_sketch_on = models.DateField(blank=True,null=True)
integrated_into_main_sketch_by = models.ForeignKey('Person' ,related_name='integrated_into_main_sketch_by', blank=True,null=True)
rendered_image = models.ImageField(upload_to='renderedSurveys',blank=True,null=True)
def __unicode__(self):
return self.expedition.year+"#"+"%02d" % int(self.wallet_number)
def notes(self):
return self.scannedimage_set.filter(contents='notes')
def plans(self):
return self.scannedimage_set.filter(contents='plan')
def elevations(self):
return self.scannedimage_set.filter(contents='elevation')

Some files were not shown because too many files have changed in this diff Show More