Compare commits

182 Commits

Author SHA1 Message Date
80ceb7aaed removed __repr__ from classes in files.py, and added in sqlalchemy class and marshmallow schemas for entry amendments, then load amendments on get_entry_by_id - so any page load (first or next/prev) will see amendments, we then display them into the files list and now add a white circle inside the throbber and overlay that with approrpiate icon/image - all of which is taken from amendment type and eid. tables.sql also updated to create the amendment data, tweaked icons.svg to remove hardcoded-colours for flip_[vh] 2025-10-15 23:06:05 +11:00
9cf47f4582 removed __repr__ from classes in files.py, and added in sqlalchemy class and marshmallow schemas for entry amendments 2025-10-15 22:52:17 +11:00
a683da13cc gave whole of TODO a once over, removed old stuff 2025-10-12 23:49:35 +11:00
9ffb704648 fix bug where AI search used wrong query as I rewrote the search_term and lost AI: now compares original / OPT.search_term. Also, the union/sql did not work with otdering, rewrote this to use python code to remove duplicates from the 3 separate queries and remove duplicates (if any) 2025-10-12 21:23:45 +11:00
27e06a9462 any change of opts needs to invaldiate cache 2025-10-12 21:22:14 +11:00
4556afb3bb changed token word 2025-10-12 19:34:00 +11:00
0eee594206 fix up bug with cache not working when we change how_many 2025-10-12 19:33:12 +11:00
78b112d050 when we change file options, make sure we call correct query data function - search needs GetSerachQueryData 2025-10-12 19:31:45 +11:00
97e738dc13 implemented a quick cache for flat view 2025-10-12 16:29:39 +11:00
b61f048dec reverted partial client side back button logic, but also now tested / validated if somehow we are in a flat view and ask for entries and dont get all of them back, or we are in folder view and we try to go into a folder or back up a folder and we get no data as someone deleted it since we made the view, so then show appropriate client-side errors 2025-10-12 16:02:21 +11:00
e3f6b416ce forgot to add js_vers for search, fixed 2025-10-12 13:24:53 +11:00
0ac0eedef9 handle going into Dirs and back from Dirs by doing logic of parent dir on server side, only return eids, and get normal code to handle rendering, no specific get_dir_entries route -> its not get_dir_eids, and the rest is normal code. Much cleaner/simpler. At this point I think the major rewrite it functional, commiting before more testing and then merge code and removed firstEntryOnPage bug (its now pageList[0]) 2025-10-12 13:14:53 +11:00
cb5ff7e985 comments for clarity of root_eid=0 meaning no entries 2025-10-12 13:14:13 +11:00
517b5c6167 made more of the javascript hopefulyl honour versions, sort of works, not fully deployed consistently but works well enough for home use in dev. Also tweaked how the layout looks on a phone so that la / ra buttons on beside the image. Still oddly needs me to shrink that page, but it is now visually right - odd 2025-10-11 15:02:58 +11:00
16d28bc02e first pass of versioning, working for files.html only for now 2025-10-11 12:47:53 +11:00
da0019ecdc adding a get_version function that gets the mtime of the jscript file and then lets us embed that into the
<script src=>
Note, accidentally committed, partially tested
2025-10-11 12:27:59 +11:00
e4bf9606b9 move functions to appropriate file location for files/view support js, commented them better, removed some dead code 2025-10-11 12:13:44 +11:00
3a053bea49 found new todo around going back in folder view and sort order 2025-10-11 12:13:17 +11:00
1e421c3f22 move jscript logic out of template/html, into jscript so we can keep all logic into the included file, then I can force version on it and stop caching issues in mobiles, and a shift-reload will also get new code, rather than a container restart even on desktops 2025-10-11 11:11:51 +11:00
346defde8b added a functional (small) up button in the viewer, all works 2025-10-11 09:19:51 +11:00
6419e20d7e updated as I have fixed the restore/del from the viewer 2025-10-10 23:14:15 +11:00
b51d9e1776 make delete/restore button in viewer be the right colour, and go back to / when the success for the delete or restore finishes 2025-10-10 23:13:03 +11:00
fa2197adbe fix up bug where empty file_rbp did not work and displayed in the wrong div 2025-10-10 23:03:23 +11:00
66344e146e added a health check to see if this can help with the odd startup lag, it didnt 2025-10-10 23:02:47 +11:00
ee1c9b5494 all override add and remove now use new datastructures, close to be able to test / augment as per TODO 2025-10-09 23:56:27 +11:00
846bdc4e52 fix up linking between face to refimg when we add that face to a person from right-click 2025-10-07 23:52:06 +11:00
541cbec0de update adding refimg to new or existing person via right-click on a face to use new data structures 2025-10-07 23:43:07 +11:00
071164d381 remove dead code 2025-10-07 23:42:17 +11:00
1adb1bc7af no longer have pa_user_state data really, ditch the table 2025-10-07 23:42:02 +11:00
747f524343 remove more dead code 2025-10-07 23:41:38 +11:00
4feae96511 remove comments/format 2025-10-07 23:31:37 +11:00
e6c9429466 remove search term from DB 2025-10-07 23:30:28 +11:00
005b5be2b9 remove search_term from DB fields we want 2025-10-07 23:28:54 +11:00
f369b6d796 ticked off a couple of items around no content being better displayed 2025-10-07 00:01:30 +11:00
eb7bb84e09 add some padding, and set root_eid - Its 0 when nothing at all in Path (first run/empty). Also, started override for marshmallow, but overrides wont work yet 2025-10-07 00:00:50 +11:00
7f13d78700 deal with completely empty directories, remove ChangeSize and move that content into changeSize 2025-10-06 21:41:35 +11:00
e5d6ce9b73 move MoveDBox into full javascript and fold it into internal/js/files_support.js, also remove unused parameter for MoveDBox, and use marshmallow to pass people in query_data - overall just cleaner more consistent code for existing functionality 2025-10-05 23:20:17 +11:00
e0654edd21 alter move_paths from a specific var passed in the render_template, to be folded into query_data var 2025-10-05 22:36:49 +11:00
24c2922f74 actually set NMO var based on marshmallow data 2025-10-05 22:21:57 +11:00
24c3762c61 remove PathDetails*, and use marshmallow schemas with methods to get icon_url, and "path" renamed to "root_dir", updated move* code to use new data structure 2025-10-05 22:19:58 +11:00
40f0b5d369 refactor most javascript code, moved what can be moved into files_support.js, made keydown() items only apply when I go to the viewer code 2025-10-04 23:25:46 +10:00
2f5c6ec949 update for sqlalchemy v2 2025-10-04 23:23:39 +10:00
81ebf6fa01 use to_dict() here, so we can use |json in jscript 2025-10-04 22:45:28 +10:00
7a4f5ddb17 update z-index so we dont get buttons being higher than toast popup 2025-10-04 22:45:01 +10:00
00e42447fc make download link work with document.viewing rather than objs[current] 2025-10-04 10:43:00 +10:00
50f9fbee20 update to use new viewing data structures, consolidate to one set of transform functions with successCallback to cover the files vs viewer content. Updated TODO appropriately 2025-10-04 10:39:11 +10:00
1b9b4c9c4f update to use new viewing data structures, consolidate to one set of transform functions with successCallback to cover the files vs viewer content. Updated TODO appropriately 2025-10-04 10:39:06 +10:00
86761994df need to check length of faces, not just it exists - it comes back as an empty array when no faces 2025-10-04 10:37:29 +10:00
4138f392d3 clean up includes, double up of fullscreen var, and use of current still 2025-10-04 10:37:00 +10:00
525b823632 dont crash server if we have not scanned the Path yet 2025-10-04 10:34:54 +10:00
d019dc7960 more things to test/fix 2025-10-03 21:40:59 +10:00
a15fbd74d5 update for new db models/marshmallow object 2025-10-03 21:40:48 +10:00
3c8babb619 made basic viewer now handle new marshmallowed data - faces, model_used, face_distance, etc. 2025-10-03 21:28:25 +10:00
cd73c16545 include refimg_lnk and facefile_lnk into Face to be used in viewer 2025-10-03 21:10:52 +10:00
efaec00127 OMG, face_distance is float not int, how has this ever worked 2025-10-03 21:10:04 +10:00
0777bbe237 remove more dead code 2025-10-02 23:14:42 +10:00
b0c738fcc1 moved cFO() to changeOPT() into internal/js, transformed file_list_ip to work with new query_data model, made the Size buttons use modern bootstrap5 radio buttons and work again without the form for all file* routes. Removed a lot of dead code, moved code to resetNextPrev buttons to anytime we getPage, reducing code, handle setting OPT in jscript via a to_dict() for States. Overall, search and file_list_ip now work. Only main thing left is the override code 2025-10-02 23:11:14 +10:00
2b9e0e19a2 add a to_dict() for States, and then I can use a 1-liner to set a var in jscript 2025-10-02 18:20:31 +10:00
e526d99389 no longer maintain too much state per user, remove all need to update it 2025-10-02 17:54:51 +10:00
5a923359bc remove dead viewing code 2025-10-02 17:54:31 +10:00
a147308b64 removed fields from PA_USER_STATE, no longer needed with new viewing logic 2025-10-02 17:51:47 +10:00
9e943c7e1f removed fields from PA_USER_STATE, no longer needed with new viewing logic 2025-10-02 17:51:39 +10:00
87651e80a0 fix up capitalisation of bOOLEAN, and removed fields from PA_USER_STATE, no longer needed with new viewing logic 2025-10-02 17:51:25 +10:00
2e952deda0 added code to support changing noo/how_many/folders and do this with json data back and forth, update the UI, all works - only search is missing now. Lots of dead code can still be deleted 2025-10-01 23:48:19 +10:00
b9b7a24326 removing done items 2025-10-01 23:43:15 +10:00
a7ce8e66b5 change back to how_many, too hard to fix and the inconsistency broke the GUI change 2025-10-01 23:40:39 +10:00
6199c042e3 remove most of the code, just have States with default values and some derived values such as prefix 2025-10-01 23:40:05 +10:00
c32b99f53e augment CreateFoldersSelect to support supplying js 2025-10-01 23:39:17 +10:00
175e43c9bb make viewer work for files_sp, by resetting entryList and pageList for any new directory load 2025-09-30 18:38:54 +10:00
4bb99ce589 catch all to cover when we try to view an img/video and the dir contents is empty, so nothing to show 2025-09-30 18:38:20 +10:00
70ca93b14e convert files_rbp to new query_data model 2025-09-30 18:37:15 +10:00
a0e06717ac viewer now works for files_ip, still have broken bits everywhere - files_rbp, change_opts, do I want a back button? lots of dead/old code, probably cam move more js into *_support, and do I want to keep files_support separate to view_support 2025-09-30 00:29:11 +10:00
0851b79e16 remove debug 2025-09-30 00:26:45 +10:00
8e6342b627 quick add of files_div to be able to toggle beween files viewing and viewer itself, moved isMobile to support js, change eids to use query_data - for now, will remove when we consolidate to new approach 2025-09-28 21:48:11 +10:00
5b6dc1b3e9 seems I now need setuptools, this feels like a kludge - watch this space for further upstream changes 2025-09-28 21:45:19 +10:00
59bd2af15e update to reflect we no longer use sudo for docker commands 2025-09-28 21:44:26 +10:00
1ca5ca192c move isMobile() to support js, as it has no jinja2 in it 2025-09-28 21:44:02 +10:00
5f8c48ac18 fix highlighting issue - as expected, ecnt was wrong 2025-09-27 17:36:31 +10:00
b67f2d9dcb now allow files_sp paths to work (and support folders). Highlighting with folders is broken (likely dodgy ecnt). viewing still broken, but basic navigations is finally working with folders now 2025-09-27 12:42:01 +10:00
5842bf2ab8 set OPT values in jscript in files.html, added functions to draw figures on a page based on pageList (subset of entryList) & json data, and tweaked just grouping select as well to also use the draw figures func() in jscript. Needed to move out .figure click handler into the draw figures too. This is now semi-functional, images load, pages next/prev works, grouping works -- BUT only for files_ip -- folders wont work, search wont work, files_sp wont work, viewing a file wont work 2025-09-27 00:31:42 +10:00
be218c5049 first pass of using query data to allow display of files_ip only. Does use pass in list to do next/prev page (of the eids only). No images drawn at all. Definitely NOT, a usable version of PA at the moment 2025-09-26 19:32:09 +10:00
a28f016b8a when viewing, there is no path and not needed for viewing anyway 2025-09-26 19:30:29 +10:00
d2db7f6184 hash is a string not an Integer, not sure how this issue has been here so long. Also, first pass of using query data to allow display of files_ip only. Definitely NOT, a usable version of PA at the moment 2025-09-26 19:26:23 +10:00
9ec8195d0a first pass of supprot functions to allow query/entry_list to drive pagination, and do not go back to the server to calc next/prev page 2025-09-26 19:25:19 +10:00
2325dcd22a work out prefix and store it away in States so we can just use if in updated json data calc for files 2025-09-26 19:24:10 +10:00
e0b597c58c add entry_list to query table 2025-09-26 19:22:36 +10:00
0895268df2 more thoughts on change over to json, etc 2025-09-26 19:20:47 +10:00
efceef7e57 added BUG-140 - db restarts cause job mgr to fail 2025-09-26 14:57:32 +10:00
21059a6235 reformatted to be normal sql, added query table 2025-09-20 19:32:47 +10:00
4d80fa4e7c first pass at new query table 2025-09-20 19:32:21 +10:00
a4526e2b70 updated BUG as well 2025-09-20 14:50:00 +10:00
cf9b0996be works with new ldap now, and put forward details of next big change into TODO 2025-09-20 14:48:29 +10:00
57e736a495 temporarily add some data capturing to dom for debugging, and remove superfluous condition in if 2025-09-20 14:47:52 +10:00
aaf04e7f9b AI search now does not wildcard, so only exact matches person when you do AI:<xxx> 2025-09-20 14:46:57 +10:00
eb9e7f2ce1 fix for ldap removing any anonymous binds, also use ldap.ddp.net 2025-09-16 20:36:04 +10:00
df7a81df09 new TODOs 2025-09-16 20:35:03 +10:00
4742509d56 added new BUG 2025-09-16 20:34:44 +10:00
391fd52332 commented out more debugs 2025-08-19 20:55:20 +10:00
1831c49b15 fixed 2 x TODOs, can now add an existing face as a refimg, and I sped up the recent jobs page 2025-08-19 20:53:26 +10:00
fa63e08b59 store away person.id per matched face to support match_add_refimg in context menu 2025-08-19 20:22:49 +10:00
e36fc170f1 moved container cwd to ~/src/photoassistant so that gunicorn can react to src code changes, also forced logs to docker logs 2025-08-19 20:22:20 +10:00
e3c94bf5eb added match_add_refimg support to context menu, allows us to take existing (presumably) better image and make it a refimg for someone its matched 2025-08-19 20:20:50 +10:00
db8cf68d4b added a clear messages (forced) to allow M to clear out annoying messages if needed 2025-08-19 20:19:57 +10:00
1a182d30b5 added a clear messages (forced) to allow M to clear out annoying messages if needed 2025-08-19 20:18:59 +10:00
c7c08c1c32 made AI:<tag> be literal, e.g. no wildcards to stop AI:kath matching AI:katherine, also removed unused fid var and commented out debugs 2025-08-19 20:18:30 +10:00
5fa35b90eb updated to reflect move from hard-coded ip to FQDN and using gitea (on git.depaoli.id.au) instead of local git on mara 2025-08-19 20:17:31 +10:00
ae718fe249 had to remove libgl1-mesa-glx to get installs to work, added mkdir -p jic, force TZ so build date is timezone accurate 2025-08-19 20:12:42 +10:00
48ca6c5e64 new todos 2025-08-19 20:11:54 +10:00
6ceb4c4604 clean up of BUGs list 2025-08-19 20:11:39 +10:00
b7b2eaeb1f use new logo for about page, and center the alert text 2025-05-13 18:59:09 +10:00
0660ca55d7 new logo file 2025-05-13 18:44:30 +10:00
e6e621d6f7 updated pics for rotations 2025-05-13 18:44:23 +10:00
0a49c2a627 added BUG-137 - after moving files, shift-click selection is wrong 2025-05-13 18:43:58 +10:00
e460164d83 various TODOs 2025-05-13 18:43:30 +10:00
f6ddb66f3f set cant_close to False now, I think I know enough about this BUG 2025-05-13 18:43:08 +10:00
1555cd5ad6 use new logo 2025-05-13 18:42:21 +10:00
62acd34ae0 trying to use less RUNs / layers, to speed up subsequent builds 2025-05-13 18:41:59 +10:00
5448e0a7a4 just do all of static, rather than explicit subdirs 2025-05-13 18:41:26 +10:00
70908d60f5 use fixed width and height to force buttons to be consistent with new rot pngs 2025-01-08 13:37:01 +11:00
58715491ea keep rot90.svg source in case we want to modify pngs in the future 2025-01-08 13:35:46 +11:00
f5dfb92db1 removing BUG-128 its a duplicate of BUG-123 2025-01-04 22:04:36 +11:00
eb819e37d7 added notes on DEEPFACE 2025-01-04 22:03:33 +11:00
c6a0352c7d svgs cant have fill and do onHover properly, and arcs need a fill of none, so switched over to pngs and made the arrow match the actual direction of rotation 2025-01-04 22:03:21 +11:00
f4aeadc019 adding some notes from some bugs stored into the db 2025-01-04 22:02:26 +11:00
79437477f7 svgs cant have fill and do onHover properly, and arcs need a fill of none, so switched over to pngs and made the arrow match the actual direction of rotation 2025-01-04 22:02:00 +11:00
989fc7c6db svgs cant have fill and do onHover properly, and arcs need a fill of none, so switched over to pngs and made the arrow match the actual direction of rotation 2025-01-04 22:01:51 +11:00
febdfbbd32 added deepface notes 2025-01-04 21:59:56 +11:00
37c0125fbd Fixed BUG-136, now can search with a * or % as wildcards, and fix bug where no search results broke pa 2024-12-31 15:13:17 +11:00
b34b71bf32 adding BUG-136, handle wildcard search of % and no results bug 2024-12-31 15:11:57 +11:00
bd3ee98328 clean up start-up messages and validation on ENV. Display error correctly when exif rotater fails 2024-12-07 19:46:15 +11:00
47190a8829 user newer sqlalchemy v2 syntax 2024-12-07 19:42:49 +11:00
f97653c4f1 had to lock sqlalchemy to a newer version, but was able to unlock flask version 2024-12-07 19:42:03 +11:00
8157292df4 convert over to pauser - and use ARG for uid/gid of pauser 2024-12-07 19:41:32 +11:00
a0b4b80ca9 clean up and use pauser 2024-12-07 19:40:29 +11:00
cbfd00fe53 make dev PA go via container name instead of via mara for DB connection 2024-12-07 19:39:12 +11:00
61b3edf5fa added bugs 133-135, failing symlinks on build, moving images reappearing and failing to rotate 2024-12-07 19:38:48 +11:00
fdba16c004 added BUG-133 right arrow in viewer goes to next pic AND scrolls right 2024-08-25 22:37:38 +10:00
9a359a1a4c fixed BUG-132 failing to create symlink on first build by being explicit in Dockerfile on how to handle content in static/ 2024-08-11 18:59:02 +10:00
15cc101bbe add BUG-131, init failing on first build to create symlinks 2024-08-11 18:58:20 +10:00
9dac54ae97 removing cant_close=True on this one, ultimately, its just viewing order is wrong, so no actual data/file issues - let it be persistent but able to be closed 2024-07-28 17:31:35 +10:00
aba96f6c52 added BUG-130, moving files and next page confusion 2024-07-28 17:29:24 +10:00
0895bc5983 Oops - remove hardcoded path and use self.url, one too many ctrl-zs when testing :( 2024-06-30 18:07:02 +10:00
46eb87bc31 use an regex match rather than just last part of self.url to find search term as something changed and caused the url to be different on the post - not sure when I broke this, or if an upgrade did, but this fixed BUG-127 and BUG-129 (just a special case of 127) 2024-06-30 17:12:31 +10:00
cbea1f09ab clarified new bugs 2024-06-29 15:23:08 +10:00
5fa9b5e386 updated for fresh ubuntu 24.04 install 2024-06-29 13:59:40 +10:00
28d4f8d998 minor tweak of content of try for symlink creation and error content afterwards 2024-06-29 13:55:43 +10:00
29ea2ca833 starting to tweak from legacy query.get() method to session.get() - these fixes will be a pita/lot 2024-06-29 13:55:01 +10:00
06ceefda97 popped face recog load_image_fil into a try block as it clearly has/can fail 2024-06-29 13:54:22 +10:00
974d96d6eb Fix BUG-126 (basically disable next/prev buttons the moment we go back to the server for the previous/next page of images, only re-enable if they make sense after we have loaded the new images 2024-06-29 13:53:50 +10:00
030eb785f3 Fix BUG-126 (basically disable next/prev buttons the moment we go back to the server for the previous/next page of images, only re-enable if they make sense after we have loaded the new images 2024-06-29 13:53:38 +10:00
1277558b0e fixed BUG-126, so removing debugs, changing unlikely bug reoccurrence to a warning that can be closed 2024-06-29 13:51:53 +10:00
d7f028f3d1 had to hard lock numpy for now so that face recognition library can find locations 2024-06-29 13:50:07 +10:00
d53bffcab9 allow this to work in padev container (e.g. run as mythtv) 2024-06-29 13:49:31 +10:00
4885acea80 adding extra debugs for weird bugs with losing data when viewing 2024-04-07 20:18:40 +10:00
fed78672dc make message field larger to cater for larger debugs 2024-04-07 20:18:10 +10:00
bc23c9a3c6 added new BUG relating to restart DB underneath web 2024-04-07 20:17:22 +10:00
ffbedc3bca added BUG-127 sort not working after a search 2024-04-07 00:34:42 +11:00
d39f1734a3 fix BUG-124 - pa job mgr does not have timestamps, vi creating/use of PAprint() 2024-03-11 00:40:25 +11:00
d86893875d added capability to run a dev container for pa, and if we re-build we create a new .sk and remove old PA_UserState on fresh login. BUG-120 is either fixed by this or at least the log catches it and does not crash now (if we rebuild between re-use) 2024-03-10 23:52:34 +11:00
2665ab086e properly define regex strings 2024-03-10 23:52:24 +11:00
063c75809a more thoughts now we have a new CPU in mara 2024-03-10 13:40:23 +11:00
8b89d845db change key warning to be persistent, so I dont miss what it said 2024-03-10 13:39:38 +11:00
6ef12d7636 change key warning to be persistent, so I dont miss what it said 2024-03-10 13:39:21 +11:00
42eeed7c45 move to ssl offloading, rather than processing ssl inside the python code - just easier / more consistent with other traefik enabled containers 2024-03-10 13:39:00 +11:00
096251f8ce move to using base image of python, not ubuntu, update requirements to use pip for as much as we can and not mix and match some apt get and pip3 for python dependencies 2024-03-10 13:37:54 +11:00
33c86f49fb "new" BUG 126 - still the fail to view an id that is not in the list 2024-03-10 13:36:31 +11:00
98d33b4880 have a solution for multi-select on tablet/phone now 2024-01-28 23:27:20 +11:00
448b635d5a updating BUGS 2024-01-28 23:27:02 +11:00
53ef671d34 added a isMobile() function that adds a shift and ctrl key to the files view, which can be clicked to fake a shift or ctrl key selection event on a tablet/mobile - first pass at this, its fairly usable. I might allow de-selecting the shift or ctrl key if they are pressed again before a selection is used, otherwise this is all functional. Note, I also changed the contextmenu to a click not mouse down on selection of an item in the menu. This is allows me to stop the propagation of the click event which was being trapped by the $(document).on( "click" ... and which we dont want - also exposes a BUG that when I click the context menu onto a different image it does not highlight the new image and some menu items process the original highlight, others the image under the context menu 2024-01-21 23:07:31 +11:00
7e25c33f1a removed use of ref var for code readability
added some more defensive code to throw an error, rather than crash if missing pref, referrers, etc.
fix use of SetFELog with log_level="error" => level="danger"
2024-01-21 18:19:20 +11:00
4db3d27cbc fix use of SetFELog with log_level="error" => level="danger" 2024-01-21 18:17:45 +11:00
a3350b9316 default for similar folder by date is now +/- 2 weeks not 1 AND fix use of SetFELog with log_level="error" => level="danger" 2024-01-21 18:17:03 +11:00
a8ee7b4f89 added TODO for touch and multi-select 2024-01-21 18:14:53 +11:00
478a7e5add added 2 new bugs - 123 -> pa_job_mgr crashed with a timeout, 124 -> pa_job_mgr logs have no timestamp 2024-01-21 18:14:25 +11:00
aad0389d77 BUG-122 - just closing this, duplicates were from 2 years ago and were 200ms apart, looks like a reload of the POST? If so, I now force a redirect after the POST, so reloads dont add more than 1 person 2024-01-01 12:16:37 +11:00
262efd89d4 Fixed BUG-121: crashing when cannot find a match when searching 2024-01-01 11:49:41 +11:00
9c4da5ec6f add allow actions for wrong person to TODO 2024-01-01 11:49:03 +11:00
3f39f74716 new bugs, 500 when cannot find a string, and duplicates of people? 2024-01-01 11:48:43 +11:00
48353f0486 have changed the rotation code to use jpegtran so its lossless 2023-11-11 22:28:29 +11:00
0ed56d6a1a added a debug for rotate code, and put quotes in the right spots so paths with spaces are covered - D'oh! 2023-11-11 22:25:26 +11:00
1005acb339 use jpegtran to do lossless 90 rotations 2023-11-11 21:28:04 +11:00
62cfe6305e use text() for newer sqlalchemy 2023-09-27 15:02:57 +10:00
43 changed files with 2685 additions and 2135 deletions

View File

@@ -2,10 +2,7 @@ images_to_process
new_img_dir
photos
reference_images
static/Bin/*
static/Import/*
static/Storage/*
static/Metadata/*
static
.pa_metadata
.pa_bin
.python

45
BUGs
View File

@@ -1,4 +1,16 @@
### Next: 121
### Next: 141
BUG-140: When db is restarted underneath PA, it crashes job mgr... It should just accept timeouts, and keep trying to reconnect every 2? mins
BUG-139: using any large entry list and going next a few times, ends say 4 pages of 50 into 4000 matches (entries from DB < 50)...
- confirmed this is when person has 2 or more refimgs:
- on page "2", we get 49 pulled back in the ORM instead of the 50 expected -- b/c I use that to indicate we must be at the end of the list if not 50 found
-- really, need to fix once and for all the eids / re-running query.
do GetEntries as we do now, once done however, get all entry ids. Stick those into the DB with a unique query-id and datestamp
new func to get all details needed for entries in an eid list (of 1-page) - show this page of entries
use current, full eidlist and to work our start/end of list (next/prev), disabling.
then client can keep current page of data, if you hit next/prev, use DB unique query id / full list and page of eids, and give full data for new page of entries
Implications though, are if a search is invalidated (maybe delete / move a photo), need to remove them from the list on the DB too OR let user know/decide to fix/wait.
BUG-100: I managed to get 2 photos matching mich in the NOT_WORKING photo (probably dif refimgs but same p.tag?)
= /photos/2012/20120414-damien/IMG_8467.JPG
BUG-106: cant add trudy /pat? as refimgs via FaceDBox
@@ -11,20 +23,19 @@ BUG-119: "Uncaught (in promise) Error: A listener indicated an asynchronous
was received"
investigate this (possible I'm calling check_for_jobs and maybe not doing the async right?)
BUG-120: on tablet:
[2023-04-09 12:21:21,214] ERROR in app: Exception on /view/16978 [GET]
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 2525, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1822, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1820, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1796, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File "/code/files.py", line 655, in view
OPT=States( request )
File "/code/states.py", line 92, in __init__
self.url = pref.orig_url
AttributeError: 'NoneType' object has no attribute 'orig_url'
BUG-123: pa_job_manager crashed with timeout on connection (probably when I turned off traefik for a bit?). Regardless, should be more fault tolerant --> maybe offer to restart pa_job_manager IF its crashed?
this definitely happened also, when I shutdown the DB back-end mid job, and it was able to be restarted, so could get f/e to at least suggest a restart of the contianer, or auto-restart job_mgr?
BUG-125: when an image is highlighted, then post the contextmenu on a different image - the highlight does not move to the new image
and the selected menu function processes the original or the new depending on the way the code works.
There is a chance we need to change the document on click to a mouse down (or whatever the context menu
uses for default), rather than just fix the highlight
BUG-130: moving files and then trying to go next page and it got confused...
BUG-132: right arrow to go to next photo in viewer ALSO scrolls to the right, needs a return somewhere in the jscript
BUG-133: when rebuilding pa[dev], the first run fails to have symlinks to the right paths for Import/Storage, etc. a simple restart fixes - so potentially the intial copy or some other race condition?
BUG-134: when moving set of photos on page, then move another set of photos on page, the first set reappears. Could really delete them from the dom?
BUG-135: failed to rotate: 2006/20061215-ITS-xmas-KP/DSC00582.JPG - not sure why && not repeatable, so its not the image, timing/race condition maybe?
BUG-137: after moving/refiling photos, the next shift-click is out of order (reload fixes it)
BUG-138: Placeholder for all the ways we can get the front-end confused:
---> JUST fix all these BUGs (relating to confused/lost state) by revisiting the overally complex way I remember state and my position in a list (probably FAR easier, to make an initial sql just save all eids, and then not try to recreate that list ever again and not care how I got into the list). Can attach a "running server-side sequence number", and if old sequence, and the original eid list results in a failure, then just pop up that the saved list is no longer valid, and ask user to re-do their search/list..."

74
DEEPFACE Normal file
View File

@@ -0,0 +1,74 @@
#
# https://pypi.org/project/deepface/
#
from deepface import DeepFace
import os
import cv2
import numpy as np
# Function to compute embeddings for a given image path
def compute_embeddings(image_path, model_name='VGG-Face'):
try:
# This returns a list of embeddings for each face found in the image
embeddings = DeepFace.represent(img_path=image_path, model_name=model_name)
return [res['embedding'] for res in embeddings]
except Exception as e:
print(f"Error processing image {image_path}: {e}")
return []
# Function to find the best matches between reference and target images
def find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine'):
# Store embeddings for reference images
reference_embeddings = {}
for ref_img in reference_images:
ref_embeddings = compute_embeddings(ref_img, model_name=model_name)
if ref_embeddings:
reference_embeddings[ref_img] = ref_embeddings
# Store best matches
best_matches = []
# Loop through each target image
for target_img in target_images:
target_embeddings = compute_embeddings(target_img, model_name=model_name)
for target_emb in target_embeddings:
for ref_img, ref_emb_list in reference_embeddings.items():
# Compare each reference embedding with the target embedding
for ref_emb in ref_emb_list:
# Compute the distance between embeddings using DeepFace's distance functions
distance = DeepFace.find_distance(ref_emb, target_emb, distance_metric=metric)
# Store match details (target image, reference image, and distance)
best_matches.append({
'target_image': target_img,
'reference_image': ref_img,
'distance': distance
})
# Sort matches by the distance (smallest distances indicate best matches)
best_matches = sorted(best_matches, key=lambda x: x['distance'])
return best_matches
# Example usage
reference_images = [
"path_to_reference_image1.jpg",
"path_to_reference_image2.jpg"
]
target_images = [
"path_to_target_image1.jpg",
"path_to_target_image2.jpg",
# Add more target images here...
]
# Find the best matches
matches = find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine')
# Display the best match
for match in matches[:5]: # Display top 5 matches
print(f"Reference Image: {match['reference_image']}, Target Image: {match['target_image']}, Distance: {match['distance']}")

View File

@@ -1,35 +1,27 @@
FROM ubuntu:22.04
FROM python:latest
# declare var
ARG PA_ID
WORKDIR /code
USER root
ENV PJM_UID=500
ENV PJM_GID=500
RUN groupadd -g ${PJM_GID} mythtv && useradd -r -u ${PJM_UID} -g ${PJM_GID} mythtv
ENV TZ=Australia/Melbourne
RUN truncate -s0 /tmp/preseed.cfg && \
(echo "tzdata tzdata/Areas select Australia" >> /tmp/preseed.cfg) && \
(echo "tzdata tzdata/Zones/Australia select Melbourne" >> /tmp/preseed.cfg) && \
debconf-set-selections /tmp/preseed.cfg && \
rm -f /etc/timezone /etc/localtime && \
apt-get update && \
DEBIAN_FRONTEND=noninteractive DEBCONF_NONINTERACTIVE_SEEN=true \
apt-get install -y tzdata
RUN apt-get update && apt-get -y dist-upgrade && apt-get -y install python3-pip libpq-dev gunicorn mediainfo cmake libgl1-mesa-glx libglib2.0-0 python3-ldap libjpeg-turbo-progs ffmpeg git
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
RUN pip3 install --upgrade pillow --user
EXPOSE 443
EXPOSE 55432
# do this first, allowing layer / build efficiencies if only code changes...
# sudo used in dev container
RUN apt-get update && apt-get -y install libpq-dev mediainfo cmake libglib2.0-0 libjpeg-turbo-progs ffmpeg git sudo
COPY requirements.txt .
RUN pip3 install --upgrade pip && pip3 install -r requirements.txt && pip3 install --upgrade pillow
RUN groupadd -g ${PA_ID} pauser && useradd -r -u ${PA_ID} -g ${PA_ID} pauser
# NOW, deal with any code changes, and reset the container for 'first run'...
COPY . .
RUN echo $RANDOM | md5sum | head -c 30 > /code/.sk
RUN chmod 600 .sk
RUN date > internal/build-date.txt
RUN git log -n 15 > internal/git-log.txt
RUN ln -s /code/TODO /code/internal/TODO
RUN ln -s /code/BUGs /code/internal/BUGs
# set up static dir and subdirs
RUN mkdir -p static/Bin && mkdir -p static/Import && mkdir -p static/Storage && mkdir -p static/Metadata
EXPOSE 80
RUN cat /dev/urandom | head -c 50 | md5sum | head -c 32 > /code/.sk && chmod 600 .sk && chown pauser:pauser .sk
RUN TZ="Australia/Melbourne" date > internal/build-date.txt && git log -n 15 > internal/git-log.txt && ln -s /code/TODO /code/internal/TODO && ln -s /code/BUGs /code/internal/BUGs
RUN rm -rf .git
RUN chown mythtv:mythtv /code
RUN chown mythtv:mythtv ./static
RUN chown mythtv:mythtv ./static/*
## cleanup of files from setup
RUN rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
RUN chown pauser:pauser /code && chown pauser:pauser ./static && chown pauser:pauser ./static/*
CMD ["./wrapper.sh"]

37
README
View File

@@ -3,34 +3,22 @@ In here we can put instructions on how to run this / any general info
to edit src:
git....
CAM: fill this in pls
cd ~/src ; # or wherever you want the code to live
git clone https://git.depaoli.id.au/ddp/photoassistant.git
ubuntu packages:
sudo apt-get install -y mediainfo cmake python3-flask
sudo apt-get install -y cmake libpq-dev
REDO THIS SOON (good to test on a clean install):
AFTER a fresh ubuntu, need to remove/make a new .python env and get pip-based python packages installed into .python:
cd ~/src/photoassistant
rm .python
# use a venv (to get consistent python versions with Prod container)
python3 -m venv .python
source ./.python/bin/activate
pip3 install -r requirements.txt
pip packages:
* pymediainfo
* PIL (should be there by default)
- on ubuntu, need to do: pip3 install --upgrade pillow --user
* ExifRead
* opencv-python
* datetime
* pytz
* face_recognition
* flask_login
* flask-ldap3-login
#### dlib (might need to install this before face_recognitioin, but it might not be needed, cmake clearly was)
upstream packages...
mkdir static/upstream
cd static/upstream
@@ -60,14 +48,11 @@ pip3 list | tail -n +3 | grep -v mysqlclient | grep -v duplicity | grep -v gp
To run debug version of web server (gunicorn deprecates FLASK_ENV, so co-exist for now):
# older flask:
FLASK_APP=main ENV=development FLASK_ENV=development flask run --host=192.168.0.2
# flask 2.2.2+ (in venv .python)
FLASK_APP=main ENV=development ./.python/bin/flask --debug run --host=192.168.0.2
FLASK_APP=main ENV=development ./.python/bin/flask --debug run --host=mara.ddp.net
to run prod version of web server:
gunicorn --bind="192.168.0.2:5000" --threads=2 --workers=2 main:app
gunicorn --bind="mara.ddp.net:5000" --threads=2 --workers=2 main:app
Also have to run the job manager for jobs to work:
ENV="development" python3 pa_job_manager.py
@@ -110,13 +95,13 @@ To get back a 'working' but scanned set of data:
# pg_dump --user=pa -a -t person -t refimg -t person_refimg_link > /docker-entrypoint-initdb.d/users.sql
# export all content so we can upgrade versions of postgres
sudo docker exec -it padb bash
docker exec -it padb bash
# pg_dump --user=pa pa > /docker-entrypoint-initdb.d/bkup.sql
### check sql looks good
sudo mv /srv/docker/container/padb/docker-entrypoint-initdb.d/bkup.sql /srv/docker/container/padb/docker-entrypoint-initdb.d/tables.sql
sudo rm /srv/docker/container/padb/docker-entrypoint-initdb.d/users.sql
sudo docker-compose -f /srv/docker/config/docker-compose.yaml build padb
( cd /srv/docker/config/ ; sudo docker-compose stop padb ; yes | sudo docker-compose rm padb ; sudo rm -rf /srv/docker/container/padb/data/ ; sudo docker-compose up -d padb ; sudo docker-compose restart paweb )
docker-compose -f /srv/docker/config/docker-compose.yaml build padb
( cd /srv/docker/config/ ; docker-compose stop padb ; yes | docker-compose rm padb ; sudo rm -rf /srv/docker/container/padb/data/ ; docker-compose up -d padb ; docker-compose restart paweb )
HANDY SQLs/commands:

244
TODO
View File

@@ -1,21 +1,54 @@
### major fix - go to everywhere I call GetEntries(), and redo the logic totally...
* client side:
* instead of removing deleted images from DOM, we should gray them out and put a big Del (red circle with line?) though it as overlay.
[DONE] * Create another table of entry_ammendments - note the deletions, rotations, flips of specific eids - then reproduce that on the client side visually as needed
[DONE] - at least grayed-out, to indicate a pending action is not complete.
- When job that flips, rotates, deletes completes then create an entry_amendment in the DB.
- Also hand fudge the jscript amendments for each job / next get_entry_by_id (if needed will also set amendments as needed)
- When job finishes, remove amendment from DB
- when job finishes, remove amendment from document.amendments
need to rework all the throbber stuff, I think it is probably better not to have a div I never use with the throbber in it, just add when I need it...
like in code for amendments. Also get rid of style and just use class
### GENERAL
* change the rotation code to use that jpeg util to reduce/remove compression loss?
* jobs for AI should show path name
* rm dups job should show progress bar
* in viewer, there is no move button (maybe add one?)
* think I killed pa_job_manager without passing an eid to a transform job, shouldn't crash
- SHOULD JUST get AI to help clean-up and write defensive code here...
* consider doing duplicates before AI, and if there are say 100s+, then maybe pause the AI work
- had 5000+ new photos, took 8 hours to finish, for me to just delete them anyway
* consider how to better version jscript - across all html files, consistently
- mtime, didnt work anyway, my phone still wont pick up the change, it was adding any ?v= changed this (once)
* optimisation:
- keep track of just new files since scan (even if we did this from the DB),
then we could just feed those eid's explicitly into a 'get_file_details_on_new_files'.
- ALSO use new_eids list IF no new refimgs to do a 'run_ai_on_new_files' :)
* allow changing dates in move dbox and then re-scan for existing folders OR just have a browse for existing...
- for use on scanned photos, they register as 2010, but are datestamped in visuals for 95
(is there a library for this???)
* sqlalchemy 2 migration:
* fix unmapped (in fact make all the code properly sqlachemy 2.0 compliant)
-- path.py has the __allow_unmapped__ = True
* remove all '.execute' from *.py
- get AI to help
* allow actions for wrong person:
-> someone else? OR override no match for this person ever for this image?
* groups of persons (non-exclusive, recursive), so:
- fam (ddp, mandy, cam, mich)
- mandy_fam (mandy, kirsty, tony, anne, tenille)
- uncle_peters_fam (peter, joan, jarrod, aaron, alana)
- cousin_aarons_fam (aaron, karly, kai, etc)
- storviks (mandy_fam, uncle_peters_fam, cousin_aarons_fam, etc)
* birthdates in PA:
- M was interested in birthdate appearing in hass, so need an API /
sensor in hass too - maybe could store (optional) birthdate and
heads up (of X days, e.g. so can get present) or just come up on the day so you can remember to text them
* search logic (AND vs OR)
* read this: https://flask.palletsprojects.com/en/2.2.x/testing/#faking-resources-and-context
* could get better AI optim, by keeping track of just new files since scan (even if we did this from the DB),
then we could just feed those eid's explicitly into a 'run_ai_on_new_files' :) -- maybe particularly
if count('new files') < say 1000 do eids, otherwise do path AND no new refimgs
* ignore face should ignore ALL matching faces (re: Declan)
* should allow context menu from View thumbs (particularly useful on search) to show other files around this one by date (maybe that folder or something?)
@@ -26,10 +59,6 @@
- rename (does this work already somehow? see issue below)
- dont allow me to stupidly move a folder to itself
* browser back/forward buttons dont work -- use POST -> redirect to GET
- need some sort of clean-up of pa_user_state -- I spose its triggered by browser session, so maybe just after a week is lazy/good enough
-- pa_user_state has last_used as a timestamp so can be used to delete old entries
* back button will fail if we do these POSTs:
job.py:@app.route("/jobs", methods=["GET", "POST"])
job.py:@app.route("/job/<id>", methods=["GET","POST"])
@@ -37,10 +66,8 @@
* if on jobs page and jobs increase, then 'rebuild' the content of the page to show new jobs, and potentially do that every 5 seconds...
- THINK: could also 'refresh' /job/id via Ajax not a reload, to avoid the POST issue above needing to remember job prefs somewhere?
files.py:@app.route("/fix_dups", methods=["POST"])
???
* allow user to choose default log level to show
* files.py:@app.route("/fix_dups", methods=["POST"])
- ???
* GUI overhaul?
* on a phone, the files.html page header is a mess "Oldest.." line is too large to fit on 1 line (make it a hamburger?)
@@ -59,9 +86,6 @@
* fix up logging in general
ProcessFileForJob --> really need to better handle log levels and counting
* metadata at folder level with file level to add more richness
- store in DB? or store in hidden file (or both)... IF it is outside the DB, then I can 'rebuild' the DB at anytime from scratch
* video player cannot handle non mp4 formats... do I care? (could just offer a download link and hope the client deals with it)
--> OR? https://jsmpeg.com/
--> OR? convert all videos to mp4/webm
@@ -78,7 +102,6 @@
* revisit SymlinkName() and make it simpler (see comment in shared.py)
*** Need to use thread-safe sessions per Thread, half-assed version did not work
Admin
-> do I want to have admin roles/users?
-> purge deleted files (and associated DB data) needs a dbox or privs
@@ -91,6 +114,8 @@
* viewer needs to allow toggle to scan_model (and prob. right-click on file... AI (with CNN) AI (with hog)
- make the form-select AI_Model actually do the change (but need more mem on mara really -- even mem is not enough
need graphic support --> need to allow pa_job_manager run on borric with acceleration)
- test this with new CPU in mara
- test this on borric for comparison
For AI / rescan:
way to override per file:
@@ -209,3 +234,182 @@ def AddRefimgToPerson(person_id: int, ref_images: List[str]) -> bool:
>>> AddRefimgToPerson(123, ['http://example.com/img1.jpg', '/path/to/img2.png'])
True
"""
checkout deepface:
User: I have this code I wrote using pythons face_recognition library, but I would like to alter it to work with deepface. My code is: im = face_recognition.load_image_file(e.FullPathOnFS())
face_locations = face_recognition.face_locations(im, model=model.name )
unknown_encodings = face_recognition.face_encodings(im, known_face_locations=face_locations)
Assistant: To alter your existing code to work with **DeepFace**, the changes involve replacing the methods of the `face_recognition` library with similar functionalities provided by DeepFace.
In **DeepFace**, the equivalent functionalities include:
1. **Detecting faces**: This is automatically handled by the `analyze` method in DeepFace. You don't need to manually specify the face locations.
2. **Encoding faces**: DeepFace internally computes embeddings for faces through its supported models (like VGG-Face, Facenet, OpenFace, DeepID, etc.).
Here's how you can modify your code to use DeepFace instead of `face_recognition`:
### Original Code (using `face_recognition`):
```python
im = face_recognition.load_image_file(e.FullPathOnFS())
face_locations = face_recognition.face_locations(im, model=model.name)
unknown_encodings = face_recognition.face_encodings(im, known_face_locations=face_locations)
```
### Modified Code (using `DeepFace`):
```python
from deepface import DeepFace
import cv2
# Load the image
im = cv2.imread(e.FullPathOnFS())
# Analyze the image to get face embeddings (encodings) and other details
results = DeepFace.represent(img_path=e.FullPathOnFS(), model_name=model.name)
# `results` will contain a list of dictionaries where each dictionary contains the embedding (encoding) of a detected face.
# Example of accessing the embeddings (face encodings)
unknown_encodings = [res['embedding'] for res in results]
```
### Explanation:
- `DeepFace.represent()` takes the path to the image (`e.FullPathOnFS()`) and the name of the model (`model.name`). It processes the image and extracts the embeddings (encodings) for each face detected.
- **Note**: In DeepFace, the model name can be one of several options: `VGG-Face`, `Facenet`, `OpenFace`, `DeepFace`, `DeepID`, etc. Make sure that `model.name` corresponds to one of these supported models.
- The result is a list of dictionaries. Each dictionary contains a detected face's encoding (embedding), which you can access using the `'embedding'` key.
DeepFace abstracts away a lot of the manual face detection work that you were handling with `face_recognition`. Let me know if you'd like to handle anything else, like visualizing the detected faces or working with other DeepFace models.
User: I want to use Deepface to take a set of reference images and match against all the images I have to find the best matches, how would you go about that?
Assistant: To use **DeepFace** to match a set of reference images against other images to find the best matches, you can follow these general steps:
1. **Compute embeddings (encodings) for the reference images**.
2. **Compute embeddings for the target images** (the ones you're searching through).
3. **Compare the embeddings** of the reference images with those of the target images to find the best matches based on a distance metric.
4. **Rank the matches** by similarity and return the best ones.
Here's a step-by-step guide on how you can achieve this with **DeepFace**:
### Step 1: Compute embeddings for reference images
You will load the reference images, compute their embeddings using DeepFace, and store them.
### Step 2: Compute embeddings for all target images
Do the same for the target images you want to match against.
### Step 3: Compare embeddings
You will compare the embeddings from the reference images to the embeddings of each target image. The smaller the distance between the embeddings, the more similar the faces are.
### Step 4: Return the best matches
### Example Code
```python
from deepface import DeepFace
import os
import cv2
import numpy as np
# Function to compute embeddings for a given image path
def compute_embeddings(image_path, model_name='VGG-Face'):
try:
# This returns a list of embeddings for each face found in the image
embeddings = DeepFace.represent(img_path=image_path, model_name=model_name)
return [res['embedding'] for res in embeddings]
except Exception as e:
print(f"Error processing image {image_path}: {e}")
return []
# Function to find the best matches between reference and target images
def find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine'):
# Store embeddings for reference images
reference_embeddings = {}
for ref_img in reference_images:
ref_embeddings = compute_embeddings(ref_img, model_name=model_name)
if ref_embeddings:
reference_embeddings[ref_img] = ref_embeddings
# Store best matches
best_matches = []
# Loop through each target image
for target_img in target_images:
target_embeddings = compute_embeddings(target_img, model_name=model_name)
for target_emb in target_embeddings:
for ref_img, ref_emb_list in reference_embeddings.items():
# Compare each reference embedding with the target embedding
for ref_emb in ref_emb_list:
# Compute the distance between embeddings using DeepFace's distance functions
distance = DeepFace.find_distance(ref_emb, target_emb, distance_metric=metric)
# Store match details (target image, reference image, and distance)
best_matches.append({
'target_image': target_img,
'reference_image': ref_img,
'distance': distance
})
# Sort matches by the distance (smallest distances indicate best matches)
best_matches = sorted(best_matches, key=lambda x: x['distance'])
return best_matches
# Example usage
reference_images = [
"path_to_reference_image1.jpg",
"path_to_reference_image2.jpg"
]
target_images = [
"path_to_target_image1.jpg",
"path_to_target_image2.jpg",
# Add more target images here...
]
# Find the best matches
matches = find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine')
# Display the best match
for match in matches[:5]: # Display top 5 matches
print(f"Reference Image: {match['reference_image']}, Target Image: {match['target_image']}, Distance: {match['distance']}")
```
### Explanation:
1. **`compute_embeddings(image_path, model_name)`**: This function uses DeepFace's `represent` function to compute face embeddings for each face found in the image.
2. **`find_best_matches()`**:
- It first computes and stores embeddings for all reference images in the `reference_embeddings` dictionary.
- For each target image, it computes the embeddings, then compares those embeddings with all the reference embeddings using the chosen distance metric (default is `'cosine'`).
- The function then stores the matches and sorts them by distance (the smaller the distance, the closer the match).
3. **Distance metrics**: You can choose between several distance metrics like `'cosine'`, `'euclidean'`, or `'euclidean_l2'`, depending on the model and your preference.
4. **Matching logic**: It finds the best matches by comparing each face embedding in the target images with each face embedding in the reference images and storing the distances.
### DeepFace Distance Metrics:
- **Cosine**: Measures the cosine of the angle between two vectors (common for face embeddings).
- **Euclidean**: Measures the straight-line distance between two vectors.
- **Euclidean L2**: A variant of Euclidean where embeddings are normalized before distance calculation.
### Example of top match output:
```bash
Reference Image: path_to_reference_image1.jpg, Target Image: path_to_target_image1.jpg, Distance: 0.256
Reference Image: path_to_reference_image2.jpg, Target Image: path_to_target_image2.jpg, Distance: 0.310
...
```
This approach will give you the best matching images based on facial similarity. You can adjust the model used (`VGG-Face`, `Facenet`, etc.) and the distance metric to fine-tune results. Let me know if you'd like to visualize the results or handle larger datasets more efficiently!

12
face.py
View File

@@ -28,9 +28,11 @@ class Face(PA,db.Model):
face_left = db.Column( db.Integer )
w = db.Column( db.Integer )
h = db.Column( db.Integer )
refimg_lnk = db.relationship("FaceRefimgLink", uselist=False, viewonly=True)
facefile_lnk = db.relationship("FaceFileLink", uselist=False, viewonly=True)
refimg_lnk = db.relationship("FaceRefimgLink", uselist=False, viewonly=True )
facefile_lnk = db.relationship("FaceFileLink", uselist=False, viewonly=True )
refimg =db.relationship("Refimg", secondary="face_refimg_link", uselist=False)
fnmo = db.relationship("FaceNoMatchOverride", back_populates="face")
ffmo = db.relationship("FaceForceMatchOverride", back_populates="face")
################################################################################
@@ -62,13 +64,13 @@ class FaceRefimgLink(PA, db.Model):
Attributes:
face_id (int): face id of row in Face table / foreign key - part primary key
refimg_id (int): face id of row in Face table / foreign key - part primary key
face_distance (int): distance value (how similar matched Face was)
face_distance (float): distance value (how similar matched Face was)
"""
__tablename__ = "face_refimg_link"
face_id = db.Column(db.Integer, db.ForeignKey("face.id"), primary_key=True )
refimg_id = db.Column(db.Integer, db.ForeignKey("refimg.id"), primary_key=True )
face_distance = db.Column(db.Integer)
face_distance = db.Column(db.Float)
################################################################################
@@ -104,6 +106,7 @@ class FaceNoMatchOverride(PA, db.Model):
face_id = db.Column(db.Integer, db.ForeignKey("face.id"), primary_key=True )
type_id = db.Column(db.Integer, db.ForeignKey("face_override_type.id"))
type = db.relationship("FaceOverrideType")
face = db.relationship("Face", back_populates="fnmo")
################################################################################
@@ -123,3 +126,4 @@ class FaceForceMatchOverride(PA, db.Model):
face_id = db.Column(db.Integer, db.ForeignKey("face.id"), primary_key=True )
person_id = db.Column(db.Integer, db.ForeignKey("person.id"), primary_key=True )
person = db.relationship("Person")
face = db.relationship("Face", back_populates="ffmo")

813
files.py
View File

@@ -1,11 +1,13 @@
from wtforms import SubmitField, StringField, HiddenField, validators, Form
from flask_wtf import FlaskForm
from flask import request, render_template, redirect, send_from_directory, url_for, jsonify, make_response
from marshmallow import Schema, fields
from main import db, app, ma
from sqlalchemy import Sequence, text
from sqlalchemy import Sequence, text, select, union, or_
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import joinedload
import os
import glob
import json
from PIL import Image
from pymediainfo import MediaInfo
import hashlib
@@ -19,16 +21,17 @@ from datetime import datetime, timedelta
import pytz
import html
from flask_login import login_required, current_user
from states import States, PA_UserState
from types import SimpleNamespace
################################################################################
# Local Class imports
################################################################################
from states import States, PA_UserState
from query import Query
from job import Job, JobExtra, Joblog, NewJob, SetFELog
from path import PathType, Path, MovePathDetails
from path import PathType, Path
from person import Refimg, Person, PersonRefimgLink
from settings import Settings, SettingsIPath, SettingsSPath, SettingsRBPath
from shared import SymlinkName
from shared import SymlinkName, ICON, PA
from dups import Duplicates
from face import Face, FaceFileLink, FaceRefimgLink, FaceOverrideType, FaceNoMatchOverride, FaceForceMatchOverride
@@ -38,41 +41,32 @@ from face import Face, FaceFileLink, FaceRefimgLink, FaceOverrideType, FaceNoMat
# Class describing PathDirLink and in the DB (via sqlalchemy)
# connects the entry (dir) with a path
################################################################################
class PathDirLink(db.Model):
class PathDirLink(PA,db.Model):
__tablename__ = "path_dir_link"
path_id = db.Column(db.Integer, db.ForeignKey("path.id"), primary_key=True )
dir_eid = db.Column(db.Integer, db.ForeignKey("dir.eid"), primary_key=True )
def __repr__(self):
return f"<path_id: {self.path_id}, dir_eid: {self.dir_eid}>"
################################################################################
# Class describing EntryDirLInk and in the DB (via sqlalchemy)
# connects (many) entry contained in a directory (which is also an entry)
################################################################################
class EntryDirLink(db.Model):
class EntryDirLink(PA,db.Model):
__tablename__ = "entry_dir_link"
entry_id = db.Column(db.Integer, db.ForeignKey("entry.id"), primary_key=True )
dir_eid = db.Column(db.Integer, db.ForeignKey("dir.eid"), primary_key=True )
def __repr__(self):
return f"<entry_id: {self.entry_id}, dir_eid: {self.dir_eid}>"
################################################################################
# Class describing Dir and in the DB (via sqlalchemy)
# rel_path: rest of dir after path, e.g. if path = /..../storage, then
# rel_path could be 2021/20210101-new-years-day-pics
# in_path: only in this structure, not DB, quick ref to the path this dir is in
################################################################################
class Dir(db.Model):
class Dir(PA,db.Model):
__tablename__ = "dir"
eid = db.Column(db.Integer, db.ForeignKey("entry.id"), primary_key=True )
rel_path = db.Column(db.String, unique=True )
in_path = db.relationship("Path", secondary="path_dir_link", uselist=False)
def __repr__(self):
return f"<eid: {self.eid}, rel_path: {self.rel_path}, in_path: {self.in_path}>"
################################################################################
# Class describing Entry and in the DB (via sqlalchemy)
# an entry is the common bits between files and dirs
@@ -82,7 +76,7 @@ class Dir(db.Model):
# in_dir - is the Dir that this entry is located in (convenience for class only)
# FullPathOnFS(): method to get path on the FS for this Entry
################################################################################
class Entry(db.Model):
class Entry(PA,db.Model):
__tablename__ = "entry"
id = db.Column(db.Integer, db.Sequence('file_id_seq'), primary_key=True )
name = db.Column(db.String, unique=False, nullable=False )
@@ -103,9 +97,6 @@ class Entry(db.Model):
s=self.dir_details.in_path.path_prefix
return s
def __repr__(self):
return f"<id: {self.id}, name: {self.name}, type={self.type}, dir_details={self.dir_details}, file_details={self.file_details}, in_dir={self.in_dir}"
################################################################################
# Class describing File and in the DB (via sqlalchemy)
# all files are entries, this is the extra bits only for a file, of note:
@@ -114,321 +105,429 @@ class Entry(db.Model):
# info can be from exif, or file system, or file name (rarely)
# faces: convenience field to show connected face(s) for this file
################################################################################
class File(db.Model):
class File(PA,db.Model):
__tablename__ = "file"
eid = db.Column(db.Integer, db.ForeignKey("entry.id"), primary_key=True )
size_mb = db.Column(db.Integer, unique=False, nullable=False)
thumbnail = db.Column(db.String, unique=False, nullable=True)
hash = db.Column(db.Integer)
hash = db.Column(db.String)
year = db.Column(db.Integer)
month = db.Column(db.Integer)
day = db.Column(db.Integer)
woy = db.Column(db.Integer)
faces = db.relationship ("Face", secondary="face_file_link" )
def __repr__(self):
return f"<eid: {self.eid}, size_mb={self.size_mb}, hash={self.hash}, year={self.year}, month={self.month}, day={self.day}, woy={self.woy}, faces={self.faces}>"
################################################################################
# Class describing FileType and in the DB (via sqlalchemy)
# pre-defined list of file types (image, dir, etc.)
################################################################################
class FileType(db.Model):
class FileType(PA,db.Model):
__tablename__ = "file_type"
id = db.Column(db.Integer, db.Sequence('file_type_id_seq'), primary_key=True )
name = db.Column(db.String, unique=True, nullable=False )
def __repr__(self):
return f"<id: {self.id}, name={self.name}>"
class AmendmentType(PA,db.Model):
__tablename__ = "amendment_type"
id = db.Column(db.Integer, db.Sequence('file_type_id_seq'), primary_key=True )
which = db.Column(db.String, nullable=False )
what = db.Column(db.String, nullable=False )
colour = db.Column(db.String, nullable=False )
class EntryAmendment(PA,db.Model):
__tablename__ = "entry_amendment"
eid = db.Column(db.Integer, db.ForeignKey("entry.id"), primary_key=True )
amend_type = db.Column(db.Integer, db.ForeignKey("amendment_type.id"))
type = db.relationship("AmendmentType", backref="entry_amendment")
################################################################################
# util function to just update the current/first/last positions needed for
# viewing / using pa_user_state DB table
# this is how we order all queries based on value of 'noo' - used with
# access *order_map.get(OPT.noo)
################################################################################
def UpdatePref( pref, OPT ):
last_used=datetime.now(pytz.utc)
if OPT.current>0:
pref.current=OPT.current
if OPT.first_eid>0:
pref.first_eid=OPT.first_eid
if OPT.last_eid>0:
pref.last_eid=OPT.last_eid
if OPT.num_entries>0:
pref.num_entries=OPT.num_entries
pref.last_used=last_used
db.session.add(pref)
db.session.commit()
order_map = {
"Newest": (File.year.desc(),File.month.desc(),File.day.desc(),Entry.name.desc()),
"Oldest": (File.year,File.month,File.day,Entry.name),
# careful, these need to be tuples, so with a , at the end
"Z to A": (Entry.name.desc(),),
"A to Z": (Entry.name.asc(),),
}
################################################################################
# GetEntriesInFlatView: func. to retrieve DB entries appropriate for flat view
################################################################################
def GetEntriesInFlatView( OPT, prefix ):
entries=[]
num_entries=0
join = "Entry.query.join(File).join(EntryDirLink).join(Dir).join(PathDirLink).join(Path).filter(Path.path_prefix==prefix)"
entries = eval( f"{join}.{OPT.order}.offset({OPT.offset}).limit({OPT.how_many}).all()" )
if OPT.first_eid == 0 and OPT.offset == 0 and len(entries):
OPT.first_eid = entries[0].id
if OPT.last_eid==0:
num_entries = eval( f"{join}.count()" )
last_entry = eval( f"{join}.{OPT.last_order}.limit(1).first()" )
if last_entry:
OPT.last_eid = last_entry.id
return entries, num_entries
################################################################################
# GetEntriesInFolderView: func. to retrieve DB entries appropriate for folder view
# read inline comments to deal with variations / ordering...
# Schemas for Path, FileType, File, Dir - used in EntrySchema
################################################################################
def GetEntriesInFolderView( OPT, prefix ):
entries=[]
num_entries=0
# okay the root cwd is fake, so treat it specially - its Dir can be found by path with dir.rel_path=''
if os.path.dirname(OPT.cwd) == 'static':
dir=Entry.query.join(Dir).join(PathDirLink).join(Path).filter(Dir.rel_path=='').filter(Path.path_prefix==prefix).order_by(Entry.name).first()
# this can occur if the path in settings does not exist as it wont be in # the DB
if not dir:
return entries, num_entries
# although this is 1 entry, needs to come back via all() to be iterable
entries+= Entry.query.filter(Entry.id==dir.id).all()
class PathType(ma.SQLAlchemyAutoSchema):
class Meta: model = PathType
load_instance = True
class PathSchema(ma.SQLAlchemyAutoSchema):
class Meta: model = Path
load_instance = True
type = ma.Nested(PathType)
root_dir = fields.Method("get_root_dir")
icon_url = fields.Method("get_icon_url")
def get_icon_url(self, obj):
return url_for("internal", filename="icons.svg") + "#" + ICON[obj.type.name]
def get_root_dir(self, obj):
parts = obj.path_prefix.split('/')
return ''.join(parts[2:])
class FileTypeSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FileType
load_instance = True
class DirSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = Dir
load_instance = True
eid = ma.auto_field() # Explicitly include eid
in_path = ma.Nested(PathSchema)
class FaceFileLinkSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FaceFileLink
load_instance = True
model_used = ma.auto_field()
class PersonSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model=Person
load_instance = True
class RefimgSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = Refimg
exclude = ('face',)
load_instance = True
person = ma.Nested(PersonSchema)
class FaceRefimgLinkSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FaceRefimgLink
load_instance = True
class FaceOverrideTypeSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FaceOverrideType
load_instance = True
class FaceNoMatchOverrideSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FaceNoMatchOverride
load_instance = True
type = ma.Nested(FaceOverrideTypeSchema)
class FaceForceMatchOverrideSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FaceForceMatchOverride
load_instance = True
person = ma.Nested(PersonSchema)
class FaceSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model=Face
exclude = ('face',)
load_instance = True
refimg = ma.Nested(RefimgSchema,allow_none=True)
# faces have to come with a file connection
facefile_lnk = ma.Nested(FaceFileLinkSchema)
refimg_lnk = ma.Nested(FaceRefimgLinkSchema,allow_none=True)
fnmo = ma.Nested( FaceNoMatchOverrideSchema, allow_none=True, many=True )
ffmo = ma.Nested( FaceForceMatchOverrideSchema, allow_none=True, many=True )
class FileSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = File
load_instance = True
faces = ma.Nested(FaceSchema,many=True,allow_none=True)
class AmendmentTypeSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = AmendmentType
load_instance = True
class EntryAmendmentSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = EntryAmendment
load_instance = True
eid = ma.auto_field()
type = ma.Nested(AmendmentTypeSchema)
################################################################################
# Schema for Entry so we can json for data to the client
################################################################################
class EntrySchema(ma.SQLAlchemyAutoSchema):
# gives id, name, type_id
class Meta:
model = Entry
load_instance = True
type = ma.Nested(FileTypeSchema)
file_details = ma.Nested(FileSchema,allow_none=True)
# noting dir_details needs in_path to work
dir_details = ma.Nested(DirSchema)
# noting in_dir needs in_path and in_path.type to work
in_dir = ma.Nested(DirSchema)
# allow us to use FullPathOnFS()
FullPathOnFS = fields.Method("get_full_path")
def get_full_path(self, obj):
return obj.FullPathOnFS()
# global - this will be use more than once below, so do it once for efficiency
entries_schema = EntrySchema(many=True)
FOT_Schema = FaceOverrideTypeSchema(many=True)
path_Schema = PathSchema(many=True)
person_Schema = PersonSchema(many=True)
################################################################################
# /get_entries_by_ids -> route where we supply list of entry ids (for next/prev
# page of data we want to show). Returns json of all matching entries
################################################################################
@app.route('/get_entries_by_ids', methods=['POST'])
@login_required
def process_ids():
data = request.get_json() # Parse JSON body
ids = data.get('ids', []) # Extract list of ids
# Query DB for matching entries
stmt = (
select(Entry)
.options(
joinedload(Entry.file_details).joinedload(File.faces).joinedload(Face.refimg).joinedload(Refimg.person),
joinedload(Entry.file_details).joinedload(File.faces).joinedload(Face.refimg_lnk),
joinedload(Entry.file_details).joinedload(File.faces).joinedload(Face.facefile_lnk),
joinedload(Entry.file_details).joinedload(File.faces).joinedload(Face.fnmo).joinedload(FaceNoMatchOverride.type),
joinedload(Entry.file_details).joinedload(File.faces).joinedload(Face.ffmo).joinedload(FaceForceMatchOverride.person),
)
.where(Entry.id.in_(ids))
)
# unique as the ORM query returns a Cartesian product for the joins. E.g if file has 3 faces, the result has 3 rows of the same entry and file data, but different face data
data=db.session.execute(stmt).unique().scalars().all()
# data is now in whatever order the DB returns- faster in python than DB supposedly. So, create a mapping from id to entry for quick lookup
entry_map = {entry.id: entry for entry in data}
# Sort the entries according to the order of ids
sorted_data = [entry_map[id_] for id_ in ids if id_ in entry_map]
# get any pending entry amendments
stmt = select(EntryAmendment).join(AmendmentType)
ea = db.session.execute(stmt).unique().scalars().all()
ea_schema = EntryAmendmentSchema(many=True)
ea_data=ea_schema.dump(ea)
print( ea_data )
return jsonify(entries=entries_schema.dump(sorted_data), amend=ea_data)
################################################################################
# /get_dir_entries:
# -> if back is false - returns list of eids inside this dir
# -> if back is true - returns list of eids inside the parent of this dir
################################################################################
@app.route("/get_dir_eids", methods=["POST"])
@login_required
def get_dir_entries():
data = request.get_json() # Parse JSON body
dir_id = data.get('dir_id', []) # Extract list of ids
back = data.get('back', False) # Extract back boolean
noo = data.get('noo', "A to Z") # Extract noo ordering
# if we are going back, find the parent id and use that instead
if back:
# get parent of this dir, to go back
stmt=select(EntryDirLink.dir_eid).filter(EntryDirLink.entry_id==dir_id)
dir_id = db.session.execute(stmt).scalars().one_or_none()
if not dir_id:
# return valid as false, we need to let user know this is not an empty dir, it does not exist
return jsonify( valid=False, entry_list=[] )
# Just double-check this is still in the DB, in case it got deleted since client made view
stmt=select(Entry.id).where(Entry.id==dir_id)
ent_id = db.session.execute(stmt).scalars().one_or_none()
if not ent_id:
# return valid as false, we need to let user know this is not an empty dir, it does not exist
return jsonify( valid=False, entry_list=[] )
# get content of dir_id
stmt=select(Entry.id).join(EntryDirLink).filter(EntryDirLink.dir_eid==dir_id)
stmt=stmt.order_by(*order_map.get(noo) )
return jsonify( valid=True, entry_list=db.session.execute(stmt).scalars().all() )
# get Face override details
def getFOT():
stmt = select(FaceOverrideType)
fot=db.session.execute(stmt).scalars().all()
return FOT_Schema.dump(fot)
# get import/storage path details for move dbox
def getMoveDetails():
stmt = select(Path).where( or_( Path.type.has(name="Import"), Path.type.has(name="Storage")))
mp=db.session.execute(stmt).scalars().all()
return path_Schema.dump(mp)
# get people data for the menu for AI matching (of person.tag)
def getPeople():
stmt = select(Person)
people=db.session.execute(stmt).scalars().all()
return person_Schema.dump(people)
################################################################################
# Get all relevant Entry.ids based on search_term passed in and OPT visuals
################################################################################
def GetSearchQueryData(OPT):
query_data={}
query_data['entry_list']=None
query_data['root_eid']=0
query_data['NMO'] = getFOT()
query_data['move_paths'] = getMoveDetails()
query_data['people'] = getPeople()
search_term = OPT.search_term
# turn * wildcard into sql wildcard of %
search_term = search_term.replace('*', '%')
if 'AI:' in OPT.search_term:
search_term = search_term.replace('AI:', '')
# AI searches are for specific ppl/joins in the DB AND we do them for ALL types of searches, define this once
ai_query = (
select(Entry.id)
.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person)
.where(Person.tag == search_term)
.order_by(*order_map.get(OPT.noo) )
)
if 'AI:' in OPT.search_term:
all_entries = db.session.execute(ai_query).scalars().all()
else:
rp = OPT.cwd.replace( prefix, '' )
# when in subdirs, replacing prefix will leave the first char as /, get rid of it
if len(rp) and rp[0] == '/':
rp=rp[1:]
dir=Entry.query.join(Dir).join(PathDirLink).join(Path).filter(Dir.rel_path==rp).filter(Path.path_prefix==prefix).order_by(Entry.name).first()
# this can occur if the path in settings does not exist as it wont be in # the DB
if not dir:
return entries, 0
# dirs cant be sorted by date really, so do best I can for now
if OPT.noo == "Z to A" or OPT.noo == "Newest":
entries+= Entry.query.join(EntryDirLink).join(FileType).filter(EntryDirLink.dir_eid==dir.id).filter(FileType.name=='Directory').order_by(Entry.name.desc()).all()
# just do A to Z / Oldest by default or if no valid option
else:
entries+= Entry.query.join(EntryDirLink).join(FileType).filter(EntryDirLink.dir_eid==dir.id).filter(FileType.name=='Directory').order_by(Entry.name).all()
# match name of File
file_query = select(Entry.id).join(File).where(Entry.name.ilike(f'%{search_term}%')).order_by(*order_map.get(OPT.noo))
# match name of Dir
dir_query = select(Entry.id).join(File).join(EntryDirLink).join(Dir).where(Dir.rel_path.ilike(f'%{search_term}%')).order_by(*order_map.get(OPT.noo))
# add any files at the current CWD (based on dir_eid in DB)
join="Entry.query.join(File).join(EntryDirLink).filter(EntryDirLink.dir_eid==dir.id)"
file_entries= eval( f"{join}.{OPT.order}.offset(OPT.offset).limit(OPT.how_many).all()")
ai_entries = db.session.execute(ai_query).scalars().all()
file_entries = db.session.execute(file_query).scalars().all()
dir_entries = db.session.execute(dir_query).scalars().all()
# Combine ai, file & dir matches with union() to dedup and then order them
all_entries = list(dict.fromkeys(ai_entries + dir_entries + file_entries))
if OPT.offset == 0 and len(file_entries):
OPT.first_eid = file_entries[0].id
num_entries = eval( f"{join}.count()" )
last_entry = eval( f"{join}.{OPT.last_order}.limit(1).first()" )
if last_entry:
OPT.last_eid = last_entry.id
query_data['entry_list']=all_entries
return query_data
entries += file_entries;
return entries, num_entries
#################################################################################
# Get all relevant Entry.ids based on files_ip/files_sp/files_rbp and OPT visuals
#################################################################################
def GetQueryData( OPT ):
query_data={}
query_data['entry_list']=None
query_data['NMO'] = getFOT()
query_data['move_paths'] = getMoveDetails()
query_data['people'] = getPeople()
################################################################################
# GetEntriesInSearchView: func. to retrieve DB entries appropriate for Search view
# Defaults search is for any matching filename, contents of any matching dirname
# and any match with AI / face for that term. Explicit, only AI match via
# AI:<tag> syntax
################################################################################
def GetEntriesInSearchView( OPT ):
search_term=OPT.orig_search_term
if 'AI:' in OPT.orig_search_term:
search_term = search_term.replace('AI:','')
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag.ilike('%{search_term}%'))"
if 'AI:' in OPT.orig_search_term:
all_entries = eval( f"{join}.{OPT.order}.offset(OPT.offset).limit(OPT.how_many).all()")
# always get the top of the (OPT.prefix) Path's eid and keep it for OPT.folders toggling/use
dir_stmt=(
select(Entry.id)
.join(Dir).join(PathDirLink).join(Path)
.where(Dir.rel_path == '').where(Path.path_prefix==OPT.prefix)
)
# this should return the 1 Dir (that we want to see the content of) - and with only 1, no need to worry about order
dir_arr=db.session.execute(dir_stmt).scalars().all()
if dir_arr:
dir_id=dir_arr[0]
else:
file_data=eval( f"Entry.query.join(File).filter(Entry.name.ilike('%{search_term}%')).{OPT.order}.offset({OPT.offset}).limit({OPT.how_many}).all()" )
dir_data =eval( f"Entry.query.join(File).join(EntryDirLink).join(Dir).filter(Dir.rel_path.ilike('%{search_term}%')).{OPT.order}.offset({OPT.offset}).limit({OPT.how_many}).all()" )
ai_data =eval( f"{join}.{OPT.order}.offset({OPT.offset}).limit({OPT.how_many}).all()")
dir_id=0
# used to know the parent/root (in folder view), in flat view - just ignore/safe though
query_data['root_eid']=dir_id
# remove any duplicates from combined data
all_entries = []
for f in file_data:
all_entries.append(f)
for d in dir_data:
add_it=1
for f in file_data:
if d.name == f.name:
add_it=0
break
if add_it:
all_entries.append(d)
for a in ai_data:
add_it=1
for f in file_data:
if a.name == f.name:
add_it=0
break
if add_it:
all_entries.append(a)
# for all searches first_entry is worked out when first_eid not set yet & offset is 0 and we have some entries
if OPT.first_eid == 0 and OPT.offset == 0 and len(all_entries):
OPT.first_eid = all_entries[0].id
if OPT.last_eid == 0:
by_fname= f"select e.id from entry e where e.name ilike '%%{search_term}%%'"
by_dirname=f"select e.id from entry e, entry_dir_link edl where edl.entry_id = e.id and edl.dir_eid in ( select d.eid from dir d where d.rel_path ilike '%%{search_term}%%' )"
by_ai =f"select e.id from entry e, face_file_link ffl, face_refimg_link frl, person_refimg_link prl, person p where e.id = ffl.file_eid and frl.face_id = ffl.face_id and frl.refimg_id = prl.refimg_id and prl.person_id = p.id and p.tag ilike '%%{search_term}%%'"
if 'AI:' in OPT.orig_search_term:
sel_no_order=f"select e.*, f.* from entry e, file f where e.id=f.eid and e.id in ( {by_ai} ) "
else:
sel_no_order=f"select e.*, f.* from entry e, file f where e.id=f.eid and e.id in ( {by_fname} union {by_dirname} union {by_ai} ) "
#num_entries
num_e_sql = f"select count(1) from ( {by_fname} union {by_dirname} union {by_ai} ) as foo"
with db.engine.connect() as conn:
OPT.num_entries = conn.execute( text( num_e_sql ) ).first().count
last_entry_sql= f"{sel_no_order} order by {OPT.last_order_raw} limit 1"
with db.engine.connect() as conn:
OPT.last_eid = conn.execute( text( last_entry_sql ) ).first().id
# store first/last eid into prefs
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==OPT.path_type,PA_UserState.orig_ptype==OPT.orig_ptype,PA_UserState.orig_search_term==OPT.orig_search_term).first()
UpdatePref( pref, OPT )
return all_entries
################################################################################
# set up "order strings" to use in ORM and raw queries as needed for
# GetEntries*Search*, GetEntries*Flat*, GetEntries*Fold*
################################################################################
def SetOrderStrings( OPT ):
if OPT.noo == "Newest":
OPT.order="order_by(File.year.desc(),File.month.desc(),File.day.desc(),Entry.name.desc())"
OPT.last_order="order_by(File.year,File.month,File.day,Entry.name)"
OPT.last_order_raw=f"f.year, f.month, f.day, e.name"
elif OPT.noo == "Oldest":
OPT.order="order_by(File.year,File.month,File.day,Entry.name)"
OPT.last_order="order_by(File.year.desc(),File.month.desc(),File.day.desc(),Entry.name.desc())"
OPT.last_order_raw=f"f.year desc, f.month desc, f.day desc, e.name desc"
elif OPT.noo == "Z to A":
OPT.order="order_by(Entry.name.desc())"
OPT.last_order="order_by(Entry.name)"
OPT.last_order_raw=f"e.name"
else:
# A to Z
OPT.order="order_by(Entry.name)"
OPT.last_order="order_by(Entry.name.desc())"
OPT.last_order_raw=f"e.name desc"
return
################################################################################
# /GetEntries -> helper function that Gets Entries for required files to show
# for several routes (files_ip, files_sp, files_rbp, search, view_list)
################################################################################
def GetEntries( OPT ):
entries=[]
SetOrderStrings( OPT )
if OPT.path_type == 'Search' or (OPT.path_type == 'View' and OPT.orig_ptype=='Search'):
return GetEntriesInSearchView( OPT )
# if we are a view, then it will be of something else, e.g. a list of
# import, storage, or bin images, reset OPT.path_type so that the paths array below works
if 'View' in OPT.path_type:
eid = OPT.url[6:]
OPT.path_type= OPT.orig_ptype
paths = []
if OPT.path_type == 'Storage':
path = SettingsSPath()
elif OPT.path_type == 'Import':
path = SettingsIPath()
elif OPT.path_type == 'Bin':
path = SettingsRBPath()
num_entries=0
path_cnt=1
# if we have not set last_eid yet, then we need to 'reset' it during the
# path loop below (if we have more than one dir in (say) Import path)
if OPT.last_eid == 0 or OPT.folders:
update_last_eid = True
else:
update_last_eid = False
prefix = SymlinkName(OPT.path_type,path,path+'/')
if OPT.folders:
tmp_ents, tmp_num_ents = GetEntriesInFolderView( OPT, prefix )
# start folder view with only the root folder
stmt=select(Entry.id).join(EntryDirLink).filter(EntryDirLink.dir_eid==dir_id)
else:
tmp_ents, tmp_num_ents = GetEntriesInFlatView( OPT, prefix )
entries += tmp_ents
num_entries += tmp_num_ents
# get every File that is in the OPT.prefix Path
stmt=(
select(Entry.id)
.join(File).join(EntryDirLink).join(Dir).join(PathDirLink).join(Path)
.where(Path.path_prefix == OPT.prefix)
)
if update_last_eid:
# find pref... via path_type if we are here
OPT.num_entries=num_entries
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==OPT.path_type).first()
UpdatePref( pref, OPT )
return entries
stmt=stmt.order_by(*order_map.get(OPT.noo) )
query_data['entry_list']=db.session.execute(stmt).scalars().all()
return query_data
################################################################################
# /change_file_opts -> allow sort order, how_many per page, etc. to change, and
# then send back the new query_data to update entryList
################################################################################
@app.route("/change_file_opts", methods=["POST"])
@login_required
def change_file_opts():
# reset options based on form post, then redirect back to orig page (with a GET to allow back button to work)
OPT=States( request )
return redirect( request.referrer )
data = request.get_json() # Parse JSON body
# allow dot-notation for OPT
OPT = SimpleNamespace(**data)
if hasattr(OPT, 'folders') and OPT.folders == 'True':
OPT.folders=True
else:
OPT.folders=False
# so create a new entryList, and handle that on the client
if 'search' in request.referrer:
query_data = GetSearchQueryData( OPT )
else:
query_data = GetQueryData( OPT )
return make_response( jsonify( query_data=query_data ) )
################################################################################
# /file_list -> show detailed file list of files from import_path(s)
################################################################################
@app.route("/file_list_ip", methods=["GET", "POST"])
@app.route("/file_list_ip", methods=["GET"])
@login_required
def file_list_ip():
OPT=States( request )
# now we have reset the offset, etc. into the prefs, we can use a GET and this will be back/forward browser button safe
if request.method=='POST':
redirect("/file_list_ip")
entries=GetEntries( OPT )
return render_template("file_list.html", page_title='View File Details (Import Path)', entry_data=entries, OPT=OPT )
query_data = GetQueryData( OPT )
js_vers = getVersions()
return render_template("file_list.html", page_title='View File Details (Import Path)', query_data=query_data, OPT=OPT, js_vers=js_vers )
################################################################################
# /files -> show thumbnail view of files from import_path(s)
################################################################################
@app.route("/files_ip", methods=["GET", "POST"])
@app.route("/files_ip", methods=["GET"])
@login_required
def files_ip():
OPT=States( request )
# now we have reset the offset, etc. into the prefs, we can use a GET and this will be back/forward browser button safe
if request.method=='POST':
redirect("/files_ip")
entries=GetEntries( OPT )
people = Person.query.all()
move_paths = MovePathDetails()
return render_template("files.html", page_title=f"View Files ({OPT.path_type} Path)", entry_data=entries, OPT=OPT, people=people, move_paths=move_paths )
query_data = GetQueryData( OPT )
js_vers = getVersions()
return render_template("files.html", page_title=f"View Files ({OPT.path_type} Path)", OPT=OPT, query_data=query_data, js_vers=js_vers )
################################################################################
# /files -> show thumbnail view of files from storage_path
################################################################################
@app.route("/files_sp", methods=["GET", "POST"])
@app.route("/files_sp", methods=["GET"])
@login_required
def files_sp():
OPT=States( request )
# now we have reset the offset, etc. into the prefs, we can use a GET and this will be back/forward browser button safe
if request.method=='POST':
redirect("/files_sp")
entries=GetEntries( OPT )
people = Person.query.all()
move_paths = MovePathDetails()
return render_template("files.html", page_title=f"View Files ({OPT.path_type} Path)", entry_data=entries, OPT=OPT, people=people, move_paths=move_paths )
query_data = GetQueryData( OPT )
js_vers = getVersions()
return render_template("files.html", page_title=f"View Files ({OPT.path_type} Path)", OPT=OPT, query_data=query_data, js_vers=js_vers )
################################################################################
# /files -> show thumbnail view of files from recycle_bin_path
################################################################################
@app.route("/files_rbp", methods=["GET", "POST"])
@app.route("/files_rbp", methods=["GET"])
@login_required
def files_rbp():
OPT=States( request )
# now we have reset the offset, etc. into the prefs, we can use a GET and this will be back/forward browser button safe
if request.method=='POST':
redirect("/files_rbp")
entries=GetEntries( OPT )
people = Person.query.all()
move_paths = MovePathDetails()
return render_template("files.html", page_title=f"View Files ({OPT.path_type} Path)", entry_data=entries, OPT=OPT, move_paths=move_paths )
query_data = GetQueryData( OPT )
js_vers = getVersions()
return render_template("files.html", page_title=f"View Files ({OPT.path_type} Path)", OPT=OPT, query_data=query_data, js_vers=js_vers )
################################################################################
# search -> GET version -> has search_term in the URL and is therefore able to
@@ -439,15 +538,10 @@ def files_rbp():
@login_required
def search(search_term):
OPT=States( request )
# if we posted to get here, its a change in State, so save it to pa_user_state, and go back to the GET version or URL
if request.method=="POST":
redirect("/search/"+search_term)
OPT.search_term = search_term
# always show flat results for search to start with
OPT.folders=False
entries=GetEntries( OPT )
move_paths = MovePathDetails()
return render_template("files.html", page_title='View Files', search_term=search_term, entry_data=entries, OPT=OPT, move_paths=move_paths )
query_data=GetSearchQueryData( OPT )
js_vers = getVersions()
return render_template("files.html", page_title='View Files', search_term=search_term, query_data=query_data, OPT=OPT, js_vers=js_vers )
################################################################################
# /files/scan_ip -> allows us to force a check for new files
@@ -576,130 +670,29 @@ def move_files():
return make_response( jsonify( job_id=job.id ) )
@login_required
@app.route("/view_list", methods=["POST"])
def view_list():
OPT=States( request )
# Get next/prev set of data - e.g. if next set, then it will use orig_url
# to go forward how_many from offset and then use viewer.html to show that
# first obj of the new list of entries
entries=GetEntries( OPT )
# this occurs when we went from the last image on a page (with how_many on
# it) and it just happened to also be the last in the DB...
if not entries:
SetFELog( message="DDP: DONT think this can happen anymore", log_level="error", job=None, persistent=True, cant_close=True )
@app.route("/view/", methods=["POST"])
def view():
data = request.get_json() # Parse JSON body
eid = data.get('eid', 0) # Extract list of ids
# undo the skip by how_many and getentries again
OPT.offset -= int(OPT.how_many)
entries=GetEntries( OPT )
# now flag we are at the last in db, to reset current below
objs = {}
eids=""
resp={}
resp['objs']={}
for e in entries:
if not e.file_details:
continue
eids=eids+f"{e.id},"
resp['objs'][e.id]={}
resp['objs'][e.id]['url'] = e.FullPathOnFS()
resp['objs'][e.id]['name'] = e.name
resp['objs'][e.id]['type'] = e.type.name
if e.file_details.faces:
# model is used for whole file, so set it at that level (based on first face)
resp['objs'][e.id]['face_model'] = e.file_details.faces[0].facefile_lnk.model_used
resp['objs'][e.id]['faces'] = []
stmt = (
select(Entry)
.options(
joinedload(Entry.file_details).joinedload(File.faces),
joinedload(Entry.file_details).joinedload(File.faces).joinedload(Face.refimg).joinedload(Refimg.person)
)
.where(Entry.id == eid)
)
# put face data back into array format (for js processing)
fid=0
for face in e.file_details.faces:
fd= {}
fd['x'] = face.face_left
fd['y'] = face.face_top
fd['w'] = face.w
fd['h'] = face.h
if face.refimg:
fd['who'] = face.refimg.person.tag
fd['distance'] = round(face.refimg_lnk.face_distance,2)
resp['objs'][e.id]['faces'].append(fd)
fid+=1
eids=eids.rstrip(",")
lst = eids.split(',')
if 'next' in request.form:
OPT.current = int(lst[0])
if 'prev' in request.form:
OPT.current = int(lst[-1])
resp['current']=OPT.current
# OPT.first_eid can still be 0 IF we have gone past the first page, I could
# better set this in states rather than kludge this if... think about it
if OPT.first_eid>0:
resp['first_eid']=OPT.first_eid
resp['last_eid']=OPT.last_eid
resp['eids']=eids
resp['offset']=OPT.offset
# save pref to keep the new current value, first/last
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.orig_ptype==OPT.orig_ptype,PA_UserState.view_eid==OPT.view_eid).first()
UpdatePref( pref, OPT )
return make_response( resp )
################################################################################
# /view/id -> grabs data from DB and views it (GET)
################################################################################
@login_required
@app.route("/view/<id>", methods=["GET"])
def view(id):
OPT=States( request )
objs = {}
entries=GetEntries( OPT )
eids=""
for e in entries:
objs[e.id]=e
eids += f"{e.id},"
# if this is a dir, we wont view it with a click anyway, so move on...
if not e.file_details:
continue
# process any overrides
for face in e.file_details.faces:
# now get any relevant override and store it in objs...
fnmo = FaceNoMatchOverride.query.filter(FaceNoMatchOverride.face_id==face.id).first()
if fnmo:
face.no_match_override=fnmo
mo = FaceForceMatchOverride.query.filter(FaceForceMatchOverride.face_id==face.id).first()
if mo:
mo.type = FaceOverrideType.query.filter( FaceOverrideType.name== 'Manual match to existing person' ).first()
face.manual_override=mo
eids=eids.rstrip(",")
# jic, sometimes we trip this, and rather than show broken pages / destroy
if id not in eids:
SetFELog( message=f"ERROR: viewing an id, but its not in eids OPT={OPT}, id={id}, eids={eids}", log_level="error", persistent=True, cant_close=True)
msg="Sorry, viewing data is confused, cannot view this image now"
if os.environ['ENV'] == "production":
msg += "Clearing out all states. This means browser back buttons will not work, please start a new tab and try again"
PA_UserState.query.delete()
db.session.commit()
SetFELog( msg, "warning" )
return redirect("/")
else:
NMO_data = FaceOverrideType.query.all()
setting = Settings.query.first()
imp_path = setting.import_path
st_path = setting.storage_path
bin_path = setting.recycle_bin_path
return render_template("viewer.html", current=int(id), eids=eids, objs=objs, OPT=OPT, NMO_data=NMO_data, imp_path=imp_path, st_path=st_path, bin_path=bin_path )
##################################################################################
# /view/id -> grabs data from DB and views it (POST -> set state, redirect to GET)
##################################################################################
@app.route("/view/<id>", methods=["POST"])
@login_required
def view_img_post(id):
# set pa_user_states...
OPT=States( request )
# then use back-button friendly URL (and use pa_user_states to view the right image in the right list
return redirect( "/view/" + id );
# this needs unique() because:
# entry (one row for id=660)
# file (one row, since file_details is a one-to-one relationship)
# face (many rows, since a file can have many faces)
# refimg and person (one row per face, via the link tables)
# The SQL query returns a Cartesian product for the joins involving collections (like faces). For example, if your file has 3 faces,
# the result set will have 3 rows, each with the same entry and file data, but different face, refimg, and person data.
data=db.session.execute(stmt).unique().scalars().all()
return jsonify(entries_schema.dump(data))
# route called from front/end - if multiple images are being transformed, each transorm == a separate call
# to this route (and therefore a separate transorm job. Each reponse allows the f/e to check the
@@ -764,14 +757,6 @@ def _jinja2_filter_toplevelfolderof(path, cwd):
else:
return False
###############################################################################
# This func creates a new filter in jinja2 to test to hand back the parent path
# from a given path
################################################################################
@app.template_filter('ParentPath')
def _jinja2_filter_parentpath(path):
return os.path.dirname(path)
###############################################################################
# route to allow the Move Dialog Box to pass a date (YYYYMMDD) and returns a
# json list of existing dir names that could be near it in time. Starting
@@ -782,7 +767,7 @@ def _jinja2_filter_parentpath(path):
def get_existing_paths(dt):
dir_ft=FileType.query.filter(FileType.name=='Directory').first()
dirs_arr=[]
for delta in range(-7, 8):
for delta in range(-14, 15):
try:
new_dtime=datetime.strptime(dt, "%Y%m%d") + timedelta(days=delta)
except:
@@ -822,3 +807,17 @@ def get_existing_paths(dt):
first_dir=0
ret+= ' ]'
return make_response( ret )
# quick helper func to return timestamps of jscript files
# we use this as a quick/hacky way of versioning them
def getVersions():
js_vers={}
js_vers['fs'] = int(os.path.getmtime( "."+url_for( 'internal', filename='js/files_support.js') ))
js_vers['vs'] = int(os.path.getmtime( "."+url_for( 'internal', filename='js/view_support.js') ))
js_vers['ft'] = int(os.path.getmtime( "."+url_for( 'internal', filename='js/files_transform.js') ))
js_vers['ic'] = int(os.path.getmtime( "."+url_for( 'internal', filename='icons.svg') ))
js_vers['r180'] = int(os.path.getmtime( "."+url_for( 'internal', filename='rot180.png') ))
js_vers['r270'] = int(os.path.getmtime( "."+url_for( 'internal', filename='rot270.png') ))
js_vers['r90'] = int(os.path.getmtime( "."+url_for( 'internal', filename='rot90.png') ))
js_vers['th'] = int(os.path.getmtime( "."+url_for( 'internal', filename='throbber.gif') ))
return js_vers

View File

@@ -161,39 +161,18 @@
c4.142,0,7.5-3.357,7.5-7.5S339.642,328,335.5,328z"/>
<g style="fill:#00000025;" transform="matrix(16, 0, 0, 16, 120, 115)"><path d="M4.502 9a1.5 1.5 0 1 0 0-3 1.5 1.5 0 0 0 0 3z"/><path d="M14.002 13a2 2 0 0 1-2 2h-10a2 2 0 0 1-2-2V5A2 2 0 0 1 2 3a2 2 0 0 1 2-2h10a2 2 0 0 1 2 2v8a2 2 0 0 1-1.998 2zM14 2H4a1 1 0 0 0-1 1h9.002a2 2 0 0 1 2 2v7A1 1 0 0 0 15 11V3a1 1 0 0 0-1-1zM2.002 4a1 1 0 0 0-1 1v8l2.646-2.354a.5.5 0 0 1 .63-.062l2.66 1.773 3.71-3.71a.5.5 0 0 1 .577-.094l1.777 1.947V5a1 1 0 0 0-1-1h-10z"/></g>
</svg>
<svg id="flip_h" fill="currentColor" viewBox='0 0 512 512'>
<svg id="flip_h" viewBox='0 0 512 512'>
<path fill='none' stroke='currentColor' stroke-linecap='round' stroke-linejoin='round' stroke-width='32' d='M304 48l112 112-112 112M398.87 160H96M208 464L96 352l112-112M114 352h302'/>
</svg>
<svg id="flip_v" fill="currentColor" viewBox='0 0 512 512'>
<svg id="flip_v" viewBox='0 0 512 512'>
<path fill='none' stroke='currentColor' stroke-linecap='round' stroke-linejoin='round' stroke-width='32' d='M464 208L352 96 240 208M352 113.13V416M48 304l112 112 112-112M160 398V96'/>
</svg>
<svg id="fullscreen" fill="currentColor" viewBox="0 0 16 16">
<svg id="fullscreen" viewBox="0 0 16 16">
<path fill-rule="evenodd" d="M5.828 10.172a.5.5 0 0 0-.707 0l-4.096 4.096V11.5a.5.5 0 0 0-1 0v3.975a.5.5 0 0 0 .5.5H4.5a.5.5 0 0 0 0-1H1.732l4.096-4.096a.5.5 0 0 0 0-.707zm4.344 0a.5.5 0 0 1 .707 0l4.096 4.096V11.5a.5.5 0 1 1 1 0v3.975a.5.5 0 0 1-.5.5H11.5a.5.5 0 0 1 0-1h2.768l-4.096-4.096a.5.5 0 0 1 0-.707zm0-4.344a.5.5 0 0 0 .707 0l4.096-4.096V4.5a.5.5 0 1 0 1 0V.525a.5.5 0 0 0-.5-.5H11.5a.5.5 0 0 0 0 1h2.768l-4.096 4.096a.5.5 0 0 0 0 .707zm-4.344 0a.5.5 0 0 1-.707 0L1.025 1.732V4.5a.5.5 0 0 1-1 0V.525a.5.5 0 0 1 .5-.5H4.5a.5.5 0 0 1 0 1H1.732l4.096 4.096a.5.5 0 0 1 0 .707z"/>
</svg>
<svg id="unknown_ftype" fill="grey" viewBox="0 0 16 16">
<path d="M2 0a2 2 0 0 0-2 2v12a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V2a2 2 0 0 0-2-2H2zm3.496 6.033a.237.237 0 0 1-.24-.247C5.35 4.091 6.737 3.5 8.005 3.5c1.396 0 2.672.73 2.672 2.24 0 1.08-.635 1.594-1.244 2.057-.737.559-1.01.768-1.01 1.486v.105a.25.25 0 0 1-.25.25h-.81a.25.25 0 0 1-.25-.246l-.004-.217c-.038-.927.495-1.498 1.168-1.987.59-.444.965-.736.965-1.371 0-.825-.628-1.168-1.314-1.168-.803 0-1.253.478-1.342 1.134-.018.137-.128.25-.266.25h-.825zm2.325 6.443c-.584 0-1.009-.394-1.009-.927 0-.552.425-.94 1.01-.94.609 0 1.028.388 1.028.94 0 .533-.42.927-1.029.927z"/>
</svg>
<svg id="rot90" viewBox="0 0 225 225" xmlns="http://www.w3.org/2000/svg" xmlns:bx="https://boxy-svg.com">
<g transform="matrix(1, 0, 0, 1, 7, 20)">
<path style="fill: rgba(216, 216, 216, 0); stroke: rgb(13, 202, 240); stroke-width: 15px;" d="M 100.468 0.428 C 155.454 0.428 200.022 45.014 200 100"/>
<path d="M -100.468 -5.428 H -100.468 L -100.468 -20.428 L -60.468 -0.428 L -100.468 19.572 L -100.468 4.572 H -100.468 V -5.428 Z" style="fill: rgb(13, 202, 240);" transform="matrix(-1, 0, 0, -1, 0, 0)" bx:shape="arrow -100.468 -20.428 40 40 10 40 0 1@18b8b13c"/>
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" transform="matrix(1, 0, 0, 1, -50.667503, 38.507301)"><tspan x="97.106" y="94.348">90</tspan><tspan x="97.106" dy="1em"></tspan></text>
</g>
</svg>
<svg id="rot180" viewBox="0 0 225 225" xmlns:bx="https://boxy-svg.com">
<path style="fill: rgba(216, 216, 216, 0); paint-order: stroke; stroke-width: 15px; stroke: rgb(13, 202, 240);" d="M 70.624 -41.413 C 147.233 -41.507 195.215 41.367 156.992 107.76 C 139.252 138.572 106.422 157.581 70.867 157.624" transform="matrix(0, -1, 1, 0, 62.412816, 178.623736)"/>
<path d="M 0 95 H 0 L 0 80 L 40 100 L 0 120 L 0 105 H 0 V 95 Z" style="fill: rgb(13, 202, 240);" transform="matrix(0.000048, 1, -1, 0.000048, 120.999475, 107.995157)" bx:shape="arrow 0 80 40 40 10 40 0 1@de4046a6"/>
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" transform="matrix(1, 0, 0, 1, -13.373604, 45.416005)"><tspan x="54.39" y="96.53">180</tspan><tspan x="54.39" dy="1em"></tspan></text>
</svg>
<svg id="rot270" viewBox="0 0 225 225" xmlns:bx="https://boxy-svg.com">
<g transform="matrix(1, 0, 0, 1, 8, 9)">
<path style="fill: rgba(13, 202, 240, 0); paint-order: fill; stroke: rgb(13, 202, 240); stroke-width: 15px;" d="M 100.308 199.384 C 177.051 199.384 225.015 116.307 186.644 49.846 C 148.272 -16.615 52.344 -16.615 13.972 49.846 C 5.371 64.744 0.771 81.615 0.62 98.817" transform="matrix(0, -1, 1, 0, 0.692227, 200.07649)"/>
<path d="M 99 195 H 99 L 99 180 L 139 200 L 99 220 L 99 205 H 99 V 195 Z" style="fill: rgb(13, 202, 240);" bx:shape="arrow 99 180 40 40 10 40 0 1@077af6d0"/>
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" x="20.59" y="133.396">270</text>
</g>
</svg>
<svg id="sort-num-asc" viewBox="0 0 16 16">
<path d="M12.438 1.668V7H11.39V2.684h-.051l-1.211.859v-.969l1.262-.906h1.046z"/>
<path fill-rule="evenodd" d="M11.36 14.098c-1.137 0-1.708-.657-1.762-1.278h1.004c.058.223.343.45.773.45.824 0 1.164-.829 1.133-1.856h-.059c-.148.39-.57.742-1.261.742-.91 0-1.72-.613-1.72-1.758 0-1.148.848-1.835 1.973-1.835 1.09 0 2.063.636 2.063 2.687 0 1.867-.723 2.848-2.145 2.848zm.062-2.735c.504 0 .933-.336.933-.972 0-.633-.398-1.008-.94-1.008-.52 0-.927.375-.927 1 0 .64.418.98.934.98z"/>
@@ -229,4 +208,7 @@
<path d="M4.406 1.342A5.53 5.53 0 0 1 8 0c2.69 0 4.923 2 5.166 4.579C14.758 4.804 16 6.137 16 7.773 16 9.569 14.502 11 12.687 11H10a.5.5 0 0 1 0-1h2.688C13.979 10 15 8.988 15 7.773c0-1.216-1.02-2.228-2.313-2.228h-.5v-.5C12.188 2.825 10.328 1 8 1a4.53 4.53 0 0 0-2.941 1.1c-.757.652-1.153 1.438-1.153 2.055v.448l-.445.049C2.064 4.805 1 5.952 1 7.318 1 8.785 2.23 10 3.781 10H6a.5.5 0 0 1 0 1H3.781C1.708 11 0 9.366 0 7.318c0-1.763 1.266-3.223 2.942-3.593.143-.863.698-1.723 1.464-2.383z"/>
<path d="M7.646 15.854a.5.5 0 0 0 .708 0l3-3a.5.5 0 0 0-.708-.708L8.5 14.293V5.5a.5.5 0 0 0-1 0v8.793l-2.146-2.147a.5.5 0 0 0-.708.708l3 3z"/>
</svg>
<svg id="back" viewBox="0 0 16 16">
<path d="m7.247 4.86-4.796 5.481c-.566.647-.106 1.659.753 1.659h9.592a1 1 0 0 0 .753-1.659l-4.796-5.48a1 1 0 0 0-1.506 0z"/>
</svg>
</svg>

Before

Width:  |  Height:  |  Size: 27 KiB

After

Width:  |  Height:  |  Size: 25 KiB

View File

@@ -1,3 +1,16 @@
// GLOBAL ICON array
ICON={}
ICON["Import"]="import"
ICON["Storage"]="db"
ICON["Bin"]="trash"
// function called when we get another page from inside the files view
function getPageFigures(res, viewingIdx)
{
// add all the figures to files_div
drawPageOfFigures()
}
// grab all selected thumbnails and return a <div> containing the thumbnails
// with extra yr and date attached as attributes so we can set the default
// dir name for a move directory - not used in del, but no harm to include them
@@ -88,7 +101,7 @@ function MoveOrDelCleanUpUI()
// show the DBox for a move file, includes all thumbnails of selected files to move
// and a pre-populated folder to move them into, with text field to add a suffix
function MoveDBox(path_details, db_url)
function MoveDBox(path_details)
{
$('#dbox-title').html('Move Selected File(s) to new directory in Storage Path')
div =`
@@ -98,12 +111,12 @@ function MoveDBox(path_details, db_url)
<form id="mv_fm" class="form form-control-inline col-12">
<input id="move_path_type" name="move_path_type" type="hidden"
`
div += ' value="' + path_details[0].type + '"></input>'
div += ' value="' + path_details[0].type.name + '"></input>'
div+=GetSelnAsDiv()
yr=$('.highlight').first().attr('yr')
dt=$('.highlight').first().attr('date')
div+='<div class="row">Use Existing Directory (in the chosen path):</div><div id="existing"></div>'
GetExistingDirsAsDiv( dt, "existing", path_details[0].type )
GetExistingDirsAsDiv( dt, "existing", path_details[0].type.name )
div+=`
<div class="input-group my-3">
<alert class="alert alert-primary my-auto py-1">
@@ -112,7 +125,7 @@ function MoveDBox(path_details, db_url)
div+= '<svg id="move_path_icon" width="20" height="20" fill="currentColor"><use xlink:href="' + path_details[0].icon_url + '"></svg>'
div+= '<select id="rp_sel" name="rel_path" class="text-primary alert-primary py-1 border border-primary rounded" onChange="change_rp_sel()">'
for(p of path_details) {
div+= '<option path_type="'+p.type+'" icon_url="'+p.icon_url+'">'+p.path+'</option>'
div+= '<option path_type="'+p.type.name+'" icon_url="'+p.icon_url+'">'+p.root_dir+'</option>'
}
div+= '</select>'
div+=`
@@ -156,7 +169,7 @@ function DelDBox(del_or_undel)
div+=`
'/delete_files',
success: function(data){
if( $(location).attr('pathname').match('search') !== null ) { window.location='/' }; CheckForJobs() } }); return false" class="btn btn-outline-danger col-2">Ok</button>
if( $(location).attr('pathname').match('search') !== null || document.viewing ) { window.location='/' }; CheckForJobs() } }); return false" class="btn btn-outline-danger col-2">Ok</button>
</div>
`
else
@@ -165,7 +178,7 @@ function DelDBox(del_or_undel)
div+=`
'/restore_files',
success: function(data){
if( $(location).attr('pathname').match('search') !== null ) { window.location='/' }; CheckForJobs() } }); return false" class="btn btn-outline-success col-2">Ok</button>
if( $(location).attr('pathname').match('search') !== null || document.viewing ) { window.location='/' }; CheckForJobs() } }); return false" class="btn btn-outline-success col-2">Ok</button>
</div>
`
$('#dbox-content').html(div)
@@ -200,53 +213,82 @@ function DetailsDBox()
}
// function to change the size of thumbnails (and resets button bar to newly
// selected size)
function ChangeSize(clicked_button,sz)
{
$('.sz-but.btn-info').removeClass('btn-info text-white').addClass('btn-outline-info')
$(clicked_button).addClass('btn-info text-white').removeClass('btn-outline-info')
$('.thumb').attr( {height: sz, style: 'font-size:'+sz+'px' } )
$('#size').val(sz)
sz=sz-22
$('.svg').height(sz);
$('.svg').width(sz);
$('.svg_cap').width(sz);
}
// DoSel is called when a click event occurs, and sets the selection via adding
// 'highlight' to the class of the appropriate thumbnails
// e == event (can see if shift/ctrl held down while left-clicking
// el == element the click is on
// this allows single-click to select, ctrl-click to (de)select 1 item, and
// shift-click to add all elements between highlighted area and clicked area,
// whether you click after highlight or before
function DoSel(e, el)
{
if( e.ctrlKey )
{
$(el).toggleClass('highlight')
return
}
if( e.shiftKey )
{
st=Number($('.highlight').first().attr('ecnt'))
end=Number($('.highlight').last().attr('ecnt'))
clicked=Number($(el).attr('ecnt'))
// if we shift-click first element, then st/end are NaN, so just highlightthe one clicked
if( isNaN(st) )
{
$('.entry').slice( clicked, clicked+1 ).addClass('highlight')
return
// shift-click to add all elements between highlighted area and clicked el,
// whether you click before highlight or after, or inside a gap and then back
// or forward to the closest higlighted entry - also, only works on entry class,
// so it ignores figures that we take entry off while we transform, etc it
function DoSel(e, el) {
const id = $(el).attr('id');
const entries = $('.entry');
// Collect currently highlighted entries
const currentHighlights = $('.highlight');
const highlighted = new Set();
currentHighlights.each(function() {
highlighted.add($(this).attr('id'));
});
// Ctrl+click: toggle highlight for the clicked entry
if (e.ctrlKey || document.fake_ctrl === 1) {
$(el).toggleClass('highlight');
if (highlighted.has(id)) {
highlighted.delete(id);
} else {
highlighted.add(id);
}
if( clicked > end )
$('.entry').slice( end, clicked+1 ).addClass('highlight')
else
$('.entry').slice( clicked, st ).addClass('highlight')
return
if (document.fake_ctrl === 1) {
document.fake_ctrl = 0;
}
return;
}
// Shift+click: select a range
else if (e.shiftKey || document.fake_shift === 1) {
if (currentHighlights.length === 0) {
// If no highlights, just highlight the clicked entry
$(el).addClass('highlight');
highlighted.add(id);
} else {
// Find the nearest highlighted entry
const clickedIndex = entries.index($(el));
let nearestHighlightIndex = -1;
let minDistance = Infinity;
currentHighlights.each(function() {
const highlightIndex = entries.index($(this));
const distance = Math.abs(highlightIndex - clickedIndex);
if (distance < minDistance) {
minDistance = distance;
nearestHighlightIndex = highlightIndex;
}
});
// Highlight the range between the nearest highlighted entry and the clicked entry
const from = Math.min(clickedIndex, nearestHighlightIndex);
const to = Math.max(clickedIndex, nearestHighlightIndex);
for (let i = from; i <= to; i++) {
const entryId = entries.eq(i).attr('id');
highlighted.add(entryId);
entries.eq(i).addClass('highlight');
}
}
if (document.fake_shift === 1) {
document.fake_shift = 0;
}
return;
}
// Single click: clear all highlights and highlight the clicked entry
else {
$('.highlight').removeClass('highlight');
highlighted.clear();
$(el).addClass('highlight');
highlighted.add(id);
}
$('.highlight').removeClass('highlight')
$(el).addClass('highlight')
}
// if a selection exists, enable move & del/restore buttons otherwise disable them
@@ -311,3 +353,547 @@ function NoSel() {
else
return true
}
/**
* Renders a group header or entry based on the object and options.
* obj - The object containing file/directory details.
* last - Tracks the last printed group (e.g., { printed: null }).
* ecnt - Entry counter (e.g., { val: 0 }).
* returns {string} - Generated HTML string.
*/
function addFigure( obj, last, ecnt )
{
let html = "";
// Grouping logic
if (OPT.grouping === "Day") {
if (last.printed !== obj.file_details.day) {
html += `<div class="row ps-3"><h6>Day: ${obj.file_details.day} of ${obj.file_details.month}/${obj.file_details.year}</h6></div>`;
last.printed = obj.file_details.day;
}
} else if (OPT.grouping === "Week") {
if (last.printed !== obj.file_details.woy) {
html += `<div class="row ps-3"><h6>Week #: ${obj.file_details.woy} of ${obj.file_details.year}</h6></div>`;
last.printed = obj.file_details.woy;
}
} else if (OPT.grouping === "Month") {
if (last.printed !== obj.file_details.month) {
html += `<div class="row ps-3"><h6>Month: ${obj.file_details.month} of ${obj.file_details.year}</h6></div>`;
last.printed = obj.file_details.month;
}
}
// Image/Video/Unknown entry
if (obj.type.name === "Image" || obj.type.name === "Video" || obj.type.name === "Unknown") {
const pathType = obj.in_dir.in_path.type.name;
const size = obj.file_details.size_mb;
const hash = obj.file_details.hash;
const inDir = `${obj.in_dir.in_path.path_prefix}/${obj.in_dir.rel_path}`;
const fname = obj.name;
const yr = obj.file_details.year;
const date = `${yr}${String(obj.file_details.month).padStart(2, '0')}${String(obj.file_details.day).padStart(2, '0')}`;
const prettyDate = `${obj.file_details.day}/${obj.file_details.month}/${obj.file_details.year}`;
const type = obj.type.name;
html += `
<figure id="${obj.id}" ecnt="${ecnt}" class="col col-auto g-0 figure entry m-1"
path_type="${pathType}" size="${size}" hash="${hash}" in_dir="${inDir}"
fname="${fname}" yr="${yr}" date="${date}" pretty_date="${prettyDate}" type="${type}">
${renderMedia(obj)}
</figure>
`;
}
// Directory entry
else if (obj.type.name === "Directory" && OPT.folders) {
const dirname = obj.dir_details.rel_path.length
? `${obj.dir_details.in_path.path_prefix}/${obj.dir_details.rel_path}`
: obj.dir_details.in_path.path_prefix;
html += `
<figure class="col col-auto g-0 dir entry m-1" id="${obj.id}" ecnt="${ecnt}" dir="${dirname}" type="Directory">
<svg class="svg" width="${OPT.size - 22}" height="${OPT.size - 22}" fill="currentColor">
<use xlink:href="/internal/icons.svg#Directory"></use>
</svg>
<figcaption class="svg_cap figure-caption text-center text-wrap text-break">${obj.name}</figcaption>
</figure>
`;
html += `<script>f=$('#${obj.id}'); w=f.find('svg').width(); f.find('figcaption').width(w);</script>`;
}
$('#figures').append( html )
// check if there is a pending amendment for this entry, if so mark it up
// (e.g. its being deleted, rotated, etc) - details in the am obj
for (const am of document.amendments)
{
if( am.eid == obj.id )
{
$('#'+obj.id).find('img.thumb').attr('style', 'filter: grayscale(100%);' )
$('#'+obj.id).removeClass('entry')
html='<img class="position-absolute top-50 start-50 translate-middle" height="60" src="/internal/white-circle.png">'
html+='<img class="position-absolute top-50 start-50 translate-middle" height="64" src="/internal/throbber.gif">'
if( am.type.which == 'icon' )
html+=`<svg class="position-absolute top-50 start-50 translate-middle" height="32" style="color:${am.type.colour}" fill="${am.type.colour}"><use xlink:href="/internal/icons.svg#${am.type.what}"></use></svg>`
else
html+=`<img class="position-absolute top-50 start-50 translate-middle" src="/internal/${am.type.what}?v={{js_vers['r270']}}" height="32">`
$('#'+obj.id).find('a').append(html)
}
}
return
}
// Helper function to render media (image/video/unknown)
function renderMedia(obj) {
const isImageOrUnknown = obj.type.name === "Image" || obj.type.name === "Unknown";
const isVideo = obj.type.name === "Video";
const path = `${obj.in_dir.in_path.path_prefix}/${obj.in_dir.rel_path}/${obj.name}`;
const thumb = obj.file_details.thumbnail
? `<a href="${path}"><img alt="${obj.name}" class="thumb" height="${OPT.size}" src="data:image/jpeg;base64,${obj.file_details.thumbnail}"></a>`
: `<a href="${path}"><svg width="${OPT.size}" height="${OPT.size}" fill="white"><use xlink:href="/internal/icons.svg#unknown_ftype"/></svg></a>`;
let mediaHtml = `<div style="position:relative; width:100%">${thumb}`;
if (isImageOrUnknown) {
if (OPT.search_term) {
mediaHtml += `
<div style="position:absolute; bottom: 0px; left: 2px;">
<svg width="16" height="16" fill="white"><use xlink:href="/internal/icons.svg#${getLocationIcon(obj)}"/></svg>
</div>
`;
}
mediaHtml += `
<div id="s${obj.id}" style="display:none; position:absolute; top: 50%; left:50%; transform:translate(-50%, -50%);">
<img height="64px" src="/internal/throbber.gif">
</div>
`;
} else if (isVideo) {
mediaHtml += `
<div style="position:absolute; top: 0px; left: 2px;">
<svg width="16" height="16" fill="white"><use xlink:href="/internal/icons.svg#film"/></svg>
</div>
`;
if (OPT.search_term) {
mediaHtml += `
<div style="position:absolute; bottom: 0px; left: 2px;">
<svg width="16" height="16" fill="white"><use xlink:href="/internal/icons.svg#${getLocationIcon(obj)}"/></svg>
</div>
`;
}
}
mediaHtml += `</div>`;
return mediaHtml;
}
// Helper: Get location icon (placeholder)
function getLocationIcon(obj) {
return ICON[obj.in_dir.in_path.type.name]
}
// POST to get entry ids, and then getPage for a specified directory
function getDirEntries(dir_id, back)
{
data={}
data.dir_id=dir_id
data.back=back
data.noo=OPT.noo
$.ajax({
type: 'POST',
url: '/get_dir_eids',
data: JSON.stringify(data),
contentType: 'application/json',
dataType: 'json',
success: function(res) {
if( res.valid === false )
{
$('#figures').html( "<alert class='alert alert-danger'>ERROR! directory has changed since you loaded this view. You have to reload and reset your view (probably someone deleted the directory or its parent since you loaded this page)" )
return
}
entryList=res.entry_list
pageList=entryList.slice(0, OPT.how_many)
// now go get actual data/entries
getPage(1,getPageFigures)
},
error: function(xhr, status, error) {
console.error("Error:", error);
}
});
}
// this function draws all the figures from document.entries - called when we
// change pages, but also when we change say grouping/other OPTs
function drawPageOfFigures()
{
$('#figures').empty()
var last = { printed: null }
var ecnt=0
// something is up, let the user know
if( document.alert )
$('#figures').append( document.alert )
if( OPT.folders )
{
// it root_eid is 0, then no entries in this path - cant go up
if( OPT.root_eid == 0 || (document.entries.length && document.entries[0].in_dir.eid == OPT.root_eid ) )
{
gray="_gray"
back=""
cl=""
back_id=0
}
else
{
gray=""
back="Back"
cl="back"
if( document.entries.length > 0 )
back_id = document.entries[0].in_dir.eid
else
back_id = document.back_id
}
// back button, if gray/back decide if we see grayed out folder and/or the name of the folder we go back to
// with clas "back" this gets a different click handler which flags server to return data by 'going back/up' in dir tree
// we give the server the id of the first item on the page so it can work out how to go back
html=`<div class="col col-auto g-0 m-1">
<figure id="${back_id}" ecnt="0" class="${cl} entry m-1" type="Directory">
<svg class="svg" width="${OPT.size-22}" height="${OPT.size-22}">
<use xlink:href="internal/icons.svg#folder_back${gray}"/>
</svg>
<figcaption class="figure-caption text-center">${back}</figcaption>
</figure>
</div>`
ecnt++
$('#figures').append(html)
}
for (const obj of document.entries) {
addFigure( obj, last, ecnt )
ecnt++
}
if( document.entries.length == 0 )
if( OPT.search_term )
$('#figures').append( `<span class="alert alert-danger p-2 col-auto"> No matches for: '${OPT.search_term}'</span>` )
else if( OPT.root_eid == 0 )
$('#figures').append( `<span class="alert alert-danger p-2 col-auto d-flex align-items-center">No files in Path!</span>` )
$('.figure').click( function(e) { DoSel(e, this ); SetButtonState(); return false; });
$('.figure').dblclick( function(e) { startViewing( $(this).attr('id') ) } )
// for dir, getDirEntries 2nd param is back (or "up" a dir)
$(".dir").click( function(e) { document.back_id=this.id; getDirEntries(this.id,false) } )
$(".back").click( function(e) { getDirEntries(this.id,true) } )
}
// emtpy out file_list_div, and repopulate it with new page of content
function getPageFileList(res, viewingIdx)
{
$('#file_list_div').empty()
// something is up, let the user know
if( document.alert )
$('#file_list_div').append( '<div class="row">' + document.alert + '</div>' )
if( OPT.root_eid == 0 )
{
$('#file_list_div').append( `<span class="alert alert-danger p-2">No files in Path!</span>` )
return
}
html='<table class="table table-striped table-sm col-12">'
html+='<thead><tr class="table-primary"><th>Name</th><th>Size (MB)</th><th>Path Prefix</th><th>Hash</th></tr></thead><tbody>'
for (const obj of res) {
html+=`<tr>
<td>
<div class="d-flex align-items-center">
<a href="${obj.in_dir.in_path.path_prefix}/${obj.in_dir.rel_path}/${obj.name}">
<img class="img-fluid me-2" style="max-width: 100px;"
src="data:image/jpeg;base64,${obj.file_details.thumbnail}"></img>
</a>
<span>${obj.name}</span>
</div>
<td>${obj.file_details.size_mb}</td>
<td>${obj.in_dir.in_path.path_prefix.replace("static/","")}/${obj.in_dir.rel_path}</td>
<td>${obj.file_details.hash}</td>
</tr>`
}
html+='</tbody></table>'
$('#file_list_div').append(html)
}
// wrapper function as we want to handle real DB query success, but also do the
// same when we just use cache
function getEntriesByIdSuccessHandler(res,pageNumber,successCallback,viewingIdx)
{
if( res.length != pageList.length )
document.alert="<alert class='alert alert-warning'>WARNING: something has changed since viewing this page (likely someone deleted content in another view), strongly suggest a page reload to get the latest data</alert>"
document.entries=res;
// cache this
document.page[pageNumber]=res
successCallback(res,viewingIdx)
resetNextPrevButtons()
// if search, disable folders
if( OPT.search_term )
$('#folders').prop('disabled', 'disabled').removeClass('border-info').addClass('border-secondary').removeClass('text-info').addClass('text-secondary');
else if( document.entries.length == 0 )
{
html=`<span class="alert alert-danger p-2 col-auto">No files in Path</span>`
$('#file_list_div').append(html)
$('#figures').append(html)
}
}
// Function to get the 'page' of entry ids out of entryList
function getPage(pageNumber, successCallback, viewingIdx=0)
{
// before we do anything, disabled left/right arrows on viewer to stop
// getting another event before we have the data for the page back
$('#la').prop('disabled', true)
$('#ra').prop('disabled', true)
const startIndex = (pageNumber - 1) * OPT.how_many;
const endIndex = startIndex + OPT.how_many;
pageList = entryList.slice(startIndex, endIndex);
// set up data to send to server to get the entry data for entries in pageList
data={}
data.ids = pageList
// assume nothing wrong, but if the data goes odd, then this will be non-null and displayed later (cant add here, as later code does .empty() of file divs)
document.alert=null
// see if we can use cache, and dont reload from DB
if( !OPT.folders && document.page.length && document.page[pageNumber] )
{
getEntriesByIdSuccessHandler( document.page[pageNumber], pageNumber, successCallback, viewingIdx )
return
}
$.ajax({
type: 'POST', url: '/get_entries_by_ids',
data: JSON.stringify(data), contentType: 'application/json',
dataType: 'json',
success: function(res) { document.amendments=res.amend; getEntriesByIdSuccessHandler( res.entries, pageNumber, successCallback, viewingIdx ) },
error: function(xhr, status, error) { console.error("Error:", error); } });
return
}
// Quick Function to check if we are on the first page
function isFirstPage(pageNumber)
{
return pageNumber <= 1;
}
// Function to check if we are on the last page
function isLastPage(pageNumber)
{
const totalPages = Math.ceil(entryList.length / OPT.how_many);
return pageNumber >= totalPages;
}
// given an id in the list, return which page we are on (page 1 is first page)
function getPageNumberForId(id) {
const idx = entryList.indexOf(id);
// should be impossible but jic
if (idx === -1) { return -1 }
return Math.floor(idx / OPT.how_many) + 1;
}
// if we are on first page, disable prev, it not ensure next is enabled
// if we are on last page, disable next, it not ensure prev is enabled
function resetNextPrevButtons()
{
// no data, so disabled both
if( getPageNumberForId(pageList[0]) == -1 )
{
$('.prev').prop('disabled', true).addClass('disabled');
$('.next').prop('disabled', true).addClass('disabled');
return
}
if ( isFirstPage( getPageNumberForId(pageList[0]) ) )
$('.prev').prop('disabled', true).addClass('disabled');
else
$('.prev').prop('disabled', false).removeClass('disabled');
if ( isLastPage( getPageNumberForId(pageList[0]) ) )
$('.next').prop('disabled', true).addClass('disabled');
else
$('.next').prop('disabled', false).removeClass('disabled');
}
// get list of eids for the next page, also make sure next/prev buttons make sense for page we are on
function nextPage(successCallback)
{
// pageList[0] is the first entry on this page
const currentPage=getPageNumberForId( pageList[0] )
// should never happen / just return pageList unchanged
if ( currentPage === -1 || isLastPage( currentPage ) )
{
console.error( "WARNING: seems first on pg=" + pageList[0] + " of how many=" + OPT.how_many + " gives currentPage=" + currentPage + " and we cant go next page?" )
return
}
getPage( currentPage+1, successCallback )
return
}
// get list of eids for the prev page, also make sure next/prev buttons make sense for page we are on
function prevPage(successCallback)
{
// pageList[0] is the first entry on this page
const currentPage=getPageNumberForId( pageList[0] )
// should never happen / just return pageList unchanged
if (currentPage === 1 || currentPage === -1 )
{
console.error( "WARNING: seems first on pg=" + pageList[0] + " of how many=" + OPT.how_many + " gives currentPage=" + currentPage + " and we cant go prev page?" )
return
}
getPage( currentPage-1, successCallback )
return
}
// function to see if we are on a phone or tablet (where we dont have ctrl or shift keys - helps to display fake buttons to allow multiselect on mobiles)
function isMobile() {
try{ document.createEvent("TouchEvent"); return true; }
catch(e){ return false; }
}
// when we change one of the options (noo, how_many, folders) - then update '{how_many} files' str,
// tweak noo menu for folders/flat view then reset the page contents based on current OPT values
function changeOPT(successCallback) {
OPT.how_many=$('#how_many').val()
// changes invalidate page cache so clear it out
document.page.length=0
new_f=$('#folders').val()
new_f=( new_f == 'True' )
// if change to/from folders, also fix the noo menu
if( new_f != OPT.folders )
{
if( new_f )
{
$('#noo option:lt(2)').prop('disabled', true);
$('#noo').val(OPT.default_folder_noo)
}
else
{
$('#noo option:lt(2)').prop('disabled', false);
$('#noo').val(OPT.default_flat_noo)
}
}
OPT.noo=$('#noo').val()
OPT.folders=new_f
OPT.folders=$('#folders').val()
OPT.grouping=$('#grouping').val()
OPT.size=$('input[name="size"]:checked').val();
$.ajax({
type: 'POST',
url: '/change_file_opts',
data: JSON.stringify(OPT),
contentType: 'application/json',
success: function(resp) {
entryList=resp.query_data.entry_list
OPT.how_many=parseInt(OPT.how_many)
pageList=entryList.slice(0, OPT.how_many)
// put data back into booleans, ints, etc
OPT.folders=( OPT.folders == 'True' )
$('.how_many_text').html( `&nbsp;${OPT.how_many} files&nbsp;` )
OPT.size=parseInt(OPT.size)
getPage(1,successCallback)
}
})
}
// function to change the size of thumbnails when user clicks xs/s/m/l/xl buttons
function changeSize()
{
sz=$('input[name="size"]:checked').val();
OPT.size=sz
$('.thumb').attr( {height: sz, style: 'font-size:'+sz+'px' } )
$('#size').val(sz)
sz=sz-22
$('.svg').height(sz);
$('.svg').width(sz);
$('.svg_cap').width(sz);
}
// different context menu on files
$.contextMenu({
selector: '.entry',
itemClickEvent: "click",
build: function($triggerElement, e) {
// when right-clicking & no selection add one OR deal with ctrl/shift right-lick as it always changes seln
if( NoSel() || e.ctrlKey || e.shiftKey )
{
DoSel(e, e.currentTarget )
SetButtonState();
}
if( FiguresOrDirsOrBoth() == "figure" )
{
item_list = {
details: { name: "Details..." },
view: { name: "View File" },
sep: "---",
}
if( e.currentTarget.getAttribute('type') == 'Image' )
{
item_list['transform'] = {
name: "Transform",
items: {
"r90": { "name" : "Rotate 90 degrees" },
"r180": { "name" : "Rotate 180 degrees" },
"r270": { "name" : "Rotate 270 degrees" },
"fliph": { "name" : "Flip horizontally" },
"flipv": { "name" : "Flip vertically" }
}
}
}
item_list['move'] = { name: "Move selected file(s) to new folder" }
item_list['sep2'] = { sep: "---" }
}
else
item_list = {
move: { name: "Move selection(s) to new folder" }
}
item_list['ai'] = {
name: "Scan file for faces",
items: {
"ai-all": { name: "all" }
}
};
// Dynamically add entries for each person in the `people` array
people.forEach(person => {
item_list['ai'].items[`ai-${person.tag}`] = { name: person.tag };
});
if( SelContainsBinAndNotBin() ) {
item_list['both']= { name: 'Cannot delete and restore at same time', disabled: true }
} else {
if (e.currentTarget.getAttribute('path_type') == 'Bin' )
item_list['undel']= { name: "Restore selected file(s)" }
else if( e.currentTarget.getAttribute('type') != 'Directory' )
item_list['del']= { name: "Delete Selected file(s)" }
}
return {
callback: function( key, options) {
if( key == "details" ) { DetailsDBox() }
if( key == "view" ) { startViewing( $(this).attr('id') ) }
if( key == "move" ) { MoveDBox(move_paths) }
if( key == "del" ) { DelDBox('Delete') }
if( key == "undel") { DelDBox('Restore') }
if( key == "r90" ) { Transform(90) }
if( key == "r180" ) { Transform(180) }
if( key == "r270" ) { Transform(270) }
if( key == "fliph" ) { Transform("fliph") }
if( key == "flipv" ) { Transform("flipv") }
if( key.startsWith("ai")) { RunAIOnSeln(key) }
// dont flow this event through the dom
e.stopPropagation()
},
items: item_list
};
}
});
// finally, for files_ip/files_sp/files_rbp - set click inside document (NOT an entry) to remove seln
$(document).on('click', function(e) { $('.highlight').removeClass('highlight') ; SetButtonState() });
document.page=[]

View File

@@ -1,27 +1,47 @@
function handleTransformFiles(data,id,job_id)
{
if( data.finished )
{
$('#s'+id).hide()
$('#'+id).find('img.thumb').attr('style', 'filter: color(100%);' );
$('#'+id).addClass('entry')
$('#'+id).find('.thumb').attr('src', 'data:image/jpeg;base64,'+data.thumbnail)
return false;
}
else
{
setTimeout( function() { CheckTransformJob(id,job_id,handleTransformFiles) }, 1000,id, job_id );
}
}
// POST to a check URL, that will tell us if the transformation has completed,
// if not, try again in 1 second... If it has finished then reset the thumbnail
// to full colour, put it back to being an entry and reset the thumbnail to the
// newly created one that was sent back in the response to the POST
function CheckTransformJob(id,job_id)
function handleTransformViewing(data,id,job_id)
{
if( data.finished )
{
// stop throbber, remove grayscale & then force reload with timestamped version of im.src
grayscale=0
throbber=0
im.src=im.src + '?t=' + new Date().getTime();
return false;
}
else
{
setTimeout( function() { CheckTransformJob(id,job_id,handleTransformViewing) }, 1000,id, job_id );
}
}
// POST to a check URL, that will tell us if the transformation has completed,
// if not, try again in 1 second... If it has finished then reset the thumbnail
// to full colour, put it back to being an entry and reset the thumbnail to the
// newly created one that was sent back in the response to the POST
function CheckTransformJob(id,job_id,successCallback)
{
CheckForJobs()
$.ajax(
{
type: 'POST', data: '&job_id='+job_id, url: '/check_transform_job', success: function(data) {
if( data.finished )
{
$('#s'+id).hide()
$('#'+id).find('img.thumb').attr('style', 'filter: color(100%);' );
$('#'+id).addClass('entry')
$('#'+id).find('.thumb').attr('src', 'data:image/jpeg;base64,'+data.thumbnail)
return false;
}
else
{
setTimeout( function() { CheckTransformJob(id,job_id) }, 1000,id, job_id );
}
},
} )
$.ajax( { type: 'POST', data: '&job_id='+job_id, url: '/check_transform_job', success: function(res) { successCallback(res,id,job_id); } } )
}
// for each highlighted image, POST the transform with amt (90, 180, 270,
@@ -31,9 +51,20 @@ function CheckTransformJob(id,job_id)
// to finish
function Transform(amt)
{
$('.highlight').each(function( id, e ) {
post_data = '&amt='+amt+'&id='+e.id
// we are in the viewer with 1 image only...
if( document.viewing )
{
post_data = '&amt='+amt+'&id='+document.viewing.id
// send /transform for this image, grayscale the thumbmail, add color spinning wheel overlay, and start checking for job end
$.ajax({ type: 'POST', data: post_data, url: '/transform', success: function(data){ $('#'+e.id).find('img.thumb').attr('style', 'filter: grayscale(100%);' ); $('#'+e.id).removeClass('entry'); $('#s'+e.id).show(); CheckTransformJob(e.id,data.job_id); return false; } })
} )
$.ajax({ type: 'POST', data: post_data, url: '/transform', success: function(data) { grayscale=1; throbber=1; DrawImg(); CheckTransformJob(document.viewing.id,data.job_id,handleTransformViewing); return false; } })
}
else
{
$('.highlight').each(function( id, e ) {
post_data = '&amt='+amt+'&id='+e.id
// send /transform for this image, grayscale the thumbmail, add color spinning wheel overlay, and start checking for job end
$.ajax({ type: 'POST', data: post_data, url: '/transform', success: function(data){ $('#'+e.id).find('img.thumb').attr('style', 'filter: grayscale(100%);' ); $('#'+e.id).removeClass('entry'); $('#s'+e.id).show(); CheckTransformJob(e.id,data.job_id,handleTransformFiles); return false; } })
} )
}
}

View File

@@ -25,6 +25,8 @@ function NewHeight()
return im.height*gap / (im.width/window.innerWidth)
}
// draw 'str' as a label above the bounding box of the face (with a white
// transparent background to enhance readability of str)
function DrawLabelOnFace(str)
{
// finish face box, need to clear out new settings for // transparent backed-name tag
@@ -85,13 +87,13 @@ function DrawImg()
else
$('.figcaption').hide()
// if we have faces, the enable the toggles, otherwise disable them
// and reset model select too
if( objs[current].faces )
// if we have faces, the enable the toggles, otherwise disable them and reset model select too
if( document.viewing.file_details.faces.length )
{
$('#faces').attr('disabled', false)
$('#distance').attr('disabled', false)
$('#model').val( Number(objs[current].face_model) )
// first face is good enough as whole file has to have used same model
$('#model').val( document.viewing.file_details.faces[0].facefile_lnk.model_used )
}
else
{
@@ -102,33 +104,37 @@ function DrawImg()
}
// okay, we want faces drawn so lets do it
if( $('#faces').prop('checked') && objs[current].faces )
if( $('#faces').prop('checked') && document.viewing.file_details.faces )
{
faces=document.viewing.file_details.faces
// draw rect on each face
for( i=0; i<objs[current].faces.length; i++ )
for( i=0; i<faces.length; i++ )
{
x = objs[current].faces[i].x / ( im.width/canvas.width )
y = objs[current].faces[i].y / ( im.height/canvas.height )
w = objs[current].faces[i].w / ( im.width/canvas.width )
h = objs[current].faces[i].h / ( im.height/canvas.height )
x = faces[i].face_left / ( im.width/canvas.width )
y = faces[i].face_top / ( im.height/canvas.height )
w = faces[i].w / ( im.width/canvas.width )
h = faces[i].h / ( im.height/canvas.height )
context.beginPath()
context.rect( x, y, w, h )
context.lineWidth = 2
// this face has an override so diff colour
if( objs[current].faces[i].override )
if( faces[i].fnmo.length || faces[i].ffmo.length )
{
context.strokeStyle = 'blue'
DrawLabelOnFace( objs[current].faces[i].override.who )
if( faces[i].ffmo.length )
DrawLabelOnFace( faces[i].ffmo[0].person.tag )
else
DrawLabelOnFace( faces[i].fnmo[0].type.name )
}
else
{
context.strokeStyle = 'green'
if( objs[current].faces[i].who )
if( faces[i].refimg )
{
str=objs[current].faces[i].who
str=faces[i].refimg.person.tag
if( $('#distance').prop('checked') )
str += "("+objs[current].faces[i].distance+")"
str += "("+faces[i].refimg_lnk.face_distance.toFixed(2)+")"
DrawLabelOnFace( str )
}
}
@@ -156,13 +162,15 @@ function FaceToggle()
// also deals with fullsecreen if needed
function ViewImageOrVideo()
{
if( objs[current].type == 'Image' )
// can happen if no content to display
if( ! document.viewing ) return
if( document.viewing.type.name == 'Image' )
{
im.src='../' + objs[current].url
im.src='../' + document.viewing.FullPathOnFS
$('#video_div').hide()
if( $('#fname_toggle').prop('checked' ) )
$('#img-cap').show()
$('#fname_i').html(PrettyFname(objs[current].url))
$('#fname_i').html(PrettyFname(document.viewing.FullPathOnFS))
$('#figure').show()
if( fullscreen )
$('#canvas').get(0).requestFullscreen()
@@ -170,11 +178,11 @@ function ViewImageOrVideo()
if( document.fullscreen )
document.exitFullscreen()
}
if( objs[current].type == 'Video' )
if( document.viewing.type.name == 'Video' )
{
$('#figure').hide()
$('#video').prop('src', '../' + objs[current].url )
$('#fname_v').html(PrettyFname(objs[current].url))
$('#video').prop('src', '../' + document.viewing.FullPathOnFS )
$('#fname_v').html(PrettyFname(document.viewing.FullPathOnFS))
if( $('#fname_toggle').prop('checked' ) )
$('#img-cap').hide()
ResizeVideo()
@@ -189,6 +197,8 @@ function ViewImageOrVideo()
var offsetX,offsetY;
// find the edge of the canvas, so when we have a PAGE event with x,y we can see
// where we clicked in it (PAGE.x - canvas.x to see where in canvas, etc)
function reOffset()
{
var BB=$('#canvas').get(0).getBoundingClientRect();
@@ -196,23 +206,27 @@ function reOffset()
offsetY=BB.top;
}
window.onscroll=function(e){ reOffset(); }
window.onresize=function(e){ reOffset(); }
// when we are ready,
$(document).ready( function()
{
var cw=$('#canvas').width;
var ch=$('#canvas').height;
reOffset();
// if we scroll or resize the window, the canvas moves on the page, reset the offsets
window.onscroll=function(e){ reOffset(); }
window.onresize=function(e){ reOffset(); }
// clicking in the viewer canvas gets its own handlers to handle faces (or not)
$.contextMenu({
selector: '#canvas',
trigger: 'left',
// trigger: 'none',
hideOnSecondTrigger: true,
// go through each face, and add appropriate 'left-click' menu.
// e.g if known face, say name, offer add refimg to person, etc.
// this is a bit complex, the item_list var has a key (which is what we
// will do if we are chosen from the menu), and data to process the action
build: function($triggerElement, e) {
reOffset();
// get mouse position relative to the canvas (left-click uses page*)
@@ -221,30 +235,33 @@ $(document).ready( function()
item_list = { not_a_face: { name: "Not a face", which_face: '-1' } }
for( i=0; i<objs[current].faces.length; i++ )
faces=document.viewing.file_details.faces
for( i=0; i<faces.length; i++ )
{
fx = objs[current].faces[i].x / ( im.width/canvas.width )
fy = objs[current].faces[i].y / ( im.height/canvas.height )
fw = objs[current].faces[i].w / ( im.width/canvas.width )
fh = objs[current].faces[i].h / ( im.height/canvas.height )
fx = faces[i].face_left / ( im.width/canvas.width )
fy = faces[i].face_top / ( im.height/canvas.height )
fw = faces[i].w / ( im.width/canvas.width )
fh = faces[i].h / ( im.height/canvas.height )
if( x >= fx && x <= fx+fw && y >= fy && y <= fy+fh )
{
if( objs[current].faces[i].override )
if( faces[i].ffmo.length || faces[i].fnmo.length )
{
item_list['remove_force_match_override']={ 'name': 'Remove override for this face', 'which_face': i, 'id': objs[current].faces[i].id }
}
else if( objs[current].faces[i].who )
item_list['remove_force_match_override']={ 'name': 'Remove override for this face', 'which_face': i, 'id': faces[i].id }
}
else if( faces[i].refimg )
{
item_list['match']={ 'name': objs[current].faces[i].who, 'which_face': i, 'id': objs[current].faces[i].id }
item_list['wrong_person']={ 'name': 'wrong person', 'which_face': i, 'id': objs[current].faces[i].id }
item_list['match']={ 'name': faces[i].refimg.person.tag, 'which_face': i, 'id': faces[i].id }
item_list['match_add_refimg']={ 'name': 'Add this as refimg for ' + faces[i].refimg.person.tag,
'person_id': faces[i].refimg.person.id, 'who': faces[i].refimg.person.tag, 'which_face': i, 'id': faces[i].id, }
item_list['wrong_person']={ 'name': 'wrong person', 'which_face': i, 'id': faces[i].id }
}
else
{
item_list['no_match_new_person']={ 'name': 'Add as reference image to NEW person', 'which_face': i, 'id': objs[current].faces[i].id }
item_list['no_match_new_refimg']={ 'name': 'Add as reference image to EXISTING person', 'which_face': i, 'id': objs[current].faces[i].id }
item_list['no_match_new_person']={ 'name': 'Add as reference image to NEW person', 'which_face': i, 'id': faces[i].id }
item_list['no_match_new_refimg']={ 'name': 'Add as reference image to EXISTING person', 'which_face': i, 'id': faces[i].id }
for( var el in NMO ) {
item_list['NMO_'+el]={'type_id': NMO[el].type_id, 'name': 'Override: ' + NMO[el].name, 'which_face': i, 'id': objs[current].faces[i].id }
item_list['NMO_'+el]={'type_id': NMO[el].id, 'name': 'Override: ' + NMO[el].name, 'which_face': i, 'id': faces[i].id }
}
}
delete item_list['not_a_face']
@@ -264,26 +281,15 @@ $(document).ready( function()
} )
} );
// quick wrapper function to make calling this ajax code simpler in SearchForPerson
// POST to the server to force a match for this face to person_id
// FIXME: could I not pass person_id, and use // ...[item[key].which_face].refimg.person.id
function OverrideForceMatch( person_id, key )
{
// we have type_id passed in, so dig the NMO out, and use that below (its really just for name, but in case we change that in the DB)
for( el in NMO )
{
if( NMO[el].type_id == item[key].type_id )
{
fm_idx=el
break
}
}
ofm='&person_id='+person_id+'&face_id='+item[key].id
$.ajax({ type: 'POST', data: ofm, url: '/add_force_match_override', success: function(data) {
objs[current].faces[item[key].which_face].override={}
objs[current].faces[item[key].which_face].override.who=data.person_tag
objs[current].faces[item[key].which_face].override.distance='N/A'
objs[current].faces[item[key].which_face].override.type_id=NMO[fm_idx].id
objs[current].faces[item[key].which_face].override.type_name=NMO[fm_idx].name
document.viewing.file_details.faces[item[key].which_face].ffmo=[]
document.viewing.file_details.faces[item[key].which_face].ffmo[0]={}
document.viewing.file_details.faces[item[key].which_face].ffmo[0].person=data.person
$('#dbox').modal('hide')
$('#faces').prop('checked',true)
DrawImg()
@@ -292,6 +298,23 @@ function OverrideForceMatch( person_id, key )
} )
}
// function that handles the POSTed data that comes back when we add a
// reference image to a new or existing person (right-click on a face)
// used in success callbacks from CreatePersonAndRefimg() and AddRefimgTo()
function handleAddRefimgData(key, data)
{
document.viewing.file_details.faces[item[key].which_face].refimg=data.refimg
document.viewing.file_details.faces[item[key].which_face].refimg_lnk={}
// if we used this img, for now set distance to 0 - it is an exact match!
document.viewing.file_details.faces[item[key].which_face].refimg_lnk.face_distance=0.0
$('#dbox').modal('hide')
$('#faces').prop('checked',true)
DrawImg()
CheckForJobs()
}
// when we right-click a face and make a new person, this code creates and
// associates the face
function CreatePersonAndRefimg( key )
{
d='&face_id='+item[key].id
@@ -300,29 +323,17 @@ function CreatePersonAndRefimg( key )
+'&surname='+$('#surname').val()
+'&refimg_data='+item[key].refimg_data
$.ajax({ type: 'POST', data: d, url: '/match_with_create_person',
success: function(data) {
objs[current].faces[item[key].which_face].who=data.who
objs[current].faces[item[key].which_face].distance=data.distance
$('#dbox').modal('hide')
$('#faces').prop('checked',true)
DrawImg()
CheckForJobs()
}
success: function(data) { handleAddRefimgData(key, data ) },
})
}
// when we right-click a face and connect to an existing person, this connects
// the refimg and associates the face
function AddRefimgTo( person_id, key, search )
{
d='&face_id='+item[key].id+'&person_id='+person_id+'&refimg_data='+item[key].refimg_data+'&search='+search
$.ajax({ type: 'POST', data: d, url: '/add_refimg_to_person',
success: function(data) {
objs[current].faces[item[key].which_face].who=data.who
objs[current].faces[item[key].which_face].distance=data.distance
$('#dbox').modal('hide')
$('#faces').prop('checked',true)
DrawImg()
CheckForJobs()
}
success: function(data) { handleAddRefimgData(key, data ) },
})
}
@@ -339,8 +350,7 @@ function SearchForPerson(content, key, face_id, face_pos, type_id)
for( var el in data ) {
content+='<div class="row">'
var person = data[el];
// NMO_1 is a non-match-override type_id==1 (or force match to existing person)
if( key == "NMO_1" )
if( item[key].name == "Override: Manual match to existing person" )
{
func='OverrideForceMatch('+person.id+',\''+key+'\' )'
content+= '<div class="col">' + person.tag + ' (' + person.firstname+' '+person.surname+ ') </div>'
@@ -363,17 +373,19 @@ function SearchForPerson(content, key, face_id, face_pos, type_id)
return false
}
// if we force a match, this func allows us to POST to the server to remove the override
function RemoveOverrideForceMatch(face_pos)
{
if( objs[current].faces[face_pos].override )
who=objs[current].faces[face_pos].override.who
if( document.viewing.file_details.faces[face_pos].ffmo.length )
who=document.viewing.file_details.faces[face_pos].ffmo[0].person.tag
else
who=objs[current].faces[face_pos].who
who=document.viewing.file_details.faces[face_pos].refimg.person.tag
d='&face_id='+objs[current].faces[face_pos].id+'&person_tag='+who+'&file_eid='+current
d='&face_id='+document.viewing.file_details.faces[face_pos].id+'&person_tag='+who+'&file_eid='+document.viewing.id
$.ajax({ type: 'POST', data: d, url: '/remove_force_match_override',
success: function(data) {
delete objs[current].faces[face_pos].override
// force/delete the ffmo cleanly
document.viewing.file_details.faces[face_pos].ffmo.length=0
$('#dbox').modal('hide')
DrawImg()
CheckForJobs()
@@ -383,12 +395,13 @@ function RemoveOverrideForceMatch(face_pos)
return false
}
// if we force NO match, this func allows us to POST to the server to remove the override
function RemoveOverrideNoMatch(face_pos, type_id)
{
d='&face_id='+objs[current].faces[face_pos].id+'&type_id='+type_id
d='&face_id='+document.viewing.file_details.faces[face_pos].id+'&type_id='+type_id
$.ajax({ type: 'POST', data: d, url: '/remove_no_match_override',
success: function(data) {
delete objs[current].faces[face_pos].override
document.viewing.file_details.faces[face_pos].fnmo.length=0
$('#dbox').modal('hide')
DrawImg()
CheckForJobs()
@@ -398,16 +411,13 @@ function RemoveOverrideNoMatch(face_pos, type_id)
return false
}
// POST to the server to force NO match for this face
function AddNoMatchOverride(type_id, face_id, face_pos, type_id)
{
d='&type_id='+type_id+'&face_id='+face_id
$.ajax({ type: 'POST', data: d, url: '/add_no_match_override',
success: function(data) {
objs[current].faces[face_pos].override={}
objs[current].faces[face_pos].override.who=NMO[type_id].name
objs[current].faces[face_pos].override.distance='N/A'
objs[current].faces[face_pos].override.type_id=type_id
objs[current].faces[face_pos].override.type_name=NMO[type_id].name
document.viewing.file_details.faces[face_pos].fnmo[0]=data
$('#dbox').modal('hide')
$('#faces').prop('checked',true)
DrawImg()
@@ -416,6 +426,9 @@ function AddNoMatchOverride(type_id, face_id, face_pos, type_id)
} )
}
// generate html for the appropriate content to search for a person when adding
// override DBox. has a button that when clicked calls SeachForPerson() which
// POSTs to the server, and fills in the 'search_person_results' div with content
function AddSearch( content, key, face_pos )
{
html='<h5>search for existing person:</h5>'
@@ -455,17 +468,17 @@ function FaceDBox(key, item)
div+='</div><div class="col-6">'
if ( key == 'remove_force_match_override' )
{
if( objs[current].faces[face_pos].override.type_name == 'Manual match to existing person' )
div+='<div class="row col-12">remove this override (force match to: ' + objs[current].faces[face_pos].override.who + ')</div>'
if( document.viewing.file_details.faces[face_pos].ffmo.length )
div+='<div class="row col-12">remove this override (force match to: ' + document.viewing.file_details.faces[face_pos].ffmo[0].person.tag + ')</div>'
else
div+='<div class="row col-12">remove this override (no match)</div>'
div+='<div class="row col-12">remove this override (' + document.viewing.file_details.faces[face_pos].fnmo[0].type.name + ')</div>'
div+='<div class="row">'
div+='<button class="btn btn-outline-info col-6" type="button" onClick="$(\'#dbox\').modal(\'hide\'); return false">Cancel</button>'
div+='<button class="btn btn-outline-danger col-6" type="button" '
if( objs[current].faces[face_pos].override.type_name == 'Manual match to existing person' )
if( document.viewing.file_details.faces[face_pos].ffmo.length )
div+='onClick="RemoveOverrideForceMatch(' +face_pos+ ')">Remove</button>'
else
div+='onClick="RemoveOverrideNoMatch(' +face_pos+','+objs[current].faces[face_pos].override.type_id+ ')">Remove</button>'
div+='onClick="RemoveOverrideNoMatch(' +face_pos+','+document.viewing.file_details.faces[face_pos].fnmo[0].type.id+ ')">Remove</button>'
div+='</div>'
}
if ( key == 'no_match_new_person' )
@@ -494,10 +507,22 @@ function FaceDBox(key, item)
{
div+=AddSearch( 'Click one of the link(s) below to add this face as a reference image to the person:<br><br>', key, face_pos );
}
if ( key == 'match_add_refimg' )
{
func='AddRefimgTo('+item[key]['person_id']+',\''+key+'\''
func_sn=func+ ', true )'
func_ao=func+ ', false )'
div+="Confirm you wish to add this face as a reference image for " + item[key]['who']
div+= '<div class="col">' + item[key]['who'] + '</div><div class="col input-group">'
div+= '<button onClick="'+func_sn+'" class="btn btn-success py-1 input-group-prepend">Add & search now</button>&nbsp;'
div+= '<button onClick="'+func_ao+'" class="btn btn-outline-success py-1 input-group-append">Add only</button></div>'
}
if ( key == 'wrong_person' )
{
div+='<br>wrong person, so mark this as the wrong person/refimg connection, for face#' + item[key]['which_face']
div+='<br>face db id: ' + item[key]['id']
div += '<br>not yet'
}
if( /NMO_/.test(key) )
{
@@ -527,7 +552,7 @@ function FaceDBox(key, item)
// pops results up in a dbox
function JoblogSearch()
{
data="eid="+current
data="eid="+document.viewing.id
$.ajax({ type: 'POST', data: data, url: '/joblog_search', success: function(res) {
data = JSON.parse(res)
div ='<div><table class="table table-striped table-sm sm-txt">'
@@ -544,3 +569,169 @@ function JoblogSearch()
}
})
}
// helper func to resert the src on the video div
function setVideoSource(newSrc) {
$('#videoSource').attr('src', newSrc);
$('#video')[0].load();
}
// function called when we get another page from inside the viewer
function getPageViewer(res, viewingIdx)
{
document.viewing=document.entries[viewingIdx]
// update viewing, arrows and image/video too
ViewImageOrVideo()
}
// handler used when we double click an entry to show it in the viewer
function dblClickToViewEntry(id) {
$('#files_div').addClass('d-none')
$('#viewer_div').removeClass('d-none')
setEntryById( id )
ViewImageOrVideo()
}
// quick function that allows us to go out of the viewer and back, the viewercomes from files_ip/sp
// so just redraw the page with drawPageOfFigures() as we have all the data
function goOutOfViewer()
{
// if this returns -1, we have used arrows to go onto a new page(s)
if( getPageNumberForId( $('#figures').find('.figure').first().prop('id') ) == -1 )
drawPageOfFigures()
// hide viewer div, then show files_div
$('#viewer_div').addClass('d-none')
$('#files_div').removeClass('d-none')
}
// change the viewer to the previous entry (handle page change too)
function getPreviousEntry() {
var currentIndex = entryList.indexOf(document.viewing.id);
oldPageOffset=Math.floor(currentIndex / OPT.how_many)
if (currentIndex > 0) {
currentIndex--;
pageOffset=Math.floor(currentIndex / OPT.how_many)
currentIndex=currentIndex-(pageOffset*OPT.how_many)
// pref page, load it
if( oldPageOffset != pageOffset )
// pref page is pageOffset+1 now
getPage(pageOffset+1,getPageViewer,currentIndex)
else
document.viewing=document.entries[currentIndex]
}
}
// change the viewer to the next entry (handle page change too)
function getNextEntry() {
var currentIndex = entryList.indexOf(document.viewing.id);
oldPageOffset=Math.floor(currentIndex / OPT.how_many)
if (currentIndex < entryList.length - 1) {
currentIndex++
pageOffset=Math.floor(currentIndex / OPT.how_many)
currentIndex=currentIndex-(pageOffset*OPT.how_many)
// next page, load it
if( oldPageOffset != pageOffset )
// next page is pageOffset+1 now
getPage(pageOffset+1,getPageViewer,currentIndex)
else
document.viewing=document.entries[currentIndex]
}
}
// check if we are viewing the very first entry (helps to disable la)
function entryIsAtStart() {
return document.viewing.id === entryList[0];
}
// check if we are viewing the very last entry (helps to disable ra)
function entryIsAtEnd() {
return document.viewing.id === entryList[entryList.length - 1];
}
// helper func to ensure document.viewing is the right entry from document.entries array
function setEntryById(id) {
var currentIndex = entryList.indexOf(parseInt(id));
// if we are on a different page, adjust as document.entries only has <= how_many
pageOffset=Math.floor(currentIndex / OPT.how_many)
currentIndex = currentIndex-(pageOffset*OPT.how_many)
document.viewing=document.entries[currentIndex]
}
// disable la button if we are viewing first entry and/or ra button if we are viewing last entry
function setDisabledForViewingNextPrevBttons()
{
$('#la').attr('disabled', entryIsAtStart());
$('#ra').attr('disabled', entryIsAtEnd());
}
// when we go into the view, the keybindings are set here for items like 'f' for face box/name
function addViewerKeyHandler() {
// allow a keypress on the viewer_div
$(document).keydown(function(event) {
// if dbox is visible, dont process this hot-key, we are inputting text into inputs instead
if( $("#dbox").is(':visible') )
return
switch (event.key)
{
case "Left": // IE/Edge specific value
case "ArrowLeft":
if( $('#la').prop('disabled') == false )
$('#la').click()
break;
case "Right": // IE/Edge specific value
case "ArrowRight":
if( $('#ra').prop('disabled') == false )
$('#ra').click()
break;
case "d":
$('#distance').click()
break;
case "f":
$('#faces').click()
break;
case "n":
$('#fname_toggle').click()
break;
case "F":
fullscreen=!document.fullscreen
ViewImageOrVideo()
break;
case "l":
JoblogSearch()
break;
case "Delete":
$('#del').click()
default:
return; // Quit when this doesn't handle the key event.
}
});
}
// left arrow onclick handler to go to prev image from inside the viewer
function prevImageInViewer()
{
getPreviousEntry()
setDisabledForViewingNextPrevBttons()
ViewImageOrVideo()
}
// right arrow onclick handler to go to next image from inside the viewer
function nextImageInViewer()
{
getNextEntry()
setDisabledForViewingNextPrevBttons()
ViewImageOrVideo()
}
// wrapper func to start the viewer - needed as we have a dbl-click & View file
// to start the viewer
function startViewing(eid)
{
dblClickToViewEntry( eid );
setDisabledForViewingNextPrevBttons();
addViewerKeyHandler()
}

View File

@@ -1,37 +0,0 @@
// POST to a check URL, that will tell us if the transformation has completed,
// if not, try again in 1 second... If it has finished then reset the thumbnail
// to full colour, put it back to being an entry and reset the thumbnail to the
// newly created one that was sent back in the response to the POST
function CheckTransformJob(id,job_id)
{
CheckForJobs()
$.ajax(
{
type: 'POST', data: '&job_id='+job_id, url: '/check_transform_job', success: function(data) {
if( data.finished )
{
// stop throbber, remove grayscale & then force reload with timestamped version of im.src
grayscale=0
throbber=0
im.src=im.src + '?t=' + new Date().getTime();
return false;
}
else
{
setTimeout( function() { CheckTransformJob(id,job_id) }, 1000,id, job_id );
}
},
} )
}
// for each highlighted image, POST the transform with amt (90, 180, 270,
// fliph, flipv) which will let the job manager know what to do to this file.
// we also grayscale the thumbnail out, remove the entry class for now, show
// the spinning wheel, and finally kick of the checking for the transform job
// to finish
function Transform(amt)
{
post_data = '&amt='+amt+'&id='+current
// send /transform for this image, grayscale the thumbmail, add color spinning wheel overlay, and start checking for job end
$.ajax({ type: 'POST', data: post_data, url: '/transform', success: function(data) { grayscale=1; throbber=1; DrawImg(); CheckTransformJob(current,data.job_id); return false; } })
}

BIN
internal/pa-logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

BIN
internal/rot180-invert.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

BIN
internal/rot180.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.3 KiB

BIN
internal/rot270-invert.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

BIN
internal/rot270.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.2 KiB

BIN
internal/rot90-invert.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

BIN
internal/rot90.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.4 KiB

107
internal/rot90.svg Normal file
View File

@@ -0,0 +1,107 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
width="265mm"
height="297mm"
viewBox="0 0 265 297"
version="1.1"
id="svg5"
inkscape:version="1.2.2 (b0a8486541, 2022-12-01)"
sodipodi:docname="rot90.svg"
inkscape:export-filename="rot90.svg"
inkscape:export-xdpi="96"
inkscape:export-ydpi="96"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg">
<sodipodi:namedview
id="namedview7"
pagecolor="#ffffff"
bordercolor="#000000"
borderopacity="0.25"
inkscape:showpageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
inkscape:deskcolor="#d1d1d1"
inkscape:document-units="mm"
showgrid="false"
inkscape:zoom="0.914906"
inkscape:cx="222.97373"
inkscape:cy="261.22902"
inkscape:window-width="3840"
inkscape:window-height="2128"
inkscape:window-x="0"
inkscape:window-y="0"
inkscape:window-maximized="1"
inkscape:current-layer="svg5" />
<defs
id="defs2">
<marker
style="overflow:visible"
id="Arrow2"
refX="0"
refY="0"
orient="95"
inkscape:stockid="Arrow2"
markerWidth="2.7"
markerHeight="3.964"
viewBox="0 0 7.7 5.6"
inkscape:isstock="true"
inkscape:collect="always"
preserveAspectRatio="none">
<path
transform="scale(0.7)"
d="M -2,-4 9,0 -2,4 c 2,-2.33 2,-5.66 0,-8 z"
style="fill:context-stroke;fill-rule:evenodd;stroke:none"
id="arrow2L" />
</marker>
</defs>
<g
inkscape:label="Layer 1"
inkscape:groupmode="layer"
id="layer1">
<ellipse
style="font-variation-settings:normal;display:inline;opacity:1;fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:10;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:20,50;stroke-dashoffset:2;stroke-opacity:1;paint-order:markers stroke fill;stop-color:#000000;stop-opacity:1"
id="path2244"
cx="131.0041"
cy="137.873"
rx="98.982185"
ry="108.3931"
inkscape:label="dashes" />
</g>
<g
inkscape:groupmode="layer"
id="layer2"
inkscape:label="270-arc" />
<path
style="font-variation-settings:normal;display:inline;fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:12;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;marker-end:url(#Arrow2);paint-order:markers stroke fill;stop-color:#000000"
id="path2244-3"
inkscape:label="arc"
sodipodi:type="arc"
sodipodi:cx="131.08505"
sodipodi:cy="137.8826"
sodipodi:rx="98.982185"
sodipodi:ry="108.3931"
sodipodi:start="4.7127986"
sodipodi:end="0"
sodipodi:arc-type="arc"
d="M 131.1256,29.489511 A 98.982185,108.3931 0 0 1 230.06724,137.8826"
inkscape:export-filename="rot90.svg"
inkscape:export-xdpi="96"
inkscape:export-ydpi="96"
sodipodi:open="true" />
<text
xml:space="preserve"
style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:90.3111px;font-family:sans-serif;-inkscape-font-specification:'sans-serif, Normal';font-variant-ligatures:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-east-asian:normal;fill:#000000;stroke:none;stroke-width:4;stroke-linecap:round;stroke-linejoin:bevel;stroke-dasharray:none;stroke-dashoffset:0;paint-order:markers stroke fill"
x="74.990334"
y="174.16937"
id="text2468"
inkscape:label="270-text"><tspan
sodipodi:role="line"
id="tspan2466"
style="font-style:normal;font-variant:normal;font-weight:bold;font-stretch:normal;font-size:95.25px;font-family:sans-serif;-inkscape-font-specification:'sans-serif, Bold';font-variant-ligatures:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-east-asian:normal;fill:#000000;stroke:none;stroke-width:4;stroke-dasharray:none"
x="74.990334"
y="174.16937">90</tspan></text>
</svg>

After

Width:  |  Height:  |  Size: 4.2 KiB

BIN
internal/white-circle.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.2 KiB

29
job.py
View File

@@ -3,7 +3,7 @@ from flask_wtf import FlaskForm
from flask import request, render_template, redirect, make_response, jsonify, url_for
from settings import Settings
from main import db, app, ma
from sqlalchemy import Sequence, func
from sqlalchemy import Sequence, func, select
from sqlalchemy.exc import SQLAlchemyError
from datetime import datetime, timedelta
import pytz
@@ -78,13 +78,6 @@ class PA_JobManager_Message(PA,db.Model):
job = db.relationship ("Job" )
################################################################################
# GetJM_Message: used in html to display any message for this front-end
################################################################################
def GetJM_Message():
msg=PA_JPA_JobManager_MessageobManager_Message.query.first()
return msg
################################################################################
# Used in main html to show a red badge of # jobs to draw attention there are
# active jobs being processed in the background
@@ -178,7 +171,9 @@ def jobs():
jobs = Job.query.order_by(Job.id.desc()).all()
else:
page_title='Job list (recent)'
jobs = Job.query.filter( Job.last_update >= (func.now() - func.cast(concat(settings.job_archive_age, 'DAYS'), INTERVAL)) ).order_by(Job.id.desc()).all()
# work out cutoff in python (used to do this in sql and it was too slow)
cutoff = datetime.now() - timedelta(days=settings.job_archive_age)
jobs = Job.query.filter( Job.last_update >= cutoff ).order_by(Job.id.desc()).all()
return render_template("jobs.html", jobs=jobs, page_title=page_title)
@@ -188,7 +183,7 @@ def jobs():
@app.route("/job/<id>", methods=["GET","POST"])
@login_required
def joblog(id):
joblog = Job.query.get(id)
joblog = db.session.get(Job,id)
if request.method == 'POST':
logs=Joblog.query.filter(Joblog.job_id==id).order_by(Joblog.log_date).all()
@@ -285,7 +280,8 @@ def joblog_search():
from sqlalchemy import text
eid=request.form['eid']
ent=Entry.query.get(eid)
stmt = select(Entry).where(Entry.id == eid)
ent = db.session.scalars(stmt).one_or_none()
logs=Joblog.query.join(Job).filter(Joblog.log.ilike(text(f"'%%{ent.name}%%'"))).with_entities(Joblog.log, Job.id, Job.name, Job.state, Joblog.log_date).all()
# turn DB output into json and return it to the f/e
@@ -344,3 +340,14 @@ def _jinja2_filter_datetime(date, fmt=None):
return date.strftime("%d/%m/%Y %I:%M:%S %p")
else:
return "N/A"
################################################################################
# allow a way to force the messages to be deleted if really needed - its a bit
# lame, but a quick fix
################################################################################
@app.route('/force_clear')
@login_required
def force_clear():
PA_JobManager_Message.query.delete();
db.session.commit()
return redirect("/")

26
main.py
View File

@@ -11,7 +11,7 @@ from datetime import datetime
import os
import re
import socket
from shared import CreateSelect, CreateFoldersSelect, LocationIcon, DB_URL, PROD_HOST, OLDEST_LOG_LIMIT
from shared import CreateSelect, CreateFoldersSelect, LocationIcon, DB_URL, OLDEST_LOG_LIMIT
# for ldap auth
from flask_ldap3_login import LDAP3LoginManager
@@ -60,15 +60,18 @@ except Exception:
app.config['SECRET_KEY'] = b'my_insecure_PA_token_with_random_2134876adsfjhlkasdf87'
# ldap config vars: (the last one is required, or python ldap freaks out)
app.config['LDAP_HOST'] = 'mara.ddp.net'
app.config['LDAP_HOST'] = 'ldap.ddp.net'
app.config['LDAP_BASE_DN'] = 'dc=depaoli,dc=id,dc=au'
app.config['LDAP_USER_DN'] = 'ou=users'
app.config['LDAP_GROUP_DN'] = 'ou=groups'
app.config['LDAP_USER_RDN_ATTR'] = 'cn'
app.config['LDAP_USER_RDN_ATTR'] = 'uid'
app.config['LDAP_USER_LOGIN_ATTR'] = 'uid'
app.config['LDAP_BIND_USER_DN'] = None
app.config['LDAP_BIND_USER_PASSWORD'] = None
app.config['LDAP_GROUP_OBJECT_FILTER'] = '(objectclass=posixGroup)'
app.config['LDAP_BIND_USER_DN'] = None
app.config['LDAP_BIND_USER_PASSWORD'] = None
db = SQLAlchemy(app) # create the (flask) sqlalchemy connection
ma = Marshmallow(app) # set up Marshmallow - data marshalling / serialising
@@ -86,6 +89,7 @@ from files import Entry
from person import Person
from settings import Settings
from user import PAUser
from states import PA_UserState
####################################### GLOBALS #######################################
# allow jinja2 to call these python functions directly
@@ -142,12 +146,17 @@ def login():
# ldap-injecting our username, so send them back to the login page instead
if request.method == 'POST' and re.search( r'[()\\*&!]', request.form['username']):
from job import SetFELog
SetFELog( message=f"ERROR: Detected special LDAP chars in username: {request.form['username']}", log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"ERROR: Detected special LDAP chars in username: {request.form['username']}", level="danger", persistent=True, cant_close=True )
return redirect(url_for('login'))
if form.validate_on_submit():
# Successfully logged in, We can now access the saved user object via form.user.
login_user(form.user, remember=True) # Tell flask-login to log them in.
next = request.args.get("next")
# just (re)-authenticated, so clear old state from UserState, to avoid re-using old data that is no longer valid
PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn).delete()
db.session.commit()
if next:
return redirect(next) # Send them back where they came from
else:
@@ -237,15 +246,16 @@ def logout():
logout_user()
return redirect('/login')
# quick health route so traefik knows we are up
@app.route('/health')
def health():
return {"status": "ok"}, 200
###############################################################################
# main to be called via Flask/Gunicorn
###############################################################################
def main():
if hostname == PROD_HOST:
app.run(ssl_context=('/etc/letsencrypt/live/pa.depaoli.id.au/cert.pem', '/etc/letsencrypt/live/pa.depaoli.id.au/privkey.pem'), host="0.0.0.0", debug=False)
else:
app.run(host="0.0.0.0", debug=True)
app.run(host="0.0.0.0", debug=True)
###############################################################################
# This func creates a new filter in jinja2 to test to hand back the username

View File

@@ -1,4 +1,4 @@
#
#
# This file controls the 'external' job control manager, that (periodically #
# looks / somehow is pushed an event?) picks up new jobs, and processes them.
@@ -15,7 +15,7 @@
### SQLALCHEMY IMPORTS ###
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String, Sequence, Float, ForeignKey, DateTime, LargeBinary, Boolean, func
from sqlalchemy import Column, Integer, String, Sequence, Float, ForeignKey, DateTime, LargeBinary, Boolean, func, text
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import relationship
from sqlalchemy import create_engine
@@ -23,7 +23,7 @@ from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm import scoped_session
### LOCAL FILE IMPORTS ###
from shared import DB_URL, PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT, THUMBSIZE, SymlinkName, GenThumb, SECS_IN_A_DAY
from shared import DB_URL, PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT, THUMBSIZE, SymlinkName, GenThumb, SECS_IN_A_DAY, PA_EXIF_ROTATER
from datetime import datetime, timedelta, date
### PYTHON LIB IMPORTS ###
@@ -45,12 +45,18 @@ import face_recognition
import re
import sys
import ffmpeg
import subprocess
# global debug setting
if 'ENV' not in os.environ or os.environ['ENV'] != "production":
if 'ENV' not in os.environ:
DEBUG=True
print("PA job manager starting as non-production -- ENV not set! Please set ENV, I'd rather not guess what to do")
elif os.environ['ENV'] != "production":
DEBUG=True
print(f"PA job manager starting as non-production -- ENV={os.environ['ENV']}")
else:
print("PA job manager starting as Production")
DEBUG=False
# global list of override tables to allow enumeration over them ...
@@ -507,35 +513,18 @@ class PA_JobManager_FE_Message(Base):
return "<id: {}, job_id: {}, level: {}, message: {}".format(self.id, self.job_id, self.level, self.message)
class PA_UserState(Base):
__tablename__ = "pa_user_state"
id = Column(Integer, Sequence('pa_user_state_id_seq'), primary_key=True )
pa_user_dn = Column(String, ForeignKey('pa_user.dn'), primary_key=True )
last_used = Column(DateTime(timezone=True))
path_type = Column(String, primary_key=True, unique=False, nullable=False )
noo = Column(String, unique=False, nullable=False )
grouping = Column(String, unique=False, nullable=False )
how_many = Column(Integer, unique=False, nullable=False )
st_offset = Column(Integer, unique=False, nullable=False )
size = Column(Integer, unique=False, nullable=False )
folders = Column(Boolean, unique=False, nullable=False )
root = Column(String, unique=False, nullable=False )
cwd = Column(String, unique=False, nullable=False )
## for now being lazy and not doing a separate table until I settle on needed fields and when
# only used if ptype == View
view_eid = Column(Integer, unique=False, nullable=False )
orig_ptype = Column(String, unique=False, nullable=False )
# only used if view and orig_ptype was search
orig_search_term = Column(String, unique=False, nullable=False )
orig_url = Column(String, unique=False, nullable=False )
current = Column(Integer)
first_eid = Column(Integer)
last_eid = Column(Integer)
num_entries = Column(Integer)
def __repr__(self):
return f"<pa_user_dn: {self.pa_user_dn}, path_type: {self.path_type}, noo: {self.noo}, grouping: {self.grouping}, how_many: {self.how_many}, st_offset: {self.st_offset}, size: {self.size}, folders: {self.folders}, root: {self.root}, cwd: {self.cwd}, view_eid: {self.view_eid}, orig_ptype: {self.orig_ptype}, orig_search_term: {self.orig_search_term}, orig_url: {self.orig_url}, current={self.current}, first_eid={self.first_eid}, last_eid={self.last_eid}, num_entries={self.num_entries}>"
##############################################################################
# PAprint(): convenience function to prepend a timestamp to a printed string
##############################################################################
def PAprint( msg ):
now=datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f')
if 'INFO:' in msg or 'WARN:' in msg:
msg = ' '+msg
if 'DEBUG:' in msg or 'ERROR:' in msg or 'WARN:' in msg or 'INFO:' in msg:
print( f"{now} {msg}" )
else:
print( f"{now} DEBUG: {msg}" )
return
##############################################################################
# NewJob(): convenience function to create a job, appropriately
@@ -576,7 +565,7 @@ def MessageToFE( job_id, message, level, persistent, cant_close ):
def SettingsRBPath():
settings = session.query(Settings).first()
if settings == None:
print("ERROR: Cannot create file data with no settings / recycle bin path is missing")
PAprint("ERROR: Cannot create file data with no settings / recycle bin path is missing")
return None
# path setting is an absolute path, just use it, otherwise prepend base_path first
if settings.recycle_bin_path[0] == '/':
@@ -614,7 +603,7 @@ def ProcessRecycleBinDir(job):
def SettingsSPath():
settings = session.query(Settings).first()
if settings == None or settings.storage_path == "":
print("ERROR: Cannot create file data with no settings / storage path is missing")
PAprint("ERROR: Cannot create file data with no settings / storage path is missing")
return None
if settings.storage_path[0] == '/':
path=settings.storage_path
@@ -641,10 +630,9 @@ def ProcessStorageDirs(parent_job):
# SettingsIPath(): return import path (abs or add base_path if needed)
##############################################################################
def SettingsIPath():
paths=[]
settings = session.query(Settings).first()
if not settings or settings.import_path == "":
print("ERROR: Cannot create file data with no settings / import path is missing")
PAprint("ERROR: Cannot create file data with no settings / import path is missing")
return None
if settings.import_path[0] == '/':
path=settings.import_path
@@ -838,7 +826,7 @@ def JobMetadata(job):
def AddLogForJob(job, message):
now=datetime.now(pytz.utc)
log=Joblog( job_id=job.id, log=message, log_date=now )
job.last_update=datetime.now(pytz.utc)
job.last_update=now
session.add(log)
# some logs have DEBUG: in front, so clean that up
message = message.replace("DEBUG:", "" )
@@ -850,7 +838,7 @@ def AddLogForJob(job, message):
else:
job.last_commit = now
if DEBUG:
print( f"DEBUG: {message}" )
PAprint( f"{message}" )
return
##############################################################################
@@ -919,7 +907,7 @@ def FinishJob(job, last_log, state="Completed", pa_job_state="Completed", level=
session.commit()
MessageToFE( job_id=job.id, message=last_log, level=level, persistent=persistent, cant_close=cant_close )
if DEBUG:
print( f"DEBUG: {last_log}" )
PAprint( f"DEBUG: {last_log}" )
return
##############################################################################
@@ -930,13 +918,13 @@ def FinishJob(job, last_log, state="Completed", pa_job_state="Completed", level=
##############################################################################
def HandleJobs(first_run=False):
if first_run:
print("INFO: PA job manager is starting up - check for stale jobs" )
PAprint("INFO: PA job manager is starting up - check for stale jobs" )
else:
if DEBUG:
print("DEBUG: PA job manager is scanning for new jobs to process")
PAprint("DEBUG: PA job manager is scanning for new jobs to process")
for job in session.query(Job).filter(Job.pa_job_state != 'Complete').all():
if first_run and job.pa_job_state == 'In Progress':
print( f"INFO: Found stale job#{job.id} - {job.name}" )
PAprint( f"INFO: Found stale job#{job.id} - {job.name}" )
job.pa_job_state = 'Stale'
session.add(job)
AddLogForJob( job, "ERROR: Job has been marked stale as it did not complete" )
@@ -948,17 +936,17 @@ def HandleJobs(first_run=False):
j2 = session.query(Job).get(job.wait_for)
if not j2:
AddLogForJob( job, f"ERROR: waiting for a job#({job.wait_for}) that does not exist? ")
print(f"ERROR: job.wait_for ({job.wait_for}) does not exist in below? " )
PAprint(f"ERROR: job.wait_for ({job.wait_for}) does not exist in below? " )
for j in session.query(Job).all():
print(f"ERROR: j={j.id}")
PAprint(f"ERROR: j={j.id}")
continue
if j2.pa_job_state != 'Completed':
continue
# use this to remove threads for easier debugging, and errors will stacktrace to the console
if DEBUG:
print("*************************************")
print("RUNNING job: id={} name={} wait_for={}".format(job.id, job.name, job.wait_for ))
PAprint("*************************************")
PAprint("RUNNING job: id={} name={} wait_for={}".format(job.id, job.name, job.wait_for ))
RunJob(job)
else:
try:
@@ -968,8 +956,8 @@ def HandleJobs(first_run=False):
try:
MessageToFE( job_id=job.id, level="danger", message="Failed with: {} (try job log for details)".format(e), persistent=True, cant_close=False )
except Exception as e2:
print("ERROR: Failed to let front-end know, but back-end Failed to run job (id: {}, name: {} -- orig exep was: {}, this exception was: {})".format( job.id, job.name, e, e2) )
print("INFO: PA job manager is waiting for a job")
PAprint("ERROR: Failed to let front-end know, but back-end Failed to run job (id: {}, name: {} -- orig exep was: {}, this exception was: {})".format( job.id, job.name, e, e2) )
PAprint("INFO: PA job manager is waiting for a job")
return
##############################################################################
@@ -1107,7 +1095,6 @@ def DisconnectAllOverrides(job):
def JobForceScan(job):
JobProgressState( job, "In Progress" )
DisconnectAllOverrides(job)
session.query(PA_UserState).delete()
session.query(FaceFileLink).delete()
session.query(FaceRefimgLink).delete()
session.query(Face).delete()
@@ -1132,14 +1119,14 @@ def JobForceScan(job):
def CreateSymlink(job,ptype,path):
path_type = session.query(PathType).get(ptype)
symlink=SymlinkName(path_type.name, path, path)
if not os.path.exists(symlink):
print( f"INFO: symlink does not exist, actually creating it -- s={symlink}" )
try:
try:
if not os.path.exists(symlink):
PAprint( f"INFO: symlink does not exist, actually creating it -- s={symlink}" )
# SAFE: SafePath() on init forces symlink to be safe
os.makedirs( os.path.dirname(symlink), mode=0o777, exist_ok=True )
os.symlink(path, symlink)
except Exception as e:
AddLogForJob( job, f"ERROR: Failed to create symlink - tried to link {symlink} -> {path}: {e}")
r=os.makedirs( os.path.dirname(symlink), mode=0o777, exist_ok=True )
r=os.symlink(path, symlink)
except Exception as e:
AddLogForJob( job, f"ERROR: Failed to create symlink - tried to link {symlink} -> {path}: {e}")
return symlink
##############################################################################
@@ -1251,25 +1238,25 @@ def RemoveEmptyDirFromDB( job, del_me ):
####################################################################################################################################
def CleanUpDirInDB(job, e):
session.commit()
print( f"CleanUpDirInDB(): checking dir: {e.FullPathOnFS()} ({e.id})" )
PAprint( f"CleanUpDirInDB(): checking dir: {e.FullPathOnFS()} ({e.id})" )
content = session.query(Entry).join(EntryDirLink).filter(EntryDirLink.dir_eid==e.id).first()
if not content:
print( f" Dir {e.FullPathOnFS()} - {e.id} is empty - removing it" )
print( f" Entry {e}" )
PAprint( f" Dir {e.FullPathOnFS()} - {e.id} is empty - removing it" )
PAprint( f" Entry {e}" )
# if no in_dir, we are at the root of the path, STOP
if not e.in_dir:
print( " Parent is empty, so NEVER delete this entry, returning" )
PAprint( " Parent is empty, so NEVER delete this entry, returning" )
return
# okay remove this empty dir
RemoveEmtpyDirFromFS( job, e )
RemoveEmptyDirFromDB( job, e )
# get an Entry from DB (in_dir is a Dir/we need the ORM entry for code to work)
parent_dir = session.query(Entry).get(e.in_dir.eid)
print( f" Dir {e.FullPathOnFS()} is in {parent_dir.FullPathOnFS()} ({parent_dir.id}) -> check next" )
PAprint( f" Dir {e.FullPathOnFS()} is in {parent_dir.FullPathOnFS()} ({parent_dir.id}) -> check next" )
# check to see if removing the empty dir has left the parent dir empty
CleanUpDirInDB(job, parent_dir)
else:
print( f"There is content (first entry: {content.name}) in {e.FullPathOnFS()} - finished for this dir" )
PAprint( f"There is content (first entry: {content.name}) in {e.FullPathOnFS()} - finished for this dir" )
return
####################################################################################################################################
@@ -1359,7 +1346,7 @@ def MoveFileToRecycleBin(job,del_me):
# SAFE: as SafePaths(rbpath) combined with data I control in this func (explicit remove of 'static/' + DB entry path)
os.replace( src, dst )
if DEBUG:
print( f"MoveFileToRecycleBin({job.id},{del_me.name}): os.replace {src} with {dst} " )
PAprint( f"MoveFileToRecycleBin({job.id},{del_me.name}): os.replace {src} with {dst} " )
except Exception as e:
AddLogForJob( job, f"ERROR: Failed to remove file from filesystem - which={src}, err: {e}")
@@ -1413,7 +1400,7 @@ def MoveFileToRecycleBin(job,del_me):
####################################################################################################################################
def MoveEntriesToOtherFolder(job, move_me, dst_storage_path, dst_rel_path):
if DEBUG:
print( f"DEBUG: MoveEntriesToOtherFolder( job={job.id}, move_me={move_me.name}, dst_storage_path={dst_storage_path.id}, dst_rel_path={dst_rel_path})")
PAprint( f"DEBUG: MoveEntriesToOtherFolder( job={job.id}, move_me={move_me.name}, dst_storage_path={dst_storage_path.id}, dst_rel_path={dst_rel_path})")
orig_name=move_me.name
orig_fs_pos=move_me.FullPathOnFS()
@@ -1513,16 +1500,16 @@ def CreateFSLocation( job, dst_path, dst_locn ):
####################################################################################################################################
def ResetAnySubdirPaths( moving_dir, dst_storage_path, parent_rel_path ):
if DEBUG:
print( f"ResetAnySubdirPaths( {moving_dir.name}, {dst_storage_path.path_prefix}, {parent_rel_path} )" )
PAprint( f"ResetAnySubdirPaths( {moving_dir.name}, {dst_storage_path.path_prefix}, {parent_rel_path} )" )
sub_dirs = session.query(Entry).join(FileType).join(EntryDirLink).filter(EntryDirLink.dir_eid==moving_dir.id).filter(FileType.name=='Directory').all()
for sub in sub_dirs:
if DEBUG:
print( f"ResetAnySubdirPaths: WAS sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
PAprint( f"ResetAnySubdirPaths: WAS sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
sub.in_path = dst_storage_path
sub.dir_details.in_path = dst_storage_path
sub.dir_details.rel_path = parent_rel_path + '/' + sub.name
if DEBUG:
print( f"ResetAnySubdirPaths: NOW sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
PAprint( f"ResetAnySubdirPaths: NOW sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
ResetAnySubdirPaths( sub, dst_storage_path, sub.dir_details.rel_path )
return
@@ -1644,18 +1631,6 @@ def find_last_successful_ai_scan(job):
return ai_job.last_update.timestamp()
return 0
####################################################################################################################################
# when an import job actually finds new files, then the pa_user_state caches will become invalid (offsets are now wrong)
####################################################################################################################################
def DeleteOldPA_UserState(job):
# clear them out for now - this is 'dumb', just delete ALL. Eventually, can do this based on just the path &/or whether the last_used is
# newer than this delete moment (only would be a race condition between an import changing things and someone simultaneously viewing)
# path=[jex.value for jex in job.extra if jex.name == "path"][0]
session.query(PA_UserState).delete()
return
####################################################################################################################################
# JobImportDir(): job that scan import dir and processes entries in there - key function that uses os.walk() to traverse the
# file system and calls AddFile()/AddDir() as necessary
@@ -1668,7 +1643,7 @@ def JobImportDir(job):
ptype = session.query(PathType).get(path_type)
AddLogForJob(job, f"Checking {ptype.name} Directory: {path}" )
if DEBUG:
print( f"DEBUG: Checking Directory: {path}" )
PAprint( f"DEBUG: Checking Directory: {path}" )
if not os.path.exists( path ):
WithdrawDependantJobs( job, job.id, "scan job found no new files to process" )
FinishJob( job, f"Finished Importing: {path} -- Path does not exist", "Failed" )
@@ -1739,7 +1714,7 @@ def JobImportDir(job):
# use ctime as even a metadata change (mv'd file on the fs, or a perms change) needs to be checked
if stat.st_ctime > dir.last_import_date:
if DEBUG:
print("DEBUG: {} - {} is newer than {}".format( basename, stat.st_ctime, dir.last_import_date ) )
PAprint("DEBUG: {} - {} is newer than {}".format( basename, stat.st_ctime, dir.last_import_date ) )
if isImage(fname):
type_str = 'Image'
elif isVideo(fname):
@@ -1753,7 +1728,7 @@ def JobImportDir(job):
found_new_files += 1
else:
if DEBUG:
print( f"DEBUG: { basename} - {stat.st_ctime} is OLDER than {dir.last_import_date}" )
PAprint( f"DEBUG: { basename} - {stat.st_ctime} is OLDER than {dir.last_import_date}" )
e=session.query(Entry).join(EntryDirLink).join(Dir).filter(Entry.name==basename,Dir.eid==dir.eid).first()
e.exists_on_fs=True
job.current_file=basename
@@ -1764,8 +1739,6 @@ def JobImportDir(job):
if found_new_files:
job.extra.append( JobExtra( name="new_files", value=str(found_new_files) ) )
session.add(job)
# this will invalidate pa_user_state for this path's contents (offsets are now wrong), clear them out
DeleteOldPA_UserState(job)
rm_cnt=HandleAnyFSDeletions(job)
@@ -1900,26 +1873,29 @@ def JobTransformImage(job):
id=[jex.value for jex in job.extra if jex.name == "id"][0]
amt=[jex.value for jex in job.extra if jex.name == "amt"][0]
e=session.query(Entry).join(File).filter(Entry.id==id).first()
print( f"JobTransformImage: job={job.id}, id={id}, amt={amt}" )
im = Image.open( e.FullPathOnFS() )
PAprint( f"JobTransformImage: job={job.id}, id={id}, amt={amt}" )
if amt == "fliph":
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} horizontally" )
im = Image.open( e.FullPathOnFS() )
out = im.transpose(Image.FLIP_LEFT_RIGHT)
out.save( e.FullPathOnFS() )
elif amt == "flipv":
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} vertically" )
im = Image.open( e.FullPathOnFS() )
out = im.transpose(Image.FLIP_TOP_BOTTOM)
out.save( e.FullPathOnFS() )
else:
AddLogForJob(job, f"INFO: Rotating {e.FullPathOnFS()} by {amt} degrees" )
if im.format == 'JPEG':
im=ImageOps.exif_transpose(im)
out = im.rotate(int(amt), expand=True)
out.save( e.FullPathOnFS() )
print( f"JobTransformImage DONE transform: job={job.id}, id={id}, amt={amt}" )
p = subprocess.run([PA_EXIF_ROTATER, e.FullPathOnFS(), amt ], capture_output=True, text=True )
if p.stderr:
PAprint(p.stderr)
else:
PAprint(p.stdout)
settings = session.query(Settings).first()
e.file_details.thumbnail, _ , _ = GenThumb( e.FullPathOnFS(), settings.auto_rotate )
e.file_details.hash = md5( job, e )
print( f"JobTransformImage DONE thumb: job={job.id}, id={id}, amt={amt}" )
PAprint( f"JobTransformImage DONE thumb: job={job.id}, id={id}, amt={amt}" )
session.add(e)
FinishJob(job, "Finished Processesing image rotation/flip")
return
@@ -1940,7 +1916,7 @@ def GenHashAndThumb(job, e):
# use mtime as only if the content is different do we need to redo the hash
if stat.st_mtime < e.file_details.last_hash_date:
if DEBUG:
print(f"OPTIM: GenHashAndThumb {e.name} file is older than last hash, skip this")
PAprint(f"OPTIM: GenHashAndThumb {e.name} file is older than last hash, skip this")
job.current_file_num+=1
return
@@ -1948,7 +1924,7 @@ def GenHashAndThumb(job, e):
# same hash and we already have a thumbnail-> just return
if new_hash == e.file_details.hash and e.file_details.thumbnail:
if DEBUG:
print(f"OPTIM: GenHashAndThumb {e.name} md5 is same - likely a mv on filesystem so skip md5/thumb")
PAprint(f"OPTIM: GenHashAndThumb {e.name} md5 is same - likely a mv on filesystem so skip md5/thumb")
job.current_file_num+=1
return
e.file_details.hash = new_hash
@@ -1966,7 +1942,7 @@ def GenHashAndThumb(job, e):
####################################################################################################################################
def ProcessFilesInDir(job, e, file_func, count_dirs):
if DEBUG:
print( f"DEBUG: ProcessFilesInDir: {e.FullPathOnFS()}")
PAprint( f"DEBUG: ProcessFilesInDir: {e.FullPathOnFS()}")
if e.type.name != 'Directory':
file_func(job, e)
else:
@@ -1986,7 +1962,7 @@ def JobGetFileDetails(job):
path=[jex.value for jex in job.extra if jex.name == "path"][0]
path_prefix=[jex.value for jex in job.extra if jex.name == "path_prefix"][0]
if DEBUG:
print( f"DEBUG: JobGetFileDetails for path={path_prefix}" )
PAprint( f"DEBUG: JobGetFileDetails for path={path_prefix}" )
p=session.query(Path).filter(Path.path_prefix==path_prefix).first()
job.current_file_num = 0
job.num_files = p.num_files
@@ -2094,7 +2070,7 @@ def JobCheckForDups(job):
AddLogForJob( job, f"Check for duplicates" )
ClearOtherDupMessagesAndJobs()
res = session.execute( "select count(e1.id) from entry e1, file f1, dir d1, entry_dir_link edl1, path_dir_link pdl1, path p1, entry e2, file f2, dir d2, entry_dir_link edl2, path_dir_link pdl2, path p2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and edl2.dir_eid = d2.eid and edl2.entry_id = e2.id and p1.type_id != (select id from path_type where name = 'Bin') and p1.id = pdl1.path_id and pdl1.dir_eid = d1.eid and p2.type_id != (select id from path_type where name = 'Bin') and p2.id = pdl2.path_id and pdl2.dir_eid = d2.eid and f1.hash = f2.hash and e1.id != e2.id and f1.size_mb = f2.size_mb" )
res = session.execute( text( "select count(e1.id) from entry e1, file f1, dir d1, entry_dir_link edl1, path_dir_link pdl1, path p1, entry e2, file f2, dir d2, entry_dir_link edl2, path_dir_link pdl2, path p2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and edl2.dir_eid = d2.eid and edl2.entry_id = e2.id and p1.type_id != (select id from path_type where name = 'Bin') and p1.id = pdl1.path_id and pdl1.dir_eid = d1.eid and p2.type_id != (select id from path_type where name = 'Bin') and p2.id = pdl2.path_id and pdl2.dir_eid = d2.eid and f1.hash = f2.hash and e1.id != e2.id and f1.size_mb = f2.size_mb") )
for row in res:
if row.count > 0:
AddLogForJob(job, f"Found duplicates, Creating Status message in front-end for attention")
@@ -2242,18 +2218,19 @@ def JobRestoreFiles(job):
def CopyOverrides():
try:
for tbl in override_tbls:
session.execute( f"select * into tmp_{tbl} from {tbl}")
session.execute( text( f"select * into tmp_{tbl} from {tbl}") )
# force a commit here - I want to fail before I delete override content
session.commit()
# now take all 4 override tables in DB and clear them out
for tbl in override_tbls:
session.execute( f"delete from {tbl}" )
session.execute( text( f"delete from {tbl}" ) )
session.commit()
except Exception as ex:
print( f"ERROR: there are existing tmp tables when processing metadata. This SHOULD NEVER HAPPEN - manual intervention needed" )
print( f"ERROR: most likely the job manager was killed during processing metadata - you may want to manually put" )
print( f"ERROR: the contents of the 'tmp_*' tables back into their corresponding official metadata tables " )
print( f"ERROR: and try to restart the job manager" )
PAprint( f"ERROR: there are existing tmp tables when processing metadata. This SHOULD NEVER HAPPEN - manual intervention needed" )
PAprint( f"ERROR: most likely the job manager was killed during processing metadata - you may want to manually put" )
PAprint( f"ERROR: the contents of the 'tmp_*' tables back into their corresponding official metadata tables " )
PAprint( f"ERROR: and try to restart the job manager" )
PAprint( f"ERROR: orig ex: {ex}" )
exit( 1 )
return
@@ -2267,8 +2244,8 @@ def GetFaceInMetadata(fname):
face_data=file_h.read(-1)
file_h.close()
except Exception as ex:
print( f"ERROR: FATAL tried to read in override data and cant read content" )
print( f"ERROR: manual intervention needed - exc={ex}" )
PAprint( f"ERROR: FATAL tried to read in override data and cant read content" )
PAprint( f"ERROR: manual intervention needed - exc={ex}" )
exit(1)
return face_data
@@ -2294,13 +2271,13 @@ def ReloadMetadata(job):
fnames = glob.glob( f'{mpath}/no_match_overrides/*' )
for fname in fnames:
# type derived from fname (e.g. 0_Too Young_md5*, 1_Too Young, 2_Ingore Face, etc.)
match=re.search( '(\d+)_([^_\.]+)', fname )
match=re.search( r'(\d+)_([^_\.]+)', fname )
face_id=match.group(1)
type_name=match.group(2)
otype = session.query(FaceOverrideType).filter(FaceOverrideType.name==type_name).one()
face_data=GetFaceInMetadata(fname)
if DEBUG:
print( f"Found metadata showing Override of type: {type_name}" )
PAprint( f"DEBUG: Found metadata showing Override of type: {type_name}" )
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ). first()
@@ -2313,22 +2290,22 @@ def ReloadMetadata(job):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, f'{mpath}no_match_overrides/0_{otype.name}_{md5face(face_data)}' )
except Exception as ex:
print( f"ERROR: renaming no-match metadata on filesystem failed: {ex}" )
PAprint( f"ERROR: renaming no-match metadata on filesystem failed: {ex}" )
# process Metadata on FS for force_match_overrides (disco ones, will have 0 as face_id)
fnames = glob.glob( f'{mpath}force_match_overrides/*' )
for fname in fnames:
# person derived from fname (e.g. 0_ddp_md5*, 1_ddp, 2_mich, etc.)
match=re.search( '(\d+)_([^_]+)', fname )
match=re.search( r'(\d+)_([^_]+)', fname )
face_id=match.group(1)
person_tag=match.group(2)
p = session.query(Person).filter(Person.tag==person_tag).first()
if not p:
print( f"There is a metadata override on the file system for person: {person_tag} - but they are no longer in the DB - skip" )
PAprint( f"INFO: There is a metadata override on the file system for person: {person_tag} - but they are no longer in the DB - skip" )
continue
face_data=GetFaceInMetadata(fname)
if DEBUG:
print( f"Found metadata showing Override match for person: {person_tag}" )
PAprint( f"DEBUG: Found metadata showing Override match for person: {person_tag}" )
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ).first()
@@ -2342,43 +2319,46 @@ def ReloadMetadata(job):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, f'{mpath}force_match_overrides/0_{p.tag}_{md5face(face_data)}' )
except Exception as ex:
print( f"ERROR: renaming force-match metadata on filesystem failed: {ex}" )
PAprint( f"ERROR: renaming force-match metadata on filesystem failed: {ex}" )
# now process each of the tmp tables for anything that was in the DB but not on FS (e.g rm'd metadata)
overrides=session.execute( "select face_id, type_id from tmp_face_no_match_override" )
overrides=session.execute( text( "select face_id, type_id from tmp_face_no_match_override" ) )
for o in overrides:
print( f"F Force Match: o.face_id={o.face_id}" )
PAprint( f"F Force Match: o.face_id={o.face_id}" )
print( f"F No Match: o.type_id={o.type_id}" )
nmo=session.query(FaceNoMatchOverride).filter(FaceNoMatchOverride.face_id==o.face_id).filter(FaceNoMatchOverride.type_id==o.type_id).first()
if not nmo:
session.add( FaceNoMatchOverride( face_id=o.face_id, type_id=o.type_id ) )
overrides=session.execute( "select face_id, person_id from tmp_face_force_match_override" )
overrides=session.execute( text( "select face_id, person_id from tmp_face_force_match_override" ) )
for o in overrides:
print( f"F Force Match: o.face_id={o.face_id}" )
print( f"F Force Match: o.person_id={o.person_id}" )
if DEBUG:
PAprint( f"F Force Match: o.face_id={o.face_id}" )
PAprint( f"F Force Match: o.person_id={o.person_id}" )
fmo=session.query(FaceForceMatchOverride).filter(FaceForceMatchOverride.face_id==o.face_id,FaceForceMatchOverride.person_id==o.person_id).first()
if not fmo:
session.add( FaceForceMatchOverride( face_id=o.face_id, person_id=o.person_id ) )
overrides=session.execute( "select face, type_id from tmp_disconnected_no_match_override" )
overrides=session.execute( text( "select face, type_id from tmp_disconnected_no_match_override" ) )
for o in overrides:
print( f"D No Match: o.type_id={o.type_id}" )
if DEBUG:
PAprint( f"D No Match: o.type_id={o.type_id}" )
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==o.face).filter(DisconnectedNoMatchOverride.type_id==o.type_id).first()
if not dnmo:
session.add( DisconnectedNoMatchOverride( face=o.face, type_id=o.type_id ) )
overrides=session.execute( "select face, person_id from tmp_disconnected_force_match_override" )
overrides=session.execute( text( "select face, person_id from tmp_disconnected_force_match_override" ) )
for o in overrides:
print( f"D Force Match: o.person_id={o.person_id}" )
if DEBUG:
PAprint( f"D Force Match: o.person_id={o.person_id}" )
dfmo=session.query(DisconnectedForceMatchOverride).filter(DisconnectedForceMatchOverride.face==o.face).filter(DisconnectedForceMatchOverride.person_id==o.person_id).first()
if not dfmo:
session.add( DisconnectedForceMatchOverride( face=o.face, person_id=o.person_id ) )
# finally, drop the tmp tables
for tbl in override_tbls:
session.execute( f"drop table tmp_{tbl}" )
session.execute( text( f"drop table tmp_{tbl}" ) )
# ok, finally commit all these changes - dont do this until now. Worst case if we crash/fail, the overrides should continue to be in tmp_{tbl}
session.commit()
@@ -2392,7 +2372,6 @@ def ReloadMetadata(job):
# If there is content in the Bin already, its logs this - mostly useful when testing)
####################################################################################################################################
def InitialValidationChecks():
now=datetime.now(pytz.utc)
job=NewJob( name="init", num_files=0, wait_for=None, jex=None, parent_job=None, desc="initialise photo assistant" )
job.start_time=datetime.now(pytz.utc)
JobProgressState( job, "In Progress" )
@@ -2405,14 +2384,14 @@ def InitialValidationChecks():
if len(dirs) + len(files) > 0:
AddLogForJob(job, "INFO: the bin path contains content, cannot process to know where original deletes were form - skipping content!" )
AddLogForJob(job, "TODO: could be smart about what is known in the DB vs on the FS, and change below to an ERROR if it is one")
AddLogForJob(job, "WARNING: IF the files in the bin are in the DB (succeeded from GUI deletes) then this is okay, otherwise you should delete contents form the recycle bin and restart the job manager)" )
AddLogForJob(job, "WARN: IF the files in the bin are in the DB (succeeded from GUI deletes) then this is okay, otherwise you should delete contents form the recycle bin and restart the job manager)" )
# create symlink and Path/Dir if needed
ProcessRecycleBinDir(job)
rbp_exists=1
except Exception as ex:
print( f"FATAL ERROR: Failed to walk the recycle bin at {path} Err:{ex}" )
PAprint( f"FATAL ERROR: Failed to walk the recycle bin at {path} Err:{ex}" )
else:
AddLogForJob(job, "ERROR: The bin path in settings does not exist - Please fix now");
AddLogForJob(job, f"ERROR: The bin path {path} in settings does not exist - Please fix now");
sp_exists=0
path = SettingsSPath()
if os.path.exists(path):
@@ -2481,7 +2460,7 @@ def AddFaceToFile( locn_data, face_data, file_eid, model_id, settings ):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, new_fname )
except Exception as ex:
print( f"ERROR: AddFaceToFile-face connects to 'disconnected-force-match' metadata, but fixing the filesystem metadata failed: {ex}" )
PAprint( f"ERROR: AddFaceToFile-face connects to 'disconnected-force-match' metadata, but fixing the filesystem metadata failed: {ex}" )
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==face.face).first()
if dnmo:
@@ -2497,7 +2476,7 @@ def AddFaceToFile( locn_data, face_data, file_eid, model_id, settings ):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, new_fname )
except Exception as ex:
print( f"ERROR: AddFaceToFile-face connects to 'disconnected-no-match' metadata, but fixing the filesystem metadata failed: {ex}" )
PAprint( f"ERROR: AddFaceToFile-face connects to 'disconnected-no-match' metadata, but fixing the filesystem metadata failed: {ex}" )
return
@@ -2517,7 +2496,7 @@ def DelMatchesForFile( job, ent ):
if DEBUG:
AddLogForJob(job, f'Remove any old matches in {ent.name}')
session.execute( f"delete from face_refimg_link where face_id in (select face_id from face_file_link where file_eid = {ent.id})" )
session.execute( text( f"delete from face_refimg_link where face_id in (select face_id from face_file_link where file_eid = {ent.id})" ) )
ent.file_details.last_ai_scan=0
session.add(ent)
return
@@ -2539,7 +2518,7 @@ def DelFacesForFile( job, eid ):
if o:
DisconnectSingleNoMatchOverride( job, o )
session.execute( f"delete from face where id in (select face_id from face_file_link where file_eid = {eid})" )
session.execute( text( f"delete from face where id in (select face_id from face_file_link where file_eid = {eid})" ) )
session.commit()
return
@@ -2697,7 +2676,7 @@ def CheckAndRunBinClean():
now=datetime.now(pytz.utc)
if not j or (now-j.last_update).days >= settings.scheduled_bin_cleanup:
print( f"INFO: Should force clean up bin path, del files older than {settings.bin_cleanup_file_age} days old" )
PAprint( f"INFO: Should force clean up bin path, del files older than {settings.bin_cleanup_file_age} days old" )
NewJob( name="clean_bin", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean up on Bin path" )
created_jobs=True
return created_jobs
@@ -2707,7 +2686,7 @@ def CheckAndRunBinClean():
# based jobs that should run (e.g. last scanned a path X day(s) ago, then scan now), etc. X is defined in settings
####################################################################################################################################
def ScheduledJobs():
print("DEBUG: Time to check for any scheduled jobs needing to be run" )
PAprint("DEBUG: Time to check for any scheduled jobs needing to be run" )
created_jobs=False
@@ -2717,11 +2696,11 @@ def ScheduledJobs():
settings = session.query(Settings).first()
now=datetime.now(pytz.utc)
if ndays_since_last_im_scan >= settings.scheduled_import_scan:
print( f"INFO: Time to force an import scan, last scan was {ndays_since_last_im_scan} days ago" )
PAprint( f"INFO: Time to force an import scan, last scan was {ndays_since_last_im_scan} days ago" )
NewJob( name="scan_ip", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Import path" )
created_jobs=True
if ndays_since_last_st_scan >= settings.scheduled_storage_scan:
print( f"INFO: Time to force a storage scan, last scan was {ndays_since_last_st_scan}" )
PAprint( f"INFO: Time to force a storage scan, last scan was {ndays_since_last_st_scan}" )
NewJob( name="scan_sp", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Storage path" )
created_jobs=True
if CheckAndRunBinClean():
@@ -2734,7 +2713,7 @@ def ScheduledJobs():
# go into waiting on a socket to be woken up (and then if woken, back into HandleJobs()
####################################################################################################################################
if __name__ == "__main__":
print("INFO: PA job manager starting - listening on {}:{}".format( PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT) )
PAprint("INFO: PA job manager starting - listening on {}:{}".format( PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT) )
InitialValidationChecks()
@@ -2748,11 +2727,11 @@ if __name__ == "__main__":
try:
conn, addr = s.accept()
if DEBUG:
print( f"accept finished, tout={s.timeout}" )
PAprint( f"accept finished, tout={s.timeout}" )
except socket.timeout:
if DEBUG:
print( f"timeout occurred, tout={s.timeout}" )
PAprint( f"timeout occurred, tout={s.timeout}" )
if ScheduledJobs():
HandleJobs(False)
continue

47
path.py
View File

@@ -42,50 +42,3 @@ class Path(db.Model):
def __repr__(self):
return f"<id: {self.id}, path_prefix: {self.path_prefix}, num_files={self.num_files}, type={self.type}>"
################################################################################
# Class describing PathDetail (quick connvenence class for MovePathDetails())
################################################################################
class PathDetail(PA):
"""Class describing details of a Path [internal class used in MovePathDetais()]"""
def __init__(self,ptype,path):
"""Initialisation function for PathDetail class
Args:
id (int): database id of row in PathDetail table / primary key
ptype (int): database id of row in PathType table / foreign key
"""
self.type:int=ptype
self.path:str=path
# construct icon_url based on type of storage path (icons.svg contains icons for each)
self.icon_url:str=url_for("internal", filename="icons.svg") + "#" + ICON[self.type]
################################################################################
# helper function to find path details for move destinations - used in html
# for move DBox to show potential storage paths to move files into
################################################################################
def MovePathDetails():
"""helper function to find path details for move destinations
used in html/javascript for move Dialog Box to show potential storage paths to move files into
Args:
None
Returns:
ret (List[PathDetail]): a list of Path Details for where files can be moved
"""
ret=[]
sps=Path.query.join(PathType).filter(PathType.name=="Storage").all()
for p in sps:
obj = PathDetail( ptype="Storage", path=p.path_prefix.replace("static/Storage/","") )
ret.append( obj )
ips=Path.query.join(PathType).filter(PathType.name=="Import").all()
for p in ips:
obj = PathDetail( ptype="Import", path=p.path_prefix.replace("static/Import/","") )
ret.append( obj )
return ret

View File

@@ -3,8 +3,9 @@ from flask_wtf import FlaskForm
from flask import request, render_template, redirect, url_for, make_response, jsonify
from main import db, app, ma
from settings import Settings, AIModel
from sqlalchemy import Sequence, func
from sqlalchemy import Sequence, func, select
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import joinedload
from flask_login import login_required, current_user
from werkzeug.utils import secure_filename
from shared import GenFace, GenThumb, PA
@@ -87,14 +88,14 @@ def AddRefimgToPerson( filename, person ):
#False == dont autorotate, its not needed on this image
refimg.thumbnail, refimg.orig_w, refimg.orig_h = GenThumb( filename, False )
settings = Settings.query.first()
model=AIModel.query.get(settings.default_refimg_model)
model=db.session.get(AIModel,settings.default_refimg_model)
refimg.face, face_locn = GenFace( filename, model=model.name )
try:
os.remove(filename)
except Exception as e:
# can fail "silently" here, if the face_locn worked, great, its only
# a tmp file in /tmp - if not, the next if will send a msg to the front-end
SetFELog( message=f"Failed to delete tmp file for refimg addition: {e}", log_level="danger", persistent=True, cant_close=True )
SetFELog( message=f"Failed to delete tmp file for refimg addition: {e}", level="danger", persistent=True, cant_close=True )
if not face_locn:
SetFELog( f"<b>Failed to find face in Refimg:</b>", "danger" )
@@ -114,7 +115,7 @@ def AddRefimgToPerson( filename, person ):
SetFELog( f"<b>Failed to add Refimg:</b>&nbsp;{e.orig}", "danger" )
except Exception as e:
SetFELog( f"<b>Failed to modify Refimg:</b>&nbsp;{e}", "danger" )
return
return refimg
################################################################################
# TempRefimgFile: helper function that takes data POST'd (from dialog box to
@@ -182,9 +183,12 @@ def match_with_create_person():
p = Person( tag=request.form["tag"], surname=request.form["surname"], firstname=request.form["firstname"] )
# add this fname (of temp refimg) to person
fname=TempRefimgFile( request.form['refimg_data'], p.tag )
AddRefimgToPerson( fname, p )
r=AddRefimgToPerson( fname, p )
SetFELog( f"Created person: {p.tag}" )
return make_response( jsonify( who=p.tag, distance='0.0' ) )
refimg_schema=RefimgSchema(many=False)
r_data=refimg_schema.dump(r)
return make_response( jsonify( refimg=r_data, who=p.tag, distance='0.0' ) )
################################################################################
# /person/<id> -> GET/POST(save or delete) -> shows/edits/delets a single person
@@ -197,7 +201,7 @@ def person(id):
if request.method == 'POST':
try:
person = Person.query.get(id)
person = db.session.get(Person, id)
if 'delete' in request.form:
SetFELog( f"Successfully deleted Person: ({person.tag})" )
@@ -237,7 +241,7 @@ def person(id):
SetFELog( f"<b>Failed to modify Person:</b>&nbsp;{e}", "danger" )
return redirect( url_for( 'persons' ) )
else:
person = Person.query.get(id)
person = db.session.get(Person, id)
if not person:
SetFELog( f"No such person with id: {id}", "danger" )
return redirect("/")
@@ -267,7 +271,7 @@ def add_refimg():
except Exception as e:
SetFELog( f"<b>Failed to load reference image:</b>&nbsp;{e}", "danger" )
AddRefimgToPerson( fname, person )
r=AddRefimgToPerson( fname, person )
return redirect( url_for( 'person', id=person.id) )
################################################################################
@@ -289,6 +293,29 @@ def find_persons(who):
return make_response( resp )
class FaceRefimgLinkSchema(ma.SQLAlchemyAutoSchema):
class Meta: model = FaceRefimgLink
face_distance = ma.auto_field() # Explicitly include face_distance
load_instance = True
class PersonSchema(ma.SQLAlchemyAutoSchema):
class Meta: model=Person
load_instance = True
class RefimgSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = Refimg
exclude = ('face',)
load_instance = True
person = ma.Nested(PersonSchema)
class FaceSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model=Face
exclude = ('face',)
load_instance = True
refimg = ma.Nested(RefimgSchema,allow_none=True)
refimg_lnk = ma.Nested(FaceRefimgLinkSchema,allow_none=True)
################################################################################
# /add_refimg_to_person/ -> POST
@@ -296,12 +323,19 @@ def find_persons(who):
@app.route("/add_refimg_to_person", methods=["POST"])
@login_required
def add_refimg_to_person():
f = Face.query.get( request.form['face_id'] )
p = Person.query.get( request.form['person_id'] )
stmt = select(Face).options( joinedload(Face.refimg_lnk) ).where(Face.id == request.form['face_id'])
f=db.session.execute(stmt).scalars().first()
stmt = select(Person).options( joinedload(Person.refimg) ).where(Person.id == request.form['person_id'])
p=db.session.execute(stmt).scalars().first()
# add this fname (of temp refimg) to person
fname=TempRefimgFile( request.form['refimg_data'], p.tag )
AddRefimgToPerson( fname, p )
r=AddRefimgToPerson( fname, p )
# connect the refimg to the face in the db, now we have added this refimg to the person
frl=FaceRefimgLink( face_id=f.id, refimg_id=r.id, face_distance=0 )
db.session.add(frl)
db.session.commit()
if request.form['search'] == "true":
jex=[]
@@ -316,7 +350,12 @@ def add_refimg_to_person():
jex.append( JobExtra( name=f"path_type", value=str(ptype.id) ) )
job=NewJob( name="run_ai_on_path", num_files=0, wait_for=None, jex=jex, desc="Look for face(s) in storage path(s)" )
return make_response( jsonify( who=p.tag, distance='0.0' ) )
refimg_schema=RefimgSchema(many=False)
r_data=refimg_schema.dump(r)
frl_schema=FaceRefimgLinkSchema(many=False)
frl_data=refimg_schema.dump(r)
return make_response( jsonify( refimg=r_data, frl=frl_data ) )
################################################################################
# /add_force_match_override -> POST
@@ -346,7 +385,9 @@ def add_force_match_override():
NewJob( "metadata", num_files=0, wait_for=None, jex=jex, desc="create metadata for adding forced match" )
# this will reply to the Ajax / POST, and cause the page to re-draw with new face override to person_tag
return make_response( jsonify( person_tag=p.tag ) )
person_schema = PersonSchema(many=False)
p_data = person_schema.dump(p)
return make_response( jsonify( person=p_data ) )
################################################################################
# /remove_force_match_override -> POST
@@ -397,6 +438,11 @@ def remove_no_match_override():
return make_response( jsonify( face_id=face_id ) )
class FaceOverrideTypeSchema(ma.SQLAlchemyAutoSchema):
class Meta:
model = FaceOverrideType
load_instance = True
################################################################################
# /add_no_match_override -> POST
################################################################################
@@ -424,5 +470,6 @@ def add_no_match_override():
# dont do status update here, the F/E is in the middle of a dbox, just send metadata through to the B/E
NewJob( "metadata", num_files=0, wait_for=None, jex=jex, desc="create metadata for adding forced non-match" )
# this will reply to the Ajax / POST, and cause the page to re-draw with new face override to person_tag
return make_response( jsonify( type=t.name ) )
fot_schema = FaceOverrideTypeSchema(many=False)
t_data=fot_schema.dump(t)
return make_response( jsonify( type_id=t.id, type=t_data ) )

41
query.py Normal file
View File

@@ -0,0 +1,41 @@
from flask_login import UserMixin, login_required
from main import db
#from sqlalchemy import Sequence
#from flask import request, redirect, make_response, jsonify
#from main import db, app, ma
#from sqlalchemy.exc import SQLAlchemyError
# pylint: disable=no-member
################################################################################
# Class describing Person in the database and DB via sqlalchemy
# id is unique id in DB
# dn is ldap distinguised name
# any entry in this DB is effectively a record you already authed successfully
# so acts as a session marker. If you fail ldap auth, you dont get a row here
################################################################################
class Query(UserMixin,db.Model):
__tablename__ = "query"
id = db.Column(db.Integer, db.Sequence('query_id_seq'), primary_key=True)
path_type = db.Column(db.String)
noo = db.Column(db.String)
grouping = db.Column(db.String)
q_offset = db.Column(db.Integer)
folder = db.Column(db.Boolean)
entry_list = db.Column(db.String)
root = db.Column(db.String)
cwd = db.Column(db.String)
search_term = db.Column(db.String)
current = db.Column(db.Integer)
created = db.Column(db.DateTime(timezone=True))
def __repr__(self):
str=f"<{self.__class__.__name__}("
for k, v in self.__dict__.items():
str += f"{k}={v!r}, "
str=str.rstrip(", ") + ")>"
return str
def get_id(self):
return self.dn

View File

@@ -1,7 +1,9 @@
numpy==1.26.4
setuptools
flask
flask_login
flask-ldap3-login
sqlalchemy
sqlalchemy>=2.0.36
flask-sqlalchemy
SQLAlchemy-serializer
marshmallow-sqlalchemy
@@ -19,3 +21,5 @@ Werkzeug
flask-compress
ffmpeg-python
psycopg2
python3-ldap
gunicorn

View File

@@ -137,7 +137,7 @@ def SettingsRBPath():
settings = Settings.query.first()
if settings == None:
from job import SetFELog
SetFELog( message="Cannot create file data with no settings / recycle bin path is missing", log_level="error", persistent=True, cant_close=False)
SetFELog( message="Cannot create file data with no settings / recycle bin path is missing", level="danger", persistent=True, cant_close=False)
return
# path setting is an absolute path, just use it, otherwise prepend base_path first
if settings.recycle_bin_path[0] == '/':
@@ -155,7 +155,7 @@ def SettingsSPath():
settings = Settings.query.first()
if settings == None:
from job import SetFELog
SetFELog( message="Cannot create file data with no settings / storage path is missing", log_level="error", persistent=True, cant_close=False)
SetFELog( message="Cannot create file data with no settings / storage path is missing", level="danger", persistent=True, cant_close=False)
return
if settings.storage_path[0] == '/':
path=settings.storage_path

View File

@@ -5,6 +5,7 @@ import io
import base64
import subprocess
from PIL import Image, ImageOps
import numpy as np
class PA:
def __repr__(self):
@@ -24,7 +25,6 @@ class PA:
hostname = socket.gethostname()
PROD_HOST="pa_web"
# dict to store name of icon in icons.svg so we can use by referece in html
ICON={}
@@ -40,19 +40,27 @@ OLDEST_LOG_LIMIT = 5
if hostname == "lappy":
PA_JOB_MANAGER_HOST="localhost"
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@localhost:5432/pa'
PA_EXIF_ROTATER = './utils/pa_exifautotran'
PA_EXIF_AUTOROTATE = './utils/pa_exifautotran'
PA_EXIF_ROTATER = './utils/pa_rotate'
# if we dont set the env or we are explicitly DEV, run web server on localhost & db on mara (port 65432)
elif 'ENV' not in os.environ or os.environ['ENV'] == "development":
elif 'ENV' not in os.environ or os.environ['ENV'] == "development" or os.environ['ENV'] == "container":
PA_JOB_MANAGER_HOST="localhost"
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@mara.ddp.net:65432/pa'
PA_EXIF_ROTATER = './utils/pa_exifautotran'
# DB_URL = 'postgresql+psycopg2://pa:for_now_pa@mara.ddp.net:65432/pa'
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb_dev/pa'
PA_EXIF_AUTOROTATE = './utils/pa_exifautotran'
PA_EXIF_ROTATER = './utils/pa_rotate'
# if we explicitly are on PROD, run web server on localhost (pa_web container) & db on mara (port 5432 on padb container)- only accessed via internal docker ports)
elif os.environ['ENV'] == "production":
PA_JOB_MANAGER_HOST="localhost"
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb/pa'
PA_EXIF_ROTATER = '/code/utils/pa_exifautotran'
PA_EXIF_AUTOROTATE = '/code/utils/pa_exifautotran'
PA_EXIF_ROTATER = '/code/utils/pa_rotate'
else:
print( "ERROR: I do not know which environment (development, etc.) and which DB (on which host to use)" )
if 'ENV' not in os.environ:
print( f"ERROR: no ENV variable set in the environment" )
else:
print( f"ERROR: ENV is {os.environ['ENV']}" )
exit( -1 )
# PORT number we connect to the pa_job_manager on - by default it runs on the
@@ -85,8 +93,8 @@ def CreateSelect(name, selected, list, js="", add_class="", vals={} ):
# TODO: can this be collapsed into using above - probably if the 'selected' passed in was 'In Folder' or 'Flat View' -- but I think that isn't in a var???
# Helper function used in html files to create a bootstrap'd select with options. Same as CreateSelect() really, only contains
# hard-coded True/False around the if selected part, but with string based "True"/"False" in the vals={}, and list has "In Folders", "Flat View"
def CreateFoldersSelect(selected, add_class=""):
str = f'<select id="folders" name="folders" class="{add_class} sm-txt bg-white text-info border-info border-1 p-1" onChange="this.form.submit()">'
def CreateFoldersSelect(selected, js="", add_class=""):
str = f'<select id="folders" name="folders" class="{add_class} sm-txt bg-white text-info border-info border-1 p-1" onChange="{js};this.form.submit()">'
# if selected is true, then folders == true, so make this the selected option
if( selected ):
str += '<option selected value="True">In Folders</option>'
@@ -132,7 +140,7 @@ def GenThumb(fname,auto_rotate):
im_orig = Image.open(fname)
if im_orig.format == 'JPEG':
# run cmdline util to re-orient jpeg (only changes if needed, and does it losslessly)
p = subprocess.run([PA_EXIF_ROTATER,fname] )
p = subprocess.run([PA_EXIF_AUTOROTATE,fname] )
im=Image.open(fname)
else:
im=im_orig
@@ -161,10 +169,12 @@ def GenThumb(fname,auto_rotate):
# generate Face data (and location) - wrapper func of face_recognition library
# used to store refimg data into the DB
def GenFace(fname, model):
img = face_recognition.load_image_file(fname)
location = face_recognition.face_locations(img, model=model)
encodings = face_recognition.face_encodings(img, known_face_locations=location)
if len(encodings) and len(location):
return encodings[0].tobytes(), location[0]
else:
return None, None
try:
img = face_recognition.load_image_file(fname)
location = face_recognition.face_locations(img, model=model)
encodings = face_recognition.face_encodings(img, known_face_locations=location)
if len(encodings) and len(location):
return encodings[0].tobytes(), location[0]
except Exception as e:
print( f"GenFace failed: {e}" )
return None, None

287
states.py
View File

@@ -1,10 +1,12 @@
from flask import request, render_template, redirect, url_for
from settings import Settings, SettingsIPath, SettingsSPath, SettingsRBPath
from flask_login import login_required, current_user
from main import db, app, ma
from shared import PA
from user import PAUser
from datetime import datetime
from job import SetFELog
from shared import SymlinkName
import pytz
import re
@@ -17,30 +19,17 @@ class PA_UserState(db.Model):
__tablename__ = "pa_user_state"
id = db.Column(db.Integer, db.Sequence('pa_user_state_id_seq'), primary_key=True )
pa_user_dn = db.Column(db.String, db.ForeignKey('pa_user.dn'), primary_key=True )
last_used = db.Column(db.DateTime(timezone=True))
path_type = db.Column(db.String, primary_key=True, unique=False, nullable=False )
noo = db.Column(db.String, unique=False, nullable=False )
grouping = db.Column(db.String, unique=False, nullable=False )
how_many = db.Column(db.Integer, unique=False, nullable=False )
st_offset = db.Column(db.Integer, unique=False, nullable=False )
size = db.Column(db.Integer, unique=False, nullable=False )
folders = db.Column(db.Boolean, unique=False, nullable=False )
root = db.Column(db.String, unique=False, nullable=False )
cwd = db.Column(db.String, unique=False, nullable=False )
## for now being lazy and not doing a separate table until I settle on needed fields and when
# only used if ptype == View
view_eid = db.Column(db.Integer, unique=False, nullable=False )
orig_ptype = db.Column(db.String, unique=False, nullable=False )
# only used if view and orig_ptype was search
orig_search_term = db.Column(db.String, unique=False, nullable=False )
orig_url = db.Column(db.String, unique=False, nullable=False )
current = db.Column(db.Integer)
first_eid = db.Column(db.Integer)
last_eid = db.Column(db.Integer)
num_entries = db.Column(db.Integer)
def __repr__(self):
return f"<pa_user_dn: {self.pa_user_dn}, path_type: {self.path_type}, noo: {self.noo}, grouping: {self.grouping}, how_many: {self.how_many}, st_offset: {self.st_offset}, size: {self.size}, folders: {self.folders}, root: {self.root}, cwd: {self.cwd}, view_eid: {self.view_eid}, orig_ptype: {self.orig_ptype}, orig_search_term: {self.orig_search_term}, orig_url: {self.orig_url}, current={self.current}, first_eid={self.first_eid}, last_eid={self.last_eid}, num_entries={self.num_entries}>"
return f"<pa_user_dn: {self.pa_user_dn}, path_type: {self.path_type}, noo: {self.noo}, grouping: {self.grouping}, how_many: {self.how_many}, size: {self.size}, folders: {self.folders}, root: {self.root}, cwd: {self.cwd}>"
################################################################################
@@ -50,252 +39,58 @@ class PA_UserState(db.Model):
################################################################################
class States(PA):
def __init__(self, request):
self.path_type=''
self.orig_search_term = ''
self.url = request.path
self.view_eid = None
self.current=0
self.first_eid=0
self.last_eid=0
self.num_entries=0
# this is any next/prev or noo, grouping, etc. change (so use referrer to work out what to do with this)
# because this can happen on a view, or files_up, etc. change this FIRST
if 'change_file_opts' in request.path:
ref=request.referrer
base=request.base_url
base=base.replace("change_file_opts", "")
self.url = "/"+ref.replace(base, "" )
# if view_list, then we really are a view, and view_eid should be in the form
if 'view_list' in request.path:
self.path_type = 'View'
self.view_eid = request.form['view_eid']
self.url = request.form['orig_url']
# this occurs ONLY when a POST to /view/<id> occurs (at this stage orig_url will be from an import, storage, bin or search)
elif 'view' in request.path:
self.path_type = 'View'
self.view_eid = self.url[6:]
# use orig url to define defaults/look up states for 'last' import/storage/bin/search
if request.method == "POST":
self.url = request.form['orig_url']
else:
# GET's occur on redirect, and we don't have a form, so get it from pref
st=self.url[8:]
ref=request.referrer
if 'search' in ref:
st=re.sub( '.+/search/', '', ref )
else:
st=''
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==st).first()
if not pref:
SetFELog( message=f"ERROR: pref not found - DONT CLOSE THIS!!! dn={current_user.dn}, st={st}, s={self}????" , log_level="error", persistent=True, cant_close=True )
redirect("/")
else:
self.url = pref.orig_url
# set the prefix based on path
path=None
if 'files_ip' in self.url or 'file_list_ip' in self.url:
if self.path_type == "View":
self.orig_ptype = 'Import'
self.orig_url = self.url
else:
self.path_type = 'Import'
self.path_type = 'Import'
path = SettingsIPath()
elif 'files_sp' in self.url:
if self.path_type == "View":
self.orig_ptype = 'Storage'
self.orig_url = self.url
else:
self.path_type = 'Storage'
self.path_type = 'Storage'
path = SettingsSPath()
elif 'files_rbp' in self.url:
if self.path_type == "View":
self.orig_ptype = 'Bin'
self.orig_url = self.url
else:
self.path_type = 'Bin'
self.path_type = 'Bin'
path = SettingsRBPath()
elif 'search' in self.url:
# okay if we are a search, but came from a view then get last_search_state form prefs and use it
self.orig_search_term = self.url[8:]
if self.path_type == "View":
self.orig_ptype = 'Search'
self.orig_url = self.url
else:
self.path_type = 'Search'
elif 'view' in self.url:
# use url to get eid of viewed entry
self.view_eid = self.url[6:]
self.path_type="View"
self.orig_url=self.url
elif 'change_file_opts' not in self.url:
SetFELog( message=f"ERROR: DDP messed up, failed to match URL {self.url} for settings this will fail, redirecting to home" , log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , log_level="error",
persistent=True, cant_close=True )
return
if self.path_type == 'View':
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==self.orig_search_term).first()
# should find original path or search for this view (if not a search, search_term='')
orig_pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.orig_ptype,PA_UserState.orig_search_term==self.orig_search_term).first()
if not orig_pref:
SetFELog( message=f"ERROR: DDP messed up 2, failed to find orig_pref for a view pt={self.path_type} for search={self.orig_search_term}" , log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , log_level="error",
persistent=True, cant_close=True )
return
elif self.path_type == 'Search':
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.orig_search_term==self.orig_search_term).first()
self.path_type = 'Search'
self.search_term = ''
else:
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type).first()
self.path_type=''
if pref:
self.grouping=pref.grouping
self.how_many=pref.how_many
self.offset=pref.st_offset
self.size=pref.size
self.cwd=pref.cwd
self.orig_ptype=pref.orig_ptype
self.orig_search_term=pref.orig_search_term
self.orig_url = pref.orig_url
self.view_eid = pref.view_eid
self.current = pref.current
if self.path_type == "View":
self.root='static/' + self.orig_ptype
self.first_eid=orig_pref.first_eid
self.last_eid=orig_pref.last_eid
self.num_entries=orig_pref.num_entries
self.noo=orig_pref.noo
self.folders=orig_pref.folders
self.orig_search_term=orig_pref.orig_search_term
else:
self.root=pref.root
self.first_eid = pref.first_eid
self.last_eid = pref.last_eid
self.num_entries = pref.num_entries
self.noo=pref.noo
self.folders=pref.folders
if path:
self.prefix = SymlinkName(self.path_type,path,path+'/')
else:
# retreive defaults from 'PAUser' where defaults are stored
u=PAUser.query.filter(PAUser.dn==current_user.dn).one()
self.grouping=u.default_grouping
self.how_many=u.default_how_many
self.offset=0
self.size=u.default_size
if self.path_type == "View":
self.root='static/' + self.orig_ptype
self.first_eid=orig_pref.first_eid
self.last_eid=orig_pref.last_eid
self.num_entries=orig_pref.num_entries
self.noo=orig_pref.noo
self.folders=orig_pref.folders
self.orig_search_term=orig_pref.orig_search_term
else:
self.root='static/' + self.path_type
if self.path_type == 'Import':
self.noo = u.default_import_noo
self.folders = u.default_import_folders
elif self.path_type == 'Storage':
self.noo = u.default_storage_noo
self.folders = u.default_storage_folders
else:
# search so force folders to be false (rather see images, # than series of folders that dont match search themselves)
self.noo=u.default_search_noo
self.folders=False
self.prefix=None
self.cwd=self.root
if not hasattr(self, 'orig_ptype'):
self.orig_ptype=None
if not hasattr(self, 'orig_search_term'):
self.orig_search_term=None
self.orig_url = self.url
# the above are defaults, if we are here, then we have current values, use them instead if they are set -- AI: searches dont set them so then we use those in the DB first
if request.method=="POST":
if self.path_type != "View" and 'noo' in request.form:
# we are changing values based on a POST to the form, if we changed the noo option, we need to reset things
if 'change_file_opts' in request.path and self.noo != request.form['noo']:
self.noo=request.form['noo']
self.first_eid=0
self.last_eid=0
self.offset=0
if 'how_many' in request.form:
self.how_many=request.form['how_many']
if 'offset' in request.form:
self.offset=int(request.form['offset'])
if 'grouping' in request.form:
self.grouping=request.form['grouping']
# this can be null if we come from view by details
if 'size' in request.form:
self.size = request.form['size']
# seems html cant do boolean, but uses strings so convert
if self.path_type != "View" and 'folders' in request.form:
# we are changing values based on a POST to the form, if we are in folder view and we changed the folders option, we need to reset things
if 'change_file_opts' in request.path:
if self.folders and self.folders != request.form['folders']:
self.num_entries=0
self.first_eid=0
self.last_eid=0
if request.form['folders'] == "False":
self.folders=False
else:
self.folders=True
# have to force grouping to None if we flick to folders from a flat view with grouping (otherwise we print out
# group headings for child content that is not in the CWD)
self.grouping=None
if 'orig_url' in request.form:
self.orig_url = request.form['orig_url']
# possible to not be set for an AI: search
if 'cwd' in request.form:
self.cwd = request.form['cwd']
if 'prev' in request.form:
self.offset -= int(self.how_many)
# just in case we hit prev too fast, stop this...
if self.offset < 0:
self.offset=0
if 'next' in request.form:
if (self.offset + int(self.how_many)) < self.num_entries:
self.offset += int(self.how_many)
else:
# should be impossible now, but leave jic
SetFELog( message=f"WARNING: next image requested, but would go past end of list? - ignore this" , log_level="warning", persistent=True, cant_close=True )
SetFELog( message=f"DDP - offset={self.offset} + how_many={self.how_many} > num_entries={self.num_entries}" , log_level="error", persistent=True, cant_close=True )
if 'current' in request.form:
self.current = int(request.form['current'])
last_used=datetime.now(pytz.utc)
# now save pref
if not pref:
# insert new pref for this combo (might be a new search or view, or first time for a path)
pref=PA_UserState( pa_user_dn=current_user.dn, last_used=last_used, path_type=self.path_type, view_eid=self.view_eid,
noo=self.noo, grouping=self.grouping, how_many=self.how_many, st_offset=self.offset, size=self.size,
folders=self.folders, root=self.root, cwd=self.cwd, orig_ptype=self.orig_ptype, orig_search_term=self.orig_search_term,
orig_url=self.orig_url, current=self.current, first_eid=self.first_eid, last_eid=self.last_eid, num_entries=self.num_entries )
# retreive defaults from 'PAUser' where defaults are stored
u=PAUser.query.filter(PAUser.dn==current_user.dn).one()
self.grouping=u.default_grouping
self.how_many=u.default_how_many
self.size=u.default_size
self.root='static/' + self.path_type
if self.path_type == 'Import':
self.noo = u.default_import_noo
self.folders = u.default_import_folders
elif self.path_type == 'Storage':
self.noo = u.default_storage_noo
self.folders = u.default_storage_folders
else:
# update this pref with the values calculated above (most likely from POST to form)
pref.pa_user_dn=current_user.dn
pref.path_type=self.path_type
pref.view_eid=self.view_eid
pref.noo=self.noo
pref.grouping=self.grouping
pref.how_many=self.how_many
pref.st_offset=self.offset
pref.size=self.size
pref.folders=self.folders
pref.root = self.root
pref.cwd = self.cwd
pref.orig_ptype = self.orig_ptype
pref.orig_search_term = self.orig_search_term
pref.orig_url = self.orig_url
pref.last_used = last_used
pref.first_eid = self.first_eid
pref.last_eid = self.last_eid
pref.num_entries = self.num_entries
# only passed in (at the moment) in view_list
pref.current = self.current
db.session.add(pref)
db.session.commit()
# search so force folders to be false (rather see images, # than series of folders that dont match search themselves)
self.noo=u.default_search_noo
self.folders=False
self.default_flat_noo=u.default_import_noo
self.default_folder_noo=u.default_storage_noo
self.default_search_noo=u.default_search_noo
self.cwd=self.root
return
def to_dict(self):
# Automatically include all instance attributes
return {key: value for key, value in vars(self).items()}
################################################################################
# /states -> GET only -> prints out list of all prefs (simple for now)
################################################################################

View File

@@ -1,189 +1,206 @@
alter database PA set timezone to 'Australia/Victoria';
ALTER DATABASE pa SET TIMEZONE TO 'Australia/Victoria';
create sequence PA_USER_ID_SEQ;
create sequence PA_USER_STATE_ID_SEQ;
create sequence FACE_ID_SEQ;
create sequence PATH_ID_SEQ;
create sequence PATH_TYPE_ID_SEQ;
create sequence FILE_ID_SEQ;
create sequence FILE_TYPE_ID_SEQ;
create sequence JOBEXTRA_ID_SEQ;
create sequence JOBLOG_ID_SEQ;
create sequence JOB_ID_SEQ;
create sequence PERSON_ID_SEQ;
create sequence REFIMG_ID_SEQ;
create sequence SETTINGS_ID_SEQ;
create sequence PA_JOB_MANAGER_ID_SEQ;
create sequence PA_JOB_MANAGER_FE_MESSAGE_ID_SEQ;
create sequence FACE_OVERRIDE_TYPE_ID_SEQ;
create sequence FACE_OVERRIDE_ID_SEQ;
CREATE SEQUENCE pa_user_id_seq;
CREATE SEQUENCE pa_user_state_id_seq;
CREATE SEQUENCE face_id_seq;
CREATE SEQUENCE path_id_seq;
CREATE SEQUENCE path_type_id_seq;
CREATE SEQUENCE file_id_seq;
CREATE SEQUENCE file_type_id_seq;
CREATE SEQUENCE jobextra_id_seq;
CREATE SEQUENCE joblog_id_seq;
CREATE SEQUENCE job_id_seq;
CREATE SEQUENCE person_id_seq;
CREATE SEQUENCE refimg_id_seq;
CREATE SEQUENCE settings_id_seq;
CREATE SEQUENCE pa_job_manager_id_seq;
CREATE SEQUENCE pa_job_manager_fe_message_id_seq;
CREATE SEQUENCE face_override_type_id_seq;
CREATE SEQUENCE face_override_id_seq;
CREATE SEQUENCE query_id_seq;
-- these are hard-coded at present, not sure I can reflexively find models from API?
create table AI_MODEL ( ID integer, NAME varchar(24), DESCRIPTION varchar(80), constraint PK_AI_MODEL primary key(ID) );
insert into AI_MODEL values ( 1, 'hog', 'normal' );
insert into AI_MODEL values ( 2, 'cnn', 'more accurate / much slower' );
CREATE TABLE ai_model ( id INTEGER, name VARCHAR(24), description VARCHAR(80), CONSTRAINT pk_ai_model PRIMARY KEY(id) );
INSERT INTO ai_model VALUES ( 1, 'hog', 'normal' );
INSERT INTO ai_model VALUES ( 2, 'cnn', 'more accurate / much slower' );
create table SETTINGS(
ID integer,
BASE_PATH varchar, IMPORT_PATH varchar, STORAGE_PATH varchar, RECYCLE_BIN_PATH varchar, METADATA_PATH varchar,
AUTO_ROTATE Boolean,
DEFAULT_REFIMG_MODEL integer, DEFAULT_SCAN_MODEL integer, DEFAULT_THRESHOLD float,
FACE_SIZE_LIMIT integer,
SCHEDULED_IMPORT_SCAN integer, SCHEDULED_STORAGE_SCAN integer,
SCHEDULED_BIN_CLEANUP integer, BIN_CLEANUP_FILE_AGE integer,
JOB_ARCHIVE_AGE integer,
constraint PK_SETTINGS_ID primary key(ID),
constraint FK_DEFAULT_REFIMG_MODEL foreign key (DEFAULT_REFIMG_MODEL) references AI_MODEL(ID),
constraint FK_DEFAULT_SCAN_MODEL foreign key (DEFAULT_SCAN_MODEL) references AI_MODEL(ID) );
CREATE TABLE settings(
id INTEGER,
base_path VARCHAR, import_path VARCHAR, storage_path VARCHAR, recycle_bin_path VARCHAR, metadata_path VARCHAR,
auto_rotate BOOLEAN,
default_refimg_model INTEGER, default_scan_model INTEGER, default_threshold FLOAT,
face_size_limit INTEGER,
scheduled_import_scan INTEGER, scheduled_storage_scan INTEGER,
scheduled_bin_cleanup INTEGER, bin_cleanup_file_age INTEGER,
job_archive_age INTEGER,
CONSTRAINT pk_settings_id PRIMARY KEY(id),
CONSTRAINT fk_default_refimg_model FOREIGN KEY (default_refimg_model) REFERENCES ai_model(id),
CONSTRAINT fk_default_scan_model FOREIGN KEY (default_scan_model) REFERENCES ai_model(id) );
create table PA_USER(
ID integer,
DN varchar unique,
DEFAULT_IMPORT_NOO varchar,
DEFAULT_STORAGE_NOO varchar,
DEFAULT_SEARCH_NOO varchar,
DEFAULT_GROUPING varchar(16),
DEFAULT_HOW_MANY integer,
DEFAULT_SIZE integer,
DEFAULT_IMPORT_FOLDERS Boolean,
DEFAULT_STORAGE_FOLDERS Boolean,
constraint PK_PA_USER_ID primary key(ID) );
CREATE TABLE pa_user(
id INTEGER,
dn VARCHAR UNIQUE,
default_import_noo VARCHAR,
default_storage_noo VARCHAR,
default_search_noo VARCHAR,
default_grouping VARCHAR(16),
default_how_many INTEGER,
default_size INTEGER,
default_import_folders BOOLEAN,
default_storage_folders BOOLEAN,
CONSTRAINT pk_pa_user_id PRIMARY KEY(id) );
-- this is totally not 3rd normal form, but when I made it that, it was so complex, it was stupid
-- so for the little data here, I'm deliberately doing a redundant data structure
create table PA_USER_STATE ( ID integer, PA_USER_DN varchar(128), PATH_TYPE varchar(16),
NOO varchar(16), GROUPING varchar(16), HOW_MANY integer, ST_OFFSET integer, SIZE integer, FOLDERS Boolean,
ROOT varchar, CWD varchar,
ORIG_PTYPE varchar, ORIG_SEARCH_TERM varchar, ORIG_URL varchar,
VIEW_EID integer, CURRENT integer, FIRST_EID integer, LAST_EID integer, NUM_ENTRIES integer, LAST_USED timestamptz,
constraint FK_PA_USER_DN foreign key (PA_USER_DN) references PA_USER(DN),
constraint PK_PA_USER_STATES_ID primary key(ID ) );
-- FIXME: NEED TO RETHINK THIS, not sure this even needs to be in the DB
CREATE TABLE pa_user_state ( id INTEGER, pa_user_dn VARCHAR(128), path_type VARCHAR(16),
noo VARCHAR(16), grouping VARCHAR(16), how_many INTEGER, size INTEGER, folders BOOLEAN,
root VARCHAR, cwd VARCHAR,
CONSTRAINT fk_pa_user_dn FOREIGN KEY (pa_user_dn) REFERENCES pa_user(dn),
CONSTRAINT pk_pa_user_states_id PRIMARY KEY(id ) );
create table FILE_TYPE ( ID integer, NAME varchar(32) unique, constraint PK_FILE_TYPE_ID primary key(ID) );
create table PATH_TYPE ( ID integer, NAME varchar(16) unique, constraint PK_PATH_TYPE_ID primary key(ID) );
create table PATH ( ID integer, TYPE_ID integer, PATH_PREFIX varchar(1024), NUM_FILES integer,
constraint PK_PATH_ID primary key(ID),
constraint FK_PATH_TYPE_TYPE_ID foreign key (TYPE_ID) references PATH_TYPE(ID) );
create table ENTRY( ID integer, NAME varchar(128), TYPE_ID integer, EXISTS_ON_FS boolean,
constraint PK_ENTRY_ID primary key(ID),
constraint FK_FILE_TYPE_TYPE_ID foreign key (TYPE_ID) references FILE_TYPE(ID) );
create table FILE ( EID integer, SIZE_MB integer, HASH varchar(34), THUMBNAIL varchar, FACES_CREATED_ON float, LAST_HASH_DATE float, LAST_AI_SCAN float, YEAR integer, MONTH integer, DAY integer, WOY integer,
constraint PK_FILE_ID primary key(EID),
constraint FK_FILE_ENTRY_ID foreign key (EID) references ENTRY(ID) );
create table DEL_FILE ( FILE_EID integer, ORIG_PATH_PREFIX varchar(1024), constraint PK_DEL_FILE_FILE_EID primary key (FILE_EID),
constraint FK_ENTRY_ID foreign key (FILE_EID) references FILE(EID) );
create table DIR ( EID integer, REL_PATH varchar(256), NUM_FILES integer, LAST_IMPORT_DATE float,
constraint PK_DIR_EID primary key(EID),
constraint FK_DIR_ENTRY_ID foreign key (EID) references ENTRY(ID) );
create table PATH_DIR_LINK ( path_id integer, dir_eid integer,
constraint PK_PDL_path_id_dir_eid primary key (path_id, dir_eid),
constraint FK_PDL_PATH_ID foreign key (PATH_ID) references PATH(ID),
constraint FK_PDL_DIR_EID foreign key (DIR_EID) references DIR(EID) );
create table ENTRY_DIR_LINK ( entry_id integer, dir_eid integer,
constraint PK_EDL_entry_id_dir_eid primary key (entry_id, dir_eid),
constraint FK_EDL_ENTRY_ID foreign key (ENTRY_ID) references ENTRY(ID),
constraint FK_EDL_DIR_EID foreign key (DIR_EID) references DIR(EID) );
create table PERSON ( ID integer default nextval('PERSON_ID_SEQ'), TAG varchar(48), FIRSTNAME varchar(48), SURNAME varchar(48),
constraint PK_PERSON_ID primary key(ID) );
alter sequence PERSON_ID_SEQ owned by PERSON.ID;
CREATE TABLE query ( id INTEGER, path_type VARCHAR(16), noo VARCHAR(16), grouping VARCHAR(16), q_offset INTEGER,
entry_list VARCHAR, folders BOOLEAN, root VARCHAR, cwd VARCHAR, search_term VARCHAR, current INTEGER,
created TIMESTAMPTZ,
CONSTRAINT pk_query_id PRIMARY KEY(id ) );
create table REFIMG ( ID integer, FNAME varchar(128), FACE bytea, ORIG_W integer, ORIG_H integer,
FACE_TOP integer, FACE_RIGHT integer, FACE_BOTTOM integer, FACE_LEFT integer, CREATED_ON float, THUMBNAIL varchar, MODEL_USED integer,
constraint PK_REFIMG_ID primary key(ID),
constraint FK_REFIMG_MODEL_USED foreign key (MODEL_USED) references AI_MODEL(ID) );
alter sequence REFIMG_ID_SEQ owned by REFIMG.ID;
CREATE TABLE file_type ( id INTEGER, name VARCHAR(32) UNIQUE, CONSTRAINT pk_file_type_id PRIMARY KEY(id) );
create table FACE( ID integer, FACE bytea, FACE_TOP integer, FACE_RIGHT integer, FACE_BOTTOM integer, FACE_LEFT integer,
W integer, H integer, constraint PK_FACE_ID primary key(ID) );
CREATE TABLE path_type ( id INTEGER, name VARCHAR(16) UNIQUE, CONSTRAINT pk_path_type_id PRIMARY KEY(id) );
create table FACE_FILE_LINK( FACE_ID integer, FILE_EID integer, MODEL_USED integer,
constraint PK_FFL_FACE_ID_FILE_ID primary key(FACE_ID, FILE_EID),
constraint FK_FFL_FACE_ID foreign key (FACE_ID) references FACE(ID) on delete cascade,
constraint FK_FFL_FILE_EID foreign key (FILE_EID) references FILE(EID),
constraint FK_FFL_MODEL_USED foreign key (MODEL_USED) references AI_MODEL(ID) );
CREATE TABLE path ( id INTEGER, type_id INTEGER, path_prefix VARCHAR(1024), num_files INTEGER,
CONSTRAINT pk_path_id PRIMARY KEY(id),
CONSTRAINT fk_path_type_type_id FOREIGN KEY (type_id) REFERENCES path_type(id) );
create table FACE_REFIMG_LINK( FACE_ID integer, REFIMG_ID integer, FACE_DISTANCE float,
constraint PK_FRL_FACE_ID_REFIMG_ID primary key(FACE_ID, REFIMG_ID),
constraint FK_FRL_FACE_ID foreign key (FACE_ID) references FACE(ID) on delete cascade,
constraint FK_FRL_REFIMG_ID foreign key (REFIMG_ID) references REFIMG(ID) );
CREATE TABLE entry( id INTEGER, name VARCHAR(128), type_id INTEGER, exists_on_fs BOOLEAN,
CONSTRAINT pk_entry_id PRIMARY KEY(id),
CONSTRAINT fk_file_type_type_id FOREIGN KEY (type_id) REFERENCES file_type(id) );
create table FACE_OVERRIDE_TYPE ( ID integer, NAME varchar unique, constraint PK_FACE_OVERRIDE_TYPE_ID primary key(ID) );
insert into FACE_OVERRIDE_TYPE values ( (select nextval('FACE_OVERRIDE_TYPE_ID_SEQ')), 'Manual match to existing person' );
insert into FACE_OVERRIDE_TYPE values ( (select nextval('FACE_OVERRIDE_TYPE_ID_SEQ')), 'Not a face' );
insert into FACE_OVERRIDE_TYPE values ( (select nextval('FACE_OVERRIDE_TYPE_ID_SEQ')), 'Too young' );
insert into FACE_OVERRIDE_TYPE values ( (select nextval('FACE_OVERRIDE_TYPE_ID_SEQ')), 'Ignore face' );
CREATE TABLE file ( eid INTEGER, size_mb INTEGER, hash VARCHAR(34), thumbnail VARCHAR, faces_created_on FLOAT, last_hash_date FLOAT, last_ai_scan FLOAT, year INTEGER, month INTEGER, day INTEGER, woy INTEGER,
CONSTRAINT pk_file_id PRIMARY KEY(eid),
CONSTRAINT fk_file_entry_id FOREIGN KEY (eid) REFERENCES entry(id) );
CREATE TABLE del_file ( file_eid INTEGER, orig_path_prefix VARCHAR(1024), CONSTRAINT pk_del_file_file_eid PRIMARY KEY (file_eid),
CONSTRAINT fk_entry_id FOREIGN KEY (file_eid) REFERENCES file(eid) );
CREATE TABLE dir ( eid INTEGER, rel_path VARCHAR(256), num_files INTEGER, last_import_date FLOAT,
CONSTRAINT pk_dir_eid PRIMARY KEY(eid),
CONSTRAINT fk_dir_entry_id FOREIGN KEY (eid) REFERENCES entry(id) );
CREATE TABLE path_dir_link ( PATH_ID INTEGER, DIR_EID INTEGER,
CONSTRAINT pk_pdl_PATH_ID_DIR_EID PRIMARY KEY (PATH_ID, DIR_EID),
CONSTRAINT fk_pdl_path_id FOREIGN KEY (path_id) REFERENCES path(id),
CONSTRAINT fk_pdl_dir_eid FOREIGN KEY (dir_eid) REFERENCES dir(eid) );
CREATE TABLE entry_dir_link ( ENTRY_ID INTEGER, DIR_EID INTEGER,
CONSTRAINT pk_edl_ENTRY_ID_DIR_EID PRIMARY KEY (ENTRY_ID, DIR_EID),
CONSTRAINT fk_edl_entry_id FOREIGN KEY (entry_id) REFERENCES entry(id),
CONSTRAINT fk_edl_dir_eid FOREIGN KEY (dir_eid) REFERENCES dir(eid) );
CREATE TABLE person ( id INTEGER DEFAULT NEXTVAL('person_id_seq'), tag VARCHAR(48), firstname VARCHAR(48), surname VARCHAR(48),
CONSTRAINT pk_person_id PRIMARY KEY(id) );
ALTER SEQUENCE person_id_seq OWNED BY person.id;
CREATE TABLE refimg ( id INTEGER, fname VARCHAR(128), face BYTEA, orig_w INTEGER, orig_h INTEGER,
face_top INTEGER, face_right INTEGER, face_bottom INTEGER, face_left INTEGER, created_on FLOAT, thumbnail VARCHAR, model_used INTEGER,
CONSTRAINT pk_refimg_id PRIMARY KEY(id),
CONSTRAINT fk_refimg_model_used FOREIGN KEY (model_used) REFERENCES ai_model(id) );
ALTER SEQUENCE refimg_id_seq OWNED BY refimg.id;
CREATE TABLE face( id INTEGER, face BYTEA, face_top INTEGER, face_right INTEGER, face_bottom INTEGER, face_left INTEGER,
w INTEGER, h INTEGER, CONSTRAINT pk_face_id PRIMARY KEY(id) );
CREATE TABLE face_file_link( face_id INTEGER, file_eid INTEGER, model_used INTEGER,
CONSTRAINT pk_ffl_face_id_file_id PRIMARY KEY(face_id, file_eid),
CONSTRAINT fk_ffl_face_id FOREIGN KEY (face_id) REFERENCES face(id) ON DELETE CASCADE,
CONSTRAINT fk_ffl_file_eid FOREIGN KEY (file_eid) REFERENCES file(eid),
CONSTRAINT fk_ffl_model_used FOREIGN KEY (model_used) REFERENCES ai_model(id) );
CREATE TABLE face_refimg_link( face_id INTEGER, refimg_id INTEGER, face_distance FLOAT,
CONSTRAINT pk_frl_face_id_refimg_id PRIMARY KEY(face_id, refimg_id),
CONSTRAINT fk_frl_face_id FOREIGN KEY (face_id) REFERENCES face(id) ON DELETE CASCADE,
CONSTRAINT fk_frl_refimg_id FOREIGN KEY (refimg_id) REFERENCES refimg(id) );
CREATE TABLE face_override_type ( id INTEGER, name VARCHAR UNIQUE, CONSTRAINT pk_face_override_type_id PRIMARY KEY(id) );
INSERT INTO face_override_type VALUES ( (SELECT NEXTVAL('face_override_type_id_seq')), 'Manual match to existing person' );
INSERT INTO face_override_type VALUES ( (SELECT NEXTVAL('face_override_type_id_seq')), 'Not a face' );
INSERT INTO face_override_type VALUES ( (SELECT NEXTVAL('face_override_type_id_seq')), 'Too young' );
INSERT INTO face_override_type VALUES ( (SELECT NEXTVAL('face_override_type_id_seq')), 'Ignore face' );
-- keep non-redundant FACE because, when we rebuild data we may have a null FACE_ID, but still want to connect to this override
-- from a previous AI pass... (would happen if we delete a file and then reimport/scan it), OR, more likely we change (say) a threshold, etc.
-- any reordering of faces, generates new face_ids... (but if the face data was the same, then this override should stand)
create table FACE_NO_MATCH_OVERRIDE ( ID integer, FACE_ID integer, TYPE_ID integer,
constraint FK_FNMO_FACE_ID foreign key (FACE_ID) references FACE(ID),
constraint FK_FNMO_TYPE foreign key (TYPE_ID) references FACE_OVERRIDE_TYPE(ID),
constraint PK_FNMO_ID primary key(ID) );
CREATE TABLE face_no_match_override ( id INTEGER, face_id INTEGER, type_id INTEGER,
CONSTRAINT fk_fnmo_face_id FOREIGN KEY (face_id) REFERENCES face(id),
CONSTRAINT fk_fnmo_type FOREIGN KEY (type_id) REFERENCES face_override_type(id),
CONSTRAINT pk_fnmo_id PRIMARY KEY(id) );
-- manual match goes to person not refimg, so on search, etc. we deal with this anomaly (via sql not ORM)
create table FACE_FORCE_MATCH_OVERRIDE ( ID integer, FACE_ID integer, PERSON_ID integer, constraint PK_FACE_FORCE_MATCH_OVERRIDE_ID primary key(ID) );
CREATE TABLE face_force_match_override ( id INTEGER, face_id INTEGER, person_id INTEGER, CONSTRAINT pk_face_force_match_override_id PRIMARY KEY(id) );
create table DISCONNECTED_NO_MATCH_OVERRIDE ( FACE bytea, TYPE_ID integer,
constraint FK_DNMO_TYPE_ID foreign key (TYPE_ID) references FACE_OVERRIDE_TYPE(ID),
constraint PK_DNMO_FACE primary key (FACE) );
CREATE TABLE disconnected_no_match_override ( face BYTEA, type_id INTEGER,
CONSTRAINT fk_dnmo_type_id FOREIGN KEY (type_id) REFERENCES face_override_type(id),
CONSTRAINT pk_dnmo_face PRIMARY KEY (face) );
create table DISCONNECTED_FORCE_MATCH_OVERRIDE ( FACE bytea, PERSON_ID integer,
constraint FK_DFMO_PERSON_ID foreign key (PERSON_ID) references PERSON(ID),
constraint PK_DFMO_FACE primary key (FACE) );
CREATE TABLE disconnected_force_match_override ( face BYTEA, person_id INTEGER,
CONSTRAINT fk_dfmo_person_id FOREIGN KEY (person_id) REFERENCES person(id),
CONSTRAINT pk_dfmo_face PRIMARY KEY (face) );
create table PERSON_REFIMG_LINK ( PERSON_ID integer, REFIMG_ID integer,
constraint PK_PRL primary key(PERSON_ID, REFIMG_ID),
constraint FK_PRL_PERSON_ID foreign key (PERSON_ID) references PERSON(ID),
constraint FK_PRL_REFIMG_ID foreign key (REFIMG_ID) references REFIMG(ID),
constraint U_PRL_REFIMG_ID unique(REFIMG_ID) );
CREATE TABLE person_refimg_link ( person_id INTEGER, refimg_id INTEGER,
CONSTRAINT pk_prl PRIMARY KEY(person_id, refimg_id),
CONSTRAINT fk_prl_person_id FOREIGN KEY (person_id) REFERENCES person(id),
CONSTRAINT fk_prl_refimg_id FOREIGN KEY (refimg_id) REFERENCES refimg(id),
CONSTRAINT u_prl_refimg_id UNIQUE(refimg_id) );
create table JOB (
ID integer, START_TIME timestamptz, LAST_UPDATE timestamptz, NAME varchar(64), STATE varchar(128),
NUM_FILES integer, CURRENT_FILE_NUM integer, CURRENT_FILE varchar(256), WAIT_FOR integer, PA_JOB_STATE varchar(48),
constraint PK_JOB_ID primary key(ID) );
CREATE TABLE job (
id INTEGER, start_time TIMESTAMPTZ, last_update TIMESTAMPTZ, name VARCHAR(64), state VARCHAR(128),
num_files INTEGER, current_file_num INTEGER, current_file VARCHAR(256), wait_for INTEGER, pa_job_state VARCHAR(48),
CONSTRAINT pk_job_id PRIMARY KEY(id) );
-- used to pass / keep extra values, e.g. num_files for jobs that have sets of files, or out* for adding output from jobs that you want to pass to next job in the chain
create table JOBEXTRA ( ID integer, JOB_ID integer, NAME varchar(32), VALUE varchar,
constraint PK_JOBEXTRA_ID primary key(ID), constraint FK_JOBEXTRA_JOB_ID foreign key(JOB_ID) references JOB(ID) );
CREATE TABLE jobextra ( id INTEGER, job_id INTEGER, name VARCHAR(32), value VARCHAR,
CONSTRAINT pk_jobextra_id PRIMARY KEY(id), CONSTRAINT fk_jobextra_job_id FOREIGN KEY(job_id) REFERENCES job(id) );
create table JOBLOG ( ID integer, JOB_ID integer, LOG_DATE timestamptz, LOG varchar,
constraint PK_JL_ID primary key(ID), constraint FK_JL_JOB_ID foreign key(JOB_ID) references JOB(ID) );
CREATE TABLE joblog ( id INTEGER, job_id INTEGER, log_date TIMESTAMPTZ, log VARCHAR,
CONSTRAINT pk_jl_id PRIMARY KEY(id), CONSTRAINT fk_jl_job_id FOREIGN KEY(job_id) REFERENCES job(id) );
create table PA_JOB_MANAGER_FE_MESSAGE ( ID integer, JOB_ID integer, LEVEL varchar(16), MESSAGE varchar(1024), PERSISTENT boolean, CANT_CLOSE boolean,
constraint PA_JOB_MANAGER_FE_ACKS_ID primary key(ID),
constraint FK_PA_JOB_MANAGER_FE_MESSAGE_JOB_ID foreign key(JOB_ID) references JOB(ID) );
CREATE TABLE pa_job_manager_fe_message ( id INTEGER, job_id INTEGER, level VARCHAR(16), message VARCHAR(8192), persistent BOOLEAN, cant_close BOOLEAN,
CONSTRAINT pk_pa_job_manager_fe_acks_id PRIMARY KEY(id),
CONSTRAINT fk_pa_job_manager_fe_message_job_id FOREIGN KEY(job_id) REFERENCES job(id) );
CREATE TABLE amendment_type ( id INTEGER, which VARCHAR(8), what VARCHAR(32), colour VARCHAR(32),
CONSTRAINT pk_amendment_type_id PRIMARY KEY(id) );
INSERT INTO amendment_type ( id, which, what, colour ) VALUES ( 1, 'icon', 'trash', 'red' );
INSERT INTO amendment_type ( id, which, what, colour ) VALUES ( 2, 'img', 'rot90.png', '#009EFF' );
INSERT INTO amendment_type ( id, which, what, colour ) VALUES ( 3, 'img', 'rot180.png', '#009EFF' );
INSERT INTO amendment_type ( id, which, what, colour ) VALUES ( 4, 'img', 'rot270.png', '#009EFF' );
INSERT INTO amendment_type ( id, which, what, colour ) VALUES ( 5, 'icon', 'flip_h', '#009EFF' );
INSERT INTO amendment_type ( id, which, what, colour ) VALUES ( 6, 'icon', 'flip_v', '#009EFF' );
CREATE TABLE entry_amendment ( eid INTEGER, amend_type INTEGER,
CONSTRAINT pk_entry_amendment_eid_name PRIMARY KEY(eid,amend_type),
CONSTRAINT fk_entry_amendment_amendment_type FOREIGN KEY(amend_type) REFERENCES amendment_type(id) );
-- default data for types of paths
insert into PATH_TYPE values ( (select nextval('PATH_TYPE_ID_SEQ')), 'Import' );
insert into PATH_TYPE values ( (select nextval('PATH_TYPE_ID_SEQ')), 'Storage' );
insert into PATH_TYPE values ( (select nextval('PATH_TYPE_ID_SEQ')), 'Bin' );
insert into PATH_TYPE values ( (select nextval('PATH_TYPE_ID_SEQ')), 'Metadata' );
INSERT INTO path_type VALUES ( (SELECT NEXTVAL('path_type_id_seq')), 'Import' );
INSERT INTO path_type VALUES ( (SELECT NEXTVAL('path_type_id_seq')), 'Storage' );
INSERT INTO path_type VALUES ( (SELECT NEXTVAL('path_type_id_seq')), 'Bin' );
INSERT INTO path_type VALUES ( (SELECT NEXTVAL('path_type_id_seq')), 'Metadata' );
-- default data for types of files
insert into FILE_TYPE values ( (select nextval('FILE_TYPE_ID_SEQ')), 'Image' );
insert into FILE_TYPE values ( (select nextval('FILE_TYPE_ID_SEQ')), 'Video' );
insert into FILE_TYPE values ( (select nextval('FILE_TYPE_ID_SEQ')), 'Directory' );
insert into FILE_TYPE values ( (select nextval('FILE_TYPE_ID_SEQ')), 'Unknown' );
INSERT INTO file_type VALUES ( (SELECT NEXTVAL('file_type_id_seq')), 'Image' );
INSERT INTO file_type VALUES ( (SELECT NEXTVAL('file_type_id_seq')), 'Video' );
INSERT INTO file_type VALUES ( (SELECT NEXTVAL('file_type_id_seq')), 'Directory' );
INSERT INTO file_type VALUES ( (SELECT NEXTVAL('file_type_id_seq')), 'Unknown' );
-- fake data only for making testing easier
--insert into PERSON values ( (select nextval('PERSON_ID_SEQ')), 'dad', 'Damien', 'De Paoli' );
--insert into PERSON values ( (select nextval('PERSON_ID_SEQ')), 'mum', 'Mandy', 'De Paoli' );
--insert into PERSON values ( (select nextval('PERSON_ID_SEQ')), 'cam', 'Cameron', 'De Paoli' );
--insert into PERSON values ( (select nextval('PERSON_ID_SEQ')), 'mich', 'Michelle', 'De Paoli' );
--INSERT INTO person VALUES ( (SELECT NEXTVAL('person_id_seq')), 'dad', 'Damien', 'De Paoli' );
--INSERT INTO person VALUES ( (SELECT NEXTVAL('person_id_seq')), 'mum', 'Mandy', 'De Paoli' );
--INSERT INTO person VALUES ( (SELECT NEXTVAL('person_id_seq')), 'cam', 'Cameron', 'De Paoli' );
--INSERT INTO person VALUES ( (SELECT NEXTVAL('person_id_seq')), 'mich', 'Michelle', 'De Paoli' );
-- DEV(ddp):
insert into SETTINGS ( id, base_path, import_path, storage_path, recycle_bin_path, metadata_path, auto_rotate, default_refimg_model, default_scan_model, default_threshold, face_size_limit, scheduled_import_scan, scheduled_storage_scan, scheduled_bin_cleanup, bin_cleanup_file_age, job_archive_age ) values ( (select nextval('SETTINGS_ID_SEQ')), '/home/ddp/src/photoassistant/', 'images_to_process/', 'photos/', '.pa_bin/', '.pa_metadata/', true, 1, 1, '0.55', 43, 1, 1, 7, 30, 3 );
INSERT INTO settings ( id, base_path, import_path, storage_path, recycle_bin_path, metadata_path, auto_rotate, default_refimg_model, default_scan_model, default_threshold, face_size_limit, scheduled_import_scan, scheduled_storage_scan, scheduled_bin_cleanup, bin_cleanup_file_age, job_archive_age ) VALUES ( (SELECT NEXTVAL('settings_id_seq')), '/home/ddp/src/photoassistant/', 'images_to_process/', 'photos/', '.pa_bin/', '.pa_metadata/', true, 1, 1, '0.55', 43, 1, 1, 7, 30, 3 );
-- DEV(cam):
--insert into SETTINGS ( id, base_path, import_path, storage_path, recycle_bin_path, metadata_path, auto_rotate, default_refimg_model, default_scan_model, default_threshold, face_size_limit, scheduled_import_scan, scheduled_storage_scan, scheduled_bin_cleanup, bin_cleanup_file_age, job_archive_age ) values ( (select nextval('SETTINGS_ID_SEQ')), 'c:/Users/cam/Desktop/code/python/photoassistant/', 'c:\images_to_process', 'photos/', '.pa_bin/', '.pa_metadata/', true, 1, 1, '0.55', 43, 1, 1, 7, 30, 3 );
--INSERT INTO settings ( id, base_path, import_path, storage_path, recycle_bin_path, metadata_path, auto_rotate, default_refimg_model, default_scan_model, default_threshold, face_size_limit, scheduled_import_scan, scheduled_storage_scan, scheduled_bin_cleanup, bin_cleanup_file_age, job_archive_age ) VALUES ( (select nextval('SETTINGS_ID_SEQ')), 'c:/Users/cam/Desktop/code/python/photoassistant/', 'c:\images_to_process', 'photos/', '.pa_bin/', '.pa_metadata/', TRUE, 1, 1, '0.55', 43, 1, 1, 7, 30, 3 );
-- PROD:
--insert into SETTINGS ( id, base_path, import_path, storage_path, recycle_bin_path, metadata_path, auto_rotate, default_refimg_model, default_scan_model, default_threshold, face_size_limit, scheduled_import_scan, scheduled_storage_scan, scheduled_bin_cleanup, bin_cleanup_file_age, job_archive_age ) values ( (select nextval('SETTINGS_ID_SEQ')), '/export/docker/storage/', 'Camera_uploads/', 'photos/', '.pa_bin/', '.pa_metadata/', true, 1, 1, '0.55', 43, 1, 1, 7, 30, 4 );
--INSERT INTO settings ( id, base_path, import_path, storage_path, recycle_bin_path, metadata_path, auto_rotate, default_refimg_model, default_scan_model, default_threshold, face_size_limit, scheduled_import_scan, scheduled_storage_scan, scheduled_bin_cleanup, bin_cleanup_file_age, job_archive_age ) VALUES ( (SELECT NEXTVAL('settings_id_seq')), '/export/docker/storage/', 'Camera_uploads/', 'photos/', '.pa_bin/', '.pa_metadata/', TRUE, 1, 1, '0.55', 43, 1, 1, 7, 30, 4 );

View File

@@ -4,10 +4,9 @@
<div class="container">
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
<h3 class="my-3 text-center" style="color: #5bc0de">
{# <svg width="64" height="64" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#pa_logo" /></svg>&nbsp;About Photo Assistant</h3> #}
<svg width="64" height="64" fill="currentColor"><use xlink:href="internal/icons.svg#pa_logo" /></svg>&nbsp;Photo Assistant - About</h3>
<img width="64" height="64" src="internal/pa-logo.png"/></img>&nbsp;Photo Assistant - About</h3>
<div class="alert alert-info">
{{build_info}}
<center>{{build_info}}</center>
</div>
</div>
<nav>

View File

@@ -29,7 +29,7 @@
<script>
function HandleSearch()
{
st=$('#search_term').val();
st=encodeURIComponent( $('#search_term').val() )
if( st != '' ) {
document.location.href='/search/'+st
}
@@ -101,6 +101,7 @@
{% if config.ENV != "production" %}
<a class="dropdown-item" href="{{url_for('force_scan')}}">Force Scan (delete data & rebuild)</a>
{% endif %}
<a class="dropdown-item" href="{{url_for('force_clear')}}">Clear Messages (Force)</a>
<a class="dropdown-item" href="{{url_for('wake_up')}}">Force wake the job manager</a>
</div class="dropdow-menu">
</div class="nav-item dropdown">
@@ -135,7 +136,7 @@
{% if not InDBox %}
{%block script_content %}{% endblock script_content %}
<div id="status_container" class="position-fixed top-0 end-0 p-0 my-5" "z-index: 11"> </div>
<div id="status_container" class="position-fixed top-0 end-0 p-0 my-5" style="z-index: 9999"> </div>
<!-- CheckForJobs(), will see if there are any messages/jobs and keep doing this until there are 0 more and then stop -->
<script>
$(document).ready(function() { CheckForJobs() } )

View File

@@ -1,64 +1,52 @@
{% extends "base.html" %} {% block main_content %}
<script src="{{ url_for( 'internal', filename='js/files_support.js')}}?v={{js_vers['fs']}}"></script>
<div class="container-fluid">
<h3 class="offset-2">{{page_title}}</h3>
<form id="main_form" method="POST">
<input id="offset" type="hidden" name="offset" value="{{OPT.offset}}">
<input id="grouping" type="hidden" name="grouping" value="">
<input id="folders" type="hidden" name="folders" value="False">
<div class="col col-auto">
<div class="input-group">
{{CreateSelect( "noo", OPT.noo, ["Oldest", "Newest","A to Z", "Z to A"], "$('#offset').val(0)", "rounded-start py-1 my-1")|safe }}
{{CreateSelect( "how_many", OPT.how_many|string, ["10", "25", "50", "75", "100", "150", "200", "500"], "", "rounded-end py-1 my-1" )|safe }}
{{CreateSelect( "noo", OPT.noo, ["Oldest", "Newest","A to Z", "Z to A"], "changeOPT(getPageFileList); return false", "rounded-start py-1 my-1")|safe }}
{{CreateSelect( "how_many", OPT.how_many|string, ["10", "25", "50", "75", "100", "150", "200", "500"], "changeOPT(getPageFileList); return false", "rounded-end py-1 my-1" )|safe }}
<div class="mb-1 col my-auto d-flex justify-content-center">
{% set prv_disabled="" %}
{% if OPT.offset|int == 0 %}
{% set prv_disabled="disabled" %}
{% endif %}
<button id="prev" {{prv_disabled}} name="prev" class="prev sm-txt btn btn-outline-secondary">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#prev"/></svg>
<button id="prev" name="prev" class="prev sm-txt btn btn-outline-secondary" onClick="prevPage(getPageFileList)">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#prev"/></svg>
</button>
<span class="sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
{% set nxt_disabled="" %}
{% if entry_data|length < OPT.how_many|int %}
{% set nxt_disabled="disabled" %}
{% endif %}
<button id="next" {{nxt_disabled}} name="next" class="next sm-txt btn btn-outline-secondary">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#next"/></svg>
<span class="how_many_text sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
<button id="next" name="next" class="next sm-txt btn btn-outline-secondary" onClick="nextPage(getPageFileList)">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#next"/></svg>
</button>
</div class="col...">
</div class="input-group...">
</div class="col col-auto">
</form
<div class="row">
<table class="table table-striped table-sm col-xl-12">
<thead><tr class="table-primary"><th>Name</th><th>Size (MB)</th><th>Path Prefix</th><th>Hash</th></tr></thead><tbody>
{% for obj in entry_data %}
<tr><td>
{% if obj.type.name == "Image" or obj.type.name == "Video" %}
<figure class="figure" font-size: 24px;>
<div style="position:relative; width:100%">
{% if obj.type.name=="Image" %}
<a href="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}/{{obj.name}}">
{% elif obj.type.name == "Video" %}
<a href="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}/{{obj.name}}">
{% endif %}
<img class="thumb" style="display:block" height="48" src="data:image/jpeg;base64,{{obj.file_details.thumbnail}}"></img>
{% if obj.type.name=="Image" or obj.type.name == "Video" %}
</a>
{% endif %}
</div>
<figcaption class="figure-caption">{{obj.name}}</figcaption>
</figure>
{% endif %}
</td>
{% if obj.type.name != "Directory" %}
<td>{{obj.file_details.size_mb}}</td><td>{{obj.in_dir.in_path.path_prefix.replace("static/","")}}/{{obj.in_dir.rel_path}}</td><td>{{obj.file_details.hash}}</td>
{% else %}
<td></td><td></td><td></td>
{% endif %}
</tr>
{% endfor %}
</tbody></table>
</div class="row">
</div class="container">
<div id="file_list_div" class="container-fluid pt-2">
</div class="container">
<div class="container-fluid">
<input type="hidden" name="cwd" id="cwd" value="{{OPT.cwd}}">
<div class="row">
<div class="col my-auto d-flex justify-content-center">
<button aria-label="prev" id="prev" name="prev" class="prev sm-txt btn btn-outline-secondary disabled" onClick="prevPage(getPageFileList)" disabled>
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#prev"/></svg>
</button>
<span class="how_many_text sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
<button aria-label="next" id="next" name="next" class="next sm-txt btn btn-outline-secondary" onClick="nextPage(getPageFileList)">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#next"/></svg>
</button>
</div class="col my-auto"> </div class="row">
</div class="container-fluid">
{% endblock main_content %}
{% block script_content %}
<script>
// this is the list of entry ids for the images for ALL matches for this query
var entryList={{query_data.entry_list}}
var OPT = {{ OPT.to_dict()|tojson }};
// set from query data and stored in OPT for convenience. It can be 0 -
// this implies no content in the Path at all
OPT.root_eid = {{ query_data.root_eid }};
// pageList is just those entries shown on this page from the full entryList
var pageList=[]
// force pageList to set pageList for & render the first page
getPage( 1, getPageFileList )
</script>
{% endblock script_content %}

View File

@@ -1,38 +1,43 @@
{% extends "base.html" %}
{% block main_content %}
<script src="{{ url_for( 'internal', filename='js/files_support.js')}}"></script>
<script src="{{ url_for( 'internal', filename='js/files_transform.js')}}"></script>
<style>
@media (max-width: 576px) {
#la, #ra {
padding: 5% !important;
}
}
.norm-txt { font-size: 1.0rem }
.form-check-input:checked {
background-color: #39C0ED;
border-color: #CFF4FC;
}
.form-switch .form-check-input {
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%2339C0ED'/%3e%3c/svg%3e");
}
.form-switch .form-check-input:focus {
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23CFF4FC'/%3e%3c/svg%3e");
}
#tst90:hover,#tst90:focus { filter: invert(73%) sepia(27%) saturate(3970%) hue-rotate(146deg) brightness(94%) contrast(100%); }
</style>
<script src="{{ url_for( 'internal', filename='js/files_transform.js')}}?v={{ js_vers['ft'] }}"></script>
<script src="{{ url_for( 'internal', filename='js/files_support.js')}}?v={{ js_vers['fs'] }}"></script>
<script src="{{ url_for( 'internal', filename='js/view_support.js')}}?v={{ js_vers['vs'] }}"></script>
<script>
var move_paths=[]
{% for p in move_paths %}
p = new Object()
p.type = '{{p.type}}'
p.path = '{{p.path}}'
p.icon_url = '{{p.icon_url}}'
move_paths.push(p)
{% endfor %}
</script>
<div class="container-fluid">
<form id="main_form" method="POST" action="/change_file_opts">
<input type="hidden" name="cwd" id="cwd" value="{{OPT.cwd}}">
{% if search_term is defined %}
<input type="hidden" name="search_term" id="view_term" value="{{search_term}}">
{% endif %}
<div id="files_div">
<div class="container-fluid">
<div class="d-flex row mb-2">
{% if OPT.folders %}
<div class="my-auto col col-auto">
<span class="alert alert-primary py-2">
{% if "files_ip" in request.url %}
<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#import"/></svg>
<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#import"/></svg>
{% set tmp_path=OPT.cwd | replace( "static/Import", "" ) + "/" %}
{% elif "files_sp" in request.url %}
<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#db"/></svg>
<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#db"/></svg>
{% set tmp_path=OPT.cwd | replace( "static/Storage", "" ) + "/" %}
{% elif "files_rbp" in request.url %}
<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash"/></svg>
<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#trash"/></svg>
{% set tmp_path=OPT.cwd | replace( "static/Bin", "" ) + "/" %}
{% endif %}
{{tmp_path}}</span>
@@ -40,15 +45,14 @@
{% endif %}
<div class="col col-auto">
<div class="input-group">
{{CreateSelect( "noo", OPT.noo, ["Oldest", "Newest","A to Z", "Z to A"], "$('#offset').val(0)", "rounded-start py-2")|safe }}
{{CreateSelect( "how_many", OPT.how_many|string, ["10", "25", "50", "75", "100", "150", "200", "500"])|safe }}
{{CreateSelect( "noo", OPT.noo, ["Oldest", "Newest","A to Z", "Z to A"], "changeOPT(getPageFigures); return false", "rounded-start py-2")|safe }}
{{CreateSelect( "how_many", OPT.how_many|string, ["10", "25", "50", "75", "100", "150", "200", "500"], "changeOPT(getPageFigures); return false" )|safe }}
{% if OPT.folders %}
<input type="hidden" name="grouping" id="grouping" value="{{OPT.grouping}}">
{{CreateFoldersSelect( OPT.folders, "rounded-end" )|safe }}
{{CreateFoldersSelect( OPT.folders, "changeOPT(getPageFigures); return false", "rounded-end" )|safe }}
{% else %}
{{CreateFoldersSelect( OPT.folders )|safe }}
{{CreateFoldersSelect( OPT.folders, "changeOPT(getPageFigures); return false" )|safe }}
<span class="sm-txt my-auto btn btn-outline-info disabled border-top border-bottom">grouped by:</span>
{{CreateSelect( "grouping", OPT.grouping, ["None", "Day", "Week", "Month"], "", "rounded-end")|safe }}
{{CreateSelect( "grouping", OPT.grouping, ["None", "Day", "Week", "Month"], "OPT.grouping=$('#grouping').val();drawPageOfFigures();return false", "rounded-end")|safe }}
{% endif %}
</div class="input-group">
</div class="col">
@@ -56,349 +60,262 @@
<div class="col col-auto my-auto">
<span class="alert alert-primary p-2">Searched for: '{{search_term}}'</span>
</div class="col my-auto">
<script>
$('#folders').prop('disabled', 'disabled').removeClass('border-info').addClass('border-secondary').removeClass('text-info').addClass('text-secondary');
</script>
{% endif %}
<div class="col flex-grow-1 my-auto d-flex justify-content-center w-100">
<button aria-label="prev" id="prev" name="prev" class="prev sm-txt btn btn-outline-secondary">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#prev"/></svg>
<button aria-label="prev" id="prev" name="prev" class="prev sm-txt btn btn-outline-secondary disabled" onClick="prevPage(getPageFigures)" disabled>
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#prev"/></svg>
</button>
<span class="sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
{% set nxt_disabled="" %}
{% if entry_data|length < OPT.how_many|int %}
{% set nxt_disabled="disabled" %}
{% endif %}
<button aria-label="next" id="next" {{nxt_disabled}} name="next" class="next sm-txt btn btn-outline-secondary">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#next"/></svg>
<span class="how_many_text sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
<button aria-label="next" id="next" name="next" class="next sm-txt btn btn-outline-secondary" onClick="nextPage(getPageFigures)">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#next"/></svg>
</button>
<button aria-label="move" id="move" disabled name="move" class="sm-txt btn btn-outline-primary ms-4" onClick="MoveDBox(move_paths,'{{url_for('internal', filename='icons.svg')}}'); return false;">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#folder_plus"/></svg>
<button aria-label="move" id="move" disabled name="move" class="sm-txt btn btn-outline-primary ms-4" onClick="MoveDBox(move_paths); return false;">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#folder_plus"/></svg>
</button>
{% if "files_rbp" in request.url %}
<button aria-label="delete" id="del" disabled name="del" class="sm-txt btn btn-outline-success mx-1" onClick="DelDBox('Restore'); return false;">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash-fill"/></svg>
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#trash-fill"/></svg>
{% else %}
<button aria-label="delete" id="del" disabled name="del" class="sm-txt btn btn-outline-danger mx-1" onClick="DelDBox('Delete'); return false;">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash-fill"/></svg>
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#trash-fill"/></svg>
{% endif %}
</button>
</div>
<button style="visibility:hidden" class="btn btn-outline-secondary" aria-label="shift-key" id="shift-key" onclick="document.fake_shift=1-document.fake_shift; event.stopPropagation(); return false">shift</button>
<button style="visibility:hidden" class="btn btn-outline-secondary" aria-label="ctrl-key" id="ctrl-key" onclick="document.fake_ctrl=1-document.fake_ctrl; event.stopPropagation(); return false">ctrl</button>
</div class="col flex-grow-1">
<div class="d-flex col col-auto justify-content-end">
<div class="btn-group">
{% if OPT.size == 64 %}
{% set bt="btn-info text-white" %}
{% else %}
{% set bt="btn-outline-info" %}
{% endif %}
<button aria-label="extra small" id="64" class="px-2 sm-txt sz-but btn {{bt}}" onClick="$('#size').val(64)">XS</button>
{% if OPT.size == 96 %}
{% set bt="btn-info text-white" %}
{% else %}
{% set bt="btn-outline-info" %}
{% endif %}
<button aria-label="small" id="96" class="px-2 sm-txt sz-but btn {{bt}}" onClick="$('#size').val(96)">S</button>
{% if OPT.size == 128 %}
{% set bt="btn-info text-white" %}
{% else %}
{% set bt="btn-outline-info" %}
{% endif %}
<button aria-label="medium" id="128" class="px-2 sm-txt sz-but btn {{bt}}" onClick="$('#size').val(128)">M</button>
{% if OPT.size == 192 %}
{% set bt="btn-info text-white" %}
{% else %}
{% set bt="btn-outline-info" %}
{% endif %}
<button aria-label="large" id="192" class="px-2 sm-txt sz-but btn {{bt}}" onClick="$('#size').val(192)">L</button>
{% if OPT.size == 256 %}
{% set bt="btn-info text-white" %}
{% else %}
{% set bt="btn-outline-info" %}
{% endif %}
<button aria-label="extra large" id="256" class="px-2 sm-txt sz-but btn {{bt}}" onClick="$('#size').val(256)">XL</button>
</div class="btn-group">
</div class="col">
<input id="offset" type="hidden" name="offset" value="{{OPT.offset}}">
<input id="size" type="hidden" name="size" value="{{OPT.size}}">
</div class="form-row">
{% set eids=namespace( str="" ) %}
{# gather all the file eids and collect them in case we go gallery mode #}
{% for obj in entry_data %}
{% if obj.type.name != "Directory" %}
{% set eids.str = eids.str + obj.id|string +"," %}
{% endif %}
{% endfor %}
<input name="eids" id="eids" type="hidden" value="{{eids.str}}">
</form>
</div>
{% set ecnt=namespace( val=0 ) %}
<div class="row ms-2">
{% set last = namespace(printed=0) %}
{# rare event of empty folder, still need to show back button #}
{% if OPT.folders and entry_data|length == 0 %}
{% if OPT.cwd != OPT.root %}
<figure id="_back" class="dir entry m-1" ecnt="{{ecnt.val}}" dir="{{OPT.cwd|ParentPath}}" type="Directory">
<svg class="svg" width="{{OPT.size|int-22}}" height="{{OPT.size|int-22}}"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#folder_back"/></svg>
<figcaption class="figure-caption text-center">Back</figcaption>
</figure class="figure">
{% set ecnt.val=ecnt.val+1 %}
<script>f=$('#_back'); w=f.find('svg').width(); f.find('figcaption').width(w);</script>
{% else %}
<div class="col col-auto g-0 m-1">
<svg class="svg" width="{{OPT.size|int-22}}" height="{{OPT.size|int-22}}"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#folder_back_gray"/></svg>
</div>
{% endif %}
{% endif %}
{% for obj in entry_data %}
{% if loop.index==1 and OPT.folders %}
{% if OPT.cwd != OPT.root %}
<figure class="col col-auto g-0 dir entry m-1" ecnt="{{ecnt.val}}" dir="{{OPT.cwd|ParentPath}}" type="Directory">
<svg class="svg" width="{{OPT.size|int-22}}" height="{{OPT.size|int-22}}" fill="currentColor">
<use xlink:href="{{url_for('internal', filename='icons.svg')}}#folder_back"/></svg>
<figcaption class="svg_cap figure-caption text-center">Back</figcaption>
</figure class="figure">
{% set ecnt.val=ecnt.val+1 %}
{% else %}
{# create an even lighter-grey, unclickable back button - so folders dont jump around when you go into them #}
<div class="col col-auto g-0 m-1">
<svg class="svg" width="{{OPT.size|int-22}}" height="{{OPT.size|int-22}}"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#folder_back_gray"/></svg>
<div class="btn-group" role="group" aria-label="Size radio button group">
<input type="radio" class="btn-check" name="size" id="size-xs" onCLick="changeSize()" autocomplete="off" value="64">
<label class="btn btn-outline-info btn-radio" for="size-xs">XS</label>
<input type="radio" class="btn-check" name="size" id="size-s" onCLick="changeSize()" autocomplete="off" value="96">
<label class="btn btn-outline-info btn-radio" for="size-s">S</label>
<input type="radio" class="btn-check" name="size" id="size-m" onCLick="changeSize()" autocomplete="off" value="128">
<label class="btn btn-outline-info btn-radio" for="size-m">M</label>
<input type="radio" class="btn-check" name="size" id="size-l" onCLick="changeSize()" autocomplete="off" value="192">
<label class="btn btn-outline-info btn-radio" for="size-l">L</label>
<input type="radio" class="btn-check" name="size" id="size-xl" onCLick="changeSize()" autocomplete="off" value="256">
<label class="btn btn-outline-info btn-radio" for="size-xl">XL</label>
</div>
{% endif %}
{% endif %}
{% if not OPT.folders and obj.type.name == "Directory" %}
{% continue %}
{% endif %}
{% if OPT.grouping == "Day" %}
{% if last.printed != obj.file_details.day %}
<div class="row ps-3"><h6>Day: {{obj.file_details.day}} of {{obj.file_details.month}}/{{obj.file_details.year}}</h6></div>
{% set last.printed = obj.file_details.day %}
{% endif %}
{% elif OPT.grouping == "Week" %}
{% if last.printed != obj.file_details.woy %}
<div class="row ps-3"><h6>Week #: {{obj.file_details.woy}} of {{obj.file_details.year}}</h6></div>
{% set last.printed = obj.file_details.woy %}
{% endif %}
{% elif OPT.grouping == "Month" %}
{% if last.printed != obj.file_details.month %}
<div class="row ps-3"><h6>Month: {{obj.file_details.month}} of {{obj.file_details.year}}</h6></div>
{% set last.printed = obj.file_details.month %}
{% endif %}
{% endif %}
{% if obj.type.name == "Image" or obj.type.name == "Video" or obj.type.name == "Unknown" %}
{% if (not OPT.folders) or ((obj.in_dir.in_path.path_prefix+'/'+obj.in_dir.rel_path+'/'+obj.name) | TopLevelFolderOf(OPT.cwd)) %}
<figure id="{{obj.id}}" ecnt="{{ecnt.val}}" class="col col-auto g-0 figure entry m-1" path_type="{{obj.in_dir.in_path.type.name}}" size="{{obj.file_details.size_mb}}" hash="{{obj.file_details.hash}}" in_dir="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}" fname="{{obj.name}}" yr="{{obj.file_details.year}}" date="{{obj.file_details.year}}{{"%02d" % obj.file_details.month}}{{"%02d" % obj.file_details.day}}" pretty_date="{{obj.file_details.day}}/{{obj.file_details.month}}/{{obj.file_details.year}}" type="{{obj.type.name}}">
{% if obj.type.name=="Image" or obj.type.name=="Unknown" %}
<div style="position:relative; width:100%">
{% if obj.file_details.thumbnail %}
<a href="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}/{{obj.name}}">
<img alt="{{obj.name}}" class="thumb" height="{{OPT.size}}" src="data:image/jpeg;base64,{{obj.file_details.thumbnail}}"></img></a>
{% else %}
<a href="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}/{{obj.name}}">
<svg width="{{OPT.size}}" height="{{OPT.size}}" fill="white"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#unknown_ftype"/></svg>
</a>
{% endif %}
{% if search_term is defined %}
<div style="position:absolute; bottom: 0px; left: 2px;">
<svg width="16" height="16" fill="white"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#{{LocationIcon(obj)}}"/></svg>
</div>
{% endif %}
<div id="s{{obj.id}}" style="display:none; position:absolute; top: 50%; left:50%; transform:translate(-50%, -50%);">
<img height="64px" src="{{url_for('internal', filename='throbber.gif')}}"></img>
</div>
</div>
{% elif obj.type.name == "Video" %}
<div style="position:relative; width:100%">
{% if obj.file_details.thumbnail %}
<a href="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}/{{obj.name}}">
<img alt="{{obj.name}}" class="thumb" height="{{OPT.size}}" src="data:image/jpeg;base64,{{obj.file_details.thumbnail}}"></img></a>
{% else %}
<a href="{{obj.in_dir.in_path.path_prefix}}/{{obj.in_dir.rel_path}}/{{obj.name}}">
<svg width="{{OPT.size}}" height="{{OPT.size}}" fill="white"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#unknown_ftype"/></svg>
</a>
{% endif %}
<div style="position:absolute; top: 0px; left: 2px;">
<svg width="16" height="16" fill="white"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#film"/></svg>
</div>
{% if search_term is defined %}
<div style="position:absolute; bottom: 0px; left: 2px;">
<svg width="16" height="16" fill="white"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#{{LocationIcon(obj)}}"/></svg>
</div>
{% endif %}
</div>
{% endif %}
</figure>
{% set ecnt.val=ecnt.val+1 %}
{% endif %}
{% elif obj.type.name == "Directory" %}
{% if OPT.folders %}
{% if obj.dir_details.rel_path | length %}
{% set dirname=obj.dir_details.in_path.path_prefix+'/'+obj.dir_details.rel_path %}
{% else %}
{% set dirname=obj.dir_details.in_path.path_prefix %}
{% endif %}
{# if this dir is the toplevel of the cwd, show the folder icon #}
{% if dirname| TopLevelFolderOf(OPT.cwd) %}
<figure class="col col-auto g-0 dir entry m-1" id={{obj.id}} ecnt={{ecnt.val}} dir="{{dirname}}" type="Directory">
<svg class="svg" width="{{OPT.size|int-22}}" height="{{OPT.size|int-22}}" fill="currentColor">
<use xlink:href="{{url_for('internal', filename='icons.svg')}}#Directory"/></svg>
<figcaption class="svg_cap figure-caption text-center text-wrap text-break">{{obj.name}}</figcaption>
</figure class="figure">
{% set ecnt.val=ecnt.val+1 %}
<script>f=$('#{{obj.id}}'); w=f.find('svg').width(); f.find('figcaption').width(w);</script>
{% endif %}
{% endif %}
{% endif %}
{% endfor %}
</div class="d-flex col">
</div class="d-flex row mb-2">
</div container="fluid">
<div id="figures" class="row ms-2">
</div>
<div class="container-fluid">
<form id="nav_form" method="POST" action="/change_file_opts">
<input type="hidden" name="cwd" id="cwd" value="{{OPT.cwd}}">
<div class="row">
<div class="col my-auto d-flex justify-content-center">
<button aria-label="prev" id="prev" name="prev" class="prev sm-txt btn btn-outline-secondary">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#prev"/></svg>
</button>
<span class="sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
<button aria-label="next" id="next" {{nxt_disabled}} name="next" class="next sm-txt btn btn-outline-secondary">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#next"/></svg>
</button>
</div>
<div class="row">
<div class="col my-auto d-flex justify-content-center">
<button aria-label="prev" id="prev" name="prev" class="prev sm-txt btn btn-outline-secondary disabled" onClick="prevPage(getPageFigures)" disabled>
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#prev"/></svg>
</button>
<span class="how_many_text sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
<button aria-label="next" id="next" name="next" class="next sm-txt btn btn-outline-secondary" onClick="nextPage(getPageFigures)">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#next"/></svg>
</button>
</div class="col my-auto">
</div class="row">
</div class="container-fluid">
</div id="files_div">
<div id="viewer_div" class="d-none">
<div id="viewer" class="container-fluid">
<div class="row flex-nowrap">
<!-- Left Buttons Column -->
<div class="col-auto d-flex flex-column min-width-0">
<!-- Up Button (Small) -->
<button title="Back to list" class="btn btn-outline-info btn-sm p-1 mb-1" onclick="goOutOfViewer()">
<svg width="16" height="16" fill="currentColor">
<use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#back"></use>
</svg>
</button>
<!-- Left Button (large/flex-grow-1) -->
<button title="Show previous image" class="btn btn-outline-info px-2 flex-grow-1 overflow-hidden"
style="padding: 10%" id="la" onClick="prevImageInViewer()">
<svg width="16" height="16" fill="currentColor">
<use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#prev"/></svg>
</button>
</div>
</form>
</div class="container">
<figure class="col col-auto border border-info rounded m-0 p-1" id="figure">
<canvas id="canvas"></canvas>
<img id="throbber" src="{{url_for('internal', filename='throbber.gif')}}?v={{js_vers[th]}}" style="display:none;">
<script>
var im=new Image();
im.onload=DrawImg
var context = canvas.getContext('2d')
</script>
<figcaption id="img-cap" class="figure-caption text-center text-wrap text-break">
<span id="fname_i"></span></figcaption>
</figure>
<div id="video_div" class="col col-auto">
<video id="video" class="col col-auto" controls>
<source id="videoSource" src="" type="video/mp4">
Your browser does not support the video tag.
</video>
<figcaption id="vid-cap" class="figure-caption text-center text-wrap text-break">
<span id="fname_v"></span></figcaption>
</div>
<!-- Right-hand Buttons Column -->
<div class="col-auto d-flex flex-column min-width-0">
<!-- Up Button (Small) -->
<button title="Back to list" class="btn btn-outline-info btn-sm p-1 mb-1" onclick="goOutOfViewer()">
<svg width="16" height="16" fill="currentColor">
<use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#back"></use>
</svg>
</button>
<!-- Right Button (large/flex-grow-1) -->
<button title="Show next image" class="btn btn-outline-info px-2 flex-grow-1 overflow-hidden"
style="padding: 10%" id="ra" onClick="nextImageInViewer()">
<svg width="16" height="16" fill="currentColor">
<use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#next"/></svg>
</button>
</div>
</div class="row">
<div class="row">
{# this whole div, just takes up the same space as the left button and is hidden for alignment only #}
<div class="col-auto px-0">
<button class="btn btn-outline-info px-2 invisible" disabled>
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#next"/></svg>
</button>
</div>
<span class="col-auto my-auto">Show:</span>
<div title="Toggle showing filename (hotkey: n)" class="d-flex form-check form-switch border border-info rounded col col-auto my-auto py-1 justify-content-center ps-5">
<input class="form-check-input" type="checkbox" id="fname_toggle" onChange="$('.figure-caption').toggle()" checked>
<label class="form-check-label ps-1" for="fname_toggle">Filename</label>
</div>
<div title="Toggle showing matched faces (hotkey: f)" class="d-flex form-check form-switch border border-info rounded col col-auto my-auto py-1 justify-content-center ps-5">
<input class="form-check-input" type="checkbox" onChange="FaceToggle()" id="faces">
<label class="form-check-label ps-1" for="faces">Faces</label>
</div>
<div title="Toggle showing 'distance' on matched faces (hotkey: d)" class="d-flex form-check form-switch border border-info rounded col col-auto my-auto py-1 justify-content-center ps-5">
<input class="form-check-input" type="checkbox" onChange="DrawImg()" id="distance">
<label class="form-check-label ps-1" for="distance">Distance</label>
</div>
<div title="Change the model used to detect faces" class="col col-auto my-auto">
AI Model:
{# can use 0 as default, it will be (re)set correctly in DrawImg() anyway #}
{{CreateSelect( "model", 0, ["N/A", "normal", "slow/accurate"], "", "rounded norm-txt", [0,1,2])|safe }}
</div>
<div class="col col-auto pt-1">
<button class="btn btn-outline-info p-1" title="Rotate by 90 degrees" onClick="Transform(90)">
<img src="{{url_for('internal', filename='rot90.png')}}?v={{js_vers['r90']}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot90-invert.png')}}'"
onMouseOut="this.src='{{url_for('internal', filename='rot90.png')}}?v={{js_vers['r90']}}'" />
</button>
<button class="btn btn-outline-info p-1" title="Rotate by 180 degrees" onClick="Transform(180)">
<img src="{{url_for('internal', filename='rot180.png')}}?v={{js_vers['r180']}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot180-invert.png')}}'"
onMouseOut="this.src='{{url_for('internal', filename='rot180.png')}}?v={{js_vers['r180']}}'" />
</button>
<button class="btn btn-outline-info p-1" title="Rotate by 270 degrees" onClick="Transform(270)">
<img src="{{url_for('internal', filename='rot270.png')}}?v={{js_vers['r270']}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot270-invert.png')}}'"
onMouseOut="this.src='{{url_for('internal', filename='rot270.png')}}?v={{js_vers['r270']}}'" />
</button>
<button class="btn btn-outline-info p-1" title="Flip horizontally" onClick="Transform('fliph')">
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#flip_h"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Flip vertically" onClick="Transform('flipv')">
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#flip_v"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="View in Fullscreen mode (hotkey: F)" onClick="fullscreen=true; ViewImageOrVideo()">
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#fullscreen"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Show logs relating to this filename (hotkey: l)" onClick="JoblogSearch()">
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#log"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="View Original" onClick="window.location='/'+document.viewing.FullPathOnFS">
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#download"/></svg>
</button>
<button id="viewer_del" class="btn btn-outline-danger p-1" title="Delete (hotkey: Del)" onClick="DelDBox('Delete')">
<svg id="viewer_bin" width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#trash"/></svg>
</button>
</div>
</div class="row">
</div id="viewer">
</div id="viewer_div">
{% endblock main_content %}
{% block script_content %}
<script>
// GLOBALS
document.fake_shift=0
document.fake_ctrl=0
$('.figure').click( function(e) { DoSel(e, this ); SetButtonState(); return false; });
$(document).on('click', function(e) { $('.highlight').removeClass('highlight') ; SetButtonState() });
// FIXME: used by viewer code - should probably get rid of this?
var fullscreen=false;
function CallViewRouteWrapper()
{
CallViewRoute( $(this).attr("id") )
}
// this is the current entry (object) we are viewing - an image/video (used when we dbl-click to view & then in next/prev in view)
document.viewing=null;
function CallViewRoute(id)
{
s='<form id="_fm" method="POST" action="/view/' + id + '">'
s+='<input type="hidden" name="eids" value="'+$("#eids").val() + '">'
s+='<input type="hidden" name="cwd" value="{{OPT.cwd}}">'
s+='<input type="hidden" name="root" value="{{OPT.root}}">'
s+='<input type="hidden" name="offset" value="{{OPT.offset}}">'
s+='<input type="hidden" name="how_many" value="{{OPT.how_many}}">'
s+='<input type="hidden" name="orig_url" value="{{request.path}}">'
s+='<input type="hidden" name="view_eid" value="'+id+'">'
{% if search_term is defined %}
s+='<input type="hidden" name="search_term" value="{{search_term}}">'
{% endif %}
s+='</form>'
$(s).appendTo('body').submit();
}
var OPT = {{ OPT.to_dict()|tojson }};
// set from query data and stored in OPT for convenience. It can be 0 -
// this implies no content in the Path at all
OPT.root_eid = {{ query_data.root_eid }};
$('.figure').dblclick( CallViewRouteWrapper )
// get items out of query_data into convenience javascript vars...
var move_paths = {{ query_data.move_paths|tojson }};
var NMO={{query_data.NMO|tojson}}
var people={{query_data.people|tojson}}
// different context menu on files
$.contextMenu({
selector: '.entry',
build: function($triggerElement, e) {
// when right-clicking & no selection add one OR deal with ctrl/shift right-lick as it always changes seln
if( NoSel() || e.ctrlKey || e.shiftKey )
{
DoSel(e, e.currentTarget )
SetButtonState();
}
// this is the list of entry ids for the images for ALL matches for this query
var entryList={{query_data.entry_list}}
if( FiguresOrDirsOrBoth() == "figure" )
{
item_list = {
details: { name: "Details..." },
view: { name: "View File" },
sep: "---",
}
if( e.currentTarget.getAttribute('type') == 'Image' )
{
item_list['transform'] = {
name: "Transform",
items: {
"r90": { "name" : "Rotate 90 degrees" },
"r180": { "name" : "Rotate 180 degrees" },
"r270": { "name" : "Rotate 270 degrees" },
"fliph": { "name" : "Flip horizontally" },
"flipv": { "name" : "Flip vertically" }
}
}
// pageList is just those entries shown on this page from the full entryList
var pageList=[]
// force pageList to set pageList for & render the first page
getPage(1,getPageFigures)
}
item_list['move'] = { name: "Move selected file(s) to new folder" }
item_list['sep2'] = { sep: "---" }
}
else
item_list = {
move: { name: "Move selection(s) to new folder" }
}
// FIXME: doco, but also gather all globals together, many make them all document. to be obviously global (and add fullscreen)
var gap=0.8
var grayscale=0
var throbber=0
item_list['ai'] = {
name: "Scan file for faces",
items: {
"ai-all": {"name": "all"},
{% for p in people %}
"ai-{{p.tag}}": {"name": "{{p.tag}}"},
{% endfor %}
}
}
if( SelContainsBinAndNotBin() ) {
item_list['both']= { name: 'Cannot delete and restore at same time', disabled: true }
} else {
if (e.currentTarget.getAttribute('path_type') == 'Bin' )
item_list['undel']= { name: "Restore selected file(s)" }
else if( e.currentTarget.getAttribute('type') != 'Directory' )
item_list['del']= { name: "Delete Selected file(s)" }
}
return {
callback: function( key, options) {
if( key == "details" ) { DetailsDBox() }
if( key == "view" ) { CallViewRoute( $(this).attr('id') ) }
if( key == "move" ) { MoveDBox(move_paths, "{{url_for('internal', filename='icons.svg')}}") }
if( key == "del" ) { DelDBox('Delete') }
if( key == "undel") { DelDBox('Restore') }
if( key == "r90" ) { Transform(90) }
if( key == "r180" ) { Transform(180) }
if( key == "r270" ) { Transform(270) }
if( key == "fliph" ) { Transform("fliph") }
if( key == "flipv" ) { Transform("flipv") }
if( key.startsWith("ai")) { RunAIOnSeln(key) }
},
items: item_list
};
}
});
$(document).ready(function() {
if( {{OPT.offset}} == 0 )
function PrettyFname(fname)
{
$('.prev').addClass('disabled')
$('.prev').prop('disabled', true)
s='<span class="alert alert-secondary py-2">'
if( fname.indexOf( "static/Import" ) == 0 )
{
s+='<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#import"/></svg>'
tmp_path=fname.replace("statuc/Import","" )
}
if( fname.indexOf( "static/Storage" ) == 0 )
{
s+='<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#db"/></svg>'
tmp_path=fname.replace("static/Storage","" )
}
if( fname.indexOf( "static/Bin" ) == 0 )
{
s+='<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}?v={{js_vers['ic']}}#trash-fill"/></svg>'
tmp_path=fname.replace("static/Bin","" )
}
s+=tmp_path+'</span>'
return s
}
$(".dir").click( function(e) { $('#offset').val(0) ; $('#cwd').val( $(this).attr('dir') ) ; $('#main_form').submit() } )
} )
$( document ).keydown(function(event) {
switch (event.key)
// check the size radiobutton
$(`input[name="size"][value="${OPT.size}"]`).prop('checked', true)
window.addEventListener('resize', DrawImg, false);
window.addEventListener('resize', ResizeVideo, false);
// when we are in recycle bin, change colours to green & func to restore
if( window.location.href.includes('files_rbp') )
{
case "Delete":
{% if "files_rbp" in request.url %}
if( ! NoSel() ) DelDBox('Restore');
{% else %}
if( ! NoSel() ) DelDBox('Delete');
{% endif %}
break;
$('#viewer_bin').attr('fill', 'var(--bs-success)')
// fill with bg-success colour
$('#viewer_bin use').attr('fill', 'var(--bs-success)')
$('#viewer_del').removeClass('btn-outline-danger').addClass('btn-outline-success')
$('#viewer_del').on('mouseenter', function() {
// Set the SVG fill to white
$('#viewer_bin use').attr('fill', 'white');
});
// When mouse leaves the button
$('#viewer_del').on('mouseleave', function() {
// Revert the SVG fill to the bg-success colour
$('#viewer_bin use').attr('fill', 'var(--bs-success)');
});
$('#viewer_del').on('click', function() { DelDBox('Restore') } )
}
});
</script>
{% endblock script_content %}

View File

@@ -46,8 +46,7 @@
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
<h3 class="my-3 text-center" style="color: #5bc0de">
{# <svg width="64" height="64" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#pa_logo" /></svg>&nbsp;Photo Assistant Login</h3> #}
<svg width="64" height="64" fill="currentColor"><use xlink:href="internal/icons.svg#pa_logo" /></svg>&nbsp;Photo Assistant Login</h3>
<img width="64" height="64" src="internal/pa-logo.png"/></img>&nbsp;Photo Assistant Login</h3>
<form class="" method="POST">
<div class="input-group">
<label for="username" class="text-right input-group-text col-4 text-info">Username:</label>

View File

@@ -34,9 +34,7 @@
<!-- browsers can put the fakepath in for security, remove it -->
function DoMagic() {
str=$("#new_file_chooser").val()
console.log(str)
str=str.replace('C:\\fakepath\\', '' )
console.log(str)
$("#fname").val(str)
}
</script>

View File

@@ -56,44 +56,6 @@
</div>
</div>
</div class="col-7">
<div class="row pt-5">
<alert class="alert alert-warning">The following values are based on the defaults above and subsequent changes as you navigate the application and are not set by hand. The following content is for checking/debugging only.</alert>
</div class="row">
<div class="row">
<table id="pa_user_state_tbl" class="table table-striped table-sm" data-toolbar="#toolbar" data-search="true">
<thead>
<tr class="table-primary"><th>Path</th><th>New or Oldest</th><th>How Many</th><th>Folders?</th><th>Group by</th><th>Thumb size</th><th>DB retrieve offset</th><th>Root</th><th>cwd</th></tr>
</thead>
<tbody>
{% for st in states %}
<tr>
<td>{{st.path_type}}
{% if st.path_type == 'Search' %}
"{{st.orig_search_term}}"
{% endif %}
{% if st.path_type == 'View' %}
(orig: id={{st.view_eid}} in {{st.orig_ptype}})
{% if st.orig_ptype == 'Search' %}
"{{st.orig_search_term}}"
{% endif %}
{% endif %}
</td>
<td>{{st.noo}}</td>
<td>{{st.how_many}}</td>
<td>{{st.folders}}</td>
<td>{{st.grouping}}</td>
<td>{{st.size}}</td>
<td>{{st.st_offset}}</td>
<td>{{st.root}}</td>
<td>{{st.cwd}}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div class="row">
</div class="container-fluid">
{% endblock main_content %}
{% block script_content %}

View File

@@ -1,323 +0,0 @@
{% extends "base.html" %} {% block main_content %}
{# make the form-switch / toggle info color set, give or take #}
<style>
.norm-txt { font-size: 1.0rem }
.form-check-input:checked {
background-color: #39C0ED;
border-color: #CFF4FC;
}
.form-switch .form-check-input {
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%2339C0ED'/%3e%3c/svg%3e");
}
.form-switch .form-check-input:focus {
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23CFF4FC'/%3e%3c/svg%3e");
}
</style>
<script src="{{ url_for( 'internal', filename='js/view_transform.js')}}"></script>
<script src="{{ url_for( 'internal', filename='js/view_support.js')}}"></script>
<script>
var gap=0.8
var grayscale=0
var throbber=0
var objs=[]
var NMO=[]
var current={{current}}
var eids="{{eids}}"
var eid_lst=eids.split(",")
var offset={{OPT.offset}}
var first_eid={{OPT.first_eid}}
var last_eid={{OPT.last_eid}}
var imp_path="static/Import/{{imp_path}}"
var st_path="static/Storage/{{st_path}}"
var bin_path="static/Bin/{{bin_path}}"
{% for id in objs %}
e=new Object()
e.url = "{{objs[id].FullPathOnFS()|safe}}"
e.type = "{{objs[id].type.name}}"
{% if objs[id].file_details.faces %}
e.face_model="{{objs[id].file_details.faces[0].facefile_lnk.model_used}}"
{% endif %}
e.faces=[]
{% for face in objs[id].file_details.faces %}
data = { 'id': '{{face.id}}', 'x': '{{face.face_left}}', 'y': '{{face.face_top}}', 'w': '{{face.w}}', 'h':'{{face.h}}' }
{% if face.refimg %}
data['who']='{{face.refimg.person.tag}}'
data['distance']="{{face.refimg_lnk.face_distance|round(2)}}"
{% endif %}
{% if face.no_match_override %}
data['override'] = {
'face_id' : '{{face.no_match_override.face_id}}',
'type_id' : '{{face.no_match_override.type.id}}',
'type_name': '{{face.no_match_override.type.name}}',
'who' : '{{face.no_match_override.type.name}}',
'distance' : 'N/A'
}
{% endif %}
{% if face.manual_override %}
data['override'] = {
'face_id' : '{{face.manual_override.face_id}}',
'type_id' : '{{face.manual_override.type.id}}',
'type_name': '{{face.manual_override.type.name}}',
'who' : '{{face.manual_override.person.tag}}',
'distance' : 'N/A'
}
{% endif %}
e.faces.push( data )
{% endfor %}
objs[{{id}}]=e
{% endfor %}
{% for el in NMO_data %}
NMO[{{el.id}}] = { 'type_id': {{el.id}}, 'name': '{{el.name}}' }
{% endfor %}
function PrettyFname(fname)
{
s='<span class="alert alert-secondary py-2">'
if( fname.indexOf( "static/Import" ) == 0 )
{
s+='<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#import"/></svg>'
tmp_path=fname.replace(imp_path,"" )
}
if( fname.indexOf( "static/Storage" ) == 0 )
{
s+='<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#db"/></svg>'
tmp_path=fname.replace("static/Storage","" )
}
if( fname.indexOf( "static/Bin" ) == 0 )
{
s+='<svg width="20" height="20" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash-fill"/></svg>'
tmp_path=fname.replace("static/Bin","" )
}
s+=tmp_path+'</span>'
return s
}
function CallViewListRoute(dir)
{
data="eids="+$("#eids").val()
data+="&cwd={{OPT.cwd}}"
data+="&root={{OPT.root}}"
data+="&orig_url={{OPT.orig_url}}"
data+="&view_eid={{OPT.view_eid}}"
// direction (next/prev)
data+="&"+dir+ "=1"
{% if search_term is defined %}
data+="&search_term={{search_term}}"
{% endif %}
$.ajax({ type: 'POST', data: data, url: '/view_list', success: function(res){
current=res.current
eids=res.eids
objs=res.objs
eid_lst=eids.split(",")
offset=res.offset
if( current == first_eid )
$("#la").prop("disabled", true)
if( current == last_eid )
$("#ra").prop("disabled", true)
ViewImageOrVideo()
}
})
}
</script>
<div id="viewer" class="container-fluid">
{% set max=eids.split(',')|length %}
<input type="hidden" name="eids" value={{eids}}>
<div class="row">
<button title="Show previous image" class="col-auto btn btn-outline-info px-2" style="padding: 10%" id="la"
{% if OPT.first_eid == current %}
disabled
{% endif %}
onClick="
cidx = eid_lst.indexOf(current.toString())
prev=cidx-1
if( prev < 0 )
{
if( offset )
{
CallViewListRoute('prev')
return
}
else
{
$('#la').attr('disabled', true )
prev=0
}
}
$('#ra').attr('disabled', false )
current=eid_lst[prev]
ViewImageOrVideo()
if( current == first_eid )
$('#la').attr('disabled', true )
">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#prev"/></svg>
</button>
<figure class="col col-auto border border-info rounded m-0 p-1" id="figure">
<canvas id="canvas"></canvas>
<img id="throbber" src="{{url_for('internal', filename='throbber.gif')}}" style="display:none;">
<script>
var im=new Image();
im.onload=DrawImg
im.src="../" + objs[current].url
var context = canvas.getContext('2d')
window.addEventListener('resize', DrawImg, false);
</script>
<figcaption id="img-cap" class="figure-caption text-center text-wrap text-break"><span id="fname_i"></span></figcaption>
</figure>
<script>$('#fname_i').html(PrettyFname(objs[current].url))</script>
{% if objs[current].type.name != "Image" %}
<script>$('#figure').hide()</script>
{% endif %}
<div id="video_div" class="col col-auto">
<video id="video" class="col col-auto" controls>
<source src="../{{objs[current].FullPathOnFS()}}" type="video/mp4">
Your browser does not support the video tag.
</video>
<figcaption id="vid-cap" class="figure-caption text-center text-wrap text-break"><span id="fname_v"></span></figcaption>
<script>$('#fname_v').html(PrettyFname(objs[current].url))</script>
</div>
<script>
window.addEventListener('resize', ResizeVideo, false);
ResizeVideo()
{% if objs[current].type.name != "Video" %}
$('#video_div').hide()
{% endif %}
</script>
<button title="Show next image" class="col-auto btn btn-outline-info px-2" style="padding: 10%" id="ra"
{% if OPT.last_eid == current %}
disabled
{% endif %}
onClick="
cidx = eid_lst.indexOf(current.toString())
if( cidx < eid_lst.length-1 )
{
current=eid_lst[cidx+1]
ViewImageOrVideo()
if( current != first_eid )
$('#la').attr('disabled', false )
}
else
CallViewListRoute('next')
if( current == last_eid )
{
$('#ra').attr('disabled', true )
return
}
">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#next"/></svg>
</button>
</div id="/form-row">
{# use this for color of toggles: https://www.codeply.com/p/4sL9uhevwJ #}
<div class="row">
{# this whole div, just takes up the same space as the left button and is hidden for alignment only #}
<div class="col-auto px-0">
<button class="btn btn-outline-info px-2" disabled style="visibility:hidden">
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#next"/></svg>
</button>
</div>
<span class="col-auto my-auto">Show:</span>
<div title="Toggle showing filename (hotkey: n)" class="d-flex form-check form-switch border border-info rounded col col-auto my-auto py-1 justify-content-center ps-5">
<input class="form-check-input" type="checkbox" id="fname_toggle" onChange="$('.figure-caption').toggle()" checked>
<label class="form-check-label ps-1" for="fname_toggle">Filename</label>
</div>
<div title="Toggle showing matched faces (hotkey: f)" class="d-flex form-check form-switch border border-info rounded col col-auto my-auto py-1 justify-content-center ps-5">
<input class="form-check-input" type="checkbox" onChange="FaceToggle()" id="faces">
<label class="form-check-label ps-1" for="faces">Faces</label>
</div>
<div title="Toggle showing 'distance' on matched faces (hotkey: d)" class="d-flex form-check form-switch border border-info rounded col col-auto my-auto py-1 justify-content-center ps-5">
<input class="form-check-input" type="checkbox" onChange="DrawImg()" id="distance">
<label class="form-check-label ps-1" for="distance">Distance</label>
</div>
<div title="Change the model used to detect faces" class="col col-auto my-auto">
AI Model:
{# can use 0 as default, it will be (re)set correctly in DrawImg() anyway #}
{{CreateSelect( "model", 0, ["N/A", "normal", "slow/accurate"], "", "rounded norm-txt", [0,1,2])|safe }}
</div>
<div class="col col-auto pt-1">
<button class="btn btn-outline-info p-1" title="Rotate by 90 degrees" onClick="Transform(90)">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot90"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Rotate by 180 degrees" onClick="Transform(180)">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot180"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Rotate by 270 degrees" onClick="Transform(270)">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot270"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Flip horizontally" onClick="Transform('fliph')">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_h"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Flip vertically" onClick="Transform('flipv')">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_v"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="View in Fullscreen mode (hotkey: F)" onClick="fullscreen=true; ViewImageOrVideo()">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#fullscreen"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Show logs relating to this filename (hotkey: l)" onClick="JoblogSearch()">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#log"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="View Original" onClick="window.location='/'+objs[current].url">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#download"/></svg>
</button>
<button id="del" class="btn btn-outline-danger p-1" title="Delete (hotkey: Del)"
onClick="$.ajax({ type: 'POST', data: '&eid-0={{current}}', url: '/delete_files', success: function(data){ window.location='/'; return false; } })">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash"/></svg>
</button>
</div>
</div class="row">
{% endblock main_content %}
{% block script_content %}
<script>
$( document ).keydown(function(event) {
// if dbox is visible, dont process this hot-key, we are inputting text
// into inputs instead
if( $("#dbox").is(':visible') )
return
switch (event.key)
{
case "Left": // IE/Edge specific value
case "ArrowLeft":
if( $('#la').prop('disabled') == false )
$('#la').click()
break;
case "Right": // IE/Edge specific value
case "ArrowRight":
if( $('#ra').prop('disabled') == false )
$('#ra').click()
break;
case "d":
$('#distance').click()
break;
case "f":
$('#faces').click()
break;
case "n":
$('#fname_toggle').click()
break;
case "F":
fullscreen=!document.fullscreen
ViewImageOrVideo()
break;
case "l":
JoblogSearch()
break;
case "Delete":
$('#del').click()
default:
return; // Quit when this doesn't handle the key event.
}
});
var fullscreen=false;
</script>
{% endblock script_content %}

9
utils/pa_rotate Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/sh
echo "start"
b=`basename "$1"`
echo "b=$b"
mv "$1" "/tmp/$b"
echo "mv '$1' to /tmp/$b"
jpegtran -outfile "$1" -rotate "$2" "/tmp/$b"
echo "done jpegtran"

View File

@@ -1,7 +1,18 @@
#!/bin/bash
su mythtv -g mythtv -c 'ENV="production" python3 -u /code/pa_job_manager.py' &> /var/log/pa_job_manager.out &
gunicorn --bind=0.0.0.0:443 --workers=4 --threads=16 --certfile /etc/letsencrypt/live/pa.depaoli.id.au/fullchain.pem --keyfile /etc/letsencrypt/live/pa.depaoli.id.au/privkey.pem main:app --env ENV="production" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output
sudo -u pauser ENV=${ENV} python3 -u pa_job_manager.py &> /var/log/pa_job_manager.out &
if [ "$ENV" == "production" ]; then
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=4 --threads=16 main:app --env ENV="production" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output
elif [ "$ENV" == "container" ]; then
# move into src tree so padev responds to code changes
cd /home/ddp/src/photoassistant
# push container logs/errors to stdout so docker logs works
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=1 --threads=1 main:app --env ENV="container" --reload --capture-output --log-file - --error-logfile -
else
echo "Not sure which ENV ($ENV) we are running, set up for DEV and dont run job manager (jic)"
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=1 --threads=1 main:app --env ENV="development" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output --enable-stdio-inheritance --reload
fi
# this should never be invoked unless gunicorn fails -- in that case, at least
# we will keep the container can login by hand and check the issue/error
sleep 99999