Compare commits

91 Commits

Author SHA1 Message Date
cf9b0996be works with new ldap now, and put forward details of next big change into TODO 2025-09-20 14:48:29 +10:00
57e736a495 temporarily add some data capturing to dom for debugging, and remove superfluous condition in if 2025-09-20 14:47:52 +10:00
aaf04e7f9b AI search now does not wildcard, so only exact matches person when you do AI:<xxx> 2025-09-20 14:46:57 +10:00
eb9e7f2ce1 fix for ldap removing any anonymous binds, also use ldap.ddp.net 2025-09-16 20:36:04 +10:00
df7a81df09 new TODOs 2025-09-16 20:35:03 +10:00
4742509d56 added new BUG 2025-09-16 20:34:44 +10:00
391fd52332 commented out more debugs 2025-08-19 20:55:20 +10:00
1831c49b15 fixed 2 x TODOs, can now add an existing face as a refimg, and I sped up the recent jobs page 2025-08-19 20:53:26 +10:00
fa63e08b59 store away person.id per matched face to support match_add_refimg in context menu 2025-08-19 20:22:49 +10:00
e36fc170f1 moved container cwd to ~/src/photoassistant so that gunicorn can react to src code changes, also forced logs to docker logs 2025-08-19 20:22:20 +10:00
e3c94bf5eb added match_add_refimg support to context menu, allows us to take existing (presumably) better image and make it a refimg for someone its matched 2025-08-19 20:20:50 +10:00
db8cf68d4b added a clear messages (forced) to allow M to clear out annoying messages if needed 2025-08-19 20:19:57 +10:00
1a182d30b5 added a clear messages (forced) to allow M to clear out annoying messages if needed 2025-08-19 20:18:59 +10:00
c7c08c1c32 made AI:<tag> be literal, e.g. no wildcards to stop AI:kath matching AI:katherine, also removed unused fid var and commented out debugs 2025-08-19 20:18:30 +10:00
5fa35b90eb updated to reflect move from hard-coded ip to FQDN and using gitea (on git.depaoli.id.au) instead of local git on mara 2025-08-19 20:17:31 +10:00
ae718fe249 had to remove libgl1-mesa-glx to get installs to work, added mkdir -p jic, force TZ so build date is timezone accurate 2025-08-19 20:12:42 +10:00
48ca6c5e64 new todos 2025-08-19 20:11:54 +10:00
6ceb4c4604 clean up of BUGs list 2025-08-19 20:11:39 +10:00
b7b2eaeb1f use new logo for about page, and center the alert text 2025-05-13 18:59:09 +10:00
0660ca55d7 new logo file 2025-05-13 18:44:30 +10:00
e6e621d6f7 updated pics for rotations 2025-05-13 18:44:23 +10:00
0a49c2a627 added BUG-137 - after moving files, shift-click selection is wrong 2025-05-13 18:43:58 +10:00
e460164d83 various TODOs 2025-05-13 18:43:30 +10:00
f6ddb66f3f set cant_close to False now, I think I know enough about this BUG 2025-05-13 18:43:08 +10:00
1555cd5ad6 use new logo 2025-05-13 18:42:21 +10:00
62acd34ae0 trying to use less RUNs / layers, to speed up subsequent builds 2025-05-13 18:41:59 +10:00
5448e0a7a4 just do all of static, rather than explicit subdirs 2025-05-13 18:41:26 +10:00
70908d60f5 use fixed width and height to force buttons to be consistent with new rot pngs 2025-01-08 13:37:01 +11:00
58715491ea keep rot90.svg source in case we want to modify pngs in the future 2025-01-08 13:35:46 +11:00
f5dfb92db1 removing BUG-128 its a duplicate of BUG-123 2025-01-04 22:04:36 +11:00
eb819e37d7 added notes on DEEPFACE 2025-01-04 22:03:33 +11:00
c6a0352c7d svgs cant have fill and do onHover properly, and arcs need a fill of none, so switched over to pngs and made the arrow match the actual direction of rotation 2025-01-04 22:03:21 +11:00
f4aeadc019 adding some notes from some bugs stored into the db 2025-01-04 22:02:26 +11:00
79437477f7 svgs cant have fill and do onHover properly, and arcs need a fill of none, so switched over to pngs and made the arrow match the actual direction of rotation 2025-01-04 22:02:00 +11:00
989fc7c6db svgs cant have fill and do onHover properly, and arcs need a fill of none, so switched over to pngs and made the arrow match the actual direction of rotation 2025-01-04 22:01:51 +11:00
febdfbbd32 added deepface notes 2025-01-04 21:59:56 +11:00
37c0125fbd Fixed BUG-136, now can search with a * or % as wildcards, and fix bug where no search results broke pa 2024-12-31 15:13:17 +11:00
b34b71bf32 adding BUG-136, handle wildcard search of % and no results bug 2024-12-31 15:11:57 +11:00
bd3ee98328 clean up start-up messages and validation on ENV. Display error correctly when exif rotater fails 2024-12-07 19:46:15 +11:00
47190a8829 user newer sqlalchemy v2 syntax 2024-12-07 19:42:49 +11:00
f97653c4f1 had to lock sqlalchemy to a newer version, but was able to unlock flask version 2024-12-07 19:42:03 +11:00
8157292df4 convert over to pauser - and use ARG for uid/gid of pauser 2024-12-07 19:41:32 +11:00
a0b4b80ca9 clean up and use pauser 2024-12-07 19:40:29 +11:00
cbfd00fe53 make dev PA go via container name instead of via mara for DB connection 2024-12-07 19:39:12 +11:00
61b3edf5fa added bugs 133-135, failing symlinks on build, moving images reappearing and failing to rotate 2024-12-07 19:38:48 +11:00
fdba16c004 added BUG-133 right arrow in viewer goes to next pic AND scrolls right 2024-08-25 22:37:38 +10:00
9a359a1a4c fixed BUG-132 failing to create symlink on first build by being explicit in Dockerfile on how to handle content in static/ 2024-08-11 18:59:02 +10:00
15cc101bbe add BUG-131, init failing on first build to create symlinks 2024-08-11 18:58:20 +10:00
9dac54ae97 removing cant_close=True on this one, ultimately, its just viewing order is wrong, so no actual data/file issues - let it be persistent but able to be closed 2024-07-28 17:31:35 +10:00
aba96f6c52 added BUG-130, moving files and next page confusion 2024-07-28 17:29:24 +10:00
0895bc5983 Oops - remove hardcoded path and use self.url, one too many ctrl-zs when testing :( 2024-06-30 18:07:02 +10:00
46eb87bc31 use an regex match rather than just last part of self.url to find search term as something changed and caused the url to be different on the post - not sure when I broke this, or if an upgrade did, but this fixed BUG-127 and BUG-129 (just a special case of 127) 2024-06-30 17:12:31 +10:00
cbea1f09ab clarified new bugs 2024-06-29 15:23:08 +10:00
5fa9b5e386 updated for fresh ubuntu 24.04 install 2024-06-29 13:59:40 +10:00
28d4f8d998 minor tweak of content of try for symlink creation and error content afterwards 2024-06-29 13:55:43 +10:00
29ea2ca833 starting to tweak from legacy query.get() method to session.get() - these fixes will be a pita/lot 2024-06-29 13:55:01 +10:00
06ceefda97 popped face recog load_image_fil into a try block as it clearly has/can fail 2024-06-29 13:54:22 +10:00
974d96d6eb Fix BUG-126 (basically disable next/prev buttons the moment we go back to the server for the previous/next page of images, only re-enable if they make sense after we have loaded the new images 2024-06-29 13:53:50 +10:00
030eb785f3 Fix BUG-126 (basically disable next/prev buttons the moment we go back to the server for the previous/next page of images, only re-enable if they make sense after we have loaded the new images 2024-06-29 13:53:38 +10:00
1277558b0e fixed BUG-126, so removing debugs, changing unlikely bug reoccurrence to a warning that can be closed 2024-06-29 13:51:53 +10:00
d7f028f3d1 had to hard lock numpy for now so that face recognition library can find locations 2024-06-29 13:50:07 +10:00
d53bffcab9 allow this to work in padev container (e.g. run as mythtv) 2024-06-29 13:49:31 +10:00
4885acea80 adding extra debugs for weird bugs with losing data when viewing 2024-04-07 20:18:40 +10:00
fed78672dc make message field larger to cater for larger debugs 2024-04-07 20:18:10 +10:00
bc23c9a3c6 added new BUG relating to restart DB underneath web 2024-04-07 20:17:22 +10:00
ffbedc3bca added BUG-127 sort not working after a search 2024-04-07 00:34:42 +11:00
d39f1734a3 fix BUG-124 - pa job mgr does not have timestamps, vi creating/use of PAprint() 2024-03-11 00:40:25 +11:00
d86893875d added capability to run a dev container for pa, and if we re-build we create a new .sk and remove old PA_UserState on fresh login. BUG-120 is either fixed by this or at least the log catches it and does not crash now (if we rebuild between re-use) 2024-03-10 23:52:34 +11:00
2665ab086e properly define regex strings 2024-03-10 23:52:24 +11:00
063c75809a more thoughts now we have a new CPU in mara 2024-03-10 13:40:23 +11:00
8b89d845db change key warning to be persistent, so I dont miss what it said 2024-03-10 13:39:38 +11:00
6ef12d7636 change key warning to be persistent, so I dont miss what it said 2024-03-10 13:39:21 +11:00
42eeed7c45 move to ssl offloading, rather than processing ssl inside the python code - just easier / more consistent with other traefik enabled containers 2024-03-10 13:39:00 +11:00
096251f8ce move to using base image of python, not ubuntu, update requirements to use pip for as much as we can and not mix and match some apt get and pip3 for python dependencies 2024-03-10 13:37:54 +11:00
33c86f49fb "new" BUG 126 - still the fail to view an id that is not in the list 2024-03-10 13:36:31 +11:00
98d33b4880 have a solution for multi-select on tablet/phone now 2024-01-28 23:27:20 +11:00
448b635d5a updating BUGS 2024-01-28 23:27:02 +11:00
53ef671d34 added a isMobile() function that adds a shift and ctrl key to the files view, which can be clicked to fake a shift or ctrl key selection event on a tablet/mobile - first pass at this, its fairly usable. I might allow de-selecting the shift or ctrl key if they are pressed again before a selection is used, otherwise this is all functional. Note, I also changed the contextmenu to a click not mouse down on selection of an item in the menu. This is allows me to stop the propagation of the click event which was being trapped by the $(document).on( "click" ... and which we dont want - also exposes a BUG that when I click the context menu onto a different image it does not highlight the new image and some menu items process the original highlight, others the image under the context menu 2024-01-21 23:07:31 +11:00
7e25c33f1a removed use of ref var for code readability
added some more defensive code to throw an error, rather than crash if missing pref, referrers, etc.
fix use of SetFELog with log_level="error" => level="danger"
2024-01-21 18:19:20 +11:00
4db3d27cbc fix use of SetFELog with log_level="error" => level="danger" 2024-01-21 18:17:45 +11:00
a3350b9316 default for similar folder by date is now +/- 2 weeks not 1 AND fix use of SetFELog with log_level="error" => level="danger" 2024-01-21 18:17:03 +11:00
a8ee7b4f89 added TODO for touch and multi-select 2024-01-21 18:14:53 +11:00
478a7e5add added 2 new bugs - 123 -> pa_job_mgr crashed with a timeout, 124 -> pa_job_mgr logs have no timestamp 2024-01-21 18:14:25 +11:00
aad0389d77 BUG-122 - just closing this, duplicates were from 2 years ago and were 200ms apart, looks like a reload of the POST? If so, I now force a redirect after the POST, so reloads dont add more than 1 person 2024-01-01 12:16:37 +11:00
262efd89d4 Fixed BUG-121: crashing when cannot find a match when searching 2024-01-01 11:49:41 +11:00
9c4da5ec6f add allow actions for wrong person to TODO 2024-01-01 11:49:03 +11:00
3f39f74716 new bugs, 500 when cannot find a string, and duplicates of people? 2024-01-01 11:48:43 +11:00
48353f0486 have changed the rotation code to use jpegtran so its lossless 2023-11-11 22:28:29 +11:00
0ed56d6a1a added a debug for rotate code, and put quotes in the right spots so paths with spaces are covered - D'oh! 2023-11-11 22:25:26 +11:00
1005acb339 use jpegtran to do lossless 90 rotations 2023-11-11 21:28:04 +11:00
62cfe6305e use text() for newer sqlalchemy 2023-09-27 15:02:57 +10:00
34 changed files with 815 additions and 293 deletions

View File

@@ -2,10 +2,7 @@ images_to_process
new_img_dir
photos
reference_images
static/Bin/*
static/Import/*
static/Storage/*
static/Metadata/*
static
.pa_metadata
.pa_bin
.python

44
BUGs
View File

@@ -1,4 +1,15 @@
### Next: 121
### Next: 140
BUG-139: since AI:mich not %AI:mich% ? going next a few times, ends say 4 pages of 50 into 4000 matches (entries from DB < 50)... Either search is no longer returning all those matches, or the stats page is wildly wrong?
- confirmed this is when person has 2 or more refimgs:
- on page "2", we get 49 pulled back in the ORM instead of the 50 expected -- b/c I use that to indicate we must be at the end of the list if not 50 found
-- really, need to fix once and for all the eids / re-running query.
do GetEntries as we do now, once done however, get all entry ids. Stick those into the DB with a unique query-id and datestamp
new func to get all details needed for entries in an eid list (of 1-page) - show this page of entries
use current, full eidlist and to work our start/end of list (next/prev), disabling.
then client can keep current page of data, if you hit next/prev, use DB unique query id / full list and page of eids, and give full data for new page of entries
Implications though, are if a search is invalidated (maybe delete / move a photo), need to remove them from the list on the DB too OR let user know/decide to fix/wait.
BUG-100: I managed to get 2 photos matching mich in the NOT_WORKING photo (probably dif refimgs but same p.tag?)
= /photos/2012/20120414-damien/IMG_8467.JPG
BUG-106: cant add trudy /pat? as refimgs via FaceDBox
@@ -11,20 +22,19 @@ BUG-119: "Uncaught (in promise) Error: A listener indicated an asynchronous
was received"
investigate this (possible I'm calling check_for_jobs and maybe not doing the async right?)
BUG-120: on tablet:
[2023-04-09 12:21:21,214] ERROR in app: Exception on /view/16978 [GET]
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 2525, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1822, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1820, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1796, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File "/code/files.py", line 655, in view
OPT=States( request )
File "/code/states.py", line 92, in __init__
self.url = pref.orig_url
AttributeError: 'NoneType' object has no attribute 'orig_url'
BUG-123: pa_job_manager crashed with timeout on connection (probably when I turned off traefik for a bit?). Regardless, should be more fault tolerant --> maybe offer to restart pa_job_manager IF its crashed?
this definitely happened also, when I shutdown the DB back-end mid job, and it was able to be restarted, so could get f/e to at least suggest a restart of the contianer, or auto-restart job_mgr?
BUG-125: when an image is highlighted, then post the contextmenu on a different image - the highlight does not move to the new image
and the selected menu function processes the original or the new depending on the way the code works.
There is a chance we need to change the document on click to a mouse down (or whatever the context menu
uses for default), rather than just fix the highlight
BUG-130: moving files and then trying to go next page and it got confused...
BUG-132: right arrow to go to next photo in viewer ALSO scrolls to the right, needs a return somewhere in the jscript
BUG-133: when rebuilding pa[dev], the first run fails to have symlinks to the right paths for Import/Storage, etc. a simple restart fixes - so potentially the intial copy or some other race condition?
BUG-134: when moving set of photos on page, then move another set of photos on page, the first set reappears. Could really delete them from the dom?
BUG-135: failed to rotate: 2006/20061215-ITS-xmas-KP/DSC00582.JPG - not sure why && not repeatable, so its not the image, timing/race condition maybe?
BUG-137: after moving/refiling photos, the next shift-click is out of order (reload fixes it)
BUG-138: Placeholder for all the ways we can get the front-end confused:
---> JUST fix all these BUGs (relating to confused/lost state) by revisiting the overally complex way I remember state and my position in a list (probably FAR easier, to make an initial sql just save all eids, and then not try to recreate that list ever again and not care how I got into the list). Can attach a "running server-side sequence number", and if old sequence, and the original eid list results in a failure, then just pop up that the saved list is no longer valid, and ask user to re-do their search/list..."

74
DEEPFACE Normal file
View File

@@ -0,0 +1,74 @@
#
# https://pypi.org/project/deepface/
#
from deepface import DeepFace
import os
import cv2
import numpy as np
# Function to compute embeddings for a given image path
def compute_embeddings(image_path, model_name='VGG-Face'):
try:
# This returns a list of embeddings for each face found in the image
embeddings = DeepFace.represent(img_path=image_path, model_name=model_name)
return [res['embedding'] for res in embeddings]
except Exception as e:
print(f"Error processing image {image_path}: {e}")
return []
# Function to find the best matches between reference and target images
def find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine'):
# Store embeddings for reference images
reference_embeddings = {}
for ref_img in reference_images:
ref_embeddings = compute_embeddings(ref_img, model_name=model_name)
if ref_embeddings:
reference_embeddings[ref_img] = ref_embeddings
# Store best matches
best_matches = []
# Loop through each target image
for target_img in target_images:
target_embeddings = compute_embeddings(target_img, model_name=model_name)
for target_emb in target_embeddings:
for ref_img, ref_emb_list in reference_embeddings.items():
# Compare each reference embedding with the target embedding
for ref_emb in ref_emb_list:
# Compute the distance between embeddings using DeepFace's distance functions
distance = DeepFace.find_distance(ref_emb, target_emb, distance_metric=metric)
# Store match details (target image, reference image, and distance)
best_matches.append({
'target_image': target_img,
'reference_image': ref_img,
'distance': distance
})
# Sort matches by the distance (smallest distances indicate best matches)
best_matches = sorted(best_matches, key=lambda x: x['distance'])
return best_matches
# Example usage
reference_images = [
"path_to_reference_image1.jpg",
"path_to_reference_image2.jpg"
]
target_images = [
"path_to_target_image1.jpg",
"path_to_target_image2.jpg",
# Add more target images here...
]
# Find the best matches
matches = find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine')
# Display the best match
for match in matches[:5]: # Display top 5 matches
print(f"Reference Image: {match['reference_image']}, Target Image: {match['target_image']}, Distance: {match['distance']}")

View File

@@ -1,35 +1,27 @@
FROM ubuntu:22.04
FROM python:latest
# declare var
ARG PA_ID
WORKDIR /code
USER root
ENV PJM_UID=500
ENV PJM_GID=500
RUN groupadd -g ${PJM_GID} mythtv && useradd -r -u ${PJM_UID} -g ${PJM_GID} mythtv
ENV TZ=Australia/Melbourne
RUN truncate -s0 /tmp/preseed.cfg && \
(echo "tzdata tzdata/Areas select Australia" >> /tmp/preseed.cfg) && \
(echo "tzdata tzdata/Zones/Australia select Melbourne" >> /tmp/preseed.cfg) && \
debconf-set-selections /tmp/preseed.cfg && \
rm -f /etc/timezone /etc/localtime && \
apt-get update && \
DEBIAN_FRONTEND=noninteractive DEBCONF_NONINTERACTIVE_SEEN=true \
apt-get install -y tzdata
RUN apt-get update && apt-get -y dist-upgrade && apt-get -y install python3-pip libpq-dev gunicorn mediainfo cmake libgl1-mesa-glx libglib2.0-0 python3-ldap libjpeg-turbo-progs ffmpeg git
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
RUN pip3 install --upgrade pillow --user
EXPOSE 443
EXPOSE 55432
# do this first, allowing layer / build efficiencies if only code changes...
# sudo used in dev container
RUN apt-get update && apt-get -y install libpq-dev mediainfo cmake libglib2.0-0 libjpeg-turbo-progs ffmpeg git sudo
COPY requirements.txt .
RUN pip3 install --upgrade pip && pip3 install -r requirements.txt && pip3 install --upgrade pillow
RUN groupadd -g ${PA_ID} pauser && useradd -r -u ${PA_ID} -g ${PA_ID} pauser
# NOW, deal with any code changes, and reset the container for 'first run'...
COPY . .
RUN echo $RANDOM | md5sum | head -c 30 > /code/.sk
RUN chmod 600 .sk
RUN date > internal/build-date.txt
RUN git log -n 15 > internal/git-log.txt
RUN ln -s /code/TODO /code/internal/TODO
RUN ln -s /code/BUGs /code/internal/BUGs
# set up static dir and subdirs
RUN mkdir -p static/Bin && mkdir -p static/Import && mkdir -p static/Storage && mkdir -p static/Metadata
EXPOSE 80
RUN cat /dev/urandom | head -c 50 | md5sum | head -c 32 > /code/.sk && chmod 600 .sk && chown pauser:pauser .sk
RUN TZ="Australia/Melbourne" date > internal/build-date.txt && git log -n 15 > internal/git-log.txt && ln -s /code/TODO /code/internal/TODO && ln -s /code/BUGs /code/internal/BUGs
RUN rm -rf .git
RUN chown mythtv:mythtv /code
RUN chown mythtv:mythtv ./static
RUN chown mythtv:mythtv ./static/*
## cleanup of files from setup
RUN rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
RUN chown pauser:pauser /code && chown pauser:pauser ./static && chown pauser:pauser ./static/*
CMD ["./wrapper.sh"]

31
README
View File

@@ -3,34 +3,22 @@ In here we can put instructions on how to run this / any general info
to edit src:
git....
CAM: fill this in pls
cd ~/src ; # or wherever you want the code to live
git clone https://git.depaoli.id.au/ddp/photoassistant.git
ubuntu packages:
sudo apt-get install -y mediainfo cmake python3-flask
sudo apt-get install -y cmake libpq-dev
REDO THIS SOON (good to test on a clean install):
AFTER a fresh ubuntu, need to remove/make a new .python env and get pip-based python packages installed into .python:
cd ~/src/photoassistant
rm .python
# use a venv (to get consistent python versions with Prod container)
python3 -m venv .python
source ./.python/bin/activate
pip3 install -r requirements.txt
pip packages:
* pymediainfo
* PIL (should be there by default)
- on ubuntu, need to do: pip3 install --upgrade pillow --user
* ExifRead
* opencv-python
* datetime
* pytz
* face_recognition
* flask_login
* flask-ldap3-login
#### dlib (might need to install this before face_recognitioin, but it might not be needed, cmake clearly was)
upstream packages...
mkdir static/upstream
cd static/upstream
@@ -60,14 +48,11 @@ pip3 list | tail -n +3 | grep -v mysqlclient | grep -v duplicity | grep -v gp
To run debug version of web server (gunicorn deprecates FLASK_ENV, so co-exist for now):
# older flask:
FLASK_APP=main ENV=development FLASK_ENV=development flask run --host=192.168.0.2
# flask 2.2.2+ (in venv .python)
FLASK_APP=main ENV=development ./.python/bin/flask --debug run --host=192.168.0.2
FLASK_APP=main ENV=development ./.python/bin/flask --debug run --host=mara.ddp.net
to run prod version of web server:
gunicorn --bind="192.168.0.2:5000" --threads=2 --workers=2 main:app
gunicorn --bind="mara.ddp.net:5000" --threads=2 --workers=2 main:app
Also have to run the job manager for jobs to work:
ENV="development" python3 pa_job_manager.py

240
TODO
View File

@@ -1,5 +1,36 @@
### major fix - go to everywhere I call GetEntries(), and redo the logic totally...
* firstly, run the query as per normal, but get just the matched eids into an entry_lst
* make a unique query_id for this entry_lst, and store entry_ids into "query" table, with a unique query_id
* take most of pa_user_state that relates to query state and move it to the "query" table per query_id
* pa_user_state then becomes defaults for next query (so how_many, noo, etc)
* we can age out queries form the query_table after a few months?
* client side always has query_id. IF DB does not have query_id, then its really old? - just say so...
* client side takes query_id, entry_lst, current_eid, offset, first/last_eid, etc. as part of its first route / html creation.
* it then decides based on all this to GetEntryDetails( subset of entry_lst ) <- needs new route
* IN THEORY some of the subset of entry_lst don't exist -- BUT, we can handle that on response, e.g. say my query used to have 1,2,3, and since then another user/action deleted 2:
- I ask for details on 1,2,3 and get back details on 1,3 only.
- On client-side, I can say, since you ran this query, data in PA has changed - image#2 is no longer in PA.
Please run a new query (or bonus points, maybe have enough of the original query to note this and ask, do you want to ignore changes, or re-run query and get latest data?)
* client can go fwd or back in the entry_lst same as now (disabling buttons as needed), BUT as entry_lst is NOT recreated per page move, then no chance to get confused about first/last
* client side:
* for real chance to stop confusion, instead of removing deleted images from DOM, we should gray them out and put a big Del (red circle with line?) though it as overlay.
* Create another table is entry_ammendments - note the deletions, rotations, flips of specific eids - then reproduce that on the client side visually as needed
- at least grayed-out, to indicate a pending action is not complete.
- When job that flips, rotates, deletes completes then lets update the query details (e.g. remove eids, or remove the ammendments)
- this actually is quite an improvement, if someone is deleting 2 as per above, I will see that as a pending change in my unrelated query, ditto flips, etc.
### GENERAL
* change the rotation code to use that jpeg util to reduce/remove compression loss?
* jobs for AI should show path name
* rm dups job should show progress bar
* in viewer, there is no move button (maybe add one?)
* consider doing duplicates before AI, and if there are say 100s+, then maybe pause the AI work
- had 5000+ new photos, took 8 hours to finish, for me to just delete them anyway
* optimisation:
- keep track of just new files since scan (even if we did this from the DB),
then we could just feed those eid's explicitly into a 'get_file_details_on_new_files'.
- ALSO use new_eids list IF no new refimgs to do a 'run_ai_on_new_files' :)
* allow changing dates in move dbox and then re-scan for existing folders OR just have a browse for existing...
- for use on scanned photos, they register as 2010, but are datestamped in visuals for 95
@@ -10,11 +41,22 @@
-- path.py has the __allow_unmapped__ = True
* remove all '.execute' from *.py
* read this: https://flask.palletsprojects.com/en/2.2.x/testing/#faking-resources-and-context
* allow actions for wrong person:
-> someone else? OR override no match for this person ever for this image?
* groups of persons (non-exclusive, recursive), so:
- fam (ddp, mandy, cam, mich)
- mandy_fam (mandy, kirsty, tony, anne, tenille)
- uncle_peters_fam (peter, joan, jarrod, aaron, alana)
- cousin_aarons_fam (aaron, karly, kai, etc)
- storviks (mandy_fam, uncle_peters_fam, cousin_aarons_fam, etc)
* could get better AI optim, by keeping track of just new files since scan (even if we did this from the DB),
then we could just feed those eid's explicitly into a 'run_ai_on_new_files' :) -- maybe particularly
if count('new files') < say 1000 do eids, otherwise do path AND no new refimgs
* birthdates in PA:
- M was interested in birthdate appearing in hass, so need an API /
sensor in hass too - maybe could store (optional) birthdate and
heads up (of X days, e.g. so can get present) or just come up on the day so you can remember to text them
* search logic (AND vs OR)
* read this: https://flask.palletsprojects.com/en/2.2.x/testing/#faking-resources-and-context
* ignore face should ignore ALL matching faces (re: Declan)
@@ -28,7 +70,7 @@
* browser back/forward buttons dont work -- use POST -> redirect to GET
- need some sort of clean-up of pa_user_state -- I spose its triggered by browser session, so maybe just after a week is lazy/good enough
-- pa_user_state has last_used as a timestamp so can be used to delete old entries
- pa_user_state has last_used as a timestamp so can be used to delete old entries
* back button will fail if we do these POSTs:
job.py:@app.route("/jobs", methods=["GET", "POST"])
@@ -40,8 +82,6 @@
files.py:@app.route("/fix_dups", methods=["POST"])
???
* allow user to choose default log level to show
* GUI overhaul?
* on a phone, the files.html page header is a mess "Oldest.." line is too large to fit on 1 line (make it a hamburger?)
- searched for text overlaps buttons above and below
@@ -59,9 +99,6 @@
* fix up logging in general
ProcessFileForJob --> really need to better handle log levels and counting
* metadata at folder level with file level to add more richness
- store in DB? or store in hidden file (or both)... IF it is outside the DB, then I can 'rebuild' the DB at anytime from scratch
* video player cannot handle non mp4 formats... do I care? (could just offer a download link and hope the client deals with it)
--> OR? https://jsmpeg.com/
--> OR? convert all videos to mp4/webm
@@ -91,6 +128,8 @@
* viewer needs to allow toggle to scan_model (and prob. right-click on file... AI (with CNN) AI (with hog)
- make the form-select AI_Model actually do the change (but need more mem on mara really -- even mem is not enough
need graphic support --> need to allow pa_job_manager run on borric with acceleration)
- test this with new CPU in mara
- test this on borric for comparison
For AI / rescan:
way to override per file:
@@ -209,3 +248,182 @@ def AddRefimgToPerson(person_id: int, ref_images: List[str]) -> bool:
>>> AddRefimgToPerson(123, ['http://example.com/img1.jpg', '/path/to/img2.png'])
True
"""
checkout deepface:
User: I have this code I wrote using pythons face_recognition library, but I would like to alter it to work with deepface. My code is: im = face_recognition.load_image_file(e.FullPathOnFS())
face_locations = face_recognition.face_locations(im, model=model.name )
unknown_encodings = face_recognition.face_encodings(im, known_face_locations=face_locations)
Assistant: To alter your existing code to work with **DeepFace**, the changes involve replacing the methods of the `face_recognition` library with similar functionalities provided by DeepFace.
In **DeepFace**, the equivalent functionalities include:
1. **Detecting faces**: This is automatically handled by the `analyze` method in DeepFace. You don't need to manually specify the face locations.
2. **Encoding faces**: DeepFace internally computes embeddings for faces through its supported models (like VGG-Face, Facenet, OpenFace, DeepID, etc.).
Here's how you can modify your code to use DeepFace instead of `face_recognition`:
### Original Code (using `face_recognition`):
```python
im = face_recognition.load_image_file(e.FullPathOnFS())
face_locations = face_recognition.face_locations(im, model=model.name)
unknown_encodings = face_recognition.face_encodings(im, known_face_locations=face_locations)
```
### Modified Code (using `DeepFace`):
```python
from deepface import DeepFace
import cv2
# Load the image
im = cv2.imread(e.FullPathOnFS())
# Analyze the image to get face embeddings (encodings) and other details
results = DeepFace.represent(img_path=e.FullPathOnFS(), model_name=model.name)
# `results` will contain a list of dictionaries where each dictionary contains the embedding (encoding) of a detected face.
# Example of accessing the embeddings (face encodings)
unknown_encodings = [res['embedding'] for res in results]
```
### Explanation:
- `DeepFace.represent()` takes the path to the image (`e.FullPathOnFS()`) and the name of the model (`model.name`). It processes the image and extracts the embeddings (encodings) for each face detected.
- **Note**: In DeepFace, the model name can be one of several options: `VGG-Face`, `Facenet`, `OpenFace`, `DeepFace`, `DeepID`, etc. Make sure that `model.name` corresponds to one of these supported models.
- The result is a list of dictionaries. Each dictionary contains a detected face's encoding (embedding), which you can access using the `'embedding'` key.
DeepFace abstracts away a lot of the manual face detection work that you were handling with `face_recognition`. Let me know if you'd like to handle anything else, like visualizing the detected faces or working with other DeepFace models.
User: I want to use Deepface to take a set of reference images and match against all the images I have to find the best matches, how would you go about that?
Assistant: To use **DeepFace** to match a set of reference images against other images to find the best matches, you can follow these general steps:
1. **Compute embeddings (encodings) for the reference images**.
2. **Compute embeddings for the target images** (the ones you're searching through).
3. **Compare the embeddings** of the reference images with those of the target images to find the best matches based on a distance metric.
4. **Rank the matches** by similarity and return the best ones.
Here's a step-by-step guide on how you can achieve this with **DeepFace**:
### Step 1: Compute embeddings for reference images
You will load the reference images, compute their embeddings using DeepFace, and store them.
### Step 2: Compute embeddings for all target images
Do the same for the target images you want to match against.
### Step 3: Compare embeddings
You will compare the embeddings from the reference images to the embeddings of each target image. The smaller the distance between the embeddings, the more similar the faces are.
### Step 4: Return the best matches
### Example Code
```python
from deepface import DeepFace
import os
import cv2
import numpy as np
# Function to compute embeddings for a given image path
def compute_embeddings(image_path, model_name='VGG-Face'):
try:
# This returns a list of embeddings for each face found in the image
embeddings = DeepFace.represent(img_path=image_path, model_name=model_name)
return [res['embedding'] for res in embeddings]
except Exception as e:
print(f"Error processing image {image_path}: {e}")
return []
# Function to find the best matches between reference and target images
def find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine'):
# Store embeddings for reference images
reference_embeddings = {}
for ref_img in reference_images:
ref_embeddings = compute_embeddings(ref_img, model_name=model_name)
if ref_embeddings:
reference_embeddings[ref_img] = ref_embeddings
# Store best matches
best_matches = []
# Loop through each target image
for target_img in target_images:
target_embeddings = compute_embeddings(target_img, model_name=model_name)
for target_emb in target_embeddings:
for ref_img, ref_emb_list in reference_embeddings.items():
# Compare each reference embedding with the target embedding
for ref_emb in ref_emb_list:
# Compute the distance between embeddings using DeepFace's distance functions
distance = DeepFace.find_distance(ref_emb, target_emb, distance_metric=metric)
# Store match details (target image, reference image, and distance)
best_matches.append({
'target_image': target_img,
'reference_image': ref_img,
'distance': distance
})
# Sort matches by the distance (smallest distances indicate best matches)
best_matches = sorted(best_matches, key=lambda x: x['distance'])
return best_matches
# Example usage
reference_images = [
"path_to_reference_image1.jpg",
"path_to_reference_image2.jpg"
]
target_images = [
"path_to_target_image1.jpg",
"path_to_target_image2.jpg",
# Add more target images here...
]
# Find the best matches
matches = find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine')
# Display the best match
for match in matches[:5]: # Display top 5 matches
print(f"Reference Image: {match['reference_image']}, Target Image: {match['target_image']}, Distance: {match['distance']}")
```
### Explanation:
1. **`compute_embeddings(image_path, model_name)`**: This function uses DeepFace's `represent` function to compute face embeddings for each face found in the image.
2. **`find_best_matches()`**:
- It first computes and stores embeddings for all reference images in the `reference_embeddings` dictionary.
- For each target image, it computes the embeddings, then compares those embeddings with all the reference embeddings using the chosen distance metric (default is `'cosine'`).
- The function then stores the matches and sorts them by distance (the smaller the distance, the closer the match).
3. **Distance metrics**: You can choose between several distance metrics like `'cosine'`, `'euclidean'`, or `'euclidean_l2'`, depending on the model and your preference.
4. **Matching logic**: It finds the best matches by comparing each face embedding in the target images with each face embedding in the reference images and storing the distances.
### DeepFace Distance Metrics:
- **Cosine**: Measures the cosine of the angle between two vectors (common for face embeddings).
- **Euclidean**: Measures the straight-line distance between two vectors.
- **Euclidean L2**: A variant of Euclidean where embeddings are normalized before distance calculation.
### Example of top match output:
```bash
Reference Image: path_to_reference_image1.jpg, Target Image: path_to_target_image1.jpg, Distance: 0.256
Reference Image: path_to_reference_image2.jpg, Target Image: path_to_target_image2.jpg, Distance: 0.310
...
```
This approach will give you the best matching images based on facial similarity. You can adjust the model used (`VGG-Face`, `Facenet`, etc.) and the distance metric to fine-tune results. Let me know if you'd like to visualize the results or handle larger datasets more efficiently!

View File

@@ -234,9 +234,13 @@ def GetEntriesInFolderView( OPT, prefix ):
################################################################################
def GetEntriesInSearchView( OPT ):
search_term=OPT.orig_search_term
# turn * wildcard into sql wildcard of %
search_term=search_term.replace('*', '%' )
if 'AI:' in OPT.orig_search_term:
search_term = search_term.replace('AI:','')
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag.ilike('%{search_term}%'))"
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag == search_term)"
else:
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag.ilike('%{search_term}%'))"
if 'AI:' in OPT.orig_search_term:
all_entries = eval( f"{join}.{OPT.order}.offset(OPT.offset).limit(OPT.how_many).all()")
else:
@@ -265,13 +269,18 @@ def GetEntriesInSearchView( OPT ):
if add_it:
all_entries.append(a)
# nothing found, just return now
if len(all_entries) == 0:
OPT.num_entries = 0
return []
# for all searches first_entry is worked out when first_eid not set yet & offset is 0 and we have some entries
if OPT.first_eid == 0 and OPT.offset == 0 and len(all_entries):
OPT.first_eid = all_entries[0].id
if OPT.last_eid == 0:
by_fname= f"select e.id from entry e where e.name ilike '%%{search_term}%%'"
by_dirname=f"select e.id from entry e, entry_dir_link edl where edl.entry_id = e.id and edl.dir_eid in ( select d.eid from dir d where d.rel_path ilike '%%{search_term}%%' )"
by_ai =f"select e.id from entry e, face_file_link ffl, face_refimg_link frl, person_refimg_link prl, person p where e.id = ffl.file_eid and frl.face_id = ffl.face_id and frl.refimg_id = prl.refimg_id and prl.person_id = p.id and p.tag ilike '%%{search_term}%%'"
by_ai =f"select e.id from entry e, face_file_link ffl, face_refimg_link frl, person_refimg_link prl, person p where e.id = ffl.file_eid and frl.face_id = ffl.face_id and frl.refimg_id = prl.refimg_id and prl.person_id = p.id and p.tag = '{search_term}'"
if 'AI:' in OPT.orig_search_term:
sel_no_order=f"select e.*, f.* from entry e, file f where e.id=f.eid and e.id in ( {by_ai} ) "
@@ -283,6 +292,9 @@ def GetEntriesInSearchView( OPT ):
with db.engine.connect() as conn:
OPT.num_entries = conn.execute( text( num_e_sql ) ).first().count
if OPT.num_entries == 0:
return []
last_entry_sql= f"{sel_no_order} order by {OPT.last_order_raw} limit 1"
with db.engine.connect() as conn:
OPT.last_eid = conn.execute( text( last_entry_sql ) ).first().id
@@ -438,7 +450,10 @@ def files_rbp():
@app.route("/search/<search_term>", methods=["GET", "POST"])
@login_required
def search(search_term):
# print( f"req={request}" )
OPT=States( request )
# print( f"OPT={OPT}" )
# if we posted to get here, its a change in State, so save it to pa_user_state, and go back to the GET version or URL
if request.method=="POST":
redirect("/search/"+search_term)
@@ -586,7 +601,7 @@ def view_list():
# this occurs when we went from the last image on a page (with how_many on
# it) and it just happened to also be the last in the DB...
if not entries:
SetFELog( message="DDP: DONT think this can happen anymore", log_level="error", job=None, persistent=True, cant_close=True )
SetFELog( message="DDP: DONT think this can happen anymore", level="danger", job=None, persistent=True, cant_close=True )
# undo the skip by how_many and getentries again
OPT.offset -= int(OPT.how_many)
@@ -610,7 +625,6 @@ def view_list():
resp['objs'][e.id]['faces'] = []
# put face data back into array format (for js processing)
fid=0
for face in e.file_details.faces:
fd= {}
fd['x'] = face.face_left
@@ -618,10 +632,10 @@ def view_list():
fd['w'] = face.w
fd['h'] = face.h
if face.refimg:
fd['pid'] = face.refimg.person.id
fd['who'] = face.refimg.person.tag
fd['distance'] = round(face.refimg_lnk.face_distance,2)
resp['objs'][e.id]['faces'].append(fd)
fid+=1
eids=eids.rstrip(",")
lst = eids.split(',')
@@ -638,9 +652,12 @@ def view_list():
resp['last_eid']=OPT.last_eid
resp['eids']=eids
resp['offset']=OPT.offset
# print( f"BUG-DEBUG: /view_list route #1 - OPT={OPT}, eids={eids} ")
# save pref to keep the new current value, first/last
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.orig_ptype==OPT.orig_ptype,PA_UserState.view_eid==OPT.view_eid).first()
# print( f"BUG-DEBUG: /view_list route #2 - OPT={OPT}, eids={eids} ")
UpdatePref( pref, OPT )
# print( f"BUG-DEBUG: /view_list route #3 - OPT={OPT}, eids={eids} ")
return make_response( resp )
@@ -674,13 +691,13 @@ def view(id):
eids=eids.rstrip(",")
# jic, sometimes we trip this, and rather than show broken pages / destroy
if id not in eids:
SetFELog( message=f"ERROR: viewing an id, but its not in eids OPT={OPT}, id={id}, eids={eids}", log_level="error", persistent=True, cant_close=True)
SetFELog( message=f"ERROR: viewing an id, but its not in eids OPT={OPT}, id={id}, eids={eids}", level="danger", persistent=True, cant_close=False)
msg="Sorry, viewing data is confused, cannot view this image now"
if os.environ['ENV'] == "production":
msg += "Clearing out all states. This means browser back buttons will not work, please start a new tab and try again"
PA_UserState.query.delete()
db.session.commit()
SetFELog( msg, "warning" )
SetFELog( msg, "warning", persistent=True, cant_close=False )
return redirect("/")
else:
NMO_data = FaceOverrideType.query.all()
@@ -688,6 +705,7 @@ def view(id):
imp_path = setting.import_path
st_path = setting.storage_path
bin_path = setting.recycle_bin_path
# print( f"BUG-DEBUG: /view/id GET route - OPT={OPT}, eids={eids}, current={int(id)} ")
return render_template("viewer.html", current=int(id), eids=eids, objs=objs, OPT=OPT, NMO_data=NMO_data, imp_path=imp_path, st_path=st_path, bin_path=bin_path )
##################################################################################
@@ -698,6 +716,7 @@ def view(id):
def view_img_post(id):
# set pa_user_states...
OPT=States( request )
# print( f"BUG-DEBUG: /view/id POST route - OPT={OPT}, id={id} ")
# then use back-button friendly URL (and use pa_user_states to view the right image in the right list
return redirect( "/view/" + id );
@@ -782,7 +801,7 @@ def _jinja2_filter_parentpath(path):
def get_existing_paths(dt):
dir_ft=FileType.query.filter(FileType.name=='Directory').first()
dirs_arr=[]
for delta in range(-7, 8):
for delta in range(-14, 15):
try:
new_dtime=datetime.strptime(dt, "%Y%m%d") + timedelta(days=delta)
except:

View File

@@ -173,27 +173,6 @@
<svg id="unknown_ftype" fill="grey" viewBox="0 0 16 16">
<path d="M2 0a2 2 0 0 0-2 2v12a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V2a2 2 0 0 0-2-2H2zm3.496 6.033a.237.237 0 0 1-.24-.247C5.35 4.091 6.737 3.5 8.005 3.5c1.396 0 2.672.73 2.672 2.24 0 1.08-.635 1.594-1.244 2.057-.737.559-1.01.768-1.01 1.486v.105a.25.25 0 0 1-.25.25h-.81a.25.25 0 0 1-.25-.246l-.004-.217c-.038-.927.495-1.498 1.168-1.987.59-.444.965-.736.965-1.371 0-.825-.628-1.168-1.314-1.168-.803 0-1.253.478-1.342 1.134-.018.137-.128.25-.266.25h-.825zm2.325 6.443c-.584 0-1.009-.394-1.009-.927 0-.552.425-.94 1.01-.94.609 0 1.028.388 1.028.94 0 .533-.42.927-1.029.927z"/>
</svg>
<svg id="rot90" viewBox="0 0 225 225" xmlns="http://www.w3.org/2000/svg" xmlns:bx="https://boxy-svg.com">
<g transform="matrix(1, 0, 0, 1, 7, 20)">
<path style="fill: rgba(216, 216, 216, 0); stroke: rgb(13, 202, 240); stroke-width: 15px;" d="M 100.468 0.428 C 155.454 0.428 200.022 45.014 200 100"/>
<path d="M -100.468 -5.428 H -100.468 L -100.468 -20.428 L -60.468 -0.428 L -100.468 19.572 L -100.468 4.572 H -100.468 V -5.428 Z" style="fill: rgb(13, 202, 240);" transform="matrix(-1, 0, 0, -1, 0, 0)" bx:shape="arrow -100.468 -20.428 40 40 10 40 0 1@18b8b13c"/>
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" transform="matrix(1, 0, 0, 1, -50.667503, 38.507301)"><tspan x="97.106" y="94.348">90</tspan><tspan x="97.106" dy="1em"></tspan></text>
</g>
</svg>
<svg id="rot180" viewBox="0 0 225 225" xmlns:bx="https://boxy-svg.com">
<path style="fill: rgba(216, 216, 216, 0); paint-order: stroke; stroke-width: 15px; stroke: rgb(13, 202, 240);" d="M 70.624 -41.413 C 147.233 -41.507 195.215 41.367 156.992 107.76 C 139.252 138.572 106.422 157.581 70.867 157.624" transform="matrix(0, -1, 1, 0, 62.412816, 178.623736)"/>
<path d="M 0 95 H 0 L 0 80 L 40 100 L 0 120 L 0 105 H 0 V 95 Z" style="fill: rgb(13, 202, 240);" transform="matrix(0.000048, 1, -1, 0.000048, 120.999475, 107.995157)" bx:shape="arrow 0 80 40 40 10 40 0 1@de4046a6"/>
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" transform="matrix(1, 0, 0, 1, -13.373604, 45.416005)"><tspan x="54.39" y="96.53">180</tspan><tspan x="54.39" dy="1em"></tspan></text>
</svg>
<svg id="rot270" viewBox="0 0 225 225" xmlns:bx="https://boxy-svg.com">
<g transform="matrix(1, 0, 0, 1, 8, 9)">
<path style="fill: rgba(13, 202, 240, 0); paint-order: fill; stroke: rgb(13, 202, 240); stroke-width: 15px;" d="M 100.308 199.384 C 177.051 199.384 225.015 116.307 186.644 49.846 C 148.272 -16.615 52.344 -16.615 13.972 49.846 C 5.371 64.744 0.771 81.615 0.62 98.817" transform="matrix(0, -1, 1, 0, 0.692227, 200.07649)"/>
<path d="M 99 195 H 99 L 99 180 L 139 200 L 99 220 L 99 205 H 99 V 195 Z" style="fill: rgb(13, 202, 240);" bx:shape="arrow 99 180 40 40 10 40 0 1@077af6d0"/>
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" x="20.59" y="133.396">270</text>
</g>
</svg>
<svg id="sort-num-asc" viewBox="0 0 16 16">
<path d="M12.438 1.668V7H11.39V2.684h-.051l-1.211.859v-.969l1.262-.906h1.046z"/>
<path fill-rule="evenodd" d="M11.36 14.098c-1.137 0-1.708-.657-1.762-1.278h1.004c.058.223.343.45.773.45.824 0 1.164-.829 1.133-1.856h-.059c-.148.39-.57.742-1.261.742-.91 0-1.72-.613-1.72-1.758 0-1.148.848-1.835 1.973-1.835 1.09 0 2.063.636 2.063 2.687 0 1.867-.723 2.848-2.145 2.848zm.062-2.735c.504 0 .933-.336.933-.972 0-.633-.398-1.008-.94-1.008-.52 0-.927.375-.927 1 0 .64.418.98.934.98z"/>

Before

Width:  |  Height:  |  Size: 27 KiB

After

Width:  |  Height:  |  Size: 25 KiB

View File

@@ -223,12 +223,14 @@ function ChangeSize(clicked_button,sz)
// whether you click after highlight or before
function DoSel(e, el)
{
if( e.ctrlKey )
if( e.ctrlKey || document.fake_ctrl === 1 )
{
$(el).toggleClass('highlight')
if( document.fake_ctrl === 1 )
document.fake_ctrl=0
return
}
if( e.shiftKey )
if( e.shiftKey || document.fake_shift === 1 )
{
st=Number($('.highlight').first().attr('ecnt'))
end=Number($('.highlight').last().attr('ecnt'))
@@ -243,6 +245,9 @@ function DoSel(e, el)
$('.entry').slice( end, clicked+1 ).addClass('highlight')
else
$('.entry').slice( clicked, st ).addClass('highlight')
if( document.fake_shift === 1 )
document.fake_shift=0
return
}
$('.highlight').removeClass('highlight')

View File

@@ -237,6 +237,8 @@ $(document).ready( function()
else if( objs[current].faces[i].who )
{
item_list['match']={ 'name': objs[current].faces[i].who, 'which_face': i, 'id': objs[current].faces[i].id }
item_list['match_add_refimg']={ 'name': 'Add this as refimg for ' + objs[current].faces[i].who,
'person_id': objs[current].faces[i].pid, 'who': objs[current].faces[i].who, 'which_face': i, 'id': objs[current].faces[i].id, }
item_list['wrong_person']={ 'name': 'wrong person', 'which_face': i, 'id': objs[current].faces[i].id }
}
else
@@ -494,10 +496,23 @@ function FaceDBox(key, item)
{
div+=AddSearch( 'Click one of the link(s) below to add this face as a reference image to the person:<br><br>', key, face_pos );
}
if ( key == 'match_add_refimg' )
{
func='AddRefimgTo('+item[key]['person_id']+',\''+key+'\''
func_sn=func+ ', true )'
func_ao=func+ ', false )'
div+=`<script>console.log( "AddExistingFaceAsRefimgToMatchedPerson()" )</script>`
div+="Confirm you wish to add this face as a reference image for " + item[key]['who']
div+= '<div class="col">' + item[key]['who'] + '</div><div class="col input-group">'
div+= '<button onClick="'+func_sn+'" class="btn btn-success py-1 input-group-prepend">Add & search now</button>&nbsp;'
div+= '<button onClick="'+func_ao+'" class="btn btn-outline-success py-1 input-group-append">Add only</button></div>'
}
if ( key == 'wrong_person' )
{
div+='<br>wrong person, so mark this as the wrong person/refimg connection, for face#' + item[key]['which_face']
div+='<br>face db id: ' + item[key]['id']
div += '<br>not yet'
}
if( /NMO_/.test(key) )
{

BIN
internal/pa-logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

BIN
internal/rot180-invert.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

BIN
internal/rot180.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.3 KiB

BIN
internal/rot270-invert.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

BIN
internal/rot270.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.2 KiB

BIN
internal/rot90-invert.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

BIN
internal/rot90.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.4 KiB

107
internal/rot90.svg Normal file
View File

@@ -0,0 +1,107 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
width="265mm"
height="297mm"
viewBox="0 0 265 297"
version="1.1"
id="svg5"
inkscape:version="1.2.2 (b0a8486541, 2022-12-01)"
sodipodi:docname="rot90.svg"
inkscape:export-filename="rot90.svg"
inkscape:export-xdpi="96"
inkscape:export-ydpi="96"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg">
<sodipodi:namedview
id="namedview7"
pagecolor="#ffffff"
bordercolor="#000000"
borderopacity="0.25"
inkscape:showpageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
inkscape:deskcolor="#d1d1d1"
inkscape:document-units="mm"
showgrid="false"
inkscape:zoom="0.914906"
inkscape:cx="222.97373"
inkscape:cy="261.22902"
inkscape:window-width="3840"
inkscape:window-height="2128"
inkscape:window-x="0"
inkscape:window-y="0"
inkscape:window-maximized="1"
inkscape:current-layer="svg5" />
<defs
id="defs2">
<marker
style="overflow:visible"
id="Arrow2"
refX="0"
refY="0"
orient="95"
inkscape:stockid="Arrow2"
markerWidth="2.7"
markerHeight="3.964"
viewBox="0 0 7.7 5.6"
inkscape:isstock="true"
inkscape:collect="always"
preserveAspectRatio="none">
<path
transform="scale(0.7)"
d="M -2,-4 9,0 -2,4 c 2,-2.33 2,-5.66 0,-8 z"
style="fill:context-stroke;fill-rule:evenodd;stroke:none"
id="arrow2L" />
</marker>
</defs>
<g
inkscape:label="Layer 1"
inkscape:groupmode="layer"
id="layer1">
<ellipse
style="font-variation-settings:normal;display:inline;opacity:1;fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:10;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:20,50;stroke-dashoffset:2;stroke-opacity:1;paint-order:markers stroke fill;stop-color:#000000;stop-opacity:1"
id="path2244"
cx="131.0041"
cy="137.873"
rx="98.982185"
ry="108.3931"
inkscape:label="dashes" />
</g>
<g
inkscape:groupmode="layer"
id="layer2"
inkscape:label="270-arc" />
<path
style="font-variation-settings:normal;display:inline;fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:12;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;marker-end:url(#Arrow2);paint-order:markers stroke fill;stop-color:#000000"
id="path2244-3"
inkscape:label="arc"
sodipodi:type="arc"
sodipodi:cx="131.08505"
sodipodi:cy="137.8826"
sodipodi:rx="98.982185"
sodipodi:ry="108.3931"
sodipodi:start="4.7127986"
sodipodi:end="0"
sodipodi:arc-type="arc"
d="M 131.1256,29.489511 A 98.982185,108.3931 0 0 1 230.06724,137.8826"
inkscape:export-filename="rot90.svg"
inkscape:export-xdpi="96"
inkscape:export-ydpi="96"
sodipodi:open="true" />
<text
xml:space="preserve"
style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:90.3111px;font-family:sans-serif;-inkscape-font-specification:'sans-serif, Normal';font-variant-ligatures:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-east-asian:normal;fill:#000000;stroke:none;stroke-width:4;stroke-linecap:round;stroke-linejoin:bevel;stroke-dasharray:none;stroke-dashoffset:0;paint-order:markers stroke fill"
x="74.990334"
y="174.16937"
id="text2468"
inkscape:label="270-text"><tspan
sodipodi:role="line"
id="tspan2466"
style="font-style:normal;font-variant:normal;font-weight:bold;font-stretch:normal;font-size:95.25px;font-family:sans-serif;-inkscape-font-specification:'sans-serif, Bold';font-variant-ligatures:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-east-asian:normal;fill:#000000;stroke:none;stroke-width:4;stroke-dasharray:none"
x="74.990334"
y="174.16937">90</tspan></text>
</svg>

After

Width:  |  Height:  |  Size: 4.2 KiB

24
job.py
View File

@@ -78,13 +78,6 @@ class PA_JobManager_Message(PA,db.Model):
job = db.relationship ("Job" )
################################################################################
# GetJM_Message: used in html to display any message for this front-end
################################################################################
def GetJM_Message():
msg=PA_JPA_JobManager_MessageobManager_Message.query.first()
return msg
################################################################################
# Used in main html to show a red badge of # jobs to draw attention there are
# active jobs being processed in the background
@@ -178,7 +171,9 @@ def jobs():
jobs = Job.query.order_by(Job.id.desc()).all()
else:
page_title='Job list (recent)'
jobs = Job.query.filter( Job.last_update >= (func.now() - func.cast(concat(settings.job_archive_age, 'DAYS'), INTERVAL)) ).order_by(Job.id.desc()).all()
# work out cutoff in python (used to do this in sql and it was too slow)
cutoff = datetime.now() - timedelta(days=settings.job_archive_age)
jobs = Job.query.filter( Job.last_update >= cutoff ).order_by(Job.id.desc()).all()
return render_template("jobs.html", jobs=jobs, page_title=page_title)
@@ -188,7 +183,7 @@ def jobs():
@app.route("/job/<id>", methods=["GET","POST"])
@login_required
def joblog(id):
joblog = Job.query.get(id)
joblog = db.session.get(Job,id)
if request.method == 'POST':
logs=Joblog.query.filter(Joblog.job_id==id).order_by(Joblog.log_date).all()
@@ -344,3 +339,14 @@ def _jinja2_filter_datetime(date, fmt=None):
return date.strftime("%d/%m/%Y %I:%M:%S %p")
else:
return "N/A"
################################################################################
# allow a way to force the messages to be deleted if really needed - its a bit
# lame, but a quick fix
################################################################################
@app.route('/force_clear')
@login_required
def force_clear():
PA_JobManager_Message.query.delete();
db.session.commit()
return redirect("/")

22
main.py
View File

@@ -11,7 +11,7 @@ from datetime import datetime
import os
import re
import socket
from shared import CreateSelect, CreateFoldersSelect, LocationIcon, DB_URL, PROD_HOST, OLDEST_LOG_LIMIT
from shared import CreateSelect, CreateFoldersSelect, LocationIcon, DB_URL, OLDEST_LOG_LIMIT
# for ldap auth
from flask_ldap3_login import LDAP3LoginManager
@@ -60,15 +60,18 @@ except Exception:
app.config['SECRET_KEY'] = b'my_insecure_PA_token_with_random_2134876adsfjhlkasdf87'
# ldap config vars: (the last one is required, or python ldap freaks out)
app.config['LDAP_HOST'] = 'mara.ddp.net'
app.config['LDAP_HOST'] = 'ldap.ddp.net'
app.config['LDAP_BASE_DN'] = 'dc=depaoli,dc=id,dc=au'
app.config['LDAP_USER_DN'] = 'ou=users'
app.config['LDAP_GROUP_DN'] = 'ou=groups'
app.config['LDAP_USER_RDN_ATTR'] = 'cn'
app.config['LDAP_USER_RDN_ATTR'] = 'uid'
app.config['LDAP_USER_LOGIN_ATTR'] = 'uid'
app.config['LDAP_BIND_USER_DN'] = None
app.config['LDAP_BIND_USER_PASSWORD'] = None
app.config['LDAP_GROUP_OBJECT_FILTER'] = '(objectclass=posixGroup)'
app.config['LDAP_BIND_USER_DN'] = None
app.config['LDAP_BIND_USER_PASSWORD'] = None
db = SQLAlchemy(app) # create the (flask) sqlalchemy connection
ma = Marshmallow(app) # set up Marshmallow - data marshalling / serialising
@@ -86,6 +89,7 @@ from files import Entry
from person import Person
from settings import Settings
from user import PAUser
from states import PA_UserState
####################################### GLOBALS #######################################
# allow jinja2 to call these python functions directly
@@ -142,12 +146,17 @@ def login():
# ldap-injecting our username, so send them back to the login page instead
if request.method == 'POST' and re.search( r'[()\\*&!]', request.form['username']):
from job import SetFELog
SetFELog( message=f"ERROR: Detected special LDAP chars in username: {request.form['username']}", log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"ERROR: Detected special LDAP chars in username: {request.form['username']}", level="danger", persistent=True, cant_close=True )
return redirect(url_for('login'))
if form.validate_on_submit():
# Successfully logged in, We can now access the saved user object via form.user.
login_user(form.user, remember=True) # Tell flask-login to log them in.
next = request.args.get("next")
# just (re)-authenticated, so clear old state from UserState, to avoid re-using old data that is no longer valid
PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn).delete()
db.session.commit()
if next:
return redirect(next) # Send them back where they came from
else:
@@ -242,10 +251,7 @@ def logout():
# main to be called via Flask/Gunicorn
###############################################################################
def main():
if hostname == PROD_HOST:
app.run(ssl_context=('/etc/letsencrypt/live/pa.depaoli.id.au/cert.pem', '/etc/letsencrypt/live/pa.depaoli.id.au/privkey.pem'), host="0.0.0.0", debug=False)
else:
app.run(host="0.0.0.0", debug=True)
app.run(host="0.0.0.0", debug=True)
###############################################################################
# This func creates a new filter in jinja2 to test to hand back the username

View File

@@ -1,4 +1,4 @@
#
#
# This file controls the 'external' job control manager, that (periodically #
# looks / somehow is pushed an event?) picks up new jobs, and processes them.
@@ -15,7 +15,7 @@
### SQLALCHEMY IMPORTS ###
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String, Sequence, Float, ForeignKey, DateTime, LargeBinary, Boolean, func
from sqlalchemy import Column, Integer, String, Sequence, Float, ForeignKey, DateTime, LargeBinary, Boolean, func, text
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import relationship
from sqlalchemy import create_engine
@@ -23,7 +23,7 @@ from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm import scoped_session
### LOCAL FILE IMPORTS ###
from shared import DB_URL, PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT, THUMBSIZE, SymlinkName, GenThumb, SECS_IN_A_DAY
from shared import DB_URL, PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT, THUMBSIZE, SymlinkName, GenThumb, SECS_IN_A_DAY, PA_EXIF_ROTATER
from datetime import datetime, timedelta, date
### PYTHON LIB IMPORTS ###
@@ -45,12 +45,18 @@ import face_recognition
import re
import sys
import ffmpeg
import subprocess
# global debug setting
if 'ENV' not in os.environ or os.environ['ENV'] != "production":
if 'ENV' not in os.environ:
DEBUG=True
print("PA job manager starting as non-production -- ENV not set! Please set ENV, I'd rather not guess what to do")
elif os.environ['ENV'] != "production":
DEBUG=True
print(f"PA job manager starting as non-production -- ENV={os.environ['ENV']}")
else:
print("PA job manager starting as Production")
DEBUG=False
# global list of override tables to allow enumeration over them ...
@@ -507,6 +513,12 @@ class PA_JobManager_FE_Message(Base):
return "<id: {}, job_id: {}, level: {}, message: {}".format(self.id, self.job_id, self.level, self.message)
##############################################################################
# Class describing PA_UserState and in the DB (via sqlalchemy)
# the state for a User defines a series of remembered states for a user
# to optimise their viewing, etc. If we scan and fine new files, we need to
# invalidate these cached values, so we have this class here just for that
##############################################################################
class PA_UserState(Base):
__tablename__ = "pa_user_state"
id = Column(Integer, Sequence('pa_user_state_id_seq'), primary_key=True )
@@ -537,6 +549,19 @@ class PA_UserState(Base):
return f"<pa_user_dn: {self.pa_user_dn}, path_type: {self.path_type}, noo: {self.noo}, grouping: {self.grouping}, how_many: {self.how_many}, st_offset: {self.st_offset}, size: {self.size}, folders: {self.folders}, root: {self.root}, cwd: {self.cwd}, view_eid: {self.view_eid}, orig_ptype: {self.orig_ptype}, orig_search_term: {self.orig_search_term}, orig_url: {self.orig_url}, current={self.current}, first_eid={self.first_eid}, last_eid={self.last_eid}, num_entries={self.num_entries}>"
##############################################################################
# PAprint(): convenience function to prepend a timestamp to a printed string
##############################################################################
def PAprint( msg ):
now=datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f')
if 'INFO:' in msg or 'WARN:' in msg:
msg = ' '+msg
if 'DEBUG:' in msg or 'ERROR:' in msg or 'WARN:' in msg or 'INFO:' in msg:
print( f"{now} {msg}" )
else:
print( f"{now} DEBUG: {msg}" )
return
##############################################################################
# NewJob(): convenience function to create a job, appropriately
##############################################################################
@@ -576,7 +601,7 @@ def MessageToFE( job_id, message, level, persistent, cant_close ):
def SettingsRBPath():
settings = session.query(Settings).first()
if settings == None:
print("ERROR: Cannot create file data with no settings / recycle bin path is missing")
PAprint("ERROR: Cannot create file data with no settings / recycle bin path is missing")
return None
# path setting is an absolute path, just use it, otherwise prepend base_path first
if settings.recycle_bin_path[0] == '/':
@@ -614,7 +639,7 @@ def ProcessRecycleBinDir(job):
def SettingsSPath():
settings = session.query(Settings).first()
if settings == None or settings.storage_path == "":
print("ERROR: Cannot create file data with no settings / storage path is missing")
PAprint("ERROR: Cannot create file data with no settings / storage path is missing")
return None
if settings.storage_path[0] == '/':
path=settings.storage_path
@@ -641,10 +666,9 @@ def ProcessStorageDirs(parent_job):
# SettingsIPath(): return import path (abs or add base_path if needed)
##############################################################################
def SettingsIPath():
paths=[]
settings = session.query(Settings).first()
if not settings or settings.import_path == "":
print("ERROR: Cannot create file data with no settings / import path is missing")
PAprint("ERROR: Cannot create file data with no settings / import path is missing")
return None
if settings.import_path[0] == '/':
path=settings.import_path
@@ -838,7 +862,7 @@ def JobMetadata(job):
def AddLogForJob(job, message):
now=datetime.now(pytz.utc)
log=Joblog( job_id=job.id, log=message, log_date=now )
job.last_update=datetime.now(pytz.utc)
job.last_update=now
session.add(log)
# some logs have DEBUG: in front, so clean that up
message = message.replace("DEBUG:", "" )
@@ -850,7 +874,7 @@ def AddLogForJob(job, message):
else:
job.last_commit = now
if DEBUG:
print( f"DEBUG: {message}" )
PAprint( f"{message}" )
return
##############################################################################
@@ -919,7 +943,7 @@ def FinishJob(job, last_log, state="Completed", pa_job_state="Completed", level=
session.commit()
MessageToFE( job_id=job.id, message=last_log, level=level, persistent=persistent, cant_close=cant_close )
if DEBUG:
print( f"DEBUG: {last_log}" )
PAprint( f"DEBUG: {last_log}" )
return
##############################################################################
@@ -930,13 +954,13 @@ def FinishJob(job, last_log, state="Completed", pa_job_state="Completed", level=
##############################################################################
def HandleJobs(first_run=False):
if first_run:
print("INFO: PA job manager is starting up - check for stale jobs" )
PAprint("INFO: PA job manager is starting up - check for stale jobs" )
else:
if DEBUG:
print("DEBUG: PA job manager is scanning for new jobs to process")
PAprint("DEBUG: PA job manager is scanning for new jobs to process")
for job in session.query(Job).filter(Job.pa_job_state != 'Complete').all():
if first_run and job.pa_job_state == 'In Progress':
print( f"INFO: Found stale job#{job.id} - {job.name}" )
PAprint( f"INFO: Found stale job#{job.id} - {job.name}" )
job.pa_job_state = 'Stale'
session.add(job)
AddLogForJob( job, "ERROR: Job has been marked stale as it did not complete" )
@@ -948,17 +972,17 @@ def HandleJobs(first_run=False):
j2 = session.query(Job).get(job.wait_for)
if not j2:
AddLogForJob( job, f"ERROR: waiting for a job#({job.wait_for}) that does not exist? ")
print(f"ERROR: job.wait_for ({job.wait_for}) does not exist in below? " )
PAprint(f"ERROR: job.wait_for ({job.wait_for}) does not exist in below? " )
for j in session.query(Job).all():
print(f"ERROR: j={j.id}")
PAprint(f"ERROR: j={j.id}")
continue
if j2.pa_job_state != 'Completed':
continue
# use this to remove threads for easier debugging, and errors will stacktrace to the console
if DEBUG:
print("*************************************")
print("RUNNING job: id={} name={} wait_for={}".format(job.id, job.name, job.wait_for ))
PAprint("*************************************")
PAprint("RUNNING job: id={} name={} wait_for={}".format(job.id, job.name, job.wait_for ))
RunJob(job)
else:
try:
@@ -968,8 +992,8 @@ def HandleJobs(first_run=False):
try:
MessageToFE( job_id=job.id, level="danger", message="Failed with: {} (try job log for details)".format(e), persistent=True, cant_close=False )
except Exception as e2:
print("ERROR: Failed to let front-end know, but back-end Failed to run job (id: {}, name: {} -- orig exep was: {}, this exception was: {})".format( job.id, job.name, e, e2) )
print("INFO: PA job manager is waiting for a job")
PAprint("ERROR: Failed to let front-end know, but back-end Failed to run job (id: {}, name: {} -- orig exep was: {}, this exception was: {})".format( job.id, job.name, e, e2) )
PAprint("INFO: PA job manager is waiting for a job")
return
##############################################################################
@@ -1132,14 +1156,14 @@ def JobForceScan(job):
def CreateSymlink(job,ptype,path):
path_type = session.query(PathType).get(ptype)
symlink=SymlinkName(path_type.name, path, path)
if not os.path.exists(symlink):
print( f"INFO: symlink does not exist, actually creating it -- s={symlink}" )
try:
try:
if not os.path.exists(symlink):
PAprint( f"INFO: symlink does not exist, actually creating it -- s={symlink}" )
# SAFE: SafePath() on init forces symlink to be safe
os.makedirs( os.path.dirname(symlink), mode=0o777, exist_ok=True )
os.symlink(path, symlink)
except Exception as e:
AddLogForJob( job, f"ERROR: Failed to create symlink - tried to link {symlink} -> {path}: {e}")
r=os.makedirs( os.path.dirname(symlink), mode=0o777, exist_ok=True )
r=os.symlink(path, symlink)
except Exception as e:
AddLogForJob( job, f"ERROR: Failed to create symlink - tried to link {symlink} -> {path}: {e}")
return symlink
##############################################################################
@@ -1251,25 +1275,25 @@ def RemoveEmptyDirFromDB( job, del_me ):
####################################################################################################################################
def CleanUpDirInDB(job, e):
session.commit()
print( f"CleanUpDirInDB(): checking dir: {e.FullPathOnFS()} ({e.id})" )
PAprint( f"CleanUpDirInDB(): checking dir: {e.FullPathOnFS()} ({e.id})" )
content = session.query(Entry).join(EntryDirLink).filter(EntryDirLink.dir_eid==e.id).first()
if not content:
print( f" Dir {e.FullPathOnFS()} - {e.id} is empty - removing it" )
print( f" Entry {e}" )
PAprint( f" Dir {e.FullPathOnFS()} - {e.id} is empty - removing it" )
PAprint( f" Entry {e}" )
# if no in_dir, we are at the root of the path, STOP
if not e.in_dir:
print( " Parent is empty, so NEVER delete this entry, returning" )
PAprint( " Parent is empty, so NEVER delete this entry, returning" )
return
# okay remove this empty dir
RemoveEmtpyDirFromFS( job, e )
RemoveEmptyDirFromDB( job, e )
# get an Entry from DB (in_dir is a Dir/we need the ORM entry for code to work)
parent_dir = session.query(Entry).get(e.in_dir.eid)
print( f" Dir {e.FullPathOnFS()} is in {parent_dir.FullPathOnFS()} ({parent_dir.id}) -> check next" )
PAprint( f" Dir {e.FullPathOnFS()} is in {parent_dir.FullPathOnFS()} ({parent_dir.id}) -> check next" )
# check to see if removing the empty dir has left the parent dir empty
CleanUpDirInDB(job, parent_dir)
else:
print( f"There is content (first entry: {content.name}) in {e.FullPathOnFS()} - finished for this dir" )
PAprint( f"There is content (first entry: {content.name}) in {e.FullPathOnFS()} - finished for this dir" )
return
####################################################################################################################################
@@ -1359,7 +1383,7 @@ def MoveFileToRecycleBin(job,del_me):
# SAFE: as SafePaths(rbpath) combined with data I control in this func (explicit remove of 'static/' + DB entry path)
os.replace( src, dst )
if DEBUG:
print( f"MoveFileToRecycleBin({job.id},{del_me.name}): os.replace {src} with {dst} " )
PAprint( f"MoveFileToRecycleBin({job.id},{del_me.name}): os.replace {src} with {dst} " )
except Exception as e:
AddLogForJob( job, f"ERROR: Failed to remove file from filesystem - which={src}, err: {e}")
@@ -1413,7 +1437,7 @@ def MoveFileToRecycleBin(job,del_me):
####################################################################################################################################
def MoveEntriesToOtherFolder(job, move_me, dst_storage_path, dst_rel_path):
if DEBUG:
print( f"DEBUG: MoveEntriesToOtherFolder( job={job.id}, move_me={move_me.name}, dst_storage_path={dst_storage_path.id}, dst_rel_path={dst_rel_path})")
PAprint( f"DEBUG: MoveEntriesToOtherFolder( job={job.id}, move_me={move_me.name}, dst_storage_path={dst_storage_path.id}, dst_rel_path={dst_rel_path})")
orig_name=move_me.name
orig_fs_pos=move_me.FullPathOnFS()
@@ -1513,16 +1537,16 @@ def CreateFSLocation( job, dst_path, dst_locn ):
####################################################################################################################################
def ResetAnySubdirPaths( moving_dir, dst_storage_path, parent_rel_path ):
if DEBUG:
print( f"ResetAnySubdirPaths( {moving_dir.name}, {dst_storage_path.path_prefix}, {parent_rel_path} )" )
PAprint( f"ResetAnySubdirPaths( {moving_dir.name}, {dst_storage_path.path_prefix}, {parent_rel_path} )" )
sub_dirs = session.query(Entry).join(FileType).join(EntryDirLink).filter(EntryDirLink.dir_eid==moving_dir.id).filter(FileType.name=='Directory').all()
for sub in sub_dirs:
if DEBUG:
print( f"ResetAnySubdirPaths: WAS sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
PAprint( f"ResetAnySubdirPaths: WAS sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
sub.in_path = dst_storage_path
sub.dir_details.in_path = dst_storage_path
sub.dir_details.rel_path = parent_rel_path + '/' + sub.name
if DEBUG:
print( f"ResetAnySubdirPaths: NOW sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
PAprint( f"ResetAnySubdirPaths: NOW sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
ResetAnySubdirPaths( sub, dst_storage_path, sub.dir_details.rel_path )
return
@@ -1668,7 +1692,7 @@ def JobImportDir(job):
ptype = session.query(PathType).get(path_type)
AddLogForJob(job, f"Checking {ptype.name} Directory: {path}" )
if DEBUG:
print( f"DEBUG: Checking Directory: {path}" )
PAprint( f"DEBUG: Checking Directory: {path}" )
if not os.path.exists( path ):
WithdrawDependantJobs( job, job.id, "scan job found no new files to process" )
FinishJob( job, f"Finished Importing: {path} -- Path does not exist", "Failed" )
@@ -1739,7 +1763,7 @@ def JobImportDir(job):
# use ctime as even a metadata change (mv'd file on the fs, or a perms change) needs to be checked
if stat.st_ctime > dir.last_import_date:
if DEBUG:
print("DEBUG: {} - {} is newer than {}".format( basename, stat.st_ctime, dir.last_import_date ) )
PAprint("DEBUG: {} - {} is newer than {}".format( basename, stat.st_ctime, dir.last_import_date ) )
if isImage(fname):
type_str = 'Image'
elif isVideo(fname):
@@ -1753,7 +1777,7 @@ def JobImportDir(job):
found_new_files += 1
else:
if DEBUG:
print( f"DEBUG: { basename} - {stat.st_ctime} is OLDER than {dir.last_import_date}" )
PAprint( f"DEBUG: { basename} - {stat.st_ctime} is OLDER than {dir.last_import_date}" )
e=session.query(Entry).join(EntryDirLink).join(Dir).filter(Entry.name==basename,Dir.eid==dir.eid).first()
e.exists_on_fs=True
job.current_file=basename
@@ -1900,26 +1924,29 @@ def JobTransformImage(job):
id=[jex.value for jex in job.extra if jex.name == "id"][0]
amt=[jex.value for jex in job.extra if jex.name == "amt"][0]
e=session.query(Entry).join(File).filter(Entry.id==id).first()
print( f"JobTransformImage: job={job.id}, id={id}, amt={amt}" )
im = Image.open( e.FullPathOnFS() )
PAprint( f"JobTransformImage: job={job.id}, id={id}, amt={amt}" )
if amt == "fliph":
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} horizontally" )
im = Image.open( e.FullPathOnFS() )
out = im.transpose(Image.FLIP_LEFT_RIGHT)
out.save( e.FullPathOnFS() )
elif amt == "flipv":
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} vertically" )
im = Image.open( e.FullPathOnFS() )
out = im.transpose(Image.FLIP_TOP_BOTTOM)
out.save( e.FullPathOnFS() )
else:
AddLogForJob(job, f"INFO: Rotating {e.FullPathOnFS()} by {amt} degrees" )
if im.format == 'JPEG':
im=ImageOps.exif_transpose(im)
out = im.rotate(int(amt), expand=True)
out.save( e.FullPathOnFS() )
print( f"JobTransformImage DONE transform: job={job.id}, id={id}, amt={amt}" )
p = subprocess.run([PA_EXIF_ROTATER, e.FullPathOnFS(), amt ], capture_output=True, text=True )
if p.stderr:
PAprint(p.stderr)
else:
PAprint(p.stdout)
settings = session.query(Settings).first()
e.file_details.thumbnail, _ , _ = GenThumb( e.FullPathOnFS(), settings.auto_rotate )
e.file_details.hash = md5( job, e )
print( f"JobTransformImage DONE thumb: job={job.id}, id={id}, amt={amt}" )
PAprint( f"JobTransformImage DONE thumb: job={job.id}, id={id}, amt={amt}" )
session.add(e)
FinishJob(job, "Finished Processesing image rotation/flip")
return
@@ -1940,7 +1967,7 @@ def GenHashAndThumb(job, e):
# use mtime as only if the content is different do we need to redo the hash
if stat.st_mtime < e.file_details.last_hash_date:
if DEBUG:
print(f"OPTIM: GenHashAndThumb {e.name} file is older than last hash, skip this")
PAprint(f"OPTIM: GenHashAndThumb {e.name} file is older than last hash, skip this")
job.current_file_num+=1
return
@@ -1948,7 +1975,7 @@ def GenHashAndThumb(job, e):
# same hash and we already have a thumbnail-> just return
if new_hash == e.file_details.hash and e.file_details.thumbnail:
if DEBUG:
print(f"OPTIM: GenHashAndThumb {e.name} md5 is same - likely a mv on filesystem so skip md5/thumb")
PAprint(f"OPTIM: GenHashAndThumb {e.name} md5 is same - likely a mv on filesystem so skip md5/thumb")
job.current_file_num+=1
return
e.file_details.hash = new_hash
@@ -1966,7 +1993,7 @@ def GenHashAndThumb(job, e):
####################################################################################################################################
def ProcessFilesInDir(job, e, file_func, count_dirs):
if DEBUG:
print( f"DEBUG: ProcessFilesInDir: {e.FullPathOnFS()}")
PAprint( f"DEBUG: ProcessFilesInDir: {e.FullPathOnFS()}")
if e.type.name != 'Directory':
file_func(job, e)
else:
@@ -1986,7 +2013,7 @@ def JobGetFileDetails(job):
path=[jex.value for jex in job.extra if jex.name == "path"][0]
path_prefix=[jex.value for jex in job.extra if jex.name == "path_prefix"][0]
if DEBUG:
print( f"DEBUG: JobGetFileDetails for path={path_prefix}" )
PAprint( f"DEBUG: JobGetFileDetails for path={path_prefix}" )
p=session.query(Path).filter(Path.path_prefix==path_prefix).first()
job.current_file_num = 0
job.num_files = p.num_files
@@ -2094,7 +2121,7 @@ def JobCheckForDups(job):
AddLogForJob( job, f"Check for duplicates" )
ClearOtherDupMessagesAndJobs()
res = session.execute( "select count(e1.id) from entry e1, file f1, dir d1, entry_dir_link edl1, path_dir_link pdl1, path p1, entry e2, file f2, dir d2, entry_dir_link edl2, path_dir_link pdl2, path p2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and edl2.dir_eid = d2.eid and edl2.entry_id = e2.id and p1.type_id != (select id from path_type where name = 'Bin') and p1.id = pdl1.path_id and pdl1.dir_eid = d1.eid and p2.type_id != (select id from path_type where name = 'Bin') and p2.id = pdl2.path_id and pdl2.dir_eid = d2.eid and f1.hash = f2.hash and e1.id != e2.id and f1.size_mb = f2.size_mb" )
res = session.execute( text( "select count(e1.id) from entry e1, file f1, dir d1, entry_dir_link edl1, path_dir_link pdl1, path p1, entry e2, file f2, dir d2, entry_dir_link edl2, path_dir_link pdl2, path p2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and edl2.dir_eid = d2.eid and edl2.entry_id = e2.id and p1.type_id != (select id from path_type where name = 'Bin') and p1.id = pdl1.path_id and pdl1.dir_eid = d1.eid and p2.type_id != (select id from path_type where name = 'Bin') and p2.id = pdl2.path_id and pdl2.dir_eid = d2.eid and f1.hash = f2.hash and e1.id != e2.id and f1.size_mb = f2.size_mb") )
for row in res:
if row.count > 0:
AddLogForJob(job, f"Found duplicates, Creating Status message in front-end for attention")
@@ -2242,18 +2269,19 @@ def JobRestoreFiles(job):
def CopyOverrides():
try:
for tbl in override_tbls:
session.execute( f"select * into tmp_{tbl} from {tbl}")
session.execute( text( f"select * into tmp_{tbl} from {tbl}") )
# force a commit here - I want to fail before I delete override content
session.commit()
# now take all 4 override tables in DB and clear them out
for tbl in override_tbls:
session.execute( f"delete from {tbl}" )
session.execute( text( f"delete from {tbl}" ) )
session.commit()
except Exception as ex:
print( f"ERROR: there are existing tmp tables when processing metadata. This SHOULD NEVER HAPPEN - manual intervention needed" )
print( f"ERROR: most likely the job manager was killed during processing metadata - you may want to manually put" )
print( f"ERROR: the contents of the 'tmp_*' tables back into their corresponding official metadata tables " )
print( f"ERROR: and try to restart the job manager" )
PAprint( f"ERROR: there are existing tmp tables when processing metadata. This SHOULD NEVER HAPPEN - manual intervention needed" )
PAprint( f"ERROR: most likely the job manager was killed during processing metadata - you may want to manually put" )
PAprint( f"ERROR: the contents of the 'tmp_*' tables back into their corresponding official metadata tables " )
PAprint( f"ERROR: and try to restart the job manager" )
PAprint( f"ERROR: orig ex: {ex}" )
exit( 1 )
return
@@ -2267,8 +2295,8 @@ def GetFaceInMetadata(fname):
face_data=file_h.read(-1)
file_h.close()
except Exception as ex:
print( f"ERROR: FATAL tried to read in override data and cant read content" )
print( f"ERROR: manual intervention needed - exc={ex}" )
PAprint( f"ERROR: FATAL tried to read in override data and cant read content" )
PAprint( f"ERROR: manual intervention needed - exc={ex}" )
exit(1)
return face_data
@@ -2294,13 +2322,13 @@ def ReloadMetadata(job):
fnames = glob.glob( f'{mpath}/no_match_overrides/*' )
for fname in fnames:
# type derived from fname (e.g. 0_Too Young_md5*, 1_Too Young, 2_Ingore Face, etc.)
match=re.search( '(\d+)_([^_\.]+)', fname )
match=re.search( r'(\d+)_([^_\.]+)', fname )
face_id=match.group(1)
type_name=match.group(2)
otype = session.query(FaceOverrideType).filter(FaceOverrideType.name==type_name).one()
face_data=GetFaceInMetadata(fname)
if DEBUG:
print( f"Found metadata showing Override of type: {type_name}" )
PAprint( f"DEBUG: Found metadata showing Override of type: {type_name}" )
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ). first()
@@ -2313,22 +2341,22 @@ def ReloadMetadata(job):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, f'{mpath}no_match_overrides/0_{otype.name}_{md5face(face_data)}' )
except Exception as ex:
print( f"ERROR: renaming no-match metadata on filesystem failed: {ex}" )
PAprint( f"ERROR: renaming no-match metadata on filesystem failed: {ex}" )
# process Metadata on FS for force_match_overrides (disco ones, will have 0 as face_id)
fnames = glob.glob( f'{mpath}force_match_overrides/*' )
for fname in fnames:
# person derived from fname (e.g. 0_ddp_md5*, 1_ddp, 2_mich, etc.)
match=re.search( '(\d+)_([^_]+)', fname )
match=re.search( r'(\d+)_([^_]+)', fname )
face_id=match.group(1)
person_tag=match.group(2)
p = session.query(Person).filter(Person.tag==person_tag).first()
if not p:
print( f"There is a metadata override on the file system for person: {person_tag} - but they are no longer in the DB - skip" )
PAprint( f"INFO: There is a metadata override on the file system for person: {person_tag} - but they are no longer in the DB - skip" )
continue
face_data=GetFaceInMetadata(fname)
if DEBUG:
print( f"Found metadata showing Override match for person: {person_tag}" )
PAprint( f"DEBUG: Found metadata showing Override match for person: {person_tag}" )
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ).first()
@@ -2342,43 +2370,46 @@ def ReloadMetadata(job):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, f'{mpath}force_match_overrides/0_{p.tag}_{md5face(face_data)}' )
except Exception as ex:
print( f"ERROR: renaming force-match metadata on filesystem failed: {ex}" )
PAprint( f"ERROR: renaming force-match metadata on filesystem failed: {ex}" )
# now process each of the tmp tables for anything that was in the DB but not on FS (e.g rm'd metadata)
overrides=session.execute( "select face_id, type_id from tmp_face_no_match_override" )
overrides=session.execute( text( "select face_id, type_id from tmp_face_no_match_override" ) )
for o in overrides:
print( f"F Force Match: o.face_id={o.face_id}" )
PAprint( f"F Force Match: o.face_id={o.face_id}" )
print( f"F No Match: o.type_id={o.type_id}" )
nmo=session.query(FaceNoMatchOverride).filter(FaceNoMatchOverride.face_id==o.face_id).filter(FaceNoMatchOverride.type_id==o.type_id).first()
if not nmo:
session.add( FaceNoMatchOverride( face_id=o.face_id, type_id=o.type_id ) )
overrides=session.execute( "select face_id, person_id from tmp_face_force_match_override" )
overrides=session.execute( text( "select face_id, person_id from tmp_face_force_match_override" ) )
for o in overrides:
print( f"F Force Match: o.face_id={o.face_id}" )
print( f"F Force Match: o.person_id={o.person_id}" )
if DEBUG:
PAprint( f"F Force Match: o.face_id={o.face_id}" )
PAprint( f"F Force Match: o.person_id={o.person_id}" )
fmo=session.query(FaceForceMatchOverride).filter(FaceForceMatchOverride.face_id==o.face_id,FaceForceMatchOverride.person_id==o.person_id).first()
if not fmo:
session.add( FaceForceMatchOverride( face_id=o.face_id, person_id=o.person_id ) )
overrides=session.execute( "select face, type_id from tmp_disconnected_no_match_override" )
overrides=session.execute( text( "select face, type_id from tmp_disconnected_no_match_override" ) )
for o in overrides:
print( f"D No Match: o.type_id={o.type_id}" )
if DEBUG:
PAprint( f"D No Match: o.type_id={o.type_id}" )
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==o.face).filter(DisconnectedNoMatchOverride.type_id==o.type_id).first()
if not dnmo:
session.add( DisconnectedNoMatchOverride( face=o.face, type_id=o.type_id ) )
overrides=session.execute( "select face, person_id from tmp_disconnected_force_match_override" )
overrides=session.execute( text( "select face, person_id from tmp_disconnected_force_match_override" ) )
for o in overrides:
print( f"D Force Match: o.person_id={o.person_id}" )
if DEBUG:
PAprint( f"D Force Match: o.person_id={o.person_id}" )
dfmo=session.query(DisconnectedForceMatchOverride).filter(DisconnectedForceMatchOverride.face==o.face).filter(DisconnectedForceMatchOverride.person_id==o.person_id).first()
if not dfmo:
session.add( DisconnectedForceMatchOverride( face=o.face, person_id=o.person_id ) )
# finally, drop the tmp tables
for tbl in override_tbls:
session.execute( f"drop table tmp_{tbl}" )
session.execute( text( f"drop table tmp_{tbl}" ) )
# ok, finally commit all these changes - dont do this until now. Worst case if we crash/fail, the overrides should continue to be in tmp_{tbl}
session.commit()
@@ -2392,7 +2423,6 @@ def ReloadMetadata(job):
# If there is content in the Bin already, its logs this - mostly useful when testing)
####################################################################################################################################
def InitialValidationChecks():
now=datetime.now(pytz.utc)
job=NewJob( name="init", num_files=0, wait_for=None, jex=None, parent_job=None, desc="initialise photo assistant" )
job.start_time=datetime.now(pytz.utc)
JobProgressState( job, "In Progress" )
@@ -2405,14 +2435,14 @@ def InitialValidationChecks():
if len(dirs) + len(files) > 0:
AddLogForJob(job, "INFO: the bin path contains content, cannot process to know where original deletes were form - skipping content!" )
AddLogForJob(job, "TODO: could be smart about what is known in the DB vs on the FS, and change below to an ERROR if it is one")
AddLogForJob(job, "WARNING: IF the files in the bin are in the DB (succeeded from GUI deletes) then this is okay, otherwise you should delete contents form the recycle bin and restart the job manager)" )
AddLogForJob(job, "WARN: IF the files in the bin are in the DB (succeeded from GUI deletes) then this is okay, otherwise you should delete contents form the recycle bin and restart the job manager)" )
# create symlink and Path/Dir if needed
ProcessRecycleBinDir(job)
rbp_exists=1
except Exception as ex:
print( f"FATAL ERROR: Failed to walk the recycle bin at {path} Err:{ex}" )
PAprint( f"FATAL ERROR: Failed to walk the recycle bin at {path} Err:{ex}" )
else:
AddLogForJob(job, "ERROR: The bin path in settings does not exist - Please fix now");
AddLogForJob(job, f"ERROR: The bin path {path} in settings does not exist - Please fix now");
sp_exists=0
path = SettingsSPath()
if os.path.exists(path):
@@ -2481,7 +2511,7 @@ def AddFaceToFile( locn_data, face_data, file_eid, model_id, settings ):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, new_fname )
except Exception as ex:
print( f"ERROR: AddFaceToFile-face connects to 'disconnected-force-match' metadata, but fixing the filesystem metadata failed: {ex}" )
PAprint( f"ERROR: AddFaceToFile-face connects to 'disconnected-force-match' metadata, but fixing the filesystem metadata failed: {ex}" )
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==face.face).first()
if dnmo:
@@ -2497,7 +2527,7 @@ def AddFaceToFile( locn_data, face_data, file_eid, model_id, settings ):
# SAFE: as SafePaths(mpath) combined with data I control in this func
os.replace( fname, new_fname )
except Exception as ex:
print( f"ERROR: AddFaceToFile-face connects to 'disconnected-no-match' metadata, but fixing the filesystem metadata failed: {ex}" )
PAprint( f"ERROR: AddFaceToFile-face connects to 'disconnected-no-match' metadata, but fixing the filesystem metadata failed: {ex}" )
return
@@ -2517,7 +2547,7 @@ def DelMatchesForFile( job, ent ):
if DEBUG:
AddLogForJob(job, f'Remove any old matches in {ent.name}')
session.execute( f"delete from face_refimg_link where face_id in (select face_id from face_file_link where file_eid = {ent.id})" )
session.execute( text( f"delete from face_refimg_link where face_id in (select face_id from face_file_link where file_eid = {ent.id})" ) )
ent.file_details.last_ai_scan=0
session.add(ent)
return
@@ -2539,7 +2569,7 @@ def DelFacesForFile( job, eid ):
if o:
DisconnectSingleNoMatchOverride( job, o )
session.execute( f"delete from face where id in (select face_id from face_file_link where file_eid = {eid})" )
session.execute( text( f"delete from face where id in (select face_id from face_file_link where file_eid = {eid})" ) )
session.commit()
return
@@ -2697,7 +2727,7 @@ def CheckAndRunBinClean():
now=datetime.now(pytz.utc)
if not j or (now-j.last_update).days >= settings.scheduled_bin_cleanup:
print( f"INFO: Should force clean up bin path, del files older than {settings.bin_cleanup_file_age} days old" )
PAprint( f"INFO: Should force clean up bin path, del files older than {settings.bin_cleanup_file_age} days old" )
NewJob( name="clean_bin", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean up on Bin path" )
created_jobs=True
return created_jobs
@@ -2707,7 +2737,7 @@ def CheckAndRunBinClean():
# based jobs that should run (e.g. last scanned a path X day(s) ago, then scan now), etc. X is defined in settings
####################################################################################################################################
def ScheduledJobs():
print("DEBUG: Time to check for any scheduled jobs needing to be run" )
PAprint("DEBUG: Time to check for any scheduled jobs needing to be run" )
created_jobs=False
@@ -2717,11 +2747,11 @@ def ScheduledJobs():
settings = session.query(Settings).first()
now=datetime.now(pytz.utc)
if ndays_since_last_im_scan >= settings.scheduled_import_scan:
print( f"INFO: Time to force an import scan, last scan was {ndays_since_last_im_scan} days ago" )
PAprint( f"INFO: Time to force an import scan, last scan was {ndays_since_last_im_scan} days ago" )
NewJob( name="scan_ip", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Import path" )
created_jobs=True
if ndays_since_last_st_scan >= settings.scheduled_storage_scan:
print( f"INFO: Time to force a storage scan, last scan was {ndays_since_last_st_scan}" )
PAprint( f"INFO: Time to force a storage scan, last scan was {ndays_since_last_st_scan}" )
NewJob( name="scan_sp", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Storage path" )
created_jobs=True
if CheckAndRunBinClean():
@@ -2734,7 +2764,7 @@ def ScheduledJobs():
# go into waiting on a socket to be woken up (and then if woken, back into HandleJobs()
####################################################################################################################################
if __name__ == "__main__":
print("INFO: PA job manager starting - listening on {}:{}".format( PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT) )
PAprint("INFO: PA job manager starting - listening on {}:{}".format( PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT) )
InitialValidationChecks()
@@ -2748,11 +2778,11 @@ if __name__ == "__main__":
try:
conn, addr = s.accept()
if DEBUG:
print( f"accept finished, tout={s.timeout}" )
PAprint( f"accept finished, tout={s.timeout}" )
except socket.timeout:
if DEBUG:
print( f"timeout occurred, tout={s.timeout}" )
PAprint( f"timeout occurred, tout={s.timeout}" )
if ScheduledJobs():
HandleJobs(False)
continue

View File

@@ -87,14 +87,14 @@ def AddRefimgToPerson( filename, person ):
#False == dont autorotate, its not needed on this image
refimg.thumbnail, refimg.orig_w, refimg.orig_h = GenThumb( filename, False )
settings = Settings.query.first()
model=AIModel.query.get(settings.default_refimg_model)
model=db.session.get(AIModel,settings.default_refimg_model)
refimg.face, face_locn = GenFace( filename, model=model.name )
try:
os.remove(filename)
except Exception as e:
# can fail "silently" here, if the face_locn worked, great, its only
# a tmp file in /tmp - if not, the next if will send a msg to the front-end
SetFELog( message=f"Failed to delete tmp file for refimg addition: {e}", log_level="danger", persistent=True, cant_close=True )
SetFELog( message=f"Failed to delete tmp file for refimg addition: {e}", level="danger", persistent=True, cant_close=True )
if not face_locn:
SetFELog( f"<b>Failed to find face in Refimg:</b>", "danger" )
@@ -194,10 +194,10 @@ def match_with_create_person():
def person(id):
form = PersonForm(request.form)
page_title='Edit Person'
if request.method == 'POST':
try:
person = Person.query.get(id)
person = db.session.get(Person, id)
if 'delete' in request.form:
SetFELog( f"Successfully deleted Person: ({person.tag})" )
@@ -237,7 +237,7 @@ def person(id):
SetFELog( f"<b>Failed to modify Person:</b>&nbsp;{e}", "danger" )
return redirect( url_for( 'persons' ) )
else:
person = Person.query.get(id)
person = db.session.get(Person, id)
if not person:
SetFELog( f"No such person with id: {id}", "danger" )
return redirect("/")

View File

@@ -1,7 +1,8 @@
numpy==1.26.4
flask
flask_login
flask-ldap3-login
sqlalchemy
sqlalchemy>=2.0.36
flask-sqlalchemy
SQLAlchemy-serializer
marshmallow-sqlalchemy
@@ -19,3 +20,5 @@ Werkzeug
flask-compress
ffmpeg-python
psycopg2
python3-ldap
gunicorn

View File

@@ -137,7 +137,7 @@ def SettingsRBPath():
settings = Settings.query.first()
if settings == None:
from job import SetFELog
SetFELog( message="Cannot create file data with no settings / recycle bin path is missing", log_level="error", persistent=True, cant_close=False)
SetFELog( message="Cannot create file data with no settings / recycle bin path is missing", level="danger", persistent=True, cant_close=False)
return
# path setting is an absolute path, just use it, otherwise prepend base_path first
if settings.recycle_bin_path[0] == '/':
@@ -155,7 +155,7 @@ def SettingsSPath():
settings = Settings.query.first()
if settings == None:
from job import SetFELog
SetFELog( message="Cannot create file data with no settings / storage path is missing", log_level="error", persistent=True, cant_close=False)
SetFELog( message="Cannot create file data with no settings / storage path is missing", level="danger", persistent=True, cant_close=False)
return
if settings.storage_path[0] == '/':
path=settings.storage_path

View File

@@ -5,6 +5,7 @@ import io
import base64
import subprocess
from PIL import Image, ImageOps
import numpy as np
class PA:
def __repr__(self):
@@ -24,7 +25,6 @@ class PA:
hostname = socket.gethostname()
PROD_HOST="pa_web"
# dict to store name of icon in icons.svg so we can use by referece in html
ICON={}
@@ -40,19 +40,27 @@ OLDEST_LOG_LIMIT = 5
if hostname == "lappy":
PA_JOB_MANAGER_HOST="localhost"
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@localhost:5432/pa'
PA_EXIF_ROTATER = './utils/pa_exifautotran'
PA_EXIF_AUTOROTATE = './utils/pa_exifautotran'
PA_EXIF_ROTATER = './utils/pa_rotate'
# if we dont set the env or we are explicitly DEV, run web server on localhost & db on mara (port 65432)
elif 'ENV' not in os.environ or os.environ['ENV'] == "development":
elif 'ENV' not in os.environ or os.environ['ENV'] == "development" or os.environ['ENV'] == "container":
PA_JOB_MANAGER_HOST="localhost"
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@mara.ddp.net:65432/pa'
PA_EXIF_ROTATER = './utils/pa_exifautotran'
# DB_URL = 'postgresql+psycopg2://pa:for_now_pa@mara.ddp.net:65432/pa'
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb_dev/pa'
PA_EXIF_AUTOROTATE = './utils/pa_exifautotran'
PA_EXIF_ROTATER = './utils/pa_rotate'
# if we explicitly are on PROD, run web server on localhost (pa_web container) & db on mara (port 5432 on padb container)- only accessed via internal docker ports)
elif os.environ['ENV'] == "production":
PA_JOB_MANAGER_HOST="localhost"
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb/pa'
PA_EXIF_ROTATER = '/code/utils/pa_exifautotran'
PA_EXIF_AUTOROTATE = '/code/utils/pa_exifautotran'
PA_EXIF_ROTATER = '/code/utils/pa_rotate'
else:
print( "ERROR: I do not know which environment (development, etc.) and which DB (on which host to use)" )
if 'ENV' not in os.environ:
print( f"ERROR: no ENV variable set in the environment" )
else:
print( f"ERROR: ENV is {os.environ['ENV']}" )
exit( -1 )
# PORT number we connect to the pa_job_manager on - by default it runs on the
@@ -132,7 +140,7 @@ def GenThumb(fname,auto_rotate):
im_orig = Image.open(fname)
if im_orig.format == 'JPEG':
# run cmdline util to re-orient jpeg (only changes if needed, and does it losslessly)
p = subprocess.run([PA_EXIF_ROTATER,fname] )
p = subprocess.run([PA_EXIF_AUTOROTATE,fname] )
im=Image.open(fname)
else:
im=im_orig
@@ -161,10 +169,12 @@ def GenThumb(fname,auto_rotate):
# generate Face data (and location) - wrapper func of face_recognition library
# used to store refimg data into the DB
def GenFace(fname, model):
img = face_recognition.load_image_file(fname)
location = face_recognition.face_locations(img, model=model)
encodings = face_recognition.face_encodings(img, known_face_locations=location)
if len(encodings) and len(location):
return encodings[0].tobytes(), location[0]
else:
return None, None
try:
img = face_recognition.load_image_file(fname)
location = face_recognition.face_locations(img, model=model)
encodings = face_recognition.face_encodings(img, known_face_locations=location)
if len(encodings) and len(location):
return encodings[0].tobytes(), location[0]
except Exception as e:
print( f"GenFace failed: {e}" )
return None, None

View File

@@ -62,10 +62,9 @@ class States(PA):
# this is any next/prev or noo, grouping, etc. change (so use referrer to work out what to do with this)
# because this can happen on a view, or files_up, etc. change this FIRST
if 'change_file_opts' in request.path:
ref=request.referrer
base=request.base_url
base=base.replace("change_file_opts", "")
self.url = "/"+ref.replace(base, "" )
self.url = "/"+request.referrer.replace(base, "" )
# if view_list, then we really are a view, and view_eid should be in the form
if 'view_list' in request.path:
@@ -82,16 +81,19 @@ class States(PA):
else:
# GET's occur on redirect, and we don't have a form, so get it from pref
st=self.url[8:]
ref=request.referrer
if 'search' in ref:
st=re.sub( '.+/search/', '', ref )
if request.referrer and 'search' in request.referrer:
st=re.sub( '.+/search/', '', request.referrer )
else:
st=''
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==st).first()
if not pref:
SetFELog( message=f"ERROR: pref not found - DONT CLOSE THIS!!! dn={current_user.dn}, st={st}, s={self}????" , log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"ERROR: pref not found - dn={current_user.dn}, st={st}, s={self}????" , level="danger", persistent=True, cant_close=False )
SetFELog( message=f"WARNING: I think this error occurred because you reloaded a page and the server had restarted between your original page load and this page reload, is that possible?" , level="warning", persistent=True, cant_close=False )
redirect("/")
else:
if not hasattr( pref, 'orig_url' ):
SetFELog( message=f"ERROR: orig_url not in pref - dn={current_user.dn}, st={st}, self={self}, pref={pref}????" , level="danger", persistent=True, cant_close=True )
redirect("/")
self.url = pref.orig_url
if 'files_ip' in self.url or 'file_list_ip' in self.url:
@@ -114,7 +116,11 @@ class States(PA):
self.path_type = 'Bin'
elif 'search' in self.url:
# okay if we are a search, but came from a view then get last_search_state form prefs and use it
self.orig_search_term = self.url[8:]
m=re.match( '.*search/(.+)$', self.url )
if m == None:
SetFELog( message=f"ERROR: DDP messed up, seems we are processing a search, but cant see the search term - is this even possible?" )
return
self.orig_search_term = m[1]
if self.path_type == "View":
self.orig_ptype = 'Search'
self.orig_url = self.url
@@ -123,22 +129,29 @@ class States(PA):
elif 'view' in self.url:
# use url to get eid of viewed entry
self.view_eid = self.url[6:]
self.path_type="View"
# force this to be a search so rest of code won't totally die, but also not return anything
self.path_type="Search"
self.orig_url=self.url
elif 'change_file_opts' not in self.url:
SetFELog( message=f"ERROR: DDP messed up, failed to match URL {self.url} for settings this will fail, redirecting to home" , log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , log_level="error",
persistent=True, cant_close=True )
SetFELog( message=f"ERROR: DDP messed up, failed to match URL {self.url} for settings this will fail, redirecting to home" , level="danger", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , level="danger", persistent=True, cant_close=True )
return
if self.path_type == 'View':
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==self.orig_search_term).first()
if not hasattr( self, 'orig_ptype' ):
self.orig_ptype='View'
self.orig_url=''
SetFELog( message=f"ERROR: No orig ptype? s={self} - pref={pref}, redirecting to home" , level="danger", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , level="danger", persistent=True, cant_close=True )
redirect("/")
# should find original path or search for this view (if not a search, search_term='')
orig_pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.orig_ptype,PA_UserState.orig_search_term==self.orig_search_term).first()
if not orig_pref:
SetFELog( message=f"ERROR: DDP messed up 2, failed to find orig_pref for a view pt={self.path_type} for search={self.orig_search_term}" , log_level="error", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , log_level="error",
persistent=True, cant_close=True )
SetFELog( message=f"ERROR: DDP messed up 2, failed to find orig_pref for a view pt={self.path_type} for search={self.orig_search_term}" , level="danger", persistent=True, cant_close=True )
SetFELog( message=f"referrer={request.referrer}" , level="danger", persistent=True, cant_close=True )
return
elif self.path_type == 'Search':
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.orig_search_term==self.orig_search_term).first()
@@ -254,9 +267,8 @@ class States(PA):
if (self.offset + int(self.how_many)) < self.num_entries:
self.offset += int(self.how_many)
else:
# should be impossible now, but leave jic
SetFELog( message=f"WARNING: next image requested, but would go past end of list? - ignore this" , log_level="warning", persistent=True, cant_close=True )
SetFELog( message=f"DDP - offset={self.offset} + how_many={self.how_many} > num_entries={self.num_entries}" , log_level="error", persistent=True, cant_close=True )
# tripping this still
SetFELog( message=f"WARNING: next image requested, but would go past end of list? - ignore this" , level="warning", persistent=True, cant_close=False )
if 'current' in request.form:
self.current = int(request.form['current'])

View File

@@ -160,7 +160,7 @@ create table JOBEXTRA ( ID integer, JOB_ID integer, NAME varchar(32), VALUE varc
create table JOBLOG ( ID integer, JOB_ID integer, LOG_DATE timestamptz, LOG varchar,
constraint PK_JL_ID primary key(ID), constraint FK_JL_JOB_ID foreign key(JOB_ID) references JOB(ID) );
create table PA_JOB_MANAGER_FE_MESSAGE ( ID integer, JOB_ID integer, LEVEL varchar(16), MESSAGE varchar(1024), PERSISTENT boolean, CANT_CLOSE boolean,
create table PA_JOB_MANAGER_FE_MESSAGE ( ID integer, JOB_ID integer, LEVEL varchar(16), MESSAGE varchar(8192), PERSISTENT boolean, CANT_CLOSE boolean,
constraint PA_JOB_MANAGER_FE_ACKS_ID primary key(ID),
constraint FK_PA_JOB_MANAGER_FE_MESSAGE_JOB_ID foreign key(JOB_ID) references JOB(ID) );

View File

@@ -4,10 +4,9 @@
<div class="container">
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
<h3 class="my-3 text-center" style="color: #5bc0de">
{# <svg width="64" height="64" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#pa_logo" /></svg>&nbsp;About Photo Assistant</h3> #}
<svg width="64" height="64" fill="currentColor"><use xlink:href="internal/icons.svg#pa_logo" /></svg>&nbsp;Photo Assistant - About</h3>
<img width="64" height="64" src="internal/pa-logo.png"/></img>&nbsp;Photo Assistant - About</h3>
<div class="alert alert-info">
{{build_info}}
<center>{{build_info}}</center>
</div>
</div>
<nav>

View File

@@ -29,7 +29,7 @@
<script>
function HandleSearch()
{
st=$('#search_term').val();
st=encodeURIComponent( $('#search_term').val() )
if( st != '' ) {
document.location.href='/search/'+st
}
@@ -101,6 +101,7 @@
{% if config.ENV != "production" %}
<a class="dropdown-item" href="{{url_for('force_scan')}}">Force Scan (delete data & rebuild)</a>
{% endif %}
<a class="dropdown-item" href="{{url_for('force_clear')}}">Clear Messages (Force)</a>
<a class="dropdown-item" href="{{url_for('wake_up')}}">Force wake the job manager</a>
</div class="dropdow-menu">
</div class="nav-item dropdown">

View File

@@ -5,6 +5,8 @@
<script src="{{ url_for( 'internal', filename='js/files_transform.js')}}"></script>
<script>
document.fake_shift=0
document.fake_ctrl=0
var move_paths=[]
{% for p in move_paths %}
p = new Object()
@@ -13,6 +15,11 @@
p.icon_url = '{{p.icon_url}}'
move_paths.push(p)
{% endfor %}
document.OPT = '{{OPT}}'
document.entries = '{{entry_data}}'
document.how_many = '{{OPT.how_many}}'
document.entries_len = '{{entry_data|length}}'
</script>
<div class="container-fluid">
@@ -66,7 +73,7 @@
</button>
<span class="sm-txt my-auto">&nbsp;{{OPT.how_many}} files&nbsp;</span>
{% set nxt_disabled="" %}
{% if entry_data|length < OPT.how_many|int %}
{% if not entry_data or entry_data|length < OPT.how_many|int %}
{% set nxt_disabled="disabled" %}
{% endif %}
<button aria-label="next" id="next" {{nxt_disabled}} name="next" class="next sm-txt btn btn-outline-secondary">
@@ -83,6 +90,8 @@
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash-fill"/></svg>
{% endif %}
</button>
<button style="visibility:hidden" class="btn btn-outline-secondary" aria-label="shift-key" id="shift-key" onclick="document.fake_shift=1-document.fake_shift; event.stopPropagation(); return false">shift</button>
<button style="visibility:hidden" class="btn btn-outline-secondary" aria-label="ctrl-key" id="ctrl-key" onclick="document.fake_ctrl=1-document.fake_ctrl; event.stopPropagation(); return false">ctrl</button>
</div>
<div class="d-flex col col-auto justify-content-end">
<div class="btn-group">
@@ -149,6 +158,9 @@
</div>
{% endif %}
{% endif %}
{% if not entry_data %}
<span class="alert alert-danger p-2 col-auto"> No matches for: '{{search_term}}'</span>
{% endif %}
{% for obj in entry_data %}
{% if loop.index==1 and OPT.folders %}
{% if OPT.cwd != OPT.root %}
@@ -303,6 +315,7 @@ $('.figure').dblclick( CallViewRouteWrapper )
// different context menu on files
$.contextMenu({
selector: '.entry',
itemClickEvent: "click",
build: function($triggerElement, e) {
// when right-clicking & no selection add one OR deal with ctrl/shift right-lick as it always changes seln
if( NoSel() || e.ctrlKey || e.shiftKey )
@@ -372,6 +385,8 @@ $.contextMenu({
if( key == "fliph" ) { Transform("fliph") }
if( key == "flipv" ) { Transform("flipv") }
if( key.startsWith("ai")) { RunAIOnSeln(key) }
// dont flow this event through the dom
e.stopPropagation()
},
items: item_list
};
@@ -398,7 +413,18 @@ $( document ).keydown(function(event) {
if( ! NoSel() ) DelDBox('Delete');
{% endif %}
break;
}
});
} })
function isMobile() {
try{ document.createEvent("TouchEvent"); return true; }
catch(e){ return false; }
}
if( isMobile() )
{
$('#shift-key').css('visibility', 'visible');
$('#ctrl-key').css('visibility', 'visible');
}
</script>
{% endblock script_content %}

View File

@@ -46,8 +46,7 @@
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
<h3 class="my-3 text-center" style="color: #5bc0de">
{# <svg width="64" height="64" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#pa_logo" /></svg>&nbsp;Photo Assistant Login</h3> #}
<svg width="64" height="64" fill="currentColor"><use xlink:href="internal/icons.svg#pa_logo" /></svg>&nbsp;Photo Assistant Login</h3>
<img width="64" height="64" src="internal/pa-logo.png"/></img>&nbsp;Photo Assistant Login</h3>
<form class="" method="POST">
<div class="input-group">
<label for="username" class="text-right input-group-text col-4 text-info">Username:</label>

View File

@@ -12,6 +12,7 @@
.form-switch .form-check-input:focus {
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23CFF4FC'/%3e%3c/svg%3e");
}
#tst90:hover,#tst90:focus { filter: invert(73%) sepia(27%) saturate(3970%) hue-rotate(146deg) brightness(94%) contrast(100%); }
</style>
<script src="{{ url_for( 'internal', filename='js/view_transform.js')}}"></script>
@@ -45,6 +46,7 @@
{% for face in objs[id].file_details.faces %}
data = { 'id': '{{face.id}}', 'x': '{{face.face_left}}', 'y': '{{face.face_top}}', 'w': '{{face.w}}', 'h':'{{face.h}}' }
{% if face.refimg %}
data['pid']='{{face.refimg.person.id}}'
data['who']='{{face.refimg.person.tag}}'
data['distance']="{{face.refimg_lnk.face_distance|round(2)}}"
{% endif %}
@@ -101,6 +103,9 @@
function CallViewListRoute(dir)
{
// dont allow mad spamming of arrows
$("#la").prop("disabled", true)
$("#ra").prop("disabled", true)
data="eids="+$("#eids").val()
data+="&cwd={{OPT.cwd}}"
data+="&root={{OPT.root}}"
@@ -117,10 +122,11 @@
objs=res.objs
eid_lst=eids.split(",")
offset=res.offset
if( current == first_eid )
$("#la").prop("disabled", true)
if( current == last_eid )
$("#ra").prop("disabled", true)
// okay, we now have results back, can reset next/prev buttons
if( current != first_eid )
$("#la").prop("disabled", false)
if( current != last_eid )
$("#ra").prop("disabled", false)
ViewImageOrVideo()
}
})
@@ -207,7 +213,7 @@
$('#la').attr('disabled', false )
}
else
CallViewListRoute('next')
CallViewListRoute('next')
if( current == last_eid )
{
@@ -246,32 +252,35 @@
</div>
<div class="col col-auto pt-1">
<button class="btn btn-outline-info p-1" title="Rotate by 90 degrees" onClick="Transform(90)">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot90"/></svg>
<img src="{{url_for('internal', filename='rot90.png')}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot90-invert.png')}}'"
onMouseOut="this.src='{{url_for('internal', filename='rot90.png')}}'" />
</button>
<button class="btn btn-outline-info p-1" title="Rotate by 180 degrees" onClick="Transform(180)">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot180"/></svg>
<img src="{{url_for('internal', filename='rot180.png')}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot180-invert.png')}}'"
onMouseOut="this.src='{{url_for('internal', filename='rot180.png')}}'" />
</button>
<button class="btn btn-outline-info p-1" title="Rotate by 270 degrees" onClick="Transform(270)">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot270"/></svg>
<img src="{{url_for('internal', filename='rot270.png')}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot270-invert.png')}}'"
onMouseOut="this.src='{{url_for('internal', filename='rot270.png')}}'" />
</button>
<button class="btn btn-outline-info p-1" title="Flip horizontally" onClick="Transform('fliph')">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_h"/></svg>
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_h"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Flip vertically" onClick="Transform('flipv')">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_v"/></svg>
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_v"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="View in Fullscreen mode (hotkey: F)" onClick="fullscreen=true; ViewImageOrVideo()">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#fullscreen"/></svg>
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#fullscreen"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="Show logs relating to this filename (hotkey: l)" onClick="JoblogSearch()">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#log"/></svg>
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#log"/></svg>
</button>
<button class="btn btn-outline-info p-1" title="View Original" onClick="window.location='/'+objs[current].url">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#download"/></svg>
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#download"/></svg>
</button>
<button id="del" class="btn btn-outline-danger p-1" title="Delete (hotkey: Del)"
onClick="$.ajax({ type: 'POST', data: '&eid-0={{current}}', url: '/delete_files', success: function(data){ window.location='/'; return false; } })">
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash"/></svg>
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash"/></svg>
</button>
</div>
</div class="row">

9
utils/pa_rotate Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/sh
echo "start"
b=`basename "$1"`
echo "b=$b"
mv "$1" "/tmp/$b"
echo "mv '$1' to /tmp/$b"
jpegtran -outfile "$1" -rotate "$2" "/tmp/$b"
echo "done jpegtran"

View File

@@ -1,7 +1,18 @@
#!/bin/bash
su mythtv -g mythtv -c 'ENV="production" python3 -u /code/pa_job_manager.py' &> /var/log/pa_job_manager.out &
gunicorn --bind=0.0.0.0:443 --workers=4 --threads=16 --certfile /etc/letsencrypt/live/pa.depaoli.id.au/fullchain.pem --keyfile /etc/letsencrypt/live/pa.depaoli.id.au/privkey.pem main:app --env ENV="production" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output
sudo -u pauser ENV=${ENV} python3 -u pa_job_manager.py &> /var/log/pa_job_manager.out &
if [ "$ENV" == "production" ]; then
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=4 --threads=16 main:app --env ENV="production" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output
elif [ "$ENV" == "container" ]; then
# move into src tree so padev responds to code changes
cd /home/ddp/src/photoassistant
# push container logs/errors to stdout so docker logs works
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=1 --threads=1 main:app --env ENV="container" --reload --capture-output --log-file - --error-logfile -
else
echo "Not sure which ENV ($ENV) we are running, set up for DEV and dont run job manager (jic)"
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=1 --threads=1 main:app --env ENV="development" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output --enable-stdio-inheritance --reload
fi
# this should never be invoked unless gunicorn fails -- in that case, at least
# we will keep the container can login by hand and check the issue/error
sleep 99999