Compare commits
92 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| a4526e2b70 | |||
| cf9b0996be | |||
| 57e736a495 | |||
| aaf04e7f9b | |||
| eb9e7f2ce1 | |||
| df7a81df09 | |||
| 4742509d56 | |||
| 391fd52332 | |||
| 1831c49b15 | |||
| fa63e08b59 | |||
| e36fc170f1 | |||
| e3c94bf5eb | |||
| db8cf68d4b | |||
| 1a182d30b5 | |||
| c7c08c1c32 | |||
| 5fa35b90eb | |||
| ae718fe249 | |||
| 48ca6c5e64 | |||
| 6ceb4c4604 | |||
| b7b2eaeb1f | |||
| 0660ca55d7 | |||
| e6e621d6f7 | |||
| 0a49c2a627 | |||
| e460164d83 | |||
| f6ddb66f3f | |||
| 1555cd5ad6 | |||
| 62acd34ae0 | |||
| 5448e0a7a4 | |||
| 70908d60f5 | |||
| 58715491ea | |||
| f5dfb92db1 | |||
| eb819e37d7 | |||
| c6a0352c7d | |||
| f4aeadc019 | |||
| 79437477f7 | |||
| 989fc7c6db | |||
| febdfbbd32 | |||
| 37c0125fbd | |||
| b34b71bf32 | |||
| bd3ee98328 | |||
| 47190a8829 | |||
| f97653c4f1 | |||
| 8157292df4 | |||
| a0b4b80ca9 | |||
| cbfd00fe53 | |||
| 61b3edf5fa | |||
| fdba16c004 | |||
| 9a359a1a4c | |||
| 15cc101bbe | |||
| 9dac54ae97 | |||
| aba96f6c52 | |||
| 0895bc5983 | |||
| 46eb87bc31 | |||
| cbea1f09ab | |||
| 5fa9b5e386 | |||
| 28d4f8d998 | |||
| 29ea2ca833 | |||
| 06ceefda97 | |||
| 974d96d6eb | |||
| 030eb785f3 | |||
| 1277558b0e | |||
| d7f028f3d1 | |||
| d53bffcab9 | |||
| 4885acea80 | |||
| fed78672dc | |||
| bc23c9a3c6 | |||
| ffbedc3bca | |||
| d39f1734a3 | |||
| d86893875d | |||
| 2665ab086e | |||
| 063c75809a | |||
| 8b89d845db | |||
| 6ef12d7636 | |||
| 42eeed7c45 | |||
| 096251f8ce | |||
| 33c86f49fb | |||
| 98d33b4880 | |||
| 448b635d5a | |||
| 53ef671d34 | |||
| 7e25c33f1a | |||
| 4db3d27cbc | |||
| a3350b9316 | |||
| a8ee7b4f89 | |||
| 478a7e5add | |||
| aad0389d77 | |||
| 262efd89d4 | |||
| 9c4da5ec6f | |||
| 3f39f74716 | |||
| 48353f0486 | |||
| 0ed56d6a1a | |||
| 1005acb339 | |||
| 62cfe6305e |
@@ -2,10 +2,7 @@ images_to_process
|
|||||||
new_img_dir
|
new_img_dir
|
||||||
photos
|
photos
|
||||||
reference_images
|
reference_images
|
||||||
static/Bin/*
|
static
|
||||||
static/Import/*
|
|
||||||
static/Storage/*
|
|
||||||
static/Metadata/*
|
|
||||||
.pa_metadata
|
.pa_metadata
|
||||||
.pa_bin
|
.pa_bin
|
||||||
.python
|
.python
|
||||||
|
|||||||
44
BUGs
@@ -1,4 +1,15 @@
|
|||||||
### Next: 121
|
### Next: 140
|
||||||
|
BUG-139: using any large entry list and going next a few times, ends say 4 pages of 50 into 4000 matches (entries from DB < 50)...
|
||||||
|
- confirmed this is when person has 2 or more refimgs:
|
||||||
|
- on page "2", we get 49 pulled back in the ORM instead of the 50 expected -- b/c I use that to indicate we must be at the end of the list if not 50 found
|
||||||
|
-- really, need to fix once and for all the eids / re-running query.
|
||||||
|
do GetEntries as we do now, once done however, get all entry ids. Stick those into the DB with a unique query-id and datestamp
|
||||||
|
new func to get all details needed for entries in an eid list (of 1-page) - show this page of entries
|
||||||
|
use current, full eidlist and to work our start/end of list (next/prev), disabling.
|
||||||
|
then client can keep current page of data, if you hit next/prev, use DB unique query id / full list and page of eids, and give full data for new page of entries
|
||||||
|
Implications though, are if a search is invalidated (maybe delete / move a photo), need to remove them from the list on the DB too OR let user know/decide to fix/wait.
|
||||||
|
|
||||||
|
|
||||||
BUG-100: I managed to get 2 photos matching mich in the NOT_WORKING photo (probably dif refimgs but same p.tag?)
|
BUG-100: I managed to get 2 photos matching mich in the NOT_WORKING photo (probably dif refimgs but same p.tag?)
|
||||||
= /photos/2012/20120414-damien/IMG_8467.JPG
|
= /photos/2012/20120414-damien/IMG_8467.JPG
|
||||||
BUG-106: cant add trudy /pat? as refimgs via FaceDBox
|
BUG-106: cant add trudy /pat? as refimgs via FaceDBox
|
||||||
@@ -11,20 +22,19 @@ BUG-119: "Uncaught (in promise) Error: A listener indicated an asynchronous
|
|||||||
was received"
|
was received"
|
||||||
investigate this (possible I'm calling check_for_jobs and maybe not doing the async right?)
|
investigate this (possible I'm calling check_for_jobs and maybe not doing the async right?)
|
||||||
|
|
||||||
BUG-120: on tablet:
|
BUG-123: pa_job_manager crashed with timeout on connection (probably when I turned off traefik for a bit?). Regardless, should be more fault tolerant --> maybe offer to restart pa_job_manager IF its crashed?
|
||||||
[2023-04-09 12:21:21,214] ERROR in app: Exception on /view/16978 [GET]
|
this definitely happened also, when I shutdown the DB back-end mid job, and it was able to be restarted, so could get f/e to at least suggest a restart of the contianer, or auto-restart job_mgr?
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 2525, in wsgi_app
|
|
||||||
response = self.full_dispatch_request()
|
|
||||||
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1822, in full_dispatch_request
|
|
||||||
rv = self.handle_user_exception(e)
|
|
||||||
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1820, in full_dispatch_request
|
|
||||||
rv = self.dispatch_request()
|
|
||||||
File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1796, in dispatch_request
|
|
||||||
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
|
|
||||||
File "/code/files.py", line 655, in view
|
|
||||||
OPT=States( request )
|
|
||||||
File "/code/states.py", line 92, in __init__
|
|
||||||
self.url = pref.orig_url
|
|
||||||
AttributeError: 'NoneType' object has no attribute 'orig_url'
|
|
||||||
|
|
||||||
|
BUG-125: when an image is highlighted, then post the contextmenu on a different image - the highlight does not move to the new image
|
||||||
|
and the selected menu function processes the original or the new depending on the way the code works.
|
||||||
|
There is a chance we need to change the document on click to a mouse down (or whatever the context menu
|
||||||
|
uses for default), rather than just fix the highlight
|
||||||
|
|
||||||
|
BUG-130: moving files and then trying to go next page and it got confused...
|
||||||
|
BUG-132: right arrow to go to next photo in viewer ALSO scrolls to the right, needs a return somewhere in the jscript
|
||||||
|
BUG-133: when rebuilding pa[dev], the first run fails to have symlinks to the right paths for Import/Storage, etc. a simple restart fixes - so potentially the intial copy or some other race condition?
|
||||||
|
BUG-134: when moving set of photos on page, then move another set of photos on page, the first set reappears. Could really delete them from the dom?
|
||||||
|
BUG-135: failed to rotate: 2006/20061215-ITS-xmas-KP/DSC00582.JPG - not sure why && not repeatable, so its not the image, timing/race condition maybe?
|
||||||
|
BUG-137: after moving/refiling photos, the next shift-click is out of order (reload fixes it)
|
||||||
|
BUG-138: Placeholder for all the ways we can get the front-end confused:
|
||||||
|
---> JUST fix all these BUGs (relating to confused/lost state) by revisiting the overally complex way I remember state and my position in a list (probably FAR easier, to make an initial sql just save all eids, and then not try to recreate that list ever again and not care how I got into the list). Can attach a "running server-side sequence number", and if old sequence, and the original eid list results in a failure, then just pop up that the saved list is no longer valid, and ask user to re-do their search/list..."
|
||||||
|
|||||||
74
DEEPFACE
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
#
|
||||||
|
# https://pypi.org/project/deepface/
|
||||||
|
#
|
||||||
|
|
||||||
|
from deepface import DeepFace
|
||||||
|
import os
|
||||||
|
import cv2
|
||||||
|
import numpy as np
|
||||||
|
|
||||||
|
# Function to compute embeddings for a given image path
|
||||||
|
def compute_embeddings(image_path, model_name='VGG-Face'):
|
||||||
|
try:
|
||||||
|
# This returns a list of embeddings for each face found in the image
|
||||||
|
embeddings = DeepFace.represent(img_path=image_path, model_name=model_name)
|
||||||
|
return [res['embedding'] for res in embeddings]
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error processing image {image_path}: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Function to find the best matches between reference and target images
|
||||||
|
def find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine'):
|
||||||
|
# Store embeddings for reference images
|
||||||
|
reference_embeddings = {}
|
||||||
|
|
||||||
|
for ref_img in reference_images:
|
||||||
|
ref_embeddings = compute_embeddings(ref_img, model_name=model_name)
|
||||||
|
if ref_embeddings:
|
||||||
|
reference_embeddings[ref_img] = ref_embeddings
|
||||||
|
|
||||||
|
# Store best matches
|
||||||
|
best_matches = []
|
||||||
|
|
||||||
|
# Loop through each target image
|
||||||
|
for target_img in target_images:
|
||||||
|
target_embeddings = compute_embeddings(target_img, model_name=model_name)
|
||||||
|
|
||||||
|
for target_emb in target_embeddings:
|
||||||
|
for ref_img, ref_emb_list in reference_embeddings.items():
|
||||||
|
# Compare each reference embedding with the target embedding
|
||||||
|
for ref_emb in ref_emb_list:
|
||||||
|
# Compute the distance between embeddings using DeepFace's distance functions
|
||||||
|
distance = DeepFace.find_distance(ref_emb, target_emb, distance_metric=metric)
|
||||||
|
|
||||||
|
# Store match details (target image, reference image, and distance)
|
||||||
|
best_matches.append({
|
||||||
|
'target_image': target_img,
|
||||||
|
'reference_image': ref_img,
|
||||||
|
'distance': distance
|
||||||
|
})
|
||||||
|
|
||||||
|
# Sort matches by the distance (smallest distances indicate best matches)
|
||||||
|
best_matches = sorted(best_matches, key=lambda x: x['distance'])
|
||||||
|
|
||||||
|
return best_matches
|
||||||
|
|
||||||
|
# Example usage
|
||||||
|
reference_images = [
|
||||||
|
"path_to_reference_image1.jpg",
|
||||||
|
"path_to_reference_image2.jpg"
|
||||||
|
]
|
||||||
|
|
||||||
|
target_images = [
|
||||||
|
"path_to_target_image1.jpg",
|
||||||
|
"path_to_target_image2.jpg",
|
||||||
|
# Add more target images here...
|
||||||
|
]
|
||||||
|
|
||||||
|
# Find the best matches
|
||||||
|
matches = find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine')
|
||||||
|
|
||||||
|
# Display the best match
|
||||||
|
for match in matches[:5]: # Display top 5 matches
|
||||||
|
print(f"Reference Image: {match['reference_image']}, Target Image: {match['target_image']}, Distance: {match['distance']}")
|
||||||
|
|
||||||
54
Dockerfile
@@ -1,35 +1,27 @@
|
|||||||
FROM ubuntu:22.04
|
FROM python:latest
|
||||||
|
|
||||||
|
# declare var
|
||||||
|
ARG PA_ID
|
||||||
|
|
||||||
WORKDIR /code
|
WORKDIR /code
|
||||||
USER root
|
# do this first, allowing layer / build efficiencies if only code changes...
|
||||||
ENV PJM_UID=500
|
# sudo used in dev container
|
||||||
ENV PJM_GID=500
|
RUN apt-get update && apt-get -y install libpq-dev mediainfo cmake libglib2.0-0 libjpeg-turbo-progs ffmpeg git sudo
|
||||||
RUN groupadd -g ${PJM_GID} mythtv && useradd -r -u ${PJM_UID} -g ${PJM_GID} mythtv
|
COPY requirements.txt .
|
||||||
ENV TZ=Australia/Melbourne
|
RUN pip3 install --upgrade pip && pip3 install -r requirements.txt && pip3 install --upgrade pillow
|
||||||
RUN truncate -s0 /tmp/preseed.cfg && \
|
|
||||||
(echo "tzdata tzdata/Areas select Australia" >> /tmp/preseed.cfg) && \
|
RUN groupadd -g ${PA_ID} pauser && useradd -r -u ${PA_ID} -g ${PA_ID} pauser
|
||||||
(echo "tzdata tzdata/Zones/Australia select Melbourne" >> /tmp/preseed.cfg) && \
|
|
||||||
debconf-set-selections /tmp/preseed.cfg && \
|
# NOW, deal with any code changes, and reset the container for 'first run'...
|
||||||
rm -f /etc/timezone /etc/localtime && \
|
|
||||||
apt-get update && \
|
|
||||||
DEBIAN_FRONTEND=noninteractive DEBCONF_NONINTERACTIVE_SEEN=true \
|
|
||||||
apt-get install -y tzdata
|
|
||||||
RUN apt-get update && apt-get -y dist-upgrade && apt-get -y install python3-pip libpq-dev gunicorn mediainfo cmake libgl1-mesa-glx libglib2.0-0 python3-ldap libjpeg-turbo-progs ffmpeg git
|
|
||||||
COPY requirements.txt requirements.txt
|
|
||||||
RUN pip3 install -r requirements.txt
|
|
||||||
RUN pip3 install --upgrade pillow --user
|
|
||||||
EXPOSE 443
|
|
||||||
EXPOSE 55432
|
|
||||||
COPY . .
|
COPY . .
|
||||||
RUN echo $RANDOM | md5sum | head -c 30 > /code/.sk
|
|
||||||
RUN chmod 600 .sk
|
# set up static dir and subdirs
|
||||||
RUN date > internal/build-date.txt
|
RUN mkdir -p static/Bin && mkdir -p static/Import && mkdir -p static/Storage && mkdir -p static/Metadata
|
||||||
RUN git log -n 15 > internal/git-log.txt
|
|
||||||
RUN ln -s /code/TODO /code/internal/TODO
|
EXPOSE 80
|
||||||
RUN ln -s /code/BUGs /code/internal/BUGs
|
RUN cat /dev/urandom | head -c 50 | md5sum | head -c 32 > /code/.sk && chmod 600 .sk && chown pauser:pauser .sk
|
||||||
|
RUN TZ="Australia/Melbourne" date > internal/build-date.txt && git log -n 15 > internal/git-log.txt && ln -s /code/TODO /code/internal/TODO && ln -s /code/BUGs /code/internal/BUGs
|
||||||
RUN rm -rf .git
|
RUN rm -rf .git
|
||||||
RUN chown mythtv:mythtv /code
|
RUN chown pauser:pauser /code && chown pauser:pauser ./static && chown pauser:pauser ./static/*
|
||||||
RUN chown mythtv:mythtv ./static
|
|
||||||
RUN chown mythtv:mythtv ./static/*
|
|
||||||
## cleanup of files from setup
|
|
||||||
RUN rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
|
|
||||||
CMD ["./wrapper.sh"]
|
CMD ["./wrapper.sh"]
|
||||||
|
|||||||
31
README
@@ -3,34 +3,22 @@ In here we can put instructions on how to run this / any general info
|
|||||||
to edit src:
|
to edit src:
|
||||||
|
|
||||||
git....
|
git....
|
||||||
CAM: fill this in pls
|
cd ~/src ; # or wherever you want the code to live
|
||||||
|
git clone https://git.depaoli.id.au/ddp/photoassistant.git
|
||||||
|
|
||||||
|
|
||||||
ubuntu packages:
|
ubuntu packages:
|
||||||
sudo apt-get install -y mediainfo cmake python3-flask
|
sudo apt-get install -y cmake libpq-dev
|
||||||
|
|
||||||
|
|
||||||
REDO THIS SOON (good to test on a clean install):
|
AFTER a fresh ubuntu, need to remove/make a new .python env and get pip-based python packages installed into .python:
|
||||||
|
cd ~/src/photoassistant
|
||||||
|
rm .python
|
||||||
# use a venv (to get consistent python versions with Prod container)
|
# use a venv (to get consistent python versions with Prod container)
|
||||||
python3 -m venv .python
|
python3 -m venv .python
|
||||||
source ./.python/bin/activate
|
source ./.python/bin/activate
|
||||||
pip3 install -r requirements.txt
|
pip3 install -r requirements.txt
|
||||||
|
|
||||||
|
|
||||||
pip packages:
|
|
||||||
* pymediainfo
|
|
||||||
* PIL (should be there by default)
|
|
||||||
- on ubuntu, need to do: pip3 install --upgrade pillow --user
|
|
||||||
* ExifRead
|
|
||||||
* opencv-python
|
|
||||||
* datetime
|
|
||||||
* pytz
|
|
||||||
* face_recognition
|
|
||||||
* flask_login
|
|
||||||
* flask-ldap3-login
|
|
||||||
#### dlib (might need to install this before face_recognitioin, but it might not be needed, cmake clearly was)
|
|
||||||
|
|
||||||
|
|
||||||
upstream packages...
|
upstream packages...
|
||||||
mkdir static/upstream
|
mkdir static/upstream
|
||||||
cd static/upstream
|
cd static/upstream
|
||||||
@@ -60,14 +48,11 @@ pip3 list | tail -n +3 | grep -v mysqlclient | grep -v duplicity | grep -v gp
|
|||||||
|
|
||||||
To run debug version of web server (gunicorn deprecates FLASK_ENV, so co-exist for now):
|
To run debug version of web server (gunicorn deprecates FLASK_ENV, so co-exist for now):
|
||||||
|
|
||||||
# older flask:
|
|
||||||
FLASK_APP=main ENV=development FLASK_ENV=development flask run --host=192.168.0.2
|
|
||||||
|
|
||||||
# flask 2.2.2+ (in venv .python)
|
# flask 2.2.2+ (in venv .python)
|
||||||
FLASK_APP=main ENV=development ./.python/bin/flask --debug run --host=192.168.0.2
|
FLASK_APP=main ENV=development ./.python/bin/flask --debug run --host=mara.ddp.net
|
||||||
|
|
||||||
to run prod version of web server:
|
to run prod version of web server:
|
||||||
gunicorn --bind="192.168.0.2:5000" --threads=2 --workers=2 main:app
|
gunicorn --bind="mara.ddp.net:5000" --threads=2 --workers=2 main:app
|
||||||
|
|
||||||
Also have to run the job manager for jobs to work:
|
Also have to run the job manager for jobs to work:
|
||||||
ENV="development" python3 pa_job_manager.py
|
ENV="development" python3 pa_job_manager.py
|
||||||
|
|||||||
240
TODO
@@ -1,5 +1,36 @@
|
|||||||
|
### major fix - go to everywhere I call GetEntries(), and redo the logic totally...
|
||||||
|
* firstly, run the query as per normal, but get just the matched eids into an entry_lst
|
||||||
|
* make a unique query_id for this entry_lst, and store entry_ids into "query" table, with a unique query_id
|
||||||
|
* take most of pa_user_state that relates to query state and move it to the "query" table per query_id
|
||||||
|
* pa_user_state then becomes defaults for next query (so how_many, noo, etc)
|
||||||
|
|
||||||
|
* we can age out queries form the query_table after a few months?
|
||||||
|
* client side always has query_id. IF DB does not have query_id, then its really old? - just say so...
|
||||||
|
|
||||||
|
* client side takes query_id, entry_lst, current_eid, offset, first/last_eid, etc. as part of its first route / html creation.
|
||||||
|
* it then decides based on all this to GetEntryDetails( subset of entry_lst ) <- needs new route
|
||||||
|
* IN THEORY some of the subset of entry_lst don't exist -- BUT, we can handle that on response, e.g. say my query used to have 1,2,3, and since then another user/action deleted 2:
|
||||||
|
- I ask for details on 1,2,3 and get back details on 1,3 only.
|
||||||
|
- On client-side, I can say, since you ran this query, data in PA has changed - image#2 is no longer in PA.
|
||||||
|
Please run a new query (or bonus points, maybe have enough of the original query to note this and ask, do you want to ignore changes, or re-run query and get latest data?)
|
||||||
|
* client can go fwd or back in the entry_lst same as now (disabling buttons as needed), BUT as entry_lst is NOT recreated per page move, then no chance to get confused about first/last
|
||||||
|
* client side:
|
||||||
|
* for real chance to stop confusion, instead of removing deleted images from DOM, we should gray them out and put a big Del (red circle with line?) though it as overlay.
|
||||||
|
* Create another table is entry_ammendments - note the deletions, rotations, flips of specific eids - then reproduce that on the client side visually as needed
|
||||||
|
- at least grayed-out, to indicate a pending action is not complete.
|
||||||
|
- When job that flips, rotates, deletes completes then lets update the query details (e.g. remove eids, or remove the ammendments)
|
||||||
|
- this actually is quite an improvement, if someone is deleting 2 as per above, I will see that as a pending change in my unrelated query, ditto flips, etc.
|
||||||
|
|
||||||
### GENERAL
|
### GENERAL
|
||||||
* change the rotation code to use that jpeg util to reduce/remove compression loss?
|
* jobs for AI should show path name
|
||||||
|
* rm dups job should show progress bar
|
||||||
|
* in viewer, there is no move button (maybe add one?)
|
||||||
|
* consider doing duplicates before AI, and if there are say 100s+, then maybe pause the AI work
|
||||||
|
- had 5000+ new photos, took 8 hours to finish, for me to just delete them anyway
|
||||||
|
* optimisation:
|
||||||
|
- keep track of just new files since scan (even if we did this from the DB),
|
||||||
|
then we could just feed those eid's explicitly into a 'get_file_details_on_new_files'.
|
||||||
|
- ALSO use new_eids list IF no new refimgs to do a 'run_ai_on_new_files' :)
|
||||||
|
|
||||||
* allow changing dates in move dbox and then re-scan for existing folders OR just have a browse for existing...
|
* allow changing dates in move dbox and then re-scan for existing folders OR just have a browse for existing...
|
||||||
- for use on scanned photos, they register as 2010, but are datestamped in visuals for 95
|
- for use on scanned photos, they register as 2010, but are datestamped in visuals for 95
|
||||||
@@ -10,11 +41,22 @@
|
|||||||
-- path.py has the __allow_unmapped__ = True
|
-- path.py has the __allow_unmapped__ = True
|
||||||
* remove all '.execute' from *.py
|
* remove all '.execute' from *.py
|
||||||
|
|
||||||
* read this: https://flask.palletsprojects.com/en/2.2.x/testing/#faking-resources-and-context
|
* allow actions for wrong person:
|
||||||
|
-> someone else? OR override no match for this person ever for this image?
|
||||||
|
* groups of persons (non-exclusive, recursive), so:
|
||||||
|
- fam (ddp, mandy, cam, mich)
|
||||||
|
- mandy_fam (mandy, kirsty, tony, anne, tenille)
|
||||||
|
- uncle_peters_fam (peter, joan, jarrod, aaron, alana)
|
||||||
|
- cousin_aarons_fam (aaron, karly, kai, etc)
|
||||||
|
- storviks (mandy_fam, uncle_peters_fam, cousin_aarons_fam, etc)
|
||||||
|
|
||||||
* could get better AI optim, by keeping track of just new files since scan (even if we did this from the DB),
|
* birthdates in PA:
|
||||||
then we could just feed those eid's explicitly into a 'run_ai_on_new_files' :) -- maybe particularly
|
- M was interested in birthdate appearing in hass, so need an API /
|
||||||
if count('new files') < say 1000 do eids, otherwise do path AND no new refimgs
|
sensor in hass too - maybe could store (optional) birthdate and
|
||||||
|
heads up (of X days, e.g. so can get present) or just come up on the day so you can remember to text them
|
||||||
|
* search logic (AND vs OR)
|
||||||
|
|
||||||
|
* read this: https://flask.palletsprojects.com/en/2.2.x/testing/#faking-resources-and-context
|
||||||
|
|
||||||
* ignore face should ignore ALL matching faces (re: Declan)
|
* ignore face should ignore ALL matching faces (re: Declan)
|
||||||
|
|
||||||
@@ -28,7 +70,7 @@
|
|||||||
|
|
||||||
* browser back/forward buttons dont work -- use POST -> redirect to GET
|
* browser back/forward buttons dont work -- use POST -> redirect to GET
|
||||||
- need some sort of clean-up of pa_user_state -- I spose its triggered by browser session, so maybe just after a week is lazy/good enough
|
- need some sort of clean-up of pa_user_state -- I spose its triggered by browser session, so maybe just after a week is lazy/good enough
|
||||||
-- pa_user_state has last_used as a timestamp so can be used to delete old entries
|
- pa_user_state has last_used as a timestamp so can be used to delete old entries
|
||||||
|
|
||||||
* back button will fail if we do these POSTs:
|
* back button will fail if we do these POSTs:
|
||||||
job.py:@app.route("/jobs", methods=["GET", "POST"])
|
job.py:@app.route("/jobs", methods=["GET", "POST"])
|
||||||
@@ -40,8 +82,6 @@
|
|||||||
files.py:@app.route("/fix_dups", methods=["POST"])
|
files.py:@app.route("/fix_dups", methods=["POST"])
|
||||||
???
|
???
|
||||||
|
|
||||||
* allow user to choose default log level to show
|
|
||||||
|
|
||||||
* GUI overhaul?
|
* GUI overhaul?
|
||||||
* on a phone, the files.html page header is a mess "Oldest.." line is too large to fit on 1 line (make it a hamburger?)
|
* on a phone, the files.html page header is a mess "Oldest.." line is too large to fit on 1 line (make it a hamburger?)
|
||||||
- searched for text overlaps buttons above and below
|
- searched for text overlaps buttons above and below
|
||||||
@@ -59,9 +99,6 @@
|
|||||||
* fix up logging in general
|
* fix up logging in general
|
||||||
ProcessFileForJob --> really need to better handle log levels and counting
|
ProcessFileForJob --> really need to better handle log levels and counting
|
||||||
|
|
||||||
* metadata at folder level with file level to add more richness
|
|
||||||
- store in DB? or store in hidden file (or both)... IF it is outside the DB, then I can 'rebuild' the DB at anytime from scratch
|
|
||||||
|
|
||||||
* video player cannot handle non mp4 formats... do I care? (could just offer a download link and hope the client deals with it)
|
* video player cannot handle non mp4 formats... do I care? (could just offer a download link and hope the client deals with it)
|
||||||
--> OR? https://jsmpeg.com/
|
--> OR? https://jsmpeg.com/
|
||||||
--> OR? convert all videos to mp4/webm
|
--> OR? convert all videos to mp4/webm
|
||||||
@@ -91,6 +128,8 @@
|
|||||||
* viewer needs to allow toggle to scan_model (and prob. right-click on file... AI (with CNN) AI (with hog)
|
* viewer needs to allow toggle to scan_model (and prob. right-click on file... AI (with CNN) AI (with hog)
|
||||||
- make the form-select AI_Model actually do the change (but need more mem on mara really -- even mem is not enough
|
- make the form-select AI_Model actually do the change (but need more mem on mara really -- even mem is not enough
|
||||||
need graphic support --> need to allow pa_job_manager run on borric with acceleration)
|
need graphic support --> need to allow pa_job_manager run on borric with acceleration)
|
||||||
|
- test this with new CPU in mara
|
||||||
|
- test this on borric for comparison
|
||||||
|
|
||||||
For AI / rescan:
|
For AI / rescan:
|
||||||
way to override per file:
|
way to override per file:
|
||||||
@@ -209,3 +248,182 @@ def AddRefimgToPerson(person_id: int, ref_images: List[str]) -> bool:
|
|||||||
>>> AddRefimgToPerson(123, ['http://example.com/img1.jpg', '/path/to/img2.png'])
|
>>> AddRefimgToPerson(123, ['http://example.com/img1.jpg', '/path/to/img2.png'])
|
||||||
True
|
True
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
checkout deepface:
|
||||||
|
|
||||||
|
|
||||||
|
User: I have this code I wrote using pythons face_recognition library, but I would like to alter it to work with deepface. My code is: im = face_recognition.load_image_file(e.FullPathOnFS())
|
||||||
|
face_locations = face_recognition.face_locations(im, model=model.name )
|
||||||
|
unknown_encodings = face_recognition.face_encodings(im, known_face_locations=face_locations)
|
||||||
|
|
||||||
|
Assistant: To alter your existing code to work with **DeepFace**, the changes involve replacing the methods of the `face_recognition` library with similar functionalities provided by DeepFace.
|
||||||
|
|
||||||
|
In **DeepFace**, the equivalent functionalities include:
|
||||||
|
|
||||||
|
1. **Detecting faces**: This is automatically handled by the `analyze` method in DeepFace. You don't need to manually specify the face locations.
|
||||||
|
|
||||||
|
2. **Encoding faces**: DeepFace internally computes embeddings for faces through its supported models (like VGG-Face, Facenet, OpenFace, DeepID, etc.).
|
||||||
|
|
||||||
|
Here's how you can modify your code to use DeepFace instead of `face_recognition`:
|
||||||
|
|
||||||
|
### Original Code (using `face_recognition`):
|
||||||
|
```python
|
||||||
|
im = face_recognition.load_image_file(e.FullPathOnFS())
|
||||||
|
face_locations = face_recognition.face_locations(im, model=model.name)
|
||||||
|
unknown_encodings = face_recognition.face_encodings(im, known_face_locations=face_locations)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Modified Code (using `DeepFace`):
|
||||||
|
```python
|
||||||
|
from deepface import DeepFace
|
||||||
|
import cv2
|
||||||
|
|
||||||
|
# Load the image
|
||||||
|
im = cv2.imread(e.FullPathOnFS())
|
||||||
|
|
||||||
|
# Analyze the image to get face embeddings (encodings) and other details
|
||||||
|
results = DeepFace.represent(img_path=e.FullPathOnFS(), model_name=model.name)
|
||||||
|
|
||||||
|
# `results` will contain a list of dictionaries where each dictionary contains the embedding (encoding) of a detected face.
|
||||||
|
# Example of accessing the embeddings (face encodings)
|
||||||
|
unknown_encodings = [res['embedding'] for res in results]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Explanation:
|
||||||
|
|
||||||
|
- `DeepFace.represent()` takes the path to the image (`e.FullPathOnFS()`) and the name of the model (`model.name`). It processes the image and extracts the embeddings (encodings) for each face detected.
|
||||||
|
|
||||||
|
- **Note**: In DeepFace, the model name can be one of several options: `VGG-Face`, `Facenet`, `OpenFace`, `DeepFace`, `DeepID`, etc. Make sure that `model.name` corresponds to one of these supported models.
|
||||||
|
|
||||||
|
- The result is a list of dictionaries. Each dictionary contains a detected face's encoding (embedding), which you can access using the `'embedding'` key.
|
||||||
|
|
||||||
|
DeepFace abstracts away a lot of the manual face detection work that you were handling with `face_recognition`. Let me know if you'd like to handle anything else, like visualizing the detected faces or working with other DeepFace models.
|
||||||
|
|
||||||
|
|
||||||
|
User: I want to use Deepface to take a set of reference images and match against all the images I have to find the best matches, how would you go about that?
|
||||||
|
|
||||||
|
Assistant: To use **DeepFace** to match a set of reference images against other images to find the best matches, you can follow these general steps:
|
||||||
|
|
||||||
|
1. **Compute embeddings (encodings) for the reference images**.
|
||||||
|
2. **Compute embeddings for the target images** (the ones you're searching through).
|
||||||
|
3. **Compare the embeddings** of the reference images with those of the target images to find the best matches based on a distance metric.
|
||||||
|
4. **Rank the matches** by similarity and return the best ones.
|
||||||
|
|
||||||
|
Here's a step-by-step guide on how you can achieve this with **DeepFace**:
|
||||||
|
|
||||||
|
### Step 1: Compute embeddings for reference images
|
||||||
|
You will load the reference images, compute their embeddings using DeepFace, and store them.
|
||||||
|
|
||||||
|
### Step 2: Compute embeddings for all target images
|
||||||
|
Do the same for the target images you want to match against.
|
||||||
|
|
||||||
|
### Step 3: Compare embeddings
|
||||||
|
You will compare the embeddings from the reference images to the embeddings of each target image. The smaller the distance between the embeddings, the more similar the faces are.
|
||||||
|
|
||||||
|
### Step 4: Return the best matches
|
||||||
|
|
||||||
|
### Example Code
|
||||||
|
|
||||||
|
```python
|
||||||
|
from deepface import DeepFace
|
||||||
|
import os
|
||||||
|
import cv2
|
||||||
|
import numpy as np
|
||||||
|
|
||||||
|
# Function to compute embeddings for a given image path
|
||||||
|
def compute_embeddings(image_path, model_name='VGG-Face'):
|
||||||
|
try:
|
||||||
|
# This returns a list of embeddings for each face found in the image
|
||||||
|
embeddings = DeepFace.represent(img_path=image_path, model_name=model_name)
|
||||||
|
return [res['embedding'] for res in embeddings]
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error processing image {image_path}: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Function to find the best matches between reference and target images
|
||||||
|
def find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine'):
|
||||||
|
# Store embeddings for reference images
|
||||||
|
reference_embeddings = {}
|
||||||
|
|
||||||
|
for ref_img in reference_images:
|
||||||
|
ref_embeddings = compute_embeddings(ref_img, model_name=model_name)
|
||||||
|
if ref_embeddings:
|
||||||
|
reference_embeddings[ref_img] = ref_embeddings
|
||||||
|
|
||||||
|
# Store best matches
|
||||||
|
best_matches = []
|
||||||
|
|
||||||
|
# Loop through each target image
|
||||||
|
for target_img in target_images:
|
||||||
|
target_embeddings = compute_embeddings(target_img, model_name=model_name)
|
||||||
|
|
||||||
|
for target_emb in target_embeddings:
|
||||||
|
for ref_img, ref_emb_list in reference_embeddings.items():
|
||||||
|
# Compare each reference embedding with the target embedding
|
||||||
|
for ref_emb in ref_emb_list:
|
||||||
|
# Compute the distance between embeddings using DeepFace's distance functions
|
||||||
|
distance = DeepFace.find_distance(ref_emb, target_emb, distance_metric=metric)
|
||||||
|
|
||||||
|
# Store match details (target image, reference image, and distance)
|
||||||
|
best_matches.append({
|
||||||
|
'target_image': target_img,
|
||||||
|
'reference_image': ref_img,
|
||||||
|
'distance': distance
|
||||||
|
})
|
||||||
|
|
||||||
|
# Sort matches by the distance (smallest distances indicate best matches)
|
||||||
|
best_matches = sorted(best_matches, key=lambda x: x['distance'])
|
||||||
|
|
||||||
|
return best_matches
|
||||||
|
|
||||||
|
# Example usage
|
||||||
|
reference_images = [
|
||||||
|
"path_to_reference_image1.jpg",
|
||||||
|
"path_to_reference_image2.jpg"
|
||||||
|
]
|
||||||
|
|
||||||
|
target_images = [
|
||||||
|
"path_to_target_image1.jpg",
|
||||||
|
"path_to_target_image2.jpg",
|
||||||
|
# Add more target images here...
|
||||||
|
]
|
||||||
|
|
||||||
|
# Find the best matches
|
||||||
|
matches = find_best_matches(reference_images, target_images, model_name='VGG-Face', metric='cosine')
|
||||||
|
|
||||||
|
# Display the best match
|
||||||
|
for match in matches[:5]: # Display top 5 matches
|
||||||
|
print(f"Reference Image: {match['reference_image']}, Target Image: {match['target_image']}, Distance: {match['distance']}")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Explanation:
|
||||||
|
|
||||||
|
1. **`compute_embeddings(image_path, model_name)`**: This function uses DeepFace's `represent` function to compute face embeddings for each face found in the image.
|
||||||
|
|
||||||
|
2. **`find_best_matches()`**:
|
||||||
|
- It first computes and stores embeddings for all reference images in the `reference_embeddings` dictionary.
|
||||||
|
- For each target image, it computes the embeddings, then compares those embeddings with all the reference embeddings using the chosen distance metric (default is `'cosine'`).
|
||||||
|
- The function then stores the matches and sorts them by distance (the smaller the distance, the closer the match).
|
||||||
|
|
||||||
|
3. **Distance metrics**: You can choose between several distance metrics like `'cosine'`, `'euclidean'`, or `'euclidean_l2'`, depending on the model and your preference.
|
||||||
|
|
||||||
|
4. **Matching logic**: It finds the best matches by comparing each face embedding in the target images with each face embedding in the reference images and storing the distances.
|
||||||
|
|
||||||
|
### DeepFace Distance Metrics:
|
||||||
|
|
||||||
|
- **Cosine**: Measures the cosine of the angle between two vectors (common for face embeddings).
|
||||||
|
- **Euclidean**: Measures the straight-line distance between two vectors.
|
||||||
|
- **Euclidean L2**: A variant of Euclidean where embeddings are normalized before distance calculation.
|
||||||
|
|
||||||
|
### Example of top match output:
|
||||||
|
```bash
|
||||||
|
Reference Image: path_to_reference_image1.jpg, Target Image: path_to_target_image1.jpg, Distance: 0.256
|
||||||
|
Reference Image: path_to_reference_image2.jpg, Target Image: path_to_target_image2.jpg, Distance: 0.310
|
||||||
|
...
|
||||||
|
```
|
||||||
|
|
||||||
|
This approach will give you the best matching images based on facial similarity. You can adjust the model used (`VGG-Face`, `Facenet`, etc.) and the distance metric to fine-tune results. Let me know if you'd like to visualize the results or handle larger datasets more efficiently!
|
||||||
|
|
||||||
|
|||||||
33
files.py
@@ -234,8 +234,12 @@ def GetEntriesInFolderView( OPT, prefix ):
|
|||||||
################################################################################
|
################################################################################
|
||||||
def GetEntriesInSearchView( OPT ):
|
def GetEntriesInSearchView( OPT ):
|
||||||
search_term=OPT.orig_search_term
|
search_term=OPT.orig_search_term
|
||||||
|
# turn * wildcard into sql wildcard of %
|
||||||
|
search_term=search_term.replace('*', '%' )
|
||||||
if 'AI:' in OPT.orig_search_term:
|
if 'AI:' in OPT.orig_search_term:
|
||||||
search_term = search_term.replace('AI:','')
|
search_term = search_term.replace('AI:','')
|
||||||
|
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag == search_term)"
|
||||||
|
else:
|
||||||
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag.ilike('%{search_term}%'))"
|
join=f"Entry.query.join(File).join(FaceFileLink).join(Face).join(FaceRefimgLink).join(Refimg).join(PersonRefimgLink).join(Person).filter(Person.tag.ilike('%{search_term}%'))"
|
||||||
if 'AI:' in OPT.orig_search_term:
|
if 'AI:' in OPT.orig_search_term:
|
||||||
all_entries = eval( f"{join}.{OPT.order}.offset(OPT.offset).limit(OPT.how_many).all()")
|
all_entries = eval( f"{join}.{OPT.order}.offset(OPT.offset).limit(OPT.how_many).all()")
|
||||||
@@ -265,13 +269,18 @@ def GetEntriesInSearchView( OPT ):
|
|||||||
if add_it:
|
if add_it:
|
||||||
all_entries.append(a)
|
all_entries.append(a)
|
||||||
|
|
||||||
|
# nothing found, just return now
|
||||||
|
if len(all_entries) == 0:
|
||||||
|
OPT.num_entries = 0
|
||||||
|
return []
|
||||||
|
|
||||||
# for all searches first_entry is worked out when first_eid not set yet & offset is 0 and we have some entries
|
# for all searches first_entry is worked out when first_eid not set yet & offset is 0 and we have some entries
|
||||||
if OPT.first_eid == 0 and OPT.offset == 0 and len(all_entries):
|
if OPT.first_eid == 0 and OPT.offset == 0 and len(all_entries):
|
||||||
OPT.first_eid = all_entries[0].id
|
OPT.first_eid = all_entries[0].id
|
||||||
if OPT.last_eid == 0:
|
if OPT.last_eid == 0:
|
||||||
by_fname= f"select e.id from entry e where e.name ilike '%%{search_term}%%'"
|
by_fname= f"select e.id from entry e where e.name ilike '%%{search_term}%%'"
|
||||||
by_dirname=f"select e.id from entry e, entry_dir_link edl where edl.entry_id = e.id and edl.dir_eid in ( select d.eid from dir d where d.rel_path ilike '%%{search_term}%%' )"
|
by_dirname=f"select e.id from entry e, entry_dir_link edl where edl.entry_id = e.id and edl.dir_eid in ( select d.eid from dir d where d.rel_path ilike '%%{search_term}%%' )"
|
||||||
by_ai =f"select e.id from entry e, face_file_link ffl, face_refimg_link frl, person_refimg_link prl, person p where e.id = ffl.file_eid and frl.face_id = ffl.face_id and frl.refimg_id = prl.refimg_id and prl.person_id = p.id and p.tag ilike '%%{search_term}%%'"
|
by_ai =f"select e.id from entry e, face_file_link ffl, face_refimg_link frl, person_refimg_link prl, person p where e.id = ffl.file_eid and frl.face_id = ffl.face_id and frl.refimg_id = prl.refimg_id and prl.person_id = p.id and p.tag = '{search_term}'"
|
||||||
|
|
||||||
if 'AI:' in OPT.orig_search_term:
|
if 'AI:' in OPT.orig_search_term:
|
||||||
sel_no_order=f"select e.*, f.* from entry e, file f where e.id=f.eid and e.id in ( {by_ai} ) "
|
sel_no_order=f"select e.*, f.* from entry e, file f where e.id=f.eid and e.id in ( {by_ai} ) "
|
||||||
@@ -283,6 +292,9 @@ def GetEntriesInSearchView( OPT ):
|
|||||||
with db.engine.connect() as conn:
|
with db.engine.connect() as conn:
|
||||||
OPT.num_entries = conn.execute( text( num_e_sql ) ).first().count
|
OPT.num_entries = conn.execute( text( num_e_sql ) ).first().count
|
||||||
|
|
||||||
|
if OPT.num_entries == 0:
|
||||||
|
return []
|
||||||
|
|
||||||
last_entry_sql= f"{sel_no_order} order by {OPT.last_order_raw} limit 1"
|
last_entry_sql= f"{sel_no_order} order by {OPT.last_order_raw} limit 1"
|
||||||
with db.engine.connect() as conn:
|
with db.engine.connect() as conn:
|
||||||
OPT.last_eid = conn.execute( text( last_entry_sql ) ).first().id
|
OPT.last_eid = conn.execute( text( last_entry_sql ) ).first().id
|
||||||
@@ -438,7 +450,10 @@ def files_rbp():
|
|||||||
@app.route("/search/<search_term>", methods=["GET", "POST"])
|
@app.route("/search/<search_term>", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def search(search_term):
|
def search(search_term):
|
||||||
|
# print( f"req={request}" )
|
||||||
OPT=States( request )
|
OPT=States( request )
|
||||||
|
# print( f"OPT={OPT}" )
|
||||||
|
|
||||||
# if we posted to get here, its a change in State, so save it to pa_user_state, and go back to the GET version or URL
|
# if we posted to get here, its a change in State, so save it to pa_user_state, and go back to the GET version or URL
|
||||||
if request.method=="POST":
|
if request.method=="POST":
|
||||||
redirect("/search/"+search_term)
|
redirect("/search/"+search_term)
|
||||||
@@ -586,7 +601,7 @@ def view_list():
|
|||||||
# this occurs when we went from the last image on a page (with how_many on
|
# this occurs when we went from the last image on a page (with how_many on
|
||||||
# it) and it just happened to also be the last in the DB...
|
# it) and it just happened to also be the last in the DB...
|
||||||
if not entries:
|
if not entries:
|
||||||
SetFELog( message="DDP: DONT think this can happen anymore", log_level="error", job=None, persistent=True, cant_close=True )
|
SetFELog( message="DDP: DONT think this can happen anymore", level="danger", job=None, persistent=True, cant_close=True )
|
||||||
|
|
||||||
# undo the skip by how_many and getentries again
|
# undo the skip by how_many and getentries again
|
||||||
OPT.offset -= int(OPT.how_many)
|
OPT.offset -= int(OPT.how_many)
|
||||||
@@ -610,7 +625,6 @@ def view_list():
|
|||||||
resp['objs'][e.id]['faces'] = []
|
resp['objs'][e.id]['faces'] = []
|
||||||
|
|
||||||
# put face data back into array format (for js processing)
|
# put face data back into array format (for js processing)
|
||||||
fid=0
|
|
||||||
for face in e.file_details.faces:
|
for face in e.file_details.faces:
|
||||||
fd= {}
|
fd= {}
|
||||||
fd['x'] = face.face_left
|
fd['x'] = face.face_left
|
||||||
@@ -618,10 +632,10 @@ def view_list():
|
|||||||
fd['w'] = face.w
|
fd['w'] = face.w
|
||||||
fd['h'] = face.h
|
fd['h'] = face.h
|
||||||
if face.refimg:
|
if face.refimg:
|
||||||
|
fd['pid'] = face.refimg.person.id
|
||||||
fd['who'] = face.refimg.person.tag
|
fd['who'] = face.refimg.person.tag
|
||||||
fd['distance'] = round(face.refimg_lnk.face_distance,2)
|
fd['distance'] = round(face.refimg_lnk.face_distance,2)
|
||||||
resp['objs'][e.id]['faces'].append(fd)
|
resp['objs'][e.id]['faces'].append(fd)
|
||||||
fid+=1
|
|
||||||
|
|
||||||
eids=eids.rstrip(",")
|
eids=eids.rstrip(",")
|
||||||
lst = eids.split(',')
|
lst = eids.split(',')
|
||||||
@@ -638,9 +652,12 @@ def view_list():
|
|||||||
resp['last_eid']=OPT.last_eid
|
resp['last_eid']=OPT.last_eid
|
||||||
resp['eids']=eids
|
resp['eids']=eids
|
||||||
resp['offset']=OPT.offset
|
resp['offset']=OPT.offset
|
||||||
|
# print( f"BUG-DEBUG: /view_list route #1 - OPT={OPT}, eids={eids} ")
|
||||||
# save pref to keep the new current value, first/last
|
# save pref to keep the new current value, first/last
|
||||||
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.orig_ptype==OPT.orig_ptype,PA_UserState.view_eid==OPT.view_eid).first()
|
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.orig_ptype==OPT.orig_ptype,PA_UserState.view_eid==OPT.view_eid).first()
|
||||||
|
# print( f"BUG-DEBUG: /view_list route #2 - OPT={OPT}, eids={eids} ")
|
||||||
UpdatePref( pref, OPT )
|
UpdatePref( pref, OPT )
|
||||||
|
# print( f"BUG-DEBUG: /view_list route #3 - OPT={OPT}, eids={eids} ")
|
||||||
|
|
||||||
return make_response( resp )
|
return make_response( resp )
|
||||||
|
|
||||||
@@ -674,13 +691,13 @@ def view(id):
|
|||||||
eids=eids.rstrip(",")
|
eids=eids.rstrip(",")
|
||||||
# jic, sometimes we trip this, and rather than show broken pages / destroy
|
# jic, sometimes we trip this, and rather than show broken pages / destroy
|
||||||
if id not in eids:
|
if id not in eids:
|
||||||
SetFELog( message=f"ERROR: viewing an id, but its not in eids OPT={OPT}, id={id}, eids={eids}", log_level="error", persistent=True, cant_close=True)
|
SetFELog( message=f"ERROR: viewing an id, but its not in eids OPT={OPT}, id={id}, eids={eids}", level="danger", persistent=True, cant_close=False)
|
||||||
msg="Sorry, viewing data is confused, cannot view this image now"
|
msg="Sorry, viewing data is confused, cannot view this image now"
|
||||||
if os.environ['ENV'] == "production":
|
if os.environ['ENV'] == "production":
|
||||||
msg += "Clearing out all states. This means browser back buttons will not work, please start a new tab and try again"
|
msg += "Clearing out all states. This means browser back buttons will not work, please start a new tab and try again"
|
||||||
PA_UserState.query.delete()
|
PA_UserState.query.delete()
|
||||||
db.session.commit()
|
db.session.commit()
|
||||||
SetFELog( msg, "warning" )
|
SetFELog( msg, "warning", persistent=True, cant_close=False )
|
||||||
return redirect("/")
|
return redirect("/")
|
||||||
else:
|
else:
|
||||||
NMO_data = FaceOverrideType.query.all()
|
NMO_data = FaceOverrideType.query.all()
|
||||||
@@ -688,6 +705,7 @@ def view(id):
|
|||||||
imp_path = setting.import_path
|
imp_path = setting.import_path
|
||||||
st_path = setting.storage_path
|
st_path = setting.storage_path
|
||||||
bin_path = setting.recycle_bin_path
|
bin_path = setting.recycle_bin_path
|
||||||
|
# print( f"BUG-DEBUG: /view/id GET route - OPT={OPT}, eids={eids}, current={int(id)} ")
|
||||||
return render_template("viewer.html", current=int(id), eids=eids, objs=objs, OPT=OPT, NMO_data=NMO_data, imp_path=imp_path, st_path=st_path, bin_path=bin_path )
|
return render_template("viewer.html", current=int(id), eids=eids, objs=objs, OPT=OPT, NMO_data=NMO_data, imp_path=imp_path, st_path=st_path, bin_path=bin_path )
|
||||||
|
|
||||||
##################################################################################
|
##################################################################################
|
||||||
@@ -698,6 +716,7 @@ def view(id):
|
|||||||
def view_img_post(id):
|
def view_img_post(id):
|
||||||
# set pa_user_states...
|
# set pa_user_states...
|
||||||
OPT=States( request )
|
OPT=States( request )
|
||||||
|
# print( f"BUG-DEBUG: /view/id POST route - OPT={OPT}, id={id} ")
|
||||||
# then use back-button friendly URL (and use pa_user_states to view the right image in the right list
|
# then use back-button friendly URL (and use pa_user_states to view the right image in the right list
|
||||||
return redirect( "/view/" + id );
|
return redirect( "/view/" + id );
|
||||||
|
|
||||||
@@ -782,7 +801,7 @@ def _jinja2_filter_parentpath(path):
|
|||||||
def get_existing_paths(dt):
|
def get_existing_paths(dt):
|
||||||
dir_ft=FileType.query.filter(FileType.name=='Directory').first()
|
dir_ft=FileType.query.filter(FileType.name=='Directory').first()
|
||||||
dirs_arr=[]
|
dirs_arr=[]
|
||||||
for delta in range(-7, 8):
|
for delta in range(-14, 15):
|
||||||
try:
|
try:
|
||||||
new_dtime=datetime.strptime(dt, "%Y%m%d") + timedelta(days=delta)
|
new_dtime=datetime.strptime(dt, "%Y%m%d") + timedelta(days=delta)
|
||||||
except:
|
except:
|
||||||
|
|||||||
@@ -173,27 +173,6 @@
|
|||||||
<svg id="unknown_ftype" fill="grey" viewBox="0 0 16 16">
|
<svg id="unknown_ftype" fill="grey" viewBox="0 0 16 16">
|
||||||
<path d="M2 0a2 2 0 0 0-2 2v12a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V2a2 2 0 0 0-2-2H2zm3.496 6.033a.237.237 0 0 1-.24-.247C5.35 4.091 6.737 3.5 8.005 3.5c1.396 0 2.672.73 2.672 2.24 0 1.08-.635 1.594-1.244 2.057-.737.559-1.01.768-1.01 1.486v.105a.25.25 0 0 1-.25.25h-.81a.25.25 0 0 1-.25-.246l-.004-.217c-.038-.927.495-1.498 1.168-1.987.59-.444.965-.736.965-1.371 0-.825-.628-1.168-1.314-1.168-.803 0-1.253.478-1.342 1.134-.018.137-.128.25-.266.25h-.825zm2.325 6.443c-.584 0-1.009-.394-1.009-.927 0-.552.425-.94 1.01-.94.609 0 1.028.388 1.028.94 0 .533-.42.927-1.029.927z"/>
|
<path d="M2 0a2 2 0 0 0-2 2v12a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V2a2 2 0 0 0-2-2H2zm3.496 6.033a.237.237 0 0 1-.24-.247C5.35 4.091 6.737 3.5 8.005 3.5c1.396 0 2.672.73 2.672 2.24 0 1.08-.635 1.594-1.244 2.057-.737.559-1.01.768-1.01 1.486v.105a.25.25 0 0 1-.25.25h-.81a.25.25 0 0 1-.25-.246l-.004-.217c-.038-.927.495-1.498 1.168-1.987.59-.444.965-.736.965-1.371 0-.825-.628-1.168-1.314-1.168-.803 0-1.253.478-1.342 1.134-.018.137-.128.25-.266.25h-.825zm2.325 6.443c-.584 0-1.009-.394-1.009-.927 0-.552.425-.94 1.01-.94.609 0 1.028.388 1.028.94 0 .533-.42.927-1.029.927z"/>
|
||||||
</svg>
|
</svg>
|
||||||
<svg id="rot90" viewBox="0 0 225 225" xmlns="http://www.w3.org/2000/svg" xmlns:bx="https://boxy-svg.com">
|
|
||||||
<g transform="matrix(1, 0, 0, 1, 7, 20)">
|
|
||||||
<path style="fill: rgba(216, 216, 216, 0); stroke: rgb(13, 202, 240); stroke-width: 15px;" d="M 100.468 0.428 C 155.454 0.428 200.022 45.014 200 100"/>
|
|
||||||
<path d="M -100.468 -5.428 H -100.468 L -100.468 -20.428 L -60.468 -0.428 L -100.468 19.572 L -100.468 4.572 H -100.468 V -5.428 Z" style="fill: rgb(13, 202, 240);" transform="matrix(-1, 0, 0, -1, 0, 0)" bx:shape="arrow -100.468 -20.428 40 40 10 40 0 1@18b8b13c"/>
|
|
||||||
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" transform="matrix(1, 0, 0, 1, -50.667503, 38.507301)"><tspan x="97.106" y="94.348">90</tspan><tspan x="97.106" dy="1em"></tspan></text>
|
|
||||||
</g>
|
|
||||||
</svg>
|
|
||||||
|
|
||||||
<svg id="rot180" viewBox="0 0 225 225" xmlns:bx="https://boxy-svg.com">
|
|
||||||
<path style="fill: rgba(216, 216, 216, 0); paint-order: stroke; stroke-width: 15px; stroke: rgb(13, 202, 240);" d="M 70.624 -41.413 C 147.233 -41.507 195.215 41.367 156.992 107.76 C 139.252 138.572 106.422 157.581 70.867 157.624" transform="matrix(0, -1, 1, 0, 62.412816, 178.623736)"/>
|
|
||||||
<path d="M 0 95 H 0 L 0 80 L 40 100 L 0 120 L 0 105 H 0 V 95 Z" style="fill: rgb(13, 202, 240);" transform="matrix(0.000048, 1, -1, 0.000048, 120.999475, 107.995157)" bx:shape="arrow 0 80 40 40 10 40 0 1@de4046a6"/>
|
|
||||||
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" transform="matrix(1, 0, 0, 1, -13.373604, 45.416005)"><tspan x="54.39" y="96.53">180</tspan><tspan x="54.39" dy="1em"></tspan></text>
|
|
||||||
</svg>
|
|
||||||
|
|
||||||
<svg id="rot270" viewBox="0 0 225 225" xmlns:bx="https://boxy-svg.com">
|
|
||||||
<g transform="matrix(1, 0, 0, 1, 8, 9)">
|
|
||||||
<path style="fill: rgba(13, 202, 240, 0); paint-order: fill; stroke: rgb(13, 202, 240); stroke-width: 15px;" d="M 100.308 199.384 C 177.051 199.384 225.015 116.307 186.644 49.846 C 148.272 -16.615 52.344 -16.615 13.972 49.846 C 5.371 64.744 0.771 81.615 0.62 98.817" transform="matrix(0, -1, 1, 0, 0.692227, 200.07649)"/>
|
|
||||||
<path d="M 99 195 H 99 L 99 180 L 139 200 L 99 220 L 99 205 H 99 V 195 Z" style="fill: rgb(13, 202, 240);" bx:shape="arrow 99 180 40 40 10 40 0 1@077af6d0"/>
|
|
||||||
<text style="fill: rgb(13, 202, 240); font-family: Arial, sans-serif; font-size: 96px; font-weight: 700; white-space: pre;" x="20.59" y="133.396">270</text>
|
|
||||||
</g>
|
|
||||||
</svg>
|
|
||||||
<svg id="sort-num-asc" viewBox="0 0 16 16">
|
<svg id="sort-num-asc" viewBox="0 0 16 16">
|
||||||
<path d="M12.438 1.668V7H11.39V2.684h-.051l-1.211.859v-.969l1.262-.906h1.046z"/>
|
<path d="M12.438 1.668V7H11.39V2.684h-.051l-1.211.859v-.969l1.262-.906h1.046z"/>
|
||||||
<path fill-rule="evenodd" d="M11.36 14.098c-1.137 0-1.708-.657-1.762-1.278h1.004c.058.223.343.45.773.45.824 0 1.164-.829 1.133-1.856h-.059c-.148.39-.57.742-1.261.742-.91 0-1.72-.613-1.72-1.758 0-1.148.848-1.835 1.973-1.835 1.09 0 2.063.636 2.063 2.687 0 1.867-.723 2.848-2.145 2.848zm.062-2.735c.504 0 .933-.336.933-.972 0-.633-.398-1.008-.94-1.008-.52 0-.927.375-.927 1 0 .64.418.98.934.98z"/>
|
<path fill-rule="evenodd" d="M11.36 14.098c-1.137 0-1.708-.657-1.762-1.278h1.004c.058.223.343.45.773.45.824 0 1.164-.829 1.133-1.856h-.059c-.148.39-.57.742-1.261.742-.91 0-1.72-.613-1.72-1.758 0-1.148.848-1.835 1.973-1.835 1.09 0 2.063.636 2.063 2.687 0 1.867-.723 2.848-2.145 2.848zm.062-2.735c.504 0 .933-.336.933-.972 0-.633-.398-1.008-.94-1.008-.52 0-.927.375-.927 1 0 .64.418.98.934.98z"/>
|
||||||
|
|||||||
|
Before Width: | Height: | Size: 27 KiB After Width: | Height: | Size: 25 KiB |
@@ -223,12 +223,14 @@ function ChangeSize(clicked_button,sz)
|
|||||||
// whether you click after highlight or before
|
// whether you click after highlight or before
|
||||||
function DoSel(e, el)
|
function DoSel(e, el)
|
||||||
{
|
{
|
||||||
if( e.ctrlKey )
|
if( e.ctrlKey || document.fake_ctrl === 1 )
|
||||||
{
|
{
|
||||||
$(el).toggleClass('highlight')
|
$(el).toggleClass('highlight')
|
||||||
|
if( document.fake_ctrl === 1 )
|
||||||
|
document.fake_ctrl=0
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
if( e.shiftKey )
|
if( e.shiftKey || document.fake_shift === 1 )
|
||||||
{
|
{
|
||||||
st=Number($('.highlight').first().attr('ecnt'))
|
st=Number($('.highlight').first().attr('ecnt'))
|
||||||
end=Number($('.highlight').last().attr('ecnt'))
|
end=Number($('.highlight').last().attr('ecnt'))
|
||||||
@@ -243,6 +245,9 @@ function DoSel(e, el)
|
|||||||
$('.entry').slice( end, clicked+1 ).addClass('highlight')
|
$('.entry').slice( end, clicked+1 ).addClass('highlight')
|
||||||
else
|
else
|
||||||
$('.entry').slice( clicked, st ).addClass('highlight')
|
$('.entry').slice( clicked, st ).addClass('highlight')
|
||||||
|
|
||||||
|
if( document.fake_shift === 1 )
|
||||||
|
document.fake_shift=0
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
$('.highlight').removeClass('highlight')
|
$('.highlight').removeClass('highlight')
|
||||||
|
|||||||
@@ -237,6 +237,8 @@ $(document).ready( function()
|
|||||||
else if( objs[current].faces[i].who )
|
else if( objs[current].faces[i].who )
|
||||||
{
|
{
|
||||||
item_list['match']={ 'name': objs[current].faces[i].who, 'which_face': i, 'id': objs[current].faces[i].id }
|
item_list['match']={ 'name': objs[current].faces[i].who, 'which_face': i, 'id': objs[current].faces[i].id }
|
||||||
|
item_list['match_add_refimg']={ 'name': 'Add this as refimg for ' + objs[current].faces[i].who,
|
||||||
|
'person_id': objs[current].faces[i].pid, 'who': objs[current].faces[i].who, 'which_face': i, 'id': objs[current].faces[i].id, }
|
||||||
item_list['wrong_person']={ 'name': 'wrong person', 'which_face': i, 'id': objs[current].faces[i].id }
|
item_list['wrong_person']={ 'name': 'wrong person', 'which_face': i, 'id': objs[current].faces[i].id }
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
@@ -494,10 +496,23 @@ function FaceDBox(key, item)
|
|||||||
{
|
{
|
||||||
div+=AddSearch( 'Click one of the link(s) below to add this face as a reference image to the person:<br><br>', key, face_pos );
|
div+=AddSearch( 'Click one of the link(s) below to add this face as a reference image to the person:<br><br>', key, face_pos );
|
||||||
}
|
}
|
||||||
|
if ( key == 'match_add_refimg' )
|
||||||
|
{
|
||||||
|
func='AddRefimgTo('+item[key]['person_id']+',\''+key+'\''
|
||||||
|
func_sn=func+ ', true )'
|
||||||
|
func_ao=func+ ', false )'
|
||||||
|
div+=`<script>console.log( "AddExistingFaceAsRefimgToMatchedPerson()" )</script>`
|
||||||
|
div+="Confirm you wish to add this face as a reference image for " + item[key]['who']
|
||||||
|
div+= '<div class="col">' + item[key]['who'] + '</div><div class="col input-group">'
|
||||||
|
div+= '<button onClick="'+func_sn+'" class="btn btn-success py-1 input-group-prepend">Add & search now</button> '
|
||||||
|
div+= '<button onClick="'+func_ao+'" class="btn btn-outline-success py-1 input-group-append">Add only</button></div>'
|
||||||
|
|
||||||
|
}
|
||||||
if ( key == 'wrong_person' )
|
if ( key == 'wrong_person' )
|
||||||
{
|
{
|
||||||
div+='<br>wrong person, so mark this as the wrong person/refimg connection, for face#' + item[key]['which_face']
|
div+='<br>wrong person, so mark this as the wrong person/refimg connection, for face#' + item[key]['which_face']
|
||||||
div+='<br>face db id: ' + item[key]['id']
|
div+='<br>face db id: ' + item[key]['id']
|
||||||
|
div += '<br>not yet'
|
||||||
}
|
}
|
||||||
if( /NMO_/.test(key) )
|
if( /NMO_/.test(key) )
|
||||||
{
|
{
|
||||||
|
|||||||
BIN
internal/pa-logo.png
Normal file
|
After Width: | Height: | Size: 18 KiB |
BIN
internal/rot180-invert.png
Normal file
|
After Width: | Height: | Size: 34 KiB |
BIN
internal/rot180.png
Normal file
|
After Width: | Height: | Size: 9.3 KiB |
BIN
internal/rot270-invert.png
Normal file
|
After Width: | Height: | Size: 36 KiB |
BIN
internal/rot270.png
Normal file
|
After Width: | Height: | Size: 9.2 KiB |
BIN
internal/rot90-invert.png
Normal file
|
After Width: | Height: | Size: 30 KiB |
BIN
internal/rot90.png
Normal file
|
After Width: | Height: | Size: 8.4 KiB |
107
internal/rot90.svg
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||||
|
<!-- Created with Inkscape (http://www.inkscape.org/) -->
|
||||||
|
|
||||||
|
<svg
|
||||||
|
width="265mm"
|
||||||
|
height="297mm"
|
||||||
|
viewBox="0 0 265 297"
|
||||||
|
version="1.1"
|
||||||
|
id="svg5"
|
||||||
|
inkscape:version="1.2.2 (b0a8486541, 2022-12-01)"
|
||||||
|
sodipodi:docname="rot90.svg"
|
||||||
|
inkscape:export-filename="rot90.svg"
|
||||||
|
inkscape:export-xdpi="96"
|
||||||
|
inkscape:export-ydpi="96"
|
||||||
|
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
|
||||||
|
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
|
||||||
|
xmlns="http://www.w3.org/2000/svg"
|
||||||
|
xmlns:svg="http://www.w3.org/2000/svg">
|
||||||
|
<sodipodi:namedview
|
||||||
|
id="namedview7"
|
||||||
|
pagecolor="#ffffff"
|
||||||
|
bordercolor="#000000"
|
||||||
|
borderopacity="0.25"
|
||||||
|
inkscape:showpageshadow="2"
|
||||||
|
inkscape:pageopacity="0.0"
|
||||||
|
inkscape:pagecheckerboard="0"
|
||||||
|
inkscape:deskcolor="#d1d1d1"
|
||||||
|
inkscape:document-units="mm"
|
||||||
|
showgrid="false"
|
||||||
|
inkscape:zoom="0.914906"
|
||||||
|
inkscape:cx="222.97373"
|
||||||
|
inkscape:cy="261.22902"
|
||||||
|
inkscape:window-width="3840"
|
||||||
|
inkscape:window-height="2128"
|
||||||
|
inkscape:window-x="0"
|
||||||
|
inkscape:window-y="0"
|
||||||
|
inkscape:window-maximized="1"
|
||||||
|
inkscape:current-layer="svg5" />
|
||||||
|
<defs
|
||||||
|
id="defs2">
|
||||||
|
<marker
|
||||||
|
style="overflow:visible"
|
||||||
|
id="Arrow2"
|
||||||
|
refX="0"
|
||||||
|
refY="0"
|
||||||
|
orient="95"
|
||||||
|
inkscape:stockid="Arrow2"
|
||||||
|
markerWidth="2.7"
|
||||||
|
markerHeight="3.964"
|
||||||
|
viewBox="0 0 7.7 5.6"
|
||||||
|
inkscape:isstock="true"
|
||||||
|
inkscape:collect="always"
|
||||||
|
preserveAspectRatio="none">
|
||||||
|
<path
|
||||||
|
transform="scale(0.7)"
|
||||||
|
d="M -2,-4 9,0 -2,4 c 2,-2.33 2,-5.66 0,-8 z"
|
||||||
|
style="fill:context-stroke;fill-rule:evenodd;stroke:none"
|
||||||
|
id="arrow2L" />
|
||||||
|
</marker>
|
||||||
|
</defs>
|
||||||
|
<g
|
||||||
|
inkscape:label="Layer 1"
|
||||||
|
inkscape:groupmode="layer"
|
||||||
|
id="layer1">
|
||||||
|
<ellipse
|
||||||
|
style="font-variation-settings:normal;display:inline;opacity:1;fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:10;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:20,50;stroke-dashoffset:2;stroke-opacity:1;paint-order:markers stroke fill;stop-color:#000000;stop-opacity:1"
|
||||||
|
id="path2244"
|
||||||
|
cx="131.0041"
|
||||||
|
cy="137.873"
|
||||||
|
rx="98.982185"
|
||||||
|
ry="108.3931"
|
||||||
|
inkscape:label="dashes" />
|
||||||
|
</g>
|
||||||
|
<g
|
||||||
|
inkscape:groupmode="layer"
|
||||||
|
id="layer2"
|
||||||
|
inkscape:label="270-arc" />
|
||||||
|
<path
|
||||||
|
style="font-variation-settings:normal;display:inline;fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#000000;stroke-width:12;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;marker-end:url(#Arrow2);paint-order:markers stroke fill;stop-color:#000000"
|
||||||
|
id="path2244-3"
|
||||||
|
inkscape:label="arc"
|
||||||
|
sodipodi:type="arc"
|
||||||
|
sodipodi:cx="131.08505"
|
||||||
|
sodipodi:cy="137.8826"
|
||||||
|
sodipodi:rx="98.982185"
|
||||||
|
sodipodi:ry="108.3931"
|
||||||
|
sodipodi:start="4.7127986"
|
||||||
|
sodipodi:end="0"
|
||||||
|
sodipodi:arc-type="arc"
|
||||||
|
d="M 131.1256,29.489511 A 98.982185,108.3931 0 0 1 230.06724,137.8826"
|
||||||
|
inkscape:export-filename="rot90.svg"
|
||||||
|
inkscape:export-xdpi="96"
|
||||||
|
inkscape:export-ydpi="96"
|
||||||
|
sodipodi:open="true" />
|
||||||
|
<text
|
||||||
|
xml:space="preserve"
|
||||||
|
style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:90.3111px;font-family:sans-serif;-inkscape-font-specification:'sans-serif, Normal';font-variant-ligatures:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-east-asian:normal;fill:#000000;stroke:none;stroke-width:4;stroke-linecap:round;stroke-linejoin:bevel;stroke-dasharray:none;stroke-dashoffset:0;paint-order:markers stroke fill"
|
||||||
|
x="74.990334"
|
||||||
|
y="174.16937"
|
||||||
|
id="text2468"
|
||||||
|
inkscape:label="270-text"><tspan
|
||||||
|
sodipodi:role="line"
|
||||||
|
id="tspan2466"
|
||||||
|
style="font-style:normal;font-variant:normal;font-weight:bold;font-stretch:normal;font-size:95.25px;font-family:sans-serif;-inkscape-font-specification:'sans-serif, Bold';font-variant-ligatures:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-east-asian:normal;fill:#000000;stroke:none;stroke-width:4;stroke-dasharray:none"
|
||||||
|
x="74.990334"
|
||||||
|
y="174.16937">90</tspan></text>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 4.2 KiB |
24
job.py
@@ -78,13 +78,6 @@ class PA_JobManager_Message(PA,db.Model):
|
|||||||
job = db.relationship ("Job" )
|
job = db.relationship ("Job" )
|
||||||
|
|
||||||
|
|
||||||
################################################################################
|
|
||||||
# GetJM_Message: used in html to display any message for this front-end
|
|
||||||
################################################################################
|
|
||||||
def GetJM_Message():
|
|
||||||
msg=PA_JPA_JobManager_MessageobManager_Message.query.first()
|
|
||||||
return msg
|
|
||||||
|
|
||||||
################################################################################
|
################################################################################
|
||||||
# Used in main html to show a red badge of # jobs to draw attention there are
|
# Used in main html to show a red badge of # jobs to draw attention there are
|
||||||
# active jobs being processed in the background
|
# active jobs being processed in the background
|
||||||
@@ -178,7 +171,9 @@ def jobs():
|
|||||||
jobs = Job.query.order_by(Job.id.desc()).all()
|
jobs = Job.query.order_by(Job.id.desc()).all()
|
||||||
else:
|
else:
|
||||||
page_title='Job list (recent)'
|
page_title='Job list (recent)'
|
||||||
jobs = Job.query.filter( Job.last_update >= (func.now() - func.cast(concat(settings.job_archive_age, 'DAYS'), INTERVAL)) ).order_by(Job.id.desc()).all()
|
# work out cutoff in python (used to do this in sql and it was too slow)
|
||||||
|
cutoff = datetime.now() - timedelta(days=settings.job_archive_age)
|
||||||
|
jobs = Job.query.filter( Job.last_update >= cutoff ).order_by(Job.id.desc()).all()
|
||||||
return render_template("jobs.html", jobs=jobs, page_title=page_title)
|
return render_template("jobs.html", jobs=jobs, page_title=page_title)
|
||||||
|
|
||||||
|
|
||||||
@@ -188,7 +183,7 @@ def jobs():
|
|||||||
@app.route("/job/<id>", methods=["GET","POST"])
|
@app.route("/job/<id>", methods=["GET","POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def joblog(id):
|
def joblog(id):
|
||||||
joblog = Job.query.get(id)
|
joblog = db.session.get(Job,id)
|
||||||
|
|
||||||
if request.method == 'POST':
|
if request.method == 'POST':
|
||||||
logs=Joblog.query.filter(Joblog.job_id==id).order_by(Joblog.log_date).all()
|
logs=Joblog.query.filter(Joblog.job_id==id).order_by(Joblog.log_date).all()
|
||||||
@@ -344,3 +339,14 @@ def _jinja2_filter_datetime(date, fmt=None):
|
|||||||
return date.strftime("%d/%m/%Y %I:%M:%S %p")
|
return date.strftime("%d/%m/%Y %I:%M:%S %p")
|
||||||
else:
|
else:
|
||||||
return "N/A"
|
return "N/A"
|
||||||
|
|
||||||
|
################################################################################
|
||||||
|
# allow a way to force the messages to be deleted if really needed - its a bit
|
||||||
|
# lame, but a quick fix
|
||||||
|
################################################################################
|
||||||
|
@app.route('/force_clear')
|
||||||
|
@login_required
|
||||||
|
def force_clear():
|
||||||
|
PA_JobManager_Message.query.delete();
|
||||||
|
db.session.commit()
|
||||||
|
return redirect("/")
|
||||||
|
|||||||
20
main.py
@@ -11,7 +11,7 @@ from datetime import datetime
|
|||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import socket
|
import socket
|
||||||
from shared import CreateSelect, CreateFoldersSelect, LocationIcon, DB_URL, PROD_HOST, OLDEST_LOG_LIMIT
|
from shared import CreateSelect, CreateFoldersSelect, LocationIcon, DB_URL, OLDEST_LOG_LIMIT
|
||||||
|
|
||||||
# for ldap auth
|
# for ldap auth
|
||||||
from flask_ldap3_login import LDAP3LoginManager
|
from flask_ldap3_login import LDAP3LoginManager
|
||||||
@@ -60,15 +60,18 @@ except Exception:
|
|||||||
app.config['SECRET_KEY'] = b'my_insecure_PA_token_with_random_2134876adsfjhlkasdf87'
|
app.config['SECRET_KEY'] = b'my_insecure_PA_token_with_random_2134876adsfjhlkasdf87'
|
||||||
|
|
||||||
# ldap config vars: (the last one is required, or python ldap freaks out)
|
# ldap config vars: (the last one is required, or python ldap freaks out)
|
||||||
app.config['LDAP_HOST'] = 'mara.ddp.net'
|
app.config['LDAP_HOST'] = 'ldap.ddp.net'
|
||||||
app.config['LDAP_BASE_DN'] = 'dc=depaoli,dc=id,dc=au'
|
app.config['LDAP_BASE_DN'] = 'dc=depaoli,dc=id,dc=au'
|
||||||
app.config['LDAP_USER_DN'] = 'ou=users'
|
app.config['LDAP_USER_DN'] = 'ou=users'
|
||||||
app.config['LDAP_GROUP_DN'] = 'ou=groups'
|
app.config['LDAP_GROUP_DN'] = 'ou=groups'
|
||||||
app.config['LDAP_USER_RDN_ATTR'] = 'cn'
|
app.config['LDAP_USER_RDN_ATTR'] = 'uid'
|
||||||
app.config['LDAP_USER_LOGIN_ATTR'] = 'uid'
|
app.config['LDAP_USER_LOGIN_ATTR'] = 'uid'
|
||||||
app.config['LDAP_BIND_USER_DN'] = None
|
app.config['LDAP_BIND_USER_DN'] = None
|
||||||
app.config['LDAP_BIND_USER_PASSWORD'] = None
|
app.config['LDAP_BIND_USER_PASSWORD'] = None
|
||||||
app.config['LDAP_GROUP_OBJECT_FILTER'] = '(objectclass=posixGroup)'
|
app.config['LDAP_GROUP_OBJECT_FILTER'] = '(objectclass=posixGroup)'
|
||||||
|
app.config['LDAP_BIND_USER_DN'] = None
|
||||||
|
app.config['LDAP_BIND_USER_PASSWORD'] = None
|
||||||
|
|
||||||
|
|
||||||
db = SQLAlchemy(app) # create the (flask) sqlalchemy connection
|
db = SQLAlchemy(app) # create the (flask) sqlalchemy connection
|
||||||
ma = Marshmallow(app) # set up Marshmallow - data marshalling / serialising
|
ma = Marshmallow(app) # set up Marshmallow - data marshalling / serialising
|
||||||
@@ -86,6 +89,7 @@ from files import Entry
|
|||||||
from person import Person
|
from person import Person
|
||||||
from settings import Settings
|
from settings import Settings
|
||||||
from user import PAUser
|
from user import PAUser
|
||||||
|
from states import PA_UserState
|
||||||
|
|
||||||
####################################### GLOBALS #######################################
|
####################################### GLOBALS #######################################
|
||||||
# allow jinja2 to call these python functions directly
|
# allow jinja2 to call these python functions directly
|
||||||
@@ -142,12 +146,17 @@ def login():
|
|||||||
# ldap-injecting our username, so send them back to the login page instead
|
# ldap-injecting our username, so send them back to the login page instead
|
||||||
if request.method == 'POST' and re.search( r'[()\\*&!]', request.form['username']):
|
if request.method == 'POST' and re.search( r'[()\\*&!]', request.form['username']):
|
||||||
from job import SetFELog
|
from job import SetFELog
|
||||||
SetFELog( message=f"ERROR: Detected special LDAP chars in username: {request.form['username']}", log_level="error", persistent=True, cant_close=True )
|
SetFELog( message=f"ERROR: Detected special LDAP chars in username: {request.form['username']}", level="danger", persistent=True, cant_close=True )
|
||||||
return redirect(url_for('login'))
|
return redirect(url_for('login'))
|
||||||
if form.validate_on_submit():
|
if form.validate_on_submit():
|
||||||
# Successfully logged in, We can now access the saved user object via form.user.
|
# Successfully logged in, We can now access the saved user object via form.user.
|
||||||
login_user(form.user, remember=True) # Tell flask-login to log them in.
|
login_user(form.user, remember=True) # Tell flask-login to log them in.
|
||||||
next = request.args.get("next")
|
next = request.args.get("next")
|
||||||
|
|
||||||
|
# just (re)-authenticated, so clear old state from UserState, to avoid re-using old data that is no longer valid
|
||||||
|
PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn).delete()
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
if next:
|
if next:
|
||||||
return redirect(next) # Send them back where they came from
|
return redirect(next) # Send them back where they came from
|
||||||
else:
|
else:
|
||||||
@@ -242,9 +251,6 @@ def logout():
|
|||||||
# main to be called via Flask/Gunicorn
|
# main to be called via Flask/Gunicorn
|
||||||
###############################################################################
|
###############################################################################
|
||||||
def main():
|
def main():
|
||||||
if hostname == PROD_HOST:
|
|
||||||
app.run(ssl_context=('/etc/letsencrypt/live/pa.depaoli.id.au/cert.pem', '/etc/letsencrypt/live/pa.depaoli.id.au/privkey.pem'), host="0.0.0.0", debug=False)
|
|
||||||
else:
|
|
||||||
app.run(host="0.0.0.0", debug=True)
|
app.run(host="0.0.0.0", debug=True)
|
||||||
|
|
||||||
###############################################################################
|
###############################################################################
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
#
|
|
||||||
#
|
#
|
||||||
# This file controls the 'external' job control manager, that (periodically #
|
# This file controls the 'external' job control manager, that (periodically #
|
||||||
# looks / somehow is pushed an event?) picks up new jobs, and processes them.
|
# looks / somehow is pushed an event?) picks up new jobs, and processes them.
|
||||||
@@ -15,7 +15,7 @@
|
|||||||
|
|
||||||
### SQLALCHEMY IMPORTS ###
|
### SQLALCHEMY IMPORTS ###
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
from sqlalchemy import Column, Integer, String, Sequence, Float, ForeignKey, DateTime, LargeBinary, Boolean, func
|
from sqlalchemy import Column, Integer, String, Sequence, Float, ForeignKey, DateTime, LargeBinary, Boolean, func, text
|
||||||
from sqlalchemy.exc import SQLAlchemyError
|
from sqlalchemy.exc import SQLAlchemyError
|
||||||
from sqlalchemy.orm import relationship
|
from sqlalchemy.orm import relationship
|
||||||
from sqlalchemy import create_engine
|
from sqlalchemy import create_engine
|
||||||
@@ -23,7 +23,7 @@ from sqlalchemy.orm import sessionmaker
|
|||||||
from sqlalchemy.orm import scoped_session
|
from sqlalchemy.orm import scoped_session
|
||||||
|
|
||||||
### LOCAL FILE IMPORTS ###
|
### LOCAL FILE IMPORTS ###
|
||||||
from shared import DB_URL, PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT, THUMBSIZE, SymlinkName, GenThumb, SECS_IN_A_DAY
|
from shared import DB_URL, PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT, THUMBSIZE, SymlinkName, GenThumb, SECS_IN_A_DAY, PA_EXIF_ROTATER
|
||||||
from datetime import datetime, timedelta, date
|
from datetime import datetime, timedelta, date
|
||||||
|
|
||||||
### PYTHON LIB IMPORTS ###
|
### PYTHON LIB IMPORTS ###
|
||||||
@@ -45,12 +45,18 @@ import face_recognition
|
|||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import ffmpeg
|
import ffmpeg
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
|
||||||
# global debug setting
|
# global debug setting
|
||||||
if 'ENV' not in os.environ or os.environ['ENV'] != "production":
|
if 'ENV' not in os.environ:
|
||||||
DEBUG=True
|
DEBUG=True
|
||||||
|
print("PA job manager starting as non-production -- ENV not set! Please set ENV, I'd rather not guess what to do")
|
||||||
|
elif os.environ['ENV'] != "production":
|
||||||
|
DEBUG=True
|
||||||
|
print(f"PA job manager starting as non-production -- ENV={os.environ['ENV']}")
|
||||||
else:
|
else:
|
||||||
|
print("PA job manager starting as Production")
|
||||||
DEBUG=False
|
DEBUG=False
|
||||||
|
|
||||||
# global list of override tables to allow enumeration over them ...
|
# global list of override tables to allow enumeration over them ...
|
||||||
@@ -507,6 +513,12 @@ class PA_JobManager_FE_Message(Base):
|
|||||||
return "<id: {}, job_id: {}, level: {}, message: {}".format(self.id, self.job_id, self.level, self.message)
|
return "<id: {}, job_id: {}, level: {}, message: {}".format(self.id, self.job_id, self.level, self.message)
|
||||||
|
|
||||||
|
|
||||||
|
##############################################################################
|
||||||
|
# Class describing PA_UserState and in the DB (via sqlalchemy)
|
||||||
|
# the state for a User defines a series of remembered states for a user
|
||||||
|
# to optimise their viewing, etc. If we scan and fine new files, we need to
|
||||||
|
# invalidate these cached values, so we have this class here just for that
|
||||||
|
##############################################################################
|
||||||
class PA_UserState(Base):
|
class PA_UserState(Base):
|
||||||
__tablename__ = "pa_user_state"
|
__tablename__ = "pa_user_state"
|
||||||
id = Column(Integer, Sequence('pa_user_state_id_seq'), primary_key=True )
|
id = Column(Integer, Sequence('pa_user_state_id_seq'), primary_key=True )
|
||||||
@@ -537,6 +549,19 @@ class PA_UserState(Base):
|
|||||||
return f"<pa_user_dn: {self.pa_user_dn}, path_type: {self.path_type}, noo: {self.noo}, grouping: {self.grouping}, how_many: {self.how_many}, st_offset: {self.st_offset}, size: {self.size}, folders: {self.folders}, root: {self.root}, cwd: {self.cwd}, view_eid: {self.view_eid}, orig_ptype: {self.orig_ptype}, orig_search_term: {self.orig_search_term}, orig_url: {self.orig_url}, current={self.current}, first_eid={self.first_eid}, last_eid={self.last_eid}, num_entries={self.num_entries}>"
|
return f"<pa_user_dn: {self.pa_user_dn}, path_type: {self.path_type}, noo: {self.noo}, grouping: {self.grouping}, how_many: {self.how_many}, st_offset: {self.st_offset}, size: {self.size}, folders: {self.folders}, root: {self.root}, cwd: {self.cwd}, view_eid: {self.view_eid}, orig_ptype: {self.orig_ptype}, orig_search_term: {self.orig_search_term}, orig_url: {self.orig_url}, current={self.current}, first_eid={self.first_eid}, last_eid={self.last_eid}, num_entries={self.num_entries}>"
|
||||||
|
|
||||||
|
|
||||||
|
##############################################################################
|
||||||
|
# PAprint(): convenience function to prepend a timestamp to a printed string
|
||||||
|
##############################################################################
|
||||||
|
def PAprint( msg ):
|
||||||
|
now=datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f')
|
||||||
|
if 'INFO:' in msg or 'WARN:' in msg:
|
||||||
|
msg = ' '+msg
|
||||||
|
if 'DEBUG:' in msg or 'ERROR:' in msg or 'WARN:' in msg or 'INFO:' in msg:
|
||||||
|
print( f"{now} {msg}" )
|
||||||
|
else:
|
||||||
|
print( f"{now} DEBUG: {msg}" )
|
||||||
|
return
|
||||||
|
|
||||||
##############################################################################
|
##############################################################################
|
||||||
# NewJob(): convenience function to create a job, appropriately
|
# NewJob(): convenience function to create a job, appropriately
|
||||||
##############################################################################
|
##############################################################################
|
||||||
@@ -576,7 +601,7 @@ def MessageToFE( job_id, message, level, persistent, cant_close ):
|
|||||||
def SettingsRBPath():
|
def SettingsRBPath():
|
||||||
settings = session.query(Settings).first()
|
settings = session.query(Settings).first()
|
||||||
if settings == None:
|
if settings == None:
|
||||||
print("ERROR: Cannot create file data with no settings / recycle bin path is missing")
|
PAprint("ERROR: Cannot create file data with no settings / recycle bin path is missing")
|
||||||
return None
|
return None
|
||||||
# path setting is an absolute path, just use it, otherwise prepend base_path first
|
# path setting is an absolute path, just use it, otherwise prepend base_path first
|
||||||
if settings.recycle_bin_path[0] == '/':
|
if settings.recycle_bin_path[0] == '/':
|
||||||
@@ -614,7 +639,7 @@ def ProcessRecycleBinDir(job):
|
|||||||
def SettingsSPath():
|
def SettingsSPath():
|
||||||
settings = session.query(Settings).first()
|
settings = session.query(Settings).first()
|
||||||
if settings == None or settings.storage_path == "":
|
if settings == None or settings.storage_path == "":
|
||||||
print("ERROR: Cannot create file data with no settings / storage path is missing")
|
PAprint("ERROR: Cannot create file data with no settings / storage path is missing")
|
||||||
return None
|
return None
|
||||||
if settings.storage_path[0] == '/':
|
if settings.storage_path[0] == '/':
|
||||||
path=settings.storage_path
|
path=settings.storage_path
|
||||||
@@ -641,10 +666,9 @@ def ProcessStorageDirs(parent_job):
|
|||||||
# SettingsIPath(): return import path (abs or add base_path if needed)
|
# SettingsIPath(): return import path (abs or add base_path if needed)
|
||||||
##############################################################################
|
##############################################################################
|
||||||
def SettingsIPath():
|
def SettingsIPath():
|
||||||
paths=[]
|
|
||||||
settings = session.query(Settings).first()
|
settings = session.query(Settings).first()
|
||||||
if not settings or settings.import_path == "":
|
if not settings or settings.import_path == "":
|
||||||
print("ERROR: Cannot create file data with no settings / import path is missing")
|
PAprint("ERROR: Cannot create file data with no settings / import path is missing")
|
||||||
return None
|
return None
|
||||||
if settings.import_path[0] == '/':
|
if settings.import_path[0] == '/':
|
||||||
path=settings.import_path
|
path=settings.import_path
|
||||||
@@ -838,7 +862,7 @@ def JobMetadata(job):
|
|||||||
def AddLogForJob(job, message):
|
def AddLogForJob(job, message):
|
||||||
now=datetime.now(pytz.utc)
|
now=datetime.now(pytz.utc)
|
||||||
log=Joblog( job_id=job.id, log=message, log_date=now )
|
log=Joblog( job_id=job.id, log=message, log_date=now )
|
||||||
job.last_update=datetime.now(pytz.utc)
|
job.last_update=now
|
||||||
session.add(log)
|
session.add(log)
|
||||||
# some logs have DEBUG: in front, so clean that up
|
# some logs have DEBUG: in front, so clean that up
|
||||||
message = message.replace("DEBUG:", "" )
|
message = message.replace("DEBUG:", "" )
|
||||||
@@ -850,7 +874,7 @@ def AddLogForJob(job, message):
|
|||||||
else:
|
else:
|
||||||
job.last_commit = now
|
job.last_commit = now
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: {message}" )
|
PAprint( f"{message}" )
|
||||||
return
|
return
|
||||||
|
|
||||||
##############################################################################
|
##############################################################################
|
||||||
@@ -919,7 +943,7 @@ def FinishJob(job, last_log, state="Completed", pa_job_state="Completed", level=
|
|||||||
session.commit()
|
session.commit()
|
||||||
MessageToFE( job_id=job.id, message=last_log, level=level, persistent=persistent, cant_close=cant_close )
|
MessageToFE( job_id=job.id, message=last_log, level=level, persistent=persistent, cant_close=cant_close )
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: {last_log}" )
|
PAprint( f"DEBUG: {last_log}" )
|
||||||
return
|
return
|
||||||
|
|
||||||
##############################################################################
|
##############################################################################
|
||||||
@@ -930,13 +954,13 @@ def FinishJob(job, last_log, state="Completed", pa_job_state="Completed", level=
|
|||||||
##############################################################################
|
##############################################################################
|
||||||
def HandleJobs(first_run=False):
|
def HandleJobs(first_run=False):
|
||||||
if first_run:
|
if first_run:
|
||||||
print("INFO: PA job manager is starting up - check for stale jobs" )
|
PAprint("INFO: PA job manager is starting up - check for stale jobs" )
|
||||||
else:
|
else:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print("DEBUG: PA job manager is scanning for new jobs to process")
|
PAprint("DEBUG: PA job manager is scanning for new jobs to process")
|
||||||
for job in session.query(Job).filter(Job.pa_job_state != 'Complete').all():
|
for job in session.query(Job).filter(Job.pa_job_state != 'Complete').all():
|
||||||
if first_run and job.pa_job_state == 'In Progress':
|
if first_run and job.pa_job_state == 'In Progress':
|
||||||
print( f"INFO: Found stale job#{job.id} - {job.name}" )
|
PAprint( f"INFO: Found stale job#{job.id} - {job.name}" )
|
||||||
job.pa_job_state = 'Stale'
|
job.pa_job_state = 'Stale'
|
||||||
session.add(job)
|
session.add(job)
|
||||||
AddLogForJob( job, "ERROR: Job has been marked stale as it did not complete" )
|
AddLogForJob( job, "ERROR: Job has been marked stale as it did not complete" )
|
||||||
@@ -948,17 +972,17 @@ def HandleJobs(first_run=False):
|
|||||||
j2 = session.query(Job).get(job.wait_for)
|
j2 = session.query(Job).get(job.wait_for)
|
||||||
if not j2:
|
if not j2:
|
||||||
AddLogForJob( job, f"ERROR: waiting for a job#({job.wait_for}) that does not exist? ")
|
AddLogForJob( job, f"ERROR: waiting for a job#({job.wait_for}) that does not exist? ")
|
||||||
print(f"ERROR: job.wait_for ({job.wait_for}) does not exist in below? " )
|
PAprint(f"ERROR: job.wait_for ({job.wait_for}) does not exist in below? " )
|
||||||
for j in session.query(Job).all():
|
for j in session.query(Job).all():
|
||||||
print(f"ERROR: j={j.id}")
|
PAprint(f"ERROR: j={j.id}")
|
||||||
continue
|
continue
|
||||||
if j2.pa_job_state != 'Completed':
|
if j2.pa_job_state != 'Completed':
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# use this to remove threads for easier debugging, and errors will stacktrace to the console
|
# use this to remove threads for easier debugging, and errors will stacktrace to the console
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print("*************************************")
|
PAprint("*************************************")
|
||||||
print("RUNNING job: id={} name={} wait_for={}".format(job.id, job.name, job.wait_for ))
|
PAprint("RUNNING job: id={} name={} wait_for={}".format(job.id, job.name, job.wait_for ))
|
||||||
RunJob(job)
|
RunJob(job)
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
@@ -968,8 +992,8 @@ def HandleJobs(first_run=False):
|
|||||||
try:
|
try:
|
||||||
MessageToFE( job_id=job.id, level="danger", message="Failed with: {} (try job log for details)".format(e), persistent=True, cant_close=False )
|
MessageToFE( job_id=job.id, level="danger", message="Failed with: {} (try job log for details)".format(e), persistent=True, cant_close=False )
|
||||||
except Exception as e2:
|
except Exception as e2:
|
||||||
print("ERROR: Failed to let front-end know, but back-end Failed to run job (id: {}, name: {} -- orig exep was: {}, this exception was: {})".format( job.id, job.name, e, e2) )
|
PAprint("ERROR: Failed to let front-end know, but back-end Failed to run job (id: {}, name: {} -- orig exep was: {}, this exception was: {})".format( job.id, job.name, e, e2) )
|
||||||
print("INFO: PA job manager is waiting for a job")
|
PAprint("INFO: PA job manager is waiting for a job")
|
||||||
return
|
return
|
||||||
|
|
||||||
##############################################################################
|
##############################################################################
|
||||||
@@ -1132,12 +1156,12 @@ def JobForceScan(job):
|
|||||||
def CreateSymlink(job,ptype,path):
|
def CreateSymlink(job,ptype,path):
|
||||||
path_type = session.query(PathType).get(ptype)
|
path_type = session.query(PathType).get(ptype)
|
||||||
symlink=SymlinkName(path_type.name, path, path)
|
symlink=SymlinkName(path_type.name, path, path)
|
||||||
if not os.path.exists(symlink):
|
|
||||||
print( f"INFO: symlink does not exist, actually creating it -- s={symlink}" )
|
|
||||||
try:
|
try:
|
||||||
|
if not os.path.exists(symlink):
|
||||||
|
PAprint( f"INFO: symlink does not exist, actually creating it -- s={symlink}" )
|
||||||
# SAFE: SafePath() on init forces symlink to be safe
|
# SAFE: SafePath() on init forces symlink to be safe
|
||||||
os.makedirs( os.path.dirname(symlink), mode=0o777, exist_ok=True )
|
r=os.makedirs( os.path.dirname(symlink), mode=0o777, exist_ok=True )
|
||||||
os.symlink(path, symlink)
|
r=os.symlink(path, symlink)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
AddLogForJob( job, f"ERROR: Failed to create symlink - tried to link {symlink} -> {path}: {e}")
|
AddLogForJob( job, f"ERROR: Failed to create symlink - tried to link {symlink} -> {path}: {e}")
|
||||||
return symlink
|
return symlink
|
||||||
@@ -1251,25 +1275,25 @@ def RemoveEmptyDirFromDB( job, del_me ):
|
|||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
def CleanUpDirInDB(job, e):
|
def CleanUpDirInDB(job, e):
|
||||||
session.commit()
|
session.commit()
|
||||||
print( f"CleanUpDirInDB(): checking dir: {e.FullPathOnFS()} ({e.id})" )
|
PAprint( f"CleanUpDirInDB(): checking dir: {e.FullPathOnFS()} ({e.id})" )
|
||||||
content = session.query(Entry).join(EntryDirLink).filter(EntryDirLink.dir_eid==e.id).first()
|
content = session.query(Entry).join(EntryDirLink).filter(EntryDirLink.dir_eid==e.id).first()
|
||||||
if not content:
|
if not content:
|
||||||
print( f" Dir {e.FullPathOnFS()} - {e.id} is empty - removing it" )
|
PAprint( f" Dir {e.FullPathOnFS()} - {e.id} is empty - removing it" )
|
||||||
print( f" Entry {e}" )
|
PAprint( f" Entry {e}" )
|
||||||
# if no in_dir, we are at the root of the path, STOP
|
# if no in_dir, we are at the root of the path, STOP
|
||||||
if not e.in_dir:
|
if not e.in_dir:
|
||||||
print( " Parent is empty, so NEVER delete this entry, returning" )
|
PAprint( " Parent is empty, so NEVER delete this entry, returning" )
|
||||||
return
|
return
|
||||||
# okay remove this empty dir
|
# okay remove this empty dir
|
||||||
RemoveEmtpyDirFromFS( job, e )
|
RemoveEmtpyDirFromFS( job, e )
|
||||||
RemoveEmptyDirFromDB( job, e )
|
RemoveEmptyDirFromDB( job, e )
|
||||||
# get an Entry from DB (in_dir is a Dir/we need the ORM entry for code to work)
|
# get an Entry from DB (in_dir is a Dir/we need the ORM entry for code to work)
|
||||||
parent_dir = session.query(Entry).get(e.in_dir.eid)
|
parent_dir = session.query(Entry).get(e.in_dir.eid)
|
||||||
print( f" Dir {e.FullPathOnFS()} is in {parent_dir.FullPathOnFS()} ({parent_dir.id}) -> check next" )
|
PAprint( f" Dir {e.FullPathOnFS()} is in {parent_dir.FullPathOnFS()} ({parent_dir.id}) -> check next" )
|
||||||
# check to see if removing the empty dir has left the parent dir empty
|
# check to see if removing the empty dir has left the parent dir empty
|
||||||
CleanUpDirInDB(job, parent_dir)
|
CleanUpDirInDB(job, parent_dir)
|
||||||
else:
|
else:
|
||||||
print( f"There is content (first entry: {content.name}) in {e.FullPathOnFS()} - finished for this dir" )
|
PAprint( f"There is content (first entry: {content.name}) in {e.FullPathOnFS()} - finished for this dir" )
|
||||||
return
|
return
|
||||||
|
|
||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
@@ -1359,7 +1383,7 @@ def MoveFileToRecycleBin(job,del_me):
|
|||||||
# SAFE: as SafePaths(rbpath) combined with data I control in this func (explicit remove of 'static/' + DB entry path)
|
# SAFE: as SafePaths(rbpath) combined with data I control in this func (explicit remove of 'static/' + DB entry path)
|
||||||
os.replace( src, dst )
|
os.replace( src, dst )
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"MoveFileToRecycleBin({job.id},{del_me.name}): os.replace {src} with {dst} " )
|
PAprint( f"MoveFileToRecycleBin({job.id},{del_me.name}): os.replace {src} with {dst} " )
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
AddLogForJob( job, f"ERROR: Failed to remove file from filesystem - which={src}, err: {e}")
|
AddLogForJob( job, f"ERROR: Failed to remove file from filesystem - which={src}, err: {e}")
|
||||||
|
|
||||||
@@ -1413,7 +1437,7 @@ def MoveFileToRecycleBin(job,del_me):
|
|||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
def MoveEntriesToOtherFolder(job, move_me, dst_storage_path, dst_rel_path):
|
def MoveEntriesToOtherFolder(job, move_me, dst_storage_path, dst_rel_path):
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: MoveEntriesToOtherFolder( job={job.id}, move_me={move_me.name}, dst_storage_path={dst_storage_path.id}, dst_rel_path={dst_rel_path})")
|
PAprint( f"DEBUG: MoveEntriesToOtherFolder( job={job.id}, move_me={move_me.name}, dst_storage_path={dst_storage_path.id}, dst_rel_path={dst_rel_path})")
|
||||||
orig_name=move_me.name
|
orig_name=move_me.name
|
||||||
orig_fs_pos=move_me.FullPathOnFS()
|
orig_fs_pos=move_me.FullPathOnFS()
|
||||||
|
|
||||||
@@ -1513,16 +1537,16 @@ def CreateFSLocation( job, dst_path, dst_locn ):
|
|||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
def ResetAnySubdirPaths( moving_dir, dst_storage_path, parent_rel_path ):
|
def ResetAnySubdirPaths( moving_dir, dst_storage_path, parent_rel_path ):
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"ResetAnySubdirPaths( {moving_dir.name}, {dst_storage_path.path_prefix}, {parent_rel_path} )" )
|
PAprint( f"ResetAnySubdirPaths( {moving_dir.name}, {dst_storage_path.path_prefix}, {parent_rel_path} )" )
|
||||||
sub_dirs = session.query(Entry).join(FileType).join(EntryDirLink).filter(EntryDirLink.dir_eid==moving_dir.id).filter(FileType.name=='Directory').all()
|
sub_dirs = session.query(Entry).join(FileType).join(EntryDirLink).filter(EntryDirLink.dir_eid==moving_dir.id).filter(FileType.name=='Directory').all()
|
||||||
for sub in sub_dirs:
|
for sub in sub_dirs:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"ResetAnySubdirPaths: WAS sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
|
PAprint( f"ResetAnySubdirPaths: WAS sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
|
||||||
sub.in_path = dst_storage_path
|
sub.in_path = dst_storage_path
|
||||||
sub.dir_details.in_path = dst_storage_path
|
sub.dir_details.in_path = dst_storage_path
|
||||||
sub.dir_details.rel_path = parent_rel_path + '/' + sub.name
|
sub.dir_details.rel_path = parent_rel_path + '/' + sub.name
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"ResetAnySubdirPaths: NOW sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
|
PAprint( f"ResetAnySubdirPaths: NOW sub={sub.name}, ip={sub.in_dir.in_path.path_prefix}, rp={sub.dir_details.rel_path}" )
|
||||||
ResetAnySubdirPaths( sub, dst_storage_path, sub.dir_details.rel_path )
|
ResetAnySubdirPaths( sub, dst_storage_path, sub.dir_details.rel_path )
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -1668,7 +1692,7 @@ def JobImportDir(job):
|
|||||||
ptype = session.query(PathType).get(path_type)
|
ptype = session.query(PathType).get(path_type)
|
||||||
AddLogForJob(job, f"Checking {ptype.name} Directory: {path}" )
|
AddLogForJob(job, f"Checking {ptype.name} Directory: {path}" )
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: Checking Directory: {path}" )
|
PAprint( f"DEBUG: Checking Directory: {path}" )
|
||||||
if not os.path.exists( path ):
|
if not os.path.exists( path ):
|
||||||
WithdrawDependantJobs( job, job.id, "scan job found no new files to process" )
|
WithdrawDependantJobs( job, job.id, "scan job found no new files to process" )
|
||||||
FinishJob( job, f"Finished Importing: {path} -- Path does not exist", "Failed" )
|
FinishJob( job, f"Finished Importing: {path} -- Path does not exist", "Failed" )
|
||||||
@@ -1739,7 +1763,7 @@ def JobImportDir(job):
|
|||||||
# use ctime as even a metadata change (mv'd file on the fs, or a perms change) needs to be checked
|
# use ctime as even a metadata change (mv'd file on the fs, or a perms change) needs to be checked
|
||||||
if stat.st_ctime > dir.last_import_date:
|
if stat.st_ctime > dir.last_import_date:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print("DEBUG: {} - {} is newer than {}".format( basename, stat.st_ctime, dir.last_import_date ) )
|
PAprint("DEBUG: {} - {} is newer than {}".format( basename, stat.st_ctime, dir.last_import_date ) )
|
||||||
if isImage(fname):
|
if isImage(fname):
|
||||||
type_str = 'Image'
|
type_str = 'Image'
|
||||||
elif isVideo(fname):
|
elif isVideo(fname):
|
||||||
@@ -1753,7 +1777,7 @@ def JobImportDir(job):
|
|||||||
found_new_files += 1
|
found_new_files += 1
|
||||||
else:
|
else:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: { basename} - {stat.st_ctime} is OLDER than {dir.last_import_date}" )
|
PAprint( f"DEBUG: { basename} - {stat.st_ctime} is OLDER than {dir.last_import_date}" )
|
||||||
e=session.query(Entry).join(EntryDirLink).join(Dir).filter(Entry.name==basename,Dir.eid==dir.eid).first()
|
e=session.query(Entry).join(EntryDirLink).join(Dir).filter(Entry.name==basename,Dir.eid==dir.eid).first()
|
||||||
e.exists_on_fs=True
|
e.exists_on_fs=True
|
||||||
job.current_file=basename
|
job.current_file=basename
|
||||||
@@ -1900,26 +1924,29 @@ def JobTransformImage(job):
|
|||||||
id=[jex.value for jex in job.extra if jex.name == "id"][0]
|
id=[jex.value for jex in job.extra if jex.name == "id"][0]
|
||||||
amt=[jex.value for jex in job.extra if jex.name == "amt"][0]
|
amt=[jex.value for jex in job.extra if jex.name == "amt"][0]
|
||||||
e=session.query(Entry).join(File).filter(Entry.id==id).first()
|
e=session.query(Entry).join(File).filter(Entry.id==id).first()
|
||||||
print( f"JobTransformImage: job={job.id}, id={id}, amt={amt}" )
|
PAprint( f"JobTransformImage: job={job.id}, id={id}, amt={amt}" )
|
||||||
im = Image.open( e.FullPathOnFS() )
|
|
||||||
|
|
||||||
if amt == "fliph":
|
if amt == "fliph":
|
||||||
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} horizontally" )
|
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} horizontally" )
|
||||||
|
im = Image.open( e.FullPathOnFS() )
|
||||||
out = im.transpose(Image.FLIP_LEFT_RIGHT)
|
out = im.transpose(Image.FLIP_LEFT_RIGHT)
|
||||||
|
out.save( e.FullPathOnFS() )
|
||||||
elif amt == "flipv":
|
elif amt == "flipv":
|
||||||
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} vertically" )
|
AddLogForJob(job, f"INFO: Flipping {e.FullPathOnFS()} vertically" )
|
||||||
|
im = Image.open( e.FullPathOnFS() )
|
||||||
out = im.transpose(Image.FLIP_TOP_BOTTOM)
|
out = im.transpose(Image.FLIP_TOP_BOTTOM)
|
||||||
|
out.save( e.FullPathOnFS() )
|
||||||
else:
|
else:
|
||||||
AddLogForJob(job, f"INFO: Rotating {e.FullPathOnFS()} by {amt} degrees" )
|
AddLogForJob(job, f"INFO: Rotating {e.FullPathOnFS()} by {amt} degrees" )
|
||||||
if im.format == 'JPEG':
|
p = subprocess.run([PA_EXIF_ROTATER, e.FullPathOnFS(), amt ], capture_output=True, text=True )
|
||||||
im=ImageOps.exif_transpose(im)
|
if p.stderr:
|
||||||
out = im.rotate(int(amt), expand=True)
|
PAprint(p.stderr)
|
||||||
out.save( e.FullPathOnFS() )
|
else:
|
||||||
print( f"JobTransformImage DONE transform: job={job.id}, id={id}, amt={amt}" )
|
PAprint(p.stdout)
|
||||||
settings = session.query(Settings).first()
|
settings = session.query(Settings).first()
|
||||||
e.file_details.thumbnail, _ , _ = GenThumb( e.FullPathOnFS(), settings.auto_rotate )
|
e.file_details.thumbnail, _ , _ = GenThumb( e.FullPathOnFS(), settings.auto_rotate )
|
||||||
e.file_details.hash = md5( job, e )
|
e.file_details.hash = md5( job, e )
|
||||||
print( f"JobTransformImage DONE thumb: job={job.id}, id={id}, amt={amt}" )
|
PAprint( f"JobTransformImage DONE thumb: job={job.id}, id={id}, amt={amt}" )
|
||||||
session.add(e)
|
session.add(e)
|
||||||
FinishJob(job, "Finished Processesing image rotation/flip")
|
FinishJob(job, "Finished Processesing image rotation/flip")
|
||||||
return
|
return
|
||||||
@@ -1940,7 +1967,7 @@ def GenHashAndThumb(job, e):
|
|||||||
# use mtime as only if the content is different do we need to redo the hash
|
# use mtime as only if the content is different do we need to redo the hash
|
||||||
if stat.st_mtime < e.file_details.last_hash_date:
|
if stat.st_mtime < e.file_details.last_hash_date:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print(f"OPTIM: GenHashAndThumb {e.name} file is older than last hash, skip this")
|
PAprint(f"OPTIM: GenHashAndThumb {e.name} file is older than last hash, skip this")
|
||||||
job.current_file_num+=1
|
job.current_file_num+=1
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -1948,7 +1975,7 @@ def GenHashAndThumb(job, e):
|
|||||||
# same hash and we already have a thumbnail-> just return
|
# same hash and we already have a thumbnail-> just return
|
||||||
if new_hash == e.file_details.hash and e.file_details.thumbnail:
|
if new_hash == e.file_details.hash and e.file_details.thumbnail:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print(f"OPTIM: GenHashAndThumb {e.name} md5 is same - likely a mv on filesystem so skip md5/thumb")
|
PAprint(f"OPTIM: GenHashAndThumb {e.name} md5 is same - likely a mv on filesystem so skip md5/thumb")
|
||||||
job.current_file_num+=1
|
job.current_file_num+=1
|
||||||
return
|
return
|
||||||
e.file_details.hash = new_hash
|
e.file_details.hash = new_hash
|
||||||
@@ -1966,7 +1993,7 @@ def GenHashAndThumb(job, e):
|
|||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
def ProcessFilesInDir(job, e, file_func, count_dirs):
|
def ProcessFilesInDir(job, e, file_func, count_dirs):
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: ProcessFilesInDir: {e.FullPathOnFS()}")
|
PAprint( f"DEBUG: ProcessFilesInDir: {e.FullPathOnFS()}")
|
||||||
if e.type.name != 'Directory':
|
if e.type.name != 'Directory':
|
||||||
file_func(job, e)
|
file_func(job, e)
|
||||||
else:
|
else:
|
||||||
@@ -1986,7 +2013,7 @@ def JobGetFileDetails(job):
|
|||||||
path=[jex.value for jex in job.extra if jex.name == "path"][0]
|
path=[jex.value for jex in job.extra if jex.name == "path"][0]
|
||||||
path_prefix=[jex.value for jex in job.extra if jex.name == "path_prefix"][0]
|
path_prefix=[jex.value for jex in job.extra if jex.name == "path_prefix"][0]
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"DEBUG: JobGetFileDetails for path={path_prefix}" )
|
PAprint( f"DEBUG: JobGetFileDetails for path={path_prefix}" )
|
||||||
p=session.query(Path).filter(Path.path_prefix==path_prefix).first()
|
p=session.query(Path).filter(Path.path_prefix==path_prefix).first()
|
||||||
job.current_file_num = 0
|
job.current_file_num = 0
|
||||||
job.num_files = p.num_files
|
job.num_files = p.num_files
|
||||||
@@ -2094,7 +2121,7 @@ def JobCheckForDups(job):
|
|||||||
AddLogForJob( job, f"Check for duplicates" )
|
AddLogForJob( job, f"Check for duplicates" )
|
||||||
ClearOtherDupMessagesAndJobs()
|
ClearOtherDupMessagesAndJobs()
|
||||||
|
|
||||||
res = session.execute( "select count(e1.id) from entry e1, file f1, dir d1, entry_dir_link edl1, path_dir_link pdl1, path p1, entry e2, file f2, dir d2, entry_dir_link edl2, path_dir_link pdl2, path p2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and edl2.dir_eid = d2.eid and edl2.entry_id = e2.id and p1.type_id != (select id from path_type where name = 'Bin') and p1.id = pdl1.path_id and pdl1.dir_eid = d1.eid and p2.type_id != (select id from path_type where name = 'Bin') and p2.id = pdl2.path_id and pdl2.dir_eid = d2.eid and f1.hash = f2.hash and e1.id != e2.id and f1.size_mb = f2.size_mb" )
|
res = session.execute( text( "select count(e1.id) from entry e1, file f1, dir d1, entry_dir_link edl1, path_dir_link pdl1, path p1, entry e2, file f2, dir d2, entry_dir_link edl2, path_dir_link pdl2, path p2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and edl2.dir_eid = d2.eid and edl2.entry_id = e2.id and p1.type_id != (select id from path_type where name = 'Bin') and p1.id = pdl1.path_id and pdl1.dir_eid = d1.eid and p2.type_id != (select id from path_type where name = 'Bin') and p2.id = pdl2.path_id and pdl2.dir_eid = d2.eid and f1.hash = f2.hash and e1.id != e2.id and f1.size_mb = f2.size_mb") )
|
||||||
for row in res:
|
for row in res:
|
||||||
if row.count > 0:
|
if row.count > 0:
|
||||||
AddLogForJob(job, f"Found duplicates, Creating Status message in front-end for attention")
|
AddLogForJob(job, f"Found duplicates, Creating Status message in front-end for attention")
|
||||||
@@ -2242,18 +2269,19 @@ def JobRestoreFiles(job):
|
|||||||
def CopyOverrides():
|
def CopyOverrides():
|
||||||
try:
|
try:
|
||||||
for tbl in override_tbls:
|
for tbl in override_tbls:
|
||||||
session.execute( f"select * into tmp_{tbl} from {tbl}")
|
session.execute( text( f"select * into tmp_{tbl} from {tbl}") )
|
||||||
# force a commit here - I want to fail before I delete override content
|
# force a commit here - I want to fail before I delete override content
|
||||||
session.commit()
|
session.commit()
|
||||||
# now take all 4 override tables in DB and clear them out
|
# now take all 4 override tables in DB and clear them out
|
||||||
for tbl in override_tbls:
|
for tbl in override_tbls:
|
||||||
session.execute( f"delete from {tbl}" )
|
session.execute( text( f"delete from {tbl}" ) )
|
||||||
session.commit()
|
session.commit()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"ERROR: there are existing tmp tables when processing metadata. This SHOULD NEVER HAPPEN - manual intervention needed" )
|
PAprint( f"ERROR: there are existing tmp tables when processing metadata. This SHOULD NEVER HAPPEN - manual intervention needed" )
|
||||||
print( f"ERROR: most likely the job manager was killed during processing metadata - you may want to manually put" )
|
PAprint( f"ERROR: most likely the job manager was killed during processing metadata - you may want to manually put" )
|
||||||
print( f"ERROR: the contents of the 'tmp_*' tables back into their corresponding official metadata tables " )
|
PAprint( f"ERROR: the contents of the 'tmp_*' tables back into their corresponding official metadata tables " )
|
||||||
print( f"ERROR: and try to restart the job manager" )
|
PAprint( f"ERROR: and try to restart the job manager" )
|
||||||
|
PAprint( f"ERROR: orig ex: {ex}" )
|
||||||
exit( 1 )
|
exit( 1 )
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -2267,8 +2295,8 @@ def GetFaceInMetadata(fname):
|
|||||||
face_data=file_h.read(-1)
|
face_data=file_h.read(-1)
|
||||||
file_h.close()
|
file_h.close()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"ERROR: FATAL tried to read in override data and cant read content" )
|
PAprint( f"ERROR: FATAL tried to read in override data and cant read content" )
|
||||||
print( f"ERROR: manual intervention needed - exc={ex}" )
|
PAprint( f"ERROR: manual intervention needed - exc={ex}" )
|
||||||
exit(1)
|
exit(1)
|
||||||
return face_data
|
return face_data
|
||||||
|
|
||||||
@@ -2294,13 +2322,13 @@ def ReloadMetadata(job):
|
|||||||
fnames = glob.glob( f'{mpath}/no_match_overrides/*' )
|
fnames = glob.glob( f'{mpath}/no_match_overrides/*' )
|
||||||
for fname in fnames:
|
for fname in fnames:
|
||||||
# type derived from fname (e.g. 0_Too Young_md5*, 1_Too Young, 2_Ingore Face, etc.)
|
# type derived from fname (e.g. 0_Too Young_md5*, 1_Too Young, 2_Ingore Face, etc.)
|
||||||
match=re.search( '(\d+)_([^_\.]+)', fname )
|
match=re.search( r'(\d+)_([^_\.]+)', fname )
|
||||||
face_id=match.group(1)
|
face_id=match.group(1)
|
||||||
type_name=match.group(2)
|
type_name=match.group(2)
|
||||||
otype = session.query(FaceOverrideType).filter(FaceOverrideType.name==type_name).one()
|
otype = session.query(FaceOverrideType).filter(FaceOverrideType.name==type_name).one()
|
||||||
face_data=GetFaceInMetadata(fname)
|
face_data=GetFaceInMetadata(fname)
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"Found metadata showing Override of type: {type_name}" )
|
PAprint( f"DEBUG: Found metadata showing Override of type: {type_name}" )
|
||||||
|
|
||||||
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
|
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
|
||||||
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ). first()
|
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ). first()
|
||||||
@@ -2313,22 +2341,22 @@ def ReloadMetadata(job):
|
|||||||
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
||||||
os.replace( fname, f'{mpath}no_match_overrides/0_{otype.name}_{md5face(face_data)}' )
|
os.replace( fname, f'{mpath}no_match_overrides/0_{otype.name}_{md5face(face_data)}' )
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"ERROR: renaming no-match metadata on filesystem failed: {ex}" )
|
PAprint( f"ERROR: renaming no-match metadata on filesystem failed: {ex}" )
|
||||||
|
|
||||||
# process Metadata on FS for force_match_overrides (disco ones, will have 0 as face_id)
|
# process Metadata on FS for force_match_overrides (disco ones, will have 0 as face_id)
|
||||||
fnames = glob.glob( f'{mpath}force_match_overrides/*' )
|
fnames = glob.glob( f'{mpath}force_match_overrides/*' )
|
||||||
for fname in fnames:
|
for fname in fnames:
|
||||||
# person derived from fname (e.g. 0_ddp_md5*, 1_ddp, 2_mich, etc.)
|
# person derived from fname (e.g. 0_ddp_md5*, 1_ddp, 2_mich, etc.)
|
||||||
match=re.search( '(\d+)_([^_]+)', fname )
|
match=re.search( r'(\d+)_([^_]+)', fname )
|
||||||
face_id=match.group(1)
|
face_id=match.group(1)
|
||||||
person_tag=match.group(2)
|
person_tag=match.group(2)
|
||||||
p = session.query(Person).filter(Person.tag==person_tag).first()
|
p = session.query(Person).filter(Person.tag==person_tag).first()
|
||||||
if not p:
|
if not p:
|
||||||
print( f"There is a metadata override on the file system for person: {person_tag} - but they are no longer in the DB - skip" )
|
PAprint( f"INFO: There is a metadata override on the file system for person: {person_tag} - but they are no longer in the DB - skip" )
|
||||||
continue
|
continue
|
||||||
face_data=GetFaceInMetadata(fname)
|
face_data=GetFaceInMetadata(fname)
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"Found metadata showing Override match for person: {person_tag}" )
|
PAprint( f"DEBUG: Found metadata showing Override match for person: {person_tag}" )
|
||||||
|
|
||||||
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
|
# check that both the id and data match - if so make new FaceNoMatch otherwise Disco*FaceNoMatch
|
||||||
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ).first()
|
face=session.query( Face ).filter( Face.id==face_id ).filter( Face.face == face_data ).first()
|
||||||
@@ -2342,43 +2370,46 @@ def ReloadMetadata(job):
|
|||||||
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
||||||
os.replace( fname, f'{mpath}force_match_overrides/0_{p.tag}_{md5face(face_data)}' )
|
os.replace( fname, f'{mpath}force_match_overrides/0_{p.tag}_{md5face(face_data)}' )
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"ERROR: renaming force-match metadata on filesystem failed: {ex}" )
|
PAprint( f"ERROR: renaming force-match metadata on filesystem failed: {ex}" )
|
||||||
|
|
||||||
|
|
||||||
# now process each of the tmp tables for anything that was in the DB but not on FS (e.g rm'd metadata)
|
# now process each of the tmp tables for anything that was in the DB but not on FS (e.g rm'd metadata)
|
||||||
overrides=session.execute( "select face_id, type_id from tmp_face_no_match_override" )
|
overrides=session.execute( text( "select face_id, type_id from tmp_face_no_match_override" ) )
|
||||||
for o in overrides:
|
for o in overrides:
|
||||||
print( f"F Force Match: o.face_id={o.face_id}" )
|
PAprint( f"F Force Match: o.face_id={o.face_id}" )
|
||||||
print( f"F No Match: o.type_id={o.type_id}" )
|
print( f"F No Match: o.type_id={o.type_id}" )
|
||||||
nmo=session.query(FaceNoMatchOverride).filter(FaceNoMatchOverride.face_id==o.face_id).filter(FaceNoMatchOverride.type_id==o.type_id).first()
|
nmo=session.query(FaceNoMatchOverride).filter(FaceNoMatchOverride.face_id==o.face_id).filter(FaceNoMatchOverride.type_id==o.type_id).first()
|
||||||
if not nmo:
|
if not nmo:
|
||||||
session.add( FaceNoMatchOverride( face_id=o.face_id, type_id=o.type_id ) )
|
session.add( FaceNoMatchOverride( face_id=o.face_id, type_id=o.type_id ) )
|
||||||
|
|
||||||
overrides=session.execute( "select face_id, person_id from tmp_face_force_match_override" )
|
overrides=session.execute( text( "select face_id, person_id from tmp_face_force_match_override" ) )
|
||||||
for o in overrides:
|
for o in overrides:
|
||||||
print( f"F Force Match: o.face_id={o.face_id}" )
|
if DEBUG:
|
||||||
print( f"F Force Match: o.person_id={o.person_id}" )
|
PAprint( f"F Force Match: o.face_id={o.face_id}" )
|
||||||
|
PAprint( f"F Force Match: o.person_id={o.person_id}" )
|
||||||
fmo=session.query(FaceForceMatchOverride).filter(FaceForceMatchOverride.face_id==o.face_id,FaceForceMatchOverride.person_id==o.person_id).first()
|
fmo=session.query(FaceForceMatchOverride).filter(FaceForceMatchOverride.face_id==o.face_id,FaceForceMatchOverride.person_id==o.person_id).first()
|
||||||
if not fmo:
|
if not fmo:
|
||||||
session.add( FaceForceMatchOverride( face_id=o.face_id, person_id=o.person_id ) )
|
session.add( FaceForceMatchOverride( face_id=o.face_id, person_id=o.person_id ) )
|
||||||
|
|
||||||
overrides=session.execute( "select face, type_id from tmp_disconnected_no_match_override" )
|
overrides=session.execute( text( "select face, type_id from tmp_disconnected_no_match_override" ) )
|
||||||
for o in overrides:
|
for o in overrides:
|
||||||
print( f"D No Match: o.type_id={o.type_id}" )
|
if DEBUG:
|
||||||
|
PAprint( f"D No Match: o.type_id={o.type_id}" )
|
||||||
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==o.face).filter(DisconnectedNoMatchOverride.type_id==o.type_id).first()
|
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==o.face).filter(DisconnectedNoMatchOverride.type_id==o.type_id).first()
|
||||||
if not dnmo:
|
if not dnmo:
|
||||||
session.add( DisconnectedNoMatchOverride( face=o.face, type_id=o.type_id ) )
|
session.add( DisconnectedNoMatchOverride( face=o.face, type_id=o.type_id ) )
|
||||||
|
|
||||||
overrides=session.execute( "select face, person_id from tmp_disconnected_force_match_override" )
|
overrides=session.execute( text( "select face, person_id from tmp_disconnected_force_match_override" ) )
|
||||||
for o in overrides:
|
for o in overrides:
|
||||||
print( f"D Force Match: o.person_id={o.person_id}" )
|
if DEBUG:
|
||||||
|
PAprint( f"D Force Match: o.person_id={o.person_id}" )
|
||||||
dfmo=session.query(DisconnectedForceMatchOverride).filter(DisconnectedForceMatchOverride.face==o.face).filter(DisconnectedForceMatchOverride.person_id==o.person_id).first()
|
dfmo=session.query(DisconnectedForceMatchOverride).filter(DisconnectedForceMatchOverride.face==o.face).filter(DisconnectedForceMatchOverride.person_id==o.person_id).first()
|
||||||
if not dfmo:
|
if not dfmo:
|
||||||
session.add( DisconnectedForceMatchOverride( face=o.face, person_id=o.person_id ) )
|
session.add( DisconnectedForceMatchOverride( face=o.face, person_id=o.person_id ) )
|
||||||
|
|
||||||
# finally, drop the tmp tables
|
# finally, drop the tmp tables
|
||||||
for tbl in override_tbls:
|
for tbl in override_tbls:
|
||||||
session.execute( f"drop table tmp_{tbl}" )
|
session.execute( text( f"drop table tmp_{tbl}" ) )
|
||||||
|
|
||||||
# ok, finally commit all these changes - dont do this until now. Worst case if we crash/fail, the overrides should continue to be in tmp_{tbl}
|
# ok, finally commit all these changes - dont do this until now. Worst case if we crash/fail, the overrides should continue to be in tmp_{tbl}
|
||||||
session.commit()
|
session.commit()
|
||||||
@@ -2392,7 +2423,6 @@ def ReloadMetadata(job):
|
|||||||
# If there is content in the Bin already, its logs this - mostly useful when testing)
|
# If there is content in the Bin already, its logs this - mostly useful when testing)
|
||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
def InitialValidationChecks():
|
def InitialValidationChecks():
|
||||||
now=datetime.now(pytz.utc)
|
|
||||||
job=NewJob( name="init", num_files=0, wait_for=None, jex=None, parent_job=None, desc="initialise photo assistant" )
|
job=NewJob( name="init", num_files=0, wait_for=None, jex=None, parent_job=None, desc="initialise photo assistant" )
|
||||||
job.start_time=datetime.now(pytz.utc)
|
job.start_time=datetime.now(pytz.utc)
|
||||||
JobProgressState( job, "In Progress" )
|
JobProgressState( job, "In Progress" )
|
||||||
@@ -2405,14 +2435,14 @@ def InitialValidationChecks():
|
|||||||
if len(dirs) + len(files) > 0:
|
if len(dirs) + len(files) > 0:
|
||||||
AddLogForJob(job, "INFO: the bin path contains content, cannot process to know where original deletes were form - skipping content!" )
|
AddLogForJob(job, "INFO: the bin path contains content, cannot process to know where original deletes were form - skipping content!" )
|
||||||
AddLogForJob(job, "TODO: could be smart about what is known in the DB vs on the FS, and change below to an ERROR if it is one")
|
AddLogForJob(job, "TODO: could be smart about what is known in the DB vs on the FS, and change below to an ERROR if it is one")
|
||||||
AddLogForJob(job, "WARNING: IF the files in the bin are in the DB (succeeded from GUI deletes) then this is okay, otherwise you should delete contents form the recycle bin and restart the job manager)" )
|
AddLogForJob(job, "WARN: IF the files in the bin are in the DB (succeeded from GUI deletes) then this is okay, otherwise you should delete contents form the recycle bin and restart the job manager)" )
|
||||||
# create symlink and Path/Dir if needed
|
# create symlink and Path/Dir if needed
|
||||||
ProcessRecycleBinDir(job)
|
ProcessRecycleBinDir(job)
|
||||||
rbp_exists=1
|
rbp_exists=1
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"FATAL ERROR: Failed to walk the recycle bin at {path} Err:{ex}" )
|
PAprint( f"FATAL ERROR: Failed to walk the recycle bin at {path} Err:{ex}" )
|
||||||
else:
|
else:
|
||||||
AddLogForJob(job, "ERROR: The bin path in settings does not exist - Please fix now");
|
AddLogForJob(job, f"ERROR: The bin path {path} in settings does not exist - Please fix now");
|
||||||
sp_exists=0
|
sp_exists=0
|
||||||
path = SettingsSPath()
|
path = SettingsSPath()
|
||||||
if os.path.exists(path):
|
if os.path.exists(path):
|
||||||
@@ -2481,7 +2511,7 @@ def AddFaceToFile( locn_data, face_data, file_eid, model_id, settings ):
|
|||||||
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
||||||
os.replace( fname, new_fname )
|
os.replace( fname, new_fname )
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"ERROR: AddFaceToFile-face connects to 'disconnected-force-match' metadata, but fixing the filesystem metadata failed: {ex}" )
|
PAprint( f"ERROR: AddFaceToFile-face connects to 'disconnected-force-match' metadata, but fixing the filesystem metadata failed: {ex}" )
|
||||||
|
|
||||||
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==face.face).first()
|
dnmo=session.query(DisconnectedNoMatchOverride).filter(DisconnectedNoMatchOverride.face==face.face).first()
|
||||||
if dnmo:
|
if dnmo:
|
||||||
@@ -2497,7 +2527,7 @@ def AddFaceToFile( locn_data, face_data, file_eid, model_id, settings ):
|
|||||||
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
# SAFE: as SafePaths(mpath) combined with data I control in this func
|
||||||
os.replace( fname, new_fname )
|
os.replace( fname, new_fname )
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
print( f"ERROR: AddFaceToFile-face connects to 'disconnected-no-match' metadata, but fixing the filesystem metadata failed: {ex}" )
|
PAprint( f"ERROR: AddFaceToFile-face connects to 'disconnected-no-match' metadata, but fixing the filesystem metadata failed: {ex}" )
|
||||||
|
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -2517,7 +2547,7 @@ def DelMatchesForFile( job, ent ):
|
|||||||
if DEBUG:
|
if DEBUG:
|
||||||
AddLogForJob(job, f'Remove any old matches in {ent.name}')
|
AddLogForJob(job, f'Remove any old matches in {ent.name}')
|
||||||
|
|
||||||
session.execute( f"delete from face_refimg_link where face_id in (select face_id from face_file_link where file_eid = {ent.id})" )
|
session.execute( text( f"delete from face_refimg_link where face_id in (select face_id from face_file_link where file_eid = {ent.id})" ) )
|
||||||
ent.file_details.last_ai_scan=0
|
ent.file_details.last_ai_scan=0
|
||||||
session.add(ent)
|
session.add(ent)
|
||||||
return
|
return
|
||||||
@@ -2539,7 +2569,7 @@ def DelFacesForFile( job, eid ):
|
|||||||
if o:
|
if o:
|
||||||
DisconnectSingleNoMatchOverride( job, o )
|
DisconnectSingleNoMatchOverride( job, o )
|
||||||
|
|
||||||
session.execute( f"delete from face where id in (select face_id from face_file_link where file_eid = {eid})" )
|
session.execute( text( f"delete from face where id in (select face_id from face_file_link where file_eid = {eid})" ) )
|
||||||
|
|
||||||
session.commit()
|
session.commit()
|
||||||
return
|
return
|
||||||
@@ -2697,7 +2727,7 @@ def CheckAndRunBinClean():
|
|||||||
|
|
||||||
now=datetime.now(pytz.utc)
|
now=datetime.now(pytz.utc)
|
||||||
if not j or (now-j.last_update).days >= settings.scheduled_bin_cleanup:
|
if not j or (now-j.last_update).days >= settings.scheduled_bin_cleanup:
|
||||||
print( f"INFO: Should force clean up bin path, del files older than {settings.bin_cleanup_file_age} days old" )
|
PAprint( f"INFO: Should force clean up bin path, del files older than {settings.bin_cleanup_file_age} days old" )
|
||||||
NewJob( name="clean_bin", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean up on Bin path" )
|
NewJob( name="clean_bin", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean up on Bin path" )
|
||||||
created_jobs=True
|
created_jobs=True
|
||||||
return created_jobs
|
return created_jobs
|
||||||
@@ -2707,7 +2737,7 @@ def CheckAndRunBinClean():
|
|||||||
# based jobs that should run (e.g. last scanned a path X day(s) ago, then scan now), etc. X is defined in settings
|
# based jobs that should run (e.g. last scanned a path X day(s) ago, then scan now), etc. X is defined in settings
|
||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
def ScheduledJobs():
|
def ScheduledJobs():
|
||||||
print("DEBUG: Time to check for any scheduled jobs needing to be run" )
|
PAprint("DEBUG: Time to check for any scheduled jobs needing to be run" )
|
||||||
|
|
||||||
created_jobs=False
|
created_jobs=False
|
||||||
|
|
||||||
@@ -2717,11 +2747,11 @@ def ScheduledJobs():
|
|||||||
settings = session.query(Settings).first()
|
settings = session.query(Settings).first()
|
||||||
now=datetime.now(pytz.utc)
|
now=datetime.now(pytz.utc)
|
||||||
if ndays_since_last_im_scan >= settings.scheduled_import_scan:
|
if ndays_since_last_im_scan >= settings.scheduled_import_scan:
|
||||||
print( f"INFO: Time to force an import scan, last scan was {ndays_since_last_im_scan} days ago" )
|
PAprint( f"INFO: Time to force an import scan, last scan was {ndays_since_last_im_scan} days ago" )
|
||||||
NewJob( name="scan_ip", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Import path" )
|
NewJob( name="scan_ip", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Import path" )
|
||||||
created_jobs=True
|
created_jobs=True
|
||||||
if ndays_since_last_st_scan >= settings.scheduled_storage_scan:
|
if ndays_since_last_st_scan >= settings.scheduled_storage_scan:
|
||||||
print( f"INFO: Time to force a storage scan, last scan was {ndays_since_last_st_scan}" )
|
PAprint( f"INFO: Time to force a storage scan, last scan was {ndays_since_last_st_scan}" )
|
||||||
NewJob( name="scan_sp", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Storage path" )
|
NewJob( name="scan_sp", num_files=0, wait_for=None, jex=None, parent_job=None, desc="periodic clean scan for new files in Storage path" )
|
||||||
created_jobs=True
|
created_jobs=True
|
||||||
if CheckAndRunBinClean():
|
if CheckAndRunBinClean():
|
||||||
@@ -2734,7 +2764,7 @@ def ScheduledJobs():
|
|||||||
# go into waiting on a socket to be woken up (and then if woken, back into HandleJobs()
|
# go into waiting on a socket to be woken up (and then if woken, back into HandleJobs()
|
||||||
####################################################################################################################################
|
####################################################################################################################################
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
print("INFO: PA job manager starting - listening on {}:{}".format( PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT) )
|
PAprint("INFO: PA job manager starting - listening on {}:{}".format( PA_JOB_MANAGER_HOST, PA_JOB_MANAGER_PORT) )
|
||||||
|
|
||||||
InitialValidationChecks()
|
InitialValidationChecks()
|
||||||
|
|
||||||
@@ -2748,11 +2778,11 @@ if __name__ == "__main__":
|
|||||||
try:
|
try:
|
||||||
conn, addr = s.accept()
|
conn, addr = s.accept()
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"accept finished, tout={s.timeout}" )
|
PAprint( f"accept finished, tout={s.timeout}" )
|
||||||
|
|
||||||
except socket.timeout:
|
except socket.timeout:
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
print( f"timeout occurred, tout={s.timeout}" )
|
PAprint( f"timeout occurred, tout={s.timeout}" )
|
||||||
if ScheduledJobs():
|
if ScheduledJobs():
|
||||||
HandleJobs(False)
|
HandleJobs(False)
|
||||||
continue
|
continue
|
||||||
|
|||||||
@@ -87,14 +87,14 @@ def AddRefimgToPerson( filename, person ):
|
|||||||
#False == dont autorotate, its not needed on this image
|
#False == dont autorotate, its not needed on this image
|
||||||
refimg.thumbnail, refimg.orig_w, refimg.orig_h = GenThumb( filename, False )
|
refimg.thumbnail, refimg.orig_w, refimg.orig_h = GenThumb( filename, False )
|
||||||
settings = Settings.query.first()
|
settings = Settings.query.first()
|
||||||
model=AIModel.query.get(settings.default_refimg_model)
|
model=db.session.get(AIModel,settings.default_refimg_model)
|
||||||
refimg.face, face_locn = GenFace( filename, model=model.name )
|
refimg.face, face_locn = GenFace( filename, model=model.name )
|
||||||
try:
|
try:
|
||||||
os.remove(filename)
|
os.remove(filename)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
# can fail "silently" here, if the face_locn worked, great, its only
|
# can fail "silently" here, if the face_locn worked, great, its only
|
||||||
# a tmp file in /tmp - if not, the next if will send a msg to the front-end
|
# a tmp file in /tmp - if not, the next if will send a msg to the front-end
|
||||||
SetFELog( message=f"Failed to delete tmp file for refimg addition: {e}", log_level="danger", persistent=True, cant_close=True )
|
SetFELog( message=f"Failed to delete tmp file for refimg addition: {e}", level="danger", persistent=True, cant_close=True )
|
||||||
|
|
||||||
if not face_locn:
|
if not face_locn:
|
||||||
SetFELog( f"<b>Failed to find face in Refimg:</b>", "danger" )
|
SetFELog( f"<b>Failed to find face in Refimg:</b>", "danger" )
|
||||||
@@ -197,7 +197,7 @@ def person(id):
|
|||||||
|
|
||||||
if request.method == 'POST':
|
if request.method == 'POST':
|
||||||
try:
|
try:
|
||||||
person = Person.query.get(id)
|
person = db.session.get(Person, id)
|
||||||
if 'delete' in request.form:
|
if 'delete' in request.form:
|
||||||
SetFELog( f"Successfully deleted Person: ({person.tag})" )
|
SetFELog( f"Successfully deleted Person: ({person.tag})" )
|
||||||
|
|
||||||
@@ -237,7 +237,7 @@ def person(id):
|
|||||||
SetFELog( f"<b>Failed to modify Person:</b> {e}", "danger" )
|
SetFELog( f"<b>Failed to modify Person:</b> {e}", "danger" )
|
||||||
return redirect( url_for( 'persons' ) )
|
return redirect( url_for( 'persons' ) )
|
||||||
else:
|
else:
|
||||||
person = Person.query.get(id)
|
person = db.session.get(Person, id)
|
||||||
if not person:
|
if not person:
|
||||||
SetFELog( f"No such person with id: {id}", "danger" )
|
SetFELog( f"No such person with id: {id}", "danger" )
|
||||||
return redirect("/")
|
return redirect("/")
|
||||||
|
|||||||
@@ -1,7 +1,8 @@
|
|||||||
|
numpy==1.26.4
|
||||||
flask
|
flask
|
||||||
flask_login
|
flask_login
|
||||||
flask-ldap3-login
|
flask-ldap3-login
|
||||||
sqlalchemy
|
sqlalchemy>=2.0.36
|
||||||
flask-sqlalchemy
|
flask-sqlalchemy
|
||||||
SQLAlchemy-serializer
|
SQLAlchemy-serializer
|
||||||
marshmallow-sqlalchemy
|
marshmallow-sqlalchemy
|
||||||
@@ -19,3 +20,5 @@ Werkzeug
|
|||||||
flask-compress
|
flask-compress
|
||||||
ffmpeg-python
|
ffmpeg-python
|
||||||
psycopg2
|
psycopg2
|
||||||
|
python3-ldap
|
||||||
|
gunicorn
|
||||||
|
|||||||
@@ -137,7 +137,7 @@ def SettingsRBPath():
|
|||||||
settings = Settings.query.first()
|
settings = Settings.query.first()
|
||||||
if settings == None:
|
if settings == None:
|
||||||
from job import SetFELog
|
from job import SetFELog
|
||||||
SetFELog( message="Cannot create file data with no settings / recycle bin path is missing", log_level="error", persistent=True, cant_close=False)
|
SetFELog( message="Cannot create file data with no settings / recycle bin path is missing", level="danger", persistent=True, cant_close=False)
|
||||||
return
|
return
|
||||||
# path setting is an absolute path, just use it, otherwise prepend base_path first
|
# path setting is an absolute path, just use it, otherwise prepend base_path first
|
||||||
if settings.recycle_bin_path[0] == '/':
|
if settings.recycle_bin_path[0] == '/':
|
||||||
@@ -155,7 +155,7 @@ def SettingsSPath():
|
|||||||
settings = Settings.query.first()
|
settings = Settings.query.first()
|
||||||
if settings == None:
|
if settings == None:
|
||||||
from job import SetFELog
|
from job import SetFELog
|
||||||
SetFELog( message="Cannot create file data with no settings / storage path is missing", log_level="error", persistent=True, cant_close=False)
|
SetFELog( message="Cannot create file data with no settings / storage path is missing", level="danger", persistent=True, cant_close=False)
|
||||||
return
|
return
|
||||||
if settings.storage_path[0] == '/':
|
if settings.storage_path[0] == '/':
|
||||||
path=settings.storage_path
|
path=settings.storage_path
|
||||||
|
|||||||
26
shared.py
@@ -5,6 +5,7 @@ import io
|
|||||||
import base64
|
import base64
|
||||||
import subprocess
|
import subprocess
|
||||||
from PIL import Image, ImageOps
|
from PIL import Image, ImageOps
|
||||||
|
import numpy as np
|
||||||
|
|
||||||
class PA:
|
class PA:
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
@@ -24,7 +25,6 @@ class PA:
|
|||||||
|
|
||||||
|
|
||||||
hostname = socket.gethostname()
|
hostname = socket.gethostname()
|
||||||
PROD_HOST="pa_web"
|
|
||||||
|
|
||||||
# dict to store name of icon in icons.svg so we can use by referece in html
|
# dict to store name of icon in icons.svg so we can use by referece in html
|
||||||
ICON={}
|
ICON={}
|
||||||
@@ -40,19 +40,27 @@ OLDEST_LOG_LIMIT = 5
|
|||||||
if hostname == "lappy":
|
if hostname == "lappy":
|
||||||
PA_JOB_MANAGER_HOST="localhost"
|
PA_JOB_MANAGER_HOST="localhost"
|
||||||
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@localhost:5432/pa'
|
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@localhost:5432/pa'
|
||||||
PA_EXIF_ROTATER = './utils/pa_exifautotran'
|
PA_EXIF_AUTOROTATE = './utils/pa_exifautotran'
|
||||||
|
PA_EXIF_ROTATER = './utils/pa_rotate'
|
||||||
# if we dont set the env or we are explicitly DEV, run web server on localhost & db on mara (port 65432)
|
# if we dont set the env or we are explicitly DEV, run web server on localhost & db on mara (port 65432)
|
||||||
elif 'ENV' not in os.environ or os.environ['ENV'] == "development":
|
elif 'ENV' not in os.environ or os.environ['ENV'] == "development" or os.environ['ENV'] == "container":
|
||||||
PA_JOB_MANAGER_HOST="localhost"
|
PA_JOB_MANAGER_HOST="localhost"
|
||||||
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@mara.ddp.net:65432/pa'
|
# DB_URL = 'postgresql+psycopg2://pa:for_now_pa@mara.ddp.net:65432/pa'
|
||||||
PA_EXIF_ROTATER = './utils/pa_exifautotran'
|
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb_dev/pa'
|
||||||
|
PA_EXIF_AUTOROTATE = './utils/pa_exifautotran'
|
||||||
|
PA_EXIF_ROTATER = './utils/pa_rotate'
|
||||||
# if we explicitly are on PROD, run web server on localhost (pa_web container) & db on mara (port 5432 on padb container)- only accessed via internal docker ports)
|
# if we explicitly are on PROD, run web server on localhost (pa_web container) & db on mara (port 5432 on padb container)- only accessed via internal docker ports)
|
||||||
elif os.environ['ENV'] == "production":
|
elif os.environ['ENV'] == "production":
|
||||||
PA_JOB_MANAGER_HOST="localhost"
|
PA_JOB_MANAGER_HOST="localhost"
|
||||||
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb/pa'
|
DB_URL = 'postgresql+psycopg2://pa:for_now_pa@padb/pa'
|
||||||
PA_EXIF_ROTATER = '/code/utils/pa_exifautotran'
|
PA_EXIF_AUTOROTATE = '/code/utils/pa_exifautotran'
|
||||||
|
PA_EXIF_ROTATER = '/code/utils/pa_rotate'
|
||||||
else:
|
else:
|
||||||
print( "ERROR: I do not know which environment (development, etc.) and which DB (on which host to use)" )
|
print( "ERROR: I do not know which environment (development, etc.) and which DB (on which host to use)" )
|
||||||
|
if 'ENV' not in os.environ:
|
||||||
|
print( f"ERROR: no ENV variable set in the environment" )
|
||||||
|
else:
|
||||||
|
print( f"ERROR: ENV is {os.environ['ENV']}" )
|
||||||
exit( -1 )
|
exit( -1 )
|
||||||
|
|
||||||
# PORT number we connect to the pa_job_manager on - by default it runs on the
|
# PORT number we connect to the pa_job_manager on - by default it runs on the
|
||||||
@@ -132,7 +140,7 @@ def GenThumb(fname,auto_rotate):
|
|||||||
im_orig = Image.open(fname)
|
im_orig = Image.open(fname)
|
||||||
if im_orig.format == 'JPEG':
|
if im_orig.format == 'JPEG':
|
||||||
# run cmdline util to re-orient jpeg (only changes if needed, and does it losslessly)
|
# run cmdline util to re-orient jpeg (only changes if needed, and does it losslessly)
|
||||||
p = subprocess.run([PA_EXIF_ROTATER,fname] )
|
p = subprocess.run([PA_EXIF_AUTOROTATE,fname] )
|
||||||
im=Image.open(fname)
|
im=Image.open(fname)
|
||||||
else:
|
else:
|
||||||
im=im_orig
|
im=im_orig
|
||||||
@@ -161,10 +169,12 @@ def GenThumb(fname,auto_rotate):
|
|||||||
# generate Face data (and location) - wrapper func of face_recognition library
|
# generate Face data (and location) - wrapper func of face_recognition library
|
||||||
# used to store refimg data into the DB
|
# used to store refimg data into the DB
|
||||||
def GenFace(fname, model):
|
def GenFace(fname, model):
|
||||||
|
try:
|
||||||
img = face_recognition.load_image_file(fname)
|
img = face_recognition.load_image_file(fname)
|
||||||
location = face_recognition.face_locations(img, model=model)
|
location = face_recognition.face_locations(img, model=model)
|
||||||
encodings = face_recognition.face_encodings(img, known_face_locations=location)
|
encodings = face_recognition.face_encodings(img, known_face_locations=location)
|
||||||
if len(encodings) and len(location):
|
if len(encodings) and len(location):
|
||||||
return encodings[0].tobytes(), location[0]
|
return encodings[0].tobytes(), location[0]
|
||||||
else:
|
except Exception as e:
|
||||||
|
print( f"GenFace failed: {e}" )
|
||||||
return None, None
|
return None, None
|
||||||
|
|||||||
46
states.py
@@ -62,10 +62,9 @@ class States(PA):
|
|||||||
# this is any next/prev or noo, grouping, etc. change (so use referrer to work out what to do with this)
|
# this is any next/prev or noo, grouping, etc. change (so use referrer to work out what to do with this)
|
||||||
# because this can happen on a view, or files_up, etc. change this FIRST
|
# because this can happen on a view, or files_up, etc. change this FIRST
|
||||||
if 'change_file_opts' in request.path:
|
if 'change_file_opts' in request.path:
|
||||||
ref=request.referrer
|
|
||||||
base=request.base_url
|
base=request.base_url
|
||||||
base=base.replace("change_file_opts", "")
|
base=base.replace("change_file_opts", "")
|
||||||
self.url = "/"+ref.replace(base, "" )
|
self.url = "/"+request.referrer.replace(base, "" )
|
||||||
|
|
||||||
# if view_list, then we really are a view, and view_eid should be in the form
|
# if view_list, then we really are a view, and view_eid should be in the form
|
||||||
if 'view_list' in request.path:
|
if 'view_list' in request.path:
|
||||||
@@ -82,16 +81,19 @@ class States(PA):
|
|||||||
else:
|
else:
|
||||||
# GET's occur on redirect, and we don't have a form, so get it from pref
|
# GET's occur on redirect, and we don't have a form, so get it from pref
|
||||||
st=self.url[8:]
|
st=self.url[8:]
|
||||||
ref=request.referrer
|
if request.referrer and 'search' in request.referrer:
|
||||||
if 'search' in ref:
|
st=re.sub( '.+/search/', '', request.referrer )
|
||||||
st=re.sub( '.+/search/', '', ref )
|
|
||||||
else:
|
else:
|
||||||
st=''
|
st=''
|
||||||
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==st).first()
|
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==st).first()
|
||||||
if not pref:
|
if not pref:
|
||||||
SetFELog( message=f"ERROR: pref not found - DONT CLOSE THIS!!! dn={current_user.dn}, st={st}, s={self}????" , log_level="error", persistent=True, cant_close=True )
|
SetFELog( message=f"ERROR: pref not found - dn={current_user.dn}, st={st}, s={self}????" , level="danger", persistent=True, cant_close=False )
|
||||||
|
SetFELog( message=f"WARNING: I think this error occurred because you reloaded a page and the server had restarted between your original page load and this page reload, is that possible?" , level="warning", persistent=True, cant_close=False )
|
||||||
redirect("/")
|
redirect("/")
|
||||||
else:
|
else:
|
||||||
|
if not hasattr( pref, 'orig_url' ):
|
||||||
|
SetFELog( message=f"ERROR: orig_url not in pref - dn={current_user.dn}, st={st}, self={self}, pref={pref}????" , level="danger", persistent=True, cant_close=True )
|
||||||
|
redirect("/")
|
||||||
self.url = pref.orig_url
|
self.url = pref.orig_url
|
||||||
|
|
||||||
if 'files_ip' in self.url or 'file_list_ip' in self.url:
|
if 'files_ip' in self.url or 'file_list_ip' in self.url:
|
||||||
@@ -114,7 +116,11 @@ class States(PA):
|
|||||||
self.path_type = 'Bin'
|
self.path_type = 'Bin'
|
||||||
elif 'search' in self.url:
|
elif 'search' in self.url:
|
||||||
# okay if we are a search, but came from a view then get last_search_state form prefs and use it
|
# okay if we are a search, but came from a view then get last_search_state form prefs and use it
|
||||||
self.orig_search_term = self.url[8:]
|
m=re.match( '.*search/(.+)$', self.url )
|
||||||
|
if m == None:
|
||||||
|
SetFELog( message=f"ERROR: DDP messed up, seems we are processing a search, but cant see the search term - is this even possible?" )
|
||||||
|
return
|
||||||
|
self.orig_search_term = m[1]
|
||||||
if self.path_type == "View":
|
if self.path_type == "View":
|
||||||
self.orig_ptype = 'Search'
|
self.orig_ptype = 'Search'
|
||||||
self.orig_url = self.url
|
self.orig_url = self.url
|
||||||
@@ -123,22 +129,29 @@ class States(PA):
|
|||||||
elif 'view' in self.url:
|
elif 'view' in self.url:
|
||||||
# use url to get eid of viewed entry
|
# use url to get eid of viewed entry
|
||||||
self.view_eid = self.url[6:]
|
self.view_eid = self.url[6:]
|
||||||
self.path_type="View"
|
|
||||||
|
# force this to be a search so rest of code won't totally die, but also not return anything
|
||||||
|
self.path_type="Search"
|
||||||
self.orig_url=self.url
|
self.orig_url=self.url
|
||||||
elif 'change_file_opts' not in self.url:
|
elif 'change_file_opts' not in self.url:
|
||||||
SetFELog( message=f"ERROR: DDP messed up, failed to match URL {self.url} for settings this will fail, redirecting to home" , log_level="error", persistent=True, cant_close=True )
|
SetFELog( message=f"ERROR: DDP messed up, failed to match URL {self.url} for settings this will fail, redirecting to home" , level="danger", persistent=True, cant_close=True )
|
||||||
SetFELog( message=f"referrer={request.referrer}" , log_level="error",
|
SetFELog( message=f"referrer={request.referrer}" , level="danger", persistent=True, cant_close=True )
|
||||||
persistent=True, cant_close=True )
|
|
||||||
return
|
return
|
||||||
|
|
||||||
if self.path_type == 'View':
|
if self.path_type == 'View':
|
||||||
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==self.orig_search_term).first()
|
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.view_eid==self.view_eid,PA_UserState.orig_search_term==self.orig_search_term).first()
|
||||||
|
if not hasattr( self, 'orig_ptype' ):
|
||||||
|
self.orig_ptype='View'
|
||||||
|
self.orig_url=''
|
||||||
|
SetFELog( message=f"ERROR: No orig ptype? s={self} - pref={pref}, redirecting to home" , level="danger", persistent=True, cant_close=True )
|
||||||
|
SetFELog( message=f"referrer={request.referrer}" , level="danger", persistent=True, cant_close=True )
|
||||||
|
redirect("/")
|
||||||
|
|
||||||
# should find original path or search for this view (if not a search, search_term='')
|
# should find original path or search for this view (if not a search, search_term='')
|
||||||
orig_pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.orig_ptype,PA_UserState.orig_search_term==self.orig_search_term).first()
|
orig_pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.orig_ptype,PA_UserState.orig_search_term==self.orig_search_term).first()
|
||||||
if not orig_pref:
|
if not orig_pref:
|
||||||
SetFELog( message=f"ERROR: DDP messed up 2, failed to find orig_pref for a view pt={self.path_type} for search={self.orig_search_term}" , log_level="error", persistent=True, cant_close=True )
|
SetFELog( message=f"ERROR: DDP messed up 2, failed to find orig_pref for a view pt={self.path_type} for search={self.orig_search_term}" , level="danger", persistent=True, cant_close=True )
|
||||||
SetFELog( message=f"referrer={request.referrer}" , log_level="error",
|
SetFELog( message=f"referrer={request.referrer}" , level="danger", persistent=True, cant_close=True )
|
||||||
persistent=True, cant_close=True )
|
|
||||||
return
|
return
|
||||||
elif self.path_type == 'Search':
|
elif self.path_type == 'Search':
|
||||||
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.orig_search_term==self.orig_search_term).first()
|
pref=PA_UserState.query.filter(PA_UserState.pa_user_dn==current_user.dn,PA_UserState.path_type==self.path_type,PA_UserState.orig_search_term==self.orig_search_term).first()
|
||||||
@@ -254,9 +267,8 @@ class States(PA):
|
|||||||
if (self.offset + int(self.how_many)) < self.num_entries:
|
if (self.offset + int(self.how_many)) < self.num_entries:
|
||||||
self.offset += int(self.how_many)
|
self.offset += int(self.how_many)
|
||||||
else:
|
else:
|
||||||
# should be impossible now, but leave jic
|
# tripping this still
|
||||||
SetFELog( message=f"WARNING: next image requested, but would go past end of list? - ignore this" , log_level="warning", persistent=True, cant_close=True )
|
SetFELog( message=f"WARNING: next image requested, but would go past end of list? - ignore this" , level="warning", persistent=True, cant_close=False )
|
||||||
SetFELog( message=f"DDP - offset={self.offset} + how_many={self.how_many} > num_entries={self.num_entries}" , log_level="error", persistent=True, cant_close=True )
|
|
||||||
if 'current' in request.form:
|
if 'current' in request.form:
|
||||||
self.current = int(request.form['current'])
|
self.current = int(request.form['current'])
|
||||||
|
|
||||||
|
|||||||
@@ -160,7 +160,7 @@ create table JOBEXTRA ( ID integer, JOB_ID integer, NAME varchar(32), VALUE varc
|
|||||||
create table JOBLOG ( ID integer, JOB_ID integer, LOG_DATE timestamptz, LOG varchar,
|
create table JOBLOG ( ID integer, JOB_ID integer, LOG_DATE timestamptz, LOG varchar,
|
||||||
constraint PK_JL_ID primary key(ID), constraint FK_JL_JOB_ID foreign key(JOB_ID) references JOB(ID) );
|
constraint PK_JL_ID primary key(ID), constraint FK_JL_JOB_ID foreign key(JOB_ID) references JOB(ID) );
|
||||||
|
|
||||||
create table PA_JOB_MANAGER_FE_MESSAGE ( ID integer, JOB_ID integer, LEVEL varchar(16), MESSAGE varchar(1024), PERSISTENT boolean, CANT_CLOSE boolean,
|
create table PA_JOB_MANAGER_FE_MESSAGE ( ID integer, JOB_ID integer, LEVEL varchar(16), MESSAGE varchar(8192), PERSISTENT boolean, CANT_CLOSE boolean,
|
||||||
constraint PA_JOB_MANAGER_FE_ACKS_ID primary key(ID),
|
constraint PA_JOB_MANAGER_FE_ACKS_ID primary key(ID),
|
||||||
constraint FK_PA_JOB_MANAGER_FE_MESSAGE_JOB_ID foreign key(JOB_ID) references JOB(ID) );
|
constraint FK_PA_JOB_MANAGER_FE_MESSAGE_JOB_ID foreign key(JOB_ID) references JOB(ID) );
|
||||||
|
|
||||||
|
|||||||
@@ -4,10 +4,9 @@
|
|||||||
<div class="container">
|
<div class="container">
|
||||||
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
|
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
|
||||||
<h3 class="my-3 text-center" style="color: #5bc0de">
|
<h3 class="my-3 text-center" style="color: #5bc0de">
|
||||||
{# <svg width="64" height="64" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#pa_logo" /></svg> About Photo Assistant</h3> #}
|
<img width="64" height="64" src="internal/pa-logo.png"/></img> Photo Assistant - About</h3>
|
||||||
<svg width="64" height="64" fill="currentColor"><use xlink:href="internal/icons.svg#pa_logo" /></svg> Photo Assistant - About</h3>
|
|
||||||
<div class="alert alert-info">
|
<div class="alert alert-info">
|
||||||
{{build_info}}
|
<center>{{build_info}}</center>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<nav>
|
<nav>
|
||||||
|
|||||||
@@ -29,7 +29,7 @@
|
|||||||
<script>
|
<script>
|
||||||
function HandleSearch()
|
function HandleSearch()
|
||||||
{
|
{
|
||||||
st=$('#search_term').val();
|
st=encodeURIComponent( $('#search_term').val() )
|
||||||
if( st != '' ) {
|
if( st != '' ) {
|
||||||
document.location.href='/search/'+st
|
document.location.href='/search/'+st
|
||||||
}
|
}
|
||||||
@@ -101,6 +101,7 @@
|
|||||||
{% if config.ENV != "production" %}
|
{% if config.ENV != "production" %}
|
||||||
<a class="dropdown-item" href="{{url_for('force_scan')}}">Force Scan (delete data & rebuild)</a>
|
<a class="dropdown-item" href="{{url_for('force_scan')}}">Force Scan (delete data & rebuild)</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
<a class="dropdown-item" href="{{url_for('force_clear')}}">Clear Messages (Force)</a>
|
||||||
<a class="dropdown-item" href="{{url_for('wake_up')}}">Force wake the job manager</a>
|
<a class="dropdown-item" href="{{url_for('wake_up')}}">Force wake the job manager</a>
|
||||||
</div class="dropdow-menu">
|
</div class="dropdow-menu">
|
||||||
</div class="nav-item dropdown">
|
</div class="nav-item dropdown">
|
||||||
|
|||||||
@@ -5,6 +5,8 @@
|
|||||||
<script src="{{ url_for( 'internal', filename='js/files_transform.js')}}"></script>
|
<script src="{{ url_for( 'internal', filename='js/files_transform.js')}}"></script>
|
||||||
|
|
||||||
<script>
|
<script>
|
||||||
|
document.fake_shift=0
|
||||||
|
document.fake_ctrl=0
|
||||||
var move_paths=[]
|
var move_paths=[]
|
||||||
{% for p in move_paths %}
|
{% for p in move_paths %}
|
||||||
p = new Object()
|
p = new Object()
|
||||||
@@ -13,6 +15,11 @@
|
|||||||
p.icon_url = '{{p.icon_url}}'
|
p.icon_url = '{{p.icon_url}}'
|
||||||
move_paths.push(p)
|
move_paths.push(p)
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
|
document.OPT = '{{OPT}}'
|
||||||
|
document.entries = '{{entry_data}}'
|
||||||
|
document.how_many = '{{OPT.how_many}}'
|
||||||
|
document.entries_len = '{{entry_data|length}}'
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<div class="container-fluid">
|
<div class="container-fluid">
|
||||||
@@ -66,7 +73,7 @@
|
|||||||
</button>
|
</button>
|
||||||
<span class="sm-txt my-auto"> {{OPT.how_many}} files </span>
|
<span class="sm-txt my-auto"> {{OPT.how_many}} files </span>
|
||||||
{% set nxt_disabled="" %}
|
{% set nxt_disabled="" %}
|
||||||
{% if entry_data|length < OPT.how_many|int %}
|
{% if not entry_data or entry_data|length < OPT.how_many|int %}
|
||||||
{% set nxt_disabled="disabled" %}
|
{% set nxt_disabled="disabled" %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<button aria-label="next" id="next" {{nxt_disabled}} name="next" class="next sm-txt btn btn-outline-secondary">
|
<button aria-label="next" id="next" {{nxt_disabled}} name="next" class="next sm-txt btn btn-outline-secondary">
|
||||||
@@ -83,6 +90,8 @@
|
|||||||
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash-fill"/></svg>
|
<svg width="16" height="16" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash-fill"/></svg>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</button>
|
</button>
|
||||||
|
<button style="visibility:hidden" class="btn btn-outline-secondary" aria-label="shift-key" id="shift-key" onclick="document.fake_shift=1-document.fake_shift; event.stopPropagation(); return false">shift</button>
|
||||||
|
<button style="visibility:hidden" class="btn btn-outline-secondary" aria-label="ctrl-key" id="ctrl-key" onclick="document.fake_ctrl=1-document.fake_ctrl; event.stopPropagation(); return false">ctrl</button>
|
||||||
</div>
|
</div>
|
||||||
<div class="d-flex col col-auto justify-content-end">
|
<div class="d-flex col col-auto justify-content-end">
|
||||||
<div class="btn-group">
|
<div class="btn-group">
|
||||||
@@ -149,6 +158,9 @@
|
|||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
{% if not entry_data %}
|
||||||
|
<span class="alert alert-danger p-2 col-auto"> No matches for: '{{search_term}}'</span>
|
||||||
|
{% endif %}
|
||||||
{% for obj in entry_data %}
|
{% for obj in entry_data %}
|
||||||
{% if loop.index==1 and OPT.folders %}
|
{% if loop.index==1 and OPT.folders %}
|
||||||
{% if OPT.cwd != OPT.root %}
|
{% if OPT.cwd != OPT.root %}
|
||||||
@@ -303,6 +315,7 @@ $('.figure').dblclick( CallViewRouteWrapper )
|
|||||||
// different context menu on files
|
// different context menu on files
|
||||||
$.contextMenu({
|
$.contextMenu({
|
||||||
selector: '.entry',
|
selector: '.entry',
|
||||||
|
itemClickEvent: "click",
|
||||||
build: function($triggerElement, e) {
|
build: function($triggerElement, e) {
|
||||||
// when right-clicking & no selection add one OR deal with ctrl/shift right-lick as it always changes seln
|
// when right-clicking & no selection add one OR deal with ctrl/shift right-lick as it always changes seln
|
||||||
if( NoSel() || e.ctrlKey || e.shiftKey )
|
if( NoSel() || e.ctrlKey || e.shiftKey )
|
||||||
@@ -372,6 +385,8 @@ $.contextMenu({
|
|||||||
if( key == "fliph" ) { Transform("fliph") }
|
if( key == "fliph" ) { Transform("fliph") }
|
||||||
if( key == "flipv" ) { Transform("flipv") }
|
if( key == "flipv" ) { Transform("flipv") }
|
||||||
if( key.startsWith("ai")) { RunAIOnSeln(key) }
|
if( key.startsWith("ai")) { RunAIOnSeln(key) }
|
||||||
|
// dont flow this event through the dom
|
||||||
|
e.stopPropagation()
|
||||||
},
|
},
|
||||||
items: item_list
|
items: item_list
|
||||||
};
|
};
|
||||||
@@ -398,7 +413,18 @@ $( document ).keydown(function(event) {
|
|||||||
if( ! NoSel() ) DelDBox('Delete');
|
if( ! NoSel() ) DelDBox('Delete');
|
||||||
{% endif %}
|
{% endif %}
|
||||||
break;
|
break;
|
||||||
|
} })
|
||||||
|
|
||||||
|
function isMobile() {
|
||||||
|
try{ document.createEvent("TouchEvent"); return true; }
|
||||||
|
catch(e){ return false; }
|
||||||
}
|
}
|
||||||
});
|
|
||||||
|
if( isMobile() )
|
||||||
|
{
|
||||||
|
$('#shift-key').css('visibility', 'visible');
|
||||||
|
$('#ctrl-key').css('visibility', 'visible');
|
||||||
|
}
|
||||||
|
|
||||||
</script>
|
</script>
|
||||||
{% endblock script_content %}
|
{% endblock script_content %}
|
||||||
|
|||||||
@@ -46,8 +46,7 @@
|
|||||||
|
|
||||||
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
|
<div class="row px-3 my-5 offset-lg-3 col-lg-6 col-sm-12" style="border: 3px solid #5bc0de; border-radius: 15px;">
|
||||||
<h3 class="my-3 text-center" style="color: #5bc0de">
|
<h3 class="my-3 text-center" style="color: #5bc0de">
|
||||||
{# <svg width="64" height="64" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#pa_logo" /></svg> Photo Assistant Login</h3> #}
|
<img width="64" height="64" src="internal/pa-logo.png"/></img> Photo Assistant Login</h3>
|
||||||
<svg width="64" height="64" fill="currentColor"><use xlink:href="internal/icons.svg#pa_logo" /></svg> Photo Assistant Login</h3>
|
|
||||||
<form class="" method="POST">
|
<form class="" method="POST">
|
||||||
<div class="input-group">
|
<div class="input-group">
|
||||||
<label for="username" class="text-right input-group-text col-4 text-info">Username:</label>
|
<label for="username" class="text-right input-group-text col-4 text-info">Username:</label>
|
||||||
|
|||||||
@@ -12,6 +12,7 @@
|
|||||||
.form-switch .form-check-input:focus {
|
.form-switch .form-check-input:focus {
|
||||||
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23CFF4FC'/%3e%3c/svg%3e");
|
background-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23CFF4FC'/%3e%3c/svg%3e");
|
||||||
}
|
}
|
||||||
|
#tst90:hover,#tst90:focus { filter: invert(73%) sepia(27%) saturate(3970%) hue-rotate(146deg) brightness(94%) contrast(100%); }
|
||||||
</style>
|
</style>
|
||||||
|
|
||||||
<script src="{{ url_for( 'internal', filename='js/view_transform.js')}}"></script>
|
<script src="{{ url_for( 'internal', filename='js/view_transform.js')}}"></script>
|
||||||
@@ -45,6 +46,7 @@
|
|||||||
{% for face in objs[id].file_details.faces %}
|
{% for face in objs[id].file_details.faces %}
|
||||||
data = { 'id': '{{face.id}}', 'x': '{{face.face_left}}', 'y': '{{face.face_top}}', 'w': '{{face.w}}', 'h':'{{face.h}}' }
|
data = { 'id': '{{face.id}}', 'x': '{{face.face_left}}', 'y': '{{face.face_top}}', 'w': '{{face.w}}', 'h':'{{face.h}}' }
|
||||||
{% if face.refimg %}
|
{% if face.refimg %}
|
||||||
|
data['pid']='{{face.refimg.person.id}}'
|
||||||
data['who']='{{face.refimg.person.tag}}'
|
data['who']='{{face.refimg.person.tag}}'
|
||||||
data['distance']="{{face.refimg_lnk.face_distance|round(2)}}"
|
data['distance']="{{face.refimg_lnk.face_distance|round(2)}}"
|
||||||
{% endif %}
|
{% endif %}
|
||||||
@@ -101,6 +103,9 @@
|
|||||||
|
|
||||||
function CallViewListRoute(dir)
|
function CallViewListRoute(dir)
|
||||||
{
|
{
|
||||||
|
// dont allow mad spamming of arrows
|
||||||
|
$("#la").prop("disabled", true)
|
||||||
|
$("#ra").prop("disabled", true)
|
||||||
data="eids="+$("#eids").val()
|
data="eids="+$("#eids").val()
|
||||||
data+="&cwd={{OPT.cwd}}"
|
data+="&cwd={{OPT.cwd}}"
|
||||||
data+="&root={{OPT.root}}"
|
data+="&root={{OPT.root}}"
|
||||||
@@ -117,10 +122,11 @@
|
|||||||
objs=res.objs
|
objs=res.objs
|
||||||
eid_lst=eids.split(",")
|
eid_lst=eids.split(",")
|
||||||
offset=res.offset
|
offset=res.offset
|
||||||
if( current == first_eid )
|
// okay, we now have results back, can reset next/prev buttons
|
||||||
$("#la").prop("disabled", true)
|
if( current != first_eid )
|
||||||
if( current == last_eid )
|
$("#la").prop("disabled", false)
|
||||||
$("#ra").prop("disabled", true)
|
if( current != last_eid )
|
||||||
|
$("#ra").prop("disabled", false)
|
||||||
ViewImageOrVideo()
|
ViewImageOrVideo()
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@@ -246,32 +252,35 @@
|
|||||||
</div>
|
</div>
|
||||||
<div class="col col-auto pt-1">
|
<div class="col col-auto pt-1">
|
||||||
<button class="btn btn-outline-info p-1" title="Rotate by 90 degrees" onClick="Transform(90)">
|
<button class="btn btn-outline-info p-1" title="Rotate by 90 degrees" onClick="Transform(90)">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot90"/></svg>
|
<img src="{{url_for('internal', filename='rot90.png')}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot90-invert.png')}}'"
|
||||||
|
onMouseOut="this.src='{{url_for('internal', filename='rot90.png')}}'" />
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="Rotate by 180 degrees" onClick="Transform(180)">
|
<button class="btn btn-outline-info p-1" title="Rotate by 180 degrees" onClick="Transform(180)">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot180"/></svg>
|
<img src="{{url_for('internal', filename='rot180.png')}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot180-invert.png')}}'"
|
||||||
|
onMouseOut="this.src='{{url_for('internal', filename='rot180.png')}}'" />
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="Rotate by 270 degrees" onClick="Transform(270)">
|
<button class="btn btn-outline-info p-1" title="Rotate by 270 degrees" onClick="Transform(270)">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#rot270"/></svg>
|
<img src="{{url_for('internal', filename='rot270.png')}}" width="32" height="32" onMouseOver="this.src='{{url_for('internal', filename='rot270-invert.png')}}'"
|
||||||
|
onMouseOut="this.src='{{url_for('internal', filename='rot270.png')}}'" />
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="Flip horizontally" onClick="Transform('fliph')">
|
<button class="btn btn-outline-info p-1" title="Flip horizontally" onClick="Transform('fliph')">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_h"/></svg>
|
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_h"/></svg>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="Flip vertically" onClick="Transform('flipv')">
|
<button class="btn btn-outline-info p-1" title="Flip vertically" onClick="Transform('flipv')">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_v"/></svg>
|
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#flip_v"/></svg>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="View in Fullscreen mode (hotkey: F)" onClick="fullscreen=true; ViewImageOrVideo()">
|
<button class="btn btn-outline-info p-1" title="View in Fullscreen mode (hotkey: F)" onClick="fullscreen=true; ViewImageOrVideo()">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#fullscreen"/></svg>
|
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#fullscreen"/></svg>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="Show logs relating to this filename (hotkey: l)" onClick="JoblogSearch()">
|
<button class="btn btn-outline-info p-1" title="Show logs relating to this filename (hotkey: l)" onClick="JoblogSearch()">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#log"/></svg>
|
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#log"/></svg>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-info p-1" title="View Original" onClick="window.location='/'+objs[current].url">
|
<button class="btn btn-outline-info p-1" title="View Original" onClick="window.location='/'+objs[current].url">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#download"/></svg>
|
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#download"/></svg>
|
||||||
</button>
|
</button>
|
||||||
<button id="del" class="btn btn-outline-danger p-1" title="Delete (hotkey: Del)"
|
<button id="del" class="btn btn-outline-danger p-1" title="Delete (hotkey: Del)"
|
||||||
onClick="$.ajax({ type: 'POST', data: '&eid-0={{current}}', url: '/delete_files', success: function(data){ window.location='/'; return false; } })">
|
onClick="$.ajax({ type: 'POST', data: '&eid-0={{current}}', url: '/delete_files', success: function(data){ window.location='/'; return false; } })">
|
||||||
<svg width="28" height="28" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash"/></svg>
|
<svg width="32" height="32" fill="currentColor"><use xlink:href="{{url_for('internal', filename='icons.svg')}}#trash"/></svg>
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
</div class="row">
|
</div class="row">
|
||||||
|
|||||||
9
utils/pa_rotate
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
echo "start"
|
||||||
|
b=`basename "$1"`
|
||||||
|
echo "b=$b"
|
||||||
|
mv "$1" "/tmp/$b"
|
||||||
|
echo "mv '$1' to /tmp/$b"
|
||||||
|
jpegtran -outfile "$1" -rotate "$2" "/tmp/$b"
|
||||||
|
echo "done jpegtran"
|
||||||
15
wrapper.sh
@@ -1,7 +1,18 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
|
||||||
su mythtv -g mythtv -c 'ENV="production" python3 -u /code/pa_job_manager.py' &> /var/log/pa_job_manager.out &
|
sudo -u pauser ENV=${ENV} python3 -u pa_job_manager.py &> /var/log/pa_job_manager.out &
|
||||||
gunicorn --bind=0.0.0.0:443 --workers=4 --threads=16 --certfile /etc/letsencrypt/live/pa.depaoli.id.au/fullchain.pem --keyfile /etc/letsencrypt/live/pa.depaoli.id.au/privkey.pem main:app --env ENV="production" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output
|
if [ "$ENV" == "production" ]; then
|
||||||
|
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=4 --threads=16 main:app --env ENV="production" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output
|
||||||
|
elif [ "$ENV" == "container" ]; then
|
||||||
|
# move into src tree so padev responds to code changes
|
||||||
|
cd /home/ddp/src/photoassistant
|
||||||
|
# push container logs/errors to stdout so docker logs works
|
||||||
|
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=1 --threads=1 main:app --env ENV="container" --reload --capture-output --log-file - --error-logfile -
|
||||||
|
else
|
||||||
|
echo "Not sure which ENV ($ENV) we are running, set up for DEV and dont run job manager (jic)"
|
||||||
|
sudo -u pauser gunicorn --bind=0.0.0.0:80 --workers=1 --threads=1 main:app --env ENV="development" --error-logfile gunicorn.error.log --access-logfile gunicorn.log --capture-output --enable-stdio-inheritance --reload
|
||||||
|
fi
|
||||||
|
|
||||||
# this should never be invoked unless gunicorn fails -- in that case, at least
|
# this should never be invoked unless gunicorn fails -- in that case, at least
|
||||||
# we will keep the container can login by hand and check the issue/error
|
# we will keep the container can login by hand and check the issue/error
|
||||||
sleep 99999
|
sleep 99999
|
||||||
|
|||||||