wrapped ProcessFilesInDir to just take a job, path, file_func to run, updated TODOs appropriately - added sql for duplicates check to TODO
This commit is contained in:
12
TODO
12
TODO
@@ -10,9 +10,6 @@
|
||||
### BACKEND
|
||||
*** Need to use thread-safe sessions per Thread, half-assed version did not work
|
||||
|
||||
* ProcessFilesInDir (should take a path, and do the first for loop so that no one sees the recursive func)
|
||||
* Handle file deletions from file system (remove dangling DIR/FILE combos) -- also when moving to storage_dir, need to reset DIR, but keep FILE data
|
||||
|
||||
Future:
|
||||
Admin
|
||||
-> reset face_flag
|
||||
@@ -23,17 +20,20 @@
|
||||
thresholds on AI, or we get a new/better one some day, then it can
|
||||
all images with faces, or if we 'reset face_flag' rescan all images
|
||||
|
||||
Admin
|
||||
-> delete old jobs / auto delete jobs older than ???
|
||||
|
||||
### UI
|
||||
|
||||
### AI
|
||||
* store reference images (UI allows this now)
|
||||
* check images
|
||||
* allow for threshold/settings to be tweaked from the GUI
|
||||
- it would be good to then say, just run the scanner against this image or maybe this DIR, to see how it IDs ppl
|
||||
When AI kicks in, it processes per person per DIR, only compares to an image if it has_unidentified_face
|
||||
|
||||
|
||||
### SORTER
|
||||
* duplicate files - this sql finds them:
|
||||
select d1.path_prefix, e1.name, f1.hash from entry e1, file f1, dir d1, entry_dir_link edl1, entry e2, file f2 where e1.id = f1.eid and e2.id = f2.eid and d1.eid = edl1.dir_eid and edl1.entry_id = e1.id and f1.hash = f2.hash and e1.name != e2.name
|
||||
|
||||
* date stuff
|
||||
* exif processing?
|
||||
* location stuff - test a new photo from my camera out
|
||||
|
||||
Reference in New Issue
Block a user