## GENERAL * fix_dups, etc. need to know path so we don't guess import_path or storage_path to remove the prefix from the keep/del alerts * pagination in dups, needs to be a drop-down and take affect on page on change * SymlinkName - use it from shared everywhere, never do path_prefix by hand use this function * AddJobForLog can absorb DEBUGs, etc. in fact fix up logging in general * comment your code * do we need to make some funcs/code into OO? * scan_sp needs to be in scannow * need a way for page to show we are in import_path or storage_path ## DB Need to think about... file (image) -> has X faces, Y matches X == Y (optim: dont scan again) say X-Y == 1, then to optimise, we need to only check the missing face... at the moment, the DB structure is not that clever... (file_refimg_link --> file_refimg_link needs a face_num?) ### BACKEND scan storage/import dir: ignore *thumb* scan storage_dir * need to find / remove duplicate files from inside storage_dir and itself, and in import_dir and in storage_dir implications -- VIEWING: need to view import dir and view storage dir as separate menu items AND make it clear what you are looking at in header MOVING/COPYING: need to be smart, its a file move/copy depending on file systems (if import_dir/storage_dir on same fs, we can use mv - much faster) -- started on some basic optimisations (commit logs every 100 logs, not each log) - with debugs: import = 04:11, getfiledetails== 0:35:35 - without debugs: import == 04:03, getfiledetails == 0:35:36 -- not a sig diff - with exifread & debug: import == 04:26 *** Need to use thread-safe sessions per Thread, half-assed version did not work need a manual button to restart it in the GUI, (based on file-level optims, just run the job as new and it will optim over already done parts and continue) Future: Admin -> reset face_flag AI -> rescan