Commit Graph

352 Commits

Author SHA1 Message Date
6e50390886 use su mythtv, not su - mythtv... myth has no home dir 2021-03-24 19:35:19 +11:00
f6ce155217 allow dev or prod defaults for settings table 2021-03-24 19:34:56 +11:00
1ab339ffea had to revert the import line - pylint didnt originally like it this way, cant see issue now - go figure, but it definitely caused the video thumbs to fail. turned off debug again, fixed up how we account for current_file_num increases via explicit use of ProcessFileForJob() to increment where required, or just inc. properly. Finally, check if image has transparency, if so convert it to rgb as we cant save it as jpeg otherwise (in thumbnail creation) 2021-03-23 18:39:58 +11:00
a2ef27e6a1 had to revert the import line - pylint didnt originally like it this way, cant see issue now - go figure, but it definitely caused the video thumbs to fail 2021-03-23 18:38:10 +11:00
92c5ee3e9d keep processing all dups, even when > pagesize, so that alert at end is definitive about automatic matches that will be deleted. Also, fixed issue where a duplicate in the root dir of an import / storage path, was showing as blank after TrimmedPath() 2021-03-23 18:37:32 +11:00
072705005d removed all references to /static except in SymlinkName() 2021-03-23 18:36:07 +11:00
ebd5a8f3ba make ./static be owned by mythtv (passed in 500/500 for uid/gid) so that it can manipulate files in storage area on mara - still need to work out build args, not just hack it in Dockerfile ENV vars 2021-03-23 18:35:43 +11:00
055dd58d34 better formatting for all automatic matches, deal with automatic matching of preferred_paths, put <input type=hidden> fields in for files in storage and import paths, only display headings/content if it actually exists 2021-03-21 17:00:36 +11:00
7e14280ec5 made new/better function to deal with path duplicates / reduced duplicate code, fixed overall count vars, improved debugs 2021-03-21 16:58:54 +11:00
f20ce0f7a4 removed old code, commented out Dump() debug 2021-03-21 16:58:06 +11:00
86ff96fefb now passing single DD var (duplicate data) from files.py for simplicity, also added the beginnings of support for regex-d auto keeping / auto deleting of files, and paths - noting that path data from dups.py seems to not be working with my test data at present, so html showing it needs testing too 2021-03-21 12:18:59 +11:00
2292777b83 fixed bug with no pagesize on fresh DB / first time in fix_dups. Also passing duplicate data (DD) as single object to dups.html 2021-03-21 12:17:43 +11:00
2939d91092 now using SymlinkName everywhere 2021-03-21 12:10:56 +11:00
054d024ae0 added more debugs into Dump(), its noisier, but while testing in small number of files in DEV, its needed 2021-03-21 12:08:21 +11:00
9c263f54e3 minor fix to only remove trailing slash if there is content at all in sig_bit of SymlinkName 2021-03-21 12:06:56 +11:00
1de74f8f51 added token settings data for testing so it has a storage path too 2021-03-21 10:29:51 +11:00
b763a59b2b whitespace removed 2021-03-19 18:06:16 +11:00
22281be93d fixed up a few issues found from linter 2021-03-19 18:04:22 +11:00
6d071f14dd fixed up a few issues found from linter 2021-03-19 18:03:00 +11:00
c2d0004537 fixed up a few issues found from linter 2021-03-19 18:00:44 +11:00
6fd0205b71 fixed up a few issues found from linter 2021-03-19 17:49:48 +11:00
ef0971f6a3 fixed up a few issues found from linter 2021-03-19 17:38:46 +11:00
517069c709 use raw string for regex to make linter happy 2021-03-19 17:19:45 +11:00
df8f9c88de whitespace removed 2021-03-19 17:17:29 +11:00
de9cd7b4fb update 2021-03-18 19:08:41 +11:00
924058e18d made pa_job_manager use mythtv:mythtv, but had to hardcode PJM_* vars, I tried to pass them through docker-compose.yml 2021-03-18 19:08:30 +11:00
2cd55580a9 rewrote dups.html to now use newer model, where we will auto-delete those that match the regexp, and not show a per row view of these. Also removed extra / unneeded line when processing deleting files 2021-03-17 20:04:25 +11:00
08dc646371 first pass of using Duplicate class, rather than files doing all the dup work. The html still is shown preferreds, and does not know there are preferred files and preferred dirs yet 2021-03-15 20:36:10 +11:00
046c512e6b added comments 2021-03-14 14:33:22 +11:00
f88ef2542f updated dups.py to have a DupRow class to store the data in a way dot-notation works consistently in Duplicates class, also moved attributes into top of __init__ for consistency, but needed for DupRow to work (implies the attributes are not globally shared per DupRow, which would not work). Broke into a couple more functions for readability, and added back the first-pass duplicate row creation into AddDups -- that needs a better name too 2021-03-14 11:51:56 +11:00
8ff61dddfa trialing a new duplicate class to deal more consistently with the various types of duplicates -- mostly to enable "auto" deleting of duplicates in specific conditions, e.g. in both an import dir and storage dir - just delete dups from import dir 2021-03-13 12:36:16 +11:00
155068ab85 new duplicate processing thoughts 2021-03-08 19:37:50 +11:00
94c84a1907 new duplicate processing thoughts 2021-03-08 19:36:39 +11:00
0d0607e9c6 made help go via on mouse over instead of onclick 2021-03-06 17:26:28 +11:00
76aee3a10a okay fix_dups page now has functioning pagination, highlights regex matching "good" files as green, and just a file as yellow if we cant find the right one, so easily shows where to really pat attention. Has DBox based help page, and overall just a better UI/UX 2021-03-06 17:18:11 +11:00
b9dea327d0 use pagesize not page_size, and pass it in based on job extras, also use path based on job extras, so we fix issue of fix_dups not knowing which path from settings was used (import or storage) 2021-03-06 11:45:21 +11:00
f1ec7f9eb6 just make sure file size is the same as well as hash, this should make it nearly impossible for accidentally having 2 files with the same hash that are different 2021-03-06 11:18:11 +11:00
1dc8477758 first pass of paginated dups, with drop-down (non-functional), but otherwise pages work - presume if I clicked it to delete, it would, but then I would need to re-do check_dups OR optimise and get it to come back to dups with next page -- probably preferred -- BUT, need to keep job state between page loads? 2021-03-03 22:05:58 +11:00
e19e1b57f8 add extension so that break from loop works 2021-03-03 22:04:23 +11:00
8649edfe7c added page_size on reducing and hardcoded stich of prefix of dups (STILL needs fixing) 2021-03-03 22:03:55 +11:00
0c583aa868 finished containers, added new items on reducing dups per page and prefix of dups 2021-03-03 22:03:06 +11:00
df954e1e1a created wrapper.sh to also run pa_job_manager in paweb container, exposed port 55432, have not tested that the job manager is connectable and we prob. should not expose that port in the long run, but for another night 2021-03-03 20:33:51 +11:00
8074225a60 fixed Dockerfile and requirements.txt (for pip) so that pa.depaoli.id.au is now a working docker container on mara 2021-03-03 01:04:57 +11:00
ba8cca1bed todo to paginate dup processing 2021-03-02 22:45:39 +11:00
689081ef0b put summary counts on dups 2021-03-02 22:44:00 +11:00
42e00d0aea more TODOs 2021-03-02 22:43:19 +11:00
269009f14f fixed choosing dup dir alert 2021-03-01 19:13:38 +11:00
06eb1ef927 fixed BUG-27, duplicate count per same named files in 2 dirs, was always 1 2021-03-01 18:46:03 +11:00
564962097a count of files that are duplicates in 2 dirs is 1 every time, but looks wrong 2021-02-28 19:23:09 +11:00
b156040483 joblog todo when there are too many log lines 2021-02-28 19:17:22 +11:00