diff --git a/docs/changelog.md b/docs/changelog.md
index 830c08437..4421508d2 100644
--- a/docs/changelog.md
+++ b/docs/changelog.md
@@ -7,6 +7,40 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
+## [Version 584](https://github.com/hydrusnetwork/hydrus/releases/tag/v584)
+
+### misc
+
+* fixed a logical hole in the recent 'is this URL that is saying (deleted/already in db) trustworthy, or does it have weird mappings to other files?' pre-download check that was causing Pixiv, Kemono, and Twitter and any other multiple-URL Post URL Class to, on re-encountering the URL in a downloader, classify the underlying file URL as untrustworthy and re-download the files(l!!)
+* the 'copy all' and paste buttons in the manage known urls dialog are replaced with icon buttons, and the copy button now copies the current selection if there is one
+* the newish Regex input widget (the one that goes green/red based on current text validity) now propagates an Enter key into a dialog ok event when appropriate
+* when you ctrl+double-click a taglist, the program now ensures the item under the mouse is selected before firing off the double-click nega-activation event. this is slightly awkward, but I hope it smoothes out the awkward moment where you want to invert a selection of tags but doing a normal ctrl+double-click on them causes the one of them to be deselected and then messes up the selection
+* regex URL searches are now always the last job to run in a file query. if you mix in any other predicate like filesize or just some tag, your regex URL searches should run massively massively faster
+* improved some boot error handling when Qt fails to import
+* fixed the whack alignment of the 'filename'/'first directory'/etc.. checkbox-and-text-edit widgets in the filename tagging panel, and set 'namespace' placeholder text
+* force-selecting a null value in a 'select one from this list of things' dialog should no longer raise errors
+* thanks to a user, the new shimmie parser gets tags in a simpler, more reliable way
+
+### client api
+
+* added a new permission, `Commit Pending` (12), which allows you to see and commit pending content for each service
+* added `/manage_services/get_pending_counts`, which basically returns the content of the client's 'pending' menu
+* added `/manage_services/commit_pending`, which fires those commands off
+* added `/manage_services/forget_pending`, which does the same 'forget' command on that menu
+* added `/manage_file_relationships/remove_potentials`, which clears any known potential pairs off the given files
+* the `/manage_pages/get_pages` and `/manage_pages/get_page_info` commands now return `is_media_page` boolean, which is a simple shorthand for 'not a page of pages'
+* added the above to the Client API help
+* wrote unit tests covering the above
+* the client api version is now 66
+
+### boring cleanup
+
+* fixed up how some lists deliver their underlying data to various methods
+* `CallBlockingToQt` no longer spams encountered errors to the log--proper error handling should (and does now) occur elsewhere
+* the way the initial focus is set on system predicate flesh-out panels (when you double-click on like 'system:dimensions' and get a bunch of sub-panels) is more sane. should be, fairly reliably, on the first editable panel's ok button. I want it to be on an editable widget in the panel in future, I think, but I need to do some behind the scenes stuff to make this work in a nicer way
+* pulled some stuff out of `HydrusData`, mostly to `HydrusNumbers`, `HydrusLists`, and the new `HydrusProcess`, mostly for decoupling purposes
+* renamed some `ConvertXToY` stuff to just `XToY`
+
## [Version 583](https://github.com/hydrusnetwork/hydrus/releases/tag/v583)
### new
@@ -322,34 +356,3 @@ title: Changelog
* moved `ClientDuplicates` up to a new `duplicates` module and migrated some duplicate enums over to it from `ClientConstants`
* removed an old method-wrapper hack that applied the 'load images with PIL' option. I just moved to a global that I set on init and update on options change
* cleaned some duplicate checking code
-
-## [Version 574](https://github.com/hydrusnetwork/hydrus/releases/tag/v574)
-
-### local hashes cache
-
-* we finally figured out the 'update 404' issue that some PTR-syncing users were getting, where PTR processing would halt with an error about an update file not being available on the server. long story short, SQLite was sometimes crossing a wire in the database on a crash, and this week I add some new maintenance code to fix this and catch it in future
-* the local hash cache has a bunch of new resync/recovery code. it can now efficiently recover from missing hash_ids, excess hash_ids, desynced hash_ids, and even repopulate the master hash table if that guy has missing hash_ids (which can happen after severe db damage due to hard drive failure). it records all recovery info to the log
-* the normal _database->regenerate->local hashes cache_ function now works entirely in this new resync code, making it significantly faster (previously it just deleted and re-added everything). this job also gets a nicer popup with a summary of any problems found
-* when the client recovers from a bad shutdown, it now runs a quick sync on the latest hash_ids added to the local hashes cache to ensure that desync did not occur. fingers crossed, this will work super fast and ensure that we don't get the 404 problem (or related hash_id cross-wire problems) again
-* on repository processing failure and a scheduling of update file maintenance, we now resync the update files in the local hash cache, meaning the 404 problem, if it does happen again, will now fix itself in the normal recovery code
-* on update, everyone is going to get a full local hash cache resync, just to catch any lingering issues here. it should now work super fast!
-* fixed an issue where the local hash and tags caches would not fully reset desynced results on a 'regenerate' call until a client restart
-
-### misc
-
-* thanks to a user, the default twitter downloader I added last week now gets full-size images. if you spammed a bunch of URLs last week, I apologise: please do a search for 'imported within the last 7 days/has a twitter url/height=1200px' and then copy/paste the results' tweet URLs into a new urls downloader. because of some special twitter settings, you shouldn't have to set 'download the file even if known url match' in the file import options; the downloader will discover the larger versions and download the full size files with no special settings needed. once done, assuming the file count is the same on both pages, go back to your first page and delete the 1200px tall files. then repeat for width=1200px!
-* the filetype selector in system:filetype now expands to eat extra vertical space if the dialog is resized
-* the filetype selector in file import options is moved a bit and also now expands to eat extra vertical space
-* thanks to a user, the Microsoft document recognition now has fewer false negatives (it was detecting some docs as zips)
-* when setting up an import folder, the dialog will now refuse to OK if you set a path that is 1) above the install dir or db dir or 2) above or below any of your file storage locations. shouldn't be possible to set up an import from your own file storage folder by accident any more
-* added a new 'apply image ICC Profile colour adjustments' checkbox to _options->media_. this simply turns off ICC profile loading and application, for debug purposes
-
-### boring cleanup
-
-* the default SQLite page size is now 4096 bytes on Linux, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this
-* fixed the last couple dozen missing layout flags across the program, which were ancient artifacts from the wx->Qt conversion
-* fixed the WTFPL licence to be my copyright, lol
-* deleted the local booru service management/UI code
-* deleted the local booru service db/init code
-* deleted the local booru service network code
-* on update, the local booru service will be deleted from the database
diff --git a/docs/developer_api.md b/docs/developer_api.md
index 76edd06c7..d70c861fc 100644
--- a/docs/developer_api.md
+++ b/docs/developer_api.md
@@ -13,7 +13,7 @@ hide: navigation
## API
-In general, the API deals with standard UTF-8 JSON. POST requests and 200 OK responses are generally going to be a JSON 'Object' with variable names as keys and values obviously as values. There are examples throughout this document. For GET requests, everything is in standard GET parameters, but some variables are complicated and will need to be JSON encoded and then URL encoded. An example would be the 'tags' parameter on [GET /get\_files/search\_files](#get_files_search_files), which is a list of strings. Since GET http URLs have limits on what characters are allowed, but hydrus tags can have all sorts of characters, you'll be doing this:
+In general, the API deals with standard UTF-8 JSON. POST requests and 200 OK responses are generally going to be a JSON 'Object' with variable names as keys and values obviously as values. There are examples throughout this document. For GET requests, everything is in standard GET parameters, but some variables are complicated and will need to be JSON-encoded and then URL-encoded. An example would be the 'tags' parameter on [GET /get\_files/search\_files](#get_files_search_files), which is a list of strings. Since GET http URLs have limits on what characters are allowed, but hydrus tags can have all sorts of characters, you'll be doing this:
* Your list of tags:
@@ -21,19 +21,19 @@ In general, the API deals with standard UTF-8 JSON. POST requests and 200 OK res
[ 'character:samus aran', 'creator:青い桜', 'system:height > 2000' ]
```
-* JSON encoded:
+* JSON-encoded:
```json
["character:samus aran", "creator:\\u9752\\u3044\\u685c", "system:height > 2000"]
```
-* Then URL encoded:
+* Then URL-encoded:
```
%5B%22character%3Asamus%20aran%22%2C%20%22creator%3A%5Cu9752%5Cu3044%5Cu685c%22%2C%20%22system%3Aheight%20%3E%202000%22%5D
```
-* In python, converting your tag list to the URL encoded string would be:
+* In python, converting your tag list to the URL-encoded string would be:
```
urllib.parse.quote( json.dumps( tag_list ), safe = '' )
@@ -128,7 +128,7 @@ Arguments:
* `hash`: (selective, a hexadecimal SHA256 hash)
* `hashes`: (selective, a list of hexadecimal SHA256 hashes)
-In GET requests, make sure any list is percent-encoded.
+In GET requests, make sure any list is percent-encoded JSON. Your `[1,2,3]` becomes `urllib.parse.quote( json.dumps( [1,2,3] ), safe = '' )`, and thus `file_ids=%5B1%2C%202%2C%203%5D`.
### **file domain** { id="parameters_file_domain" }
@@ -358,6 +358,7 @@ Arguments:
* 9 - Edit File Ratings
* 10 - Manage Popups
* 11 - Edit File Times
+ * 12 - Commit Pending
``` title="Example request"
/request_new_permissions?name=my%20import%20script&basic_permissions=[0,1]
@@ -2299,6 +2300,32 @@ Response:
If there are no potential duplicate groups in the search, this returns an empty list.
+### **POST `/manage_file_relationships/remove_potentials`** { id="manage_file_relationships_remove_potentials" }
+
+Remove all potential pairs that any of the given files are a part of. If you hit [/manage\_file\_relationships/get\_file\_relationships](#get-manage_file_relationshipsget_file_relationships) after this on any of these files, they will have no potential relationships, and any hashes that were potential to them before will no longer, conversely, refer to these files as potentials.
+
+Restricted access:
+: YES. Manage File Relationships permission needed.
+
+Required Headers:
+:
+ * `Content-Type`: application/json
+
+Arguments (in JSON):
+:
+ * [files](#parameters_files)
+
+```json title="Example request body"
+{
+ "file_id" : 123
+}
+```
+
+Response:
+: 200 with no content.
+
+If the files are a part of any potential pairs (with any files, including those you did not specify), those pairs will be deleted. This deletes everything they are involved in, and the files will not be queued up for a re-scan, so I recommend you only do this if you know you added the potentials yourself (e.g. this is regarding video files) or you otherwise have a plan to replace the deleted potential pairs with something more useful.
+
### **POST `/manage_file_relationships/set_file_relationships`** { id="manage_file_relationships_set_file_relationships" }
Set the relationships to the specified file pairs.
@@ -2441,6 +2468,108 @@ Response:
The files will be promoted to be the kings of their respective duplicate groups. If the file is already the king (also true for any file with no duplicates), this is idempotent. It also processes the files in the given order, so if you specify two files in the same group, the latter will be the king at the end of the request.
+## Managing Services
+
+For now, this refers to just seeing and committing pending content (which you see in the main "pending" menubar if you have an IPFS, Tag Repository, or File Repository service).
+
+### **GET `/manage_services/get_pending_counts`** { id="manage_services_get_pending_counts" }
+
+_Get the counts of pending content for each upload-capable service. This basically lets you construct the "pending" menu in the main GUI menubar._
+
+Restricted access:
+: YES. Start Upload permission needed.
+
+Required Headers: n/a
+
+Arguments: n/a
+
+``` title="Example request"
+/manage_services/get_pending_counts
+```
+
+Response:
+: A JSON Object of all the service keys capable of uploading and their current pending content counts.
+
+```json title="Example response"
+{
+ "pending_counts" : {
+ "ae91919b0ea95c9e636f877f57a69728403b65098238c1a121e5ebf85df3b87e" : {
+ "pending_tag_mappings" : 11564,
+ "petitioned_tag_mappings" : 5,
+ "pending_tag_siblings" : 2,
+ "petitioned_tag_siblings" : 0,
+ "pending_tag_parents" : 0,
+ "petitioned_tag_parents" : 0
+ },
+ "3902aabc3c4c89d1b821eaa9c011be3047424fd2f0c086346e84794e08e136b0" : {
+ "pending_tag_mappings" : 0,
+ "petitioned_tag_mappings" : 0,
+ "pending_tag_siblings" : 0,
+ "petitioned_tag_siblings" : 0,
+ "pending_tag_parents" : 0,
+ "petitioned_tag_parents" : 0
+ },
+ "e06e1ae35e692d9fe2b83cde1510a11ecf495f51910d580681cd60e6f21fde73" : {
+ "pending_files" : 2,
+ "petitioned_files" : 0
+ }
+ }
+}
+```
+
+The keys are as in [/get\_services](#get_services).
+
+Each count here represents one 'row' of content, so for "tag_mappings" that is one (tag, file) pair and for "tag_siblings" one (tag, tag) pair. You always get everything, even if the counts are all 0.
+
+### **POST `/manage_services/commit_pending`** { id="manage_services_commit_pending" }
+
+_Start the job to upload a service's pending content._
+
+Restricted access:
+: YES. Start Upload permission needed.
+
+Required Headers: n/a
+
+Arguments (in JSON):
+:
+* `service_key`: (the service to commit)
+
+``` title="Example request body"
+{
+ "service_key" : "ae91919b0ea95c9e636f877f57a69728403b65098238c1a121e5ebf85df3b87e"
+}
+```
+
+This starts the upload popup, just like if you click 'commit' in the menu. This upload could ultimately take one second or several minutes to finish, but the response will come back immediately.
+
+If the job is already running, this will return 409. If it cannot start because of a difficult problem, like all repositories being paused or the service account object being unsynced or something, it gives 422; in this case, please direct the user to check their client manually, since there is probably an error popup on screen.
+
+If tracking the upload job's progress is important, you could hit it again and see if it gives 409, or you could [/get\_pending\_counts](#manage_services_get_pending_counts) again--since the counts will update live as the upload happens--but note that the user may pend more just after the upload is complete, so do not wait forever for it to fall back down to 0.
+
+### **POST `/manage_services/forget_pending`** { id="manage_services_forget_pending" }
+
+_Forget all pending content for a service._
+
+Restricted access:
+: YES. Start Upload permission needed.
+
+Required Headers: n/a
+
+Arguments (in JSON):
+:
+* `service_key`: (the service to forget for)
+
+``` title="Example request body"
+{
+ "service_key" : "ae91919b0ea95c9e636f877f57a69728403b65098238c1a121e5ebf85df3b87e"
+}
+```
+
+This clears all pending content for a service, just like if you click 'forget' in the menu.
+
+Response description:
+: 200 and no content.
+
## Managing Cookies
This refers to the cookies held in the client's session manager, which you can review under _network->data->manage session cookies_. These are sent to every request on the respective domains.
@@ -2663,6 +2792,7 @@ Response:
"page_key" : "3b28d8a59ec61834325eb6275d9df012860a1ecfd9e1246423059bc47fb6d5bd",
"page_state" : 0,
"page_type" : 10,
+ "is_media_page" : false,
"selected" : true,
"pages" : [
{
@@ -2670,6 +2800,7 @@ Response:
"page_key" : "d436ff5109215199913705eb9a7669d8a6b67c52e41c3b42904db083255ca84d",
"page_state" : 0,
"page_type" : 6,
+ "is_media_page" : true,
"selected" : false
},
{
@@ -2677,6 +2808,7 @@ Response:
"page_key" : "40887fa327edca01e1d69b533dddba4681b2c43e0b4ebee0576177852e8c32e7",
"page_state" : 0,
"page_type" : 9,
+ "is_media_page" : true,
"selected" : false
},
{
@@ -2684,6 +2816,7 @@ Response:
"page_key" : "2ee7fa4058e1e23f2bd9e915cdf9347ae90902a8622d6559ba019a83a785c4dc",
"page_state" : 0,
"page_type" : 10,
+ "is_media_page" : false,
"selected" : true,
"pages" : [
{
@@ -2691,6 +2824,7 @@ Response:
"page_key" : "9fe22cb760d9ee6de32575ed9f27b76b4c215179cf843d3f9044efeeca98411f",
"page_state" : 0,
"page_type" : 7,
+ "is_media_page" : true,
"selected" : true
},
{
@@ -2698,6 +2832,7 @@ Response:
"page_key" : "2977d57fc9c588be783727bcd54225d577b44e8aa2f91e365a3eb3c3f580dc4e",
"page_state" : 0,
"page_type" : 6,
+ "is_media_page" : true,
"selected" : false
}
]
@@ -2707,35 +2842,37 @@ Response:
}
```
- `name` is the full text on the page tab.
-
- `page_key` is a unique identifier for the page. It will stay the same for a particular page throughout the session, but new ones are generated on a session reload.
-
- `page_type` is as follows:
-
- * 1 - Gallery downloader
- * 2 - Simple downloader
- * 3 - Hard drive import
- * 5 - Petitions (used by repository janitors)
- * 6 - File search
- * 7 - URL downloader
- * 8 - Duplicates
- * 9 - Thread watcher
- * 10 - Page of pages
-
- `page_state` is as follows:
-
- * 0 - ready
- * 1 - initialising
- * 2 - searching/loading
- * 3 - search cancelled
-
- Most pages will be 0, normal/ready, at all times. Large pages will start in an 'initialising' state for a few seconds, which means their session-saved thumbnails aren't loaded yet. Search pages will enter 'searching' after a refresh or search change and will either return to 'ready' when the search is complete, or fall to 'search cancelled' if the search was interrupted (usually this means the user clicked the 'stop' button that appears after some time).
-
- `selected` means which page is currently in view. It will propagate down the page of pages until it terminates. It may terminate in an empty page of pages, so do not assume it will end on a media page.
-
- The top page of pages will always be there, and always selected.
-
+`name` is the full text on the page tab.
+
+`page_key` is a unique identifier for the page. It will stay the same for a particular page throughout the session, but new ones are generated on a session reload.
+
+`page_type` is as follows:
+
+* 1 - Gallery downloader
+* 2 - Simple downloader
+* 3 - Hard drive import
+* 5 - Petitions (used by repository janitors)
+* 6 - File search
+* 7 - URL downloader
+* 8 - Duplicates
+* 9 - Thread watcher
+* 10 - Page of pages
+
+`page_state` is as follows:
+
+* 0 - ready
+* 1 - initialising
+* 2 - searching/loading
+* 3 - search cancelled
+
+Most pages will be 0, normal/ready, at all times. Large pages will start in an 'initialising' state for a few seconds, which means their session-saved thumbnails aren't loaded yet. Search pages will enter 'searching' after a refresh or search change and will either return to 'ready' when the search is complete, or fall to 'search cancelled' if the search was interrupted (usually this means the user clicked the 'stop' button that appears after some time).
+
+`is_media_page` is simply a shorthand for whether the page is a normal page that holds thumbnails or a 'page of pages'. Only media pages can have files (and accept [/manage\_files/add\_files](#manage_pages_add_files) commands).
+
+`selected` means which page is currently in view. It will propagate down the page of pages until it terminates. It may terminate in an empty page of pages, so do not assume it will end on a media page.
+
+The top page of pages will always be there, and always selected.
+
### **GET `/manage_pages/get_page_info`** { id="manage_pages_get_page_info" }
@@ -2767,6 +2904,7 @@ Response description
"page_key" : "aebbf4b594e6986bddf1eeb0b5846a1e6bc4e07088e517aff166f1aeb1c3c9da",
"page_state" : 0,
"page_type" : 3,
+ "is_media_page" : true,
"management" : {
"multiple_watcher_import" : {
"watcher_imports" : [
@@ -2827,12 +2965,11 @@ Response description
}
```
- `name`, `page_key`, `page_state`, and `page_type` are as in [/manage\_pages/get\_pages](#manage_pages_get_pages).
-
- As you can see, even the 'simple' mode can get very large. Imagine that response for a page watching 100 threads! Turning simple mode off will display every import item, gallery log entry, and all hashes in the media (thumbnail) panel.
-
- For this first version, the five importer pages--hdd import, simple downloader, url downloader, gallery page, and watcher page--all give rich info based on their specific variables. The first three only have one importer/gallery log combo, but the latter two of course can have multiple. The "imports" and "gallery_log" entries are all in the same data format.
-
+`name`, `page_key`, `page_state`, and `page_type` are as in [/manage\_pages/get\_pages](#manage_pages_get_pages).
+
+As you can see, even the 'simple' mode can get very large. Imagine that response for a page watching 100 threads! Turning simple mode off will display every import item, gallery log entry, and all hashes in the media (thumbnail) panel.
+
+For this first version, the five importer pages--hdd import, simple downloader, url downloader, gallery page, and watcher page--all give rich info based on their specific variables. The first three only have one importer/gallery log combo, but the latter two of course can have multiple. The "imports" and "gallery_log" entries are all in the same data format.
### **POST `/manage_pages/add_files`** { id="manage_pages_add_files" }
@@ -2840,7 +2977,7 @@ _Add files to a page._
Restricted access:
: YES. Manage Pages permission needed.
-
+
Required Headers:
:
* `Content-Type`: application/json
@@ -2860,7 +2997,7 @@ The files you set will be appended to the given page, just like a thumbnail drag
```
Response:
-: 200 with no content. If the page key is not found, this will 404.
+: 200 with no content. If the page key is not found, it will 404. If you try to add files to a 'page of pages' (i.e. `is_media_page=false` in the [/manage\_pages/get\_pages](#manage_pages_get_pages) call), you'll get 400.
### **POST `/manage_pages/focus_page`** { id="manage_pages_focus_page" }
diff --git a/docs/old_changelog.html b/docs/old_changelog.html
index 762938e05..ddaf0d975 100644
--- a/docs/old_changelog.html
+++ b/docs/old_changelog.html
@@ -34,6 +34,37 @@
+ -
+
+
+ misc
+ - fixed a logical hole in the recent 'is this URL that is saying (deleted/already in db) trustworthy, or does it have weird mappings to other files?' pre-download check that was causing Pixiv, Kemono, and Twitter and any other multiple-URL Post URL Class to, on re-encountering the URL in a downloader, classify the underlying file URL as untrustworthy and re-download the files(l!!)
+ - the 'copy all' and paste buttons in the manage known urls dialog are replaced with icon buttons, and the copy button now copies the current selection if there is one
+ - the newish Regex input widget (the one that goes green/red based on current text validity) now propagates an Enter key into a dialog ok event when appropriate
+ - when you ctrl+double-click a taglist, the program now ensures the item under the mouse is selected before firing off the double-click nega-activation event. this is slightly awkward, but I hope it smoothes out the awkward moment where you want to invert a selection of tags but doing a normal ctrl+double-click on them causes the one of them to be deselected and then messes up the selection
+ - regex URL searches are now always the last job to run in a file query. if you mix in any other predicate like filesize or just some tag, your regex URL searches should run massively massively faster
+ - improved some boot error handling when Qt fails to import
+ - fixed the whack alignment of the 'filename'/'first directory'/etc.. checkbox-and-text-edit widgets in the filename tagging panel, and set 'namespace' placeholder text
+ - force-selecting a null value in a 'select one from this list of things' dialog should no longer raise errors
+ - thanks to a user, the new shimmie parser gets tags in a simpler, more reliable way
+ client api
+ - added a new permission, `Commit Pending` (12), which allows you to see and commit pending content for each service
+ - added `/manage_services/get_pending_counts`, which basically returns the content of the client's 'pending' menu
+ - added `/manage_services/commit_pending`, which fires those commands off
+ - added `/manage_services/forget_pending`, which does the same 'forget' command on that menu
+ - added `/manage_file_relationships/remove_potentials`, which clears any known potential pairs off the given files
+ - the `/manage_pages/get_pages` and `/manage_pages/get_page_info` commands now return `is_media_page` boolean, which is a simple shorthand for 'not a page of pages'
+ - added the above to the Client API help
+ - wrote unit tests covering the above
+ - the client api version is now 66
+ boring cleanup
+ - fixed up how some lists deliver their underlying data to various methods
+ - `CallBlockingToQt` no longer spams encountered errors to the log--proper error handling should (and does now) occur elsewhere
+ - the way the initial focus is set on system predicate flesh-out panels (when you double-click on like 'system:dimensions' and get a bunch of sub-panels) is more sane. should be, fairly reliably, on the first editable panel's ok button. I want it to be on an editable widget in the panel in future, I think, but I need to do some behind the scenes stuff to make this work in a nicer way
+ - pulled some stuff out of `HydrusData`, mostly to `HydrusNumbers`, `HydrusLists`, and the new `HydrusProcess`, mostly for decoupling purposes
+ - renamed some `ConvertXToY` stuff to just `XToY`
+
+
-
diff --git a/hydrus/client/ClientAPI.py b/hydrus/client/ClientAPI.py
index 5a76950ce..126507d99 100644
--- a/hydrus/client/ClientAPI.py
+++ b/hydrus/client/ClientAPI.py
@@ -23,6 +23,7 @@
CLIENT_API_PERMISSION_EDIT_RATINGS = 9
CLIENT_API_PERMISSION_MANAGE_POPUPS = 10
CLIENT_API_PERMISSION_EDIT_TIMES = 11
+CLIENT_API_PERMISSION_COMMIT_PENDING = 12
ALLOWED_PERMISSIONS = (
CLIENT_API_PERMISSION_ADD_FILES,
@@ -36,7 +37,8 @@
CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS,
CLIENT_API_PERMISSION_EDIT_RATINGS,
CLIENT_API_PERMISSION_MANAGE_POPUPS,
- CLIENT_API_PERMISSION_EDIT_TIMES
+ CLIENT_API_PERMISSION_EDIT_TIMES,
+ CLIENT_API_PERMISSION_COMMIT_PENDING
)
basic_permission_to_str_lookup = {}
@@ -53,6 +55,7 @@
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_EDIT_RATINGS ] = 'edit file ratings'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_POPUPS ] = 'manage popups'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_EDIT_TIMES ] = 'edit file times'
+basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_COMMIT_PENDING ] = 'commit pending'
SEARCH_RESULTS_CACHE_TIMEOUT = 4 * 3600
diff --git a/hydrus/client/ClientController.py b/hydrus/client/ClientController.py
index d30eb9a13..78543fc9e 100644
--- a/hydrus/client/ClientController.py
+++ b/hydrus/client/ClientController.py
@@ -17,6 +17,7 @@
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusNumbers
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusPSUtil
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusThreading
@@ -436,8 +437,8 @@ def qt_code( win: QW.QWidget, job_status: ClientThreading.JobStatus ):
job_status.SetErrorException( e )
- HydrusData.Print( 'CallBlockingToQt just caught this error:' )
- HydrusData.DebugPrint( traceback.format_exc() )
+ #HydrusData.Print( 'CallBlockingToQt just caught this error:' )
+ #HydrusData.DebugPrint( traceback.format_exc() )
finally:
@@ -557,7 +558,7 @@ def CatchSignal( self, sig, frame ):
def CheckAlreadyRunning( self ):
- while HydrusData.IsAlreadyRunning( self.db_dir, 'client' ):
+ while HydrusProcess.IsAlreadyRunning( self.db_dir, 'client' ):
self.frame_splash_status.SetText( 'client already running' )
@@ -581,7 +582,7 @@ def qt_code():
for i in range( 10, 0, -1 ):
- if not HydrusData.IsAlreadyRunning( self.db_dir, 'client' ):
+ if not HydrusProcess.IsAlreadyRunning( self.db_dir, 'client' ):
break
diff --git a/hydrus/client/ClientDownloading.py b/hydrus/client/ClientDownloading.py
index e169515ec..acf9562a7 100644
--- a/hydrus/client/ClientDownloading.py
+++ b/hydrus/client/ClientDownloading.py
@@ -269,7 +269,7 @@ def MainLoop( self ):
total_done = total_hashes_in_this_run - len( hashes_still_to_download_in_this_run )
- job_status.SetStatusText( 'downloading files from remote services: {}'.format( HydrusData.ConvertValueRangeToPrettyString( total_done, total_hashes_in_this_run ) ) )
+ job_status.SetStatusText( 'downloading files from remote services: {}'.format( HydrusNumbers.ValueRangeToPrettyString( total_done, total_hashes_in_this_run ) ) )
job_status.SetVariable( 'popup_gauge_1', ( total_done, total_hashes_in_this_run ) )
try:
diff --git a/hydrus/client/ClientFiles.py b/hydrus/client/ClientFiles.py
index ac2529c63..a078d8a2e 100644
--- a/hydrus/client/ClientFiles.py
+++ b/hydrus/client/ClientFiles.py
@@ -1443,7 +1443,7 @@ def ClearOrphans( self, move_location = None ):
HydrusData.Print( 'Deleting the orphan ' + path )
- status = 'deleting orphan files: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( orphan_paths ) )
+ status = 'deleting orphan files: ' + HydrusNumbers.ValueRangeToPrettyString( i + 1, len( orphan_paths ) )
job_status.SetStatusText( status )
@@ -1470,7 +1470,7 @@ def ClearOrphans( self, move_location = None ):
HydrusData.Print( 'Deleting the orphan ' + path )
- status = 'deleting orphan thumbnails: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( orphan_thumbnails ) )
+ status = 'deleting orphan thumbnails: ' + HydrusNumbers.ValueRangeToPrettyString( i + 1, len( orphan_thumbnails ) )
job_status.SetStatusText( status )
@@ -3065,7 +3065,7 @@ def job_done_hook():
num_jobs_done = vr_status[ 'num_jobs_done' ]
- status_text = '{}'.format( HydrusData.ConvertValueRangeToPrettyString( num_jobs_done, total_num_jobs_to_do ) )
+ status_text = '{}'.format( HydrusNumbers.ValueRangeToPrettyString( num_jobs_done, total_num_jobs_to_do ) )
job_status.SetStatusText( status_text )
@@ -3303,7 +3303,7 @@ def job_done_hook():
num_jobs_done = vr_status[ 'num_jobs_done' ]
- status_text = '{} - {}'.format( HydrusData.ConvertValueRangeToPrettyString( num_jobs_done, total_num_jobs_to_do ), regen_file_enum_to_str_lookup[ job_type ] )
+ status_text = '{} - {}'.format( HydrusNumbers.ValueRangeToPrettyString( num_jobs_done, total_num_jobs_to_do ), regen_file_enum_to_str_lookup[ job_type ] )
job_status.SetStatusText( status_text )
diff --git a/hydrus/client/ClientParsing.py b/hydrus/client/ClientParsing.py
index f8ab80d56..33a1ffe44 100644
--- a/hydrus/client/ClientParsing.py
+++ b/hydrus/client/ClientParsing.py
@@ -683,6 +683,46 @@ def MakeParsedTextPretty( parsed_text ):
return parsed_text
+def ParseHashesFromRawHexText( hash_type, hex_hashes_raw ):
+
+ hash_type_to_hex_length = {
+ 'md5' : 32,
+ 'sha1' : 40,
+ 'sha256' : 64,
+ 'sha512' : 128,
+ 'pixel' : 64,
+ 'perceptual' : 16
+ }
+
+ hex_hashes = HydrusText.DeserialiseNewlinedTexts( hex_hashes_raw )
+
+ # convert md5:abcd to abcd
+ hex_hashes = [ hex_hash.split( ':' )[-1] for hex_hash in hex_hashes ]
+
+ hex_hashes = [ HydrusText.HexFilter( hex_hash ) for hex_hash in hex_hashes ]
+
+ expected_hex_length = hash_type_to_hex_length[ hash_type ]
+
+ bad_hex_hashes = [ hex_hash for hex_hash in hex_hashes if len( hex_hash ) != expected_hex_length ]
+
+ if len( bad_hex_hashes ):
+
+ m = 'Sorry, {} hashes should have {} hex characters! These did not:'.format( hash_type, expected_hex_length )
+ m += '\n' * 2
+ m += '\n'.join( ( '{} ({} characters)'.format( bad_hex_hash, len( bad_hex_hash ) ) for bad_hex_hash in bad_hex_hashes ) )
+
+ raise Exception( m )
+
+
+ hex_hashes = [ hex_hash for hex_hash in hex_hashes if len( hex_hash ) % 2 == 0 ]
+
+ hex_hashes = HydrusData.DedupeList( hex_hashes )
+
+ hashes = tuple( [ bytes.fromhex( hex_hash ) for hex_hash in hex_hashes ] )
+
+ return hashes
+
+
def ParseResultsHavePursuableURLs( results ):
for ( ( name, content_type, additional_info ), parsed_text ) in results:
@@ -713,7 +753,7 @@ def RenderJSONParseRule( rule ):
index = parse_rule
- s = 'get the ' + HydrusNumbers.ConvertIndexToPrettyOrdinalString( index ) + ' item (for Objects, keys sorted)'
+ s = 'get the ' + HydrusNumbers.IndexToPrettyOrdinalString( index ) + ' item (for Objects, keys sorted)'
elif parse_rule_type == JSON_PARSE_RULE_TYPE_DICT_KEY:
@@ -1657,7 +1697,7 @@ def ToString( self ):
else:
- s += ' the ' + HydrusNumbers.ConvertIndexToPrettyOrdinalString( self._tag_index )
+ s += ' the ' + HydrusNumbers.IndexToPrettyOrdinalString( self._tag_index )
if self._tag_name is not None:
@@ -1682,7 +1722,7 @@ def ToString( self ):
else:
- s += ' to the ' + HydrusNumbers.ConvertIntToPrettyOrdinalString( self._tag_depth ) + ' <' + self._tag_name + '> tag'
+ s += ' to the ' + HydrusNumbers.IntToPrettyOrdinalString( self._tag_depth ) + ' <' + self._tag_name + '> tag'
diff --git a/hydrus/client/ClientServices.py b/hydrus/client/ClientServices.py
index a08a1e1f1..ff27f7e38 100644
--- a/hydrus/client/ClientServices.py
+++ b/hydrus/client/ClientServices.py
@@ -18,7 +18,6 @@
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTags
from hydrus.core import HydrusTime
-from hydrus.core.networking import HydrusNATPunch
from hydrus.core.networking import HydrusNetwork
from hydrus.core.networking import HydrusNetworkVariableHandling
from hydrus.core.networking import HydrusNetworking
@@ -38,6 +37,25 @@
SHORT_DELAY_PERIOD = 50000
ACCOUNT_SYNC_PERIOD = 250000
+def ConvertNumericalRatingToPrettyString( lower, upper, rating, rounded_result = False, out_of = True ):
+
+ rating_converted = ( rating * ( upper - lower ) ) + lower
+
+ if rounded_result:
+
+ rating_converted = round( rating_converted )
+
+
+ s = '{:.2f}'.format( rating_converted )
+
+ if out_of and lower in ( 0, 1 ):
+
+ s += '/{:.2f}'.format( upper )
+
+
+ return s
+
+
def GenerateDefaultServiceDictionary( service_type ):
dictionary = HydrusSerialisable.SerialisableDictionary()
@@ -680,7 +698,7 @@ def ConvertNoneableRatingToString( self, rating: typing.Optional[ float ] ):
rating_value = self.ConvertRatingToStars( rating )
rating_range = self._num_stars
- return HydrusData.ConvertValueRangeToPrettyString( rating_value, rating_range )
+ return HydrusNumbers.ValueRangeToPrettyString( rating_value, rating_range )
return 'unknown'
@@ -1653,7 +1671,7 @@ def _ReportOngoingRowSpeed( self, job_status, rows_done, total_rows, precise_tim
rows_s = HydrusNumbers.ToHumanInt( int( rows_done_in_last_packet / it_took ) )
- popup_message = '{} {}: processing at {} rows/s'.format( row_name, HydrusData.ConvertValueRangeToPrettyString( rows_done, total_rows ), rows_s )
+ popup_message = '{} {}: processing at {} rows/s'.format( row_name, HydrusNumbers.ValueRangeToPrettyString( rows_done, total_rows ), rows_s )
CG.client_controller.frame_splash_status.SetText( popup_message, print_to_log = False )
job_status.SetStatusText( popup_message, 2 )
@@ -1770,7 +1788,7 @@ def _SyncDownloadUpdates( self, stop_time ):
for ( i, update_hash ) in enumerate( update_hashes ):
- status = 'update ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( update_hashes ) )
+ status = 'update ' + HydrusNumbers.ValueRangeToPrettyString( i + 1, len( update_hashes ) )
CG.client_controller.frame_splash_status.SetText( status, print_to_log = False )
job_status.SetStatusText( status )
@@ -1968,7 +1986,7 @@ def _SyncProcessUpdates( self, maintenance_mode = HC.MAINTENANCE_IDLE, stop_time
for ( definition_hash, content_types ) in definition_hashes_and_content_types:
- progress_string = HydrusData.ConvertValueRangeToPrettyString( num_updates_done + 1, num_updates_to_do )
+ progress_string = HydrusNumbers.ValueRangeToPrettyString( num_updates_done + 1, num_updates_to_do )
splash_title = '{} sync: processing updates {}'.format( self._name, progress_string )
@@ -2097,7 +2115,7 @@ def _SyncProcessUpdates( self, maintenance_mode = HC.MAINTENANCE_IDLE, stop_time
for ( content_hash, content_types ) in content_hashes_and_content_types:
- progress_string = HydrusData.ConvertValueRangeToPrettyString( num_updates_done + 1, num_updates_to_do )
+ progress_string = HydrusNumbers.ValueRangeToPrettyString( num_updates_done + 1, num_updates_to_do )
splash_title = '{} sync: processing updates {}'.format( self._name, progress_string )
@@ -2716,7 +2734,7 @@ def SyncThumbnails( self, stop_time ):
for ( i, thumbnail_hash ) in enumerate( thumbnail_hashes ):
- status = 'thumbnail ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do )
+ status = 'thumbnail ' + HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do )
CG.client_controller.frame_splash_status.SetText( status, print_to_log = False )
job_status.SetStatusText( status )
@@ -3148,7 +3166,7 @@ def PinDirectory( self, hashes, note ):
return
- job_status.SetStatusText( 'ensuring files are pinned: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( hashes ) ) )
+ job_status.SetStatusText( 'ensuring files are pinned: ' + HydrusNumbers.ValueRangeToPrettyString( i + 1, len( hashes ) ) )
job_status.SetVariable( 'popup_gauge_1', ( i + 1, len( hashes ) ) )
media_result = CG.client_controller.Read( 'media_result', hash )
@@ -3202,7 +3220,7 @@ def PinDirectory( self, hashes, note ):
return
- job_status.SetStatusText( 'creating directory: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( file_info ) ) )
+ job_status.SetStatusText( 'creating directory: ' + HydrusNumbers.ValueRangeToPrettyString( i + 1, len( file_info ) ) )
job_status.SetVariable( 'popup_gauge_1', ( i + 1, len( file_info ) ) )
object_multihash = response_json[ 'Hash' ]
@@ -3416,6 +3434,7 @@ def UnpinFile( self, hash, multihash ):
CG.client_controller.WriteSynchronous( 'content_updates', ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( self._service_key, content_updates ) )
+
class ServicesManager( object ):
def __init__( self, controller ):
@@ -3613,3 +3632,17 @@ def ServiceExists( self, service_key: bytes ):
+
+class ServiceUpdate( object ):
+
+ def __init__( self, action, row = None ):
+
+ self._action = action
+ self._row = row
+
+
+ def ToTuple( self ):
+
+ return ( self._action, self._row )
+
+
diff --git a/hydrus/client/ClientStrings.py b/hydrus/client/ClientStrings.py
index bb27e16f8..2174d9f73 100644
--- a/hydrus/client/ClientStrings.py
+++ b/hydrus/client/ClientStrings.py
@@ -978,7 +978,7 @@ def ToString( self, simple = False, with_type = False ) -> str:
elif self.SelectsOne():
- result = 'selecting the {} string'.format( HydrusNumbers.ConvertIndexToPrettyOrdinalString( self._index_start ) )
+ result = 'selecting the {} string'.format( HydrusNumbers.IndexToPrettyOrdinalString( self._index_start ) )
elif self._index_start is None and self._index_end is None:
@@ -986,15 +986,15 @@ def ToString( self, simple = False, with_type = False ) -> str:
elif self._index_start is not None and self._index_end is None:
- result = 'selecting the {} string and onwards'.format( HydrusNumbers.ConvertIndexToPrettyOrdinalString( self._index_start ) )
+ result = 'selecting the {} string and onwards'.format( HydrusNumbers.IndexToPrettyOrdinalString( self._index_start ) )
elif self._index_start is None and self._index_end is not None:
- result = 'selecting up to and including the {} string'.format( HydrusNumbers.ConvertIndexToPrettyOrdinalString( self._index_end - 1 ) )
+ result = 'selecting up to and including the {} string'.format( HydrusNumbers.IndexToPrettyOrdinalString( self._index_end - 1 ) )
else:
- result = 'selecting the {} string up to and including the {} string'.format( HydrusNumbers.ConvertIndexToPrettyOrdinalString( self._index_start ), HydrusNumbers.ConvertIndexToPrettyOrdinalString( self._index_end - 1 ) )
+ result = 'selecting the {} string up to and including the {} string'.format( HydrusNumbers.IndexToPrettyOrdinalString( self._index_start ), HydrusNumbers.IndexToPrettyOrdinalString( self._index_end - 1 ) )
if with_type:
diff --git a/hydrus/client/db/ClientDB.py b/hydrus/client/db/ClientDB.py
index 634d02504..94acfc147 100644
--- a/hydrus/client/db/ClientDB.py
+++ b/hydrus/client/db/ClientDB.py
@@ -15,6 +15,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusDB
+from hydrus.core import HydrusDBBase
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusLists
@@ -179,7 +180,7 @@ def report_content_speed_to_job_status( job_status, rows_done, total_rows, preci
rows_s = HydrusNumbers.ToHumanInt( int( num_rows / it_took ) )
- popup_message = 'content row ' + HydrusData.ConvertValueRangeToPrettyString( rows_done, total_rows ) + ': processing ' + row_name + ' at ' + rows_s + ' rows/s'
+ popup_message = 'content row ' + HydrusNumbers.ValueRangeToPrettyString( rows_done, total_rows ) + ': processing ' + row_name + ' at ' + rows_s + ' rows/s'
CG.client_controller.frame_splash_status.SetText( popup_message, print_to_log = False )
job_status.SetStatusText( popup_message, 2 )
@@ -195,6 +196,7 @@ def report_speed_to_job_status( job_status, precise_timestamp, num_rows, row_nam
CG.client_controller.frame_splash_status.SetText( popup_message, print_to_log = False )
job_status.SetStatusText( popup_message, 2 )
+
def report_speed_to_log( precise_timestamp, num_rows, row_name ):
if num_rows == 0:
@@ -210,7 +212,8 @@ def report_speed_to_log( precise_timestamp, num_rows, row_name ):
HydrusData.Print( summary )
-class JobDatabaseClient( HydrusData.JobDatabase ):
+
+class JobDatabaseClient( HydrusDBBase.JobDatabase ):
def _DoDelayedResultRelief( self ):
@@ -225,6 +228,7 @@ def _DoDelayedResultRelief( self ):
+
class DB( HydrusDB.HydrusDB ):
READ_WRITE_ACTIONS = [ 'service_info', 'system_predicates', 'missing_thumbnail_hashes' ]
@@ -983,7 +987,7 @@ def _CacheTagsPopulate( self, file_service_id, tag_service_id, status_hook = Non
self.modules_tag_search.AddTags( file_service_id, tag_service_id, group_of_tag_ids )
- message = HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do )
+ message = HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do )
self._controller.frame_splash_status.SetSubtext( message )
@@ -1608,7 +1612,7 @@ def _DeletePending( self, service_key ):
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_tag_display_application' )
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
- self.pub_service_updates_after_commit( { service_key : [ HydrusData.ServiceUpdate( HC.SERVICE_UPDATE_DELETE_PENDING ) ] } )
+ self.pub_service_updates_after_commit( { service_key : [ ClientServices.ServiceUpdate( HC.SERVICE_UPDATE_DELETE_PENDING ) ] } )
def _DeleteService( self, service_id ):
@@ -1640,7 +1644,7 @@ def _DeleteService( self, service_id ):
self.modules_services.DeleteService( service_id )
- service_update = HydrusData.ServiceUpdate( HC.SERVICE_UPDATE_RESET )
+ service_update = ClientServices.ServiceUpdate( HC.SERVICE_UPDATE_RESET )
service_keys_to_service_updates = { service_key : [ service_update ] }
@@ -7528,7 +7532,7 @@ def _RegenerateTagMappingsTags( self, tags, tag_service_key = None ):
# yes we do want 'actual' here I think. we are regenning the actual current computation
# maybe we'll ultimately expand this to the ideal also, we'll see how it goes
- affected_tag_ids = HydrusData.MassUnion( ( self.modules_tag_display.GetChainsMembers( ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, tag_service_id, ( tag_id, ) ) for tag_id in tag_ids ) )
+ affected_tag_ids = HydrusLists.MassUnion( ( self.modules_tag_display.GetChainsMembers( ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, tag_service_id, ( tag_id, ) ) for tag_id in tag_ids ) )
tag_service_ids_to_affected_tag_ids[ tag_service_id ] = affected_tag_ids
@@ -7543,7 +7547,7 @@ def _RegenerateTagMappingsTags( self, tags, tag_service_key = None ):
job_status.SetStatusText( message )
self._controller.frame_splash_status.SetSubtext( message )
- all_affected_tag_ids = HydrusData.MassUnion( tag_service_ids_to_affected_tag_ids.values() )
+ all_affected_tag_ids = HydrusLists.MassUnion( tag_service_ids_to_affected_tag_ids.values() )
self.modules_tags_local_cache.DropTagIdsFromCache( all_affected_tag_ids )
# I think the clever add is done by later regen gubbins here
@@ -7922,7 +7926,7 @@ def _RepairInvalidTags( self, job_status: typing.Optional[ ClientThreading.JobSt
break
- message = 'Scanning tags: {} - Bad Found: {}'.format( HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ), HydrusNumbers.ToHumanInt( bad_tag_count ) )
+ message = 'Scanning tags: {} - Bad Found: {}'.format( HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ), HydrusNumbers.ToHumanInt( bad_tag_count ) )
job_status.SetStatusText( message )
@@ -7965,7 +7969,7 @@ def _RepairInvalidTags( self, job_status: typing.Optional[ ClientThreading.JobSt
break
- message = 'Fixing bad tags: {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, bad_tag_count ) )
+ message = 'Fixing bad tags: {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, bad_tag_count ) )
job_status.SetStatusText( message )
@@ -8069,7 +8073,7 @@ def _RepopulateMappingsFromCache( self, tag_service_key = None, job_status = Non
if job_status is not None:
- message = 'Doing "{}": {}'.format( name, HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ) )
+ message = 'Doing "{}": {}'.format( name, HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ) )
message += '\n' * 2
message += 'Total rows recovered: {}'.format( HydrusNumbers.ToHumanInt( num_rows_recovered ) )
@@ -8211,7 +8215,7 @@ def _RepopulateTagDisplayMappingsCache( self, tag_service_key = None ):
for ( group_of_ids, num_done, num_to_do ) in HydrusDB.ReadLargeIdQueryInSeparateChunks( self._c, 'SELECT hash_id FROM {};'.format( table_name ), 1024 ):
- message = 'repopulating {} {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, len( file_service_ids ) ), HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ) )
+ message = 'repopulating {} {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, len( file_service_ids ) ), HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ) )
job_status.SetStatusText( message )
self._controller.frame_splash_status.SetSubtext( message )
@@ -8868,7 +8872,7 @@ def ask_what_to_do_png_stuff():
for ( chunk_of_tag_ids, num_done, num_to_do ) in HydrusDB.ReadLargeIdQueryInSeparateChunks( self._c, f'SELECT tag_id FROM {tags_table_name};', CHUNK_SIZE ):
- num_string = HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do )
+ num_string = HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do )
self._controller.frame_splash_status.SetSubtext( f'{message} - {num_string}' )
@@ -9892,7 +9896,7 @@ def ask_what_to_do_concatenated_urls():
for ( chunk_of_url_ids, num_done, num_to_do ) in HydrusDB.ReadLargeIdQueryInSeparateChunks( self._c, f'SELECT url_id FROM urls;', CHUNK_SIZE ):
- num_string = HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do )
+ num_string = HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do )
self._controller.frame_splash_status.SetSubtext( f'bad url scan - {num_string} - bad urls: {HydrusNumbers.ToHumanInt(len( bad_url_ids))}' )
@@ -10450,6 +10454,36 @@ def ask_what_to_do_zip_docx_scan():
+ if version == 583:
+
+ try:
+
+ domain_manager: ClientNetworkingDomain.NetworkDomainManager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
+
+ domain_manager.Initialise()
+
+ domain_manager.OverwriteDefaultParsers( [
+ 'shimmie file page parser'
+ ] )
+
+ #
+
+ domain_manager.TryToLinkURLClassesAndParsers()
+
+ #
+
+ self.modules_serialisable.SetJSONDump( domain_manager )
+
+ except Exception as e:
+
+ HydrusData.PrintException( e )
+
+ message = 'Trying to update some downloaders failed! Please let hydrus dev know!'
+
+ self.pub_initial_message( message )
+
+
+
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusNumbers.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
diff --git a/hydrus/client/db/ClientDBDefinitionsCache.py b/hydrus/client/db/ClientDBDefinitionsCache.py
index 7e87dc44a..c23c608ec 100644
--- a/hydrus/client/db/ClientDBDefinitionsCache.py
+++ b/hydrus/client/db/ClientDBDefinitionsCache.py
@@ -276,7 +276,7 @@ def SyncHashIds( self, all_hash_ids: typing.Collection[ int ], job_status = None
block_of_hash_ids = set( block_of_hash_ids )
- text = 'syncing local hashes {}'.format( HydrusData.ConvertValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
+ text = 'syncing local hashes {}'.format( HydrusNumbers.ValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
CG.client_controller.frame_splash_status.SetSubtext( text )
job_status.SetStatusText( text )
diff --git a/hydrus/client/db/ClientDBFilesSearch.py b/hydrus/client/db/ClientDBFilesSearch.py
index 93f154b2a..fd51d220e 100644
--- a/hydrus/client/db/ClientDBFilesSearch.py
+++ b/hydrus/client/db/ClientDBFilesSearch.py
@@ -1116,6 +1116,77 @@ def _DoSimpleRatingPreds( self, file_search_context: ClientSearch.FileSearchCont
+ return query_hash_ids
+
+
+ def _DoSpecificKnownURLPreds( self, file_search_context: ClientSearch.FileSearchContext, query_hash_ids: typing.Optional[ typing.Set[ int ] ], allowed_rule_types: typing.Collection[ str ] ) -> typing.Optional[ typing.Set[ int ] ]:
+
+ system_predicates = file_search_context.GetSystemPredicates()
+
+ is_inbox = system_predicates.MustBeInbox()
+
+ simple_preds = system_predicates.GetSimpleInfo()
+
+ if 'known_url_rules' in simple_preds:
+
+ known_url_rules = list( simple_preds[ 'known_url_rules' ] )
+
+ magic_sort_list = [
+ 'exact_match',
+ 'domain',
+ 'url_class',
+ 'url_match',
+ 'regex'
+ ]
+
+ def url_rules_key( row ):
+
+ rule_type = row[1]
+
+ if rule_type in magic_sort_list:
+
+ return magic_sort_list.index( rule_type )
+
+ else:
+
+ return 10
+
+
+
+ known_url_rules.sort( key = url_rules_key )
+
+ for ( operator, rule_type, rule ) in known_url_rules:
+
+ if rule_type not in allowed_rule_types:
+
+ continue
+
+
+ if rule_type == 'exact_match' or ( is_inbox and len( query_hash_ids ) == len( self.modules_files_inbox.inbox_hash_ids ) ):
+
+ url_hash_ids = self.modules_url_map.GetHashIdsFromURLRule( rule_type, rule )
+
+ else:
+
+ with self._MakeTemporaryIntegerTable( query_hash_ids, 'hash_id' ) as temp_table_name:
+
+ self._AnalyzeTempTable( temp_table_name )
+
+ url_hash_ids = self.modules_url_map.GetHashIdsFromURLRule( rule_type, rule, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name )
+
+
+
+ if operator: # inclusive
+
+ query_hash_ids = intersection_update_qhi( query_hash_ids, url_hash_ids )
+
+ else:
+
+ query_hash_ids.difference_update( url_hash_ids )
+
+
+
+
return query_hash_ids
@@ -2149,60 +2220,14 @@ def sort_longest_tag_first_key( s ):
query_hash_ids = intersection_update_qhi( query_hash_ids, url_hash_ids )
- if 'known_url_rules' in simple_preds:
-
- known_url_rules = list( simple_preds[ 'known_url_rules' ] )
-
- magic_sort_list = [
- 'exact_match',
- 'domain',
- 'url_class',
- 'url_match',
- 'regex'
- ]
-
- def url_rules_key( row ):
-
- rule_type = row[1]
-
- if rule_type in magic_sort_list:
-
- return magic_sort_list.index( rule_type )
-
- else:
-
- return 10
-
-
-
- known_url_rules.sort( key = url_rules_key )
-
- for ( operator, rule_type, rule ) in known_url_rules:
-
- if rule_type == 'exact_match' or ( is_inbox and len( query_hash_ids ) == len( self.modules_files_inbox.inbox_hash_ids ) ):
-
- url_hash_ids = self.modules_url_map.GetHashIdsFromURLRule( rule_type, rule )
-
- else:
-
- with self._MakeTemporaryIntegerTable( query_hash_ids, 'hash_id' ) as temp_table_name:
-
- self._AnalyzeTempTable( temp_table_name )
-
- url_hash_ids = self.modules_url_map.GetHashIdsFromURLRule( rule_type, rule, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name )
-
-
-
- if operator: # inclusive
-
- query_hash_ids = intersection_update_qhi( query_hash_ids, url_hash_ids )
-
- else:
-
- query_hash_ids.difference_update( url_hash_ids )
-
-
-
+ allowed_job_types = [
+ 'exact_match',
+ 'domain',
+ 'url_class',
+ 'url_match'
+ ]
+
+ query_hash_ids = self._DoSpecificKnownURLPreds( file_search_context, query_hash_ids, allowed_job_types )
#
@@ -2304,6 +2329,17 @@ def url_rules_key( row ):
return []
+ #
+
+ allowed_job_types = [ 'regex' ]
+
+ query_hash_ids = self._DoSpecificKnownURLPreds( file_search_context, query_hash_ids, allowed_job_types )
+
+ if job_status.IsCancelled():
+
+ return []
+
+
#
query_hash_ids = list( query_hash_ids )
diff --git a/hydrus/client/db/ClientDBMappingsCacheCombinedFilesDisplay.py b/hydrus/client/db/ClientDBMappingsCacheCombinedFilesDisplay.py
index 101fb2d54..91d8d047f 100644
--- a/hydrus/client/db/ClientDBMappingsCacheCombinedFilesDisplay.py
+++ b/hydrus/client/db/ClientDBMappingsCacheCombinedFilesDisplay.py
@@ -5,7 +5,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
-from hydrus.core import HydrusTime
+from hydrus.core import HydrusNumbers
from hydrus.client.db import ClientDBFilesStorage
from hydrus.client.db import ClientDBMappingsCounts
@@ -541,7 +541,7 @@ def RegeneratePending( self, tag_service_id, status_hook = None ):
if i % 100 == 0 and status_hook is not None:
- message = 'regenerating pending tags {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
+ message = 'regenerating pending tags {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do ) )
status_hook( message )
diff --git a/hydrus/client/db/ClientDBMappingsCacheCombinedFilesStorage.py b/hydrus/client/db/ClientDBMappingsCacheCombinedFilesStorage.py
index d53c2e688..dade31fb3 100644
--- a/hydrus/client/db/ClientDBMappingsCacheCombinedFilesStorage.py
+++ b/hydrus/client/db/ClientDBMappingsCacheCombinedFilesStorage.py
@@ -2,9 +2,8 @@
import sqlite3
import typing
-from hydrus.core import HydrusData
from hydrus.core import HydrusDB
-from hydrus.core import HydrusTime
+from hydrus.core import HydrusNumbers
from hydrus.client.db import ClientDBMappingsCacheCombinedFilesDisplay
from hydrus.client.db import ClientDBMappingsCounts
@@ -119,7 +118,7 @@ def RegeneratePending( self, tag_service_id, status_hook = None ):
if i % 100 == 0 and status_hook is not None:
- message = 'regenerating pending tags {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
+ message = 'regenerating pending tags {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do ) )
status_hook( message )
diff --git a/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py b/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py
index 7d13a8194..720a02e05 100644
--- a/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py
+++ b/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py
@@ -6,7 +6,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusDBBase
-from hydrus.core import HydrusTime
+from hydrus.core import HydrusNumbers
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBMappingsCounts
@@ -669,7 +669,7 @@ def RegeneratePending( self, file_service_id, tag_service_id, status_hook = None
if i % 100 == 0 and status_hook is not None:
- message = 'regenerating pending tags {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
+ message = 'regenerating pending tags {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do ) )
status_hook( message )
diff --git a/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py b/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py
index 1b96b4cae..8eb502808 100644
--- a/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py
+++ b/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py
@@ -7,7 +7,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusDBBase
from hydrus.core import HydrusLists
-from hydrus.core import HydrusTime
+from hydrus.core import HydrusNumbers
from hydrus.client.db import ClientDBFilesStorage
from hydrus.client.db import ClientDBMaintenance
@@ -645,7 +645,7 @@ def RegeneratePending( self, file_service_id, tag_service_id, status_hook = None
if i % 100 == 0 and status_hook is not None:
- message = 'regenerating pending tags {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
+ message = 'regenerating pending tags {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do ) )
status_hook( message )
diff --git a/hydrus/client/db/ClientDBSimilarFiles.py b/hydrus/client/db/ClientDBSimilarFiles.py
index abe5208b0..eaf92d004 100644
--- a/hydrus/client/db/ClientDBSimilarFiles.py
+++ b/hydrus/client/db/ClientDBSimilarFiles.py
@@ -7,6 +7,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusDBBase
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusTime
@@ -162,7 +163,7 @@ def _GenerateBranch( self, job_status, parent_id, perceptual_hash_id, perceptual
while len( process_queue ) > 0:
- job_status.SetStatusText( 'generating new branch -- ' + HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ), 2 )
+ job_status.SetStatusText( 'generating new branch -- ' + HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ), 2 )
( parent_id, perceptual_hash_id, perceptual_hash, children ) = process_queue.popleft()
@@ -204,7 +205,7 @@ def _GenerateBranch( self, job_status, parent_id, perceptual_hash_id, perceptual
inner_population = len( inner_children )
outer_population = len( outer_children )
- ( inner_id, inner_perceptual_hash ) = self._PopBestRootNode( inner_children ) #HydrusData.MedianPop( inner_children )
+ ( inner_id, inner_perceptual_hash ) = self._PopBestRootNode( inner_children ) #HydrusLists.MedianPop( inner_children )
if len( outer_children ) == 0:
@@ -212,7 +213,7 @@ def _GenerateBranch( self, job_status, parent_id, perceptual_hash_id, perceptual
else:
- ( outer_id, outer_perceptual_hash ) = self._PopBestRootNode( outer_children ) #HydrusData.MedianPop( outer_children )
+ ( outer_id, outer_perceptual_hash ) = self._PopBestRootNode( outer_children ) #HydrusLists.MedianPop( outer_children )
@@ -709,7 +710,7 @@ def MaintainTree( self, maintenance_mode = HC.MAINTENANCE_FORCED, job_status = N
num_done = num_to_do - len( rebalance_perceptual_hash_ids )
- text = 'rebalancing similar file metadata - ' + HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do )
+ text = 'rebalancing similar file metadata - ' + HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do )
CG.client_controller.frame_splash_status.SetSubtext( text )
job_status.SetStatusText( text )
diff --git a/hydrus/client/db/ClientDBTagSearch.py b/hydrus/client/db/ClientDBTagSearch.py
index 0689ff915..f12c2201a 100644
--- a/hydrus/client/db/ClientDBTagSearch.py
+++ b/hydrus/client/db/ClientDBTagSearch.py
@@ -1542,7 +1542,7 @@ def RegenerateSearchableSubtagMap( self, file_service_id, tag_service_id, status
- message = HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do )
+ message = HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do )
CG.client_controller.frame_splash_status.SetSubtext( message )
diff --git a/hydrus/client/duplicates/ClientDuplicates.py b/hydrus/client/duplicates/ClientDuplicates.py
index 907d22a69..44a651f7a 100644
--- a/hydrus/client/duplicates/ClientDuplicates.py
+++ b/hydrus/client/duplicates/ClientDuplicates.py
@@ -187,7 +187,7 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
percentage_difference = ( s_size / c_size ) - 1.0
- percentage_different_string = ' ({}{})'.format( sign, HydrusData.ConvertFloatToPercentage( percentage_difference ) )
+ percentage_different_string = ' ({}{})'.format( sign, HydrusNumbers.FloatToPercentage( percentage_difference ) )
if is_a_pixel_dupe:
@@ -248,7 +248,7 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
else:
- s_string = HydrusData.ConvertResolutionToPrettyString( s_resolution )
+ s_string = HydrusNumbers.ResolutionToPrettyString( s_resolution )
if s_w % 2 == 1 or s_h % 2 == 1:
@@ -262,7 +262,7 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
else:
- c_string = HydrusData.ConvertResolutionToPrettyString( c_resolution )
+ c_string = HydrusNumbers.ResolutionToPrettyString( c_resolution )
if c_w % 2 == 1 or c_h % 2 == 1:
@@ -758,7 +758,7 @@ def THREADSearchPotentials( self ):
- text = 'searching: {}'.format( HydrusData.ConvertValueRangeToPrettyString( num_searched_estimate, total_num_files ) )
+ text = 'searching: {}'.format( HydrusNumbers.ValueRangeToPrettyString( num_searched_estimate, total_num_files ) )
job_status.SetStatusText( text )
job_status.SetVariable( 'popup_gauge_1', ( num_searched_estimate, total_num_files ) )
diff --git a/hydrus/client/exporting/ClientExportingFiles.py b/hydrus/client/exporting/ClientExportingFiles.py
index c7669dde3..7cd96a036 100644
--- a/hydrus/client/exporting/ClientExportingFiles.py
+++ b/hydrus/client/exporting/ClientExportingFiles.py
@@ -513,7 +513,7 @@ def _DoExport( self, job_status: ClientThreading.JobStatus ):
for ( i, block_of_hash_ids ) in enumerate( HydrusLists.SplitListIntoChunks( query_hash_ids, 256 ) ):
- job_status.SetStatusText( 'searching: {}'.format( HydrusData.ConvertValueRangeToPrettyString( i * CHUNK_SIZE, len( query_hash_ids ) ) ) )
+ job_status.SetStatusText( 'searching: {}'.format( HydrusNumbers.ValueRangeToPrettyString( i * CHUNK_SIZE, len( query_hash_ids ) ) ) )
if job_status.IsCancelled():
@@ -551,7 +551,7 @@ def _DoExport( self, job_status: ClientThreading.JobStatus ):
for ( i, media_result ) in enumerate( media_results ):
- job_status.SetStatusText( 'exporting: {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, len( media_results ) ) ) )
+ job_status.SetStatusText( 'exporting: {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, len( media_results ) ) ) )
if job_status.IsCancelled():
@@ -663,7 +663,7 @@ def _DoExport( self, job_status: ClientThreading.JobStatus ):
return
- job_status.SetStatusText( 'delete-synchronising: {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, len( deletee_paths ) ) ) )
+ job_status.SetStatusText( 'delete-synchronising: {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, len( deletee_paths ) ) ) )
ClientPaths.DeletePath( deletee_path )
@@ -721,7 +721,7 @@ def _DoExport( self, job_status: ClientThreading.JobStatus ):
return
- job_status.SetStatusText( 'delete-prepping: {}'.format( HydrusData.ConvertValueRangeToPrettyString( i + 1, len( media_results ) ) ) )
+ job_status.SetStatusText( 'delete-prepping: {}'.format( HydrusNumbers.ValueRangeToPrettyString( i + 1, len( media_results ) ) ) )
if media_result.IsDeleteLocked():
@@ -753,7 +753,7 @@ def _DoExport( self, job_status: ClientThreading.JobStatus ):
return
- job_status.SetStatusText( 'deleting: {}'.format( HydrusData.ConvertValueRangeToPrettyString( i * CHUNK_SIZE, len( deletee_hashes ) ) ) )
+ job_status.SetStatusText( 'deleting: {}'.format( HydrusNumbers.ValueRangeToPrettyString( i * CHUNK_SIZE, len( deletee_hashes ) ) ) )
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, chunk_of_hashes, reason = reason )
diff --git a/hydrus/client/gui/ClientGUI.py b/hydrus/client/gui/ClientGUI.py
index 601ab91a8..8a7a887fb 100644
--- a/hydrus/client/gui/ClientGUI.py
+++ b/hydrus/client/gui/ClientGUI.py
@@ -276,7 +276,7 @@ def THREADUploadPending( service_key ):
num_done = num_to_do - remaining_num_pending
- job_status.SetStatusText( 'uploading to ' + service_name + ': ' + HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ) )
+ job_status.SetStatusText( 'uploading to ' + service_name + ': ' + HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( num_done, num_to_do ) )
while job_status.IsPaused() or job_status.IsCancelled():
@@ -1762,20 +1762,6 @@ def _DeleteGUISession( self, name ):
- def _DeletePending( self, service_key ):
-
- service_name = self._controller.services_manager.GetName( service_key )
-
- message = 'Are you sure you want to delete the pending data for {}?'.format( service_name )
-
- result = ClientGUIDialogsQuick.GetYesNo( self, message )
-
- if result == QW.QDialog.Accepted:
-
- self._controller.Write( 'delete_pending', service_key )
-
-
-
def _DeleteServiceInfo( self, only_pending = False ):
if only_pending:
@@ -2159,7 +2145,7 @@ def do_it( external_update_dir ):
finally:
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( i, num_to_do ) )
@@ -2804,8 +2790,8 @@ def publish_callable( result ):
submenu = ClientGUIMenus.GenerateMenu( self._menubar_pending_submenu )
- ClientGUIMenus.AppendMenuItem( submenu, 'commit', 'Upload {}\'s pending content.'.format( name ), self._UploadPending, service_key )
- ClientGUIMenus.AppendMenuItem( submenu, 'forget', 'Clear {}\'s pending content.'.format( name ), self._DeletePending, service_key )
+ ClientGUIMenus.AppendMenuItem( submenu, 'commit', 'Upload {}\'s pending content.'.format( name ), self.UploadPending, service_key )
+ ClientGUIMenus.AppendMenuItem( submenu, 'forget', 'Clear {}\'s pending content.'.format( name ), self.ForgetPending, service_key )
ClientGUIMenus.SetMenuTitle( submenu, name )
@@ -6900,43 +6886,6 @@ def _UnclosePage( self, closed_page_index = None ):
self._controller.pub( 'notify_new_pages' )
- def _UploadPending( self, service_key ):
-
- service = self._controller.services_manager.GetService( service_key )
-
- try:
-
- if isinstance( service, ClientServices.ServiceRestricted ):
-
- service.CheckFunctional( including_bandwidth = False )
-
- else:
-
- service.CheckFunctional()
-
-
- if isinstance( service, ClientServices.ServiceRepository ):
-
- if not service.IsMostlyCaughtUp():
-
- raise Exception( 'Repository processing is not caught up--please process more before you upload new content.' )
-
-
-
- except Exception as e:
-
- ClientGUIDialogsMessage.ShowCritical( self, 'Error', 'Unfortunately, there is a problem with starting the upload: ' + str( e ) )
-
- return
-
-
- self._currently_uploading_pending.add( service_key )
-
- self._menu_updater_pending.update()
-
- self._controller.CallToThread( THREADUploadPending, service_key )
-
-
def _UpdateMenuPagesCount( self ):
(
@@ -7441,6 +7390,20 @@ def FlipSubscriptionsPaused( self ):
self._menu_updater_network.update()
+ def ForgetPending( self, service_key ):
+
+ service_name = self._controller.services_manager.GetName( service_key )
+
+ message = 'Are you sure you want to delete the pending data for {}?'.format( service_name )
+
+ result = ClientGUIDialogsQuick.GetYesNo( self, message )
+
+ if result == QW.QDialog.Accepted:
+
+ self._controller.Write( 'delete_pending', service_key )
+
+
+
def GetCurrentPage( self ):
return self._notebook.GetCurrentMediaPage()
@@ -7625,6 +7588,11 @@ def IsCurrentPage( self, page_key ):
+ def IsCurrentlyUploadingPending( self, service_key ):
+
+ return service_key in self._currently_uploading_pending
+
+
def MaintainCanvasFrameReferences( self ):
self._canvas_frames = [ frame for frame in self._canvas_frames if QP.isValid( frame ) ]
@@ -8651,3 +8619,47 @@ def UnregisterUIUpdateWindow( self, window ):
self._ui_update_windows.discard( window )
+ def UploadPending( self, service_key ):
+
+ if service_key in self._currently_uploading_pending:
+
+ return False
+
+
+ service = self._controller.services_manager.GetService( service_key )
+
+ try:
+
+ if isinstance( service, ClientServices.ServiceRestricted ):
+
+ service.CheckFunctional( including_bandwidth = False )
+
+ else:
+
+ service.CheckFunctional()
+
+
+ if isinstance( service, ClientServices.ServiceRepository ):
+
+ if not service.IsMostlyCaughtUp():
+
+ raise Exception( 'Repository processing is not caught up--please process more before you upload new content.' )
+
+
+
+ except Exception as e:
+
+ ClientGUIDialogsMessage.ShowCritical( self, 'Error', 'Unfortunately, there is a problem with starting the upload: ' + str( e ) )
+
+ return False
+
+
+ self._currently_uploading_pending.add( service_key )
+
+ self._menu_updater_pending.update()
+
+ self._controller.CallToThread( THREADUploadPending, service_key )
+
+ return True
+
+
diff --git a/hydrus/client/gui/ClientGUIDownloaders.py b/hydrus/client/gui/ClientGUIDownloaders.py
index d7a76c20c..678bcf57b 100644
--- a/hydrus/client/gui/ClientGUIDownloaders.py
+++ b/hydrus/client/gui/ClientGUIDownloaders.py
@@ -1950,7 +1950,7 @@ def _UpdateControls( self ):
if True in ( string_match.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ):
- choices.append( ( HydrusNumbers.ConvertIntToPrettyOrdinalString( index + 1 ) + ' path component', ( ClientNetworkingURLClass.GALLERY_INDEX_TYPE_PATH_COMPONENT, index ) ) )
+ choices.append( ( HydrusNumbers.IntToPrettyOrdinalString( index + 1 ) + ' path component', ( ClientNetworkingURLClass.GALLERY_INDEX_TYPE_PATH_COMPONENT, index ) ) )
diff --git a/hydrus/client/gui/ClientGUIFileSeedCache.py b/hydrus/client/gui/ClientGUIFileSeedCache.py
index 43e1385c7..572619bea 100644
--- a/hydrus/client/gui/ClientGUIFileSeedCache.py
+++ b/hydrus/client/gui/ClientGUIFileSeedCache.py
@@ -6,7 +6,6 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusNumbers
-from hydrus.core import HydrusPaths
from hydrus.core import HydrusText
from hydrus.client import ClientConstants as CC
@@ -1003,7 +1002,7 @@ def _Update( self ):
else:
- self._progress_st.setText( HydrusData.ConvertValueRangeToPrettyString(num_done,num_to_do) )
+ self._progress_st.setText( HydrusNumbers.ValueRangeToPrettyString(num_done,num_to_do) )
self._progress_gauge.SetRange( num_to_do )
diff --git a/hydrus/client/gui/ClientGUIShortcuts.py b/hydrus/client/gui/ClientGUIShortcuts.py
index f38074c88..2b0176942 100644
--- a/hydrus/client/gui/ClientGUIShortcuts.py
+++ b/hydrus/client/gui/ClientGUIShortcuts.py
@@ -8,6 +8,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusSerialisable
from hydrus.client import ClientApplicationCommand as CAC
@@ -1407,7 +1408,7 @@ def _ProcessShortcut( self, shortcut: Shortcut ):
ancestor_shortcuts_handlers = AncestorShortcutsHandlers( self._parent )
- all_ancestor_shortcut_names = HydrusData.MassUnion( [ ancestor_shortcuts_handler.GetShortcutNames() for ancestor_shortcuts_handler in ancestor_shortcuts_handlers ] )
+ all_ancestor_shortcut_names = HydrusLists.MassUnion( [ ancestor_shortcuts_handler.GetShortcutNames() for ancestor_shortcuts_handler in ancestor_shortcuts_handlers ] )
ancestor_command = shortcuts_manager().GetCommand( all_ancestor_shortcut_names, shortcut )
diff --git a/hydrus/client/gui/ClientGUISubscriptions.py b/hydrus/client/gui/ClientGUISubscriptions.py
index e47c14d58..72e48840a 100644
--- a/hydrus/client/gui/ClientGUISubscriptions.py
+++ b/hydrus/client/gui/ClientGUISubscriptions.py
@@ -1,6 +1,4 @@
import collections
-import itertools
-import os
import threading
import time
import typing
@@ -10,6 +8,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusText
@@ -728,7 +727,7 @@ def _CopyQualityInfo( self, data ):
if num_archived + num_deleted > 0:
- percent = HydrusData.ConvertFloatToPercentage( num_archived / ( num_archived + num_deleted ) )
+ percent = HydrusNumbers.FloatToPercentage( num_archived / ( num_archived + num_deleted ) )
else:
@@ -780,7 +779,7 @@ def _ShowQualityInfo( self, data ):
if num_archived + num_deleted > 0:
- data_string += ' | good {}'.format( HydrusData.ConvertFloatToPercentage( num_archived / ( num_archived + num_deleted ) ) )
+ data_string += ' | good {}'.format( HydrusNumbers.FloatToPercentage( num_archived / ( num_archived + num_deleted ) ) )
data_strings.append( data_string )
@@ -2043,7 +2042,7 @@ def CheckNow( self ):
subscriptions = self._subscriptions.GetData( only_selected = True )
- query_headers = HydrusData.MassExtend( ( subscription.GetQueryHeaders() for subscription in subscriptions ) )
+ query_headers = HydrusLists.MassExtend( ( subscription.GetQueryHeaders() for subscription in subscriptions ) )
try:
@@ -2312,7 +2311,7 @@ def DedupeAll( self ):
else:
- label = '{} (has {} duplicate queries)'.format( subscription.GetName(), HydrusData.ConvertValueRangeToPrettyString( len( dupe_query_texts_it_can_do ), len( query_texts_we_want_to_dedupe_now ) ) )
+ label = '{} (has {} duplicate queries)'.format( subscription.GetName(), HydrusNumbers.ValueRangeToPrettyString( len( dupe_query_texts_it_can_do ), len( query_texts_we_want_to_dedupe_now ) ) )
choice_tuples.append( ( label, subscription, ', '.join( sorted( dupe_query_texts_it_can_do ) ) ) )
diff --git a/hydrus/client/gui/ClientGUITagSuggestions.py b/hydrus/client/gui/ClientGUITagSuggestions.py
index 3a76c48b9..0ec1239e9 100644
--- a/hydrus/client/gui/ClientGUITagSuggestions.py
+++ b/hydrus/client/gui/ClientGUITagSuggestions.py
@@ -422,7 +422,7 @@ def qt_code( predicates, num_done, num_to_do, num_skipped, total_time_took ):
else:
- num_done_s = '{} {} searched fully in '.format( HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ), tags_s )
+ num_done_s = '{} {} searched fully in '.format( HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ), tags_s )
label = '{}{}.'.format( num_done_s, HydrusTime.TimeDeltaToPrettyTimeDelta( total_time_took ) )
diff --git a/hydrus/client/gui/ClientGUITags.py b/hydrus/client/gui/ClientGUITags.py
index c22b9bb89..6b04f5dd7 100644
--- a/hydrus/client/gui/ClientGUITags.py
+++ b/hydrus/client/gui/ClientGUITags.py
@@ -15,6 +15,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTags
@@ -3344,7 +3345,7 @@ def ProcessContentUpdatePackage( self, content_update_package: ClientContentUpda
for m in self._media:
- if HydrusData.SetsIntersect( m.GetHashes(), content_update.GetHashes() ):
+ if HydrusLists.SetsIntersect( m.GetHashes(), content_update.GetHashes() ):
m.GetMediaResult().ProcessContentUpdate( service_key, content_update )
diff --git a/hydrus/client/gui/QtPorting.py b/hydrus/client/gui/QtPorting.py
index 0eb230fee..ef3492880 100644
--- a/hydrus/client/gui/QtPorting.py
+++ b/hydrus/client/gui/QtPorting.py
@@ -17,6 +17,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusProfiling
from hydrus.core import HydrusTime
@@ -1992,7 +1993,7 @@ def _UpdateParentCheckState( self, item: QW.QTreeWidgetItem ):
def ListsToTuples( potentially_nested_lists ):
- if HydrusData.IsAListLikeCollection( potentially_nested_lists ):
+ if HydrusLists.IsAListLikeCollection( potentially_nested_lists ):
return tuple( map( ListsToTuples, potentially_nested_lists ) )
diff --git a/hydrus/client/gui/canvas/ClientGUICanvas.py b/hydrus/client/gui/canvas/ClientGUICanvas.py
index 4439e1b59..f3731f5d2 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvas.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvas.py
@@ -2685,7 +2685,7 @@ def _GetIndexString( self ):
progress = self._current_pair_index + 1
total = len( self._batch_of_pairs_to_process )
- index_string = HydrusData.ConvertValueRangeToPrettyString( progress, total )
+ index_string = HydrusNumbers.ValueRangeToPrettyString( progress, total )
num_committable = self._GetNumCommittableDecisions()
num_deletable = self._GetNumCommittableDeletes()
@@ -3484,7 +3484,7 @@ def _GetIndexString( self ):
else:
- index_string = HydrusData.ConvertValueRangeToPrettyString( self._sorted_media.index( self._current_media ) + 1, len( self._sorted_media ) )
+ index_string = HydrusNumbers.ValueRangeToPrettyString( self._sorted_media.index( self._current_media ) + 1, len( self._sorted_media ) )
return index_string
@@ -3834,7 +3834,7 @@ def TryToDoPreClose( self ):
location_contexts_to_present_options_for.append( self._location_context )
- current_local_service_keys = HydrusData.MassUnion( [ m.GetLocationsManager().GetCurrent() for m in deleted ] )
+ current_local_service_keys = HydrusLists.MassUnion( [ m.GetLocationsManager().GetCurrent() for m in deleted ] )
local_file_domain_service_keys = [ service_key for service_key in current_local_service_keys if CG.client_controller.services_manager.GetServiceType( service_key ) == HC.LOCAL_FILE_DOMAIN ]
diff --git a/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py b/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
index d4e6577f1..34853afde 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
@@ -7,6 +7,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusSerialisable
from hydrus.client import ClientApplicationCommand as CAC
@@ -89,7 +90,7 @@ def ProcessContentUpdatePackage( self, content_update_package: ClientContentUpda
hashes = content_update.GetHashes()
- if HydrusData.SetsIntersect( self._hashes, hashes ):
+ if HydrusLists.SetsIntersect( self._hashes, hashes ):
( self._rating_state, self._rating ) = ClientRatings.GetIncDecStateFromMedia( ( self._current_media, ), self._service_key )
@@ -198,7 +199,7 @@ def ProcessContentUpdatePackage( self, content_update_package: ClientContentUpda
hashes = content_update.GetHashes()
- if HydrusData.SetsIntersect( self._hashes, hashes ):
+ if HydrusLists.SetsIntersect( self._hashes, hashes ):
self._SetRatingFromCurrentMedia()
@@ -318,7 +319,7 @@ def ProcessContentUpdatePackage( self, content_update_package: ClientContentUpda
hashes = content_update.GetHashes()
- if HydrusData.SetsIntersect( self._hashes, hashes ):
+ if HydrusLists.SetsIntersect( self._hashes, hashes ):
( self._rating_state, self._rating ) = ClientRatings.GetIncDecStateFromMedia( ( self._current_media, ), self._service_key )
diff --git a/hydrus/client/gui/canvas/ClientGUICanvasMedia.py b/hydrus/client/gui/canvas/ClientGUICanvasMedia.py
index 4433e8d33..7fb60e912 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvasMedia.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvasMedia.py
@@ -20,6 +20,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusNumbers
from hydrus.core import HydrusPaths
from hydrus.core import HydrusTime
from hydrus.core.files import HydrusFileHandling
@@ -1050,7 +1051,7 @@ def _Redraw( self, painter ):
if num_frames_are_useful:
- progress_strings.append( HydrusData.ConvertValueRangeToPrettyString( current_frame_index + 1, self._num_frames ) )
+ progress_strings.append( HydrusNumbers.ValueRangeToPrettyString( current_frame_index + 1, self._num_frames ) )
if current_timestamp_ms is not None:
diff --git a/hydrus/client/gui/exporting/ClientGUIExport.py b/hydrus/client/gui/exporting/ClientGUIExport.py
index 3c9f8f7df..1bb835dfd 100644
--- a/hydrus/client/gui/exporting/ClientGUIExport.py
+++ b/hydrus/client/gui/exporting/ClientGUIExport.py
@@ -879,7 +879,7 @@ def do_it( directory, metadata_routers, delete_afterwards, export_symlinks, quit
try:
- x_of_y = HydrusData.ConvertValueRangeToPrettyString( index + 1, num_to_do )
+ x_of_y = HydrusNumbers.ValueRangeToPrettyString( index + 1, num_to_do )
job_status.SetStatusText( 'Done {}'.format( x_of_y ) )
job_status.SetVariable( 'popup_gauge_1', ( index + 1, num_to_do ) )
diff --git a/hydrus/client/gui/importing/ClientGUIImport.py b/hydrus/client/gui/importing/ClientGUIImport.py
index 1b7ebca3c..c6ac291df 100644
--- a/hydrus/client/gui/importing/ClientGUIImport.py
+++ b/hydrus/client/gui/importing/ClientGUIImport.py
@@ -104,7 +104,7 @@ class CheckBoxLineEdit( QW.QWidget ):
valueChanged = QC.Signal()
- def __init__( self, parent, label ):
+ def __init__( self, parent, placeholder_text ):
QW.QWidget.__init__( self, parent )
@@ -113,12 +113,13 @@ def __init__( self, parent, label ):
self._lineedit = QW.QLineEdit( self )
self._lineedit.setMinimumWidth( 100 )
+ self._lineedit.setPlaceholderText( placeholder_text )
+
self._checkbox.clicked.connect( self.valueChanged )
self._lineedit.textEdited.connect( self._LineEditChanged )
hbox = QP.HBoxLayout()
- QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText( self, label ), CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._checkbox, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._lineedit, CC.FLAGS_EXPAND_BOTH_WAYS )
@@ -343,7 +344,7 @@ def __init__( self, parent, service_key, filename_tagging_options, present_for_a
self._quick_namespaces_list.columnListContentsChanged.connect( self.tagsChanged )
- self._regex_input.userHitEnter.connect( self.AddRegex )
+ self._regex_input.SetEnterCallable( self.AddRegex )
def _ConvertQuickRegexDataToListCtrlTuples( self, data ):
@@ -534,7 +535,11 @@ def __init__( self, parent, service_key, filename_tagging_options, present_for_a
self._checkboxes_panel = ClientGUICommon.StaticBox( self, 'misc' )
- self._filename_namespace = CheckBoxLineEdit( self._checkboxes_panel, 'add filename? [namespace]' )
+ rows = []
+
+ self._filename_namespace = CheckBoxLineEdit( self._checkboxes_panel, 'namespace' )
+
+ rows.append( ( 'add filename?', self._filename_namespace ) )
# TODO: Ok, when we move to arbitrary string processing from filenames, and we scrub this, we will want 'easy-add rule' buttons to do this
# When we do, add a thing that adds the nth, including negative n index values. as some users want four deep
@@ -553,11 +558,15 @@ def __init__( self, parent, service_key, filename_tagging_options, present_for_a
for ( index, phrase ) in directory_items:
- widget = CheckBoxLineEdit( self._checkboxes_panel, f'add {phrase} directory? [namespace]' )
+ widget = CheckBoxLineEdit( self._checkboxes_panel, 'namespace' )
+
+ rows.append( ( f'add {phrase} directory?', widget ) )
self._directory_namespace_controls[ index ] = widget
+ filename_gridbox = ClientGUICommon.WrapInGrid( self, rows )
+
#
( tags_for_all, add_filename, directory_dict ) = filename_tagging_options.SimpleToTuple()
@@ -589,14 +598,7 @@ def __init__( self, parent, service_key, filename_tagging_options, present_for_a
self._single_tags_panel.Add( self._tag_autocomplete_selection, CC.FLAGS_EXPAND_PERPENDICULAR )
self._single_tags_panel.Add( self._single_tags_paste_button, CC.FLAGS_EXPAND_PERPENDICULAR )
- self._checkboxes_panel.Add( self._filename_namespace, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
-
- for index in ( 0, 1, 2, -3, -2, -1 ):
-
- widget = self._directory_namespace_controls[ index ]
-
- self._checkboxes_panel.Add( widget, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
-
+ self._checkboxes_panel.Add( filename_gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
self._checkboxes_panel.Add( QW.QWidget( self._checkboxes_panel ), CC.FLAGS_EXPAND_BOTH_WAYS )
diff --git a/hydrus/client/gui/lists/ClientGUIListBoxes.py b/hydrus/client/gui/lists/ClientGUIListBoxes.py
index 38785cfb4..6edc0f269 100644
--- a/hydrus/client/gui/lists/ClientGUIListBoxes.py
+++ b/hydrus/client/gui/lists/ClientGUIListBoxes.py
@@ -2196,6 +2196,18 @@ def eventFilter( self, watched, event ):
ctrl_down = event.modifiers() & QC.Qt.ControlModifier
shift_down = event.modifiers() & QC.Qt.ShiftModifier
+ if ctrl_down:
+
+ logical_index = self._GetLogicalIndexUnderMouse( event )
+
+ if not self._LogicalIndexIsSelected( logical_index ):
+
+ # ok we will assume that the user just deselected with the first ctrl-click, and to fudge the awkward moment, we will sneakily re-select it
+
+ self._Hit( False, True, logical_index )
+
+
+
action_occurred = self._Activate( ctrl_down, shift_down )
if action_occurred:
@@ -3281,7 +3293,7 @@ def group_and_sort_parents_to_service_keys( p_to_s_ks, c_to_s_ks ):
ClientGUIMenus.AppendMenuItem( search_menu, 'add an OR of {} to current search'.format( predicates_selection_string ), 'Add the selected predicates as an OR predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_or_predicate' )
- some_selected_in_current = HydrusData.SetsIntersect( predicates, current_predicates )
+ some_selected_in_current = HydrusLists.SetsIntersect( predicates, current_predicates )
if some_selected_in_current:
diff --git a/hydrus/client/gui/media/ClientGUIMediaModalActions.py b/hydrus/client/gui/media/ClientGUIMediaModalActions.py
index da6c47c12..538b78771 100644
--- a/hydrus/client/gui/media/ClientGUIMediaModalActions.py
+++ b/hydrus/client/gui/media/ClientGUIMediaModalActions.py
@@ -439,7 +439,7 @@ def do_it():
for ( i, m ) in enumerate( medias_to_alter_modified_dates ):
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i, num_to_do ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( i, num_to_do ) )
if not showed_popup and HydrusTime.TimeHasPassed( time_started + 3 ):
@@ -859,7 +859,7 @@ def work_callable():
break
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( i * BLOCK_SIZE, num_to_do ) )
content_updates = []
@@ -958,7 +958,7 @@ def do_it( urls ):
return
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_urls ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_urls ) )
job_status.SetVariable( 'popup_gauge_1', ( i + 1, num_urls ) )
@@ -1060,7 +1060,7 @@ def work_callable():
break
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( i * BLOCK_SIZE, num_to_do ) )
hashes = { media.GetHash() for media in block_of_media }
@@ -1187,7 +1187,7 @@ def UndeleteMedia( win, media ):
return
- media_deleted_service_keys = HydrusData.MassUnion( ( m.GetLocationsManager().GetDeleted() for m in undeletable_media ) )
+ media_deleted_service_keys = HydrusLists.MassUnion( ( m.GetLocationsManager().GetDeleted() for m in undeletable_media ) )
local_file_services = CG.client_controller.services_manager.GetServices( ( HC.LOCAL_FILE_DOMAIN, ) )
diff --git a/hydrus/client/gui/metadata/ClientGUIMetadataMigration.py b/hydrus/client/gui/metadata/ClientGUIMetadataMigration.py
index ce34822d8..3cb076d23 100644
--- a/hydrus/client/gui/metadata/ClientGUIMetadataMigration.py
+++ b/hydrus/client/gui/metadata/ClientGUIMetadataMigration.py
@@ -208,7 +208,7 @@ def _UpdateTestPanel( self ):
self._test_notebook.addTab( list_ctrl, 'init' )
- page_name = HydrusText.ElideText( HydrusNumbers.ConvertIndexToPrettyOrdinalString( i ), 14 )
+ page_name = HydrusText.ElideText( HydrusNumbers.IndexToPrettyOrdinalString( i ), 14 )
self._test_notebook.setTabText( i, page_name )
diff --git a/hydrus/client/gui/networking/ClientGUIHydrusNetwork.py b/hydrus/client/gui/networking/ClientGUIHydrusNetwork.py
index bfb0c58e9..77260b59a 100644
--- a/hydrus/client/gui/networking/ClientGUIHydrusNetwork.py
+++ b/hydrus/client/gui/networking/ClientGUIHydrusNetwork.py
@@ -6,6 +6,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusTime
from hydrus.core.networking import HydrusNetwork
@@ -380,7 +381,7 @@ def key_transfer_not_collapsed():
keys = set( self._deletee_account_type_keys_to_new_account_type_keys.keys() )
values = set( self._deletee_account_type_keys_to_new_account_type_keys.values() )
- return HydrusData.SetsIntersect( keys, values )
+ return HydrusLists.SetsIntersect( keys, values )
while key_transfer_not_collapsed():
diff --git a/hydrus/client/gui/pages/ClientGUIManagementPanels.py b/hydrus/client/gui/pages/ClientGUIManagementPanels.py
index 4e135b4a4..bd837522a 100644
--- a/hydrus/client/gui/pages/ClientGUIManagementPanels.py
+++ b/hydrus/client/gui/pages/ClientGUIManagementPanels.py
@@ -991,7 +991,7 @@ def _UpdateMaintenanceStatus( self ):
else:
- self._num_searched.SetValue( 'Searched ' + HydrusData.ConvertValueRangeToPrettyString( num_searched, total_num_files ) + ' files at this distance.', num_searched, total_num_files )
+ self._num_searched.SetValue( 'Searched ' + HydrusNumbers.ValueRangeToPrettyString( num_searched, total_num_files ) + ' files at this distance.', num_searched, total_num_files )
page_name = 'preparation (needs work)'
@@ -1714,7 +1714,7 @@ def work_callable():
num_done = i * BLOCK_SIZE
- job_status.SetStatusText( 'Loading files: {}'.format( HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ) ) )
+ job_status.SetStatusText( 'Loading files: {}'.format( HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ) ) )
job_status.SetVariable( 'popup_gauge_1', ( num_done, num_to_do ) )
if not have_published_job_status and HydrusTime.TimeHasPassedFloat( start_time + 3 ):
@@ -2226,7 +2226,7 @@ def _UpdateImportStatus( self ):
( num_done, num_total ) = file_seed_cache_status.GetValueRange()
- text_top = '{} queries - {}'.format( HydrusNumbers.ToHumanInt( num_gallery_imports ), HydrusData.ConvertValueRangeToPrettyString( num_done, num_total ) )
+ text_top = '{} queries - {}'.format( HydrusNumbers.ToHumanInt( num_gallery_imports ), HydrusNumbers.ValueRangeToPrettyString( num_done, num_total ) )
text_bottom = file_seed_cache_status.GetStatusText()
@@ -2828,7 +2828,7 @@ def work_callable():
num_done = i * BLOCK_SIZE
- job_status.SetStatusText( 'Loading files: {}'.format( HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ) ) )
+ job_status.SetStatusText( 'Loading files: {}'.format( HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ) ) )
job_status.SetVariable( 'popup_gauge_1', ( num_done, num_to_do ) )
if not have_published_job_status and HydrusTime.TimeHasPassedFloat( start_time + 3 ):
@@ -3360,7 +3360,7 @@ def _UpdateImportStatus( self ):
( num_done, num_total ) = file_seed_cache_status.GetValueRange()
- text_top = '{} watchers - {}'.format( HydrusNumbers.ToHumanInt( num_watchers ), HydrusData.ConvertValueRangeToPrettyString( num_done, num_total ) )
+ text_top = '{} watchers - {}'.format( HydrusNumbers.ToHumanInt( num_watchers ), HydrusNumbers.ValueRangeToPrettyString( num_done, num_total ) )
text_bottom = file_seed_cache_status.GetStatusText()
@@ -5628,7 +5628,7 @@ def qt_petition_fetched( petition: HydrusNetwork.Petition ):
CG.client_controller.WriteSynchronous( 'content_updates', ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( service.GetServiceKey(), content_updates ) )
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( num_done, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( num_done, num_to_do ) )
diff --git a/hydrus/client/gui/pages/ClientGUIPages.py b/hydrus/client/gui/pages/ClientGUIPages.py
index 166a5802b..ff91e76a8 100644
--- a/hydrus/client/gui/pages/ClientGUIPages.py
+++ b/hydrus/client/gui/pages/ClientGUIPages.py
@@ -674,6 +674,7 @@ def GetAPIInfoDict( self, simple ):
d[ 'page_key' ] = self._page_key.hex()
d[ 'page_state' ] = self.GetPageState()
d[ 'page_type' ] = self._management_controller.GetType()
+ d[ 'is_media_page' ] = True
management_info = self._management_controller.GetAPIInfoDict( simple )
@@ -747,7 +748,7 @@ def GetNameForMenu( self ) -> str:
if num_value != num_range:
- name_for_menu = '{} - {}'.format( name_for_menu, HydrusData.ConvertValueRangeToPrettyString( num_value, num_range ) )
+ name_for_menu = '{} - {}'.format( name_for_menu, HydrusNumbers.ValueRangeToPrettyString( num_value, num_range ) )
return HydrusText.ElideText( name_for_menu, 32, elide_center = True )
@@ -847,6 +848,7 @@ def GetSessionAPIInfoDict( self, is_selected = False ):
root[ 'page_key' ] = self._page_key.hex()
root[ 'page_state' ] = self.GetPageState()
root[ 'page_type' ] = self._management_controller.GetType()
+ root[ 'is_media_page' ] = True
root[ 'selected' ] = is_selected
return root
@@ -1078,7 +1080,7 @@ def work_callable():
initial_media_results.extend( more_media_results )
- status = f'Loading initial files{HC.UNICODE_ELLIPSIS} {HydrusData.ConvertValueRangeToPrettyString( len( initial_media_results ), len( initial_hashes ) )}'
+ status = f'Loading initial files{HC.UNICODE_ELLIPSIS} {HydrusNumbers.ValueRangeToPrettyString( len( initial_media_results ), len( initial_hashes ) )}'
controller.CallAfterQtSafe( self, 'setting status bar loading string', qt_code_status, status )
@@ -1645,7 +1647,7 @@ def _RefreshPageName( self, index ):
num_string += ', '
- num_string += HydrusData.ConvertValueRangeToPrettyString( num_value, num_range )
+ num_string += HydrusNumbers.ValueRangeToPrettyString( num_value, num_range )
@@ -2338,7 +2340,8 @@ def GetAPIInfoDict( self, simple ):
'name' : self.GetName(),
'page_key' : self._page_key.hex(),
'page_state' : self.GetPageState(),
- 'page_type' : ClientGUIManagementController.MANAGEMENT_TYPE_PAGE_OF_PAGES
+ 'page_type' : ClientGUIManagementController.MANAGEMENT_TYPE_PAGE_OF_PAGES,
+ 'is_media_page' : False
}
@@ -2388,7 +2391,7 @@ def GetNameForMenu( self ) -> str:
if num_value != num_range:
- name_for_menu = '{} - {}'.format( name_for_menu, HydrusData.ConvertValueRangeToPrettyString( num_value, num_range ) )
+ name_for_menu = '{} - {}'.format( name_for_menu, HydrusNumbers.ValueRangeToPrettyString( num_value, num_range ) )
return HydrusText.ElideText( name_for_menu, 32, elide_center = True )
@@ -2606,7 +2609,7 @@ def GetPrettyStatusForStatusBar( self ):
if num_range > 0 and num_value != num_range:
- num_string += ', ' + HydrusData.ConvertValueRangeToPrettyString( num_value, num_range )
+ num_string += ', ' + HydrusNumbers.ValueRangeToPrettyString( num_value, num_range )
return HydrusNumbers.ToHumanInt( self.count() ) + ' pages, ' + num_string + ' files'
@@ -2656,6 +2659,7 @@ def GetSessionAPIInfoDict( self, is_selected = True ):
root[ 'page_key' ] = self._page_key.hex()
root[ 'page_state' ] = self.GetPageState()
root[ 'page_type' ] = ClientGUIManagementController.MANAGEMENT_TYPE_PAGE_OF_PAGES
+ root[ 'is_media_page' ] = False
root[ 'selected' ] = is_selected
root[ 'pages' ] = my_pages_list
diff --git a/hydrus/client/gui/pages/ClientGUIResults.py b/hydrus/client/gui/pages/ClientGUIResults.py
index e5b09fcd7..c0d60b7b3 100644
--- a/hydrus/client/gui/pages/ClientGUIResults.py
+++ b/hydrus/client/gui/pages/ClientGUIResults.py
@@ -12,6 +12,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusPaths
from hydrus.core import HydrusTime
@@ -24,6 +25,7 @@
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientLocation
from hydrus.client import ClientPaths
+from hydrus.client import ClientServices
from hydrus.client.gui import ClientGUIDragDrop
from hydrus.client.gui import ClientGUICore as CGC
from hydrus.client.gui import ClientGUIDialogs
@@ -2489,11 +2491,11 @@ def ProcessContentUpdatePackage( self, content_update_package: ClientContentUpda
- def ProcessServiceUpdates( self, service_keys_to_service_updates ):
+ def ProcessServiceUpdates( self, service_keys_to_service_updates: typing.Dict[ bytes, typing.Collection[ ClientServices.ServiceUpdate ] ] ):
ClientMedia.ListeningMediaList.ProcessServiceUpdates( self, service_keys_to_service_updates )
- for ( service_key, service_updates ) in list(service_keys_to_service_updates.items()):
+ for ( service_key, service_updates ) in service_keys_to_service_updates.items():
for service_update in service_updates:
@@ -3811,7 +3813,7 @@ def ShowMenu( self, do_not_show_just_return = False ):
selection_has_archive = True in ( media.HasArchive() and media.GetLocationsManager().IsLocal() for media in self._selected_media )
selection_has_deletion_record = True in ( CC.COMBINED_LOCAL_FILE_SERVICE_KEY in locations_manager.GetDeleted() for locations_manager in selected_locations_managers )
- all_file_domains = HydrusData.MassUnion( locations_manager.GetCurrent() for locations_manager in all_locations_managers )
+ all_file_domains = HydrusLists.MassUnion( locations_manager.GetCurrent() for locations_manager in all_locations_managers )
all_specific_file_domains = all_file_domains.difference( { CC.COMBINED_FILE_SERVICE_KEY, CC.COMBINED_LOCAL_FILE_SERVICE_KEY } )
some_downloading = True in ( locations_manager.IsDownloading() for locations_manager in selected_locations_managers )
@@ -3915,15 +3917,15 @@ def ShowMenu( self, do_not_show_just_return = False ):
groups_of_petitioned_remote_service_keys = [ locations_manager.GetPetitioned().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
groups_of_deleted_remote_service_keys = [ locations_manager.GetDeleted().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
- current_remote_service_keys = HydrusData.MassUnion( groups_of_current_remote_service_keys )
- pending_remote_service_keys = HydrusData.MassUnion( groups_of_pending_remote_service_keys )
- petitioned_remote_service_keys = HydrusData.MassUnion( groups_of_petitioned_remote_service_keys )
- deleted_remote_service_keys = HydrusData.MassUnion( groups_of_deleted_remote_service_keys )
+ current_remote_service_keys = HydrusLists.MassUnion( groups_of_current_remote_service_keys )
+ pending_remote_service_keys = HydrusLists.MassUnion( groups_of_pending_remote_service_keys )
+ petitioned_remote_service_keys = HydrusLists.MassUnion( groups_of_petitioned_remote_service_keys )
+ deleted_remote_service_keys = HydrusLists.MassUnion( groups_of_deleted_remote_service_keys )
- common_current_remote_service_keys = HydrusData.IntelligentMassIntersect( groups_of_current_remote_service_keys )
- common_pending_remote_service_keys = HydrusData.IntelligentMassIntersect( groups_of_pending_remote_service_keys )
- common_petitioned_remote_service_keys = HydrusData.IntelligentMassIntersect( groups_of_petitioned_remote_service_keys )
- common_deleted_remote_service_keys = HydrusData.IntelligentMassIntersect( groups_of_deleted_remote_service_keys )
+ common_current_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_current_remote_service_keys )
+ common_pending_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_pending_remote_service_keys )
+ common_petitioned_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_petitioned_remote_service_keys )
+ common_deleted_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_deleted_remote_service_keys )
disparate_current_remote_service_keys = current_remote_service_keys - common_current_remote_service_keys
disparate_pending_remote_service_keys = pending_remote_service_keys - common_pending_remote_service_keys
diff --git a/hydrus/client/gui/panels/ClientGUIScrolledPanelsEdit.py b/hydrus/client/gui/panels/ClientGUIScrolledPanelsEdit.py
index 48d34ad1b..f6397a10d 100644
--- a/hydrus/client/gui/panels/ClientGUIScrolledPanelsEdit.py
+++ b/hydrus/client/gui/panels/ClientGUIScrolledPanelsEdit.py
@@ -2027,7 +2027,7 @@ def _Paste( self ):
for item in names_and_notes:
- if not HydrusData.IsAListLikeCollection( item ):
+ if not HydrusLists.IsAListLikeCollection( item ):
continue
@@ -2783,8 +2783,11 @@ def __init__( self, parent, medias: typing.Collection[ ClientMedia.MediaSingleto
self._url_input = QW.QLineEdit( self )
self._url_input.installEventFilter( ClientGUICommon.TextCatchEnterEventFilter( self._url_input, self.AddURL ) )
- self._copy_button = ClientGUICommon.BetterButton( self, 'copy all', self._Copy )
- self._paste_button = ClientGUICommon.BetterButton( self, 'paste', self._Paste )
+ self._copy_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().copy, self._Copy )
+ self._copy_button.setToolTip( ClientGUIFunctions.WrapToolTip( 'Copy selected URLs to the clipboard, or all URLs if none are selected.' ) )
+
+ self._paste_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().paste, self._Paste )
+ self._paste_button.setToolTip( ClientGUIFunctions.WrapToolTip( 'Paste URLs from the clipboard.' ) )
self._urls_to_add = set()
self._urls_to_remove = set()
@@ -2820,7 +2823,12 @@ def __init__( self, parent, medias: typing.Collection[ ClientMedia.MediaSingleto
def _Copy( self ):
- urls = sorted( self._current_urls_count.keys() )
+ urls = self._urls_listbox.GetData( only_selected = True )
+
+ if len( urls ) == 0:
+
+ urls = self._urls_listbox.GetData()
+
text = '\n'.join( urls )
@@ -2999,8 +3007,8 @@ def _UpdateList( self ):
def EventListDoubleClick( self, item ):
-
- urls = [ QP.GetClientData( self._urls_listbox, selection.row() ) for selection in list( self._urls_listbox.selectedIndexes() ) ]
+
+ urls = self._urls_listbox.GetData( only_selected = True )
for url in urls:
@@ -3021,7 +3029,7 @@ def EventListKeyDown( self, event ):
if key in ClientGUIShortcuts.DELETE_KEYS_QT:
- urls = [ QP.GetClientData( self._urls_listbox, selection.row() ) for selection in list( self._urls_listbox.selectedIndexes() ) ]
+ urls = self._urls_listbox.GetData( only_selected = True )
for url in urls:
diff --git a/hydrus/client/gui/panels/ClientGUIScrolledPanelsManagement.py b/hydrus/client/gui/panels/ClientGUIScrolledPanelsManagement.py
index 860e720fe..bc5873d77 100644
--- a/hydrus/client/gui/panels/ClientGUIScrolledPanelsManagement.py
+++ b/hydrus/client/gui/panels/ClientGUIScrolledPanelsManagement.py
@@ -1974,7 +1974,7 @@ def __init__( self, parent, new_options ):
self._command_palette_show_main_menu.setToolTip( 'Show the main gui window\'s menubar actions.' )
self._command_palette_show_media_menu = QW.QCheckBox( self._command_palette_panel )
- self._command_palette_show_media_menu.setToolTip( 'Show the menu actions for the current media view. Be careful with this, it basically just shows everything with slightly ugly labels..' )
+ self._command_palette_show_media_menu.setToolTip( 'Show the actions for the thumbnail menu on the current media page. Be careful with this, it basically just shows everything with slightly ugly labels..' )
#
@@ -3920,7 +3920,7 @@ def EventImageCacheUpdate( self ):
resolution = ( int( 16 * unit_length ), int( 9 * unit_length ) )
- self._image_cache_storage_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), HydrusData.ConvertResolutionToPrettyString( resolution ) ) )
+ self._image_cache_storage_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), HydrusNumbers.ResolutionToPrettyString( resolution ) ) )
num_pixels = cache_size * ( self._image_cache_prefetch_limit_percentage.value() / 100 ) / 3
@@ -3930,7 +3930,7 @@ def EventImageCacheUpdate( self ):
resolution = ( int( 16 * unit_length ), int( 9 * unit_length ) )
- self._image_cache_prefetch_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), HydrusData.ConvertResolutionToPrettyString( resolution ) ) )
+ self._image_cache_prefetch_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), HydrusNumbers.ResolutionToPrettyString( resolution ) ) )
#
@@ -3976,7 +3976,7 @@ def EventThumbnailsUpdate( self ):
( thumbnail_width, thumbnail_height ) = HC.options[ 'thumbnail_dimensions' ]
- res_string = HydrusData.ConvertResolutionToPrettyString( ( thumbnail_width, thumbnail_height ) )
+ res_string = HydrusNumbers.ResolutionToPrettyString( ( thumbnail_width, thumbnail_height ) )
estimated_bytes_per_thumb = 3 * thumbnail_width * thumbnail_height
@@ -5220,7 +5220,7 @@ def __init__( self, parent, new_options ):
self._thumbnail_dpr_percentage = ClientGUICommon.BetterSpinBox( self, min = 100, max = 800 )
tt = 'If your OS runs at an UI scale greater than 100%, mirror it here and your thumbnails will look crisp. If you have multiple monitors at different UI scales, or you change UI scale regularly, set it to the largest one you use.'
tt += '\n' * 2
- tt += 'I believe the UI scale on the monitor this dialog opened on was {}'.format( HydrusData.ConvertFloatToPercentage( self.devicePixelRatio() ) )
+ tt += 'I believe the UI scale on the monitor this dialog opened on was {}'.format( HydrusNumbers.FloatToPercentage( self.devicePixelRatio() ) )
self._thumbnail_dpr_percentage.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
self._video_thumbnail_percentage_in = ClientGUICommon.BetterSpinBox( self, min=0, max=100 )
diff --git a/hydrus/client/gui/panels/ClientGUIScrolledPanelsReview.py b/hydrus/client/gui/panels/ClientGUIScrolledPanelsReview.py
index 96279c14a..31464d2af 100644
--- a/hydrus/client/gui/panels/ClientGUIScrolledPanelsReview.py
+++ b/hydrus/client/gui/panels/ClientGUIScrolledPanelsReview.py
@@ -19,8 +19,6 @@
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusPaths
from hydrus.core import HydrusSerialisable
-from hydrus.core import HydrusTags
-from hydrus.core import HydrusTagArchive
from hydrus.core import HydrusText
from hydrus.core import HydrusThreading
from hydrus.core import HydrusTime
@@ -475,7 +473,7 @@ def _ConvertLocationToListCtrlTuples( self, base_location: ClientFilesPhysical.F
fp = locations_to_file_weights[ base_location ]
tp = locations_to_thumb_weights[ base_location ]
- p = HydrusData.ConvertFloatToPercentage
+ p = HydrusNumbers.FloatToPercentage
current_bytes = fp * f_space + tp * t_space_max
@@ -3137,7 +3135,7 @@ def THREADParseImportablePaths( self, unparsed_paths_queue, currently_parsing, w
else:
- message = HydrusData.ConvertValueRangeToPrettyString( num_good_files, total_paths ) + ' files parsed successfully'
+ message = HydrusNumbers.ValueRangeToPrettyString( num_good_files, total_paths ) + ' files parsed successfully'
if num_empty_files + num_missing_files + num_unimportable_mime_files + num_occupied_files + num_sidecars > 0:
@@ -3809,7 +3807,7 @@ def _ConvertNameToListCtrlTuples( self, name ):
if sort_total_size > 0:
- pretty_free_size = '{} ({})'.format( pretty_free_size, HydrusData.ConvertFloatToPercentage( sort_free_size / sort_total_size ) )
+ pretty_free_size = '{} ({})'.format( pretty_free_size, HydrusNumbers.FloatToPercentage( sort_free_size / sort_total_size ) )
sort_last_vacuumed_ms = vacuum_dict[ 'last_vacuumed_ms' ]
diff --git a/hydrus/client/gui/panels/ClientGUIScrolledPanelsSelectFromList.py b/hydrus/client/gui/panels/ClientGUIScrolledPanelsSelectFromList.py
index 72142d814..9d7f03119 100644
--- a/hydrus/client/gui/panels/ClientGUIScrolledPanelsSelectFromList.py
+++ b/hydrus/client/gui/panels/ClientGUIScrolledPanelsSelectFromList.py
@@ -1,6 +1,7 @@
from qtpy import QtCore as QC
from qtpy import QtWidgets as QW
+from hydrus.core import HydrusExceptions
from hydrus.client import ClientConstants as CC
from hydrus.client.gui import ClientGUIFunctions
from hydrus.client.gui import QtPorting as QP
@@ -19,8 +20,6 @@ def __init__( self, parent: QW.QWidget, choice_tuples: list, value_to_select = N
#
- selected_a_value = False
-
if sort_tuples:
try:
@@ -47,17 +46,15 @@ def __init__( self, parent: QW.QWidget, choice_tuples: list, value_to_select = N
item.setData( QC.Qt.UserRole, value )
self._list.addItem( item )
- if value_to_select is not None and value_to_select == value:
-
- QP.ListWidgetSetSelection( self._list, i )
-
- selected_a_value = True
-
+
+ if value_to_select is not None:
+
+ self._list.SelectData( ( value_to_select, ) )
- if not selected_a_value:
+ if len( list( self._list.selectedItems() ) ) == 0:
- QP.ListWidgetSetSelection( self._list, 0 )
+ self._list.item( 0 ).setSelected( True )
#
@@ -94,11 +91,22 @@ def EventSelect( self, item ):
def GetValue( self ):
- selection = QP.ListWidgetGetSelection( self._list )
+ data = self._list.GetData( only_selected = True )
- return QP.GetClientData( self._list, selection )
+ if len( data ) == 0:
+
+ data = self._list.GetData()
+
+
+ if len( data ) == 0:
+
+ raise HydrusExceptions.CancelledException( 'No data selected!' )
+
+
+ return data[0]
+
class EditSelectFromListButtonsPanel( ClientGUIScrolledPanels.EditPanel ):
def __init__( self, parent: QW.QWidget, choices, message = '' ):
diff --git a/hydrus/client/gui/parsing/ClientGUIParsingFormulae.py b/hydrus/client/gui/parsing/ClientGUIParsingFormulae.py
index 34a16a154..a1253596c 100644
--- a/hydrus/client/gui/parsing/ClientGUIParsingFormulae.py
+++ b/hydrus/client/gui/parsing/ClientGUIParsingFormulae.py
@@ -69,6 +69,7 @@ def __init__( self, parent: QW.QWidget, collapse_newlines: bool, formula: Client
edit_panel = ClientGUICommon.StaticBox( self, 'edit' )
+ # TODO: get rid of all the GetClientData for this guy, below. replace with newer stuff, add methods to BetterQListWidget (ReplaceData?) as needed
self._formulae = ClientGUIListBoxes.BetterQListWidget( edit_panel )
self._formulae.setSelectionMode( QW.QAbstractItemView.SingleSelection )
self._formulae.itemDoubleClicked.connect( self.Edit )
@@ -236,7 +237,7 @@ def Edit( self ):
def GetValue( self ):
- formulae = [ QP.GetClientData( self._formulae, i ) for i in range( self._formulae.count() ) ]
+ formulae = self._formulae.GetData()
sub_phrase = self._sub_phrase.text()
diff --git a/hydrus/client/gui/search/ClientGUIACDropdown.py b/hydrus/client/gui/search/ClientGUIACDropdown.py
index fc88b1304..869fa8500 100644
--- a/hydrus/client/gui/search/ClientGUIACDropdown.py
+++ b/hydrus/client/gui/search/ClientGUIACDropdown.py
@@ -1632,7 +1632,7 @@ def work_callable():
new_tags_to_child_tags = CG.client_controller.Read( 'tag_descendants_lookup', tag_service_key, uncached_context_tags )
- new_child_tags = HydrusData.MassUnion( new_tags_to_child_tags.values() )
+ new_child_tags = HydrusLists.MassUnion( new_tags_to_child_tags.values() )
child_predicates = CG.client_controller.Read(
'tag_predicates',
diff --git a/hydrus/client/gui/search/ClientGUIPredicatesSingle.py b/hydrus/client/gui/search/ClientGUIPredicatesSingle.py
index 4ca015077..cbb3e124d 100644
--- a/hydrus/client/gui/search/ClientGUIPredicatesSingle.py
+++ b/hydrus/client/gui/search/ClientGUIPredicatesSingle.py
@@ -8,13 +8,13 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
-from hydrus.core import HydrusGlobals as HG
from hydrus.core.files import HydrusFileHandling
from hydrus.core.files.images import HydrusImageHandling
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientImageHandling
+from hydrus.client import ClientParsing
from hydrus.client.gui import ClientGUIDialogsMessage
from hydrus.client.gui import ClientGUIFunctions
from hydrus.client.gui import ClientGUIOptionsPanels
@@ -397,6 +397,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def _GetSystemPredicateLabel( self ) -> str:
@@ -562,6 +564,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -792,6 +796,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -843,6 +849,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -907,6 +915,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -970,6 +980,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -1044,6 +1056,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -1106,6 +1120,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -1170,6 +1186,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -1187,7 +1205,7 @@ def GetPredicates( self ):
hex_hashes_raw = self._hashes.toPlainText()
- hashes = HydrusData.ParseHashesFromRawHexText( hash_type, hex_hashes_raw )
+ hashes = ClientParsing.ParseHashesFromRawHexText( hash_type, hex_hashes_raw )
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ( hashes, hash_type ), inclusive = inclusive ), )
@@ -1228,6 +1246,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._operator )
+
def GetDefaultPredicate( self ):
@@ -1287,6 +1307,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -1337,6 +1359,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._operator )
+
def GetDefaultPredicate( self ):
@@ -1409,6 +1433,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._operator )
+
def GetDefaultPredicate( self ):
@@ -1479,6 +1505,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._operator )
+
def GetDefaultPredicate( self ):
@@ -1571,6 +1599,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._operator )
+
def GetDefaultPredicate( self ):
@@ -1632,6 +1662,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._limit )
+
def GetDefaultPredicate( self ):
@@ -1679,6 +1711,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._mimes )
+
def GetDefaultPredicate( self ):
@@ -1737,6 +1771,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -1790,6 +1826,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -1853,6 +1891,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -1928,6 +1968,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -1980,6 +2022,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -2034,6 +2078,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
@@ -2092,6 +2138,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -2191,6 +2239,8 @@ def __init__( self, parent, predicate ):
self.setLayout( big_vbox )
+ self.setFocusProxy( self._pixel_hashes )
+
def _Clear( self ):
@@ -2302,11 +2352,11 @@ def GetPredicates( self ):
hex_pixel_hashes_raw = self._pixel_hashes.toPlainText()
- pixel_hashes = HydrusData.ParseHashesFromRawHexText( 'pixel', hex_pixel_hashes_raw )
+ pixel_hashes = ClientParsing.ParseHashesFromRawHexText( 'pixel', hex_pixel_hashes_raw )
hex_perceptual_hashes_raw = self._perceptual_hashes.toPlainText()
- perceptual_hashes = HydrusData.ParseHashesFromRawHexText( 'perceptual', hex_perceptual_hashes_raw )
+ perceptual_hashes = ClientParsing.ParseHashesFromRawHexText( 'perceptual', hex_perceptual_hashes_raw )
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO_DATA, ( pixel_hashes, perceptual_hashes, self._max_hamming.value() ) ), )
@@ -2368,6 +2418,8 @@ def __init__( self, parent, predicate ):
self.setLayout( vbox )
+ self.setFocusProxy( self._hashes )
+
def GetDefaultPredicate( self ):
@@ -2381,7 +2433,7 @@ def GetPredicates( self ):
hex_hashes_raw = self._hashes.toPlainText()
- hashes = HydrusData.ParseHashesFromRawHexText( 'sha256', hex_hashes_raw )
+ hashes = ClientParsing.ParseHashesFromRawHexText( 'sha256', hex_hashes_raw )
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO_FILES, ( hashes, self._max_hamming.value() ) ), )
@@ -2423,6 +2475,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -2480,6 +2534,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._sign )
+
def GetDefaultPredicate( self ):
@@ -2533,6 +2589,8 @@ def __init__( self, parent, predicate ):
self.setLayout( hbox )
+ self.setFocusProxy( self._number_test )
+
def GetDefaultPredicate( self ):
diff --git a/hydrus/client/gui/search/ClientGUISearch.py b/hydrus/client/gui/search/ClientGUISearch.py
index ad8e22562..c5a267ab6 100644
--- a/hydrus/client/gui/search/ClientGUISearch.py
+++ b/hydrus/client/gui/search/ClientGUISearch.py
@@ -778,14 +778,14 @@ def __init__( self, parent, predicate: ClientSearch.Predicate ):
page_parent = self._notebook
+ first_editable_pred_panel = None
+
for ( i, ( page_name, recent_predicate_types, static_pred_buttons, editable_pred_panels ) ) in enumerate( pages ):
page_panel = QW.QWidget( page_parent )
page_vbox = QP.VBoxLayout()
- recent_predicates = []
-
all_static_preds = set()
for pred_button in static_pred_buttons:
@@ -829,6 +829,11 @@ def __init__( self, parent, predicate: ClientSearch.Predicate ):
for panel in editable_pred_panels:
+ if first_editable_pred_panel is None:
+
+ first_editable_pred_panel = panel
+
+
if isinstance( panel, self._PredOKPanel ) and isinstance( panel.GetPredicatePanel(), ClientGUIPredicatesSingle.PanelPredicateSystemMime ):
flags = CC.FLAGS_EXPAND_BOTH_WAYS
@@ -854,6 +859,10 @@ def __init__( self, parent, predicate: ClientSearch.Predicate ):
ClientGUIFunctions.SetFocusLater( static_pred_buttons[0] )
+ elif first_editable_pred_panel is not None:
+
+ ClientGUIFunctions.SetFocusLater( first_editable_pred_panel )
+
if len( pages ) > 1:
@@ -932,7 +941,7 @@ def __init__( self, parent, predicate_panel_class, predicate ):
self.setLayout( hbox )
- ClientGUIFunctions.SetFocusLater( self._ok )
+ self.setFocusProxy( self._ok )
def _DefaultsMenu( self ):
diff --git a/hydrus/client/gui/services/ClientGUIClientsideServices.py b/hydrus/client/gui/services/ClientGUIClientsideServices.py
index 9effcd891..e4429c20d 100644
--- a/hydrus/client/gui/services/ClientGUIClientsideServices.py
+++ b/hydrus/client/gui/services/ClientGUIClientsideServices.py
@@ -12,6 +12,7 @@
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusPaths
from hydrus.core import HydrusSerialisable
+from hydrus.core import HydrusTags
from hydrus.core import HydrusTime
from hydrus.core.networking import HydrusNetwork
from hydrus.core.networking import HydrusNetworkVariableHandling
@@ -395,8 +396,14 @@ def _GetArchiveNameToDisplay( self, portable_hta_path, namespaces ):
hta_path = HydrusPaths.ConvertPortablePathToAbsPath( portable_hta_path )
- if len( namespaces ) == 0: name_to_display = hta_path
- else: name_to_display = hta_path + ' (' + ', '.join( HydrusData.ConvertUglyNamespacesToPrettyStrings( namespaces ) ) + ')'
+ if len( namespaces ) == 0:
+
+ name_to_display = hta_path
+
+ else:
+
+ name_to_display = hta_path + ' (' + ', '.join( HydrusTags.ConvertUglyNamespacesToPrettyStrings( namespaces ) ) + ')'
+
return name_to_display
@@ -1769,7 +1776,7 @@ def do_it():
update_speed_string = ''
- content_update_index_string = 'content row ' + HydrusData.ConvertValueRangeToPrettyString( c_u_p_total_weight_processed, c_u_p_num_rows ) + ': '
+ content_update_index_string = 'content row ' + HydrusNumbers.ValueRangeToPrettyString( c_u_p_total_weight_processed, c_u_p_num_rows ) + ': '
job_status.SetStatusText( content_update_index_string + 'committing' + update_speed_string )
@@ -1786,7 +1793,7 @@ def do_it():
return
- content_update_index_string = 'content row ' + HydrusData.ConvertValueRangeToPrettyString( c_u_p_total_weight_processed, c_u_p_num_rows ) + ': '
+ content_update_index_string = 'content row ' + HydrusNumbers.ValueRangeToPrettyString( c_u_p_total_weight_processed, c_u_p_num_rows ) + ': '
job_status.SetStatusText( content_update_index_string + 'committing' + update_speed_string )
@@ -2941,7 +2948,7 @@ def do_it( dest_dir, service ):
finally:
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i + 1, num_to_do ) )
job_status.SetVariable( 'popup_gauge_1', ( i, num_to_do ) )
@@ -3359,7 +3366,7 @@ def qt_code( num_local_updates, num_updates, content_types_to_num_processed_upda
return
- download_text = 'downloaded {}'.format( HydrusData.ConvertValueRangeToPrettyString( num_local_updates, num_updates ) )
+ download_text = 'downloaded {}'.format( HydrusNumbers.ValueRangeToPrettyString( num_local_updates, num_updates ) )
self._download_progress.SetValue( download_text, num_local_updates, num_updates )
@@ -3373,7 +3380,7 @@ def qt_code( num_local_updates, num_updates, content_types_to_num_processed_upda
processing_work_to_do = True
- definitions_text = 'definitions: {}'.format( HydrusData.ConvertValueRangeToPrettyString( d_value, d_range ) )
+ definitions_text = 'definitions: {}'.format( HydrusNumbers.ValueRangeToPrettyString( d_value, d_range ) )
self._processing_definitions_progress.SetValue( definitions_text, d_value, d_range )
@@ -3388,7 +3395,7 @@ def qt_code( num_local_updates, num_updates, content_types_to_num_processed_upda
processing_work_to_do = True
- content_text = '{}: {}'.format( HC.content_type_string_lookup[ content_type ], HydrusData.ConvertValueRangeToPrettyString( c_value, c_range ) )
+ content_text = '{}: {}'.format( HC.content_type_string_lookup[ content_type ], HydrusNumbers.ValueRangeToPrettyString( c_value, c_range ) )
gauge.SetValue( content_text, c_value, c_range )
diff --git a/hydrus/client/gui/widgets/ClientGUICommon.py b/hydrus/client/gui/widgets/ClientGUICommon.py
index 5c1376a89..069713aa1 100644
--- a/hydrus/client/gui/widgets/ClientGUICommon.py
+++ b/hydrus/client/gui/widgets/ClientGUICommon.py
@@ -6,6 +6,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
+from hydrus.core import HydrusLists
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
@@ -48,7 +49,7 @@ def WrapInGrid( parent, rows, expand_text = False, add_stretch_at_end = True, ex
for row in rows:
- if HydrusData.IsAListLikeCollection( row ) and len( row ) == 2:
+ if HydrusLists.IsAListLikeCollection( row ) and len( row ) == 2:
( text, control ) = row
diff --git a/hydrus/client/gui/widgets/ClientGUIRegex.py b/hydrus/client/gui/widgets/ClientGUIRegex.py
index 6fdeccde4..de058f0be 100644
--- a/hydrus/client/gui/widgets/ClientGUIRegex.py
+++ b/hydrus/client/gui/widgets/ClientGUIRegex.py
@@ -167,12 +167,13 @@ def _ManageFavourites( self ):
class RegexInput( QW.QWidget ):
textChanged = QC.Signal()
- userHitEnter = QC.Signal()
def __init__( self, parent: QW.QWidget, show_group_menu = False ):
QW.QWidget.__init__( self, parent )
+ self._allow_enter_key_to_propagate_outside = True
+
self._regex_text = QW.QLineEdit( self )
self._regex_text.setPlaceholderText( 'regex input' )
@@ -185,7 +186,6 @@ def __init__( self, parent: QW.QWidget, show_group_menu = False ):
self.setLayout( hbox )
- self._regex_text.installEventFilter( ClientGUICommon.TextCatchEnterEventFilter( self, self.userHitEnter.emit ) )
self._regex_text.textChanged.connect( self.textChanged )
self._regex_text.textChanged.connect( self._UpdateValidityStyle )
@@ -208,6 +208,12 @@ def _UpdateValidityStyle( self ):
self._regex_text.style().polish( self._regex_text )
+ def SetEnterCallable( self, c ):
+
+ # note that this guy stops further processing of Enter presses so this guy now won't trigger dialog ok!
+ self._regex_text.installEventFilter( ClientGUICommon.TextCatchEnterEventFilter( self, c ) )
+
+
def GetValue( self ) -> str:
return self._regex_text.text()
diff --git a/hydrus/client/importing/ClientImportFileSeeds.py b/hydrus/client/importing/ClientImportFileSeeds.py
index caa76c60b..cbc6312b2 100644
--- a/hydrus/client/importing/ClientImportFileSeeds.py
+++ b/hydrus/client/importing/ClientImportFileSeeds.py
@@ -81,8 +81,9 @@ def FileURLMappingHasUntrustworthyNeighbours( hash: bytes, lookup_urls: typing.C
if len( lookup_urls ) == 0:
- # what is going on, yes, whatever garbage you just threw at me is not to be trusted to produce a dispositive result
- return True
+ # ok, this method was likely called with only File/Unknown URLs, probably, or at least those are the only things that could be providing trustworthy results
+ # we cannot adjudicate File/Unknown URL trustworthiness here, so we return False
+ return False
lookup_url_domains = { ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url ) for lookup_url in lookup_urls }
@@ -1971,7 +1972,7 @@ def GetStatusText( self, simple = False ) -> str:
if num_unknown > 0:
- status_text += HydrusData.ConvertValueRangeToPrettyString( total_processed, total )
+ status_text += HydrusNumbers.ValueRangeToPrettyString( total_processed, total )
else:
diff --git a/hydrus/client/importing/ClientImportLocal.py b/hydrus/client/importing/ClientImportLocal.py
index d2ad1c014..b6cdc1bc5 100644
--- a/hydrus/client/importing/ClientImportLocal.py
+++ b/hydrus/client/importing/ClientImportLocal.py
@@ -772,7 +772,7 @@ def _ImportFiles( self, job_status ):
gauge_num_done = num_files_imported + 1
- job_status.SetStatusText( 'importing file ' + HydrusData.ConvertValueRangeToPrettyString( gauge_num_done, num_total ) )
+ job_status.SetStatusText( 'importing file ' + HydrusNumbers.ValueRangeToPrettyString( gauge_num_done, num_total ) )
job_status.SetVariable( 'popup_gauge_1', ( gauge_num_done, num_total ) )
path = file_seed.file_seed_data
diff --git a/hydrus/client/importing/ClientImportSubscriptions.py b/hydrus/client/importing/ClientImportSubscriptions.py
index 82b4074dd..c536a9ff4 100644
--- a/hydrus/client/importing/ClientImportSubscriptions.py
+++ b/hydrus/client/importing/ClientImportSubscriptions.py
@@ -306,7 +306,7 @@ def _SyncQueries( self, job_status ):
status_prefix += ' "' + query_name + '"'
- status_prefix += ' (' + HydrusData.ConvertValueRangeToPrettyString( i + 1, num_queries ) + ')'
+ status_prefix += ' (' + HydrusNumbers.ValueRangeToPrettyString( i + 1, num_queries ) + ')'
try:
@@ -875,7 +875,7 @@ def _WorkOnQueriesFiles( self, job_status ):
query_summary_name += ': ' + query_name
- text_1 += ' (' + HydrusData.ConvertValueRangeToPrettyString( i + 1, num_queries ) + ')'
+ text_1 += ' (' + HydrusNumbers.ValueRangeToPrettyString( i + 1, num_queries ) + ')'
job_status.SetStatusText( text_1 )
@@ -1054,7 +1054,7 @@ def _WorkOnQueryFiles(
human_num_urls = num_urls - starting_num_done
human_num_done = num_done - starting_num_done
- x_out_of_y = 'file ' + HydrusData.ConvertValueRangeToPrettyString( human_num_done + 1, human_num_urls ) + ': '
+ x_out_of_y = 'file ' + HydrusNumbers.ValueRangeToPrettyString( human_num_done + 1, human_num_urls ) + ': '
job_status.SetVariable( 'popup_gauge_2', ( human_num_done, human_num_urls ) )
diff --git a/hydrus/client/importing/ClientImporting.py b/hydrus/client/importing/ClientImporting.py
index b51adb508..be374b855 100644
--- a/hydrus/client/importing/ClientImporting.py
+++ b/hydrus/client/importing/ClientImporting.py
@@ -243,7 +243,7 @@ def status_hook( text ):
break
- job_status.SetStatusText( HydrusData.ConvertValueRangeToPrettyString( i + 1, len( urls ) ) )
+ job_status.SetStatusText( HydrusNumbers.ValueRangeToPrettyString( i + 1, len( urls ) ) )
job_status.SetVariable( 'popup_gauge_1', ( i + 1, len( urls ) ) )
file_seed = ClientImportFileSeeds.FileSeed( ClientImportFileSeeds.FILE_SEED_TYPE_URL, url )
diff --git a/hydrus/client/importing/options/FileImportOptions.py b/hydrus/client/importing/options/FileImportOptions.py
index 5373265a1..7765332cd 100644
--- a/hydrus/client/importing/options/FileImportOptions.py
+++ b/hydrus/client/importing/options/FileImportOptions.py
@@ -368,7 +368,7 @@ def CheckFileIsValid( self, size, mime, width, height ):
if too_thin or too_short:
- raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + HydrusData.ConvertResolutionToPrettyString( ( width, height ) ) + ' but the lower limit is ' + HydrusData.ConvertResolutionToPrettyString( self._min_resolution ) )
+ raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + HydrusNumbers.ResolutionToPrettyString( ( width, height ) ) + ' but the lower limit is ' + HydrusNumbers.ResolutionToPrettyString( self._min_resolution ) )
@@ -381,7 +381,7 @@ def CheckFileIsValid( self, size, mime, width, height ):
if too_wide or too_tall:
- raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + HydrusData.ConvertResolutionToPrettyString( ( width, height ) ) + ' but the upper limit is ' + HydrusData.ConvertResolutionToPrettyString( self._max_resolution ) )
+ raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + HydrusNumbers.ResolutionToPrettyString( ( width, height ) ) + ' but the upper limit is ' + HydrusNumbers.ResolutionToPrettyString( self._max_resolution ) )
diff --git a/hydrus/client/media/ClientMedia.py b/hydrus/client/media/ClientMedia.py
index 9a4c1894f..f0475403f 100644
--- a/hydrus/client/media/ClientMedia.py
+++ b/hydrus/client/media/ClientMedia.py
@@ -5,6 +5,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTime
@@ -13,6 +14,7 @@
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientLocation
+from hydrus.client import ClientServices
from hydrus.client import ClientTime
from hydrus.client.media import ClientMediaManagers
from hydrus.client.media import ClientMediaResult
@@ -1290,7 +1292,7 @@ def ProcessContentUpdatePackage( self, full_content_update_package: ClientConten
self._RecalcAfterContentUpdates( content_update_package )
- def ProcessServiceUpdates( self, service_keys_to_service_updates ):
+ def ProcessServiceUpdates( self, service_keys_to_service_updates: typing.Dict[ bytes, typing.Collection[ ClientServices.ServiceUpdate ] ] ):
for ( service_key, service_updates ) in service_keys_to_service_updates.items():
@@ -1529,8 +1531,8 @@ def _RecalcHashes( self ):
current = set( current_to_timestamps_ms.keys() )
deleted = set( deleted_to_timestamps_ms.keys() )
- pending = HydrusData.MassUnion( [ locations_manager.GetPending() for locations_manager in all_locations_managers ] )
- petitioned = HydrusData.MassUnion( [ locations_manager.GetPetitioned() for locations_manager in all_locations_managers ] )
+ pending = HydrusLists.MassUnion( [ locations_manager.GetPending() for locations_manager in all_locations_managers ] )
+ petitioned = HydrusLists.MassUnion( [ locations_manager.GetPetitioned() for locations_manager in all_locations_managers ] )
times_manager = ClientMediaManagers.TimesManager()
@@ -1935,7 +1937,7 @@ def timestamp_ms_is_interesting( timestamp_ms_1, timestamp_ms_2 ):
if width is not None and height is not None:
- info_string += f' ({HydrusData.ConvertResolutionToPrettyString( ( width, height ) )})'
+ info_string += f' ({HydrusNumbers.ResolutionToPrettyString( ( width, height ) )})'
if duration is not None:
diff --git a/hydrus/client/media/ClientMediaFileFilter.py b/hydrus/client/media/ClientMediaFileFilter.py
index 0327c0e7a..70579391d 100644
--- a/hydrus/client/media/ClientMediaFileFilter.py
+++ b/hydrus/client/media/ClientMediaFileFilter.py
@@ -5,6 +5,7 @@
from hydrus.core import HydrusText
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusSerialisable
@@ -190,7 +191,7 @@ def GetMediaListFileCount( self, media_list: ClientMedia.MediaList ):
elif and_or_or == 'OR':
- return sum( ( 1 for m in flat_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL ), select_tags ) ) )
+ return sum( ( 1 for m in flat_media if HydrusLists.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL ), select_tags ) ) )
@@ -271,7 +272,7 @@ def GetMediaListHashes( self, media_list: ClientMedia.MediaList ):
elif and_or_or == 'OR':
- filtered_media = [ m for m in flat_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL ), select_tags ) ]
+ filtered_media = [ m for m in flat_media if HydrusLists.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL ), select_tags ) ]
@@ -335,7 +336,7 @@ def GetMediaListMedia( self, media_list: ClientMedia.MediaList ):
elif and_or_or == 'OR':
- filtered_media = { m for m in media_list.GetSortedMedia() if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL ), select_tags ) }
+ filtered_media = { m for m in media_list.GetSortedMedia() if HydrusLists.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL ), select_tags ) }
diff --git a/hydrus/client/media/ClientMediaResultCache.py b/hydrus/client/media/ClientMediaResultCache.py
index 724901804..916c7588e 100644
--- a/hydrus/client/media/ClientMediaResultCache.py
+++ b/hydrus/client/media/ClientMediaResultCache.py
@@ -8,6 +8,7 @@
from hydrus.core import HydrusThreading
from hydrus.client import ClientGlobals as CG
+from hydrus.client import ClientServices
from hydrus.client.media import ClientMediaResult
from hydrus.client.metadata import ClientContentUpdates
from hydrus.client.metadata import ClientTags
@@ -214,7 +215,7 @@ def ProcessContentUpdatePackage( self, content_update_package: ClientContentUpda
- def ProcessServiceUpdates( self, service_keys_to_service_updates ):
+ def ProcessServiceUpdates( self, service_keys_to_service_updates: typing.Dict[ bytes, typing.Collection[ ClientServices.ServiceUpdate ] ] ):
with self._lock:
diff --git a/hydrus/client/networking/ClientLocalServer.py b/hydrus/client/networking/ClientLocalServer.py
index 7dd65980f..75ad65c0f 100644
--- a/hydrus/client/networking/ClientLocalServer.py
+++ b/hydrus/client/networking/ClientLocalServer.py
@@ -110,6 +110,14 @@ def _InitRoot( self ):
manage_database.putChild( b'lock_off', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageDatabaseLockOff( self._service, self._client_requests_domain ) )
manage_database.putChild( b'get_client_options', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageDatabaseGetClientOptions( self._service, self._client_requests_domain ) )
+ manage_services = NoResource()
+
+ root.putChild( b'manage_services', manage_services )
+
+ manage_services.putChild( b'get_pending_counts', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageServicesPendingCounts( self._service, self._client_requests_domain ) )
+ manage_services.putChild( b'commit_pending', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageServicesCommitPending( self._service, self._client_requests_domain ) )
+ manage_services.putChild( b'forget_pending', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageServicesForgetPending( self._service, self._client_requests_domain ) )
+
manage_file_relationships = NoResource()
root.putChild( b'manage_file_relationships', manage_file_relationships )
@@ -118,6 +126,7 @@ def _InitRoot( self ):
manage_file_relationships.putChild( b'get_potentials_count', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialsCount( self._service, self._client_requests_domain ) )
manage_file_relationships.putChild( b'get_potential_pairs', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialPairs( self._service, self._client_requests_domain ) )
manage_file_relationships.putChild( b'get_random_potentials', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRandomPotentials( self._service, self._client_requests_domain ) )
+ manage_file_relationships.putChild( b'remove_potentials', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsRemovePotentials( self._service, self._client_requests_domain ) )
manage_file_relationships.putChild( b'set_file_relationships', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsSetRelationships( self._service, self._client_requests_domain ) )
manage_file_relationships.putChild( b'set_kings', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsSetKings( self._service, self._client_requests_domain ) )
diff --git a/hydrus/client/networking/ClientLocalServerResources.py b/hydrus/client/networking/ClientLocalServerResources.py
index ac7940af0..6b62aa408 100644
--- a/hydrus/client/networking/ClientLocalServerResources.py
+++ b/hydrus/client/networking/ClientLocalServerResources.py
@@ -25,6 +25,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusLists
from hydrus.core import HydrusPaths
from hydrus.core import HydrusTags
from hydrus.core import HydrusTemp
@@ -202,6 +203,23 @@ def CheckTags( tags: typing.Collection[ str ] ):
+def CheckUploadableService( service_key: bytes ):
+
+ try:
+
+ service = CG.client_controller.services_manager.GetService( service_key )
+
+ except:
+
+ raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_key.hex() ) )
+
+
+ if service.GetServiceType() not in ( HC.IPFS, HC.FILE_REPOSITORY, HC.TAG_REPOSITORY ):
+
+ raise HydrusExceptions.BadRequestException( f'Sorry, the service key "{service_key.hex()}" was not for an uploadable service!' )
+
+
+
def GetServicesDict():
service_types = [
@@ -1937,7 +1955,7 @@ def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
tag = tag_item
- elif HydrusData.IsAListLikeCollection( tag_item ) and len( tag_item ) == 2:
+ elif HydrusLists.IsAListLikeCollection( tag_item ) and len( tag_item ) == 2:
( tag, reason ) = tag_item
@@ -2378,6 +2396,7 @@ def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
return response_context
+
class HydrusResourceClientAPIRestrictedAddURLsImportURL( HydrusResourceClientAPIRestrictedAddURLs ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
@@ -4075,6 +4094,20 @@ def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
+class HydrusResourceClientAPIRestrictedManageFileRelationshipsRemovePotentials( HydrusResourceClientAPIRestrictedManageFileRelationships ):
+
+ def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
+
+ hashes = ParseHashes( request )
+
+ CG.client_controller.WriteSynchronous( 'remove_potential_pairs', hashes )
+
+ response_context = HydrusServerResources.ResponseContext( 200 )
+
+ return response_context
+
+
+
class HydrusResourceClientAPIRestrictedManageFileRelationshipsSetKings( HydrusResourceClientAPIRestrictedManageFileRelationships ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
@@ -4748,3 +4781,99 @@ def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
+class HydrusResourceClientAPIRestrictedManageServices( HydrusResourceClientAPIRestricted ):
+
+ pass
+
+
+class HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs( HydrusResourceClientAPIRestrictedManageServices ):
+
+ def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
+
+ request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_COMMIT_PENDING )
+
+
+
+class HydrusResourceClientAPIRestrictedManageServicesPendingCounts( HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs ):
+
+ def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
+
+ info_type_to_str_lookup = {
+ HC.SERVICE_INFO_NUM_PENDING_MAPPINGS : 'pending_tag_mappings',
+ HC.SERVICE_INFO_NUM_PETITIONED_MAPPINGS : 'petitioned_tag_mappings',
+ HC.SERVICE_INFO_NUM_PENDING_TAG_SIBLINGS : 'pending_tag_siblings',
+ HC.SERVICE_INFO_NUM_PETITIONED_TAG_SIBLINGS : 'petitioned_tag_siblings',
+ HC.SERVICE_INFO_NUM_PENDING_TAG_PARENTS : 'pending_tag_parents',
+ HC.SERVICE_INFO_NUM_PETITIONED_TAG_PARENTS : 'petitioned_tag_parents',
+ HC.SERVICE_INFO_NUM_PENDING_FILES : 'pending_files',
+ HC.SERVICE_INFO_NUM_PETITIONED_FILES : 'petitioned_files',
+ }
+
+ service_keys_to_info_types_to_counts = CG.client_controller.Read( 'nums_pending' )
+
+ body_dict = {
+ 'pending_counts' : { service_key.hex() : { info_type_to_str_lookup[ info_type ] : count for ( info_type, count ) in info_types_to_counts.items() } for ( service_key, info_types_to_counts ) in service_keys_to_info_types_to_counts.items() }
+ }
+
+ body = Dumps( body_dict, request.preferred_mime )
+
+ response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
+
+ return response_context
+
+
+
+class HydrusResourceClientAPIRestrictedManageServicesCommitPending( HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs ):
+
+ def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
+
+ service_key = request.parsed_request_args.GetValue( 'service_key', bytes )
+
+ CheckUploadableService( service_key )
+
+ def do_it():
+
+ if CG.client_controller.gui.IsCurrentlyUploadingPending( service_key ):
+
+ raise HydrusExceptions.ConflictException( 'Upload is already running.' )
+
+
+ result = CG.client_controller.gui.UploadPending( service_key )
+
+ if not result:
+
+ raise HydrusExceptions.UnprocessableEntity( 'Sorry, could not start for some complex reason--check the client!' )
+
+
+
+ CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it )
+
+ body_dict = {}
+
+ body = Dumps( body_dict, request.preferred_mime )
+
+ response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
+
+ return response_context
+
+
+
+class HydrusResourceClientAPIRestrictedManageServicesForgetPending( HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs ):
+
+ def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
+
+ service_key = request.parsed_request_args.GetValue( 'service_key', bytes )
+
+ CheckUploadableService( service_key )
+
+ CG.client_controller.WriteSynchronous( 'delete_pending', service_key )
+
+ body_dict = {}
+
+ body = Dumps( body_dict, request.preferred_mime )
+
+ response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
+
+ return response_context
+
+
diff --git a/hydrus/client/networking/ClientNetworkingJobs.py b/hydrus/client/networking/ClientNetworkingJobs.py
index ea8879af5..79c13d6bb 100644
--- a/hydrus/client/networking/ClientNetworkingJobs.py
+++ b/hydrus/client/networking/ClientNetworkingJobs.py
@@ -95,6 +95,10 @@ def ConvertStatusCodeAndDataIntoExceptionInfo( status_code, data, is_hydrus_serv
eclass = HydrusExceptions.SessionException
+ elif status_code == 422:
+
+ eclass = HydrusExceptions.UnprocessableEntity
+
elif status_code == 426:
eclass = HydrusExceptions.NetworkVersionException
diff --git a/hydrus/client/search/ClientSearch.py b/hydrus/client/search/ClientSearch.py
index 127c59958..d645d81a7 100644
--- a/hydrus/client/search/ClientSearch.py
+++ b/hydrus/client/search/ClientSearch.py
@@ -567,7 +567,7 @@ def ToString( self, absolute_number_renderer: typing.Optional[ typing.Callable ]
if self.operator == NUMBER_TEST_OPERATOR_APPROXIMATE_PERCENT:
- result += f' {HC.UNICODE_PLUS_OR_MINUS}{HydrusData.ConvertFloatToPercentage(self.extra_value)}'
+ result += f' {HC.UNICODE_PLUS_OR_MINUS}{HydrusNumbers.FloatToPercentage(self.extra_value)}'
elif self.operator == NUMBER_TEST_OPERATOR_APPROXIMATE_ABSOLUTE:
@@ -2747,7 +2747,7 @@ def ToString( self, with_count: bool = True, render_for_user: bool = False, or_u
( operator, size, unit ) = self._value
- base += ' ' + operator + ' ' + str( size ) + HydrusData.ConvertIntToUnit( unit )
+ base += ' ' + operator + ' ' + str( size ) + HydrusNumbers.IntToUnit( unit )
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_LIMIT:
@@ -2873,7 +2873,7 @@ def ToString( self, with_count: bool = True, render_for_user: bool = False, or_u
( operator, num_pixels, unit ) = self._value
- base += ' ' + operator + ' ' + str( num_pixels ) + ' ' + HydrusData.ConvertIntToPixels( unit )
+ base += ' ' + operator + ' ' + str( num_pixels ) + ' ' + HydrusNumbers.IntToPixels( unit )
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_KNOWN_URLS:
diff --git a/hydrus/client/search/ClientSearchParseSystemPredicates.py b/hydrus/client/search/ClientSearchParseSystemPredicates.py
index a4ae59319..95798046c 100644
--- a/hydrus/client/search/ClientSearchParseSystemPredicates.py
+++ b/hydrus/client/search/ClientSearchParseSystemPredicates.py
@@ -5,6 +5,7 @@
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusNumbers
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientGlobals as CG
@@ -234,14 +235,14 @@ def url_class_pred_generator( include, url_class_name ):
SystemPredicateParser.Predicate.NUM_OF_WORDS : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.NumberTest.STATICCreateFromCharacters( o, v ) ),
SystemPredicateParser.Predicate.HEIGHT : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_HEIGHT, ClientSearch.NumberTest.STATICCreateFromCharacters( o, v ) ),
SystemPredicateParser.Predicate.WIDTH : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_WIDTH, ClientSearch.NumberTest.STATICCreateFromCharacters( o, v ) ),
- SystemPredicateParser.Predicate.FILESIZE : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( o, v, HydrusData.ConvertUnitToInt( u ) ) ),
+ SystemPredicateParser.Predicate.FILESIZE : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( o, v, HydrusNumbers.UnitToInt( u ) ) ),
SystemPredicateParser.Predicate.SIMILAR_TO_FILES : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO_FILES, convert_hex_hashlist_and_other_to_bytes_and_other( v ) ),
SystemPredicateParser.Predicate.SIMILAR_TO_DATA : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO_DATA, convert_double_hex_hashlist_and_other_to_double_bytes_and_other( v ) ),
SystemPredicateParser.Predicate.HASH : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, convert_hex_hashlist_and_other_to_bytes_and_other( v ), inclusive = o == '=' ),
SystemPredicateParser.Predicate.DURATION : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.NumberTest.STATICCreateFromCharacters( o, v[0] * 1000 + v[1] ) ),
SystemPredicateParser.Predicate.FRAMERATE : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_FRAMERATE, ClientSearch.NumberTest.STATICCreateFromCharacters( o, v ) ),
SystemPredicateParser.Predicate.NUM_OF_FRAMES : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_FRAMES, ClientSearch.NumberTest.STATICCreateFromCharacters( o, v ) ),
- SystemPredicateParser.Predicate.NUM_PIXELS : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_PIXELS, ( o, v, HydrusData.ConvertPixelsToInt( u ) ) ),
+ SystemPredicateParser.Predicate.NUM_PIXELS : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_PIXELS, ( o, v, HydrusNumbers.PixelsToInt( u ) ) ),
SystemPredicateParser.Predicate.RATIO : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_RATIO, ( o, v[0], v[1] ) ),
SystemPredicateParser.Predicate.RATIO_SPECIAL : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_RATIO, ( o, v[0], v[1] ) ),
SystemPredicateParser.Predicate.TAG_AS_NUMBER : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, ( o[0], o[1], v ) ),
diff --git a/hydrus/core/HydrusConstants.py b/hydrus/core/HydrusConstants.py
index 2d75374b3..d38de72ee 100644
--- a/hydrus/core/HydrusConstants.py
+++ b/hydrus/core/HydrusConstants.py
@@ -105,8 +105,8 @@
# Misc
NETWORK_VERSION = 20
-SOFTWARE_VERSION = 583
-CLIENT_API_VERSION = 65
+SOFTWARE_VERSION = 584
+CLIENT_API_VERSION = 66
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
diff --git a/hydrus/core/HydrusController.py b/hydrus/core/HydrusController.py
index 4651dfcf9..9f1f172f4 100644
--- a/hydrus/core/HydrusController.py
+++ b/hydrus/core/HydrusController.py
@@ -10,6 +10,7 @@
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusPaths
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusPubSub
from hydrus.core import HydrusThreading
from hydrus.core import HydrusTemp
@@ -736,7 +737,7 @@ def RecordRunningStart( self ):
self._i_own_running_file = True
- HydrusData.RecordRunningStart( self.db_dir, self._name )
+ HydrusProcess.RecordRunningStart( self.db_dir, self._name )
def ReleaseThreadSlot( self, thread_type ):
diff --git a/hydrus/core/HydrusDB.py b/hydrus/core/HydrusDB.py
index 5dbcdd514..e6d6ecd09 100644
--- a/hydrus/core/HydrusDB.py
+++ b/hydrus/core/HydrusDB.py
@@ -395,7 +395,7 @@ def _DoAfterJobWork( self ):
def _GenerateDBJob( self, job_type, synchronous, action, *args, **kwargs ):
- return HydrusData.JobDatabase( job_type, synchronous, action, *args, **kwargs )
+ return HydrusDBBase.JobDatabase( job_type, synchronous, action, *args, **kwargs )
def _GetPossibleAdditionalDBFilenames( self ):
diff --git a/hydrus/core/HydrusDBBase.py b/hydrus/core/HydrusDBBase.py
index b47c9fb58..e72323a00 100644
--- a/hydrus/core/HydrusDBBase.py
+++ b/hydrus/core/HydrusDBBase.py
@@ -1,9 +1,12 @@
import collections
+import threading
+import time
import typing
import sqlite3
from hydrus.core import HydrusData
+from hydrus.core import HydrusExceptions
from hydrus.core import HydrusPaths
from hydrus.core import HydrusPSUtil
from hydrus.core import HydrusGlobals as HG
@@ -178,6 +181,7 @@ def ReleaseName( self, column_name, table_name ):
self._column_names_to_table_names[ column_name ].append( table_name )
+
class TemporaryIntegerTable( object ):
def __init__( self, cursor: sqlite3.Cursor, integer_iterable, column_name ):
@@ -215,6 +219,91 @@ def __exit__( self, exc_type, exc_val, exc_tb ):
return False
+
+class JobDatabase( object ):
+
+ def __init__( self, job_type, synchronous, action, *args, **kwargs ):
+
+ self._type = job_type
+ self._synchronous = synchronous
+ self._action = action
+ self._args = args
+ self._kwargs = kwargs
+
+ self._result_ready = threading.Event()
+
+
+ def __str__( self ):
+
+ return 'DB Job: {}'.format( self.ToString() )
+
+
+ def _DoDelayedResultRelief( self ):
+
+ pass
+
+
+ def GetCallableTuple( self ):
+
+ return ( self._action, self._args, self._kwargs )
+
+
+ def GetResult( self ):
+
+ time.sleep( 0.00001 ) # this one neat trick can save hassle on superquick jobs as event.wait can be laggy
+
+ while True:
+
+ result_was_ready = self._result_ready.wait( 2 )
+
+ if result_was_ready:
+
+ break
+
+
+ if HG.model_shutdown:
+
+ raise HydrusExceptions.ShutdownException( 'Application quit before db could serve result!' )
+
+
+ self._DoDelayedResultRelief()
+
+
+ if isinstance( self._result, Exception ):
+
+ e = self._result
+
+ raise e
+
+ else:
+
+ return self._result
+
+
+
+ def GetType( self ):
+
+ return self._type
+
+
+ def IsSynchronous( self ):
+
+ return self._synchronous
+
+
+ def PutResult( self, result ):
+
+ self._result = result
+
+ self._result_ready.set()
+
+
+ def ToString( self ):
+
+ return '{} {}'.format( self._type, self._action )
+
+
+
class DBBase( object ):
def __init__( self ):
diff --git a/hydrus/core/HydrusData.py b/hydrus/core/HydrusData.py
index 2e62b9896..48313836c 100644
--- a/hydrus/core/HydrusData.py
+++ b/hydrus/core/HydrusData.py
@@ -7,21 +7,16 @@
import random
import re
import struct
-import subprocess
import sys
-import threading
import time
import traceback
import typing
import yaml
-from hydrus.core import HydrusBoot
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusNumbers
-from hydrus.core import HydrusPSUtil
-from hydrus.core import HydrusText
def default_dict_list(): return collections.defaultdict( list )
@@ -35,6 +30,7 @@ def BuildKeyToListDict( pairs ):
return d
+
def BuildKeyToSetDict( pairs ):
d = collections.defaultdict( set )
@@ -43,6 +39,7 @@ def BuildKeyToSetDict( pairs ):
return d
+
def BytesToNoneOrHex( b: typing.Optional[ bytes ] ):
if b is None:
@@ -54,6 +51,7 @@ def BytesToNoneOrHex( b: typing.Optional[ bytes ] ):
return b.hex()
+
def CalculateScoreFromRating( count, rating ):
# https://www.evanmiller.org/how-not-to-sort-by-average-rating.html
@@ -68,6 +66,7 @@ def CalculateScoreFromRating( count, rating ):
return score
+
def CheckProgramIsNotShuttingDown():
if HG.model_shutdown:
@@ -75,6 +74,7 @@ def CheckProgramIsNotShuttingDown():
raise HydrusExceptions.ShutdownException( 'Application is shutting down!' )
+
def CleanRunningFile( db_path, instance ):
# just to be careful
@@ -91,59 +91,6 @@ def CleanRunningFile( db_path, instance ):
-# TODO: remove all the 'Convert' from these
-def ConvertFloatToPercentage( f ):
-
- percent = f * 100
-
- if percent == int( percent ):
-
- return f'{int( percent )}%'
-
- else:
-
- return f'{percent:.1f}%'
-
-
-def ConvertIntToPixels( i ):
-
- if i == 1: return 'pixels'
- elif i == 1000: return 'kilopixels'
- elif i == 1000000: return 'megapixels'
- else: return 'megapixels'
-
-def ConvertIntToUnit( unit ):
-
- if unit == 1: return 'B'
- elif unit == 1024: return 'KB'
- elif unit == 1048576: return 'MB'
- elif unit == 1073741824: return 'GB'
-
-
-def ConvertNumericalRatingToPrettyString( lower, upper, rating, rounded_result = False, out_of = True ):
-
- rating_converted = ( rating * ( upper - lower ) ) + lower
-
- if rounded_result:
-
- rating_converted = round( rating_converted )
-
-
- s = '{:.2f}'.format( rating_converted )
-
- if out_of and lower in ( 0, 1 ):
-
- s += '/{:.2f}'.format( upper )
-
-
- return s
-
-def ConvertPixelsToInt( unit ):
-
- if unit == 'pixels': return 1
- elif unit == 'kilopixels': return 1000
- elif unit == 'megapixels': return 1000000
-
def ConvertPrettyStringsToUglyNamespaces( pretty_strings ):
result = { s for s in pretty_strings if s != 'no namespace' }
@@ -152,34 +99,7 @@ def ConvertPrettyStringsToUglyNamespaces( pretty_strings ):
return result
-def ConvertResolutionToPrettyString( resolution ):
-
- if resolution is None:
-
- return 'no resolution'
-
-
- if not isinstance( resolution, tuple ):
-
- try:
-
- resolution = tuple( resolution )
-
- except:
-
- return 'broken resolution'
-
-
-
- if resolution in HC.NICE_RESOLUTIONS:
-
- return HC.NICE_RESOLUTIONS[ resolution ]
-
-
- ( width, height ) = resolution
-
- return '{}x{}'.format( HydrusNumbers.ToHumanInt( width ), HydrusNumbers.ToHumanInt( height ) )
-
+
def ConvertStatusToPrefix( status ):
if status == HC.CONTENT_STATUS_CURRENT: return ''
@@ -189,40 +109,11 @@ def ConvertStatusToPrefix( status ):
else: return '(?)'
-def ConvertUglyNamespaceToPrettyString( namespace ):
-
- if namespace is None or namespace == '':
-
- return 'no namespace'
-
- else:
-
- return namespace
-
-
-def ConvertUglyNamespacesToPrettyStrings( namespaces ):
-
- namespaces = sorted( namespaces )
-
- result = [ ConvertUglyNamespaceToPrettyString( namespace ) for namespace in namespaces ]
-
- return result
-
-def ConvertUnitToInt( unit ):
-
- if unit == 'B': return 1
- elif unit == 'KB': return 1024
- elif unit == 'MB': return 1048576
- elif unit == 'GB': return 1073741824
-
def ConvertValueRangeToBytes( value, range ):
return ToHumanBytes( value ) + '/' + ToHumanBytes( range )
-def ConvertValueRangeToPrettyString( value, range ):
-
- return HydrusNumbers.ToHumanInt( value ) + '/' + HydrusNumbers.ToHumanInt( range )
-
+
def DebugPrint( debug_info ):
Print( debug_info )
@@ -230,6 +121,7 @@ def DebugPrint( debug_info ):
sys.stdout.flush()
sys.stderr.flush()
+
def DedupeList( xs: typing.Iterable ):
if isinstance( xs, set ):
@@ -255,6 +147,7 @@ def DedupeList( xs: typing.Iterable ):
return xs_return
+
def GenerateKey():
return os.urandom( HC.HYDRUS_KEY_LENGTH )
@@ -363,225 +256,6 @@ def GetNonDupeName( original_name, disallowed_names ):
return non_dupe_name
-def GetSiblingProcessPorts( db_path, instance ):
-
- path = os.path.join( db_path, instance + '_running' )
-
- if os.path.exists( path ):
-
- with open( path, 'r', encoding = 'utf-8' ) as f:
-
- file_text = f.read()
-
- try:
-
- ( pid, create_time ) = HydrusText.DeserialiseNewlinedTexts( file_text )
-
- pid = int( pid )
-
- except ValueError:
-
- return None
-
-
- if not HydrusPSUtil.PSUTIL_OK:
-
- raise HydrusExceptions.CancelledException( 'psutil is not available--cannot determine sibling process ports!' )
-
-
- try:
-
- if HydrusPSUtil.psutil.pid_exists( pid ):
-
- ports = []
-
- p = HydrusPSUtil.psutil.Process( pid )
-
- for conn in p.connections():
-
- if conn.status == 'LISTEN':
-
- ports.append( int( conn.laddr[1] ) )
-
-
-
- return ports
-
-
- except HydrusPSUtil.psutil.Error:
-
- return None
-
-
-
-
- return None
-
-
-def GetSubprocessEnv():
-
- if HG.subprocess_report_mode:
-
- env = os.environ.copy()
-
- ShowText( 'Your unmodified env is: {}'.format( env ) )
-
-
- env = os.environ.copy()
-
- if HydrusBoot.ORIGINAL_PATH is not None:
-
- env[ 'PATH' ] = HydrusBoot.ORIGINAL_PATH
-
-
- if HC.RUNNING_FROM_FROZEN_BUILD:
-
- # let's make a proper env for subprocess that doesn't have pyinstaller woo woo in it
-
- changes_made = False
-
- orig_swaperoo_strings = [ 'LD_LIBRARY_PATH', 'XDG_DATA_DIRS' ]
- ok_to_remove_absent_orig = [ 'LD_LIBRARY_PATH' ]
-
- for key in orig_swaperoo_strings:
-
- orig_key = '{}_ORIG'.format( key )
-
- if orig_key in env:
-
- env[ key ] = env[ orig_key ]
-
- changes_made = True
-
- elif key in env and key in ok_to_remove_absent_orig:
-
- del env[ key ]
-
- changes_made = True
-
-
-
- remove_if_hydrus_base_dir = [ 'QT_PLUGIN_PATH', 'QML2_IMPORT_PATH', 'SSL_CERT_FILE' ]
- hydrus_base_dir = HG.controller.GetDBDir()
-
- for key in remove_if_hydrus_base_dir:
-
- if key in env and env[ key ].startswith( hydrus_base_dir ):
-
- del env[ key ]
-
- changes_made = True
-
-
-
- if ( HC.PLATFORM_LINUX or HC.PLATFORM_MACOS ):
-
- if 'PATH' in env:
-
- # fix for pyinstaller, which drops this stuff for some reason and hence breaks ffmpeg
-
- path = env[ 'PATH' ]
-
- path_locations = set( path.split( ':' ) )
- desired_path_locations = [ '/usr/bin', '/usr/local/bin' ]
-
- for desired_path_location in desired_path_locations:
-
- if desired_path_location not in path_locations:
-
- path = desired_path_location + ':' + path
-
- env[ 'PATH' ] = path
-
- changes_made = True
-
-
-
-
- if 'XDG_DATA_DIRS' in env:
-
- xdg_data_dirs = env[ 'XDG_DATA_DIRS' ]
-
- # pyinstaller can just replace this nice usually long str with multiple paths with base_dir/share
- # absent the _orig above to rescue this, we'll populate with basic
- if ':' not in xdg_data_dirs and HC.BASE_DIR in xdg_data_dirs:
-
- xdg_data_dirs = '/usr/local/share:/usr/share'
-
- changes_made = True
-
-
-
-
- if not changes_made:
-
- env = None
-
-
- else:
-
- env = None
-
-
- return env
-
-def GetSubprocessHideTerminalStartupInfo():
-
- if HC.PLATFORM_WINDOWS:
-
- # This suppresses the terminal window that tends to pop up when calling ffmpeg or whatever
-
- startupinfo = subprocess.STARTUPINFO()
-
- startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
-
- else:
-
- startupinfo = None
-
-
- return startupinfo
-
-def GetSubprocessKWArgs( hide_terminal = True, text = False ):
-
- sbp_kwargs = {}
-
- sbp_kwargs[ 'env' ] = GetSubprocessEnv()
-
- if text:
-
- # probably need to override the stdXXX pipes with i/o encoding wrappers in the case of 3.5 here
-
- if sys.version_info.minor >= 6:
-
- sbp_kwargs[ 'encoding' ] = 'utf-8'
-
-
- if sys.version_info.minor >= 7:
-
- sbp_kwargs[ 'text' ] = True
-
- else:
-
- sbp_kwargs[ 'universal_newlines' ] = True
-
-
-
- if hide_terminal:
-
- sbp_kwargs[ 'startupinfo' ] = GetSubprocessHideTerminalStartupInfo()
-
-
- if HG.subprocess_report_mode:
-
- message = 'KWargs are: {}'.format( sbp_kwargs )
-
- ShowText( message )
-
-
- return sbp_kwargs
-
-
def GetTypeName( obj_type ):
if hasattr( obj_type, '__name__' ):
@@ -593,6 +267,7 @@ def GetTypeName( obj_type ):
return repr( obj_type )
+
def GenerateHumanTextSortKey():
"""Solves the 19, 20, 200, 21, 22 issue when sorting 'Page 21.jpg' type strings.
Breaks the string into groups of text and int (i.e. ( "Page ", 21, ".jpg" ) )."""
@@ -610,126 +285,7 @@ def HumanTextSort( texts ):
texts.sort( key = HumanTextSortKey )
-def IntelligentMassIntersect( sets_to_reduce ):
-
- answer = None
-
- for set_to_reduce in sets_to_reduce:
-
- if len( set_to_reduce ) == 0:
-
- return set()
-
-
- if answer is None:
-
- answer = set( set_to_reduce )
-
- else:
-
- if len( answer ) == 0:
-
- return set()
-
- else:
-
- answer.intersection_update( set_to_reduce )
-
-
-
-
- if answer is None:
-
- return set()
-
- else:
-
- return answer
-
-
-def IsAListLikeCollection( obj ):
-
- # protip: don't do isinstance( possible_list, collections.abc.Collection ) for a 'list' detection--strings pass it (and sometimes with infinite recursion) lol!
- return isinstance( obj, ( tuple, list, set, frozenset ) )
-
-
-def IsAlreadyRunning( db_path, instance ):
-
- if not HydrusPSUtil.PSUTIL_OK:
-
- Print( 'psutil is not available, so cannot do the "already running?" check!' )
-
- return False
-
-
- path = os.path.join( db_path, instance + '_running' )
-
- if os.path.exists( path ):
-
- try:
-
- with open( path, 'r', encoding = 'utf-8' ) as f:
-
- file_text = f.read()
-
- try:
-
- ( pid, create_time ) = HydrusText.DeserialiseNewlinedTexts( file_text )
-
- pid = int( pid )
- create_time = float( create_time )
-
- except ValueError:
-
- return False
-
-
- try:
-
- me = HydrusPSUtil.psutil.Process()
-
- if me.pid == pid and me.create_time() == create_time:
-
- # this is me! there is no conflict, lol!
- # this happens when a linux process restarts with os.execl(), for instance (unlike Windows, it keeps its pid)
-
- return False
-
-
- if HydrusPSUtil.psutil.pid_exists( pid ):
-
- p = HydrusPSUtil.psutil.Process( pid )
-
- if p.create_time() == create_time and p.is_running():
-
- return True
-
-
-
- except HydrusPSUtil.psutil.Error:
-
- return False
-
-
-
- except UnicodeDecodeError:
-
- Print( 'The already-running file was incomprehensible!' )
-
- return False
-
- except Exception as e:
-
- Print( 'Problem loading the already-running file:' )
- PrintException( e )
-
- return False
-
-
-
- return False
-
def IterateHexPrefixes():
hex_chars = '0123456789abcdef'
@@ -765,26 +321,6 @@ def LastShutdownWasBad( db_path, instance ):
return False
-def MassExtend( iterables ):
-
- return [ item for item in itertools.chain.from_iterable( iterables ) ]
-
-
-def MassUnion( iterables ):
-
- return { item for item in itertools.chain.from_iterable( iterables ) }
-
-
-def MedianPop( population ):
-
- # assume it has at least one and comes sorted
-
- median_index = len( population ) // 2
-
- row = population.pop( median_index )
-
- return row
-
def MergeKeyToListDicts( key_to_list_dicts ):
result = collections.defaultdict( list )
@@ -802,51 +338,13 @@ def PartitionIterator( pred: typing.Callable[ [ object ], bool ], stream: typing
return ( itertools.filterfalse( pred, t1 ), filter( pred, t2 ) )
+
def PartitionIteratorIntoLists( pred: typing.Callable[ [ object ], bool ], stream: typing.Iterable[ object ] ):
( a, b ) = PartitionIterator( pred, stream )
return ( list( a ), list( b ) )
-def ParseHashesFromRawHexText( hash_type, hex_hashes_raw ):
-
- hash_type_to_hex_length = {
- 'md5' : 32,
- 'sha1' : 40,
- 'sha256' : 64,
- 'sha512' : 128,
- 'pixel' : 64,
- 'perceptual' : 16
- }
-
- hex_hashes = HydrusText.DeserialiseNewlinedTexts( hex_hashes_raw )
-
- # convert md5:abcd to abcd
- hex_hashes = [ hex_hash.split( ':' )[-1] for hex_hash in hex_hashes ]
-
- hex_hashes = [ HydrusText.HexFilter( hex_hash ) for hex_hash in hex_hashes ]
-
- expected_hex_length = hash_type_to_hex_length[ hash_type ]
-
- bad_hex_hashes = [ hex_hash for hex_hash in hex_hashes if len( hex_hash ) != expected_hex_length ]
-
- if len( bad_hex_hashes ):
-
- m = 'Sorry, {} hashes should have {} hex characters! These did not:'.format( hash_type, expected_hex_length )
- m += '\n' * 2
- m += '\n'.join( ( '{} ({} characters)'.format( bad_hex_hash, len( bad_hex_hash ) ) for bad_hex_hash in bad_hex_hashes ) )
-
- raise Exception( m )
-
-
- hex_hashes = [ hex_hash for hex_hash in hex_hashes if len( hex_hash ) % 2 == 0 ]
-
- hex_hashes = DedupeList( hex_hashes )
-
- hashes = tuple( [ bytes.fromhex( hex_hash ) for hex_hash in hex_hashes ] )
-
- return hashes
-
def Print( text ):
@@ -859,6 +357,7 @@ def Print( text ):
print( repr( text ) )
+
ShowText = Print
def PrintException( e, do_wait = True ):
@@ -931,66 +430,6 @@ def RandomPop( population ):
return row
-def RecordRunningStart( db_path, instance ):
-
- if not HydrusPSUtil.PSUTIL_OK:
-
- return
-
-
- path = os.path.join( db_path, instance + '_running' )
-
- record_string = ''
-
- try:
-
- me = HydrusPSUtil.psutil.Process()
-
- record_string += str( me.pid )
- record_string += '\n'
- record_string += str( me.create_time() )
-
- except HydrusPSUtil.psutil.Error:
-
- return
-
-
- with open( path, 'w', encoding = 'utf-8' ) as f:
-
- f.write( record_string )
-
-
-
-def RestartProcess():
-
- time.sleep( 1 ) # time for ports to unmap
-
- # note argv is unreliable in weird script-launching situations, but there we go
- exe = sys.executable
- me = sys.argv[0]
-
- if HC.RUNNING_FROM_SOURCE:
-
- # exe is python's exe, me is the script
-
- args = [ exe ] + sys.argv
-
- else:
-
- # we are running a frozen release--both exe and me are the built exe
-
- # wrap it in quotes because pyinstaller passes it on as raw text, breaking any path with spaces :/
- if not me.startswith( '"' ):
-
- me = '"{}"'.format( me )
-
-
- args = [ me ] + sys.argv[1:]
-
-
- os.execv( exe, args )
-
-
def SampleSetByGettingFirst( s: set, n ):
# sampling from a big set can be slow, so if we don't care about super random, let's just rip off the front and let __hash__ be our random
@@ -1016,27 +455,7 @@ def SampleSetByGettingFirst( s: set, n ):
return sample
-def SetsIntersect( a, b ):
-
- # not a.isdisjoint( b )
-
- if not isinstance( a, set ):
-
- a = set( a )
-
-
- if not isinstance( b, set ):
-
- b = set( b )
-
-
- if len( a ) > len( b ):
-
- ( a, b ) = ( b, a )
-
-
- return True in ( i in b for i in a )
-
+
def SmoothOutMappingIterator( xs, n ):
# de-spikifies mappings, so if there is ( tag, 20k files ), it breaks that up into manageable chunks
@@ -1052,10 +471,12 @@ def SmoothOutMappingIterator( xs, n ):
+
def SplayListForDB( xs ):
return '(' + ','.join( ( str( x ) for x in xs ) ) + ')'
+
def SplitIteratorIntoChunks( iterator, n ):
chunk = []
@@ -1249,100 +670,3 @@ def SetLabel( self, label: str ):
self._label = label
-
-class JobDatabase( object ):
-
- def __init__( self, job_type, synchronous, action, *args, **kwargs ):
-
- self._type = job_type
- self._synchronous = synchronous
- self._action = action
- self._args = args
- self._kwargs = kwargs
-
- self._result_ready = threading.Event()
-
-
- def __str__( self ):
-
- return 'DB Job: {}'.format( self.ToString() )
-
-
- def _DoDelayedResultRelief( self ):
-
- pass
-
-
- def GetCallableTuple( self ):
-
- return ( self._action, self._args, self._kwargs )
-
-
- def GetResult( self ):
-
- time.sleep( 0.00001 ) # this one neat trick can save hassle on superquick jobs as event.wait can be laggy
-
- while True:
-
- result_was_ready = self._result_ready.wait( 2 )
-
- if result_was_ready:
-
- break
-
-
- if HG.model_shutdown:
-
- raise HydrusExceptions.ShutdownException( 'Application quit before db could serve result!' )
-
-
- self._DoDelayedResultRelief()
-
-
- if isinstance( self._result, Exception ):
-
- e = self._result
-
- raise e
-
- else:
-
- return self._result
-
-
-
- def GetType( self ):
-
- return self._type
-
-
- def IsSynchronous( self ):
-
- return self._synchronous
-
-
- def PutResult( self, result ):
-
- self._result = result
-
- self._result_ready.set()
-
-
- def ToString( self ):
-
- return '{} {}'.format( self._type, self._action )
-
-
-class ServiceUpdate( object ):
-
- def __init__( self, action, row = None ):
-
- self._action = action
- self._row = row
-
-
- def ToTuple( self ):
-
- return ( self._action, self._row )
-
-
diff --git a/hydrus/core/HydrusExceptions.py b/hydrus/core/HydrusExceptions.py
index 07c19ac43..09182d968 100644
--- a/hydrus/core/HydrusExceptions.py
+++ b/hydrus/core/HydrusExceptions.py
@@ -105,6 +105,7 @@ class NotAcceptable( NetworkException ): pass
class NotModifiedException( NetworkException ): pass
class BadRequestException( NetworkException ): pass
class ConflictException( NetworkException ): pass
+class UnprocessableEntity( NetworkException ): pass
class RangeNotSatisfiableException( NetworkException ): pass
class MissingCredentialsException( NetworkException ): pass
class DoesNotSupportCORSException( NetworkException ): pass
diff --git a/hydrus/core/HydrusLists.py b/hydrus/core/HydrusLists.py
index 3845fbe9b..28f797b95 100644
--- a/hydrus/core/HydrusLists.py
+++ b/hydrus/core/HydrusLists.py
@@ -1,5 +1,73 @@
+import itertools
+import typing
+
from hydrus.core import HydrusTime
+def IntelligentMassIntersect( sets_to_reduce: typing.Collection[ set ] ):
+
+ answer = None
+
+ for set_to_reduce in sets_to_reduce:
+
+ if len( set_to_reduce ) == 0:
+
+ return set()
+
+
+ if answer is None:
+
+ answer = set( set_to_reduce )
+
+ else:
+
+ if len( answer ) == 0:
+
+ return set()
+
+ else:
+
+ answer.intersection_update( set_to_reduce )
+
+
+
+
+ if answer is None:
+
+ return set()
+
+ else:
+
+ return answer
+
+
+
+def IsAListLikeCollection( obj ):
+
+ # protip: don't do isinstance( possible_list, collections.abc.Collection ) for a 'list' detection--strings pass it (and sometimes with infinite recursion) lol!
+ return isinstance( obj, ( tuple, list, set, frozenset ) )
+
+
+def MassExtend( iterables ):
+
+ return [ item for item in itertools.chain.from_iterable( iterables ) ]
+
+
+def MassUnion( iterables ):
+
+ return { item for item in itertools.chain.from_iterable( iterables ) }
+
+
+def MedianPop( population ):
+
+ # assume it has at least one and comes sorted
+
+ median_index = len( population ) // 2
+
+ row = population.pop( median_index )
+
+ return row
+
+
def PullNFromIterator( iterator, n ):
chunk = []
@@ -17,6 +85,16 @@ def PullNFromIterator( iterator, n ):
return chunk
+def SetsIntersect( a, b ):
+
+ if not isinstance( a, set ):
+
+ a = set( a )
+
+
+ return not a.isdisjoint( b )
+
+
def SplitIteratorIntoAutothrottledChunks( iterator, starting_n, precise_time_to_stop ):
n = starting_n
diff --git a/hydrus/core/HydrusNumbers.py b/hydrus/core/HydrusNumbers.py
index 302e631d1..af6e83633 100644
--- a/hydrus/core/HydrusNumbers.py
+++ b/hydrus/core/HydrusNumbers.py
@@ -1,16 +1,48 @@
-def ConvertIndexToPrettyOrdinalString( index: int ):
+from hydrus.core import HydrusConstants as HC
+
+def FloatToPercentage( f ):
+
+ percent = f * 100
+
+ if percent == int( percent ):
+
+ return f'{int( percent )}%'
+
+ else:
+
+ return f'{percent:.1f}%'
+
+
+
+def IndexToPrettyOrdinalString( index: int ):
if index >= 0:
- return ConvertIntToPrettyOrdinalString( index + 1 )
+ return IntToPrettyOrdinalString( index + 1 )
else:
- return ConvertIntToPrettyOrdinalString( index )
+ return IntToPrettyOrdinalString( index )
-def ConvertIntToPrettyOrdinalString( num: int ):
+def IntToPixels( i ):
+
+ if i == 1: return 'pixels'
+ elif i == 1000: return 'kilopixels'
+ elif i == 1000000: return 'megapixels'
+ else: return 'megapixels'
+
+
+def IntToUnit( unit ):
+
+ if unit == 1: return 'B'
+ elif unit == 1024: return 'KB'
+ elif unit == 1048576: return 'MB'
+ elif unit == 1073741824: return 'GB'
+
+
+def IntToPrettyOrdinalString( num: int ):
if num == 0:
@@ -62,6 +94,42 @@ def ConvertIntToPrettyOrdinalString( num: int ):
return s
+def PixelsToInt( unit ):
+
+ if unit == 'pixels': return 1
+ elif unit == 'kilopixels': return 1000
+ elif unit == 'megapixels': return 1000000
+
+
+def ResolutionToPrettyString( resolution ):
+
+ if resolution is None:
+
+ return 'no resolution'
+
+
+ if not isinstance( resolution, tuple ):
+
+ try:
+
+ resolution = tuple( resolution )
+
+ except:
+
+ return 'broken resolution'
+
+
+
+ if resolution in HC.NICE_RESOLUTIONS:
+
+ return HC.NICE_RESOLUTIONS[ resolution ]
+
+
+ ( width, height ) = resolution
+
+ return '{}x{}'.format( ToHumanInt( width ), ToHumanInt( height ) )
+
+
def ToHumanInt( num ):
num = int( num )
@@ -73,3 +141,17 @@ def ToHumanInt( num ):
return text
+
+def UnitToInt( unit ):
+
+ if unit == 'B': return 1
+ elif unit == 'KB': return 1024
+ elif unit == 'MB': return 1024 ** 2
+ elif unit == 'GB': return 1024 ** 3
+ elif unit == 'TB': return 1024 ** 4
+
+
+def ValueRangeToPrettyString( value, range ):
+
+ return ToHumanInt( value ) + '/' + ToHumanInt( range )
+
diff --git a/hydrus/core/HydrusPaths.py b/hydrus/core/HydrusPaths.py
index 8dcd9d25d..14abdf029 100644
--- a/hydrus/core/HydrusPaths.py
+++ b/hydrus/core/HydrusPaths.py
@@ -14,6 +14,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusPSUtil
from hydrus.core import HydrusThreading
from hydrus.core import HydrusTime
@@ -493,7 +494,7 @@ def do_it():
# setsid call un-childs this new process
- sbp_kwargs = HydrusData.GetSubprocessKWArgs()
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs()
preexec_fn = getattr( os, 'setsid', None )
@@ -552,7 +553,7 @@ def do_it( launch_path ):
try:
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( hide_terminal = hide_terminal, text = True )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( hide_terminal = hide_terminal, text = True )
HydrusData.CheckProgramIsNotShuttingDown()
@@ -975,7 +976,7 @@ def do_it():
raise NotImplementedError( 'Haiku cannot open file locations!' )
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( hide_terminal = False )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( hide_terminal = False )
HydrusData.CheckProgramIsNotShuttingDown()
diff --git a/hydrus/core/HydrusProcess.py b/hydrus/core/HydrusProcess.py
new file mode 100644
index 000000000..2a7b36fac
--- /dev/null
+++ b/hydrus/core/HydrusProcess.py
@@ -0,0 +1,370 @@
+import os
+import subprocess
+import sys
+import time
+
+from hydrus.core import HydrusBoot
+from hydrus.core import HydrusConstants as HC
+from hydrus.core import HydrusData
+from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusPSUtil
+from hydrus.core import HydrusText
+
+def GetSiblingProcessPorts( db_path, instance ):
+
+ path = os.path.join( db_path, instance + '_running' )
+
+ if os.path.exists( path ):
+
+ with open( path, 'r', encoding = 'utf-8' ) as f:
+
+ file_text = f.read()
+
+ try:
+
+ ( pid, create_time ) = HydrusText.DeserialiseNewlinedTexts( file_text )
+
+ pid = int( pid )
+
+ except ValueError:
+
+ return None
+
+
+ if not HydrusPSUtil.PSUTIL_OK:
+
+ raise HydrusExceptions.CancelledException( 'psutil is not available--cannot determine sibling process ports!' )
+
+
+ try:
+
+ if HydrusPSUtil.psutil.pid_exists( pid ):
+
+ ports = []
+
+ p = HydrusPSUtil.psutil.Process( pid )
+
+ for conn in p.connections():
+
+ if conn.status == 'LISTEN':
+
+ ports.append( int( conn.laddr[1] ) )
+
+
+
+ return ports
+
+
+ except HydrusPSUtil.psutil.Error:
+
+ return None
+
+
+
+
+ return None
+
+
+def GetSubprocessEnv():
+
+ if HG.subprocess_report_mode:
+
+ env = os.environ.copy()
+
+ HydrusData.ShowText( 'Your unmodified env is: {}'.format( env ) )
+
+
+ env = os.environ.copy()
+
+ if HydrusBoot.ORIGINAL_PATH is not None:
+
+ env[ 'PATH' ] = HydrusBoot.ORIGINAL_PATH
+
+
+ if HC.RUNNING_FROM_FROZEN_BUILD:
+
+ # let's make a proper env for subprocess that doesn't have pyinstaller woo woo in it
+
+ changes_made = False
+
+ orig_swaperoo_strings = [ 'LD_LIBRARY_PATH', 'XDG_DATA_DIRS' ]
+ ok_to_remove_absent_orig = [ 'LD_LIBRARY_PATH' ]
+
+ for key in orig_swaperoo_strings:
+
+ orig_key = '{}_ORIG'.format( key )
+
+ if orig_key in env:
+
+ env[ key ] = env[ orig_key ]
+
+ changes_made = True
+
+ elif key in env and key in ok_to_remove_absent_orig:
+
+ del env[ key ]
+
+ changes_made = True
+
+
+
+ remove_if_hydrus_base_dir = [ 'QT_PLUGIN_PATH', 'QML2_IMPORT_PATH', 'SSL_CERT_FILE' ]
+ hydrus_base_dir = HG.controller.GetDBDir()
+
+ for key in remove_if_hydrus_base_dir:
+
+ if key in env and env[ key ].startswith( hydrus_base_dir ):
+
+ del env[ key ]
+
+ changes_made = True
+
+
+
+ if ( HC.PLATFORM_LINUX or HC.PLATFORM_MACOS ):
+
+ if 'PATH' in env:
+
+ # fix for pyinstaller, which drops this stuff for some reason and hence breaks ffmpeg
+
+ path = env[ 'PATH' ]
+
+ path_locations = set( path.split( ':' ) )
+ desired_path_locations = [ '/usr/bin', '/usr/local/bin' ]
+
+ for desired_path_location in desired_path_locations:
+
+ if desired_path_location not in path_locations:
+
+ path = desired_path_location + ':' + path
+
+ env[ 'PATH' ] = path
+
+ changes_made = True
+
+
+
+
+ if 'XDG_DATA_DIRS' in env:
+
+ xdg_data_dirs = env[ 'XDG_DATA_DIRS' ]
+
+ # pyinstaller can just replace this nice usually long str with multiple paths with base_dir/share
+ # absent the _orig above to rescue this, we'll populate with basic
+ if ':' not in xdg_data_dirs and HC.BASE_DIR in xdg_data_dirs:
+
+ xdg_data_dirs = '/usr/local/share:/usr/share'
+
+ changes_made = True
+
+
+
+
+ if not changes_made:
+
+ env = None
+
+
+ else:
+
+ env = None
+
+
+ return env
+
+
+def GetSubprocessHideTerminalStartupInfo():
+
+ if HC.PLATFORM_WINDOWS:
+
+ # This suppresses the terminal window that tends to pop up when calling ffmpeg or whatever
+
+ startupinfo = subprocess.STARTUPINFO()
+
+ startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
+
+ else:
+
+ startupinfo = None
+
+
+ return startupinfo
+
+
+def GetSubprocessKWArgs( hide_terminal = True, text = False ):
+
+ sbp_kwargs = {}
+
+ sbp_kwargs[ 'env' ] = GetSubprocessEnv()
+
+ if text:
+
+ # probably need to override the stdXXX pipes with i/o encoding wrappers in the case of 3.5 here
+
+ if sys.version_info.minor >= 6:
+
+ sbp_kwargs[ 'encoding' ] = 'utf-8'
+
+
+ if sys.version_info.minor >= 7:
+
+ sbp_kwargs[ 'text' ] = True
+
+ else:
+
+ sbp_kwargs[ 'universal_newlines' ] = True
+
+
+
+ if hide_terminal:
+
+ sbp_kwargs[ 'startupinfo' ] = GetSubprocessHideTerminalStartupInfo()
+
+
+ if HG.subprocess_report_mode:
+
+ message = 'KWargs are: {}'.format( sbp_kwargs )
+
+ HydrusData.ShowText( message )
+
+
+ return sbp_kwargs
+
+
+def IsAlreadyRunning( db_path, instance ):
+
+ if not HydrusPSUtil.PSUTIL_OK:
+
+ HydrusData.Print( 'psutil is not available, so cannot do the "already running?" check!' )
+
+ return False
+
+
+ path = os.path.join( db_path, instance + '_running' )
+
+ if os.path.exists( path ):
+
+ try:
+
+ with open( path, 'r', encoding = 'utf-8' ) as f:
+
+ file_text = f.read()
+
+ try:
+
+ ( pid, create_time ) = HydrusText.DeserialiseNewlinedTexts( file_text )
+
+ pid = int( pid )
+ create_time = float( create_time )
+
+ except ValueError:
+
+ return False
+
+
+ try:
+
+ me = HydrusPSUtil.psutil.Process()
+
+ if me.pid == pid and me.create_time() == create_time:
+
+ # this is me! there is no conflict, lol!
+ # this happens when a linux process restarts with os.execl(), for instance (unlike Windows, it keeps its pid)
+
+ return False
+
+
+ if HydrusPSUtil.psutil.pid_exists( pid ):
+
+ p = HydrusPSUtil.psutil.Process( pid )
+
+ if p.create_time() == create_time and p.is_running():
+
+ return True
+
+
+
+ except HydrusPSUtil.psutil.Error:
+
+ return False
+
+
+
+ except UnicodeDecodeError:
+
+ HydrusData.Print( 'The already-running file was incomprehensible!' )
+
+ return False
+
+ except Exception as e:
+
+ HydrusData.Print( 'Problem loading the already-running file:' )
+ HydrusData.PrintException( e )
+
+ return False
+
+
+
+ return False
+
+
+def RecordRunningStart( db_path, instance ):
+
+ if not HydrusPSUtil.PSUTIL_OK:
+
+ return
+
+
+ path = os.path.join( db_path, instance + '_running' )
+
+ record_string = ''
+
+ try:
+
+ me = HydrusPSUtil.psutil.Process()
+
+ record_string += str( me.pid )
+ record_string += '\n'
+ record_string += str( me.create_time() )
+
+ except HydrusPSUtil.psutil.Error:
+
+ return
+
+
+ with open( path, 'w', encoding = 'utf-8' ) as f:
+
+ f.write( record_string )
+
+
+
+def RestartProcess():
+
+ time.sleep( 1 ) # time for ports to unmap
+
+ # note argv is unreliable in weird script-launching situations, but there we go
+ exe = sys.executable
+ me = sys.argv[0]
+
+ if HC.RUNNING_FROM_SOURCE:
+
+ # exe is python's exe, me is the script
+
+ args = [ exe ] + sys.argv
+
+ else:
+
+ # we are running a frozen release--both exe and me are the built exe
+
+ # wrap it in quotes because pyinstaller passes it on as raw text, breaking any path with spaces :/
+ if not me.startswith( '"' ):
+
+ me = '"{}"'.format( me )
+
+
+ args = [ me ] + sys.argv[1:]
+
+
+ os.execv( exe, args )
+
diff --git a/hydrus/core/HydrusTags.py b/hydrus/core/HydrusTags.py
index 313d5d585..2d35cbfa5 100644
--- a/hydrus/core/HydrusTags.py
+++ b/hydrus/core/HydrusTags.py
@@ -248,6 +248,27 @@ def ConvertTagSliceToPrettyString( tag_slice ):
+def ConvertUglyNamespaceToPrettyString( namespace ):
+
+ if namespace is None or namespace == '':
+
+ return 'no namespace'
+
+ else:
+
+ return namespace
+
+
+
+def ConvertUglyNamespacesToPrettyStrings( namespaces ):
+
+ namespaces = sorted( namespaces )
+
+ result = [ ConvertUglyNamespaceToPrettyString( namespace ) for namespace in namespaces ]
+
+ return result
+
+
ALL_UNNAMESPACED_TAG_SLICE = ''
ALL_NAMESPACED_TAG_SLICE = ':'
diff --git a/hydrus/core/files/HydrusFileHandling.py b/hydrus/core/files/HydrusFileHandling.py
index ed2874c22..340448e6b 100644
--- a/hydrus/core/files/HydrusFileHandling.py
+++ b/hydrus/core/files/HydrusFileHandling.py
@@ -6,6 +6,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusNumbers
from hydrus.core import HydrusPaths
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTemp
@@ -351,7 +352,7 @@ def GenerateThumbnailNumPy( path, target_resolution, mime, duration, num_frames,
except Exception as e:
- message = 'Problem generating thumbnail for "{}" at frame {} ({})--FFMPEG could not render it.'.format( path, desired_thumb_frame_index, HydrusData.ConvertFloatToPercentage( percentage_in / 100.0 ) )
+ message = 'Problem generating thumbnail for "{}" at frame {} ({})--FFMPEG could not render it.'.format( path, desired_thumb_frame_index, HydrusNumbers.FloatToPercentage( percentage_in / 100.0 ) )
PrintMoreThumbErrorInfo( e, message, extra_description = extra_description )
diff --git a/hydrus/core/files/HydrusFlashHandling.py b/hydrus/core/files/HydrusFlashHandling.py
index 78d0c1388..0b4a4327d 100644
--- a/hydrus/core/files/HydrusFlashHandling.py
+++ b/hydrus/core/files/HydrusFlashHandling.py
@@ -4,6 +4,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusThreading
from hydrus.core import HydrusTime
@@ -55,7 +56,7 @@ def RenderPageToFile( path, temp_path, page_index ):
timeout = HydrusTime.GetNow() + 60
- sbp_kwargs = HydrusData.GetSubprocessKWArgs()
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs()
sbp_kwargs[ 'stdout' ] = subprocess.DEVNULL
sbp_kwargs[ 'stderr' ] = subprocess.DEVNULL
diff --git a/hydrus/core/files/HydrusVideoHandling.py b/hydrus/core/files/HydrusVideoHandling.py
index e09bf17ae..6a46e6378 100644
--- a/hydrus/core/files/HydrusVideoHandling.py
+++ b/hydrus/core/files/HydrusVideoHandling.py
@@ -8,6 +8,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusText
from hydrus.core import HydrusThreading
from hydrus.core.files import HydrusAudioHandling
@@ -53,7 +54,7 @@ def GetFFMPEGVersion():
try:
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( text = True )
process = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
@@ -140,7 +141,7 @@ def GetFFMPEGInfoLines( path, count_frames_manually = False, only_first_second =
- sbp_kwargs = HydrusData.GetSubprocessKWArgs()
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
@@ -479,7 +480,7 @@ def RenderImageToImagePath( path, temp_image_path ):
cmd = [ FFMPEG_PATH, '-y', "-i", path, temp_image_path ]
- sbp_kwargs = HydrusData.GetSubprocessKWArgs()
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
@@ -949,7 +950,7 @@ def VideoHasAudio( path, info_lines ) -> bool:
'-' ] )
- sbp_kwargs = HydrusData.GetSubprocessKWArgs()
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
@@ -1129,7 +1130,7 @@ def initialize( self, start_index = 0 ):
] )
- sbp_kwargs = HydrusData.GetSubprocessKWArgs()
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
diff --git a/hydrus/core/networking/HydrusNATPunch.py b/hydrus/core/networking/HydrusNATPunch.py
index 75ce31cdc..c3ae2aab8 100644
--- a/hydrus/core/networking/HydrusNATPunch.py
+++ b/hydrus/core/networking/HydrusNATPunch.py
@@ -8,6 +8,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusText
from hydrus.core import HydrusThreading
from hydrus.core import HydrusTime
@@ -88,7 +89,7 @@ def GetExternalIP():
cmd = [ UPNPC_PATH, '-l' ]
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
@@ -150,7 +151,7 @@ def AddUPnPMapping( internal_client, internal_port, external_port, protocol, des
cmd = [ UPNPC_PATH, '-e', description, '-a', internal_client, str( internal_port ), str( external_port ), protocol, str( duration ) ]
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
@@ -214,7 +215,7 @@ def GetUPnPMappings():
cmd = [ UPNPC_PATH, '-l' ]
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
@@ -328,7 +329,7 @@ def RemoveUPnPMapping( external_port, protocol ):
cmd = [ UPNPC_PATH, '-d', str( external_port ), protocol ]
- sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
+ sbp_kwargs = HydrusProcess.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
diff --git a/hydrus/core/networking/HydrusNetwork.py b/hydrus/core/networking/HydrusNetwork.py
index 1a13a9e37..8ca8a47e7 100644
--- a/hydrus/core/networking/HydrusNetwork.py
+++ b/hydrus/core/networking/HydrusNetwork.py
@@ -2100,7 +2100,7 @@ def GetEarliestTimestampForTheseHashes( self, hashes ):
for ( update_index, ( update_hashes, begin, end ) ) in sorted( self._metadata.items() ):
- if HydrusData.SetsIntersect( hashes, update_hashes ):
+ if HydrusLists.SetsIntersect( hashes, update_hashes ):
return end
diff --git a/hydrus/core/networking/HydrusNetworking.py b/hydrus/core/networking/HydrusNetworking.py
index 3e2fe03da..d20945d7f 100644
--- a/hydrus/core/networking/HydrusNetworking.py
+++ b/hydrus/core/networking/HydrusNetworking.py
@@ -272,7 +272,7 @@ def key( rule_tuple ):
elif bandwidth_type == HC.BANDWIDTH_TYPE_REQUESTS:
- s += HydrusData.ConvertValueRangeToPrettyString( usage, max_allowed ) + ' requests'
+ s += HydrusNumbers.ValueRangeToPrettyString( usage, max_allowed ) + ' requests'
if time_delta is None:
diff --git a/hydrus/core/networking/HydrusServerResources.py b/hydrus/core/networking/HydrusServerResources.py
index 6b90480aa..c447143c2 100644
--- a/hydrus/core/networking/HydrusServerResources.py
+++ b/hydrus/core/networking/HydrusServerResources.py
@@ -858,6 +858,10 @@ def _errbackHandleProcessingError( self, failure, request: HydrusServerRequest.H
status_code = 419
+ elif isinstance( e, HydrusExceptions.UnprocessableEntity ):
+
+ status_code = 422
+
elif isinstance( e, HydrusExceptions.NetworkVersionException ):
status_code = 426
diff --git a/hydrus/hydrus_client_boot.py b/hydrus/hydrus_client_boot.py
index 9a6f8b406..64588c485 100644
--- a/hydrus/hydrus_client_boot.py
+++ b/hydrus/hydrus_client_boot.py
@@ -132,6 +132,10 @@
title = 'Critical boot error occurred! Details written to crash.log in either your db dir, userdir, or desktop!'
+ import traceback
+
+ error_trace = str( e ) + '\n\n' + traceback.format_exc()
+
try:
HydrusData.DebugPrint( title )
@@ -139,13 +143,11 @@
except:
- pass
+ print( title )
+ print( 'Note for hydev: HydrusData did not import; probably a very early import (Qt?) issue!' )
+ print( error_trace )
- import traceback
-
- error_trace = str( e ) + '\n\n' + traceback.format_exc()
-
if 'db_dir' in locals() and os.path.exists( db_dir ):
emergency_dir = db_dir
@@ -185,7 +187,16 @@
except:
- HydrusData.DebugPrint( 'Could not start up Qt to show the error visually!' )
+ message = 'Could not start up Qt to show the error visually!'
+
+ try:
+
+ HydrusData.Print( message )
+
+ except:
+
+ print( message )
+
sys.exit( 1 )
@@ -246,6 +257,8 @@ def boot():
if HG.restart:
- HydrusData.RestartProcess()
+ from hydrus.core import HydrusProcess
+
+ HydrusProcess.RestartProcess()
diff --git a/hydrus/server/ServerController.py b/hydrus/server/ServerController.py
index 5b5d02314..c541403df 100644
--- a/hydrus/server/ServerController.py
+++ b/hydrus/server/ServerController.py
@@ -13,6 +13,7 @@
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusNumbers
from hydrus.core import HydrusPaths
+from hydrus.core import HydrusProcess
from hydrus.core import HydrusSessions
from hydrus.core import HydrusThreading
from hydrus.core import HydrusTime
@@ -25,7 +26,7 @@
def ProcessStartingAction( db_dir, action ):
- already_running = HydrusData.IsAlreadyRunning( db_dir, 'server' )
+ already_running = HydrusProcess.IsAlreadyRunning( db_dir, 'server' )
if action == 'start':
@@ -87,7 +88,7 @@ def ShutdownSiblingInstance( db_dir ):
try:
- ports = HydrusData.GetSiblingProcessPorts( db_dir, 'server' )
+ ports = HydrusProcess.GetSiblingProcessPorts( db_dir, 'server' )
except HydrusExceptions.CancelledException as e:
@@ -141,7 +142,7 @@ def ShutdownSiblingInstance( db_dir ):
time_waited = 0
- while HydrusData.IsAlreadyRunning( db_dir, 'server' ):
+ while HydrusProcess.IsAlreadyRunning( db_dir, 'server' ):
time.sleep( 1 )
diff --git a/hydrus/test/TestClientAPI.py b/hydrus/test/TestClientAPI.py
index c84ec6b04..c243ca8b7 100644
--- a/hydrus/test/TestClientAPI.py
+++ b/hydrus/test/TestClientAPI.py
@@ -4584,6 +4584,35 @@ def _test_manage_duplicates( self, connection, set_up_permissions ):
+ # remove potentials
+
+ HG.test_controller.ClearWrites( 'remove_potential_pairs' )
+
+ headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
+
+ path = '/manage_file_relationships/remove_potentials'
+
+ test_hashes = [
+ "b54d09218e0d6efc964b78b070620a1fa19c7e069672b4c6313cee2c9b0623f2",
+ "bbaa9876dab238dcf5799bfd8319ed0bab805e844f45cf0de33f40697b11a845"
+ ]
+
+ request_dict = { 'hashes' : test_hashes }
+
+ request_body = json.dumps( request_dict )
+
+ connection.request( 'POST', path, body = request_body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 200 )
+
+ [ ( args, kwargs ) ] = HG.test_controller.GetWrite( 'remove_potential_pairs' )
+
+ self.assertEqual( set( args[0] ), { bytes.fromhex( h ) for h in test_hashes } )
+
# set kings
HG.test_controller.ClearWrites( 'duplicate_set_king' )
@@ -4695,6 +4724,185 @@ def _test_manage_pages( self, connection, set_up_permissions ):
self.assertEqual( result, expected_result )
+ def _test_manage_services( self, connection, set_up_permissions ):
+
+ # this stuff is super dependent on the db requests, which aren't tested in this class, but we can do the arg parsing and wrapper
+
+ api_permissions = set_up_permissions[ 'everything' ]
+
+ access_key_hex = api_permissions.GetAccessKey().hex()
+
+ headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
+
+ default_location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY )
+
+ # get pending counts
+
+ db_response = {
+ bytes.fromhex( 'ae91919b0ea95c9e636f877f57a69728403b65098238c1a121e5ebf85df3b87e' ) : {
+ HC.SERVICE_INFO_NUM_PENDING_MAPPINGS : 11564,
+ HC.SERVICE_INFO_NUM_PETITIONED_MAPPINGS : 5,
+ HC.SERVICE_INFO_NUM_PENDING_TAG_SIBLINGS : 2,
+ HC.SERVICE_INFO_NUM_PETITIONED_TAG_SIBLINGS : 0,
+ HC.SERVICE_INFO_NUM_PENDING_TAG_PARENTS : 0,
+ HC.SERVICE_INFO_NUM_PETITIONED_TAG_PARENTS : 0
+ },
+ bytes.fromhex( '3902aabc3c4c89d1b821eaa9c011be3047424fd2f0c086346e84794e08e136b0' ) : {
+ HC.SERVICE_INFO_NUM_PENDING_MAPPINGS : 0,
+ HC.SERVICE_INFO_NUM_PETITIONED_MAPPINGS : 0,
+ HC.SERVICE_INFO_NUM_PENDING_TAG_SIBLINGS : 0,
+ HC.SERVICE_INFO_NUM_PETITIONED_TAG_SIBLINGS : 0,
+ HC.SERVICE_INFO_NUM_PENDING_TAG_PARENTS : 0,
+ HC.SERVICE_INFO_NUM_PETITIONED_TAG_PARENTS : 0
+ },
+ bytes.fromhex( 'e06e1ae35e692d9fe2b83cde1510a11ecf495f51910d580681cd60e6f21fde73' ) : {
+ HC.SERVICE_INFO_NUM_PENDING_FILES : 2,
+ HC.SERVICE_INFO_NUM_PETITIONED_FILES : 0
+ }
+ }
+
+ expected_response = {
+ "pending_counts" : {
+ "ae91919b0ea95c9e636f877f57a69728403b65098238c1a121e5ebf85df3b87e" : {
+ "pending_tag_mappings" : 11564,
+ "petitioned_tag_mappings" : 5,
+ "pending_tag_siblings" : 2,
+ "petitioned_tag_siblings" : 0,
+ "pending_tag_parents" : 0,
+ "petitioned_tag_parents" : 0
+ },
+ "3902aabc3c4c89d1b821eaa9c011be3047424fd2f0c086346e84794e08e136b0" : {
+ "pending_tag_mappings" : 0,
+ "petitioned_tag_mappings" : 0,
+ "pending_tag_siblings" : 0,
+ "petitioned_tag_siblings" : 0,
+ "pending_tag_parents" : 0,
+ "petitioned_tag_parents" : 0
+ },
+ "e06e1ae35e692d9fe2b83cde1510a11ecf495f51910d580681cd60e6f21fde73" : {
+ "pending_files" : 2,
+ "petitioned_files" : 0
+ }
+ }
+ }
+
+ HG.test_controller.SetRead( 'nums_pending', db_response )
+
+ path = '/manage_services/get_pending_counts'
+
+ connection.request( 'GET', path, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ text = str( data, 'utf-8' )
+
+ self.assertEqual( response.status, 200 )
+
+ d = json.loads( text )
+
+ self.assertEqual( d[ 'pending_counts' ], expected_response[ 'pending_counts' ] )
+
+ #
+
+ is_already_running = False
+ is_big_problem = False
+ magic_results_dict = {}
+
+ def IsCurrentlyUploadingPending( *args ):
+
+ magic_results_dict[ 'a' ] = True
+
+ return is_already_running
+
+
+ CG.client_controller.IsCurrentlyUploadingPending = IsCurrentlyUploadingPending
+
+ def UploadPending( *args ):
+
+ magic_results_dict[ 'b' ] = True
+
+ return not is_big_problem
+
+
+ CG.client_controller.UploadPending = UploadPending
+
+ headers = { 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ], 'Hydrus-Client-API-Access-Key' : access_key_hex }
+
+ path = '/manage_services/commit_pending'
+
+ body_dict = { 'service_key' : HG.test_controller.example_file_repo_service_key_1.hex() }
+
+ body = json.dumps( body_dict )
+
+ connection.request( 'POST', path, body = body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 200 )
+
+ self.assertEqual( magic_results_dict, { 'a' : True, 'b' : True } )
+
+ #
+
+ is_already_running = True
+ is_big_problem = False
+ magic_results_dict = {}
+
+ connection.request( 'POST', path, body = body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 409 )
+
+ self.assertEqual( magic_results_dict, { 'a' : True } )
+
+ #
+
+ is_already_running = False
+ is_big_problem = True
+ magic_results_dict = {}
+
+ connection.request( 'POST', path, body = body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 422 )
+
+ self.assertEqual( magic_results_dict, { 'a' : True, 'b' : True } )
+
+ #
+
+ HG.test_controller.ClearWrites( 'delete_pending' )
+
+ headers = { 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ], 'Hydrus-Client-API-Access-Key' : access_key_hex }
+
+ path = '/manage_services/forget_pending'
+
+ body_dict = { 'service_key' : HG.test_controller.example_file_repo_service_key_1.hex() }
+
+ body = json.dumps( body_dict )
+
+ connection.request( 'POST', path, body = body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 200 )
+
+ [ ( ( service_key, ), kwargs ) ] = HG.test_controller.GetWrite( 'delete_pending' )
+
+ self.assertEqual( service_key, HG.test_controller.example_file_repo_service_key_1 )
+
+
def _test_options( self, connection, set_up_permissions ):
# first fail
@@ -6569,6 +6777,7 @@ def test_client_api( self ):
self._test_add_tags_get_tag_siblings_and_parents( connection, set_up_permissions )
self._test_add_urls( connection, set_up_permissions )
self._test_associate_urls( connection, set_up_permissions )
+ self._test_manage_services( connection, set_up_permissions )
self._test_manage_duplicates( connection, set_up_permissions )
self._test_manage_cookies( connection, set_up_permissions )
self._test_manage_headers( connection, set_up_permissions )
diff --git a/hydrus/test/TestClientDB.py b/hydrus/test/TestClientDB.py
index e20d3b194..95b3d37e3 100644
--- a/hydrus/test/TestClientDB.py
+++ b/hydrus/test/TestClientDB.py
@@ -5,6 +5,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
+from hydrus.core import HydrusNumbers
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTime
from hydrus.core.files.images import HydrusImageHandling
@@ -507,19 +508,19 @@ def run_or_predicate_tests( tests ):
tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO_DATA, ( (), tuple( perceptual_hashes ), 0 ), 1 ) )
tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO_DATA, ( (), ( os.urandom( 32 ), ), 0 ), 0 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 0, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 5270, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 5271, HydrusData.ConvertUnitToInt( 'B' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 5270, HydrusData.ConvertUnitToInt( 'B' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 0, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( HC.UNICODE_APPROX_EQUAL, 5270, HydrusData.ConvertUnitToInt( 'B' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( HC.UNICODE_APPROX_EQUAL, 0, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 5270, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 5269, HydrusData.ConvertUnitToInt( 'B' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusData.ConvertUnitToInt( 'B' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusData.ConvertUnitToInt( 'KB' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusData.ConvertUnitToInt( 'MB' ) ), 1 ) )
- tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusData.ConvertUnitToInt( 'GB' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 0, HydrusNumbers.UnitToInt( 'B' ) ), 0 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 5270, HydrusNumbers.UnitToInt( 'B' ) ), 0 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 5271, HydrusNumbers.UnitToInt( 'B' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 5270, HydrusNumbers.UnitToInt( 'B' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 0, HydrusNumbers.UnitToInt( 'B' ) ), 0 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( HC.UNICODE_APPROX_EQUAL, 5270, HydrusNumbers.UnitToInt( 'B' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( HC.UNICODE_APPROX_EQUAL, 0, HydrusNumbers.UnitToInt( 'B' ) ), 0 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 5270, HydrusNumbers.UnitToInt( 'B' ) ), 0 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 5269, HydrusNumbers.UnitToInt( 'B' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusNumbers.UnitToInt( 'B' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusNumbers.UnitToInt( 'KB' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusNumbers.UnitToInt( 'MB' ) ), 1 ) )
+ tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '>', 0, HydrusNumbers.UnitToInt( 'GB' ) ), 1 ) )
tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_WIDTH, ClientSearch.NumberTest.STATICCreateFromCharacters( '<', 201 ), 1 ) )
tests.append( ( ClientSearch.PREDICATE_TYPE_SYSTEM_WIDTH, ClientSearch.NumberTest.STATICCreateFromCharacters( '<', 200 ), 0 ) )
diff --git a/hydrus/test/TestClientDBDuplicates.py b/hydrus/test/TestClientDBDuplicates.py
index d237709e6..f910d995b 100644
--- a/hydrus/test/TestClientDBDuplicates.py
+++ b/hydrus/test/TestClientDBDuplicates.py
@@ -5,7 +5,7 @@
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
-from hydrus.core import HydrusTime
+from hydrus.core import HydrusNumbers
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
@@ -1062,10 +1062,10 @@ def test_duplicates( self ):
self._false_positive_king_hash = self._similar_looking_false_positive_hashes[0]
self._alternate_king_hash = self._similar_looking_alternate_hashes[0]
- self._our_main_dupe_group_hashes = set( [ self._king_hash ] )
- self._our_second_dupe_group_hashes = set( [ self._second_group_king_hash ] )
- self._our_alt_dupe_group_hashes = set( [ self._alternate_king_hash ] )
- self._our_fp_dupe_group_hashes = set( [ self._false_positive_king_hash ] )
+ self._our_main_dupe_group_hashes = { self._king_hash }
+ self._our_second_dupe_group_hashes = { self._second_group_king_hash }
+ self._our_alt_dupe_group_hashes = { self._alternate_king_hash }
+ self._our_fp_dupe_group_hashes = { self._false_positive_king_hash }
n = len( self._all_hashes )
@@ -1074,7 +1074,7 @@ def test_duplicates( self ):
# initial number pair combinations is (n(n-1))/2
self._expected_num_potentials = int( n * ( n - 1 ) / 2 )
- size_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 65535, HydrusData.ConvertUnitToInt( 'B' ) ) )
+ size_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 65535, HydrusNumbers.UnitToInt( 'B' ) ) )
png_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, ( HC.IMAGE_PNG, ) )
location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
diff --git a/hydrus/test/TestController.py b/hydrus/test/TestController.py
index cd8ebc395..6c17dba90 100644
--- a/hydrus/test/TestController.py
+++ b/hydrus/test/TestController.py
@@ -397,9 +397,6 @@ def qt_code( win: QW.QWidget, job_status: ClientThreading.JobStatus ):
job_status.SetErrorException( e )
- HydrusData.Print( 'CallBlockingToQt just caught this error:' )
- HydrusData.DebugPrint( traceback.format_exc() )
-
finally:
job_status.Finish()
diff --git a/hydrus/test/TestHydrusData.py b/hydrus/test/TestHydrusData.py
index 5df8e6595..1fac11f30 100644
--- a/hydrus/test/TestHydrusData.py
+++ b/hydrus/test/TestHydrusData.py
@@ -6,16 +6,16 @@ class TestHydrusData( unittest.TestCase ):
def test_ordinals( self ):
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 1 ), '1st' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 2 ), '2nd' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 3 ), '3rd' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 4 ), '4th' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 1 ), '1st' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 2 ), '2nd' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 3 ), '3rd' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 4 ), '4th' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 11 ), '11th' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 12 ), '12th' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 11 ), '11th' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 12 ), '12th' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 213 ), '213th' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 213 ), '213th' )
- self.assertEqual( HydrusNumbers.ConvertIntToPrettyOrdinalString( 1011 ), '1,011th' )
+ self.assertEqual( HydrusNumbers.IntToPrettyOrdinalString( 1011 ), '1,011th' )
diff --git a/static/default/parsers/shimmie file page parser.png b/static/default/parsers/shimmie file page parser.png
index a6ad2b199..2404b033f 100644
Binary files a/static/default/parsers/shimmie file page parser.png and b/static/default/parsers/shimmie file page parser.png differ