-
-
Notifications
You must be signed in to change notification settings - Fork 159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hydrus GUI slowdown/unresponsive (partially hanged?) when running a subscription download #1612
Comments
Thank you for this report. Please run a 'profile' over the duration of a run of this subscription, as under help->debug->profiling, and then have a skim of the log file created. I don't think it will have private info, so it would be safe to pastebin it here, but if there is anything private then please DM or email it to me. I will check out what is running slow and see what I can do. The serialisable named stuff is usually related to larger chunky objects, which includes subscriptions. I've noticed that some of our larger subs are taking a bit of time to load, sometimes, but if you are getting 60-second hangs, that suggests a full-on traffic jam. |
I enabled the profiling and immediately ran a subscription download of one of the larger tag collections (62k). If you look at the time stamps you'll see that it took over 5 minutes to complete. A second download also occurred right after of a smaller tag collection, which completed much faster, though still slower then when subscriptions download from other sites. (For some reason git isn't allowing me to attach the file. I will DM on Discord with the file.) |
Thank you very much! I see a big problem and will work on it. |
I was able to significantly improve subscription load times last week in v595, which I believe is what was hitting you. If you still have slowdown, I'd love to see some new profiles, but if things are good now, I will close this issue. |
I just installed 596 and ran the same subscription. It was much faster, though still takes some time. It was somewhere shy of 1 minute in total, from the over 5 minutes it was previously. Still slows down the GUI on the initial running of the subscription, where the information box pops-up and contains the text "Subsciption - name of subscription" and db read locked appears, but once it loads and actually starts to go through the steps, its running much faster. I tried to compare large subscriptions with similar file counts on different boorus, and that initial slow down appears to be similar across similar sized subscriptions, even on different boorus. I'm now wondering if the slow down has to do with the "Item" count in the subscription? Is Hydrus trying to load ALL those "items" (I'm assuming saved URL/image information), even though the subscription will only check the last 50-100 image urls? I'm going to guess the easiest solution on my end would be to delete the current huge item count subsciptions and restart them so they have a smaller item count, though I have hundreds of such subsciptions (cough 2k+ cough) so it would be time consuming. Couldn't Hydrus prune older "items" information after X amount of time, or X count, or at least ignore the "item" information after X number, so it's not constantly trying to load them? |
Yeah, each sub-query in a subscription will typically regularly cull itself to the most recent I think 200 items. The exact number can fluctuate based on some timing stuff, but a mature query will generally keep itself fairly slender. And, every query is loaded up and saved back separately in turn, so it typically shouldn't matter too much if you have one sub of five hundred queries vs ten subs of fifty. Now it may be that 200 urls is still taking some time on your machine, or it is hammer subs as quickly as it can and still causing a bit of a traffic jam, but I'm happy we've improved things a bit. You might like to try reducing the max number of subs that can run under options->downloading to 1 and seeing if that frees you up a bunch. If not, please run some more profiles and we'll see what the next bottleneck is! |
I'm gonna run some test in the next day or two, as Hydrus doesn't appear to be only using the last 200 items in it's query list. I'm in the middle of switching over my main NAS from ESXi to Proxmox and the image directorys that are setup in Hydrus are on the NAS. |
Hydrus version
592
Qt major version
Qt 6
Operating system
Windows 11
Install method
Extract
Install and OS comments
Hydrus main program and DB is installed on a dedicated NVME drive, seperate from the OS
Bug description and reproduction
Issue: When Hydrus runs a subscription download for beta.sankaku, the GUI will become very slugglish and unresponsive. Menus opening are very delayed, clicking on a image is very delayed till the image actually is highlighted then shown in the preview window. In the worst state, it appears the Hydrus GUI is locked up/frozen, very often exceeding 1 minute while in this state. CPU usage on the core that Hydrus is currently using will also goto 100% usage during this time of unresponsiveness. Hydrus, however, is not actually frozen or locked up. The bottom right corner of the GUI that gives current information on database operations shows db read locked. Hovering the mouse over that information then shows current db job: readserialisable_named.
Normally on other subscription downloads, the db read locked flashes in the corner a bunch of times very quickly while the subscription download is going through its steps, so the sluggish GUI is not noticeable. But during the beta.sankaku subscription download, it appears to me (this is all just what I think is happening in the background) to stop and read the database in search of all entries with the same tag as it is currently going to download. The more entries in the databse with the tag, the longer is takes to read (hence GUI getting sluggish/frozen), and Hydrus does this multiuple times in a single download.
This issue starting happening somewhere around the early 57x versions (possibly 56x version). It did not occur before then.
Log output
No response
The text was updated successfully, but these errors were encountered: