-
-
Notifications
You must be signed in to change notification settings - Fork 467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sort and show duplicates by path (file-tree) #1584
Comments
well I at least tried to remove it and add a few tags like deduplicator, ui... |
Can you show me a few screenshots of what you are looking for? The Deduplicator tool in SD Maid already shows results in groups. |
@d4rken I know, but those are only groups of duplicate files. as I said I'd like to be able to switch mode so I can see all of them in a form where I can browse all the folders with duplicate files in them When I have like 150 from one folder, I could either exclude the whole folder without needing to do it manually and then have to run the scan again for it to take effect, or I can choose a whole folder and delete all duplicates that are 1. older 2. newer 3. in the folder while leaving one 4. outside the folder while leaving one. |
Thanks, I understand your idea better now. 👍 Would anyone else find this very useful? |
Is your feature request related to a problem? Please describe.
I need duplicates to show up as directory tree, something like qdirstat, where I can either choose ti see all, multuple or only one group of the same files
Describe the solution you'd like
to have a directory tree available
Describe alternatives you've considered
trying to find an app that does this for Linux, cause I work with ext4 too
Describe why this would be in the interest of all users
easier cleanup if you have hundreds of duplicates. You can just choose the whole folder and check the box to leave at least one copy, or you can exclude the whole folder/multiple folders (make exceptions for them)
Additional context
Idk what triage tag means so I removed it just to be sure
The text was updated successfully, but these errors were encountered: