I often end up with duplicate files with same content, but with same or different names while doing the literature study. How to find and delete the duplicate files from terminal? How to just move them to bin and recover later if necessary and how to delete them permanently?
Asked
Active
Viewed 2,027 times
-2
-
1Does this help? How to find ONLY duplicate files that have different names? I posted two answers there, one with a flexible script that offers many options to filter, the other with a relatively simple Bash one-liner that just finds all duplicates by content. – Byte Commander Oct 14 '20 at 09:08
-
The one-lines seems to be easy to remember and can be used anywhere across all my systems where I can not have root permissions. – Srinivasarao Bukkuru Oct 14 '20 at 11:20
3 Answers
0
You can use rfind
or fdupes
both installable via apt
.
See 4 Useful Tools to Find and Delete Duplicate Files in Linux for more details.

Lorenz Keel
- 8,905

cknoll
- 103
- 5
0
FDUPES
you can install fdupes from terminal (CRT+ALT+T):
sudo apt install fdupes
for example if you want find all duplicate documents in you Home you can type:
fdupes ~/Documents -r
FSLINT
you can also use fslint from grafphical user interface that is available till ubuntu 18.04 because now the use of fslint in 20.04 is not possible as standard because python2 is not present as it is deprecated.
sudo apt install fslint
For the use fslint

pat
- 429