4

I have 280 thousand photos to delete in a folder, but some videos to keep. In the folder, I gave the command: #rm *.jpg, but I get "argument list too long". When I create an argument to delete some of the photos, it works on a smaller set, like this: # rm 104-*.jpg.

How can I efficiently delete all the JPEG files in a directory without getting the message "Argument list too long"?

#rm -f *.jpg gives the same message.

Enter image description here

Opening the folder in Caja uses too much memory and crashes. I am using Ubuntu MATE.

j0h
  • 14,825

2 Answers2

12

A typical way to handle the “argument list too long” error is via the find command:

find  -maxdepth 1 -mindepth 1 -type f -name "*.jpg" -delete
Sergiy Kolodyazhnyy
  • 105,154
  • 20
  • 279
  • 497
  • This is a slow way to do what I already did, prior to steeldriver's answer. N-.jpg extends from 0 to 198 or something like that. I thought about just writing a for loop to increment N in rm N-N-N.jpg. but thought there had to be a single efficient command to do such an operation. – j0h Jun 23 '18 at 10:38
  • 1
    @j0h As far as the 104-*.jpg part goes, I apologize for that, I thought that's a pattern for all the images. But this is a single command, which doesn't call anything external. Compared to steeldriver's command - two commands which also are slowed down by a pipe ( which implies buffering ) - I don't see how this single command is slow. – Sergiy Kolodyazhnyy Jun 23 '18 at 15:55
  • ah it might not be, as a single command, just the pattern match. Im tempted to let the same issue happen again just to test the solutions. – j0h Jun 23 '18 at 22:34
  • 1
    @j0h I guess we're on the same topic then :) I'm working on an answer that should generate enough arguments that will get below or above the threshold to cause Argument list too long error intentionally. I might post a link once I figure this out. – Sergiy Kolodyazhnyy Jun 23 '18 at 22:41
  • Im thinking that around 10,000 items is the area that generates that error. – j0h Jun 23 '18 at 22:46
  • 1
    @j0h Actually, it's not related to number of items at all. It's the total size in bytes of the names that get passed to the command. – Sergiy Kolodyazhnyy Jun 23 '18 at 22:51
  • @j0h I posted an answer. Also, please read the linked Kusalananda's answer as well. Basically, it's difficult to estimate exact number of files. – Sergiy Kolodyazhnyy Jun 26 '18 at 01:14
  • find: warning: you have specified the -mindepth option after a non-option argument -type, but options are not positional (-mindepth affects tests specified before it as well as those specified after it). Please specify options before other arguments – j0h Jun 30 '18 at 19:32
  • @j0h The -mindepth has to come before -type. I'll edit shortly – Sergiy Kolodyazhnyy Jun 30 '18 at 19:37
8

You can use xargs:

printf '%s\0' *.jpg | xargs -0 rm --

In bash, the printf command is a built-in and is not subject to the same argument length limitations.

steeldriver
  • 136,215
  • 21
  • 243
  • 336
  • 2
    Alternatively, using ./*.jpg instead of relying on --. Makes it more portable – Sergiy Kolodyazhnyy Jun 23 '18 at 00:13
  • This command failed in a later attempt, with : sudo: unable to execute /usr/bin/printf: Argument list too long .... I think it may have succeeded previously, because i had been actively removing items. prior to using the command. Sergiy's answer worked, without any file removal, so my answer has changed. – j0h Jul 08 '18 at 18:52
  • 1
    @j0h this answer works when printf is a bash shell builtin command; if you run it with sudo then (as you can see from the error message), sudo will invoke the external /usr/bin/printf - which is subject to the same argument length restrictions as any other command. – steeldriver Jul 08 '18 at 19:54