First of all: Thank you for taking the time to read this and provide some help. Any advice is appreciated.
I am using Nautilus-Actions to add a custom context menu's action to create a copy of a JPG image and reduce its quality (and the file size) while preserving the image's dimensions.
Let's say I have this picture with 2.3 MB (2 261 588 bytes) file size and:
After opening the file in GIMP 2.8 in order to just export a copy and use the JPG Export dialogue box to reduce the quality of the image to 30%:
I get these properties in the image copy (251.8 kB (251 797 bytes) file size):
I am using Nautilus Actions to batch process a lot of images to reproduce the same behaviour via context menu by selecting a bunch of files in Nautilus, right click on any of these selected files, and choose the proper Nautilus Actions action. The result is a file containing the image in the same dimensions as the original, with less quality of course and the file size is a bit greater (275.3 kB (275 265 bytes) file size) but that's not a major problem for me.
The problem is happening when I try to batch process several image files, which results in the first file name cloned into several file names containing a different image each one. With the properties I set in this set of commands:
Basically, each Nautilus Action is using these parameters:
Command: convert
- From (imagemagick)
Parameters: %F.jpg -quality 80%% %F-80q.jpg
- Which is interpreted as:
convert path/to/file1.mid.jpg -quality=80% path/to/file1.mid-80q.jpg
Working directory: %d
So, the result is:
Here we can see each image processed in a different file but just the first file name is used to assign the new names to the copied files. I want a result like:
File1.jpg === File1-80q.jpg
File2.jpg === File2-80q.jpg
File3.jpg === File3-80q.jpg
File4.jpg === File4-80q.jpg
And so on...
I have used Phatch to do such tasks but what I want is to just right click on the selected files and process in a single action.
Is it there a better way to do this? or... Can I improve in some way what I am doing already?
Thank you very much for your support.
P.S. I have seen How can I batch convert images to b/w while preserving the folder structure and Is there a way to batch export SVGs to PNGs? but I am not trying to run this from a terminal but from a context menu. If I am omitting something in this exercise don't hesitate to let me know! I'll appreciate it a lot. Thank you!
Edit
After using muru's answer I find that solution partially does what I expect.
More specifically it solves the file names problem. Nevertheless, I also expected to get smaller file sizes by running the proper convert file.jpg -quality 30%
command and it is dropping bigger file sizes as you can see:
Does anyone know if there's something else I should add to the parameters seen in muru's answer? I appreciate a lot your efforts to provide help on this issue.
Update
I am writing this just to let people know I have seen where the problem was with the compression. The file I was trying to compress was already fully compressed so (as mentioned in muru's answer):
The process was just adding overhead without gaining anything.
You cannot use
%F
twice in a command like that, because%F
gets replaced by the names of every file selected. For example, a command ofsh
with parameters-c 'printf "%%s\n" "$@" > foo' %F %F
will create a file namedfoo
with the names of every file selected, twice. Therefore theconvert
command that actually runs is:And since the last file is taken to be the output file name, only it will matter.
What you can do is wrap your command in
bash -c
and run a for loop:(where I am assuming that all files end with
.jpg
)In an action, you will have
bash
as the command, and for the parameters:You can make this more complicated to handle any extension, at which point you might as well skip Actions and use scripting.