So I want to change several files at one with one command. This is what I have so far:
find -regex ".*\.ext$" | xargs grep -l searchText 2> /dev/null | xargs -i sed 's/searchText/replaceText/' > {}.new
What it does:
I find files with extension ext, send it to grep ignoring errors, and then send it to sed for replacement with new filename file.ext.new.
The problem seems to be that the redirect > in the sed prevents the {} from being replaced with the filename.
Any ideas? or a better solution?
If you can live with the file being updated in place and a backup created.
Edit:
If you absolutely can't have the files that way round - modified file in place and original with an extension - then you could bolt an extra command on the end of the exec to move them around.
Backticks aka. "External commands" to the rescue.
Here we have find searching for '*.ext' which is then given as files to search for grep which are the given to sed which does the actual replacing. It has the advantage that it doesn't spawn new grep process for every file that is found. Ditto for sed.
sed has a --in-place option.
Perl rocks for this:
Yeah, Perl rocks for this and here is a whole script for this purpose.
You can use Vim in Ex mode:
%
select all liness
substituteg
replace all instances in each linex
save and close