There are a variety of ways to replace one string of text with another across many files. Here are a few ways:
using sed and find:
sed 's/oldstring/newstring/' "$1" > "$1".new && find -iname "*.new" | sed 's/.new//' | sh
using grep and sed:
grep -rl oldstring . | xargs sed -i -e 's/oldstring/newstring/'
using grep and perl:
grep -rl oldstring . | xargs perl -pi~ -e 's/oldstring/newstring/'
Please offer your own suggestions.
Using the GNU find, xargs and sed like this:
Adjust the
-P
and-n
parameters as you like. The/g
is needed so that every occurrence in a line gets replaced, not just the first one (g
stands for global if I remember correctly). You can also pass a value to--in-place
to make a backup.I like perl's in-place filtering recipe.
In context...
here is the trimmed quick-reference on the perl incantation used...
I'd use Python for this. Put all this code into a file called mass_replace and "
chmod +x mass_replace
":For a single search and replace of one string in one type of file, the solution with find and sed isn't bad. But if you want to do a lot of processing in one pass, you can edit this program to extend it, and it will be easy (and likely to be correct the first time).
Assuming the list of files isn't a mile long, you don't need to use xargs, as sed can handle multiple files on the command line:
Be careful if you replace URLs with "/" character.
An example of how to do it:
Extracted from: http://www.sysadmit.com/2015/07/linux-reemplazar-texto-en-archivos-con-sed.html
Thanks for some great answers everyone! This was super-helpful.
Since I didn't have hundreds of files to replace lines in, I used a do loop, like this:
Hope that helps!