Will the other process be able to finish reading the old file even though it's been replaced?
In Scientific Linux 5.5, /etc/resolv.conf
is continually overwritten and so DNS is broken. If I delete or change it, it instantly reverts to its former state. Writing over it with: cp /etc/NEWresolve.conf /etc/resolv.conf && chattr +i /etc/resolv.conf
just results in an immutable copy of the original resolv.conf with no changes. I'm running as root (not sudo) and Avahi and NetworkManager are not running.
Any ideas? There's no DHCP anywhere on this machine, and even if there were, I can't imagine it'd overwrite the file so quickly.
Thanks
Some ancillary info: uname -a Linux localhost.localdomain 2.6.18-238.12.1.el5 #1 SMP Tue May 31 13:12:32 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux Intel I340 (82580) NIC
I have set up a job within MSQL studio to backup all of my databases to a specific file. Then I take that file, compress it, and send it to a backup device. I am currently hoping to make this entire process automated on a weekly basis. However, here is my problem. The job I have created in MSQL Studio currently runs a full backup of the databases, however, it does not overwrite the old data. I have been through the wizard time and time again but can't figure out how to me the process overwrite the old files. The purpose of doing this is to save space on the server.
Can anyone help me figure out how to make the backup job overwrite old files?
I'm trying to back up a file via the command
scp /tmp/backup.tar.gz hostname:/home/user/backup.tar.gz
When I run it, the scp progress bar shows up and it looks like its transferring the file, however when I log into the destination server to check the file, the timestamp and filesize haven't changed from the older version, so it looks like scp didn't overwrite the old file at all. It only sees to work when I manually delete the file from the destination server.
I'm running ubuntu, and this is happening on two servers: one cygwin ssh, and one fedora core 3.
Anyone have any idea why this is happening? I thought scp would ONLY overwrite existing files..
Thanks
How do continuously monitor a file that is being continuously overwritten? Specifically, the file contains a single line of text which a background process is repeatedly opening and re-writing.
I was hoping tail -F
would work, but it doesn't.
Update: I was hoping this wouldn't be pertinent, but I'm running Mac OS X 10.5.8.