I have a simple backup server that rsyncs several network shares and then compresses them into a zip file that is loaded onto tapes.
zip -r /media/1tb/backup/compressed/reports.zip /media/1tb/backup/nightly/reports/
And about half way through i get
adding: media/1tb/backup/nightly/reports/active/aa010aq/rpts.2009.12.15/aa010aq.datzip I/O error: File too large zip error: Output file write failure (write error on zip file)
This is on a standard ubuntu server.
How large is the file at the point of the error?
The problem is most likely the filesystem on the drive you are backing up to not supporting files large enough. FAT16 is limited to 2Gb per file, FAT32 is limited to 4Gb. FAT32 is the most common filesystem to be found on external drives unless you reformat explicitly using a different filesystem.
If this is the case then you will need to reformat the drive with a better filesystem (ext3 for example, or NTFS) or change your procedure so that it doesn't generate a single archive that is so big.
Which version of zip are you using? You need at least 3.0 to support Zip64 (which is required for creating archives larger than 4GB).