I'm dumping an svn repository to a giant USB disk that is formatted FAT due to necessity (treat this as unchangeable).
It conks out when you try to create a file larger than 4 gb.
I need a tool that I can pipe data to that will create files of arbitrary size that when catted together will be the original file. I can write a tool to do this, but if one already exists I'd rather use it.
Cheers
EDIT: A second look at the split man page looks like it might work.
SVN dumps are one gigantic file, and FAT conks out after 4GB.
split is a unix tool that sinks input into a series of files. From the manpage:
Something like
svnadmin dump $reponame | split -d -b 1073741824 "$reponame." -d
will give you $reponame.1, $reponame.2 and so on, with a gigabyte file each. Hopefully FAT will continue to function with multiple large files.To put them back together again, use cat:
cat $reponame.* | svnadmin load $reponame"