As the title states, I have a few large archives (currently in .7z and .tar.gz format) stored on a remote location which I access via sshfs. I often find myself needing to extract one or two files from these archives and the default archive manager in ubuntu seems to extract/read the whole file in the background first before I get the file I want.
I'd like a list of the possible formats that does not do this. In other words, I'd like to be able to extract a file from a large archive without any delay.
I don't know of anything which meets all of your requirements. However, I might have something which will still work
If your needs are read-mostly, instead of creating a compressed archive (eg. tar.gz), consider creating a SquashFS image instead. SquashFS is a read-only compressed filesystem. You would then be able to access that image file via sshfs and mount that as a loopback device.
Since this is a complete filesystem, directory traversal, data access, etc. will be limited to only the particular blocks required.
zip, 7zip, and dar ( in non solid mode ) have this property. They do this by storing a table of contents and compressing in smaller blocks so only the blocks containing the files you want to extract need to be decompressed. This does result in slightly less compression though.
It's has become regulation in every UNIX and Unix-like platforms. Because UNIX's philosophy that application must attempt to complete single task perfectly. So there's no one applications that brings archiving and compression together.
The advantage, you can combine one archiving application with any compression applications. Example combining tar with gzip (tar.gz) and tar with lzma (tar.lz).
Maybe propiertary applications provide what you want such as WinZIP, but it only available for another paid UNIX (OS X)