Best Large Archive Format to Extract Files

So sometimes you just want to extract some files from a large archive. It might be a bit late now, but tar.gz is probably not the best archive format for huge archives and removing specific files. It can take a long time to find the file you want and extract it.

Formats like zip (and 7z?) know where the file is located in the archive and can extract it quickly (compared to tar.gz). In the comments of this article some people talk about the length of time tar.gz can take, eg 4 Gig file, 25 minutes.

According to this StackOverflow article, zip, 7z and dar (in non solid mode), can also extract files more quickly from a remote source. They have a list of files and can archive in segments to allow extraction without requiring the whole archive to be unarchived. tar by itself does no compression and harks back to the days of tape drives (tar is short for tape archive) where this is not really important as they use sequential reads.

An answer here mentions tar can store *nix file attributes. It seems like this is an optional parameter.

--xattrs this option causes tar to store each file's extended attributes in the archive. This option also enables --acls and--selinux if they haven't been set already, due to the fact that the data for those are stored in special xattrs.

But at some point zip also gained support for *nix file attributes.

A unix zip file attributes answer and here is a python example.

zip can also store MSDOS file attributes.