Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage Software

Recoverable File Archiving with Free Software? 80

Viqsi asks: "Back in my Win32 days, I was a very frequent user of RAR archives. I've had them get hit by partial hardware failures and still be recoverable, so I've always liked them, but they're completely non-Free, and the mini-RMS in my brain tells me this could be a problem for long-term archival. The closest free equivalent I can find is .tar.bz2, and while bzip2 has some recovery ability, tar is (as far as I have ever been able to tell) incapable of recovering anything past the damaged point, which is unacceptable for my purposes. I've recently had to pick up a copy of RAR for Linux to dig into one of those old archives, so this question's come back up for me again, and I still haven't found anything. Does anyone know of a file archive type that can recover from this kind of damage?"
This discussion has been archived. No new comments can be posted.

Recoverable File Archiving with Free Software?

Comments Filter:
  • where have you been? (Score:3, Informative)

    by Anonymous Coward on Wednesday February 25, 2004 @12:51AM (#8382938)
    ever heard of parity archives?
  • by jason.stover ( 602933 ) on Wednesday February 25, 2004 @01:07AM (#8383045)
    Here's the parchive sourceforge site [sourceforge.net] .. Links to PAR2 utils, spec, etc...
  • Try apio (Score:5, Informative)

    by innosent ( 618233 ) <jmdorityNO@SPAMgmail.com> on Wednesday February 25, 2004 @01:18AM (#8383125)
    There used to be a cpio-like archiver called apio, that was designed for those types of situations. Of course, that might not be much help for non-unix systems (unless you plan on running in Cygwin), but I remember having great success with it for the old QIC tapes, which were in my experience the worst backup medium for important data ever (better to have no backup than think you have a good one, but have a dead tape)
  • Par2 works great (Score:5, Informative)

    by dozer ( 30790 ) on Wednesday February 25, 2004 @01:22AM (#8383158)
    Store the recovery information outside the archive. Par2 [sf.net] works really well. You can configure how much redundancy you want (2% should be fine for occasional bit errors, 30% if you burn it to a CD that might get mangled, etc.). It's a work in progress, but it's already really useful.
  • cpio (Score:5, Informative)

    by Kevin Burtch ( 13372 ) on Wednesday February 25, 2004 @01:42AM (#8383286)

    True, tar cannot handle a single error... all files past that error are lost.

    On the other hand, cpio (and clones) can handle missing/damaged data without losing the undamaged portions that follow (you only lose the archived file that contains the damage). It is the only common/free format I can think of (from the top of my head) that is capable of this.
  • by wiswaud ( 22478 ) <<ac.dww> <ta> <jse>> on Wednesday February 25, 2004 @01:49AM (#8383316) Homepage
    if you make a big tar then bzip2-it, then store the file on a CD.
    then 2 years later you want the data back.
    there's a read-error at some point within the .tar.bz2, and it gives you some garbage data.
    bunzip2 will actually be able to recover all other 900kB chunks of the original tar file, except for this missing chunk or part of it.
    Tar will just choke at that point and you lost everything past the read error. bunzip2 was able to recover the data past the error, but tar can't use the data.
    It's quite frustrating.
  • Re:Try apio (Score:5, Informative)

    by innosent ( 618233 ) <jmdorityNO@SPAMgmail.com> on Wednesday February 25, 2004 @01:55AM (#8383347)
    Sorry, I believe it was afio [freshmeat.net]
  • by Kris_J ( 10111 ) * on Wednesday February 25, 2004 @02:31AM (#8383508) Homepage Journal
    RAR compression is free for decompression [rarsoft.com] with source available, heaps of precompiled binaries for decompression on your OS of choice and it's included in a whole heap of popular free archive programs. Just burn the latest source on every CD you make and you should be fine.
  • by wotevah ( 620758 ) on Wednesday February 25, 2004 @03:35AM (#8383776) Journal
    A quick google search [google.com] turns up the link shown at the end of this post, from which I quote:

    The gzip Recovery Toolkit

    The gzip Recovery Toolkit has a program - gzrecover - that attempts to skip over bad data in a gzip archive and a patch to GNU tar that enables that program to skip over bad data and extract whatever files might be there. This saved me from exactly the above situation. Hopefully it will help you as well.
    [...]
    Here's an example:

    $ ls *.gz
    my-corrupted-backup.tar.gz
    $ gzrecover my-corrupted-backup.tar.gz
    $ ls *.recovered
    my-corrupted-backup.tar.recovered
    $ tar --recover -xvf my-corrupted-backup.tar.recovered > /tmp/tar.log 2>&1 &
    $ tail -f /tmp/tar.log

    http://www.urbanophile.com/arenn/hacking/gzrt/gzrt .html
  • Re:Par2 works great (Score:5, Informative)

    by Stubtify ( 610318 ) on Wednesday February 25, 2004 @03:50AM (#8383823)
    Allow me to second this. Par2 is everything the first PAR files were and more. No matter what has been wrong I've always been able to recover with a 10% parity set. (even this seems like a lot of overkill, except on USENET). Interestingly enough Par files have revolutionized USENET, I can't remember the last time I needed a fill.

    good overview here: PAR2 files [slyck.com]

    comparison between v1 and 2: here [quickpar.org.uk]

  • Re:cpio (Score:2, Informative)

    by Anonymous Coward on Wednesday February 25, 2004 @04:18AM (#8383912)
    On the other hand, cpio (and clones) can handle missing/damaged data without losing the undamaged portions that follow (you only lose the archived file that contains the damage). It is the only common/free format I can think of (from the top of my head) that is capable of this.

    ZIP also supports this (the command is "zip -F" with Info-ZIP, the standard zip/unzip program on Linux).
  • by AdamPiotrZochowski ( 736869 ) <apz@nofate.com> on Wednesday February 25, 2004 @12:27PM (#8387068) Homepage

    rar has one of the best recovery methods, as it has mutliple of them.

    during compression:
    Recovery Record (-rr option)

    it has Recovery Record, this is data appended to the actual
    rar file that lets you recover from errors within a file. The
    default RR takes 1% of the archive and lets you recover 0.6%. You
    can change this behaviour to going more recoverability by
    specifying -rr[N]p and telling it larger percantage for recoverability.

    Recovery Volume (-rv option)

    further more, rar supports PAR like volumes called REV
    That can recover full missing files. For all you are concerned REV is
    PAR, except its integrated to RAR utility. all you type is unrar *.rar
    and rar will recover files for you, either through RR or REV. No need
    to muck around twenty different utilities just to ensure proper file.

    Non Solid Archiving (-s- option)

    Further more, rar support non solid archiving, meaning each file is
    saved using new compression statistics. You will lose some space due
    to this method, however you will gain speed (you dont need to decompress
    first 20 files to gain access to 21st file), as well as you will gain
    partial recoverability (if file 20 is corrupt, you can still decompress
    file 21)

    during decompression:
    Keep Broken Files (-kb option)

    By default, like most archiving software, rar will not save a file
    that is known that is corrupt, unless you explicitly force it to do
    so.

    I highly recommend checking out the command line manual to RAR,

    Eugene Roshal is GOD
  • by Anonymous Coward on Wednesday February 25, 2004 @12:44PM (#8387328)
    You definitely shouldn't use RAR for archival purposes, but for extracting existing archives, try unrarlib [unrarlib.org]. It includes a library for accessing the contents of RAR files, and an "unrar" utility based on this library. It is dual-licensed under the GPL and a more restrictive license.
  • Re:cpio (Score:3, Informative)

    by Detritus ( 11846 ) on Wednesday February 25, 2004 @03:19PM (#8389529) Homepage
    I know that I've recovered data from damaged tar archives in the past. I just ran some tests with intentionally damaged tar files, using GNU tar from FreeBSD 5.2.1. GNU tar successfully recovered the data from all of the damaged tar files. It just skips over the damaged bits and resynchronizes at the next valid file header.
  • by Helen O'Boyle ( 324127 ) on Thursday February 26, 2004 @02:43AM (#8395009) Journal
    { The poster is looking for alternatives to tar, because he has concerns about tarball content recovery. }

    It's been possible to do that for well over a decade, using various utilities such as tarx. I've successfully recovered files after a damaged point in a tarball many times. (Sigh, I used to use an old AT&T UNIX with a #$*@# broken tar, which occasionally created corrupt tarballs).

    See this post [sunmanagers.org] on the Sun Managers list circa 1993, and the venerable comp.sources.unix collection, volume 24, for the sources.

1 + 1 = 3, for large values of 1.

Working...