Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

TrueDisc Error Correction for Disc Burning? 68

An anonymous reader asks: "Macintouch has a link to a new piece of software — TrueDisc — which claims to make data burned to record-able discs more reliable. More specifically it uses interleaved redundant cells to rebuild data should part of the disc be scratched. On the developer's blog they say they plan to create an open-source implementation of the TrueDisc system, now that it is not going to be included in the Blu-ray/HD-DVD standards. Have any of you used this software before, and what alternatives are already available?"
This discussion has been archived. No new comments can be posted.

TrueDisc Error Correction for Disc Burning?

Comments Filter:
  • Okay, so you store redundant (maybe error-correcting) stuff on top of the existing file system (or in otherwise unused sectors), so you can recover your files if the original sectors succumb to bit rot. For fifty bucks.

    Why not just store the files twice? It would be a whole lot cheaper...
    • Or use par2 (Score:2, Insightful)

      by bruguiea ( 1038034 )
      Or use PAR2 [par2.net]? It's free.
      Tony
      • I agree but if the data set is sufficiently large enough to store on a DVD - using PAR/PAR2 can take forever even on "faster" systems. I've tried on my 1.8 GHz iMac G5 and on a Athlon 64 3Ghz. Both systems had at least 1GB RAM (the 64 actually had 2 GB). This was about a year ago. It took about 5 hours for the G5 and about 2 for the Athlon 64. I was backing up PDFs (just txt no images), .DOC and .XLS and small PPT onto DVD. I burned 4 copies - two discs with the Data, two with the PAR2 data at 80%.

        If TrueDi
        • by maxume ( 22995 )
          I was under the impression that par2 worked faster for smaller data sets. If that is correct and you had partitioned your data into ~2 gigabyte chunks and made pars for each chunk and then burned chunk+local pars to a dvd each you would have saved time. (and made your recovery process simpler to boot)

          It wouldn't protect you from a 'failed' disc the way your method does, but for anything paranoid I would go with two copies stored separately anyway.
          • If you take a look at the way a lot of files are distributed on usenet you'll find exactly that. 4GB of data will be broken up in to 40-50MB blocks and par2 files will be built to support the set. Doing a repair on a highly damaged set of between 4-5GB on my Athlon 64 X2 4600 takes about 5 minutes.

            • by maxume ( 22995 )
              I meant that if you had 4 gigabytes of data, rather than distribute it as 4GB + pars, distribute it as 2GB+pars X 2, with each set of pars only covering half of the data. Files on usenet are chopped up because of limitations in newsreaders and crappy servers, but the pars are generally generated for the whole set of files. My handwavy impression is that the algorithm used gets slower when processing larger amounts of data so that choosing a slightly less convenient breakdown in the files would speed up the
              • I realize that part of the splitting of files on usenet is due to limitations in the distribution system, I was just trying to reinforce the idea that pars tend to work very quickly with large data sets if that data is split into smaller, though not necessarily convenient chunks.
                • Re: (Score:3, Interesting)

                  by dgatwood ( 11270 )

                  I tried to use it to send my parents a copy of their problematic hard drive that I scraped for them, spread across a handful of DVDs. Turns out that at least on Mac OS X, I couldn't find a PAR decoder implementation that worked correctly if the total data size was over 2GB (or maybe 4GB). It was mistakenly using a 32-bit value for some of its internal math instead of a 64-bit, thinking (incorrectly) that Mac OS X's seek only supported a 32-bit offset. I sent the UnRarX folks a hackish workaround patch to

      • Re: (Score:3, Funny)

        by pegr ( 46683 )
        For large data sets, I rar to a "block" size one third of that of my media, then put two data blocks and one par block per disk. Yes, it's a pain to restore, even without damage, but it gives me great recoverability, as I can loose up to a third of my disks and still be able to recover. These data sets are typically 50 to 200 GBs, btw...
      • Is PAR2 open-source? It seems like it is, but I certainly wouldn't want to do long-term archives of my data in any format where there were only binary decoders available.

        A while back I had an absolute devil of a time trying to unpack some Compact Pro archives (.sea), and that's not really even that old a format -- it was last released in 1995 -- and there are still a lot of Classic Macs around that will run the software. However, in another 10 years, I'd imagine that it would be a lot tougher, since Macs be
    • From their site: you can get 600 megs to a CD and 4.1 gigs to a DVD, pretty good space saving over having two full copies.
    • Music CD's already include error correction bytes embedded in each frame of data, so I assume this technology does the same sort of thing for data CD's/DVD's/bluray's/etc..

      On music CD's, there's one error correction byte for every three bytes of data. That's a lot more space-efficient than just burning your data twice.....
      • Re:Sheesh (Score:5, Informative)

        by tenton ( 181778 ) on Wednesday March 07, 2007 @11:21PM (#18271762)
        Music CD's already include error correction bytes embedded in each frame of data, so I assume this technology does the same sort of thing for data CD's/DVD's/bluray's/etc..

        On music CD's, there's one error correction byte for every three bytes of data. That's a lot more space-efficient than just burning your data twice.....


        Music CD's have piss poor error correction, by data standards. CD-ROM and DVD-ROM (which includes the video variant, since it's an application of DVD-ROM) have much more robust error correction. There is more error detection (and correction) per block on a CD-ROM (consequently, less for data) than on a music CD. Music CDs have the additional advantage of not needing to be precise; it can try to guess (interpolate) the missing data it runs into, or, at worse, skip (which may or may not be noticable). Can't do that with a spreadsheet.

        Burning your data twice also has the advantage of being able to separate the copies (to different physical locations). Error correction technologies aren't going to help if you CDs and DVDs are roasted in a fire; the extra copy you made and put into storage elsewhere will still be safe.
    • Re: (Score:3, Informative)

      by phliar ( 87116 )

      Why not just store the files twice?

      Then your overhead is 100%. They promise an overhead of 14%.

      There are much better error correction schemes than "duplicate the data" -- look up Reed-Solomon.

    • It really is open source if you just store the data twice. The TrueDisk recovery part is proprietary so it's not really open, you will only be able to read the 'master data', still have to buy their software to recover the undamaged data from their 'recovery sections'. Is the author of this article part of the OOXML team?
      • My guess is, speed would be the problem. If it's anything like par2, the recovery process takes too long to do in realtime; therefore, only useful if you were allowed to burn a backup copy (and if it was economically feasible to do so).
        • Ignore the subject of my post. I was going to go on about how it either wasn't up to par, too expensive, or they had seen no need for it, but when it came to typing it, I didn't feel like typing rambling on. :-)
    • by goombah99 ( 560566 ) on Thursday March 08, 2007 @06:17AM (#18274108)
      So far all the comments I've read are way off the mark about what is interesting about TrueDisk. Yes it's true that TrueDisk is just yet another error correction scheme. What is slick about it is it's high usability. This comes from two things
      1) It writes the correction bits to a separate partition from the "regular" bits. As a result, the primary partition looks exactly like a regular CD. put it in any computer, even one not equipped with the TrueDisk Software and it can be read normally.

      2) The amount of the redundancy is automatically chosen. It just uses any left over space when it finalizes the CD.

      As a result the operation of TrueDisk is pretty much transparent. You only need to invoke the truedisk software to read a disk that has been corrupted. Uncorrepted disks can be read normally. So You won't lose your data if you don't have the software or the company goes out of bussiness and it stops working on newer OS's. (All you would lose without the software is the ability to recover from the redundant bits. ).

      In comparison to PAR or RAR, you are not compressing the data so it's faster. Now I note that if you compress and then add redundancy you could potentially have higher redundancy for a given amount of data on a fixed CD size. So there could be some theoretical advantages to RAR and PAR. However, those PAR/RAR disks cannot be read in-place (they have to be expanded) nor in "real time" (say if you are playing video). They are very slow to write. They can't be read on any computer without the same verison of par/rar. And if you do lose bits beyond the point of recovery the compressed bits will span a much greater extent in the data space--you might even lose the entire CD with PAR/RAR. So you can see that TrueDisk has usability advantages even if it's redundancy is less and it's uncompressed.

  • Isn't there already an open source program which does this called DVDisaster?
    • Re: (Score:2, Informative)

      by brendan_orr ( 648182 )
      I agree, DVDisaster is quite nice, I've my main copies of backups, then a separate disc with error correction files with copies being held on a hard drive and eventually tape, other hard drives and any other medium I can...at least for the really important backups. My ogg collection I'm not too worried about (as I can always re-rip the song/album should corruption occur)
  • Parchive (Score:2, Informative)

    by Anonymous Coward
    There's already good parity software available. Parchive will create redundant data that can be burned on the same disc or a separate one. You can create up to 100% redundant data so even if the original disc is lost you can completely restore the files. The software is free and open source. The windows version is called quickpar.
    • Maybe I'm just not getting it... if you have a disc, and then enough information on a second disc to completely rebuild the first disc, isn't that the same thing as burning the same data twice? A compressed copy at best maybe. It's very possible I just don't understand correctly though.
      • Actually 100% redundancy is a good bit better than two copies.
        If, instead of losing the entire 1st disk, you scratch the first half of both disks you can still recover everything. If the disks were just identical copies, you would be SOL...
      • Re: (Score:2, Informative)

        by Anonymous Coward
        If you burned two copies of a disc, and both of them went bad on the same file, you're hosed. With error correction codes like par2 uses, you can use any pieces from the par to replace any pieces from the original data, so even if both discs end up scratched, as long as you can get as many par2 blocks off the second one as you can't get blocks off the first, it doesn't matter which blocks they are, you can restore the data.
  • RAR (Score:3, Informative)

    by mastershake_phd ( 1050150 ) on Wednesday March 07, 2007 @08:50PM (#18270192) Homepage
    what alternatives are already available?

    RAR compression has an option for redundancy. You set what % you want to be able to recover if it becomes corrupted.
  • These optical disk formats already use error correction codes in an attempt to recover from small scratches. If you have a big enough error on one part of the disk, chances are lots of the disk will be unreadable. Unless you create multiple disks and spread the redundant data across all of them, this isn't going to add a lot of protection.
    • Re: (Score:3, Interesting)

      by loraksus ( 171574 )
      Your average blank disk out there is pretty poor quality, if anything, this lets you burn on crap disks with at least the chance of reading the data a year or more down the road.
      Par does take a while to generate the recovery files though...
  • Looks like they're using some kind of error correction code that's better than just parity, and distributing the blocks in each ECC group around the disk. It's better than writing the file twice if this (as it sounds like) happens below the file system level.

    It's kind of analogous to a super-RAID, except with the "disks" that are being redundantly striped and mirrored are all on the same physicl DVD or CDR.
  • by Runefox ( 905204 ) on Wednesday March 07, 2007 @10:12PM (#18271124)
    In Mode 1 CD-ROM, for every 2048 bytes of data, there's 276 bytes of Reed-Solomon error correcting code and 4 bytes of error detection. Considering we're talking bytes, that's pretty reliable, and as you know, a single scratch often doesn't render a CD totally useless. Usually, a CD has to look like an ice skating rink after an hour of skating for it to fail miserably, and light scratches, even in high numbers, are generally not a factor.

    So what the hell? Why is this even necessary, unless you're using a Mode 2 CD (and then, Mode 2 is usually used for videos/streaming data, which requires a more sequential read, where adding ECC would defeat the purpose).

    Waste of money.
    • by Sycraft-fu ( 314770 ) on Wednesday March 07, 2007 @11:07PM (#18271648)
      Get some Brasso (brass polish) and a soft, lint free cloth and you are in business. Really. You just polish the surface so it's all even and thus reflects light equally. If you are nervous about using Brasso, there's a number of products designed just for this purpose, though they are way more expensive and Brasso does just as good a job.

      Either way the point is that with error correction as it is now, it's not hard to fix a CD if needed.
      • Re: (Score:1, Informative)

        by Anonymous Coward
        I like Simichrome even more than Brasso - it has a toothpaste-like consistency, so it doesn't make as much of a wet mess, and it's more abrasive than Brasso, while still being a very fine-grained polish that will get you a very smooth surface on a scratched disc.
      • Wouldn't the difference between the refractive index of the brass polish and the CD plastic cause problems? Or is the angle of incidence high enough that refraction doesn't matter?
        • You're not filling in the scratches, you're actually polishing them out to a smooth surface. The polish is rinsed away, so its R.I. is irrelevant.
      • Either way the point is that with error correction as it is now, it's not hard to fix a CD if needed.
        For light scratches, I've found that a minute or so of radial rubbing with my t-shirt is usually enough to make an old disc readable.
      • Re: (Score:3, Informative)

        by dbIII ( 701233 )
        I'll second that - I've used brasso in the late stages of polishing metal specimens to look at under a microscope at 1000x. With a light enough touch you don't see many scratches even at that magnification. A major part of the mechanism is chemical attack on copper alloys, but a lot of suspended really small hard particles that are in there still work with polishing other materials. Silvo has smaller particles again which I've used for the final polish - but you wouldn't need that on a CDROM.
      • That works great if the scratch is on the non-label side. (DVDs as well). Most people don't realize that the label side of most disks is just a thin layer of paint over the thin layer of aluminum/ exotic materials that holds the data. If you scratch this side, your data is gone, unless it's redundant.
        Double sided DVDs have two thick layers of polycarbonate with the data sandwiched between them. It's much harder to permenently scratch the data away from one of those.
      • I will 100% vouch for this. I scratched the crap out of my Halo 2 disc one time, throwing the controller at the ground and accidentally hitting the Xbox (I was pissed about a multiplayer game). Needless to say the disc got scratched all to hell and wouldn't even read afterwards.

        After some searching around on the net, I found that using Brasso with a cotton cloth would do the trick, so I tried it, and after 10 minutes or so the DVD is back to a totally-readable state! I was really skeptical at first but it m
    • by Athrac ( 931987 )
      Those 276 bytes are quite useless, because they cannot (at least to my knowledge) be used for correcting errors in other sectors. So, all you need is one sector that is totally unreadable, and the whole disc might become useless.
      • by Runefox ( 905204 )
        Not really; That 276 bytes applies to every 2048 byte sector. If one sector dies ENTIRELY, then you still have the rest of the disc (you're only out 2048 bytes), not to mention, as others have said in other comments, you can easily repair physical damage to a CD-ROM, unless it's cracked in half, in which case NO error correction will work for you.
    • by bhima ( 46039 )
      That's what I was thinking...

      Oh and I did the green hair thing back in the late seventies or early eighties I forget which. It sounded a lot more fun than it was.
  • ISO9660? (Score:3, Informative)

    by NekoXP ( 67564 ) on Thursday March 08, 2007 @05:21AM (#18273846) Homepage
    From article: "Since the TrueDisc format is open and the master copies stored by TrueDisc are located in the standard ISO 9660 filesystem"

    That pretty much fucks up anyone's day when they wanna burn a UDF DVD doesn't it? ISO9660 doesn't support files greater than 4GB, you can only have 8 directories deep (until the 1999 spec but I always had a hell of a time reading this stuff on anything but XP), stupid filename restrictions (and then do you use Joliet or RockRidge or whatever to fix it or not?)..
    • Re: (Score:1, Informative)

      by Anonymous Coward
      There's a 4GB limit for single extents of files, but one can fragment files into multipe extents [wikipedia.org]. Theoretically, at least, since all the burning tools I've tried (basically mkisofs in its various forms and some shitty windows software) just silently skip larger files and will happily produce an empty disk if that was the only file.
  • by adam1101 ( 805240 ) on Thursday March 08, 2007 @05:45AM (#18273950)
    This $89 (or $52 intro price) TrueDisc sounds rather similar to the open source dvdisaster [sourceforge.net]. It builds Reed-Solomon error correction data from CD or DVD iso images, which can be either augmented to the image and burned on the same disc, or stored separately. It's somewhat similar to par2/quickpar, but dvdisaster is more specialized for CDs and DVDs.
  • This kind of thing already exists in the form of PAR files. Basically RAID5 but on a set of files, with arbitrary amounts of parity data (from 1% to 100%). The advantage of using PAR files, created with a program like QuickPAR, is that you can burn the parity files to a different disc.

One man's constant is another man's variable. -- A.J. Perlis

Working...