Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

A Distributed DivX Ripper? 32

RJ asks: "I know much about Java/C++ and sockets programming and I'd like to use this knowledge to build a distributed program to rip a DVD into DivX. It will work by breaking the job into chunks, sending chunks to other computers to encode, then patching it back together. Despite my searching efforts, I have been unable to find a decent resource to teach me how to program the DivX core to encode mpeg2 and re-join parts. I'm hoping that some readers of slashdot can point me in the right direction?"
This discussion has been archived. No new comments can be posted.

A Distributed DivX Ripper?

Comments Filter:
  • I'm not an expert, (Score:3, Informative)

    by Skyfire ( 43587 ) on Saturday May 11, 2002 @02:42AM (#3501221) Homepage
    but it seems to me like there are several problems that slow down DVD to Divx reencoding:
    • Reading around 4 gigs of data in a fairly fast time: not much of an issue on a normal computer, but distributed computing might introduce issues with speeds of the network.
    • Deencoding the mpeg2: this seems to me to be the biggest slow down. The problem seems to one of not having a fast enough decoder that is capable of good image output. Distributed computing would help, but it seems to me to be somewhat of a macgyvered solution to the real problem.
    • Resizing, cropping etc: not much of an issue, but of course distributed computing would help.
    • Mpeg4 encoding: Again, one of the major problems seems to be with fairly slow encoding alogarithims. Therefore, helped by distributed computing.
    I'm sure setting up a beowulf cluster of computers or some other setup seems to be nice and all that, but unless you're reencoding several dvds a day, it seems like a normal athlon would probably do the trick, and working on the code for better decoding and coding of the various codec would seem to me to be a more efficient way of improving speed. I know that I can reencode a dvd at half real time with 1-pass divx 5 encoding and flaskmpeg, and a 1/4 real time with 2-pass encoding, and I've only got a Athlon 1.4. So unless you are reencoding Das Boot quite a bit, it seems to me like one normal high powered consumer computer would be plenty sufficient.
    • Please, it's "decode," not "deencode."

      Like my stupid high cshool programming teacher who yankee-ily kept saying "disenable," when it's really "disable," you don't have to include BOTH positive and negative prefixes -- only one.

      Anyway, you're pretty much right. Though I think the original intent was a bit more distributed than a cluster (wan), since such a thing (with oh, just a couple athlons) is beyond most would-be rippers, I'd assume.

      However, if you had a good sized pool of cohorts, and they all had cable modems that can sustain 100k+ between them boh ways, then the networking part seems more feasible than one might at first think. But it's not something I'd release and let and old Joe 56k try, of course, but I wouldn't (personally) consider it McGyver-ish.

  • by karnal ( 22275 ) on Saturday May 11, 2002 @03:20AM (#3501282)
    I've tried to do something similar before -- namely, breaking up .wav files so that I could distribute the pieces to other machines to encode.

    I ran into a snag.

    It seems the encoder I was using at the time (bladeenc) was inserting silence at the end of each mp3, to keep it to spec. What I can imagine is that even with DVD encoding, you'd need a "master" that would give out file chunks to the worker bees. But -- it would have to be intelligent enough to know when you wanted a new keyframe, and split up the .avi / .vob in that sense.

    In other words, you may as well just build one heck of a fast machine, and try to get 30-40fps encoding out of it, rather than try to put together something to distribute it and encode it. That's my 2 cents, and I may be wrong....
    • It seems the encoder I was using at the time (bladeenc) was inserting silence at the end of each mp3, to keep it to spec.

      Dedicate one machine to running LAME [sulaco.org] on the audio, and you won't have this problem.

    • Well, MP3 audio consists of frames, so if your .wav files are not the exact size to match MP3 frame boundaries, the rest of the frame will be padded with zeroes. Same problem when trying to play two tracks without gaps in between. Go to www.r3mix.net, they got a lot of info on this (and more).

      The solution is to cut your .wav files to the correct size, and all will be fine.

  • by Vito ( 117562 ) on Saturday May 11, 2002 @03:37AM (#3501323) Homepage

    Watch this post get modded up, and not my qualified response [slashdot.org] to the From Coder to Game Designer [slashdot.org] question. Humbug!

    Anyway, as brought up in the last Ask Slashdot remotely similar to this one (Archiving DVD's with Linux [slashdot.org]), dvd::rip [exit1.org], which is a Perl+GTK front-end to transcode [uni-goettingen.de], has a fairly insecure cluster mode [exit1.org], whereby it will split up the video transcoding task among however many machines you can coerce into doing it, and rip and mux the audio with the video on the host machine.

    Sounds like just what the doctor ordered. Now someone go mod up that other answer of mine. Please?

    • Allegedly, transcode has some sort of clustering stuff as well. I can't testify for it, but dvd::rip's cluster mode is great, except that it won't do AC3 audio pass-through.

      However, I've found that the XviD [xvid.org] codec is fast enough to encode 2:1 on my P3 866. That is, it takes 2x as long to encode as the movie lasts - it gets around 12fps. DiVX 4.02 put out slightly better quality, at 4-6fps on the same system. I haven't done anything with DivX 5.xx yet, so I can't speak for that.
  • Er? (Score:2, Funny)

    by Wakko Warner ( 324 )
    You want us to help you break the law?!? What do you think we are?!
  • Yeah, i've been thinking of something like that for bit now. On a local network, you wouldn't have to worry much about speeds (some guy above said that). One of the things i was thinking of was more of a distributed encoding system, that on the servers (that the chunks are being sent to) could have sorta like plugins for different media types...so you could encode many types at once.
  • when developers have put years of work
    in PVM or MPI. I do not know if
    "mpi_allgather" and "mpi_allscatter"
    would stand an 2gb array like found on DVDs,
    but at least this would put several 1M$
    beowulves I know of to a somewhat useful
    purpose (besides cracking /etc/passwd and SSL
    sniffs, of course), instead of boring
    quantum chemical computations or climate simulations.
    • As the author of Transcode [uni-goettingen.de] explained to me, using a binary Divx encoder with PVM/MPI/Mosix is impossible. I don't know what state open source Divx encoders are in, but I agree this would be a much better solution than chopping up a DVD and encoding all the pieces separately.
  • by Anonymous Coward on Saturday May 11, 2002 @01:23PM (#3502589)
    Vidomi [vidomi.com] is a badass little program to turn mpg, vob, ... into DivX. One of the recently added features is "Distributed Encoding" (read: Scalability via network slaves).

    This answer your question?
  • Well, I'm not sure exactly how DivX compression works, but in general, the more data the compression algorithm can see at once, the more redundancy it can find, and therefore the more it can compress. Chopping a piece of data up into bits will at some point start reducing the seen compression ratio.
  • The hardest part is load balancing. How do you make sure the slowest computer doesn't get the last job and force the faster computers to wait forever for the last job?

Neutrinos have bad breadth.

Working...