Forgot your password?
typodupeerror
The Internet

Ask Slashdot: Do You Move Legal Data With Torrents? 302

Posted by Soulskill
from the i-seed-randomly-generated-text-files dept.
An anonymous reader writes "We've recently seen a number of interesting projects come from bittorrent.com, including Sync and SoShare. I sometimes use torrents to move several GB of data, especially when pushing large bundles to multiple destinations. It's mostly a hodgepodge of open source tools, though. Apart from anecdotes and info from bittorrent.com, details are thin on the ground (e.g. the Blizzard Downloader). I have two questions for the Slashdot community. 1) Do you use BitTorrent to move data? If so, how? i.e. What kind of data and what's the implementation? 2) If you've looked at torrent clients/tools, what's missing in the open source ecosystem that would make it more useful for moving around large blobs of data?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Do You Move Legal Data With Torrents?

Comments Filter:
  • by Qzukk (229616) on Wednesday April 24, 2013 @06:19PM (#43541183) Journal

    The entire point of swarm topology is to move data to a lot of places at the same time. If you just need to get data from A to B without sharing it with anyone else, rsync it.

    • by TooMuchToDo (882796) on Wednesday April 24, 2013 @06:37PM (#43541339)

      While working at Fermilab on the LHC CMS data taking team, I used bittorrent to speed up re-installs of thousands of worker nodes. I was able to saturate 10Gb Ethernet links this way, and could reinstall ~5500 Linux boxes within 10-15 minutes (with only two initial OS source servers).

      Yes, Bittorrent is not just for piracy.

    • by DragonTHC (208439)

      That's not the entire point.

      If you want to save bandwidth and still distribute your data, then crowdsource your downloads with bit torrent.

      Linux, MMO games, game mods, etc. All excellent uses of bit torrent.

    • by nabsltd (1313397)

      The entire point of swarm topology is to move data to a lot of places at the same time. If you just need to get data from A to B without sharing it with anyone else, rsync it.

      One huge advantage of bittorrent is that error checking/correcting is built in.

      Although you would still need to re-download the block with the error, it's a very small amount of data (usually a few KB). This solves the problem of errors that happen after any data transport verification, as most bittorrent clients can be configured to do a final re-check after the torrent is complete. In addition, the data transport can be encrypted if you need it to be. Although it's not the strongest of encryptions, it

    • Re: (Score:3, Interesting)

      by i.r.id10t (595143)

      Yup, it is one of the few ways that darn near any user out there can contribute back to Open Source.

      I can code, some, but not very experienced with c/c++ - just haven't had the need to do it. I hate writing documentation. I do file bug reports, but the stuff I use is pretty darn stable.

      So, to give back, I seed iso images for 24 hours or 100gb in upload of any Linux distro release notice I see here on slashdot, even if I don't use that distro.

  • Absolutely (Score:5, Funny)

    by Anonymous Coward on Wednesday April 24, 2013 @06:19PM (#43541185)

    All of my Torrents are legal data. What else would I use Bittorrent for besides Linux distros and Humble Bundle games?

  • by Synerg1y (2169962) on Wednesday April 24, 2013 @06:20PM (#43541193)

    Moving large data requires resources. In the case, of bittorent most things don't qualify because it's a distributed network, if 10 people in the office have the file and all know how t seed / use bittorent, you'd still be throttled by your bandwidth. Bittorent has however time and time again shown that it's distributed architecture can get something out to the masses very effectively.

    • by Grishnakh (216268)

      I don't see why Bittorrent wouldn't work for an office of 10 people; its strength is copying data over distributed networks. For an office with 10 people, assuming they're on wired Ethernet links rather than WiFi, all 10 of those links will be connecting to one or more switches, which are able to handle full 100Mb or 1Gb speeds, duplex, between the computer and the switch, simultaneously. Using BT to copy the data might be a little slower than just using scp if it were only one PC, but the total transfer

      • If the 10 people are in the same collision domain, torrenting won't make things faster. Actually it can slow things down with unneccesary collisions.

        • Re:No - Resources (Score:5, Insightful)

          by darkain (749283) on Wednesday April 24, 2013 @09:12PM (#43542345) Homepage

          Serious question... How long have you been using a network HUB instead of a network SWITCH?

        • by MightyYar (622222)

          Collisions? Are switches that expensive?

        • Re: (Score:2, Informative)

          by Drakonblayde (871676)

          You're confusing broadcast domain with collision domain.

          On a switch, a collision domain is limited to the link itself, ie, collisions aren't possible. However, all ports on the switch (or all ports in the same VLAN, if the switch is manageable) share the same broadcast domain, which merely allows for all ports to contribute to saturation, but there still won't be any collisions

  • by Rinikusu (28164) on Wednesday April 24, 2013 @06:21PM (#43541201)

    I mean, other than the Blizzard stuff, no, I don't use bittorrent at all unless I'm downloading movies (usually) or software (sometimes).

    Rapidshare/megaupload/etc work much better for my one-off transfer needs, while I leave media distribution to the masses via Youtube, Vimeo, Bandcamp, and media collaboration to Dropbox and sneaker-usbdrive-net (especially for big projects).

  • Yes, I have (Score:5, Interesting)

    by EkriirkE (1075937) on Wednesday April 24, 2013 @06:28PM (#43541247) Homepage
    Many times a person is searching for a program to do something by keyword instead of software title, and for free. Torrent sites are a common place to go for something free. I just generate a .torrent for my software(s) and upload it to a few big trackers and the others seem to pilfer it from there. Just make sure the filenames and titles are relevant. It's like SEO, but TTO: Torrent Tracker Optimization.
  • In a word? YES! (Score:4, Informative)

    by bobbied (2522392) on Wednesday April 24, 2013 @06:28PM (#43541249)

    I specifically do not torrent anything that has copyright issues but I do seed a number of Linux distributions and development tools which do not prevent distribution in their licenses. Downloading anything using torrent is effectively distribution of the material too, so you had better KNOW that the license allows you to make copies and give them away.

    You folks that torrent movies and stuff that is not in the public domain are crazy in my book.

  • by MarkvW (1037596) on Wednesday April 24, 2013 @06:29PM (#43541257)

    Only a complete fucking moron would move legal data with torrents.

    A lawyer is obligated to preserve his clients' confidences. When you store your information on somebody elses server or servers you are giving up custody and control over some of those confidences. In that situation you are entirely dependent upon the strength of your encryption.

    That encryption might be good today or tomorrow, but how good will it be five years from now or ten years from now when quantum computing or the next best thing becomes available for codebreakers.

    Don't risk a lawsuit from a pissed client!

    • by godrik (1287354)

      In this context, I believe legal meant "not illegal". Or "data I own" or "data that nobody will sue my ass off for moving them around"

  • by Anonymous Coward on Wednesday April 24, 2013 @06:30PM (#43541263)
    inquiring minds want to know.
  • Linux ISO's mostly (Score:5, Informative)

    by dlapine (131282) <dlapine.ncsa@uiuc@edu> on Wednesday April 24, 2013 @06:31PM (#43541285) Homepage

    At work I need to install several different types/versions of linux OS's for testing. I always torrent the ISO as a way of "paying" for the image that I'm using.

    A few years back, we did some experimenting with torrents over the Teragrid 10GBe backbone, to see how well that worked over the long haul between IL and CA. With just 2 endpoints, even on GBe, it wasn't better than a simple rsync. We did some small scale test with less than 10 cluster nodes on one side, but still not as useful as a Wide Area filesystem we were testing against. Bittorrent protocols just aren't optimized for a few nodes with a fat pipe between them.

    I am interested in looking at the new Bitorrent Sync client to see how thanks for our setup. We have many users with 10's of TB's of data to push around on a weekly basis.

    • by raymorris (2726007) on Wednesday April 24, 2013 @07:26PM (#43541645)
      Same here, I download and seed Linux distros.

      with just two end points, it wasn't better

      For point-to-point transfer to large amounts of data, the protocol does't matter, as long as the protocol is sane. The time spent moving data bytes will be much higher than any protocol overhead. rsync is roughly optimal because it won't transfer portions of the file that the receiver already has. BitTorrent is for distributing data to many destinations.

      • That's not true: Some protocols are very "chatty" and may require, say, 1 round trip acknowledgement per file transferred. This is where rsync's streaming protocol shines.

        Even at home I noticed that copying large files over wireless via SMB is much slower than copying them over an SSHFS mount (getting previously unseen transfer rates, actually). However, the SMB mount is more responsive when exploring files.

        In my experience, protocol can matter a lot.

        • by wagnerrp (1305589)
          I wonder how much of that is poor CIFS implementation or configuration. I can do 115-120MB/s either way between my (Win7) desktop and my SAMBA server in my basement.
        • The GP referenced isos and "10s of terabytes of data". Unless those tens of TB are BILLIONS of tiny files, 1 RTT per file would be less than 0.01%. For lots of tiny files, like Maildir, yes SMB sucks. Of course SMB is Microsoft, so the fact that it sucks is assumed.
      • BitTorrent is, conceptually, best used as a reimplementation of multicast. Multicast is probably far more efficient when it comes to the actual data distribution, but multicast (specifically, routing multicast) is one of those blackbox things that not a huge number of people understand. Last time I checked, I couldn't route a multicast source from a Comcast connection and have the data arrive on a receiver on the Cox network.

        However, there was still a need for a protocol that could effectively do one to man

  • by markdavis (642305) on Wednesday April 24, 2013 @06:32PM (#43541289)

    Just about the only time I use torrents is when downloading Linux distributions- Mageia, Fedora, CentOS, etc. Occasionally iso's for grub magic, ultimate boot CD, and such. All of that legal. And I usually leave it up at least long enough that my share ratio is 100% (1.0).

    • by nabsltd (1313397)

      Occasionally iso's for grub magic, ultimate boot CD, and such. All of that legal. And I usually leave it up at least long enough that my share ratio is 100% (1.0).

      For such tiny things (latest UBCD is 486MB), I pretty much seed forever, as it doesn't really cost me anything. I'm at 67:1 on the latest UBCD, and over 400:1 on some other much smaller torrents.

      I leave them running because I don't artificially limit my upload rate on a per-torrent basis, only as a total for all torrents (and that limit is quite high, as I have Verizon FiOS). It really bugs me when I try to download an older torrent and the only seeds are uploading at a few KB/sec. Even on a 100MB torren

    • Just about the only time I use torrents is when downloading Linux distributions- Mageia, Fedora, CentOS, etc. Occasionally iso's for grub magic, ultimate boot CD, and such. All of that legal. And I usually leave it up at least long enough that my share ratio is 100% (1.0).

      Even better is if you aim for a share ratio of 2.0. With a share ratio of 1.0, you only "give back what you take", so the swarm stays as strong as it was. That's good. But sending back an "extra copy" gives your contribution to make the swarm stronger.

  • occasionally (Score:4, Informative)

    by BenSchuarmer (922752) on Wednesday April 24, 2013 @06:33PM (#43541295)
    mostly for music that's under creative commons license and the occasional Linux download.
  • by GameboyRMH (1153867) <gameboyrmhNO@SPAMgmail.com> on Wednesday April 24, 2013 @06:37PM (#43541345) Journal

    Linux distros, free movies, free games...

    I tried to switch to Deluge but it couldn't handle a file with a Japanese character in its name...other than that, only things that I think many torrent clients could use is the ability to accept magnet downloads through a drop folder somehow, and searching & better sorting/filtering options for downloaded torrents.

  • We use the OpenBitTorrent tracker and the Transmission client to deploy and acquire virtual machine hard drive images among our developers. The obvious reason is that it's much faster for our developers to help eachother out with shoveling the data around rather than the developers having to get all the data from one and the same link (read: the main server). Compare it to a person reading a novel, once, out loud, to a group of people, instead of reading it in private over and over to each and every attend
  • gittorrent (Score:5, Interesting)

    by lkcl (517947) <lkcl@lkcl.net> on Wednesday April 24, 2013 @06:40PM (#43541379) Homepage

    the one thing that would help enormously would be to have git be *truly* peer-to-peer distributed. not "yeah shure mate you can always git pull and git push, that's distributed, and you're a peer, right, so... so... git is peer-to-peer and distributed, so what are you talking about you moron??" but "at the network level, git pull and git push have a URL type that is **TRULY** peer-to-peer distributed. to illustrate what i mean, i would like to be able to do the following - with all that it implies:

    git clone magnet://abcdefg0123456789/gittorrent.git

    if you're familiar with magnet links, you'll know that there is *no* central location: a DHT lookup is used to find the peers.

    now, what wasn't clear to the people on the git mailing list when i last looked at this, was that it is possible to use bittorrent to do git pack objects, by creating a file named after the pack object itself. and what wasn't clear to sam (the last person who tried to put git over bittorrent) was that you *MUST NOT* make use of bittorrent's "multiple file in a torrent" feature, because bittorrent divides up its data into equal-sized blocks that *do not* line up with the files that are in them, which is why when you download one file in a torrent you almost always end up with the end of its preceding file and the start of the one after it, as well.

    the idea i came up with is that you create *multiple* torrents - one per git object (or git pack object). if you want to pull a tree, you create a torrent containing *one file* which is the list of objects in that tree; gittorrent would then know to map each of those objects onto yet *another* torrent (one per object), repeat until all downloading happily. gittorrent objects are of course named after the hash, so you can pretty much guarantee they'll be unique.

    and, adding in a DHT (a la magnet links), you are now no longer critically dependent on something like e.g. github, or in fact any server at all.

    to answer your question in a non-technical way, mr anonymous, i think you can see that i feel it would be much more useful to have development tools that use bittorrent-like protocols to share files-as-revision-controlled-data (and, if you've seen what joey hess is doing with bittorrent you'll know that that's a hell of a lot - including storing home directories in git and doing automatic distributed backups)

    • by Qzukk (229616) on Wednesday April 24, 2013 @06:55PM (#43541481) Journal

      Intriguing idea, but I tried subscribing to your newsletter and I keep getting the Sept. 24, 1998 edition over and over. The problem with using bittorrent for this is distributing NEW data. If the protocol could cope with a seed appending data to the torrent without having to create a whole new .torrent file, then this could be awesome. As it is, you're just changing the problem from "how do I send out new versions of files when I commit something" to "how do I send out new versions of the .torrent files every time I commit something"

      • I tried subscribing to your newsletter and I keep getting the Sept. 24, 1998 edition over and over.

        A new meme is born!

      • I see your flamebait already fooled some mods. However, you're supposed to attack the idea, not the man. And you failed spectacularly at that.

        The key component in what he's proposing would be the DHT; not even necessarily the same DHT as used by standard BitTorrent clients. The use (or not) of the BitTorrent specification and its intricacies would be just an implementation detail.

        You're free to be one of the "can't be done" people; it would be wise not to advertise it too much around here however; as it's n

  • Linux ISOs
    VM Images
    Backup Images
    Home Movies
    etc...

  • by Spy Handler (822350) on Wednesday April 24, 2013 @06:47PM (#43541427) Homepage Journal

    I got a game called War Thunder via bit torrent. What you do is download a small installer program from the publisher's website and run that. The installer automatically connects to BT seeds and peers and downloads the actual game itself.

    There is no other way to get War Thunder. I suppose since they're a small publisher, their web server can't handle distributing the 13 GB game file to tens of thousands of users.

    • I hate it when developers do this. Offer a torrent as your default download, sure, but put a direct download buried somewhere in your support pages.

      Some people (like me) have net connections that just crap out on torrents. I can download a large file reasonably well, but the same file in a torrent will take weeks.

      And some people (a majority, in some nations) have caps, and a torrent-based downloader eats into that quite a bit.

  • by silas_moeckel (234313) <[silas] [at] [dsminc-corp.com]> on Wednesday April 24, 2013 @06:47PM (#43541429) Homepage

    I have a fairly large and growing (3.7TB) dataset that needs to be replicated nightly to a bunch of different sites. By the nature of the dataset nothing is ever removed or changed just new files added. It needs to be copied out to a half dozen locations that have as much outbound bandwidth as the primary. So a cron job sets up the torrent every night and all the remote sites pull data from the primary and reshuffle it between themselves.

    • by wvmarle (1070040)

      I may miss something, but isn't this the ultimate job for rsync, to copy only the new bits? Or do you take a diff and distribute only that?

  • by ADRA (37398) on Wednesday April 24, 2013 @06:59PM (#43541501)

    Bit torrent is a good data -distribution- tool, not a data -mover-, and it would be lousy to play that role. There are at least a dozen possible open solutions for moving data from point to point, but I have no idea why you'd use a protocol/tool stack that are designed for broadcast/graph distribution to do so.

    An off the top list:
          1. NFS
          2. SMB
          3. FTP
          4. SFTP/SCP/rsync
          5. HTTP/HTTPS
          6. sz/rz
          7. iscsi
          8. DFS
          9. AFS
          10. UFTP/XFTP

    The question should really be what exactly do you see ad being deficient about all these protocols that deems it necessary to re-invent the wheel yet again?

  • by erroneus (253617) on Wednesday April 24, 2013 @07:00PM (#43541507) Homepage

    Punish the technology because of how it is used? I thought we grew past that notion already.

    At some point, one of the few remaining ways to get good information and news will be through these rougue channels and methods. Do we have to keep re-hashing the same ridiculous notions? How about we ban types of music based on the fact that thugs and criminals like it and glorify killing?

    • I thought we grew past that notion already.

      Where have you been?

    • Punish the technology because of how it is used? I thought we grew past that notion already.

      If we had you'd be able to buy any weapon or drug you want without government interference or oversight.

      The NRA could go back to its original functions of training and research, and the FDA and DEA to could be replaced by Underwriters' Laboratories and Consumers' Reports.

      As you can see we have a long way to go.

  • Facebook Does (Score:4, Interesting)

    by terbeaux (2579575) on Wednesday April 24, 2013 @07:10PM (#43541557)
    Facebook deploys its 4GB binary to its 500,000 servers using a torrent client that has rack and switch affinity. Each client goes for data chunks that are already locally on a rack or switch that it is connected to. That is a crap-ton of data.
  • by tdelaney (458893) on Wednesday April 24, 2013 @07:17PM (#43541605)

    The big advantage of using BitTorrent over many other protocols for moving large amounts of data (as opposed to distributing it) is reliability - or lack of it. When you're moving large datasets, you don't want it to crap out and not be able to resume 90% in.

    Sure - rsync has the ability to resume, but it requires explicit command line options. It's a terrible feeling to realise you just restarted a 10GB+ transfer instead of resuming it.

  • SOLR at Etsy.com? (Score:3, Interesting)

    by rwhiffen (141401) on Wednesday April 24, 2013 @07:19PM (#43541611) Homepage Journal

    The folks at Etsy do it to replicate SOLR:
    http://codeascraft.etsy.com/2012/01/23/solr-bittorrent-index-replication/

    Not sure if that's what you mean.

  • Back in the pre-YouTube days, Rooster Teeth distributed their videos using Bit Torrent to relieve their own HTTP load. I think they gave BT users the incentive of downloading earlier to encourage its use.

  • somewhere in the world, whatever you're moving, it's legal.
  • by Osgeld (1900440)

    I dont have anything worthy of having to mass source it, and my 3$ a month "unlimited bandwidth" website has taken 400 gigs in a month downloads before without a sweat.

    About the only thing I could think of is a linux distro, but again the only thing I am bound to cobble together is a lightweight debian mix for maintance / repair / recovery when someone brings me their computer to be fixed and winders is all trashed.

    • by nabsltd (1313397)

      I dont have anything worthy of having to mass source it, and my 3$ a month "unlimited bandwidth" website has taken 400 gigs in a month downloads before without a sweat.

      That's an awesome price, even if the overall speed isn't that great.

      Everything I have found that gives you more than a few GB of disk space and truly unlimited bandwidth costs a lot more than that per month.

  • It's a good idea for sharing, but I'm not a fan of the way it loves to hog the Internet connection, it starts connections out the ass, flooding the network and just slows everything down. I tend to use wget for pretty much everything; with a decent server, it gets by just fine with only one connection. I also like the idea of maintaining at least somewhat-accurate timestamp data whenever possible, and BitTorrent doesn't seem to have a concept of that. And also, I like to maintain my own checksum files, s

    • by BitZtream (692029)

      If you're using a bittorrent client that doesnt' preallocation the file, then you should switch to a non-shitty one that preallocations, avoiding the entire fragmentation problem completely.

      Or you could use a filesystem that didnt' suck.

  • I often download Linux DVD images and such using bittorrent. Bittorrent isn't "evil" - it is simple a means to share the load of moving large amounts of data. To make it illegal should be considered something akin to making automobiles and pickup trucks illegal - as they can move both large amounts of legitimate as well as other goods from point A to point B, and they can use many routes to get to B.
  • I use Torrent for many legal uses :
    - Blizzard downloader (WoW, Starcraft2, Diablo 3)
    - Humble Bundle
    - Linux Distributions

    But also, from time to time, some free tools, some F2P games that use BitTorrent for distribution, ...

  • I've read that in Australia both the big, legitimate movie distributors and the major TV networks use (or have used) torrents to move programs around, between theaters (with digital projection facilities) and networked TV stations.
  • I pretty much always use torrents to download OS installers (Linux, etc.). I don't know whether I've ever downloaded anything that wasn't totally legal. Probably have at some point, but I mostly use torrents for things where I want to be contributing bandwidth back to the community.

  • Mostly SuSE. Some kubuntu, CentOS and Oracle OEL.
  • Wait, you can move illegal data using torrents? Who would have thunk!?!

    Haven't used Torrents in years. Many years.

  • many open source projects with large downloads like .ISOs for linux distribution use torrents because paying for bandwith would be really expensive for direct download.

    bit torrent is great, because everyone with consumer broadband becomes and instant mirror when they download it.
  • We use bittorrent to move MS ISOs at work. We are an Educational institution and as such we are allowed to offer students access to MS products. (not the Office suite, but OS's .NET and the like) This is done via the Dreamspark site where students can get their license keys but from our internal network they can use bittorrent to get the ISOs

    So we legally give access to MS OS via torrents (^_^)

A motion to adjourn is always in order.

Working...