Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet

Bittorrent To Replace Standard Downloads? 591

Max Sayre writes "Have you ever tried to download an operating system update only to have it fail and have to start all over? What about patches for your favorite games? World of Warcraft already uses Bittorrent technology as a way to distribute large amounts of content at a lower cost to the company and faster speeds to all of their clients. So why haven't they replaced the standard downloading options built into any major OS? Companies like Opera are including the downloading of torrents in their products already and extensions have been written for Firefox to download torrents in-browser. Every day Bittorrent traffic is growing. Sites like OpenBittorrent already exist and DHT doesn't even require a tracker. So why isn't everyone doing it? Is it finally time to see all downloads replaced with Bittorrent?"
This discussion has been archived. No new comments can be posted.

Bittorrent To Replace Standard Downloads?

Comments Filter:
  • You explained it. (Score:4, Insightful)

    by binarylarry ( 1338699 ) on Sunday October 03, 2010 @10:37PM (#33780700)

    When torrent support comes equipped on all the major browsers, it can take off.

    Until then it's a tool for nerds to get their porn faster.

    • by smartr ( 1035324 ) on Sunday October 03, 2010 @10:57PM (#33780874)
      I can see this really taking off in the office I work at... Oh wait... Is that a giant truck of bandwidth clogging the private network? You're using the VPN to host torrent files? Ring Ring, the customer wants to know why is the internet so slow.
      • Re:You explained it. (Score:4, Interesting)

        by davester666 ( 731373 ) on Sunday October 03, 2010 @11:15PM (#33780996) Journal

        Hell, I'm on Shaw Cable in Canada, and if I don't limit my upload bandwidth to 5 kb/s, my download bandwidth drops to sub-50 kb/s. But if I do limit it to 5 kb/s, then download speeds go way up to over 200 kb/s.

        And yes, they advertise that I should be getting an order of magnitude greater speed than this...

        • Re:You explained it. (Score:5, Informative)

          by Sancho ( 17056 ) * on Monday October 04, 2010 @12:35AM (#33781434) Homepage

          Your acknowledgement packets probably aren't getting through.

          http://www.benzedrine.cx/ackpri.html [benzedrine.cx]

          • by Idbar ( 1034346 ) on Monday October 04, 2010 @11:15AM (#33784222)
            I don't see why you're modded interesting, when you should be modded +5 informative.

            Some people don't understand the fundamentals of TCP's congestion control/flow control. Bit torrent is a very greedy, selfish, egocentric, abusive (keep going with the adjectives) algorithm. It takes advantage of TCP's mechanism to provide fairness, and use it to abuse the rest of the users. While people is excited about it's performance to selfishly downloading data, the widespread of these type of algorithms may lead to unusable networks. Particularly, because there is no queue management enforced and marking mechanisms are not used by default, therefore, routers will drop packets and the end effect is a large number of retransmitted packets.

            As the the parent points out what the grand parent states, the greed of such protocols even degrades the throughput by starving the acknowledgement packets.
        • Re: (Score:3, Informative)

          by mirix ( 1649853 )

          Shaw seems to be throttling torrents, from my experience at least, and a few friends with them. Used to be faster a couple years ago.

          Upload has always been rather pathetic with them though... and seems to have gotten worse over time (over subscribing I guess?)

      • Re:You explained it. (Score:4, Informative)

        by Runaway1956 ( 1322357 ) on Monday October 04, 2010 @04:12AM (#33782276) Homepage Journal
        That's why you use a traffic shaper on your own network. With Wondershaper, I find my real available bandwidth, subtract a small percentage from the available bandwidth, and throttle everything to that speed. With short queues, everything just moves along nicely, and high priority packets are moved to the front of queues. If you allow the queues to get long, with no prioritization, bandwidth comes to a crawl. Now, don't get me wrong here - I am not advocating that ISP's do traffic shaping. They will do it all wrong, for all the wrong reasons. If most people did the traffic shaping on their own networks, it would relieve the load on the ISP, and the ISP would have less justification for traffic shaping THEIR WAY. So - what does traffic shaping mean, in real life, on my own network? Without traffic shaping, any person can start a download from his desktop (or laptop), or watch a Youtube video, and bring everyone in the house to a crawl. Pages may not load at all for other people. With traffic shaping, I can start huge downloads from all five machines on the network, but all five machines can still browse. Those downloads run somewhat slower than they would have - but no one is screaming, "WHY CAN'T I LOAD MY EMAIL?? WHO IS DOWNLOADING THE ENTIRE INTERNET?" Try it. Put a traffic shaper on your gateway machine, or at least set up QOS on your router. The internet will never look the same again.
      • Re: (Score:3, Interesting)

        by Firethorn ( 177587 )

        Thing is, a proper bittorrent implementation would actually improve speeds at a private network. That way, rather than having 50-100* clients all contacting microsoft for updates and downloading that 200MB set of patches, while they do their contact, the BT system realizes they're all on the same network/subnet and they promptly share them all with each other first. Without needing some sort of 'official' local patch depository server or fancy management system like SMS. Or even a caching proxy server(wh

      • Re: (Score:3, Insightful)

        At my job I torrented a Doctor Who audiofile pack, so I'd have something to relieve the boredom, and the next day the IT Staff was scanning my computer. I don't think torrenting will be permitted in the office. Ever.

        From the summary:
        >>>"Have you ever tried to download an operating system update only to have it fail and have to start all over?"

        Yes and it's ridiculous. Even back in the 80s we had the ZMODEM protocol on our lowly 8-bit Ataris and Commodores. If a file was interrupted the ZMODEM pr

    • Re: (Score:3, Insightful)

      by haruchai ( 17472 )

      Why the hell would it have to be in all the major browsers, when the ability to open files with external apps has been around for a decade, if not longer.
      Just so you know, there have been Firefox addons for torrents for several years and Opera baked in right into the browser over 5 years ago.

      • by Nirvelli ( 851945 ) on Monday October 04, 2010 @12:21AM (#33781386)
        Because your everyday normal "series of tubes" people can't go to the effort of using your "external apps."
        And the only Firefox addon they have is VideoDownloadHelper, because they went on Yahoo Answers asking how they could download YouTube videos.
        • by Sir_Lewk ( 967686 ) <sirlewkNO@SPAMgmail.com> on Monday October 04, 2010 @02:26AM (#33781944)

          This. People don't seem to realize that PDF, word documents, and flash will never take off as accepted formats for the layman unless they are baked into every major web-browser.

          Wait, what?

        • Re: (Score:3, Interesting)

          by cbope ( 130292 )

          Surely you forgot about browser toolbars... the last time I cleaned my parent's computer I removed at least half a dozen damn toolbars from Firefox, several of them with very questionable intent. Remember, if Joe Sixpack visits a page and it prompts them to install something... they will very likely install it. They don't know they don't need it, most likely, because the general public has been conditioned over the years that various add-ons are required for viewing certain websites and content.

          I did instru

    • Re: (Score:3, Funny)

      by kwerle ( 39371 )

      Until then it's a tool for nerds to get their porn faster.

      11+ Million World of Warcraft players can't be wrong...

      OK, the porn market is bigger than that - but the porn torrent market? I wonder.

    • Re: (Score:3, Insightful)

      by eth1 ( 94901 )

      When torrent support comes equipped on all the major browsers, it can take off.

      And ISPs start offering decent upstream bandwidth?

      Right now I hate torrents because it always seems to be slower than a server with a good connection. The past few times I've been forced to use it, I've been uploading faster than I was downloading (I do have decent upstream).

  • by pizzach ( 1011925 ) <pizzachNO@SPAMgmail.com> on Sunday October 03, 2010 @10:37PM (#33780704) Homepage
    Why aren't linux package managers using this instead of just leaching off of college servers and the like?
    • by compro01 ( 777531 ) on Sunday October 03, 2010 @10:50PM (#33780818)

      There was apt-torrent, but that project appears to be abandoned.

      The thing is probably that there is no pressing need. There are many educational facilities that are are willing to provide mirrors for such things, so there's no real reason to implement a system to borrow user's upstream bandwidth.

      • by Splab ( 574204 ) on Monday October 04, 2010 @12:53AM (#33781548)

        This is spot on.

        WoW (and other MMOs) needs torrents, because they have an very high to extremely high burst ratio. When a new patch is deployed for Linux it needs to propagate out to various distributions, people need to start packing it and then end users auto update will pick it up eventually. This means it's often distributed over time. When WoW deploys a new patch, they have 10 million people trying to get it at once in order to be first in new instances. Even the big distributors have trouble coping with this - and over a short period of time the need for bandwith will drop to very low levels as people are getting up-to-date, so there is no financial incentive for Blizzard to invest in the hardware to cope with deployment.

    • by jojoba_oil ( 1071932 ) on Sunday October 03, 2010 @10:56PM (#33780864)
      Because for security updates, this allows users to find others who don't have the latest patches yet. Just imagine the people watching leecher IPs every time a new remote exploit is patched...
      • Re: (Score:3, Insightful)

        by Anpheus ( 908711 )

        There's no reason the tracker couldn't limit the peer visibility such that only a few trusted seeder's IPs would be given to leechers. That is, each leecher would see an artificially low number of seeders, only seeders that were trusted. The client would then intentionally not use DHT or other mechanisms to find other peers.

        For non-security or low priority updates, full tracker support could be allowed.

        • Re: (Score:3, Interesting)

          by blueg3 ( 192743 )

          That's essentially the same as not using bittorrent. If you can't see arbitrary peers, your peer-to-peer system isn't very effective.

          • by scrib ( 1277042 ) on Sunday October 03, 2010 @11:46PM (#33781216)

            Not quite... The difference is that you could download from any or ALL of the trusted peers (currently known as "mirror sites") at the same time. Seems a bit better than trying to pick from a list of mirrors that might be close to you or using the "random mirror" link. If one mirror was down or slow, it would barely be noticed on the downloader's end.

            Also, once a machine downloaded and installed the patch it could then announce back to the tracker that it can be a seed as it is no longer vulnerable. So, the tracker would only show seeds, and the downloading system would only announce that it was a seed AFTER it installed the patch.

    • by buchner.johannes ( 1139593 ) on Sunday October 03, 2010 @11:04PM (#33780918) Homepage Journal

      I would like a solution that combines metalinks [wikipedia.org] (one file that contains multiple urls for a download plus the checksum) with Bittorrent.
      A client could start a http download from one server, and a bittorrent that requests pieces for the latter chunks. You can also make multiple http request with a offset these days, on another http server or the same one.

      This could even be built in magically into http browsers: if the file size is > 50MB, ask the cloud if there are nodes for the given url. That is provided you have a checksum like with metalinks. Appearantly metalink already features this possibility: http://www.metalinker.org/ [metalinker.org]

    • by antifoidulus ( 807088 ) on Sunday October 03, 2010 @11:13PM (#33780974) Homepage Journal
      The file sizes of most Linux packages are simply not big enough to warrant the use of bittorrent. The 32 bit x86 kernel(usually one of the biggest packages in a distro) is only about 32 megs or so. By the time you downloaded the tracker, found your peers and actually started downloading something you could have had the whole package d/led already. Most big universities and research institutions have to host the files anyway(for internal updates), its not all that difficult to extend the download service to the general public. Not to mention the fact that in order for the torrent to be effective you would actually have to retain the packages after installation which can quickly become a huge pain in the ass.....
      • by blueg3 ( 192743 ) on Sunday October 03, 2010 @11:26PM (#33781084)

        Contacting the tracker and getting an initial peer list, in a proper system, takes a fraction of a second. It's iteratively contacting peers, obtaining their piece bitmap, negotiating with them for piece exchange, and finding peers that actually have high bandwidth that makes the startup time of BitTorrent so high.

  • by dominion ( 3153 ) on Sunday October 03, 2010 @10:38PM (#33780710) Homepage

    Combine this with social networking to allow/deny access to your files and I think you've got a game changer. Files which require no server, and which are unknown/unavailable to anyone who doesn't need to know about them. I could share my mp3 collection or movie collection with only my friends list, which would be much more along the lines of fair use (like tape trading).

    • Re: (Score:3, Insightful)

      by rHBa ( 976986 )
      Isn't this what private trackers do already?

      Yes, they require a server (tracker) to limit access to members only but that functionality would just be shifted to the social networking site.

      If you're planning to do this without a tracker then how do you prevent people outside your friends list from joining the torrent (assuming they manage to find a copy of the .torrent file)?

      If you have friends list big enough to make bittorrent worth while it's quite likely that someone will leak the torrent file to s
    • by jmottram08 ( 1886654 ) on Monday October 04, 2010 @01:42AM (#33781740)
      Exactly how does this not require a server?

      Just a heads up, this "game changer" is called a ftp server. My friends and family already have access to download my files or upload whatever.

      Maybe what you want is a game changing facebook app that just manages passwords and opens a new window with your friends ftp server in it.

      Why are you trying to reinvent the wheel?

  • File size (Score:5, Insightful)

    by gringer ( 252588 ) on Sunday October 03, 2010 @10:38PM (#33780714)

    Why? because for small files (as I expect most software updates would be), downloading directly is quicker and safer.

    • Re:File size (Score:5, Insightful)

      by Anonymous Coward on Sunday October 03, 2010 @10:46PM (#33780786)

      Why? because for small files (as I expect most software updates would be), downloading directly is quicker and safer.

      Safer? Bittorrent already has built in checksumming which most people don't do with regular downloads anyways. By that metric alone I'd say the BitTorrent is safer than a regular download.

      • Re:File size (Score:5, Insightful)

        by Anonymous Coward on Sunday October 03, 2010 @11:09PM (#33780956)

        Because when you download directly instead of torrenting a file, you aren't basically shouting to the world "HEY I DON'T HAVE THE NEW SECURITY UPDATE YET! ANYONE HAVE THE NEW SECURITY UPDATE?"

      • Re: (Score:3, Interesting)

        by Sigma 7 ( 266129 )

        Bittorrent already has built in checksumming which most people don't do with regular downloads

        Bittorrent requires checksumming because it has to pull data from random sources, some of which may attempt to poison the Torrent. It relies on the SHA-1 checksum, which isn't broken yet, but a dedicated enough individual can find a way to poison such a system.

        If an attacker manages to get enough control to manipulate HTTP downloads, he can also manipulate the posted checksum as well.

        If you're worried about corruption appearing in HTTP, remember that there's already checksumming on the packet level, as wel

    • Re: (Score:3, Insightful)

      Not to mention the nasty tendency that bittorrent has to saturate consumer-grade routers in a nasty way, even when not using all available bandwidth. I suspect ISP routers wouldn't handle it much better if it suddenly became the defacto standard method of transferring files.
  • Data Caps (Score:5, Informative)

    by Anonymous Coward on Sunday October 03, 2010 @10:38PM (#33780720)

    Those of us stuck in New Zealand or Australia still have data caps to think about. If every download was a torrent there would be a lot more overhead eating into our precious data caps!

    Please, think of the Kiwis.

    • Re: (Score:3, Informative)

      by AHuxley ( 892839 )
      Yes and look at the new plans by Australian ISP's eg. Internode
      http://www.internode.on.net/residential/broadband/bundles/easy_bundle/plans/ [on.net]
      "Massive 'Any Time' monthly quota - measured as the total of downloads plus uploads. "
  • Why? (Score:5, Insightful)

    by DarkKnightRadick ( 268025 ) <the_spoon.geo@yahoo.com> on Sunday October 03, 2010 @10:40PM (#33780738) Homepage Journal

    Because Bittorrent has a reputation issue, for one. The MPAA and RIAA attack it and call it the reason they are losing money (instead of their failing business model).

    Large companies don't want to have to deal with the previous hassle, and even though the load might not be much for individual computers, if everyone on a company network was bittorrenting, other traffic would be interrupted (even on 2MB DSL, bittorrent interferes with my connections to many popular IM services and I don't even run it full throttle during the day).

    • Re:Why? (Score:5, Informative)

      by xiando ( 770382 ) on Sunday October 03, 2010 @10:48PM (#33780806) Homepage Journal

      Because Bittorrent has a reputation issue, for one. The MPAA and RIAA attack it and call it the reason they are losing money (instead of their failing business model).

      Try running a perfectly legal BitTorrent tracker. You will find that the MPAA/RIAA criminals both DDOS your server and spam your ISP with DMCA crap regarding files you are not tracking and never heard of. They really dislike BitTorrent.

      • Re:Why? (Score:5, Insightful)

        by Anonymous Coward on Monday October 04, 2010 @12:06AM (#33781310)

        Try running a perfectly legal BitTorrent tracker. You will find that the MPAA/RIAA criminals both DDOS your server and spam your ISP with DMCA crap regarding files you are not tracking and never heard of. They really dislike BitTorrent.

        It's because it competes with them. Not as content producers, as distributors. If BitTorrent had a good reputation then indie filmmakers would use it to distribute their films to customers, perhaps as encrypted files with DRM, perhaps not, but in any event in competition with distributing them through official MPAA channels where the big companies get their big cut.

    • Re:Why? (Score:5, Insightful)

      by Dayofswords ( 1548243 ) on Sunday October 03, 2010 @10:54PM (#33780844)

      MPAA said the same thing about the VCR.

      Can we go back to not giving a fuck what the MPAA thinks?

  • No (Score:4, Insightful)

    by arth1 ( 260657 ) on Sunday October 03, 2010 @10:40PM (#33780742) Homepage Journal

    No, it won't replace standard downloads, if nothing else because bittorrent is "best effort", and there's no guarantee that the client receives a file within a certain time frame. And for small and medium files, the overhead of BT severely slows down the access.
    Yes, it's useful for large files. No, it's no 100% replacement.

    And that's the beauty of internet in a nutshell -- there isn't one solution that fits all, but lots and lots of tools and standards that can be used and adjusted to the specific needs. So stop looking for The One And Only Way.

    • "and there's no guarantee that the client receives a file within a certain time frame."

      But that applies to regular downloading, too. It depends on the server location, its connection speed, your connection speed (which isn't always constant), and how much traffic the server is getting at the time.

      • Re: (Score:3, Insightful)

        by arth1 ( 260657 )

        But you can control those factors. A VPN with a CIR, for example.

        And even when you can't control it, you still can estimate much better. A 1:1 download that's doing 150 kBps for the first five minutes from a server with plenty of bandwidth isn't likely to drop to 15 kBps for half an hour and then pick up to 300 kBps.

        If I need a large file, I look for a http download first, and only if I can't find that do I go to bittorrent. Because BT is usually going to take longer, and is always impossible to estimate

        • Re: (Score:3, Informative)

          Yeah, I know a bittorrent download is largely unpredictable, but I was just pointing out that, to an extent, so are regular downloads.

  • by davidwr ( 791652 ) on Sunday October 03, 2010 @10:40PM (#33780744) Homepage Journal

    1) because I'm a leech.
    2) because I don't want legal liability FOR DISTRIBUTING if I download a file that unknown to me is illegal, e.g. a software package from overseas that someone inserted illegal-in-my-country pornography into the binary. Yeah, I'll take the risk for possession but not for distribution.
    3) because my employer's lawyer made me say #2 when it comes to company machines.
    4) because I prefer to get my bits from the official location. Yea, I know a checksum should be good enough but I'm old school here.

    Seriously though, I can see torrents overtaking web- and ftp- downloads as the primary method for distributing large, popular files. However, there will always be customers who refuse to share and who refuse to get data from any source that doesn't have a reputation for quality and isn't blessed by the original publisher.

    Oh, and seriously, I'll be fine using torrents to download things like well-known linux distros. I trust modern checksums. I probably won't use them for low-demand files or smaller files though.

    • Re: (Score:3, Informative)

      by Tom ( 822 )

      4) because I prefer to get my bits from the official location. Yea, I know a checksum should be good enough but I'm old school here.

      Actually, from a strictly security-POV, a checksum and a distributed distribution model is better, because it makes man-in-the-middle attacks considerably more difficult.

      Of course, only as long as you have a trustworthy channel to get the checksum through and actually bother to verify it.

  • World of Warcraft already uses Bittorrent technology as a way to distribute large amounts of content at a lower cost to the company and faster speeds to all of their clients

    Lower cost, for sure, but it is not faster. The fastest download is when you're downloading from a single server that is able to fully saturate your connection. Even better if this server is situated directly within your ISP as is the case for some content delivery networks (Limelight I think does this). Having to negotiate individual connections with hundreds of peers around the world and incurring the associated lag and protocol overhead can't even compare.

    • Re: (Score:3, Insightful)

      by davidwr ( 791652 )

      "...that is able to fully saturate your connection."

      Yeah, like this always happens. Not.

      Scenario: 1st day of release of a new popular file.

      Either the vendor prepares well and works with content-delivery networks so you and everyone else on the planet can download the file while saturating your network, or vendor doesn't.

      If he doesn't, everyone gets throttled and/or some people are told to try again later.

      A torrent option would help distribute the load and cut out the bottleneck.

    • by kc8apf ( 89233 )

      That really only scales up so far. It's actually quite difficult to saturate a 1gbps, let alone a 10gbps or 100gbps, link with a single stream. Multiple streams work around some of the problems and allow the full link to be used.

    • The way I see it:
      If the protocol were improved a little bit, and ISPs were a little smarter, then everyone wins. If the protocol allowed preferred connection to big nearby pipes (and I know that some clients try to do that) and there was a way to really relay/cache/siren feeds (like http proxies), then ISPs *could* watch for 'hot torrents' and cache them to fee them to their customers at high speed - thus reducing their out of network costs (because they are feeding the data, themselves) and improving the

  • by gman003 ( 1693318 ) on Sunday October 03, 2010 @10:43PM (#33780762)
    That's the one real problem with BitTorrent. If nobody is seeding the file, nobody can download. If the servers that would be hosting the data were instead used as no-limit seeders, that might make BitTorrent a more viable system for "real" downloads.
    • by talsemgeest ( 1346555 ) on Sunday October 03, 2010 @11:05PM (#33780928)
      Most modern bittorrent client support web seeds, that is using an http-hosted file as a seed for the torrent. Ad the speed from that server to the other people who are downloading and you have much better speeds than if you were to simply download straight from the server. Add to this all the other bittorrent features, like resuming a broken download, and improved error checking and you have a very powerful downloading strategy. Just take a look at burnbit: http://burnbit.com/ [burnbit.com] which takes a normal hosted file on the internet and turns it into a torrent. Everyone wins!
  • I gather BitTorrent can't be easily used from behind a firewall, which makes it of limited use in corporate settings at present. As well as built-in support from the major web clients, we'd also need support from the major http proxy servers.

    • by xiando ( 770382 )
      If you're being a firewall which blocks _outgoing_ connections then you're in some environment where you are not going to download any files whatsoever anyway, and you're probably better off using snail-mail. People with proxy-only access to the outside world are not relevant when it comes to downloading files beyond HTML.
      • by arth1 ( 260657 )

        People with proxy-only access to the outside world are not relevant when it comes to downloading files beyond HTML.

        Oh? I regularly download DVDs and CDs through a proxy, no problem. For non-unique downloads, it's even lightning fast because of caching.

      • Re: (Score:3, Informative)

        According to my understanding of BitTorrent, the client needs to be able to accept incoming connections as well as outgoing ones. See for example Brian's BitTorrent FAQ and Guide [dessent.net].

        Also, we use a proxy server for outgoing requests from all of our teaching labs, and we have no trouble downloading stuff. The proxy server is perfectly capable of keeping up with our internet connection. It's not as though it has to do any hard work, all it does is relay data from an incoming TCP connection to an outgoing one

  • Many years ago... (Score:3, Interesting)

    by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Sunday October 03, 2010 @10:51PM (#33780822) Journal

    I recall really hoping that a new distributed file transfer protocol would become standard in browsers. For one thing, it could virtually eliminate large loads on smaller servers caused by flash crowds (more colloquially known as the slashdot effect).

    What I had envisioned is that every webclient currently displaying a web page would effectively act as a seed for the content (including pictures, embedded videos, etc) that the browser has loaded from that page for as long as the user has that page open, radically reducing the load required by the webserver where the original data was hosted when a lot of people want to see the content at the same time.

    Of course, it never happened.

    • I recall really hoping that a new distributed file transfer protocol would become standard in browsers. For one thing, it could virtually eliminate large loads on smaller servers caused by flash crowds (more colloquially known as the slashdot effect).

      Why does everyone seem to want everything to be in the fscking browser? Applications already grow to huge amounts of bloat until they can send mail. What is wrong with having the browser open filetypes in some preferred stand-alone application which does the job and does it well? I really don't see the point in having some poor joke of a BitTorrent client built into my web browser when there are so many good stand-alone apps readily available.

  • A start up I know of started out using peer to peer, but it was too much grief to get people to download a plug in, and then get it to set up port forwarding through their firewall, and at the price of CDNs these days, you are just not saving enough money for it to be worth while.

    Now, when we get IPv6, and HTML5, perhaps it will be a different game (no NAT in IPv6, no need).

    In the case of a game, you already have downloaded stuff, and can convince a fair chunk of your users to set it up.

    Twitter uses it to push patches to their servers in 12 seconds instead of 10 min.

    So it is part of the future.

  • There's a place for direct downloads (HTTP, whatever), but more "aboveboard" use of BitTorrent seems like a great idea; might help if it isn't seen as "mainly a pirate toy". :P

  • by drsquare ( 530038 ) on Sunday October 03, 2010 @10:59PM (#33780886)

    In WoW I have to disable bittorrent if I actually want to download a patch. Otherwise it saturates my connection with upload data whilst only downloading at 1% of my max speed.

    Blizzard use bittorrent simply because they're cheap. Instead of using their millions in profits to provide bandwidth, they make the players smash their quotas sending data to each other. I had to install a bandwidth limiter to get Wrath of the Lich King to install because otherwise the outrageous upload speeds stopped me actually downloading anything. You'd think $15 a month would be enough to pay for enough bandwidth to allow me to download the game I've just paid for, but no they have to chase every penny...

    • If you become a seed for a popular file, you can peg your upload bandwidth. If your upload bandwidth is fairly small (Most users probably still have 1.5/384 or even 512/128 in the US), and you are trying to download something at the same time with TCP (HTTP, FTP, etc), the upload will clobber a lot of the ACKs that the download session is trying to send, and the download bandwidth will get clobbered as well.

      You can work around this with QoS to some extent. Some cheap-ass DSL routers might now or soon even s

  • Comment removed based on user account deletion
  • by RichMan ( 8097 ) on Sunday October 03, 2010 @11:03PM (#33780912)

    Most houses have more than one PC. It is stupid that they all separately download the patches from the source.
    How about an option to share patch downloads across a local network.
    Nominate one machine as a master then all the other machines check with the master for their patches.
    The master is responsible for contacting the source.

  • Bittorrent does not work well with NAT. And pretty much every end-user network employs NAT these days. Therefore, only the nerds who know how to configure their routers will use bittorrent... until NAT dies the miserable death it deserves.

  • Setup and Teardown (Score:5, Insightful)

    by The Raven ( 30575 ) on Sunday October 03, 2010 @11:15PM (#33780990) Homepage

    Bittorrent is great for very large files, and popular files.

    But for small files it's really, really bad. Many linux patches involve downloading hundreds of small files, not one big one. Most applications are so small that the setup and teardown time for bittorrent would dwarf the download time. Any download that takes less than 5 will likely have a smoother user experience if it is not done using bittorrent.

    Even ignoring tiny files, there is the issue of bandwidth limited users, the significantly higher routing requirements of bittorrent (many home routers flake out when you get 50+ TCP connections going through them), users with heavily asymmetrical connections (5Mbit down/256kbit up), and the more complicated configuration required to get a good bittorrent connection.

    In short, bittorrent is nice for its niche (large, popular files), but outside that niche it is often not the best solution. Wider deployment of bittorrent technology would probably help some places, but it's not a silver bullet for all Internet downloads.

  • Multicast? (Score:3, Interesting)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Sunday October 03, 2010 @11:27PM (#33781096) Journal

    Why has no one mentioned this?

    • Re:Multicast? (Score:4, Informative)

      by Anonymous Coward on Monday October 04, 2010 @03:42AM (#33782174)

      Why has no one mentioned this?

      Because multi-cast doesn't work in practice.

      Because almost every gateway router drops multi-cast packets.

      Because multi-cast is only efficient if there is more than one recipient on the same subnet downloading the file at the same time.

      Because synchronizing the assembly of milti-cast downloads that were initiated at different times (as in, 1 second apart) would require as much work as implementing the bittorent protocol.

      Honestly, multi-cast was only really thought out for machines sharing a private network. It wasn't intended for internet-style applications.

  • by Type44Q ( 1233630 ) on Sunday October 03, 2010 @11:36PM (#33781146)
    I've noticed a marvelous use for Bittorrent that I'm not sure everyone knows about, it's especially useful for those with limited bandwidth or download caps:

    Say you've got a CD or DVD that's scratched, or an .iso you spent forever downloading via ftp and discovered to your dismay was corrupted. Assuming a bit-identical image is available online via .torrent, you can 'repair' your data without having to download the whole thing all over again:

    Start your bittorrent app and begin downloading a new copy of the image you need. Immediately stop the download and exit your bittorrent app. An .iso file (incomplete, of course) will have been created in the destination folder.

    Now rip your [damaged] disc to hard drive, creating an [obviously corrupted] .iso. Copy/paste that .iso into bittorrent's download folder, overwriting the existing .iso.

    Fire-up bittorrent and begin your download once again. Bittorrent will analyze the corrupted .iso and immediately download the bits needed to repair (i.e. complete) it. In most cases this will only take a few seconds, even over dial-up, due to the insignificant amount of data usually needed (except, of course, in the event of a heavily scratched disc, which can also take a long time to rip in the first place; having a high-quality optical drive with good firmware and good optics certainly couldn't hurt).

  • Swarm tracking (Score:4, Interesting)

    by Ziekheid ( 1427027 ) on Sunday October 03, 2010 @11:36PM (#33781148)

    No thanks. They did some research recently on how easy it is to track users in swarms. As soon as you're in the swarm you can know every other IP transfering those files (depending on tracker usage ofcourse). It's easy to compile a list of IP adresses and the content they downloaded over time.

    I like my privacy and I have no intention to let people know what software I'm downloading.
    And as stated before, it's a security risk too. This doesn't only apply to software updates, it applies to any software that is downloaded.

    For example: there is an outdated version of some application still hosted on the tracker of download.com and I'm someone who knows of a vulnerability in it. I join into the swarm, collect all IP's and eventually just exploit them as I go.
    Hell, I don't even have to scan entire ranges for this application port anymore!

  • by SuperKendall ( 25149 ) on Sunday October 03, 2010 @11:58PM (#33781260)

    Content delivery networks already solve a lot of the issues that bittorrent addresses - You can distribute large files without consuming a huge amount of backbone bandwidth, with a lot of regional servers.

    It also helps with some other things:

    1) Guaranteed level of reliable local service.
    2) Customers don't know who each other are, a data privacy issue (Say, I notice someone at ip 4.5.6.7 is downloading this particular security patch)
    3) Security (yes I know torrents are checksummed but it's not impossible to defeat).

    But basically, it's all about a known level of quality for customers, which CDN's deliver and which are more of a case by case thing for torrents.

    Also, some customers could be angry that companies are using bandwidth to send files to other people - I've been surprised that Blizzard gets away with that with as little complaint as they do.

  • DHT Tracker (Score:3, Informative)

    by pgn674 ( 995941 ) on Sunday October 03, 2010 @11:59PM (#33781264) Homepage

    DHT doesn't even require a tracker

    I thought DHT did require a centralized server, called a bootstrap node?

  • by lennier ( 44736 ) on Monday October 04, 2010 @12:20AM (#33781376) Homepage

    Why don't we have a generic TCP/IP transfer protocol which caches things at every hop it passed through?

    That way, if a million people download a file, it gets uploaded once from the server to that server's ISP, stored once at that ISP, transferred once from that ISP to every other ISP that requests it, stored once at each of those, and then transferred once from each ISP to every LAN that requests it.

    You know, the way Usenet used to work and still could if anyone bothered to resurrect it.

    Seems like this would be the sensible, distributed, long-term solution to file distribution?

  • by m.dillon ( 147925 ) on Monday October 04, 2010 @12:24AM (#33781400) Homepage

    Both FTP and HTTP can fetch at offsets other than 0 and ftp at least has been able to do that for well over two decades. I haven't had to start a download over in a long, long time.

    -Matt

    • by Anonymous Coward on Monday October 04, 2010 @03:57AM (#33782220)

      Yeah, but the built in download manager for Firefox sucks. Yes, it *can* resume... if you paused the download first. Even when it's told the file size, if your connection dies in the middle (which my connection too often does), it thinks it's "done" (which is absurd, because wget reports an error code) and you can't resume from within Firefox any more. Oh! I got an RST packet, that must mean that the last 300 MB transferred instantly!

      I gave up on that piece of crap and used an add-on to make Firefox use wget, which at least has the decency to know when files have *actually* been fully downloaded, rather than giving up and deciding it's good enough to hand me a useless half file and no error messages at all. Seriously, have the devs never tested large downloads on a link that dies? It's not even hard to test: you can emulate the connection dying by pulling the damn ethernet cable. It just ignores the error and continues blindly. Does it really think that the other server going silent is an indicator that the file has been fully downloaded, or what?

  • by amaiman ( 103647 ) on Monday October 04, 2010 @08:44AM (#33783410) Homepage
    One issue is that customers may not want to give away their bandwidth to the companies that they are paying for a service. Game patches are a good example...The player pays a monthly fee for access to the game. That fee should be paying for the bandwidth used to download patches. Why should the customer have to give their upstream bandwidth to other players trying to download the patch? The server load and cost issues for the game company are not his problem. I've encountered several "downloaders" that load themselves into Startup and will proceed to seed the game or patch that you just downloaded indefinitely, stealing your bandwidth. The only way to stop it is to kill the task and manually remove the program that's seeding the content. At the very least, seeding the completed download needs to be opt-in, not opt-out. That would break bittorrent distribution, of course, though, unless there were dedicated seeds. But the source company should be the primary seed, anyway.
  • by synthesizerpatel ( 1210598 ) on Monday October 04, 2010 @12:00PM (#33784770)

    When my HTTP or FTP transfers fail I just use 'wget -c' to continue them.. No need to switch to torrents.

  • No. (Score:3, Insightful)

    by SmallFurryCreature ( 593017 ) on Monday October 04, 2010 @12:18PM (#33784994) Journal

    Bittorrent as its uses. It can help off load some of the traffic to your customers so you increase bandwidth without having to pay for it.

    BUT there are some HUGE downsides:

    A: You cannot rely on your customers to host your data, meaning you still have to supply a copy of the data AND serve it at expected speeds.

    B: Customers might come when nobody else wants it, meaning you are STILL providing all the bandwidth. Customers will have little motivation to seed your content.

    C: Many customers will not have the right setup to share data. Either firewall restrictions or ISP limitations.

    D: You need to bake the bittorrent into your application (like WoW and other games do) or face endless questions by customers who are barely able to download in the first place.

    E: Some content you don't want shared. How can you watermark content and tie it to a user if every user has the same file? Blizzard don't care who gets their patches since only legit users can play the game anyway and Linux torrents are of course free to start with.

    F: for small files, bittorrent costs to much overhead. If I share a million MP3's the changes of finding anyone else with the same, willing and able to share it are tiny and the overhead will be more then the saved bandwidth.

    So, this question is asked by a person who clearly hasn't understood the web, users, copyright or usability.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...