Bittorrent To Replace Standard Downloads? 591
Max Sayre writes "Have you ever tried to download an operating system update only to have it fail and have to start all over? What about patches for your favorite games? World of Warcraft already uses Bittorrent technology as a way to distribute large amounts of content at a lower cost to the company and faster speeds to all of their clients. So why haven't they replaced the standard downloading options built into any major OS? Companies like Opera are including the downloading of torrents in their products already and extensions have been written for Firefox to download torrents in-browser. Every day Bittorrent traffic is growing. Sites like OpenBittorrent already exist and DHT doesn't even require a tracker. So why isn't everyone doing it? Is it finally time to see all downloads replaced with Bittorrent?"
You explained it. (Score:4, Insightful)
When torrent support comes equipped on all the major browsers, it can take off.
Until then it's a tool for nerds to get their porn faster.
Re:You explained it. (Score:5, Insightful)
Re:You explained it. (Score:4, Interesting)
Hell, I'm on Shaw Cable in Canada, and if I don't limit my upload bandwidth to 5 kb/s, my download bandwidth drops to sub-50 kb/s. But if I do limit it to 5 kb/s, then download speeds go way up to over 200 kb/s.
And yes, they advertise that I should be getting an order of magnitude greater speed than this...
Re:You explained it. (Score:5, Informative)
Your acknowledgement packets probably aren't getting through.
http://www.benzedrine.cx/ackpri.html [benzedrine.cx]
Re:You explained it. (Score:4, Insightful)
Some people don't understand the fundamentals of TCP's congestion control/flow control. Bit torrent is a very greedy, selfish, egocentric, abusive (keep going with the adjectives) algorithm. It takes advantage of TCP's mechanism to provide fairness, and use it to abuse the rest of the users. While people is excited about it's performance to selfishly downloading data, the widespread of these type of algorithms may lead to unusable networks. Particularly, because there is no queue management enforced and marking mechanisms are not used by default, therefore, routers will drop packets and the end effect is a large number of retransmitted packets.
As the the parent points out what the grand parent states, the greed of such protocols even degrades the throughput by starving the acknowledgement packets.
Re: (Score:3, Informative)
Shaw seems to be throttling torrents, from my experience at least, and a few friends with them. Used to be faster a couple years ago.
Upload has always been rather pathetic with them though... and seems to have gotten worse over time (over subscribing I guess?)
Re:You explained it. (Score:4, Informative)
Re: (Score:3, Interesting)
Thing is, a proper bittorrent implementation would actually improve speeds at a private network. That way, rather than having 50-100* clients all contacting microsoft for updates and downloading that 200MB set of patches, while they do their contact, the BT system realizes they're all on the same network/subnet and they promptly share them all with each other first. Without needing some sort of 'official' local patch depository server or fancy management system like SMS. Or even a caching proxy server(wh
Re: (Score:3, Insightful)
At my job I torrented a Doctor Who audiofile pack, so I'd have something to relieve the boredom, and the next day the IT Staff was scanning my computer. I don't think torrenting will be permitted in the office. Ever.
From the summary:
>>>"Have you ever tried to download an operating system update only to have it fail and have to start all over?"
Yes and it's ridiculous. Even back in the 80s we had the ZMODEM protocol on our lowly 8-bit Ataris and Commodores. If a file was interrupted the ZMODEM pr
Re:You explained it. (Score:5, Informative)
Re: (Score:3, Insightful)
Why the hell would it have to be in all the major browsers, when the ability to open files with external apps has been around for a decade, if not longer.
Just so you know, there have been Firefox addons for torrents for several years and Opera baked in right into the browser over 5 years ago.
Re:You explained it. (Score:4, Insightful)
And the only Firefox addon they have is VideoDownloadHelper, because they went on Yahoo Answers asking how they could download YouTube videos.
Re:You explained it. (Score:5, Insightful)
This. People don't seem to realize that PDF, word documents, and flash will never take off as accepted formats for the layman unless they are baked into every major web-browser.
Wait, what?
Re: (Score:3, Interesting)
Surely you forgot about browser toolbars... the last time I cleaned my parent's computer I removed at least half a dozen damn toolbars from Firefox, several of them with very questionable intent. Remember, if Joe Sixpack visits a page and it prompts them to install something... they will very likely install it. They don't know they don't need it, most likely, because the general public has been conditioned over the years that various add-ons are required for viewing certain websites and content.
I did instru
Re: (Score:3, Informative)
Perhaps you'd like to give this a try?
http://www.fireaddons.com/downloads/ [fireaddons.com]
Re: (Score:3, Funny)
Until then it's a tool for nerds to get their porn faster.
11+ Million World of Warcraft players can't be wrong...
OK, the porn market is bigger than that - but the porn torrent market? I wonder.
Re: (Score:3, Insightful)
When torrent support comes equipped on all the major browsers, it can take off.
And ISPs start offering decent upstream bandwidth?
Right now I hate torrents because it always seems to be slower than a server with a good connection. The past few times I've been forced to use it, I've been uploading faster than I was downloading (I do have decent upstream).
Re: (Score:3, Insightful)
Sure, that's what resume is for... HTTP supports it, FTP supports it...
What i don't like are sites which force you to download files within the browser (which are designed for browsing, and generally have very poor download functions) instead of just presenting a url which you can cut+paste to wget.
Re: (Score:3, Interesting)
Torrent itself, probably not. There's nothing in P2P that's inherently evil.
But there are several ISPs out there already using DPI to specifically throttle/downgrade the service for certain protocols. Downloading through BitTorrent is a *slower* download for me than HTTP or FTP, even when connecting to a server that can't push more than 1mbit. I'd be seriously peeved if things like OS patches went BitTorrent, because I'd never be able to get them downloaded/installed the day they release.
Re: (Score:3, Interesting)
Have you enabled ecryption. MSE (Message Stream Encryption) is standard on most torrent clients however most clients have it disabled by default. In uTorrent I enable MSE and reject all non encrypted packets & requests.
Using MSE ISP can no longer simply shape based on protocol. Bittorrent uses a random port which makes shaping based on port equally ineffective.
Re: (Score:3, Interesting)
Encryption doesn't do shit against DPI. The encryption is application-level... Level 7 in the OSI model. DPI can go up to level 7 to find out what a packet is for. It doesn't give a damn what the data is, it only cares what application the data is for. Even if it's encrypted, believe me when I tell you that DPI can figure out enough to know whether it should be throttled.
Besides, even if it couldn't figure out what application the data was for, using stunnel/ssl or such, why not simply set the DPI to thrott
The bigger question is: (Score:4, Interesting)
Re:The bigger question is: (Score:5, Informative)
There was apt-torrent, but that project appears to be abandoned.
The thing is probably that there is no pressing need. There are many educational facilities that are are willing to provide mirrors for such things, so there's no real reason to implement a system to borrow user's upstream bandwidth.
Re:The bigger question is: (Score:5, Interesting)
This is spot on.
WoW (and other MMOs) needs torrents, because they have an very high to extremely high burst ratio. When a new patch is deployed for Linux it needs to propagate out to various distributions, people need to start packing it and then end users auto update will pick it up eventually. This means it's often distributed over time. When WoW deploys a new patch, they have 10 million people trying to get it at once in order to be first in new instances. Even the big distributors have trouble coping with this - and over a short period of time the need for bandwith will drop to very low levels as people are getting up-to-date, so there is no financial incentive for Blizzard to invest in the hardware to cope with deployment.
Re:The bigger question is: (Score:5, Insightful)
But if the repositories were themselves seeding, then it'd work just fine: Worst case is that it's still at least as fast as HTTP or FTP from the same repository (plus or minus some BT overhead), all else being the same.
Best case is that there's several repositories all seeding the same basic set of random apps, plus a bunch of users who have already downloaded the random app, and things turn both faster and cheaper than they otherwise would have been.
The hash checks performed by BT will do well to prevent errors and/or poisoned apps, as well.
Sounds like a win to me.
Re: (Score:3, Insightful)
The hash checks performed by BT will do well to prevent errors ...
Please, please tell this to all the dipshits who post torrents of RAR archives.
Re: (Score:3, Insightful)
It's perfectly legal to download copyrighted material with the permission of the copyright holder (unless it breaks any other laws) in every country.
Why would the RIAA or anybody else want to poison Linux updates?
Because they, point-blank, would like to see ALL downloading of ANY files that are not expressly approved BY THEM to be made illegal and/or blocked by ISPs under government mandate. Yes, that's how they think. Do they care about Linux updates, specifically? Of course not: but the media cartel is all about banning entire technologies (cassette tape, DAT, writeable CDs and DVDS, the VCR, you-name-it.) If it can be used to copy entertainment data they feel they have the right to eliminate it, and should there
Re: (Score:3, Interesting)
Yes you can, just get apt-spy.
Re: (Score:3, Insightful)
Try it before declaring its "alive" or "dead". Its "schrodingers cat" until you actually try it. I tried using the numerous torrent to apt interfaces about two months ago, only got one working, forget which. Best case scenario was finding single digit seeders with performance roughly equal to ye olden dialup days. Needless to say after a couple days I dropped it, which I'm sure further lowered the network performance by a significant fraction.
The "problem" is Debian has hundreds of very fast mirrors. S
Re:The bigger question is: (Score:5, Insightful)
Re: (Score:3, Insightful)
There's no reason the tracker couldn't limit the peer visibility such that only a few trusted seeder's IPs would be given to leechers. That is, each leecher would see an artificially low number of seeders, only seeders that were trusted. The client would then intentionally not use DHT or other mechanisms to find other peers.
For non-security or low priority updates, full tracker support could be allowed.
Re: (Score:3, Interesting)
That's essentially the same as not using bittorrent. If you can't see arbitrary peers, your peer-to-peer system isn't very effective.
Re:The bigger question is: (Score:4, Insightful)
Not quite... The difference is that you could download from any or ALL of the trusted peers (currently known as "mirror sites") at the same time. Seems a bit better than trying to pick from a list of mirrors that might be close to you or using the "random mirror" link. If one mirror was down or slow, it would barely be noticed on the downloader's end.
Also, once a machine downloaded and installed the patch it could then announce back to the tracker that it can be a seed as it is no longer vulnerable. So, the tracker would only show seeds, and the downloading system would only announce that it was a seed AFTER it installed the patch.
Re: (Score:3, Interesting)
Not that it matters... people usually just disable GPG checking or force install, when the signature check fails. Or they don't bother to check the "signature that's essentially 'impossible' to fake" before installing the tarball, anyways.
Care to cite your source for that? I work for a company with a very large base of customers running various distros (mostly Ubuntu and Debian), and this is not the behavior I see at all.
Re:The bigger question is: (Score:5, Interesting)
I would like a solution that combines metalinks [wikipedia.org] (one file that contains multiple urls for a download plus the checksum) with Bittorrent.
A client could start a http download from one server, and a bittorrent that requests pieces for the latter chunks. You can also make multiple http request with a offset these days, on another http server or the same one.
This could even be built in magically into http browsers: if the file size is > 50MB, ask the cloud if there are nodes for the given url. That is provided you have a checksum like with metalinks. Appearantly metalink already features this possibility: http://www.metalinker.org/ [metalinker.org]
Re:The bigger question is: (Score:5, Informative)
Re:The bigger question is: (Score:5, Insightful)
Contacting the tracker and getting an initial peer list, in a proper system, takes a fraction of a second. It's iteratively contacting peers, obtaining their piece bitmap, negotiating with them for piece exchange, and finding peers that actually have high bandwidth that makes the startup time of BitTorrent so high.
Take it a step further... (Score:5, Interesting)
Combine this with social networking to allow/deny access to your files and I think you've got a game changer. Files which require no server, and which are unknown/unavailable to anyone who doesn't need to know about them. I could share my mp3 collection or movie collection with only my friends list, which would be much more along the lines of fair use (like tape trading).
Re: (Score:3, Insightful)
Yes, they require a server (tracker) to limit access to members only but that functionality would just be shifted to the social networking site.
If you're planning to do this without a tracker then how do you prevent people outside your friends list from joining the torrent (assuming they manage to find a copy of the
If you have friends list big enough to make bittorrent worth while it's quite likely that someone will leak the torrent file to s
Re:Take it a step further... (Score:4, Insightful)
Just a heads up, this "game changer" is called a ftp server. My friends and family already have access to download my files or upload whatever.
Maybe what you want is a game changing facebook app that just manages passwords and opens a new window with your friends ftp server in it.
Why are you trying to reinvent the wheel?
File size (Score:5, Insightful)
Why? because for small files (as I expect most software updates would be), downloading directly is quicker and safer.
Re:File size (Score:5, Insightful)
Why? because for small files (as I expect most software updates would be), downloading directly is quicker and safer.
Safer? Bittorrent already has built in checksumming which most people don't do with regular downloads anyways. By that metric alone I'd say the BitTorrent is safer than a regular download.
Re:File size (Score:5, Insightful)
Because when you download directly instead of torrenting a file, you aren't basically shouting to the world "HEY I DON'T HAVE THE NEW SECURITY UPDATE YET! ANYONE HAVE THE NEW SECURITY UPDATE?"
Re:File size (Score:5, Informative)
AC knew what he was talking about. Let me spell it out since you (and the guy who modded you Insightful) clearly don't.
IPs in a swarm are visible to anyone who can join the swarm. If you use it for security updates, you are implicitly announcing (a) the security update in question, and (b) how to join the swarm. Q.E.D., most people attached to a swarm who are not yet seeding (and possibly many of those who are seeding) do not have the update installed and are publishing this along with their IP for anyone on the internet to see.
Re: (Score:3, Interesting)
Bittorrent already has built in checksumming which most people don't do with regular downloads
Bittorrent requires checksumming because it has to pull data from random sources, some of which may attempt to poison the Torrent. It relies on the SHA-1 checksum, which isn't broken yet, but a dedicated enough individual can find a way to poison such a system.
If an attacker manages to get enough control to manipulate HTTP downloads, he can also manipulate the posted checksum as well.
If you're worried about corruption appearing in HTTP, remember that there's already checksumming on the packet level, as wel
Re: (Score:3, Insightful)
Data Caps (Score:5, Informative)
Those of us stuck in New Zealand or Australia still have data caps to think about. If every download was a torrent there would be a lot more overhead eating into our precious data caps!
Please, think of the Kiwis.
Re: (Score:3, Informative)
http://www.internode.on.net/residential/broadband/bundles/easy_bundle/plans/ [on.net]
"Massive 'Any Time' monthly quota - measured as the total of downloads plus uploads. "
Re:Data Caps (Score:5, Informative)
Re:Data Caps (Score:5, Funny)
Competition, man. It solves all. There's probably hundreds, if not thousands, of ISPs in his area alone that offer an unlimited data plan. They're just invisible!
Re: (Score:3, Funny)
Adam Smith did a coin trick at an Economics summit way back when. After taking the Prime Minister's gold piece in one hand and making it appear in another, then sending it back to the other hand without apparently moving it; he opened both hands and the coin was gone. The Prime Minister was amused, and after a few minutes, the Prime Minister politely asked for his coin back. Adam Smith replied that it was in his invisible hand, and if only he could locate that hand, could he actually return it.
The rest i
Re:Data Caps (Score:5, Informative)
Re: (Score:3, Funny)
Everybody's always picking on Ayn Rand.
Well, she just needs to man up and deal with it.
Re: (Score:2)
So get another ISP! There's always thousands of ISPs in all areas of the world. Competition ensures that at least some of them follow good practices. Get with the times, man.
Yeah. Right. Like anybody is willing to move to another continent to get a faster internet connection. You may not beware of this, but New Zealand and Australia are actually islands.
Why? (Score:5, Insightful)
Because Bittorrent has a reputation issue, for one. The MPAA and RIAA attack it and call it the reason they are losing money (instead of their failing business model).
Large companies don't want to have to deal with the previous hassle, and even though the load might not be much for individual computers, if everyone on a company network was bittorrenting, other traffic would be interrupted (even on 2MB DSL, bittorrent interferes with my connections to many popular IM services and I don't even run it full throttle during the day).
Re:Why? (Score:5, Informative)
Because Bittorrent has a reputation issue, for one. The MPAA and RIAA attack it and call it the reason they are losing money (instead of their failing business model).
Try running a perfectly legal BitTorrent tracker. You will find that the MPAA/RIAA criminals both DDOS your server and spam your ISP with DMCA crap regarding files you are not tracking and never heard of. They really dislike BitTorrent.
Re:Why? (Score:5, Insightful)
Try running a perfectly legal BitTorrent tracker. You will find that the MPAA/RIAA criminals both DDOS your server and spam your ISP with DMCA crap regarding files you are not tracking and never heard of. They really dislike BitTorrent.
It's because it competes with them. Not as content producers, as distributors. If BitTorrent had a good reputation then indie filmmakers would use it to distribute their films to customers, perhaps as encrypted files with DRM, perhaps not, but in any event in competition with distributing them through official MPAA channels where the big companies get their big cut.
Re:Why? (Score:5, Insightful)
MPAA said the same thing about the VCR.
Can we go back to not giving a fuck what the MPAA thinks?
No (Score:4, Insightful)
No, it won't replace standard downloads, if nothing else because bittorrent is "best effort", and there's no guarantee that the client receives a file within a certain time frame. And for small and medium files, the overhead of BT severely slows down the access.
Yes, it's useful for large files. No, it's no 100% replacement.
And that's the beauty of internet in a nutshell -- there isn't one solution that fits all, but lots and lots of tools and standards that can be used and adjusted to the specific needs. So stop looking for The One And Only Way.
Re: (Score:2)
"and there's no guarantee that the client receives a file within a certain time frame."
But that applies to regular downloading, too. It depends on the server location, its connection speed, your connection speed (which isn't always constant), and how much traffic the server is getting at the time.
Re: (Score:3, Insightful)
But you can control those factors. A VPN with a CIR, for example.
And even when you can't control it, you still can estimate much better. A 1:1 download that's doing 150 kBps for the first five minutes from a server with plenty of bandwidth isn't likely to drop to 15 kBps for half an hour and then pick up to 300 kBps.
If I need a large file, I look for a http download first, and only if I can't find that do I go to bittorrent. Because BT is usually going to take longer, and is always impossible to estimate
Re: (Score:3, Informative)
Yeah, I know a bittorrent download is largely unpredictable, but I was just pointing out that, to an extent, so are regular downloads.
Why not? Here are some reasons... (Score:4, Interesting)
1) because I'm a leech.
2) because I don't want legal liability FOR DISTRIBUTING if I download a file that unknown to me is illegal, e.g. a software package from overseas that someone inserted illegal-in-my-country pornography into the binary. Yeah, I'll take the risk for possession but not for distribution.
3) because my employer's lawyer made me say #2 when it comes to company machines.
4) because I prefer to get my bits from the official location. Yea, I know a checksum should be good enough but I'm old school here.
Seriously though, I can see torrents overtaking web- and ftp- downloads as the primary method for distributing large, popular files. However, there will always be customers who refuse to share and who refuse to get data from any source that doesn't have a reputation for quality and isn't blessed by the original publisher.
Oh, and seriously, I'll be fine using torrents to download things like well-known linux distros. I trust modern checksums. I probably won't use them for low-demand files or smaller files though.
Re: (Score:3, Informative)
4) because I prefer to get my bits from the official location. Yea, I know a checksum should be good enough but I'm old school here.
Actually, from a strictly security-POV, a checksum and a distributed distribution model is better, because it makes man-in-the-middle attacks considerably more difficult.
Of course, only as long as you have a trustworthy channel to get the checksum through and actually bother to verify it.
Faster? (Score:2)
World of Warcraft already uses Bittorrent technology as a way to distribute large amounts of content at a lower cost to the company and faster speeds to all of their clients
Lower cost, for sure, but it is not faster. The fastest download is when you're downloading from a single server that is able to fully saturate your connection. Even better if this server is situated directly within your ISP as is the case for some content delivery networks (Limelight I think does this). Having to negotiate individual connections with hundreds of peers around the world and incurring the associated lag and protocol overhead can't even compare.
Re: (Score:3, Insightful)
"...that is able to fully saturate your connection."
Yeah, like this always happens. Not.
Scenario: 1st day of release of a new popular file.
Either the vendor prepares well and works with content-delivery networks so you and everyone else on the planet can download the file while saturating your network, or vendor doesn't.
If he doesn't, everyone gets throttled and/or some people are told to try again later.
A torrent option would help distribute the load and cut out the bottleneck.
Re: (Score:2)
That really only scales up so far. It's actually quite difficult to saturate a 1gbps, let alone a 10gbps or 100gbps, link with a single stream. Multiple streams work around some of the problems and allow the full link to be used.
Protocol trouble Re:Faster? (Score:2)
The way I see it:
If the protocol were improved a little bit, and ISPs were a little smarter, then everyone wins. If the protocol allowed preferred connection to big nearby pipes (and I know that some clients try to do that) and there was a way to really relay/cache/siren feeds (like http proxies), then ISPs *could* watch for 'hot torrents' and cache them to fee them to their customers at high speed - thus reducing their out of network costs (because they are feeding the data, themselves) and improving the
Only if there's good seeds (Score:3, Interesting)
Re:Only if there's good seeds (Score:4, Informative)
Firewalls (Score:2)
I gather BitTorrent can't be easily used from behind a firewall, which makes it of limited use in corporate settings at present. As well as built-in support from the major web clients, we'd also need support from the major http proxy servers.
Re: (Score:2)
Re: (Score:2)
Oh? I regularly download DVDs and CDs through a proxy, no problem. For non-unique downloads, it's even lightning fast because of caching.
Re: (Score:3, Informative)
According to my understanding of BitTorrent, the client needs to be able to accept incoming connections as well as outgoing ones. See for example Brian's BitTorrent FAQ and Guide [dessent.net].
Also, we use a proxy server for outgoing requests from all of our teaching labs, and we have no trouble downloading stuff. The proxy server is perfectly capable of keeping up with our internet connection. It's not as though it has to do any hard work, all it does is relay data from an incoming TCP connection to an outgoing one
Many years ago... (Score:3, Interesting)
I recall really hoping that a new distributed file transfer protocol would become standard in browsers. For one thing, it could virtually eliminate large loads on smaller servers caused by flash crowds (more colloquially known as the slashdot effect).
What I had envisioned is that every webclient currently displaying a web page would effectively act as a seed for the content (including pictures, embedded videos, etc) that the browser has loaded from that page for as long as the user has that page open, radically reducing the load required by the webserver where the original data was hosted when a lot of people want to see the content at the same time.
Of course, it never happened.
Re: Lets even make the brower make tea and such (Score:2)
I recall really hoping that a new distributed file transfer protocol would become standard in browsers. For one thing, it could virtually eliminate large loads on smaller servers caused by flash crowds (more colloquially known as the slashdot effect).
Why does everyone seem to want everything to be in the fscking browser? Applications already grow to huge amounts of bloat until they can send mail. What is wrong with having the browser open filetypes in some preferred stand-alone application which does the job and does it well? I really don't see the point in having some poor joke of a BitTorrent client built into my web browser when there are so many good stand-alone apps readily available.
CDNs are cheap, NAT makes it hard (Score:4, Interesting)
A start up I know of started out using peer to peer, but it was too much grief to get people to download a plug in, and then get it to set up port forwarding through their firewall, and at the price of CDNs these days, you are just not saving enough money for it to be worth while.
Now, when we get IPv6, and HTML5, perhaps it will be a different game (no NAT in IPv6, no need).
In the case of a game, you already have downloaded stuff, and can convince a fair chunk of your users to set it up.
Twitter uses it to push patches to their servers in 12 seconds instead of 10 min.
So it is part of the future.
There's A Place... (Score:2)
There's a place for direct downloads (HTTP, whatever), but more "aboveboard" use of BitTorrent seems like a great idea; might help if it isn't seen as "mainly a pirate toy". :P
Faster Speeds? Yeah right... (Score:5, Informative)
In WoW I have to disable bittorrent if I actually want to download a patch. Otherwise it saturates my connection with upload data whilst only downloading at 1% of my max speed.
Blizzard use bittorrent simply because they're cheap. Instead of using their millions in profits to provide bandwidth, they make the players smash their quotas sending data to each other. I had to install a bandwidth limiter to get Wrath of the Lich King to install because otherwise the outrageous upload speeds stopped me actually downloading anything. You'd think $15 a month would be enough to pay for enough bandwidth to allow me to download the game I've just paid for, but no they have to chase every penny...
The known problem wth asymmetrical DSL (Score:3, Insightful)
If you become a seed for a popular file, you can peg your upload bandwidth. If your upload bandwidth is fairly small (Most users probably still have 1.5/384 or even 512/128 in the US), and you are trying to download something at the same time with TCP (HTTP, FTP, etc), the upload will clobber a lot of the ACKs that the download session is trying to send, and the download bandwidth will get clobbered as well.
You can work around this with QoS to some extent. Some cheap-ass DSL routers might now or soon even s
Re: (Score:2)
Heard of QoS? (Score:2)
You could apply QoS policies to outbound traffic such that BT only gets left over bandwidth a.k.a. a QoS scavenger class.
How about a share local option (Score:4, Interesting)
Most houses have more than one PC. It is stupid that they all separately download the patches from the source.
How about an option to share patch downloads across a local network.
Nominate one machine as a master then all the other machines check with the master for their patches.
The master is responsible for contacting the source.
maybe with IPv6 (Score:2)
Bittorrent does not work well with NAT. And pretty much every end-user network employs NAT these days. Therefore, only the nerds who know how to configure their routers will use bittorrent... until NAT dies the miserable death it deserves.
Setup and Teardown (Score:5, Insightful)
Bittorrent is great for very large files, and popular files.
But for small files it's really, really bad. Many linux patches involve downloading hundreds of small files, not one big one. Most applications are so small that the setup and teardown time for bittorrent would dwarf the download time. Any download that takes less than 5 will likely have a smoother user experience if it is not done using bittorrent.
Even ignoring tiny files, there is the issue of bandwidth limited users, the significantly higher routing requirements of bittorrent (many home routers flake out when you get 50+ TCP connections going through them), users with heavily asymmetrical connections (5Mbit down/256kbit up), and the more complicated configuration required to get a good bittorrent connection.
In short, bittorrent is nice for its niche (large, popular files), but outside that niche it is often not the best solution. Wider deployment of bittorrent technology would probably help some places, but it's not a silver bullet for all Internet downloads.
Multicast? (Score:3, Interesting)
Why has no one mentioned this?
Re:Multicast? (Score:4, Informative)
Why has no one mentioned this?
Because multi-cast doesn't work in practice.
Because almost every gateway router drops multi-cast packets.
Because multi-cast is only efficient if there is more than one recipient on the same subnet downloading the file at the same time.
Because synchronizing the assembly of milti-cast downloads that were initiated at different times (as in, 1 second apart) would require as much work as implementing the bittorent protocol.
Honestly, multi-cast was only really thought out for machines sharing a private network. It wasn't intended for internet-style applications.
useful functionality, for those not in the know... (Score:5, Interesting)
Say you've got a CD or DVD that's scratched, or an .iso you spent forever downloading via ftp and discovered to your dismay was corrupted. Assuming a bit-identical image is available online via .torrent, you can 'repair' your data without having to download the whole thing all over again:
Start your bittorrent app and begin downloading a new copy of the image you need. Immediately stop the download and exit your bittorrent app. An .iso file (incomplete, of course) will have been created in the destination folder.
Now rip your [damaged] disc to hard drive, creating an [obviously corrupted] .iso. Copy/paste that .iso into bittorrent's download folder, overwriting the existing .iso.
Fire-up bittorrent and begin your download once again. Bittorrent will analyze the corrupted .iso and immediately download the bits needed to repair (i.e. complete) it. In most cases this will only take a few seconds, even over dial-up, due to the insignificant amount of data usually needed (except, of course, in the event of a heavily scratched disc, which can also take a long time to rip in the first place; having a high-quality optical drive with good firmware and good optics certainly couldn't hurt).
Re: (Score:3, Insightful)
I like that the example given was VERY well suited for an obviously legal operation (fixing a corrupted Linux ISO) and NOT well suited for grabbing a movie. It's a genuinely useful and neat trick.
Swarm tracking (Score:4, Interesting)
No thanks. They did some research recently on how easy it is to track users in swarms. As soon as you're in the swarm you can know every other IP transfering those files (depending on tracker usage ofcourse). It's easy to compile a list of IP adresses and the content they downloaded over time.
I like my privacy and I have no intention to let people know what software I'm downloading.
And as stated before, it's a security risk too. This doesn't only apply to software updates, it applies to any software that is downloaded.
For example: there is an outdated version of some application still hosted on the tracker of download.com and I'm someone who knows of a vulnerability in it. I join into the swarm, collect all IP's and eventually just exploit them as I go.
Hell, I don't even have to scan entire ranges for this application port anymore!
Because CDN's work really well and are secure (Score:3, Interesting)
Content delivery networks already solve a lot of the issues that bittorrent addresses - You can distribute large files without consuming a huge amount of backbone bandwidth, with a lot of regional servers.
It also helps with some other things:
1) Guaranteed level of reliable local service.
2) Customers don't know who each other are, a data privacy issue (Say, I notice someone at ip 4.5.6.7 is downloading this particular security patch)
3) Security (yes I know torrents are checksummed but it's not impossible to defeat).
But basically, it's all about a known level of quality for customers, which CDN's deliver and which are more of a case by case thing for torrents.
Also, some customers could be angry that companies are using bandwidth to send files to other people - I've been surprised that Blizzard gets away with that with as little complaint as they do.
DHT Tracker (Score:3, Informative)
DHT doesn't even require a tracker
I thought DHT did require a centralized server, called a bootstrap node?
Instead of BitTorrent (Score:3, Insightful)
Why don't we have a generic TCP/IP transfer protocol which caches things at every hop it passed through?
That way, if a million people download a file, it gets uploaded once from the server to that server's ISP, stored once at that ISP, transferred once from that ISP to every other ISP that requests it, stored once at each of those, and then transferred once from each ISP to every LAN that requests it.
You know, the way Usenet used to work and still could if anyone bothered to resurrect it.
Seems like this would be the sensible, distributed, long-term solution to file distribution?
Why would I have to start over? (Score:4, Insightful)
Both FTP and HTTP can fetch at offsets other than 0 and ftp at least has been able to do that for well over two decades. I haven't had to start a download over in a long, long time.
-Matt
Because Firefox's download manager sucks? (Score:4, Insightful)
Yeah, but the built in download manager for Firefox sucks. Yes, it *can* resume... if you paused the download first. Even when it's told the file size, if your connection dies in the middle (which my connection too often does), it thinks it's "done" (which is absurd, because wget reports an error code) and you can't resume from within Firefox any more. Oh! I got an RST packet, that must mean that the last 300 MB transferred instantly!
I gave up on that piece of crap and used an add-on to make Firefox use wget, which at least has the decency to know when files have *actually* been fully downloaded, rather than giving up and deciding it's good enough to hand me a useless half file and no error messages at all. Seriously, have the devs never tested large downloads on a link that dies? It's not even hard to test: you can emulate the connection dying by pulling the damn ethernet cable. It just ignores the error and continues blindly. Does it really think that the other server going silent is an indicator that the file has been fully downloaded, or what?
Re: (Score:3, Insightful)
The situation is even worse.
The market is full of faulty clients (that don't support resume or have faulty support, say, corrupt the file) and of faulty servers - ones that just don't support resume are less of a problem than the ones that claim to support it, and when asked to resume from offset 10000, happily start sending the file from the beginning.
Customers don't want to give away their bandwidth (Score:4, Interesting)
Download resume isn't the speciality of torrents (Score:3, Insightful)
When my HTTP or FTP transfers fail I just use 'wget -c' to continue them.. No need to switch to torrents.
No. (Score:3, Insightful)
Bittorrent as its uses. It can help off load some of the traffic to your customers so you increase bandwidth without having to pay for it.
BUT there are some HUGE downsides:
A: You cannot rely on your customers to host your data, meaning you still have to supply a copy of the data AND serve it at expected speeds.
B: Customers might come when nobody else wants it, meaning you are STILL providing all the bandwidth. Customers will have little motivation to seed your content.
C: Many customers will not have the right setup to share data. Either firewall restrictions or ISP limitations.
D: You need to bake the bittorrent into your application (like WoW and other games do) or face endless questions by customers who are barely able to download in the first place.
E: Some content you don't want shared. How can you watermark content and tie it to a user if every user has the same file? Blizzard don't care who gets their patches since only legit users can play the game anyway and Linux torrents are of course free to start with.
F: for small files, bittorrent costs to much overhead. If I share a million MP3's the changes of finding anyone else with the same, willing and able to share it are tiny and the overhead will be more then the saved bandwidth.
So, this question is asked by a person who clearly hasn't understood the web, users, copyright or usability.
Re: (Score:3, Funny)
Nah, 0 seeds, 3790 peers (all on 99.9%) ... DOUBLE FFFFFUUUUUU