FTP: Better Than HTTP, Or Obsolete? 1093
An anonymous reader asks "Looking to serve files for downloading (typically 1MB-6MB), I'm confused about whether I should provide an FTP server instead of / as well as HTTP. According to a rapid Google search, the experts say 1) HTTP is slower and less reliable than FTP and 2) HTTP is amateur and will make you look a wimp. But a) FTP is full of security holes. and b) FTP is a crumbling legacy protocol and will make you look a dinosaur. Surely some contradiction... Should I make the effort to implement FTP or take desperate steps to avoid it?"
I wouldn't worry about it... (Score:2, Insightful)
--sex [slashdot.org]
well, what're you trying to do? (Score:4, Insightful)
HTTP is restricted by browsers, many of which will not support files larger than a certain size. Furthermore, FTP allows for features such as resume, etc...
The real question, however, is what are you trying to use this for? What's your intended application?
If it's a file repository for moderately computer literate people - FTP is definitely the way to go.
If it's a place for average-joes to store pictures, maybe HTTP is your best option. Sacrificing a bit of speed and capabilities such as resume might be made up for with ease of use..
Anecdotally, HTTP is more reliable (Score:5, Insightful)
It's generally simpler to get to from a browser, which is where 95% of people's online life is anyway. Yeah, you can rig up a FTP URL, but it seems a bit kludgey and more prone to firewall issues.
Transparent (Score:5, Insightful)
And I wouldn't care about the opinion of someone who would actually judge you over what friggin protocol you use to provide downloads. Such an utter nerd is somethig that I can not relate too. Maybe after I use Linux for a few more years, who knows.
Re:Both... (Score:5, Insightful)
Many companies lock down their firewalls with a huge, gigantic virtual padlock -- ie, port 80 outgoing only. When I'm at work and want to download something from a site that offers FTP only, I'm screwed. Some companies will keep ports < 1024 or some other low number open, but many will not. This would be a driving reason to provide HTTP access to files.
Personally, anyone that calls you an amateur for using HTTP or a dinosaur for using FTP needs to get a life. Both protocols have their place in the modern internet; there's absolutely nothing wrong with serving files via either method other than security and the above mentioned concern. Do what you feel is easiest to maintain and simplest for your users.
Make your life simpler: use HTTP (Score:5, Insightful)
Security-wise, HTTP is a big win over FTP if only because it makes your port-filtering easier - "allow to 80" is simpler and less likely to cause unintended holes than all the things you need to do to support FTP active and passive connections. Certain FTP server software has a reputation as having more security holes than IIS, but there are FTP servers out there that are as secure as Apache.
Of all the morons (Score:1, Insightful)
* protocol
* application
Neither ftp nor http are `full of holes'.
It very much depends which specific server you are using. Now, IIS is full of holes, and some ftp servers have been full of holes in the past, and still are... and there has been holes in apache as well.
As far as the functionality goes, ftp indeed provides some very useful functionality for file transfers: the ability to resume an interrupted transfer. Most servers and most clients support it nowadays and that is really VERY useful.
ftp is slightly more difficult to set up when firewalls are in effect, because of the two actual tcp channels used (data+control), which means the firewall needs to know about ftp (or it can get very unpleasant: active/passive protocols tricks will get you through one firewall, not through two of them).
Of course, if you are a complete moron, you will botch your server configuration and leave yourself wide open, or choose the wrong server, or forget to upgrade, and get security holes all over the place... doesn't matter which
protocol you use.
In reality, depending on what you want to serve, modern answers include sup, cvsup, rsync, and all kinds of other applications which leave ftp faaar behind (not even talking about http...). But then, they are less accessible and a little more 3l33t, which is maybe what you are looking for.
I'd say it depends on what you're serving... (Score:4, Insightful)
1) I've found HTTP transfers are a little faster than FTP transfers (just personally, and I can in no way prove it - it may be user error, or just the programs I'm using)
2) I've found that FTP clients are everwhere - Windows, Linux, BSD, everything I've ever installed has included a command line FTP client, but not a web browser unless I specifically remember to install one. Further more, most of the "live CDs/boot disks" that I use don't have a web browser, but do have FTP... Thus, if you're serving files that a person with out a web browser/server might need, I'd set up both...
3) FTP security is what you/your daemon makes of it. wu-ftpd has a long history of being rooted... ProFTPd dosn't. VSFtp doesn't. HTTP security is the same way... IIS has a long history of being rooted... Apache doesn't... *(Not to say that there haven't been occasional exploits for these platforms)
There is no clear "Use this" or "Use that" procedure here, it depends entirely on your situation, what you're serving, what your network setup is, etc...
FTP vs HTTP (Score:2, Insightful)
Security (Score:3, Insightful)
WebDAV? (Score:3, Insightful)
Still have some of the unreliablity of HTTP transfers and slowness. But works a lot better through firewalls (and more securely since connection tracking works better with WebDAV).
I've found Passive Mode FTP to also be more unstable than standard ftp transfers.
HTTP Vs FTP (Score:5, Insightful)
To me, this is a problem of authentication. If you want EVERYONE to have these files, why not just use the HTTP server? If you're targeting a select few people, then why not use the built-in authentication mechanisms of FTP?
Yes I know there are authentication mechanisms for HTTP, but they're arguably harder to implement than setting up an FTP server.
Are your clients only using web browsers to retrieve these files? I'll get flamed for this, but web browsers were not designed for FTP, and thus are klunky at it. HTTP wins there again.
Don't worry about it. Just use HTTP and let the FTP bigots flame away.
Re:Forget them both.... (Score:5, Insightful)
I personally would say go with http for the files, as it'll be much easier for people behind http proxies to download, it'll get cached more often by transparant proxies, and most browsers support browsing http directories FAR better than FTP directories.
FTP does text better (Score:3, Insightful)
This is one area that I feel HTTP was deliberately weak on. I can't see any reason why HTTP couldn't require any text/* media type to be sent in a canonical format, with "network" newlines. If every Internet transport protocol did this properly, there would never be a situation where you have a file on your computer using the wrong type of newline.
Re:Both... (Score:2, Insightful)
True, not every ftp server supports it, but it'll get through any firewall you can connect out through. IIRC, it abandons the UDP data stream, and instead pushes the data back to you over the TCP stream.
Many browsers have that as a config option, I believe.
Both... (Score:3, Insightful)
HTTP "upload" is trickier than FTP upload, if that's a factor.
Computers are full of security holes. Both webservers and FTP daemons have the risk (and history) of 'sploitable holes. As to configuration-related risk, there's more inherent in the default configuration of an FTP server, since its original concept of ops is browsing a physical directory tree, but most current FTP servers can be locked down as tight as any webserver. Just can't win, can you? Only posers worry that their choice of protocol might look bad. It's not fashion, folks.If you had to support only one, pick HTTP, since I bet you're already running a webserver. And if it's really about fashion, run an exotic protocol like CORBA or SCP or something.
Re:Forget them both.... (Score:4, Insightful)
FTP's statefulness (Score:3, Insightful)
My suggestion would be to go with FTP if the size is anything above a few MB for administrative reasons. HTTP doesn't allow you to limit connections and bandwidth could be incidentally mashed into virtually nothing per user. You can limit not only the number of active downloads, but even who can download if you wanted.
Think of almost every Linux distribution - they use FTP. Why? Ever tried downloading something, connected at a smooth 100k/s or something and found, a few minutes later, you're pulling 2k/s? That USUALLY doesn't happen with FTP. Once you're in, you're in.
If you have tiny files, of course, logic dictates you go with HTTP. But for anything of size, do *yourself* a favor and go with FTP.
Adam
Re:Anecdotally, HTTP is more reliable (Score:4, Insightful)
I suspect that's because 99% of people are downloading from one of the FTP servers.
It's generally simpler to get to from a browser, which is where 95% of people's online life is anyway.
I honestly don't see how.
Yeah, you can rig up a FTP URL, but it seems a bit kludgey
ftp://www.mysite.com/file.zip
How is that cludgey?
weak question, (Score:5, Insightful)
Don't start with finding the solution, figure out what it is you want, what you want it to do and then find the right tool. We can not tell you which is right with almost no information about the use of it, for what and what is the average user profile etc.
HTTP and FTP can be equally insecure, but it shouldn't be much of a job to properly secure a ftp.
Convenience's sake... (Score:2, Insightful)
People are more likely to find the files on HTTP (Score:3, Insightful)
I think that the ability for people to find the files using a search engine outweigh the other considerations if you are seeking to make the files public.
Re:No, (Score:4, Insightful)
Re:hmm (Score:3, Insightful)
Re:No, (Score:4, Insightful)
HTTP or FTP (Score:1, Insightful)
You don't need to use something with encryption like HTTPS or SFTP. These are publicly available files, right? I do recommend you post md5sums for each of these files both in the download directory and on the webpage that links to each of these files.
HTTP is most likely to get cached by caching proxies. So if your target audience is likely to be behind a caching proxy (Unv campus?) and you want to save yourself some bandwidth, use HTTP on a dedicated port or box so you can control concurrent downloads. HTTP might also get you through more downloads. Then again if someone's firewall will only let them get to port 80, they're too stupid to be downloading files from me anyhow.
Personally I'd recommend FTP. It's easier to control the number of concurrent downloads, concurrent connections from a single IP, transfer speeds, and integrates well with your authenticated uploads by yourself or whatever other person will be uploading files for download. Saying that FTP is less secure than HTTP is complete BS. Both have the same level of insecurity. The most common FTP daemon (WU-FTPD) has had horiffic security holes in the past which is why no one in their right mind should use it. ProFTPd on the other hand has had very few security problems (usually long after the affected release has been surpassed by a couple new releases at least). I highly recommend you look into ProFTPd for and FTP solution. You can always offer and HTTP solution but I highly recommend you limit the concurrent number of transfer if you use HTTP.
Re:FTP does text better (Score:3, Insightful)
Re:Different, not better or wose (Score:5, Insightful)
FTP is notoriously difficult to secure while retaining ease of use for the clueless end-user.
-Sara
Re:Forget them both.... (Score:5, Insightful)
-Sara
Re:Forget them both.... (Score:5, Insightful)
It sounds like anonymous downloading of publicy available files - whatfor do we need any encryption then?
If not for the encryption, then consider what else you get: a well-defined TCP connection. It's a cinch to configure a firewall to allow sftp connections, while FTP firewalling will give you prematurely grey hair (and if it doesn't, then you're not doing it right).
Re:Forget them both.... Anonymity (Score:4, Insightful)
Re:HTTP is fine (Score:1, Insightful)
Pay attention to what you say, please. (Score:2, Insightful)
Imagine a time when you want to download DeCSS for your linux boxen from a foreign server, but someone is logging your downloads Verizon and the RIAA wants access to those records
Why in Hell would the Recording Industry Association of America care who has software to break the DVD Content Scrambling System? They market music on CDs, not movies on DVDs.
Or did you mean to say MPAA?
Yeah, yeah, it's just a careless error, but things like this make us all sound like a bunch of idiots. If we can't even keep track of which evil corporate organization goes with which issue, then how is anyone supposed to take us seriously?
'Reliability' of HTTP vs. FTP (Score:4, Insightful)
I suspect that's because 99% of people are downloading from one of the FTP servers.
I put to you that would be more logical to suspect it's because HTTP is faster than FTP as a transfer protocol. It generates less traffic (and uses less CPU overhead) which means downloads end quicker.
Additionally the CPU overhead generated by FTP connections also causes many sites to limit the number of users who can connect, which often results in 'busy sessions', something much rarer with HTTP (as HTTP servers typically have very high thresholds for the number of concurrent connections they will support). The overhead on a server of a user downloading a file over FTP is much greater than that of a user downloading the same file over HTTP.
Although FTP is of course theoretically more reliable than HTTP, in practice, because of 'Server busy: Too many users' messages combined with the speed and reliability of modern connections (which in turn makes HTTP more reliable) mean the the reverse is often the case from a user perspective - which is what I think the poster is getting at.
This may be partly due to poor FTP server configuration defaults and/or poor administration, but they cannot shoulder all the blame.
The potential lack of reliability with HTTP is a very minor issue these days, and the extra overhead of integrity checking files in addition to relying on TCP is just not warranted for all but the largest of files.
This doesn't make FTP completely redundant, but it does make it make it redundant when your files are small and your users are on fast, reliable connections (though the value of 'fast' varies in relation to the size of the file, even 33 kbps is 'fast' compared to the speed of connections that proliferated when the File Transfer Protocol was developed).
Who cares what others think? (Score:1, Insightful)
FTP is a crumbling legacy protocol and will make you look a dinosaur.
Why care so much what others think? Efficiency beats out looks any day.
HTTP and FTP FUD (Score:5, Insightful)
1) HTTP doesn't support resumed downloading.
- That's ridiculous. It has since HTTP/1.1 years ago. In fact, it can even do things like request bytes 70,000 - 80,000, then 90,763 - 96,450, etc.
2) HTTP doesn't support security/authentication
- Ridiculous. HTTP has an open-ended model for authentication and security, many of which are secure and standardized. If you REALLY need security, use HTTPS.
3) HTTP doesn't support uploading
- HTTP/1.1 has had this for a while. Netscape 4.7, Mozilla 1.1, and IE 4+ support this. I must admit though, it sucks.
Several people have pointed out the real differences:
1) FTP doesn't like firewalls
- Passive FTP fixes this, but it has quirks and limitations.
2) FTP supports directory listing, renaming, uploading, changing of permissions, etc.
- This is what FTP is for
- This can be done in HTTP, but requires serious work
- If the scope creeps, shell access would be better.
Comment removed (Score:3, Insightful)
Re:No, (Score:4, Insightful)
Re:how about rsync? (Score:5, Insightful)
Re:hmm (Score:5, Insightful)
It does when 90% of your users only have wrenches, and don't want to make the switch to hammers. You don't want to hand out nails in that kind of situation.
That said, my problems with IE's ftp seem to be unique. Isn't there anyone else who notices the 30 second freezes while IE tries to contact the ftp site? Or does everyone just take it for granted that their web browser should freeze while it tries to do ftp?
Re:Different, not better or wose (Score:4, Insightful)
A pretty distinct advantage for those of us who use shells often or have to repair machines at a command prompt.
Eight words to consider (Score:3, Insightful)
HTTP is better (IMHO) (Score:3, Insightful)
In my case, I run an Apache server on my file server box because it gives me full 100% throughput at Ethernet speeds. That's 10+ megaBYTES/sec. The average anime episode downloads in 10-15 seconds, and when friends come over, I don't have to worry about them having the right kind of client when they want to leech.
Where FTP becomes useful is when you want people to upload to you.
Re:I wouldn't worry about it... (Score:3, Insightful)
Re:how about rsync? (Score:3, Insightful)
rsync wasn't written by a moron.
Transparent HTTP Proxies ... (Score:1, Insightful)
Re:Forget them both.... Anonymity (Score:3, Insightful)
All this information makes it trivial to find out which pages you downloaded, when, and how long you visited.
The only thing it doesn't do so well is listen in on data you send. But consider--if they're listening to the tcp channel to the remote site then they can listen to outgoing tcp connectivity as well.--and correlate what information came in, and what information went back out.
You think a single encrypted channel buys you any privacy? Get real.
Re:Different, not better or wose (Score:5, Insightful)
mget blah[1,2,3].iso....
Get the drift? HTTP indexes are rather stupid if you ask me, it's FTP without the features. And before you whine "But I don't need the features", neither do I most of the time, but it's nice when they're there.
Re:Different, not better or wose (Score:5, Insightful)
But really, even at that I think it has more to do with the features typically found in the clients than with the differences in the protocols.
HTTP transfers are usually done using a web browser. Web browsers can also perform FTP transfers, but FTP clients offer some capabilities that web browsers lack (and of course, also lack some capabilities that web browsers have). With a web browser, grabbing ONE file to download is easy. Starting a second or third download is also easy. But what if you want to initiate a transfer (either up or down) of all files within a directory tree with a single operation, and maybe even have the directory structure kept intact? What if you want to setup a complex "batch" of transfers with full control over files will be transferred into which directories, some transferring up, others down, and have the whole thing then run unattended with the ability not just to resume a partial download when the user restarts it, but to automatically restore dropped connections, rusuming any partial download and continuing to process the batch without user intervention. There are ftp clients that can do all of this. I suppose it could be done over HTTP, but I sure haven't seen it. Not even close. And until something like that does exist, this difference in client capabilities is a valid justification for continuing to use FTP, at least for applications where those capabilities are beneficial.
And here's another reason to offer FTP along with HTTP: Some users like it. If you have any concern about the extent to which your site pleases it's users, then that is a perfectly valid item to put in the "pros" column.
Having that kind of capability within a browser would really rock though. Especially if it worked for both FTP and HTTP. I suppose it would need some standardization on the web server end as well, so that the client can reliably parse the http directory listings. Though I suppose it might still be possible to make it work at least with the most common servers. Any browser developers listening?
bin or asc (Score:1, Insightful)
file transfer started
ASCII data transfer of mybigfile.iso successfully completed.
Re:I don't believe there is anonymous sftp... (Score:3, Insightful)
http support for self-signed keys is weak - it complains too much.
http upload is weak, and the method to accomplish an http upload is akward.
Why would I want to use UDP? Don't you imagine that there were performance reasons behind the choice of UDP for NFS?
cli support for http is weak - show me how you upload with wget.
Since when did Apache run chroot?
Funny, I don't remember linking in zlib for compressed http streams the last time I built apache (let alone bzip2)...
Re:Different, not better or wose (Score:2, Insightful)
HOW ABOUT UPLOADING??? (Score:5, Insightful)
BitTorrent (Score:3, Insightful)
You need a proxy/caching appliance. (Score:3, Insightful)
These caching appliances can also broadcast multimedia streams (Real or WinMedia) with the correct licenses.
I would do some research on some CacheFlow boxes, which, by the way, are selling on the cheap at eBay. These were the $25,000 (circa 1998-2000) priced appliances that you can now pick up for, sometimes, in the hundreds of very low thousands.
HTTP is better for most cases (Score:5, Insightful)
The main strengths of HTTP over FTP for file transfers are:
The other differences one sees are due to server design issues. I.e. most FTP servers are large and spawn a process per connection, which makes FTP sessions much slower than HTTP sessions. But if you want to use FTP, there are very fast FTP servers out there.
Overall, in today's world, it does not make sense to use FTP unless you have a requirement from your users. For public access to files, use HTTP or something more modern, such as rsynch, or a P2P network.
As usual, you should answer such questions by thinking about your target users and asking yourself what they are likely to be most comfortable using. Chances are it's their main tool, the web browser.
FTP obsolete years ago. (Score:3, Insightful)
There are hardly any practical legitimate reasons for an FTP server to shove data to some other place rather than the FTP client and other dangerous combinations FTP allows.
Uploads? Uploads had better be disabled on FTP by default. Uploads are disabled in HTTP by default, but you can configure your site to accept uploads.
Mass downloads of files? Just standardise. As is you can already download entire websites. With HTML (not javascript) the links are there. If you want files only you just have to standardise on a program readable way to figure out which links are files to download.
Resuming of downloads - works on most webservers, works on most ftp servers.
And http allows more - better compatibility with firewalls and proxies. Can be made to support SSL easily - most popular clients support SSL.
---
FTP is a bad protocol. The PORT and PASV thing is stupid. Most protocols which specify network layer data (IP address, port) in the content layer for programs to use, are bad designs.
How would PASV or PORT look with IPv6? With HTTP, you don't have to change the protocol much if at all.