A Better FTP? 37
cppgodjavademigod asks: "I used to work for a company that sold a file transfer product for datacenters. It supported checkpoint/restart, encyrpted password transmition, asynchonous job procesing, etc. Is there an Open Source project that aims to provide a better FTP? I'm looking for something that makes use of multiple paths (for machines connected via more than one network), job restart, job control, secure transmission (over internet), maybe even tunneling over HTTP and redundant servers (via some kind of private P2P protocol)."
This is a great idea - but it won't catch on. (Score:2, Insightful)
This means that there is really no incentive to change to it.
The second problem is that for most people, most of the time, their Internet connection is pretty reliable. This is also improving all the time as more and more people move to DSL and cable modems instead of dialling up.
Download managers (Score:1)
Swarmcast? (Score:2, Interesting)
FSP, anyone? (Score:3, Interesting)
Remember FSP [faqs.org], the "File Server Protocol". It was introduced about 10 years ago and was supposed to be the FTP-killer. Technically it probably was superior, but good ol' FTP was available everywhere and was good enough. Today you'd be hard-pressed to find any FSP sites at all. The last published version of the FSP FAQ appears to be dated 1996-08-19. It seems there's really no demand for a better FTP.
Re:FSP, anyone? (Score:2)
FSP seems to have died for lack of new or interesting things, rather than because FTP was too entrenched.
File Slurping Protocol (Score:1)
rsync has some features.. (Score:2, Redundant)
Hotline? (Score:1)
Once you connected to a server, you had file transfer with start/stop/resume, you could comunicate with other users on the server, you could tunnel through http...
i dont recall if any incarnations had any more secure features, however.
It was a wonderful program, with actually a lot of promise, until it was released for windows, at which point it became a banner-ad driven attempt at making money for files which were usually not provided at the end.(clicking on banner pages to get username/passwds to get in and download the warez/pr0n/mp3s/whatever needed.
rsync efficient secure file transfers (Score:5, Informative)
The rsync algorithm meets most of your requirements. rsync was proposed in 1998 by Andrew Tridgell for efficient secure file transfers. The main points are:
The detailed description is here [anu.edu.au] (http://samba.anu.edu.au/rsync/tech_report/), and open-source software is here [anu.edu.au] (http://samba.anu.edu.au/rsync/download.html).
Overall rsync is often much (10x) faster than using compressed file transfers. It is most useful for users who frequently download new versions of packages with significant similarities between successive versions.
Re:rsync efficient secure file transfers (Score:1)
Re:rsync efficient secure file transfers (Score:1)
You're right -- the rsync algorithm was first published in June 1996. Even though I started using it in 1997 I somehow picked the 1998 date from the html version of the tech. report :)
Re:rsync efficient secure file transfers (Score:1)
Re:rsync efficient secure file transfers (Score:2, Informative)
Grid computing pushing this issue (Score:3, Interesting)
HTTP/1.1 (Score:2)
Re:HTTP/1.1 (Score:3, Insightful)
That said, I have to wonder whether HTTP/1.1 could be a true solution, for the simple fact that HTTP was not created specifically for the purpose suggested. In addition, for future development purposes, would we really want to bog down HTTP with features not used in everyday web transactions?
*shrug*, just my initial thought, I might not have a clue what I'm talking about =)
SFTP (Score:2, Interesting)
Re:SFTP (Score:2)
FYI, security isn't the most important for everyone.
I'm much more concerned these days with bandwidth utilization, which would be kind of hosed by a scheme that encrypts the data stream. I probably want encrypted authentication, but that's it.
Re:SFTP (Score:1)
I'm just curious why you think an ecrypted data stream would use more bandwidth than unencrypted?
Besides, sftp can use zlib compression anyway, so that could probably help a little(well, depending on the type of data you're transfering)...
Re:SFTP (Score:1)
Re:SFTP (Score:2)
If you then choose to encrypt it, your crypto algorythm will at some point not be able to keep up with the pipe available.
If you choose not to encrypt it you will have saved an almost unimaginable number of cycles to use pushing data down the pipe instead.
Remember what started this thread though. I don't disagree that encryption is useful, just that the naked assertion that security is the most important thing isn't nearly always true.
Re:SFTP (Score:2)
Re:SFTP (Score:2)
True that the data going down the pipe is the same size, but unless you have multiple machines sending pieces of the same data down different paths to the endpoint, you find that more often than not your CPU cannot encrypt the data quickly enough to keep up with the pipe that is available to it.
So in quick tests we find that SCP takes about twice as long as native FTP. I've never tested SFTP, but imagine similar results. Do you know?
Re:SFTP (Score:1)
I've seen dramatic speedups by putting this
in $HOME/.ssh/config:
Host *
Cipher blowfish
It is *much* faster than the default (3DES)
I'd definitely say WebDAV. (Score:1)
Not only that, but you can mount WebDAV trees from your OS! MacOS X and Windows 2000/XP do this happily, just give it a WebDAV URL under their 'connect to server' dialog. Unfortunately, 2000 (XP is supposedly much improved) isn't a full redirecter, so you can only use File Explorer & Office against WebDAV, not WinAMP, for instance.
You can also get a WebDAV mounter for Linux too.
It's also readily supported.. Office supports it, as well as some Macromedia & Adobe products I believe. Even Oracle has it.
Now someone just needs to make a nice multi-user WebDAV server for UNIX.. I'd love to move my Ogg collection to a WebDAV server, and have people upload it as a user other than nobody
Downloader for X (Score:1)
http://www.krasu.ru/soft/chuchelo/
- James
not what you're looking for, but proftpd is nice (Score:1)
sendfile (Score:2)
webdav (Score:1)
Related question: multicast file transfer (Score:1)
On our latest project, we have a large number of embedded targets running Windows CE on a TCP/IP (cable modem) connection to a file server running Windows NT. We have requirements to be able to download identical images to all the targets for software upgrades within a certain period.
The problem is that TCP does not support a multicast to multiple targets - you essentially wind up retransmit the same data over and over again. With a certain number of targets, the numbers come out taking longer to do a system upgrade than using an older legacy serial link that supported broadcast.
So, here's my question: Does anyone know of a multicast file transfer protocol (not simply serial FTP) that is suitable for this application, preferably something open source?
Thanks
US healthcare mandates are pushing this (Score:1)
Pretty much across the board, insurers are moving to HTTP based solutions (over SSL obviously). For a few lines of Perl/Java/PHP you can ride on top of existing SSL transport, easily provide redundancy, be universally available to any type of client, etc. Command line tools like cURL (a kick ass utility btw) make it scriptable / automatable.
Features like job control and smart restart aren't inherently included though with HTTP based apps.
FTP over SSL isn't fully standardized from what I can tell. See http://www.ford-hutchinson.com/~fh-1-pfh/ftps-ext
LFTP? (Score:2)
Not sure if it does the encrypted password part, but it has almost every other bell and whistle out there. My fave is the 'mirror' and 'mirror -R' commands - does a comparison with the local file timestamps/sizes and only "get"s or "put"s the required files.
Bittorrent (Score:3, Interesting)
Content-Addressable Web (Score:2, Interesting)
The Content-Addressable Web provides all of the asked-for features, including multi-source/parallel downloads, and the ability to safely retrieve content from untrusted mirrors.
Please read the paper and tell me what you think.