Delivering Software, Electronically? 220
zpengo asks: "I'm trying to find the best way to implement a large-scale Electronic Software Delivery (ESD) service for my software company. I've been able to find very little information online (after weeks of research) so I must take it to America's best and brightest. Have you ever worked with ESD on a higher than plain-vanilla FTP level, and if so, what did you learn from it? When do you consider the product 'delivered'? Was it worth it? (I'm planning to put together a public domain whitepaper on the subject with the information I gather, to help fill in the gaps I found while researching online)."
ximian's red carpet (Score:4, Informative)
ESD (Score:1, Informative)
You know, Joe Schmuck loads his own software, and blammo my ESD job breaks. IF you have rigid controls on your environemnt, ESD works great.
Java? You could try Java Web Start (Score:4, Informative)
If you have a pure java swing application, this is probably the way to go. If not, read more about it and decide whether it's appropriate.
The technology was a little rough at first, but I assume it's matured somewhat, considering that it's now part of the standard java environment.
Java Web Start [sun.com]
Software Delivery (Score:1, Informative)
If the above doesnt fit you then your answer is below.
There are a number of companys out there that specialize in software lic's.
Most can be included into a couple diffrent lang's with very little effort at all.
One very good example of this would be..
http://www.elicense.com/
This and more information can be found on google without a problem. ( But of course this persons "research" didnt include simple searches on the most popular search engine.. But he did research, He really did research hard, I got that link in 1 minute, He spent weeks? researching and sounds like he found nothing? )
Valve is doing it (Score:3, Informative)
rsync and rdist (Score:4, Informative)
From the rdist website: "RDist is an open source program to maintain identical copies of files over multiple hosts. It preserves the owner, group, mode, and mtime of files if possible and can update programs that are executing."
From the rsync website: "rsync is an open source utility that provides fast incremental file transfer. rsync is freely available under the GNU General Public License"
Web Based Software Delivery (Score:5, Informative)
For examply, only users identified as Development can submit software. At that point Software Configuration Management is notified to reproduce the software (can SCM build the same binaries as the developers?) SCM retrieves the software from the web site. Once SCM approves the software, Test is notified.
Test retrieves the software and puts it through its paces. If it passes Test grants its approval through the web site. Otherwise the software fails and Test provides a URL explaining the problems. And on...
At any point program management can see the state of the software in its track to customer delivery. PM has override ability to approve software for customer delivery even if it has, for example, failed testing.
The web site makes it easy to access. Access control and approval manage the software delivery process. Notification keeps everyone on the ball. And logging provides CYA - and has covered my butt on numerous occasions.
My boss particularly loves to be sitting in a Change Control Meeting and hear the development manager say, "The software's been delivered to SCM. We're waiting on them." And he can say with confidence, "Not yet it hasn't."
I worked for a company that did that (Score:3, Informative)
Consider this...(corporate plug) (Score:3, Informative)
software = Decrypt(software, key), where key = Hash(k1 concatenate-with k2).
This is called time-lock crypto as written by Rivest Shamir Wagner in [3].
CertainKey [certainkey.com] offers this service with all the software/crypto you need at a modest price see [1].
note: I'm a founder of CertainKey...so use discretion.
References:
[1] [certainkey.com]
[2] [com.com]
[3] [mit.edu]
Please restrain the knee-jerk reaction (Score:5, Informative)
It's also similar to the way F-Prot Antivirus [complex.is] is delivered.
Basically each customer gets a login for the web site and can download from there. It avoids serial generators and cracks because you can't just download the shareware and then apply a crack. The only people who even get the opportunity to download the software are those who have paid so it's less likely (but still inevitable) that they will give it away, share it on kazaa, etc.
Kagi.com (Score:5, Informative)
-John
How could you not find alot of info? (Score:1, Informative)
There's some good payware service providers like Digital River, Metatec, Intraware, etc. And some decent freeware/open source ones that you could build off of, like weps.org. And there's always freshmeat, twocows.
It really depends what you're trying to achieve - what you're trying to deliver, to whom and for what reasons. You may need accountability, tracking, different views for different user sets, etc. Usually, you're best off just rolling your own if you have the time & resources to implement it.
Oh, and for resuming transactions, you can use HTTP 1.1 "Range" header protocol to do that if the files are large, and you lost connectivity.
Re:.....tell...us...more... (Score:1, Informative)
http://www.tivoli.com/products/index/config-mgr
Full disclosure: I work for Big Blue, and despite my bias I can tell you some HUGE companies and government agencies are happily using this product. (plus lots of small ones too)
Re:Valve is doing it (Score:3, Informative)
If it only it worked through NAT firewalls. Grrrr
The server is blocking ICMP requests, which means it will not see the ICMP Fragmentation Needed packets your NAT'd boxes will send. You need to reduce the MTU to around 1412 on the machines behind the firewall, or force the MTU in the firewall itself.
If using Linux 2.4/iptables, see the netfilter kernel config help option for "TCPMSS Target Support"...
Note that, technically, this is a problem on the server side (blocking ICMP for "security" reasons) but it can be solved on your end.
(I fought with this for months before I found the problem)
I wrote something to do this a while ago... (Score:3, Informative)
It's windows, and freeware now. You might learn about some of the issues from the documentation.
I developed such a system (Score:2, Informative)
The basic features of the system were as follows:
1) Packaging of software into the smallest deployable units. Define a standard for how files and meta-information are grouped together into a package (e.g. tarfiles, RPM's) so that the packages can be created and installed in a common manner.
2) tracking of dependencies and compatibilities between packages
3) Specification of the set of top-level packages that are required by an individual workstation
4) dependency evaluation to calculate the final set of packages to be installed, or determine if no viable package set existed because of dependency conflicts
5) a sizeable set of tools to allow us to manage this information, build packages, and track what got downloaded, why it got downloaded, and who changed what when
The combination of these features is very much like what RedHat's "update agent" (and other Linux update utilities) provides. If you have the luxury of only having to support Linux, your best bet is probably to try to adapt one of these to your needs.
Softdisk - back in the day (Score:1, Informative)
AOL - Members would join a software club - billed $19.95 monthly - and be able to download from our library. This was for in-house software, not for third party. At a royalty based on $2.95 per hour, we made a few bucks there. AOL's model change pretty much ended that. We also made money from our freebie download area, albeit royalty only. All programming done in Rainman Plus. It was different and pretty easy, but there were some hideous holes in the system security-wise.
Prodigy - Customers bought software and after the transaction downloaded the software. Any disputes or problems were handed by our customer support staff, who would email or snail-mail the product if necessary. We had to snail mail our products b/c of problems w/their software delivery check-in system. We had little direct control of the store.
CompuServe - Most painless to deal with. We uploaded product ourselves. Had to use wierd scripting language to construct/modify store/pricing. It was kinda buggy, but it worked. Store performed quite well. The more often products were changed/updated, the better. Rotating ads throughout system for promotion, front screen placement drove huge traffic (big surprise).
eWorld - Transaction completed in online store, product was emailed to customer minutes after tranasction went through. Worked nicely, but ultimately tanked a couple of months after we got it up and running when Apple shut down eWorld.
Web - Home-brewed CGI scripts ran the store. SSL, transaction processed real-time with our bank, customer could download product for up to 72 hours. Customer support thereafter.
ESD? Isn't that just digital delivery of data... (Score:3, Informative)
I believe it's already been done. Originally, the exchange of digital information was done by a wide variety of means, then commerce kinda standardized on this thing called EDI (Electronic Data Interchange). More recently, it's evolved into an XML-based thing, but it's still EDI.
It seems to have all the attributes you need:
*) electronic delivery across platform, language, even character sets (XML will handle/requires Unicode).
*) authentication of the recipient and sender, reasonable security
*) provisions to automatically and securely exchange payment data upon receipt
*) a standardized set of tags to use for referring to the business entities involved in the exchange
*) many products already exist to facilitate this
Googling on "XML EDI" will get you a bunch of responses. From a quick once once-over of the 1st page, I thought this one was a good starting point:
http://www.eccnet.com/xmledi/guidelines-s
The only downside (and you may or may not consider it a problem) is that this requires your consumers to use a program they probably don't already have to receive and decode the transmission. Ftp has the virtue of being essentially omnipresent, albeit in minutely-differing flavors, and requires a modicum of knowledge to use in cross-platform, cross character set interactions. Perhaps your target user base has that knowledge. You don't really say whether this is an intracorporate distribution mechanism, or a vehicle for direct sales to the unwashed masses. This is important.
I'm not really sure why nobody (at least nobody I know of) is using this to manage software distribution and payment exchange today, other than that the existing web-based tools are "good enuff".
I'm sure some standards group has defined the grammar for commercial sale of software and related items (media, documentation, support contracts). And I'm equally sure that someone has a nice generic java-based XML-EDI client/server implementation package. But it's certainly cheaper to whack out something using normal web tools and ftp.
In the end, cheaper usually wins out over everything else.
Re:I do! (Score:4, Informative)
Tell me this: what is different between your script writing headers, and the Apache server writing headers, to describe the content about to be sent?
Honestly, use 'wget' or 'lynx -dump' and really examine the headers that are sent when you download a file. Apache is sending those headers. This is what tells the browser what is being sent, and it's the *only* thing telling the browser what is being sent.
Simply mimic those headers (substituting the proper filename and size etc), and the browser will happily prompt the user to download.
We built an inventory system for a manufacturer, and having pre-built Excel reports was one of their requirements. We simply send an HTML table, but sending the headers to appear as a
It just takes some trial and error, but the biggest clue is to look at the headers that are sent when you actually download a file directly. The browser doesn't know (or care) whether it's a binary webserver program, or a bash shell script, sending the headers.
And if that's too much work, again, create a symlink:
ln -s filename.zip [unique-id]-filename.zip
And give a hyperlink to the symlink. That's about as simple as it gets. In Windows you could probably create a "shortcut", but I really don't know/care about that. If you're running Unix, you have a ton of options here.
Re:I do! (Score:3, Informative)
I'm generating PDFs to send dynamically. I've done the same thing with inline jpgs for ages now, without having to save them to disk in any way, shape, or form.
Browsers don't like HTTP redirects. It doesn't always work. IE5.5 is seriously broken unless you have a certain set of patches installed. Opera 6.0 Linux freaks out. Mozilla mostly handles stuff right.
Eventually I had to do something like you did, generate the file and put it on a directly accessible filesystem, which is very inefficient compared to just streaming the data out, and potentially a lot less secure.
Why can't browsers get their act together with dymanic content generated for external plugins? It doesn't seem like it would be that hard to fix... Mozilla already has it mostly right.
Re:I do! (Score:4, Informative)
I do recall there being one issue, with Mozilla/Netscape specifically, where the filename it prompts you to save is the filename of the *script*. But we got around this using mod_rewrite. So a link like this:
[unique-id]-filename.zip
becomes:
script.php?id=[unique-id]
And, since the browser is seeing "...zip" as the filename, it prompts with the correct default "Save As" filename. That's what we actually did for the Excel file, we just linked to (eg) Report.xls, which was actually a script.
Personally, I say go with the symlink idea. It's probably the easiest for you to change from your current setup; simply change your 'cp' command to 'ln -s'... the deletion of the link, downloading of the link, etc will work just the same as if it were truly a redundant copy of the file.
Of course Apache must be set to follow symlinks; don't forget to check that first.
Re:I do! - hooray for *nix users (Score:2, Informative)
We will use the free junction command line component, or linkd.exe, or one of the others and run it from our ASP page using ASPExec from ServerObjects.com. Will do the same as the unix version of a virtual link.
So, even if this thread did not help the oroginal poster, it helped us out and that is a good thing.
Give yourselves all +1 karma
Good job!
Re:Please restrain the knee-jerk reaction (Score:3, Informative)
So anyone who shares it will be likely flagged by Gibson.