Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Building Distributable Linux Binaries? 128

Grubby Games asks: "I make games for a living, and I want to ensure that my games will run on as many Linux distros as possible. However, since I distribute binary game executables, the programs often fail to run on certain distros because of missing dependencies, and so forth. So, how do the Slashdot Linux gurus handle this situation? I've heard a number of theories on the subject, but have yet to find one that results in 100% cross-distro compatibility. Is it even possible, short of distributing source code?"
This discussion has been archived. No new comments can be posted.

Building Distributable Linux Binaries?

Comments Filter:
  • by baadger ( 764884 ) on Thursday November 24, 2005 @06:56PM (#14109675)
    How many comments until someone asks "Why not distribute the source?"?

    Ah geeks..always finding a way to answer the question with a better(?) one.
    • by Ithika ( 703697 ) on Thursday November 24, 2005 @07:03PM (#14109706) Homepage

      You forgot force the user to move to your distro of choice and I only play Nethack.

      • You forgot force the user to move to your distro of choice and I only play Nethack.

        We are more open minded then that. Most of us have also tried slash'em once or twice. :)

    • I would just write a script that checked for everything it needed, and, if something isn't there, informs the user and (if possible) installs the dependency for the user (with their permission, of course,)
    • Yes, just provide the tarball, a list of required libs, perhaps a gentle hint on using ldd, and the three or four step process to config/make/make install/change perms. Ye gads man, it's not hard to do either on your end or the users'. Granted, there are exceptions: Xnotes++ is a harsh mistress I've not yet been able to tame, but then it's hardly your standard build process.

      And if Joe Noobie hoses his libc, qt _and_ gtk libs in an attempt to play a cheesey version of Hearts, well then the lesson is more va
    • by Miffe ( 592354 ) on Thursday November 24, 2005 @07:20PM (#14109780)
      Not the only way, there is also statifier [sourceforge.net]

      From the page:
      Statifier create from dynamically linked executables and all it's libraries one file. This file can be copied and run on another machine without need to drag all it's libraries.
      • by Tamerlan ( 817217 ) on Thursday November 24, 2005 @11:19PM (#14110698) Homepage
        As someone who is involved in supporting commerical closed-source applications on 3 Windozes, 5 Linuxes and 4 Solarises, I must confess - binary compatibility on Linux is too damn hard.

        Solaris is way better than Linux in that regard, everything compiled on lower versions always works like a breeze on higher versions. Windows has a clear dividing line between retarded 95/98/ME and 2000/XP/2003/Vista, but we do not support 95/98/ME anyway, so binary compatibiity on Windows is wonderful, aside from some minor glitches with missing DLLs.

        Now, binary compatibility on Linux is a total pain in the butt. Incompatibilities in glibc 2.2 vs 2.3, pthreads vs nptl, gcc C++ ABI is broken on every other gcc release, thread local storage, just to name a few hurdles.

        Distributing statically linked executables is the most reliable way to go, if you want to support as many Linux variants as possible. However, there are few things to remember:
        * If your application is security-critical, you link against static library and later security flaw is found in the library, OS security update leaves your application vulnerable.
        * Never link statically with libdl. For example, application, statically linked on RH 7.3 witl libdl will misteriously crash on RH9
        * Size of executables is big. Even worse if you have many executables using same libraries.

        Oh, you said you are going to market games... Boy, you are better to build them for Windows anyway. Don't let me be misunderstood, I love Linux (except, for its binary (in)compatibility, of course), but it's just not the right market and business model for you anyway.

        If you keep insisting on Linux, here is a hint. Most commercial vendors shipping products for Linux are oriented towards a limited number of distros, those used in enterprises. RedHat and SuSE that is. Mandriva? Ha-ha-ha. Oh, you say, Ubuntu and Debian are popular? Ouch, not among our cutomers.
        • I think his best option is to distribute packages for various common distros, and a few statically-linked tarballs for everything else. Another thing that comes to mind is CPU optimizations; I don't know if he's thuoght of this, but is he distributing generic 80386-compatible packages, or things designed to run on MMX or 3dNow! chips?

          Kermit and distributed.net are examples of places where you can almost always find a binary for your system. Those might not be bad places to look to figure out what sorts of
        • * Never link statically with libdl. For example, application, statically linked on RH 7.3 witl libdl will misteriously crash on RH9

          I've heard about this problem, but I don't recall whether it is due to a bug in glibc, or a bug in Redhat's many modifications. I just know I haven't run into any such problem in Slackware. I don't even touch Redhat anymore given so many issues like this.

          Oh, you say, Ubuntu and Debian are popular? Ouch, not among our cutomers.

          Maybe you should do a survey of your Linux c

    • It depends what sort of games you are writing, but if not graphics intensive, you could package the game+all essentials on a lightweight linux distro, and put the whole thing out as a virtual machine using VMWare Player.
  • Ideas (Score:5, Interesting)

    by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Thursday November 24, 2005 @07:02PM (#14109700) Homepage
    You're going to have a hard time, I think. Obviously if you have to just build for a number of distros you'll want to do it for the most popular (Ubuntu, Debian, Fedora, and Gentoo, for example). That said, there are some solutions:
    • Java
    • Python
    • Static Build

    Java and Python are obvious. You can always use things like JNI and the Python-C bridge if you feel you need better speed in some parts. As long as those parts don't link to any external code, you won't have any problem. They are both very fast these days, and by using things like PyOpenGL and Java3D you can get accelerated rendering and everything. Plus you would only need one codebase for both windows, linux, and OS X.

    As for a static build, you may or may not want to do that. The size of the executable will be much larger, and it will take more memory to run, but you won't have the problem of everyone has a different version. It would solve your problem.

    As an odd suggestion: why not build all your code into objects and distribute those with a little script to link them on the client machine into an executable. That would fix the problem right? The problem is you couldn't strip the binary (if you do that) because the linker needs the symbols (right?). But it would solve you problem. This is my understanding of how nVidia's kernel drivers work (they have a little glue source too, which you could include). This way they don't have to put out a version for every combination under the sun.

    I hope this helps. This is one area where Linux is (rather far) behind Windows and OS X. It's not much of a problem if you've got the source, but if the application is closed source (like yours) then it can be a headache. This is something that the LSB was trying to solve, but I don't know how far they got (or how many distros follow their guidelines).

    • Java and Python are obvious. You can always use things like JNI and the Python-C bridge if you feel you need better speed in some parts. As long as those parts don't link to any external code, you won't have any problem.

      That is a big if ;). Personally, I'd stay away from Python and use Java - Pythons nature (lack of data hiding, lack of strong typing) makes it very easy to have the program collapse into a horrendous mess. On the other hand, for proof-of-concept things and people with tempered steel disc

      • Just curious, why avoid Boost? Most of Boost is template code, so it gets compiled into your objects and doesn't leave any library dependencies. I use Boost heavily and distribute to customers who don't have it installed.
    • As an odd suggestion: why not build all your code into objects and distribute those with a little script to link them on the client machine into an executable.
      That's exactly what I already do, actually :) It works on about 90% of the Linux machines it has been tested on, but I want that last 10%!
    • Python and Java aren't as simple as you imply. Non-trivial applications are going to require other libraries. They may need printing, gui, data exchange, remote communication, database etc. There are extra packages that can be installed to do all that for each environment, and in many cases those packages aren't provided with each distro, or they don't keep the versions up to date.

      I package a GPLed Python application which requires 8 other Python libraries. Although users could run from source, they wou
      • Python has its dependencies -- you have to figure out which GUI, there are other interesting libraries. You don't just install Python -- it is Python plus a bunch of supporting stuff.

        Java on the other hand is Java. Unless you want to go the SWT route, if you are using Swing/Java2D, all the stuff is there.

    • As an odd suggestion: why not build all your code into objects and distribute those with a little script to link them on the client machine into an executable. That would fix the problem right?

      Nope. If it was that simple, it'd just work anyway. Problem is, if structures in header files or whatever change, then the old object files are no longer compatible with the new library, and a recompile is needed.
    • Not so hard (Score:3, Interesting)

      by jd ( 1658 )
      Java and Python are only maybes - different versions of the engine may break the scripts. (Actually, if you're going to include scripts, then you can also include Perl, Tcl/Tk, Ruby and Scheme.) The only way to be 100% sure a script will run is to also ship a copy of the interpreter (statically linked) which you conditionally install if no interpreter is installed or is the wrong version.

      For regular binaries, a static build is pretty much guaranteed to work. Alternatively, all Linux distributions I know of

  • It's not possible. (Score:2, Informative)

    by keesh ( 202812 )
    It can't be done with dynamic linking, despite what the Autopackage people tell you. There's too much variety between distributions with things like libc and libstdc++ versions. There's even too much variety between distribution releases in many cases.

    Most distributions don't aim to provide binary compatibility between releases. Even RedHat get sick of shipping multiple copies of libraries eventually... And if you throw distributions like Gentoo into the mix, anything binary-based using dynamic linking is o
    • by Spudley ( 171066 ) on Thursday November 24, 2005 @08:10PM (#14110017) Homepage Journal
      *shrug* Of course, any respectable Linux user won't use your software unless they can see the code anyway. How else do we know it's not chock full of security holes and spyware?

      Ah. So that's why Loki went bust.

      But you know what... I'm respectable, and I bought their games. And no, I didn't get any source code.

      There are legitimate reasons not to release source code, and the original poster didn't give any details of his licence, so I think you're being excessively harsh to call it obnoxious.

      The decision to distribute source code or not does not affect the quality of the software. It may make it harder for you to look inside, but remember - Sony didn't release their DRM source code either.
      • I bought [Loki's] games. And no, I didn't get any source code.

        Nor did I, and that means that I have no hope of ever getting the bugs in Railroad Tycoon II fixed. (And that includes the one that makes the game crash if you want to play multiplayer. I wonder how they could not notice that before release.)

        I, too, see the point in hiding the source code of games while they're new. But I would love for the law to require source to be released when a program is no longer actively maintained. And that will of

    • It's possible to distribute statically linked LGPLed binaries, as long as you include in the distribution all the .o files and library source needed to make it possible to relink with a modified library.
    • The upcoming Autopackage release is much improved in that regard (currently in beta stage). It allows you to ship dynamically linked binaries for several libstdc++ versions based on binary deltas.
    • "It can't be done with dynamic linking, despite what the Autopackage people tell you"

      Dude, *we* aren't the only ones that tell you it can be done. The *users* also tell you it can be done. What more proof do you need?

      "There's too much variety between distributions with things like libc and libstdc++ versions."

      That is exactly why we have developed APBuild. We've done extensive research into finding out *why* glibc versions cause problems.
  • I believe UT2K4 achieves this by including the libraries themselves in one of the application's directories. Just link to the ones you place in the applications directory as opposed to system libraries and you are all set. Correct me if I am wrong, which is entirely possible.
    • I believe UT2K4 achieves this by including the libraries themselves in one of the application's directories. Just link to the ones you place in the applications directory as opposed to system libraries and you are all set.

      Neverwinter Nights also packaged SDL to a subdirectory, and then had a wrapper script say "export LD_LIBRARY_PATH=./miles:./SDL-1.2.5:$LD_LIBRARY_PA TH" which worked fine and let the user update the libraries if needed.

      • There's no need to use shell scripts that export LD_LIBRARY_PATH any more, and there hasn't been for many, many years. With the GNU dynamic linker, and any other modern linker, you can spesify an $ORIGIN which will be searched for libraries, for example "./libs".

        Read this for more about why LD_LIBRARY_PATH is bad practise:
        http://www.visi.com/~barr/ldpath.html [visi.com]
        • There's no need to use shell scripts that export LD_LIBRARY_PATH any more, and there hasn't been for many, many years. With the GNU dynamic linker, and any other modern linker, you can spesify an $ORIGIN which will be searched for libraries, for example "./libs".

          I'm sure you can. However, using a shell script and LD_LIBRARY_PATH allows the user to switch into using the systems default library if wanted. For example, I have a newer version of SDL installed than the one supplied with NWN, and NWN works pe

    • Comment removed based on user account deletion
      • Re:UT2K4 (Score:3, Informative)

        by rbarreira ( 836272 )
        The entire point of having dynamically linked libraries is to have one central source to update your libraries at, and to share that functionality with other programs. If you are going to include the dynamic library with the program, it's easier to just make a statically linked executable...

        But you can't statically link LGPL licensed libraries...
        • huh?? (Score:3, Informative)

          by mkcmkc ( 197982 )
          Can't statically link LGPL-licensed libraries? That's news to me. What is it that makes this impossible?

          Mike

          • You can statically link, but if you do so and distribute, then you must provide the source of the application you statically linked the libraries in to.

            The LGPL only allows for dynamic linking without providing source, and the resulting application must allow the substitution of newer or improved or replacement version of the libraries you link to.

            Personally, I'd question the need to use any sort of license if you just want to dynamically link - you're not using any of their code (assuming the header files
            • Re:huh?? (Score:3, Interesting)

              by Mprx ( 82435 )
              See LGPL section 6a:

              Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is

              • ahhh, OK.
                I can see I'll have to read it again a few times. The GPL is easy enough to understand, but the LPGL kinda makes my head hurt.
                • Rule of thumb is this: If you use an LGPL library, it must be possible for the end-user of your application to replace the LGPL library with something else. Theres a few different ways to do this - dynamically link, statically link but ship source, statically link but ship object files & linker scripts.
  • I have noticed that applications compiled for glibc 2.1.3 run on systems with glibc 2.3.1, but not the reverse. You can find glibc 2.1.3 and gcc 2.95 on Slackware 7.1.

    Netscape 4.8 and Maple 5.0 have been compiled for libc5 and run on many systems.
    • I meant that applications compiled for glibc 2.1.3 run on systems with glibc 2.3.4. I have not tried 2.3.1.
    • In general, link against the least specific filename. If you link against libfoo.so.1.2.3, it will be difficult to get your software running with libfoo.so.1.2.2 or libfoo.so.1.3.4, even though both should be compatible. If you link against libfoo.so.1, there is a much better chance that things will work.
  • by RAMMS+EIN ( 578166 ) on Thursday November 24, 2005 @07:35PM (#14109857) Homepage Journal
    If you're making games, it's usually a good solution to make the game engine open source, and charge for the data. The game engine can get ported to dog knows what platforms without any effort from you, and you can still get compensated by charging for the data files. The real value of a game tends to be in the data files, as they control the story line, the graphics, the sound effects, basically everything. The game engine is just like a library used to play the datafiles. Of course, not all games work like this.
    • by Anonymous Coward
      If you're making games, it's usually a good solution to make the game engine open source, and charge for the data.

      (ppffftttt) Milk just shot out of my nose (and I'm not drinking milk).

      The real value of a game tends to be in the data files

      Wipe the drool from your chin before posting, please. Do you know how brutally retarded you sound? Take any commercial game company. For example:

      "John Carmack - your Doom 3 engine is merely a 'playback library', the real value here is in the 'data files' (snicker)".

      "Rocksta
    • If you're making games, it's usually a good solution to make the game engine open source, and charge for the data. The game engine can get ported to dog knows what platforms without any effort from you, and you can still get compensated by charging for the data files. The real value of a game tends to be in the data files, as they control the story line, the graphics, the sound effects, basically everything. The game engine is just like a library used to play the datafiles. Of course, not all games work lik
  • Easy (Score:2, Funny)

    by orasio ( 188021 )

    1 - Statically Link everything
    2 - Realize you are linking GPLed or LGPLed libraries. Who will notice? They will.
    3 - Remove GPL and LGPL libraries.
    4 - Look for proprietary-friendly libraries.
    5.a - Develop for .NET. Be sure you charge enough for support.
    5.b - Cave in after finding you can't afford it, or you can't find what you want, and GPL your own code. Keep the artwork proprietary. Try to raise money to "liberate" your game.
    5.c - Pay for libraries. Distribute your games. Be happy.

    6.a - ???
    6.b - W

      1. Statically Link everything
      2. Realize you are linking GPLed or LGPLed libraries. Who will notice? They will.
      3. Remove GPL and LGPL libraries.
      4. Look for proprietary-friendly libraries.
      1. Even dynamic linking of GPL'ed libraries into non-free software is prohibited
      2. LGPL'ed libraries ARE proprietary-friendly. This is the key difference between GPL and LGPL. You can link LGPL'ed libraries to whatever you like.
      • 1. True
        2. Not enough, for high values of proprietary-ness, such as the case we are talking about, where the guy wants to distribute binaries that run by themselves.

        --------

        I'll use this to bitch about moderation.

        I _do_ have karma to burn, but I really hate to see a +1, Funny , and then a -1, Overrated.
        Nobody cares that you think something is not funny. Punishing people who get Funny ratings (This combination is a -1 karma penalty), even if they are not fun to you, ruins my experience. I like silly comments,
  • Half-Binary (Score:3, Interesting)

    by RAMMS+EIN ( 578166 ) on Thursday November 24, 2005 @07:45PM (#14109908) Homepage Journal
    If you absolutely can't have the source to your app open, you could do like the developers of binary kernel modules: ship your software in object format, with all the parts that interface with the rest of the system open source. That way, people can't read your actual application source, but they can compile and link the glue layer to make your code work on their system.

    If you take this approach, be _very_ careful to have _everything_ that interfaces with the outside system in the open source part; I've worked with kernel modules where some OS calls were still made from inside the binary part, and they were hell.
    • Re:Half-Binary (Score:4, Informative)

      by jonwil ( 467024 ) on Thursday November 24, 2005 @08:22PM (#14110065)
      Even that doesnt necessarily work if C++ is involved and the system its built on and the one its run on have different GCC & libstdc++ versions with different incompatible ABIs...
  • by ubiquitin ( 28396 ) * on Thursday November 24, 2005 @07:50PM (#14109926) Homepage Journal
    I'd suggest doing a bootable CD. You can have total control, then. In my limited experience, gamers are typically not too disturbed by having to do a reboot to get to their game.
    • That depends on the game, and for some people, it even doesn't (they'd always be annoyed by that).
    • Ugh. I'm a windows gamer (I realise I am kissing my karma goodbye here) but I absolutely abhor rebooting my system at all, especially just to play a game for five minutes.

      It's the same reason I don't use Linux. I would like to use Linux. I know Linux isn't a beginner system and there's a lot of work involved in making it behave smoothly and completely on a system, I'm not afraid of that. I'm an I/T worker (this supports my gaming habit). But I don't use it because I need to play games, lots of games, a
      • This leaves me with dual booting as the only viable option. That would mean an awful lot of restarts because I have a short attention span. And why restart when Windows happens to run all the apps I want as well as the games?

        I have two boxes under this desk, one Ubuntu Linux, the other XP... I have a KVM switch for swapping between them, so the flashy windows games are only a button click away, and I can leave one paused while getting on with real work on the Linux box...

    • you totally don't need to reboot, just chroot to the cd's root dir.
  • by RAMMS+EIN ( 578166 ) on Thursday November 24, 2005 @07:54PM (#14109953) Homepage Journal
    ``100% cross-distro compatibility. Is it even possible''

    In a word, no. Binary compatibility just isn't. It happens to work in very select situations, usually on proprietary operating systems that are themselves binary-only, which means the developers of the OS would be dealing with the same headaches you would if they broke binary compatibility.

    On Linux, where binary compatibility has very low priority because _everything_ on the system works fine without it, binary compatibility happens only by accident. It starts with the fact that Linux runs on many architectures, and people actually use it on most, if not all, of these architectures.

    Then there are distributions that are still using libc5. Think they don't matter? What happens when libc7 comes out? And if you think that isn't going to happen for a long time, how about when X.org replaces the monolithic structure they inherited from XFree86 by their new, modular one? Or how about when Qt 4 replaces Qt 3, or GNOME 3 is released? These projects are not going to let themselves held back because of your app. Because this is what binary compatibility means: slowing progress, because some changes become impossible.
    • In a word, no. Binary compatibility just isn't. It happens to work in very select situations, usually on proprietary operating systems that are themselves binary-only, which means the developers of the OS would be dealing with the same headaches you would if they broke binary compatibility.

      There are a couple of paths that proprietary OS's have taken to maintain binary compatibility. One approach taken by Apple and Sun is to thoroughly document what will be the stable ABI and maintain backwards compatibili

      • I used to be on the ABI team at Sun that did this, and it's really pretty trivial.

        A minimally necessary step is to label interfaces with what revision of what standard they use. The distributor then adds interfaces labeled "POSIX 20.3.6" (an imaginary standard) when the 3.6 release comes out. They don't remove the 20.2 stuff until no-one uses it any more (e.g., when they switch from 32 to 64-bit), and ELF apps link to the version they need.

        If an application needs a newer version than 20.2, the user can

    • I like the way id has done this. Sure, way too much is statically linked, but often the games work even if you rip out their distributed libs and use the standard ones from the distro. And, by the time it becomes too much work for them to maintain the thing, it's also no longer profitable, so they release the source and let us worry about it.
  • Static Linking. It results in bigger bloated binaries - but at least they're self-contained.
    • This tactic is not legal under the LGPL. :-/

      I think the Open Source engine, proprietary content game is probably the best way to go. Of all the kinds of software out there, games are the kind where I question the viability of the Open Source model the most.

      • This tactic is not legal under the LGPL

        Unless the copyright holder(s) of the libraries agree to licence it under a different licence.
        Neither the GPL nor the LGPL preclude the owner from make individual agreements with other parties.
        • Once a library (or any software, for that matter) gets touched by enough fingers, though, doesn't that become a large problem, even just in finding out who to ask?
        • It's *still* not legal under the LGPL - it becomes legal under something else which is not the LGPL. And you win the prize for "least usefull non-answer to a concern".
    • i'm quite sure that losing the library reuse advantage of dynamic libraries is pretty much not a problem at all with games, since the typical user won't run a dozen of them at the same time (quite unlike anything office- or communication-related)

      what you still have is the harddisk space issue, but considering the typical ratio of code vs media in games this won't be much of a problem either.

      it would be a wholly different story in case of an IM client for example.
    • They are self contained.

      Do you remember the zlib buffer overrun problem a while back?
      So many appliactions statically linked to zlib had to be updated.

      How about the jpeg buffer overrun where MS provided a program to search out all the office apps that were statically linked to the joeg library so they could patch.

      I just thought I'd mention the downside.

      sam
  • klick (Score:4, Insightful)

    by astrashe ( 7452 ) on Thursday November 24, 2005 @07:59PM (#14109970) Journal
    You might want to take a look at klik:

    http://klik.atekon.de/ [atekon.de]

  • If you just link the stuff statically, you shouldn't have much trouble.

    I would probably set up an apt repository for Debian, where it's easy to make sure it works with the given dependencies. I would complement that with a statically linked rpm and plain old tarball.

    Well, in actual fact I would also distribute it as FOSS, of course. I doubt that would impact your revenue negatively. Rather, the fact that people have the possibility to patch bugs and port it themselves would probably make them more willing t
    • ...would probably make them more willing to shell out for it in the first place

      Oh yes, because I see so much open source software making money on sales. More likely it would be copied and passed around on the p2p networks with no worries. I personally don't think games will ever do well as open source, to much capital has to be spent on development to allow anyone to give it away. You also would have massive cheats created for any network play which has killed several games.
      • Oh yes, because I see so much open source software making money on sales. More likely it would be copied and passed around on the p2p networks with no worries.

        How is that different from any other computer games, now?
  • by torpor ( 458 )
    distribute your game as a chroot'able .dd.tar.gz file ..
  • by fdragon ( 138768 ) on Thursday November 24, 2005 @09:25PM (#14110285)
    You can do a few different options, each with their own drawbacks. Depending on the libraries you are currently using you will possibly have to use a combination of these items.

    Static linking allows you to distribute a single binary that will run on any linux system so long as you have the correct minimum kernel version and CPU. Some of the problems with this will include the license on the libraries you are using may not allow this, and the application's code image changes from what you have debugged against.

    Dynamic linking and putting all of the required dynamic libraries your application requires in their own personal library directory. This is the quickest solution, but not always the best solution. It does allow you to release periodic incremental patches to the binaries used by your applicaton.

    Usage of a application such as Elf Statifier [sourceforge.net] to take your debugged dynamic application, and bundling the applicable libraries into your application. This is a halfway point between a completely dynamic and static application that allows you to take your "GOLD" release and package it up without changing the generated code in your application.

    Just release your application as a dynamic application and mark it with the correct dependancies in RPM format and follow the Linux Standards Base [linuxbase.org].

    Most commercial applications will use either the 2nd method (dynamic and distribute all their own versions of various dynamic libraries) or the 4th method. In both cases they specify a message to the effect of "We support X distro version Y. It may work on your linux distro, but we only support X version Y."

    As you are planning to make money off of this game, I wish you luck, and suggest you take a look into what the most popular linux distros for your target audience will be. Based on advertisements from Wal-Mart and Fry's selling Linux preloaded I would bet Linspire and Xandros is high on that list. As with all things, don't forget the testing across many distributions to make sure the solution you choose actually works.
  • I saw blah blah blah distro, so fun idea, why not make the necessary libraries for running the application available, and then run the program in a self contained box (chroot jail?) for distros that wont work with said game?
  • by davecb ( 6526 ) * <davecb@spamcop.net> on Thursday November 24, 2005 @10:19PM (#14110462) Homepage Journal
    If you can't run the same binaries on all the distributions, then you're on the way to suffering from the same thing that made Windows so much more popular than Unix.

    Distributors who want to make it hard for you to run your binaries are every bit as wise as the Unix vendors who tried to avoid standardization and the risk of mutual success...

    --dave

    • by Anonymous Coward
      Library hell on Linux is ten times worse than DLL hell ever was on Windows. So that ship has already sailed...
    • Linux solved this problem by changing the software distribution model from being customer centric to being distribution centric. I.E. Customer gets software from distribution who compiles and configures software from source. The problem with commercial apps is that they want to bypass the Linux model and use the Windows model where: Distributions are configured similarly enough so that the same app runs on all versions. Companies that want to run commercial software succesfully under Linux on machines
  • autopackage (Score:3, Informative)

    by mrsbrisby ( 60242 ) on Thursday November 24, 2005 @10:42PM (#14110562) Homepage
    autopackage [autopackage.org] is probably the current best tool for this. It makes a single, easily installable (and removable) package while coercing my system's GCC and libs into versions that are suitable for other distributions.

    I use it currently for schism tracker [rigelseven.com] (particularly the CVS builds [nimh.org] that I do).

    It works very well for me: Debian, Ubuntu, SUSE, Fedora, Gentoo, and Mandrake users have reported success with these binaries built with (gasp) a lone FC4 machine.

    One day, I'll actually do a proper release and six years later it'll show up in the next Debian release. Then your fancy apt-get tools will work.

    And to kneejerk jackasses that say "just release the source"- you must realize that the source is good for you and me (well me anyway, I cannot speak for you, obviously), the people who USE these programs have absolutely NO INTEREST in doing the work that I have just done for my own purposes.

    Plus, having them use a single binary means that it's very easy to debug with nothing more than a core file.

    Oh, and I suppose I am giving the submitter the benefit of the doubt that we are indeed talking about Free Software, aren't we? :)
    • This [gentoo.org] is related to Gentoo, but it explains quite well IMO why autopackage is flawed.

      For the non-clickers among us:
      1. To even unpack the package, you must run arbitrary code supplied by an untrusted source.
      2. They install directly to the live filesystem.
      3. Their intrinsic dependency resolver is broken.
      4. They do not support the filesystem layout used by Gentoo.
      5. They do not support configuration protection.
      6. They can clobber arbitrary files on uninstall.
      7. The linking mechanism used is insufficiently flexible.
      8. The entire format
      • Why is a vendor website "an untrusted source"?

        That's like saying Firefox extensions are broken, because they might come from sites other than mozilla.org.
        • Why is a vendor website "an untrusted source"?

          * Because the site may have been hacked?
          * Because the package may have been downloaded from an untrusted mirror?
          * Because the package may have came from an untrusted third party?
          * Because the proxy server of your institution may be compromised?

          Usually the distribution package checks a signature that is already trusted, either by a one time download of a key or by having being installed by the original cd-rom. The problem with runing a package to install it is th

  • Many distrobutions = Good, (more competition, more innovation).

    HOWEVER, the catch is that the answer to the above question is not a unanimous "Yes, here's how and it'll work for most if not all distrobutions." Why? Too many variables to account for. To be almost universal across the linux distro, you'd have to release at least three different binarys (which at most covers a dozen or so of the most common distros). To make it completly universal across all linux distros, you'd have to release your source
  • Dude, all you have to do is build a static binary that doesn't need any outside files, and program it so that the only thing outside of your program that you depend on are system calls that have been around since, say, kernel 2.4 or so. Then, you're just about guaranteed that your software will work on any system. In fact, if you use system calls that are pretty much available on any *nix, then you're pretty much guaranteed that it will work on anything with an x86.
  • Distribute all the shared libs you need, except for glibc. (BSD licensed, right?) Link all your binaries (including shared libs) against an older glibc version, and it will be forward-compatible with newer glibc versions, thanks to versioned symbols.

    This pretty much guarantees compatibility across distributions. Just make sure the glibc you link against is old enough. Then just distribute binary tarballs and tell people only "glibc 2.2 or newer required" "glibc 2.3 or newer required" etc. It works pre
  • We (my company) use a disk image from Redhat 6 to compile binaries. Old versions of gcc, X, gtk etc. The programs work fine on everything new. GCC/Gtk/QT all need to do a feature freeze for programs to work in all distros IMHO. I might suggest to also looking at the loki installer. It works on new and old Linux systems.

    Enjoy,

  • Package your software into an LSB RPM. Anyone (using an LSB compliant distribution) can then install it.

    If you are using libraries that the LSB does not specify, build private copies and distribute them in your package.

    Good lord! That's pretty much exactly how software distribution works on Windows! :) And it works pretty well.
  • by Anonymous Coward
    Whatever happened to the good old days before dynamic libraries?
    We used to link EVERYTHING static and we had a stand-alone binary.

    I miss the old days...we've got the resources for it...multi gig RAM, multi-multi gig hard disks, etc.
  • whether or not you like thier installer/package system you would be pretty mad not to check out thier developer tools.

    another option is to build on the oldest distro you plan to target.

    in any case there are two main issues with building distributable linux binaries

    1: glibc symbol versioning
    glibc uses a symbol versioning system that means builds made against a newer glibc not work with an older one.

    2: macros in headers. e.g. if you use gtk 2.4 headers then your binary will end up relying on new gtk 2.4 featu
  • nuff said. the reality is dynamic linking causes more problems than it solves. the argument about it saving space is 'blah' (i.e. true but irrelevant).
  • An option that nobody seems to have come up with yet is to use Mono. Yep, it's reverse-engineered "MS crap", but it's actually really good. C# is a nice language to develop in.

    Assuming you *can* switch to Mono (and that's a really big "if" there), you could supply instructions for how to install Mono at your Web site for the common distros - it's simple for all Debian-based distros and for Gentoo, and probably RedHat/Mandriva/etc. as well. Alternately, you could (possibly|probably) ship Mono with your co

E = MC ** 2 +- 3db

Working...