Forgot your password?
typodupeerror
Graphics Displays Linux

Ask Slashdot: Hardware Accelerated Multi-Monitor Support In Linux? 278

Posted by timothy
from the 32.174-ft/s^2 dept.
An anonymous reader writes "I'm an Engineer with a need for 3 large monitors on the one PC. I want to run them as 'one big desktop' so I can drag windows around between all three monitors (Windows XP style). I run Debian and an nVidia NVS450. Currently I have been able to do what I want by using Xinerama which is painfully slow (think 1990s), or using TwinView which is hardware accelerated but only supports 2 monitors. I can live without 3D performance, but I need a hardware accelerated 2D desktop at the minimum. What are my options? I will happily give up running X and run something else if I need to (although I would like to keep using Xfce — but am open to anything). I am getting so desperate that I am starting to think of running Windows on my box, but that would be painful in so many other ways given my work environment revolves around the Linux toolset."
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Hardware Accelerated Multi-Monitor Support In Linux?

Comments Filter:
  • by Anonymous Coward on Saturday July 27, 2013 @11:27AM (#44399663)

    A pair of nvidia 9800gtx cards gives me quad DVI on which I run three monitors. The option you are seeking is basemosaic. I don't have the config in front of me or I would include it.

  • by tramp (68773) on Saturday July 27, 2013 @11:30AM (#44399689)
    arandr is a standard package in Debian and can be used with Xfce too. http://packages.debian.org/unstable/main/arandr [debian.org]
  • by Anonymous Coward on Saturday July 27, 2013 @11:32AM (#44399703)

    This works out-of-the-box with any number of monitors (well, as many as the number of CRTCs provided by your GPU) for ATi Radeons (both free and proprietary drivers) and Intel (free drivers).

    Now, embedded Intel usually only has two CRTCs, but the newer Radeons have at least three, up to six.

    You just need to configure the viewports using your preferred desktop environment or directly using xrandr or the x.org config.

  • by Coeurderoy (717228) on Saturday July 27, 2013 @11:34AM (#44399719)

    You might be using the open source driver and not the nvidia driver.
    We use Two GTX220 or GT650 and plug three or four terminals withouth any hassle, but we do use the proprietary nvidia driver.

    And the result is quite fast (we typically test our games on two full HD monitors while running our development tools in one or two others.

    I suspect the NVS450 is also more expensive than our setup :-)

    BTW we use either debian or ubuntu depending of the whim of each developper.

  • by amginenigma (1495491) on Saturday July 27, 2013 @11:35AM (#44399721)
    I also do not have the config in front of me, but mosaic is what you are looking for in your xconfig. Bit of googling (ftp://download.nvidia.com/XFree86/Linux-x86/256.35/README/sli.html) on that should point you in the right direction. And yes once configured it's as 'easy as Windowz...'
  • by Marrow (195242) on Saturday July 27, 2013 @11:35AM (#44399729)

    Try the README.txt

  • by Anonymous Coward on Saturday July 27, 2013 @11:39AM (#44399761)

    Multi-monitor isn't the problem here. Hardware-acceleration is the problem.

    Last time I checked, officiak nVIDIA driver is the only one which implements 2D render acceleration which is still marked as experimental (for like 10 years), and that is only partially supported by other GUI functionalies, such as multi-monitor - most applications/toolkits don't even know it. Hardware-acceleration except 3D for gaming is difficult with X-window because:

    1) You need X-window to have that acceleration API
    2) You need X-window drivers (per-vendor) to implement the acceleration API
    3) You need various X-window extensions to make use of the acceleration API
    4) You need GUI toolkits to provide a layer of higher-level acceleration API to support the acceleration API in X-window and make use of it
    5) You may also need GUI apps to make use of the higher-level acceleration API

    It cannot change overtime, and since nobody cares about hardware acceleration except gamers, there can be no acceleration for your regular 2D/GUI work, no progress in the field for so many years. About 3 years ago I can still notice that quick-scrolling on webpage appears to be much slower on x-window than winodws (using opera browser), though it doesn't hurt usability.

  • by Anonymous Coward on Saturday July 27, 2013 @11:47AM (#44399831)

    I use xrandr with Arch and Xfce and it works fine: https://wiki.archlinux.org/index.php/Xrandr, so I suspect arandr for Debian will achieve the same results. How did this get past the /. moderators?

  • I just used (Score:5, Informative)

    by mocm (141920) on Saturday July 27, 2013 @11:51AM (#44399863) Homepage

    the nvidia-settings tool to set up 4 monitors on my GTX670, there is no problem with speed and I get hw accelerated 3d on every screen. The driver is NVidia's 310.19. I used the TwinView Option on the Layout selection screen and could put the monitors into the wanted configuration with the GUI. I can move windows between the monitors and xfce gives me panels on the separate monitors.
    The screen section in the xorg.conf looks like this:
    Section "Screen"
            Identifier "Screen0"
            Device "Device0"
            Monitor "Monitor0"
            DefaultDepth 24
            Option "TwinView" "0"
            Option "Stereo" "0"
            Option "nvidiaXineramaInfoOrder" "DFP-0"
            Option "metamodes" "DFP-0: nvidia-auto-select +0+0, DFP-1: 1920x1200 +1920+1080, DFP-3: nvidia-auto-select +1920+0, DFP-4: nvidia-auto-select +0+1080; DFP-1: 1920x1200 +0+0; DFP-1: 1920x1200 +0+0"
            SubSection "Display"
                    Depth 24
            EndSubSection
    EndSection

    and the server layout:

    Section "ServerLayout"
            Identifier "Layout0"
            Screen 0 "Screen0" 0 0
            InputDevice "Keyboard0" "CoreKeyboard"
            InputDevice "Mouse0" "CorePointer"
            Option "Xinerama" "0"
    EndSection

  • by serviscope_minor (664417) on Saturday July 27, 2013 @11:52AM (#44399871) Journal

    So why not just run windows and fire up a linux VM to run your tools in?

    Because Linux does support it out of the box. I have no idea what the user has done, but me and many other posters find that the nvidia drivers support multiple accelerated monitors with no trouble whatsoever.

    There seems to be some odd issue with his setup. This therefore seems to me more of a question for a slower, more persistent help problem where he can post debugging output and have some experts look at it.

  • by Anonymous Coward on Saturday July 27, 2013 @11:52AM (#44399879)

    My Ubuntu workstation has an HD 7950, using proprietary drivers installed from the Settings menu. Currently running three 1080p monitors, two of which are rotated portrait mode. Any HD 7xxx series card is supposed to be able to run up to six monitors, though you usually only get four outputs (six requires monitors that support DisplayPort daisy-chaining).

    Oh, and I occasionally play DotA 2 on Steam for Linux on this as well. Apart from trying to start on the wrong monitor, it works very well.

  • by perpenso (1613749) on Saturday July 27, 2013 @11:55AM (#44399907)
    Get a Mac. Are you sure your toolset is Linux specific? Odds are your apps and tools run fine under Mac OS X. Some info from Apple:
    http://movies.apple.com/media/us/osx/2012/docs//OSX_for_UNIX_Users_TB_July2011.pdf [apple.com]
  • by Anonymous Coward on Saturday July 27, 2013 @12:23PM (#44400079)

    No, it doesn't just work. I have a very nice triple monitor Mac setup. Besides the obvious price issue, here are my two major complaints (there are other more nitpicky ones I won't get into).

    1. Sound. I had to download a third-party app called Soundflower to get the sound to work the way I want. (Actually, the way I want is for the sound of the app on a given monitor to come from that monitor's speaker, but that's asking for unicorns so I just settled for using left and right monitors for stereo.)
    2. Fullscreen. Fullscreening any app on a monitor blanks out the other two monitors.

  • by Anonymous Coward on Saturday July 27, 2013 @12:42PM (#44400227)

    2. Fullscreen. Fullscreening any app on a monitor blanks out the other two monitors.

    You'll be pleased to know that apple announced that fixing this is one of the major new features of Mavericks [apple.com].

  • by Mad Merlin (837387) on Saturday July 27, 2013 @01:14PM (#44400471) Homepage

    That's not necessary anymore. Kepler based cards (GTX 600 and 700) support up to 4 monitors. I'm posting from 3 monitors connected to a GTX 670.

  • by Anonymous Coward on Saturday July 27, 2013 @01:40PM (#44400687)

    You are an idiot. You realize that initially Wayland is going go use a bunch of drivers ported from X right? An issue stemming from lack of drivers will continue to be an issue...

  • by Tawnos (1030370) on Saturday July 27, 2013 @04:41PM (#44401825)

    Not quite. I used to work on the windows display management kernel and did a ton of testing when we brought back heterogeneous in Win7. In XDDM (XP Display Driver Model), heterogeneous was allowed, but it had issues when drivers would conflict. You could find some setups that worked and some that didn't, largely based on the drivers, cards, and the alignment of the planets.

    When Windows Vista came out the drivers moved to WDDM (Windows Display Driver Model). This model initially disallowed heterogeneous configurations. In Win7, heterogeneous support was again allowed, partially because the OS now tracked monitor connectivity state (CCD - connecting and configuring displays). Previous versions of windows had left that to the individual drivers, which could cause conflicts and loops of bad behavior ("value add" software from vendor x sets "clone" mode, then from vendor y sets extend mode, and they fight back and forth, for example).

    So in Windows, it was allowed for every release except Vista, though it wasn't really supported or tested well until 7 and beyond.

  • by yhetti (57297) <yhetti@sh[ ]x.net ['evi' in gap]> on Saturday July 27, 2013 @04:56PM (#44401909)

    I can confirm that BaseMosaic on an NVS450 works under LMDE (Debian Testing) using:

    Section "Screen"
            Identifier "Screen0"
            Device "Device0"
            Monitor "Monitor0"
            DefaultDepth 24
            Option "BaseMosaic" "True"
            Option "MetaModes" "GPU-1.DFP-0: 1680x1050+0+0, GPU-0.DFP-1: 1680x1050+3360+0, GPU-0.DFP-0: 1680x1050+1680+0; GPU-1.DFP-0: NULL, GPU-0.DFP-1: NULL, GPU-0.DFP-0: 1680x1050"
            SubSection "Display"
                    Depth 24
            EndSubSection
    EndSection

  • by Miamicanes (730264) on Saturday July 27, 2013 @11:33PM (#44403805)

    Oh, the days of using separate video cards for 2D and 3D support. It was "cool" to have a setup like that, but somehow I was never interested and held out for the TNT2.

    > Oh, the days of using separate video cards for 2D and 3D support.
    > It was "cool" to have a setup like that, but somehow I was never interested and held out for the TNT2.

    And thanks to mass-market consumers who did the same thing, we ended up with video GPUs today that are basically a pimped out 3DFX stapled onto a dumb framebuffer, with no real 2D acceleration to speak of.

    Instead of getting hardware-accelerated B-splines and the ability to render subpixel-hinted scalable fonts via hardware in realtime sometime around 2006 like we were supposed to (going by ATI's roadmaps), we have Android and IOS hardware built around GPUs that couldn't render a full page of hinted dealiased text a-la-Postscript to display memory in 1/60th of a second if the future of their manufacturers' companies depended on it. Because 3D is trendy, hot, and sexy, and 2D isn't.

    Joe Sixpack doesn't know what a B-spline or subpixel rendering is, but he knows that 3D is "cool", and a GPU that has "more triangles" is better (the same way he "knew" a 1.8GHz Pentium 4 was better than a 1.1GHz Pentium III Xeon, and even ran out to buy a new laptop with one).

    That's why Android & IOS-based e-readers suck for interactively reading technical books that require constant page-flipping. They lack 2D spline acceleration, so they have to do everything via brute CPU force. They're too slow to render pages from scratch in realtime with "real book" aesthetics, they don't have enough memory to pre-render the whole book to ram, and they're too slow to fetch entire pre-rendered arbitrary pages from microSD in 1/60th of a second or less(*).

    (*) The fastest microSD interface on any known Android phone maxes out around 25MB/s.... and a 32-bit 1280x800 bitmap weighs in around 4MB. Real-world Android phones like the S3 generally max out around 17MB/s. The fastest UHC-1 Sandisk Extreme cards have a theoretical max of 95MB/s, which STILL isn't fast enough to fetch the ~187MB/sec required for realtime brute-force 1280x800x60fps @ 24 bits. In theory, the 62MB/s required to fetch 8-bit grayscale at 60fps might be do-able on a future phone with UHC-1 microSD, but no current device can do it.

  • by iserlohn (49556) on Sunday July 28, 2013 @05:41AM (#44405007) Homepage

    Yes, same here on a 660. Twinview works with 3 monitors on 1 card as expected. 3D acceleration is working fine, I'm using gnome-shell (I can hear the gasps already).

"Well hello there Charlie Brown, you blockhead." -- Lucy Van Pelt

Working...