Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Compare cell phone plans using Wirefly's innovative plan comparison tool ×
Displays Hardware

Ask Slashdot: Tiny PCs To Drive Dozens of NOC Monitors? 197

mushero writes: We are building out a new NOC with dozens of LCD monitors and need ideas for what PCs to use to drive all those monitors. What is small and easy to stack, rack, power, manage, replace, etc.?

The room is 8m x 8m. It has a central 3x3 LCD array, as well as mixed-size and -orientation LCD monitors on the front and side walls (plus scrolling LEDs, custom desks, team tables, etc) — it's designed as a small version of the famous AT&T Ops Center. We are an MSP and this is a tour showcase center, so more is better — most have real functions for our monitor teams, DBAs, SoC, alert teams, and so on, 7x24. We'll post pics when it's done.

But what's the best way to drive all this visual stuff? The simplest approach for basic/tiny PCs is to use 35-50 of these — how do we do that effectively? Almost all visuals are browser-only, so any PC can run them (a couple will use Apple TV or Cable feeds for news). The walls are modular and 50cm thick, and we'll have a 19" rack or two, so we have room, and all professional wiring/help as needed.

Raspberry Pis are powerful enough for this, but painful to mount and wire. Chromeboxes are great and the leading candidate, as the ASUS units can drive two monitors. The Intel NUC can also do this — those and the Chromeboxes are easily stackable. My dream would be a quad-HDMI device in Chromebox form factor. Or are there special high-density PCs for this with 4-8-16 HDMI outputs?

Each unit will be hard-wired to its monitor, and via ip-KVM (need recommendations on that, too, 32+ port) for controls. Any other ideas for a cool NOC are also appreciated, as we have money and motivation to do anything that helps the team and the tours.
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Tiny PCs To Drive Dozens of NOC Monitors?

Comments Filter:
  • Barco... (Score:4, Interesting)

    by speleo ( 61031 ) * on Monday November 09, 2015 @12:03PM (#50893105) Homepage

    https://www.barco.com/en/solutions/Control-rooms

    • Re:Barco... (Score:5, Interesting)

      by Anonymous Coward on Monday November 09, 2015 @12:24PM (#50893275)

      You say Raspberry pis are "a pain to mount and wire." Have you really thought about this?

      1 - Power (wire one)
      2 - HDMI (wire two)
      3 - wifi plugin. Can be set for static IP. Even a minimal router will allow you up to 50 clients on one WiFi subnet. Apple airport, for instance. And you can use more than one, so you can go up to 250 clients if you really need to. No wires. Unless we're talking about a really huge amount of bandwidth, wifi should do it. If not, ethernet cable, which would be wire three. Same issue with any client, though, so...

      My first question is, how are you going to get simpler than that? 2 or 3 connections. Seems like a doddle, frankly.

      So as to mounting:

      Is there some reason you can't use double sticky tape and just slap the thing on the back of the monitor? Or, if not that, which *is* a little hacky, use one of the ultra-inexpensive cases and put at the foot of the monitor like any other PC, only smaller, using less power, less obtrusive, etc?

      As to configuration, you can prepare the OS + software for these anywhere, walk up to the PI in question, insert the card, power it up, and you're done.

      None of these will need keyboards; any management you want can be done by SSH. Though why you'd have to manage an information repeater I don't know.

      As to reliability, it's pretty good, and hell, if one goes down, you unplug it, plug in a new one, and go on about your day.

      I really don't see the problem. Why would you do this particular task any *other* way?

      fyngyrz [slashdot.org]

      (anon because mod points)

      • Re:Barco... (Score:5, Insightful)

        by Anonymous Coward on Monday November 09, 2015 @12:37PM (#50893375)

        I would stay away from Wifi for anything that is actually in important continual use, or is intended to impress people on tours. I've seen way too many kiosks and displays that don't work or have error messages because of software and connection problems, and it looks rather bad and unprofessional. You can get Wifi to be pretty reliable, but it is easy enough to use a wired and avoid the chance of it going down when you most need it.

      • Re:Barco... (Score:4, Insightful)

        by Anonymous Coward on Monday November 09, 2015 @12:41PM (#50893405)

        Uhh, anybody using a "Wireless Router" in an enterprise environment needs to be kicked in the teeth.

        Also, don't use wireless for anything mission-critical. Monitoring systems are critical imho.

      • Sticky-tape on the back of the monitors?

        Sounds very impressive, quite professional.

      • I mount raspberry pi's with velcro tape. the $8 plastic case + velcro is cheaper and more flexible than the vesa mount cases. it's all mounted behind the monitor so nobody sees it. and 1ft HDMI cables are pretty easy to obtain. After that they boot up to being ethernet enable monitors, setting up the first SD card to do exactly what you want takes a bit of time. but copying that to N identical configurations is not hard.

    • Yeah, sounds like they really need a video wall controller instead of each monitor being independently driven. With a video wall controller you can drive all the monitors from a single controller and then resize 'windows (or inputs)' across the hole thing, in a corner, etc. Each input becomes a window. You can also save layouts/change them according to shift/etc.

      With a video wall controller you specify the number of inputs/outputs you need. Many also allow for IP based sources (cameras, remote screens v

      • Yeah, back in my defense-contractor days we built several video walls for connected C&C rooms.

        The high-end systems could put multi-display graphics at 1080p60 from any console to the theater and were based around the 64x64 Thinklogical DCS KVM over fiber modems and fed into a VistaSystems Spyder 12x8 video wall controller (of course they have larger units to drive your 3x3 wall, and you'd also be able to have a "preview" scaled down display of the entire wall which is also good for recording or broadcas

        • I program and setup pro av for a living and the video wall controller is the "best" option. Also the most expensive but they are highly flexible, especially the higher end ones.

          Some of the well known ones to look at are "RGB Spectrum", "Christie Spyder", Extron QuantumView, And the the reigning king Jupiter Systems.

          The best of these will let you define a virtual canvas as large as your wall is, and inputs are used as windows on that canvas, any layout you want. And presets are very nice and flexible
  • by Anonymous Coward

    http://www.displayport.org/cables/driving-multiple-displays-from-a-single-displayport-output/

  • One server, run virtual desktops and have 35-50 thin clients driving your monitors.

    • by Snuggles ( 74048 )

      Agree, get a stack of thin clients from eBay.

      • by Anonymous Coward

        Yea serious companies don't buy random equipment on eBay. Once you're paid more than maybe 75k/year, your time is better spent building cool shit than testing/supporting dodgy used hardware.

        • by pnutjam ( 523990 )
          Igel makes some excellent thin clients. Last time I used them they were head and shoulders above HP and Wyse. They operate off a linux image (maybe bsd) that can just be flashed to each device with dd or something similar. Config was basically a big text file.
    • Get a stack of zero clients and use vmware horizon view there are some out that support 4 DVI out, you can use DVI to HDMI cables for your connections.

      You can set them to auto connect and connect on disconnect.

      I work for a MSP and when we get to build our show place NOC like you are doing we will be using Zero Clients and a VDI Infrastructure back-end.

  • NVS (Score:4, Interesting)

    by Anonymous Coward on Monday November 09, 2015 @12:06PM (#50893123)

    Take a look at the nVidia NVS line of GPUs, they're designed for digital signage but would probably work for you - the new ones support up to 32 displays driven from a single machine (4 cards).

  • by drinkypoo ( 153816 ) <martin.espinoza@gmail.com> on Monday November 09, 2015 @12:07PM (#50893133) Homepage Journal

    There are a variety of cases to help you mount the Pis. They're lightweight enough to where you can literally just heat shrink them and zip tie or foam tape them down. Pis or similar are going to be your lowest-power, lowest-footprint option no matter what. And since these are just operating informational displays, you really don't need anything more than VNC (or the like) to control them, because bandwidth is not an issue. A KVM, IP or not, is literally just something which can fail.

    I'm not a Pi advocate specifically, but I fail to see what's wrong with them for this application.

    • by Anonymous Coward

      Nothing hard about RPi mounting, but have you ever used a browser on a Raspberry Pi? The GUI sucks, it's so slow.

      Nothing against RPi, but using it for a NOC for displaying info in a browser is NOT ideal.

    • by AmiMoJo ( 196126 )

      We rolled something like this using RPi at work recently. It works really well. We used VESA mounted enclosures to attach them to the back of the monitors.

      There are other options, but they all cost more. The Pi can be powered from the monitor's USB port (make sure it can supply more than 500mA, or buy those Y cables that pair two ports up) and we used a minimal network booting system on the SD cards so we can update easily and the local disk can be read-only. Sudden power loss is therefore not a problem, ju

    • by phlawed ( 29334 )

      I agree. An RPI is a simple, cost effective way to do this. I would not bother with a case, though.
      I would likely put a stack of RPIs on a board together with an Ethernet switch and a fat USB thingy with multiple outputs for power. Check Amazon for '12port Satechi'. Attach board to a single monitor. VESA100?

      Boot all of them from the same image loaded from tftp, minimal configfile on SDcard to tell the RPI what URL to display. "Static" IP-address assignment via DHCP/MAC-address. A bunch of HDMI cables from e

  • Seriously. A small form factor real computer. Put in 2 graphics cards that can run 5 monitors each. Done. You will have to look a little, but for our... let's just say I bought a lot of 7570 I think (a year and a half in the past) That had 5 mini displayport outputs each. Work like a charm and run up to 5 monitors. What exactly was the problem?
  • PC on a stick (Score:4, Interesting)

    by rwven ( 663186 ) on Monday November 09, 2015 @12:09PM (#50893137)

    http://gizmodo.com/this-130-wi... [gizmodo.com]

    Asus and Intel are making these types of devices. There are probably other companies making them by now as well.

    • by aliquis ( 678370 )

      Yeah, that's what I wanted to mention too. That or the ChromeCast or whatever.

      He already seem to be aware anything can run them so why not just get that anything and let it run them?

      Why is this on Slashdot? In case someone have a better idea?

      Guess low-end PC with four graphics cards * at least 3 displays each may be more cost efficient? =P

      • by unrtst ( 777550 )

        Why is this on Slashdot? In case someone have a better idea?

        I think the TMTOWTDI -ness of this question is why it's on slashdot, and I enjoy that, even though I haven't seen anything I wasn't aware of yet.
        I also thought the compute sticks (or cheap knock-offs or chromecast-like devices) would be a very viable option - and I think they'd be better than a RPi for this use case (much easier to buy a bunch of them, and have any NOC monkey pop in a new one).

        That said, there's so many ways to handle this, it's crazy. It's pretty impressive how many options there are. Just

    • Computer sticks are only about $100 each. Use a small keyboard and trackpad for each and you've got a nice setup with either Linux or Windows.
  • As the content is likely mostly static: What about a single PC with many USB3.0 -> HDMI adapters + USB 3.0 Hubs? Sure, refresh rate will likely go down to something like 10 Hz because of bandwidth limitation but that should fine for your kind of content and driving all screens from the same PC could be very useful for administration.

  • by Anonymous Coward

    What is the name of the MSP, so I can avoid dealing with them? If they could not solve that prolem themself, it is scary to think how they can "help" customers.

    • It's probably worse than you think. "This is a tour showcase center, so more is better". Sounds like someone who watches too much CSI:CYBER wants to impress the next round of investors. "most have real functions for our monitor teams" - what are the other ones going to show, screensavers? Form follows function, not the other way around.
      • Yeah, there's two references to the tours in the submission, and it sounds very much like it's as much marketing as it is functional.

        Suddenly I'm imagining a room in which the tech people never actually go, stage dressed with some carefully chosen people, and which will serve for a great tour but which otherwise will have nothing at all to do with operations.

        Meanwhile the actual staff are in dingy cubicles, with ancient CRT monitors, and not ever able to see this glorious presentation of the monitoring cent

        • by khallow ( 566160 )
          The customer for a flashy command center/situation room is the executive who thought it was a good idea. Lot's of sexy graphics and blinking lights will satisfy that customer.
  • http://www.ambery.com/2x2hdvga... [ambery.com] Shows all sorts of combinations. rack mounted
  • I'd use a NUC form factor with one mounted on the back of each monitor (or mounted on the back of every other monitor since it has two outputs). Basically no maintenance, easy to expand, and the off-the-shelf solution means easy to upgrade later. Will never fail if a small SSD is used, and has an ethernet hard port and plenty of resources (including 8-32GB of ram). Most monitors already have the necessary mounts.

    -Matt

    • Indeed, though one would have to examine the NUC/BRIX specs carefully. They are being driven (typically) by a mobile chipset GPU which will have some limitations.

      In fact, one could probably stuff them without any storage at all, just ram, and netboot the suckers from a single PC. I have a couple of BRIX (basically the same as a NUC) for GPU testing with 16GB of ram in each and they netboot just fine.

      Maintainance -> basically none.

      Expandability -> unlimited w/virtually no setup/work required.

      Performa

      • The latest NUC/BRIX even with the mobile chipset they use can easily do all the way up to 4k resolution. unless you are trying to play games or something on them then they are perfect for displays, workstations for non demanding users or thin clients.
  • since this is a tour showcase, and these monitors are all presumably providing metrics and alerts to act upon, why not encode the display and simply beam it wherever you want?

    https://obsproject.com/index [obsproject.com] OpenBroadcast project seems to have been designed for this, and would mean instead of a bunch of computers you could just buy smart TV's with embedded android.
  • Sounds like a nightmare to maintain. At any given time, a handful with just be displaying error messages. You see this in airports, hospitals and conference centers all the time. If it is mostly displaying browser stuff, use an esignage solution. Chromecasts+Greenscreen(a Groupon project) sounds like a good fit. There are also lots of companies that sell a turnkey solution. Ideally, the boxes should be small and really robust. When one fails, a hardware swap with no or minimal software configuration is
  • by Lumpy ( 12016 ) on Monday November 09, 2015 @12:13PM (#50893189) Homepage

    Just use a single PC and a matrox card and call it done. HDMI fiber extensions and walk away.

    • by forty-2 ( 145915 )

      ^ This.
      Do you really want to manage dozens of little machines? Matrox will give you gobs of outputs on a few cards. They're nothing you'd game on, but champs at what you're looking to do. Signal extension can get pricey, but if you want to do it right, and give yourself some flexibility, look at Creston's DigitalMedia Matrix. I think of it as a premium extension solution that includes free routing and KVM capabilities. Mix and match I/O flavors, and supports both UTP & fiber extension.

  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Monday November 09, 2015 @12:19PM (#50893245) Homepage
    Shuttle makes fanless VESA-mountable PCs [shuttle.com] that use the low power Atom CPUs. This would be a great use for them.
    • We have had great luck with the zotac zbox ci320 boxes which are also vesa mounted and fanless and look great mounted to the back of monitors. Zotac also offers several higher end versions but for our needs the ci320 is plenty. We have ubuntu running on them and they work well. The only real drawback is that at least the ci320 only has hdmi out so you'll either need a monitor with hdmi in or need some type of adapter.

    • by myrdos2 ( 989497 )

      Nice. There's also Mini-Itx.com [mini-itx.com] with many different boards and cases. They're like larger, beefier Raspberri Pi's and should be able to power 2-3 monitors each I think.

    • The problem with VESA mountable PCs for this usage, most times you want to mount the monitors on the wall. you can't if you're using the mounting holes to hold a PC... better to use video extenders from a server room/wiring closet with old repurposed laptops. or small NUC like computers driving multiple monitors. from far away.

    • Rather than the XS36 that the parent links, I'd suggest one of Shuttle's DS87 [shuttle.com]s. It can drive three displays, and I can confirm that they a pretty robust hardware.

      They're cheaper than an NUC, but more useful than a RPi.

  • by bucky0 ( 229117 ) on Monday November 09, 2015 @12:26PM (#50893293)

    If it's strictly browser-based, chromecast sticks (not the boxes) should work. Google is advertising that use, no less.

  • We are an MSP and this is a tour showcase center, so more is better

    Basically I read this as "we want to have some really cool blinking lights when we walk customers through here, even if none of this stuff actually does anything".

    Is this marketing, or actually intended to be functional?

    Please tell us you are really going to have people working in this room and monitoring stuff and that this isn't just for show.

  • The low end ones can be pretty inexpensive, presuming you need something more than what you do with a Raspberry PI. The NUC can run whatever OS you care to run on an Intel platform. The NUC's even have VESA mounting holes/brackets designed to attach to the back of most flat screen TVs
  • by Anonymous Coward

    Sounds like you need a video wall controller. You then specify the number of inputs/outputs from it. You can then size monitors/resize/do all sorts of stuff. Several vendors have API's available to take control of the video wall controller via scripts/etc.

  • What you really need is a digital signage solution to manage the displays. There are lots. Almost all of them are capable of embedding a web page on whatever they describe as a 'layout'. This will give you the advantage of being able to display any other kind of content as well. Now all you need is the smallest stack-able x86 machines you can find, to put in the closet nearest to your displays.

  • Go for servers (Score:4, Interesting)

    by Stefanos Harhalakis ( 4326659 ) on Monday November 09, 2015 @12:39PM (#50893391)
    Having done this twice in the past 4 years, my suggestion is to use rack mounted x86 PCs/servers with dual graphics cards. With ATI cards you can go to 8 or 16 monitors per server and as long as you keep a ratio of 1 screen / cpu, you should be fine (capacity wise). Using PCs (a) will allow for easy maintenance and (b) will be easy for others to work on them. PCs are also much easier to upgrade (hardware wise) as they keep the manual effort needed to a minimum. We've done this with PCs and PIs. PIs are a fun project and so far they work well, but you *will* be swearing in the process as you will have to figure out many things, including power, cabling, mounting, etc.
    • by Lluc ( 703772 )

      Having done this twice in the past 4 years, my suggestion is to use rack mounted x86 PCs/servers with dual graphics cards. With ATI cards you can go to 8 or 16 monitors per server and as long as you keep a ratio of 1 screen / cpu, you should be fine (capacity wise). Using PCs (a) will allow for easy maintenance and (b) will be easy for others to work on them. PCs are also much easier to upgrade (hardware wise) as they keep the manual effort needed to a minimum. We've done this with PCs and PIs. PIs are a fun project and so far they work well, but you *will* be swearing in the process as you will have to figure out many things, including power, cabling, mounting, etc.

      I built a setup like this (50X LCDs) closer to 10 yrs ago with a rack of servers, and I think it was a mistake. I should have used small desktop PCs. I was somewhat budget limited, so it was a bit of a stretch to get all the monitors driven by the limited set of servers + multiple video cards. In the end I had an array of client machines network-booting from a single server. I could have used a rack of small desktops as the clients and had 2x more CPUs and higher performance graphics cards for the same

  • Just get a pile of AMD cards. Doesn't even matter what model they are as long as they're the same generation (and even then AMD's kinda given up on that). It'll make it easier to set them up as one giant monitor, and you won't get frustrated by running into architecture/power issues.
  • A small PC, you say? (Score:4, Interesting)

    by Anaerin ( 905998 ) on Monday November 09, 2015 @12:51PM (#50893477)
    How about something smaller than Intel's NUC, more powerful, fanless and reasonably cheap. Something like the fitlet [fit-pc.com] for example. And VESA Mountable too.
  • A pc can drive 12 or more screens. daisy chainable DP screens are the best for cutting cables.

    Or you can get 6 head mini DP ati cards with 6 mini dp to hdmi ACTIVE adapters each. a pc with 2 X16 slots even at X8 X8 can drive 12 screens. Maybe even a board with x8/x4/x4/x4 or x4/x4/x4/x4 should work as well to have 32 screens.

    an 1150 Xeon (can't use on board video unless it's pci / pci-e based) is cheaper then a i7 and gives you Quad-Core + HT.

  • Your requests doesn't mention cost but since you mentioned RP I suspect it's tight. However going cheap isn't always the least cost option. Unless of course your time is worth nothing. I have worked in an environment using ClearCube [clearcube.com] Blade center PC's doing PCoIP to Zero Clients (No OS on client) and it worked really well. We needed high power systems so we had dedicated blade PC's in a 2U backplane but they offer VDI solutions if your needs are more modest. You basically plug an Ethernet cable (Fiber is als
  • by Shoten ( 260439 ) on Monday November 09, 2015 @12:57PM (#50893519)

    From the way this question is worded, I've got a hunch that you just bought common screens for the displays.

    Danger, Will Robinson. Ordinary screens aren't rated for 24x7 use, and they WILL burn in over time, among other things. If you're not using screens that are purpose-built for this kind of nonstop usage, you need to back up and change that or it'll all be for nothing.

    I'm used to seeing data walls and multi-monitor room displays of this sort designed from soup-to-nuts as a full solution by a service provider that specializes in doing so. There's a reason for the existence of an industry to serve that purpose; it's not as easy as just putting up a lot of big television screens and plugging them into small computers, as you're beginning to discover. Be aware that you almost certainly haven't run into all the problems yet, and it may be cheaper to contract with an outside company to do it all. (I do not work for such a company, just to be up front about it. I'm not stumping for business here.)

    • by ledow ( 319597 )

      Burn-in? In this day and age?

      I'm buying the cheapest Chinese LCD junk I can get my hand on, putting them up as digital signage, and leaving them on 24/7. So far, 18 months and not a sign of burn-in.

      I'm also running them off thin-client things (nComputing, that were unanimously panned as being useless for anything else in this day and age but were old clients that were bought a LONG time ago) that have VESA mountings and can run from a single central VM running TS. Combine it with some open-source digital

      • by Shoten ( 260439 )

        Burn-in? In this day and age?

        I'm buying the cheapest Chinese LCD junk I can get my hand on, putting them up as digital signage, and leaving them on 24/7. So far, 18 months and not a sign of burn-in.

        I'm also running them off thin-client things (nComputing, that were unanimously panned as being useless for anything else in this day and age but were old clients that were bought a LONG time ago) that have VESA mountings and can run from a single central VM running TS. Combine it with some open-source digital signage software (Xibo) and it all just works. That might well be a way - if they're running lots of servers, it'll be better to have a lot of thin-clients just doing the displays and a central overpowered computer actually running the browser - no cable spaghetti, built in VESA mountings, can even run off PoE if you do it right. One switch, one VM, and a one-off investment in thin-clients and you're done, rather than some knocked-together homebrew junk that will fall over more than the stuff it's monitoring.

        Burn-in is the very, very, least of your problems and god knows what you're buying to see burn-in.

        (Hint: My signage is all white-background, with hard B/W logos and text, up for days on end. No burn-in).

        Believe it or not, but it does happen. Ask any NOC/SOC/equivalent facility, and ask them what kinds of monitors they have up on the walls...and why. I've seen it on stuff that was bought last year.

        • by dbIII ( 701233 )
          Do you have a brandname in your example? My experience appears to conflict with yours so perhaps there is something different about the monitors you saw.
    • The year 2000 happened and LCDs are no longer shit. There's a few screens around here that have been on most of the time since 2003 when 19 inch screens first became cheap and my workplace got a large number of them.
      Power supplies fail, backlights die, but burning in is no longer a thing to worry about with consumer LCDs.

      I'm used to seeing data walls and multi-monitor room displays of this sort designed from soup-to-nuts as a full solution by a service provider that specializes in doing so

      True, it's not a

  • by spaceman375 ( 780812 ) on Monday November 09, 2015 @01:01PM (#50893565)
    You want to impress people, be sure you can grab & throw what's on one monitor to another, plus pinch & un-pinch with whole arm gestures. For the right age of clients, you may also want to be able to play a few older games on entire walls.
  • I think one CPU per screen is overkill, unless each is going to be it's own discrete Display. A single PC with a bunch of high-end/multi-port display cards would enable you to have a fully-customizable display, rather than 50-60 discrete desktops.

    For single cup per display purposes, you could throw a bunch of the Infocus Kangaroo PCs at the problem.

    Or, if you really feel you have to throw 50-60 Raspberry Pis at this problem, consider hiring someone to make you a card cage that can hold dozens of RPis in a 2

  • It might be worth considering the Gigabyte BRIX units - there's quite a range, but most of them support dual output (HDMI+VGA or HDMI+MiniDisplayPort). There's one that lists nVidia graphics and triple displays but that might not be worth it; you might also be able to drive dual HDMI with active splitting of the DisplayPort but again, that might not be worth it.

    Processors are all over the map from Celeron up to i7.
  • Can't believe nobody's asked that yet!

    • by mushero ( 142275 )

      YES we are, in every area, but jobs are in Shanghai. We are in fact looking for NOC engineers and process people. Senior engineers in all areas: Linux, DBA, Security, Performance, Troubleshooting, tools, managers and much more. We are building the world's top MSP and running numerous multi-hundred mullion user systems, doing the most difficult things on the Internet today.

      I know you are probably being a bit facetious, but our career site:
      http://careers.chinanetcloud.c... [chinanetcloud.com]

  • It comes down to who can interact with it how. Are you doing HD or 4K for the monitors? http://www.brightsign.biz/digi... [brightsign.biz]
  • Check out:
    http://www.piwall.co.uk/information/installation

    and:
    http://dmx.sourceforge.net/

    Seriously, one PC for the horsepower then just networked rpi's to create 1 giant screen. who needs KVM when it's just one screen?
    depending on the screens you choose the rpi's can easily mount to the back of the monitor, get power from the screen's USB port if it has it, hdmi to the output and the only "wire" you have to manage is the screen power and 1 network cable. Seems simple and scalable.
  • Surprised nobody has mentioned it.
    There are several solutions using X11 to split a virutal screen among slave PCs
    E.g. XbigX http://www.x-software.com/en/p... [x-software.com]

  • OP mentioned that the screens are just showing browser windows. USB-attached displays perform surprisingly well for that, and have the advantage of working with any Windows PC. Here's a video showing 4 displays: https://youtu.be/KKcMqCAYkpk [youtu.be] And one showing 14: https://youtu.be/heB94f6FHd8 [youtu.be] Full disclosure: I work for the company that made these videos. One important thing to note is the 14 monitor demo was done with a pure USB 2.0 system. Modern USB 3.0 systems have lower limits in terms of how many USB de
  • Why tiny? All those monitors have a huge footprint. Use that footprint by putting things underneath them. Sit your three screen array on a server with the few video cards in it (one per four monitors) and you are not losing any more space.
  • For my company's purpose we just use smart TVs, specifically 40" mi TVs for things like netmons/buildmons/stats/etc, and built iframed sites to display different sources of data in a single screen. No external PC to manage and since browser support was the only requirement, works out fine.

  • ... the machine that goes ping!

    (Monty Python reference, for the young.)

  • Don't know if it's the best answer but it'd be a fun project. Or just skip the monitors and have everyone wear Oculus Rift headgear.
  • Have you thought about using Intel Compute Sticks for this? They aren't super powerful, but they're only $99 and can more than handle running a web browser.

    I actually liked the Raspberry Pi idea better, but if you want to use Windows for your screens... This option might work.

The only problem with being a man of leisure is that you can never stop and take a rest.

Working...