Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Data Storage Hardware

Rugged Linux Server For Rural, Tropical Environment? 236

travalas writes "Last year I moved to Rural Bangladesh. My work is pretty diverse, everything from hacking web apps to designing building materials. Increasingly a Linux VM on my MacBook Pro is insufficient due to storage speed/processing constraints and the desire to interface more easily with some sensor packages. There are a few issues that make that make a standard server less than desirable. This server will generally not be running with any sort of climate control and it may need to move to different locations so would also be helpful if it was somewhat portable. The environment here is hot, humid and dusty and brutal on technology and power is very inconsistent so it will often be on a combination of Interruptible Power Supply and solar power. So a UPS is a must and low power consumption desirable, so it strikes me that an Integrated UPS a la Google's servers would be handy. Spec wise it needs to be it needs to be able to handle several VM's and some other processor storage intensive tasks. So 4 cores, 8GB of ram and 3-4 TB of SATA storage seems like a place to start for processing specs. What sort of hardware would you recommend without breaking the bank?"
This discussion has been archived. No new comments can be posted.

Rugged Linux Server For Rural, Tropical Environment?

Comments Filter:
  • Laptop (Score:5, Informative)

    by B5_geek ( 638928 ) on Sunday April 19, 2009 @03:44PM (#27638623)

    Get a laptop or 3.
    Portable - check
    UPS - check
    Able to handle no climate control - check
    4 cores & 8GB - check
    4TB of storage - Get an external drive bay. (Do you really need that much storage? really?)

    Some of the XPS line from Dell or other 'Gaming' laptops should do the trick.

  • by Britz ( 170620 ) on Sunday April 19, 2009 @03:55PM (#27638715)

    I don't think there are any servers for those requirements:
    portable, rugged, low power (incl. UPS)

    But those are the exact specs of the rugged laptop. Laptops have built-in UPS units (called batteries) and are low in power consumption.

    Panasonic Toughbooks, or Toshiba Tecra ruggedized come to mind. Dell also has some new offerings in that segment:
    http://www.dell.com/content/products/productdetails.aspx/latit_xfr_d630?c=us&cs=04&l=en&s=bsd [dell.com]

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Sunday April 19, 2009 @03:55PM (#27638717)
    Comment removed based on user account deletion
  • Re:Laptop (Score:3, Informative)

    by glennpratt ( 1230636 ) on Sunday April 19, 2009 @03:58PM (#27638739) Homepage

    I agree, this is probably the best suggestion without knowing more of your budget.

    Laptops are the closest thing you will get without breaking the bank - you could probably buy a few + storage with for what a truly ruggedized server system would cost. They will be infinitely more portable and easy to run on DC; plus they will run for years on DC while most UPSs wont.

    If it must be real servers - I'd build them in something like this:

    http://www.racksolutions.com/transport-case.shtml [racksolutions.com]

    Heck, you could fit a pretty powerful network, including a large battery inside one - but it will cost lots of money.

  • Rugged Laptops? (Score:4, Informative)

    by jaker29902 ( 926208 ) on Sunday April 19, 2009 @03:58PM (#27638741)
    http://www.dell.com/xfr [dell.com] Apparently this is dells solution to your problem, has ballistic armor and is apparently able to be drop kicked into a pool with rabid sharks who have chainsaws for teeth.

    You could get an external drive and possible cluster them together for the enhanced processor power? Dont know but this taptop seems to be able to handle the enviroment you want it to. Also UPS plus Solar Panels = headache so be prepared!

  • by Viv ( 54519 ) on Sunday April 19, 2009 @04:13PM (#27638895)

    The ideal human interface for my needs is console to an RS-232, so annoying rubberized keypads don't matter to me.

  • by amcchord ( 1334469 ) on Sunday April 19, 2009 @04:14PM (#27638913)
    In looking at your specs I think your storage is going to be the hardest to deal with. Todays 1TB drives are quite fragile. Drop them from a table and 90% of the time they are goners. In addition without serious cooling they can get very hot (I am looking at you segate) and once you get upwards of 55 Celsius they start to break down fast. Even worse would the temperature cycling due to the fact the server is not online 24/7. Seeing as you are power constrained its probably not going to feasible to go with a ton of 250GB drives ect...

    To build something like this in a ruggedized form is going to be expensive (5k + for the basics) Is there a way you could reduce your data requirments? 4TB is a tremendous amount of data.

    If you were willing to compromise

    One other suggestion. Go with a MiniITX board and a DC-DC power supply This one is cheap and you could put an AMD CPU in it http://www.newegg.com/Product/Product.aspx?Item=N82E16813500021 [newegg.com] Then add a 45 Watt AMD CPU and maybe RAID 5 with 500GB laptop HD's
    In theory you could build a decent dual core 2.6 Ghz system with 1.5 TB of storage, 4GB of ram and all powered by 12Volt DC (negating the UPS need.. just use 12volt Batteries) for a reasonable price. A system like this would be small and portable. If you needed more horsepower I would suggest building multiples.

    It would likely be much more cost effective to build multiple moderately powerful systems than one massive one.
  • RV? (Score:4, Informative)

    by vlm ( 69642 ) on Sunday April 19, 2009 @04:23PM (#27638991)

    Sounds like you want plain ole standard commercial grade server hardware mounted in a tiny RV.
    Extensively shock mount a relay rack, put in somewhat bigger AC/batteries/genset than usual, and you're good to go.
    You can use the living quarters to house the armed guard, which will be required for expensive equipment in that corner of the world.

    Trying to buy super tough server hardware will simply be more expensive than a RV and much harder to replace / maintain when it breaks.

    Admittedly I'm mystified what you'd do with such immense computing power in a rural area without electricity. Maybe a really nice mythtv backend? Educate the locals using SimTractor?

    You do realize that Bangladesh is like 1 foot above sea level, so no need to engineer this to last forever when its going to get washed into the sea every couple years by storms etc. Using a RV could help in the evac, assuming there is any place safe to evac to...

    Alternately, split your workload transparently across maybe 50 smaller machines, and start purchasing replacements when attrition nears 75%.

  • Dunk it in oil. (Score:5, Informative)

    by w0mprat ( 1317953 ) on Sunday April 19, 2009 @04:26PM (#27639011)
    I know some guys who were running some wi-fi gear on a roof with a small linux server etc andto beat the elements (many days a year of driving horizontal rain and gale force windows) they submerged some low power components in a metal tool chest filled with mineral oil. Their set up had 4gb CF and USB keyfobs for storage. There was a 12VDC input power car-PC-style supply that handles variable input (goes as low as 6v) and they ran long wires down to a small 240v/12v transformer in the building. This meant that even if moisture got in, the components were very well protected as water would sit at the bottom of the oil, and there was utterly no dangerous voltage exposed to the outdoors. They later they went with a smaller o-ring sealed aluminum box filled with proper transformer oil, but the original hack was working fine after 1 year.

    From my own experience with dunking rigs in oil, you only need to watch out for a few things, one being the mineral oil leaching plasticizers out of wire insulation - they eventually become brittle. You also need to seal your electrolytic caps with a little epoxy so the rubber seal doesn't get eaten alive. Interestingly most caps seem to survive a long time like this, but personally I'd recommend motherboards with solid aluminum caps.

    However these things don't become a problem for months, so you'd likely get away with just dunking your rig and leaving it. You also cannot dunk a HDD, as the oil will get inside it and foul things up. I haven't tried it, but it would be possible to seal up in a box or 'pot' a mechanical HDD in epoxy, but best to stick with SSD / Compactflash.
  • Cases (Score:3, Informative)

    by WTF Chuck ( 1369665 ) on Sunday April 19, 2009 @04:34PM (#27639093) Journal

    You might accidentally break the bank. You may want to try putting the server and a rackmount UPS into something like the cases you can find here [hardigg.com]. Take along a back-up generator. And lots of fans and filters. Spare parts for the server would also be helpful.

  • Redundancy (Score:4, Informative)

    by Doc Ruby ( 173196 ) on Sunday April 19, 2009 @04:45PM (#27639187) Homepage Journal

    Buy a used 1U rack Dell server with redundant power supplies, Pentiums, ethernets and HDs on a RAID. Then replace the HDs with Flash SSD. Then put the whole thing in a plywood box with an air conditioner mounted on top, tubes blowing cold air in and three .00 grade nylon layers over the out vents, the upper layer removable. Seal all cracks, especially around cable slots, with silicone caulk. Run the whole thing as a unit, cleaning the air conditioner filters and out vent screens twice a day (so get two sets of those filters).

    Keep spares of each redundant part. Buy two of those whole units (including air conditioners), because one unit will die anyway.

    Run them on an ethernet switch, one powered down except once a day or so to sync their RAIDs.

    Or rent a server at some global datacenter, and get WiFi/pringles antenna to an ISP somewhere.

  • by F34nor ( 321515 ) on Sunday April 19, 2009 @05:02PM (#27639327)

    Tiny, portable, low power, and no moving parts.

  • by jonbryce ( 703250 ) on Sunday April 19, 2009 @05:05PM (#27639349) Homepage

    A beowulf cluster of laptops should do it.

  • Re:Not gonna happen (Score:2, Informative)

    by xous ( 1009057 ) on Sunday April 19, 2009 @05:36PM (#27639593) Homepage

    Actually,

    There are several types of UPS and the better ones you ARE running off the batteries of the UPS all the time.

    Offline/Standby: cheap as hell, not something you want to use in a bad power environment for anything important.

    Line-Interactive: better but still wouldn't use it for anything important

    Double-conversion / online: this is the probably the best solution for the OP. OP should get one with weather protection for his intended usage.

    http://en.wikipedia.org/wiki/Uninterruptible_power_supply [wikipedia.org]

  • by daybot ( 911557 ) * on Sunday April 19, 2009 @05:40PM (#27639629)

    Just build two commodity servers - obtain reliability through redundancy and you'll get the specs you want without ridiculous cost.

    Here are some tips.

    • Keep spares of everything, especially fans and PSUs.
    • Check out Intel's new 65W quad core chips [tomshardware.co.uk] if you really need quad core.
    • Use a simple, fanless motherboard and a CPU heatsink with a good reputation [tomshardware.co.uk].
    • Invest in good fans - like these [acoustiproducts.com].
    • For power, you just need a standard UPS and possibly a generator - Google wouldn't bother with homebrew internal designs if they only had two servers.
  • Go Small or Go Home (Score:5, Informative)

    by grcumb ( 781340 ) on Sunday April 19, 2009 @05:55PM (#27639759) Homepage Journal

    What would you suggest? Lesser hardware? Surely there must be a solution somewhere in the middle of "I want this" and "I can use this".

    Yep, there is. But it's not always where you think.

    Shameless (but hopefully useful) self-promotion:

    I've been living and working in Least Developed Countries in the tropics for nearly 6 years now, and for the last 2, I've been writing a weekly IT-related column called Communications [imagicity.com]. There's a ton of advice in there. Go take a look. Check my tag cloud for relevant topics.

    Here are a few fundamentals:

    -1- The first thing to do is to adjust both hardware and - and this is important- software to the circumstances. Focus on the task first, then avoid confusing how that task is completed in a North American office environment with 'the right way' to do things.

    -2- Scale everything down, in order to make the cost of failure of any single element as small as possible. This way, you get a solution that's replicable, affordable and - most importantly- easily replaced when (not if) it breaks.

    -3- If you have unreliable power, then do two things first: Make your system tolerant to current fluctuations[*], and then plan for an intermittently available service. Forget about trying to keep it running at all times. Just minimise the cost of interruptions. A surge suppressing electrical switch on the wall where your main power source enters the building will cost you less and save you more than anything else.

    [*] Bad (i.e. poor quality) power is the source of about 80% of hardware failure where I live. Every time the local power company hits us with brown-outs and spikes, I'd get a surge (heh!) of customer service calls.

    To me, this situation screams 'require redundancy'. I understand this was not given as an option originally, but with the environment described I would certainly not want to rely on one single server.

    Yes, redundancy is good. Cheap, small, easily replaced devices are good. Snap-shotted VMs are also good. The bottom line is that you need to keep the cost of failure low, because the system is certain to fail due to environmental factors. A good motto for working in the Developing World is: If you can't beat 'em, at least don't lose too much.

    The best way to do this is to try to run on hardware that's about 3-5 years behind the curve, or to go straight to the bleeding edge of low-power tech.

    To the submitter: I have a personal interest in Bangladesh, by the way. You can reach me by leaving a comment on my website. Good luck!

    P.S. Unless money and space are no object, you'll never run full-time computing services on solar power. Especially in monsoon season. IMO, best not to try.

  • by Mr. Freeman ( 933986 ) on Sunday April 19, 2009 @06:53PM (#27640177)

    No, the power he is supplied with is Interruptable.
    Therefore, he needs an UNinterruptible power supply.

  • need climate control (Score:3, Informative)

    by Eric Smith ( 4379 ) on Sunday April 19, 2009 @08:26PM (#27640723) Homepage Journal
    I just got back from installing a bunch of equipment in a tropical area (Saipan, Tinian, and Rota, the three inhabited islands of the Commonwealth of Northern Mariana Islands).

    If you need the equipment to have any halfway reasonable reliability, it MUST have some environmental control. You could use a NEMA class 5 enclosure if only the humidity was an issue and the equipment didn't dissipate a lot of heat. However, your hardware description indicates that you will have a lot of heat dissipation.

    The only other option I can see is having some kind of environmental chamber (i.e., an air-conditioned box) to keep the humidity and temperature under control.

    If you don't have that, the equipment WILL fail. It's a matter of WHEN, not IF.

    Almost all electronic equipment is rated for operation at a maximum of 90% relative humidity (non-condensing), and much equipment is rated even lower than that.

    In the CNMI, the _average_ of the daily high relative humidity is above 90% part of the year, and only slightly below 90% the rest of the year.

  • by travalas ( 853279 ) on Sunday April 19, 2009 @10:12PM (#27641221) Homepage
    We have both interruptible and uninterruptible power supplies here. The difference is that IPS's take about a second to switch over to the batteries
  • In Thailand... (Score:5, Informative)

    by aaarrrgggh ( 9205 ) on Sunday April 19, 2009 @10:29PM (#27641319)

    I haven't done Bangladesh, but in Thailand if I had it to do over again I would go for four low-spec machines, and a sealed enclosure with compressed air cooling. It isn't the most energy efficient approach, but having a sealed (and slightly positively-pressurized) enclosure does wonders at keeping out moisture, dust, and ants.

    The general idea is to have a couple small compressors (with check-valves) feed into a common reservoir that has adequate time to cool to ambient temperatures. Ideally, you would run at about 300 psi/20 bar, and have a pressure reducing valve inside the enclosure to drop the air to about 10 psi with a 1/16" orifice into the enclosure. (You might have to experiment on orifice and pressure.) Provide a pressure relief valve to keep the enclosure under 2 psig. (Another constrained orifice would work, but you will lose more air.)

    Keep a spare machine in a pelican box with desiccant along with two or three spare hard drives. Keep a backup external USB hard drive in a separate pelican box with dessiccant and only open it up when you are doing a backup.

    I'd also second comments about running everything in virtual machines and being willing to make compromises when one of them isn't working.

    Back in my day, getting 12V power supplies wasn't nearly as easy as Google makes it sound. (You need to have a high enough float voltage to charge the batteries, and have a regulated output that will handle the end cell voltage of the batteries.) The logical alternative is to use 48VDC power supplies which are much more expensive. They are designed to operate within the float/ECV requirements of a VRLA. Don't forget your blocking diodes! Try to stay away from car batteries if you can and find some real deep-cycle batteries. Getting through monsoon season on battery isn't realistic without a huge battery plant. Our island's phone switch was pretty well equipped, but for two months a year the phones only worked when the sun was shining.

    External connectivity was always my nemesis; when the phone switch was down, everything else crapped out. Satellite phones weren't viable from a cost perspective; the consumer satellite service was too unreliable to even be considered.

  • by grcumb ( 781340 ) on Sunday April 19, 2009 @10:39PM (#27641371) Homepage Journal

    To boil it down, it sounds like the biggest single problem once you actually HAVE a machine is keeping juice to it consistently.

    Yep. Power is the single biggest problem faced by rural ICT-related projects in my part of the world. It's dead easy to find someone to donate equipment. It's incredibly hard finding someone willing to pay you to run it.

    The answer to the power question is horses for courses [wiktionary.org], I'm afraid. Some places have great power generation possibilities, either through solar, small-scale hydro or wind. Some projects just find the cash to keep a generator running. Most don't.

    In every case, reducing your power footprint only makes sense. Batteries are hugely expensive and difficult to transport, so the less power storage you need, the better. Running off low-voltage DC is great, because it's much more efficient over short distances.

    Solid-state is your friend. It's more resistant to heat, dust and other environmental factors. Small form factors also help, because buildings are often rudimentary at best. Being able to stick everything in a seal-able, easily transported box makes everyone's life easier.

    In many cases, the right answer is actually to reduce the amount of automation in your work. Human labour is cheap and time is plentiful, whereas power and equipment are not. Building the right amount of inefficiency into your system is a counter-intuitive but often rewarding approach.

  • by Antique Geekmeister ( 740220 ) on Sunday April 19, 2009 @11:59PM (#27641789)

    Goodness, if possible, forget VM's. Use multiple OLPC systems, which are fiscally sensible, extremely low power, and startingly robust. Salt air and water is a problem: consider machines exposed to that for a year or so to be due for replacement.

    If Windows or a UNIX system are necessary for basic software compatibility reasons, life is rather different.

  • by Magic5Ball ( 188725 ) on Monday April 20, 2009 @03:23AM (#27642549)

    Different poster, but...

    OpenVPN default settings on UDP work reasonably well on the cheap birds if latency isn't a deakbreaker. For best results, be prepared to spend for your own time, or at least find out how the provider mangles/fakes layer 3 going up and down and adjust your packets accordingly. Also try to originate close to the uplink to save 50-200ms each way on the ground.

    Bandwidth gets really expensive if you don't design your solution correctly. (Hint: Most remote desktop tech support is performed incorrectly. Consider what information the helpdesk actually needs to model and solve the problem before sending lots of pretty pictures with low information density around the world.) If you must do remote desktops, NX may or may not help with TCO vs putting a terminal server on site, and using it smartly. Even if you're supporting Windows boxes, it's also worthwhile to look into some of the commercial X implementations that optimize for poor bandwidth conditions.

  • by subreality ( 157447 ) on Monday April 20, 2009 @07:36AM (#27643683)

    humid air gets in and immediately condenses into water all over your chilled electronics.

    Which is why I suggest keeping the temperature high.

    Air conditioners are slightly better. They will dehumidify and clean the air before pumping it in.

    Actually, they just form condensation on the coils, and then have a drip pan to carry it outside. Normally they recirculate, and don't pump anything in, for efficiency.

    overclockers have been immersing their entire motherboard in mineral oil (or whatever) for ages now. Solves issues with temp, humidity, dirt, etc.

    It doesn't solve humidity: It just collects in the bottom of the oil, and eventually you have to get it out before it reaches the motherboard. It's the same problem as the fridge has, just in liquid form. :) Either way, raise the board up a little in case water collects underneath.

    Temperature: Mineral oil doesn't directly solve this problem. It gets the heat away from the CPU, but you still have to get it outside of whatever environmental housing you've created. You still need a heat pump of some sort.

    Dirt: No, in my opinion, it makes this much worse. Dirt plus air is cleaned up with a feather duster, or canned air. Dirt plus oil is a real mess.

    And since we're talking about laptops, I'd be really skeptical of a plan to dunk an entire laptop in oil. It'll destroy the keyboard membrane, internal hard drives, fans, and possibly other parts in short order.

    The higher the VA-rating, the longer it will last during a blackout

    This is not true at all. VA stands for Volt-Amps. This is similar to watts: Volts times Amps = Watts, but VA measures the peak momentary power at each point on the sine wave, whereas watts measures the average power.

    The runtime of a UPS is approximately (battery pack voltage * battery pack amp-hour rating * inverter efficiency) = watt hours.

    Since UPS manufacturers are sketchy on the details of their batteries, follow the simple rule of thumb: The runtime of a UPS for a given load is approximately proportional to its weight.

    Pure sine doesn't really matter for modern computer power supplies. They cope just fine with square waves. Buck-Boost capability (which allows it to double-convert power to handle over/undervoltage) IS a big plus if you expect extended brownouts: It'll just suck more amps from the line to keep things running instead of running on batteries until it's fixed.

  • by grcumb ( 781340 ) on Monday April 20, 2009 @08:09AM (#27643853) Homepage Journal

    Goodness, if possible, forget VM's. Use multiple OLPC systems, which are fiscally sensible, extremely low power, and startingly robust. Salt air and water is a problem: consider machines exposed to that for a year or so to be due for replacement.

    [In reply to both you and GP]

    I like the XO a lot, but as a personal computing system, not a server. Frankly, the CPU's a little lightweight for anything non-trivial. Keyboard input is difficult for adults - that's by design, of course - and while I agree that the machine is remarkably robust, the form factor isn't ideal for adult use.

    As for a cluster of anything... while I agree that ruggedised laptops are a good solution for general computing needs, adding pieces to this particular puzzle isn't necessarily a Good Thing. If it were up to me, you'd have to make a pretty compelling case that more than one machine was needed before I'd even contemplate that kind of configuration.

    I've seen systems break down for the most trivial of reasons. You have to factor in local technical capability on top of everything else. In the country I'm living in right now, there's not a single individual with significant experience with distributed computing. If you're confident that support for that level of complexity will be there for the life of the project, then by all means go ahead. Just don't expect things to work if you go offsite for more than 24 hours.

    The KISS principle applies in spades here, Cleverness is usually punished in such scenarios. As a friend of mine likes to say: If you want to make the gods laugh, tell them your plans.

  • by grcumb ( 781340 ) on Monday April 20, 2009 @08:12AM (#27643875) Homepage Journal

    This is what i was thinking when i read this: that you really have to do ultra-low-power, ie fully ARM processors and SSD/flash so you can run on ~5-10 watts

    Agreed. Given the output of typical small-scale power generation & storage schemes, 5-10 watts really is about the most you can manage if you want full-time operation.

No man is an island if he's on at least one mailing list.

Working...