Forgot your password?
typodupeerror
Technology

Home Server Rooms? 464

Posted by Cliff
from the things-you-might-do-to-a-den dept.
Tuzanor writes "I've got a buddy moving into a brand new house. Being geeks, we've decided to wire the house with a large home network. While this story took care of wiring the house, we need to figure out how to create a well set up server room. We'll be having both towers and rack mounted computers as well as various switches, UPSes, etc. Also, we figure this room will get warm, even in winter. How may we cool it while still keeping the rest of the house toasty warm on a cold Canadian night (without opening a window)"
This discussion has been archived. No new comments can be posted.

Home Server Rooms?

Comments Filter:
  • Come on... (Score:4, Insightful)

    by GigsVT (208848) on Saturday December 15, 2001 @08:06PM (#2709488) Journal
    Is this a serious question?

    Just set up the ventilation system to suck warm air from the top of the server room, and pipe it to the colder rooms in the house.

    For air return, install intakes near the bottom of some of the colder rooms.

    It would cost like $50 at a home improvement store to get enough flexible ducting and registers.

    Go to a surplus site like www.mpja.com and get some AC powered fans with a good CFM output.
  • Re:ceiling vent (Score:2, Insightful)

    by skroz (7870) on Saturday December 15, 2001 @08:09PM (#2709496) Homepage
    Temperature is not the only problem; you also have to consider relative humidity. Opening a window may introduce more problems than it solves.
  • Heat, Noise Issues (Score:2, Insightful)

    by rmckeethen (130580) on Saturday December 15, 2001 @08:17PM (#2709532)

    Personally, I've got my boxen sitting within inches of the furnace and I've had them there for months without a problem. I live in Seattle, about 125 miles from the Canadian border so the climate is somewhat similar. Unless your buddy is looking at putting in loads of servers and other equipment I can't imagine that you'd have a problem. If you really want to 'do it right' you can usually get most manufacturers to give you the heat output rates for their equipment in BTUs per hour. Add all the rates together then you'll have an idea of how bad things are likely to get. I would imagine you'd have more problems with too much heat then not enough; it might not be a bad idea to check the room where the rack is going to go and verify that it has adequate ventilation to carry the heat load. Stick a wall-mounted thermometer in and see how it goes over time.

    One thing that you should really think about with rack equipment is the noise level. Manufacturers of rack-mounted equipment just love to shove lots and lots of fans in the backsides of their boxes; this tends to make a great deal of unwanted noise. Unless the plan is to have all this stuff in a separate room where no one is going to be in you might want to consider spending the extra money and get a glass or plastic enclosed rack. It costs more but hey, it definitely has the cool factor covered.

  • Re:electricity (Score:4, Insightful)

    by J.C.B. (141141) on Saturday December 15, 2001 @08:28PM (#2709568) Homepage
    There's still no reason to waste it. He lives in Canada, I live in North Dakota, I could use the heat put out by a server room during the winter. It would sure save on heating.
  • by MBCook (132727) <foobarsoft@foobarsoft.com> on Saturday December 15, 2001 @08:43PM (#2709603) Homepage
    Why not run ducts behind all the computers and have those ducts be the intake from outside for the heater? That way, the air comes in cold, get's warmed up (so your heater doesn't have to do as much) and cools the computers/room (serving it's purpose), then it's business as usual.

    Another suggestion is that when I lived in Salt Lake City our house had water heating. What if you ran pipes behind the computers with fins on the pipes (like a heatsink) then that water could go into the hot water heater. Once again, saving you some money.

    Where is the room located physically? Don't forget that an underground external room (as opposed to a room in the middle of the house) will be cooler.

    Being true geeks, you're probably not opposed to spending some moolah on this. What about doing something like this [freeserve.co.uk] guy did? If you buried a few large tanks deep the ground deep so it's below the frost line, you'd get cold water for free. Then just hook all you're PCs into water cooling. Have them all draw from the same spot, and then all empty back in. That way you get free cooling and it'd be quiet. If you look back at my earlier suggestion involving the water heater, you'd be all set.

  • by Kymermosst (33885) on Saturday December 15, 2001 @10:30PM (#2709816) Journal
    You know, geeking out now and then is cool and all, but why, exactly, do you need this much server equipment for a "home network?"

    Personally, I have ONE well-configured machine acting as the firewall, the router, and the file server. There would be a seperate machine providing external 'net service (HTTP) if I could think of any damn good reason I needed a web server at my house.

    So, one well-configured machine with 2 NICs, one 8-port ethernet switch and a DSL modem equals: one short Cat5 cable to the DSL modem, 4 power cables (one for the seldom-used monitor), and 8 Cat5 cables run to the rest of the house.

    What you and almost everyone else is describing here is more of what you'd find in much more commercial places, and a bit overkill if you ask me. My single-machine setup works just fine, and the advantage of one machine is that you DON'T need any additional cooling.

    All of it fits in a closet, and I can work with the server from any part of the house with a tektronix X-terminal, or the computer that happens to be there.

    So, I guess I wonder where the advantage is of having enough machines to have to design it so that people get a "feeling" of what my machines' duties are visually? What's the point of having a huge NOC in your house?

    Is there a point, or is it just merely to geek-out to the point of overkill, which I can also respect, but can't logically submit myself to?
  • by raju1kabir (251972) on Saturday December 15, 2001 @10:39PM (#2709840) Homepage
    Finally, place your equipment. Servers should be placed where they most make sense, e.g. don't put the internal file server next to the router and the public webserver on the other end. People should get a "feeling" of what your machine's duties are visually.

    Why on earth is this? Do you hold dinner parties where strangers get to come over and reconfigure your servers? As long as you're bright enough to remember from one day to the next which server is which, who cares how they're arranged? And what is the correct order for a set of servers, anyway? Alphabetical by hostname? Ascending order of system RAM? Uptime? Numerical order of primary service port?

  • by hearingaid (216439) <redvision@geocities.com> on Saturday December 15, 2001 @10:52PM (#2709870) Homepage

    Well, if you have three servers, then no, it doesn't really matter.

    Suppose you have twenty-three. Now think. You're going to sit down in front of these one day after having spent a month in Bermuda. How will you feel?

    • confused; or
    • familiar?

    I know I'd rather feel the latter.

    then there's the geeky-friend situation.

    personally, my favourite solution is to label my computers. give them names, and stick the names to them somehow.

  • by Chas (5144) on Saturday December 15, 2001 @11:20PM (#2709956) Homepage Journal

    One thing to find out BEFORE you begin mounting expensive electronic equipment down in your nice, cool basement is:

    HOW PRONE ARE YOU TO FLOODING?

    My parents place was in a well developed subdivision with one decent power drop and one shitty one. Guess which one they were on?

    So every time they'd get a bit of rain, BOOM. Out would go the power in their place, and every place down the right-hand side of the block. While our next door neighbors off to the left (and down the left side of the block (we were at the end of a cul-de-sac) had power.

    Consequently, if this happened in the middle of the night, they'd take on 3-4 feet of water.

    If you're in an area that has no flooding problems, you're set. You can drop your setup down in the basement.

    If you live in an area that's flood prone, then take the extra time and money to rig the server room on the main floor.

    Have a cold-air return in the floor (or low on the wall) blowing directly into the equipment bay. Then (assuming you're in a one story home), have a ceiling ventilation fan above the rack.

    You can find a lot of HVAC supplies to improve your climate control here [smarthome.com]. Look particularly closely at the duct fans.

  • Re:No, no, no! (Score:2, Insightful)

    by Torak- (198078) <.zn.ten.tcidda. .ta. .karot.> on Sunday December 16, 2001 @01:31AM (#2710282) Homepage
    Guh, the trolls on this site get more boring every day.
  • false flooring (Score:3, Insightful)

    by cvd6262 (180823) on Sunday December 16, 2001 @02:01AM (#2710361)
    One thing I haven't seen mentioned that I saw when I worked at IBM was to use a false floor. If you raise the floor 6-12 inches on a simple framework, and use removeable tiles, you can run cables and cords from anywhere to anywhere and not worry about tripping.

    In fact, they not only used this technique in their server farms, but also in the production line. When they added on to the line, they dug a 8-foot hole, and then built scaffolding and a false floor. All the plumbing and wiring run under it.
  • by gmby (205626) on Sunday December 16, 2001 @02:54AM (#2710470)
    I've seen two way fans at Target; They change air direction acording to tempeture inside and outside your window. But don't put it in the window. Just put one in the door of your server room and move air in/out of the room/hallway. Hint put a vent in the bottom of the door and the fan in the top of the door to draw heat out in the summer. Winter is not a problem because most electronics can handle cold down to frezing. No humidity of course. You might need to consider humidity if you have a wide tempeture change in a short time. I bet Target also has dehimidifiers. Don't forget lighting protection on your meter box and phone/dsl/T1 line outside BEFORE they enter your house. Cover the floor (in serverroom) with silver conductive duct tape (the kind you get for AC Vent installations) in a criss-cross pattern about 1' spacing to discharge static from your feet. Use a needle to poke the tape together where it crosses and ground it at least two palces on the tape grid. Oh well enough of this rambling....

  • by Bronster (13157) <slashdot@brong.net> on Sunday December 16, 2001 @03:56AM (#2710573) Homepage
    Personally, I have ONE well-configured machine acting as the firewall, the router, and the file server. There would be a seperate machine providing external 'net service (HTTP) if I could think of any damn good reason I needed a web server at my house.

    I personally have two machines - one being nothing but a firewall and router and the other being all those handy services that you need on a home network (file storage, DNS, web proxy, testing DB and web server, etc).

    There are good reasons for this split of duties:
    • The firewall is running a minimal setup - no setuid binaries, no listening to arbitary ports (port 22 is the only open port, and even that is only opened on the internal interface), no wu-ftpd or whatever the latest insecure daemon is (oh yeah - no public BIND!!!).
    • I frequently mess with the config of my internal server, trying something different, upgrading to new versions of software. It's hard to keep a system secure under these changes. I very rarely touch the firewall box.
    • Attackers have to break two different machines (which should be running two different OSen, but I'm lazy, and LRP based firewall systems are easier than picobsd for what I want) to get access to anything. The router machine only has 16Mb of memory, and boots off a floppy - it's even going to be hard for the attacker to copy a binary in, with no wget or similar installed. If it gets broken, I just hit the reset button, and the write-protected floppy has the same config (which I guess I'd want to check anyway, for how they got in).


    In summary - home networks needs 2 machines - one providing security, one providing services.
  • by Dyolf Knip (165446) on Sunday December 16, 2001 @04:49AM (#2710621) Homepage
    This sounds really cool, but I can't seem to form a good mental image of it. Have you got any diagrams or pictures of your system? Or of the geothermal system?
  • by axelbaker (167936) on Sunday December 16, 2001 @04:56AM (#2710632)
    Unless I am mistaken you said you live in Canada. The land of ice and snow (according to what I am told). Why are you worried about your computers overheating? Spend the extra $$ you are thinking about for cooling your computers on EXTRA insulation for the rest of the house!!! The $$ saved over the life of the house will pay off big time and you will help the environment by spending less fossil fuels on heating and cooling. Also, invest in spending $$ on computers the produce less heat, and use less power. Use less monitors, and KVM switches. Your 100 watt 21" monitor uses tons more power and produces tons more heat than that 5 watt Athlon. If they must produce heat, have it use the heat for good. The suggestions of using the heat to feed the inlets on the heaters is VERY GOOD. The thoughts of cooling using underground water reservoirs is one of the CHEAPEST CLEANEST methods of cooling the whole house around. If you spend the $$ on an energy efficient house now, while it is cheap, you will be much happier in the long run.
  • by shoppa (464619) on Sunday December 16, 2001 @07:58AM (#2710900)
    I fully understand the desire for having a dozen machines up at the same time, each doing their "own" thing. But face it: today's computers are so ridiculously powerful that you'll probably be utilizing a percent or two of CPU on each of those machines. If you can consolidate all the functions onto a single machine, you'll be way ahead in the game for a number of reasons:
    • Cooling. This was your primary concern, so I think you'll grok it immediately.
    • Power. (This is actually directly related to "Cooling", but I'll treat it separately because most slashdotters don't know a thing about thermodynamics.) If UPS'ing is important, you'll be able to keep a single server up for twelve times longer than a dozen equivalent servers, given the same UPS capacity.

      Just as a data point, I have recently consolidated all but one of my servers onto a single little box, drawing a little bit under 100 watts. My UPS can keep this little guy up for two hours during a power outage.

    • Redundancy. Want full redundancy for all your operation? With one server, you just double to two. With twelve servers, you have to double to twenty-four!
    • Software maintenance. Do you really enjoy maintaining a dozen different machines? Do you feel you need a dozen different OS installations for some reason? Maybe you feel that no one single OS or distribution is the "right one" for you and that's why you need so many machines? Seriously think about making your own personalized custom Linux From Scratch [linuxfromscratch.org] distro, where you are the guy in control. No more whining about the way Redhat does package configuration!
  • Cold Air Return (Score:2, Insightful)

    by SEWilco (27983) on Sunday December 16, 2001 @08:20AM (#2710912) Journal
    Yes, I think that's the key. Have a "cold air return", one of the vents which the furnace fan sucks air from, in the server room.

    Usually a "cold air return" is near the floor, as it's intended to remove cold air in the winter. In your server room, have that air return vent connected to openings near the floor and ceiling. Install the type of grill which can be opened and closed, so you can adjust how much air gets pulled from the top and bottom of the room. This lets you keep air circulating, but you now can remove hot air more easily.

    Make sure you also run a vent from the furnace to that room, and again have an adjustable grill so you can control how much air enters the room. In the winter you probably want to nearly close it, and allow more of the house air to be drawn through the warm room.

    The last thing you need is a Fan Always On switch. Sometimes there is one on the furnace, and sometimes there is one in the thermostat. Leave the fan always on, so you keep air moving and even out the differences.

    Last, consider an electronic air filter. This is an electrostatic device installed next to the furnace, in the cold air return. It's a couple of hundred dollars, but it removes well over 90% of stuff floating in the air. If your fan is always running, you also keep running the air through so it is kept nice and clean. You just have to wash the metal filters, no disposable filters to buy. Less dust in the computers.

  • be prepared (Score:3, Insightful)

    by xah (448501) on Sunday December 16, 2001 @11:08AM (#2711048) Homepage
    Don't forget all of the things that add complexity to the situation.

    1. Problems already discussed: heat, electricity, noise.
    2. Electrostatic discharge. Ground all your equipment properly.
    3. Flood. Keep your servers a few inches off the floor for minor incidents. Keep a backup somewhere on higher ground for major incidents.
    4. Earthquakes, tornadoes. Keep your server in a position where it cannot fall over or hit the ground over if it tips. Consider buying a solid steel case to potential minimize crush damage.
    5. Kids. Get a door with a lock to keep kids from endangering themselves in your server room.
    6. Sanity. Get a network connection from your server room to some other location or locations in your house. At this location, put your main workstation, from which you can access all your servers remotely. That way you won't be stuck in the server room for too long.
  • Pollution (Score:3, Insightful)

    by ClockworkPlanet (244761) on Sunday December 16, 2001 @02:32PM (#2711480)
    If I was moving into a brand new house, and was looking to build a server farm properly, I'd be ready - this is one of my favourite "What would you do if you won the Lottery?" answers, and I've spent a lot of time planning it.

    After looking at the server farm in work I figured the first thing to decide is "What the heck is all that stuff going to sound like in my house? It's pretty noisy at work, and the walls are made of breeze block and concrete. I can hear a motor hum through the wall when there's no other noise. In my house, after about 10:30pm there's no noise at all, it's silent. If I leave my desktop PC on overnight you can hear it.

    I'd certainly soundproof the walls, and if money was no object, I'd add insulation to keep the heat out. I'd then look at some kind of system to pull dust and fibles out of the air before they reached the equipment. We have an extraction system with filters that are regularly cleaned. Houses get pretty dusty, with the resultant build up all adding to the build up of heat.

    I reckon you'd want to sort all that before you started with the actual ecuipment.

Those who can, do; those who can't, simulate.

Working...