Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Planning a Small Server Room 98

An anonymous reader writes: "Our company is planning to build a small server room. Initial requirements are for two or three enclosed server cabinets in which various servers and network gear will be installed. The cabinets are planned to hold between 15 to 20 servers of various types and sizes, switches, routers, four dial-in modems for after hours use by staff who do not have ISPs and a KVM switch. We would expect for a small desk as a work area, a book case, storage for some spare parts as well as server documentation and records. We know that we need some power protection in the way of a UPS and a generator. We also expect that this room will get quite warm in the summer months so it will need more air conditioning than the rest of the office. What should we expect for power and cooling needs? Are there any 'rules of thumb' when it comes to building a server room. Good suggestions and help would be appreciated."
This discussion has been archived. No new comments can be posted.

Planning a Small Server Room

Comments Filter:
  • Link! (Score:3, Insightful)

    by NWT ( 540003 ) on Thursday March 14, 2002 @03:16PM (#3163730) Homepage
    We've had this kind of question once, but it was for a home server room, this shouldn't be too different!
    Link [slashdot.org]
  • I'm assuming you are building this small room in a rented-office type situation, probably in a building that normally houses more people than computers.

    I've been where you are, and let me say that your #1 problem will probably be cooling. If it's really a closet/old office/whatever, it is unlikely your building management will be able to get enough cooling to you. I'd recommend planning for one of those free-standing moving cooling units that can vent into the drop ceiling. By planning, I mean both power and space.

    Power: Again, better get in touch with your bldg management. Most office circuits are in the range of 20 amps, which sounds like a lot, but isn't -- you shouldn't plan on using all 20 amps, and remember that if you coldstart everything at once, they will pull a lot more than 20. I would plan on an amp per server at least, and go no more than 16 boxes per circuit. You may need a 220 circuit for a movable AC or some other weird piece of equipment.

    Do NOT do the latter yourself. Hire a professional electrician. One mistake in this area can not only ruin your business, but potentially take your life.
    • by Anonymous Coward

      A good electrician will be able to hook up a meter to a few sample servers and get the exact amount of juice they pull. Use the GREATER of that number and the name plate rating on the computer. I would plan on having UPS's that can take 125% of your calculated load. Also the UPSs should be considered continuous loads so the circuits that feed them need to be rated to 125% of the actual load (per the NEC). Also most wiring in commercial spaces is done in conduit and more than 3 wires in a conduit requires that the wires be derated and not all electricians pay attention to that (again per the NEC). The net effect is that you should plan on the electrician using #10 THHN in any conduits. Computers often need good grounding systems, so I would also require a separate ground wire to be run in the conduits even though it is usually not done since the conduit can act as a ground. You will also want to make sure the racks are grounded and you may even wish to consider putting a wire mesh beneath the floor and grounding that.

      Finally, if possible, require that the communications cables be run in over sized conduit as well. It makes expansion much easier in the future and also provides a measure of RF shielding.
      • by sigwinch ( 115375 ) on Thursday March 14, 2002 @05:13PM (#3164410) Homepage
        A good electrician will be able to hook up a meter to a few sample servers and get the exact amount of juice they pull. Use the GREATER of that number and the name plate rating on the computer.
        Ignore nameplate ratings on big devices like computers and monitors. They're usually overrated. Ignore measurements. Speaking as an electrical engineer who designs computer peripherals, getting true worst-case measurements is very, very difficult. You have to exercise the hard drive heads, CPU cores, RAM busses, and I/O busses fully, and that's near impossible. Switching power supplies also draw more current as the voltage goes down. If you make the measurement when the line voltage is 130V, the equipment will draw 20% more current at 110V.

        For little things like KVMs, modems, inkjet printers, etc. you can safely use the nameplate ratings.

        For big things, determine how many machines you would ever conceivably want in the room. Choose the biggest, baddest equipment you could possibly want. 1U dual-proccesor machines, arrays of 15000 rpm hard drives, a desk full of 21 inch monitors, you name it. Then go to the manufacturers web sites and find the nameplate ratings for the various things, and add 'em all up. The total will be a number you won't easily outgrow.

        Be sure to account for start-up loads. You don't want to trip a breaker by turning everything on at once. Hard drives draw a lot of power while they're spinning up, monitors while degaussing, laser printers while warming up the fusion rollers. This is just an educated guess, but use a factor of 2 for hard drives, and 5 for monitors. Read the specs for the laser printers very, very carefully and find the worst-case.

        Also most wiring in commercial spaces is done in conduit and more than 3 wires in a conduit requires that the wires be derated and not all electricians pay attention to that (again per the NEC).
        I'd go even farther. When many surge protectors divert a surge, they divert it into the ground wire. This causes a brief, high voltage spike on that circuit's ground relative to the other circuits in the room. The longer the ground wire is, the larger the spike. This spike can do nasty things to serial lines, KVM cables, and so forth that connect machines on different circuits.

        So if the building breaker box is farther than, say, 50 feet from the server room, I'd have a small breaker box installed in the server room. Also this lets you recover from a tripper breaker without getting the main breaker box unlocked.

        If you can afford it, have a couple of separate circuits run from the main breaker box. This gives you someplace to plug in coffee pots and vacuum cleaners without disturbing the electronics.

        If the room gets its own air conditioner, make sure that has a dedicated circuit from the main breaker box.

        If you can afford it, have a big industrial surge protector installed at this breaker box. Also the breaker box is a good grounding point for surge protectors on your external data lines.

        The net effect is that you should plan on the electrician using #10 THHN in any conduits.
        This is excellent advice. The electrical code is based on safe operation of motors and heaters. Bigger wires make your electronics more reliable by reducing voltage droop.

        Also, computers often don't draw sine wave current. They draw less current at the beginning and end of the AC cycle, and more in the middle. This means the peak current is larger than the sinewave loads envisioned by the electrical codes.

        Computers often need good grounding systems, so I would also require a separate ground wire to be run in the conduits even though it is usually not done since the conduit can act as a ground.
        More excellent advice. Conduit is completely unacceptable for grounding computers. A grounding wire is cheaper than the cost of a single computer crash caused by a poor ground.
        You will also want to make sure the racks are grounded and you may even wish to consider putting a wire mesh beneath the floor and grounding that.
        Have an electrician tie all the racks and other metal stuff together with big ground wires. This will help protect the rest of your equipment if one of the devices has a ground fault. It'll also help reduce static electricity by giving you lots of big grounded metal things to touch. Wire is cheap compared to the cost of a single failure.
        Finally, if possible, require that the communications cables be run in over sized conduit as well. It makes expansion much easier in the future and also provides a measure of RF shielding.
        Conduit does make running wires much easier. If there is no other wiring or fluorescent lights within a few feet, I'd use nonmetallic conduit, as metallic conduit can actually act as an antenna for picking up radio waves and coupling them into your data cables. OTOH if there are AC lines parallel to the run, metallic conduit is probably better, and be sure to make the electrician ground the conduit properly.
        • > laser printers while warming up the fusion rollers

          Alas - the dream of cold fusion rollers was just that...... ;-}

          Methought it was considered a fuser, not a fusion plant ;-} Unless that's one hell of a laser printer you've got
    • I've been where you are, and let me say that your #1 problem will probably be cooling. If it's really a closet/old office/whatever, it is unlikely your building management will be able to get enough cooling to you. I'd recommend planning for one of those free-standing moving cooling units that can vent into the drop ceiling. By planning, I mean both power and space.

      I do agree with this, however there is one problem to mention. Check with the building management before installing such a thing. The building management here was very reluctant to let us install a simple free-standing AC unit.
      They were concerned about water leaks, mostly. These free-standing units need somewhere to drain to. There is an evaporation pan, but the building management was concerned about it overflowing, on a weekend, flowing down to the floors below us. A very unlikely scenario, but it concerned them none the less.They finally agreed to let us install it after we agreed to tie it into the fire alarm system, to notify them of an overflow.

      • I've actually had free standing unit leak into a server room. We only noticed it when a room adjacent to the server room had a wet patch on the carpet. The rats' nest of power and network cables under the false floor in the server room were all sitting in about an inch of water.

        Needless to say, we got it cleaned up quickly. However, just because we were lucky and didn't get electrocuted, or have servers explode or anything else as dramatic doesn't mean you shouldn't be extremely careful!
    • >I would plan on an amp per server at least, and go no more than 16 boxes per circuit.

      Now, i'm not an electrician, but this is probably the worst bit of advice here. Don't get me wrong, electricity is important, but let's do the math, and determine how good your advice is.

      Amps x Volts = Wattage.

      Therefore, assuming you're using 110 Volt power, and your calculation of 1 amp per server, we get:

      1 x 110 = 110 Watts, per sever.

      It should be painfully obvious from the above that this is obviously way too little power to be running a server on. (to say nothing of your expertise with electricity and servers.)

      Now, a 20 amp circuit gives you 2220 watts of power. ( 20A x 110v = 2200W ) Assuming 500 watts per server, you can run about 4 boxes per circuit. Remember, computers run on electricity, and if you rob them of power, they are entirely useless without it. Do not skimp on power.

      In fact, if you want to do power right, you will run ONE circuit PER server. This way, if you need to cut power, or a breaker blows, only one server is affected, not the entire rack. And also, you don't have to worry about not having enough power for a big server, unless your server wants more than 2000 watts, which is less likely, unless you're getting some big iron servers. And those would likely have dual (or more) power supplies. And you would want them plugged into seperate circuits, because if one fails, the other is a backup.

      The overall point is, it's hard to have too much power, and with some simple math, it's easy to figure out your needs, and what to set up. Compared to the cost of a server, a curcuit is pretty cheap to install, (i'm guessing $50 each) so there's no reason to cheat yourself here. If you want to do a server room right, you've got to have good power.
      • I based my reasoning on real-world numbers, not conjecture. We use 1U supermicro servers with 250W power supplies. These are rackmountable boxes capable of taking dual Intel Tulatin CPUS at 1.2Ghz, up to *4GB* of memory, and two hotswap SCSI drives (as well as onboard video, memory, etc).

        Now, assuming you don't have a balls-to-the-wall system (say 1 processor, 1 disk, and 1-2gb, it's reasonable to assume you're only going to be drawing maybe 150W, or slightly more than 1 Amp. If you are using something like a desktop as your server, it will probably be less.

        While I wholeheartedly agree that power is important, one circuit per server is very extravagant except in the most mission-critical circumstances. It's going to cost more than the aforementioned $50/circuit in parts to implement that solution, not counting electrician time (oh, $100/hour?). That'll include plug assemblies, cable, and breakers -- and a brand new breaker box if you don't have enough slots in your current one. And think of the bundle of cables! Remember, this guy was asking about small closet-sized server rooms, not something for an E911 system!

        But the point and formula is simple: take the time to add up what all your power supplies say. Then get a circuit put in that gives you a quarter to a third overhead.
  • by duffbeer703 ( 177751 ) on Thursday March 14, 2002 @03:22PM (#3163759)
    If you are in an old building, make sure that the floors can safely support the weight of alot of computers.

    When you initally layout the room, pack everything as tightly as possible. You'll be happy you made that decision 5 years from now.

    Be careful with roof-mounted air conditioners. They have a habit of spewing ice cold water all over the place when they have a problem.

    Offset the racks far enough away from the wall so that you have enough room to work. Make sure that some dope doesn't push them back on you.
    • we were actually bitten by this over this past summer. i work for the electrical engineering department at the college i attend, and we don't have a whole lot of choice as to where our server room is, or how the air conditioning works. we just got stuck with an interior room in a converted train station. over the summer, we filed complaints with the maintenance people that the whole ac system was screwed. we were getting so much ac in the server room that we had to HEAT it, and none of the other rooms in that area of the building were getting much of anything. after a few weeks of complaining futilely, we come in one morning to find a puddle covering half our floor, and a steady drip coming from the ceiling. (thank god it was over a low point in our painfully uneven floors, or some equipment might have been damaged.) after we told this to maintenence, they finally sent somebody over. it seems there was a backup of some sort of crap in the air ducts that caused everything, including that water, to come out of our vent.

    • Be careful with roof-mounted air conditioners. They have a habit of spewing ice cold water all over the place when they have a problem.

      Amen to that. We got a wall-mounted unit put in just under the ceiling in our server room at my last job. We came in one morning to find a dark spot on the carpet. Turns out that that model had a known defect where it would just never stop running, would ice up, and then trip its breaker and shut off, leaving the ice to melt as the servers heated the room to near-oven temperatures. Ended up having to jury-rig a normal, everyday thermostat to the thing to set a minimum temperature for the room in addition to the maximum temp set on the built-in thermostat.

      ~Philly
  • by PhysicsGenius ( 565228 ) <<moc.oohay> <ta> <rekees_scisyhp>> on Thursday March 14, 2002 @03:23PM (#3163767)
    You shouldn't need to use too much cooling. Yes, the CPU produces heat but keep in mind that a server room is a closed environment--no energy (e.g. thermal energy) is actually created. The heat produced is given off by the entropy reversal of information being created. When that information is destroyed, in practical terms just deleting a file, some of that heat is sucked back up and it cools the room back down.

    Of course, if you intend to send large amount of data out over the internet the environment is no longer entropically closed and you will experience heat buildup. In fact, Josh Bell proved in 1999 that data transmitted over a CAT5 cable is mathematically isomorphic to heat transferred backward over that same cable.

    Since you are probably intending to have a net link, make sure you insulate your T1 connection well to keep this heat gain to a mimimum.

    • Ya know... i've never heard something so stupid put across as sounding so intellegent before.

      Good job!

    • You're assuming that deleting a file is the same thing thermodynamically as removing information. This might work for some mathmatical models, but I know of no disk structure that allows it's magnetic domains to return to thermal equilibrium for every bit of encoded data.

      On top of this assumption, you are assuming that the computer acts as a better-than-ideal engine. The amount of heat put off, even if deleting a file did cool a computer, would still be extreme because work is done by the computer, and I'm sure that code optimization does not mean thermodynamic optimization.

      There are many more problems with this argument, even the internal clock in the computer is going to create this type of entropy generated heat.

      What you absolutely cannot get around is the heat generated by the current in the wires. Even in superconducting wires, current generates heat. Factor in resistance and you have a source of heat far greater than any heat generated by entropy reversal.

      These are fine theoretical assumptions, but in practice, computers generate a lot of heat.
      • That reminds me of that Ask Marilyn question a few years back that she didn't have an answer for.

        "Why can't we combat Global Warming by having everyone crank their ACs and open their windows?"

        I don't even know where to start on that one. Even if all of the ACs were perfect engines your net effect would be zero. Now take into account the heat generated making all of the electricity to run these ACs, etc. I was dumbfounded. The answer is right there in any 1st year Physics book.

        --Mike
    • Someone mod this guy as funny just to make sure that no one takes him seriously...
    • No it is not a closed system. Power from an outside source(in the form of Electricity) is being brought in. Then it is run through a resistor (otherwise known as CPU/RAM/etc). This creates heat.
  • by geirt ( 55254 ) on Thursday March 14, 2002 @03:24PM (#3163769)
    The one thing I am missing in our server room is a plain phone....
    • GOOD ONE! Hah! I can't COUNT the number of times I've been inconvenienced by this! A phone with a LOOOOOOONNNG cord! :-)
    • Choose a cordless 2ghz phone. We use them in our server rooms. Using the cordless phone you can avoid getting tangled and knocking shit off your desk. Also, most of them are fairly resilient to server interference.
      • I forgot to mention that it would probably be a good idea to get a cordless phone with a speaker phone on the handset. I believe V-tech makes one although I cannot find a link.
      • Either a phone with a long cord, or a wireless headset...confine use of the headset to within the room and you shouldn't have many problems with reception.....you'll be glad for the headset when you spend 3+ hours with someone's Tech Support fixing something...I been there, trust me!
    • There is a lot of equipment that you can buy with a "call-home" feature. Most just calls the manufacture (You pay service for them to answer), but some will page you.

      Even if you don't want to pay for this feature now, you will in the future (odds are you will eventially have at least one critical machine to keep up)

    • You will want TWO analog POTS phone lines, dedicated to the room. They should bypass your company PABX or VoIP system. They should be ordered as business grade lines, so you can get better service from the telco if they have problems.

      These phone lines will save your career sometime when the power is flaky, or your PABX has gone down, or you have to call two different hell desk lines at the same time (finger pointing? Who? Not me!)

      Since you will have some dial-in modems, ensure one of your telephones is a simple, plain, ordinary telephone, which doesn't require electricity to operate. For the other, follow the other suggestions in this sub-thread; i.e. cordless, handsfree speakerphone, etc.

      And a selection of RJ-11 (not RJ-45) cords, long enough to reach from corner to corner of your machine room. And a couple of banjo breakout connectors.

      And depending on the theft/wandering kit factor in your place, florescent spraypaint to mark your easily lost phones :-)

      the AC
      I'm back!
      • handsfree speakerphone

        I have had a hard time finding speakerphones that will work in a server room. With almost all half-duplex phones (except for the very rare model where you can manually adjust switching sensitivity) the ambient hum of all the servers and UPSes is so loud that it triggers the switching mechanism such that you don't hear much of the person at the other end.

        The only really effective experience I had was spiriting the expensive full-duplex Polycom from the conference room into the server room. It was later tracked down and I was the recipient of some profoundly dirty looks.

  • Already on Track (Score:5, Informative)

    by antis0c ( 133550 ) on Thursday March 14, 2002 @03:27PM (#3163781)
    You've already listed some good rules of thumb, the Air Conditioning, shelving space, etc. I can't express how important it is to have good organization. Organize your unused cables too, otherwise one day you'll end up with a 200 pound rat's nest of cables you're trying to pick through to find a spare UPS Serial cable, and it'll take you half the day to un-knot it. Keep your servers and network equipment well labeled too, this way you don't have to describe to a new employee which server to power cycle on the phone.

    Locks, you'll want to have good locks on this room. Maybe a camera in it too, Security is always important. Not only security, but preventing some uneducated employee from accidently wondering into the room and pressing buttons. It happens I've seen it. I've also seen employees wonder in and realize their monitor isn't as good as the one you have in the server room, and switch them.

    Keep it clean - I can't stress this enough either. Server rooms are a breeding ground for dust. Keep it well filtered with air filters, de-humidifiers to keep the moister down, and try to limit what kind of cardboard products are in the room.

    I'm not a good expert on Power and Cooling, but I think one rule of thumb is as much as you can get it. And Redundancy, cooling included. Multiple Air Conditioners, and Multiple power backups. I've been in many places where Air Conditioners go out in server rooms and those things jump to 100 degees in just a few hours.

    That's about all the advice I can offer, good luck.
    • You should be able to get the BTU rating for any servers you're running; multiply the totals by 1.5, and get an air conditioner that can 'cruise' at that level. Have at least two plugs for every piece of equipment you want; surface mounts are your friends. Raised floors, if you can get them, are your friends. Leave space, if you at all can, for at least one full empty rack. You'd be surprised how quickly stuff can mount up.
    • Re:Already on Track (Score:3, Informative)

      by wfrp01 ( 82831 )
      I've been in many places where Air Conditioners go out in server rooms and those things jump to 100 degees in just a few hours.

      Yup. And then you end up with the door propped open so you can run a fan.

      If you care at all about security, do yourself a big (and cheap!) favor. Install an emergency exhaust fan. Don't forget you'll need air in from somewhere to. If you live in a cold climate, you might like to pull air from outside the building. Otherwise, you might choose to use building air.

      Something cheap like this can keep you up and running while you fix the expensive HVAC gear; without leaving the door open overnight.
      • If you care at all about security, do yourself a big (and cheap!) favor. Install an emergency exhaust fan...Something cheap like this can keep you up and running while you fix the expensive HVAC gear; without leaving the door open overnight.

        An alternative is to mount a lockable steel mesh door on the free side of the door frame leading into the server room. You can get such a door for a couple hundred bucks, which may be cheaper than having the wall cut or windows adjusted for an exterior exhaust fan.

        Then pick up a big floor fan at CostCo and you've got a way to steal A/C from the rest of the building when necessary.

  • Most of the climate control units -- and you really need to think in terms of humidity as well as temperature -- are very noisy. Don't put someone at the desk on any kind of extended basis (>1 day) or he or she is likely to go postal. Noise cancelling headphones may help, but these sorts of rooms are not in general fit for extended human habitation.

    A regular telephone with a speakerphone and a l-o-o-ng cord and possibly a headset is also a good idea for calls to various tech support lines.

    --Paul
  • The rule-of-thumb I know is that a refrigerator can extract no more heat than three times its own power rating. Add up the average power at peak usage of all your equipment: (4KW?/rack * 3 racks) and divide by three, then you need a 4KW? refrigeration unit for ordinary circumstances.
  • You definitely want the room to have its own power feed and air conditioning unit. 30 IBM servers going full-tilt, relying on your building's circuits and air conditioning isn't the wisest of choices. =)
  • Get a subfloor (Score:2, Insightful)

    by infernalC ( 51228 )
    You should set this up with a removeable tile subfloor so that you can run cables between racks in a pretty fashion. Power is gonna be a big concern. Get cable trays - separate ones for power and data - and mount them under the subfloor. Get yourself plenty of wire ties, too.

    Most of the stuff you'll want can be gotten cheaply at Anixter [anixter.com].
    • Get some drains mounted under it. When the sprinklers on the floor above go off (they will), or the roof leaks, it'll go under the floor, and drain away. Don't put it in a basement, and if you are in an area prone to earthquakes, hurricanes, or tornadoes, put the server room lower in the building. If the area is prone to flooding, move it up a floor or two. If in doubt, mount a moisture alarm in a low spot in the room. (They sell them wherever they sell sump pumps.)

      Under the floor is really where racks are meant to have their cables run. Some flooring units have inserts that act as vents, and that works nicely. Your under-floor is ventilated, kept dry, and your smoke sensors have a higher chance of sniffing the smoke if the air moves through that closed area. There are actually some commercial smoke alarms that continually pump and sniff air, rather than the passive ones we have in our homes that rely upon convection and diffusion for the smoke to reach a sensor. Put some sort of dust-handling equipment on your air-conditioning. The folks that sell you the AC will be able to help. See if they can tie the ventilation into the smoke alarm so that if there is an alarm, fresh oxygen stops getting pumped to that room. (They do this on some modern highrises.) The folks that sell you the AC should also be able to help you with sizing the air conditioning to your requirements.

      Call in your local pest control expert to mouse-proof the building, then make sure there's serious screening over all entries into the building. Mice get bored, and for fun, they pick their teeth with the fiber core of the cable running to your most mission-critical server. However, they have to chew through several cables before they find the right one.

      Consider one of those big panic buttons that shut the room down in a hurry. Just make sure under a cover so that someone doesn't accidentally punch it. At the very least, place the circuit breaker panel in the room, and clearly exposed (meaning don't stack crap in front of it), so that someone can get to it and flip things on and off.

      Also place several of the correct class fire extinguishers there. Place a wall-mounted first-aid kit (some cases have sharp burrs that cut fingers well) near the door. Doesn't have to be fancy. Could just be something on a shelf. Also have a paper-towel dispenser (or just a roll of Bounty) for when someone forgets, takes their drink in, and then knocks it over.

      Finally, plan for expansion. Make the room a bit bigger than you think, but leave one wall that can be bumped out to claim the room next to it sometime in the future. (However, I think server sizes have stopped growing, so the need for more physical space is lessening.)
    • Yup, I agree. Having worked in two situations, one without a raised floor and one with, I can say that the site with raised flooring is much, much better organised in terms of cabling.

      However, you do want to make sure that you don't cram too many cables under the floor; you run the risk of blocking airflow if your airconditioning is trying to dissipate heat using that gap as an airflow path. However, that's probably not an issue in a smaller room.

      Aside from that, make sure you have enough power points in place and there is enough juice to power them all.

      Finally, for wire ties, try to get some of the ones IBM use (at least for their Unix systems); they are basically strips of velcro which you can easily attach/detach.

    • I'd recommend ladder rack instead. Easier to access; and properly laid out, helps organize your wiring. Once you fill up the room, getting into a subfloor is a pain. Plus you'll probably lose that floor lifter upper thingie somewhere, and then your magazines^H^H^H^H^H^H cables will be lost to the mists of time.

      A cheap way manage wire around the perimeter is to fasten 2-4' long U-shaped troughs to the wall at 7' high or so, with a few inches separation between each section to drop wires down. In a small room like yours, that may be all you need.

      On an unrelated note, don't forget to stash some clean underwear somewhere for when the doors slam shut and the halon goes off.
  • For non rack mounted equipment, wire shelving works very well. It holds up well, and allows for ventilation and water flow (from fire sprinklers). You can then tie wrap the cables to the shelving.

    And either black anodized or chrome plated depending on decor.
    • Speaking of sprinklers... fire will kill your servers, but water will too. See if you can get the building's normal sprinkler system turned OFF for your server room, and have it replaced with something more electronics-friendly (Haylon?).
      -Ster
      • Speaking of sprinklers... fire will kill your servers, but water will too. See if you can get the building's normal sprinkler system turned OFF for your server room, and have it replaced with something more electronics-friendly (Haylon?).

        Our data center uses a non-liquid fire extinguishing system made by Ansul [ansul.com] that will set off an alarm warning anyone in the room to get out, then it releases an O2 displacing mixture of inert gases (mostly N2). Much better than water.
        • set off an alarm warning anyone in the room to get out, then it releases an O2 displacing mixture of inert gases

          I don't know if I like this. Joe Sysop is behind a cabinet standing on his head in a pile of cabling with a toolbox behind him. *BWAH* *BWAH* *BWAH* "Holy shit, time to evacuate!" Joe heart-rate goes through the roof, he jumps up, hits his head on the case above him, trips on the cabling behind him, knocks the rack over onto his feet, gets thoroughly tangled in cables, and is lying there. *BWAH* *BWAH* *BWAH* Here comes the gas. Poor Joe, he was a helluva nice guy too.
  • by NetJunkie ( 56134 ) <jason.nashNO@SPAMgmail.com> on Thursday March 14, 2002 @04:33PM (#3164138)
    It won't just get hot in the summer, it's going to be hot 24/7. All of your equipment has a rating for the amount of heat produced..from small things like disks and tape drives to big server enclosures. Look that information up and figure out how many BTUs of heat you'll be outputting. Then go find an AC unit that can handle it for your sized environment.

    Power, Power, Power. Going to go with a big UPS or smaller ones for each rack? Talk to an electrician about circuits. Figure up how many amps of power you need and then decide on the number of circuits. They didn't do that in the room I took over and we've blown circuits three times, but it's been fixed on my watch.

    I recommend a locking door, of course. Raised floors if you can do it. And always figure on another rack or two. They seem to multiply and working in a cramped server room switching equipment gets old.

    NOTE: If anyone needs server racks in the RTP, NC area let me know. I have three that would be free to a good home. Glass front nice cases.
    • Consider power. Lots and lots of power.

      We run a pair of 20-amp circuits to hardwired 'plugmold' power outlet strips mounted to each rack, with the 'left' and 'right' side being fed from a distinct UPS and battery (two giant UPS systems, one giant diesel generator).

      Thus every power strip has it's own circuit breaker, overloading any one will only take out half of the equipment in one rack.

      • or better yet if everything has dual power supplies (like just about any real server) you just split the load over both strips, if one trips then the other takes the load. Now if overloading is your problem then you need higher amp circuits/circuit breakers.
  • Fire Supression (Score:2, Insightful)

    by thefatz ( 97467 )
    Nobody has mentioned anything about Fire Supression except one guy who said sprinklers. Hehe NOPE. Consider getting a Halon dump system or some other chemical fire supression system that will not damage your equipment.

    I have seen when a diskpack caught fire from a crashed disk. When they opened the door to the disk pack a sudden backdraft type explosion occured. The Halon released just seconds later putting out the fire, all while the other servers, printers, and mainframe continued to work. Sprinklers would of been a $20 million dollar mistake.

    Fatz
    • Just make sure you can get out. One of our central IT systems server rooms has a Halon system. A huge tank. When the alarm goes, you have about 20 seconds to clear the room, or you'll suffocate, since you can't breath halon....


      • If the site is properly designed, you'll probably survive, because the ventilation system is supposed to kick in after a couple of minutes to clear the halon. (But don't assume that is the case at your site.) What I'm wondering is whether blowing out the floor tiles is a "desired" feature of a Halon system. (Ours is supposed to do that, halon blasted out from the floor.) Still, as I recall, you would have to be worried about dying from the stuff long-term (cancer, emphysema) ...
      • Doesn't Halon essentially use up all the oxygen in the room so the fire can't burn anymore? That would suck.
        BTW, the cause of all death is lack of oxygen to the brain.
    • Well, you won't be getting a Halon system any more, as they have been outlawed due to the CFC laws. I'm sure there's some non-CFC-based replacement tho.
      • Well, you won't be getting a Halon system any more, as they have been outlawed due to the CFC laws. I'm sure there's some non-CFC-based replacement tho.

        FM 200 [fm-200.com] - in a big red tank and buttons by the doors (on the outside, so you can (a) not worry about access when the place is on fire and (b) not suffocate yourself when you evacuate the aforementioned big red tank).

        • Or Inergen [google.com]

          According to the guys who set up our system, you can breathe immediately after a release of this stuff. It's a compound of Nitrogen, Argon, and CO2.
          They also said that it's not a good plan to be in the room if the system does discharge, as a properly designed system has 1.5* the room's volume in the tanks. The installation includes venting to the outside atmosphere, otherwise the windows will be blown out.

          But it is effective at snuffing fires, and relatively benign to the electronics.
  • Fond memories (Score:4, Informative)

    by highcaffeine ( 83298 ) on Thursday March 14, 2002 @04:48PM (#3164221)
    Brings back memories of a previous job I held. Our server room (about 40x65 feet) was a glass-enclosed room with a raised floor for cabling and ventilation. The AC unit was the size of four industrial refridgerators side by side. The UPS was a cabinet slightly larger than the AC unit and held dozens of batteries in series which could keep the equipment in the room running for 30 minutes -- more than enough time for the outside generator to kick in. Each battery was roughly what you would find in a large truck.

    The room housed the servers for our local network and for the WAN which consisted of roughly two dozen buildings scattered around the county. We had a mix of HP/UX, Linux and NT servers -- and even one MPE/iX box (an HP3000 server). We also had our dial in, frame relay, outside Internet connection and terminal servers in the room. I believe there were 6 rackmount cabinets for most of the servers and the network equipment (the HP3000 and our voicemail systems were their own fridge-sized units outside the cabinets).

    It was actually separated in to two parts, as well. The main room, which housed the actual servers, was about 40x50 feet. The second part was separated by a glass wall and was 40x15 feet. The smaller area had desks and a couple enclosed rooms where the support staff would usually work. Hardware work was done inside the main server room because of the air control.

    The main things done right with the room were:

    - AC Unit: this thing kept the room at a nice 54 degrees Fahrenheit no matter what was going on outside. The AC in the rest of the building would go out and everyone would start opening windows and turning on their desk fans, while I would retreat to the server room and put on my fleece.

    - The raised floor: We never had a single cable on the floor to trip someone, and we could put a power outlet anywhere in the room we wanted. The floor was about a foot and a half off the real floor and covered the entire room. I loved that raised floor.

    - Security: Sure, someone could break the glass walls (although the building's security system included glass break detectors in the server room), but the doors were very heavy and very thick. Access was controlled by individual keycodes which we had to change regularly. Out of the 50 plus people working in the same area of the building the server room was located, three of us had passcodes to the server room. So we always knew when someone was in there because one of us would have to escort them in and out of the room.

    - Shelving: We had tons of shelving. We devoted one side of the room to just aisles of shelves, all clearly marked with their contents. The actual types of items were kept in alphabetized order. So, we had our boxes of cables near the first aisle, memory was near the middle and "Wyse" terminals near the end (a brand of basic vt102 dumb serial terminals).

    - Deskspace: My desk actually was located in our server room, though I was usually on call in another building. But we also had an "island" in the middle of the room for general use. It was large enough to have four people simultaneously working on hardware with all their components spread out around them. We also had a couple workstations on the island that could be used to log in to the various servers and other equipment. These were convenient because they could remain logged in with privileged access to certain servers and we didn't have to worry about someone using them when we went to chat with mother nature since access to the room was stricly controlled.

    The only complaint I ever had about the room was that when we would get shipments of 100 new workstations, they would cramp the room up a little until we got them all set up and shipped out to the various other buildings.

    The suggestions I would make for things to consider when setting up a new server room:

    - AC (obviously) and UPS (obviously)

    - Raised floor (you can get by without one, but when you have one you never want to get rid of it)

    - Entranceway security and if possible video monitoring

    - Strict, enforceable access policy (there's no need for the the new graphics temp to be wandering around in the server room, but sometimes you'll want to be able to escort the VP through the room so he/she can see all the pretty blinking lights)

    - 1.5 times the rackspace for your initial machine count at minimum, with twice the space initially needed reserved for cabinets

    - Tons and tons of shelving, plastic ties, rubber bands, electrical tape and sticky labels. You never have enough of any of these things. Get plenty of bins of various sizes, too, to use on the shelves for things like screws, jumpers, adapters, etc.

    - It's really helpful to have a common area for all the tools. We actually didn't do this at first and we'd lose a crimper or a screwdriver or something once or twice a week (more often than not we'd find them under the raised floor).

    - If you find you're running a lot of cables in the to ceiling to distribute to the rest of the building, get some regular PVC plumbing pipes and a hacksaw to create basic conduits in to the ceiling and then above the ceiling to outside the walls of the server room. One of the easiest ways to feed cables through these is to get a string and tie it in to a loop where it will run one length inside the PVC pipe and another length outside. Create a few loopholes in the string and then whenever you want to feed a cable through it, hook the cable's connector in to a loop and then pull the string.
  • Our one mistake... (Score:3, Insightful)

    by arrow ( 9545 ) <mike.damm@com> on Thursday March 14, 2002 @05:04PM (#3164321) Homepage Journal
    We currently have 2 server rooms, one here in MIS, the other in another building. The second room can't be locked(!) due to pointy hair policy and sticking useless crap in there. At some point a user slipped a line printer in there to cut down on noise in their office and the room has been left unlocked ever since.

    My recommendations would be:
    1. NEVER EVER EVER EVER let lusers into your server room. Put decapitated heads on bamboo sticks all around your server room. I almost killed someone when I came in one morning, and realized someone had manualy ctrl+alt+del'ed our timeclock server because their PC couldn't access it and they assumed it was a server problem.

    2. Replace the door handle on the door with a deadbolt. Nothing says go away more than no handle, and its fairly easy to just turn the key and push.

    3. Use racks. If your room is already going to be temp controled, and its locked up tight, cabinets aren't needed. Plus if venting fans on one of your cabniets dies, it turns it into a big thick metal blanket for your servers.
  • 1. Plenty of airconditioning 2 smaller units capable of 2/3's of the load each giving you 1 1/3 total needed capacity.
    2. Fans for when 1 of the A/C units die. The cheap ones do a good job.

    (there are 3 things)
    3. Sound dampening material on the walls and ceiling near the server's or it will be loud in the room.

    -- Tim
  • by booyah ( 28487 ) on Thursday March 14, 2002 @05:14PM (#3164421)
    Cable management cable management cable management!!!

    well really more than just those, those are just my big pet peeves.

    Good things to have include

    a work bench

    a tool cart

    a phone

    a seperate test subnet (firewalled from the real net)

    a good lock

    cooling

    UPS

    generator

    all internal walls

    static floor panals

    and make sure there is room to work today and a few years down the line...

    -Booyah
    • a work bench
      Amen. I am getting so tired of working on the floor. But if you get a workbench, try and ensure that it is ONLY a workbench. We have a good table in our smaller server room, and guess what happened when a couple more machines needed to be added? You guessed it, right on the table.

      a tool cart
      Ditto. Carry your own Leatherman, but it's not a replacement for everything. A good toolkit, with adjustable wrenches, crescent wrenches, socket wrenches, screwdrivers of every imaginable size and shape, is worth its weight in gold.

      a phone
      A high-quality, cordless phone. Having to say "hold on", put down the phone, run over to check something, then come back is a pain and a half.

      a seperate test subnet (firewalled from the real net)
      Nice idea...but try getting my management to pay for "unnecessary" equipment. :-(

      a good lock
      Yup - but make sure you know how to pick it, because you will lock yourself out some day, and because lockpicking should be in every sysadmin's repertoire of skills.

      Other ideas:

      • a dead-tree logbook. Not everything can or should go in computers' logfiles.
      • a terminal server (if appropriate for your hardware)
      • a graphical terminal (because setting something up, then having to run back to your desk to check on something is a real nuisance)
      • a labelmaker - label everything, but make sure it's accurate!

  • ...do not forget to consider (in no particular order):

    Fire extinguisher mechanisms

    Easily accessible power circuit breakers

    Room accessibility that allows you to easily put another cabinet in there (or out of there !) - tall enough doors, ramps if you'll have a raised floor, etc.

    Tipically systems pull cold air from the front and blow hot air from the back (check yours, though); consider this when laying out the overall air flow, as you don't want to waste expensive cold air on the wrong "side" of the systems.

    Remember that the door on your cabinets will need to be opened. Depending on the cabinet models, the orientation can (or cannot) be reversed. Another point to remember when laying out the cabinets through the room...

    Did I mention raised floor and structured cabling ?... Maybe I'm asking for too much....

  • I've deployed a lot of small data centers. Your choices in what to buy are actually easier than you think. You can calculate your power and cooling needs down to the last BTU and KVa, and then you have to buy equipment in a size they make them in! Just go the smaller end of the lines.

    You'll end up with a 3 ton or 5 ton air conditioner. Liebert air conditioners [liebert.com] can also humidify/dehumidify and heat/cool. There is a market for used Lieberts if you want to save some money. Call your local A/C contractor.

    A 3Kva UPS would be a good size unless you want more standby time in which case you could go for a 5 Kva or maybe two 3's for redundancy.

    Liebert makes great UPSs [liebert.com], too. The APC Matrix [apc.com] line is a pretty good design because you can hot-swap the batteries yourself.
  • Even for as few as four cabinets, it makes sense to do things right the first time.

    Make sure everything is to CAT-5E specs!

    Install a 24-port patch panel in each rack. Consider punching down at least 12 ports in each, if not all 24.

    Run all of these connections back to a 'patch rack' or 'patch wall' and make all of your inter-rack and rack-to-desktop connections at this central location. Document all changes, and you are golden.

    • Have a standard color for patch cables(i prefer blue) and make sure that your cross over cables are a different color(i prefer yellow).
      • mjoconnor81 writes:
        Have a standard color for patch cables(i prefer blue) and make sure that your cross over cables are a different color(i prefer yellow).
        It's a tough call whether to pick one standard color for patch cables, to try to color-code cables by some scheme (black for power, red for serial, etc), or just try to use lots of different colors so you can more easily decide which cable in the giant jumble is the one you need to replace!

        I do agree that one hard-and-fast rule is that crossover cables should be a unique color, not used for any other cable -- I also prefer yellow, but at a previous job the color was pink (because that was one of the few colors the colorblind CIO could differentiate).

        One advantage to the 'yellow is crossover' rule is that IT employees get a legal, free supply of brand new cables, as you have to dispose of all of the brand new non-crossover yellow-jacketed cables vendors tend to include with new hardware.

  • Accessibility (Score:4, Interesting)

    by Webmoth ( 75878 ) on Thursday March 14, 2002 @08:11PM (#3165643) Homepage
    Lots of good suggestions here, but I don't see any mention of accessibility (yet).

    Make sure that you can walk -- and stand up -- BEHIND your servers. Make sure you can open cabinet doors fully. Make sure you can pull a server out of the rack without moving stuff around. Be able to have two people in the room: one in front and one behind the rack at the same time. Make sure you don't have to move the rack to work on it.

    You want a server ROOM, not a server CLOSET. I've seen far too many situations where work on a server involved crawling under desks, moving stuff, craning necks. Hey, moving computer while they are running is A BAD THING: you don't want heads crashing into a hard disk platter. Besides, you risk knocking the (power) cords loose, something I've done on several occasions. I've got one customer whose server closet is so small I have to move the rack forward to access the back and then push it back to access the front again.

    I would say that you want at least 3 feet in front of and behind the rack. Typical racks are nearly 3 feet deep, so you want your server room to be at least 9 feet in one of the dimensions.
    Now placing your rack in the middle of the room means you have to get your cabling and power to the middle of the room. Having your patch panel or power outlets on the wall just won't cut it. Use either overhead cable trays (NOT conduit) or a subfloor with removable tiles. Don't run cables above a drop ceiling from point to point in the server room (cables headed out of the room are OK to be in the ceiling). NEVER run cables across the floor.

    Bolt your rack to the floor so you (or an earthquake) don't knock it over.

    DO NOT allow non-network junk to clutter up the server room. That old dot matrix machine gun that nobody will ever use again but you can't bear to throw away can go in a storage closet somewhere else.

    Again, give yourself elbow room. It may be hard to convince the person with the purse strings to pay for space ("but the server will fit in a 3' x 3' closet, why do you need a 10' x 12' room?") that will be mostly empty, but it will make your life easier and will -- practice saying this -- REDUCE UNPRODUCTIVE DOWNTIME. Make sure you get the "unproductive" in there.

    • If you are stuck with a space the size of a wiring closet, do what the professionals do with their wiring closets... install one or two "relay racks [iscdfw.com]" (also known as "telco frames") and use those to mount your systems.

      These are more limited in what you can mount, though with the right shelves they can fit some pretty big PCs.

    • Put your KVM switches in FIRST (before all your other stuff) in the MIDDLE of the rack, or at least in a central point within all the racks. VGA cables are always shorter than you think - and you don't want to have to move stuff about simply because you can't connect boxen to the switches.

  • What should we expect for power and cooling needs?

    Using my Jedi mind powers, I see that you require 240V/20A twist lock outlets, and that your machines each put out 1200 BTU/hour. Therefore, your power and cooling solution is...

    Ummm... hate to break it to you, but /. isn't going to do your homework for you on this one.
  • A few suggestions (Score:2, Interesting)

    by acaird ( 530225 )
    For the size room you're making, instead of a raised floor, consider a cable tray (basically, a ladder a meter from the ceiling) in which you can lay cables. Raised floors are better, but also cost a mint and aren't always practical to install.
    • Also, as everyone said, power. keep in mind that the UPSes you'll need probably don't plug into normal outlets; pick them out, and call the electrician - Hubble twist-lock-y things are what you need, but check out APC [apcc.com] (and other) web sites for the specs on the plug; you don't need to know what they mean, just copy them carefully for the electrician.
    • The portable air conditioners are nice, but still need drains - talk to the facilities people while you're doing the lay-out - if there are pipes in the walls in a convenient place, take advantage of them and put the A/C units by them.
    • Don't worry too much about humidity. back in the day you had to 'cause there was paper in the computer rooms from the big line printers. i doubt you have that, so make sure the A/C people realize that and don't sell you super-fancy humidity controls that you likely don't need.
    • As has been mentioned - a meter behind the racks, and 2 meters in front, computers are heavy and the more space you have the easier it will be to get them in the racks.
    • IMO, don't use it for storage, no shelves, no drawers in the desk/table, etc. It's a machine room, not a storage room. Put the computers in, and stay the hell out. It's temping, since it's locked, and probably not full, etc. But don't do it, you'll lose control really fast, and it'll be a disaster.
    • Again, power: circuits, and more circuits. However many you have, you don't have enough. One room I designed has two 30A circuits per rack, and in some cases that's not enough, mounted on the cable tray so cords don't go back to the wall behind the racks - remember, you need to be able to walk (and carry heavy things) back there.
    • Also, in addition to a phone, put normal old network jacks in the walls. I know, you'll have switches in there somewhere, but probably not near where you want the desk, and nor does the cable tray go there. If you're having cabling done anyhow, a few jacks right near the patch panel are cheap and well worth it.
    • Leave room for expansion, pack it as tight as you can. Remember, when you add hardware, you need to add A/C and maybe UPS and power. Leave lots of room. If you're over 60-70% full today, you're in trouble real soon now.
    Good luck.
  • At my company, they went through a total cleanup of the server room and they installed power strips with LED ammeters built in. They are KILLER, and they give surprising readings. Nameplate (rated) power levels are very unreliable, and these strips did a great job of allowing them to quantify on a continuous basis how much power is being drawn through each breaker.

    They're pricey (something like $300 apiece) but the sysadmins all thought they had paid form themselves easily in the first couple of months.

    - Leo
  • Racks (Score:2, Interesting)

    by ag3n7 ( 442539 )
    Use third party racks that allow for variable sizing of rails and better cable management than the standard 'vendor' racks:

    http://www.chatsworth.com/
  • Something nobody seems to have mentioned is that you may need to UPS your AC. There's little point in running your servers during a power failure on a UPS, whilst the temperature in the room slowly rises & they cook in their own juices.

    Here we recently designed a new server room where the UPS includes 6 tons of battery - this is enough to run all our servers & all their AC for 5 hours - which is long enough for us to organise an alternate power feed from another village (we're a bit out in the sticks here!).

    Matt
    • Don't forget... if you plug anything with a motor into a UPS, better make sure it is rated for it and that it does not void the warranty. Laser printers, and air conditioners (compressor motors) draw waaaay too much surge current for most run of the mill SOHO UPS's and even for some mid sized ones...
      • Ack... dont misunderstand me - laserprinter motors are not the problem - their surge draw is. Laser printers use "massive" (1 foot long or longer) halogen bulbs to heat the fuser roller from inside the roller, and they draw a lot of power during their heatup cycle.

        The AC units have a high surge when the compressor is turned on.

        Robert

  • by Nonesuch ( 90847 ) on Friday March 15, 2002 @02:03PM (#3169105) Homepage Journal
    If you are protecting valuable hardware and/or data, consider requiring keycard+PIN for physical access to the server room.

    Make sure you control all access, including the potential for intrusion from above and below -- dropped ceilings and raised floors often make an easy path for a skinny crook to get from a public area to a controlled location.

    For around $1K in equipment you can set up four cameras, a quad combiner, and a time-lapse VCR system to provide a video record of everybody entering and leaving the room.

    We've examined many different options to handle the camera monitoring and recording with a digital system, but there is no PC solution that comes close to the good old $200 surveillance VCR. Plus, videotape is going to be more acceptable when you need to involve law enforcement.

    One last note -- make sure the VCR itself is in a seperate controlled access location. Not much point in a videotape record when the thief can simply eject your tape and walk off with the evidence.

  • 1) Space. We designed a 12x14 room. I comprimised and cut it in half. We're going to expand the room this summer so I get the origianl space planned - just not enough room.

    2) A/C. To save money we decided the space didn't need a seperate airco. Oh, yes it does - all those boxes make a lot of heat.

  • We recently built a new office and put quite a bit of thought into the server room. You can see some pictures here [greatmindsworking.com] (server room pics at the bottom).

    Some of our considerations included:
    - LOTS of conduit dropping into the server room
    - separate A/C
    - plenty of 110 & 220 circuits
    - separate electrical panel tied to a generator by-pass switch
    - workbench
    - plywood on all the walls (for mounting equipment, stapling, etc.)

    It is by far my favorite room in the building!

    • But WTF did you put the UPD and batteries on TOP of the cabinet? They're really Heavy and should the cabinet ever become unsteady (say, while racking a server) your center of gavity is much higher.
      • We actually had some heavy-duty rack caps custom welded for the server room to accomidate the UPS units. I agree that it does create a very high center of gravity, but we're very careful and it is actually more stable than you might think.

        The real pain was lifting the stupid things up there!

    • Upgraded the site today and realized that my article addressing changed. Click here [greatmindsworking.com] instead.
  • WHile not strictly a case in server room design, make sure you take time to organize cables, cable runs and system placement.

    The system that has worked for me in our midsize machine room (3 Unix, 3 NT, 1 2K, Phone Switch, and UPSes) is a compination of color and labeling:

    on each cable, put a tag attached to each end saying what that end plugs into and where the other end goes - you can also use numbers and a lookup sheet, but that was too tedious for me. Another good idea someone showed me is used specific cable colours for specific connections (i.e. Blue for Workstations to hubs/switched, Green for Servers, yellow for hubs/switch/router to hub/switch/router, etc) -- it make visualization of your setup a bit easier to contemplate.

    Of you're using human-sized UPSes, UPS everything. Each machine should have it's own 20-25 minute (exluding monitor) UPS, and put maybe 1-2 (maybe three, if they're small) pices of networking equipmnt (hubs, DSL routers, etc) to a UPS.

    If you have the interesting chance to also work with the electrical wiring and want the extra piece of mind, have a power receptical every 4 feet (I've found that works well in my experince in the case of unpowered racks) -- and if you want to REALLY overplan, have a serpeate curcuit breaker for every wall/group of plugs.

    But trust me on the wiring :)
  • At my school I am a co-director of the Tech Dept.
    We has a server room (that used to be a radio broadcasting room), with a nice AC unit. We have 2 compaq big servers, and 4 hubs, 3 switches, and 4 ciscos. Atleast once a week I come in in the morning to find that my partner has turned off the AC because he was cold, and the room goes to over a 100degrees. Make it a rule, or just lose the control, so that employees can't mess with the AC.
  • If my management caught me asking questions about how to do my job on Slashdot, I'd be shitcanned for sure... Did they in fact interview you before you got this position?!
  • You need to hire an electrical contractor licensed for that type of work. Get someone that can do a power & heat study for you. Make sure that they are licensed and have all the information about the hardware you are going to install in the server room. Your cooling will only be as good as the information you provide them.

It is easier to write an incorrect program than understand a correct one.

Working...