Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

Computer Room Design? 79

Onion asks: "My company is considering giving us a new Computer Room, and Command Center, as our existing building is nowhere near meeting current needs, let alone future needs. I have seen a few plans for command center furniture, but no real designs or ideas for the layout of these two rooms. We have five racks for the actual computer room, and need around 25 screens for the command console. Add to this bench space for repairs, and things like: a cupboard, bookshelf, plus more storage space, and the design becomes more complicated. We need enough space for three or four admins. Has anyone seen plans for this type of setup ?"
This discussion has been archived. No new comments can be posted.

Computer Room Design?

Comments Filter:
  • Seven Circles (Score:2, Interesting)

    by anti-snot ( 555305 )
    I think there is a rather nice classic example in Dante's Inferno, although that may have very well been a Microsoft Shop. Servers should be (ignoring hardware issues) administered via ssh from nice, comfortable office chairs down the hall. I would be more worried about cooling/venting/security issues than whether or not pc repair and servers can coexist in a small area... our setup is like that and its quite silly.

    • Good idea, keep PC repair and your servers/wiring
      in two seperate rooms.
      • by SN74S181 ( 581549 ) on Thursday July 11, 2002 @02:17PM (#3865562)
        Heavens no. The servers should be a snarl of ethernet cable, switches, hubs, etc. all mixed in with torn apart boxes. If you can't tell which machine is the server except by which box has the SCSI card in it with the LEDs illuminated (cover on said box should be OFF of course) you're wasting valuable company time fiddling with a phillips screwdriver when you should be:

        changing the f-ing toner cartridge on the LJet2 up on the second floor.

        Hop to it, now, admin-boy.
  • by GuyMannDude ( 574364 ) on Thursday July 11, 2002 @01:54PM (#3865396) Journal

    My advice is to build your own command center by studying those featured in the James Bond movies. Most of the bad guys have a fully staffed, very impressive command center filled with computers and cool looking wall-sized flat screens. Make sure you get the center chair that overlooks the whole setup. Not only will it be a joy to work at but a command center like that will impress even the most stingy of VCs or customers.

    Always happy to help,
    GMD

  • by ScumBiker ( 64143 ) <scumbiker@jwenge ... minus physicist> on Thursday July 11, 2002 @01:55PM (#3865404) Homepage Journal
    I'm a little confused here. If you only have 5 racks of servers, why on earth do you need 25 consoles? 5 racsk can easily be controlled with one decent KVM, I prefer Blackbox myself. You also didn't say how much room you have. 25 consoles, even stacked 3 high take up a hell of a lotta room. Get a really good KVM, leave the people that need to be there in the noisy hel of a computer room, and then go back out to your desk and run everything from ssh or a Win2k terminal session. That's the way I do it.
    • In some situations you do need that many consoles, and an actual command center. I don't know what the situation is here, but take a look at these pics from FermiLab control rooms. They're laid out nice.

      CDF control room front [uiuc.edu]
      CDF control room front left [uiuc.edu]
      CDF control room right [uiuc.edu]

      The computers and such are stored in the next room and the rooms below. There are ~ 30 monitors on the walls, plus desks for laptops to plug in. Obviously you aren't doing particle collisions, but it is a really nice setup. Usually a staff of 4, but can support 10 or so if needed.
  • Okay, I'll admit that I'm not the best person to ask about this, but what came to mind (yup, I'm a geek) was the bridge of the Enterprise.

    Bear with me, I'm serious.

    Arrange the main desks into a shallow semicircle, facing one wall. On the wall, you can have whiteboards, charts, an electronic status display, etc. -- general information stuff that the admins might want to look up and see. (Oh yeah... a clock would be good, too.) It sounds like you already have an idea for the workstations themselves.

    For announcements, crisis control, and so forth, the team leader could stand with his back to that wall and comfortably address the entire team.

    Then, around the perimeter of the room, have repair benches, shelves, cupboards, a fridge in the corner, whatever you feel.

    Hmm... geeky, but that's just off the top of my head.

    • The amount of fighting between the techs as to who gets to sit in the "Captian's Chair" would end any chance for productivity ever again. This solution should also only be implemented if you can get a device that will make the room make that weird sonar like sound that was always on the bridge of the enterprise in the old shows. :)
    • That's funny.

      Gene Rodenberry laid out the bridge in Star Trek that way for dramatic reasons. Prior space SF usually involved a rocket pilot hero who sat at a control panel, but somebody pushing buttons didn't seem dramatic enough for TV. So Rodenberry went with a ship-like layout. This led to a dialog-heavy show, which was what Rodenberry wanted.

      Amusingly, one class of U.S. submarine copied the Star Trek bridge layout.

    • Okay, I'll admit that I'm not the best person to ask about this, but what came to mind (yup, I'm a geek) was the bridge of the Enterprise.
      Bear with me, I'm serious.


      Actually, this is pretty much what an industrial-grade NOC looks like. Large screens on the wall showing a map of the network, and various metrics on performance, probably a TV news feed or two (strewth, traffic just went through the roof! oh wait, CNN just reported...) and a few rows of consoles for the actual operators to sit at. Usually a raised podium at the back where a supervisor can monitor the whole room.

      Control rooms for power stations, rail networks, oil refineries, etc, follow similar principles.
  • I'm not sure what you mean by a "Command Center". Is that a room to oversee a network from? Or do you run physical machines (manufacturing plant, pump systems, etc)?
  • Oh no! This is not another one of those "how do I make my geeky enough" ask slashdots, is it?

    BTW, this is a joke, mods should try it somtime
  • by dotslash ( 12419 ) on Thursday July 11, 2002 @01:57PM (#3865416) Homepage
    There is a special kind of person who can answer your question, an "Architect". Perhaps your company should get their act together and pay a professional to do the job properly. After all the cost of an architect will be a fraction of the construction cost and will make the difference between a usable space and ... not.

    Think of it in terms of design before implementation.
    • I second that emotion.

      Not only do you need an architect to lay it out, you need an engineering firm to design the HVAC and electrical systems - and no, a spot cooler and a power strip won't cut it.

      Then hire a contractor (that's me) to build it.

      Then hire a cardiologist, 'cause you're gonna have a heart attack when it comes to paying for it.

      Just bite the bullet and do it right. Every owner I've ever dealt with who tried to design or engineer their own facilities has been dissatisfied with the result.
      • Every owner I've ever dealt with who tried to design or engineer their own facilities has been dissatisfied with the result.

        a friend of mine works for a company that just bought a new building. while they didn't have quite that much equipment, they were still able to design a nice center on their own without hiring someone. they had to do some research, but if you want to not pay money, you gotta get the knowledge somehow.

        i saw it too, looks damn nice and works great too.

        • on their own without hiring someone
          That might work, if you have the right people in your company. If you employ a few mechanical engineers, they had HVAC in school, so they can look over their old schoolbooks, spend some time working on it, and design your HVAC system. But remember to double the capacity they spec out; it's surprisingly easy to not put in enough. (If you have too much HVAC, you've spent too much money (and you can expand later without adding HVAC capacity), but if you don't have enough, you have a nightmare of new installations and reduced operational capacity.) If you don't have any ME's (EE's and programers in an electronics/software firm, for example), contract that out.

          But you can get an estimate before you try in-house design. And it is hard to duplicate an architect's skills and experience. A college roommate was in architecture, and he was learning some esoteric stuff. If you need more than one isolated room, think about an architect, she can help you iron out the layout and plan for the future.

          Talk to the professionals, get their estimates. You are making an investment, do it right.

          • I'll bite here. No, you are better off hiring people rather than diverting in-house staff with marginal theoretical knowledge to try and do the job. Example: tying into the building's chilled water system. The landlord might be a bit more comfortable with someone that does this frequently.

            Electrically, the actual work is simple enough, but there are codes that you must/should comply with.

            My experience, though, has been that most people don't really want to do the job right, they just want to have something functional. The parent's dream of a NOC for five racks seems a bit off in the first place. Do it cheap now, knowing that you are skimping, and budget to go back later and do it right!

            (One little code tidbit to spread is that any UPS over 750VA should be killed by an EPO switch at the door of a computer room! A relevant engineering corrollary is that you don't put UPS systems in series!)

    • True, but you need to have knowledgeable people from all areas involved. There is A LOT to consider when building a LAN room. I went through this last year for one of our smaller offices. About 6 servers, a switch (telephony) handling 6 T1s, plus associated gear.

      An architect is definitely a necessity, but they don't always know everything you'll need. First off, you have to determine the amount of heat generated in the room, and this is affected by the number of people normally in the room. Take whatever you have know, and double it. You don't want to have to add in another air conditioner later (trust me on this - its a pain, and very expensive).

      Fire alarms are another big one. In fact, yesterday we learned about problems with them. When we originally setup the system, the idea was the sprinkler system (some type of foam, not sure what) was only designed to go off if both detectors in the room had two signs of fire (heat and smoke). However, I didn't know they also setup the fire alarm with an emergency kill. That is, should the fire alarm be activated, it will cut power to the room. Okay, we've got UPSs, however, they also wired up the one UPS in the room that is smart enough to receive this type of kill signal. (There's also a panic button that will cut the power). Yesterday, the company showed up to test the system, didn't know/forgot about the kill signal, and while testing the system shutdown about 1/2 the machines in the room. The other 1/2 were on dumber UPSs and the switch is on its own battery backup. So, in the even of a real fire, the most expensive piece of equipment in the room would have been powered when the foam came down!

      Other things - racks right against the wall are bad, leave room for someone to squeeze behind them because you'll need to eventually. Lighting is another issue, it should be bright, but you don't want a lot of glare. Where are the cables going to run? In the walls (not a great idea when you want to replace bad jack) or overhead cable trays (which can get in the way of lighting and cooling).

      Short answer, you need professionals who not only understand architecture, but also power, alarms (fire & security), cooling, and finally usability!

    • I was the IS manager at a large architecture firm for years. Architects are a pain in the ass to work with, but a good architect is worth his weight in gold. Architects at their best are integrators and managers as well as designers.

      Other posters have mentioned heat, alarms and fire supression levels as examples of things an architect might not understand. In my experience, most architects are pretty good about subbing out work they know they can't do. Architects don't draw plans for the whole project - they design the frame and facades, the interiors and the fixtures, but they usually leave other work for specialists. This means that your plumbing, electrical, etc will be handled by a consultant who does it for a living.

      Your best bet is to find an architect with demonstrated experience in the kind of project you have in mind. Then ask for references and visit his(her/its) past projects. If possible, talk with the internal project manager at those sites (the person who dealt with the architect on a daily basis).

      Nearly any architect can probably do your job, but it will be very painful with one who is inexperienced. Architects have a long and formal internship system - let your architect get his computer room experience as an intern on someone else's project, not as a lead on your project.
      • Good advice.

        I'd add, though, that you would still probably like to have at least a good conceptual idea about what you're asking for before beginning consultations with an architect. Don't assume an architect knows what is and isn't important to you.

        Also remember, the most important design decisions happen early. As the project progresses, the broad brush strokes of the early conceptual design will become more detailed.

        Good architects listen, communicate, and would like your feedback. If you find yourself doing business with someone who does the meet-and-greet and then disappears for awhile ... you might consider finding someone else.
    • You should consult an Interior Designer THe building is already built. As to others who say that you need someone knowledgeable about computer stuff, a competent designer will ask you about your requirements. It is an interactive process.
    • There are many complexities to designing a computer room and command center. I so happen to work for a company that is the best at designing them. Information on the company can be found at http://www.bruns-pak.com Look at our client list. We have been in this business for over 20 years and have partnerships with the leading equipment companies. I am not trying to advertise but since you asked for help...there it is.
  • I think there is one all setup the way you want right here [slashdot.org]
  • Has anyone seen plans for this type of setup ?"

    only in the house I designed with my wife.... I wonder why she left me?
  • by qurob ( 543434 )

    http://www.spacedesigntechnology.com/`

    It all depends on what you want to spend.
  • VNC Sessions! (Score:3, Interesting)

    by agrounds ( 227704 ) on Thursday July 11, 2002 @02:14PM (#3865547)
    We're extremely co-located here at my current job. In fact the closest server is two hours from me. (This is for security reasons) Anyway, we do it all with just a few terminals and a whole lot of VNC [att.com]. I think the best answer for you is to set up a few simple boxen that exist to only run VNC sessions for guests and the like, and then hook up a tunnel encryption to the servers if you are worried about it. I can honestly say that Zebedee [winton.ork.uk] has been the easiest thing to set up. It runs over port 11965 if you want to push it out the firewalls as well.

    KVM switches rock, but tie you to one location, and then you fight over the terminal with the other admins. When you can do it all from your desk with just a click, why not?
    • Re:VNC Sessions!?? (Score:2, Informative)

      by mattster999 ( 591497 )
      Um, sure you can do it with VNC, or you can get a Matrix KVM switch (not a name brand) that is a many systems to many consoles layout. They even have IP-based consoles (software that will tie into the switch) for the 2-mile away ones. We have an older model here (non-IP) that works great. It all runs over a pair of cat5 cables for each point. 40+ servers, 5 consoles throughout the building for the various user groups, logins on the consoles integrated with the domain decide which servers you can choose from. Works great. Ours is by Avocent. --Matt
    • They keep the servers at least two hours away from you for security reasons, huh? Seems like it would be easier to just fire you and hire somebody they trust... ;-P

    • So you can fight with the other admins over the vnc connection and you aren't even close enough to smack them if they force you off?
  • by duffbeer703 ( 177751 ) on Thursday July 11, 2002 @02:26PM (#3865619)
    Unless you want to get sued and fined by OSHA.

    Noise levels, particularly high-pitched noise in computer rooms are way too high for human habitation. You will go deaf if subjected to for many years.

    If people need to sit in the raised floor area, there needs to be a wood/glass wall between them and the computers and chillers.

    • New units are much more quite then the older units. I have a printer that's louder. We're a state institution and we have a staffed computer room 24/7 almost. Since we are state, we KNOW about OSHA and they have yet to say anything. A well designed center can mitigate things like chiller sounds and the like.
      • The agency better make ear protection available to employees, or the employee unions will eventually raise a major fuss.

        Hearing loss is something that really happens to people in IT. Hard disks in particular are terrible for the ears and operators handling tapes should be wearing ear protection.
        • What union? We are non union and intend to stay that way. And if it really bothers us, we can get ear protection if we want. But if you have actually BEEN in a computer room, you'd realize it ain't that bad. I have been working in one for 5 years and I hear just about as good as I did before I started to work there. Maybe way back when the computers actually made more noise then current pc's do, well, then you'd have a case. Current PC's and servers don't make enough noise to be heard over the air conditioners and everything else going on. Handling tapes with ear protection? No way. Then I would never hear the console attention beep. I believe sight loss is actually worse then any hearing loss. We IT folks stare at CRT's way too much. LCD's will be a godsend!
      • No new units are not quiter than old, in fact because of the increasing power usage of cpus and reduced case size new systems are often louder. For an example our brand new Dell 2U dual-PIV Xeon has 4 high speed exhaust fans that are louder than our 16 way sun by a long shot. In fact the system made it so that I hate going into the datacenter anymore.
  • My 'command center' (Score:2, Interesting)

    by Mordant ( 138460 )
    consists of my laptop running Slackware 8.1, Fluxbox, OpenSSH, and Galeon.

    If you use automation properly, and technologies like ssh and VPN, etc., you don't -need- a 'command center', no matter the size of your organization.

    Server rooms, network rooms/closets/PoPs, are absolutely necessary, and should be designed properly, w/racks/raised floor/UPS/etc. 'Command centers' aren't necessary at all, except for stroking one's ego.

    That's not to say they aren't -cool-, but they're really passe.
    • If you use automation properly, and technologies like ssh and VPN, etc., you don't -need- a 'command center', no matter the size of your organization.
      If there is a real network 'incident' in progress, you may not be able to reach the systems via SSH. Internet connectivity (and thus VPN) may be under attack, or disabled to address security concerns.

      Perhaps many (most) enterprises do not need a full-bore NOC, but a good "situation room" with full system access and room for at least a half dozen people makes a lot of sense.

      For example, an 8K sq. ft. datacenter might have two such rooms:

      1. a 10x12 NOC-like room, just outside the machine room, where the technical operations folk to have their workstations, along with a couple of big network overview displays, good for impressing visitors. Also, you have to pass through the outer room to get to the datacenter, so the techops guys can keep an eye on what's going on.

      2. A 8x16 "console room", where the real admins can get full serial (and KVM, for the windows machines) console access to all of the systems. Both console systems are independent of the primary LAN.

        Server rooms, network rooms/closets/PoPs, are absolutely necessary, and should be designed properly, w/racks/raised floor/UPS/etc. 'Command centers' aren't necessary at all, except for stroking one's ego.

        That's not to say they aren't -cool-, but they're really passe.

        I disagree. If you intend to run a 'lights out' datacenter, it is advantageous to have a small command center to house your technical operations staff, so they don't have to sit in the dark.

        All the better if there is enough room and spare terminals to get a team together to address a problem, with at least one big display so all the bigwigs can crowd around the one guy who knows what commands to type.

      • I don't -want- the bigwigs to crowd around. I want them to stay the hell away, heh.

        Serial terminal servers fulfill the 'what if network connectivity to the boxes is lost' requirement, for *NIX hosts, routers, and switches.

        Why do I need a separate room? If I have to, I plug my laptop into a switch in the datacenter, or in a cabling room. If connectivity from my home DSL into one PoP is down, I go through one of the 5 others we have in different parts of the world. The only thing I can't fix remotely is a bad patch cable to a host/router/switch which doesn't have redundancy.

        FYI, the company I work for does about $20B/year. We're 24/7/365 worldwide - and we don't have a NOC.
  • Workcenter design (Score:3, Insightful)

    by Lando ( 9348 ) <lando2+slash&gmail,com> on Thursday July 11, 2002 @03:08PM (#3865909) Homepage Journal
    You really need to step back and consider what you want to do here.

    Do you want a data center, ie computer server room, a noc, a workroom, etc?

    If you have 4 admins, why do they need to see a massed array of consoles? Thats a job for Operations, not administrators. Put your administrators in cubes... Various cube designs out there.

    Datacenter use open space and raised floor if possible.

    Operations, couple of screens maybe 2 for each operator and use programs to pipe information to said operators, they shouldn't need a screen each. If your running MVS systems which "require" dedicated consoles... You can find cards and "hllapi" interfaces to run several console servers on one machine.

    It doesn't sound like you actually have that big of needs.

    Myself, I would just take over some custodial closet for the computers, beef up the ventilation and add in a ups system... Your going to need an electrical engineer to look over the power situation which should provide most of the space requirements for you.

    As far as the admins go, find out what they want, usually 6x6 cubes with room for 2 computers is adequate...

    As far as working on machines, find a room somewhere 20*40 or so that has a locking door. This way your's not concerned with parts walking out the door...

    In truth, the computer room with be more of a engineering design, and set by your space requirements. The workspace for the admins is mainly set by getting input from management and admins.

    What difficulties are you forseeing?
    • On a related note, it's worth your while to physically separate the 'NOC' from the 'DataCenter'. Put up a solid wall, and locking doors with good access control (keycard, etc). Discourage people from going into the datacenter to perform trivial tasks that do not really require physical access.

      This also helps keep the cold air in the datacenter where it belongs, rather than causing hypothermia for the poor network operations people.

  • by 4of12 ( 97621 ) on Thursday July 11, 2002 @03:14PM (#3865948) Homepage Journal

    First,

    • raised floor
    • refrigerated air
    • big electric circuits
    then think about UPS units.

    And, just as you can never be too rich or too thin, you can never have too much storage space.

    After that, move onto your network drops, benchspace, lighting, chairs, etc.

    I'd advise having some locks on the doors, too, not only for the obvious security implications, but also so you have a place to hide when things go south (have a prepared placard to the effect of "We're actively working on the problem and will update you immediately as it's fixed.")


  • One good suggestion is to let a pro design your new datacenter, or at least help you out. I've had some really good experiences with IBM Global Services - they worked with us on overall design, and we bought and installed the raised floor, cable management, cabling, UPSs, cooling units, etc... through them. Companies like that have a lot of experiential knowledge you can lean on, it takes a lot of practice to learn how to design a rock solid datacenter without overdesigning too much.

    That aside - You say 5 racks and 25 monitors in a command center - sounds like one monitor in your command center for every machine you own or something. Consider using switches and keeping your command center down to 4 heads or so (or however many you think you need for simultaneous admin access). You can set up a network of kvm switches such that any of your 4 head units can reach any of your 25 machines easily. If they're *nix, skip the kvm stuff alltogether and just go serail console. Cyclades makes a nice linux-based serial terminal server with ssh support and whatnot.

  • It's been said before, but, put your servers and routers and such in their own room. You don't need to be in there. They're machines that aren't supposed to need that much upkeep. Administrate them via SSH/VNC/whatever. They're also more secure in a dedicated room, and you can install a Halon system if the room isn't usually occupied by people.

    In that room, it helps to have a raised floor to #1, route cables under, #2, provide a place to put more ventilation conduits, and #3, keep the racks off the floor in case of flooding. You'll want this room to have it's own Air Conditioning [1] and filtered electrical power. Your utility company can bring in seperate "clean" power to that room, or you can just give the room it's own few breakers in the breaker box and filter the power with a UPS system, depending on your needs and size. As far as UPS systems go, most server rooms I've seen have banks of batteries for this purpose, which run the entire room on nice, clean, uninterruptable power. The company I work for even has a diesel generator on the roof with a 2 day supply of fuel, which is tested every month.

    For the actual servers and routers, do the obvious: use racks. If your servers aren't already rackmounted, invest in rackmount PC cases - they're not too terribly expensive these days. You can fit quite a few 4U servers in one rack. My company generally has one KVM cluster for each 1.5 racks (keyboard/monitor/KVM taking up shelves on .5 of one of the racks).

    It's not hard to build a good server room, and it doesn't necessarily have to be expensive. It just takes some planning. :-)

    - Eric

    [1] This doesn't just mean keeping the temperature down. Most server room A/C units also closely control humidity, as too much or too little humidity and you have bad, bad problems.
  • Remember, CRT's pump out a lot of waste heat!

    I'm not sure why 25 screens are required?

    Most of my servers run 'headless', with serial ports connected to a terminal server. I have a single console server that handles all of the serial connections from the individual systems' serial console.

    All of the routers, switches, UPS systems and other 'infrastructure' is on an identical setup, with extra security and logging.

    In this design, I have one desktop system (FreeBSD) and screen (18" LCD) for each operator station, plus two large screen displays that show the current network status (one map, one showing alerts and status messages from the monitoring software).

    The remote serial consoles are accessible via SSH (and strong authentication) from anywhere in the local network, so sysadmins and network admins can perform their duties without having to physical visit the data center.

    By using the free 'screen' software to handle the serial port connections, we get a disk log of console activity, a scrollback buffer, and the ability to 'kibbutz', have two users share access to a single console, even though one might be in the NOC and the other user at home connected via VPN.

    This design scales up well, I can get ~100 consoles on two PII/300 machines (retired PC desktops running OpenBSD), and adding additional hosts is as simple as buying another terminal server.

  • We have five racks for the actual computer room, and need around 25 screens for the command console. Add to this bench space for repairs, and things like: a cupboard, bookshelf, plus more storage space, and the design becomes more complicated. We need enough space for three or four admins. Has anyone seen plans for this type of setup ?

    5 racks is not a lot, but leave space for 10 or 15 for future expansion. If the PHB gives you any grief on this, you want to present him with a paper to be signed by the CEO that he is not planning on any growth at all over the next decade. No new customers, no increase in revenue, profits, or employees. That is usually enough to get them to approve a larger space for you and the machines.

    You also need to hire someone who has done this before, who knows all about all the little things like glare from the windows and morning/evening temperature shifts in the building and HVAC. A knowledgeable contractor will then sub-contract the various bits to other professionals. Certainly you will need an HVAC team, an electrician, cabling guys, fire suppression specialists, a security guy, and an architect. Yes, all that for just 5 racks and room for 4-10 admins.

    You will need aircon to keep the racks cool. Count on 5000 watts of power from 5 fully loaded racks, which equates to 15000 BTU/Hr of cooling needed to keep the servers running. Most office buildings can only do 2-3000 BTU/Hr of cooling in an area, so the machine room will need local, dedicated cooling systems, which possibly means an external chiller and water pipes under the floors.

    Each rack requires 3 square metres (or yards) of space, 1 square metre for the rack, and 1 each front and back for the doors. Space above the ceiling and below the floor for cables, electricity, water, fire suppression, drains, etc. Repair area must be separate from the main machine area, physically and electrically.

    Electricity will be specced by the electician for the power-on surge, multiplied by the Power Factor. Only a licensed electrician can tell you what all the local laws require/forbid and can sign off the installation with the local authorities. YOU can NOT do this unless you go to night school for a long time and pick up an electrical contractors license. Cheaper to hire a pro :-)

    Human work spaces can NOT have any noisy equipment running, the cool looking cabinets should have sound damping. Google for the relevent laws of dB per day exposure in your area. The HVAC guys will have to provide large surface area low noise vents, with multiple zones and controls. Consider putting the fluorescent lights onto 3-phase power. Incandescent floor lamps if you don't/can't.

    There are about a thousand things to know about when building a dedicated machine room and "star trek" command centre. My best advice would be to hire the pros, and watch and note everything they do. Then next time an employer asks you about this, you will have a valuable job skill.

    Check fatbrain or amazon for books about this topic, I'm sure someone has written this down at least once.

    A few years ago I specced out a small machine room, with 16 racks. When the room was originally planned in 1996, the company just didn't count on any growth whatsoever. None! They had two racks then, and figured that would hold them for ever. Totally clueless PHB (but I repeat myself). 3 years later I came on the scene, just before the new building was finished. I pointed out the lack of HVAC, electricity, access control, etc. We asked the HVAC guy if he could cool 30000 Watts of power, and he just laughed. Maximum of 5 desktop PCs in this whole area was his answer, not the 60+ currently planned. Anything more was going to cost $$$, and time. So I got to write up the whole spec for the area over a weekend, and they put it all in, at a huge cost but only delayed the move-in by a week. At first they swore I had over-engineered by a huge amount and threatened lawsuits, but then dropped the whole matter. Last month I saw the room, all 16 racks full of equipment, 90 PCs in the command area, cooling and electricity stressed to the limit. They told me I had saved the company a few years ago, because if they hadn't built the building correctly before moving in, the cost of a retrofit would have been 10X to 20X the original cost, and the lost business due to a year delay would have sunk them even during the internet boom. But I knew enough then to ask all the pros for their advice, rather than do it all myself. A thousand details? Nahh, much, much more.

    the AC
    Freelancer looking for a job in .eu land, will even do machine rooms for food (and drink and ). Leave a reply here if you are hiring
  • My advice is to check out the HVAC contractors you use.

    Almost every computer installation is going to produce heat. A lot of heat.

    Typical HVAC people don't understand this for some reason. They look at the square footage and then do some calculations for how many BTU's they think they will need. I've several times in projects asked "you are aware that we're going to have a lot of computers in here and we need a LOT of cooling and almost no heating (if not even some cooling) in the winter.". In at least a couple of cases I got an answer of "oh yeah, we do this all the time", and then they proceeded to install a system obviously not engineered for our application. I.E. undersized during the summer, and no cooling (heating only) during the winter.

    Even a typical 75W average 1U server throws 250 BTU/hr. A rack with ~40 of these is 10,000 BTU/hr. This is enough BTU's to raise the temp in a well-insulated 12x12 room 40 degrees or so. Try putting 5 racks in.

    For reference, a typical 120V window mount air conditioner is typically under 12,000 BTU's.

    At some point you don't have to heat during the winter - just cool year round.

    And in case I didn't mention this strongly enough to start with. Most HVAC contractors just don't get it. Make sure you get one that does.

  • Pay for a pro to do it and you can't go wrong. You have no idea what future requirements might be, but they might. You see some servers like IBM's big Regatta p690 have to have raised floor and certain load ratings, heigth and other specs on that raised floor that have to be satisfied before IBM will support them.

    IBM Global Services can do this and Bruns-Pak [bruns-pak.com] is a company who has their stuff together. They did a seminar I attended on Fault Tolerant centers...centers designed to withstand a category 5 hurricane with out power, water, and even food and sleeping quarters being supplied to the staffers during the whole event. Alot of companies forget that Operations Staff and Sysadmins need food while taking care of an emergency especially one that lasts for several days.

    They can build you a center that costs 1 dollar per square foot (no raised floor, basically a room) or X amount (too high) with all of the robustness of NORAD's Crystal Palace. It all depends on what you want/need.

    If I were to plan one for our location, I would build a one floor facility with no windows, chiller's in a protected area (inside the building, not on the roof), generators in a protected area, UPS, nice and high raised floor, possibly a tall ceiling (I hear that the new standard ceiling height is 40 feet....for taller racks), the building should be hardened against the weather in your area.....meaning it should be very tough in Ohio so it could withstand a Tornado. Ohio doesn't get very many cat 5 hurricanes so we don't have to go that far. Also, have a operations center so your operators don't have to be in the room. This saves sanity for mthe rush of the AC/units. Have a BATHROOM not far from the center....preferably connected to it. You could make it a locker room so that if you have a multi day emergency, you could shower. Have food (at least vending machines or maybe an emergency box of food with food that can stand to sit on a shelf for a while (MRE's would work). Also, you need a coffee pot (GREAT for LONG NIGHTS!). Also, get cable trays for managing the cat5, cat6 or fiber. They are for more then making things neet, they also protect the cable from and errant floor tile landing on a fiber and crushing it, or from cutting a patch cable. That's just the things I could think of. The pro's will not forget anything. It's what they do. Don't just pay those pro's to implement YOUR design....have them do the design with you. It's not the big thing's you will miss when doing a computer room. It's the little things that turn out to be big things too that you miss. This is why I say GET A PRO!
  • Use raised floors. This helps you organize you cable plant. This is a neccessary evil.

    Use full length racks (ie, racks that aren't just a single frame but are multiple frames to support the backs of large objects. Buy racks with builtin power management or install your own quality power management system with remote monitoring and control capabilities like APC's solutions.

    While we're on the topic of power, the entire server farm should have conditioned power (ie, large UPS or battery solution) and should also have backup generator power. We're a small regents school in a midwest state. Even we have a natural gas generator for our server farm. Also put the admins' workstations on these protected outlets (mark the outlets well and educate the admins on not plugging in their refrigerators or space heaters into these outlets). Use multiple protected circuits in your server farm. You shouldn't have 10 machines and monitors on a single circuit. Never put a laser printer on the same circuit as a server.

    Build an extensive KVM system and don't forget to utilize remote serial consoles too. Those two should always be built in parallel. All to frequently one is built without the other.

    You need good cooling. Strike that, you need excellent cooling. You room should be very cool. Heat is the number 1 killer or electronic components. Keep the room cool and have you admins bring in jackets if they need to work in the server room. The room should be cold enough that a human feels very chilly, almost cold when they are in the room. This is another reason why you don't want your admins stationed in that room. They'll always be sick!

    Your personnel should not be placed in the same room as the hardware. The background noise would drive them insane. Also consider possible medical risks from being around monitors and hardware all day long. Sure, it hasn't happened yet but I believe that someday some medical thing will be blamed on all those electronics. Better safe than sorry.

    It sounds corny but strictly enforce a not eating/drinking policy in the server farm. Consider the room a psuedo clean room. An accident at your workstation might cost you a keyboard. An accident in a server farm might cost you a Sun Enterprise 10k (or a lot of parts and time).

    Build your server farm with network storage and backup in mind. Build a SANs infrastucture into the design. Something I personally love to do is build a secondary server farm *only* high speed network that all servers are a part of. Each server connects to this network with a 2nd nic and uses RFC 1918 addressing. The network is not available from the outside world. You can use it for secure high-speed communication between your servers for LDAP, NIS, DNS, NTP, NFS, r*, whatever you need. Maybe you don't connect your servers to a SANs. Maybe you connect 3 NFS servers to it and NFS mount everything. Put those NFS servers on this high-speed network. It is an excellent way to provide high-speed and secure network access between your servers.

    Always keep security in mind when you build it. "Do I need a DMZ?", "Do I need a partial DMZ or layered DMZ?", "Am I running any servers that only need to contact other in-house servers?", "Does one of my servers only need to be contacted from within my company or a small group in my company?", "Which servers are more of a threat to my other servers (ie, servers with user shell accounts) and what do I need to do to protect my other servers from this one?", "Is the network I'm connecting this server to secure enough for the data it houses?". The more of these things you think now, the less likely it is that you'll have to explain why one of these systems was hacked to a suit.

    The best recommendation I can make for a server farm is to appoint (or higher if you're big enough) a server farm administrator. Make this person responsible for managing the power situation in the server room. Make them responsible for administrating the network in the server room. Give this person the authority to bitch slap people that eat and drink in the server room. If needed, give this person the authority to perform a remote security audit of a server to see if it is compromising the security of the other servers in the server room. Make this person responsible for the KVM installation, console server, and rack hardware in the server room.

    For a admin room, I recommend having a conference room handy nearby in the office with a computer and project built into the room. Give you admins some privacy. Ideally they'd have their own offices. Maybe for some senior admins this more of a possibility. Give them plenty of space. Let them have multiple machines and monitors and give them the space to do so. Let them pick some of their office furniture, most importantly their chairs. Keep power management in mind in your cubicle farm just like you did in the server room. Allow them to have mini-fridges. Allow them to eat at their desks. Give them access to a kitchen area with a microwave, big fridge, sink, maybe cook-top but it's unlikely, and table to sit down at and eat if they want. Give them a lounge area where they can take their laptops and go sit in big comfy chairs and sofas. This could double as a room to nap in during an all-nighter or crisis. Provide wireless network access but HEAVILY secure that network. You admin network MUST, I repeat, **MUST** be kept secure! Encryption is a must. Physical security is a must. A firewall in front of your admin area is a must. If someone is using a Windows machine, FORCE that user to use a PRIVATE IP and STRONG anti-virus software. Don't let the lax security of a Windows machine compromise your admin network. If needed, give each of your admins their own subnet-based VLAN, say a /29 each. The other admins can use that to increase their own security. The admin area and server room must remained locked at all times. Utilzie keycards and PIN numbers. Perform a security check of all new employees prior to highering them.

    Hell, I'm rambling now. You get the point though. Most of these things are achievable with ease and stubborn determination. Good luck!

    • I forgot to mention the repair/testing area. Large. Spacious. Seperate from both you admin area and your server room, physically and electrically. Multiple work areas so one is always free. Build network access into the picture.

      Another excellent use for the back-side network I forgot is syslog! What an excellent place to put something like that.

  • Server farms today tend to look like telephone central offices. Rather than raised floors, there are overhead cable trays. AltaVista was the first site to go this way, and they considered it a big win. But you have to work out the cableing plan in advance of the build.

    Another question is whether you intend to maintain the systems, or just let them wear out. Inktomi doesn't maintain its servers; they install them in clusters of 100, and remotely power off the ones that break. When enough have failed, the whole cluster is replaced with new equipment. This cuts the people cost way down, and reduces maintenance-induced failures. If your application is very regular, like a search engine, that may work for you.

  • I've worked in three data centres. One small, cobbled together one, and two larger ones done properly...

    Things to consider in my opinion...

    • Seperate the NOC/Command Centre from the Machine Room. This will give you access to the information you require without going into the room itself. You shouldn't need to go into the Machine Room too often. You certainly shouldn't need to work in there for too long.
    • Have a raised floor. Power should come up from under the floor. Air conditioning can be forced through a ventilated floor in which case put your networking cables in overhead trays (no air dams). If your AC isn't forced through the floor, then you can put the network cabling under there too if you wish...
    • Make sure ALL cables are labelled. At the ends, and at reasonable distances. There is nothing worse than finding a cable is entangled in a mess and not knowing which cable it is.
    • Arrange your Command Centre/NOC so that needed info can be displayed on one or two large screens. If they're not huge wall screens then make sure there is enough room for several people to gather around. You also want to make sure that this place looks good for when the management bring someone through.
    • Have locks on the doors of the Machine Room so people can't just walk back there. Have a window or two into that area from a corridor or the NOC/Command Centre so that management can see that you're not just sleeping back there.
    • Make sure that you have room for at least twice as many racks as you need in the Machine Room. You'll need that space sooner than you think.
    • Don't push the racks against the wall (or run cables from the racks to the walls) as that way it's harder than heck to get to the back of machine X to check the network lights or whatever...
    • Install a KVM/Console Server (Many to many, not one to many). For Unix boxes look at conserver [conserver.org]. It's always handy if many people can look at many servers at once. A decent Unix console server will also log the console history (at least 64k of history) which can help you wwork out what the problem was a couple of days after the system rebooted itself.
    • Have a fire suppression system, fire alarms, climate control and back up power installed (UPS and generator). They WILL help save your bacon.
    • Look into enterprise wide storage and backup systems (SAN/NAS, netbackup/legato/amanda). Better to implement them before you HAVE to implement them, than when it becomes critical.
    • If using racks, allow space for equipment that won't fit into a rack. Gaps at the end of rows...
    • Have a CORDLESS phone on the wall of the computer room. You never know when you will needed to call someone from inside the machine room (for help) and running all the way back to your cubicle/control room just doesn't cut it.

    That's just what I can think of off the top of my head. But if you get all of these ideas incorporated into your new data centre then you are doing well.

    Z.

    • Have a
      CORDLESS phone on the wall of the computer room. You never know when you will needed to call someone from inside the machine room (for help) and running all the way back to your cubicle/control room just doesn't cut it.
      This is good advice... but I suggest one minor change:

      Have a
      secure cordless phone on the wall of the computer room.

      There's nothing more satisfying than picking up a casette tape from your little scanner+recorder (situated just outside the target's property line) after a maintenance weekend and end up with the root and enable passwords for all of their critical systems.

      There are various degrees of 'security', from true frequency hopping spread spectrum through actual voice scramblers.

      • Good point...

        But I would suggest that you shouldn't need to have a secure cordless phone... people should neither be asking for, nor telling passwords over the phone... :-)

        Yes, a corded phone would work, but it's much more of a pain to try and do something while talking on it...

        Z.

When it is incorrect, it is, at least *authoritatively* incorrect. -- Hitchiker's Guide To The Galaxy

Working...