Computer Room Design? 79
Onion asks: "My company is considering giving us a new Computer Room, and Command Center, as our existing building is nowhere near meeting current needs, let alone future needs. I have seen a few plans for command center furniture, but no real designs or ideas for the layout of these two rooms. We have five racks for the actual computer room, and need around 25 screens for the command console. Add to this bench space for repairs, and things like: a cupboard, bookshelf, plus more storage space, and the design becomes more complicated. We need enough space for three or four admins. Has anyone seen plans for this type of setup ?"
Seven Circles (Score:2, Interesting)
Re:Seven Circles (Score:1)
Good idea, keep PC repair and your servers/wiring
in two seperate rooms.
Re:Seven Circles (Score:4, Funny)
changing the f-ing toner cartridge on the LJet2 up on the second floor.
Hop to it, now, admin-boy.
Bond Villain Command Center (Score:5, Funny)
My advice is to build your own command center by studying those featured in the James Bond movies. Most of the bad guys have a fully staffed, very impressive command center filled with computers and cool looking wall-sized flat screens. Make sure you get the center chair that overlooks the whole setup. Not only will it be a joy to work at but a command center like that will impress even the most stingy of VCs or customers.
Always happy to help,
GMD
You ph00l!... You're forgetting... (Score:1)
The cute-but-evil-looking white longhaired cat!!
Gawd...
Ali
Remote Control or KVM (Score:4, Interesting)
Re:Remote Control or KVM (Score:2, Informative)
CDF control room front [uiuc.edu]
CDF control room front left [uiuc.edu]
CDF control room right [uiuc.edu]
The computers and such are stored in the next room and the rooms below. There are ~ 30 monitors on the walls, plus desks for laptops to plug in. Obviously you aren't doing particle collisions, but it is a really nice setup. Usually a staff of 4, but can support 10 or so if needed.
Why not like Star Trek? (Score:2)
Bear with me, I'm serious.
Arrange the main desks into a shallow semicircle, facing one wall. On the wall, you can have whiteboards, charts, an electronic status display, etc. -- general information stuff that the admins might want to look up and see. (Oh yeah... a clock would be good, too.) It sounds like you already have an idea for the workstations themselves.
For announcements, crisis control, and so forth, the team leader could stand with his back to that wall and comfortably address the entire team.
Then, around the perimeter of the room, have repair benches, shelves, cupboards, a fridge in the corner, whatever you feel.
Hmm... geeky, but that's just off the top of my head.
Only one problem.... (Score:1)
Re:Why not like Star Trek? (Score:2)
Gene Rodenberry laid out the bridge in Star Trek that way for dramatic reasons. Prior space SF usually involved a rocket pilot hero who sat at a control panel, but somebody pushing buttons didn't seem dramatic enough for TV. So Rodenberry went with a ship-like layout. This led to a dialog-heavy show, which was what Rodenberry wanted.
Amusingly, one class of U.S. submarine copied the Star Trek bridge layout.
Re:Why not like Star Trek? (Score:2)
Bear with me, I'm serious.
Actually, this is pretty much what an industrial-grade NOC looks like. Large screens on the wall showing a map of the network, and various metrics on performance, probably a TV news feed or two (strewth, traffic just went through the roof! oh wait, CNN just reported...) and a few rows of consoles for the actual operators to sit at. Usually a raised podium at the back where a supervisor can monitor the whole room.
Control rooms for power stations, rail networks, oil refineries, etc, follow similar principles.
Re:PRIMVUS POSTVUS (Score:1)
What kind of Command Center? (Score:2)
Oh no!!! (Score:1)
BTW, this is a joke, mods should try it somtime
The word is "Architect" (Score:3, Insightful)
Think of it in terms of design before implementation.
Re:The word is "Architect" (Score:2, Interesting)
Not only do you need an architect to lay it out, you need an engineering firm to design the HVAC and electrical systems - and no, a spot cooler and a power strip won't cut it.
Then hire a contractor (that's me) to build it.
Then hire a cardiologist, 'cause you're gonna have a heart attack when it comes to paying for it.
Just bite the bullet and do it right. Every owner I've ever dealt with who tried to design or engineer their own facilities has been dissatisfied with the result.
Re:The word is "Architect" (Score:1)
a friend of mine works for a company that just bought a new building. while they didn't have quite that much equipment, they were still able to design a nice center on their own without hiring someone. they had to do some research, but if you want to not pay money, you gotta get the knowledge somehow.
i saw it too, looks damn nice and works great too.
Re:The word is "Architect" (Score:2)
But you can get an estimate before you try in-house design. And it is hard to duplicate an architect's skills and experience. A college roommate was in architecture, and he was learning some esoteric stuff. If you need more than one isolated room, think about an architect, she can help you iron out the layout and plan for the future.
Talk to the professionals, get their estimates. You are making an investment, do it right.
Re:The word is "Architect" (Score:1)
Electrically, the actual work is simple enough, but there are codes that you must/should comply with.
My experience, though, has been that most people don't really want to do the job right, they just want to have something functional. The parent's dream of a NOC for five racks seems a bit off in the first place. Do it cheap now, knowing that you are skimping, and budget to go back later and do it right!
(One little code tidbit to spread is that any UPS over 750VA should be killed by an EPO switch at the door of a computer room! A relevant engineering corrollary is that you don't put UPS systems in series!)
Re:The word is "Architect" (Score:3, Insightful)
An architect is definitely a necessity, but they don't always know everything you'll need. First off, you have to determine the amount of heat generated in the room, and this is affected by the number of people normally in the room. Take whatever you have know, and double it. You don't want to have to add in another air conditioner later (trust me on this - its a pain, and very expensive).
Fire alarms are another big one. In fact, yesterday we learned about problems with them. When we originally setup the system, the idea was the sprinkler system (some type of foam, not sure what) was only designed to go off if both detectors in the room had two signs of fire (heat and smoke). However, I didn't know they also setup the fire alarm with an emergency kill. That is, should the fire alarm be activated, it will cut power to the room. Okay, we've got UPSs, however, they also wired up the one UPS in the room that is smart enough to receive this type of kill signal. (There's also a panic button that will cut the power). Yesterday, the company showed up to test the system, didn't know/forgot about the kill signal, and while testing the system shutdown about 1/2 the machines in the room. The other 1/2 were on dumber UPSs and the switch is on its own battery backup. So, in the even of a real fire, the most expensive piece of equipment in the room would have been powered when the foam came down!
Other things - racks right against the wall are bad, leave room for someone to squeeze behind them because you'll need to eventually. Lighting is another issue, it should be bright, but you don't want a lot of glare. Where are the cables going to run? In the walls (not a great idea when you want to replace bad jack) or overhead cable trays (which can get in the way of lighting and cooling).
Short answer, you need professionals who not only understand architecture, but also power, alarms (fire & security), cooling, and finally usability!
Re:The word is "Architect" (Score:2)
Re:The word is "Architect" (Score:2)
You might want to read my post again, those T1s are not a 1.5Mb line, but 23(+1 control) ISDN lines used for voice not data... hence the switch is a telephony switch, not a network switch.
Re:The word is "Architect" (Score:1)
Re:The word is "Architect" (Score:2)
Reference: http://www.howstuffworks.com/question372.htm [howstuffworks.com]
Re:The word is "Architect" (Score:1)
The poster said that the T1s were for phone switches. I know that this is going to come as a complete shock to you, but T1s were not originally designed as a 1.544 (1.536) Mb/s data pipe, but rather as a way of passing 24 channels of phone calls across two pairs of wires. These are also known as "trunk lines".
Please realize that you need to check your information before you shoot your stupid mouth off.
Re:The word is "Architect" (Score:1)
Re:The word is "Architect" - AMEN (Score:3, Insightful)
Other posters have mentioned heat, alarms and fire supression levels as examples of things an architect might not understand. In my experience, most architects are pretty good about subbing out work they know they can't do. Architects don't draw plans for the whole project - they design the frame and facades, the interiors and the fixtures, but they usually leave other work for specialists. This means that your plumbing, electrical, etc will be handled by a consultant who does it for a living.
Your best bet is to find an architect with demonstrated experience in the kind of project you have in mind. Then ask for references and visit his(her/its) past projects. If possible, talk with the internal project manager at those sites (the person who dealt with the architect on a daily basis).
Nearly any architect can probably do your job, but it will be very painful with one who is inexperienced. Architects have a long and formal internship system - let your architect get his computer room experience as an intern on someone else's project, not as a lead on your project.
Re:The word is "Architect" - AMEN (Score:2)
I'd add, though, that you would still probably like to have at least a good conceptual idea about what you're asking for before beginning consultations with an architect. Don't assume an architect knows what is and isn't important to you.
Also remember, the most important design decisions happen early. As the project progresses, the broad brush strokes of the early conceptual design will become more detailed.
Good architects listen, communicate, and would like your feedback. If you find yourself doing business with someone who does the meet-and-greet and then disappears for awhile
Actually, it's Interior Designer (Score:1)
Re:Actually, it's Interior Designer (Score:1)
Re:The phrase is "Data Center Designer" (Score:1)
Move into existing facilities?? (Score:1)
this kind of setup? (Score:1)
only in the house I designed with my wife.... I wonder why she left me?
Space Design Technology (Score:2, Informative)
http://www.spacedesigntechnology.com/`
It all depends on what you want to spend.
VNC Sessions! (Score:3, Interesting)
KVM switches rock, but tie you to one location, and then you fight over the terminal with the other admins. When you can do it all from your desk with just a click, why not?
Re:VNC Sessions!?? (Score:2, Informative)
Re:VNC Sessions! (Score:3, Funny)
Re:VNC Sessions! (Score:1)
Looks over shoulder warily
Re:VNC Sessions! (Score:1)
Re:ask slashdot submissions future (Score:2)
NO workers in computer room all day. (Score:4, Informative)
Noise levels, particularly high-pitched noise in computer rooms are way too high for human habitation. You will go deaf if subjected to for many years.
If people need to sit in the raised floor area, there needs to be a wood/glass wall between them and the computers and chillers.
Re:NO workers in computer room all day. (Score:2)
Re:NO workers in computer room all day. (Score:2)
Hearing loss is something that really happens to people in IT. Hard disks in particular are terrible for the ears and operators handling tapes should be wearing ear protection.
Re:NO workers in computer room all day. (Score:2)
Re:NO workers in computer room all day. (Score:1)
My 'command center' (Score:2, Interesting)
If you use automation properly, and technologies like ssh and VPN, etc., you don't -need- a 'command center', no matter the size of your organization.
Server rooms, network rooms/closets/PoPs, are absolutely necessary, and should be designed properly, w/racks/raised floor/UPS/etc. 'Command centers' aren't necessary at all, except for stroking one's ego.
That's not to say they aren't -cool-, but they're really passe.
Re:My 'command center' (Score:2)
Perhaps many (most) enterprises do not need a full-bore NOC, but a good "situation room" with full system access and room for at least a half dozen people makes a lot of sense.
For example, an 8K sq. ft. datacenter might have two such rooms:
All the better if there is enough room and spare terminals to get a team together to address a problem, with at least one big display so all the bigwigs can crowd around the one guy who knows what commands to type.
Re:My 'command center' (Score:1)
Serial terminal servers fulfill the 'what if network connectivity to the boxes is lost' requirement, for *NIX hosts, routers, and switches.
Why do I need a separate room? If I have to, I plug my laptop into a switch in the datacenter, or in a cabling room. If connectivity from my home DSL into one PoP is down, I go through one of the 5 others we have in different parts of the world. The only thing I can't fix remotely is a bad patch cable to a host/router/switch which doesn't have redundancy.
FYI, the company I work for does about $20B/year. We're 24/7/365 worldwide - and we don't have a NOC.
Workcenter design (Score:3, Insightful)
Do you want a data center, ie computer server room, a noc, a workroom, etc?
If you have 4 admins, why do they need to see a massed array of consoles? Thats a job for Operations, not administrators. Put your administrators in cubes... Various cube designs out there.
Datacenter use open space and raised floor if possible.
Operations, couple of screens maybe 2 for each operator and use programs to pipe information to said operators, they shouldn't need a screen each. If your running MVS systems which "require" dedicated consoles... You can find cards and "hllapi" interfaces to run several console servers on one machine.
It doesn't sound like you actually have that big of needs.
Myself, I would just take over some custodial closet for the computers, beef up the ventilation and add in a ups system... Your going to need an electrical engineer to look over the power situation which should provide most of the space requirements for you.
As far as the admins go, find out what they want, usually 6x6 cubes with room for 2 computers is adequate...
As far as working on machines, find a room somewhere 20*40 or so that has a locking door. This way your's not concerned with parts walking out the door...
In truth, the computer room with be more of a engineering design, and set by your space requirements. The workspace for the admins is mainly set by getting input from management and admins.
What difficulties are you forseeing?
Re:Workcenter design (Score:2)
This also helps keep the cold air in the datacenter where it belongs, rather than causing hypothermia for the poor network operations people.
Obvious Ingredients First (Score:3, Insightful)
First,
And, just as you can never be too rich or too thin, you can never have too much storage space.
After that, move onto your network drops, benchspace, lighting, chairs, etc.
I'd advise having some locks on the doors, too, not only for the obvious security implications, but also so you have a place to hide when things go south (have a prepared placard to the effect of "We're actively working on the problem and will update you immediately as it's fixed.")
Re:Obvious Ingredients First (Score:1)
Ask a pro? (Score:2)
One good suggestion is to let a pro design your new datacenter, or at least help you out. I've had some really good experiences with IBM Global Services - they worked with us on overall design, and we bought and installed the raised floor, cable management, cabling, UPSs, cooling units, etc... through them. Companies like that have a lot of experiential knowledge you can lean on, it takes a lot of practice to learn how to design a rock solid datacenter without overdesigning too much.
That aside - You say 5 racks and 25 monitors in a command center - sounds like one monitor in your command center for every machine you own or something. Consider using switches and keeping your command center down to 4 heads or so (or however many you think you need for simultaneous admin access). You can set up a network of kvm switches such that any of your 4 head units can reach any of your 25 machines easily. If they're *nix, skip the kvm stuff alltogether and just go serail console. Cyclades makes a nice linux-based serial terminal server with ssh support and whatnot.
Some tips... (Score:1)
In that room, it helps to have a raised floor to #1, route cables under, #2, provide a place to put more ventilation conduits, and #3, keep the racks off the floor in case of flooding. You'll want this room to have it's own Air Conditioning [1] and filtered electrical power. Your utility company can bring in seperate "clean" power to that room, or you can just give the room it's own few breakers in the breaker box and filter the power with a UPS system, depending on your needs and size. As far as UPS systems go, most server rooms I've seen have banks of batteries for this purpose, which run the entire room on nice, clean, uninterruptable power. The company I work for even has a diesel generator on the roof with a 2 day supply of fuel, which is tested every month.
For the actual servers and routers, do the obvious: use racks. If your servers aren't already rackmounted, invest in rackmount PC cases - they're not too terribly expensive these days. You can fit quite a few 4U servers in one rack. My company generally has one KVM cluster for each 1.5 racks (keyboard/monitor/KVM taking up shelves on
It's not hard to build a good server room, and it doesn't necessarily have to be expensive. It just takes some planning.
- Eric
[1] This doesn't just mean keeping the temperature down. Most server room A/C units also closely control humidity, as too much or too little humidity and you have bad, bad problems.
NOC design: Do you really need 25 screens? (Score:2)
I'm not sure why 25 screens are required?
Most of my servers run 'headless', with serial ports connected to a terminal server. I have a single console server that handles all of the serial connections from the individual systems' serial console.
All of the routers, switches, UPS systems and other 'infrastructure' is on an identical setup, with extra security and logging.
In this design, I have one desktop system (FreeBSD) and screen (18" LCD) for each operator station, plus two large screen displays that show the current network status (one map, one showing alerts and status messages from the monitoring software).
The remote serial consoles are accessible via SSH (and strong authentication) from anywhere in the local network, so sysadmins and network admins can perform their duties without having to physical visit the data center.
By using the free 'screen' software to handle the serial port connections, we get a disk log of console activity, a scrollback buffer, and the ability to 'kibbutz', have two users share access to a single console, even though one might be in the NOC and the other user at home connected via VPN.
This design scales up well, I can get ~100 consoles on two PII/300 machines (retired PC desktops running OpenBSD), and adding additional hosts is as simple as buying another terminal server.
Re:NOC design: Do you really need 25 screens? (Score:2, Informative)
http://www.q9.com/news/images/q9_noc.jpg [q9.com]
http://www.q9.com/support/images/ts.jpg [q9.com]
http://www.q9.com/news/images/CanBus_Oct1-01.jpg [q9.com]
Hire a professional, and prepare for a whole team (Score:2)
5 racks is not a lot, but leave space for 10 or 15 for future expansion. If the PHB gives you any grief on this, you want to present him with a paper to be signed by the CEO that he is not planning on any growth at all over the next decade. No new customers, no increase in revenue, profits, or employees. That is usually enough to get them to approve a larger space for you and the machines.
You also need to hire someone who has done this before, who knows all about all the little things like glare from the windows and morning/evening temperature shifts in the building and HVAC. A knowledgeable contractor will then sub-contract the various bits to other professionals. Certainly you will need an HVAC team, an electrician, cabling guys, fire suppression specialists, a security guy, and an architect. Yes, all that for just 5 racks and room for 4-10 admins.
You will need aircon to keep the racks cool. Count on 5000 watts of power from 5 fully loaded racks, which equates to 15000 BTU/Hr of cooling needed to keep the servers running. Most office buildings can only do 2-3000 BTU/Hr of cooling in an area, so the machine room will need local, dedicated cooling systems, which possibly means an external chiller and water pipes under the floors.
Each rack requires 3 square metres (or yards) of space, 1 square metre for the rack, and 1 each front and back for the doors. Space above the ceiling and below the floor for cables, electricity, water, fire suppression, drains, etc. Repair area must be separate from the main machine area, physically and electrically.
Electricity will be specced by the electician for the power-on surge, multiplied by the Power Factor. Only a licensed electrician can tell you what all the local laws require/forbid and can sign off the installation with the local authorities. YOU can NOT do this unless you go to night school for a long time and pick up an electrical contractors license. Cheaper to hire a pro
Human work spaces can NOT have any noisy equipment running, the cool looking cabinets should have sound damping. Google for the relevent laws of dB per day exposure in your area. The HVAC guys will have to provide large surface area low noise vents, with multiple zones and controls. Consider putting the fluorescent lights onto 3-phase power. Incandescent floor lamps if you don't/can't.
There are about a thousand things to know about when building a dedicated machine room and "star trek" command centre. My best advice would be to hire the pros, and watch and note everything they do. Then next time an employer asks you about this, you will have a valuable job skill.
Check fatbrain or amazon for books about this topic, I'm sure someone has written this down at least once.
A few years ago I specced out a small machine room, with 16 racks. When the room was originally planned in 1996, the company just didn't count on any growth whatsoever. None! They had two racks then, and figured that would hold them for ever. Totally clueless PHB (but I repeat myself). 3 years later I came on the scene, just before the new building was finished. I pointed out the lack of HVAC, electricity, access control, etc. We asked the HVAC guy if he could cool 30000 Watts of power, and he just laughed. Maximum of 5 desktop PCs in this whole area was his answer, not the 60+ currently planned. Anything more was going to cost $$$, and time. So I got to write up the whole spec for the area over a weekend, and they put it all in, at a huge cost but only delayed the move-in by a week. At first they swore I had over-engineered by a huge amount and threatened lawsuits, but then dropped the whole matter. Last month I saw the room, all 16 racks full of equipment, 90 PCs in the command area, cooling and electricity stressed to the limit. They told me I had saved the company a few years ago, because if they hadn't built the building correctly before moving in, the cost of a retrofit would have been 10X to 20X the original cost, and the lost business due to a year delay would have sunk them even during the internet boom. But I knew enough then to ask all the pros for their advice, rather than do it all myself. A thousand details? Nahh, much, much more.
the AC
Freelancer looking for a job in
HVAC Design is important. (Score:2)
Almost every computer installation is going to produce heat. A lot of heat.
Typical HVAC people don't understand this for some reason. They look at the square footage and then do some calculations for how many BTU's they think they will need. I've several times in projects asked "you are aware that we're going to have a lot of computers in here and we need a LOT of cooling and almost no heating (if not even some cooling) in the winter.". In at least a couple of cases I got an answer of "oh yeah, we do this all the time", and then they proceeded to install a system obviously not engineered for our application. I.E. undersized during the summer, and no cooling (heating only) during the winter.
Even a typical 75W average 1U server throws 250 BTU/hr. A rack with ~40 of these is 10,000 BTU/hr. This is enough BTU's to raise the temp in a well-insulated 12x12 room 40 degrees or so. Try putting 5 racks in.
For reference, a typical 120V window mount air conditioner is typically under 12,000 BTU's.
At some point you don't have to heat during the winter - just cool year round.
And in case I didn't mention this strongly enough to start with. Most HVAC contractors just don't get it. Make sure you get one that does.
My advice.... (Score:2)
IBM Global Services can do this and Bruns-Pak [bruns-pak.com] is a company who has their stuff together. They did a seminar I attended on Fault Tolerant centers...centers designed to withstand a category 5 hurricane with out power, water, and even food and sleeping quarters being supplied to the staffers during the whole event. Alot of companies forget that Operations Staff and Sysadmins need food while taking care of an emergency especially one that lasts for several days.
They can build you a center that costs 1 dollar per square foot (no raised floor, basically a room) or X amount (too high) with all of the robustness of NORAD's Crystal Palace. It all depends on what you want/need.
If I were to plan one for our location, I would build a one floor facility with no windows, chiller's in a protected area (inside the building, not on the roof), generators in a protected area, UPS, nice and high raised floor, possibly a tall ceiling (I hear that the new standard ceiling height is 40 feet....for taller racks), the building should be hardened against the weather in your area.....meaning it should be very tough in Ohio so it could withstand a Tornado. Ohio doesn't get very many cat 5 hurricanes so we don't have to go that far. Also, have a operations center so your operators don't have to be in the room. This saves sanity for mthe rush of the AC/units. Have a BATHROOM not far from the center....preferably connected to it. You could make it a locker room so that if you have a multi day emergency, you could shower. Have food (at least vending machines or maybe an emergency box of food with food that can stand to sit on a shelf for a while (MRE's would work). Also, you need a coffee pot (GREAT for LONG NIGHTS!). Also, get cable trays for managing the cat5, cat6 or fiber. They are for more then making things neet, they also protect the cable from and errant floor tile landing on a fiber and crushing it, or from cutting a patch cable. That's just the things I could think of. The pro's will not forget anything. It's what they do. Don't just pay those pro's to implement YOUR design....have them do the design with you. It's not the big thing's you will miss when doing a computer room. It's the little things that turn out to be big things too that you miss. This is why I say GET A PRO!
My thoughts (Score:2)
Use full length racks (ie, racks that aren't just a single frame but are multiple frames to support the backs of large objects. Buy racks with builtin power management or install your own quality power management system with remote monitoring and control capabilities like APC's solutions.
While we're on the topic of power, the entire server farm should have conditioned power (ie, large UPS or battery solution) and should also have backup generator power. We're a small regents school in a midwest state. Even we have a natural gas generator for our server farm. Also put the admins' workstations on these protected outlets (mark the outlets well and educate the admins on not plugging in their refrigerators or space heaters into these outlets). Use multiple protected circuits in your server farm. You shouldn't have 10 machines and monitors on a single circuit. Never put a laser printer on the same circuit as a server.
Build an extensive KVM system and don't forget to utilize remote serial consoles too. Those two should always be built in parallel. All to frequently one is built without the other.
You need good cooling. Strike that, you need excellent cooling. You room should be very cool. Heat is the number 1 killer or electronic components. Keep the room cool and have you admins bring in jackets if they need to work in the server room. The room should be cold enough that a human feels very chilly, almost cold when they are in the room. This is another reason why you don't want your admins stationed in that room. They'll always be sick!
Your personnel should not be placed in the same room as the hardware. The background noise would drive them insane. Also consider possible medical risks from being around monitors and hardware all day long. Sure, it hasn't happened yet but I believe that someday some medical thing will be blamed on all those electronics. Better safe than sorry.
It sounds corny but strictly enforce a not eating/drinking policy in the server farm. Consider the room a psuedo clean room. An accident at your workstation might cost you a keyboard. An accident in a server farm might cost you a Sun Enterprise 10k (or a lot of parts and time).
Build your server farm with network storage and backup in mind. Build a SANs infrastucture into the design. Something I personally love to do is build a secondary server farm *only* high speed network that all servers are a part of. Each server connects to this network with a 2nd nic and uses RFC 1918 addressing. The network is not available from the outside world. You can use it for secure high-speed communication between your servers for LDAP, NIS, DNS, NTP, NFS, r*, whatever you need. Maybe you don't connect your servers to a SANs. Maybe you connect 3 NFS servers to it and NFS mount everything. Put those NFS servers on this high-speed network. It is an excellent way to provide high-speed and secure network access between your servers.
Always keep security in mind when you build it. "Do I need a DMZ?", "Do I need a partial DMZ or layered DMZ?", "Am I running any servers that only need to contact other in-house servers?", "Does one of my servers only need to be contacted from within my company or a small group in my company?", "Which servers are more of a threat to my other servers (ie, servers with user shell accounts) and what do I need to do to protect my other servers from this one?", "Is the network I'm connecting this server to secure enough for the data it houses?". The more of these things you think now, the less likely it is that you'll have to explain why one of these systems was hacked to a suit.
The best recommendation I can make for a server farm is to appoint (or higher if you're big enough) a server farm administrator. Make this person responsible for managing the power situation in the server room. Make them responsible for administrating the network in the server room. Give this person the authority to bitch slap people that eat and drink in the server room. If needed, give this person the authority to perform a remote security audit of a server to see if it is compromising the security of the other servers in the server room. Make this person responsible for the KVM installation, console server, and rack hardware in the server room.
For a admin room, I recommend having a conference room handy nearby in the office with a computer and project built into the room. Give you admins some privacy. Ideally they'd have their own offices. Maybe for some senior admins this more of a possibility. Give them plenty of space. Let them have multiple machines and monitors and give them the space to do so. Let them pick some of their office furniture, most importantly their chairs. Keep power management in mind in your cubicle farm just like you did in the server room. Allow them to have mini-fridges. Allow them to eat at their desks. Give them access to a kitchen area with a microwave, big fridge, sink, maybe cook-top but it's unlikely, and table to sit down at and eat if they want. Give them a lounge area where they can take their laptops and go sit in big comfy chairs and sofas. This could double as a room to nap in during an all-nighter or crisis. Provide wireless network access but HEAVILY secure that network. You admin network MUST, I repeat, **MUST** be kept secure! Encryption is a must. Physical security is a must. A firewall in front of your admin area is a must. If someone is using a Windows machine, FORCE that user to use a PRIVATE IP and STRONG anti-virus software. Don't let the lax security of a Windows machine compromise your admin network. If needed, give each of your admins their own subnet-based VLAN, say a /29 each. The other admins can use that to increase their own security. The admin area and server room must remained locked at all times. Utilzie keycards and PIN numbers. Perform a security check of all new employees prior to highering them.
Hell, I'm rambling now. You get the point though. Most of these things are achievable with ease and stubborn determination. Good luck!
Re:My thoughts (Score:2)
Another excellent use for the back-side network I forgot is syslog! What an excellent place to put something like that.
Raised floor vs. cable trays (Score:2)
Another question is whether you intend to maintain the systems, or just let them wear out. Inktomi doesn't maintain its servers; they install them in clusters of 100, and remotely power off the ones that break. When enough have failed, the whole cluster is replaced with new equipment. This cuts the people cost way down, and reduces maintenance-induced failures. If your application is very regular, like a search engine, that may work for you.
My experiences... (Score:2)
I've worked in three data centres. One small, cobbled together one, and two larger ones done properly...
Things to consider in my opinion...
That's just what I can think of off the top of my head. But if you get all of these ideas incorporated into your new data centre then you are doing well.
Z.
Cordless phone for machine room (Score:2)
There's nothing more satisfying than picking up a casette tape from your little scanner+recorder (situated just outside the target's property line) after a maintenance weekend and end up with the root and enable passwords for all of their critical systems.
There are various degrees of 'security', from true frequency hopping spread spectrum through actual voice scramblers.
Re:Cordless phone for machine room (Score:1)
Good point...
But I would suggest that you shouldn't need to have a secure cordless phone... people should neither be asking for, nor telling passwords over the phone... :-)
Yes, a corded phone would work, but it's much more of a pain to try and do something while talking on it...
Z.