Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware Hacking

How Have You Equipped a Tiny Server Closet? 81

BenEnglishAtHome asks: "One of our remote offices will soon be gutted/rebuilt and our local IT staff managed to fumble the political ball. Our server closet is being reduced to 45 square feet and there will be no more unused desk space that can be occupied by visiting techs. Result? That 45 square feet must house 3 desktop-size servers; 3 UPSs; a fully-adjustable user workstation that includes separate adjustments for work surface height, keyboard height, and keyboard angle as well as a big ergo chair; an area suitable for workstation diagnostics; a good KVM switch; 2 monitors, keyboards, mice, and laptop docking stations that must be simultaneously available; and some amount of small item storage, while still having enough room for a door to swing into the roughly square room. The only bright side is that I can have all the A/C, power, and LAN drops I want. Has anyone managed to find and deploy a freestanding server rack/workstation/furniture system (probably something L-shaped) that can perform this many tasks in such a small space?"
This discussion has been archived. No new comments can be posted.

How Have You Equipped a Tiny Server Closet?

Comments Filter:
  • Suggestion: (Score:4, Insightful)

    by Anonymous Coward on Friday July 28, 2006 @08:49PM (#15803232)
    Start having lots of conversations in earshot of management about electrical fires.
    • Or as a hiring manager told me during a job interview for a QA position in a very small lab, it would be fatal to take a big, smelly fart in that room.
    • And water as a fire suppression system.
    • Re:Suggestion: (Score:5, Informative)

      by pla ( 258480 ) on Friday July 28, 2006 @09:48PM (#15803443) Journal
      Start having lots of conversations in earshot of management about electrical fires.

      And on a more serious note, start talking to management about the wisdom of putting a human in the server room.

      We have a similarly sized server room at my workplace, a bit more horsepower though. And we can't actually work in there for any length of time due to OSHA regs - It stays a nice comfy 86-88db in there all the time. If (when, I should say) we need to replace a server, we would need to wear earplugs to legally stay in the room long enough.

      Your boss might not care about pesky little problems like the Pauli Exclusion Principle*, but when it comes to OSHA, employers tend to go out of their way to do things the "right" way (and if not, you can guarantee they will the second time).



      * Yes, I know - Laugh, don't take it literally.
      • start talking to management about the wisdom of putting a human in the server room

        It's only for one person, one day a week, who will typcially spend less than 4 hours in the room. Oddly enough, while we won't put someone in a non-ergo chair with a non-adjustable work surface, we frequently don't care squat about other issues. Example? In our main server room, we have a completely screwed up air handling system that keeps the air pressure in the room much higher than the rest of the floor. It is *very* d

  • by LiquidCoooled ( 634315 ) on Friday July 28, 2006 @08:50PM (#15803234) Homepage Journal
    I would store the servers outside the closet and convert the closet into a fully stocked bar.
    By piping the A/C through, you can keep your beverages at a perfect temperature.

    If you have any remaining space, you could install a dockable sweet trolley to take your refreshments mobile.

    If your boss asks, just tell him your servers were claustrophobic.
    • Good idea.

      And I was going to recommend tiny servers and umpa lumpas...
    • Back when I had a lab, there was a time that the A/C system in the ceiling croaked and they had a free-standing A/C unit as a stopgap until they got it fixed. It had a bunch of tubes [wikipedia.org] in the back and an opening in the front where the cold air came out. And my department had a sales guy who was a wine expert, had a small Napa Valley winery with a couple of friends, and did occasional evening winetastings after work. So there was obviously one thing we had to do with the A/C unit, which was to chill a coupl
  • Huh??? (Score:5, Funny)

    by Black Parrot ( 19622 ) on Friday July 28, 2006 @08:52PM (#15803240)
    I've heard of "earth closet" and "water closet", but I'm aghast at the idea of a "server closet".

    All the same, I'd equip it with toilet paper and hand soap, just like the others.
    • Re:Huh??? (Score:3, Funny)

      by cptgrudge ( 177113 )
      Well, it's pretty simple. This is where you put your POS servers. Granted, they won't go away, but maybe they'll be bricked up behind a wall?
  • Shelve 'em (Score:5, Insightful)

    by ErikTheRed ( 162431 ) on Friday July 28, 2006 @08:54PM (#15803252) Homepage
    Well, since your first problem is that your servers aren't rack-mountable (or, if you can get conversion kits, rack-mount them and forget the rest of this), your next best bet is some good shelving. I've purchased some heavy-duty 4' x 18" stainless steel shelves from Costco for about $75 a set. Each shelf can hold 500 lbs if necessary. Find a way to attach the shelves to the wall (several half-inch-thick zip-ties screwed into the wall studs works well), and use cargo straps (the kind with built-in ratchets to tighten them) to attach the servers to the shelves. Very space-efficient and sturdy.
    • Re:Shelve 'em (Score:3, Insightful)

      by dgatwood ( 11270 )

      Why use a free-standing shelf? I wouldn't think those would be nearly as sturdy as a shelf that's actually mounted to the wall, and from your description, it sounds more expensive to use free-standing shelves, too.

      My server closet is built using steel mesh shelves available at any Lowes or Home Depot. You mount three or four vertical strips with little slots, then the shelf brackes drop in and slide down, locking them into place. The shelf brackets, in turn, snap onto the wire shelves themselves. Take

      • Generally -- and obviously I don't know the specs on what kind of shelves you used -- freestanding shelves can hold a lot more weight than wall-mounted ones. With any kind of wall-mounted system that's not somehow supported on the forward end by the floor, you're going to be putting a lot of torque on the points where it attaches to the floor.

        Depending on how much weight you're considering putting on the shelves, that may or may not be acceptable. Desktop computers used a servers probably aren't heavy enoug
        • Okay, I had a bit of a brain-fart in the first paragraph (even though I previewed). I meant:

          Generally -- and obviously I don't know the specs on what kind of shelves you used -- freestanding shelves can hold a lot more weight than wall-mounted ones. With any kind of wall-mounted system that's not somehow supported on the forward end by the floor, you're going to be putting a lot of torque on the points where it attaches to the walls .

          Sorry about that.

          • The problem with free-standing shelves is that there is usually either no cross-bracing or inadequate cross-bracing. Hanging shelves are inherently cross-braced because the wall boards keep the studs from flexing much (and the studs themselves are thick and are fastened to horizontal studs at the top and bottom, which in turn, are fastened to the floor, which isn't going anywhere). Most of the stand-alone shelf failures I have seen result from not having any good way to prevent the four vertical posts fr

    • Re:Shelve 'em (Score:3, Informative)

      by Fallen Kell ( 165468 )
      Yes, shelve them, but in a rack. Just about any system out there is either rack-mountible or capable of being placed on a shelf in a rack. Shelve them yes.

      So lets see, 3 desktop sized servers and their UPSes should easily fit in a rack. Just make sure you get shelves that can handle the weight of the UPSes (unless the UPS is already rack-mountible, in which case, just get the rack kit for them). Get yourself a good slimline rack-mount KVM to place in there as well.

      For item storage, get yourself a rollin

      • 19" racks are your friends. Most standard equipment fits in them, if you buy the right shelves, and having one makes it easier to convince your bosses to let you buy rack-mountable servers. Moore's Law means that they'll be a lot faster and cheaper than the three servers you have now, with much bigger disk drives. I'm assuming your UPSs are the el-cheapo type that support desktop PCs - put them on the bottom shelf of your rack because they're heavy and might leak, and make sure that whatever else you do,
    • Good idea on the shelves, but I wouldn't recommend stationary ones

      Instead, the items you need are:
      1) Mobile rack (as tall as would fit in your new room)
      2) Shelves for the said rack
      3) Tetris skills ;)

      Simply place the desktop-sized servers on the shelves at the top of the rack and the UPSes at the bottom, and use the middle for hot-pluggable stuff. You might even be able to place a few actual rackmount cases before you run out of space
      There are some 1U racks that act as a combo KVM/keyboard/monitor - very com
  • ... when I say: "my home is my server closet!"

    hey, it works the other way around as well. Now, where'd I put that CAT-6 cable...
  • by TheSHAD0W ( 258774 ) on Friday July 28, 2006 @08:59PM (#15803268) Homepage
    Reverse the door. You'll want pinned hinges so the room's still secure, but that'll buy you a LOT of extra space.
  • Virtualize (Score:5, Insightful)

    by digitalhermit ( 113459 ) on Friday July 28, 2006 @09:01PM (#15803276) Homepage
    If your workload per server is light enough, then buy 1 decent server with RAID and lots of memory and CPU. Virtualize the machines on this new server. Put in a Ethernet remote management card. This will allow you to forego the monitor and access it remotely. Make sure this machine is fully redundant and hot-swappable. Now instead of 3 servers you have one. You don't have to actually enter the server closet. For even more space, mount the unit up above.
    • Since you don't have the room, put the servers in somebody's colocation facility. You can get the servers and UPS in a proper environment.
    • ...virtualize...

      Not an option. I don't own or admin the servers. They belong to our corp lan admins; the most I've ever been aked to do with them is power down or up when electrical service is going to be/has been interrupted. I'm just a lowly desktop technician. Normally, when I visit the office I just plop down at an unused workstation and, if I need more space, I can take over an interview room. No longer. After the rebuild, there will be no more room for me to camp out in the office space contro

  • silly questions (Score:3, Insightful)

    by Clover_Kicker ( 20761 ) <clover_kicker@yahoo.com> on Friday July 28, 2006 @09:17PM (#15803342)
    Are you /sure/ you need 3 UPSs and not 1 big one?

    Just because you have a big ergo chair now, do you really need to keep it?

    PS - you're insane if you put 3 desktop sized servers in that room. Replace 'em with 3 rackmounted servers and a 1U LCD/keyboard/trackball/KVM.
  • by dtfinch ( 661405 ) * on Friday July 28, 2006 @09:17PM (#15803343) Journal
    Roughly same dimensions. 8 systems. 3 24 port switches. And a security system with a monitor that can't be turned off independently.

    I've mentioned several times that it'll all go to hell when the small single room household A/C dies. They won't even approve my offsite backup plans. There is a backup server, but it too is in the server closet. They even had a heating vent going into the server closet until I convinced them to seal it this last winter, after it reached the 90's during the coldest time of year.

    The thing gives me nightmares. I imagine the A/C failing, the servers dying, and the room catching fire and taking the building with it.
  • Maybe move the work area into a nearby cubicle.

    Lot of server rooms have no accomodations...it's for servers after all.
  • Right... (Score:5, Interesting)

    by CXI ( 46706 ) on Friday July 28, 2006 @09:24PM (#15803363) Homepage
    Has anyone managed to find and deploy a freestanding server rack/workstation/furniture system (probably something L-shaped) that can perform this many tasks in such a small space?"

    Yes, it's called a rack and a desk. You can find both of them available from retailers the world over. Seriously, this question is... trivial. It's all up to how you want to arrange things. As others have suggested, you could buy a seriously powerful multicore system with plenty of RAID storage that takes up under 4U for a few thousand dollars. Ok, so put it in the rack with the UPSs on the bottom (wait, do you need them all anymore?), a shelf with a monitor and KVM (because you only need it for emergencies, since you should connect in normally by remote) and we just used up under 12 square feet. That's a lot of room left for a desk and chair! Even if you don't want to buy a new server, then buy a few more shelves for your rack and stick them in it standing up.

    I have five machines, one of which runs five other VMs, several UPSs an LTO-3 backup system, two ancient mini-fridge sized servers and a KVM all taking up less than 25 square feet. Half of that is the two ancient servers I'm about to get rid of. It's not that hard...
  • I have done it several times, although never to also include a diagnostic station, but i would strongly recommend that you think above the ground. In a 45' room, you would barely be able to fit one rack mount system. But if you are concerned primarily with desktop systems, then you can do 3 ties of shelving just large enough to hold a tower securely. Adjustable shelving might be a godsend.

    Also, a small adjustable angle table (drafting table) might make it so that you can acomplish work when necessary, and a
  • by TheSHAD0W ( 258774 ) on Friday July 28, 2006 @09:32PM (#15803387) Homepage
    They have these really great baker's racks that can handle hundreds of pounds per shelf, and will allow airflow around the computer cases. The wireframe racks are great to hook pull-ties around too, makes the cabling neater. Casters on the feet mean you can roll 'em around to access the back. You can probably put everything on it, excepting for your monitor and keyboard; and with the space savings you ought to be able to fit a small desk in there.

    Several other people have recommended rack setups, and for ultimate reliability and neatness I'd have to agree; but if your budget is small a backer's rack will do the job.
  • My server room in my house is quite literally in a closet. You can see it here [jasongreb.com]. I've only got three servers at the moment. The last one is on the floor. I need more shelving. I also need to clean up the wiring.
  • by Opportunist ( 166417 ) on Friday July 28, 2006 @09:37PM (#15803406)
    Write down every even remotely possible hazard this could lead to (electrical fires and server failings due to heat, as well as long downtimes because you have to "un-build" a lot of the server farm to get to the failing computer, etc. Whatever comes to mind, write it down), but make sure you leave out anything that could remotely be tracked to your convenience (managers hate it if their subjects are working in a convenient environment, it will make your complaint look like you're just trying to get more comfort).

    Then, assemble a list of managers that could even remotely be connected to the problem, or who could get fired when one of those hazards really happen.

    Next, draw together what services of your company rely on computers. Managers don't understand the implications of a failed server, but they do understand what happens when people can't work because of it. Make sure you describe in very easy terms the connection between the server being down and Joe Cubicle sitting around and twiddling his thumbs because of it, for as long as the server is down (that the server will be down for LONG and chances that it is down was already covered in the first paragraph, I hope). Also make sure you include that they will not get a SINGLE EMail when (NOT if!) the servers fail, and for as long as the servers are down, no electronic communication AT ALL with the outside world and the other offices, or with clients! No mails, no files, no reports, NOTHING will come in and go out when the servers fail!

    Pull it all together and write memos. Emphasis on the s. Not one. Make sure you write to them until you get a reply, don't let it rest, make sure that they understand the urgency and that it is a serious, serious, SERIOUS problem to the company. Make sure they understand that everything your company does stands and falls with the availability of the servers and their services. That nobody can be productive when your department is working at sub-par conditions.

    Also, look around for possible solutions. Is there some space where you could put a server? Is there a way you could "grow" your space at the expense of some other department? Try to offer a solution, not only a problem. For two reasons: NObody likes a complaint (sounds like "waaah, I'm unhappy"), and when you already offer a solution, your chances are good that they will be picked up instead of a manager trying to come up with an idea. This is bad for two reasons: First and foremost, he has NO clue what you and your servers need, you might end up with a server in a toilet right under the water reservoir. And second, they will decide it behind your back, without you having a say in it, and they will try to meet your bare minimum requirements (if at all).
    • That all ignores something in his original statement: "our local IT staff managed to fumble the political ball." If we're talking "three desktop systems == server farm", you can bet "our local staff" is pretty small, and I'm betting it translates directly into "BenEnglishAtHome".

      [ It's kind of like asking questions that start with, "I know this guy with herpes..." ]

      A memo filled with finger pointing and blame throwing will be a fine document for the new IT guy to use to justify a new server room, but w

      • Just a short note to let you know that most of your assumptions are wrong. I can see where you might reach those conclusions, but the actual case is that this is a large federal agency. The reason there are only three servers onsite is that only three specific tasks are deemed to be such bandwidth hogs that we have to place them in remote posts of duty, close to their users. Most local servers are in our two main offices with a total of about 6000 square feet of dedicated server room space.

        As for the for
    • I couldn't agree more with the memo barrage idea. They say bad news travel fast, hehe:) A couple complementary ideas for approaching mgmt.:

      1) Give multiple solutions - think of one low-cost alternative, another medium cost/effort one, and a budget hyper-blast based on long-term planning(good profile-builder, too) Then present it as: - would you like to go with or with ? People like to have options, it makes them feel like they're in control.

      2) Speak to them in $$$ - for each solution, table how much
  • If you're looking for furniture recommendations you should specify actual dimensions. 6.5' x 7' has very different solutions than 4.5' x 10'.

    For three desktop-size servers, though, this is trivial. Use a cubical L-desk for a cube-shaped room and a bench along one wall for a rectangular room. The three servers sit under, the monitor keyboard and KVM sit on top and the entire other half of the desk is available for the tech with his laptop.

    Do watch out for the air conditioning. A non-IT guy's definition of wh
  • by plcurechax ( 247883 ) on Friday July 28, 2006 @09:42PM (#15803425) Homepage
    Real servers are rackable, in 19 inch wide, 42U (~72 inches I think ) high racks.

    One UPS, hot swapable batteries are nice, but we fry as many APC brand controllers as we kill batteries. I like to have an independent AC line conditioner, on a serperate AC mains circuit (i.e. different 15A circuit breaker) so that those real servers with dual power supplies (hot swappable of course) go one to UPS, one to the line conditioner (for UPS failures). Have enough circuits (not just more plugs) to accomidate future growth. A Watts Up? [smarthome.com] or Kill-A-Watt [p3international.com] meter are nice to measuring your electrical consumption.

    Honestly with how swappable hardware RAID-5 disks, hot swappable power supplies, sensible power distribution, and practicing regular backup hygenie, downtime can be minimized to mere hours per year range or less with care and planning of the administrator(s).

    I also love KVM over IP (I use an ) or [avocent.com]ILO [hp.com] (Intergrated Lights Out management) for headless servers, and have a backup AC available for server rooms/closets.

    For servers ideas look at HP Proliant DL380 or Dell PowerEdge 2850 series.
    • but we fry as many APC brand controllers as we kill batteries


      jeez, I'm not alone? I thought I was going crazy, with all these APC units dying around here. I think we've gone though twice as many APC units as servers.
    • 15A? I'd get 3 dedicated 20A, which is code in most of the US for commercial buildings. I'd also see about getting isolated ground circiuts to help cut out some possible ground loop interference, but those are hard to come by. One if you need any cooling, and two for equipment redundancy.
      • Skip the Isoloated Ground. It may cause a ground loop in the system leading to a fire. Remember a ground loop is very low volts but almost INFINITE current. I'm just starting to get my Associates Specialist in Technology (AST) in computer servicing (You get more of the important stuff and less of the other shit a.k.a minimum math, minimum english, maximum tech). I don't know which is worse noise or ground loops (which can also cause noise). I've read stories of burned printer cables when computers were
    • Just to allay confusion, the Avocent AutoView 2000 [avocent.com] noted in parent is KVM-over-Cat5, NOT KVM-over-IP. Big difference, since IP is routable across the WAN and whatever (presumably analog) signal on Cat5 is not.

      That said, KVM-over-Cat5 isn't a bad way to make a room accessable from outside...

  • Try not doing it... (Score:4, Interesting)

    by ComputerSlicer23 ( 516509 ) on Friday July 28, 2006 @09:53PM (#15803456)

    This clearly isn't the advice you asked for. Personally, I'd make effective use of shelves (and or cabinets), and use the longest table that will fit on the longet dimention. Get the next longest table for the other dimension. There should just be room for the door to open, and the chair to be out of the way when someone is sitting in there. You can stack desktops and tower cases underneath. Have a working surface for repairs on top. Hook the laptop dock into the KVM, and put the dock up high (I once packed 10 computers into a room that was 6.5x9.5 or so at a small startup, oh and that was also my office). Now, that I've given you the token advice you've asked for. Let me try giving you my real advice:

    I've been reasonable effective with the passive aggressive stance, of not supporting something so stupid. I'm not sure I can visualize just how dumb this is (I'm not good with descriptions of space). If this is truly a political issue, put in everything you can fit in with a reasonable about of space. The rest of it became "do without". I've seen 60 person offices run of development offices run off 3 machines in a server room that run all of the IT infrastructure for the company. It's stupid, but it can be done. A decent desk and an LCD monitor are all that's needed. It'll easily fit within a 7x7 room (a square room roughly of the right size).

    You'll be shocked and amazed at the types of results you get from, "I've done everything I can with the resources I have. This project will continue to be a nightmare and always behind the eightball until we decide to do it right. I'll continue to do my best to support it as it is, but it'll take more time, and cost a lot more money then just having done this properly the first time. It will continue to cost us money, and I can give you estimates of the amount of time it will take for resolving this properly will become profitable".

    Generally speaking, no sane boss will argue against that. I've come dreadfully close to being fired on several occasions because for this. However, I was kept around as "the guy who got stuff done". Generally letting people suffer the consequences of their stupidity is the single most effective and convincing way to get them to see the error.

    I used to work with a woman who everytime you asked her to test things you developed for her to automated a business process she was in charge of that "it was so broken, that I'll just do it by hand". No matter how far or near you were to the mark, if it didn't work perfectly the first time she would refuse to work towards fixing it. Wouldn't explain what she needed or what was wrong. She used to do that to everyone. And everyone worked really hard to coax her into discussing it like a rational person. I found the most effective way to deal with her was to walk out. Wait 3 months until she was completely overwhelmed by the problem you were in the middle of automated. Dust off the old code, work with her for a day and finish up the automation. No one every understood that, all she wanted was the attention of how much crap she had to do, and all the crap she did by hand for you. Most importantly, she liked working harder, not smarter. So automating it before it was absolutely necessary ruined her mind set. Generally speaking, letting someone suffer the consequences of their decision is the ultimate way of convincing them that they were wrong.

    Ironically, I did get laid off about 2 months after explaining to my boss that he needed to find a way to make backups. He could make me repsonsible for them, and I'd do them. He could motivate the lazy piece of crap who hadn't gotten them made in the 9 months he'd worked there (even more irritating, he dismantled the old backups that worked). Since I was the one who'd get to spend 96 straight hours in the office rebuilding our network due to no backups, I had a bit more vested interest in seeing them work. I worked really hard to be ahead of the game in everything I did, and I'd be responsible for cle

  • We have some server rooms at some of our facilites that are similiar size. The problem rooms have one server, a small pbx, 2200VA UPS, 4 switches, router, and a few other misc items. The biggest issue we've faced is heat from having so much equipment in a dense area that the air can't dissapate the heat fast enough and it ends up being a sauna in there. Sure the facilities guys will pipe in AC from the building, but unless they are putting in a separate zone with your own thermostat you are going to be o
  • by symbolset ( 646467 ) on Saturday July 29, 2006 @01:41AM (#15804258) Journal
    Plant them on some users' desktops and tell them they got an upgrade. Then remote in.

    That leaves you 45sq ft, which is a nice space for a cot when you need quiet time. It sounds like you're going to need a lot of quiet time.

  • I don't know the exact details of your situation, but anyone on Slashdot can only give advice with on the details given to him/her. Here's a possible solution: Virtualize.

    You say you need 3 desktop sized servers. Why? Unless there are speciffic tasks that cannot be done in a virtual machine, you should seriously consider running your servers on a virtual machine. Plus, you should be able to reduce the number of UPSs.

    If you are able to virtualize, build the most powerful (dual-xion or quad-amd) machine you c
  • what's all this 'ergonomic seating and laptop docks'... all I need to admin my servers is a ssh connection, a cup of coffee, and a toilet.
  • I work for a company that sells ultra high end desk that would exceed your needs. Watson furnitureadjustable hight desks fan controlled storage of PC's many many options
  • Anyone ever feel that all cliff does is ask questions? Yeah some of the questions could be useful for future knowledge, but it seems its all questions relating to him not wanting to google search (and for his personal/work life so he gets payed for this stuff) and just asks you guys who do all the labor for him.
  • Why does each server has its own UPS? Use a single UPS with more power. To notify all machines about a power outage or a low battery state, use either a special distribution box from the UPS manufacturer, or make one server the "UPS master" and let it notify the other servers via ethernet. (Connect the ethernet switch to the UPS.) Have a look at http://www.apcupsd.com/ [apcupsd.com] for a free (as in speak and as in beer) solution that works not only with APC UPSes.

    Why do you need three servers? One more powerful machine
  • I have that much in a room like that but why do you need the chair? I think standing should be fine.
  • by Thumper_SVX ( 239525 ) on Saturday July 29, 2006 @09:38AM (#15805442) Homepage
    And the first thing you have to realize is that there isn't a single reason in the world that the admin actually needs to be in the room with the servers.

    I've worked on small datacenters about the size you're specifying, and I work on huge datacenters that have up to 10,000 square feet of space filled with racks. In both instances, I've always recommended that it be a "lights out" datacenter; that is that there's noone in there.

    Why? Well, first of all there's the noise issues. Real servers make noise, and in most instances it would actually be harmful to you to sit in the room with them for long periods of time. There's also the issue of "kicked cables" that could take down the servers, or a kicked power button. You might think it could never happen, but it can. Thirdly, there's heat. You sitting in that room is actually reducing the efficacy of the air conditioning because the heat you put off is also adding to the heat the AC needs to remove. Finally, there's the fact that we aren't clean animals; where do you think the dust comes from that you have to clean out of desktop systems? Much of it is human skin shed by you. Would you rather have to take down the servers every six months to blow out the dust, or have servers that can run for quite some time without buildup?

    As for console access, there are plenty of good KVM solutions that use IP. I use Avocent's solutions myself, but YMMV... but they are pretty damned good for small to enterprise businesses. I've used it in environments with 4 or 5 servers, and I've used it in environments with several hundred servers across multiple sites... still works pretty damned well.

    Another solution that works incredibly well is to use out of band management. In one large installation I work with often, we have HP servers with Integrated Lights Out cards. Allow us to power the servers on and off, launch a remote console and even mount CDs remotely to install software. Since it's Java based it also works well under Linux... if you've got a little money to spare then that might be a good solution.

    Finally, others have mentioned this; how about one big and meaty server and virtualize? I built an entire R&D lab environment around an 8-processor HP DL760 with 20Gb of hot-swap RAM attached to a Compaq SAN that was going to be disposed of... it now runs daily about 15 VM's with a peak of 25 and has become a really reliable and flexible system. It still runs VMWare ESX 2.0, and I plan to upgrade it to 2.5 soon because it scales a lot better in our tests and I might be able to squeeze a couple more machines out of that machine. Sure, with the SAN only being an older one the boot times are rather pitiful, but I have swap partitions moved off to local storage on the 760 (it has space for 4 72Gb drives, soon to upgrade to 146's) to keep performance up. Even my Windows boxes I create a "local" partition for temp and swap... boot times suck but running performance you can rarely tell the difference between the VM's and physicals.

    The key here though is lose the workstation. If you lose a server, take it out of the server room for troubleshooting. It maintains the "clean room ideal" of the datacenter, reduces risk of downtime and even protects you. Plus it looks good on a resume when you say you've dealt with some of these lights out technologies (IP KVM's, iLo's etc.)... believe me it can help a lot if/when you decide to move to a bigger enterprise that may have rules and regs that require lights-out datacenter experience.
  • Stop those nasty thoughts, I'm not talking about that one. My office moved a few months back and this room (9' X 5') was all we could scavenge if we wanted a central location (in 5,500 SF) and roomy offices for everyone (about 20 max.).

    If you can, make the door swing out. That was a big help to me and it's a relatively easy mod. I also used a louvered door to aid airflow.

    Then realize that you have three walls to mount equipment on. My room includes all the telephone patch panels, PBX, and network routers/sw
  • A simple closet is plenty of space, but it should have a special lock. About 3 feet from the back wall goes the rack. The top 1/4 is for patch (about 600) and bength that the switches. The bottom half starts with any fiber equip from your telecom, then routers. I like to keep servers (blade center) above the floor and put any specialized blades (security) nearest the floor. Next to the rack goes the PBX and above it on the side wall go your phone punch down block. 4x3' should be enough for about a 1000 phon
  • star trek doors.

    have the door open/close either vertically, horizontally, or outwardly. Anything other than inwardly. . .
    you should also consider punching out the ceiling tiles, set some 9' wooden beams up there, and place the cases/UPSes up there.
    in my place of work we have a raised floor, i bet stuffing a few machines down there could work out, perhaps it is similar at your job?
    If all the walls have 2 wall surfaces(cross section would look like this-> | | ), knock out the inner wall to get that e

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...