Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware Hacking

How Have You Equipped a Tiny Server Closet? 81

BenEnglishAtHome asks: "One of our remote offices will soon be gutted/rebuilt and our local IT staff managed to fumble the political ball. Our server closet is being reduced to 45 square feet and there will be no more unused desk space that can be occupied by visiting techs. Result? That 45 square feet must house 3 desktop-size servers; 3 UPSs; a fully-adjustable user workstation that includes separate adjustments for work surface height, keyboard height, and keyboard angle as well as a big ergo chair; an area suitable for workstation diagnostics; a good KVM switch; 2 monitors, keyboards, mice, and laptop docking stations that must be simultaneously available; and some amount of small item storage, while still having enough room for a door to swing into the roughly square room. The only bright side is that I can have all the A/C, power, and LAN drops I want. Has anyone managed to find and deploy a freestanding server rack/workstation/furniture system (probably something L-shaped) that can perform this many tasks in such a small space?"
This discussion has been archived. No new comments can be posted.

How Have You Equipped a Tiny Server Closet?

Comments Filter:
  • Suggestion: (Score:4, Insightful)

    by Anonymous Coward on Friday July 28, 2006 @08:49PM (#15803232)
    Start having lots of conversations in earshot of management about electrical fires.
  • Shelve 'em (Score:5, Insightful)

    by ErikTheRed ( 162431 ) on Friday July 28, 2006 @08:54PM (#15803252) Homepage
    Well, since your first problem is that your servers aren't rack-mountable (or, if you can get conversion kits, rack-mount them and forget the rest of this), your next best bet is some good shelving. I've purchased some heavy-duty 4' x 18" stainless steel shelves from Costco for about $75 a set. Each shelf can hold 500 lbs if necessary. Find a way to attach the shelves to the wall (several half-inch-thick zip-ties screwed into the wall studs works well), and use cargo straps (the kind with built-in ratchets to tighten them) to attach the servers to the shelves. Very space-efficient and sturdy.
  • by TheSHAD0W ( 258774 ) on Friday July 28, 2006 @08:59PM (#15803268) Homepage
    Reverse the door. You'll want pinned hinges so the room's still secure, but that'll buy you a LOT of extra space.
  • Virtualize (Score:5, Insightful)

    by digitalhermit ( 113459 ) on Friday July 28, 2006 @09:01PM (#15803276) Homepage
    If your workload per server is light enough, then buy 1 decent server with RAID and lots of memory and CPU. Virtualize the machines on this new server. Put in a Ethernet remote management card. This will allow you to forego the monitor and access it remotely. Make sure this machine is fully redundant and hot-swappable. Now instead of 3 servers you have one. You don't have to actually enter the server closet. For even more space, mount the unit up above.
  • Re:Shelve 'em (Score:3, Insightful)

    by dgatwood ( 11270 ) on Friday July 28, 2006 @09:15PM (#15803331) Homepage Journal

    Why use a free-standing shelf? I wouldn't think those would be nearly as sturdy as a shelf that's actually mounted to the wall, and from your description, it sounds more expensive to use free-standing shelves, too.

    My server closet is built using steel mesh shelves available at any Lowes or Home Depot. You mount three or four vertical strips with little slots, then the shelf brackes drop in and slide down, locking them into place. The shelf brackets, in turn, snap onto the wire shelves themselves. Takes a couple of hours the first time you do it---a little longer if you have to shorten the length of the shelves with bolt cutters---but once the vertical pieces are in place, you can add shelves at any time, raise them or lower them, etc. You can cut them off to whatever length you need relatively easily, and you can use extra shelves as bookshelves above the desk if you want.

  • silly questions (Score:3, Insightful)

    by Clover_Kicker ( 20761 ) <clover_kicker@yahoo.com> on Friday July 28, 2006 @09:17PM (#15803342)
    Are you /sure/ you need 3 UPSs and not 1 big one?

    Just because you have a big ergo chair now, do you really need to keep it?

    PS - you're insane if you put 3 desktop sized servers in that room. Replace 'em with 3 rackmounted servers and a 1U LCD/keyboard/trackball/KVM.
  • by Opportunist ( 166417 ) on Friday July 28, 2006 @09:37PM (#15803406)
    Write down every even remotely possible hazard this could lead to (electrical fires and server failings due to heat, as well as long downtimes because you have to "un-build" a lot of the server farm to get to the failing computer, etc. Whatever comes to mind, write it down), but make sure you leave out anything that could remotely be tracked to your convenience (managers hate it if their subjects are working in a convenient environment, it will make your complaint look like you're just trying to get more comfort).

    Then, assemble a list of managers that could even remotely be connected to the problem, or who could get fired when one of those hazards really happen.

    Next, draw together what services of your company rely on computers. Managers don't understand the implications of a failed server, but they do understand what happens when people can't work because of it. Make sure you describe in very easy terms the connection between the server being down and Joe Cubicle sitting around and twiddling his thumbs because of it, for as long as the server is down (that the server will be down for LONG and chances that it is down was already covered in the first paragraph, I hope). Also make sure you include that they will not get a SINGLE EMail when (NOT if!) the servers fail, and for as long as the servers are down, no electronic communication AT ALL with the outside world and the other offices, or with clients! No mails, no files, no reports, NOTHING will come in and go out when the servers fail!

    Pull it all together and write memos. Emphasis on the s. Not one. Make sure you write to them until you get a reply, don't let it rest, make sure that they understand the urgency and that it is a serious, serious, SERIOUS problem to the company. Make sure they understand that everything your company does stands and falls with the availability of the servers and their services. That nobody can be productive when your department is working at sub-par conditions.

    Also, look around for possible solutions. Is there some space where you could put a server? Is there a way you could "grow" your space at the expense of some other department? Try to offer a solution, not only a problem. For two reasons: NObody likes a complaint (sounds like "waaah, I'm unhappy"), and when you already offer a solution, your chances are good that they will be picked up instead of a manager trying to come up with an idea. This is bad for two reasons: First and foremost, he has NO clue what you and your servers need, you might end up with a server in a toilet right under the water reservoir. And second, they will decide it behind your back, without you having a say in it, and they will try to meet your bare minimum requirements (if at all).
  • by Thumper_SVX ( 239525 ) on Saturday July 29, 2006 @09:38AM (#15805442) Homepage
    And the first thing you have to realize is that there isn't a single reason in the world that the admin actually needs to be in the room with the servers.

    I've worked on small datacenters about the size you're specifying, and I work on huge datacenters that have up to 10,000 square feet of space filled with racks. In both instances, I've always recommended that it be a "lights out" datacenter; that is that there's noone in there.

    Why? Well, first of all there's the noise issues. Real servers make noise, and in most instances it would actually be harmful to you to sit in the room with them for long periods of time. There's also the issue of "kicked cables" that could take down the servers, or a kicked power button. You might think it could never happen, but it can. Thirdly, there's heat. You sitting in that room is actually reducing the efficacy of the air conditioning because the heat you put off is also adding to the heat the AC needs to remove. Finally, there's the fact that we aren't clean animals; where do you think the dust comes from that you have to clean out of desktop systems? Much of it is human skin shed by you. Would you rather have to take down the servers every six months to blow out the dust, or have servers that can run for quite some time without buildup?

    As for console access, there are plenty of good KVM solutions that use IP. I use Avocent's solutions myself, but YMMV... but they are pretty damned good for small to enterprise businesses. I've used it in environments with 4 or 5 servers, and I've used it in environments with several hundred servers across multiple sites... still works pretty damned well.

    Another solution that works incredibly well is to use out of band management. In one large installation I work with often, we have HP servers with Integrated Lights Out cards. Allow us to power the servers on and off, launch a remote console and even mount CDs remotely to install software. Since it's Java based it also works well under Linux... if you've got a little money to spare then that might be a good solution.

    Finally, others have mentioned this; how about one big and meaty server and virtualize? I built an entire R&D lab environment around an 8-processor HP DL760 with 20Gb of hot-swap RAM attached to a Compaq SAN that was going to be disposed of... it now runs daily about 15 VM's with a peak of 25 and has become a really reliable and flexible system. It still runs VMWare ESX 2.0, and I plan to upgrade it to 2.5 soon because it scales a lot better in our tests and I might be able to squeeze a couple more machines out of that machine. Sure, with the SAN only being an older one the boot times are rather pitiful, but I have swap partitions moved off to local storage on the 760 (it has space for 4 72Gb drives, soon to upgrade to 146's) to keep performance up. Even my Windows boxes I create a "local" partition for temp and swap... boot times suck but running performance you can rarely tell the difference between the VM's and physicals.

    The key here though is lose the workstation. If you lose a server, take it out of the server room for troubleshooting. It maintains the "clean room ideal" of the datacenter, reduces risk of downtime and even protects you. Plus it looks good on a resume when you say you've dealt with some of these lights out technologies (IP KVM's, iLo's etc.)... believe me it can help a lot if/when you decide to move to a bigger enterprise that may have rules and regs that require lights-out datacenter experience.

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...