How Have You Equipped a Tiny Server Closet? 81
BenEnglishAtHome asks: "One of our remote offices will soon be gutted/rebuilt and our local IT staff managed to fumble the political ball. Our server closet is being reduced to 45 square feet and there will be no more unused desk space that can be occupied by visiting techs. Result? That 45 square feet must house 3 desktop-size servers; 3 UPSs; a fully-adjustable user workstation that includes separate adjustments for work surface height, keyboard height, and keyboard angle as well as a big ergo chair; an area suitable for workstation diagnostics; a good KVM switch; 2 monitors, keyboards, mice, and laptop docking stations that must be simultaneously available; and some amount of small item storage, while still having enough room for a door to swing into the roughly square room. The only bright side is that I can have all the A/C, power, and LAN drops I want. Has anyone managed to find and deploy a freestanding server rack/workstation/furniture system (probably something L-shaped) that can perform this many tasks in such a small space?"
Suggestion: (Score:4, Insightful)
Re:Suggestion: (Score:2)
Re:Suggestion: (Score:2)
To paraphrase George Carlin, you don't take a fart, you leave one!
Re:Suggestion: (Score:1)
Re:Suggestion: (Score:5, Informative)
And on a more serious note, start talking to management about the wisdom of putting a human in the server room.
We have a similarly sized server room at my workplace, a bit more horsepower though. And we can't actually work in there for any length of time due to OSHA regs - It stays a nice comfy 86-88db in there all the time. If (when, I should say) we need to replace a server, we would need to wear earplugs to legally stay in the room long enough.
Your boss might not care about pesky little problems like the Pauli Exclusion Principle*, but when it comes to OSHA, employers tend to go out of their way to do things the "right" way (and if not, you can guarantee they will the second time).
* Yes, I know - Laugh, don't take it literally.
Re:Suggestion: (Score:2)
It's only for one person, one day a week, who will typcially spend less than 4 hours in the room. Oddly enough, while we won't put someone in a non-ergo chair with a non-adjustable work surface, we frequently don't care squat about other issues. Example? In our main server room, we have a completely screwed up air handling system that keeps the air pressure in the room much higher than the rest of the floor. It is *very* d
Another recommendation (Score:5, Funny)
By piping the A/C through, you can keep your beverages at a perfect temperature.
If you have any remaining space, you could install a dockable sweet trolley to take your refreshments mobile.
If your boss asks, just tell him your servers were claustrophobic.
Re:Another recommendation (Score:1)
And I was going to recommend tiny servers and umpa lumpas...
Lab A/C system as Wine Cooler (Score:3, Interesting)
Huh??? (Score:5, Funny)
All the same, I'd equip it with toilet paper and hand soap, just like the others.
Re:Huh??? (Score:3, Funny)
Shelve 'em (Score:5, Insightful)
Re:Shelve 'em (Score:3, Insightful)
Why use a free-standing shelf? I wouldn't think those would be nearly as sturdy as a shelf that's actually mounted to the wall, and from your description, it sounds more expensive to use free-standing shelves, too.
My server closet is built using steel mesh shelves available at any Lowes or Home Depot. You mount three or four vertical strips with little slots, then the shelf brackes drop in and slide down, locking them into place. The shelf brackets, in turn, snap onto the wire shelves themselves. Take
Wall-mounts versus utility shelves (Score:2)
Depending on how much weight you're considering putting on the shelves, that may or may not be acceptable. Desktop computers used a servers probably aren't heavy enoug
Correction: torque is on walls, not floor. (Score:2)
Sorry about that.
Re:Correction: torque is on walls, not floor. (Score:2)
The problem with free-standing shelves is that there is usually either no cross-bracing or inadequate cross-bracing. Hanging shelves are inherently cross-braced because the wall boards keep the studs from flexing much (and the studs themselves are thick and are fastened to horizontal studs at the top and bottom, which in turn, are fastened to the floor, which isn't going anywhere). Most of the stand-alone shelf failures I have seen result from not having any good way to prevent the four vertical posts fr
Re:Shelve 'em (Score:3, Informative)
So lets see, 3 desktop sized servers and their UPSes should easily fit in a rack. Just make sure you get shelves that can handle the weight of the UPSes (unless the UPS is already rack-mountible, in which case, just get the rack kit for them). Get yourself a good slimline rack-mount KVM to place in there as well.
For item storage, get yourself a rollin
Racks, and rack-servers, and desks. (Score:3, Informative)
mobile rack might be a way to go (Score:2)
Instead, the items you need are:
1) Mobile rack (as tall as would fit in your new room)
2) Shelves for the said rack
3) Tetris skills
Simply place the desktop-sized servers on the shelves at the top of the rack and the UPSes at the bottom, and use the middle for hot-pluggable stuff. You might even be able to place a few actual rackmount cases before you run out of space
There are some 1U racks that act as a combo KVM/keyboard/monitor - very com
Re:mobile rack might be a way to go (Score:2)
1) Mobile rack (as tall as would fit in your new room)
2) Shelves for the said rack
3) Tetris skills
Most Tetris pieces don't have cables attached.
Re:mobile rack might be a way to go (Score:1)
I think I speak for everyone... (Score:4, Funny)
hey, it works the other way around as well. Now, where'd I put that CAT-6 cable...
stupid suggestion... (Score:5, Insightful)
Re:Virtualize... and use Wall mounted racks (Score:1, Redundant)
Re:Virtualize... and use Wall mounted racks (Score:1)
Virtualize (Score:5, Insightful)
Colocation (Score:1)
Re:Virtualize (Score:2)
Not an option. I don't own or admin the servers. They belong to our corp lan admins; the most I've ever been aked to do with them is power down or up when electrical service is going to be/has been interrupted. I'm just a lowly desktop technician. Normally, when I visit the office I just plop down at an unused workstation and, if I need more space, I can take over an interview room. No longer. After the rebuild, there will be no more room for me to camp out in the office space contro
silly questions (Score:3, Insightful)
Just because you have a big ergo chair now, do you really need to keep it?
PS - you're insane if you put 3 desktop sized servers in that room. Replace 'em with 3 rackmounted servers and a 1U LCD/keyboard/trackball/KVM.
We have a server closet (Score:5, Funny)
I've mentioned several times that it'll all go to hell when the small single room household A/C dies. They won't even approve my offsite backup plans. There is a backup server, but it too is in the server closet. They even had a heating vent going into the server closet until I convinced them to seal it this last winter, after it reached the 90's during the coldest time of year.
The thing gives me nightmares. I imagine the A/C failing, the servers dying, and the room catching fire and taking the building with it.
Re:We have a server closet (Score:2)
Right?
Re:We have a server closet (Score:1)
Re:We have a server closet (Score:2)
Excellent point.
Maybe he should keep copies at home, everyone will be very impressed when he does the happy dance on the smoking ruins of the building, waving his memo and screaming "I TOLD YOU SO! I TOLD YOU SO!"
Re:We have a server closet (Score:2)
Re:We have a server closet (Score:4, Funny)
No, they decided it was easier just to give him back his stapler.
Re:We have a server closet (Score:1)
lose the desk? (Score:2)
Lot of server rooms have no accomodations...it's for servers after all.
Right... (Score:5, Interesting)
Yes, it's called a rack and a desk. You can find both of them available from retailers the world over. Seriously, this question is... trivial. It's all up to how you want to arrange things. As others have suggested, you could buy a seriously powerful multicore system with plenty of RAID storage that takes up under 4U for a few thousand dollars. Ok, so put it in the rack with the UPSs on the bottom (wait, do you need them all anymore?), a shelf with a monitor and KVM (because you only need it for emergencies, since you should connect in normally by remote) and we just used up under 12 square feet. That's a lot of room left for a desk and chair! Even if you don't want to buy a new server, then buy a few more shelves for your rack and stick them in it standing up.
I have five machines, one of which runs five other VMs, several UPSs an LTO-3 backup system, two ancient mini-fridge sized servers and a KVM all taking up less than 25 square feet. Half of that is the two ancient servers I'm about to get rid of. It's not that hard...
Re:Right... (Score:2)
Small areas = verticle space (Score:2, Informative)
Also, a small adjustable angle table (drafting table) might make it so that you can acomplish work when necessary, and a
Re:Small areas = verticle space (Score:2)
There are also things like this:
1U 17" Rack Mount LCD Monitor 8 Ports KVM Built In Dual Rail
http://www.circotech.com/1u-17--rack-mount-lcd-mon itor-8-ports-kvm-built-in.html [circotech.com]
which is a 1U slide-out tray with LCD with built-in KVM switch. Big space-saver.
I just grabbed this link off a search. I'm not connected with them.
You might want to do your own google on the text string above to not punish their server...
ever been to Sam's Club? (Score:5, Informative)
Several other people have recommended rack setups, and for ultimate reliability and neatness I'd have to agree; but if your budget is small a backer's rack will do the job.
Server "Closet" (Score:1)
Memos, Memos and more Memos (Score:5, Insightful)
Then, assemble a list of managers that could even remotely be connected to the problem, or who could get fired when one of those hazards really happen.
Next, draw together what services of your company rely on computers. Managers don't understand the implications of a failed server, but they do understand what happens when people can't work because of it. Make sure you describe in very easy terms the connection between the server being down and Joe Cubicle sitting around and twiddling his thumbs because of it, for as long as the server is down (that the server will be down for LONG and chances that it is down was already covered in the first paragraph, I hope). Also make sure you include that they will not get a SINGLE EMail when (NOT if!) the servers fail, and for as long as the servers are down, no electronic communication AT ALL with the outside world and the other offices, or with clients! No mails, no files, no reports, NOTHING will come in and go out when the servers fail!
Pull it all together and write memos. Emphasis on the s. Not one. Make sure you write to them until you get a reply, don't let it rest, make sure that they understand the urgency and that it is a serious, serious, SERIOUS problem to the company. Make sure they understand that everything your company does stands and falls with the availability of the servers and their services. That nobody can be productive when your department is working at sub-par conditions.
Also, look around for possible solutions. Is there some space where you could put a server? Is there a way you could "grow" your space at the expense of some other department? Try to offer a solution, not only a problem. For two reasons: NObody likes a complaint (sounds like "waaah, I'm unhappy"), and when you already offer a solution, your chances are good that they will be picked up instead of a manager trying to come up with an idea. This is bad for two reasons: First and foremost, he has NO clue what you and your servers need, you might end up with a server in a toilet right under the water reservoir. And second, they will decide it behind your back, without you having a say in it, and they will try to meet your bare minimum requirements (if at all).
Re:Memos, Memos and more Memos (Score:3, Interesting)
[ It's kind of like asking questions that start with, "I know this guy with herpes..." ]
A memo filled with finger pointing and blame throwing will be a fine document for the new IT guy to use to justify a new server room, but w
Re:Memos, Memos and more Memos (Score:2)
As for the for
Re:Memos, Memos and more Memos (Score:2)
Re:Memos, Memos and more Memos (Score:1)
1) Give multiple solutions - think of one low-cost alternative, another medium cost/effort one, and a budget hyper-blast based on long-term planning(good profile-builder, too) Then present it as: - would you like to go with or with ? People like to have options, it makes them feel like they're in control.
2) Speak to them in $$$ - for each solution, table how much
Cubical (Score:2)
For three desktop-size servers, though, this is trivial. Use a cubical L-desk for a cube-shaped room and a bench along one wall for a rectangular room. The three servers sit under, the monitor keyboard and KVM sit on top and the entire other half of the desk is available for the tech with his laptop.
Do watch out for the air conditioning. A non-IT guy's definition of wh
Enough room/electricity to expand (Score:3, Informative)
One UPS, hot swapable batteries are nice, but we fry as many APC brand controllers as we kill batteries. I like to have an independent AC line conditioner, on a serperate AC mains circuit (i.e. different 15A circuit breaker) so that those real servers with dual power supplies (hot swappable of course) go one to UPS, one to the line conditioner (for UPS failures). Have enough circuits (not just more plugs) to accomidate future growth. A Watts Up? [smarthome.com] or Kill-A-Watt [p3international.com] meter are nice to measuring your electrical consumption.
Honestly with how swappable hardware RAID-5 disks, hot swappable power supplies, sensible power distribution, and practicing regular backup hygenie, downtime can be minimized to mere hours per year range or less with care and planning of the administrator(s).
I also love KVM over IP (I use an ) or [avocent.com]ILO [hp.com] (Intergrated Lights Out management) for headless servers, and have a backup AC available for server rooms/closets.
For servers ideas look at HP Proliant DL380 or Dell PowerEdge 2850 series.
Re:Enough room/electricity to expand (Score:2)
jeez, I'm not alone? I thought I was going crazy, with all these APC units dying around here. I think we've gone though twice as many APC units as servers.
Re:Enough room/electricity to expand (Score:2, Informative)
Re:Enough room/electricity to expand (Score:1)
Re:Enough room/electricity to expand (Score:2)
Just to allay confusion, the Avocent AutoView 2000 [avocent.com] noted in parent is KVM-over-Cat5, NOT KVM-over-IP. Big difference, since IP is routable across the WAN and whatever (presumably analog) signal on Cat5 is not.
That said, KVM-over-Cat5 isn't a bad way to make a room accessable from outside...
Try not doing it... (Score:4, Interesting)
This clearly isn't the advice you asked for. Personally, I'd make effective use of shelves (and or cabinets), and use the longest table that will fit on the longet dimention. Get the next longest table for the other dimension. There should just be room for the door to open, and the chair to be out of the way when someone is sitting in there. You can stack desktops and tower cases underneath. Have a working surface for repairs on top. Hook the laptop dock into the KVM, and put the dock up high (I once packed 10 computers into a room that was 6.5x9.5 or so at a small startup, oh and that was also my office). Now, that I've given you the token advice you've asked for. Let me try giving you my real advice:
I've been reasonable effective with the passive aggressive stance, of not supporting something so stupid. I'm not sure I can visualize just how dumb this is (I'm not good with descriptions of space). If this is truly a political issue, put in everything you can fit in with a reasonable about of space. The rest of it became "do without". I've seen 60 person offices run of development offices run off 3 machines in a server room that run all of the IT infrastructure for the company. It's stupid, but it can be done. A decent desk and an LCD monitor are all that's needed. It'll easily fit within a 7x7 room (a square room roughly of the right size).
You'll be shocked and amazed at the types of results you get from, "I've done everything I can with the resources I have. This project will continue to be a nightmare and always behind the eightball until we decide to do it right. I'll continue to do my best to support it as it is, but it'll take more time, and cost a lot more money then just having done this properly the first time. It will continue to cost us money, and I can give you estimates of the amount of time it will take for resolving this properly will become profitable".
Generally speaking, no sane boss will argue against that. I've come dreadfully close to being fired on several occasions because for this. However, I was kept around as "the guy who got stuff done". Generally letting people suffer the consequences of their stupidity is the single most effective and convincing way to get them to see the error.
I used to work with a woman who everytime you asked her to test things you developed for her to automated a business process she was in charge of that "it was so broken, that I'll just do it by hand". No matter how far or near you were to the mark, if it didn't work perfectly the first time she would refuse to work towards fixing it. Wouldn't explain what she needed or what was wrong. She used to do that to everyone. And everyone worked really hard to coax her into discussing it like a rational person. I found the most effective way to deal with her was to walk out. Wait 3 months until she was completely overwhelmed by the problem you were in the middle of automated. Dust off the old code, work with her for a day and finish up the automation. No one every understood that, all she wanted was the attention of how much crap she had to do, and all the crap she did by hand for you. Most importantly, she liked working harder, not smarter. So automating it before it was absolutely necessary ruined her mind set. Generally speaking, letting someone suffer the consequences of their decision is the ultimate way of convincing them that they were wrong.
Ironically, I did get laid off about 2 months after explaining to my boss that he needed to find a way to make backups. He could make me repsonsible for them, and I'd do them. He could motivate the lazy piece of crap who hadn't gotten them made in the 9 months he'd worked there (even more irritating, he dismantled the old backups that worked). Since I was the one who'd get to spend 96 straight hours in the office rebuilding our network due to no backups, I had a bit more vested interest in seeing them work. I worked really hard to be ahead of the game in everything I did, and I'd be responsible for cle
Cooling needs - biggest issue I've had (Score:1)
Zero footprint server room (Score:3, Funny)
That leaves you 45sq ft, which is a nice space for a cot when you need quiet time. It sounds like you're going to need a lot of quiet time.
Re:Zero footprint server room (Score:2)
Don't forget to disconnect the off switch. Darned users get energy conscious from time to time.
Ideas... (Score:1)
You say you need 3 desktop sized servers. Why? Unless there are speciffic tasks that cannot be done in a virtual machine, you should seriously consider running your servers on a virtual machine. Plus, you should be able to reduce the number of UPSs.
If you are able to virtualize, build the most powerful (dual-xion or quad-amd) machine you c
pffft, sissies (Score:2, Funny)
Ergonomic (Score:1)
Ergonomic in this context means one of those padded seat ones.
Suggestion (Score:1)
Cliff and his questions? (Score:1)
Re:Cliff and his questions? (Score:2)
Re-Think some Parts of that Concept (Score:2)
Why do you need three servers? One more powerful machine
Hmm (Score:1)
Been there done that (Score:3, Insightful)
I've worked on small datacenters about the size you're specifying, and I work on huge datacenters that have up to 10,000 square feet of space filled with racks. In both instances, I've always recommended that it be a "lights out" datacenter; that is that there's noone in there.
Why? Well, first of all there's the noise issues. Real servers make noise, and in most instances it would actually be harmful to you to sit in the room with them for long periods of time. There's also the issue of "kicked cables" that could take down the servers, or a kicked power button. You might think it could never happen, but it can. Thirdly, there's heat. You sitting in that room is actually reducing the efficacy of the air conditioning because the heat you put off is also adding to the heat the AC needs to remove. Finally, there's the fact that we aren't clean animals; where do you think the dust comes from that you have to clean out of desktop systems? Much of it is human skin shed by you. Would you rather have to take down the servers every six months to blow out the dust, or have servers that can run for quite some time without buildup?
As for console access, there are plenty of good KVM solutions that use IP. I use Avocent's solutions myself, but YMMV... but they are pretty damned good for small to enterprise businesses. I've used it in environments with 4 or 5 servers, and I've used it in environments with several hundred servers across multiple sites... still works pretty damned well.
Another solution that works incredibly well is to use out of band management. In one large installation I work with often, we have HP servers with Integrated Lights Out cards. Allow us to power the servers on and off, launch a remote console and even mount CDs remotely to install software. Since it's Java based it also works well under Linux... if you've got a little money to spare then that might be a good solution.
Finally, others have mentioned this; how about one big and meaty server and virtualize? I built an entire R&D lab environment around an 8-processor HP DL760 with 20Gb of hot-swap RAM attached to a Compaq SAN that was going to be disposed of... it now runs daily about 15 VM's with a peak of 25 and has become a really reliable and flexible system. It still runs VMWare ESX 2.0, and I plan to upgrade it to 2.5 soon because it scales a lot better in our tests and I might be able to squeeze a couple more machines out of that machine. Sure, with the SAN only being an older one the boot times are rather pitiful, but I have swap partitions moved off to local storage on the 760 (it has space for 4 72Gb drives, soon to upgrade to 146's) to keep performance up. Even my Windows boxes I create a "local" partition for temp and swap... boot times suck but running performance you can rarely tell the difference between the VM's and physicals.
The key here though is lose the workstation. If you lose a server, take it out of the server room for troubleshooting. It maintains the "clean room ideal" of the datacenter, reduces risk of downtime and even protects you. Plus it looks good on a resume when you say you've dealt with some of these lights out technologies (IP KVM's, iLo's etc.)... believe me it can help a lot if/when you decide to move to a bigger enterprise that may have rules and regs that require lights-out datacenter experience.
Mine's smaller (Score:1)
If you can, make the door swing out. That was a big help to me and it's a relatively easy mod. I also used a louvered door to aid airflow.
Then realize that you have three walls to mount equipment on. My room includes all the telephone patch panels, PBX, and network routers/sw
Lots (Score:1)
easiest solution (Score:1)
have the door open/close either vertically, horizontally, or outwardly. Anything other than inwardly. .
you should also consider punching out the ceiling tiles, set some 9' wooden beams up there, and place the cases/UPSes up there.
in my place of work we have a raised floor, i bet stuffing a few machines down there could work out, perhaps it is similar at your job?
If all the walls have 2 wall surfaces(cross section would look like this-> | | ), knock out the inner wall to get that e