Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Businesses IT

Outfitting a Brand New Datacenter? 110

An anonymous reader writes "We completed our new 4,000 sq. ft. data center (Tier II/III, according to The Uptime Institute) and just recently moved our core systems from our old data center to the new. We've been up and running for several months now and I'm preparing to close out the project. The last piece is to purchase some accessories and tools for the new location. The short list so far consists of a Server Lift, a few extra floor tile pullers, flashlights and a crash cart. We'll also add to the tools in the toolbox located in one of the auxiliary rooms — these things seem to have legs! What are we missing? Where can we find crash carts set up more for a data center environment (beyond the utility cart with and LCD, keyboard, and mouse strapped to it)?"
This discussion has been archived. No new comments can be posted.

Outfitting a Brand New Datacenter?

Comments Filter:
  • Re:Safety equipment (Score:2, Informative)

    by gnarfel ( 1135055 ) <anthony.j.fiumara@gmail.com> on Monday July 30, 2007 @10:25PM (#20051943) Homepage
    Padlocks. Lots and lots of padlocks. Those pesky electrical panels have a tendency of getting shut off, and they all seem to be outfitted with padlock holes.
  • by gen0c1de ( 977481 ) on Monday July 30, 2007 @10:28PM (#20051989)
    At the DC I work at we have a crap load of extra gear. Make sure you have one emergcy kit in your core room, ensure that no one is to use it unless it is an emergmcy. The kit should have but not limited to the following: screw drivers mounting screws/cage nuts knife (a Leatherman multi-tool) spare patch/cross-over cables (Copper) (various length) spare fibre patch cables (Various length) Cable tester (Copper/fibre) couplers for fibre fibre cleaning kits Patch panel punch tool spare hard ware for core gear We have more gear however i'm drawing a bit of a blank as I haven't needed to look at the kit for a while.
  • Safety equipment++ (Score:4, Informative)

    by soloport ( 312487 ) on Monday July 30, 2007 @10:36PM (#20052071) Homepage
    And a good sized crescent wrench. Absolutely indispensable.

    Drop it across the terminals of one of your backup batteries -- when it's disconnected from the grid. When the wrench cools off, store it in a safe place. Makes a great scapegoat when things go wrong. Could save your career...
  • Nice crash cart (Score:4, Informative)

    by Anonymous Coward on Monday July 30, 2007 @10:36PM (#20052075)
    I've used crash carts from a company called Ergotron: http://www.ergotron.com/tabid/158/language/en-US/d efault.aspx [ergotron.com]

    At my current and my past company, they work real well. I looked high and low for a good crash cart and nothing seemed to come close to these. Maybe I was just searching the wrong terms(and apparently my vendors were too). They are a bit pricey though, ~$1500 or so to start. I have a Styleview LCD cart at my current job, and had a LCD cart and a laptop cart at my last place (servers were co-located in a ~900 sq foot cage, 8 feet between rows, so plenty of space for the carts).

    I also bought a KVM over IP/CAT5 solution from raritan(http://www.raritan.com/ [raritan.com]), which worked out real well for those situations where a serial console wasn't enough(unless you have fancy out of band management, some do, some don't). I setup tables in the front of the cage, hooked up a couple of the raritan hardware clients. Typically ran one CAT5 cable w/KVM hookup to each rack, so it could be plugged into any system fairly easily. Range of 1000 feet. This was pretty pricey too, with the adapters and all it was about $25k. Though in the grand scheme of things it was cheap at the time. I had cyclades terminal servers in every rack, with serial consoles on all the servers and network gear.

    Also I hooked up a temperature sensor board, from Sensatronics(http://www.sensatronics.com/ [sensatronics.com]) I think. I think it was a 16 port board, and I bought all 300 foot cables for all of the sensors, and cut them to length. This ended up being about $5k I think(I went way overkill on the cable lengths).

    At my current company we use servertech(http://www.servertech.com/ [servertech.com]) PDUs, their higher end models come with optional temperature/humidity sensors so we use those instead of the senatronics.

    Despite it being a co-location, we had 500kW of power going into that cage(standard setup was ~12kW/rack), if the data center had followed their own procedures(AT&T enterprise network services), we would of had to have about a 5,500 sq foot cage, comparable to your data center :) (@ 90 watts/sq foot of cooling). But they did not(at the time, they wised up July of last year and now strictly enforce their cooling capacity at this particular data center).

    posting as AC, since I don't have an account. I read slashdot daily but I post maybe once every 2-3 years, so I haven't bothered to make an account.
  • Don't guess! (Score:5, Informative)

    by martyb ( 196687 ) on Tuesday July 31, 2007 @12:40AM (#20053213)

    I've seen several good suggestions already with specific suggestions on tools or parts. Start with those. My suggestion is quite simple, actually: Why GUESS what you need, when you can find out for sure?

    Tear down one ENTIRE rack. (Or several, if they have any variations.)

    1. Pull out ALL the servers.
    2. Pull out ALL the switches and routers.
    3. Disconnect ALL the cables.
    4. Unscrew EVERY screw and EVERY bolt.
    5. Disassemble each different server's internals:
      1. Pull out EVERY board.
      2. Remove the power supplies.
      3. Pull out the motherboards.
    6. Ditto for any COMMs hardware (e.g. cards, etc.)

    Now, look at this big pile of parts in front of you and imagine what you would do WHEN *ANY* one of them breaks.

    Get several spares for each of those parts and put into the cart.

    Whatever tools you needed for disassembly, put into a crash cart.

    Then make another, identical cart. When the brown stuff hits the spinnie thingie, and multiple systems are down, the last thing you want to be doing is fighting over tools. Get spares of EVERYTHING so at least TWO people can work on things at the same time! You'll thank me when there's two of you trying to work on both sides of a rack.

    NOTE: Be sure to inventory what you put into each cart! Tools have a way of growing legs and you want to be able to check and make sure that you STILL have ALL the tools.

    And please consider getting a big-ass UPS for your cart (At least 1KVA). If your power is wonky, you want to be sure your cart's equipment (laptop, hub, switch, router, etc.) won't be flaking out as the power comes and goes. Even with the power out, you can plug one server into the UPS and restore/repair it while the power is still out. While you're at it, also get some LONG extension cords (100-foot) made of AT LEAST 12-gauge wire. Plug the UPS into the extension cord.

    Think you're all set? Now, using ONLY the tools on ONE crash cart, put the rack back together. With the power out. (i.e. no mains)

    When you have done this, not only will you be CERTAIN that you have all the tools you needed to [re]assemble everything, you'll actually have done so and will have run into (hopefully) most of the problems that you could encounter.

    That's it off the top of my head. Best of luck to you! P.S. One last thing: MANY rolls of Duct Tape! <grin>

  • Re:Safety equipment (Score:2, Informative)

    by Anonymous Coward on Tuesday July 31, 2007 @12:53AM (#20053277)
    Um, that's for lockouts, to keep the power OFF while someone is working on it, not to keep the power on. (and in fact, it'd be likely illegal to lock it on even if the holes did line up)
  • Re:D Batteries (Score:5, Informative)

    by sumdumass ( 711423 ) on Tuesday July 31, 2007 @02:51AM (#20053911) Journal
    Funny story on a similar but not as large of scale.

    I have a small site with about 8 computers and 3 servers and there is wireless shooting to 3 other buildings with about 4-6 computer each in them about 50 feet apart. I was over ridden in our battery backup system in favor of $50 ups purchased for each computer separately at office max. I'm thinking OK, they are getting a generator and I told them to make sure it had a line conditioner and was certified to work with sensitive computer equipment. besides, when it was just the one building, the UPS worked just fine.

    They ignored that and one the test, after all the batteries went down, the computer just quit because the UPS software conflicted with a proprietary app the chose to use. I was called in by the guy who installed the generator and was told that about 20 of the UPS were bad. I though ok, they have been there for a couple of years in some cases and brought down some replacements. I swapped them out, they tested it again and before I got back to the shop I got a call saying more of them were bad. All the local sources were out and the electrician told me he had better backups so I told him to get them. after swapping them out I asked to make sure that they had a clean electrical line coming off the generator and they assured me there was.

    Two weeks later, a car hits a telephone pole, the electric was out for more then 10 hours. All the UPS units went out, None of the computers would work. I tested the electrical line and it was jumping between 70 and 150 volts at about 40 hertz. All the ups shut down and wouldn't take power, they decided to plug the computers directly into the wall outlets and took two main boards out, three power supplies and the rest of the computers just wouldn't power up.

    The data base on one of the applications got corrupted beyond repair and they had to recreate a weeks worth of entries because the drive got corrupted on the backup server too when the main board went out and no one had made the external backup in over 5 days. The phone system was borked, a 64 inch plasma TV in the lobby was gone, and various other things needed replaced because they acted weird from then on out. The line conditioners should have been about $90 per outlet or about $2000 for one capable of regulating all the power coming from the generator. In the end, it costs around $10,000 in replacements, labor and everything plus they ended up buying a new generator and this time getting a power control system that was certified for sensitive electronics.

    Bad power will cause so many problems it isn't funny. Most people don't even know that a generator can be out of whack on output. Not all of them are created alike. Small things like how fast they can adjust to the load pulling from them and how stable the current is isn't a given. You have to make sure it is there or end up with broken electronic every time the power goes out and it kicks on.
  • Spares (Score:2, Informative)

    by Sobrique ( 543255 ) on Tuesday July 31, 2007 @04:39AM (#20054437) Homepage
    A decent spares store.

    Computer hardware isn't so much an issue - although, if you don't have some kind of maintenance contract, you want at least 2 of everything, up to and including 'entire servers'.

    Depending on how much you're doing 'in house' things like cagenuts, spare cable management thingies, and tools to deploy said items will save a lot of grief.

    Serial cables, and consoles, if you're running unix hardware. Get a set that you _know_ works. All too often you only ever need these when things have gone a bit wrong, which is entirely the wrong time to be wondering whether that's the right cable.

    Spare UPS battery modules - if your whole DC isn't on a clean UPS supply, then you'll need standalone units for all your servers. And they will have batteries going bad, and it will always be a nuisance when they do.

    Little labling utility thing, like a Dymo [dymo.com]. The key to a happy datacentre, is to label and label and label. Even put labels on top of other labels saying you think this label is wrong, but haven't had a chance to check it. Label everything you can think of, with what it's for, where it goes, and who's in charge of it. Servers need hostnames, IP addresses, and anything that I might need to know about it right there and then. Cables need where they're going, and what they're plugged into. Go nuts with your labels, if I can't tell something just by looking at it, and I might need to know it 'here and now' then it should have a label with that information on it.

  • by RMH101 ( 636144 ) on Tuesday July 31, 2007 @05:13AM (#20054609)
    if your AC fails, you have a surpringly short time before heat will become a major problem. A portable aircon unit or 5 and some door wedges, combined with your largest sysadmins on guard duty, could save your bacon...
  • A few more items (Score:3, Informative)

    by Critical Facilities ( 850111 ) on Tuesday July 31, 2007 @10:04AM (#20056887)
    I'd say you'll thank yourself if you have some of these items:

    A Spot Cooler [airrover.com]- If you have a CRAC unit crap out and need some coverage while replacing part(s).

    Replacement compressor(s) [carlylecompressor.com]

    A variety of above floor fans, [grainger.com] and below floor fans [grainger.com] (in case of water under the floor).

    As many spare breakers [geindustrial.com] as they'll let you buy. (that UPS is no good to you with a bad breaker downstream of it).

    Don't just get tarps, get these tarps. [mauritzononline.com]

    Extra long load bank cables [teledynereynolds.com]. Have your electrician make them up for you. If you make them extra long and store them onsite, you can use them to jumper out inside switchgear if you suffer a catastrophic failure (it might be ugly, but if done right, it can save your ass).

    Flashlights [foreverflashlights.com] that will work.

    Hand operated pumps [deanbennett.com]. If you have a pump fail and you need to get diesel fuel from your storage tanks to the "day tanks" of you generators, you'll be glad this is on the shelf.

    A megger. [megger.com]

    A phase rotation meter. [mygreenlee.com]

    A good circuit tracer. [mygreenlee.com]


    That's a pretty good start.
  • Re:Safety equipment (Score:2, Informative)

    by rickshank ( 206122 ) on Tuesday July 31, 2007 @12:32PM (#20059107)
    I'll clarify a bit (I'm the submitter). Tier II/III defines the redundancy/sustainability of our data center.

    We've got redundant feeds from the power company, redundant diesel generators, redundant UPS systems, N+1 HVAC system, etc.

    Image is important. We frequently have audits and they typically want to view the data center. Certainly a clean room type environment, with scheduled above/under floor professional cleanings and no clutter. No games, tool boxes, shipping containers or the like. Servers are to be loaded across the hall in the integration room prior to production.

Kleeneness is next to Godelness.

Working...