Please create an account to participate in the Slashdot moderation system


Forgot your password?
Communications Software

How Do You Handle Your Enterprise Documentation? 125 125

An anonymous reader wonders: "I'm curious as to what tools Slashdot readers use to inventory and document their networks? What got me thinking about this is the part VMWare has been taking in data centers. You've got your SAN, various physical and logical networks, various VMs, and so forth. It just adds a new layer of complexity in terms of documentation. I'm curious as to what people have been using as for doing things like documenting how their backups work, LAN settings, FW settings, where and what runs what services, and so forth. How do you blueprint your entire IT infrastructure so that someone brand new could start and figure out what does what?"
This discussion has been archived. No new comments can be posted.

How Do You Handle Your Enterprise Documentation?

Comments Filter:
  • Easy! (Score:3, Interesting)

    by locokamil (850008) on Friday December 15, 2006 @11:06AM (#17255208) Homepage
    ... we don't.
    • Re:Easy! (Score:3, Insightful)

      by richdun (672214) on Friday December 15, 2006 @11:16AM (#17255414)
      Sadly, I think that would win a poll of the average /.er and others.
      • by Silver Sloth (770927) on Friday December 15, 2006 @11:26AM (#17255614)
        Untill you're on call, it's 4 am., the system's down, and you're trying to fix somethign that was amended by another team member . All of a sudden you become a HUGE fan of documentation, and, next morning, you tend to explain your new found zeal in no uncertain terms to the person who amended the port numbers without amending the documentation.
      • Re:Easy! (Score:3, Interesting)

        by SatanicPuppy (611928) * <> on Friday December 15, 2006 @12:40PM (#17257000) Journal
        I'm a coder and a unix admin...Unfortunately, I don't get paid to be a Unix admin (as the CFO told me, right before she cut out my raise), so while my code is documented to death, my network architecture is cryptic and erratic. Some services are enabled as I use them, but not set up to enable on boot. Some machines are in oddball locations in the building, and they're all headless, and unlabeled. Since the network infrastructure of the building is crap, I've had to pull my own cable to some machines to get adequate bandwidth, and that means that servers aren't necessarily plugged in where you'd think they would be.

        All the Win admins, who tell the CIO they don't need to pay me to do admin work when everything is working perfectly, and then worship my skills when something breaks, can figure it out themselves if I'm not around. I'm tired of babying them, and giving them detailed written instructions for stuff that they should damn well know how to do.
        • Re:Easy! (Score:2, Funny)

          by chris_mahan (256577) <> on Friday December 15, 2006 @01:47PM (#17258144) Homepage
          While I completely agree with you, I now know why your nick is SatanicPuppy. Winy yet Evil.
          • by SatanicPuppy (611928) * <> on Friday December 15, 2006 @03:54PM (#17260092) Journal
            Yea, I guess it is a bit whiny...I got dumped with a completely undocumented system after the powers that be fired and drove off the people who'd maintained that system, and when I jumped in and took up the slack (the slack of three people), working crazy overtime trying to keep up some SERIOUSLY erratic, yet mission-critical applications, then to get shafted in my review because my programming output dropped?

            I did feel abused. At that point I stopped making anything idiot-proof, and stopped replacing awful kludges with stable configurations. I stopped programming in one programming language. I stopped using one crontab. I stopped caring if applications had critical directory mount chains that crossed multiple machines. Most importantly, I stopped documenting anything.

            Looking at my work from a purely functional standpoint, no one could have any complaints. Everything does exactly what it's supposed to. But the robustness and redundancy that are the hallmarks of a system that is done correctly are gone, and the beauty of it all is, I've got a nasty performance review in my file blasting me for not focusing on "my job", so I fricking dare anyone to complain about anything that is not part of my official responsibilities.
            • Re:Easy! (Score:2, Insightful)

              by chris_mahan (256577) <> on Friday December 15, 2006 @05:13PM (#17261360) Homepage
              Same here...

              My managers are like "What have you done lately?"

              My reply: Documentation, stability and scalability enhancement

              Their reply: "What for? Deliver something to the customer!"

              My reply: "I have: zero downtime in the past 12 months."

              But do they care? No.

              • Re:Easy! (Score:2, Funny)

                by chris_mahan (256577) <> on Friday December 15, 2006 @05:16PM (#17261402) Homepage
                Oh, as an aside, my boss said he had a problem. For our goals, he has to reduce the number of tickets filed against our applications by 40% next year, in order to meet his achievements benchmark. The problem? We only had 1 ticket filed last year against our applications.

                • Re:Easy! (Score:4, Interesting)

                  by SatanicPuppy (611928) * <> on Friday December 15, 2006 @05:23PM (#17261530) Journal
                  I hate percentage goals. What a worthless metric. If I have a great year for programming (like I did last year), then the next year my job responsibilities double and my code output drops, did I become less productive...or more?

                  It's like taking a poorly written application and cleaning it up, so that when you're finished, it's smaller than it was when you started. I did that a couple of months ago and this dumbass kept overwriting my new code with the old code, because he assumed that the new code must be bigger than the old code, and couldn't be bothered to look at the timestamps.
  • by jofny (540291) on Friday December 15, 2006 @11:06AM (#17255222) Homepage
    I'm busy creating a model for as-is IT systems, policies, procedures, configuration standards, actual settings where appropriate, etc. into an enterprise architecture tool. The toollets me relate the disparate information types, find gaps, plan change, etc. It's also a central repository for any and all IT documentation (as you described) and allows multiple people to update their bits of it as needed. It's kind of cool!
  • Uhh, the usuals? (Score:5, Informative)

    by toleraen (831634) on Friday December 15, 2006 @11:07AM (#17255246)
    Word []+visio. []

    Of course the person creating the drawings and documents must be proficient in technical writing (aka not an idiot), because no matter what tools you have, if you don't know how to explain things, they'll be useless. Try to get your documentation peer reviewed to make sure it makes sense.
    • by lostboy2 (194153) on Friday December 15, 2006 @11:40AM (#17255852)
      mod parent up!

      The only thing I'd add to the parent post is that the people documenting stuff have to be willing and able to communicate effectively, not just proficient in tech writing. That means, among other things, that they must be committed to maintaining the documentation, willing to taking the time to explain things clearly, and able to organize the documentation effectively.

      Collections of undefined acronyms, cryptic phrases, and/or excerpts cut & pasted from e-mails without context into text documents scattered across a hundred directories and subdirectories (or printed and stuffed into a 3-ring binder) is not useful. And sometimes old/incorrect/outdated documentation is worse than no documentation.
    • by qwijibo (101731) on Friday December 15, 2006 @12:08PM (#17256352)
      The problem I find with using desktop based tools like this is that the documentation may exist and even be decent (in theory, I've never seen it happen in the real world), but how does anyone find the documentation? It's easy to keep things well organized for a 10 person company, but very difficult when there are over 10,000.
      • by walt-sjc (145127) on Friday December 15, 2006 @04:13PM (#17260412)
        Organizing the 1000 or so word documents in any kind of reasonable fashion is a nightmare.

        I much prefer a wiki.
        • Re:Uhh, the usuals? (Score:3, Interesting)

          by PylonHead (61401) on Friday December 15, 2006 @07:00PM (#17262806) Homepage Journal
          [We're so not 'enterprise' anything] But I'll say that for our small show, switching IT documentation over to a Wiki has been amazing.

          * If you're looking at something, and it's wrong, you can change it without missing a beat.
          * There are no worries that you're using an old version of the documentation
          * It's got a search engine
          * All changes are versioned
          * We have all passwords information encrypted

          If you make documenting something simple, people will document it. If you make it hard, people will not.

  • by T.Hobbes (101603) on Friday December 15, 2006 @11:08AM (#17255250)
    I tried organizing textfiles for all the chapters and gifs, but it's much easier to just fork over the money and pay for the printed version. Paper makes for easier reading and browsing, too, like with any other book.

    Amazon has it for $25 here: al-Manual/dp/0671704273 []

    Enjoy :)
  • by hcdejong (561314) <[ln.tensmx] [ta] [sebboh]> on Friday December 15, 2006 @11:09AM (#17255268)
    I was going to recommend Adobe FrameMaker, but that's for a different value of 'Enterprise Documentation'.
  • Scuse me? (Score:3, Funny)

    by justkarl (775856) * on Friday December 15, 2006 @11:10AM (#17255290) Homepage
    ....What documentation?
  • Use a Wiki (Score:5, Insightful)

    by Silver Sloth (770927) on Friday December 15, 2006 @11:11AM (#17255308)
    The biggest problem with documenation is that we're all too busy keeping the systems running to write up what we did. It therefore is neccessary to use a system where
    • It's easy to amend/update
    • Access is controllable
    • The content is searchable
    All this screams Wiki to me. If you're capable of setting up the sort of VMWare system you describe then installing Wikimedia [] will be a piece of cake.
    • by B2K3 (669124) on Friday December 15, 2006 @11:17AM (#17255420)
      Exactly -- any documentation about a live network will inherently become out of date after any change. With a wiki, whoever notices a discrepancy between documentation and reality can easily fix the documentation.
    • Re:Use a Wiki (Score:3, Interesting)

      by WebCrapper (667046) on Friday December 15, 2006 @11:28AM (#17255630)
      This is pretty scary because my org has been attempting to find the best way to document for the last week. With over 700 computers/servers/laptops, all seperated into regions up to 9 hours away, its a little painful. On top of that, we've noticed that the past admins haven't documented anything since 2000...

      Sadly, we don't have the time (like you said) to go out and find this stuff and determine the status.

      Within the last couple of hours though, I've found Technical Support software (which we need badly), that will scan your network for all kinds of info. I won't list anything specific because we haven't gone with anything yet nor do I want to look like I'm advertising. But, these packages look pretty promising and some offer reporting ability.

      Now, the bad part is, we want to create "God Books" for each one of our servers detailing EVERYTHING about it and how to bring it back from the dead, if needed. Talk about a pain in the ass. Although, I never thought of a Wiki. Since we want to stand one up anyway, that would be interesting. I'd be interested in seeing anything like this anyone has created.
      • by RingDev (879105) on Friday December 15, 2006 @12:31PM (#17256816) Homepage Journal
        "Now, the bad part is, we want to create "God Books" for each one of our servers detailing EVERYTHING about it and how to bring it back from the dead, if needed."

        It's called Norton Ghost ;)

        But yeah, that documentation is incredibly helpful if you need to build the same functionality on new hardware, or if you need to upgrade a system.

        • by fostware (551290) on Saturday December 16, 2006 @12:42AM (#17265596) Homepage
          Actually Symantec LiveState or Acronis TrueImage Enterprise...

          Both use VSS or low level agents to create images without downing the servers, and both allow restores from a SMB share.

          Worth my weight in gold when multiple drives die in a RAID, and it's end of school year (certificates, reports, Curriculam Council census data - some with legal deadlines)
          • by WebCrapper (667046) on Saturday December 16, 2006 @06:28AM (#17267222)
            We currently run RAID1 mirrors on the OS drives, RAID5 on the data drives and the servers are monitored every day for dead drives. We actually had a drive fail on the exchange server and we caught it quick enough to have the rebuild done on the new drive within an hour and a half of the drive failing. Now granted, we just happened to be in the room 3 minutes after the drive failed, but it would have been caught within 3 hours. It also seems that the past admins used drives from the same lot in about 4 of the servers (you know, box shows up and you just plug everything into everywhere you can, all at once) - we've had 4 drives fail on 3 machines within 2 weeks, which is why we're monitoring so much at the moment.

            On top of that, we back up between servers and use Veritas to back up to either USB hard drives (don't ask) or Tape drives. I've also been tasked with creating a network status web page to allow us to catch issues before the admins remote into the machine to search for any issues. No clue how I'm going to tackle the issue of failing drives since Windows thinks a RAID array is fine even though 1 drive can be dead...we'll see. I'd be interested in hearing if anyone has coded this before.

            We're working on setting up 3 3.5TB SANS in separate locations for cross network backups and restores (we're running a lot of big connectivity between our sites). At this point, we're just waiting on the drives to show up and then we'll deploy. We're also looking to move away from Veritas for various reasons.

            On top of all that, we have to....document all of this, including network diagrams (visio) somewhere. Over the past 24 hours of thinking, I think the wiki is the best way to go.
    • by tom17 (659054) on Friday December 15, 2006 @11:29AM (#17255644) Homepage
      What if you were in an environment where any and every change, no matter how small, is documented in a mandatory way?

      Now I realise that this ups the staffing overhead considerably as you have twice as much work to do. But that aside, do you think this would work?

      What do all the ISO9001 type companies do in this respect? I have worked on projects where, although not ISO9001, were very strict with their change management process and it has to be said, the documentation was always spot on.

      A lot of overhead, but if it works and can save huge problems down the line then maybe worth it?
      • Re:Use a Wiki (Score:3, Insightful)

        by Silver Sloth (770927) on Friday December 15, 2006 @11:38AM (#17255802)
        The problem with insisting on full change management protocols is the same problem as with hyper tight security. If you make things too difficult then people will find ways round it.

        For example, in the organisation I work for make a change involves a seven page document with a five working day lead time. On the other hand, changing configuration in response to a customer complaint can be done instantaneously with the minimum of paperwork. So, if you want to get something done, get a customer to raise a complaint to avoid the paperwork.

        As such over complex systems are self defeating.

        • by tom17 (659054) on Friday December 15, 2006 @11:52AM (#17256060) Homepage
          In my old company, (the same one where I said the docus are spot on for certain projects) I would hate the change management process also.

          Its probably one of those things thats great for managers who want to ensure everything is ducumented but awful for techies who just want to get work done.

          If, however, there was a change that needed to be done straight away for whatever reason (eg, critical bugfixing) we were able to do it pretty much straight away and follow up with the documentation.

          Maybe there is a place for a "half-way" type cm process where insignificant or bugfix changes changes can be made quickly with the cm/ducumentation process following thereafter. I dunno, i'm just waffling really :)

          How about develop some system that documents the system for you.. You make a change to a port number and the docus pick it up automagically :)
          • by qwijibo (101731) on Friday December 15, 2006 @12:12PM (#17256430)
            The half way process is to have an onerous process that you demand everyone use, and a wiki for the technical people to use, but isn't considered documentation and is frowned upon. A wiki is much easier for people to work with than searching through hundreds of emails on a topic and can be informal enough that people aren't afraid to make changes that haven't been reviewed by 25 levels of management.
      • Re:Use a Wiki (Score:3, Interesting)

        by qwijibo (101731) on Friday December 15, 2006 @11:51AM (#17256036)
        It doesn't work. I work at a company that has strict requirements for following a defined process with a lot of documentation for every project. The official story is that we need to do all of this because we're a bank and SarbOx requires us to be thorough in our documentation of everything. That's +100 for creating a ton of jobs for people who perform no necessary function, but -infinity for good thinking or recognition of reality.

        In reality, the official process turns a 1 month project into a 2.5+ year project (already past year 2, end still not in sight). The 6+ month projects could never be done using the official methodology, so they're clandestine. The strict requirements have caused us to abandon all reason and good judgement and just do as little as possible across the board, resulting in insecure environments that are not administered with applications that are held together with duct tape. There's such a strong push for no single point of failure from an official company perspective that it results in many key people never having time to train anyone else, so major pieces of functionality and knowledge are lost everytime someone leaves. This creates an environment where people don't want to stay.

        One of our DBA's set up TWiki to document things. It looks decent and seems like a good idea to me. It's probably noncompliant with company policy on many fronts, so it probably can't be the official repository here, even though it's many times better than what we do have. I like the idea of anyone being able to find the documentation and fix it as well as using version control to allow anyone to see past revisions in case someone's "fixes" are wrong.

        That's way better than the methodology we use where all the documentation has to be watered down to the point where no useful or accurate information is presented, they admit that form over function is the rule for the entire process, and no one could ever find documentation for any project anyway, because there is no common place to put the useless powerpoints.
        • by NeutronCowboy (896098) on Friday December 15, 2006 @12:33PM (#17256862)
          I'll throw in my hat for TWiki. We're not a bank, but we produce a lot of IT monitoring software, and hence, are often looked at as experts on how to run IT departments. Well.... let's just say I won't divulge the name of my company, cuz quite frankly, we barely can run ourselves. Specifically documentation is ass. The worst part is not that documentation is not there, it's that no one can find it. Occasionally, there is a drive to document something cool, or somebody just sits down and writes down his/her amassed wisdom. The problem is, those documents then vanish into a multitude of document repositories. There are knowledge bases, shared drives, several different document storage solutions, and a couple of Wikis that all hold various docs, some of which are duplicates, and none of which (with the exception of the Wikis and the knowledge base) can be properly searched. Now we're trying to standardize on a Wiki, but for some reason, the nitwit in charge of it tries to make it into file share with a web interface. I'd love to know how other people store and access their documentation, because for me, that's problem #1.
        • by bsd4me (759597) on Friday December 15, 2006 @12:44PM (#17257080)

          Another post mentions it, but Confluence [] may be a good fit. It is a Wiki, but geared towards the needs of an enterprise. Compared to other wikis, Confluence has better permission control and has better facilities for organizing articles. We have deployed several Conflence instances for clients, and all are happy.

    • by marcello_dl (667940) on Friday December 15, 2006 @03:36PM (#17259816) Homepage Journal
      That's what I did. For free in a "scratchpad wiki" at []. If you have to specify sensitive data, network services details, run a wiki yourself or look for specialized hosting.

      Of course i'm still the only one reading the wiki :)
    • by WuphonsReach (684551) on Friday December 15, 2006 @06:00PM (#17262038)
      It screams Wiki to me to... until the Wiki server is offline.

      How do you handle documentation that is stored on a centralized bit of storage that may be inaccessible when the documentation is needed?

      Are there distributed wikis?
      • by turbidostato (878842) on Friday December 15, 2006 @06:57PM (#17262774)
        "It screams Wiki to me to... until the Wiki server is offline."

        Scream "paper" for that ocassions.

        "How do you handle documentation that is stored on a centralized bit of storage that may be inaccessible when the documentation is needed?"

        Think about it for a moment. Do you really need the "Operations manual for the Hooly Pahula branch office" when your wiki comes off-line? What for? (I won't accept as an answer "but my tech doc wiki is at our Hooly Pahula branch office", you dork). If you are serious about your documentation (and so it seems if you indeed develop a "hard to sustain under PHB practices tool" like a seemingly unproffesional -front typical PHB's point of view, wiki) all that you need is start the wiki page about coming back your wiki online with something like "There's a paper copy of this page here (on site location) and here (off site location). Whenever you modify this page, you should print fresh copies and put them on said locations ASAP (or prefirably evean sooner)", and a big "README FIRST" link on the wiki's frontpage to the online wiki recovery page to be sure all related people knows about it. This is of course valid for the general backup/recovery practices manual too.
        • by pnutjam (523990) <slashdot AT borowicz DOT org> on Tuesday December 19, 2006 @09:54AM (#17299346) Homepage Journal
          Wiki's need to be on a virtual machine so they can be easily moved to good hardware.
  • by Opportunist (166417) on Friday December 15, 2006 @11:13AM (#17255344)
    Generally, you'll be hard pressed to get techs to document anything. Simple reason: If it was documented, anyone could find the junk again. Not just them.

    It's our way of securing out jobs. If you want a CD or want to know what this button does, hell, ask. You can even call us at home, even in the middle of the night, we won't even get too mad if you throw us out of our cozy beds at 10am with a call, but don't ever dare to question our way of organising things. If you ask a tech where the documentation is, he'll tip his temple and say "here".

    That way you can't fire him. In today's corporate world, it's an essential job security thing to NOT document. If you have to document it, write it down and then reshuffle everything.

    Sorry to be not too helpful, but that's simply how it is. At least for me. And now excuse me, I need to hunt down that (censored) tech, I need an MS-Office CD.
  • Media Wiki (Score:5, Insightful)

    by RingDev (879105) on Friday December 15, 2006 @11:14AM (#17255348) Homepage Journal
    I'm working hard at convincing my management to impliment a Wikipedia style documentation system. I've demoed some of the possibilities and it looks like a great tool for it. So good that I've recently installed Media Wiki for another large company looking for a documentation system. For its ease of use, configurability, and built in functionality, it is truely a great tool.

    Now if I can just convince the last supervisor that Media Wiki is better than MS Word with Track Changes turned on (shudder!).

    • by Zadaz (950521) on Friday December 15, 2006 @11:54AM (#17256112)
      Wikis are a great tool, but only as good as the information in them. If no one's documenting before the wiki, no one is going to after.

      The last three projects I with had wiki (wikis, wikka, wikum?) for all aspects of the project from spec to doc. I was told that if I had any questions, I should just annotate the wiki with my questions so people who knew could fill them in.

      In every case the wiki's were about 50% stubs, and of the rest of the pages, they were all

      About half way through the project, everyone just skipped asking questions on the wiki because it was a waste of time when you needed a ready answer and did it in email.

      These weren't an enterprise situation, but it still holds. You need to have the discipline and management in place, or whatever technology you use will not help you.
      • by edmudama (155475) on Friday December 15, 2006 @01:44PM (#17258112)
        Sounds like someone was lazy.

        Wiki at some level requires the generosity of the users with their own time (or else paid to do it). After you had that exchange in email to answer a question, someone should have cut and paste the question and answer into the wiki, so that all others could read it. There's no time wasted.

        Wiki doesn't have to be the QA forum, but the process needs to come full circle to get the information into the wiki if the original exchange is via another technique.
    • Re:Media Wiki (Score:3, Interesting)

      by truthsearch (249536) on Friday December 15, 2006 @12:00PM (#17256230) Homepage Journal
      At my company (a software company) we use Media Wiki for all internal documentation, including server and network configurations. It's working quite well. Having free-form documentation, rather than a strictly organized hierarchical model, means people are more inclined to toss in information as they think of it. For example, if I upgrade PHP on a server it takes only a few seconds to update it in the wiki. No time wasted looking through directories or document indexes.
    • by djh101010 (656795) * on Friday December 15, 2006 @01:35PM (#17257992) Homepage Journal
      Now if I can just convince the last supervisor that Media Wiki is better than MS Word with Track Changes turned on (shudder!).

      There's a word macro out there called word2wiki, by some German guy I think. Works great, and helped me overcome that last bit of social inertia here. You can write it in word, no problem, run it through the macro, and paste the result into your wiki, it's all wikified, no (gasp!) having to learn any new tags.
    • Re:Media Wiki (Score:3, Informative)

      by Degrees (220395) <> on Friday December 15, 2006 @04:34PM (#17260732) Homepage Journal
      At least with older MediaWiki (ver. 1.4), it didn't search on IP addresses. That is to say, each octet of an IP address was too small for MySQL to index, so you couldn't search by IP address. If you knew you were looking for the Central Plant router, you were fine - but if you had and wanted to find where that was used, you were s.o.l.

      Another deficiency is that MediaWiki doesn't support image map. Sometimes the best way to find info is to click on the picture....

    • by Just Some Guy (3352) <> on Saturday December 16, 2006 @01:57PM (#17269792) Homepage Journal
      Now if I can just convince the last supervisor that Media Wiki is better than MS Word with Track Changes turned on (shudder!).

      Easy. In event of an emergency, is your field tech going to find it easier to use his cell phone to browse the corporate Wiki or an unviewable Word document? There have been plenty of times when I've been grateful to have web access to some information I needed.

  • LIVELINK BY OPENTEXT (Score:1, Interesting)

    by Anonymous Coward on Friday December 15, 2006 @11:14AM (#17255362)
    Livelink [] by Open Text [] is simply the best solution on the market for ECM.
  • Confluence (Score:2, Interesting)

    by sof_boy (35514) on Friday December 15, 2006 @11:16AM (#17255398)
    We use Confluence, a wiki from Atlassian []. It also integrates well with Jira [], their bug tracking program we also use. Both products are popular with some open source projects, the names of which elude me at the time.
  • Trac (Score:3, Interesting)

    by AlXtreme (223728) on Friday December 15, 2006 @11:19AM (#17255470) Homepage Journal
    Trac [] is what we use for network, backup and project-documentation. And bugtracking. And for browsing through our projects' code. "It just works (tm)".
  • by sgt.greywar (1039430) on Friday December 15, 2006 @11:21AM (#17255502) Homepage Journal
    I work in a government run operation as a contractor and the documentation rarely gets beyond PowerPoint slides with the basics of each WAN site on them. We are attempting to upgrade this through the ITILS process [] but have not had much luck so far.
  • by KidSock (150684) on Friday December 15, 2006 @11:21AM (#17255506)
    I'm going to need to find a solution for this as well. I want to generate a PDF manual, HTML "technotes", HTML API documentation, man pages and possibly more materials. Much of the content will appear in more than one place. It seems to me the ideal solution would use a single set of XML sources written in a custom markup specific to the content (e.g. API descriptions, code examples, etc) and then translate that into HTML, PDF, and so on using XSLT. The only problem I have right now is that I need a word processor that understands XML and can display content with tables footers, footnotes, SVG graphics, etc. Then I can create a template document, write the XSLT transform and generate the manual and convert it to PDF. The only problem is the only product that I know of that can do all the footers, TOC, footnotes, tables, graphics, etc AND import and export XML is Microsoft Word 2003 but I'm not excited about the price and I don't usually have a Windows machine on in the office I'm in.

    Has anyone else been doing something similar? Any tips for me? I'm going to check out OpenOffice first but based on previous experiences I'm a little skeptical that it can do more than create "Lost Dog" signs.
    • by Baricom (763970) on Friday December 15, 2006 @11:34AM (#17255746)
      What you're describing sounds a lot like DocBook []. I had difficulty getting the tool chain set up, though, so I have no practical experience using it.
      • by bahco (522962) on Friday December 15, 2006 @12:12PM (#17256436) Homepage Journal

        > What you're describing sounds a lot like DocBook.

        Yes, you want to use XML according to DocBook's schema somewhere in the chain from data entry and modification up to the generation of all presentation forms of the information stored.

        And no, you do not want to use DocBook's XML as data entry format for the technical data. (As there are [AFAIK, and please correct me if I'm wrong] no usable open source editors for XML, I doubt that you want any XML for data entry.) You can use DocBook for the narrative part of the documentation, but the source of the technical data should be in a vocabulary that is tailored to your environment, be it XML or otherwise. This technical data you transform to DocBook before transforming it to man page, PDF, ...

        At least, that is how I would do it, if I had the time. ;-|

    • by value_added (719364) on Friday December 15, 2006 @11:56AM (#17256144)
      I'm going to need to find a solution for this as well. I want to generate a PDF manual, HTML "technotes", HTML API documentation, man pages and possibly more materials. Much of the content will appear in more than one place. It seems to me the ideal solution would use a single set of XML sources written in a custom markup specific to the content (e.g. API descriptions, code examples, etc) and then translate that into HTML, PDF, and so on using XSLT.

      How about DocBook?

      DocBook is a markup language for technical documentation. It was originally intended for authoring technical documents related to computer hardware and software but it can be used for any other sort of documentation. One of the principal benefits of DocBook is that it enables its users to create document content in a presentation-neutral form that captures the logical structure of the content; that content can then be published in a variety of formats, including HTML, PDF, man pages and HTML Help, without requiring users to make any changes to the source. []

      As for the original question, I prefer a simple binder with copies of the output of whatever program can be used to generate the output: dmesg, netstat, ifconfig, Windows whatever, etc. Not exactly enterprise, but it works at a smaller scale. Past that, you're looking at first defining what you're going to document, the form of that documentation, how it's distributed or made available, blah blah blah. That's the kind of job best left to a committee or by the folks upstairs whose job it is to define and set policies.
    • by slide-rule (153968) on Friday December 15, 2006 @01:49PM (#17258186)
      Our co. has some OO.o template files (*.ott) set up for developers to use ... they enter info into the form in proscribed places and then some XSLT (etc) I wrote converts the underlying *.odt file's XML into HTML. Our usage is all for in-house purposes. It is doable, but there are some minor niggles in the overall process. If the person filling in the info doesn't do what OO.o wants them to do and exactly how it wants them to, the underlying XML file -- technically preserving the info accurately -- might look a little shuffled up to your first-draft XSLT process. You can really only go so far in the XSLT domain before you just have the process throw an error message, after which you walk down to someone's office and smack their knuckles for doing it the wrong way.

      • by KidSock (150684) on Friday December 15, 2006 @05:44PM (#17261838)
        they enter info into the form in proscribed places and then some XSLT (etc) I wrote converts the underlying *.odt file's XML into

        Actually I was only going to use the template to create the XSLT template. Then I was going to write the XML using a custom schema BY HAND. I know that sounds a little nuts but I'm a lot faster in vim than in any word processor. Then I run the XSLT processor and generate a hopefully valid OO.o document. From there I'll tweek as necessary (does OO.o support macros?), print to PS and convert to PDF using ps2pdf. But that's just for the manual. Everything else is HTML and I have a transform for man pages that I'm using already.

        From your experience do you see any problems with this technique? Does OO.o support TOCs and footers and all that stuff? Last I checked it didn't even come close but I have to admit it's been years. I suppose I should just try it before coughing up $400 for Word.
        • by slide-rule (153968) on Friday December 15, 2006 @06:59PM (#17262788)

          Yes, for any reasonable "document" need, OO.o seems to support things. Figuring out the interface is another matter -- it is wholly counterintuitive how to do things looking at OO.o, but it does tables (and handles breaking them across a page), running headers and footers, TOCs, etc. The real trick to OO.o is that everything is controlled via some "style" selection, so a few things logically do drive differently than in Word. In other words, the 'zen' is different. Some things are just plain stupid: some knob related to a page property seemed to be in a paragraph property area. But the *.odt file is fairly accessible. What I did was start w/ a blank page and add just one thing to it: a header, or a table, or a graphic, then I unzipped the odt and started mucking with the XML files to learn how/where it stores things. Its about the easiest way I can think of to digest the format.

          On the other hand, if you're gonna insist on authoring in XML, you might look into DITA. Our co. also has a tech guy or two that converts dev-speak into english and then into DITA xml files. Our deliverable docs are run through an in-house customized version of DITA OT 1.2.1 to create XML-FO code (and then through fop to get PDF files). DITA OT 1.2.1 also converts to (X)HTML, but we haven't played with that yet. The upshot is our docs are part of our nightly build process: the XML files are checked out, processed, and turned into PDFs every single day. What is really really painful, though, is to convert DITA XML into *.odt-based XML. I got just enough of it working to work, but it's a fairly horrid XSLT file to write.

          • by KidSock (150684) on Friday December 15, 2006 @08:29PM (#17263670)
            Thanks, this is exactly what I wanted to know. Sounds like OO.o will work. DITA looks right on but I already have a schema that has some features I really need. For example I can have a function prototype like <meth><pre>struct foo *foo_new(unsigned int size, struct bar *bar);</pre></meth> and my HTML reference and man page XSL transforms will isolate each parameter making them bold (and in the future I could link each param). DITA didn't look like it had this. The way I see it I pretty much have to write all the XML and XSLT no matter what I do so I'm not convinced something like DITA or Docbook would help me. If I were writing a thesis maybe, but custom technical docs I don't know.

            Are you using an OSS solution for FOP? Last I checked Apache FOP looked a little crunchy so I was going to do XML -> .odt -> OO.o -> PostScript -> PDF. But it would be delightful if there was a really good OSS FOP processor out there.
            • by slide-rule (153968) on Friday December 15, 2006 @09:32PM (#17264254)

              OO.o is working well enough that 99% of the company desktops now only have OO.o installed. (A big part of the equation is that we're cheapskates. ;-) We allow the HR lady to keep her copy for resume handling purposes. The two tech writers have it for legacy document purposes. Everyone else from our boss on down has OO.o. We're an IT shop, granted, but it does what we need it to.

              Part of the alleged usefulness of DITA is you can extend the default functionality to add things you need. But, if you already have a schema worked out and have some work done in that direction, no point trying to fold it over into DITA... it wouldn't be an easy thing to do. (We've done minor extensions to better annotate various bits of legalese, product information, etc... but certainly nothing of any real scale).

              We're using (apache) fop 0.20.5 as we use free/OS were possible (damn that exchange server!) ... while there are some higher revision numbers for fop, they seemed to break/drop support for various parts of the XML-FO spec we had been using. We're sacrificing on only one sticking point: "keeps" ... that is, keep this fo:block adjacent on a page with the following fo:block, possibly by breaking the page sooner than otherwise needed. (also impacts table keeps.) Other than that, we're generally happy enough. We have one deliverable that numbers above 400 pages when fully assembled and rendered to PDF, complete with PDF bookmarks, TOC, front matter, legalese, back matter ... fop doesn't have a problem with it (though that guide eats a lot of memory to assemble, fop hangs right in the game). And for those lurking on the thread: moving the material out of a word processor format into DITA xml topic files has been a godsend ... re-use of content is trivial, and everything stays up to date with one file change. Anyway, best of luck.

  • by Bandman (86149) <> on Friday December 15, 2006 @11:25AM (#17255584) Homepage
    We've setup an internal Wiki site using the MediaWiki [] software.
  • by Anonymous Coward on Friday December 15, 2006 @11:28AM (#17255624)
    make sure you are consistent with the industry... []
  • by Anonymous Coward on Friday December 15, 2006 @11:28AM (#17255632)
    I'm a techie, I know how to program, manage networks, install & configure domain controllers, I can rattle off hundreds of Unix CLI tools
    However, my writing for non-techies sucks.
    Companies: once your IT departments hits about twenty need to hire a technical writer or a documentation specialist.
    When you get ten or fifteen geek-nerds contributing to one document (eg: "the disaster recovery scenario"), the document WILL be a mess

  • by bsd4me (759597) on Friday December 15, 2006 @11:56AM (#17256148)

    We have a small, internal Mediawiki installation for documenting things like this. I have found that more people actually document things this way.

    I also like an online tool for tracking software versions. I have a page that lists all of the F/OSS software that we have installed, along with the installed version number, the latest version number, and the URL to the distribution page. Once a week I have an intern go through and update the latest version numbers. I get notified about changes, and then I we can make the decision about whether to install the new version.

  • by Colin Smith (2679) on Friday December 15, 2006 @11:58AM (#17256170)
    Two types of document management. Egroupware document management for policy style docs which describe the way things should be done and a wiki to describe the way things are actually implemented. Strategy vs tactics.

  • by TheOldBear (681288) on Friday December 15, 2006 @12:00PM (#17256212)
    Well, solution is too strong a word.

    Word processing documents, scattered seemingly at random on a shared disk drive.

    The organization has Lotus Notes - but does not use it [I'm thinking of the Team Room template - its sort of like the Confluence Wiki in capability]. The corporate culture is allergic to any non M$ documentation solution - even for a new flagship project that has been in progress for just over a year.
    • by RingDev (879105) on Friday December 15, 2006 @01:04PM (#17257418) Homepage Journal
      That's the same kind of crap my supervisor is sticking too. Word docs in random places on the network. A few months ago he decided to reorganize them. That royally screwed with everyone and we are still having issues finding documents. Even with Google Desktop installed (the only way I could handle this crap) I still wind up grabbing cached copies half the time.

      We have no change tracking, no access control, no versioning, etc... on most of the docs. Some of the docs are checked into Visual Source Safe, which is atleast something, but it means making a one line change to a document is now a 5 minute process. The supervisor is very insistant that 'Track Changes' is turned on, even though it makes documents un-readable, and provides no real value for version tracking on the document.

      In any case, documentation through assorted Word documents has to be one of the worst solutions I've had to deal with in an enterprise environment. Even Lotus Notes has better tools for this type of information colaboration.

  • by raist_online (522240) on Friday December 15, 2006 @12:01PM (#17256248) Homepage
    I've been thinking about this for some time - my job involves being in the team looking after a big university data centre(s) and for some time now we have been seeking solutions for documenting our networks, applications, topology etc.

    So far we've deployed nagios ( for monitoring and rolled our own blog for notes / comments on servers and services.

    I would like to do some more integration, possibly utilising Rackview ( DCML ( showed some promise but now seems dead.

    If anyone is further down this path, I'd really appreciate some input, otherwise I can release the first stage design specs from our project and see if we can build a community around that.
  • by rmerrill11 (308424) on Friday December 15, 2006 @12:16PM (#17256522)
    There are several nice wiki solutions, but Confluence wiki [] does the best job of meeting our corporate standards, and we are in the process of migrating all our documentation to it.

    The key points for us:

    • Supports page level access controls
    • Integrates with external authentication system (LDAP/Active Directory)
    • Runs on a Java Application Server
    Good luck!
  • by jar240 (760653) on Friday December 15, 2006 @12:24PM (#17256652)
    what is this... earth documentation?
  • by iJed (594606) on Friday December 15, 2006 @12:24PM (#17256680) Homepage

    While not specifically for the uses stated in the article, we use MediaWiki [] for all our documentation nowadays. This has replaced the dreadful Lotus Notes as our documentation management system.

  • by rs232 (849320) on Friday December 15, 2006 @12:35PM (#17256908)
    Put everything in the one folder and name the documents a##### where # stands for 0-9 - a true story.
  • by alancdavis (677086) on Friday December 15, 2006 @12:35PM (#17256910)
    I'm currently using MediaWiki in a two-pronged manner - I keep my daily and rough notes under my User: space - after 20+ years it's gotten to be a habit to make notes about /everything/ I do.

    These notes become source material for the "real" Wiki entries that have all the nice (well - it's /still/ a wiki) formatting and complete information.

    I've also used Forrest + successfully in the past. Forrest accepts OOo XML as an input format. As long as you use the styles in the sample doc from the Forrest distribution it renders cleanly.

    Forrest can then output in a variety of formats, including PDF, to make generating offline site documentation for disaster recovery guides a /much/ simpler task - which means there's a better chance of it staying current.

  • Document for Life (Score:4, Interesting)

    by jafiwam (310805) on Friday December 15, 2006 @12:47PM (#17257142) Homepage Journal
    Documentation is not a project you finish.

    It's something you do as best you can in-between other stuff. (Preferably starting with the stuff you are working on already.)

    Then, the next time you do that, just go back and open the document and update it as you go through.

    In our small company, we use a scattering of web sites (SharePoint or FrontPage based), network folders, individual "not done yet" documents, and a (yick) Wiki. I would like for us to use "Public Folders" on our exchange server as it doesn't involve teaching staff members to do stuff they don't already know how to do. (Some folks are not technical enough to even handle a Wiki.)

    You just keep at it, and over the years you get better stuff as a collective whole. Be sure to clean out the stuff that is no longer valid, (but maybe keep it archived).

    EVERYBODY needs to be writing it. I figure for every full time difficult to learn job, there's about two full time documentation jobs. So don't worry if it doesn't ever get complete. It won't, and for the most part it doesn't HAVE TO.

    Also, for everyone's sake, get a dual monitor setup so you can easily document while you work on the other screen. Since our staff got two or more monitors, documentation creation rates have skyrocketed.

    Of course, if you are a regulated body or get audits, it's a really good idea to review all your requirements for that once in a while so you don't waste effort doing the documentation wrong.
    • by RMH101 (636144) on Wednesday December 20, 2006 @08:28AM (#17311120)
      you need to have a policy where all departments, projects, etc *accept responsibility for keeping their documentation up to date*. have a standard document set/framework. use a standard template. standard naming convention. don't get too hung up on the technology - an excel spreadsheet and a load of text files will do the job, you don't have to use documentum or anything.
      standardise as much as possible, ensure people have a backup in case they get hit by a bus, and store master passwords in the safe in a sealed, signed envelope...
  • by smooth wombat (796938) on Friday December 15, 2006 @01:02PM (#17257398) Homepage Journal
    I know I've written it in a previous post but when documenting a procedure, installing a piece of software for instance, my documentation starts with "Insert CD" and ends with "Remove CD". Every step along the way, every instance of clicking Yes/OK/No/Cancel/whatever, is documented.

    As far as the network itself is concerned, I'm in the process of physically visiting every pc and printer in our building, writing down its name and cable number then putting that information into a spreadsheet which also has what switch the equipment is on and what port, with each switch having its own tab. I also do updates to machines if people aren't at them.

    CiscoWorks gives me the switch and port info so that is the easy part.

    Before I left my previous job, I did a knowledge transfer for our SAN with the guy who would be dealing with it. I worked with him for two months so he understood how the physical connections worked, why they were connected to both sides of the SAN switch, the importance of keeping your cable numbers accurate, how to add devices to the SAN, creating LUNs, the whole works. He documented everything and expanded upon what I had already done, including screenshots, in a binder so (hopefully) anyone else who has to deal with it can follow the pictures. The best part was the physical layout of the SAN switch. All anyone had to do was have the printout, hold it up at arms length and they could see exactly what device was on what port and what adapter was on what side.

    I also documented everything I did with printers so, as I told people, "When I get run over by cars who refuse to stop at the red light as I'm crossing the street, any idiot can pick up where I left off." Every printer, including model, IP, location, name, etc was kept in a spreadsheet as well. There were only 800 or so to deal with. I guess I could have memorized everything.

    Sadly, I've found out that since I've left, things aren't anywhere near what they were when I was there so apparently the idiots that are still there can't follow simple directions.

    So yes, documentation is critical. Everything, no matter how minute, must be written down, labeled, etc. I'm doing my best at this location to bring some of that mentality to bear but it's going to be a long and tedious process. Try doing a Visual Studio install on a machine and getting "Error code 103" or "The system cannot find _setup.dll which is necessary to complete the installation" without documentation on how to work around the messages. Of course, if the programmers who wrote the installation programs for Visual Studio would have known what they were doing, these messages wouldn't occur. But that's a different story.
  • by skinfitz (564041) on Friday December 15, 2006 @01:05PM (#17257434) Journal
    How do you blueprint your entire IT infrastructure so that someone brand new could start and figure out what does what?

    You must be new here.

    If any admin were to document something so that someone brand new could just step into their shoes they just lost a serious advantage to not getting 'downsized' at the next opportunity.

    Reasons for getting rid of good admins usually come down to the fact we are the proverbial housewives of organisations - the only way to show you what we do is not to do it. Many managers get complacent and start thinking along the lines of they never have problems so why do we need these people. That, and admin culture does actively encourage the denigration of users, often to their face which many take offence to. If they start getting big ideas about replacing the admins whenever they get upset by it being pointed out to them just how stupid they are, for example the lady this week who emailed us to complain her email was not working, then good documentation that anyone 'brand new' could follow is a Dangerous Thing.
  • by Joe The Dragon (967727) on Friday December 15, 2006 @01:13PM (#17257602)
    A lot of places use them.
  • by dJCL (183345) on Friday December 15, 2006 @04:46PM (#17260984) Homepage
    We use the in house developers to write a proper helpdesk system with proper asset tracking. Supports multiple sites, users, mapping of physical locations, it now even updates our DNS and will soon be updating our firewall configurations.

    For anything beyond that, company wiki sometimes linking to files on the fileshare.

    /plug: Still looking for techs in Ottawa, need Good MS, some linux and experience on the phone. Talk to me and if your qulified, I will get you a job here - I have gotten 3 others already.


  • by gtoomey (528943) on Friday December 15, 2006 @07:16PM (#17262998) agement.asp [] Its very much enterprise level. The inane comments here make my wonder if any of you have a job at all.
  • by thewiz (24994) * on Friday December 15, 2006 @10:26PM (#17264674)
    I'm curious as to what people have been using as for doing things like documenting how their backups work, LAN settings, FW settings, where and what runs what services, and so forth. How do you blueprint your entire IT infrastructure so that someone brand new could start and figure out what does what?

    I could tell you, but then I'd have to send you to Abu Gharib.
  • by canuck57 (662392) on Saturday December 16, 2006 @04:54PM (#17271266)

    It has been years since I have worked at an organization where they have been truly effective at dealing with Enterprise Documentation. More commonly it is a mix of emails, many dozens of shares in what seems like a billion diverse places all over including local PCs and home computer systems. All which are NOT friendly to new starts on a project or a company. Fragmented at best.

    How this highly effective organization did it was simple:

    • everyone had the same set of tools, no exceptions. If they could afford the tools, they didn't use or endorse its use for anyone.
    • they used common formats, often just text files. The idea being they can be read 3-5 years later. Sometimes it changed, but a clear upgrade path was provided.
    • open discussions for all projects were available, and constructive input in writing from any concerned was encouraged.
    • spelling, formating, were not too brutal
    • writing skills were encouraged as the mentality was if you could write it, you likely don't know what you were doing.
    • people who did not comply with the culture were let go. Even if they were otherwise competent, not viewed as a team player.
    • people who said it was not documented enough, but really didn't know what they were doing were ether trained/mentored or let go.
    • No extra points for filler either. Cut and paste of vendor manuals was not encouraged.
    • everything, and I mean everything was posted into "nntp news groups". If a hard drive was replaced, it went into hardware maintenance section under the device/server. If it was proposed plans to change mail routing the discussion would be in software mail routing.
    • even non-I/T business used it. Sending mail to more than 2-3 recipients had better be considered very confidential or your manager would ask why it wasn't posted in the groups. Email blasting, a plag of todays culture was - well - severely dealt with. People missed raises for this.
    • even vendors had their news groups. Vendors hated this. If they screwed department A, when department B wanted something they would help out department A before dealing with them.
    • only discussion forums expired in 6 months, many never expired documents.
    • if MS-Word was used, a synopsis with an attachment was often posted.
    • each news group had a moderator for cleanup.
    • even the CEO and CIO often posted. Marketing would even jump in.
    • custom software used a version control system.
    • commercial software had a librarian who managed, filed and controlled all software that was bought. Surprising how many overbought licenses occur. You checked out the media and checked it back in.

    Now the above could use the same tools today but a little modernization is in order. Pick a Wiki, pick a common version control system and perhaps Slashdot code for discussions -- and make the policies. More importantly, vigorously enforce the policies.

    Be prepared, depending on your organizations discipline, expect 10% or more to quit or be fired. Many people are solo cowboys and will not document and participate. Take the most critical of progress and ask them to leave. Take the best of those that participate and send them on a week long course of their choice.

    Like Slashdot, using a online discussion mechanism discouraged dysfunctional politics. After all, the CEO might read it. Better yet, if you were new you just pulled a list and subscribed to what was of interest skipping what was of no concern, often seeing history on a application or hardware going back years in one well known place for quick background on the reasons.

    Enforcement was easy during budget time, no news group with online docs, no money.

    Less phone calls too, operations often had the change in the groups and got the right support more often... was good to have worked there.

"History is a tool used by politicians to justify their intentions." -- Ted Koppel