Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

What's Your Command Line Judo? 272

lousyd asks: "We all have our CLI amor. That two- or three-letter command that fiddles our heartstrings. 'mv' is pedestrian, but 'mmv' is so cool. 'cp' can lug bits around all day to no applause, but 'cpio' will excite even the most jaded of command line linguists. 'How perfect!' you exclaim. So what's your poison? What turns you on? What little known command performs just the right function for you?"
This discussion has been archived. No new comments can be posted.

What's Your Command Line Judo?

Comments Filter:
  • hrm. (Score:2, Funny)

    by grub ( 11606 )

    foad works every time.
    • ZTree and UnixTree (Score:4, Interesting)

      by Glonoinha ( 587375 ) on Sunday October 02, 2005 @11:03AM (#13698350) Journal
      Yea, Though I compute through the user interface of the shadow of death, I shall fear no evil, for ZTree is with me.

      No joke - ZTree is a character for character re-write of an old utility called XTree Gold v2.0 or 2.5 - and it is by far the most effective and influential interface between me and my data. The entire file system is but an extension of my mental processes, and I can slice and dice through the multi-dimensional (time, space, attributes, multi-layered directory structure, multi-drive architecture,) in effect creating a virtual directory within which I control the parameters driving what is listed, in what order - then copy, view, move, delete, diff (file compare), view in hex mode (and edit it in hex mode), search for text in lists of files, compare directory trees for like or different files (binary, time stamp, etc.)

      It is totally CUI, about like Midnight commander but a ton better. Take time to get fluent in ZTree (UnixTree for Unix / Linux, a bit older with a few quirks, but still pretty damn good) and you will be like the guys in the Matrix sitting at their green screen terminals.

      ZTree [ztree.com] Don't leave $HOME without it.
  • lh and hog (Score:4, Informative)

    by newsblaze ( 894675 ) * on Saturday October 01, 2005 @06:57PM (#13695192) Homepage Journal
    My favourites are

    lh ls -lt !* | head -15

    which shows me the newest 15 stories in the current or specified directory

    hog "ps -eo pcpu,vsz,args,time | sort -rn | head -11"

    which shown the 10 most cpu-intensive processes
  • can't eat just one (Score:5, Informative)

    by yagu ( 721525 ) * <{yayagu} {at} {gmail.com}> on Saturday October 01, 2005 @06:59PM (#13695203) Journal

    Command line judo? Sheesh! Where to start?

    Okay, the tool I'm using now: vim, derivative of Bill Joy's vi, with color syntax and a bounty of enhancements. (Yes, I prepare my comments in vi, then cut and paste, don't even try to make me use some GUI text widget editor and claim it can be productive.)

    And then there's:

    • sed, or even ed. ed is great for scripting edit changes if you don't have the heart to master sed. I once used "sed" to create nroff output of document versions with delete markers (asterisks), change markers, and insertion markers from our technical documentation library.
    • awk. I couldn't have done half the work I've done in my career without this one. I watched entire teams scramble to solve some problem in Visual Studio, and would crank out a twenty line awk script that afternoon. Problem solved.
    • [ef]grep. If you're going to pipeline some shit, it's nice to filter it first (and sometimes second, then third, etc....). The grep family is second to none for this. (Funny story: I once worked at Microsoft. Was doing some stuff in DOS. Asked what command I'd use to find a string in a file or files. They showed me "FIND". Okay. I typed in:
      find mystring *.*
      and literally got back:
      mystring found in *.*, and then a listing of all the lines found. No reference to the files they were in... Shit, after all, I asked to find the lines in '*.*'.

      I complained. They showed me their (Microsoft's purchased from IBM) unix, "xenix", and their "grep" command. Ahhhhh, better. I typed: grep -i mystring *.*
      and it replied "unknown option -i". I complained about not having an "ignore case" option. They looked at me like I was crazy... "Why would you ever want to ignore case?"

    • diff. A life saver. 'Nuff said.
    • find. Freak! When I first met this command I said "go away!". Talk about an obtuse syntax and paradigm. Learn it! It's worth it.
    • xargs. Nice way to get around the line length restrictions of some of the unix shells.
    • zsh, bash, (but not csh)... Any interactive shell with a good history and command line completion mechanism. I never have to remember command syntax anymore... usually I can find a usable and editable example somewhere in the last 2000 commands I've executed.

    I could go on, and probably will in some subsequent posts. When you have so many well written, well evolved, well crafted, and well behaved tools all flying in

    • by ATomkins ( 564078 )
      I complained. They [Microsoft] showed me their (Microsoft's purchased from IBM) unix, "xenix"...

      Sorry to be picky, but Microsoft bought a license for UNIX from AT&T, developed Xenix, then sold it to SCO who would transform it into SCO UNIX.

    • I actually use a shell script in my .bash_logout that appends my .history file to a master "history" file. It's helped me numerous times when I've need to recall a command I ran ages ago. I just use grep succesively until I find the obscure command string.

      I don't mean that I was looking for a single command; I use bash pretty hardcore, and I've used pipes to do some pretty extensive things - that's where I think history has it's greatest use.

      (and yes, I'm using bash 3 and my HISTSIZE envvar is set to 1000
  • Here's mine (Score:5, Funny)

    by Henry V .009 ( 518000 ) on Saturday October 01, 2005 @07:00PM (#13695207) Journal
    My most important tools as a system admin:
    alias rm 'rm -rf \!*'
    alias hose kill -9 '`ps -augxww | grep \!* | awk \'{print $2}\'`'
    alias kill 'kill -9 \!* ; kill -9 \!* ; kill -9 \!*'
    alias renice 'echo Renice\? You must mean kill -9.; kill -9 \!*'
  • I found this little php script called "cp.php" which I renamed pcp and use for mundane cp whenever I'm in a tty and hafta do lotsa copying. It gives lotsa info about progress of copying and such, which I really like compared to the rather spartan cp function. It may be somewhat superfluous, but it's nice eye candy in a bash, and nice for moving lotsa big files.

    Someone with more Google skills than I might be able to find it online. I got it from sourceforge originally, but it doesn't seem to show up in thei

    • cp -g (Score:3, Informative)

      by ReKleSS ( 749007 )
      I don't know how much information your script gives, but cp -g gives you a progress bar, transfer rate, completion percentage and other stuff when a transfer is going to take more than a few seconds (similar to what scp gives you). It's sometimes a handy feature to have, sounds like what your script is doing...
      -ReK
      • Haha, I am clearly a CLI noob! Thank you =]
  • I was on a special project, tucked away in a secret office in Seattle. We were creating a workbench of sorts using one of the very first PC networks in our company (it was 1986). Five of us created hundreds of files and thousands of lines of code and balanced version control ad hoc and on-the-fly. The project was a success.

    One day though, I'd transferred an image of our work from one machine to another. When I "ls'ed", oooops, I mean "dir'ed" the directory structures on the two machines they did not ma

  • CTRL + L (Score:3, Insightful)

    by Kizzle ( 555439 ) on Saturday October 01, 2005 @07:11PM (#13695252)
    Every person I see using linux clears the screen by typing CLS. The key combo "CTRL + L" is so much sexier.
    • Re:CTRL + L (Score:3, Interesting)

      by pclminion ( 145572 )
      CTRL+L in almost any terminal application forces a terminal redraw. Comes in handy when the application doesn't explicitly detect SIGWINCH, and you want to resize the terminal window. Resize, then hit CTRL+L to inform the app that the window size changed.

      I wish the behavior was the same in bash, but for some reason they chose to make it clear the screen instead.

    • cls is DOS. clear is UNIX. ^L is a cool one though, thanks!
    • Other useful control combinations:

      ^D -- end of file, logs out of most shells, and exits programs looking for user input
      ^S -- stop terminal output (useful if you're watching make output and see something scroll by too quickly)
      ^Q -- resume terminal output. Before learning this one, everything I accidently hit ^S (save reflex), I had to kill the terminal
      ^Z -- suspend. Stops a foreground process and gives you your command line back. Useful for GUI apps you started without '&'. Also useful with the 'bg'
  • skill (Score:3, Insightful)

    by cgenman ( 325138 ) on Saturday October 01, 2005 @07:15PM (#13695264) Homepage
    I'm not much of a linux guru, so my favorite is simple.

    skill [about.com]

    s-kill, basically, kills a process by name. "skill netscape" will kill netscape, no finding proc ID required. It's what kill should have been from the beginning.

    The only command I love more than skill is apt-get, but that doesn't really count.
    • man skill : skill, snice - send a signal or report process status There's also: man killall : killall - kill processes by name I got annoyed by having to find the proc ID in top too, and stubled across killall. Haha =]
      • Oops... I forgot my html tags =[ Shoulda been:

        man skill : skill, snice - send a signal or report process status

        There's also:
        man killall : killall - kill processes by name

        I got annoyed by having to find the proc ID in top too, and stubled across killall. Haha =]

      • kill `ps auxww | grep -i processname | grep -v grep | awk '{ print $2 }'` ... only wimps use pgrep and pkill..!

    • Re:skill (Score:4, Informative)

      by Fweeky ( 41046 ) on Saturday October 01, 2005 @07:36PM (#13695349) Homepage
      Looks like killall to me:
      SYNOPSIS
          killall [-delmsvz] [-help] [-j jid] [-u user] [-t tty] [-c procname]
                  [-SIGNAL] [procname ...]
       
      DESCRIPTION
          The killall utility kills processes selected by name, as opposed to the
          selection by pid as done by kill(1). By default, it will send a TERM
          signal to all processes with a real UID identical to the caller of
          killall that match the name procname. The super-user is allowed to kill
          any process.
      Though you need to be careful with that if you admin other systems like Solaris ;)

      pkill/grep are nice too, and are standard on a fair few systems now.
  • What little-known command performs just the right function for me?

    strip, of course.

  • by bergeron76 ( 176351 ) * on Saturday October 01, 2005 @07:27PM (#13695311) Homepage
    I never figured out a way to use 'ls' to show only directories (and not their subcontents), so I created an alias called 'lsd':

    alias lsd='\ls -l | grep "drwx"'

    and placed it in my .bash_profile

    It's quite useful, but it doesn't work well with shell scripts.

  • emacs, whose shell mode and keyboard macros have totally replaced the crude shell-or-worse scripts that one writes and debugs to perform repetitive daily tasks.
  • perl (Score:5, Insightful)

    by Turmio ( 29215 ) on Saturday October 01, 2005 @07:33PM (#13695335) Homepage
    First, I must comment the article. Question goes What little known command performs just the right function for you? I hope all sane people here (haha) would answer "None". There's no command that does just the right function since there's no one The Function. It depends on the situation what the function is. And in that case, per Unix philosophy, where one tool does one simple job, but does it well, you should choose the tool accordingly.

    Enough of that. If you really must name something, then, in my opinion, there's one gizmo above others. And that is

    perl

    Perl one-liners is a damn powerful concept when you get it. Say one of your boxen switches IP address. You want to replace all references in files under /etc to this new IP address. First you might think searching all the files under /etc with find(1), then passing the list of files to grep(1) and then manually editing the places where the old IP address was found with your $EDITOR. That's fine and will get the job done and all but what if you could just edit the files in place? With perl, you can.

    perl -pne's/oldip/newip/g' -i `find /etc -type f`

    and you're done (better be extra careful with commands like that for obvious reasons!). Of course you're able to do the same thing with other tools too, but I don't think it could be much easier than that. And naturally you're not just limited to simple search and replace of text, you have the full power of Perl (and CPAN [cpan.org]!) at your disposal.

    Besides being my number one choice for creating complex scripts and small applications, Perl has very special place in my command line toolbox just next to the old friends such as grep(1), cut(1), wc(1), etc. and a huge pile of pipes :)
    • Re:perl (Score:3, Interesting)

      Second that. "Traditional" Unix tools sometimes just don't cut it (or they do, but requires significantly longer typing and multi-piping work).

      I use this all the time:

      $ perlrename -e's/.\[www.descargasweb.net\]//' *
      $ perlrename -e's/.\[www.\S+?\]//' *
      $ perlrename -e'$_=lc' *
      $ perlrename -de's/Sex And The City - S(\d+)E(\d+)/SATC - $1$2/ig' Sex*
      etc etc

      where perlrename is a 30-something perl script.
    • perl -pne's/oldip/newip/g' -i `find /etc -type f`

      Hun? `man perlrun` says -p overrides -n
      I don't see how this will work. Either way, it would edit all files, not only the ones wich has the 'newip' string.
    • How platform specific you are!

      Perl / PHP is not a killer solution for everything. Particularly, if you use an embedded system that doesn't have php/perl installed, you'll find that bash/busybox is very much your friend.

      That said, I love php and don't know that I'd do without it (aside from use bash for everything).

  • groff (Score:2, Interesting)

    groff -ms -t -Tascii <file>.ms
    is how I code all my outgoing emails; that's GNU troff with the ms macro package and tbl preprocessor.

    Justifies nicely said paragraphs; provides lists and sigart.

  • Junction for Windows (Score:4, Informative)

    by alan_dershowitz ( 586542 ) on Saturday October 01, 2005 @07:43PM (#13695376)
    Junction [sysinternals.com] lets you make symlinks in Windows without installing the entire Windows Resource Kit tools. Also, CACLS.EXE for changing ACLs in Windows via the command line, since I have no fucking clue where you do this in the GUI. Some of the more usefule CLI commands in Windows, IMO. I hope this discussion wasn't limited to Unix or anything.
    • I hope this discussion wasn't limited to Unix or anything.

      It shouldn't be. Another source of good command line tools is sysinternals.com.

      The pstools let you do all kinds of nifty things that are sometimes covered by RK utils, but the sysinternals stuff is usually way better... ...with one exception that drives me crazy. pslist is their version of ps, which I just alias using 4NT, however they don't echo the banner to stderr like most utils, so if you sort the list looking to see if you need to kill and er
    • by BrynM ( 217883 ) * on Saturday October 01, 2005 @08:26PM (#13695555) Homepage Journal
      without installing the entire Windows Resource Kit tools
      There are several RK tools in my MS toolbox, but the best thing is having real unix utils [sourceforge.net]. Pop those in your %PATH% and enjoy some of the same fun that's being spoken of here. Of course there's always Cygwin, but these native ports are handy to keep on a USB drive and don't need any configuration/installation at all.
  • dusort (Score:3, Informative)

    by RGRistroph ( 86936 ) <rgristroph@gmail.com> on Saturday October 01, 2005 @07:48PM (#13695399) Homepage
    in bash:

    function dusort ()
    {
      du -s "$@" | sort -r -n |
        awk '{sum+=$1;printf("%9d %9d %s\n",sum,$1,$2)}' ;
    }

    in tcsh:

    alias dusort 'du -s \!* | sort -r -n | awk '"'"'{sum+=$1;printf("%9d %9d %s\n",sum,$1,$2)}'"'"

    The most common way to use those commands would be:

    cd /home/
    dusort *

    It's useful for tracking down what's using up your space, for example finding a sub directory deep in a source tree that isn't cleaned by make clean.
    • by willfe ( 6537 )
      How about:

      $ find /wherever -maxdepth 1 -exec du -sm {} \; | sort -n

      Replace "-sm" with "-sk" if you want kilobytes instead of megabytes. Total space consumed by the named directory appears at the bottom of the list.
  • hmm... (Score:3, Informative)

    by bbrack ( 842686 ) on Saturday October 01, 2005 @07:48PM (#13695406)
    perl -pi -e "s/x/y"

    ever had to make a change to every line in a test vector (up to several million lines long), but didn't have the half hour it would take to retranslate the whole thing? - has saved my ass more than anything I can think of
    also fun to do something like

    perl -pi -e "s/(alias \w) \'.+\'/$1 \'echo \"DFU DFU DFU\"/g ~user/.aliases

  • I can't believe nobody's mentioned rsync yet. It has to be the single best command line tool ever.

    I use it to update my web site. I use it to synchronize bookmarks between multiple machines. I use it for online incremental backups. I use it to copy directory trees around from disk to disk.

    I even use it to fetch my mail. Yes, I switched my mail account to Maildir by using procmail for final delivery, and now I can use rsync -avz --remove-sent-files instead of crappy POP3 or IMAP protocols to pull the mail do
  • by pclminion ( 145572 ) on Saturday October 01, 2005 @08:44PM (#13695637)
    I don't think the point of the article is to discuss which commands are coolest, rather, what ways can you combine those commands to achieve powerful results?

    Here's something that I run via cron on a nightly basis. See if you can decode its function :-)

    for F in $(find $SRCDIR); do echo $(basename $F) $F; done | sort | rev | uniq -c -f 1 | grep -E ^[[:space:]]*1[[:space:]] | awk '{print $2 " " $3}' | rev > $CACHEFILE

  • screen (Score:5, Informative)

    by pauljlucas ( 529435 ) on Saturday October 01, 2005 @08:47PM (#13695653) Homepage Journal
    Multiple multiplexed ttys that stay running even after disconnect and you can reattach to them later.
    • Multiple multiplexed ttys that stay running even after disconnect and you can reattach to them later.

      Hi screen brother. :) I live and breath screen -- I sometimes open multiple windows to the same screen session, but different 'screens.' I've also changed the MAXWIN value from 40 to 60 before compiling because, well, 40 screens just isn't enough.

      I had one screen session up for almost a year, which would've been longer but I upgraded my kernel.

  • find . -name '*.ext' |xargs grep -i bar

    finds all the files having the extension 'ext' and matching the pattern 'bar'.
  • by elmegil ( 12001 ) * on Saturday October 01, 2005 @09:07PM (#13695715) Homepage Journal
    Though I have no idea why, it's pretty basic:
    find . -type f -exec grep regexp {} /dev/null \;
    The tricky bit is that adding /dev/null means I will always see the filename in the output; otherwise grepping against one file only returns the matched strings.
    • I generally build it like this (this is a very trivial example, but you get the idea):

      find . -type f |grep '.thumb.gif'

      Okay, then I can eyeball the result and check it. Pipe it to less if necessary. Then up arrow and append:

      find . -type f |grep '.thumb.gif' | while read FN ; do echo "$FN"... ; rm "$FN" ; convert -geometry 200 "$( echo "$FN" | sed 's/.thumb//' )" "$FN" ; done

      I make no claim that this is better, it's just what rolls off my fingertips when working on mass groups of files. One nice thi

    • Very clever. It took me a bit to figure out that /dev/null is a second filename paramter to egrep, which forces it to list which file it found the matching regexp in. I'll have to remember this command!
    • Or even better:

      find . -type f -print -exec grep regexp \{} \; ... which works better because you get all the filenames, and you don't have to do any workarounds with /dev/null.. which might not work if you're on a weird system.
      • Not having seen \{} notation before but presuming that it means find all the files and substitute them at once, I see two potential problems: 1) I think /dev/null is more likely to exist than this option I've never seen before :-) and 2) grep's command line may hit some length limit if you find too many files or your path structure is too deep. At this point I'd just as likely use xargs, which can limit the length of the command line. Actually (no one expects the spanish inquisition!) I see a third poten
  • unison (Score:3, Interesting)

    by buffy ( 8100 ) * <buffy@p a r a p e t .net> on Saturday October 01, 2005 @09:57PM (#13695913) Homepage
    Based on the rsync protocol, unison can maintain a bi-direction file sync. I use it to sync my personal data between home and co-lo'ed server, and work and co-lo'ed server. I can update files in any of the three locations and they'll be replicated across all three.

    -buf
  • xargs and for loops (Score:5, Informative)

    by photon317 ( 208409 ) on Saturday October 01, 2005 @10:17PM (#13696046)
    There's millions of tricks, but if I had to a couple simple powerful techniques that anyone should learn that doesn't know them already, it would be xargs and commandline "for" loops.

    xargs takes whatever is piped into it, and executes a command with those things as arguments. It can do it all at once, or it can break them up in chunks, or it can execute your command once per input. Consider:

    rm -f `find . -name "*.o"`

    This normally works fine, and will forcibly remove all .o files anywhere underneath the current directory. However, if there are too many .o files to fit on a single commandline, it will barf with "argument list too long" or some such sounding error. The xargs way to do this would be:

    find . -name "*.o" | xargs -n 50 rm -f

    Which will execute a seperate "rm -f" for each chunk of 50 filenames. Take a look at the "-i" mode as well, read the whole man page. It's a great little peice of glue.

    On to for loops. You've seen them in sh/ksh/bash shellscripts like so:

    for fn in *.c
    do
        echo Sending $fn ...
        rcp $fn remotehost:/tmp/
    done

    You can of course do this straight from the commandline, which is indispensable for complex looped operations. To do it all in one line, you just have to get the semicolons in the right place. Just remember there's a semicolon before the "do", but not immediately after it:

    for fn in *.c; do echo Sending $fn ...;rcp $fn remotehost:/tmp/;done


    • Oh a couple more good simple ones popped into my head for beginners:

      Use "&&" between commands, and the second command will only execute if the first command succeeded. For example:

      make && make install

      That will only run the "make install" if the "make" succeeded (which is kinda contrived, since for most packages a simple "make install" would have done the same all on it's own).

      And also, did you know that you can group commands with parentheses, and you can pipe into parenthesized groups of c
    • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Sunday October 02, 2005 @12:15PM (#13698681) Homepage Journal
      find . -name "*.o" | xargs -n 50 rm -f

      That's great. Now, create a directory called " " (empty space). Inside that, create a directory called " -rf " ("-rf" with an empty space on either side). Inside that, create a file named " " (yet another empty space). Now, watch in horror as find prints "./ -rf /", which it passes dutifully to "rm -f". Since xargs by default passes each word as an individual argument, that expands to:

      rm -f ./ -rf /

      Hope you weren't running as root! The moral of this story is to never, never! use find/xargs without the "-print0" option whenever the command you're executing is destructive. "ls" is probably OK. "rm" definitely isn't.

    • I always liked sticking the printer in slot 6, just to mess people up.

      Of course, by the time I finally got rid of that old machine, I had flopply controllers in slots 6, 5, and 4. ProDOS made handling multiple disks a breeze. Printers were on 7 and 1 (one for labels, the other for regular), mouse on 2, and a 64 KB RAM expansion :D
  • I really like:
    du -sh *
    Probably best to put this in your .bash_profile file (or equivilant):
    alias diskusage=`du -sh * && du -sh .[A-Za-z]*`
    (I know this can be shortened with regular expressions, but I've never looked into it)

    Anyway, if need be, pipe that to a "sort"-on-steroids and you can see where your disk space is getting eaten up.
  • From my painfully extensive man page research years ago, this seems to be the fastest way to (safely) shutdown (or restart) a Linux box:

    shutdown -hn now (for halt)
    shutdown -rn now (for restart)

    I call this my "Express Elevator to Hell: System Goin' DOWN!" command. [Somebody wake up Hicks.]
  • ssh is awesome. ssh -L, keys, encryption, compression. Awesome.
    sort -n
    grep is an amazing tool.
  • Need to find that file named somethingfoosomeotherthing?
    alias ff='find . |grep -i '
    >ff foo

    ./somedir/somethingfoo
    ./somedir/something foo2
    ./somedir/somethingfoo3

    Need to find that pesky configuration file for printer?
    ~/bin/gr:
    grep -i -r $1

    ~/bin/fgre:
    grep -i -r -l $1

    >cd /etc/
    >fgr laserjet

    ./cups/ppd/hp.ppd:*ShortNickName: "HP LaserJet Series"

    Just interested which files to check?
    >fgre laserjet

    ./cups/ppd/hp.ppd

  • Midnight Commander. (Score:3, Interesting)

    by Inoshiro ( 71693 ) on Sunday October 02, 2005 @03:50AM (#13697235) Homepage
    'mc' is your gateway to a much more powerful shell. Combine your command line shell with quick changing to bookmarked dirs (ctrl+\); easy adding to that via alt+a; F2 menus that let you easily compress subdirs, or run other things from your actions file based on the mime-type or file pattern; two-panes at two different FS locations; undelete FS, FTPFS, and other meta-FS interfaces; good mouse support; built-in editor (Cool edit); prompts you for arguments for makefiles you hit enter on...

    The list goes on. Midnight Commander is a shell designed to speed up all your common tasks. It has sane defaults you don't really have to change much. It'll make you more productive!
    • by fuzza ( 137953 )

      Not to mention, it's really useful for temporary (ie. until you can reinstall) cleanups of compromised systems, since (a) it uses its own internal code for things like ls, chmod, etc, and (b) rootkits don't generally replace it :)

  • My favourites (Score:3, Informative)

    by cowbutt ( 21077 ) on Sunday October 02, 2005 @04:02AM (#13697275) Journal

    strace/truss/ktrace for Linux, Solaris and BSD/MacOS respectively. If a program fails to start, or terminates abnormally, this will usually give me the heads up on why (it's usually a missing file, or bad permissions) without having to break out gdb.

    lsof. Useful in so many ways; for debugging situations similar to the above, as well as hardening systems and building chroot environments for specific programs.

    tcpdump/snoop/tethereal/ethereal. If you can see what's really on the wire between two network applications, you can probably figure out what's going wrong. Ethereal is particularly nice.

    hexdump/khexedit. If you can see what's really in the file used by an application, you can probably figure out what's going wrong. :-) khexedit also has a bonus feature of being able to perform statistical logical operations across the file; useful if you have a file which you suspect has been encrypted with some lame substitution cipher.

    After those, the usual - sed, awk, grep, find. It's rare that I can't turn any problem into an awk-shaped nail. :-)

  • abcde, a better cd encoder [lly.org]. So great it's untrue. Try it, you might love it.
  • We all have our CLI amor

    No, no we don't.
  • Back in school, our Pascal compiler often would dump files in the current directory with garbage filenames, with lots of special characters in 'em so you couldn't type them at the shell (or use shell expansion, either). Someone showed me a trick that I still use to this day:

    ls -i
    find . -inum inode -exec rm -i {} \;

    The ls -i gives you the inodes for all the files in the current directory. Find the inode for the file you can't delete, then use that in the find command to actually select it and delete it wit
  • by Richard Steiner ( 1585 ) <rsteiner@visi.com> on Monday October 03, 2005 @11:00AM (#13704123) Homepage Journal
    ...but it's limited to JP Software shells (4DOS, 4NT, 4OS2), and the authors of Linux shells don't seem interested in implementing it.

    SELECT is a command which provides a fullscreen point-and-shoot file selection interface for other commands, allowing for the easy arbitrary selection of target files (point-and-shoot) in command aliases or batch/script files. In essence, it provides a mini filemanager interface which can be used to select the targets for *any* command on the command line, used in aliases, etc.

    Some simple examples (from my 4OS2 alias file) are as follows:

    TCOPY=select copy (*.*) %1
    TMOVE=select move (*.*) %1
    TDEL=select del (*.*)
    TRUN=select %1 (*.*)
    TEDIT=trun fte
    TLIST=trun list
    UNZ=f:\unz.cmd
    TUNZ=select unz (*.zip)

    The filespec in parenthesis limits the files which appear in the selection list, so the TUNZ command will only show .ZIP files as potential targets.

"The four building blocks of the universe are fire, water, gravel and vinyl." -- Dave Barry

Working...