Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

Essential UNIX Tricks and Tools? 185

Chris Lesner asks: "What handy UNIX tricks/tools do you use everyday? I'm asking for stuff that amazes your friends and makes you wonder how they use UNIX w/o them. Some simple examples include: job control (with fg, bg/&, jobs, Ctrl-Z); moving login sessions between machines with Screen for vt100 and VNC for X11 and using screen and VNC to share login session b/w users for demos etc.; using find, xargs -i and echo to build command strings which after inspection can be piped back though bash e.g. `find . -type f | xargs -i{} echo "cp {} {}.bak" | bash` I'm asking b/c my source for this kind of information has dried up as my UNIX skills have matured. I'm guessing other Slashdot readers have the same problem. By the way, if you think the examples I give are lame I challenge you to better them!"
This discussion has been archived. No new comments can be posted.

Essential UNIX Tricks and Tools?

Comments Filter:
  • Unix Power Tools (Score:2, Informative)

    by Bouncings ( 55215 )
    All the tricks you can cram into one of those big books, and more: Unix Power Tools [oreilly.com] -- O'Reilly's best book, IMHO.
    • by devphil ( 51341 )


      After hacking around on *nix for years, everything in that book was old news to me. Well, almost everything. Out of 1000+ pages, here's the only trick I had never seen before:

      If you have a directory full of files that are precious to you, run "touch ./-i" in that directory. This creates an empty file named "-i". Why do this? Because:

      If you accidentally "rm *" there, the shell's default collating order while expanding the * will put the "-i" file first in the list, which will then be interpreted by rm as the interactive option, rather than a file.

      (I've never used this trick, but I can see how it would be much less hassle than trying to remember to chmod u-w a precious file every time you get done working with it. Refinements on the technique include hardlinking the "-i" file to different directories instead of touch'ing it more than once, to save on inodes.)

      (This also assumes that the user is already familiar with any of the half-billion methods of actually deleting a file whose name starts with '-'. This is /the/ most frequently asked FAQ in comp.unix.*.)

      There. Now you know the most interesting trick in the whole book, and you don't need to spend the sixty bucks. :-)

  • Stupid shell trick (Score:2, Informative)

    by Bouncings ( 55215 )
    This is pretty simple, but on one of my machines I like to have the same directory location when I log in and out. If I'm in /home/foobar/monkey/dance, and I close my ssh connection, then I open a new one, I want to start in /home/foobar/monkey/dance. Here's how:

    In my .bashrc:

    cd "`cat ~/.lwd`"
    In my .bash_logout:
    pwd > ~/.lwd
    Pretty basic, but it does the job. One problem you have to watch for: scp'd files appear in your last working directory.
  • Well, seeing that my friends don't know much about UNIX, the following usually impresses them:

    1. Connect to home router (FreeBSD) w/ PuTTy SSH client
    2. smbclient to windows share (i.e. smbclient //empire/files -U jesus)
    3. mget files to /tmp
    4. run command shell on client box
    5. pscp files from router
    6. in the meantime, as a finishing touch, perl -e $#29%% something

    To impress normal people, you don't need to type anything fancy. Just type FAST, and as soon as a command returns, pretend to contemplate for a second, and then type your next command.
    • To impress normal people, you don't need to type anything fancy. Just type FAST

      Actually, if you want to impress normal people, use tab completion, but don't tell them about it and don't make it obvious :) The looks you'll get are priceless.

      Now, on the same idea, one thing I no longer can live without: I use emacs for all my C development. The default emacs setup maps "M-/" to 'dabbrev-expand' which is immensely useful. This means that if I type "some_long_variable_name" in some file somewhere, the next time I only need to type "som M-/" and it does the Right Thing - even if "some_long_variable_name" is not to be found anywhere in the current file, it will search all open buffers. Extremely useful.

    • A simple trick that's both day-to-day-useful and gets even systems-literate folk (including my advisor) gaping is this:

      I have two machines on my desk; one main Linux box in text console mode, and one by-the-side windows box running cygwin and sshd. I use Lynx (by preference) on the Linux box for most of my text-only web browsing. But occasionally when I want graphics, I happily hit this magic key sequence, and *poof* an Internet Explorer pops up on the windows box.

      The trick is to have a special bookmark file under lynx multi-bookmarks; when you "add-bookmark" to this file, a little daemon shellscript polls it every second, grabs the URL, and runs "ssh windowsbox IEXPLORE.EXE url". No big shakes!
  • by Bouncings ( 55215 ) <ken&kenkinder,com> on Saturday June 01, 2002 @06:18PM (#3623960) Homepage
    One of the most useful gadgets I use is the bash completion project [caliban.org]. It's a handy-dandy tool where tab-completion does more, oh, so much more than filenames. When I do a Debian apt-get install python-, I get a list of Debian packages to install starting with python-

    There's more fun too. It completes tons of crazy stuff. I'd check it out.

    • FYI, tcsh has completion built in.

      Here's a few of my interesting ones:


      complete kill 'c/-/S/' 'p/*/`ps -x | awk \{print\ \$1\}`/'

      complete scp 'c;*@;`awk \{print\ \$1\} < $HOME/.ssh/known_hosts | sed -e s/,.\*\$// | uniq`;:;'

      complete ssh 'c;*@;`awk \{print\ \$1\} < $HOME/.ssh/known_hosts | sed -e s/,.\*\$// | uniq`;' 'p;*;`awk \{print\ \$1\} < $HOME/.ssh/known_hosts | sed -e s/,.\*\$// | uniq`;'


      Only annoyance is that you can't do multiple completion sets simultaneously. Maybe I should write a patch for that.. hmm...

    • I got tired of looking up all of the various -f* and -m* options in gcc. So the Bash completion project now knows how to complete gcc/g++ options, e.g.,

      gcc -fomit-fr<TAB>
      will complete, and
      gcc -f<TAB><TAB>
      will list possible completions, etc.

      Other shells have had programmable completion for a while now. It's nice to see this feature added to bash, and it's nice to see someone volunteering to collect the widespread completion functions.

  • How about (Score:5, Informative)

    by Hard_Code ( 49548 ) on Saturday June 01, 2002 @06:18PM (#3623962)
    find . -type f | xargs -i{} echo "cp {} {}.bak" | bash
    find . -type f -exec cp '{}' '{}'.bak ';'
    • I like piping to sed. Like:

      ls | sed 's/.*/mv "&" "&.foo"/'

      Or you can use find . instead of ls, with the usual type selectors. This gives two nice things - flexability, plus you can keep piping those puppies through additional sed filters to get what you want, postpending | bash at the last one. I also often use rev in the pipe chain and a sed s/ without a /g to modify the tail end of the strings.

      Hey, it might not be the least verbose, but sed can be quite a nice tool for really complex matching.

      --
      Evan

    • Re:How about (Score:4, Informative)

      by Zapman ( 2662 ) on Sunday June 02, 2002 @12:08PM (#3626533)
      The slight problem with this (not in this case though) is you have the expense of forking and killing a lot of processes (equal to the number of files +1 for the find). On the mv or cp case, you can't (easily) get around it, but if you were to do:

      find . -type f -exec chmod 644 {} \;

      vs

      find . -type f | xargs chmod 644

      you'll find that the second runs amazingly faster, since it will group a lot of the commands together into 1.

      I did 'cd /usr/bin ; time find . -type f -exec ls {} \;' and got:

      find . -type f -exec ls {} \; 0.98s user 3.28s system 88% cpu 4.831 total

      I did 'cd /usr/bin ; time find . -type f | xargs ls -1 ' and got:

      find . -type f 0.00s user 0.02s system 31% cpu 0.063 total
      xargs ls -1 0.03s user 0.04s system 74% cpu 0.094 total

      That's a BIG difference, especially when you have a LOT of files.

      (NOTE: if I did a bare 'ls' with the xargs, the output would be different due to the way xargs works (man xargs for more details). It may or may not make a difference depending on how you're using the output. If you're shleping the whole line at once, it could make a huge difference... if you're spliting the lines by $IFS or something, then it's probably alright.)

      As for my contribution, I find these 3 find | xargs commands, wraped together in a script I call "makereadable" help me a LOT (for example, if you install from source, and the permissions get borked due to forgetting to set root's umask to 022):

      find . -type d -print | xargs chmod 755
      find . -type f -perm +0100 -print | xargs chmod go+x
      find . -type f -perm +0200 -print | xargs chmod go+r

      The first makes all directories 755. 99.99% of the time, that's what you want... just don't do it with /tmp (sticky bit will get wacked). The second finds all files that are executable by the user owner, and makes them executable to others too. The third finds what is readable to the user, and makes it readable too.

      If you want to do this with write permissions, you're probably doing something stupid you will regret later. Figuring out the command to do this is then left as an exercise for the reader. :-)
    • You seem to be missing the author's point. Most of the time, when you are about to do such an operation, you want to be sure that the commands to be executed are what you expect (for example, if this is going to do something destructive, such as remove files). By using the echo, you can preview the execution, check it off, and just rerun the last command with a pipe.
  • chvt (Score:2, Insightful)

    by NotoriousQ ( 457789 )
    most linux newbies do not have a clue how their ctl-alt-f(x) works, and I have found this tool to be very useful on occasion.
    • on occasion?

      I always have a minimum of 4 vt's and an X session active on my workstation, it's one of the reasons (behind perhaps 5-6 more) that I use nix over windows for daily tasks.
      • I always have a minimum of 4 vt's and an X session active on my workstation, it's one of the reasons (behind perhaps 5-6 more) that I use nix over windows for daily tasks.

        No kidding. Multiple text logins as well as the graphical display REALLY blows the minds of many of my winblows buddies. I think it's even MORE useful than my favorite useful tool, Opera's "open link in background."

  • learn your find (Score:1, Redundant)

    by molo ( 94384 )
    `find . -type f | xargs -i{} echo "cp {} {}.bak" | bash`

    This is fugly and a waste.

    find . -type f -exec cp "{}" "{}.bak" \;

  • Just use find . -type f | xargs -ixx cp xx xx.bak
  • by Froze ( 398171 ) on Saturday June 01, 2002 @06:55PM (#3624073)
    su root cd /; rm -rf *

    This one works really nice when your getting ready to do a fresh install, since it allows to vent all that pent up frustation with your now obsolete system :-)
  • In tcsh:

    alias dusort 'du -s \!* | sort -r -n | awk '"'"'{sum+=$1;printf("%9d %9d %s\n",sum,$1,$2)}'"'

    In bash (or zsh):

    function dusort ()
    {
    du -s "$@" | sort -r -n |
    awk '{sum+=$1;printf("%9d %9d %s\n",sum,$1,$2)}' ;
    }

    An example way to invoke it is "dusort /home/*", to see who the disk pigs are.
    • Re:dusort (Score:5, Interesting)

      by Phexro ( 9814 ) on Saturday June 01, 2002 @07:28PM (#3624174)
      But why bother with the awk, when you can just do:

      $ du -s /path/to/wherever/* | sort -rn | head -10


      to get the top ten hogs in /path/to/wherever?

      Also, sometimes there are some big files, and you are only interested in the directories full of crap:

      $ du -s `find /path/to/wherever/* -type d` | sort -rn | head -10


      While we're on the subject, you can use this handy-dandy snippet to find the disk usage of one user in any part of the filesystem:

      $ find /path -type f -user jsmith -exec ls -l {} \; 2>/dev/null \

      > | awk '{ sum += $5} END { print sum }'
      • what does 2>/dev/null actually do?

        I can't find it in a man page anywhere.
        • 2>/dev/null redirects error messages from the command (and anything else written to stderr, which is file descriptor 2) to the null device. It keeps error messages from showing up. When > is used alone, it implicitly redirects standard output, which is file descriptor 1 (so it's the same as 1>something).

          More interesting (and often obscure to new users) ways to redirect streams can be found here [tldp.org].

  • "watch" command (Score:2, Informative)

    by Anonymous Coward
    A lot of people don't know about the "watch" command

    watch -n1 df -h

    I always use hdparm to speed up IDE drives too.

    hdparm -c1 -d1 -A1 -a16 -m16 -u1 -S240 -W1 -k1 -K1 -X66 /dev/hda

    Of course there's aliases you can set to reduce keystrokes.

    alias ll='ls -l'

    pgrep and killall can be pretty useful.

    killall java
    renice 19 `pgrep java`

    I also have a button on my GNOME panel that runs xkill so I can click on a locked up GUI app and terminate it.

    If you have a multiprocessor box, you can do "make -j3" to run parallel builds.

    And updatedb and locate are really useful for finding files quickly.

    If you're trying to remember a command you've run before you can run this to find it quickly.

    history | grep 'command'

    They you can run !850 to run command #850 from your history for example.

    And you can run this to find GNOME packages.

    rpm -qa | grep gnome

    That's all I can think of for now. Other posters are welcome to point out faster ways to do any of the tips I've given. :p
    • If you're trying to remember a command you've run before you can run this to find it quickly.
      history | grep 'command'

      If you're using bash or zsh with the default emacs-like bindings, you can use ctrl-r to do an interactive backwards search in your history. Just keep hitting ctrl-r to search deeper back into the history.

      Also, the one thing I've noted a lot of intermediate Linux users don't know about, but find very useful upon investigation is job control. Even with good window management and multiple xterms, job control is very useful: I often have three or four jobs on each xterm. Also, a neat trick: with zsh or bash, "fg %-" goes back to the second-to-last job. You can use this to quickly suspend and switch between two jobs continuously (eg, after you've done it once, "ctrl-z, ctrl-p, enter" switches to the "other" job).

      Also, if there's a long command I recently executed, I usually won't search through history, but rather use the "!cmd" thing. Eg, if you've recently run "gcc -g -Wall -O3 -blahblah ..." a few commands ago, you can do it again by just typing "!gc" without searching. I guess it's really a question of preference, but I picked this up from the older gurus who were weaned on the monstrosity which is the C shell [faqs.org].

      • Eg, if you've recently
        run"gcc -g -Wall -O3 -blahblah ..." a few commands ago, you can do it again by just typing "!gc" without searching.

        It's safer, though more tedious, to do

        !gc:p
        and then, if it spits out what you wanted, do an up-arrow, enter to execute. Sometimes you forget that you did a "make clean" between now and the last time you did "make", for example.

        I wish a shell would pick up the vim history completion (or, I wish I knew which shell had already implemented it or how to access it or whatever). I want to be able to do

        !scp[up arrow] and have it look back through my history only at the stuff that starts with scp.

        • Re:"watch" command (Score:3, Informative)

          by phraktyl ( 92649 )

          If you do a set -o vi in bash or ksh you get the vi mode. Then you can hit <Esc>/^scp<Enter> to get the last command that started with `scp', and then hit n for next, as in vi. You can also do most of the rest of your vi commands.

      • by Anonymous Coward
        If you're using bash or zsh with the default emacs-like bindings, you can use ctrl-r to do an interactive backwards search in your history. Just keep hitting ctrl-r to search deeper back into the history.

        And, after you've found it with ctrl-r, ctrl-o will execute the command and get the next command ready on the command line for you. Great for those "vi" "make" "./run" "gdb" "vi" ... sequences....
        • after you've found it with ctrl-r, ctrl-o will execute the command and get the next command ready on the command line for you.

          That is immensely cool. I've only been using it for five minutes, but I already can't live without it. You learn something every day.

    • A lot of people don't know about the "watch" command

      Indeed. I always did while true ; do clear ; date ; sleep 3 ; done when I needed to keep an eye on something.

      I never knew about the history command, although I use ^R in bash a whole lot.

      du -h --max-depth 1 /home | sort -rn is always a favourite of mine; add a | head -10 to grab just the top 10 (or whatever you prefer).

    • make -j3

      Why isn't -j2 good enough for a two processor box?

      Thanx about watch and pgrep. Especially watch. I can't believed I never knew about that one.

      hdparm just scares me. I know it shouldn't. But after suffering an hd crash I don't want to mess with a working drive. Though should I ever add a new drive, I'll have to think about playing with it.
      • with j3 it makes sure the box is (nearly) always has a compiler session for each cpu, it would probably be reasonable to run j2 for a single processor box as well...
    • watch -n1 df -h

      That reminds me of this trvial script I wrote once:

      fileprogress() {
      tput clear
      if [ $# -gt 1 ] ; then
      TOTAL=$2
      FMT='"\r%.2fM (%.2f%%)"'
      else
      TOTAL=1
      FMT='"\r%.2fM"'
      fi
      while true ; do
      ls -l $1 | awk "{printf($FMT, \$5/(1024*1024), (\$5 / $TOTAL)*100)}"
      sleep 1
      done
      }

      Then, when you're downloading some ISO, you can do "fileprogress some-image.iso" or "fileprogress some-image.iso 681574400" if you know the ISO is supposed to be 681574400 bytes when downloaded.

      I wrote that before I knew about "watch" and before scp clients had progress bars, so it was useful at the time, but I haven't used it in a while. Another cute thing is to define a function:

      beep() {
      echo -e '\007'
      xrefresh -solid white 2> /dev/null
      xkbbell 2> /dev/null
      }
      And then have it do that when the file is done downloading. Or do "/run/some/long/command ; beep" to let you know when your command is done.
  • by Permission Denied ( 551645 ) on Saturday June 01, 2002 @07:13PM (#3624129) Journal
    I use zsh on all my non-Linux machines and just use the default bash on all my Linux boxes. First of all, I have one common .zshrc and .bashrc which all my machines get. The last line sources .zsh-local or .bash-local, where I keep all my machine-specific settings. This way, when I think of something clever to put in .zshrc, I can distribute it to all my machines at once.

    Here are some stupid shell tricks I've found moderately useful:

    1. bash: HISTCONTROL=ignoreboth, zsh: setopt HIST_IGNORE_SPACE; This means when you type a command with a space in front of it, it doesn't go in your history. Useful if you do something that you don't want others to see (eg, xv ~/.pr0n/*.jpg).
    2. The zsh FAQ has an entry which describes how to get your xterm title to describe all sorts of fun stuff. Here's a bash equivalent:
      # see the document /xc/doc/hardcopy/xterm/ctlseqs.PS.gz
      # on any XFree86 mirror; the only other cool thing you
      # can do is change the font with "ESC ] 5 0 ; fontname BEL"
      xtitle() {
      echo -ne "\033]2;$*\007" > /dev/tty
      }
      if [ "$TERM" = "xterm" -o "$TERM" = "xterm-color" ] ; then
      PROMPT_COMMAND="echo -ne '\017'"
      fi
      if [ "$TERM" = "xterm" \
      -o "$TERM" = "xterm-color" \
      -o "$TERM" = "rxvt" \
      -o "$TERM" = "cygwin" ] ; then
      settitle() {
      xtitle "`date +'%H:%M:%S'` - ${HOSTESS}: $PWD"
      }
      PROMPT_COMMAND="settitle ; $PROMPT_COMMAND"
      fi
    3. PATH management:
      pathdel() {
      PATH=`echo $PATH | sed -e "s/\(.*\)\(\:\)$1\(\:\)\(.*\)/\1:\4/g" \
      -e "s/^$1://" -e "s/:$1\$//"`
      }
      pathadd() {
      if echo $PATH | fgrep "$1" > /dev/null 2>&1 ; then
      :
      else
      if [ -d $1 ] ; then
      case $2 in
      "append") PATH=$PATH:$1 ;;
      "prepend") PATH=$1:$PATH ;;
      *) echo fixme ;;
      esac
      fi
      fi
      }

      pathdel '\.'

      if [ "$UID" -eq "0" ] ; then
      pathadd /sbin "append"
      pathadd /usr/sbin "append"
      pathadd /usr/local/sbin "append"
      for i in /usr/local/*/sbin ; do pathadd $i "append" ; done
      for i in /opt/*/sbin ; do pathadd $i "append" ; done
      fi
    4. This next one is absolutely necessary on any Linux machine:

      alias kilmoz='killall -9 netscape ; rm -f ~/.netscape/lock'
    5. Recursively do something for each subdirectory:
      dirs() {
      # follows symlinks, prints dot-dirs
      ls -A $@ | while read i ; do [ -d $i ] && echo $i ; done
      }
      recurse() { # DFS
      dirs | while read DIR ; do
      $@
      cd $DIR
      recurse $@
      cd ..
      done
      }
    Here's a dumb shell script that I call "randsort" - it prints out lines from stdin in a random order:
    #!/bin/sh
    randsort() {
    perl -e 'srand(time() ^ ($$ + ($$ << 15)));
    print sort {rand 10 <=> rand 10} <STDIN>;'
    }
    randsort

    This is useful 'cause you can do something like this:

    xv `ls -t ~/.pr0n/*.jpg | head -200 | randsort`

    and this randomization, for some reason, "feels" better than xv's -random option. It also has a number of "legitimate" uses. For instance:

    playm3u() {
    randsort < $1 | while read i ; do mpg123 "$i" ; done
    # heh - 'mpg123 -z --list "$i"' does same thing nowadays
    }

    Anyway, I got lots of stupid little crap like this. I try not to get too far into the customization stuff, as it's definitely a timesink and it's very important to be able to use a "standard" unix setup. For example, my .emacs is circa 80 KB (I'm big into Lisp), but I'm extremely proficient with vi, because there are lots of situations where emacs is not available, or vi is better-suited to the job.

    The lameness filter is such a piece of horseshit. I mean, are the editors deliberately trying to prevent us from discussing code and technical matters? You'll need to re-indent all the snippets above because of this.

    Lameness filter fodder:

    C este un limbaj de programare cu scop general ale carui caracteristici sint economia de expresie, structuri moderne de control al fluxului si de date, precum si un set bogat de operatori. C nu este un limbaj de nivel "foarte inalt", nici "mare", si nu este specializat vreunei arii particulare de aplicatii. Dar absenta in restrictii si generalitatea sa il fac mai convenabil si mai de efect pentru mai multe scopuri decit limbaje presupuse mai puternice. C a fost la inceput proiectat si implementat pe sistemul de operare UNIX pe DEC PDP11 de catre Dennis Ritchie. Sistemul de operare, compilatorul C si in mod esential, toate programele de aplicatii ale lui UNIX (inclusiv software-ul folosit pentru a pregati cartea aceasta) sint scrise in C. Compilatoare de C exista deasemenea si pe mai multe alte calculatoare, intre care IBM System/370 Honeywell 6000 si Interdata 8/32. C nu este legat de nici un hardware sau calculator anumit si e simplu de scris programe care se pot executa fara nici o modificare pe diferite calculatoare care au limbajul C implementat. Aceasta carte are drept scop sa-l ajute pe cititor sa invete sa programeze in C. Ea contine o initiere, pentru ca noii utilizatori sa poata incepe cit mai repede posibil, capitole separate pentru fiecare caracteristica majora, si un manual de referinta. Marea parte a textului nu se bazeaza atit pe expunerea de reguli si propozitii cit pe citirea, scrierea si revizuirea de exemple. In cea mai mare parte exemplele sint programe reale si sint complete si nu fragmente izolate. Toate exemplele au fost testate direct din text, care este intr-o forma citibila pe calculator. Pe linga faptul ca am aratat cum se utilizeaza efectiv limbajul, am incercat in plus, acolo unde era posibil, sa-l ilustram cu algoritmi utili si cu principii de bun stil in programare si proiectare sanatoasa. Aceasta carte nu este un manual introductiv de programare. Ea presupune anumite familiaritati cu conceptele. de baza din programare, ca variabile, instructiuni de asignare, bucle, functii. Cu toate acestea, un programator novice va fi in stare sa citeasca cartea si sa-si insuseasca limbajul, chiar daca ajutorul unui coleg cu experienta mai mare i-ar usura munca foarte mult. In experienta noastra, C s-a dovedit un limbaj placut, expresiv si adaptabil pentru o mare varietate de programe. Este usor de invatat si "se poarta bine" pe masura ce experienta in programare cu el creste. Speram ca aceasta carte va va ajuta sa-l folositi bine.

    C este un limbaj de programare cu scop general ale carui caracteristici sint economia de expresie, structuri moderne de control al fluxului si de date, precum si un set bogat de operatori. C nu este un limbaj de nivel "foarte inalt", nici "mare", si nu este specializat vreunei arii particulare de aplicatii. Dar absenta in restrictii si generalitatea sa il fac mai convenabil si mai de efect pentru mai multe scopuri decit limbaje presupuse mai puternice. C a fost la inceput proiectat si implementat pe sistemul de operare UNIX pe DEC PDP11 de catre Dennis Ritchie. Sistemul de operare, compilatorul C si in mod esential, toate programele de aplicatii ale lui UNIX (inclusiv software-ul folosit pentru a pregati cartea aceasta) sint scrise in C. Compilatoare de C exista deasemenea si pe mai multe alte calculatoare, intre care IBM System/370 Honeywell 6000 si Interdata 8/32. C nu este legat de nici un hardware sau calculator anumit si e simplu de scris programe care se pot executa fara nici o modificare pe diferite calculatoare care au limbajul C implementat. Aceasta carte are drept scop sa-l ajute pe cititor sa invete sa programeze in C. Ea contine o initiere, pentru ca noii utilizatori sa poata incepe cit mai repede posibil, capitole separate pentru fiecare caracteristica majora, si un manual de referinta. Marea parte a textului nu se bazeaza atit pe expunerea de reguli si propozitii cit pe citirea, scrierea si revizuirea de exemple. In cea mai mare parte exemplele sint programe reale si sint complete si nu fragmente izolate. Toate exemplele au fost testate direct din text, care este intr-o forma citibila pe calculator. Pe linga faptul ca am aratat cum se utilizeaza efectiv limbajul, am incercat in plus, acolo unde era posibil, sa-l ilustram cu algoritmi utili si cu principii de bun stil in programare si proiectare sanatoasa. Aceasta carte nu este un manual introductiv de programare. Ea presupune anumite familiaritati cu conceptele. de baza din programare, ca variabile, instructiuni de asignare, bucle, functii. Cu toate acestea, un programator novice va fi in stare sa citeasca cartea si sa-si insuseasca limbajul, chiar daca ajutorul unui coleg cu experienta mai mare i-ar usura munca foarte mult. In experienta noastra, C s-a dovedit un limbaj placut, expresiv si adaptabil pentru o mare varietate de programe. Este usor de invatat si "se poarta bine" pe masura ce experienta in programare cu el creste. Speram ca aceasta carte va va ajuta sa-l folositi bine.

  • I'm sure everyone knows about this...except newcomers to UNIX.

    Its called nohup. Nohup is a command that will prevent your job from being terminated once you log out of your account. Always leaving those "please do not log out of this account" message to your co-workers? Well, that's all fine unless your co-workers are assholes. Also, even if they're not, you might want to do them a favor by doing this.

    What you do is type (wihout the quotes) "nohup " where is the command or program you want not to be interrupted upon logout. You can type in normal syntax.

    Another essential "command" if your at the command line and are going to start somethin which takes awhile is (w/o quotes) " but newcomers won't know them, nor will people who just download cygwin to use on their Windows OS, because some scientific applications require Linux.
    • Its called nohup.

      The big question I have is this. Why is it that, when I run "nohup python myscript.py &" and logout of my *bsd based webserver, my python scripts crap out?

      It's gonna be hard to get my program to generate updated html pages to match constantly updated data, if I can't get the dadburned script to stay running.

      This is an educational thread for me. My personal guru got a travelling job. (Hi Dave!)

      • Sorry, I can't much help you there, since I don't use Python, Perl, or any other scripting languages aside from sed (alot) and awk (rarely).

        But here's my conjecture for starters. Sorry if this is typical, but its the first thing that comes to mind and the easiest to fix if wrong: (1) Either your version of nohup has a bug or is corrupted; (2) or your version of Python has a bug or is corrupted. I suggest you try "reinstalling" nohup and Python. If that doesn't work, check the BUG history on these things and try using a different version of each.

        I know, this is the typical help you get when you call your computer's support service:

        YOU: One specific feature of this program isn't working.

        THEM: Well, when did it start happening and what was the last thing you did before it started happening?

        YOU: I haven't changed anything, it just started happening.

        ...then he takes you through a lot of pointless bullshit which has nothing to do with solving your problem, but makes it seem like he's doing his job.

        THEM: Ok, sir, after performing 1000 useless tests, I've determined that your program is corrupted. Since re-installing it didn't fix it, I'm going to have to request that you delete your entire hard drive and reinstall everything.

        YOU: What? You want me to wipe my freakin' hard drive?

        THEM: Yes, sir, that appears to be the solution.

        YOU: Well fuck that. I could've "fixed" the problem that way.
      • your python script may crap out due to the stdin stream disappearing (i know nohup redirects stdout and stderr, but i don't think it redirects stdin)

        you might try nohup python myscript.py /dev/null &
  • one of the first 'tricks' I saw a more experienced
    guy to perform in good old days when I was a
    newbie... ;-)

    How he transfered a text file from one of his
    accounts (in Russia) to another one (in SUNY SB)
    when the source side did not allow ftp
    connections, telnet only?

    Well, 'more foo' in one xterm, 'cat > foo' in
    another, then click left-drag-move-click middle...

    Vas'ka, you might be reading this... HI! :)

    Paul B.
    • Re:poor man's ftp (Score:2, Informative)

      An easier way (and not needing xterms) would have been to use script:
      $ script
      Script started, output file is typescript
      $ telnet remotehost
      ...
      remotehost $ cat filename
      ...
      remotehost $ exit
      $ exit
      Script done, output file is typescript
      $
      Of course, instead of cat, use uuencode for a binary file...
    • sz/rz would be usefull too, assuming theyre installed on the remote host. There was a wrapper somewhere to allow sz/rz to work over ssh, and some telnet tools allowed zmodem send/recieve
    • For that sort of thing with binary files there's also uuencode, zmodem/xmodem/ymodem etc. Or also MIME encoding.

      Cheerio,
      Link.
  • by Pauly ( 382 ) on Saturday June 01, 2002 @07:41PM (#3624206)

    I believe nothing is more impressive than knowing what you're doing with the world of UNIX command line tools and standard in/out on the pipe.
    I often show up Perl studs at work, creating similar functionality to their spaghetti code with a few lines of korn shell code in much less time.
    Shell programming with the pipe is the most powerful form of elegant simplicity I've ever seen in any computing.

  • command line tricks (Score:2, Interesting)

    by caca_phony ( 465655 )
    Here is a trick for my favorite shell, es the Extensible Shell, a derivitive of plan9's rc shell. The es shell has the most logical set of syntax rules I have ever seen in a shell. It is also the only shell where I have figured out how to do automated file renaming (very handy). Here is a transcript of a session as an example:

    ;; touch 123
    ;; touch 124
    ;; touch 125
    ;; touch 126
    ;; touch 127
    ;; touch 128
    ;; touch 129
    ;; ls
    123 124 125 126 127 128 129
    ;; echo <={~~ (*) 1*}
    23 24 25 26 27 28 29
    ;; for (i = <={~~ (*) 1*}) {mv 1$i one$i}
    ;; ls
    one23 one24 one25 one26 one27 one28 one29

    this functionality is great for preventing name collisions when consolidating files from two directories into a single directory.

    as some explanation <={...} is like typing the return value of the command inside the braces, ~~
    returns the part of the second aruments matches that were expanded from the the first argument's glob.
    • for (i = <={~~ (*) 1*}) {mv 1$i one$i}

      You do realize how sick that looks? :)

      How about just using any shell and having a shell function:

      rename() {
      # snarfed from Larry Wall
      perl -e '
      $op = shift;
      for (@ARGV) {
      $was = $_;
      eval $op;
      die $@ if $@;
      rename($was,$_)
      unless $was eq $_;
      }' $@
      }
      Then you could do your example as follows:

      rename 's/^1/one/' 1*

      Now that I think of it (programatic file processing), in college, I was a TA for a web programming course. One cute trick that impressed students was automatically generating thumbnails for images, like this:

      for i in *.jpg ; do convert -size 100x100 $i tn_$i ; done

      This impressed the hell out of the Windows and MacOS users, especially as I had never done that before and I came up with it after about thirty seconds (needed to look at the ImageMagick "convert" manpage). In Windows, if you have a thousand images and you want to create thumbnails, you buy a program to do that type of thing. Or, you hire a unix guy to do it for you :)

      • actually microsoft's power toys for XP includes a mass resizer (the power toys are usually their most useful software)
        • In other words, while buying XP, you also bought a special tool that resizes lots of images.

          It's nice that MSFT provided that program, but it's not in the same league as the flexibility provided by the unix command line.
    • % rename 1 one 1*

      % man rename

      For more sophisticated renaming, I use a small perl script similar to another reply.

  • Eh... (Score:2, Funny)

    by jo42 ( 227475 )
    Does "FORMAT C:" count?
  • I agree with everything that's already been said about knowing the value of pipe's and the standard tools.

    In addition to screen which has already been mentioned the two things that I've noticed impressing other people recently are, splitvt [devolution.com], and Emacs

    Emacs is impressive because in the hands of an expert you can do almost anything - and splitvt is a stunning program which will turn one shell into two - I highly recommend you check it out.

  • by stevey ( 64018 ) on Saturday June 01, 2002 @08:25PM (#3624342) Homepage

    The biggest single command which saves me time is 'cd -' which changes to your previous directory under bash.

    It doesn't sound terribly useful, but it is... Take my word for it.

  • I wasn't as impressed when I first saw mouse pointer activation, but after using it for a while I get almost as annoyed when windows don't auto activate on mouse overs, as when I grab a non reverse polish calculator.
  • best grep trick ever (Score:5, Informative)

    by novarese ( 24280 ) on Saturday June 01, 2002 @09:14PM (#3624473) Journal
    Instead of
    $ ps aux | grep foo | grep -v grep
    use
    $ ps aux | grep [f]oo
    The brackets will show up in the ps output but don't match your pattern, so your grep is automaticly excluded from your final output.
  • Regular Expressions (Score:3, Interesting)

    by sam the lurker ( 209655 ) on Saturday June 01, 2002 @09:43PM (#3624531)
    Regular expressions aren't so much either a trick or a tool exactly, but you can use them with all the "good" tools.

    Get the book "Mastering Regular Expressions," by Jeffrey E. F. Friedl. http://www.oreilly.com/catalog/regex2/ [oreilly.com]

    Read it slowly, a couple of pages every day. I didn't understand much of what he was trying to say until I read the book the second time.

    But why make up my own clever things to say... From http://www.oreilly.com/catalog/regex/desc.html [oreilly.com] "There can be certain subtle, but valuable, ways to think when you're using regular expressions, and these can be taught."

    I find that books that teach you how to think about problems and solutions are few and far between, and books that do it well are almost impossible to find. This books is one of those.

    Once learned regular expressions are one of those things that can profoundly effect the way you work. And once your there "you wonder how they [other people] use UNIX w/o them".
    • by Permission Denied ( 551645 ) on Sunday June 02, 2002 @01:28AM (#3625195) Journal
      Here's a cute little perl script that's useful for developping other perl scripts:
      #!/usr/bin/env perl
      $st = "\033[7m";
      $en = "\033[m";

      print "Enter regex: ";
      $re = <>;
      chomp $re;
      while () {
      print "Enter string: ";
      $s = <>;
      chomp $s;
      exit unless $s;
      $s =~ s/$re/$st$&$en/g;
      print $s . "\n";
      }
      This will highlight what parts of a string match a given regular expression. (I modelled this script on something I found in a Ruby tutorial).

      Also, for those learning regular expressions, I would highly recommend Introduction to Automota Theory, Languages and Computation by Hopcroft et al. This is a more mathematical/theoretical introduction to the subject. Regular expressions/automata are a very nice part of CS in that the basic theory is immediately applicable.

  • ...wouldn't this be better in every way?

    find . -type f -exec cp -r {} {}.bak \;

    Find is your friend!
    • ..wouldn't this be better in every way?

      find . -type f -exec cp -r {} {}.bak \;

      Maybe; but it won't work with any version of find I've ever encountered, since {} is only expanded when it stands alone, with whitespace on both sides.

      And the -r seems out of place, too, since you've already selected objects of type file with your -type f directive. How are you going to recurse a file?

  • by coyote-san ( 38515 ) on Saturday June 01, 2002 @11:19PM (#3624807)
    I consider this pretty basic, but I remember some junior coworkers blown away by

    <tt>something | sort | uniq -c | sort -rn</tt>

    On the coding level, I constantly use the regex library (with extended regular expressions) and profiling. There are a lot of places where a single well-designed RE can eliminate many lines of code, and profiling can help you ensure that you close all files you open, free all memory you malloc, etc.
  • debugging (Score:2, Informative)

    by Chacham ( 981 )
    Two biggies for me are 2>a to send errors to a file named "a". And strace, when you really need to know what a prgram is doing.

    I couldn't live without strace. I had a font problem with X, strace told me where the problem was. xine wasn't playing sound, strace showed be that it was hitting the wrong entry in /dev/. The list goes on.

    For Debian users, apt-spy (requires installation of the package) to find fast sites for package downloads, and apt-listchanges to list what all the changes are.

    To wow silly Windows users, eject and aafire (part of aalib).

  • These are not groundbreaking, but I do use them a lot.

    Locate files with a string in them. For example:

    find /etc -type f -exec grep -l hdparm {} \;

    Also, using tar across pipes to copy a tree without an intermediate tar file:

    tar cf - foo | (cd /bar ; tar xvf -)

    I also use it over ssh in cron jobs:

    tar cf - foo | ssh root@host "cd /bar ; tar xvf -"

    • why not use scp to copy the stuff instead of tar?
      • Deagol said:
        tar cf - foo | ssh root@host "cd /bar ; tar xvf -"

        Toast0 said:
        why not use scp to copy the stuff instead of tar?

        Tar's faster (for large scale directories). NOTE: This is on solaris that I did the following.

        We have 1 website with a BLOODY STUPID design. Part of it is that 1 filesystem has 13 MILLION files in it. Some of the directories have 10 or 15 thousand files in them. The idiots should have used a proper database. This is one of the few times under solaris that I wished for reiserFS. Anyway I was tasked to copy this fs over to another box (website migration).

        First time, tried rdist. 8 hours later, and about 20% complete, we tried something else (For grins, I did the same command later... rdist ended up taking 28 hours in total).

        Second time, rcp. 28 hours again.

        Third time, rsync. Took less time (about 10 hours).

        Fourth time, tar over rsh. Took 4 hours.

        Shlepping filesystems across machines with ssh and tar is really, really nice and quite fast. I highly recommend it.
    • Try:

      `tar cf - foo | ssh bar tar -C daz -xvf -'

      Which instructs tar to change into `daz' before performing any operations and eliminates the cd command.
      • `tar cf - foo | ssh bar tar -C daz -xvf -'

        I've seen people do stuff like this with ssh a few times and I always wonder are you implying a -p password, have no password set, or does it just ask for a password (which would make it a no go for automated scripts).
    • find /etc -type f -exec grep -l hdparm {} \;

      I usually use something like this:

      find /etc -type f -print | xargs grep -l hdparm /dev/null

      Besides switching from the -exec option of find to using xargs (which means that you start grep not once per file, but once per bunch of files), this also uses the neat little trick of adding "/dev/null" to the list of files for grep to check. This isn't so important for grep -l, but if you want to print the matches that were found, then remember that grep will prefix each match with the filename if there are more than two filenames, but will show only the matching lines if there is only one filename. The presence of /dev/null ensures that there will always be at least two filenames in the grep command line -- so you'll always get the filename prepended to matching lines.

  • Hey, that works in tcsh too. Keen.

    - H

  • xtail is essential (Score:2, Informative)

    by RevDigger ( 4288 )
    One of the first things I install on a new system is always xtail.

    It has nothing to do with X Windows.

    It works just like 'tail -f' but it can watch multiple files. You can even watch whole directories and it will notice when new files are created.

    http://www.unicom.com/sw/xtail/

    Very handy tool.



  • Ctags and tab completion in VI as well as vsplit and CVS from VI .. these things make any programmers life easier.

    having a good bashrc/vimrc/muttrc will save you soo damned much time and typing.

    ssh keys and ssh-agent .. key auth is another time/productivity.

    Screen which has been mentioned is also somthing i cannot live without.

    also perl from the command line.

    one i use a lot:

    perl -i.bak -lpe 's///g;'

    replace whatever in your regex in whatever files
    backing up original files to file.bak

  • First the tool - xtail [unicom.com] . It's wonderful for watching a bunch of system/web logs on one terminal window.

    Next, two basic tips:
    1. control - \ sends SIGQUIT rather than SIGINT as control - c does. Useful for killing programs that do something besides exit when they get SIGINT (such as xtail).
    2. kill -ILL Simulates an illegal instruction - useful for killing tasks that ignore kill -KILL. I had to use this all the time to kill hung opnet and comnet simulations back in my networking class several years ago.


    One last thing, to address a major peeve I have with many scripts I find:

    Always use random names for temp files. Even if you don't want to use mktemp, please do something as simple as appending .$$ to the end of the file names. While this may not prevent someone trying to force a race condition, think of what would happen if two copies of your script were started at the same time if you didn't ensure that different instances of your script are using different temp files...
  • One of my favorite shortcuts in bash/csh is !$ which is expanded to the last argument you typed on the previous command line. If you often issue multiple commands on the same argument, e.g. to manipulate the same file several different ways, this can save you a ton of keystrokes.

    I also find myself using this shortcut when I'm tracking down spam:
    $ nslookup bestmortgageratesintheworld.com
    $ whois !$
    $ traceroute !$
    Beats typing the argument over and over - especially when it's long - and it's faster than hitting the up-arrow and editing the previous command. Hope someone finds this useful, I've already pulled a few great tips out of this thread myself.

    Shaun
  • I'm surprised no one's yet mentioned the command to "read mail, real fast".
  • Pushd and popd made my day when I learned about them. I now never use unix without 'em!

  • Unix Guru's Univers (Score:3, Informative)

    by GeorgeH ( 5469 ) on Sunday June 02, 2002 @03:31PM (#3627354) Homepage Journal
    UGU offers a Unix tip of the day at http://www.ugu.com/sui/ugu/show?tip.today [ugu.com] - that should keep you in tips for a while.
  • ssh -R.

    - A.P.
  • by crucini ( 98210 ) on Monday June 03, 2002 @03:17AM (#3629550)
    1. Tcpflow [circlemud.org] - read contents of tcp traffic in real time. Great for watching browser/webserver interactions.
    2. Netcat [atstake.com] - connect Unix pipes to TCP sockets. Should have been part of Unix since the advent of TCP/IP. Great for rigging a temporary "server" to see if a client is connecting as advertised: nc -lp 80.
    3. X Resources (as seen in ~/.Xdefaults) - you can make xterms really dark, even when running colored apps like mutt, with dark Xresources like: XTerm*color9: #690000 - man xterm for meanings of color0-15.
      xrdb -merge .darkXres to use.
    4. Xmessage - useful in crontabs to remind you of periodic things - like remembering to go home. With the right params, it can take over the whole screen, which is hard to ignore.
    5. perl -pi.bak -e's/chocolate/vanilla/g' *.recipe - change a bunch of files, leaving backups.
  • by realdpk ( 116490 ) on Monday June 03, 2002 @01:41PM (#3632245) Homepage Journal

    To make all of your files and directory world-readable with one command:

    $ chmod -R go+rwX .

    (the X is the key, you don't have to worry about executables vs. directories)

    Need a list of numbers of letters or anything? Check out jot (on FreeBSD, maybe on others):

    $ jot 3
    1
    2
    3

    $ jot -w %c 3 65
    A
    B
    C

    (This is especially useful with for loops - "for i in `jot 20`;do touch foo.$i;done" will generate foo.1 through foo.20. Extra hint: -w %02d will give you a leading zero on 0-9)

    grep's -A, -B, and -C flags can be very useful. Using them, you can have grep display the lines immediately before and/or after a match.

    If you have a lot of files in a directory, so many that "*" complains of the argument length, bash (and probably other shells) can let you get around this by changing your command to one using a for loop. It won't be as fast, but it won't churn for several minutes only to tell you it won't work. ;)

    instead of:

    $ rm *

    do

    $ for i in *;do rm $i;done

    You can get a list of all files in your directory excluding . and .. with 'ls -A'.

    One of the most annoying things about 'less', at least in Debian and probably other dists, is that it clears the screen when you exit. Ugh! You can fix this problem by setting PAGER='less -X' in your environment.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...