112

I would like to open a discussion that would accumulate your Linux command line (CLI) best practices and tips.

I've searched for such a discussion to share the below comment but haven't found one, hence this post.

I hope we all could learn from this.

You are welcome to share your Bash tips, grep, sed, AWK, /proc and all other related Linux/Unix system administration, shell programming best practices for the benefit of us all.

HopelessN00b
  • 53,385
  • 32
  • 133
  • 208
Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
  • Why Linux only? Many tips may be useful on almost all flavours of Unix. – mouviciel Mar 02 '09 at 19:49
  • This question might be a tad broad - there is enough to Linux/Unix command line best practices to fill entire books... – Jonik Mar 02 '09 at 19:50
  • I agree that unix tips that can be applied to linux also are very much welcome, I've written linux because A. This is what we work with here. B. because it's gaining a more wider public exposure then unix in recent years. – Maxim Veksler Mar 02 '09 at 19:57
  • I think it's a great idea to comiple a list of the most useful and interesting commands that people use. –  Mar 02 '09 at 19:57
  • Are any of these relevant? http://refspecs.freestandards.org/ –  Mar 02 '09 at 19:58
  • Unfortunately, there is no such thing as "the" linux command line. There are at least half a dozen shells, all subtly different. –  Mar 02 '09 at 19:59
  • @Neil Butterworth: That is true, but please agree that the common one is bash using either Konsole of Gnome-Terminal. So this list should still be relevant for most, quoting python zen: "Special cases aren't special enough to break the rules." – Maxim Veksler Mar 02 '09 at 20:11
  • @Shane, on a second thought, yeah, this will probably work just fine, especially with the voting system bringing forth the best pieces of our common knowledge – Jonik Mar 02 '09 at 20:36
  • I still think SU might have been the best suited trilogy site for this poll. If you look at the answers, most of them are general Linux "power user" tips whose usefulness is by no means limited to professional sysadmins. – Jonik Sep 27 '09 at 12:33

85 Answers85

111

Use screen, a free terminal multiplexer developed by the GNU Project that will allow you to have several terminals in one.

You can start a session and your terminals will be saved even when you connection is lost, so you can resume later or from home.

  • 2
    The most common switches I use are 'screen -D -R' for detaching and resuming my last screen session and 'screen -x' to view the same screen session from multiple logins. Also, get yourself a nice .screenrc from http://dotfiles.org/.screenrc – Nick Devereaux Mar 05 '09 at 01:41
  • 4
    Another nice thing about screen is that you can share a terminal with other people when you want to collaborate on something, using screen -x – gareth_bowles Mar 12 '09 at 18:40
101

SSH!
SSH is the god command--I think it's the most valuable over-all command to learn. The options can be pretty daunting, but it seems like I'm constantly learning to use new command-line options for SSH that I never thought would be necessary. I may have used them all at this point.

The more you use it, the more you learn about it. You can use it to do some AMAZING things.

Note: ALL these things are doable remotely with no setup on your server except to have ssh server running.

Mount a file system over the internet

search the net for SSHFS

Forward commands.

The SVN+SSH protocol is Subversion from a remote client to a server with NO DEAMON running on it! The SVN command starts the server through the ssh shell and passes the info back and forth through the existing pipe. The rsync program does the same thing, runs against a server with no rsync deamon by starting one itself via SSH. It's easy to write your own bash files to do similar tricks.

Chain to get through firewalls

I use this all the time to jump through my linux server at home to my mac.

Forward ports:
Seems only moderately useful until you realize that you can bounce through your home firewall and configure your router at home from work as though you were doing so from within your home network).

Forward X requests:

This is another amazing one. With or without an X server running on your remote system, you can run an x-windows program and the window will appear on your local screen. Just use the switch -X, that's all!

Since you don't have to have an X server running on your remote server, the CPU impact on your server is minimal, you can have a TINY Linux server that serves up huge apps to your powerful game PC running Windows and cygwin/X.

Of course VI and EMACS work over SSH, but when I'm running at home, sometimes I want more. I use ssh -X to start up a copy of Eclipse! If your server is more powerful than your laptop, you've got the GUI sitting right there on your laptop, but compiles are done on your server, so don't worry about system load.

Run in batch files

(meaning run a local batch file that "does stuff" on other systems):

Two things combine to make this one cool. One is you can eliminate password prompts by using (more secure) encryption keys. The second is you can specify a command on the SSH CLI. I've used this in some interesting ways--Like when a compile fails on the remote server, I would have it SSH into my computer and play a sound file).

Remember you can redirect the output from the remote command and use it within your local batch file, so you could also be locally tailing a compile running on your server.

Built in to Mac

Both server and client are built into both mac and linux. In the case of the Mac and Ubuntu, enabling a server is as simple as finding the right checkbox.

On a PC install cygwin or cygwin/X (cygwin/X will allow you to forward your x-window output from your Linux machine to your Windows PC--it installs an X server)

Important Tips/config file

Never use port 22 on your firewall. You'll get a lot of hack attempts, it's just not worth it. Just have your firewall forward a different port to your server.

There are extensive configuration options that allow you to simplify your ssh commands significantly. Here's an example of mine at work:

Host home
    hostname billshome.hopto.org
    Port=12345
    user=bill
    LocalForward=localhost:1025 mac:22

When I type "ssh home" (nothing else), it acts as though I had typed:

ssh -p 12345 bill@billshome.hopto.org

and then forwards my local port 1025 to my system "mac" at home. The reason for this is that I have another entry in my file:

Host mac
    hostname localhost
    port=1025

so that once I've done an "ssh home" and still have the window open, I can type "ssh mac" and the computer at work here will actually try to connect to its own port 1025 which has been forwarded to "mac:22" by the other command, so it will connect to my Mac at home through the firewall.

Edit--cool script!

I dug up an old script I just love--had to come back and post it for anyone here who might be interested. The script is called "authMe"

#!/bin/bash
if [ ! -f ~/.ssh/id_dsa.pub ]
then
    echo 'id_dsa.pub does not exist, creating'
    ssh-keygen -tdsa
fi
ssh $1 'cat >>.ssh/authorized_keys' <~/.ssh/id_dsa.pub

If you have this script in your home directory and there is a host you can connect to (via ssh), then you can type "./authMe hostName".

If necessary it will create a public/private keypair for you, then it will ssh over to the other machine and copy your public key over (the ssh command will prompt you for a password...)

After this, the SSH command should not ask for your password any more when attaching to that remote system, it will use the public/private keypair.

If your remote computer is not always secure, you should consider setting a "passphrase" when prompted.

You may also want to configure the ssh server on the far end to not allow text passwords (only keys) for additional security.

Bill K
  • 1,189
  • 1
  • 6
  • 7
  • 4
    +1 very nice and compact post – Karsten Mar 03 '09 at 12:20
  • 2
    A small note on SSH over port 22: provided you've got a good root password and your users don't have terrible passwords, just running fail2ban (or something similar) is enough to keep out crack attempts. I found it really annoying to keep SSH on another port. – David Wolever Sep 13 '09 at 16:04
  • 1
    It can be irritating at times, but the volume of connects was just scary. I know that they aren't supposed to get through, but it's kind of like if you had someone walking up to your house and fiddling with your doorknob every minute--just unnerving. –  Sep 14 '09 at 15:49
  • There's a built-in version of your last command in most versions of ssh nowadays, it's called `ssh-copy-id`, and it's as simple as `ssh-copy-id user@someserver` – JamesHannah Oct 20 '09 at 15:20
  • Another way to do your 'ssh mac' trick is to use a ProxyCommand directive in your .ssh/config file. Host mac ProxyCommand ssh -q home "nc %h %p" This will run netcat (nc) on home and redirect your ssh connection straight through to mac. I find those easier to chain (for example if you need to connect to 'home' to get to 'work bastion' to get to 'work web server'. – toppledwagon Oct 20 '09 at 20:36
  • You might consider switching from dsa to rsa in your keypairs – ptman Nov 24 '09 at 10:56
  • Actually, a passphrase is more useful if the **local** computer is insecure. – Fahad Sadah Oct 30 '10 at 14:44
73

I like to use

cd -

to switch to the previous directory. Very useful!

Fortyrunner
  • 161
  • 1
  • 4
  • 6
    I never knew this! Thanks! On a related note, pushd and popd do something similar, by building up a stack of dirs. These sort of tricks help when you are navigating around a deep dir structure. –  Mar 03 '09 at 19:40
  • This is sheer enlightment. – Manuel Ferreria Mar 07 '09 at 21:07
69

I've recently discovered the pv command (pipe viewer) which is like cat but with transfer details.

So instead of

$ gzip -c access.log > access.log.gz

You can use

$ pv access.log | gzip > access.log.gz
611MB 0:00:11 [58.3MB/s] [=>      ] 15% ETA 0:00:59

So instead of having no idea when your operation will finish, now you'll know!

Courtesy of Peteris Krumins

Nick Devereaux
  • 191
  • 4
  • 9
53
sudo !!

Rerun the previous command as root.

[The current top command on the site http://www.commandlinefu.com, a site themed along the lines of this question.]

codeinthehole
  • 313
  • 2
  • 6
  • 10
45

The command line is a funny thing. I think that you can only learn so much on your own and the rest you learn by accident watching somebody else using a command line.

I was using the shell for years painstakingly typing in directory names by hand. One day I was watching a friend mess around on a system and he kept hitting the tab key. I asked "why are you hitting tab?". Answer: it tries to complete the directory or filename. Who would have guessed--tab completion! Type a bit of the file or directory, hit tab, and it will try to finish what you typed (behavior depends on which shell though).

One day, said friend was watching me on the command line and watched me type something like:

coryking@cory ~/trunk/mozi $ pushd /etc
/etc ~/trunk/mozi
coryking@cory /etc $ popd
~/trunk/mozi
coryking@cory ~/trunk/mozi $

Who would have guessed!? He never knew about popd / pushd. Guess we are even...

Cory R. King
  • 101
  • 1
  • 4
  • 1
    To me tab completion seems like something "obvious", very basic, but I never knew about popd/pushd either. Funny indeed. :) – Jonik Mar 02 '09 at 20:46
  • 3
    Also see this answer to be able to tab-complete a lot more than commands, files and directories: http://stackoverflow.com/questions/603696//603919#603919 – Jonik Mar 02 '09 at 21:20
  • 1
    That's so true. I learned about bash's backward-history-search (ctrl-r) by watching someone. I later realized that it was a readline feature and worked the same in other programs that incorporate readline (mysql, gdb, python -i, clisp etc). – sigjuice Mar 23 '09 at 05:41
  • 2
    How could you use "the shell" for years and not know about tab completion? – prestomation Dec 18 '09 at 21:36
  • When dealing with pushd/popd don't forget to mention the command **dirs**. It shows what's on the pushd/popd stack. – slm Jul 30 '11 at 04:42
45

Press Ctrl-R and start typing a command (or any part of it) - it searches the command history. Hitting Ctrl-R again will jump to the next match, enter executes the currently displayed command, and right-arrow (at least) will let you edit it first.

$ (reverse-i-search)`svn': svn status

I had used Linux for something like 7 years as my main OS before learning of this, but now that I know it, it's quite handy.

Jonik
  • 2,911
  • 4
  • 37
  • 48
  • Thanks. I've been using the reverse search, but haven't been able to figure out how to jump to the next match. – Adam Mar 07 '09 at 16:56
  • Ctrl-S jumps to the next match. You may have to do `stty -ixon` first to turn off the XON/XOFF flow-control meaning (this also makes Ctrl-Q available, by default it's mapped to be the same as Ctrl-V quoted-insert, but you can change it to something else). See `man stty` and `man readline` for more informations. The `stty` command can be added to your `~/.bashrc` – Dennis Williamson Sep 26 '09 at 14:58
41

Learn Vim.

It is (arguably) the best editor, but most certainly the best editor available on a bare Linux server.

  • 1
    gvim is also best file browser on windows ;) –  Jun 16 '09 at 19:12
  • This is also extremely useful as once you know vim, you can put most shells into vi-mode, and navigation becomes extremely easy and fast – phemmer Jan 27 '12 at 02:59
38

It is sometimes useful to leave a program running even after you have logged out. I've seen some solutions that use nohup or even screen for that purpose. The simplest I know of is:

$ your_command_here & disown

You can also detach a running program:

$ your_command_here
# Press <Control-Z> to stop the program and bring it to background:
$ bg
$ disown
kyku
  • 97
  • 2
  • 7
38

When I want to make sure that I use actual command and not an alias, I use a leading backslash:

\rm -rf ~/tmp
mouviciel
  • 476
  • 4
  • 6
38

This tip will make your CLI more comfortable (at least it makes for me):

create ~/.inputrc file with following contents:

"\e[A": history-search-backward
"\e[B": history-search-forward

Reload bash (eg by typing "exec bash"). When you type a prefix of a command and press the up arrow you will browse commands starting with your prefix, for example if you typed ssh it will show your former connections with remote shells. If your prompt is empty then the up arrow will browse the history the normal way.

kyku
  • 97
  • 2
  • 7
  • 3
    Usually I bind those to \e[5~ and \e[6~ (aka pageup and pagedown) instead of rebinding the arrows, but it is more convenient than Ctrl-R/Ctrl-S :) – ephemient Mar 02 '09 at 21:06
  • It's also educational to read `man readline` and memorize it's default shortcuts – SaveTheRbtz Jul 06 '10 at 18:35
  • You can also do `bind -f ~/.inputrc` to reload the inputrc without having to start a new bash and wipe out your current environment. – phemmer Jan 27 '12 at 03:02
36

Use && instead of ; when executing multiple commands at once. It stops when an error occurs and does not execute the other commands.

Classical example:

./configure && make && make install
x-way
  • 216
  • 2
  • 4
  • 1
    This practice is so underrated! It should be standard practice to end each line of a shell script with && and use "true" as the final command. – user18911 Mar 05 '09 at 02:57
  • 3
    what's funny is that I didn't know about the ';' command before--I had ALWAYS used '&&'. – Nick Klauer Mar 15 '09 at 02:45
  • 4
    You can start shell scripts with "#!/bin/sh -e". This stops if any line fails. – stribika Aug 08 '09 at 11:32
  • 3
    You can use a double pipe for the "else" clause: `dosomething && echo "success" || echo "teh fail"` – Dennis Williamson Sep 26 '09 at 15:04
  • 2
    There was a recent note on Planet Debian that it's bad practice to use the -e on the shebang line. Instead use set -e in the body of the script. This way the script doesn't work differently if someone happens to run in using sh $filename. – ptman Nov 24 '09 at 11:00
32

When writing loops on the command line in bash I often prefix risky commands with the 'echo' command.

for item in items; do echo something-risky; done

This way I get to see the 'something-risky' in full before committing to run it. Helps when your risky command includes variable expansions and globs. And 'set -x' is very useful when writing bash scripts.

'set -x' enables debugging. See http://tldp.org/LDP/Bash-Beginners-Guide/html/sect_02_03.html for information on bash debugging.

Jonathon Watney
  • 621
  • 1
  • 10
  • 17
31

Requirement

You need to test maximum bandwidth between 2 servers.


Solution

On SERVER1 do :

nc -u -l 54321 > /dev/null

On SERVER2 do :

dd if=/dev/zero bs=1MB | nc -u SERVER1 54321 &
pid=$(pidof dd)
while (( 1 )); do kill -USR1 $pid; sleep 2; done

You will see output such as :

182682000000 bytes (183 GB) copied, 1555.74 seconds, 117 MB/s
182920+0 records in
182919+0 records out

117 MB/s is the interesting factor here, which shows the actual network transfer bandwidth.

Explanation:

As packet will start flowing over the network you will be able to see bandwidth statistics on SERVER2, which is a pretty good estimate of the actual maximum possible bandwidth between the 2 servers.

Copying by UDP (to prevent TCP overhead).

Copying to from memory (/dev/zero) on SERVER1 to memory (/dev/null) on SERVER2, thus preventing disk I/O from becoming the bottleneck.

Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
23

Seeing color directory structures is easier.

alias ls="ls --color=tty"

Edit

alias ls="ls --color=auto"
Suroot
  • 171
  • 2
  • 5
    I prefer --color=auto because it doesn't sent rubbish control chars if redirect to file you can also do alias grep="grep --color=auto" to have matching patterns highlighted. –  Mar 02 '09 at 19:51
  • +1 for --color=auto; could someone augment the answer with that? – Jonik Mar 02 '09 at 20:40
  • I think tty does the same thing as auto here. – joshudson Mar 02 '09 at 21:54
  • Updated to reflect the color=auto – Suroot Mar 03 '09 at 00:11
  • Strangely, the coloring makes things much less readable for me. Yellow-on-white, for example, isn't a good coloring choice. –  Mar 03 '09 at 02:52
  • Not talking about the Yellow on white or green on black. More about the ability to see the difference in the file types. I.e. blue for directories, green for symlinks, etc... – Suroot Mar 03 '09 at 03:32
  • You can change your terminal's colors. For xterm, use .Xdefaults, XTerm*background: black XTerm*foreground: gray XTerm*cursorColor: yellow XTerm*color0: black and so on. If you have white background, often default colors don't work right. – ypnos Mar 28 '09 at 23:40
  • In MacOSX: alias ls="ls -G" –  May 14 '09 at 03:05
20

Some small pointers regarding log viewing:

  1. You can use tail -F to keep watching the log after it is truncated (by example log4j).
  2. You can use less: Open less, click SHIFT+F to emulate tail behaviour. Also useful combination is -S to disable line wrapping. less will enable you to search in the logs.
Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
20

I learned a trick from a friend a long while back to easily change a file's extension:

mv my_filename.{old,new}

Shell expansion will expand this to:

mv my_filename.old my_filename.new

This can also be used to rename a file in other ways, such as adding something in:

mv my_{,cool_}filename
Scotty Allen
  • 101
  • 2
  • 6
    Just be sure not to put any wildcards or other pattern expressions in the same command, or the glob could expand to more than just 2 arguments! mv *.{old,new} is likely to produce really surprising results. –  Sep 13 '09 at 16:24
  • Yes, agreed - this is an important caveat:) –  Sep 13 '09 at 17:44
  • 2
    You can also type, for example, `ls -l /bin/c` then press Alt-Shift-{ and it will complete this to something like `ls -l /bin/c{at,h{grp,mod,own,vt},p{,io},sh}` for you. – Dennis Williamson Sep 26 '09 at 16:05
19

Use "tar xf" to extract compressed archives. The j for bzip2 and z for gzip are not necessary, as tar will detect file type of your archive. Neither is '-' sign before arguments. You will save a lot of time (over a millenium ;-)).

Better yet use unfoo to extract any archive with a single command without any unnecessary arguments.

kyku
  • 97
  • 2
  • 7
17

Install bash-completion package. It contains a number of predefined completion rules for shell. Enable it by typing "source /etc/bash_completion" if you distro doesn't do it for you. Then, for example whenever you complete after kpdf you will only see a list of directories and PDF files. It is as smart as to complete remote files after scp ssh://server/XXX (if you enabled passwordless logins).

kyku
  • 97
  • 2
  • 7
17

I use constantly these ones

ALT-. (ESC+. in some terminals) copies last used argument (super-useful)

CTRL-W deletes word

CTRL-L clear terminal (like clear command but faster)

ALT-B (ESC+B in some terminals) move backward a word

ALT-F (ESC+F in some terminals) move forward a word

CTRL-E jump to EOL

CTRL-A jump to BOL

CTRL-R search in history

Eduardo Ivanec
  • 14,531
  • 1
  • 35
  • 42
Álvaro
  • 183
  • 4
  • 10
16

In bash, I use !$ a lot. It repeats the last argument of the last command:

ls /really/long/command/argument/that/you/dont/want/to/repeat.txt
vi !$

It will run the ls, then if you want to edit it, you do not have to retype it, just use !$. This is really useful for long path/file names. Also, !* repeats all the previous command's arguments. I don't use that as much, but it looks useful.

I known they've been mentioned, but I use vim, screen, and cd - a lot.

I forgot noclobber:

set -o noclobber

From man bash:

If set, bash does not overwrite an existing file with the >, >&, and <> redirection operators. This may be overridden when creating output files by using the redirection operator >| instead of >.

gpojd
  • 132
  • 6
14

Switch from bash to zsh, and see your productivity improve:

  • Really intelligent, scriptable tab completion. It completes not just command lines but all the options, names of man pages, package names (for apt-get / emerge etc), and what have you. And provides a helpful explanation of the options during the completion. All this without using any scrollback space after the command has been issued.
  • Tab completion of wildcards; write cat *.txt, press tab, and choose between any files that match the expression.
  • Switch directories just by typing their names.
  • Several line editor modes. Can behave like vi or emacs if you want it to.
  • Customizable key bindings to do whatever you wish.
  • Themeable prompts, including the ability to put prompt information on the right side of the screen and have it auto-hide when typing a long command

Google will tell you many more benefits.

flodin
  • 101
  • 3
  • 1
    you should give more then a link. sounds interesting, but sell it to me man! –  Mar 02 '09 at 20:47
  • Did they manage to include unicode support? –  Mar 02 '09 at 20:50
  • @Cory Alright, I just didn't want to plagiarize and it felt redundant to write my own when there's so much good stuff written by others already :) –  Mar 02 '09 at 20:54
  • @kyku yes, there's proper unicode support since maybe a year back. –  Mar 02 '09 at 20:55
  • looks cool. i'll give it a shot! –  Mar 02 '09 at 21:24
  • the bash_completion package does have the same completion ability that you describe here, completion over ssh or anything else. –  Mar 03 '09 at 09:20
  • hmm, but zsh completes them and rotates the completion list inline too. you dont need to type anything but the TAB key :P – Sujoy Mar 03 '09 at 09:38
  • AFAI, bash completion doesn't do wildcards completion. tho you have echo as a kind of workaround. – RamyenHead Sep 13 '09 at 12:08
  • Bash has most of the items in the list above: "rotating" completion: bind menu-complete to Tab, `vi` or `emacs` mode: `set -o vi` or `set -o emacs`, customizable key bindings via readline: show current settings with the -p, -P, -v and -s (and other) options to `bind`, prompts can be customized (a lot) but I wouldn't call them "themeable", switch directories just by typing their names: Bash 4 has `shopt -s autocd` to enable this – Dennis Williamson Sep 26 '09 at 16:02
13

Old school, moving a directory tree from one place to another, preserving symlinks, permissions and all that good stuff:

tar cBf - . | (cd /destination; tar xvBpf -)

or across the network

tar cBf - . | rsh foo.com "cd /destination; tar xvBpf -)

New school:

rsync -urltv . /destination

or across the network

rsync -urltv -e ssh . foo.com:/destination
Paul Tomblin
  • 5,217
  • 1
  • 27
  • 39
  • `tar xvBpf - -C /destination` == `(cd /destination; tar xvBpf -)` A little less typing, and easier to use over ssh. – ephemient Mar 02 '09 at 21:08
  • 1
    -a is the same as '-rlptgoD' - that is recurse, copy symlinks as symlinks, preserve perms, preserve timestamps, preserve groups, preserve owners, copy device files and special files as device and special files. Also, between machines, -z (compress files before sending, decompress after receiving) is useful. rsync -avz [-e ssh] is your friend :-) – dr-jan Apr 20 '09 at 21:10
  • Does -z do any good over ssh? I wouldn't think so. – Paul Tomblin Apr 20 '09 at 22:44
  • `rsync -avh` is best on a single machine - the `h` ensures that hard links are transferred correctly without turning each link into a separate file, though if you have enormous numbers of files it can use a lot of memory. `rsync -avzh . foo.com/destination` is the equivalent for the network, and usually you don't need the `-e ssh` as rsync will know to use ssh. – RichVel Aug 27 '11 at 07:41
  • Actually, these days I'm using `rsync -aSHuvrx --delete / --link-dest=/backup/$PREV /backup/$HOUR` – Paul Tomblin Aug 27 '11 at 16:55
13

Requirement: You have a directory containing large list of files you would like to delete. rm -r will fail!

example

find /var/spool/mqueue/ | wc -l
191545
rm -f /var/spool/mqueue/*
-bash: /bin/rm: Argument list too long

Solution:

find /var/spool/mqueue/ -xdev -exec command rm -f '{}' +

Explanation:

Edit: Fixing explanation following @ephemient comment.

find will supply arguments to rm by the maximum allowed arguments. This will allows rm to delete files in batches which is the fastest technique that I know of without using the -delete operation of find itself. It's equivalent to

find /var/spool/mqueue -xdev -print0 | xargs -0 rm -f

which you may find useful if your find does not support -exec ... +.

Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
11

I use this two tips so often that i thought it would be a good idea to share :

!foo

Will launch the last command in the history file beginning with "foo" (I often use it when compiling, !gcc for example.)

The other one is a keyboard shortcut (Ctrl+O instead of Return) this will execute the command AND display the next command in the history file. For example when I compile and test a file i always do 3 or 4 commands, make, cd to the dir of the test, run the test, cd to the dir of the makefile. Using Ctrl+O this makes this task a lot easier :)

Hope this help!

10

How to use subversion in Linux without the help of fancy graphical interfaces that may not be available.

svn co <repo-url> . # Checks out project into current folder

svn update # Get latest changes from server

svn status # Shows what files have changed locally

svn add <path/filename> # Add some file to the repo

svn commit # Commit your changes to the repo

This holds back a lot of developers from using Linux, strangely enough.

postfuturist
  • 101
  • 5
  • rapidsvn is a nice little gui for svn under Linux. –  Mar 02 '09 at 20:15
  • Many fancy, powerful GUIs are certainly available (like that in IntelliJ IDEA), but I like the possibility of using either those or the command line (also powerful, but differrently) - whichever suits the current situation better – Jonik Mar 02 '09 at 21:18
  • My favorite by far is NautilusSVN (much like TortoiseSVN): http://code.google.com/p/nautilussvn/ – user7655 Mar 03 '09 at 14:00
10

Most overlooked old-school command: find

Yesterday I repaired a gross permissions bug:

for u in nr dias simonpj; do
   sudo -u $u find . -type d -exec chmod g+s '{}' ';'
done
Norman Ramsey
  • 645
  • 2
  • 10
  • 24
  • Yes! Learn the find command. Find mixed with xargs, sed and awk makes for a file processing juggernaut! – Jonathon Watney Mar 03 '09 at 01:08
  • Agreed! Although I never remember anything about sed and awk - I always need to relearn it! I should really print out a cheatsheet. –  Mar 03 '09 at 01:22
  • Yes, but don't forget to use -print0 and the -0 option to xargs to avoid tricky problems involving filenames containing unusual characters (e.g. backslash). – user18911 Mar 05 '09 at 02:55
  • Yup, probably half the time I use find it's to fix stupid permission things: `chmod 644 $(find . -type f)`, `chmod 755 $(find . -type d)` – David Wolever Sep 13 '09 at 16:33
  • 1
    Don't forget the `-perm` option to `find` to locate files that have permissions set a certain way. – Dennis Williamson Sep 26 '09 at 16:07
8

For shells like bash and ksh, you can make the command line respond to navigation commands like your text editor:

set -o emacs

or

set -o vi

will allow you to search your history and move about the command line in the way that you are used to doing in text files (e.g. in vi mode, pressing ESC then typing /string will search your previous commands for "string" - you can use n and N to move between matches)

Tim Whitcomb
  • 101
  • 3
8

grep is my friend. Seriously.

List .rb files that contain the text class foo:

grep -l "class foo" .rb

List .rb files that don't contain the text class foo:

grep -L "class foo" *.rb

List .rb files that don't contain foo or bar (you can use any regexp with -e, but you need to escape the operators):

grep -L -e "foo\|bar" *.rb
  • You might be interested in a tool called "ack", which does the same thing as grep but will scan recursively only sources files. Give it a try (it's free): http://betterthangrep.com/ –  May 05 '09 at 18:40
8

Shell substitution is performed with ^:

/home/eugene $ ls foo.txt
foo.txt
/home/eugene $ ^ls^rm
rm foo.txt
/home/eugene $ ls foo.txt
ls: cannot access foo.txt: No such file or directory
  • 2
    Actually I prefer "!!:gs/foo/bar" since I often find that foo is mentioned more than once in the previous command and I want a global replacement (e.g. "mv foo.txt foo.bak"). – user18911 Mar 05 '09 at 03:07
7

$_ is an environment variable for the last argument from the previous command

So if I create a new directory

$ mkdir ~/newdir/

To enter I simply go

$ cd $_

This is handy for complicated and large texts such as URLs, directories, long file names etc.


You can also refer to each argument in the previous command using !:{number}

$ echo !:0 !:1 !:2

Note that bash will expand this before you execute it (to see this press up to go through your history).

$ touch one two three
$ ls !:1 !:2 !:3 

Unlike $_ which is an environment variable, this will expand to 'ls one two three', perform the action and print the command to the shell. This method is a lot harder (in my opinion) to use than $_ which I use much more frequently.

Note: you can also use !$ instead of $_ but the former will expand

Nick Devereaux
  • 191
  • 4
  • 9
  • 2
    A nice shortcut for getting the last argument of previous command(s) is M-. (aka (left) Alt-.) This will insert the same stuff as $_ with a single key press. Pressing it repeatedly will bring last arguments from previous commands. There's also a way of inserting other arguments,but I'm short of time –  Mar 05 '09 at 08:09
6

There was an answer about using pushd/popd to remember directories. If you want to temporarily visit some directory, you can save typing by using cd - command, like this:

/home/eugene $ cd /etc
/etc $ cd -
/home/eugene $

Also, one of my favorite commands is xargs. I use it very often with find, but it can be also handy in other situations. For example, to find which command line arguments were used for starting some process, you can use the following command on linux:

 $ xargs -0 echo < /proc/[PID]/cmdline

In some cases (especially when working with source code) ack is more handy than grep because it automatically searches recursively and ignores backup files and version control directories (like .svn, .hg). No more typing long command lines like find . -name \*.c | xargs grep 'frobnicate'.

  • To temporarily visit a directory and jump back automatically, use a subshell: `(cd test; dosomething)` leaves you in the directory you started from. – Dennis Williamson Sep 26 '09 at 16:13
6

A nice one I've seen today from a friend.

Clean the log of the application (for next launch and co.)

> /var/log/appname.log

(Note the the > is part of the command).

This is the same as doing:

echo '' > /var/log/appname.log
Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
  • That's kind of scary, actually... especially when you copy/paste from a terminal or website where the prompt ends with '>'... – Thomas Mar 14 '09 at 06:18
  • The two commands are not equivalent. The first one truncates the file (file is 0 bytes). The second one truncates the file and then adds a newline (file is 1 byte). – Ole Tange Jun 22 '10 at 11:05
5

Here is an excellent collection of tips I came across on digg today.

Satish
  • 227
  • 3
  • 4
  • 9
5

I just finished reading Linux tips every geek should know at TuxRadar. Very nice articles for those who know a little bit but want to know more.

matpie
  • 453
  • 1
  • 5
  • 9
5

cut and grep are a very nice way to manage plain text files. You can retrieve whatever you want. cut allows you to "vertical" split the file, and grep allows you to "horizontal" split the file.

The following sentence will split all the lines for ; an only returns the 1 and 3.

$cut -d ';' -f 1,3 hello.txt

With grep (a well known command) you can do the same for lines. The following sentence will ignore the lines which have no interest for you:

$grep error hello.txt

grep can be also for reverse: ignore lines not matching the pattern and also you can use regular expressions.

But the most powerfulness of both are using pipes. For instance:

$grep error hello.txt | cut d ':' -f1,3 | cut -d' ' -f1
FerranB
  • 1,362
  • 2
  • 18
  • 28
4

If you want to combine both stdout and stderr in a redirection, try using 2>&1, as in:

make > make.log 2>&1

Jared Oberhaus
  • 596
  • 6
  • 14
4

My ISP has a bandwidth cap, but a free-for-all window at starting 2 or 3am. So I schedule huge downloads for that window with at:

$ echo aptitude -d -y dist-upgrade | at 3am
$ echo wget http://example.com/bigfile | at 3am

The thing that originally confused me with at is that it takes the script on stdin, not on the command line.

bstpierre
  • 431
  • 1
  • 3
  • 14
4

Use the watch command to repeat commands and observe results. If it's not supported (as on Solaris systems), try the following (bash):

while [ 1 ] ; do
<cmd>
sleep <n> # n is # of seconds to repeat command
echo "" # meaningful output here can be helpful
        # I like to use ">>>>>>" `date` "<<<<<<<"
done
bedwyr
  • 151
  • 1
  • 3
4
reset

or

stty sane

in case you ruin your terminal by accidentally catting a binary file!

dogbane
  • 944
  • 5
  • 8
4

You can follow more than one file with tail -f:

tail -f logfile1 logfile2

the updates will be intermingled depending on the order of occurrence:

==> logfile1 <==
event-a
event-b
==> logfile2 <==
event-p
event-q
==> logfile1 <==
event-c

If you want a cleaner display, use watch and omit the -f from tail:

watch tail logfile1 logfile2
Dennis Williamson
  • 60,515
  • 14
  • 113
  • 148
3

I'm surprised no one mentioned bash's built-in fc command (fc stands for Fix Command).

Use it to edit your previous command in an editor (i.e. vim) instead of in the command line, and execute it upon quitting the editor. Pretty handy.

fc [-e ename] [-nlr] [first] [last]
fc -s [pat=rep] [command]

Read more about it here

ArtBIT
  • 151
  • 1
  • 3
3

My personal, all time favorite: CLI meta and persistent aliases.

with these aliases, one simple command (val) to define new shell aliases, and have them going forward. concept can be extended to smaller "modes"/domains by additional alias files or aliases.

in ~/.alias.sh (make sure this gets sourced in your shell startup files)

# bash format example
alias sal='. ~/.alias.sh; echo ~/.alias.sh was sourced'
alias val='vi ~/.alias.sh; sal'

or ~/.alias.csh (csh format-- make sure it gets included in your shell start files)

# csh format 
alias sal 'source ~/.alias.csh; echo ~/.alias.csh was sourced'
alias val 'vi ~/.alias.csh ; sal'
popcnt
  • 101
  • 3
  • +1 Cool, this looks like something I'll be making use of (with 'vi' replaced with some editor I can use :) – Jonik Mar 02 '09 at 22:20
3

My most common ls commands are

ls -lSr
ls -ltr

to sort files in order of increasing size and time respectively (to find the largest or most recent files). Also, if you don't have a colour terminal for some reason, or don't like colours (like me) then ls -F gives you the same sort of metadata as colours: '/' indicates a dir, '*' an executable, etc.

Also, learn to use find and xargs: the power of the command line is cobbling together smaller commands into something more powerful. These 2 commands are indispensible for that!

3

here-docs are kind of fun:

cat << EOF > /tmp/file.txt

The ${speed} ${color} ${animal} jumped over the ${structure}

EOF

Or, if using BASH .. fun with here-docs in loops:

cat << EOF > /tmp/file.txt

The ${speed[i]} ${color[i]} ${animal[i]} jumped over the ${structure[i]}

EOF

Handy for generating HTML, PHP, configuration files, or damn near anything else inside of a shell script .. even other shell scripts :)

Tim Post
  • 1,515
  • 13
  • 25
3

Command-line editing keyboard shortcuts I use in bash:

CTRL-U - Delete text from cursor position back to home. Great for when you mistype a password, and don't remember if your terminal supports DELETE or BACKSPACE.

CTRL-A - Just like HOME key, even when your terminal doesn't send HOME correctly.

CTRL-E - Just like END

ALT-F - Move cursor forward by a word.

ALT-B - Move cursor backward by a word.

Shalom Craimer
  • 543
  • 9
  • 16
3

Add this to ~/.bashrc

# expand things like !310:1 after spaces.
bind Space:magic-space

It's a bit scary to use

  • !:0 (0th argument of last command, i.e. the program name)
  • !:2 (2th argument of last command)
  • !! (the whole of last command) (sudo !!)
  • !$ (last argument of last command)
  • !* (all arguments of last command)
  • !ssh (last command starting with ssh)
  • ^chunky^bacon (last command except the first chunky is replaced with bacon)
  • !:gs\chunky\bacon (... all chunky is replaced with bacon)

without checking what's being substituted before running the command.

When you use wildcards like *.txt or globs like hello.{txt,org}, you can check what's being done with echo command beforehand.

echo rm *.bak
rm *.bak

But for stuff like !:0, you don't use echo, because once echo is done, the last command is the echo command. You may have heard of this phenomena as the "observation screw things" principle in quantum mechanics.

Instead, you just add "bind Space:magic-space" in your ~/.bashrc and then, whenever you press space, the stuffs like !:0 is expanded right there.

RamyenHead
  • 311
  • 3
  • 6
  • 11
2

Quick & Simple data integrity verification

using nothing more then bash & md5sum

This can prove to be priceless in terms of debugging trouble when moving binary files over the network... You should embrace this technique as common practice for each copy of valuable data to ensure 100% data integrity.

Setup some test data...

mkdir -p /tmp/mdTest/dir{1,2,3}
for i in `seq 1 3`; do echo $RANDOM > /tmp/mdTest/dir$i/file$i ; done

md5 hash calculation on the test data

cd /tmp/mdTest/
TMPMD5LIST=$(mktemp); (find  -type f -exec md5sum '{}' \;) > $TMPMD5LIST; mv $TMPMD5LIST list.md5sum

data integraty verification from the hash

cd /tmp/mdTest/
md5sum --check list.md5sum
./dir3/file3: OK
./dir1/file1: OK
./dir2/file2: OK

Unit test: Let's break one of the files.

echo $RANDOM >> /tmp/mdTest/dir1/file1
md5sum --check list.md5sum
./dir3/file3: OK
./dir1/file1: FAILED
./dir2/file2: OK
md5sum: WARNING: 1 of 3 computed checksums did NOT match
Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
2

Bash parameter expansions are great:

$ FOO=leading-trailing.txt

$ echo ${FOO#leading-} # remove leading match
trailing.txt

$ echo ${FOO%.txt} # remove trailing match
leading-trailing

$ echo ${FOO/-*/-following} # match a glob and substitute
leading-following.txt

Need to rename a bunch of files? Combine with a for loop:

$ for FILE in file*.txt; do mv -v $FILE ${FILE#file-}; done
file-01.txt -> 01.txt
file-02.txt -> 02.txt
file-03.txt -> 03.txt

Once you have a basic grasp on them you'll be surprised how often they're useful.

markdrayton
  • 2,429
  • 1
  • 20
  • 24
2

Sometimes while working in terminal you need to open a file using some GUI application associated with it like for pdf or mp3 files. You don't have to remember the exact name of that command, just use:

gnome-open some-file.pdf

BTW, the shortest alias I use is:

alias o=gnome-open

Very handy.

jackhab
  • 751
  • 1
  • 8
  • 20
2

Search in a gzipped file without first unzipping it:

gzcat someFile.gz | grep searchString
Millhouse
  • 113
  • 5
  • Similarly, bzcat works on bzip2'd files. Quite handy for patches that are distributed bzipped - bzcat patch.bz2 | patch -p1 avoids the need to extract it somewhere temporary. –  Mar 03 '09 at 01:27
  • 7
    How about zgrep, zless and zxpdf? ;-) –  Mar 03 '09 at 09:57
2

A few useful ones:

rsync -av old_location new_location

will copy a directory structure and preserve permissions and links.

sudo updatedb && locate filename

to find files quickly (requires findutils)

apropos term_or_terms

searches the manpages.

rlbond
  • 171
  • 1
  • 10
2
alias rm='rm -i'
  • 4
    That is a standard no-no. One day, you'll not have the alias set, and then you'll accidentally remove something precious. – Jonathan Leffler Mar 03 '09 at 02:48
  • I use this too, but I don't depend on it. –  Mar 03 '09 at 17:02
  • I have a script called `rn`: `mkdir ~/.rm 2>/dev/null; mv "$@" ~/.rm;` This makes a wastebasket that in ~/.rm from where you can rescue stuff that should not be removed. When you disk is running full empty the wastebasket: rm -rf ~/.rm – Ole Tange Jun 22 '10 at 11:30
2

If you find yourself thinking "oh, I can just write about 15 lines of Perl/Python/whatever to do what I want", first take a look at the coreutils.

Svante
  • 131
  • 3
2

You can do path substitutions to change dirs by using 2 args with cd:

$ pwd
/foo/bar/blah
$ cd bar bat
/foo/bat/blah
2

use expect ! This makes scripting interactive tools much easier. For example you can script telnet session, or ftp session.

Do your work on the command line, and then just script it. Here is a crude example, to telnet on a developement board, retrieve a kernel image, and put it into flash

#!/bin/bash
# scripted telnet
IP=$1
IMAGE="platform-AT91SAM9260/images/linuximage"
cp $IMAGE /home/cynove/public_html/
expect -b - <<EndOfTelnet
spawn telnet $IP
expect "login"
send "root\r"
expect "#"
set timeout -1
send "wget -O kimage http://192.168.10.2/~cynove/linuximage\r"
expect "#"
send "ls -al kimage\r"
expect "kimage"
send "flashcp -v kimage /dev/mtd1\r"
expect "Erasing"
expect "#"
send "exit\r"
EndOfTelnet
shodanex
  • 212
  • 2
  • 9
2

I like the following commands which gives you Google and Google books in terminal

#!/bin/sh

q=$1
w=$2
e=$3
r=$4
t=$5

open "http://www.google.com/search?q=$q+$w+$e+$r+$t&ie=utf-8&oe=utf-8:en-GB:official&client=firefox-a"

and Google books in terminal

#!/bin/sh

q=$1
w=$2
e=$3
r=$4
t=$5

open "http://books.google.com/books?q=$q+$w+$e+$r+$t"

I have the following in .bashrc

alias gb='/Users/Mas/bin/googlebooks'                                                                               

alias g='/Users/Mas/bin/google'

I have permissions 777 for the scripts at /bin/googleScripts/.

Examples

gb Strang null space            // searches books which have the words

g Lordi Hard Rock Eurovision   // Googles websites which have the words

It would be cool to have book titles in terminal so that you do not need an external browser.

Similarly, it would be useful to have Google searches' urls directly in Terminal such that you can process the data in Terminal.

2

One of my favorite cli trick is to get out of CLI.

xdg-open .

This opens a GUI file browser with the current directory. That's for Linux. Use "START ." for Windows and "open ." for OS X)

Before I learned to use echo, I was afraid to use rm with wildcards, so I would xdg-open the current folder and then remove files in GUI. I was also afraid of the tar command, another reason to use xdg-open.

What about a way to get back to the CLI world?

Double-click (or triple-click) on the location bar of your GUI file browser and run:

cd "$(xsel)"

that is from here. (or use open-terminal-here with nautilus)

RamyenHead
  • 311
  • 3
  • 6
  • 11
2

Safety at the Command Line

Experienced sysadmins do things that appear lazy or idiosyncratic at first, but save their bacon once in a blue moon.

Don't Type Anything You Can Paste

Prefer copy and paste to typing, whether you're copying from a run book, manual, or just higher in the terminal window. It's all too easy to type the wrong argument, switch, command, filename, etc., especially when you're simultaneously looking at systems, reporting status on a conference call, and trying to figure out the root cause of a problem.

Pasting the command line is a good habit. We should enable it by a) making everything scriptable and b) putting commands in manuals instead of screenshots.

Differentiate Windows

"Oops, wrong shell!" Famous last words. Find some way to separate windows that log in to different environments. Give production environments a different background color, or put them on a different monitor.

Don't Trust the Path

An oldie, but goodie, is to make a shell script called "ls" in your directory. Make it suid root, have it create a suid root copy of /bin/bash in a hidden directory of your own, then delete itself and run the real /bin/ls. All you need to do then is get a naive admin to run "ls -la" in your directory and poof, you've got a root shell.

Real admins never have "." in their paths, exactly to avoid this kind of sneak attack. You shouldn't either. Adding "./" to the front of nearby executables is a good habit to get into.

Don't Move Files. Copy Then Delete

"mv oldname newname" is dangerous. It can destroy two files at once. It's better to do a sequence. Copy the origin file to the destination, check whether it's OK, then delete the original. Better yet, wait until you're totally done with the whole process, then remove the original file. Better yet, make a safe copy of the file you're about to change. The goal is to make everything completely reversible, so you can always get back to a known state.

1
  • For smaller directory trees with documentation to browse

    find .
    
  • To empty a file from shell

    > file.txt
    
  • To return to my home directory

    cd
    
Dlf
  • 111
  • 3
1

IMHO, *nix most important command ever is... man :)

Almost everything one needs to know can be found with man and using man prevents us from interrupting our co-workers. And dealing with interruptions is one of our biggest concerns...

Marco Ramos
  • 3,100
  • 22
  • 25
1

I dont these commands are in the above list...!!!

  1. find . -name .svn -type d |xargs rm -rf

    Remove all .svn folders

  2. bash -x script.sh

    print line and execute it in BASH

  3. Ctrl + [

    the same as [Esc] in vim

  4. shopt -s autocd

    Automaticly cd into directory

  5. df -i

    View the current number of free/used inodes in a file system

  6. sudo !!

    Run the last command as root

  7. python -m SimpleHTTPServer

    Serve current directory tree at http://$HOSTNAME:8000/

  8. netstat -tlnp

    Lists all listening ports together with the PID of the associated process

  9. Below are some ways to number input.txt:

    cat -n

    $ cat -n input.txt 1 123 2 3 456 4 5 789 6 7 8 abc 9 10 def 11 12 ghi

Jayakrishnan T
  • 278
  • 2
  • 8
  • 22
1

I found Git version control to be:

  • Snappy
  • A pleasure to use
  • Useful for a projects of almost any size (100K to 100GB; 1 to 100k files)

Here is how I do it:

# Create new repository
# (for now, it will live in .git/ - a single directory)
git init

# Commit all I got so far
git add .
git commit

# Add new or modified files manually
git add *.c
git status
git commit

# Add all modified files
git status
git commit -a

# Redo last commit
git commit -a --amend

# View log
git log

# Reset everything (files and git history) back to 
# what it was at 96223554b3e3b787270b1f216c19ae38e6f83ca5
git branch this-was-a-mistake
git reset --hard 9622

# Everything is back in time
ls
git log
Aleksandr Levchuk
  • 2,415
  • 3
  • 21
  • 41
1

Easy sums/averages/grouping with awk:

cat tests
ABC 50
DEF 70
XYZ 20
DEF 100
MNP 60
ABC 30

cat tests | awk '{sums[$1] += $2; tot += $2; qty++}\
   END { for (i in sums) 
     printf("%s %s\n", i, sums[i]); 
     printf("Total: %d\nAverage: %0.2f\n", tot, tot/qty)} ' 
MNP 60
ABC 80
XYZ 20
DEF 170
Total: 330
Average: 55.00
ggiroux
  • 234
  • 1
  • 2
1

My Favorite And Frequently Used:

List contents of tar.gz file

tar -tzf filename.tar.gz

will match line containing S1 OR S2 OR S3 OR S4

grep 'S1.*S2.*S3.*S4' file

Lists all subdirectories of current directory

ls -d */

Total size of directory
du -sh

find a date:

TIMESTAMP=date '+%y%m%d%H%M'

move process from foreground to background

Ctrl-z and then bg

Entire word to uppercase

echo "word" | awk '{print toupper($0)}'

Checks equality between numbers

x -eq y Check is x is equal to y

x -ne y Check if x is not equal to y

x -gt y Check if x is greater than y

x -lt y Check if x is less than y

Checks equality between strings

x = y Check if x is the same as y

x != y Check if x is not the same as y

-n x Evaluates to true if x is not null

-z x Evaluates to true if x is null

Command Line Parameters for ' test '

-d check if the file is a directory

-e check if the file exists

-f check if the file is a regular file

-g check if the file has SGID permissions

-r check if the file is readable

-s check if the file's size is not 0

-u check if the file has SUID permissions

-w check if the file is writetable

-x check if the file is executable

print first field of the last line"

awk '{ field = $1 }; END{ print field }'

important build in variables

$# Number of command line arguments. Useful to test no. of command line args in shell script.

$* All arguments to shell

$@ Same as above

$- Option supplied to shell

$$ PID of shell

$! PID of last started background process (started with &)

  • This might be more useful if you splitted separate tips (like the "build-in variables") in separate answers, with short justification and example for each. And maybe skip those 'test' parameter lists altogether - people can find that kind of stuff in manuals and reference guides. – Jonik Mar 04 '09 at 07:10
1

I'm moving towards never typing "rm" at a command prompt. Instead, I type "ls", and if I like the list of removed files I edit the command (easily possible with bash and ksh).

Edit to add something from the comments: "rm -i" will prompt for each deletion, which accomplishes the same purpose. Thanks!

David Thornley
  • 181
  • 1
  • 1
  • 4
  • Please explain, I can't understand what you mean? – Maxim Veksler Mar 02 '09 at 20:24
  • Typing something like "rm *.txt.bak" is a little error-prone; if you type "rm * .txt.bak" by mistake you're in trouble. Therefore, I'll type "ls *.txt.bak" to see if I'm selecting the files I think I am; then I use bash shell command-line editing to substitute "rm" for "ls". – David Thornley Mar 02 '09 at 20:44
  • How do you do the command-line substitution? There must be a faster way than hitting up, home, delete, delete, r, m. – dotjoe Mar 02 '09 at 22:23
  • I don't know of a faster way. I type faster and better than most people I hang out with, and it seems that I am willing to type more than they are. I think that in csh and tcsh ^ls^rm or something like that worked. – David Thornley Mar 03 '09 at 14:36
  • Using 'rm -i filename' works for me. It will prompt me to remove the file. I actually aliased rm to 'rm -i'. –  Mar 03 '09 at 16:54
1

You can do some simple networking with bash (credit to this page and man bash):

cat < /dev/tcp/time.nist.gov/13

Yes, writing to and simultaneous reading-writing is also possible.

kyku
  • 97
  • 2
  • 7
  • Alas, Bash from Ubuntu doesn't seem to have this feature enabled :-( Wait for newer version. –  Mar 05 '09 at 08:11
  • I beleive this is because Bash from Ubuntu isn't compiled with the option to enable networking. Don't know what reasons keep them from enabling it. But I couldn't find a proper use for this feature anyway. :) –  Mar 05 '09 at 11:23
1

Above all, understand anything you see in the internetz before trying it out in your dev box.

1

Every now and then neither of:

find . -exec ...
find . -print0 | xargs ...
for i in *.png; do ... done

works for processing a list of files, since one needs the combined power of find, NULL separated filenames as well as plain shell loops. The solution is this little bit of bash code:

find . -print0 | while read -r -d $'\0' filename; do echo $filename; done

That allows one to process NULL separated files in a normal loop.

Grumbel
  • 194
  • 1
  • 8
  • What is the problem with the third? for i in *.png; do...done? – Juliano Mar 29 '09 at 03:16
  • GNU Parallel http://www.gnu.org/software/parallel/ seems to be better suited for this, as it can run jobs in parallel. The while-read loop can be rewitten as: find . -print0 | parallel echo Watch the intro video: http://www.youtube.com/watch?v=OpaiGYxkSuQ – Ole Tange Jun 22 '10 at 12:09
1

Shell-fu is a place for storing, moderating and propagating command line tips and tricks. A bit like StackOverflow, but solely for shell. You'll find plenty of answers to this question there.

1

Another trick:

When I want to make a bash alias, I just make a bash script on my user bin folder.

For instance, instead of adding the following line to my .bashrc,

alias make-symlink='ln -s -iv'

I'd make the following script and save it as ~/bin/make-symlink

#!/bin/bash                                                                     
ln -s -iv "$@"

, once the script is made executable (chmod +x), it's like I have a new alias.

Now make-symlink can be used in xargs. Also when you use a different shell (ZSH, FISH, IPYTHON, ...), make-symlink is there too.

If you use emacs, you might want to add the following to your emacs init file.

;; Make scripts executable on save                                              
(add-hook 'after-save-hook 'executable-make-buffer-file-executable-if-script-p)
RamyenHead
  • 311
  • 3
  • 6
  • 11
1

The 'tee' command is really useful for when you are outputting to a file and want to see the progress at the same time. This is especially helpful for when you are logging output to a file and need to watch it as it progresses.

Instead of doing something like:

./program > file &
tail -f file

You can use the tee command on one line:

./program | tee file
1

I like to keep track of everything I do. One command that I learned in college was 'script'. This takes any output on your terminal and logs it to a file. What I didn't learn in college is how to make every terminal a script. Now I have this in my .login file:

exec script ~/.typescript/`date +%Y%m%d%H%M%S`.$$

Make sure that ~/.typescript/ exists before you add that to the end of your .login file. :)

toppledwagon
  • 4,215
  • 24
  • 15
0

Meta+. in bash for cycling through the last argument from previous commands. Great for running tail and grep in various combinations.

Jeremy M
  • 819
  • 4
  • 10
  • Note for OS X users: In Terminal, meta needs to be enabled under the Keyboard settings before this will work. – Jeremy M Jul 14 '10 at 17:27
0

CTRL+] x to forward search for a character "x", and Meta, CTRL+] x for backward search. On most systems, Meta can be ESC or ALT. For ESC, you press ESC then release, then combine CTRL and ], and then press the character to search for. For ALT, press down CTRL + ALT + ] at the same time, then the target character.

I find it's useful when editing history command.

For very long and very complicated command. I use fc to open vi(probably actually vim on linux) to edit the command.

zhaorufei
  • 99
  • 1
  • 1
    For long command editng, in bash at least, you can 'set -o vi' to enable vi-style keybindings directly in your shell. – pboin Oct 25 '10 at 14:49
0

To rename a multiple files in a similar fashion I found the following script very useful and robust over the years.

It just puts the output of ls into your favorite text editor. You just modify the text, save, close. The files are renamed accordingly.

It's especially great when you combine this with Vi column editing (Ctrl-v, select a block, I to insert before or A to insert after, type text, Esc).

#!/usr/bin/ruby

RM = '/bin/rm'
MV = '/bin/mv'

from = Dir.entries('.').sort; from.delete('.'); from.delete('..')
from.sort!

from.delete_if {|i| i =~ /^\./} # Hidden files

tmp = "/tmp/renamer.#{Time.now.to_i}.#{(rand * 1000).to_i}"

File.open(tmp, 'w') do |f|
  from.each {|i| f.puts i}
end

ENV['EDITOR'] = 'vi' if ENV['EDITOR'].nil?
system("#{ENV['EDITOR']} #{tmp}")

to = File.open(tmp) {|f| f.readlines.collect{|l| l.chomp}}
`#{RM} #{tmp}`

if to.size != from.size
  STDERR.puts "renamer: ERROR: number of lines changed"
  exit(1)
end

from.each_with_index do |f, i|
  puts `#{MV} -v --interactive "#{f}" "#{to[i]}"` unless f == to[i]
end

I call this script renamer.

Aleksandr Levchuk
  • 2,415
  • 3
  • 21
  • 41
0

To copy part of a file system to a new hard disk one can use

mkfs.ext4 /dev/sdb
mkdir /mnt/newhd
mount /dev/sdb /mnt/newhd/
rsync -av --hard-links --acls --one-file-system --xattrs /home/maxim/ /mnt/newhd/
echo '/dev/sdb /home/maxim ext4 defaults,user_xattr,noatime 0 1' >> /etc/fstab
Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
0

Output Redirection. When you're running something apend

> ~log.txt

to capture the output for later. Append

>& error_and_log.txt

for errors as well.

0

pushd and popd to temporarily switch to different directories.

So

pushd ~/tmp

will move you to that directory, but push your current location to a stack (so it can be nested).

Then

popd

to return to the previous location.

Unsliced
  • 141
  • 1
  • 4
0

Turn on inline mode for tab completion for Bash:

http://codesnippets.joyent.com/posts/show/1690

0

To open remote X application on you local machine do:

ssh -X remoteuser@removeHost
konsole

This allows quickly viewing graphical applications running on remote host yet using your machine as the graphical server.

Maxim Veksler
  • 2,555
  • 10
  • 27
  • 32
0

Copy selected text:

CTRL + SHIFT + C

...and paste it:

CTRL + SHIFT + V
0

In bash to close file descriptors: FD>&-

Close stderr:

$ function echostderr() { echo $1 >&2; }
$ echostderr "now you see me"
now you see me
$ echostderr "now you don't" 2>&-
$ 

Or inside a script:

$ function echostderr() { exec 2>&-; echo $1 >&2; }
$ echostderr "now you don't"
$ 
aless
  • 1
0

ALWAYS start any command or pipeline with # (comment) and remove it when you finish writing the command. Gives you a 2nd chance at spotting rm -rf / like things.

0

Here are Collection of Linux, UNIX User Management Commands http://www.linuxconfig.net/2009/11/16/linux-unix-user-management-commands.html