Archive for the ‘Bash’ Category

Running scripts on network connection (if-up.d, if-down.d,if-pre-up.d,if-post-down.d)

Friday, November 19th, 2010

On my previous post I described how to authenticate against web based logons after connecting to a particular SSID.  One of the problems I had was getting the script to actually run.  You should just be able to put a script in the /etc/network/if-up.d directory (on Debian based systems) and it should run when a network interface connects.  This wasn’t working for me and it seemed some other people were having issues too.

The solution to the problem turned out to be removing the file extension.  So the script that was named testscript.sh didn’t run, but the script named testscript did.  So if you are having similar problems you may want to try removing the file extension.

I am not sure if this is documented and expected behavior, or a bug.

Auto logon to web-authenticated wireless

Thursday, November 18th, 2010

At my current workplace I am using a guest wireless network.  After connecting, I need to open any web page and I am redirected to the login page and need to enter my provided username and password.  After submitting, I can browse any page and use the connection for email and other tasks.

Whilst this is only a minor inconvenience, it is definitely a candidate for automation.

A single line wget command is all we really need.  I have wrapped it up into a shell script below:

#!/bin/sh

username=XXXX
password=XXXX

wget -O /dev/null --no-check-certificate --post-data "username=$username&password=$password&buttonClicked=4&err_flag=0&redirect_url=google.com" https://1.1.1.1/login.html

-O /dev/null throws any downloaded content away, --no-check-certificate is fairly self-explanatory (wget wouldn’t connect to the https address without this), --post-data wraps up the various name/value pairs that the login form is expecting.

For most of these login forms, you only need a username and password, but I found with this one it also needed a few extra parameters.

So, the next step is to get this script to run automatically after connecting to the wireless network.  On Ubuntu (or any Debian based distro), there is a directory structure under /etc/network that includes an if-up.d directory.  Any scripts in here will be run after a network interface is brought online.  These scripts don’t have access to the SSID (as far as I know), but this can be obtained with the following command:

iwgetid --raw

Wrapping it all together, we can put the following script in the “if-up.d” directory:

#!/bin/sh

username=XXXX
password=XXXX
ssid=XXXX

if [  "`iwgetid --raw`" = "$ssid" ]
then
wget -O /dev/null --no-check-certificate  --post-data "username=$username&password=$password&buttonClicked=4&err_flag=0&redirect_url=google.com" https://1.1.1.1/login.html
fi

Remember to change the values of username, password and ssid.  Set the URL to the forms action value and add remove parameters as necessary.  Also, another point I learned the hard way is that the script can’t have a “.sh” extension, so name it “something” not “something.sh” (and remember to chmod +x it).

If you need to then connect to a VPN, this can also be added to the script.

Using Vi commands to control your shell (ksh, Bash)

Saturday, September 13th, 2008

Recently I was forced to use the Korn Shell (or ksh) when I had to perform some minor tasks on an AIX server. So of course I just jumped in and started using the shell. A few seconds later I was thoroughly frustrated as I discovered I had no command history and couldn’t use backspace or command completion (as you can probably guess I’m a Bash user).

The Vi mindset

So begrudgingly I resorted to a few web pages to work out what was going on. To my pleasant surprise, all I had to do was switch my thinking from Bash to Vi. I’m a big Vi(m) fan and feel quite at home using Vi like commands so I was quite happy to discover this was all I had to do. Command history: just hit ESC and use the Vi “up” which is “k” to go back in history, or Vi “down” which is “j” to go forward in history – easy.

Keeping in this mindset I began investigating other commands. Jumping back in history, then a quick “cw” to change a word, “A” to append to the end of that command. I wont bore all the people that don’t use Vi(m) with a list of commands, because if you don’t, then all this probably sounds like a pretty painful way of using a shell. But if you do use Vi(m) regularly, you quickly learn that a few carefully selected keystrokes can save you many repetitive backspaces or left arrows etc.

Escape from home

The other great thing about using Vi(m), or a Vi like interface, is that your fingers never need to leave the home keys. The only exception to this is that pesky ESC key that has been banished to the far corner of the keyboard. But never fear, there are some fixes to this.

The answer is to remap your CapsLock key to be another Ctrl key. Not only does this open up a wealth of shortcuts that can be performed without taking you fingers away from the home keys, but it also allows you to type an ESC equivalent, which is Ctrl-[.

If you are running X11 you can do it in the (re)configuration. Running Gnome you can do it through keyboard preferences, I’m guessing there is something similar in KDE and other window managers. You can even do it in Windows with a simple registry hack.

More information for Linux, Mac and Windows at the below links:

http://www.manicai.net/comp/swap-caps-ctrl.html

http://johnhaller.com/jh/useful_stuff/disable_caps_lock/

Dessert time

Anyway, whilst I was pretty impressed with the interface that ksh had on offer (once I understood it), I wasn’t totally sold. Command line completion was there (ESC+\) but it was nowhere near the level of Bash’s usefulness.

But…..inspired by this discovery, it didn’t take long to work out that I could have my cake and eat it too! A simple command in Bash allows these Vi like commands to be used:

set -o vi

This way you can keep on using Bash as you would, but hit the ESC key (or Ctrl-[) and you have the power of Vi at your fingertips!

Customizing the Bash prompt

Monday, April 14th, 2008

Customizing the Bash prompt is, in essence, a relatively easy task. All we need to do is modify the PS1 environment variable. To have a look at your current prompt string, execute the following in a Bash shell:

echo $PS1

Depending on you distribution, this may be a relatively simple output or it may be full of special escape codes to set the colors and do other interesting things.

By adding special characters in our PS1 we can change the elements that make up the prompt, their order, their color and their layout. For example, we can use the following command to change our prompt temporarily:

PS1=”\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]\$”

On a black background this will look something like this:

Example prompt

The “\u”, “\h” and “\w” are special elements that get replaced with the current username, hostname and working directory respectively. The numeric elements such as “\[\033[01;32m\]” are used to set either the foreground or background color.

Now at this stage, most guides on the Bash prompt will give you a list of all the possible escape codes and color codes you can use and send you packing. This is fine if you know exactly what you want but if you want to try out different combinations it can get a bit tedious. It was this situation that motivated me to write a program to ease the process of creating and trying different prompts.

impromptu

impromptu is a simple program I have written in Perl/GTK to make it easy to create and manage prompts for the Bash shell. It allows you to easily try out different combinations with different colors and saves the user from needing to know all the escape codes and color codes. Prompts can be saved and loaded, automatically installed and un-installed and previewed on the fly.

Here are a couple of screenshots:

Screenshot 1

Screenshot 2

impromptu can be downloaded from here.

Whilst fully functional, it is a young program, so let me know if you have any suggestions or if you find any bugs.

Contact: impromptu@omobos.com.

Convert an image to XPM format suitable for embedding in Perl

Monday, March 24th, 2008

Recently I was working on a single file GTK/Perl program which used an external image file for a logo. I didn’t want to distribute the image file along side the actual program as this was a little messy and probably not worth it for just one logo. So before doing away with the logo, I investigated if there were any possible ways to embed the image data in the Perl code.

XPM

It seems that the defacto standard for representing image data in ASCII format is XPM or X PixMap. This is usually represented in a way that is compatible with programs written in C (the XPM2 format does away with any language specific details).

Getting the XPM data

Getting the image data in XPM format is relatively easy if you have the ImageMagick suite of tools installed. If so, then just run the following:

convert inputfile outputfile.xpm

The trick here is to make sure your output file ends with a .xpm extension. ImageMagick will recognize this and do the conversion to XPM for you.

Converting to Perl code

As you might have noticed if you looked at the resulting XPM file, it is in standard C notation. Not very useful for a Perl program. Never fear, a sprinkling of substitution and regex magic will bring us home.

“s/\’/\\\’/g”
s/\”/\’/g
“s|^/\*.*\n$||”
“s/^static.*$/my \@xpm_image = (/”
“s/^};$/);/”

In order from top to bottom we have:

  • Escape all the single quotes
  • Convert double quotes to single quotes
  • Remove lines that start with a C-style comment
  • Reformat the variable declaration for Perl
  • Fix up the closing brace on the array definition.

With all these substitutions done, we can embed the resulting code in a Perl program and use it with functions that expect XPM data, Gtk2::Gdk::Pixbuf->new_from_xpm_data() for example.

All together now…….

Ok, lets put it all together into a Bash script. As with any shell script, we need the shebang on the first line:

#!/bin/bash

then we have filename definitions:

inputfile=$1 # $1 is the first argument passed to the script
xpmfile=$inputfile.xpm
perlfile=$xpmfile.pl

ImageMagick conversion and rename:

convert $inputfile $xpmfile
mv $xpmfile $perlfile

and the substitutions (the “i” argument means substitutions are done in place, i.e. there is no separate output file)

perl -pi -w -e “s/\’/\\\’/g” $perlfile # All single quotes get escaped
perl -pi -w -e s/\”/\’/g $perlfile # Double quotes -> single quotes
perl -pi -w -e “s|^/\*.*\n$||” $perlfile # Remove c-style comment lines
perl -pi -w -e “s/^static.*$/my \@xpm_image = (/” $perlfile # Change variable declaration
perl -pi -w -e “s/^};$/);/” $perlfile # Change closing brace of array

If you want to download or view the script in it entirety, have a look at img2xpm_perl at omobos.com.

Finding running processes

Sunday, March 23rd, 2008

Here is a simple alias to help track down running processes and their PID’s.

alias fproc=’ps aux | grep ‘

This uses the ps command to list running processes and then filters the result with grep using the keyword you specify. For example:

user@machine:~$ fproc firef
user 3413 2.9 12.8 256936 127640 ? Rl Mar23 14:38 /usr/lib/iceweasel/firefox-bin -a firefox
user 31952 0.0 0.0 3860 716 pts/2 R+ 01:16 0:00 grep –color firef

(Note: the grep command will always show up in the output as it too has the keyword in the process name.)

For more information on aliases checkout my previous post or have a look at my custom .bash_aliases file at omobos.com

Bash aliases

Sunday, March 23rd, 2008

The Bourne again shell (Bash) has a useful feature called aliases.

Aliases allow common commands (along with arguments) to be executed with an alternate command name. For instance using this command;

alias mv=”mv -iv”

means that whenever I use the mv command, the -iv argument will be automatically added resulting in interactive mode (prompting before overwriting) and verbose output. For example:

~$ touch test01.dat test02.dat
~$ mv test01.dat test02.dat
mv: overwrite `test02.dat’? y (Interactive mode)
`test01.dat’ -> `test02.dat’ (Verbose output)
~$

Many commands can have default behavior added to them by adding your most commonly used arguments. Some other examples:

alias grep=’grep –color’ # Highlight search word
alias df=’df -h’ # Display in human readable format
alias du=’du -h’ # i.e. 41G, 217M

But what if you use the ls command a lot but also in the form ls -la? To get around this we can introduce new commands (just make sure you don’t mask over existing commands). For example, I have the following aliases for the ls command.

alias ls=’ls –color=auto’ # Use colors
alias l=’ls -l’ # Long format, username, size etc.
alias la=’ls -la’ # Long format with hidden files
alias lsd=’ls */ -d’ # Only display directories

Making things permanent

When the alias command is used in a shell, the changes are only local to that shell and will be lost when it is exited (exit or Ctrl-D). To make changes permanent, we need to add the aliases to the bashrc file (~/.bashrc for the current user, or /etc/bash.bashrc to make the changes system wide). Similar config files exist for other shells.

But we can take this one step further and keep everything nice and organized. If we keep all the aliases in a separate file (~/.bash_aliases), we can source them from our shell config file and make them a lot easier to maintain. Most likely you already have the following lines in your bashrc file:

# Alias definitions.
# You may want to put all your additions into a separate file like
# ~/.bash_aliases, instead of adding them here directly.
# See /usr/share/doc/bash-doc/examples in the bash-doc package.#if [ -f ~/.bash_aliases ]; then
# . ~/.bash_aliases
#fi

If so, just uncomment those last three lines and keep your aliases in ~/.bash_aliases.

You can check out my complete alias file at omobos.com/data/scripts/.bash_aliases