Friday, February 29, 2008

Useful Scripts, Part 2: Linux "file sorter" script

The following Linux-based script (which you can download here) is also one I find myself using on occasion. I've made versions of it that run in just about every OS I've used at one time or another. It's a script that takes a large directory full of files, creates 37 subdirectories (A-Z, 0-9 and "other") and moves all the files in the folder it is run into their lettered folder according to their first character.

This is great when you have a large folder full of files that takes forever to "ls" or access over a network or a folder that is unmanageable for quickly finding particular files in a GUI. It's case-insensitive (I'm sorry, but DOS got this bit right, case-sensitivity can be a real hindrance at times, no matter how careful and experienced you are) so "Afile" and "anotherfile" both get put into the a/ folder.

I've used this on emulator rom folders when they got too many files in them, I've used it (or a Windows-"port" of this script) in user-profile folders on servers to help narrow the users down a bit and even move portions of them to a different storage medium in a logical manner (it's better to move A-L to another drive than it is to move the first 1000 users, from a "I'm only human" point of view). I've used it in archives of zip downloads, photographs, instruction manuals, all sorts. I hope someone else finds it useful.

mkdir other
for LETTER in a b c d e f g h i j k l m n o p q r s t u v w x y z
mkdir $LETTER
UCASEFILESPEC=`echo $LETTER | tr a-z A-Z`*
for NUMBER in 0 1 2 3 4 5 6 7 8 9
mkdir $NUMBER
mv ./* ./other/

Useful Scripts, Part 1: Linux "screen" wrapper script

A few people in the past have asked me for a few of the hundreds of scripts that I have made for myself. My life is a one long session of script/batchfile writing (I use the terms interchangably because a list of commands to run is a list of commands to run no matter what language, OS, etc. it is written for). There are school networks that are running on some of my scripts!

I'm not so keen on letting the larger, more tedious scripts out without me doing a proper licensing thing on them (open-source of course) but the little ones that are just convenient wrappers etc. I don't mind giving away. They are so trivial you can hardly claim copyright on them, even if the only intention of that copyright is to give them away but ensure you get recognition. Anyway, here's the first one, for Linux. It's a wrapper for the "screen" command. You can download the script from here.

Screen is a powerful command. It's ALMOST as good as when you first use a Unix-based machine and are taught to use Virtual Consoles (pressing Alt-F1, Alt-F2 etc. lets you have a variety of commandlines open at the same time even if you aren't in a GUI). Screen lets you run tasks in the background, with their output hidden, in such a way that you can "resume" or connect back to the screen that the program is displaying, from anywhere. It's a kind of VNC for the console. So, for example, if you want to run a program like IPTraf or HTop, but want to be able to see its "screen" from an SSH session later when you're at work, you can do this.

The script itself is quite simple, and I have a habit of calling it my "do offline" script, so I normally name it

if [ -z $1 ]
echo "First arg is screen name, rest is command to run"
if [ ! -z $1 ]
echo "Running commands ($*) under screen name $SCREEN_NAME"
screen -A -m -d -S $SCREEN_NAME nice $*

Basically, you run it as: "screenname" "program and arguments..."

e.g. htop htop lare_file_copy cp -R *.* /mnt/backup/ kernel_compile make

And it will happily run those commands in the background while putting the output into an invisible "screen". This is useful if you are logged in via SSH... rather than having to fudge access to the F1-F12 consoles you can start jobs in their own screens and then disconnect and reconnect from their screens as necessary without interrupting the job. You can start an 50Gb download remotely and not have to interrupt it when you disconnect your SSH session (the program will just carry on running in its screen and you can "resume" the screen from the local console when you get home to see if it finished).

To resume the screens, you just use:

 screen -r 

screen -r kernel_compile

And the one annoyance... to "disconnect" from a screen without interrupting the program its running, just press Ctrl-A and then D. Horrible keyboard shortcut.

For example,

# htop htop
# kernel_download wget
# screen -r kernel

Is it done yet? Nope?

Ctrl-A D
# cd /

Go off and do something else, disconnect from the computer, go to work and log in to this computer remotely.

# screen -r kernel_download
There is no screen to be resumed matching kernel_download.

So it's finally finished and we can compile it. But I want the computer to have compiled it by the time I get home

# kernel_compile make

Go home, log back into the machine at the local console

# screen -r kernel_compile

It shows up as still compiling, so I just watch the progress for a few minutes and then..

Ctrl-A D
# screen -r htop

Have a look at the CPU usage, wait for it to dip

Ctrl-A D
# screen -r kernel_compile
There is no screen to be resumed matching kernel_compile.

My kernel is downloaded and compiled without me having to watch its progress, limit it's output to a single virtual console on a single physical machine, or keep the remote session connected constantly while it does it.

I hope it's useful to someone. It's already been useful to me and to at least a couple of people on the Linux Questions forum.