Thursday, June 05, 2008

Hobbyist programming advice

I've programmed quite a bit, in a variety of languages, for work, for fun and for learning. Quite often, I've chosen how I do it and the other times it's been dictated but I've found that I program to a set of rules unconsciously. These mostly apply to "hobby" coding (games, utilities, etc. that you are mostly making for yourself) but can also apply to school or work if you have flexible working regimes:

1) Program in whatever language you like.

Honestly. It really doesn't matter for "casual" use. Don't program in C just because everyone else is, or because you've heard that it's better. Program in a language that you're comfortable with, even if that's BASIC and people tell you it's rubbish. You're the programmer, you should work to what you know. If you want to learn new languages, do so, but don't write every project in the current-fad language just because.

This also means that if you are confident in several languages, you should program in the most appropriate or your favourite language. It doesn't matter which! The most appropriate will make things easier (e.g. using something like Perl for handling a lot of text, or using Java for an object-oriented idea) but your favourite is likely to be the one you know best and you can make it do whatever handstands you need it to despite its shortcomings.

If someone tells you that you should start a program in C because you might need the performance later, test their theory first. Knock up a worst-case in your favourite language and see if it's worth the effort of learning a new language or coding in an unfamiliar one before you start. Chances are, for most things, just about any language will do.

There's nothing worse than being told that you "should" be programming in C when you're knocking up a twenty-line batch file (yes, batch files are programming, of a sort!). However, knowing whether you want an interpreted vs compiled language can be a help when you start - the chances are that it won't matter though.

2) Ignore coding style.

If you're writing for yourself, you can ignore supposed "best" coding styles and work your own way. Even if you're working as part of a small team what matters is clarity. Whether you space or double-space or crunch your arguments together with their commas doesn't matter so long as it can be read and understood. Stylising the code can come later if it's necessary. And if you're the only person to ever read it, it doesn't matter that people aren't used to your coding style.

I would argue that programming teams should have a system where they all use their own coding styles and have "convertors" so that when they put code back into the global pool, the style gets modified to the "team" style and when they check it out, it gets customised to that particular programmer's style. This way, everyone works comfortably but the central repository stays consistent.

If you're a single developer, spending too much time on worrying about the style wastes the "programming mood". You could write a thousand lines of new code with new features in the time it takes to style your code.

3) Document the code.

Documenting your code is important, even for yourself. You will not get to the end of any significant project and remember why every line is in there or how they all work. But you don't need flow-charts, invariant lists, hundred-pages of specification and all the other rubbish that programmers are taught to do if you're just working for yourself.

A simple comment above each block of code is more than sufficient but what's ABSOLUTELY necessary are warnings to yourself:


REM Don't remove this, because the program crashes without it.

// Don't pass different flags here because they are ignored.

;This line isn't redundant, it's deliberate. Without it X happens.

/* I know it's already supposed to be initialised but
without this, it can pass uninitialiased through
the compiler unnoticed and cause problems.*/


For big projects, you might find a brief Changelog and a Todo come in handy, but everything else should be inside the actual code. It takes long enough to write user documentation without having to document every single thing you do for yourself.

4) Backup

Yes, I know, but you really need to, especially if you're programming. Programs can crash, file accesses can run amok, source code will be heavily edited to fix a basic problem that you'll track down to something else, etc. Ideally make a copy of files you intend to change just before you start a session, and after any major breakthough (which may be as simple as finding the 1-character cause of a serious bug... you will kick yourself if you forget which of the hundred lines that mention "variableX" it was on and have to go through hours of debugging to find it again).

Regularly copy those files to other computers, or a website somewhere. SVN and other source code management programs are useful for this if you do it often but even just a simple ZIP and upload is better than nothing. THIS INCLUDES MAKEFILES AND COMPILE SCRIPTS!

I work on the following basis: At the start of a session, I backup the files I'm likely to change the most (e.g. graphics.c if I'm planning on revamping the graphics), to a dated copy in the same folder (e.g. graphics-2008-01-01.c). If I make significant advances (add a new feature, fix an old bug, etc.), I will copy that file again to the same folder (e.g. graphics-2008-01-01-added-double-buffering.c). At the end of a particular session, I copy all changed files to an SVN folder, which I may or may not decide to "commit" to a remote website. Every time I think I've made enough advances to see that the program obviously works differently, or which would be a real pain to replicate, I copy them to a couple of personal websites/SVN repositories.

Even few months or so, backup all that stuff to a CD-R or something. You'll be glad of it in a few years time. "I used to have code that did something similar, let me dig it out for you", etc.

5) Develop on a slow PC.

Seriously. Find an old junker, a laptop if possible because programming is one of those "I'm in the mood" hobbies that can take you any time and it's a shame to waste it. Don't make it so ridiculously bad that you can't use it comfortably, but a machine with less CPU and less RAM will show you the second you start to implement code that is detrimental to performance.

And it will also encourage good practice. Any program that's slow on your development PC will be unnecessarily slow if you ever do intend to scale it up somewhere else. So when that script to rename your MP3's starts struggling with 1000 files on your old laptop, you know that you should fix it before you convert it to rename 10,000 user accounts on the server in work. And when that game starts running out of memory with only a 10x10 board on your laptop, you know exactly how much it'll take on a modern PC before it starts exhibiting problems, when normally you wouldn't have given it a second thought.

This is especially important if you are developing for embedded systems, handheld games consoles, homebrew circuits, anything that's underpowered. You don't need to program on something of the same MHz or less of your target machine, but you need to be able to get an idea of where your program is slowing down while still using your development machine.

Using an old machine will also deter you from using your main desktop to program on. This can be important for a number of reasons - First, the source code (and associated backups) of any prolific programmer will get out of hand very quickly and you don't want it spread across your system. Secondly, you can backup to a different machine easily as part of the "routine" (e.g. a good system is to develop on a PC which is tricky to connect to the Internet - my old laptop has wireless but for simplicity and security, I set it up so that it proxies through my desktop machine for web access. This means that anything like FTP, SVN, etc. I have to deliberately enact or do from another machine, and it's slower to do so, which also provides the incentive to backup the same files to other machines while I'm waiting). Third, it removes the risk of damage to your everyday machine. This is especially important if you are using pointer-based or low-level languages that can potentially crash or corrupt a machine.

6) Test

If you intend to distribute the end-binaries to anyone, test it thoroughly. Good feedback is significantly hindered if your early versions have lots of basic problems or dangerous bugs. And get lots of people to test it for you. Announcing version 1.0 suggests that you've had lots of people testing it, believe it or not, so make it clear it's a test/0.9/beta/prelease version and then improve on it to make the magic 1.0 something special that works first time for everyone.

Testing is multi-stage. You need to test it thoroughly yourself. This usually means running the end program an awful lot of times, with different inputs. You don't need to generate a program to automate a test suite, you just need to use the end program a lot. This is especially good for games - just play the game a lot - until you realise that actually it IS possible for both players in the fighting game to hit 0 health on the same move, and you should add a "Round Draw" sound/routine/graphic.

Another stage of testing is theoretical. As you plough through the code, look for potential problems. "Oh, bugger, I don't test if someone asks for a -1 sized game!". Users are dumb - don't trust users, they are "rm -f *"'ing all the time, so they will find a way to break your program/their computer if it exists. Such analysis is hard to do if you set out specifically to do it but you'll catch a lot of bugs this way. This is best done when you spot a particular bug or as part of your general looking through the code for functions to tweak.

The best stage of testing is undoubtedly giving the program to other people and asking them to test. They won't have the same file structure, hardware, software, way of using the keyboard/mouse/joystick. They won't be aware of or use the little foibles and habits you've got into ("Oh, I only ever run the game from a script within Windows, I thought it would just work without it", "I don't touch the mouse when it's loading up", etc.). And the least experienced the person is, the better. You'll get hundreds of people test a game but only one or two would accidentally click outside the main menu, or tell you that the program says it can't find a common library.

And people who don't know how the program works will be clicking around in a non-obvious way so they will find all sorts of bugs as they try to figure out how it works, whereas an experienced person will just be dragging/dropping and completely ignoring the menus or buttons, for instance. My wife is the best bug-finder in the world. I guarantee that I can put a program thats worked successfully for months for a thousand people onto her Palm and within an hour she'll have crashed it somehow.

7) Have a nice development environment.

This doesn't mean a flashy GUI, it doesn't mean having a multitude of debuggers for multiple architectures. What is does mean is that there area as little distractions as possible when you're coding. You can compile with one command or click. You can place files into a clean test area with similar ease. You have all the tools you need most of the time. You don't need perfect Makefiles, or fantastically complicated scripts to allow it to compile on every architecture in the world, but you do need to have an easy way to do things for yourself.

For my projects, I often have Makefiles but use them sparingly or in a special way. And a lot of the time, if I'm working on a particular part of the code, I'll have a simple script that compiles, links and runs just the part that I need. I don't intend anybody else to ever use that script, I don't even distribute it, but it's a convenience when I'm programming.

If you're developing for another machine, you should have a way to simulate or access that machine quickly. For instance, I develop for the GP2X, so when programming for it, I usually have it close to hand. That doesn't mean I have to plug it in every session but it's easily accessible. I can plug it in and test stuff in it. As there are two main types of GP2X, I have a way to "emulate" the other type so that I can spot potential problems etc. I also have telnet and VNC access to the machine so that I can leave it running and test multiple programs in many ways in one session without having to keep re-connecting and running the machine.

For some people, this might also mean artificially limiting the memory/CPU available to their programs when they want to see how it would run on their target machine. Or running a software emulator. Or connecting the machine itself, transferring the program over and seeing how it works. Especially in the last case, you want to make it as simple as possible.

Debuggers are useful but you should only be using them relatively rarely. Simple problems are much easier to solve with code inspection. You'll usually know exactly where it crashed and a quick glance will give you the why. Tricky problems can be isolated with a couple of printf's or their equivalent. And difficult problems don't NEED the facilities of a debugger, they need your brain the most, but a debugger can then be a big help.

Have easy access to manuals and function references. Visual Basic used to come with a massive function reference manual that was incredibly useful. Sometimes just flicking through it would give you ideas about how to solve problems because you found functions you never knew existed that could do things in a different way for you. The TI-85 calculator was the same, and even the ZX Spectrum came with a BASIC manual. Even a "For Dummies" book can be an incredible reference for a seasoned programmer, especially if you switch between languages often ("Is it Select Case of switch() in this language?").

If you're programming with a particular library, say SDL, have access to a complete reference to it, with examples. This is immensely helpful, even if it's just a bookmark of a good site. I use one for SDL that lists the C prototype of functions and structures (which I copy/paste straight into a program and then edit to turn it into an actual function call), a brief description with lots of hints on what is and isn't possible (e.g. don't run SDL functions inside SDL threads etc.), and an example or two of using each function. It's amazing how much more helpful a simple example can be over a documented manual. Even a Google for a function name will turn up uses and tricks that you'd never have thought of, plus a variety of sites publishing "gotcha's" for using it.

8) Don't worry about the "proper way" of doing things.

You can't use GOTO. You should write performance-critical code in C. You shouldn't declare global variables. You should get structures down to their minimum required size. You should optimise all your functions. You should consolidate duplicated code.

Rubbish. Program for operation FIRST. You're not a NASA shuttle engineer, you're not dealing with 1000's of financial transactions. There are not going to be people sniffing over your code with disgust because you saved yourself having to restructure a hundred lines of code loops by using a jump instead.

If you intend to teach a programming course, or you're entering code into a critical programming project, then you should follow quite a lot of this advice. But if it's for a quick game that only you or a handful of people will ever want to see the code for, don't worry about it. A few of those items will help make you a more structured programmer, a few will reduce the risk of you leaving an accidental bug in your code, but the majority of them are pendantic ramblings of people who learned to program on punchcard and consider every cycle sacred. Don't get me wrong, I hate unoptimised programs with shoddy algorithms that bloat memory and slow the CPU, I can describe a bucketful of pathfinding, tree, sorting algorithms and their O(N) scales and there was a time when I would audit Z80 assembly and literally save hundreds or thousands of cycles in even the smallest listing, but for casual programming all that sort of stuff can come later.

9) Think.

This is the best bit of programming. Getting your head around why the 100% logical machine in front of you isn't doing what you ask. Invariably, it's because you've done something wrong but finding it is the best part. Programming requires you to think not only logically but creatively ("I haven't got a way to use floating-point numbers on this machine, how can I get around that to draw a circle?", "I can't use this function from inside a thread, so how do I get this thread to do what I want?").

The best part, although it's shocking, is that you do NOT need a computer to program. Argh! What is he suggesting? No, I don't meant that you should write a hundreds lines of C on paper perfectly first time, but that you can program in your head or on paper just as easily. You should be able to craft the basic structure of a program in your head before you start to write it, you'll be able to spot pitfalls, estimate which way is better or takes less RAM or less CPU.

You can program on a busy train. You can program in a quiet room. You can program in the garden. And you don't need a wireless laptop to do it. When I was learning to program, the best part was to get so struck with an idea that you dug out some paper to get your idea down before you'd even sat at a computer to try it.

Sometimes this even involved primitive flow charts but most of the time it meant just sketching out some pseudocode and working out some primitive structures.

I would use other people's programs of operating systems and work out how certain things functioned ("This must be a double-linked list", "They must be spinning in a loop waiting for me to press a key", "The AI in that is watching to see when you move more than X units into a small space before it considers it an attack", etc.). And a lot of the time I found myself stumped and sat down to work out how things worked. Or a way to improve them. Most of my ideas never made it onto an actual computer but it's the best part of programming - to stimulate the brain.

10) Enjoy it, and make it fun

Programming is like a puzzle. The aim of the puzzle is to finish whatever program you started (and sometimes just getting the answer hidden in the yellow squares isn't enough and you have to go and finish off the rest of the crossword). The intellectual challenge is to do that and to debug it when it doesn't work (and some people subject themselves to such intellectual challenges all the time, even when the answer is only a calculator tap away). The point of doing it all is to have fun (once you do the one-millionth wordsearch, or you've spend eight months on the same clue, the puzzles lose their appeal).

Why do you want to subject yourself to decrypting the Enigma code, when you could just as happily do a tabloid crossword? Do things that are within your capabilities, are fun, aren't over-stretching your bain, and that you enjoy doing. That's why people program. That's why there's billions of lines of code just floating out that the people are offering for free... they don't care that it was hard to do, it was fun for them to do it. Fun to prove it could be done, or that it could be done better, or that they could do it, or to get recognition from someone else for their hard work.

This is especially true if your programming games. If you can't enjoy the game at the end, there was little point in writing it you would think, but actually even the worst of games can be fun to program.

Wednesday, April 09, 2008

STPPC2x Beta 2- GP2X port of Simon Tatham's Portable Puzzle Collection

I've just released Beta 2 of my port of Simon Tatham's Portable Puzzle Collection to SDL/GP2X. It is much improved and the only remaining "problem" is that mines doesn't work. The other 26 games run fine from start to end.

I think the problem with minesis due to endian/signedness differences between x86 and ARM within the puzzle code itself (which I don't change in order to port any of the games). The game compiles and works under x86 Linux and runs fine but the GP2X crashes with an assert on mines.

The beta 2 version of STPPC2x (and the associated source code) can be found here:

http://archive.gp2x.de/cgi-bin/cfiles.cgi?0,0,0,0,25,2543

Or on the website where I keep all the released versions:

http://www.ledow.org.uk/gp2x/

Saturday, March 29, 2008

GP2X port of Simon Tatham's Portable Puzzle Collection

Damn it's a long time since I programmed in C.

I've just spent the last few weeks porting the GTK/Windows/MacOS/Palm collection of games that are in Simon Tatham's Portable Puzzle Collection to work on the GP2X handheld game console.

That involved the creation of an SDL "frontend" for the existing puzzle infrastructure (which, I have to say was beautifully abstracted and documented from a programmer's point of view). It worked beautifully well on the whole. The puzzle framework is designed in such a way that it can be easily ported by just filling out the definitions of a few dozen functions. Basing my work off the existing GTK port, most of the simple functions were only a line or two to convert to SDL (using the SDL_gfx library helped a lot too). Even the font functions were quite simple, when using SDL_ttf. And because the GP2X is Linux-based, I was able to prototype all the functions using the GTK functions running simultaneously with their SDL equivalents on the same development PC, as you can see from the screenshot. Each game is running an SDL and GTK copy of the game and synching display and input between the two, while running on a normal Linux PC.

After they all functioned enough to give me a playable game in SDL, I started stripping out the GTK functions, libraries and dependencies and ended up with an executable that only needed SDL, SDL_ttf and SDL_gfx to run. That ran on any SDL-based architecture so I could compile the same code for either the GP2X or the Linux PC. For the GP2X it was the "simple" matter of cross-compiling them to the GP2X's ARM architecture, linking them against some ARM SDL libraries and then testing them on the machine itself.

But it has to be said that the hardest part of all was actually getting things to compile and link properly! I now officially hate Makefiles, gcc and linkers. I wasted more time trying to get code which I *knew* was working to compile and link in various ways without errors than I ever did coding! That really was the most horrible experience and half the time I had no clue as to what the problem was, even after analysing and googling the various weird and nonsensical errors I was getting. Programmers should really not have to play about with such convoluted and finnicky command lines just to get a simple C program with pre-compiled libraries to compile and link. Granted, wanting to compile a program for both x86 and ARM, with dynamic or static libraries should mean a little tweaking but it should never be as difficult as it was. I was actually *scared* to touch the Makefile each time in case I broke it and I kept more backups of that than I did the actual SDL interface code.

Anyway, I eventually got the puzzles to the point that they were mostly playable, mostly fun, mostly working and mostly pretty-looking. I've tested them all on GP2X and they are, on the whole, working (there is one that errors out in the actual game code, which I haven't touched, but only on the GP2X - a Linux machine with exactly the same code doesn't error. The other problematic one is just a bit too slow because it tries to do a lot in a short time on a slow processor). I was working my way through them one at a time to release them, because some of them exhibited subtle bugs, some of them required more work on input to allow them to "enter" numbers into the puzzles and at least one of them was (and still is) just being a pain in the bum. But now the vast majority of them work and I've built "beta 1" of what I have called (in a rare fit of originality) STPPC2x, or Simon Tatham's Portable Puzzle Collection for the GP2X. :-)

I'm surprised at how well and how fast they work, considering that the development 600MHz machine doesn't run them any faster than the GP2X at 200MHz and I'm not even using the hardware acceleration on the GP2X properly.

The first beta, with 25 out of 27 games playable, most of them near-perfectly, has been released on both the GP2X archive and at my website: http://www.ledow.org.uk/gp2x/ You can get the source code and toolchain that I used from my website too, because I find the most annoying thing about picking up other people's code is when you don't have the same compiler/libraries/filesystem setup as they do, so it's a problem to compile even before you start coding.

I consider it a good piece of work, considering that my last C excursion ended about an hour after starting it and I ended up writing the program I needed in something else instead. That was several years ago but I was able to pick this up and run with it. I estimate the total *actual* development time on STPPC2x to be approximately 48 hours for approximately 1000 lines of actual code (not including the Makefile), spread over a period of two and a bit months (it's hard to get in a mood where you want to fight with compilers, pointers and libraries). At least six of those hours was getting the Makefile to work properly. At least two were creating scripts to compile and link each file individually because I was sick of the Makefile not working properly and needed something that worked *here and now* so I could work on either the Makefile or the actual code at my leisure. At least ten hours on top of those 48 were things like "waiting for stuff to compile", "waiting for stuff to copy", "waiting for stuff to upload to my website/the GP2X archive", "preparing screenshots, instructions, proper licensing info, directory structure etc. for releases" and "waiting for the GP2X to finish writing to the SD card". A 600MHz development machine trying to copy megabytes of files over a USB 1.1 connection to an SD card isn't ideal, especially when you trying to write instructions, take screenshots, upload via FTP, etc. simultaneously. :-)



Anyway, it's been my first big programming project for a while. I'm always writing *something* whether it's a shell script or batch, a VB program to do a quick job etc. or something for my own entertainment but they all involve different standards of work - either I'm the only person to ever use the program at all, or it's for a limited technical audience, and it's very rare that the source code is ever looked at in either case.

In the first 24 hours, it saw 100 downloads from the GP2X archive alone and I haven't looked at my website's logs yet. At least I know that it was worth my while.

Thursday, March 06, 2008

Darren Stone's WinXScreensaver mirror

Ever since I posted my article about the substrate screensaver, I've been getting emails of thanks (even though it wasn't my program!). Now, it seems, the author of the substrate screensaver port to Windows has let his website lapse and it's no longer available from there.

Well, that's not a big problem for substrate fans because I mirrored that particular screensaver along with the article I wrote and someone else has also ported it to Windows, as mentioned in the comments in the article above. But it now looks like people want the full program of WinXScreensaver with all the other screensavers and some curious screen-saver management abilities. Having scoured my extensive software archives of "stuff that'll come in useful one day", I have managed to locate a copy of this file (I never delete anything, but that doesn't always mean I can find where I put it. The Linux slocate command is worth its weight in gold as far as I'm concerned).

Given that it's based on XScreensaver which uses the X11/MIT open source license, I assume that the license of the Windows port allows me to mirror it and post it. I also assume that the original porter, Darren Stone of http://tron.lir.dk, doesn't mind me posting a copy here. Darren, shout if I'm wrong.

So, seeing as mirroring old files is now becoming a bit of a hobby for this blog, there is now a copy of the file up at my website.

It's WinXScreensaver1.1-Install.msi that I have, I have no idea if that was the latest version. Archive.org's history of the site is sparse at best. It's definitely an unmodified working version, though, because it's the one that I used to trial the program for something fancy I was going to do on my network. Hopefully people will find this useful. Enjoy! And give credit for the port to both the XScreensaver authors and Darren Stone.

Friday, February 29, 2008

Useful Scripts, Part 2: Linux "file sorter" script

The following Linux-based script (which you can download here) is also one I find myself using on occasion. I've made versions of it that run in just about every OS I've used at one time or another. It's a script that takes a large directory full of files, creates 37 subdirectories (A-Z, 0-9 and "other") and moves all the files in the folder it is run into their lettered folder according to their first character.

This is great when you have a large folder full of files that takes forever to "ls" or access over a network or a folder that is unmanageable for quickly finding particular files in a GUI. It's case-insensitive (I'm sorry, but DOS got this bit right, case-sensitivity can be a real hindrance at times, no matter how careful and experienced you are) so "Afile" and "anotherfile" both get put into the a/ folder.

I've used this on emulator rom folders when they got too many files in them, I've used it (or a Windows-"port" of this script) in user-profile folders on servers to help narrow the users down a bit and even move portions of them to a different storage medium in a logical manner (it's better to move A-L to another drive than it is to move the first 1000 users, from a "I'm only human" point of view). I've used it in archives of zip downloads, photographs, instruction manuals, all sorts. I hope someone else finds it useful.


#!/bin/sh
mkdir other
for LETTER in a b c d e f g h i j k l m n o p q r s t u v w x y z
do
mkdir $LETTER
FILESPEC=$LETTER\*
UCASEFILESPEC=`echo $LETTER | tr a-z A-Z`*
mv ./$FILESPEC ./$LETTER/
mv ./$UCASEFILESPEC ./$LETTER/
done
for NUMBER in 0 1 2 3 4 5 6 7 8 9
do
mkdir $NUMBER
FILESPEC=$NUMBER\*
mv ./$FILESPEC ./$NUMBER/
done
mv ./* ./other/

Useful Scripts, Part 1: Linux "screen" wrapper script

A few people in the past have asked me for a few of the hundreds of scripts that I have made for myself. My life is a one long session of script/batchfile writing (I use the terms interchangably because a list of commands to run is a list of commands to run no matter what language, OS, etc. it is written for). There are school networks that are running on some of my scripts!

I'm not so keen on letting the larger, more tedious scripts out without me doing a proper licensing thing on them (open-source of course) but the little ones that are just convenient wrappers etc. I don't mind giving away. They are so trivial you can hardly claim copyright on them, even if the only intention of that copyright is to give them away but ensure you get recognition. Anyway, here's the first one, for Linux. It's a wrapper for the "screen" command. You can download the script from here.

Screen is a powerful command. It's ALMOST as good as when you first use a Unix-based machine and are taught to use Virtual Consoles (pressing Alt-F1, Alt-F2 etc. lets you have a variety of commandlines open at the same time even if you aren't in a GUI). Screen lets you run tasks in the background, with their output hidden, in such a way that you can "resume" or connect back to the screen that the program is displaying, from anywhere. It's a kind of VNC for the console. So, for example, if you want to run a program like IPTraf or HTop, but want to be able to see its "screen" from an SSH session later when you're at work, you can do this.

The script itself is quite simple, and I have a habit of calling it my "do offline" script, so I normally name it do_offline.sh:


#!/bin/sh
if [ -z $1 ]
then
echo "First arg is screen name, rest is command to run"
fi
if [ ! -z $1 ]
then
SCREEN_NAME=$1
shift
echo "Running commands ($*) under screen name $SCREEN_NAME"
screen -A -m -d -S $SCREEN_NAME nice $*
fi


Basically, you run it as:
do_offline.sh "screenname" "program and arguments..."


e.g.

do_offline.sh htop htop


do_offline.sh lare_file_copy cp -R *.* /mnt/backup/


do_offline.sh kernel_compile make


And it will happily run those commands in the background while putting the output into an invisible "screen". This is useful if you are logged in via SSH... rather than having to fudge access to the F1-F12 consoles you can start jobs in their own screens and then disconnect and reconnect from their screens as necessary without interrupting the job. You can start an 50Gb download remotely and not have to interrupt it when you disconnect your SSH session (the program will just carry on running in its screen and you can "resume" the screen from the local console when you get home to see if it finished).

To resume the screens, you just use:

 screen -r 


e.g.
screen -r kernel_compile


And the one annoyance... to "disconnect" from a screen without interrupting the program its running, just press Ctrl-A and then D. Horrible keyboard shortcut.

For example,


# do_offline.sh htop htop
# do_offline.sh kernel_download wget
http://www.kernel.org/pub/linux/kernel/v2.6/linux-2.6.24.3.tar.bz2
# screen -r kernel

Is it done yet? Nope?

Ctrl-A D
# cd /

Go off and do something else, disconnect from the computer, go to work and log in to this computer remotely.

# screen -r kernel_download
There is no screen to be resumed matching kernel_download.

So it's finally finished and we can compile it. But I want the computer to have compiled it by the time I get home

# do_offline.sh kernel_compile make

Go home, log back into the machine at the local console

# screen -r kernel_compile

It shows up as still compiling, so I just watch the progress for a few minutes and then..

Ctrl-A D
# screen -r htop

Have a look at the CPU usage, wait for it to dip

Ctrl-A D
# screen -r kernel_compile
There is no screen to be resumed matching kernel_compile.


My kernel is downloaded and compiled without me having to watch its progress, limit it's output to a single virtual console on a single physical machine, or keep the remote session connected constantly while it does it.

I hope it's useful to someone. It's already been useful to me and to at least a couple of people on the Linux Questions forum.

Monday, January 21, 2008

GP2X handheld Linux games console - A review

For Christmas this year, I managed to persuade my other half that a GP2X would be an ideal present to keep me quiet. For those who don't know, the GP2X is a handheld games console whose main selling point is that it runs Linux behind-the-scenes and has buckets of "homebrew" software, including ports of popular Open-Source games and emulators. As I've stated in a previous article, to me that makes it more valuable than any other games console I've ever owned - I can load it up with "fun" older games and play for thousands of hours rather than spend a fortune on a single modern game which I would play for about a day before I get bored or completed it. (Incidentally, thank you Nintendo for the Wii and bringing the fun back into gaming!)

Anyway, I am now the very proud owner of a "Mark 1" GP2X, called the F-100. This is the black version with the original "joystick" rather than the touchscreen and digital joypad. To be honest, I didn't specify which version and knowing what I know now, I'm glad that all the wife could afford was a second-hand F-100. It's personal choice but I can sacrifice the features of the sucessor F-200 for the features that the earlier model has. Other people would disagree depending on their usage.

Most of the people who are interested in the GP2X, or indeed it's predecessor the GP32, will know about the handheld's features but for a quick rundown...

It runs on two "off-the-shelf" ARM chips (940T and 920T), with variable (and software-controlled) clocking between 60MHz and 260MHz each, the default being 200MHz and the overclocking being quite "safe" overclocking that doesn't cause permanent damage unless you do it for very long periods of time and overheat something. In fact, more powerful games can only run at the higher speeds and will "overclock" the device themselves - all that happens is that it works or it crashes. All that is required to "recover" from a crash is a switch-on, switch-off to get it back to normal.

It comes with 64Mb RAM (32Mb is directly accesible, the rest can be used with some trickery and often is) and a 64Mb internal NAND permanent storage. This contains the bootloader, kernel, some built-in applications like the menu and maybe even some games depending on your particular purchase. This can all be replaced and customised but you won't gain much. The NAND allows people who forgot to get an SD card to use the console straight away, and also provides an avenue for the vendors to pre-load certain games if you buy their bundles. Also, because it has to be a deliberate act to affect the NAND bootloaders or kernel, you aren't going to be bricking your GP2X accidentally. NAND firmware updates tend to come in the form of a bootable SD card.

For main storage, you can use an SD card up to 4Gb (32Gb in the later F-200 model) formatted in either ext2 or FAT32 - most cards come with FAT anyway and if you want to use the connectivity features, you're better off with the more-prevelant FAT. This is where most of your games etc. will go and it's quite easy to have hundreds or thousands of games on a single large SD card. And, let's be honest, SD cards are so tiny that you could easily carry a handful with you and fulfill every gaming need.

The firmware runs uBoot to boot pure Linux 2.4 as the core OS (source and alternative firmwares are available but you don't gain much because GamePark Holdings, the manufacturer, did a good job in the first place) in around ten seconds, with a nice splashscreen and bingely-bingely-beep startup sound. All of the "menus", games, emulators are just ordinary Linux programs compiled on GCC against an ARM target. So everything is either open-source from the start or has many open-source equivalents, even the main menu, boot-up screens, built-in applications etc.

Because it relies on BusyBox internally, it's not even unusual to see wrapper-bash-scripts around games in the Games menu. It has a rather perfect and simplistic method of program execution - when the GP2X starts up, it boots and then runs the menu program (standard Linux binary). That lets you select a game/application to run. On termination, each individual application is responsible for making sure it exec()'s the main menu before it clears up. It's beautifully simple but works and prevents the menu hogging RAM while you're playing a game. And if a program ever crashes really hard, you just switch-off, switch-on and it boots the menu back up again. Programs have full access to NAND and SD storage for savegames etc. but they tend to only place things in their own folders. This does, however, allow you to have a collective "roms" folder and use several different emulators with the same roms.

It operates off two AA batteries, although they HAVE to be high-power rechargeables (preferably 2800mAh) - what do you expect for a dual-200MHz portable machine?! You can get a good couple of hours out of a set of two depending on what you're doing. There is also a mains-adaptor port for static use and you can easily carry enough AA batteries to last you all day if need be. The only minor point here is that the mains adaptor doesn't charge the batteries, but you can't have everything.

The screen is full-colour, 320x240 and is very good in virtually any lighting. It is covered by a plastic protective screen about 2mm above the surface, which protects the expensive bits against that pen you keep in your pocket. There are two speaker grilles on the front (I believe only one is an actual speaker(?) but stereo sound is present in the headphones) and the SD slot sits in the very middle at the top.

It also has a headphone socket, power socket (3.3v regulated), mini-USB socket and EXT socket (we'll get to those last two in a minute). The two AA's sit comfortably in a rear "bump". The game controls are (on the F-100) a "mini-joystick" on the left of the screen which works surprisingly well, A, B, X, Y, Start, Select buttons in their usual places, Vol+ and Vol- just underneath the joystick and L and R shoulder buttons. On the F-100, the joystick also "clicks" down to provide another button. Sadly this was removed from the F-200 model because the joystick was replaced with a 4-way D-pad and a touchscreen was introduced over the LCD.

The GP2X is comfortable to hold for long periods and the design is "flat" on the front, except for the joystick which can be controlled by a wiggly thumb or between two fingers for precision control. The batteries are tucked away from your fingers so it feels quite thin. You can't accidentally eject the SD card or knock the battery cover off while playing and all the other ports have rubber covers to stop you poking things in them accidentally. Headphones plug into the top, keeping the lead out of your way.

The mini-USB socket allows you to connect the supplied USB cable to access the GP2x from a PC - of any kind. No driver software is required and it appears as a standard mass storage device so Linux, Windows and Mac can all "manage" the devices files. Installing a game can literally be a drag-and-drop. You select what content you would like the PC to access each time - either the SD card or (F-100 only) the internal NAND - and it just pops up as a removeable disk.

You can copy your games to your GP2x without switching off by using this feature or you can just eject the SD card and use an SD card reader in your PC (not supplied). There are also a plethora of USB options on the earlier model - the F-100 runs what is known as a USB gadget interface so that it can appear as a "device" to normal PC's. This allows it to be seen as a USB network card, USB HID device (so you can control windows games with its joypad, for example), and it has built-in web server, telnet server and samba server for access over the USB-net. Sadly, these features are lacking from the later model F-200, which I see as a great loss, and were instead replaced with a touchscreen interface in addition to the normal control methods.

Because it's all just Linux, you get some fanatics do things like plug a wireless or Bluetooth USB adaptor into the socket and port a driver for the device. Strangely, they often work, although the practicalities of a handheld limit its usefulness. There are ports of games designed especially for accessing a Nintendo Wiimote over a USB-Bluetooth device, for example.

The EXT socket allows for a whole new range of options. First, TV-out. Yes, this little device can display on your TV! Some games can appear blocky in this mode but some make use of higher resolutions when they detect the TV-out cable. Either way it makes for much better "static" multi-player fun.

Additionally, the F-100 has a peripheral available called the Cradle - essentially a "break-out box" which connects to the EXT port to give you access to 4 USB ports, for connecting devices such as joypads, keyboards, mice, USB keys etc. Games have to support extra controllers but most popular ones do. The GP2x also directly recognises USB mass storage devices connected to it. The breakout box also features TV-out itself too, plus JTAG programming ports (for hard-core tinkering and "un-bricking"), additional audio-out, a power-supply connector and RS232. It's safe to say that the break-out box isn't really that portable because it is intended as a home-device for development, or for using the TV-out feature to turn it into a home console.

But the important thing is, how well does it play games? Well, the absolute best examples for "showing off" don't run to much if you're looking for 3D-power in your handheld, but considering the devices specifications they are very impressive. Payback is a GTA 1/2 clone (some might say a bit TOO close to the original) with 3D, dynamic lighting etc. and plays really well. It has to be said that this is the showpiece of the GP2x and little beats it in terms of hardware use, speed and visuals. On the homebrew side, a complete port (yes, port, not remake) of Quake is the best, in visual terms, that you will see - and it's compatible with virtually every Quake mod, including the official ones. However the little beast should not be underestimated - Quake running at full-speed on a device such as this is no mean feat when there is no dedicated 3D hardware.

The GP2X, it has to be understood, is not going to out-perform much at 3D. It's based firmly on 2D, from design to manufacture to software, and that's where it excels. There are ports of almost every 2D GPL Linux game available - SuperTux, Crimson Fields, LBreakout, Liquid War, GNU Chess, Quake, Hexen, Clonk Planet, etc. there are dozens. But that's NOT what the GP2X is for - I'm sorry but it's not! Neither is it to be used for it's built-in MP3 player, eBook reader or Video player (DivX compatible). Nope. This thing is an emulation machine, pure and simple. The "official" archive is full of games but emulators top the download charts every month.

ZX Spectrum, Amiga, Atari, Commodore 64, Gameboy, NES, SNES, Master System/Game Gear, Genesis/Megadrive, Arcade games, they all have emulators for them that run on the GP2X. Only the most demanding tax the little workhorse but for myself, that was more than enough. I can play all my old favourites, full speed, on a little portable device that I can put in my inside pocket comfortably. This is also the ultimate test of gameplay on the joystick - pulling off Ryu's special moves on a SNES emulator running Street Fighter 2 is flawless (I've heard the F-200 has more trouble because of its D-Pad?). The button layout is very well thought-out and lets you emulate SNES controllers virtually perfectly, and every other comfortably. You never feel that there's a button mapped into an impossible place.

The speed, graphics and sound for the above-mentioned emulators are perfect for the vast majority of games in default settings - it's always those ones with the special chips that give you performance problems. Gameboy games feel perfect, SNES games work perfectly if you have the "basic" chips, so Super Mario World and Mario All-Stars are flawless but things like Street Fighter Alpha, Starfox and Yoshi's Island will suffer. There is a port of MAME available with over a thousand games playable. Most 80's arcade games are fully playable and later ones are hit-and-miss depending on the specifications. I love Final Fight, Wonder Boy, Pang, Ikari Warriors etc. and was very glad to see that they all ran perfectly.

You can stretch the machine to higher-level games (there's even a PSX emulator for the very optimistic) using the built-in overclocking options in most emulators but you rarely go from "Aw, it's unplayable" to "Yay, it's perfect" by doing so. Also, as with all overclocking (of which I am a massive opponent when it is used on PC's), it varies considerably based on the particular manufacturing that went into your particular device. Some people can overclock their GP2X to 270MHz and beyond without problems, others can't get much past the 200MHz defaults. Oh, and it can kill your batteries much more quickly, so in fact what you find yourself doing is finding "sweet-spot" under-clocking limits for every game so that you can save battery power without sacrificing gameplay. Most emulators allow you to do this on a per-game basis, which helps you save as much as possible.

Emulators are definitely leading the software development on the GP2X - RAM timing and MMU hacks to vastly improve performance originated from a want to get every ounce of power out of the GP2X and are present in every emulator and in most homebrew games. MAME lets you access 4 USB joypads connected to the handheld for multiplayer action - a rare feature in other GP2X games, but has been copied into most emulators for the platform. There are a smattering of commercial games, none really priced higher than about £10, and their quality does show through but there are not many able to compete with "free" games coming out of the porting community. I thought that Payback was well-worth the money but I'm not sure I'd fork out for some of the puzzle games. I'd much rather give a homebrew-author the money for a particular favourite.

And with a 2Gb SD card, you can fit almost everything you'd want (including a couple of hundred MP3's and a video or two) onto a single card.

On top of emulation, the homebrew would be the next software "feature". If you can compile against ARM targets (easy with GCC and the various devkits available for the GP2X), base the game off Allegro or SDL or be prepared to write a little hardware-code, you can get games up and running in minutes. The hardware was designed to be accessible - it's all just Linux. You can get the joypad showing up in /dev/joy, you get sound out of /dev/dsp, you can do some memory mapping tricks on /dev/mem and /dev/fb to create double-buffered video in the slightly-trickier top 32Mb of RAM. Everything is just Linux 2.4 with some extra features here and there to let you tweak the LCD backlight, control the battery-low light, speed up either CPU etc.

There are versions of BASIC available which target the GP2x and are designed to create the sort of mini-games that can be more fun than commercial games - sites running nothing but Flash games are proof of this on the Internet, and on the GP2X you can knock up similar games in minutes using one of dozens of development packages - Fenix, Python, BASIC, all sorts of languages are available. There are hundreds of games available, some diabolical, some fantastic. There are even ports of SCUMMVM, Albion, Descent, Doom, Duke Nukem 3D, Ultima 7, Heretic, Hexen, Rise of the Triad, various DOSBox-based games, Frozen Bubble, Super Mario War, OpenTyrian and even the Graphical version of Nethack! (I'm sorry, you can't leave that game out of the list!). Every year there is a competition run for the best GP2X homebrew or ported game and the winners can be very impressive.

At the moment, when I'm not playing the "oldies" on an emulator, I'm playing Ghostpix (a very polished Picross/Nonogram/whatever you want to call it puzzle-game), SuperTux, Liquid Wars, Kuoles, FreeDroid, Frontier2x (an Elite-2-port), Quake, and a million other "five-minutes" games that are just fantastic for a handheld console.

And the most important thing - the GP2X makes gaming fun again. You plug it into your PC, download some stuff from the web or the official archive, throw it onto the SD card over the USB cable, then go to Games, GameName, GameExecutable and play. You can even use the eBook reader to read the instructions on the device itself (emulators tend to have a lot of instructions because, for instance, SNES emulation demands quite a lot of buttons to be mapped so you need to know how to get back out to the menu - usually this is some combination like Vol+ and Vol- simultaneously or R+L+Start).

A lot of thought obviously went into its design. It's sleek, small, comfy, durable and practical. It has plenty of connectivity and hacking potential (even the F-200). It works well and is sturdy. Controls are obvious (Vol+ and Vol- work in EVERYTHING, just about, even though they are software-controlled) and well thought-out. The built in applications are more than good enough and substitutes are easy to come by - there's even one that makes it look and work like the PSP interface.

It could benefit from an add-on battery pack like the PSP has, especially if you can get several hours out of such a thing, even on the highest demands. And it really needs an in-built charging circuit. But apart from that it's very, very good and it should really get more attention from hardware designers.

For the next model, a "hybrid" of the first and second models would sell much better - sacrificing old functionality for new functionality isn't a good choice to make. If you could upgrade its 3D capabilities without destroying backwards compatibility (the homebrew/porting scene is far too important to just discard and try to build another), that could only be a good thing. I don't see it being impossible, even if it's only in the form of a 3D accelerator chip and a custom OpenGL library to manage it. But on many fronts it's already perfect.

The capabilities are there for networked games (at least in the F-100), but it doesn't appear that many people have used them. Maybe a small wireless or Bluetooth chip could solve that problem in a backward-compatible way (especially with the Wiimote being Bluetooth-based, it looks set to be a standard for wireless games controllers) - it's not like the drivers for such things would be impossible to port. You could easily upgrade to a 2.6 kernel (some people already have!) for the next version and open up a whole new world of new drivers you could take advantage of. You would have to tweak it, though, to ensure power-use stayed as low as possible - embedded kernels aren't exactly rare, though.

But the best thing about the GP2X is the reputation that goes with it. Nobody knows what it is, so you get some strange looks when you produce it from a pocket on the train. Some even stranger ones when someone recognises the Mario "ting" from an obviously non-Nintendo device. It is absolutely fantastic for wiling away long journeys, it has to be said, purely because of its design for a short attention span... listen to some music, read an ebook, play a SNES game, listen some more, play some Megadrive, listen some more, carry on that campaign in Crimson Fields, etc.

All in all, the GP2X has managed to do what a lot of the larger handheld console developers haven't. It turns a profit, based purely on hardware. Software is especially prevelant even though there are few "launch" titles. It's great fun and well designed. And it has a community effect that's unmatched.

Here's to a GP2X sequel that's even better!