Sunday, October 28, 2007

Why can't my computer...

There are a lot of things that annoy me about computers, or more usually, Windows in particular.

Why can't my computer:

1) Accurately gauge how long it will take to do something? (Prime Culprit: Windows)

File copies, program installations, downloads, no matter what it is, if it's got a progress bar it's not going to be accurate. Not only will it not be accurate, it won't even be close most of the time. Sometimes, granted, it'll be spot-on but that's just got statistical averaging written all over it. Why, in the middle of a file copy, does it suddenly decide that it's going to take 3 days, no, 10 seconds, no, wait, 18 hours, no, hold on, ... ?

Yes, not all things are predictable, but the computer could at least estimate (within a reasonable margin of error) and not blindly pick up random and wildly varying estimates without even flinching. It'd be much better if, when it gets confused or held up, it just gave up until it knew again! (KDE does do this, for instance, when copying files... it'll just say "Stalled" for a second).

2) Know where the drivers for a bit of hardware will be? (Prime Culprit: Windows)

Why do I have to point it to a driver that I've either had to insert myself or, a lot of the time, had to go to www.randommanufacturerswebsite.com/techsupport/drivers/windows/xp/driver/thingamajig/v8/revision2/setup.exe and downloaded myself before it will recognise the hardware I bought?

Why can't there be a standard for hardware so that everything USB or PCI-based contains some information that tells the computer roughly WHERE a set of drivers will be. Or, failing that, some website where it can automatically look up what drivers exist and where when given a PCI/USB id, even if it gets "Not Supported" or "Unknown" some of the time?

3) Know what I mean? (Prime Culprit: All OS)

When I type www.fredc.om into my browser, why can't it just correct it for me (maybe with a Google-style "did you mean?")? Some mistakes are hard to catch but it doesn't even catch the little ones. wwww.fred.com or www,fred,com , for instance. And when I have a file copy command, rather than just error at me because I mis-spelled or, in Linux, mis-capitalised a filename, have a small go at working out what I meant and ask me if it's right. (Yes, there are certain plugins etc. but I want it to be a standard feature. I can't be the only person who's done these silly typos!)

4) Protect me from others and myself?(Prime Culprit: Windows and Linux)

Now, I'd like to point out that this is almost exclusively a Windows problem for one part of the question (protecting me from others) and almost exclusively a Linux problem for the other (protecting me from myself). Windows just doesn't go far enough to stop other people getting into/onto/through my computer's defences. Not even after the 50th version which has promised to do so. But it has some fantastic features to protect me from myself, if they are enabled. File deletes are, on the whole, confirmed first and undoable later. Plus, it's quite hard to completely shoot yourself in the foot by, say, deleting your Windows directory accidentally. On the other hard, it's extremely easy if you are not careful to totally balls up your Windows installation just by clicking on the wrong website, the wrong email etc. or even disabling the wrong service.

Linux, though, protects you from outside elements a lot more. And even if they do get through, it is quite easy to recover from them and, additionally, their impact will be limited to the user accounts that are affected. However, even as a normal user, you can wipe out your home directory in one command without any confirmations and with little chance of getting it back unless you have specifically put into place procedures to recover it (such as replacing commands with safer versions, configuring user accounts so that they can't do that sort of thing, or just having easy-to-restore backups in place).

So it seems that Linux could benefit from a bit of Shadow Copy, a bit of System Restore or some kind of filesystem rollback and Windows could benefit from a bit more privilege seperation, a bit better programming and a focus on non-virus software rather than anti-virus software (i.e. before-the-event practices that stop the viruses getting on there so easily in the first place).

And this doesn't just apply to desktop environments. Humans make mistakes. Operating systems should be designed to take account of this fact and help where possible.

5) At least give me a clue? (Prime Culprit: All OS)

"mplayer: error while loading shared libraries: liba52.so.0: cannot open shared object file: No such file or directory"
(Note that this is an example only - when you compile mplayer from source, it does in fact warn you or leave out support when pre-requisites are missing).

Well. Lovely. Fantastic. So you know that you NEED liba52. You won't run without it. You were obviously written with it in mind. So why can't I instead get:

"Mplayer: Error: You haven't installed liba52. You can download this from http://liba52.sourceforge.net/"

Now, with modern dependency checking this sort of thing is getting rarer but even so, where's the error message that a human can parse easily? Windows does just the same with missing DLL's. And compile-time messages but not run-time messages are another bit of a bind. Fine, tell me gently at compile-time that I need libX and exit neatly. But why not do the same when I move that binary to another machine that doesn't HAVE libX, instead of erroring as above? More people RUN programs than COMPILE them. People compiling programs usually have the sense to sort such problems out for themselves (and such a trivial error is nothing compared to some of the doozies that you can get when compiling software for yourself!), ordinary users can't.

Similarly, dumbing down error messages too much is a major mistake:

"An error has occurred."

What error? Why? Whose error was it? Was it my fault? Was something wrong with the machine? Where did the error occur - in the program, in Windows, in something else? What can be done about it?

As a replacement, how about:

"An error has occurred in PartyPoker. [[Note the friendly program name]]This appears to be a problem with that program. You can try running it again, or checking for a PartyPoker update. If you still recieve errors with PartyPoker, the program gives the website address www.partypoker.com/problem as a source of help. Click below for a file which will help the program author to determine the cause of the problem."

Or:

"An out-of-disk-space error has occurred. Windows is showing you this error because it did not have enough room to create a 50Gb file on drive C: as requested by the program Nero Burning ROM. You have only 10Gb free space on drive C:. You can try:

- Clearing up 40Gb of space on drive C: and retrying the operation
- Instructing Nero Burning ROM to use a drive with more space (for example, D: currently has 100Gb free)"

6) Fix itself. (Prime Culprit: All OS)

Windows.
Windows Last Known Good Configuration.
Windows Safe Mode.
Windows Recovery Console.

Where's the "I need to get to my files" option - with a minimal desktop that uses NO programs, services or other information in common with the main Windows install and lets you copy your files off the computer before it dies completely? Where's the "Run Diagnostics" option to let Windows have a go at trying to find out what it actually wrong rather than blindly looping through a list of Windows "versions", each of which gets less and less drivers loaded? While we're at it, where's the "Check My Disks/Memory/Hardware" option in that list?

Where's the "Right, last time I crashed loading the graphics driver, according to these logs - this time I'll ignore graphics and just load a basic VESA driver and see if I can get further" logic?

And then we have the fantastic idea to include an option, which is usually the default, to automatically restart Windows on error (great, so you can't even SEE the BSOD when it whizzes past, and then Windows will blindly sit there trying to get into Windows every time it reboots until you come and fix it - it'd be better just to turn itself off!). Yeah, there sometimes a need for a watchdog on a high-availability server but on an OS designed for Home Desktop use? And what's the point of it just infinitely restarting at the same place unless it LEARNS from that mistake, especiallly if that place is before it evens gets to a desktop?

That's just the start of my list. Hopefully, I'll finish it off soon.

Friday, October 19, 2007

What does the Linux desktop need? Those who say "I want, I want..."

I've just read an article linked on LWN.net entitled "What does the Linux desktop really need?".

Let's veer slightly and ask a similar question: "What does my car need?". I'd say it NEEDED a lot less rust, a sunroof that isn't held in by parcel-tape, a new wheel-bearing (AGAIN) and something to make my lights turn on at least once in every ten tries.

Personally, I'd say it could also do with electric windows, heated front windscreen and a CD changer/MP3 stereo. But wait. Hold on. If we're saying that I can have anything I've seen on a car... I "need" braking-power reclamation, a hybrid engine, 0-60 in 5 seconds, a five-year warranty, finger-print recognition for starting the engine and GPS vehicle tracking.

Now, that's ridiculous, because my car doesn't NEED those last things, but that's basically what this article was saying. The Linux Desktop doesn't NEED anything else. It's there. It's a viable alternative to Windows. It can do anything that Windows can do (given the developer time investment). Years of development on the Windows side is now recreated in a matter of months on the Linux side - take drivers for things like newly-released wireless cards, some of which have to be reverse-engineered before a driver can be made, take some of the fancy graphical effects present in Vista, some of the desktop "features" of MacOS and Vista and there are already equivalents and copies available for Linux that can do just the same, most of which were started AFTER someone had seen those features elsewhere.

There isn't a type of application that can't be run natively, in theory. Given enough horsepower, we can even replicate the majority of Windows functions enough that high-power applications and 3D games can be run in a Windows-userspace recreation (Wine) at astonishing speed considering the technical problems of doing such things. Not only that, Linux can do virtually everything that Windows can do natively, and usually does a better job at it. There's nowhere to go from here apart from getting people to a) use the thing and b) develop for the thing, both of which are mutually dependent.

Reading LWN comments on the article are even worse... it "needs" Photoshop, Office, Games... No, it doesn't.

It's been proven - it's technically possible to write top-class 3D games and powerful image-editing programs for the Linux desktop. It's not even any "harder" than doing so for Windows. When Adobe want to do it, they can. In fact, Linux is more standardised for such things. You don't need to worry about ATI vs nVIDIA vs Intel - just let OpenGL sort things out for you.

The fact is that the desktop doesn't NEED anything, unless you are intent on recreating Windows on Linux. That's the problem - the Windows mentality isn't suitable, or compatible, with the way Linux works. Windows people want firewalls that don't disrupt their games and let any application request an open port via uPNP. Windows people want antivirus because they think they need it. Windows people want perfect connection to the heap of junk that is Active Directory. Windows people don't want to enter passwords or manually configure their hardware in order to do dangerous things, like overclock their graphics card or turn their firewall off. You can't change those people. Not without a big stick.

The way to get Linux onto a desktop is not to perfectly emulate every undocumented Windows bug and quirk when connecting to an Active Directory server for login so that some poor sap can run Outlook the way he likes, but to build a Linux equivalent that has clear advantages - faster, smaller, easier to manage, more transparent, easily portable, easily extendible and which can do stuff not seen elsewhere. The people who are more likely to make decisions based on those criteria? Large organisations. Who use networks. Which are run by a poor sysadmin somewhere who "knows" Windows but only "plays" with Linux. They don't care that Linux can detect and use 99% of all PC hardware - they care that it takes an hour to set up a new type of PC to the way they want it to be, rather than a five-minute copy of a well-known model's hard drive.

Imagine a Linux distribution. You install it in a "server" mode via a menu. Then you install it on a client machine via the same menu. At no time did you have to install drivers for monitors or some such rubbish. You don't HAVE to license it. You don't HAVE to spend days setting up the user group structure and policies to a safe default. Yeah, they'll be parts of the machine that won't work without proper drivers but that's not important. Really. These sorts of places SPECIFY the machines. They say what hardware it will or will not come with, down to the individual components. Compatibility with some cheap winmodem is not their problem - they buy a different modem, especially if it affects their security or technician's free time.

Anyway, you've started a client and server from barebones. Then imagine that you have automatic, cross-network authentication to that server, client logon, desktop settings and "policies", which allow the network administrator to change every single setting and restriction on the clients in almost every program via one directory interface. Imagine it works just as well over wireless, VPN, a Bluetooth interface or a USB cable. Just as automatic. Just as simple. Just as fast.

You can throw software across the network by just clicking on a machine in a tree-diagram on the server and deploying a package (so it'll be an RPM, not an MSI, but who cares?). Managing a thousand users on a hundred workstations becomes a cinch. And as a bonus, the machines automatically share work between them when they are idle. They automatically discover each other (with according administrator control) and use each other's storage as a mass-RAID for the network data, including encryption to stop people looking at other people's work. It does it all without needing a millions ports open. It does it all without major security problems. It works just as well from outside the network, when one of your staff takes a clieent laptop home - they plug it into their broadband, maybe they have to click an option to connect remotely instead of locally, and bam! - it's just like they are at the office.

Now imagine that you can do all that on lower-end machines than Windows could. And you can do more, stuff that just isn't possible on Windows. You can plug four graphics cards into each PC, four USB mice and four USB keyboards and now four people can use the one machine without even knowing. And their CPU power is being shared across the network, with all the other four-man machines, maybe even with the server itself doing some CPU work on their behalf when it's not busy with other things. And you wouldn't even notice that was what was going on. We're *not* talking thin-client - but you can do that if you want, too. You just tick the "thin-client" option when you install the client and the system does the rest for you.

Now imagine that not only does it do all that but you can also trust the server to backup those clients too, whether they are working locally or remotely. The server remembers individual machines and any quirks you've had to apply (that binary modem driver, that setting on boot that prevents the ACPI problems etc.) and when you rebuild them you can re-include those quirks too. Saving data to the network is transparent and not only does the server RAID all it's data, but it shares it out with the network. Server blows up? No probs. Stick the CD into any machine, create a server, maybe supply it with the old servers private key and bam - all the data feeds back from the clients to the server and the network rebuilds itself.

Well... the problem is that most of that stuff exists in one form or another. Certainly everything listed above is perfectly "do-able" and there's at least a few bits of software for every single component of that sort of system. They're not all tied into one distribution (that I know) but they are there. The most "together" distributions are the paid-for ones, Red-Hat etc. But there is nothing there that isn't possible, it might take a few months work and you could probably do it all without nothing more than existing downloads and kernel patches and a bit of bash-glue. But it's not around. You can't actually get it. And most Windows admin's won't even try it while it involves a lot of messing about. Have you seen some of the HOWTO's? Have you seen the number of steps needed to get Kerberos, LDAP, Exotic Filesystems, remote-control, VPN's, etc. all working your way? Windows is no easier, either, so you're left in the "what's in it for me" valley.

What's needed is not more and more replication of existing features but new and exotic uses. What's the most interesting part of Google? The Google Labs. What's the thing that people ALWAYS buy an OS for? The new, interesting features. Yes, when Samba can perfectly manage every aspect of AD integration, it'll be sought-after. But people scrambled to Vista "because". There wasn't anything complicated in it, there was little groundbreaking stuff and popular opinion now says that Vista is more of a pain in the backside to run for the average user than previous versions. But it was bought because it "could". It could do "new stuff" that Windows people hadn't seen before. Remember Windows 98SE that could "do USB".

People are already talking about the next version of Windows Server because of what it can do. Not about how well it does it. Not even about how easy it is to do, that's normally left until review copies appear in the hands of magazine reporters, but about what's new. And, stupidly, not even about what it doesn't do any more. The fact that every single version of Windows Server has had a hundred features announced that have never appeared is overlooked. The hype surrounding it by the time it comes out MAKES people want it. Vista was supposed to include database-style filesystems, a download manager, filesystem "filters" (Libraries), Palladium "trusted systems", integrated anti-spyware, UEFI support, PC-to-PC Sync, a full XPS implementation, to have a better UI, to perform better than Windows XP, and that's before you even get into all the capabilities that they physically removed from the OS that were there before in XP.

And the fact is that Windows Vista was just a small upgrade. If it had had ALL of those things, it could possibly be the best OS in the world. And Linux CAN have the majority, if not all, of those things. Most of them even exist for Linux right now. We just aren't using them.

People who "push" developers to make a Linux-Windows just don't get it - Linux is already in front in terms of features and technical details. We all know that. It wipes the floor with it's "main" competitor (although, to be fair, so do a lot of other operating systems). It's not that we're not "there". We are. Something else is holding Linux back. Firstly, ease of use. That's usually a big trade-off with not only compatibility and security but also with system performance. However, Linux has power to spare. And then it's just a matter of making things work without a million settings. I'm a big fan of command-lines and text-based configuration files - there is no reason to lose them. But they don't have to have vi as their only interface.

The main thing it's missing, however, is a short, simple, easy demonstration of powers that Vista and even future versions of Windows either can't or don't have. It's needs a show-distro to turn up, either from the depths of one of the established ones, or out-of-the-blue. It doesn't need that distro to say "Look at me, I'm just like Windows, only slightly better", it needs to say "Why on Earth would you bother to look at an OS that can't do X, Y and Z", where X, Y and Z are things that either have never been done before, or always been "promised" or "desired" and never materialised. And I don't mean a flashy-gui interface. The nearest we ever got to that sort of hype was the Kororaa Xgl Live CD and look at what it did - very little of any practical use. But it was NEW. It was even NEWER than what Windows could do at the time. So it got a lot of press.

Being able to access an AD domain isn't something new. It's not impressive to people. It's not even that innovative - there's a major OS that does it automatically and (fairly) reliably. What's needed is to play to Linux's strengths - flexibility, malleability, speed of development, freeform and accessible API's. That means coding quickly, easily, without barriers and restrictions and expensive SDK's. Just get in there and write stuff. In half the time it's taken Windows to get where it is, Linux has replicated and/or surpassed every aspect of Windows. Now it needs to overtake it - you can't do that by blindly copying features from Windows, or even other OS's.

Now, the article doesn't push Linux for anywhere near as much as the comments on LWN. To them I say: Just because Windows does something, doesn't mean that Linux should follow suit. It that were the case, Linux would BE Windows. I don't WANT my Linux desktop to have a built-in GUI firewall that's difficult to configure the way I want. I don't WANT automatic update dialogs that are a pain to turn off. I don't WANT something to automatically detect all wireless networks the second it sees a wireless card.

On the software front, what would be the point of "getting" Exchange, Adobe, Office as Linux-native versions or equivalents. By doing that, you would have to integrate a significant portion of Windows infrastructure, including Active Directory and DirectX. So what you've done is made a "free" version of Windows. Whoopee. Everyone who's currently using Linux is using it NOW while it's not a version of Windows... why? Because it's BETTER. It isn't bound by some stupid corporate decision or two decades of backward compatibility quirks.

Take a look at some edited highlights of Vista SP1:

Performance improvements
New security APIs
A new version of Windows Installer, version 4.1.[47]
Users will be able to change the default desktop search program
Support for the exFAT file system (FAT extended for USB sticks, basically)
Support for 802.11n.
Terminal Services can connect to an existing session.
IPv6 over VPN connections.
Support for booting using Extensible Firmware Interface on x64 systems.
An update to Direct3D, 10.1.
Support for the Secure Socket Tunneling Protocol.

What's there that Linux won't have by the time it comes out, if it hasn't got it already? What's there that Linux couldn't do? Nothing. And to be honest, as a changelog for a major upgrade to even a stable release of an OS, that's pretty pathetic. What about Server 2008? It's all pretty much the same. There's nothing in there that Linux doesn't already or couldn't do with a year or so's work.

Let's stop faffing about asking Windows users what they think they need from a Linux machine. Let's SHOW them. Let's just get stuff done and forget emulating Windows. We all know that Windows has it's death coming to it. The longer we give it credibility by attempting to copy everything it does, the more time we waste away from the interesting stuff, the stuff that will have people hooked. We have SELinux, we have file-server compatibility, we have directory management software, we have all of this but nobody cares. We need to show stuff that Windows can't do.

We need a five-machine network that can outperform the best Windows servers and individual desktops, when both are running 20 simultaneous clients (as in four people ACTUALLY WORKING on each of the five machines, locally). We need filesystems that "heal" (and not like self-healing NTFS in Server 2008 which is basically thread-safe Scandisk), network filesystems that can let Google do it's job without worrying and with which small companies no longer need to worry about tape backup (although, obviously, they still could) - which adds 50% to the price of any server.

We need perfect, logical, simple directory systems that can do stuff that Windows AD can't even dream of, in an easily editable/recoverable/backup-able format - it doesn't matter if it's Fedora Directory Server or Apache Directory Server - no-one cares. We need it all to run, automatically but securely. We need automatic secure communcations across a network to pick up new machines and integrate them directly into the Directory. We need systems that (with proper admin control over the process) auto-patch underneath systems that are still running. We need one-click Setup, Trusts, Backups and merges of entire Domains.

We need client systems that can repair themselves from known-good images (which, hell, should be stored in the cross-network filesystem) while they are still working - no, we don't acquire viruses but you still need to Ghost stuff back sometimes. We need machines that detect faulty hardware and compensate automatically - memory just failed in the server? Fine. Isolate the memory areas responsible (BadRAM), alert the admin, allow them to work-around the problem temporarily until they can get a replacement, restart and check all services and then carry on like nothing happened. And all the time you spotted it where Windows would have just crashed.

We need systems that can tolerate as much failure as possible. Primary hard drive or RAID array failed? Warn the admin, carry on as normal, read what you need off the network. Network failed? Route around it, over a USB connection if the admin only has a USB cable left, or FireWire, or wireless, or Bluetooth, or Serial, or Parallel, or... We need a real, "intelligent" help system. When it sees that admin hunting through menus looking at DNS settings, it tries to (unobtrusively) help. It brings up a checklist and works through things one at a time by itself until it says to the admin "The DNS server is fine. But you forgot to point that client machine at it." or "The DNS server doesn't have a reverse-DNS for that IP, that's why what you're trying isn't working".

We need systems that collectively monitor, detect and shutdown other rogue systems within their sight, a kind of distributed-IDS built into the system. We need systems that do all this 100% securely, with full certificate-chains and verification and let the admin control exactly what's going on if he wants. And when someone breaks that particular method of encryption? Ah, just choose one of the thousand-and-one encryption methods and do a one-time re-encryption to change every server, client and software over. Well, yes, do pick up local Windows systems and tie into them as much as you can but forget making that a priority. Set NEW standards. Make people say "I absolutely NEED a system that can do that." Let the other OS manufacturers play catch-up for a change.

Let's stop playing catch-up. We already won that one, there's no competition there any more, there's no more fun to be had. Let's start wiping the floor. Let's get JUST ONE feature in that people decide they absolutely NEED. And let's do it before Windows can even get a sniff. Let's do it so that, when the time comes for Microsoft to replicate it, they want to be able to read OUR code in order to get it done well enough. Let's stop playing about asking 90-year-old grannies why they don't like Linux when they know nothing BUT Windows... their answer will always be some variant of "It's not like Windows".... either that or "That penguin is scary". Let's make the people that are really scared of the Penguin be Microsoft and Apple. Because, at last and for once, they can't keep up with Tux.

Friday, July 20, 2007

Essential Linux Utilities

Ever since setting up three Linux PC's in a row, I've realised that I've grown dependent on a few pieces of software for Linux, above and beyond what comes with a standard distro (or, at least, Slackware).

Beep - a tiny util that can beep the PC speaker in a variety of ways, perfect for headless systems. I use it to give a warning tones inside boot scripts and also to provide a rising or falling tone on the start or end of certain tasks, such as booting or shutting down. Because it uses the PC speaker, it doesn't interfere with ALSA, works on even the oldest of PC's, doesn't necessarily require an external set of speakers etc. Beware using it, however, on multi-user installations - I tend to keep it restricted to the audio group of users only to stop people messing about with it.

Ether-wake (available from various places, originally by Donald Becker) - the ultimate power-saving util... this is a Wake-on-LAN packet broadcaster to wake up computers that support WoL from their deep sleep (i.e. turn them on so long as they are plugged into the net and have a power cable in them). With this I keep my home network largely turned off and "wake up" (i.e. turn on) particular PC's as and when I need them. And larger scale experiments have shown that there's nothing better than the sound of a room full of PC's all booting up simultaneously at the click of a single button / cron job.

HTop - a better version of "top" that I find easier to use. Shows processes and RAM usage in a nice controllable text-mode GUI that allows you to kill individual processes, scroll up and down etc.

rc.firewall (See this post for a mirror) - a perfect, simple, one-file iptables firewall that works well as rc.firewall in Slackware. Works for single computers, NAT'ing routers, multiple network cards, multiple-networks-on-a-single-card, and lots of other configurations. It uses a simple syntax for even multi-port port-forwards, has many simple options for various things such as allowing or deny ping's or cross-network traffic, has a very strong default configuration and can be reloaded at the drop of a hat at which point all the detected network interfaces are re-firewalled.

x11vnc - This is one of those utilities that few people ever use. It's a vnc server for X. But it has a vital difference... it's a VNC server for EXISTING X sessions. Most people are familiar with xVNC which allows you to spawn an entire X-Windows system where each "screen" is actually a VNC session (thereby providing instant-VNC-thin-client) but that's not much use to someone that has a single-user Linux PC who wants to log onto their home PC and click on that link that they left showing in their browser. x11VNC does just that - the command-lines get horrid very quickly, you have to pay close attention to the security of the thing (because now connecting to the PC on port 5900 is the equivalent of logging in as yourself on the local PC!) but it's a great piece of software. The author is also working hard to make VNC-wrapped-in-SSH a cinch, even from Windows PC's, by extending the TightVNC clients to incorporate SSL tunnelling. Yeah, you can now do this with some things like KDE's Remote Desktop functionality but I've been using this particular utility for so long that I have scripts which build-on to it and it also has some features that just aren't present in other imitators.

knockd - a simple port-knocking daemon implementation which can be triggered remotely using either a tiny utility that works on Linux/Unix/Windows or by simpler tools such as telnet. Perfect for securing a server for remote access (and incidentally the best way to stop random port probes to your machine - my SSH logs were filling up until I found this) as you can just put the portknock client on a usb disk or a website and download it from wherever you happen to be or you can even "bodge" one in a real emergency. Also, the configuration basically consists of port-sequences and names of scripts to run. This means that it's easy to configure it to see port-hits on ports X,Y,Z as an instruction to run an "open" script and then you can hit ports Z,Y,X to run a "close" script. And because you can have multiple port sequences running, it's very easy to have all sorts of different things happening. See my article here for a bit more background on my use of this utility.

Tuesday, July 10, 2007

Mirror of Projectfiles.com / lfw.sourceforge.net rc.firewall

Having just completed a set of instructions for a group of Linux newbies on how to set up a firewall, I then discovered that my favourite Linux iptables firewall script has all-but gone from the Internet. I checked Google, both "official" websites (including the Freshmeat.net mirror) and archive.org. Still no joy. Luckily I had kept a copy of this GPL script, which I have mirrored.

For those people who have had trouble finding the script that's been hosted at both ProjectFiles.com and http://lfw.sourceforge.net you can download the rc.firewall script at the following address:

http://www.ledow.org.uk/linux/

This is the 2.0 "final" version. I have the documentation mirrored too. Oh, and I assume that the reason that the archive.org site has no mirror is that the author wants no more to do with it. So be polite if you do need to contact them (the above file has their email address etc.) and don't bother me for support, either! (You probably couldn't afford me!).

Tuesday, May 15, 2007

The key to learning (and teaching) Programming

The subject's come up many a time in my life about how people program computers. A lot of the time it's seen as a mystical art and some lay-people even confuse "programming" with "operating a computer", or even just "changing some settings".

The question always comes up about how to "learn programming". Besides the obvious problem with the wording of that phrase, it seems that people still haven't found an effective way to teach children how to program computers properly... and more and more teachers are skipping every bit of the subject that they can. My interest was caught by this subject again recently by a story on BBC News and by a subsequent appearance of the same story on Slashdot concerning a graphical programming language specifically aimed at younger children.

Programming is quite a strange discipline. It is the laying down of logical, unbreakable rules and actions to arrive at a predictable and reliable answer. However, it also requires, more often than not, ingenuity and creativity in order to achieve those aims. In some respects, it is similar to the cutting-edge of mathematical research (and, in fact, mathematics is probably the most complementary subject to computer programming, with the possible exception of electronic circuit design).

The problem with teaching programming is not that children cannot be brought to understand the subject, or even that the tools aren't available for them... most children of an age where they can comprehend what programming is are capable of actually programming. The problem is, in fact, the established complexity of the systems that they use each day and the over-simplification of their given tools. Additionally, the fast change of consensus on a "suitable" language stifles the teaching of programming, when in fact every programming language proposed has problems whether they be from an over-simplification of programming or a steep learning curve.

Also, the program which dumbs down programming to the point where children can "understand it" is probably one of the more complex programs to write. The abstraction required to allow a child to just drag-and-drop items into a graphical interface and thereby create a set of instructions for the computer to follow relies on decades of other people's code in order to display itself, allow the child to manipulate the mouse, and do so in a high-performance manner on a modern machine.

This is, in some way, similar to mathematics again... we all get told that it's pi-r-squared but finding out WHY takes more advanced mathematics to discover. Unfortunately, the modern consensus is that we have to assume that "it just is" for several years before we can get to the stage where we can work out exactly why. This doesn't present a problem in teaching, however, as almost all teaching comes from the concept of "good enough for now" or "lies for kids".

Teachers are also drawn into the same trap that stops children from being able to grasp programming: You can teach children for-loops and variables to your heart's content... the fact is that they will not be able to effectively link those concepts with the word-processor that loads in a fraction of a second with a thousand different features or their favourite 3D game that simulates realistic physics without a complete understanding of some quite complex mathematical principles. Most teachers, if pressed, will happily explain that "it all gets turned into 1's and 0's" (thereby introducing even more questions about how 1's and 0's can look like Mario running across a platform in time to their controller movements) but few can explain to a child's satisfaction just how it does that.

And when the children attempt to reproduce that 3D game in BASIC or Logo, they will be sadly disappointed that the best they can manage is a few hundred lines of code to make some circles appear at random locations on the screen (but it would only be pseudo-random, which is based on polynomial iteration, which requires advanced mathematics... and the circles would be only grid-based representations of the formula for the graph of a circle, which requires trigonometry to grasp completely... the traps are always there). Yes, there are such things as Dark BASIC, but again we're just abstracting away absolutely everything into a black box that "just works", rather than letting the children find out for themselves in due course.

That's not to say that there isn't some effort being put into teaching children programming, or even that that effort is failing. There are some very effective tools to demonstrate programming in a context where even the youngest child can interact and play with a system that they can understand. The best example would probably be the Lego Mindstorms kits - some building blocks and motors can be turned into anything once a child has learned how to make the control blocks do what he wants them to do. A common first foray into the programming world for many primary school children today is a Lego Mindstorms kit that they first build and then program to achieve an aim - whether it be controlling the traffic lights at a busy "junction" or raising and lowering a gate on a railway crossing.

Unfortunately, some of the Lego software is abysmal to understand on first glance, even for a seasoned programmer. Clunky interfaces, unintuitive icons and "settings" and an extremely limited instruction set, all of which are supposed to help the child understand.

When all you have to work with is "Output X on", "Output X off", "Wait X seconds", "Wait for input X to reach Y" arranged in blocks where the syntax is horribly restrictive and forces you (from a logical and interface point of view) to do things such as "wait for input NONE to reach NONE" on dozens on instructions where it would just be more intuitive to introduce some concept of our old friend IF instead, programming starts to lose its fun.

Rather than forcing the logical equivalent of null IF statements onto every line of code, a much better idea would be to merely teach the children how to program the old fashioned way - groups of english-like instructions, executed strictly in order. Unfortunately, the languages of today all have syntaxes which require pixel-perfect placement of semicolons, parentheses and quotation marks in order to write even a simple Hello World program. And although there are ways and means around that (such as seriously relaxing the syntax), people seem to have the idea that having to use WORDS is somehow dirty, when it comes to programming.

As a youngster, I was only ever officially taught (in this order) Logo, BBC BASIC and Java. Bearing in mind that the first was in infant school (early 80's) and the last on my degree course (early 00's), that's a huge and slow gap to a programming language of practical use (i.e. one that I would have available at home, one that I could make useful programs in that I could use each day, one that I would be able to swap, create and share programs with other people). It was only by my own experimentation that I was able to learn other languages myself and I was programming competently in the last two of those languages before my teachers even mentioned them. For my A-level's, I actually taught my own classmates for several lessons because the teacher believed I could do a better job than he, having been programming in the course language longer than he had been programming.

Logo provided graphical (or physical) response, but all code was textual. BBC BASIC *could* be graphical but was mainly textual and all code was textual. Java, again, *could* be graphical but all code was, again, textual. However, a lot of modern-concept educational programming revolves on getting away from actual words (the Lego Mindstorms "language" being a perfect example). Textual code is not something to be scared of. In fact, in other subjects, we treat textual lists of instructions as vital. And isn't that, when you get down to it, all a computer program is? We need to bring back the old-fashioned text languages, without the historical baggage of text-mode editors, strange delimiters and line terminators.

To clarify, I'm not seriously suggesting that learning C++ at a young age would have helped me or any child. That's a ludicrous assumption, similar to saying that being taught quantum mechanics as a child would help me understand Newtonian physics - of course it would but the complexities of teaching at (and learning at) that sort of level would far outweigh the benefits of such an education. My own teachers struggled with every language I had to be taught, until I hit university. There's no way on Earth that we could expect teachers to teach at the levels required in order to allow that to happen, at least not in the forseeable future.

What's needed is not a *primarily* graphical system. What's needed is a way to easily construct lists of plain-English instructions without needing to worry about perfect spelling, excellent grammar (the absence of which is, in programming, a vital learning experience) and lots of typing. Drag-and-drop keywords are a good start.

The language itself, however, needs to be less strict or, at a minimum, not allow beginners to make most of the classic mistakes. You can't mispell a dragged-and-dropped keyword and with proper graphical interfaces you are able to show scope, loops etc. effectively without having to worry how many invisible tabs you have inserted on the left of the screen.

Data-types, although important, do not need to be set in stone. Children SHOULD learn the difference between a string and an integer variable as it is of critical importance at each stage of programming. Visual Basic's "Variant" does an excellent job of masquerading as multiple types simultaneously but introduces new problems in itself because certain distinctions are not made. However, a simple indicator of Constant vs Variable, String vs Number, should always be included.

The programmer should also be responsible for creating the variables that they are going to use. Variable scope is vital knowledge. If this means clicking a new button and selecting a type, be it "Number" or "Animated 3D Penguin Character", the kids should be the ones who create him, the ones that drag him in each time he's needed and the ones who (critically) should remove him from the project if he's no longer necessary. This means not only in whatever GUI the programming language is set in but into the actual program itself. They should insert the Animated 3D Penguin not only into the project but also into the correct parts of the program to "create" and "destroy" him at the right times.

[[ On a side note: They should also be able to name him whatever they like first... they are kids after all. In fact, he shouldn't be able to be created without being given an assigned name by the programmer. No "Animated 3D Penguin Character #1" etc. ]]

Ideally, his birth and death should be graphical each time the program is run. This will teach correct construction/destruction techniques (because Pingu just popping into the screen half-way through the game isn't good but creating him at the start and just keeping him off-screen until needed is a good way to learn - similarly this will also show the brighter kids that you don't need to create everything at the start and slow the whole system down, if you manage the entrance and exit of data effectively).

Abstraction of complex functions, e.g. 3D libraries etc. also help the programming become interesting much more quickly (I can remember the wow-factor of my first coloured circle on a black background but children today would hardly be impressed with that). However, many of these can, in fact, be expressed in the language itself (albeit complexly).

Most importantly, a language should be complete and self-hosting... that is that each part of it should be capable of being written in itself. The beginners will wonder how this is possible but this is true for almost all programming languages. Except, strangely, the educational ones. By hiding the complexity within the language itself, you strip away the hard work into black boxes.

For instance, the "Create a new penguin called Bob" instruction should not be the end of the matter. You should be able to drill down. That instruction should have the capability to be broken into several hundred parts, each of more complex and primitive instructions, but all in the same "language". The hidden properties of him should be there somewhere... accessible to the more advanced children who want to do something that the basic levels of the languages don't allow.

A multi-tiered GUI is the perfect way of doing this. On startup, you have your "Penguin Bob" buttons but if you "break" them, they split into the code that creates Penguin Bob, in a simple-enough set of further instructions. And you can break them multiple times, into ever-more-complex code until you hit some equivalent of assembly language... most probably some sort of sandboxed, interpreted code similar to Java. Why? Because this is how ALL code works... the BASIC interpreter is probably written in C. The C code would be executed as machine code. Everything is based on an more complex underlying level. And as an extra added bonus, this means that the language that kids learn in Year 5 to make Penguin Bob run across the screen and giggle is still the same language when they hit Year 11, 12 or 13 and have to submit an independently produced program of a highly complex nature (for instance, their own word-processor).

It doesn't have to be fast... in fact it SHOULDN'T be, for the basic levels. This teaches the users quickly that unnecessarily looping ten thousand times over a hundred lines of code is bad practice. And when they grow up and are able to break the code down to it's components they will not only see WHY but be able to generate a much more efficient code that is functionally equivalent.

With widespread use, it would also encourage the greatest principle in programming that has been present since the first line of BASIC hit our eyes... collaboration. Libraries of code. Open source. Those BASIC listings in ancient computer magazines. All of them were about collaboration and sharing of code. Imagine if your GCSE project in Year 11 was to create a documented, stable, fast set of replacement functions that the kids in Year 7 would use to make Penguin Bob do a somersualt.

There also needs to be real, physical feedback. A coloured square on a screen means nothing anymore - especially when it takes £2000 or computer to do it, from a child's perspective. Logo (and some of its later clones, such as RoamerWorld) knew this... many a British child grew up by slamming a glass-domed turtle into the teacher's ankles. Lego Mindstorms spotted this. To incite enthusiasm for programming, there needs to be a response... not just from a computer screen. Things such as PIC-chips and simple robotic circuits make this a breeze.

And, with something along the lines of RoboWar, where students could compete in a school-wide championship to generate the "best" robot by programming him, you could involve the whole school. In one fell swoop, you could have programming on multiple levels (from all levels of AI to sandboxing), electronics, control, communication and design (by applying the "Robot Wars" principle to a full-scale, physical version of each robot), competition and publicity.

Each educational language seems to have one or two brilliant ideas and then to fall flat on it's face in all other aspects. Considering the computers are ever-more complex, ever-more prevalent and are not going to be going anywhere in the foreeseable future, our children should have an understanding of exactly the principles that underly them.

They will learn WHY computers cannot be trusted to run the world without an awful lot of testing (because of the absence of "intelligent" computers so often seen in Hollywood), they will have the ability to not have to rely on multinational companies to write a simple letter. ICT is already compulsory in many aspects in almost every school lesson, and it creeps into our lives more and more each day. Unfortunately, nobody is being educated how to make all this technology do our bidding, instead relying on a few good people to do everything for us.

Saturday, April 14, 2007

Retro-gaming value for money

Take one old PC (considered obsolete by its previous owner and consigned to the scrapheap), add a lashing of a spare operating system (Linux is fine but, in all honesty, Windows is easier for this project). Mix in some random game controllers chosen by lucky dip into your "parts" box (anything from an expensive Playstation joypad with a USB adaptor through to an old two-button gameport joystick is fine). Add a dash of emulators for older games systems that you knew and loved. Sprinkle with a hefty swig of configuration, plugging in and cursing. Decorate with a lashing of ROM downloads of games that you still currently own but can't be bothered to play because the setting up of the games console that runs them takes longer than completing the game.

Finally, download lots of popular freeware games, give them a try-out, then have a retro/freeware-games party with anyone that has a favourite old game they haven't played in years. A TV-out capable card will help matters immensely, allowing you to play them on the telly in the front room, just like the old days. If not, a nice monitor will do just fine. Or for those lucky enough to have a projector, this would be the best way to keep a room full of big-kids happy.

Seriously. I've just done the above with my own ingredients... a 1GHz PC with 512 MB RAM (although you could probably get away with even less). I slapped a spare copy of Windows XP Home on it (I always end up with a little collection of Windows XP licenses from the various computers that I repair or, if they are beyond repair, recycle). I "upgraded" the onboard graphics to an ancient and obsolete AGP 4x card because I wanted TV-Out (the less said about that the better - old TV's that can't display the refresh rates you want without flickering can ruin hours of careful planning and playing with settings).

I threw in a spare 4-port USB card and as many reliable games controllers as I could find. A cheap cable to connect the sound through the television, a few extender cables for the various controllers (because in the heat of a gaming party someone WILL tug on them) and a cordless keyboard/mouse set to control the system from the living room sofa. A little utility called Joy2Key also came to the rescue, especially if you do what I did - I made a tiny Visual Basic util to turn the machine into a primitive "who pressed first" quiz buzzer - more on that later.

The emulators available to such an "obsolete" machine cross all sorts of systems (but only those ones that I ever played on or still have... I have somewhere a CD with Commodore 64 emulators and ROM's but I don't think I've ever even loaded it because I never used to have that particular piece of gaming history, so I have no interest in seeing the games it used to play.) Currently, my particular setup has (in order of age), ZX Spectrum, Sega Master System, Nintendo Gameboy, Sega GameGear, Philips CD-I, Sega Megadrive, Super Nintendo and Nintendo 64 emulators, all of which have several popular games (everything from the previously mentioned JetPac through to Super Mario 64) loaded and ready to play, full screen, full speed, on this old relic of a PC. Anything of the Nintendo 64-era or less is fully playable on such a machine, given the right emulators, and more powerful machines will laugh at them.

It even has DOSBox running several old DOS games that I've only just consigned to the "cupboard of slow, ageing death" where games live that are no longer easy enough to play on modern systems but which I still have a fond memory for (Syndicate, Cannon Fodder, Command & Conquer, Settlers etc.)

And it plays them all fantastically. The foresight of using my best two games controllers (a USB PC joypad made to duplicate a Playstation Analog Dual Shock and an actual Dual Shock with a Playstation -> USB convertor) even allows precise analog control of Mario in Mario 64, or while driving in Mario Kart. And with a press of a button and a change of emulator, it's suddenly a Kempston joystick in a ZX Spectrum.

It's been great to play all the classics. JetPac just isn't JetPac unless it's played on a TV and is emulated so accurately that all it's little palette flickers happen just right. And Target Renegade is still a fantastic game for picking up for only a few minutes for a quick bash. Super Mario 3's Battle Game (the version from the SNES remake) is a great party game.

So every game I've ever owned on a games console is now sitting on one easy-to-manage, quick-to-boot, no-fiddling-with-TV-tuning PC with gamepads enough for a four player battle in any emulated system that supported it. When I originally had this system set up, it was on an equivalent laptop and that was even more fun... there's nothing better than being able to whip out your laptop at someone's house and launch into four-player frenzied gaming of some of the classics.

And I also loaded this PC up with some other games that I had lying around or had downloaded previously...

- Slicks 'n' Slides
An old DOS racing game that I was a registered user of, way back when. With a utility like Joy2Key, you can play four-player with ease (beats having to cram four of us around a single keyboard like we did when we used to play this in Maths lessons in school!).

- XQuest 2
A marvellous freeware game that requires precision control of the mouse

- SCUMMVM
An emulator for various point-and-click adventure games, which comes with Beneath A Steel Sky as a freeware game

- OpenTTD
A remake of the classic Transport Tycoon Deluxe

- Abuse-SDL
The dark and fantastic Abuse from Crack dot Com.

- Rocks 'n' Diamonds
Another remake, this time of Rockman.

- PySolitaire
A card-game to wipe the floor with all the rest (but you can still add Solitaire and Hearts to your PC "console" if you want) - hundreds of customisable card games in one program.

- Liquid Wars
A brilliantly original game, part strategy, part fast reactions.

- Battle Painters
A wonderful game to leave the kiddies with - each player has to paint as much of the screen as possible before the timer runs out. I use Joy2Key on this as well, for the same reason as Slicks 'n' Slides, though this is a Windows game.

- Super Mario War
The best Christmas present I ever gave my mum - she is a massive Mario Battle Game fan and this gave us a night full of entertainment. Mario-themed but the number of games alone is enough to keep the most ardent Mario-hater busy. And such four-player mayhem is incomparable.

Not to mention TuxRacer, Pingus, Gate 88 and all those other great freeware or open-source games.

It's the best games console I've ever had. I even ended up naming the PC "CONSOLE" in it's configuration. It doesn't have network or anything else set up - it was worth sacrificing an old PC for and, as far as I'm concerned, the best use of a Windows XP license ever.

I turn it on. Within about 30 seconds it's in Windows (no net = no need for lots of utilities / services that slow it down) and ready to play. That 30 seconds would be taken up by switching to the right SCART socket, untangling the joypads etc. anyway, so you don't even notice it. Then, when you're in, you can play some old ZX Spectrum game while you wait for your loved one to settle down with a controller that they don't mind using, then you can both have a blast on Mario Kart 64 or play some silly freeware game without having to switch machines. When she goes to bed, you can load up a bit of Syndicate or finish off that Red Alert campaign.

You can have "Mario nights"... Start with the original NES versions, then do the Gameboy ones, working your way up to Mario 64 and filling in the time with some Battle Games, a quick blast of Mario Kart or a four-way battle in Super Mario War. And when anyone comes over and says "Do you remember ?", you can just load it up and play it.

It's great and it's the best investment of time I've ever put into a Windows system. I was going to download one of these menu systems that they have for multi-emulator setups but they were all pants and I found that a desktop folder called Games with a shortcut for each machine type is more than good enough. When you have a lot of games on the system, you would spend longer setting up a menu system and then selecting them than you would just loading up an emulator and typing in the name anyway. You can assign shortcut keys to each shortcut and, if you want, use something like Joy2Key to load up particular ones from the joypad!

Joy2Key is a bit of a fiddle but it's only really necessary for comfort on old games or emulators that can't reassign keys to joypads easily. Do it right and you have the "best" joypad be the controller - the one that can pause, open games, exit emulators etc. by pressing the right buttons on it. Or you can just delegate the keyboard to someone who knows the shortcut keys for each emulator.

I found that the Playstation-style joypad controllers were the best to use. Enough buttons for almost any game system (with a few spare for vital functions like Pause etc.), digital and analog controls, cheap, comfortable, easy to get hold of and easy to find adaptors for (in the UK, Game sell them in all their shops). They're also especially close to the original SNES controllers that I end up playing more than other types of games.

The most time-consuming bits were:

- Making it boot fast
Don't install anything that you don't need to (networks etc. included). Clear out all your startup entries. Set all BIOS options etc. to be as quick as possible. Make your desktop as plain as possible and completely empty your system tray. Make sure everything is plugged in each time you use it. You don't care that the system looks ugly, but if it jerks in the middle of a four-player game you'll be regretting putting that large theme on your desktop.

- TV-Out
My own fault really, because I was trying to use what I had to hand. Only when I was several days into getting it to work did I realise that all my problems stemmed from a television that just wouldn't support non-standard refresh rates at all, not even by the slightest degree, but didn't moan about them. I have an external D-Sub -> SCART convertor that just wasn't up to the job of changing refresh rates without having MASSIVE jerkiness (single screen games worked fine but anything that scrolled horizontally or, worse, vertically looked absolutely crap).

So I tried a video card with TV-Out which was perfectly smooth at doing that job, but which had BIG problems with colour display on my TV. Turns out that the old TV just doesn't support colour over a S-Video connector, even if it's connected via SCART. That wasted HOURS of fiddling, soldering, setting-tweaking etc. and on any other TV it just worked fine... so I used a different TV. My next step will be to get a nice flatscreen for it. The most important thing is that everyone can see it, so a small flatscreen with a limited viewing angle is a no-no unless you have some kind of dual-head setup and run one screen for each "team". (P.S. A very brief experiment showed conclusively that playing Jetpac on a 10-foot-wide projector image connected over S-Video is the coolest thing ever, but a very expensive way to run a games machine, so it's just not practical).

- Emulator configuration
Every one had it's own idea of what was Joypad 1. Every one had a different set of buttons on it's emulated controllers. Quite a lot have some weird options or programming that make them totally unsuited to the task. Some quit out if you pressed the same button on the joypad as selected the menu too many times (highly annoying when in the middle of a game). I had Gameboy emulators that couldn't emulate 4-colour Tetris as fast as a Nintendo 64 emulator could emulate Mario 64 on the same setup. I had Megadrive emulators that ALWAYS tore the screen no matter what VSync/Resolution/Buffering/Refresh options you chose. I had a CD-I emulator that will only accept mouse input. I had DOS emulators that needed settings tweaked for each and every game to get the best out of them (and even then at least one of them needed Joy2Key to make it easily playable). ZX Spectrum emulators offer emulations of so many different types of joystick (some of which conflict) that it's a pain to configure them all. Some I fixed, some I persevered and did it the hard way, some I ditched and replaced with other emulators, some I set up with certain limits (i.e. no more than four players, etc.)

- Distractions
Every time you set up an emulator, you just HAVE to test the games several times to make sure that they are all right, don't you?

- Programming
Yep... I did a bit of programming. It's a party machine, it's obvious from the outset. This thing is great at parties. And the best bit is, it's multipurpose. The hardest thing in the world to get is a decent Quiz Buzzer - one of those "who buzzed first" things. So I wrote one. I misused Joy2Key so that any button on the first joypad "typed" a 1 on the keyboard, second joypad 2 etc. and then I wrote a small VB program that just waited for input after the quizmaster had "opened the question" and then whatever character was typed first was declared the winner. It probably wasn't perfectly accurate but it did the job and took about twenty lines of BASIC code and a freeware Joy2Key to do it.

It's a good project to do. It requires limited resources compared to other PC projects (office and schools are throwing out PC's that can do this, TV's are perfect for playing old games on because you don't need the high-resolution of a monitor, any gamepad or joystick will work etc.). It is immensely fun and can be as simple or as complex as you like. It brings people together.

And I guarantee you that, if you are of the age that you can remember playing these sorts of consoles and not just Playstations, you will get ten times more play out of it than spending the equivalent amount of money on the newest games console. For the price of a new games console and a single game you can have every game you've ever played on in full better-than-the-original glory.

Wednesday, March 28, 2007

Jetpac

Seeing how I've been looking back at some of my older games recently, having been disillusioned with most modern ones and without a computer to play them on for a while, I was horrified to spot this bit of news:

Take a 16 kilobyte game, of classic nature, with fast, simplified gameplay, clear objectives, simple controls, that used to run on a processor of only a handful of MHz.

"Upgrade" it for modern machines, in the process making the screen hideously difficult to see what's going on, full of fancy unnecessary effects, several of which do nothing more than make the ENTIRE gameplay field obscured from the player's view, add extremely primitive multiplayer into a game never designed to have it and only release it (online) for one of the most powerful modern games consoles.

And in the process remove years of good game-playing memories.

Sunday, January 07, 2007

Linux - Good enough, Easy enough, Supported enough, Common enough.

I was recently sitting on a bench on the platform of a local train station when a large man, who was sitting next to me, started talking to me for no apparent reason. This isn't unusual - I must attract local nutcases - but I worked out who he was when he asked if I was "into computers". Seeing as I had no indications on me that would suggest I worked with computers, I quickly realised that he must work in the school that I had just finished working at.

I was right, he was a maths teacher there, and once I had realised this, we introduced ourselves and got talking. First, my answer to the question whether I was "into" computers was a little understated. You can't move in my house for flashing lights and humming fans and it's been that way for nearly ten years. It's my hobby, my job and most things that I do in some way tie back to computers.

Anyway, he started to demonstrate his knowledge of computers, a trait that I find common among people who discover that I "know" IT. The first words out of his mouth took him above the average Word-letter-writer and placed him into the "geek" bracket - Java (in terms of programming environments and not just "that bit in IE that makes fancy websites"), LaTeX and Linux. For those unfamiliar with LaTeX, it's a typesetting language designed for hard-to-typeset symbols, usually used with mathematical formulae. Tying this together with my earlier discovery that he was a mathematics teacher at the school, I quickly realised that he a) actually knew more than average about IT, b) he was researching things to help him work and that c) he was already aware of free and open-source OS's and software.

(Incidentally, he was shocked that I had a degree and further shocked that it was in Mathematics and Computing - he hadn't expected that in a lowly computer technician. That made me into "someone he could to talk about mathematics without having to dumb down")

This mention of Linux and LaTeX further elevated him to "person that *I* can communicate with on a technical level". The fact was that he was using Linux, LaTeX and Kile to do his job, without anyone "making" him doing it, without anyone even knowing he was doing it and without any sort of help from other people got us talking all the way to our station.

He had a Linux desktop (although his technical knowledge hadn't quite reached the level of compiling his own software, so everything was RPM-based), he had found a Linux-based program that he found interesting and useful to his own work and he was trying his best to get things running as smoothly as possible. He asked if I would mind if he popped into the ICT office the next time he was around in order to ask some more questions to help him get enough things working that he could demonstrate Kile to the rest of the Maths department in the school. Immediately, I said "of course not". And why not? Because not only was he POLITE, not only was he TRYING to do as much as he could understand (the two main determining criteria on whether or not I go out of my way to help somebody), but he was also trying to use Linux to get stuff done, which isn't the easiest thing for a relative novice to do.

Similarly, my brother has recently had to help the local Scouts complete their IT proficiency badges. My brother is quite a Linux fan but has never really "run" his own Linux system. He has a router/firewall/storage server/print server/emergency-desktop system that I set up for him some years ago and is capable of using it and managing it himself, so long as there is someone like me or a Google result that helps him find the command he wants, once he's discovered he needs to do something.

The Scout IT activity badge is remarkably well-designed, unlike some computer-based achievements for children, in that it does not make mention of ANY particular computer or operating system. Criteria such as "Create a simple website" or "Take part in a video conference" are worded so that, although they may suggest examples for possible software, they are not heavily biased in terms of the platform on which the activities must take place. Most, if not all, requirements could be completed on something as antique as DOS or CP/M!

This is especially good given the average budget of a small Scout group who would be required to arrange such activities on a fairly regular basis. A few old, obsolete machines loaded with even FreeDOS would be able to complete enough criteria for most of the stages of the IT badge, without unfairly hindering the children required to complete them.

It was with this in mind that my brother, a born-educator, wanted to provide a challenge to the more "cocky" Scouts. Those who have been trying to show off that they "know" computers when in fact all they have ever used is a Windows XP PC which was already installed and set up for them would, on the day, be faced with a Linux-KDE desktop. To the experienced, and genuinely talented, children this would be little more than a cosmetic hinderance. To those who have just memorised that they click the "Blue W" or the "Blue e" to get things done, it would be an awakening.

By the time these children are in the workpool, it's possible that the OS's that are commonly in use today would be completely obsolete and not just from a versioning point of view, but from the methodologies and techniques used. As an example, when I was at school, for my "official" education, I was taught to use BBC Micros with 5.25" floppy disks, followed by Amstrad PCW512's with 3" floppy disks with integrated CP/M, moving onto Windows 3.1 machines (with MICE!), which meant drag-and-drop, windowing metaphors, multi-tasking etc., and finally onto Windows 95 (which introduced me to the world of computer crashes and diagnosis like never before). By university, we were introduced to Linux, Apple Mac and SGI UNIX workstations.

So, if I had "learned" computers only from those that were taught TO me from my school days and then been thrown into the workplace, I would have to relearn everything that I knew. The only way that I was able to keep up and still stay ahead of the class (and the teachers) was to not "learn" specific operating systems, terminology, methodology or icon locations but to spot the patterns and learn how to operate any generic computer. This was mainly aided by the fact that I was exposed to lots of different OS's and architectures before I had left secondary school. Since then I have applied these skills to operating systems that I had never seen before. I have managed networks that I have never had any formal training on with only a few seconds in which to "learn" the system. And I haven't managed to blow anything up yet.

My brother learned the same way, that a broad, general education is far better than a targeted, by-rote education, and wished to convey this to the children at the same time as doing their ordinary badges but without hindering them in such a way that it would interfere with their actual results.

Thus, he wanted to trial a small Linux desktop system to use on the day.
He set aside an entire evening for installation, not including hardware setup (inserting hard disks etc.), plus an old 833MHz computer with 128Mb RAM.
It took us less than an hour to install, from a blank harddrive and 2 Slackware CD's.

The next thirty minutes he spent marvelling about how easy the setup process was, how much of the hardware it supported, how much stuff was "pre-installed" with the basic system (enough to do everything that he intended to do, even though he had initially wondered about installing OpenOffice, the built-in KOffice was consider more than adequate), how "obvious" the setup questions were (although he is more than computer-literate, he still loved the explanations given for options that he was unfamiliar with - most of which told him exactly what each option did, why it did it and what he should choose if he was unsure), how customisable the whole thing was, how secure (when running as an unprivileged user) it was and how quickly and smoothly it all worked. The step-by-step instructions held his hand just enough (although he doesn't really need it and isn't put off by reading a HOWTO or asking on a forum or even just experimenting) at every single stage and I usually made him make the decisions about what option to choose - if in doubt, he chose defaults.

Considering it was Slackware (a server-based distro, really, and an older version at that), on an old machine, using more swap to get stuff done than real RAM, with the default 2.4 kernel, with the default VESA drivers on a crappy 16Mb graphics card, without any sort of real configuration, turning features off or having to fiddle, he was impressed at just how fast and usable a desktop system could be created so quickly. There were some rough edges, which was the main reason for my presence as he was perfectly capable of trialling this himself.

For one thing, he had to type "startx" at one point! And then later edit inittab to make it always boot graphically. He had to select hdb in one place instead of hda despite there only being one drive present (weird ancient BIOS assigned the first hard drive to hdb when hda was empty?!) and I advised him to install LILO to MBR instead of the other locations. That was about it. And none of those problems would be present on any desktop-targeted distro.

However, once in the GUI, he created his own, locked down users (i.e. normal users other than root). He had office suites ready and they were trialled at the touch of a button. He was able to change the clock that was out of date - although he was asked for a root password as he was logged in as one of his restricted users at the time. At no point was he asked for driver CD's, Windows updates, or to reboot. Nothing took over his hard disk or wiped out boot sectors without asking. Functionality that could only be enabled with the addition of extra software on his Windows was there by default - multiple desktops, multiple clipboard entries, etc. Everything ran smoothly without any knowledge of the hardware involved (although, admittedly, we didn't do anything more complicated than join the local Ethernet).

And I know from experience that sound (had this computer had possessed even a built-in soundcard) would work flawlessly without "drivers", printer and scanner support would be the work of a few seconds even on Slackware, never mind an auto-detecting desktop distro, wireless would be just as easy and well-supported. The only problems that MAY have appeared would be with exotic or entirely obsolete hardware - and hardware that exotic or obsolete would either not work at all in Windows or certainly not work without installing lots of drivers. With that effort, they could almost certainly be made to work on Linux just as quickly.

We seem to have reached a plateau. Linux is "good enough" for most tasks, "easy enough" for most people (technically-minded or not, especially if pre-installed for them), "supports enough" to make it run on virtually any hardware no matter what its vintage, "common enough" that even relative novices are hearing of it and using it (and in fact most places are using it whether they know it or not in the form of in-house black boxes, routers, TomTom kits, ISP's, webhosts, firewalls, etc.), "supported enough" in that the simplest of Google searches will throw up hundreds of places to find help (whether by yourself, directly from other people or from somewhere that will sell you support).

And it seems increasingly true that Windows is fast becoming "not enough". Not secure enough (Let's not even get into that debate - I'm taking Windows + supplied software + latest updates against Linux + supplied software + latest updates). Not fast enough (with modern Windows' hardware requirements). Not forgiving enough (of sloppy users, old hardware, etc.). Not usable enough (with more restrictions, problems, confusions, distractions and idiosyncracies). Not simple enough (seven different versions of Vista, difficult to install from scratch on "unusual" hardware, harder and harder to get simple stuff working, more and more complex to secure). Not cheap enough.

The only question remains, how long until you've had enough?