Sunday, October 28, 2007

Why can't my computer...

There are a lot of things that annoy me about computers, or more usually, Windows in particular.

Why can't my computer:

1) Accurately gauge how long it will take to do something? (Prime Culprit: Windows)

File copies, program installations, downloads, no matter what it is, if it's got a progress bar it's not going to be accurate. Not only will it not be accurate, it won't even be close most of the time. Sometimes, granted, it'll be spot-on but that's just got statistical averaging written all over it. Why, in the middle of a file copy, does it suddenly decide that it's going to take 3 days, no, 10 seconds, no, wait, 18 hours, no, hold on, ... ?

Yes, not all things are predictable, but the computer could at least estimate (within a reasonable margin of error) and not blindly pick up random and wildly varying estimates without even flinching. It'd be much better if, when it gets confused or held up, it just gave up until it knew again! (KDE does do this, for instance, when copying files... it'll just say "Stalled" for a second).

2) Know where the drivers for a bit of hardware will be? (Prime Culprit: Windows)

Why do I have to point it to a driver that I've either had to insert myself or, a lot of the time, had to go to and downloaded myself before it will recognise the hardware I bought?

Why can't there be a standard for hardware so that everything USB or PCI-based contains some information that tells the computer roughly WHERE a set of drivers will be. Or, failing that, some website where it can automatically look up what drivers exist and where when given a PCI/USB id, even if it gets "Not Supported" or "Unknown" some of the time?

3) Know what I mean? (Prime Culprit: All OS)

When I type into my browser, why can't it just correct it for me (maybe with a Google-style "did you mean?")? Some mistakes are hard to catch but it doesn't even catch the little ones. or www,fred,com , for instance. And when I have a file copy command, rather than just error at me because I mis-spelled or, in Linux, mis-capitalised a filename, have a small go at working out what I meant and ask me if it's right. (Yes, there are certain plugins etc. but I want it to be a standard feature. I can't be the only person who's done these silly typos!)

4) Protect me from others and myself?(Prime Culprit: Windows and Linux)

Now, I'd like to point out that this is almost exclusively a Windows problem for one part of the question (protecting me from others) and almost exclusively a Linux problem for the other (protecting me from myself). Windows just doesn't go far enough to stop other people getting into/onto/through my computer's defences. Not even after the 50th version which has promised to do so. But it has some fantastic features to protect me from myself, if they are enabled. File deletes are, on the whole, confirmed first and undoable later. Plus, it's quite hard to completely shoot yourself in the foot by, say, deleting your Windows directory accidentally. On the other hard, it's extremely easy if you are not careful to totally balls up your Windows installation just by clicking on the wrong website, the wrong email etc. or even disabling the wrong service.

Linux, though, protects you from outside elements a lot more. And even if they do get through, it is quite easy to recover from them and, additionally, their impact will be limited to the user accounts that are affected. However, even as a normal user, you can wipe out your home directory in one command without any confirmations and with little chance of getting it back unless you have specifically put into place procedures to recover it (such as replacing commands with safer versions, configuring user accounts so that they can't do that sort of thing, or just having easy-to-restore backups in place).

So it seems that Linux could benefit from a bit of Shadow Copy, a bit of System Restore or some kind of filesystem rollback and Windows could benefit from a bit more privilege seperation, a bit better programming and a focus on non-virus software rather than anti-virus software (i.e. before-the-event practices that stop the viruses getting on there so easily in the first place).

And this doesn't just apply to desktop environments. Humans make mistakes. Operating systems should be designed to take account of this fact and help where possible.

5) At least give me a clue? (Prime Culprit: All OS)

"mplayer: error while loading shared libraries: cannot open shared object file: No such file or directory"
(Note that this is an example only - when you compile mplayer from source, it does in fact warn you or leave out support when pre-requisites are missing).

Well. Lovely. Fantastic. So you know that you NEED liba52. You won't run without it. You were obviously written with it in mind. So why can't I instead get:

"Mplayer: Error: You haven't installed liba52. You can download this from"

Now, with modern dependency checking this sort of thing is getting rarer but even so, where's the error message that a human can parse easily? Windows does just the same with missing DLL's. And compile-time messages but not run-time messages are another bit of a bind. Fine, tell me gently at compile-time that I need libX and exit neatly. But why not do the same when I move that binary to another machine that doesn't HAVE libX, instead of erroring as above? More people RUN programs than COMPILE them. People compiling programs usually have the sense to sort such problems out for themselves (and such a trivial error is nothing compared to some of the doozies that you can get when compiling software for yourself!), ordinary users can't.

Similarly, dumbing down error messages too much is a major mistake:

"An error has occurred."

What error? Why? Whose error was it? Was it my fault? Was something wrong with the machine? Where did the error occur - in the program, in Windows, in something else? What can be done about it?

As a replacement, how about:

"An error has occurred in PartyPoker. [[Note the friendly program name]]This appears to be a problem with that program. You can try running it again, or checking for a PartyPoker update. If you still recieve errors with PartyPoker, the program gives the website address as a source of help. Click below for a file which will help the program author to determine the cause of the problem."


"An out-of-disk-space error has occurred. Windows is showing you this error because it did not have enough room to create a 50Gb file on drive C: as requested by the program Nero Burning ROM. You have only 10Gb free space on drive C:. You can try:

- Clearing up 40Gb of space on drive C: and retrying the operation
- Instructing Nero Burning ROM to use a drive with more space (for example, D: currently has 100Gb free)"

6) Fix itself. (Prime Culprit: All OS)

Windows Last Known Good Configuration.
Windows Safe Mode.
Windows Recovery Console.

Where's the "I need to get to my files" option - with a minimal desktop that uses NO programs, services or other information in common with the main Windows install and lets you copy your files off the computer before it dies completely? Where's the "Run Diagnostics" option to let Windows have a go at trying to find out what it actually wrong rather than blindly looping through a list of Windows "versions", each of which gets less and less drivers loaded? While we're at it, where's the "Check My Disks/Memory/Hardware" option in that list?

Where's the "Right, last time I crashed loading the graphics driver, according to these logs - this time I'll ignore graphics and just load a basic VESA driver and see if I can get further" logic?

And then we have the fantastic idea to include an option, which is usually the default, to automatically restart Windows on error (great, so you can't even SEE the BSOD when it whizzes past, and then Windows will blindly sit there trying to get into Windows every time it reboots until you come and fix it - it'd be better just to turn itself off!). Yeah, there sometimes a need for a watchdog on a high-availability server but on an OS designed for Home Desktop use? And what's the point of it just infinitely restarting at the same place unless it LEARNS from that mistake, especiallly if that place is before it evens gets to a desktop?

That's just the start of my list. Hopefully, I'll finish it off soon.

Friday, October 19, 2007

What does the Linux desktop need? Those who say "I want, I want..."

I've just read an article linked on entitled "What does the Linux desktop really need?".

Let's veer slightly and ask a similar question: "What does my car need?". I'd say it NEEDED a lot less rust, a sunroof that isn't held in by parcel-tape, a new wheel-bearing (AGAIN) and something to make my lights turn on at least once in every ten tries.

Personally, I'd say it could also do with electric windows, heated front windscreen and a CD changer/MP3 stereo. But wait. Hold on. If we're saying that I can have anything I've seen on a car... I "need" braking-power reclamation, a hybrid engine, 0-60 in 5 seconds, a five-year warranty, finger-print recognition for starting the engine and GPS vehicle tracking.

Now, that's ridiculous, because my car doesn't NEED those last things, but that's basically what this article was saying. The Linux Desktop doesn't NEED anything else. It's there. It's a viable alternative to Windows. It can do anything that Windows can do (given the developer time investment). Years of development on the Windows side is now recreated in a matter of months on the Linux side - take drivers for things like newly-released wireless cards, some of which have to be reverse-engineered before a driver can be made, take some of the fancy graphical effects present in Vista, some of the desktop "features" of MacOS and Vista and there are already equivalents and copies available for Linux that can do just the same, most of which were started AFTER someone had seen those features elsewhere.

There isn't a type of application that can't be run natively, in theory. Given enough horsepower, we can even replicate the majority of Windows functions enough that high-power applications and 3D games can be run in a Windows-userspace recreation (Wine) at astonishing speed considering the technical problems of doing such things. Not only that, Linux can do virtually everything that Windows can do natively, and usually does a better job at it. There's nowhere to go from here apart from getting people to a) use the thing and b) develop for the thing, both of which are mutually dependent.

Reading LWN comments on the article are even worse... it "needs" Photoshop, Office, Games... No, it doesn't.

It's been proven - it's technically possible to write top-class 3D games and powerful image-editing programs for the Linux desktop. It's not even any "harder" than doing so for Windows. When Adobe want to do it, they can. In fact, Linux is more standardised for such things. You don't need to worry about ATI vs nVIDIA vs Intel - just let OpenGL sort things out for you.

The fact is that the desktop doesn't NEED anything, unless you are intent on recreating Windows on Linux. That's the problem - the Windows mentality isn't suitable, or compatible, with the way Linux works. Windows people want firewalls that don't disrupt their games and let any application request an open port via uPNP. Windows people want antivirus because they think they need it. Windows people want perfect connection to the heap of junk that is Active Directory. Windows people don't want to enter passwords or manually configure their hardware in order to do dangerous things, like overclock their graphics card or turn their firewall off. You can't change those people. Not without a big stick.

The way to get Linux onto a desktop is not to perfectly emulate every undocumented Windows bug and quirk when connecting to an Active Directory server for login so that some poor sap can run Outlook the way he likes, but to build a Linux equivalent that has clear advantages - faster, smaller, easier to manage, more transparent, easily portable, easily extendible and which can do stuff not seen elsewhere. The people who are more likely to make decisions based on those criteria? Large organisations. Who use networks. Which are run by a poor sysadmin somewhere who "knows" Windows but only "plays" with Linux. They don't care that Linux can detect and use 99% of all PC hardware - they care that it takes an hour to set up a new type of PC to the way they want it to be, rather than a five-minute copy of a well-known model's hard drive.

Imagine a Linux distribution. You install it in a "server" mode via a menu. Then you install it on a client machine via the same menu. At no time did you have to install drivers for monitors or some such rubbish. You don't HAVE to license it. You don't HAVE to spend days setting up the user group structure and policies to a safe default. Yeah, they'll be parts of the machine that won't work without proper drivers but that's not important. Really. These sorts of places SPECIFY the machines. They say what hardware it will or will not come with, down to the individual components. Compatibility with some cheap winmodem is not their problem - they buy a different modem, especially if it affects their security or technician's free time.

Anyway, you've started a client and server from barebones. Then imagine that you have automatic, cross-network authentication to that server, client logon, desktop settings and "policies", which allow the network administrator to change every single setting and restriction on the clients in almost every program via one directory interface. Imagine it works just as well over wireless, VPN, a Bluetooth interface or a USB cable. Just as automatic. Just as simple. Just as fast.

You can throw software across the network by just clicking on a machine in a tree-diagram on the server and deploying a package (so it'll be an RPM, not an MSI, but who cares?). Managing a thousand users on a hundred workstations becomes a cinch. And as a bonus, the machines automatically share work between them when they are idle. They automatically discover each other (with according administrator control) and use each other's storage as a mass-RAID for the network data, including encryption to stop people looking at other people's work. It does it all without needing a millions ports open. It does it all without major security problems. It works just as well from outside the network, when one of your staff takes a clieent laptop home - they plug it into their broadband, maybe they have to click an option to connect remotely instead of locally, and bam! - it's just like they are at the office.

Now imagine that you can do all that on lower-end machines than Windows could. And you can do more, stuff that just isn't possible on Windows. You can plug four graphics cards into each PC, four USB mice and four USB keyboards and now four people can use the one machine without even knowing. And their CPU power is being shared across the network, with all the other four-man machines, maybe even with the server itself doing some CPU work on their behalf when it's not busy with other things. And you wouldn't even notice that was what was going on. We're *not* talking thin-client - but you can do that if you want, too. You just tick the "thin-client" option when you install the client and the system does the rest for you.

Now imagine that not only does it do all that but you can also trust the server to backup those clients too, whether they are working locally or remotely. The server remembers individual machines and any quirks you've had to apply (that binary modem driver, that setting on boot that prevents the ACPI problems etc.) and when you rebuild them you can re-include those quirks too. Saving data to the network is transparent and not only does the server RAID all it's data, but it shares it out with the network. Server blows up? No probs. Stick the CD into any machine, create a server, maybe supply it with the old servers private key and bam - all the data feeds back from the clients to the server and the network rebuilds itself.

Well... the problem is that most of that stuff exists in one form or another. Certainly everything listed above is perfectly "do-able" and there's at least a few bits of software for every single component of that sort of system. They're not all tied into one distribution (that I know) but they are there. The most "together" distributions are the paid-for ones, Red-Hat etc. But there is nothing there that isn't possible, it might take a few months work and you could probably do it all without nothing more than existing downloads and kernel patches and a bit of bash-glue. But it's not around. You can't actually get it. And most Windows admin's won't even try it while it involves a lot of messing about. Have you seen some of the HOWTO's? Have you seen the number of steps needed to get Kerberos, LDAP, Exotic Filesystems, remote-control, VPN's, etc. all working your way? Windows is no easier, either, so you're left in the "what's in it for me" valley.

What's needed is not more and more replication of existing features but new and exotic uses. What's the most interesting part of Google? The Google Labs. What's the thing that people ALWAYS buy an OS for? The new, interesting features. Yes, when Samba can perfectly manage every aspect of AD integration, it'll be sought-after. But people scrambled to Vista "because". There wasn't anything complicated in it, there was little groundbreaking stuff and popular opinion now says that Vista is more of a pain in the backside to run for the average user than previous versions. But it was bought because it "could". It could do "new stuff" that Windows people hadn't seen before. Remember Windows 98SE that could "do USB".

People are already talking about the next version of Windows Server because of what it can do. Not about how well it does it. Not even about how easy it is to do, that's normally left until review copies appear in the hands of magazine reporters, but about what's new. And, stupidly, not even about what it doesn't do any more. The fact that every single version of Windows Server has had a hundred features announced that have never appeared is overlooked. The hype surrounding it by the time it comes out MAKES people want it. Vista was supposed to include database-style filesystems, a download manager, filesystem "filters" (Libraries), Palladium "trusted systems", integrated anti-spyware, UEFI support, PC-to-PC Sync, a full XPS implementation, to have a better UI, to perform better than Windows XP, and that's before you even get into all the capabilities that they physically removed from the OS that were there before in XP.

And the fact is that Windows Vista was just a small upgrade. If it had had ALL of those things, it could possibly be the best OS in the world. And Linux CAN have the majority, if not all, of those things. Most of them even exist for Linux right now. We just aren't using them.

People who "push" developers to make a Linux-Windows just don't get it - Linux is already in front in terms of features and technical details. We all know that. It wipes the floor with it's "main" competitor (although, to be fair, so do a lot of other operating systems). It's not that we're not "there". We are. Something else is holding Linux back. Firstly, ease of use. That's usually a big trade-off with not only compatibility and security but also with system performance. However, Linux has power to spare. And then it's just a matter of making things work without a million settings. I'm a big fan of command-lines and text-based configuration files - there is no reason to lose them. But they don't have to have vi as their only interface.

The main thing it's missing, however, is a short, simple, easy demonstration of powers that Vista and even future versions of Windows either can't or don't have. It's needs a show-distro to turn up, either from the depths of one of the established ones, or out-of-the-blue. It doesn't need that distro to say "Look at me, I'm just like Windows, only slightly better", it needs to say "Why on Earth would you bother to look at an OS that can't do X, Y and Z", where X, Y and Z are things that either have never been done before, or always been "promised" or "desired" and never materialised. And I don't mean a flashy-gui interface. The nearest we ever got to that sort of hype was the Kororaa Xgl Live CD and look at what it did - very little of any practical use. But it was NEW. It was even NEWER than what Windows could do at the time. So it got a lot of press.

Being able to access an AD domain isn't something new. It's not impressive to people. It's not even that innovative - there's a major OS that does it automatically and (fairly) reliably. What's needed is to play to Linux's strengths - flexibility, malleability, speed of development, freeform and accessible API's. That means coding quickly, easily, without barriers and restrictions and expensive SDK's. Just get in there and write stuff. In half the time it's taken Windows to get where it is, Linux has replicated and/or surpassed every aspect of Windows. Now it needs to overtake it - you can't do that by blindly copying features from Windows, or even other OS's.

Now, the article doesn't push Linux for anywhere near as much as the comments on LWN. To them I say: Just because Windows does something, doesn't mean that Linux should follow suit. It that were the case, Linux would BE Windows. I don't WANT my Linux desktop to have a built-in GUI firewall that's difficult to configure the way I want. I don't WANT automatic update dialogs that are a pain to turn off. I don't WANT something to automatically detect all wireless networks the second it sees a wireless card.

On the software front, what would be the point of "getting" Exchange, Adobe, Office as Linux-native versions or equivalents. By doing that, you would have to integrate a significant portion of Windows infrastructure, including Active Directory and DirectX. So what you've done is made a "free" version of Windows. Whoopee. Everyone who's currently using Linux is using it NOW while it's not a version of Windows... why? Because it's BETTER. It isn't bound by some stupid corporate decision or two decades of backward compatibility quirks.

Take a look at some edited highlights of Vista SP1:

Performance improvements
New security APIs
A new version of Windows Installer, version 4.1.[47]
Users will be able to change the default desktop search program
Support for the exFAT file system (FAT extended for USB sticks, basically)
Support for 802.11n.
Terminal Services can connect to an existing session.
IPv6 over VPN connections.
Support for booting using Extensible Firmware Interface on x64 systems.
An update to Direct3D, 10.1.
Support for the Secure Socket Tunneling Protocol.

What's there that Linux won't have by the time it comes out, if it hasn't got it already? What's there that Linux couldn't do? Nothing. And to be honest, as a changelog for a major upgrade to even a stable release of an OS, that's pretty pathetic. What about Server 2008? It's all pretty much the same. There's nothing in there that Linux doesn't already or couldn't do with a year or so's work.

Let's stop faffing about asking Windows users what they think they need from a Linux machine. Let's SHOW them. Let's just get stuff done and forget emulating Windows. We all know that Windows has it's death coming to it. The longer we give it credibility by attempting to copy everything it does, the more time we waste away from the interesting stuff, the stuff that will have people hooked. We have SELinux, we have file-server compatibility, we have directory management software, we have all of this but nobody cares. We need to show stuff that Windows can't do.

We need a five-machine network that can outperform the best Windows servers and individual desktops, when both are running 20 simultaneous clients (as in four people ACTUALLY WORKING on each of the five machines, locally). We need filesystems that "heal" (and not like self-healing NTFS in Server 2008 which is basically thread-safe Scandisk), network filesystems that can let Google do it's job without worrying and with which small companies no longer need to worry about tape backup (although, obviously, they still could) - which adds 50% to the price of any server.

We need perfect, logical, simple directory systems that can do stuff that Windows AD can't even dream of, in an easily editable/recoverable/backup-able format - it doesn't matter if it's Fedora Directory Server or Apache Directory Server - no-one cares. We need it all to run, automatically but securely. We need automatic secure communcations across a network to pick up new machines and integrate them directly into the Directory. We need systems that (with proper admin control over the process) auto-patch underneath systems that are still running. We need one-click Setup, Trusts, Backups and merges of entire Domains.

We need client systems that can repair themselves from known-good images (which, hell, should be stored in the cross-network filesystem) while they are still working - no, we don't acquire viruses but you still need to Ghost stuff back sometimes. We need machines that detect faulty hardware and compensate automatically - memory just failed in the server? Fine. Isolate the memory areas responsible (BadRAM), alert the admin, allow them to work-around the problem temporarily until they can get a replacement, restart and check all services and then carry on like nothing happened. And all the time you spotted it where Windows would have just crashed.

We need systems that can tolerate as much failure as possible. Primary hard drive or RAID array failed? Warn the admin, carry on as normal, read what you need off the network. Network failed? Route around it, over a USB connection if the admin only has a USB cable left, or FireWire, or wireless, or Bluetooth, or Serial, or Parallel, or... We need a real, "intelligent" help system. When it sees that admin hunting through menus looking at DNS settings, it tries to (unobtrusively) help. It brings up a checklist and works through things one at a time by itself until it says to the admin "The DNS server is fine. But you forgot to point that client machine at it." or "The DNS server doesn't have a reverse-DNS for that IP, that's why what you're trying isn't working".

We need systems that collectively monitor, detect and shutdown other rogue systems within their sight, a kind of distributed-IDS built into the system. We need systems that do all this 100% securely, with full certificate-chains and verification and let the admin control exactly what's going on if he wants. And when someone breaks that particular method of encryption? Ah, just choose one of the thousand-and-one encryption methods and do a one-time re-encryption to change every server, client and software over. Well, yes, do pick up local Windows systems and tie into them as much as you can but forget making that a priority. Set NEW standards. Make people say "I absolutely NEED a system that can do that." Let the other OS manufacturers play catch-up for a change.

Let's stop playing catch-up. We already won that one, there's no competition there any more, there's no more fun to be had. Let's start wiping the floor. Let's get JUST ONE feature in that people decide they absolutely NEED. And let's do it before Windows can even get a sniff. Let's do it so that, when the time comes for Microsoft to replicate it, they want to be able to read OUR code in order to get it done well enough. Let's stop playing about asking 90-year-old grannies why they don't like Linux when they know nothing BUT Windows... their answer will always be some variant of "It's not like Windows".... either that or "That penguin is scary". Let's make the people that are really scared of the Penguin be Microsoft and Apple. Because, at last and for once, they can't keep up with Tux.