It is human nature to assume that what is familiar to us, is familiar to all. I can still clearly remember, when I was around 12, going to a friends house, and being astonished at the fact that around my dinner plate were three pairs of knives and forks. In fact, petrified would probably be a better word - not only had I never personally seen anything like it, I had never imagined that anyone would, or indeed could, use more than one knife and fork in a single meal.
Of course, my life thus far has not just involved my surprise at other peoples habits; on occasions (less rare than my ego might have preferred), other people have been surprised at mine. Recently a non-computing friend saw my main computer workspace - a Unix setup with 4 xterms displayed - and asked, jokingly, if I was plotting to take over the world (I blame the media for this particular image). I’m so used to my own setup that I no longer think of it as odd but, when suitably prompted, I can see why other people might think so. In comparison to a Windows or Mac machine, full of little visual goodies, and perhaps with only a web browser or word processor loaded, a number of tiny xterms filled with half-executed commands does look odd.
It’s not really surprising that a non-computing person would find my setup odd - after all, I spend a lot of time on computers, so it’s to be expected that some things that I have come to find natural scare casual users. What has surprised, and continues to surprise me, is how many computing people I come across find my setup odd - sufficiently odd that it attracts comment. Some people are baffled as to why my systems are as they are, some are curious as to how it works, and some people sneer at the way I do things. There is no good answer to the sneer, nor any great reason to answer that person (although, possibly due to a deep character flaw, I find the sneer rather amusing). However the how and why are interesting questions, which raise interesting points, and are more closely integrated than they may first appear.
Here’s the broad setup I use. I have a desktop machine (because it’s fast and comfortable), a laptop (which I use only when out and about, because laptops are slow and ergonomically disastrous), a main server (where you’re probably reading this from), and a backup server (for the next time the water company cuts through the cable powering the main server; in an attempt to salve my environmental qualms, the backup server is a very low power device that also serves some domestic purposes). Though various people have some sort of access to the servers, I personally administer all 4 machines. This raises two immediate problems: how to keep the administration overhead to a minimum; and how to keep files synchronised between each machine.
The answer to the administration overhead question is, for me, simple: use the same operating system for all machines. That way, the lessons learned on one machine apply trivially to the others (and, when things go belly up, machines can relatively easily stand in for one another). As someone who (through a quirk of geography and history) never passed through the DOS / Windows world, I eventually gravitated towards Unix operating systems and, after a brief flirtation with Linux, I’ve exclusively used OpenBSD for nearly 10 years. This immediately scares most people off, or confirms their worst suspicions of me - to give you an idea of the popularity of this OS, at the time of writing, I’ve met precisely 1 OpenBSD user in real life. Why did I chose OpenBSD? Simple: it’s simple. OpenBSD does very little by default, and what it does, it does well, consistently, with minimal configuration, good documentation, and is easily administered remotely. I can have a blank box turned into a complete OpenBSD install with everything I want, setup how I need, in a couple of hours (most of which is automatic downloading of stuff, and doesn’t require my presence). I keep an open mind about OS replacements, but so far none of them appears an improvement, or even a sideways step. Of course, using what is often dismissively called a server operating system does involve some compromises, although less than you might think - the increasing diversity of real-world OS’s (thanks indirectly, I think, to OS X) has meant that running a minority platform involves fewer compromises than it did 5 or 6 years back.
The file synchronization problem is a little more subtle, but arguably more important. I outlined my mechanism for this a while back and while it’s changed in detail quite a bit, in spirit it’s still the same: I use a version control system (git these days) for my important files and Unison for large files that I can recreate via other mechanisms.
A corollary of using a decent Unix, and synchronizing files automatically, is that virtually everything is configured by simple text files, so to a large extent my configuration also propagates across machines. I have also tried over the years to accept, whenever possible, the default configuration on a machine. The reason for this is simple: the less I feel the need to change, the easier it is to move between machines (and different OS’s). Of course, there’s a limit to how far I’m prepared to accept someone else’s choices, and so I do change a reasonable number of settings; but, compared to most people I know, I change relatively little.
As well as trying to use the default configuration as far as is practicable, I also try to maximise my use of tools supplied with the OS and, failing that, to use the simplest tool that does the task I require. A decent Unix comes with a wide variety of little tools, most of them neglected by most users; it continues to amaze me as to how many tasks can be expressed in terms of these little tools. Using tools that are standard across many different machines and OS’s again lowers the barriers to moving between machines. It also generally implies a greater consistency of user experience, since tools from the same providers tend to be more consistent; some providers (particularly commercial) seem to delight in perverse user interface choices, which means that installing and learning new tools can be an uncomfortable experience. I also try, whenever possible, to use command-line tools not because - despite what some of my friends think - I like being obscure (my formative years were spent on RISC OS, where the GUI was King and the default assumption was that the command-line was for the mentally unsound) but because it’s easier to control command-line tools and maintain consistency across platforms.
Most of my time on a computer is spent either doing e-mail (I use mutt because it’s the least annoying mail client I’ve yet found, despite its obvious limitations), web browsing, programming, or writing. The latter two tasks are the most interesting. If for you, as for me, an average day is a wild trip of programming in several languages, and working on several papers of dubious literary merit, then you’ll know how much time one can spend editing text; I often have 20 or 30 files open for editing. The sad truth of the matter is that, as far as I can tell, the modern computing world does not contain a single decent text editor (whereas RISC OS, which I mentioned earlier, had at least 2 excellent text editors). Most text editors are either arcane (e.g. vi and emacs) or bloated (e.g. Eclipse). Since I am not clever enough for the former, and far too impatient to wait for the latter to load (I had a massive shock 4 or 5 years back when, on a powerful machine, I found to my horror that if I typed at full speed in a well known IDE, there was a noticeable lag in text appearing on screen), I use a half-way option, NEdit. NEdit has many limitations and flaws, but it’s simple, loads almost immediately, and its syntax highlighting is just about powerful enough to satisfy me.
Let’s return to the 4 xterms I mentioned at the beginning of the article, which scared my non-computing friend - it’s both worse and better than it seems. I setup KDE so that it has 12 virtual desktops (one of the main reasons I used KDE in the early days was because it binds sensible keys to virtual desktop selection by default), of which I typically use 7 to 8 at any given point. The first is my main work area; one of the xterms has mutt permanently loaded (I occasionally load other mutts in different xterms to simultaneously read multiple folders; an advantage of using a simple tool), one is mostly used for downloading e-mail, and the other two are for random commands and ssh sessions. Desktop 2 is for web browsing and desktop 3 for web page editing. Desktop 4 is my calendar. Desktops 5 and 6 are for programming. Desktops 7 and 8 are for paper writing, with 9 and 10 being used for secondary paper writing. The remaining desktops are spares. This may seem complex or unduly pernickety. The answer to both points is essentially the same: I evolved this setup organically over time so it seems natural, to me at least. Because of virtual desktops, I need only a single 19“ monitor, although these days it’s a struggle to find a sensibly sized monitor. Unfortunately, computer people are generally gadget people, and gadget people are easily fooled by bigger, faster, better, so monitors these days have largely useless resolutions. Wide-screen monitors, for example, are (I assume) good for watching films, but they’re fairly useless for text editing, where screen height is more important than width. Furthermore, the pointless fixation on resolution means that many fixed size things (some fonts, icons etc.) appear tiny, so I increasingly see people having to put their nose virtually to their screen to read things. 1280x1024 works for me and, until someone doubles the resolution without increasing the screen size (since then one can imagine that old apps could be transparently run with two physical pixels for every logical pixel they perceive, while new apps will have access to the genuine high resolution, making everything a bit smoother and sharper), I will try to resist the siren call of the resolution junkies.
In the above I’ve tried to give a brief outline of how I use computers - a modern reader may well detect a certain Luddite tendency in some of the choices, but hopefully I’ve also provided some small justification for each of the detailed choices. Of course, none of the above is really a high-level why. Why go to all this effort? Why these particular choices? Although I didn’t explicitly think of it this way when I first started down the path that led me to my current mode of operation, there is a solid reason behind it. When I look at the really good programmers I’ve come across (directly and indirectly), then, with only one exception I can really think of, they seem to share one thing in common: they’re also good sysadmins. Their machine(s) are in good order, simple, with the right hardware for the job, and ready for the task at hand; and they can whip a new machine into shape quickly. When the muse strikes, there is little to get in the way of good work: they know their machines inside out, the tools inside out, all their files are easily available, everything used frequently loads quickly, and they can flick rapidly between the sub-tasks that constitute a larger job. Similarly, the best sysadmins I’ve seen are also good programmers. I don’t think it’s possible to understand a modern OS without being a decent programmer - more importantly, it’s certainly not possible to tame and control an OS in the desired way without programming being involved. There’s a symbiosis between these two activities that seems to me undeniable; being really good in one requires being at least fairly good in the other.
So, in conclusion, my computer setup is an attempt to emulate, in my own small way, the best habits I’ve been able to pick up from those more able than myself. It’s a continual work in progress, but it does the trick for me.