Tuesday 29 January 2008

Tuesday 22 January 2008

Backports, and why you shouldn't just follow my directions

I got hooked on Bazaar (or bzr - which is the package name) shortly after I started working in Ubuntu. It's a great version control system written in Python, and I'll show a nice example of its use here soon. But that's not what this post is to be about.

Looking for bzr in the Etch repository, I found that a plain aptitude install bzr would provide me with version 0.11 of the software. Now, that is a lot of versions away from the current stable version, 1.1. So I decided I couldn't do without a newer version, and since I trust the package to work well and it doesn't provide any network services (so security risks are limited), I thought I might as well get the newest version from the developer site.

At that point I remembered backports. A quick look around the site told me bzr 1.0 for Etch was available from there. Now, there are a few advantages to using a back-ported deb package over a manual installation. One of them is the ease of installation (after adding backports.org to your trusted sources as explained here, the back-ported packages are available in your package manager), but in fact installing bzr by hand is also really simple: unpack the tarball in a suitable place and put a link to bzr in your path.

The real reason I opted for backports is that using the package manager makes the package management system aware of the presence of a version of bzr on your system. This gives you lots of benefits: you get warned when you try to install an incompatible package, when you install another version, or when you remove a package that bzr depends on.

Well then, I added backports to the trusted sources as described in the given link, and used aptitude to select the 1.0 version. You need to explicitly select the newer version, otherwise the system defaults to installing the official repository version (you can use pinning to alter this behaviour, but that's a topic I'll have to skip for now). I chose not to install the recommended packages bzrtools and python-paramiko (for sftp support), which you would also have to manually set to the right versions if you have a need for them. That's all!

Almost all. There's one essential remark I usually don't see in blog posts like this one: there is of course a reason why backports are not enabled by default on your Etch system. If, like many people, you've come to Debian for its reputation of good security, you need to realise that it is only as secure as you keep it. You are the only person who can assess the implications of adding a repository to your trusted sources, and it's not a decision to take too lightly. I trust backports as a source, and I trust bzr 1.0 to be stable enough to belong in an Etch system, but you shouldn't just jump off that cliff with me...

Lack of notes on installation procedure

Like I said in the previous post, I'm planning to provide a brief walk-through of setting up LVM on a LUKS-encrypted partition during a Debian "net install", if only to show how very comprehensive the installer is, and how usable - if you remain open to the information coming at you from that Spartan-looking screen.

There's one problem however. It was such smooth sailing at the time, that I didn't take any notes. Now, I find I'm not sure if I can reproduce the procedure off the top of my head. I should have written that blog entry the same day, really.

Here's the new plan: I'll set up the qemu emulator (yet something else to blog about!), then simulate a minimal installation of Etch with that, and take notes (and perhaps some screen shots). This will take a bit longer to come through though.

Sunday 20 January 2008

Where I want to take this blog...

Here's a list of things I plan to write about soon:
  • That installing Debian with the net-installer and setting it up to use a LUKS encrypted partition with LVM partitions on top of that is very straightforward nowadays (thanks to those who made such a great installer!). And of course: why you might like LUKS, and LVM.
  • That you really don't need a blog like this because everything seems to be so well documented, but what I think I will add by writing stuff anyway. Or in other words: my preference for man pages and -doc packages over some obscure blogged how-to.
  • A discussion of useful options for keeping track of changes to your configuration files.
  • Some notes about my firewall configuration adventures.
  • A practical take on backing up your data (ideal takes on it are ubiquitous already :)).
And, of course, if I'm too slow to produce any of that, do prod me by leaving a comment.

The coolest feature?

In the last post, I sort of suggested I'd be just as happy to use MS Windows. That's not entirely true, though. The one thing I'd really, really miss is a good package manager. Ian Murdock put it very well, so I won't try rephrasing it here.

And, while we're on the topic: aptitude is my tool of choice. I might sing in its praise more elaborately at some point.

Saturday 19 January 2008

Learning in Debian

It's kind of funny. I've played with GNU/Linux systems for ten years now, but only in the last three years did it go beyond play and towards work.

I think the first thing I had was some version of Red Hat (can't remember what but it came on a pile of floppies my house mate gave me) and I used it to learn C++ on. Then there was Suse, and at some point someone gave me his old Mac with a terrible version of MacOS on it (8 point something) and I installed Debian on that.

All this time, I never really "got in to the system". I just figured out how to run gcc and get my network connection up, that was about it. Every now and then, I'd try and learn some more, break the system, and then mistrust it too much to just fix it and go on - instead leaving it for days or weeks, to then come back, not remember what I'd been fiddling with, and fresh-install some other flavour of OS (I had FreeBSD at some point too, but the most exciting bit I did with that was setting up partitions/slices...).

So that was all play, and everything else was done with tools I felt more confident using - all were built on MS Windows. Actually, I think I would still be quite happy with that - both Windows 2000 and XP were excellent products in my personal experience.

Then came Ubuntu (5.10 I think), and for whatever reason, after checking it out I felt confident enough to switch to using that as my main platform. Working in GNU/Linux everyday, rather than every now and then on a free evening, you learn much faster. Several generations of Ubuntu later, I think I know my way around a lot better than I used to do in Windows.

Somehow, now was the time to find out about Debian Etch. So that's what I've switched to two weeks ago, and of course, coming from Ubuntu, I feel right at home. Tonight I was wondering, how is it different, if at all? The answer, I think, is in the documentation. The amount of information, the style of writing, and the nature of topics of how-tos are all adjusted to another audience.

Surely, there's also the difference that Ubuntu will support more cutting-edge hardware and sports newer versions of packages, but I'm running an older machine and for my purposes everything is new enough in Etch - python 2.4 does all I need (and actually 2.5 is in the main repository, too), so to say.

So really, the difference to me is just in the documentation - a change of level that I think I like. Debian Etch it is then, from here on. And this time there's no uncomfortable feeling using it.

Lots of people already blog on how they configure their Debian boxes. I don't think I'll add anything exciting, but at the very least this place will be a good reference for myself. I could do that in a log in my home directory of course, but let's hope some of this stuff will help one or two people - my very minuscule contribution to free software ;)

Wednesday 16 January 2008

Wget saves mouse clicks...

Today I was visiting a journal web page that listed supplements to an article they published. There were umpteen items to click away at (why don't they have a "download all" button that gives me a gzipped archive??) and I hate that sort of dumb clicking.

I remembered wget can automate this. A quick man wget provided just the right example to do what I wanted:

wget -r -l1 --no-parent -A.mov http://www.thejournal.com/thepage

...where this particular example helps me to download the page and the links on that page (-r), but only one link deep (-l1) and not going up in the web site directory tree (--no-parent), and, finally, to only get items that are .mov-files (-A or --accept).

Yes, the next thing I found out was that I didn't have a codec for those Quicktime supplements, but that's another story...