HTML Logo by World Wide Web Consortium ( Click to learn more about our commitment to accessibility and standards.

Moving forward with Composr

ocPortal has been relaunched as Composr CMS, which is now in beta. ocPortal 9 will be superseded by Composr 10.

Head over to for our new site, and to our migration roadmap. Existing ocPortal member accounts have been mirrored.

When Linux goes wrong

When Linux goes wrong Our server's email wasn't going out for a couple of days, and it just clicked with me this morning when I wondered why I wasn't notified about some new issues posted on the tracker.

I thought I'd be a bit technical about what happened, and then a bit philosophical about why in 2013 we're still all dealing with such arcane software.

So, what happened?

We installed some new packages on the server, to bolster our ability to host very select other sites on there (our server is under capacity). Those came with some dependencies that meant virus/spam scanners got automatically installed (clamav / amavis). We run Ubuntu, so the theory is that these dependencies configure themselves automatically so that we don't need to know exactly how everything impacts everything else.

To try and make our lives easier, we actually use Google Mail (gmail) on the domains. There's no sense maintaining our own inboxes, our own spam checking, and backing it all up: email is just way too complex for all that fuss (we used to, it seriously ate time and things regularly broke).

Our server also uses a secure TLS connection to send outbound email out via gmail, because this delivery is much more reliable given the world has to trust gmail more than independent servers. The way things work is we still run a proper mail server (postfix), but that server relays stuff through gmail.

Postfix handles scanning via essentially passing the email through an intermediate server that takes that email, before passing it back out. Our strict requirement of TLS for gmail actually broke the passing on to the scanner, as the default Linux set up didn't contain valid certificates. Not normally a problem because we don't accept external connections to our mail server and thus would not imagine them ever being used. We got this error in the mail logs:
delivery temporarily suspended: TLS is required, but was not offered by host[]

This message led me down the garden path for a while, because I had no idea why emails were routing back through our own server. I thought maybe it had thought that it had to do local delivery for these messages.

Once it was worked out that postfix's TLS requirement also applies to the bridge through to the local virus scanner, it was easy enough to resolve. The mail server configuration was adjusted and all the queued emails go out.

It's 2013, dammit

I wouldn't normally bother writing this kind of thing up. I probably have to deal with a few random things like this each week in our line of work. However, it just struck me that we are using systems designed in the 1970s, and what a stark difference it is to what people expect from computers nowadays. On one hand we're loving the perfectly laser cut finishes of our latest smartphones, arguing about what DPI is really retina, and thinking about user interfaces where even the idea of copy&paste is considered too complex. Yet underneath, these systems are plumbed together via these ageing software components that are filled with unneeded complexity that only exists for legacy reasons, and where really nobody agrees on how to do anything.

Consider these issues:
  • Errors went through the 'info' log, not the 'error' log
  • Postfix is applying logic designed for inter-server certificate security, to the pipeline used for internal virus scanning
  • "TLS is required, but was not offered by host" is not a correct error message. TLS was configured, it's just the certificate was not signed by a trusted authority. Why did the error not say that instead?
  • The lack of any kind of system-level alert telling me all emails were queuing. When I log in to the server it tells me about package upgrades, why not have some way to aggregate the error messages happen and detect new kinds of problems?

Linux: you can't live in the 70s, and think POSIX is the be-all and end-all of architectures. How about being more forward thinking, and creating some standard notification and monitoring systems that integrate neatly with what you already do. How about stopping the bickering, with every distribution writing it's own front-ends to everything. How about stopping re-inventing the wheel all the time, throwing out what came before and always starting from scratch? Every time a new bit of software comes out, or a new standard, other software starts adding new options on how it can talk a new language, and things just spiral and spiral - everything tries to talk to everything, because nobody can agree a common language or what basic frameworks we need. Even the standards people always are reinventing the wheel – they don't extend or refine existing ones, they create a new standard.

The big problem, is how any of the needed improvements in Linux can be achieved. I have no idea how to incentivise usability on system level software that is so far away from investment budgets, and which the maintainers are comfortable either living in sheds and focusing on their system administration and programming with the devotion of a religion – or living on consultancy salaries with the implicit expectation/status-quo that only people like them will be able to get anything done.

Don't think I'm not pro Open-Source. I'm extremely pro Open-Source. Microsoft software stinks just as much, as does Apple stuff under the hood (I recently spent an afternoon working out how to unlock some folders, and it turned out the locking was performed by setting the folder time to a magic value, the release date of the first Apple Mac – on some levels MacOS used/recognised this bizarre setting as a folder locking status, even though it has a few proper mechanisms for this kind of thing). Commercial stuff pushes problems under the carpet just as much!

To a large extent the modern 'cloud' philosophy has supplanted our problems. Very few use "Linux" now, we use some cloud system, or we use Android, or Google ChromeOS, or Ubuntu One, or whatever, and leave the Linux system-level stuff to an ever narrowing group of people. The users don't even know it is Linux. This in essence is kind of a betrayal of the point of it all, Linux was meant to liberate us – but the solution now seems to be remotely hosted systems or boxed hardware that we have no control over.

The package maintainers of tools like Postfix really need to wake up to the need to enter the 21st century of usability. It's not just Postfix, it's pretty much every software package there is. But we can't just poke fingers, we need to incentivise people somehow. Unfortunately, I have no idea how! So until someone works it out, techie people will need to diagnose problems the very old fashioned way, and the masses will use increasingly closed but commoditised systems.

View all


Item has a rating of 5 (Liked by Harry-SLiked by JeanLiked by sholzy)
3 votes