Thursday, 29 September 2016

Still at university

Breaking with tradition here, I’m going to ramble about work a bit - or at least some of the weird effects of working at a university. I’ve been thinking more about them recently, partly because I’m getting old and senile and partly because after 15 years I’m finally going to leave university and work Somewhere Else.

It started a few weeks ago when I left a meeting in the university library and walked across level 4. When I was a student (some 12 years ago or more) level 4 of the library was our territory. It was where the maths textbooks laired, and called us to gather even if we never actually looked at the things other than to marvel at their number and the amount of dust that had accumulated since the last pilgrims had arrived. At that time the floor was filled with bulky computers, with CRT monitors perched atop them like the rock piles in Blair Witch, and laid out in uniform lines which both made economical use of the space and sapped any will to live from you. Now though? Well, now it’s…

Library level 4

Well, exactly the same really but with more modern tech. The weirdest sense of deja vu. I have to wonder if the Silver Fox is still haunting the place, seeking those students who dare to consume something other than blessed water in his hallowed halls.

This time-bubble warps perception everywhere and our relatively low staff turnover only encourages it. This week is Freshers’ Week, which means hundreds of school leavers are roaming the grounds in an attempt to find the fabled “north buildings”. They are aided in their quest by the returning students and it is so very easy to look at these second and third years and relate to them, thinking “that was me not so long ago”. And yet it was more than a decade since I was a third year student, worrying about coursework and helping run a radio station. I have many friends who have had children - several children - in that time who are now going to school and looking at us as the uncool adults we really are.

It is a frog on slow-boil problem I feel (minus the brain-removed issue those frogs experienced, although…). I haven’t left so I haven’t aged. Despite doing adult things like getting a mortgage, life has failed to convince me that I have become, in theory at least, a responsible member of society and A Proper Grown Up. Maybe it is because I haven’t had that moment, a variant of which I assume everyone else goes through, where I suddenly become my Dad, understand what’s going on in the adult world, learn to appreciate sport and discover the enjoyment in gardening.

Yesterday, I was sitting in the Plug (the student bar) drinking diet coke and MMMBop started playing over the speakers. Aside from the video being projected onto the wall (and the lack of a half-completed piece of maths tutorial work), this could have been a scene from 15 years ago - except I was a bit fitter (although I’m more flexible now - in your face younger me) and had a bit more hair. Fortunately the modern world dragged me back from my time travel experience as the bar audio has a feature which lets anyone add music to the playlist from their phone anywhere on campus.

Back in the day nobody played anything but URB if we had anything to say about it.

Tuesday, 30 August 2016

Jekyll and the build scripts a few years on

A few years ago I moved my sites from a PHP templating system to static generation using Jekyll. How is it working out?

Pretty well. I’ve had no downtime (that I haven’t caused) which is to be expected on a low-traffic website serving HTML files. Updating content and templates has been easy, with Jekyll remaining simple to use. While I’m sure I’m in need of an update, that is less of a concern than if I was running code exposed to a user. Overall no problems with the technology or maintenance - indeed I find it much easier to work with than previous versions of the site as I don’t have to re-learn my configuration each time I want to do anything more complicated than change some words.

The biggest win - and something I actually considered skipping when I initially implemented - has been with the build scripts. In professional life I wouldn’t think twice about writing automated build scripts for a project but we all know that this kind of thinking isn’t as rigorously followed for personal projects. I wrote a simple mina script for deploying (and updating) my sites and several years on I deeply thankful to my past self. I haven’t had to keep my build process in my brain at all - just the magic command, which is in a README somewhere. This has meant small updates have been easy, the most boring part of site maintenance has all but gone away and consequently those updates have actually happened.

The lesson to take away here is that doing the hard (and dull) work up front of defining a development process and writing deployment scripts was worth it. Not so much because of time saving, or the consistency inherent in an automated process - but because these benefits actually encouraged me to maintain my sites in a way I simply wouldn’t have done had I been required to remember how to deploy my work each time I did anything.

Saturday, 30 July 2016

Printing a photo book

TL;DR - I used CEWE Photoworld and it was good

I have been running a photography website since the start of 2015 and I thought it would be nice to turn the pictures from last year into some kind of coffee table photo book. That's 72 photographs (12 months x 6 photographers) each with title and caption, plus each month needed a cover page, and I wanted the first photo of each month to appear on the right-hand page so I needed another page per month to shuffle the photos on. That's 8 pages per month, for a total of 96 pages to lay out - not a small amount of work so I needed a site which would give me a tool which I could use without wanting to do Very Bad Things by the end of it.


After some searching I ended up using CEWE Photoworld for three important reasons:

  • They have a desktop tool
  • They have an online help which actually ... helps
  • They have a 100% satisfaction guarantee

The desktop tool is hardly the pinnacle of software design, but it does the job. It is quick, reassuringly responsive and allows text with background colour and limited bulk formatting. It let me put everything together in the way I wanted, including guidelines on where I could push content to the edge of pages and where it wouldn't work. This put it a huge step forwards from Photobox, which I tried first because it is the famous one. Photobox offers a web based application which didn't let me add text with a coloured background and generally had that cumbersome feeling of web applications from last decade.

The Photoworld online help includes a real time chat, which was very helpful. I used it twice. The first agent was very responsive and helpful, answering my questions and generally being very reassuring. The second one was significantly less so - I got the impression he had far too many simultaneous conversations running, and was annoyingly vague when I was asking very precise questions (if you've got five different types of paper to choose from then you really can't use terminology which ambiguously covers three of them when recommending a choice). Still, we got there and the site helpfully emailed me a copy of the conversation which I kept just in case I needed to trigger the guarantee at a later date. Fortunately I've not had to test their guarantee in practice, but it was very reassuring to know that I had that safety net.

Reassuring is, I think, the key word for describing dealing with Photoworld. They know their service is expensive (as in book printing is expensive - I don't think they are expensive compared to their competitors) and is likely to be bought by people who haven't got a clue what they are doing so they do their utmost to make you feel like you're in good hands, and minimise the chances you're going to make a mess of what you're doing. One example from the site text - each book passes through 15+ pairs of hands as it is produced so it is thoroughly checked for imperfections. Regardless of how helpful this actually is in reality, it is an encouraging thought.

The only time I feared for my book while using it was when it came to finish and pay. At this point it uploads the pictures and send you off to secure payment, or crashes horribly if you attempt to use the Paypal option. This is slightly frightening when you've spent tens of hours laying everything out and proofing the book and all of a sudden it looks like it might be stuck on your desktop for all of eternity. Anyway, a switch to using a credit card bypassed that part of the application and it all worked fine.


The book arrived slightly quicker than promised and looks great.

Year in Pictures 2015

The presentational box was an extra, but looks really nice.

Year in Pictures 2015

And the photos printed well. There is a notable variance in the quality of the pictures between the different photographers, but that is to be expected, reflecting the different cameras in use.

Overall, I'm impressed. I'll be using Photoworld again.

And a big thank you to Kirsty Davey for proof reading it and correcting my mistakes. If she had a web presence I'd link to it.

Thursday, 30 June 2016

The Brexit post

So, like everyone else with a social media account I have an opinion on Brexit and the chaotic aftermath in which we find ourselves with both the government and the opposition collapsing in on themselves at exactly the time when some actual leadership is required. While I doubt I've much new to add, one day I will look back at this blog and I want to see a collection of my thoughts from this time.

Disclosure first. I believe in the Europe Union and the European vision. I believe that as a nation we are more than this small island and that not only means we should engage with European politics, but we have a responsibility to do so. So yes, I voted Remain.

Obviously I think the referendum result was a terrible decision and I'm appalled at the lack of conviction shown by the winners in the aftermath - be it Boris deciding that after leading Leave, he doesn't want to lead actually leaving or the calls from the Leave camp to put off invoking Article 50 for an unspecified amount of time. The indecision and lack of any coherent plan for this result is, frankly, terrifying.

Remain supporters are trying to process the situation. Some are calling for a second referendum, while others are looking to Scotland to find a magic veto and dig us out of this mess. Still others are looking to claim citizenship of other countries, or leave altogether. There is a hope the government will simply ignore the result, which seems a reasonable reaction, if wishful. It's not like they've listened when it comes to anything else recently.

Many have had enough of all this. They've sat through months of campaigning, of impenetrable rhetoric, half-truths, scaremongering and downright lies and, understandably, just want to get back to normal life. They want cats and babies on their Facebook feeds, not endless discussion of what is seen as a now-closed issue. This resignation hasn't gone down well and others are asserting their right to be angry, leading to a weird meta-argument.

Personally, I'm sympathetic to the weariness. I'm tired of all the debates and all the fighting being about stopping things getting worse. The Remain campaign wasn't about fighting for a better future - it was a rearguard action to defend what was the current (far from ideal) state of affairs from the self-serving and deluded. The same as the battle to stop the NHS being taken to pieces and privatised. And the battle for the BBC. And the schools. And the Snooper's Charter. And so on.

The left does not seem to be fighting for improvements any more. We aren't campaigning for positive change, but opposing negative change which rather plays to the whining liberal stereotype and it is really hard to get gain any kind of momentum when your message is "now, hang on". It is at this point we really need something big and positive we can get behind in the political arena. We should be able to look to the opposition for some kind of balance. Except the opposition has struggled to be credible for the last few years and has just imploded.

This is, of course, an emotional reaction to the current situation. There is a tremendous amount of work done by those who are campaigning for a genuinely better future, and I am doing a disservice to those fighting the rearguard action. But ultimately, major change will need to come through voting in what I am going to crudely call "better people" and that means increasing engagement in a process which for me (someone who is already engaged and interested) is currently a source of helplessness and fatigue. I doubt I am alone in feeling this.

I hope future-me reading back can say that I've played a part in improving this situation.

Wednesday, 25 May 2016

Exporting a postgres database from Heroku and importing to local install

Continuing with my efforts to learn some basic, useful postgres admin commands it’s time to look at importing and exporting data. We are going to export a postgres database from Heroku and import it to a local postgres install for development.

I’m assuming the Heroku toolbelt and postgres are installed locally and myuser is already created. I’ve written some very basic pointers to (local dev) postgres installation and administration already.

We are going to export the database used in myapp and import it locally to mydatabase to be owned by myuser. Brace thyself.

Export from Heroku

This is the easy bit.
heroku pg:backups capture --app myapp
curl -o latest.dump `heroku pg:backups public-url --app myapp`

Import to local

We are going to use the pg_restore command, but that needs to import as a postgres superuser. It will also prompt for a password, even if the user is set up for peer authentication (as per my last post) so we’re going to create an importer user with superuser powers. There is probably a better way to do this, but life is short…

Logged in to postgres as a superuser:
CREATE USER importer WITH PASSWORD 'mypassword';
We also need a target database:
Then to import the database (back on the command line):
pg_restore --verbose --clean --no-acl --no-owner -h localhost -U importer -d mydatabase latest.dump
This will throw some errors when the DROP commands in the Heroku export fail. This seems to be ok, but check nothing else has gone wrong. There is probably a way to have Heroku export the database without the drop statements to eliminate these messages.

Back in postgres as a superuser, switch to the new database and assign the correct ownership:
\c mydatabase
REASSIGN OWNED BY importer TO myuser;

Sunday, 24 April 2016

HTTPS for a small site

We all know it’s a good thing. Security, SEO and soon not being called out by Chrome and Firefox for being insecure. But for a small, personal site it’s a pain in the rear to set up and the certificate is prohibitively expensive, right? Right?

Maybe not. Let's try and change this:

The certificate

These days you can get a 90 day certificate for free from Let’s Encrypt, which is news to me and the reason I thought I’d give this a go.

Main stumbling block removed.

Apache config for SSL

Ok, I can write this config myself. However Let’s Encrypt has a magic tool which claims to do everything for me. Let’s find out.

git clone
cd letsencrypt
./letsencrypt-auto --help
  • It downloaded a python environment for me.
  • It did a thing with root privileges courtesy of sudo. Probably shouldn't have used a window in which I’d previously sudo'd something. Oops.
./letsencrypt-auto --apache
The automated thing doesn't detect my domain. It detects a load of others, but I’m not ready to destroy those yet. Boo.

Also, I'm guessing with letsencrypt-auto. It seems to pass flags to the letsencrypt script which is buried somewhere. Turns out I am right. Great.

I have to agree with the T&Cs to register with the ACME server. Aside from the obvious, ACME seem to mean Advisory Committee on Mathematics Education which I don’t think is relevant here so clearly I am getting a cert from the same people who supply anvils to Wile E. Coyote.

Seems legit. Let’s do this.

./letsencrypt-auto --apache -d
Still not finding my domain. Is it … confused by the number of domains? Nope, it doesn’t like files containing multiple vhosts. Oh. Reconfiguration time.


Ok, updated. Now time to fire this baby up. The original command now finds all the domains. Go! What could go wrong?


Well, shit.

Minor problem - apparently I'm loading my fonts over an insecure connection.


For those of you not up to speed with the arcane art of reading browser URL bars, the shield is gone which means the browser isn't blocking assets trying to load into a secure page over an insecure connection.

These certs expire in 90 days so time for a simple cron.

00 03 * * * $location/letsencrypt/letsencrypt-auto renew >> $location/letsencrypt/logs/renew.log 2>&1
Docs recommend checking daily, so that should keep things up to date. And potentially fill the filesystem. Meh.

So, my site is available over a secure connection. Hurrah! The “ensure all connections” setting seems to have set up a basic redirect, which is good although I'm going to have add the HSTS headers myself and hope that doesn't get toasted when I next run one of these scripts. Renew seems to behave though.


HSTS removes a vulnerable step when redirecting from an insecure to a secure connection. Details on the magic can be seen on the OWASP site.

The important bit of Apache magic is:

Header always set Strict-Transport-Security "max-age=31536000; includeSubdomains; preload"
Which is stuck into the https vhosts and requires mod_headers enabled.

Testing this was a world of fun. I'd recommend disabling the cache (in the dev tools), using a plugin to inspect the headers (I like Live HTTP Headers) and making liberal use of this secret page to check the status of the HSTS settings. This is all in Chrome.

Tidying up

It seems only the automagic script doesn't like my old Apache config. Now it’s all set up I can put everything back in the same file.
So now I am handling four different connections in the same file:
With 2. and 3. redirecting to 1. and 4. redirecting to 2. so as to pick up the extra HSTS headers.


Yeah, that can wait.

Overall though, this was not the trial I expected. Getting a cert is now really easy. The only parts that required any real thought were figuring out how to arrange my Apache config and checking the HSTS headers were being set correctly.

No excuses any more! Best do the others.

Sunday, 3 April 2016

Creating a database and user for local postgres development

So, yeah. I'm a postgres n00b. But I'm a n00b who wants to be able to create a non-superuser account and database, relate the two and also be able to remember how to do this again in two weeks time.


As superuser via UNIX user authentication:
sudo -u postgres psql postgres

As superuser directly (-W forces password prompt):
psql -U postgres -W

As a user to a specific database:
psql database -U username -W

Basic commands

Show tables:

List users:

List databases:


User management

CREATE USER username WITH PASSWORD 'password';
DROP USER username;

Database management


Granting ownership and permissions

ALTER DATABASE database OWNER TO username;
Dumb settings for local dev.

Also, if you're getting problems connecting try replacing local peer with local md5 in /etc/postgresql/VERSION/main/pg_hba.conf.

Much of this came from this post. I'm planning on using pgAdmin3 as a database explorer when I want something quicker than the command line (on Ubuntu).