Sunday 6 October 2024

Migrating postgres databases from ElephantSQL to Neon

Continuing my series of "if I push enough buttons I can get postgres to work for me" I am going to record how I migrated from ElephantSQL to Neon. This is one of my personal documentation posts - I write these for my own reference for when I need to do something similar in future but all useful information has dropped out of my head so I don't have to distil something simple from proper documentation again. They are sometimes useful to someone doing the same thing (I'm actually surprised how often I do send these links to people) but since more folk are reading my blog from LinkedIn these days this is fair warning.

The setup

I was migrating from ElephantSQL to Neon as the former was shutting down. I wish Neon all the best, but the way things are at the moment I guess it's only a matter of time before I have to do it again. Migrating a simple postgres database is straightforward, but if (like me) you don't do it often it is nice to have the process written out.

This is for my own experimental applications, so I'm dealing with small, single-node databases and I'm not worried about downtime.

Recover the data

Getting the data out of the source database is straightforward. Simply log into the control panel, copy the connection string and use pg_dump for a full download:

pg_dump -Fc -v -d <source_database_connection_string> -f <dump_file_name>

-Fc makes the output format suitable for pg_restore. -v is verbose mode, showing you all the things going wrong...

Upload to the new database

Initially, I struggled a bit with Neon. I created the database and user in the web interface, but could not find a way to associate the two so consequently pg_restore failed with permissions errors. The simple way around this was to create the database via the Neon CLI, recorded here as a bit of a gotcha.

neon roles create --name new_user
neon databases create --name new_database --owner-name new_user

And for completeness, these are the commands which list the databases and users.

neon roles list
neon databases list

Once the database is created properly, the database can be restored using the pg_restore command.

pg_restore -v -d <neon_database_connection_string> <dump_file_name>

Repointing the database

So far so simple. Now to reconfigure the application - this should be a case of updating an environment variable. For a Rails app, that is likely DATABASE_URL. Simply edit the environment variable to the new database connection string, restart the app and this is done.

Again, this is for a very simple application - one Rails node, small database, no particular need for zero downtime.

Hopefully this will be useful to someone out there even if it's just me in the future. Hello, future me. What are you up to these days?

Monday 23 September 2024

Why good software engineering matters

I've needed to make some changes to a few of my personal applications recently and running through the process made me reflect on some of the basic building blocks of my profession. As a deeply uncool individual, I am very interested in the long-term sustainability of our technical estates so I thought I'd capture those thoughts.

The story so far... I run a few small-scale applications which make my life easier in different ways. I used to host these on Heroku, then when they shut down their free tier I migrated them all to Render and Koyeb with databases hosted by ElephantSQL. About a year on, I started getting emails from ElephantSQL telling me they are shutting down their database hosting so I needed to migrate again. I also needed to fix a few performance problems with one of the applications, and generally make some updates. Fairly simple changes but this is on an application I haven't really changed in several years.

A variant of this scenario comes up regularly in the real world. Unless you're lucky enough to be working on a single product, at some point your organisation will need to pick up some code nobody has touched in ages and make some changes. The application won't be comprehensively documented - it never is - so the cost to make those updates will be disproportionately high. Chances are, this means you won't do them so the application sits around for longer and the costs rise again and again until the code is totally rotten and has to be rebuilt from the ground up, which is even more expensive.

In a world where applications are constantly being rolled out, keeping on top of maintenance - and keeping organisational knowledge - is vital, but also definitely not sustainable. There are lots of service-level frameworks which promote best practice in keeping applications fresh, with ITIL being the obvious one, but this is only part of the picture. How do we reduce the cost of ongoing maintenance? Is there something we can do to help pick up and change code that has been forgotten?

This is where good software engineering makes a huge difference, and also where building your own in-house capability really has value. Writing good code is not just about making sure it works and is fast, and it's not just about making sure it's peer reviewed - although all of this is very important. But there are many approaches which really help with sustainability.

Again, my applications are really quite simple but also the "institutional knowledge" problem is significant. I wrote these (mostly) alone so anything I've forgotten is gone. The infrastructure has been configured by me, and I'm not actively using much of this stuff day to day so I have to dredge everything out of my memory / the internet - I am quite rusty at doing anything clever. These problems make change harder, so I have to drive my own costs (time in my case) down else I won't bother.

Let's look at some basics.

First, the database move. My databases are separated from the applications which means migration is as simple as transferring the data from one host to another and repointing the application. This last step could be tricky, except my applications use environment variables to configure the database. All I need to do is modify one field in a web form and redeploy the application to read the new target and it's done with minimal downtime. Sometimes developers will abstract this kind of change in project team discussion ("instead of pointing at this database, we just point at this other one") but with the right initial setup it really can be that simple.

Oh, except we need to redeploy. That could be a pain except... my applications are all set up for automated testing and deployment. Once I've made a change, it automatically runs all the tests and assuming they pass one more click and the new version goes to the server without my having to remember how to do this. I use Github Actions for my stuff, but there are lots of ways to make this happen.

That automated testing is important. Since everything in tech is insufficiently documented (at best) this creates a safety net for when I return to my largely forgotten codebase. I can make my changes or upgrades and run the tests with a single command. A few minutes later, the test suite completes and if everything comes up green then I can be pretty confident I've not broken anything.

Finding my way around my old code is fairly easy too, because it conforms to good practice use of the framework and it is all checked by an automated linter. This makes sure that what I've written is not too esoteric or odd - that is, it looks like the kind of code other people would also produce. This makes it much easier to read in the future and helps if someone else wants to take a look.

So through this, I've changed infrastructure with a simple field change, run tests giving me significant confidence the application is working after I've made a change with a single command (which also checks the code quality) and deployed to the server with another single command. To do all this, I don't really have to remember anything much and can focus on the individual change I need to make.

Now, any developer reading this will tell you the above is really basic in the modern world - and they are right, and also can be taken MUCH further. However, it is very hard to get even this level of rigour into a large technical estate as all this practice takes time - especially if it was not the standard when the code was initially written. But this really basic hygiene can save enormous amounts of time and thus costs over the lifecycle of your service. At work we are going on this journey and, while there is a lot more to do, I'm immensely proud of the progress that the software engineering teams have made driving down our costs and increasing overall development pace.

Basics are important! Always worth revisiting the basics.

Saturday 31 August 2024

Moving office

I don't often directly talk about events at work, but for once I'm going to celebrate something rather cool that's happened. We've moved offices!

Despite the valiant efforts of our estates people, the Macmillan offices in Vauxhall were ... well, terrible. Vauxhall itself is a roundabout with delusions of grandeur and the building was slowly falling about around us. I do not frequent the office too often, so I was rather surprised during a trustee meeting when the whole building started to shake like the apocalypse had come. Nobody else blinked - this was "normal" to the point of it happening several times a day. The rest of the day gave a glorious demonstration and I have no idea how anyone copes, frankly.

So for this and various other reasons it was time to Be Elsewhere. However for us in Tech this meant we had to face the (kinda literal) elephant in the closet - the server room on the premises. This was not a comms cupboard, but a proper server room, with ageing steam-powered servers bolted the floor, powering the organisation. But this was not a time for panic and fear - instead, we had a fantastic opportunity to take a big step modernising our systems. A golden opportunity to spend a chunk of time significantly moving the dial on our legacy tech debt in the service of a hard deadline which the org needed hitting. We grasped this opportunity, with months of work spent virtualising, reconfiguring, and rebuilding to bring things into a much better state in preparation for the move.

To actually move, we initially had to plan for disabling everything. However, with every week of work where we cleaned up dependencies and updated our overall configuration we decoupled systems and by the time we came to do the move, the only services that we actually disabled were the ones hosted on the machines we had to turn off. This in itself was a huge win, but the move weekend itself was exceptional. I've been involved in lots of tech projects over the years, and many releases, and something always goes wrong and needs correcting at the last minute. We had our share of challenges, but for the week before the move we were having daily meetings in which we were looking at each other wondering what we had missed - things were calm. Then the weekend was so well executed it was almost unsettling. The team didn't exactly stick to published timings, but only because they were so far ahead.

Overall, it was incredibly smooth and not only did this enable our office move, we have finished with our systems in a much better place, either in the cloud or in a proper data centre and better understood, and run by a team with a great deal of (very much earned) confidence. An exceptional result on the back of a lot of hard work - really knocked it out the park.

The second, and far more visible, part is the new office itself. This was clearly much wider than the Technology group, but we had a crucial role in making sure the new premises had an internet connection (which it didn't until quite literally the 11th hour ... worrying times!) and working AV, door controls, room bookings, etc etc. The wider team did an excellent job bringing everything together on time and it is lovely being in a modern office which doesn't shake when the trains go by. In particular (for my post!) I'm going to say the technology is working rather well. The new meeting equipment is very easy to use, with great sound quality and scary cameras which track motion to zoom in on speakers. I wonder if I can mount a nerf gun on one of them...

So yes. Some excellent work here and well worth recording. A great result for Macmillan. For the Technology group, not only did we play our part we've also managed to modernise, increase knowledge, improve resilience and other great things across our server infrastructure we ALSO managed to remove a load of problems with the office AV. As I said at the top, I don't often write about specifics here but I thought I'd make an exception.

And to close, here are some pictures of the new place including the most important part of the new office building - a button which gives hot chocolate milk...

The Forge, Macmillan

Congratulations everyone!

Sunday 28 July 2024

When to mentor

I've been thinking again about mentoring. When is the right point to consider the challenge of mentoring someone? When does one know enough? When should one offer oneself as a mentor, without it coming across as seriously arrogant?

The answer is, of course, never. A mentor is calm, wise, and has seen it all before. They can easily understand everything that could possibly come up, have a very clear plan in place immediately and be able to take a mentee forward through any situation. Does this sound like me / you? Really? Plus, let's face it, if I / you know it then it's pretty obvious and can't possibly be worth offering to someone else.

Or at least that's what The Voices say to me every time I think about this. This is, of course, nonsense.

So what is the real answer? When is the right time to help those with less experience? Now. It doesn't matter what experience you have - it is more than some people. Sure, over time that number will increase and more folk will benefit from hearing from you, but you already know something that is unique and worth telling others. Mid-level developer? Plenty of people coming through the junior levels who need to learn from you. "Only" a junior? Well, there are plenty of people who are just starting out and have no experience at all.

This is before we get to the value of mentoring to you. Similarly to writing a blog, there is a discipline in structuring ideas and then clearly talking through ideas and concepts in a way someone else can understand them and like any form of teaching, one needs an extra level of understanding to be able to talk about a concept in this way. It is essential for a leader to be able to articulate their thinking clearly in order to bring others along with them. It is also very important to be able to think clearly on the fly - such as when people can drop awkward topics of discussion on you at any time.

As an aside, I really don't like the term "mentor" - or rather I don't like thinking of someone as "my mentee" because of the implied power dynamic there. I would say I don't have any mentees, but there are plenty of people who would disagree with that.

Ok, so how does one offer mentoring without sounding deeply arrogant? The easiest route, I think, is by offering to a group who are already in a place to be receptive, and maybe linked to individual topics you know you can claim some expertise. I've recently seen someone I respect offering consultation around salary negotiations. This is a form of targeted mentoring, and in a field she is visibly knowledgeable. 

As I said above, I already do some mentoring however my new year resolutions included giving more back to this industry. So I'm going to do two things. 

Firstly, if anyone is reading this and wants a chat about the tech industry - in particular technical leadership, moving from a technical job to a leadership role, the role of technical knowledge in the strategic / leadership space or similar - then please do reach out on LinkedIn or Twitter. I am also open to speaking to groups (which is whole different post).

Secondly, I'm going to make this same offer in an engineering leadership Slack which is filled with people I don't know. That idea scares me ... we'll see what happens.

An important caveat here. I know there are qualified coaches, mentors and so on. I am not that. I am simply someone who has been around a bit.

Anyway, I'm going to do something here and I challenge you, dear reader, to do so too. The important thing is that there is always something one can offer to others who are looking to learn. And there is always something one can learn from someone else. We all can find value by listening to and learning from each other.

Sunday 23 June 2024

Behold the art

Sometimes I just want to write a post on this blog to mark something that made me happy. This is one of those, so feel free to tune out if you're looking for something something data technology management leadership.

Anyone left? Cool.

For reasons that escape me, I was asked to show some of my photographs at a local exhibition of creativity and art. I have been a keen photographer for many years, but I've never really thought my pictures rise to any kind of display standard. However, others did not share that opinion so I was pushed into creating a display.

It went well!

My photos at the St Swithins art exhibition 2024

And my photos of the whole event are here.

I actually got some really good comments. People loved the theme coming through the writeup and apparently some people got quite emotional when I wasn't there. In addition to the disease itself, the COVID lockdown has left some deep wounds and it does seem weird to me that, unlike other national emergencies (eg the war) we don't really talk about it much. Some of life has changed, some of life has reverted to as it was before. But in the main we're just carrying on. Certainly in my head unless I properly think about it, lockdown was ages ago now and something that happened over a couple of weeks. Clearly, that is absolutely not true, and I find it weird how keen we are collectively to put it to the side. Perhaps this is our way of collectively dealing with the trauma? It seems we should have a national memorial day or something?

Anyway, before this becomes a post about lockdown or COVID, just a few notes on how this was pulled together.

Obviously, the photos were taken years ago. They were part of a wider set (which was on the projector above, and is in the embedded carousel below) depicting light in the darkness of those times. This set is also on Flickr and that creates a slideshow which could be put up on screen. These pics were made into a book years ago, so I got one of the owners of that book (my parents) to pick their favourites and, after a spot of measuring and checking the DPI I calculated they would work at A3 size. Sadly, the local print store has shut down so after a spot of Googling, I took my flash drive to Ryman and was very pleasantly surprised that their photo printing is (to my amateur eye at least) really very good. If you are in Bath and need something printed well, you can do a lot worse!

One Flickr link, six pictures and a short write up combined to what you see above. I was really quite pleased with the outcome and makes me think I would like to do a bit more of this kind of thing. Of course that means I need to do some more creative work.

Since this blog is usually about the tech industry - I also met someone who is a developer looking at their future in the industry and gave them some sage advice (lol). Seriously, no idea if I said anything valuable or not, but I am always open to opportunities to help those coming through and give back. In fact, there is a post on this coming soon...

And to sign off - here is the full set of images from the display.

Light book - lockdown

Monday 27 May 2024

AI in the charity and healthcare sectors and not leaving people behind

A couple of weeks ago I attended the CIO Digital Enterprise forum and spoke on AI in healthcare and the charity sector. Everyone knows AI is absolutely everywhere, and is the solution to every problem in the known universe and while we are clearly in the upper parts of a crazy hype cycle, unlike recent tech revolutions this one might actually deliver some of its promise to change the game. In this world, it is very important we consider all of society and do not leave people behind, and this was the topic of my fireside chat with Timandra Harkness who did a wonderful job interviewing me (I was rather nervous!). I thought I'd recap some of what I said here, although I'm not going to bother writing much about efficiency. Everyone knows that at this stage.

Charities and the public sector need to think about customers differently to a business. Where a company like Amazon can focus down to the most profitable users and decide, after analysis of the return on investment, to simply ignore anyone who doesn't own a modern smartphone or a high speed internet connection this isn't really an option for us. Our mission is to reach everyone, so we need to avoid making decisions that cut out or degrade service to subsets of the population.

Fundamentally, charities exist on trust both for income and service delivery. Income is predominantly donations from people who want to support the cause, and fairly obviously people will not donate to an organisation they do not trust to be good stewards of their money. Similarly, people will only reach out for a service to an organisation they trust. This naturally leads to a more risk-averse approach to anything that can damage that trust.

At Macmillan, we are trying to reach people who are going through one of the worst experiences of their lives, when they are most vulnerable. This is a tremendous privilege and responsibility and we have to take this very seriously, understand where people are coming from and meet them at their place of need. We work with people from all manner of backgrounds. Some are highly educated in the medical field. Some are in cultures where speaking of any illness, let alone cancer, is taboo. Some will reach out to a doctor when feeling unwell. Some mistrust doctors and the wider establishment and will talk to a local community or spiritual leader instead. All these different groups and many more besides deserve access to the best healthcare available when they need it and for many of these people we'll have perhaps one chance to engage with them and build a connection before we're written off as "not for them".

Looking at technology, this means we have to be very very careful when putting in anything that can be a barrier to engagement and this does not sit well with many of the end-user deployments of AI at the moment. Although the potential is far wider, the discussions around AI usually end up being about cost saving - doing more with less. When talking about user interaction, an obvious option is the chat bot, either web chat or an automated phone responder. These tend to communicate in a very particular way which works for simple information retrieval but lacks warmth and certainly isn't all things to all people. I know I've been turned off from services by being presented with chat bots (in fact, I wrote a post about this some years ago) and I work in this field and haven't been looking for potentially terrifying medical advice. Chat bots are getting better all the time, but at the moment they certainly do not replace the personal connection one gets from a well trained call responder.

That said, call responders are expensive and their capacity scales linearly so need to be deployed carefully. Behind the scenes, there is lots of use for data (and therefore potentially AI) driven optimisation of their time, ensuring good stewardship of donations by making sure phone lines are staffed without being over-staffed. As real-time translation improves, this will also make a huge difference to us. There are a lot of languages spoken in the UK and we cannot possibly maintain a workforce which allows people to speak to us in whatever language they choose. However if and when we can have ongoing translations between our users and our call centre staff we can communicate in their preferred language, again reaching them in their place of need.

In a similar way, use of AI in semantic site searching is an opportunity to allow people to communicate with us how they choose. In earlier days of the internet, everyone knew someone who was "good at finding things with Google" - this means they could phrase their searches in a way the search engine understood. Any good site tries to make finding content easier through good information architecture and a decent search function, and this can be significantly enhanced with AI. Again, closing the gap with users rather than expecting them to come to us.

Of course, AI-driven chat bots do have a place working out of hours. As long as it is very clear when speaking to a machine rather than a person, and there is clear signposting to when a human is available, it provides a "better than nothing" opportunity for when the phone lines are closed.

This theme also comes through when considering personalisation. In theory, personalisation lets us provide content suitable for you and your needs, which is a great way of helping you find what you want. However, promoting some content inherently means we're demoting other content. Is this the right tradeoff? Ideally, yes and I'm sure we can tune the site to behave better for a high percentage of visitors. But we're trying to reach everyone and now we're doing maths trading some people off. If we can provide good personalisation for 99% of our visitors, that means in a period of time where we're seeing 100,000 visitors we're actively hiding the content 1000 people need. In all likelihood, those people with "unusual" needs are going to correlate with the people about whom we have less data and guess which of the above groups that represents...

This is the fundamental danger of data-drive organisations and service design. The underlying data must be understood, including the weaknesses. We know there are many MANY holes in research data across healthcare. You may well have equal access to medical care, but the medical care itself was almost certainly not developed equally and its effectiveness will vary accordingly. There is a lot of work going on to correct this problem (although not enough!) but in the meantime we need to be very alert to not compounding the problem.

This is a useful segue to the last thing I want to put down. We were talking about the future where AI takes us. I had a couple of things to say, but the one I want to replicate here is around the change I hope we will see across the sector. Currently, charities cooperate with other organisations, but each is fairly stand alone. Given the rich, but incomplete (see above) data we are collecting and our resources being tiny when compared with big tech firms, I hope we see "big data" collaboration across charity groups to help spread the costs and fill in data gaps. We need to deliberately find and occupy our places in a wider ecosystem, so we can work together, share and signpost to each other more as a single organism rather than overlapping entities. What that specifically looks like remains to be seen, but this has to be the future and I'm hoping to be a part of it.

And let's close with a picture of me pretending to be smart...

Photo credit to CIO Digital Enterprise forum


Sunday 21 April 2024

Hp 1010 printer on Windows 10

I have an ancient HP 1010 laserjet printer, bought back at university some 5 years ago (lol). Eventually I want to replace this with a wifi printer, but only when this one runs out of toner and so far it's refusing to die. Each time I reinstall my computer, I have to figure out how to make it work again so here is a quick post for future me, or for anyone else who is looking to make an HP 1010 LaserJet work on a Window 10 machine.

I run Windows 10 64 bit edition and HP haven't produced a driver since Windows Vista. The printer is so old that Windows 10 doesn't automatically detect it as a printer. To keep it going, I have to jump through a few hoops.

First, download the Vista 64 bit driver.

Extract this somewhere.

Then in Device Manager, do Action -> Add Legacy Hardware. Select "printers" from list and in the port dropdown you'll find a USB printer entry. On the next screen, Have Disk then find the driver you've just extracted. Select the right printer and it should install.

And voila, the printer will now work from the local machine.

Or, if you've got a Linux machine sitting around, it should just work if plugged in (Ubuntu 22 for me).