Day 19: A few modules to ease working with databases in Raku applications

There’s no single big Raku application I work on regularly at the moment, but there’s plenty of smaller ones that I need to do a bit of work on now and then. Nearly all of them involve using a database for persistence; I tend to reach for Postgres. This year I put together a few Raku modules to ease my work with databases on these projects. All of them are available for installation via zef; maybe some will make nice Christmas gifts for others using Raku and databases together.

Just let me develop this thing locally!

How should I run an application I’m developing on my local machine when it uses a database? I could use the system Postgres, creating a database, user, and so forth. Of course, this isn’t a task I do every day, and not even every month, so I have to go and look up the syntax every time. Then what if the Postgres version in the deployment environment is different from the one on my system? Not likely to be an issue often in such a mature product, but maybe some day it’d be a tripping hazard.

Thankfully, container technology has made it super easy to spin up most things at the version of one’s choice. Postgres containers are readily available. That still leaves a little scripting and plumbing to do to get the container up, create the database and user required, and have the environment variables injected before running the application. I’ve done it a bunch of times, in a bunch of ways. It’s easy, but boring.

What’s less boring? Writing a Raku module to take away the repetitive work, of course! Thus Dev::ContainerizedService. Now I need only write a devenv.raku like this:

#!/usr/bin/env raku
use Dev::ContainerizedService;

service 'postgres', :tag<13.0>, -> (:$conninfo, *%) {
    env 'DB_CONN_INFO', $conninfo;
}

This suffices to have a Postgres 13.0 docker container pulled (if needed), then started up, with a database and user created. These are then injected into the environment for the application; in this case, my application was expected a Postgres connection string in %*ENV<DB_CONN_INFO>.

I can then run my application via this script:

./devenv.raku run raku -I. service.raku

The Postgres instance is run on the host network, and a free port is chosen, which is just great for when I’ve got a couple of different projects that I’m working on at once. If I’m using cro and the development runner that restarts the service on changes, that’d be:

./devenv.raku run cro run

By default, it creates a throwaway database each time I run it. If I want to have the database persisted between runs, I need to add a project name and specify that it should store the data persistently:

#!/usr/bin/env raku
use Dev::ContainerizedService;

project 'my-app';
store;

service 'postgres', :tag<13.0>, -> (:$conninfo, *%) {
    env 'DB_CONN_INFO', $conninfo;
}

Sometimes, for debugging reasons, I’ll want a psql shell to poke at the database. That’s available with:

./devenv.raku tool postgres client

See the docs to learn about some of the more advanced features, and how to add support for further services (I’ve done Postgres and Redis, since those are what I had immediate use for).

I want integration tests that hit the database!

Unit tests that stub out the database, perhaps using something like Test::Mock, are all well and good, but do I really think my data access code is somehow going to be perfect? Of course not; it needs testing too.

Of course, that means some plumbing. Setting up a test database. Doing the same in the CI environment. I’ve done it a dozen times before. It’s easy. It’s boring. Why can’t I have a Raku module to take away the tedium?

Well, I could if I wrote it. Thus Test::ContainerizedService, the testy descendant of Dev::ContainerizedService. It’s actually a small wrapper around the core of Dev::ContainerizedService, meaning that one only has to add support for a database or queue in Dev::ContainerizedService, and then it’s available in Test::ContainerizedService too.

Using it looks like this:

use Test;
use Test::ContainerizedService;
use DB::Pg;

# Either receive a formed connection string:
test-service 'postgres', :tag<14.4> -> (:$conninfo, *%) {
    my $pg = DB::Pg.new(:$conninfo);
    # And now there's a connection to a throwaway test database
}

In a nutshell, wrap the tests up in a test-service block, which does what is needed to get a Postgres container up and running, and then passes in the connection information. If docker is not available, the tests are skipped instead.

What about migrations?

The two previous gifts are ready for unwrapping this Christmas. I’ve also been working on one that is probably only unwrappable for the adventurous at this point: DB::Migration::Declare.

It’s not the first Raku effort at database migrations – that is, the idea of having an ordered, append-only list of database change that together bring the database schema up to the current state. The Red ORM has some work in that direction, for those using Red. There’s also a module where one writes the SQL DDL up and down steps, and it applies them. I’ve used it, it works. But inspired by the benefits and shortcomings of Knex.js migrations, which I’ve been using quite a bit this year at a customer, I decided to set about building something sort of similar in Raku.

The idea is relatively simple: use a Raku DSL for specifying the migrations, and have the SQL to put the changes into effect generated. Supposing we want a database table to track the tallest skyscrapers, we could write this:

use DB::Migration::Declare;

migration 'Setup', {
    create-table 'skyscrapers', {
        add-column 'id', integer(), :increments, :primary;
        add-column 'name', text(), :!null, :unique;
        add-column 'height', integer(), :!null;
    }
}

Assuming it’s in a file migrations.raku alongside the application entrypoint script, we could add this code:

use DB::Migration::Declare::Applicator;
use DB::Migration::Declare::Database::Postgres;
use DB::Pg;

my $conn = $pg.new(:conninfo(%*ENV<DB_CONN_INFO>));

my $applicator = DB::Migration::Declare::Applicator.new:
        schema-id => 'my-project',
        source => $*PROGRAM.parent.add('migrations.raku'),
        database => DB::Migration::Declare::Database::Postgres.new,
        connection => $conn;
my $status = $applicator.to-latest;
note "Applied $status.migrations.elems() migration(s)";

At application startup, it will check if the migration we wrote has been applied to the database yet, and if not, translated it to SQL and apply it.

If a little later we realized that we also wanted to know what country each skyscraper was in, we could write a second migration after this first one:

migration 'Add countries', {
    create-table 'countries', {
        add-column 'id', integer(), :increments, :primary;
        add-column 'name', varchar(255), :!null, :unique;
    }

    alter-table 'skyscrapers',{
        add-column 'country', integer();
        foreign-key table => 'countries', from => 'country', to => 'id';
    }
}

On starting up our application, it would detect that the latest migration had not yet been applied and do so.

DB::Migration::Declare doesn’t just produce schema change SQL from Raku code, however. It also maintains a model of the current state of the database. Thus if my previous migration had a typo like this:

    alter-table 'skyskrapers',{
        add-column 'country', integer();
        foreign-key table => 'countries', from => 'country', to => 'id';
    }

It would detect it and let me know, before it even gets so far as trying to build the SQL:

Migration at migrations.raku:11 has problems:
  Cannot alter non-existent table 'skyskrapers'

It detects a range of such mistakes – not only typos, but also semantic issues such as trying to drop a table that was already dropped, or adding a duplicate primary key.

Development environment, test environment, and migrations

Not quite gold, frankincense, and myrrh, but my hope is that somebody else might find these useful too. Cheers!

One thought on “Day 19: A few modules to ease working with databases in Raku applications

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: