Thomas Pain :: blog

Offloading Work to GitHub Actions

and making my website faster in the process

2022-08-04 :: 498 words

GitHub Actions is a super powerful tool.1

For just over a year, I've been using it to maintain a list of all the articles I've read on the internet - whenever I finished reading an article, I'd hit a little JavaScript bookmarklet in my toolbar that makes a request to an endpoint on my website, which would in-turn clone the Git repo for the list, make a commit and push it back to the origin.

GitHub Actions would then kick in, run some static-site-generating code and publish the output here using GitHub Pages.

This has worked pretty well for the last year, but recently I've noticed the bookmarklet-initiated request beginning to slow down quite considerably. It turns out that this is because of how long it takes to git clone the repository and perform all the required Git operations.

This got me thinking about ways I could speed up the process - perhaps running a background worker on my webserver, or always having a copy of the repository cloned (before, it would all be deleted once the request was done).

While I was thinking about this, I learnt that GitHub Actions workflows can be triggered using a repository_dispatch event, which allows an HTTP request to trigger a workflow and put some input data into the workflow context at the same time.

For an example, I'll create a basic workflow in a test repository called codemicro/test that just echos back our input data to us. That looks like this:

2  repository_dispatch:
3    types: [test]
6  thing:
7    runs-on: ubuntu-latest
8    steps:
9      - run: echo "${{ toJSON(github.event.client_payload) }}"

If we trigger it with the following cURL command: (see "Create a repository dispatch event")

1curl -X POST \
2    -H "Accept: application/vnd.github+json" \
3    -H "Authorization: token rhubarbrhubarb"
4 \
5    -d '{"event_type":"test","client_payload":{"unit":false,"integration":true,"hello":"world!"}}'

we see the following workflow outputs:

Test workflow results

Armed with this knowledge, instead of doing the smart thing and switching to using a local SQLite database, I opted to move all the logic that added to the reading list into a new GitHub Actions workflow that gets triggered remotely, hence hiding all the slow bits.2

Much of the code powering this from my website's codebase got copy-pased into the reading list's site generator and I added a little CLI over the top of it. I then wrote a new GitHub Actions workflow using the repository_dispatch trigger, and removed the now-redundant code from my website's codebase, instead opting to just have it make a POST request to the GitHub API.

Once the changes were deployed, rudimentary testing showed that my change meant an addition to the reading list took a grand total of about 500 milliseconds. This is a drastic improvement on the old method, which is about 12 times slower at about 5-6 seconds each addition.

Complete source code is available for viewing at, should you desire.

  1. More generally, so is all CI/CD.
  2. Plus, if it fails for whatever reason, I get an email telling me about it. This is contrast to an error being quietly logged somewhere that I never look at, like what would happen if I implemented a background worker myself.