Skip to main content

Tom Pain

Pyourls3 - a learning experience

9 min read

Stuff I learnt and the challenges I faced publishing a package as a package publishing newbie

My inspiration for Pyourls3 came from another Python package – python-yourls, an 11 year old API wrapper library for the open source URL shortener, YOURLS. It’s listed on the official awesome list and is a bit useless in today’s day and age being written for Python 2. I set out creating my package, using similar function names to the old, but completely rewriting the program for the new, using the fantastic Requests package that everyone and their dog uses. What resulted was a lightweight and efficient package that is simple and easy to use, marking my first published Python package. This serves to document the stuff I learnt and the challenges I faced along the way, as the focus of this package became on learning.

Writing the package

Of the entire project, this was the simplest part of all. I’d created packages before, but none of them were ever intended for use by anyone, including myself. I started by creating a list of conceivable exceptions that I might need, and my final list contained 5 individual exceptions, ranging from a base exception that all others were based on to an exception thrown if a URL was attempted to be shortened twice.

I went on to create the main package, creating one main class, with 5 functions contained within it. Within these, I expanded upon the functionality of python-yourls, offering a function for retrieval of overall statistics for the YOURLS installation, as well as the standard offering of shortening, expanding and individual URL statistics functions. These posed little challenge, with the __init__ function doing most of the input validation, as most other functions only submit data to the API. If invalid data is put into one of the arguments for a function, the YOURLS API will return an error, which will raise an exception from one of the functions, hence removing the need for so much input validation.


Because this package is publicly available and published on PyPI, it has to be tested, to ensure the functionality and behaviour remains the same between versions. This proved to be more involved than I initially expected, having never done unit testing in Python before (or at all really). I chose the tool PyTest to write my tests, which the basics of are super simple and easy. You create a file that has the “test” keyword in it, and within that, functions with “test” in their name, which are run as part of a test. If a function raises an error, this is reported when running a test, otherwise it passes. This premise is scalable and flexible, which is why PyTest is so good.

But my package requires the existence of a HTTP server to work properly, which poses an interesting question; how do I test my package using PyTest if I need a webserver running properly? I can’t use an actual YOURLS installation, as to make it work properly I’d need to provide valid admin credentials which would be stored in the GitHub repository, which I’d rather not do. Not to mention that the webserver would have to remain up and running forever to ensure that tests can be executed whenever. The solution to this fiasco is to create a Flask server. If I can setup a Flask server that runs in the background on localhost and can emulate the responses of an actual YOURLS installation it will be good enough to run tests with.

The first order of business was to make sure it was possible to run a Flask server and at the same time make requests to it with only one Python process. This is indeed possible. I began to write the webserver code, and soon I realised I needed a way to transfer two bits of data between two modules, which took again more work than I expected. There is no straightforward way to do it - I tried adding my variable to globals, or sys.argv, but nothing worked. This stumped me for quite some time, until I found a user on StackOverflow suggesting importing a common module into each script I was using at one time, ie. the webserver and the test script. By defining some variables within the third module, setting them to None, and then importing the entirety of the module, I was able to modify the contents of the variable in one file and read from it in another module, immediately. To leverage this, I created two variables to use for data interchange:<

  • One variable to act as a modifier to alter the response from the webserver (useful for specific tests that relied on malformed JSON being returned, or an error being returned from the server for example)
  • Another variable to store all the post arguments specified in the previous request sent to the webserver. This was used to ensure that the program was sending all the arguments it was meant to be sending.

It worked like a charm. The majority of my tests relied on this in some way shape or form.

As for the content of the stuff that the test webserver was sending back, it’s all predefined responses, with the correct HTTP response codes mixed in when needed, which YOURLS uses inconsistently in its API responses, so I used them equally inconsistently (as in I copied what the API does).

It’s all well and good running my tests on my local development environment (Windows 10 in a Python 3.7 venv), but I wanted to support cross-platform compatibility, and I thought I might as well officially support all the versions that Requests does, which is everything post Python 3.4 (I did not support Python 2. I started programming in Python way after Python 3 was released and never did anything with Python 2, so I’m just going to stay away from it).

There was no way I was going to setup three VMs (Windows, Linux and MacOS - is it even possible to make an up to date MacOS VM?) for testing my package, each with three different versions of Python on. Instead, I turned to Travis CI - an automated testing tool which hooks onto the back of a GitHub repository and runs a predetermined set of tests every time there is a commit to it. It takes a bit of configuring but works flawlessly when it’s up and running. It’s also free for open source projects.

Travis CI is configured using a YAML file in the root directory of the target GitHub repository. Once you add this repository to your Travis account, it begins running tests based off the parameters specified in said YAML file. It took a bit of work, but eventually I was able to get a decent setup working, for all three supported Python versions on Windows and Linux, and Python 3.7.4 on MacOS. If any of these tests ever fails, I get an email telling me about it.


At this point, I got to enjoy writing some documentation, which was absolutely my least favourite part of the entire project. I used MkDocs to generate my documentation using Markdown files (and more YAML), along with the BOOTSTRA.386, a weird retro theme for Bootstrap that also works with MkDocs. The writing process is as simple as writing a load of markdown, adding some basic entries to a YAML file, and running one command. This produces a folder of HTML and CSS that you just drag and drop into your (actual) webserver, and it just works, which is nice.

Packaging and publishing

If you’re going to bother making a Python package, you might as well put it on PyPI - it’s the standard for package distribution within Python, and PIP makes installing PyPI packages super easy.

To begin with, you have to install the Wheel and Twine packages, and update the Setuptools package that’s included with Python (don’t uninstall this, because doing so completely breaks PIP, and you can’t even reinstall it because it uses Setuptools in every package installation - I learnt this from experience). You’re then going to have to write a file, which details basic information about the package - its version, a short and long description, the maintainer’s name and email address, the package’s web page, its dependencies and a few other bits and bobs. It again takes some tweaking but given time it can be perfected. I found it helpful to look at a few other packages’ files to see what was going on in theirs, and nick little snippets in the process.

You’re then ready to compile wheels and source code archives using the Wheel package. This spits out two files into a “dist” directory (one a .whl, another a .tar.gz), which can then be uploaded to PyPI using Twine. Again, it’s as simple as running one command, sticking your PyPI credentials in, and it uploads your package for you, giving you a URL at the end where you can see you can see your package on the PyPI website.

I created a simple batch script that automated deleting old and creating new wheels, tarballs and documentation, as well as uploading the new wheels to PyPI, because I can’t be bothered to remember three commands.

The very last step for me was submitting a pull request on awesome-yourls to get my package listed on the awesome list (and hence their readme.html file included with every YOURLS installation as well as being their homepage on

You can view the project here: