Professional Goals for 2021

Yesterday I posted my personal goals for 2021. During the course of drafting that post, I realised I had some goals that were very specifically relevant to my career as a software developer, and therefore my ability to earn income, and some that were not.

Those that were specific to my career, and either directly or indirectly affect my ability to earn income, are probably the ones that I’m more likely to focus on and complete, so I thought it might be useful to separate the two.

If nothing else it provides me two “recap” posts to write towards the end of the year. ;-D

Test Driven Development

In 2019 and 2020 I made a conscious choice to learn how to write proper test suites for my code. I’m still not an expert, but I’m finally comfortable enough with PHPUnit, the Laravel and WordPress Plugin testing environments, and the CodeCeption testing framework, that I have no more reason not to write tests for all of my code.

In 2021 I want to focus on improving my Test Driven Development (TDD) practice. My goal is two fold; I want to make sure that any new code I write first has tests in place for the expected outcomes, and I want to make sure that any existing code I touch, either already has tests in place, or I add them first, before making any changes.

My developer mantra for 2021 is “testing, testing, testing”.

PHP Certification

I’ve had the Zend PHP Certification on my radar for years now. Last year I purchased the study guide, and so this year I’d like to try and complete the certification. It probably doesn’t mean too much in the general scheme of things, but I’ve already learned a few new bits of information from the first few chapters, and it would be nice to add to my professional credentials.

Learn Python

Python is a general-purpose programming language I’ve wanted to learn better for the past two years. There’s a lot of code out there written in Python, much of it in open source projects I’m interested in, but it also has direct relevance to my position at Castos. I’ve started a few Python courses, so this year I’d like to pick one, finish it, and actually be comfortable hacking on some Python code.

CS50’s Introduction to Computer Science

Back when I studied programming, the college where I completed my diploma advertised that the 2 year diploma course, once completed, would be equivalent to the first two years of a 3 year BSc degree in Computer Science. This meant we could take that final year at university, and complete the degree.

Sadly the college didn’t actually keep up to date with the requirements to allow this, so when I enrolled, I wasn’t able to take that option.

I’d still love to properly learn the fundamentals of a computer science degree. The Harvard CS50’s Introduction to Computer Science course is a great and free way to do this, so I’d like to try and complete it during the course of this year.

This doesn’t have a direct impact on my earning potential, but I do feel the fundamentals I can learn from this will help with in my daily problem solving.

Moar Writing

This is one of the few areas where my personal and professional goals are similar. I thoroughly enjoy writing, possibly sometimes more so than development. 2020 was not a good year for me for professional writing gigs. My freelance writing through Skyword essentially stopped at the end of 2019, and I’ve not had the time to actively write for other online publications. So in 2021, I will be looking for more opportunities for paid writing.

Experiences General

Personal Goals for 2021

I’ve never really been the type of person who writes “year in review” posts. Mostly this is because I don’t have any ventures that would be interesting to report yearly results on. What I have done in the past was to set some personal goals for the coming year, which I managed to do for 2016 and 2017. For some reason, I skipped setting goals for 2018 but picked it up again for 2019. I also didn’t do any goal setting for 2020, which turns out might not have been the worst idea, given how the year ended up turning out!

If I think about it, one of the possible reasons I skipped out on public goal setting for 2018 and 2020 was the fact that both 2017 and 2019 were quite busy, so I didn’t really have time to set any specific plans for the year to come.

This year I do have some very specific goals for things I want to achieve in 2021, so this post serves as a record for those goals.

It’s worth noting that these are very much personal goals, and fall outside of what I want to achieve professionally in my career as a developer. That’s a whole other blog post…

Write a book

I’ve had the idea to write a book for about 2 years now. My initial attempts to start a book on working with freelance developers didn’t make it past chapter 1. That was partly because, over the course of the last two years, I’ve moved away from freelancing. I think my best option here is to focus on a technical book, something which I’m very comfortable with, and something that I think people will actually pay money for.

I already have one idea, based on a talk I gave at a conference a few years ago. It won’t be the longest book in the world, so I’m not expecting it to make millions, but if it goes well it should inspire me to create more technical content. I’ll probably look at writing it “in public” so to speak, using this blog. What I may do is write the outline and introductory chapter(s) as a couple of publicly published blog posts, and then invite folks to pre-purchase access to the rest of the content, as it’s being written. I’m not entirely sure if this will work but I have a feeling that knowing folks have prepaid for the final book, meaning they want to read it, will help me focus my efforts towards finishing it.

Plugin Testing Course

Earlier this year I came up with the idea of creating a WordPress plugin testing course. I even got as far as creating the outline for the course, but that’s as far as I got, for two reasons. Firstly, I discovered a fellow WordPress developer has already released a similar course (he kindly invited me to take his course and give him my feedback, which I’ve embarrassingly not had the chance to do yet, sorry Fränk). Secondly, the global pandemic had an effect on the amount of time I could spend on non-income-generating activities.

This is still something I would like to look into, perhaps from a different perspective, in the new year.

Open Sourcery Podcast

I initially launched the WP Hacker Cast on a whim, with no real plan other than trying to connect with fellow WordPress folks. Last year I didn’t really have much success with it, but I did at least manage to interview someone I’ve been dying to talk to since it began.

In 2021 I plan to rebrand the podcast and move the focus away from just being about people in WordPress, to being about all folks involved in open source projects. I have some very interesting people I follow on social media, and I’ve never been able to invite them to the podcast, because of it’s WordPress slant. By switching to a more general focus on open source, I hope that I’ll not only be able to interview interesting WordPress folks but folks from all walks of life in our wider open source community.

WP Notify

Speaking of open source, this year I am looking forward to being able to put more time into WP Notify, the WordPress notifications feature project I’ve been working on for almost a year and a half. Over the course of last year, I struggled to find a way to put regular, scheduled time into it, so for 2021 I plan to dedicate at least one hour per week to make sure the project is moving forward.

Moar blogging

In 2020 my blogging efforts suffered. A large part of this is that I didn’t really have a lot of solid ideas to talk about. I tend to prefer to blog about things that might be interesting to my handful of readers, be it general tech content, development tutorials, or personal updates that might interest folks. My hope is that with the goals I’m setting this year, I’ll be doing more interesting things, and therefore have more to blog about.


Disable the Touch pad on an Ubuntu Laptop When an External Mouse Is Connected.

Featured image by John Petalcurin from Pexels

One of my pet peeves when working on a laptop is the position of the touch pad, relevant to my right hand position. Probably due to my large hands, and the way I rest my palms when typing, the part of my palm that’s at the bottom of the thumb on my right hand typically always makes a connection with the touch pad, so when I’m doing lots of typing, either long form writing, or coding, I inevitably touch the touch pad with my hand, and the mouse cursor shoots all over the place. If I’m very unlucky, it might even copy/paste some text in the process.

For this reason, I always have an external mouse attached when using my laptop for anything other than browsing. And it means I need a way to ensure that the touch-pad is disabled when the external mouse it attached.

For the longest time I’ve been using the open source touchpad-indicator app, which has always just worked. Unfortunately it seems that since Ubuntu 20.04, the app does not work 100% on Ubuntu. It installs and automatically loads, but the icon doesn’t appear in the taskbar, and any attempts to configure it from the command line are unsuccessful.

While I can’t blame the developer of this app for not keeping things up to date, it does mean I have to find an alternative solution. Fortunately I stumbled across one, using the dconf-editor, on the AskUbuntu forums.

Installing dconf-editor is as easy as running:

sudo apt install dconf-editor

And then running dconf-editor from the command line.

The editor exposes MANY configuration settings, which, if you don’t know what you are doing, could quite easily bork your system. It does warn you about this, so you’ve been told!

Once opened, browse to the following settings:


Under the Custom value setting for this item, change it from enabled to disabled-on-external-mouse

Close the editor, restart, and you should be good to go. I tested this with a Bluetooth mouse, and it works perfectly first time.

Freelancing Ubuntu

Setting Up a Matrix Server on Ubuntu 20.04 – Part 2

In part 1 of this tutorial, I dived into my reasons for setting up a Matrix Synapse homeserver, and how to set up the basics of the server and it’s required software. In part 2, I’m going to register myself an admin account, log into the server using the online chat client, to verify it’s all working as it should, and migrate the database engine from SQLite to Postgres.

A quick reminder, all server commands are run as the root user.

Register an admin user account

Installing the Matrix Synapse server software also installs a few executables on the server, that can be used for specific tasks. register_new_matrix_user is one such executable, and it allows you to register a new user from the command line of the server.

register_new_matrix_user -c /etc/matrix-synapse/homeserver.yaml http://localhost:8008

The process will ask you for a user name, password, and whether to make this user an admin or not.

Once you’ve created the user, you can log in to the homeserver via the Element web client hosted at, or download Element for your operating system, and Sign in from the client.

Either way, to sign in to your server, click on the Change link, next to “Sign in to your Matrix account on”.

Enter the server domain as the homeserver URL and click Next.

Enter your user name and password and click Sign in.

Switch Database Engine

While Matrix Synapse shops with SQLite by default, the official documentation suggests this only for testing purposes or for servers with light workloads. Otherwise, it’s recommended to switch to Postgres, as it provides significant performance improvements due to the superior threading and caching model, and smarter query optimiser, and allows the database to be run on separate hardware.

Install Postgres

First step is to install Postgres on the server.

apt install postgresql postgresql-contrib

We then need to create a user for synapse to use to access the database. We do this by switching to the postgres system user, and running the createuser command

su - postgres
createuser --pwprompt synapse_user

The createuser command will ask you for a password, and create the synapse_user with that password.

Now we can create the database, by logging into the Postgres database server, while operating as the postgres system user.


Once logged in, we can create the database, and assign it to the synapse_user

 OWNER synapse_user;

Then we need to allow the synapse_user to connect to the database. The Postgres docs for Synapse talk about possibly needing to enable password authentication, but I found that by default this was already endabled, so all I had to do was add the relevant file to the pg_hba.conf file. I wasn’t sure how to find the pg_hba.conf file on my system, but this Stack Overflow thread explained what commands I could use when logged into the Postgres database server.

show hba_file;

My pb_hba.conf file was located at /etc/postgresql/12/main/pg_hba.conf.

Towards the bottom of the file, I added the Synapse local connection section to allow the synapse user access to Postgres.

# Database administrative login by Unix domain socket
local   all             postgres                                peer

# TYPE  DATABASE        USER            ADDRESS                 METHOD

# "local" is for Unix domain socket connections only
local   all             all                                     peer
# IPv4 local connections:
host    all             all               md5
# Synapse local connection:
host    synapse         synapse_user    ::1/128                 md5
# IPv6 local connections:
host    all             all             ::1/128                 md5
# Allow replication connections from localhost, by a user with the
# replication privilege.
local   replication     all                                     peer
host    replication     all               md5
host    replication     all             ::1/128                 md5

Because line order matters in pg_hba.conf, the Synapse Postgres docs make a point of the fact that the Synapse local connection needs to be just above the IPv6 local connections.

Migrating the SQLite data to the Postgres database

In order for the data to be migrated from the SQLite database to the PostgreSQL database, we need to use the synapse_port_db executable, which requires that the homeserver.yaml file includes the correct server_name. So edit the homeserver.yaml, and set the server_name to your domain name from part 1.

nano /etc/matrix-synapse/homeserver.yaml
server_name: ""

The next step is to make a copy of the homeserver.yaml file, in preparation for the Postgres set up

cp /etc/matrix-synapse/homeserver.yaml /etc/matrix-synapse/homeserver-postgres.yaml

Then, edit the new /etc/matrix-synapse/homeserver-postgres.yaml file, so that the database settings point to the newly created Postgres database.

#  name: sqlite3
#  args:
#    database: /var/lib/matrix-synapse/homeserver.db

  name: psycopg2
    user: synapse_user
    password: dbpassword
    database: synapse
    host: localhost
    cp_min: 5
    cp_max: 10

Make sure that the newly created /etc/matrix-synapse/homeserver-postgres.yaml file is owned by the correct system user.

chown matrix-synapse:nogroup /etc/matrix-synapse/homeserver-postgres.yaml

Next step is to copy the SQLite database, so that we can import from the copy, and give that copy the correct permissions.

cp /var/lib/matrix-synapse/homeserver.db /var/lib/matrix-synapse/homeserver.db.snapshot
chown matrix-synapse:nogroup /var/lib/matrix-synapse/homeserver.db.snapshot

We should then stop the matrix-synapse server, before we run the import.

systemctl stop matrix-synapse

Now we can use the synapse_port_db command to import the data from SQLite to Postgres, using the SQLite snapshot, and the Postgres enabled homeserver-postgres.yaml.

synapse_port_db --curses --sqlite-database homeserver.db.snapshot --postgres-config /etc/matrix-synapse/homeserver-postgres.yaml

Once the import successfully completes, we can backup the current SQLite enabled configuration.

mv /etc/matrix-synapse/homeserver.yaml /etc/matrix-synapse/homeserver-old-sqlite.yaml

Finally, we set the Postgres enabled yaml as the default.

mv /etc/matrix-synapse/homeserver-postgres.yaml /etc/matrix-synapse/homeserver.yaml

At this stage, it’s a good idea to make sure that the new /etc/matrix-synapse/homeserver.yaml file has the right permissions set, and if not, set them correctly.

chown matrix-synapse:nogroup /etc/matrix-synapse/homeserver.yaml

And that’s it, log into your Matrix server via the web client or desktop client, and if you’ve completed the import correctly, everything should be working as before.

That concludes the main set up requirements to have a working Matrix homeserver.

I do however plan to have two follow-up posts. In the first, I’ll dive into some of the quirks of signing into and verifying additional clients, once your initial sign-in has been successful.

For the second, I’ll be adding some testers to the mix, and I’ll document and any configuration changes I’ve needed to make, to get things usable as a community communication platform. This one might take a bit longer, as it relies on other folks testing the platform, giving me feedback, and me tweaking the server until it’s a workable solution.

Freelancing Ubuntu

Setting Up a Matrix Server on Ubuntu 20.04 – Part 1

For about the past six months or so, I’ve been interested in the open, decentralized communications standard called Matrix. In May of this year, it was announced on TechCrunch that Automattic had invested $4.6 million into the company behind the Matrix standard. The company in question, New Vector (now rebranded as Element) also develop an open-source chat client, which is a rival to Slack, that runs on the Matrix standard. This is the part that made me sit up and take notice. Matt Mullenweg is the CEO of Automattic and the co-founder and BDFL for WordPress. His decision to invest in this open-source communications standard, which offers an alternative to Slack, could very well mean that in the near future, both the internal communication at Automattic and that of the WordPress project, could end up running on Matrix.

For some time now, myself and the other admins of the WP South Africa Slack community, have been considering switching away from Slack. The main reason for this is that Slack doesn’t offer an option for open source communities to get any premium plan features, so unless we’re prepared to pay $8 USD per person for our over 1000 members we’re stuck with the free version. It also means we don’t “own” our open source community’s communication channels. If Slack ever dropped the free plan, which could happen, we’d be stuck up the proverbial creek.

I decided it was time I took a look at what it would take to get a Matrix homeserver set up, what the pros and cons would be, and if it would be possible to migrate all our users over to our own homeserver.


In order to set up a Matrix homeserver, I would need a web server instance to host it. Fortunately, I have a Digital Ocean account, so spinning up a new basic VPS droplet with Ubuntu 20.04, 1GB of RAM, 1 vCPU and 25GB of storage at $5 a month was a click of a few buttons. The other requirement was to have a public domain pointing to the IP address of the server. I asked Hugh, who manages the domain, to set up an A record in the DNS to point the subdomain to the new server IP address. You can also use a regular top-level domain (eg but if you already have a domain (for example for your community website), using a subdomain for the Matrix server means not needing to purchase a new one.

Initial set up

A note on commands, I’m running all the commands for this server as the root user. If you have access to your web server via another user with sudo privileges, I suggest switching to the root user, it will just make everything easier.

sudo su

The first step with a new server is to make sure the server software is up to date.

apt update && apt upgrade

Then, I needed to ensure that any package dependencies for the matrix-synapse server software are installed.

apt install lsb-release wget apt-transport-https

While there are official matrix-synapse packages in the Ubuntu repositories, the matrix-synapse docs recommend using their own packages. To do that I first had to enable those packages, by adding the GPG key and the matrix-synapse repository for a Debian/Ubuntu-based system.

wget -qO /usr/share/keyrings/matrix-org-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/matrix-org-archive-keyring.gpg] $(lsb_release -cs) main" | tee /etc/apt/sources.list.d/matrix-org.list

Once the repository is set up, I can get the latest package updates, which will now include the Matrix ones, and install the matrix-synapse homeserver software.

apt update
apt install matrix-synapse-py3

During the install, I enter the chosen domain (in my case our subdomain) as the server name. I can also choose not to send Anonymous Data Statistics, but that’s entirely up to you.

Once it’s all installed, I start the matrix-synapse server, and enable it to auto-start on system boot.

systemctl start matrix-synapse
systemctl enable matrix-synapse

Configure Matrix Synapse

Firstly, I needed to generate a random string which is used as the Matrix Synapse registration secret. This I did using the following command.

cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1

The server outputs the random key, which I copied and saved somewhere safe for later.


No, this isn’t the key for our Matrix homeserver! 🙂

The next step is to edit the homeserver.yaml configuration file, which is in the /etc/matrix-synapse directory. For this tutorial, I’m using nano, but you can use whatever CLI text editor you prefer.

nano /etc/matrix-synapse/homeserver.yaml

I searched for and changed the registration_shared_secret entry, and used the randomly generated key created earlier. It’s important to note that the key should be enclosed by double-quotes.

registration_shared_secret: "YC9h8djIiCaCinWxkE2zt9cgvXYGX25P"

I then saved and closed the homeserver.yaml file.

Then, I restarted the matrix-synapse service, which will apply the new configuration.

systemctl restart matrix-synapse

Generate SSL

My next step was setting up the webserver software Nginx as a reverse proxy for the Matrix Synapse server. Before I could do that, I needed to generate an SSL certificate, to make sure the traffic to and from the server is secure. This can be accomplished using a LetsEncrypt certificate, for which I need to install certbot.

apt install certbot

Once certbot is installed, I generated the SSL certificate, using my email address, and the subdomain we have pointing to the IP address of the webserver.

certbot certonly --rsa-key-size 2048 --standalone --agree-tos --no-eff-email --email -d

Once this is completed, the SSL certificate and chain were saved at /etc/letsencrypt/live/ and the SSL key file was been saved at /etc/letsencrypt/live/ This information will become useful when I set up Nginx in the next step.

Setup Nginx as a Reverse Proxy

Setting up a reverse proxy allows Matrix clients to connect to your server securely through the default HTTPS port (443) without needing to run Synapse with root privileges. So first I install Nginx.

apt install nginx

Once installed, I create a virtual host file, to manage the incoming connections.

nano /etc/nginx/sites-available/matrix

The virtual host file configuration redirects all traffic from port 80 (HTTP) to port 443 (HTTPS), configures the SSL port with the certificates created earlier, redirects any requests to the /_matrix endpoint on the domain to the Matrix Synpse server, and configures port 8448, which is used by the Matrix Federation APIs to allow Matrix homeservers to communicate with each other.

server {
    listen 80;
    return 301 https://$host$request_uri;

server {
    listen 443 ssl;

    ssl_certificate /etc/letsencrypt/live/;
    ssl_certificate_key /etc/letsencrypt/live/;

    location /_matrix {
        proxy_pass http://localhost:8008;
        proxy_set_header X-Forwarded-For $remote_addr;
        # Nginx by default only allows file uploads up to 1M in size
        # Increase client_max_body_size to match max_upload_size defined in homeserver.yaml
        client_max_body_size 10M;

# This is used for Matrix Federation
# which is using default TCP port '8448'
server {
    listen 8448 ssl;

    ssl_certificate /etc/letsencrypt/live/;
    ssl_certificate_key /etc/letsencrypt/live/;

    location / {
        proxy_pass http://localhost:8008;
        proxy_set_header X-Forwarded-For $remote_addr;

Once the file is saved and closed, I can enable the Nginx virtual host, and test to make sure there were no issues.

ln -s /etc/nginx/sites-available/matrix /etc/nginx/sites-enabled/
nginx -t

Finally, I restart Nginx, and enable it to start at system boot.

systemctl restart nginx
systemctl enable nginx

UFW Firewall

With the basics set up, it would now be a good idea to add some firewall rules, to ensure the rest of the server ports are locked down. I add the ssh, http, https, and the TCP port ‘8448’ to the UFW firewall using the command below.

for svc in ssh http https 8448
ufw allow $svc

After the rules are added, I enable the firewall and check the firewall rules, using these two commands.

ufw enable
ufw status numbered

At this point, I usually log into the server via SSH in a new terminal just to be sure I’ve not locked myself out of SSH access.

Test the Matrix Synapse homeserver

If everything is set up correctly, at this point you should be able to browse to the below URL, enter the server domain, and check if a successful call can be made to your homeserver.

If you see Success message, congratulations! You’ve successfully set up your Matrix Synapse server. However, there’s still more to be done, which we will dive into in part 2 of this tutorial.


OBS Studio 26.1.0 for Linux – Now with Virtual Camera Support.

Some time ago I discovered OBS Studio as a solution for recording internal screencasts for tutorials and workshop videos. As a Linux user, I was pleased to find that it was both open-source, and therefore available for Linux. It lags a bit when it comes to releasing new features vs the Windows version, but it’s perfect for what I need.

Recently I was trying to see if the Linux version supported the Virtual Camera option that the Windows version did, and at the time it sadly did not. Thanks to an in-depth tutorial by J. B. Rainsberger, I was able to configure a working virtual camera, but for some reason, I could never get it to work right every single time, and had to reinstall things every time I wanted to use it.

Fortunately, since the latest release, OBS Studio for Linux now includes support for a Virtual Camera.

All I needed to do to enable it, was to install v4l2loopback-dkms using this command:

sudo apt install -y v4l2loopback-dkms

I initially couldn’t find where to enable the Virtual Camera, until I saw the button under my Stream/Recording Controls.

After clicking “Start Virtual Camera” I fired up my Zoom application, and lo and behold, there it is as an option for my camera feed.

I’m pretty happy about this, I use OBS quite a bit for recording walk-through videos for the development and support teams at Castos, so this is going to save me a lot of time and stress.

Development Experiences General Ubuntu

Building A New AMD Powered Workstation

If there’s one hobby that I have that I don’t get to spend much time on, it’s building/upgrading PCs.

A few years ago I upgraded my 10-year-old workstation/gaming PC, to something a bit more modern. At the time I was working with a fairly limited budget, and so I had to make some concessions around what parts to purchase for the upgrade.

During the course of the 10 years since the original build, I had added a 128 GB SSD boot drive and included additional two additional 1 TB HDD storage drives. So when I upgraded I opted to merely improve the PC internals, namely the motherboard, CPU, memory, and graphics card. The plan was to use this as both a workstation PC as well as for gaming and keep my laptop for remote work/conferences. During the upgrade, I discovered that I had a spare 128GB M.2 SSD from my then laptop that I could use as a secondary boot drive. So I ended up with a dual boot Windows (for gaming) and Ubuntu (for work) machine.

I’ve been using this way successfully for the past two years, but over the course of the last year, a few things become clear to me.

Firstly, while the 128 GB M.2 SSD was nice and fast as a boot drive for Ubuntu, it wasn’t enough space to keep all my work-related files on, so I had to purchase an additional 1TB storage drive, move my work files there, and symlink them all up to my boot drive. This meant that indexing new projects in PHPStorm could often be painfully slow.

Secondly, Steam Play, and especially the work being done on the Proton tool, was getting REALLY good. It’s gotten to a point where most modern triple-A games run natively on Proton or require a few tweaks here and there to get running smoothly. Even Star Wars: Jedi Fallen Order (my favourite game of 2019), got to Gold level status on ProtonDB, even with the EA Origin account sign in required nonsense.

Thirdly, and a little selfishly, I wasn’t actually getting much time to game. Call me stupid, but as it turns out, the plan of having a gaming PC that would double as my workstation, while sounding like an amazing idea (gaming in my breaks during work hours, woohoo!) didn’t quite work out. In the two years since I upgraded the PC, the only game I actually managed to play through was the aforementioned Jedi Fallen Order, and that was only because I took the PC home and played in the evenings during my year-end leave.

With these realizations, I spent the latter part of 2019 and the rest of 2020 putting some money aside for a new build. The new PC would remain at the office, and the older, upgraded one would come home, giving me the ability to work and game at both locations. Over time I will probably only need to upgrade specific parts of the new machine to stay up to date, and then the parts they replace could be moved to the older one. By the time November rolled around, I’d saved enough to buy the parts for a modest mid-range build, with a decent upgrade path for future changes. Given that 2020 turned out to be the year it was, I decided I would like to end the year on a happy note.

A note on naming. I used to always call my workstation/gaming PC “Psyrig”, a portmanteau of my then online nick (Psykro) and the word “rig” (from the term gaming rig). As I got older I’ve dropped that name, and simply called it my workstation/gaming PC. Now that I have two, with different sets of parts, I’m going to have to think up some new names.

The parts

After much online research, I finally settled on the following parts

  • AMD Ryzen 5 3600 CPU
  • Asus TUF GAMING B550M-PLUS (WIFI) Motherboard
  • Samsung 970 EVO Plus 500GB NVMe SSD
  • Gigabyte GeForce GTX 1660 Ti OC 6GB Graphics Card
  • Cooler Master MWE GOLD 650W ATX PSU
  • Cooler Master Masterbox K500L ATX case

The motherboard was the most important part of the build. I wanted something that would be solid now, but have a decent upgrade path. The Asus board supports both the current-gen AMD Zen 2 CPUs, as well as the newly released Zen 3 chips, has PCIe 4.0 connectivity, supports the latest standards for external ports (USB 3.2 and USB Type-C) as well as having built-in WiFi and Bluetooth. So when the time comes to upgrade either the CPU or the Graphics Card (or both) this board should be able to handle it.

I really wanted to get an X570 based board, but the price was just to high for my budget, so the B550 would have to do.

The AMD CPU was the second most important part. I’d been eyeing the Zen2 Ryzen 5 3600 for a while, and it was a great little upgrade from my previous 2600x.

The third most important part was a decent-sized NVMe SSD, that I could use for my boot drive, as well as for storing my work-related files, instead of needing to offload them to a separate hard drive. This also meant I could keep one 1TB HDD with the old PC, for general storage.

When it came to the graphics card, I didn’t quite have enough in my budget to afford an RTX 2060, so I opted for a GTX 1660 Ti instead. Once the current shortage of the new graphics cards is over I’ll probably want to upgrade this to either an RTX 3060 Ti or the equivalent AMD 6000 series card.

I wanted to get DDR4 3200 Mhz RAM, but that was out of stock so I settled on DDR4 3000 Mhz RAM instead. To round out the build, I went with a Gold rated 650W power supply, that can handle any modest planned future upgrades, and the Masterbox case because it was the most understated, within my budget.

The goal of this build is to only ever need to upgrade the graphics card when the current one gets a bit out of date. The rest of the hardware should be pretty solid for at least a couple of years, and I can easily swap out anything that might cause bottlenecks down the line.

The build

I decided to stream the new build, instead of just taking before and after shots. I only ended up streaming the pre-build, where I made sure that all the parts were working, as trying to get a decent camera angle while I put the parts inside the case proved more difficult than I had anticipated.

Warning, content slightly NSFW

The completed build looks more or less how I wanted it to, simple, clean, fairly well cabled managed, and with all the RGB on the motherboard turned off. I still need to turn off the front fan RGB, but that’s only still on (I think) because I didn’t connect up the fan headers to the motherboard properly, so that’s a problem for another day.



I discovered Hardinfo when I wanted to benchmark my workstation against my laptop, for my Zenbook laptop review. While the benchmarks are related to the processing capabilities of the CPU, it was nice to see that all of those benchmarks were improved across the board in the new build. The only benchmarks that were worse were the FPU (Floating-point unit) benchmarks, which was interesting, but I have no idea what this means in the grand scheme of things.

CPU Blowfish (lower is better)0.971.050.50
CPU CryptoHash (higher is better)1284.821058.341613.40
CPU Fibonacci (lower is better)0.550.390.28
CPU N-Queens (lower is better)5.124.804.40
CPU Zlib (higher is better)2.351.462.82
FPU FFT (lower is better)0.800.650.70
FPU Raytracing (lower is better)2.561.134.47
GPU Drawing (higher is better)16462.808428.9919499.27


In my Windows benchmarking days, 3DMark would have been my go-to graphics benchmark tool. However, I wanted to test something on Ubuntu. After a bit of searching, I found UNIGINE, and installed and ran the Superposition benchmark, at the “1080p medium settings” configuration on both PCs.

The old machine had a score of 7355, with an average framerate of 55, while the new machine had a score of 11153 and an average framerate of 83.

For completeness, I also ran the benchmark on “1080p high settings” on the new build, and recorded a score of 8111, with an average framerate of 60. While I can’t compare this to the old build, as the 3GB VRAM on the graphics card can’t handle the high settings, it’s nice to know that I should be able to run most games at high settings going forward, or as a worst-case scenario, drop down to medium.

I’m very happy with this new build, and I hope I don’t have to upgrade anything major for at least a year. That being said, I am fully aware that the new AMD CPUs and GPUs, as well as the new nVidia GPUs, have just launched, so I have no idea how long things will last in their current state. I’m a bit of a sucker for new upgrades!


Taking on new (old) challenges

Life is a funny old thing. When I left working for an employer back at the end of 2015, I never dreamed I would, in the short space of 5 years time, come back to being employed full time again.

Back in June of 2019, I shared the news that I had accepted an increased role at Castos, that of lead developer. Accepting this position meant that I could dedicate more of my time to working with Craig and the rest of the Castos team, and would be spending less time working with various freelance clients at Codeable. The agreement between Craig and I was that of a 75/25 split, I would spend 75% of my time on Castos tasks, and the other 25% working with freelance clients through Codeable.

Now, just over a year (and 8000 plus new sign ups) later, the requirements for that role have grown to a point where it needs someone who can focus 100% of their time to it. So, from 1 October 2020, I am happy to announce that I will be dedicating all of my resources to Castos as Development Lead, overseeing the development efforts for our web application and our suite of WordPress podcasting plugins.

The title itself is arbitrary, it’s the journey ahead that excites me. In the past three years we’ve grown from Craig and myself to an 8 person team working on the Castos Hosting and Analytics platform (with a new developer to be added shortly), as well as being joined by folks from Craig’s other business, PodcastMotor, who manage our new Castos Productions service (About page images pending).

So while I will still be working on building new features for our web application and the WordPress plugins, I’ll also be more involved in finding ways to improve our development processes, scaling our platform to meet the demands of our customers requirements, while helping to make sure our software makes our customers more successful, and also supports each member of our team do to their best work.

In line with this change, I’m also looking forward to spending more time on our WordPress plugins, streamlining them for our WordPress users, and making it simpler to build and manage a successful, self hosted WordPress podcast.

Going forward, the content of this blog is also going to undergo some changes. There will be a shift away from freelance developer focused content, and will be more in line with whatever I am working on, or learning, as I navigate this new path. I’ll probably be overhauling my about page, and a few of the general content pages.

Most importantly, I won’t be taking on any new freelance clients. This means if you are looking for someone to work on your development project, I’ll rather redirect you straight to the Codeable developers page, where you can search for your perfect WordPress expert, and start a project with them.

I consider myself very fortunate to be working with such an amazing team of people, building a product that we all believe in, while supporting open source software at the same time. It’s been an amazing 3 years growing into this position, and I am looking forward to the next 5 years (and beyond).

Development Experiences Freelancing Ubuntu

ASUS Zenbook 15 UX533FD review – an Ubuntu friendly developer laptop

I’ve always been a fan of Dell laptops. While often a little more pricey than their counterparts, their laptops are usually well built, typically run Ubuntu without any hassles, and Dell have great after sales service. My last two laptops where Dell.

I’d been eyeing the Dell XPS 15 for about a year, and I had planned to purchase one when I was due for an upgrade. However, sometime back in 2017, someone I follow on Twitter suggested something I had not thought of, the Asus Zenbook range.

Now as a PC gamer, Asus is a well known brand. The produce some of the best PC gaming hardware around. In recent years they’ve switched to more integrated hardware, from tablets (my first Android tablet was an Asus) and more recently to laptops. So I was curious to see how the Zenbook range compared to the Dell XPS range.

The first thing that struck me was the wide range of ZenBook Series laptops available. After some extensive research, I eventually settled on the Asus Zenbook 15 UX533FD.

General impressions

I’ve been using this laptop for almost a year now, and I’m very happy with it. It boots fast, runs all my applications without any problems, and is super light and easy to carry around. Whenever I’m working in it, it never gets hot, and I hardly hear the cooling fans spinning up, so it’s super quiet as well. It has a average of 10 hours battery life if I’m using it for development all day, and an added bonus is that it looks really good, with the Royal Blue finish. You can read more about the tech specs here, but it has everything I need in terms of connections. Finally, the charging cable is also small and light, so when it’s in my laptop backpack I hardly even notice it’s there.

Cost Effective

I always prefer to limit myself to a specific budget, this time around, no more then ZAR25 000. I also tend to have specific minimum hardware requirements when it comes to a work laptop, preferring a decent Intel Core i7 CPU, at least 16GB of RAM and a 512 GB SSD hard drive. I’m not too worried about getting the latest and greatest of the Intel chips, nor do I need a 4k or touch enabled screen. I’m also not concerned about super powerful graphics or the number of additional ports it has, I just need at least one or two USB ports and an HDMI port.

Based on these specifics the Asus Zenbook was the clear winner, being available within my budget at ZAR7000 less than a Dell XPS, configured with almost exactly the same hardware.

Ubuntu friendly

Whether it comes pre-installed with Windows Home or Pro does not really matter to me, as long as I can install Ubuntu on it without any problems. At first I had some issues with getting Ubuntu installed, but after a bit of research online I discovered that updating the laptop firmware to the latest version would resolve any issues. I also decided to install the most recent Ubuntu version, which usually contains the most recent kernel updates and therefore less hardware compatibility issues. The base OS install was a breeze and I didn’t need to jump through any hacky workarounds to get certain things working. I’ve since successfully upgraded the OS to the recent LTS release (20.04), again with very few issues.


I’ll be the first to admit that I know nothing about performance benchmarks, so all I did was find a benchmarking tool on Ubuntu that would give me some scores to compare. Hardinfo seemed to be a solid option, so I ran the benchmark suite on the laptop and compared it to my AMD Ryzen powered workstation.

I was pleased to discover that not only were many of the benchmarks pretty close, but a few of them were better on the laptop, mostly in CPU related tests. I honestly can’t say I can tell the difference when I’m working on my laptop vs the workstation, except when it comes to graphics intensive applications, like games.

CPU Blowfish (lower is better)0.971.05
CPU CryptoHash (higher is better)1284.821058.34
CPU Fibonacci (lower is better)0.550.39
CPU N-Queens (lower is better)5.124.80
CPU Zlib (higher is better)2.351.46
FPU FFT (lower is better)0.800.65
FPU Raytracing (lower is better)2.561.13
GPU Drawing (higher is better)16462.808428.99

I’ve since had quite a few online interactions with other developers who’ve also discovered the joy of the Zenbook range.

So if you’re looking for a small, powerful, good looking, well priced, Ubuntu friendly laptop, you won’t go wrong with an Asus Zenbook.

Development Ubuntu WordPress

Submitting a patch to WordPress core, using Git

I initially encountered version control in my 4th year of programming, when the lead developer of the company I worked at had implemented Subversion as a code backup solution on our local testing server. As we were all required to use Windows at the time, we mostly just installed TortioseSVN, so my command line Subversion knowledge is limited.

A few years later, at another company, I was introduced to Git. At the time GitHub was in it’s infancy, but we used it internally for revision control and production deployments. This was also around the time I switched to using Ubuntu as my main OS, and learning the joys of the terminal. Since then, every team I worked with has used Git and either Bitbucket, GitHub, or GitLab. This means that I can use Git on the command line with reasonable success, and can work with branching, code reviews, and submitting/merging pull requests using the web interfaces of these platforms, with GitHub probably being the one I am most used to.

Back to the point, this meant then when I got into developing for WordPress at the end of 2015, and found they all used Subversion to manage both core and plugin/theme repositories, I was a little out of my depth. Managing plugins was easier, I could keep the my version control using Git, and then just memorise a couple of commands to push the latest updates to the plugin repository. Getting into core development was a little trickier, as it was all managed using Subversion. To be honest, it’s been one of the reasons I’ve struggled to get into WordPress core development.

All this changed in March of this year, when it become possible to associate GitHub accounts with profiles and documentation was added to WordPress core developer handbook on how to use GitHub pull requests for code reviews related to trac tickets in WordPress.

I had the code, and the comfort in using Git, and the documentation was very clear as to the steps to follow. All I had to do was make it happen.

Setting up the development environment

I’ve blogged about this before, but I use a very simplified LAMP stack for my local development. I know, I’m old, but it’s easy to understand, fast, and it just works. It does help that my base OS is Ubuntu, and the actual LAMP set-up is really easy, especially if you have the amazing tutorials over at Digital Ocean.

What did concern me was how much work I might have to do to get the wordpress-develop GitHub repository installed and set up on my local environment, to be able to test anything I added. To my surprise, it was a breeze. Using the same setup script I use for my client projects, I created a wordpress-develop local site and cloned the wordpress-develop repository into the local wordpress-develop site directory.

The WordPress source code resides in the /src directory, so I browsed to the local .test domain which my setup script creates (https://wordpress-develop.test/src) and was surprised to see a little ‘reminder’ screen, telling me I needed to install the npm dependencies.

Fortunately I already had npm installed locally (having recently needed it for the Seriously Simple Podcasting player block) so I just needed to install and build the required assets with npm install and npm run dev.

After that, it was the good old famous 5 minute install I’m always used to, and within a few minutes I had the wordpress-develop code up and running on my local environment.

The actual Git process.

This is all documented in the core handbook entry on Git, but once you have the GitHub repo cloned, creating a patch to submit to a ticket is pretty straightforward.

Firstly, it’s a good idea to create a working branch for your code. The handbook recommends using the ticket number as part of your branch name

git checkout -b 30000-add-new-feature

Once you have your working branch, you can make your changes to the code, and use the git add and git commit commands you’re used to if you’re comfortable using Git. You’re able to view the status of your changes, using the diff tool. This outputs all the changes between the main branch, and your working branch.

git diff master 30000-add-new-feature

When you are ready to submit the patch, you can use the diff tool again, and simply pipe the changes to a .diff patch file.

git diff master 30000-add-new-features > 30000.diff

If there are already patches attached to the trac ticket, you can use this tool to create updated patches.

git diff master 30000-add-new-features > 30000.2.diff

To be honest, this is probably very similar to how it works in Subversion, but being able to use Git commands and Git branches, when you’re more comfortable with Git, makes it much easier to get started.

You can also use GitHub pull request for code reviews on patches, but that’s a topic for another day.

This is all pretty exciting for me, because it lowers the barrier for me to contribute directly to WordPress. I’ve already got two new tickets in trac that I’ve submitted patches to, and I’m looking forward to being able to contribute back to WordPress core in the coming years.