The slides from my freelancing talk at WordCamp Johannesburg
I learned a valuable lesson yesterday, if your workshop prep requires a lot of involved setup, you might want to reconsider your workshop.
Anyway, for those who want to try and replicate the workshop process at home, here are my slides and the workshop notes I used. You should also be able to followed the numbered branches on each of the GitHub repositories.
My workshop notes
The GitHub Repositories
On Thursday the 25th of October 2018 I will be presenting a workshop titled “Turning your WordPress plugin into a SaaS”, where I will be sharing some insight into connecting a WordPress plugin to a Laravel App through HTTP requests.
As with most workshops, there is some preparation work required to attend. However because we are dealing with two different platforms, (Laravel and WordPress) setting it all up might take a bit longer than usual.
TL;DR – Requirements:
- WordPress environment – accessible on its own IP address
- Install the Simple Posts Backup plugin from GitHub
- Laravel environment – accessible on its own IP address
- Database access
- Code Editor
This article is aimed at folks who a) have never set up local development environments or b) have only ever used applications like WampServer or Mamp for their local development. If you already use something like Vagrant or Docker environment, you probably don’t need this article 😉 It would however be good to skim through it to make sure you have a set up that matches my own.
1. WordPress environment.
To participate in the workshop, you will need a local WordPress environment. Make sure you have the latest version of WordPress installed (4.9.9 at the time of this article).
My recommendation would be to try something like Chassis.io, or my own BossBox. These are pre-configured WordPress ready Vagrant boxes that replicate a Linux server and any installed software you might need. If you already use VVV, this is also a good option.
To install any of these options you will need to install VirtualBox and Vagrant first. Each project has instructions on how to set up and run your WordPress install. Obviously I favour BossBox, because I built it and I know it works well out of the (ahem) box.
What I like about a Vagrant option is that you can configure each instance with it’s own IP address, which makes it easier to develop and test API end points on two different instances at the same time, which is what we’ll be doing in the workshop.
You will then need to download and install the base plugin we will be using, which you can download from this link.
For the purposes of the workshop, my WordPress environment will be using the default IP address which ships with BossBox, which is 192.168.33.10.
2. Laravel Environment.
Firstly you’ll need to follow the installation instructions to create a new Laravel project.
Then you’ll need to choose which local environment you use. There are a few options to for setting up a local Laravel environment, the two most popular being Laravel Valet and Homestead.
Laravel Valet as a Mac only option, that requires very little in the way of set up. The only downside is if you don’t own a Mac, you can’t use it. I develop on a Linux OS, so I can’t speak for if it will be sufficient for the workshop, but I don’t see why it wouldn’t. As with the Wamp or Mamp options I also don’t know if you can configure the Valet set up to a local IP address.
The option which I prefer is Homestead. Again, it is also a pre-built Vagrant box, so if you go with one of the Vagrant options for WordPress you already have the required software installed. If you do go this route, I suggest using the Per Project installation method, which is the one I’ll be using for this workshop.
For the purposes of the workshop, my Laravel environment will be using the default IP address which ships with Homestead, which is 192.168.10.10.
A note on Vagrant options: because these pre-built virtual Linux web servers, they can be quite large when you first download them. BossBox is around 750mb (the last time I checked) and the Homestead box is almost double that. So make sure you have a good internet speed and enough data to complete the downloads. If you only use mobile data, I recommend visiting an internet cafe or WiFi enabled coffee shop or co-working space. It’s also the reason I suggest you do this before you come to the workshop, as downloading everything during the workshop is going to take too long on conference WiFi. The upside is that once they are installed, you use the same installed base box for every other instance you set up.
3. Database Access
You will need some way to access the database for both set ups.
If you use the BossBox route for your WordPress setup, you can access the database via PHPMyAdmin, which comes pre-installed on the box. You will find details on how to do this in the project readme file. If you go with the Homestead route for Laravel, you can use an application like MySQL WorkBench. For the Laravel app the defaults are:
- Hostname – 192.168.10.10
- Port – 3306
- Username – homestead
- Password – secret
For which ever set up you choose, make sure you can connect to and view the database for both.
3. API Testing – Postman.
Postman is an application for testing software API’s. We’ll be using it to test the API end points we create during the workshop. To make sure your WordPress and Laravel environments are accessible to each other, test them via Postman.
In order to test them, you need to install Postman and create two new requests, one for WordPress and one for Laravel. Set the URL for each request to the IP address for both WordPress and Laravel. You should see some html data returned, which means Postman can send a request to either one.
4. Code Editor/IDE.
Obviously we’ll be writing code, so we’ll need a code editor. If you are already writing PHP code you probably have an editor you prefer, and that’s great. If you don’t I can recommend Visual Code Studio (free) or PHPStorm (paid). PHPStorm comes with a free 30 day trial, so if you’ve never tried it, now is a great time. I’ll be using PHPStorm in the workshop.
One of the things I love about using Ubuntu as my primary operating system is that I can have quickly set up a ‘bare metal’ LAMP development environment. While I unusually run my client websites inside on of my custom Vagrant boxes, for working on personal projects or plugin/theme customisation everything’s much faster when it’s able to use the full power of the machine I’m working on.
After I switched back to a computer as my every day workstation, I came across a weird little issue with installing MySQL. Usually during the install process it asks me to set a root password, but this time it did not. This meant that I wouldn’t be able to access the database using the root user. Not a huge issue, as I could create another user to access any database, but as it’s a local install, connecting as the root MySQL user is just much easier.
As it turns out, on Ubuntu installs running MySQL 5.7 or later, the root user is set to authenticate using the auth_socket plugin rather than using a password. Thankfully the folks over at Digital Ocean have released an article which explains this information and provides the steps to switch the root user authentication from auth_socket to using a password.
UPDATE: As it turns out, this was not the only issue I was having, turns out the MySQL install was also corrupt somehow. Fortunately the internet is a wonderful place, and I found some instructions on how to completely remove all traces of MySQL server, and start again.
sudo dpkg -P mysql-server mysql-server-5.7 sudo apt-get autoremove sudo apt-get clean dpkg -l | grep -i mysql sudo rm -rvf /var/lib/mysql sudo apt-get install mysql-server
I’ve been thinking a lot about my use of the internet and it’s associated tools and apps lately. I’ve never been the biggest fan of social media, but since roughly 2015 I’ve found myself getting drawn into the online conversation more and more, so much so that I’ve recently realised I may have a little bit of an addiction.
A few days ago someone suggested Catherine Price’s book How to Break Up with Your Phone. (The irony of the fact that it was suggested because I follow that person on Twitter is not lost on me). It made me realise that I need to take a long hard look at my social media and mobile phone usage and make some changes.
As such, like others before me, I’ve decided to take a month long sabbatical from social media, for the month of October. This may prove difficult, as I am attending WordCamp Johannesburg at the end of the month, and the need to live tweet stuff will be strong, but it’s as good a time as any to see if I really can do it. Hopefully it will allow me to change my internet habits and spend more time on things that matter.
In the meantime, I hope to blog more here during the course of the month, which will automatically be shared to my social media accounts. So if you want to converse with me online, feel free to leave a comment on one of my blog posts, email me (my name at gmail) or find me in the various Slack channels I hang out in.
Until the 1st of November then…
Build it. Theme it. Watch it fail. Or not. If you know how to optimize.
Few things are as frustrating as a website slower than molasses going uphill. Even fewer people stick around to see whether the website was worth waiting for in the first place. There are plenty of other websites in the great online sea and if yours is slow visitors will just go somewhere else.
Website optimisation is the key to online success. It really is. So, to pin it down and get clarity around the issues that influence it, I spoke to Justin Frydman. Justin is a full stack web developer, systems administrator and (in his words) ‘passionate problem solver’ and since I joined Codeable he’s become one of my go to guys with regards to website optimisation. Justin and I had a chat about the how, the why and the what of website optimisation…
JB: Why is website optimisation so important?
JF: The short answer is that people hate slow websites. It doesn’t matter if you’re selling a product, trying to increase reader conversion or gain leads that result in sales. A slow website is going to impact your goals significantly.
There are a lot of studies that prove this point but one of the most popular is how Google found that a .5 second loading variable cost them a 20% dip in traffic. While your business may not match the sales of a giant like Google, removing roadblocks to potential sales makes sense. Right?
Considering how many people use their mobile devices to shop online it is even more important to focus on web optimisation now than ever before. Any slowness on your site is magnified on mobile. These devices have slower hardware and connectivity and will load far more slowly than the desktop.
JB: What are the top three worst and best WordPress themes in terms of performance (themes or plugins that affect site performance)?
JF: It’s the themes that try to ‘have it all’ that tend to suffer from performance degradation. The more that a theme does in each page load, the harder the server has to work to serve up your page. There are ways you can hide these shortcomings from your visitors such as using caching methods or other techniques but you did ask for a list. Here it is…
The worst three themes:
The best three themes:
- Custom built and tailored to your needs. This way you don’t have more or less than YOU need.
- Many of the themes built on the Genesis Framework
- The Beaver Builder theme for a visual builder
JB: What are the factors that influence website optimisation?
JF: There are numerous factors that can affect your website’s performance. It’s about ensuring that each of them is as good as possible so they add up to having a truly fast loading website.
You must have a good host with a good stack (server software) that is properly configured to serve WordPress as fast as possible. A great host doesn’t have to cost a fortune but if you’re spending less than a cup of coffee a month on your host, well, you’re probably getting what you pay for. It’s also important to ensure your server is as close as possible to your target market. The closer it is to your visitors, the faster your website will be.
Every plugin you add to your website has the potential to slow it down. Not all WordPress plugins are created equal. Many are developed without performance in mind. So, a single plugin could perform a taxing database query that adds seconds to your page loading.
If you have too many plugins it will complicate the technical debt and debugging of your site. This will make it harder to track down any problem plugins. Some can add bloat to your database as time goes by so the slowing down of your site won’t show until your database queries take seconds instead of milliseconds. Ensure you have a staging website and thoroughly test and vet every new plugin before you use it.
Themes that make heavy use of animations can look slow to the visitor even if they aren’t. It’s also worth noting that the heavier themes need a lot more time to process a page before it can even be rendered. The longer it takes to process a page, the longer things are loading before the website is visually drawn in the browser.
Using images that haven’t been optimised (and using many of them) increases loading times. Images can be compressed, saving up to 80% in file size without compromising on quality. The less the browser has to download, the faster your website. Sliders with 10 images or large background images can increase page size significantly. So, to optimise your website, consider each image’s purpose on the page and ensure the dimension and file sizes are appropriate.
Fully loading videos will destroy your website metrics and often cause a laggy experience. Use videos sparingly and make decisions based on the data you collect. If a video really is converting sales, then it is probably worth keeping.
While ads are often needed to help you make a living, they can really slow things down. Even once they are fully loaded they can slow scrolling and the functionality of the website, especially on slower devices.
JB: How can you test your website’s performance?
JF: There are a number of online tools to help you with website optimisation and performance. That said, the grades don’t always align with a faster website. If you are shooting for straight As that doesn’t actually mean you will end up with a visually faster loading website.
Here are some steps you can take to website optimisation:
Step 01: Your browser
Really use your website, logged in and logged out. Does it feel fast? Is the initial load pretty fast? What about clicking on additional pages? How about the WordPress dashboard?
Human testing goes a long way towards website optimisation as it is humans who use your site.
Step 02: Pingdom
Not all items in the Performance Grade can or should be fixed. It’s important to understand this report in detail but what should matter to you the most are: Load Time, Page Size and Requests. The ultimate goal is to get these as low as possible. Increasing the grades is a bonus and only IF they don’t affect the ultimate metric – Load Time – negatively.
I’ve seen websites taken from a C to an A but at the cost of 400ms. It’s just not worth adding that extra loading time to increase a score.
Step 03: Gtmetrix
The recommendations here must be properly analysed by an expert GTmetrix tracks Fully Loaded Time. This considers the background scripts running – if they are running after the page is visually drawn in the browser that’s not a big deal, however, if your page is waiting for them to load before it shows your site then you have a problem. Ideally, you want to get Fully Loaded Time, Total Page Size and Requests as low as possible. This may involve removing items you don’t need on your site.
Step 04: Google PageSpeed Tools Insights
Until recently, this test didn’t even show your actual website speed and still doesn’t for many websites. What this does is make suggestions around how you can improve your speed. Keep in mind that these are suggestions and may not result in lowering the key speed metrics mentioned earlier. Some of these suggestions can be complex and expensive to solve and the payoff may not be worth the effort. Google has a FAQ that explains a lot of these elements in more detail.
Want to get your hands on Justin? Need a bit more website optimisation expertise?
Click here, hire Justin, and fix your problems…
(aka the longest blog title I’ve probably ever written).
reinstalled upgraded my laptop to Ubuntu 17.10 (Artful Aardvark) Ubuntu 18.04 Bionic Beaver. I’ve been wanting to upgrade since it was released in October to try out the updated Gnome interface since Ubuntu officially dropped the Unity interface for the latest release. Usually I stick with the LTS releases but the draw to try out the new, shiny thing was too much. I prefer running on the latest stable LTS release so as soon as 18.04 came up I upgraded. 17.10 was a solid release and I enjoyed working on the Gnome shell again, so it was a no brainier when the LTS came out.
As I have adopted PHPStorm as my development IDE this meant I had to set up the PHP Code Sniffer and WordPress Coding Standards integration again. I remember this being a bit of a pain when I did it on my 16.04 install.
The Jetbrains documentation about WordPress Development using PhpStorm (specifically the section on the coding standards) was written back in 2015 and the issues I had were mainly due to different paths for newer PHP/PEAR versions, so I thought I would document the updated process here.
Step 1) Install PHP Code Sniffer via the PEAR package
sudo pear install PHP_CodeSniffer
I could probably have done this using either the PEAR package or by using the Composer package but I found that the Jetbrains document refers to the location of the Code Sniffer sniffs using the PEAR method and I already had PEAR installed, so it was easier for me to use this method. At some later stage I might try the Composer method.
Step 2) Determine the location of the PHP Code Sniffer Standards
17.10 18.04 with a default PHP 7.1 7.2 install the Standards are located in the following path:
Step 3) Clone the WordPress Coding Standards GitHub repository
In this case I simply cloned this repository to the root of my home directory.
git clone -b master https://github.com/WordPress-Coding-Standards/WordPress-Coding-Standards.git
Step 4) Copy the relevant Coding Standard to the PHP Code Sniffer Standards location determined at Step 2
I copy all the available WordPress Coding Standards, and merely enable WordPress-Extra for my projects. I am thinking about enabling WordPress-VIP for some extra fun, but I’m not sure I’m brave enough to be shouted at by my IDE…
cd ~/WordPress-Coding-Standards sudo cp -r WordPress /usr/share/php/PHP/CodeSniffer/src/Standards/ sudo cp -r WordPress-Core /usr/share/php/PHP/CodeSniffer/src/Standards/ sudo cp -r WordPress-Docs /usr/share/php/PHP/CodeSniffer/src/Standards/ sudo cp -r WordPress-Extra /usr/share/php/PHP/CodeSniffer/src/Standards/ sudo cp -r WordPress-VIP /usr/share/php/PHP/CodeSniffer/src/Standards/
I also realised recently I could merely create symlinks, thereby automatically updating the Coding Standards every time I pull the latest release from GitHub
cd /usr/share/php/PHP/CodeSniffer/src/Standards/ sudo ln -s /home/jonathan/WordPress-Coding-Standards/WordPress WordPress sudo ln -s /home/jonathan/WordPress-Coding-Standards/WordPress-Core WordPress-Core sudo ln -s /home/jonathan/WordPress-Coding-Standards/WordPress-Docs WordPress-Docs sudo ln -s /home/jonathan/WordPress-Coding-Standards/WordPress-Extra WordPress-Extra
Step 5) Enable PHP Code Sniffer with WordPress Coding Standards in PHPStorm
First step is to enable PHP Code Sniffer in PHPStorm. To do so you’ll need to know where phpcs is installed. Run the following from your command line to find out.
Mine is located at /usr/bin/phpcs.
In PHPStorm, go to your Settings screen (File->Settings or CTRL+ALT+S) and navigate to Languages and Frameworks -> PHP -> Code Sniffer. Next to the Local Configuration click the browse button and enter the path to phpcs in the PHP Code Sniffer (phpcs) path field. Hit Validate to make sure PHPStorm can find it, then click Ok.
Next you will need to enable the WordPress Coding Standards. Back to the Settings screen navigate to Editor -> Inspections, scroll down to PHP and open the tree. Then scroll down to PHP Code Sniffer Validation and tick the box next to it. In the box that appears to the right when you select PHP Code Sniffer Validation hit the little refresh button and then choose your WordPress Coding Standard of preference and click Ok.
Remember to do this last bit for each new (or existing) WordPress project as you can have different Code Sniffer rules set for each project type (for example if you’re working on any non WordPress projects, like Laravel).
About 8 years ago I built my first “proper” gaming computer, which for various reasons I nicknamed Psyrig. It wasn’t the first gaming computer that I owned, but it was the first time I carefully selected all the components, including the case, and put it together myself from the ground up. It was built to serve two purposes, both as a development workstation and a gaming rig.
In the years that have followed I upgraded the cooling fan, added a 128GB SSD to improve boot times and eventually upgraded the graphics card to handle newer games. After 8 good years of service, it was time for a more drastic upgrade: new motherboard, processor, memory and graphics card!
Part 1 – the plan
I’ve been toying with the idea of upgrading the computer for a while now. Since purchasing my first development laptop about 5 years ago, the computer was relegated to powering our home media centre, and the odd game when I had time. However I find gaming in the lounge to be suited better towards consoles and games that play better with controllers, so having a gaming computer not connected to a keyboard and mouse never seemed logical. What seemed even less logical was letting the power of a gaming computer run mostly as a media centre. So, after I purchased a 2nd hand workstation to take over media centre capabilities, the gaming computer was taken to my office and I started planning the upgrade.
The purpose of the upgrade was two fold.
First, I wanted to have a dedicated workstation at my office, on which I would maintain all my work. This would mean I would be less inclined to bring work home. Right now everything is on my laptop, which I bring home, and it’s too easy to just open it up and do some work. I also wanted to make sure that the upgrade would last me at least the next 5 years.
Second, I wanted to be able to play some of my newer game purchases on it. I recently purchased the latest Deus Ex game, and I’ve not had the time to get into it. So the idea was that whenever I took a break from work I could pop in a half hour of gaming instead of watching some YouTube video.
Part 2 – the parts
After much online shopping for probably the better part of the last year, I finally settled on another AMD powered set up. I’ve been an AMD fanboy since I got into serious PC gaming and my last two computers were AMD powered.
Part 3 – the OS
This was probably the trickiest part of this build.
I’ve been a Windows user, because gaming, for as long as I remember. However, I’ve been what I like to consider an Ubuntu power user since I discovered it in 2008. On my laptop, which came with a 128GB M.2 drive, I also installed a 500GB SSD, which runs Ubuntu and is my main OS of choice. I have Windows installed on the M.2 for when I feel like a game, or if Ubuntu isn’t playing nicely with some projector, which hasn’t happened since Ubuntu 17.10. I really like Ubuntu and all the unixy goodness when it comes to development. I also liked the fact that Valve recently started working on a project that will one day allow all my games to run smoothly on Ubuntu. That day is not however today.
Because I only have one SSD installed in the PC, it means either installing Windows now and then purchasing a new SSD later to dual boot, or installing Ubuntu and only playing the games that currently work on Linux. Granted the primary purpose of the computer is not gaming, so I doubt this will be a problem, but it’s also nice to have a Windows install for things that don’t work on Ubuntu (I’m looking at you Adobe Creative Suite).
I even went as far as asking folks on Twitter, and the resounding response was dual boot, which was going to be difficult at this time.
By the time I came to the actual build, I still didn’t know what to do.
Part 4 – the build
When I’m building or upgrading a computer I like to open all the boxes and lay all the new parts out on the table around the case, along with any tools I might need. After 10 or more years of building, upgrading and fixing computers, I’ve learned to make sure everything I will need is close at hand. The one thing that is missing are the cable ties, but I knew where to find them when I needed them.
During the upgrade I was reminded why I decided to not go into computer hardware for a living. Having big hands and fingers makes it tricky to get into all the nooks and crannies of a computer casing to ensure mounting screws and the like are properly installed.
About halfway through the process, things got a little interesting. While plugging in the various case cables (power switches, front side audio and the rest) I discovered that my new motherboard supported M.2 drives! This was quite a moment, because I realised that if I could take the M.2 drive out of my laptop, I could keep the laptop as an Ubuntu machine and effectively dual boot the computer.
Taking the M.2 drive out of the laptop proved to be easier than expected, once I realised that part of the bootloader was installed on the M.2 drive. I quickly created a bootable Ubuntu USB, booted live from the USB and installed the Boot Repair tool. This reinstalled the bootloader onto the Ubuntu installed SSD, and I could now safely remove and install the M.2 drive into the computer.
The next things that happened absolutely shocked me. After plugging all the cables in I booted the computer, expecting it to give me some ‘non bootable disc error’ or similar. But no, it actually booted into the Windows 10 install on the M.2 drive and, after setting up some devices, I was actually able to use the computer. I was not aware that Windows 10 was able to do this and even though I intend to reinstall everything from scratch anyway, I was pretty impressed. Maybe Microsoft has come as far as everyone says they have.
And that was it, I’ve installed Windows on the SSD and Ubuntu on the M.2 and I’m happy dual booting on the computer. I was hoping to be able to Activate my Windows install using the Windows 10 key I had installed on the old system, but it turns out the free upgrade license I have doesn’t support a motherboard change. I found this a little annoying. I could go and buy a new Windows license, but seeing as I’m only going to use that drive for gaming, I may just leave it un-activated, it turns out that Microsoft doesn’t cripple un-activated installs like they used to.
Now all that’s left is installing all the software…
Since then there has been a lot of discussion online about the pros and cons of this event. While I do feel that a lot of the complaints are probably coming from a vocal minority who are decidedly anti Microsoft (no doubt angrily typing their responses on Apple products, the irony) I also agree that this is perhaps not the best news for the open source movement in general.
Personally I will be moving my private (and possibly later public) git repositories over to GitLab and I thought it might be interesting to share my reasons why, if for no other reason than they were referred to as ‘a far sight more measured than most of the opinions about this that I’ve seen so far!‘. I feel like the world could use more measured opinions about popular or unpopular topics in general, so here goes:
- GitLab is open source, nuff said!
- While I don’t disagree that Microsoft has, under the leadership of it’s current CEO Staya Nadella, done a lot to prove that it now supports open source, something that the company was vehemently against in previous years, I feel it’s important to note that this change has only happened since Nadella took over as CEO 2014. This means that the company shift is directly as a result of the mindset of it’s CEO and that CEO will not be in place forever. The next leader of the company may not share Nadella’s love of open source.
- Personally, I don’t like the fact that any large company is in control of the world’s largest open source, code hosting platform. Did Microsoft really have to acquire GitHub to help it grow and improve? I would have been more impressed if Nadella had simply become a member of the GitHub board of directors and helped raise capital in order to support and grow GitHub, instead of outright buying it. Interestingly, this is a path that Matt Mullenweg took with GitLab about a year ago.
- GitLab has some cool features that GitHub doesn’t built right in, like Continuous Integration & Deployment on the free plan right up to Epics, Roadmaps and License Management on the top pricing tiers. They really seem to be building a seamless, integrated product for modern software developers and I want to support them in this cause.
I was already paying GitHub $7 a month to host my private repositories, so switching to GitLab’s $4 a month pricing plan will be a nice little saving each month. Not that I have to, as GitLab, like BitBucket, supports private repositories on the free tier.
And yes, I do know that if you were a GitLab user when a member of their staff managed to delete a database by accident you probably weren’t happy with them and moved on. I’m willing to give them a second chance.
A lighting talk I gave about my experiences contributing to the WordPress community.