My PHP VSCode / Docker / Remote SSH Setup

My PHP VSCode / Docker / Remote SSH Setup
Photo by Venti Views / Unsplash

Even though I mainly work on js projects I did a lot of laravel apps in the past and currently I still work on my project management app "flow" from time to time.

Everything I explain in this post is based on my experience over the past years and might not be objective.

Remote SSH?

Without going too much into detail: Devolving on linux just has many advantages. Sure, you can install php, mysql and nodejs on windows but it just feels a little bit off. It gets worse when you want to use docker because while docker for window is a thing, the performance and overhead is terrible in my opinion.

WSL x WSL 2 x Docker Desktop – Comparing Performance Distro Ubuntu 18.04 | Matheus Castello MicroHobby
In last post I wrote a about the differences between WSL and WSL 2. I also mentioned that WSL 2, even being inside a utility virtual machine in Hyper-V, delivers a better performance than WSL 1. In<a class=“moretag” href=“https://microhobby.com.br/blog/2019/09/25/comparing-performance-ubuntu-18-04-wsl-wsl-2-docker-desktop/”> Read more…</a>

Luckily there is Windows Subsystem for Linux (WSL) which allows you to run a linux system within windows. I was happy with this for a long time until one day where it simply stopped working. My ubuntu WSL did not have access to the internet anymore. I never found out what exactly is the problem but I guess a windows update simply messed something up eventhough I tried everything: Updating WSL, updating Ubuntu, reinstalling WSL, messing with the network interfaces/bridges, hyper v manager and more...

At this point I switched over to remote ssh. I have a local homelab server running proxmox, so I slapped a clean ubuntu VM on there and connected to it via the vscode remote ssh feature and it has been working ever since.

I worked with all 3 versions of docker in the past (bare metal, wsl, docker for windows) and must say bare metal / VM just feels more snappy.

What I might look into in the future is windows 11 dev drive.

Why docker?

Docker brings the advantage that I can develop and ship the app on the exact same base and do not have to worry about for example slightly different php versions, extensions or binaries. It just feels right to have something that behaves the same locally like the production/live version. And if I ever decide to switch my dev setup somewhere else, I just need docker and I'm ready to go.

So now to the final question that explains the title:

Why not use PHPStorm?

When it comes to PHP, PHPStorm is the best IDE, hands down, but currently I only use it at work because to be honest, its a bit on the expensive side for my casual hobby projects. There are other reasons though:

  • High RAM / CPU usage
  • Feels a bit bulky and slow sometimes
  • Remote SSH only with jetbrains gateway which seems to require running the full IDE on the remote system (I was not able to test this yet)
  • I/O Performance on WSL sucks (subjective impression)
  • Constant reindexing which temporarily freezes code completion
  • It has so many feature that it feels annoying and overwhelming sometimes xD
  • PHPStorm runs on windows and requires the windows docker-compose executable to parse docker-compose.yaml files, thus forcing the usage of docker for windows

But if it runs, boy does it help you get shit done and the framework integration is just awesome.

Making VSCode a bit more useful

highly professional meme I made

VSCode needs some love to make it useful for php development:

Laravel in docker

My docker base image is made from the php fpm alpine image. I then install nodejs and all necessary php extensions plus composer. I also modify some php fpm settings and configure the timezone.

My laravel app image adds some more things ontop:

  • nginx, because no one should use apache in the 21st century
  • supervisor
  • crontab
  • an entry point script that helps bootstrap the prod app
    • create storage, log and cache dirs if they do not exist yet
    • wait for the database to be available
    • run migrations, clear the cache, link the storage
    • optimize, cache config/route/views
    • run supervisor
      • php fpm
      • nginx
      • laravel queue worker

But most importantly: it helps with the permissions. On my linux VM, the project is owned by the user "kiwi" with the ID 1000. The docker container runs as root, this means files created by the laravel app or cli scripts like artisan are created as "root" which means I cannot edit them as user "kiwi".

My first approach was to do it like laravel sail in the past and create a user with id 1337 (lol) and running php as that user. This means also creating the group on the host and adding my "kiwi" user to it for write access. I then added a perms.sh to recursively chmod the project folder.

I was not entirely happy with that because it requires modifying the host and sometime re-running the perms.sh.

Running the container as the user instead was not possible in a nice way because I was not able to setup the permissions for nginx and php fpm.. so I added a part to the entrypoint script that makes php and nginx inside the container run as me.

By default, php runs as www-data and nginx is configured to run as www-data in the nginx.conf. The entrypoint script creates a new user lrv-XXX (where XXX is the UID passed via env) and replaces www-data in all these configs:

# setup user
if [[ -n "$UID" && -n "$GID" ]]; then
  # Create group if it doesn't already exist
  if ! getent group lrv-$GID > /dev/null; then
    addgroup -g $GID lrv-$GID
  fi

  # Create user if it doesn't already exist
  if ! id -u lrv-$UID > /dev/null 2>&1; then
    adduser -D -H -u $UID -G lrv-$GID lrv-$UID
  fi

  sed -i \
    -e "s/user = www-data/user = lrv-$UID/g" \
    -e "s/group = www-data/group = lrv-$GID/g" \
    -e "s/;listen.owner = www-data/listen.owner = lrv-$UID/g" \
    -e "s/;listen.group = www-data/listen.group = lrv-$GID/g" \
    /usr/local/etc/php-fpm.d/www.conf

  sed -i \
    -e "s/user=www-data/user=lrv-$UID/g" \
    /etc/supervisord.conf

  sed -i \
    -e "s/user www-data www-data/user lrv-$UID lrv-$UID/g" \
    /etc/nginx/nginx.conf
else
  echo "UID and GID not set, not modifying perms"
fi

This way the project permissions can stay as they are and the host user can be any user. The only thing is composer and artisan need to be told to run as the user

This could be improved by a little shell script.

Quick side note: In newer versions of laravel sail, it runs the server using artisan serve which means modifying the the php-fpm and nginx is not necessary but this would mean that prod and dev environments are different again.

xdebug

What was confusing to me at first is that xdebug works "in the other direction". The php server is the client and my computer is the server. So we need a way that the php inside the container can reach my vscode, the xdebug server.

VScode already runs on the docker host through the magic of remote SSH, so now we only need to tell xdebug to connect to the docker host. Because I did not want to hardcode the hosts IP in the docker image, I created a common alias, so the final docker-compose.yaml looks like this

version: '3'
services:
    flow:
        image: codingkiwi/laravel-app:8.3-xdebug
        ports:
            - '80:80'
        volumes:
            - '.:/var/www/html'
        environment:
           ....
            UID: ${UID:-1000}
            GID: ${GID:-1000}
        networks:
            - flow
        depends_on:
            - mariadb
        extra_hosts:
            - "host.docker.internal:host-gateway"

    npm:
        image: codingkiwi/laravel-base:8.3
        entrypoint: npm
        command: run watch
        volumes:
            - '.:/var/www/html'
        networks:
            - flow

    mariadb:
        image: 'mariadb'
        environment:
            ....
        volumes:
            - 'flowmysql:/var/lib/mysql'
        networks:
            - flow

networks:
    flow:
        driver: bridge

volumes:
    flowmysql:
        driver: local

Xdebug in mycontainer is configured to need a trigger so it only tries to connect to my VSCode when I want instead of trying it every request. I use the browser extension xdebug helper to set a cookie