Well, after a while in the container world ive come to realise that keeping all these containers up to date is hard work and time consuming with simple docker compose. I’ve recently learnt that portainer may come to hand here. I believe that feeding the yaml file through portainer allows the latter to take control of updates. Correct?
I have a Truenas Scale machine with a VM running my containers as i find its the easiest approach for secure backps as i replicate the VM to another small sever just in case.
But i have several layers to maintain. I dont like the idea of apps on Truenas as I’m worried i dont have full control of app backup. Is there a simpler way to maintain my containers up to date?


You’ve done the hard work building the compose file. Push that file to a private GitHub repository, set up renovate bot and it’ll create PR’s to update those containers on whatever cadence and rules you want (such as auto updating bug fixes from certain registries).
Then you just need to set up SSH access to your VM running the containers and a simple GitHub action to push the updated compose file and run docker compose up. That’s what I do and it means updates are just a case of merging in a PR when it suits me.
Also I would suggest ditching the VM and just running the docker commands directly on the TrueNAS host - far less overheads, one less OS to maintain and makes shares resources (like a GPU) easier to manage.
You should look at restic or Kopia for backups, they are super efficient and encrypted. All my docker data is backed up hourly and thanks to the way out handles snapshots, I have backups going back literally years that don’t actually take up much space.