Well, after a while in the container world ive come to realise that keeping all these containers up to date is hard work and time consuming with simple docker compose. I’ve recently learnt that portainer may come to hand here. I believe that feeding the yaml file through portainer allows the latter to take control of updates. Correct?

I have a Truenas Scale machine with a VM running my containers as i find its the easiest approach for secure backps as i replicate the VM to another small sever just in case.

But i have several layers to maintain. I dont like the idea of apps on Truenas as I’m worried i dont have full control of app backup. Is there a simpler way to maintain my containers up to date?

  • Benjy33@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 hours ago

    One thing that helps with the “should I update?” anxiety: knowing what changed and whether it actually matters for your setup before pulling the trigger.

    I built Maintenant (GitHub) and one of the features I added for exactly this reason is update detection — it checks OCI registries via HEAD requests (no image pulls, no bandwidth waste) and tells you which containers have newer images available. The Pro tier goes further with CVE detection and risk scoring that cross-references whether the container is exposed to the internet, has dependents, etc.

    It won’t auto-update anything — that’s a deliberate choice, the tool is observe-only and never touches your stack. But it gives you the information to decide when updating is worth the risk, instead of either blindly auto-updating with Watchtower or manually checking Docker Hub every week.

    Also does container monitoring, HTTP/TCP checks, heartbeats for cron jobs, and SSL cert tracking if you want to consolidate. Single container, zero config, ~17 MB RAM. AGPL-3.0.

  • irmadlad@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 hours ago

    I run about 70 containers. I really don’t find it that difficult. I do run a Watchtower fork, but I run it with --run-once --cleanup. I do that once a month after I feel confident that everyone else has done all the beta testing on the new updates for me. So hats off to all you guys who just yolo your updates. You are an invaluable resource to the selfhosting community. Thank you.

    As far as Linux updates, I’m running Ubuntu Jammy so those updates don’t usually introduce breaking changes and I complete them as they become available. I use Portainer, but I am unaware of any auto—update features for Docker containers. You can feed it a new yaml and it will replace or recreate the container based on that yaml, but it doesn’t do it automatically. Portainer is just a handy way to consolidate all your container administration in one place in lieu of using the terminal.

    There are other options to updating your containers like WUD, or similar. They will alert you that there is an update, but you have to manually initiate the update. Anecdotally, I’ve only encountered one breaking change and that was when Portainer updated, but was incompatible at the time with the current version of Docker, or something like that. Memory is foggy this morning. It took about an hour to find a fix, and implement it, so it wasn’t an excruciating change up.

    • lankydryness@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 hours ago

      I messed around in portainer before and I think possibly OP is referring to their feature where it can watch a git repo and anytime a change occurs, it’ll try to do a pull and recreate the container.

  • SkyNTP@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 hours ago

    Are you updating 1000’s of stacks every week? I update a couple critical things maybe once a month, and the other stuff maybe twice a year.

    I don’t recommend auto updates, because updates break things and dealing with that is a lot of work.

  • Kushan@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 hours ago

    You’ve done the hard work building the compose file. Push that file to a private GitHub repository, set up renovate bot and it’ll create PR’s to update those containers on whatever cadence and rules you want (such as auto updating bug fixes from certain registries).

    Then you just need to set up SSH access to your VM running the containers and a simple GitHub action to push the updated compose file and run docker compose up. That’s what I do and it means updates are just a case of merging in a PR when it suits me.

    Also I would suggest ditching the VM and just running the docker commands directly on the TrueNAS host - far less overheads, one less OS to maintain and makes shares resources (like a GPU) easier to manage.

    You should look at restic or Kopia for backups, they are super efficient and encrypted. All my docker data is backed up hourly and thanks to the way out handles snapshots, I have backups going back literally years that don’t actually take up much space.

  • Caveman@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 hours ago

    Since there’s no lack of solutions here I’m going to add one more. If you manage to create bash to update the containers then you can have it run with a systemd service that’s easy to set up. It’s very easy to set up and it’ll work the same as running the command no your computer.

  • monkeyman512@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 hours ago

    I just started playing with Dockhand and it looks like it has a built in update schedule mechanism. It’s fills a comparable role as Portainer, so maybe check that out.