[SOLVED] Docker ERROR: Preparation failed

Hi,

I’m getting this message when pipeline starts:
ERROR: Preparation failed: adding cache volume: set volume permissions: running permission container "44b3c9dcc096cda7aab18742e0016572764550d413a1097fba846e0afda8dcbd" for volume "runner-gup5trke-project-1041-concurrent-0-cache-3c3f060a0374fc8bc39395164f415a70": starting permission container: Error response from daemon: error evaluating symlinks from mount source "/var/lib/docker/volumes/runner-gup5trke-project-1041-concurrent-0-cache-3c3f060a0374fc8bc39395164f415a70/_data": lstat /var/lib/docker/volumes/runner-gup5trke-project-1041-concurrent-0-cache-3c3f060a0374fc8bc39395164f415a70: no such file or directory (linux_set.go:105:0s)

A few days back it worked just fine.
It started to happen after I added variables to CD/CI - but it maybe coincidence.
Anyway I removed them - still the same error.
Tried other images as well, but no luck.
Also another of my repo works just fine with similar settings.
Please take a look at that.

Regards,
SeeLook

Looking into this now, thank you for letting us know.

I have restarted one of your jobs and it appears to now be working. If you have any further issues please don’t hesitate to let us know.

Edit: Job appimage-amd64 (#9977) · Jobs · SeeLook / nootka · GitLab

Thank You a lot for fixing this issue!
And first of all thank You for maintaining opencode.net service - I like to use it very much.

I will extend this pipeline rules, so if something wrong will popup I will write.

@justinz we are experiencing something similar here, could you please take a look at it:

This error appeared again:

About week ago it worked properly.
Could You fix it, please.

There’s a Gitlab update happening tomorrow. I’ll reboot the docker system now and maybe the fix will be in the new runner update.

It works already!
@justinz Thank You a lot.

Hallo again,

The same situation but with another repo:

Worked a few days before

I’ve set up a script to delete old volumes that seem to be blocking new volumes from being created. Seems Gitlab Runner isn’t so good at cleaning up after itself. Hopefully this will stop this from happening again until Gitlab Runner gets smarter about cleanup.

Thank You for that.
Working now.

amazing, thank you for that