I’m using DVC for two projects sharing the same remote and cache.
I would like to clean the remote and the cache in order to keep only the data used in the two project workspaces.
But when using the command “gc” with these options: “dvc gc -c -w -p PATH_TO_FIRST_PROJECT PATH_TO_SECOND_PROJECT” I’m getting this error:
ERROR: Unable to acquire lock. Most likely another DVC process is running or was terminated abruptly. Check the page https://dvc.org/doc/user-guide/troubleshooting#lock-issue for other possible reasons and to learn how to resolve this.
Looking at the indicated webpage I tried to remove the “.dvc/tmp/lock” files in the two projects but the error is still occuring when trying dvc gc.
This error also happens without the cloud “-c” option, just trying to clean the shared cache.
However I’m able to clean the cache with a “dvc gc -w” command in any of the two projects (without the “-p” options, meaning that I’m removing the files tracked in the workspace of the other project).
If someone already had this kind of issue and has a solution to handle this situation, I would be glad to have your feedback on this.
I don’t know if this issue is due to a bad configuration on my side or a bug in DVC…
PS: The output of the “dvc version” command on my system is
Platform: Python 3.9.12 on Linux-5.13.0-40-generic-x86_64-with-glibc2.31
webhdfs (fsspec = 2022.3.0),
http (aiohttp = 3.8.1, aiohttp-retry = 2.4.6),
https (aiohttp = 3.8.1, aiohttp-retry = 2.4.6),
ssh (sshfs = 2022.3.1)
Cache types: hardlink, symlink
Cache directory: ext4 on /dev/mapper/vgubuntu-root
Workspace directory: ext4 on /dev/mapper/vgubuntu-root
Repo: dvc, git