How to pull data from GCS without pipelines

Hi,

I am tracking some folders using DVC with google cloud storage. I don’t have any yaml file because I don’t have any experiment.

How can I pull the data from just some folders?

I tried doing dvc pull path_subfolder but this checks for a stage (I can see this with -v). After adding a dummy stage, there with an echo as cmd and the path as deps, the command goes through but no data is present (only in cache).

Thanks for your help.

How are you tracking folders from a remote location if you don’t have an import-url stage? How did you get the data in the first place? If you did imported it with import-url in the first place, you should be able to update it as a whole with dvc update.

I just did a dvc add folder which created the folder.dvc and then dvc push. So I can do a dvc pull and everything from the folder is getting downloaded, but I want to select some subfolders.

I just did a dvc add folder which created the folder.dvc and then dvc push . So I can do a dvc pull and everything from the folder is getting downloaded, but I want to select some subfolders.

Ah, it seems like an issue in dvc 2.0 pull: can't do partial checkouts · Issue #5577 · iterative/dvc · GitHub which is now resolved. When the new bugfix release comes (a couple days), you are going to be able to pull partial paths. Thanks for reporting!

Oh ok nice then! Thanks for your help and support :grinning_face_with_smiling_eyes: