I added my data with dvc add mydata/path
then i got mydata.dvc
the config
file in .dvc
folder is
[core]
remote = my_remote
[cache]
type = “reflink,hardlink”
[‘remote “my_remote”’]
url = s3://…
when i ran dvc push
it raised an error
unexpected error - init() got an unexpected keyword argument ‘cache_regions’
can anyone help me figure out this problem?
Traceback (most recent call last):
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/main.py”, line 55, in main
ret = cmd.do_run()
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/command/base.py”, line 45, in do_run
return self.run()
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/command/data_sync.py”, line 57, in run
processed_files_count = self.repo.push(
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/repo/init.py”, line 50, in wrapper
return f(repo, *args, **kwargs)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/repo/push.py”, line 48, in push
pushed += self.cloud.push(obj_ids, jobs, remote=remote, odb=odb)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/data_cloud.py”, line 85, in push
return transfer(
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/objects/transfer.py”, line 153, in transfer
status = compare_status(src, dest, obj_ids, check_deleted=False, **kwargs)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/objects/status.py”, line 160, in compare_status
dest_exists, dest_missing = status(
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/objects/status.py”, line 122, in status
exists = hashes.intersection(
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/objects/status.py”, line 48, in _indexed_dir_hashes
dir_exists.update(odb.list_hashes_exists(dir_hashes - dir_exists))
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/objects/db/base.py”, line 415, in list_hashes_exists
ret = list(itertools.compress(hashes, in_remote))
File “/home/ubuntu/anaconda3/lib/python3.8/concurrent/futures/_base.py”, line 611, in result_iterator
yield fs.pop().result()
File “/home/ubuntu/anaconda3/lib/python3.8/concurrent/futures/_base.py”, line 432, in result
return self.__get_result()
File “/home/ubuntu/anaconda3/lib/python3.8/concurrent/futures/_base.py”, line 388, in __get_result
raise self._exception
File “/home/ubuntu/anaconda3/lib/python3.8/concurrent/futures/thread.py”, line 57, in run
result = self.fn(*self.args, **self.kwargs)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/objects/db/base.py”, line 406, in exists_with_progress
ret = self.fs.exists(path_info)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/fs/fsspec_wrapper.py”, line 97, in exists
return self.fs.exists(self._with_bucket(path_info))
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/funcy/objects.py”, line 50, in get
return prop.get(instance, type)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/funcy/objects.py”, line 28, in get
res = instance.dict[self.fget.name] = self.fget(instance)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/dvc/fs/s3.py”, line 157, in fs
return _S3FileSystem(**self.fs_args)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/fsspec/spec.py”, line 75, in call
obj = super().call(*args, **kwargs)
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/s3fs/core.py”, line 187, in init
self.s3 = self.connect()
File “/home/ubuntu/anaconda3/lib/python3.8/site-packages/s3fs/core.py”, line 280, in connect
self.session = botocore.session.Session(**self.kwargs)
TypeError: init() got an unexpected keyword argument ‘cache_regions’
DEBUG: Version info for developers:
DVC version: 2.7.4 (pip)
Platform: Python 3.8.5 on Linux-5.4.0-1056-aws-x86_64-with-glibc2.10
Supports:
hdfs (pyarrow = 4.0.1),
http (aiohttp = 3.7.4.post0, aiohttp-retry = 2.4.5),
https (aiohttp = 3.7.4.post0, aiohttp-retry = 2.4.5),
s3 (s3fs = 0.4.2, boto3 = 1.17.72)
Cache types: hardlink, symlink
Cache directory: ext4 on /dev/nvme0n1p1
Caches: local
Remotes: s3
Workspace directory: ext4 on /dev/nvme0n1p1
Repo: dvc, git