DVC in AWS Sagemaker notebook worked perfectly, until like a week ago. Everytime I try to do DVC push to S3 I receive the same error.
*ERROR: unexpected error - init() got an unexpected keyword argument ‘skip_instance_cache’ *
Having any troubles? Hit us up at https://dvc.org/support, we are always happy to help!
Here is the full result of dvc push -v (the only thing I changed is the actual path to S3 into [S3_PATH]:
sh-4.2$ dvc push -v
2022-04-28 07:28:50,999 DEBUG: Preparing to transfer data from ‘/home/ec2-user/SageMaker/.dvc/cache’ to [S3_PATH]
2022-04-28 07:28:51,000 DEBUG: Preparing to collect status from[S3_PATH]
2022-04-28 07:28:51,000 DEBUG: Collecting status from[S3_PATH]
*2022-04-28 07:28:51,001 ERROR: unexpected error - init() got an unexpected keyword argument ‘skip_instance_cache’ *
------------------------------------------------------------
Traceback (most recent call last):
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/cli/init.py”, line 89, in main*
- ret = cmd.do_run()*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/cli/command.py”, line 22, in do_run*
- return self.run()*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/commands/data_sync.py”, line 68, in run*
- glob=self.args.glob,*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/repo/init.py”, line 48, in wrapper*
- return f(repo, *args, *kwargs)
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/repo/push.py”, line 69, in push*
- obj_ids, jobs, remote=remote, odb=odb or dest_odb*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data_cloud.py”, line 91, in push*
- cache_odb=self.repo.odb.local,*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data/transfer.py”, line 155, in transfer*
- src, dest, obj_ids, check_deleted=False, jobs=jobs, *kwargs
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data/status.py”, line 164, in compare_status*
- dest, obj_ids, index=dest_index, jobs=jobs, *kwargs
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data/status.py”, line 135, in status*
- exists.update(odb.hashes_exist(hashes, name=odb.fs_path, jobs=jobs))*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 423, in hashes_exist*
- remote_size, remote_hashes = self._estimate_remote_size(hashes, name)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 265, in _estimate_remote_size*
- remote_hashes = set(iter_with_pbar(hashes))*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 254, in iter_with_pbar*
- for hash_ in hashes:*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 217, in _hashes_with_limit*
- for hash_ in self._list_hashes(prefix):*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 207, in _list_hashes*
- for path in self._list_paths(prefix):*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 191, in _list_paths*
- yield from self.fs.find(self.fs.path.join(parts), prefix=bool(prefix))
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/fsspec_wrapper.py”, line 189, in find*
- files = self.fs.find(with_prefix, prefix=self.path.parts(path)[-1])*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/funcy/objects.py”, line 50, in get*
- return prop.get(instance, type)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/funcy/objects.py”, line 28, in get*
- res = instance.dict[self.fget.name] = self.fget(instance)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/s3.py”, line 153, in fs*
- return _S3FileSystem(*self.fs_args)
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/s3fs/core.py”, line 189, in init*
- self.s3 = self.connect()*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/s3fs/core.py”, line 256, in connect*
- *self.kwargs)
TypeError: init() got an unexpected keyword argument ‘skip_instance_cache’
------------------------------------------------------------
2022-04-28 07:28:51,150 DEBUG: [Errno 95] no more link types left to try out: [Errno 95] ‘reflink’ is not supported by <class ‘dvc.fs.local.LocalFileSystem’>: [Errno 95] Operation not supported
------------------------------------------------------------
Traceback (most recent call last): - File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/cli/init.py”, line 89, in main*
- ret = cmd.do_run()*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/cli/command.py”, line 22, in do_run*
- return self.run()*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/commands/data_sync.py”, line 68, in run*
- glob=self.args.glob,*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/repo/init.py”, line 48, in wrapper*
- return f(repo, *args, *kwargs)
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/repo/push.py”, line 69, in push*
- obj_ids, jobs, remote=remote, odb=odb or dest_odb*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data_cloud.py”, line 91, in push*
- cache_odb=self.repo.odb.local,*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data/transfer.py”, line 155, in transfer*
- src, dest, obj_ids, check_deleted=False, jobs=jobs, *kwargs
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data/status.py”, line 164, in compare_status*
- dest, obj_ids, index=dest_index, jobs=jobs, *kwargs
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/data/status.py”, line 135, in status*
- exists.update(odb.hashes_exist(hashes, name=odb.fs_path, jobs=jobs))*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 423, in hashes_exist*
- remote_size, remote_hashes = self._estimate_remote_size(hashes, name)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 265, in _estimate_remote_size*
- remote_hashes = set(iter_with_pbar(hashes))*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 254, in iter_with_pbar*
- for hash_ in hashes:*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 217, in _hashes_with_limit*
- for hash_ in self._list_hashes(prefix):*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 207, in _list_hashes*
- for path in self._list_paths(prefix):*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/objects/db.py”, line 191, in _list_paths*
- yield from self.fs.find(self.fs.path.join(parts), prefix=bool(prefix))
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/fsspec_wrapper.py”, line 189, in find*
- files = self.fs.find(with_prefix, prefix=self.path.parts(path)[-1])*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/funcy/objects.py”, line 50, in get*
- return prop.get(instance, type)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/funcy/objects.py”, line 28, in get*
- res = instance.dict[self.fget.name] = self.fget(instance)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/s3.py”, line 153, in fs*
- return _S3FileSystem(*self.fs_args)
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/s3fs/core.py”, line 189, in init*
- self.s3 = self.connect()*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/s3fs/core.py”, line 256, in connect*
- *self.kwargs)
TypeError: init() got an unexpected keyword argument ‘skip_instance_cache’
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/utils.py”, line 28, in _link*
- func(from_path, to_path)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/local.py”, line 145, in reflink*
- System.reflink(from_info, to_info)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/system.py”, line 112, in reflink*
- System._reflink_linux(source, link_name)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/system.py”, line 96, in _reflink_linux*
- fcntl.ioctl(d.fileno(), FICLONE, s.fileno())*
OSError: [Errno 95] Operation not supported
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/utils.py”, line 69, in _try_links*
- return _link(link, from_fs, from_path, to_fs, to_path)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/utils.py”, line 34, in _link*
- ) from exc*
OSError: [Errno 95] ‘reflink’ is not supported by <class ‘dvc.fs.local.LocalFileSystem’>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/utils.py”, line 124, in _test_link*
- _try_links([link], from_fs, from_file, to_fs, to_file)*
- File “/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.7/site-packages/dvc/fs/utils.py”, line 79, in _try_links*
- ) from error*
OSError: [Errno 95] no more link types left to try out
------------------------------------------------------------
2022-04-28 07:28:51,151 DEBUG: Removing ‘/home/ec2-user/SageMaker/.CNzpVvSTZswKCiRWdm3JZN.tmp’
2022-04-28 07:28:51,151 DEBUG: Removing ‘/home/ec2-user/SageMaker/.CNzpVvSTZswKCiRWdm3JZN.tmp’
2022-04-28 07:28:51,151 DEBUG: Removing ‘/home/ec2-user/SageMaker/.CNzpVvSTZswKCiRWdm3JZN.tmp’
2022-04-28 07:28:51,151 DEBUG: Removing ‘/home/ec2-user/SageMaker/DS-16-MEX-NC/.dvc/cache/.mTyDWUx2EnLr7DGFySCmAh.tmp’
2022-04-28 07:28:51,154 DEBUG: Version info for developers:
DVC version: 2.10.1 (pip)
---------------------------------
Platform: Python 3.7.12 on Linux-4.14.252-131.483.amzn1.x86_64-x86_64-with-glibc2.10
Supports: -
webhdfs (fsspec = 2022.3.0),*
-
http (aiohttp = 3.8.1, aiohttp-retry = 2.4.6),*
-
https (aiohttp = 3.8.1, aiohttp-retry = 2.4.6),*
-
s3 (s3fs = 0.2.0, boto3 = 1.21.42)*
Cache types: hardlink, symlink
Cache directory: ext4 on /dev/xvdf
Caches: local
Remotes: s3
Workspace directory: ext4 on /dev/xvdf
Repo: dvc, git
Having any troubles? Hit us up at https://dvc.org/support, we are always happy to help!
2022-04-28 07:28:51,156 DEBUG: Analytics is enabled.
2022-04-28 07:28:51,217 DEBUG: Trying to spawn ‘[‘daemon’, ‘-q’, ‘analytics’, ‘/tmp/tmp7v_heiau’]’
2022-04-28 07:28:51,218 DEBUG: Spawned ‘[‘daemon’, ‘-q’, ‘analytics’, ‘/tmp/tmp7v_heiau’]’
Could you please help me to fix this?