下载 huggingface 模型 经常出现的443错误

2023-12-26 11:23:45

记录:选择合适的库包版本和代理处理

BUG:

requests.exceptions.ProxyError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by ProxyError('Unable to connect to proxy', SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF)

正在下载模型,例如

model = SentenceTransformer('all-MiniLM-L6-v2')

出现 ProxyError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443)?

尝试解决:

1. 降版本: requests 和 urllib3

2. 并且需要 开启代理 (yiyuanfeiji)

pip install requests==2.27.1
pip install urllib3==1.25.11

成功下载!

config_sentence_transformers.json: 100%|██████████| 116/116 [00:00<00:00, 58.2kB/s]
README.md: 100%|██████████| 10.6k/10.6k [00:00<00:00, 3.55MB/s]
sentence_bert_config.json: 100%|██████████| 53.0/53.0 [00:00<00:00, 17.7kB/s]
config.json: 100%|██████████| 612/612 [00:00<00:00, 205kB/s]
pytorch_model.bin: 100%|██████████| 90.9M/90.9M [00:15<00:00, 5.69MB/s]
tokenizer_config.json: 100%|██████████| 350/350 [00:00<00:00, 117kB/s]
vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 1.05MB/s]
tokenizer.json: 100%|██████████| 466k/466k [00:00<00:00, 3.30MB/s]
special_tokens_map.json: 100%|██████████| 112/112 [00:00<00:00, 37.5kB/s]
1_Pooling/config.json: 100%|██████████| 190/190 [00:00<00:00, 63.6kB/s]
2023-12-25 18:08:57 - Use pytorch device_name: cpu
Batches: 100%|██████████| 1/1 [00:00<00:00, ?2.68it/s]

文章来源:https://blog.csdn.net/weixin_46141492/article/details/135206086
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。