使用开源通义千问模型(Qwen)搭建自己的大模型服务

2024-01-08 22:42:28

目标

1、使用开源的大模型服务搭建属于自己的模型服务;

2、调优自己的大模型;

选型

采用通义千问模型,https://github.com/QwenLM/Qwen

步骤

1、下载模型文件

开源模型库:https://www.modelscope.cn/models

mkdir -p /data/qwen
cd /data/qwen
git clone --depth 1 https://www.modelscope.cn/qwen/Qwen-14B-Chat.git

2、下载使用docker 镜像

docker pull qwenllm/qwen

3、启动脚本

https://github.com/QwenLM/Qwen/blob/main/docker/docker_web_demo.sh

# 修改如下内容
IMAGE_NAME=qwenllm/qwen
QWEN_CHECKPOINT_PATH=/data/qwen/Qwen-14B-Chat
PORT=8000
CONTAINER_NAME=qwen-14B

4、运行

访问http://localhost:8080 即可

sh docker_web_demo.sh

文档参考

https://github.com/QwenLM/Qwen/blob/main/README_CN.md

常见问题

1、运行报错?

去掉docker_web_demo.sh中--gpus all

docker: Error response from daemon: could not select device driver "" with capabilities: [[gpu]].

2、错误日志

先安装yum install git-lfs 在下载模型文件

Traceback (most recent call last):
? File "web_demo.py", line 209, in <module>
? ? main()
? File "web_demo.py", line 203, in main
? ? model, tokenizer, config = _load_model_tokenizer(args)
? File "web_demo.py", line 50, in _load_model_tokenizer
? ? model = AutoModelForCausalLM.from_pretrained(
? File "/usr/local/lib/python3.8/dist-packages/transformers/models/auto/auto_factory.py", line 511, in from_pretrained
? ? return model_class.from_pretrained(
? File "/usr/local/lib/python3.8/dist-packages/transformers/modeling_utils.py", line 3091, in from_pretrained
? ? ) = cls._load_pretrained_model(
? File "/usr/local/lib/python3.8/dist-packages/transformers/modeling_utils.py", line 3456, in _load_pretrained_model
? ? state_dict = load_state_dict(shard_file)
? File "/usr/local/lib/python3.8/dist-packages/transformers/modeling_utils.py", line 458, in load_state_dict
? ? with safe_open(checkpoint_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge

文章来源:https://blog.csdn.net/HDF734839030/article/details/135462024
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。