langchain-chatglm6b git项目
参考文档
环境说明
- 2卡3090(24GB)单机
- python 3.10.11
- CUDA 12.0
- torch 2.0.1
环境搭建
langchain-chatglm项目
git clone https://github.com/chatchat-space/langchain-ChatGLM # clone git
cd langchain-ChatGLM/
pip install -r requirements.txt # 安装依赖,较多较慢
预先准备好chatglm2模型权重
git clone https://huggingface.co/THUDM/chatglm2-6b-int4
ls /data0/LLMs/chatglm2-6b
预先准备好embedding模型权重
git lfs install # 也可以不安装git-lfs,浏览器手动下载pytorch_model.bin
git clone https://huggingface.co/GanymedeNil/text2vec-base-chinese
启动测试
修改配置
vi configs/model_config.py
embedding_model_dict = {
"ernie-tiny": "nghuyong/ernie-3.0-nano-zh",
"ernie-base": "nghuyong/ernie-3.0-base-zh",
"text2vec-base": "/home/featurize/data/text2vec-base-chinese",
"text2vec": "/data0/dig/text2vec-base-chines", # 修改处
"m3e-small": "moka-ai/m3e-small",
"m3e-base": "moka-ai/m3e-base",
}
llm_model_dict = {
...
"chatglm2-6b": {
"name": "chatglm2-6b",
"pretrained_model_name": "/data0/LLMs/chatglm2-6", # 修改处
"local_model_path": None,
"provides": "ChatGLM"
},
...
}
# LLM 名称改成 chatglm2-6b
LLM_MODEL = "chatglm2-6b"
启动测试
python webui.py
上传文档:
简单测试:
cli测试
python cli_demo.py
Input your local knowledge file path 请输入本地知识文件路径:/data/lsy/langchain-ChatGLM/docs/FAQ.md
问答测试:
异常
vi webui.py
(demo
.queue(concurrency_count=3)
.launch(server_name='0.0.0.0',
server_port=17860, # 修改处
show_api=False,
share=False,
inbrowser=False))