一、检查Python环境
root@llmserver:~# python -V
Python 3.12.9
root@llmserver:~# pip3 --version
pip 25.0.1 from /usr/local/lib/python3.12/site-packages/pip (python 3.12)
二、检查显卡CUDA版本
root@llmserver:~# nvidia-smi
Tue Mar 11 09:46:59 2025
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 550.144.03 Driver Version: 550.144.03 CUDA Version: 12.4 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA GeForce RTX 4090 Off | 00000000:01:00.0 Off | Off |
| 30% 26C P8 24W / 450W | 4MiB / 24564MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
+-----------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=========================================================================================|
| No running processes found |
+-----------------------------------------------------------------------------------------+
我的显卡型号是NVIDIA GeForce RTX 4090,CUDA Version: 12.4
三、安装依赖包
- 安装机器学习框架
pip3 install torch torchvision torchaudio
PyTorch官网安装命令生成工具:Start Locally | PyTorch
image.png
- 安装CUDNN
wget https://developer.download.nvidia.com/compute/cudnn/9.8.0/local_installers/cudnn-local-repo-rhel9-9.8.0-1.0-1.x86_64.rpm
sudo rpm -i cudnn-local-repo-rhel9-9.8.0-1.0-1.x86_64.rpm
sudo dnf clean all
sudo dnf -y install cudnn
sudo dnf -y install cudnn-cuda-12
image.png
四、安装unsloth
- 关闭http和https代理和本地防火墙,防止连接github失败
root@llmserver:~# git config --global --unset http.proxy
root@llmserver:~# git config --global --unset https.proxy
root@llmserver:~# systemctl stop firewalld
- 查询已安装的torch版本
root@llmserver:~# pip3 list|grep torch
torch 2.6.0
torchaudio 2.6.0
torchvision 0.21.0
- 安装unsloth
pip3 install "unsloth[cu124-torch260] @ git+https://github.com/unslothai/unsloth.git"
......
安装过程忽略,最终安装的包如下:
Successfully installed accelerate-1.4.0 aiohappyeyeballs-2.5.0 aiohttp-3.11.13 aiosignal-1.3.2 bitsandbytes-0.45.3 cut_cross_entropy-25.1.1 datasets-3.3.2 dill-0.3.8 docstring-parser-0.16 frozenlist-1.5.0 fsspec-2024.12.0 hf_transfer-0.1.9 huggingface_hub-0.29.2 markdown-it-py-3.0.0 mdurl-0.1.2 multidict-6.1.0 multiprocess-0.70.16 pandas-2.2.3 peft-0.14.0 propcache-0.3.0 protobuf-3.20.3 psutil-7.0.0 pyarrow-19.0.1 pygments-2.19.1 pytz-2025.1 regex-2024.11.6 rich-13.9.4 safetensors-0.5.3 sentencepiece-0.2.0 shtab-1.7.1 tokenizers-0.21.0 tqdm-4.67.1 transformers-4.49.0 trl-0.15.2 typeguard-4.4.2 tyro-0.9.16 tzdata-2025.1 unsloth-2025.3.9 unsloth_zoo-2025.3.8 wheel-0.45.1 xformers-0.0.29.post3 xxhash-3.5.0 yarl-1.18.3
- 查看已安装的pip3包
root@llmserver:~# pip3 list
Package Version
------------------------- ------------
accelerate 1.4.0
aiohappyeyeballs 2.5.0
aiohttp 3.11.13
aiosignal 1.3.2
attrs 23.2.0
bitsandbytes 0.45.3
Brlapi 0.8.5
charset-normalizer 3.3.2
cockpit 333
cupshelpers 1.0
cut-cross-entropy 25.1.1
dasbus 1.7
datasets 3.3.2
dbus-python 1.3.2
dill 0.3.8
distro 1.9.0
dnf 4.20.0
docstring_parser 0.16
file-magic 0.4.0
filelock 3.17.0
frozenlist 1.5.0
fsspec 2024.12.0
hf_transfer 0.1.9
huggingface-hub 0.29.2
idna 3.7
Jinja2 3.1.6
jsonschema 4.19.1
jsonschema-specifications 2023.11.2
libcomps 0.1.21
libdnf 0.73.1
louis 3.28.0
lxml 5.2.1
markdown-it-py 3.0.0
MarkupSafe 3.0.2
mdurl 0.1.2
mpmath 1.3.0
multidict 6.1.0
multiprocess 0.70.16
networkx 3.4.2
nftables 0.1
numpy 2.2.3
nvidia-cublas-cu12 12.4.5.8
nvidia-cuda-cupti-cu12 12.4.127
nvidia-cuda-nvrtc-cu12 12.4.127
nvidia-cuda-runtime-cu12 12.4.127
nvidia-cudnn-cu12 9.1.0.70
nvidia-cufft-cu12 11.2.1.3
nvidia-curand-cu12 10.3.5.147
nvidia-cusolver-cu12 11.6.1.9
nvidia-cusparse-cu12 12.3.1.170
nvidia-cusparselt-cu12 0.6.2
nvidia-nccl-cu12 2.21.5
nvidia-nvjitlink-cu12 12.4.127
nvidia-nvtx-cu12 12.4.127
packaging 24.2
pandas 2.2.3
peft 0.14.0
perf 0.1
pexpect 4.9.0
pillow 11.1.0
pip 25.0.1
propcache 0.3.0
protobuf 3.20.3
psutil 7.0.0
ptyprocess 0.7.0
pyarrow 19.0.1
pycairo 1.25.1
pycups 2.0.1
Pygments 2.19.1
PyGObject 3.46.0
pyinotify 0.9.6
python-dateutil 2.8.2
python-linux-procfs 0.7.3
python-pam 2.0.2
pytz 2025.1
pyudev 0.24.1
PyYAML 6.0.1
pyynl 0.0.1
referencing 0.31.1
regex 2024.11.6
requests 2.32.3
rich 13.9.4
rpds-py 0.17.1
rpm 4.19.1.1
safetensors 0.5.3
selinux 3.8
sentencepiece 0.2.0
sepolicy 3.8
setools 4.5.1
setroubleshoot 3.3.33
setuptools 69.0.3
shtab 1.7.1
six 1.16.0
sos 4.8.2
sympy 1.13.1
systemd-python 235
tokenizers 0.21.0
torch 2.6.0
torchaudio 2.6.0
torchvision 0.21.0
tqdm 4.67.1
transformers 4.49.0
triton 3.2.0
trl 0.15.2
typeguard 4.4.2
typing_extensions 4.12.2
tyro 0.9.16
tzdata 2025.1
unsloth 2025.3.9
unsloth_zoo 2025.3.8
urllib3 1.26.19
wheel 0.45.1
xformers 0.0.29.post3
xxhash 3.5.0
yarl 1.18.3
- 开启本地防火墙
root@llmserver:~# systemctl start firewalld
五、结束语
至此,本地python库unsloth部署完成,可以本地命令行或安装集成开发环境使用,但不支持远程使用。
解决方案:
1. 部署python web开发环境;
2. 部署LLaMA-Factory库;
unsloth官网:Pip Install | Unsloth Documentation