无网络docker部署ollama+open-webui+PatientSeek-运行在cpu

一、环境

docker版本:19.03.9

二、有网环境下载安装docker

  1. 安装docker
  2. docker下载ollama镜像
docker pull ollama/ollama
  1. docker下载open-webui镜像
docker pull ghcr.io/open-webui/open-webui:main
  1. 导出ollama与open-webui镜像
docker save ollama/ollama >ollama.tar
docker save ghcr.io/open-webui/open-webui:main > open-webui.tar

三、下载模型(推介魔塔社区下载,不用科学上网。也可以从huggingface)

1. git

git clone https://www.modelscope.cn/whyhow-ai/PatientSeek.git

2. modelscope

modelscope download --model whyhow-ai/PatientSeek

下载好后,文件目录为

F:.
|   .gitattributes
|   configuration.json
|   README.md
|
+---blobs
|       f8eba201522ab44b79bc54166126bfaf836111ff4cbf2d13c59c3b57da10573b
|
+---raw_model
|       configuration.json
|       pytorch_model.bin
|
+---refs
|       main
|
\---snapshots
    \---70661aa9b9e6c69734b394916ddbc540fd4731bf
            DeepSeek-R1-Distill-Llama-8B-Q4_K_M.gguf

四、无网环境

  1. 无网服务器安装docker
  2. 将ollama与open-webui镜像上传到无网服务器/opt/app/images目录
  3. 将下载好的模型DeepSeek-R1-Distill-Llama-8B-Q4_K_M.gguf上传到服务器的/opt/ollama_data/models文件夹
  4. 加载ollama与open-webui镜像
docker load < ollama.tar
docker load < open-webui.tar
  1. 启动容器
  • 创建网络
docker network create ai-network
  • ollama。默认是使用cpu
docker run \
  --name=ollama \
  --network=ai-network \
  --memory="24g" \
  --cpus="16" \
  --ulimit nofile=65535:65535 \
  --cap-add=SYS_PTRACE \
  --security-opt seccomp=unconfined \
  -d \
  -p 11434:11434 \
  -v /opt/ollama_data:/root/.ollama \
  --restart=unless-stopped \
  ollama/ollama
  • open-webui
docker run -d \
  -p 3000:8080 \
  --network=ai-network \
  --ulimit nofile=65535:65535 \
  --ulimit nproc=65535:65535 \
  --memory="8g" \
  --cpus="8" \
  --add-host=host.docker.internal:172.17.0.1 \
  -v /opt/open-webui-data:/app/backend/data \
  --name open-webui \
  -e OLLAMA_API_BASE_URL=http://ollama:11434 \
  -e OPENBLAS_NUM_THREADS=1 \
  -e OMP_NUM_THREADS=1 \
  -e LD_PRELOAD= \
  --cap-add=SYS_ADMIN \
  --cap-add=SYS_RESOURCE \
  --security-opt seccomp=unconfined \
  --restart always \
  ghcr.io/open-webui/open-webui:main

配置根据服务器配置调整

  1. 进入ollama容器
docker exec -it ollama /bin/bash

进入/root/.ollama/models目录

cd /root/.ollama/models
  1. 创建Modelfile文件
vim Modelfile

输入如下内容。esc后:wq保存退出

FROM ./DeepSeek-R1-Distill-Llama-8B-Q4_K_M.gguf
TEMPLATE """{{- if .System }}{{ .System }}{{ end }}
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1}}
{{- if eq .Role "user" }}<|User|>{{ .Content }}
{{- else if eq .Role "assistant" }}<|Assistant|>{{ .Content }}{{- if not $last }}<|end▁of▁sentence|>{{- end }}
{{- end }}
{{- if and $last (ne .Role "assistant") }}<|Assistant|>{{- end }}
{{- end }}"""
PARAMETER stop <|begin▁of▁sentence|>
PARAMETER stop <|end▁of▁sentence|>
PARAMETER stop <|User|>
PARAMETER stop <|Assistant|>
LICENSE """MIT License

Copyright (c) 2023 DeepSeek

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
  1. ollama导入模型
ollama create PatientSeek -f Modelfile

验证,如果有PatientSeek模型,则导入成功

ollama ls
  1. 运行PatientSeek
ollama run PatientSeek

如下则成功


1742029690522.png
  1. 浏览器访问[服务器ip]:3000,等一会就进入了open-weiui界面


    1742029866780.png

五、报错处理

  1. 启动容器时,最好看一下日志docker logs -f ollamadocker logs -f open-weiui没有报错基本就没问题
  2. 如果日志没有报错,但是在open-weiui界面没有可选的PatientSeek模型。点击【左下角用户名】,点击【设置】,点击【管理员设置】,点击【外部链接】。修改【管理Ollama API连接】为http://ollama:11434保存。刷新页面
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容