yyhhyy's blog

yyhhyy

DB-GPT 0.7.0 与 Ollama 集成使用指南

366
2025-03-08

1. Ollama 安装

1.1 Mac

具体请看:https://ollama.com/download/mac

1.2 Linux

curl -fsSL https://ollama.com/install.sh | sh

1.3 Windows

具体请看:https://ollama.com/download/windows

2. 运行模型

请根据实际情形自己到ollama模型进行检索

  • LLM:

ollama run deepseek-r1:1.5b
  • Embedding

ollama run bge-m3:latest

3. 依赖安装

  • macOS and Linux

uv sync --all-packages --frozen \
--extra "base" \
--extra "proxy_ollama" \
--extra "rag" \
--extra "storage_chromadb" \
--extra "dbgpts" \
--link-mode=copy
  • windows:

uv sync --all-packages --frozen --extra "base" --extra "proxy_ollama" --extra "rag" --extra "storage_chromadb" --extra "dbgpts" --link-mode=copy

4. 配置文件

编辑configs/dbgpt-proxy-ollama.toml配置文件

注意api_base、api_key即可

[system]
# Load language from environment variable(It is set by the hook)
language = "${env:DBGPT_LANG:-en}"
api_keys = []
encrypt_key = "your_secret_key"
​
# Server Configurations
[service.web]
host = "0.0.0.0"
port = 5670
​
[service.web.database]
type = "sqlite"
path = "pilot/meta_data/dbgpt.db"
​
[rag.storage]
[rag.storage.vector]
type = "Chroma"
persist_path = "pilot/data"
​
# Model Configurations
[models]
[[models.llms]]
name = "deepseek-r1:1.5b"
provider = "proxy/ollama"
api_base = "http://localhost:11434"
api_key = ""
​
[[models.embeddings]]
name = "bge-m3:latest"
provider = "proxy/ollama"
api_url = "http://localhost:11434"
api_key = ""

5. 启动项目

  • 方式一:

uv run dbgpt start webserver --config configs/dbgpt-proxy-ollama.toml
  • 方式二:

uv run python packages/dbgpt-app/src/dbgpt_app/dbgpt_server.py --config configs/dbgpt-proxy-ollama.toml

接下来访问 http://localhost:5670/ 即可。