Skip to content

Commit 3a0f6a8

Browse files
authored
chore(模型模态框): 添加模型教程链接并更新默认教程URL (#84)
添加 addingModelTutorialURL 属性到 ModelModalProps 接口 更新所有提供商的默认教程链接为新的 GitHub 文档地址 新增 AddinModelTutorial.md 文档文件
1 parent 18ce2e9 commit 3a0f6a8

File tree

4 files changed

+160
-5
lines changed

4 files changed

+160
-5
lines changed

docs/AddinModelTutorial.md

Lines changed: 153 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,153 @@
1+
# 接入本地部署大模型教程
2+
3+
# 安装部署平台(Ollama/Xinference/GPUStack)
4+
5+
请按照您想要接入的平台的教程部署, 注意 pandawiki只支持linux系统部署
6+
7+
[ollama安装教程](https://docs.ollama.com/quickstart)
8+
9+
[xinference安装教程](https://inference.readthedocs.io/zh-cn/v1.2.0/getting_started/installation.html)
10+
11+
[gpustack安装教程](https://docs.gpustack.ai/latest/quickstart/)
12+
13+
# 确认大模型平台安装成功
14+
15+
## 将大模型平台的监听IP设置为0.0.0.0
16+
17+
### ollama
18+
19+
1. 通过执行 `systemctl edit ollama.service`编辑 systemd 服务文件
20+
21+
2. 在 `[Service]`部分下,添加一行 `Environment`
22+
23+
24+
```ini
25+
[Service]
26+
Environment="OLLAMA_HOST=0.0.0.0:11434"
27+
```
28+
29+
1. 保存并退出编辑器。
30+
31+
2. 重新加载 systemd 并重启 Ollama:
32+
33+
34+
```bash
35+
systemctl daemon-reload
36+
systemctl restart ollama
37+
```
38+
39+
### xinference
40+
41+
在启动 Xinference 时加上 `-H 0.0.0.0` 参数:
42+
43+
```plaintext
44+
xinference-local -H 0.0.0.0
45+
```
46+
47+
## 获取部署机器的ip: 在命令行中输入 ip addr, 通常为 eth0或wlan0 网卡中的inet 后的ip
48+
49+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/mxPOG5zpEKkaJnKa/img/483323b8-d6e7-4f30-ab82-e500cfb170c3.png)
50+
51+
## 获取模型列表
52+
53+
请替换命令中的端口号为 ollama的默认端口号: 11434, xinference的默认端口号: 9997, gpustack的默认端口号: 80
54+
55+
```c++
56+
curl -X GET \
57+
http://部署机器的ip:端口/v1/models \
58+
-H "Content-Type: application/json"
59+
60+
```
61+
示例响应
62+
```c++
63+
64+
{
65+
"object": "list",
66+
"data": [
67+
{
68+
"id": "gemma3:latest",
69+
"object": "model",
70+
"created": 1755516177,
71+
"owned_by": "library"
72+
}
73+
]
74+
}
75+
```
76+
77+
## 检查模型是否可以使用
78+
对话模型
79+
```c++
80+
curl -X POST \
81+
http://部署机器的ip:端口/v1/chat/completions \
82+
-H "Authorization: Bearer 您设置的API Key,没有可以去掉这行" \
83+
-H "Content-Type: application/json" \
84+
-d '{
85+
"model": "模型列表中您想要配置的模型id",
86+
"messages": [{"role": "user", "content": "Hello"}]
87+
}'
88+
```
89+
向量模型
90+
```c++
91+
curl -X POST \
92+
http://部署机器的ip:端口/v1/embeddings \
93+
-H "Content-Type: application/json" \
94+
-H "Authorization: Bearer 您设置的API Key,没有可以去掉这行" \
95+
-d '{
96+
"model": "模型列表中您想要配置的模型id",
97+
"input": "hello, nice to meet you , and you?",
98+
"encoding_format": "float"
99+
}'
100+
```
101+
重排序模型
102+
```c++
103+
curl -X POST \
104+
http://部署机器的ip:端口/v1/rerank \
105+
-H "Content-Type: application/json" \
106+
-H "Authorization: Bearer 您设置的API Key,没有可以去掉这行" \
107+
-d '{
108+
"model": "模型列表中您想要配置的模型id",
109+
"documents": ["hello"],
110+
"query": "test"
111+
}'
112+
```
113+
114+
# 配置模型
115+
116+
## 选择供应商
117+
118+
**对话模型**
119+
120+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/23a5b54d-be99-4cd3-81b7-b1e47ebca6c2.png)
121+
122+
**向量/重排序模型**
123+
124+
注意 ollama不支持重排序模型!
125+
126+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/410271ce-dfa0-4ec5-bc89-9a0c2328099d.png)
127+
128+
## 输入API地址与API Key![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/bb3327ac-deda-4b6d-bf6c-970f582b469a.png)
129+
130+
1. API地址为`http://curl中的ip:curl中的端口`
131+
132+
2. 选择其它供应商时, 还需要输入之前curl中您输入的模型名称
133+
134+
135+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/9284fe77-8d1d-40d6-8689-06adb5c82767.png)
136+
137+
## 选择模型
138+
139+
注意向量/重排只能选择对应标签下的模型
140+
141+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/39e3c799-5617-4b6d-bd3b-089dd47590f1.png)
142+
143+
## 确认保存
144+
145+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/c3ae5021-b4a8-4228-b03f-bfe48f0e86cb.png)
146+
147+
配置成功后会弹出“修改成功”的提示
148+
149+
![image.png](https://alidocs.oss-cn-zhangjiakou.aliyuncs.com/res/54Lq35ojy3gLXl7E/img/a31a2600-bea0-478c-8a48-c473013035a4.png)
150+
151+
# 按照上述流程执行, 依然配置失败怎么办?
152+
153+
请附上报错的截图提Issue, 开发者会及时解答您的问题

ui/ModelModal/src/ModelModal.tsx

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,7 @@ export const ModelModal: React.FC<ModelModalProps> = ({
5252
language = 'zh-CN',
5353
messageComponent,
5454
is_close_model_remark = false,
55+
addingModelTutorialURL = 'https://github.com/chaitin/ModelKit/blob/main/docs/AddinModelTutorial.md',
5556
}: ModelModalProps) => {
5657
const theme = useTheme();
5758

@@ -607,7 +608,7 @@ export const ModelModal: React.FC<ModelModalProps> = ({
607608
}}
608609
onClick={() =>
609610
window.open(
610-
providers[providerBrand].addingModelTutorial,
611+
addingModelTutorialURL ,
611612
'_blank'
612613
)
613614
}

ui/ModelModal/src/constants/providers.ts

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,7 @@ export const DEFAULT_MODEL_PROVIDERS: ModelProviderMap = {
128128
analysis: true,
129129
analysis_vl: true,
130130
modelDocumentUrl: 'https://github.com/ollama/ollama/tree/main/docs',
131-
addingModelTutorial: 'https://pandawiki.docs.baizhi.cloud/node/019a160d-0528-736a-b88e-32a2d1207f3e',
131+
addingModelTutorial: 'https://github.com/chaitin/ModelKit/blob/main/docs/AddinModelTutorial.md',
132132
defaultBaseUrl: 'http://172.17.0.1:11434',
133133
},
134134
SiliconFlow: {
@@ -265,7 +265,7 @@ export const DEFAULT_MODEL_PROVIDERS: ModelProviderMap = {
265265
analysis: true,
266266
analysis_vl: true,
267267
modelDocumentUrl: 'https://inference.readthedocs.io/zh-cn/v1.2.0/getting_started/installation.html#installation',
268-
addingModelTutorial: 'https://pandawiki.docs.baizhi.cloud/node/019a160d-0528-736a-b88e-32a2d1207f3e',
268+
addingModelTutorial: 'https://github.com/chaitin/ModelKit/blob/main/docs/AddinModelTutorial.md',
269269
defaultBaseUrl: 'http://172.17.0.1:9997',
270270
},
271271
gpustack: {
@@ -282,7 +282,7 @@ export const DEFAULT_MODEL_PROVIDERS: ModelProviderMap = {
282282
analysis: true,
283283
analysis_vl: true,
284284
modelDocumentUrl: 'https://docs.gpustack.ai/latest/quickstart/',
285-
addingModelTutorial: 'https://pandawiki.docs.baizhi.cloud/node/019a160d-0528-736a-b88e-32a2d1207f3e',
285+
addingModelTutorial: 'https://github.com/chaitin/ModelKit/blob/main/docs/AddinModelTutorial.md',
286286
defaultBaseUrl: 'http://172.17.0.1',
287287
},
288288
Yi: {
@@ -758,7 +758,7 @@ export const DEFAULT_MODEL_PROVIDERS: ModelProviderMap = {
758758
analysis: true,
759759
analysis_vl: true,
760760
modelDocumentUrl: '',
761-
addingModelTutorial: 'https://pandawiki.docs.baizhi.cloud/node/019a160d-0528-736a-b88e-32a2d1207f3e',
761+
addingModelTutorial: 'https://github.com/chaitin/ModelKit/blob/main/docs/AddinModelTutorial.md',
762762
defaultBaseUrl: '',
763763
},
764764
};

ui/ModelModal/src/types/types.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -176,4 +176,5 @@ export interface ModelModalProps {
176176
language?: 'zh-CN' | 'en-US';
177177
messageComponent?: MessageComponent;
178178
is_close_model_remark?: boolean;
179+
addingModelTutorialURL?: string;
179180
}

0 commit comments

Comments
 (0)