Skip to content

Commit

Permalink
update supported models
Browse files Browse the repository at this point in the history
  • Loading branch information
irexyc committed Sep 23, 2024
1 parent c04866b commit 8174257
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,7 @@ For detailed inference benchmarks in more devices and more settings, please refe
<li>InternLM-XComposer2 (7B, 4khd-7B)</li>
<li>InternLM-XComposer2.5 (7B)</li>
<li>Qwen-VL (7B)</li>
<li>Qwen2-VL (2B, 7B)</li>
<li>DeepSeek-VL (7B)</li>
<li>InternVL-Chat (v1.1-v1.5)</li>
<li>InternVL2 (1B-76B)</li>
Expand Down
1 change: 1 addition & 0 deletions README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,6 +150,7 @@ LMDeploy TurboMind 引擎拥有卓越的推理能力,在各种规模的模型
<li>InternLM-XComposer2 (7B, 4khd-7B)</li>
<li>InternLM-XComposer2.5 (7B)</li>
<li>Qwen-VL (7B)</li>
<li>Qwen2-VL (2B, 7B)</li>
<li>DeepSeek-VL (7B)</li>
<li>InternVL-Chat (v1.1-v1.5)</li>
<li>InternVL2 (1B-76B)</li>
Expand Down
1 change: 1 addition & 0 deletions docs/en/supported_models/supported_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ The TurboMind engine doesn't support window attention. Therefore, for models tha
| QWen1.5 | 0.5B - 110B | LLM | Yes | No | No | Yes |
| QWen1.5-MoE | A2.7B | LLM | Yes | No | No | No |
| QWen2 | 0.5B - 72B | LLM | Yes | No | No | Yes |
| QWen2-VL | 2B, 7B | MLLM | Yes | No | No | No |
| DeepSeek-MoE | 16B | LLM | Yes | No | No | No |
| DeepSeek-V2 | 16B, 236B | LLM | Yes | No | No | No |
| MiniCPM3 | 4B | LLM | Yes | No | No | No |
Expand Down
1 change: 1 addition & 0 deletions docs/zh_cn/supported_models/supported_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ turbomind 引擎不支持 window attention。所以,对于应用了 window att
| QWen1.5 | 0.5B - 110B | LLM | Yes | No | No | Yes |
| QWen1.5-MoE | A2.7B | LLM | Yes | No | No | No |
| QWen2 | 0.5B - 72B | LLM | Yes | No | No | Yes |
| QWen2-VL | 2B, 7B | MLLM | Yes | No | No | No |
| DeepSeek-MoE | 16B | LLM | Yes | No | No | No |
| DeepSeek-V2 | 16B, 236B | LLM | Yes | No | No | No |
| MiniCPM3 | 4B | LLM | Yes | No | No | No |
Expand Down

0 comments on commit 8174257

Please sign in to comment.