diff --git a/README.md b/README.md index bc0b101859..a8fcaf105f 100644 --- a/README.md +++ b/README.md @@ -149,6 +149,7 @@ For detailed inference benchmarks in more devices and more settings, please refe
  • InternLM-XComposer2 (7B, 4khd-7B)
  • InternLM-XComposer2.5 (7B)
  • Qwen-VL (7B)
  • +
  • Qwen2-VL (2B, 7B)
  • DeepSeek-VL (7B)
  • InternVL-Chat (v1.1-v1.5)
  • InternVL2 (1B-76B)
  • diff --git a/README_zh-CN.md b/README_zh-CN.md index b62cb22b05..871eba01b5 100644 --- a/README_zh-CN.md +++ b/README_zh-CN.md @@ -150,6 +150,7 @@ LMDeploy TurboMind 引擎拥有卓越的推理能力,在各种规模的模型
  • InternLM-XComposer2 (7B, 4khd-7B)
  • InternLM-XComposer2.5 (7B)
  • Qwen-VL (7B)
  • +
  • Qwen2-VL (2B, 7B)
  • DeepSeek-VL (7B)
  • InternVL-Chat (v1.1-v1.5)
  • InternVL2 (1B-76B)
  • diff --git a/docs/en/supported_models/supported_models.md b/docs/en/supported_models/supported_models.md index 1a04b0b5fb..52367e4471 100644 --- a/docs/en/supported_models/supported_models.md +++ b/docs/en/supported_models/supported_models.md @@ -62,6 +62,7 @@ The TurboMind engine doesn't support window attention. Therefore, for models tha | QWen1.5 | 0.5B - 110B | LLM | Yes | No | No | Yes | | QWen1.5-MoE | A2.7B | LLM | Yes | No | No | No | | QWen2 | 0.5B - 72B | LLM | Yes | No | No | Yes | +| QWen2-VL | 2B, 7B | MLLM | Yes | No | No | No | | DeepSeek-MoE | 16B | LLM | Yes | No | No | No | | DeepSeek-V2 | 16B, 236B | LLM | Yes | No | No | No | | MiniCPM3 | 4B | LLM | Yes | No | No | No | diff --git a/docs/zh_cn/supported_models/supported_models.md b/docs/zh_cn/supported_models/supported_models.md index 2d7f2b6c2a..779fc6cd51 100644 --- a/docs/zh_cn/supported_models/supported_models.md +++ b/docs/zh_cn/supported_models/supported_models.md @@ -62,6 +62,7 @@ turbomind 引擎不支持 window attention。所以,对于应用了 window att | QWen1.5 | 0.5B - 110B | LLM | Yes | No | No | Yes | | QWen1.5-MoE | A2.7B | LLM | Yes | No | No | No | | QWen2 | 0.5B - 72B | LLM | Yes | No | No | Yes | +| QWen2-VL | 2B, 7B | MLLM | Yes | No | No | No | | DeepSeek-MoE | 16B | LLM | Yes | No | No | No | | DeepSeek-V2 | 16B, 236B | LLM | Yes | No | No | No | | MiniCPM3 | 4B | LLM | Yes | No | No | No |