Skip to content

murasakii0118/llama.cpp-Qwen3vl-inference

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

考虑到本人并未查找到Qwen3vl从源代码层级推理的相关资料,以及询问AI后并未找到合适的结果,于是结合各处零碎的知识和代码,产生了此示例仓库

Considering that I did not find any relevant information on Qwen3vl's inference from the source code level, and did not find suitable results after consulting AI, I combined fragmented knowledge and code from various sources to create this example repository

这是一个使用llama.cpp从源代码层面推理Qwen3VL模型的代码模板

This is a code template of using llama.cpp from source code layerd to inference Qwen3VL models

你需要正确克隆llama.cpp最新的源代码,并正确的编译和导出,具体不再赘述

You need to correctly clone the latest source code of llama.cpp, compile it properly, and export it, with specific details omitted for brevity

你不仅需要导出正常的include文件夹,还需要导出llama.cpp中的common文件夹,考虑到编译速度,这里建议提前将common中的代码预编译为动态链接库

You not only need to export the normal include folder, but also the common folder in llama.cpp. Considering compilation speed, it is recommended to precompile the code in common as a dynamic link library in advance

About

Example code for inferring Qwen3 using llama.cpp

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages