We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
由于项目需要onnx模型,不支持trt模型。 请问仓库中的方法,能不能直接量化onnx为int8模型,或者将int8的trt模型转回onnx。 谢谢!
The text was updated successfully, but these errors were encountered:
不可以
Sorry, something went wrong.
@dollarser 直接将onnx-fp32转化为onnx-int8即可(ORT推理,官方有demo),和onnx-fp32转trt-int8类似的。
No branches or pull requests
由于项目需要onnx模型,不支持trt模型。
请问仓库中的方法,能不能直接量化onnx为int8模型,或者将int8的trt模型转回onnx。
谢谢!
The text was updated successfully, but these errors were encountered: