Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: int #73

Open
darrenearl opened this issue Oct 28, 2022 · 3 comments

Comments

@darrenearl
Copy link

(pcdet) exinova@exinova-B560M-AORUS-PRO-AX:/media/exinova/ssd/CUDA-PointPillars/tool$ python exporter.py --ckpt pointpillar_7728.pth --cfg_file cfgs/custom_models/pointpillar.yaml
2022-10-28 14:55:09,684 INFO ------ Convert OpenPCDet model for TensorRT ------
2022-10-28 14:55:11,488 INFO ==> Loading parameters from checkpoint pointpillar_7728.pth to CPU
2022-10-28 14:55:11,499 INFO Not updated weight vfe.pfn_layers.0.linear.weight: torch.Size([64, 9])
2022-10-28 14:55:11,499 INFO ==> Done (loaded 126/127)
/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/utils.py:355: UserWarning: Skipping _decide_input_format
-1
warnings.warn("Skipping _decide_input_format\n {}".format(e.args[0]))
Traceback (most recent call last):
File "exporter.py", line 155, in
main()
File "exporter.py", line 139, in main
output_names = ['cls_preds', 'box_preds', 'dir_cls_preds'], # the model's output names
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/init.py", line 276, in export
custom_opsets, enable_onnx_checker, use_external_data_format)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/utils.py", line 94, in export
use_external_data_format=use_external_data_format)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/utils.py", line 701, in _export
dynamic_axes=dynamic_axes)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/utils.py", line 459, in _model_to_graph
use_new_jit_passes)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/utils.py", line 420, in _create_jit_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/onnx/utils.py", line 380, in _trace_and_get_graph_from_model
torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/jit/_trace.py", line 1139, in _get_trace_graph
outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/exinova/anaconda3/envs/pcdet/lib/python3.7/site-packages/torch/jit/_trace.py", line 93, in forward
in_vars, in_desc = _flatten(args)
RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: int

@sylivahf
Copy link

sylivahf commented Nov 7, 2022

Have you solved it, please?

@zzningxp
Copy link

if you meet same problem please check the PR: #52

@porterpan
Copy link

截止目前最新代码commit tag: (ce7e2bd694c90207435c8751d61cdb38d48a9f4c), 文件:tool/export_onnx.py 中126行,dummy_input['batch_size'] = 1 需要修改为dummy_input['batch_size'] = torch.tensor(1)#1,这句话导致JIT报错int问题,官方有时间麻烦可以修改下。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants