Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error #5

Open
WANGCHAO1996 opened this issue Nov 30, 2020 · 1 comment
Open

error #5

WANGCHAO1996 opened this issue Nov 30, 2020 · 1 comment

Comments

@WANGCHAO1996
Copy link

WANGCHAO1996 commented Nov 30, 2020

ubuntu18.04
pytorch 1.5.1
libtorch 1.2.0
cuda 10.0
cudnn 7.6.5

@WANGCHAO1996
Copy link
Author

./darknet data/ronaldo.jpg
ERROR

Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Short Cut Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Conv Forward Layer
Linear Conv Forward Layer
YOLO Forward Layer
Route Forward Layer
Conv Forward Layer
Upsample Forward Layer
terminate called after throwing an instance of 'c10::Error'
what(): It is expected output_size equals to 2, but got size 1 (upsample_bilinear2d_out_cpu_template at /pytorch/aten/src/ATen/native/UpSampleBilinear2d.cpp:165)
frame #0: c10::Error::Error(c10::SourceLocation, std::string const&) + 0x33 (0x7fb692e5c273 in /home/xx405/下载/libtorch-shared-with-deps-1.2.0/libtorch/lib/libc10.so)
frame #1: + 0x1c75a50 (0x7fb694cf8a50 in /home/xx405/下载/libtorch-shared-with-deps-1.2.0/libtorch/lib/libtorch.so)
frame #2: at::native::upsample_bilinear2d_cpu(at::Tensor const&, c10::ArrayRef, bool) + 0x14a (0x7fb694cfb73a in /home/xx405/下载/libtorch-shared-with-deps-1.2.0/libtorch/lib/libtorch.so)
frame #3: + 0x1dabf7d (0x7fb694e2ef7d in /home/xx405/下载/libtorch-shared-with-deps-1.2.0/libtorch/lib/libtorch.so)
frame #4: torch::autograd::VariableType::upsample_bilinear2d(at::Tensor const&, c10::ArrayRef, bool) + 0x60d (0x7fb696b30a1d in /home/xx405/下载/libtorch-shared-with-deps-1.2.0/libtorch/lib/libtorch.so)
frame #5: + 0xc13cd (0x55809c6b23cd in ./darknet)
frame #6: at::Tensor std::__invoke_impl<at::Tensor, at::Tensor (&)(at::Tensor const&, c10::ArrayRef, bool), at::Tensor, int&, bool&>(std::__invoke_other, at::Tensor (&)(at::Tensor const&, c10::ArrayRef, bool), at::Tensor&&, int&, bool&) + 0x9d (0x55809c6b3c02 in ./darknet)
frame #7: std::__invoke_result<at::Tensor (&)(at::Tensor const&, c10::ArrayRef, bool), at::Tensor, int&, bool&>::type std::__invoke<at::Tensor (&)(at::Tensor const&, c10::ArrayRef, bool), at::Tensor, int&, bool&>(at::Tensor (&)(at::Tensor const&, c10::ArrayRef, bool), at::Tensor&&, int&, bool&) + 0x89 (0x55809c6b3a3d in ./darknet)
frame #8: at::Tensor std::_Bind<at::Tensor (
(std::_Placeholder<1>, int, bool))(at::Tensor const&, c10::ArrayRef, bool)>::__call<at::Tensor, at::Tensor&&, 0ul, 1ul, 2ul>(std::tupleat::Tensor&&&&, std::_Index_tuple<0ul, 1ul, 2ul>) + 0xb7 (0x55809c6b3827 in ./darknet)
frame #9: at::Tensor std::_Bind<at::Tensor ((std::_Placeholder<1>, int, bool))(at::Tensor const&, c10::ArrayRef, bool)>::operator()<at::Tensor, at::Tensor>(at::Tensor&&) + 0x5e (0x55809c6b3504 in ./darknet)
frame #10: std::_Function_handler<at::Tensor (at::Tensor), std::_Bind<at::Tensor (
(std::_Placeholder<1>, int, bool))(at::Tensor const&, c10::ArrayRef, bool)> >::_M_invoke(std::_Any_data const&, at::Tensor&&) + 0x51 (0x55809c6b3132 in ./darknet)
frame #11: torch::nn::FunctionalImpl::forward(at::Tensor) + 0x34 (0x7fb696f6a324 in /home/xx405/下载/libtorch-shared-with-deps-1.2.0/libtorch/lib/libtorch.so)
frame #12: upSampleLayer::forward(at::Tensor) + 0x87 (0x55809c6b2529 in ./darknet)
frame #13: torch::nn::AnyModule::Value torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::InvokeForward::operator()at::Tensor(at::Tensor&&) + 0x67 (0x55809c6aa60d in ./darknet)
frame #14: torch::nn::AnyModule::Value torch::unpack<torch::nn::AnyModule::Value, at::Tensor, torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::InvokeForward, torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::CheckedGetter, 0ul>(torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::InvokeForward, torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::CheckedGetter, torch::Indices<0ul>) + 0x4a (0x55809c6a886b in ./darknet)
frame #15: torch::nn::AnyModule::Value torch::unpack<torch::nn::AnyModule::Value, at::Tensor, torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::InvokeForward, torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::CheckedGetter>(torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::InvokeForward, torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::CheckedGetter) + 0x5c (0x55809c6a7965 in ./darknet)
frame #16: torch::nn::AnyModule::Holder<upSampleLayer, at::Tensor>::forward(std::vector<torch::nn::AnyModule::Value, std::allocatortorch::nn::AnyModule::Value >&&) + 0x1b0 (0x55809c6a526a in ./darknet)
frame #17: torch::nn::AnyModule::Value torch::nn::AnyModule::any_forwardtorch::nn::AnyModule::Value(torch::nn::AnyModule::Value&&) + 0x1b2 (0x55809c6addd0 in ./darknet)
frame #18: at::Tensor torch::nn::SequentialImpl::forward<at::Tensor, at::Tensor&>(at::Tensor&) + 0x1dd (0x55809c6ad03b in ./darknet)
frame #19: main + 0x440 (0x55809c6ac714 in ./darknet)
frame #20: __libc_start_main + 0xe7 (0x7fb691a7bb97 in /lib/x86_64-linux-gnu/libc.so.6)
frame #21: _start + 0x2a (0x55809c673dba in ./darknet)

已放弃 (核心已转储)
(pytorch) xx405@xx405-Precision-5820-Tower-X-Series:~/下载/yolov3-master$

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant