Skip to content

昆仑芯P800推理RetinaNet模型报错 #4403

@Haixu-Liu

Description

@Haixu-Liu

FastDeploy使用的版本是release/1.1.0,retinanet模型版本是retinanet_r50_fpn_1x_coco,部分测试脚本如下:

runtime_option = fd.RuntimeOption()
runtime_option.use_kunlunxin()

if args.model_dir is None:
    model_dir = fd.download_model(name='retinanet')
else:
    model_dir = args.model_dir

model_file = os.path.join(model_dir, "model.pdmodel")
params_file = os.path.join(model_dir, "model.pdiparams")
config_file = os.path.join(model_dir, "infer_cfg.yml")

# settting for runtime
print("RetinaNet model initializing......")
model = fd.vision.detection.RetinaNet(
    model_file, params_file, config_file, runtime_option=runtime_option)

# predict
if args.image_file is None:
    image_file = fd.utils.get_detection_test_image()
else:
    image_file = args.image_file
im = cv2.imread(image_file)

print("predicting......")
result = model.predict(im)
print(result)

模型加载和编译都没有问题,但是推理的时候报错:

Image 看上去发生在paddle-lite里面获取xpu context报错

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions