J6E模型QAT训练的时候报错,模型包含backbone,fpn, head多模块,多任务模型,在准备QAT模型的时候prepare报错:
WARNING: check model failed. The common reason is that inputs required for multiple forward are inconsistent. solution:
1. call horizon_plugin_pytorch.utils.check_model.check_qat_model individually
2. rewrite model forward to ensure that inputs of multiple forward are consistent.
(2)输出的信息如下, 是否需要修改?
INFO: cat_19 doesn't support high precisionoutput.
(3)模型开始训练,报错
TypeError: rand() received an invalid combination of arguments - got (list, device=torch.device, dtype=QuantDType), but expected one of:
错误如附件,还请帮忙看看, 谢谢!

