用户您好,请详细描述您所遇到的问题。
1.硬件获取渠道:
2.当前系统镜像版本:
3.当前天工开物版本:
4.问题定位:
5.开发的demo/案例:
6.需要提供的解决方案:
用户您好,请详细描述您所遇到的问题。
1.硬件获取渠道:
2.当前系统镜像版本:
3.当前天工开物版本:
4.问题定位:
5.开发的demo/案例:
6.需要提供的解决方案:


导出 float onnx,config添加:
使用以下命令:
python3 tools/export_onnx.py --config /workspace/bev_config_path.py
注:导出的bev onnx包含plugin算子

修改了,但发送了一下错误
"is desired. ".format(mode)
Traceback (most recent call last):
File "tools/export_onnx.py", line 86, in
export_to_onnx(model, (example_input, {}), file_path, **kwargs)
File "/usr/local/lib64/python3.6/site-packages/horizon_plugin_pytorch/utils/onnx_helper.py", line 196, in export_to_onnx
custom_opsets=custom_opsets,
File "/root/.local/lib/python3.6/site-packages/torch/onnx/__init__.py", line 320, in export
custom_opsets, enable_onnx_checker, use_external_data_format)
File "/root/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 111, in export
custom_opsets=custom_opsets, use_external_data_format=use_external_data_format)
File "/root/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 729, in _export
dynamic_axes=dynamic_axes)
File "/root/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 493, in _model_to_graph
graph, params, torch_out, module = _create_jit_graph(model, args)
File "/root/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 437, in _create_jit_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args)
File "/root/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 388, in _trace_and_get_graph_from_model
torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
File "/root/.local/lib/python3.6/site-packages/torch/jit/_trace.py", line 1166, in _get_trace_graph
outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/hat/utils/module_patch.py", line 46, in _wrap
return fn(self, *args, **kwargs)
File "/root/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "/root/.local/lib/python3.6/site-packages/torch/jit/_trace.py", line 95, in forward
in_vars, in_desc = _flatten(args)
RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: placeholder

工具链板块的技术干货也会有BEV系列的参考文档的

1.第一个输入是rgb图片;
3.根据hrt_model_exec model_info --model_file XXX.hbm 查看是否有quanti type和scale值以及输出类型判断,根据scale值做反量化。
