专栏算法工具链Shape currently only support FLOAT, INT8 and INT32 but gives INT64

Shape currently only support FLOAT, INT8 and INT32 but gives INT64

cherry2023-08-14
45
4
1.芯片型号:X3派
2.天工开物开发包OpenExplorer版本:horizon_xj3_open_explorer_v2.4.2_20221227
3.问题定位:模型转换<-->板端部署
4.问题具体描述

I am deploying a tracking algorithm named SiamGAT: https://github.com/ohhhyeahhh/SiamGAT

I have converted the network to onnx model successfully, which is provided in:

链接: https://pan.baidu.com/s/1rcSpzW62soM3_SDxJXPDEg 提取码: x4s9

However, when i use "hb_mapper" to check the model, an error appears:

hb_mapper checker --model-type onnx --march bernoulli2 --model SiamGAT.onnx --input-shape input1 1x3x127x127 --input-shape input2 1x3x287x287

hb_mapper_checker.log is in the attached file.

I have located the problem in onnx model:

The problem appears after input3(a bbox reshaped to 1x4x1x1) enters the network. The bbox is used as index to extract roi from a tensor:

The error about "int64" occurs because i use "long" tensor:

roi = torch.round((bbox + 1 - offset + stride / 2) / stride - 1).long()

I have tried to change ".long()" to ".int()", but onnx model cannot be generated in this way because:

to perform as an index, roi must be "int64" in onnx

The model which error appears in is provided in attached file.

算法工具链
评论2
0/1000
  • 颜值即正义
    Lv.2

    Hi, have you tried converting roi.dtype to be "Int32" or "float32". See whether it works or not?

    And another point needs to be confirmed from "I have tried to change ".long()" to ".int()", but onnx model cannot be generated in this way because to perform as an index, roi must be "int64" in onnx": Does the data type of variable "roi" affect the export from networks into ONNX model?

    2023-08-17
    0
    2
    • cherry回复颜值即正义:

      Thanks for your response!

      1. I have tried converting roi.dtype to be "Int32" or "float32", but an error appears:

      TypeError: only integer tensors of a single element can be converted to an index

      This is because our code uses elements in roi as index to execute "slice" operation in the tensor "mask":

      mask[0, :, max(0, roi[0][1]): (min(roi[0][3], 12)), max(0, roi[0][0]): (min(roi[0][2], 12))] = 1

      2. If i change the ".long()" to ".int()", the export from networks into ONNX model makes a success.

      However, when using hb_mapper to check this onnx model(we provide it in [1]), the ai-tool-chain will raise the Type Error mentioned above:

      ERROR [ONNXRuntimeError] : 1 : FAIL : Type Error: Type parameter (Tind) bound to different types (tensor(int64) and tensor(int32) in node (/backbone/Slice_2).

      ERROR *** ERROR-OCCUR-DURING {horizon_nn.check_onnx} ***

      [1]链接:https://pan.baidu.com/s/1ZL8fk8zF5D3CyhefaWQ-zg 提取码:yxwy

      2023-08-20
      0
    • 颜值即正义回复cherry:

      As shown in the following figure, int64 is not supported

      2023-08-21
      0
  • cherry
    Lv.1

    If i change ".long()" to ".int()", i.e.

    roi = torch.round((bbox + 1 - offset + stride / 2) / stride - 1).int()

    onnx model cannot be generated in this way because of:

    "

    ERROR [ONNXRuntimeError] : 1 : FAIL : Type Error: Type parameter (Tind) bound to different types (tensor(int64) and tensor(int32) in node (/backbone/Slice_2).

    ERROR *** ERROR-OCCUR-DURING {horizon_nn.check_onnx} ***

    "

    2023-08-14
    0
    0