Open12

TensorFlowやPyTorchからONNXへエクスポートするときに extra_opset を取り込んでエクスポートする方法など (MatrixInverse, torch.inverse)

PINTOPINTO
  • ONNXに標準で実装されていないTensorFlowのオペレーションをエクスポートしたいときのワークアラウンド
  • --extra_opset com.microsoft:1
python3 -m tf2onnx.convert \
--saved-model saved_model \
--opset 11 \
--output model_float32.onnx \
--extra_opset com.microsoft:1 #['com.microsoft:1','ai.onnx.contrib:1','ai.onnx.converters.tensorflow:1']
PINTOPINTO
  • ONNX の TensorFlow extra_opset に指定可能なバージョンが定義されている箇所
  • MICROSOFT_DOMAIN の文字列と helper.make_opsetid の第2引数の数値をコロンで結ぶ
tensorflow-onnx/tf2onnx/convert.py
MICROSOFT_DOMAIN = "com.microsoft"
CONTRIB_OPS_DOMAIN = "ai.onnx.contrib"

# Default opset version for onnx domain
PREFERRED_OPSET = 9

# Default opset for custom ops
TENSORFLOW_OPSET = helper.make_opsetid("ai.onnx.converters.tensorflow", 1)

https://github.com/onnx/tensorflow-onnx/blob/5cd3b5b87ca8fbe38e90d2ecce6f9bda891792bf/tf2onnx/constants.py#L15-L22

PINTOPINTO

一例として、下記の MatrixInverse 変換エラーへの対処

Warning: Unsupported operator MatrixInverse. No schema registered for this operator.

Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.8/dist-packages/onnxsim/__main__.py", line 90, in <module>
    main()
  File "/usr/local/lib/python3.8/dist-packages/onnxsim/__main__.py", line 68, in main
    model_opt, check_ok = onnxsim.simplify(
  File "/usr/local/lib/python3.8/dist-packages/onnxsim/onnx_simplifier.py", line 444, in simplify
    onnx.checker.check_model(model)
  File "/usr/local/lib/python3.8/dist-packages/onnx/checker.py", line 106, in check_model
    C.check_model(protobuf_string)
onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for MatrixInverse with domain_version of 11

==> Context: Bad node spec for node. Name: Model_tower0/global_refine_visual_hull/get_visual_hull/transform_depth/MatrixInverse OpType: MatrixInverse
PINTOPINTO
import onnxruntime as ort
from onnxruntime_extensions import get_library_path

so = ort.SessionOptions()
so.register_custom_ops_library(get_library_path())

sess = ort.InferenceSession("model1.onnx", so)
print("Inputs:", [inp.name for inp in sess.get_inputs()])
print("Outputs:", [out.name for out in sess.get_outputs()])
PINTOPINTO
  • torch.inverse
import onnx
from torch.onnx import register_custom_op_symbolic
def my_inverse(g, self):
    return g.op("ai.onnx.contrib::Inverse", self)
register_custom_op_symbolic('::inverse', my_inverse, 1)
torch.onnx.export(
    model.network,#.module,
    args=(left,right,img_ipm,cam_confs,ipm_m),
    f=onnx_file,
    opset_version=16,
    input_names = ['left','right','img_ipm', 'cam_confs', 'ipm_m'],
    output_names=['pred_seg'],
    custom_opsets={"ai.onnx.contrib": 1}, #<--- 1 を指定する時は本来は指定不要
)
PINTOPINTO
  • inference
import numpy as np
import onnxruntime
from onnxruntime_extensions import get_library_path

left = np.ones([1,3,256,640], dtype=np.float32)
right = np.ones([1,3,256,640], dtype=np.float32)
img_ipm = np.ones([1, 3, 128, 128], dtype=np.float32)
ipm_m = np.ones([1, 11], dtype=np.float32)

session_option = onnxruntime.SessionOptions()
session_option.register_custom_ops_library(get_library_path())
providers = ['CPUExecutionProvider']
reloaded_model = onnxruntime.InferenceSession(
    path_or_bytes='sbevnet_opset16_256x640.onnx',
    sess_options=session_option,
    providers=providers,
)
onnx_pred = reloaded_model.run(
    None,
    {
        "left": left,
        "right": right,
        "img_ipm": img_ipm,
        "ipm_m": ipm_m,
    }
)

print(onnx_pred[0].shape)
PINTOPINTO
[E:onnxruntime:, sequential_executor.cc:368 Execute]
Non-zero status code returned while running Inverse node.
Name:'Inverse_875'
Status Message: Only 2-d matrix supported.