DrDroid

ONNX Runtime ONNXRuntimeError: [ONNXRuntimeError] : 11 : INVALID_ARGUMENT : Unsupported data type

The model uses a data type that is not supported by ONNX Runtime.

👤

Stuck? Let AI directly find root cause

AI that integrates with your stack & debugs automatically | Runs locally and privately

Download Now

What is ONNX Runtime ONNXRuntimeError: [ONNXRuntimeError] : 11 : INVALID_ARGUMENT : Unsupported data type

Understanding ONNX Runtime

ONNX Runtime is a high-performance inference engine for machine learning models in the Open Neural Network Exchange (ONNX) format. It is designed to accelerate the deployment of machine learning models across various platforms and hardware. ONNX Runtime supports a wide range of operators and data types, enabling developers to run models efficiently on different devices.

Identifying the Symptom

When working with ONNX Runtime, you might encounter the following error message: ONNXRuntimeError: [ONNXRuntimeError] : 11 : INVALID_ARGUMENT : Unsupported data type. This error indicates that the model you are trying to run includes a data type that ONNX Runtime does not support.

Common Scenarios

Loading a model trained with an unsupported data type. Attempting to run inference on a model with custom or experimental data types.

Exploring the Issue

The error code INVALID_ARGUMENT suggests that the input provided to ONNX Runtime is not valid. Specifically, the error message points to an unsupported data type in the model. ONNX Runtime supports a variety of standard data types such as float32, int32, and bool. However, if your model uses a data type outside of these, you may encounter this error.

Why This Happens

This issue often arises when models are exported from frameworks that support a broader range of data types than ONNX Runtime. For example, some frameworks might support float16 or complex64, which may not be supported in the version of ONNX Runtime you are using.

Steps to Resolve the Issue

To fix this issue, you can take the following steps:

Step 1: Check Supported Data Types

First, verify the data types supported by your version of ONNX Runtime. You can find this information in the ONNX Runtime documentation. Ensure that your model only uses these supported types.

Step 2: Convert Unsupported Data Types

If your model uses unsupported data types, consider converting them to supported ones. For instance, if your model uses float16, you might convert these to float32. This can be done using model conversion tools or scripts.

Step 3: Update ONNX Runtime

Ensure you are using the latest version of ONNX Runtime, as newer versions may include support for additional data types. You can update ONNX Runtime using the following command:

pip install --upgrade onnxruntime

Step 4: Re-export the Model

If the issue persists, consider re-exporting the model from the original framework, ensuring that only supported data types are used. Refer to the ONNX documentation for guidance on exporting models.

Conclusion

By following these steps, you should be able to resolve the ONNXRuntimeError: INVALID_ARGUMENT : Unsupported data type error. Ensuring compatibility between your model's data types and ONNX Runtime's supported types is crucial for successful model deployment. For further assistance, consider reaching out to the ONNX Runtime community.

ONNX Runtime ONNXRuntimeError: [ONNXRuntimeError] : 11 : INVALID_ARGUMENT : Unsupported data type

TensorFlow

  • 80+ monitoring tool integrations
  • Long term memory about your stack
  • Locally run Mac App available
Read more

Time to stop copy pasting your errors onto Google!