You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I dowloanded public yolov9-m-converted.pt model on WongKinYiu/yolov9. And I converted model pt to onnx with export.py. Then I deployed model on triton inference server. Model output dimension is :
{
name: "output0"
data_type: TYPE_FP16
dims: [
1,
84,
8400
]
}
So I use client.py but output layer is not include ["num_dets", "det_boxes", "det_scores", "det_classes"]. So I have an error message. Your model and my model why did different? How parse is it ?
The text was updated successfully, but these errors were encountered:
The original YOLOv9c models do not include the NMS functionality, so the model's output is raw data containing all detections made by the neural network.
I implemented a NMS solution within the model's final stage to filter these raw detections before returning the output to the client, making it end-to-end (E2E).
You need to export the .pt model to ONNX with this functionality enabled. This way, you can use the model in TensorRT, which already expects the output layers to include ["num_dets", "det_boxes", "det_scores", "det_classes"]
Hello,
I dowloanded public yolov9-m-converted.pt model on WongKinYiu/yolov9. And I converted model pt to onnx with export.py. Then I deployed model on triton inference server. Model output dimension is :
{
name: "output0"
data_type: TYPE_FP16
dims: [
1,
84,
8400
]
}
So I use client.py but output layer is not include ["num_dets", "det_boxes", "det_scores", "det_classes"]. So I have an error message. Your model and my model why did different? How parse is it ?
The text was updated successfully, but these errors were encountered: