-
Notifications
You must be signed in to change notification settings - Fork 650
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add yolov10 detection node #8753
Comments
@storrrrrrrrm |
If the inference code could be reused from existing Autoware, it shouldn't matter. https://github.com/THU-MIG/yolov10/blob/main/docs/en/integrations/tensorrt.md We will only use the model that is exported to TensorRT format. But if we are going to use the TensorRT inference codes from their repository, we cannot put it under Autoware.Universe. Then we can create a sub-repo under autowarefoundation and add it to |
don't worry, I will just refer to the code logic, and I will implement new node code based on the existing code of autoware. |
@storrrrrrrrm (cc: @mitsudome-r @xmfcx ) |
Morning kminoda san @kminoda Thanks for your remind,I check the original repo of yolo -v10 and found a open discussion about license of the mode discussion link 。 It looks that the author did not give a response yet since the tensorrt-yolov10-node code is totally refactored based on tire4’s tensorrt-yolox , i believe it should follow tire4‘s license like shin san suggested (apache 2.0) discusison link is here Looking forward to your further comment Have a nice day! Xingang |
Checklist
Description
add a node which use latest yolo10 model to do detection
Purpose
yolo10 is faster and has higher AP on coco compared to previous yolo series models. it may help to imporove detection performance
Possible approaches
https://github.com/THU-MIG/yolov10 refer to above python code, we can implement a node that uses tensorrt for inference.
Definition of done
The text was updated successfully, but these errors were encountered: