Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is v9-e model not available for inference? #128

Open
dsbyprateekg opened this issue Nov 25, 2024 · 2 comments
Open

Is v9-e model not available for inference? #128

dsbyprateekg opened this issue Nov 25, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@dsbyprateekg
Copy link

I am just checking the inference command of YOLOv9_MIT with below:

!yolo task=inference name="testing" device="cuda" model="v9-e" task.nms.min_confidence=0.1 task.fast_inference="trt" task.data.source="horses.jpg" +quite=True

And getting the following error-

`In 'config': Could not find 'model/v9-e'

Available options in 'model':
v7
v9-c
v9-c-cls
v9-c-seg
v9-m
v9-s
Config search path:
provider=hydra, path=pkg://hydra.conf
provider=main, path=pkg://yolo.config
provider=schema, path=structured://`

@dsbyprateekg dsbyprateekg added the bug Something isn't working label Nov 25, 2024
@henrytsui000
Copy link
Collaborator

Hi,

Thank you for your question! Currently, we don't provide the v9-e weight as it is still under preparation. We plan to release this weight with the official v1 release.

Best regards,
Henry Tsui

@dsbyprateekg
Copy link
Author

Hi,

Thank you for your question! Currently, we don't provide the v9-e weight as it is still under preparation. We plan to release this weight with the official v1 release.

Best regards, Henry Tsui

Thanks for the reply.
I hope we will get the official v1 release soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants