Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Conversion Colab Issue #489

Open
kuaashish opened this issue Jan 22, 2025 · 1 comment
Open

Model Conversion Colab Issue #489

kuaashish opened this issue Jan 22, 2025 · 1 comment

Comments

@kuaashish
Copy link
Contributor

kuaashish commented Jan 22, 2025

This issue pertains to the Model Conversion script, documented here and raised google-ai-edge/mediapipe#5811. When converting supported models in CPU mode, the conversion does not work as expected for all models.

Specifically, issues occur with MediaPipe Falcon 1B, StableLM 3B, Phi-2, and Gemma 2B CPU models. However, the conversion works as expected on GPU for Falcon 1B, StableLM 3B, Phi-2, Gemma 2B, and Gemma 7B.

@kuaashish
Copy link
Contributor Author

Hi @woodyhoko, @PaulTR,

Could you please look into this issue?

Thank you!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant