Skip to content

Llama 4 support #2015

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
codestar12 opened this issue Apr 14, 2025 · 4 comments
Open

Llama 4 support #2015

codestar12 opened this issue Apr 14, 2025 · 4 comments
Labels
enhancement New feature or request

Comments

@codestar12
Copy link

With the new Llama release it would be nice to support the new models.

@codestar12 codestar12 added the enhancement New feature or request label Apr 14, 2025
@codestar12
Copy link
Author

I know there is support for MoEs with Mixtral. I'm not sure how much of a lift it will take but I'm willing to help if people can point me in the right direction.

@bhimrazy
Copy link
Contributor

@ysjprojects, Any thoughts on this?

@codestar12
Copy link
Author

bump

@ysjprojects
Copy link
Contributor

ysjprojects commented Jun 7, 2025

Will be looking into LLaMA-4's architecture in the next few days and giving my thoughts on them

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants