Skip to content

Conversation

@joao-alves97
Copy link

@joao-alves97 joao-alves97 commented Oct 28, 2022

Support architecture proposed in Beyond English-Centric Multilingual Machine Translation
I think that it's particularly relevant taking into consideration that the implementation of NLLB200 is based on the implementation of M2M100 in the original transformers library. After having support for M2M100 it would be easily to support NLLB200 too.

@joao-alves97
Copy link
Author

Taking into consideration that the M2M100 implementation is based on mBART50 implementation, I also based the adapters implementation for M2M100 on the one for mBART50.

@joao-alves97
Copy link
Author

Any news on this? I don't know who I can ping, may @calpt ?

@calpt
Copy link
Member

calpt commented Sep 9, 2023

Hey, thanks for your efforts in contributing new model architectures to adapter-transformers and sorry for the silence on our side.

In the last few weeks, we've been working on a large refactoring of our project, which will ultimately result in the release of Adapters, the next-generation adapters library. See #584.

As a consequence, we plan to merge any new model integrations directly to the new codebase, which currently can be found on this branch. Unfortunately, this necessitates some changes in the model integration code (detailed here, see already integrated models such as BERT, BART etc. for reference).

If you'd be willing to update your model integration to target the new library yourself, we'd be super happy to help you on this. Otherwise, we might look into upgrading and merging some of the open model integration PRs ourselves in the future. For more details, again see #584.

@calpt
Copy link
Member

calpt commented Apr 20, 2025

Closing in favor of #769.

@calpt calpt closed this Apr 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants