Skip to content

Conversation

ankitade
Copy link
Contributor

@ankitade ankitade commented Jun 21, 2022

Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan

  1. pytest
  2. python -m flava.train config=flava/configs/pretraining/debug.yaml
  3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml

Stack from ghstack (oldest at bottom):

Differential Revision: D37481127

[ghstack-poisoned]
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 21, 2022
ankitade added a commit that referenced this pull request Jun 21, 2022
ghstack-source-id: 1b7477c
Pull Request resolved: #106
[ghstack-poisoned]
ankitade added a commit that referenced this pull request Jun 23, 2022
ghstack-source-id: 0bca6e6
Pull Request resolved: #106
@codecov-commenter
Copy link

codecov-commenter commented Jun 23, 2022

Codecov Report

❗ No coverage uploaded for pull request base (gh/ankitade/5/base@3f7009e). Click here to learn what that means.
The diff coverage is n/a.

@@                  Coverage Diff                  @@
##             gh/ankitade/5/base     #106   +/-   ##
=====================================================
  Coverage                      ?   93.04%           
=====================================================
  Files                         ?       47           
  Lines                         ?     2776           
  Branches                      ?        0           
=====================================================
  Hits                          ?     2583           
  Misses                        ?      193           
  Partials                      ?        0           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3f7009e...4bcee67. Read the comment docs.

[ghstack-poisoned]
ankitade added a commit that referenced this pull request Jun 26, 2022
ghstack-source-id: 4c0738f
Pull Request resolved: #106
[ghstack-poisoned]
ankitade added a commit that referenced this pull request Jun 26, 2022
ghstack-source-id: a97330d
Pull Request resolved: #106
@ankitade ankitade changed the title Temp CL [FLAVA] Make projections part of the core model Jun 28, 2022
Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan
1. pytest
2.  python -m flava.train config=flava/configs/pretraining/debug.yaml
3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml
 



[ghstack-poisoned]
ankitade added a commit that referenced this pull request Jun 28, 2022
ghstack-source-id: f8b9173
Pull Request resolved: #106
Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan
1. pytest
2.  python -m flava.train config=flava/configs/pretraining/debug.yaml
3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml
 



[ghstack-poisoned]
ankitade added a commit that referenced this pull request Jun 28, 2022
ghstack-source-id: 4844b17
Pull Request resolved: #106
@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan
1. pytest
2.  python -m flava.train config=flava/configs/pretraining/debug.yaml
3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml
 

Differential Revision: [D37481127](https://our.internmc.facebook.com/intern/diff/D37481127)

[ghstack-poisoned]
ankitade added a commit that referenced this pull request Jun 30, 2022
ghstack-source-id: e6b230c
Pull Request resolved: #106
@ankitade ankitade marked this pull request as ready for review July 13, 2022 06:26
Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan
1. pytest
2.  python -m flava.train config=flava/configs/pretraining/debug.yaml
3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml
 

Differential Revision: [D37481127](https://our.internmc.facebook.com/intern/diff/D37481127)

[ghstack-poisoned]
@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan
1. pytest
2.  python -m flava.train config=flava/configs/pretraining/debug.yaml
3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml
 

Differential Revision: [D37481127](https://our.internmc.facebook.com/intern/diff/D37481127)

[ghstack-poisoned]
@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

4 similar comments
@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Move projections from the contrastive loss to the core model
This will allow users to use the model (instead of the pretraining model) for doing zero shot
Also moved to using the translated the checkpoint.

Test plan
1. pytest
2.  python -m flava.train config=flava/configs/pretraining/debug.yaml
3. python -m flava.finetune config=flava/configs/finetuning/qnli.yaml
 

Differential Revision: [D37481127](https://our.internmc.facebook.com/intern/diff/D37481127)

[ghstack-poisoned]
@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot facebook-github-bot deleted the gh/ankitade/5/head branch July 29, 2022 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants