-
Notifications
You must be signed in to change notification settings - Fork 4
ASK FOR HELP #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi chenyang, thanks for your interest in our work! |
Dear Doctor:
|
Hi chenyang,
|
Thank you for your reply! I will try to add/delete FD loss later for experimentation and will give you feedback if it works. |
Hello, Could you tell me about the versions of mmcv and other dependency packages? I encountered some version issues while installing the dependency packages. Thank you! |
Hi TiSgrc, how about providing more details on the issues (e.g., the commands you run, the procedures you take, and the traceback of errors) so that we can have a better idea of where to look into? |
Hi TiSgrc, how about providing more details on the issues (e.g., the commands you run, the procedures you take, and the traceback of errors) so that we can have a better idea of where to look into? Hi KiwiXR, I am glad to receive your reply. The specific error is as follows. Actually, the versions of mmseg, mmcv, and pytorch do not match. May I ask what your versions are? Can you take a look at your pip list? |
Hi TiSgrc, the recommended combination is |
Thank you for your reply, it is very important to me. I will try this version. Thank you again for your reply, and I wish you a smooth research! |
Dear Doctor:
Your work is excellent! I have some questions that I would like to ask for your help.
I added LCL to my UDA model, the way I did it was to take 2-norm on logit before it passed in the cross-entropy loss function, I changed the eps from 1e-7 to 1e-3 due to using AMP, but after adding LCL, my loss function curve keeps going up. Did I do anything wrong?
2-norm
cross-entropy loss
Looking forward to your help! Thanks!
best regards!
The text was updated successfully, but these errors were encountered: