-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Fix warnings in unit-tests tests #6159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 5 commits
93b3e84
5b2049b
5ca18fe
d8d5b84
4813241
16ddd6c
10595d8
ae06b1b
f187760
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -74,6 +74,8 @@ def forward(self, x: Tensor) -> Tensor: | |
class QuantizableGoogLeNet(GoogLeNet): | ||
# TODO https://github.com/pytorch/vision/pull/4232#pullrequestreview-730461659 | ||
def __init__(self, *args: Any, **kwargs: Any) -> None: | ||
if "init_weights" not in kwargs: | ||
kwargs["init_weights"] = True | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm not sure why in order to fix the warning on unit-tests we modify the models themselves. My expectations without having all the details in my mind, is that the tests are the ones that need to be updated. What am I missing? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. At first I was thinking that QuantizableGoogLeNet does not properly set up its base class There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ok, let me check and send PR again. |
||
super().__init__( # type: ignore[misc] | ||
blocks=[QuantizableBasicConv2d, QuantizableInception, QuantizableInceptionAux], *args, **kwargs | ||
) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not good. Check #2170 why they switched from True to None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, I don't have good idea. Is there better method than I did here.
https://github.com/pytorch/vision/pull/6159/commits/16ddd6c0f23fe56e8029f336c5e3f3c69e9c534f