Skip to content

How do I skip an argument of a callable? #669

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
carmocca opened this issue Feb 4, 2025 · 3 comments
Open

How do I skip an argument of a callable? #669

carmocca opened this issue Feb 4, 2025 · 3 comments
Labels
enhancement New feature or request

Comments

@carmocca
Copy link
Contributor

carmocca commented Feb 4, 2025

In this example

from jsonargparse import ArgumentParser, CLI
import torch
from typing import Callable
from dataclasses import dataclass

@dataclass
class Foo:
    opt: Callable[[torch.nn.Parameter], torch.optim.Optimizer]

parser = ArgumentParser()
parser.add_class_arguments(Foo)
args = parser.parse_args()
print(args)
# python script.py --opt torch.optim.Adam

How would I skip the lr?

I tried both:

parser.add_class_arguments(Foo, skip={"opt.lr"})
parser.add_class_arguments(Foo, skip={"opt.init_args.lr"})

but they do nothing.

I noticed that:

    opt: Callable[[torch.nn.Parameter, float], torch.optim.Optimizer]

will work assuming that the lr is the second positional argument of all optimizers, which might not be always true.

Thanks for the help!

@mauvilsa
Copy link
Member

mauvilsa commented Feb 5, 2025

You don't need to skip lr. The type would be Callable[[torch.nn.Parameter], torch.optim.Optimizer]. If a class path is given to an optimizer that requires lr, then lr must also be provided. The fact that lr is positional is not important. At least, this is how it is supposed to work. Did something not work correctly for you?

I tried both:

parser.add_class_arguments(Foo, skip={"opt.lr"})
parser.add_class_arguments(Foo, skip={"opt.init_args.lr"})

but they do nothing.

I think skips with a dot notation are not supported. But still, no need to skip for the case of optimizers.

@carmocca
Copy link
Contributor Author

carmocca commented Feb 5, 2025

Sorry, what I meant is that I want to skip the lr. I dont want to expose it to the command line or config. My code will set a value automatically (not represented in the repro snippet above).

@mauvilsa
Copy link
Member

mauvilsa commented Feb 5, 2025

Okay, then this is not currently supported.

@mauvilsa mauvilsa added the enhancement New feature or request label Feb 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants