-
-
Notifications
You must be signed in to change notification settings - Fork 18
π·οΈ added additional overloads and improved existing stubs of block-related sparse construction functions #775
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
π·οΈ added additional overloads and improved existing stubs of block-related sparse construction functions #775
Conversation
Looks like I've got some |
Most errors are related to There is still an error left that I have to investigate. |
Okay, the last error is basically the same issue. |
|
In my experience that's indeed often the case. But as we've recently seen, pyright also has its issues. I have high hopes for |
That's probably a classic case of "join vs union", which the (based)pyright docs explain quite nicely: https://docs.basedpyright.com/v1.18.2/usage/mypy-comparison/#unions-vs-joins I believe that mypy intends to address this class of issues though: python/mypy#12056. |
You could also leave it in with a |
Ok, I like that approach. I'll restore the problematic tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the block_diag
return type (matrix vs array) still depend on the input, or did that maybe change in the meantime?
The documentation states the following:
So, I thought it might be most accurate to create an overload for all possible combinations of matrices, numpy arrays, and sequences, while arrays are treated in a separate overload. I haven't found a way to get the right behavior for "a list with at least one element of type |
Yea, I don't think that's something we can express in the Python type system. I suppose you could brute-force-engineer something like that for tuples, but that wouldn't be very pretty. |
I did try by creating a Protocol that contains |
I think I did something wrong with git... Hopefully it didn't screw anything up. |
Looks good, but |
Hmm yea, those commits don't look right. Did you by any chance rebase and then |
779f04a
to
cfa2252
Compare
I think you nailed it, but now I'm in too deep apparently π. |
Haha no worries, I'll squash merge them |
β¦rsion-specific numpy errors
Hmm, I just compared the timing results of I don't think that the test slowdown is any reason for concern, especially if you consider the tests that were added. But the For context, these are the top 10 slowest analysis times per file in
So even before this change, I'm also worried that this will affect scipy-stubs users that use I'll see if I can figure out an easy way to improve the analysis time. Sometimes it's as easy as tweaking a single type alias, or swapping some overloads, so let's hope that's also the case here. In this case, I suspect that it has to do with the overload-overlap detection algorithm, which has an exponential time-complexity in terms of overload count and the amount of protocol members of the input types. So it might help to replace a couple of Anyway, it's just something to keep in mind. It would be pretty cool to have a script that warns about performance regressions in CI π€. I guess I'll open an issue for it. |
Ok so there's two easy ways to bring down the analysis time a bit:
Since these changes aren't directly related to this PR, I first thought I'd create a separate PR for this. But fix 2. in particular would've caused nasty merge conflicts. And even though it's not very clean, it's fine by me to put these performance tweaks in this PR. Those two should be enough to counteract the +300 ms :) |
Thanks for the detailed analysis!
Indeed; I didn't notice this slowdown, so this warning would be quite nice!
Sounds good! I'll start implementing these suggestions soon :) |
Ok great; thanks Jul! |
I added all formats for the final functions in
_construct.pyi
.Let me know if you spot any errors! π