-
Notifications
You must be signed in to change notification settings - Fork 69
Fix: move additional metrics from approximator to networks #500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
Supplying the additional metrics for inference and summary networks via the approximators compile method caused problems during deseralization (#497). This can be resolved nicely by moving the metrics directly to the networks' constructors, analogous to how Keras normally handles custom metrics in layers. As summary networks and inference networks inherit from the respective base classes, this change only requires minor adaptations.
41e2967
to
8d296e9
Compare
023db06
to
4c7a5a2
Compare
This change makes it more capable for our purposes by allowing any serializable value, not only the base types in the auto-config. We have to check if this brings any footguns/downsides, or whether this is fine for our setting. It also replaces Keras' functions with our custom serialization functions.
4c7a5a2
to
51dff0d
Compare
@LarsKue @stefanradev93 @han-ol I encountered the problem that to pass metrics to the networks in their constructors, we would have to specify all |
I would let @LarsKue chip in on this, as I am bit concerned about having more and more "custom" variants of basic keras containers (e.g., Sequential, Layer,...). |
Thanks for the comment, I can understand this, relying too much on hacking/modifying Keras internals has the downside that our code might become more fragile with respect to changes in Keras. I think the ability to pass non-basic types to our inference and summary network base classes would be nice to have, the behavior in this PR is one example of this.
|
I would be on board if it was a lightweight wrapper, but it seems that there is a lot of copied code from keras, which we should avoid in general, imo. @vpratz Can we solve this through monkey-patching somehow, like with the |
If this is not breaking in any downstream task and only adds to the existing functionality, we may consider doing a PR to keras or asking them to implement it (or why they decided not to)? |
@LarsKue Thanks for taking a look. I'll take a look at alternative ways to achieve this behavior. |
@LarsKue Thanks a lot, this was a really good pointer. I have now implemented a wrapper around the constructor that is applied inside the There is still some copied code, but I think it is now limited to the part that is required for the feature to behave as similar to the Keras implementation as possible, apart from the desired changes (we might not need the part regarding the With those changes, we could remove the @stefanradev93 @LarsKue Please take another look and let me know what you think (the most relevant part is in |
Thanks, this looks much better. Since this is a sensitive change, I think we should extensively test it before rolling it out. Otherwise, green light from my side! |
This is what most classifier metrics expect, and contains more detail for the metrics to work with.
I forgot to make the same changes in the |
Supplying the additional metrics for inference and summary networks via the approximators compile method caused problems during deseralization (#497). This can be resolved nicely by moving the metrics directly to the networks' constructors, analogous to how Keras normally handles custom metrics in layers.
As summary networks and inference networks inherit from the respective base classes, this change only requires minor adaptations. Calls to
layer_kwargs
are now only used in classes that directly inherit fromkeras.Layer
, and have been moved intoInferenceNetwork
andSummaryNetwork
.Fixes #497.