You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you so much for your work! I am wondering about the generalization ability of such PINN? I tried with a simple sine function, when training and testing within the range: [0,2pi], both training loss and validation loss are good. However, when I feed the network with a new set of x, ranges from [2pi, 4pi], then the prediction looks bad. Is it because that the network neve sees such number? I feel like it is memorizing the distribution of things it has been trained on, but not generalized to unseen floats?
The text was updated successfully, but these errors were encountered:
Hi, thank you so much for your work! I am wondering about the generalization ability of such PINN? I tried with a simple sine function, when training and testing within the range: [0,2pi], both training loss and validation loss are good. However, when I feed the network with a new set of x, ranges from [2pi, 4pi], then the prediction looks bad. Is it because that the network neve sees such number? I feel like it is memorizing the distribution of things it has been trained on, but not generalized to unseen floats?

The text was updated successfully, but these errors were encountered: