We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could you please make a comment as to why there is no convergence in example.py (1d)? Yet, the end GP is a fairly good approximation.
Thank you!
Epoch 9988: 0 %. Loss: -20.101368667360404 Epoch 9989: 0 %. Loss: -38.94774396791739 Epoch 9990: 0 %. Loss: -39.56470825273037 Epoch 9991: 0 %. Loss: -12.318895364429565 Epoch 9992: 0 %. Loss: -19.47904221404525 Epoch 9993: 0 %. Loss: -36.77490064730489 Epoch 9994: 0 %. Loss: 43.20385940066373 Epoch 9995: 0 %. Loss: 20.232737011431396 Epoch 9996: 0 %. Loss: -34.98050499495804 Epoch 9997: 0 %. Loss: -32.932921142632466 Epoch 9998: 0 %. Loss: 739.1653327084057 Epoch 9999: 0 %. Loss: -39.727655479961854 Epoch 10000: 0 %. Loss: -35.62156909134845
The text was updated successfully, but these errors were encountered:
because its using the Adam Optimizer. Using a lower learning rate, or different optimizer should do the trick (but might give overfit)
Sorry, something went wrong.
No branches or pull requests
Could you please make a comment as to why there is no convergence in example.py (1d)? Yet, the end GP is a fairly good approximation.
Thank you!
Epoch 9988: 0 %. Loss: -20.101368667360404
Epoch 9989: 0 %. Loss: -38.94774396791739
Epoch 9990: 0 %. Loss: -39.56470825273037
Epoch 9991: 0 %. Loss: -12.318895364429565
Epoch 9992: 0 %. Loss: -19.47904221404525
Epoch 9993: 0 %. Loss: -36.77490064730489
Epoch 9994: 0 %. Loss: 43.20385940066373
Epoch 9995: 0 %. Loss: 20.232737011431396
Epoch 9996: 0 %. Loss: -34.98050499495804
Epoch 9997: 0 %. Loss: -32.932921142632466
Epoch 9998: 0 %. Loss: 739.1653327084057
Epoch 9999: 0 %. Loss: -39.727655479961854
Epoch 10000: 0 %. Loss: -35.62156909134845
The text was updated successfully, but these errors were encountered: