-
Notifications
You must be signed in to change notification settings - Fork 0
[tests] tests for the parser #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
tests for utilsforecast
…locally but not in actions)
|
I've temporarily skipped the tests for datasetsforecast for now, I'm suspecting a python version/dependency issue. I can reproduce the issue in a codespace, but not locally. Besides that the overall PR is in good shape to get some feedback and comments! :) |
JQGoh
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the given tests, they look good to me. But I suggest that we try to resolve the dependency issue and have validations for datasets cases that you mention.
This PR adds tests for the new parser that will be used in the when we will create new documentation. The idea is that the parser reads the
.mdfile based documentation and when it sees a code block like this# some title other imp info .... ::: coreforecast.differences.num_diffsrenders to
It will render the documentation for the fn/class in-place in the md file