-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test coverage #5539
Comments
If somebody will know more exact values, he / she / it can use https://github.com/henning-gerhardt/kitodo-production/tree/codecoverage_jacoco with reading https://github.com/henning-gerhardt/kitodo-production/blob/codecoverage_jacoco/README.coverage . It is an experimental branch by myself to get some numbers like this |
Thanks a lot for this proposal. It would be really helpful, if more more tests were available. |
May I ask: How do we negotiate what tests are necessary and useful, and what tests are not worth the effort? There is a huge range of how detailed tests can be implemented:
At the moment, I have no idea how to estimate the amount of time required for this issue, because of the huge range of how detailed tests can be implemented, and what tests could be considered relevant or not. Plus, I imagine every developer has a different opinion about what is important and what is not important. I also have no idea how this could be specified more clearly without doing half of the work upfront, meaning other than writing a list of all possible tests and discussing upfront whether each test is useful or not. Seems unlikely. Anyone any thoughts? Thank you and Cheers! |
The main goal is to improve the test coverage of Kitodo.Production. While I agree with @thomaslow that simple unit tests may be of limited value, trying to cover everything with more sophisticated tests will not be feasible given the size of the code base. Instead, we will have to define certain parts of the application where additional tests are absolutely required. One of those parts has already been specified in the development fund description as the Metadata Editor. Here, Selenium tests are to be written to cover most if not all functions available in the GUI. The main reason for this is that the editor is a component where large parts of the functionality reside in the client side code like JavaScript etc. and the tests should simulate the user experience as closely as possible. We do already have a class for selenium tests of the editor in Another part of the software where the tests should be more in-depth is the creation of new processes. Importing metadata from simulated SRU or OAI interfaces has to be covered by integration tests, including tests of data records that are not compatible with import mapping or ruleset files, to verify that the application does handle those cases gracefully and does not crash. Similar to the metadata editor, all functions in the workflow editor and the calendar tool that are not yet covered by tests should be tested by Selenium or Integration tests. Suggestions for additional components of the application where more advanced tests should be considered in favour over simple units are welcome. The goal of the development fund is not to cover all branches or all lines of code, but all methods should be tested by at least one test. Exempt from this are getters and setters that do not perform additional business logic and non-public methods. Additional exceptions can be necessary but have to be discussed case by case. Since this worked for the improvement of the Kitodo.Presentation test coverage, I hope it is possible in this case, too. |
I have been trying to get a better picture of the current state of the test coverage in the last days since the low test coverage reported did not seem right. Apparently the coverage mentioned in the issue description and also by @henning-gerhardt in the first comment is indeed not accurate and way too low. While experimenting with the coverage tool integrated with my IDE (IntelliJ), I discovered that running the coverage tool with a normal build & run configuration will not detect all tests and leads to a reported code coverage of around 11%, so similar to that provided by @henning-gerhardt in the screenshot above. However, when starting the tool via the context menu of the project window of IntelliJ, for each module individually, it takes all tests into account and yields a much higher test coverage: Apparently, the actual number for the current master branch is at around 71% classes, 45,5% methods and 42% lines of code covered by tests: These numbers incorporate private methods and getters and setters, though, so the number of covered public methods will be even higher. Additionally, code covered by Selenium tests is not included in these statistics, so the actual numbers for code coverage will be higher still. (I uploaded the HTML coverage report to a repository so you can take a more in-depth look at the state of individual packages and classes here: https://solth.github.io/kitodo-production-coverage/) Even then, though, test coverage is still far from complete and many parts of the code are not yet tested at all. But I think the current situation might be little less dire than anticipated initially and does paint a more feasible picture for the current undertaking of improving the test coverage for Kitodo.Production, IMHO. |
@solth
Basically, as I understand it, you reinforced the description of the official Kitodo announcement. Meaning, there will be no exhaustive list of expected tests or otherwise exhaustive and final description of what is considered necessary. The scope of this issue will be determined on-the-fly. |
@thomaslow yes, I think you are correct. The official announcement was supposed to make it clear that many exceptions will have to be made since it won't be feasible to cover all classes, methods, branches and lines with dedicated tests. Unfortunately, since it wasn't obivous which parts were already covered directly or indirectly it was difficult to specify an exhaustive list of tests to write. It will probably come down to a prioritized list, where the parts specifically mentioned (like the metadata editor, import, calendar etc.) have the highest priority and should have the most complete test coverage, while other parts should be as much covered as possible given available ressources. |
What about adding all open issues with label development fund 2023 also to the development fund 2024? Regarding the issue here I saw that there is already ongoing work. How is the current status of test coverage? |
This is already being worked in context of last years development fund, see https://github.com/orgs/kitodo/projects/8 |
Closing this issue as the corresponding project of the development fund 2023 is completed. For future improvements of the test coverage new issues should be created. |
Description
Kitodo.Production currently has a rather low test coverage (atm. less than 20% of the code is covered by unit tests, for example). This should be improved upon to harden the software against new bugs and make future developments easier.
Before resolving this issue we should decide
Related Issues
#3330
#3939
Expected Benefits of this Development
Improved test coverage will make future developments easier (and therefore probably cheaper!) and reduce the risk of introducing new bugs. Additionally reviewing pull requests will benefit from a better test coverage as well because the significance of succesful Github CI builds increases with the amount of tests.
Estimated Costs and Complexity
The exact costs for resolving this issue depend on the amount of tests to be implemented. Covering all functions in Kitodo.Production is probably not feasible in the scope of one issue, but the complexity will most likely be high in any case.
The text was updated successfully, but these errors were encountered: