Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add coverage options for hatch-test scripts #1477

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

jborbely
Copy link

It has been a very educational experience following the documentation of hatch to update my setup.py-style projects to follow the latest best-practices. Thank you @ofek for all the time you spend developing helpful tools.

This PR is in regards to the hatch test command that was added in version 1.10.0.

While converting a project that used the coverage html command, I needed to redefine all scripts in the [envs.hatch-test.scripts] table even though I used the default run, run-cov and cov-combine scripts, e.g.,

[envs.hatch-test.scripts]
run = "pytest{env:HATCH_TEST_ARGS:} {args}"   # default
run-cov = "coverage run -m pytest{env:HATCH_TEST_ARGS:} {args}"  # default
cov-combine = "coverage combine"  # default
cov-report = "coverage html"  # replaced 'report' with 'html'

This PR proposes that the following options may be defined in the [envs.hatch-test] table

  • combine-args: array of strings, for the cov-combine script
  • reporting: string, for the cov-report script
  • reporting-args: array of strings, for the cov-report script

so that, for my particular use case, I could now define the [envs.hatch-test] table as

[envs.hatch-test]
reporting = "html"

Another example. Suppose someone wants to silence all messages from coverage combine and show the lines that were excluded in the tests when reporting

[envs.hatch-test]
combine-args = ["--quiet"]
reporting-args = ["--show-missing"]

Thanking you for any comments or suggestions you may have. No worries if this PR does not align with the way you want hatch-test.scripts to be used ‐ feel free to close it.

The following new options may be defined in the [tool.hatch.envs.hatch-test] table
* combine-args: array of strings, for the 'cov-combine' script
* reporting: string, for the 'cov-report' script
* reporting-args: array of strings, for the 'cov-report' script
@ofek
Copy link
Collaborator

ofek commented May 10, 2024

I will check this out in the coming weeks, thanks!!!

mihaimaruseac added a commit to mihaimaruseac/model-transparency that referenced this pull request Aug 15, 2024
By default, `hatch test -c` coverage report lists only counts of lines and lines missing and percentages but there is no way to see which are the ones that are missing. We don't have an option to generate an html report at the moment (pypa/hatch#1477). Added some options to also display branch coverage and missing lines/branches.

Also, by default, all files are included in the report, including tests
(covering the test-only code). I removed the tests, but if we decide we
should add them that's easy to do.

More importantly, the report lists files that are 100% covered (not useful in CI) and empty files (not useful at all). So, I removed those from the output.

There is another bigger issue that only files that are imported by a test get reported, so if we have code that is not tested at all it will not show up here. We already have such code in `signature/` and `signing/sigstore.py`. Fixing this will be left for later.

Current output is:

```
Name                                                         Stmts   Miss Branch BrPart  Cover   Missing
--------------------------------------------------------------------------------------------------------
src/model_signing/hashing/file.py                              102      0     68      1    99%   293->286
src/model_signing/hashing/hashing.py                            34      5     24      0    91%   53, 66, 72, 81, 86
src/model_signing/manifest/manifest.py                          78      1     28      0    99%   102
src/model_signing/serialization/serialization.py                 8      1      2      0    90%   49
src/model_signing/serialization/serialize_by_file.py           109      1     43      0    99%   198
src/model_signing/serialization/serialize_by_file_shard.py      80      1     29      0    99%   212
src/model_signing/signing/empty_signing.py                      39      2     24      2    94%   51, 97
src/model_signing/signing/in_toto.py                           169     68     98      0    66%   65-78, 181-190, 342-367, 485-512, 660-671, 793-806
src/model_signing/signing/signing.py                            25      5     18      0    88%   65, 78, 99, 126, 155
--------------------------------------------------------------------------------------------------------
TOTAL                                                          773     84    390      3    91%
```

Fixing the missing coverage is left for later. We should aim for 95%+ or so coverage, I think.

We should probably make it so that GitHub reports this table back on PRs, so reviewers can quickly ask for more testing without needing to check the GHA report. Punted for later, for now I'll just remember to just keep checking.

Signed-off-by: Mihai Maruseac <[email protected]>
mihaimaruseac added a commit to mihaimaruseac/model-transparency that referenced this pull request Aug 15, 2024
By default, `hatch test -c` coverage report lists only counts of lines and lines missing and percentages but there is no way to see which are the ones that are missing. We don't have an option to generate an html report at the moment (pypa/hatch#1477). Added some options to display missing lines.

Also, by default, all files are included in the report, including tests (covering the test-only code). I removed the tests, but if we decide we should add them that's easy to do.

More importantly, the report lists files that are 100% covered (not useful in CI) and empty files (not useful at all). So, I removed those from the output.

There is another bigger issue that only files that are imported by a test get reported, so if we have code that is not tested at all it will not show up here. We already have such code in `signature/` and `signing/sigstore.py`. Fixing this will be left for later.

Current output is:

```
Name                                                         Stmts   Miss  Cover   Missing
------------------------------------------------------------------------------------------
src/model_signing/hashing/hashing.py                            34      5    85%   53, 66, 72, 81, 86
src/model_signing/manifest/manifest.py                          78      1    99%   102
src/model_signing/serialization/serialization.py                 8      1    88%   49
src/model_signing/serialization/serialize_by_file.py           109      1    99%   198
src/model_signing/serialization/serialize_by_file_shard.py      80      1    99%   212
src/model_signing/signing/empty_signing.py                      39      2    95%   51, 97
src/model_signing/signing/in_toto.py                           169     68    60%   65-78, 181-190, 342-367, 485-512, 660-671, 793-806
src/model_signing/signing/signing.py                            25      5    80%   65, 78, 99, 126, 155
------------------------------------------------------------------------------------------
TOTAL                                                          773     84    89%
```

Fixing the missing coverage is left for later. We should aim for 95%+ or so coverage, I think.

We should probably make it so that GitHub reports this table back on PRs, so reviewers can quickly ask for more testing without needing to check the GHA report. Punted for later, for now I'll just remember to just keep checking.

Signed-off-by: Mihai Maruseac <[email protected]>
mihaimaruseac added a commit to mihaimaruseac/model-transparency that referenced this pull request Aug 15, 2024
By default, `hatch test -c` coverage report lists only counts of lines and lines missing and percentages but there is no way to see which are the ones that are missing. We don't have an option to generate an html report at the moment (pypa/hatch#1477). Added some options to display missing lines.

Also, by default, all files are included in the report, including tests (covering the test-only code). I removed the tests, but if we decide we should add them that's easy to do.

More importantly, the report lists files that are 100% covered (not useful in CI) and empty files (not useful at all). So, I removed those from the output.

There is another bigger issue that only files that are imported by a test get reported, so if we have code that is not tested at all it will not show up here. We already have such code in `signature/` and `signing/sigstore.py`. Fixing this will be left for later.

Current output is:

```
Name                                         Stmts   Miss  Cover   Missing
--------------------------------------------------------------------------
src/model_signing/signing/empty_signing.py      39      2    95%   51, 97
src/model_signing/signing/in_toto.py           168     68    60%   65-78, 181-190, 342-367, 485-512, 660-671, 793-806
--------------------------------------------------------------------------
TOTAL                                          745     70    91%
```

Fixing the missing coverage is left for later. We should aim for 95%+ or so coverage, I think.

We should probably make it so that GitHub reports this table back on PRs, so reviewers can quickly ask for more testing without needing to check the GHA report. Punted for later, for now I'll just remember to just keep checking.

Signed-off-by: Mihai Maruseac <[email protected]>
mihaimaruseac added a commit to mihaimaruseac/model-transparency that referenced this pull request Aug 15, 2024
By default, `hatch test -c` coverage report lists only counts of lines and lines missing and percentages but there is no way to see which are the ones that are missing. We don't have an option to generate an html report at the moment (pypa/hatch#1477). Added some options to display missing lines.

Also, by default, all files are included in the report, including tests (covering the test-only code). I removed the tests, but if we decide we should add them that's easy to do.

More importantly, the report lists files that are 100% covered (not useful in CI) and empty files (not useful at all). So, I removed those from the output.

There is another bigger issue that only files that are imported by a test get reported, so if we have code that is not tested at all it will not show up here. We already have such code in `signature/` and `signing/sigstore.py`. Fixing this will be left for later.

Current output is:

```
Name                                         Stmts   Miss  Cover   Missing
--------------------------------------------------------------------------
src/model_signing/signing/in_toto.py           168     68    60%   65-78, 181-190, 342-367, 485-512, 660-671, 793-806
--------------------------------------------------------------------------
TOTAL                                          745     70    91%
```

Fixing the missing coverage is left for later. We should aim for 95%+ or so coverage, I think.

We should probably make it so that GitHub reports this table back on PRs, so reviewers can quickly ask for more testing without needing to check the GHA report. Punted for later, for now I'll just remember to just keep checking.

Signed-off-by: Mihai Maruseac <[email protected]>
mihaimaruseac added a commit to sigstore/model-transparency that referenced this pull request Aug 19, 2024
* Configure coverage reporting

By default, `hatch test -c` coverage report lists only counts of lines and lines missing and percentages but there is no way to see which are the ones that are missing. We don't have an option to generate an html report at the moment (pypa/hatch#1477). Added some options to display missing lines.

Also, by default, all files are included in the report, including tests (covering the test-only code). I removed the tests, but if we decide we should add them that's easy to do.

More importantly, the report lists files that are 100% covered (not useful in CI) and empty files (not useful at all). So, I removed those from the output.

There is another bigger issue that only files that are imported by a test get reported, so if we have code that is not tested at all it will not show up here. We already have such code in `signature/` and `signing/sigstore.py`. Fixing this will be left for later.

Current output is:

```
Name                                         Stmts   Miss  Cover   Missing
--------------------------------------------------------------------------
src/model_signing/signing/in_toto.py           168     68    60%   65-78, 181-190, 342-367, 485-512, 660-671, 793-806
--------------------------------------------------------------------------
TOTAL                                          745     70    91%
```

Fixing the missing coverage is left for later. We should aim for 95%+ or so coverage, I think.

We should probably make it so that GitHub reports this table back on PRs, so reviewers can quickly ask for more testing without needing to check the GHA report. Punted for later, for now I'll just remember to just keep checking.

Signed-off-by: Mihai Maruseac <[email protected]>

* Add unit tests for signing with sigstore.

We need to do quite a lot of mocking around Sigstore, but we are able to test all logic in our library. What is left to do for testing is e2e integration tests (#5) and testing with signing on one OS and verifying on another (#25). Both of these are integration style tests and we will only be able to run them in GHA. I'll send a PR for those soon.

While testing, I discovered some minor bugs with error reporting and one moderate bug. Fixed in this PR.

We now have achieved 100% test coverage! 🎉

```
Name    Stmts   Miss  Cover   Missing
-------------------------------------
TOTAL     835      0   100%
```

Well, almost. There are 2 files that are not imported by tests at all, so they don't get included in the report:

```
src/model_signing/signature/fake.py
src/model_signing/signature/pki.py
```

This depends on #287 which configures the coverage reporting.

Signed-off-by: Mihai Maruseac <[email protected]>

---------

Signed-off-by: Mihai Maruseac <[email protected]>
susperius pushed a commit to susperius/model-transparency that referenced this pull request Aug 21, 2024
* Configure coverage reporting

By default, `hatch test -c` coverage report lists only counts of lines and lines missing and percentages but there is no way to see which are the ones that are missing. We don't have an option to generate an html report at the moment (pypa/hatch#1477). Added some options to display missing lines.

Also, by default, all files are included in the report, including tests (covering the test-only code). I removed the tests, but if we decide we should add them that's easy to do.

More importantly, the report lists files that are 100% covered (not useful in CI) and empty files (not useful at all). So, I removed those from the output.

There is another bigger issue that only files that are imported by a test get reported, so if we have code that is not tested at all it will not show up here. We already have such code in `signature/` and `signing/sigstore.py`. Fixing this will be left for later.

Current output is:

```
Name                                         Stmts   Miss  Cover   Missing
--------------------------------------------------------------------------
src/model_signing/signing/in_toto.py           168     68    60%   65-78, 181-190, 342-367, 485-512, 660-671, 793-806
--------------------------------------------------------------------------
TOTAL                                          745     70    91%
```

Fixing the missing coverage is left for later. We should aim for 95%+ or so coverage, I think.

We should probably make it so that GitHub reports this table back on PRs, so reviewers can quickly ask for more testing without needing to check the GHA report. Punted for later, for now I'll just remember to just keep checking.

Signed-off-by: Mihai Maruseac <[email protected]>

* Add unit tests for signing with sigstore.

We need to do quite a lot of mocking around Sigstore, but we are able to test all logic in our library. What is left to do for testing is e2e integration tests (#5) and testing with signing on one OS and verifying on another (sigstore#25). Both of these are integration style tests and we will only be able to run them in GHA. I'll send a PR for those soon.

While testing, I discovered some minor bugs with error reporting and one moderate bug. Fixed in this PR.

We now have achieved 100% test coverage! 🎉

```
Name    Stmts   Miss  Cover   Missing
-------------------------------------
TOTAL     835      0   100%
```

Well, almost. There are 2 files that are not imported by tests at all, so they don't get included in the report:

```
src/model_signing/signature/fake.py
src/model_signing/signature/pki.py
```

This depends on sigstore#287 which configures the coverage reporting.

Signed-off-by: Mihai Maruseac <[email protected]>

---------

Signed-off-by: Mihai Maruseac <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants