Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Downgrade pytest back to 7.4.3 #763

Merged
merged 7 commits into from
May 16, 2024
Merged

Downgrade pytest back to 7.4.3 #763

merged 7 commits into from
May 16, 2024

Conversation

amishatishpatel
Copy link
Contributor

@amishatishpatel amishatishpatel commented May 16, 2024

As pytest-dev/pytest#12328 mentions, the recent upgrade to pytest 8.x.x
resulted in the qualification tests taking forever and a day to complete (well, ~12 hours).
This PR rolls the version back to the last known working iteration and updates the pytest
8 feature usage.

There is also a note in qualification/requirements.in to support the reasoning to keep its pytest version
below 8. At least for now, until there is a new release of pytest with the fix for the issue raised by @bmerry.

Lastly, there is a corresponding tweak to the Jenkins Qualification pipeline to start the qualification run
at 11pm SAST. With the 8 hour timeout, that should take it through to a reasonable hour before the team
checks in.

Checklist (if not applicable, edit to add (N/A) and mark as done):

  • If dependencies are added/removed: update setup.cfg and .pre-commit-config.yaml
  • (n/a) If modules are added/removed: use sphinx-apidoc -efo doc/ src/ to update files in doc/
  • Ensure copyright notices are present and up-to-date
  • (n/a) If qualification tests are changed: attach a sample qualification report
  • (n/a) If design has changed: ensure documentation is up to date
  • (n/a) If ICD-defined sensors have been added: update fake_servers.py in katsdpcontroller to match

Closes NGC-1318.

amishatishpatel and others added 6 commits May 6, 2024 08:33
Specifically to see if it resolves the qualification run timeout issues.

Also commented out a pytest 8.0.x feature usage in xbgpu's unit test to
ensure Jenkins doesn't complain about this as well.

Contributes to: NGC-1318.
To remove dependency on pytest 8.x.x as we need to constrain the
qualification test requirement to version<8.x.x.

Contributes to: NGC-1318.
Citing pytest issue #12328 raised by @bmerry regarding fixture reuse.

Contributes to: NGC-1318.
From the temporary 23 hours, but still greater than the original 5
hours.

Contributes to: NGC-1318.
@amishatishpatel amishatishpatel requested a review from bmerry May 16, 2024 08:06
Comment on lines 1048 to 1051
assert list(filter(accum_warning_filter, caplog.record_tuples))[: len(corrprod_outputs)] == [
(
"katgpucbf.xbgpu.engine",
WARNING,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're repeating the value you're checking for in both the filter function and in this test.

How about replacing this with something like:

assert caplog.record_tuples.count(
    (
        "katgpucbf.xbgpu.engine",
        WARNING,
        "All Antennas had a break in data during this accumulation",
    )
) >= len(corrprod_outputs)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah fair point. Realised that when I explicitly compared tuples in the filter. Updating now.

As per @bmerry's suggestion on PR #763.

Contributes to: NGC-1318.
@amishatishpatel amishatishpatel requested a review from bmerry May 16, 2024 09:10
@bmerry bmerry merged commit 1290b23 into main May 16, 2024
2 checks passed
@bmerry bmerry deleted the NGC-1318-pytest-downgrade branch May 16, 2024 12:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants