Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFE: add --disesct-extensions option #8993

Closed
kloczek opened this issue Aug 9, 2021 · 13 comments
Closed

RFE: add --disesct-extensions option #8993

kloczek opened this issue Aug 9, 2021 · 13 comments
Labels
invalid status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity

Comments

@kloczek
Copy link
Contributor

kloczek commented Aug 9, 2021

pytest can handle -no:foo option to disable loading foo pytest extension.

Idea behind --dissect-extensions option is to automatically run additional passes with dissected set of available pytest extensions if error or warning will occur and report at the end which one of the errors or warnings have been reported when exact pytest extension is loaded.

I think that something like this would dramatically improve reporting some errors because it would be easier to point on exact extension which shows some warnings or errors or report errors against some pytest extensions.
I think that code handling such option should be core part of the pytest (not as optional extension)

@RonnyPfannschmidt RonnyPfannschmidt added the status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity label Aug 9, 2021
@RonnyPfannschmidt
Copy link
Member

the desired workflow/use-case is not clear, please provide some examples that help to understand what you mean

@kloczek
Copy link
Contributor Author

kloczek commented Aug 9, 2021

Use case: build enviroment with large number of installed pytest extencions in which is hard to identify which one extension is reportimng exact error/warning.

@RonnyPfannschmidt
Copy link
Member

closing as OP refusing to describe the use-case in sufficient detail

@nicoddemus
Copy link
Member

I think that code handling such option should be core part of the pytest (not as optional extension)

I disagree, if I understand the request correctly, this probably should be even implemented by an external tool, not as a pytest plugin, as pytest itself might break in the presence of a bad plugin.

@nicoddemus
Copy link
Member

I understand the context of the request because I read ESSS/pytest-regressions#64, where running pytest with many plugins installed gives this error:

Direct construction of SpecModule has been deprecated, please use SpecModule.from_parent.
See https://docs.pytest.org/en/stable/deprecations.html#node-construction-changed-to-node-from-parent for more details.

And is not clear which plugin is the culprit, I had to look at the source code of pytest-relaxed to figure it out.

We might consider at least show the full qualified name of the offending class in that message? So users would see pytest_relaxed.plugin.SpecModule, which is a nudge in the right direction.

@kloczek
Copy link
Contributor Author

kloczek commented Aug 9, 2021

I think that code handling such option should be core part of the pytest (not as optional extension)

I disagree, if I understand the request correctly, this probably should be even implemented by an external tool, not as a pytest plugin, as pytest itself might break in the presence of a bad plugin.

My intention was to provide such level of diagnostics as OOTB/core part of the pytest but available only on demand when pytest would be started with --disesct-extensions

@kloczek
Copy link
Contributor Author

kloczek commented Aug 9, 2021

I understand the context of the request because I read ESSS/pytest-regressions#64, where running pytest with many plugins installed gives this error (..)

Exactly.
Quick additional round of running pytest --disesct-extensions would allow quickly to point that exact error/warnig generated by core pytest or when exact extension is used.

@kloczek
Copy link
Contributor Author

kloczek commented Aug 9, 2021

We might consider at least show the full qualified name of the offending class in that message? So users would see pytest_relaxed.plugin.SpecModule, which is a nudge in the right direction

Generally what IMO is missing is in pytest output are details about per extension diagnostics.
IMO -r option could be enriched to show per extensions found errors/warnings.

Simple at the moment when I'm running pytest in env with many extensions it is really hard to figure out is exact error/warning generated by core pytest or by some exact extensions.
I have already opened few tickets with some pytest errors and I was not fully aware of the fact that exact errors not been generated by core pytest but by some pytest extensions. Some of those tickets have been already closed by modules maintainers with request to isolate exact extension responsible for emitting exact error/warning.
In such case blind/brute force dissection could dramatically lower minimum pytest user knowledge/exp allowing way better/more precise report some issues even by such joe-pythonier like me.

In my rpm build infrastructure I can send set of requests to perform scratch/control/test builds with only build logs preservation, and additionally with modified exact macro and additional set of BuildRequires not specified in rpm spec file (all without touching any cfg files .. just paramsa in command line).
By such series of test builds where %pytest macro would have additional --disesct-extensions pytest parameter I would be able very quickly report which one modules test suite is affected and how by using exact pytest extension. Cumulate such mass build outcome would allow easy correlate results with possible causes.

I've been thinking about such mass test build quite long time (way before I've started packaging python modules). When I found pytest as most sophisticated python testing framework I've started using pytest everywhere when it is only possible (even if module maintainers recommends other method of testing).

I think that outcome of such mass tests could be quite interesting for python community and may automatically expose many issues.

@nicoddemus
Copy link
Member

Generally what IMO is missing is in pytest output are details about per extension diagnostics.

I understand and sympathize with the sentiment, but this is not something which is simple and perhaps even possible to implement in pytest reliably, that's why I suggest an external tool.

@kloczek
Copy link
Contributor Author

kloczek commented Aug 10, 2021

Generally what IMO is missing is in pytest output are details about per extension diagnostics.

I understand and sympathize with the sentiment, but this is not something which is simple and perhaps even possible to implement in pytest reliably, that's why I suggest an external tool.

That is OK. This is still only as idea :)
I've been looking for how to generate at least something like list of available pytests extensions to handle this kind case externally however looks like pytest command does not offer pursuable output with list of extensions which is possible to disable by generate sets of command line parameters.
Can someone point on something which may print such list?

@nicoddemus
Copy link
Member

Can someone point on something which may print such list?

λ pytest --version --version will print a list of plugins to stdout, which can be parsed.

@kloczek
Copy link
Contributor Author

kloczek commented Aug 10, 2021

λ pytest --version --version will print a list of plugins to stdout, which can be parsed.

"Nobody expects the Spanish Inquisition " .. in --version output :P
However even that output is not easy to use because it contains list of actual python modules and not pytest fixtures which pytest is using internally :/

@nicoddemus
Copy link
Member

and not pytest fixtures which pytest is using internally

But you don't want the fixtures, you want the plugins IIUC.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invalid status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity
Projects
None yet
Development

No branches or pull requests

3 participants