Skip to content

pytest warnings due to deprecation of pytest.warns(None) #22572

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
avm19 opened this issue Feb 22, 2022 · 35 comments · Fixed by #23068 or #22949
Closed

pytest warnings due to deprecation of pytest.warns(None) #22572

avm19 opened this issue Feb 22, 2022 · 35 comments · Fixed by #23068 or #22949
Labels
good first issue Easy with clear instructions to resolve Meta-issue General issue associated to an identified list of tasks module:test-suite everything related to our tests

Comments

@avm19
Copy link
Contributor

avm19 commented Feb 22, 2022

Pytest issues warnings about using pytest.warns(None) context manager, where the None part is now being deprecated.

For example,
>>> pytest ./sklearn/tests/test_naive_bayes.py
results in 79 passed, 4 warnings. After commenting out --disable-pytest-warnings key in [tool:pytest] section of setup.cfg, I could see that some of the warnings are:

PytestRemovedIn8Warning: Passing None has been deprecated.
See https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests for alternatives in common use cases.

Apparently, pytest.warns(None) is deprecated for being widely misunderstood. I think that while sklearn's tests do not suffer from this misunderstanding, they might be affected in the future if and when pytest raises an error instead of a deprecation warning. (I couldn't figure out if this is planned for pytest's future releases, but in any case, avoiding the deprecation and reducing the warnings count won't harm.)

I found this message , which is relevant to how pytest.warns(None) is used in scikit-learn tests, and how this could be changed. I opened this issue to share this information and quote the message, as it may save someone time in the future.

The tests were of the form:

with pytest.warns(None) as record:
    <do some stuff>
    <check record for warnings, and possibly raise an error>

The tests were not assuming that the pytest context code would raise an error if a warning occurred. Instead, these tests were doing their own error checking based on record. So they were really using pytest.warns(None) exactly like warnings.catch_warnings(record=True). (There is a pull request to make the change to catch_warnings.)

@github-actions github-actions bot added the Needs Triage Issue requires triage label Feb 22, 2022
@thomasjpfan thomasjpfan added module:test-suite everything related to our tests and removed Needs Triage Issue requires triage labels Feb 22, 2022
@thomasjpfan
Copy link
Member

thomasjpfan commented Feb 22, 2022

Thanks for opening the issue! Since pytest.warns(None) is deprecated, It makes sense to update our tests to use warnings.catch_warnings(record=True).

To be specific:

    with pytest.warns(None) as record:
        ...

can be directly replaced with:

    with warnings.catch_warnings(record=True) as record:
        ...

@cmarmo
Copy link
Contributor

cmarmo commented Mar 11, 2022

Here is a list of files containing pytest.warns(None).
For those willing to address this issue, please fix only one file per pull request.

I guess solving this issue will also move forward #22396.

Sorry, something went wrong.

@cmarmo cmarmo added the good first issue Easy with clear instructions to resolve label Mar 11, 2022
@ShanDeng123
Copy link
Contributor

Hello - I'd like to help tackle this problem if that'd be alright? If someone else is already working on this please just let me know!

@avm19
Copy link
Contributor Author

avm19 commented Mar 13, 2022

Hello @ShanDeng123
As far as I understand, all the items on cmarmo's list are independent and she asked to do 1 file per pull request.
I suppose, you can write here to reserve a bunch of them (e.g., sklearn/cluster/tests/*.*), then take the next bunch, and so on.
I am not working on this issue.

@avm19
Copy link
Contributor Author

avm19 commented Mar 13, 2022

I am curious if the issue is more complex than Find-Replace... And if pytest.warns(None) in the existing code was always used and understood correctly.

@thomasjpfan
Copy link
Member

thomasjpfan commented Mar 13, 2022

For this specific case, I am okay with one PR that does a "Find and Replace" replacing all the with pytest.warns(None) as record: with with warnings.catch_warnings(record=True) as record:.

I think it's better to do one PR per file. See #22572 (comment) for details.

@ShanDeng123
Copy link
Contributor

@avm19
Sounds good - I can look to do the first 10 in the list up until the last file under the ensemble folder, and can continue working on the others afterwards if nobody else hops on.

@thomasjpfan
I can look to merge updates for multiple files in a single PR as I go! I do want to make sure that I'm following all of the proper guidelines/precautions though as I am still a bit new to open source work. Would hate to cause any unnecessary issues! 😅

That being said - would any of you guys know if this would necessitate an update to the changelog document? And if so, would the note be a bullet under the main Changelog header?

@thomasjpfan
Copy link
Member

It is not necessary to update the changelog, since it is not a public facing change.

@thomasjpfan
Copy link
Member

thomasjpfan commented Mar 14, 2022

Coming from #22822 (comment) I think it's better to do this one per file.

When we write the following to test for no warnings:

with pytest.warns(None) as record:
    ...
assert not [w.message for w in record]

I think the better solution is to replace that with:

with warnings.catch_warnings():
    warnings.simplefilter("error", EXPECTED_WARNING)
    ...

where EXPECTED_WARNING is the warning we expect to not happen. Since this is a non-trival change, I think it's better a one PR per file.

If we do not know the expected warning then we can write:

with warnings.catch_warnings():
    warnings.simplefilter("error")
    ...

(But I prefer to have the expected warning)

@ShanDeng123
Copy link
Contributor

Will do - I'll go back over the files with 1 PR each, and look to add the extra detail when applicable 👍

@thomasjpfan
Copy link
Member

I opened #22824 as a demonstration of what the idea looks like. Let's wait to see what other maintainers think before opening more.

@thomasjpfan
Copy link
Member

@ShanDeng123 #22824 has been merged. You may open a PR per file to address the other ones following the idea in #22572 (comment) .

@danifernandes-hub
Copy link
Contributor

Working on sklearn/metrics/tests/test_classification.py

@ShanDeng123
Copy link
Contributor

ShanDeng123 commented Mar 15, 2022

@thomasjpfan
Is there an intuitive way to know which error to not expect? In some cases, the error is a bit easy to see due to the function name/surrounding statements, but I'm a bit unsure on most others.

Would it be fair to say that pytest.warns(None) generally checks for UserWarning unless the surrounding statements suggest otherwise?

For example, in test_feature_agglomeration, I'd imagine that the expected warning is a UserWarning for the code below
with pytest.warns(None) as record: agglo_mean.fit(X) assert not [w.message for w in record]

but I'd be lying if I said I had any confidence in that answer.

@thomasjpfan
Copy link
Member

Is there an intuitive way to know which error to not expect? In some cases, the error is a bit easy to see due to the function name/surrounding statements, but I'm a bit unsure on most others.

The only way to check is through git blame. For example, test_feature_agglomeration is a DeprecationWarning: #11537

For that specific case, there is no deprecation warning anymore in the code, so I would remove the "no warnings" check all together. The PR should state the reason for removing the "no warning" check.

@nik1097
Copy link
Contributor

nik1097 commented Apr 7, 2022

Hey @thomasjpfan I'm a first-time contributor, could you please review my changes? Thank you so much.

@MegaGonz
Copy link
Contributor

MegaGonz commented Apr 7, 2022

i can work on sklearn/tests/test_naive_bayes.py

@nik1097
Copy link
Contributor

nik1097 commented Apr 7, 2022

i can work on sklearn/tests/test_naive_bayes.py
I'm afraid #23066 already fixes this issue @adamgonzo

@cmarmo cmarmo reopened this Apr 7, 2022
@cmarmo
Copy link
Contributor

cmarmo commented Apr 7, 2022

Reopening as some tests still need to be modified.

@nik1097
Copy link
Contributor

nik1097 commented Apr 7, 2022

@jeremiedbb this should be the last file having the outdated syntax, I've opened another PR to fix this issue. Could you please review it?

@cmarmo
Copy link
Contributor

cmarmo commented Apr 7, 2022

Also #22949 is still open

@jeremiedbb
Copy link
Member

jeremiedbb commented Apr 8, 2022

sklearn/inspection/_plot/tests/test_plot_partial_dependence.py and sklearn/tests/test_pipeline.py still have some instances of the deprecated pattern

orionlee added a commit to orionlee/lightkurve that referenced this issue Jan 29, 2024
orionlee added a commit to lightkurve/lightkurve that referenced this issue Jan 30, 2024

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
* Fixed tpf.interact() : lc plot ylabel, case normalized flux

* Fixed tests compatibility with pytest v8
- patterned after: scikit-learn/scikit-learn#22572 (comment)

* add to changelog [no ci]
ignacioct pushed a commit to argilla-io/argilla that referenced this issue Jan 31, 2024

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
<!-- Thanks for your contribution! As part of our Community Growers
initiative 🌱, we're donating Justdiggit bunds in your name to reforest
sub-Saharan Africa. To claim your Community Growers certificate, please
contact David Berenstein in our Slack community or fill in this form
https://tally.so/r/n9XrxK once your PR has been merged. -->

# Description

This PR addresses errors produced by using `pytest.warns` with `None`
args

See [this failing
action](https://github.com/argilla-io/argilla/actions/runs/7715020702/job/21028986681#step:12:23277),
and [this
thread](scikit-learn/scikit-learn#22572 (comment))
for more info.

**Type of change**

(Please delete options that are not relevant. Remember to title the PR
according to the type of change)

- [ ] New feature (non-breaking change which adds functionality)
- [ ] Refactor (change restructuring the codebase without changing
functionality)
- [X] Improvement (change adding some improvement to an existing
functionality)

**How Has This Been Tested**

(Please describe the tests that you ran to verify your changes. And
ideally, reference `tests`)

- [ ] Test A
- [ ] Test B

**Checklist**

- [ ] I added relevant documentation
- [ ] I followed the style guidelines of this project
- [ ] I did a self-review of my code
- [ ] I made corresponding changes to the documentation
- [ ] My changes generate no new warnings
- [ ] I have added tests that prove my fix is effective or that my
feature works
- [ ] I filled out [the contributor form](https://tally.so/r/n9XrxK)
(see text above)
- [ ] I have added relevant notes to the `CHANGELOG.md` file (See
https://keepachangelog.com/)
frascuchon added a commit to argilla-io/argilla that referenced this issue Jan 31, 2024

Unverified

This commit is not signed, but one or more authors requires that any commit attributed to them is signed.
<!-- Thanks for your contribution! As part of our Community Growers
initiative 🌱, we're donating Justdiggit bunds in your name to reforest
sub-Saharan Africa. To claim your Community Growers certificate, please
contact David Berenstein in our Slack community or fill in this form
https://tally.so/r/n9XrxK once your PR has been merged. -->

# Description

This PR addresses errors produced by using `pytest.warns` with `None`
args

See [this failing
action](https://github.com/argilla-io/argilla/actions/runs/7715020702/job/21028986681#step:12:23277),
and [this
thread](scikit-learn/scikit-learn#22572 (comment))
for more info.

**Type of change**

(Please delete options that are not relevant. Remember to title the PR
according to the type of change)

- [ ] New feature (non-breaking change which adds functionality)
- [ ] Refactor (change restructuring the codebase without changing
functionality)
- [X] Improvement (change adding some improvement to an existing
functionality)

**How Has This Been Tested**

(Please describe the tests that you ran to verify your changes. And
ideally, reference `tests`)

- [ ] Test A
- [ ] Test B

**Checklist**

- [ ] I added relevant documentation
- [ ] I followed the style guidelines of this project
- [ ] I did a self-review of my code
- [ ] I made corresponding changes to the documentation
- [ ] My changes generate no new warnings
- [ ] I have added tests that prove my fix is effective or that my
feature works
- [ ] I filled out [the contributor form](https://tally.so/r/n9XrxK)
(see text above)
- [ ] I have added relevant notes to the `CHANGELOG.md` file (See
https://keepachangelog.com/)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Easy with clear instructions to resolve Meta-issue General issue associated to an identified list of tasks module:test-suite everything related to our tests
Projects
None yet
Development

Successfully merging a pull request may close this issue.