Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

4.1.0: pytest is failing #339

Closed
kloczek opened this issue Sep 8, 2021 · 10 comments
Closed

4.1.0: pytest is failing #339

kloczek opened this issue Sep 8, 2021 · 10 comments

Comments

@kloczek
Copy link

kloczek commented Sep 8, 2021

I'm trying to package your module as an rpm package. So I'm using the typical build, install and test cycle used on building packages from non-root account.

  • "setup.py build"
  • "setup.py install --root </install/prefix>"
  • "pytest with PYTHONPATH pointing to sitearch and sitelib inside </install/prefix>

May I ask for help because few units are failing:

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-billiard-3.6.4.0-4.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-billiard-3.6.4.0-4.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra --ignore t/unit/test_common.py --ignore t/unit/test_win32.py
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
Using --randomly-seed=3323748296
rootdir: /home/tkloczko/rpmbuild/BUILD/billiard-3.6.4.0, configfile: setup.cfg, testpaths: t/unit/
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, aspectlib-1.5.2, toolbox-0.5, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, xprocess-0.18.1, black-0.3.12, anyio-3.3.0, asyncio-0.15.1, subtests-0.5.0, isort-2.0.0, hypothesis-6.14.6, mock-3.6.1, profiling-1.7.0, randomly-3.8.0, Faker-8.12.1, nose2pytest-1.0.8, pyfakefs-4.5.1, tornado-0.8.1, twisted-1.13.3, aiohttp-0.3.0
collected 11 items

t/unit/test_spawn.py .F.                                                                                                                                             [ 27%]
t/unit/test_dummy.py .                                                                                                                                               [ 36%]
t/unit/test_values.py ....                                                                                                                                           [ 72%]
t/unit/test_package.py .                                                                                                                                             [ 81%]
t/unit/test_pool.py ..                                                                                                                                               [100%]

================================================================================= FAILURES =================================================================================
______________________________________________________________________ test_spawn.test_set_pdeathsig _______________________________________________________________________

self = <t.unit.test_spawn.test_spawn object at 0x7fd2ddb84040>

    @pytest.mark.skipif(not sys.platform.startswith('linux'),
                        reason='set_pdeathsig() is supported only in Linux')
    def test_set_pdeathsig(self):
        success = "done"
        q = Queue()
        p = Process(target=parent_task, args=(q, success))
        p.start()
        child_proc = psutil.Process(q.get(timeout=3))
        try:
            p.terminate()
>           assert q.get(timeout=3) == success

t/unit/test_spawn.py:31:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <billiard.queues.Queue object at 0x7fd2dda0f100>, block = True, timeout = 2.999995225982275

    def get(self, block=True, timeout=None):
        if block and timeout is None:
            with self._rlock:
                res = self._recv_bytes()
            self._sem.release()

        else:
            if block:
                deadline = monotonic() + timeout
            if not self._rlock.acquire(block, timeout):
                raise Empty
            try:
                if block:
                    timeout = deadline - monotonic()
                    if timeout < 0 or not self._poll(timeout):
>                       raise Empty
E                       _queue.Empty

billiard/queues.py:111: Empty
========================================================================= short test summary info ==========================================================================
FAILED t/unit/test_spawn.py::test_spawn::test_set_pdeathsig - _queue.Empty
======================================================================= 1 failed, 10 passed in 5.35s =======================================================================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
@auvipy auvipy closed this as completed Nov 19, 2021
@kloczek
Copy link
Author

kloczek commented Nov 19, 2021

Any comment?

@auvipy auvipy reopened this Nov 19, 2021
@auvipy
Copy link
Member

auvipy commented Nov 19, 2021

i thought it's fixed!

@kloczek
Copy link
Author

kloczek commented Dec 14, 2022

Just tested 4.1.0 and all looks ~good except one small detail .. test suite still is using nose.
Would you accept PR with below patch?

--- a/t/integration/setup.py
+++ b/t/integration/setup.py
@@ -41,9 +41,9 @@
     data_files=[],
     zip_safe=False,
     cmdclass={'install': no_install},
-    test_suite='nose.collector',
+    test_suite='pytest',
     build_requires=[
-        'nose',
+        'pytest',
         'coverage>=3.0',
     ],
     classifiers=[
--- a/t/integration/tests/test_multiprocessing.py
+++ b/t/integration/tests/test_multiprocessing.py
@@ -15,7 +15,7 @@
 import array
 import random
 import logging
-from nose import SkipTest
+import pytest
 from test import test_support
 from StringIO import StringIO
 try:

Feel free to commit that without waiting on PR 😋
With above patch

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-billiard-4.1.0-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-billiard-4.1.0-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.16, pytest-7.2.0, pluggy-1.0.0
rootdir: /home/tkloczko/rpmbuild/BUILD/billiard-4.1.0, configfile: setup.cfg, testpaths: t/unit/
collected 73 items

t/unit/test_common.py ........                                                                                                                                       [ 10%]
t/unit/test_dummy.py .                                                                                                                                               [ 12%]
t/unit/test_einfo.py ..                                                                                                                                              [ 15%]
t/unit/test_package.py .                                                                                                                                             [ 16%]
t/unit/test_pool.py ...                                                                                                                                              [ 20%]
t/unit/test_spawn.py ...                                                                                                                                             [ 24%]
t/unit/test_values.py ....                                                                                                                                           [ 30%]
t/unit/test_win32.py sssssssssssssssssssssssssssssssssssssssssssssssssss                                                                                             [100%]

============================================================================= warnings summary =============================================================================
t/unit/test_spawn.py::test_spawn::test_start
  /usr/lib/python3.8/site-packages/_pytest/python.py:204: PytestReturnNotNoneWarning: Expected None, but t/unit/test_spawn.py::test_spawn::test_start returned 0, which will be an error in a future version of pytest.  Did you mean to use `assert` instead of `return`?
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
========================================================================= short test summary info ==========================================================================
SKIPPED [29] t/unit/test_win32.py:13: Requires Windows to work
SKIPPED [20] t/unit/test_win32.py:47: Requires Windows to work
SKIPPED [1] t/unit/test_win32.py:72: Requires Windows to work
SKIPPED [1] t/unit/test_win32.py:76: Requires Windows to work
================================================================ 22 passed, 51 skipped, 1 warning in 3.93s =================================================================

@auvipy
Copy link
Member

auvipy commented Dec 15, 2022

ok thanks will do this week :p

@auvipy auvipy changed the title 3.6.4.0: pytest is failing 4.1.0: pytest is failing Dec 15, 2022
@auvipy
Copy link
Member

auvipy commented Dec 15, 2022

#383

@auvipy
Copy link
Member

auvipy commented Apr 11, 2023

#383

can you review it please

@auvipy
Copy link
Member

auvipy commented Apr 13, 2023

what is the pytest alternative of from nose import SkipTest ?

@kloczek
Copy link
Author

kloczek commented Apr 13, 2023

#383

can you review it please

On sec .. will try to test that 😋

what is the pytest alternative of from nose import SkipTest ?

Generally speaking one of the best compendium of nose -> pytest migration IMO is https://github.com/schollii/nose2pytest/

@kloczek
Copy link
Author

kloczek commented Apr 13, 2023

OK just tested that PR.
Here is pytest output:

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-billiard-4.1.0-3.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-billiard-4.1.0-3.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra -m 'not network'
==================================================================================== test session starts ====================================================================================
platform linux -- Python 3.8.16, pytest-7.3.0, pluggy-1.0.0
rootdir: /home/tkloczko/rpmbuild/BUILD/billiard-4.1.0
configfile: setup.cfg
testpaths: t/unit/
collected 73 items

t/unit/test_common.py ........                                                                                                                                                        [ 10%]
t/unit/test_dummy.py .                                                                                                                                                                [ 12%]
t/unit/test_einfo.py ..                                                                                                                                                               [ 15%]
t/unit/test_package.py .                                                                                                                                                              [ 16%]
t/unit/test_pool.py ...                                                                                                                                                               [ 20%]
t/unit/test_spawn.py ...                                                                                                                                                              [ 24%]
t/unit/test_values.py ....                                                                                                                                                            [ 30%]
t/unit/test_win32.py sssssssssssssssssssssssssssssssssssssssssssssssssss                                                                                                              [100%]

===================================================================================== warnings summary ======================================================================================
t/unit/test_spawn.py::test_spawn::test_start
  /usr/lib/python3.8/site-packages/_pytest/python.py:203: PytestReturnNotNoneWarning: Expected None, but t/unit/test_spawn.py::test_spawn::test_start returned 0, which will be an error in a future version of pytest.  Did you mean to use `assert` instead of `return`?
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================================================================== short test summary info ==================================================================================
SKIPPED [29] t/unit/test_win32.py:13: Requires Windows to work
SKIPPED [20] t/unit/test_win32.py:47: Requires Windows to work
SKIPPED [1] t/unit/test_win32.py:72: Requires Windows to work
SKIPPED [1] t/unit/test_win32.py:76: Requires Windows to work
========================================================================= 22 passed, 51 skipped, 1 warning in 3.67s =========================================================================

@auvipy
Copy link
Member

auvipy commented Nov 5, 2023

what is the pytest alternative of from nose import SkipTest ?

got it covered here #397

@auvipy auvipy closed this as completed Nov 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

No branches or pull requests

2 participants