Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pytest output kind of unreadable in asyncio/logging context #13158

Closed
ayjayt opened this issue Jan 22, 2025 · 3 comments
Closed

pytest output kind of unreadable in asyncio/logging context #13158

ayjayt opened this issue Jan 22, 2025 · 3 comments
Labels
status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity topic: reporting related to terminal output and user-facing messages and errors

Comments

@ayjayt
Copy link
Contributor

ayjayt commented Jan 22, 2025

Heres an example of my output, its for one test:

The stack traces are not divided well, there are dividers in weird places, output sections aren't labeled. It's my own program and I can figure things out from context but this seems like really weird mixed output.

Poe => pytest -W error -vvx -rA --capture=tee-sys tests/test_process.py
============================================================ test session starts =============================================================
platform linux -- Python 3.13.0, pytest-8.3.4, pluggy-1.5.0 -- /home/ajp/projects/devtools_protocol/.venv/bin/python
cachedir: .pytest_cache
rootdir: /home/ajp/projects/devtools_protocol
configfile: pyproject.toml
plugins: xdist-3.6.1, asyncio-0.25.2
asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=function
collected 18 items

tests/test_process.py::test_context[headless-enable_sandbox-enable_gpu] FAILED                                                         [  5%]

================================================================== FAILURES ==================================================================
______________________________________________ test_context[headless-enable_sandbox-enable_gpu] ______________________________________________

capteesys = <_pytest.capture.CaptureFixture object at 0x7acb96b152b0>, headless = True, sandbox = True, gpu = True

Exception ignored in: <finalize object at 0x7acb96b44380; dead>
Traceback (most recent call last):
  File "/home/ajp/.local/share/uv/python/cpython-3.13.0-linux-x86_64-gnu/lib/python3.13/weakref.py", line 590, in __call__
    return info.func(*info.args, **(info.kwargs or {}))
  File "/home/ajp/.local/share/uv/python/cpython-3.13.0-linux-x86_64-gnu/lib/python3.13/tempfile.py", line 935, in _cleanup
    cls._rmtree(name, ignore_errors=ignore_errors)
  File "/home/ajp/.local/share/uv/python/cpython-3.13.0-linux-x86_64-gnu/lib/python3.13/tempfile.py", line 930, in _rmtree
    _shutil.rmtree(name, onexc=onexc)
  File "/home/ajp/.local/share/uv/python/cpython-3.13.0-linux-x86_64-gnu/lib/python3.13/shutil.py", line 763, in rmtree
    _rmtree_safe_fd(stack, onexc)
  File "/home/ajp/.local/share/uv/python/cpython-3.13.0-linux-x86_64-gnu/lib/python3.13/shutil.py", line 707, in _rmtree_safe_fd
    onexc(func, path, err)
  File "/home/ajp/.local/share/uv/python/cpython-3.13.0-linux-x86_64-gnu/lib/python3.13/shutil.py", line 658, in _rmtree_safe_fd
    os.rmdir(name, dir_fd=dirfd)
OSError: [Errno 39] Directory not empty: '/tmp/tmp9g5ntzdi/Default'
    @pytest.mark.asyncio(loop_scope="function")
    async def test_context(capteesys, headless, sandbox, gpu):
        async with (
>           choreo.Browser(
                headless=headless,
                enable_sandbox=sandbox,
                enable_gpu=gpu,
            ) as browser,
            timeout(pytest.default_timeout),
        ):

tests/test_process.py:19:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
choreographer/browser_async.py:222: in __aexit__
    await self.close()
choreographer/browser_async.py:203: in close
    await self._close()
choreographer/browser_async.py:175: in _close
    await self.send_command("Browser.close")
choreographer/protocol/devtools_async.py:222: in send_command
    return await session.send_command(command, params)
choreographer/protocol/devtools_async.py:94: in send_command
    return await self._broker.write_json(json_command)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <choreographer._brokers._async.Broker object at 0x7acb96b156a0>, obj = {'id': 3, 'method': 'Browser.close'}

    async def write_json(
        self,
        obj: protocol.BrowserCommand,
    ) -> protocol.BrowserResponse:
        _logger.debug2(f"In broker.write_json for {obj}")
        protocol.verify_params(obj)
        key = protocol.calculate_message_key(obj)
        if not key:
            raise RuntimeError(
                "Message strangely formatted and "
                "choreographer couldn't figure it out why.",
            )
        loop = asyncio.get_running_loop()
        future: asyncio.Future[protocol.BrowserResponse] = loop.create_future()
        self.futures[key] = future
        try:
            await asyncio.to_thread(self._channel.write_json, obj)
        except BaseException as e:  # noqa: BLE001
            future.set_exception(e)
            del self.futures[key]
>       return await future
E       asyncio.exceptions.CancelledError

choreographer/_brokers/_async.py:218: CancelledError
========================================================== short test summary info ===========================================================
FAILED tests/test_process.py::test_context[headless-enable_sandbox-enable_gpu] - asyncio.exceptions.CancelledError
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================= 1 failed in 0.23s ==============================================================
Error: Sequence aborted after failed subtask '_debug-test_proc'
@Zac-HD
Copy link
Member

Zac-HD commented Jan 27, 2025

Can you provide a minimal reproducer? What does it look like if you just run the test function without pytest? Does pytest --tb=native ... look better to you?

@Zac-HD Zac-HD added topic: reporting related to terminal output and user-facing messages and errors status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity labels Jan 27, 2025
@graingert
Copy link
Member

It looks like an unraisable exception is getting logged midway through pytest printing it's exception. This should be fixed in 8.4

@ayjayt
Copy link
Contributor Author

ayjayt commented Jan 27, 2025

Hey everyone, i simplified my capture and reporting configuration and the error disappeared. Sorry for not helping more.

@ayjayt ayjayt closed this as completed Jan 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity topic: reporting related to terminal output and user-facing messages and errors
Projects
None yet
Development

No branches or pull requests

3 participants