Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Leak/Memory Increase on Aiohttp Server #10528

Open
1 task done
jmspereira opened this issue Mar 7, 2025 · 1 comment
Open
1 task done

Memory Leak/Memory Increase on Aiohttp Server #10528

jmspereira opened this issue Mar 7, 2025 · 1 comment
Labels
bug needs-info Issue is lacking sufficient information and will be closed if not provided

Comments

@jmspereira
Copy link

Describe the bug

Hey everyone,

I have a backend that provides multiple endpoints for thousands of connections, and I am noticing a memory increase/leak that I am not able to fully pinpoint the root cause.

However, with the following minimal example, I am noticing that aiohttp is not fully releasing the memory used. I.e. the process (the backend) starts using almost negligible memory, and when I run the client multiple times I am able of increasing the memory used by the backend that never decreases.

from aiohttp import web


async def test_handler(_: web.Request) -> web.Response:
    return web.json_response({"message": "Hello, World!"})


def main():
    app = web.Application()
    app.add_routes([web.get("/test", test_handler)])
    web.run_app(app, port=9090, access_log=None, print=lambda _: None)


if __name__ == '__main__':
    main()
import asyncio
import logging

import aiohttp

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)


async def make_request() -> None:
    try:
        async with aiohttp.ClientSession() as session:
            async with session.get("http://localhost:9090/test") as response:
                await response.read()
                logger.info(f"Request sucessed: {response.status}")
    except Exception as e:
        logger.error(f"Error making request: {str(e)}")


async def run_load_test(num_requests: int = 10000) -> None:    
    tasks: list[asyncio.Task] = []
    
    for _ in range(num_requests):
        task = asyncio.create_task(make_request())
        tasks.append(task)
    
    await asyncio.gather(*tasks)

if __name__ == "__main__":
    asyncio.run(run_load_test())

If you play with num_requests, for instance double it, you can see that the memory of the server increases even more compared with the 10000 and does not decrease.

To Reproduce

  1. Run the backend
  2. Run the client
  3. Play with the number of requests

This is while checking the memory usage of the process.

Expected behavior

The memory should be released.

Logs/tracebacks

Not applicable.

Python Version

$ python --version
Python 3.12.3

aiohttp Version

$ python -m pip show aiohttp
Name: aiohttp
Version: 3.11.13
Summary: Async http client/server framework (asyncio)
Home-page: https://github.com/aio-libs/aiohttp
Author: 
Author-email: 
License: Apache-2.0
Requires: aiohappyeyeballs, aiosignal, attrs, frozenlist, multidict, propcache, yarl

multidict Version

$ python -m pip show multidict
Name: multidict
Version: 6.1.0
Summary: multidict implementation
Home-page: https://github.com/aio-libs/multidict
Author: Andrew Svetlov
Author-email: [email protected]
License: Apache 2
Requires:

propcache Version

$ python -m pip show propcache
Name: propcache
Version: 0.3.0
Summary: Accelerated property cache
Home-page: https://github.com/aio-libs/propcache
Author: Andrew Svetlov
Author-email: [email protected]
License: Apache-2.0
Requires:

yarl Version

$ python -m pip show yarl
Name: yarl
Version: 1.18.3
Summary: Yet another URL library
Home-page: https://github.com/aio-libs/yarl
Author: Andrew Svetlov
Author-email: [email protected]
License: Apache-2.0
Requires: idna, multidict, propcache

OS

Ubuntu 24.04.2 LTS

Related component

Server

Additional context

No response

Code of Conduct

  • I agree to follow the aio-libs Code of Conduct
@jmspereira jmspereira added the bug label Mar 7, 2025
@Dreamsorcerer
Copy link
Member

See #4618 and verify trimming the memory as per the end of that discussion. We've spent many hours in the past looking at these issues which have turned out to be nothing. I don't think anybody here is going to spend more time looking at this unless you someone does the work first to verify there's a real memory issue and narrows down where the problem is. I'd note that there are caches (DNS etc.), so a small increase in memory is expected.

@Dreamsorcerer Dreamsorcerer added the needs-info Issue is lacking sufficient information and will be closed if not provided label Mar 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug needs-info Issue is lacking sufficient information and will be closed if not provided
Projects
None yet
Development

No branches or pull requests

2 participants