Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rate Limit NOT working with multiple workers using memory #226

Open
abdksyed opened this issue Nov 26, 2024 · 1 comment
Open

Rate Limit NOT working with multiple workers using memory #226

abdksyed opened this issue Nov 26, 2024 · 1 comment

Comments

@abdksyed
Copy link

Describe the bug
I implemented rate limiting with default values, so the storage being used is local memory and running uvicorn with 8 workers.

For a end point with rate limit 5/minute, when I am hitting the endpoint in a for loop, I am able to hit for more than 10 times without any rate limiting. Is it due to multiple workers?

But isn't the memory shared between workers, so that the rate limit should work across workers?

To Reproduce

limiter = Limiter(key_func=get_remote_address)

@app.get("/home")
@limit.limiter("5/minute")
async def home(request: Request):
    return {"message": "hello"}

Expected behavior
N/A

Screenshots
N/A

Your app (please complete the following information):

  • fastapi or starlette? FastAPI
  • Version? 0.111.1
  • slowapi version (have you tried with the latest version)? 0.1.9

Additional context
Add any other context about the problem here.

@vladiliescu
Copy link

Each worker runs as a separate process, so the memory isn't shared between them. You should probably use an external backend like Redis or memcached.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants