-
Notifications
You must be signed in to change notification settings - Fork 738
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle parallel requests and tasks #545
Comments
For high-performance task management and asynchronous processing in Python, you can use specialized frameworks and tools that are designed for concurrency and scalability. Here are a few high-performance options: 1. DramatiqDramatiq is a fast and reliable distributed task processing library for Python. It's designed to handle a high volume of tasks with low latency. Installation: pip install dramatiq redis Setup with FastAPI: worker.py: import dramatiq
from dramatiq.brokers.redis import RedisBroker
redis_broker = RedisBroker(host="localhost", port=6379)
dramatiq.set_broker(redis_broker)
@dramatiq.actor
def long_running_task(param1, param2):
# Your long-running task here
... main.py: from fastapi import FastAPI
from worker import long_running_task
app = FastAPI()
@app.post("/start-task/")
async def start_task(param1: str, param2: str):
long_running_task.send(param1, param2)
return {"message": "Task started"} Start the Dramatiq worker: dramatiq worker 2. RQ (Redis Queue)RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. Installation: pip install rq Setup with FastAPI: worker.py: import time
from redis import Redis
from rq import Queue
redis_conn = Redis()
q = Queue(connection=redis_conn)
def long_running_task(param1, param2):
# Your long-running task here
time.sleep(10) # Simulating a long task
... main.py: from fastapi import FastAPI
from worker import q, long_running_task
app = FastAPI()
@app.post("/start-task/")
async def start_task(param1: str, param2: str):
q.enqueue(long_running_task, param1, param2)
return {"message": "Task started"} Start the RQ worker: rq worker 3. DaskDask is a flexible parallel computing library for analytics. Installation: pip install dask distributed Setup with FastAPI: worker.py: from dask.distributed import Client, fire_and_forget
client = Client()
def long_running_task(param1, param2):
# Your long-running task here
... main.py: from fastapi import FastAPI
from dask.distributed import Client
from worker import long_running_task
client = Client()
app = FastAPI()
@app.post("/start-task/")
async def start_task(param1: str, param2: str):
future = client.submit(long_running_task, param1, param2)
fire_and_forget(future)
return {"message": "Task started"} Start the Dask scheduler and workers: dask-scheduler
dask-worker <scheduler-address> ConclusionAll these frameworks are designed for high-performance and scalable task processing. Depending on your specific requirements and environment, you can choose the one that best fits your needs. Dramatiq and RQ are simpler and great for straightforward background tasks, while Dask is more powerful and suitable for complex computational tasks. |
@TheSnowGuru |
This issue is stale because it has been open for 30 days with no activity. |
We're actively working on this right now. |
This issue is stale because it has been open for 30 days with no activity. |
We've got a beta version on skyvern cloud that works with temporal. We are not going to push it to the open source side for now, as our small size team don't have the capacity to do it right now. |
When a task is running, Skyvern's api service hangs and blocks incoming requests until the running task is completed.
The text was updated successfully, but these errors were encountered: