Skip to content

Commit

Permalink
gui-crawl: Reduce memory required to end a test-run.
Browse files Browse the repository at this point in the history
Earlier,
If URLs were not found for 100 iterations, then test-run was exited.
This took ~8 mins on a local system. On a busy CI node, the time taken
could be scaled by a fator of 3.

Now,
If URLs are not found for 25 iterations, then exit the test-run.

Removed `time.sleep(rate_of_tasks)`.
processing the tasks as a TaskGroup ensures that URLs are found faster
than the tasks being created.

CMK-20093

Change-Id: I0fd9e9b4023d29e7c1a1cf66853ba4ab086e4b1d
  • Loading branch information
dhsh-checkmk committed Nov 22, 2024
1 parent 11b631c commit e7d007a
Showing 1 changed file with 2 additions and 7 deletions.
9 changes: 2 additions & 7 deletions tests/testlib/crawler.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,13 +211,11 @@ async def crawl(self, max_tasks: int, max_url_batch_size: int = 100) -> None:
"""Crawl through URLs using simultaneously using tasks / coroutines.
A group of tasks / coroutines is added every `rate_create_crawl_task` seconds.
Crawling stop when URLs are not found for last `memory_size_urls_exist` iterations
(`rate_create_crawl_task` x `memory_size_urls_exist` seconds).
Crawling stop when URLs are not found for last `memory_size_urls_exist` iterations.
debug-mode: Crawling URLs stop when at least `--max-urls` number of URLs have been crawled.
"""
rate_create_crawl_task = 0.1 # seconds
memory_size_urls_exist = 100 # iterations
memory_size_urls_exist = 25 # iterations
search_limited_urls: bool = self._max_urls > 0
# special-case
if search_limited_urls and self._max_urls < max_tasks:
Expand Down Expand Up @@ -256,9 +254,6 @@ async def crawl(self, max_tasks: int, max_url_batch_size: int = 100) -> None:
logger.info("No more URLs to crawl. Stopping ...")
# ----

# ensure rate of URL collection in `self._todos` > rate of new tasks added
time.sleep(rate_create_crawl_task)

async def setup_checkmk_context(
self, browser: playwright.async_api.Browser
) -> playwright.async_api.BrowserContext:
Expand Down

0 comments on commit e7d007a

Please sign in to comment.