Add option to limit parallel crawling #28
wernerkrauss
started this conversation in
Ideas
Replies: 2 comments
-
This currently is not possible. I'd welcome a PR that adds this to the package. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @freekmurze, thanks for your reply. Is this possible with the included crawler module, and if yes, how can it be configured? If you can show me the right direction I might be able to add it to the CLI module. Thanks for the mixed content scanner! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
On a slow server I can crawl the first site, but when it tries to crawl the next 10 pages in parallel, I only get "server did not respond when crawling".
Is there a possibility to limit the parallel requests to deal with slow servers?
Beta Was this translation helpful? Give feedback.
All reactions