Skip to content Skip to sidebar Skip to footer

Several Concurrent Url Calls

How can I make, say N url calls in parallel, and process the responses as they come back? I want to ready the responses and print them to the screen, maybe after some manipulations

Solution 1:

You can use Twisted Python for this, such as in the example here: https://twistedmatrix.com/documents/13.0.0/web/howto/client.html#auto3

Twisted is an asynchronous programming library for Python which lets you carry out multiple actions "at the same time," and it comes with an HTTP client (and server).

Solution 2:

One basic solution that comes to mind is to use threading.

Depending of the number of URL you retrieve in parallel, you could have one thread per URL. Or (scale better), have a fixed number of "worker" threads, reading the URL from a shared Queue.

Post a Comment for "Several Concurrent Url Calls"