Parallel GET requests for different domains with threading module

115 views Asked by At

I expect to have maybe something like 100k URLs from different domains. I wrote this code which has a list of URLs in all_urls and forms N threads to run in one batch. Currently I'm using threading module to make these requests in parallel.

import requests
import os
import threading
import time

all_urls = [] # a list of URLs to request, can have up to 100k

global success, fail
success = 0
fail = 0

def func(url_to_request):
    global success, fail
    try:
        r = requests.get(url_to_request, timeout=5)
        c = r.content
        success = success +1
    except:
        fail = fail +1
    return

batch_count = 1
N = 200 # number of threads
all_threads_urls = []
time_start = time.time()

for item in all_urls:
    all_threads_urls.append(item)
    if all_urls.index(item) == len(all_urls)-1 or len(all_threads_urls) == N:
        # call it
        all_threads = []
        for link in all_threads_urls:
            current_thread = threading.Thread(target=func, args=(link,))
            all_threads.append(current_thread)
            current_thread.start()

        for thr in all_threads:
            thr.join()

        all_threads_urls = [] # for the next batch
        time_end = time.time()

        print "Request number", all_urls.index(item)+1, "Good:", success, "Bad:", fail, "Duration:", round(time_end - time_start,2 ), "seconds."
        time_start = time_end

Results for this are a bit weird, it seems that the script starts very fast but then slows down a lot (see image). Printed durations are for each batch.

enter image description here
Can someone explain what is the bottleneck here? Is there maybe a better module for this or there is no way around this?

0

There are 0 answers