Python web request slow through Proxy

1k views Asked by At

Some background: I work for a corporation which uses a proxy. Ping/nslookup are blocked, and I think this may be contributing to the following problem. The operating system being used is Windows, and the version of Python I'm testing 3.4.3.

I'm trying to create an application that communicates with a webservice, and this application will run inside our network. However, all requests take over 10 seconds to complete, while in the web browser it loads in under a second. Note that these requests succeed, they just take too long to be usable.

I profiled the application using the cProfile module, and I found that the application is spending 11 seconds on gethostbyaddr, and 4 seconds on gethostbyname.

I'm not familiar enough with networks, but is this a timeout? Why does the request go through despite the timeout? How do I disable these operations? And if I can't, is there a library that does not use these operations?

I tried both the requests and urllib modules. Pip is also exceedingly slow and may be because of the same cause.

Thanks in advance for any help or information on this subject.

Edit

I just tried monkey patching socket.gethostbyaddr and socket.gethostbyname, and the speed delay was gone. This doesn't feel like a proper solution though.

import requests
import socket

def do_nothing(*args):
    return None
socket.gethostbyaddr = do_nothing
socket.gethostbyname = do_nothing

r = requests.get('https://google.com')
print(r.status_code) # prints 200
0

There are 0 answers