Getting cookies with requests

3.5k views Asked by At

When I try to access Tor sites with the .cab web proxy using a browser, I first get a disclaimer from the .cab proxy, and then after clicking a button I get through to the actual .onion site. I think the site uses cookies to determine if the disclaimer has been clicked, as when I delete the cookies from the browser, I get the disclaimer again when I try to access sites.

However, when I try to access the sites with requests, I don't get any cookies:

>>> r = requests.get(address)
>>> r.cookies
<RequestsCookieJar[]>

I've tried using sessions, but the same thing happens. How can I get the cookies using Python requests?

The URL I'm trying is "https://qzbkwswfv5k2oj5d.onion.cab/". I've tried both with no headers, and with the headers Chrome sends:

Host: qzbkwswfv5k2oj5d.onion.cab
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.124 Safari/537.36
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-GB,en-US;q=0.8,en;q=0.6
1

There are 1 answers

1
James Mills On

I believe you'll have to fake the User-Agent:

Example:

from requests import get


headers = {
    "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.1 Safari/537.36"
}


response = get(url, headers=headers)
response.raise_for_status()
response.cookies

This is a typical Google Chrome User-Agent which I got from here