I work with an enormous system that uses authenticated SSL pervasively, and it's been working perfectly for five-months. I have the entire ecosystem setup on my local system (Mac 10.9.5). When running locally, it still uses authenticated SSL connections, though the SSL verification of the server certificates is turned off, for obvious reasons.
I use requests to make the connections to an Nginx webserver, on both the local system and the servers.
On Friday, I started getting a 400 from requests, in the client, on my local system. I had made some changes, but I used an earlier version, and it didn't make a difference. To my recollection, I made no changes to the environment.
To verify that it wasn't some automatic Apple update that caused the breakage, I configured Ubuntu 12.04 (Python 2.7.3) and 14.04 (Python 2.7.6) environments using Vagrant. Although it eliminated some variables, things still don't work.
The weird thing is that using cURL directly works, but using urllib2/httplib (which is invoked by requests) still gives me the 400.
It is broken with Python 2.7, but Python 3.4 still works fine.
cURL (works):
curl -s -v -X GET -k --cert /var/lib/rt_data/ssl/rt.crt.pem --key /var/lib/rt_data/ssl/rt.private_key.pem https://deploy_api.local:8443/auth/admin/1/hosts
urllib2/httplib under Python (broken):
#!/usr/bin/env python2.7
import urllib2
import httplib
cert_filepath = '/var/lib/rt_data/ssl/rt.crt.pem'
key_filepath = '/var/lib/rt_data/ssl/rt.private_key.pem'
url = 'https://deploy_api.local:8443/auth/admin/1/hosts'
class HTTPSClientAuthHandler(urllib2.HTTPSHandler):
"""Wrapper to allow for authenticated SSL connections."""
def __init__(self, key, cert):
urllib2.HTTPSHandler.__init__(self)
self.key = key
self.cert = cert
def https_open(self, req):
# Rather than pass in a reference to a connection class, we pass in
# a reference to a function which, for all intents and purposes,
# will behave as a constructor
return self.do_open(self.getConnection, req)
def getConnection(self, host, timeout=300):
return httplib.HTTPSConnection(host, key_file=self.key, cert_file=self.cert)
opener = urllib2.build_opener(HTTPSClientAuthHandler(key_filepath, cert_filepath))
response = opener.open(url)
response_data = response.read()
print(response_data)
It still works from Python 3.4. This uses requests (which we prefer, anyways), because urllib2 is no longer available for our experiment under Python 3.4:
import requests
cert_filepath = '/var/lib/rt_data/ssl/rt.crt.pem'
key_filepath = '/var/lib/rt_data/ssl/rt.private_key.pem'
url = 'https://deploy_api.local:8443/auth/admin/1/hosts'
r = requests.get(url,
cert=(cert_filepath, key_filepath),
verify=False)
For reference, this is the Nginx config:
server {
listen 8443 ssl;
ssl on;
ssl_certificate /vagrant/ssl/deploy/deploy_api.crt.pem;
ssl_certificate_key /vagrant/ssl/deploy/deploy_api.key.pem;
ssl_client_certificate /vagrant/ssl/ca_cert.pem;
ssl_verify_client on;
ssl_verify_depth 1;
location ^~ /auth/ {
# ...
}
location / {
return 403;
}
}
If I comment ssl_verify_client
, it'll pass, but not process the client-certificate.
The response when I get a 400 is:
vagrant@deploy:/deployment/deploy_scripts/dev$ ./curl_ds_host_shell.py
<html>
<head><title>400 No required SSL certificate was sent</title></head>
<body bgcolor="white">
<center><h1>400 Bad Request</h1></center>
<center>No required SSL certificate was sent</center>
<hr><center>nginx/1.4.6 (Ubuntu)</center>
</body>
</html>
Therefore, I think urllib2/httplib is omitting the certificate, for some reason. However, The clients still work both from my local machine and the Ubuntu instance under Vagrant to the production server. This only occurs with the local server, under both the Mac and the Vagrant instance.
Does anyone have any idea what's going on?