Cert already in hash table exception

5.5k views Asked by At

I am using chef dk version 12 and i have done basic setup and uploaded many cookbooks , currently i am using remote_directory in my default.rb What i have observed is whenever there are too many files /hierarchy in the directory the upload fails with the below exception :-

ERROR: SSL Validation failure connecting to host: xyz.com - SSL_write: cert already in hash table
ERROR: Could not establish a secure connection to the server.
Use `knife ssl check` to troubleshoot your SSL configuration.
If your Chef Server uses a self-signed certificate, you can use
`knife ssl fetch` to make knife trust the server's certificates. 
Original Exception: OpenSSL::SSL::SSLError: SSL_write: cert already in hash table 

As mentioned earlier connection to server isnt a problem it happens only when there are too many files/the hierarchy is more . Can you please suggest what i can do? I have tried searching online for solutions but failed to get a solution

I have checked the question here but it doesnt solve my problem Chef uses embedded ruby and openssl for people not working with chef

Some updates on suggestion of tensibai, The exceptions have changed since adding the option of --concurrency 1 , Initially i had received, INFO: HTTP Request Returned 403 Forbidden:ERROR: Failed to upload filepath\file (7a81e65b51f0d514ec645da49de6417d) to example.com:443/bookshelf/… 3088476d373416dfbaf187590b5d5687210a75&Expires=1435139052&Signature=SP/70MZP4C2U‌​dUd9%2B5Ct1jEV1EQ%3D : 403 "Forbidden" <?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code><Message>Access Denied</Message>

Then yesterday it has changed to INFO: HTTP Request Returned 413 Request Entity Too Large: error ERROR: Request Entity Too Large Response: JSON must be no more than 1000000 bytes.

Should i decrease the number of files or is there any other option?

Knife --version results in Chef: 12.3.0

2

There are 2 answers

3
Tensibai On

Should i decrease the number of files or is there any other option?

Ususally the files inside a cookbook are not intended to be too large and too numerous, if you got a lot of files to ditribute it's a sign you should change the way you distribute thoose files.

One option could be to make a tarball, but this makes harder to manage the deleted files.

Another option if you're on an internal chef-server is to follow the advice here and change the client_max_body_size 2M; value for nginx but I can't guarantee it will work.

0
Innocent Anigbo On

I had same error and i ran chef-server-ctl reconfigure on chef server then tried uploading cookbook again and all started working fine again