I want to synchronise files from a location in Perforce to an S3 bucket. I've set up a Jenkins job that uses s3cmd to sync files from Perforce to S3.
The problem I'm having is that the auto-detection of mime-types is not working as I would like. Is there a relatively straightforward way to override the detection with my own mime-type mapping? Let's say I want all .xml.gz
files to be mime-typed as application/x-gzip
.
How do I do this without rolling my own equivalent of s3cmd sync? Is there a way to do this with s3cmd, or is there another tool that's good for syncing a folder to S3 that has this functionality?
EDIT:
This isn't what I was looking for, but if anyone else has the problem it at least gets past the issue. I modified S3.py
and after the snippet that looks like this:
if not content_type: content_type = self.config.default_mime_type
I added:
# JUN-DAI'S HACK TO GET .gz mimetypes correct. # I couldn't find another way to do this as the mimetypes library strips out the ".gz" suffix and determines the mimetype of the underlying file without it. if filename.endswith(".gz"): content_type = "application/x-gzip" print "Setting content-type of {0} to {1}".format(filename, content_type)
Now you can use the options
--no-guess-mime-type --default-mime-type "application/x-gzip"
with s3cmd sync or put.