I'm trying to dockerize mlflow with PostgreSQL and nginx configurations for Google Cloud Run (GCR) on the Google Cloud Platform (GCP).
Before deploying anything to GCP however, I wanted to get a local deployment working. I found this guide that details the process of setting up the environment. Having followed the guide (excluding the SQL part), I can see the mlflow UI on localhost:80
as nginx redirects traffic on port 80 to 5000. To add authentication, I found here that I can do it using sudo htpasswd -c .htpasswd <username>
in the etc/nginx/
directory and then adding
location \ {
auth_basic "Private Property";
auth_basic_user_file .htpasswd;
}
to the nginx.conf
(or mlflow.conf
in this case) to make it appear online. Trouble is, when I go to localhost:80
now and enter in my username/password, I continue to see
[error] 6#6: *1 open() "/etc/nginx/.htpasswd" failed (2: No such file or directory)
in the docker-compose up
logs as they are printed to the terminal, and as such I'm not able to see the mlflow UI on localhost:80
(either a blank screen or nginx 403 error).
Now, I've looked at several other posts (such as this one and this one) and it seems to me that nginx doesn't have the right permissions to read the .htpasswd
in the etc/nginx/
directory file or that the path of the file isn't correct, i.e. the path has to be in reference to the nginx.conf
file.
Even though I made these corrections to the above towards-data-science files, the problem still persists. I've been stuck for a while on this. Any particular reasons why this may be happening?
Edit: Here is my directory structure in case it may help:
mlflow-docker/:
mlflow/:
Dockerfile
nginx/:
Dockerfile
mlflow.conf
nginx.conf
docker-compose.yml
You need to add the .htpasswd file inside your container's file system.
Generate the password file in your project's nginx folder.
Copy the password file to the nginx container's directory. Add following line in nginx dockerfile.