I'm using terraform to ensure a consistent configuration for my github repos using the Terraform Github Provider. Basically this works fine.
But now I experience some undesired behavior with my terraform state. I develop from my local workstation and from a Gitpod. Plus I have a Github Actions workflow which applies my terraform config regularly. So I have three points from which I trigger terraform.
My terraform.tfstate
file is not part of my repository due to security concerns (secrets are stored in plan text inside this file).
Now my question is: how can I handle the terraform state most effectively? I would like to somehow store my terraform state in a remote location which is accessible from all 3 points. But I do not want to set up an S3 bucket or anything similar that would cost some money.
My first ideas are somehow storing the terraform.tfstate
file file in my google drive and make it available when I apply my terraform config. Preferably have terraform directly write to the file in the google drive. But I don't know how to do this and I don't know if this even is a good idea.
The other idea is to store the terraform.tfstate
file in a private git repository and clone the repo and copy the terraform.tfstate
file to the correct location before I apply any config. And afterwards push the terraform.tfstate
file back into the private repo. But again I don*t know if this is a good way to go.
I assume both ideas are not really a good practice? Maybe anyone got a better idea or a hint? Maybe there is some sort of free (0 Dollar) storage in some cloud? I'm thinking AWS, GCP? Or rather Linode or DigitalOcean (because they are smaller and easier to handle). I would prefer a free cloud solutions because I don't need the terraform state 24/7 ... only on demand when I apply a configuration.
I'd appreciate any help or idea or discussion. Thanks everyone and best regards. Sebastian
The amount of data transfer, and the size of the state file, would almost certainly fall within the free-tier of AWS.