2 terraform state files in two different S3 buckets, same code, possible?

606 views Asked by At

Folks! Need some help here.

Problem statement - We have a project in which we create AWS infra from TF code. We trigger it through Azure DevOps pipelines, infra gets created, and the state file gets stored in an S3 bucket. This is perfect.

Now, we run some Gradle tests locally to test AWS infra which uses the same TF file and the state file goes to the same bucket. This is where we have the issue.

Need - I want if I run the local tests the TF state file MUST go to another S3 bucket. For eg. Azure DevOps Pipeline - Bucket A. Local Gradle AWS infra tests - Bucket B

Questions -

  1. Is this even possible?
  2. How terraform can decide where to store the state file on the basis of the local or ADO call?
1

There are 1 answers

0
Maciej Rostański On

The state definition is located in backend block. Let's suppose you have that definition in a backend.tf file and Bucket B as state source. You could store another backend.tf file (with bucket A as state source) in a subfolder, which means it will not be read autimatically. Then, in ADO pipeline you can switch the state file (overwrite the main backend.tf file) with this one.

I do not know the reason behind two different buckets (probably permissions) but you can also use workspaces (for example default and ado) and:

  • give local users permissions to specific object (statefile) only that contains default workspace,
  • git ADO runner permissions to specific object (statefile) associated with ado workspace.

More information: