S3 state conflicts when using terraspace in multiple projects

I am looking at using Terraspace in our CI/CD pipeline to automate our infrastructure setup. So far things look really good. There is one place I am running into issues though.

The way the terraspace stores the state file in S3 is using this key
“<%= expansion(’:REGION/:ENV/:BUILD_DIR/terraform.tfstate’) %>”
This works well for a single deployment pipeline but doesn’t work well when I create 20 different deployment pipelines each in their own git repository and each using the same AWS account. The problem is that the stack name in different repositories will conflict with one another.

I am looking for suggestions on how I might best deal with stack names that may be repeated in completely separate repositories but conflict because of the shared AWS storage location.

There are a few ways I can think of to get around this problem.

  1. Force each stack name to be unique across my organization. This would be painful to enforce.

  2. Include the github repo name as part of the TS_ENV – TS_ENV=dev-my-repo-name. I don’t really like this option either because it seems to be a hack and destroys the common use of ENV across disparate stacks.

3.Include the github repo name into the S3 key via an environment variable -
<%= expansion(':REGION/:ENV/' + ENV['GITHUB_REPO_NAME'] + '/:BUILD_DIR/terraform.tfstate') %>.
I like this better than the first two options. This introduces a problem of how do I fail terraform if the environment variable is not specified?

Anyway I would appreciate any suggestions you might have.

Would probably go with 3, the env var, maybe call it GITHUB_REPO. One possible approach to ensure it gets set is to use a boot hook.

https://terraspace.cloud/docs/config/boot/

Maybe something like:

config/boot.rb

unless ENV['GITHUB_REPO']
  puts "ERROR: Please set GITHUB_REPO env var"
  exit 1
end