GCP Account feature is not suitable for terraform backend GCS configuration

Hello,
After using new feature with GCP accounts for terraform, We have a problem with authorization to GCS bucket with state:

  1. What?
    After removing credentials properties from google provider and terraform backend configuration, there is a problem with authorization to GCS with terraform state. Octopus with terraform step cannot read remote state. Configuration:
terraform {
  backend "gcs" {
    bucket = "somethink"
    prefix = "somethink"
  }
  1. Why?
    After some research We figure out that Octopus inject environment as GOOGLE_CLOUD_KEYFILE_JSON. This variable is suitable for google providers, but not for GCS backed configuration. In docs We can read, that there are a different names of this environment variable:
    Backend Type: gcs | Terraform | HashiCorp Developer
  • credentials / GOOGLE_BACKEND_CREDENTIALS / GOOGLE_CREDENTIALS - (Optional) Local path to Google Cloud Platform account credentials in JSON format. If unset, Google Application Default Credentials are used. The provided credentials must have Storage Object Admin role on the bucket. Warning: if using the Google Cloud Platform provider as well, it will also pick up the GOOGLE_CREDENTIALS environment variable.
  1. Workaround
    Current workaround is pass credentials by file name, that contains decoded JsonKey.
#{GCP.Account.JsonKey | FromBase64}
terraform {
  backend "gcs" {
    bucket = "somethink"
    prefix = "somethink"
    credentials = "key.json"
  }

Probably this is bug or We use this feature incorrectly. Could you help us ?

Hi Piotr,

If you are able to get the rest of the GCP commands running correctly, then this may be a bug that you’ve identified.

I will replicate this over the next day or so and see if I can’t get to the bottom of the issue. This may mean it will end up with the developers to patch the bug in a later version.

Thanks for all the detail.

Regards,

Dane

1 Like

Hi @dane.falvo

Could you provide any feedback about issue ?

Regards,
Piotr

Hi @Piotr!

Sadly Dane has wrapped up for the day, as a member of our EU-based team, so I’m not sure where he got to with his reproduction of this. I’ll circle around with him when he’s back online tomorrow and have him give you an update ASAP.

Hi Piotr,

Sorry for the delay in getting back to you. I spent some time attempting to reproduce this issue, however I was struggling to get the exact same result as you did. Until, I followed your exact steps.

Can I get you to try the following?
Can you please create a new bucket, (and state file) without specifying additional credentials in the Terraform Script. Only using the Google account provided in the “Terraform apply” step template in Octopus Deploy?

That way, when you attempt to get the state of the resources created by Terraform, it should return this without requiring additional credentials.

Please give that a shot and let me know how it goes. You may have to manually remove the current bucket, but you will definitely have to remove the current state file or use a different backend location.

Regards,

Dane

Hi @dane.falvo
I’m glad to see you back.

Steps that you described seems to be exactly the same as I my first post. GCP Account is setup in Octopus Terraform plan step and removed from any terraform scripts. I don’t know why I need new GCS and what problem will be resolve ? I try to connect to existing one, that hold the state currently. GCP Account works ok for Google providers: google and google-beta (credentials is auto setup) when I omit configuration for GCS state bucket configuration.

Regards
Piotr

Hi Piotr,

Just stepping in for Dane, hopefully I can shed some light on what is happening here.

When you initially configured the Google Backend, using the credentials inside the backend configuration, Terraform stored the backend state with the GOOGLE_BACKEND_CREDENTIALS variable configured.

However the default variable used when it hasn’t been explicitly configured is GOOGLE_CREDENTIALS. This is the variable that the Step uses to authenticate to GCP. This means that subsequent deployments that don’t explicitly use credentials in the backend configuration fail as the backend is expecting a value for the GOOGLE_BACKEND_CREDENTIALS instead of the default variable, the same as is used by the step: GOOGLE_CREDENTIALS.

terraform {
  backend "gcs" {
    bucket = "somethink"
    prefix = "somethink"
    credentials = "key.json"
  }

To resolve this, a backend configuration which just uses the GOOGLE_CREDENTIALS is required. This is achievable by configuring a backend without setting the credentials inside the backend configuration, which could be achieved by using a new bucket or by modifying the Terraform state file.

I believe the safest option is Dane’s suggestion of using a different bucket for the backend configuration, without setting the credentials inside, so you can test it without modifying the existing Terraform state file.

Hopefully that helps clarify what’s going on, feel free to let me know if you’d like any part explained further or you have any questions!

Best Regards,

Hi @finnian.dempsey
Thanks for your deep dive details.

Unfortunately, even when I create new bucket and configure terraform backend gcs without credentials error is rising. Error tells that I cannot read data from bucket with terraform state (same as in my first post).

Error: Failed to get existing workspaces: querying Cloud Storage failed: Get “”: metadata: GCE metadata “instance/service-accounts/default/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.read_write” not defined

It seems that terraform try to get token from metadata sever. It’s take place when credentials property or environment variable is not set.

In Octopus Deploy logs I see that only GOOGLE_CLOUD_KEYFILE_JSON is injected as environment variable.

Also I check remote state saved from previous deployment and I cannot find any information about GOOGLE_BACKEND_CREDENTIALS. Credentials property is only used to read and write terraform state.

Best
Piotr

Hi Piotr,

After some more testing I have to agree, it’s definitely not behaving like it should!

I found that I received the same error even when not using Octopus and I believe it might be this open Terraform issue that’s causing this. Could you please test whether the same issue occurs when running terraform locally, outside of Octopus?

I’ve reached out internally for confirmation and I’ll keep you posted with any updates.

Best Regards,

Hi @finnian.dempsey,

As you suggest I tested this locally. I setup GOOGLE_BACKEND_CREDENTIALS and GOOGLE_CREDENTIALS in my system environment as Google docs said. Once again I cannot read remote state after execute terraform init. However, when I execute this command like this GOOGLE_CREDENTIALS="/home/piotr/Downloads/key.json" terraform init, then terraform can read state from remote bucket. This looks like terraform cannot read environment variables. I try to investigate this in free time or maybe if you other have suggestions, let me know about them.

Best
Piotr

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.