I have a Terraform template that needs to update a Kubernetes config map. To do this, I’m attempting to use the Apply a Terraform template step and a Kubernetes deployment target. However, the Apply a Terraform template step is unable to connect to the Kubernetes cluster.
Given that there are no configurable features for the Apply a Terraform template step, and thus I can’t add a pre-deploymet script to set ~/.kube/config, how can I use Terraform to manage a Kubernetes cluster?
Many thanks,
David
More information
Step 1
First, Kubernetes cluster (AWS EKS) and associated resources are deployed using the Apply a Terraform template step. This step is working successfully.
Step 2
Next, a deployment target is dynamically created by using the AWS CLI script step to describe the cluster that was created by Terraform, and then create the target via the Create Kubernetes Target command. This step is also working.
Step 3
Finally, a Kubernetes config map needs to be updated through Terraform, and it’s this step that is proving problematic.
The step also uses the Apply a Terraform template step, but this time the step is run on behalf of the deployment target that was dynamically created by the previous step. The hope was that the worker could use the kubectl
CLI with ~/.kube/config as required (because I assume the Kubernetes deployment target sets it somewhere), but that doesn’t seem to be the case judging by the error message -
Error: Post “http://localhost/api/v1/namespaces/kube-system/configmaps”: dial tcp 127.0.0.1:80: connect: connection refused
on …/…/modules/user-management/user-management.tf line 3, in resource “kubernetes_config_map” “aws_auth”:
3: resource “kubernetes_config_map” “aws_auth” {