Skip to content

Troubleshooting

Common issues and their solutions.

Terraform Errors

"No value for required variable"

Error Message:

Error: No value for required variable
│ on variables.tf line 1:
│   1: variable "dbt_account_id" {
│ The root module input variable "dbt_account_id" is not set

Causes: - Environment variables not loaded - Incorrect variable name - Missing .env file

Solutions:

  1. Load environment variables:

    source .env
    echo $TF_VAR_dbt_account_id  # Verify it's set
    

  2. Check variable names match:

    # In .env, must be TF_VAR_<name>
    export TF_VAR_dbt_account_id=12345
    
    # In variables.tf
    variable "dbt_account_id" {  # Name matches after TF_VAR_
      type = number
    }
    

  3. Pass directly:

    terraform plan -var="dbt_account_id=12345"
    


"Invalid JSON for token_map"

Error Message:

Error: Invalid value for input variable
│ The given value is not suitable for var.token_map

Cause: token_map isn't valid JSON.

Solutions:

# ❌ Multi-line won't work in .env
export TF_VAR_token_map='{
  "key": "value"
}'

# ✅ Single line, use single quotes
export TF_VAR_token_map='{"key":"value","key2":"value2"}'

# ✅ Or use terraform.tfvars (HCL syntax)
token_map = {
  key  = "value"
  key2 = "value2"
}

"State Lock Timeout"

Error Message:

Error: Error acquiring the state lock
│ Lock Info:
│   ID:        abc123...
│   Operation: OperationTypePlan
│   Who:       user@hostname

Causes: - Previous Terraform run didn't finish - Crashed Terraform process - Concurrent runs

Solutions:

  1. Wait for the other operation to complete

  2. Verify no other runs are active:

    # Check for terraform processes
    ps aux | grep terraform
    

  3. Force unlock (use carefully):

    terraform force-unlock abc123
    

Force Unlock Warning

Only force unlock if you're CERTAIN no other process is running. This can corrupt state.


"Module not found"

Error Message:

Error: Module not installed
│ This configuration requires module "dbt_cloud"

Cause: Terraform modules not downloaded.

Solution:

terraform init
# Or if already initialized
terraform get


dbt Cloud API Errors

"401 Unauthorized"

Error Message:

Error: dbt Cloud API error: 401 Unauthorized

Causes: - Invalid API token - Expired token - Wrong dbt Cloud account

Solutions:

  1. Regenerate token:
  2. Go to dbt Cloud Profile
  3. Create new API token
  4. Update .env or secrets

  5. Verify account ID:

    # Check URL: https://cloud.getdbt.com/accounts/{ACCOUNT_ID}/
    echo $TF_VAR_dbt_account_id
    

  6. Check token format:

    # Should start with 'dbtc_'
    echo $TF_VAR_dbt_api_token
    


"403 Forbidden"

Error Message:

Error: dbt Cloud API error: 403 Forbidden

Cause: Token doesn't have required permissions.

Solutions:

  1. Use account-level token (not project-level)
  2. Check token permissions in dbt Cloud
  3. Create service account token with admin permissions

"Connection Not Found"

Error Message:

Error: Cannot find connection with key: "my_connection"

Cause: The connection_key in your environment doesn't match any global_connections[].key.

Solutions:

  1. Check your global_connections keys:

    global_connections:
      - name: Databricks Production
        key: databricks_prod     # ← this is the key
    

  2. Make sure your environment references it correctly:

    environments:
      - name: Production
        connection_key: databricks_prod   # ← must match exactly
    


YAML Configuration Errors

"YAML Parse Error"

Error Message:

Error: failed to parse YAML
│ yaml: line 10: did not find expected key

Causes: - Invalid YAML syntax - Incorrect indentation - Missing quotes

Solutions:

  1. Validate YAML syntax:

    # Use yamllint
    yamllint dbt-config.yml
    
    # Or online validator
    # https://www.yamllint.com/
    

  2. Check indentation (2 spaces, no tabs):

    # ❌ Wrong
    project:
        name: "test"  # 4 spaces
    
    # ✅ Correct
    project:
      name: "test"    # 2 spaces
    

  3. Quote special characters:

    # ❌ Wrong
    name: My Project: Production
    
    # ✅ Correct
    name: "My Project: Production"
    


"Missing Required Field"

Error Message:

Error: Missing required field

Cause: Required field not provided in YAML.

Solutions:

Check the YAML Schema for required fields:

environments:
  - name: Production          # Required
    key: prod                 # Required
    type: deployment          # Required
    deployment_type: production  # Required for deployment envs
    connection_key: my_conn   # References global_connections[].key
    credential:
      credential_type: databricks
      schema: analytics

"Invalid Enum Value"

Error Message:

Error: Invalid value for type: must be 'development' or 'deployment'

Cause: Used invalid value for a restricted field.

Solutions:

# ❌ Wrong
type: "prod"  # Not a valid option

# ✅ Correct
type: "deployment"  # Must be 'development' or 'deployment'

Git Integration Errors

"GitHub Installation ID Not Found"

Error Message:

Error: GitHub App installation not found

Causes: - Invalid installation ID - GitHub App not installed on repository - Using wrong git_clone_strategy

Solutions:

  1. Find installation ID:
  2. Go to GitHub Settings > Applications > dbt Cloud
  3. Check URL: /settings/installations/{INSTALLATION_ID}

  4. Install GitHub App:

  5. In dbt Cloud: Admin > Integrations > GitHub
  6. Click "Connect GitHub App"
  7. Authorize for your repositories

  8. Update YAML:

    repository:
      remote_url: "https://github.com/org/repo.git"
      git_clone_strategy: "github_app"
      github_installation_id: 12345678  # Your installation ID
    


"GitLab Project ID Not Found"

Error Message:

Error: GitLab project not found

Cause: Invalid GitLab project ID.

Solutions:

  1. Find project ID:
  2. In GitLab: Project > Settings > General
  3. Look for "Project ID" (numeric value)

  4. Update YAML:

    repository:
      remote_url: "https://gitlab.com/group/repo.git"
      git_clone_strategy: "deploy_token"
      gitlab_project_id: 9876543  # Numeric ID, not name
    


Credential Errors

"Token Not Found in token_map"

Error Message:

Error: Token key 'prod_databricks_token' not found in token_map

Cause: YAML references a token key that doesn't exist in token_map.

Solutions:

Option A — Using environment_credentials (recommended):

# In YAML: environment with key: prod in project with key: analytics
environments:
  - name: Production
    key: prod
    credential:
      credential_type: databricks
      catalog: main
      schema: analytics
# Key is "{project_key}_{env_key}"
export TF_VAR_environment_credentials='{"analytics_prod": {"credential_type": "databricks", "token": "dapi..."}}'

Option B — Using token_map (legacy Databricks):

credential:
  token_name: "prod_databricks_token"  # This key...
export TF_VAR_token_map='{"prod_databricks_token": "dapi_abc123"}'
#                          ↑ Must match exactly


CI/CD Issues

"Terraform Init Failed"

Problem: CI/CD can't initialize Terraform.

Solutions:

  1. Check Terraform version:

    - name: Setup Terraform
      uses: hashicorp/setup-terraform@v2
      with:
        terraform_version: 1.6.0  # Pin specific version
    

  2. Verify backend access:

    # Ensure CI has permissions to state bucket
    - name: Configure AWS Credentials
      uses: aws-actions/configure-aws-credentials@v2
      with:
        role-to-assume: arn:aws:iam::123456789012:role/terraform
    


"Secrets Not Available"

Problem: Environment variables are undefined in CI/CD.

Solutions:

  1. Verify secrets are defined:
  2. GitHub: Settings > Secrets and variables > Actions
  3. GitLab: Settings > CI/CD > Variables
  4. Azure DevOps: Pipelines > Library

  5. Check secret names match:

    env:
      TF_VAR_dbt_account_id: ${{ secrets.DBT_ACCOUNT_ID }}
      TF_VAR_dbt_token: ${{ secrets.DBT_TOKEN }}
      TF_VAR_environment_credentials: ${{ secrets.ENVIRONMENT_CREDENTIALS }}
      # Secret names must match exactly ↑
    

  6. Ensure workflow has access:

    # For protected branches
    environment:
      name: production  # Must have access to this environment's secrets
    


Performance Issues

"Terraform Plan Takes Too Long"

Problem: terraform plan runs for minutes.

Solutions:

  1. Use targeted plans:

    terraform plan -target=module.dbt_cloud.module.jobs
    

  2. Enable parallelism:

    terraform plan -parallelism=10
    

  3. Upgrade Terraform:

    # Newer versions have performance improvements
    terraform version
    brew upgrade terraform  # or similar
    


"State Refresh Slow"

Problem: State refresh takes a long time.

Solutions:

  1. Skip refresh when safe:

    terraform plan -refresh=false
    

  2. Use remote state caching:

    terraform {
      backend "s3" {
        # Enable caching
        skip_metadata_api_check = true
      }
    }
    


Debug Mode

Enable detailed logging:

# Set debug level
export TF_LOG=DEBUG
export TF_LOG_PATH=./terraform-debug.log

# Run terraform
terraform plan

# Review logs
cat terraform-debug.log

Getting Help

Still stuck? Here's how to get support:

1. Check Existing Issues

Search GitHub Issues for similar problems.

2. Create Detailed Issue

Include: - Error message (full output) - Terraform version: terraform version - Module version: Check your source URL - Minimal config that reproduces the issue - What you've tried already

Community Support


Common Mistakes

❌ Hardcoding Credentials

# DON'T DO THIS!
variable "dbt_token" {
  default = "dbtc_xxxxx"
}

❌ Committing .env File

# Make sure it's in .gitignore
echo ".env" >> .gitignore

❌ Using Root Tokens

# Use service accounts, not personal tokens
# Generate at: dbt Cloud > Admin > Service Tokens

❌ Not Pinning Versions

# ❌ Bad
module "dbt" {
  source = "git::https://github.com/..."
}

# ✅ Good
module "dbt" {
  source = "git::https://github.com/...?ref=v1.0.0"
}

Prevention Checklist

Before deployment:

  • Run terraform fmt -check
  • Run terraform validate
  • Review terraform plan output
  • Test in non-production first
  • Have rollback plan ready
  • Monitor logs during apply
  • Verify resources in dbt Cloud UI

Need More Help?