Troubleshooting¶
Common issues and their solutions.
Terraform Errors¶
"No value for required variable"¶
Error Message:
Error: No value for required variable
│ on variables.tf line 1:
│ 1: variable "dbt_account_id" {
│
│ The root module input variable "dbt_account_id" is not set
Causes:
- Environment variables not loaded
- Incorrect variable name
- Missing .env file
Solutions:
-
Load environment variables:
-
Check variable names match:
-
Pass directly:
"Invalid JSON for token_map"¶
Error Message:
Cause: token_map isn't valid JSON.
Solutions:
# ❌ Multi-line won't work in .env
export TF_VAR_token_map='{
"key": "value"
}'
# ✅ Single line, use single quotes
export TF_VAR_token_map='{"key":"value","key2":"value2"}'
# ✅ Or use terraform.tfvars (HCL syntax)
token_map = {
key = "value"
key2 = "value2"
}
"State Lock Timeout"¶
Error Message:
Error: Error acquiring the state lock
│ Lock Info:
│ ID: abc123...
│ Operation: OperationTypePlan
│ Who: user@hostname
Causes: - Previous Terraform run didn't finish - Crashed Terraform process - Concurrent runs
Solutions:
-
Wait for the other operation to complete
-
Verify no other runs are active:
-
Force unlock (use carefully):
Force Unlock Warning
Only force unlock if you're CERTAIN no other process is running. This can corrupt state.
"Module not found"¶
Error Message:
Cause: Terraform modules not downloaded.
Solution:
dbt Cloud API Errors¶
"401 Unauthorized"¶
Error Message:
Causes: - Invalid API token - Expired token - Wrong dbt Cloud account
Solutions:
- Regenerate token:
- Go to dbt Cloud Profile
- Create new API token
-
Update
.envor secrets -
Verify account ID:
-
Check token format:
"403 Forbidden"¶
Error Message:
Cause: Token doesn't have required permissions.
Solutions:
- Use account-level token (not project-level)
- Check token permissions in dbt Cloud
- Create service account token with admin permissions
"Connection Not Found"¶
Error Message:
Cause: The connection_key in your environment doesn't match any global_connections[].key.
Solutions:
-
Check your global_connections keys:
-
Make sure your environment references it correctly:
YAML Configuration Errors¶
"YAML Parse Error"¶
Error Message:
Causes: - Invalid YAML syntax - Incorrect indentation - Missing quotes
Solutions:
-
Validate YAML syntax:
-
Check indentation (2 spaces, no tabs):
-
Quote special characters:
"Missing Required Field"¶
Error Message:
Cause: Required field not provided in YAML.
Solutions:
Check the YAML Schema for required fields:
environments:
- name: Production # Required
key: prod # Required
type: deployment # Required
deployment_type: production # Required for deployment envs
connection_key: my_conn # References global_connections[].key
credential:
credential_type: databricks
schema: analytics
"Invalid Enum Value"¶
Error Message:
Cause: Used invalid value for a restricted field.
Solutions:
# ❌ Wrong
type: "prod" # Not a valid option
# ✅ Correct
type: "deployment" # Must be 'development' or 'deployment'
Git Integration Errors¶
"GitHub Installation ID Not Found"¶
Error Message:
Causes: - Invalid installation ID - GitHub App not installed on repository - Using wrong git_clone_strategy
Solutions:
- Find installation ID:
- Go to GitHub Settings > Applications > dbt Cloud
-
Check URL:
/settings/installations/{INSTALLATION_ID} -
Install GitHub App:
- In dbt Cloud: Admin > Integrations > GitHub
- Click "Connect GitHub App"
-
Authorize for your repositories
-
Update YAML:
"GitLab Project ID Not Found"¶
Error Message:
Cause: Invalid GitLab project ID.
Solutions:
- Find project ID:
- In GitLab: Project > Settings > General
-
Look for "Project ID" (numeric value)
-
Update YAML:
Credential Errors¶
"Token Not Found in token_map"¶
Error Message:
Cause: YAML references a token key that doesn't exist in token_map.
Solutions:
Option A — Using environment_credentials (recommended):
# In YAML: environment with key: prod in project with key: analytics
environments:
- name: Production
key: prod
credential:
credential_type: databricks
catalog: main
schema: analytics
# Key is "{project_key}_{env_key}"
export TF_VAR_environment_credentials='{"analytics_prod": {"credential_type": "databricks", "token": "dapi..."}}'
Option B — Using token_map (legacy Databricks):
CI/CD Issues¶
"Terraform Init Failed"¶
Problem: CI/CD can't initialize Terraform.
Solutions:
-
Check Terraform version:
-
Verify backend access:
"Secrets Not Available"¶
Problem: Environment variables are undefined in CI/CD.
Solutions:
- Verify secrets are defined:
- GitHub: Settings > Secrets and variables > Actions
- GitLab: Settings > CI/CD > Variables
-
Azure DevOps: Pipelines > Library
-
Check secret names match:
-
Ensure workflow has access:
Performance Issues¶
"Terraform Plan Takes Too Long"¶
Problem: terraform plan runs for minutes.
Solutions:
-
Use targeted plans:
-
Enable parallelism:
-
Upgrade Terraform:
"State Refresh Slow"¶
Problem: State refresh takes a long time.
Solutions:
-
Skip refresh when safe:
-
Use remote state caching:
Debug Mode¶
Enable detailed logging:
# Set debug level
export TF_LOG=DEBUG
export TF_LOG_PATH=./terraform-debug.log
# Run terraform
terraform plan
# Review logs
cat terraform-debug.log
Getting Help¶
Still stuck? Here's how to get support:
1. Check Existing Issues¶
Search GitHub Issues for similar problems.
2. Create Detailed Issue¶
Include:
- Error message (full output)
- Terraform version: terraform version
- Module version: Check your source URL
- Minimal config that reproduces the issue
- What you've tried already
Community Support¶
Common Mistakes¶
❌ Hardcoding Credentials¶
❌ Committing .env File¶
❌ Using Root Tokens¶
❌ Not Pinning Versions¶
# ❌ Bad
module "dbt" {
source = "git::https://github.com/..."
}
# ✅ Good
module "dbt" {
source = "git::https://github.com/...?ref=v1.0.0"
}
Prevention Checklist¶
Before deployment:
- Run
terraform fmt -check - Run
terraform validate - Review
terraform planoutput - Test in non-production first
- Have rollback plan ready
- Monitor logs during apply
- Verify resources in dbt Cloud UI
Need More Help?¶
-
Documentation
Review the full documentation
-
GitHub Issues
Report bugs or request features
-
Discussions
Ask questions and share ideas