Skip to content

🌍 Environment Management

Tutorial Duration Level

Manage multiple environments (Dev/Test/Prod) with proper configuration, deployment strategies, and environment-specific settings.

📋 Table of Contents

🏗️ Environment Strategy

Environment Setup

Environments:
├── Development (DEV)
│   ├── Data Factory: adf-project-dev
│   ├── Resource Group: rg-adf-dev
│   └── Purpose: Active development
├── Testing (TEST)
│   ├── Data Factory: adf-project-test
│   ├── Resource Group: rg-adf-test
│   └── Purpose: QA and validation
└── Production (PROD)
    ├── Data Factory: adf-project-prod
    ├── Resource Group: rg-adf-prod
    └── Purpose: Live workloads

⚙️ Configuration Management

ARM Template Parameters

{
  "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "factoryName": {
      "value": "adf-project-prod"
    },
    "AzureSqlDatabase_connectionString": {
      "value": "Server=tcp:sql-prod.database.windows.net,1433;Database=salesdb;"
    },
    "AzureBlobStorage_accountName": {
      "value": "staprod"
    }
  }
}

Environment-Specific Parameters

{
  "dev": {
    "sqlServer": "sql-dev.database.windows.net",
    "storageAccount": "stadev",
    "keyVault": "kv-dev"
  },
  "test": {
    "sqlServer": "sql-test.database.windows.net",
    "storageAccount": "statest",
    "keyVault": "kv-test"
  },
  "prod": {
    "sqlServer": "sql-prod.database.windows.net",
    "storageAccount": "staprod",
    "keyVault": "kv-prod"
  }
}

🚀 Deployment Process

Pre-Production Checklist

  • Code review completed
  • Unit tests passed
  • Integration tests passed
  • Performance testing completed
  • Security scan passed
  • Documentation updated
  • Rollback plan documented

Deployment Steps

  1. Stop Triggers (Production)

    # Stop all triggers
    $triggers = Get-AzDataFactoryV2Trigger -ResourceGroupName "rg-adf-prod" -DataFactoryName "adf-project-prod"
    $triggers | ForEach-Object { Stop-AzDataFactoryV2Trigger -ResourceGroupName "rg-adf-prod" -DataFactoryName "adf-project-prod" -Name $_.Name -Force }
    

  2. Deploy ARM Template

    az deployment group create \
      --resource-group "rg-adf-prod" \
      --template-file "ARMTemplateForFactory.json" \
      --parameters @ARMTemplateParametersForFactory.prod.json
    

  3. Start Triggers (Production)

    # Start all triggers
    $triggers | ForEach-Object { Start-AzDataFactoryV2Trigger -ResourceGroupName "rg-adf-prod" -DataFactoryName "adf-project-prod" -Name $_.Name -Force }
    

📊 Environment Variables

Use Global Parameters

{
  "environment": {
    "dev": {
      "type": "string",
      "value": "development"
    },
    "test": {
      "type": "string",
      "value": "testing"
    },
    "prod": {
      "type": "string",
      "value": "production"
    }
  }
}

🎯 Best Practices

Environment Isolation

  • Separate subscriptions or resource groups
  • Different service principals per environment
  • Isolated networks (VNets)

Configuration Management

  • Store environment configs in Key Vault
  • Use ARM template parameters
  • Implement global parameters
  • Version control all configurations

Deployment Automation

  • Automated testing before deployment
  • Blue-green deployment strategy
  • Automated rollback capability
  • Deployment notifications

Security

  • Least privilege access per environment
  • Separate service principals
  • Environment-specific managed identities
  • Regular security audits

✅ Summary

Congratulations! You've completed the Azure Data Factory Tutorial Series.

What You've Learned

  • ✅ ADF fundamentals and architecture
  • ✅ Environment setup and configuration
  • ✅ Integration runtime setup
  • ✅ Multi-source data integration
  • ✅ Secure connectivity patterns
  • ✅ Pipeline development and orchestration
  • ✅ Data transformation techniques
  • ✅ Error handling and monitoring
  • ✅ CI/CD implementation
  • ✅ Environment management

Next Steps

  • Build Production Pipelines: Apply learnings to real projects
  • Explore Advanced Features: Azure Purview integration, Delta Lake
  • Join Community: Participate in forums and user groups
  • Continuous Learning: Stay updated with new features

Additional Resources

Feedback

We'd love to hear about your experience with this tutorial series!

  • GitHub Issues: Report problems or suggest improvements
  • Discussions: Share your implementations and ask questions
  • LinkedIn: Connect with the Azure Data Factory community

🎉 Tutorial Complete! You're now ready to build enterprise-scale data integration solutions with Azure Data Factory.


Module Progress: 18 of 18 complete

Tutorial Version: 1.0 Last Updated: January 2025