A console application for managing connection references in Power Platform solutions. This tool helps standardize connection references across Cloud Flows by creating shared connection references with consistent naming conventions.
- Analyze solutions to identify connection reference usage
- Create standardized connection references with configurable naming schemes
- Update Cloud Flows to use shared connection references
- Clean up old connection references
- Generate deployment settings JSON for ALM processes
- Dry-run mode for safe testing
- Configure your environment in
appsettings.json
- Run commands to manage your solution's connection references
Analyze a solution to see what connection references are used and what would be created:
dotnet run -- analyze --solution "YourSolutionName" [--format table|vertical|csv|json] [--output "filename"]
Output Formats:
table
(default) - Tabular format in terminalvertical
- Tree-like format for better readabilitycsv
- Comma-separated values for Excel/analysisjson
- Structured JSON for automation/scripting
Examples:
# Default table format
dotnet run -- analyze --solution "MyFlows"
# Vertical format for readability
dotnet run -- analyze --solution "MyFlows" --format vertical
# Export to CSV for analysis
dotnet run -- analyze --solution "MyFlows" --format csv --output "analysis.csv"
# JSON for automation
dotnet run -- analyze --solution "MyFlows" --format json --output "analysis.json"
Analyze Output Details: The analyze command provides comprehensive information about each flow's connection references:
- Flow ID: Unique identifier of the Power Automate flow
- Flow Name: Display name of the flow
- Connection Reference ID: Unique ID of the connection reference
- Logical Name: The logical name used to reference the connection in the flow
- Provider: The connector type (e.g.,
shared_commondataserviceforapps
,shared_azuread
) - Connection ID: The actual connection being used
This information helps identify which flows need to be updated and what standardized connection references should be created.
Create new shared connection references for all connectors found in the solution:
dotnet run -- create-refs --solution "YourSolutionName" [--dry-run]
Update Cloud Flows to use the new shared connection references:
dotnet run -- update-flows --solution "YourSolutionName" [--dry-run]
Run the complete process (create connection references + update flows):
dotnet run -- process --solution "YourSolutionName" [--dry-run]
Create a deployment settings JSON file for the solution's connection references:
dotnet run -- generate-deployment-settings --solution "YourSolutionName" --output "deploymentsettings.json"
The generated deployment settings file includes descriptive placeholders for easy find-and-replace operations in ALM processes:
{
"EnvironmentVariables": [],
"ConnectionReferences": [
{
"LogicalName": "prefix_connector_flowid",
"ConnectionId": "{{REPLACE_WITH_CONNECTOR_CONNECTION_ID}}",
"ConnectorId": "/providers/Microsoft.PowerApps/apis/shared_connector"
}
]
}
Placeholder Format: Connection ID placeholders use the format {{REPLACE_WITH_[CONNECTOR]_CONNECTION_ID}}
where [CONNECTOR]
is the connector name in uppercase (e.g., COMMONDATASERVICEFORAPPS
, AZUREAD
, SHAREPOINTONLINE
).
ALM Usage Examples:
- PowerShell:
$content -replace "{{REPLACE_WITH_DATAVERSE_CONNECTION_ID}}", $connectionId
- Azure DevOps: Use File Transform task with variable replacement
- Find/Replace: Search for
{{REPLACE_WITH_
pattern and replace with actual connection IDs
These placeholders make automated deployment processes much easier by providing clear, searchable tokens for connection ID replacement.
Add existing connection references that are used by flows in the solution to the solution itself:
dotnet run -- add-existing-refs --solution "YourSolutionName" [--dry-run]
Purpose: This command ensures that connection references used by flows in a solution are actually included in that solution. This is important for proper solution deployment and dependency management.
How it works:
- Analyzes all flows in the solution to identify which connection references they use
- Checks if those connection references already exist in the environment
- Adds existing connection references to the solution (does NOT create new ones)
- Skips connection references that are already part of the solution
- Provides detailed logging of what was added vs. skipped
Key differences from other commands:
analyze
: Only shows connection reference usage (read-only)create-refs
: Creates new connection references and adds them to solutionadd-existing-refs
: Only adds existing connection references to solution (no creation)
Example scenario: You have flows in a solution that reference connection references created outside of the solution. This command will add those existing connection references to the solution so they're included when you export/import the solution.
Remove old unused connection references (dependency-aware - only deletes connection references not used by any flows):
dotnet run -- cleanup --solution "YourSolutionName" [--dry-run]
Safety Features:
- Analyzes all flows in the solution to identify connection reference dependencies
- Only removes connection references that are not referenced by any flow
- Provides detailed logging of what will be kept vs. deleted
- Shows which flows are using each connection reference before deletion
Configure the tool by editing appsettings.json
:
{
"PowerPlatform": {
"TenantId": "your-tenant-id",
"ClientId": "your-client-id",
"ClientSecret": "your-client-secret",
"DataverseUrl": "https://yourorg.crm.dynamics.com"
},
"ConnectionReferences": {
"Prefix": "new",
"ProviderMappings": {
"shared_azuread": {
"connectionId": "your-connection-id",
"connectorId": "/providers/Microsoft.PowerApps/apis/shared_azuread"
},
"shared_commondataservice": {
"connectionId": "your-connection-id",
"connectorId": "/providers/Microsoft.PowerApps/apis/shared_commondataservice"
}
}
}
}
The ProviderMappings
section controls which connectors the tool will process and maps them to their corresponding connection and connector IDs in your environment.
Important: The tool will only process providers that are explicitly listed in the ProviderMappings section. Any flows using connectors not listed here will be skipped with a warning message.
For each provider you want to manage, you'll need to specify:
- connectionId: The GUID of the existing connection you want to reference
- connectorId: The full connector API path (usually starts with
/providers/Microsoft.PowerApps/apis/
)
Behavior:
- ✅ Processes: Flows using connectors listed in ProviderMappings that don't already follow your naming pattern
⚠️ Skips: Flows using connectors listed in ProviderMappings that already follow your naming pattern- ❌ Ignores: Flows using connectors NOT listed in ProviderMappings (with warning message)
Example:
If your ProviderMappings only contains shared_commondataserviceforapps
, the tool will:
- Process Dataverse flows that need standardized connection references
- Skip Office 365, SharePoint, and other connector flows entirely
- Show warning messages for skipped providers
To find connection and connector IDs, you can use the Power Platform admin center or query the Dataverse API directly.
--solution
: The unique name of the solution to process--dry-run
: Preview changes without making modifications--format
: Output format (table, vertical, csv, json) - analyze command only--output
: Specify output file path
# Safe analysis of a solution (shows all connectors, regardless of ProviderMappings)
dotnet run -- analyze --solution "MyCloudFlows"
# Analyze with vertical format for better readability
dotnet run -- analyze --solution "MyCloudFlows" --format vertical
# Export analysis to CSV
dotnet run -- analyze --solution "MyCloudFlows" --format csv --output "flow-analysis.csv"
# Test what would be created (only processes providers in ProviderMappings)
dotnet run -- process --solution "MyCloudFlows" --dry-run
# Actually create and update everything (only for configured providers)
dotnet run -- process --solution "MyCloudFlows"
# Generate deployment settings for ALM
dotnet run -- generate-deployment-settings --solution "MyCloudFlows" --output "release/deploymentsettings.json"
# Add existing connection references to solution
dotnet run -- add-existing-refs --solution "MyCloudFlows" --dry-run
The tool automatically handles different Dataverse environment versions by trying multiple component types when adding connection references to solutions:
- First attempts with component type 10132 (newer environments)
- Falls back to component type 10469 (older environments) if the first fails
- This ensures compatibility across different Power Platform environment versions
- "Invalid component type" warnings: These are handled automatically and don't prevent successful operation
- Authentication token expiry: The tool uses token caching and will prompt for re-authentication when needed
- Connection reference already exists: The tool detects existing connection references and reuses them instead of creating duplicates
- Service Principal 403/ConnectionAuthorizationFailed errors: Service principals need explicit "Can use" permission on connections - see AUTHENTICATION.md for detailed solution
The tool uses a selective processing model based on your ProviderMappings configuration:
- Flows using configured providers that have connection references NOT following your naming pattern
- Only connection references that need to be standardized are created
- Existing connection references that already follow the pattern are left unchanged
- Flows using unconfigured providers (not in ProviderMappings) - completely ignored with warning
- Flows using configured providers that already have properly named connection references
With this configuration:
"ProviderMappings": {
"shared_commondataserviceforapps": {
"connectionId": "your-connection-id",
"connectorId": "/providers/Microsoft.PowerApps/apis/shared_commondataserviceforapps"
}
}
Tool behavior:
- ✅ Processes: Dataverse flows with connection references like
msdyn_sharedcommondataserviceforapps
→ Createsnew_shared_commondataserviceforapps_[flowid]
⚠️ Skips: Dataverse flows already usingnew_shared_commondataserviceforapps_*
naming- ❌ Ignores: Office 365, SharePoint, Teams flows (shows warning: "No mapping for provider 'shared_office365', skipping")
This allows you to incrementally standardize your environment by focusing on specific connector types without affecting others.
Connection references create a layer of abstraction between your flows and actual connections:
Flow → Connection Reference → Connection
Key concepts:
- Flow: Your Power Automate workflow that needs to connect to external services
- Connection Reference: A logical pointer that can be redirected to different connections
- Connection: The actual authenticated connection to a service (SharePoint, SQL, etc.)
Benefits of this architecture:
- Environment portability: Change which connection a flow uses without modifying the flow
- Deployment flexibility: Point to different connections in Dev/Test/Prod environments
- Throttling management: Distribute load across multiple connections for the same service
Recommended environment strategy:
Development (Unmanaged) → Test (Managed) → Production (Managed)
↑ ↓ ↓
Use this tool here Deploy via pipeline Deploy via pipeline
Development Environment:
- Use unmanaged solutions for flexibility
- Run this tool to standardize connection references
- Develop and test all functionality
- Generate deployment settings for promotion
Test/Production Environments:
- Receive managed solutions only
- Use deployment settings to configure connection references
- Never run development tools directly
- Maintain environment isolation and traceability
- Service Principals: Grant only the minimum Dataverse permissions needed for your automation scenarios
- User Accounts: Avoid using high-privilege admin accounts for day-to-day connection management
- Connection Sharing: Explicitly grant "Can use" permission rather than relying on broad access
Recommendation: Use a dedicated, shared administrator account for connection management.
Why this matters:
- Connections are not automatically shared with team members
- Individual user connections become invisible and unusable by other team members
- Personal connections create deployment bottlenecks and knowledge silos
Implementation:
- Create a shared service account (e.g.,
[email protected]
) - Use this account to create all shared connections
- Share this account's credentials securely with the team (using tools like Azure Key Vault)
- Document which connections belong to shared vs. personal use
Recommendation: Create a unique connection reference for each flow, even if they use the same service.
Benefits:
- Granular throttling control: If one flow gets throttled, others continue working
- Independent scaling: Point different flows to different connections as load increases
- Easier troubleshooting: Isolate connection issues to specific flows
- Flexible deployment: Different flows can use different connections in different environments
Example scenario:
❌ Bad: Multiple flows sharing one connection reference
Flow A ──┐
Flow B ──┼── Shared Connection Reference ── Connection
Flow C ──┘
✅ Good: Each flow has its own connection reference
Flow A ── Connection Reference A ── Connection 1
Flow B ── Connection Reference B ── Connection 2
Flow C ── Connection Reference C ── Connection 3
- Monitor connection usage and prepare to distribute load across multiple connections
- Use connection references to easily redirect flows to less-utilized connections
- Consider peak usage times when planning connection capacity
- Document your connection topology for operational teams
Recommended ALM workflow:
- Development Environment: Use this tool to standardize connection references in your dev environment
- Export as Managed Solution: Export your solution as a managed solution for deployment
- Generate Deployment Settings: Use the
generate-deployment-settings
command to create configuration files - Deploy via Pipelines: Use Power Platform pipelines or Azure DevOps with deployment settings to deploy to higher environments
- Target Environment Setup: Ensure connections exist and are properly shared in target environments before deployment
❌ Do NOT run this tool directly in production environments
Why this matters:
- Breaks ALM traceability: Direct changes in production can't be tracked back to source environments
- Creates environment drift: Production becomes out of sync with your development environments
- Deployment conflicts: Future deployments may fail or overwrite manual production changes
- No rollback capability: Manual changes are harder to undo if issues arise
The only exception: If you're using unmanaged solutions (not recommended), but this creates significant ALM risks and environment synchronization challenges.
- Develop in Dev: Use this tool in development environment to create standardized connection references
- Test in Test: Deploy the managed solution to test environment using deployment settings
- Validate in Test: Ensure all flows work correctly with the new connection references
- Deploy to Production: Use the same managed solution and deployment settings for production
- Monitor: Verify all flows are functioning correctly in production
- Run with
--dry-run
to preview all changes before applying them - Test in development environments before deploying to higher environments
- Verify authentication works before running bulk operations
- Validate deployment settings in test environments before production deployment
- Use consistent naming conventions for your connection reference prefix across environments
- Keep provider mappings up to date with your environment configuration
- Document your naming standards for the team
- Version control your configuration files (excluding secrets)
- Generate deployment settings files for solutions being promoted (
generate-deployment-settings
command) - Use managed solutions for all deployments to test and production environments
- Leverage Power Platform pipelines or Azure DevOps for automated deployments
- Use Service Principal authentication for CI/CD pipelines
- Store deployment settings in source control alongside your solution
- Environment-specific configuration: Maintain separate deployment settings for each target environment
- Regular cleanup in development helps maintain a tidy environment (use
cleanup
command)
Limitation: If a flow connects to multiple Dataverse environments, this tool will currently point both connection references to the same connection.
Example problematic scenario:
Flow connects to:
├── Environment A (Dev)
└── Environment B (Prod)
Current behavior: Both connection references → Same connection
Desired behavior: Each connection reference → Different connection
Workarounds:
- Manual adjustment: After running the tool, manually update connection references for multi-environment flows
- Separate solutions: Keep flows that span environments in separate solutions
- Flow redesign: Consider if cross-environment flows can be split into single-environment flows with alternative integration patterns
When this matters:
- Cross-environment data synchronization flows
- Flows that read from one environment and write to another
- Multi-tenant scenarios where flows span different Dataverse instances
- Custom connectors: May require manual provider mapping configuration
- Premium connectors: Ensure proper licensing before creating connection references
- Regional connectors: Some connectors have region-specific providers that may need manual mapping
- Verify the connection exists and is shared with the appropriate accounts
- Check that the connection is in the same environment as your solution
- Ensure service principals have "Can use" permission on connections
- Validate that target environment has the required connectors enabled
- Confirm that connection references in deployment settings point to existing connections
- Check that the deploying account has permission to create connection references
- Monitor for connection throttling in high-usage scenarios
- Consider distributing flows across multiple connections for the same service
- Review connection reference topology for optimization opportunities
- Power Platform ALM Overview - Comprehensive guide to ALM with Power Platform
- ALM Basics with Power Platform - Fundamental ALM concepts and best practices
- Environment Strategy for ALM - Planning development, test, and production environments
- Solution Concepts - Understanding managed vs unmanaged solutions
- Use DevOps for Automation - Implementing CI/CD pipelines
- Use Connection References in Solutions - Complete guide to connection references
- Share Connections with Service Principals - Service principal permission setup
- Connection Reference Naming - Naming best practices
- Update Flows to Use Connection References - Migration guidance
- Environment Variables Overview - Configuration management for environments
- Environment Variables in ALM - Using environment variables in deployment
- Deploy Solutions using Pipelines - Automated deployment with Power Platform pipelines
- Power Platform Build Tools for Azure DevOps - Azure DevOps integration
- GitHub Actions for Power Platform - GitHub-based CI/CD
- Work with Solutions - Solution fundamentals
- Import and Export Solutions - Solution lifecycle management
- Solution Layers - Understanding solution layering and customizations
- Power Platform Admin Guide - Security and governance
- Service Principal Authentication - Setting up automation accounts
- Power Platform Security - Security best practices
- Solution-Aware Cloud Flows - Working with flows in solutions
- Manage Connections - Connection management fundamentals
- Share Flow Resources - Sharing connections and flows
- Power Platform Community - Community forums and discussions
- Power Platform Blog - Latest announcements and best practices
- Power CAT (Customer Advisory Team) - Advanced guidance and patterns
- Power Platform Learning Paths - Microsoft Learn training modules
- Power Platform Fundamentals - Foundation concepts
- Power Platform CLI - Command-line tools for Power Platform
- Solution Packager - Solution packaging and source control
- Configuration Migration Tool - Data migration between environments
- Power Platform Center of Excellence (CoE) Kit - Governance and monitoring tools
- .NET 8.0 SDK or later
- Power Platform environment with appropriate permissions
- Azure AD tenant access for authentication setup
-
Clone the repository:
git clone https://github.com/yourusername/PowerPlatform-ConnectionReferences-BestPractices.git cd PowerPlatform-ConnectionReferences-BestPractices/src/PowerPlatform.Tools.ConnectionReferences
-
Configure authentication in
appsettings.json
(see AUTHENTICATION.md) -
Build and run:
dotnet build dotnet run -- analyze --solution "YourSolutionName"
This is a specialized tool focused on Power Platform connection reference best practices. While the code is open source for transparency and learning, this project is not actively seeking contributions.
If you encounter issues:
- Check the troubleshooting sections in this README
- Review the AUTHENTICATION.md guide
- Search existing GitHub Issues for similar problems
For feature requests or bugs:
- Open an issue with detailed information
- Include error messages, configuration (without secrets), and steps to reproduce
This project is licensed under the MIT License - see the LICENSE file for details.
This tool is provided as-is for educational and operational purposes. Always test in development environments before using in production scenarios. Follow your organization's change management and ALM processes.
Microsoft Trademarks: Power Platform, Power Automate, Power Apps, and Dataverse are trademarks of Microsoft Corporation.