As part of the Continuous Cloud Optimization Insights solution, a dashboard is included to track the contributions made to a Azure DevOps repository. The objective is to monitor not only the cloud environment, but also all the resources used for its design, deployment and maintenance. This dashboard allows you to monitor different metrics such as:
- Number of Projects
- Number of open/closed pull requests
- Average pull requests per day
- Comparison between number of open vs closed pull requests over the last months
- Branches created over the last months
An important note about this dashboard is that this dashboard can be published in the PowerBI online service with auto refresh enabled. The difference with the current versions of the other dashboards of CCO Insights is that, for this one, no dynamic queries are being done directly from the PowerBI file, meaning that it can be published and consumed directly from the PowerBI online service.
The CCO Azure DevOps Contributions dashboard requires an infrastructure being deployed in Azure. The infrastructure consists of a Powershell Function App, an Application Insights for monitoring and a Storage Account where results from the Azure DevOps REST API calls will be stored in different tables. The following diagram represents the infrastructure to be deployed.
As part of this solution we offer you already the required bicep template that will deploy and connect the architecture presented previously.
In order to successfully user the deploy.bicep and workflow provided, you will need to have:
-
This repository forked in your own environment.
-
An Azure subscription. If you don't have one you can create one for free using this link. If you already have an Azure tenant but you want to create a new subscription you can follow the instructions here.
-
A resource group already created.
-
A service principal with Owner permissions in your subscription. You will need owner permissions because as part of the architecture you will be creating a Managed Identity that will require a role assignment to save the retrieved data in the Storage Account. You can create your service principal with Contributor rights by running the following commands:
az ad sp create-for-rbac --name "<<service-principal-name>>" --role "Contributor" --scopes /subscriptions/<<subscriptionId>> --output "json"
-
A secret in your GitHub repository with the name
AZURE_CREDENTIALS
. You can user the output from the previous command to generate this secret. The format of the secret should be:{ "clientId": "<client_id>", "ClientSecret": "<client_secret>", "SubscriptionId": "<subscription_id>", "TenantId": "<tenant_id>" }
-
Another secret in your Azure DevOps repository with the name
ADOPAT
. This will be store the value of a PAT token you will need to generate with the following permissions:Scope Permission Code Read Graph Read Identity Read Project and Team Read -
In the local.settings.json file, update the values for the
organization
,resourceGroup
andstorageAccount
with the names you want to configure in your environment. Also, make sure that these names match the values in the deploy.bicep file for the same resources.Note: The organization corresponds to the ADO organization from where the information needs to be retrieved.
In the infrastructure folder you will find a deploy.bicep
file which is the template that will be used to deploy the infrastructure. Please, go ahead and update the first two parameters (name
and staname
) with your unique values. Name will be used to compose the name of all resources except for the storage account, which will leverage the staname.
In the src folder you can find the source code that will be deployed in the Function App once the infrastructure is ready. Basically you will deploy two endpoints:
- InitializeTables: you will need to run this endpoint once manually to initialize the Storage Account with the required tables and collect all the data history available in the Azure DevOps API.
- ADODailySync: this endpoint will be automatically run in a daily basis and will add more data to the already created storage account tables. If you don't want a daily cadence, you can update the cron expression in the
function.json
file under the ADO DailySync folder.
Finally, if you go to the root folder of the repository you will find the workflows folder under the .github
folder. There you can locate the workflow that you will have to run to deploy the back-end of the dashboard. The only parameter you will need to setup manually while triggering the workflow in the resourceGroupName
that you created earlier.
Now you are ready to deploy the back-end solution in your environment:
After successfully deploying the back-end go to the Azure portal and manually rung the InitializeTables
endpoint. Make sure you see the tables in your Storage Account before moving forward.
With the previous back-end deployed, you can now download the ADOContributions v1.0.pbit and run it locally. You will be asked to enter:
After that you will be able to monitor your contributions!