Contents

Azure Pipelines and Dependabot

Keeping dependencies up to date using Dependabot and Azure Pipelines

Keeping your dependencies up to date in a project is a really easy way to try and keep the software secure. New releases of a dependency often include

  • Patches for security vulnerabilities!
  • Performance improvements!
  • Awesome new features!
  • Bug fixes!

It can also be quite a boring activity and can be time-consuming for a team maintaining the project to run updates regularly into production. Fortunately tools like Dependabot exist!

Dependabot?

Dependabot creates pull requests on your repos with the dependencies you should update. You can read about how it works ; but in a nutshell

  • it checks for updates of your dependencies
  • it opens up a pull request on your repo
  • you review the PR and merge

This is an awesome tool to save you time and I find it makes keeping your dependencies up to date really easy.

Integration with Azure Pipelines

Dependabot is baked into the GitHub ecosystem and really easy to use there. Recently I needed to solve this problem on a project in Azure DevOps using Azure Pipelines and thought I would share my solution.

If you search the Azure DevOps Extension Marketplace for Dependabot you will find this extension made by Tingle Software . It is always great to find extensions to speed up integration and I gave this one a whirl.

The extension is feature rich and works really well with little configuration required. Simply add the below to an Azure Pipeline and run it, and you’ll end up with a host of PR’s!

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
stages:
  - stage: CheckDependencies
    displayName: "Check Dependencies"
    jobs:
      - job: Dependabot
        displayName: "Run Dependabot"
        pool:
          vmImage: "ubuntu-latest"
        steps:
          - task: dependabot@1
            displayName: "Run Dependabot"
            inputs:
              packageManager: "nuget" # Examples: nuget, maven, gradle, npm, etc. Add multiple tasks if multiple package managers are used in your solution
              targetBranch: "main"
              openPullRequestsLimit: 10 # Limits the number of PR's you get
              setAutoComplete: true # Saves us one click, once our PR policies pass, the update will merge

Have a look at all the Task Parameters that the extension supports to fine tune your implementation. The above checks for nuget dependencies on the main branch, limits the number of PR’s raise to be 10 and sets the PR to autocomplete (letting our branch policies and PR validation pipelines perform all their checks).

Work Item Linking

In the project I’m working in we have a policy set on all PR’s to ensure they are linked to a work item. When I ran the above pipeline, I ended up with 10 PR’s, none of which were linked to a work item so I couldn’t quickly review and merge each one. The extension handles this by allowing you to pass in a workItemId parameter.

Changes in release 0.5
In the upcoming 0.5 release of the extension, the workItemId parameter has been renamed to milestone so please be on the look out for this change!

The Microsoft team have an extension for creating work items which is really configurable. For my teams workflow, we wanted to have a User Story on our board with all the PR’s linked to it. We also didn’t want the pipeline creating duplicate User Stories every time it runs if we already had one open that we are working on. After a bit of trial and error, adding the below to the pipeline gave us the desired result.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
- task: CreateWorkItem@1
  displayName: "Create User Story for Dependabot"
  inputs:
    workItemType: "User Story" # We wanted a User Story created, but all work item types are available
    title: "Update Dependencies" # The title of the work item
    # Adding some tags to the work item
    fieldMappings: |
      Tags=dependabot; dependencies      
    areaPath: 'your\area'
    iterationPath: 'your\iteration'
    # Adding duplicate detection, using the area path, iteration path and title to match
    preventDuplicates: true
    keyFields: |
      System.AreaPath
      System.IterationPath
      System.Title      
    # Create output variables for the next task to be able use the work item ID
    createOutputs: true
    outputVariables: |
      workItemId=ID      

- task: dependabot@1
  displayName: "Run Dependabot"
  inputs:
    packageManager: "nuget"
    targetBranch: "main"
    openPullRequestsLimit: 10
    workItemId: $(workItemId) # This uses the output from the above as the work item to link the PR's to
    setAutoComplete: true

Docker Caching

The Dependabot extension depends on a Docker image hosted on Docker Hub . The image is around 4.4GB in size and can take some time to download in the pipeline.

In the docs it mentions

Since this task makes use of a docker image, it may take time to install the docker image. The user can choose to speed this up by using Caching for Docker in Azure Pipelines.

The caching tasks can look a bit confusing, breaking it down you need 3 parts:

1
2
3
4
5
6
- task: Cache@2
  inputs:
    key: docker | "${{ variables.imageToCache }}" # image to look for in the cache, e.g. tingle/dependabot-azure-devops:0.4
    path: $(Pipeline.Workspace)/docker # path of the cache
    cacheHitVar: DOCKER_CACHE_HIT # variable to set to true if a cache hit is found
  displayName: Cache Docker images

This checks if there is a cached version of the image to be used. If yes, DOCKER_CACHE_HIT is set to true, otherwise false.

1
2
3
4
- script: |
    docker load -i $(Pipeline.Workspace)/docker/cache.tar    
  displayName: Restore Docker image
  condition: and(not(canceled()), eq(variables.DOCKER_CACHE_HIT, 'true'))

If we have a cached version of the image, we need to pipeline to download it. This step only runs if DOCKER_CACHE_HIT is set to true and will download the cached archive and load it into the docker context.

1
2
3
4
5
6
- script: |
    mkdir -p $(Pipeline.Workspace)/docker
    docker pull -q ${{ parameters.imageToCache }}
    docker save -o $(Pipeline.Workspace)/docker/cache.tar ${{ parameters.imageToCache }}    
  displayName: Save Docker image
  condition: and(not(canceled()), or(failed(), ne(variables.DOCKER_CACHE_HIT, 'true')))

If we do not have a cached version of the image we need to pull it down from Docker Hub and save it as an archive to be cached.

Finishing up

We have a pipeline that runs on a schedule, running all the above steps together. A complete example can be found on GitHub here .

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
schedules:
  - cron: "0 4 * * 1"
    displayName: "Weekly Run"
    always: true
    branches:
      include:
        - main

trigger: none

variables:
  - name: imageToCache
    value: tingle/dependabot-azure-devops:0.4

stages:
  - stage: CheckDependencies
    displayName: "Check Dependencies"
    jobs:
      - job: Dependabot
        displayName: "Run Dependabot"
        pool:
          vmImage: "ubuntu-latest"
        steps:
          - task: CreateWorkItem@1
            displayName: "Create User Story for Dependabot"
            inputs:
              workItemType: "User Story"
              title: "Update Dependencies"
              fieldMappings: |
                Tags=dependabot; dependencies                
              areaPath: 'your\area'
              iterationPath: 'your\iteration'
              preventDuplicates: true
              keyFields: |
                System.AreaPath
                System.IterationPath
                System.Title                
              createOutputs: true
              outputVariables: |
                workItemId=ID                

          - task: Cache@2
            inputs:
              key: docker | "${{ variables.imageToCache }}"
              path: $(Pipeline.Workspace)/docker
              cacheHitVar: DOCKER_CACHE_HIT
            displayName: Cache Docker images
          - script: |
              docker load -i $(Pipeline.Workspace)/docker/cache.tar              
            displayName: Restore Docker image
            condition: and(not(canceled()), eq(variables.DOCKER_CACHE_HIT, 'true'))
          - script: |
              mkdir -p $(Pipeline.Workspace)/docker
              docker pull -q ${{ variables.imageToCache }}
              docker save -o $(Pipeline.Workspace)/docker/cache.tar ${{ variables.imageToCache }}              
            displayName: Save Docker image
            condition: and(not(canceled()), or(failed(), ne(variables.DOCKER_CACHE_HIT, 'true')))

          - task: dependabot@1
            displayName: "Run Dependabot"
            inputs:
              packageManager: "nuget"
              targetBranch: "main"
              openPullRequestsLimit: 10
              workItemId: $(workItemId)
              setAutoComplete: true

In summary the pipeline does the following for you

  • Runs on a weekly schedule (update the cron to suite your needs)
  • Creates a new work item, with the tags we would like (avoiding duplicates)
  • Tries to use a cached version of the image for faster builds (creating a cached version if not found)
  • Run Dependabot, limiting the number of open PR’s to 10, linking the PR’s to the created work item and completing the PR once all the policies pass

Hopefully this will help you to get your projects running in Azure DevOps nice and up to date!