Download as pdf or txt
Download as pdf or txt
You are on page 1of 318

Tell us about your PDF experience.

Azure Artifacts documentation


Code once and share packages across your organization. Host your private NuGet, npm,
Maven, Python, and Universal Packages with Azure Artifacts for more reliable and
scalable builds.

Get started

p CONCEPT

Artifacts overview

f QUICKSTART

Best practices

Artifacts storage consumption

Package sizes and count limits

Configure Feeds

p CONCEPT

What are feeds?

What are feed views?

f QUICKSTART

Configure permissions

Upstream sources overview

Configure upstream sources

Artifacts in Azure Pipelines

p CONCEPT

Overview
f QUICKSTART

Pipeline Artifacts

Release Artifacts and Artifact sources

Publish and download Artifacts

Pipeline caching

Symbols

p CONCEPT

What are symbol files?

c HOW-TO GUIDE

Debug with Visual Studio

Debug with WinDbg

Publish symbols with Azure Pipelines

NuGet

f QUICKSTART

Get started

c HOW-TO GUIDE

Publish NuGet packages (CLI)

Publish NuGet packages with Azure Pipelines (YAML/Classic)

Install NuGet packages with Visual Studio

NuGet.org upstream source

npm

f QUICKSTART
Get started

c HOW-TO GUIDE

Publish npm packages (CLI)

Publish npm packages with Azure Pipelines (YAML/Classic)

Set up your project and connect to Azure Artifacts

Use npm scopes

Npmjs.com upstream source

Maven

f QUICKSTART

Get started

c HOW-TO GUIDE

Project setup

Install Maven Artifacts

Use packages from Maven Central

Google Maven Repository upstream source

Gradle Plugins upstream source

JitPack upstream source

Python

f QUICKSTART

Get started

c HOW-TO GUIDE

Publish and download Python packages (CLI)


Publish and download Python packages with Azure Pipelines (YAML/Classic)

Universal Packages

f QUICKSTART

Get started

c HOW-TO GUIDE

Publish and download Universal Packages with Azure Pipelines (YAML/Classic)

Universal Packages upstream sources


Azure Artifacts overview
Article • 02/21/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts enables developers to share their code efficiently and manage all their
packages from one place. With Azure Artifacts, developers can publish packages to their
feeds and share it within the same team, across organizations, and even publicly.
Developers can also consume packages from different feeds and public registries such
as NuGet.org or npmjs.com. Azure Artifacts supports multiple package types such as
NuGet, npm, Python, Maven, and Universal Packages.

Prerequisites
An Azure DevOps organization and a project. Create an organization or a project if
you haven't already.

Allow domain URLs and IP addresses if your organization is using a firewall or a


proxy server.

7 Note

If you anticipate using more than the provided 2-GiB storage, it is recommended to
set up billing for your organization.

Get started with Azure Artifacts


With Azure Artifacts, you can publish and consume different types of packages. Select
your package type to get started:

NuGet

Get started with NuGet packages and Azure Artifacts

Feature availability
Package type Azure DevOps Azure DevOps Server TFS-
Services 2018

NuGet
packages

npm
packages

Maven
packages

Python Server 2019 Update 1 and newer, Server 2020,


packages and Server 2022.

Universal
Packages

Artifacts free tier and upgrade


Azure Artifacts is free for every organization up to 2 GiB of storage. Once you reach the
maximum storage limit, you can no longer upload new artifacts and need to delete
some of your existing artifacts or set up billing to increase your storage limit. See the
Pricing Calculator to learn more about Azure DevOps billing.

Organization billing settings


Follow the steps outlined below to view your billing settings for your organization:

1. Sign in to your Azure DevOps organization.

2. Select Organization settings.


3. Select Billing.
4. View your Artifacts tier and usage limit.
View Artifacts storage consumption
Organization-level storage
Project-level storage

FAQs

Q: Which artifacts count toward my total billed storage?


A: You get billed for all package types (npm, NuGet, Python, Maven, and Universal
Packages) including packages stored from upstream sources. However, you don't get
billed for Pipeline Artifacts, and Pipeline Caching.
7 Note

Packages in the recycle bin still count as part of your storage consumption. Those
packages get deleted permanently after 30 days. If you want to delete them sooner,
navigate to your recycle bin and delete them manually.

Q: I'm storing Artifacts but my storage consumption


shows 0 GiB?
A: 1 GiB is currently our lowest granularity, so you most likely haven't reached 1 GiB yet.

Q: How can I control how many days Artifacts are kept?


A: You can set up the retention policies to delete packages automatically. For more
information, see How to use retention policies to delete old packages.

Q: How can I delete specific packages?


A: See Delete and recover packages for details.

Q: How long does it take for deleted Artifacts to reflect in


the amount of billed storage?
A: Deletion of artifacts doesn't register immediately. Storage consumption should be
updated within 24 hours, but in some cases it may take up to 48 hours. If you're blocked
from uploading Artifacts, as a workaround you can temporarily increase your usage
level, then reduce the level back once the storage metrics are updated.

The used column on the Billing page of your Organization gets updated once a day.
When you delete an Artifact, it may not reflect immediately on your billing page. The
Artifact Storage page however gets updated more frequently, so you may see a small
discrepancy between the two pages.
Q: What happens if I remove my Azure Subscription from
my Azure DevOps organization?
A: When you remove your Azure Subscription from your Azure DevOps organization,
you only have access to the free tier. If you used more than 2 GiB of storage, you can
only read packages. You can't publish new packages until you lower your storage below
2 GiB. Or, you can reconnect an Azure subscription to your organization and set up
billing to increase your storage tier.

Related articles
View storage usage
Feeds overview
Manage permissions
Feed views
Artifacts storage consumption
Article • 04/11/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts adopts a consumption-based billing model for all the package types it
supports, such as NuGet, npm, Python, Maven, and Universal packages. The free-tier
plan provides a storage capacity of two Gibibytes (GiB) to store various package types. If
you exceed the storage limit, you can either upgrade to a paid subscription or remove
some of your existing artifacts.

The artifact storage UI available in your organization/project settings allows you to


monitor your storage usage at the organization and project levels. Storage is also
grouped by project and artifact type.

Organization-level storage
The organization-level view provides an overview of your total storage usage as well as
the storage consumption by artifact type and by project.

1. Sign in to your Azure DevOps organization.

2. From within your organization, select Organization settings.

3. Select Storage from the left pane.


4. You can view your total storage summary, storage by artifact type, and storage by
projects in your organization.
5. Select View storage breakdown from Storage by type to view the total storage for
packages in your organization-scoped feeds.

7 Note

The list of Storage by projects only includes projects with the largest storage
consumption and not the complete list of projects in your organization.

Project-level storage
The project-level view provides an overview of your total storage usage as well as the
storage consumption by artifact type.

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. From within your project, select Project settings.

3. Select Storage from the left pane.

4. You can view your total storage summary and storage consumption by artifact type
for your project.
5. Select View storage breakdown from Storage by type to view the total storage for
packages in project-scoped feeds.

7 Note

Azure Artifacts provides 2GiB of free storage for each organization. After reaching
the maximum storage limit, you need to set up billing for your organization.

Related articles
Get started with NuGet packages in Azure Artifacts
Publish NuGet packages with Azure Pipelines (YAML/classic)
Azure DevOps blog: Artifacts billing
Azure Artifacts: best practices
Article • 04/07/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Managing software packages can be a complex and time-consuming process,


particularly when working with large-scale projects. Fortunately, Azure Artifacts provides
a robust platform for package management that can help streamline the process and
improve collaboration among development teams. However, to get the most out of
Azure Artifacts, it's essential to follow best practices that ensure the integrity and quality
of your packages. In this article, we'll cover some of the most important best practices
for producing, consuming, and managing packages in Azure Artifacts. Whether you're a
seasoned developer or just starting with Azure Artifacts, these tips will help you
optimize your workflow and ensure the success of your projects.

Create and publish packages


Creating and publishing packages is a critical step in any package management
workflow. In this section, we'll cover best practices for creating and publishing packages
in Azure Artifacts.

Each repository should only reference one feed:

A feed is a fundamental organizational structure for hosting packages. While you


can have multiple feeds for a project, it's best to limit a project to referencing just
one feed. If you want to use packages from multiple feeds, it's recommended to
use upstream sources. This enables you to access packages from multiple feeds
and public registries.

Automatically publish newly created packages to your feed:

This will update the @local view of your feed with the new packages. See Feed
views to learn more about feed views and upstream sources.

Enable retention policies to automatically clean up old package versions:

By deleting older package versions, you can optimize client performance and free
up storage space. When setting up your retention policies you have the flexibility
to select the number of versions of a package to keep. This allows you to easily
manage package versions and improve your package management workflow.
Promote your package to the correct view:

To make a package available to early adopters, you can select it from your feed
and promote it to the @prerelease view. Once you've deemed the package to be
of sufficient quality for a full release, you can promote it to the @release view. By
promoting package versions to a view, you can prevent them from being deleted
by retention policies. To learn more about feed views, check out the Feed views
article.

If external teams are consuming your packages, ensure that @release and
@prerelease views are visible across the organizations:

If these views aren't visible, teams won't have access to your packages.

Consume packages
In this section, we'll cover best practices for consuming packages with Azure Artifacts,
including configuring package sources, managing package versions, and ensuring
secure and efficient package consumption.

Configure upstream sources for your feed:

Adding upstream sources to your feed is the recommended approach for


consuming packages from public registries like NuGet.org or npmjs.com. See
Understand upstream sources and how to configure upstream sources for more
details.

Sources not in your organization but in the same AAD tenant should be added
using the feed locator:

The syntax for the feed locator is as follows: azure-


feed://<organization>/<projectName>/<feed>@<view>

Ensure that the order of the sources matches your desired package resolution
order:

The feed will sequentially check each upstream source, and return the package
from the first source that has it.

Place public upstreams FIRST in your resolution order:

This prevents other sources from overriding well-known packages with altered or
incompatible versions.
Related articles
Package sizes and count limits
Artifacts storage consumption
Upstream sources overview
Search packages across your feeds
Article • 06/06/2023

Azure DevOps Services

Package Search is available to all users of Azure DevOps. For information on main search
functions, see Get started with search.

Prerequisites
An Azure DevOps organization and a project. Create an organization or a project if
you haven't already.

An Azure Artifacts feed. Create a feed, if you don't have one already.

Apply supported functions to package search


1. Select the filter icon to show the filter panel.

2. Select from the dropdown menus to search by feeds, views, or package types.

By default, you search within all feeds of the organization, no matter which project
you're in. The Views filter only appears if a single feed gets selected from the Feeds
filter. Use this filter to show packages from a specific view.
Using the Type filter, you can
select the type of package you want to search for (such as NuGet packages).
Search using the REST API
You can use the Azure DevOps REST API to search for packages in a specific
organization. See Fetch Package Search Results for more details.

Example

Command

POST
https://almsearch.dev.azure.com/ORGANIZATION_NAME/_apis/search/packagesearch
results?api-version=7.0

Request

"$orderBy": null,

"$top": 100,

"$skip": 0,

"searchText": "react-calendar",
"filters": {

"ProtocolType": "Npm"

Search in upstream sources


Using upstream sources, you can consume packages from public registries and Azure
Artifacts feeds. See Search upstreams to lean how to search for packages in upstream
sources and save them to your feed.

7 Note

Searching for packages in upstreams using the NuGet Package Explorer is not
supported. See Download NuGet packages for more details.

Next steps
What are feeds?
What are feed views?
Promote a package to a view
Related articles
Get started with Search
Search code
Search work items
Search FAQs
Share your Artifacts with package
badges
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, you can share your packages anywhere you can share an image by
using package badges. You can embed package badges directly into your project's
home page or in any Markdown file for your customers to easily find and download
your packages.

Enable package sharing

7 Note

You must be a feed administrator to enable package sharing.

1. Select Artifacts, and then select your feed from the dropdown menu. Select the
gear icon to access the Feed settings.

2. Find the Package sharing section and select the checkbox to Enable package
badges.
Create a package badge
With package-sharing enabled, you can create a badge for any package in your feed.
But you can only create a badge for the latest version of each package.

1. From within your feed, select your package and then select Create badge.

2. Select a Feed view for your package badge. If you're using release views, select the
view that contains the packages you want to share.
3. You can now share your package by using the Markdown snippet or the direct
image link.

Related articles
Limits of package sizes and counts
Package notifications
Delete and recover packages
Package sizes and count limits
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts is a highly scalable package management solution that enables


developers to create, host, and share different types of packages. In this article, we will
cover the size and count limits that developers should be aware of when using Azure
Artifacts. Some of these limits are imposed by the client tools that Azure Artifacts
integrates with (example nuget.exe).

Count limits
5000 versions per package ID.
Unlimited package IDs per feed.
20 upstreams per package type per feed.

7 Note

You can use retention policies to automatically delete older package versions.

Size limits
NuGet packages: limited to 500 MB per file.

Npm packages: limited to 500 MB per file.

Maven packages: limited to 500 MB per file.

Python packages: limited to 500 MB per file.

Universal Packages: up to 4 TB per file (Recommended for large binary files).

7 Note

Universal Packages are only available in Azure DevOps Services.

Related articles
Delete and recover packages
Publish and download Universal Packages
Upstream sources
Package notification
Article • 11/11/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Notifications are a great tool to stay informed about specific changes within your
project. In Azure Artifacts, when you follow a package you’ll be notified every time a
new version of that package is published. A notification will be sent to your preferred
email address with details about the new published version. The preferred email address
is usually the email address you signed into Azure DevOps with, but you can change it
from your profile page.

Follow a package
1. Select Artifacts, and then select your feed.

2. Select the package you want to follow, and then select Follow to start getting
notified whenever a new version is published.

Views notifications
Aside from getting notifications when a new package version is published, you can also
set up alerts to be notified when a package is promoted to a view. This can be helpful to
filter alerts and only receive specific notifications especially in a busy development cycle
when numerous packages are being published.

1. Select Artifacts, and then select your feed.

2. Filter to a specific view (for example, @Prerelease, @Release)


3. Select the package you want to follow, and then select Follow.

Follow a package with personal notifications


In addition to following a package from the Feeds page, you can also manage your
personal notifications and set up alerts to be notified about a specific package or set of
packages from the Notification Settings page.

1. Navigate to your Azure DevOps organization:


https://dev.azure.com/<YOUR_ORGANIZATION> .

2. Select
User settings, and then select Notifications.

3. Select New subscription, and then select Artifacts > A package is changed. Select
Next when you are done.
4. Add a Description and then select an email address for the notifications to be
delivered to. By default, your preferred email address is used. You can also add
filters to only receive notifications when a set of criteria are met.

5. Select Finish when you are done.

7 Note
You must be a member of the Project Administrators group or the Project Collection
Administrators group if you want to create project notifications.

Related articles
Share your Artifacts with package badges
Limits on package sizes and counts
Delete and recover packages
Get started with notifications in Azure DevOps
Manage team, group, and global notifications
Get started with NuGet packages in
Azure Artifacts
Article • 06/09/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts enables developers to publish and download NuGet packages from
different sources such as feeds and public registries. With Azure Artifacts, you can create
feeds that can be either private, allowing you to share packages with your team and
specific users, or public, enabling you to share them openly with anyone on the internet.

In this article, you'll learn how to:

" Create a new feed


" Set up your project and connect to your feed
" Publish NuGet packages
" Download packages from your feed

Prerequisites
An Azure DevOps organization and a project. Create an organization or a project if
you haven't already.

Install the latest NuGet version .

Install Azure Artifacts Credential Provider .

Create a feed
Azure Artifacts offers two types of feeds: project-scoped feeds and organization-scoped
feeds. if you want to create a public feed, begin by creating a project-scoped feed, and
then adjust the visibility settings of the project hosting your feed to public. This will
effectively make your project-scoped feed accessible to the public.

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select Create Feed.


3. Provide a descriptive Name for your feed and specify its Visibility (determining
who can view packages within the feed). Additionally, configure the Upstream
sources and specify the Scope of your feed (project-scoped or organization-
scoped).

4. Select Create when you're done.


7 Note

When creating a new feed, the default access level for the Project Collection Build
Service (organization-scoped) and the project-level Build Service(project-scoped) is
set to Collaborator.

Connect to feed
1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the dropdown menu.

3. Select Connect to feed.

4. Select NuGet.exe. If this is the first time using Azure Artifacts with Nuget.exe,
ensure that you have installed all the prerequisites.

5. Follow the instructions provided in the Project setup section to configure your
nuget.config file.
Download packages

1. Get the feed's source URL

1. Navigate to your project, and then select Artifacts then select your feed.

2. Select Connect to feed, and then select Visual Studio from the left navigation
panel.

3. Copy your Source URL.


2. Set up Visual Studio

Windows

1. In Visual Studio, select Tools, and then Options.

2. Expand the NuGet Package Manager section, and then select Package
Sources.

3. Enter the feed's Name and the Source URL, and then select the green (+) sign
to add a source.

4. If you enabled upstream sources in your feed, clear the nuget.org checkbox.

5. Select OK when you're done.

3. Download packages

1. In Visual Studio, right-click on your project, and then select Manage NuGet
Packages.

2. Select Browse, and then select your feed from the Package source dropdown
menu.
3. Use the search bar to search for packages from your feed.

7 Note

Using NuGet Package Explorer to search for packages in upstreams is not


supported.

Publish packages
Run the following command to publish your package to your feed. You can use any
string for the ApiKey argument.

Command

nuget.exe push -Source <SOURCE_NAME> -ApiKey key <PACKAGE_PATH>

Related articles
Publish NuGet packages with Azure Pipelines
Publish packages to NuGet.org
NuGet.org upstream source
Connect to Azure Artifacts feeds
(NuGet.exe)
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Project setup
1. Select Artifacts and then select your feed.

2. Select Connect to feed.

3. Select NuGet.exe from the left panel.

4. If this is your first time using Azure Artifacts with NuGet.exe, select Get the tools in
the top-right corner and follow the instructions to download and install NuGet and
Azure Artifacts Credential Provider.

5. Follow the instructions in the Project setup to set up your nuget.config file.
7 Note

Azure Artifacts Credential Provider is supported with NuGet 4.8.2 or later. See
Azure Artifacts Credential Provider for more information.

Legacy project setup (NuGet v2)


1. Select Artifacts and then select your feed.

2. Select Connect to feed.

3. Select NuGet.exe from the left panel.

4. Copy your source URL, and then replace /v3/index.json with /v2 .

5. Create a Personal Access Token. Scope your PAT to the organization(s) you want to
access and to one of the following scopes: Packaging (read), Packaging (read and
write), or Packaging (read, write, and manage).

6. Run the following command in an elevated command prompt window to add your
package source:

Command

nuget sources add -name <Feed_Name> -source <Feed_URL> -username


<Any_String_But_Not_Null> -password <Personal_Access_Token>

7. If your organization is connected to Azure Active Directory, you must first


authenticate with your AD credentials and then add your personal access token
using the setapikey command:

Command

nuget sources add -name <Feed_Name> -source <Feed_URL> -username


<Azure_Active_Directory_UserName> -password
<Azure_Active_Directory_Password>

nuget setapikey <Personal_Access_Token> -source <Feed_URL>

Related articles
Publish NuGet packages with Azure Pipelines
Publish NuGet packages from the command line (NuGet.exe)
Publish NuGet packages from the command line (dotnet)
NuGet.org upstream source
Connect to Azure Artifacts feeds
(dotnet)
Article • 05/30/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Project setup
1. Select Artifacts and then select your feed from the dropdown menu.

2. Select Connect to feed.

3. Select dotnet from the NuGet section.

4. If this is your first time using Azure Artifacts with dotnet, select Get the tools in the
top-right corner and then:
a. Download and install the .NET Core SDK .
b. Download and install Azure Artifacts Credential Provider .

5. Create a nuget.config file in the same folder as your csproj or sln file. Copy and
paste the following snippet into your new file:

Project-scoped feed:

XML

<?xml version="1.0" encoding="utf-8"?>

<configuration>

<packageSources>

<clear />

<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_p
ackaging/<FEED_NAME>/nuget/v3/index.json" />

</packageSources>

</configuration>

Organization-scoped feed:

XML

<?xml version="1.0" encoding="utf-8"?>

<configuration>

<packageSources>

<clear />

<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_
NAME>/nuget/v3/index.json" />

</packageSources>

</configuration>

Related articles
Publish packages to NuGet.org
Publish NuGet packages with Azure Pipelines
NuGet.org upstream source
Publish and restore NuGet packages
from the command line (NuGet.exe)
Article • 07/13/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, you can publish your NuGet packages to public or private feeds
and share them with others based on your feed's visibility settings. This article will guide
you through connecting to Azure Artifacts and publishing and restoring your NuGet
packages.

Prerequisites
An Azure DevOps organization and a project. Create an organization or a project if
you haven't already.

An Azure Artifacts feed. Create a new feed if you don't have one already.

Azure Artifacts Credential Provider .

The latest NuGet version .

Connect to feed
1. Select Artifacts, and then select your feed from the dropdown menu.

2. Select Connect to feed.

3. Select NuGet.exe.
4. Follow the instructions in Project setup to set up your nuget.config file.

Publish packages
Run the following command to publish your packages to your feed. Replace the
placeholders with the appropriate information:

Command

nuget push <PACKAGE_PATH> -src


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging/<FE
ED_NAME>/nuget/v3/index.json -ApiKey <ANY_STRING>

7 Note

The ApiKey is required, but you can use any arbitrary value when pushing to Azure
Artifacts feeds.

Example:

Command

nuget push MyPackage.5.0.2.nupkg -src


https://pkgs.dev.azure.com/MyOrg/MyProject/_packaging/MyFeed/nuget/v3/i
ndex.json -ApiKey AZ

Publish packages from external sources


1. Create a personal access token (PAT) with packaging read and write scope.

2. Add your package source to your nuget.config file. This will add your PAT to your
nuget.config file. Store this file in a safe location, and make sure that you don't
check it into source control. See NuGet sources for more details.

Command

nuget sources Add -Name <SOURCE_NAME> -Source


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packagin
g/<FEED_NAME>/nuget/v3/index.json -UserName <USER_NAME> -Password
<PERSONAL_ACCESS_TOKEN> -config <PATH_TO_NUGET_CONFIG_FILE>

3. Publish your package. See NuGet push for more details.

Command

nuget push <PACKAGE_PATH> -src <SOURCE_NAME> -ApiKey <ANY_STRING>


Example:

Command

nuget sources Add -Name "MySource" -Source


https://pkgs.dev.azure.com/MyOrg/MyProject/_packaging/MyFeed/nuget/v3/i
ndex.json -UserName MyUserName -Password YourPersonalAccessToken -
config ./nuget.config
nuget push nupkgs/mypackage.1.1.8.nupkg -src MySource -ApiKey AZ

7 Note

If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.

Restore packages
Run the following command to restore your packages:

Command

nuget.exe restore

Related articles
Publish packages to NuGet.org
Set up upstream sources.
Publish NuGet packages with Azure Pipelines.
Publish and restore NuGet packages
from the command line (dotnet)
Article • 07/13/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, you can publish and restore your NuGet packages to/from your
feed and share them with others based on your feed's visibility settings. This article will
guide you through setting up your project to publish and restore your packages using
the dotnet command-line interface.

Prerequisites
An Azure DevOps organization. Create an organization, if you don't have one
already.

An Azure Artifacts feed. Create a new feed if you don't have one already.

Azure Artifacts Credential Provider .

Download and install .NET SDK .

Connect to feed
1. Select Artifacts, and then select your feed from the dropdown menu.

2. Select Connect to feed.

3. Select dotnet from the NuGet section.


4. Create a nuget.config file in the same folder as your .csproj or .sln file. Copy the
following XML snippet and paste it into your new file:

Organization-scoped feed:

XML

<?xml version="1.0" encoding="utf-8"?>


<configuration>
<packageSources>
<clear />
<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_
NAME>/nuget/v3/index.json" />
</packageSources>
</configuration>

Project-scoped feed:

XML

<?xml version="1.0" encoding="utf-8"?>


<configuration>
<packageSources>
<clear />
<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_p
ackaging/<FEED_NAME>/nuget/v3/index.json" />
</packageSources>
</configuration>

Publish packages
To publish a package to your feed, run the following command in an elevated command
prompt. Replace the placeholders with the appropriate information:

Command

dotnet nuget push <PACKAGE_PATH> --source


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging/<FE
ED_NAME>/nuget/v3/index.json --api-key <ANY_STRING>

7 Note

The api-key is only used as a placeholder.


Example:

Command

dotnet nuget push MyPackage.5.0.2.nupkg --source


https://pkgs.dev.azure.com/MyOrg/MyProject/_packaging/MyFeed/nuget/v3/i
ndex.json --api-key AZ

Publish packages from external sources


1. Create a personal access token (PAT) with packaging read and write scope.

2. Add your package source to your nuget.config file. This will add your PAT to your
nuget.config file. Make sure to store this file in a safe place, and do not check this
file into source control.

Command

dotnet nuget add source


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packagin
g/<FEED_NAME>/nuget/v3/index.json --name <SOURCE_NAME> --username
<USER_NAME> --password <PERSONAL_ACCESS_TOKEN> --configfile
<PATH_TO_NUGET_CONFIG_FILE>

3. Publish your package:

Command

dotnet nuget push <PACKAGE_PATH> --source <SOURCE_NAME> --api-key


<ANY_STRING>

Example:

Command

dotnet nuget add source


https://pkgs.dev.azure.com/MyOrg/MyProject/_packaging/MyFeed/nuget/v3/i
ndex.json --name MySource --username MyUserName --password
MyPersonalAccessToken --configfile ./nuget.config
dotnet nuget push nupkgs/mypackage.1.1.0.nupkg --source MySource --api-
key AZ

7 Note
If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.

Restore packages
To restore your packages, run the following command in an elevated command prompt.
The --interactive flag is used to prompt the user for credentials.

Command

dotnet restore --interactive

Related articles
Connect to Azure Artifacts feeds (NuGet.exe)
Publish packages with Azure Pipelines (YAML/Classic)
Publish packages to NuGet.org
Publish packages to NuGet.org
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

NuGet.org is a package manager that enables developers to share reusable code. A


NuGet package is a compressed file with .nupkg extension that contains compiled code
that can be consumed in other projects. Packages hosted in NuGet.org are available to
all developers around the world.

Prerequisites
Any version of Visual Studio 2019 and the .NET Core workload.
dotnet CLI. If you don't have it already, install the .NET Core SDK .
Register for a free account on nuget.org if you don't have one already.

Create a project
You can use your own .NET project to build and generate a NuGet package, or create a
new basic .NET class library as follows:

1. In Visual Studio, select File, New, then Project.

2. Select the Class Library (.NET Standard) template and select Next.

3. Name your project and your solution then select a location to save the project
locally. Select Create when you are done.
The template class library is sufficient to create a NuGet package so for this tutorial we
will use the existing template and we won't be writing any additional code.

Set up and generate a package


1. Select your project from the solution explorer, right-click and select properties
then Package.

2. Fill out the form and make sure that your package id is unique otherwise it may
conflict with existing packages on NuGet.org. A common naming convention is
something like: Company.Product.Feature. If you want to generate your package
every time you build your project, select the Generate NuGet package on build
checkbox.
3. Select your project from the solution explorer, right-click then select Pack to
generate your .nupkg package.
4. Check the status of the pack command in the output window.

Generate an API key


Now that we created our nupkg package, we are almost ready to publish it, but first we
need to generate an API key to connect to the NuGet.org API.

1. Sign in to your NuGet.org account or create one if you haven't.

2. Select your user name icon then select API Keys.


3. Select Create then enter a name for your key. Give your key a Push new packages
and package version scope, and enter * in the glob pattern field to select all
packages. Select Create when you are done.

4. Select Copy and save your API key in a secure location. We will need this key to
publish our NuGet package.

Publish a package to NuGet.org


You can publish your package using the web UI, dotnet CLI, or nuget.exe CLI. We are
going to focus on publishing packages by using the command line in this section. You
will need the name of your package, an API key, and the source URL to do so.
dotnet CLI

1. In an elevated command prompt, navigate to the folder containing your


nupkg package.

2. Run the following command to publish your package to NuGet.org. Replace


the placeholders with your package name and API key.

Command

dotnet nuget push <packageName> --api-key <APIKey> --source


https://api.nuget.org/v3/index.json

3. The output of the previous command should look something like this.

Related articles
Consume NuGet packages in Visual Studio
Get started with NuGet packages and Azure Artifacts
Publish NuGet packages with Azure Pipelines
Publish NuGet packages with Azure
Pipelines (YAML/Classic)
Article • 05/24/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

In Azure Pipelines, you can use the classic editor or the YAML tasks to publish your
NuGet packages within your pipeline, to your Azure Artifacts feed, or to public registries
such as nuget.org.

Create a NuGet package


There are various ways to create your NuGet packages such as using Visual Studio to
pack your NuGet packages. If you're already using MSBuild or some other task to create
your packages, skip this section and jump to the publish NuGet packages section.

YAML

To create a NuGet package, add the following snippet to your pipeline YAML file.
See NuGet task for more details.

YAML

- task: NuGetCommand@2

inputs:

command: pack

packagesToPack: '**/*.csproj'

packDestination: '$(Build.ArtifactStagingDirectory)'

packagesToPack: pattern to search for csproj directories to pack


packDestination: directory where packages will be created

Package versioning
NuGet packages are distinguished by their names and version numbers. Employing
Semantic Versioning is a recommended strategy for effectively managing package
versions. Semantic versions consist of three numeric components: Major, Minor, and
Patch.

The Patch is usually incremented after fixing a bug. When you release a new backward-
compatible feature, you increment the Minor version and reset the Patch version to 0,
and when you make a backward-incompatible change, you increment the Major version
and reset the Minor and Patch versions to 0.

With Semantic Versioning, you can also use prerelease labels to tag your packages. To
do so, enter a hyphen followed by your prerelease tag: E.g.1.0.0-beta. Semantic
Versioning is supported in Azure Pipelines and can be configured in your NuGet task as
follows:

Use the date and time (Classic): byPrereleaseNumber (YAML). Your package
version will be in the format: Major.Minor.Patch-ci-datetime where you have the
flexibility to choose the values of your Major, Minor, and Patch.

Use an environment variable (Classic): byEnvVar (YAML). Your package version will
be set to the value of the environment variable you specify.

Use the build number (Classic): byBuildNumber (YAML). Your package version will
be set to the build number. Make sure you set your build number format under
your pipeline Options to
$(BuildDefinitionName)_$(Year:yyyy).$(Month).$(DayOfMonth)$(Rev:.r) . To do this
in YAML, add a property name: at the root of your pipeline and add your format.

The following example shows how to use the date and time versioning option. This will
generate a SemVer compliant version formatted as: Major.Minor.Patch-ci-datetime .

YAML

YAML

variables:

Major: '1'

Minor: '0'

Patch: '0'

steps:

- task: NuGetCommand@2

inputs:

command: pack

versioningScheme: byPrereleaseNumber

majorVersion: '$(Major)'

minorVersion: '$(Minor)'

patchVersion: '$(Patch)'

7 Note

DotNetCore and DotNetStandard packages should be packaged with the


DotNetCoreCLI@2 task to avoid System.InvalidCastExceptions. See the .NET Core CLI

task for more details.

YAML

task: DotNetCoreCLI@2

inputs:

command: pack

versioningScheme: byPrereleaseNumber

majorVersion: '$(Major)'

minorVersion: '$(Minor)'

patchVersion: '$(Patch)'

Publish NuGet packages


To publish packages to an Azure Artifacts feed from your pipeline, you must set the
Project Collection Build Service identity to be a Contributor on your feed. See
Configure feed settings for more details.

YAML

YAML

steps:

- task: NuGetAuthenticate@0

displayName: 'NuGet Authenticate'

- task: NuGetCommand@2

displayName: 'NuGet push'

inputs:

command: push

publishVstsFeed: '<projectName>/<feed>'

allowPackageConflicts: true

To publish a package to an external NuGet feed, you must first create a service
connection to connect to that feed. You can do this by going to Project settings >
Service connections > New service connection. Select NuGet, and then select
Next. Fill out the form and then select Save when you're done. See Manage service
connections for more details.

To publish a package to an external NuGet feed, add the following snippet to your
YAML pipeline.

Using the Command line task (with NuGet.exe):

YAML

- task: NuGetAuthenticate@1

inputs:

nuGetServiceConnections: <NAME_OF_YOUR_SERVICE_CONNECTION>

- script: |

nuget push <PACKAGE_PATH> -src


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/nuget/v3/index.json -ApiKey <ANY_STRING>

displayName: "Push"

Using the Command line task (with dotnet):

YAML

- task: NuGetAuthenticate@1

inputs:

nuGetServiceConnections: <NAME_OF_YOUR_SERVICE_CONNECTION>

- script: |

dotnet build <CSPROJ_PATH> --configuration <CONFIGURATION>

dotnet pack <CSPROJ_PATH> -p:PackageVersion=<YOUR_PACKAGE_VERSION>


--output <OUTPUT_DIRECTORY> --configuration <CONFIGURATION>

dotnet nuget push <PACKAGE_PATH> --source


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/nuget/v3/index.json --api-key <ANY_STRING>

displayName: "Build, pack and push"

7 Note

The ApiKey is required, but you can use any arbitrary value when pushing to
Azure Artifacts feeds.

Publish to NuGet.Org
1. Generate an API key
2. Navigate to your Azure DevOps project and then select Project settings.

3. Select Service Connections, and then select New service connection.

4. Select NuGet, and then select Next.

5. Select ApiKey as your authentication method. Use the following url for your Feed
URL: https://api.nuget.org/v3/index.json .

6. Enter the ApiKey you generated earlier, and then enter a Service connection
name.

7. Select Grant access permission to all pipelines, and then select Save when you're
done. To select this option, you'll need the service connection Administrator role.

YAML

Add the following YAML snippet to your pipeline definition:

yml

steps:

- task: NuGetCommand@2

displayName: 'NuGet push'

inputs:

command: push

nuGetFeedType: external

publishFeedCredentials: nuget.org

Related articles
Publish npm packages with Azure Pipelines
Publish and download Universal Packages in Azure Pipelines
Releases in Azure Pipelines
Release artifacts and artifact sources
Install NuGet packages with Visual
Studio
Article • 01/27/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Using Azure Artifacts and Visual Studio, you can set up your development machine to
access and install packages from your feeds and public registries such as NuGet.org.

Get source URL


To set up Visual Studio to access your feed as a package source, we must first get the
Source URL:

1. Navigate to your project, and then select Artifacts then select your feed.

2. Select Connect to feed, and then select Visual Studio from the left navigation
panel.

3. Copy your Source URL.

Set up Visual Studio


Windows
1. Open Visual Studio, and then select Tools > Options.

2. Select NuGet Package Manager, and then select Package Sources.

3. Enter your feed's Name and the Source URL you copied in the previous step,
and then select the green (+) sign to add a new package source.

4. If you enabled upstream sources in your feed, clear the nuget.org checkbox.

5. Select OK when you're done.

Install packages from your feed


Now that you set up Visual Studio and added a new package source pointing to your
feed, you can now search and install packages right from Visual Studio package
manager.

1. Open Visual Studio, and then right-click on your project in the Solution Explorer,
then select Manage NuGet Packages....

2. Select Browse, and then select your feed from the Package source dropdown
menu.
3. Use the search bar to look for packages in your feed.

7 Note

If you enabled upstream sources, any packages that haven't been saved to your
feed (by using them at least once) won't appear in the search result.

Install packages from NuGet.org


1. Navigate to NuGet.org and search for the package you want to install.

2. Select Package Manager, and then copy the Install-Package command.

3. Open Visual Studio, and then select Tools > NuGet Package Manager > Package
Manager Console to open the package manager console.

4. Paste the install command into the Package Manager Console and then press
Enter.

Related articles
Publish NuGet packages with Azure Pipelines (YAML/Classic)
Publish and restore NuGet packages from the command line (NuGet.exe)
Publish packages to NuGet.org
Configure upstream sources
NuGet.org upstream source
Article • 04/17/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Enabling upstream sources on your feed enables developers to consume packages from
public registries such as nuget.org and npmjs.com. In this article, you will learn how to
add the NuGet Gallery upstream source to consume NuGet packages from the
nuget.org public registry.

Add NuGet Gallery upstream source


1. Select Artifacts, and then select your feed.

2. Select the gear icon button to navigate to Feed settings.

3. Select Upstream Sources, and then select Add Upstream.

4. Select Public source.


5. Select NuGet Gallery from the dropdown menu. Select Save when you are done.

7 Note

The service index location for nuget.org is


https://api.nuget.org/v3/index.json .
6. Select Save at the top right corner to save your changes.

Update nuget.config
1. Select Artifacts, and then select your feed.

2. Select Connect to feed, and then select NuGet.exe.

3. Copy the XML snippet in the Project Setup section.

XML

<?xml version="1.0" encoding="utf-8"?>

<configuration>

<packageSources>

<clear />

<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_
NAME>/nuget/v3/index.json" />

</packageSources>

</configuration>

4. Create a new nuget.config file in the root of your project.

5. Paste the XML snippet in your nuget.config file.

View saved packages


You can view the packages you saved from the NuGet Gallery by selecting your Source
from the dropdown menu.
Related articles
Publish NuGet packages with Azure Pipelines
Publish packages to NuGet.org
Upstream sources overview
Quickstart: Use GitHub Actions to push
to Azure Artifacts
Article • 06/06/2023

Azure DevOps Services

Get started using GitHub Actions and Azure Artifacts together. GitHub Actions help
you automate your software development workflows from within GitHub. You can use
GitHub Actions to deploy to an Azure Artifacts feed.

Prerequisites
A GitHub account with a repository. Join GitHub and create a repository .
An Azure Artifact feed that you'll push your NuGet package to from a GitHub
workflow. Get Started with NuGet Packages.
An Azure DevOps personal access token (PAT) to use with your GitHub action.
Create a PAT.
Your PAT needs to have read, write, and manage Packaging permissions.

Authenticate with Azure Pipelines


Use a personal access token (PAT) to connect your GitHub account to Azure DevOps.
You can generate a PAT from within Azure DevOps and then store it as a GitHub secret.
Within your GitHub workflow, reference the secret so that your GitHub action can
authenticate with your Azure DevOps project.

1. Open your GitHub repository and go to Settings.

2. Select Security > Secrets and variables > Actions.

3. Paste in your PAT and give it the name AZURE_DEVOPS_TOKEN .

4. Select Add secret.


Create a GitHub workflow that builds an
artifact
GitHub workflows are a series of actions (like tasks in Azure Pipelines). This workflow:

Sets up a .NET Core CLI environment with the setup-dotnet action .


Restores dependencies, builds the project and its dependencies into a set of
binaries, and runs all unit tests associated with the project.
Packs the code into a NuGet package with the GitHub Run ID environmental
variable included in the version number.
Publishes the NuGet package to Azure Artifacts.

1. In your repository on GitHub, create a new YAML file in the .github/workflows


directory.

2. Copy the following contents into your YAML file. Customize the
AZURE_ARTIFACTS_FEED_URL , BUILD_CONFIGURATION , and DOTNET_VERSION values.

Set AZURE_ARTIFACTS_FEED_URL to the registry url for your Azure Artifacts Feed.
Set the BUILD_CONFIGURATION .
Set DOTNET_VERSION to the version of your project.

YAML

name: Push a NuGet package to Azure Artifacts or GitHub Package


Registry

on:

push:

branches:

- main

env:

AZURE_ARTIFACTS_FEED_URL: https://pkgs.dev.azure.com/myorg/nuget-
artifact/_packaging/Fabrikam_Feed/nuget/v3/index.json

BUILD_CONFIGURATION: 'Release' # set this to the appropriate build


configuration

DOTNET_VERSION: '6.x'

jobs:

build:

runs-on: ubuntu-latest

steps:

# Checkout the repo

- uses: actions/checkout@v2

# Setup .NET Core SDK

- name: Setup .NET Core

uses: actions/setup-dotnet@v1

with:

dotnet-version: ${{ env.DOTNET_VERSION }}

# Run dotnet build and package

- name: dotnet build and test

run: |

dotnet restore

dotnet build --configuration '${{ env.BUILD_CONFIGURATION }}'

dotnet test --configuration '${{ env.BUILD_CONFIGURATION }}'

az-artifacts-build-and-deploy:

needs: build

runs-on: ubuntu-latest

steps:

# Checkout the repo

- uses: actions/checkout@v2

# Setup .NET Core SDK

- name: Setup .NET Core

uses: actions/setup-dotnet@v1

with:

dotnet-version: ${{ env.DOTNET_VERSION }}

source-url: ${{ env.AZURE_ARTIFACTS_FEED_URL }}

env:

NUGET_AUTH_TOKEN: ${{ secrets.AZURE_DEVOPS_TOKEN }}

# Run dotnet build and package

- name: dotnet build and publish

run: |

dotnet restore

dotnet build --configuration '${{ env.BUILD_CONFIGURATION }}'

dotnet pack -c '${{ env.BUILD_CONFIGURATION }}' --version-


suffix $GITHUB_RUN_ID

# Publish the package to Azure Artifacts

- name: 'dotnet publish'

run: dotnet nuget push --api-key AzureArtifacts


bin/Release/*.nupkg

3. Go to your Azure Artifacts feed to verify that you see the package you pushed.

Clean up resources
If you're not going to continue to use your GitHub workflow, disable the workflow .

Next steps
Deploy to Azure using GitHub Actions
Migrate your packages from MyGet to
Azure Artifacts
Article • 06/23/2023

Azure DevOps Services

Using the AzureArtifactsPackageMigration PowerShell module, you can easily migrate


your NuGet packages to Azure Artifacts. This article will walk you through an example of
migrating NuGet packages from MyGet to Azure Artifacts.

In this article, you'll learn how to:

" Install the AzureArtifactsPackageMigration PowerShell module.


" Connect to Azure Artifacts feeds.
" Migrate to Azure Artifacts.

Prerequisites
An Azure DevOps organization and a project. Create an organization or a project if
you haven't already.

An Azure Artifacts feed. Create a new feed if you don't have one already.

Install Azure Artifacts Credential Provider .

Install NuGet CLI.

Install PowerShell

A personal access token to authenticate with Azure DevOps.

Install PowerShell module


Using the command line interface, run the provided commands to install and import the
PowerShell module. You can also download the migration scripts directly from the
azure-artifacts-migration GitHub repository.

Windows

1. Open a PowerShell prompt window.


2. Run the following commands to install the AzureArtifactsPackageMigration
PowerShell module and import it into your current session.

PowerShell

Install-Module -Name AzureArtifactsPackageMigration -Scope


CurrentUser -Force

Import-Module -Name AzureArtifactsPackageMigration

Migration setup
To migrate your packages, you'll need to get the source URLs for both the source feed
(MyGet) and destination feed (Azure Artifacts).

Azure Artifacts
1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, select your feed from the dropdown menu and then select
Connect to feed.

3. Select NuGet.exe and then copy your feed's source URL.


Project-scoped feed:

command

https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pac
kaging/<FEED_NAME>/nuget/v3/index.json

Organization-scoped feed:

command

https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NA
ME>/nuget/v3/index.json

MyGet

1. Sign in to your MyGet Account.

2. Navigate to the feed you wish to migrate.

3. Select Feed Details.

4. Select Packages and then copy your NuGet V3 feed URL.

https://www.myget.org/F/<FEED_NAME>/api/v3/index.json

Migrate packages
If your MyGet feed is private, you'll need to create a password to authenticate. You can
skip the first step if your MyGet feed is public.

1. Run the following command to convert your password to a secure string.


PowerShell

$password = ConvertTo-SecureString -String '<YOUR_PASSWORD>' -


AsPlainText -Force

2. Run the following command to migrate your packages to Azure Artifacts.

Migrate from a private MyGet feed:

PowerShell

Move-MyGetNuGetPackages -SourceIndexUrl '<MYGET_SOURCE_URL>' -


DestinationIndexUrl '<ARTIFACTS_FEED_SOURCE_URL>' -DestinationPAT
'<AZURE_DEVOPS_PAT>' -DestinationFeedName '<ARTIFACTS_FEED_NAME>'
-SourceUsername '<MYGET_USERNAME>' -SourcePassword $password -
Verbose

Migrate from a public MyGet feed:

PowerShell

Move-MyGetNuGetPackages -SourceIndexUrl '<MYGET_SOURCE_URL>' -


DestinationIndexUrl '<ARTIFACTS_FEED_SOURCE_URL>' -DestinationPAT
'<AZURE_DEVOPS_PAT>' -DestinationFeedName '<ARTIFACTS_FEED_NAME>'
-Verbose

Related articles
Publish and restore NuGet packages (NuGet.exe)

Publish and restore NuGet packages (dotnet)

Publish packages to NuGet.org


Get started with npm packages in Azure
Artifacts
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, you can publish and download npm packages from feeds and
public registries such as npmjs.com. This quickstart will guide you through creating your
own feed, setting up your project, and publishing and downloading npm packages to
and from your Azure Artifacts feed.

Create a feed
A feed is an organizational construct that allows users to store their packages and
control who can access them. Azure Artifacts support storing several package types in a
single feed such as NuGet, npm, Maven, Python, and Universal packages.

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select Create Feed.

3. Provide a descriptive Name for your feed and specify its Visibility (determining
who can view packages within the feed). Additionally, configure the Upstream
sources and specify the Scope of your feed (project-scoped or organization-
scoped).
4. Select Create when you're done.

7 Note

When creating a new feed, the default access level for the Project Collection Build
Service (organization-scoped) and the project-level Build Service(project-scoped) is
set to Collaborator.

Set up your .npmrc files

7 Note

vsts-npm-auth is not supported in TFS and Azure DevOps Server.


We recommend having two .npmrc files. The first one should be placed in the same
directory as your package.json file. The second one should be placed in the $home
directory (Linux/macOS) or $env.HOME (Windows) to store your credentials. The npm
client then will be able to look up this file and fetch your credentials for authentication.
This enables you to share your config file while keeping your credentials secure.

1. Select Artifacts, and then select Connect to feed.

2. Select npm. If this is your first time using Azure Artifacts, select Get the tools and
then follow the steps to download Node.js and set up the credential provider.

3. Follow the instructions in the Project setup to set up your project.

Set up authentication on your development machine

) Important

npm supports a single registry in your .npmrc file. Multiple registries are possible
with scopes and upstream sources.
Windows

If you're developing on Windows, we recommend using vsts-npm-auth to


authenticate with Azure Artifacts. Run npm install -g vsts-npm-auth to install the
package globally and then add a run script to your package.json.

JSON

"scripts": {

"refreshVSToken": "vsts-npm-auth -config .npmrc"

Publish packages
To publish your npm package, run the following command in your project directory

Command

npm publish

) Important

Using the publishConfig property to override the registry config param at publish-
time is not supported.

Restore packages
To restore an npm package, run the following command in your project directory

Command

npm install --save <package>

Next steps
Publish npm packages (YAML/Classic)
Use packages from npmjs.com

Use npm scopes


Use npm audit
Publish npm packages (YAML/Classic)
Article • 01/06/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Using Azure Pipelines, you can publish your npm packages to Azure Artifacts feeds or to
public registries such as npmjs.com. In this article, you will learn how to publish your
npm packages using YAML and Classic pipelines.

Publish to Azure Artifacts feeds


YAML

7 Note

The Project Collection Build Service and your project's Build Service identity
must be set to Contributor to publish your packages to a feed using Azure
Pipelines. See Add new users/groups for more details.

YAML

- task: Npm@1

inputs:

command: publish

publishRegistry: useFeed

publishFeed: <FEED_NAME> ## For project-scoped feeds, use:


<PROJECT_NAME>/<FEED_NAME>

publishRegistry: Options: useExternalRegistry, useFeed. Select useFeed to use a


feed within your organization.
publishFeed: Required when publishRegistry = useFeed. The feed you want to
publish to.

 Tip

Using the YAML editor to add the npm publish task will generate the project
and feed IDs for your publishFeed .
Publish to a public registry
To publish your packages to a public npm registry such as npmjs.com, you must first
create a service connection to connect to the desired external service.

1. Select Project settings, and then select Service connections.

2. Select Create service connection to create a new service connection.

3. Select npm and then select Next. Fill out the required fields, and then select Save
when you are done.

YAML

YAML

- task: Npm@1

inputs:

command: publish

publishRegistry: useExternalRegistry

publishEndpoint: '<NAME_OF_YOUR_SERVICE_CONNECTION>'

publishRegistry: Select useExternalRegistry to publish to a public registry.


Options: useExternalRegistry | useFeed.
publishEndpoint: required when publishRegistry == useExternalRegistry .
Replace the placeholder with the name of the service connection you created
earlier.
Related articles
Publish and download Artifacts in Azure Pipelines.
Publish npm packages from the command line.
Use packages from npmjs.com.
Publish and restore npm packages from
the command line
Article • 01/24/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Follow this quick tutorial to learn how to connect your npm client to your feed and
publish your packages using the command line. If you don't have a feed yet, you can
follow the steps in the quickstart to Create your own feed.

Project setup
1. Navigate to your project,and then select Artifacts then select Connect to feed.

2. Select npm from the left navigation panel, and then follow the instructions under
Project setup to configure your .npmrc file and connect to your feed. If this is your
first time using Azure Artifacts with npm on your machine, make sure you select
Get the tools to download and install the prerequisites.

Publish packages
1. Open a command prompt window and navigate to the directory that contains your
package.json. If you don't have a package.json file, run the following command:

Command
npm init

2. Run the following command in your project directory to publish your npm
packages:

Command

npm publish

Restore packages
1. Run the following command in your project directory to restore your npm
packages:

Command

npm install

Related articles
Use packages from npmjs.com
Publish npm packages with Azure Pipelines (YAML/Classic)
Use npm audit
Set up your project and connect to
Azure Artifacts
Article • 06/02/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts enables you to publish various package types to your feeds and install
packages from both feeds and public registries like npmjs.com. Before we can
authenticate with Azure Artifacts, we need to configure our .npmrc file, which stores the
feed URLs and credentials that Npm uses. This file can be used to customize the
behavior of the Npm client, such as setting up proxies, specifying default package
locations, or configuring private package feeds. The .npmrc file is located in the user's
home directory and can also be created at the project level to override the default
settings. By editing the .npmrc file, users can customize their Npm experience and make
it more tailored to their needs.

Project setup
For best practice, we suggest using two separate configuration files. The first file is used
to authenticate with Azure Artifacts, while the second file is stored locally and contains
your credentials. To set up the second file, place it in your home directory on your
development machine and include all of your registry credentials. By using this
approach, the Npm client can easily retrieve your credentials for authentication, allowing
you to share your configuration file while keeping your credentials secure. These steps
will walk you through the process of setting up the first configuration file:

Windows

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select Connect to feed.


3. Select npm from the left navigation pane.

4. If this is the first time using Azure Artifacts with npm, select Get the tools and
follow the instructions to install the prerequisites.

5. Follow the instructions in Project setup to set up your config file.

 Tip

Using multiple registries in .npmrc files is supported with scopes and upstream
sources.

7 Note

vsts-npm-auth is not supported in TFS and Azure DevOps Server.

Pipeline authentication
For authentication with your pipeline, Azure Artifacts recommends using the npm
authenticate task. When using a task runner like gulp or Grunt, it's important to add the
npm authenticate task to the beginning of your pipeline. By doing so, your credentials
are injected into your project's .npmrc file and persisted during the pipeline run,
enabling subsequent steps to use the credentials in the configuration file.

Classic

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Azure Pipelines, and then select your pipeline definition.

3. Select Edit to modify your pipeline.

4. Select + to add a task to your pipeline.

5. Search for the npm Authenticate task, and then select Add to add it to your
pipeline.

6. Select your .npmrc file.


7. Select Save & queue when you're done.

7 Note

To grant your pipeline access to your feed, make sure you set the build service role
to Contributor in your feed settings.

7 Note

If your organization is using a firewall or a proxy server, make sure you allow the
appropriate domain URLs. For more information, please refer to the list of Allowed
IP addresses and domain URLs.
Troubleshoot

vsts-npm-auth is not recognized

If you encounter the following error while running your project:

Cmd: 'vsts-npm-auth' is not recognized as an internal or external command,


operable program or batch file.
PowerShell: vsts-npm-auth : The term 'vsts-npm-auth' is not recognized as the
name of a cmdlet, function, script file, or operable program.

Then it's likely that the npm modules folder is not added to your path. To resolve this
issue, rerun the Node.js setup and make sure to select the Add to PATH option.

As an alternative solution, you can add the npm modules folder to your path by editing
the PATH variable %APPDATA%\npm in Command Prompt or $env:APPDATA\npm in
PowerShell.

Unable to authenticate

If you're running into a E401 error: code E401 npm ERR! Unable to authenticate . Run the
vsts-npm-auth command with -F flag to reauthenticate.

Command
vsts-npm-auth -config .npmrc -F

Reset vsts-npm-auth

Follow the steps below to modify/reset your vsts-npm-auth credentials:

Uninstall vsts-npm-auth.

command

npm uninstall -g vsts-npm-auth

Clear your npm cache.

command

npm cache clean --force

Delete your .npmrc file.

Reinstall vsts-npm-auth.

command

npm install -g vsts-npm-auth --registry https://registry.npmjs.com --


always-auth false

Related articles
Publish npm packages (YAML/Classic)
Use npm scopes
Use npm audit
Use packages from npmjs.com
Article • 04/17/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

The npm client is designed to work with a single primary registry (what Azure Artifacts
calls a feed). It also supports secondary scoped registries. Scoped registries can only be
used to install packages whose names begin with the scope prefix, so their usage is
more restrictive. If you want to use both private packages you've created and public
packages from npmjs.com, we recommend using upstream sources.

The npmjs.com upstream source allows you to merge the contents of npmjs.com into
your feed such that the npm client can install packages from both locations. Enabling
upstream sources also automatically enables saving of packages you use from the
upstream source. This is the recommended way to use Azure Artifacts with npm.
Upstreams give you the most flexibility to use a combination of scoped- and nonscoped
packages in your feed, as well as scoped- and nonscoped packages from npmjs.com.

Enable npmjs.com as an upstream


You can use npmjs.com as an upstream source with new and existing feeds.

On a new feed
Create a new feed. Make sure you check the Include packages from common
public sources checkbox.
On an existing feed
1. Select Artifacts, and then select your feed.

2. Navigate to Feed settings by selecting the gear icon button.


3. Select Upstream sources, and then select Add Upstream.

4. Select Public source, and then select npmjs from the dropdown menu.

5. Select Save when you are done.

6. Select Save to save your changes.

Filter to saved packages


You can view the packages you saved from upstreams by selecting your Source from the
dropdown menu.

Scopes
Using scopes instead of upstream sources limits your private package consumption to
those with the @scope prefix e.g. @fabrikam/core but enables you to consume public
packages directly from npmjs.com, see npm scopes for more details.

Related articles
Publish npm packages (CLI)
Publish npm packages (YAML/Classic)
Use npm audit
Npm scopes
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Npm scopes are a way of grouping related packages together. A scope allows you to
create a package with the same name as other packages created by different users
without conflicts. Using scopes, you can separate public and private packages by adding
the scope prefix @SCOPE_NAME and configuring the .npmrc file to only use a feed for
that scope.
With Azure Artifacts, you can publish and download both scoped and non-
scoped packages to/from feeds or public registries. Using npm scopes is also useful with
self-hosted on-premise servers that do not have internet access because setting up
upstream sources in that case is not possible. Using scopes:

We don't have to worry about name collisions.


No need to change the npm registry in order to install or publish our packages.
Each npm organization/user has their own scope, and only the owner or the scope
members can publish packages to their scope.

Project setup
1. Select Artifacts, and then select Connect to feed.

2. Select npm, and then select Other.

3. Add a .npmrc file in the same directory as your package.json, and paste the
following snippet into your file.

JSON

registry=https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEE
D_NAME>/npm/registry/

always-auth=true

Set up credentials
1. Copy the following snippet into your .npmrc file.

Organization-scoped feed:

Command

; begin auth token

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/np
m/registry/:username=[ENTER_ANY_VALUE_BUT_NOT_AN_EMPTY_STRING]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/np
m/registry/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/np
m/registry/:email=npm requires email to be set but doesn't use the
value

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/np
m/:username=[ANY_VALUE_BUT_NOT_AN_EMPTY_STRING]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/np
m/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/np
m/:email=npm requires email to be set but doesn't use the value

; end auth token

Project-scoped feed:

Command

; begin auth token

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/npm/registry/:username=
[ENTER_ANY_VALUE_BUT_NOT_AN_EMPTY_STRING]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/npm/registry/:_password=
[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/npm/registry/:email=npm requires email to be set but
doesn't use the value

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/npm/:username=
[ENTER_ANY_VALUE_BUT_NOT_AN_EMPTY_STRING]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/npm/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]

//pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging
/<FEED_NAME>/npm/:email=npm requires email to be set but doesn't
use the value

; end auth token

2. Generate a personal access token with Packaging > Read & write scopes.

3. Run the following command to encode your newly generated personal access
token. Paste your personal access token when prompted.

Command

node -e "require('readline')
.createInterface({input:process.stdin,output:process.stdout,historySize
:0}) .question('PAT> ',p => {
b64=Buffer.from(p.trim()).toString('base64');console.log(b64);process.e
xit(); })"

4. Open your .npmrc file and replace the placeholder


[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN] with your encoded personal access token

you just created.

In your .npmrc file, replace registry=<YOUR_SOURCE_URL> with @SCOPE_NAME:registry=


<YOUR_SOURCE_URL> . Make sure you add the scope and package names to your

package.json file: { "name": "@SCOPE_NAME/PACKAGE_NAME" } .

npmrc

@[SCOPE_NAME]:registry=https://pkgs.dev.azure.com/[ORGANIZATION_NAME]/_packa
ging/[FEED_NAME]/npm/registry/

always-auth=true

package.json

"name": "[@SCOPE_NAME]/[PACKAGE_NAME]"

Upstream sources vs scopes


Upstream sources give you the most flexibility to use a combination of scoped and non-
scoped packages in your feed, as well as scoped and non-scoped packages from public
registries such as npmjs.com.

Scopes add another restriction when naming your packages: each package name must
start with @<scope> . If you want to publish your private packages to public registries,
you must do so with the scopes intact. If you remove package scopes when deploying
your packages, you'll need to update all the references in your package.json. With that in
mind, scopes can be a viable alternative to upstream sources.

Related articles
Use npm audit
Publish npm packages (YAML/Classic)
Use packages from npmjs.com
Use npm audit
Article • 10/04/2022

Azure DevOps Services

The npm audit command scans your project for security vulnerabilities and provides a
detailed report of any identified anomaly. Performing security audits is an essential part
in identifying and fixing vulnerabilities in the project's dependencies. Fixing these
vulnerabilities could prevent things like data loss, service outages, and unauthorized
access to sensitive information.

Azure DevOps does not support npm audit, if you try to run the default npm audit
command from your pipeline, the task will fail with the following message: Unexpected
end of JSON input while parsing....

As a workaround, you can run npm audit with the registry argument --
registry=https://registry.npmjs.org/ . This will route the npm audit command directly
to the public registry.

2 Warning

Running npm audit will forward all the packages' names from your package.json to
the public registry.

Run npm audit from your pipeline


Select the YAML or the classic tab to learn how to run npm audit from you Pipeline.

YAML

Add the following task to your yaml pipeline to only scan for security vulnerabilities.

YAML

steps:

- task: Npm@1

displayName: 'npm audit'

inputs:

command: custom

customCommand: 'audit --registry=https://registry.npmjs.org/'

Instead of only scanning, to scan and also attempt to upgrade to non-vulnerable


package versions:

YAML

steps:

- task: Npm@1

displayName: 'npm audit fix'

inputs:

command: custom

customCommand: 'npm audit fix --registry=https://registry.npmjs.org/


--package-lock-only'

command: the npm command to run.


customCommand: Required when command == custom.

Run npm audit on your development machine


To run npm audit locally, run the following command in a command prompt window:

Command

npm audit --registry=https://registry.npmjs.org/

To also attempt to upgrade to non-vulnerable package versions:

Command

audit fix --registry=https://registry.npmjs.org/ --package-lock-only

Related articles
npm quickstart.
Publish npm packages with Azure Pipelines.
Artifacts storage consumption
Delete and recover packages.
Get started with Maven packages and
Azure Artifacts
Article • 07/13/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

This quickstart will guide you through setting up your Maven project to connect to
Azure Artifacts feeds and publish and download your Maven packages.

Prerequisites
An Azure DevOps organization. Create an organization, if you don't have one
already.
Install Apache Maven .
An Azure Artifacts feed. Create a feed if you don't have one already.

Set up authentication
1. Select Artifacts, and then select Connect to Feed.

2. Select Maven.

3. If this is the first time using Azure Artifacts with Maven, select Get the tools to
download and install Maven.

4. Follow the instructions in the Project setup to set up your pom.xml and
settings.xml files. If your settings.xml file is shared within your team, you can use
Maven to encrypt your passwords .
 Tip

If you are using Maven task, set the mavenAuthenticateFeed argument to true to
automatically authenticate with your Maven feed.

Publish artifacts
1. If you don't have a Maven package yet, you can create one by running the
following command:

Command

mvn -B archetype:generate -
DarchetypeGroupId="org.apache.maven.archetypes" -DgroupId="MyGroup" -
DartifactId="myFirstApp"

If you get the following error: You must specify a valid lifecycle phase or a goal (..),
follow the steps below to add a goal to your configuration:

Right click on your project, select Run as > Maven Build. Enter package in the
Goals text box, and then select Run.

2. Navigate to the path of your pom.xml file and run the following commands to
build and deploy your Maven artifact:

command

mvn build
mvn deploy
If you get the following error: Unknown lifecycle phase "build"(...) when you run mvn
build , you can use Eclipse IDE to build your maven project as follows:

1. Right click on your project.

2. Select Run as, and then select Maven Build....

3. Write package in the Goals text box.

4. Select Run.

If you want to publish a third-party artifact, you can use the deploy:deploy-file mojo.
This can be used with or without a POM file to deploy your packages.

Command

mvn deploy:deploy-file -Dpackaging="jar" -DrepositoryId="MyFeedName" -


Durl="MyFeedURL" -DgroupId="MyGroup" -DartifactId="myFirstApp" -
Dversion="jarFileVersion" -Dfile="jarFileLocalPath"
7 Note

If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.

Install artifacts
1. Navigate to Azure Artifacts, and then select the package you want to install and
copy the <dependency> snippet.

2. Open your pom.xml file and paste your code inside the <dependencies> tag.

3. Run mvn install from the same path as your pom.xml file.

Related articles
Configure permissions
Use feed views to share packages
Set up upstream sources
Set up your Maven project and connect
to feed
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

1. Select Artifacts, and then select Connect to Feed.

2. Select Maven.

3. If this is the first time using Azure Artifacts with Maven, select Get the tools to
download and install Maven.

4. Follow the instructions in the Project setup to set up your pom.xml and
settings.xml files. If your settings.xml file is shared within your team, you can use
Maven to encrypt your passwords .

7 Note
If your settings.xml file is shared within your team, you can use mvn to encrypt
your passwords .
Install Maven Artifacts
Article • 01/09/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, you can publish and restore Maven packages from Azure Artifacts
feed and public registries. In this article, you will learn how to connect to Azure Artifacts
feeds and restore your Maven packages.

Prerequisites
An Azure DevOps organization. Create an organization, if you don't have one
already.

An Azure Artifacts feed. Create a new feed if you don't have one already.

Download and install Maven.

Connect to feed
1. From your Azure DevOps project, select Artifacts, and then select your feed from
the dropdown menu.

2. Select Connect to feed.

3. Select Maven from the left navigation panel.

4. Follow the instructions in the Project setup section to set up your config files and
generate a new personal access token.
 Tip

If your settings.xml file is shared within your team, you can use Maven to encrypt
your passwords .

Restore Maven packages


Run the following command in an elevated command prompt to download your Maven
packages. Maven automatically downloads all your dependencies to your local
repository when the build command is executed.

Command

mvn build

 Tip

The <id> tags in your settings.xml and pom.xml files must be the same.

Related articles
Use packages from Maven Central
Use public feeds to share your packages publicly
Configure permissions
Use feed views to share packages
Publish Maven artifacts using Gradle
Article • 05/09/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

In this article, you will learn how to connect to an Azure Artifacts feed and publish
Maven artifacts using Gradle.

Prerequisites
An Azure DevOps organization. Create an organization, if you don't have one
already.

An Azure Artifacts feed. Create a feed if you don't have one already.

Download and install Gradle .

Install Java SE .

Project setup
Before setting up your project, ensure that you have installed Gradle and added the
Maven Settings plugin to your build.gradle file as follows:

groovy

plugins {

id "net.linguica.maven-settings" version "0.5"

Create a personal access token


1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select User settings, and then select Personal access tokens.


3. Select New Token, and then fill out the required fields. Make sure you select the
Packaging > Read & write scope.

4. Select Create when you are done. Copy your token and save it in a secure location.
Configure build.gradle
1. If a build.gradle file does not exist in the root of your project, create a new file and
name it: build.gradle.

2. Add the following section to your build.gradle file in both the repositories and
publishing.repositories containers.

groovy

maven {

url
'https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packagi
ng/<FEED_NAME>/maven/v1'

name '<FEED_NAME>'

authentication {

basic(BasicAuthentication)

3. Here's an example of what your build.gradle file should look like:

groovy

publishing {

publications {

myPublication(MavenPublication) {

groupId '<GROUP_ID>'

artifactId '<ARTIFACT_ID>'

version '<VERSION_NUMBER>'

artifact '<PATH_TO_YOUR_JAR_FILE>'

// Repositories to publish artifacts

repositories {

maven {

url
'https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packagi
ng/<FEED_NAME>/maven/v1'

name '<FEED_NAME>'

authentication {

basic(BasicAuthentication)

// Repositories to fetch dependencies

repositories {

maven {

url
'https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packagi
ng/<FEED_NAME>/maven/v1'

name '<FEED_NAME>'

authentication {

basic(BasicAuthentication)

Configure settings.xml
1. Open your settings.xml file in your home directory and add the following snippet.
Replace the placeholders with your feed name, organization name, and the
personal access token you created earlier.
XML

<server>

<id>[FEED_NAME]</id>

<username>[ORGANIZATION_NAME]</username>

<password>[PERSONAL_ACCESS_TOKEN]</password>

</server>

Publish artifacts
Run the following command in an elevated command prompt to publish your package
to your feed. Your new package will be named: groupId:artifactId.

Command

gradle publish

Related articles
Install Maven Artifacts
Use packages from Maven Central
Google Maven Repository upstream source
Use packages from Maven Central
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With upstream sources, you can use both private packages you've created and public
packages from Maven Central. When you enable upstream sources in your feed, Azure
Artifacts will save a copy of any packages you install from Maven central. Azure Artifacts
also support other Maven upstream sources such as Google Maven Repository, Gradle
Plugins, and JitPack.

7 Note

Maven snapshots are not supported with Maven upstream sources.

Enable upstream sources


Follow the instructions below to create a new feed and enable upstream sources:

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select Create Feed.

3. Provide a descriptive Name for your feed and specify its Visibility (determining
who can view packages within the feed). Additionally, configure the Upstream
sources and specify the Scope of your feed (project-scoped or organization-
scoped).
4. Select Create when you're done.

7 Note

When creating a new feed, the default access level for the Project Collection Build
Service (organization-scoped) and the project-level Build Service(project-scoped) is
set to Collaborator.

Add Maven Central upstream


1. Select the in the top right of the page to access your feed's settings.

2. Select Upstream sources.


3. Select Add Upstream.

4. Select Public source, and then select Maven Central


(https://repo.maven.apache.org/maven2/ ) from the dropdown menu.

5. Select Save when you are done.

6. Select Save to save your changes.

View saved packages


You can view the packages you saved from upstreams by selecting the Maven Central
source from the dropdown menu.
 Tip

If Maven is not downloading all your dependencies, run the following command
from the project directory to regenerate your project's files:
mvn eclipse:eclipse -
DdownloadSources=true -DdownloadJavadocs=true

Related articles
Install Maven Artifacts
Configure permissions
Configure upstream behavior
Google Maven Repository upstream
source
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, developers can enable upstream sources to store packages from
different sources such as Google Maven Repository. Once enabled, Azure Artifacts will
save a copy of all the packages installed from Google Maven Repository. Azure Artifacts
also support other Maven upstream sources such as Maven Central, Gradle Plugins, and
JitPack.

Add Google Maven Repository


1. Select the at the top right corner to navigate to Feed Settings.

2. Select Upstream sources.

3. Select Add Upstream.

4. Select Public source, and then select Google Maven Repository


(https://maven.google.com/web/index.html ) from the dropdown menu.

5. Select Save when you are done.


6. Select Save at the top right corner to save your changes.

7 Note

Maven snapshots are not supported with Maven upstream sources.

View saved packages


To view the packages you installed from Google Maven Repository, select the
appropriate source from the dropdown menu.

 Tip

If Maven is not downloading all your dependencies, run the following command
from the project directory to regenerate your project's files:
mvn eclipse:eclipse -
DdownloadSources=true -DdownloadJavadocs=true

Related articles
Maven Central upstream source
Configure permissions
Set up upstream sources
Upstream behavior
Gradle Plugins upstream source
Article • 04/17/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts feeds, you can enable upstream sources to include packages from
different public registries such as Gradle Plugins. Once upstream sources are enabled on
your feed, Azure Artifacts will save a copy of any package you install from upstream.
Azure Artifacts also support other Maven upstream sources such as Maven Central,
Google Maven Repository, and JitPack.

7 Note

Organization-scoped feeds cannot be converted into project-scoped feeds.

Add Gradle Plugins


1. Select the at the top right corner to navigate to Feed Settings.

2. Select Upstream Sources, and then select Add Upstream.

3. Select Public source, and then select Gradle Plugins


(https://plugins.gradle.org/m2/ ) from the dropdown menu.
4. Select Save when you are done.

5. Select Save at the top right corner to save your changes.

View saved packages


To view the packages from Gradle Plugins, select Gradle Plugins from the Source
dropdown menu.

7 Note

Maven snapshots are not supported with Maven upstream sources.

Related articles
Maven Central upstream source
Google Maven Repository
Set up upstream sources
JitPack upstream source
Article • 04/17/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Artifacts, you can consume packages from different public registries such as
Maven Central and Google Maven Repository and JitPack. Once you enable upstream
sources, Azure Artifacts will save a copy of any package you install from upstream.

Add JitPack upstream


1. Select the at the top right corner to navigate to Feed Settings.

2. Select Upstream Sources, and then select Add Upstream.

3. Select Public source, and then select JitPack (https://jitpack.io/ ) from the
dropdown menu.
4. Select Save when you are done.

5. Select Save at the top right corner to save your changes.

View saved packages


To view saved packages from JitPack, select JitPack from the Source dropdown menu.

Related articles
Google Maven Repository
Gradle Plugins
Maven Central upstream source
Set up upstream sources
Get started with Python packages in
Azure Artifacts
Article • 02/24/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

This guide will walk you through using Azure Artifacts to publish and consume Python
packages to and from your feed.

Create a feed
1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select Create Feed.

3. Provide a descriptive Name for your feed and specify its Visibility (determining
who can view packages within the feed). Additionally, configure the Upstream
sources and specify the Scope of your feed (project-scoped or organization-
scoped).
4. Select Create when you're done.

7 Note

When creating a new feed, the default access level for the Project Collection Build
Service (organization-scoped) and the project-level Build Service(project-scoped) is
set to Collaborator.

Connect to feed
There are two primary ways to connect to a feed to publish or consume your Python
packages:
1. Install and use the artifacts-keyring package, which will automatically set up
authentication for you.
2. Manually set up credentials for your .pypirc pushes, and your pip.ini/pip.conf for
pulls with a personal access token (PAT).

7 Note

artifacts-keyring is not supported on newer versions of Ubuntu.

Use artifacts-keyring to set up authentication


The artifacts-keyring package allows you to set up authentication to publish and
consume your Python packages to and from your feed. Both pip and twine use the
Python keyring library to find credentials.

) Important

You must have pip 19.2 and twine 1.13.0 or higher to use artifacts-keyring. See
Usage requirements for more details.

1. In an elevated command prompt window, run the following command to install


the artifacts-keyring package:

Command

pip install artifacts-keyring

2. To install a package from your feed, run the following command:

Project scoped feed:

Command

pip install <PACKAGE_NAME> --index-url


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pac
kaging/<FEED_NAME>/pypi/simple

Organization scoped feed:

Command
pip install <PACKAGE_NAME> --index-url
https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NA
ME>/pypi/simple

3. To publish a package to your feed, run the following command:

Project scoped feed:

Command

twine upload --repository-url


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pac
kaging/<FEED_NAME>/pypi/upload

Organization scoped feed:

Command

twine upload --repository-url


https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NA
ME>/pypi/upload

7 Note

The artifacts-keyring package is layered on top of the Azure Artifacts Credential


Provider. For more advanced configuration options, check out the artifacts-
credprovider repository.

Manually configure authentication


1. Create a Personal access token with Packaging > Read scope to authenticate with
Azure DevOps.

2. Select Artifacts, and then select your feed then select Connect to feed.
3. Select pip under the Python section.
4. If this is your first time using Azure Artifacts with twine, select Get the tools to
download and install the prerequisites.

5. Create a virtualenv , if you don't already have one.

6. Add a pip.ini (Windows) or a pip.conf (Mac/Linux) file to your virtualenv. Make sure
you don't check your personal access token into a public repository.

Project scoped feed:

[global]

extra-index-url=https://<FEED_NAME>:
<YOUR_PERSONAL_ACCESS_TOKEN>@pkgs.dev.azure.com/<ORGANIZATION_NAME
>/<PROJECT_NAME>/_packaging/<FEED_NAME>/pypi/simple/

Organization scoped feed:

[global]

extra-index-url=https://<FEED_NAME>:
<YOUR_PERSONAL_ACCESS_TOKEN>@pkgs.dev.azure.com/<ORGANIZATION_NAME
>/_packaging/<FEED_NAME>/pypi/simple/

7. Run the following command in your project directory to install your package.

Command

pip install <PACKAGE_NAME>

When you connect to Azure DevOps for the first time, you'll be prompted for
credentials. Enter your user name(any string) and your personal access token in the
appropriate fields. The credentials will be cached locally and used to automatically sign
you in the next time you use the service.

7 Note

If you want to publish or consume your packages using Azure Pipelines, use the
Python Pip Authenticate task to authenticate and install packages, or the Python
Twine Upload Authenticate task to publish your packages.
Related articles
Use feed views to share packages

Publish Python packages with Azure Pipelines.

Build Python apps.


Publish Python packages with Azure
Pipelines
Article • 07/03/2023

Azure DevOps Services

Using Azure Pipelines, you can publish your Python packages to Azure Artifacts feeds,
public registries, or as a pipeline artifacts.

This article will show you how to:

" Install Twine
" Authenticate with your Azure Artifacts feeds
" Publish Python packages to an Azure Artifacts feed

Install twine
YAML

YAML

- script: 'pip install twine'

Authenticate with Azure Artifacts


To use twine to publish your Python packages, you must first set up authentication to
your Azure Artifacts feed. The TwineAuthenticate task stores your credentials in a
PYPIRC_PATH environment variable. twine will reference this variable to publish your

packages from your pipeline.

YAML

YAML

- task: TwineAuthenticate@1
inputs:
artifactFeed: <PROJECT_NAME/FEED_NAME>
#For an organization-scoped feed, artifactFeed: <FEED_NAME>
pythonUploadServiceConnection: <NAME_OF_YOUR_SERVICE_CONNECTION>
artifactFeed: The name of your feed.
pythonUploadServiceConnection: a service connection to authenticate with
twine.

 Tip

The credentials stored in the PYPIRC_PATH environment variable supersede those in


your .ini and .conf files.
If you add multiple TwineAuthenticate tasks at different stages in your pipeline,
each additional task execution will extend (not override) the existing PYPIRC_PATH
environment variable.

Publish Python packages to Azure Artifacts


feeds
YAML

YAML

- script: |
   pip install wheel
   pip install twine

- script: |
   python setup.py bdist_wheel

- task: TwineAuthenticate@1
  displayName: Twine Authenticate
  inputs:
    artifactFeed: <PROJECT_NAME/FEED_NAME> #For an
organization-scoped feed, artifactFeed: <FEED_NAME>.

- script: |
   python -m twine upload -r <FEED_NAME> --config-file $(PYPIRC_PATH)
dist/*.whl

Example using Python build and twine to publish a Python package to an Azure
Artifacts feed.

YAML
- script: |
pip install twine
pip install build

- script: |
python -m build -w

- task: TwineAuthenticate@1
inputs:
artifactFeed: <PROJECT_NAME/FEED_NAME>

- script: |
python -m twine upload -r <FEED_NAME> --config-file $(PYPIRC_PATH)
dist/*.whl
displayName: 'upload'

Related articles
Publish and download pipeline Artifacts
Artifacts in Azure Pipelines
Release artifacts and artifact sources
Publish and consume Python packages
using the command line
Article • 07/13/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts enables developers publishing and consuming packages from Azure
Artifacts feeds and public registries like pypi.org. By following this quickstart guide, you
can learn how to use the command line to publish and consume Python packages.

Publish Python packages


1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts and then select your feed from the dropdown menu.

3. Select Connect to feed and then select twine from the left navigation panel.

4. If this is your first time using Azure Artifacts with twine, select Get the tools to
install the prerequisites.

5. Download and install Python, and then run the following command to install the
latest version of Azure Artifacts keyring.

Command

pip install twine keyring artifacts-keyring

6. Add a .pypirc configuration file to your home directory.

Command

touch ~/.pypirc

7 Note

If you already have a .pypirc file with credentials for the public PyPI index, it is
recommended to remove the [pypi] section from your file to prevent
unintended publication of private packages.
7. Paste the following snippet to your .pypirc file:

Command

[distutils]
Index-servers =
<FEED_NAME>

[<FEED_NAME>]
Repository =
https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/PROJECT_NAME/_packaging/
<FEED_NAME>/pypi/upload/

8. Create a source and wheel distributions.

Command

python setup.py sdist bdist_wheel

9. Run the following command to publish your package

twine upload -r <FEED_NAME> dist/*

7 Note

If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.

Consume Python packages


1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts and then select your feed from the dropdown menu.

3. Select Connect to feed and then select pip from the left navigation panel.

4. If this is your first time using Azure Artifacts with pip, select Get the tools to install
the prerequisites.

5. Download and install Python, and then run the following command to update your
Python package installer.
Command

python -m pip install --upgrade pip

6. Install the latest Azure Artifacts keyring.

Command

pip install twine keyring artifacts-keyring

7. Create a virtual environment if you don't have one already.

8. Add a pip.ini (Windows) or pip.conf (Mac/Linux) configuration file to your


virtualenv.

Command

[global]
extra-index-
url=https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pack
aging/<FEED_NAME>/pypi/simple/

9. Run the following command in your project directory to install your package.

pip install <PACKAGE_NAME>

Related articles
Configure permissions
Understand upstream sources
Publish Python packages with Azure Pipelines
Publish and download universal
packages in Azure Artifacts
Article • 05/22/2023

Universal Packages offer developers the capability to store an extensive array of package
types that extend beyond the conventional ones, such as NuGet, npm, Maven, or Python
packages. Using Azure CLI, you can conveniently publish and download universal
packages directly from the command line. The size of published packages can vary,
reaching up to 4 TB, but must always maintain the essential requirement of including a
name and version number. This article will walk you through the steps to publish and
download your universal packages to your Azure Artifacts feed.

Prerequisites
Install Azure CLI.
If you're using Linux, make sure you install the .NET on Linux version.
An Azure DevOps organization and a project. Create an organization or a project if
you haven't already.
An Azure Artifacts feed. Create a feed, if you don't have one already.

Project setup
Windows

1. Run the following command to install the Azure DevOps extension.

Azure CLI

az extension add --name azure-devops

2. If you already have the Azure DevOps extension installed and wish to update
it to the latest version, run the following command::

Azure CLI

az extension update --name azure-devops

3. Log in to Azure.
Azure CLI

az login

 Tip

To access tenants without subscriptions, run az login --allow-no-


subscription .

4. Set your project and organization as the CLI's default.

Azure CLI

az devops configure --defaults project=<YOUR_PROJECT_NAME>


organization=https://dev.azure.com/<YOUR_ORGANIZATION_NAME>

Publish packages
To publish a universal package, run the following command in an elevated command
prompt. Package names must be lowercase, start and end with letters or numbers, and
contain only letters, numbers, and nonconsecutive dashes, underscores, and periods.
Package versions must be lowercase without build metadata (+ suffix). See SemVer to
learn more about semantic versioning.

Organization-scoped feed:

Azure CLI

az artifacts universal publish --organization


https://dev.azure.com/<YOUR_ORGANIZATION> --feed <FEED_NAME> --name
<PACKAGE_NAME> --version <PACKAGE_VERSION> --path <PACKAGE_DIRECTORY> -
-description <PACKAGE_DESCRIPTION>

Project-scoped feed:

Azure CLI

az artifacts universal publish --organization


https://dev.azure.com/<YOUR_ORGANIZATION> --project <PROJECT_NAME> --
scope project --feed <FEED_NAME> --name <PACKAGE_NAME> --version
<PACKAGE_VERSION> --path <PACKAGE_DIRECTORY> --description
<PACKAGE_DESCRIPTION>

View published packages


1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the drop-down menu. Once
publishing is completed successfully, your package should be available in your
feed.

Download packages
To download a universal package using Azure CLI, run the following command in an
elevated command prompt.

Organization-scoped feed:

Azure CLI

az artifacts universal download --organization


https://dev.azure.com/<YOUR_ORGANIZATION> --feed <FEED_NAME> --name
<PACKAGE_NAME> --version <PACKAGE_VERSION> --path <DOWNLOAD_PATH>

Project-scoped feed:

Azure CLI

az artifacts universal download --organization


https://dev.azure.com/<YOUR_ORGANIZATION> --project <PROJECT_NAME> --
scope project --feed <FEED_NAME> --name <PACKAGE_NAME> --version
<PACKAGE_VERSION> --path <DOWNLOAD_PATH>

Download specific files


If you only want to download specific files, you can use the --file-filter parameter to
download a subset of files. See File matching patterns reference for more details.
Example: --file-filter logs/.log would match any file ending with logs and with the
extension .log (Example: build123_logs.log).

Organization-scoped feed:

Azure CLI

az artifacts universal download --organization


https://dev.azure.com/<YOUR_ORGANIZATION> --feed <FEED_NAME> --name
<PACKAGE_NAME> --version <PACKAGE_VERSION> --path <DOWNLOAD_PATH> --
file-filter <MATCH_PATTERN>

Project-scoped feed:

Azure CLI

az artifacts universal download --organization


https://dev.azure.com/<YOUR_ORGANIZATION> --project <PROJECT_NAME> --
scope project --feed <FEED_NAME> --name <PACKAGE_NAME> --version
<PACKAGE_VERSION> --path <DOWNLOAD_PATH> --file-filter <MATCH_PATTERN>

Download the latest version


You can use wildcards * to download the latest version of your Universal Packages.

Examples:

--version '*' : download the latest version.

--version '1.*' : download the latest version with major 1.

--version '1.2.*' : download the latest patch release with major 1 and minor 2.

7 Note

Wildcard patterns are not supported with pre-release versions (packages with a
dash in their version number).

Related articles
Publish and download universal packages with Azure Pipelines.
Delete and recover packages.
Configure feed permissions
Universal Packages upstream sources
Article • 04/17/2023

With Azure Artifacts, you can enable upstream sources to start consuming packages
from public registries such as NuGet.org or npmjs.com. Once you enable upstream
sources, Azure Artifacts will save a copy of any packages you install from upstream.
Azure Artifacts also supports using other feeds as upstreams. In this article, you'll learn
how to add a new Universal Packages upstream source with a feed in your organization
or other organizations within the same Azure Active Directory. See Manage access with
Azure Active Directory to learn how to connect your organization to Azure Active
Directory.

Add a feed in your organization as an upstream


source
1. Select the at the top right corner to navigate to Feed Settings.

2. Select Upstream sources, and then select Add Upstream.

3. Select Azure Artifacts feed in this organization to use packages from a feed in
your organization.
4. Select your Feed from the dropdown menu, select a View and give your upstream
source a name. Make sure you check the UPack package type.
5. Select Save when you're done.

6. Select Save at the top right corner to save your changes.

Add a feed in another organization as an


upstream source
1. Select the at the top right corner to navigate to Feed Settings.

2. Select Upstream sources, and then select Add Upstream.

3. Select Azure Artifacts feed in another organization to use packages from a feed
in a different organization within the same Azure Active Directory.
4. Enter your Azure Artifacts feed locator, and give your upstream source a name.
Make sure you check the UPack package type.
5. Select Save when you're done.

6. Select Save at the top right corner to save your changes.

View saved packages from upstream


To view the packages saved from your Universal Packages upstream source, select your
UPack source from the dropdown menu.

Related articles
DevBlogs - Universal Packages upstream sources
Configure upstream sources
Publish packages to NuGet.org
Configure upstream behavior
Publish and download Universal
Packages with Azure Pipelines
Article • 05/05/2023

Azure DevOps Services

Universal Packages allow you to package any number of files of any type and share
them with your team. Using the Universal Package task in Azure Pipelines, you can pack,
publish, and download packages of various sizes, up to 4 TB. Each package is uniquely
identified with a name and a version number. You can use Azure CLI or Azure Pipelines
to publish and consume packages from your Artifacts feeds.

7 Note

Universal Packages are only available in Azure DevOps Services.

Copy files
The Universal Packages task in Azure Pipelines is set to use
$(Build.ArtifactStagingDirectory) as the default publish directory. To ready your
Universal Package for publishing, move the files you wish to publish to that directory.
You can also use the Copy Files utility task to copy those files to the publish directory.

Publish a Universal Package


YAML

To publish a Universal Package to your Azure Artifacts feed, add the following task
to your pipeline's YAML file.

YAML

- task: UniversalPackages@0

displayName: Publish a Universal Package

inputs:

command: publish

publishDirectory: '$(Build.ArtifactStagingDirectory)'

vstsFeedPublish: '<projectName>/<feedName>'

vstsFeedPackagePublish: '<Package name>'

packagePublishDescription: '<Package description>'

Argument Description

publishDirectory Location of the files you wish to publish.

vstsFeedPublish The project and feed name to publish to. If you're working with
an organization-scoped feed, specify only the feed name.

vstsFeedPackagePublish The package name. Must be lower case. Use only letters,
numbers, and dashes.

packagePublishDescription Description of the package content.

To publish packages to an Azure Artifacts feed from your pipeline, you must add
the Project Collection Build Service identity as a Contributor from your feed's
settings. See Adding users/groups permissions to a feed for more details.

To publish to an external feed, you must first create a service connection to


authenticate with your feed. see Manage service connection for more details.

Package versioning
Universal Packages follow the semantic versioning specification and can be identified by
their names and version numbers. Semantic version numbers are composed of three
numeric components, Major, Minor, and Patch, in the format: Major.Minor.Patch .

The minor version number is incremented when new features are added that are
backward compatible with previous versions, in this case, you increment the minor
version and reset the patch version to 0 ( 1.4.17 to 1.5.0 ). The major version number is
incremented when there are significant changes that could break compatibility with
previous versions. In this case, you increment the major version and reset the minor and
patch versions to 0 ( 2.6.5 to 3.0.0 ). The patch version number should be incremented
when only bug fixes or other small changes are made that do not affect compatibility
with previous versions ( 1.0.0 to 1.0.1 ).

When publishing a new package, the Universal Packages task will automatically select
the next major, minor, or patch version for you.

YAML

To enable versioning for your package, add a versionOption input to your YAML
file. The options for publishing a new package version are: major , minor , patch , or
custom .
Selecting custom enables you to manually specify your package version. The other
options will get the latest package version from your feed and increment the
chosen version segment by 1. So if you have a testPackage 1.0.0, and select the
major option, your new package will be testPackage 2.0.0. If you select the minor
option, your package version will be 1.1.0, and if you select the patch option, your
package version will be 1.0.1.

Note that if you choose the custom option, you must also specify a versionPublish
value as follows:

YAML

- task: UniversalPackages@0

displayName: Publish a Universal Package

inputs:

command: publish

publishDirectory: '$(Build.ArtifactStagingDirectory)'

vstsFeedPublish: '<projectName>/<feedName>'

vstsFeedPackagePublish: '<Package name>'

versionOption: custom

versionPublish: '<Package version>'

packagePublishDescription: '<Package description>'

Argument Description

publishDirectory Location of the files you wish to publish.

vstsFeedPublish The project and feed name to publish to. If you're working with
an organization-scoped feed, specify only the feed name.

vstsFeedPackagePublish The package name. Must be lower case. Use only letters,
numbers, and dashes.

versionOption Select a versioning strategy. Options: major , minor , patch ,


custom .

versionPublish The custom package version.

packagePublishDescription Description of the package content.

Download a Universal Package


YAML
To download a Universal Package from a feed in your organization, use the
Universal Package task with the download command as follows:

YAML

steps:

- task: UniversalPackages@0

displayName: Download a Universal Package

inputs:

command: download

vstsFeed: '<projectName>/<feedName>'

vstsFeedPackage: '<packageName>'

vstsPackageVersion: '<packageVersion>'

downloadDirectory: '$(Build.SourcesDirectory)\someFolder'

Argument Description

vstsFeed The Artifacts feed hosting the package to be downloaded.

vstsFeedPackage Name of the package to be downloaded.

vstsPackageVersion Version of the package to be downloaded.

downloadDirectory The package destination folder. Default value:


$(System.DefaultWorkingDirectory).

To download a Universal Package from an external source, use the following


snippet:

YAML

steps:

- task: UniversalPackages@0

displayName: Download a Universal Package

inputs:

command: download

feedsToUse: external

externalFeedCredentials: 'MSENG2'

feedDownloadExternal: 'fabrikamFeedExternal'

packageDownloadExternal: 'fabrikam-package'

versionDownloadExternal: 1.0.0

Argument Description

feedsToUse Set the value to external when downloading from an external


source.
Argument Description

externalFeedCredentials Name of the service connection to the external feed. See


manage service connections for more details.

feedDownloadExternal Name of the external feed.

packageDownloadExternal The package name you wish to download.

versionDownloadExternal The version of the package you wish to download.

 Tip

You can use wildcards to download the latest version of a Universal Package. See
Download the latest version for more details.

Related articles
Universal Packages upstream sources
Search for packages in upstream sources
Feed permissions
What are feeds?
Article • 03/08/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Artifacts Feeds are organizational constructs that allow you to store, manage, and group
your packages and control who to share it with. Feeds are not package-type dependent.
You can store all the following package types in a single feed: npm, NuGet, Maven,
Python, and Universal packages.

Project-scoped vs Organization-scoped feeds


Previously, all feeds were scoped to an organization, they could be viewed and accessed
in the Azure Artifacts hub from any project within an organization. With the introduction
of public feeds, we also introduced project-scoped feeds. This type of feed can only be
accessed from within the hosting project.

Only project-scoped feeds can be made public. You can learn more about public feeds
later in this article. See Feeds visibility to understand the differences between project-
scoped and organization-scoped feeds.

7 Note

To access a feed in a different organization, a user must be given access to the


project hosting that feed.

Public feeds
Public feeds are used to share your packages publicly with anyone on the Internet. Users
won't have to be a member of your organization or your enterprise. They can access the
packages even if they don't have an Azure DevOps account.

Public feeds are project-scoped feeds and it will inherit the visibility settings of the
hosting project.

There some important things to note regarding public feeds:

Public feeds can only be created inside of public projects.


Public feeds aren't intended as a replacement for existing package management
platforms (NuGet.org, npmjs.com, etc.).
Public users cannot currently download universal packages. All other package
types are supported for public access.

7 Note

All feed views in a public project are accessible to everyone on the internet.

Create public feeds


Public feeds are project-scoped feeds in a public project.

1. Select Artifacts.

2. Select Create Feed.


3. Give your feed a Name, and then select Project for your feed's scope.

4. Select Create when you are done.

Delete a feed
1. Select Artifacts, and then select your feed from the dropdown menu.

2. Select the gear icon to navigate to your feed's settings.


3. Select Delete feed.

4. Select Delete when you are ready.

Restore deleted feeds


If you accidentally delete a feed, Azure Artifacts provides a 30 days window to recover
your feed to its original state. After the 30 days, the feed will be deleted permanently.
During the recovery window, the name of the feed remains reserved, packages are
unavailable for download, and write access is suspended for that feed.

You can view the feeds that are pending permanent deletion in the feed picker
dropdown list under the Deleted Feeds tab.

1. Select Artifacts.

2. Select the feed picker dropdown menu, and then select Deleted Feeds

3. Select the feed you want to restore, and then select Feed Settings.
4. Select Restore Feed.

Permanently deleting a feed


A feed pending deletion will still use storage space. If you want to permanently delete
your feed before the 30 days period is up, you can do this as follows:

1. Select Artifacts.

2. Select the feed picker dropdown menu, and then select Deleted Feeds

3. Select the feed you want to restore, and then select Feed Settings.
4. Select Permanently Delete Feed, and then select Delete.

Once the feed is permanently deleted, users won't be able to view or restore its
packages. The feed name will be available for reuse 15 minutes after the deletion.
Project-scoped feeds
Article • 06/21/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

When creating a new Azure Artifacts feed, you can choose to scope your feed to your
project or your organization depending on your needs. Feeds that are created through
the web interface are project-scoped by default.

Create a new feed


Follow the instructions below and select the appropriate scope for your need to create a
project-scoped or an organization-scoped feed.

1. Select Artifacts, and then select Create Feed.

2. Give your feed a Name and choose its visibility. Select upstream sources if you
want to include packages from public registries.

3. Select Project if you want to create a project-scoped feed, otherwise select


Organization.

4. Select Create when you're done.


7 Note

Organization-scoped feeds cannot be converted into project-scoped feeds.

Project-scoped vs organization-scoped feeds


A project-scoped feed is scoped to a project instead of an organization. Here are the
main differences between the two types of feeds:

1. Visibility:
Project-scoped feeds inherit the visibility of the project.
Organization-scoped feeds are always private by default.

2. Links:

The URL of a project-scoped feed includes the project.


Example:
https://pkgs.dev.azure.com/<ORG_NAME>/<PROJECT_NAME>/_packaging/<FEED_

NAME>/nuget/v3/index.json

The URL of an organization-scoped feed doesn't include a project.


Example:
https://pkgs.dev.azure.com/<ORG_NAME>/_packaging/<FEED_NAME>/nuget/v3/

index.json

3. User interface:

All organization-scoped feeds are available from the feeds' dropdown menu.
To see a project-scoped feed in the list of feeds, you have to navigate to the
project hosting that feed.

4. Connection:

When connecting to a private project scoped feed from an Azure DevOps


pipeline that is in the same organization but in a different project, the project
that the feed is scoped to must allow access to the other project's build
service. The build service must also be separately added to the feed
permissions, regardless of the scope of the feed. See Package permissions for
more details.

Security policies
if you want to add an extra layer of security to your project-scoped feed and protect
your feed's visibility, you can disable the Allow public projects policy from the
Organization Policy Settings.

Alternatively, you can use the Create Feed API to manually create a new organization-
scoped feed. You will have to set the default permissions for the new feed manually
either by using the Feed Permission API or the Artifacts feed settings.

) Important
If a user has permissions to access a specific view, and they don't have permissions
to the feed, they will still be able to access and download packages through that
view.

If you want to completely hide your packages, you must restrict access to both the
feed and the view. See Feeds and views permissions for more details.

Q&A

Q: How can I share packages with all users in my organization?

A: If you want to make certain packages in your feed available to all users in your
organization, create or select a view that contains the packages you want to share and
ensure its visibility is set to People in my organization.

Q: How to access a project-scoped feed in another project using


Azure Pipelines?

In order for a pipeline to access a project-scoped feed in a different project, it is


necessary to grant the pipeline access to both the project where the feed is scoped and
the feed itself.

Project setup: navigate to the project hosting the feed, select Project settings >
Permissions and then add your pipeline's project build service to the Contributors
group or any other suitable group that provides contributor access to its users.

Feed setup: Navigate to the feed you want to access, select Settings > Feed
permissions and then add your project build service as a Collaborator. Your Project
build service identity is displayed in the following format: [Project name] Build
Service ([Organization name]) (e.g. FabrikamFiber Build Service (codesharing-

demo))

Related articles
Configure permissions
Delete and recover packages
Use feed views to share packages
Manage permissions
Article • 06/21/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts enables you to publish, consume, and store various types of packages in
your feed. By configuring permissions for your feed, you can manage access to your
packages and control who can interact with them.

Azure Artifacts settings


1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the dropdown menu. Select the
Azure Artifacts settings icon on the right.

7 Note

By default, the Azure Artifacts settings icon is only visible to feed owners and
project collection administrators.

3. Choose the users or groups who should have the ability to create and/or
administer feeds, and then select Save when you're done.
Feed settings
1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the dropdown menu. Select the
gear icon to navigate to your feed's settings.

3. Select Permissions, and then select Add users/groups.


4. Add new user(s)/group(s) and choose the appropriate Role for them.

5. Select Save when you're done.

7 Note

By default, the Project Collection Build Service (org-scoped) and the project-level
Build Service (project-scoped) are assigned the Collaborator role.
Permissions table
Permission Reader Collaborator Contributor Owner Administrator

List/install/restore packages ✓ ✓ ✓ ✓ ✓

Publish packages ✓ ✓ ✓

Unlist packages (NuGet) ✓ ✓ ✓

Delete packages ✓ ✓

Deprecate packages (Npm) ✓ ✓ ✓

Unpublish packages (Npm) ✓ ✓

Promote packages to a view ✓ ✓ ✓

Add/remove upstream ✓ ✓
sources

Allow external package ✓ ✓


versions

Save packages from ✓ ✓ ✓ ✓


upstream sources

Edit feeds settings ✓ ✓

7 Note

To access a project-scoped feed, a user must also have access to the project
hosting that feed.

Feed views settings


Feed views enable users to share certain packages while keeping others private. A
common scenario for using a feed view is sharing a package version that has already
been tested and validated but keeping packages under development private.

By default, there are three views in a feed: @local, @prerelease, and @release view. The
latter two are suggested views that you can rename or delete as desired. The @local
view is the default view and it includes all the packages published to the feed as well as
all the packages downloaded from upstream sources.
) Important

Users who have access to a specific view are able to access and download packages
from the feed through that view even if they don't have direct access to that feed.
If
you want to completely hide your packages, you must restrict access to both feed
and views.

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the dropdown menu. Select the
gear icon to navigate to your feed's settings.

3. Select Views, and then select the ellipsis button, and then select Edit to modify its
permission. To restrict access to your view, change its visibility to specific people.

4. Select Save when you're done. The access permissions column should reflect your
changes.
) Important

Views inherit permissions from the parent feed. If you set a view's visibility to
Specific people without specifying any users or groups, the view's permissions will
default back to the permissions of its parent feed.

Pipelines permissions
To access your feed from your pipeline, the corresponding build identity must have the
necessary permissions. By default, feeds have the Project Collection Build Service role set
to Collaborator. However, if you have configured your pipeline to run at project-scope,
you will need to add the project-level build identity as a Reader or Contributor. The
project-level build identity is named as follows: [Project name] Build Service
([Organization name]) . Example: FabrikamFiber Build Service (codesharing-demo).

1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the dropdown menu. Select the
gear icon to navigate to Feed settings.

3. Select Permissions, and then select Add users/groups. Add your build identity and
set its role to a Contributor.
7 Note

If you want to access a feed in a different project from your pipeline, you must
configure the other project to provide read/write access to the build service.

Related articles
Artifacts storage consumption.

Promote packages to a view.

Set up upstream sources.


Delete and recover packages
Article • 02/22/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts safely stores different types of packages in your feed, whether you
published them directly or saved them from upstream sources. As older package
versions fall out of use, you might want to clean them up either manually or
automatically by using retention policies.

In this article, you'll learn how to:

" Delete packages from feeds.


" Set up retention policies to automatically delete older packages.
" Recover recently deleted packages from the Recycle Bin.

7 Note

You must be a feed Owner or Administrator to delete packages or set up retention


policies.

Delete packages
In Azure Artifacts, packages are immutable. When you publish a package to your feed,
its version number will be reserved permanently. You can't upload a new package with
that same version number, even if you delete it from your feed.

NuGet

Two options are available to delete a NuGet package from your feed, Unlist and
Delete.

7 Note

You must be a Contributor to unlist a package and an Owner to delete it.

1. Select Artifacts, and then select your feed.


2. Select the package that you want to delete or deprecate, and then select
Unlist or Delete latest.

Unlist a NuGet package by using NuGet.exe


1. Select Artifacts, and then go to your feed. Select Connect to feed.

2. Select NuGet.exe, and then find and copy your Package Source URL.

3. Run the following command:

Command

nuget.exe delete <PACKAGE_NAME> <PACKAGE_VERSION> -Source


<PACKAGE_SOURCE_URL> -ApiKey <KEY>

7 Note

Azure DevOps and Visual Studio Team Foundation Server interpret the
nuget.exe delete command as an unlist operation. To delete a package, you
must use the REST API or the web interface.

7 Note
Packages sent to the Recycle Bin will be deleted permanently after 30 days.
However, these packages still count as part of your storage bill. If you want to
delete them sooner, go to the Recycle Bin and delete them manually.

Delete packages automatically with retention


policies
The number of versions for each package hosted in your feed can grow quickly. To free
up storage space, you can set up retention policies to automatically delete old packages.

If you want to retain a package indefinitely, you can promote it to a view. Packages
promoted to a view are exempt from retention policies and won't be deleted.

7 Note

Package demotion is not supported. If you want this feature to be added to future
releases, feel free to use Suggest a feature on our Azure DevOps Developer
Community page.

To configure retention policies:

1. Select Artifacts.
2. Select the gear icon to navigate to your feed's settings.

3. Select Feed details, and then select the Enable package retention checkbox. Then
enter values for:

Maximum number of versions per package: How many versions of a


package you want to keep.
Days to keep recently downloaded packages: Packages will be deleted only
if they haven't been downloaded for the number of days set in here.

4. Select Save when you're done.

7 Note

When you enable package retention, a version of a package will be deleted when
both of the following criteria are met:

The number of published versions reaches the Maximum number of versions


per package limit.
A version of that package has not been downloaded for the period defined in
Days to keep recently downloaded packages.

Recover deleted packages


Deleted packages will remain in the Recycle Bin for 30 days. After that, they'll be
permanently deleted. You must be a feed Owner to recover deleted packages.

1. Select Artifacts.

2. Select Recycle Bin.

3. Select your package, and then select Restore.


Q&A

Q: What is the difference between Deprecate, Unpublish,


Unlist, and Delete a package version?
A: Unpublish and Deprecate applies to npm packages, while Unlist and Delete applies to
NuGet packages. You can also Delete package versions for the rest of the package types
(Maven, Python, and Universal Packages):

Deprecate (npm): When you deprecate a package version, a warning message is


added to the package's metadata. Azure Artifacts and most npm clients will display
the warning message whenever the package is viewed or installed.

Unpublish (npm): Unpublishing a package version makes it unavailable to install.


Unpublished packages can be restored from the Recycle Bin within 30 days of
deletion. After that, the packages will be permanently deleted.

Unlist (NuGet): Unlisting a package version hides it from the search results in Azure
Artifacts feeds and on NuGet.org.

Delete: Deleting a package version makes it unavailable to install. Deleted


packages can be restored from the Recycle Bin within 30 days of deletion. After
that, the packages will be permanently deleted.

Q: What happens with old or existing packages when we


enable retention policies?
A: Old or existing packages will be soft-deleted and moved to the Recycle Bin. The
deletion job runs once a day, but there might be an initial delay after the policy is turned
on for the first time because of an influx of packages.

Packages remain in the Recycle Bin for 30 days before they're permanently deleted. To
remove the packages from your billable storage, you can choose to delete them
manually by using the UI or the REST API before the 30 days are up.

Related articles
Understand upstream sources
Feeds permissions
Configure upstream sources
Promote a package to a view
Use an Azure Artifacts feed as a private
PowerShell repository
Article • 11/28/2022

Azure DevOps Services

Azure Artifacts provides an easy way to share PowerShell scripts across teams to
promote collaboration and maximize effectiveness. By storing PowerShell modules in a
private repository, you can give members of your team the ability to download or
update those scripts quickly using the command line.

This article will guide you through setting up your Azure Artifacts feed as a private
PowerShell repository to store and share your PowerShell modules. You'll learn how to:

" Create a Personal Access Token


" Create a new feed to store PowerShell modules
" Create, package, and publish PowerShell modules
" Connect to a feed with PowerShell
" Use the private PowerShell repository with Azure Pipelines

Prerequisites
NuGet.exe
Azure Artifacts Credential Provider
An Azure DevOps organization. Create an organization, if you don't have one
already.
An Azure Artifacts feed. Create a new feed if you don't have one already.

Create a personal access token


Using a personal access token (PAT) is a great way to authenticate with Azure DevOps
without using your primary credentials. See Use personal access tokens for more details.

1. Navigate to your Azure DevOps organization


https://dev.azure.com/<ORGANIZATION_NAME>/

2. Select the user settings icon, and then select Personal access tokens.
3. Select New Token

4. Enter a name for your PAT and then choose an Expiration date.

5. Select Custom defined, and then select Packaging > Read, write & manage.

6. Select Create when you're done. Copy and store your PAT in a safe location.
Create a module
1. Create a new folder Get-Hello. Navigate inside your folder and create a new file
Get-Hello.psm1.
|--- Get-Hello // Parent folder

|--- Get-Hello.psm1 // This will become our PowerShell Module

|--- Get-Hello.psd1 // This will become our module manifest

2. Paste the following script into your Get-Hello.psm1 file:

PowerShell

Function Get-Hello{

Write-Host "Hello from my Azure DevOps Services Package."

3. Create the module manifest by running the following command in your Get-Hello
directory path.

PowerShell

New-ModuleManifest -Path .\Get-Hello.psd1

4. Open your Get-Hello.psd1 file and find the RootModule variable. Replace the empty
string with the path to your Get-Hello.psm1 file as follows:

PowerShell

RootModule = 'Get-Hello.psm1'

5. The FunctionsToExport section is meant to define the list of functions that will be
exported from this module. Add your Get-Hello function as follows:

PowerShell

FunctionsToExport = @('Get-Hello')

6. Find the FileList section, and add the following list of files that should be
packaged with your module.

PowerShell

FileList = @('./Get-Hello.psm1')

Pack and publish module


1. Create nuspec file for your module. This command will create a Get-Hello.nuspec
file that contains metadata needed to pack the module.

PowerShell

nuget spec Get-Hello

2. Run the following command to pack your module.

PowerShell

nuget pack Get-Hello.nuspec

3. Run the following command to add your feed source URL. NuGet v3 is not
supported, make sure you use v2 in your feed source URL.

Org-scoped feed:

PowerShell

nuget sources Add -Name "<FEED_NAME>" -Source


"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_N
AME>/nuget/v2" -username "<USER_NAME>" -password "
<PERSONAL_ACCESS_TOKEN>"

Project-scoped feed:

PowerShell

nuget sources Add -Name "<FEED_NAME>" -Source


"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pa
ckaging/<FEED_NAME>/nuget/v2" -username "<USER_NAME>" -password "
<PERSONAL_ACCESS_TOKEN>"

4. Publish the package to your feed.

PowerShell

nuget push -Source "<FEED_NAME>" -ApiKey "<ANY_STRING>" "


<PACKAGE_PATH>"

) Important

The version number in your Module Manifest (.psd1) and the .nuspec file must
match.

Connect to feed as a PowerShell repository


1. Open an elevated PowerShell prompt window.

2. Set up your credentials to authenticate with Azure Artifacts. Replace the


placeholders with the appropriate information.

PowerShell

$patToken = "<PERSONAL_ACCESS_TOKEN>" | ConvertTo-SecureString -


AsPlainText -Force

PowerShell

$credsAzureDevopsServices = New-Object
System.Management.Automation.PSCredential("<USER_NAME>", $patToken)

3. Register your PowerShell repository. The SourceLocation link can be found by


navigating to Artifacts > Connect to Feed > NuGet.exe under Project setup
source URL.

Project-scoped feed:

PowerShell

Register-PSRepository -Name "PowershellAzureDevopsServices" -


SourceLocation
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pa
ckaging/<FEED_NAME>/nuget/v2" -PublishLocation
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pa
ckaging/<FEED_NAME>/nuget/v2" -InstallationPolicy Trusted -
Credential $credsAzureDevopsServices

Org-scoped feed:

PowerShell

Register-PSRepository -Name "PowershellAzureDevopsServices" -


SourceLocation
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_N
AME>/nuget/v2" -PublishLocation
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_N
AME>/nuget/v2" -InstallationPolicy Trusted -Credential
$credsAzureDevopsServices

If you're still using the older visualstudio.com URLs, use the following command
instead:

Project-scoped feed:

PowerShell

Register-PSRepository -Name "PowershellAzureDevopsServices" -


SourceLocation
"https://<ORGANIZATION_NAME>.pkgs.visualstudio.com/<PROJECT_NAME>/
_packaging/<FEED_NAME>/nuget/v2" -PublishLocation
"https://<ORGANIZATION_NAME>.pkgs.visualstudio.com/<PROJECT_NAME>/
_packaging/<FEED_NAME>/nuget/v2" -InstallationPolicy Trusted -
Credential $credsAzureDevopsServices

Org-scoped feed:

PowerShell

Register-PSRepository -Name "PowershellAzureDevopsServices" -


SourceLocation
"https://<ORGANIZATION_NAME>.pkgs.visualstudio.com/_packaging/<FEE
D_NAME>/nuget/v2" -PublishLocation
"https://<ORGANIZATION_NAME>.pkgs.visualstudio.com/_packaging/<FEE
D_NAME>/nuget/v2" -InstallationPolicy Trusted -Credential
$credsAzureDevopsServices

 Tip

Certain versions of PowerShell require restarting a new session after executing


the Register-PSRepository cmdlet to avoid the Unable to resolve package
source warning.

4. Register your package source:

Project-scoped feed:

PowerShell

Register-PackageSource -Name "PowershellAzureDevopsServices" -


Location
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_pa
ckaging/<FEED_NAME>/nuget/v2" -ProviderName NuGet -Trusted -
SkipValidate -Credential $credsAzureDevopsServices

Org-scoped feed:

PowerShell

Register-PackageSource -Name "PowershellAzureDevopsServices" -


Location
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_N
AME>/nuget/v2" -ProviderName NuGet -Trusted -SkipValidate -
Credential $credsAzureDevopsServices

5. Run the following command to confirm if the repository was registered


successfully. This command gets all the registered repositories for the current user:

PowerShell

Get-PSRepository

6. Run the following command if you want to find all modules in the repository.

PowerShell

Find-Module -Repository PowershellAzureDevopsServices

7. Run the following command if you want to install the Get-Hello module.

PowerShell

Install-Module -Name Get-Hello -Repository


PowershellAzureDevopsServices

If the Install-Module command is returning the following error: Unable to resolve


package source, run the Register-PackageSource cmdlet again with the Trusted flag as
follows:

PowerShell

Register-PackageSource -Name "PowershellAzureDevopsServices" -Location


"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/nuget
/v2" -ProviderName NuGet -Trusted -Trusted -SkipValidate -Credential
$credsAzureDevopsServices

Connect to feed with Azure Pipelines


The following example shows how to authenticate and install a PowerShell Module with
YAML pipeline.

YAML

variables:

PackageFeedEndpoint:
https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_NAME>/nuget/
v2 # For Project-scoped feed use this endpoint url:
https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_packaging/<FE
ED_NAME>/nuget/v2

# Construct a JSON object that contains the endpoint URL and the personal
access token to pass them to Azure Artifacts credential provider.

PackageFeedEndpointCredential: '{"endpointCredentials":
[{"endpoint":"$(PackageFeedEndpoint)", "username":"OPTIONAL",
"password":"ACCESS TOKEN"}]}'

steps:

# To prevent possible 'Unable to resolve package source' errors when


installing modules from your feed, call Install-Module in a separate
PowerShell task.

- powershell: |

Register-PSRepository -Name "PowershellAzureDevopsServices" -


SourceLocation "$(PackageFeedEndpoint)" -PublishLocation
"$(PackageFeedEndpoint)" -InstallationPolicy Trusted

displayName: 'Register Azure Artifacts Feed as PSRepository'

env:

# This environment variable passes the credentials to the credential


provider.

VSS_NUGET_EXTERNAL_FEED_ENDPOINTS: $(PackageFeedEndpointCredential)

- powershell: |

Install-Module -Name Get-Hello -Repository


PowershellAzureDevopsServices

displayName: 'Install Get-Hello PowerShell module'

env:

# The credentials must be set on every task that interacts with your
private PowerShell repository.

VSS_NUGET_EXTERNAL_FEED_ENDPOINTS: $(PackageFeedEndpointCredential)

- powershell: |

Get-Hello

displayName: Execute Get-Hello

Related articles
Upstream sources
Configure permissions
Delete and recover packages
Share packages publicly
Article • 07/13/2023

Azure DevOps Services

Azure Artifacts provides an easy way to share packages to users outside your
organization and even external customers using public feeds. Packages that are stored
in public feeds can be restored and installed by anyone on the Internet.

Prerequisites
An Azure DevOps organization. Create an organization, if you don't have one
already.
A public project. Create a public project if you don't have one already.

Create a public feed


Public feeds are project-scoped feeds in a public project. Public feeds inherit the
visibility settings of the hosting project.

1. Sign in to your Azure DevOps organization, and then navigate to your public
project.

2. Select Artifacts, and then select Create Feed.

3. Give your feed a Name, select Project: PublicProject (Recommended) for its
scope, and then select Create when you're done.
Publish packages

7 Note

If you want to publish NuGet packages, make sure you have the latest Azure
Artifacts Credential Provider .

Publish packages (CLI)

NuGet

Publish NuGet packages - (NuGet.exe)


Publish NuGet packages - (dotnet)

Publish packages with Azure Pipelines

NuGet
Publish NuGet packages

7 Note

If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.

Share packages
To share your packages publicly, you can simply share your feed URL E.g.
https://dev.azure.com/<ORGANIZATION_NAME>/<PROJECT-

NAME>/_artifacts/feed/<FEED_NAME> or share individual packages with package badges.

As long as your project is kept public, anyone can view and download packages from
your public feed. Anonymous users won't be able to create new feeds or access the
recycle bin.

Related articles
Package sizes and count limits
Follow a package for publish alerts
Delete and recover packages
Migrate from file shares to Azure
Artifacts
Article • 06/19/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Using Azure Artifacts, you can streamline package management to enhance


collaboration, ensure package integrity, and leverage various capabilities such as
versioning, access control, and feed management.

Key concepts
Azure Artifacts offers several advantages over file shares:

Indexing:

Azure Artifacts maintains an index of packages within each feed, allowing for quick
list operations. In contrast, when using file shares, the client needs to open each
nupkg file and inspect the nuspec metadata, unless the file share is configured
with an index that the NuGet client recognizes.

Immutability:

Each package version can only be pushed to a feed once in order to maintain the
integrity of dependencies. This guarantees that any references to that version will
always be accurate. However, if you have workflows that publish packages with
updated binaries but without changing the version number, those workflows will
encounter issues when transitioning to Azure Artifacts feeds. See Immutability for
more details.

Well-formedness:

Azure Artifacts performs thorough validation on all pushed packages to ensure


their integrity and correctness. This validation process prevents any invalid
packages from entering your development and build environments. However, it's
important to note that any workflow that publishes packages with malformed
structures will encounter issues when transitioning to Azure Artifacts feeds.

Authentication and authorization


If you're currently utilizing Active Directory-backed file shares, it's probable that you and
your on-premises build agents are automatically authenticated using Windows NTLM.
Migrating your packages to Azure Artifacts will require a few adjustments:

Authentication: You need to provide access to the NuGet client in order to push
and restore packages.
Visual Studio: Credential acquisition happens automatically.
nuget.exe: Credential acquisition happens automatically after you install the
Azure Artifacts Credential Provider.

Authorization: Make sure that any user, service, organization, or group requiring
access to your packages has the necessary permissions in place. See the
permissions section for more details.

Migrating your packages is a 4-step process:

1. Inventory your existing package sources


2. Plan your access control strategy
3. Set up your feeds
4. Migrate your packages

Inventory your existing package sources


Before making any configuration changes, it's important to inventory your existing
package sources. This involves identifying and listing all the package sources currently
used in your setup. By conducting this inventory, you'll have a comprehensive
understanding of the package sources that need to be migrated or reconfigured. Start b

Any nuget.config files in your codebase, likely in the same folder as your solution
(.sln) file.

The default NuGet configuration file at:


Windows: %APPDATA%\NuGet\NuGet.Config
macOS/Linux: ~/.config/NuGet/NuGet.Config or ~/.nuget/NuGet/NuGet.Config

Identify the server path associated with your package sources (e.g., <add
key="SMBNuGetServer" value="\\server\share\NuGet" /> ) and make a copy of it. This list

of server paths will be utilized in the subsequent sections for the migration process.

Plan your access control strategy


When configuring your new feeds, you have two options:
Configure the feed permissions to match the permissions of your existing file
shares.
Align the feed permissions with the teams and groups already set up in Azure
DevOps.

If you want to replicate your existing file share permissions, make a note of the
permissions on each share that contains packages. Specifically, take note of users or
groups with:

Full control

Read or List

Write or Modify

Set up your feeds


After completing the inventory of your current package sources, it's time to configure
your feeds. In this step, we'll assume a one-to-one mapping of feeds to SMB shares.

For each SMB share, follow the instructions to Create a feed:

Set the Feed name to match the name of your SMB share folder.

Choose your feed Visibility, Upstream sources, and Scope.

For each feed you've created, there are a set of feed permissions that you should
consider when setting up feed permissions.

If you have opted to configure your new feed permissions to match your existing file
share permissions, refer to the table below to assign the appropriate permissions to
your users:

File share permissions Feed permissions

Full control Owners

Write or Modify Contributors

Read or List Readers

Migrate your packages


For each feed, navigate to Artifacts > Connect to feed and then choose NuGet.exe.
Copy the Source URL provided in the Project setup section. This source URL is required
to update your NuGet configuration and migrate your packages.

Once you've set up your feeds, you can now set up your project to authenticate with
your feed and publish your packages. Make sure you have installed the latest version of
the Azure Artifacts credential provider before proceeding to the next steps.

7 Note

We recommend using NuGet version 5.5.x or later, as it includes critical bug fixes
that address cancellations and timeouts.

1. Ensure that your nuget.config file is located in the same folder as your .csproj or
.sln file. Once you have verified the file's placement, add the following snippet to
your nuget.config file. Replace the placeholders with the appropriate values:

Organization scoped feed:

XML

<?xml version="1.0" encoding="utf-8"?>

<configuration>

<packageSources>

<clear />

<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_packaging/<FEED_
NAME>/nuget/v3/index.json" />

</packageSources>
</configuration>

Project scoped feed:

XML

<?xml version="1.0" encoding="utf-8"?>

<configuration>

<packageSources>

<clear />

<add key="<FEED_NAME>"
value="https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_p
ackaging/<FEED_NAME>/nuget/v3/index.json" />

</packageSources>
</configuration>

2. Run the following push command to publish all your packages to your new feed.
You can provide any string as the value for the ApiKey argument.
Command

nuget.exe push -Source <FEED_NAME> -ApiKey Az <PACKAGE_PATH>\*.nupkg

 Tip

For larger teams, you should consider marking each share as read-only before
doing the nuget push operation to ensure no one adds or updates packages during
your migration.

If you're also incorporating Azure Pipelines into your workflow, make sure you update
your pipelines to ensure they have the right permissions to publish packages to your
feeds. See Publish NuGet packages with Azure Pipelines and Restore NuGet packages
with Azure Pipelines for more details.

Related articles
Install NuGet packages with Visual Studio
Publish packages to NuGet.org
Delete and recover packages
What are feed views?
Article • 12/02/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Feed views enable developers to share a subset of package-versions with their


consumers. A common use of feed views is to share package versions that have been
tested and validated but hold back on packages that are still under development and/or
didn't meet a certain quality bar.

Default view
All Artifacts feeds come with three views: @local , @prerelease , and @release . The latter
two are suggested views that you can rename or delete as desired. @local is the default
view that's commonly used in upstream sources.

The @local view contains all packages published directly to the feed and all packages
saved from upstream sources.

Feed views are read-only, which means that users connected to a view can only use
packages that are published to that view and/or packages previously saved from
upstream sources. See package graphs to learn how available packages are constructed.

7 Note

All feed views in a public project are accessible to everyone on the internet.

Feed views and upstream sources


Feed views and upstream sources are designed to work together to provide an
enterprise-level solution to share and consume packages.
In order for other Azure
Artifacts feeds to use your feed as an upstream source, you must set your feed's visibility
to members of your organization, or members of your Azure Active Directory,
depending on your scenario. If you choose the latter, all people in your organization will
be able to access your feed. In addition, all feeds in your organization and other
organizations associated with the same Azure Active Directory tenant will be able to
upstream to your feed.
Release packages with feed views
When creating release packages, it's important to convey three pieces of information:
the nature of the change, the risk of the change, and the quality of the change.

Nature and risk of the change


The nature and the risk of the change both pertain to the change itself, that is, what you
set out to do, they're both known at the outset of the work. If you're introducing new
features, making updates to existing features, or patching bugs; this is the nature of
your change. If you're still making changes to the API portion of your application; this is
one facet of the risk of your change. Many NuGet users use Semantic Versioning
(SemVer) notation to convey these two pieces of information. SemVer is a widely used
standard and does a good job of communicating this type of information.

Quality of the change


The quality of the change isn't generally known until the validation process is complete.
This comes after your change is built and packaged. Because of this detail, it's not
feasible to communicate the quality of the change in the numerical segment of the
version number (e.g 1.2.3). There are workarounds to pre-validate (e.g. consume the
build's DLLs directly before they're packaged and publish the packages to a "debug" or
"CI" environment then validate and republish those packages to a "release"
environment) but none that we've seen can truly guarantee that the built package will
meet the correct quality standard.
You can use the @Release view as a means to convey the quality of your changes. Using
the @Release view, you can share packages that met your quality bar and allow your
consumers to only see the subset of package versions that were tested, validated, and
are ready to be consumed.

Related articles
Promote a package to a view
Set up upstream sources
Configure permissions
Use feed views to share packages
Article • 07/14/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Feed views are a way to enable users to share some packages while keeping other
packages private. Views filter the feed to a subset of packages that meet a set of criteria
defined by that view.

By default, Azure Artifacts comes with three views: @Local, @Prerelease, and @Release.
@local is the default view that contains all the published packages and all the packages
saved from upstream sources. All views support NuGet, npm, Maven, Python, and
Universal packages.

7 Note

Publishing and restoring packages directly to/from a view is not supported in Azure
Artifacts.

Promote packages
1. Sign in to your Azure DevOps organization, and then navigate to your project.

2. Select Artifacts, and then select your feed from the dropdown menu.

3. Select the package you wish to promote, and then select Promote.

4. Select a view from the dropdown menu, and then select Promote.
7 Note

Package demotion is not supported. If you want this feature to be added to a


future release, please feel free to Suggest a feature on Azure DevOps Developer
Community .

Promote packages using the REST API


In addition to using the Azure Artifacts user interface, you can also promote packages
using the REST API. The URI varies based on the package type:

Use the actual user-facing name and version of the package for the {packageName} and
{packageVersion} fields, respectively. If your feed is organization-scoped, omit the

{project} field.

The body of the request is a JSON Patch document adding the view to the end of the
views array. See Get started with the REST API and the REST API samples for more

information on how to interact with Azure DevOps REST API.

NuGet

Organization scoped feed:

HTTP

PATCH
https://pkgs.dev.azure.com/{organization}/_apis/packaging/feeds/{fe
edId}/nuget/packages/{packageName}/versions/{packageVersion}?api-
version=7.1-preview.1

Project scoped feed:

HTTP

PATCH
https://pkgs.dev.azure.com/{organization}/{project}/_apis/packaging
/feeds/{feedId}/nuget/packages/{packageName}/versions/{packageVersi
on}?api-version=7.1-preview.1

Use JsonPatchOperation to construct the body of your request. See NuGet -


update package version for more details.

Example:

HTTP

PATCH https://pkgs.dev.azure.com/fabrikam-fiber-
inc/litware/_apis/packaging/feeds/litware-
tools/nuget/packages/LitWare.Common/versions/1.0.0?api-version=5.1-preview.1
HTTP/1.1
Content-Type: application/json-patch+json

{
"views": {
"op": "add",
"path": "/views/-",
"value": "Release"
}
}

Manage views
You can create your own views or rename and delete existing ones from your feed's
settings.

7 Note

All feed views in a public project are accessible to everyone on the internet.

1. Select Artifacts.
2. Select your feed from the dropdown menu.

3. Select the gear icon to access your feed's settings.

4. Select Views.

5. Select a view, and then select Edit to edit your view or select Add view if you want
to add a new view.

6. Select Save when you're done.

) Important

For public feeds, if you change the access permissions of a certain view to Specific
people your view will not be available as an upstream source.

Related articles
Upstream sources overview
Configure permissions
Set up upstream sources
Delete and recover packages
Upstream sources
Article • 03/22/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Using upstream sources, you can conveniently store packages from various sources in a
single feed - including those that you publish, as well as those you consume from other
feeds and public registries such as NuGet.org, npmjs.com, Maven Central, and PyPI.
Once upstream sources enabled, a copy of any package installed from upstream will be
automatically saved to your feed

7 Note

You must be a Collaborator or higher to install packages from upstream sources.

Advantages
Upstream sources enable you to manage all of your product's dependencies in a single
feed. Publishing all your packages to a single feed has a few benefits:

Simplicity: your config file such as NuGet.config, npmrc, or settings.xml will


contain only one feed so it less prone to mistakes and bugs.
Determinism: your feed resolves package requests in order, so rebuilding your
code will be more consistent.
Provenance: your feed knows the provenance of the packages it saved from
upstream sources, so you can verify that you're using the original package and not
a copy or malicious package.
Peace of mind: a copy is saved to your feed for any package installed from
upstream sources. So if the upstream source is disabled, removed, or undergoing
maintenance, you can still continue to develop and build because you have a copy
of that package in your feed.

Best practices - package consumers


To take full advantage of the benefits of upstream sources as a package consumer,
follow these best practices:

Use a single feed in your config file


In order for your feed to provide deterministic restore, it's important to ensure that your
configuration file such as nuget.config or .npmrc references only one feed with the
upstream sources enabled.

Example:

nuget.config

<packageSources>

<clear />

<add key="FabrikamFiber"
value="https://pkgs.dev.azure.com/fabrikam/_packaging/FabrikamFiber/nug
et/v3/index.json" />

</packageSources>

7 Note

NuGet composes several config files to determine the full set of options to
use. Using <clear /> allow us to ignore all other package sources defined in
higher-level configuration files.

.npmrc:

registry=https://pkgs.dev.azure.com/fabrikam/_packaging/FabrikamFiber/n
pm/registry/

always-auth=true

Order your upstream sources intentionally


If you're only using public registries such as nuget.org or npmjs.com, the order of your
upstream sources is irrelevant. Requests to the feed follow the search order.

If you're using multiple sources such as a mixture of feeds and public registries, then
each upstream is searched in the order it's listed in the feed's configuration settings. In
this case, we recommend placing the public registries first in the list of upstream
sources.

In rare cases, some organizations choose to modify OSS packages to fix security issues,
to add functionality, or to satisfy requirements that the package is built from scratch
internally, rather than consumed directly from the public repository.
If your organization
follows this pattern, place the upstream source that contains these modified OSS
packages before the public package managers to ensure you use your organization's
modified versions.

Use the suggested default view

When you add a remote feed as an upstream source, you must select its feed's view.
This enables the upstream sources to construct a set of available packages. See
complete package graphs for more details.

Best practices: feed owners/package publishers


To make sure your feed is easily configured as an upstream source, consider applying
the following best practices:

Use the default view

The @local view is the default view for all newly created feeds. It contains all the
packages published to your feed or saved from upstream sources.

If you want to use views to release new package versions, you can promote your
package to a view such as @release and make it available to your consumers.

Construct a package graph

To construct a package graph, simply connect to the feed's default view and install the
package you wish to share. When the package is installed correctly in the default view,
users who want to consume it will be able to resolve the package graph and install the
desired package. Packages from upstream sources are displayed based on the
configured view for the corresponding upstream source.

Search order
For public package managers that support multiple feeds (NuGet and Maven), the order
in which feeds are queried is sometimes unclear or nondeterministic. For example in
NuGet, parallel queries are made to all the feeds in the config file, and the responses are
processed first-In, first-out FIFO.
Upstream sources prevent this nondeterministic behavior by searching the feed and its
upstream sources using the following order:

1. Packages pushed to the feed.

2. Packages saved from an upstream source.

3. Packages available from upstream sources: each upstream is searched in the order
it's listed in the feed's configuration

To take full advantage of the fast lookup feature, we recommend that you only include
one feed in your config file.

7 Note

Searching for packages in upstreams with NuGet Package Explorer is not


supported.

Save packages from upstream sources


When you enable upstream sources for your feed, packages installed from upstream
sources will be automatically saved to your feed. These packages could be installed
directly from the upstream as follows npm install express or they could be installed as
part of a dependency resolution (installing express would also save dependencies like
accepts ).

Saving packages can improve download performance and save network bandwidth
especially for TFS servers in internal networks.

7 Note

Custom upstream sources are only supported for npm packages.

Override packages from upstream sources


When you enable upstream sources, you must be aware that publishing a package
version that already exists in upstream will not be possible. For instance, when you
enable the NuGet.org upstream, you won't be able to publish the Newtonsoft.Json
10.0.3 package because that same package version is already present in NuGet.org.
If you must publish a package version that already exists on one of your upstream
sources, you must disable that upstream source, publish your package, and then re-
enable the upstream source.

7 Note

Package versions are immutable. Saved packages remain in the feed even if the
upstream source is disabled or removed.

Health status
If a feed has a failing upstream source, the metadata can no longer be refreshed for
packages of the same protocol. To view your upstream source's health status, select the
gear icon to access your Feed settings, and then select Upstream sources.

If there are any failures, a warning message will be displayed. Selecting the failed status
provides more details such as the reason of failure and instructions on how to solve it.

7 Note
For public registries such as NuGet.org, there is a 3-6 hour delay between when a
package is pushed to the public registry and when it is available for download. This
delay depends on job timing and data propagation. When the upstream source is
an Azure Artifacts feed, the latency is typically no more than a few minutes.

Offline upstream sources


Upstream sources are a great way to protect your consumers and infrastructure from
unplanned outages. When you install a package from an upstream source, a copy of
that package is saved to your feed. If the upstream source is down, undergoing
maintenance, or not available, you can still access the packages you need from your
feed.

FAQs

Q: I can't find my package even though I can see it in one of my


feed's upstreams?

A: Packages belonging to an upstream are available downstream soon after they are
published. However the package will only show up in your feed's UI once it's ingested,
which requires installing the package version for the first time in the downstream feed.

Q: What are feed views?

A: Views enable developers to only share a subset of package versions that have been
tested and validated and excluding any packages that are still under development
and/or did not meet the quality bar. See What are feed views for more details.

Q: I can't find the feed that I want to configure as an upstream


source?

A: Make sure that the feed's owner is sharing a view as an upstream source.

Q: Can a user with Reader role download packages from an


upstream source?

A: No. A user with Reader role in an Azure Artifacts feed can only download packages
that have been saved to the feed. Packages are saved to the feed when a Collaborator, a
Contributor, or an Owner install those packages from upstream.
Q: What happens when a user deletes or unpublishes a package
saved from an upstream source?

A: The package will not be available for download from the feed and the version number
gets reserved permanently. The package also will no longer be saved from the upstream
source. Earlier and later versions of the package will not be affected.

Q: What happens when a user deprecates a package saved from an


upstream source?

A: A warning message gets added to the package's metadata and displayed whenever
the package is viewed or installed from the feed.

Related articles
Set up upstream sources
Manage dependencies
Configure upstream behavior
Feed permissions
Universal Packages upstream sources
Configure upstream sources
Article • 03/08/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With upstream sources, you can use a single feed to store the packages you generate
and the packages you consume from public registries such as npmjs.com, NuGet.org,
Maven Central, and PyPI.org. Once you've enabled an upstream source, every time you
install a package from the public registry, Azure Artifacts will save a copy of that
package in your feed.

Create a new feed and enable upstream


sources
1. From within your project, select Artifacts.

2. Select Create Feed.


3. Give your feed a Name and choose its visibility, and scope settings. Make sure you
check the Include packages from common public sources checkbox to enable
upstream sources.

4. Select Create when you are done.

) Important
Maven snapshots are not supported in upstream sources.

Enable upstream sources in an existing feed

7 Note

Custom public upstream sources are only supported with npm registries.

1. Select the button to access your feed's settings.

2. Select Upstream sources.

3. Select Add upstream source.

4. Select Public source, and then select the Public source. (Example Maven Central
(https://repo.maven.apache.org/maven2/ ) for Maven central)

5. Select Add when you are done.

7 Note

Azure Artifacts support Maven Central, Google Maven Repository, Gradle Plugins,
and JitPack as upstream sources for Maven.

Add a feed in your organization as an upstream


source
1. Select the button to access your feed's settings.

2. Select Upstream sources.

3. Select Add Upstream.

4. Select Azure Artifacts feed in this organization.

5. Select the feed you would like to add from the dropdown menu.

6. Select the package types you want to use, select the View and name your
upstream source.

7. Select Save when you are done.


Add a feed in a different organization as an
upstream source

7 Note

Universal Packages are only supported in upstream sources within the same
organization.

1. Select the button to access your feed's settings.

2. Select Upstream sources.

3. Select Add Upstream.


4. Select Azure Artifacts feed in another organization.

5. Enter your Azure DevOps Services feed locator. Example: azure-


feed://myOrg/myProject/myFeed@local.

6. Select the Package type(s) you want to use and enter an Upstream source name.

7. Select Save when you are done.

Example: install NuGet packages from


upstream sources with Visual Studio
Using Visual Studio, we can now install packages from the upstream sources we
configured:
1. Navigate to NuGet.org, find the package you want to install, and then copy the
Install-Package command.
2. In Visual Studio, select Tools > NuGet Package Manager > Package Manager
Console.
3. Paste the install command into the Package Manager Console and press ENTER to
run it.

Example: install npm packages from upstream


sources using the CLI
Run the following command in an elevated command prompt window to install your
npm package from upstream.

Command

npm install --save <package>

7 Note

You must be a Collaborator, a Contributor, or an Owner to install new packages


from upstream. A copy of each upstream package is saved to the feed on first use.
Packages already saved from upstream sources can be used by feed Readers.

Related articles
Manage dependencies with upstream sources
Universal Packages upstream sources
Configure upstream behavior
Use feed views to share packages
Configure permissions
Tutorial: How to use upstream sources
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Using upstream sources in your feed enables you to manage your application
dependencies from a single feed. Using upstream sources makes it easy to consume
packages from public registries while having protection against outages or
compromised packages. You can also publish your own packages to the same feed and
manage all your dependencies in one location.

This tutorial will walk you through how to enable upstream sources on your feed and
consume packages from public registries such as NuGet.org or npmjs.com.

In this tutorial, you will:

" Create a new feed and enable upstream sources.


" Set up your configuration file.
" Run an initial package restore to populate your feed.
" Check your feed to view the saved copy of the packages you consumed from the
public registry.

Create a feed and enable upstream sources


1. Select Artifacts.
2. Select Create Feed to create a new feed.

3. Provide a name for your feed, and then select its visibility. Make sure your check
the Include packages from common public sources checkbox to enable upstream
sources. Select Create when you are done
Set up the configuration file
Now that we created our feed, we need to update the config file to point to our feed. To
do this we must:
1. Get the source's URL
2. Update the configuration file

npm

1. Select Artifacts, and then select Connect to feed.

2. On the left side of the page, select the npm tab.

3. Follow the instructions in the Project setup section to set up your config file.

If you don't have a .npmrc file already, create a new one in the root of your project
(in the same folder as your package.json). Open your new .npmrc file and paste the
snippet you just copied in the previous step.

Restore packages
Now that you enabled upstream sources and set up your configuration file, we can run
the package restore command to query the upstream source and retrieve the upstream
packages.

We recommend clearing your local cache first before running the nuget restore. Azure
Artifacts will have a saved copy of any packages you installed from upstream.

npm
Remove the node_modules folder from your project and run the following
command in an elevated command prompt window:

Command

npm install --force

7 Note

The --force argument will force pull remotes even if a local copy exists.

Your feed now should contain any packages you saved from the upstream source.

Related articles
Set up upstream sources
Universal Packages upstream sources
Feed permissions
Publish packages to NuGet.org
Search for packages in upstream
sources
Article • 10/07/2022

Using upstream sources enable developers to consume packages from different feeds
and public registries. This tutorial will walk you through how to enable upstream sources
in your feed and search for packages in upstreams.

Enable upstream sources


1. Navigate to your project, and then select Artifacts.

2. Select the gear icon to navigate to your Feed Settings.

3. Select Upstream Sources.

4. Select Add Upstream.

5. Select Public source, and then select the Public source. (Example NuGet gallery
(https://api.nuget.org/v3/index.json ))

6. Select Save when you're done.

Search upstream sources


1. Navigate to your project, and then select Artifacts.

2. Select your feed from the dropdown menu.

3. Select Search Upstream Sources at the top right of your screen.

4. Select the Package type and type your Package Name. (The package name is case
sensitive and must be an exact match).

5. Select Search when you're done.

6. A list of package versions will be displayed as follows:


Save packages

7 Note

Saving packages to your feed is only supported for NuGet, Npm, and Universal
Packages.

1. To save a package, select the ellipsis button and then select Save to feed.

2. Select Save to save the package to your feed.


3. The saved versions will have the In this feed tag.

Related articles
Set up upstream sources
Configure upstream behavior
Configure feed permissions
Configure upstream behavior
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With upstream sources, developers can use a single feed to publish and consume
packages from Artifact feeds and public registries such as NuGet.org or npmjs.com. To
set up upstream sources for your feed, check the box to include packages from
common public sources. This will allow your feed to use packages from the common
public registries.
Previously, Artifact feeds combined a list of available package versions from the feed
and all the upstream sources.

Fabrikam Public registry

1.0.0 2.0.0 2.1.0

Available packages

1.0.0 2.0.0 2.1.0

Upstream behavior is a feature that enables developers to choose if they want to


consume externally sourced package versions. Upstream behavior dictates which
packages will be made available from the public registries for individual packages.

When the upstream behavior is enabled, when a package is published to your Azure
Artifacts feed, any version from the public registry will be blocked and not made
available for download.

This approach provides another layer of security by blocking the exposure to malicious
packages that may infiltrate the public registries.

Users will still be able to toggle off the upstream behavior setting and consume
packages from the public registries if they choose to do so.

7 Note

The new behavior won't affect any package versions that are already in use. Those
are stored in the feed's @local view.
Applicable scenarios
The next section shows a few common scenarios where the upstream behavior is
triggered to block externally sourced package versions along with few other cases where
no blockage to the public packages is needed.

Public versions will be blocked


Private package version made public: in this scenario, a team has a private
package that was made public. The upstream behavior in this case will be triggered
to block any new public versions (untrusted packages).

Feed Public registry

1.0.0 2.0.0 2.1.0 2.2.0

Private Private

Public

Available packages

1.0.0 2.0.0 2.1.0 2.2.0

Having both private and public packages: in this scenario, if a team already has
both private and public packages, enabling the upstream behavior will result in
blocking any new package versions from the public registry.
Feed Public registry

1.0.0 2.0.0 2.1.0 2.2.0

Private Public

Available packages

1.0.0 2.0.0 2.1.0 2.2.0

Public versions will not be blocked


All packages are private: if all existing packages are private and the team won't be
consuming any public packages, the new upstream behavior will have no effect on
the team's workflow in this scenario.
Feed Public registry

1.0.0 2.0.0 2.1.0 2.2.0

Private Private

Available packages

1.0.0 2.0.0 2.1.0 2.2.0

All packages are public: if all the packages consumed are public, whether it's from
the public registry or any other open-source repositories, the new upstream
behavior will have no effect on the team's workflow in this scenario.
Feed Public registry

1.0.0 2.0.0 2.1.0 2.2.0

Public Public

Available packages

1.0.0 2.0.0 2.1.0 2.2.0

Public package made private: if a public package is switched to a private package,


the new upstream behavior will have no effect on the team's workflow in this
scenario.
Feed Public registry

1.0.0 2.0.0 2.1.0 2.2.0

Private Public

Private

Available packages

1.0.0 2.0.0 2.1.0 2.2.0

Allow external versions

7 Note

You must be a feed Owner or a feed Administrator to allow externally sourced


versions. See Feed permissions for more details.

1. Select Artifacts, and then select your feed.

2. Select your package, and then select the ellipsis button for more options. Select
Allow externally-sourced versions.
3. Select the toggle button to allow external versions. Select Close when you're done.

Allow external versions using the REST API


Aside from using the feed's user interface, you can also configure the upstream behavior
using the Azure DevOps Services REST API. Select the appropriate tab and find the links
to the REST API docs.

NuGet

Get upstreaming behavior


Set upstreaming behavior

Allow external versions using PowerShell


1. Create a personal access token with Packaging > Read, write, & manage
permissions.

2. Create an environment variable for your personal access token.

PowerShell

$env:PATVAR = "YOUR_PERSONAL_ACCESS_TOKEN"

3. Convert your personal access token to baser64 encoded string and construct the
HTTP request header.

PowerShell

$token =
[Convert]::ToBase64String(([Text.Encoding]::ASCII.GetBytes("username:$e
nv:PatVar")))

$headers = @{

Authorization = "Basic $token"

4. Construct your endpoint url. Example:


//pkgs.dev.azure.com/MyOrg/MyProject/_apis/packaging/feeds/MyFeed/nuget/pa
ckages/pkg1.0.0.nupkg/upstreaming?api-version=6.1-preview.1

Project-scoped feed:

PowerShell

$url =
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/<PROJECT_NAME>/_ap
is/packaging/feeds/<FEED_NAME>/<PROTOCOL>/packages/<PACKAGE_NAME>/
upstreaming?api-version=6.1-preview.1"

Organization-scoped feed:

PowerShell

$url =
"https://pkgs.dev.azure.com/<ORGANIZATION_NAME>/_apis/packaging/fe
eds/<FEED_NAME>/<PROTOCOL>/packages/<PACKAGE_NAME>/upstreaming?
api-version=6.1-preview.1"

Get upstreaming behavior

Run the following command to retrieve the upstream behavior state of your
package. $url and $headers are the same variables we used in the previous
section.

PowerShell

Invoke-RestMethod -Uri $url -Headers $headers

Related articles
Understand upstream sources
Set up upstream sources
Manage dependencies with upstream sources
Feeds permissions
Best practices
Artifacts in Azure Pipelines - overview
Article • 11/07/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts enable developers to consume and publish different types of packages
to Artifacts feeds and public registries such as NuGet.org and npmjs.com. You can use
Azure Artifacts in conjunction with Azure Pipelines to deploy packages, publish build
artifacts, or integrate files between your pipeline stages to build, test, or deploy your
application.

Supported artifact types


Artifact Description
type

Build The files generated by a build. Example: .dll, .exe, or .PDB files.
artifacts

NuGet Publish NuGet packages to Azure Artifacts feeds or public registries such as nuget.org.

npm Publish npm packages to Azure Artifacts feeds or public registries such as nmpjs.com.

Maven Publish Maven packages to Azure Artifacts feeds.

Python Publish Python packages to Azure Artifacts feeds or PyPI.org.

Universal Publish Universal Packages to Azure Artifacts feeds.


Packages

Symbols Symbol files contain debugging information about the compiled executables. You can
publish symbols to Azure Artifacts symbol server or to a file share. Symbol servers
enable debuggers to automatically retrieve the correct symbol files without knowing
the specific product, package, or build information.

Publish and consume artifacts


NuGet

Publish a NuGet package using the command line


Publish to NuGet feeds (YAML/Classic)
Consume NuGet packages
 Tip

If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.

Next steps
Publish and download Pipeline Artifacts
Build Artifacts

Release artifacts and artifact sources


Publish and download pipeline Artifacts
Article • 11/29/2022

Azure DevOps Services

Using Azure Pipelines, you can download artifacts from earlier stages in your pipeline or
from another pipeline. You can also publish your artifact to a file share or make it
available as a pipeline artifact.

Publish artifacts
You can publish your artifacts using YAML, the classic editor, or Azure CLI:

7 Note

Publishing pipeline artifacts is not supported in release pipelines.

YAML

YAML

steps:

- publish: $(System.DefaultWorkingDirectory)/bin/WebApp

artifact: WebApp

7 Note

The publish keyword is a shortcut for the Publish Pipeline Artifact task .

Although the artifact's name is optional, it is a good practice to specify a name that
accurately reflects the contents of your artifact. If you plan to consume the artifact from
a job running on a different OS, you must ensure all the file paths are valid for the target
environment. For example, a file name containing the character \ or * will fail to
download on Windows.

The path of the file/folder that you want to publish is required. This can be an absolute
or a relative path to $(System.DefaultWorkingDirectory) .
Packages in Azure Artifacts are immutable. Once you publish a package, its version will
be permanently reserved. rerunning failed jobs will fail if the package has been
published. A good way to approach this if you want to be able to rerun failed jobs
without facing an error package already exists, is to use Conditions to only run if the
previous job succeeded.

yml

jobs:

- job: Job1

steps:

- script: echo Hello Job1!

- job: Job2

steps:

- script: echo Hello Job2!

dependsOn: Job1

7 Note

You will not be billed for storing Pipeline Artifacts. Pipeline Caching is also exempt
from storage billing. See Which artifacts count toward my total billed storage.

U Caution

Deleting a pipeline run will result in the deletion of all Artifacts associated with that
run.

Use .artifactignore
.artifactignore uses a similar syntax to .gitignore (with few limitations) to specify

which files should be ignored when publishing artifacts. Make sure that the
.artifactignore file is located within the directory specified by the targetPath argument of
your Publish Pipeline Artifacts task.

7 Note

The plus sign character + is not supported in URL paths and some builds metadata
for package types such as Maven.

Example: ignore all files except .exe files:


**/*

!*.exe

) Important

Azure Artifacts automatically ignore the .git folder path when you don't have a
.artifactignore file. You can bypass this by creating an empty .artifactignore file.

Download artifacts
You can download artifacts using YAML, the classic editor, or Azure CLI.

YAML

YAML

steps:

- download: current

artifact: WebApp

current: download artifacts produced by the current pipeline run. Options:


current, specific.

7 Note

List of published artifacts will be available only in following dependant jobs.


Therefore, use current option only in separate jobs, that has dependency on
jobs with publish artifacts tasks.

 Tip

You can use Pipeline resources to define your source in one place and use it
anywhere in your pipeline.

7 Note

The download keyword is a shortcut for the Download Pipeline Artifact task.
By default, files are downloaded to $(Pipeline.Workspace). If an artifact name was not
specified, a subdirectory will be created for each downloaded artifact. You can use
matching patterns to limit which files get downloaded. See File matching patterns for
more details.

yml

steps:

- download: current

artifact: WebApp

patterns: |

**/*.js

**/*.zip

Artifacts selection
A single download step can download one or more artifacts. To download multiple
artifacts, leave the artifact name field empty and use file matching patterns to limit
which files will be downloaded. ** is the default file matching pattern (all files in all
artifacts).

Single artifact

When an artifact name is specified:

1. Only files for that specific artifact are downloaded. If the artifact does not exist, the
task will fail.

2. File matching patterns are evaluated relative to the root of the artifact. For
example, the pattern *.jar matches all files with a .jar extension at the root of
the artifact.

The following example illustrates how to download all *.js from an artifact WebApp :

YAML

YAML

steps:

- download: current

artifact: WebApp

patterns: '**/*.js'

Multiple artifacts
When no artifact name is specified:

1. Multiple artifacts can be downloaded and the task does not fail if no files are
found.

2. A subdirectory is created for each artifact.

3. File matching patterns should assume the first segment of the pattern is (or
matches) an artifact name. For example, WebApp/** matches all files from the
WebApp artifact. The pattern */*.dll matches all files with a .dll extension at the
root of each artifact.

The following example illustrates how to download all .zip files from all artifacts:

YAML

YAML

steps:

- download: current

patterns: '**/*.zip'

Artifacts in release and deployment jobs


Artifacts are only downloaded automatically in deployment jobs. By default, artifacts are
downloaded to $(Pipeline.Workspace) . The download artifact task will be auto injected
only when using the deploy lifecycle hook in your deployment. To stop artifacts from
being downloaded automatically, add a download step and set its value to none.
In a
regular build job, you need to explicitly use the download step keyword or the Download
Pipeline Artifact task. See lifecycle hooks to learn more about the other types of hooks.

YAML

steps:

- download: none

Use Artifacts across stages


If you want to be able to access your artifact across different stages in your pipeline, you
can now publish your artifact in one stage and then download it in the next stage
leveraging dependencies. See Stage to stage dependencies for more details.

Example
In the following example, we will copy and publish a script folder from our repo to the
$(Build.ArtifactStagingDirectory) . In the second stage, we will download and run our

script.

YAML

trigger:

- main

stages:

- stage: build

jobs:

- job: run_build

pool:

vmImage: 'windows-latest'

steps:

- task: VSBuild@1

inputs:

solution: '**/*.sln'

msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package


/p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true
/p:DesktopBuildPackageLocation="$(build.artifactStagingDirectory)\WebApp.zip
" /p:DeployIisAppPath="Default Web Site"'

platform: 'Any CPU'

configuration: 'Release'

- task: CopyFiles@2
displayName: 'Copy scripts'

inputs:

contents: 'scripts/**'

targetFolder: '$(Build.ArtifactStagingDirectory)'

- publish: '$(Build.ArtifactStagingDirectory)/scripts'

displayName: 'Publish script'

artifact: drop

- stage: test

dependsOn: build

jobs:

- job: run_test

pool:

vmImage: 'windows-latest'

steps:

- download: current

artifact: drop

- task: PowerShell@2

inputs:

filePath: '$(Pipeline.Workspace)\drop\test.ps1'

Migrate from build artifacts


Pipeline artifacts are the next generation of build artifacts and are the recommended
way to work with artifacts. Artifacts published using the Publish Build Artifacts task can
still be downloaded using Download Build Artifacts, but we recommend using the latest
Download Pipeline Artifact task instead.

When migrating from build artifacts to pipeline artifacts:

1. By default, the Download Pipeline Artifact task downloads files to


$(Pipeline.Workspace) . This is the default and recommended path for all types of

artifacts.

2. File matching patterns for the Download Build Artifacts task are expected to start
with (or match) the artifact name, regardless if a specific artifact was specified or
not. In the Download Pipeline Artifact task, patterns should not include the
artifact name when an artifact name has already been specified. For more
information, see single artifact selection.

Example
YAML

- task: PublishPipelineArtifact@1

displayName: 'Publish'

inputs:

targetPath: $(Build.ArtifactStagingDirectory)/**

${{ if eq(variables['Build.SourceBranchName'], 'main') }}:

artifactName: 'prod'

${{ else }}:

artifactName: 'dev'

artifactType: 'pipeline'

targetPath: The path of the file or directory to publish. Can be absolute or relative
to the default working directory. Can include variables, but wildcards are not
supported.

artifactName: Name of the artifact to publish. If not set, defaults to a unique ID


scoped to the job.

artifactType: Choose whether to store the artifact in Azure Pipelines, or to copy it


to a file share that must be accessible from the pipeline agent. Options: pipeline ,
filepath .

FAQ

Q: What are build artifacts?


A: Build artifacts are the files generated by your build. See Build Artifacts to learn more
about how to publish and consume your build artifacts.

Q: Can I delete pipeline artifacts when re-running failed jobs?

A: Pipeline artifacts are not deletable or overwritable. If you want to regenerate artifacts
when you re-run a failed job, you can include the job ID in the artifact name.
$(system.JobId) is the appropriate variable for this purpose. See System variables to

learn more about predefined variables.

Q: How can I access Artifacts feeds behind a firewall?

A: If your organization is using a firewall or a proxy server, make sure you allow Azure
Artifacts Domain URLs and IP addresses.
Related articles
Build artifacts
Releases in Azure Pipelines
Release artifacts and artifact sources
How to mitigate risk when using private package feeds
Pipeline caching
Article • 03/22/2023

Azure DevOps Services

Pipeline caching can help reduce build time by allowing the outputs or downloaded
dependencies from one run to be reused in later runs, thereby reducing or avoiding the
cost to recreate or redownload the same files again. Caching is especially useful in
scenarios where the same dependencies are downloaded over and over at the start of
each run. This is often a time consuming process involving hundreds or thousands of
network calls.

Caching can be effective at improving build time provided the time to restore and save
the cache is less than the time to produce the output again from scratch. Because of
this, caching may not be effective in all scenarios and may actually have a negative
impact on build time.

Caching is currently supported in CI and deployment jobs, but not classic release jobs.

When to use artifacts versus caching


Pipeline caching and pipeline artifacts perform similar functions but are designed for
different scenarios and shouldn't be used interchangeably.

Use pipeline artifacts when you need to take specific files produced in one job
and share them with other jobs (and these other jobs will likely fail without them).

Use pipeline caching when you want to improve build time by reusing files from
previous runs (and not having these files won't impact the job's ability to run).

7 Note

Pipeline caching and pipeline artifacts are free for all tiers (free and paid). see
Artifacts storage consumption for more details.

Cache task: how it works


Caching is added to a pipeline using the Cache task. This task works like any other task
and is added to the steps section of a job.
When a cache step is encountered during a run, the task restores the cache based on
the provided inputs. If no cache is found, the step completes and the next step in the
job is run.

After all steps in the job have run and assuming a successful job status, a special "Post-
job: Cache" step is automatically added and triggered for each "restore cache" step that
wasn't skipped. This step is responsible for saving the cache.

7 Note

Caches are immutable, meaning that once a cache is created, its contents cannot
be changed.

Configure the Cache task


The Cache task has two required arguments: key and path:

path: the path of the folder to cache. Can be an absolute or a relative path. Relative
paths are resolved against $(System.DefaultWorkingDirectory) .

7 Note

You can use predefined variables to store the path to the folder you want to cache,
however wildcards are not supported.

key: should be set to the identifier for the cache you want to restore or save. Keys
are composed of a combination of string values, file paths, or file patterns, where
each segment is separated by a | character.

Strings:

Fixed value (like the name of the cache or a tool name) or taken from an
environment variable (like the current OS or current job name)

File paths:

Path to a specific file whose contents will be hashed. This file must exist at the time
the task is run. Keep in mind that any key segment that "looks like a file path" will
be treated like a file path. In particular, this includes segments containing a . . This
could result in the task failing when this "file" doesn't exist.

 Tip
To avoid a path-like string segment from being treated like a file path, wrap it
with double quotes, for example: "my.key" | $(Agent.OS) | key.file

File patterns:

Comma-separated list of glob-style wildcard pattern that must match at least one
file. For example:
**/yarn.lock : all yarn.lock files under the sources directory

*/asset.json, !bin/** : all asset.json files located in a directory under the


sources directory, except under the bin directory

The contents of any file identified by a file path or file pattern is hashed to produce a
dynamic cache key. This is useful when your project has file(s) that uniquely identify
what is being cached. For example, files like package-lock.json , yarn.lock ,
Gemfile.lock , or Pipfile.lock are commonly referenced in a cache key since they all
represent a unique set of dependencies.

Relative file paths or file patterns are resolved against


$(System.DefaultWorkingDirectory) .

Example:

Here's an example showing how to cache dependencies installed by Yarn:

YAML

variables:

YARN_CACHE_FOLDER: $(Pipeline.Workspace)/.yarn

steps:

- task: Cache@2

inputs:

key: '"yarn" | "$(Agent.OS)" | yarn.lock'

restoreKeys: |

"yarn" | "$(Agent.OS)"

"yarn"

path: $(YARN_CACHE_FOLDER)

displayName: Cache Yarn packages

- script: yarn --frozen-lockfile

In this example, the cache key contains three parts: a static string ("yarn"), the OS the job
is running on since this cache is unique per operating system, and the hash of the
yarn.lock file that uniquely identifies the set of dependencies in the cache.
On the first run after the task is added, the cache step will report a "cache miss" since
the cache identified by this key doesn't exist. After the last step, a cache will be created
from the files in $(Pipeline.Workspace)/.yarn and uploaded. On the next run, the cache
step will report a "cache hit" and the contents of the cache will be downloaded and
restored.

7 Note

Pipeline.Workspace is the local path on the agent running your pipeline where all
directories are created. This variable has the same value as Agent.BuildDirectory .

Restore keys
restoreKeys can be used if one wants to query against multiple exact keys or key

prefixes. This is used to fall back to another key in the case that a key doesn't yield a hit.
A restore key will search for a key by prefix and yield the latest created cache entry as a
result. This is useful if the pipeline is unable to find an exact match but wants to use a
partial cache hit instead. To insert multiple restore keys, simply delimit them by using a
new line to indicate the restore key (see the example for more details). The order of
which restore keys will be tried against will be from top to bottom.

Required software on self-hosted agent

Archive software / Platform Windows Linux Mac

GNU Tar Required Required No

BSD Tar No No Required

7-Zip Recommended No No

The above executables need to be in a folder listed in the PATH environment variable.
Keep in mind that the hosted agents come with the software included, this is only
applicable for self-hosted agents.

Example:

Here's an example of how to use restore keys by Yarn:

YAML

variables:

YARN_CACHE_FOLDER: $(Pipeline.Workspace)/.yarn

steps:

- task: Cache@2

inputs:

key: '"yarn" | "$(Agent.OS)" | yarn.lock'

restoreKeys: |

yarn | "$(Agent.OS)"

yarn

path: $(YARN_CACHE_FOLDER)

displayName: Cache Yarn packages

- script: yarn --frozen-lockfile

In this example, the cache task attempts to find if the key exists in the cache. If the key
doesn't exist in the cache, it tries to use the first restore key yarn | $(Agent.OS) .
This
will attempt to search for all keys that either exactly match that key or has that key as a
prefix. A prefix hit can happen if there was a different yarn.lock hash segment.
For
example, if the following key yarn | $(Agent.OS) | old-yarn.lock was in the cache
where the old yarn.lock yielded a different hash than yarn.lock , the restore key will
yield a partial hit.
If there's a miss on the first restore key, it will then use the next restore
key yarn which will try to find any key that starts with yarn . For prefix hits, the result will
yield the most recently created cache key as the result.

7 Note

A pipeline can have one or more caching task(s). There is no limit on the caching
storage capacity, and jobs and tasks from the same pipeline can access and share
the same cache.

Cache isolation and security


To ensure isolation between caches from different pipelines and different branches,
every cache belongs to a logical container called a scope. Scopes provide a security
boundary that ensures a job from one pipeline cannot access the caches from a different
pipeline, and a job building a PR has read access to the caches for the PR's target
branch (for the same pipeline), but cannot write (create) caches in the target branch's
scope.

When a cache step is encountered during a run, the cache identified by the key is
requested from the server. The server then looks for a cache with this key from the
scopes visible to the job, and returns the cache (if available). On cache save (at the end
of the job), a cache is written to the scope representing the pipeline and branch. See
below for more details.

CI, manual, and scheduled runs

Scope Read Write

Source branch Yes Yes

main branch (default branch) Yes No

Pull request runs

Scope Read Write

Source branch Yes No

Target branch Yes No

Intermediate branch (such as refs/pull/1/merge ) Yes Yes

main branch (default branch) Yes No

Pull request fork runs

Branch Read Write

Target branch Yes No

Intermediate branch (such as refs/pull/1/merge ) Yes Yes

main branch (default branch) Yes No

 Tip

Because caches are already scoped to a project, pipeline, and branch, there is no
need to include any project, pipeline, or branch identifiers in the cache key.

Conditioning on cache restoration


In some scenarios, the successful restoration of the cache should cause a different set of
steps to be run. For example, a step that installs dependencies can be skipped if the
cache was restored. This is possible using the cacheHitVar task input. Setting this input
to the name of an environment variable will cause the variable to be set to true when
there's a cache hit, inexact on a restore key cache hit, otherwise it will be set to false .
This variable can then be referenced in a step condition or from within a script.

In the following example, the install-deps.sh step is skipped when the cache is
restored:

YAML

steps:

- task: Cache@2

inputs:

key: mykey | mylockfile

restoreKeys: mykey

path: $(Pipeline.Workspace)/mycache

cacheHitVar: CACHE_RESTORED

- script: install-deps.sh

condition: ne(variables.CACHE_RESTORED, 'true')

- script: build.sh

Bundler
For Ruby projects using Bundler, override the BUNDLE_PATH environment variable used by
Bundler to set the path Bundler will look for Gems in.

Example:

YAML

variables:

BUNDLE_PATH: $(Pipeline.Workspace)/.bundle

steps:

- task: Cache@2

displayName: Bundler caching

inputs:

key: 'gems | "$(Agent.OS)" | Gemfile.lock'

restoreKeys: |

gems | "$(Agent.OS)"

gems

path: $(BUNDLE_PATH)

Ccache (C/C++)
Ccache is a compiler cache for C/C++. To use Ccache in your pipeline make sure
Ccache is installed, and optionally added to your PATH (see Ccache run modes ). Set
the CCACHE_DIR environment variable to a path under $(Pipeline.Workspace) and cache
this directory.

Example:

YAML

variables:

CCACHE_DIR: $(Pipeline.Workspace)/ccache

steps:

- bash: |

sudo apt-get install ccache -y

echo "##vso[task.prependpath]/usr/lib/ccache"

displayName: Install ccache and update PATH to use linked versions of gcc,
cc, etc

- task: Cache@2

inputs:

key: 'ccache | "$(Agent.OS)"'


path: $(CCACHE_DIR)

restoreKeys: |

ccache | "$(Agent.OS)"

displayName: ccache

See Ccache configuration settings for more details.

Docker images
Caching Docker images dramatically reduces the time it takes to run your pipeline.

YAML

variables:

repository: 'myDockerImage'

dockerfilePath: '$(Build.SourcesDirectory)/app/Dockerfile'

tag: '$(Build.BuildId)'

pool:

vmImage: 'ubuntu-latest'

steps:

- task: Cache@2

displayName: Cache task

inputs:

key: 'docker | "$(Agent.OS)" | cache'

path: $(Pipeline.Workspace)/docker

cacheHitVar: CACHE_RESTORED #Variable to set to 'true'


when the cache is restored

- script: |

docker load -i $(Pipeline.Workspace)/docker/cache.tar

displayName: Docker restore

condition: and(not(canceled()), eq(variables.CACHE_RESTORED, 'true'))

- task: Docker@2

displayName: 'Build Docker'

inputs:

command: 'build'

repository: '$(repository)'

dockerfile: '$(dockerfilePath)'

tags: |

'$(tag)'

- script: |

mkdir -p $(Pipeline.Workspace)/docker

docker save -o $(Pipeline.Workspace)/docker/cache.tar


$(repository):$(tag)

displayName: Docker save

condition: and(not(canceled()), not(failed()),


ne(variables.CACHE_RESTORED, 'true'))

key: (required) - a unique identifier for the cache.


path: (required) - path of the folder or file that you want to cache.

Golang
For Golang projects, you can specify the packages to be downloaded in the go.mod file.
If your GOCACHE variable isn't already set, set it to where you want the cache to be
downloaded.

Example:

YAML

variables:

GO_CACHE_DIR: $(Pipeline.Workspace)/.cache/go-build/

steps:

- task: Cache@2

inputs:

key: 'go | "$(Agent.OS)" | go.mod'

restoreKeys: |

go | "$(Agent.OS)"

path: $(GO_CACHE_DIR)

displayName: Cache GO packages

Gradle
Using Gradle's built-in caching support can have a significant impact on build time. To
enable the build cache, set the GRADLE_USER_HOME environment variable to a path under
$(Pipeline.Workspace) and either run your build with --build-cache or add
org.gradle.caching=true to your gradle.properties file.

Example:

YAML

variables:

GRADLE_USER_HOME: $(Pipeline.Workspace)/.gradle

steps:

- task: Cache@2

inputs:

key: 'gradle | "$(Agent.OS)" | **/build.gradle.kts' # Swap


build.gradle.kts for build.gradle when using Groovy

restoreKeys: |

gradle | "$(Agent.OS)"

gradle

path: $(GRADLE_USER_HOME)

displayName: Configure gradle caching

- task: Gradle@2

inputs:

gradleWrapperFile: 'gradlew'

tasks: 'build'

options: '--build-cache'

displayName: Build

- script: |

# stop the Gradle daemon to ensure no files are left open (impacting the
save cache operation later)

./gradlew --stop

displayName: Gradlew stop

restoreKeys: The fallback keys if the primary key fails (Optional)

7 Note

Caches are immutable, once a cache with a particular key is created for a specific
scope (branch), the cache cannot be updated. This means that if the key is a fixed
value, all subsequent builds for the same branch will not be able to update the
cache even if the cache's contents have changed. If you want to use a fixed key
value, you must use the restoreKeys argument as a fallback option.
Maven
Maven has a local repository where it stores downloads and built artifacts. To enable, set
the maven.repo.local option to a path under $(Pipeline.Workspace) and cache this
folder.

Example:

YAML

variables:

MAVEN_CACHE_FOLDER: $(Pipeline.Workspace)/.m2/repository

MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)'

steps:

- task: Cache@2

inputs:

key: 'maven | "$(Agent.OS)" | **/pom.xml'

restoreKeys: |

maven | "$(Agent.OS)"

maven

path: $(MAVEN_CACHE_FOLDER)

displayName: Cache Maven local repo

- script: mvn install -B -e

If you're using a Maven task, make sure to also pass the MAVEN_OPTS variable because it
gets overwritten otherwise:

YAML

- task: Maven@4

inputs:

mavenPomFile: 'pom.xml'

mavenOptions: '-Xmx3072m $(MAVEN_OPTS)'

.NET/NuGet
If you use PackageReferences to manage NuGet dependencies directly within your
project file and have a packages.lock.json file, you can enable caching by setting the
NUGET_PACKAGES environment variable to a path under $(UserProfile) and caching this

directory. See Package reference in project files for more details on how to lock
dependencies.
If you want to use multiple packages.lock.json, you can still use the
following example without making any changes. The content of all the
packages.lock.json files will be hashed and if one of the files is changed, a new cache key
will be generated.

Example:

YAML

variables:

NUGET_PACKAGES: $(Pipeline.Workspace)/.nuget/packages

steps:

- task: Cache@2

inputs:

key: 'nuget | "$(Agent.OS)" |


$(Build.SourcesDirectory)/**/packages.lock.json'

restoreKeys: |

nuget | "$(Agent.OS)"

nuget

path: $(NUGET_PACKAGES)

displayName: Cache NuGet packages

Node.js/npm
There are different ways to enable caching in a Node.js project, but the recommended
way is to cache npm's shared cache directory . This directory is managed by npm and
contains a cached version of all downloaded modules. During install, npm checks this
directory first (by default) for modules that can reduce or eliminate network calls to the
public npm registry or to a private registry.

Because the default path to npm's shared cache directory is not the same across all
platforms , it's recommended to override the npm_config_cache environment variable
to a path under $(Pipeline.Workspace) . This also ensures the cache is accessible from
container and non-container jobs.

Example:

YAML

variables:

npm_config_cache: $(Pipeline.Workspace)/.npm

steps:

- task: Cache@2

inputs:

key: 'npm | "$(Agent.OS)" | package-lock.json'

restoreKeys: |

npm | "$(Agent.OS)"

path: $(npm_config_cache)

displayName: Cache npm

- script: npm ci

If your project doesn't have a package-lock.json file, reference the package.json file in
the cache key input instead.

 Tip

Because npm ci deletes the node_modules folder to ensure that a consistent,


repeatable set of modules is used, you should avoid caching node_modules when
calling npm ci .

Node.js/Yarn
Like with npm, there are different ways to cache packages installed with Yarn. The
recommended way is to cache Yarn's shared cache folder . This directory is managed
by Yarn and contains a cached version of all downloaded packages. During install, Yarn
checks this directory first (by default) for modules, which can reduce or eliminate
network calls to public or private registries.

Example:

YAML

variables:

YARN_CACHE_FOLDER: $(Pipeline.Workspace)/.yarn

steps:

- task: Cache@2

inputs:

key: 'yarn | "$(Agent.OS)" | yarn.lock'

restoreKeys: |

yarn | "$(Agent.OS)"

yarn

path: $(YARN_CACHE_FOLDER)

displayName: Cache Yarn packages

- script: yarn --frozen-lockfile

Python/Anaconda
Set up your pipeline caching with Anaconda environments:

Example
YAML

variables:

CONDA_CACHE_DIR: /usr/share/miniconda/envs

# Add conda to system path

steps:

- script: echo "##vso[task.prependpath]$CONDA/bin"

displayName: Add conda to PATH

- bash: |

sudo chown -R $(whoami):$(id -ng) $(CONDA_CACHE_DIR)

displayName: Fix CONDA_CACHE_DIR directory permissions

- task: Cache@2

displayName: Use cached Anaconda environment

inputs:

key: 'conda | "$(Agent.OS)" | environment.yml'

restoreKeys: |

python | "$(Agent.OS)"

python

path: $(CONDA_CACHE_DIR)

cacheHitVar: CONDA_CACHE_RESTORED

- script: conda env create --quiet --file environment.yml

displayName: Create Anaconda environment

condition: eq(variables.CONDA_CACHE_RESTORED, 'false')

Windows

YAML

- task: Cache@2

displayName: Cache Anaconda

inputs:

key: 'conda | "$(Agent.OS)" | environment.yml'

restoreKeys: |

python | "$(Agent.OS)"

python

path: $(CONDA)/envs

cacheHitVar: CONDA_CACHE_RESTORED

- script: conda env create --quiet --file environment.yml

displayName: Create environment

condition: eq(variables.CONDA_CACHE_RESTORED, 'false')


PHP/Composer
For PHP projects using Composer, override the COMPOSER_CACHE_DIR environment
variable used by Composer.

Example:

YAML

variables:

COMPOSER_CACHE_DIR: $(Pipeline.Workspace)/.composer

steps:

- task: Cache@2

inputs:

key: 'composer | "$(Agent.OS)" | composer.lock'

restoreKeys: |

composer | "$(Agent.OS)"

composer

path: $(COMPOSER_CACHE_DIR)

displayName: Cache composer

- script: composer install

Known issues and feedback


If you're experiencing issues setting up caching for your pipeline, check the list of open
issues in the microsoft/azure-pipelines-tasks repo. If you don't see your issue listed,
create a new one and provide the necessary information about your scenario.

Q&A

Q: Can I clear a cache?


A: Clearing a cache is currently not supported. However you can add a string literal (such
as version2 ) to your existing cache key to change the key in a way that avoids any hits
on existing caches. For example, change the following cache key from this:

YAML

key: 'yarn | "$(Agent.OS)" | yarn.lock'

to this:
YAML

key: 'version2 | yarn | "$(Agent.OS)" | yarn.lock'

Q: When does a cache expire?


A: Caches expire after seven days of no activity.

Q: When does the cache get uploaded?


A: After the last step fo your pipeline a cache will be created from your cache path and
uploaded. See the example for more details.

Q: Is there a limit on the size of a cache?


A: There's no enforced limit on the size of individual caches or the total size of all caches
in an organization.
Set retention policies for builds,
releases, and tests
Article • 05/03/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Retention policies let you set how long to keep runs, releases, and tests stored in the
system. To save storage space, you want to delete older runs, tests, and releases.

The following retention policies are available in Azure DevOps in your Project settings:

1. Pipeline - Set how long to keep artifacts, symbols, attachments, runs, and pull
request runs.
2. Release (classic) - Set whether to save builds and view the default and maximum
retention settings.
3. Test - Set how long to keep automated and manual test runs, results, and
attachments.
7 Note

If you are using an on-premises server, you can also specify retention policy
defaults for a project and when releases are permanently destroyed. Learn more
about release retention later in this article.

Prerequisites
By default, members of the Contributors, Build Admins, Project Admins, and Release
Admins groups can manage retention policies.

To manage retention policies, you must have one of the following subscriptions:

Enterprise
Test Professional
MSDN Platforms

You can also buy monthly access to Azure Test Plans and assign the Basic + Test Plans
access level. See Testing access by user role.

Configure retention policies


1. Sign in to your project.

2. Go to the Settings tab of your project's settings.

3. Select Settings or Release retention under Pipelines or Retention under Test.

Select Settings to configure retention policies for runs, artifacts, symbols,


attachments, and pull request runs.
Select Release retention to set up your release retention policies and
configure when to delete or permanently destroy releases.
Select Retention to set up how long to keep manual and automated test
runs.
Set run retention policies
In most cases, you don't need to retain completed runs longer than a certain number of
days.
Using retention policies, you can control how many days you want to keep each
run before deleting it.

1. Go to the Settings tab of your project's settings.

2. Select Settings in the Pipelines section.


Set the number of days to keep artifacts, symbols, and attachments.
Set the number of days to keep runs
Set the number of days to keep pull request runs
Set the number of recent runs to keep for each pipeline

2 Warning

Azure DevOps no longer supports per-pipeline retention rules.


The only way to
configure retention policies for YAML and classic pipelines is through the project
settings described above. You can no longer configure per-pipeline retention
policies.

The setting for number of recent runs to keep for each pipeline requires a little more
explanation. The interpretation of this setting varies based on the type of repository you
build in your pipeline.

Azure Repos: Azure Pipelines retains the configured number of latest runs for the
pipeline's default branch and for each protected branch of the repository. A branch
that has any branch policies configured is considered to be a protected branch.

As an example, consider a repository with two branches, main and release .


Imagine the pipeline's default branch is the main branch, and the release
branch has a branch policy, making it a protected branch. In this case, if you
configured the policy to retain three runs, then both the latest three runs of main
and the latest three runs of the release branch are retained. In addition, the latest
three runs of this pipeline (irrespective of the branch) are also retained.

To clarify this logic further, let us say the list of runs for this pipeline is as follows,
with the most recent run at the top. The table shows which runs will be retained if
you have configured to retain the latest three runs (ignoring the effect of the
number of days setting):

Run # Branch Retained / Not retained Why?

Run 10 main Retained Latest 3 for main and Latest 3 for pipeline

Run 9 branch1 Retained Latest 3 for pipeline

Run 8 branch2 Retained Latest 3 for pipeline

Run 7 main Retained Latest 3 for main

Run 6 main Retained Latest 3 for main


Run # Branch Retained / Not retained Why?

Run 5 main Not retained Neither latest 3 for main, nor for pipeline

Run 4 main Not retained Neither latest 3 for main, nor for pipeline

Run 3 branch1 Not retained Neither latest 3 for main, nor for pipeline

Run 2 release Retained Latest 3 for release

Run 1 main Not retained Neither latest 3 for main, nor for pipeline

All other Git repositories: Azure Pipelines retains the configured number of latest
runs for the whole pipeline.

TFVC: Azure Pipelines retains the configured number of latest runs for the whole
pipeline, irrespective of the branch.

What parts of the run get deleted


The following information is deleted when a run is deleted:

Logs
All pipeline and build artifacts
All symbols
Binaries
Test results
Run metadata
Source labels (TFVC) or tags (Git)

Universal packages, NuGet, npm, and other packages are not tied to pipelines retention.

When are runs deleted


Your retention policies are processed once a day. The time that the policies get
processed variables because we spread the work throughout the day for load-balancing
purposes. There is no option to change this process.

A run is deleted if all of the following conditions are true:

It exceeds the number of days configured in the retention settings


It is not one of the recent runs as configured in the retention settings
It is not marked to be retained indefinitely
It is not retained by a release
Automatically set retention lease on pipeline runs
Retention leases are used to manage the lifetime of pipeline runs beyond the configured
retention periods. Retention leases can be added or deleted on a pipeline run by calling
the Lease API. This API can be invoked within the pipeline using a script and using
predefined variables for runId and definitionId.

A retention lease can be added on a pipeline run for a specific period. For example, a
pipeline run which deploys to a test environment can be retained for a shorter duration
while a run deploying to production environment can be retained longer.

Manually set retention lease on pipeline runs


You can manually set a pipeline run to be retained using the More actions menu on the
Pipeline run details page.

Delete a run
You can delete runs using the More actions menu on the Pipeline run details page.
7 Note

If any retention policies currently apply to the run, they must be removed before
the run can be deleted. For instructions, see Pipeline run details - delete a run.

Set release retention policies


The release retention policies for a classic release pipeline determine how long a release
and the run linked to it are retained. Using these policies, you can control how many
days you want to keep each release after it has been last modified or deployed and the
minimum number of releases that should be retained for each pipeline.

The retention timer on a release is reset every time a release is modified or deployed to
a stage. The minimum number of releases to retain setting takes precedence over the
number of days. For example, if you specify to retain a minimum of three releases, the
most recent three will be retained indefinitely - irrespective of the number of days
specified. However, you can manually delete these releases when you no longer require
them. See FAQ below for more details about how release retention works.
As an author of a release pipeline, you can customize retention policies for releases of
your pipeline on the Retention tab.

The retention policy for YAML and build pipelines is the same. You can see your
pipeline's retention settings in Project Settings for Pipelines in the Settings section.

Global release retention policy


If you are using an on-premises Team Foundation Server or Azure DevOps Server, you
can specify release retention policy defaults and maximums for a project. You can also
specify when releases are permanently destroyed (removed from the Deleted tab in the
build explorer).

If you are using Azure DevOps Services, you can view but not change these settings for
your project.

Global release retention policy settings can be reviewed from the Release retention
settings of your project:

Azure DevOps Services:


https://dev.azure.com/{organization}/{project}/_settings/release?app=ms.vss-

build-web.build-release-hub-group
On-premises:
https://{your_server}/tfs/{collection_name}/{project}/_admin/_apps/hub/ms.vss-
releaseManagement-web.release-project-admin-hub

The maximum retention policy sets the upper limit for how long releases can be
retained for all release pipelines. Authors of release pipelines cannot
configure settings
for their definitions beyond the values specified here.

The default retention policy sets the default retention values for all the release
pipelines. Authors of build pipelines can override these values.

The destruction policy helps you keep the releases for a certain period of time after
they are deleted. This policy cannot be overridden in individual release pipelines.

Set collection-level retention policies


For on-premises servers, you can also set the collection-level retention policies with
custom retention rules. These retention policies apply to Classic build pipelines. The
page at https://{your_server}/{collection_name}/_settings/buildqueue governs your
maximum values and default values.
Use the Copy Files task to save data longer
You can use the Copy Files task to save your build and artifact data for longer than what
is set in the retention policies. The Copy Files task is preferable to the Publish Build
Artifacts task because data saved with the Publish Build Artifacts task will get
periodically cleaned up and deleted.

YAML

YAML

- task: CopyFiles@2

displayName: 'Copy Files to: \\mypath\storage\$(Build.BuildNumber)'

inputs:

SourceFolder: '$(Build.SourcesDirectory)'

Contents: '_buildOutput/**'

TargetFolder: '\\mypath\storage\$(Build.BuildNumber)'

FAQ

If I mark a run or a release to be retained indefinitely,


does the retention policy still apply?
No. Neither the pipeline's retention policy nor the maximum limits set by the
administrator are applied when you mark an individual run or release to be retained
indefinitely. It will remain until you stop retaining it indefinitely.

How do I specify that runs deployed to production will be


retained longer?
If you use classic releases to deploy to production, then customize the retention policy
on the release pipeline. Specify the number of days that releases deployed to
production must be retained. In addition, indicate that runs associated with that release
are to be retained. This will override the run retention policy.

If you use multi-stage YAML pipelines to deploy to production, the only retention policy
you can configure is in the project settings. You cannot customize retention based on
the environment to which the build is deployed.

I did not mark runs to be retained indefinitely. However, I


see a large number of runs being retained. How can I
prevent this?
This could be for one of the following reasons:

The runs are marked by someone in your project to be retained indefinitely.


The runs are consumed by a release, and the release holds a retention lock on
these runs. Customize the release retention policy as explained above.

If you believe that the runs are no longer needed or if the releases have already been
deleted, then you can manually delete the runs.
How does 'minimum releases to keep' setting work?
Minimum releases to keep are defined at stage level. It denotes that Azure DevOps will
always retain the given number of last deployed releases for a stage even if the releases
are out of retention period. A release will be considered under minimum releases to
keep for a stage only when the deployment started on that stage. Both successful and
failed deployments are considered. Releases pending approval are not considered.

How is retention period decided when release is deployed


to multiple stages having different retention period?
Final retention period is decided by considering days to retain settings of all the stages
on which release is deployed and taking max days to keep among them. Minimum
releases to keep is governed at stage level and do not change based on release
deployed to multiple stages or not. Retain associated artifacts will be applicable when
release is deployed to a stage for which it is set true.

I deleted a stage for which I have some old releases.


What retention will be considered for this case?
As the stage is deleted, so the stage level retention settings are not applicable now.
Azure DevOps will fall back to project level default retention for such case.

My organization requires us to retain builds and releases


longer than what is allowed in the settings. How can I
request a longer retention?
The only way to retain a run or a release longer than what is allowed through retention
settings is to manually mark it to be retained indefinitely. There is no way to configure a
longer retention setting manually. Please reach out to Azure DevOps Support for
assistance.

You can also explore the possibility of using the REST APIs in order to download
information and artifacts about the runs and upload them to your own storage or
artifact repository.

I lost some runs. Is there a way to get them back?


If you believe that you have lost runs due to a bug in the service, create a support ticket
immediately to recover the lost information. If a build definition was manually deleted
more than a week earlier, it will not be possible to recover it. If the runs were deleted as
expected due to a retention policy, it will not be possible to recover the lost runs.

How do I use the Build.Cleanup capability of agents?


Setting a Build.Cleanup capability on agents will cause the pool's cleanup jobs to be
directed to just those agents, leaving the rest free to do regular work. When a pipeline
run is deleted, artifacts stored outside of Azure DevOps are cleaned up through a job
run on the agents. When the agent pool gets saturated with cleanup jobs, this can cause
a problem. The solution to that is to designate a subset of agents in the pool that are
the cleanup agents. If any agents have Build.Cleanup set, only those agents will run the
cleanup jobs, leaving the rest of the agents free to continue running pipeline jobs. The
Cleanup functionality can be enabled by navigating to Agent > Capabilities and setting
Build.Cleanup equal to 1 .

What happens to file share Artifacts when the build is


deleted
When a build with file share Artifacts is deleted, a new build task is queued on a build
agent to clean up those files. An agent is picked to perform this task based on the
following criteria:
Is there an agent with Build.Cleanup capability available?
Is the agent
that ran the build available?
Is an agent from the same pool available?
Is an agent from
a similar pool available?
Is any agent available?

Are automated test results that are published as part of a


release retained until the release is deleted?
Test results published within a stage of a release are retained as specified by the
retention policy configured for the test results. The test results do not get retained until
the release is retained. If you need the test results as long as the release, set the
retention settings for automated test runs in the Project settings accordingly to Never
delete. This makes sure the test results are deleted only when the release is deleted.

Are manual test results deleted?


No. Manual test results are not deleted.

How do I preserve my version control labels or tags?


U Caution

Any version control labels or tags that are applied during a build pipeline that arent
automatically created from the Sources task will be preserved, even if the build is
deleted.
However, any version control labels or tags that are automatically created
from the Sources task during a build are considered part of the build artifacts and
will be deleted when the build is deleted.

If version control labels or tags need to be preserved, even when the build is deleted,
they will need to be either applied as part of a task in the pipeline, manually labeled
outside of the pipeline, or the build will need to be retained indefinitely.

What happens to pipelines that are consumed in other


pipelines?
Classic releases retain pipelines that they consume automatically.

Related articles
Control how long to keep test results
Delete test artifacts
Symbols overview
Article • 05/23/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

To debug compiled executables from native languages like C and C++, you need symbol
files that contain mapping information to the source code. These files are created from
source code during compilation and generally have the PDB (program database)
extension. Azure Artifacts offers a dedicated symbols server to publish your symbols.

What are symbol files


Symbol files are created by the compiler when you build your project. A typical symbols
file might contain: source indexers, local and/or global variables, function names and
pointers to the addresses of their entry points, line numbers etc. This data can be used
to link the debugger to your source code to debug your application.

Publish symbol files


Using the Index Sources and Publish Symbols task, you can publish your symbols to
Azure Artifacts symbol server, file shares, or portable PDBs:

Publish symbols to Azure Artifacts symbol server.


Publish symbols to a file share.
Publish portable PDBs to Azure Artifacts symbol server.

If your application uses the .NET standard, another viable option to share your symbols
is to Create a .snupkg symbol package and publish it to NuGet.org.

Consume symbol files


Once the symbol files are published, you can use Visual Studio or WinDbg to consume
the symbols and debug your application. The debugger will find the appropriate
symbols using a unique ID that identifies the symbols associated with the compiled
binary and link it to your source code.

Debug with symbols in Visual Studio.


Debug with symbols in WinDbg.
Related articles
Artifacts feeds overview.
Promote a package to a view.
Upstream sources overview.
Configure upstream behavior.
Debug with Visual Studio
Article • 05/23/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Symbol servers enable debuggers to automatically retrieve the correct symbol files
without knowing product names, build numbers or package names. These files contain
useful information for the debugger and generally have the PDB extension. You can use
Visual Studio to consume your symbols from Azure Artifacts symbol server or other
external sources to step into your code and debug your application.

Add Azure Artifacts symbol server


To debug with symbols from the Azure Artifacts symbol server, we must authenticate to
the server and add a new Azure DevOps Services symbol server to our Visual Studio
environment.

1. From Visual Studio, select Tools > Options > Debugging.

2. Select Symbols from the list, and then select the + sign to add a new Azure
DevOps symbol server location.

3. A new dialog box Connect to Azure DevOps Symbol Server will open, select your
account from the dropdown menu, and then select the organization that you wish
to connect to. Select Connect when you are done to connect to the symbol server.
4. Your symbol server is then added to the list of symbol file locations.

Debug optimized modules


If you're planning to debug an optimized module (example release binaries) or a third-
party source code, we recommend that you uncheck the Enable Just My Code checkbox
in Visual Studio options.

To do so, select Tools > Options and then Debugging. Select General from the list and
then uncheck Enable Just My Code.
7 Note

To enable support for portable PDB files, check the Enable Source Link Support
checkbox, and to enable support for Windows PDB files on symbol servers, check
the Enable Source Server Support checkbox, both located under Tools > Options
> Debugging > General.

Start debugging
You can start debugging your application in a few different ways:

Press F5 to start the app with the debugger attached to the app process.
Select Debug > Start Debugging.
Select the Start Debugging button in the debug toolbar.

When you start the debugger, Visual Studio will attempt to load your symbols from the
cache folder first before downloading them from the Artifacts symbol server that we
added in the previous section.

Once Visual Studio finds and loads your symbols, you should be able to step through
your code and debug your application. See Navigate through code with the Visual
Studio debugger for more details.

Related articles
Symbols overview.
Debug with WinDbg.
Artifacts in Azure Pipelines
Debug with WinDbg
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Azure Artifacts offers a dedicated symbols server to publish your symbols. You can
connect a debugger to automatically retrieve the correct symbol files and debug your
application. Using WinDbg, you can load an executable or attach the debugger to a
running process, consume your symbols, set up breakpoints, and step through and
analyze your code.

Add the symbol server to WinDbg


To use the Azure Artifacts symbol server, you must add your organization to the symbols
search path, but before we can do that, we must first create a personal access token.

1. Create a Personal Access Token with Symbols (read) scope and copy it to your
clipboard.

2. Open WinDbg, or install it if you haven't already.

3. Select File > OpenExecutable to load the executable you wish to debug.

4. Run the following command to set the symbols path. Replace the placeholder
<ORGANIZATION_NAME> with your organization name:

Command

.sympath+
https://artifacts.dev.azure.com/<ORGANIZATION_NAME>/_apis/symbol/symsrv

5. Set a breakpoint by running the bp command. this will trigger a symbols request.

6. In the authentication prompt, paste your personal access token that you created
earlier. You can leave the username field blank.

WinDbg should then acquire the symbols for your executable. To verify if your symbols
are loaded, run the lm command to list all loaded modules.

Start debugging
With WinDbg, you can debug both kernel-mode and user-mode components:

Getting started with WinDbg (user-mode).


Getting started with WinDbg (kernel-mode).
Using the WinDbg Graphical Interface.
Using the Debugger Command Window.

Related articles
Symbols overview.
Debug with Visual Studio.
Publish symbols for debugging
Article • 05/23/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

With Azure Pipelines, you can publish your symbols to Azure Artifacts symbol server
using the Index sources and publish symbols task. You can use the debugger to connect
and automatically retrieve the correct symbol files without knowing product names,
build numbers, or package names. Using Azure Pipelines, you can also publish your
symbols to files shares and portable PDBs.

7 Note

The Index sources and publish symbols task is not supported in release pipelines.

Publish symbols to Azure Artifacts symbol


server
To publish your symbols to Azure Artifacts symbols server, you can use the Index Sources
& Publish Symbols task.

1. From your pipeline definition, select + to add a new task.

2. Search for the Index sources and publish symbols task. Select Add to add it to
your pipeline.

3. Fill out the required fields as follows:


Task version: 2.\*.

Display name: task display name.

Path to symbols folder: path to the folder hosting the symbol files.

Search pattern: the pattern used to find the pdb files in the folder that you
specified in Path to symbols folder. Single-folder wildcard ( * ) and recursive
wildcards ( ** ) are supported. Example: *\bin**.pdb searches for all .pdb files in all
the bin subdirectories.

Index sources: indicates whether to inject source server information into the PDB
files.

Publish symbols: indicates whether to publish the symbol files.


Symbol server type: select Symbol Server in this organization/collection
(requires Azure Artifacts) to publish your symbols to Azure Artifacts symbol
server.

Verbose logging: check to include more information in your logs.

Publish symbols to a file share


Aside from Azure Artifacts symbol server, you can also publish your symbols to a file
share using the Index Sources and Publish Symbols task.

1. From your pipeline definition, select + to add a new task.


2. Search for the Index sources and publish symbols task. Select Add to add it to
your pipeline.

3. Fill out the required fields as follows:

Task version: 2.\*.

Display name: task display name.

Path to symbols folder: path to the folder hosting the symbol files.

Search pattern: the pattern used to find the pdb files in the folder that you
specified in Path to symbols folder.

Index sources: indicates whether to inject source server information into the PDB
files.

Publish symbols: indicates whether to publish the symbol files.


Symbol server type: select File share to publish your symbols to a file share.
Path to publish symbols: the file share that will host your symbols.
Verbose logging: check to include more information in your logs.

Publish portable PDBs to Azure Artifacts


symbol server
Portable PDBs are symbol files that can be created and used on all platforms unlike the
traditional PDBs which are used on Windows only. For portable PDBs, the build does the
indexing, but you still need to use the Index Sources and Publish Symbols task to
publish your symbols.

Use Source Link in .NET projects


Source link is a set of tools that allow developers to debug their source code by
mapping from the .NET assemblies back to the source code. Check out the
dotnet/sourcelink GitHub repository to learn about the different packages included.

For projects hosted on GitHub, add the Microsoft.SourceLink.GitHub package


reference to your project file.

XML

<ItemGroup>

<PackageReference Include="Microsoft.SourceLink.GitHub"
Version="1.1.1" PrivateAssets="All"/>

</ItemGroup>

For projects hosted on Azure Repos, add the Microsoft.SourceLink.AzureRepos.Git


package reference to your project file.

XML

<ItemGroup>

<PackageReference Include="Microsoft.SourceLink.AzureRepos.Git"
Version="1.1.1" PrivateAssets="All"/>

</ItemGroup>

For projects hosted on Azure DevOps Server, add the


Microsoft.SourceLink.AzureDevOpsServer.Git package reference to your project

file.

XML
<ItemGroup>

<PackageReference
Include="Microsoft.SourceLink.AzureDevOpsServer.Git" Version="1.1.1"
PrivateAssets="All"/>

</ItemGroup>

Set up the publish task


The Index Sources & Publish Symbols task is used to index your source code and
publish your symbols to Azure Artifacts symbols server. Because we are using Source
Link, we will disable indexing in the publish task.

1. From your pipeline definition, select + to add a new task.

2. Search for the Index sources and publish symbols task. Select Add to add it to
your pipeline.

3. Fill out the required fields as follows:

Task version: 2.\*.


Index sources: Uncheck to disable indexing. Source indexing in the publish task is
not needed when using Source Link.

Publish symbols: indicates whether to publish the symbol files.


Symbol server type: select Symbol Server in this organization/collection
(requires Azure Artifacts) to publish your symbols to Azure Artifacts symbol
server.

) Important

To delete symbols that were published using the Index Sources & Publish Symbols
task, you must first delete the build that generated those symbols. This can be
accomplished by using retention policies or by manually deleting the run.

Set up Visual Studio

7 Note

Visual Studio for Mac does not support provide support debugging using symbol
servers.

Before starting to consume our symbols from Azure Artifacts symbol server, let's make
sure that Visual Studio is set up properly:

1. In Visual Studio, select Tools then Options.

2. Select Symbols from the Debugging menu.

3. Select the + sign to add a new symbol server location.


4. A new dialog box will open, select your account from the dropdown menu, and
then select the organization that you wish to connect to. Select Connect when you
are done.

5. Select General from the same Debugging section. Scroll down and check Enable
Source Link support to enable support for portable PDBs.

7 Note
Checking the Enable source server support option enables you to use Source
Server when there is no source code on the local machine or the symbol file does
not match the source code. If you want to enable third-party source code
debugging, uncheck the Enable Just My Code checkbox.

FAQs

Q: What is the duration for which symbols are retained?


A: A symbol file has the same retention period as the build that generated it. When you
delete a build either manually or using retention policies, the symbols that were
generated by that build will be deleted as well.

Q: Can I use source indexing on a portable PDB


generated from a .NET Core assembly?
A: This is not possible at the moment. Source indexing is not currently supported for
portable PDBs. The recommended approach is to configure your build to do the
indexing.

Related articles
Debug with Visual Studio.
Debug with WinDbg.
Configure retention policies.
Key concepts for Azure Artifacts
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Immutability
Once you publish a particular version of a package to a feed, that version number is
permanently reserved. You cannot upload a newer revision package with that same
version number, or delete it and upload a new package with the same version number.

Many package clients, including NuGet and npm, keep a local cache of packages on
your machine. Once a client has cached a particular package version, it will return that
copy on future install/restore requests.

If, on the server side, you replace a package version v1 with a new version v2, the client
is unable to tell the difference. This can lead to indeterminate build results from
different machines. For example, a developer's machine and the build agent might have
cached different revisions of the package, leading to unexpected build results.

If a package is broken, buggy, or shares unintended content (like secrets), the best
approach is to prepare a fix and publish it as a new version. Then, depending on the
severity of the issue and how widely depended-on the package is, you can delete the
package to make it unavailable for consumption.

The only way to work around the immutability constraint is to create a new feed and
publish the desired package version to the new feed.

7 Note

Deleted feeds remain in the recycle bin for 30 days then are deleted permanently.
The feed name becomes available once the feed is permanently deleted.

Indexing
Azure Artifacts maintain an index of all the packages in each feed, which enables fast list
operations. List operations on your file shares require the client to open every package
and examine its metadata unless your file share has been configured to provide an index
that the client understands.
Well-formedness
Azure Artifacts validate all the published packages to ensure they're well formed. This
prevents invalid packages from entering your development and build environments.
However, any workflow that publishes malformed packages will break when migrating
to Azure Artifacts.

Recycle Bin
Packages can be deleted manually or by setting up retention policies for your feed.
Deleted packages remain in the recycle bin for 30 days then get deleted permanently.
Feed owners can recover the deleted packages from the Recycle Bin.

Related articles
Package graphs
Use artifactignore
Packages componentization
Constructing a complete package graph
Article • 10/04/2022

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

7 Note

Check your package type to ensure compatibility with Azure DevOps Services or
on-premises.

When you release a package, it's important to ensure that all the package dependencies
are also available. Azure Artifacts recommend using upstream sources to publish and
consume package dependencies. When a package is consumed from an upstream
source for the first time, a copy of that package is saved in the feed, so even if the
upstream source goes down, your copy will remain available to you and your customers.

How upstream sources construct the set of


available packages
Because Azure Artifacts feeds can have other feeds as upstream sources, it seems
possible on the surface to have a cycle of upstream sources, where feed A upstreams to
feed B, which upstreams to feed C, which upstreams back to feed A. Left unchecked,
such a cycle could break package requests by creating an infinite loop where a user asks
A for a package, then A asks B, then B asks C, then C asks A again, etc. Upstream sources
are designed to prevent this failure.

When a feed consults its upstream sources for a package, Azure Artifacts will return the
packages in the view configured for that upstream source. Thus, a query to feed A does
not actually result in a transitive query to feed C (A -> B -> C), because views are read-
only. A has access to any packages from C that a user of B has previously saved into B,
but not the full set of packages available in C.

Thus, the onus falls to B to ensure that its local packages represent a complete
dependency graph, so that users who consume B's package via an upstream source
from another feed are able to successfully resolve the graph and install their desired B
package.
Example: constructing the set of available
packages
Assume three feeds, Fabrikam, Contoso, and AdventureWorks. In this example, we'll look
at the packages available to the Fabrikam feed as we add upstream sources.

At first, Fabrikam has no upstream sources, and users connected to Fabrikam can only
install versions 1.0.0 and 2.0.0 of the Widgets package. Likewise, Contoso has no
upstream sources, and users connected to Contoso can only install versions 1.0.0 and
3.0.0 of the Gizmos package. Ditto for the AdventureWorks feed, where connected users
can only install versions 1.0.0 and 2.0.0 of the Gadgets package or version 1.0.0 of the
Things package.

Fabrikam Contoso

Widgets Gizmos

1.0.0 1.0.0
2.0.0 3.0.0

AdventureWorks

Things
Gadgets Gadgets
Things

1.0.0 1.0.0
1.0.0
2.0.0 2.0.0

Next, consider what happens if Contoso adds AdventureWorks as an upstream source. A


user connected to Contoso can install any version of Gizmos, any version of Gadgets, or
any version of Things. If Gadgets@2.0.0 is installed, that package-version is saved to
Contoso (with a link back to AdventureWorks).
Fabrikam Contoso

Gizmos
Widgets
1.0.0
3.0.0
1.0.0
2.0.0
Gadgets

2.0.0

AdventureWorks

Things Gadgets

1.0.0 1.0.0
2.0.0
Upstream source

Feed link

Now, let's have the Fabrikam feed add Contoso as an upstream source. A user
connected to Fabrikam can install any version of Widgets, any version of Gizmos, but
only saved versions (2.0.0) of Gadgets.
The user will not be able to install version 1.0.0 of Gadgets or any version of Things,
because those package versions haven't been saved to Contoso by a Contoso user.
Fabrikam Contoso

Widgets Gizmos Gizmos

1.0.0 3.0.0 1.0.0


2.0.0 3.0.0

Gadgets Gadgets

2.0.0 2.0.0

AdventureWorks

Things Gadgets

1.0.0 1.0.0
2.0.0
Upstream source

Feed link

Related articles
Key concepts
Use the .artifactignore file
componentization and composition
Use .artifactignore
Article • 05/31/2023

Azure DevOps Services

The .artifactignore is a text file that controls which files are uploaded when you publish a
Universal Package or a Pipeline Artifact.

.artifactignore is typically checked into your version control repository and the syntax is
similar to that of .gitignore.

Using the .artifactignore file can help reduce your pipeline execution time by avoiding
copying files into your staging directory before publishing your artifacts.

Example
In the following example, we will be ignoring all files except the ones in the
src/MyApp/bin/Release directory.

artifactignore

**/*

!src/MyApp/bin/Release/**.*

) Important

The .artifactignore file must be in the directory provided to the targetPath


argument in your Publish Pipeline Artifacts task.

Syntax
The .artifactignore follows the same syntax as the .gitignore with some minor
limitations. The plus sign character + is not supported in URL paths and some of the
semantic versioning metadata for some package types like Maven.

7 Note

The .gitignore file is ignored by default if you don't have an .artifactignore file. You
can re-include it by creating an empty .artifactignore file.
Related articles
Publish and download pipeline artifacts
Limits on package sizes and counts
Package componentization
Accelerate collaboration and Agile
development with componentization
Article • 05/30/2023

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS
2018

Your product is successful, your organization is growing, and it's time to scale up your
codebase to match this success. As you scale out past 2-3 teams working in a single
codebase on a single product, you may find yourself asking questions like:

How can my teams efficiently share reusable components?

How do I enable my feature teams to iterate rapidly without stepping on other


teams' work?

How do I give my teams autonomy to iterate at the pace that's right for them?

These questions aren't just applicable to newly growing teams. If you're an established
team with a legacy codebase, you may be asking these same questions as you're being
asked to deliver more value, faster than ever. Regardless of your situation,
componentization can help you build a codebase that scales to the size of your team
and the speed of today's development.

In this article, we'll explore how binary composition through Azure Artifacts can help you
manage and share your external dependencies, your open-source software, and your
isolated shared components.

Components and composition


Componentization is the process of dividing and organizing your product into distinct
components. Most .NET projects already have some notion of components in the form
of the projects within the solution. For instance, a basic website may consist of a front-
end component, a data access component, and a model/data storage component.

Source composition
As your product grows, the solution and the project model can become inefficient.
Changes take longer to integrate and are harder to merge, the build gets slower, and
components start to grow from a single project to multiple projects. Generally, this is
the point at which teams start breaking out these sets of related projects into separate
solutions.

Once you've outgrown a single solution, how you componentize becomes an interesting
question. We started with source composition, where each component is referenced via
a project reference in Visual Studio. Source composition is possible as long as your
source lives in a single composition boundary: a single solution within a single source
repository.

Unfortunately, these project references start to break down when multiple solutions are
involved. At this point, when solution A depends on solution B it must refer to the built
binaries (i.e. DLLs) produced by solution B - this is binary composition.

Accordingly, these binaries now need to be built and made available to solution A
before it can build successfully. There are a few ways to do that:

You can check them into source control. Depending on your source control system,
binaries can quickly balloon the size of your repo, slowing check-out times and
general repo performance. If you start to work in branches, multiple teams can end
up introducing the same binary at different versions, leading to challenging merge
conflicts.

Alternatively, you can host them on a file share, although this approach comes with
certain limitations. File shares lack an index for quick lookups, and they do not
provide protection against overwriting a version in the future.

Package composition
Packages address many of the challenges of referencing binaries. Instead of checking
them into source, you can have a solution B produce its binaries as NuGet packages that
another solution A can then consume. If solution A and solution B are maintained as
separate components, where simultaneous changes across A and B are rare, package
composition is a great way to manage the dependency of A on B. Package composition
allows B to iterate on its own cadence, while A is free to get updates from B when A's
schedule permits, and it allows multiple teams to iterate and update solution B without
affecting solution A (or other solutions C or D).

However, package composition does come with its own set of challenges. So far, we
have examined a straightforward example. Scaling package composition up to the size
of a large codebase (something like Windows or Bing) can cause a series of challenges:

Understanding the impact of breaking changes in a component low in the


dependency graph becomes very challenging.
Diamond dependencies can become a significant roadblock to agility. In a
diamond dependency, components B and C both depend on a shared component
A, while component D depends on both B and C. When component A introduces a
new version with breaking changes, if B updates to the new version but C does not,
D cannot take B's updates without introducing a dependency conflict. In this
simple example, a conversation with C may be all that's needed to resolve the
conflict. However, in a complex graph, diamonds can quickly become unresolvable.

When modifications need to be applied to two components that are composed


using packages, the developer's iteration cycle becomes considerably slower. If
Component A is updated, it necessitates rebuilding, repackaging, and republishing
it. Subsequently, component B must update to the recently published version to
validate the change made in component A. Employing source composition, which
allows for simultaneous building of Component A and B, will consistently deliver a
quicker iteration cycle for developers.

What should you use


In general, we've seen large teams be most successful when they use a mixture of
composition strategies. To help determine what's right for your codebase, begin by
mapping out the dependency graph of your product, and start to group your
components into sets of related components.

For instance you might have a collection of components constituting your framework,
and another set of component forming your user-facing service.
Then, for each group of
related components, ask these questions:

Can I anticipate frequent check-ins across the sets I've established for my teams?

Is a single team responsible for the entire set?

For a single set, is there a shared release cadence?

In our experience, we have found that using source composition is most effective for
related projects handled by a single team or a group of related teams. Conversely,
binary composition proves advantageous for open-source software, external
dependencies (components from distant or isolated teams), and independent shared
components.

Next steps
Publish and restore NuGet packages
Configure feed permissions

Set up upstream sources


Developer resources documentation
Use command line tools, REST APIS, and more to interface with Azure DevOps
programmatically.

Azure DevOps Command Line Interface (CLI)

b GET STARTED

Get Started with Azure DevOps CLI

Sign in with a Personal Access Token (PAT)

Configure policies

i REFERENCE

Azure DevOps CLI quick reference

Repos and Test CLI

i REFERENCE

Git commands

TFVC repository commands

Test case management commands

Node CLI for Azure DevOps

b GET STARTED

Cross-platform CLI for Azure DevOps (tfx-cli)

Azure DevOps REST API Reference

b GET STARTED
Get started

Build

Core

Git

Work item tracking

Azure DevOps Demo generator

b GET STARTED

About the Demo generator

Get started
Azure DevOps Services REST API
Reference
Article • 03/31/2023

Welcome to the Azure DevOps Services/Azure DevOps Server REST API Reference.

Representational State Transfer (REST) APIs are service endpoints that support sets of
HTTP operations (methods), which provide create, retrieve, update, or delete access to
the service's resources. This article walks you through:

The basic components of a REST API request/response pair.


Overviews of creating and sending a REST request, and handling the response.

Most REST APIs are accessible through our client libraries, which can be used to
greatly simplify your client code.

Components of a REST API request/response


pair
A REST API request/response pair can be separated into five components:

1. The request URI, in the following form: VERB https://{instance}[/{team-


project}]/_apis[/{area}]/{resource}?api-version={version}

instance: The Azure DevOps Services organization or TFS server you're


sending the request to. They are structured as follows:
Azure DevOps Services: dev.azure.com/{organization}
TFS: {server:port}/tfs/{collection} (the default port is 8080, and the
value for collection should be DefaultCollection but can be any
collection)
resource path: The resource path is as follows: _apis/{area}/{resource} . For
example _apis/wit/workitems .
api-version: Every API request should include an api-version to avoid having
your app or service break as APIs evolve. api-versions are in the following
format: {major}.{minor}[-{stage}[.{resource-version}]] , for example:
api-version=1.0

api-version=1.2-preview
api-version=2.0-preview.1
Note: area and team-project are optional, depending on the API request. Check
out the TFS to REST API version mapping matrix below to find which REST API
versions apply to your version of TFS.

2. HTTP request message header fields:

A required HTTP method (also known as an operation or verb), which tells


the service what type of operation you are requesting. Azure REST APIs
support GET, HEAD, PUT, POST, and PATCH methods.
Optional additional header fields, as required by the specified URI and HTTP
method. For example, an Authorization header that provides a bearer token
containing client authorization information for the request.

3. Optional HTTP request message body fields, to support the URI and HTTP
operation. For example, POST operations contain MIME-encoded objects that are
passed as complex parameters.

For POST or PUT operations, the MIME-encoding type for the body should be
specified in the Content-type request header as well. Some services require
you to use a specific MIME type, such as application/json .

4. HTTP response message header fields:

An HTTP status code , ranging from 2xx success codes to 4xx or 5xx error
codes. Alternatively, a service-defined status code may be returned, as
indicated in the API documentation.
Optional additional header fields, as required to support the request's
response, such as a Content-type response header.

5. Optional HTTP response message body fields:

MIME-encoded response objects may be returned in the HTTP response


body, such as a response from a GET method that is returning data. Typically,
these objects are returned in a structured format such as JSON or XML, as
indicated by the Content-type response header. For example, when you
request an access token from Azure AD, it will be returned in the response
body as the access_token element, one of several name/value paired objects
in a data collection. In this example, a response header of Content-Type:
application/json is also included.

Create the request


Authenticate
There are many ways to authenticate your application or service with Azure DevOps
Services or TFS. The following table is an excellent way to decide which method is the
best for you:

Type of Description example Authentication Code samples


application mechanism

Interactive Client application, that Console application Microsoft sample


client-side allows user interaction, enumerating Authentication
calling Azure DevOps projects in an Library (MSAL)
Services REST APIs organization

Interactive GUI based JavaScript AngularJS single MSAL sample


JavaScript application page app
displaying project
information for a
user

Non- Headless text only client Console app Device Profile sample
interactive side application displaying all bugs
client-side assigned to a user

Interactive GUI based web Custom Web OAuth sample


web application dashboard
displaying build
summaries

TFS TFS app using the Client TFS extension Client Libraries sample
application OM library displaying team
bug dashboards

Azure Azure DevOps Services Azure DevOps VSS Web sample


DevOps extension extension Extension walkthrough
Services samples SDK
Extension

Note: You can find more information on authentication on our authentication


guidance page.

Assemble the request


Azure DevOps Services

For Azure DevOps Services, instance is dev.azure.com/{organization} , so the pattern


looks like this:
VERB https://dev.azure.com/{organization}/_apis[/{area}]/{resource}?api-
version={version}

For example, here's how to get a list of team projects in a Azure DevOps Services
organization.

dos

curl -u {username}[:{personalaccesstoken}]
https://dev.azure.com/{organization}/_apis/projects?api-version=2.0

If you wish to provide the personal access token through an HTTP header, you must first
convert it to a Base64 string (the following example shows how to convert to Base64
using C#). (Certain tools like Postman applies a Base64 encoding by default. If you are
trying the API via such tools, Base64 encoding of the PAT is not required) The resulting
string can then be provided as an HTTP header in the format:

Authorization: Basic BASE64PATSTRING

Here it is in C# using the [HttpClient class](/previous-


versions/visualstudio/hh193681(v=vs.118).

C#

public static async void GetProjects()

try

var personalaccesstoken = "PAT_FROM_WEBSITE";

using (HttpClient client = new HttpClient())

client.DefaultRequestHeaders.Accept.Add(

new
System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"))
;

client.DefaultRequestHeaders.Authorization = new
AuthenticationHeaderValue("Basic",

Convert.ToBase64String(

System.Text.ASCIIEncoding.ASCII.GetBytes(

string.Format("{0}:{1}", "",
personalaccesstoken))));

using (HttpResponseMessage response = await client.GetAsync(


"https://dev.azure.com/{organization}/_apis/projects"))

response.EnsureSuccessStatusCode();

string responseBody = await


response.Content.ReadAsStringAsync();

Console.WriteLine(responseBody);

catch (Exception ex)

Console.WriteLine(ex.ToString());

Most samples on this site use Personal Access Tokens as they're a compact example for
authenticating with the service. However, there are a variety of authentication
mechanisms available for Azure DevOps Services including MSAL, OAuth and Session
Tokens. Refer to the Authentication section for guidance on which one is best suited for
your scenario.

TFS

For TFS, instance is {server:port}/tfs/{collection} and by default the port is 8080.


The default collection is DefaultCollection , but can be any collection.

Here's how to get a list of team projects from TFS using the default port and collection.

dos

curl -u {username}[:{personalaccesstoken}]
https://{server}:8080/tfs/DefaultCollection/_apis/projects?api-version=2.0

The examples above use personal access tokens, which requires that you create a
personal access token.

Process the response


You should get a response like this.

JSON

"value": [

"id": "eb6e4656-77fc-42a1-9181-4c6d8e9da5d1",

"name": "Fabrikam-Fiber-TFVC",

"url": "https://dev.azure.com/fabrikam-fiber-
inc/_apis/projects/eb6e4656-77fc-42a1-9181-4c6d8e9da5d1",

"description": "TeamFoundationVersionControlprojects",

"collection": {

"id": "d81542e4-cdfa-4333-b082-1ae2d6c3ad16",

"name": "DefaultCollection",

"url": "https: //dev.azure.com/fabrikam-fiber-


inc/_apis/projectCollections/d81542e4-cdfa-4333-b082-1ae2d6c3ad16",

"collectionUrl": "https: //dev.azure.com/fabrikam-fiber-


inc/DefaultCollection"

},

"defaultTeam": {

"id": "66df9be7-3586-467b-9c5f-425b29afedfd",

"name": "Fabrikam-Fiber-TFVCTeam",

"url": "https://dev.azure.com/fabrikam-fiber-
inc/_apis/projects/eb6e4656-77fc-42a1-9181-4c6d8e9da5d1/teams/66df9be7-3586-
467b-9c5f-425b29afedfd"

},

"id": "6ce954b1-ce1f-45d1-b94d-e6bf2464ba2c",

"name": "Fabrikam-Fiber-Git",

"url": "https://dev.azure.com/fabrikam-fiber-
inc/_apis/projects/6ce954b1-ce1f-45d1-b94d-e6bf2464ba2c",

"description": "Gitprojects",

"collection": {

"id": "d81542e4-cdfa-4333-b082-1ae2d6c3ad16",

"name": "DefaultCollection",

"url": "https://dev.azure.com/fabrikam-fiber-
inc/_apis/projectCollections/d81542e4-cdfa-4333-b082-1ae2d6c3ad16",

"collectionUrl": "https://dev.azure.com/fabrikam-fiber-
inc/DefaultCollection"

},

"defaultTeam": {

"id": "8bd35c5e-30bb-4834-a0c4-d576ce1b8df7",

"name": "Fabrikam-Fiber-GitTeam",

"url": "https://dev.azure.com/fabrikam-fiber-
inc/_apis/projects/6ce954b1-ce1f-45d1-b94d-e6bf2464ba2c/teams/8bd35c5e-30bb-
4834-a0c4-d576ce1b8df7"

],

"count": 2

The response is JSON . That's generally what you'll get back from the REST APIs
although there are a few exceptions,
like Git blobs.

Now you should be able to look around the specific API areas like work item tracking or
Git and get to the resources that you need. Keep reading to learn more about the
general patterns that are used in these APIs.

API and TFS version mapping


Below you'll find a quick mapping of REST API versions and their corresponding TFS
releases. All API versions will work on the server version mentioned as well as later
versions.

TFS Version REST API Version Build Version

Azure DevOps Server vNext 7.1

Azure DevOps Server 2022 7.0 versions >= 19.205.33122.1

Azure DevOps Server 2020 6.0 versions >= 18.170.30525.1

Azure DevOps Server 2019 5.0 versions >= 17.143.28621.4

TFS 2018 Update 3 4.1 versions >= 16.131.28106.2

TFS 2018 Update 2 4.1 versions >= 16.131.27701.1

TFS 2018 Update 1 4.0 versions >= 16.122.27409.2

TFS 2018 RTW 4.0 versions >= 16.122.27102.1

TFS 2017 Update 2 3.2 versions >= 15.117.26714.0

TFS 2017 Update 1 3.1 versions >= 15.112.26301.0

TFS 2017 RTW 3.0 versions >= 15.105.25910.0

TFS 2015 Update 4 2.3 versions >= 14.114.26403.0

TFS 2015 Update 3 2.3 versions >= 14.102.25423.0

TFS 2015 Update 2 2.2 versions >= 14.95.25122.0

TFS 2015 Update 1 2.1 versions >= 14.0.24712.0

TFS 2015 RTW 2.0 versions >= 14.0.23128.0

Related Content
Check out the Integrate documentation for REST API samples and use cases.

Authentication guidance
Samples
Client Libraries
Discover the client libraries for these REST APIs.

.NET conceptual documentation and .NET reference documentation


Go
Node.js
Python
Swagger 2.0
Web Extensions SDK

Where are the earlier versions of REST APIs?


(Before 4.1)
We recently made a change to our engineering system and documentation generation
process; we made this change to provide clearer, more in-depth, and more accurate
documentation for everyone trying to use these REST APIs. Due to technical constraints,
we are only able to document API Version 4.1 and newer using this method. We believe
the documentation for API Version 4.1 and newer will be easier to use due to this
change.

If you are working in TFS or are looking for the older versions of REST APIs, you can take
a look at the REST API Overview for TFS 2015, 2017, and 2018.

You might also like