top of page
Search
Writer's pictureNathan

Azure DevOps YAML Pipelines - Part 7

Updated: Aug 18


Stages, Jobs, and Steps


There are a lot of topics to cover for YAML-based Pipelines in Azure DevOps. So, I will be breaking this up into a multi-part series. This section will be updated over time with links to the whole series of articles once they are written.



 

Azure DevOps YAML Pipelines are broken up into one or more Stages. Each Stage is comprised of one or more Jobs. And, finally, each Job contains one or more Steps or Tasks.


As stated in Part 2, there are certain cases where you can omit the 'stages' or 'jobs' sections from your YAML code. And, while they may be missing from the code, there is still a Stage and a Job running in the background, using default settings.



 

Stages


Stages are the major divisions in a Pipeline. They are a collection of related Jobs. They are logical boundaries that can be used to pause the pipeline for checks and approvals. One fantastic benefit about Stages is that when you manually run a Pipeline, you are given the choice to pick which Stages to run, and which to skip.


Dependencies


By default, Stages will run sequentially, one after the other. This means that each Stage, by default, has an implicit dependency on the previous Stage. However, you can use the optional 'dependsOn' property of a Stage to build your own order of execution, including running Stages in parallel, if you wish. Your Pipeline must have at least one Stage that doesn't use a 'dependsOn' property, otherwise nothing would run!


Conditions


By default, if one Stage fails then the Pipeline is finished and all other Stages will be skipped. This means that each Stage, by default, has an implicit condition that states the previous Stage must succeed. However, you can use the optional 'condition' property to specify your own custom check which determines if a Stage will run or not. Note: adding a condition to a Stage will remove the implicit condition that says the previous Stage must succeed. Therefore, it is common to use a condition of 'and(succeeded(),yourCustomCondition)' to add the implicit success condition back, as well as adds your own custom condition. Otherwise, the Stage will run regardless of the outcome of the preceding Stage


Defining Stages



 

Jobs


A Job is a collection of related Steps/Tasks. Each Job is assigned to its own Agent. This means all of the Steps/Tasks from a given Job will be run on the same Agent machine. Each Agent can only run 1 Job at a time. A Job is the smallest unit of work that can be scheduled to run.



There is one exception to mention: Agentless Jobs, which do not run on an Agent, and they support a handful of Agentless Tasks.


Dependencies


By default, Jobs run in parallel / at the same time. However, for parallel Jobs to work you must meet a few requirements: you must have multiple Agents available, and you must have the required amount of parallel Jobs licenses. For example, if your Stage has 5 Jobs, then by default they will all run at the same time (given that you have at least 5 Agents that are available and that you have at least 5 parallel Jobs licenses).


You can use the optional 'dependsOn' property of a Job to build your own order of execution (sequential Jobs, fan-out Jobs, fan-in Jobs).


Conditions


By default, Jobs don't have any implicit conditions. In other words, by default a Job will always run, and a Job doesn't care about whether another Job succeeds or fails. However, you can use the optional 'condition' property to specify your own custom check which determines if a Job will run or not.


Defining Jobs


Jobs come in two different forms, with each form supporting different options. There are traditional (sometimes called 'build') Jobs, and then there are Deployment Jobs. You can also specify Job templates. This post would be much too long if I were to include the code for each of these options. Instead, I will point you at my Pipelines Guide repo in GitHub, which shows all of the available options in detail.


 

Steps (aka Tasks)


The terms "Steps" and "Tasks" can be used almost interchangeably. "Steps" is used as the name of the YAML array, but each entry in the array uses the term "Task" (except for the Task shortcuts, but more on that later).


Tasks are what actually define the work to be done by your pipeline. A Task is simply a packaged script or procedure that has been abstracted with a set of inputs. In other words, a Task is designed to help you run a complicated process by simply defining a few inputs and letting the Task take care of the work behind the scenes.


There are many built-in Tasks provided by Microsoft (link 1) (link 2). To name a few examples, there are Tasks to help you do Android builds, or do Docker builds, run an Azure CLI command, read a secret from Azure KeyVault, and many, many more. There are also many custom tasks that you can install & use from the Visual Studio Marketplace.


Some of the built-in Tasks also have shortcut syntaxes. For example, the 'Cache' Task can be referenced by 2 different shortcuts: 'saveCache' and 'restoreCache'.


Dependencies


Inside of each Job, the Steps will run sequentially, and there is no way to change this.


Conditions


Each Step, by default, has an implicit condition that states "run if we're in a successful state". However, you can use the optional 'condition' property to specify your own custom check which determines if a Step will run or not. One final note, there is also 'continueOnError' property. This property, if set to true, gives a 'success' signal to the next Step no matter what actually happens, even if a failure occurs.


Defining Steps



 

To summarize, we discussed how to organize the work being done by the pipeline into Stages, Jobs, and Steps.


Next up, we will finally wrap up the series by going over some useful tips and tricks of Azure DevOps Pipelines.

3,023 views

コメント


bottom of page