Saturday, June 5, 2021

Azure Pipelines, YAML Schema





https://faun.pub/reduce-your-build-time-using-caching-in-azure-pipelines-7a7bd0201cee

Resources
Trigger
Pipeline -> Stage(Implicit) -> Jobs -> Steps [Task, Script, Checkout]

jobs:
- job: Job_1
  displayName: Agent job 1   
  pool:
    vmImage: ubuntu-latest
  steps:
    -script: echo "Hello World"
pool:
  name: Default
  demands: SpecialSoftware # Check if SpecialSoftware capability exists
  
  
stages:
- stage: A
  jobs:
  - job: A1
  - job: A2
- stage: B
  jobs:
  - job: B1
  - job: B2  




stages:
- stage: A

# stage B runs if A fails
- stage: B
  condition: failed()

# stage C runs if B succeeds
- stage: C
  dependsOn:
  - A
  - B
  condition: succeeded('B')




If you choose to specify a pool at the stage level, then all jobs defined in that stage will use that pool unless otherwise specified at the job-level.
stages:
- stage: A
  pool: StageAPool
  jobs:
  - job: A1 # will run on "StageAPool" pool based on the pool defined on the stage
  - job: A2 # will run on "JobPool" pool
    pool: JobPool





jobs:
- job: Foo

  steps:
  - script: echo Hello!
    condition: always() # this step will always run, even if the pipeline is canceled

- job: Bar
  dependsOn: Foo
  condition: failed() # this job will only run if Foo fails





jobs:
- job: Debug
  steps:
  - script: echo hello from the Debug build
- job: Release
  dependsOn: Debug
  steps:
  - script: echo hello from the Release build


You can organize pipeline jobs into stages. Stages are the major divisions in a pipeline: "build this app", "run these tests", and "deploy to pre-production" are good examples of stages. They are logical boundaries in your pipeline where you can pause the pipeline and perform various checks.

Pipeline > Stages >Stage>Steps>Step

jobs:
- job: A
  steps:
  - bash: echo "A"

- job: B
  steps:
  - bash: echo "B"
  
  
  If you organize your pipeline into multiple stages, you use the stages keyword.

If you choose to specify a pool at the stage level, then all jobs defined in that stage will use that pool unless otherwise specified at the job-level
stages:
- stage: A
  jobs:
  - job: A1
  - job: A2

- stage: B
  jobs:
  - job: B1
  - job: B2
  
  When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. The exception to this is when you add dependencies. With dependencies, stages run in the order of the dependsOn requirements
  stages:
- stage: string
  dependsOn: string
  condition: string
  -----------------------------------------------------------
  You can organize your pipeline into jobs. Every pipeline has at least one job. A job is a series of steps that run sequentially as a unit. In other words, a job is the smallest unit of work that can be scheduled to run.
  
  In the simplest case, a pipeline has a single job. In that case, you do not have to explicitly use the job keyword
  
  jobs:
- job: myJob
  timeoutInMinutes: 10
  pool:
    vmImage: 'ubuntu-16.04'
  steps:
  - bash: echo "Hello world"
  
  
  /usr/lib/jvm/adoptopenjdk-11-hotspot-amd64


 # update-alternatives --config java
          update-alternatives --list java
          echo ls /etc/alternatives
          ls /etc/alternatives
  
  


No comments:

Post a Comment

Azure - Pipeline - Add Approver for Stage

https://learn.microsoft.com/en-us/azure/devops/pipelines/process/approvals?view=azure-devops&tabs=check-pass