Gitlab Ci Dynamic Child Pipelines : a New Hope
Intended audience : developpers, CI/CD operators, Jenkins afficionados
By Max Gautier, Consultant Cloud & DevOps @ ObjectifLibre
The Dark Ages
Gitlab CI pipelines are cool. They’re easy to understand, well-integrated with the Gitlab UI and can run on Kubernetes. One thing is missing, though, for them to be a killer feature : the ability to dynamically create jobs. Indeed, suppose we have to execute several similar jobs (a set of tests would be a good example). Well, we need one entry in our .gitlab-ci.yml
for each job we want to execute !
That’s a lot of copy-pasting… We can use yaml aliases, but we’re still left with too much boilerplate.
test_template: &test
script: echo "Testing $SOMETHING"
stage: test
test_A:
variables:
SOMETHING: A
<<: *test
test_B:
variables:
SOMETHING: B
<<: *test
test_C:
variables:
SOMETHING: C
<<: *test
test_D:
variables:
SOMETHING: D
<<: *test
You can see where this leads, don’t you ?
Now, that is annoying, but you can overcome it, either with the simple aforementioned copy pasting, or with clever usage of the include feature, which allows you to include file from an external source.
What you cannot do though, is defining jobs dynamically 1. For instance, I have a tests
directory, under which lives all my tests (not original, but most projects have that kind of layout), as
shell scripts.
tests
├── test_A
├── test_B
├── test_C
├── test_D
├── test_E
├── test_F
├── test_G
├── test_H
├── test_I
├── test_J
├── test_K
├── test_L
├── test_M
├── test_N
├── test_O
├── test_P
├── test_Q
├── test_R
├── test_S
├── test_T
├── test_U
├── test_V
├── test_W
├── test_X
├── test_Y
└── test_Z
Let’s say we want to launch them each in their own job 2, We could
reuse our earlier approach :
test_template: &test
script: tests/test_$SOMETHING
stage: test
test_A:
variables:
SOMETHING: A
<<: *test
test_B:
variables:
SOMETHING: B
<<: *test
...
# and so on
Besides copy pasting, there is one problem with that configuration :
repeating yourself. What if we add a test_XY.sh
? Or remove
test_A.sh
. Well, you will have either a test not executed, or a
spurious error, or you have to edit your .gitlab-ci.yml
file.
Not ideal.
A New Hope
In it’s 12.7 release, Gitlab announced Parent Child Pipelines.
They allow to define jobs whose purpose is to trigger another pipeline in the same project.
job_1:
trigger:
include:
- local: path/to/pipeline.yml
This jobs creates a new pipelines in the same project, defined in path/to/pipeline.yml
. The include is the same that for a top-level include.
By default, that job will be considered successful and will not wait for the created pipeline to finish. If we want to change that, we need to use strategy: depend
in the job definition. It will then “depends” on the pipeline.
How does that help us for dynamic jobs ? Well, it does not, yet. But in the 12.9 release, they enhanced the trigger syntax we just saw with one little thing : it allows a new type of include, artifact
, which allow to include a yaml file from a previously generated artifact.
And that is exactly what we need.
Let’s apply that to our case.
First, we’ll create a template for our dynamic pipeline. We’ll use a shell script for simplicity but you could use anything, like Jinja templating, for instance :
#!/bin/sh
for test in tests/*
do
cat <<EOF
job_${test##*/}:
stage: test
script: ./$test
EOF
done
In our main .gitlab-ci.yml
, we create the generator job:
generator:
stage: generate
image: ubuntu
script:
- ./generate-pipeline.sh > generated-pipeline.yml
artifacts:
paths:
- generated-pipeline.yml
And finally, we include the generated yaml and trigger a child pipeline:
child-pipeline:
stage: trigger
trigger:
include:
- artifact: generated-pipeline.yml
job: generator
strategy: depend
And the result is :
So, as of now, we can dynamically generate pipelines configuration for Gitlab CI, which means we are no longer restricted by the expressiveness of the .gitlab-ci.yml
syntax, and can enjoy the power of our favorite language.
Of course, this comes with more complexity. A simple pipeline definition is easier to understand than a pipeline generating others. But complexity does not go away, and does not appear out of thin air. What this new feature gives us is a tool to manage the complexity.
- 1
- Well, you could generate an appropriate yaml file using the
remote option of include and encoding parameters in the url. But if
you’re using Gitlab CI, the point is to have integration, not more
moving pieces. - 2
- Because of course, you could just have a simple job with a command
for test in tests/*; do ./$test; done
if you don’t care about
the separation.