By default, all jobs are interruptible, except the dont-interrupt-me
job which runs automatically on main
, and is manual
otherwise.
If you want a running pipeline to finish even if you push new commits to a merge request, be sure to start the dont-interrupt-me
job before pushing.
Because GitLab.com uses the pack-objects cache, concurrent Git fetches of the same pipeline ref are deduplicated on the Gitaly server (always) and served from cache (when available).
This works well for the following reasons:
gitlab-org/gitlab
is Git clone, causing all jobs to fetch the same data, which maximizes the cache hit ratio.Lately we see errors from Gitaly look like this: (see the issue)
fatal: remote error: GitLab is currently unable to handle this request due to load.
While GitLab.com uses pack-objects cache, sometimes the load is still too heavy for Gitaly to handle, and thundering herds can also be a concern that we have a lot of jobs cloning the repository around the same time.
To mitigate and reduce loads for Gitaly, we changed some jobs to fetch the repository from artifacts in a job instead of all cloning from Gitaly at once.
For now this applies to most of the RSpec jobs, which has the most concurrent jobs in most pipelines. This also slightly improved the speed because fetching from the artifacts is also slightly faster than cloning, at the cost of saving more artifacts for each pipeline.
Based on the numbers on 2023-12-20 at Fetch repo from artifacts for RSpec jobs, the extra storage cost was about 280M for each pipeline, and we save 15 seconds for each RSpec jobs.
We do not apply this to jobs having no other job dependencies because we don’t want to delay any jobs from starting.
This behavior can be controlled by variable CI_FETCH_REPO_GIT_STRATEGY
:
none
means jobs using .repo-from-artifacts
fetch repository from artifacts in job clone-gitlab-repo
rather than cloning.clone
means jobs using .repo-from-artifacts
clone repository as usual. Job clone-gitlab-repo
does not run in this case.To disable it, set CI_FETCH_REPO_GIT_STRATEGY
to clone
. To enable it, set CI_FETCH_REPO_GIT_STRATEGY
to none
.
.gitlab/ci/global.gitlab-ci.yml
, with fixed keys:
.setup-test-env-cache
.ruby-cache
.static-analysis-cache
.rubocop-cache
.ruby-node-cache
.qa-cache
.yarn-cache
.assets-compile-cache
(the key includes ${NODE_ENV}
so it’s actually two different caches).maintenance
scheduled pipelines, are pushing (that is, updating) to the caches:
update-setup-test-env-cache
, defined in .gitlab/ci/rails.gitlab-ci.yml
.update-gitaly-binaries-cache
, defined in .gitlab/ci/rails.gitlab-ci.yml
.update-rubocop-cache
, defined in .gitlab/ci/rails.gitlab-ci.yml
.update-qa-cache
, defined in .gitlab/ci/qa.gitlab-ci.yml
.update-assets-compile-production-cache
, defined in .gitlab/ci/frontend.gitlab-ci.yml
.update-assets-compile-test-cache
, defined in .gitlab/ci/frontend.gitlab-ci.yml
.update-storybook-yarn-cache
, defined in .gitlab/ci/frontend.gitlab-ci.yml
.pipeline:update-cache
label (this can be useful to warm the caches in a MR that updates the cache keys).We limit the artifacts that are saved and retrieved by jobs to the minimum to reduce the upload/download time and costs, as well as the artifacts storage.
Components cachingSome external components (GitLab Workhorse and frontend assets) of GitLab need to be built from source as a preliminary step for running tests.
cache-workhorse
In this MR, and then this MR, we introduced a new cache-workhorse
job that:
gitlab-org/gitlab
scheduled pipelinesmaster
commit that touches the workhorse/
foldergitlab-org
’s MRs that touches caching-related filesThis job tries to download a generic package that contains GitLab Workhorse binaries needed in the GitLab test suite (under tmp/tests/gitlab-workhorse
).
scripts/setup-test-env
, so that the GitLab Workhorse binaries are built.We also changed the setup-test-env
job to:
cache-workhorse
.tmp/tests/gitlab-workhorse
), preventing the building of the binaries when scripts/setup-test-env
is run later on.scripts/setup-test-env
.The version of the package is the workhorse tree SHA (for example, git rev-parse HEAD:workhorse
).
cache-assets
In this MR, we introduced three new cache-assets:test
, cache-assets:test as-if-foss
, and cache-assets:production
jobs that:
$CACHE_ASSETS_AS_PACKAGE == "true"
gitlab-org/gitlab
scheduled pipelinesmaster
commit that touches the assets-related foldersgitlab-org
’s MRs that touches caching-related filesThis job tries to download a generic package that contains GitLab compiled assets needed in the GitLab test suite (under app/assets/javascripts/locale/**/app.js
, and public/assets
).
bin/rake gitlab:assets:compile
, so that the GitLab assets are compiled.compile-*-assets
We also changed the compile-test-assets
, and compile-production-assets
jobs to:
First download the “native” cache assets, which contain:
The compiled assets.
A cached-assets-hash.txt
file containing the SHA256
hexdigest of all the source files on which the assets depend on. This list of files is a pessimistic list and the assets might not depend on some of these files. At worst we compile the assets more often, which is better than using outdated assets.
The file is created after assets are compiled.
We then we compute the SHA256
hexdigest of all the source files the assets depend on, for the current checked out branch. We store the hexdigest in the GITLAB_ASSETS_HASH
variable.
If $CACHE_ASSETS_AS_PACKAGE == "true"
, we download the generic package built and uploaded by cache-assets:*
.
We run the assets_compile_script
function, which itself runs the assets:compile
Rake task.
This task is responsible for deciding if assets need to be compiled or not. It compares the HEAD
SHA256
hexdigest from $GITLAB_ASSETS_HASH
with the master
hexdigest from cached-assets-hash.txt
.
If the hashes are the same, we don’t compile anything. If they’re different, we compile the assets.
By default, setup-test-env
creates an artifact which contains stripped binaries to save storage and speed-up artifact downloads of subsequent CI jobs.
To make debugging a crash from stripped binaries easier comment line with strip_executable_binaries
in the setup-test-job
job and start a new pipeline.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4