Any future job runs that use the same Gemfile. Additional details : The cache key is a SHA computed from the most recent commits that changed each listed file.
If neither file is changed in any commits, the fallback key is default. Use cache:key:prefix to combine a prefix with the SHA computed for cache:key:files. Possible inputs : A string A predefined variables A combination of both.
Example of cache:key:prefix : rspec : script : - echo "This rspec job uses a cache. If a branch changes Gemfile. A new cache key is generated, and a new cache is created for that key. If Gemfile. Additional details : If no file in cache:key:files is changed in any commits, the prefix is added to the default key.
Possible inputs : true or false default. Example of cache:untracked : rspec : script : test cache : untracked : true Additional details : You can combine cache:untracked with cache:paths to cache all untracked files as well as files in the configured paths. Use cache:when to define when to save the cache, based on the status of the job. By default, the job downloads the cache when the job starts, and uploads changes to the cache when the job ends.
This caching style is the pull-push policy default. To set a job to only download the cache when the job starts, but never upload changes when the job finishes, use cache:policy:pull.
To set a job to only upload a cache when the job finishes, but never download the cache when the job starts, use cache:policy:push. Use the pull policy when you have many jobs executing in parallel that use the same cache. This policy speeds up job execution and reduces load on the cache server.
You can use a job with the push policy to build the cache. You can also set a job to download no artifacts at all. If you do not use dependencies , all artifacts from previous stages are passed to each job. Possible inputs : The names of jobs to fetch artifacts from. An empty array [] , to configure the job to not download any artifacts.
When test osx is executed, the artifacts from build osx are downloaded and extracted in the context of the build. The same thing happens for test linux and artifacts from build linux. The deploy job downloads artifacts from all previous jobs because of the stage precedence. Additional details : The job status does not matter. If the artifacts of a dependent job are expired or deleted , then the job fails. The artifacts are sent to GitLab after the job finishes. They are available for download in the GitLab UI if the size is not larger than the maximum artifact size.
By default, jobs in later stages automatically download all the artifacts created by jobs in earlier stages. You can control artifact download behavior in jobs with dependencies. When using the needs keyword, jobs can only download artifacts from the jobs defined in the needs configuration. Job artifacts are only collected for successful jobs by default, and artifacts are restored after caches. Read more about artifacts.
Similar to artifacts:paths , exclude paths are relative to the project directory. You can use Wildcards that use glob or doublestar. PathMatch patterns. To exclude all of the contents of a directory, you can match them explicitly rather than matching the directory itself. Made default behavior in GitLab Disabled instance-wide. Pipeline artifacts. See When pipeline artifacts are deleted for more information. If the expiry time is not defined, it defaults to the instance wide setting 30 days by default.
To override the expiration date and protect artifacts from being automatically deleted: Select Keep on the job page. After their expiry, artifacts are deleted hourly by default using a cron job , and are not accessible anymore.
To access the link, select View exposed artifact below the pipeline graph in the merge request overview. A maximum of 10 job artifacts per merge request can be exposed. Glob patterns are unsupported. If a directory is specified, the link is to the job artifacts browser if there is more than one file in the directory.
For exposed single file artifacts with. Not enabled, the file is displayed in the artifacts browser. You can specify a unique name for every archive.
The artifacts:name variable can make use of any of the predefined variables. The default name is artifacts , which becomes artifacts. To restrict which jobs a specific job fetches artifacts from, see dependencies. Send all files in binaries and.
Use artifacts:public to determine whether the job artifacts should be publicly available. The default for artifacts:public is true which means that the artifacts in public pipelines are available for download by anonymous and guest users: artifacts : public : true To deny read access for anonymous and guest users to artifacts in public pipelines, set artifacts:public to false : artifacts : public : false artifacts:reports Use artifacts:reports to: Collect test reports, code quality reports, and security reports from jobs.
Expose these reports in merge requests, pipeline views, and security dashboards. The test reports are collected regardless of the job results success or failure. Some artifacts:reports types can be generated by multiple jobs in the same pipeline, and used by merge request or pipeline features from each job.
Keyword Multiple reports in the same pipeline? Merge request diff annotations: No. Full report: No. Track progress on adding support in this issue. Requires GitLab Runner The collected API Fuzzing report uploads to GitLab as an artifact and is summarized in merge requests and the pipeline view.
The collected Browser Performance report uploads to GitLab as an artifact and displays in merge requests. The cobertura report collects Cobertura coverage XML files. The collected Cobertura coverage reports upload to GitLab as an artifact and display in merge requests. Cobertura was originally developed for Java, but there are many third party ports for other languages like JavaScript, Python, Ruby, and so on.
The codequality report collects Code Quality issues as artifacts. The collected Code Quality report uploads to GitLab as an artifact and is summarized in merge requests. The collected Container Scanning report uploads to GitLab as an artifact and is summarized in merge requests and the pipeline view.
The collected coverage fuzzing report uploads to GitLab as an artifact and is summarized in merge requests and the pipeline view. The collected Dependency Scanning report uploads to GitLab as an artifact and is summarized in merge requests and the pipeline view.
The dotenv report collects a set of environment variables as artifacts. The collected variables are registered as runtime-created variables of the job, which is useful to set dynamic environment URLs after a job finishes.
The maximum size of the. Variable substitution in the. Although JUnit was originally developed in Java, there are many third party ports for other languages like JavaScript, Python, Ruby, and so on.
See Unit test reports for more details and examples. If the JUnit tool you use exports to multiple XML files, specify multiple test report paths in a single job to concatenate them into a single file. The License Compliance report uploads to GitLab as an artifact and displays automatically in merge requests and the pipeline view. The report provides data for security dashboards. The report is uploaded to GitLab as an artifact and is shown in merge requests automatically.
The collected Metrics report uploads to GitLab as an artifact and displays in merge requests. The requirements report collects requirements. The collected Requirements report uploads to GitLab as an artifact and existing requirements are marked as Satisfied. The sast report collects SAST vulnerabilities as artifacts. Moved to GitLab Free in The secret-detection report collects detected secrets as artifacts.
The collected Secret Detection report is uploaded to GitLab as an artifact and summarized in the merge requests and pipeline view. The terraform report obtains a Terraform tfplan. JQ processing required to remove credentials. The collected Terraform plan report uploads to GitLab as an artifact and displays in merge requests.
For more information, see Output terraform plan information into a merge request. For example, when uploading artifacts required to troubleshoot failing tests.
The coverage is shown in the UI if at least one line in the job output matches the regular expression. Possible inputs : A regular expression. A line like Code coverage: The sample matching line above gives a code coverage of Additional details : If there is more than one matched line in the job output, the last line is used.
Leading zeros are removed. Coverage output from child pipelines is not recorded or displayed. Check the related issue for more details. Both profiles must first have been created in the project. You can use only as part of a job. Additional details : Settings contained in either a site profile or scanner profile take precedence over those contained in the DAST template. Related topics : Site profile. Scanner profile.
If not defined, defaults to 0 and jobs do not retry. When a job fails, the job is processed up to two more times, until it succeeds or reaches the maximum number of retries. By default, all failure types cause the job to be retried. Use retry:when to select which failures to retry on. Possible inputs : 0 default , 1 , or 2. Example of retry : test : script : rspec retry : 2 retry:when Use retry:when with retry:max to retry jobs for only specific failure cases.
Possible inputs : A single failure type, or an array of one or more failure types:. Use timeout to configure a timeout for a specific job. If the job runs for longer than the timeout, the job fails. The job-level timeout can be longer than the project-level timeout. For example, these are all equivalent: seconds 60 minutes one hour Example of timeout : build : script : build. Multiple runners must exist, or a single runner must be configured to run multiple jobs concurrently.
Possible inputs : A numeric value from 2 to Related topics : Parallelize large jobs. The job naming style was improved in GitLab Use parallel:matrix to run a job multiple times in parallel in a single pipeline, but with different variable values for each instance of the job. Run a matrix of triggered parallel jobs. Use trigger to start a downstream pipeline that is either: A multi-project pipeline. A child pipeline. Possible inputs : For multi-project pipelines, path to the downstream project.
Related topics : Multi-project pipeline configuration examples. Child pipeline configuration examples. To force a rebuild of a specific branch, tag, or commit, you can use an API call with a trigger token. The trigger token is different than the trigger keyword. This behavior is different than the default, which is for the trigger job to be marked as success as soon as the downstream pipeline is created.
This setting makes your pipeline execution linear rather than parallel. Use interruptible if a job should be canceled when a newer pipeline starts before the job completes. This keyword is used with the automatic cancellation of redundant pipelines feature. When enabled, a running job with interruptible: true can be cancelled when a new pipeline starts on the same branch. Example of interruptible : stages : - stage1 - stage2 - stage3 step-1 : stage : stage1 script : - echo "Can be canceled.
Not canceled, after step-2 starts. Additional details : Only set interruptible: true if the job can be safely canceled after it has started, like a build job. To completely cancel a running pipeline, all jobs must have interruptible: true , or interruptible: false jobs must not have started. For example, if multiple jobs that belong to the same resource group are queued simultaneously, only one of the jobs starts.
Resource groups behave similar to semaphores in other programming languages. You can define multiple resource groups per environment. For example, when deploying to physical devices, you might have multiple physical devices. Each device can be deployed to, but only one deployment can occur per device at any given time. As a result, you can ensure that concurrent deployments never happen to the production environment.
Use release to create a release. If you use the Docker executor , you can use this image from the GitLab Container Registry: registry. Additional details : All release jobs, except trigger jobs, must include the script keyword.
A release job can use the output from script commands. If the release already exists, it is not updated and the job with the release keyword fails. If you use the Shell executor or similar, install release-cli on the server where the runner is registered.
Create multiple releases in a single pipeline. The Git tag for the release. If the tag does not exist in the project yet, it is created at the same time as the release. New tags use the SHA associated with the pipeline. Possible inputs : A tag name. There are many things you can configure: see the Ivy version-matchers documentation.
When you create a new Play application, a dependencies. The require section list all dependencies needed by your application. Here the new application only depends on Play version 1. To ask Play to resolve, download and install the new dependencies, run play dependencies :. Now Play has downloaded two JARs guava-r Why two jars, since we only declared one dependency?
Because Google Guava has a transitive dependency. In fact this dependency is not really required and we would like to exclude it. By default, any transitive dependencies are automatically retrieved. But there are several ways to exclude them if needed. You can alter the ant configuration mappings used during resolution. See Custom Configuration Mappings.
This means all transtive ant configurations or maven scopes are pulled in as dependencies. You can customize this behavior by using the configurations block in dependencies. Viewed 4k times. Improve this question. SBerg Olivier Refalo Olivier Refalo See also stackoverflow. Add a comment. Active Oldest Votes. Improve this answer. Do you know how to make it work with the play war command?
Tried with 1. Felipe Oliveira Felipe Oliveira 8 8 silver badges 9 9 bronze badges. Sign up or log in Sign up using Google. Sign up using Facebook.
0コメント