Compare commits

...

35 Commits

Author SHA1 Message Date
dependabot[bot]
e1ada08a9a Bump the github-actions group with 1 update (#1047)
Bumps the github-actions group with 1 update:
[gradle/gradle-build-action](https://github.com/gradle/gradle-build-action).

Updates `gradle/gradle-build-action` from 2.11.1 to 2.12.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gradle/gradle-build-action/releases">gradle/gradle-build-action's
releases</a>.</em></p>
<blockquote>
<h2>v2.12.0</h2>
<p>Adds a new option to clear a previously submitted
dependency-graph.</p>
<pre lang="yaml"><code>steps:
- uses: gradle/gradle-build-action@v2
  with:
    dependency-graph: clear
</code></pre>
<p>This may prove useful when migrating to a workflow using the upcoming
<code>gradle/actions/dependency-submission</code> action.</p>
<p><strong>Full-changelog</strong>: <a
href="https://github.com/gradle/gradle-build-action/compare/v2.11.1...v2.12.0">https://github.com/gradle/gradle-build-action/compare/v2.11.1...v2.12.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a8f75513ea"><code>a8f7551</code></a>
Build outputs</li>
<li><a
href="9283312acb"><code>9283312</code></a>
Add new option to clear dependency-graph</li>
<li><a
href="7c8a278ea0"><code>7c8a278</code></a>
Remove old clear-dependency-graph action</li>
<li><a
href="d8ca9b7d2e"><code>d8ca9b7</code></a>
Do full checks on release branches</li>
<li>See full diff in <a
href="https://github.com/gradle/gradle-build-action/compare/v2.11.1...v2.12.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=gradle/gradle-build-action&package-manager=github_actions&previous-version=2.11.1&new-version=2.12.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-25 10:21:22 -07:00
daz
a8e3e5e2b4 Apply dependency version updates
- NPM dependencies
- github-actions dependencies
2024-01-25 10:03:45 -07:00
daz
2be01ca1c6 Build outputs 2024-01-25 10:00:43 -07:00
dependabot[bot]
a00827eebb Bump the npm-dependencies group with 7 updates
Bumps the npm-dependencies group with 7 updates:

| Package | From | To |
| --- | --- | --- |
| [@actions/artifact](https://github.com/actions/toolkit/tree/HEAD/packages/artifact) | `2.0.0` | `2.1.0` |
| [@actions/cache](https://github.com/actions/toolkit/tree/HEAD/packages/cache) | `3.2.2` | `3.2.3` |
| [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) | `6.17.0` | `6.19.1` |
| [eslint-plugin-jest](https://github.com/jest-community/eslint-plugin-jest) | `27.6.1` | `27.6.3` |
| [eslint-plugin-prettier](https://github.com/prettier/eslint-plugin-prettier) | `5.1.2` | `5.1.3` |
| [prettier](https://github.com/prettier/prettier) | `3.1.1` | `3.2.4` |
| [ts-jest](https://github.com/kulshekhar/ts-jest) | `29.1.1` | `29.1.2` |

Updates `@actions/artifact` from 2.0.0 to 2.1.0
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/artifact/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/artifact)

Updates `@actions/cache` from 3.2.2 to 3.2.3
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/cache/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/cache)

Updates `@typescript-eslint/parser` from 6.17.0 to 6.19.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v6.19.1/packages/parser)

Updates `eslint-plugin-jest` from 27.6.1 to 27.6.3
- [Release notes](https://github.com/jest-community/eslint-plugin-jest/releases)
- [Changelog](https://github.com/jest-community/eslint-plugin-jest/blob/main/CHANGELOG.md)
- [Commits](https://github.com/jest-community/eslint-plugin-jest/compare/v27.6.1...v27.6.3)

Updates `eslint-plugin-prettier` from 5.1.2 to 5.1.3
- [Release notes](https://github.com/prettier/eslint-plugin-prettier/releases)
- [Changelog](https://github.com/prettier/eslint-plugin-prettier/blob/master/CHANGELOG.md)
- [Commits](https://github.com/prettier/eslint-plugin-prettier/compare/v5.1.2...v5.1.3)

Updates `prettier` from 3.1.1 to 3.2.4
- [Release notes](https://github.com/prettier/prettier/releases)
- [Changelog](https://github.com/prettier/prettier/blob/main/CHANGELOG.md)
- [Commits](https://github.com/prettier/prettier/compare/3.1.1...3.2.4)

Updates `ts-jest` from 29.1.1 to 29.1.2
- [Release notes](https://github.com/kulshekhar/ts-jest/releases)
- [Changelog](https://github.com/kulshekhar/ts-jest/blob/main/CHANGELOG.md)
- [Commits](https://github.com/kulshekhar/ts-jest/compare/v29.1.1...v29.1.2)

---
updated-dependencies:
- dependency-name: "@actions/artifact"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: npm-dependencies
- dependency-name: "@actions/cache"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: npm-dependencies
- dependency-name: "@typescript-eslint/parser"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm-dependencies
- dependency-name: eslint-plugin-jest
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-dependencies
- dependency-name: eslint-plugin-prettier
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-dependencies
- dependency-name: prettier
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm-dependencies
- dependency-name: ts-jest
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-25 09:59:45 -07:00
dependabot[bot]
ad80850e98 Bump the github-actions group with 2 updates
Bumps the github-actions group with 2 updates: [actions/dependency-review-action](https://github.com/actions/dependency-review-action) and [gradle/gradle-build-action](https://github.com/gradle/gradle-build-action).

Updates `actions/dependency-review-action` from 3 to 4
- [Release notes](https://github.com/actions/dependency-review-action/releases)
- [Commits](https://github.com/actions/dependency-review-action/compare/v3...v4)

Updates `gradle/gradle-build-action` from 2.11.0 to 2.11.1
- [Release notes](https://github.com/gradle/gradle-build-action/releases)
- [Commits](https://github.com/gradle/gradle-build-action/compare/v2.11.0...v2.11.1)

---
updated-dependencies:
- dependency-name: actions/dependency-review-action
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: gradle/gradle-build-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-25 09:57:26 -07:00
daz
bd6d0a74d4 Configure explicit java version for config-cache test
The default JDK on some runners can have minor differences, resulting
in configuration-cache misses. Setting the Java version explicitly should
ensure consistency.
2024-01-25 09:21:52 -07:00
Daz DeBoer
1b6cac1f97 Make it easy to publish to scans.gradle.com (#1045) 2024-01-25 16:58:51 +01:00
daz
90d7c1a069 Apply TOS agreement even if plugin is already applied
Fixes #1044
2024-01-25 08:56:13 -07:00
daz
4062866f05 Document build scan publishing 2024-01-25 08:56:13 -07:00
daz
83a95864e5 Build outputs 2024-01-25 08:56:13 -07:00
daz
60c43cb563 Make it easy to publish to scans.gradle.com
- Allow init-script to publish to scans.gradle.com
- Add paramaters to enable build scan publishing
- Test coverage for build scan publishing
2024-01-25 08:56:13 -07:00
daz
75b3db10df Remove node warnings from workflows
- Use setup-node to control Node version used to build
- Use Node20 compatible actions in custom actions
2024-01-24 16:01:15 -07:00
daz
f1361c71c2 Build outputs 2024-01-23 16:19:26 -07:00
daz
49ade81b5d Add a new option to clear the dependency-graph
When changing workflow names or when changing to the new 'dependency-submission'
action, it can be useful to clear existing dependency graph snapshots from previous
submissions. While the old graphs will eventually "age out", the 'clear' option will
submit an empty dependency graph for an existing Job correlator, ensuring that old
dependency graphs don't linger.
2024-01-23 16:19:25 -07:00
daz
79fa674432 Remove old clear-dependency-graph action 2024-01-23 16:19:17 -07:00
daz
46878035be Do full checks on release branches 2024-01-23 15:26:51 -07:00
daz
42452daeb5 Add explicit process.exit() to avoid wait for hanging promises
When using the `@actions/cache` library to save cache entries, it seems that one
or more Promises remain unresolved after the save completes.
With Node20 this causes a delay when exiting the process: the default behaviour
now wait for these Promises to complete. Adding an explicit `Process.exit()`
removes the delay, returning to the Node 16 behaviour.

Fixes #1038
2024-01-16 18:01:46 -07:00
daz
346645706f Don't overwrite dependency-graph env vars
This allows these vars to be explicitly set, which is required for
testing (and could prove useful for debugging).
2024-01-16 09:43:56 -07:00
daz
5516b39940 Fix docs for dependency-graph param 2024-01-13 12:55:55 -07:00
Daz DeBoer
7099569988 Improve dependency graph failure handling (#1036)
One goal for the original dependency-graph support was to minimize it's
impact on existing workflows, by operating transparently and not
impacting the build outcome. This meant that any failures in
dependency-graph generation or submission were logged as warnings, but
did not cause the workflow to fail.

However, in some cases the primary purpose of a workflow is to generate
and submit a dependency graph: in these cases it is desirable to have
the workflow fail when this process breaks.

This PR introduces a new `dependency-graph-continue-on-failure`
parameter, which when `false` will enable the latter behaviour. It also
adds test coverage for different failures in dependency graph generation
and submission.

Fixes #1034 
Fixes #997
2024-01-13 15:32:28 +01:00
daz
610728fa8c Build outputs 2024-01-13 07:21:40 -07:00
daz
a835cbb991 Use latest release of dependency-graph plugin 2024-01-13 07:21:40 -07:00
daz
ee4d92bb22 Document 'dependency-graph-continue-on-error' 2024-01-13 07:21:40 -07:00
daz
173b6ae553 Improve testing for dependency graph failures
- Update test to use input param
- Rename Job to indicate expected failure
2024-01-13 07:21:08 -07:00
daz
a01f794d92 Add dependency-graph-continue-on-failure input param
- Translate to env var for init-script support
- Use when deciding whether to log or rethrow errors
- Add a custom error type to trigger failure in post action
2024-01-13 07:20:45 -07:00
daz
369fcc54d8 Add tests for dependency graph failures 2024-01-13 07:20:45 -07:00
daz
6523a87c8f Update supported Gradle versions to match plugin 2024-01-12 13:33:42 -07:00
daz
11693a1169 Run npm tasks concurrently 2024-01-12 07:22:24 -07:00
daz
0e6b90783e Fix dependency-graph with configuration-cache
When state is reused from the configuration cache, no dependencies are resolved.
This fix prevents the action from submitting an empty dependency graph in this case.
2024-01-11 21:25:29 -07:00
Iurii Ignatko
932abbbe13 Update GE injection script for Develocity rename
- Rename all env vars expected by init-script: `s/GRADLE_ENTERPRISE/DEVELOCITY`
- Update README for changes
- Use `Develocity` consistently in tests
2024-01-10 20:30:40 -07:00
Daz DeBoer
1a18d0b2d3 Update description of cache key 2024-01-09 20:10:49 -07:00
Daz DeBoer
7af89832c5 Further improvements to Merge Queue docs 2024-01-08 13:00:48 -07:00
Daz DeBoer
b5ebb0cc96 Document caching with merge-queue 2024-01-08 12:58:38 -07:00
daz
3a75647ad4 Remove 'gradle-executable' input param 2024-01-04 13:53:16 -07:00
daz
4dda5928c7 Update CodeQL config to latest default 2024-01-04 13:03:33 -07:00
40 changed files with 2988 additions and 704 deletions

View File

@@ -3,14 +3,18 @@ name: 'Build and upload distribution'
runs:
using: "composite"
steps:
- uses: actions/setup-node@v4
with:
node-version: 20
- name: Build distribution
shell: bash
run: |
npm -v
node -v
npm install
npm run build
- name: Upload distribution
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: dist
path: dist/

View File

@@ -6,7 +6,7 @@ runs:
steps:
- name: Download dist
if: ${{ env.DOWNLOAD_DIST == 'true' }}
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: dist
path: dist/

View File

@@ -1,22 +1,10 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: CI-codeql
on:
push:
branches: [ main ]
branches: [ "main" ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ main ]
branches: [ "main" ]
schedule:
- cron: '25 23 * * 2'
@@ -32,9 +20,7 @@ jobs:
strategy:
fail-fast: false
matrix:
language: [ 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Learn more about CodeQL language support at https://git.io/codeql-language-support
language: [ 'javascript-typescript' ]
steps:
- name: Checkout repository
@@ -45,26 +31,9 @@ jobs:
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# queries: ./path/to/local/query, your-org/your-repo/queries@main
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language
#- run: |
# make bootstrap
# make release
config: |
paths:
- src
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3

View File

@@ -17,4 +17,4 @@ jobs:
- name: 'Checkout Repository'
uses: actions/checkout@v4
- name: 'Dependency Review'
uses: actions/dependency-review-action@v3
uses: actions/dependency-review-action@v4

View File

@@ -9,6 +9,7 @@ on:
push:
branches:
- main
- release/**
paths:
- '.github/**'
- 'dist/**'
@@ -36,6 +37,11 @@ jobs:
with:
cache-key-prefix: ${{github.run_number}}-
dependency-graph-failures:
uses: ./.github/workflows/integ-test-dependency-graph-failures.yml
with:
cache-key-prefix: ${{github.run_number}}-
execution-with-caching:
uses: ./.github/workflows/integ-test-execution-with-caching.yml
with:
@@ -46,12 +52,12 @@ jobs:
with:
cache-key-prefix: ${{github.run_number}}-
gradle-enterprise-injection:
uses: ./.github/workflows/integ-test-inject-gradle-enterprise.yml
develocity-injection:
uses: ./.github/workflows/integ-test-inject-develocity.yml
with:
cache-key-prefix: ${{github.run_number}}-
secrets:
GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.GE_SOLUTIONS_ACCESS_TOKEN }}
DEVELOCITY_ACCESS_KEY: ${{ secrets.GE_SOLUTIONS_ACCESS_TOKEN }}
provision-gradle-versions:
uses: ./.github/workflows/integ-test-provision-gradle-versions.yml

View File

@@ -20,7 +20,7 @@ jobs:
distribution: temurin
java-version: 8
- name: Setup Gradle
uses: gradle/gradle-build-action@v2.11.0 # Use a released version to avoid breakages
uses: gradle/gradle-build-action@v2.12.0 # Use a released version to avoid breakages
- name: Run integration tests
working-directory: test/init-scripts
run: ./gradlew check

View File

@@ -3,7 +3,9 @@ name: CI-quick-check
on:
workflow_dispatch:
push:
branches-ignore: main
branches-ignore:
- main
- release/**
jobs:
build-distribution:
@@ -59,6 +61,13 @@ jobs:
runner-os: '["ubuntu-latest"]'
download-dist: true
dependency-graph-failures:
needs: build-distribution
uses: ./.github/workflows/integ-test-dependency-graph-failures.yml
with:
runner-os: '["ubuntu-latest"]'
download-dist: true
execution-with-caching:
needs: build-distribution
uses: ./.github/workflows/integ-test-execution-with-caching.yml
@@ -73,14 +82,14 @@ jobs:
runner-os: '["ubuntu-latest"]'
download-dist: true
gradle-enterprise-injection:
develocity-injection:
needs: build-distribution
uses: ./.github/workflows/integ-test-inject-gradle-enterprise.yml
uses: ./.github/workflows/integ-test-inject-develocity.yml
with:
runner-os: '["ubuntu-latest"]'
download-dist: true
secrets:
GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.GE_SOLUTIONS_ACCESS_TOKEN }}
DEVELOCITY_ACCESS_KEY: ${{ secrets.GE_SOLUTIONS_ACCESS_TOKEN }}
provision-gradle-versions:
needs: build-distribution

View File

@@ -8,6 +8,7 @@ on:
push:
branches:
- main
- release/**
- dependabot/**
jobs:
@@ -16,6 +17,9 @@ jobs:
steps:
- name: Checkout sources
uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- name: Build
run: |
npm -v

View File

@@ -0,0 +1,103 @@
name: Test dependency graph
on:
workflow_call:
inputs:
cache-key-prefix:
type: string
runner-os:
type: string
default: '["ubuntu-latest"]'
download-dist:
type: boolean
default: false
env:
DOWNLOAD_DIST: ${{ inputs.download-dist }}
GRADLE_BUILD_ACTION_CACHE_KEY_PREFIX: dependency-graph-${{ inputs.cache-key-prefix }}
GRADLE_BUILD_ACTION_CACHE_DEBUG_ENABLED: true
jobs:
unsupported-gradle-version-warning:
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Gradle for dependency-graph generate
uses: ./
with:
gradle-version: 7.0.1
dependency-graph: generate
dependency-graph-continue-on-failure: true
- name: Run with unsupported Gradle version
working-directory: .github/workflow-samples/groovy-dsl
run: |
if gradle help | grep -q 'warning::Dependency Graph is not supported for Gradle 7.0.1. No dependency snapshot will be generated.';
then
echo "Got the expected warning"
else
echo "Did not get the expected warning"
exit 1
fi
unsupported-gradle-version-failure:
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Gradle for dependency-graph generate
uses: ./
with:
gradle-version: 7.0.1
dependency-graph: generate
dependency-graph-continue-on-failure: false
- name: Run with unsupported Gradle version
working-directory: .github/workflow-samples/groovy-dsl
run: |
if gradle help; then
echo "Expected build to fail with Gradle 7.0.1"
exit 1
fi
insufficient-permissions-warning:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Gradle for dependency-graph generate
uses: ./
with:
dependency-graph: generate-and-submit
dependency-graph-continue-on-failure: true
- name: Run with insufficient permissions
working-directory: .github/workflow-samples/groovy-dsl
run: ./gradlew help
# This test is primarily for demonstration: it's unclear how to check for warnings emitted in the post-action
SHOULD_FAIL-insufficient-permissions-failure:
runs-on: ubuntu-latest
permissions:
contents: read
continue-on-error: true
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Gradle for dependency-graph generate
uses: ./
with:
dependency-graph: generate-and-submit
dependency-graph-continue-on-failure: false
- name: Run with insufficient permissions
working-directory: .github/workflow-samples/groovy-dsl
run: ./gradlew help
# This test is primarily for demonstration: it's unclear how to check for a failure in the post-action

View File

@@ -90,11 +90,15 @@ jobs:
- id: gradle-build
run: ./gradlew build
working-directory: .github/workflow-samples/groovy-dsl
- id: gradle-build-again
run: ./gradlew build
working-directory: .github/workflow-samples/groovy-dsl
- name: Check generated dependency graphs
shell: bash
run: |
echo "gradle-assemble report file: ${{ steps.gradle-assemble.outputs.dependency-graph-file }}"
echo "gradle-build report file: ${{ steps.gradle-build.outputs.dependency-graph-file }}"
echo "gradle-build-again report file: ${{ steps.gradle-build-again.outputs.dependency-graph-file }}"
ls -l dependency-graph-reports
if [ ! -e "${{ steps.gradle-assemble.outputs.dependency-graph-file }}" ]; then
echo "Did not find gradle-assemble dependency graph file"
@@ -104,3 +108,41 @@ jobs:
echo "Did not find gradle-build dependency graph files"
exit 1
fi
if [ ! -e "${{ steps.gradle-build-again.outputs.dependency-graph-file }}" ]; then
echo "Did not find gradle-build-again dependency graph files"
exit 1
fi
config-cache:
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Gradle for dependency-graph generate
uses: ./
with:
dependency-graph: generate-and-submit
- id: config-cache-store
run: ./gradlew assemble --configuration-cache
working-directory: .github/workflow-samples/groovy-dsl
- name: Check and delete generated dependency graph
shell: bash
run: |
if [ ! -e "${{ steps.config-cache-store.outputs.dependency-graph-file }}" ]; then
echo "Did not find config-cache-store dependency graph files"
exit 1
fi
rm ${{ steps.config-cache-store.outputs.dependency-graph-file }}
- id: config-cache-reuse
run: ./gradlew assemble --configuration-cache
working-directory: .github/workflow-samples/groovy-dsl
- name: Check no dependency graph is generated
shell: bash
run: |
if [ ! -z "$(ls -A dependency-graph-reports)" ]; then
echo "Expected no dependency graph files to be generated"
ls -l dependency-graph-reports
exit 1
fi

View File

@@ -46,12 +46,6 @@ jobs:
gradle-version: release-candidate
build-root-directory: .github/workflow-samples/no-wrapper
arguments: help
- name: Test use defined Gradle executable
uses: ./
with:
gradle-executable: .github/workflow-samples/groovy-dsl/gradlew${{ matrix.script-suffix }}
build-root-directory: .github/workflow-samples/no-wrapper
arguments: help
gradle-versions:
strategy:

View File

@@ -0,0 +1,97 @@
name: Test develocity injection
on:
workflow_call:
inputs:
cache-key-prefix:
type: string
runner-os:
type: string
default: '["ubuntu-latest", "windows-latest", "macos-latest"]'
download-dist:
type: boolean
default: false
secrets:
DEVELOCITY_ACCESS_KEY:
required: true
env:
DOWNLOAD_DIST: ${{ inputs.download-dist }}
GRADLE_BUILD_ACTION_CACHE_KEY_PREFIX: provision-gradle-versions-${{ inputs.cache-key-prefix }}
GRADLE_BUILD_ACTION_CACHE_DEBUG_ENABLED: true
jobs:
inject-develocity:
env:
DEVELOCITY_INJECTION_ENABLED: true
DEVELOCITY_URL: https://ge.solutions-team.gradle.com
DEVELOCITY_PLUGIN_VERSION: 3.16.1
DEVELOCITY_CCUD_PLUGIN_VERSION: 1.12.1
GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.DEVELOCITY_ACCESS_KEY }} # This env var has not (yet) been renamed/aliased in GE plugin 3.16.1
strategy:
matrix:
gradle: [current, 7.6.2, 6.9.4, 5.6.4]
os: ${{fromJSON(inputs.runner-os)}}
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: temurin
java-version: 8
- name: Setup Gradle
id: setup-gradle
uses: ./
with:
cache-read-only: false # For testing, allow writing cache entries on non-default branches
gradle-version: ${{ matrix.gradle }}
- name: Run Gradle build
id: gradle
working-directory: .github/workflow-samples/no-ge
run: gradle help
- name: Check Build Scan url
if: ${{ !steps.gradle.outputs.build-scan-url }}
uses: actions/github-script@v7
with:
script: |
core.setFailed('No Build Scan detected')
build-scan-publish:
strategy:
matrix:
gradle: [current, 7.6.2, 6.9.4, 5.6.4]
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: temurin
java-version: 8
- name: Setup Gradle
id: setup-gradle
uses: ./
with:
cache-read-only: false # For testing, allow writing cache entries on non-default branches
gradle-version: ${{ matrix.gradle }}
build-scan-publish: true
build-scan-terms-of-service-url: "https://gradle.com/terms-of-service"
build-scan-terms-of-service-agree: "yes"
- name: Run Gradle build
id: gradle
working-directory: .github/workflow-samples/no-ge
run: gradle help
- name: Check Build Scan url
if: ${{ !steps.gradle.outputs.build-scan-url }}
uses: actions/github-script@v7
with:
script: |
core.setFailed('No Build Scan detected')

View File

@@ -1,60 +0,0 @@
name: Test gradle enterprise injection
on:
workflow_call:
inputs:
cache-key-prefix:
type: string
runner-os:
type: string
default: '["ubuntu-latest", "windows-latest", "macos-latest"]'
download-dist:
type: boolean
default: false
secrets:
GRADLE_ENTERPRISE_ACCESS_KEY:
required: true
env:
DOWNLOAD_DIST: ${{ inputs.download-dist }}
GRADLE_BUILD_ACTION_CACHE_KEY_PREFIX: provision-gradle-versions-${{ inputs.cache-key-prefix }}
GRADLE_BUILD_ACTION_CACHE_DEBUG_ENABLED: true
GRADLE_ENTERPRISE_INJECTION_ENABLED: true
GRADLE_ENTERPRISE_URL: https://ge.solutions-team.gradle.com
GRADLE_ENTERPRISE_PLUGIN_VERSION: 3.16.1
GRADLE_ENTERPRISE_CCUD_PLUGIN_VERSION: 1.12.1
GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.GRADLE_ENTERPRISE_ACCESS_KEY }}
jobs:
inject-gradle-enterprise:
strategy:
matrix:
gradle: [current, 7.6.2, 6.9.4, 5.6.4]
os: ${{fromJSON(inputs.runner-os)}}
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: temurin
java-version: 8
- name: Setup Gradle
id: setup-gradle
uses: ./
with:
cache-read-only: false # For testing, allow writing cache entries on non-default branches
gradle-version: ${{ matrix.gradle }}
- name: Run Gradle build
id: gradle
working-directory: .github/workflow-samples/no-ge
run: gradle help
- name: Check Build Scan url
if: ${{ !steps.gradle.outputs.build-scan-url }}
uses: actions/github-script@v7
with:
script: |
core.setFailed('No Build Scan detected')

View File

@@ -33,6 +33,11 @@ jobs:
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java to ensure consistency
uses: actions/setup-java@v4
with:
distribution: 'liberica'
java-version: '21'
- name: Setup Gradle
uses: ./
with:
@@ -56,6 +61,11 @@ jobs:
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java to ensure consistency
uses: actions/setup-java@v4
with:
distribution: 'liberica'
java-version: '21'
- name: Setup Gradle
uses: ./
with:
@@ -89,6 +99,11 @@ jobs:
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java to ensure consistency
uses: actions/setup-java@v4
with:
distribution: 'liberica'
java-version: '21'
- name: Setup Gradle with no extracted cache entries restored
uses: ./
env:
@@ -113,6 +128,11 @@ jobs:
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java to ensure consistency
uses: actions/setup-java@v4
with:
distribution: 'liberica'
java-version: '21'
- name: Setup Gradle
uses: ./
with:
@@ -136,6 +156,11 @@ jobs:
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java to ensure consistency
uses: actions/setup-java@v4
with:
distribution: 'liberica'
java-version: '21'
- name: Setup Gradle
uses: ./
with:
@@ -160,6 +185,11 @@ jobs:
uses: actions/checkout@v4
- name: Download distribution if required
uses: ./.github/actions/download-dist
- name: Setup Java to ensure consistency
uses: actions/setup-java@v4
with:
distribution: 'liberica'
java-version: '21'
- name: Setup Gradle
uses: ./
with:

140
README.md
View File

@@ -120,7 +120,7 @@ cache-disabled: true
By default, the `gradle-build-action` will only write to the cache from Jobs on the default (`main`/`master`) branch.
Jobs on other branches will read entries from the cache but will not write updated entries.
See [Optimizing cache effectiveness](#optimizing-cache-effectiveness) for a more detailed explanation.
See [Optimizing cache effectiveness](#select-which-branches-should-write-to-the-cache) for a more detailed explanation.
In some circumstances it makes sense to change this default, and to configure a workflow Job to read existing cache entries but not to write changes back.
@@ -247,11 +247,11 @@ For this reason, it's very difficult to create a cache key that will determinist
The Gradle User Home cache key is composed of:
- The current operating system (`RUNNER_OS`)
- The workflow name and Job ID
- A hash of the Job matrix parameters
- The Job id
- A hash of the Job matrix parameters and the workflow name
- The git SHA for the latest commit
Specifically, the cache key is: `${cache-protocol}-gradle|${runner-os}|${workflow-name}-${job-id}[${hash-of-job-matrix}]-${git-sha}`
Specifically, the cache key is: `${cache-protocol}-gradle|${runner-os}|${job-id}[${hash-of-job-matrix-and-workflow-name}]-${git-sha}`
As such, the cache key is likely to change on each subsequent run of GitHub actions.
This allows the most recent state to always be available in the GitHub actions cache.
@@ -259,8 +259,8 @@ This allows the most recent state to always be available in the GitHub actions c
### Finding a matching cache entry
In most cases, no exact match will exist for the cache key. Instead, the Gradle User Home will be restored for the closest matching cache entry, using a set of "restore keys". The entries will be matched with the following precedence:
- An exact match on OS, workflow name, job id, matrix and Git SHA
- The most recent entry saved for the same OS, workflow name, job id and matrix values
- An exact match on OS, job id, workflow name, matrix and Git SHA
- The most recent entry saved for the same OS, job id, workflow name and matrix values
- The most recent entry saved for the same OS and job id
- The most recent entry saved for the same OS
@@ -314,20 +314,23 @@ There are some techniques that can be used to avoid/mitigate this issue:
### Select which branches should write to the cache
GitHub cache entries are not shared between builds on different branches.
This means that each PR branch will have it's own Gradle User Home cache, and will not benefit from cache entries written by other PR branches.
An exception to this is that cache entries written in parent and upstream branches are visible to child branches, and cache entries for the default (`master`/`main`) branch can be read by actions invoked for any other branch.
Workflow runs can restore caches created in either the current branch or the default branch (usually main).
This means that each branch will have it's own Gradle User Home cache scope, and will not benefit from cache entries written for other (non-default) branches.
By default, the `gradle-build-action` will only _write_ to the cache for builds run on the default (`master`/`main`) branch.
Jobs run on other branches will only read from the cache. In most cases, this is the desired behaviour,
because Jobs run against other branches will benefit from the cache Gradle User Home from `main`,
without writing private cache entries that could lead to evicting shared entries.
Jobs run on other branches will only read from the cache. In most cases, this is the desired behavior.
This is because Jobs run on other branches will benefit from the cache Gradle User Home from `main`,
without writing private cache entries that which could lead to evicting these shared entries.
If you have other long-lived development branches that would benefit from writing to the cache,
you can configure these by overriding the `cache-read-only` action parameter.
you can configure this by disabling the `cache-read-only` action parameter for these branches.
See [Using the cache read-only](#using-the-cache-read-only) for more details.
Similarly, you could use `cache-read-only` for certain jobs in the workflow, and instead have these jobs reuse the cache content from upstream jobs.
Note there are some cases where writing cache entries is typically unhelpful (these are disabled by default):
- For `pull_request` triggered runs, the cache scope is limited to the merge ref (`refs/pull/.../merge`) and can only be restored by re-runs of the same pull request.
- For `merge_group` triggered runs, the cache scope is limited to a temporary branch with a special prefix created to validate pull request changes, and won't be available on subsequent Merge Queue executions.
### Exclude content from Gradle User Home cache
As well as any wrapper distributions, the action will attempt to save and restore the `caches` and `notifications` directories from Gradle User Home.
@@ -473,9 +476,10 @@ You enable GitHub Dependency Graph support by setting the `dependency-graph` act
| Option | Behaviour |
| --- | --- |
| `disabled` | Do not generate a dependency graph for any build invocations.<p>This is the default. |
| `generate` | Generate a dependency graph snapshot for each build invocation, saving as a workflow artifact. |
| `generate-and-submit` | As per `generate`, but any generated dependency graph snapshots will be submitted at the end of the job. |
| `download-and-submit` | Download any previously saved dependency graph snapshots, submitting them via the Dependency Submission API. This can be useful to collect all snapshots in a matrix of builds and submit them in one step. |
| `generate` | Generate a dependency graph snapshot for each build invocation. |
| `generate-and-submit` | Generate a dependency graph snapshot for each build invocation, and submit these via the Dependency Submission API on completion of the job. |
| `generate-and-upload` | Generate a dependency graph snapshot for each build invocation, saving as a workflow artifact. |
| `download-and-submit` | Download any previously saved dependency graph snapshots, and submit them via the Dependency Submission API. This can be useful to submit [dependency graphs for pull requests submitted from a repository forks](#dependency-graphs-for-pull-request-workflows). |
Example of a CI workflow that generates and submits a dependency graph:
```yaml
@@ -509,6 +513,22 @@ Depending on [repository settings](https://docs.github.com/en/actions/security-g
> for a PR submitted from a forked repository.
> For a configuration that supports this setup, see [Dependency Graphs for pull request workflows](#dependency-graphs-for-pull-request-workflows).
### Making dependency graph failures cause Job failures
By default, if a failure is encountered when generating or submitting the dependency graph, the action will log the failure as a warning and continue.
This allows your workflow to be resilient to dependency graph failures, in case dependency graph production is a side-effect rather than the primary purpose of a workflow.
If instead you have a workflow that has a primary purpose to generate and submit a dependency graph, then it makes sense for this workflow to fail if the dependency
graph cannot be generated or submitted. You can enable this behaviour with the `dependency-graph-continue-on-failure` parameter, which defaults to `true`.
```yaml
# Ensure that the workflow Job will fail if the dependency graph cannot be submitted
- uses: gradle/gradle-build-action@v3
with:
dependency-graph: generate-and-submit
dependency-graph-continue-on-failure: false
```
### Using a custom plugin repository
By default, the action downloads the `github-dependency-graph-gradle-plugin` from the Gradle Plugin Portal (https://plugins.gradle.org). If your GitHub Actions environment does not have access to this URL, you can specify a custom plugin repository to use.
@@ -568,7 +588,7 @@ jobs:
needs: build
runs-on: ubuntu-latest
- name: Perform dependency review
uses: actions/dependency-review-action@v3
uses: actions/dependency-review-action@v4
```
See [Dependency Graphs for pull request workflows](#dependency-graphs-for-pull-request-workflows) for a more complex
@@ -685,6 +705,9 @@ name: run-build-and-generate-dependency-snapshot
on:
pull_request:
permissions:
contents: read
jobs:
build:
runs-on: ubuntu-latest
@@ -693,7 +716,7 @@ jobs:
- name: Setup Gradle to generate and submit dependency graphs
uses: gradle/gradle-build-action@v2
with:
dependency-graph: generate # Only generate in this job
dependency-graph: generate-and-upload # Generate graphs and save as workflow artifacts
- name: Run a build, generating the dependency graph snapshot which will be submitted
run: ./gradlew build
```
@@ -707,6 +730,9 @@ on:
workflows: ['run-build-and-generate-dependency-snapshot']
types: [completed]
permissions:
contents: write
jobs:
submit-dependency-graph:
runs-on: ubuntu-latest
@@ -714,7 +740,7 @@ jobs:
- name: Retrieve dependency graph artifact and submit
uses: gradle/gradle-build-action@v2
with:
dependency-graph: download-and-submit
dependency-graph: download-and-submit # Download saved workflow artifacts and submit
```
### Integrating `dependency-review-action` for pull request workflows
@@ -742,7 +768,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: 'Dependency Review'
uses: actions/dependency-review-action@v3
uses: actions/dependency-review-action@v4
with:
retry-on-snapshot-warnings: true
retry-on-snapshot-warnings-timeout: 600
@@ -780,27 +806,26 @@ To reduce storage costs for these artifacts, you can set the `artifact-retention
# Gradle Enterprise plugin injection
# Develocity plugin injection
The `gradle-build-action` provides support for injecting and configuring the Gradle Enterprise Gradle plugin into any Gradle build, without any modification to the project sources.
The `gradle-build-action` provides support for injecting and configuring the Develocity Gradle plugin into any Gradle build, without any modification to the project sources.
This is achieved via an init-script installed into Gradle User Home, which is enabled and parameterized via environment variables.
The same auto-injection behavior is available for the Common Custom User Data Gradle plugin, which enriches any build scans published with additional useful information.
## Enabling Gradle Enterprise injection
## Enabling Develocity injection
In order to enable Gradle Enterprise for your build, you must provide the required configuration via environment variables.
In order to enable Develocity injection for your build, you must provide the required configuration via environment variables.
Here's a minimal example:
```yaml
name: Run build with Gradle Enterprise injection
name: Run build with Develocity injection
env:
GRADLE_ENTERPRISE_INJECTION_ENABLED: true
GRADLE_ENTERPRISE_URL: https://ge.gradle.org
GRADLE_ENTERPRISE_PLUGIN_VERSION: 3.16.1
GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.GE_ACCESS_KEY }} # Required to publish scans to ge.gradle.org
DEVELOCITY_INJECTION_ENABLED: true
DEVELOCITY_URL: https://develocity.your-server.com
DEVELOCITY_PLUGIN_VERSION: 3.16.1
jobs:
build:
@@ -809,36 +834,53 @@ jobs:
- uses: actions/checkout@v4
- name: Setup Gradle
uses: gradle/gradle-build-action@v2
- name: Run a Gradle build with Gradle Enterprise injection enabled
- name: Run a Gradle build with Develocity injection enabled
run: ./gradlew build
```
This configuration will automatically apply `v3.16.1` of the [Gradle Enterprise Gradle plugin](https://docs.gradle.com/enterprise/gradle-plugin/), and publish build scans to https://ge.gradle.org.
This configuration will automatically apply `v3.16.1` of the [Develocity Gradle plugin](https://docs.gradle.com/enterprise/gradle-plugin/), and publish build scans to https://develocity.your-server.com.
Note that the `ge.gradle.org` server requires authentication in order to publish scans. The provided `GRADLE_ENTERPRISE_ACCESS_KEY` isn't required by the Gradle Enterprise injection script,
but will be used by the GE plugin in order to authenticate with the server.
This example assumes that the `develocity.your-server.com` server allows anonymous publishing of build scans.
In the likely scenario that your Develocity server requires authentication, you will also need to configure an addition environment variable
with a valid [Develocity access key](https://docs.gradle.com/enterprise/gradle-plugin/#via_environment_variable).
## Configuring Gradle Enterprise injection
## Configuring Develocity injection
The `init-script` supports a number of additional configuration parameters that you may fine useful. All configuration options (required and optional) are detailed below:
| Variable | Required | Description |
| --- | --- | --- |
| GRADLE_ENTERPRISE_INJECTION_ENABLED | :white_check_mark: | enables Gradle Enterprise injection |
| GRADLE_ENTERPRISE_URL | :white_check_mark: | the URL of the Gradle Enterprise server |
| GRADLE_ENTERPRISE_ALLOW_UNTRUSTED_SERVER | | allow communication with an untrusted server; set to _true_ if your Gradle Enterprise instance is using a self-signed certificate |
| GRADLE_ENTERPRISE_ENFORCE_URL | | enforce the configured Gradle Enterprise URL over a URL configured in the project's build; set to _true_ to enforce publication of build scans to the configured Gradle Enterprise URL |
| GRADLE_ENTERPRISE_PLUGIN_VERSION | :white_check_mark: | the version of the [Gradle Enterprise Gradle plugin](https://docs.gradle.com/enterprise/gradle-plugin/) to apply |
| GRADLE_ENTERPRISE_CCUD_PLUGIN_VERSION | | the version of the [Common Custom User Data Gradle plugin](https://github.com/gradle/common-custom-user-data-gradle-plugin) to apply, if any |
| GRADLE_ENTERPRISE_PLUGIN_REPOSITORY_URL | | the URL of the repository to use when resolving the GE and CCUD plugins; the Gradle Plugin Portal is used by default |
| Variable | Required | Description |
|-----------------------------------| --- | --- |
| DEVELOCITY_INJECTION_ENABLED | :white_check_mark: | enables Develocity injection |
| DEVELOCITY_URL | :white_check_mark: | the URL of the Develocity server |
| DEVELOCITY_ALLOW_UNTRUSTED_SERVER | | allow communication with an untrusted server; set to _true_ if your Develocity instance is using a self-signed certificate |
| DEVELOCITY_ENFORCE_URL | | enforce the configured Develocity URL over a URL configured in the project's build; set to _true_ to enforce publication of build scans to the configured Develocity URL |
| DEVELOCITY_PLUGIN_VERSION | :white_check_mark: | the version of the [Develocity Gradle plugin](https://docs.gradle.com/enterprise/gradle-plugin/) to apply |
| DEVELOCITY_CCUD_PLUGIN_VERSION | | the version of the [Common Custom User Data Gradle plugin](https://github.com/gradle/common-custom-user-data-gradle-plugin) to apply, if any |
| GRADLE_PLUGIN_REPOSITORY_URL | | the URL of the repository to use when resolving the Develocity and CCUD plugins; the Gradle Plugin Portal is used by default |
## Publishing to scans.gradle.com
Gradle Enterprise injection is designed to enable publishing of build scans to a Gradle Enterprise instance,
and is not suitable for publishing to the public Build Scans instance (https://scans.gradle.com).
Develocity injection is designed to enable publishing of build scans to a Develocity instance,
but is also useful for publishing to the public Build Scans instance (https://scans.gradle.com).
In order to publish Build Scans to scans.gradle.com, you need to:
- Apply the Gradle Enterprise plugin to your build configuration ([see docs](https://docs.gradle.com/enterprise/get-started/#applying_the_plugin))
- Programmatically accept the Terms of Service for scans.gradle.com ([see docs](https://docs.gradle.com/enterprise/gradle-plugin/#connecting_to_scans_gradle_com))
- Execute the build with `--scan` or configure your build with `publishAlways()` ([see docs](https://docs.gradle.com/enterprise/get-started/#always_publishing_a_build_scan))
To publish to https://scans.gradle.com, you must specify in your workflow that you accept the [Gradle Terms of Service](https://gradle.com/terms-of-service).
```yaml
name: Run build and publish Build Scan
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Gradle to publish build scans
uses: gradle/gradle-build-action@v2
with:
build-scan-publish: true
build-scan-terms-of-service-url: "https://gradle.com/terms-of-service"
build-scan-terms-of-service-agree: "yes"
- name: Run a Gradle build - a build scan will be published automatically
run: ./gradlew build
```

View File

@@ -69,14 +69,34 @@ inputs:
default: 'never'
dependency-graph:
description: Specifies if a GitHub dependency snapshot should be generated for each Gradle build, and if so, how. Valid values are 'disabled' (default), 'generate', 'generate-and-submit', 'generate-and-upload' and 'download-and-submit'.
description: Specifies if a GitHub dependency snapshot should be generated for each Gradle build, and if so, how. Valid values are 'disabled' (default), 'generate', 'generate-and-submit', 'generate-and-upload', 'download-and-submit' and 'clear'.
required: false
default: 'disabled'
dependency-graph-continue-on-failure:
description: When 'false' a failure to generate or submit a dependency graph will fail the Step or Job. When 'true' a warning will be emitted but no failure will result.
required: false
default: true
artifact-retention-days:
description: Specifies the number of days to retain any artifacts generated by the action. If not set, the default retention settings for the repository will apply.
required: false
build-scan-publish:
description: |
Set to 'true' to automatically publish build results as a Build Scan on scans.gradle.com.
For publication to succeed without user input, you must also provide values for `build-scan-terms-of-service-url` and 'build-scan-terms-of-service-agree'.
required: false
default: false
build-scan-terms-of-service-url:
description: The URL to the Build Scan® terms of service. This input must be set to 'https://gradle.com/terms-of-service'.
required: false
build-scan-terms-of-service-agree:
description: Indicate that you agree to the Build Scan® terms of service. This input value must be "yes".
required: false
# DEPRECATED ACTION INPUTS
arguments:
description: Gradle command line arguments (supports multi-line input)
@@ -88,11 +108,6 @@ inputs:
required: false
deprecation-message: Using the action to execute Gradle directly is deprecated in favor of using the action to setup Gradle, and executing Gradle in a subsequent Step. See https://github.com/gradle/gradle-build-action?tab=readme-ov-file#use-the-action-to-setup-gradle.
gradle-executable:
description: Path to the Gradle executable. If specified, this executable will be added to the PATH and used for invoking Gradle.
required: false
deprecation-message: Using the action to execute Gradle directly is deprecated in favor of using the action to setup Gradle, and executing Gradle in a subsequent Step. See https://github.com/gradle/gradle-build-action?tab=readme-ov-file#use-the-action-to-setup-gradle.
generate-job-summary:
description: When 'false', no Job Summary will be generated for the Job.
required: false

View File

@@ -1,24 +0,0 @@
name: 'Clear dependency graph for a correlator'
inputs:
job-correlator:
required: true
runs:
using: "composite"
steps:
- name: Set current timestamp as env variable
shell: bash
run: echo "NOW=$(date -Iseconds)" >> $GITHUB_ENV
- name: Submit empty dependency graph
shell: bash
run: |
curl -L \
-X POST \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ github.token }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
https://api.github.com/repos/${{ github.repository }}/dependency-graph/snapshots \
-d '{ "version" : 0, "job" : { "id" : "${{ github.run_id }}", "correlator" : "${{ inputs.job-correlator }} " }, "sha" : "${{ github.sha }}", "ref" : "${{ github.ref }}", "detector" : { "name" : "GitHub Dependency Graph Gradle Plugin", "version" : "0.0.3", "url" : "https://github.com/gradle/github-dependency-graph-gradle-plugin" }, "manifests" : {}, "scanned" : "${{ env.NOW }}" }'
- run: echo "::notice ::Cleared dependency graph for job correlator '${{ inputs.job-correlator }}'"
shell: bash

583
dist/main/index.js vendored
View File

@@ -824,7 +824,7 @@ __exportStar(__nccwpck_require__(63077), exports);
"use strict";
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.ArtifactService = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "results/api/v1/artifact.proto" (package "github.actions.results.api.v1", syntax proto3)
// tslint:disable
@@ -1400,6 +1400,121 @@ class GetSignedArtifactURLResponse$Type extends runtime_5.MessageType {
* @generated MessageType for protobuf message github.actions.results.api.v1.GetSignedArtifactURLResponse
*/
exports.GetSignedArtifactURLResponse = new GetSignedArtifactURLResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class DeleteArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.DeleteArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "workflow_job_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value) {
const message = { workflowRunBackendId: "", workflowJobRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string workflow_job_run_backend_id */ 2:
message.workflowJobRunBackendId = reader.string();
break;
case /* string name */ 3:
message.name = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string workflow_job_run_backend_id = 2; */
if (message.workflowJobRunBackendId !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.workflowJobRunBackendId);
/* string name = 3; */
if (message.name !== "")
writer.tag(3, runtime_1.WireType.LengthDelimited).string(message.name);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.DeleteArtifactRequest
*/
exports.DeleteArtifactRequest = new DeleteArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class DeleteArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.DeleteArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, runtime_1.WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.DeleteArtifactResponse
*/
exports.DeleteArtifactResponse = new DeleteArtifactResponse$Type();
/**
* @generated ServiceType for protobuf service github.actions.results.api.v1.ArtifactService
*/
@@ -1407,7 +1522,8 @@ exports.ArtifactService = new runtime_rpc_1.ServiceType("github.actions.results.
{ name: "CreateArtifact", options: {}, I: exports.CreateArtifactRequest, O: exports.CreateArtifactResponse },
{ name: "FinalizeArtifact", options: {}, I: exports.FinalizeArtifactRequest, O: exports.FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: exports.ListArtifactsRequest, O: exports.ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse }
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse }
]);
//# sourceMappingURL=artifact.js.map
@@ -1438,6 +1554,7 @@ class ArtifactServiceClientJSON {
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toJson(request, {
@@ -1477,6 +1594,16 @@ class ArtifactServiceClientJSON {
ignoreUnknownFields: true,
}));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/json", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
}
exports.ArtifactServiceClientJSON = ArtifactServiceClientJSON;
class ArtifactServiceClientProtobuf {
@@ -1486,6 +1613,7 @@ class ArtifactServiceClientProtobuf {
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toBinary(request);
@@ -1507,6 +1635,11 @@ class ArtifactServiceClientProtobuf {
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/protobuf", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromBinary(data));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromBinary(data));
}
}
exports.ArtifactServiceClientProtobuf = ArtifactServiceClientProtobuf;
var ArtifactServiceMethod;
@@ -1515,12 +1648,14 @@ var ArtifactServiceMethod;
ArtifactServiceMethod["FinalizeArtifact"] = "FinalizeArtifact";
ArtifactServiceMethod["ListArtifacts"] = "ListArtifacts";
ArtifactServiceMethod["GetSignedArtifactURL"] = "GetSignedArtifactURL";
ArtifactServiceMethod["DeleteArtifact"] = "DeleteArtifact";
})(ArtifactServiceMethod || (exports.ArtifactServiceMethod = ArtifactServiceMethod = {}));
exports.ArtifactServiceMethodList = [
ArtifactServiceMethod.CreateArtifact,
ArtifactServiceMethod.FinalizeArtifact,
ArtifactServiceMethod.ListArtifacts,
ArtifactServiceMethod.GetSignedArtifactURL,
ArtifactServiceMethod.DeleteArtifact,
];
function createArtifactServiceServer(service) {
return new twirp_ts_1.TwirpServer({
@@ -1558,6 +1693,12 @@ function matchArtifactServiceRoute(method, events) {
yield events.onMatch(ctx);
return handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, interceptors);
});
case "DeleteArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "DeleteArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors);
});
default:
events.onNotFound();
const msg = `no handler found`;
@@ -1608,6 +1749,17 @@ function handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, in
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceCreateArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
@@ -1732,6 +1884,37 @@ function handleArtifactServiceGetSignedArtifactURLJSON(ctx, service, data, inter
}));
});
}
function handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.DeleteArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return JSON.stringify(artifact_1.DeleteArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceCreateArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
@@ -1832,6 +2015,31 @@ function handleArtifactServiceGetSignedArtifactURLProtobuf(ctx, service, data, i
return Buffer.from(artifact_1.GetSignedArtifactURLResponse.toBinary(response));
});
}
function handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.DeleteArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return Buffer.from(artifact_1.DeleteArtifactResponse.toBinary(response));
});
}
//# sourceMappingURL=artifact.twirp.js.map
/***/ }),
@@ -1867,6 +2075,7 @@ const core_1 = __nccwpck_require__(42186);
const config_1 = __nccwpck_require__(74610);
const upload_artifact_1 = __nccwpck_require__(42578);
const download_artifact_1 = __nccwpck_require__(73555);
const delete_artifact_1 = __nccwpck_require__(70071);
const get_artifact_1 = __nccwpck_require__(29491);
const list_artifacts_1 = __nccwpck_require__(44141);
const errors_1 = __nccwpck_require__(38182);
@@ -1953,6 +2162,28 @@ If the error persists, please check whether Actions and API requests are operati
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`);
throw error;
}
});
}
deleteArtifact(artifactName, options) {
return __awaiter(this, void 0, void 0, function* () {
try {
if ((0, config_1.isGhes)()) {
throw new errors_1.GHESNotSupportedError();
}
if (options === null || options === void 0 ? void 0 : options.findBy) {
const { findBy: { repositoryOwner, repositoryName, workflowRunId, token } } = options;
return (0, delete_artifact_1.deleteArtifactPublic)(artifactName, workflowRunId, repositoryOwner, repositoryName, token);
}
return (0, delete_artifact_1.deleteArtifactInternal)(artifactName);
}
catch (error) {
(0, core_1.warning)(`Delete Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`);
throw error;
}
@@ -1964,6 +2195,96 @@ exports.DefaultArtifactClient = DefaultArtifactClient;
/***/ }),
/***/ 70071:
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
"use strict";
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.deleteArtifactInternal = exports.deleteArtifactPublic = void 0;
const core_1 = __nccwpck_require__(42186);
const github_1 = __nccwpck_require__(21260);
const user_agent_1 = __nccwpck_require__(85164);
const retry_options_1 = __nccwpck_require__(64597);
const utils_1 = __nccwpck_require__(58154);
const plugin_request_log_1 = __nccwpck_require__(68883);
const plugin_retry_1 = __nccwpck_require__(86298);
const artifact_twirp_client_1 = __nccwpck_require__(12312);
const util_1 = __nccwpck_require__(63062);
const generated_1 = __nccwpck_require__(49960);
const get_artifact_1 = __nccwpck_require__(29491);
const errors_1 = __nccwpck_require__(38182);
function deleteArtifactPublic(artifactName, workflowRunId, repositoryOwner, repositoryName, token) {
var _a;
return __awaiter(this, void 0, void 0, function* () {
const [retryOpts, requestOpts] = (0, retry_options_1.getRetryOptions)(utils_1.defaults);
const opts = {
log: undefined,
userAgent: (0, user_agent_1.getUserAgentString)(),
previews: undefined,
retry: retryOpts,
request: requestOpts
};
const github = (0, github_1.getOctokit)(token, opts, plugin_retry_1.retry, plugin_request_log_1.requestLog);
const getArtifactResp = yield (0, get_artifact_1.getArtifactPublic)(artifactName, workflowRunId, repositoryOwner, repositoryName, token);
const deleteArtifactResp = yield github.rest.actions.deleteArtifact({
owner: repositoryOwner,
repo: repositoryName,
artifact_id: getArtifactResp.artifact.id
});
if (deleteArtifactResp.status !== 204) {
throw new errors_1.InvalidResponseError(`Invalid response from GitHub API: ${deleteArtifactResp.status} (${(_a = deleteArtifactResp === null || deleteArtifactResp === void 0 ? void 0 : deleteArtifactResp.headers) === null || _a === void 0 ? void 0 : _a['x-github-request-id']})`);
}
return {
id: getArtifactResp.artifact.id
};
});
}
exports.deleteArtifactPublic = deleteArtifactPublic;
function deleteArtifactInternal(artifactName) {
return __awaiter(this, void 0, void 0, function* () {
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
const listReq = {
workflowRunBackendId,
workflowJobRunBackendId,
nameFilter: generated_1.StringValue.create({ value: artifactName })
};
const listRes = yield artifactClient.ListArtifacts(listReq);
if (listRes.artifacts.length === 0) {
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}`);
}
let artifact = listRes.artifacts[0];
if (listRes.artifacts.length > 1) {
artifact = listRes.artifacts.sort((a, b) => Number(b.databaseId) - Number(a.databaseId))[0];
(0, core_1.debug)(`More than one artifact found for a single name, returning newest (id: ${artifact.databaseId})`);
}
const req = {
workflowRunBackendId: artifact.workflowRunBackendId,
workflowJobRunBackendId: artifact.workflowJobRunBackendId,
name: artifact.name
};
const res = yield artifactClient.DeleteArtifact(req);
(0, core_1.info)(`Artifact '${artifactName}' (ID: ${res.artifactId}) deleted`);
return {
id: Number(res.artifactId)
};
});
}
exports.deleteArtifactInternal = deleteArtifactInternal;
//# sourceMappingURL=delete-artifact.js.map
/***/ }),
/***/ 73555:
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
@@ -2005,7 +2326,7 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.downloadArtifactInternal = exports.downloadArtifactPublic = void 0;
exports.downloadArtifactInternal = exports.downloadArtifactPublic = exports.streamExtractExternal = void 0;
const promises_1 = __importDefault(__nccwpck_require__(73292));
const github = __importStar(__nccwpck_require__(21260));
const core = __importStar(__nccwpck_require__(42186));
@@ -2039,20 +2360,57 @@ function exists(path) {
});
}
function streamExtract(url, directory) {
return __awaiter(this, void 0, void 0, function* () {
let retryCount = 0;
while (retryCount < 5) {
try {
yield streamExtractExternal(url, directory);
return;
}
catch (error) {
retryCount++;
core.debug(`Failed to download artifact after ${retryCount} retries due to ${error.message}. Retrying in 5 seconds...`);
// wait 5 seconds before retrying
yield new Promise(resolve => setTimeout(resolve, 5000));
}
}
throw new Error(`Artifact download failed after ${retryCount} retries.`);
});
}
function streamExtractExternal(url, directory) {
return __awaiter(this, void 0, void 0, function* () {
const client = new httpClient.HttpClient((0, user_agent_1.getUserAgentString)());
const response = yield client.get(url);
if (response.message.statusCode !== 200) {
throw new Error(`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`);
}
const timeout = 30 * 1000; // 30 seconds
return new Promise((resolve, reject) => {
const timerFn = () => {
response.message.destroy(new Error(`Blob storage chunk did not respond in ${timeout}ms`));
};
const timer = setTimeout(timerFn, timeout);
response.message
.on('data', () => {
timer.refresh();
})
.on('error', (error) => {
core.debug(`response.message: Artifact download failed: ${error.message}`);
clearTimeout(timer);
reject(error);
})
.pipe(unzip_stream_1.default.Extract({ path: directory }))
.on('close', resolve)
.on('error', reject);
.on('close', () => {
clearTimeout(timer);
resolve();
})
.on('error', (error) => {
reject(error);
});
});
});
}
exports.streamExtractExternal = streamExtractExternal;
function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, token, options) {
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
@@ -2211,7 +2569,9 @@ function getArtifactPublic(artifactName, workflowRunId, repositoryOwner, reposit
throw new errors_1.InvalidResponseError(`Invalid response from GitHub API: ${getArtifactResp.status} (${(_a = getArtifactResp === null || getArtifactResp === void 0 ? void 0 : getArtifactResp.headers) === null || _a === void 0 ? void 0 : _a['x-github-request-id']})`);
}
if (getArtifactResp.data.artifacts.length === 0) {
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}`);
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`);
}
let artifact = getArtifactResp.data.artifacts[0];
if (getArtifactResp.data.artifacts.length > 1) {
@@ -2240,7 +2600,9 @@ function getArtifactInternal(artifactName) {
};
const res = yield artifactClient.ListArtifacts(req);
if (res.artifacts.length === 0) {
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}`);
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`);
}
let artifact = res.artifacts[0];
if (res.artifacts.length > 1) {
@@ -6377,7 +6739,8 @@ function createHttpClient() {
return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions());
}
function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) {
const components = paths;
// don't pass changes upstream
const components = paths.slice();
// Add compression method to cache version to restore
// compressed cache as per compression method
if (compressionMethod) {
@@ -6666,26 +7029,21 @@ function resolvePaths(patterns) {
implicitDescendants: false
});
try {
for (var _e = true, _f = __asyncValues(globber.globGenerator()), _g; _g = yield _f.next(), _a = _g.done, !_a;) {
for (var _e = true, _f = __asyncValues(globber.globGenerator()), _g; _g = yield _f.next(), _a = _g.done, !_a; _e = true) {
_c = _g.value;
_e = false;
try {
const file = _c;
const relativeFile = path
.relative(workspace, file)
.replace(new RegExp(`\\${path.sep}`, 'g'), '/');
core.debug(`Matched: ${relativeFile}`);
// Paths are made relative so the tar entries are all relative to the root of the workspace.
if (relativeFile === '') {
// path.relative returns empty string if workspace and file are equal
paths.push('.');
}
else {
paths.push(`${relativeFile}`);
}
const file = _c;
const relativeFile = path
.relative(workspace, file)
.replace(new RegExp(`\\${path.sep}`, 'g'), '/');
core.debug(`Matched: ${relativeFile}`);
// Paths are made relative so the tar entries are all relative to the root of the workspace.
if (relativeFile === '') {
// path.relative returns empty string if workspace and file are equal
paths.push('.');
}
finally {
_e = true;
else {
paths.push(`${relativeFile}`);
}
}
}
@@ -6787,7 +7145,7 @@ var CacheFilename;
(function (CacheFilename) {
CacheFilename["Gzip"] = "cache.tgz";
CacheFilename["Zstd"] = "cache.tzst";
})(CacheFilename = exports.CacheFilename || (exports.CacheFilename = {}));
})(CacheFilename || (exports.CacheFilename = CacheFilename = {}));
var CompressionMethod;
(function (CompressionMethod) {
CompressionMethod["Gzip"] = "gzip";
@@ -6795,12 +7153,12 @@ var CompressionMethod;
// This enum is for earlier version of zstd that does not have --long support
CompressionMethod["ZstdWithoutLong"] = "zstd-without-long";
CompressionMethod["Zstd"] = "zstd";
})(CompressionMethod = exports.CompressionMethod || (exports.CompressionMethod = {}));
})(CompressionMethod || (exports.CompressionMethod = CompressionMethod = {}));
var ArchiveToolType;
(function (ArchiveToolType) {
ArchiveToolType["GNU"] = "gnu";
ArchiveToolType["BSD"] = "bsd";
})(ArchiveToolType = exports.ArchiveToolType || (exports.ArchiveToolType = {}));
})(ArchiveToolType || (exports.ArchiveToolType = ArchiveToolType = {}));
// The default number of retry attempts.
exports.DefaultRetryAttempts = 2;
// The default delay in milliseconds between retry attempts.
@@ -138544,6 +138902,65 @@ function loadBuildResults() {
exports.loadBuildResults = loadBuildResults;
/***/ }),
/***/ 85772:
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
__setModuleDefault(result, mod);
return result;
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.setup = void 0;
const core = __importStar(__nccwpck_require__(42186));
const input_params_1 = __nccwpck_require__(23885);
function setup() {
if ((0, input_params_1.getBuildScanPublishEnabled)() && verifyTermsOfServiceAgreement()) {
maybeExportVariable('DEVELOCITY_INJECTION_ENABLED', 'true');
maybeExportVariable('DEVELOCITY_PLUGIN_VERSION', '3.16.1');
maybeExportVariable('DEVELOCITY_CCUD_PLUGIN_VERSION', '1.12.1');
maybeExportVariable('BUILD_SCAN_TERMS_OF_SERVICE_URL', (0, input_params_1.getBuildScanTermsOfServiceUrl)());
maybeExportVariable('BUILD_SCAN_TERMS_OF_SERVICE_AGREE', (0, input_params_1.getBuildScanTermsOfServiceAgree)());
}
}
exports.setup = setup;
function verifyTermsOfServiceAgreement() {
if ((0, input_params_1.getBuildScanTermsOfServiceUrl)() !== 'https://gradle.com/terms-of-service' ||
(0, input_params_1.getBuildScanTermsOfServiceAgree)() !== 'yes') {
core.warning(`Terms of service must be agreed in order to publish build scans.`);
return false;
}
return true;
}
function maybeExportVariable(variableName, value) {
if (!process.env[variableName]) {
core.exportVariable(variableName, value);
}
}
/***/ }),
/***/ 47591:
@@ -138736,7 +139153,7 @@ class GradleStateCache {
'gradle-build-action.build-result-capture-service.plugin.groovy',
'gradle-build-action.github-dependency-graph.init.gradle',
'gradle-build-action.github-dependency-graph-gradle-plugin-apply.groovy',
'gradle-build-action.inject-gradle-enterprise.init.gradle'
'gradle-build-action.inject-develocity.init.gradle'
];
for (const initScriptFilename of initScriptFilenames) {
const initScriptContent = this.readResourceFileAsString('init-scripts', initScriptFilename);
@@ -139971,6 +140388,7 @@ const request_error_1 = __nccwpck_require__(10537);
const path = __importStar(__nccwpck_require__(71017));
const fs_1 = __importDefault(__nccwpck_require__(57147));
const layout = __importStar(__nccwpck_require__(28182));
const errors_1 = __nccwpck_require__(36976);
const input_params_1 = __nccwpck_require__(23885);
const DEPENDENCY_GRAPH_PREFIX = 'dependency-graph_';
function setup(option) {
@@ -139983,16 +140401,26 @@ function setup(option) {
return;
}
core.info('Enabling dependency graph generation');
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED', 'true');
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR', getJobCorrelator());
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_ID', github.context.runId);
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_REF', github.context.ref);
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_SHA', getShaFromContext());
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_WORKSPACE', layout.workspaceDirectory());
core.exportVariable('DEPENDENCY_GRAPH_REPORT_DIR', path.resolve(layout.workspaceDirectory(), 'dependency-graph-reports'));
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED', 'true');
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_CONTINUE_ON_FAILURE', (0, input_params_1.getDependencyGraphContinueOnFailure)());
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR', getJobCorrelator());
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_ID', github.context.runId);
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_REF', github.context.ref);
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_SHA', getShaFromContext());
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_WORKSPACE', layout.workspaceDirectory());
maybeExportVariable('DEPENDENCY_GRAPH_REPORT_DIR', path.resolve(layout.workspaceDirectory(), 'dependency-graph-reports'));
if (option === input_params_1.DependencyGraphOption.Clear) {
core.exportVariable('DEPENDENCY_GRAPH_INCLUDE_PROJECTS', '');
core.exportVariable('DEPENDENCY_GRAPH_INCLUDE_CONFIGURATIONS', '');
}
});
}
exports.setup = setup;
function maybeExportVariable(variableName, value) {
if (!process.env[variableName]) {
core.exportVariable(variableName, value);
}
}
function complete(option) {
return __awaiter(this, void 0, void 0, function* () {
try {
@@ -140002,6 +140430,7 @@ function complete(option) {
case input_params_1.DependencyGraphOption.DownloadAndSubmit:
return;
case input_params_1.DependencyGraphOption.GenerateAndSubmit:
case input_params_1.DependencyGraphOption.Clear:
yield submitDependencyGraphs(yield findGeneratedDependencyGraphFiles());
return;
case input_params_1.DependencyGraphOption.GenerateAndUpload:
@@ -140009,7 +140438,7 @@ function complete(option) {
}
}
catch (e) {
core.warning(`Failed to ${option} dependency graph. Will continue. ${String(e)}`);
warnOrFail(option, e);
}
});
}
@@ -140040,7 +140469,7 @@ function downloadAndSubmitDependencyGraphs() {
yield submitDependencyGraphs(yield downloadDependencyGraphs());
}
catch (e) {
core.warning(`Download and submit dependency graph failed. Will continue. ${String(e)}`);
warnOrFail(input_params_1.DependencyGraphOption.DownloadAndSubmit, e);
}
});
}
@@ -140052,7 +140481,7 @@ function submitDependencyGraphs(dependencyGraphFiles) {
}
catch (error) {
if (error instanceof request_error_1.RequestError) {
core.warning(buildWarningMessage(jsonFile, error));
throw new Error(translateErrorMessage(jsonFile, error));
}
else {
throw error;
@@ -140061,9 +140490,9 @@ function submitDependencyGraphs(dependencyGraphFiles) {
}
});
}
function buildWarningMessage(jsonFile, error) {
function translateErrorMessage(jsonFile, error) {
const relativeJsonFile = getRelativePathFromWorkspace(jsonFile);
const mainWarning = `Failed to submit dependency graph ${relativeJsonFile}.\n${String(error)}`;
const mainWarning = `Dependency submission failed for ${relativeJsonFile}.\n${String(error)}`;
if (error.message === 'Resource not accessible by integration') {
return `${mainWarning}
Please ensure that the 'contents: write' permission is available for the workflow job.
@@ -140118,6 +140547,12 @@ function findDependencyGraphFiles(dir) {
return graphFiles;
});
}
function warnOrFail(option, error) {
if (!(0, input_params_1.getDependencyGraphContinueOnFailure)()) {
throw new errors_1.PostActionJobFailure(error);
}
core.warning(`Failed to ${option} dependency graph. Will continue.\n${String(error)}`);
}
function getOctokit() {
return github.getOctokit(getGithubToken());
}
@@ -140169,6 +140604,30 @@ function sanitize(value) {
}
/***/ }),
/***/ 36976:
/***/ ((__unused_webpack_module, exports) => {
"use strict";
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.PostActionJobFailure = void 0;
class PostActionJobFailure extends Error {
constructor(error) {
if (error instanceof Error) {
super(error.message);
this.name = error.name;
this.stack = error.stack;
}
else {
super(String(error));
}
}
}
exports.PostActionJobFailure = PostActionJobFailure;
/***/ }),
/***/ 23584:
@@ -140288,7 +140747,7 @@ function validateGradleWrapper(buildRootDirectory) {
}
function verifyExists(file, description) {
if (!fs_1.default.existsSync(file)) {
throw new Error(`Cannot locate ${description} at '${file}'. Specify 'gradle-version' or 'gradle-executable' for projects without Gradle wrapper configured.`);
throw new Error(`Cannot locate ${description} at '${file}'. Specify 'gradle-version' for projects without Gradle wrapper configured.`);
}
}
function verifyIsExecutableScript(toExecute) {
@@ -140332,7 +140791,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.JobSummaryOption = exports.DependencyGraphOption = exports.parseNumericInput = exports.getArtifactRetentionDays = exports.getDependencyGraphOption = exports.getPRCommentOption = exports.getJobSummaryOption = exports.isJobSummaryEnabled = exports.getGithubToken = exports.getJobMatrix = exports.getArguments = exports.getGradleExecutable = exports.getGradleVersion = exports.getBuildRootDirectory = exports.getCacheExcludes = exports.getCacheIncludes = exports.getCacheEncryptionKey = exports.isCacheCleanupEnabled = exports.isCacheDebuggingEnabled = exports.isCacheStrictMatch = exports.isCacheOverwriteExisting = exports.isCacheWriteOnly = exports.isCacheReadOnly = exports.isCacheDisabled = void 0;
exports.JobSummaryOption = exports.DependencyGraphOption = exports.parseNumericInput = exports.getArtifactRetentionDays = exports.getDependencyGraphContinueOnFailure = exports.getDependencyGraphOption = exports.getBuildScanTermsOfServiceAgree = exports.getBuildScanTermsOfServiceUrl = exports.getBuildScanPublishEnabled = exports.getPRCommentOption = exports.getJobSummaryOption = exports.isJobSummaryEnabled = exports.getGithubToken = exports.getJobMatrix = exports.getArguments = exports.getGradleVersion = exports.getBuildRootDirectory = exports.getCacheExcludes = exports.getCacheIncludes = exports.getCacheEncryptionKey = exports.isCacheCleanupEnabled = exports.isCacheDebuggingEnabled = exports.isCacheStrictMatch = exports.isCacheOverwriteExisting = exports.isCacheWriteOnly = exports.isCacheReadOnly = exports.isCacheDisabled = void 0;
const core = __importStar(__nccwpck_require__(42186));
const string_argv_1 = __nccwpck_require__(19663);
function isCacheDisabled() {
@@ -140383,10 +140842,6 @@ function getGradleVersion() {
return core.getInput('gradle-version');
}
exports.getGradleVersion = getGradleVersion;
function getGradleExecutable() {
return core.getInput('gradle-executable');
}
exports.getGradleExecutable = getGradleExecutable;
function getArguments() {
const input = core.getInput('arguments');
return (0, string_argv_1.parseArgsStringToArgv)(input);
@@ -140412,6 +140867,18 @@ function getPRCommentOption() {
return parseJobSummaryOption('add-job-summary-as-pr-comment');
}
exports.getPRCommentOption = getPRCommentOption;
function getBuildScanPublishEnabled() {
return getBooleanInput('build-scan-publish');
}
exports.getBuildScanPublishEnabled = getBuildScanPublishEnabled;
function getBuildScanTermsOfServiceUrl() {
return core.getInput('build-scan-terms-of-service-url');
}
exports.getBuildScanTermsOfServiceUrl = getBuildScanTermsOfServiceUrl;
function getBuildScanTermsOfServiceAgree() {
return core.getInput('build-scan-terms-of-service-agree');
}
exports.getBuildScanTermsOfServiceAgree = getBuildScanTermsOfServiceAgree;
function parseJobSummaryOption(paramName) {
const val = core.getInput(paramName);
switch (val.toLowerCase().trim()) {
@@ -140437,10 +140904,16 @@ function getDependencyGraphOption() {
return DependencyGraphOption.GenerateAndUpload;
case 'download-and-submit':
return DependencyGraphOption.DownloadAndSubmit;
case 'clear':
return DependencyGraphOption.Clear;
}
throw TypeError(`The value '${val}' is not valid for 'dependency-graph'. Valid values are: [disabled, generate, generate-and-submit, generate-and-upload, download-and-submit]. The default value is 'disabled'.`);
throw TypeError(`The value '${val}' is not valid for 'dependency-graph'. Valid values are: [disabled, generate, generate-and-submit, generate-and-upload, download-and-submit, clear]. The default value is 'disabled'.`);
}
exports.getDependencyGraphOption = getDependencyGraphOption;
function getDependencyGraphContinueOnFailure() {
return getBooleanInput('dependency-graph-continue-on-failure', true);
}
exports.getDependencyGraphContinueOnFailure = getDependencyGraphContinueOnFailure;
function getArtifactRetentionDays() {
const val = core.getInput('artifact-retention-days');
return parseNumericInput('artifact-retention-days', val, 0);
@@ -140476,6 +140949,7 @@ var DependencyGraphOption;
DependencyGraphOption["GenerateAndSubmit"] = "generate-and-submit";
DependencyGraphOption["GenerateAndUpload"] = "generate-and-upload";
DependencyGraphOption["DownloadAndSubmit"] = "download-and-submit";
DependencyGraphOption["Clear"] = "clear";
})(DependencyGraphOption || (exports.DependencyGraphOption = DependencyGraphOption = {}));
var JobSummaryOption;
(function (JobSummaryOption) {
@@ -140725,6 +141199,7 @@ function run() {
core.info(error.stack);
}
}
process.exit();
});
}
exports.run = run;
@@ -140781,19 +141256,13 @@ const cache = __importStar(__nccwpck_require__(27799));
const toolCache = __importStar(__nccwpck_require__(27784));
const gradlew = __importStar(__nccwpck_require__(32335));
const params = __importStar(__nccwpck_require__(23885));
const layout = __importStar(__nccwpck_require__(28182));
const cache_utils_1 = __nccwpck_require__(41678);
const gradleVersionsBaseUrl = 'https://services.gradle.org/versions';
function provisionGradle() {
return __awaiter(this, void 0, void 0, function* () {
const gradleVersion = params.getGradleVersion();
if (gradleVersion !== '' && gradleVersion !== 'wrapper') {
return addToPath(path.resolve(yield installGradle(gradleVersion)));
}
const gradleExecutable = params.getGradleExecutable();
if (gradleExecutable !== '') {
const workspaceDirectory = layout.workspaceDirectory();
return addToPath(path.resolve(workspaceDirectory, gradleExecutable));
return addToPath(yield installGradle(gradleVersion));
}
return undefined;
});
@@ -141060,6 +141529,7 @@ const layout = __importStar(__nccwpck_require__(28182));
const params = __importStar(__nccwpck_require__(23885));
const dependencyGraph = __importStar(__nccwpck_require__(80));
const jobSummary = __importStar(__nccwpck_require__(87345));
const buildScan = __importStar(__nccwpck_require__(85772));
const build_results_1 = __nccwpck_require__(82107);
const cache_reporting_1 = __nccwpck_require__(66674);
const daemon_controller_1 = __nccwpck_require__(85146);
@@ -141083,6 +141553,7 @@ function setup() {
yield caches.restore(userHome, gradleUserHome, cacheListener);
core.saveState(CACHE_LISTENER, cacheListener.stringify());
yield dependencyGraph.setup(params.getDependencyGraphOption());
buildScan.setup();
});
}
exports.setup = setup;
@@ -141462,7 +141933,7 @@ function firstString() {
/***/ ((module) => {
"use strict";
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.0.0","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^5.3.1","crypto":"^1.0.1","jwt-decode":"^3.1.2","twirp-ts":"^2.5.0","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.1.0","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^5.3.1","crypto":"^1.0.1","jwt-decode":"^3.1.2","twirp-ts":"^2.5.0","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
/***/ }),

File diff suppressed because one or more lines are too long

581
dist/post/index.js vendored
View File

@@ -824,7 +824,7 @@ __exportStar(__nccwpck_require__(63077), exports);
"use strict";
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.ArtifactService = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "results/api/v1/artifact.proto" (package "github.actions.results.api.v1", syntax proto3)
// tslint:disable
@@ -1400,6 +1400,121 @@ class GetSignedArtifactURLResponse$Type extends runtime_5.MessageType {
* @generated MessageType for protobuf message github.actions.results.api.v1.GetSignedArtifactURLResponse
*/
exports.GetSignedArtifactURLResponse = new GetSignedArtifactURLResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class DeleteArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.DeleteArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "workflow_job_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value) {
const message = { workflowRunBackendId: "", workflowJobRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string workflow_job_run_backend_id */ 2:
message.workflowJobRunBackendId = reader.string();
break;
case /* string name */ 3:
message.name = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string workflow_job_run_backend_id = 2; */
if (message.workflowJobRunBackendId !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.workflowJobRunBackendId);
/* string name = 3; */
if (message.name !== "")
writer.tag(3, runtime_1.WireType.LengthDelimited).string(message.name);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.DeleteArtifactRequest
*/
exports.DeleteArtifactRequest = new DeleteArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class DeleteArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.DeleteArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, runtime_1.WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.DeleteArtifactResponse
*/
exports.DeleteArtifactResponse = new DeleteArtifactResponse$Type();
/**
* @generated ServiceType for protobuf service github.actions.results.api.v1.ArtifactService
*/
@@ -1407,7 +1522,8 @@ exports.ArtifactService = new runtime_rpc_1.ServiceType("github.actions.results.
{ name: "CreateArtifact", options: {}, I: exports.CreateArtifactRequest, O: exports.CreateArtifactResponse },
{ name: "FinalizeArtifact", options: {}, I: exports.FinalizeArtifactRequest, O: exports.FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: exports.ListArtifactsRequest, O: exports.ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse }
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse }
]);
//# sourceMappingURL=artifact.js.map
@@ -1438,6 +1554,7 @@ class ArtifactServiceClientJSON {
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toJson(request, {
@@ -1477,6 +1594,16 @@ class ArtifactServiceClientJSON {
ignoreUnknownFields: true,
}));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/json", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromJson(data, {
ignoreUnknownFields: true,
}));
}
}
exports.ArtifactServiceClientJSON = ArtifactServiceClientJSON;
class ArtifactServiceClientProtobuf {
@@ -1486,6 +1613,7 @@ class ArtifactServiceClientProtobuf {
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(request) {
const data = artifact_1.CreateArtifactRequest.toBinary(request);
@@ -1507,6 +1635,11 @@ class ArtifactServiceClientProtobuf {
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "GetSignedArtifactURL", "application/protobuf", data);
return promise.then((data) => artifact_1.GetSignedArtifactURLResponse.fromBinary(data));
}
DeleteArtifact(request) {
const data = artifact_1.DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request("github.actions.results.api.v1.ArtifactService", "DeleteArtifact", "application/protobuf", data);
return promise.then((data) => artifact_1.DeleteArtifactResponse.fromBinary(data));
}
}
exports.ArtifactServiceClientProtobuf = ArtifactServiceClientProtobuf;
var ArtifactServiceMethod;
@@ -1515,12 +1648,14 @@ var ArtifactServiceMethod;
ArtifactServiceMethod["FinalizeArtifact"] = "FinalizeArtifact";
ArtifactServiceMethod["ListArtifacts"] = "ListArtifacts";
ArtifactServiceMethod["GetSignedArtifactURL"] = "GetSignedArtifactURL";
ArtifactServiceMethod["DeleteArtifact"] = "DeleteArtifact";
})(ArtifactServiceMethod || (exports.ArtifactServiceMethod = ArtifactServiceMethod = {}));
exports.ArtifactServiceMethodList = [
ArtifactServiceMethod.CreateArtifact,
ArtifactServiceMethod.FinalizeArtifact,
ArtifactServiceMethod.ListArtifacts,
ArtifactServiceMethod.GetSignedArtifactURL,
ArtifactServiceMethod.DeleteArtifact,
];
function createArtifactServiceServer(service) {
return new twirp_ts_1.TwirpServer({
@@ -1558,6 +1693,12 @@ function matchArtifactServiceRoute(method, events) {
yield events.onMatch(ctx);
return handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, interceptors);
});
case "DeleteArtifact":
return (ctx, service, data, interceptors) => __awaiter(this, void 0, void 0, function* () {
ctx = Object.assign(Object.assign({}, ctx), { methodName: "DeleteArtifact" });
yield events.onMatch(ctx);
return handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors);
});
default:
events.onNotFound();
const msg = `no handler found`;
@@ -1608,6 +1749,17 @@ function handleArtifactServiceGetSignedArtifactURLRequest(ctx, service, data, in
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceDeleteArtifactRequest(ctx, service, data, interceptors) {
switch (ctx.contentType) {
case twirp_ts_1.TwirpContentType.JSON:
return handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors);
case twirp_ts_1.TwirpContentType.Protobuf:
return handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors);
default:
const msg = "unexpected Content-Type";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.BadRoute, msg);
}
}
function handleArtifactServiceCreateArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
@@ -1732,6 +1884,37 @@ function handleArtifactServiceGetSignedArtifactURLJSON(ctx, service, data, inter
}));
});
}
function handleArtifactServiceDeleteArtifactJSON(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
const body = JSON.parse(data.toString() || "{}");
request = artifact_1.DeleteArtifactRequest.fromJson(body, {
ignoreUnknownFields: true,
});
}
catch (e) {
if (e instanceof Error) {
const msg = "the json request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return JSON.stringify(artifact_1.DeleteArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false,
}));
});
}
function handleArtifactServiceCreateArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
@@ -1832,6 +2015,31 @@ function handleArtifactServiceGetSignedArtifactURLProtobuf(ctx, service, data, i
return Buffer.from(artifact_1.GetSignedArtifactURLResponse.toBinary(response));
});
}
function handleArtifactServiceDeleteArtifactProtobuf(ctx, service, data, interceptors) {
return __awaiter(this, void 0, void 0, function* () {
let request;
let response;
try {
request = artifact_1.DeleteArtifactRequest.fromBinary(data);
}
catch (e) {
if (e instanceof Error) {
const msg = "the protobuf request could not be decoded";
throw new twirp_ts_1.TwirpError(twirp_ts_1.TwirpErrorCode.Malformed, msg).withCause(e, true);
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = (0, twirp_ts_1.chainInterceptors)(...interceptors);
response = yield interceptor(ctx, request, (ctx, inputReq) => {
return service.DeleteArtifact(ctx, inputReq);
});
}
else {
response = yield service.DeleteArtifact(ctx, request);
}
return Buffer.from(artifact_1.DeleteArtifactResponse.toBinary(response));
});
}
//# sourceMappingURL=artifact.twirp.js.map
/***/ }),
@@ -1867,6 +2075,7 @@ const core_1 = __nccwpck_require__(42186);
const config_1 = __nccwpck_require__(74610);
const upload_artifact_1 = __nccwpck_require__(42578);
const download_artifact_1 = __nccwpck_require__(73555);
const delete_artifact_1 = __nccwpck_require__(70071);
const get_artifact_1 = __nccwpck_require__(29491);
const list_artifacts_1 = __nccwpck_require__(44141);
const errors_1 = __nccwpck_require__(38182);
@@ -1953,6 +2162,28 @@ If the error persists, please check whether Actions and API requests are operati
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`);
throw error;
}
});
}
deleteArtifact(artifactName, options) {
return __awaiter(this, void 0, void 0, function* () {
try {
if ((0, config_1.isGhes)()) {
throw new errors_1.GHESNotSupportedError();
}
if (options === null || options === void 0 ? void 0 : options.findBy) {
const { findBy: { repositoryOwner, repositoryName, workflowRunId, token } } = options;
return (0, delete_artifact_1.deleteArtifactPublic)(artifactName, workflowRunId, repositoryOwner, repositoryName, token);
}
return (0, delete_artifact_1.deleteArtifactInternal)(artifactName);
}
catch (error) {
(0, core_1.warning)(`Delete Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`);
throw error;
}
@@ -1964,6 +2195,96 @@ exports.DefaultArtifactClient = DefaultArtifactClient;
/***/ }),
/***/ 70071:
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
"use strict";
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.deleteArtifactInternal = exports.deleteArtifactPublic = void 0;
const core_1 = __nccwpck_require__(42186);
const github_1 = __nccwpck_require__(21260);
const user_agent_1 = __nccwpck_require__(85164);
const retry_options_1 = __nccwpck_require__(64597);
const utils_1 = __nccwpck_require__(58154);
const plugin_request_log_1 = __nccwpck_require__(68883);
const plugin_retry_1 = __nccwpck_require__(86298);
const artifact_twirp_client_1 = __nccwpck_require__(12312);
const util_1 = __nccwpck_require__(63062);
const generated_1 = __nccwpck_require__(49960);
const get_artifact_1 = __nccwpck_require__(29491);
const errors_1 = __nccwpck_require__(38182);
function deleteArtifactPublic(artifactName, workflowRunId, repositoryOwner, repositoryName, token) {
var _a;
return __awaiter(this, void 0, void 0, function* () {
const [retryOpts, requestOpts] = (0, retry_options_1.getRetryOptions)(utils_1.defaults);
const opts = {
log: undefined,
userAgent: (0, user_agent_1.getUserAgentString)(),
previews: undefined,
retry: retryOpts,
request: requestOpts
};
const github = (0, github_1.getOctokit)(token, opts, plugin_retry_1.retry, plugin_request_log_1.requestLog);
const getArtifactResp = yield (0, get_artifact_1.getArtifactPublic)(artifactName, workflowRunId, repositoryOwner, repositoryName, token);
const deleteArtifactResp = yield github.rest.actions.deleteArtifact({
owner: repositoryOwner,
repo: repositoryName,
artifact_id: getArtifactResp.artifact.id
});
if (deleteArtifactResp.status !== 204) {
throw new errors_1.InvalidResponseError(`Invalid response from GitHub API: ${deleteArtifactResp.status} (${(_a = deleteArtifactResp === null || deleteArtifactResp === void 0 ? void 0 : deleteArtifactResp.headers) === null || _a === void 0 ? void 0 : _a['x-github-request-id']})`);
}
return {
id: getArtifactResp.artifact.id
};
});
}
exports.deleteArtifactPublic = deleteArtifactPublic;
function deleteArtifactInternal(artifactName) {
return __awaiter(this, void 0, void 0, function* () {
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
const listReq = {
workflowRunBackendId,
workflowJobRunBackendId,
nameFilter: generated_1.StringValue.create({ value: artifactName })
};
const listRes = yield artifactClient.ListArtifacts(listReq);
if (listRes.artifacts.length === 0) {
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}`);
}
let artifact = listRes.artifacts[0];
if (listRes.artifacts.length > 1) {
artifact = listRes.artifacts.sort((a, b) => Number(b.databaseId) - Number(a.databaseId))[0];
(0, core_1.debug)(`More than one artifact found for a single name, returning newest (id: ${artifact.databaseId})`);
}
const req = {
workflowRunBackendId: artifact.workflowRunBackendId,
workflowJobRunBackendId: artifact.workflowJobRunBackendId,
name: artifact.name
};
const res = yield artifactClient.DeleteArtifact(req);
(0, core_1.info)(`Artifact '${artifactName}' (ID: ${res.artifactId}) deleted`);
return {
id: Number(res.artifactId)
};
});
}
exports.deleteArtifactInternal = deleteArtifactInternal;
//# sourceMappingURL=delete-artifact.js.map
/***/ }),
/***/ 73555:
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
@@ -2005,7 +2326,7 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.downloadArtifactInternal = exports.downloadArtifactPublic = void 0;
exports.downloadArtifactInternal = exports.downloadArtifactPublic = exports.streamExtractExternal = void 0;
const promises_1 = __importDefault(__nccwpck_require__(73292));
const github = __importStar(__nccwpck_require__(21260));
const core = __importStar(__nccwpck_require__(42186));
@@ -2039,20 +2360,57 @@ function exists(path) {
});
}
function streamExtract(url, directory) {
return __awaiter(this, void 0, void 0, function* () {
let retryCount = 0;
while (retryCount < 5) {
try {
yield streamExtractExternal(url, directory);
return;
}
catch (error) {
retryCount++;
core.debug(`Failed to download artifact after ${retryCount} retries due to ${error.message}. Retrying in 5 seconds...`);
// wait 5 seconds before retrying
yield new Promise(resolve => setTimeout(resolve, 5000));
}
}
throw new Error(`Artifact download failed after ${retryCount} retries.`);
});
}
function streamExtractExternal(url, directory) {
return __awaiter(this, void 0, void 0, function* () {
const client = new httpClient.HttpClient((0, user_agent_1.getUserAgentString)());
const response = yield client.get(url);
if (response.message.statusCode !== 200) {
throw new Error(`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`);
}
const timeout = 30 * 1000; // 30 seconds
return new Promise((resolve, reject) => {
const timerFn = () => {
response.message.destroy(new Error(`Blob storage chunk did not respond in ${timeout}ms`));
};
const timer = setTimeout(timerFn, timeout);
response.message
.on('data', () => {
timer.refresh();
})
.on('error', (error) => {
core.debug(`response.message: Artifact download failed: ${error.message}`);
clearTimeout(timer);
reject(error);
})
.pipe(unzip_stream_1.default.Extract({ path: directory }))
.on('close', resolve)
.on('error', reject);
.on('close', () => {
clearTimeout(timer);
resolve();
})
.on('error', (error) => {
reject(error);
});
});
});
}
exports.streamExtractExternal = streamExtractExternal;
function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, token, options) {
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
@@ -2211,7 +2569,9 @@ function getArtifactPublic(artifactName, workflowRunId, repositoryOwner, reposit
throw new errors_1.InvalidResponseError(`Invalid response from GitHub API: ${getArtifactResp.status} (${(_a = getArtifactResp === null || getArtifactResp === void 0 ? void 0 : getArtifactResp.headers) === null || _a === void 0 ? void 0 : _a['x-github-request-id']})`);
}
if (getArtifactResp.data.artifacts.length === 0) {
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}`);
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`);
}
let artifact = getArtifactResp.data.artifacts[0];
if (getArtifactResp.data.artifacts.length > 1) {
@@ -2240,7 +2600,9 @@ function getArtifactInternal(artifactName) {
};
const res = yield artifactClient.ListArtifacts(req);
if (res.artifacts.length === 0) {
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}`);
throw new errors_1.ArtifactNotFoundError(`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`);
}
let artifact = res.artifacts[0];
if (res.artifacts.length > 1) {
@@ -6377,7 +6739,8 @@ function createHttpClient() {
return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions());
}
function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) {
const components = paths;
// don't pass changes upstream
const components = paths.slice();
// Add compression method to cache version to restore
// compressed cache as per compression method
if (compressionMethod) {
@@ -6666,26 +7029,21 @@ function resolvePaths(patterns) {
implicitDescendants: false
});
try {
for (var _e = true, _f = __asyncValues(globber.globGenerator()), _g; _g = yield _f.next(), _a = _g.done, !_a;) {
for (var _e = true, _f = __asyncValues(globber.globGenerator()), _g; _g = yield _f.next(), _a = _g.done, !_a; _e = true) {
_c = _g.value;
_e = false;
try {
const file = _c;
const relativeFile = path
.relative(workspace, file)
.replace(new RegExp(`\\${path.sep}`, 'g'), '/');
core.debug(`Matched: ${relativeFile}`);
// Paths are made relative so the tar entries are all relative to the root of the workspace.
if (relativeFile === '') {
// path.relative returns empty string if workspace and file are equal
paths.push('.');
}
else {
paths.push(`${relativeFile}`);
}
const file = _c;
const relativeFile = path
.relative(workspace, file)
.replace(new RegExp(`\\${path.sep}`, 'g'), '/');
core.debug(`Matched: ${relativeFile}`);
// Paths are made relative so the tar entries are all relative to the root of the workspace.
if (relativeFile === '') {
// path.relative returns empty string if workspace and file are equal
paths.push('.');
}
finally {
_e = true;
else {
paths.push(`${relativeFile}`);
}
}
}
@@ -6787,7 +7145,7 @@ var CacheFilename;
(function (CacheFilename) {
CacheFilename["Gzip"] = "cache.tgz";
CacheFilename["Zstd"] = "cache.tzst";
})(CacheFilename = exports.CacheFilename || (exports.CacheFilename = {}));
})(CacheFilename || (exports.CacheFilename = CacheFilename = {}));
var CompressionMethod;
(function (CompressionMethod) {
CompressionMethod["Gzip"] = "gzip";
@@ -6795,12 +7153,12 @@ var CompressionMethod;
// This enum is for earlier version of zstd that does not have --long support
CompressionMethod["ZstdWithoutLong"] = "zstd-without-long";
CompressionMethod["Zstd"] = "zstd";
})(CompressionMethod = exports.CompressionMethod || (exports.CompressionMethod = {}));
})(CompressionMethod || (exports.CompressionMethod = CompressionMethod = {}));
var ArchiveToolType;
(function (ArchiveToolType) {
ArchiveToolType["GNU"] = "gnu";
ArchiveToolType["BSD"] = "bsd";
})(ArchiveToolType = exports.ArchiveToolType || (exports.ArchiveToolType = {}));
})(ArchiveToolType || (exports.ArchiveToolType = ArchiveToolType = {}));
// The default number of retry attempts.
exports.DefaultRetryAttempts = 2;
// The default delay in milliseconds between retry attempts.
@@ -135997,6 +136355,65 @@ function loadBuildResults() {
exports.loadBuildResults = loadBuildResults;
/***/ }),
/***/ 85772:
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
__setModuleDefault(result, mod);
return result;
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.setup = void 0;
const core = __importStar(__nccwpck_require__(42186));
const input_params_1 = __nccwpck_require__(23885);
function setup() {
if ((0, input_params_1.getBuildScanPublishEnabled)() && verifyTermsOfServiceAgreement()) {
maybeExportVariable('DEVELOCITY_INJECTION_ENABLED', 'true');
maybeExportVariable('DEVELOCITY_PLUGIN_VERSION', '3.16.1');
maybeExportVariable('DEVELOCITY_CCUD_PLUGIN_VERSION', '1.12.1');
maybeExportVariable('BUILD_SCAN_TERMS_OF_SERVICE_URL', (0, input_params_1.getBuildScanTermsOfServiceUrl)());
maybeExportVariable('BUILD_SCAN_TERMS_OF_SERVICE_AGREE', (0, input_params_1.getBuildScanTermsOfServiceAgree)());
}
}
exports.setup = setup;
function verifyTermsOfServiceAgreement() {
if ((0, input_params_1.getBuildScanTermsOfServiceUrl)() !== 'https://gradle.com/terms-of-service' ||
(0, input_params_1.getBuildScanTermsOfServiceAgree)() !== 'yes') {
core.warning(`Terms of service must be agreed in order to publish build scans.`);
return false;
}
return true;
}
function maybeExportVariable(variableName, value) {
if (!process.env[variableName]) {
core.exportVariable(variableName, value);
}
}
/***/ }),
/***/ 47591:
@@ -136189,7 +136606,7 @@ class GradleStateCache {
'gradle-build-action.build-result-capture-service.plugin.groovy',
'gradle-build-action.github-dependency-graph.init.gradle',
'gradle-build-action.github-dependency-graph-gradle-plugin-apply.groovy',
'gradle-build-action.inject-gradle-enterprise.init.gradle'
'gradle-build-action.inject-develocity.init.gradle'
];
for (const initScriptFilename of initScriptFilenames) {
const initScriptContent = this.readResourceFileAsString('init-scripts', initScriptFilename);
@@ -137424,6 +137841,7 @@ const request_error_1 = __nccwpck_require__(10537);
const path = __importStar(__nccwpck_require__(71017));
const fs_1 = __importDefault(__nccwpck_require__(57147));
const layout = __importStar(__nccwpck_require__(28182));
const errors_1 = __nccwpck_require__(36976);
const input_params_1 = __nccwpck_require__(23885);
const DEPENDENCY_GRAPH_PREFIX = 'dependency-graph_';
function setup(option) {
@@ -137436,16 +137854,26 @@ function setup(option) {
return;
}
core.info('Enabling dependency graph generation');
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED', 'true');
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR', getJobCorrelator());
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_ID', github.context.runId);
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_REF', github.context.ref);
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_SHA', getShaFromContext());
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_WORKSPACE', layout.workspaceDirectory());
core.exportVariable('DEPENDENCY_GRAPH_REPORT_DIR', path.resolve(layout.workspaceDirectory(), 'dependency-graph-reports'));
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED', 'true');
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_CONTINUE_ON_FAILURE', (0, input_params_1.getDependencyGraphContinueOnFailure)());
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR', getJobCorrelator());
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_ID', github.context.runId);
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_REF', github.context.ref);
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_SHA', getShaFromContext());
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_WORKSPACE', layout.workspaceDirectory());
maybeExportVariable('DEPENDENCY_GRAPH_REPORT_DIR', path.resolve(layout.workspaceDirectory(), 'dependency-graph-reports'));
if (option === input_params_1.DependencyGraphOption.Clear) {
core.exportVariable('DEPENDENCY_GRAPH_INCLUDE_PROJECTS', '');
core.exportVariable('DEPENDENCY_GRAPH_INCLUDE_CONFIGURATIONS', '');
}
});
}
exports.setup = setup;
function maybeExportVariable(variableName, value) {
if (!process.env[variableName]) {
core.exportVariable(variableName, value);
}
}
function complete(option) {
return __awaiter(this, void 0, void 0, function* () {
try {
@@ -137455,6 +137883,7 @@ function complete(option) {
case input_params_1.DependencyGraphOption.DownloadAndSubmit:
return;
case input_params_1.DependencyGraphOption.GenerateAndSubmit:
case input_params_1.DependencyGraphOption.Clear:
yield submitDependencyGraphs(yield findGeneratedDependencyGraphFiles());
return;
case input_params_1.DependencyGraphOption.GenerateAndUpload:
@@ -137462,7 +137891,7 @@ function complete(option) {
}
}
catch (e) {
core.warning(`Failed to ${option} dependency graph. Will continue. ${String(e)}`);
warnOrFail(option, e);
}
});
}
@@ -137493,7 +137922,7 @@ function downloadAndSubmitDependencyGraphs() {
yield submitDependencyGraphs(yield downloadDependencyGraphs());
}
catch (e) {
core.warning(`Download and submit dependency graph failed. Will continue. ${String(e)}`);
warnOrFail(input_params_1.DependencyGraphOption.DownloadAndSubmit, e);
}
});
}
@@ -137505,7 +137934,7 @@ function submitDependencyGraphs(dependencyGraphFiles) {
}
catch (error) {
if (error instanceof request_error_1.RequestError) {
core.warning(buildWarningMessage(jsonFile, error));
throw new Error(translateErrorMessage(jsonFile, error));
}
else {
throw error;
@@ -137514,9 +137943,9 @@ function submitDependencyGraphs(dependencyGraphFiles) {
}
});
}
function buildWarningMessage(jsonFile, error) {
function translateErrorMessage(jsonFile, error) {
const relativeJsonFile = getRelativePathFromWorkspace(jsonFile);
const mainWarning = `Failed to submit dependency graph ${relativeJsonFile}.\n${String(error)}`;
const mainWarning = `Dependency submission failed for ${relativeJsonFile}.\n${String(error)}`;
if (error.message === 'Resource not accessible by integration') {
return `${mainWarning}
Please ensure that the 'contents: write' permission is available for the workflow job.
@@ -137571,6 +138000,12 @@ function findDependencyGraphFiles(dir) {
return graphFiles;
});
}
function warnOrFail(option, error) {
if (!(0, input_params_1.getDependencyGraphContinueOnFailure)()) {
throw new errors_1.PostActionJobFailure(error);
}
core.warning(`Failed to ${option} dependency graph. Will continue.\n${String(error)}`);
}
function getOctokit() {
return github.getOctokit(getGithubToken());
}
@@ -137622,6 +138057,30 @@ function sanitize(value) {
}
/***/ }),
/***/ 36976:
/***/ ((__unused_webpack_module, exports) => {
"use strict";
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.PostActionJobFailure = void 0;
class PostActionJobFailure extends Error {
constructor(error) {
if (error instanceof Error) {
super(error.message);
this.name = error.name;
this.stack = error.stack;
}
else {
super(String(error));
}
}
}
exports.PostActionJobFailure = PostActionJobFailure;
/***/ }),
/***/ 23885:
@@ -137653,7 +138112,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.JobSummaryOption = exports.DependencyGraphOption = exports.parseNumericInput = exports.getArtifactRetentionDays = exports.getDependencyGraphOption = exports.getPRCommentOption = exports.getJobSummaryOption = exports.isJobSummaryEnabled = exports.getGithubToken = exports.getJobMatrix = exports.getArguments = exports.getGradleExecutable = exports.getGradleVersion = exports.getBuildRootDirectory = exports.getCacheExcludes = exports.getCacheIncludes = exports.getCacheEncryptionKey = exports.isCacheCleanupEnabled = exports.isCacheDebuggingEnabled = exports.isCacheStrictMatch = exports.isCacheOverwriteExisting = exports.isCacheWriteOnly = exports.isCacheReadOnly = exports.isCacheDisabled = void 0;
exports.JobSummaryOption = exports.DependencyGraphOption = exports.parseNumericInput = exports.getArtifactRetentionDays = exports.getDependencyGraphContinueOnFailure = exports.getDependencyGraphOption = exports.getBuildScanTermsOfServiceAgree = exports.getBuildScanTermsOfServiceUrl = exports.getBuildScanPublishEnabled = exports.getPRCommentOption = exports.getJobSummaryOption = exports.isJobSummaryEnabled = exports.getGithubToken = exports.getJobMatrix = exports.getArguments = exports.getGradleVersion = exports.getBuildRootDirectory = exports.getCacheExcludes = exports.getCacheIncludes = exports.getCacheEncryptionKey = exports.isCacheCleanupEnabled = exports.isCacheDebuggingEnabled = exports.isCacheStrictMatch = exports.isCacheOverwriteExisting = exports.isCacheWriteOnly = exports.isCacheReadOnly = exports.isCacheDisabled = void 0;
const core = __importStar(__nccwpck_require__(42186));
const string_argv_1 = __nccwpck_require__(19663);
function isCacheDisabled() {
@@ -137704,10 +138163,6 @@ function getGradleVersion() {
return core.getInput('gradle-version');
}
exports.getGradleVersion = getGradleVersion;
function getGradleExecutable() {
return core.getInput('gradle-executable');
}
exports.getGradleExecutable = getGradleExecutable;
function getArguments() {
const input = core.getInput('arguments');
return (0, string_argv_1.parseArgsStringToArgv)(input);
@@ -137733,6 +138188,18 @@ function getPRCommentOption() {
return parseJobSummaryOption('add-job-summary-as-pr-comment');
}
exports.getPRCommentOption = getPRCommentOption;
function getBuildScanPublishEnabled() {
return getBooleanInput('build-scan-publish');
}
exports.getBuildScanPublishEnabled = getBuildScanPublishEnabled;
function getBuildScanTermsOfServiceUrl() {
return core.getInput('build-scan-terms-of-service-url');
}
exports.getBuildScanTermsOfServiceUrl = getBuildScanTermsOfServiceUrl;
function getBuildScanTermsOfServiceAgree() {
return core.getInput('build-scan-terms-of-service-agree');
}
exports.getBuildScanTermsOfServiceAgree = getBuildScanTermsOfServiceAgree;
function parseJobSummaryOption(paramName) {
const val = core.getInput(paramName);
switch (val.toLowerCase().trim()) {
@@ -137758,10 +138225,16 @@ function getDependencyGraphOption() {
return DependencyGraphOption.GenerateAndUpload;
case 'download-and-submit':
return DependencyGraphOption.DownloadAndSubmit;
case 'clear':
return DependencyGraphOption.Clear;
}
throw TypeError(`The value '${val}' is not valid for 'dependency-graph'. Valid values are: [disabled, generate, generate-and-submit, generate-and-upload, download-and-submit]. The default value is 'disabled'.`);
throw TypeError(`The value '${val}' is not valid for 'dependency-graph'. Valid values are: [disabled, generate, generate-and-submit, generate-and-upload, download-and-submit, clear]. The default value is 'disabled'.`);
}
exports.getDependencyGraphOption = getDependencyGraphOption;
function getDependencyGraphContinueOnFailure() {
return getBooleanInput('dependency-graph-continue-on-failure', true);
}
exports.getDependencyGraphContinueOnFailure = getDependencyGraphContinueOnFailure;
function getArtifactRetentionDays() {
const val = core.getInput('artifact-retention-days');
return parseNumericInput('artifact-retention-days', val, 0);
@@ -137797,6 +138270,7 @@ var DependencyGraphOption;
DependencyGraphOption["GenerateAndSubmit"] = "generate-and-submit";
DependencyGraphOption["GenerateAndUpload"] = "generate-and-upload";
DependencyGraphOption["DownloadAndSubmit"] = "download-and-submit";
DependencyGraphOption["Clear"] = "clear";
})(DependencyGraphOption || (exports.DependencyGraphOption = DependencyGraphOption = {}));
var JobSummaryOption;
(function (JobSummaryOption) {
@@ -138025,6 +138499,7 @@ Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.run = void 0;
const core = __importStar(__nccwpck_require__(42186));
const setupGradle = __importStar(__nccwpck_require__(18652));
const errors_1 = __nccwpck_require__(36976);
process.on('uncaughtException', e => handleFailure(e));
function run() {
return __awaiter(this, void 0, void 0, function* () {
@@ -138032,8 +138507,14 @@ function run() {
yield setupGradle.complete();
}
catch (error) {
handleFailure(error);
if (error instanceof errors_1.PostActionJobFailure) {
core.setFailed(String(error));
}
else {
handleFailure(error);
}
}
process.exit();
});
}
exports.run = run;
@@ -138145,6 +138626,7 @@ const layout = __importStar(__nccwpck_require__(28182));
const params = __importStar(__nccwpck_require__(23885));
const dependencyGraph = __importStar(__nccwpck_require__(80));
const jobSummary = __importStar(__nccwpck_require__(87345));
const buildScan = __importStar(__nccwpck_require__(85772));
const build_results_1 = __nccwpck_require__(82107);
const cache_reporting_1 = __nccwpck_require__(66674);
const daemon_controller_1 = __nccwpck_require__(85146);
@@ -138168,6 +138650,7 @@ function setup() {
yield caches.restore(userHome, gradleUserHome, cacheListener);
core.saveState(CACHE_LISTENER, cacheListener.stringify());
yield dependencyGraph.setup(params.getDependencyGraphOption());
buildScan.setup();
});
}
exports.setup = setup;
@@ -138547,7 +139030,7 @@ function firstString() {
/***/ ((module) => {
"use strict";
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.0.0","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^5.3.1","crypto":"^1.0.1","jwt-decode":"^3.1.2","twirp-ts":"^2.5.0","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.1.0","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^5.3.1","crypto":"^1.0.1","jwt-decode":"^3.1.2","twirp-ts":"^2.5.0","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
/***/ }),

File diff suppressed because one or more lines are too long

1486
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -10,9 +10,9 @@
"lint": "eslint src/**/*.ts",
"compile-main": "ncc build src/main.ts --out dist/main --source-map --no-source-map-register",
"compile-post": "ncc build src/post.ts --out dist/post --source-map --no-source-map-register",
"compile": "npm run compile-main && npm run compile-post",
"compile": "npm-run-all --parallel compile-*",
"check": "npm-run-all --parallel format lint",
"test": "jest",
"check": "npm run format && npm run lint",
"build": "npm run check && npm run compile",
"all": "npm run build && npm test"
},
@@ -28,8 +28,8 @@
],
"license": "MIT",
"dependencies": {
"@actions/artifact": "2.0.0",
"@actions/cache": "3.2.2",
"@actions/artifact": "2.1.0",
"@actions/cache": "3.2.3",
"@actions/core": "1.10.1",
"@actions/exec": "1.1.1",
"@actions/github": "6.0.0",
@@ -45,17 +45,18 @@
"@types/jest": "29.5.11",
"@types/node": "20.10.0",
"@types/unzipper": "0.10.9",
"@typescript-eslint/parser": "6.17.0",
"@typescript-eslint/parser": "6.19.1",
"@vercel/ncc": "0.38.1",
"eslint": "8.56.0",
"eslint-plugin-github": "4.10.1",
"eslint-plugin-jest": "27.6.1",
"eslint-plugin-prettier": "5.1.2",
"eslint-plugin-jest": "27.6.3",
"eslint-plugin-prettier": "5.1.3",
"jest": "29.7.0",
"js-yaml": "4.1.0",
"npm-run-all": "4.1.5",
"patch-package": "8.0.0",
"prettier": "3.1.1",
"ts-jest": "29.1.1",
"prettier": "3.2.4",
"ts-jest": "29.1.2",
"typescript": "5.3.3"
}
}

33
src/build-scan.ts Normal file
View File

@@ -0,0 +1,33 @@
import * as core from '@actions/core'
import {
getBuildScanPublishEnabled,
getBuildScanTermsOfServiceUrl,
getBuildScanTermsOfServiceAgree
} from './input-params'
export function setup(): void {
if (getBuildScanPublishEnabled() && verifyTermsOfServiceAgreement()) {
maybeExportVariable('DEVELOCITY_INJECTION_ENABLED', 'true')
maybeExportVariable('DEVELOCITY_PLUGIN_VERSION', '3.16.1')
maybeExportVariable('DEVELOCITY_CCUD_PLUGIN_VERSION', '1.12.1')
maybeExportVariable('BUILD_SCAN_TERMS_OF_SERVICE_URL', getBuildScanTermsOfServiceUrl())
maybeExportVariable('BUILD_SCAN_TERMS_OF_SERVICE_AGREE', getBuildScanTermsOfServiceAgree())
}
}
function verifyTermsOfServiceAgreement(): boolean {
if (
getBuildScanTermsOfServiceUrl() !== 'https://gradle.com/terms-of-service' ||
getBuildScanTermsOfServiceAgree() !== 'yes'
) {
core.warning(`Terms of service must be agreed in order to publish build scans.`)
return false
}
return true
}
function maybeExportVariable(variableName: string, value: unknown): void {
if (!process.env[variableName]) {
core.exportVariable(variableName, value)
}
}

View File

@@ -203,7 +203,7 @@ export class GradleStateCache {
'gradle-build-action.build-result-capture-service.plugin.groovy',
'gradle-build-action.github-dependency-graph.init.gradle',
'gradle-build-action.github-dependency-graph-gradle-plugin-apply.groovy',
'gradle-build-action.inject-gradle-enterprise.init.gradle'
'gradle-build-action.inject-develocity.init.gradle'
]
for (const initScriptFilename of initScriptFilenames) {
const initScriptContent = this.readResourceFileAsString('init-scripts', initScriptFilename)

View File

@@ -10,7 +10,13 @@ import * as path from 'path'
import fs from 'fs'
import * as layout from './repository-layout'
import {DependencyGraphOption, getJobMatrix, getArtifactRetentionDays} from './input-params'
import {PostActionJobFailure} from './errors'
import {
DependencyGraphOption,
getDependencyGraphContinueOnFailure,
getJobMatrix,
getArtifactRetentionDays
} from './input-params'
const DEPENDENCY_GRAPH_PREFIX = 'dependency-graph_'
@@ -25,16 +31,29 @@ export async function setup(option: DependencyGraphOption): Promise<void> {
}
core.info('Enabling dependency graph generation')
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED', 'true')
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR', getJobCorrelator())
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_ID', github.context.runId)
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_REF', github.context.ref)
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_SHA', getShaFromContext())
core.exportVariable('GITHUB_DEPENDENCY_GRAPH_WORKSPACE', layout.workspaceDirectory())
core.exportVariable(
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED', 'true')
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_CONTINUE_ON_FAILURE', getDependencyGraphContinueOnFailure())
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR', getJobCorrelator())
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_JOB_ID', github.context.runId)
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_REF', github.context.ref)
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_SHA', getShaFromContext())
maybeExportVariable('GITHUB_DEPENDENCY_GRAPH_WORKSPACE', layout.workspaceDirectory())
maybeExportVariable(
'DEPENDENCY_GRAPH_REPORT_DIR',
path.resolve(layout.workspaceDirectory(), 'dependency-graph-reports')
)
// To clear the dependency graph, we generate an empty graph by excluding all projects and configurations
if (option === DependencyGraphOption.Clear) {
core.exportVariable('DEPENDENCY_GRAPH_INCLUDE_PROJECTS', '')
core.exportVariable('DEPENDENCY_GRAPH_INCLUDE_CONFIGURATIONS', '')
}
}
function maybeExportVariable(variableName: string, value: unknown): void {
if (!process.env[variableName]) {
core.exportVariable(variableName, value)
}
}
export async function complete(option: DependencyGraphOption): Promise<void> {
@@ -45,13 +64,14 @@ export async function complete(option: DependencyGraphOption): Promise<void> {
case DependencyGraphOption.DownloadAndSubmit: // Performed in setup
return
case DependencyGraphOption.GenerateAndSubmit:
case DependencyGraphOption.Clear: // Submit the empty dependency graph
await submitDependencyGraphs(await findGeneratedDependencyGraphFiles())
return
case DependencyGraphOption.GenerateAndUpload:
await uploadDependencyGraphs(await findGeneratedDependencyGraphFiles())
}
} catch (e) {
core.warning(`Failed to ${option} dependency graph. Will continue. ${String(e)}`)
warnOrFail(option, e)
}
}
@@ -78,7 +98,7 @@ async function downloadAndSubmitDependencyGraphs(): Promise<void> {
try {
await submitDependencyGraphs(await downloadDependencyGraphs())
} catch (e) {
core.warning(`Download and submit dependency graph failed. Will continue. ${String(e)}`)
warnOrFail(DependencyGraphOption.DownloadAndSubmit, e)
}
}
@@ -88,7 +108,7 @@ async function submitDependencyGraphs(dependencyGraphFiles: string[]): Promise<v
await submitDependencyGraphFile(jsonFile)
} catch (error) {
if (error instanceof RequestError) {
core.warning(buildWarningMessage(jsonFile, error))
throw new Error(translateErrorMessage(jsonFile, error))
} else {
throw error
}
@@ -96,9 +116,9 @@ async function submitDependencyGraphs(dependencyGraphFiles: string[]): Promise<v
}
}
function buildWarningMessage(jsonFile: string, error: RequestError): string {
function translateErrorMessage(jsonFile: string, error: RequestError): string {
const relativeJsonFile = getRelativePathFromWorkspace(jsonFile)
const mainWarning = `Failed to submit dependency graph ${relativeJsonFile}.\n${String(error)}`
const mainWarning = `Dependency submission failed for ${relativeJsonFile}.\n${String(error)}`
if (error.message === 'Resource not accessible by integration') {
return `${mainWarning}
Please ensure that the 'contents: write' permission is available for the workflow job.
@@ -160,6 +180,14 @@ async function findDependencyGraphFiles(dir: string): Promise<string[]> {
return graphFiles
}
function warnOrFail(option: String, error: unknown): void {
if (!getDependencyGraphContinueOnFailure()) {
throw new PostActionJobFailure(error)
}
core.warning(`Failed to ${option} dependency graph. Will continue.\n${String(error)}`)
}
function getOctokit(): InstanceType<typeof GitHub> {
return github.getOctokit(getGithubToken())
}

11
src/errors.ts Normal file
View File

@@ -0,0 +1,11 @@
export class PostActionJobFailure extends Error {
constructor(error: unknown) {
if (error instanceof Error) {
super(error.message)
this.name = error.name
this.stack = error.stack
} else {
super(String(error))
}
}
}

View File

@@ -28,7 +28,7 @@ function validateGradleWrapper(buildRootDirectory: string): void {
function verifyExists(file: string, description: string): void {
if (!fs.existsSync(file)) {
throw new Error(
`Cannot locate ${description} at '${file}'. Specify 'gradle-version' or 'gradle-executable' for projects without Gradle wrapper configured.`
`Cannot locate ${description} at '${file}'. Specify 'gradle-version' for projects without Gradle wrapper configured.`
)
}
}

View File

@@ -49,10 +49,6 @@ export function getGradleVersion(): string {
return core.getInput('gradle-version')
}
export function getGradleExecutable(): string {
return core.getInput('gradle-executable')
}
export function getArguments(): string[] {
const input = core.getInput('arguments')
return parseArgsStringToArgv(input)
@@ -79,6 +75,18 @@ export function getPRCommentOption(): JobSummaryOption {
return parseJobSummaryOption('add-job-summary-as-pr-comment')
}
export function getBuildScanPublishEnabled(): boolean {
return getBooleanInput('build-scan-publish')
}
export function getBuildScanTermsOfServiceUrl(): string {
return core.getInput('build-scan-terms-of-service-url')
}
export function getBuildScanTermsOfServiceAgree(): string {
return core.getInput('build-scan-terms-of-service-agree')
}
function parseJobSummaryOption(paramName: string): JobSummaryOption {
const val = core.getInput(paramName)
switch (val.toLowerCase().trim()) {
@@ -105,12 +113,18 @@ export function getDependencyGraphOption(): DependencyGraphOption {
return DependencyGraphOption.GenerateAndUpload
case 'download-and-submit':
return DependencyGraphOption.DownloadAndSubmit
case 'clear':
return DependencyGraphOption.Clear
}
throw TypeError(
`The value '${val}' is not valid for 'dependency-graph'. Valid values are: [disabled, generate, generate-and-submit, generate-and-upload, download-and-submit]. The default value is 'disabled'.`
`The value '${val}' is not valid for 'dependency-graph'. Valid values are: [disabled, generate, generate-and-submit, generate-and-upload, download-and-submit, clear]. The default value is 'disabled'.`
)
}
export function getDependencyGraphContinueOnFailure(): boolean {
return getBooleanInput('dependency-graph-continue-on-failure', true)
}
export function getArtifactRetentionDays(): number {
const val = core.getInput('artifact-retention-days')
return parseNumericInput('artifact-retention-days', val, 0)
@@ -146,7 +160,8 @@ export enum DependencyGraphOption {
Generate = 'generate',
GenerateAndSubmit = 'generate-and-submit',
GenerateAndUpload = 'generate-and-upload',
DownloadAndSubmit = 'download-and-submit'
DownloadAndSubmit = 'download-and-submit',
Clear = 'clear'
}
export enum JobSummaryOption {

View File

@@ -29,6 +29,9 @@ export async function run(): Promise<void> {
core.info(error.stack)
}
}
// Explicit process.exit() to prevent waiting for hanging promises.
process.exit()
}
run()

View File

@@ -1,5 +1,6 @@
import * as core from '@actions/core'
import * as setupGradle from './setup-gradle'
import {PostActionJobFailure} from './errors'
// Catch and log any unhandled exceptions. These exceptions can leak out of the uploadChunk method in
// @actions/toolkit when a failed upload closes the file descriptor causing any in-process reads to
@@ -13,8 +14,15 @@ export async function run(): Promise<void> {
try {
await setupGradle.complete()
} catch (error) {
handleFailure(error)
if (error instanceof PostActionJobFailure) {
core.setFailed(String(error))
} else {
handleFailure(error)
}
}
// Explicit process.exit() to prevent waiting for promises left hanging by `@actions/cache` on save.
process.exit()
}
function handleFailure(error: unknown): void {

View File

@@ -8,7 +8,6 @@ import * as toolCache from '@actions/tool-cache'
import * as gradlew from './gradlew'
import * as params from './input-params'
import * as layout from './repository-layout'
import {handleCacheFailure, isCacheDisabled, isCacheReadOnly} from './cache-utils'
const gradleVersionsBaseUrl = 'https://services.gradle.org/versions'
@@ -20,13 +19,7 @@ const gradleVersionsBaseUrl = 'https://services.gradle.org/versions'
export async function provisionGradle(): Promise<string | undefined> {
const gradleVersion = params.getGradleVersion()
if (gradleVersion !== '' && gradleVersion !== 'wrapper') {
return addToPath(path.resolve(await installGradle(gradleVersion)))
}
const gradleExecutable = params.getGradleExecutable()
if (gradleExecutable !== '') {
const workspaceDirectory = layout.workspaceDirectory()
return addToPath(path.resolve(workspaceDirectory, gradleExecutable))
return addToPath(await installGradle(gradleVersion))
}
return undefined

View File

@@ -9,7 +9,7 @@ buildscript {
maven { url pluginRepositoryUrl }
}
dependencies {
classpath "org.gradle:github-dependency-graph-gradle-plugin:1.0.0"
classpath "org.gradle:github-dependency-graph-gradle-plugin:1.1.1"
}
}
apply plugin: org.gradle.github.GitHubDependencyGraphPlugin

View File

@@ -6,8 +6,13 @@ if (getVariable('GITHUB_DEPENDENCY_GRAPH_ENABLED') != "true") {
}
// Do not run for unsupported versions of Gradle
if (GradleVersion.current().baseVersion < GradleVersion.version("5.0")) {
println "::warning::Dependency Graph is not supported for Gradle versions < 5.0. No dependency snapshot will be generated."
def gradleVersion = GradleVersion.current().baseVersion
if (gradleVersion < GradleVersion.version("5.2") ||
(gradleVersion >= GradleVersion.version("7.0") && gradleVersion < GradleVersion.version("7.1"))) {
if (getVariable('GITHUB_DEPENDENCY_GRAPH_CONTINUE_ON_FAILURE') != "true") {
throw new GradleException("Dependency Graph is not supported for ${gradleVersion}. No dependency snapshot will be generated.")
}
println "::warning::Dependency Graph is not supported for ${gradleVersion}. No dependency snapshot will be generated."
return
}
@@ -22,11 +27,6 @@ if (isTopLevelBuild) {
return
}
def githubOutput = System.getenv("GITHUB_OUTPUT")
if (githubOutput) {
new File(githubOutput) << "dependency-graph-file=${reportFile.absolutePath}\n"
}
println "Generating dependency graph into '${reportFile}'"
}

View File

@@ -15,21 +15,21 @@ initscript {
}
// finish early if injection is disabled
def gradleInjectionEnabled = getInputParam("gradle-enterprise.injection-enabled")
def gradleInjectionEnabled = getInputParam("develocity.injection-enabled")
if (gradleInjectionEnabled != "true") {
return
}
def pluginRepositoryUrl = getInputParam('gradle-enterprise.plugin-repository.url')
def gePluginVersion = getInputParam('gradle-enterprise.plugin.version')
def ccudPluginVersion = getInputParam('gradle-enterprise.ccud-plugin.version')
def pluginRepositoryUrl = getInputParam('gradle.plugin-repository.url')
def gePluginVersion = getInputParam('develocity.plugin.version')
def ccudPluginVersion = getInputParam('develocity.ccud-plugin.version')
def atLeastGradle5 = GradleVersion.current() >= GradleVersion.version('5.0')
def atLeastGradle4 = GradleVersion.current() >= GradleVersion.version('4.0')
if (gePluginVersion || ccudPluginVersion && atLeastGradle4) {
pluginRepositoryUrl = pluginRepositoryUrl ?: 'https://plugins.gradle.org/m2'
logger.quiet("Gradle Enterprise plugins resolution: $pluginRepositoryUrl")
logger.quiet("Develocity plugins resolution: $pluginRepositoryUrl")
repositories {
maven { url pluginRepositoryUrl }
@@ -52,9 +52,9 @@ initscript {
def BUILD_SCAN_PLUGIN_ID = 'com.gradle.build-scan'
def BUILD_SCAN_PLUGIN_CLASS = 'com.gradle.scan.plugin.BuildScanPlugin'
def GRADLE_ENTERPRISE_PLUGIN_ID = 'com.gradle.enterprise'
def GRADLE_ENTERPRISE_PLUGIN_CLASS = 'com.gradle.enterprise.gradleplugin.GradleEnterprisePlugin'
def GRADLE_ENTERPRISE_EXTENSION_CLASS = 'com.gradle.enterprise.gradleplugin.GradleEnterpriseExtension'
def DEVELOCITY_PLUGIN_ID = 'com.gradle.enterprise'
def DEVELOCITY_PLUGIN_CLASS = 'com.gradle.enterprise.gradleplugin.GradleEnterprisePlugin'
def DEVELOCITY_EXTENSION_CLASS = 'com.gradle.enterprise.gradleplugin.GradleEnterpriseExtension'
def CI_AUTO_INJECTION_CUSTOM_VALUE_NAME = 'CI auto injection'
def CI_AUTO_INJECTION_CUSTOM_VALUE_VALUE = 'gradle-build-action'
def CCUD_PLUGIN_ID = 'com.gradle.common-custom-user-data-gradle-plugin'
@@ -71,17 +71,19 @@ def getInputParam = { String name ->
}
// finish early if injection is disabled
def gradleInjectionEnabled = getInputParam("gradle-enterprise.injection-enabled")
def gradleInjectionEnabled = getInputParam("develocity.injection-enabled")
if (gradleInjectionEnabled != "true") {
return
}
def geUrl = getInputParam('gradle-enterprise.url')
def geAllowUntrustedServer = Boolean.parseBoolean(getInputParam('gradle-enterprise.allow-untrusted-server'))
def geEnforceUrl = Boolean.parseBoolean(getInputParam('gradle-enterprise.enforce-url'))
def buildScanUploadInBackground = Boolean.parseBoolean(getInputParam('gradle-enterprise.build-scan.upload-in-background'))
def gePluginVersion = getInputParam('gradle-enterprise.plugin.version')
def ccudPluginVersion = getInputParam('gradle-enterprise.ccud-plugin.version')
def geUrl = getInputParam('develocity.url')
def geAllowUntrustedServer = Boolean.parseBoolean(getInputParam('develocity.allow-untrusted-server'))
def geEnforceUrl = Boolean.parseBoolean(getInputParam('develocity.enforce-url'))
def buildScanUploadInBackground = Boolean.parseBoolean(getInputParam('develocity.build-scan.upload-in-background'))
def gePluginVersion = getInputParam('develocity.plugin.version')
def ccudPluginVersion = getInputParam('develocity.ccud-plugin.version')
def buildScanTermsOfServiceUrl = getInputParam('build-scan.terms-of-service.url')
def buildScanTermsOfServiceAgree = getInputParam('build-scan.terms-of-service.agree')
def atLeastGradle4 = GradleVersion.current() >= GradleVersion.version('4.0')
@@ -103,10 +105,12 @@ if (GradleVersion.current() < GradleVersion.version('6.0')) {
}
if (!scanPluginComponent) {
logger.quiet("Applying $BUILD_SCAN_PLUGIN_CLASS via init script")
logger.quiet("Connection to Gradle Enterprise: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
applyPluginExternally(pluginManager, BUILD_SCAN_PLUGIN_CLASS)
buildScan.server = geUrl
buildScan.allowUntrustedServer = geAllowUntrustedServer
if (geUrl) {
logger.quiet("Connection to Develocity: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
buildScan.server = geUrl
buildScan.allowUntrustedServer = geAllowUntrustedServer
}
buildScan.publishAlways()
if (buildScan.metaClass.respondsTo(buildScan, 'setUploadInBackground', Boolean)) buildScan.uploadInBackground = buildScanUploadInBackground // uploadInBackground not available for build-scan-plugin 1.16
buildScan.value CI_AUTO_INJECTION_CUSTOM_VALUE_NAME, CI_AUTO_INJECTION_CUSTOM_VALUE_VALUE
@@ -115,12 +119,17 @@ if (GradleVersion.current() < GradleVersion.version('6.0')) {
if (geUrl && geEnforceUrl) {
pluginManager.withPlugin(BUILD_SCAN_PLUGIN_ID) {
afterEvaluate {
logger.quiet("Enforcing Gradle Enterprise: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
logger.quiet("Enforcing Develocity: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
buildScan.server = geUrl
buildScan.allowUntrustedServer = geAllowUntrustedServer
}
}
}
if (buildScanTermsOfServiceUrl && buildScanTermsOfServiceAgree) {
buildScan.termsOfServiceUrl = buildScanTermsOfServiceUrl
buildScan.termsOfServiceAgree = buildScanTermsOfServiceAgree
}
}
if (ccudPluginVersion && atLeastGradle4) {
@@ -137,13 +146,15 @@ if (GradleVersion.current() < GradleVersion.version('6.0')) {
} else {
gradle.settingsEvaluated { settings ->
if (gePluginVersion) {
if (!settings.pluginManager.hasPlugin(GRADLE_ENTERPRISE_PLUGIN_ID)) {
logger.quiet("Applying $GRADLE_ENTERPRISE_PLUGIN_CLASS via init script")
logger.quiet("Connection to Gradle Enterprise: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
applyPluginExternally(settings.pluginManager, GRADLE_ENTERPRISE_PLUGIN_CLASS)
extensionsWithPublicType(settings, GRADLE_ENTERPRISE_EXTENSION_CLASS).collect { settings[it.name] }.each { ext ->
ext.server = geUrl
ext.allowUntrustedServer = geAllowUntrustedServer
if (!settings.pluginManager.hasPlugin(DEVELOCITY_PLUGIN_ID)) {
logger.quiet("Applying $DEVELOCITY_PLUGIN_CLASS via init script")
applyPluginExternally(settings.pluginManager, DEVELOCITY_PLUGIN_CLASS)
eachDevelocityExtension(settings, DEVELOCITY_EXTENSION_CLASS) { ext ->
if (geUrl) {
logger.quiet("Connection to Develocity: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
ext.server = geUrl
ext.allowUntrustedServer = geAllowUntrustedServer
}
ext.buildScan.publishAlways()
ext.buildScan.uploadInBackground = buildScanUploadInBackground
ext.buildScan.value CI_AUTO_INJECTION_CUSTOM_VALUE_NAME, CI_AUTO_INJECTION_CUSTOM_VALUE_VALUE
@@ -151,12 +162,19 @@ if (GradleVersion.current() < GradleVersion.version('6.0')) {
}
if (geUrl && geEnforceUrl) {
extensionsWithPublicType(settings, GRADLE_ENTERPRISE_EXTENSION_CLASS).collect { settings[it.name] }.each { ext ->
logger.quiet("Enforcing Gradle Enterprise: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
eachDevelocityExtension(settings, DEVELOCITY_EXTENSION_CLASS) { ext ->
logger.quiet("Enforcing Develocity: $geUrl, allowUntrustedServer: $geAllowUntrustedServer")
ext.server = geUrl
ext.allowUntrustedServer = geAllowUntrustedServer
}
}
if (buildScanTermsOfServiceUrl && buildScanTermsOfServiceAgree) {
eachDevelocityExtension(settings, DEVELOCITY_EXTENSION_CLASS) { ext ->
ext.buildScan.termsOfServiceUrl = buildScanTermsOfServiceUrl
ext.buildScan.termsOfServiceAgree = buildScanTermsOfServiceAgree
}
}
}
if (ccudPluginVersion) {
@@ -169,7 +187,7 @@ if (GradleVersion.current() < GradleVersion.version('6.0')) {
}
void applyPluginExternally(def pluginManager, String pluginClassName) {
def externallyApplied = 'gradle.enterprise.externally-applied'
def externallyApplied = 'develocity.externally-applied'
def oldValue = System.getProperty(externallyApplied)
System.setProperty(externallyApplied, 'true')
try {
@@ -183,8 +201,9 @@ void applyPluginExternally(def pluginManager, String pluginClassName) {
}
}
static def extensionsWithPublicType(def container, String publicType) {
container.extensions.extensionsSchema.elements.findAll { it.publicType.concreteClass.name == publicType }
static def eachDevelocityExtension(def settings, def publicType, def action) {
settings.extensions.extensionsSchema.elements.findAll { it.publicType.concreteClass.name == publicType }
.collect { settings[it.name] }.each(action)
}
static boolean isNotAtLeast(String versionUnderTest, String referenceVersion) {

View File

@@ -7,6 +7,7 @@ import * as layout from './repository-layout'
import * as params from './input-params'
import * as dependencyGraph from './dependency-graph'
import * as jobSummary from './job-summary'
import * as buildScan from './build-scan'
import {loadBuildResults} from './build-results'
import {CacheListener} from './cache-reporting'
@@ -41,6 +42,8 @@ export async function setup(): Promise<void> {
core.saveState(CACHE_LISTENER, cacheListener.stringify())
await dependencyGraph.setup(params.getDependencyGraphOption())
buildScan.setup()
}
export async function complete(): Promise<void> {

View File

@@ -16,7 +16,7 @@ import java.nio.file.Files
import java.util.zip.GZIPOutputStream
class BaseInitScriptTest extends Specification {
static final String GE_PLUGIN_VERSION = '3.16.1'
static final String DEVELOCITY_PLUGIN_VERSION = '3.16.1'
static final String CCUD_PLUGIN_VERSION = '1.12.1'
static final TestGradleVersion GRADLE_3_X = new TestGradleVersion(GradleVersion.version('3.5.1'), 7, 9)
@@ -24,6 +24,7 @@ class BaseInitScriptTest extends Specification {
static final TestGradleVersion GRADLE_5_X = new TestGradleVersion(GradleVersion.version('5.6.4'), 8, 12)
static final TestGradleVersion GRADLE_6_NO_BUILD_SERVICE = new TestGradleVersion(GradleVersion.version('6.5.1'), 8, 14)
static final TestGradleVersion GRADLE_6_X = new TestGradleVersion(GradleVersion.version('6.9.4'), 8, 15)
static final TestGradleVersion GRADLE_7_1 = new TestGradleVersion(GradleVersion.version('7.6.2'), 8, 19)
static final TestGradleVersion GRADLE_7_X = new TestGradleVersion(GradleVersion.version('7.6.2'), 8, 19)
static final TestGradleVersion GRADLE_8_0 = new TestGradleVersion(GradleVersion.version('8.0.2'), 8, 19)
static final TestGradleVersion GRADLE_8_X = new TestGradleVersion(GradleVersion.version('8.5'), 8, 19)
@@ -146,7 +147,7 @@ class BaseInitScriptTest extends Specification {
} else {
"""
plugins {
id 'com.gradle.enterprise' version '${GE_PLUGIN_VERSION}'
id 'com.gradle.enterprise' version '${DEVELOCITY_PLUGIN_VERSION}'
${ccudPluginVersion ? "id 'com.gradle.common-custom-user-data-gradle-plugin' version '$ccudPluginVersion'" : ""}
}
gradleEnterprise {
@@ -174,7 +175,7 @@ class BaseInitScriptTest extends Specification {
} else if (gradleVersion < GradleVersion.version('6.0')) {
"""
plugins {
id 'com.gradle.build-scan' version '${GE_PLUGIN_VERSION}'
id 'com.gradle.build-scan' version '${DEVELOCITY_PLUGIN_VERSION}'
${ccudPluginVersion ? "id 'com.gradle.common-custom-user-data-gradle-plugin' version '$ccudPluginVersion'" : ""}
}
gradleEnterprise {

View File

@@ -1,11 +1,16 @@
package com.gradle.gradlebuildaction
import org.gradle.util.GradleVersion
import static org.junit.Assume.assumeTrue
class TestDependencyGraph extends BaseInitScriptTest {
def initScript = 'gradle-build-action.github-dependency-graph.init.gradle'
static final List<TestGradleVersion> NO_DEPENDENCY_GRAPH_VERSIONS = [GRADLE_3_X, GRADLE_4_X]
static final TestGradleVersion GRADLE_5_1 = new TestGradleVersion(GradleVersion.version('5.1.1'), 8, 12)
static final TestGradleVersion GRADLE_7_0 = new TestGradleVersion(GradleVersion.version('7.0.1'), 8, 12)
static final List<TestGradleVersion> NO_DEPENDENCY_GRAPH_VERSIONS = [GRADLE_3_X, GRADLE_4_X, GRADLE_5_1, GRADLE_7_0]
static final List<TestGradleVersion> DEPENDENCY_GRAPH_VERSIONS = ALL_VERSIONS - NO_DEPENDENCY_GRAPH_VERSIONS
def "does not produce dependency graph when not enabled"() {
@@ -29,7 +34,6 @@ class TestDependencyGraph extends BaseInitScriptTest {
then:
assert reportFile.exists()
assert gitHubOutputFile.text == "dependency-graph-file=${reportFile.absolutePath}\n"
where:
testGradleVersion << [GRADLE_8_X]
@@ -57,7 +61,23 @@ class TestDependencyGraph extends BaseInitScriptTest {
then:
assert !reportsDir.exists()
assert result.output.contains("::warning::Dependency Graph is not supported")
assert result.output.contains("::warning::Dependency Graph is not supported for ${testGradleVersion}")
where:
testGradleVersion << NO_DEPENDENCY_GRAPH_VERSIONS
}
def "fails build when enabled for older Gradle versions if continue-on-failure is false"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
def vars = envVars
vars.put('GITHUB_DEPENDENCY_GRAPH_CONTINUE_ON_FAILURE', 'false')
def result = runAndFail(['help'], initScript, testGradleVersion.gradleVersion, [], vars)
then:
assert !reportsDir.exists()
assert result.output.contains("Dependency Graph is not supported for ${testGradleVersion}")
where:
testGradleVersion << NO_DEPENDENCY_GRAPH_VERSIONS
@@ -110,13 +130,13 @@ class TestDependencyGraph extends BaseInitScriptTest {
def getEnvVars() {
return [
GITHUB_DEPENDENCY_GRAPH_ENABLED: "true",
GITHUB_DEPENDENCY_GRAPH_CONTINUE_ON_FAILURE: "true",
GITHUB_DEPENDENCY_GRAPH_JOB_CORRELATOR: "CORRELATOR",
GITHUB_DEPENDENCY_GRAPH_JOB_ID: "1",
GITHUB_DEPENDENCY_GRAPH_REF: "main",
GITHUB_DEPENDENCY_GRAPH_SHA: "123456",
GITHUB_DEPENDENCY_GRAPH_WORKSPACE: testProjectDir.absolutePath,
DEPENDENCY_GRAPH_REPORT_DIR: reportsDir.absolutePath,
GITHUB_OUTPUT: gitHubOutputFile.absolutePath
]
}
@@ -127,8 +147,4 @@ class TestDependencyGraph extends BaseInitScriptTest {
def getReportFile() {
return new File(reportsDir, "CORRELATOR.json")
}
def getGitHubOutputFile() {
return new File(testProjectDir, "GITHUB_OUTPUT")
}
}

View File

@@ -5,28 +5,28 @@ import org.gradle.util.GradleVersion
import static org.junit.Assume.assumeTrue
class TestGradleEnterpriseInjection extends BaseInitScriptTest {
class TestDevelocityInjection extends BaseInitScriptTest {
static final List<TestGradleVersion> CCUD_COMPATIBLE_VERSIONS = ALL_VERSIONS - [GRADLE_3_X]
def initScript = 'gradle-build-action.inject-gradle-enterprise.init.gradle'
def initScript = 'gradle-build-action.inject-develocity.init.gradle'
private static final GradleVersion GRADLE_6 = GradleVersion.version('6.0')
def "does not apply GE plugins when not requested"() {
def "does not apply Develocity plugins when not requested"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
def result = run([], initScript, testGradleVersion.gradleVersion)
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
where:
testGradleVersion << ALL_VERSIONS
}
def "does not override GE plugin when already defined in project"() {
def "does not override Develocity plugin when already defined in project"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
given:
@@ -36,7 +36,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, testConfig())
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
and:
@@ -46,14 +46,14 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << ALL_VERSIONS
}
def "applies GE plugin via init script when not defined in project"() {
def "applies Develocity plugin via init script when not defined in project"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
def result = run(testGradleVersion, testConfig())
then:
outputContainsGePluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsDevelocityPluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputMissesCcudPluginApplicationViaInitScript(result)
and:
@@ -63,14 +63,14 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << ALL_VERSIONS
}
def "applies GE and CCUD plugins via init script when not defined in project"() {
def "applies Develocity and CCUD plugins via init script when not defined in project"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
def result = run(testGradleVersion, testConfig().withCCUDPlugin())
then:
outputContainsGePluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsDevelocityPluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsCcudPluginApplicationViaInitScript(result)
and:
@@ -80,7 +80,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << CCUD_COMPATIBLE_VERSIONS
}
def "applies CCUD plugin via init script where GE plugin already applied"() {
def "applies CCUD plugin via init script where Develocity plugin already applied"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
given:
@@ -90,7 +90,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, testConfig().withCCUDPlugin())
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputContainsCcudPluginApplicationViaInitScript(result)
and:
@@ -110,7 +110,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, testConfig().withCCUDPlugin())
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
and:
@@ -120,18 +120,18 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << CCUD_COMPATIBLE_VERSIONS
}
def "ignores GE URL and allowUntrustedServer when GE plugin is not applied by the init script"() {
def "ignores Develocity URL and allowUntrustedServer when Develocity plugin is not applied by the init script"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
given:
declareGePluginApplication(testGradleVersion.gradleVersion)
when:
def config = testConfig().withServer(URI.create('https://ge-server.invalid'))
def config = testConfig().withServer(URI.create('https://develocity-server.invalid'))
def result = run(testGradleVersion, config)
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
and:
@@ -141,7 +141,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << ALL_VERSIONS
}
def "configures GE URL and allowUntrustedServer when GE plugin is applied by the init script"() {
def "configures Develocity URL and allowUntrustedServer when Develocity plugin is applied by the init script"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
@@ -149,8 +149,8 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, config)
then:
outputContainsGePluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsGeConnectionInfo(result, mockScansServer.address.toString(), true)
outputContainsDevelocityPluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsDevelocityConnectionInfo(result, mockScansServer.address.toString(), true)
outputMissesCcudPluginApplicationViaInitScript(result)
outputContainsPluginRepositoryInfo(result, 'https://plugins.gradle.org/m2')
@@ -161,22 +161,22 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << ALL_VERSIONS
}
def "enforces GE URL and allowUntrustedServer in project if enforce url parameter is enabled"() {
def "enforces Develocity URL and allowUntrustedServer in project if enforce url parameter is enabled"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
given:
declareGePluginApplication(testGradleVersion.gradleVersion, URI.create('https://ge-server.invalid'))
declareGePluginApplication(testGradleVersion.gradleVersion, URI.create('https://develocity-server.invalid'))
when:
def config = testConfig().withServer(mockScansServer.address, true)
def result = run(testGradleVersion, config)
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
and:
outputEnforcesGeUrl(result, mockScansServer.address.toString(), true)
outputEnforcesDevelocityUrl(result, mockScansServer.address.toString(), true)
and:
outputContainsBuildScanUrl(result)
@@ -185,7 +185,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << ALL_VERSIONS
}
def "can configure alternative repository for plugins when GE plugin is applied by the init script"() {
def "can configure alternative repository for plugins when Develocity plugin is applied by the init script"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
@@ -193,8 +193,8 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, config)
then:
outputContainsGePluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsGeConnectionInfo(result, mockScansServer.address.toString(), true)
outputContainsDevelocityPluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsDevelocityConnectionInfo(result, mockScansServer.address.toString(), true)
outputMissesCcudPluginApplicationViaInitScript(result)
outputContainsPluginRepositoryInfo(result, 'https://plugins.grdev.net/m2')
@@ -213,7 +213,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, config)
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
result.output.contains('Common Custom User Data Gradle plugin must be at least 1.7. Configured version is 1.6.6.')
@@ -221,15 +221,15 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
testGradleVersion << ALL_VERSIONS
}
def "can configure GE via CCUD system property overrides when CCUD plugin is inject via init script"() {
def "can configure Develocity via CCUD system property overrides when CCUD plugin is inject via init script"() {
assumeTrue testGradleVersion.compatibleWithCurrentJvm
when:
def config = testConfig().withCCUDPlugin().withServer(URI.create('https://ge-server.invalid'))
def config = testConfig().withCCUDPlugin().withServer(URI.create('https://develocity-server.invalid'))
def result = run(testGradleVersion, config, ["help", "-Dgradle.enterprise.url=${mockScansServer.address}".toString()])
then:
outputContainsGePluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsDevelocityPluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsCcudPluginApplicationViaInitScript(result)
and:
@@ -247,7 +247,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def result = run(testGradleVersion, config, ["help", "--configuration-cache"])
then:
outputContainsGePluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsDevelocityPluginApplicationViaInitScript(result, testGradleVersion.gradleVersion)
outputContainsCcudPluginApplicationViaInitScript(result)
and:
@@ -257,7 +257,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
result = run(testGradleVersion, config, ["help", "--configuration-cache"])
then:
outputMissesGePluginApplicationViaInitScript(result)
outputMissesDevelocityPluginApplicationViaInitScript(result)
outputMissesCcudPluginApplicationViaInitScript(result)
and:
@@ -273,7 +273,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
assert 1 == result.output.count(message)
}
void outputContainsGePluginApplicationViaInitScript(BuildResult result, GradleVersion gradleVersion) {
void outputContainsDevelocityPluginApplicationViaInitScript(BuildResult result, GradleVersion gradleVersion) {
def pluginApplicationLogMsgGradle4And5 = "Applying com.gradle.scan.plugin.BuildScanPlugin via init script"
def pluginApplicationLogMsgGradle6AndHigher = "Applying com.gradle.enterprise.gradleplugin.GradleEnterprisePlugin via init script"
if (gradleVersion < GRADLE_6) {
@@ -287,7 +287,7 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
}
}
void outputMissesGePluginApplicationViaInitScript(BuildResult result) {
void outputMissesDevelocityPluginApplicationViaInitScript(BuildResult result) {
def pluginApplicationLogMsgGradle4And5 = "Applying com.gradle.scan.plugin.BuildScanPlugin via init script"
def pluginApplicationLogMsgGradle6AndHigher = "Applying com.gradle.enterprise.gradleplugin.GradleEnterprisePlugin via init script"
assert !result.output.contains(pluginApplicationLogMsgGradle4And5)
@@ -305,20 +305,20 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
assert !result.output.contains(pluginApplicationLogMsg)
}
void outputContainsGeConnectionInfo(BuildResult result, String geUrl, boolean geAllowUntrustedServer) {
def geConnectionInfo = "Connection to Gradle Enterprise: $geUrl, allowUntrustedServer: $geAllowUntrustedServer"
void outputContainsDevelocityConnectionInfo(BuildResult result, String geUrl, boolean geAllowUntrustedServer) {
def geConnectionInfo = "Connection to Develocity: $geUrl, allowUntrustedServer: $geAllowUntrustedServer"
assert result.output.contains(geConnectionInfo)
assert 1 == result.output.count(geConnectionInfo)
}
void outputContainsPluginRepositoryInfo(BuildResult result, String gradlePluginRepositoryUrl) {
def repositoryInfo = "Gradle Enterprise plugins resolution: ${gradlePluginRepositoryUrl}"
def repositoryInfo = "Develocity plugins resolution: ${gradlePluginRepositoryUrl}"
assert result.output.contains(repositoryInfo)
assert 1 == result.output.count(repositoryInfo)
}
void outputEnforcesGeUrl(BuildResult result, String geUrl, boolean geAllowUntrustedServer) {
def enforceUrl = "Enforcing Gradle Enterprise: $geUrl, allowUntrustedServer: $geAllowUntrustedServer"
void outputEnforcesDevelocityUrl(BuildResult result, String geUrl, boolean geAllowUntrustedServer) {
def enforceUrl = "Enforcing Develocity: $geUrl, allowUntrustedServer: $geAllowUntrustedServer"
assert result.output.contains(enforceUrl)
assert 1 == result.output.count(enforceUrl)
}
@@ -369,31 +369,31 @@ class TestGradleEnterpriseInjection extends BaseInitScriptTest {
def getEnvVars() {
Map<String, String> envVars = [
GRADLE_ENTERPRISE_INJECTION_ENABLED: "true",
GRADLE_ENTERPRISE_URL: serverUrl,
GRADLE_ENTERPRISE_ALLOW_UNTRUSTED_SERVER: "true",
GRADLE_ENTERPRISE_PLUGIN_VERSION: GE_PLUGIN_VERSION,
GRADLE_ENTERPRISE_BUILD_SCAN_UPLOAD_IN_BACKGROUND: "true" // Need to upload in background since our Mock server doesn't cope with foreground upload
DEVELOCITY_INJECTION_ENABLED: "true",
DEVELOCITY_URL: serverUrl,
DEVELOCITY_ALLOW_UNTRUSTED_SERVER: "true",
DEVELOCITY_PLUGIN_VERSION: DEVELOCITY_PLUGIN_VERSION,
DEVELOCITY_BUILD_SCAN_UPLOAD_IN_BACKGROUND: "true" // Need to upload in background since our Mock server doesn't cope with foreground upload
]
if (enforceUrl) envVars.put("GRADLE_ENTERPRISE_ENFORCE_URL", "true")
if (ccudPluginVersion != null) envVars.put("GRADLE_ENTERPRISE_CCUD_PLUGIN_VERSION", ccudPluginVersion)
if (pluginRepositoryUrl != null) envVars.put("GRADLE_ENTERPRISE_PLUGIN_REPOSITORY_URL", pluginRepositoryUrl)
if (enforceUrl) envVars.put("DEVELOCITY_ENFORCE_URL", "true")
if (ccudPluginVersion != null) envVars.put("DEVELOCITY_CCUD_PLUGIN_VERSION", ccudPluginVersion)
if (pluginRepositoryUrl != null) envVars.put("GRADLE_PLUGIN_REPOSITORY_URL", pluginRepositoryUrl)
return envVars
}
def getJvmArgs() {
List<String> jvmArgs = [
"-Dgradle-enterprise.injection-enabled=true",
"-Dgradle-enterprise.url=$serverUrl",
"-Dgradle-enterprise.allow-untrusted-server=true",
"-Dgradle-enterprise.plugin.version=$GE_PLUGIN_VERSION",
"-Dgradle-enterprise.build-scan.upload-in-background=true"
"-Ddevelocity.injection-enabled=true",
"-Ddevelocity.url=$serverUrl",
"-Ddevelocity.allow-untrusted-server=true",
"-Ddevelocity.plugin.version=$DEVELOCITY_PLUGIN_VERSION",
"-Ddevelocity.build-scan.upload-in-background=true"
]
if (enforceUrl) jvmArgs.add("-Dgradle-enterprise.enforce-url=true")
if (ccudPluginVersion != null) jvmArgs.add("-Dgradle-enterprise.ccud-plugin.version=$ccudPluginVersion")
if (pluginRepositoryUrl != null) jvmArgs.add("-Dgradle-enterprise.plugin-repository.url=$pluginRepositoryUrl")
if (enforceUrl) jvmArgs.add("-Ddevelocity.enforce-url=true")
if (ccudPluginVersion != null) jvmArgs.add("-Ddevelocity.ccud-plugin.version=$ccudPluginVersion")
if (pluginRepositoryUrl != null) jvmArgs.add("-Dgradle.plugin-repository.url=$pluginRepositoryUrl")
return jvmArgs.collect { it.toString() } // Convert from GStrings
}