As of version 0.3, the minimum version of Gradle that is supported by Grolifant is 2.8.


The library is available on JCenter. Add the following to your Gradle build script to use it.

repositories {
Adding Grolifant as a compile dependency
dependencies {
  compile 'org.ysb33r.gradle:grolifant:0.16.2'

Distribution Installer

There are quite a number of occasions where it would be useful to download various versions SDK or distributions from a variety of sources and then install them locally without having to affect the environment of a user. The Gradle Wrapper is already a good example of this. Obviously it would be good if one could also utilise other solutions that manage distributions and SDKs on a per-user basis such as the excellent SDKMAN!.

The AbstractDistributionInstaller abstract class provides the base for plugin developers to add such functionality to their plugins without too much trouble.

Getting started

class TestInstaller extends AbstractDistributionInstaller {
        static final String DISTPATH = 'foo/bar'
        static final String DISTVER = '0.1'

        TestInstaller(Project project) {
            super('Test Distribution', DISTVER, DISTPATH, project) (1)

        URI uriFromVersion(String version) { (2)
            TESTDIST_DIR.toURI().resolve("testdist-${DISTVER}.zip") (3)
1 The installer needs to be provided with a human-readable name, the version of the distribution, a relative path below the installation for installing this type of distribution and a reference to an exiting Gradle Project instance.
2 The uriFromVersion method is used to returned an appropriate URI where to download the specific version of distribution from. Supported protocols are all those supported by Gradle Wrapper and includes file, http(s) and ftp.
3 Use code appropriate to your specific distribution to calculate the URI.

The download is invoked by calling the getDistributionRoot method.

The above example uses Groovy to implement an installer class, but you can use Java, Kotlin or any other JVM-language that works for writing Gradle plugins.

How it works

When getDistributionRoot is called, it effectively uses the following logic

File location = locateDistributionInCustomLocation(distributionVersion) (1)

if (location == null && this.sdkManCandidateName) { (2)
    location = distFromSdkMan

location ?: distFromCache (3)
1 If a custom location location is specified, look there first for the specific version
2 If SDKMAN! has been enabled, look if it has an available distribution.
3 Try to get it from cache. If not in cache try to download it.

Marking files executable

Files in some distributed archives are platform-agnostic and it is necessary to mark specific files as executable after unpacking. The addExecPattern method can be used for this purpose.

TestInstaller installer = new TestInstaller(project)
installer.addExecPattern '**/*.sh' (1)
1 Assuming the TestInstaller from Getting Started, this example will mark all shell files in the distribution as executable once the archive has been unpacked.

Patterns are ANT-style patterns as is common in a number of Gradle APIs.

Search in custom locations

The locateDistributionInCustomLocation method can be used for setting up a search in specific locations.

For example a person implementing a Ceylon language plugin might want to look in the ~/.ceylon folder for an existing installation of a specific version.

This optional implementation is completely left up to the plugin author as it will be very specific to a distribution. The method should return null if nothing was found.

Changing the download and unpack root location

By default downloaded distributions will be placed in a subfolder below the Gradle user home directory as specified during construction time. It is possible, especially for testing purposes, to use a root folder other than Gradle user home by setting the downloadRoot

Utilising SDKMAN!

SDKMAN! is a very useful local SDK installation and management tool and when specific SDKs or distributions are already supported it makes sense to re-use them in order to save on download time.

All that is required is to provide the SDKMAN! candidate name using the setSdkManCandidateName method.

Utilising SDKMAN!
installer.sdkManCandidateName = 'ceylon' (1)
1 Sets the candidate name for a distribution as it will be known to SDKMAN!. In this example the Ceylon language distribution is used.


By default the installer will not check any values, but calling setChecksum will force the installer to perform a check after downloading and before unpacking. It is possible to invoke a behavioural change by overriding verification.

TestInstaller installer = new TestInstaller(project)
installer.checksum = 'b1741e3d2a3f7047d041c79d018cf55286d1168fd6f0533e7fae897478abcdef'  (1)
1 Provide SHA-256 checksum string

Only SHA-256 checksums are supported. if you need something else you will need to override verification and provide your own checksum test.

Advanced: Override unpacking

By default, AbstractDistributionInstaller already knows how to unpack ZIPs and TARs of a variety of compressions. If something else is required, then the unpack method can be overridden.

This is the approach to follow if you need support for unpacking MSIs. There is a helper method called unpackMSI which will install and then call the lessmsi utility with the correct parameters. In order to use this in a practical way it is better to override the unpack method and call it from there. For example:

Overriding for adding MSI support.
protected void unpack(File srcArchive, File destDir) {
    if('.msi')) {
        unpackMSI(srcArchive,destDir,[:])  (1)

        // Add additional file and directory manipulation here if needed

    } else {
        super.unpack(srcArchive, destDir)
1 The third parameter can be used to set up a special environment for lessmsi if needed.

Advanced: Override verification

Verification of a downloaded distribution occurs in two parts:

  • If a checksum is supplied, the downloaded archive is validated against the checksum. The standard implementation will only check SHA-256 checksums.

  • The unpacked distribution is then checked for sanity. In the default implementation this is simply to check that only one directory was unpacked below the distribution directory. The latter is effectively just replicating the Gradle Wrapper behaviour.

Once again it is possible to customise this behaviour if your distribution have different needs. In this case there are two protected methods than can be overridden:

  • verifyDownloadChecksum - Override this method to take care of handling checksums. The method, when called, will be passed the URI where the distribution was downloaded from, the location of the archive on the filesystem and the expected checksum. It is possible to pass null for the latter which means that no checksum is available.

  • getAndVerifyDistributionRoot - This validates the distribution on disk. When called, it is passed the the location where the distribution was unpacked into. The method should return the effective home directory of the distribution.

In the case of getAndVerifyDistributionRoot it can be very confusing sometimes as to what the distDir is and what should be returned. The easiest is to explain this by looking at how Gradle wrappers are stored. For instance for Gradle 3.0 the distDir might be something like ~/.gradle/wrapper/dists/gradle-3.0-bin/2z3tfybitalx2py5dr8rf2mti/ whereas the return directory would be ~/.gradle/wrapper/dists/gradle-3.0-bin/2z3tfybitalx2py5dr8rf2mti/gradle-3.0.

Helper and other protected API methods

  • getProject provides access to the associated Gradle Project object.

  • listDirs provides a listing of directories directly below an unpacked distribution. It can also be used for any directory if the intent is to see which child directories are available.

  • getLogger provides access to a simple stdout logger.

Unpacking DMG files

Since 0.6 there is utility that can be used to unpack DMG files, called UnpackUtils.unpackDmgOnMacOsX. On non-MacOs platforms it will be a NOOP if called.

DMG files are not unpacked automatically by AbstractDistributionInstaller. The plugin implementor will need to override the unpack method in order to call the DMG unpacker and also add the appropriate logic.

Creating Script Wrappers

Consider that you have a plugin that already node or terraform and you want to try something on the command-line with the tool directly. You do not want to install the tool again if you could possibly just use the already cached version. It would be of the correct version as required by the project in any case.

You are probably very familiar with the Gradle wrapper. Now if could be nice under certain circumstances to create wrappers that will call the executables from distributions that were installed using the [DistributionInstaller]. Since 0.14 Grolifant offers two abstract task types to help you add such functionality to your plugins.

These task types attempt to address the following:

  • Create wrappers for tools be it executables or scripts that will point to the correct version as required by the specific project.

  • Realise it is out of date if the version or location of the distribution/tool changes.

  • Cache the distribution/tool if it is not yet cached.

Creating a wrapper task

Let’s assume you would like to create a plugin for Hashicorp’s Packer. Let’s assume that you have already created an extension class which extends AbstractToolExtension and is called PackerExtension. Let’s also assume that this class knows how to download packer for the appropriate platform whic you probably implemented using AbstractDistributionInstaller.

In 0.14 the only supported implementation is to place the template files in a directory path in resources and then substitute values by tokens. This implementation uses Ant ReplaceTokens under the hood.

Start by extending AbstractScriptWrapperTask

class PackerWrapper extends AbstractScriptWrapperTask {
    PackerWrapper() {
        useWrapperTemplatesInResources( (1)
            '/packer-wrappers', (2)
            [ '' : 'packerw', (3)
              'wrapper-template.bat': 'packerw.bat'
1 Although this is currently the only supported method, it has to be explicitly specified that wrapper templates are in resources.
2 Specify the resource path where to find the resource wrappers. This resource path will be scanend for files as defined below.
3 Specify a map which maps the names of files in the resource path to final file names. The format is [ <WRAPPER TEMPLATE NAME> : <FINAL SCRIPT NAME> ]. Although the final script names can be specified using a relative path, convention is to just place the file wrapper scripts in the project directory. See example script wrappers for some inspiration.

The next step is to provide tokens can be substituted by implementing the appropriate abstract methods.

protected String getBeginToken() { (1)

protected String getEndToken() { (2)

protected Map<String, String> getTokenValuesAsMap() { (3)
        APP_BASE_NAME               : 'packer',
        APP_LOCATION_FILE           : '/path/to/packer'
1 Start token for substitution. This can be anything. This example uses ~~ because it matches the delimiter from the example script wrappers.
2 End token for substitution.
3 Return a map of the values for substituting into the template when creating the scripts.

At this point you can test the task and it should generate wrappers, however there are a number of shortcomings:

  • When somebody clones a project that contains the wrappers for the first time, there is a good chance that none of the wrapped binaries would be cached too and when they are cached they might end up at a different location due to the environment of the user.

  • The classic place to cache something is in the project cache directory, but this can be overridden from the command-line, so special care has to be taken.

  • You might have pulled an updated version of the project and the version of the wrapped binary has been changed by the project maintainers.

Let’s start by creating a caching task first.

Creating a caching task

Create a task type that extends AbstractCacheBinaryTask.

class PackerCacheBinary extends AbstractCacheBinaryTask {
    PackerCacheBinary() {
        super('') (1)
1 Define a properties file that will store appropriate information about the cached binary that will be local to the project on a user’s computer or in CI.

There are three minimum characteristics that need to be defined:

  • Version of the binary/script/distribution if it is set via executable version : '1.2.3'

  • The location of the binary/script.

  • Description of wrapper.

This is done by implementing three abstract methods.

protected String getBinaryVersion() {
    PackerExtension packerExtension = project.extensions.getByType(PackerExtension) (1)
    switch (packerExtension.resolvableExecutableType.type) { (2)
        case 'version': (3)
            return packerExtension.resolvableExecutableType.value.get() (4)
            '' (5)

protected String getBinaryLocation() {
    PackerExtension packerExtension = project.extensions.getByType(PackerExtension)
    packerExtension.resolvableExecutable.getExecutable().canonicalPath (6)

protected String getPropertiesDescription() {
    "Describes the Packer usage for the ${} project" (7)
1 Working on the assumption you created PackerExtension as mentioned earlier.
2 Query how the Packer binary should be obtained
3 If it was defined via executable: version '1.2.3' it should be easy to retrieve the version.
4 Use resolveableExceutableType to obtain the version.
5 For other definitions, it might not be possible or even needed to know what the version is. For a specific tool there might be different cases, but for most tools, the standard cases of path and search will not make a version possible.
6 Resolve the excutable path. This will also result in the binary/distribution being cached.
7 A simple one line describing what the property file is about.

If you execute an instance of your new task type it will automatically cache the binary/distribution dependent on how it has been defined. It will also generate a properties file into the project cache directory. This latter file should be ignored by source control and the project cache directory should never be in source control.

The next step is to revisit the wrapper task and link it to the caching task.

Linking the caching and wrapper tasks

Return to the wrapper task and modify the constructor as follows:

class PackerWrapper extends AbstractScriptWrapperTask {
  private final PackerCacheBinary cacheTask

    PackerWrapper(PackerCacheBinary cacheTask) { (1)
        this.cacheTask = cacheTask
        inputs.file(cacheTask.locationPropertiesFile) (2)
        dependsOn(cacheTask) (3)

        def mapping =[
            '' : 'packerw',
            'wrapper-template.bat': 'packerw.bat'

            '/packer-wrappers', mapping

        outputs.files(mapping.values().collect { (4)
            new File(project.projectDir, it)
1 Restrict the wrapper task type to only be instantiated if there is an associated caching task.
2 If the location of the properties file have changed, the the wrapper task should be out of date.
3 If the wrapper task is run, then the caching task should also be run if out of date.
4 If the wrapper scripts do not exist, the task should be executed. Use a property rather than a @OutputFiles annotation as the scripts are not used by other tasks directory.

Now change your getTokenValuesAsMap method. (Once again we base these tokens on the ones used in [ExampleScriptWrapper]).

protected Map<String, String> getTokenValuesAsMap() {
        APP_BASE_NAME               : 'packer',
        GRADLE_WRAPPER_RELATIVE_PATH: project.relativePath(project.rootDir), (1)
        DOT_GRADLE_RELATIVE_PATH    : project.relativePath(cacheTask.locationPropertiesFile.get().parentFile), (2)
        APP_LOCATION_FILE           : cacheTask.locationPropertiesFile.get().name, (3)
        CACHE_TASK_NAME             : (4)
1 If the project uses a Gradle wrapper it is important that the tool wrapper script also use the Gradle wrapper to invoke the caching task.
2 Get the location of the project cache directory. You can also use project.relativePath(FileUtils.projectCacheDirFor(project)).
3 The name of the wrapper properties file.
4 The name of the cache task to invoke if either the wrapper properties file does not exist or the distribution/binary has not been cached.

Putting everything in a plugin

It is recommended that the tasks created by convention are placed in a separate plugin and that the plugin users are recommended to only load this plugin in the root project of a multi-project.

In your plugin add the following code to the apply method.

PackerCacheBinary packerCacheBinary = project.tasks.create('cachePackerBinary', PackerCacheBinary)
project.tasks.create('packerWrapper', PackerWrapper, packerCacheBinary)

Example script wrappers

These are provided as starter points for wrapping simple binary tools. They have been hashed together from various other examples in open-source.

For shell scripts
#!/usr/bin/env sh

##  ~~APP_BASE_NAME~~ wrapper up script for UN*X
# Relative path from this script to the directory where the Gradle wrapper
# might be found.

# Relative path from this script to the project cache dir (usually .gradle).

# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
    ls=`ls -ld "$PRG"`
    link=`expr "$ls" : '.*-> \(.*\)$'`
    if expr "$link" : '/.*' > /dev/null; then
        PRG=`dirname "$PRG"`"/$link"

cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null

# OS specific support (must be 'true' or 'false').
case "`uname`" in
  Darwin* )
  MINGW* )

# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
    APP_HOME=`cygpath --path --mixed "$APP_HOME"`
    CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
    JAVACMD=`cygpath --unix "$JAVACMD"`

    # We build the pattern for arguments to be converted via cygpath
    ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
    for dir in $ROOTDIRSRAW ; do
    # Add a user-defined pattern to the cygpath arguments
    if [ "$GRADLE_CYGPATTERN" != "" ] ; then
    # Now convert the arguments - kludge to limit ourselves to /bin/sh
    for arg in "$@" ; do
        CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
        CHECK2=`echo "$arg"|egrep -c "^-"`                                 ### Determine if an option

        if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then                    ### Added a condition
            eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
            eval `echo args$i`="\"$arg\""
    case $i in
        (0) set -- ;;
        (1) set -- "$args0" ;;
        (2) set -- "$args0" "$args1" ;;
        (3) set -- "$args0" "$args1" "$args2" ;;
        (4) set -- "$args0" "$args1" "$args2" "$args3" ;;
        (5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
        (6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
        (7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
        (8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
        (9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;


run_gradle ( ) {
  if  [ -x "$GRADLE_WRAPPER_RELATIVE_PATH/gradlew" ] ; then
    gradle "$@"

app_property ( ) {
    echo `cat $APP_LOCATION_FILE | grep $1 | cut -f2 -d=`

# If the app location is not available, set it first via Gradle
if [ ! -f $APP_LOCATION_FILE ] ; then
  run_gradle -q ~~CACHE_TASK_NAME~~

# Now read in the configuration values for later usage

# If the app is not available, download it first via Gradle
if [ ! -f $APP_LOCATION  ] ; then
  run_gradle -q ~~CACHE_TASK_NAME~~

# If global configuration is disabled which is the default, then
# point the Terraform config to the generated configuration file
# if it exists.
if [ -z $TF_CLI_CONFIG_FILE ] ; then
    if [ $USE_GLOBAL_CONFIG == 'false' ] ; then
        CONFIG_LOCATION=`app_property configLocation`
        if [ -f $CONFIG_LOCATION ] ; then
          echo Config location specified as $CONFIG_LOCATION, but file does not exist. >&2
          echo Please run the terraformrc Gradle task before using $(basename $0) again >&2

# If we are in a project containing a default Terraform source set
# then point the data directory to the default location.
if [ -z $TF_DATA_DIR ] ; then
    if [ -f $PWD/src/tf/main ] ; then
        export TF_DATA_DIR=$PWD/build/tf/main
        echo $TF_DATA_DIR will be used as data directory >&2

exec $APP_LOCATION "$@"
For windows batch files
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem  ~~APP_BASE_NAME~~ wrapper script for Windows
@rem ##########################################################################

@rem Relative path from this script to the directory where the Gradle wrapper
@rem might be found.

@rem  Relative path from this script to the project cache dir (usually .gradle).

@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal

set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0

@rem Get command-line arguments, handling Windows variants

if not "%OS%" == "Windows_NT" goto win9xME_args

@rem Slurp the command line arguments.
set _SKIP=2

if "x%~1" == "x" goto execute


@rem Setup the command line


@rem If the app location is not available, set it first via Gradle
if not exist %APP_LOCATION_FILE% call :run_gradle -q ~~CACHE_TASK_NAME~~

@rem Read settings in from app location properties

@rem If the app is not available, download it first via Gradle
if not exist %APP_LOCATION% call :run_gradle -q ~~CACHE_TASK_NAME~~

@rem If global configuration is disabled which is the default, then
@rem  point the Terraform config to the generated configuration file
@rem  if it exists.
if %TF_CLI_CONFIG_FILE% == "" (
    if %USE_GLOBAL_CONFIG%==true goto cliconfigset
    if exist %CONFIG_LOCATION% (
    ) else (
        echo Config location specified as %CONFIG_LOCATION%, but file does not exist. 1>&2
        echo Please run the terraformrc Gradle task before using %APP_BASE_NAME% again 1>&2

@rem  If we are in a project containing a default Terraform source set
@rem  then point the data directory to the default location.
if "%TF_DATA_DIR%" == "" (
    if exist %CD%\src\tf\main (
        set TF_DATA_DIR=%CD%\build\tf\main
        echo %TF_DATA_DIR% will be used as data directory 1>&2

@rem Execute ~~APP_BASE_NAME~~

@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd

exit /b 1

if "%OS%"=="Windows_NT" endlocal

exit /b 0

if  exist %GRADLE_WRAPPER_RELATIVE_PATH%\gradlew.bat (
    call %GRADLE_WRAPPER_RELATIVE_PATH%\gradlew.bat %*
) else (
    call gradle %*
exit /b 0

Git Cloud Provider Archives

In a slightly similar fashion to Distribution Installer it is possible to download archives of GitHub & GitLab repositories.

Firstly define a description of the specific Git repository.

GitHubArchive git = new GitHubArchive() (1)

git.organisation = 'ysb33rOrg'
git.repository = 'grolifant'
git.branch = 'master' (2)
1 Create a GitHub description. (For GitLab use GitLabArchive instead)
2 For setting a tag use the tag/setTag method and for a specific commit the commit/setCommit method.

Then create downloader

GitRepoArchiveDownloader downloader = new GitRepoArchiveDownloader(git, project) (1)
File root = downloader.archiveRoot (2)
1 The downloader only requires the description and a link to the current Project instance.
2 When the archiveRoot (or getArchiveRoot()) is accessed the archive will be downloaded if it has not already been downloaded.

Normally the downloader will cache the repository in the Gradle user cache so that is can be shared between different projects. It is possible to configure the download area for somewhere else. For instance you might prefer to cache it in the project’s build directory. To do this simply set, the download root location on the downloader.

downloader.downloadRoot = new File(project.buildDir, 'my-archive') (1)
1 Use downloadRoot in Groovy or setDownloadRoot in Java & Kotlin.

Tool Executions Tasks and Execution Specifications

Gradle script authors are quite aware of Exec and /org/gradle/api/tasks/JavaExec.html[JavaExec] tasks as well as the projects extensions exec and javaexec. Implementing tasks or extensions to support specific tools can involve a lot of work. This is where this set of abstract classes come in to simplify the work to a minimum and allowing plugin authors to think about what kind of tool functionality to wrap rather than implementing heaps of boilerplate code.

Wrapping an external tool within a gradle plugin usually have three components:

  • Execution specification

  • Project extension

  • Task type

How to implement these components are described in the following sections.

Execution specifications

Execution specifications are used for configuring the necessary details for running an external process. The latter will then be used by a task type of a project extension.

There is currently three abstract classes in the hierarchy and the all implement the BaseExecSpec interface.

diag 07741afaeb27b8ed12acf20305e3342c

These execution specifications allow to easily present configuration options to your plugin users such as the following:

Common declarative settings
ignoreExitValue true  (1)
standardOutput System.out  (2)
standardInput    (3)
errorOutput System.err     (4)
workingDir '.'     (5)
1 Whether the exit value can be ignored.
2 Where standard output should be sent to. (It is up to a plugin author to decide on behaviour if this value is null).
3 Where standard input is read from. (It is up to a plugin author to decide on behaviour if this value is null).
4 Where error output should be sent to. (It is up to a plugin author to decide on behaviour if this value is null).
5 The working directory during execution. This is a lazy-evaluated value and can be anything that project.file() will be able to process.
Setting process environment
environment = [foo: 'bar']               (1)
environment foo2: 'bar2', foo3: { 'bar3' } (2)
environment 'foo4', 'bar4'   (3)
1 Explicitly set the environment in an assignment style, removing any previous environment settings.
2 Add additional environment settings in the familiar, and gradlesque, map-style. Values of environmental variables have the ability to be lazily-evaluated by the consuming task or project extension. (As a plugin author you should consider using MapUtils.stringizeValues for your conversions. The tasks described further down do the same).
3 Add one environment setting as a pair of environment variable and its value.

If you are familiar with the options on the Exec task, then the above will come as no surprise. It will also present your plugin user with a familiar set of configuration options.

The executable can also be set in the normal way, but if you set the executable in an implementation-specific way in your implementation you might want to prevent the user from setting executable. These specifications also allows you to provide arguments that is specific to the executable and not any associated command. For instance if you were to do git -C /foo commit myfile.txt, then -C /foo would be executable arguments.

Setting executables and executable arguments
executable { '/path/to/exe' }         (1)
exeArgs = ['first', 'second']     (2)
exeArgs 'third', { 'fourth' }  (3)
1 Set the executable. This is also a lazy-evaluated value and anything that StringUtils.stringize can deal with can be used. In addition ResolvedExecutable instantiations can also be used.
2 Explicitly set the execution arguments in an assignment style, removing any previous execution arguments.
3 Add additional execution arguments. All of these values are lazily evaluated.

The above distinction of using execution arguments might seem to be an unnecessary extra at a first read, but in terms of a DSL they allow the use to customise certain behaviour of the executable without losing focus on the real work the executable is supposed to do. This is similar to running an additional JVM via JavaExec. In this case the `jvmArgs customises the JVM, and not the arguments passed to the class to be executed.

In addition to those the AbstractCommandExecSpec will allow you to specify a command that is associated with the executable. For instance in git commit, the command will be commit.

Setting a command and command arguments (AbstractCommandExecSpec)
command 'install'         (1)
cmdArgs = ['aye', 'bee'] (2)
cmdArgs 'cee', { 'dee' }    (3)
1 Set the command. This can be lazy-evaluated.
2 Explicitly set the command arguments in an assignment style, removing any previous command arguments.
3 Add additional command arguments. All of these values are lazily evaluated.

In a similar fashion AbstractScriptExecSpec offers the ability to specify a script name and script arguments.

Setting a script and script arguments (AbstractScriptExecSpec)
script ''       (1)
scriptArgs = ['aye']      (2)
scriptArgs 'cee', { 'dee' }  (3)
1 Set the script. This can be lazy-evaluated.
2 Explicitly set the script arguments in an assignment style, removing any previous script arguments.
3 Add additional script arguments. All of these values are lazily evaluated.

In order to implement your own execution specification you need to derive from the appropriate specification.

Wrapping Git as a tool with commands
class GitExecSpec extends AbstractCommandExecSpec {
    GitExecSpec(Project project, Object exe) {
        super(project, new ResolverFactoryRegistry(project))
        setExecutable(exe ?: 'git')
Wrapping Perl as a tool which executes scripts
class PerlScriptExecSpec extends AbstractScriptExecSpec {
    PerlScriptExecSpec(Project project, Object exe) {
        super(project, new ResolverFactoryRegistry(project))
        setExecutable(exe ?: 'perl')

Creating a project extension

ExtensionUtils.addProjectExtension is the key method to use.

Assume for the moment that you have created a execution specification class for wrapping Git and that it looks like the following:

class GitExecSpec extends AbstractCommandExecSpec {
    GitExecSpec(Project project) {
        super(project, new ResolverFactoryRegistry(project))

To add this to the plugin, do

void apply(Project project) {
  ExtensionUtils.addExecProjectExtension('gitexec', project, { Project project ->
      new GitExecSpec(project)
  } as ExecSpecInstantiator<GitExecSpec>) (1)
1 An instantiator is required to create instances of execution specifications on demand. One simple way is to create a closure and coerce it to a ExecSpecInstantiator

Now it will be possible to do

task gitdiff {
  doLast {
    project.gitexec {
      command 'diff'

Creating a task

There is currently four abstract task classes in the hierarchy.

diag c6b528dc265d89de1ea11dfe1bed0cfc

The minimum you will need to do is extend the appropriate class and provide a suitable constructor that can be called by Gradle.

Wrapping a tool with AbstractExecWrapperTask

The AbstractExecWrapperTask is a simplified way of abstract tools into gradlesque tasks. Unlike the other abstraction execution task types mentioned above, it does not expose the full command-line options to the build script author, but rather allows a plugin author to provide suitable functional abstractions. For instance the Packer plugin provides a packerBuild task that wraps around the packer executable. Instead of the command-line options it provides methods to configure the location of the packer.json file` and to set a collections of variables.

This abstract task also relies on suitable extension of AbstractToolExtension by the plugin author. The result is a very flexible DSL. This can be illustrated by the following example which is also a good starting point for any plugin author wanting to abstract a tool in such a way.

Step 1 - Create an execution specification

The details for creating execution specifications has been described earlier. It is suggested to use an AbstractCommandExecSpec as a base class.

class MyCmdExecSpec extends AbstractCommandExecSpec {
    MyCmdExecSpec(Project project, ExternalExecutable registry) {
        super(project, registry)

Step 2 - Create an extension

The extension class will be attached to both the project and the task for maximum flexibility. This allows for global configuration, with customisation on a task level as needed.

class MyExtension extends AbstractToolExtension { (1)

    static final String NAME = 'toolConfig'

    MyExtension(Project project) { (2)

    MyExtension(Task task) {
        super(task, NAME) (3)

1 Derive from AbstractToolExtension. This will provide methods for setting the executable.
2 Create a constructor for attaching the extension to a project.
3 You will also need a constructor for attaching to a task. In this case you will also need to specify the name of the project extension. By convention you should always have the task and project extension by the same name.

Step 3 - Create the task class

class MyWrapperTask extends AbstractExecWrapperTask<MyCmdExecSpec, MyExtension> { (1)

    MyWrapperTask() {
        myExtension = extensions.create(MyExtension.NAME, MyExtension, this) (2)

    protected MyCmdExecSpec createExecSpec() { (3)
        new MyCmdExecSpec(project, toolExtension.resolver)

    protected MyCmdExecSpec configureExecSpec(MyCmdExecSpec execSpec) { (4)
        execSpec.cmdArgs '--yellow', '--bright'
        execSpec (5)

    protected MyExtension getToolExtension() { (6)

    private final MyExtension myExtension
1 Your task class must extend AbstractExecWrapperTask and specify the types of the associated execution specification and extension.
2 You need to create an extension class. It is best to also store a reference to the extension within the task as to prevent further lookups.
3 You need to implement a method which will create an execution specification.
4 You will also need to implement a method to configure the execution specification according to all of the specifics of the tool that is being wrapped. This is the method that translates task properties into command-line options.
5 This method always needs to return the execution specification
6 Finally you also need to implement a method which returns the associated task extension. This is simplified if you have already stored a reference to the extension.

Step 4 - Apply this via plugin

class MyPlugin implements Plugin<Project> {
    void apply(Project project) {
        project.extensions.create(MyExtension.NAME, MyExtension, project) (1)
        project.tasks.create('mycmd', MyWrapperTask) (2)
1 Create the extension at project level
2 Create a default instance of your task.

Use it in the DSL

toolConfig {
    executable path: '/usr/local/bin/mycmd' (1)

mycmd {
    toolConfig {
        executable path: '/opt/local/bin/mycmd' (2)
1 Set the default executable at project level.
2 You can also customise it a task level.

Task-first config

The best idiom to implement is to retrieve task extension data first, before project extension data. Assuming that you need to call getResolvableExecutable() (the same applies to any method on your extension) is to do the following in your task code:


This will provide access to the getResolvableExecutable() method on the extension which is actually implemented along the lines of

ResolvableExecutable exe = some_internal_method()
if(exe == null && getTask() != null) { (1)
  exe = getProjectExtension().getResolvableExecutable() (2)
return exe (3)
1 getTask is a protected method which return an associated task or null if the extension is attached to a project.
2 getProjectExtension is another protected method which always returns the project extension irrespective of whether the current extension is attached to a project or a task.
3 At this point your implementation can decide to return null or throw an exception.

You can simplify some of this work by using the getValue and getValueByMethod methods in AbstractToolExtension. Consider for a moment that you have a method String getMode() in your extension class. You can then do


This uses reflection internally to obtain the value and might not necessarily be the best solution for your situation.

Due to the fact that Grolifant maintains JDK7 compatibility, Java method references are not supported at present. You may of course decide to implement your own simplification to method references if your plugin does not need to care about JDK7.

Adding version-based resolving

Version-based resolving is handled through the ResolveExecutableByVersion class and requires an implementation of a distribution installer. You can Assuming that you have such an implementation called MyInstaller you can proceed to add the logic to your extension

class MyExtension extends AbstractToolExtension { (1)

    static final String NAME = 'toolConfig'

    MyExtension(Project project) { (2)

    MyExtension(Task task) {
        super(task, NAME) (3)

    private void addVersionResolver(Project project) {
        ResolveExecutableByVersion.DownloaderFactory downloaderFactory = {
            Map<String, Object> options, String version, Project p -> (5)
                new MyInstaller(version, p)
        } as ResolveExecutableByVersion.DownloaderFactory

        ResolveExecutableByVersion.DownloadedExecutable resolver = { MyInstaller installer -> (6)
            new File(installer.distributionRoot, (7)
                OperatingSystem.current().windows ? 'test.bat' : ''
        } as ResolveExecutableByVersion.DownloadedExecutable

            new ResolveExecutableByVersion(project, downloaderFactory, resolver) (8)
1 Derive from AbstractToolExtension. This will provide methods for setting the executable.
2 Create a constructor for attaching the extension to a project.
3 You will also need a constructor for attaching to a task. In this case you will also need to specify the name of the project extension. By convention you should always have the task and project extension by the same name.
4 First step is to implement the functional interface of ResolveExecutableByVersion.DownloadFactory. Use it to construct an instance of your downloader (MyInstaller) in this example.
5 You can use various means to create this including JDK8 lambdas, but for compatibility with JDK7, the example uses a coerced closure.
6 Second step is to implement the ResolveExecutableByVersion.DownloadedExecutable interface. Its purpose is to download the correct version and resolve the path to the executable.
7 Once again a coerced closure is used in the example, but if you use Kotlin or Java you can use appropriate means to implement the interface.
8 Register the factory and you’ll be able to use version key when specifying the executable.

You can now use version above and beyond the default path and search keys.

toolConfig {
    executable version: '0.2'

Operating System

Many plugin developers are familiar with the OperatingSystem internal API in Gradle. Unfortunately this remains an internal API and is subject to change.

Grolifant offers a similar public API with a small number of API differences:

  • No getFamilyName and getNativePrefix methods. (A scan of the Gradle 3.2.1 codebase seem to yield to usage either).

  • No public static fields called WINDOWS, MACOSX etc. These are now a static field called INSTANCE on each of the specific operating system implementations.

  • getSharedLibrarySuffix and getSharedLibraryName have been added.

  • Support for NetBSD.


OperatingSystem os = OperatingSystem.current() (1)
    File findExe = os.findInPath('bash')
    File findExe = os.findInPath('bash')
1 Use current() to the operating system the code is being executed upon.

Operating system detection

The logic in 0.16.2 to determine an operating system is

static OperatingSystem current() {
    if (OS_NAME.contains('windows')) {
        return Windows.INSTANCE
    } else if (OS_NAME.contains('mac os x') || OS_NAME.contains('darwin') || OS_NAME.contains('osx')) {
        return MacOsX.INSTANCE
    } else if (OS_NAME.contains('linux')) {
        return Linux.INSTANCE
    } else if (OS_NAME.contains('freebsd')) {
        return FreeBSD.INSTANCE
    } else if (OS_NAME.contains('sunos') || OS_NAME.contains('solaris')) {
        return Solaris.INSTANCE
    } else if (OS_NAME.contains('netbsd')) {
        return NetBSD.INSTANCE

    // Not strictly true, but a good guess

Contributing fixes

Found a bug or need a method? Please raise an issue and preferably provide a pull request with features implemented for all supported operating systems.

String Utilities

Converting objects to strings

Use the stringize method to convert nearly anything to a string or a collection of strings. Closures are evaluated and the results are then converted to strings. Anything that implements Callable<String> and Provider will also be converted. In the case of Provider the object is first retrieved and then converted to a String.

StringUtils.stringize('foo') == 'foo'
StringUtils.stringize(new File('foo')) == 'foo'
StringUtils.stringize { 'foo' } == 'foo'

StringUtils.stringize(['foo1', new File('foo2'), { 'foo3' }]) == ['foo1', 'foo2', 'foo3']

Updating Property<String> instances in-situ.

Gradle’s Property class is two-edged sword. On the one sde it makes lazy-evaluation easier for both Groovy & Kotlin DSLs, bit on the other sides it really messes up the Groovy DSL for build script authors.

The correct way to use is not as field, but as the return type of the getter as illustraedt by the following skeleton code

class MyTask extends DefaultTask  {
    Property<String> getSampleText() {

    void setSampleText(Object txt) {
       // ...

    private final Property<String> sampleText

The hard part for the plugin author is to deal with initialisation of the private field and then with further updates. This is where updatePropertyString becomes very useful as the code can now be implemented as.

class MyTask extends DefaultTask  {

    MyTask() {
        sampleText =
        sampleText.set('default value')

    Property<String> getSampleText() {

    void setSampleText(Object txt) {
       StringUtils.updatePropertyString(project, sampleText, txt) (1)

    private final Property<String> sampleText
1 Updates the value of the Property instance, but keeps the txt object lazy-evaluated.
You will need at least Gradle 4.3 to call this method.

File Utilities

Creating safe filenames

Sometimes you might want to use entities such as task names in file names. Use the toSafeFilename method to obtain a string that can be used as part of a filename.

Listing child directories of a directory

listDirs provides a list of child directories of a directory.

Resolving the location of a class

For some cases it is handy to resolve the location of a class, so that it can be added to a classpath. One use case is the for javaexec processes and Gradle workers. Use resolveClassLocation to obtain a File object to the class. If the class is located in a JAR it path to the JAR will be returned. If the class is directly on the filesystem, the toplevel directory for the package hierarchy that the class belongs to, will be returned.

Obtaining the project cache directory

If a plugin might need to cache information in the local cache directory it is important that it determines this folder correctly. You can call projectCacheDirFor to achieve this.

URI Utilities

Converting objects to URIs

Use the urize method to convert nearly anything to a URI. If objects that have toURI() methods those methods will be called, otherwise objects that are convertible to strings, will effectively call toString().toURI(). Closures will be evaluated and the results are then converted to URIs.

UriUtils.urize('ftp://foo/bar') == new URI('ftp://foo/bar')
UriUtils.urize(new File('/')) == new File('/').toURI()
UriUtils.urize { 'ftp://foo/bar' } == new URI('ftp://foo/bar')
UriUtils.urize { new File('/') } == new File('/').toURI()

Removing credentials from URIs

Use safeURI to remove credentials from URIs. This is especially useful for printing.

Exclusive File Access

When creating a plugin that will potentially access shared state between different gradle projects, such as downloaded files, co-operative exclusive file access. This can be achieved by using ExclusiveFileAccess.

File someFile

ExclusiveFileAccess accessManager = new ExclusiveFileAccess(120000, 200) (1)

accessManager.access( someFile ) {
  // ... do something, whilst someFile is being accessed
} (2)
1 Set the timeout waiting for file to become available and the poll frequency. Both are in milliseconds.
2 Run this closure whilst this file is locked. You can also use anything that implements Callable<T>.

The value returned from the closure or callable is the one returned from the access method.

Lazy create tasks

In Gradle 4.9, functionality was added to allow for lazy creation and configuration of tasks. Although this provides the ability for a Gradle build to be configured much quicker, it creates a dilemma for plugin authors wanting to be backwards compatible.

With the TaskProvider API in Grolifant it is now possible for plugin authors to lazy-create tasks in Gradle 4.9, but automatically fall back to straight creation of tasks on older versions.

// The following imports are assumed
import org.ysb33r.grolifant.api.TaskProvider
import static org.ysb33r.grolifant.api.TaskProvider.registerTask

TaskProvider tp = registerTask(project, 'foo', Copy) { (1)
    into 'foo'

tp.configure { (2)
    from 'bar'
1 Register a task and configure it. In case of Gradle 4.9+ the configuration will queued until such time that the task is actually needed.
2 Add a configuration to the existing task. IN case of Gradle 4.9+ the configuration is added as an action to be executed later. In earlier versions the task will be configured immediately.

Kotlin and Java implementations can use Action instances instead of closures.

Java Fork Options

There are a number of places in the Gradle API which utilises JavaForkOptions, but there is no easy way for a plugin provider to create a set of Java options for later usage. For this purpose we have created a version that looks the same and implements most of the methods on the interface.

Here is an example of using it with a Gradle worker configuration.

JavaForkOptions jfo = new JavaForkOptions()

jfo.systemProperties 'a.b.c' : 1

workerExecutor.submit(RunnableWorkImpl.class) { WorkerConfiguration conf ->

    forkOptions { org.gradle.process.JavaForkOptions options ->


As from Grolifant 0.10 implementation of new repository types that optionally also need to support credentials is available. Two classes are currently available:

Extending Repository Handler

It is possible to add additional methods to project.repositories in a way that safely works with both Kotlin & Groovy.

Extending Dependency Handler

It is possible to add additional methods to project.repositories in a way that safely works with both Kotlin & Groovy.

Simplified Property Resolving

Takign an idea from Spring Boot configuration, this class provides out-of-the-box functionality to reoslve a property by looking at the Gradle project projecties, Java system properties and the environment.

PropertyResolver resolver = new PropertyResolver(project)

resolver.get('a.b.c') (1)
resolver.get('a.b.c','123') (2)
resolver.get('a.b.c', resolver.SYSTEM_ENV_PROPERTY) (3)
resolver.get('a.b.c', '123', resolver.SYSTEM_ENV_PROPERTY) (4)
1 Search for property a.b.c in the order of project property & system property and then for environmental variable A_B_C.
2 Search for property, but result default value if none was found.
3 Use a different search order. Anything that implements PropertyResolveOrder can be specified.
4 Combine both a default value and a different search order.

What’s in a name

Grolifant is a concatenation of Gr for Gradle and olifant, which is the Afrikaans word for elephant. The latter is of course the main part of the current Gradle logo.

Who uses Grolifant?

The following plugins are known consumers of Grolifant:

If you would like to register your plugin as a Grolifant user, please raise and {isses}[issue] (and preferably a merge request).