Support this project

If this project is useful to you, you may want to consider donating to the associated Patreon account.

Compatibility

As of version 0.17.0, the minimum version of Gradle that is supported by Grolifant is 4.0.

In 0.17.0 we moved all of the APIs to the org.ysb33r.grolifant.api.v4 package and deprecated most of the packages, classes and interfaces in org.ysb33r.grolifant.api. The reason for this is that in future the package will be used to indicate a level compatibility. For instance if a class is in the v4 package it will be compatibility with Gradle 4.0+. If it appears in the v5 package then it will only be compatible with Gradle 5.0+. The same for v6, v7 etc. Any class which still appears at the top-level is not dependent on specific Gradle versions.

As per usual you should not rely on anything that appears in the org.ysb33r.grolifant.internal package. If you need anything from there please raise an issue.

The deprecated packages are scheduled to be removed when we release 2.0 one day (or in a release soon thereafter).

This version has been built against 6.6.1 and checked against runtime compatibility both up and down in terms of Gradle versions.

Configuration Cache

Version 1.0.x are the first series that are aware of the restrictions introduced by configuration caching as from Gradle 6.6. It introduces [GrolifantExtension] to help.

Bootstrapping

The library is available on JCenter. Add the following to your Gradle build script to use it.

repositories {
  jcenter()
}
Adding one of these as your Grolifant compile dependency
dependencies {
  implementation 'org.ysb33r.gradle:grolifant40:1.3.3' (1)
  implementation 'org.ysb33r.gradle:grolifant50:1.3.3' (2)
  implementation 'org.ysb33r.gradle:grolifant60:1.3.3' (3)
  implementation 'org.ysb33r.gradle:grolifant70:1.3.3' (4)
}
1 Add this if you want support for Gradle 4.0+
2 Add this if you want additional functionality that is only compatible with Gradle 5.0+. Adding this does not make your existing code incompatible with Gradle 4.x, but it provides additional functionality in the org.ysb33r.grolifant.api.v5 namespace that required Gradle 5.0+. Using that functionality will make you minimum supported Gradle to be 5.0.
3 Add this if you want additional functionality that is only compatible with Gradle 6.0+. Adding this does not make your existing code incompatible with Gradle 4.x or 5.x, but it provides additional functionality in the org.ysb33r.grolifant.api.v6 namespace that requires Gradle 6.0+. Using that functionality will make you minimum supported Gradle to be 6.0.
4 Add this if you want additional functionality that is only compatible with Gradle 7.0+. Adding this does not make your existing code incompatible with Gradle 4.x, 5.x or 6.x, but it provides additional functionality in the org.ysb33r.grolifant.api.v7 namespace that requires Gradle 7.0+. Using that functionality will make you minimum supported Gradle to be 7.0.

If you have some older code that is pre 0.17.0, which you’ve had not had the chance to migrate, you can add the following additional dependency.

Legacy dependency
dependencies {
  implementation 'org.ysb33r.gradle:grolifant-compat3:1.3.3' (1)
}
1 This is not compatible with Gradle 7.0+ and there is a good chance that your plugin might fail when run with Gradle 7.0+. This will be removed when 2.0 of Grolifant is released.

Utilities for Common Types

String Utilities

Converting objects to strings

Use the stringize method to convert nearly anything to a string or a collection of strings. Closures are evaluated and the results are then converted to strings. Anything that implements Callable<String> and Provider will also be converted. In the case of Provider the object is first retrieved and then converted to a String.

StringUtils.stringize('foo') == 'foo'
StringUtils.stringize(new File('foo')) == 'foo'
StringUtils.stringize { 'foo' } == 'foo'

StringUtils.stringize(['foo1', new File('foo2'), { 'foo3' }]) == ['foo1', 'foo2', 'foo3']

Updating Property<String> instances in-situ.

Gradle’s Property class is two-edged sword. On the one sde it makes lazy-evaluation easier for both Groovy & Kotlin DSLs, bit on the other sides it really messes up the Groovy DSL for build script authors.

The correct way to use is not as field, but as the return type of the getter as illustraedt by the following skeleton code

class MyTask extends DefaultTask  {
    @Input
    Property<String> getSampleText() {
        this.sampleText
    }

    void setSampleText(Object txt) {
       // ...
    }

    private final Property<String> sampleText
}

The hard part for the plugin author is to deal with initialisation of the private field and then with further updates. This is where updatePropertyString becomes very useful as the code can now be implemented as.

class MyTask extends DefaultTask  {

    MyTask() {
        sampleText = project.objects.property(String)
        sampleText.set('default value')
    }

    @Input
    Property<String> getSampleText() {
        this.sampleText
    }

    void setSampleText(Object txt) {
       StringUtils.updatePropertyString(project, sampleText, txt) (1)
    }

    private final Property<String> sampleText
}
1 Updates the value of the Property instance, but keeps the txt object lazy-evaluated.
You will need at least Gradle 4.3 to call this method.

File Utilities

Creating safe filenames

Sometimes you might want to use entities such as task names in file names. Use the toSafeFilename method to obtain a string that can be used as part of a filename.

Listing child directories of a directory

listDirs provides a list of child directories of a directory.

Resolving the location of a class

For some cases it is handy to resolve the location of a class, so that it can be added to a classpath. One use case is the for javaexec processes and Gradle workers. Use resolveClassLocation to obtain a File object to the class. If the class is located in a JAR it path to the JAR will be returned. If the class is directly on the filesystem, the toplevel directory for the package hierarchy that the class belongs to, will be returned.

If you run into the java.lang.NoClassDefFoundError: org/gradle/internal/classpath/Instrumented issues on Gradle 6.5+, especially when calling a javaexec, you can potentially work around this issue by doing some substitution. Use resolveClassLocation from the v6 package.

// import org.ysb33r.grolifant.api.v6.FileUtils

FileUtils.resolveClassLocation(
  myClass, (1)
  project.rootProject.buildscript.configurations.getByName('classpath'), (2)
  ~/myjar-.+\.jar/  (3)
)
1 The class you are looking for.
2 A file collection to search. The root project’s buildscript classpath is a common use case.
3 The JAR you would like to use instead.

Obtaining the project cache directory

If a plugin might need to cache information in the local cache directory it is important that it determines this folder correctly. You can call projectCacheDirFor to achieve this.

Converting objects to files

You are probably familiar with project.file to convert a single object to a file. Grolifant offers similar methods in fileize (4.0+) & fileize (5.0+), but will keep behaviour if if that of project.file might change in a future version.

It also provides a version that takes a collection of objects and converts them a list of files. This behaviour is different to that of project.files in that it returns a List<file> rather than a ConfigurableFileCollection. It also does not evaluate Task and TaskProvider instances.

In addition, there is a v4.fileizeOrNull / v5.fileizeOrNull available which will not throw an exception if it encounters a null or empty provider. For dealing with collection there is v4.fileizeDropNull / v5.fileizeDropNull, which will not throw an exception on nulls or empty providers, but just drop those items from the final container.

You can also update any Property<File> using the v4.updateFileProperty / v5.updateFilePropertyutility method.

URI Utilities

Converting objects to URIs

Use the urize method to convert nearly anything to a URI. If objects that have toURI() methods those methods will be called, otherwise objects that are convertible to strings, will effectively call toString().toURI(). Closures will be evaluated and the results are then converted to URIs.

UriUtils.urize('ftp://foo/bar') == new URI('ftp://foo/bar')
UriUtils.urize(new File('/foo.bar')) == new File('/foo.bar').toURI()
UriUtils.urize { 'ftp://foo/bar' } == new URI('ftp://foo/bar')
UriUtils.urize { new File('/foo.bar') } == new File('/foo.bar').toURI()

Removing credentials from URIs

Use safeURI to remove credentials from URIs. This is especially useful for printing.

Downloading Tools & Packages

Distribution Installer

There are quite a number of occasions where it would be useful to download various versions SDK or distributions from a variety of sources and then install them locally without having to affect the environment of a user. The Gradle Wrapper is already a good example of this. Obviously it would be good if one could also utilise other solutions that manage distributions and SDKs on a per-user basis such as the excellent SDKMAN!.

The AbstractDistributionInstaller abstract class provides the base for plugin developers to add such functionality to their plugins without too much trouble.

Getting started

TestInstaller.groovy
class TestInstaller extends AbstractDistributionInstaller {
        static final String DISTPATH = 'foo/bar'
        static final String DISTVER = '0.1'

        TestInstaller(Project project) {
            super('Test Distribution', DISTVER, DISTPATH, project) (1)
        }

        @Override
        URI uriFromVersion(String version) { (2)
            TESTDIST_DIR.toURI().resolve("testdist-${DISTVER}.zip") (3)
        }
}
1 The installer needs to be provided with a human-readable name, the version of the distribution, a relative path below the installation for installing this type of distribution and a reference to an exiting Gradle Project instance.
2 The uriFromVersion method is used to returned an appropriate URI where to download the specific version of distribution from. Supported protocols are all those supported by Gradle Wrapper and includes file, http(s) and ftp.
3 Use code appropriate to your specific distribution to calculate the URI.

The download is invoked by calling the getDistributionRoot method.

The above example uses Groovy to implement an installer class, but you can use Java, Kotlin or any other JVM-language that works for writing Gradle plugins.

How it works

When getDistributionRoot is called, it effectively uses the following logic

File location = locateDistributionInCustomLocation(distributionVersion) (1)

if (location == null && this.sdkManCandidateName) { (2)
    location = distFromSdkMan
}

location ?: distFromCache (3)
1 If a custom location location is specified, look there first for the specific version
2 If SDKMAN! has been enabled, look if it has an available distribution.
3 Try to get it from cache. If not in cache try to download it.

Marking files executable

Files in some distributed archives are platform-agnostic and it is necessary to mark specific files as executable after unpacking. The addExecPattern method can be used for this purpose.

TestInstaller installer = new TestInstaller(project)
installer.addExecPattern '**/*.sh' (1)
1 Assuming the TestInstaller from Getting Started, this example will mark all shell files in the distribution as executable once the archive has been unpacked.

Patterns are ANT-style patterns as is common in a number of Gradle APIs.

Search in custom locations

The locateDistributionInCustomLocation method can be used for setting up a search in specific locations.

For example a person implementing a Ceylon language plugin might want to look in the ~/.ceylon folder for an existing installation of a specific version.

This optional implementation is completely left up to the plugin author as it will be very specific to a distribution. The method should return null if nothing was found.

Changing the download and unpack root location

By default downloaded distributions will be placed in a subfolder below the Gradle user home directory as specified during construction time. It is possible, especially for testing purposes, to use a root folder other than Gradle user home by setting the downloadRoot

Utilising SDKMAN!

SDKMAN! is a very useful local SDK installation and management tool and when specific SDKs or distributions are already supported it makes sense to re-use them in order to save on download time.

All that is required is to provide the SDKMAN! candidate name using the setSdkManCandidateName method.

Utilising SDKMAN!
installer.sdkManCandidateName = 'ceylon' (1)
1 Sets the candidate name for a distribution as it will be known to SDKMAN!. In this example the Ceylon language distribution is used.

Checksum

By default the installer will not check any values, but calling setChecksum will force the installer to perform a check after downloading and before unpacking. It is possible to invoke a behavioural change by overriding verification.

TestInstaller installer = new TestInstaller(project)
installer.checksum = 'b1741e3d2a3f7047d041c79d018cf55286d1168fd6f0533e7fae897478abcdef'  (1)
1 Provide SHA-256 checksum string

Only SHA-256 checksums are supported. if you need something else you will need to override verification and provide your own checksum test.

Advanced: Override unpacking

By default, AbstractDistributionInstaller already knows how to unpack ZIPs and TARs of a variety of compressions. If something else is required, then the unpack method can be overridden.

This is the approach to follow if you need support for unpacking MSIs. There is a helper method called unpackMSI which will install and then call the lessmsi utility with the correct parameters. In order to use this in a practical way it is better to override the unpack method and call it from there. For example:

Overriding for adding MSI support.
@Override
protected void unpack(File srcArchive, File destDir) {
    if(srcArchive.name.endsWith('.msi')) {
        unpackMSI(srcArchive,destDir,[:])  (1)

        // Add additional file and directory manipulation here if needed

    } else {
        super.unpack(srcArchive, destDir)
    }
}
1 The third parameter can be used to set up a special environment for lessmsi if needed.

Advanced: Override verification

Verification of a downloaded distribution occurs in two parts:

  • If a checksum is supplied, the downloaded archive is validated against the checksum. The standard implementation will only check SHA-256 checksums.

  • The unpacked distribution is then checked for sanity. In the default implementation this is simply to check that only one directory was unpacked below the distribution directory. The latter is effectively just replicating the Gradle Wrapper behaviour.

Once again it is possible to customise this behaviour if your distribution have different needs. In this case there are two protected methods than can be overridden:

  • verifyDownloadChecksum - Override this method to take care of handling checksums. The method, when called, will be passed the URI where the distribution was downloaded from, the location of the archive on the filesystem and the expected checksum. It is possible to pass null for the latter which means that no checksum is available.

  • getAndVerifyDistributionRoot - This validates the distribution on disk. When called, it is passed the the location where the distribution was unpacked into. The method should return the effective home directory of the distribution.

In the case of getAndVerifyDistributionRoot it can be very confusing sometimes as to what the distDir is and what should be returned. The easiest is to explain this by looking at how Gradle wrappers are stored. For instance for Gradle 3.0 the distDir might be something like ~/.gradle/wrapper/dists/gradle-3.0-bin/2z3tfybitalx2py5dr8rf2mti/ whereas the return directory would be ~/.gradle/wrapper/dists/gradle-3.0-bin/2z3tfybitalx2py5dr8rf2mti/gradle-3.0.

Helper and other protected API methods

  • getProject provides access to the associated Gradle Project object.

  • listDirs provides a listing of directories directly below an unpacked distribution. It can also be used for any directory if the intent is to see which child directories are available.

  • getLogger provides access to a simple stdout logger.

Unpacking DMG files

Since 0.6 there is utility that can be used to unpack DMG files, called UnpackUtils.unpackDmgOnMacOsX. On non-MacOs platforms it will be a NOOP if called.

DMG files are not unpacked automatically by AbstractDistributionInstaller. The plugin implementor will need to override the unpack method in order to call the DMG unpacker and also add the appropriate logic.

Migrating from older AbstractDistributionInstaller

Moving to new AbstractDistributionInstaller

import org.ysb33r.grolifant.api.v4.downloader.AbstractDistributionInstaller (1)

class TestInstaller extends AbstractDistributionInstaller {
    TestInstaller(final ProjectOperations po) {
        super('mytool', 'native-binaries/mytool', po) (2)
    }


}
1 Change package name to import from
2 The super class' constructor no longer requires the version.

Moving to AbstractSingleFileInstaller

If you previously customised a number of methods in order to download a single file, you can switch to AbstractSingleFileInstaller instead.

class TestInstaller extends AbstractSingleFileInstaller { (1)
    TestInstaller(final ProjectOperations po) {
        super('mytool', 'native-binaries/mytool', po) (2)
    }

    @Override
    protected String getSingleFileName() { (3)
        OperatingSystem.current().windows ? 'mytool.exe' : 'mytool'
    }
}
1 Extend AbstractSingleFileInstaller instead
2 The version is no longer used in the constructor of the super class.
3 Implement a method to obtain the name of the file.

You can access a downloaded single file by version. Simply call getSingleFile(version).

Exclusive File Access

When creating a plugin that will potentially access shared state between different gradle projects, such as downloaded files, co-operative exclusive file access. This can be achieved by using ExclusiveFileAccess.

File someFile

ExclusiveFileAccess accessManager = new ExclusiveFileAccess(120000, 200) (1)

accessManager.access( someFile ) {
  // ... do something, whilst someFile is being accessed
} (2)
1 Set the timeout waiting for file to become available and the poll frequency. Both are in milliseconds.
2 Run this closure whilst this file is locked. You can also use anything that implements Callable<T>.

The value returned from the closure or callable is the one returned from the access method.

Utilities for Gradle Entities

Configuration Utilities

Adding configurations to tasks is a not uncommon situation. In order to help plugin authors have a more flexible way, so utilities have been added to lazy-resolve items to project configurations. Grolifant offers a method in asConfiguration to resolve a single item to a configuation. It can be provided with an existing configuration or anything resolvable to a string using stringize. If the configurations do not exist at the time they are evaluated an exception will be thrown.

There is also a version that takes a collection of objects and converts them a list of cofigurations.

Task Utilities

Lazy create tasks

In Gradle 4.9, functionality was added to allow for lazy creation and configuration of tasks. Although this provides the ability for a Gradle build to be configured much quicker, it creates a dilemma for plugin authors wanting to be backwards compatible.

Before Grolifant 1.3, TaskProvider was provided to auto-switvh between create and register, but this has been superceded by TaskTools.register. An instance of TaskTools can be obtained via TaskTools.getTasks

// The following imports are assumed
import org.ysb33r.grolifant.api.core.ProjectOperations

ProjectOperations projectOperations = ProjectOperations.find(project)
projectOperations.tasks.register('foo', Copy) {  (1) (2)
    it.into 'foo'
}
projectOperations.tasks.named('foo', Copy) { (3)
    it.from 'bar'
}
1 Register a task and configure it. In case of Gradle 4.9+ the configuration will queued until such time that the task is actually needed.
2 Add a configuration to the existing task. In case of Gradle 4.9+ the configuration is added as an action to be executed later. In earlier versions the task will be configured immediately.
3 Add more configuration to a task.

Kotlin and Java implementations can use Action instances instead of closures.

Configure a task upon optional configuration

Sometimes a task might never registered, but if it does get gesitered and later created, adding configuration at that point, is a good option. For this you can use TaskTools.whenNamed

// The following imports are assumed
import org.ysb33r.grolifant.api.core.ProjectOperations

projectOperations.tasks.whenNamed('foo', Copy) {
    it.from 'bar'
}

Lazy evaluate objects to task instances

The Task.dependsOn method allows various ways for objects to be evaluated to tasks at a later point in time. Unfortunately this kind of functionality is not available via a public API and plugin authors would want to create links between tasks for domain reasons, have to find other ways of creating lazy-evaluated task dependencies. In 0.17.0 we added the TaskUtils.taskize to help relieve this problem, but from 1.3 the better solution is to use TaskTools.taskize.

// The following imports are assumed
import org.ysb33r.grolifant.api.core.ProjectOperations

projectOperations.tasks.named('foo') { Task t ->
    t.dependsOn(projectOperations.tasks.taskize('nameOfTask1')) (1)
    t.dependsOn(projectOperations.tasks.taskize({ -> 'nameOfTask2' })) (2)
    t.dependsOn(projectOperations.tasks.taskize(['nameOfTask3', 'nameofTask4'])) (3)
}
1 Resolves a single item to a task within a project context.
2 Single items could be recursively evaluated if applicable, for example Closures returning strings.
3 Resolves an iterable sequence of items to a list of tasks within a project context. Items are recursively evaluated if necessary.

The fully supported list of items are always described in the API documentation, but the following list can be used as a general guideline:

Supported single object types
  • Gradle task.

  • Gradle TaskProvider (Gradle 4.8+).

  • Grolifant TaskProvider. (Deprecated).

  • Any CharSequence including String and GString.

  • Any Gradle Provider to the above types

  • Any Callable implementation including Groovy Closures that return one of the above values.

Additional types for the iterable version
  • Any iterable sequence of the above types

  • A Map for which the values are evaluated. (The keys are ignored).

Java Fork Options

There are a number of places in the Gradle API which utilises JavaForkOptions, but there is no easy way for a plugin provider to create a set of Java options for later usage. For this purpose we have created a version that looks the same and implements most of the methods on the interface.

Here is an example of using it with a Gradle worker configuration.

JavaForkOptions jfo = new JavaForkOptions()

jfo.systemProperties 'a.b.c' : 1

workerExecutor.submit(RunnableWorkImpl.class) { WorkerConfiguration conf ->

    forkOptions { org.gradle.process.JavaForkOptions options ->
        jfo.copyTo(options)
    }
}

Extending Repository Handler

It is possible to add additional methods to project.repositories in a way that safely works with both Kotlin & Groovy.

Extending Dependency Handler

It is possible to add additional methods to project.repositories in a way that safely works with both Kotlin & Groovy.

Repositories

As from Grolifant 0.10 implementation of new repository types that optionally also need to support credentials is available. Two classes are currently available:

Working with Gradle Property

The Property interface introduces in Gradle 4.3 is very useful for lazy-evaluation. For compatibility and usability purposes, Property instances should be used as private members as exposed as Provider instances on getters.

For plugin authors who need to remain compatible with Gradle 4.0 - 4.2, version 1.0.0 has introduced PropertyStore. It’s usage is primarily targeted to be a private final field with behaviour similar to the Property interface. On Gradle 4.3 it will simply wrap a Property instance.

import org.ysb33r.grolifant.api.v4.PropertySource

class MyTaskWithProperty extends DefaultTask {
  private final PropertySource<String> myProp (1)

  Provider<String> getMyProp() { (2)
    this.myProp.asProvider
  }

  MyTaskWithProperty() {
    this.myProp = PropertySource.create(String, project) (3)
  }

  void setMyProp(Provider<String> provider) {
    this.myProp.set(provider) (4)
  }
}
1 Use this instead of Property<String>
2 Expose it as a Provider to consumers.
3 Initialise it is the constructor as pass the Task.getProject() call. At this point it is still safe to call project as it is configuration phase. This is important as this code will be compatible back to Gradle 4.0, yet deal with configuration cache on Gradle 6.6+.
4 Various options are possible for updating the actual value. This shows simple usage of another Provider. Your own context will determine more useful setters.

Property Providers

In Gradle 6.1, the ProviderFactory introduced methods for obtaining Gradle properties, system properties and environment variables as providers. In order to use these facilities safely you can now use the appropriate methods on ProjectOperations. In addition, for the forUseAtConfigurationTime method that was introduced in Gradle 6.5 there are variants of these methods to use similar functionality when available. Using these methods, can safeguard your plugin aginst cusage on a Gradle version where configuration-cache is enabled.

ProjectOperations grolifant = ProjectOperations.find(project)

Provider<String> sysProp1 = grolifant.systemProperty( 'some-property' ) (1)
Provider<String> sysProp2 = grolifant.systemProperty( 'some-property', grolifant.atConfigurationsTime() ) (2)
Provider<String> gradleProp1 = grolifant.gradleProperty( 'some-property' ) (3)
Provider<String> gradleProp2 = grolifant.gradleProperty( 'some-property', grolifant.atConfigurationsTime() ) (4)
Provider<String> env1 = grolifant.environmentVariable( 'SOME_ENV' ) (5)
Provider<String> env2 = grolifant.environmentVariable( 'SOME_ENV', grolifant.atConfigurationsTime() ) (6)
1 System property provider.
2 System property provider that is safe to read at configuration time.
3 Gradle property provider.
4 Gradle property provider that is safe to read at configuration time.
5 Environment variable provider.
6 Environment variable provider that is safe to read at configuration time.

Working with Executables

Creating Script Wrappers

Consider that you have a plugin that already node or terraform and you want to try something on the command-line with the tool directly. You do not want to install the tool again if you could possibly just use the already cached version. It would be of the correct version as required by the project in any case.

You are probably very familiar with the Gradle wrapper. Now if could be nice under certain circumstances to create wrappers that will call the executables from distributions that were installed using the [DistributionInstaller]. Since 0.14 Grolifant offers two abstract task types to help you add such functionality to your plugins.

These task types attempt to address the following:

  • Create wrappers for tools be it executables or scripts that will point to the correct version as required by the specific project.

  • Realise it is out of date if the version or location of the distribution/tool changes.

  • Cache the distribution/tool if it is not yet cached.

Creating a wrapper task

Let’s assume you would like to create a plugin for Hashicorp’s Packer. Let’s assume that you have already created an extension class which extends AbstractToolExtension and is called PackerExtension. Let’s also assume that this class knows how to download packer for the appropriate platform whic you probably implemented using AbstractDistributionInstaller.

In 0.14 the only supported implementation is to place the template files in a directory path in resources and then substitute values by tokens. This implementation uses Ant ReplaceTokens under the hood.

Start by extending AbstractScriptWrapperTask

@CompileStatic
class PackerWrapper extends AbstractScriptWrapperTask {
    PackerWrapper() {
        super()
        useWrapperTemplatesInResources( (1)
            '/packer-wrappers', (2)
            [ 'wrapper-template.sh' : 'packerw', (3)
              'wrapper-template.bat': 'packerw.bat'
            ]
        )
    }
}
1 Although this is currently the only supported method, it has to be explicitly specified that wrapper templates are in resources.
2 Specify the resource path where to find the resource wrappers. This resource path will be scanend for files as defined below.
3 Specify a map which maps the names of files in the resource path to final file names. The format is [ <WRAPPER TEMPLATE NAME> : <FINAL SCRIPT NAME> ]. Although the final script names can be specified using a relative path, convention is to just place the file wrapper scripts in the project directory. See example script wrappers for some inspiration.

The next step is to provide tokens can be substituted by implementing the appropriate abstract methods.

@Override
protected String getBeginToken() { (1)
    '~~'
}

@Override
protected String getEndToken() { (2)
    '~~'
}

@Override
protected Map<String, String> getTokenValuesAsMap() { (3)
    [
        APP_BASE_NAME               : 'packer',
        APP_LOCATION_FILE           : '/path/to/packer'
    ]
}
1 Start token for substitution. This can be anything. This example uses ~~ because it matches the delimiter from the example script wrappers.
2 End token for substitution.
3 Return a map of the values for substituting into the template when creating the scripts.

At this point you can test the task and it should generate wrappers, however there are a number of shortcomings:

  • When somebody clones a project that contains the wrappers for the first time, there is a good chance that none of the wrapped binaries would be cached too and when they are cached they might end up at a different location due to the environment of the user.

  • The classic place to cache something is in the project cache directory, but this can be overridden from the command-line, so special care has to be taken.

  • You might have pulled an updated version of the project and the version of the wrapped binary has been changed by the project maintainers.

Let’s start by creating a caching task first.

Creating a caching task

Create a task type that extends AbstractCacheBinaryTask.

@CompileStatic
class PackerCacheBinary extends AbstractCacheBinaryTask {
    PackerCacheBinary() {
        super('packer-wrapper.properties') (1)
    }
}
1 Define a properties file that will store appropriate information about the cached binary that will be local to the project on a user’s computer or in CI.

There are three minimum characteristics that need to be defined:

  • Version of the binary/script/distribution if it is set via executable version : '1.2.3'

  • The location of the binary/script.

  • Description of wrapper.

This is done by implementing three abstract methods.

@Override
protected Provider<String> getBinaryVersionProvider() {
    PackerExtension packerExtension = project.extensions.getByType(PackerExtension) (1)
    packerExtension.resolvedExecutableVersion()
}

@Override
protected Provider<String> getBinaryLocationProvider() {
    PackerExtension packerExtension = project.extensions.getByType(PackerExtension)
    projectOperations.map(
            packerExtension.executable,
            { File it -> it.canonicalPath }
        )
}

@Override
protected String getPropertiesDescription() {
    "Describes the Packer usage for the ${projectOperations.projectName} project" (7)
}
1 Working on the assumption you created PackerExtension as mentioned earlier. < If you execute an instance of your new task type it will automatically cache the binary/distribution dependent on how it has been defined. It will also generate a properties file into the project cache directory. This latter file should be ignored by source control, and the project cache directory should never be in source control.

The next step is to revisit the wrapper task and link it to the caching task.

Linking the caching and wrapper tasks

Return to the wrapper task and modify the constructor as follows:

class PackerWrapper extends AbstractScriptWrapperTask {
  private final PackerCacheBinary cacheTask

    @Inject
    PackerWrapper(PackerCacheBinary cacheTask) { (1)
        super()
        this.cacheTask = cacheTask
        inputs.file(cacheTask.locationPropertiesFile) (2)
        dependsOn(cacheTask) (3)

        def mapping =[
            'wrapper-template.sh' : 'packerw',
            'wrapper-template.bat': 'packerw.bat'
        ]

        useWrapperTemplatesInResources(
            '/packer-wrappers', mapping
        )

        outputs.files(mapping.values().collect { (4)
            new File(project.projectDir, it)
        })
    }
}
1 Restrict the wrapper task type to only be instantiated if there is an associated caching task.
2 If the location of the properties file have changed, the the wrapper task should be out of date.
3 If the wrapper task is run, then the caching task should also be run if out of date.
4 If the wrapper scripts do not exist, the task should be executed. Use a property rather than a @OutputFiles annotation as the scripts are not used by other tasks directory.

Now change your getTokenValuesAsMap method. (Once again we base these tokens on the ones used in [ExampleScriptWrapper]).

@Override
protected Map<String, String> getTokenValuesAsMap() {
    [
        APP_BASE_NAME               : 'packer',
        GRADLE_WRAPPER_RELATIVE_PATH: project.relativePath(project.rootDir), (1)
        DOT_GRADLE_RELATIVE_PATH    : project.relativePath(cacheTask.locationPropertiesFile.get().parentFile), (2)
        APP_LOCATION_FILE           : cacheTask.locationPropertiesFile.get().name, (3)
        CACHE_TASK_NAME             : cacheTask.name (4)
    ]
}
1 If the project uses a Gradle wrapper it is important that the tool wrapper script also use the Gradle wrapper to invoke the caching task.
2 Get the location of the project cache directory. You can also use project.relativePath(FileUtils.projectCacheDirFor(project)).
3 The name of the wrapper properties file.
4 The name of the cache task to invoke if either the wrapper properties file does not exist or the distribution/binary has not been cached.

Putting everything in a plugin

It is recommended that the tasks created by convention are placed in a separate plugin and that the plugin users are recommended to only load this plugin in the root project of a multi-project.

In your plugin add the following code to the apply method.

PackerCacheBinary packerCacheBinary = project.tasks.create('cachePackerBinary', PackerCacheBinary)
project.tasks.create('packerWrapper', PackerWrapper, packerCacheBinary)

Example script wrappers

These are provided as starter points for wrapping simple binary tools. They have been hashed together from various other examples in open-source.

For shell scripts
#!/usr/bin/env sh

##############################################################################
##
##  ~~APP_BASE_NAME~~ wrapper up script for UN*X
##
##############################################################################
# Relative path from this script to the directory where the Gradle wrapper
# might be found.
GRADLE_WRAPPER_RELATIVE_PATH=~~GRADLE_WRAPPER_RELATIVE_PATH~~

# Relative path from this script to the project cache dir (usually .gradle).
DOT_GRADLE_RELATIVE_PATH=~~DOT_GRADLE_RELATIVE_PATH~~

PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
    ls=`ls -ld "$PRG"`
    link=`expr "$ls" : '.*-> \(.*\)$'`
    if expr "$link" : '/.*' > /dev/null; then
        PRG="$link"
    else
        PRG=`dirname "$PRG"`"/$link"
    fi
done

SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null

# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
  CYGWIN* )
    cygwin=true
    ;;
  Darwin* )
    darwin=true
    ;;
  MINGW* )
    msys=true
    ;;
  NONSTOP* )
    nonstop=true
    ;;
esac

# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
    APP_HOME=`cygpath --path --mixed "$APP_HOME"`
    CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
    JAVACMD=`cygpath --unix "$JAVACMD"`

    # We build the pattern for arguments to be converted via cygpath
    ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
    SEP=""
    for dir in $ROOTDIRSRAW ; do
        ROOTDIRS="$ROOTDIRS$SEP$dir"
        SEP="|"
    done
    OURCYGPATTERN="(^($ROOTDIRS))"
    # Add a user-defined pattern to the cygpath arguments
    if [ "$GRADLE_CYGPATTERN" != "" ] ; then
        OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
    fi
    # Now convert the arguments - kludge to limit ourselves to /bin/sh
    i=0
    for arg in "$@" ; do
        CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
        CHECK2=`echo "$arg"|egrep -c "^-"`                                 ### Determine if an option

        if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then                    ### Added a condition
            eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
        else
            eval `echo args$i`="\"$arg\""
        fi
        i=$((i+1))
    done
    case $i in
        (0) set -- ;;
        (1) set -- "$args0" ;;
        (2) set -- "$args0" "$args1" ;;
        (3) set -- "$args0" "$args1" "$args2" ;;
        (4) set -- "$args0" "$args1" "$args2" "$args3" ;;
        (5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
        (6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
        (7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
        (8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
        (9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
    esac
fi

APP_LOCATION_FILE=$DOT_GRADLE_RELATIVE_PATH/~~APP_LOCATION_FILE~~

run_gradle ( ) {
  if  [ -x "$GRADLE_WRAPPER_RELATIVE_PATH/gradlew" ] ; then
    $GRADLE_WRAPPER_RELATIVE_PATH/gradlew "$@"
  else
    gradle "$@"
  fi
}

app_property ( ) {
    echo `cat $APP_LOCATION_FILE | grep $1 | cut -f2 -d=`
}

# If the app location is not available, set it first via Gradle
if [ ! -f $APP_LOCATION_FILE ] ; then
  run_gradle -q ~~CACHE_TASK_NAME~~
fi

# Now read in the configuration values for later usage
. $APP_LOCATION_FILE

# If the app is not available, download it first via Gradle
if [ ! -f $APP_LOCATION  ] ; then
  run_gradle -q ~~CACHE_TASK_NAME~~
fi

# If global configuration is disabled which is the default, then
# point the Terraform config to the generated configuration file
# if it exists.
if [ -z $TF_CLI_CONFIG_FILE ] ; then
    if [ $USE_GLOBAL_CONFIG == 'false' ] ; then
        CONFIG_LOCATION=`app_property configLocation`
        if [ -f $CONFIG_LOCATION ] ; then
            export TF_CLI_CONFIG_FILE=$CONFIG_LOCATION
        else
          echo Config location specified as $CONFIG_LOCATION, but file does not exist. >&2
          echo Please run the terraformrc Gradle task before using $(basename $0) again >&2
        fi
    fi
fi

# If we are in a project containing a default Terraform source set
# then point the data directory to the default location.
if [ -z $TF_DATA_DIR ] ; then
    if [ -f $PWD/src/tf/main ] ; then
        export TF_DATA_DIR=$PWD/build/tf/main
        echo $TF_DATA_DIR will be used as data directory >&2
    fi
fi

exec $APP_LOCATION "$@"
For windows batch files
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem  ~~APP_BASE_NAME~~ wrapper script for Windows
@rem
@rem ##########################################################################

@rem Relative path from this script to the directory where the Gradle wrapper
@rem might be found.
set GRADLE_WRAPPER_RELATIVE_PATH=~~GRADLE_WRAPPER_RELATIVE_PATH~~

@rem  Relative path from this script to the project cache dir (usually .gradle).
set DOT_GRADLE_RELATIVE_PATH=~~DOT_GRADLE_RELATIVE_PATH~~

@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal

set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%

:init
@rem Get command-line arguments, handling Windows variants

if not "%OS%" == "Windows_NT" goto win9xME_args

:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2

:win9xME_args_slurp
if "x%~1" == "x" goto execute

set CMD_LINE_ARGS=%*

:execute
@rem Setup the command line

set APP_LOCATION_FILE=%DOT_GRADLE_RELATIVE_PATH%/~~APP_LOCATION_FILE~~

@rem If the app location is not available, set it first via Gradle
if not exist %APP_LOCATION_FILE% call :run_gradle -q ~~CACHE_TASK_NAME~~

@rem Read settings in from app location properties
@rem - APP_LOCATION
@rem - USE_GLOBAL_CONFIG
@rem - CONFIG_LOCATION
call %APP_LOCATION_FILE%

@rem If the app is not available, download it first via Gradle
if not exist %APP_LOCATION% call :run_gradle -q ~~CACHE_TASK_NAME~~


@rem If global configuration is disabled which is the default, then
@rem  point the Terraform config to the generated configuration file
@rem  if it exists.
if %TF_CLI_CONFIG_FILE% == "" (
    if %USE_GLOBAL_CONFIG%==true goto cliconfigset
    if exist %CONFIG_LOCATION% (
        set TF_CLI_CONFIG_FILE=%CONFIG_LOCATION%
    ) else (
        echo Config location specified as %CONFIG_LOCATION%, but file does not exist. 1>&2
        echo Please run the terraformrc Gradle task before using %APP_BASE_NAME% again 1>&2
    )
)
:cliconfigset

@rem  If we are in a project containing a default Terraform source set
@rem  then point the data directory to the default location.
if "%TF_DATA_DIR%" == "" (
    if exist %CD%\src\tf\main (
        set TF_DATA_DIR=%CD%\build\tf\main
        echo %TF_DATA_DIR% will be used as data directory 1>&2
    )
)


@rem Execute ~~APP_BASE_NAME~~
%APP_LOCATION% %CMD_LINE_ARGS%

:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd

:fail
exit /b 1

:mainEnd
if "%OS%"=="Windows_NT" endlocal

:omega
exit /b 0

:run_gradle
if  exist %GRADLE_WRAPPER_RELATIVE_PATH%\gradlew.bat (
    call %GRADLE_WRAPPER_RELATIVE_PATH%\gradlew.bat %*
) else (
    call gradle %*
)
exit /b 0

Tool Executions Tasks and Execution Specifications

Gradle script authors are quite aware of Exec and /org/gradle/api/tasks/JavaExec.html[JavaExec] tasks as well as the projects extensions exec and javaexec. Implementing tasks or extensions to support specific tools can involve a lot of work. This is where this set of abstract classes come in to simplify the work to a minimum and allowing plugin authors to think about what kind of tool functionality to wrap rather than implementing heaps of boilerplate code.

Wrapping an external tool within a gradle plugin usually have three components:

  • Execution specification

  • Project extension

  • Task type

How to implement these components are described in the following sections.

As from Grolifant 1.1, the orginal classes in org.ysb33r.grolifant.api.v4.exec have been deprecated and replaced with simplified ones in org.ysb33r.grolifant.api.v4.exec. Please see the upgrade guide for converting the new setup

Execution specifications

Execution specifications are used for configuring the necessary details for running an external process. The latter will then be used by a task type of a project extension.

There are three main interfaces in the hierarchy and all extend the BaseExecSpec interface. In addition, there are three abstract base classes which implements these interfaces.

diag c4a99c2eefdcc25393f834ce6871feed
Figure 1. Execution Specifications

These execution specifications allow to easily present configuration options to your plugin users such as the following:

Common declarative settings
// When the execution specification extends
// org.ysb33r.grolifant.api.v4.runnable.AbstractExecSpec

ignoreExitValue = true  (1)
standardOutput = System.out  (2)
standardInput = System.in    (3)
errorOutput = System.err     (4)
workingDir = '.'     (5)
1 Whether the exit value can be ignored.
2 Where standard output should be sent to. (It is up to a plugin author to decide on behaviour if this value is null).
3 Where standard input is read from. (It is up to a plugin author to decide on behaviour if this value is null).
4 Where error output should be sent to. (It is up to a plugin author to decide on behaviour if this value is null).
5 The working directory during execution. This is a lazy-evaluated value and can be anything that ProjectOperations.file will be able to process.
Setting process environment
environment = [foo: 'bar']               (1)
environment foo2: 'bar2', foo3: { 'bar3' } (2)
environment 'foo4', 'bar4'   (3)
addEnvironmentProvider(externalEnvironmentProvider) (4)
1 Explicitly set the environment in an assignment style, removing any previous environment settings.
2 Add additional environment settings in the familiar, and gradlesque, map-style. Values of environmental variables have the ability to be lazily-evaluated by the consuming task.
3 Add one environment setting as a pair of environment variable and its value.
4 Add external environment provider. THese providers are evaluated before anything set in environment.

If you are familiar with the options on the Exec task, then the above will come as no surprise. It will also present your plugin user with a familiar set of configuration options.

The executable can also be set in the normal way, but if you set the executable in an implementation-specific way in your implementation you might want to prevent the user from setting executable. These specifications also allows you to provide arguments that is specific to the executable and not any associated command. For instance if you were to do git -C /foo commit myfile.txt, then -C /foo would be executable arguments.

Setting executables and executable arguments
executable { '/path/to/git' }         (1)
exeArgs = ['-C', '/path/to/project']     (2)
exeArgs '-p', { '--bare' }  (3)
1 Set the executable. This is also a lazy-evaluated value and anything that StringUtils.stringize can deal with can be used. In addition any Provider<String> can also be used.
2 Explicitly set the execution arguments in an assignment style, removing any previous execution arguments.
3 Add additional execution arguments. All of these values are lazily evaluated.

The above distinction of using execution arguments might seem to be an unnecessary extra at a first read, but in terms of a DSL they allow the use to customise certain behaviour of the executable without losing focus on the real work the executable is supposed to do. This is similar to running an additional JVM via JavaExec. In this case the `jvmArgs customises the JVM, and not the arguments passed to the class to be executed.

In addition to those, the AbstractExecCommandSpec will allow you to specify a command that is associated with the executable. For instance in git commit, the command will be commit.

Setting a command and command arguments (AbstractExecCommandSpec)
// When the execution specification extends
// org.ysb33r.grolifant.api.v4.runnable.AbstractExecCommandSpec

command = 'remote'         (1)
cmdArgs = ['prune', '--dry-run'] (2)
cmdArgs '-n', { 'origin' }    (3)
1 Set the command. This can be lazy-evaluated.
2 Explicitly set the command arguments in an assignment style, removing any previous command arguments.
3 Add additional command arguments. All of these values are lazily evaluated.

In a similar fashion AbstractExecScriptSpec offers the ability to specify a script name and script arguments.

Setting a script and script arguments (AbstractExecScriptSpec)
// When the execution specification extends
// org.ysb33r.grolifant.api.v4.runnable.AbstractExecScriptSpec

script = 'install.pl'       (1)
scriptArgs = ['aye']      (2)
scriptArgs 'cee', { 'dee' }  (3)
1 Set the script. This can be lazy-evaluated.
2 Explicitly set the script arguments in an assignment style, removing any previous script arguments.
3 Add additional script arguments. All of these values are lazily evaluated.

In order to implement your own execution specification you need to derive from the appropriate specification.

Wrapping Git as a tool with commands
class GitExecSpec extends AbstractExecCommandSpec<GitExecSpec> {
    GitExecSpec(ProjectOperations projectOperations) {
        super(projectOperations)
        setExecutable('git')
    }
}
Wrapping Perl as a tool which executes scripts
class PerlScriptExecSpec extends AbstractExecScriptSpec<PerlScriptExecSpec> {
    PerlScriptExecSpec(ProjectOperations projectOperations) {
        super(projectOperations)
        setExecutable('perl')
    }
}

Creating a task

There is currently three abstract task classes in the hierarchy.

diag 0c16d2627b935870deb623632eccb476
Figure 2. Task Classes
Unlike the older exec classes it is no longer necessary to implement execution specifications just to implement a class. These classes offer the same interfaces as the execution specifications and knows internally how to copy parameters to an ExecSpec. The minimum you will need to do is extend the appropriate class and provide a suitable constructor that can be called by Gradle.

A minimalistic class for wrapping the Terraform executable could be as simple as

class TerraformExec extends AbstractExecCommandTask<TerraformExec> {
    TerraformExec() {
        super()
        setExecutable('terraform')
    }
}

In a similar fashion wrapping a script language executor could be as simple as

class PythonExec extends AbstractExecScriptTask<PythonExec> {
    PythonExec() {
        super()
        setExecutable('python')
    }

}

Wrapping a tool with AbstractExecWrapperTask

The AbstractExecWrapperTask is a simplified way of abstract tools into gradlesque tasks. Unlike the other abstraction execution task types mentioned above, it does not expose the full command-line options to the build script author, but rather allows a plugin author to provide suitable functional abstractions. For instance the Terraform plugin provides a hierarchy of tasks that wrap around the Terraform executable, deals with initialisation and simplifies integration of a very popular tool into a build pipeline in a very gradlesque way.

This abstract task also relies on the creation of suitable extension derived from AbstractToolExtension. The result is a very flexible DSL. This can be illustrated by the following example which is also a good starting point for any plugin author wanting to abstract a tool in such a way.

Step 1 - Create an execution specification

The details for creating execution specifications has been described earlier. You can use the one which is best suited to your application.

In this example Iwe are using AbstractExecCommandSpec as a base class.

MyCmdExecSpec.groovy
@CompileStatic
class MyCmdExecSpec extends AbstractExecCommandSpec<MyCmdExecSpec> {
    MyCmdExecSpec(ProjectOperations po) {
        super(po)
    }
}

Step 2 - Create a downloader for your tool distribution

The simplest way is to extend AbstractDistributionInstaller or AbstractSingleFileInstaller. See for more details.

Step 3 - Create an extension

We start with an extension class that will only be used as a project extension. Se further down for a case where both task and project extensions will be used.

MyExtension.groovy
@CompileStatic
class MyExtension extends AbstractToolExtension<MyExtension> { (1)

    public static final String NAME = 'toolConfig'

    MyExtension(Project project) { (2)
        super(ProjectOperations.find(project))
    }

    MyExtension(Task task) {
        super( (3)
            task,
            ProjectOperations.find(task.project),
            task.project.extensions.getByName(NAME)
        )
    }

    @Override
    protected ExecutableDownloader getDownloader() { (4)
        new ExecutableDownloader() {
            File getByVersion(String version) {
                toolDownloader.getDistributionFile(version, FILENAME)
            }
        }
    }

    @Override
    protected String runExecutableAndReturnVersion() throws ConfigurationException {
        ExecUtils.parseVersionFromOutput(
            projectOperations,
            ['-v'],
            executable.get(),
            { String output ->
                output.readLines()[0].repalceFirst('version: ', '')
            }
        )
    }

    private static final String FILENAME = OperatingSystem.current().windows ? 'test.bat' : 'test.sh'
    private final DistributionInstaller toolDownloader (5)
}
1 Derive from AbstractToolExtension. This will provide methods for setting the executable.
2 Create a constructor for attaching the extension to a project.
3 You will also need a constructor for attaching to a task. In this case you will also need to specify the name of the project extension. By convention, always have the task and project extension as the same name. For simplicity we’ll ignore this constructor and return to it a bit later.
4 Create a simple downloader which know how to resolve a version to a distribution and then extract the correct executable’s location from the distribution.
5 The distribution downloader. See [DistributionInstaller]] for more details on implementing your own.

Step 4 - Create the task class

MyWrapperTask.groovy
@CompileStatic
class MyWrapperTask extends AbstractExecWrapperTask<MyCmdExecSpec> { (1)

    MyWrapperTask() {
        super()
        myExtension = project.extensions.getByType(MyExtension) (2)
    }

    @Override
    protected File getExecutableLocation() {
        myExtension.executable.get() (3)
    }

    @Override
    protected MyCmdExecSpec createExecSpec() {
        new MyCmdExecSpec(projectOperations) (4)
    }

    @Override
    protected void configureExecSpec(MyCmdExecSpec execSpec) { (5)
        execSpec.executable = executableLocation (6)
        execSpec.command = 'show-colours'
        execSpec.cmdArgs '--yellow', '--bright'
    }

    private final MyExtension myExtension

}
1 Your task class must extend AbstractExecWrapperTask and specify the type of the associated execution specification.
2 Cache the reference to the extension within the task as to prevent further lookups and to be configuration cache-safe.
3 Simply hook the location of the executable to the provider on the project extension.
4 You need to implement a method which will create an execution specification.
5 You will also need to implement a method to configure the execution specification according to all of the specifics of the tool that is being wrapped. This is the method that translates task properties into command-line options.
6 Remember to bind the executable location to the execution specification.

Step 5 - Apply this via plugin

MyPlugin.groovy
@CompileStatic
class MyPlugin implements Plugin<Project> {
    void apply(Project project) {
        ProjectOperations.maybeCreateExtension(project) (1)
        project.extensions.create(MyExtension.NAME, MyExtension, project) (2)
        project.tasks.create('mycmd', MyWrapperTask) (3)
    }
}
1 As this plugin uses Grolifant, add the preoject operations extensions if it does not exist.
2 Create the extension at project level
3 Create a default instance of your task.

Use it in the DSL

build.gradle
toolConfig {
    executableByVersion('1.2.3') (1)
    executableBySearchPath('mycmd') (2)
    executableByPath('/usr/local/bin/mycmd')(3)
}
1 Resolve the tool by version
2 Resolve the tool by searching the system path for a command. On Windows this will search by trying various extensions.
3 Resolve the tool by a fixed path.

Add task extensions for more flexibility

The extension class can be attached to both the project and the task for maximum flexibility. This allows for global configuration, with customisation on a task level as needed. My modifying the previously created task to extend AbstractExecWrapperWithExtensionTask instead.

MyWrapperTask.groovy
@CompileStatic
class MyWrapperExtTask extends AbstractExecWrapperWithExtensionTask<MyExtension, MyCmdExecSpec> { (1)

    MyWrapperExtTask() {
        super()
        myExtension = extensions.create( (2)
            MyExtension.NAME,
            MyExtension,
            this
        )
    }

    @Override
    protected File getExecutableLocation() {
        toolExtension.executable?.get()
    }

    @Override
    protected MyExtension getToolExtension() { (3)
        myExtension
    }

    @Override
    protected MyCmdExecSpec createExecSpec() {
        new MyCmdExecSpec(projectOperations)
    }

    @Override
    protected void configureExecSpec(MyCmdExecSpec execSpec) {
        execSpec.executable = executableLocation
        execSpec.command = 'show-colours'
        execSpec.cmdArgs '--yellow', '--bright'
    }

    private final MyExtension myExtension

}
1 You need to specify both the extension type and the execution specificaiton tyep
2 Create a task extension.
3 Implement a method that can return the task extension.

In addition to the DSL described earlier, you can now configure tool specifics on the task itself.

build.gradle
mycmd {
    toolConfig { (1)
        executableByVersion('1.2.3')
        executableBySearchPath('mycmd')
        executableByPath('/usr/local/bin/mycmd')
    }
}
1 Set the same confugration but on the task itself. Other tasks of the same type will still use this project configuration, but this task will now use a different configuration.

Creating a project extension

The original way of adding an extra into project.extensions.extraProperties has been deprecated. The recommended approach is to add a method to your implementation of AbstractToolExtension.

Assume for the moment that we have the same execution specification implementation from above called MyCmdExecSpec. Add ExecMethods and proced to implement the exec methods.

class MyExtensionWithExec extends AbstractToolExtension<MyExtension>
    implements ExecMethods<MyCmdExecSpec> { (1)

    public static final String NAME = 'toolConfig'

    MyExtensionWithExec(Project project) {
        super(ProjectOperations.find(project))
    }

    MyExtensionWithExec(Task task) {
        super(
            task,
            ProjectOperations.find(task.project),
            task.project.extensions.getByName(NAME)
        )
    }

    @Override
    ExecResult exec(Action<MyCmdExecSpec> specConfigurator) { (2)
        MyCmdExecSpec spec = new MyCmdExecSpec(projectOperations) (3)
        spec.executable(getExecutable()) (4)
        specConfigurator.execute(spec) (5)
        exec(spec) (6)
    }

    @Override
    ExecResult exec(MyCmdExecSpec spec) { (7)
        projectOperations.exec(new Action<ExecSpec>() { (8)
            @Override
            void execute(ExecSpec execSpec) {
                spec.copyToExecSpec(execSpec)
            }
        })
    }

    @Override
    protected ExecutableDownloader getDownloader() {
        new ExecutableDownloader() {
            File getByVersion(String version) {
                toolDownloader.getDistributionFile(version, FILENAME)
            }
        }
    }

    @Override
    protected String runExecutableAndReturnVersion() throws ConfigurationException {
        /* ... */
    }

    private static final String FILENAME = OperatingSystem.current().windows ? 'test.bat' : 'test.sh'
    private final DistributionInstaller toolDownloader
}
1 Add ExecMethods and use the execution specification type as the parameter.
2 Implement a method that will take a configurating Action as parameter.
3 Create the execution specification as appropriate for your implementation.
4 Link the executable to the provider of the extension.
5 Configure using the provided Action.
6 Call the other exec method using this specification.
7 The second method takes an instance of your execution specification.
8 Execute it using the exec methods on `ProjectOperations`[ProjectOperations]. The provided action simply copies from your execution specification to an ExecSpec.

You can directly use this in your DSL

build.gradle
tasks.create('runMe') {
    doLast {
        toolConfig.exec {
            command = 'show-colours'
            cmdArgs '--yellow', '--bright'
        }
    }
}
tasks.create('runMe') {
    doLast {
        toolConfig.exec {
            command = 'show-colours'
            cmdArgs '--yellow', '--bright'
        }
    }
}

As an alternative you can implement the ProvisionedExecMethods interface instead.

class MyExtensionWithProvisionedExec extends AbstractToolExtension<MyExtension>
    implements ProvisionedExecMethods<MyCmdExecSpec> { (1)

    public static final String NAME = 'toolConfig'

    MyExtensionWithProvisionedExec(Project project) {
        super(ProjectOperations.find(project))
    }

    MyExtensionWithProvisionedExec(Task task) {
        super(
            task,
            ProjectOperations.find(task.project),
            task.project.extensions.getByName(NAME)
        )
    }

    @Override
    MyCmdExecSpec createExecSpec() { (2)
        MyCmdExecSpec spec = new MyCmdExecSpec(projectOperations)
        spec.executable(getExecutable()) (3)
        spec
    }

    @Override
    protected ExecutableDownloader getDownloader() {
        /* ... */
    }

    @Override
    protected String runExecutableAndReturnVersion() throws ConfigurationException {
        /* ... */
    }

    private static final String FILENAME = OperatingSystem.current().windows ? 'test.bat' : 'test.sh'
    private final DistributionInstaller toolDownloader
}
1 Add ProvisionedExecMethods and use the execution specification type as the parameter.
2 Implement a method on the extension to create the appropriate execution specification.
3 Link the executable.

Adding version-based resolving

Version-based resolving requires a downloader. The quickest way for most cases is to use AbstractDistributionInstaller

class MyInstaller extends AbstractDistributionInstaller {

    MyInstaller(ProjectOperations projectOperations) {
        super(
            'Test Distribution',  (1)
            'native-binaries/testdist', (2)
            projectOperations
        )
        addExecPattern('**/*.sh', '**/*.bat') (3)
    }

    @Override
    URI uriFromVersion(String version) { (4)
        // Make a downloadable URL from a version string
    }

}
1 Distribution name. This is purely used in logging.
2 Where the downloaded packages will be cached. This directory is relative to the Gradle user home directory.
3 If any of the unpacked files need execution permissions, remember to add the patterns.
4 Supply an implementation which can convert a version of the package to a URL where the package can be downloaded from.

To hook this into one of the previous extensions, will simply require the downloader to be initialised in the constructor.

class MyExtensionWithDownloader extends AbstractToolExtension<MyExtension>
    implements ProvisionedExecMethods<MyCmdExecSpec> {

    public static final String NAME = 'toolConfig'

    MyExtensionWithDownloader(Project project) {
        super(ProjectOperations.find(project))
        this.toolDownloader = new MyInstaller(projectOperations)
    }

    MyExtensionWithDownloader(Task task) {
        super(
            task,
            ProjectOperations.find(task.project),
            task.project.extensions.getByName(NAME)
        )
        this.toolDownloader = new MyInstaller(projectOperations)
    }

    @Override
    MyCmdExecSpec createExecSpec() {
        MyCmdExecSpec spec = new MyCmdExecSpec(projectOperations)
        spec.executable(getExecutable())
        spec
    }

    @Override
    protected ExecutableDownloader getDownloader() {
        def dnl = toolDownloader
        new ExecutableDownloader() {
            File getByVersion(String version) {
                dnl.getDistributionFile(version, FILENAME).get()
            }
        }
    }

    @Override
    protected String runExecutableAndReturnVersion() throws ConfigurationException {
        /* ... */
    }

    private static final String FILENAME = OperatingSystem.current().windows ? 'test.bat' : 'test.sh'
    private final DistributionInstaller toolDownloader
}

Upgrading old Execution Tasks & Specifications

Table 1. Quick reference

If you have exec class

Use runnable class

Notes

AbstractCacheBinaryTask

AbstractCacheBinaryTask

You will need to switch to getBinaryLocationProvider and getBinaryVersionProvider.

AbstractCombinedProjectTaskExtension

CombinedProjectTaskExtensionBase

Instead of passing a Project, only pass a `ProjectOperations` instance to the protected constructor. If you passed a Task and extension before, then you need to pass a Task, `ProjectOperations` and reference to the project extension of the same type. You may pass null as the third parameter if your plugin does not use a project extension for the specific type.

AbstractCommandExecSpec

AbstractExecCommandSpec

Instead of passing a Project and an Object of the executable, only pass a `ProjectOperations` instance to the protected constructor.

AbstractCommandExecTask

AbstractExecCommandTask

-

AbstractDistributionInstaller

AbstractDistributionInstaller

The new implementation does not require a version in the constructor.

AbstractExecTask

AbstractExecTask

-

AbstractScriptExecTask

AbstractExecScriptTask

-

AbstractExecSpec

AbstractExecSpec

Instead of passing a Project and an Object of the executable, only pass an instance of `ProjectOperations` to the protected constructor.

AbstractExecWrapperTask

AbstractExecWrapperTask or AbstractExecWrapperWithExtensionTask

-

AbstractScriptExecSpec

AbstractExecScriptSpec

Instead of passing a Project and an Object of the executable, only pass a `ProjectOperations` instance to the protected constructor.

AbstractToolExtension

AbstractToolExtension

If you passed a Project before, now pass an instance of `ProjectOperations` to the protected constructor. If you passed a Task and extension before, then you need to pass a Task, `ProjectOperations` and reference to the project extension of the same type. You may pass null as the third parameter if your plugin does not use a project extension for the specific type.

NamedResolvedExecutableFactory

-

This is no longer required if AbstractToolExtension is used.

If you have a method executable(Map<String, Object> opts) then you can mark it as deprecated and implement it’s routing as follows:

@Deprecated
void executable(Map<String, Object> opts) {
        if (opts.containsKey('version')) {
            log.warn("'${this.class.name}#executable version' is deprecated. Use executableByVersion()")
            executableByVersion(opts['version'])
        } else if (opts.containsKey('path')) {
            log.warn("'${this.class.name}#executable path' is deprecated. Use executableByPath()")
            executableByPath(opts['path'])
        } else if (opts.containsKey('search')) {
            log.warn("'${this.class.name}#executable searchPath()' is deprecated. Use executableBySearchPath()")
            executableBySearchPath('your-exec-name') (1)
        }
}
1 Replace with the basename of the executable.

Miscellaneous

Configuration Cache Safety

Because Gradle Configuration Cache will make it impossible to use the Project instance after the configuration phase has ended, plugin authors can no longer rely on calling methods ion this class. The only exceptions are within constructors of tasks and extensions. Plugin authors should take extra care as to not call the Task.getProject() method anywhere except in the constructor.

In order to help with compatibility across version that has a configuration caching and those that don’t, the ProjectOperations has been introduced. It can either be used as an extension or as a standalone instance.

Usage as an extension
import org.ysb33r.grolifant.api.GrolifantExtension (1)

ProjectOperations grolifant = GrolifantExtension.maybeCreateExtension(project) (2)
1 Required import
2 Call this from the plugin’s apply method and pass the project instance. It will create a project extension which is called grolifant. This method can be called safely from many plugins, as the extension will only be created once.
Usage as a detached instance
ProjectOperations grolifant = ProjectOperations.create(project) (1)
1 Creates an instance of ProjectOperations, but does not attach it to the project. The instance will be aware of the project, but cache a number of references to extensions which would traditionally be obtained via the Project instance. The specific instance will depends on whicg grolifant JARs are on the classpath and which Gradle is actually running.
Obtaining the projext extension
ProjectOperations grolifant = ProjectOperations.find(project) (1)
1 Shortcut for fining the project extension. Will throw an exception if the extension was not attached to the project.

Replacing existing Project methods

The following methods call be called like-for-like on the grolifant extension.

  • copy

  • exec

  • file

  • getBuildDir

  • javaexec

  • provider

  • tarTree

  • zipTree

In addition the following helper methods are also provided.

  • buildDirDescendant - Returns a provider to a path below the build directory.

  • [fileOrNull] - Similar to file, but returns null rather than throwing an exception when the object is null or an empty provider.

  • projectCacheDir - Returns the project’s cache directory.

  • updateFileProperty - Updates a file property on Gradle 4.3+, allowing behaviour to be the same as that for new methods introduced on Gradle 5.0. On Gradle <4.3 it will update an instance of PropertyStore.

Provider Tools

In order to help with certain provider functionality not being available in earlier version of Gradle, providerTools can be utilised. The

  • flatMap - FlatMaps one provider to another using a transformer. This allows Gradle 4.0 - 4.10.3 to have the same functionality as Gradle 5.0+.

  • getOrElse - Get value of provider or an alternative value. This allows Gradle 4.0 - 4.2 to have the same functionality as Gradle 4.3+.

  • getOrNull - Get value of provider or an alternative value. This allows Gradle 4.0 - 4.2 to have the same functionality as Gradle 4.3+.

  • map - Maps one provider to another using a transformer. This allows Gradle 4.0 - 4.2 to have the same functionality as Gradle 4.3+.

Operating System

Many plugin developers are familiar with the OperatingSystem internal API in Gradle. Unfortunately this remains an internal API and is subject to change.

Grolifant offers a similar public API with a small number of API differences:

  • No getFamilyName and getNativePrefix methods. (A scan of the Gradle 3.2.1 codebase seem to yield to usage either).

  • No public static fields called WINDOWS, MACOSX etc. These are now a static field called INSTANCE on each of the specific operating system implementations.

  • getSharedLibrarySuffix and getSharedLibraryName have been added.

  • Support for NetBSD.

Example

OperatingSystem os = OperatingSystem.current() (1)
    File findExe = os.findInPath('bash')
    File findExe = os.findInPath('bash')
1 Use current() to the operating system the code is being executed upon.

Operating system detection

The logic in 1.3.3 to determine an operating system is

static OperatingSystem current() {
    if (OS_NAME.contains('windows')) {
        return Windows.INSTANCE
    } else if (OS_NAME.contains('mac os x') || OS_NAME.contains('darwin') || OS_NAME.contains('osx')) {
        return MacOsX.INSTANCE
    } else if (OS_NAME.contains('linux')) {
        return Linux.INSTANCE
    } else if (OS_NAME.contains('freebsd')) {
        return FreeBSD.INSTANCE
    } else if (OS_NAME.contains('sunos') || OS_NAME.contains('solaris')) {
        return Solaris.INSTANCE
    } else if (OS_NAME.contains('netbsd')) {
        return NetBSD.INSTANCE
    }

    // Not strictly true, but a good guess
    GenericUnix.INSTANCE
}

Contributing fixes

Found a bug or need a method? Please raise an issue and preferably provide a pull request with features implemented for all supported operating systems.

Git Cloud Provider Archives

In a slightly similar fashion to Distribution Installer it is possible to download archives of GitHub & GitLab repositories.

Firstly define a description of the specific Git repository.

GitHubArchive git = new GitHubArchive() (1)

git.organisation = 'ysb33rOrg'
git.repository = 'grolifant'
git.branch = 'master' (2)
1 Create a GitHub description. (For GitLab use GitLabArchive instead)
2 For setting a tag use the tag/setTag method and for a specific commit the commit/setCommit method.

Then create downloader

GitRepoArchiveDownloader downloader = new GitRepoArchiveDownloader(git, projectOperations) (1)
File root = downloader.archiveRoot (2)
1 The downloader only requires the description and a link to the current Project instance.
2 When the archiveRoot (or getArchiveRoot()) is accessed the archive will be downloaded if it has not already been downloaded.

Normally the downloader will cache the repository in the Gradle user cache so that is can be shared between different projects. It is possible to configure the download area for somewhere else. For instance you might prefer to cache it in the project’s build directory. To do this simply set, the download root location on the downloader.

downloader.downloadRoot = projectOperations.buildDirDescendant('my-archive').get() (1)
1 Use downloadRoot in Groovy or setDownloadRoot in Java & Kotlin.

Simplified Property Resolving

Taking an idea from Spring Boot configuration, this class provides out-of-the-box functionality to reoslve a property by looking at the Gradle project projecties, Java system properties and the environment.

PropertyResolver resolver = new PropertyResolver(project)

resolver.get('a.b.c') (1)
resolver.get('a.b.c','123') (2)
resolver.get('a.b.c', resolver.SYSTEM_ENV_PROPERTY) (3)
resolver.get('a.b.c', '123', resolver.SYSTEM_ENV_PROPERTY) (4)

resolver.provide('a.b.c') (5)
resolver.provide('a.b.c','123') (6)

resolver.provideAtConfiguration('a.b.c') (7)
resolver.provideAtConfiguration('a.b.c','123') (8)

resolver.provide('a.b.c', '123', resolver.SYSTEM_ENV_PROPERTY, false) (9)
resolver.provide('a.b.c', '123', resolver.SYSTEM_ENV_PROPERTY, true) (10)
1 Search for property a.b.c in the order of project property & system property and then for environmental variable A_B_C.
2 Search for property, but result default value if none was found.
3 Use a different search order. Anything that implements PropertyResolveOrder can be specified.
4 Combine a default value with a different search order.
5 A provider of a property.
6 A provider of a property with a default value.
7 A provider of a property that can be used at configuration time.
8 A provider of a property with a default value that can be used at configuration time.
9 A provider of a property with a default value, and a different order, which cannot be used at configuration time.
10 A provider of a property with a default value, and a different order, which can be used at configuration time.

What’s in a name

Grolifant is a concatenation of Gr for Gradle and olifant, which is the Afrikaans word for elephant. The latter is of course the main part of the current Gradle logo.

Who uses Grolifant?

The following plugins are known consumers of Grolifant:

If you would like to register your plugin as a Grolifant user, please raise and issue (and preferably a merge request).