Usage Guide

This is general usage guide for tackler reporting engine.

To use tackler you have to have

  • Java Runtime Environment (Tackler requires at least Java 11 to run)

  • Tackler binaries

  • Correct configuration for your setup

And last but not least you need some transaction data to play with.

If your installation follows recommended structure, then you can generate default reports immediately after you put some transaction data under txns-directory.

CLI

See Installation Manual for minimal setup and recommendations for production setups.

After you have installed tackler and you have some transaction data you can run tackler with command:

java -jar tackler-cli.jar

(If you created minimal setup based on Installation Manual)

To skip automatic configuration discovery mechanism, you can provide path to configuration file with --cfg argument.

java tackler-cli.jar --cfg=/path/to/my/tackler.conf

See tackler’s Configuration Manual, tackler.conf and accounts.conf for how to configure tackler.

Input and output

--basedir

Provides base directory for all operations. All other relative paths are relative to the basedir.

Input

There are few different ways to provide transaction data to Tackler.

Input can be based on:

  • Filesystem storage

    • Journal in single file

    • Journal data is in multiple files inside directory structure

  • Git storage

    • (Journal is in single file)

    • Journal data is in multiple files inside directory structure

Storage selector

If there are valid configurations for filesystem and git storage, then used storage system can be switched from command line:

--input.storage

Valid options are fs or git. This could be used in case when there is need to run tests reports from working copy before committing transactions into repository. Or vice versa, using fs normally but running reports from git.

Below is a storage configuration example for such dual setup.

tackler {
  core {
    input {

      storage = git

      # In this example, "accounting" repository is clone
      # and is used as a working copy.
      fs {
        dir = "accounting/2019"
        glob = "*.txn"
      }

      git {
        repository = "accounting/.git"
        ref = "master"
        dir = "2019"
        suffix = ".txn"
      }
    }
  }
}

Journal in single file

Single file input can be defined only by command line switch:

--input.file

This is a path to journal file, which will be used for input. Path can be absolute or relative path to the basedir.

If you like to use single journal file mode and define it by configuration, then this can be done by putting journal file in dedicated directory and define that directory as a root for storage directory (input.fs.dir) with approriate glob.

Journal data is in multiple files inside directory structure

--input.fs.dir

Directory which holds txn -files (absolute or relative path to basedir).

--input.fs.glob

Glob to define which files are used as journals. For example:

  • *.txn Single folder and shallow directory structure

  • **.txn Deep directory structure with some sharding scheme

See Journal Sharding for discussion of possible sharding schemes.

Git storage

Tackler journal data can be read directly from git, even without checkout working copy.

See git storage for full information about git based storage.

Git storage backend will use by default directory structure based mode. If you like to use single journal mode with git, then just put journal in specific directory, and define that as a root for storage (input.git.dir).

Addition to configuration, Git reference or commit id can be given from command line:

--input.git.ref=<ref-name>

Use ref-name for git reference. This could be e.g. tag or branch name. When it is’s branch name, HEAD of that branch will be used automatically. This can not be specified at the same time with --input.git.commit.

--input.git.commit=<commit-id>

Use single commit-id and tree defined by it. This can not be specified at the same time with --input.git.ref.

--input.git.dir=<dir-name>

Top level directory of transaction data is defined by dir-name. All files inside this directory tree with configured suffix will be used as journal files.

These options are mutually exclusive with filesystem storage arguments (input.fs.*).

See Journal Sharding for discussion of possible sharding schemes.

Transaction Filters

Tackler has an option to filter transactions based on attributes of single transaction.

If transaction is filtered away by txn filter, it will disappear from all calculations and statistics. The effect is same as if transaction didn’t exist in the first place. Transactions can be filtered based on various attributes of single transaction, and different filters can be combined logically together.

For full list of available filters and their syntax, see Transaction Filters document.

Transaction filter can be defined by providing filter definition as JSON with --api-filter-def option, and it can be plain JSON or encoded as base64 string. Base64 encoding will make it easy to use filters with shell scripts.

Belows is an example of filter to find all transactions where there is "ice-cream" on transaction’s description field.

--api-filter-def '{ "txnFilter": { "TxnFilterTxnDescription": { "regex": ".*ice-cream.*" } } }'

Same filter defined as base64 string (with base64 --wrap=0):

--api-filter-def base64:eyAidHhuRmlsdGVyIjogeyAiVHhuRmlsdGVyVHhuRGVzY3JpcHRpb24iOiB7ICJyZWdleCI6ICIuKmljZS1jcmVhbS4qIiB9IH0gfQo=

If txn filter is defined as base64 string, then there must be base64: at the begin of string (see Using txn filters with shell).

Second example is more complex combination of filters to find all transactions which have code as starting "#" and description starts as "txn-".

--api-filter-def '{ "txnFilter": { "TxnFilterAND" : { "txnFilters" : [ { "TxnFilterTxnCode": { "regex": "#.*" } },  { "TxnFilterTxnDescription": { "regex": "txn-.*" } } ] } } }'

See Transaction Filters for list of all available filters and their syntax.

Using Transaction Filters with shell scripting

Transaction filters can be easily combined and created by shell scripts. Filter definitions can be easily handled with shell scripts in base64 ascii armor format.

By combining these two features, it’s easy to extend Tackler’s functionality with simple and powerful constructs.

Filter for time span

Below is an example of bash-based shell function which creates transaction filter for time span :

time_span_filter () {
    local begin=$1
    local end=$2

    flt=$(cat << EOF | base64 --wrap=0
{
    "txnFilter" : {
        "TxnFilterAND" : {
            "txnFilters" : [
                {
                    "TxnFilterTxnTSBegin" : {
                        "begin" : "$begin"
                    }
                },
                {
                    "TxnFilterTxnTSEnd" : {
                        "end" : "$end"
                    }
                }
            ]
        }
    }
}
EOF
)
    echo "base64:$flt"
}
Examples

Get reports for all transactions between 2019-01-15 10:00 and 15:30 on TZ=02:00

tackler-cli.jar --api-filter-def \
   $(time_span_filter 2019-01-15TT10:00:00+02:00  2019-01-15T15:30:00+02:00)

Filter:
  AND
    Txn TS: begin 2019-01-15T10:00:00+02:00
    Txn TS: end   2019-01-15T15:30:00+02:00
Filter for time window

Below is definition of time based windowing filter using above time_span_filter. This utilizes natural language support of date-command and above defined time_span_filter.

time_window_filter () {
    local ts1=$(TZ=Z date --date=$1 --iso-8601=s)
    local ts2=$(TZ=Z date --date="$ts1 $2" --iso-8601=s)

    local begin=$(echo -e "$ts1\n$ts2" | sort -n | head -n1)
    local end=$(echo   -e "$ts1\n$ts2" | sort -n | tail -n1)

    time_span_filter "$begin" "$end"
}
Examples

Transaction data from last 5 years:

tackler-cli.jar --api-filter-def \
   $(get_window_filter "2019-01-01" "-5 years")

Filter:
  AND
    Txn TS: begin 2014-01-01T00:00:00Z
    Txn TS: end   2019-01-01T00:00:00Z
...

Transaction data from last 30 days:

tackler-cli.jar --api-filter-def \
   $(get_window_filter "2019-01-15" "-30 days")

Filter:
  AND
    Txn TS: begin 2018-12-16T00:00:00Z
    Txn TS: end   2019-01-15T00:00:00Z
...

Transaction data for Q1/2018:

tackler-cli.jar --api-filter-def \
   $(get_window_filter "2018-01-01" "+3 months")

Filter:
  AND
    Txn TS: begin 2018-01-01T00:00:00Z
    Txn TS: end   2018-04-01T00:00:00Z
...

Reporting

Ordering of transaction is done by comparing time, code, description or uuid, in that order. If uuid is not provided and ordering is not clear by other fields, then txn ordering is undefined for that txn.

If truly stable reporting output is needed (especially Register and Identity report), then either Txns must have either uuid or unique time, code or description.

Selecting reports and exports

Produced reports can be selected either by configuration or CLI options:

--reporting.reports report1 report2 …​

Valid options are: balance, balance-group, register

Produced exports can be selected either by configuration or CLI options:

--reporting.exports export1 export2

valid options are equity and identity

Selecting report formats

Report formats can be selected either by configuration or CLI options:

--reporting.formats frmt1 frmt2

Valid options are: txt and json

Configuring used output scale of reports

Report output scale (e.g. count of decimals) can be set either globally or based on report type. When values are truncated based on max scale setting, used rounding mode is HALF_UP.

Example of global scale settings:

reporting {
  scale {
    min = 2
    max = 7
  }
}

and report specific scale settings:

reports {
   balance {
     scale {
        min = 2
        max = 2
     }
   }
}

See tackler.conf full documentation.

Selecting accounts for reports

Accounts can be selected for reports either by setting global reporting.accounts (conf-setting and command line) setting or with report specific selector.

Default selection for reports is "all accounts" and it can be done with empty setting.

See Balance and Balance Group for details how account selectors affects reports.

Command line example:

--reporting.accounts "Assets(:.*)?" "Expenses(:.*)?"

All accounts:

--reporting.accounts

Configuration example:

reporting {
  accounts = [ "Assets(:.*)?", "Expenses(:.*)?" ]
}

All accounts

reporting {
  accounts = [ ]
}

If There are no accounts matched for report then report’s sub-section is not printed / outputted at all (balance Group, register report).

Balance Group Report and GroupBy

Balance Group report is like Balance report, but it will produce several sub-reports for group of transactions. Typical examples are Balance report over month and Balance Group report by weeks, or Balance report for week and Balance Group report based on iso-week-date or plain date.

Criteria could be: year, month, date, iso-week, iso-week-date

GroupBy is set by configuration (tackler.conf).

Output

--reporting.console=true

will print reports on console

--output <basename>

will print reports to separate files, which are named based on basename.

Basename is path and name prefix for output reports, and it can be either absolute path or relative path to basedir.

Actual file names will be:

For reports:

For exports:

  • <output>.equity.txn: Equity report

  • <output>.identity.txn: Identity report

Exports are special reports, which are valid input for Tackler.

Accounting Auditing and Assurance

See document Accounting Auditing and Assurance for information how Tackler reports could support accounting auditing and assurance actions.