UP | HOME

SPOT-X

Table of Contents

NAME

spot-x - Common fine-tuning options and environment variables.

SYNOPSIS

--extra-options STRING
-x STRING

DESCRIPTION

Common fine-tuning options for programs installed with Spot.

The argument of -x or --extra-options is a comma-separated list of KEY=INT assignments that are passed to the post-processing or translation routines (they may be passed to other algorithms in the future). These options are mostly used for benchmarking and debugging purpose. KEY (without any value) is a shorthand for KEY=1, while !KEY is a shorthand for KEY=0.

Temporal logic simplification options:

tls-impl

Control usage of implication-based rewriting. (0) disables it, (1) enables rules based on syntactic implications, (2) additionally allows automata-based implication checks, (3) enables more rules based on automata-based implication checks. The default value depends on the --low, --medium, or --high settings.

tls-max-ops

Maximum number of operands in n-ary operators (or, and) on which implication-based simplifications are attempted. Defaults to 16.

tls-max-states

Maximum number of states of automata involved in automata-based implication checks for formula simplifications. Defaults to 64.

Translation options:

branch-post

Set to 0 to disable branching-postponement (done during translation, may create more states) and delayed-branching (almost similar, but done after translation to only remove states). Set to 1 to force branching-postponement, and to 2 to force delayed-branching. By default, delayed-branching is used.

comp-susp

Set to 1 to enable compositional suspension, as described in our SPIN’13 paper (see Bibliography below). Set to 2, to build only the skeleton TGBA without composing it. Set to 0 (the default) to disable. Nowadays, ltl-split already takes care of that for suspendable subformulas at the top level.

early-susp

When set to 1, start compositional suspension on the transitions that enter accepting SCCs, and not only on the transitions inside accepting SCCs. This option defaults to 0, and is only used when comp-susp=1.

exprop

When set, this causes the core LTL translation to explicitly iterate over all possible valuations of atomic propositions when considering the successors of a BDD-encoded state, instead of discovering possible successors by rewriting the BDD as a sum of product. This is enabled by default for --high, and disabled by default otherwise. When unambiguous automata are required, this option is forced and cannot be disabled.

ltl-split

Set to 0 to disable the translation of automata as product or sum of subformulas.

skel-simul

Default to 1. Set to 0 to disable simulation on the skeleton automaton during compositional suspension. Only used when comp-susp=1.

skel-wdba

Set to 0 to disable WDBA minimization on the skeleton automaton during compositional suspension. Set to 1 always WDBA-minimize the skeleton. Set to 2 to keep the WDBA only if it is smaller than the original skeleton. This option is only used when comp-susp=1 and default to 1 or 2 depending on whether --small or --deterministic is specified.

Postprocessing options:

acd

Set to 1 (the default) to use paritize automata using the alternating cycle decomposition. Set to 0 to use paritization based on latest appearance record variants.

ba-simul

Set to 0 to disable simulation-based reductions on automata where state-based acceptance must be preserved (e.g., after degeneralization has been performed). The name suggests this applies only to Büchi automata for historical reasons; it really applies to any state-based acceptance nowadays. Set to 1 to use only direct simulation. Set to 2 to use only reverse simulation. Set to 3 to iterate both direct and reverse simulations. The default is the value of parameter "simul" in --high mode, and 0 therwise.

dba-simul

Set to 1 to enable simulation-based reduction after running the powerset determinization enabled by "tba-det". By default, this is disabled at low level or if parameter "simul" is set to 0.

degen-lcache

If non-zero (the default is 1), whenever the degeneralization algorithm enters an SCC on a state that has already been associated to a level elsewhere, it should reuse that level. Different values can be used to select which level to reuse: 1 always uses the first level created, 2 uses the minimum level seen so far, and 3 uses the maximum level seen so far. The "lcache" stands for "level cache".

degen-lowinit

Whenever the degeneralization algorihm enters a new SCC (or starts from the initial state), it starts on some level L that is compatible with all outgoing transitions. If degen-lowinit is zero (the default) and the corresponding state (in the generalized automaton) has an accepting self-loop, then level L is replaced by the accepting level, as it might favor finding accepting cycles earlier. If degen-lowinit is non-zero, then level L is always used without looking for the presence of an accepting self-loop.

degen-lskip

If non-zero (the default), the degeneralization algorithm will skip as many levels as possible for each transition. This is enabled by default as it very often reduce the number of resulting states. A consequence of skipping levels is that the degeneralized automaton tends to have smaller cycles around the accepting states. Disabling skipping will produce automata with large cycles, and often with more states.

degen-order

If non-zero, the degeneralization algorithm will compute an independent degeneralization order for each SCC it processes. This is currently disabled by default.

degen-remscc

If non-zero (the default), make sure the output of the degeneralization has as many SCCs as the input, by removing superfluous ones.

degen-reset

If non-zero (the default), the degeneralization algorithm will reset its level any time it exits an SCC.

det-max-edges

When defined to a positive integer N, determinizations will be aborted whenever the number of generated edges would exceed N. In this case a non-deterministic automaton will be returned.

det-max-states

When defined to a positive integer N, determinizations will be aborted whenever the number of generated states would exceed N. In this case a non-deterministic automaton will be returned.

det-scc

Set to 0 to disable scc-based optimizations in the determinization algorithm.

det-simul

Set to 0 to disable simulation-based optimizations in the determinization algorithm. This is enabled by default, unless "simul" is set to 0. (Do not confuse this with option "dpa-simul", which runs a simulation-based reduction after determinization.)

det-stutter

Set to 0 to disable optimizations based on the stutter-invariance in the determinization algorithm.

dpa-simul

Set to 1 to enable simulation-based reduction after running a Safra-like determinization to obtain a DPA, or 0 to disable. By default this is disabled at low level or if parameter "simul" is set to 0. (Do not confuse this with option det-simul, which uses a simulation-based optimizations during the determinization.)

gen-reduce-parity

When the postprocessor routines are configured to output automata with any kind of acceptance condition, but they happen to process an automaton with parity acceptance, they call a function to minimize the number of colors needed. This option controls what happen when this reduction does not reduce the number of colors: when set (the default) the output of the reduction is returned, this means the colors in the automaton may have changed slightly, and in particular, there is no transition with more than one color; when unset, the original automaton is returned.

gf-guarantee

Set to 0 to disable alternate constructions for GF(guarantee)->[D]BA and FG(safety)->DCA. Those constructions are from an LICS’18 paper by J. Esparza, J. Křentínský, and S. Sickert. This is enabled by default for medium and high optimization levels. Unless we are building deterministic automata, the resulting automata are compared to the automata built using the more traditional pipeline, and only kept if they are better.

merge-states-min

Number of states above which states are merged using a cheap approximation of a bisimulation quotient before attempting simulation-based reductions. Defaults to 128. Set to 0 to never merge states.

relabel-bool

If set to a positive integer N, a formula with N atomic propositions or more will have its Boolean subformulas abstracted as atomic propositions during the translation to automaton. This relabeling can speeds the translation if a few Boolean subformulas use many atomic propositions. This relabeling make sure the subexpressions that are replaced do not share atomic propositions. By default N=4. Setting this value to 0 will disable the rewriting.

relabel-overlap

If set to a positive integer N, a formula with N atomic propositions or more will have its Boolean subformulas abstracted as atomic propositions during the translation to automaton. This version does not care about overlapping atomic propositions, so it can cause the created temporary automata to have incompatible combinations of atomic propositions that will be eventually be removed. This relabeling is attempted after relabel-bool. By default, N=8. Setting this value to 0 will disable the rewriting.

sat-acc

When this is set to some positive integer, the SAT-based will attempt to construct a TGBA with the given number of acceptance sets. It may however return an automaton with fewer acceptance sets if some of these are useless. Setting sat-acc automatically sets sat-minimize to 1 if not set differently.

sat-incr-steps

Set the value of sat-incr-steps. This variable is used by two SAT-based minimization algorithms: (2) and (3). They are both described below.

sat-langmap

Find the lower bound of default sat-minimize procedure (1). This relies on the fact that the size of the minimal automaton is at least equal to the total number of different languages recognized by the automaton’s states.

sat-minimize

Set to a value between 1 and 4 to enable SAT-based minimization of deterministic ω-automata. If the input has n states, a SAT solver is used to find an equivalent automaton with 1≤m<n states. The value between 1 and 4 selects how the lowest possible m is searched, see the SAT-MINIMIZE VALUE section. SAT-based minimization uses PicoSAT (embedded in Spot), but another installed SAT-solver can be set thanks to the SPOT_SATSOLVER environment variable. Enabling SAT-based minimization will also enable tba-det.

sat-states

When this is set to some positive integer, the SAT-based minimization will attempt to construct an automaton with the given number of states. It may however return an automaton with fewer states if some of these are unreachable or useless. Setting sat-states automatically enables sat-minimize, but no iteration is performed. If no equivalent automaton could be constructed with the given number of states, the original automaton is returned.

scc-filter

Set to 1 (the default) to enable SCC-pruning and acceptance simplification at the beginning of post-processing. Transitions that are outside accepting SCC are removed from accepting sets, except those that enter into an accepting SCC. Set to 2 to remove even these entering transition from the accepting sets. Set to 0 to disable this SCC-pruning and acceptance simplification pass.

simul

Set to 0 to disable simulation-based reductions. Set to 1 to use only direct simulation. Set to 2 to use only reverse simulation. Set to 3 to iterate both direct and reverse simulations. The default is 3, except when option --low is specified, in which case the default is 1.

simul-max

Number of states above which simulation-based reductions are skipped. Defaults to 4096. Set to 0 to disable. This applies to all simulation-based optimization, including those of the determinization algorithm.

simul-method

Chose which simulation based reduction to use: 1 force the signature-based BDD implementation, 2 force matrix-based and 0, the default, is a heuristic which chooses which implementation to use.

simul-trans-pruning

Number of equivalence classes above which simulation-based transition-pruning for non-deterministic automata is disabled. Defaults to 512. Set to 0 to disable. This applies to all simulation-based reductions, as well as to the simulation-based optimization of the determinization algorithm. Simulation-based reductions perform a number of BDD implication checks that is quadratic in the number of classes to implement transition pruning. The number of equivalence classes is equal to the number of output states of the simulation-based reduction when transition-pruning is disabled, it is just an upper bound otherwise.

state-based

Set to 1 to instruct the SAT-minimization procedure to produce an automaton where all outgoing transition of a state have the same acceptance sets. By default, this is only enabled when options -B or -S are used.

tba-det

Set to 1 to attempt a powerset determinization if the TGBA is not already deterministic. Doing so will degeneralize the automaton. This is disabled by default, unless sat-minimize is set.

wdba-det-max

Maximum number of additional states allowed in intermediate steps of WDBA-minimization. If the number of additional states reached in the powerset construction or in the followup products exceeds this value, WDBA-minimization is aborted. Defaults to 4096. Set to 0 to disable. This limit is ignored when -D used or when det-max-states is set.

wdba-minimize

Set to 0 to disable WDBA-minimization, to 1 to always try it, or 2 to attempt it only on syntactic obligations or on automata that are weak and deterministic. The default is 1 in --high mode, else 2 in --medium or --deterministic modes, else 0 in --low mode.

SAT-MINIMIZE VALUES

When the sat-minimize=K option is used to enable SAT-based minimization of deterministic automata, a SAT solver is used to minimize an input automaton with N states into an output automaton with 1≤M≤N states. The parameter K specifies how the smallest possible M should be searched.

1

The default, 1, performs a binary search between 1 and N. The lower bound can sometimes be improved when the sat-langmap option is used.

2

Use PicoSAT assumptions. Each iteration encodes the search of an (N-1) state equivalent automaton, and additionally assumes that the last sat-incr-steps states are unnecessary. On failure, relax the assumptions to do a binary search between N-1 and N-1-sat-incr-steps. sat-incr-steps defaults to 6.

3

After an (N-1) state automaton has been found, use incremental solving for the next sat-incr-steps iterations by forbidding the usage of an additional state without reencoding the problem again. A full encoding will occur after sat-incr-steps iterations unless sat-incr-steps=-1 (see SPOT_XCNF environment variable). sat-incr-steps defaults to 2.

4

This naive method tries to reduce the size of the automaton one state at a time. Note that it restarts all the encoding each time.

ENVIRONMENT VARIABLES

SPOT_BDD_TRACE

If this variable is set to any value, statistics about BDD garbage collection and resizing will be output on standard error.

SPOT_CONTAINMENT_CHECK

Specifies which inclusion algorithm Spot should use. If the variable is unset, or set to "default", containment checks are done using a complementation-based procedure. If the variable is set to "forq", then the FORQ-based containment check is used for Büchi automata (the default procedure is still used for non-Büchi automata). See [6] in the bibliography below.

SPOT_DEFAULT_FORMAT

Set to a value of dot or hoa to override the default format used to output automata. Up to Spot 1.9.6 the default output format for automata used to be dot. Starting with Spot 1.9.7, the default output format switched to hoa as it is more convenient when chaining tools in a pipe. Set this variable to dot to get the old behavior. Additional options may be passed to the printer by suffixing the output format with = and the options. For instance running

% SPOT_DEFAULT_FORMAT=dot=bar autfilt ...

is the same as running

% autfilt --dot=bar ...

but the use of the environment variable makes more sense if you set it up once for many commands.

SPOT_DEBUG_PARSER

If this variable is set to any value, the automaton parser of Spot is executed in debug mode, showing how the input is processed.

SPOT_DOTDEFAULT

Whenever the --dot option is used without argument (even implicitely via SPOT_DEFAULT_FORMAT), the contents of this variable are used as default argument. If you have some default settings in SPOT_DOTDEFAULT and want to append to options xyz temporarily for one call, use --dot=.xyz: the dot character will be replaced by the contents of the SPOT_DOTDEFAULT environment variable.

SPOT_DOTEXTRA

The contents of this variable is added to any dot output, immediately before the first state is output. This makes it easy to override global attributes of the graph.

SPOT_EXCLUSIVE_WORD

Specifies which algorithm spot should use for exclusive_word. This can currently take on 1 of 2 values: 0 for the legacy implementation, and 1 for the forq implementation [6] (See bibliography below). Forq assumes buchi automata in order to find an exclusive word, and will default to the legacy version if these constraints are not satisfied with the automata passed.

SPOT_HOA_TOLERANT

If this variable is set, a few sanity checks performed by the HOA parser are skipped. The tests in questions correspond to issues in third-party tools that output incorrect HOA (e.g., declaring the automaton with property "univ-branch" when no universal branching is actually used)

SPOT_O_CHECK

Specifies the default algorithm that should be used by the is_obligation() function. The value should be one of the following:

1

Make sure that the formula and its negation are realizable by non-deterministic co-Büchi automata.

2

Make sure that the formula and its negation are realizable by deterministic Büchi automata.

3

Make sure that the formula is realizable by a weak and deterministic Büchi automata.

SPOT_OOM_ABORT

If this variable is set, Out-Of-Memory errors will abort() the program (potentially generating a coredump) instead of raising an exception. This is useful to debug a program and to obtain a stack trace pointing to the function doing the allocation. When this variable is unset (the default), std::bad_alloc are thrown on memory allocation failures, and the stack is usually unwinded up to top-level, losing the original context of the error. Note that at least ltlcross has some custom handling of std::bad_alloc to recover from products that are too large (by ignoring them), and setting this variable will interfer with that.

SPOT_PR_CHECK

Select the default algorithm that must be used to check the persistence or recurrence property of a formula f. The values it can take are between 1 and 3. All methods work either on f or !f thanks to the duality of persistence and recurrence classes. See this page for more details. If it is set to:

1

It will try to check if f (or !f) is co-Büchi realizable in order to tell if f belongs to the persistence (or the recurrence) class.

2

It checks if f (or !f) is det-Büchi realizable via a reduction to deterministic-Rabin in order to tell if f belongs to the recurrence (or the persistance) class.

3

It checks if f (or !f) is det-Büchi realizable via a reduction to deterministic-parity in order to tell if f belongs to the recurrence (or the persistance) class.

SPOT_SATLOG

If set to a filename, the SAT-based minimization routines will append statistics about each iteration to the named file. Each line lists the following comma-separated values: input number of states, target number of states, number of reachable states in the output, number of edges in the output, number of transitions in the output, number of variables in the SAT problem, number of clauses in the SAT problem, user time for encoding the SAT problem, system time for encoding the SAT problem, user time for solving the SAT problem, system time for solving the SAT problem, automaton produced at this step in HOA format.

SPOT_SATSOLVER

If set, this variable should indicate how to call an external SAT-solver - by default, Spot uses PicoSAT, which is distributed with. This is used by the sat-minimize option described above. The format to follow is the following: "<sat_solver> [options] %I >%O". The escape sequences %I and %O respectively denote the names of the input and output files. These temporary files are created in the directory specified by SPOT_TMPDIR or TMPDIR (see below). The SAT-solver should follow the convention of the SAT Competition for its input and output format.

SPOT_STREETT_CONV_MIN

The number of Streett pairs above which conversion from Streett acceptance to generalized-Büchi acceptance should be made with a dedicated algorithm. By default this is 3, i.e., if a Streett automaton with 3 acceptance pairs or more has to be converted into generalized-Büchi, the dedicated algorithm is used. This algorithm is close to the classical conversion from Streett to Büchi, but with several tweaks. When this algorithm is not used, the standard "Fin-removal" approach is used instead: first the acceptance condition is converted into disjunctive normal form (DNF), then Fin acceptance is removed like for Rabin automata, yielding a disjuction of generalized Büchi acceptance, and the result is finally converted into conjunctive normal form (CNF) to obtain a generalized Büchi acceptance. Both algorithms have a worst-case size that is exponential in the number of Streett pairs, but in practice the dedicated algorithm works better for most Streett automata with 3 or more pairs (and many 2-pair Streett automata as well, but the difference here is less clear). Setting this variable to 0 will disable the dedicated algorithm. Setting it to 1 will enable it for all Streett automata, however we do not recommand setting it to less than 2, because the "Fin-removal" approach is better for single-pair Streett automata.

SPOT_STUTTER_CHECK

Select the default check used to decide stutter invariance. The variable should hold a value between 1 and 8, corresponding to the following tests described in our Spin’15 paper (see the BIBLIOGRAPHY section). The default is 8.

1

sl(a) x sl(!a)

2

sl(cl(a)) x !a

3

cl(sl(a)) x !a

4

sl2(a) x sl2(!a)

5

sl2(cl(a)) x !a

6

cl(sl2(a)) x !a

7

sl(a) x sl(!a), performed on-the-fly

8

cl(a) x cl(!a)

This variable is used by the --check=stutter-invariance and --stutter-invariant options, but it is ignored by --check=stutter-sensitive-example.

SPOT_SIMULATION_REDUCTION

Choose which simulation based reduction to use: 1 force signature-based BDD implementation, 2 force matrix-based implementation and 0 is default, a heuristic is used to choose which implementation to use.

SPOT_TMPDIR, TMPDIR

These variables control in which directory temporary files (e.g., those who contain the input and output when interfacing with translators) are created. TMPDIR is only read if SPOT_TMPDIR does not exist. If none of these environment variables exist, or if their value is empty, files are created in the current directory.

SPOT_TMPKEEP

When this variable is defined, temporary files are not removed. This is mostly useful for debugging.

SPOT_XCNF

Assign a folder path to this variable to generate XCNF files whenever SAT-based minimization is used - the file is output as "incr.xcnf" in the specified directory. This feature works only with an external SAT-solver. See SPOT_SATSOLVER to know how to provide one. Also note that this needs an incremental approach without restarting the encoding i.e "sat-minimize=3,param=-1" for ltl2tgba and ltl2tgta or "incr,param=-1" for autfilt (see sat-minimize options described above or autfilt man page). The XCNF format is the one used by the SAT incremental competition.

BIBLIOGRAPHY

The following papers are related to some of the options and environment variables.

1.

Christian Dax, Jochen Eisinger, Felix Klaedtke: Mechanizing the Powerset Construction for Restricted Classes of ω-Automata. Proceedings of ATVA’07. LNCS 4762.

Describes the WDBA-minimization algorithm implemented in Spot. The algorithm used for the tba-det options is also a generalization (to TBA instead of BA) of what they describe in sections 3.2 and 3.3.

2.

Tomáš Babiak, Thomas Badie, Alexandre Duret-Lutz, Mojmír Křetínský, Jan Strejček: Compositional Approach to Suspension and Other Improvements to LTL Translation. Proceedings of SPIN’13. LNCS 7976.

Describes the compositional suspension, the simulation-based reductions, and the SCC-based simplifications.

3.

Rüdiger Ehlers: Minimising Deterministic Büchi Automata Precisely using SAT Solving. Proceedings of SAT’10. LNCS 6175.

Our SAT-based minimization procedures are generalizations of this paper to deal with TBA or TGBA.

4.

Thibaud Michaud and Alexandre Duret-Lutz: Practical stutter-invariance checks for ω-regular languages, Proceedings of SPIN’15. LNCS 9232.

Describes the stutter-invariance checks that can be selected through SPOT_STUTTER_CHECK.

5.

Javier Esparza, Jan Křetínský and Salomon Sickert: One Theorem to Rule Them All: A Unified Translation of LTL into ω-Automata. Proceedings of LICS’18.

Describes (among other things) the constructions used for translating formulas of the form GF(guarantee) or FG(safety), that can be disabled with -x gf-guarantee=0.

6.

Kyveli Doveri and Pierre Ganty and Nicolas Mazzocchi: FORQ-Based Language Inclusion Formal Testing. Proceedings of CAV’22. LNCS 13372.

The containment check implemented as spot::contains_forq(), and used for Büchi automata when SPOT_CONTAINMENT_CHECK=forq.

REPORTING BUGS

Report bugs to <spot@lrde.epita.fr>.

COPYRIGHT

Copyright © 2024 by the Spot authors, see the AUTHORS File for details. License GPLv3+: GNU GPL version 3 or later.
This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law.

SEE ALSO

ltl2tgba(1) ltl2tgta(1) dstar2tgba(1) autfilt(1)