welcome: please sign in
location: Diff for "ProblemIOSpecification"
Differences between revisions 17 and 18
Revision 17 as of 2012-05-10 08:44:00
Size: 5338
Comment:
Revision 18 as of 2012-05-10 08:45:56
Size: 5303
Comment:
Deletions are marked like this. Additions are marked like this.
Line 18: Line 18:
  2. a {{{benchmark_name.enc.asp}}} file containing a proposed encoding in ASP for the problem ''(details about the language will follow)''; alternatively, an encoding in another declarative language can be submitted.
  2. a {{{benchmark_name.enc.asp}}} file containing a proposed encoding in ASP for the problem (optional: this is strongly encouraged for problems to be submitted to the System Track);

Problem Submission Specification: Input, Output, Checking

Quick Directions

Package Submission Instructions

  • Preliminary submission (see Example) can be done by providing problem name and problem specification only; we encourage to provide a problem encoding and sample instances also.

  • Complete problem specifications (see Example) shall be sent by email to Benchmark Submission and have to be packaged in a single compressed archive named benchmark_name-contributor_name.zip (or tar.gz) containing:

    1. a benchmark_name.spec.txt file containing the textual problem description (a template is available, see below);

    2. a benchmark_name.enc.asp file containing a proposed encoding in ASP for the problem (optional: this is strongly encouraged for problems to be submitted to the System Track);

    3. a folder named checker containing the sources of a correctness checker together with a README.txt file containing the instructions for (possibly building and) running it;

    4. a folder named samples containing some instances (one instance per text file).

benchmark_name and contributor_name should be substituted with their actual value.

Package Example Download

Problem Input-Output predicates and Instance Specification

In the following it is specified the allowed format for specifying input and output of problems and instances. Samples are available in the competition web site.

Problem Input and Output.

Benchmark problems specifications have to clearly indicate the vocabulary of input and output predicates, i.e., a list of predicate names for the input and a list of predicate names for the output and their arity in the usual logic programming notation (e.g. p/2 denotes the binary predicate p).

Instance Specification.

Input instance are sequences of Facts (atoms followed by the dot "." character) with only predicates of the input vocabulary, possibly separated by spaces and line breaks, entirely saved in a text file (only one instance per file is allowed). Examples: node(a), edge(a,b). The terms a and b can be positive integers, double quoted strings or unquoted strings belonging to the vocabulary [a-zA-Z][a-zA-Z0-9]*.

Maximum Integer and maximum function nesting level,

have to be provided in a per-instance basis. In particular, they are specified in the filename containing a given instance. We recall that instances must be named XX-benchmark_name-MAXINT-MAXLEVEL.asp where XX is the instance number, MAXINT is the maximum integer (0 if not relevant/no arithmetics are required), and MAXLEVEL is the maximum function nesting level (0 if not relevant). These data should be provided for those solvers needing an explicit bound on the range of integers and/or function symbols. One or both can be set to 0 if your problem domain does not use either numbers or function symbols or both.

Checker format

The checker reads from standard input a witness (an answer to the problem instance at hand) and writes in standard output a single row of text containing the string OK (if the witness is correctly assessed as a solution), and the string FAIL otherwise. The string OK must be followed by a space-separated integer representing the witness cost in case of optimization problems. The checker outputs WARN and an exit code different from zero in all other cases (e.g. the witness is not syntactically correct, or whenever the output INCONSISTENT has been incorrectly piped towards the checker). Informative messages can be output in standard error.

Note that witnesses given in input to checkers are assumed to be inclusive of both instance facts (input predicates) and solution facts (output predicates).

Remark: checkers provided by benchmark authors will be used only for checking existing witnesses. It is not up to the benchmark author to check validity of other outputs like INCONSISTENT, or optimality.

The source code of the correctness checker has to be be included in the package; moreover, the provided software must be able to (build and) run on the Debian i686 GNU/Linux(kernel ver. 2.6.26) operating system. A checker can be built on top of a logic specification for a solver of choice: in such a case binaries of the solver must be provided together with the checker encoding.

Checker Example Download

ASP Competition 2013: ProblemIOSpecification (last edited 2013-02-11 12:49:28 by GuentherCharwat)