welcome: please sign in
location: Diff for "BenchmarkSubmission"
Differences between revisions 63 and 64
Revision 63 as of 2012-05-10 17:02:45
Size: 7481
Comment:
Revision 64 as of 2012-05-10 17:40:39
Size: 7453
Comment: add new email addresses for submissions and discussions.
Deletions are marked like this. Additions are marked like this.
Line 76: Line 76:
[[mailto:aspcomp_benchmarks_submission_REPLACE_WITH_AT_mat.unical.it|Benchmark Submission]] [[mailto:aspcomp_benchmarks_submission@yahoogroups.com|Benchmark Submission]]
Line 99: Line 99:
All Researchers interested in discussing/improving problem proposals can subscribe to the [[mailto:aspcomp_benchmarks_discussion_REPLACE_WITH_AT_mat.unical.it|Benchmark Discussion]] mailing list. All communities are encouraged to participate to this crucial stage in which problems are submitted and evaluated. All Researchers interested in discussing/improving problem proposals can subscribe to the [[mailto:aspcomp_benchmarks_discussion@yahoogroups.com|Benchmark Discussion]] mailing list. All communities are encouraged to participate to this crucial stage in which problems are submitted and evaluated.

Call for Benchmark Problems

The 4th Open Answer Set Programming Competition is open to ASP systems and any other system based on a declarative specification paradigm.

Participants will compete on a selected collection of declarative specifications of benchmark problems, taken from a variety of domains as well as real world applications, and instances thereof.

These include but are not limited to:

  • Deductive database tasks on large data-sets
  • Sequential and Temporal Planning
  • Classic and Applicative graph problems
  • Puzzles and combinatorics
  • Scheduling, timetabling and other resource allocation problems
  • Combinatorial Optimization problems
  • Ontology reasoning
  • Automated Theorem Proving and model checking
  • Reasoning tasks over large propositional instances
  • Constraint Programming problems
  • Other AI problems

We encourage to provide help by proposing and/or devising new challenging benchmark problems. The submission of problems arising from applications having practical impact are especially encouraged; problems used in the former ASP Competitions, or variants thereof, can be re-submitted (see Problem Submission).

Benchmark authors are asked to provide the committee with a problem specification and an instance set (or a generator thereof). Problems will be classified and selected in order to obtain a fair balance between several factors (including application domain, modeling difficulty, complexity category, expressiveness, etc).

What is a problem?

Problem Submission

The problems submission procedure will consist of two stages:

  1. Proposal and discussion;
  2. Validation and final submission.

At the first stage, problem descriptions are submitted and made publicly available in the competition website. Everybody can discuss and improve the proposals on our discussion mailing list. At the second stage, the organizing committee will validate a selection of benchmarks; each selected problem will then be completed under the responsibility of the original contributor(s).

Submission, discussion and selection of benchmark problems is handled via email: details are reported below.

Schedule

  • Problem Proposal and Discussion

    • Aug 31th, 2012 - Problem submission deadline

  • Problem Validation and Final Submission

    • Sep. 30th, 2012 - Final submission deadline

How To Submit a Problem

Problem specifications can be initially submitted in a preliminary form (abstract); authors will be asked to provide a complete specification later on.

To submit a new problem just prepare a benchmark package (see Package Description) and send it by email to Benchmark Submission with subject:

  • SUBMISSION: Benchmark Name

Package Description

  • A preliminary submission (see the Examples) simply requires problem name and description/specification only, even if the submission of sample instances is encouraged (sample instances will not be used for the competition). In case you want to propose the problem for the System Competition, an ASP encoding is also strongly suggested.

  • A complete problem submission (see the Examples), to be provided if the proposal has been accepted, has to be enclosed in a single compressed package (zip or tar.gz), containing:

    1. a textual problem description, with a clear specification of the input and output formats. Names and arguments of input and output predicates must be clearly specified (see Problem I/O);

    2. if the problem has been included into the System Track benchmark suite, a problem encoding in ASP;

    3. at least 50 instances, or, equivalently, a script/program that is able to generate problem instances. The format of instances can be found in the Instance Specification page.

    4. a correctness checker, that is, a script or a program (preferably written in ASP whenever possible) that is able to decide whether, given an instance of a the problem at hand, the output of a system actually represent a solution to the instance. If the problem has been included into the System Competition benchmark suite, the ASP program or the script must check whether the predicates occurring in some answer set provided by the system form a solution to the instance. In case of an optimization problem, the program should be also able to compute the "quality" of the answer (see Checker Specification).

    5. a "demonstration", in which the author convincingly shows that the benchmark problem both can be effectively handled/solved and it is not trivial.

Note that: The problem statement must be unambiguous and include the description of input and output according to the problem description format described above. We would really appreciate if you can provide us with a declarative encoding and some sample instance at once: the submission of problems encodings often helps in disambiguating blurred specifications, so its provision is encouraged.

Problem Discussion and Validation

Benchmarks will be published in a specific section of the competition website.

All Researchers interested in discussing/improving problem proposals can subscribe to the Benchmark Discussion mailing list. All communities are encouraged to participate to this crucial stage in which problems are submitted and evaluated.

Validation and final submission stage

Benchmark problems submitted and discussed in the first stage are evaluated by the competition organizing committee. As for paper submissions in a conference, authors will be notified regarding the final acceptance decision.

In this phase the contribution of benchmarks authors is fundamental; submission which results to be incomplete at the end of this stage are unsuitable fot the competition and will be rejected (see Section Final acceptance of problems). The organizing committee takes the right to exclude benchmark problems whose provided instance family turns out to be blatantly too easy or too difficult in terms of expected evaluation time.

Final acceptance of problems

The final inclusion of problems in the competition is subject to to the conclusive approval of the competition organizing committee. The committee takes the right to discard problems which are not compliant with one or more of the following criteria:

  1. The instance set (or the instances generator output) does not comply with the competition input/output format (see Problem I/O and Instance Specification);

  2. The set of provided/generated instances counts less than 50 instances, or the provided generator is missing/failing;
  3. The provided/generated instances are trivial or not provably solvable on the competition hardware.

ASP Competition 2013: BenchmarkSubmission (last edited 2012-10-05 06:55:24 by GiovambattistaIanni)