welcome: please sign in
location: Diff for "BenchmarkSubmission"
Differences between revisions 67 and 68
Revision 67 as of 2012-05-13 15:16:30
Size: 7708
Comment:
Revision 68 as of 2012-08-02 11:25:19
Size: 5855
Comment:
Deletions are marked like this. Additions are marked like this.
Line 2: Line 2:
Line 5: Line 4:
The 4th Open Answer Set Programming Competition is open
to ASP systems and '''any other system'''
based on a declarative specification paradigm.
The 4th Open Answer Set Programming Competition is open  to ASP systems and '''any other system''' based on a declarative specification paradigm.
Line 9: Line 6:
Participants will compete on a selected collection of declarative
specifications of benchmark problems, taken from a variety of domains as
well as real world applications, and instances thereof.  
Participants will compete on a selected collection of declarative specifications of benchmark problems, taken from a variety of domains as well as real world applications, and instances thereof.
Line 15: Line 10:
  * Deductive database tasks on large data-sets
  * Sequential and Temporal Planning    * Classic and Applicative graph problems
  * Puzzles and combinatorics
  * Scheduling, timetabling and other resource allocation problems
  * Combinatorial Optimization problems
  * Ontology reasoning
  * Automated Theorem Proving and model checking
  * Reasoning tasks over large propositional instances
  * Constraint Programming problems
  * Other AI problems
 * Deductive database tasks on large data-sets
 * Sequential and Temporal Planning
* Classic and Applicative graph problems
 * Puzzles and combinatorics
 * Scheduling, timetabling and other resource allocation problems
 * Combinatorial Optimization problems
 * Ontology reasoning
 * Automated Theorem Proving and model checking
 * Reasoning tasks over large propositional instances
 * Constraint Programming problems
 * Other AI problems
Line 27: Line 22:
We encourage to provide help by
proposing and/or devising new challenging benchmark problems.
The submission of problems arising
from applications having practical impact are especially encouraged;
problems used in the former ASP Competitions, or variants thereof, can
be re-submitted (see [[#problem-submission|Problem Submission]]).
We encourage to provide help by proposing and/or devising new challenging benchmark problems.  The submission of problems arising from applications having practical impact are especially encouraged; problems used in the former ASP Competitions, or variants thereof, can be re-submitted (see [[#problem-submission|Problem Submission]]).
Line 34: Line 24:
Benchmark authors are asked to provide the committee with a problem
specification and an instance set (or a generator thereof). Problems
will be classified and selected in order to obtain a fair balance between
several factors (including application domain, modeling difficulty,
complexity category, expressiveness, etc).
Benchmark authors are asked to provide the committee with a problem  specification and an instance set (or a generator thereof). Problems  will be classified and selected in order to obtain a fair balance between  several factors (including application domain, modeling difficulty,  complexity category, expressiveness, etc).
Line 43: Line 29:
Line 44: Line 31:
Line 47: Line 33:
  1. Proposal and discussion;
  2. Validation and final submission.
 1. Proposal and discussion;
 1. Validation and final submission.
Line 50: Line 36:
At the first stage, problem descriptions are submitted and made publicly
available in the competition website. Everybody can discuss
and improve the proposals on our discussion mailing list. At the second stage, the organizing committee
will validate a selection of benchmarks; each selected problem will then
be completed under the responsibility of the original contributor(s). 
At the first stage, problem descriptions are submitted and made publicly  available in the competition website. Everybody can discuss  and improve the proposals on our discussion mailing list. At the second stage, the organizing committee  will validate a selection of benchmarks; each selected problem will then  be completed under the responsibility of the original contributor(s).
Line 56: Line 38:
Submission, discussion and selection of benchmark problems is handled via
email: details are reported below.
Submission, discussion and selection of benchmark problems is handled via  email: details are reported below.
Line 60: Line 41:

 
* ''Problem Proposal and Discussion''
    * '''Aug 31th, 2012''' - Problem submission deadline
  * ''Problem Validation and Final Submission''
    * '''Sep. 30th, 2012''' - Final submission deadline
 * ''Problem Proposal and Discussion''
  * '''Aug 31th, 2012''' - Problem submission deadline
 * ''Problem Validation and Final Submission''
  * '''Sep. 30th, 2012''' - Final submission deadline
Line 68: Line 47:
Line 69: Line 49:
''Problem specifications can be initially submitted in a'' '''preliminary form''' (abstract); authors will be asked to provide a complete specification later on.
Line 70: Line 51:
''Problem specifications can be initially submitted in a'' '''preliminary
form''' (abstract); authors will be asked to provide a complete specification
later on.
To submit a new problem just prepare a benchmark package (see [[#package-format|Package Description]]) and send it by email to [[mailto:aspcomp_benchmarks_submission@yahoogroups.com|Benchmark Submission]] with subject:
Line 74: Line 53:
To submit a new problem just prepare a benchmark package (see
[[#package-format|Package Description]]) and send it by email to
[[mailto:aspcomp_benchmarks_submission@yahoogroups.com|Benchmark Submission]]
with subject:

      SUBMISSION: Benchmark Name
 . SUBMISSION: Benchmark Name
Line 82: Line 56:
Line 83: Line 58:

  * A ''' ''preliminary submission'' ''' (see the [[ProblemIOSpecification|Examples]]) simply requires problem name and description/specification only, even if the submission of sample instances is encouraged (sample instances will not be used for the competition). In case you want to propose the problem for the System Competition, an ASP encoding is also strongly suggested.
  * A ''' ''complete problem submission'' ''' (see the [[ProblemIOSpecification|Examples]]), to be provided if the proposal has been accepted, has to be enclosed in a single compressed package ({{{zip}}} or {{{tar.gz}}}), containing:
    1. a textual problem description, with a clear specification of the input and output formats. Names and arguments of input and output predicates must be clearly specified (see [[ProblemIOSpecification|Problem I/O]]);
    2. if the problem has been included into the [[SystemCompetition|System Track]] benchmark suite, a problem encoding in ASP;
    3. at least 50 instances, or, equivalently, a script/program that is able to generate problem instances. The format of instances can be found in the [[ProblemIOSpecification|Instance Specification]] page.
    4. a correctness checker, that is, a script or a program (preferably written in ASP whenever possible) that is able to decide whether, given an instance of a the problem at hand, the output of a system actually represent a solution to the instance. If the problem has been included into the System Competition benchmark suite, the ASP program or the script must check whether the predicates occurring in some answer set provided by the system form a solution to the instance. In case of an optimization problem, the program should be also able to compute the "quality" of the answer (see [[ProblemIOSpecification|Checker Specification]]).
    5. a "demonstration", in which the author convincingly shows that the benchmark problem both can be effectively handled/solved and it is not trivial.

'''Note that''':
''The problem statement must be unambiguous and include the description of input and output according to the problem description format described above. We would really appreciate if you can provide us with a declarative encoding and some sample instance at once: the submission of problems encodings often helps in disambiguating blurred specifications, so its provision is encouraged.''
 * A ''' ''preliminary submission'' ''' (see the [[ProblemIOSpecification|Package Description]]) simply requires problem name and description/specification only, even if the submission of sample instances is encouraged (sample instances will not be used for the competition). In case you want to propose the problem for the System Competition, an ASP encoding is also strongly suggested.
 * A ''' ''complete problem submission'' ''' (see the [[ProblemIOSpecification|Package Description]]), to be provided if the proposal has been accepted, has to be enclosed in a single compressed package ({{{zip}}}).
Line 96: Line 62:
Benchmarks will be published in a [[BenchmarkDiscussion|specific section of the competition website]].
Line 97: Line 64:
Benchmarks will be published in a [[BenchmarkDiscussion| specific section of the competition website]].
     
Line 100: Line 65:
Line 104: Line 68:
Line 109: Line 72:
<<Anchor(acceptance)>>
Line 110: Line 74:
<<Anchor(acceptance)>>
Line 112: Line 75:
The final inclusion of problems in the competition is subject to to the conclusive approval of the competition organizing committee. The committee takes the right to discard problems which are not compliant with one or more of the following criteria:
Line 113: Line 77:
The final inclusion of problems in the competition is subject to to the conclusive approval of the competition organizing committee. The committee takes the right to discard problems which are not compliant with one or more of the following criteria:
 
1. The instance set (or the instances generator output) does not comply with the competition input/output format (see [[ProblemIOSpecification|Problem I/O and Instance Specification]]);
  2. The set of provided/generated instances counts less than 50 instances, or the provided generator is missing/failing;
  3. The provided/generated instances are trivial or not provably solvable on the competition hardware.
 1. The instance set (or the instances generator output) does not comply with the competition input/output format (see [[ProblemIOSpecification|Problem I/O and Instance Specification]]);
 1. The set of provided/generated instances counts less than 50 instances, or the provided generator is missing/failing;
 1. The provided/generated instances are trivial or not provably solvable on the competition hardware.

Call for Benchmark Problems

The 4th Open Answer Set Programming Competition is open to ASP systems and any other system based on a declarative specification paradigm.

Participants will compete on a selected collection of declarative specifications of benchmark problems, taken from a variety of domains as well as real world applications, and instances thereof.

These include but are not limited to:

  • Deductive database tasks on large data-sets
  • Sequential and Temporal Planning
  • Classic and Applicative graph problems
  • Puzzles and combinatorics
  • Scheduling, timetabling and other resource allocation problems
  • Combinatorial Optimization problems
  • Ontology reasoning
  • Automated Theorem Proving and model checking
  • Reasoning tasks over large propositional instances
  • Constraint Programming problems
  • Other AI problems

We encourage to provide help by proposing and/or devising new challenging benchmark problems. The submission of problems arising from applications having practical impact are especially encouraged; problems used in the former ASP Competitions, or variants thereof, can be re-submitted (see Problem Submission).

Benchmark authors are asked to provide the committee with a problem specification and an instance set (or a generator thereof). Problems will be classified and selected in order to obtain a fair balance between several factors (including application domain, modeling difficulty, complexity category, expressiveness, etc).

What is a problem?

Problem Submission

The problems submission procedure will consist of two stages:

  1. Proposal and discussion;
  2. Validation and final submission.

At the first stage, problem descriptions are submitted and made publicly available in the competition website. Everybody can discuss and improve the proposals on our discussion mailing list. At the second stage, the organizing committee will validate a selection of benchmarks; each selected problem will then be completed under the responsibility of the original contributor(s).

Submission, discussion and selection of benchmark problems is handled via email: details are reported below.

Schedule

  • Problem Proposal and Discussion

    • Aug 31th, 2012 - Problem submission deadline

  • Problem Validation and Final Submission

    • Sep. 30th, 2012 - Final submission deadline

How To Submit a Problem

Problem specifications can be initially submitted in a preliminary form (abstract); authors will be asked to provide a complete specification later on.

To submit a new problem just prepare a benchmark package (see Package Description) and send it by email to Benchmark Submission with subject:

  • SUBMISSION: Benchmark Name

Package Description

  • A preliminary submission (see the Package Description) simply requires problem name and description/specification only, even if the submission of sample instances is encouraged (sample instances will not be used for the competition). In case you want to propose the problem for the System Competition, an ASP encoding is also strongly suggested.

  • A complete problem submission (see the Package Description), to be provided if the proposal has been accepted, has to be enclosed in a single compressed package (zip).

Problem Discussion and Validation

Benchmarks will be published in a specific section of the competition website.

Join the Discussion Group

All Researchers interested in discussing/improving problem proposals can join the Benchmark Discussion mailing list visiting the group page or sending a message to Benchmark Discussion Subscription Request. All communities are encouraged to participate to this crucial stage in which problems are submitted and evaluated.

Validation and final submission stage

Benchmark problems submitted and discussed in the first stage are evaluated by the competition organizing committee. As for paper submissions in a conference, authors will be notified regarding the final acceptance decision.

In this phase the contribution of benchmarks authors is fundamental; submission which results to be incomplete at the end of this stage are unsuitable fot the competition and will be rejected (see Section Final acceptance of problems). The organizing committee takes the right to exclude benchmark problems whose provided instance family turns out to be blatantly too easy or too difficult in terms of expected evaluation time.

Final acceptance of problems

The final inclusion of problems in the competition is subject to to the conclusive approval of the competition organizing committee. The committee takes the right to discard problems which are not compliant with one or more of the following criteria:

  1. The instance set (or the instances generator output) does not comply with the competition input/output format (see Problem I/O and Instance Specification);

  2. The set of provided/generated instances counts less than 50 instances, or the provided generator is missing/failing;
  3. The provided/generated instances are trivial or not provably solvable on the competition hardware.

ASP Competition 2013: BenchmarkSubmission (last edited 2012-10-05 06:55:24 by GiovambattistaIanni)