Fifth Answer Set Programming Competition (ASPCOMP 2014)
The Fifth Answer Set Programming Competition (ASPCOMP 2014) will take place as one of the events within the FLoC Olympic Games, held in Vienna in the context of the Vienna Summer of Logic 2014. The event is officially affiliated with the 30th International Conference on Logic Programming (ICLP 2014).
Contents
- Fifth Answer Set Programming Competition (ASPCOMP 2014)
Competition Results and Data
Raw data (.xlsx MS Excel file).
News and updates
08 August, 2014: Besides results, full data from the competition is now available. Please find details here.
22 July, 2014: Winner teams of the modeling competition announced. Details here.
21 July, 2014: Results unveiled! Please have a look at the slides from the talk @ICLP2014 here.
21 July, 2014: Tasksheet, instances, and sample solutions for the modeling competition available. Details here.
11 June, 2014: Participants, Encodings, Instance Sets released. Details here.
03 June, 2014: Live events schedule unveiled. Details here.
28 May, 2014: The Design of the Fifth Answer Set Programming Competition: report paper available here.
- 23 April, 2014: Competition has started! Good luck to all participants...
15 April, 2014: Competition Format and Benchmark suite unveiled. Have a look here.
- 14 April, 2014: More details about rules.
- 08 April, 2014: Application procedure updated, submission procedure published.
- 31 March, 2014: Application procedure updated.
- 14 March, 2014: Important dates updated.
- 20 February, 2014: Application directions now available, registration opens March 1st, 2014.
- 11 February, 2014: Call for Participation updated. Please check it out!
03 February, 2014: Announcement - ASPCOMP 2014 is officially affiliated with the 30th International Conference on Logic Programming (ICLP 2014).
- 03 February, 2014: Call for participation updated.
- 13 January, 2014: Preliminary Call for Participation out.
Live Events
Report Talk @ICLP 2014: Sunday, July 20th 2014 - Competition report (see http://users.ugent.be/~tschrijv/ICLP2014/program.html).
Awarding Ceremony @FLoC Olympic Games 2014: Monday, July 21st - FLoC Olympic Games Award Ceremony 2 (see http://vsl2014.at/olympics/).
Answer Set Programming Modeling Competition
The Answer Set Programming Modeling Competition 2014 @FLoC Olympic Games 2014 took place on Sunday, July 20th, 2014. The ASP Modeling Competition 2014 was held on-site: complex problems of various kinds were waiting to be modeled, using the ASP-Core-2 standard input language. Dauntless programmers, novices and experts, have been encouraged to team up, take the challenge and the glory (see http://www.easychair.org/smart-program/VSL2014/VSL-2014-07-20.html#session:912).
Tasksheet, instances, and sample solutions for the modeling competition available here (.zip file)
Place |
Winner Team |
1 |
Mario Alviano (University of Calabria), Carmine Dodaro (University of Calabria), Wolfgang Faber (University of Huddersfield) |
2 |
Marcello Balduccini (Drexel University), Agostino Dovier (University of Udine), Tran Cao Son (New Mexico State University) |
3 |
Günther Charwat (Vienna University of Technology), Stefan Ellmauthaler (University of Leipzig) |
Report Papers and Slides
The Design of the Fifth Answer Set Programming Competition: report paper available here.
Slides from the talk @ICLP: Competition results available here.
Call for Participation
Please find the official Call for Participation here.
Important Dates
NOTE: Due to the nature of the competition, schedule can be subject to further modification. Please check this site out often for news and updates.
- March 1st, 2014: Participant registration opens
- March 31st, 2014: Participant registration closes (new)
- April 16th, 2014: Competition server closes to participants
- April 16th, 2014: The competition starts (extended)
- July 2014: Awards are presented at FLoC (22nd) and at ICLP (19th-22nd)
Competition Format and Benchmark Suite
Competition Categories
The Competition consists of two categories, each one featuring four tracks. In particular, the categories depend on the computational resources allowed:
One processor allowed (SP)
Multiple processors allowed (MP)
Of course parallel systems can participate in tracks of MP category, and similarly non-parallel systems can participate in tracks of SP category. The categorization deals with the allowed processors to a system for each run.
Competition Tracks
The ASP Competition Series consisted usually of (sub)-Tracks conceived on the basis of the "complexity" of problems; in the light of the work spent by the community on language standardization, this edition takes rather into consideration language features.
As stated by the Call for Participation, according to the availability of benchmarks, to submitted systems, and to participants feedback, both categories of the competition are hence structured into four tracks:
Track #
Track Name
Track features
1 (SP/MP)
Basic
Basic Decision. Encodings: normal LP + simple arithmetic and comparison operators
2 (SP/MP)
Advanced
Advanced Decision. Encodings: full language, excepting optimization statements and non-HCF disjunction
3 (SP/MP)
Optimization
Optimization. Encodings: full language with optimization statements, excepting non-HCF disjunction
4 (SP/MP)
Unrestricted
Unrestricted. Encodings: full language
Instance Selection
Instances have been randomly selected from the already available official ASPCOMP 2013 suite: the piece of software used for this task is available here. The seed used for randomization is 13152024460108, as the concatenation of the winning numbers of the Euromillion lottery of Tuesday, 22nd April 2014 (see Euromillion Lottery website).
Benchmark Suite
Note: For descriptions and further details about problems please refer to the ASPCOMP 2013 Official Problem Suite, except for Partner Units, that comes from the ASPCOMP 2011 Official Problem Suite. For more details, please refer to reports.
Domain |
P |
2013 Encoding - (Track) |
2014 Encoding - (Track) |
Labyrinth |
D |
basic, non-tight (1) |
basic, non-tight (1) |
Stable Marriage |
D |
basic (1) |
basic (1) |
Bottle Filling |
D |
aggr (2) |
aggr, choice (2) |
Graceful Graphs |
D |
choice (2) |
choice (2) |
Graph Colouring |
D |
disj (2) |
basic (1) |
Hanoi Tower |
D |
disj (2) |
basic (1) |
Incremental Scheduling |
D |
aggr, choice (2) |
aggr, choice (2) |
Knight Tour with Holes |
D |
disj, non-tight (2) |
basic, non-tight (1) |
Nomystery |
D |
aggr, choice (2) |
choice (2) |
Partner Units |
D |
aggr, disj, non-tight (2) |
aggr, choice (2) |
Permutation Pattern Matching |
D |
choice (2) |
choice (2) |
Qualitative Spatial Reasoning |
D |
choice, disj (2) |
disj (2) |
Reachability |
Q |
non-tight (2) |
n/a |
Ricochet Robots |
D |
choice (2) |
aggr, choice (2) |
Sokoban |
D |
aggr, choice (2) |
choice (2) |
Solitaire |
D |
choice (2) |
aggr, choice (2) |
Visit-all |
D |
aggr, choice (2) |
basic (1) |
Weighted-Sequence Problem |
D |
choice (2) |
aggr, choice (2) |
Connected Still Life |
O |
aggr, choice, non-tight (3) |
aggr, choice, non-tight (3) |
Crossing Minimization |
O |
disj (3) |
aggr, choice (3) |
Maximal Clique |
O |
disj (3) |
basic (3) |
Valves Location |
O |
aggr, choice, non-tight (3) |
aggr, choice, non-tight (3) |
Abstract Dialectical Frameworks |
O |
aggr, disj, level, non-tight (4) |
aggr, disj, level, non-tight (4) |
Complex Optimization |
D |
choice, disj, non-tight (4) |
choice, disj, non-tight (4) |
Minimal Diagnosis |
D |
disj, non-tight (4) |
disj, non-tight (4) |
Strategic Companies |
Q |
disj, non-tight (4) |
n/a |
|
|||
D: Decision - Q: Query Answering - O: Optimization |
Participants, Encodings, Instance Sets
Participant Teams
The competition featured 16 systems coming from three teams.
Aalto Team from the Aalto University (9 solvers):
- LP2SAT3+GLUCOSE, LP2SAT3+LINGELING, LP2SAT3+PLINGELING-MT, LP2BV2+BOOLECTOR, LP2GRAPH, LP2MAXSAT+CLASP, LP2MIP2, LP2MIP2-MT, LP2NORMAL2+CLASP.
Potassco Team from the University of Potsdam (2 solvers, relying on GRINGO-4 for grounding and CLASP as back-end):
- CLASP, CLASP-MT.
Wasp Team from the University of Calabria (5 solvers, some relying on DLV for grounding, some on GRINGO-4):
- WASP-1, WASP-2, WASP-1.5, WASP-WMSU1-ONLYWEAK, WASP-WPM1-ONLY-WEAK.
NOTE: In order to assess the improvements in system implementation, we also ran a selection of systems that were submitted in the Fourth ASP Competition. In particular, we considered one system per team selected according to the following criteria:
- it features an updated version in this year’s edition;
- it is compliant with ASP-Core-2;
- it performed best in the Fourth ASP Competition among the systems submitted by the same team.
For further details and proper discussion, please refer to the competition reports available here.
Full stuff
Please find here everything needed for the sake of reproducibility:
NOTE: full instance set for Reachability is available here.
Rules, Scoring and Details
Rules and Scoring
Please find details about rules and scoring here.
Input/Output format
Input and Output formats do not significantly change with respect to ASPCOMP 2013.
Input format
Input format do not change w.r.t. last edition. Please find further info via the official ASPCOMP 2013 website:
Output format
Expected output is almost the same as the one from last edition: see note below. Please find last specs via the official ASPCOMP 2013 website:
Note |
Exit Code 30 is now allowed also in case of decision problems. |
Please have a look at the ASPCOMP 2013 ASP standardization page.
Application and Submission Procedure
Application Procedure
Participants willing to apply for the competition can send an email (starting from March 1st, 2014) to this address specifying:
- the name of their team;
- names and affiliation of the team members (institution, research group);
- name and number of the systems the team will submit;
- specification about whether a system is an updated version of a solution submitted to the 2013 edition;
- specific software requirements: e.g. linux packages/libraries needed to run your system.
language support: e.g. <<our submission supports normal logic programs and aggregates, it does not support weak constraints and function symbol>>, or so, etc.
Applicants will get a private e-mail with the instructions for submitting systems and login credentials for accessing our Competition server through ssh.
System Package(s) submission
Once received the instructions via e-mail, participants can log in onto the competition hardware system, and are free to use their account for building, testing, and submitting systems packages purposes; access is avalaible until the day before the competition starts.
In order to properly submit a system package, participants must create a sub-directory with the name of the system at hand in the directory "submission" of their home. In case of several systems submissions, participants must create a sub-directory for each system.
Reminder: participants can log in and update their submission until the competition starts. Please check the deadline out in the above section.
Package Structure
The structure of each submission package must be the following:
system-name └ bin └ run └ mysolver └ my_helper_tool
A package should contain at least a folder named bin. The package is expected to be launched with the same configuration for all the benchmark domains, i.e., the package must comprise all files required for running the submitted system, on any problem encoding and any problem instance.
Command Line
The main script "run" takes three arguments:
- The first argument ($1, argv[1], ...) is the maximum integer (0 if not meaningful for the problem/instance at hand).
- The second argument ($2, argv[2], ...) is the maximum nesting level of functional terms (0 if not meaningful for the problem/instance at hand).
- The third argument ($3, argv[3], ...) is a comma-separated list of output predicates p1/n1, p2/n2,... with predicate names pi and arities ni. The list coincides with the list of output predicates which are allowed in the output of the run script. For each domain, the list can be found in the related checker folder, for convenience.
Of course, participants are free to design submitted scripts in order to take into account the three arguments, or not.
A problem instance in the prescribed format is fed to run's standard input. The output of each run must be printed to standard output, and a proper exit code must be issued, according to the competition format.
More details on input and output are available in the proper section.
An example of run script is available here.
Contacts
Organizing Committee
The Fifth Answer Set Programming Competition is hosted and regulated jointly by Aalto University (Finland), University of Calabria (Italy) and University of Genova (Italy).
Co-Chairs
Email and Contacts
Drop us a note at aspcomp2014@mat.unical.it.