CP 2020 presentation slides of the MiniZinc Challenge 2020 will be available here.
The entrants for this year (with their descriptions, when provided):
In addition, the challenge organisers entered the following FlatZinc and MiniZinc implementations:
As per the challenge rules, these entries are not eligible for prizes, but do modify the scoring results.
Furthermore, entries in the FD search category (Gecode, JaCoP, SICStus Prolog) were automatically included in the free search category, while entries in the free search category (Chuffed, OscaR/CBLS and promoted FD entries except Gecode) were automatically included in the parallel search category. Lastly, all entries in the parallel search category and promoted entries into that category were automatically included in the open search category.
The Choco entry submitted by the final deadline had a bug that prevented it from competing in the free and parallel categories. We entered an updated version that was provided after the deadline and fixed those problems, but we treated it the same as the other internal entries: Choco was therefore ineligible to win prizes in the free and parallel categories, but its performance modifies the scoring results.
Category | Gold | Silver | Bronze |
---|---|---|---|
Fixed | SICStus Prolog | JaCoP | Choco 4 |
Free | OR-Tools | PicatSAT | Mistral 2.0 |
Parallel | OR-Tools | PicatSAT | Mistral 2.0 |
Open | OR-Tools | sunny-cp | PicatSAT |
Local Search | Yuck | OscaR/CBLS | |
All times are given in milliseconds.
A score of 0.0 indicates a worse answer in quality (worse objective, no proof of optimality, or no answer for satisfaction problems), 1.0 a better solution in quality. When the quality is the same, the 1.0 purse is split with respect to time used.
If a promoted entry does not recognize an option (or states that it is just ignored), times and solutions from the previous category are used for scoring. The suffixes -fd, -free, -par or -open (for the parallel portfolio solver entered) at the end of the solver names indicate which configuration the solvers were run with.
The time limit includes both MiniZinc compilation and solving.
In the Status column:
|
|
|
Summary: |
Total per problem: | |||||||||
|
|
Problem | Instance | Solver | Status | Time | Objective | Score | Score Incomplete | Score Area |
Problem | Instance | Plot |
The following table lists the global constraints used by each model in this year's challenge. In addition, the columns RC and SBC, respectively, indicate whether the model contains redundant or/and symmetry breaking constraints.
Model | RC | SBC | Global constraints used |
---|---|---|---|
bnn-planner | |||
cable_tree_wiring | X | all_different | |
code-generator | X | X | all_different, diffn_nonstrict, diffn, maximum, table, cumulative, member, value_precede_chain, minimum, decreasing |
collaborative-construction | X | X | |
gbac | global_cardinality_low_up_closed, bin_packing_load | ||
hoist-benchmark-for-minizinc | X | ||
is | X | table, circuit | |
lot-sizing | X | X | global_cardinality, at_least, at_most |
minimal-decision-sets | |||
p1f-pjs | X | circuit, all_different, inverse, lex_less | |
pentominoes | regular | ||
pillars-and-planks | X | diffn | |
racp | X | cumulative | |
radiation | |||
sdn-chain | |||
skill-allocation | |||
soccer-computational | |||
stable-goods | |||
tower_challenge | arg_max | ||
whirlpool |
The files on this page are for MiniZinc version 2.4.3.