MiniZinc Challenge 2018 -- Rules

These are the official rules for the MiniZinc Challenge 2018. Version 2.1.7.
These rules were last updated on 8 March 2018.

Entrants

The MiniZinc Challenge 2018 will test solvers on problems written in MiniZinc 2.1.7.
Let name be the name of the solver system in what follows.

An entrant in the challenge is a constraint solver that is installed in a Docker image that is an extension of the MiniZinc Challenge 2018 image provided by the organizers.
Constraint solvers that have several variants, for example that can alternatively use copying or trailing, may submit one entry per variant although the organizers reserve the right to reject such variations if they not sufficiently interesting, (e.g. multiple copies of the same solver with differing parameters).

Each entrant must provide a gzipped tarball containing the following:

  1. The filled in form named EntryClasses.txt specifying which competition CLASS(es) the entry is to be entered in.

  2. A text file named DESCRIPTION, that contains a short (1-2 pages) description of the system. This should include a list of all authors of the system and their present institutional affiliations. It should also describe any algorithms or data structures that are not standardly used in such systems.
    System descriptions will be posted on the MiniZinc Challenge 2018 website.

  3. A filled in form named Consent_3rdPartyUsage.txt.
  4. A description how to retrieve the entry's Docker image with the installed solver. More information about Docker images is below.

The gzipped tar-ball must be made accessible for download for the organizer and the submitter must send an email to the organizer describing how to download the Docker image.

The organizers will make reasonable efforts to run each system, including communication with the submitters of the system in case of difficulties. Nevertheless, the organizers reserve the right to reject an entrant if its process proves overly difficult.

The results will be announced at CP2018. Entrants are encouraged to physically attend CP2018, but are not required to in order to participate or win.

There will be at most five competition CLASSES depending on how many solvers are entered in each of them:

  • FD search: solvers in this class must follow the search strategy defined in the problem, they will be disqualified if there is evidence that they do not follow the search strategy.
  • Free search: solvers in this class are free to ignore the search strategy. All FD search solvers (and local search solver running on a single thread) will be automatically entered in this class too.
  • Parallel search: solvers in this class are free to use multiple threads or cores to solve the problem. All entrants in the free search class (and the local search class) will be automatically entered in this class too, but they will be run in a single threaded mode.
  • Open class: This class allows the usage of portfolio solvers. Solvers in this class are free to use multiple threads or cores to solve the problem. All entrants in the parallel search class will be automatically entered in this class too.
  • Local search: This class is specific for the local search solvers. Solvers in this class are free to use multiple threads or cores to solve the problem.

    The EntryClasses.txt file included in the entry must specify which competition CLASS(es) the entry is to be entered in.

    Docker Images

    More details about Docker images and how to create a Docker image with your solver is available here.

    The provided Docker image with the installed solver can be run by the provided scripts inside the image as FlatZinc or MiniZinc solver, i.e.,

      • FlatZinc Solver
        fzn-exec - an executable file in the image folder /entry_data that invokes a FlatZinc solver handling FlatZinc version 2.1.7. If your FlatZinc solver only handles XML-FlatZinc version 2.1.7, please contact the organizers.
        This executable will be invoked from the command line/scripts as follows:

        fzn-exec [<options>] file.fzn

        The argument file.fzn is the name of a FlatZinc 2.1.7 model instance to evaluate.
      • MiniZinc Solver
        An executable file needs to provided in the image folder /entry_data that invokes a MiniZinc solver handling MiniZinc version 2.1.7. This executable will be invoked from the command line and/or scripts located at the image folder /minizinc as follows:

        exec [<options>] -G <mznlibdir> file.mzn [<data.dzn>]

        The arguments file.mzn and data.dzn are names of a MiniZinc 2.1.7 model and data file, respectively. The argument -G <mznlibdir> points to the location of your solver's MiniZinc library directory containing the redefinition or global constraints files.
      The FlatZinc and MiniZinc executable should support the following command line options:
      1. -a
        • Satisfaction problems
          This causes the solver to search for, and output all solutions.
          When this option is not given the solver should search for, and output the first solution.
        • Optimisation problems
          This causes the solver to search for the first optimal solution, and output all found intermediate solutions and the first optimal solution.
          When this option is not given the solver should search for, and output the first optimal solution.
      2. -f
        When invoked with this option the solver is free to ignore any specified search strategy.
      3. -p <n>
        When invoked with this option the solver is free to use multiple threads and/or cores during search. The argument n specifies the number of cores that are available.

      Execution of solvers must not require root access.

    1. Any solver-specific definitions of the global constraints in the MiniZinc library in the image directory /entry_data/mzn-lib.
      This directory may also contain a file named redefinitions.mzn that contains redefinitions of FlatZinc built-ins required by the solver.

  • Problem Format

    The problem format will be MiniZinc 2.1.7.
    There will be some restrictions on the problems tested in MiniZinc challenge.

    1. No floats may be involved in any model. This is to avoid accuracy differences and simplify entry requirements.
    2. No variable sets of integers may be used in any model. This is to simplify entry requirements. Not even implicit var sets of int, e.g. this is forbidden:
      
      	 array[1..3] of set of 1..3: a = [{1,2}, {3}, {1,3}];
               var 1..3: i;
               constraint card(a[i]) > 1; 
      
      
    3. In order to facilitate local search entrants, ideally a model should wrap symmetry breaking constraints in a predicate "symmetry_breaking_constraint" e.g.,
      
          var 0..100: x;
          var 0..100: y;
          constraint x + y < 144;
          constraint symmetry_breaking_constraint(x <= y);
      
      
      and redundant constraints in a predicate "redundant_constraint", e.g.,
      
          array[1..4] of var 0..20: start;
          array[1..4] of int: duration = [3, 4, 6, 7];
          array[1..4] of int: usage    = [6, 3, 5, 3];
          constraint cumulative(start, duration, usage, 10);
          constraint redundant_constraint(start[1] + dur[1] <= start[3] \/ start[3] + dur[3] <= start[1]);
      
      
    4. Each solve item must be annotated with a search strategy, such that fixing all the variables appearing in the search strategy would allow a value propagation solver to check a solution. For example,
      
               var 1..5: x;
               var 1..5: y;
               var 1..5: z;
               constraint x <= y /\ y <= z;
               solve :: int_search([x, y, z], input_order, indomain_min, complete)
                        satisfy;
      
      
      is correct but not
      
                solve :: int_search([x,z], input_order, indomain_min, complete)
                        satisfy;
      
      
      even though most FD solvers would know the second was satisfiable.
    5. Search annotations will be restricted to bool_search, int_search and seq_search.
      For bool_search and int_search only the following parameters (where applicable) will be used:
      • variable choice:
        • input_order
        • first_fail (variable with smallest domain size)
        • anti_first_fail (variable with largest domain size)
        • smallest (variable with smallest minimal value)
        • largest (variable with largest maximum value)
      • value choice: [examples for domain {1,3,4,18}]
        • indomain_min (x <= 1; x > 1)
        • indomain_max (x >= 18; x < 18)
        • indomain_median (x = 3 ; x != 3)
        • indomain_split (x <= (1+18)/2 ; x > (1+18)/2 )
        • indomain_reverse_split (x > (1+18)/2 ; x <= (1+18)/2 )
      • search style
        • complete
      Note that for variable choices ties are broken by taking the earliest variable in the given array. Also note that the decision is reassessed at each choice.
      
             var 1..5: x;
             var 1..10: objective;
             constraint x > 1 -> objective > 7;
             constraint x = 1 -> objective < 3;
             solve :: int_search([x, objective], first_fail, indomain_min, complete)
                      maximize objective;
      
      
      will first label x = 1 and find maximum value objective = 2 eventually on backtracking to the choice x = 1, then setting x >= 2 in most FD solvers will have domains for x :: 2..5 and objective :: 8..10 and this time objective will be selected as the next variable to label. A full specification of the available choices is given in the FlatZinc 1.6 specification.
    6. The objective variable must be called objective in optimisation problems, e.g. see previous example.

    Output Requirements

    Output from entries must conform to the FlatZinc 1.6 specification. For optimization problems, if the time limit is exceeded before the final solution is printed then the last complete approximate solution printed will be considered to be the solution for that entry. Note that is important that entries flush the output stream after printing each approximate solution.

    Execution Environment

    During the MiniZinc Challenge 2018 all instances of Docker images will run on Amazon EC2 M5 Instance from the Amazon Web Services with the following specification:

    • Image Operating System: Ubuntu 16.04.2 LTS (xenial xerus)
    • Processor(s): 2.5 GHz Intel Xeon® Platinum 8175
    • Model for FD, FREE: m5.xlarge (vCPU: 4; Memory: 16 GiB)
    • Model for PAR, OPEN: m5.2xlarge (vCPU: 8; Memory: 32 GiB)

    Except in the Parallel search, Open class, and local search solver using multiply cores, only a single core of one processor will be used for each entrant.

    Benchmark Selection

    The benchmarks for MiniZinc Challenge 2018 (as well as for the qualification rounds) will be selected by the judges to try to examine some of the breadth of characteristics of FD solvers:

    • propagation speed
    • search speed
    • global constraints
    • satisfaction
    • optimization

    To obtain benchmarks of suitable difficulty we will select only such instances that can be solved by at least one of the participating solvers in a sensible time-frame. For the qualification rounds no such restriction applies.

    In order to collect good benchmarks each entrant is strongly encouraged to submit one or two MiniZinc 2.1.7 models, making use of only the global constraints included in the MiniZinc 2.1.7 library, as well as some (preferably 20) instance files for each model. The instances should range from easy (about a minute) to hard (about 15 minutes) if possible.
    Note that the model must conform to the problem format restrictions above.

    Submitted benchmarks must be placed in the public domain.

    Initial Submission Round

    There will be an initial submission round, which will provide the organizers with an opportunity to provide feedback on entries' compatibility with the competition hardware, compliance with the FlatZinc specification and any other matters that may arise. Submission in the initial round is not required in order to qualify for the final round, but it is strongly encouraged.

    The Challenge

    The challenge will require solvers to process 100 MiniZinc models with a run-time limit of 20 minutes (process time) per problem.
    NOTE that the MiniZinc to FlatZinc/XML-FlatZinc time will be included in this time.

    Assessment

    Each solver s is run on problem p and the following information is collected.

    • timeUsed(p,s) - the wall clock time used by the solver, or timeLimit(p) if it was still running at the timeLimit (quantized to seconds).
    • solved(p,s) - true if a correct solution is returned, or correct unsatisfiability is detected
    • quality(p,s) - the objective value of the best solution found by the solver (that is the last solution fully output before the time limit) assuming maximization
    • optimal(p,s) - whether the objective value is proved optimal by the solver.
    • timeSol(p,s,i) - the wall clock time used by the solver for finding the i-th solution on the problem
    • qualitySol(p,s,i) - the objective value of the i-th solution found by the solver on the problem

    There three different scoring procedure: complete, incomplete, and area. For prices, the solver ranking TBA is used.

    Complete Scoring Procedure

    The complete scoring procedure is based on the Borda count voting system. Each benchmark instance is treated like a voter who ranks the solvers. Each solver scores points equal to the number of solvers that they beat in the ranking (more or less). A solver s scores points on problem p by comparing its performance with each other solver s' on problem p.
    • If s gives a better answer than s' it scores 1 point.
    • If s and s' gives indistinguishable answers then scoring is based on execution time comparison (see below).
    • If s gives a worse answer than s' it scores 0 point.
    In the case on indistinguishable answers between s and s', s scores timeUsed(p,s') / (timeUsed(p,s') + timeUsed(p,s)) , 0.5 if both finished in 0s. The exception is that if solved(p,s) is false, that is, s fails to find any solution or prove unsatisfiability for problem p it always scores 0 points (even if s' also similarly fails).
    • Satisfaction Problem
      A solver s answer is better than solver s' answer on satisfaction problem p iff
      • solved(p,s) && not solved(p,s')
    • Optimization Problem
      A solver s is better than solver s' on optimization problem p iff
      • solved(p,s) && not solved(p,s'), or
      • optimal(p,s) && not optimal(p,s'), or
      • quality(p,s) > quality(p,s'), or

    Incomplete Scoring Procedure

    The incomplete scoring procedure is the same as the complete one using the Borda count, but the proved optimal solution by a solver does not count.
    • Satisfaction Problem
      A solver s answer is better than solver s' answer on satisfaction problem p iff
      • solved(p,s) && not solved(p,s')
    • Optimization Problem
      A solver s is better than solver s' on optimization problem p iff
      • solved(p,s) && not solved(p,s'), or
      • quality(p,s) > quality(p,s'), or

    Area Scoring Procedure

    The area scoring procedure computes the integral of a step function over the runtime horizon. Intuitively, a solver that quickly finds good solutions performs better than a solver that finds even better solutions, but much later in the solving stage. The step function f is defined as follows for a problem p and a solver s.
    • Satisfaction and Unsatisfiable Problems
      f(p,s) = timeUsed(p,s)
    • Satisfiable Minimization Problems
      f(p,s) = 0.25 * timeSol(p,s,1) + 0.5 * sum(i in 1..n)(qualitySol(p,s,i-1) * (timeSol(p,s,i) - timeSol(p,s,i-1)) ) / (UB - LB + 1) + 0.25 * timeUsed(p,s)
      where UB = max(s in Solvers)(qualitySol(p,s,1)) and LB = min(s in Solvers)(quality(p,s).

    CLASSES

    The scoring calculations will be done once for each run class: FD search, Free search, Parallel search, Open class, and Local search. Note that if too few solvers are entered in a class then the challenge won't be run for that class.

    The organizers may well run entrants in the FD search class on a series of smaller problems to test that they indeed are following the given search strategy. These problems will not accrue points, but may disqualify an entry from the FD search class.

    Prizes

    The solvers will be ranked on total points awarded. There will be prizes for the solvers with the highest scores in each of the run classes: FD search, Free search, Parallel search, Open class, and Local search. The organizers may also award prizes to the best solvers on a particular category of problems. Note that if too few solvers are entered in a class then the challenge won't be run for that class and no prizes will be awarded for that class.

    Restrictions

    The organizers reserve the right to enter their own systems--or other systems of interest--to the competition, but these will not be eligible for prizes, but still will modify the scoring results. In addition, the organizers reserve the right not to run the challenge on classes with insufficient number of solver entrants.


    Return to the MiniZinc Challenge 2018 home page.