OKlibrary  0.2.1.6
Evaluation.hpp File Reference

Tools for evaluating local search algorithms More...

Go to the source code of this file.


Detailed Description

Tools for evaluating local search algorithms

Todo:
Evaluation tools for run_ubcsat
  • Functions are needed for standard evaluations.
    1. At the R-level, we go for convenience, and overloading functions like "summary" or "plot" seems be the right way.
    2. One needs to find out how to do this, how to recognise our specific data frames.
    3. "summary" should show data, while "plot" might run through a serious of plots, showing the data from different angles.
  • Plotting an overview:
    1. Given the dataframe in E, the best algorithms seems best graphed by
      plot(E$alg,E$min)
           
      which plots per algorithm the boxplots of the min-distribution.
    2. However it is useless if on the x-axis not all algorithms are listed.
      1. So the algorithms-names needed to be printed vertically.
      2. See https://stat.ethz.ch/pipermail/r-help/2003-May/033400.html .
      3. And all names need to be shown.
    3. And an option for cutoff_min is needed, which removes algorithms whose best min-value is greater than the cutoff-value.
  • DONE (eval_ubcsat_dataframe now prints the sorted table-data) Sorting the algorithms:
    1. Considering single algorithms by e.g.
      > table(E$min[E$alg=="adaptnoveltyp"])
           
      (note that currently algorithm names are inappropriately handled).
    2. These tables can be put into a linear order by sorting first according to min-value reached (the lower the better), and second by count obtained (the higher the better).
    3. Also the time needed should at least be shown.
    4. A function should be written which prints out the sorted tables in a nice way.
    5. As a first attempt we have eval_ubcsat_dataframe, which just shows all results in table form (sorting needs to be added).
    6. Perhaps then the (first) evaluation tool just uses plot(E$alg,E$min), followed by printing those *sorted* tables.
  • Also the average number of steps to the optimum is relevant: When it is closer to the cutoff, then increasing the cutoff likely might yield an improvement, while if not then the algorithm wanders randomly around without achieving something (so increasing the cutoff seems of little value).

Definition in file Evaluation.hpp.