Skip to content

Commit

Permalink
Release/0.9.1 (#120)
Browse files Browse the repository at this point in the history
* Sprk 261 (#61)

* refactoring gnuplot out of sparkle report generation

* Bugfixes for plot generation

* Deleting unused component

* Fixing bugs with environments (#62)

* Prepping new venv

* Fixing pcsparser installation

* Renaming file as it can now be simplified

* Environment changes

* Packaging SMAC fix for autofolio

* test fix

* Seperating environs

* Simplifying user env

* Updating environments

* Environment packaging fixes

* Adding setuptools for devtools to dev env

* Sprk 288 (#63)

* Prepping new venv

* Fixing pcsparser installation

* Renaming file as it can now be simplified

* Environment changes

* Packaging SMAC fix for autofolio

* test fix

* Seperating environs

* Simplifying user env

* Updating environments

* Environment packaging fixes

* Adding setuptools for devtools to dev env

* Updating the readme for new environment and installation setup

* Update

* typos

* typo

* redeleting file

* renaming file

* fixing remove temp files pathing

* minor refactoring of report generation

* Refactor selection (#65)

* Refactoring how sparkle constructs portfolio selectors

* fix for run-on arg

* fixes for slurm runs

* Reducing selector report generation run time by saving values

* refactoring report generation

* refactoring selection report generation

* moving plot generator to general latex file

* stringification of objects now happens in pdf generator

* reordering methods

* Test fixes

* test fix

* reorganising performance dataframe

* test dataframe fix

* Updating pytests

* Removing print statements

* Moving marginal contribution methods

* deleting unused file

* removing redundant print statement

* CLI test fixes

* Moving slurm id detection to relevant if

* Fixing marginal contribution test

* Updating readme (tip from hadar)

* Naming fix test data

* Renaming command

* Updating how construction works in combination with marginal contribution calculation

* bugfixes and simplifications for marginal contribution calculations

* renaming command

* Bug fixing run portfolio selector command

* Report fixes

* new test file

* Updating selection example for CLI test

* test update

* test file update

* updating test

* test data fixes

* settings fix

* Removing redundant test

* test renaming

* test fix

* Updating tests

* Fixing pytests

* replacing data for test

* Removing redundant lines from flake8 config

* top dir was included in snapshot, now removed

* removing dead test file

* removing dead test file

* test fixes

* No default selector anymore, is now a variable

* Updating comments

* log fix

* Refactor selection (#66)

* Refactoring how sparkle constructs portfolio selectors

* fix for run-on arg

* fixes for slurm runs

* Reducing selector report generation run time by saving values

* refactoring report generation

* refactoring selection report generation

* moving plot generator to general latex file

* stringification of objects now happens in pdf generator

* reordering methods

* Test fixes

* test fix

* reorganising performance dataframe

* test dataframe fix

* Updating pytests

* Removing print statements

* Moving marginal contribution methods

* deleting unused file

* removing redundant print statement

* CLI test fixes

* Moving slurm id detection to relevant if

* Fixing marginal contribution test

* Updating readme (tip from hadar)

* Naming fix test data

* Renaming command

* Updating how construction works in combination with marginal contribution calculation

* bugfixes and simplifications for marginal contribution calculations

* renaming command

* Bug fixing run portfolio selector command

* Report fixes

* new test file

* Updating selection example for CLI test

* test update

* test file update

* updating test

* test data fixes

* settings fix

* Removing redundant test

* test renaming

* test fix

* Updating tests

* Fixing pytests

* replacing data for test

* Removing redundant lines from flake8 config

* top dir was included in snapshot, now removed

* removing dead test file

* removing dead test file

* test fixes

* No default selector anymore, is now a variable

* Updating comments

* log fix

* removing print statement

* And two more prints

* simplifying log strings

* Name selector scenario after the solvers not instances

* PR

* merge fix

* remove print statement

* Refactoring sparkle_job_help out of sparkle

* Reorganising code into similar files

* Refactor run solver (#68)

* Refactoring Solver usage

* Fixing configuration example

* Fixing running a configured solver two new set up

* Minor changes wrapping up

* Pathing fixes

* Applying new solver structure to parallel portfolio

* imrpoving parallel portfolio

* renaming method

* Refactor run selector (#69)

* Renaming logging

* Moving support methods to more logical files

* Deletion wrap up

* Updating selection example to work with new set up for run portfolio selector

* test fix

* test fixes

* test fix

* Updating global variables

* Bug fixes for parallel portfolio

* Bugfix settings

* Updating run parallel portfolio

* Updating to RunRunner 0.1.6

* RunRunner upgrade

* RunRunner update number 2

* Sprk 274 (#67)

* Move argument parsing from solver wrapper to help lib

* Remove debug print statement

* Missing docstring and formatting

* Update docstrings

* Update docstring v2

* Extend solver wrapper modifications to CCAG FastCA

* Extend solver wrapper changes to remaining solvers

* Changes to solver wrapper parsing

* remove dead code

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Sprk 330 (#70)

* Fixing runrunner logging

* remove dead file

* Updating github

* gitignore update

* log removal from git ignore is now in output

* Setting log dir for Commands output files from Tmp to CLI caller log

* Updating the logs for other commands and runrunner

* refactoring

* camelcase fix

* expanding clean up command

* fixing run solvers logging path

* Removing legacy code

* logging bugfixes selection

* test update

* test fix

* test fix

* Fixing print statements in validate configured vs default

* Sprk 79 (#64)

* Added machine readable configuration report

* ConfigurationOutput prototype

* Merge dev

* Output runs again and first fixes

* Fixed number_of_runs problem and added scenario to output

* Moved functions, removed ConfigurationPerformance class

* Modified output file structure

* Configuration Output includes all configurations

* Fixes to ConfigurationOutput class

* Added Selection Output

* Added settings to SelectionOutput

* Created ParallelPortfolioOutput file

* Changes to ConfigurationOutput

* Moved Output folder

* Added new Output to generate_report

* Final fixes

* Final push (?)

* Added only_json argument to generate_report

* bugfix

* replacing os.listdir with Path lib method

* fixing config scenario spaghetti

* less white lines

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Sprk 325 (#72)

* reworking instance set to be a template class with specific subclasses

* Naming bugfix

* New example dataset

* template fix

* renaming data set

* First version of new example solver

* System status update

* Test update system status

* bugfix instances

* Solver adding bugfixes

* Bugfixes regarding objectives

* bugfixes

* bugfix smacoutput

* Bug fixes for solver wrapper dict parsing

* Updates solver wrapper

* Bugfix solver tools

* bugfixes for validation

* Solver wrapper bugfixes and validation update

* adding objectives to validate

* minor update for solver wrapper

* Minor validation fixes and example update

* Example update

* Minor update solver wrapper tools

* test update

* bugfix call params

* pytest fix, config scenario bugfix

* parallel portfolio bugfix

* InstanceSet bugfix

* test file update, report generation bugfixes

* Comments fixes

* ablation fix

* example update

* bugfixes report generation

* bugfix

* New version

* New version (#73)

* new distributable

* Release/0.8.5 (#76)

* New version

* new distributable

* adding pcs to sparkle to avoid setup install

* Removing pcsparser from setup to allow for pip installability

* minor bugfixes

* Update changelog

* distributable update

* Fixed comp

* More universal installation support for conda and runsolver

* Fix

* Objective refactor (#80)

* Prepping platform for objective update

* Updating objective structure

* reverse due to installation failure

* Replacing PerformanceMeasure everywhere with objective

* Updating tests and templates

* removing penalisation from report generator

* default settings update

* update for target algorithm to handle quality correctly

* update column resolution for validation

* bugfix solver parse

* Various bugfixes for new SparkleObjective

* test file update

* configurator test fix

* Test fixes and data

* test fixes

* Updating selection with new objectives

* CPU and Wall time are now default objectives that are always reported

* use CPU time for PARk

* Include memory usage measurement

* Fix type

* Fixing runsolver parsing

* Fix type

* memory bugfix

* Remove cutoff from solver output

* More generic resolving of objective string to SparkleObjectives

* objective fixes for Parallel Portfolio

* test file fixes

* test fix

* example fix

* examples update

* Documentation update

* More generic resolving of objective string to SparkleObjectives

* Update RandomForest Wrapper

* Remove outdated regex

* Fix PAR rename

* Check settings changes after CLI args parsing

* pytest fixes

---------

Co-authored-by: J. Rook <jrook94@gmail.com>

* remove

* Changelog update

* Fixes for SparkleObjective regarding parallel portfolio and run solvers

* Include cutoff to args_dict

* AutoFolio objectives fix

* Selection/Performance DataFrame objectives fix

* pytest updates

* test file update

* pytest

* data fixes for CLI tests

* Creating empty templates for missing test files

* test TODO expansion and flake8

* Test place holder expansion and bugfix

* Expanding pytests with FeatureDataFrame place holder, renaming FDF method to property

* Place holder tests for extractor and objective

* flake8

* remove redundant file

* tools test placeholders

* New test data for PDF and passthrough for print method to Pandas DF representation

* Expanding MO pytests for performance dataframe

* pytest fix

* bugfix feature dataframe property

* Data update

* bugfix documentation path

* documentation bugfixes

* Documentation bugfixes

* Bugfixes CLI docs building

* Updating changelog for release prep

* Release 0.8.6 (#81)

* Updating changelog for release prep

* marginal contribution fix

* Readme update

* Release/0.8.6 (#82)

* Updating changelog for release prep

* marginal contribution fix

* Readme update

* Versioning

* Release/0.8.6 (#84)

* Release/0.8.5 (#75)

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* SPRK-88 (#12)

* Creating unified file for CLI arguments

* Aggregate all CLI arguments into argparse_custom

* Made CLI programs use centralised ArgumentContainers

* Fixed SolverPathArgument

* Sprk 76 (#7)

* Added cpu_time & solver_calls to configuration budget (missing from run_ablation & get_smac_settings)

* Fixed some flake8 and pytest errors

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Small flake8 fix

* Small fixes

* Small fixes

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* Changed Default CPU value

* Added alias to userguide & settings

* small fix

* corrected method name

* Set solver_calls and cpu_time default to None

* flake8 hates me

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Split PerformanceMeasureArgument in two and remove current value information for certain CLI arguments

* Wrap-up of restructuring argparse arguments

* Actual implementation of SPRK-88

* Small fix to generate_report

* Removing print statement, correcting my own mistakes for argsv

* Formatting string without + operator

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>
Co-authored-by: Brian Schiller <54008766+BrianSchiller@users.noreply.github.com>
Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Fix typo (#10)

Minor change

* bugfix in dev

* SPRK-120 (#15)

* Implementation of SPRK-120

* Change requests

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Added .kwargs to argparse argument

* Replaced * with ** for kwargs

* Adding Aaron to contributors list

* SPRK-128: Rename clips_per_node (#16)

* Rename clis_per_node

* Requests

* Adding Aaron to contributors list

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>
Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Fixing permissions

* Fixing bug in remove solver, renaming global_variables symbol from sgh (sparkle_global_help) to gv (#14)

* SPRK-55 (#13)

* Fixes for removing instances from Sparkle through CLI

* Intermediate check

* ignoring runsolver file from compilation

* Fixing all nicknames in commands for configuration

* bugfix from different ticket

* Refactoring and print improvement

* Bugfix feature data frame

* Refactoring useless/dead code

* bug fix args branch

* nickname support run sparkle portfolio selector

* Nickname support for generate report

* flake8

* Simplified extractor nicknaming code

* Simplifying remove instances nickname

* Simplifying remove solver

* Refactoring

* flake8

* Bugfix

* sh test fix

* Preventing the nickname handler from crashing when args are None

* test fix

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 80 - [M] Add pcs validity and running check to wrappers (#17)

* Add check argument

* Merge duplicate check pcs functions

* Add pcsparser

* Start with running an instances to check if wrapper runs

* Repo change

* Do checks before adding. Change executable check return type

* Merge and comment on wrong executable check

* fixes

* flake8

* enviroment fix

* Changelog

* PR comments fixes

* Move messages from reading the pcs files to the cli part

* Sprk 285 (#19)

* Fixed check_settings_changes KeyError

* Finishing SPRK-285

* Revert run_on option strings to lowercase again

* Resolved flake8 issues

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 220 (#9)

* Add option to set autofolio's timeout

* fixup! Add option to set autofolio's timeout

* Fix autofolio timeout

* Minor fixes selection example

* Updating docs for runsolver (#26)

* Docs bugfix

* Update about usage for docs

* typo

* trying to fix docs compilation

* Bugfixes for docs gen

* removing print statement

* Updating requirements

* Sprk 102 (#25)

* Remove redundant code

* Added default extractor wrapper

* Modified Wrapper

* Implemented new wrapper in add_feature_extractor

* rewrote extractor wrapper code

* Fixed flake8 mistakes

* Some fixes (mostly path related)

* Requested Changes

* Updates

* Few more changes

* Removing dead imports

* Implemented changes to extractor wrapper

* Extractor now works

* Removed print statements

* Minor code layout changes

* standardising

* Wrapping up wrapper changes for extractors

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Improve status monitoring with logfile status (#28)

* First implementation of monitoring with logfile status

* Formatting and variable renaming

* More formatting

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 269 (#18)

* refactor

* Removing construct parallel portfolio from Sparkle

* flake8

* Removing traces of construct parallel portfolio

* flake8

* Adding in solver wrappers for solvers with outdated wrappers

* Bugfixes

* Big changes

* wrapping up report refactoring

* flake8 adapt

* Clean up

* Moving code to more logical place

* flake8

* bugfix from refactor

* minor refactor

* flake8 fixes

* Updating parallel portfolio arguments/settings

* Renaming the command

* bugfix

* Moving arguments to argparse custom

* Bugfix hadar

* Removing dead variables

* Refactoring variables pt 2

* Refactoring gv method

* Bugfixes

* Load snapshot bugfix

* remove feature extractor bugfix

* Minor bug fixes

* flake8

* CLI pass through comments fix

* Bug fixes

* about test fix

* About fix

* Bugfix report generation

* Forgotten print statement

* Reverting refactoring

* Updating CLI test

* Fixing bug

* Final bug fix for test

* Removing dead files

* Bugfixes remove instances CLI test

* remove multi instances bugfix

* CLI test fixes

* Refactoring SMAC

* Refactoring test bugs fix

* Updating changelog

* Bugfix wrapper, updating changelog, minor refactoring

* Minor bug fix solver wrapper

* Bug fix validator

* Bug fixes in compute features / selection example

* Removing useless print statement

* flake8 fix

* Removing dead variable

* Adding file lock for selector writing to performance data csvs

* flake8

* Report bugfixing

* Bug fixes report

* Changelog + bugfix

* bugfix feature extractor removal (#29)

* Release/0.8.2 (#31)

* Sprk 76 (#7)

* Added cpu_time & solver_calls to configuration budget (missing from run_ablation & get_smac_settings)

* Fixed some flake8 and pytest errors

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Small flake8 fix

* Small fixes

* Small fixes

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* Changed Default CPU value

* Added alias to userguide & settings

* small fix

* corrected method name

* Set solver_calls and cpu_time default to None

* flake8 hates me

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Fixing execute permission on example

* bugfix feature extractor removal

---------

Co-authored-by: Brian Schiller <54008766+BrianSchiller@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Quick fix for documentation generation

* Sprk 292 (#33)

* Change documentation template

* added theme to requirements

* Fixed Index page of documentation

* Included github action file

* Test GitHub Actions

* test v2

* test 3

* test 4

* test 5 (?)

* test 6

* test7

* test 8

* test 9

* test 10 ?

* another one

* New approach

* check

* spelling?

* this is getting sad

* lesgo

* wild test

* Polishing

* Deleted build files to try something out

* Changed the workflow to run on push on main

* small fixes

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* SPRK-275 (#34)

* refactoring the random date time stamp getter

* Forgot one file

* Moving ablation help to its correct location

* This file shouldn't exist anymore since last version, weird

* Redundant code removal?

* Refactoring useless code

* Removing dead code

* Moving file to correct spot

* Refactoring, moving files

* Reforming slurm options

* refactoring + test fixes

* Refactoring solver object usage in Sparkle

* test bug fixes

* Refactoring solver deterministic parameter

* Updating args

* Moving compute features help to correct file

* code layout

* Bug fixes

* Getting rid of CSV merge by using file locks instead

* Removing dead file

* flake8

* Removing unused command enums

* Refactoring run configured solver help

* Renaming file to be more representative

* Moving to more suitable folder

* Simplifying run solvers help by making more use of the Solver object

* Removing dead code

* Refactorign default wrapper out of sparkle

* Updating user guide

* updating the template

* bug fix in results parsing

* Setting example to run in parallel by default

* bugfix adding solver

* Updating printing info from add solver to be more compact and accurate

* Factoring settings usage out of performance data frame

* Final bug fixes for refactoring global variables out of performance data frame

* Updated comments

* Pytest fixes

* test fix

* Removing dead code

* flake8

* test fix

* Refactoring feature data csv, bug fixes

* Bug fixes for Performance dataframe

* bugfix performance dataframe

* Simplifying Solver imports

* Moving missing value into the data structures

* Removing dead import

* Removing sparkle version from global variables

* Removing dead import

* Test fix

* removing print statement

* Refactoring instance set NAME references to PATH for SMAC configurator class

* flake8

* Fixing pytests

* Removing extra spaces

* refactoring instances help

* reverting change due to complexity

* refactoring global variables out of instances_help

* Refactoring global variables out of configurator

* pytest fix

* Moving CLI related files

* Refactoring gv from file help, bug fixes

* restoring file

* Bugfixes

* flake8 + bugfix

* Refactoring variables

* Setting string variables to Path

* Unused import removal

* Setting path to performance data csv as path object instead of str

* Factoring out double code

* Updated comment

* dead arguments, flake8

* fixing path variables to actually be paths

* Fixing tests, str to path objects

* bugfixes

* Factoring global variables out of report generation

* renaming file

* pytest fix

* flake8

* Moving check results to generate report command

* Removing last run test as its already being resolved by reportingscenario object

* Removing get num instances as it can be deduced from the validator data

* Removing methods only referenced once

* Refactoring configuration report generator with GV, moving class out of GV

* flake8

* pytest fixes

* flake8

* removing gv references from parallel portfolio report

* flake8

* minor fixes for selection reports

* Refactoring selection report

* selection example fix

* refactoring gv from selection

* Refactoring performance dataframe usage

* Removing global variable references from selection

* Flake8

* forgotten print

* pytest fix

* removing gv from settings help

* Bugfix

* Removing solver references as str

* Fixing bug, giving test a better name

* bugfix

* Mistake

* Bugfix

* flake8

* Removing --parallel option from compute features (This should be done using sparkle settings)

* Removing unused method

* Renaming method to be more clear

* Removing dead argument from add solver

* Moving add solver argument to argparse custom

* Refactoring --parallel argument out of Sparkle

* Flake8

* bugfix settings

* Bug fix settings

* Removing double code and renaming

* flake8 fixes

* test fix

* Removing nearly unused variables from gv

* Bugfixes

* test fix

* Flake8

* Moving slurm settings to settings class

* flake8

* Refactoring slurm settings into sparkle settings

* Correcting method variable to Path

* Bugfix for settings

* bugfix config report generation

* test fixes

* Removing ``in parallel'' from sparkle prints

* Parallel portfolio fixes and improvements

* Bugfix settings writing

* Removing useless write statement

* double spaces in print statement

* Unnecessary path conversion removal

* Removing feature data tmp dir

* Bugfolio portfolio selector

* Sprk 308 (#35)

* Setting up base shared class of extractor and solver

* First step towards an OO extractor

* flake8

* SPRK-306 (#36)

* Update to check_settings_changes to verify if sections in the settings file were added or removed

* Changed lists to sets and formatting

* More descriptive variable naming

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Add entry to CONTRIBUTING.md regarding the custom git clean command

* Sprk 271 (#37)

* Base setup for new instances object

* Preparing configuration to run with instances object instead of paths/str

* Fixing up Sparkle with new Instances object

* bugfix

* Reworking deterministic into flag

* Bug fix in pathing

* Using instance object for ablation

* Minor fixes for ablation report generation

* Fixing run configured solver with new object

* flake8

* Fixing test of dead code

* bug fixes instances

* flake8

* Enabling instance set to contain instance

* Renaming variable

* Test fix

* flake8

* Fixing run portfolio selector with more elegant code

* Fixing arguments for parallel portfolio

* Bugfix

* Removing unused code

* Prepping to test multi-file instance

* Updating extractor

* Renaming wrapper

* Fixing multi instance extractor

* Fixing bugs in adding multi instance extractor

* Fixing up add instances

* Bugfixes add instances

* Fixing up example, enabling solver wrapper to receive lists

* Adding sparkle solver wrapper to TCA

* Expanding compute features to work with multi-file instances

* Forgot to save data structures after modification

* Updating compute features to work with multi instance

* Bugfixes for compute features multi-file instances

* Fixing multi instance examples

* Cleaning up compute features core

* flake8

* Quickfix for multi-instance run solvers

* Removing print statement

* Removing old wrappers

* Updating remove instances with new object

* minor bug fixes

* flake8

* Removing unused file

* Removing system status file getter

* Refactoring file_hlep

* Refactoring argument

* Fixing run configured solver to use instance sets

* Removing dead arguments, unnecessary path conversions

* Making sure Sparkle uses solver object

* Fixing pytest files

* Bugfix solver wrapper

* Minor bug fixes

* Bugfixes for running solvers

* Cleaning up variable types for generate report config

* removing print statement

* Bugfixes data setting for performance dataframe

* bugfix reporting scenario

* Bugfix report generation parallel portfolio

* Bugfix parallel portfolio report generation

* flake8

* bugfix reports

* Sprk 202 (#39)

* Removing unused configurator files in test

* Adding pytests

* Fixing up configurator pytests

* flake8

* Bugfixes for getting optimal configuration

* Removing special string references from sparkle

* Add default value for the run-on argument (#38)

* Add standard value for the run-on argument which can be overwritten via the command line

* Make get_run_on() return enum instead of string

* Write run-on argument as a string to the latest.ini file

* Last commit was missing one file

* Small fix

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Starting to refactor how feature data works in sparkle

* Restructuring feature data frame

* Renaming file and setting up package init

* renaming file

* Reworking feature extractor workings with feature data frame

* Prepping feature extractor wrappers

* Preparing feature extractors part 2

* bugfixes for adding/removing instances from fdf

* Bugfixes for adding/removing extractors

* Updating extractor to handle I/O better

* Fixing feature computations to work with new wrapper set up

* Fixing featuredataframe imports

* Removing dead variables

* Correcting settings

* Updating manual

* Bugfixes for how to compute remaining feature jobs

* Added comment

* wrapper fix

* Bug fixes running selector on single instance

* Deleting now unused file

* Test fix

* Unused code removal

* Flake8

* Updating remaining jobs format

* pytest fix

* Removing unused code

* Flake8

* bugfixes

* Use extractor object in compute features core

* Minor code fixes

* Bugfixes regarding runsolver

* recompute bugfix

* Fixing up extractor wrapper

* Fixing comments for featuredataframe

* Bugfix for multi file instances

* flake8

* Reanming file

* bugfix running extractor

* Test fix

* Removing old configuration example

* Replacing old config, deleting old validation test data

* New validation example data

* Test update

* Adding satzilla2024

* Prepping SATZilla2024 example

* Prepping Sparkle for new extractor

* Moving compute features into the CLI command file

* updating extractors

* Updating example execution rights

* adding description

* Updating 2024 wrapper

* missing executable

* Don't use the last feature

* chmod +x

* Feature group preparation

* Description update

* Setting execution permissions

* Adding feature mapping

* Compilation update SATZilla2024

* Updating example

* Execution rights

* Fixing wrapper

* Bugfixes feature imputation

* bugfix get feature groups

* Adding new instance set

* Updating selection example

* renaming instances new set

* Bugfix runsolver output parsing

* flake8

* minor fix

* Update description_SAT-features-competition2024.txt

* Adding main to avoid floating code

* Fixing settings options without SMAC section

* Removing dead code

* Settings fix

* minor fixes

* Fixing last Solver object omission

* Status info fix

* Sprk 252 (#40)

* Change documentation template

* added theme to requirements

* Fixed Index page of documentation

* Included github action file

* Test GitHub Actions

* test v2

* test 3

* test 4

* test 5 (?)

* test 6

* test7

* test 8

* test 9

* test 10 ?

* another one

* New approach

* check

* spelling?

* this is getting sad

* lesgo

* wild test

* Polishing

* Deleted build files to try something out

* Changed the workflow to run on push on main

* small fixes

* Sparkle wait now shows a table

* Added output-verbosity to settings

* sparkle wait now also shows partition

* Restructering. Removed Extensive, rename REDUCED to QUIET, removed partition

* Finished new wait command

* Flake is killing me

* Moved vreebosity from output to general (settings)

* Added check interval to settings

* Jobs are sorted based on status

* Fixed flake8

* Added TEXT class and implemented in sparkle wait

* Minor changes

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 283 (#42)

* Print update

* Refactoring

* fixing comments

* Bug fixes dataframes

* first implementation design of selector OO

* refactoring file

* Refactoring sparkle to use selector OO solution

* flake8

* Removing multiple file creating from perfect selector computation

* refactoring copy operations out of marginal contribution

* performance dataframe fixes

* Changing references to Settings file

* Forgot a few

* Testing changing reference to gv.settings to settings

* Refactoring usage of marginal contribution

* flake8

* Updating test

* bugfix

* Fixing verbosity

* Flake8 + pytest fixes

* bugfix

* pytest mock fixes

* more usefull error printing selector

* test fix

* bugfixes

* bugfixes

* reordering tests

* flake8

* new test files

* Removing dead command

* Removing dead test

* New RunRunner version

* Test fixes for new runrunner version

* Showing new column based on runrunner update in sparkle wait

* fix for pytest

* Sprk 160 (#44)

* Prepping tests to use all a different scenario file

* Fixing tests to always use specific test set

* Removing dead variables from global variables

* Minor changes ablation

* reducing git ignore

* Removing logging from Platform (CLI functionality)

* Removing non-CLI references to global variables, moving sparkle log

* settings mistake

* Creating init for CLI packages

* Pre release test branch (#45)

* Trying to move global variables

* init fllake8

* flake8

* Moving tools to be part of Sparkle lib

* Updating setup to not include the tests

* renaming latex dir

* Creating manifest, updating setup.py

* flake8

* Moving Components into sparkle dir to prepare workable pip install

* Update pull-request.yml (#49)

* Update pull-request.yml

* Update pull-request.yml

* SPRK 151 (#46)

* Flake8 pre-commit

* Pre commit test

* Disable test for pre commit

* Updating change log and contributing

* Reducing test files bloating

* Settings mistake

* Removing unused test files

* Unused test files

* Sprk 320 (#50)

* Fixing bugs in Ablation

* First class design, bug fixes, flake8

* Refactoring ablation object usage

* fixing pytests

* Moving ablation

* Bug fixes for report generation

* Bugfixes ablation, refactoring, fixing timezone on smac target algorithm

* Adding example download capability to initialise command

* Simplifying + comment for init

* Updating changelog

* Simplifying wait command

* Minor variable name and comments fixes

* Removing dead variables from global var

* Fixes regarding portfolio selector methods

* Renaming symbols

* Minor refactoring

* Minor refactoring

* Refactoring method definitions

* Bugfixes for initialise

* Bugfix init

* Removing excess test files

* Moving CLI tests to tests folder

* Removing unnecessary methods

* Refactoring function definitions

* Simplifying sat_help methods

* sat verify bug fix

* Old code reversal

* Refactoring snapshit

* snapshot bugfix

* Logger typing fixes

* Minor fixes

* Refactoring file help

* Redundant string removal

* Fixes

* Sprk 295 (#51)

* Creating parser for solver wrapper output

* Change solver status from string to enum

* Completed docstring

* Merge dev into SPRK-295

---------

Co-authored-by: ngcp1 <noahgcp@gmail.com>

* Remove the exclusive flag when running the portfolio selector (#52)

* Moving CLI to a better place

* Bug fix seed being none for solver

* Bugfix run sparkle portfolio core

* Bugfixes portfolio selector

* Version 0.8.3 release (#53)

* Sprk 76 (#7)

* Added cpu_time & solver_calls to configuration budget (missing from run_ablation & get_smac_settings)

* Fixed some flake8 and pytest errors

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Small flake8 fix

* Small fixes

* Small fixes

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* Changed Default CPU value

* Added alias to userguide & settings

* small fix

* corrected method name

* Set solver_calls and cpu_time default to None

* flake8 hates me

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Fixing execute permission on example

* Release/0.8.2 (#30)

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* SPRK-88 (#12)

* Creating unified file for CLI arguments

* Aggregate all CLI arguments into argparse_custom

* Made CLI programs use centralised ArgumentContainers

* Fixed SolverPathArgument

* Sprk 76 (#7)

* Added cpu_time & solver_calls to configuration budget (missing from run_ablation & get_smac_settings)

* Fixed some flake8 and pytest errors

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Small flake8 fix

* Small fixes

* Small fixes

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* Changed Default CPU value

* Added alias to userguide & settings

* small fix

* corrected method name

* Set solver_calls and cpu_time default to None

* flake8 hates me

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Split PerformanceMeasureArgument in two and remove current value information for certain CLI arguments

* Wrap-up of restructuring argparse arguments

* Actual implementation of SPRK-88

* Small fix to generate_report

* Removing print statement, correcting my own mistakes for argsv

* Formatting string without + operator

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>
Co-authored-by: Brian Schiller <54008766+BrianSchiller@users.noreply.github.com>
Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Fix typo (#10)

Minor change

* bugfix in dev

* SPRK-120 (#15)

* Implementation of SPRK-120

* Change requests

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Added .kwargs to argparse argument

* Replaced * with ** for kwargs

* Adding Aaron to contributors list

* SPRK-128: Rename clips_per_node (#16)

* Rename clis_per_node

* Requests

* Adding Aaron to contributors list

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>
Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Fixing permissions

* Fixing bug in remove solver, renaming global_variables symbol from sgh (sparkle_global_help) to gv (#14)

* SPRK-55 (#13)

* Fixes for removing instances from Sparkle through CLI

* Intermediate check

* ignoring runsolver file from compilation

* Fixing all nicknames in commands for configuration

* bugfix from different ticket

* Refactoring and print improvement

* Bugfix feature data frame

* Refactoring useless/dead code

* bug fix args branch

* nickname support run sparkle portfolio selector

* Nickname support for generate report

* flake8

* Simplified extractor nicknaming code

* Simplifying remove instances nickname

* Simplifying remove solver

* Refactoring

* flake8

* Bugfix

* sh test fix

* Preventing the nickname handler from crashing when args are None

* test fix

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 80 - [M] Add pcs validity and running check to wrappers (#17)

* Add check argument

* Merge duplicate check pcs functions

* Add pcsparser

* Start with running an instances to check if wrapper runs

* Repo change

* Do checks before adding. Change executable check return type

* Merge and comment on wrong executable check

* fixes

* flake8

* enviroment fix

* Changelog

* PR comments fixes

* Move messages from reading the pcs files to the cli part

* Sprk 285 (#19)

* Fixed check_settings_changes KeyError

* Finishing SPRK-285

* Revert run_on option strings to lowercase again

* Resolved flake8 issues

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 220 (#9)

* Add option to set autofolio's timeout

* fixup! Add option to set autofolio's timeout

* Fix autofolio timeout

* Minor fixes selection example

* Updating docs for runsolver (#26)

* Docs bugfix

* Update about usage for docs

* typo

* trying to fix docs compilation

* Bugfixes for docs gen

* removing print statement

* Updating requirements

* Sprk 102 (#25)

* Remove redundant code

* Added default extractor wrapper

* Modified Wrapper

* Implemented new wrapper in add_feature_extractor

* rewrote extractor wrapper code

* Fixed flake8 mistakes

* Some fixes (mostly path related)

* Requested Changes

* Updates

* Few more changes

* Removing dead imports

* Implemented changes to extractor wrapper

* Extractor now works

* Removed print statements

* Minor code layout changes

* standardising

* Wrapping up wrapper changes for extractors

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Improve status monitoring with logfile status (#28)

* First implementation of monitoring with logfile status

* Formatting and variable renaming

* More formatting

---------

Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>

* Sprk 269 (#18)

* refactor

* Removing construct parallel portfolio from Sparkle

* flake8

* Removing traces of construct parallel portfolio

* flake8

* Adding in solver wrappers for solvers with outdated wrappers

* Bugfixes

* Big changes

* wrapping up report refactoring

* flake8 adapt

* Clean up

* Moving code to more logical place

* flake8

* bugfix from refactor

* minor refactor

* flake8 fixes

* Updating parallel portfolio arguments/settings

* Renaming the command

* bugfix

* Moving arguments to argparse custom

* Bugfix hadar

* Removing dead variables

* Refactoring variables pt 2

* Refactoring gv method

* Bugfixes

* Load snapshot bugfix

* remove feature extractor bugfix

* Minor bug fixes

* flake8

* CLI pass through comments fix

* Bug fixes

* about test fix

* About fix

* Bugfix report generation

* Forgotten print statement

* Reverting refactoring

* Updating CLI test

* Fixing bug

* Final bug fix for test

* Removing dead files

* Bugfixes remove instances CLI test

* remove multi instances bugfix

* CLI test fixes

* Refactoring SMAC

* Refactoring test bugs fix

* Updating changelog

* Bugfix wrapper, updating changelog, minor refactoring

* Minor bug fix solver wrapper

* Bug fix validator

* Bug fixes in compute features / selection example

* Removing useless print statement

* flake8 fix

* Removing dead variable

* Adding file lock for selector writing to performance data csvs

* flake8

* Report bugfixing

* Bug fixes report

* Changelog + bugfix

* bugfix feature extractor removal

---------

Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>
Co-authored-by: Noah Peil <151151647+ngcp1@users.noreply.github.com>
Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>
Co-authored-by: Brian Schiller <54008766+BrianSchiller@users.noreply.github.com>
Co-authored-by: Koen van der Blom <5031234+kvdblom@users.noreply.github.com>
Co-authored-by: Jeroen Rook <jrook94@gmail.com>

* Version change

* Small fixes

---------

Co-authored-by: Brian Schiller <54008766+BrianSchiller@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>
Co-authored-by: Noah Peil <151151647+ngcp1@users.noreply.github.com>
Co-authored-by: Noah Peil <noah.peil@rwth-aachen.de>
Co-authored-by: Koen van der Blom <5031234+kvdblom@users.noreply.github.com>
Co-authored-by: Jeroen Rook <jrook94@gmail.com>

* Release/0.8.3 (#55)

* Sprk 76 (#7)

* Added cpu_time & solver_calls to configuration budget (missing from run_ablation & get_smac_settings)

* Fixed some flake8 and pytest errors

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Small flake8 fix

* Small fixes

* Small fixes

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to detect runtime.

* Updating solver wrapper to be able to read configurations from file

* Bugfixes SMAC wrappers

* Reverting sparkle settings to original

* Reverting slurm settings to original

* Moving direcotries to unified output directory

* Correcting file structure

* Refactoring, flake8

* Wrapping up file organisation

* Refactoring code to work with new implementation

* Bug fixes

* Fixing code for report generation

* flake8

* Fixing solver input for validate after config to match amount of configurations

* Attempted bug fix

* Bugfix for ablation?

* Bugfix ablation

* Refactoring configure_solver_help content

* Renaming to make a closer match to current contents

* description rename

* Fixing rename for imports

* flake8

* pytest fixes

* Fixing variables

* pytest fix

* Pytest fix

* Pytest fix

* pytest fixes

* Fix

* Pytest fix

* removing dead code from test

* Final fixes

* bugfix

* pytest fix

* Removing smac references from global variables

* Removing hard coded smacv2 references from Sparkle all together

* PR changes

* Minor fixes

* flake8

* Minor fixes

* Forgot to check in settings file

* Bugfixes for ablation

* Changed Default CPU value

* Added alias to userguide & settings

* small fix

* corrected method name

* Set solver_calls and cpu_time default to None

* flake8 hates me

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>
Co-authored-by: Thijs Snelleman <32924404+thijssnelleman@users.noreply.github.com>
Co-authored-by: Hadar Shavit <shavit@aim.rwth-aachen.de>

* Fixing execute permission on example

* Release/0.8.2 (#30)

* Refactor

* Updating Runsolver and placing compilation responsibility in platform initialisation (#3)

* Updating runsolver

* Fixed runsolver make

* Updating runsolver and gitignore structure for local compilation

* Removing runsolver as part of Sparkle, must now be compiled by user

* Fixing sparkle platform intialization with Runsolver compilation

* Refactoring, flake8

* Fixes for instant compilation of runsolver (Conda install of required c packages and necessary path modding)

---------

Co-authored-by: Thijs Snelleman <snelleman@aim.rwth-aachen.de>

* Run workflow on on pull request (#6)

* Update SWIG (#5)

* Sprk 270 (#8)

* temp design check in

* Flake8 fixes

* Refactoring validator as preparation

* Preparing configurator to use validator

* Attempting fix for Github

* pytest fix for gen rep

* Pytest fix for validation scenario

* Re-enabling pytest

* Prepping for testing

* Intermediate check in

* Moving configurator implementations in settings

* Bugfixes moving configurator

* Setting up general configurator CLI

* Fixing configurator writing output configurations from each run to a unified CSV

* Validator and solver running fix

* flake8 fix

* Fixing command calls

* Fixing settings

* Fixing variable name in test

* Continuing work on configurators, cleaning up Components directory

* Removing unused files

* Intermediate check in to check issue with runsolver

* Found bug in smac target algorithm, was using the wrong runsolver output file to dete…
  • Loading branch information
9 people authored Dec 8, 2024
1 parent 521e49e commit 83937ca
Show file tree
Hide file tree
Showing 246 changed files with 7,866 additions and 3,326 deletions.
2 changes: 1 addition & 1 deletion .flake8
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ exclude =
sparkle/Components/AutoFolio/,
sparkle/Components/ablationAnalysis-0.9.4/,
sparkle/Components/runsolver/,
sparkle/Components/smac-v2.10.03-master-778/,
sparkle/Components/smac2-v2.10.03-master-778/
sparkle/Components/irace-v3.5/,
Examples/Resources/CVRP/,
Examples/Resources/CCAG/,
Expand Down
16 changes: 16 additions & 0 deletions .github/workflows/changelog.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
name: Changelog Enforcer

on:
pull_request:
branches: ["development", "main"]

jobs:

changelog:
runs-on: ubuntu-latest

steps:
- name: Enforce changelog entry
uses: dangoslen/changelog-enforcer@v3
with:
skipLabels: skip-changelog, auto-skip-changelog
2 changes: 0 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@ sparkle/Components/runsolver/src/runsolver
Examples/Resources/CCAG/Solvers/TCA/src/
Examples/Resources/CCAG/Solvers/FastCA/src/
Examples/Resources/CVRP/Solvers/VRP_SISRs/src/
*.json

# Generated documentation
Documentation/source/_generated/
Expand All @@ -39,7 +38,6 @@ Documentation/build
## Core latex/pdflatex auxiliary files:
*.aux
*.lof
*.log
*.lot
*.fls
*.out
Expand Down
23 changes: 23 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,29 @@

Notable changes to Sparkle will be documented in this file.

## [0.9.1] - 2024/12/08

### Added
- Added the SMAC3 configurator to Sparkle [SPRK-335]
- Added no-copy argument to all CLI add commands so the user can create symbolic links to their files instead of copying [SPRK-356]
- Added no-save argument to initialise command [SPRK-358]
- Added SolutionFileVerifier to verify instance solutions from CSV file [SPRK-360]

### Changed

- Generate report command (Configuration report) now checks whether there are still jobs that should be run before allowing the user to start this command [SPRK-328]
- PerformanceDataFrame now directly subclasses from Pandas DataFrame instead of functioning as a container class [SPRK-278]
- Initialise command no longer removes the user's Settings directory if a platform already exists, but does still save it to the snapshot. [SPRK-355]
- Solver configuration now stores found configurations and their results in the PerformanceDataFrame [SPRK-358]
- run_solvers_core is integrated now into the solver class [SPRK-358]
- configure solver command now also runs default configuration, schedules train set validation and test set validation if given [SPRK-358]
- Modified SolutionVerifier adding to solvers as a CLI argument that is saved in the Solver meta file instead of in the Settings file [SPRK-359]
- PAR objective now takes into account 'negative status' and penalises solvers for crashing or incorrect answers [SPRK-360]

### Removed
- Validate configured vs default command has been removed as it is now redundant [SPRK-358]
- Validator class has been removed as it is no longer relevant [SPRK-358]

## [0.9.0] - 2024/10/30

### Added
Expand Down
23 changes: 20 additions & 3 deletions Documentation/source/configurators.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,19 +4,31 @@ Sparkle offers several configurators to use for Algorithm Configuration. Althoug

## SMAC2

Sequantial Model-Based Optimization for General Algorithm Configuration[[1]](#1), or [SMAC]((https://www.cs.ubc.ca/labs/algorithms/Projects/SMAC)) for short is a Java based algorithm configurator. *Note that this the second version, and not SMAC3 the Python version*. The original documentation of the configurator can be found [here](https://www.cs.ubc.ca/labs/algorithms/Projects/SMAC/v2.10.03/manual.pdf).
Sequantial Model-Based Optimization for General Algorithm Configuration[[1]](#1), or [SMAC]((https://www.cs.ubc.ca/labs/algorithms/Projects/SMAC)) for short is a Java based algorithm configurator. *Note that this the second version, and not SMAC3 the Python version. For SMAC3 see below*. The original documentation of the configurator can be found [here](https://www.cs.ubc.ca/labs/algorithms/Projects/SMAC/v2.10.03/manual.pdf).

```{note}
SMAC2 is written in Java and therefore requires Java to be installed in your environment. The current tested version in Sparkle is 1.8.0_402
```

### Budget

SMAC2 receives it budget in terms of `solver_calls`, which specify the maximum amount of times the target solver (e.g. your algorithm) may be run on a certain instance, or through `cpu_time` or `wallclock_time`. Note that in the case of using time as a budget, not only the solver time measurement is used for the budget but also that of SMAC itself. If you want only the execution time of the algorithm to be used for the budget, set `use_cpu_time_in_tunertime` to `False`.
SMAC2 receives it budget in terms of `solver_calls`, which specifies the maximum amount of times the target solver (e.g. your algorithm) may be run on a certain instance, or through `cpu_time` or `wallclock_time`. Note that in the case of using time as a budget, not only the solver time measurement is used for the budget but also that of SMAC itself. If you want only the execution time of the algorithm to be used for the budget, set `use_cpu_time_in_tunertime` to `False`.

## SMAC3

The Python version of Sequantial Model-Based Optimization for General Algorithm Configuration[[2]](#2), or [SMAC3](https://github.com/automl/SMAC3) for short. The original documentation can be found [here](https://automl.github.io/SMAC3/).

### Budget

SMAC3 can be budgetted in terms of `solver_calls`, which specifies the maximum amount of times the target solver (e.g. your algorithm) may be run on a certain instance, but also through `walltime_limit` or `cputime_limit`. Not that the time limits only consider the budget in terms of time used by your solver, and not time used by the configurator. The time used by your solver is timed in Sparkle by RunSolver and directly communicated to SMAC3 to ensure comparibility cross platform. SMAC3 also offers target solver limitations in terms of time and memory, but this is not available in Sparkle by design, as this would cause SMAC3 to wrap our target algorithm call by Pynisher. However, RunSolver is more accurate in its measurements and to avoid any interference between the two resource management tools, this is disabled.

```{warning}
Although misleading, SMAC3 does currently not actually support CPU time: The budgets are deducted by a single 'time' variable, and currently Sparkle communicates measured CPU time for fairness. It is planned to separate these variables, such that the budgets are actually different.
```

## IRACE

Iterated Racing for Automatic Algorithm Configuration[[2]](#2), or [IRACE](https://mlopez-ibanez.github.io/irace/) for short is an R based algorithm configurator. The full documentation of the configurator can be found [here](https://cran.r-project.org/web/packages/irace/vignettes/irace-package.pdf).
Iterated Racing for Automatic Algorithm Configuration[[3]](#3), or [IRACE](https://mlopez-ibanez.github.io/irace/) for short is an R based algorithm configurator. The full documentation of the configurator can be found [here](https://cran.r-project.org/web/packages/irace/vignettes/irace-package.pdf).

IRACE offers many parameters that can be set, but also automatically computed in accordance with their paper[[2]](#2) and we recommend not deviating from those formulae as it may result in unexpected behaviour.

Expand Down Expand Up @@ -52,6 +64,11 @@ F. Hutter and H. H. Hoos and K. Leyton-Brown (2011)
Proc.~of LION-5, 2011, p507--523

<a id="2">[2]</a>
SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization
Marius Lindauer and Katharina Eggensperger and Matthias Feurer and André Biedenkapp and Difan Deng and Carolin Benjamins and Tim Ruhkopf and René Sass and Frank Hutter (2022)
Journal of Machine Learning Research, 2022, p1--9

<a id="3">[3]</a>
The irace package: Iterated Racing for Automatic Algorithm Configuration,
Manuel López-Ibáñez and Jérémie Dubois-Lacoste and Leslie Pérez Cáceres and Thomas Stützle and Mauro Birattari (2016)
Operations Research Perspectives, Volume 3, p43--58
18 changes: 13 additions & 5 deletions Documentation/source/packagegen.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,11 @@
(mod-CLI)=
```{eval-rst}
CLI
===============
.. automodule:: sparkle.CLI
:members:
```

(mod-about)=
```{eval-rst}
about
Expand All @@ -11,31 +19,31 @@ about
configurator
===============
.. automodule:: sparkle.configurator
:members:
:members: AblationScenario,ConfigurationScenario,Configurator
```

(mod-instance)=
```{eval-rst}
instance
===============
.. automodule:: sparkle.instance
:members: instance_set,FileInstanceSet,InstanceSet,IterableFileInstanceSet,MultiFileInstanceSet
:members: Instance_Set,FileInstanceSet,InstanceSet,IterableFileInstanceSet,MultiFileInstanceSet
```

(mod-platform)=
```{eval-rst}
platform
===============
.. automodule:: sparkle.platform
:members: CommandName,SettingState,Settings
:members: SettingState,Settings
```

(mod-solver)=
```{eval-rst}
solver
===============
.. automodule:: sparkle.solver
:members: Extractor,SATVerifier,Selector,SolutionVerifier,Solver,Validator
:members: Extractor,SATVerifier,Selector,SolutionVerifier,Solver
```

(mod-structures)=
Expand All @@ -51,7 +59,7 @@ structures
tools
===============
.. automodule:: sparkle.tools
:members: get_solver_args,get_solver_call_params,get_time_pid_random_string,PCSParser,SlurmBatch
:members: get_solver_call_params,get_time_pid_random_string,PCSParser,RunSolver,SlurmBatch
```

(mod-types)=
Expand Down
10 changes: 5 additions & 5 deletions Documentation/source/platform.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ This is a short example to show the format.

```
[general]
objective = RUNTIME
objective = PAR10
target_cutoff_time = 60
[configuration]
Expand All @@ -126,18 +126,18 @@ number_of_runs = 25
number_of_runs_in_parallel = 25
```

When initialising a new platform, the user is provided with a default settings.ini, which can be viewed [here](https://raw.githubusercontent.com/ADA-research/Sparkle/main/sparkle/Components/sparkle_settings.ini).
When initialising a new platform, the user is provided with a default settings file, which can be viewed [here](https://raw.githubusercontent.com/ADA-research/Sparkle/main/sparkle/Components/sparkle_settings.ini).

(sparkle-objective)=
### Sparkle Objectives
To define an objective for your algorithms, you can define them in the `general` section of your `Settings.ini` like the following:

```
[general]
objective = PAR10,loss,accuracy:max
objective = PAR10,loss,accuracy:max,train_loss:metric
```

In the above example we have defined three objectives: Penalised Average Runtime, the loss function value of our algorithm on the task, and the accuracy of our algorithm on the task. Note that objectives are by default assumed to be _minimised_ and we must therefore specify `accuracy`_`:max`_ to clarifiy this. The platform predefines for the user three objectives: cpu time, wallclock time and memory. These objectives will always recorded next to whatever the user may choose.
In the above example we have defined three objectives: Penalised Average Runtime, the loss function value of our algorithm on the task, and the accuracy of our algorithm on the task. Note that objectives are by default assumed to be _minimised_ and we must therefore specify `accuracy`_`:max`_ to clarifiy this. Furthermore, you may have certain objectives that you wish to record, but not actually have configurators and algorithms use as an objective. For this we can specificy `train_loss`_`:metric`_, letting the platform now this value will be present but must not be passed as an optimisable objective. The platform predefines for the user three objectives: cpu time, wallclock time and memory. These objectives will always recorded next to whatever the user may choose.

```{note}
Although the Platform supports multiple objectives to be registered for any Solver, not all used components, such as SMAC and Ablation Analysis, support Multi-Objective optimisation. In any such case, the first defined objective is considered the most important and used in these situations
Expand All @@ -159,7 +159,7 @@ It is possible to redefine these attributes for your specific objective. The pla

### Slurm

Slurm settings can be specified in the `Settings/settings.ini` file. Any setting in the Slurm section not internally recognised by Sparkle will be added to the `sbatch` or `srun` calls. It is advised to overwrite the default settings specific to your cluster, such as the option "--partition" with a valid value on your cluster. Also, you might have to adapt the default "--mem-per-cpu" value to your system. For example, your Slurm section in the `settings.ini` could look like:
Slurm settings can be specified in the `Settings/sparkle_settings.ini` file. Any setting in the Slurm section not internally recognised by Sparkle will be added to the `sbatch` or `srun` calls. It is advised to overwrite the default settings specific to your cluster, such as the option "--partition" with a valid value on your cluster. Also, you might have to adapt the default "--mem-per-cpu" value to your system. For example, your Slurm section in the `sparkle_settings.ini` could look like:

```
[slurm]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@

# Convert Solver output to dictionary for configurator target algorithm script
output_str = solver_call.stdout.decode()
print(output_str) # Print original output so it can be verified

solution_quality = sys.maxsize
status = SolverStatus.CRASHED
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@

# Convert Solver output to dictionary for configurator target algorithm script
output_str = solver_call.stdout.decode()
print(output_str) # Print original output so it can be verified

solution_quality = sys.maxsize
status = SolverStatus.CRASHED
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,12 @@
# Execute the solver call
try:
solver_call = subprocess.run(solver_cmd + params,
capture_output=True)
capture_output=True)
except Exception as ex:
print(f"Solver call failed with exception:\n{ex}")

output_str = solver_call.stdout.decode()
print(output_str) # Print original output so it can be verified
output_list = output_str.splitlines()

quality = 1000000000000
Expand Down
4 changes: 2 additions & 2 deletions Examples/Resources/Solvers/CSCCSat/sparkle_solver_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,7 @@
import subprocess
from pathlib import Path
from sparkle.types import SolverStatus
from sparkle.tools.solver_wrapper_parsing import parse_solver_wrapper_args, \
get_solver_call_params
from sparkle.tools.solver_wrapper_parsing import parse_solver_wrapper_args


# Parse the arguments of the solver wrapper
Expand Down Expand Up @@ -34,6 +33,7 @@

# Convert Solver output to dictionary for configurator target algorithm script
output_str = solver_call.stdout.decode()
print(output_str) # Print original output so it can be verified

# Try to parse the status from the output
status = SolverStatus.CRASHED
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@

# Convert Solver output to dictionary for configurator target algorithm script
output_str = solver_call.stdout.decode()
print(output_str) # Print original output so it can be verified

status = SolverStatus.CRASHED
for line in output_str.splitlines():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@

# Convert Solver output to dictionary for configurator target algorithm script
output_str = solver_call.stdout.decode()
print(output_str) # Print original output so it can be verified

status = SolverStatus.CRASHED
for line in output_str.splitlines():
Expand Down
16 changes: 6 additions & 10 deletions Examples/Resources/Solvers/RandomForest/sparkle_solver_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -183,14 +183,13 @@ def train(self: RandomForest,
y_test = y_train_base[test_id]

classifier.fit(x_train, y_train)

y_pred = classifier.predict(x_test)

model_size = 0
for decision_tree in classifier.estimators_:
model_size += decision_tree.tree_.node_count

averaging = "binary"
averaging = "binary" if len(np.unique(y_train)) == 2 else "micro"

performances = {
"accuracy": accuracy_score(y_test, y_pred),
Expand Down Expand Up @@ -222,7 +221,6 @@ def train(self: RandomForest,

dataset = DataSet().load_from_csv(dataset)
rf = RandomForest(dataset)
rf.objectives = [o.metric.lower() for o in objectives]

for k, v in config.items():
if v == "True":
Expand Down Expand Up @@ -293,12 +291,10 @@ def train(self: RandomForest,
status = "SUCCESS"
try:
result = rf.train(config, instance, seed)
except Exception:
except Exception as e:
print(e)
status = "CRASHED"
result = {k: 100 for k in objectives}

# quality = list(result.values())[0]
outdict = {"status": status,
"quality": result}
result = {k: 0 for k in objectives}

print(outdict)
result["status"] = status
print(result)
2 changes: 2 additions & 0 deletions Examples/Resources/Solvers/template/sparkle_solver_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,8 @@

# Convert Solver output to dictionary for configurator target algorithm script
output_str = solver_call.stdout.decode()
# Optional: Print original output so the solution can be verified by SATVerifier
print(output_str)

status = SolverStatus.CRASHED
for line in output_str.splitlines():
Expand Down
24 changes: 5 additions & 19 deletions Examples/configuration_quality.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,38 +31,24 @@ In this case the source directory also contains an executable, as the algorithm

### Configure the solver

Perform configuration on the solver to obtain a target configuration. For the VRP we measure the absolute quality performance by setting the `--objectives` option, to avoid needing this for every command it can also be set in `Settings/sparkle_settings.ini`.
Perform configuration on the solver to obtain a target configuration. For the VRP we measure the absolute quality performance by setting the `--objectives` option, to avoid needing this for every command it can also be set as the first objective in `Settings/sparkle_settings.ini` under the general section.

```bash
sparkle configure_solver --solver Solvers/VRP_SISRs/ --instance-set-train Instances/X-1-10/ --objectives quality
```

### Validate the configuration

To make sure configuration is completed before running validation you can use the `sparkle wait` command

```bash
sparkle wait
```

Validate the performance of the best found parameter configuration. The test set is optional. We again set the performance measure to absolute quality.

```bash
sparkle validate_configured_vs_default --solver Solvers/VRP_SISRs/ --instance-set-train Instances/X-1-10/ --instance-set-test Instances/X-11-20/ --objective quality
sparkle configure_solver --solver Solvers/VRP_SISRs/ --instance-set-train Instances/X-1-10/ --instance-set-test Instances/X-11-20/ --objectives quality
```

### Generate a report

Wait for validation to be completed
Wait for the configuration to be completed:

```bash
sparkle wait
```

Generate a report detailing the results on the training (and optionally testing) set. This includes the experimental procedure and performance information; this will be located in a `Configuration_Reports/` subdirectory for the solver, training set, and optionally test set like `VRP_SISRs_X-1-10_X-11-20/Sparkle-latex-generator-for-configuration/`. We again set the performance measure to absolute quality.
Generate a report detailing the results on the training (and optionally testing) set. This includes the experimental procedure and performance information; this will be located in `Output/Configuration/Analysis`. The configuration scenario is saved by Sparkle, including the specified objective.

```bash
sparkle generate_report --objective quality
sparkle generate_report
```

By default the `generate_report` command will create a report for the most recent solver and instance set(s). To generate a report for older solver-instance set combinations, the desired solver can be specified with `--solver Solvers/VRP_SISRs/`, the training instance set with `--instance-set-train Instances/X-1-10/`, and the testing instance set with `--instance-set-test Instances/X-11-20/`.
Expand Down
Loading

0 comments on commit 83937ca

Please sign in to comment.