This repository contains a description of MatHH
, a framework developed in Matlab for coding and testing hyper-heuristics. This framework has been published in Elsevier's SoftwareX journal and is freely available here.
In order to properly use MatHH
and the examples shown in this document, the following packages are required:
Utils
: a set of diverse utility functions to better organize the code; available at: GithubJSSP-Matlab-OOP
: an object-oriented class for handling Job-Shop scheduling problems; available at: Github. Note that we include a stable version of this domain with the MatHH
, for the sake of simplicity. So, there is no need to clone the repository, unless the user wishes to test experimental features.BaseInstances
: a set of instances for different problem domains; available at: GithubIt is also necessary to define the training instances that will be used. To this end, we suggest using the BaseInstances
package (available at: Github), which contains some instances that can be used with this framework.
Seeking to facilitate the maintenance of the required packages, the root folder of each package should be located at the same level. Moreover, let us assume a folder name 'outsiderCode' in which we will include our work. Hence, the following structure is suggested:
\BaseInstances
\MatHH
\src
\extended
\Domains
\Utils
\distance
...
\outsiderCode <--------- Use this folder to store your codes
Note: remember you can use addpath(genpath(pathString))
for temporarily adding these packages to Matlab's search path, so that you can put your codes in different folders.
Note for Mac users: Please notice that we use the Windows path separator ("\") throughout this document, which differs from the one used by Mac ("/"). So, Mac users must modify the separator in order to use MatHH
.
The following kinds of hyper-heuristics (HHs) are currently supported:
HH model | Class name | Description |
---|---|---|
Selection | selectionHH.m |
Parent class for selection hyper-heuristics |
--- Rule-based | ruleBasedSelectionHH.m |
Class for rule-based selection hyper-heuristics |
The following example is also provided in the file example.m
so that you can run it directly into Matlab. This example shows how to create a simple HH and associate it to the Job-Shop Scheduling problem. Besides, we also provide some examples about how to train the HH model and about how to use it for solving a set of new instances. Some details about the information that can be used are also provided.
The first thing is to make sure that the workspace is pristine:
clc
clear
close all
Then, the required packages must be added to the search path:
addpath(genpath("..\..\JSSP-Matlab-OOP")); % Adds JSSP functionality
addpath(genpath("..\..\Utils")); % Adds assorted utilities
Before creating the hyper-heuristic, we must first define some basic parameters:
nbRules = 4; % Number of rules for the model
targetProblem = "job shop scheduling"; % String representing the problem domain
Now, we can create a basic HH:
testHH = ruleBasedSelectionHH(nbRules, targetProblem); % Initializes to random model
We can also define our own model by providing the rule matrix:
userModel = [0.2 0.4 0.6 1;...
0.1 0.3 0.9 3;...
0.8 0.7 0.2 1;...
0.5 0.5 0.5 2]; % User-defined rule matrix
testHH.value = userModel; % Sets the user-defined model
Similarly, we can create random models with subsets of features or solvers:
useFeatures = [2 4 3]; % Number of features for the model
useSolvers = 2:4; % Number of solvers that will be available
useRules = 10; % New number of rules for the HH
testHH.assignFeatures(useFeatures);
testHH.assignSolvers(useSolvers);
testHH.initializeModel(useRules); % Generates a random model for the current subset of features and solvers
testHH % Displays the new HH model
In order to train the HH model we need to define a set of training instances. For the sake of simplicity, in this example we use already available instances from the BaseInstances
package.
Load the first set of instances from the E02 folder:
instanceDataset = '..\..\BaseInstances\JobShopScheduling\files\mat\Instances\E02\instanceDataset.mat';
load(instanceDataset);
trainInstances1 = num2cell(allInstances); % Stores instances as cell array
Load the second set of instances from the E01 folder:
instanceDataset = '..\..\BaseInstances\JobShopScheduling\files\mat\Instances\E01\instanceDataset.mat';
load(instanceDataset);
trainInstances2 = num2cell(allInstances); % Stores as cell array
Now, we only need to point the HH towards the set of instances it should use for training:
testHH.trainingInstances = trainInstances1; % Assigns first set to HH
The training process has a required parameter that represents the stop criterion. Currently, only criterion=1
is supported, which indicates that the training should be done for a fixed number of iterations. So, in order to train with the default parameters simply use:
criterion = 1; % Train for a fixed number of iterations
testHH.train(criterion); % Trains using default parameters
Training can also be done with custom parameters and they must be called in the following order (some feasible values are shown):
maxIter = 100; % Maximum number of iterations for training
populationSize = 20; % Number of search agents (particles)
selfConf = 2.1; % Self-confidence constant (UPSO parameter)
globalConf = 2.1; % Global confidence constant (UPSO parameter)
unifyFactor = 0.5; % Unification factor (UPSO parameter)
visualMode = false; % Flag for indicating if fitness evolution should be plotted
The training process returns three elements: position
, a vector containing the HH model; fitness
, best fitness value found by UPSO; details
, structure with extra information about the training process. So, we can create a new HH and train it with the custom parameters:
testHH2 = ruleBasedSelectionHH(nbRules, targetProblem); % Creates the new HH
testHH2.trainingInstances = trainInstances1; % Assigns the first set of instances to the HH
[position, fitness, details] = testHH2.train(criterion, maxIter, populationSize, selfConf, globalConf, unifyFactor, visualMode);
We can use training information, e.g. to plot performance evolution across iterations:
figure, plot(details.procedureEvolution.fitness.raw)
We can use the trained model for different actions. For example, we can use it for solving a new set of instances:
solvedInstances = testHH2.solveInstanceSet(trainInstances2);
Bear in mind that solveInstanceSet
clones the instances before solving them. In this way, trainInstances2
is preserved and the solved instances are located within solvedInstances
.
After solving a set of instances we can analyze performance data by accessing it directly from the HH. For example, we can display the initial feature values when solving the second instance:
selectedStep = 1;
selectedInstance = 2;
testHH2.performanceData{selectedInstance}{selectedStep}.featureValues
Or we can plot the final solution of the third instance:
selectedInstance = 3;
testHH2.performanceData{selectedInstance}{end}.solution.plot()
Similarly, we can plot the solution of the third instance before taking the fourth decision:
selectedStep = 4;
selectedInstance = 3;
testHH2.performanceData{selectedInstance}{selectedStep}.solution.plot()
Should you require more information about MatHH
, or if you want some feature to be developed, feel free to reach us at: iamaya2@tec.mx