A brief review of simulated Kalman Filter Algorithm – variants and applications [version 1; peer review: awaiting peer review]

Simulated Kalman Filter (SKF) solves optimization problems by finding the estimate of the optimum solution. As a multi-agent algorithm, every agent in the population acts as a Kalman filter by using a standard Kalman filter framework, which includes a simulated measurement process and a best-so-far solution as a reference. This paper presented an overview of the research progress in SKF from the day it was introduced until the present day, discussing the progress, improvements, modifications, and applications of SKF. The fundamental and standard algorithm were first introduced. Then, the work on the algorithm improvements was surveyed. Finally, the remaining unresolved problems and some directions of SKF research were discussed. We reviewed 57 SKF papers. 16 of them on fundamental improvements, 9 on extension of the algorithm to discrete problems and 25 on their applications. Researchers have worked on ideas to improve exploration capability to prevent premature convergence by trying prediction operators, oppositionbased learning, and different iteration strategies. There were also attempts to hybridize SKF with other famous algorithms such as Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Sine Cosine Algorithm (SCA) to improve its performance. Lastly, a single-agent variant of SKF and a multi-objective SKF were introduced. SKF algorithms and its variants have been implemented in at least nine areas of applications: drill path optimization, airport gate allocation problem (AGAP), assembly sequence planning (ASP), system identification, feature selection, image template matching, controller tuning, wireless sensor network, and engineering design problem. The literature reviewed solely depended on the keyword search that Open Peer Review Reviewer Status AWAITING PEER REVIEW Any reports and responses or comments on the article can be found at the end of the article. Page 1 of 9 F1000Research 2021, 10:1081 Last updated: 27 OCT 2021


Introduction
Optimization in terms of time and complexity is often needed in this technological era. Optimization helps in cost minimization and profit maximization. The complexity of the traditional exact optimization methods led to enormous amount of computational work which is not practical. Therefore, approximate methods especially metaheuristics are becoming popular. These general-purpose algorithms have lower complexity and offer good solutions.
Studies of well-known Kalman Filter 1 have contributed to the development of many algorithms in estimation as well as in other fields, such as optimization. In optimization, two optimizers have been developed based on Kalman filter. They are Heuristic Kalman Algorithm (HKA) 2 and Simulated Kalman Filter (SKF). 3 This paper emphasizes the fundamental advancements and applications of SKF, which originally proposed by Ibrahim et al. in 2015. The intention of this paper is to gather and briefly review all the papers related to SKF for the beneficial of students and researchers who would like to venture in this field.

Simulated Kalman Filter (SKF) algorithm
The SKF, which is also an estimation-based metaheuristic algorithm 4 was first introduced as a solution to unimodal optimization problems. 3 A year later, it was tested on various optimization problems and found to be a promising optimizer. 5 In principle, SKF tries to solve an optimization problem by estimating the optimum solution. By taking the inspiration from the Kalman filter algorithm, each agent in the SKF will go through the same three-step process available in the Kalman filter that consist of prediction, measurement and estimation. The SKF uses the same prediction and measurement equation as in the Kalman filter. Moreover, since the measurement in Kalman filter estimation comes from sensors, the measurement is simulated in the optimization algorithm, thus the algorithm is named Simulated Kalman Filter (SKF). Figure 1 shows the flowchart of the SKF algorithm. The optimization process starts with the initialization of solutions, followed by the fitness evaluation of each solution, and then the generation of new solutions for the next iteration. This process will stop when the stopping criteria is met. To further understand how the SKF works, a tutorial published in 2019 can be referred. 6 In addition, further studies of the SKF algorithm revealed that the P, Q and R parameters can be replaced with random numbers ∈ 0,1 ½ without affecting the performance of the algorithm. [7][8][9] The SKF improvements The exploration in the SKF algorithm solely depends on the measurement equation. The sine operator in the measurement equation gives a 50-50 chance for exploration to occur in the measurement phase. In the estimation phase, exploitation is set to take place. The predicted value is updated by a multiplication of two small numbers (the Kalman gain and the distance between the predicted and measurement value). This may cause the SKF algorithm to prematurely converge.

Opposition-based learning
In 2018, Ibrahim et al. introduced Opposition-Based Learning (OBL) concept in SKF to overcome the problem of premature convergent. In SKF with oppositional-learning prediction operator (SKF-OPO), the opposition population is generated around the best-so-far solution identified in the prediction phase. 10 If the opposite prediction has a better fitness value than the original prediction (which is the previous estimated solution), the opposite prediction will be used as the predicted solution. This method proves that a proper prediction operator will help SKF to escape from premature convergence. Further experiments were conducted to observe the impact of opposition population generation in SKF-OPO with the introduction of jumping rate. 11 This jumping rate is later compared to a random number between 0-1. If the random number is smaller than the jumping rate, then the opposition population will be generated at prediction phase. Results show that the higher the jumping rate, the better the performance of the algorithm. Another variation of OBL implementation in SKF was introduced in 2019 by Mohd Azmi et al. known as Current Optimum Opposition-Based Learning SKF (COOB-SKF). In this algorithm, the formation of the opposite population uses the best-so-far solution as the center between the estimate population and the opposition estimate population. 12

Iteration strategy
The original population-based SKF algorithm uses the synchronous update mechanism, in which all agents have to go through all optimization steps before the best-so-far solution is updated. However, in 2018, Ab. Aziz et al. found that when the update strategy of SKF is made individual-oriented, the results are better.
The researchers later explore the possibility to use both mechanisms in SKF. There are three variants of SKF adaptive switching algorithms: the fitness-based adaptive switching synchronous-asynchronous SKF (ASSA-SKF), fitnessevaluated adaptive switching SKF with randomness (ASSKFR), and diversity-based adaptive switching synchronousasynchronous SKF (DASSA-SKF). In ASSA-SKF, the SKF algorithm starts with synchronous update and later changes its update mechanism when the fitness is found to be static for a number of fitness evaluations. 13 In ASSKFR, the switching happens when the switching counter is greater than a random number which is chosen every time a switching occurs. 14 One may choose to start with synchronous update or with asynchronous update. Lastly in DASSA-SKF, instead of using fitness as the switching indicator, the decision to switch depends on the diversity of the population. 15 Every time the diversity of the population is not changing for a certain number of iterations, then the iteration strategy will switch from synchronous update to asynchronous update and vice versa. All findings have shown that the algorithms benefit when there is a greater number of switching happening, thus encouraging more explorations.

Hybrid
Hybridization between algorithms can be used to improve an algorithm's performance. SKF has been subjected to hybridization with three algorithms: the Particle Swarm Optimization (PSO), 16 the Gravitational Search Algorithm (GSA), 17 and the Sine Cosine Algorithm (SCA). 18 Muhammad et al. have proposed four ways to show how GSA 19 and PSO 20 can be hybridized with SKF during its prediction step due to the absence of prediction operator in SKF. In another paper, although the fourth model has a better performance compared to the original SKF algorithm, this method is not the best hybrid method for SKF-GSA and SKF-PSO. 21 A fairly recent SKF hybrid is the hybridization between SKF with SCA in 2018. 18 The SCA algorithm is formulated using the mathematical sine and cosine term. In the hybrid version of SKF and SCA, namely the Kalman Filter-based Sine Cosine Algorithm (KFSCA), the prediction and estimation phases of SKF are implemented in SCA. Instead of using the simulated measurement equation of SKF during the measurement phase, SCA equations are used to update the individual agent's position. Five unimodal benchmark functions are used to compare the performance of the hybrid KFSCA algorithm with the original SKF and SCA algorithm. The statistical results showed that the KFSCA performed significantly better than the original SKF and SCA algorithms, and had a higher convergence rate compared to the original SKF algorithm.

Other improvement methods
High convergence rate is a signature for all Kalman filter-based algorithms. While most researchers tried many ways to introduce more explorations to prevent the SKF algorithm from prematurely converging due to its fast convergence speed, Mat Jusof et al. opined that a faster convergence speed was favored especially in solving unimodal problems. 22 In his paper, Simulated Kalman Filter with Improved Accuracy, a new exponential-based SKF named SKFIA was introduced. This SKFIA algorithm uses modified equations in the estimation phase where exponential term is introduced in the calculation of the Kalman gain and the corresponding error covariance. Instead of using the suggestion constant for measurement noise, an exponential term is added in the calculation of the measurement noise and is made dependent on the predicted error covariance. This is to enable a large step size at the beginning of the search and smaller size towards the end. Performance evaluation comparing the SKFIA with the original SKF using the first four benchmark functions of CEC2014 (3 unimodal and 1 multimodal function) shows SKFIA is able to find better results compared to the original SKF.

SKF extensions
Besides being subjected to various modifications to improve its performance, SKF also has been modified to extend its usage. Three main extension of SKF are SKF algorithms for discrete problems, single solution version of SKF algorithm, and SKF for multi-objective problems.
SKF for discrete problems The first extension of SKF algorithm is to make it available to solve discrete problems. There are three approaches proposed by Md. Yusof et al. in solving combinatorial optimization problems using SKF, giving birth to three new modified algorithms: the binary SKF (BSKF), angle modulated SKF (AMSKF) and distance evaluated SKF (DESKF). 23 Right after the introduction of SKF, Md. Yusof et al. published Binary SKF (BSKF) algorithm to enable SKF to operate in binary search space BSKF. 24 This is followed by the publication of DESKF and AMSKF a year later. 25,26 Another variant of the DESKF is the local optimum DESKF algorithm 27 and the DESKF with state encoding. 28 All these discrete SKF algorithms are meant to solve binary problems. The state-encoded DESKF (SEDESKF) was introduced in 2018 to solve combinatorial optimization problems that were not in binary search space.

Single-solution Simulated Kalman Filter (ssSKF)
In the analysis on the number of agents towards the performance of the SKF optimizer, the results showed that the SKF algorithm performed its best at a surprisingly large population of 700 agents. 29 Due to this reason, another variant of the SKF algorithm was introduced in 2018 which was a single agent version of the SKF algorithm. 30 In order to use the same equations as the SKF algorithm, a prediction operator needs to be introduced during the prediction phase. Thus, a simple local adaptive neighborhood method centered around the best-found solution so far is used to predict the location of the optimum solution. However, the introduction of this new adaptive prediction operator introduces another parameter. The ssSKF algorithm with adaptive coefficient value of 5 is proven to be significantly better than the original SKF of 100 agents. Another tutorial published in 2019 can be referred to understand the difference between the population-based and the single-solution algorithm. 31 The parameter tuning analysis published last year revealed that ssSKF performed best when the adaptive coefficient was set at 10. 32

Multiobjective SKF
The last but quite significant extension of SKF is the introduction of multi-objective SKF (MOSKF). 33 Research on multiobjective algorithm is gaining attention worldwide because it deals with problems with two or more objectives that might be conflicting to one another. The SKF is transformed into MOSKF by applying the non-dominated sorting approach. Each agent is associated with cost function value and diversity spacing parameter. Besides that, the mutation and crossover operators are also adopted in MOSKF. Only a limited number of agents undergo mutations and crossovers to reduce the complexity of the algorithm while at the same time promote randomness. The experimental results on three multi-objective benchmark functions showed MOSKF performed better than the Non-dominated Sorting Genetic Algorithm II. However, this MOSKF has no real-world application or has not been subjected to solve more complex problems yet.  Table 1 shows the summary of applications of SKF algorithm and its variants in the 9 fields mentioned above.

Conclusions
In this paper, we provided a brief review of SKF, its improvements, extensions and applications. It does not include any comparison study between SKF and other algorithms. Since the introduction of SKF at the end of 2015, more than 57 papers have been published. The majority of them focused on methods to improve the SKF performance and their applications in various fields due to its simple implementation and fast convergence behavior. However, additional works might need to be done to test the algorithms performance to solve more complex applications and large-scale problems, especially for multi-objective optimization. Another interesting area to explore is on improving the exploration of the single agent version of SKF since no single work has been carried out yet for the ssSKF algorithm.

Data availability
No data associated with this paper.
Author contributions NHAA & NAAA: wrote most of the manuscript.
MRK: compiled all SKF related papers.