Slime Mould Algorithm

From testwiki
Jump to navigation Jump to search
Different phases of the slime mould algorithm (SMA)

Slime mould algorithm (SMA) is a population-based optimization technique [1], which is proposed based on the oscillation style of slime mould in nature [2]. The SMA has a unique mathematical model that simulates positive and negative feedbacks of the propagation wave of slime mould. It has a dynamic structure with a stable balance between global and local search drifts.


Despite their lack of a brain or neurons, slime moulds exhibit extraordinary intelligence, capable of solving difficult computational problems with extreme efficiency [3]. This single-celled amoeba is able to memorize, make some motion decisions and contribute to changes, and all these can impact on our thinking to intelligent behaviour [4]. This organism can optimize the form of its network by more time as it takes in info [5].

File:Plant hairy root cultures as plasmodium modulators of the slime mold emergent computing substrate Physarum polycephalum - Video1.webm

Mathematical model

Approach food

To model the approaching manner of slime mould in the mathematical model of SMA, as a mathematical equation, the next rule is developed to make the start to contraction mode:

Xt+1={Xb(t)+vb.(W.XA(t)XB(t))r<pvc.Xtrp (1)

where vb is a parameter with a interval of [a,a], vc decreases linearly from one to zero. t denotes the current iteration, Xb shows the individual position with the highest odor concentration currently explored, X is the location vector of slime mould, XA and XB are two individuals, that we randomly selected from the current population, W is the weight of slime mould. The equation of p is as follows:

p=tanh|S(i)DF| (2)

where 1,2,,n, S(i) is the fitness of X , DF is the best fitness attained in all iterations.

The formula of vb can be expressed as follows:

vb=[a,a] (3)

a=arctanh((tmaxt)+1) (4)

The formula of W can be expressed as follows:

W(SmellIndex(i))={1+rlog((bFS(i))/(bFwF)+1)condition1rlog((bFS(i))/(bFwF)+1)others (5)

SmellIndex=sort(S) (6)

where condition show that S(i) ranks first half of the swarm, r is the random value in the limit of [0,1], bF is the best fitness attained in the current loop, wF is the worst fitness value attained in the iterative procedure, SmellIndex is the sequence of fitness values sorted (ascends in the minimum value case).

Wrap food

The mathematical rule for the update on the location of slime mould is as follows:

X*={rand(UBLB)+LBrand<zXb(t)+vb(WXA(t)XB(t))r<pvcX(t)rp (7)

where LB and UB denote the lower and upper limits of the feature range, rand and r is the random value in [0. 1].

Oscillation

The value of vb oscillates in a random manner between [a,a] and gradually approaches zero with more iterations. The value of vc oscillates among [-1, 1] and converges to zero eventually.

The SMA algorithm

  • Inputs: The population size N and maximum number of iterationsmaxt
  • Outputs: The best solution Initialize the the positions of slime mould Xi(i=1,2,,n)
    • Calculate the fitness of all slime mould Calculate the W by Eq. (5)
    • Update p, vb, vc;
    • Update positions by Eq. (7)
  • Return bestFitness and Xb

Applications of the Slime Mould Algorithm

The Slime Mould Algorithm (SMA) has demonstrated remarkable versatility across various application domains, showcasing its adaptability and effectiveness in complex optimization tasks. Below are several notable applications:

1. Path Planning for Autonomous Robots: Zheng and Tian (2023) applied an improved SMA for path planning in autonomous mobile robots, enhancing navigation efficiency.[6]

2. Signal Detection in Instrumentation: He and Liu (2023) developed a novel SMA-based unresolved peaks analysis algorithm for signal detection in measurement systems.[7]

3. Distribution Network Optimization: Pan and Wang (2022) utilized a dynamic optimal period division and multi-group flight SMA for reconfiguring distribution networks, improving power system reliability.[8]

4. Gene Data Mining and Feature Selection: Qiu and Guo (2022) applied an enhanced SMA for high-dimensional gene data mining and feature selection, demonstrating its efficacy in biological data analysis.[9]

5. Numerical and Engineering Optimization: Jui et al. (2022) explored the use of a Lévy SMA for solving various numerical and engineering optimization problems, showcasing its robustness.[10]

6. Hybrid Optimization Techniques: Kundu and Garg (2022) combined SMA with Teaching-Learning-Based Optimization (TLBO) and Lévy flight mutation for enhanced numerical and engineering design solutions.[11]

7. Structural Optimization: Kaveh and Hamedani (2022) applied an improved SMA with an elitist strategy for structural optimization, focusing on natural frequency constraints.[12]

8. Engineering Design: Liu and Fu (2023) presented a novel improved SMA tailored for engineering design problems, highlighting its effectiveness in complex design scenarios.[13]

9. Structural Health Monitoring: Wu and Heidari (2023) applied the Gaussian bare-bone SMA to optimize performance in truss structures, demonstrating its potential in structural health monitoring.[14]

10. Feature Selection: Zhou and Chen (2023) enhanced feature selection processes through a boosted local dimensional mutation SMA, providing significant improvements in data analysis.[15]

11. Maximum Power Point Tracking: Houssein and Helmy (2022) implemented an orthogonal opposition-based SMA for maximum power point tracking in photovoltaic systems, improving energy efficiency.[16]

12. Image Segmentation: Ren and Heidari (2022) used a Gaussian kernel probability-driven SMA for multi-level image segmentation, demonstrating its effectiveness in complex image processing tasks.[17]

13. Feature Selection in Chemical Data: Ewees and Al-Qaness (2023) applied SMA for feature selection in chemical data, enhancing the classification process.[18]

14. Sonar Image Recognition: Yutong and Khishe (2021) utilized a fuzzy SMA for real-time sonar image recognition, combining deep convolutional neural networks with SMA.[19]

15. Job Shop Scheduling: Wei and Othman (2022) applied an equilibrium optimizer and SMA with variable neighborhood search for solving job shop scheduling problems, enhancing scheduling efficiency.[20]

16. Wireless Sensor Networks: Prabhu et al. (2023) used SMA for fuzzy linear CFO estimation in wireless sensor networks, improving network data accuracy.[21]

17. Optimal Power Flow Problems: Al-Kaabi and Dumbrava (2022) applied SMA for solving single and multi-objective optimal power flow problems with a Pareto front approach, focusing on high voltage grids.[22]

18. Review and Comparative Analysis: Chen and Li (2023) provided a comprehensive review of recent SMA variants and their applications, offering insights into the algorithm's evolution and use.[23]

19. Ancient Glass Classification: Guo and Zhan (2023) integrated SMA with Support Vector Machine algorithms for classifying ancient glass, enhancing classification accuracy.[24]

20. Employment Stability Prediction: Gao and Liang (2022) applied a multi-population enhanced SMA to predict postgraduate employment stability, providing valuable insights into job market trends.[25]

21. Real-World Optimization Problems: Örnek and Aydemir (2022) introduced an enhanced SMA for global optimization and real-world engineering problems, demonstrating its practical applicability.[26]

22. Fuzzy Systems for Real-Time Recognition: Yutong and Khishe (2021) explored a fuzzy SMA for real-time sonar image recognition, integrating extreme learning machines for improved performance.[27]

These applications highlight the broad scope of SMA and its ability to address various complex optimization challenges across different domains.


References

Template:Reflist