Particle Swarm Optimisation

Author

Enrico Schumann

Keywords

Particle Swarm, population-based methods

Review Status

Unreviewed

The Algorithm

Particle Swarm Optimisation (PS) is a population-based method and was proposed for the optimisation of continuous objective functions [1]. While the metaphor for other population-based methods (like Differential Evolution or Genetic Algorithms) was evolution, the narrative for PS is based on flocks of birds that search for food.

The initial population consists of $n_P$ solutions that are represented by real-valued vectors. In each iteration, each solution is updated by adding another vector called velocity $v_i$. This velocity changes over the course of the optimisation. More specifically, at the start of each iteration the directions towards the best solution found so far by the particular solution, $Pbest_i$, and the best overall solution, $Pbest_{gbest}$, are determined. The sum of these two directions (which are the differences between the respective vectors, see Line 7 in the pseudocode below) are perturbed by multiplication with a uniform random variable $u$ and a constant $c$. The vector so obtained is added to the previous $v_i$; the resulting updated velocity is added to the respective solution. In some implementations, the velocities are reduced in every generation by setting the inertia parameter $\delta$ to a value smaller than unity.

(1)
\begin{eqnarray} {\scriptscriptstyle \phantom{0}1:\quad}&& \text{Initialise parameters } n_{\mathrm{P}}, n_{\mathrm{G}},\delta, c_{\mathrm{1}}\text{ and } c_{\mathrm{2}}\\ {\scriptscriptstyle \phantom{0}2:\quad}&& \text{Initialise particles } P^{(0)}_i \text{ and velocity } v^{(0)}_i, i=1,\ldots,n_{\mathrm{P}}\\ {\scriptscriptstyle \phantom{0}3:\quad}&& \text{Evaluate objective function } F_i = f(P^{(0)}_i), i=1,\ldots,n_{\mathrm{P}}\\ {\scriptscriptstyle \phantom{0}4:\quad}&& \mathit{Pbest} = P^{(0)}, \mathit{Fbest} = F, \mathit{Gbest} = {\textstyle \min_i(F_i)}, \mathit{gbest} = {\textstyle \operatornamewithlimits{argmin}_{i} (F_i)}\\ {\scriptscriptstyle \phantom{0}5:\quad}&& \text{\textbf{for }} k = 1 \text{ to } n_{\mathrm{G}}\\ {\scriptscriptstyle \phantom{0}6:\quad}&& \quad \text{\textbf{for }} i = 1 \text{ to } n_{\mathrm{P}}\\ {\scriptscriptstyle \phantom{0}7:\quad}&& \quad \quad \vartriangle\! v_i = c_{\mathrm{1}}\times u \times( \mathit{Pbest}_i - P^{(k-1)}_i ) + c_{\mathrm{2}}\times u \times(\mathit{Pbest}_{\mathit{gbest}} - P^{(k-1)}_i )\\ {\scriptscriptstyle \phantom{0}8:\quad}&& \quad \quad v^{(k)}_i = \delta v^{(k-1)} + \vartriangle\! v_i\\ {\scriptscriptstyle \phantom{0}9:\quad}&& \quad \quad P^{(k)}_i = P^{(k-1)}_i + v^{(k)}_i\\ {\scriptscriptstyle 10:\quad}&& \quad \text{\textbf{end for}}\\ {\scriptscriptstyle 11:\quad}&& \quad \text{Evaluate objective function } F_i = f(P^{(k)}_i), i=1,\ldots,n_{\mathrm{P}}\\ {\scriptscriptstyle 12:\quad}&& \quad \text{\textbf{\textbf{for }}} i = 1 \text{ to } n_{\mathrm{P}}\\ {\scriptscriptstyle 13:\quad}&& \quad \quad \textbf{\textbf{if }} F_i<\mathit{Fbest}_i \textbf{ \textbf{then} } \mathit{Pbest}_i = P^{(k)}_i \text{ and } \mathit{Fbest}_i = F_i\\ {\scriptscriptstyle 14:\quad}&& \quad \quad \textbf{\textbf{if }} F_i<\mathit{Gbest} \textbf{ \textbf{then} } \mathit{Gbest} = F_i \text{ and } \mathit{gbest} = i\\ {\scriptscriptstyle 15:\quad}&& \quad \text{\textbf{end for}}\\ {\scriptscriptstyle 16:\quad}&& \text{\textbf{end for}}\\ \end{eqnarray}

Internal Links

Concepts
Tutorials
Tips
Related Articles

External links

References
1. Eberhart, R. C. and J. Kennedy (1995). A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micromachine and Human Science. Nagoya, Japan.
Weblinks

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License