This repository contains an academic implementation of Particle Swarm Optimization (PSO) for solving the Maximum Covering Problem (MCP), where the goal is to select exactly k subsets to maximize coverage of elements. The PSO algorithm is adapted to this discrete binary setting, and supports extensive experimentation with multiple PSO variants, selection types, mutation strategies, and performance analysis.
├── readme.md # Project overview and documentation
├── requirements.txt # Python dependencies
├── analysis/ # Jupyter notebooks for performance analysis
│ └── metah-result-analysis-PSO-MCP.ipynb
├── data/ # MCP benchmark datasets (Set Cover Problem variants) from https://people.brunel.ac.uk/~mastjjb/jeb/orlib/scpinfo.html
│ ├── scp41.txt ... scp410.txt
│ ├── scpa1.txt ... scpa5.txt
│ ├── scpb1.txt ... scpb5.txt
│ └── scpc1.txt ... scpc5.txt
├── src/ # Source code for algorithms and experiments
│ ├── demo.py # Script to demo algorithms
│ ├── DFS.py # Depth-First Search implementation
│ ├── DFSTest.py # DFS testing script
│ ├── GreedyTest.py # Greedy method test runner
│ ├── GridSearchPSO.py # Grid search for tuning PSO parameters
│ ├── MPGridSearchPSO.py # Grid search with multiprocessing support
│ ├── MaxCoveringProblem.py # MCP problem definition
│ ├── PSO.py # Main PSO algorithm
│ ├── PSOTest.py # Test script for PSO
│ └── Particle.py # Particle class definition for PSO
├── stats/ # Output metrics, logs, and experiment results
│ ├── dfs_1h.csv
│ ├── greedy.csv
│ ├── mutation.csv
│ ├── pso_results.csv
│ ├── standard_bpso_hdbpso.csv
│ ├── standard_whdbpso.csv
│ ├── stochastic_bpso_hdbpso.csv
│ └── stochastic_whdbpso.csv
- Defines the MCP problem, parses input files, and computes
k = floor(m / 20)( number of subsets to select ).
- Implements Particle Swarm Optimization (PSO) for solving MCP:
- Supports multiple inertia strategies:
fixed,linear,nonlinear. - Checkpointing: Saves progress to avoid re-running iterations.
- Supports multiple inertia strategies:
- Defines particles used in PSO:
- Initialization strategies:
random,greedy(selects larger subsets first) ... - Distance metrics:
bitwise,Hamming,Weighted Hamming. - Selection types:
standardstochasticdeterministic
- Mutation: Bit Change Mutation (based on this paper).
- Particle variants:
ParticleRestructuredimplemented BRPSO but did not discuss it nor experiment with it, It's based on the following paper
- Initialization strategies:
demo.py: Demonstrates PSO, DFS ( with 30s timeout ), and Greedy onscp41.txt.DFS.py: Implements Depth-First Search (DFS).GreedyTest.py: Runs the Greedy algorithm.GridSearchPSO.py: Performs grid search for PSO tuning.<x>Test.py: Test scripts for individual algorithms.
To set up the environment and install required dependencies, run:
pip install -r requirements.txtto run the demo file run :
python src/demo.pyThe repository includes several benchmark datasets for the Maximum Covering Problem (MCP). These datasets are based on the Set Cover Problem (SCP) and are available in the data/ directory. You can find the following datasets:
scp41.txttoscp410.txtscpa1.txttoscpa5.txtscpb1.txttoscpb5.txtscpc1.txttoscpc5.txt
These datasets are derived from the OR-Library (Set Cover Problem instances), and more details about the datasets can be found here.
The format of each file is as follows:
- The first number represents the total number of elements (
n). - The second number represents the total number of subsets (
m).
The following references were used or inspired the development of the algorithms and methodologies in this project:
- J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95 - International Conference on Neural Networks, vol. 4 of ICNN-95, p. 1942–1948, IEEE, 1995.
- W. A. L. W. M. Hatta, C. S. Lim, A. F. Z. Abidin, M. H. Azizan, and S. S. Teoh, “Solving maximal covering location with particle swarm optimization,” International Journal of Innovation, Management and Technology, vol. 4, no. 2, pp. 211–215, 2013.
- J. Kennedy and R. Eberhart, “A discrete binary version of the particle swarm algorithm,” in 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, vol. 5, pp. 4104–4108 vol.5, 1997.
- J. Zhu, J. Liu, Y. Chen, X. Xue, and S. Sun, “Binary restructuring particle swarm optimization and its application,” Biomimetics, vol. 8, p. 266, June 2023.
- H. Banka and S. Dara, “A Hamming distance based binary particle swarm optimization (hdbpso) algorithm for high dimensional feature selection, classification and validation,” Pattern Recognition Letters, vol. 52, p. 94–100, Jan. 2015.
- S. Lee, H. Park, and M. Jeon, “Binary particle swarm optimization with bit change mutation,” IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, E90-A, vol. E90-A, 10 2007.
- F. Vandenbergh and A. Engelbrecht, “A study of particle swarm optimization particle trajectories,” Information Sciences, vol. 176, p. 937–971, Apr. 2006.
- https://web2.qatar.cmu.edu/~gdicaro/15382-Spring18/hw/hw3-files/pso-book-extract.pdf. Accessed: 2025-04-02.
- S. J. Russell and P. Norvig, Artificial Intelligence: A Modern Approach, 3rd ed., Pearson, 2010. (For foundational AI concepts used in the problem-solving approaches).
The above references have significantly contributed to the development and understanding of the methods used in this project.