Navigate to the Data/ directory to generate the Erdős–Rényi (ER) graphs required for the experiments.
For demonstration purposes, we have already included a pre-generated dataset with:
- n = 200 nodes
- p = 0.8 edge probability
- 1,000 instances
you should generate your own dataset using the provided scripts if you wish to test on different configurations.
We provide a demonstration on a graph with 200 nodes and edge probability p = 0.8.
Execute the following command to generate structural information:
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 1 --n_iter 10 --hidden 256 --noise_scale 0.02 --p_edge 0.8 --cheb 0.06Navigate to the SearchWithMaxCliqueDyn/SearchCMDS directory and run the following scripts:
./S200_RUNRandom.sh
./S200_RUNDegree.sh
./S200_RUNReorder.shThese scripts evaluate different node orderings: Random, Degree-based, and Learned Reorder.
Results are saved in Results/Logs/. To compare the three ordering strategies, run:
python compareall_S200.pyThis will produce a summary comparing the performance across all methods.
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 3 --n_iter 10 --hidden 256 --noise_scale 0.01 --p_edge 0.1 --cheb 0.06
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 4 --n_iter 10 --hidden 256 --noise_scale 0.03 --p_edge 0.2 --cheb 0.06
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 4 --n_iter 10 --hidden 256 --noise_scale 0.05 --p_edge 0.3 --cheb 0.08
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 3 --n_iter 10 --hidden 256 --noise_scale 0.05 --p_edge 0.4 --cheb 0.08
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 3 --n_iter 10 --hidden 256 --noise_scale 0.01 --p_edge 0.5 --cheb 0.08
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 5 --n_iter 10 --hidden 256 --noise_scale 0.04 --p_edge 0.6 --cheb 0.08
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 5 --n_iter 10 --hidden 256 --noise_scale 0.04 --p_edge 0.7 --cheb 0.08
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 1 --n_iter 10 --hidden 256 --noise_scale 0.02 --p_edge 0.8 --cheb 0.06
python InferenceBatched.py --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 4 --n_iter 10 --hidden 256 --noise_scale 0.05 --p_edge 0.9 --cheb 0.08
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 1 --n_iter 20 --noise_scale 0.05 --p_edge 0.1 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 2 --n_iter 20 --noise_scale 0.02 --p_edge 0.2 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 4 --n_iter 20 --noise_scale 0.03 --p_edge 0.3 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 5 --n_iter 20 --noise_scale 0.05 --p_edge 0.4 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 5 --n_iter 20 --noise_scale 0.04 --p_edge 0.5 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 5 --n_iter 20 --noise_scale 0.05 --p_edge 0.6 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 5 --n_iter 20 --noise_scale 0.03 --p_edge 0.7 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 3 --n_iter 20 --noise_scale 0.01 --p_edge 0.8 --cheb 0.2
python InferenceBatched.py --num_of_nodes 100 --sctorder 6 --gcnorder 3 --tau 3 --n_iter 20 --noise_scale 0.04 --p_edge 0.9 --cheb 0.2
python InferenceBatchedDummy.py --p_edge_loaded 0.81 --num_of_nodes 200 --sctorder 6 --gcnorder 3 --tau 2 --n_iter 10 --hidden 256 --noise_scale 0.01 --p_edge 0.9 --cheb 0.06
Parts of the code are adapted from the following sources: