-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
Dear authors,
Thank you for your inspiring work! I have a few questions regarding the reproducibility of your CIFAR-10 results:
- I followed the pipeline using the commands below to train the model, select the forget set, and perform unlearning. However, I was unable to reproduce the Retrain (worst) result with 0% unlearning accuracy (UA) for 10% forgetting, as shown in Figure 2 of your paper. The best result I obtained was 94.755% accuracy on the forget set, corresponding to approximately 5% UA. Could you kindly share the hyperparameters used to achieve the reported results?
The commands that I used:
- Training
python main_train.py --dataset cifar10 --train_seed 42 --seed 42 --save_dir "results" --epochs 182
- Select worst-case forget sets (20 epochs for the upper-level optimization, 10 epochs for the lower-level optimization)
python main_selmu.py --dataset cifar10 --train_seed 42 --cp_path "results/0model_SA_best.pth.tar" --num_indexes_to_replace 4500 --seed 42 --unlearn w_FT --save_dir "results/select_10pct" --w_lr 1e-3 --theta_lr 1e-3 --gamma 1e-4 --unlearn_steps 10 --select_epochs 20
- Unlearn worst-case forget sets via Retraining
python main_evalmu.py --dataset cifar10 --train_seed 42 --cp_path "results/0model_SA_best.pth.tar" --num_indexes_to_replace 4500 --seed 42 --unlearn retrain --unlearn_steps 182 --theta_lr 0.1 --save_dir "results/select_10pct/worst_case_eval" --w_path "results/select_10pct/select_weight.pth.tar"
- I also tested the forget set indices mentioned here and was able to achieve 0% UA for 10% forgetting. This leads me to suspect that there might be an issue with my forget set selection configuration.
I would greatly appreciate any guidance or clarification you could provide!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels