This project implements advanced optimization algorithms for regularized logistic regression, including Gradient Descent, Conjugate Gradient, BFGS, ARC, and Cubic Newton, in a modular, production-grade Python package.
- Modular, extensible codebase
- Robust error handling and logging
- Statistical analysis and plotting utilities
- Ready for research and enterprise use
cubic_regularization/
│
├── src/
│ ├── cubic_regularization/
│ │ ├── __init__.py
│ │ ├── data.py
│ │ ├── optimizers/
│ │ │ ├── __init__.py
│ │ │ ├── base.py
│ │ │ ├── gradient_descent.py
│ │ │ ├── conjugate_gradient.py
│ │ │ ├── bfgs.py
│ │ │ ├── arc.py
│ │ │ └── cubic_newton.py
│ │ ├── experiment.py
│ │ ├── utils.py
│ │ └── logging_config.py
│ └── main.py
│
├── tests/
│ ├── __init__.py
│ ├── test_data.py
│ ├── test_optimizers.py
│ └── test_experiment.py
│
├── requirements.txt
├── .gitignore
├── README.md
└── setup.py
- Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Run experiments:
python -m src.main
- Add your data files (e.g.,
a9a,a9a.t) to the project root.
- Run tests with pytest or unittest:
pytest tests/
If you want to run the experiment from the project root (not inside src/), set the PYTHONPATH so Python can find the package:
On Windows PowerShell:
$env:PYTHONPATH = ".\src"; python -m src.mainOn Linux/macOS:
PYTHONPATH=./src python -m src.mainMIT