- Introduction to Python : Writing a Python file and using the Python interpreter; syntax for variables, functions, data types, control flow, and comprehensions.
- The Standard Library : Built-in functions, namespaces,
import
syntax, and a few standard library modules. - Object-oriented Programming : Python classes, including attributes, methods, and inheritance.
- Introduction to NumPy : n -dimensional
numpy
arrays, data access, array broadcasting, and universal functions. - Introduction to Matplotlib : Creating line plots, histograms, scatter plots, heat maps, and contour maps with
matplotlib
; basic plot customization, including subplots, using the object-orientedmatplotlib
API. - Exceptions and File I/O : Syntax for raising and handling exceptions, reading from and writing to files, and string formatting.
- Unit Testing : Introduction to the PyTest framework, including coverage tests; outline of the philosophy of test-driven development.
- Profiling : Why fast algorithms matter; the
%prun
profiler; common ways to speed up Python code, includingnumba
. - Introduction to SymPy : Using symbolic variables and expressions to solve equations, do linear algebra, and perform calculus.
- Data Visualization : Best practices for visualizing data, including tips for line plots, bar charts, histograms, scatter plots, and heat maps.
- Introduction to the Unix Shell — Learning core Unix commands for file manipulation, process management, and text processing; using pipes, redirection, and wildcards for automation.
- Unix Shell 2 — Advanced shell scripting with conditionals, loops, and environment variables; creating reusable and parameterized Bash scripts.
- SQL 1: Introduction — Fundamentals of Structured Query Language (SQL) ; creating, reading, updating, and deleting data from relational databases.
- SQL 2 (The Sequel) — Advanced SQL concepts including joins , subqueries , aggregate functions , views , and index optimization for performance.
- Regular Expressions — Pattern matching for text data cleaning and extraction; syntax for tokens, groups, quantifiers, and assertions across Python and shell tools.
- Web Scraping — Extracting structured information from the web using libraries such as BeautifulSoup , Selenium , or Scrapy ; handling dynamic and paginated data.
- Pandas 1: Introduction — Data structures and indexing in Pandas (
Series
,DataFrame
); data loading, cleaning, and basic transformations. - Pandas 2: Plotting — Visualizing datasets using Pandas built-in plotting and Matplotlib ; creating line, bar, histogram, and box plots.
- Pandas 3: Grouping — Data aggregation, filtering, and summarization using groupby() , pivot_table() , and multi-indexed data operations.
- GeoPandas — Extending Pandas for geospatial data; working with shapefiles, coordinate reference systems (CRS), and performing spatial joins and overlays.
- Data Cleaning — Handling missing, inconsistent, and duplicate data; data type conversion, normalization, and schema validation using Pandas , NumPy , and regex .
- Intro to Parallel Computing — Understanding concepts of parallelism , concurrency , and distributed computing ; using Python’s
multiprocessing
andconcurrent.futures
modules. - Parallel Programming with MPI — Implementing distributed computing with MPI (Message Passing Interface) ; process communication, synchronization, and load balancing using mpi4py .
- Apache Spark — Working with big data using PySpark ; RDDs, DataFrames, lazy evaluation, and transformations/actions for large-scale data analytics.
- Linear Transformations : Matrix representations of linear and affine transformations; the computational cost of applying linear transformations via matrix multiplication.
Linear Systems : Gaussian elimination, the LU decomposition, and an introduction to scipy.linalg
and scipy.sparse
.
The QR Decomposition : QR via modified Gram-Schmidt and Householder reflections; using the QR decomposition to solve systems and compute determinants.
Least Squares and Computing Eigenvalues : Solving the normal equations via QR; using least squares to fit a lines, polynomials, circles, and ellipses; computing eigenvalues via the power method and the QR algorithm.
Image Segmentation : Representing graphs with adjacency matrices; the Laplacian matrix and algebraic connectivity; working with images in Python; a graph-based image segmentation algorithm.
The SVD and Image Compression : Computing the compact SVD (given an eigenvalue solver); using the truncated SVD for image compression.
Facial Recognition : Representing a database of images as a single matrix; using the SVD to efficiently match images to the database.
Differentiation : Comparison of symbolic differentiation (sympy
), numerical differentiation (difference quotients), and differentiation packages (autograd
).
Newton's Method : Implementation of Newton's method in n dimensions, including backtracking; basins of attraction.
Conditioning and Stability : Matrix condition numbers; conditioning of root finding and eigenvalue solvers; stability of least squares; catastrophic cancellation.
Monte Carlo Integration : Integration in n dimensions with Monte Carlo sampling; convergence estimates using the Gaussian distribution as an example.
Importance Sampling : Adjustments to Monte Carlo integration for small-volume or long-tail integrals.
Visualizing Complex-valued Functions : Working with complex numbers in Python; counting zeros and poles of complex-valued functions.
The PageRank Algorithm : The PageRank model; solving for PageRanks with various methods, including networkx
; applications to NCAA basketball team rankings and Hollywood popularity rankings.
The Drazin Inverse : Computing the Drazin inverse; applications to effective resistance and link prediction.
Iterative Solvers : The Jacobi, Gauss-Seidel, and SOR methods for solving linear systems; application to Laplace's equation in 2 dimensions.
The Arnoldi Iteration : Implementing the Arnoldi iteration; using Arnoldi to compute eigenvalues of arbitrary linear tranformations; Ritz values and convergence.
GMRES : Implementing GMRES with restarts; convergence properties of GMRES.
- Linked Lists : Implementation of a
LinkedList
class withappend()
,remove()
andinsert
() methods; stacks, queues, and deques. - Binary Search Trees : Recursion; implementation of a
BST
class withinsert()
andremove()
methods; AVL trees. - Nearest Neighbor Search : The nearest neighbor problem; using k -D trees to solve the nearest neighbor problem efficiently; constructing a k -nearest-neighbors classifier.
- Breadth-first Search : Adjacency dictionaries; breadth-first search for finding shortest paths; introduction to
networkx
; application to the "Six degrees of Kevin Bacon" problem. - Markov Chains : Transition matrices and simulating transitions; steady state distributions; application to random sentence generation.
- The Discrete Fourier Transform : Working with sound clips in Python; generating tones and chords; implementation of the fast discrete Fourier transform.
- Convolution and Filtering : Efficient circular and linear convolution via the FFT; using the DFT to identify and filter unwanted signal frequencies in sounds and images.
- Introduction to Wavelets : Haar wavelets; the discrete wavelet transform in one and two dimensions; applications to image cleaning and data compression.
- Polynomial Interpolation : Lagrange, Barycentric Lagrange, and Chebyshev interpolation; application to air quality data.
- Gaussian Quadrature : Integration with Gauss-Legendre and Gauss-Chebyshev quadrature, in one and two dimensions.
- One-dimensional Optimization : Golden section search; Newton's method for optimization in 1 dimension; the secant method.
- Newton and Quasi-Newton Methods : Newton's method of optimization in n dimensions; BFGS; the Gauss-Newton method.
- Gradient Descent Methods : The method of steepest descent; the conjugate gradient method; application to linear and logistic regression, including a binary logistic regression classifier.
- CVXOPT : Linear and quadratic programming using
cvxopt
, including l1 and l2 norm minimzation, transportation models, and allocation models. - Interior Point 1: Linear Programs : An interior point solver for linear programs with equality constraints; application to least absolute deviations.
- Interior Point 2: Quadratic Programs : An interior point solver for linear programs with inequality constraints; applications to elastic membrane theory and Markowitz portfolio optimization.
- Dynamic Programming : The marriage problem; cake eating problems.
- Policy Function Iteration : Value iteration and policy iteration to solve a deterministic Markov decision process.
- Information Theory — Core concepts of information, entropy, and mutual information; relationship between coding and probability distributions.
- LSI and Scikit-learn — Implementing Latent Semantic Indexing for text data using
scikit-learn
; understanding vector space models and dimensionality reduction. - K-Means Clustering — Unsupervised learning via centroid-based clustering; initialization, convergence, and evaluation with inertia and silhouette scores.
- Random Forests — Ensemble learning with decision trees; bagging, feature randomness, out-of-bag estimates, and model interpretability.
- Linear Regression — Fitting continuous target models using least squares; residual analysis and evaluation metrics like ( R^2 ) and RMSE.
- Logistic Regression — Modeling binary outcomes; sigmoid function, log-likelihood optimization, and decision boundary visualization.
- Naive Bayes — Probabilistic classification using conditional independence; Gaussian, Multinomial, and Bernoulli variants.
- Metropolis Algorithm — Monte Carlo sampling from complex probability distributions using acceptance–rejection criteria.
- Gibbs Sampling and LDA — Implementing Gibbs sampling for Bayesian inference; applying Latent Dirichlet Allocation for topic modeling.
- Gaussian Mixture Models — Soft clustering via EM algorithm; learning parameters of multiple Gaussian components and visualizing latent clusters.
- Discrete Hidden Markov Models — Sequential modeling with hidden states; forward–backward algorithm and Viterbi decoding.
- Speech Recognition using CDHMMs — Applying context-dependent HMMs for phoneme-based speech modeling and decoding.
- Kalman Filter — Recursive estimation for linear dynamical systems; prediction and correction steps in state-space models.
- ARMA Models — Time-series modeling using Auto-Regressive and Moving Average components; parameter estimation and model diagnostics.
- Non-negative Matrix Factorization Recommender — Matrix factorization for collaborative filtering; minimizing reconstruction error for recommendation systems.
- Intro to Deep Learning and PyTorch — Neural network basics using
PyTorch
; tensors, automatic differentiation, and simple feed-forward models. - Recurrent Neural Networks (RNNs) — Modeling sequential data with RNNs, GRUs, and LSTMs; training for text or time-series prediction tasks.
- Animations and 3D Plotting in Matplotlib — Creating dynamic visualizations and 3D surface plots using
matplotlib.animation
andAxes3D
; visualizing time-dependent simulations and data. - Intro to IVP and BVP — Introduction to solving Initial Value Problems (IVPs) and Boundary Value Problems (BVPs) ; fundamental concepts of ODEs and boundary conditions.
- Modeling the Spread of an Epidemic: SIR Models — Constructing and simulating Susceptible–Infected–Recovered (SIR) models to analyze epidemic dynamics and control measures.
- Numerical Methods for Initial Value Problems — Implementing Euler , Runge–Kutta , and Adams–Bashforth methods; analyzing stability and convergence.
- Predator–Prey Models — Simulating ecological interactions using the Lotka–Volterra equations; exploring equilibrium points and limit cycles.
- Lorenz Equations — Studying chaos and sensitivity to initial conditions using the Lorenz system ; visualizing strange attractors in 3D.
- Bifurcations and Hysteresis — Investigating nonlinear dynamics and transitions between equilibrium states; exploring saddle-node and Hopf bifurcations .
- The Finite Difference Method — Discretizing PDEs using grid-based approximations; solving Laplace, Poisson, and diffusion equations numerically.
- Wave Phenomena — Modeling and simulating 1D and 2D wave equations; understanding reflection, interference, and standing waves.
- Heat Flow — Solving the heat equation on spatial domains; exploring transient and steady-state solutions using finite differences.
- Anisotropic Diffusion — Implementing Perona–Malik diffusion for image smoothing while preserving edges; solving diffusion equations with direction-dependent coefficients.
- The Finite Element Method (FEM) — Introduction to weak formulations and element assembly; solving PDEs on arbitrary meshes.
- Poisson’s Equation — Numerical solution of the Poisson equation using FEM and FDM approaches; applying Dirichlet and Neumann boundary conditions.
- Spectral 1: Method of Mean Weighted Residuals — Deriving spectral approximations using orthogonal basis functions; implementing Galerkin and collocation methods.
- Spectral 2: A Pseudospectral Method for Periodic Functions — Applying Fourier-based pseudospectral methods for smooth periodic problems; exploring convergence behavior.
- Inverse Problems — Recovering hidden parameters from observed data; introducing regularization methods like Tikhonov and total variation .
- The Shooting Method for Boundary Value Problems — Reformulating BVPs as IVPs; applying iterative techniques like the secant method for boundary satisfaction.
- Total Variation and Image Processing — Implementing Total Variation (TV) minimization for image denoising and restoration; studying the ROF model.
- Transit Time Crossing a River — Modeling optimal trajectories under flow dynamics; solving variational problems using control-based methods.
- HIV Treatment Using Optimal Control — Applying Pontryagin’s Maximum Principle to determine optimal drug dosage strategies for HIV models.
- Solitons — Simulating Korteweg–de Vries (KdV) and Nonlinear Schrödinger equations; studying solitary wave propagation and stability.
- Obstacle Avoidance — Designing trajectory planning algorithms for dynamical systems with obstacles; optimization under constraints.
- The Inverted Pendulum — Modeling and stabilizing an inverted pendulum on a cart; implementing feedback control for balance maintenance.
- LQG (Linear–Quadratic–Gaussian) Control — Integrating optimal control with state estimation using Kalman filters and LQR design.
A comprehensive course book combining differential equations, chaotic systems, bifurcation theory, nonlinear dynamics, and AI/neural models with Python.
Chapter 1 — Tutorial Introduction to Python
-
🐍 Using Python as a scientific computing tool: IDLE, Anaconda, and Spyder.
-
Tutorials:
- Python as a calculator
- Basic scripting and control flow
- Plotting with Turtle
- Introduction to
SymPy
,NumPy
, andMatplotlib
- Solving simple ODEs numerically
-
Skills: Scientific scripting, plotting, symbolic algebra, numerical computing.
Chapter 2 — Differential Equations
-
Linear, separable, exact, and homogeneous ODEs.
-
Applications to:
- Chemical kinetics
- Electric circuits
- Existence and uniqueness theorem.
-
Python Labs: Numerical ODE solvers (
scipy.integrate.solve_ivp
), visual phase plots. -
Mini-Project: Simulate RC and RL circuits; visualize transient dynamics.
Chapter 3 — Planar Systems
- Linear systems, canonical forms, and eigenanalysis.
- Phase portraits and linearization (Hartman–Grobman theorem).
- Python Labs: Phase-plane diagrams, stability classification (node, focus, saddle).
- Mini-Project: Visualize trajectories for coupled predator-prey systems.
Chapter 4 — Interacting Species
- Lotka–Volterra predator-prey and competition models.
- Stability, equilibrium points, and bifurcations.
- Python Labs: Interactive SIR and predator-prey simulations.
- Project: Modeling ecosystems and population competition.
Chapter 5 — Limit Cycles
- Existence and nonexistence of limit cycles, perturbation methods.
- Python Labs: Van der Pol oscillator simulation.
- Project: Analyze periodic biological rhythms or oscillatory circuits.
Chapter 6 — Hamiltonian Systems and Lyapunov Stability
- Hamiltonian systems in the plane.
- Lyapunov functions for nonlinear stability.
- Python Labs: Energy-conserving simulations and stability analysis.
- Project: Simulate the pendulum and analyze stability energy contours.
Chapter 7 — Bifurcation Theory
- Saddle-node, transcritical, pitchfork, and Hopf bifurcations.
- Normal forms, multistability, bistability.
- Python Labs: Parameter continuation and bifurcation plots.
- Project: Explore hysteresis and pattern formation.
Chapter 8 — 3D Autonomous Systems and Chaos
- Linear vs nonlinear systems, Lorenz and Rössler attractors.
- Chaotic dynamics and strange attractors.
- Python Labs: 3D trajectory plotting and Lyapunov exponent estimation.
- Project: Visualize Lorenz attractor and detect chaos transition.
Chapter 9 — Poincaré Maps and Nonautonomous Systems
- Poincaré section construction, 2-DOF Hamiltonian systems.
- Nonautonomous ODEs in the plane.
- Python Labs: Discrete map generation and orbit visualization.
- Project: Compute Poincaré maps for driven oscillators.
Chapter 10 — Local and Global Bifurcations
- Small-amplitude limit cycle bifurcations, Grӧbner bases, Melnikov integrals.
- Homoclinic loop analysis.
- Python Labs: Symbolic algebraic bifurcation using SymPy.
- Project: Numerical study of homoclinic orbits.
Chapter 11 — Hilbert’s 16th Problem
- Liénard systems, global and local results.
- Poincaré compactification methods.
- Python Labs: Phase compactification visualization.
- Project: Analyze the Van der Pol–Liénard oscillator.
Chapter 12 — Delay Differential Equations
- Method of steps, biological and optical applications.
- Python Labs: Implement DDE solvers and plot delayed responses.
- Project: Gene regulation or optical feedback models.
Chapter 13 — Linear Discrete Dynamical Systems
- Recurrence relations, Leslie model, harvesting policies.
- Python Labs: Eigen decomposition for population dynamics.
- Project: Age-structured population forecasting.
Chapter 14 — Nonlinear Discrete Dynamical Systems
- Tent and logistic maps, Feigenbaum number, Hénon map.
- Python Labs: Bifurcation diagrams, chaos detection.
- Project: Map-based chaos visualization.
Chapter 15 — Complex Iterative Maps
- Julia sets, Mandelbrot set, Newton fractals.
- Python Labs: Generate fractals using complex iteration.
- Project: GPU-based fractal rendering.
Chapter 16 — Electromagnetic Waves and Optical Resonators
- Maxwell’s equations, optical chaos, nonlinear resonators.
- Python Labs: Wave propagation and bistability simulations.
- Project: Simulate optical feedback and pattern bistability.
Chapter 17 — Fractals and Multifractals
- Fractal construction, dimension calculation, multifractal spectra.
- Python Labs: Box-counting dimension computation.
- Project: Analyze fractal properties in real data (e.g., coastlines, turbulence).
Chapter 18 — Image Processing with Python
- Image representation as matrices, FFT, frequency filtering.
- Python Labs: Fourier transforms and denoising.
- Project: Edge detection and image compression using FFT.
Chapter 19 — Chaos Control and Synchronization
- Controlling chaos in logistic and Hénon maps.
- Synchronization of chaotic systems.
- Python Labs: Implement feedback control on chaotic maps.
- Project: Chaos synchronization between coupled Lorenz systems.
Chapter 20 — Neural Networks
- Introduction to neural modeling and learning.
- Delta rule, backpropagation, Hopfield networks, neurodynamics.
- Python Labs: Build simple ANN and Hopfield memory models.
- Project: Compare biological vs. artificial learning dynamics.
Chapter 21 — Binary Oscillator Computing
- Brain-inspired computation and oscillatory logic.
- Applications to neuromorphic systems.
- Python Labs: Binary oscillator network simulation.
- Project: Implement oscillatory threshold logic gates.
Project Ideas :
- Couple chaos + control + neural dynamics → chaotic neural oscillator control
- Connect image processing + fractals → multifractal texture analysis for materials
- Link bifurcation + epidemiology → SIR with bifurcations in infection rate
- Use DDEs in neural models → time-delayed Hopfield networks
🧩 Chapter 1 — Singular Value Decomposition (SVD)
-
Python Labs:
- Implement matrix factorization using
numpy.linalg.svd
and visualize singular values. - Low-rank matrix approximations and image compression using truncated SVD.
- Compute pseudo-inverse and least-squares regression using SVD decomposition.
- PCA implementation via SVD; compare variance explained across components.
- Eigenfaces mini-project — reconstruct face images using top singular modes.
- Explore randomized SVD algorithms for large datasets (
sklearn.utils.extmath.randomized_svd
). - Tensor decomposition on multi-dimensional data using CP and Tucker models.
- Implement matrix factorization using
-
Projects:
- Build a face recognition pipeline using PCA/SVD.
- Implement real-time dimensionality reduction for video frames.
- Compare truncated vs randomized SVD in high-dimensional datasets.
🌊 Chapter 2 — Fourier and Wavelet Transforms
-
Python Labs:
- Implement discrete Fourier transform (DFT) and FFT using
numpy.fft
. - Visualize frequency spectra of signals; reconstruct from selected harmonics.
- Apply Fourier methods to solve PDEs (e.g., heat or wave equation).
- Build a spectrogram using the Gabor transform for time-frequency analysis.
- Compute Laplace transforms symbolically using
sympy
. - Implement 1D and 2D wavelet decomposition using
pywt
. - Apply 2D DFT and wavelet transforms for image filtering and denoising.
- Implement discrete Fourier transform (DFT) and FFT using
-
Projects:
- Develop a mini sound analyzer with spectrogram visualization.
- Image compression using discrete wavelet transform.
- Solving diffusion equations in Fourier space — visualize frequency-domain dynamics.
⚙️ Chapter 3 — Sparsity and Compressed Sensing
-
Python Labs:
- Experiment with sparse signals and compression ratios using random projections.
- Reconstruct signals using compressed sensing algorithms (
L1
minimization viacvxpy
). - Demonstrate sparse regression with LASSO and ElasticNet.
- Implement robust PCA for background subtraction in videos.
- Sparse sensor placement lab using greedy or convex optimization.
-
Projects:
- Design a compressed sensing pipeline for image recovery.
- Sparse representation and denoising in natural images.
- Build a regression model for sparse feature selection in high-dimensional data.
🧩 Chapter 4 — Regression and Model Selection
-
Python Labs:
- Implement linear and polynomial regression using
numpy
andscikit-learn
. - Solve over- and under-determined systems (
Ax=b
) via least-squares and pseudo-inverse. - Apply gradient descent for nonlinear regression; visualize convergence.
- Cross-validation experiments for model selection (
KFold
,train_test_split
). - Compare model selection metrics: AIC, BIC, R² score.
- Pareto front visualization for multi-objective regression optimization.
- Implement linear and polynomial regression using
-
Projects:
- Fit real-world datasets (housing prices, stock prices) and evaluate models.
- Automated regression pipeline with cross-validation and metric reporting.
- Experiment with feature scaling and regularization to improve model robustness.
🌊 Chapter 5 — Clustering and Classification
-
Python Labs:
- k-Means clustering on synthetic datasets; visualize clusters in 2D/3D.
- Hierarchical clustering with dendrograms; experiment with linkage methods.
- Gaussian mixture models and expectation-maximization algorithm.
- Implement linear discriminant analysis (LDA) and visualize decision boundaries.
- Support vector machines (SVM) with linear and RBF kernels; tune hyperparameters.
- Decision trees and random forest classification; feature importance analysis.
-
Projects:
- Unsupervised customer segmentation using clustering on e-commerce data.
- Classification of handwritten digits (MNIST) using SVM and random forests.
- Compare clustering metrics (silhouette score, Davies-Bouldin index) across algorithms.
⚙️ Chapter 6 — Neural Networks and Deep Learning
-
Python Labs:
- Build single-layer and multi-layer neural networks with
TensorFlow
orPyTorch
. - Implement backpropagation manually and with built-in frameworks.
- Train networks using stochastic gradient descent; visualize loss curves.
- Convolutional neural networks (CNNs) for image classification.
- Recurrent neural networks (RNNs) for sequential data (e.g., time series).
- Autoencoders for dimensionality reduction and denoising.
- Generative adversarial networks (GANs) for image generation experiments.
- Build single-layer and multi-layer neural networks with
-
Projects:
- Build an image classifier for CIFAR-10 or MNIST datasets.
- Time-series prediction using RNN or LSTM networks.
- Experiment with autoencoders for anomaly detection in sensor data.
🧩 Chapter 7 — Data-Driven Dynamical Systems
-
Python Labs:
- Implement Dynamic Mode Decomposition (DMD) on time-series or fluid flow data.
- Sparse Identification of Nonlinear Dynamics (SINDy) using
PySINDy
. - Koopman operator approximation for nonlinear systems; visualize eigenfunctions.
- Data-driven reconstruction of trajectories from partial measurements.
-
Projects:
- Analyze video or sensor datasets to extract dominant dynamic modes.
- Build a predictive model for nonlinear dynamics using SINDy.
- Compare linear and Koopman-based models for forecasting nonlinear systems.
🌊 Chapter 8 — Linear Control Theory
-
Python Labs:
- Simulate closed-loop feedback control with proportional, integral, derivative (PID) controllers.
- Implement linear time-invariant (LTI) system simulations with state-space models.
- Controllability and observability experiments on example systems.
- Design Linear–Quadratic Regulator (LQR) controllers; evaluate performance.
- Kalman filter implementation for state estimation.
- Linear–Quadratic Gaussian (LQG) control simulations.
- Case study: inverted pendulum on a cart; implement stabilization.
-
Projects:
- Build interactive LTI system simulator with feedback control tuning.
- Design LQR/LQG controllers for robotics or mechanical systems.
- Compare open-loop vs closed-loop performance and robustness under noise.
⚙️ Chapter 9 — Balanced Models for Control
-
Python Labs:
- Model reduction for large-scale LTI systems using balanced truncation.
- Compute controllability and observability Gramians; visualize singular values.
- Implement system identification from input-output datasets.
-
Projects:
- Reduce a high-dimensional mechanical or electrical system to a lower-order model.
- Compare full vs reduced model performance in control simulations.
- Build a library of ROM-based controllers for fast simulation and optimization.
🧩 Chapter 10 — Data-Driven Control
-
Python Labs:
- Implement Model Predictive Control (MPC) for linear and nonlinear systems.
- Nonlinear system identification from experimental or simulated data.
- Machine learning-based controllers using regression or neural networks.
- Adaptive extremum-seeking control simulations for optimizing unknown systems.
-
Projects:
- Design a data-driven controller for an inverted pendulum or cart-pole system.
- Compare MPC vs classical PID and LQR controllers on dynamic systems.
- Adaptive optimization of system performance using extremum-seeking algorithms.
🌊 Chapter 11 — Reinforcement Learning
-
Python Labs:
- Implement Q-learning for discrete environments (e.g., Gridworld).
- Model-based optimization for control tasks using simple simulators.
- Deep reinforcement learning (DQN) for high-dimensional state spaces.
- Visualize reward landscapes and policy evolution over episodes.
-
Projects:
- Train an RL agent to balance a pole or navigate a maze.
- Compare model-free vs model-based RL strategies on continuous systems.
- Apply deep RL for robotic arm control or path planning.
⚙️ Chapter 12 — Reduced-Order Models (ROMs)
-
Python Labs:
- Implement Proper Orthogonal Decomposition (POD) on PDE solutions.
- Compute POD expansion and optimal basis elements.
- Use POD with symmetries (rotations, translations) in simulation data.
- Neural network-based time-stepping for POD-Galerkin reduced models.
- Combine DMD or SINDy with POD for efficient simulation.
-
Projects:
- Build a reduced-order model for fluid flow or heat diffusion.
- Compare full-order vs ROM performance in simulation speed and accuracy.
- Implement neural-network enhanced ROM for parametric PDEs.
🧩 Chapter 13 — Interpolation for Parametric ROMs
-
Python Labs:
- Implement Gappy POD for partial data reconstruction.
- Error analysis and convergence experiments for gappy measurements.
- Discrete Empirical Interpolation Method (DEIM) for nonlinear terms.
- Neural network decoders for interpolating parametric ROMs.
- Randomized compression techniques for high-dimensional ROM datasets.
-
Projects:
- Reconstruct missing simulation data using gappy POD and DEIM.
- Build ML-based parametric ROM for varying system parameters.
- Test ROM interpolation accuracy across multiple scenarios.
🌊 Chapter 14 — Physics-Informed Machine Learning
-
Python Labs:
- Implement SINDy autoencoders for learning coordinates and dynamics.
- Koopman operator-based forecasting from time-series data.
- Train physics-informed neural networks (PINNs) for PDE solutions.
- Deep learning for boundary value and coarse-grained PDE problems.
- Learn nonlinear operators using neural networks constrained by physics.
-
Projects:
- Build a PINN to solve the heat or wave equation with boundary conditions.
- Forecast nonlinear dynamical systems using Koopman or SINDy models.
- Develop ML models that respect physical constraints in simulations.
🧩 Chapter 1 — Python Introduction
- Python Labs:
- Implement vector and matrix operations with
numpy
. - Logic, loops, and iteration exercises; control flow and boolean indexing.
- Newton–Raphson method implementation for root finding.
- Function definitions, I/O interactions, and debugging Python scripts.
- Plotting with
matplotlib
; import/export CSV and text data.
- Implement vector and matrix operations with
- Mini-Projects:
- Solve nonlinear equations using Newton–Raphson with visualization.
- Build a small data analysis pipeline: read, process, plot, export results.
🌊 Chapter 2 — Linear Systems
- Python Labs:
- Direct solution of
Ax=b
usingnumpy.linalg.solve
. - Iterative methods (Jacobi, Gauss-Seidel) for linear systems.
- Gradient (steepest) descent for linear equations.
- Compute eigenvalues/eigenvectors with
numpy.linalg.eig
. - Apply eigenmethods to face recognition (PCA-based).
- Solve simple nonlinear systems numerically.
- Direct solution of
- Mini-Projects:
- Implement face recognition using eigenfaces.
- Compare direct vs iterative solutions for large sparse matrices.
⚙️ Chapter 3 — Numerical Differentiation and Integration
- Python Labs:
- Numerical differentiation: finite difference schemes.
- Numerical integration: trapezoidal and Simpson’s rules.
- Apply differentiation/integration on discrete datasets.
- Differentiate noisy data and analyze error propagation.
- Mini-Projects:
- Compute derivatives and integrals of experimental datasets.
- Visualize accuracy of different numerical schemes.
🧩 Chapter 4 — Curve Fitting
- Python Labs:
- Least-squares fitting for linear and nonlinear curves.
- Polynomial fits and spline interpolation using
numpy
andscipy
. - Fit experimental data and compute residuals.
- Sparse methods for curve learning (LASSO, regularization).
- Mini-Projects:
- Fit noisy datasets and compare polynomial vs spline approximations.
- Build automated curve-fitting routines with plotting.
🌊 Chapter 5 — Basic Optimization
- Python Labs:
- Unconstrained optimization (derivative-free) using
scipy.optimize
. - Gradient-based optimization and line search methods.
- Linear programming and the simplex method.
- Implement genetic algorithms for function optimization.
- Unconstrained optimization (derivative-free) using
- Mini-Projects:
- Solve constrained and unconstrained optimization problems.
- Compare performance of derivative-free vs derivative-based methods.
⚙️ Chapter 6 — Advanced Curve Fitting and Machine Learning
- Python Labs:
- Use machine learning models (regression, trees) as curve fitters.
- Neural networks for nonlinear curve fitting (
PyTorch
orTensorFlow
). - Evaluate generalization, interpolation, and extrapolation performance.
- Mini-Projects:
- Fit complex datasets with neural networks; compare with classical methods.
- Build ML pipelines to predict and visualize unknown functions.
🧩 Chapter 7 — Visualization
- Python Labs:
- Customize 2D plots: colors, markers, legends, labels.
- Advanced 2D and 3D plotting with
matplotlib
andmpl_toolkits.mplot3d
. - Generate movies and animations for time-dependent data.
- Mini-Projects:
- Create animated simulations of dynamical systems.
- Visualize multi-dimensional datasets in 2D/3D for presentations.
🌊 Chapter 8 — Initial and Boundary Value Problems of Differential Equations
- Python Labs:
- Solve initial value problems using Euler, Runge–Kutta, and Adams methods.
- Perform error analysis for time-stepping routines.
- Implement advanced time-stepping algorithms for stiff ODEs.
- Solve boundary value problems using shooting and direct relaxation methods.
- Compute spectra using linear operators.
- Time-stepping with neural networks for ODEs and PDEs.
- Mini-Projects:
- Solve and visualize solutions for classical ODEs and boundary value problems.
- Implement shooting method for a second-order BVP and analyze convergence.
⚙️ Chapter 9 — Finite Difference Methods
- Python Labs:
- Implement finite difference discretization for PDEs.
- Apply iterative methods for solving
Ax=b
in discretized systems. - Fast Poisson solvers using Fourier transforms.
- Compare direct, iterative, and spectral solution methods.
- Address computational difficulties: stability, convergence, and performance.
- Mini-Projects:
- Simulate heat and wave equations on 1D/2D grids using finite differences.
- Benchmark solver efficiency for large-scale linear systems.
🧩 Chapter 10 — Time and Space Stepping Schemes: Method of Lines
- Python Labs:
- Implement basic time-stepping schemes: explicit and implicit methods.
- Stability analysis of time-stepping schemes.
- Operator splitting techniques for multidimensional PDEs.
- Optimize computational performance for large-scale simulations.
- Mini-Projects:
- Solve advection-diffusion problems using method of lines.
- Compare stability and accuracy across explicit, implicit, and operator-splitting schemes.
🌊 Chapter 11 — Spectral Methods
- Python Labs:
- Fast Fourier Transform (FFT) and cosine/sine transforms.
- Implement Chebyshev polynomials and transforms.
- Pseudo-spectral methods with filtering techniques.
- Handle boundary conditions in spectral methods.
- Compute spectra using the Floquet–Fourier–Hill method.
- Mini-Projects:
- Simulate PDEs using spectral and pseudo-spectral methods.
- Compare spectral vs finite difference solutions for accuracy and efficiency.
⚙️ Chapter 12 — Finite Element Methods
- Python Labs:
- Construct finite element basis functions.
- Discretize PDEs with finite elements and boundary conditions.
- Implement linear system assembly and solve using FEM.
- Explore open-source FEM software (
FEniCS
,SfePy
).
- Mini-Projects:
- Simulate 1D/2D PDEs (e.g., Poisson or heat equation) using FEM.
- Compare FEM results with finite difference and spectral solutions.
🧩 Chapter 13 — Statistical Methods and Their Applications
- Python Labs:
- Implement basic probability concepts and distributions in Python (
numpy
,scipy.stats
). - Work with random variables: mean, variance, covariance, correlation.
- Perform hypothesis testing and compute statistical significance.
- Implement basic probability concepts and distributions in Python (
- Mini-Projects:
- Analyze experimental or simulated datasets with statistical tests.
- Visualize probability distributions and confidence intervals.
🌊 Chapter 14 — Time–Frequency Analysis: Fourier Transforms and Wavelets
- Python Labs:
- Compute Fourier series and Fourier transforms of signals.
- Apply FFT for radar detection, filtering, and averaging.
- Perform windowed Fourier transforms for time-frequency analysis.
- Implement wavelet transforms and multi-resolution analysis with
pywt
. - Generate spectrograms using the Gabor transform.
- Image processing and denoising using wavelets and diffusion.
- Explore compressive sensing techniques to circumvent Nyquist limits.
- Mini-Projects:
- Analyze audio or vibration signals in time-frequency domain.
- Denoise images and signals using wavelet-based filters.
- Implement compressive sensing to reconstruct undersampled signals.
⚙️ Chapter 15 — Matrix Decompositions
- Python Labs:
- Implement Singular Value Decomposition (SVD) on matrices and images.
- Apply SVD to PCA and proper orthogonal decomposition (POD).
- Robust PCA for separating low-rank and sparse components.
- Dynamic Mode Decomposition (DMD) for time-series and PDE data.
- Explore Koopman operators and randomized linear algebra for scalable SVD/DMD.
- Implement autoencoders and shallow recurrent decoders (SHRED) for nonlinear SVD.
- Mini-Projects:
- Image compression and reconstruction using SVD and robust PCA.
- Perform DMD on fluid flow or time-series data for modal analysis.
- Build autoencoder-based dimensionality reduction pipelines.
🧩 Chapter 16 — Independent Component Analysis (ICA)
- Python Labs:
- Explore the concept of independent components and statistical independence.
- Solve the image separation problem using ICA.
- Implement FastICA algorithm using
sklearn.decomposition.FastICA
.
- Mini-Projects:
- Separate mixed audio or image signals using ICA.
- Visualize independent components for real-world datasets.
🌊 Chapter 17 — Unsupervised Machine Learning
- Python Labs:
- Explore feature spaces and perform data mining.
- Implement clustering algorithms: k-means, hierarchical, DBSCAN.
- Evaluate clustering quality using silhouette score and other metrics.
- Mini-Projects:
- Cluster real-world datasets (e.g., images, sensor data) and visualize clusters.
- Compare performance of different unsupervised algorithms.
⚙️ Chapter 18 — Supervised Machine Learning
- Python Labs:
- Train classifiers for image recognition (e.g., dogs vs. cats).
- Apply SVD and Linear Discriminant Analysis (LDA) for dimensionality reduction and classification.
- Implement decision trees, random forests, SVMs, and neural networks.
- Mini-Projects:
- Build and compare multiple supervised models on a labeled dataset.
- Visualize decision boundaries and evaluate model performance metrics.
🧩 Chapter 19 — Reinforcement Learning
- Python Labs:
- Implement the mathematical architecture of reinforcement learning.
- Model Markov Decision Processes (MDP) for discrete environments.
- Apply policy optimization and value iteration techniques.
- Mini-Projects:
- Train an RL agent to navigate a gridworld or balance a pole.
- Compare model-free vs model-based approaches on simple tasks.
🌊 Chapter 20 — Spatio-Temporal Data and Dynamics
- Python Labs:
- Modal expansion techniques for PDEs.
- POD and robust PCA for extracting dominant spatio-temporal modes.
- Implement shallow recurrent decoders (SHRED) for sensing and PDEs.
- Sparse Identification of Nonlinear Dynamics (SINDy) for system discovery.
- Deep learning methods for time-space stepping and forecasting.
- Mini-Projects:
- Simulate PDEs and extract dominant modes using POD and SINDy.
- Predict spatio-temporal dynamics with recurrent neural networks.
⚙️ Chapter 21 — Data Assimilation Methods
- Python Labs:
- Implement basic data assimilation methods and Kalman filtering.
- Sampling-based data assimilation for dynamical systems.
- Apply assimilation techniques to the Lorenz system.
- Mini-Projects:
- Combine simulation and measurement data to improve forecasts.
- Test Kalman filter performance on noisy time-series datasets.
Part V — Equation-Free Modeling, Complex Dynamical Systems, and Scientific Applications (Chapters 22–26)
🧩 Chapter 22 — Equation-Free Modeling
- Python Labs:
- Implement multi-scale physics simulations without explicit equations.
- Apply lifting and restricting operations in equation-free computing.
- Explore space–time dynamics using coarse-grained simulations.
- Mini-Projects:
- Simulate emergent behavior in multi-scale systems.
- Analyze reduced dynamics from fine-scale simulation data.
🌊 Chapter 23 — Complex Dynamical Systems
- Python Labs:
- Combine dimensionality reduction, compressed sensing, and machine learning for complex systems.
- Implement a basic dynamical systems library in Python.
- Flow around a cylinder: simulate and analyze as a prototypical example.
- Mini-Projects:
- Build a Python pipeline to simulate and visualize fluid flow.
- Extract reduced-order models and analyze system dynamics.
⚙️ Chapter 24 — Applications of Differential Equations and Boundary Value Problems
- Python Labs:
- Hodgkin–Huxley model simulation for neuronal dynamics.
- Celestial mechanics and three-body problem simulations.
- Solve Lorenz system for atmospheric motion analysis.
- Quantum mechanics and wavefunction evolution.
- Electromagnetic waveguide simulations.
- Mini-Projects:
- Visualize neuronal action potentials and parameter sensitivity.
- Simulate planetary motion and Lorenz attractor trajectories.
🧩 Chapter 25 — Applications of Partial Differential Equations
- Python Labs:
- Solve the wave equation in 1D/2D domains.
- Model mode-locked lasers and Bose–Einstein condensates.
- Advection–diffusion and atmospheric dynamics simulations.
- Reaction–diffusion systems and pattern formation.
- Steady-state flow over an airfoil using PDE solvers.
- Mini-Projects:
- Simulate wave propagation and visualize mode shapes.
- Model diffusion–reaction phenomena and analyze emergent patterns.
🌊 Chapter 26 — Applications of Data Analysis
- Python Labs:
- Analyze music scores using Gabor transforms.
- Image denoising through filtering and diffusion.
- Oscillating mass and dimensionality reduction analysis.
- Music genre identification using statistical and machine learning methods.
- Mini-Projects:
- Build pipelines to extract features from audio and image data.
- Implement data-driven classification of music genres or image patterns.