Iterative method to minimize error by adjusting model parameters. Moves in the direction that reduces the error the most (downhill).
Steady descent path - consistent but can be slow
Adds inertia to Gradient Descent for faster convergence. Helps avoid getting stuck in small dips by carrying momentum from previous steps.
Accelerated path - builds speed and overshoots valleys
Adjusts the learning rate based on the steepness of the error surface. Takes smaller steps on steep slopes and larger steps on shallow slopes.
Adaptive path - adjusts step size based on terrain
Combines Momentum and RMSprop for efficient and stable learning. Adjusts step sizes and remembers past movements for smarter updates.
Optimal path - combines speed and adaptivity
How each algorithm performs across different criteria
Simple parameter update with learning rate Îą
Adds velocity term with momentum β
Adapts learning rate based on gradient magnitude
Combines momentum and adaptive learning rates
Š 2025 Machine Learning for Health Research Course | Prof. Gennady Roshchupkin
Interactive slides designed for enhanced learning experience