{% extends "layout.html" %} {% block content %}
Optimization (Gradient Descent): Gradient descent is an optimization algorithm used to minimize the error (loss function) of a model during the training process. It is the underlying mechanism that powers the "gradient" aspect of gradient boosting, allowing the models to iteratively improve their performance.