Mini-Batch Gradient Descent is an optimization algorithm that combines the efficiency of batch processing with the speed of stochastic methods. It updates model parameters by computing the gradient on small, randomly selected subsets (mini-batches) of the training data. This approach balances convergence stability and computational speed, making it widely used in training neural networks and large-scale machine learning models.