Bagging, short for Bootstrap Aggregating, is an ensemble method in machine learning that improves model stability and accuracy. It involves creating multiple training datasets through bootstrap sampling (random sampling with replacement), training a separate model on each, and then aggregating their predictions, typically by voting or averaging. This technique reduces overfitting and variance, enhancing predictive performance.