Gaussian Mixture Models (GMMs) are probabilistic models used in machine learning to describe a dataset as a combination of multiple Gaussian (normal) distributions. Each component captures a subpopulation within the data, enabling tasks like clustering and density estimation. GMMs assume that data points are generated from these overlapping distributions, providing a flexible way to model complex, multimodal data structures.