Expected Calibration Error (ECE) quantifies how well a model's predicted probabilities align with actual outcomes. It averages the absolute differences between predicted confidence levels and observed frequencies across all prediction bins, providing a comprehensive measure of the model's probability calibration. Lower ECE indicates better calibration, meaning the predicted probabilities accurately reflect true likelihoods.