A reward function in machine learning specifies the immediate feedback an agent receives after performing an action within a particular state. It guides the learning process by quantifying the desirability of outcomes, encouraging the agent to select actions that maximize cumulative rewards over time. This function is essential in reinforcement learning, shaping agent behavior and decision-making strategies.