We propose the concept of rate entropy function to define generalized rate distortion function from the decoder side by replacing traditional distortion constraint with decoder uncertainty. Although rate entropy function is defined as a constrained variational problem of mutual information, the closed solution can be found by constructing a special solution. We propose four methods for constructing special solutions, such as entropy invariant criterion, independent error criterion, regeneration criterion and weak regeneration criterion. Accordingly, closed expressions of rate entropy function for the current common probability distributions are derived, including uniform distribution, vector Gaussian distribution and probability distributions with regeneration and weak regeneration. Entropy distortion and entropy power distortion are generalizations of mean square distortion (second-order statistic) and absolute value distortion (first-order statistic), which are more general. The concept of rate entropy function solves rate distortion function problem of currently known common sources, enriches and develops Shannon’s theory of rate distortion function, and has important theoretical significance and application value in source coding.