The year 1948 witnessed the historic moment of the birth of classic information theory (CIT). Guided by CIT, modern communication techniques have approached the theoretic limitations, such as, entropy function H(U), channel capacity C = max p(x) I(X; Y) and rate-distortion function R(D) = min p(x|x):Ed(x,x)≤D I(X; X). Semantic communication paves a new direction for future communication techniques whereas the guided theory is missed. In this paper, we try to establish a systematic framework of semantic information theory (SIT). We investigate the behavior of semantic communication and find that synonym is the basic feature so we define the synonymous mapping between semantic information and syntactic information. Stemming from this core concept, synonymous mapping, we introduce the measures of semantic information, such as semantic entropy H s (Ũ), up/down semantic mutual information I s (X; Ỹ) (I s (X; Ỹ)), semantic capacity C s = max p(x) I s (X; Ỹ), and semantic rate-distortion function R s (D) = min p(x|x):Eds(x, x)≤D I s (X; X). Furthermore, we prove three coding theorems of SIT by using random coding and (jointly) typical decoding/encoding, that is, the semantic source coding theorem, semantic channel coding theorem, and semantic rate-distortion coding theorem. We find that the limits of SIT are extended by using synonymous mapping, that is, H s (Ũ) ≤ H(U), C s ≥ C and R s (D) ≤ R(D). All these works composite the basis of semantic information theory. In addition, we discuss the semantic information measures in the continuous case. Especially, for band-limited Gaussian channel, we obtain a new channel capacity formula, C s = B log S 4 1 + P N0B with the synonymous length S. In summary, the theoretic framework of SIT proposed in this paper is a natural extension of CIT and may reveal great performance potential for future communication.