Generalized zero-shot learning (GZSL) is one of the most realistic problems, but also one of the most challenging problems due to the partiality of the classifier to supervised classes. Instance-borrowing methods and synthesizing methods solve this problem to some extent with the help of testing semantics, but therefore neither can be used under the class-inductive instance-inductive (CIII) training setting where testing data are not available, and the latter require the training process of a classifier after generating examples. In contrast, a novel method called Semantic Borrowing for improving GZSL methods with compatibility metric learning under CIII is proposed in this paper. It borrows similar semantics in the training set, so that the classifier can model the relationship between the semantics of zero-shot and supervised classes more accurately during training. In practice, the information of semantics of unseen or unknown classes would not be available for training while this approach does NOT need any information of semantics of unseen or unknown classes. The experimental results on representative GZSL benchmark datasets show that it can reduce the partiality of the classifier to supervised classes and improve the performance of generalized zero-shot classification.