Sequential Model-Based Optimization (SMBO) is a highly effective strategy for hyperparameter search in machine learning. It utilizes a surrogate model that fits previous trials and approximates hyperparameters' response surface (performance). This surrogate model primarily guides the decision-making process for selecting the next set of hyperparameters. Existing classic surrogates, such as Gaussian processes and random forests, focus solely on the current task of interest and lack the capability to incorporate trials from historical tasks. This limitation hinders their efficacy in various applications. Inspired by the state-of-the-art convolutional neural process, this paper proposes a novel meta-learning-based surrogate model for efficient and effective hyperparameter optimization. Our surrogate is trained on the meta-knowledge from a range of historical tasks, enabling it to accurately predict the hyperparameter response surface even with a limited number of trials on a new task. We tested our approach on the hyperparameter selection problem for the well-known support vector machine (SVM) and residual neural network (ResNet) across hundreds of real-world classification datasets. The empirical results demonstrate its superiority over existing surrogate models, highlighting the effectiveness of metalearning in hyperparameter optimization.