Facial expression recognition (FER) plays an important role in human-computer interaction, psychology, and various applications. It requires significantly large annotated data with practical variations. Existing FER datasets in the uncontrolled environment are small and with a limited number of expressions. In addition, they are lacking with facial ambiguity and crosscultural diversity. To address this need, we have proposed a more comprehensive FER dataset (called FER20E) and an emotion theory, exploring a remarkable expansion to include 20 discrete facial expressions. FER20E contains more than 100,000 facial images with existing eight and twelve unexplored emotions. It takes into account cultural, environmental, and demographic variations like illumination, pose, rotation, scale, age, gender, and occlusion. To reduce the manual intervention for data annotation, we have additionally designed an automatic annotation tool by utilizing a lightweight deep convolutional neural network (DCNN). To evaluate its efficacy, various state-of-the-art models are applied to it. This research not only advances FER but also has broader implications, including enhancing security measures, improving health monitoring, and potentially aiding in the early diagnosis of mental health conditions, thereby contributing to overall well-being and mental health awareness. The code and dataset will be available at https://github.com/akstheme/FER20E.