Intensive rehabilitation is essential for stroke survivors, facilitating motor recovery, improving activitiesof-daily-life (ADL) performance, and overcoming limitations in traditional practices by offering accessible, effective, and tailored therapy options. The integration of wearable technologies and machine learning is advancing homebased rehabilitation. However, a significant research gap remains in simulating realistic home environments to evaluate ADL measurement techniques effectively. In this paper, we present a methodology for identifying three isolated arm movements (reaching, lifting, and pronation/supination) using data from 12 healthy participants and two different sensor configurations. The first configuration involved four Inertial Measurement Units (IMUs) on the dominant arm, while the second used a single IMU on the wrist. In addition to comparing sensor configurations, we evaluated the generalization of two arm movement identification models by training them on structured trials and testing on semistructured trial, involving fourteen kitchen-related activities, simulating a home-based rehabilitation scenario. We employed a Random Forest (RF) classifier and a new variant of the previously proposed hybrid deep learning model, combining convolutional neural network and recurrent neural network architectures. The RF classifier could generalize the prediction for activities in the semi-structured trial with 86.54% balanced accuracy, while the hybrid model reached 87.96% in identifying those activities. Performance declined when using a single wrist IMU, with the RF classifier showing a smaller decrease in accuracy (10.57%) compared to the hybrid model (26.22%). Our findings demonstrate that the generalization of key arm movement identification is accurate and robust, therefore, indicating the potential for future application in home-based stroke rehabilitation.