Patients physically handicapped can't take care of themselves. Helping them easily control the objects around them will reduce their psychological burden and social pressure. In this article, a semi-autonomous grasping system based on eye movement and EEG is presented to achieve this goal. Patient just needs to gaze the target object and keep focused, the manipulator will automatically move to its position and grasp it. Experimental results verify the reliability of the system. This system promotes the development of human-computer interaction system based on multi-sensor fusion.