In this work, an EEG-based control paradigm assisted by micro-facial-expressions (microFE-BCI) was developed, focusing on the mainstream defect as the insufficiency of real-time capability, asynchronous logic, and robustness. The core algorithm in microFE-BCI contained two stages (asynchronous ‘ON’ detection & microFE-BCI based real-time control) with four steps (obvious non-microFE-EEGs exclusion, interface ‘ON’ detection, microFE-EEGs real-time decoding, and validity judgment). It provided the asynchrounous function, decoded 8 instructions from the latest 100 ms EEGs, and greatly reduced the frequent misoperation. In the offline assessment, microFE-BCI achieved 96.46%±1.07 accuracy for interface ‘ON’ detection and 92.68%±1.21 for microFE-EEGs real-time decoding, with the theoretical output timespan less than 200 ms. This microFE-BCI was implemented into a software, and applied to two online manipulations for evaluating the stability and agility. In object-moving with a robotic arm, the averaged IoU was 60.03±11.53%. In water-pouring with a prosthetic Hand, the averaged water volume was 202.5±7.0 ml. During online, microFE-BCI performed no significant difference (P = 0.6521 & P = 0.7931) with commercial control methods (i.e., FlexPendant and Joystick), indicating a similar level of controllability and agility. This study demonstrated the capability of microFE-BCI, enabling a novel solution to the noninvasive BCIs in real-world challenges.