The goal of the metaverse is to provide a synchronized interoperability of real and virtual worlds across multiple domains and applications. The main challenge is to make the metaverse interface both immersive and responsive to user actions. In this paper, we propose a new concept called a distributed wearable smartphone, which provides a synergistic integration of head-mounted displays, wearable sensors and other smart devices to translate user context and actions across real and virtual metaverse domains. The proposed architecture is flexible and can be composed from any set of devices, depending on the target application. The key benefits of the proposed approach are the realistic 3D representation and the semi-virtual user experience, which can be extended to a wide range of different applications in industry, education, healthcare and entertainment.