Exoskeleton devices can reduce metabolic cost, increase walking speed, and augment load-carrying capacity. However, little is known about the effects of powered assistance on the sensory information required to achieve these tasks. To learn how to use an assistive device, humans must integrate novel sensory information into their internal model, but under certain sensory conditions, this may present a challenge. We investigated the exoskeleton-induced changes in balance performance and the sensory system adaptations during quiet standing. We asked 11 unimpaired adults to perform a virtual reality-based test of sensory integration in balance (VRSIB) on two days while wearing the exoskeleton either unpowered, using proportional myoelectric control, or with regular shoes. We measured postural sway, equilibrium scores, and sensory ratios. In addition, we measured postural control strategy, joint kinematics and kinetics, and muscle activity. Results showed improvement in balance performance when wearing the exoskeleton on firm ground. The opposite occurred when standing on an unstable platform with eyes closed or when the visual information was non-veridical. The balance performance was equivalent when the exoskeleton was powered versus unpowered in all conditions except when both the support surface and the visual information were altered. We argue that in stable ground the passive stiffness of the device dominates the postural task. In contrast, when the ground becomes unstable the passive stiffness negatively affects balance performance. Furthermore, when the visual input to the user is non-veridical, exoskeleton assistance can magnify erroneous muscle inputs and negatively impact the user’s postural control.