Si Cheng

and 3 more

Decisions about a current stimulus are influenced by previously encountered stimuli, leading to sequential bias. However, the specific processing levels at which serial dependence emerges remain unclear. Despite considerable evidence pointing to contributions from perceptual and post-perceptual processes, as well as response carryover effects impacting subsequent judgments, research into how different task measurements affect sequential dependencies is limited. To address this gap, the present study investigated the role of task type in shaping sequential effects in time perception, employing a random-dot kinematogram (RDK) in a post-cue paradigm. Participants had to remember both the duration and the direction of the RDK movement and perform the task based on a post-cue, which was equally likely to be direction or duration. To delineate the task type, we employed the temporal bisection task in Experiment 1 and the duration reproduction task in Experiment 2. Both experiments revealed a significant sequential bias: durations were perceived as longer following longer previous durations, and vice versa. Intriguingly, the sequential effect was enhanced in the reproduction task following the same reproduction task (Experiment 2), but did not show significant variation by the task type in the bisection task (Experiment 1). Moreover, comparable response carryover effects were observed across two experiments. We argue that the differential impacts of task types on sequential dependence lies in the involvement of memory reactivation process in the decision stage, while the post-decision response carryover effect may reflect the assimilation by subjective, rather than objective, durations, potentially linking to the sticky pacemaker rate and/or decisional inertia.

Siyi Chen

and 3 more

When memorizing an integrated object such as a Kanizsa figure, the completion of parts into a coherent whole is attained by grouping processes which render a whole-object representation in visual working memory (VWM). The present study measured event-related potentials (ERPs) and oscillatory amplitudes to track these processes of encoding and representing multiple features of an object in VWM. To this end, a change detection task was performed, which required observers to memorize both the orientations and colors of six ‘pacman’ items while inducing configurations of the pacmen that systematically varied in terms of their grouping strength. The results revealed an effect of object configuration in VWM despite physically constant visual input: change detection for both orientation and color features was more accurate with increased grouping strength. At the electrophysiological level, the lateralized ERPs and alpha activity mirrored this behavioral pattern. Perception of the orientation features gave rise to the encoding of a grouped object as reflected by the amplitudes of the PPC. The grouped object structure, in turn, modulated attention to both orientation and color features as indicated by the enhanced N1pc and N2pc. Finally, during item retention, the representation of individual objects and the concurrent allocation of attention to these memorized objects were modulated by grouping, as reflected by variations in the CDA amplitude and a concurrent lateralized alpha suppression, respectively. These results indicate that memorizing multiple features of grouped, to-be-integrated objects involves multiple, sequential stages of processing, providing support for a hierarchical model of object representations in VWM.