Event-based warping: A relative distortion of time within events.

J Exp Psychol Gen

Johns Hopkins University, Department of Psychological and Brain Sciences.

Published: September 2025


Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Objects and events are fundamental units of perception: Objects structure our experience of space, and events structure our experience of time. A striking and counterintuitive finding about object representation is that it can warp perceived space, such that stimuli within an object appear farther apart than stimuli in empty space. Might events influence perceived time in the same way objects influence perceived space? Here, five experiments (N = 500 adults) show that they do: Just as stimuli within an object are perceived as farther apart in space, stimuli within an event are perceived as further apart in time. Such "event-based warping" is elicited both by events characterized by sound (Experiment 1) and by events characterized by silence (Experiment 2). Moreover, these effects cannot be explained by surprise, distraction, or attentional cueing (Experiments 3 and 4) and also arise cross-modally (from audition to vision; Experiment 5). We suggest that object-based warping and event-based warping are both instances of a more general phenomenon in which representations of structure-whether in space or in time-generate powerful and analogous relative perceptual distortions. (PsycInfo Database Record (c) 2025 APA, all rights reserved).

Download full-text PDF

Source
http://dx.doi.org/10.1037/xge0001798DOI Listing

Publication Analysis

Top Keywords

event-based warping
8
structure experience
8
space events
8
space stimuli
8
stimuli object
8
farther apart
8
influence perceived
8
events characterized
8
events
6
space
5

Similar Publications

Objects and events are fundamental units of perception: Objects structure our experience of space, and events structure our experience of time. A striking and counterintuitive finding about object representation is that it can warp perceived space, such that stimuli within an object appear farther apart than stimuli in empty space. Might events influence perceived time in the same way objects influence perceived space? Here, five experiments (N = 500 adults) show that they do: Just as stimuli within an object are perceived as farther apart in space, stimuli within an event are perceived as further apart in time.

View Article and Find Full Text PDF

Accounting for electron-beam-induced warping of molecular nanocrystals in MicroED structure determination.

IUCrJ

March 2025

Department of Chemistry and Biochemistry, UCLA-DOE Institute for Genomics and Proteomics; STROBE, NSF Science and Technology Center, University of California, Los Angeles, 611 Charles E. Young Dr East, Los Angeles, CA 90095, USA.

High-energy electrons induce sample damage and motion at the nanoscale to fundamentally limit the determination of molecular structures by electron diffraction. Using a fast event-based electron counting (EBEC) detector, we characterize beam-induced, dynamic, molecular crystal lattice reorientations (BIRs). These changes are sufficiently large to bring reciprocal lattice points entirely in or out of intersection with the sphere of reflection, occur as early events in the decay of diffracted signal due to radiolytic damage, and coincide with beam-induced migrations of crystal bend contours within the same fluence regime and at the same illuminated location on a crystal.

View Article and Find Full Text PDF

Discrete myoelectric control-based gesture recognition has recently gained interest as a possible input modality for many emerging ubiquitous computing applications. Unlike the continuous control commonly employed in powered prostheses, discrete systems seek to recognize the dynamic sequences associated with gestures to generate event-based inputs. More akin to those used in general-purpose human-computer interaction, these could include, for example, a flick of the wrist to dismiss a phone call or a double tap of the index finger and thumb to silence an alarm.

View Article and Find Full Text PDF

Event camera shows great potential in 3D hand pose estimation, especially addressing the challenges of fast motion and high dynamic range in a low-power way. However, due to the asynchronous differential imaging mechanism, it is challenging to design event representation to encode hand motion information especially when the hands are not moving (causing motion ambiguity), and it is infeasible to fully annotate the temporally dense event stream. In this paper, we propose EvHandPose with novel hand flow representations in Event-to-Pose module for accurate hand pose estimation and alleviating the motion ambiguity issue.

View Article and Find Full Text PDF

Event-based sampled ECG morphology reconstruction through self-similarity.

Comput Methods Programs Biomed

October 2023

Embedded Systems Laboratory (ESL), École Polytechnique Fédérale de Lausanne (EPFL), Lausanne 1015.

Background And Objective: Event-based analog-to-digital converters allow for sparse bio-signal acquisition, enabling local sub-Nyquist sampling frequency. However, aggressive event selection can cause the loss of important bio-markers, not recoverable with standard interpolation techniques. In this work, we leverage the self-similarity of the electrocardiogram (ECG) signal to recover missing features in event-based sampled ECG signals, dynamically selecting patient-representative templates together with a novel dynamic time warping algorithm to infer the morphology of event-based sampled heartbeats.

View Article and Find Full Text PDF