Loading...
Thumbnail Image
Publication

C3I-SynMicrosaccade: A pipeline and dataset for microsaccade recognition using neuromorphic event camera streams

Shariff, Waseem
Hanley, Timothy
Stec, Maciej
Javidni, Hossein
Corcoran, Peter
Citation
Shariff, Waseem, Hanley, Timothy, Stec, Maciej, Javidnia, Hossein, & Corcoran, Peter. (2026). C3I-SynMicrosaccade: A pipeline and dataset for microsaccade recognition using neuromorphic event camera streams. Data in Brief, 65, 112491. https://doi.org/10.1016/j.dib.2026.112491
Abstract
This article presents a C3I-SynMicrosaccade dataset: a synthetic microsaccade dataset designed to enable event-based modelling and classification of microsaccadic eye movements. Using Blender, we generated high-resolution RGB sequences of microsaccades, characterized by small, transient eye rotations around a fixed head pose. Each microsaccade follows a horizontal-boomerang-like trajectory, simulating the natural back-and-forth displacement of the eye during visual fixation. Seven distinct angular classes, ranging from 0.5° to 2.0°, capture varying motion amplitudes while maintaining consistent scene, lighting, and texture conditions. The rendered RGB frames were converted into event-based data streams using the v2e simulator, which replicates the asynchronous behaviour of neuromorphic vision sensors. Temporal durations and event counts were carefully controlled and resampled to ensure class balance and eliminate bias toward motion magnitude. The resulting dataset comprises 175,000 event sequences (87,500 per eye), providing a large-scale, balanced foundation for microsaccade recognition, neuromorphic vision research, and synthetic-to-real transfer learning. This work offers a controlled, reproducible framework for studying fixational eye movements and evaluating event-based algorithms under fine motion dynamics.
Publisher
Elsevier
Publisher DOI
Rights
CC BY-NC
Collections