Maddsmr_shortclip912.mp4 Official

Video-evoked responses are reliably mapped across occipital, temporal, and parietal cortices.

Data and pre-trained models (like the TSM ResNet50 used in the study) are available on GitHub . maddsmr_shortclip912.mp4

The study provides a benchmark for understanding the neural mechanisms of visual event understanding , bridging the gap between static image perception and long-form movie analysis. maddsmr_shortclip912.mp4

The dataset contains 1,102 three-second naturalistic videos sampled from the Moments in Time (MiT) and Memento10k datasets. maddsmr_shortclip912.mp4

The study identifies specific brain regions in the parietal and high-level visual cortex that correlate with how memorable a video clip is. 🎥 Related Resources

Human-written sentence descriptions of the videos correlate more strongly with brain activity than simple labels like "object" or "action".

Close layer
maddsmr_shortclip912.mp4
TOP