The mDOT Center

Transforming health and wellness via temporally-precise mHealth interventions
mDOT@MD2K.org
901.678.1526
 

CP 11: Capturing Autobiographical Memory Formation in People Moving Through Real-world Spaces Using Synchronized Wearables and Intracranial Recordings of EEG

mDOT Center > CP 11: Capturing Autobiographical Memory Formation in People Moving Through Real-world Spaces Using Synchronized Wearables and Intracranial Recordings of EEG

CP 11: Capturing Autobiographical Memory Formation in People Moving Through Real-world Spaces Using Synchronized Wearables and Intracranial Recordings of EEG

Garcia Utah

Collaborating Investigator:

Luis Garcia, The University of Utah

 

Funding Status: 

1R61MH135109

NIH/NIMH

03/15/24 – 02/28/27

 

Associated with:

TR&D3

CP11 is developing a smartphone-based recording application called the CAPTURE app that synchronizes invasive neural recordings with continuous audio-visual, accelerometry, GPS, subjective report, autonomic physiology, and wearable eye tracking recordings during real-world behaviors like autobiographical memory encoding.

 

This project aims to integrate wearable devices—smartphones capturing audio-visual, accelerometry, GPS, physiological, and eye-tracking data—with synchronized intracranial neural recordings to study autobiographical memory (AM) formation in real-world settings. AM, which is uniquely personal and complex, has been challenging to analyze due to the limitations of traditional neuroimaging. This study will investigate how the brain encodes, processes, and retrieves AM by tracking real-world behaviors, offering insights into cognitive and neural mechanisms that are compromised in disorders like Alzheimer’s. By developing the CAPTURE app to record multimodal data synchronized with invasive neural data during daily experiences, the research will establish a foundation for neuromodulation approaches to enhance memory in real-world contexts. The project will leverage the NeuroPace Responsive Neurostimulation System, implanted in over 2,000 epilepsy patients, as a unique opportunity to collect direct neural data associated with AM formation and potentially develop real-world memory restoration tools.

TR&D3 is pushing new methods for efficiently learning robust embedding representations from multimodal sensor data streams from wearables and intracranial recordings.

 

CP11 challenges TR&D3 to develop neural foundation models to detect spatiotemporal events.

 

CP11 will get deep learning based neural foundation models that can be utilized to create analytics pipelines for a variety of downstream tasks involving detection of spatiotemporal events as people move through real-world spaces for purposes of assisting in memory formation.

 

This project aims to unlock the potential of combining wearable mobile recording devices, such as smartphones with continuous audio-visual, accelerometry, GPS, subjective report, autonomic physiology, and wearable eye tracking recordings, with precisely synchronized intracranial neural recordings during real-world behaviors. Autobiographical memory (AM) formation is a critical human behavior that has been difficult to study with traditional neuroimaging methods. Thus the proposed project aims to develop a smartphone-based recording application (CAPTURE app; R61 phase) synchronized with wearables and invasive neural recordings during real-world behaviors like autobiographical memory encoding (R33 phase).

Category

CP, TR&D3

Share
No Comments

Post a Comment