Tundra Space

Tundra Space

Clinical Research Directory

Browse clinical research sites, groups, and studies.

Back to Studies
RECRUITING
NCT07560631
NA

SeeMe: Using Automated Facial Tracking to Detect Voluntary Behavior in Brain Injury

Sponsor: Stony Brook University

View on ClinicalTrials.gov

Summary

Objective: This prospective interventional study introduces "SeeMe," an automated, high-resolution computer vision platform designed to objectively quantify microscopic, auditory command-evoked movements in patients with Traumatic Brain Injury (TBI). Current clinical assessments, such as the Glasgow Coma Scale (GCS) and Coma Recovery Scale-Revised (CRS-R), rely on subjective human observation and often fail to detect low-amplitude motor responses, potentially misclassifying up to 25% of patients as unresponsive. Methodology: SeeMe utilizes vector analysis, cross-correlation, and deep neural networks (DNNs) to track individual facial pores and hand movements with sub-millimeter precision (0.5 mm) and high temporal resolution (0.03s). The study will enroll a cohort of 60-80 TBI patients, alongside healthy controls and pharmacologically paralyzed subjects, to validate SeeMe's sensitivity and specificity. Primary Goals: 1. Validation: Compare SeeMe's detection of voluntary motor recovery against gold-standard clinical examinations (CRS-R). 2. Synchronization: Simultaneously record and time-lock electroencephalography (EEG) and electrocorticography (ECoG) with SeeMe-detected movements. 3. Biomarker Identification: Characterize neural signatures (specifically Beta-band oscillations) associated with the return of voluntary behavior. Impact: By providing a real-time, objective measure of motor intention and execution, SeeMe aims to identify "Cognitive-Motor Dissociation" (CMD) earlier than current methods, facilitating more accurate prognostications and laying the framework for future closed-loop neuromodulation (e.g., Vagus Nerve Stimulation) to accelerate TBI recovery.

Official title: SeeMe: A Multimodal Behavioral-Electrophysiological Tool for Real-Time Detection of Motor Behavior in Brain Injury Patients

Key Details

Gender

All

Age Range

22 Years - 85 Years

Study Type

INTERVENTIONAL

Enrollment

80

Start Date

2026-03-30

Completion Date

2029-12

Last Updated

2026-05-05

Healthy Volunteers

Yes

Interventions

DIAGNOSTIC_TEST

SeeMe Multimodal Auditory Command Protocol

A standardized, computer-controlled auditory stimulation (AS) protocol designed to elicit and quantify microscopic motor responses. Protocol Details: Stimuli: Participants are presented with five distinct auditory commands: 1) 'Stick out your tongue,' 2) 'Open your eyes,' 3) 'Show me a smile,' 4) 'Close your hands,' and 5) a neutral control command ('Today is a sunny day'). Timing: Each command is presented 10 times via single-use headphones with a randomized 30-45 second jittered interval between trials to distinguish stimulus-evoked responses from spontaneous arousal. Data Capture: Responses are captured using high-resolution video (Panasonic HC-2000X) at 0.03s temporal resolution and synchronized millisecond-level EEG/ECoG. Analysis: Displacement heatmaps are generated via facial pore vector analysis and classified using a bidirectional long short-term memory (LSTM) neural network to determine the statistical significance of motor initiation compared to a 15-minute resting base

Locations (1)

Stony Brook University Hospital

Stony Brook, New York, United States