Screen Shot 2020-03-19 at 9.36.28 AM.png

Source:

Mahoney, J.R. & Verghese, J. Using the Race Model Inequality to Quantify Behavioral Multisensory Integration Effects. J Vis Exp., 147, e59575, 2019. PMID: 31132070

The picture above depicts one of our research participants partaking in a laboratory-based visual-somatosensory assessment.
 
With his eyes fixated on the center bulls-eye and his hands cradling fixed bilateral stimulators (vibration motors and LEDs), he is asked to respond with a foot-pedal (not pictured) as quickly as possible once he feels, sees, or feels & sees (multisensory condition) any stimulation. 
 
Headphones provide a constant stream of white noise to block potential sound emanating from the vibration motors.  

This lab equipment is costly and truly not designed to be mobile.  Thus, we created CatchU™, a digital healthcare app, which presents the exact experimental paradigm on an iPhone. This gives clinicians the opportunity to gauge the risk of older adult falls in a less invasive, more flexible environment. 

This assessment is conducted remotely on our healthcare app and the results are transmitted electronically. Preventing falls in the elderly is key to helping them maintain independence in their aging community.