Environmental complexity study
ECS
This is a project I am very fortunate to have the opportunity to work on as a Research Assistant for the John’s Hopkins Medical Institute. Under the supervision of Dr. Yingzi Xiong, we have been working on creating a program, and curating a set of scenes to work with Dual-Sensory impaired patients in order to test their mobility and rate their perception of complexity in real-world environments. This started with going through thousands of field recordings courtesy of the TAU Urban Audio-Visual Database (2021). The project began with parsing through 10 scenes of scenarios, with up to 30 locations, to create a wide range of environmental complexity. The second task was creating a program, that can randomize the order of scenes being played back, in three different modalities; Auditory, Visual, and Audio Visual. In this program four questions will be asked to the patient during each scene and a prompt will be queued before the field recording plays in order to set the test for the patient. We are currently undergoing testing for control participants, and will begin testing Dual-Sensory Impaired patients starting in January of 2025. This project has also been in collaboration with Dr. Joe Nemargut of the University of Montreal. The program is designed in Max MSP.