The EXPERIENCE project
The EXPERIENCE project aims to enable the creation and sharing of extended-personal realities. Extended-personal realities are custom made virtual environments generated from personal neurophysiological data and equipped with psychological, cognitive, and behavioral information. The extended-personal reality of a user (provider) can be shared with others who will re-live the same experience via visual-auditory and tactile stimuli, space-time manipulation, and multiple biofeedback. This will enable a new kind of social interaction, and provide access to new, previously unreachable consciousness layers. Practically, the project will result in a ready- and simple-to-use system of hardware and software technologies that allows everyone to generate their own VR environments.
The research consortium responsible for the project is continuously working on creating the hardware and software infrastructure that is required – including, but not limited to a wearable system for the recording of physiological states as well as advanced artificial intelligence for the easy-to-use and quick reconstruction of physical environments into virtual reality (currently limited to static, indoor environments). At the same time, scientific investigation is ongoing to better understand subjective personal experiences, for example time-space perception; as well as to utilize the EXPERIENCE system and the extended-personal realities for the diagnosis and treatment of affective disorders.
Current diagnostic protocols of affective disorders are often based on self-reports and verbal communication between the patient and the psychiatrist. Thus, it requires the patient to accurately reflect on and describe their symptoms and the mental health professional to accurately understand their personal experience.
Mental health assessment – relevance of virtual reality?
The EXPERIENCE system aims to make the diagnostic process more direct by removing the communication barrier. This is made possible by virtual reality, which enables the direct observation and measurement of behavior. By first identifying the patterns of behavior that differ between those with affective disorders and healthy individuals the system has the promise to later recognize the very same pattern and support the diagnostic process. Currently, the consortium is working on the assessment of depressive symptoms severity, but later will also focus on anxiety and anorexia nervosa.
The pilot study has started
Data collection has started in the EXPERIENCE trial in Padua, Italy to test the first version of a VR environment specifically designed for the assessment of depressive symptoms. Participants are free to explore the virtual environment which resembles a family home – while doing so, numerous measures are registered, including certain behaviors, eye-gaze, cognitive performance, and even physiological measures. The collected data will be used to train a machine learning algorithm that should assess depressive symptoms based exclusively on engagement with the virtual environment.
We will see which of the measures are informative, these then can be taken into future versions of the system. The future versions will be more personal – for example by using personal recordings of virtual environments instead of a predesigned one. The more the hardware and software technologies of the project evolve, the more data will be available to inform the algorithms.
Possible impact
We hope that the EXPERIENCE system will be able to support the diagnosis of affective disorders by incorporating novel and objective measures into the diagnostic process. This might lead to earlier detection or enable the identification of different patient subgroups.
In the future, the treatment possibilities of the EXPERIENCE system will also be investigated, including for example exposure therapy in virtual reality.
More information
Please visit the EXPERIENCE website for more information or follow the project on social media (Facebook, Twitter).
Funding
The EXPERIENCE project and the study are funded by the European Commission H2020 Framework Program, Grant No. 101017727.
Comments