David Williams, PhD, dean for Research for the UR’s College of Arts, Sciences and Engineering, says it won’t be long before smartphones go to our heads—literally.
The director of UR’s Center for Visual Sciences, who has pioneered technologies to improve the eyesight of people around the globe, says worldwide there are about 1,000 companies investing millions of dollars to develop a pair of glasses that will provide hands-free, visual and auditory response to voice commands, connect with others instantly, and fully engage the internet, including social media, without getting in the way of everyday functioning.
“Everything you can see and hear on your smartphone you’ll be able to do in a much more enriched and interactive way, to navigate the world,” he says.
Williams uses this as an example because the pursuit of these commercial technologies has now opened the floodgates for academic institutions like the UR to develop multi-sensory technologies to diagnose and treat people with neurological diseases.
“The challenges of providing seamless access to online information, without disrupting the perceptual experience of the world, have birthed a renaissance in basic research on multi-sensory processing,” he says. “And our international expertise in neuroscience, especially multi-sensory processing, and the powerful clinical applications we are discovering for these patients, make The Del Monte Institute a natural home for this initiative.”
The Center for Augmented and Virtual Reality is the brainchild of Williams and Mark Bocko, PhD, chair of Electrical and Computer Engineering. It brings together more than 50 optical and electrical engineers, computer scientists, neuroscientists and augmented-virtual reality (AVR) users from across the UR—including professors of music, the humanities and medicine. With the support of federal, corporate and private funding, the goal is to develop a new wave of AVR technologies that will allow scientists to study, at an unprecedented level, how patients integrate sensory information with balance, motor and muscular activity.
“In patients where those abilities are compromised, what better way to study them than to have complete control of a sensory environment through virtual reality?” says Williams. “We can look at how patients respond in different settings and what their deficits are in integrating varied senses. Maybe there’s a mismatch between what their hearing is telling them and what their vision is telling them…this is a way to isolate that.”
The work of professor of Neuroscience and Visual Science Krystel Huxlin, PhD, provides another example of how AVR technology might be applied. Huxlin currently uses a one-of-a-kind visual training regimen, using a personalized software system, to restore sight in patients who have lost it following a stroke. To maximize the regimen’s potential, she is now working to develop a virtual reality device that would allow patients to do the training at home.
“We want to develop technologies that feed our scientists’ visions, while also being a resource for industries that can benefit from our research,” says Williams. “There are very few universities capable of doing this kind of work, and there are no limits to how it can be applied in neuroscience and beyond...only our imaginations.”