Sex and Age Shape Progression of Batten Disease, Brainwave Study Finds
Thursday, November 6, 2025
Batten disease is a rare inherited condition that affects brain development and function. CLN3 disease is the most common type of this disease. The symptoms are life changing. They usually begin between the ages of four and seven. Children will experience vision loss, problems with cognition, movement, seizures, and difficulties with speech. The symptoms make it a difficult disease to study, and how the disease progresses in males versus females is not well understood; however, it is common for female patients with CLN3 to experience a later onset of symptoms compared to males and have a more rapid disease progression.
Researchers from the Del Monte Institute for Neuroscience at the University of Rochester have found that male and female brains show different responses as the disease progresses and have found a model of the disease that could transform future treatments, as explained in a paper published today in the Journal of Neurodevelopmental Disorders.
“Because vision and cognition decline early, it is hard for scientists to track how the disease progresses and develop reliable treatments using standard tests,” said Yanya Ding, PhD, an alumna of the Neuroscience Graduate program in the Wang Lab and first author of the study. “Being able to successfully track brain functions in mice gives us a model that could transform how we study possible treatments and therapeutics for this devastating disease.”
Using a non-invasive measure of brain electrical activity known as electroencephalography or EEG, and an audio test, researchers were able to detect how the brain responds to sound pattern changes in male and female mouse models of CLN3. They surprisingly discovered that male mice showed early auditory problems that improved with age, while female mice had persistent difficulties, evidence that both age and sex play important roles in the progression of Batten disease.
Previous research led by the co-senior author of this study, John Foxe, PhD, principal investigator of the Fredrick J. and Marion A. Schindler Cognitive Neurophysiology Lab at the University of Rochester, identified the easy-to-measure brain process or biomarker in human CLN3 patients that was used in this mouse study.
“These findings highlight the importance of tracking brain function over time and support the use of this EEG-based method as a valuable tool for monitoring disease progression and testing new treatments,” said Kuan Hong Wang, PhD, professor of Neuroscience and co-senior author of the new study. “By showing how Batten disease progresses differently in males and females, this research could help guide more personalized therapies and improve the timing of interventions for better outcomes.”
Read More: Sex and Age Shape Progression of Batten Disease, Brainwave Study FindsScientists Reveal How Senses Work Together in the Brain
Friday, August 15, 2025
It has long been understood that experiencing two senses simultaneously, like seeing and hearing, can lead to improved responses relative to those seen when only one sensory input is experienced by itself. For example, a potential prey that gets visual and auditory clues that it is about to be attacked by a snake in the grass has a better chance of survival. Precisely how multiple senses are integrated or work together in the brain has been an area of fascination for neuroscientists for decades. New research by an international collaboration of scientists at the University of Rochester and a research team in Dublin, Ireland, has revealed some new key insights.
“Just like sensory integration, sometimes you need human integration,” said John Foxe, PhD, director of the Del Monte Institute for Neuroscience at the University of Rochester and co-author of the study that shows how multisensory integration happens in the brain. These findings were published in Nature Human Behaviour today. “This research was built on decades of study and friendship. Sometimes ideas need time to percolate. There is a pace to science, and this research is the perfect example of that.”
Simon Kelly, PhD, professor at University College Dublin, led the study. In 2012, his lab discovered a way to measure information for a decision being gathered over time in the brain using an electroencephalographic (EEG) signal. This step followed years of research that set the stage for this work. “We were uniquely positioned to tackle this,” Kelly said. “The more we know about the fundamental brain architecture underlying such elementary behaviors, the better we can interpret differences in the behaviors and signals associated with such tasks in clinical groups and design mechanistically informed diagnostics and treatments.”
Research participants were asked to watch a simple dot animation while listening to a series of tones and press a button when they noticed a change in the dots, the tones, or both. Using EEG, the scientists were able to infer that when changes happened in both the dots and tones, auditory and visual decision processes unfolded in parallel but came together in the motor system. This allowed participants to speed up their reaction times. “We found that the EEG accumulation signal reached very different amplitudes when auditory versus visual targets were detected, indicating that there are distinct auditory and visual accumulators,” Kelly said.
Using computational models, the researchers then tried to explain the decision signal patterns as well as reaction times. In one model, the auditory and visual accumulators race against each other to trigger a motor reaction, while the other model integrates the auditory and visual accumulators and then sends the information to the motor system. Both models worked until researchers added a slight delay to either the audio or visual signals. Then the integration model did a much better job at explaining all the data, suggesting that during a multisensory (audiovisual) experience, the decision signals may start on their own sensory-specific tracks but then integrate when sending the information to areas of the brain that generate movement.
“The research provides a concrete model of the neural architecture through which multisensory decisions are made,” Kelly said. “It clarifies that distinct decision processes gather information from different modalities, but their outputs converge onto a single motor process where they combine to meet a single criterion for action.”
Read More: Scientists Reveal How Senses Work Together in the BrainWhat New Research Revels about Autism, Stimming, and Touch
Monday, April 14, 2025
Tapping a pen, shaking a leg, twirling hair—we have all been in a classroom, meeting, or a public place where we find ourselves or someone else engaging in repetitive behavior—a type of self-stimulatory movement also known as stimming. For people with autism, stimming can include movements like flicking fingers or rocking back and forth. These actions are believed to be used to deal with overwhelming sensory environments, regulate emotions, or express joy, but stimming is not well understood. And while the behaviors are mostly harmless and, in some instances, beneficial, stimming can also escalate and cause serious injuries. However, it is a difficult behavior to study, especially when the behaviors involve self-harm.
“The more we learn about how benign active tactile sensations like stimming are processed, the closer we will be to understanding self-injurious behavior,” said Emily Isenstein, PhD (’24), Medical Scientist Training Program trainee at the University of Rochester School of Medicine and Dentistry, and first author of the study in NeuroImage that provides new clues into how people with autism process touch. “By better understanding how the brain processes different types of touch, we hope to someday work toward more healthy outlets of expression to avoid self-injury.”
Researchers used several technological methods to create a more realistic sensory experience for active touch—reaching and touching—and passive touch—being touched. A virtual reality headset simulated visual movement, while a vibrating finger clip—or vibrotactile disc—replicated touch. Using EEG, researchers measured the brain responses of 30 neurotypical adults and 29 adults with autism as they participated in active and passive touch tasks. To measure active touch, participants reached out to touch a virtual hand, giving them control over when they would feel the vibrations. To measure passive touch, a virtual hand reached out to touch them. The participant felt vibrations when the two hands “touched," simulating physical contact. As expected, the researchers found that the neurotypical group had a smaller response in a brain signal to active touch when compared to passive touch, evidence that the brain does not use as many resources when it controls touch and knows what to expect.
However, the group with autism showed little variation in brain response to the two types of touch. Both were more in line with the neurotypical group's brain response to passive touch, suggesting that in autism, the brain may have trouble distinguishing between active and passive inputs. “This could be a clue that people with autism may have difficulty predicting the consequences of their actions, which could be what leads to repetitive behavior or stimming,” said Isenstein.
It was a surprising finding, particularly in adults. John Foxe, PhD, director of the Golisano Intellectual and Developmental Disabilities Institute at the University of Rochester and co-senior author of the study, remarked that this may indicate the difference in children with autism could be greater than their neurotypical counterparts. “Many adults with autism have learned how to interact effectively with their environment, so the fact that we’re still finding differences in brain processing for active touch leads me to think this response may be more severe in kids, and that’s what we also need to understand.”
Read More: What New Research Revels about Autism, Stimming, and Touch