Brain and AI: What fascinating connections does the new study reveal?
A new study from the University of Osnabrück, where Prof. Dr. Tim C. Kietzmann, who is involved, investigates how the brain processes visual information and links it with language. Published in "Nature Machine Intelligence", it sheds light on connections between brain activity and AI models.

Brain and AI: What fascinating connections does the new study reveal?
A new study published in Nature Machine Intelligence, examines in depth how the human brain processes visual information and connects it to language. Under the direction of Prof. Dr. Tim C. Kietzmann from the University of Osnabrück, one of the co-first authors of the study, developed an innovative approach to explore the interaction between visual perception and linguistic models.
As part of the research, subjects were shown images in a magnetic resonance imaging (MRI) scanner while their brain activity was recorded. The study hypothesized that the visual system processes information in a way that is compatible with linguistic structures. Prof. Dr. Adrien Doerig, who now works at the Free University of Berlin, described this possible connection as a universal “lingua franca” between different brain regions.
Artificial intelligence and human brain activity
An outstanding result of the study is that today's language models, especially large language models, show remarkable similarities in their activity to the human visual system. The artificial neural networks trained by the study were able to accurately predict linguistic representations from images. These models surpass many current AI technologies in their ability to model brain activity.
A central focus is on the functioning of the frontal lobe, which is particularly active when processing visual information. Recent findings underline that the frontal lobe is responsible not only for visual perception, but also for cognitive processes such as thinking and decision-making. During the research, 13 participants saw a total of 28 images — including faces and familiar places — with their brain activity measured multiple times over time.
The results showed that activations in the frontal lobe had stronger connections to text-based networks, while correlations with visual networks were lower. This suggests that the brain closely associates visual information with linguistic processing, a finding that challenges the traditional view that the frontal lobe is exclusively dedicated to motor and decision-making tasks. The study suggests that when the images were presented, activity in the frontal lobe correlated with text-related responses for a longer period of time.
Visual perception and new research approaches
Another important aspect of the research is the function of the different fields of vision. The foveal field of vision, which accounts for only 1% of the total field of vision, plays a crucial role in activities such as reading, while peripheral vision is crucial for orientation and navigation. The role of peripheral vision is often given less attention, which is being investigated in the PERFORM project. This highlights the complexity of transsaccadic perception, a process that is still poorly understood.
In addition, children performed worse than adults on tests of peripheral position perception, but they showed faster corrections through eye movements. This suggests that the brain seamlessly fills in sensory information gaps, underscoring the relevance of multidimensional perception. These findings are now being incorporated into a new research project called SENCES, which deals with the completion of sensory information in the brain.
Continued research in these areas could not only deepen our understanding of the complex interaction between visual perception and language, but also have practical applications in improving brain-computer interfaces and developing visual prostheses for people with visual impairments. Given the large number of undiscovered parameters, it remains exciting to see what future research will uncover from these recent findings.