Neuro-LLM
Toward understanding brain and language

ICONIP 2025 Workshop: Nov. 24, OIST


Accelerating Large-Scale Computational Neuroscience Using Transformers and Large Language Models (LLMs)


Under the AMED Bain/Minds 2.0 domain 4 project, our research aims to transform computational neuroscience through four key objectives. First, we seek to develop a standard platform for analyzing large-scale brain measurement and simulation data. While a fully detailed digital brain capturing every molecular to behavioral nuance is theoretically possible, the enormous number of undetermined parameters makes it impractical. Instead, we propose building a functional digital brain model using Transformer architectures applied to both experimental and simulation datasets, linking molecular, cellular, circuit, and behavioral scales. Second, we aim to unravel the inner workings of large language models (LLMs) and multimodal models (LMMs). Although these Transformers excel in next-word prediction and implicitly generate linguistic structures, how high-level representations emerge remains unclear. We plan to create innovative, large-scale analytical methods that extend beyond current techniques like attention flow analysis. Third, we will apply these Transformer-based techniques to neural data to optimize models for predicting brain activity, capturing the essential dynamics and spatiotemporal interactions of neural populations, and refining our digital brain model. Lastly, we intend to develop a specialized LLM to automate script generation and evaluation , streamlining complex simulations and data analysis. Join us as we bridge advanced computational methods with neuroscience, paving the way for a new era in data-driven brain research!

Program (under consideration)


• Introduction

     o Riichiro Hira, Tokyo Institute of Science (5 minutes)



• Regular Talks

     o Kentaro Inui, MBZUAI (25 minutes presentation / 5 minutes Q&A)

     o Jun Igarashi, RIKEN RCCS (25 minutes presentation / 5 minutes Q&A)

     o Keisuke Sakaguchi, Tohoku University (25 minutes presentation / 5 minutes Q&A)

     o Benjamin Heinzerling, RIKEN AIP (25 minutes presentation / 5 minutes Q&A)

     o Yu Takagi, NIT (25 minutes presentation / 5 minutes Q&A)

     o Riichiro Hira (20 minutes presentation / 5 minutes Q&A)

Abstract


Kentaro Inui

Mohamed bin Zayed University of Artificial Intelligence





Jun Igarashi

RIKEN Center for Computational Science





Keisuke Sakaguchi

Graduate School of Information Sciences Department of System Information Sciences, Tohoku University





Benjamin Heinzerling

RIKEN Center for Advanced Intelligence Project (AIP)





Yu Takagi

Nagoya Institute of Technology





Riichiro Hira

Deparatment of Physiology and Cell Biology, Tokyo Institute of Science





Contact

Riichiro Hira    (Institute of Science Tokyo)
riichirohira@gmail.com