
Date: August 29, 2025
Speaker: Michal Irani, Weizmann Institute of Science
Title: Reading Minds & Machines (This talk will also be given in Bloomberg 081 at Cornell Tech.)
Abstract:
1. Can we reconstruct images that a person saw, directly from their fMRI brain recordings?
2. Can we reconstruct the training data that a deep-network trained on, directly from the parameters of the network?
The answer to both of these intriguing questions is “Yes!”
In this talk I will present some of our work in both of these domains. I will then show how combining the power of Brains & Machines can lead to significant breakthroughs in both domains, and potentially bridge the gap between Minds and Machines. Finally, I will show how combining the power of Multiple Brains (with NO shared data) may lead to new breakthrough discoveries in Brain-Science, and allow mapping of information between different brains.
Bio: Michal Irani is a Professor at the Weizmann Institute of Science, and is currently the Dean of the Faculty of Mathematics and Computer-Science. Michal's research interests center around Computer-Vision, Artificial-Intelligence, and decoding information from Brain activity. Michal received her PhD from the Hebrew University of Jerusalem (1994). During 1993-1996 she was a member of the Sarnoff Research Center (Princeton). She joined the Weizmann Institute in 1997.
Michal's honors and awards include the Sarnoff Technical Achievement Award (1994), the Alon Fellowship for Outstanding Young Scientists (1998), the Levinson Prize in Mathematics (2003), the Maria Petrou Prize (awarded by the IAPR) for outstanding contributions to the fields of Computer Vision and Pattern Recognition (2016), the Helmholtz “Test of Time Award” for her paper “Actions as space-time shapes” (2017), the Landau Prize in Artificial Intelligence (2019), and the Rothschild Prize in Mathematics and Computer Science (2020), and several Best-Paper Awards in leading Computer Vision conferences. In 2023, Michal was elected member of the Israel Academy of Sciences and Humanities.