Menu
- Undergraduate
- Graduate
- Research
- News & Events
- People
- Inclusivity
- Jobs
Back to Top Nav
Back to Top Nav
Back to Top Nav
In this talk, we present a geometric framework for learning and processing information with deep neural networks.
Abstract: In this talk, we present a geometric framework for learning and processing information with deep neural networks. We introduce feature geometry, which unifies statistical dependence and feature representations in a function space. We formulate each learning problem as solving the optimal feature representation of the associated dependency component. We will illustrate how this perspective provides a unified learning solution to distinct problems, including estimating different targets, ranking, and prediction/classification. In particular, we will discuss its extension in learning structured feature representations that achieve optimal performance-cost tradeoffs. We will also showcase its applications in various complicated learning scenarios, including dealing with constraints and incomplete modalities, incorporating side information for inference, learning the dependence structures of sequential data, and learning the spectral decomposition of general linear operators.
Bio: Xiangxiang Xu received the B.Eng. and Ph.D. degrees in electronic engineering from Tsinghua University, Beijing, China, in 2014 and 2020, respectively. He is a postdoctoral associate in the Department of EECS at MIT. His research focuses on information theory and statistical learning, with applications in understanding and developing learning algorithms. He is a recipient of the 2016 IEEE PES Student Prize Paper Award in Honor of T. Burke Hayes and the 2024 ITA (Information Theory and Applications) Workshop Sand Award.
Events are free and open to the public unless otherwise noted.