Yuening Cai, a graduate student in computer science, is among this year's cohort of Arts Integration Initiative Grant recipients. The grants, awarded by The Hopkins Center for the Arts and the Office of the Vice Provost For Research, aim to support arts-centric research, incubate interdisciplinary projects and advance faculty-student mentorship.
Cai's proposed project, Spectral, is a multichannel immersive sound installation that transforms solar radiation data from different wavelength bands into a layered sonic structure, allowing audiences to experience the invisible energy and dynamics of the Sun within a spatial environment.
By integrating astronomical data, sound design, and spatial audio techniques, the work seeks to translate scientific measurements into an experiential sonic field, which occupies a contemplative space between nature, technology, and artistic experience.
David Kotz, the Pat and John Rosenwald Professor of Computer Science, has been elected to the Board of Directors of the Computing Research Association, whose mission is to enhance innovation by joining with industry, government and academia to strengthen research and advanced education in computing.
Computer Science PhD students Shenyang Deng and Tianyu Pang won Best Student Paper Award at the 37th International Conference on Algorithmic Learning Theory in February. Other winners include co-authors Zhuoli Ouyang, a former summer intern and Assistant Professor Yaoqing Yang.
Their research reveals a shift in how scientists understand Stochastic Gradient Descent (SGD)—the optimization algorithm widely used to train modern deep-learning models.
Early empirical studies observed that SGD gradient directions align within a tiny subspace of the model’s parameter space in the later phase of training. This suggested that restricting SGD updates to this subspace might achieve performance comparable to full-space training. However, later research uncovered a counterintuitive phenomenon: when updates are restricted to this tiny subspace, training loss often fails to decrease.
The paper, titled "Suspicious Alignment of SGD: A Fine-Grained Step Size Condition Analysis," offers new insights into why this surprising behavior arises by providing a fine-grained theoretical analysis explaining this paradox by identifying critical step-size thresholds and regimes of gradient alignment in ill-conditioned optimization problems.