Hsien-Chih Chang Presents Two Papers at Top Tier Theory Conference

Assistant professor Hsien-Chih Chang's research focuses on studying the shapes behind computing; not just explicit, visible shapes like 2D meshes, trajectories, and networks, but also abstract shapes that may cook up from seemingly unrelated problems, say string alignments and wireless transmissions.  His goal is to uncover hidden connections and use tools from abstract mathematics to establish provable guarantees on the efficiency of computation.

One common theme of his field of study is to identify distance-like structures (called metrics), study their structural properties, and find applications in algorithm design.  Because of the growth of big data, it is often too expensive to reexamine the whole input data again and again to answer a stream of distance queries. Instead, researchers find ways to construct a sketch of the input network in such a way that all the distance information is roughly preserved; the sketch may be much tinier than the original network and thus can be kept in the memory for future requests. 

In recent work that will be presented at the 54th ACM Symposium on Theory and Computing (STOC 22), Chang successfully found a way to construct a sketch that demonstrates a drastic reduction in size in the scenario where the input network is road-like. This means that it can be drawn in a way that has few crossings between different parts of the network. Furthermore, the sketch remains road-like and can be constructed efficiently.

Chang's second paper, also to be presented at STOC 2022, is an extremely fast algorithm for a classical problem called bipartite matching. The problem considers a graph with two sets of vertices where all edges go between the two sets. To achieve a matching one must come up with a set of edges so that each vertex has at most one edge incident on it.

Traditionally, the problem has been solved using a technique called network flows. Unfortunately flow-based algorithms, while running in polynomial time, are too slow for the size of datasets we have in modern times.  However, if the solution is only required to be close to the optimal, there are several methods to obtain better and faster algorithms. In all these methods, some source of randomness is required; in other words, the best guarantee from previously proposed algorithms is that they will run efficiently and the solution will be a good approximation to the optimal most of the time

Chang and his co-authors have developed a method with which a good approximation will be returned 100% of the time, with no loss in efficiency, when the input is low-dimensional (only a constant many numbers are required to record each input point).  This is the first algorithm of its type, and they expect the method to be applied to many other similar optimization problems in low-dimensional spaces.