Welcome to the Q4C (QForce) Series

Queens For Computing
Queens College CUNY Computer Science Colloquium


This colloquium is intended to bring together Computer Science and Data Science researchers in the tri-state area (especially in NYC) and to foster collaboration. We welcome talks on any topic of interest to the CS community, including theory, algorithms, machine learning, and data science. If you are interested in attending in-person or online, or would like to give a talk, please contact the organizers.


  1. Wednesday, 09/11/2024, 12:15PM - 1:30PM
    Room: Science Building, C203
    Speaker: Tim Mitchell, Department of Computeer Science, Queens College of CUNY

    Title: Computing the Kreiss Constant of a Matrix

    Abstract: As famously introduced by H.-O. Kreiss about six decades ago, the now-called Kreiss constant of a square matrix bounds the magnitude of the transient behavior of the associated system of ordinary differential (or difference) equations. Although asymptomatically stable systems must eventually decay to a stable state, they nevertheless may incur surprisingly large transients before settling down, and such behavior can result in consequences ranging from undesirable to catastrophic. For example, large transients are innately power inefficient and may pose comfort or safety concerns, e.g., in transportation applications. Moreover, if the underlying physical system is nonlinear, large transient behavior in a stable linear model of that system may end up feeding nonlinear instabilities, which in turn could destabilize the actual physical process!

    Although Kreiss constants theoretically allow us to robustly predict how severely a linear dynamical system will undergo transient behavior, until recently, no methods existed to actually compute them. Instead, Kreiss constants have historically only been approximated using crude and ad hoc techniques that may not provide even a single digit of accuracy. A sticking point toward better methods has been that the value of a Kreiss constant is formulated via certain global continuous optimization problems that often are nonconvex and have multiple local optimizers, and general nonconvex optimization is NP Hard. However, by taking advantage of special structure specific to our problem of interest, we present the very first algorithms to compute Kreiss constants to arbitrary accuracy. In fact, using two very different techniques, we give two different classes of algorithms to reliably compute Kreiss constants.

  2. Monday, 09/16/2024, 12:15PM - 1:30PM
    Room: Science Building, C205
    Speaker: Jun Li, Department of Computer Science, Queens College of CUNY

    Title: Coding Techniques to Mitigate Stragglers in Large-Scale Distributed Computation

    Abstract: Large-scale computation distributed across servers is often slowed down by stragglers—servers that delay or fail to complete tasks on time. In this talk, I will present two recent advancements in coding schemes to mitigate the impact of stragglers in distributed computing, focusing on matrix multiplication and gradient-based optimization. First, I will introduce a coding scheme for matrix multiplication that leverages subtask execution order to improve efficiency and recoverability. By probabilistically recovering incomplete tasks through coded ones, our method reduces encoding and decoding complexity while maintaining high recoverability. Next, I will present the ignore-straggler gradient coding (IS-GC), which tolerates an arbitrary number of stragglers in distributed machine learning. Using a graph-based decoding model, IS-GC maximizes gradient recovery and reduces training time.

  3. Monday, 10/07/2024, 12:15PM - 1:30PM
    Room: Science Building, C201
    Speaker: Teal Witter, NYU

    Title:

    Abstract:

  4. Monday, 10/21/2024, 12:15PM - 1:30PM
    Room: Science Building, C201
    Speaker: Benjamin Eysenbach, Princeton University

    Title:

    Abstract:




The seminar is organized by Jun Li
Email Contact: jun.li@qc.cuny.edu