I attended the 8th IFAC Workshop on Distributed Estimation and Control in Networked Systems (NecSys) in Chicago on September 16-17, 2019. NecSys is a small international single-track conference that tends to attract optimization/controls/distributed systems researchers. The last time I attended NecSys was in 2012 (in Santa Barbara) while I was still a postdoc, and I have fond memories of it. It was nice for NecSys to be local again this year, as it meant that I could bring most of my research group along!
At the conference, Bryan presented a poster on his recent distributed optimization work (arxiv link). In this work, Bryan showed that near-optimal gradient-based distributed optimization algorithms can be designed by alternating between simple gradient steps and gossip steps. In the paper, Bryan shows how to tune the ratio of gradient to gossip steps based on the spectral gap of the adjacency matrix of the underlying communication graph (a measure of how “connected” the graph is) in order to achieve the fastest possible convergence rate.
On a personal note, I had the opportunity at NecSys to reconnect with my former PhD advisor Sanjay Lall and my former postdoc advisor Ben Recht (first photo). The fact that NecSys was local also led to a little road trip back to Madison with my students Mruganka, Saman, and Akhil! We stopped for dinner in “Little India”, which is a neighborhood filled with Indian and Pakistani restaurants and shops in a suburb north of Chicago. Akhil guided us to some traditional Chaat (second photo) and it was delicious!