Skip to main content

Xue Chen

December 3, 2019

Thesis Information
Title: Using and Saving Randomness
Adviser: David Zuckerman
Institution: University of Texas at Austin
Graduation Date: April 2018

Contact Information
Email Candidate
Candidate Website
SIGACT Membership No.: 7912706

Candidate Bio:

Xue is a postdoc in Northwestern University. Before that, he obtained PHD at University of Texas at Austin, advised by Prof. David Zuckerman. He is broadly interested in randomized algorithms and derandomization. Specific areas include algorithms for big data — sparse recovery and fast Fourier transform, foundations of machine learning, and pseudorandomness.

Paper 1:

Testing noisy linear functions for sparsity. Xue Chen, Anindya De, and Rocco A. Servedio. In submission, 2019

Link to PDF

Paper 2:

Fourier-sparse interpolation without a frequency gap. Xue Chen, Daniel M. Kane, Eric Price, and Zhao Song. FOCS 2016

Link to PDF

Paper 3:

Active Regression via Linear-Sample Sparsification. Xue Chen and Eric Price. COLT 2019

Link to PDF

Keywords: Algorithms for big data, foundations of machine learning, pseudorandomness, computational complexity

All Job Candidates

Or Zamir

December 3, 2019

Thesis Information
Title: Faster k-SAT Algorithms
Adviser: Uri Zwick and Haim Kaplan
Institution: Tel Aviv University
Graduation Date: August 2020

Contact Information
Email Candidate
Candidate Website
SIGACT Membership No.: 2142011

Candidate Bio:

“I have completed my bachelor and masters degrees in Mathematics and Computer Science in Tel Aviv University when I was an high school student.
After high-school, I have enlisted to a research unit in the army for three years (as military service is mandatory in Israel).
Simultaneously, I was working on my PhD under the advising of Uri Zwick and Haim Kaplan.
I am usually interested in classical graph algorithms, data structures and graph theory.

I am currently 23 years old and expect finishing the PhD by the end of the current academic year.”

Paper 1:

T. Dueholm Hansen, H. Kaplan, O. Zamir, U. Zwick, Faster k-SAT algorithms using biased-PPSZ, STOC ’19.

Link to PDF

Paper 2:

J. Holm, V. King, M. Thorup, O. Zamir, U. Zwick, Random k-out subgraph leaves only (n/k) inter-component edges, FOCS ’19.

Link to PDF

Paper 3:

H. Kaplan, O. Zamir, U. Zwick, The amortized cost of finding the minimum, SODA ’15.

Link to PDF

Keywords: algorithms, data structures, graph theory, graph algorithms, combinatorics

All Job Candidates

Sumegha Garg

December 3, 2019

Thesis Information
Title: Understanding the Limits of Computational Models and Learning Algorithms
Adviser: Mark Braverman
Institution: Princeton University
Graduation Date: June 2020

Contact Information
Email Candidate
Candidate Website
SIGACT Membership No.: 6721778

Candidate Bio:

Sumegha Garg is a graduate student in Computer Science at Princeton University, where she is advised by Professor Mark Braverman. Before coming to Princeton, she received her Bachelor’s degree in Computer Science and Engineering from Indian Institute of Technology, Delhi. She is broadly interested in theoretical computer science, with a focus on complexity theory and algorithmic fairness. The primary aim of her research has been studying the computation limits of and the role of randomness in space-bounded computational models. Another aim of her research has been understanding the sources of unfairness in machine learning algorithms.

Paper 1:

Extractor-based time-space lower bounds for learning, Sumegha Garg and Ran Raz and Avishay Tal, STOC, 2018

Link to PDF

Paper 2:

Hitting Sets with Near-Optimal Error for Read-Once Branching Programs, Mark Braverman and Gil Cohen and Sumegha Garg, STOC, 2018

Link to PDF

Paper 3:

Tracking and Improving Information in the Service of Fairness, Sumegha Garg and Michael Kim and Omer Reingold, EC, 2019

Link to PDF

Keywords: Computational complexity, algorithmic fairness, space-bounded computation, randomness in computing

All Job Candidates

Sivakanth Gopi

December 3, 2019

Thesis Information
Title: Locality in Coding Theory
Adviser: Zeev Dvir
Institution: Princeton University
Graduation Date: June 2018

Contact Information
Email Candidate
Candidate Website
SIGACT Membership No.: 9185199

Candidate Bio:

Sivakanth Gopi (called “Gopi” by friends and colleagues) is a postdocotoral researcher in the Algorithms group at Microsoft Research Redmond. He received a PhD in Computer Science from Princeton University in 2018, advised by Zeev Dvir; the title of his dissertation was “Locality in coding theory”. His main research interests are in coding theory and its connections to pseudorandomness, complexity theory and cryptography. He is also involved with the DNA Storage project, erasure coding in Azure storage and Project Laplace (differential privacy) at Microsoft. He enjoys spending leisure time learning new things, staying fit, exploring nature and talking to his family.

Paper 1:

2-Server PIR with sub-polynomial communication; Sivakanth Gopi, Zeev Dvir; STOC 2015.

Link to PDF

Paper 2:

Maximally Recoverable LRCs: A field size lower bound and constructions for few heavy parities; Sivakanth Gopi, Venkatesan Guruswami, Sergey Yekhanin; SODA 2019

Link to PDF

Paper 3:

CSPs with Global Modular Constraints: Algorithms and Hardness via Polynomial Representations; Joshua Brakensiek, Sivakanth Gopi, Venkatesan Guruswami; STOC 2019

Link to PDF

Keywords: Coding theory, locally decodable codes, complexity theory, pseudorandomness

All Job Candidates

Ariel Schvartzman Cohenca

December 3, 2019

Thesis Information
Title: Circumventing Lower Bounds in Mechanism Design
Adviser: S. Matthew Weinberg
Institution: Princeton University
Graduation Date: July 2020

Contact Information
Email Candidate
Candidate Website
SIGACT Membership No.: 0644007

Candidate Bio:

Ariel Schvartzman Cohenca is a PhD candidate at Princeton University advised by S. Matthew Weinberg. Ariel’s work focuses in understanding the trade-off between optimality and simplicity in the design of multi-dimensional auctions. He was awarded the Department of Computer Science’s Graduate Student Teaching Award in 2017, and the School of Engineering and Applied Science’s Award for Excellence in 2018. During the summer of 2018, Ariel was a research intern at Google-Mountain View. He obtained his B.S. in Mathematics with Computer Science from MIT in 2015.

Paper 1:

Approximation Schemes for a Buyer with Independent Items via Symmetries. Pravesh Kothari, Divyarthi Mohan, Ariel Schvartzman, Sahil Singla, S. Matthew Weinberg. FOCS 2019.

Link to PDF

Paper 2:

Smoothed Analysis of Multi-Item Auctions with Correlated Values. Christos-Alexandros Psomas, Ariel Schvartzman, S. Matthew Weinberg. EC 2019.

Link to PDF

Paper 3:

The menu complexity of “one-and-a-half-dimensional” mechanism design. Raghuvansh R. Saxena, Ariel Schvartzman, S. Matthew Weinberg. SODA 2018.

Link to PDF

Keywords: economics and computation, algorithmic game theory

All Job Candidates

Di Wang

December 3, 2019

Thesis Information
Title: Foundamental Machine Learning Problems In Differential Privacy Model
Adviser: Jinhui Xu
Institution: State University of New York at Buffalo
Graduation Date: June 2020

Contact Information
Email Candidate
Candidate Website

Candidate Bio:

I am a fifth (final) year PHD student in the Department of Computer Science and Engineering at The State University of New York (SUNY) at Buffalo under supervision of Dr. Jinhui Xu . Before that I got my Master degree in Mathematics at University of Western Ontario in 2015, and I got my Bachelor degree in Mathematics and Applied Mathematics at Shandong University in 2014.

Paper 1:

Facility Location Problem in Differential Privacy Model Revisited. Yunus Esencayi, Marco Gaboardi, Shi Li and Di Wang Conference on Neural Information Processing Systems (NIPS/NeurIPS), 2019

Link to PDF

Paper 2:

Differentially Private Empirical Risk Minimization with Non-convex Loss FunctionsDi Wang, Changyou Chen and Jinhui Xu. 36th International Conference on Machine Learning (ICML 2019).

Link to PDF

Paper 3:

On Sparse Linear Regression in the Local Differential Privacy ModelDi Wang and Jinhui Xu. 36th International Conference on Machine Learning (ICML 2019).

Link to PDF

Keywords: Differential Privacy, Learning Theory

All Job Candidates

Tal Wagner

December 3, 2019

Thesis Information
Title: Algorithms for Large and High-Dimensional Data: Compression, Sketching and Quantization
Adviser: Piotr Indyk
Institution: MIT
Graduation Date: June 2020

Contact Information
Email Candidate
Candidate Website
SIGACT Membership No.: 8157902

Candidate Bio:

I am a PhD candidate in CSAIL, MIT, advised by Piotr Indyk. During my PhD I completed summer internships in Microsoft Research Redmond (2019), Amazon Core Machine Learning Group (2017), and VMware Research Group (2015). I received my MSc from the Weizmann Institute, and BSc in Computer Science and Mathematics from the Technion, Israel.

Paper 1:

Practical Data-Dependent Metric Compression with Provable Guarantees, Piotr Indyk, Ilya Razenshteyn and Tal Wagner, NeurIPS, 2017

Link to PDF

Paper 2:

Approximate Nearest Neighbors in Limited Space, Piotr Indyk and Tal Wagner, COLT, 2018

Link to PDF

Paper 3:

Semi-Supervised Learning on Data Streams via Temporal Label Propagation, Tal Wagner, Sudipto Guha, Shiva Kasiviswanathan and Nina Mishra, ICML, 2018

Link to PDF

Keywords: metric spaces, sketching, quantization, nearest neighbor search, dimension reduction

All Job Candidates

Aditya Potukuchi

December 3, 2019

Thesis Information
Title: Combinatorial methods in Theoretical Computer Science
Adviser: Swastik Kopparty
Institution: Rutgers University
Graduation Date: May 2020

Contact Information
Email Candidate
Candidate Website

Candidate Bio:

I am a 5th year graduate student at Rutgers University, and my advisor is Swastik Kopparty. My interests lie broadly in Discrete mathematics, probability, combinatorics, and coding theory. Before Rutgers, I received my master’s degree in Computer Science from Chennai Mathematical Institute.

Paper 1:

A spectral bound on hypergraph discrepancy, Aditya Potukuchi, preprint

Link to PDF

Paper 2:

Improved Inapproximability of rainbow coloring, Per Austrin, Amey Bhangale, and Aditya Potukuchi, SODA, 2020

Link to PDF

Paper 3:

Syndrome decoding of Reed-Muller codes and tensor decomposition over finite fields, Swastik Kopparty and Aditya Potukuchi, SODA, 2018

Link to PDF

Keywords: Discrete mathematics, Probability, Combinatorics, Coding Theory

All Job Candidates

Tongyang Li

December 3, 2019

Thesis Information
Title: Quantum algorithms for machine learning and optimization
Adviser: Andrew M. Childs
Institution: University of Maryland
Graduation Date: May 2020

Contact Information
Email Candidate
Candidate Website

Candidate Bio:

Tongyang Li is a Ph.D. candidate at the Department of Computer Science, University of Maryland. He received B.E. from Institute for Interdisciplinary Information Sciences, Tsinghua University and B.S. from Department of Mathematical Sciences, Tsinghua University, both in 2015; he also received a Master degree from Department of Computer Science, University of Maryland in 2018. He is a recipient of the IBM Ph.D. Fellowship, the NSF QISE-NET Triplet Award, and the Lanczos Fellowship. His research focuses on designing quantum algorithms for machine learning and optimization.

Paper 1:

Tongyang Li, Shouvanik Chakrabarti, and Xiaodi Wu, Sublinear quantum algorithms for training linear and kernel-based classifiers. Proceedings of the 36th International Conference on Machine Learning (ICML 2019), 3815–3824, 2019.

Link to PDF

Paper 2:

Shouvanik Chakrabarti, Andrew M. Childs, Tongyang Li, and Xiaodi Wu, Quantum algorithms and lower bounds for convex optimization. Contributed talk at the 22nd Annual Conference on Quantum Information Processing (QIP 2019).

Link to PDF

Paper 3:

Fernando G.S.L. Brandão, Amir Kalev, Tongyang Li, Cedric Y.-Y. Lin, Krysta M. Svore, and Xiaodi Wu, Quantum SDP Solvers: Large Speed-ups, Optimality, and Applications to Quantum Learning. Proceedings of the 46th International Colloquium on Automata, Languages and Programming (ICALP 2019), Vol. 132, 27:1–27:14, Leibniz International Proceedings in Informatics; also a contributed talk at the 22nd Annual Conference on Quantum Information Processing (QIP 2019).

Link to PDF

Keywords: Quantum computing, theoretical machine learning, optimization theory

All Job Candidates

Tanmay Inamdar

December 3, 2019

Thesis Information
Title: Covering and Clustering Problems with Outliers
Adviser: Kasturi Varadarajan
Institution: The University of Iowa
Graduation Date: May 2020

Contact Information
Email Candidate
Candidate Website

Candidate Bio:

Tanmay Inamdar is a fifth year PhD candidate at the University of Iowa, advised by Prof. Kasturi Varadarajan. His research is mainly focused on approximation algorithms. He is also interested in computational geometry and distributed algorithms.

Paper 1:

Inamdar, T. and Varadarajan, K., 2018. On Partial Covering For Geometric Set Systems}}. In 34th International Symposium on Computational Geometry (SoCG 2018) (Vol. 99, p. 47). Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik.

Link to PDF

Paper 2:

Bandyapadhyay, S., Inamdar, T., Pai, S. and Varadarajan, K., 2019. A Constant Approximation for Colorful k-Center. In 27th Annual European Symposium on Algorithms (ESA 2019). Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik.

Link to PDF

Paper 3:

Inamdar, T., Pai, S. and Pemmaraju, S.V., 2018. Large-Scale Distributed Algorithms for Facility Location with Outliers. In 22nd International Conference on Principles of Distributed Systems (OPODIS 2018). Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik.

Link to PDF

Keywords: Approximation Algorithms, Computational Geometry, Parallel and Distributed Computing

All Job Candidates