Akshay Agrawal

A picture of Akshay.

Ph.D. Candidate in Electrical Engineering, Stanford University
M.S., B.S. in Computer Science, Stanford University
[email protected]

Github / Google Scholar / Twitter / LinkedIn / Blog


I'm Akshay Agrawal, a third-year Ph.D. student at Stanford University, advised by Professor Stephen Boyd.

My research is in optimization and machine learning, and I'm passionate about building tools for both. I developed PyMDE, a library for computing embeddings of large datasets, and I am a core developer of CVXPY, a domain-specific language for convex optimization used by many universities and companies. Previously, I was a full-time software engineer on the Google Brain team, where I worked on TensorFlow 2 and the core TensorFlow runtime.

Computer science is my first passion; writing is my second. I worked as a writer and investigative news editor for The Stanford Daily, and I blog at debugmind.com.

I graduated from Stanford in 2017 with a B.S. and M.S. in computer science and a minor in mathematics. I am currently supported by a Stanford Graduate Fellowship.


*denotes alphabetical ordering of authors


Constant Function Market Makers: Multi-Asset Trades via Convex Optimization [bibtex]
G. Angeris, A. Agrawal, A. Evans, T. Chitra, and S. Boyd Pre-print.
Allocation of Fungible Resources via a Fast, Scalable Price Discovery Method. [bibtex] [code]
A. Agrawal, S. Boyd, D. Narayanan, F. Kazhamiaka, M. Zaharia Pre-print.
Minimum-Distortion Embedding. [bibtex] [code]
A. Agrawal, A. Ali, and S. Boyd. Foundations and Trends in Machine Learning (to appear).


Learning convex optimization models. [bibtex] [code]
A. Agrawal, S. Barratt, and S. Boyd.* IEEE/CAA Journal of Automatica Sinica.
Differentiating through log-log convex programs. [bibtex] [poster] [code]
A. Agrawal and S. Boyd. Pre-print.
Learning convex optimization control policies. [bibtex] [code]
A. Agrawal, S. Barratt, S. Boyd, B. Stellato.* Learning for Dynamics and Control (L4DC), oral presentation.
Disciplined quasiconvex programming. [bibtex] [code]
A. Agrawal and S. Boyd. Optimization Letters.


Differentiable convex optimization layers. [bibtex] [code] [blog post]
A. Agrawal, B. Amos, S. Barratt, S. Boyd, S. Diamond, and J. Z. Kolter.* In Advances in Neural Information Processing Systems (NeurIPS).
Presented at the TensorFlow Developer Summit 2020, Sunnyvale [slides] [video]
Differentiating through a cone program. [bibtex] [code]
A. Agrawal, S. Barratt, S. Boyd, E. Busseti, W. Moursi.* Journal of Applied and Numerical Optimization.
TensorFlow Eager: A multi-stage, Python-embedded DSL for machine learning. [bibtex] [slides] [blog post] [code]
A. Agrawal, A. N. Modi, A. Passos, A. Lavoie, A. Agarwal, A. Shankar, I. Ganichev, J. Levenberg, M. Hong, R. Monga, S. Cai.* Systems for Machine Learning (SysML).
Disciplined geometric programming. [bibtex] [tutorial] [poster] [code]
A. Agrawal, S. Diamond, S. Boyd. Optimization Letters.
Presented at ICCOPT 2019, Berlin [slides]


A rewriting system for convex optimization problems. [bibtex] [slides] [code]
A. Agrawal, R. Verschueren, S. Diamond, S. Boyd. Journal of Control and Decision.


YouEDU: Addressing confusion in MOOC discussion forums by recommending instructional video clips. [bibtex] [dataset] [code]
A. Agrawal, J. Venkatraman, S. Leonard, and A. Paepcke. Educational Data Mining.
Presented at EDM 2015, Madrid [slides]


I enjoy speaking with people working on real problems. If you'd like to chat, don't hesitate to reach out over email.

I have industry experience in designing and building software for machine learning (TensorFlow 2.0), optimizing the scheduling of containers in shared datacenters, motion planning and control for autonomous vehicles, and performance analysis of Google-scale software systems.

From 2017-2018, I worked on TensorFlow as an engineer on Google Brain team. Specifically, I developed a multi-stage programming model that lets users enjoy eager (imperative) execution while providing them the option to optimize blocks of TensorFlow operations via just-in-time compilation.

I honed my technical infrastructure skills over the course of four summer internships at Google, where I:


I spent seven quarters as a teaching assistant for the following Stanford courses:


Technical Reports