Akshay Agrawal

A picture of Akshay.

PhD in Electrical Engineering, Stanford University
MS, BS in Computer Science, Stanford University
[email protected]

Github / Google Scholar / Twitter / LinkedIn / Blog

I'm Akshay Agrawal, a researcher and engineer interested in optimization and machine learning. I have a PhD from Stanford University, where I was advised by Stephen Boyd, as well as a BS and MS in computer science from Stanford.

I am the developer of PyMDE, a library for computing embeddings of large datasets with thousands of monthly downloads, and I am a core developer and maintainer of CVXPY, a domain-specific language for convex optimization used by many universities and companies and with over half a million monthly downloads. Previously, I was a software engineer on the Google Brain team, where I worked on TensorFlow 2.

Computer science is my first passion; writing is my second. I worked as a writer and investigative news editor for The Stanford Daily, and I blog at debugmind.com.


I build open source software designed to make data science actionable and accessible. Below are some of my projects.
CVXPY is a parser-compiler for convex optimization that extends the reach of low-level numerical solvers. CVXPY is used by dozens of universities and companies, for problems in energy management, finance, resource allocation, and more.
PyMDE is a GPU-accelerated library for embedding large datasets and laying out graphs, using a single, simple-to-understand framework that unifies many embedding methods developed over the past century. PyMDE has been used to embed real-world data such as single-cell transcriptomes, news documents, graphs, and more.


*denotes alphabetical ordering of authors


Allocation of fungible resources via a fast, scalable price discovery method. [bibtex] [code]
A. Agrawal, S. Boyd, D. Narayanan, F. Kazhamiaka, M. Zaharia. Mathematical Programming Computation.
Embedded code generation with CVXPY. [bibtex] [code]
M. Schaller, G. Banjac, S. Diamond, A. Agrawal, B. Stellato, S. Boyd. Pre-print.


Computing tighter bounds on the n-Queen's Constant via Newton's Method. [bibtex] [code]
P. Nobel, A. Agrawal, and S. Boyd. Pre-print
Minimum-distortion embedding. [bibtex] [slides] [code]
A. Agrawal, A. Ali, and S. Boyd. Foundations and Trends in Machine Learning.
Constant function market makers: Multi-asset trades via convex optimization [bibtex]
G. Angeris, A. Agrawal, A. Evans, T. Chitra, and S. Boyd. Pre-print.


Learning convex optimization models. [bibtex] [code]
A. Agrawal, S. Barratt, and S. Boyd.* IEEE/CAA Journal of Automatica Sinica.
Differentiating through log-log convex programs. [bibtex] [poster] [code]
A. Agrawal and S. Boyd. Pre-print.
Learning convex optimization control policies. [bibtex] [code]
A. Agrawal, S. Barratt, S. Boyd, B. Stellato.* Learning for Dynamics and Control (L4DC), oral presentation.
Disciplined quasiconvex programming. [bibtex] [code]
A. Agrawal and S. Boyd. Optimization Letters.


Differentiable convex optimization layers. [bibtex] [code] [blog post]
A. Agrawal, B. Amos, S. Barratt, S. Boyd, S. Diamond, and J. Z. Kolter.* In Advances in Neural Information Processing Systems (NeurIPS).
Presented at the TensorFlow Developer Summit 2020, Sunnyvale [slides] [video]
Differentiating through a cone program. [bibtex] [code]
A. Agrawal, S. Barratt, S. Boyd, E. Busseti, W. Moursi.* Journal of Applied and Numerical Optimization.
TensorFlow Eager: A multi-stage, Python-embedded DSL for machine learning. [bibtex] [slides] [blog post] [code]
A. Agrawal, A. N. Modi, A. Passos, A. Lavoie, A. Agarwal, A. Shankar, I. Ganichev, J. Levenberg, M. Hong, R. Monga, S. Cai.* Systems for Machine Learning (SysML).
Disciplined geometric programming. [bibtex] [tutorial] [poster] [code]
A. Agrawal, S. Diamond, S. Boyd. Optimization Letters.
Presented at ICCOPT 2019, Berlin [slides]


A rewriting system for convex optimization problems. [bibtex] [slides] [code]
A. Agrawal, R. Verschueren, S. Diamond, S. Boyd. Journal of Control and Decision.


YouEDU: Addressing confusion in MOOC discussion forums by recommending instructional video clips. [bibtex] [dataset] [code]
A. Agrawal, J. Venkatraman, S. Leonard, and A. Paepcke. Educational Data Mining.
Presented at EDM 2015, Madrid [slides]


I enjoy speaking with people working on real problems. If you'd like to chat, don't hesitate to reach out over email.

I have industry experience in designing and building software for machine learning (TensorFlow 2.0), optimizing the scheduling of containers in shared datacenters, motion planning and control for autonomous vehicles, and performance analysis of Google-scale software systems.

From 2017-2018, I worked on TensorFlow as an engineer on Google Brain team. Specifically, I developed a multi-stage programming model that lets users enjoy eager (imperative) execution while providing them the option to optimize blocks of TensorFlow operations via just-in-time compilation.

I honed my technical infrastructure skills over the course of four summer internships at Google, where I:


I spent seven quarters as a teaching assistant for the following Stanford courses:


Technical Reports