PhD in Electrical Engineering,
Stanford University
MS, BS in Computer Science,
Stanford University
[email protected]
Github /
Google Scholar /
Twitter /
LinkedIn /
Blog
I'm Akshay Agrawal, a researcher and engineer interested in optimization and machine learning. I have a PhD from Stanford University, where I was advised by Stephen Boyd, as well as a BS and MS in computer science from Stanford.
I am the developer of PyMDE, a library for computing embeddings of large datasets with thousands of monthly downloads, and I am a core developer and maintainer of CVXPY, a domain-specific language for convex optimization used by many universities and companies and with over half a million monthly downloads. Previously, I was a software engineer on the Google Brain team, where I worked on TensorFlow 2.
Computer science is my first passion; writing is my second. I worked as a writer and investigative news editor for The Stanford Daily, and I blog at debugmind.com.
I enjoy speaking with people working on real problems. If you'd like to chat, don't hesitate to reach out over email.
I have industry experience in designing and building software for machine learning (TensorFlow 2.0), optimizing the scheduling of containers in shared datacenters, motion planning and control for autonomous vehicles, and performance analysis of Google-scale software systems.
From 2017-2018, I worked on TensorFlow as an engineer on Google Brain team. Specifically, I developed a multi-stage programming model that lets users enjoy eager (imperative) execution while providing them the option to optimize blocks of TensorFlow operations via just-in-time compilation.
I honed my technical infrastructure skills over the course of four summer internships at Google, where I:
conducted fleet-wide performance analyses of programs in shared servers and datacenters;
analyzed Dapper traces for the distributed storage stack and uncovered major performance bugs;
built a simulator for solid-state drives and investigated garbage reduction policies;
wrote test suites and tools for the Linux production kernel.
I spent seven quarters as a teaching assistant for the following Stanford courses:
EE 364a: Convex Optimization I. Professor Stephen Boyd. Spring 2016-17, Summer 2018-2019.
CS 221: Artificial Intelligence, Principles and Techniques. Professor Percy Liang. Autumn 2016-17.
CS 109: Probability for Computer Scientists. Professor Mehran Sahami and Lecturer Chris Piech. Winter 2015-16, Spring 2015-16, Winter 2016-17.
CS 106A: Programming Methodology. Section Leader. Lecturer Keith Schwarz. Winter 2013-14.
Paths to the Future: A Year at Google Brain. January 2020.
A Primer on TensorFlow 2.0. April 2019.
Learning about Learning: Machine Learning and MOOCs. June 2015.
Machines that Learn: Making Distributed Storage Smarter. Sept. 2014.
Separation Theorems. Lecture notes on separation theorems in convex analysis. A. Agrawal. 2019.
A Cutting-Plane, Alternating Projections Algorithm for Conic Optimization Problems. A. Agrawal. 2017. [code]
Cosine Siamese Models for Stance Detection. A. Agrawal, D. Chin, K. Chen. 2017. [code]
Xavier : A Reinforcement-Learning Approach to TCP Congestion Control. A. Agrawal. 2016. [code]
B-CRAM: A Byzantine-Fault-Tolerant Challenge-Response Authentication Mechanism. A. Agrawal, R. Gasparyan, J. Shin. 2015. [code]