Google, Mountain View CA December 2016 — April 2022
Gboard — Federated Analytics and Machine Learning July 2019 - April 2022
I worked on a modeling team for Gboard and partnered with international teams on private federated learning. We identified ‘private heavy hitters’, or the most frequent items of a dataset, without centralized logging of what a user does on their keyboard. With those heavy hitters, we train models to power new experiences or improve existing models to adapt to changing user needs.
- Designed, implemented and shipped our team's first on-device personalized models.
- Shipped two iterations of sticker pack recommendations using collaborative filtering, one using item-item collaborative filtering based on privately aggregated item co-share data, and another using user-item embeddings.
- Impact: Significant increase in engagement with recommendations and forged several processes (e.g., diagnosing data corruption in FA tasks, federated model evaluation).
- Developed new federated tasks to securely aggregate heavy-hitter data in anonymized batches, using Invertible Bloom Lookup Tables (IBLT), secure aggregation protocols, and differential privacy.
- Used heavy-hitter data to improve the quality of typing language models, emoji search, and emoji suggestions.
- Optimized federated models by creating custom TensorFlow ops.
- Identified, diagnosed, and resolved barriers to private heavy hitter tasks outside my direct team.
- Interviewed multiple software engineer candidates, created documentation to unblock team members, presented in internal tech talks on collaborative filtering.
- Skill highlights: Python, TensorFlow, TensorFlow Federated, Java, C++, IBLTs, Flume, Apache Beam, data pipelines, differential privacy, secure aggregation, collaborative filtering
Google Search — Search Frontend December 2016 — July 2019
Google.com Search consists of multiple teams with different search verticals. My work on a horizontal infrastructure team was to optimize a slice of the stack for improved developer velocity and to optimize the billions of search result pages served to end users every day.
- Analyzed metrics and deployed experiments to identify and implement optimizations in the JavaScript and rendering for mobile web search. These optimizations improved the speed of the search experience for billions of users.
- Wrote efficient JavaScript libraries and polyfills for the Google search results page used by virtually every mobile query. Supported multiple accessibility modalities and browsers as old as iOS 8 Safari.
- Increased search developer velocity by developing easy-to-use APIs around async requests, without sacrificing performance for the end user. These libraries power the async experience of several search verticals, such as Jobs Search.
- Presented and coordinated frontend portions of Search Features Bootcamp, a semi-annual course on developing for Google Search, three times. This involved developing and testing course materials in a dry run with trial participants. I administered the actual presentation and labs, and gave additional availability as a point of contact for feature developers who had questions after the class.
- Supported a cross-platform component library for Google Search, which served mobile web surfaces and native mobile views in the Google Assistant on both Android and iOS.
- Planned, pitched and managed 2x STEP internship projects around client side rendering on the search results page, leading to full-time hires for those interns.
- Skill highlights: JavaScript, TypeScript, HTML/CSS, Java, Python, Objective-C, Protobuf, Blaze/Bazel, Mentorship
Apple, Cupertino CA May 2013 - December 2016
Fullstack engineering for the Apple Instructional Design department. This ranged from internal tooling for authors and localizers, to creating new user-facing instructional experiences ahead of product launches.
- Created a custom CMS solution to manage authoring, localization and review of content for the Tips app.
- Translated designs to localizable HTML/CSS/JavaScript for "Quick Tours," interactive introductions to macOS or first party apps with over 20 supported languages. (e.g., help.apple.com/osx-mavericks/whats-new).
- Lead exploratory project for a help chat bot to recommend relevant support content.
- Skill highlights: JavaScript, CSS, Ember.js, Node.js, Objective-C, Postgres, Web Localization
Education
Rochester Institute of TechnologySeptember 2009 - May 2013
Earned a Bachelors of Science in Computer Science, from Rochester Institute of Technology. Minors in Mathematics and Economics.
Stanford Online, CourseraApril 2016
Earned a Certificate from Andrew Ng's Machine Learning course on Coursera.
Other Projects
Blaze/Bazel Build Notifying Widget 2017
When I joined Google Search, I found that I often missed when the build was done, especially if it had failed early. A cold build could take 30-45 minutes even on my specialty machine with 12 cores and 128gb of ram, which meant context switching whenever blaze's cache was too old. So I built this LED notification widget to let me know when the build had finished with attention-grabbing lights. I provided parts to my coworkers in my team and taught a quick class on assembling / programming the microcontroller as a blaze status indicator.
This widget connects to the host machine over usb and was posted somewhere noticeable in my office. I wrote a small python daemon for the build machine that polls my blaze server for updates in its status. When it detects a change, it pushes the updated status over usb to a Teensy microcontroller, which controls a ring of ws2812b LEDs, pulsing all red for failure or all green for success. The walls and the back of the enclosure are laser cut wood pieces, and the front is white acrylic to defuse the LEDs inside. The back wood panel is held on by a single screw that, when loosened, allows the panel to swivel open for servicing.
Kinect the Dots 2015
Spearheaded an electronic art installation called 'Kinect the Dots', a large-scale grid of approximately 1,900 LEDs which displays RGB silhouettes of nearby people in real time. This has been showcased at the Bay Area Maker Faire (received 2 editor's choice awards), Burning Man, and Santa Cruz Glow.
C++ code interprets Kinect point cloud data on MacBook, which finds people in the scene and pushes display data to C program on a Teensy microcontroller. The Teensy drives the RGB LEDs as fast as it receives frames from the MacBook. The LEDs are behind holes in mirrored acrylic, and in front of that mirror is a two-way-mirrored acrylic sheet, which bounces some of the light back into the mirrored acrylic creating an "infinity mirror" effect.
I worked on every layer of the stack, starting with fleshing out the artistic vision, architecting the different parts, sourcing materials, power calculations, machining, assembly, microcontroller code, wiring, developing a wire format over usb, and the c++ code that used Prime Sense's pose detection models to find people in the scene.