C++ neural networks from scratch – Pt 2. building an MLP
Building a trainable multilayer perceptron in pure C++.
I'm a computational & theoretical neuroscience PhD student at NYU advised by Eero Simoncelli and David Heeger. My dissertation focuses on theories of adaptive efficient coding in recurrent neural networks, and developing statistical tools for analyzing neural representational geometry.
In the summer of 2022, I was a PhD intern on the Open Codecs team at Google where I did a research project on machine learning for video compression with adaptive nonlinear autoencoders.
I was born and raised in Yellowknife, Northwest Territories, Canada 🥶🍁. My BSc is in Physiology and Physics from McGill University, where I began working on what would eventually become my MSc at the University of Western Ontario (our lab migrated), modeling neural correlates of visuo-spatial attention in prefrontal cortex, advised by Julio Martinez-Trujillo.
Building a trainable multilayer perceptron in pure C++.
Creating a bare-bones linear algebra library to train a neural net.
Poisson-Identifiable Variational Autoencoder w/ PyTorch implementation.
Update: I wrote about how my internship experience went here.
Python implementation of adaptive spiking neural net proposed in Gutierrez and Deneve eLife 2019.
Simple code to save activations of a model’s intermediate layers.
A cool algorithm that adaptively approximates the column space of a matrix using random projections.
Simple implenetation of the RealNVP invertible neural network.