For the past year at Harvard Medical School, I have been developing a system with the primary goal of providing intuitive patient education of their anatomy and surgical procedures.
The system automatically generates 3D models of organs or tumors from CT or MRI scans, then renders them in augmented reality using the Apple Vision Pro.
I lead a group of programmers in developing a soft-body physics simulation of the gastrulation process of the Drosophila fruit fly.
The JavaScript program simulates a 2D cross-section of the embryo, where each rectangle models a cell. After tuning parameters such as stiffness and elasticity and applying appropriate forces, we are able to get results that closely match the real-world process.
You can read the source code on GitHub.
A quick project I coded in JavaScript from scratch (including 3d objects and vector calculus) to learn more about the ray-tracing process.
The program supports materials with different roughness, color, and emmission properties.
See it in action here.
A custom neural network to recognize upper and lowercase letters as well as numbers.
It was a project I worked on with a friend in our freshman year of high school, before LLMs and chatGPT were a thing.
We both wanted to better understand how neural networks functioned mathematically, so we decided to build one from scratch from arrays and floats.
The training was done in python, and the weights were used in JavaScript to create an interactive website.
Try it out here.
Character Recognition with Machine Learning