A project completed in partnership with the City of Madison to investigate the fairness of tax-assessed home values with respect to racial demographics.
- Collaborated remotely with a team of students, managed deadlines, meet with stakeholders, and used version control software for datasets and data-analysis code.
- Gained domain-specific knowledge to adequately interpret a large dataset of high-dimensional features.
- Combined publicly available and web-scrapped datasets to examine demographic factors.
- Performed feature engineering to create new, useful features for analysis.
- Used a multi-variable regression model to predict the effect of many factors on correct assessment values.
- Interpreted and performed analysis of geospatial data.
- Generated a wide array of intermediary and final visualizations using matplotlib.
- Presented work to stakeholders, and created a written report.
Tools & Technologies Used:
A new image compression format that uses both unsupervised and supervised machine learning techniques to (hopefully) outperform existing methods of image compression
- Utilized principle component analysis to optimize color information in sub-blocks of image.
- Developed novel unsupervised learning technique for learning multiple orthogonal sub-spaces of high-dimensional data using gaussian mixture models and expectation maximization.
- Used cross-validation to tune the hyper-parameters of models used for different stages of image compression.
- Performed analysis, comparing performance to existing image compression standards like JPEG and PNG.
- Leveraged numpy and vectorization techniques to improve runtime performance of training compression models.
Tools & Technologies Used:
A showcase of metrics tracking Milwaukee's status as a Tech Hub
- Worked directly with stakeholders from multiple organizations to gather, analyze, and display data on a custom-made public portal, managed by a proprietary internal website
- Organized meetings, managed ongoing and future tasks, and mentored members of a six-person interdisiplinary team
- Conducted principal development work on the aforementioned web software, including creating a Vue-TS front-end, TS-Node-Express back-end and SQL Server database
- Setup continuous deployment using AWS CodePipeline, AWS Elastic Beanstalk, and AWS S3
- Created over 10 interactive data visualizations using Microsoft Power BI
Tools & Technologies Used:
My final project for an introductory Data Science class, an analysis of Horror Movie data
- Gathered data from a selenuim-built web scraper.
- Used popular python data analysis and visualization tools
- Created ten visualizations on different aspects of the data, with a focus on analysing the relationship between movie ratings, popularity, and profitability.
- Published to a static website.
Tools & Technologies Used:
A full-stack NodeJS web app with basic social media functionality
- Implemented a front end with VueJS.
- Implemented a back end with sign-in, sign-up, and content creation in NodeJS with typescript and express.
- Built and deployed the app to Heroku.
- Hosted a SQL Server database in Microsoft Azure.
- Collected user feedback
Tools & Technologies Used:
A Prototype of an advanced study tool for the 2019 innovation competition Transcend Madison
- Won Best Pitch (Early Stage).
- Developed a new system for memorization that enables more granular recall.
- Created a desktop app as a minimal viable product.
- Validated the software in a partnership with a UW professor.
- Piched the idea to industry veterans.
Tools & Technologies Used:
My final project for AP Calculus and AP CS Principles. For computation & visualization of Derivatives and Slope Feilds
- Used a custom expression tree for computation, no math libraries used.
- Expressed slope fields as a heatmap.
- Solved for the derivatives of 1-variable functions.