-
Notifications
You must be signed in to change notification settings - Fork 0
Closed
Milestone
Description
Implement linear regression + 2 solving methods + regularization. Includes hyperparameter tuning and making graphs for speed and accuracy of various combinations.
implementation
- Matrix inversion
- GD
math break
- latex -> svg on computer, embed in markdown and ensure works (i.e. no rawgit needed)
- section: naive regression to scalar values
- brief intro of variables
- least squares
- loss
- naive least squares loss derivative w/ some algebra, show gradient
- gradient = 0 -> closed form
- ridge
- loss
- derivative
- gradient = 0 -> closed form
- lasso
- loss
- derivative
more implementation
- ridge
- lasso
coordinate descent
- math for ols
- implementation for ols
- math for lasso
- implementation for lasso
OK then make actually useful regression to 10 classes
- multi-class OLS analytic
- multi-class OLS GD
- clean up eval
- refactor coordinate descent
- multi-class OLS CD
- multi-class ridge analytic
- multi-class ridge GD
oh wait i figured out ridge CD:
- scalar ridge CD
- math for scalar ridge CD
ok back to multiclass:
- multi-class ridge CD
- multi-class lasso GD
- multi-class lasso CD
then some tooling and graphs!
- SGD
- Hyperparam tuning framework and graphs
- Graphs of accuracy and speed of above methods
- math of multiclass (can be super brief)