- You teach the model using labeled data (inputs + correct outputs)
- Goal : Predict outputs for new data
- Example: - Input = [size of house, number of rooms, location] - Output = price of house
- Linear Regression : Predicition
- Logistic Regression : Classification
- Decision Trees : Decision-Making
- Random Forests, SVM
- The model finds patterns without labels
- Goal : Understand structure in data
- Example : Customer segmentation = Group simliar customers for marketing
- K-Means : Clustering
- PCA : Dimensionality reduction
- A model (agent) learns by trial-and-error, receiving rewards.
- Goal : Maximize cumulative reward over time.
- Example : A robot learns to walk
- Agent : learner/decision maker
- Environment : World interacted by the Agent
- Reward : Feedback for actions
- Policy : Strategy followed by the Agent
- Layers of neurons: input -> hidden -> output
- Each neuron does a calculation, applies an activation function
- Example : Predict if an image is of a cat or dog
- Specifically of images
- Detect features like edges, textures, shapes using convolution layers
- Typical Structure : Convolution -> Pooling -> Fully Connected -> Output
- Specifically for sequential data like text, time series
- Maintains memory of previous inputs
- Variants: LSTM, GRU (solve long-term memory problems)
- Modern architecture of NLP
- Uses attention mechanism -> understands context in whole sentence
- Example : GPT, BERT
- Vectors, matrices, dot products
- Used for representing data and weights in neural networks
- Mean, variance, probability distributions
- Used in making predictions and understanding uncertainty
- Derivative -> used in gradient descent to minimize errors
- Partial derivatives -> used to update weights in NNs
- Scale/normalize features
- Encode categorical variables (eg Red = 1, Blue = 2)
- Remove duplicates, correct errors
- Handle missing data (fill with mean/median or drop rows)
- Create new meaningful features from existing data
- Example -> Extract day/month/year from a date, compute BMI from height & weight.