Classification vs Regression
Classification predicts categories (spam/not spam). Regression predicts numbers (house price = $450,000). Same supervised framework — learn from labeled examples — but the output is continuous rather than discrete.
Real-World Examples
House price prediction: Features = sq ft, bedrooms, location. Output = price.
Stock forecasting: Features = historical prices, volume. Output = future price.
Weather prediction: Features = temperature, pressure, humidity. Output = tomorrow’s temperature.
Ad revenue estimation: Features = clicks, impressions. Output = revenue.
# Linear regression — the simplest model
y = w⋅x + b
# Example: predict house price
price = 200 × sqft + 50000 × bedrooms
+ 30000 × garage - 10000
# The model learns weights (w) and bias (b)
# by minimizing prediction error on
# training data (labeled examples).
# Loss function: Mean Squared Error
MSE = (1/n) × ∑(y_pred - y_true)²
Beyond linear: Real relationships are rarely linear. Polynomial regression, decision tree regressors, and neural networks can model complex non-linear relationships. Deep learning excels when the mapping from input to output is highly complex (e.g., image → age estimation).