Skip to content

Commit 6c714c2

Browse files
Add hw1, q7 explanation of weights (#6)
1 parent 325cf5f commit 6c714c2

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
---
2+
id: y97mg42O7d
3+
question: 'Homework Q7: What do the weights represent?'
4+
sort_order: 8
5+
---
6+
7+
The weight vector, `w`, contains the coefficients for a linear model fit between the target variable, `y`, and the input features in `X`, with the model estimate of `y`, `y_est` defined as follows:
8+
$$ y_{est} = w[0]*X[0] + w[1]*X[1] $$
9+
where the values in brackets refer to each column of the feature matrix, `X`, and the corresponding row of the weight vector, `w`. Each value in `w` describes the slope of the trend line that fits `y` the best for each feature. As we'll learn in Module 2, least squares yields a "best" fit that minimizes the squared difference between `y` and `y_est`. The weights, `w`, can be checked to see if they're reasonable by multiplying `X` by the weight vector, `w`:
10+
$$ y_{est} = X.dot(w) $$
11+
This should produce a vector, `y_est` that is similar, plus or minus some error, to the original target variable, `y`.

0 commit comments

Comments
 (0)