You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/eigen_I.md
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -190,7 +190,7 @@ One way to understand this transformation is that $A$
190
190
191
191
Let's examine some standard transformations we can perform with matrices.
192
192
193
-
Below we visualise transformations by thinking of vectors as points
193
+
Below we visualize transformations by thinking of vectors as points
194
194
instead of arrows.
195
195
196
196
We consider how a given matrix transforms
@@ -511,7 +511,7 @@ Let $A$ be the $90^{\circ}$ clockwise rotation matrix given by
511
511
$\begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}$ and let $B$ be a shear matrix
512
512
along the x-axis given by $\begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix}$.
513
513
514
-
We will visualise how a grid of points changes when we apply the
514
+
We will visualize how a grid of points changes when we apply the
515
515
transformation $AB$ and then compare it with the transformation $BA$.
516
516
517
517
```{code-cell} ipython3
@@ -685,7 +685,7 @@ In this case, repeatedly multiplying a vector by $A$ makes the vector "spiral ou
685
685
686
686
We thus observe that the sequence $(A^kv)_{k \geq 0}$ behaves differently depending on the map $A$ itself.
687
687
688
-
We now discuss the property of A that determines this behaviour.
688
+
We now discuss the property of A that determines this behavior.
689
689
690
690
(la_eigenvalues)=
691
691
## Eigenvalues
@@ -846,7 +846,7 @@ This is discussed further later.
846
846
```{exercise}
847
847
:label: eig1_ex1
848
848
849
-
Power iteration is a method for finding the largest absolute eigenvalue of a diagnalizable matrix.
849
+
Power iteration is a method for finding the largest absolute eigenvalue of a diagonalizable matrix.
850
850
851
851
The method starts with a random vector $b_0$ and repeatedly applies the matrix $A$ to it
852
852
@@ -1087,17 +1087,17 @@ for i, example in enumerate(examples):
1087
1087
plt.show()
1088
1088
```
1089
1089
1090
-
The vector fields explains why we observed the trajectories of the vector $v$ multiplied by $A$ iteratively before.
1090
+
The vector fields explain why we observed the trajectories of the vector $v$ multiplied by $A$ iteratively before.
1091
1091
1092
1092
The pattern demonstrated here is because we have complex eigenvalues and eigenvectors.
1093
1093
1094
1094
It is important to acknowledge that there is a complex plane.
1095
1095
1096
-
If we add the complex axis for the plot, the plot will be more complicated.
1096
+
If we add the complex axis to the plot, the plot will be more complicated.
1097
1097
1098
1098
Here we used the real part of the eigenvalues and eigenvectors.
1099
1099
1100
-
We can try to plot the complex plane for one of the matrix using `Arrow3D` class retrieved from [stackoverflow](https://stackoverflow.com/questions/22867620/putting-arrowheads-on-vectors-in-matplotlibs-3d-plot).
1100
+
We can try to plot the complex plane for one of the matrices using `Arrow3D` class retrieved from [stackoverflow](https://stackoverflow.com/questions/22867620/putting-arrowheads-on-vectors-in-matplotlibs-3d-plot).
Copy file name to clipboardExpand all lines: lectures/eigen_II.md
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -73,15 +73,15 @@ Here are some examples to illustrate this further.
73
73
74
74
Let $A$ be a square nonnegative matrix and let $A^k$ be the $k^{th}$ power of $A$.
75
75
76
-
A matrix is consisdered**primitive** if there exists a $k \in \mathbb{N}$ such that $A^k$ is everywhere positive.
76
+
A matrix is considered**primitive** if there exists a $k \in \mathbb{N}$ such that $A^k$ is everywhere positive.
77
77
78
78
It means that $A$ is called primitive if there is an integer $k \geq 0$ such that $a^{k}_{ij} > 0$ for *all* $(i,j)$.
79
79
80
80
We can see that if a matrix is primitive, then it implies the matrix is irreducible.
81
81
82
-
This is becuase if there exists an $A^k$ such that $a^{k}_{ij} > 0$ for all $(i,j)$, then it guarantees the same property for ${k+1}^th, {k+2}^th ... {k+n}^th$ iterations.
82
+
This is because if there exists an $A^k$ such that $a^{k}_{ij} > 0$ for all $(i,j)$, then it guarantees the same property for ${k+1}^th, {k+2}^th ... {k+n}^th$ iterations.
83
83
84
-
In other words, a primitive matrix is both irreducible and aperiodical as aperiodicity requires the a state to be visited with a guarantee of returning to itself after certain amount of iterations.
84
+
In other words, a primitive matrix is both irreducible and aperiodical as aperiodicity requires a state to be visited with a guarantee of returning to itself after a certain amount of iterations.
85
85
86
86
### Left Eigenvectors
87
87
@@ -97,7 +97,7 @@ A vector $\varepsilon$ is called a left eigenvector of $A$ if $\varepsilon$ is a
97
97
98
98
In other words, if $\varepsilon$ is a left eigenvector of matrix A, then $A^T \varepsilon = \lambda \varepsilon$, where $\lambda$ is the eigenvalue associated with the left eigenvector $v$.
99
99
100
-
This hints on how to compute left eigenvectors
100
+
This hints at how to compute left eigenvectors
101
101
102
102
```{code-cell} ipython3
103
103
# Define a sample matrix
@@ -134,10 +134,10 @@ This is a more common expression and where the name left eigenvectors originates
134
134
(perron-frobe)=
135
135
### The Perron-Frobenius Theorem
136
136
137
-
For a nonnegative matrix $A$ the behaviour of $A^k$ as $k \to \infty$ is controlled by the eigenvalue with the largest
137
+
For a nonnegative matrix $A$ the behavior of $A^k$ as $k \to \infty$ is controlled by the eigenvalue with the largest
138
138
absolute value, often called the **dominant eigenvalue**.
139
139
140
-
For a matrix $A$, the Perron-Frobenius theorem characterises certain
140
+
For a matrix $A$, the Perron-Frobenius theorem characterizes certain
141
141
properties of the dominant eigenvalue and its corresponding eigenvector when
142
142
$A$ is a nonnegative square matrix.
143
143
@@ -214,7 +214,7 @@ Using matrix algebra we can conclude that the solution to this system of equatio
214
214
What guarantees the existence of a unique vector $x^{*}$ that satisfies
215
215
{eq}`neumann_eqn` ?
216
216
217
-
The following is a fundamental result in functional analysis that generalises
217
+
The following is a fundamental result in functional analysis that generalizes
0 commit comments