Skip to content

Commit fe7d71d

Browse files
committed
Merge branch 'dev' into 2025linda
2 parents e645951 + e3d8b8a commit fe7d71d

File tree

5 files changed

+130
-163
lines changed

5 files changed

+130
-163
lines changed

src/algorithms/controllers/AVLTreeSearch.js

Lines changed: 37 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
* This file contains the AVL Tree Search algorithm,
33
* alongside the visualisation code.
44
*
5+
* XXX needs a bunch more fixes. Best ignore height and balance, highlight
6+
* nodes at the right points (done), improve display of t if possible,...
7+
*
58
* The AVL Tree Search algorithm is used to find a node.
69
*
710
* The search algorithm is based on the tree created by the insertion algorithm.
@@ -41,51 +44,62 @@ export default {
4144
let current = root;
4245
let parent = null;
4346

44-
chunker.add('AVL_Search(t, k)', (vis) => {
45-
vis.graph.setFunctionInsertText(" (" + target + ") ");
47+
chunker.add('AVL_Search(t, k)', (vis, c, p) => {
48+
vis.graph.setZoom(0.55);
49+
vis.graph.setFunctionInsertText("(t, " + target + ")");
4650
vis.graph.setFunctionName("AVL_Search");
47-
});
48-
chunker.add('while t not Empty');
51+
vis.graph.visit(c, p);
52+
}, [current, parent]);
53+
if (!tree)
54+
chunker.add('while t not Empty');
4955

5056
let ptr = tree;
5157
parent = current;
5258

53-
while (ptr) {
59+
/* eslint-disable no-constant-condition */
60+
while (true) {
61+
chunker.add('while t not Empty');
62+
63+
if (current === undefined || !ptr) // should use null
64+
break;
5465

55-
chunker.add('n = root(t)', (vis, c, p) => vis.graph.visit(c, p), [current, parent]);
5666
let node = current;
5767
chunker.add('if n.key = k');
5868
if (node === target) {
59-
chunker.add('if n.key = k', (vis, c, p) => vis.graph.leave(c, p), [node, parent]);
60-
chunker.add('return t', (vis, c, p) => vis.graph.select(c, p), [node, parent]);
69+
chunker.add('return t', (vis, c, p) => {
70+
vis.graph.leave(c, p);
71+
vis.graph.select(c, p);
72+
vis.graph.setText('Key found');
73+
}, [node, parent]);
6174
return 'success';
6275
}
6376

6477
chunker.add('if n.key > k');
6578
if (target < node) {
66-
if (tree[node].left !== undefined) {
67-
// if current node has left child
68-
parent = node;
69-
current = tree[node].left;
70-
ptr = tree[node];
79+
parent = node;
80+
current = tree[node].left;
81+
ptr = tree[node];
82+
if (current !== undefined) {
7183

72-
chunker.add('t <- n.left');
84+
chunker.add('t <- n.left', (vis, c, p) => vis.graph.visit(c, p), [current, parent]);
7385
} else {
74-
break;
86+
chunker.add('t <- n.left', (vis) => vis.graph.setText('t = Empty'));
7587
}
76-
} else if (tree[node].right !== undefined) {
77-
// if current node has right child
88+
} else {
7889
parent = node;
7990
current = tree[node].right;
8091
ptr = tree[node];
81-
82-
chunker.add('t <- n.right');
83-
} else {
84-
break;
92+
// if current node has right child
93+
if (current !== undefined) {
94+
chunker.add('t <- n.right', (vis, c, p) => vis.graph.visit(c, p), [current, parent]);
95+
} else {
96+
chunker.add('t <- n.right', (vis) => vis.graph.setText('t = Empty'));
97+
}
8598
}
8699
}
87100

88-
chunker.add('return NotFound', (vis) => vis.graph.setText('RESULT NOT FOUND'));
101+
chunker.add('return NotFound', (vis) => vis.graph.setText('Key not found'));
89102
return 'fail';
90103
},
91-
};
104+
};
105+

src/algorithms/explanations/QSExp.md

Lines changed: 35 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -6,35 +6,53 @@ Quicksort is a divide and conquer algorithm. It first rearranges the input
66
array into two smaller sub-arrays: the (relatively) low elements and the
77
(relatively) high elements. It then recursively sorts each of the sub-arrays.
88

9-
### Sorting using Quicksort
9+
## Algorithm overview
1010

11-
The steps for Quicksort are:
11+
The steps for basic Quicksort are:
1212

13-
* Pick the rightmost element of the array, called a pivot.
13+
* Pick the *pivot* element of the sub-array; here it is the rightmost
14+
element.
1415

15-
* Partitioning: reorder the array so that all elements with values less than the pivot come before the pivot, while all elements with values greater than the pivot come after it. After this partitioning, the pivot is in its final position.
16+
* Partitioning: reorder the sub-array so that only elements with values less than or equal to the pivot come before the pivot, while only elements with values greater than or equal to the pivot come after it. After this partitioning, the pivot is in its final position.
1617

17-
* Recursively apply the above steps to the sub-array of elements with smaller values and separately to the sub-array of elements with greater values.
18+
* Recursively apply the above steps to the sub-array of elements before the pivot and separately to the sub-array of elements after the pivot.
1819

19-
The base case of the recursion is arrays of size one or zero, which are in order by definition, so they never need to be sorted.
20+
The base case of the recursion is sub-arrays of size one or zero, which are in order by definition, so they never need to be sorted.
2021

22+
## Partitioning
2123

22-
### Complexity
24+
The way partitioning is done here is to use two pointers/indices to
25+
scan through the sub-array. One starts at the left and scans right
26+
in search for "large" elements (greater than or equal to the pivot).
27+
The other starts at the right and scans left in search for "small"
28+
elements (less than or equal to the pivot). Whenever a large and a small
29+
element are found they are swapped. When the two indices meet, the pivot
30+
is swapped into that position and partitioning is complete.
2331

24-
Time complexity:
25-
<code>
26-
Average case <i>O(n log n)</i>
27-
Worst case <i>O(n<sup>2</sup>)</i>
28-
Best case <i>O(n log n)</i>
2932

30-
Note: Worst case in quicksort occurs when a file is already sorted, since the partition is highly asymmetrical. Improvements such as median-of-three quicksort make a significant improvement, although worst case behaviour is still possible.
31-
</code>
33+
## Time complexity
3234

33-
Space complexity is O(1), that is, no extra space is required.
35+
In the best case, partition divides the sub-array in half at each step,
36+
resulting in <i>O(log n)</i> levels of recursion and <i>O(n log n)</i>
37+
complexity overall. In the worst case, partition divides the sub-array
38+
very unevenly at each step. The pivot element is either the largest or
39+
smallest element in the sub-array and one of the resulting partitions
40+
is always empty, resulting in <i>O(n<sup>2</sup>)</i> complexity.
41+
This occurs if the input is sorted or reverse-sorted. Refinements such
42+
as median of three partitioning (shown elsewhere) make the worst case
43+
less likely. On average, partitioning is reasonably well balanced and
44+
<i>O(n log n)</i> complexity results.
3445

46+
## Space complexity
3547

48+
Although there is no explicit additional space required, quicksort is
49+
recursive, so it uses implicit stack space proportional to the depth of
50+
recursion. The best and average cases are <i>O(log n)</i> but the worst
51+
case is <i>O(n)</i>.
3652

37-
### Development of Quicksort
53+
54+
55+
## Development of Quicksort
3856

3957
The first version of quicksort was published by Tony Hoare in 1961 and
4058
quicksort remains the *fastest* sorting algorithm on average (subject to
@@ -43,4 +61,4 @@ done in *many* different ways and the choice of specific implementation
4361
details and computer hardware can significantly affect the algorithm's
4462
performance. In 1975, Robert Sedgewick completed a Ph.D. thesis on this
4563
single algorithm. Our presentation here is influenced by the original
46-
Hoare version and some of Sedgewick's adaptations.
64+
Hoare version and some of Sedgewick's adaptations.

0 commit comments

Comments
 (0)