Skip to content

Commit aad6949

Browse files
authored
Merge pull request #9 from mutable-learning/dev
Update to algorithmics notes
2 parents b1b2ce0 + 3eb9440 commit aad6949

10 files changed

+49
-7
lines changed

pages/Breadth-first Search Algorithm.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ algo:: Unit 3 Outcome 2
77
- Traversing a tree or graph
88
- Start at any node, or the root node (in a tree)
99
- Explore all neighbours of a node before moving to their neighbours
10-
- Only difference with [[Depth-first Search]] is a [[Queue]] instead of a [[Stack]]
10+
- Only difference with [[Depth-first Search]] is a [[Queue]] instead of a [[Stack ADT]]
1111
- Steps to follow
1212
- Start from root node (random for graph)
1313
- Set node as seen

pages/Church-Turing Thesis.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ algo:: Unit 4 Outcome 3
1414
- there is no algorithm that can solve them
1515
- it also implies that any computer program that can be written can be simulated by a Turing machine
1616
- Church stated that *"No computational procedure will be considered as an algorithm unless it can be represented as a Turing Machine"*
17+
- it does not state whether an algorithm is *feasible* or not, and this is where [[Cobham's Thesis]] is different
18+
- only that an algorithm is computable ([[Tractability]])
1719
-
1820
- Further Research
1921
background-color:: purple

pages/Cobham's Thesis.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
tags:: Algorithmics
2+
alias:: Cobham-Edmonds Thesis
3+
topic:: [[Computer Science]]
4+
algo:: Unit 4 Outcome 3
5+
6+
-
7+
- named after Alan Cobham and Jack Edmonds, who first proposed it in the early 1960s
8+
- is a statement about the feasibility of computational problems
9+
- asserts that computational problems can be feasibly computed on some computational device only if they can be computed in polynomial time
10+
- if a problem can be solved by an algorithm that runs in polynomial time, then it is considered to be a feasible problem
11+
- Cobham's Thesis strengths
12+
- it provides a clear and concise definition of what it means for a problem to be *feasible*
13+
- it suggests that there is a fundamental limit to what can be computed in practice.
14+
- it has helped to guide the development of new algorithms and to identify problems that are likely to be intractable
15+
- Cobham's Thesis limitations
16+
- it does not take into account the constant factors in the running time of algorithms and lower order terms
17+
- it ignores the size of the exponent
18+
- an algorithm with a polynomial running time may still be infeasible if the constant factor is large
19+
- does not apply to problems that are defined over infinite inputs
20+
- under Cobham's thesis, a problem for which the best algorithm takes $n^{200}$ instructions is considered feasible, and a problem with an algorithm that takes $2^{0.00001 n}$ instructions infeasible—even though one could never solve an instance of size $n = 2$ with the former algorithm, whereas an instance of the latter problem of size $n = 10^6$ could be solved without difficulty
21+
- relationship between the [[Church-Turing Thesis]] and Cobham's thesis
22+
- the [[Church-Turing Thesis]] establishes a limit on what can be **computed**
23+
- any [[Algorithm]] that can be executed by a human or another computing device can also be executed by a [[Turing Machine]]
24+
- it does not say anything about the *feasibility* of computing particular problems
25+
- Cobham's thesis refines the [[Church-Turing thesis]]
26+
- only problems that can be computed in polynomial time are **feasible**
27+
- even if a problem can be solved by a [[Turing Machine]] , it may still be *infeasible* if it takes too long to compute the solution
28+
-
29+
- Further Research
30+
background-color:: purple
31+
- Read
32+
- [Cobham's thesis - Wikipedia](https://en.wikipedia.org/wiki/Cobham's_thesis)

pages/Heuristic.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,16 @@ topic:: [[Advanced Algorithm Design]]
44
algo:: Unit 4 Outcome 2
55

66
-
7-
- a technique designed for problem solving more quickly when classic methods are too slow for finding an exact or approximate solution, or when classic methods fail to find any exact solution in a search space.
8-
- a [[Design Pattern]] utilizing a heuristic will produce a solution in a reasonable time frame that is good enough for solving the problem
7+
- a technique designed for problem solving more quickly when classic methods are too slow for finding an exact or approximate solution, or when classic methods fail to find any exact solution in a search space
8+
- examples include
9+
- [[Greedy]]
10+
- [[Hill Climbing]]
11+
- [[Simulated Annealing]]
12+
- [[Graph Coloring]]
13+
- a [[Design Pattern]] utilising a heuristic will produce a solution in a reasonable time frame that is good enough for solving the problem
914
- it may not be the best solution
1015
- it may be an approximate of the best solution
16+
- [[A* Algorithm]] being a popular example
1117
- heuristic functions may be the only viable option when solving complex optimization problems and are routinely used in real-world applications
1218
- the [[Travelling Salesman Problem]] is an example of a problem that can be solved with heuristic functions that find good solutions
1319
-

pages/Hilbert's Formalism.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ algo:: Unit 4 Outcome 3
2525
- he created the [[Halting Problem]] to prove his work
2626
- there is no algorithm that can correctly determine, for any arbitrary mathematical statement, whether the statement is true or false
2727
- Hilbert was disappointed by Turing's proof, but it ultimately led to a deeper understanding of the limits of what computers can do through the concept of [[Decidability]]
28-
- brief timeline of the events leading up to the [[halting problem]]
28+
- brief timeline of the events leading up to the [[Halting Problem]]
2929
- 1928: David Hilbert poses the Entscheidungsproblem
3030
- 1931: Kurt Gödel proves that there are statements that are true but cannot be proven
3131
- 1936: Alan Turing proves that the Entscheidungsproblem is undecidable

pages/Minimal Spanning Tree.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,4 +11,5 @@ algo:: Unit 3 Outcome 2
1111
- There can be multiple MSTs in a [[Graph]]
1212
- https://i.ytimg.com/vi/z1L3rMzG1_A/maxresdefault.jpg
1313
id:: 6515425f-7d1d-45c6-ac71-01f54c4d26e8
14-
*Refer to [[Set Operations]] for help with any of the notation in the above algorithm*
14+
*Refer to [[Set Operations]] for help with any of the notation in the above algorithm*
15+
- user [[Prim's Algorithm]] or [[Kruskal's Algorithm]] to find an MST for a graph

pages/Quicksort Algorithm.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ topic:: [[Advanced Algorithm Design]]
44
algo:: Unit 4 Outcome 2
55

66
-
7-
- uses divide and conquer [[Recursion]] for sorting arrays.
7+
- uses [[Divide and Conquer]] [[Recursion]] for sorting arrays.
88
- works by recursively partitioning the array around a Pivot element.
99
- the Pivot element is then placed in its correct position in the array, and the two halves of the array are recursively sorted.
1010
- key aspects of the quicksort:

pages/Support Vector Machines.md

Lines changed: 0 additions & 1 deletion
This file was deleted.

pages/Time Complexity.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ algo:: Unit 4 Outcome 1
44

55
-
66
- is a measure of how long it takes an algorithm to run as a function of the size of the input
7+
- you can find examples of common time complexities on [[Time Complexity/Examples]]
78
- Big O, Big \Theta, and Big \Omega are three notations used in computer science to describe the asymptotic behaviour of algorithms.
89
- used to classify algorithms based on their time complexity
910
- Big O notation is used to describe the upper bound of an algorithm's time complexity

pages/Time Complexity___Examples.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@ topic:: [[Time Complexity]]
33
algo:: Unit 4 Outcome 1
44

55
-
6+
- common [[time complexity]] examples are shown below
67
- O(1)
78
- Constant time algorithms: These algorithms take the same amount of time to execute, regardless of the size of the input. Some examples of constant time algorithms include:
89
- Accessing an element in an array

0 commit comments

Comments
 (0)