Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion pages/Breadth-first Search Algorithm.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ algo:: Unit 3 Outcome 2
- Traversing a tree or graph
- Start at any node, or the root node (in a tree)
- Explore all neighbours of a node before moving to their neighbours
- Only difference with [[Depth-first Search]] is a [[Queue]] instead of a [[Stack]]
- Only difference with [[Depth-first Search]] is a [[Queue]] instead of a [[Stack ADT]]
- Steps to follow
- Start from root node (random for graph)
- Set node as seen
Expand Down
2 changes: 2 additions & 0 deletions pages/Church-Turing Thesis.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ algo:: Unit 4 Outcome 3
- there is no algorithm that can solve them
- it also implies that any computer program that can be written can be simulated by a Turing machine
- Church stated that *"No computational procedure will be considered as an algorithm unless it can be represented as a Turing Machine"*
- it does not state whether an algorithm is *feasible* or not, and this is where [[Cobham's Thesis]] is different
- only that an algorithm is computable ([[Tractability]])
-
- Further Research
background-color:: purple
Expand Down
32 changes: 32 additions & 0 deletions pages/Cobham's Thesis.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
tags:: Algorithmics
alias:: Cobham-Edmonds Thesis
topic:: [[Computer Science]]
algo:: Unit 4 Outcome 3

-
- named after Alan Cobham and Jack Edmonds, who first proposed it in the early 1960s
- is a statement about the feasibility of computational problems
- asserts that computational problems can be feasibly computed on some computational device only if they can be computed in polynomial time
- if a problem can be solved by an algorithm that runs in polynomial time, then it is considered to be a feasible problem
- Cobham's Thesis strengths
- it provides a clear and concise definition of what it means for a problem to be *feasible*
- it suggests that there is a fundamental limit to what can be computed in practice.
- it has helped to guide the development of new algorithms and to identify problems that are likely to be intractable
- Cobham's Thesis limitations
- it does not take into account the constant factors in the running time of algorithms and lower order terms
- it ignores the size of the exponent
- an algorithm with a polynomial running time may still be infeasible if the constant factor is large
- does not apply to problems that are defined over infinite inputs
- under Cobham's thesis, a problem for which the best algorithm takes $n^{200}$ instructions is considered feasible, and a problem with an algorithm that takes $2^{0.00001 n}$ instructions infeasible—even though one could never solve an instance of size $n = 2$ with the former algorithm, whereas an instance of the latter problem of size $n = 10^6$ could be solved without difficulty
- relationship between the [[Church-Turing Thesis]] and Cobham's thesis
- the [[Church-Turing Thesis]] establishes a limit on what can be **computed**
- any [[Algorithm]] that can be executed by a human or another computing device can also be executed by a [[Turing Machine]]
- it does not say anything about the *feasibility* of computing particular problems
- Cobham's thesis refines the [[Church-Turing thesis]]
- only problems that can be computed in polynomial time are **feasible**
- even if a problem can be solved by a [[Turing Machine]] , it may still be *infeasible* if it takes too long to compute the solution
-
- Further Research
background-color:: purple
- Read
- [Cobham's thesis - Wikipedia](https://en.wikipedia.org/wiki/Cobham's_thesis)
10 changes: 8 additions & 2 deletions pages/Heuristic.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,16 @@ topic:: [[Advanced Algorithm Design]]
algo:: Unit 4 Outcome 2

-
- a technique designed for problem solving more quickly when classic methods are too slow for finding an exact or approximate solution, or when classic methods fail to find any exact solution in a search space.
- a [[Design Pattern]] utilizing a heuristic will produce a solution in a reasonable time frame that is good enough for solving the problem
- a technique designed for problem solving more quickly when classic methods are too slow for finding an exact or approximate solution, or when classic methods fail to find any exact solution in a search space
- examples include
- [[Greedy]]
- [[Hill Climbing]]
- [[Simulated Annealing]]
- [[Graph Coloring]]
- a [[Design Pattern]] utilising a heuristic will produce a solution in a reasonable time frame that is good enough for solving the problem
- it may not be the best solution
- it may be an approximate of the best solution
- [[A* Algorithm]] being a popular example
- heuristic functions may be the only viable option when solving complex optimization problems and are routinely used in real-world applications
- the [[Travelling Salesman Problem]] is an example of a problem that can be solved with heuristic functions that find good solutions
-
Expand Down
2 changes: 1 addition & 1 deletion pages/Hilbert's Formalism.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ algo:: Unit 4 Outcome 3
- he created the [[Halting Problem]] to prove his work
- there is no algorithm that can correctly determine, for any arbitrary mathematical statement, whether the statement is true or false
- Hilbert was disappointed by Turing's proof, but it ultimately led to a deeper understanding of the limits of what computers can do through the concept of [[Decidability]]
- brief timeline of the events leading up to the [[halting problem]]
- brief timeline of the events leading up to the [[Halting Problem]]
- 1928: David Hilbert poses the Entscheidungsproblem
- 1931: Kurt Gödel proves that there are statements that are true but cannot be proven
- 1936: Alan Turing proves that the Entscheidungsproblem is undecidable
Expand Down
3 changes: 2 additions & 1 deletion pages/Minimal Spanning Tree.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,5 @@ algo:: Unit 3 Outcome 2
- There can be multiple MSTs in a [[Graph]]
- https://i.ytimg.com/vi/z1L3rMzG1_A/maxresdefault.jpg
id:: 6515425f-7d1d-45c6-ac71-01f54c4d26e8
*Refer to [[Set Operations]] for help with any of the notation in the above algorithm*
*Refer to [[Set Operations]] for help with any of the notation in the above algorithm*
- user [[Prim's Algorithm]] or [[Kruskal's Algorithm]] to find an MST for a graph
2 changes: 1 addition & 1 deletion pages/Quicksort Algorithm.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ topic:: [[Advanced Algorithm Design]]
algo:: Unit 4 Outcome 2

-
- uses divide and conquer [[Recursion]] for sorting arrays.
- uses [[Divide and Conquer]] [[Recursion]] for sorting arrays.
- works by recursively partitioning the array around a Pivot element.
- the Pivot element is then placed in its correct position in the array, and the two halves of the array are recursively sorted.
- key aspects of the quicksort:
Expand Down
1 change: 0 additions & 1 deletion pages/Support Vector Machines.md

This file was deleted.

1 change: 1 addition & 0 deletions pages/Time Complexity.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ algo:: Unit 4 Outcome 1

-
- is a measure of how long it takes an algorithm to run as a function of the size of the input
- you can find examples of common time complexities on [[Time Complexity/Examples]]
- Big O, Big \Theta, and Big \Omega are three notations used in computer science to describe the asymptotic behaviour of algorithms.
- used to classify algorithms based on their time complexity
- Big O notation is used to describe the upper bound of an algorithm's time complexity
Expand Down
1 change: 1 addition & 0 deletions pages/Time Complexity___Examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ topic:: [[Time Complexity]]
algo:: Unit 4 Outcome 1

-
- common [[time complexity]] examples are shown below
- O(1)
- Constant time algorithms: These algorithms take the same amount of time to execute, regardless of the size of the input. Some examples of constant time algorithms include:
- Accessing an element in an array
Expand Down