Difference between revisions of "Comparing Algorithms - Big O"
(→Logarithmic complexity - O(log n)) |
(→Quiz) |
||
(14 intermediate revisions by 4 users not shown) | |||
Line 1: | Line 1: | ||
+ | =Classification of Algorithms= | ||
+ | |||
+ | <youtube>https://www.youtube.com/watch?v=G77RVWK_1wo&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=1</youtube> | ||
+ | |||
+ | https://www.youtube.com/watch?v=G77RVWK_1wo&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=1 | ||
+ | |||
+ | =Big O Notation= | ||
Big O Notation is a measure of how long or how much memory is needed to execute and algorithm. This uses the worst case scenario, so that you get the maximum time and memory usage. It uses n as the number of items. | Big O Notation is a measure of how long or how much memory is needed to execute and algorithm. This uses the worst case scenario, so that you get the maximum time and memory usage. It uses n as the number of items. | ||
+ | |||
+ | <youtube>https://www.youtube.com/watch?v=sp9d7LlBJEM&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=2</youtube> | ||
+ | |||
+ | https://www.youtube.com/watch?v=sp9d7LlBJEM&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=2 | ||
+ | |||
+ | <youtube>https://www.youtube.com/watch?v=VTkqNgW_rm0&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=4</youtube> | ||
+ | |||
+ | https://www.youtube.com/watch?v=VTkqNgW_rm0&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=4 | ||
=== Time complexities: === | === Time complexities: === | ||
===== Constant complexity - O(1)===== | ===== Constant complexity - O(1)===== | ||
− | An algorithm with constant complexity means that the time taken doesn't increase regardless of the items that you process. | + | An algorithm with constant complexity means that the time taken doesn't increase regardless of the items that you process. ''a'' items would take the same amount of time to compute as ''b'' items. |
===== Linear complexity - O(n)===== | ===== Linear complexity - O(n)===== | ||
− | An algorithm with linear complexity takes more time as you give it more items to process. | + | An algorithm with linear complexity takes more time, by a constant gradient, as you give it more items to process. The larger the ''n'' value, the longer it takes, with the time being ''n'' times larger than if ''n'' = 1. |
− | For example planting seeds in a | + | For example planting ''n'' seeds in a large field would take longer than planting ''n'' seeds in a smaller field by comparison. |
===== Logarithmic complexity - O(log n)===== | ===== Logarithmic complexity - O(log n)===== | ||
− | An algorithm with logarithmic complexity is better than | + | An algorithm with logarithmic complexity is better than linear complexity for searching, as the growth of time for increasing ''n'' decelerates. For example when you search a list, and see that it's not in the middle, it will cut the list in half and then it searches in that half, so the time is reduced as the list gets split. |
===== Linearithmic complexity - O(nlog n)===== | ===== Linearithmic complexity - O(nlog n)===== | ||
+ | As the input n grows, the time taken to process the queue, but the time taken will always grow faster than the quantity of the input. Like a comb sort. | ||
+ | |||
===== Polynomial complexity - O(n<sup>k</sup>)===== | ===== Polynomial complexity - O(n<sup>k</sup>)===== | ||
+ | The time taken to process the input increases faster than the size of the input n. An example would be shaking hands, between two people only one is needed, however as the amount of people in increases there will have to be many more handshakes between the people. 1/2(n<sup>2</sup> - n) would be the time complexity for handshakes. | ||
+ | |||
===== Exponential complexity - O(k<sup>n</sup>)===== | ===== Exponential complexity - O(k<sup>n</sup>)===== | ||
+ | Problems with O(k<sup>n</sup>) can only be solved for a small values of the input n, for large inputs it would take much longer than would be around for. Or you would need a fully developed quantum computer. | ||
+ | |||
===== Factorial complexity - O(n!)===== | ===== Factorial complexity - O(n!)===== | ||
+ | O(n!) problems are worse than O(k<sup>n</sup>) problems, however it is possible to solve them in exponential time using dynamic programming though exponential time is still an unreasonable amount of time. Once again it only works in for small values. |
Latest revision as of 08:47, 23 August 2023
Classification of Algorithms
https://www.youtube.com/watch?v=G77RVWK_1wo&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=1
Big O Notation
Big O Notation is a measure of how long or how much memory is needed to execute and algorithm. This uses the worst case scenario, so that you get the maximum time and memory usage. It uses n as the number of items.
https://www.youtube.com/watch?v=sp9d7LlBJEM&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=2
https://www.youtube.com/watch?v=VTkqNgW_rm0&list=PLCiOXwirraUDwnyalzSD1aCvtUnw_UHkd&index=4
Time complexities:
Constant complexity - O(1)
An algorithm with constant complexity means that the time taken doesn't increase regardless of the items that you process. a items would take the same amount of time to compute as b items.
Linear complexity - O(n)
An algorithm with linear complexity takes more time, by a constant gradient, as you give it more items to process. The larger the n value, the longer it takes, with the time being n times larger than if n = 1. For example planting n seeds in a large field would take longer than planting n seeds in a smaller field by comparison.
Logarithmic complexity - O(log n)
An algorithm with logarithmic complexity is better than linear complexity for searching, as the growth of time for increasing n decelerates. For example when you search a list, and see that it's not in the middle, it will cut the list in half and then it searches in that half, so the time is reduced as the list gets split.
Linearithmic complexity - O(nlog n)
As the input n grows, the time taken to process the queue, but the time taken will always grow faster than the quantity of the input. Like a comb sort.
Polynomial complexity - O(nk)
The time taken to process the input increases faster than the size of the input n. An example would be shaking hands, between two people only one is needed, however as the amount of people in increases there will have to be many more handshakes between the people. 1/2(n2 - n) would be the time complexity for handshakes.
Exponential complexity - O(kn)
Problems with O(kn) can only be solved for a small values of the input n, for large inputs it would take much longer than would be around for. Or you would need a fully developed quantum computer.
Factorial complexity - O(n!)
O(n!) problems are worse than O(kn) problems, however it is possible to solve them in exponential time using dynamic programming though exponential time is still an unreasonable amount of time. Once again it only works in for small values.