summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authormo khan <mo@mokhan.ca>2021-11-07 15:20:04 -0700
committermo khan <mo@mokhan.ca>2021-11-07 15:20:04 -0700
commit9c2288f84e1a0811e26ee6f954a4d16d8c03ba3c (patch)
treeb1575dcc13cf9904682e954ba9be5f13a8d4ca63
parentf82103800d1250f9699d8c97d07adbd488a7936f (diff)
work on assignment 1
-rw-r--r--0x01/README.md62
-rw-r--r--notes.md36
2 files changed, 90 insertions, 8 deletions
diff --git a/0x01/README.md b/0x01/README.md
index ff523b3..c0daa8a 100644
--- a/0x01/README.md
+++ b/0x01/README.md
@@ -224,9 +224,9 @@ Asymptotic behaviour of polynomials
Let
```
- d
-p(n) = Σ ai n^i
- i=0
+ d
+p(n) = Σ ai n^i
+ i=0
```
where `ad > 0`, be a degree-d polynomial in `n`, and let `k` be a constant.
@@ -234,9 +234,65 @@ Use the definitions of the asymptotic notations to prove the following
properties.
a. If `k >= d`, then `p(n) = O(n^k)`.
+
+ To prove this we need to use the big-oh formula which is:
+ `f(n) = O(g(n))` if there exists positive constants c and n0
+ such that `f(n) <= c * g(n)` where n > n0.
+
+ `0 <= c * g(n)` when we replace `g(n)` with `ad * n^d` we get
+ `0 <= c * ad * n^d` and if `k >= d` then we can write
+ `0 <= c * ad * n^d <= c * ak * n^k`
+ This can be reduced to
+ `0 <= n^d <= n^k` and if we swap the polynomial back in we get
+ `0 <= p(n) <= n^k` so we can say that `p(n) = O(n^k)`.
+
b. If `k <= d`, then `p(n)` = `omega(n^k)`.
+
+ To prove `omaga(n^k)` we need to refer to the omega formula which says:
+ The function `f(n) = omega(g(n))` if there exists positive contstants c and n0
+ such that `0 <= c * g(n) <= f(n)` for all `n >= n0`.
+
+ If `k` is less than or equal to `d` then we can say that `n^k` will grow
+ slower than or equal to `n^d` as `k` approaches `d`.
+
+ We can try to fit this into the omega function definition.
+ `0 <= c * g(n) <= f(n)` where `g(n)` = `ad * n^d` or
+ `0 <= c * (ad * n^k) <= f(n)`. Since `k` <= `d` we can also say
+ `0 <= c * (ad * n^k) <= c * (ad * n^d)`
+ If we choose the constant 0.5 for `c` we can say
+ `0 <= 0.5 * (ak * n^k) <= (ad * n^d)` which reduces to
+ `0 <= 0.5 * (n^k) <= (n^d)`. Since `k` is less than `d` and
+ `n^k` is cut in half by multiplying it with the constant 0.5 this
+ satisfies the constraints of the equation so we can make the claim that this
+ function is lower bound is `n^k` or `p(n) = omega(n^k)`.
+
c. If `k = d`, then `p(n) = theta(n^k)`.
+
+ To prove `theta(n^k)` we need the theta formula.
+ The function `f(n) = theta(g(n))` if there exists positive constants c1, c2, and n0
+ such that `0 <= c1 * g(n) <= f(n) <= c2 * g(n)` for all `n >= n0`
+
+ We know that `k` is equal to `d` and `d` is the upper limit. So `k` is equal
+ to the largest `d` value. With that knowledge we can say that the upper
+ bound will be `n^k` and since `k` is constant it is also the lower bound.
+ Now we can claim that this function is tightly bound to `n^k` or `p(n) = theta(n^k)`
+ which is equivalent to `p(n) = theta(n^d)`.
+
d. If `k > d`, then `p(n) = o(n^k)`.
+
+ To prove this we need the little-oh formula which says:
+ The function `f(n) = o(g(n))` for any positive constant `c > 0`,
+ if there exists a constant `n0 > 0` such that `0 < f(n) < c * g(n)`
+ for all `n >= n0`.
+
+ This is similar to `a` without the equality.
+ `0 < f(n) < c * g(n)`
+ `0 < f(n) < c * (ad * n^d)`.
+ If `k` is greater than `d` then we state the following:
+ `0 < c * (ad * n^d) < c * (ak * n^k)`. `n^k` will always be greater than
+ `n^d` because of the constraint `k > d` therefore any constant `c` will
+ work. This allows us to claim that `p(n) = o(n^k)`.
+
e. If `k < d`, then `p(n) = w(n^k)`.
1. Exercise 4.1-2 from the textbook (10 marks)
diff --git a/notes.md b/notes.md
index 4cd2d1b..de455b5 100644
--- a/notes.md
+++ b/notes.md
@@ -942,12 +942,19 @@ Example:
instead of guessing the value you can make it as 2n+3
+Little-oh
+---------
+
+The function `f(n) = o(g(n))` for any positive constant `c > 0`,
+if there exists a constant `n0 > 0` such that `0 < f(n) < c * g(n)`
+for all `n >= n0`.
+
Omega
----
-The function `f(n) = omega(g(n)) if positive contstants c and n0`
-such that `f(n) >= c * g(n) V n>=n0`
+The function `f(n) = omega(g(n))` if there exists positive contstants c and n0
+such that `0 <= c * g(n) <= f(n)` for all `n >= n0`.
e.g.
@@ -963,13 +970,32 @@ f(n) c * g(n)
true f(n) = omega(logn)
```
+little-omega w-notation
+----------
+Denotes a lower bound that is not asymptotically tight.
+
+The function `f(n) = w(g(n))` for any positive constant `c > 0`,
+if there exists a constant `n0 > 0`
+such that `0 <= c * g(n) < f(n)` for all `n >= n0`
+
+the relation `f(n) = w(g(n))` implies that:
+
+```plaintext
+lim f(n)
+ ---- = infinity
+n->infinity g(n)
+```
+if the limit exists. That is `f(n)` becomes arbitrarily large relative to `g(n)`
+as `n` approaches infinity.
+
+
Theta
-----
-The funciton `f(n)=theta(g(n)) if 3 +ve constants c1,c2 and n0`
-such that `c1*g(n) <= f(n) <= c2*g(n)`
+The function `f(n)=theta(g(n))` if there exists positive constants c1, c2, and n0
+such that `0 <= c1 * g(n) <= f(n) <= c2 * g(n)` for all `n >= n0`. (n0 is pronounced enn-not)
-f(n) = theta(n)
+f(n) = theta(g(n))
```
eg.