Algorithms Books Program Submissions Dates and Final Exam Time

Algorithms
Books
Section 001 MWF 11:00-11:50PM, CAS134
Section 002 MWF 2:15-3:05PM, CAS143

Course website:
http://www.cs.uakron.edu/~duan/class/435

Instructor: Dr. Zhong-Hui Duan
Textbook:
Algorithms by Dasgupta, et al.
Office: 238 CAS
Phone: 330-972-6773
E-mail: [email protected]
Office Hours: MWF12:00-1:00pm or appointment






Teaching assistant:

Mr. Avinash Chandra Vootkuri, [email protected]
Office hour: TBA (CAS 254)



Labs and Tutors

Course Goals:
Introduction to Algorithms, 3rd Edition,
by T.H. Cormen, C.E. Leiserson, R.L.
Rivest, and C. Stein, MIT, 2009.
https://www.uakron.edu/computer-science/current-students/lab-hours.dot

An excellent reference:
To learn well-known algorithms as well as the design and analysis of these
algorithms.

Program Submissions


Grading:





Two exams (15%, 30%).
4-5 Programs (30%).
Weekly homework problems (10%).
 neat, clear, precise, formal.
Quizzes & class participation (15%).
Grading Scale:

A 90-100 B 80-89 C 70-79 D 60-69 F
0-59

Dates and Final Exam Time

Last Day (Spring 2017):
Add or Drop – January 31; Withdraw – March 6

Final Exam Date: Given according to the
University Final Exam schedule.
sec 001: Monday, May 8, 12:15-2:15pm
 sec 002: Monday, May 8, 2:30-4:30pm

All programs will be submitted using Springboard. A drop box
will be created for each program. You will also need to submit a
printed copy of your source code and a readme file. DO NOT
submit programs that are not reasonably correct! To be considered
reasonably correct, a program must be completely documented and
work correctly for sample data provided with the assignment.
Programs failing to meet these minimum standards will be
returned ungraded and a 30% penalty assessed. Late points will be
added on top of this penalty.
Ethics: All programming assignments must be your own work.
Duplicate or similar programs/reports, plagiarism, cheating, undue
collaboration, or other forms of academic dishonesty will be
reported to the Student Disciplinary Office as a violation of the
Student Honor Code.
Algorithm
A procedure for solving a computational problem
(ex: sorting a set of integers) in a finite number
of steps.
More specifically: a step-by-step procedure for
solving a problem or accomplishing something
(especially using a computer).
1
Solving a Computational Problem

Problem definition & specification





RSA.

Quicksort.

FFT.

Huffman codes.

Network flow.

Linear programming.

specify input, output and constraints
Algorithm analysis & design

Examples

devise a correct & efficient algorithm

Implementation planning
Coding, testing and verification


Input
Algorithm
Output

Cryptography.
Databases.
Signal processing.
Data compression.
Routing Internet packets.
Planning, decision-making.
What is CS435/535?

Learn well-known
algorithms and the
design and analysis of
algorithms.






Project 1 and final project
Examine interesting
problems.
Devise algorithms for
solving them.
Data structures and core
algorithms.
Analyze running time of
programs.
Critical thinking.
Asymptotic Complexity
Ch 0: Big-O notation

Running time of an algorithm as a function of input
size n for large n.

Expressed using only the highest-order term in the
expression for the exact running time.

Instead of exact running time, say (n2).

Describes behavior of function in the limit.

Written using Asymptotic Notation.
2
O-notation
Asymptotic Notation




, O, , o, 
Defined for functions over the natural numbers.
 Ex: f(n) = (n2).
 Describes how f(n) grows in comparison to n2.
Define a set of functions; in practice used to compare
two function sizes.
The notations (, O, , o,  describe different rate-ofgrowth relations between the defining function and the
defined set of functions.
For function g(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  f(n)  cg(n) }
Intuitively: Set of all functions whose
rate of growth is the same as or lower
than that of g(n).
Technically, f(n)  O(g(n)).
Older usage, f(n) = O(g(n)).
g(n) is an asymptotic upper bound for f(n).
 -notation
Examples
O(g(n)) = {f(n) :  positive constants c and n0,
such that n  n0, we have 0  f(n)  cg(n) }

For function g(n), we define (g(n)),
big-Omega of n, as the set:

(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,

we have 0  cg(n)  f(n)}
O(n2)
n2 +1
n2 +n
 1000n2 +1000n
 n1.999
 n2/log n
 n2/log log logn
Example
(g(n)) = {f(n) :  positive constants c and n0, such
that n  n0, we have 0  cg(n)  f(n)}







n = (lg n). Choose c and n0.
1000n2 + 1000n
1000n2 − 1000n
n3
n2.00001
n2 log log log n
22n
Intuitively: Set of all functions whose rate of growth is the
same as or higher than that of g(n).
g(n) is an asymptotic lower bound for f(n).
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)
}
Intuitively: Set of all functions that have the same rate of
growth as g(n).
g(n) is an asymptotically tight bound for f(n).
3
Example
Relations Between O, 
(g(n)) = {f(n) :  positive constants c1, c2, and n0,
such that n  n0, 0  c1g(n)  f(n)  c2g(n)}



10n2 - 3n = (n2)
To compare orders of growth, look at the
leading term.
Exercise: Prove that n2/2-3n= (n2)
Exercise
n-106 = Ω(n)
Relations Between , , O
Theorem : For any two functions g(n) and f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = (g(n)).

i.e., (g(n)) = O(g(n))  (g(n))
f(n) = (g(n))  f(n) = O(g(n)).
(g(n))  O(g(n)).

f(n) = (g(n))  f(n) = (g(n)).
(g(n))  (g(n)).
Example

Is 3n3 = (n4) ?

How about 22n= (2n)?

How about log2n = (log10n)?
Comparison of Functions
fg ~ ab
f (n) = O(g(n)) ~ a  b
f (n) = (g(n)) ~ a  b
f (n) = (g(n)) ~ a = b
In practice, asymptotically tight bounds are obtained
from asymptotic upper and lower bounds.
4
Properties of a relation R={(a,b)}
Transitive
 Reflexive, irreflexive
 Symmetric, antisymmetric, asymmetric
Relations

Equivalence relation: reflexive, symmetric, and
transitive.
Partial order: reflexive, antisymmetric, and
transitive.







R={(f(n), g(n)) | f(n) = O(g(n))}
R={(f(n), g(n)) | f(n) = Ω(g(n))}
R={(f(n), g(n)) | f(n) = Θ(g(n))}
R={(f(n), g(n)) | f(n) = o(g(n))}
R={(f(n), g(n)) | f(n) = ω(g(n))}
Properties

Transitivity
f(n) = (g(n)) and g(n) = (h(n))  f(n) = (h(n))
f(n) = O(g(n)) and g(n) = O(h(n))  f(n) = O(h(n))
f(n) = (g(n)) and g(n) = (h(n))  f(n) = (h(n))
f(n) = o (g(n)) and g(n) = o (h(n))  f(n) = o (h(n))
f(n) = (g(n)) and g(n) = (h(n))  f(n) = (h(n))

Reflexivity
f(n) = (f(n))
f(n) = O(f(n))
f(n) = (f(n))
Limits

lim [f(n) / g(n)] <   f(n)  (g(n))
n

0 < lim [f(n) / g(n)] <   f(n)  (g(n))

0 < lim [f(n) / g(n)]  f(n) (g(n))
n
n

Properties
lim [f(n) / g(n)] undefined can’t say, need special attention
n

Symmetry

Antisymmetry

Asymmetric

Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = ((f(n))
f(n) = (g(n)) iff g(n) = (f(n)) ???
YES
f(n) = O(g(n)) & g(n) = O(f(n))  f(n) = g(n) ???
f(n) = o(g(n))  g(n) ≠ o(n) ???
NO
NO
YES
complexity classes
adjective
constant
logarithmic
linear
nlogn
quadratic
cubic
exponential
exponential
etc.
-notation
(1)
(logn)
(n)
(nlogn)
(n2)
(n3)
(2n)
(10n)
5
Fibonacci Numbers
Which one is better?
Exercise
Express functions in A in asymptotic notation using functions in B.
A
5n2

B
+ 100n
3n2
+2
Ch1
A  (B)
A  (n2), n2  (B)  A  (B)
log3(n2)
log2(n3)
A  (B)
logba = logca / logcb; A = 2logn / log3, B = 3logn, A/B =2/(3log3)
nlog24
3log2n
A  (B)
alog b = blog a; B =3log n=nlog 3; A/B =nlog(4/3)   as n
log2n
n1/2
A  (B)
lim ( loga n / nb ) = 0 (here a = 2 , b = 1/2)  A  (B)
n
6