Lecture Notes and Further Reading
The notes posted here are somehow sparse. They are (hopefully) a good summary for later reference but are by no
means a substitute for the actual lectures and/or the textbook.
- Introduction to the design and analysis of algorithms ( handout and overhead presentation). These
notes offer an overview of the basics of algorithm design and analysis; most of this material will be
reconsidered in more details as we move through the course.
The notes roughly correspond to Chapter 0 of the textbook. This chapter is definitely worth reading
as it discusses a few things not covered here (such as what is not an algorithm).
- Asymptotic order notations ( handout and overhead presentation). This matter is considered already
known in many algorithms textbooks including our primary textbook. The other two recommended
textbooks do contains a summary of the order notations (Cormen addresses this matter in Chapter 3
and Neapolitan in Chapter 1). Check out Moodle for more reading.
- Counting Steps and Recurrence Relations ( handout and overhead presentation). Here is a detailed
and rigorous step counting for the iterative binary search algorithm. This is mostly dedicated to
solving recurrence relations, which is in turn necessary when analyzing the performance of recursive
algorithms. The second part of this presentation is probably overkill for the purpose of the rest of the
course (and also real life). A more practical replacement would be Section 1.7 from the textbook.
Most of the presentation is according to Neapolitan’s Appendix B (which also offers proofs for most
of the theorems used).
- Correctness ( handout and overhead presentation). This section provides a brief introduction into
proofs of correctness. We will discuss correctness or our algorithms so keep this all in mind as we
more along.
I do not have a concise reference for this part, but note that most algorithms that we will see later
come in your textbook with arguments of correctness. Cormen is even more strict; in that textbook
everything is formally proven correct.
- Data Structures ( handout and overhead presentation). You are supposed to be familiar with many
data structures, the most important ones being summarized (though not defined of analyzed) here. In
addition, we also talk about disjoint sets and graphs, which will become useful later on.
Disjoint sets are defined in Cormen’s Chapter 19. Notice the use of a linked list instead of array, but
also notice that the basic algorithms and their properties are the same. Graphs are nicely introduced
in Cormen’s Chapter 20. Check out Moodle for details.
- Divide and conquer ( handout and overhead presentation). This is one of the most spectacular (i.e.,
simple and effective) technique of algorithm design, just keep in mind that it is not going to work
everywhere.
Divide and conquer is covered (though strangely enough not explicitly named) in Chapter 1 of the
textbook. The lower bound for comparison sorting is explained in many places, see e.g. Cormen.
- The greedy technique ( handout and overhead presentation). Yet another simple and effective
technique when it works (namely, when the problem has the greedy choice property).
Minimum-cost spanning trees are covered in Chapter 7 of the textbook, including some entertaining
historical information. Dijkstra’s algorithm is covered in Chapter 8 of the textbook, which also contains
additional information well worth reading. More greedy algorithms (including Huffman codes) are
the subject of Chapter 4. The greedy approach to the knapsack problems is covered very briefly in
Cormen (Section 15.2); we will meet these problems again.
- Dynamic programming ( handout and overhead presentation). This technique is more general than
greedy algorithms and applies to about the same kind of problems (mostly optimization problems).
Dynamic programming is covered in Chapter 3 of the textbook, and as it is readily apparent early
in that chapter this is one of the author’s favourite techniques. Floyd’s algorithm is described in
Section 9.8, with the whole (pretty short) Chapter 9 well worth reading. The dynamic programming
solution for the traveling salesman problem is addressed in Neapolitan (Section 3.6). The dynamic
programming solution to the 0/1 knapsack problem is pretty straightforward but I could not find a
source so I am just going to mention that it is presented as an exercise in Cormen (Exercise 15.2-2).
- Backtracking and branch & bound ( handout and overhead presentation). Backtracking is a good first
attempt technique. It can be expensive but often a backtracking solution can also be converted often
into a dynamic programming solution. For many problems backtracking is the best known algorithm.
Branch & bound is an improvement over backtracking, but only works on optimization problems and
consumes substantially more space.
Both backtracking and branch & bound are covered by virtually all AI textbooks. Backtracking is
covered in Chapter 2 of the textbook. Branch & bound is covered in Chapter 6 of Neapolitan (where
my example comes from), but in far more details than necessary for the course; here is an alternative
reference: Branch and Bound Algorithms - Principles and Examples by J. Clausen (which is still a
more in-depth coverage than necessary for this course).
- Introduction to the complexity theory ( handout and overhead presentation). As opposed to
algorithms where we consider problems one by one, the complexity theory deals with classes of
problems (basically classifying problems in easy to solve as opposed to hard to solve, under various
definitions of “easy” and “hard”). Complexity is a vast domain and we will only get to see a very light
introduction to it. Because we have to start somewhere this introduction will be based on the question
of P vs. NP; you will have to wait for the lectures to see what those initials mean.
This is all covered in Chapter 12 of the textbook. Note however that our coverage is lighter and in
particular we will not be discussing any actual reduction, so you can probably stop reading somewhere
in the middle of Section 12.5. The whole chapter nonetheless is an entertaining reading, worth a
however quick look. The concepts related to problems and encodings are better explained in Cormen
(Chapter 34), but I am going to argue that these concepts are not hard to understand (and are also not
central to the matter covered in this course) and so there is no need for any additional reading on this
matter.
Note that most algorithms people prefer to define NP in terms of polynomial-time verifiable solutions
rather than nondeterminism. I personally believe that nondeterminism is easier to understand (and
certainly easier to explain ;-) ), but perhaps that is because my background is in formal languages and
complexity rather than algorithms. Dr. Erickson (the author of your textbook) is clearly an algorithms
person and so we disagree. You decide who is right.
Lecture Recording
Lectures are recorded as follows:
- Introduction: Lectures 1 and 2, not captured (equipment failure)
- Asymptotic order notations: Lecture 3
- Counting steps and recurrence relations: Lecture 4, Lecture 5
- Correctness: First about 50 minutes of Lecture 6
- Data structures: Quick recap of basic data structures in the last about 15 minutes of Lecture 6, then
Lecture 7, first about 30 minutes of Lecture 8
- Divide and conquer: last about 40 minutes of Lecture 8, Lecture 9 (with a short discussion to justify
ignoring floor and ceiling), Lecture 10, Lecture 11
- Greedy algorithms: Lecture 13, Lecture 14, Lecture 16
- Dynamic programming: Lecture 17, lecture 18, Lecture 19
- Backtracking: Lecture 20, Lecture 21 (including a 10-minute discussion on greedy algorithms at the
beginning, and a 10-minute start into the complexity theory at the end)
- Complexity: Lecture 22
Note that Lecture 12 was a recap and Q&A session for the first examination and so was not recorded. Lecture 15
was the actual first examination.