Algorithms and Data Structures I

In the summer semester 2016/2017, I teach the English version of the basic course on Algorithms and Data Structures I [NTIN060]. It is primarily intended for students of the English study programs, but everybody is welcome. It will be similar to my previous year's ADS1 course.

The lectures are held on every Wednesday from 15:40 in room S5.

Classes (exercises) are on Wednesdays from 14:00 in room S11. They are taught by Radek Hušek. Czech-speaking students can also select any classes of the Czech version of the course, we will try to keep both versions of the course synchronized.

Syllabus

date topic
21. 2. Introductory examples: finding the longest increasing subsequences in different ways. An attempt at defining an algorithm.
28. 2. The Random Access Machine as a model of computation. Execution time and space, possible choices of instruction cost (unit cost, limited arguments, logarithmic and relative logarithmic cost). Time and space complexity of an algorithm. Programming on RAM and estimating complexity: an example with sorting by selection. Asymptotic notation: O, Ω, Θ.
7. 3. Graph algorithms: Breadth-first search (BFS) and its properties. Representation of graphs (matrices vs. lists) and its effect on time complexity. Applications of BFS: connected components, shortest paths.
14. 3. Graph algorithms: Depth-first search (BFS) and its properties. The DFS tree and classification of edges (tree, back, forward, cross). Identifying bridges in undirected graphs. DFS on directed graphs: detection of cycles. Topological order and its existence.
21. 3. Topological order using DFS, induction based on it. Strongly connected components. Shortest paths in graphs with variable edge lengths: triangle inequality for distance, problems with negative cycles, shortest path versus shortest walk. Dijkstra's algorithm and its derivation from BFS with subdivided edges. Implementation with heaps (binary and k-regular).
28. 3. Shortest paths: Relaxation principle and proof of Dijkstra's algorithm in general case. Bellman-Ford algorithm.
4. 4. Minimum spanning trees: Jarník's (a.k.a. Prim's) and Borůvka's algorithm. Proofs via cut lemma. MST is unique and determined by order of edges. Kruskal's algorithm, maintenance of connected components and the Union-Find problem.
11. 4. Interfaces of common data structures: queue, priority queue, set, dictionary. Binary search trees and operations on them. Perfectly balanced trees and a lower bound on their complexity.
18. 4. Height-balanced (AVL) trees: definition, proof of logarithmic height. Balancing AVL trees using rotations.
25. 4. (a,b)-trees: definition, bounds on height. Insertion and deletion based on vertex splits and merges. Left-leaning red-black (LLRB) trees: definition, isomorphism with (2,4)-trees, insertion.
2. 5. Divide and conquer: Mergesort, multiplication of n-digit numbers in O(nlog23) time (Karatsuba-Ofman algorithm). Master theorem on complexity of recursive algorithms.
9. 5. Strassen's algorithm for matrix multiplication (formulae). Selection of the k-th smallest element and different ways of choosing the pivot. Randomization and expected time complexity. QuickSort: basic principle, analysis of expected complexity.
16. 5. No lecture today.
23. 5. Linear-time selection. Lower bound on complexity of searching and sorting in the comparison model. Bucket sorting: integer keys, lexicographic sorting of k-tuples, radix sorting. Hashing: buckets with chains, open addressing. Dynamic reallocation of hash tables and its amortization.

Recommended reading

This page is maintained by Martin Mareš