## Topic outline

• ### General

Overview:

This is an advanced level (optional) course in the Fall of 2022 diving deeper into the theory side of algorithms. The focus of this iteration is on randomized algorithms with applications on parallel and distributed algorithms. Randomization often allows for relatively simple algorithm design outsourcing the hard part to the analysis. If deterministic algorithms are desired, one can often derandomize the randomized design with a small overhead. In the context of parallel algorithms, randomness offers tools for symmetry breaking, a crucial tool for efficient algorithms.

Prerequisites: Fundamentals of algorithm design and analysis (e.g. CS-E3190). Mathematical maturity.

First lecture: 6.9.2022 at 12:15 - 14:00 in T5 - A133
First exercise: 15.9.2022 at 12:15 - 14:00 in Undergraduate Centre, ALMA MEDIA - U356

Learning objectives

• An introduction to central concepts in randomized algorithms such as concentration bounds.
• Analysis of randomized processes with limited dependence.
• Derandomization through the method of conditional expectations

Schedule

Week 1: Introduction. Basic concepts related to random variables such as probability, independence, Las Vegas / Monte Carlo algorithms and probability boosting.

Week 2: The coupon's collector problem. Your favorite supermarket has announced a special offer for anyone that collects all of their 100 different campaign stickers. After each purchase, you obtain one sticker uniformly at random. How many purchases do you need to make in expectation and high probability until you obtained all the stickers?
Notes: 02-notes

Week 3: Suppose there is a randomized process that creates a combinatorial object, such as a graph cut, with a strictly positive probability. Then clearly, such an object must exists! We introduce the probabilistic method and use the expectation argument to find large cuts and satisfying assignments for the SAT problem.
Notes: 03-notes

Week 4: More on probabilistic method. There exist graphs with a high chromatic number and high girth. Furthermore, we introduce the method of conditional probabilities to derandomize the max cut approximation algorithm. Interestingly, this algorithm turns out to be equivalent to the greedy algorithm.
Notes: 04-notes

Week 5: Even more on probabilistic method. Often, when searching for a point in a probability space, the random variables have dependencies. These dependencies  yield difficulties in applying the probabilistic method. The Lovasz Local Lemma is a tool that provides existential guarantees in case the dependencies are sparse in the sense that each random variable depends only on a few other random variables.
Notes: 05-notes

Week 6: k-wise independent randomness. Often and in many applications, mutual independence of events or random variables is too much to ask. A weaker notion of independence is pairwise (or k-wise) independence, which means that each random variable or event is independent of any other single (or k) random variables or events. We will elaborate on this perhaps unintuitive concept and show how to create b pairwise independent random variables from log b mutually independent random variables. We will apply this to obtain an algorithm for finding a cut of size m/2 and give a straightforward way to derandomize this algorithm.
Notes: 06-notes

Week 7: Bonus problems

• • • 