Topic outline

  • General

    Overview:

    This is an advanced level (optional) course in the Fall of 2023 diving deeper into the theory side of algorithms. The focus of this iteration is on randomized algorithms with applications on parallel and distributed algorithms. Randomization often allows for relatively simple algorithm design outsourcing the hard part to the analysis. If deterministic algorithms are desired, one can often derandomize the randomized design with a small overhead. In the context of parallel algorithms, randomness offers tools for symmetry breaking, a crucial tool for efficient algorithms.

    Book: Probability and Computing Randomization and Probabilistic Techniques in Algorithms and Data Analysis (Second Edition) by Michael Mitzenmacher Eli Upfal

    Prerequisites: Fundamentals of algorithm design and analysis (e.g. CS-E3190). Mathematical maturity.

    First lecture: 5.9.2023 at 12:15 - 14:00 in T5 - A133
    First exercise: 18.9.2023 at 16:15 - 18:00 in T5 - A133

    Practicalities

    Learning objectives

    • An introduction to central concepts in randomized algorithms such as concentration bounds.
    • Showing existence of combinatorial objects using probabilistic method.
    • Analyzing the power of random processes in task allocation. 


    Schedule (to be updated)

    Week 1: Introduction. Basic concepts related to random variables such as probability, independence, Las Vegas / Monte Carlo algorithms and probability boosting.

    • The initial content of class is in section 1.2 of the MU textbook.
    • Expected running time analysis of quicksort is in section 2.5 of the MU textbook.
    • Notes on paranoid quicksort

    Week 2: The coupon's collector problem. Your favorite supermarket has announced a special offer for anyone that collects all of their 100 different campaign stickers. After each purchase, you obtain one sticker uniformly at random. How many purchases do you need to make in expectation and high probability until you obtained all the stickers?

    • Coupon Collector's Problem: Sections 2.4.1 and 3.3.1 of the MU textbook.
    • Median Finding Algorithm: Section 3.5 of the MU textbook.

    Week 3: 
    Suppose there is a randomized process that creates a combinatorial object, such as a graph cut, with a strictly positive probability. Then clearly, such an object must exists! We introduce the probabilistic method and use the expectation argument to find large cuts and satisfying assignments for the SAT problem.

    Week 4:
    The Probabilistic Method and the Lovasz Local Lemma. Often, when searching for a point in a probability space, the random variables have dependencies. These dependencies yield difficulties in applying the probabilistic method. The Lovasz Local Lemma (LLL) is a tool that provides existential guarantees in case the dependencies are sparse in the sense that each random variable depends only on a few other random variables.


    Week 5: Constructive Proof of Lovasz Local Lemma. In many cases, we can even efficiently construct the structures that can be shown to exist using LLL. We will look at an algorithm that finds a satisfying assignment for a SAT formula, which can also be interpreted as a constructive proof of the LLL.
    • k-SAT application: Section 6.7.2
    • Shattering Algorithm: Section 6.8


    Week 6:
    Concentration Inequalities and dealing with some kinds of dependencies using Martingales.