**All reading referenced in lecture slides are chapters and sections in the main textbook for this course, which is: ****M. Mitzenmacher and E. Upfal, Probability and Computing: Randomized Algorithms and Probabilistic Analysis, 2nd edition , Cambridge University Press, 2017.**

For background material on basic counting and discrete probability, you can visit Kousha Etessami's lecture notes (lectures 16-18, and lectures 25-29) for the old Discrete Mathematics course.

There are many excellent introductory textbooks on probability theory. We can recommend the following:

- Sheldon Ross, A First Course in Probability, 8th Edition, Pearson, 2010. (A good, gentle but rigorous, introduction, including some background on counting.)
- G. Grimmett and D. Stirzaker, Probability and Random Processes, 4th edition, Oxford University Press, 2020. (A more advanced text, covering much more, but also much more dense.)

Other reference texts on randomized algorithms, besides the main textbook for this course by Mitzenmacher and Upfal, include: R. Motwani and P. Raghavan, Randomized Algorithms, Cambridge University Press, 1995.

Some reference books specifically on Markov chains include:

- J. R. Norris, Markov Chains, Cambridge University Press, 1998.
- D. A. Levin and Y. Peres, Markov Chains and Mixing Times, 2nd Edition, American Mathematical Society, 2017.

An excellent classic reference book on the Probabilistic Method is:

- N. Alon and J. H. Spencer, The Probabilistic Method, 4th Edition, Wiley, 2016.