**Lecturer:** Benjamin Paassen**Fields:** Artificial Intelligence / Machine Learning

### Content

This course is intended for non-machine learners with little to no prior knowledge. It will provide many examples as well as accompanying exercises and limit the number of formulae to a bare minimum, while instead maximizing the number of meaningful images. In more detail, the course will cover the following topics.

- Session 1: Basics of optimization (What is a mathematical optimization problem? How do we model the world in optimization? How do we solve optimization problems?), basics of probability theory (What are distributions, joint and conditional probabilities, and Bayes’ rule? How do we maximize probabilities?), and linear regression from a geometric and a risk minimization perspective
- Session 2: Machine learning from the perspective of
*distances*for classification (how do I put things into known categories?), clustering (how do I discover new categories of things?), regression (how do I infer an unknown variable from a known one, based on examples?), and dimensionality reduction (how do I simplify data that is too big to process?) - Session 3: Neural-network-based learning (What is an artificial neural network? What are popular components? What kind of models can I build? How do I learn such models?) and the problems of generalization (When can learning fail? How do I prevent that? How can hackers attack my model?)
- Session 4: Reinforcement learning (What is it and how do I do it?) and algorithmic fairness (What does it mean to be fair and what role to risk, reward, and curiosity play?)

### Objectives

- becoming familiar with key concepts from machine learning (e.g. risk minimization, exploration versus exploitation, priors and posteriors, generalization)
- achieving a
high-level understanding of how the most popular machine learning
methods work and which method can be used for which application
(e.g. when
*not*to use deep learning methods) - de-mystifying machine learning (it is just a collection of methods with certain assumptions)
- optionally, becoming able to apply some machine learning methods in Python on your own data (exercises)

### Literature

- Barber, D. (2012). Bayesian Reasoning and Machine Learning. Cambridge University Press. Cambridge, UK. http://www.cs.ucl.ac.uk/staff/d.barber/brml/
- Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep Learning. MIT Press. Cambridge, MA, USA. https://www.deeplearningbook.org/
- Paaßen, B. (2019). Lecture Notes on Optimization. Bielefeld University. https://pub.uni-bielefeld.de/record/2935200

### Lecturer

Benjamin Paassen received their doctoral degree in 2019 from Bielefeld University, Germany on the topic of ‘Metric Learning for Structured Data’. Prior work has focused on machine learning algorithms to support applications in computer science education and hand prosthesis research, but has also included research on discrimination in video game culture and in machine learning. Research interests include interpretable machine learning, metric learning, transfer learning, and fairness.

**Affiliation:** Bielefeld University and University of Sydney**Website:** https://bpaassen.gitlab.io/