Related Fields
Related Fields
Please note: „Related fields“ cannot be chosen as one of the major topics. Modules from this topic can only be chosen within the 15 „free“ ECTS (see overview).
Regular courses
These modules are offered for „Related Field“ on a regular basis. „Related fields“ comprises modules which are not exactly from the area of optics or optical technologies, but help to understand the more specific modules belonging to the other seven topics.
Please note: Each module usually corresponds to a single course with the same title. In a few cases, a module is linked to two courses which will then have different title.
Summer term
Prof. Dr. Maier, 5 ECTS
(this module is also offered in winter term)
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This module introduces the core elements of neural networks and deep learning, it comprises:
 (multilayer) perceptron, backpropagation, fully connected neural networks
 loss functions and optimization strategies
 convolutional neural networks (CNNs)
 activation functions
 regularization strategies
 common practices for training and evaluating neural networks
 visualization of networks and results
 common architectures, such as LeNet, Alexnet, VGG, GoogleNet
 recurrent neural networks (RNN, TBPTT, LSTM, GRU)
 deep reinforcement learning
 unsupervised learning (autoencoder, RBM, DBM, VAE)
 generative adversarial networks (GANs)
 weakly supervised learning
 applications of deep learning (segmentation, object detection, speech recognition, …)
The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.
Dr. Christlein, Prof. Dr. Maier, 5 ECTS
A pattern recognition system consists of the following steps: sensor data acquisition, preprocessing, feature extraction, and classification/machine learning. This module focuses mainly on the first three steps and is the basis of our master courses (Pattern Recognition and Pattern Analysis).
The goal of this module is to familiarize the students with the overall pipeline of a pattern recognition system. The various steps involved from data capture to pattern classification are presented. The lectures start with a short introduction, where the nomenclature is defined. Analogtodigital conversion is discussed with a focus on how it impacts further signal analysis. Commonly used preprocessing methods are then described. A key component of pattern recognition is feature extraction. Thus, several techniques for feature computation will be presented including Walsh transform, Haar transform, linear predictive coding (LPC), wavelets, moments, principal component analysis (PCA) and linear discriminant analysis (LDA). The lectures conclude with a basic introduction to classification. The principles of statistical, distributionfree and nonparametric classification approaches will be presented. Within this context we will cover Bayesian and Gaussian classifiers.
Prof. Dr. Eskofier, 5 ECTS
(this module is also offered in winter term)
This module offers an overview of some of the most widely used machine learning (ML) methods that are required for solving data science problems. We present the necessary fundamental for each topic and provide programming exercises.
The course includes:
1) The common practices for data preGprocessing.
2) Teaching different tasks regarding regression, classification, and dimensionality reduction using methods including but not limited to linear regression and classification, Support vector machines and Deep neural networks.
3) Introduction to Python programming for data science.
4) Applying machine learning models on real world engineering applications.
Prof. Dr. Eskofier, 5 ECTS
(this module is also offered in winter term)
This module focuses on various aspects of Deep Learning. Theoretical foundations and general concepts are introduced in the first part, while the second part focuses on specific networks used in image analysis as well as timeseries analysis, two common tasks in engineering applications. The list of topics covered includes:
 Network optimization
 Regularization
 Convolutional neural networks
 Reccurent neural networks
In the integrated lab sessions, the students will tackle an image classification problem as well as a timeseries regression problem using industrial datasets.
Prof. Dr. Eskofier, 5 ECTS
(this module is also offered in winter term)
Neuroscience has played a key role in the history of artificial intelligence (AI), and has been an inspiration for building humanlike AI, i.e. to design AI systems that emulate human intelligence.
Neuroscience provides a vast number of methods to decipher the representational and computational principles of biological neural networks, which can in turn be used to understand artificial neural networks and help to solve the so called black box problem. This endeavour is called neuroscience 2.0 or machine behaviour. In addition, transferring design and processing principles from biology to computer science promises novel solutions for contemporary challenges in the field of machine learning. This research direction is called neuroscienceinspired artificial intelligence.
The course will cover the most important works which provide the cornerstone knowledge to understand the biological foundations of cognition and AI, and applications in the areas of AIbased modelling of brain function, neuroscienceinspired AI and reverseengineering of artificial neural networks.
N.N., 10 ECTS
 Crystal structures
 Structure determination
 Vibrational properties
 Electronic structure
 Electronic transport
 Dielectric and optical properties
 Magnetism
 Superconductivity
N.N., 10 ECTS
 Structure of solids: Bravais lattices, reciprocal lattice, Brillouin zone
 The solid as a manybody problem: Hamiltonian of a solid, electronelectron interaction, electronion interaction, separation of electronic and ionic motion (BornOppenheimer approximation), types of bonding
 Lattice dynamics: Phonons: Harmonic approximation, classical solution, dispersion relation, acoustic and optical modes, Debye and Einstein model, quantum theory of lattice vibrations, phonons, density of states, van Hove singularities, thermal properties, anharmonic effects
 Electrons in a periodic potential: Bloch theorem, band structure, nearly free electrons, tightbinding method, Wannier functions, metals, insulators, semiconductors, density of states, Fermi surface, quantum statistics, thermal properties, Fermi distribution
 Electronelectron interaction: HartreeFock method, density functional theory, homogeneous electron gas
N.N., 10 ECTS
The course covers an introduction to quantum field theory. The following main topics will be discussed in the lecture:
 Motivation Quantum Field Theory
 Classical Field Theory (Hamiltonian, Lagrange formalism for classical field theories)
 Relativistic Quantum Mechanics (KleinGordon and Dirac equation)
 Representation Theory Lorentz und PoincareGroups (finite dimensional scalar , vector, tensor and spinor representations of the Lorentz group, infinite dimensional representations: field representations, finite and infinite dimensional representation of the Poincare group)
 Quantisation of Free Fields (multi particle states, Fock space, canonical quantisation of scalar, vector and spinor fields)
 Quantisation of Interacting Field Theories (interaction picture, Dyson series, perturbation theory, Smatrix, Feynman rules, Higgs Mechanism)
N.N., 5 ECTS
 Introduction: particle zoo, interactions and exchange particles, relativistic kinematics, Feynman diagrams
 Covariant description of relativistic particles: KleinGordon equation, crossing symmetry, invariant amplitude and cross section, Fermi’s Golden Rule
 Quantum electrodynamics of spinless particles: covariant electrodynamics, photon propagator, Feynman rules, scattering cross section
 Quantum electrodynamics of spin1/2 particles: Dirac equation, electronmuon scattering cross section, helicity conservation, electronpositron scattering
 Weak Interactions: chargedcurrent interactions, VA structure, parity violation, quark couplings and CP violation
 Physics of massive neutrinos: neutrino oscillations, mass hierarchy, double beta decay
 Towards the Standard Model of Particle Physics: neutral current interactions, weak isospin and hypercharge, electroweak unification
 The Higgs mechanism: gauge invariance, spontaneous symmetry breaking, Higgs couplings, Higgs production and decay
 Beyond the Standard Model: introduction to supersymmetry, Dark Matter
Prof. Pflaum, 5 ECTS
 Vector spaces, norms, prinical axis theorem

Banach space, Hilbert space

Sobolev space

Elliptic partial differential equation

Fourier transform

Distributions
Further courses
Theses modules were given irregularly during the previous semesters and might be offered again, but there is no guarantee.
Prof. Dr. F. Marquardt, 5 ECTS
This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform diverse challenging tasks, including image recognition and natural language processing, just by training them on many examples. Neural networks have recently achieved spectacular successes, with their performance often surpassing humans. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions. We will cover the basics of neural networks, convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Prerequisites: almost none, except for matrix multiplication and the chain rule.
Prof. Dr. Nelles, 5 ECTS
The lectures provide an overview of the most important methods for the statistical evaluation of measured data. It lays the foundation for bachelor and master theses in experimental physics. In the first part of the lectures we will deal with the basics of statistics and probability theory. The second part of the lectures provides an introduction to measurement error and error calculation, parameter estimates and confidence intervals. For some of the exercises we will use computer (python language), which will be useful for the data analysis in the context of a Bachelor / Master thesis.
The topics will include:
Part I. Probability and statistics
 Introduction to statistics and probability theory
 Special distibutions: Gaussian, Poisson, Multinomial
 Parameter estimators of the distribution (mean, variance, bias etc.)
 Multidimensional distributions
 Random sampling
Part II. Statistical interpretation of measurements
 Least squared method
 Chi2 fitting
 Maximal likelihood
 Bayesian statistics
 Estimation of confidence intervals
 Binned and unbinned analysis