PhD Topics

1. Lensless X-Ray Microscopy at the Photon Limit

Supervisors: Prof. Christian Schroer (DESY, UHH, Fachbereich Physik), Jun.-Prof. Mathias Trabs (UHH, Fachbereich Mathematik)

The images acquired with any kind of optical instrument from visible light to hard x-ray regime are spoiled by noise and possibly some blurring, stemming from a non-ideal point spread function of the imaging instrument. In principle noise could be reduced by increasing the signal to noise ratio by putting more radiation on the object being imaged. This is not always possible due to radiation damage in the object (especially biological specimen) which causes a structural change. Thus, the true signal i.e. the image of the specimen has to be recovered from perturbed observations. To this end deconvolution and denoising techniques have been developed. The goals of this project are: (i) The investigation of different regularization and deconvolution methods as well as their mathematical understanding for diffraction and full-field data and (ii) the development of a parallelized implementation for the processing of large datasets.

2. New Computational Methods for Serial Crystallography at X-ray Free Electron Lasers

Supervisors: Prof. Henry Chapman (CFEL, DESY, UHH Fachbereich Physik, Dr. Anton Barty (CFEL, DESY), Prof. Matthias Rarey (ZBH, UHH, Fachbereich Informatik)

Serial crystallography experiments run X-ray cameras continuously at their maximum frame rate resulting in the collection of large volumes of data. Current experiments generate over 20,000,000 frames which need to be analyzed with raw data sets of over 50TB in size per experiment. Experiments in the next few years at the European XFEL in Hamburg will be able to collect over 3500 diffraction patterns per second, or over 12 million measurements per hour. Efficient and automated data analysis is essential to obtaining results. The aim of this project is to develop novel computational approaches for the most pressing data processing tasks along the serial crystallography pipeline. The focus will be on two problems. The first major challenge is the development of efficient methods for data reduction. Poor quality frames have to be filtered out efficiently, in the ideal case online during data generation. The second challenge lies in how the collected frames in total can be optimally exploited for atomistic model generation. Novel strategies taking all data into account and making use of modern numerical optimization schemes will be combined with interactive visualization components to generate tailored model building solutions for serial crystallography and the generation of time-resolved macromolecular movies.

3. A multi-purpose framework for efficient parallelized execution of charged particle tracking

Supervisors:  Dr. Krisztian Peters (DESY), Prof. Thomas Ludwig (UHH, Fachbereich Informatik)

Charged particle track reconstruction in an environment with extremely high particle multiplicities and measurement densities is a significant computing challenge for future high-energy physics experiments, such as the planned upgrade of the ATLAS experiment at CERN for the High-Luminosity Large Hadron Collider (HL-LHC). In order to make best use of modern developments in processor technologies (such as wide registers, GPUs, and other accelerators), the ability to run efficiently in a parallelized, multi-threaded framework is key. The application of such paradigms to track reconstruction is currently limited by the highly serial procedures used to mitigate the intrinsic combinatorial challenges of the application, and the necessity to run on distributed, heterogeneous computing resources. This project will address these issues by studying methods to apply intra- algorithm parallelism to common track reconstruction tasks, and developing a framework to ensure that such tasks can be efficiently assigned to resources with appropriate architectures.

We are looking for a computational scientist with an interest in physics and mathematics, or a physicist with a background in computer science. Candidates should have experience with high performance computing, code optimization/parallelization and linear algebra. Knowledge of low-level programming languages such as C or C++ is highly desirable. An interest in interdisciplinary research and a high degree of independence is required due to the nature of the project

4. Distributed Self-Healing Infrastructure for High-Speed Scientific Data Processing

Supervisors: Dr. Holger Schlarb (DESY), Prof. Görschwin Fey (TUHH, Department of Computer Engineering)

Most advanced physics meets highly dependable high performance embedded computing precisely at the infrastructure of large scale research facilities like the European XFEL or PETRAIII/IV. The physics of the experiment defines requirements and functionality for high-speed high-performance real time processing. Reliable operation requires a dependable distributed infrastructure composed of thousands of custom computing nodes for data taking, processing, storage and transfer. Thus, the key question arises: How to facilitate self-diagnosis and self-healing under tight real-time and high-performance processing demands? Guided by the physicists and engineers at DESY and supervised at TUHH, the prospective PhD student will devise new concepts for self-aware distributed computing to identify and heal faults autonomously at run time. The scientific challenges are in the automated localization of potential sources for failures may these be due to hardware defects, radiation or even software bugs and their mitigation. Deep understanding of the computing infrastructure as well as the experiment physics are mandatory to identify feasible solutions. Model-based approaches joined with online formal reasoning will be the method of choice for advanced self-awareness. Empirical studies will implement and verify developed concepts on the most recent devices deployed at DESY. This exactly matches future needs in wider application areas relying on a myriad of devices combined into virtually autonomous distributed computing infrastructures.

5. Search for Dark Matter with the CMS experiment via Neural Networks with Multi-Task Learning

Supervisors: Prof. Christian Schwanenberger (DESY, UHH, Fachbereich Physik), Dr. Christian Seifert (TUHH, Institute of Mathematics)

The big unknown in particle physics is Dark Matter. In order to detect the presence of Dark Matter, we use data coming from the CMS experiment at the Large Hadron Collider at CERN. To classify the data into potential Dark Matter candidates and known elementary particles, a deep neural network will be implemented, trained, and fine-tuned. To obtain further characteristic properties of Dark Matter particles, such as the nature of the interaction with known particles, an attribute learning model combined with multi-task learning techniques will be exploited. A stable classification with characteristic meta-information of Dark Matter will be a huge step forward for new physics searches since to our knowledge such an approach had not been applied ever in particle physics.
 

6. Workflows for reproducible computational science and data science

Supervisors: Prof. Hans Fangohr (European XFEL), Dr. Sandor Brockhauser (European XFEL), Dr. Adrian Mancuso (European XFEL), Prof. Volker Guelzow (DESY), Prof. Thomas Ludwig (UHH, Fachbereich Informatik)

Carrying out data analysis of scientific data obtained during experiments is a main activity in photon science, and is essential to convert the obtained data into understanding, and eventually publications. A topic that receives growing attention is that of reproducibility and re-usability: Given a publication, it should be possible for readers of the publication to reproduce the results published in the paper, particularly so if the results are based on computational processes. This forms the bases for re-use of the work, for example to extend the analysis software to carry out a related but new study. In practice, this is often impossible. In this project, we will investigate the process of data analysis towards publication and then work to improve this workflow. Typically, data analysis involves processing huge amounts of data (GB to PB) using a range of specialist software tools. Challenges include to preserve all these processing steps, the specialist software, and its computation environment so that the computation can be reproduced and re-used in the future. Objectives are to make the process reproducible, convenient and effective. Important tools for the technical part of this work are likely to include the Jupyter Notebook and an ecosystem of tools, including Python, package managers such as Spack, and containers. We are looking for a computational scientist with a background in physics, chemistry, biology, mathematics, engineering or similar with strong interest in programming and computational science, or for a computer scientist with interest in supporting computational science.

7. Cost-Effective High-Performance Simulation of Next Generation Light Sources

Supervisors: Prof. Markus Bause (HSU, Numerical Mathematics), Prof. Michael Breuer (HSU, Fluid Mechanics), Prof. Franz Kärtner (DESY, UHH, Fachbereich Physik), Dr. Jens Osterhoff (DESY)

X-ray Free-Electron Lasers (XFELs) driven by compact electron accelerators and using short period undulators show complex behavior and are extremely challenging to simulate. Some 10 million electrons are involved in a highly nonlinear motion with collective effects driven by electro-magnetic fields ranging from the mm-wavelength range down to the hard-X-ray wavelength range at a fraction of an Angstrom. To understand and optimize X-ray production with these sources asks for an exact simultaneous solution of the emission and motion of electrons and electro-magnetic fields on multiple time and spatial scales. Even though significant progress has been made in the numerical simulation of particle motion under field effects, substantial progress is still required for XFELs. Data mining, in particular machine learning, offers the potential to accelerate computations, such that a real breakthrough becomes feasible, by inferring knowledge from data bases instead of computing all physical quantities of the system on sufficiently fine spatial and temporal scales. By combining the expertise of the involved research groups, the long-term goal is to achieve the simulations only by providing the physics and geometry of the problem, consequently reducing the mesh and solver parameters setup time to almost zero and eliminating the need of expertise. We are looking for one PhD student working on such cost-effective simulations of X-ray sources. Knowledge of numerical approaches for partial differential equations and/or machine learning techniques is desired. Advanced programming skills are mandatory. The student will be supervised conjointly and integrated fully in the research groups with stimulating atmosphere. Excellent equipment is offered by the groups.

8. High Performance Simulations of Next Generation Light Sources

Supervisors: Prof. Franz Kärtner (DESY, UHH, Fachbereich Physik), Dr. Jens Osterhoff (DESY), Prof. Sabine Le Borne (TUHH, Mathematics), Dr. Jens-Peter Zemke (TUHH, Mathematics)

Next generation light sources, X-ray Free-Electron Lasers (XFELs) driven by compact electron accelerators and using short period undulators show highly complex behavior and are extremely challenging to simulate. Some 10 million electrons are involved in a highly nonlinear motion with collective effects driven by electromagnetic fields of a wide range of wavelengths. Understanding and optimization of X-ray production with these sources requires the simultaneous computation of the emission and motion of electrons and electromagnetic fields on multiple time and spatial scales.
In this project, modern data science techniques/algorithms will be developed to process the enormous amounts of data in these challenging simulations. The focus will lie on PDE modeling and the development of efficient solvers based on data-driven data-sparse preconditioners for the linear systems arising in space-time finite-element  discretisations. The advanced techniques will be implemented to enable the analysis and design of novel FEL schemes via manipulation of the electron phase space starting from the photo cathode in the electron gun to novel undulator configurations and implementations such as optical undulators.
Depending on the background of the PhD candidate, the focus of the project may shift from more applied aspects such as the design of novel FEL schemes to more theoretical/mathematical aspects such as design, convergence or complexity analyses of algorithms.
Requirements: Masters degree in physics, mathematics or computer science (or closely related), background in scientific computing, including coding skills (C or C++).
 

9. Next Generation Integrative Modeling for Cryo-Electron Microscopy

Supervisors: Prof. Michael Kolbe (HZI, CSSB, UHH, Fachbereich Chemie), Prof. Matthias Rarey (ZBH, UHH, Fachbereich Informatik)

Cryo-electron microscopy (cryo-EM) is an emerging technology in the field of structural biology. Especially for larger biomolecular assemblies (hundreds of kDa to mDa) it is becoming the method of choice and might play a crucial role in the understanding of cellular processes at the molecular level. Although cryo-EM structures with near to atomic resolution exist, the most frequent scenario is models at slightly lower resolutions of 4-8 Å. Moreover, the resolution can vary considerably for different volumes of a given cryo-EM map which makes structural interpretation sometimes challenging. To overcome the information gap, various different data sources are usually combined into an integrative modelling approach. Highly sophisticated computational techniques combining spatial pattern recognition with precise molecular modelling are required to create overall models with atomic resolution. The project focuses on two challenging problems that researchers often face when working with EM. We want to develop better tools for the interpretation of mid resolution electron density maps by integrating atomic structures from other biophysical methods under consideration of conformational flexibility. Second, we want to improve the detection of self-similarity and local symmetry in density maps and consider this information upon model building

10. Scalable in situ visual analytics of multivariate volume data time series for materials science applications

Supervisors: Prof. Martin Müller (HZG), Prof. Stephan Olbrich (UHH, RRZ, Fachbereich Informatik)

To support materials science applications, we will develop and integrate application-specific data stream and file interfaces as well as processing and visualization algorithms, and optimize them for scalability and usability, based on an existing in-situ parallel data streaming and extraction framework. It is planned to combine and tightly couple 3Dd image based data extraction methods with 3D geometry based analytics, in order to develop and integrate innovative hybrid methods, e. g. for segmentation purposes. These should reduce or avoid intermediate data and take advantage of complexity reduction (O(N3) volume/voxel → O(N2) surface/polygons), in order to efficiently and automatically generate abstractions of the characteristics of materials and its changes over time. To enable and support interdisciplinary usage and openness for scientific and public scenarios, efficient parallel compression and decompression will be integrated as part of the extraction of the resulting 3D geometric and/or symbolic representations, and a thin multi-platform client software will be provided for open access and remote 3D viewing.

11. Phase Retrieval in Imaging and Speech Enhancement

Supervisors: Prof. Henry Chapman (DESY, CFEL, UHH, Fachbereich Physik), Prof. Timo Gerkmann (UHH, Fachbereich Informatik)

Advanced sources such as free-electron lasers produce intense and coherent beams of X-rays that are opening up new possibilities to image biological materials, such as single molecules, at atomic resolution. Since atomic-resolution lenses do not exist such methods usually rely upon retrieving the structural information encoded in the far-field coherent diffraction pattern [Shechtman et al, 2015]. This intensity pattern corresponds to the Fourier magnitude of the object, and is thus an incomplete measurement since the spectral phase cannot be measured. To reconstruct the original structure, the missing phase information needs to be retrieved . A very similar problem of phase retrieval is encountered in speech source separation and enhancement, which has received increasing attention recently [Gerkmann et al, 2015]. Just in the last few years, modern machine learning methods have been successfully applied to the problem aiming to obtain faster and more accurate results.
The goal of this project is to explore common concepts in X-ray imaging and speech enhancement. Recent advances in the application of machine learning in speech analysis will be transferred and tailored to X-ray imaging, obtaining deeper insights into this challenging inverse problem and providing new opportunities for high-resolution structure determination of biological systems.

12. Multi-messenger X-ray Science – electron densities from a combined analysis of elastic x-ray scattering and x-ray emission data

Supervisors: Prof. Nina Rohringer (DESY, UHH, Fachbereich Physik), Jun.-Prof. Christina Brandt (UHH, Fachbereich Mathematik)

We propose an interdisciplinary project between the natural and mathematical sciences to develop novel x-ray data analysis methods for combined x-ray diffraction and x-ray emission spectroscopy measurements at high-brilliance x-ray sources. While coherent xray diffraction gives access to the structure (position of atoms) of a chemical compound or reaction unit, x-ray emission spectroscopy is a complementary and sensitive probe to the changes of the chemical bonds. The interpretation of the latter strongly relies on comparison with theoretically calculated spectra (electronic structure calculations). Pioneering experiments at storage-ring based x-ray sources and x-ray free electron lasers (XFELs) have shown the possibility for combining x-ray diffraction, x-ray scattering and x-ray emission spectroscopy in one high-quality experiment. Currently, the data analysis and reconstruction of the underlying physical quantities of interest (electron density, oxidation state, electron affinity, etc.) are undertaken independently from the two techniques. In the spirit of quantum crystallography, we propose to refine electronic structure (quantum chemistry) calculations by constraining the minimization problem of the total energy by measured x-ray structure factors and x-ray emission spectra. The goal is thus to predict the valence electron wave function by a  combined theoretical and experimental method.
 

13. Dynamic protein pattern Recognition In Free-Electron Laser Experiments

Supervisors: Dr. Sadia Bari (DESY), Prof. Simone Techert (DESY), Jun.-Prof. Robert Meißner (TUHH, HZG)

Utilizing the unique pulsed features of x-ray free-electron laser (FELs), it will become possible not only to study the structure of radiation sensitive and fragile proteins as well as complexes, but also to gain a deeper understanding of their dynamics. Going beyond well-ordered systems, it is expected that the combination of x-ray scattering and x-ray spectroscopy methods combined with advanced molecular dynamics (MD) simulations leads to real-time dynamics information about molecular migration and overall structural dynamics information of protein entities. Investigated mechanisms include folding-unfolding processes, protein aggregation or segregation as well as dynamics of intrinsically disordered proteins. Machine learning approaches, e.g. pattern recognition algorithms will be used to identify energetically stable patterns in the conformational space – allowing in combination with x-ray experiments deeper insights into the conformational space of proteins. Mathematical tools based on a probabilistic analysis of molecular motifs and kernel ridge regression techniques are used to determine the fingerprint of biomolecular conformations combining data from advanced MD simulations, online databases and x-ray spectroscopy data. This graduate school provides a unique opportunity to develop and optimize theoretical predictions and simulation algorithms utilizing the provided FEL and synchrotron based spectroscopic data – from small peptides up to hierarchically built-up proteins.

Requirements for position 1 (theoretical work): Master’s degree particular in physics, chemistry or materials science as well as computer science (possibly with specialization in physics- or chemoinformatics).
Special knowledge in the fields of molecular dynamics simulations (e.g. with the open source software LAMMPS), density functional theory simulations (e.g. with VASP), machine learning techniques or advanced sampling methods (e.g. Umbrella Sampling or Metadynamics) is advantageous.
Requirements for position 2 (experimental work): Master’s degree in experimental physics, physical chemistry or related fields. Experience in laser spectroscopy,  x-ray spectroscopy or mass spectrometry would be beneficial. Experience at synchrotrons or free-electron lasers would be helpful.
 

14. Using Prior Knowledge for the Solution of Ill-Posed Inverse Problems in X-Ray Microscopy

Supervisors: Prof. Christian Schroer (DESY, UHH, Fachbereich Physik), Prof. Tobias Knopp (TUHH, Institut für Biomedizinische Bildgebung)

The interpretation of most experimental data in X-ray microscopy requires solving an inverse problem, i. e., finding the physical properties of a sample from a more or less known model for the image formation. Solving such inverse problems is usually a challenging task since the underlying mathematical problem is ill-posed and in turn small perturbations in the measurement due to noise lead to large errors in the calculated solution of the inverse problem. To handle inverse problems, the underlying linear reconstruction problem has to be solved by applying prior knowledge. For instance it can be assumed that the solution is smooth or that it has minimal total variation. In the recent years, the classical solvers for inverse problems have been complemented by approaches that are based on machine learning. Based on large databases of typical images, the inverse imaging operator can be learned and efficiently evaluated. Within this project the goal is to implement, improve, and compare state-of-the art algorithms for the solution of the inverse problem in X-ray microscopy using real-world data. The algorithms will be benchmarked with respect to noise reduction and reconstruction speed. A major focus will be put investigation of artifacts that appear when the algorithms are applied to unusual but physically plausible data.