Research Interests: Machine Learning, Bio-inspired Artificial Intelligence, Methods from Applied Mathematics and Physics.
Professional Memberships: German Physical Society (DPG eV), student memberships in AAAI and SIGEVO.
Hobbies / Trivia: I ocassionally blog, write scifi shorts and do hackathons. Among my favourite movies are Matrix, Gattaca and Memento. My favourite old-school early 20th century scientist is the fascinating but also controversial John von Neumann.
We are living in crazy times. I wrote about studying Machine Learning at the start of the year, updated the article last month about working on a spare time project and now I already have a position as a PhD student on very promising research in sight.
I have accepted the offer to join the ITN ECOLE project funded under the EU Horizon 2020 Marie Curie Skłodowska actions. Horizon 2020 is with 80 billion euros currently the largest funded research initiative by the European Commission. I will be working together with an international tech company and the University of Birmingham on the intersection between evolutionary algorithms and currently popular machine learning algorithms.
It is not just crazy research drawing inspirations from many interesting fields, but also promising and practical in terms of applications. Therefore, I am very excited about beginning my work at the University of Birmingham in September. But I further plan on keeping this blog about spare time projects.
Feel free to take a look at my slides I prepared for my talk I gave for the application.
Machine Learning first caught my interest while working for a Data Science start-up after my studies. I found the way how a mathematical framework can be used for enumerable applications ranging from predictive maintenance, to language processing over visual recognition tasks deeply fascinating. Visiting a seminar of the German Physical Society in July 2017, I further met a PhD graduate who after finishing his thesis about ‘cold quantum gases’ became an AI developer. He particularly suggested everyone who has interest in the field to take the online courses offered by Stanford University professor Andrew Ng.
It took not long for me to register on Coursera and work on my first machine learning course. Despite having a full-time position at my new job, I was hooked to the lectures and finished them quickly. The new courses offered by Ng’s website DeepLearning.ai immediately followed.
But ML research and neuroscience are like apples and oranges. And we barely know anything about how the actual brain works. At a talk I attended, researcher Jürgen Schmidhuber compared neuroscientists to 20th century electricians trying to figure out how a 21st century smartphone works. And if you read some literature on neuroscience, you will come to the very same conclusions. So the recent achievements and progresses in ML theory are decoupled from any neuroscientific basis. And the most important ingredient of machine learning, the backpropagation algorithm, is also still searching for a biological analogy. Although one has to mention, that some first ideas are being investigated.
I hope in the future I further find the time to work on a machine learning project of my own. If anyone has any cool and interesting suggestions, feel free to contact me .
Update (2018-06-05): I recently started working on generative models (VAE & GANs). I will keep this blog updated about any progresses.
Update (2018-07-01): I got a PhD position in Computational Intelligence.
Visiting the course on Human Computer Systems at my university, I learned about various image filtering techniques used in medical computing. A quite interesting exercise I found to test some of my knowledge in the field was to determine vessel diameters in medical images.
I solved the problem by implementing different filtering techniques in a Java-based tool. Given an input image, the tool applies stepwise a grayscale conversion, a noise filter, a laplace filter (edge detection) and finally a binary filter.
The 2d intensity map derived in the last step can be used to extract vessel diameters. Provided some measurement points inside the vessels, the tool does this automatically by measuring the 1d intensity profile along a line and trying to minimize the distance between two peaks (i.e. the wall distance) in the intensity profile by variation of the inclination angle.
Overall it was a fun little project. Especially seeing how the combination of filters worked out as intended was great.
Fascinated by the cellular automata engine from Mirek Wojtowicz (MCell 4.20) and the multi-agent evolution simulator from Björn Seekatz (Bugs Evolution), I always wanted to create my own rule-based ‘biophysical’ simulation. Particularly to investigate the dynamics of adaptation through evolutionary processes. Final inspirations struck me as I watched a video of a microbial evolution study done in a joint venture of Havard Medical School and the Technion Israel Institute of Technology.
The video shows a growing microbial film successively evolving resistances against antibiotic stripes arranged in rising concentration towards the center.
In a similar way I recreated the behaviour of the biofilm within a cellular automaton. For this I considered the following rules: A living cell has a color type, a maximum age, a birth rate, a mutation rate as well as attack and defence values. The value of the maximum age, birth rate, mutation rate as well as attack and defence value can further change by spontaneous mutations (thus the cellular automaton is stochastic and not deterministic). The defence value is a number which protects a cell. If a cell of e.g. a red species tries to grow into a cell of a green species, the attack value of the red cell has to match the defence value of the green cell. The results of the run of a ‘simulation’ can be seen above at three different points in time. Below you find a still frame of a simulation with further two different visualization modes.
The first picture shows two species of ‘molds’ with the color red and green. The brightness indicates the generation number of a subspecies formed by spontaneous mutation. Older generations of a subspecies have lower generation number and are brighter, while newer generations of a subspecies have higher generation number and are darker. The green species developed at an early age mutations with a higher birth rate, thus managed it to expand further than the red species. The second picture shows a heatmap of the generations, the third picture the rate of mutations per birth.
I find it very fascinating that similar to the video you can see the formation of ‘cones’ whenever a new subspecies is spawned by spontaneous mutations. Further by playing around with some of the settings, the automaton allows one to test a variety of different scenarios for evolutionary processes.
You can find the Eclipse project of the Stochastic Cellular Automaton at my GitHub account (names of the included files may differ): StochasticCellularAutomaton .
During my Master’s studies I attended a seminar on theoretical problems of nuclear physics. I particularly worked on a term paper about the nuclear physics of neutron stars. Because doing research on astrophysical problems felt very refreshening for me, I decided to switch to theoretical nuclear astrophysics for my Master’ thesis.
Working in the TNA group was different to my prior experiences. I really enjoyed working in such a diverse team. I became known for my coffee drinking habits and particularly recall one weary morning at which a bugged out PhD student complained to me about having too much. In his defence: The coffee machine was standing right on his desk and was always making loud noises.
I finished my thesis in March 2016 and defended it with great success! The 80 pages long work studies the mathematical framework used to calculate neutrino reactions in hot dense matter (by this we mean the proto-neutron star), investigates the behaviour of known basic reaction processes for a data set stemming from a core-collapse supernova simulation, criticizes heuristic methods which are used to extrapolate the framework to more complicated modes of interaction and contains analytical calculations of the more complicated interactions.
The study of neutrino reactions in nuclear astrophysics is particularly interesting in order to better understand the dynamics of core-collapse supernova and the resulting observable neutrino signal.
During the Serious Games seminar I got to know a fellow computer science student named Alvar who invited me to participate with him in the hackathons hosted at the multimedia communications lab of the factuly of electrical engineering and information systems. Because I always wanted to work on a video game from the scratch, I did not hesitate to accept the invitation. The hackathons were coupled to globally organized Ludum Dare game jams with thousands of people from around the world simultaneously participating for three days.
Alvar and I immidiately formed a very effective duo. We specialized in making small but humorous games. While Alvar mostly worked on building the game logic in Unity, I specialized on game design. Researching ideas for the theme, writing dialogs and juvenile jokes, making graphics, finding the right music, finding the right sounds and even recording dialog. And of course coming up with entertaining eastereggs. And all has to be put together in a way at the end so that the pacing and presentation is pleasurable. At times we both got into conflict over creative decisions. But at the end, both of our games for Ludum Dare 33 and Ludum Dare 34 turned out very well.
Our Ludum Dare 33 contribution ‘Y.A.T.M – You Are The Monster’ was ranked #16 in the category humor. For Ludum Dare 34, ‘The Authentic Roman History Simulator – 44 B.C. Edition’ even further ranked #8 in category humor and #30 in category theme from more than 2000 (!!) contributions in total.
You can try out our games for yourself at the following links:
Question: “So, what’s the deal with open-world and sandbox games?”. While this may sound like a conundrum, it was an actual question I worked on for the Serious Games seminar at my university.
The faculty of electrical engineering and information technology at the Technical University of Darmstadt has a work group and a lab dedicated to the study of games and their applications for training and health care. I actually found out about them while looking for interesting courses for my non-physics curriculum.
A particular question which rose up for my supervisor who worked in the field of cooperative games was, what the difference between sandbox and open world games are (if there are any). For this reason I did read enumerous blog posts by game designers, literature and even bought some games. I came to the conclusion, that while the terms ‘sandbox’ and ‘open world’ are often used as synonyms, they describe completely different aspects of a game.
While ‘open world’ implies a certain degree of scale or complexity of a level, ‘sandbox’ describes the world as literally manipulatable like sand in a box. In contrast, many open world games like GTA often rather resemble ‘theme parks’.
I found the excursion in the world of Game Studies quite refreshening and overall it was a great success. If you are interested, you find the term paper and the presentation attached below. Sadly, due to the requirements of the seminar I had to write them in German.
Understanding quantum field theory was one of the main motivations for me to go into theoretical physics. At the Technical University of Darmstadt we particularly had two work groups which were interested into the study of the quantum field theory of strong interaction (i.e. quantum chromodynamics) to describe states of QCD matter.
QCD matter is a state of matter which is thought to exist in dense neutron stars and in the early universe. At these conditions matter breaks up into a soup of interacting quarks and gluons, the so called quark gluon plasma. If the thermodynamic conditions undergo changes, the quark gluon plasma can condense into bound composite particles, the so called hadrons. In a way it’s similar to how droplets form in moist air. The boundaries between these different states of matter are the so called phase transitions (indicated by blue lines) in the plot. There are of course many more phase transitions for the quark gluon plasma conjectured. However, if they exist is a different question.
Calculating the phase transitions and properties of QCD matter with numerical methods (i.e. lattice QCD) using the theory of quantumchromodynamics raises some mathematical problems which can not be resolved yet. Therefore various heuristic methods are used, as well as effective field theories which are mathematically easier to handle, but still replicate the most important properties of the ‘full’ theory.
I particularly tested the performance of a NJL model extended with vector interactions in comparison to lattice QCD. My conclusion was, that including the vector interaction, the performance of the NJL model worsens and thus the interaction terms should be neglected.
One of the most interesting projects I worked on during my studies was an Epidemic simulation based upon a SIR model implemented in Wolfram Mathematica. The project particularly dealed with the simulation of the spread of an infection in Europe. Therefore individual European cities were put into the simulation and connected by land (green lines) and air (dark blue lines).
The SIR model is basically a set of three first order differential equations, accompanied with a conservation condition N = S + I + R, which states that the total population number is constant and does not decline. The total population (N) of every city is split up into individuals who are susceptible (S) to a disease, are infected by a disease (I) and are recovered from a disease (R).
Because the project particularly dealed with the simulation of the spread of an infection in Europe, traffic was explicitely modelled between cities. Further it was ensured, that migration occurs in a way such that the total population in every city is conserved. Otherwise, the validity of the simulation would be put into question. The results of the simulation are quite interesting to observe. Starting an epidemic with about ~50 infected people in London, it rapidly spread to continental Europe until it halts in Central Europe, with limited spread to the eastern regions.
Of course the outcome of the simulation pretty much depends upon the chosen initial values. There are also some quirks in the modelling which can be further improved on, e.g. the connnections between cities are solely based upon their population numbers, thus the UK is over-connected to continental Europe. A further extension of the simulation considered transatlantic connections to North America.
Overall I really enjoyed working on the project. I personally wish my curriculum would have allowed me to work on more problems of interdisciplinary nature.