“… supercomputer the new window to the cognition to the world” interview with Kornél Kapás, HPC scholarship PhD student


How would you introduce yourself, what is your research area?

I deal with theoretical particle physics. I already specialized in this field during my university studies. At first, I did experimental work during my undergraduate years, for example I worked on detector development, but during my master's I switched to the theoretical direction.

Why did you switch to the theoretical field?

When I dealt with the development of detectors, I was already very interested in programming and software development. This is actually necessary for all physical research. After BSc, I looked for a topic, worked here and there, then I found my current research group, where coding and parallel programming played a particularly large role. It is not a purely theoretical field in the sense that we create new models. I deal with the part of particle physics where there is an existing model, and from it we calculate things that were not possible before without the appropriate IT tools.

How did you encounter supercomputing?

My current supervisor, Sándor Katz, has a research group and a supercomputer. The first time I saw a live supercomputer was six years ago

Did HPC play a role in your studies before?

I had a course on parallel programming, but not at the HPC level. It was more about how to write programs for GPUs, but that was also only an optional subject in my major. I was introduced to supercomputing when I joined my research group.

Why did you get connected to supercomputers so "late"?

There are some branches of physics where HPC is definitely needed, but there are also some where it is not necessarily, although this is constantly changing. Where Monte Carlo simulations are used, as we are, it is needed. I have astrophysicist friends who, after their master's degree, got involved in a collaboration where they use HPC, and they only got to know the technology there. HPCs are also used intensively in other areas, such as network research or neural networks, but even there students typically only encounter it when they join a research group. I started 10 years ago, maybe the situation is different today.

How can HPC be used in your field?

We have particle physics models that need to be supported from both sides. The weight of the proton and neutron has been measured for some time, but it was not clear how to calculate, for example, the weight of the proton from the Standard Model of particle physics. We had the formulas, but before the supercomputers it was impossible to solve this task because the equations to be solved were extremely complicated. There is a formalism with which we can reduce the equations from the otherwise infinite-dimensional integral to a finite multidimensional integral. This is lattice theory. In short, this involves interpreting continuous space-time at discrete grid points, then gradually densifying them, and finally obtaining the result for continuous space-time. This is a technically difficult procedure, tens of millions of integrals have to be performed, which is impossible on a traditional computer, not to mention on paper. Monte Carlo simulations are needed for this, but they also need to be well optimized so that the calculations can be completed within a finite time. This is clearly impossible without a supercomputer.

Did the supercomputer make a new procedure possible?

The formalism I use burst into in 1974, so it's basically an old theory. However, truly efficient and exciting calculations only became possible in the early 2000s, with the appearance of more serious HPCs. The Standard Model of particle physics, which describes the behaviour of elementary particles, was born in the second half of the previous century. But there were only conjectures about what kind of properties a proton, neutron, or atomic nucleus would have if it was made up of these, it was impossible to calculate. In simpler models, it was of course possible to calculate this and that, but the real particle physics quantities not in all cases. In general, science works by creating a model, calculating things from it, and then testing it experimentally. My job is to examine certain parts of the Standard Model to see if it theoretically reproduces the quantities that we have already measured experimentally. If we validate a model in this way, then of course it must also be able to make predictions, which we can then verify experimentally.

What is the purpose of the model?

Put it simply; three quarks make up a proton or neutron, and these quarks always like to be in threes or twos. But at very high temperature or extremely high pressure, this can dissipate into a quark-gluon plasma. We know this from experiments that it happens. The temperature at which this happens, the order and speed of the phase transformation, can in principle be calculated from the Standard Model. This is a very complicated task, it was impossible to calculate it for real physics until 20 years ago.

Are you testing specific items using this method?

We have six kinds of quarks, or so we think. Of these, the three lightest typically occur the most often. The model I am investigating focuses on the behaviour of the substance containing these three quarks. The question is when and how quickly these hadrons, i.e. particles consisting of two or three quarks, transform into quark-gluon plasma. So I deal with the properties of such phase transitions of systems containing many particles.

Has the supercomputer opened up a new dimension in your field?

In any case, it made a big swing on it. Lattice field theory has been around since 1974, but very simple models were used. There were some ideas of what the results would be in real physics, but it was only when supercomputers appeared, especially graphics card clusters, that the weight and other properties of a real particle, such as the proton, could be calculated for the first time. This agreed with the experimental results, thus validating the Standard Model, and many publications were also produced. I have arrived into the more mature period of the method, but there are still quite a few exciting things that can be investigated.

What can we find out about the project included in your application?

I'll start a little further. When I started my PhD, I specifically had to calculate a certain physical quantity related to the phase diagram. This has been calculated before, but we have no results on a finer grid, especially in the continuum limit. However, in all kinds of Monte Carlo simulations, there are two errors, the sign problem and the overlap problem. If I increase the volume of the system, the severity of the problems also increases exponentially with this increasing the amount of errors. The sign problem can be easily measured, but the overlap problem may not be recognized immediately. It turned out that in specific cases these are so serious that they cannot be solved even by multiplying the statistics. We have already found a method previously, after which only the sign problem remained the only hindering factor, but it cannot be bypassed. Looking for a solution to this, last year we tried a method on a toy model, which, although it is not perfect, alleviates the sign problem. I investigated a far more complicated version of this game model within the framework of the project included in the application.

Have you managed to achieve your goals?

Yes. I managed to plan my project in such a way that I had planned exactly when and what I would do, what kind of result I would expect, and everything I expected actually came out. There were some results, which had come even sooner than expected, so I calculated one or two extra things. The results are very promising, so I hope to be able to publish in a newspaper with a high impact factor.

Has the KIFÜ supercomputer caused any surprises?

Yes. Our job management is different, few people use the machines, so we distribute access verbally, often per machine. This is a completely workable thing in a small group. Many more people can use the machines of the KIFÜ, so some kind of task manager is needed. I start, let's say, 300 jobs, and from then on everything happens automatically. I liked this very much, it is much more convenient and there is much less possibility of error.

Kvantum-színdinamika rácson

How challenging do you find using a supercomputer?

It depends on the level you are on at the beginning, but those who enjoy it will learn quickly.

What is needed for an effective application?

Curiosity and determination. Just for fun, I sometimes go to NVIDIA's website and see what new architectures have come out, how they work, how to write a program according to a public tutorial, which will make the code the most efficient for the given card. Obviously, it helps a lot if it entertains us.

Who would you recommend the supercomputer to?

For anyone who deals with complex systems, where the algorithms can be parallelized very well. Monte Carlo simulations could usually be run on a single machine anyway, but it would take years. On several machines, the appropriate Monte Carlo statistics are collected sooner. As time goes on, more and more people will say they need HPC. Although they have been used in our area for 20 years, it will soon be the dominant one everywhere. What can be calculated on paper or on a computer have been calculated, the supercomputer is a new window for learning about the world. Just like the computer, the supercomputer will spread everywhere, we're going one level up.

How do you see the future of supercomputing?

Moore's law is well known, but now the strip width of the chips is about a few tens of atoms wide, we cannot really reduce the sizes any further. How fast a CPU is also starting to become saturated, because if we increase its performance, it gets so hot that it becomes impossible to cool it down. Reducing the size of the chips and increasing their capacity is therefore most likely not possible, so a supercomputer is the solution for drastically increasing capacities.