Dr. Milán Szőri, Associate Professor at the Institute of Chemistry of the Technical Materials Science Faculty of the University of Miskolc. Supervisor at the “Kerpely Antal” Materials Sciences and Technologies Doctoral School, and HPC expert of the Government Agency for IT Development. Independent impact factor of all his scientific publications and works: 464. Fields of research: computational physical chemistry, reaction mechanism studies using theoretical chemistry methods, interpreting boundary surface phenomena based on
computer simulations, and modelling of chemical evolution processes.

“... the combination of a supercomputer and the software installed on it are rather like a suspect that doesn’t lie” – interview with Dr. Milán Szőri, Associate Professor at the University of Miskolc.

When and how did you get to know supercomputing?

In the fall of 2000, as a chemistry student in his third year at the University of Sciences of Szeged under the mentorship of Gyula Tasi, who had just came back from Japan at the time. He talked a lot about the fact that he had used a supercomputer to determine all possible molecular geometries (conformers) of melatonin, the hormone controlling the sleep cycle, and that it is not feasible anymore to process the huge amounts of data from quantum chemistry calculations simply by human work. Back then, my goal was to have a PC suitable for quantum chemistry calculations. This is less of a problem today – first-year chemical engineering students at the University of Miskolc also perform such calculations using their own laptops or machine-room PCs. There also exist students who become daily users of HPC by their third year, or in other words, they are “HPC natives”.

What was your first supercomputer work and experience?

It wasn’t a concrete event for me but a multi-step process from PCs through a cluster computer consisting of 18 machines and “mini” supercomputers with 5 blade units to real supercomputers. The first real supercomputer infrastructure I worked with was a machine at Szeged operated by the NIIF, the predecessor of KIFU – I started to do calculations on this computer on 22 July 2011. In addition, I was allowed to play around with the clusters of the Czech Academy, and the supercomputers of Cineca, Compute Canada, and Mésocentre de calcul de Franche-Comté. I have become a researcher in an environment where we used supercomputers by default – I used them wherever I could, and also taught others to use them wherever it was possible.

Can supercomputers be regarded as a sort of a “philosopher’s stone” useful for all fields of science, or is it just a practical tool like chalk?

It is not the “philosophers’ stone” at all, it will not raise and solve problems. To cite the novel “The Hitchhiker’s Guide to the Galaxy”, the supercomputer’s answer will be 42 at the most. The question is what we do with this result. If we want to find analogies, the combination of a supercomputer and the software installed on it are rather like a suspect that doesn’t lie. For the answer to be meaningful, that is, true for the physically existing system we wish to model, we need to ask good questions. This is not easy.

What do you use the supercomputers for?

Mostly for interpreting experimental results and for designing experiments. With the help of predictive chemical models, we conduct a virtual measurement on a computer; for example, we map the possible chemical reactions. Next, experiments are used to corroborate the results. In addition, my colleagues and I regularly study the adsorption processes in the interstellar space which result in rendezvous between the probiotic molecules on ice surfaces formed of the water which is ubiquitous in outer space too. I am trying to figure out how these molecules meet and why exactly those amino acids are formed which are indispensable for life.

Isn’t it way too challenging to understand the formation of life?

Of course it is! But it’s exciting at the same time! Of course, we are not alone in this research topic, other researchers make their own contributions too. The essence of science is to understand the processes surrounding us as much as possible, and to find rational answers. There are no miracles, everything has a reason.

Do you have a project or result that would have been unfeasible without a supercomputer?

Our research team tries to implement projects in a way that everything is calculated at the theoretical level of the highest reliability available to us. Reliability goes hand in hand with the selection of the computation-intensive method. Thus, I can safely say that this holds for all of our computational chemistry research projects.

This does not seem like an easy task.

But can be solved! Supercomputers are generic measurement systems that completely restructure chemical knowledge. The resolution a supercomputer can provide is not or hardly accessible via experiments. A much wider range of problems can be studied but the human processing of the results has become unfeasible in many cases. Hence, we need to think through the results we can expect in advance; that is, we need to organise knowledge we don’t even have yet in the form of programs.

Isn’t science becoming dehumanised?

Supercomputers only automate certain steps. But in the meantime, the partial results indicate directions, and new ideas may emerge already the next day.

How can your results be utilised?

Primarily, supercomputing-based chemistry is forming our concepts – it helps in the molecularlevel interpretation of processes that are visible to the naked eye too. For example, the elemental reactions leading to by-products during industrial synthesis processes become easier to interpret with the results of our calculations. With this knowledge, we can modify the synthesis parameters. My experience tells that offering turn-key services is the reasonable way in case of industrial partners because they wish to solve a given problem. Molecular design is in a similar situation. For the industry, the product is the knowledge of the professional supercomputer user and not the supercomputer itself or the access to it. Trust is an additional capital because companies would not share all information with the researchers. This is a typical problem but can be resolved via in-person meetings, intensive cooperation, and confidentiality agreements.

To what extent do you think the use of supercomputers is challenging?

High performance computing also means large data quantities. Managing them requires systemic thinking and careful project planning. HPC use helps us think in workflows and prepares us for the preliminary filtering of complications. In reality, on should not be worried about HPC use because humans can only solve tasks of ever increasing complexity in line with their own capabilities. Actually, HPC gives a platform on which humans can extend their capabilities. Probably, it won’t be HPC but our intellectual limitations setting the true limits.

How does one acquire the confident knowledge required to become an expert of the Competence Centre?

You shouldn’t be afraid of supercomputers – use it and experience will come. When you are quite safe with standard uses, it’s worth distancing yourself from the daily routine. Things that can be measured can be calculated too. We should try to make use of these high-performance virtual measurement tools, in which the measurement principle is specified by the software, in an increasing number of novel fields. However, since measurement principles are rather general in many cases, the boundaries between scientific disciplines seem less like demarcation lines for us, computational chemists. The general character of natural laws “pays less respect” to disciplinary boundaries. That's how I can acquire new knowledge during my research from rational drug design through astrochemistry to chemical evolution.

Who helps you if you might need it?

With HPC problems – Dr. Attila Fekete, one of the system administrators of KIFU. He gives me ideas and helps me when I fell stuck.

Have you ever been inadvertently inspired by a user representing a completely different field of science?

I get to know better another colleague at every consultation. Since I’m at home in more than one scientific field, I use the knowledge I gained in one field in another field even without wanting it.

Do these consultations have a community building effect? Do you also keep in touch after the joint work?

Definitely yes. I believe in the community building power of research, and in these communities creating unique cultures. Naturally, people don’t only talk about technical stuff in these communities.

What do you think is the most important task of the Competence Centre?

In general, ensuring knowledge and information transfer, and training future HPC professionals. Services should be made more easily accessible in everyday life, and beginners should be provided with practical guidance. The centre has to coordinate the testing of the pre-installed software, and to support the user by showing optimum software settings.

In-person meetings are highly necessary, we could learn a lot from each other. Those who understand the technology and are absolutely up-to-date can also give a lot to such research communities. It would be important to know the directions of development, and to share the experiences gathered abroad and adapt good practices. The Competence Centre needs to identify and focus on flagship projects that popularise supercomputers and present their benefits to thereby catalyse the emergence of additional users.

Ideally, the Competence Centre will make recommendations on the development and extension of the services upon synthesising the considerations of the user communities. The most effective way is to let users define the software products through which novelties in the given field of science are accessible and the order in which these should be purchased and installed.

In what fields of science would you think that integrating basic supercomputing skills into the curriculum is important?

HPC is a software-deployable and software-specifiable measurement tool that can be used in all fields of science that I know. And higher education curricula must reflect this. Introduction to HPC is already present in the training of the first-year chemical engineering students of the University of Miskolc. Immediately during the second IT class of theirs, I show them the KIFU supercomputer located in the supercomputer centre of our university. And the HPC capacities are also available for students showing interest. I can only encourage my colleagues to do the same.

What do you think of the future of supercomputers?

In classical fields of science such as chemistry and biology, the future and supercomputers are one and the same. As it has already happened in physics, these disciples will also become predictively modellable, and prediction is increasingly preceding experiments. At the same time, HPC simulations neatly infiltrate other fields of science too. The task isn’t to stop this infiltration but to widen it into a river as soon as possible.